text
stringlengths 29
850k
|
|---|
#!/usr/bin/env python3
# Copyright (c) 2018-2020 The Bitcoin Core developers
# Distributed under the MIT software license, see the accompanying
# file COPYING or http://www.opensource.org/licenses/mit-license.php.
"""Useful util functions for testing the wallet"""
from collections import namedtuple
from test_framework.address import (
key_to_p2pkh,
key_to_p2sh_p2wpkh,
key_to_p2wpkh,
script_to_p2sh,
script_to_p2sh_p2wsh,
script_to_p2wsh,
)
from test_framework.script import (
CScript,
OP_0,
OP_2,
OP_3,
OP_CHECKMULTISIG,
OP_CHECKSIG,
OP_DUP,
OP_EQUAL,
OP_EQUALVERIFY,
OP_HASH160,
hash160,
sha256,
)
from test_framework.util import hex_str_to_bytes
Key = namedtuple('Key', ['privkey',
'pubkey',
'p2pkh_script',
'p2pkh_addr',
'p2wpkh_script',
'p2wpkh_addr',
'p2sh_p2wpkh_script',
'p2sh_p2wpkh_redeem_script',
'p2sh_p2wpkh_addr'])
Multisig = namedtuple('Multisig', ['privkeys',
'pubkeys',
'p2sh_script',
'p2sh_addr',
'redeem_script',
'p2wsh_script',
'p2wsh_addr',
'p2sh_p2wsh_script',
'p2sh_p2wsh_addr'])
def get_key(node):
"""Generate a fresh key on node
Returns a named tuple of privkey, pubkey and all address and scripts."""
addr = node.getnewaddress()
pubkey = node.getaddressinfo(addr)['pubkey']
pkh = hash160(hex_str_to_bytes(pubkey))
return Key(privkey=node.dumpprivkey(addr),
pubkey=pubkey,
p2pkh_script=CScript([OP_DUP, OP_HASH160, pkh, OP_EQUALVERIFY, OP_CHECKSIG]).hex(),
p2pkh_addr=key_to_p2pkh(pubkey),
p2wpkh_script=CScript([OP_0, pkh]).hex(),
p2wpkh_addr=key_to_p2wpkh(pubkey),
p2sh_p2wpkh_script=CScript([OP_HASH160, hash160(CScript([OP_0, pkh])), OP_EQUAL]).hex(),
p2sh_p2wpkh_redeem_script=CScript([OP_0, pkh]).hex(),
p2sh_p2wpkh_addr=key_to_p2sh_p2wpkh(pubkey))
def get_multisig(node):
"""Generate a fresh 2-of-3 multisig on node
Returns a named tuple of privkeys, pubkeys and all address and scripts."""
addrs = []
pubkeys = []
for _ in range(3):
addr = node.getaddressinfo(node.getnewaddress())
addrs.append(addr['address'])
pubkeys.append(addr['pubkey'])
script_code = CScript([OP_2] + [hex_str_to_bytes(pubkey) for pubkey in pubkeys] + [OP_3, OP_CHECKMULTISIG])
witness_script = CScript([OP_0, sha256(script_code)])
return Multisig(privkeys=[node.dumpprivkey(addr) for addr in addrs],
pubkeys=pubkeys,
p2sh_script=CScript([OP_HASH160, hash160(script_code), OP_EQUAL]).hex(),
p2sh_addr=script_to_p2sh(script_code),
redeem_script=script_code.hex(),
p2wsh_script=witness_script.hex(),
p2wsh_addr=script_to_p2wsh(script_code),
p2sh_p2wsh_script=CScript([OP_HASH160, witness_script, OP_EQUAL]).hex(),
p2sh_p2wsh_addr=script_to_p2sh_p2wsh(script_code))
def test_address(node, address, **kwargs):
"""Get address info for `address` and test whether the returned values are as expected."""
addr_info = node.getaddressinfo(address)
for key, value in kwargs.items():
if value is None:
if key in addr_info.keys():
raise AssertionError("key {} unexpectedly returned in getaddressinfo.".format(key))
elif addr_info[key] != value:
raise AssertionError("key {} value {} did not match expected value {}".format(key, addr_info[key], value))
|
The LOD announced at huddle that management will not be calling no-shows anymore. This announcement was made along with a reminder that 3 NC/NS in a row is voluntary job abandonment, and automatic termination.
I can understand that it might be labor-intensive to call people who can't properly keep track of their work schedules, especially if it happens very often, and at least part of the motivation for the change in policy might be to reduce payroll, and possibly staff, this time of year. However, I personally know of at least two examples where calling or following up on the unexplained absence of an otherwise reliable worker either saved someone's life or helped solve a crime. At the very least, even reliable workers who write down and double-check their schedules each week can get mixed up now and then, especially TM's who can have highly variable schedules from week-to-week.
I can predict that the first time Spot fails to follow up on someone who is living alone, doesn't have many close friends or relatives looking after them, and is later discovered to be face down on the kitchen floor knocked unconscious, with a broken hip, or even dead, there will be serious bad publicity, possibly even a lawsuit. Even worse than an accident or illness, someone might have been murdered or kidnapped, and abandoning that person to their fate without any followup would allow the investigative trail to go cold.
What is the policy and practice at your store?
Our store started this same policy around a year ago. At first no one liked it because as you said it does save peoples jobs if you call them but now we all just accept it. In fact I had completely forgot that we used to call people until this thread came up. A side effect of this policy is that if you have some good friends at spot they will call/text you instead if they know you should be there with them.
During one really heavily shopped/understaffed holiday season, I missed half of a shift because I overslept. When you work till 1-2am and are scheduled to be back by 9-10am, it helps to know the store cares to call you.
And there was a 3 month period, on multiple occasions, where I was supposed to be off on the printed schedule. But somehow I was scheduled on the computer. The only way I knew I was schedule was the LOD calling me.
To me this is going to lower morale even more and show that management doesn't care about its team members.
I know there were a couple incidents at my store that the reason why a TM didn't show up for their shift was that they were in the ER either overwhelmingly sick or injured in a car crash. If someone from the store didn't call, they wouldn't have known.
Our store is the same way. I am sure that if there was a legitimate reason as to why it happen that MAYBE it would slide. There is one thing that does concern me. In the mornings when the over night LOD opens the door for the 7am & 7:30am TM's... Sometimes they open it late and I know that you have 5 mins to clock in or it counts you as late. But if the door isn't opened at the right time or if the time clock is offline, you will be counted as late. I have had a few TM's mention this to me and it has happen to to also. On my 6 month evaluation, I was "clocked in" as being late 16 times. When my ETL-GE told me this I started laughing. Seriously I did. I was never given a verbal warning, a written or a CCA for this so called "16" times and im sure if I was I would have been terminated seeing as how I run the front and am considered to be a roll model for our team. I did my own little investigating and those 16 times, 7 out of them were punch corrected, 5 were because the LOD was late to the door and the other 3 I was actually late, one being because I was in a car accident on my way to work.. Should I have been spoken to about this on my evaluation as if I was doing something wrong because that is exactly how it was made out to be.?
it's to get rid of team members plain and simple. My store quit doing it around a year ago as well.
It's also to minimize favoritism.
it's lame though, it definately came in handy a few times when the etl called and woke me up. It was also nice having a tl who gave a **** enough to text and wake me up. But of course he had to be fired!
Even though I feel that there are time's when we should call. For instance, if it's not like a team member to not show-up, or to be late. Then I'd probably give them a call. But I'm not calling someone who is consistently late or has attendance issues. Ultimately, if I call you your being coached anyways for not showing up. But it is the Team Members responsibility to show up for their scheduled shifts.
Exactly, my store still calls NCNSs, but we don't call people who have clear attendance issues. People make mistakes, it's called "being human".
However, even if my store adopted this stupid policy, I'd still shoot a text to any TMs that I knew didn't have attendance issues. I'm all about following the rules, but some rules are stupid. I'll coach you for being late, but I'm not going to just lose good TMs b/c they're human beings.
Walking the plank for ncns is good thing!
We've rolled this out like 3 years ago. We were told it wasn't brand to call actually anybody whether it's a NCNS or if their 15 minutes late. If it's someone I know and I have their number in my phone, I'll sneak off and give a shout, but that's it.
Same here. We aren't allowed call either.
We don't call anyone - from my point of view it's not brand to remind someone to work.
I have been with Target going on 10 years, and it has always been our policy, that it is "NOT BRAND" to call. I was always told it was Target policy and they were not allowed to. even if it was someone who has never missed a day for 2 or 3 years, and then fails to show up for work. NO Call from TARGET management.
If it's someone I know and I have their number in my phone, I'll sneak off and give a shout, but that's it.
I can't count the number of times I've had the Starbucks TL do this for a team member, or I've done it myself. I wonder if there would be any repercussions from leadership if they knew you did this though. Sure it would be very hard for them to find out, but still.
The other morning our new ETL-AP was supposed to be LOD and didn't come in and they called him..wonder if that applies to ETL's...probably not.
this is our policy but it's not strictly followed. I will say that the tm's who have attendance issues are never called if they're late or NCNS but if it's someone that is ALWAYS on time or ALWAYS there then they will call them.
At my store we're never supposed to call. I think it's ridiculous, especially if the TM in question normally has very good attendance.
Then set an alarm? Seriously, I stay up pretty late every night (usually go to sleep between 2:30 and 4) and I've never missed an opening shift. Heck, I've never missed a PA shift that starts at 6. I set my main alarm and my phone alarm.
We do not call either.... However I'm lucky enough to have had a smart etl who decided to call and check to see if everything was ok. As a diabetic I am so grateful there was someone to help me or the situation could have been much worse. Other than these extreme circumstances though, I'm on board with not calling no shows. Let's get the irresponsible team member out of here.
If it's someone who hasn't had prior attendance issues, we'll call because it may've been an honest oversight or to see if something's wrong.
If it's a regular offender, either they'll call just to hear their latest excuse/leave a message so as to coach 'em out the door.
|
#!/usr/bin/python
# -*- coding: utf-8 -*-
#
# Copyright (C) 2017 Google
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
# ----------------------------------------------------------------------------
#
# *** AUTO GENERATED CODE *** AUTO GENERATED CODE ***
#
# ----------------------------------------------------------------------------
#
# This file is automatically generated by Magic Modules and manual
# changes will be clobbered when the file is regenerated.
#
# Please read more about how to change this file at
# https://www.github.com/GoogleCloudPlatform/magic-modules
#
# ----------------------------------------------------------------------------
from __future__ import absolute_import, division, print_function
__metaclass__ = type
################################################################################
# Documentation
################################################################################
ANSIBLE_METADATA = {'metadata_version': '1.1', 'status': ["preview"], 'supported_by': 'community'}
DOCUMENTATION = '''
---
module: gcp_compute_route_info
description:
- Gather info for GCP Route
- This module was called C(gcp_compute_route_facts) before Ansible 2.9. The usage
has not changed.
short_description: Gather info for GCP Route
version_added: 2.7
author: Google Inc. (@googlecloudplatform)
requirements:
- python >= 2.6
- requests >= 2.18.4
- google-auth >= 1.3.0
options:
filters:
description:
- A list of filter value pairs. Available filters are listed here U(https://cloud.google.com/sdk/gcloud/reference/topic/filters).
- Each additional filter in the list will act be added as an AND condition (filter1
and filter2) .
type: list
extends_documentation_fragment: gcp
'''
EXAMPLES = '''
- name: get info on a route
gcp_compute_route_info:
filters:
- name = test_object
project: test_project
auth_kind: serviceaccount
service_account_file: "/tmp/auth.pem"
'''
RETURN = '''
resources:
description: List of resources
returned: always
type: complex
contains:
destRange:
description:
- The destination range of outgoing packets that this route applies to.
- Only IPv4 is supported.
returned: success
type: str
description:
description:
- An optional description of this resource. Provide this property when you create
the resource.
returned: success
type: str
name:
description:
- Name of the resource. Provided by the client when the resource is created.
The name must be 1-63 characters long, and comply with RFC1035. Specifically,
the name must be 1-63 characters long and match the regular expression `[a-z]([-a-z0-9]*[a-z0-9])?`
which means the first character must be a lowercase letter, and all following
characters must be a dash, lowercase letter, or digit, except the last character,
which cannot be a dash.
returned: success
type: str
network:
description:
- The network that this route applies to.
returned: success
type: dict
priority:
description:
- The priority of this route. Priority is used to break ties in cases where
there is more than one matching route of equal prefix length.
- In the case of two routes with equal prefix length, the one with the lowest-numbered
priority value wins.
- Default value is 1000. Valid range is 0 through 65535.
returned: success
type: int
tags:
description:
- A list of instance tags to which this route applies.
returned: success
type: list
nextHopGateway:
description:
- URL to a gateway that should handle matching packets.
- 'Currently, you can only specify the internet gateway, using a full or partial
valid URL: * U(https://www.googleapis.com/compute/v1/projects/project/global/gateways/default-internet-gateway)
* projects/project/global/gateways/default-internet-gateway * global/gateways/default-internet-gateway
.'
returned: success
type: str
nextHopInstance:
description:
- URL to an instance that should handle matching packets.
- 'You can specify this as a full or partial URL. For example: * U(https://www.googleapis.com/compute/v1/projects/project/zones/zone/)
instances/instance * projects/project/zones/zone/instances/instance * zones/zone/instances/instance
.'
returned: success
type: dict
nextHopIp:
description:
- Network IP address of an instance that should handle matching packets.
returned: success
type: str
nextHopVpnTunnel:
description:
- URL to a VpnTunnel that should handle matching packets.
returned: success
type: dict
nextHopNetwork:
description:
- URL to a Network that should handle matching packets.
returned: success
type: str
'''
################################################################################
# Imports
################################################################################
from ansible.module_utils.gcp_utils import navigate_hash, GcpSession, GcpModule, GcpRequest
import json
################################################################################
# Main
################################################################################
def main():
module = GcpModule(argument_spec=dict(filters=dict(type='list', elements='str')))
if module._name == 'gcp_compute_route_facts':
module.deprecate("The 'gcp_compute_route_facts' module has been renamed to 'gcp_compute_route_info'", version='2.13')
if not module.params['scopes']:
module.params['scopes'] = ['https://www.googleapis.com/auth/compute']
return_value = {'resources': fetch_list(module, collection(module), query_options(module.params['filters']))}
module.exit_json(**return_value)
def collection(module):
return "https://www.googleapis.com/compute/v1/projects/{project}/global/routes".format(**module.params)
def fetch_list(module, link, query):
auth = GcpSession(module, 'compute')
return auth.list(link, return_if_object, array_name='items', params={'filter': query})
def query_options(filters):
if not filters:
return ''
if len(filters) == 1:
return filters[0]
else:
queries = []
for f in filters:
# For multiple queries, all queries should have ()
if f[0] != '(' and f[-1] != ')':
queries.append("(%s)" % ''.join(f))
else:
queries.append(f)
return ' '.join(queries)
def return_if_object(module, response):
# If not found, return nothing.
if response.status_code == 404:
return None
# If no content, return nothing.
if response.status_code == 204:
return None
try:
module.raise_for_status(response)
result = response.json()
except getattr(json.decoder, 'JSONDecodeError', ValueError) as inst:
module.fail_json(msg="Invalid JSON response with error: %s" % inst)
if navigate_hash(result, ['error', 'errors']):
module.fail_json(msg=navigate_hash(result, ['error', 'errors']))
return result
if __name__ == "__main__":
main()
|
David brings to RPG and GAWCO his 35 years of multi-discipline experience in the power and petrochem industry, spanning an early career as a nuclear pipe welder who later in his career managed projects at nuclear power plants across the globe. Most recently he served as co-founder and VP of one of the most successful and recognized welding and machining service companies in the industry: Carolina Energy Solutions, recently acquired by Westinghouse Electric Corporation. Now as President of RPG and GAWCO, David brings his extensive knowledge and experience for what it takes to make Riley Power Group and Great American Welding Company excel as the best welding & machining service providers in the power and petrochem industry, names that can be trusted to provide the highest standards in excellence, integrity, safety and innovation.
|
"""Extract files from processing run into output directory, organized by sample.
"""
import os
import shutil
from bcbio import utils
from bcbio.log import logger
from bcbio.upload import shared
def copy_finfo(finfo, storage_dir, pass_uptodate=False):
"""Copy a file into the output storage directory.
"""
if "sample" in finfo:
out_file = os.path.join(storage_dir, "%s-%s%s%s" % (finfo["sample"], finfo["ext"],
"-" if (".txt" in finfo["type"]) else ".",
finfo["type"]))
else:
out_file = os.path.join(storage_dir, os.path.basename(finfo["path"]))
out_file = os.path.abspath(out_file)
if not shared.up_to_date(out_file, finfo):
logger.info("Storing in local filesystem: %s" % out_file)
shutil.copy(finfo["path"], out_file)
return out_file
if pass_uptodate:
return out_file
def copy_finfo_directory(finfo, storage_dir):
"""Copy a directory into the final output directory.
"""
out_dir = os.path.abspath(os.path.join(storage_dir, finfo["ext"]))
if not shared.up_to_date(out_dir, finfo):
logger.info("Storing directory in local filesystem: %s" % out_dir)
if os.path.exists(out_dir):
shutil.rmtree(out_dir)
shutil.copytree(finfo["path"], out_dir)
for tmpdir in ["tx", "tmp"]:
if os.path.exists(os.path.join(out_dir, tmpdir)):
shutil.rmtree(os.path.join(out_dir, tmpdir))
os.utime(out_dir, None)
return out_dir
def update_file(finfo, sample_info, config, pass_uptodate=False):
"""Update the file in local filesystem storage.
"""
# skip if we have no directory to upload to
if "dir" not in config:
return
if "sample" in finfo:
storage_dir = utils.safe_makedir(os.path.join(config["dir"], finfo["sample"]))
elif "run" in finfo:
storage_dir = utils.safe_makedir(os.path.join(config["dir"], finfo["run"]))
else:
raise ValueError("Unexpected input file information: %s" % finfo)
if finfo.get("type") == "directory":
return copy_finfo_directory(finfo, storage_dir)
else:
return copy_finfo(finfo, storage_dir, pass_uptodate=pass_uptodate)
|
Far from trendy, a recent discovery of a fire pit in Tel Aviv, Israel cave suggests that humans have been gathering around fires as far back as 300,000 years ago. What is hot now is installing a pre-fabricated outdoor fire pit or fireplace kit.
These kits come in styles and sizes that accommodate any landscape and can be finished with a cast veneer stone of your choice; making each one a custom piece that matches your home’s style and décor. They can easily transform a dark and unused corner of your yard into a warm and inviting centerpiece. From a formal hearth and mantle model, to a more rustic camp fire pit, history tells us these are sure to be a favorite gathering spot for family and friends.
Installed seating refers to incorporating a seat or ledge into a permanent wall structure. Homeowners with larger families, or who those who frequently entertain, are now incorporating seating into their patio designs. By adding seating into a retaining wall, or by adding a seat height ledge into a dividing wall, you can really expand your seating capacity.
Add pillows or cushions in a moment’s notice to accommodate extra guests. The new outdoor fabrics have come a long way since the 1960’s. High-end textile designers and manufactures are promoting chic collections in vibrant patterns in easy-care fabrics. These new softer fabrics are virtually indistinguishable from their indoor counterparts and are perfect for adding texture and color while cozying up hard lines.
By integrating a feature such as a fire pit, fountain or bar, these easily become a garden destination spot all on their own.
Health conscientious and eco-friendly dwellers all over the country have taken to incorporating edible plants into their garden landscaping designs. Homeowners concerned with food safety can now grow their own organic produce with sustainable gardening. Good choices for beginners are tomatoes, lettuce, peppers, squash and spinach.
When considering adding edible plants into your plan, think first about sunlight and time. If your available space spends a good part of the day in the shade, and you’re busy with other commitments, then it may not be wise to begin with vegetables.
Not ready for the commitment? Try adding herbs into the mix. Basil, sage thyme and rosemary are amongst the easiest to grow and can be added to larger plant groupings as well as container gardens. A window box arrangement with mint is perfect for a summertime drinks and helps to contain the mint, which can easily overtake a garden bed. Even if you don’t typically add fresh herbs to you cooking, smaller “lower” herbs make great ground cover and are both beautiful as well as fragrant.
So whether you are in the process of planning a new outdoor living space, or revamping an existing one, considering adding a new element or feature into the mix. While some trends come and go, these are some that have stood the test of time and are sure to be enjoyed for years to come.
|
# -*- coding: utf-8 -*-
import time
from datetime import datetime, timedelta
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.common.by import By
from selenium.webdriver.support import expected_conditions as EC
from selenium.common.exceptions import TimeoutException
from com.ericsson.xn.commons.funcutils import find_single_widget, find_all_widgets, \
ne_category_by_ne_type, get_widget_ignore_refrence_error
from com.ericsson.xn.commons import test_logger as test
from com.ericsson.xn.commons.selfutils import compare_lists
def to_pm_management_page(driver):
'''
This function has been abandoned.
:param driver:
:return:
'''
test.info('To the PmManagement page...')
identifier = (By.XPATH, "//div[@class='ebLayout-Navigation']/div/div[1]/span")
find_single_widget(driver, 10, identifier).click()
identifier = (By.XPATH, "//div[@class='ebBreadcrumbs-list']/ul/li[3]/a")
find_single_widget(driver, 10, identifier).click()
def check_in_correct_pm_page(driver):
'''
This function works on the beginning editions and has been abandoned.
:param driver:
:return:
'''
id_search_btn = (By.ID, "idBtn-search")
b_validate = False
try:
find_single_widget(driver, 10, id_search_btn)
b_validate = True
except TimeoutException as e:
# page not loaded
return False
if b_validate:
# check if in the correct page
# id_navi = identifier = (By.XPATH, "//div[@class='ebLayout-Navigation']/div")
# navi = find_single_widget(driver, 10, id_navi)
id_divs = identifier = (By.XPATH, "//div[@class='ebLayout-Navigation']/div/div")
children_divs = find_all_widgets(driver, 20, id_divs)
str_last_navi = find_single_widget(children_divs[-1], 10, (By.XPATH, ".//a")).get_attribute('innerHTML').\
encode('utf-8').strip()
# logger.info(children_divs[-2].get_attribute('innerHTML').encode('utf-8'))
lis = find_all_widgets(children_divs[-2], 10, (By.XPATH, ".//div/ul/li"))
for li in lis:
str_a_li = find_single_widget(li, 10, (By.XPATH, ".//a")).get_attribute('innerHTML').encode('utf-8').strip()
if str_last_navi == str_a_li:
return True
# current page not in parent navigation
return False
def to_pm_management_page_by_url(driver, ne_type, server_info, to_url_pre='#network-overview/pm-management/'):
test.info('Will Navigate to the PMManagement page...')
base_url = 'http://' + server_info.getProperty('host') + ':' + str(server_info.getProperty('port')) + \
server_info.getProperty('preurl')
test.info('Base URL is: ' + base_url)
to_url = base_url + (to_url_pre + 'pm-' + ne_category_by_ne_type(ne_type) + '/' + 'pm-' + ne_type).lower()
test.info('To URL: ' + to_url)
driver.get(to_url)
make_sure_in_pm_page(driver)
def make_sure_in_pm_page(driver):
# btn id: ebBtnSearch
id_btn_interface = (By.ID, 'ebBtnSearch')
try:
find_single_widget(driver, 5, id_btn_interface)
test.error('Page redirect to the interface management page, critical error!')
except TimeoutException:
id_query_btn = (By.ID, "idBtn-search")
try:
pm_query_btn = find_single_widget(driver, 10, id_query_btn)
if pm_query_btn:
test.passed('Found the query button of PM Management page, check passed.')
except TimeoutException:
test.failed('Cannot find the query button of PM Management page.')
def make_in_correct_tab(driver, prefix, postfix):
id_tabs = (By.XPATH, "//div[@class='ebTabs']/div[1]/div[2]/div")
tabs = find_all_widgets(driver, 10, id_tabs)
for tab in tabs:
if prefix + postfix == tab.get_attribute('innerHTML').encode('utf-8').strip():
if not tab.get_attribute('class').encode('utf-8').find('ebTabs-tabItem_selected_true') > -1:
tab.click()
wait_noti_widget_show(driver)
test.info('Now in TAB: ' + prefix + postfix)
def wait_noti_widget_show(driver, wait_time=10):
id_div = (By.XPATH, "//div[@class='noti']/div")
try:
find_single_widget(driver, wait_time, id_div)
test.info('Query result notification shown up.')
except TimeoutException:
test.warning('Query result notification did not shown up, case may or may not fail later.')
def to_tab_by_ne_type(driver, ne_type, logger):
ne_type = ne_type.strip().upper()
tab_index = 1
if 'PGW' == ne_type:
tab_index = 1
elif 'SGW' == ne_type:
tab_index = 2
elif 'SGSN' == ne_type:
tab_index = 3
elif 'MME' == ne_type:
tab_index = 4
elif 'SBC' == ne_type:
tab_index = 5
elif 'OCGAS' == ne_type:
tab_index = 6
identifier = (By.XPATH, "//div[@class='ebTabs-tabArea']/div[" + str(tab_index) + "]")
find_single_widget(driver, 10, identifier).click()
# wait for the notification, maximum 10 seconds
try:
identifier = (By.XPATH, "//div[@class='noti']/div")
WebDriverWait(driver, 10).until(EC.presence_of_element_located(identifier))
except TimeoutException:
pass
def init_and_search(driver, ne_name, end_time=None, start_time=None):
# select the given nename
select_given_ne_name(driver, ne_name)
# select the correct time
if end_time is not None:
test.info('Query end time point set to: ' + end_time.strftime('%H%M%S'))
id_end_time = (By.XPATH, "//div[@class='endtime']/div/span/input")
find_single_widget(driver, 10, id_end_time).click()
set_time_for_query(driver, end_time)
if start_time is not None:
test.info('Query start time point set to: ' + start_time.strftime('%H%M%S'))
id_start_time = (By.XPATH, "//div[@class='starttime']/div/span/input")
find_single_widget(driver, 10, id_start_time).click()
set_time_for_query(driver, start_time)
# click the query button
id_query_btn = (By.ID, "idBtn-search")
find_single_widget(driver, 10, id_query_btn).click()
# wait for the notification, maximum 20 seconds
id_body_date = (By.XPATH, "//div[@class='ebTabs']/div[2]/div/div/div/div/table/tbody")
find_single_widget(driver, 20, id_body_date)
def wait_until_pm_date_show_up(driver, ne_name, wait_time=720):
select_given_ne_name(driver, ne_name)
end_time = datetime.now() + timedelta(seconds=wait_time)
while datetime.now() < end_time:
id_query_btn = (By.ID, "idBtn-search")
find_single_widget(driver, 10, id_query_btn).click()
id_body_date = (By.XPATH, "//div[@class='ebTabs']/div[2]/div/div/div/div/table/tbody")
try:
find_single_widget(driver, 10, id_body_date)
test.passed('Successfully found the counters data.')
return
except TimeoutException:
pass
test.error('Wait for ' + str(wait_time) + ' seconds but cannot find any PM datas.')
def select_given_ne_name(driver, ne_name):
identifier = (By.XPATH, "//div[@class='pmcommonarea']/div/div[2]/div[1]/div[2]/input")
input_ne_name = find_single_widget(driver, 10, identifier)
if not '' == input_ne_name.get_attribute('value').strip():
input_ne_name.click()
find_single_widget(driver, 10, (By.ID, "btnAllLeft")).click()
else:
input_ne_name.click()
id_table_candidate = (By.XPATH, "//div[@class='ebLayout-candidateEnbs']/div[2]/div/div[3]/div/div/div/table")
table_candidate = find_single_widget(driver, 20, id_table_candidate)
id_input_search = (By.XPATH, ".//thead/tr[2]/th[2]/input")
candi_input = find_single_widget(table_candidate, 10, id_input_search)
candi_input.clear()
candi_input.send_keys(ne_name.strip())
time.sleep(1.0)
id_checkbox = (By.XPATH, ".//tbody/tr[1]/td[1]/div/div/input")
left_checkbox = find_single_widget(table_candidate, 10, id_checkbox)
if not left_checkbox.is_selected():
left_checkbox.click()
# select to right
id_arrow_to_right = (By.ID, "btnRight")
find_single_widget(driver, 10, id_arrow_to_right).click()
# time.sleep(5.0)
# close select ne dialog
id_btn_choose_ne = (By.CLASS_NAME, "choose")
find_single_widget(driver, 10, id_btn_choose_ne).click()
# WebDriverWait(driver, 10).until(EC.element_to_be_clickable(id_btn_choose_ne))
def wait_until_rounds_ok(driver, rows, rows_of_page, rows_each_period):
'''
This function will check the number of rows that we need to check the PM.
:param driver:
:param rows:
:param rows_of_page:
:param dict_additional:
:param ne_type:
:return: None
'''
id_tbdoy_trs = (By.XPATH, "//div[@class='ebTabs']/div[2]/div/div/div/div/table/tbody/tr")
# if dict_additional.has_key('check_rows'):
# rows = dict_additional['check_rows']
# if not 0 == rows % dict_additional['number_of_lic']:
# test.error('Number of checked rows should be integer multiples of number of LICs.')
t_start = datetime.now()
# Note that most of the PM need T2-T1, for Node like SBC, we may wait 5 minutes more since SBC don't need T2-T1
# t_end = t_start + timedelta(minutes=5 * (rows // dict_additional['number_of_lic'] + 1) + 2)
t_end = t_start + timedelta(minutes=5 * (rows // rows_each_period + 1) + 2)
while datetime.now() < t_end:
# click the query button
id_query_btn = (By.ID, "idBtn-search")
find_single_widget(driver, 10, id_query_btn).click()
time.sleep(.1)
try:
i_page = rows / rows_of_page
tgt_page_number = i_page if 0 == rows % rows_of_page else i_page + 1
id_tgt_pager = (By.XPATH, ("//div[@class='page']/ul/li[2]/ul/li[" + str(tgt_page_number) + "]"))
time.sleep(.1)
tgt_pager = get_widget_ignore_refrence_error(driver, id_tgt_pager)
if not tgt_pager.get_attribute('class').find('ebPagination-entryAnchor_current') > -1:
tgt_pager.click()
trs = find_all_widgets(driver, 20, id_tbdoy_trs)
if rows % rows_of_page <= len(trs):
test.passed('All the data that we need are ready now.')
return
except TimeoutException:
pass
time.sleep(.5)
test.failed('It seems that the the data we need has not been collected as expectes, case may fail later steps.')
def check_pm_rows_updated(driver, ne_type, dict_counters, rows_of_page, dict_additional):
'''
The main function that check the PM Data accurate, it will first check the data of each row,
then check the GUI time's minutes is multiple of 5,
then check the Lics if the node has many LICs.
:param ne_type: the ne's type
:param dict_counters: the base counter values in dictionary
:param rows_of_page: how many rows each page has on the GUI, default is 10
:param dict_additional: additional information that used for special nodes, (number_of_lic: how many lics of a no
de), (check_rows: how many rows that will be checked, if this value exist, will only check this number of rows,
otherwise the number of rows will checked is equal the size of dict_counters)
:return: None
'''
check_rounds = dict_additional['check_rounds']
number_of_rows_be_checked = check_rounds * dict_additional['number_of_lic']
wait_until_rounds_ok(driver, number_of_rows_be_checked, 10, dict_additional['number_of_lic'])
is_m_lics = True if dict_additional['number_of_lic'] > 1 else False
list_returns = []
id_table = (By.XPATH, "//div[@class='ebTabs']/div[2]/div/div/div/div/table")
id_header_trs = (By.XPATH, "//div[@class='ebTabs']/div[2]/div/div/div/div/table/thead/tr/th")
ths = find_all_widgets(driver, 20, id_header_trs)
list_headers = []
for th in ths:
list_headers.append(th.get_attribute('innerHTML').encode('utf-8').strip())
# if not 0 == number_of_rows_be_checked % dict_additional['number_of_lic']:
# test.error('Number of checked rows should be integer multiples of number of LICs.')
for row_index in range(1, number_of_rows_be_checked + 1):
# check_pm_by_row returns [gui_datettime, lic_name] in List
list_returns.append(check_pm_by_row(driver, id_table, row_index, ne_type, dict_counters, rows_of_page,
list_headers, is_m_lics))
# check GUI time and lic_name
lic_from_gui = []
if number_of_rows_be_checked != len(list_returns):
test.failed('Number of rows need to be checked mis-match with the number we expected.')
else:
number_of_lic = dict_additional['number_of_lic']
for i in range(0, len(list_returns), number_of_lic):
for j in range(number_of_lic):
lic_from_gui.append(list_returns[i + j][1])
gui_time_and_lic_name = list_returns[i + j]
if gui_time_and_lic_name[0] is not None and 0 == gui_time_and_lic_name[0].minute % 5:
test.passed('Row ' + str(i + j) + ' GUI time is correct, is: ' +
gui_time_and_lic_name[0].strftime('%Y-%m-%d %H:%M'))
else:
test.failed('Row ' + str(i + j) + ' GUI time is not multiple of 5, is: ' +
gui_time_and_lic_name[0].strftime('%Y-%m-%d %H:%M'))
if is_m_lics:
msg = 'Node has more than one LIC, '
if list_returns[i][0] == list_returns[i + j][0]:
msg += ' different LICs have the same report time.'
test.passed(msg)
else:
msg += ' different LICs don\'t have the same report time.'
test.failed(msg)
if i + number_of_lic < len(list_returns):
# the pre-condition of this check point is: GUI list data decent by datetime
if 300 == (list_returns[i][0] - list_returns[i + number_of_lic][0]).seconds:
test.passed('Report delta time is 5 minutes.')
else:
test.failed('Report delta time is not 5 minutes.')
# if checked 1 hour PM and node has many LICs, will check the LIC
if 12 == int(number_of_rows_be_checked / dict_additional['number_of_lic']):
if is_m_lics:
expected_lic = [t.split('-', 1)[1].strip() for t in sorted(dict_counters)]
if compare_lists(expected_lic, lic_from_gui):
test.passed('Lic check passed.')
else:
test.failed('Lic check failed, E: ' + str(expected_lic) + ', G: ' + str(lic_from_gui))
# else will check single LIC node, will not cover this edition
def check_me_counters(driver, ne_type, counters_expected, rows_of_page, dict_me_add, me_types):
'''
This function will check the ME counters, the first edition suspect that only one record each 5 minutes.
:param ne_type the type of the node
:param counters_expected: node ME counters that will check with the counters on GUI
:param dict_me_add: additional information, (check_rounds: how many rounds that will be checked), (
rows_each_period: how many rows each period, default is 1, this parameter is for extending later.)
:return: None: the function is for automation testing, critical errors will case program to exit immediately
'''
checked_rounds = dict_me_add['check_rounds']
number_of_rows_be_checked = checked_rounds * dict_me_add['rows_each_period']
wait_until_rounds_ok(driver, number_of_rows_be_checked, 10, dict_me_add['rows_each_period'])
list_returns = []
id_table = (By.XPATH, "//div[@class='ebTabs']/div[2]/div/div/div/div/table")
id_header_trs = (By.XPATH, "//div[@class='ebTabs']/div[2]/div/div/div/div/table/thead/tr/th")
ths = find_all_widgets(driver, 20, id_header_trs)
list_headers = []
for th in ths:
list_headers.append(th.get_attribute('innerHTML').encode('utf-8').strip())
# number_of_rows_be_checked = len(counters_expected)
# if dict_me_add.has_key('check_rounds'):
# if not 0 == number_of_rows_be_checked % dict_me_add['number_of_lic']:
# test.error('Number of checked rows should be integer multiples of number of LICs.')
for row_index in range(1, number_of_rows_be_checked + 1):
# check_pm_by_row returns [gui_datettime, lic_name] in List
time_of_gui = check_me_single_row(driver, id_table, row_index, ne_type, counters_expected,
rows_of_page, list_headers, me_types)
list_returns.append(time_of_gui)
if number_of_rows_be_checked != len(list_returns):
test.failed('Number of rows have been checked mis-match with the number we expected.')
else:
for i in range(len(list_returns)):
if list_returns[i] is not None and 0 == list_returns[i].minute % 5:
test.passed('Row ' + str(i) + ' GUI time is correct, is: ' +
list_returns[i].strftime('%Y-%m-%d %H:%M'))
else:
test.failed('Row ' + str(i) + ' GUI time is correct, is: ' +
list_returns[i].strftime('%Y-%m-%d %H:%M'))
if i + 1 < len(list_returns):
if 300 == (list_returns[i] - list_returns[i + 1]).seconds:
test.passed('Report delta time is 5 minutes.')
else:
test.failed('Report delta time is not 5 minutes.')
def check_pm_rows(driver, logger, ne_type, dict_counters, rows_of_page, dict_additional):
bool_overall = True
list_time = []
id_table = (By.XPATH, "//div[@class='ebTabs']/div[2]/div/div/div/div/table")
id_header_trs = (By.XPATH, "//div[@class='ebTabs']/div[2]/div/div/div/div/table/thead/tr/th")
ths = find_all_widgets(driver, 20, id_header_trs)
list_headers = []
for th in ths:
list_headers.append(th.get_attribute('innerHTML').encode('utf-8').strip())
# table = find_single_widget(driver, 10, id_table)
rounds = len(dict_counters)
if 'SBC' == ne_type:
if dict_additional.has_key('rounds'):
rounds = dict_additional['rounds']
for i in range(1, rounds + 1):
bool_row, gui_time = check_pm_by_row(driver, id_table, logger, i, ne_type, dict_counters, rows_of_page, list_headers)
list_time.append(gui_time)
if not bool_row:
bool_overall = False
logger.error('Row ' + str(i) + " check FAILED. Check the log for detailed information.")
else:
logger.info('Row ' + str(i) + " check PASSED.")
if bool_overall:
if len(list_time) < 1:
bool_overall = False
logger.error('Failed: 0 rounds of PM checked, this does not make any sense.')
elif len(list_time) < 2:
if 'OCGAS' == ne_type:
bool_overall = False
logger.error('Failed: Node OCGAS is supposed to have two LICs, there is only one record of PM Data.')
elif list_time[0] is None:
bool_overall = False
logger.error('Failed: Fail to get the PM data time.')
else:
if 0 != list_time[0].minute % 5:
bool_overall = False
logger.error('Failed: PM Data time is not multiples of 5.')
else:
if ne_type in ['SGW', 'PGW', 'SGSN', 'MME', 'SBC']:
for i in range(0, len(list_time) - 1):
if list_time[i] is None or list_time[i + 1] is None:
bool_overall = False
logger.error('Failed: Fail to get the PM data time.')
break
else:
if 0 != list_time[i].minute % 5 or 0 != list_time[i + 1].minute % 5:
bool_overall = False
logger.error('Failed: PM Data time is not multiples of 5.')
break
if 300 != abs((list_time[i] - list_time[i + 1]).seconds):
bool_overall = False
logger.error('Failed: PM period is not 5 minutes.')
break
elif 'OCGAS' == ne_type:
for i in range(0, len(list_time), 2):
if i != len(list_time) - 2:
if list_time[i] is None or list_time[i + 1] is None or list_time[i + 2] is None:
bool_overall = False
logger.error('Failed: Fail to get the PM data time.')
break
else:
if list_time[i] != list_time[i + 1]:
bool_overall = False
logger.error('Failed: Two LICs of Node OCGAS should be the same.')
break
else:
if 0 != list_time[i].minute % 5 or 0 != list_time[i + 2].minute % 5:
bool_overall = False
logger.error('Failed: PM Data time is not multiples of 5.')
break
elif 300 != abs((list_time[i] - list_time[i + 2]).seconds):
bool_overall = False
logger.error('Failed: PM period is not 5 minutes. ' + str(list_time[i]) + ' '
+ str(list_time[i + 2]))
break
logger.info('GUI times: ' + ', '.join([str(t) for t in list_time]))
if bool_overall:
logger.info("Overall PASSED.")
else:
logger.error("Overall FAILED.")
def check_pm_by_row(driver, id_table, index_row, ne_type, dict_counters, rows_of_page, list_headers, is_m_lics):
test.info('Start to check row: ' + str(index_row))
make_sure_is_correct_page(driver, index_row, rows_of_page)
try:
gui_index_row = rows_of_page if 0 == index_row % rows_of_page else index_row % rows_of_page
id_tr = (By.XPATH, ".//tbody/tr[" + str(gui_index_row) + "]")
table = find_single_widget(driver, 10, id_table)
time.sleep(.5)
tr = find_single_widget(table, 10, id_tr)
gui_str_time = find_single_widget(tr, 10, (By.XPATH, ".//td[2]")).get_attribute('innerHTML').encode('utf-8')
gui_time = datetime.strptime(gui_str_time.strip(), "%Y-%m-%d %H:%M")
except_counter_id = str(gui_time.minute)
id_lic_name = (By.XPATH, ".//td[3]")
lic_name = find_single_widget(tr, 5, id_lic_name).get_attribute('innerHTML').encode('utf-8')
if is_m_lics:
except_counter_id = str(gui_time.minute) + '-' + lic_name
list_row = dict_counters[except_counter_id].split(',')
for i in range(len(list_row)):
try:
id_counter = (By.XPATH, ".//td[" + str(i + 4) + "]")
gui_counter = find_single_widget(tr, 5, id_counter).get_attribute('innerHTML').encode('utf-8')
i_gui_counter = int(gui_counter)
except Exception as e:
i_gui_counter = None
if int(list_row[i].strip()) == i_gui_counter:
msg = list_headers[1] + ": " + gui_str_time.strip() + ",\t" + list_headers[2] + ": " + lic_name + "; " \
+ list_headers[i + 3] + ", GUI is " + str(i_gui_counter) + ",\tExpected is " + str(list_row[i]) \
+ "."
test.passed(msg)
else:
msg = list_headers[1] + ": " + gui_str_time.strip() + ",\t" + list_headers[2] + ": " + lic_name + "; " \
+ list_headers[i + 3] + ", GUI is " + str(i_gui_counter) + ",\tExpected is " + str(list_row[i]) \
+ "."
test.failed(msg)
return [gui_time, lic_name]
except Exception as e:
test.error("Test failed, ERROR: " + str(e))
def check_me_single_row(driver, id_table, index_row, ne_type, dict_counters, rows_of_page, list_headers, me_types):
test.info('Start to check ME row: ' + str(index_row))
make_sure_is_correct_page(driver, index_row, rows_of_page)
try:
gui_index_row = rows_of_page if 0 == index_row % rows_of_page else index_row % rows_of_page
id_tr = (By.XPATH, ".//tbody/tr[" + str(gui_index_row) + "]")
table = find_single_widget(driver, 10, id_table)
time.sleep(.5)
tr = find_single_widget(table, 10, id_tr)
gui_str_time = find_single_widget(tr, 10, (By.XPATH, ".//td[2]")).get_attribute('innerHTML').encode('utf-8')
gui_time = datetime.strptime(gui_str_time.strip(), "%Y-%m-%d %H:%M")
except_counter_id = str(gui_time.minute)
# id_lic_name = (By.XPATH, ".//td[3]")
# lic_name = find_single_widget(tr, 5, id_lic_name).get_attribute('innerHTML').encode('utf-8')
list_row = dict_counters[except_counter_id].split(',')
list_types = me_types['counter_types'].split(',')
for i in range(len(list_row)):
try:
id_counter = (By.XPATH, ".//td[" + str(i + 3) + "]")
gui_counter = find_single_widget(tr, 5, id_counter).get_attribute('innerHTML').encode('utf-8')
# i_gui_counter = int(gui_counter) if 'int' == list_types[i].lower().strip()
if 'int' == list_types[i].lower().strip():
i_gui_counter = int(gui_counter)
i_expected = int(list_row[i].strip())
elif 'float' == list_types[i].lower().strip():
i_gui_counter = float(gui_counter)
i_expected = float(list_row[i].strip())
else:
test.error('Unknown counter type of me counters.')
except Exception as e:
i_gui_counter = None
if i_expected == i_gui_counter:
msg = list_headers[1] + ": " + gui_str_time.strip() + ",\t" + "; " + list_headers[i + 2] + ", GUI is " \
+ str(i_gui_counter) + ",\tExpected is " + str(list_row[i]) + "."
test.passed(msg)
else:
msg = list_headers[1] + ": " + gui_str_time.strip() + ",\t" + "; " + list_headers[i + 2] + ", GUI is " \
+ str(i_gui_counter) + ",\tExpected is " + str(list_row[i]) + "."
test.failed(msg)
return gui_time
except Exception as e:
test.error("Test failed, ERROR: " + str(e))
def to_second_page(driver, logger):
id_next_page = (By.XPATH, "//div[@class='page']/ul/li[3]")
pager = find_single_widget(driver, 10, id_next_page)
# driver.execute_script("arguments[0].scrollIntoView(true);", tds[12])
driver.execute_script("arguments[0].scrollIntoView(true);", pager)
pager.click()
# wait for the notification, maximum 10 seconds
id_body_date = (By.XPATH, "//div[@class='ebTabs']/div[2]/div/div/div/div/table/tbody")
find_single_widget(driver, 10, id_body_date)
# time.sleep(2.0)
def make_sure_is_correct_page(driver, row_index, rows_of_page):
"""
This function handle the situation that we need to paginate to different to check the PM datas.
:param rows_of_page: how many rows each page has
:param driver: selenium instance
:param row_index: row index of the GUI
:return: None
"""
i_page = row_index / rows_of_page
tgt_page_number = i_page if 0 == row_index % rows_of_page else i_page + 1
id_tgt_pager = (By.XPATH, ("//div[@class='page']/ul/li[2]/ul/li[" + str(tgt_page_number) + "]"))
tgt_pager = find_single_widget(driver, 10, id_tgt_pager)
if not tgt_pager.get_attribute('class').find('ebPagination-entryAnchor_current') > -1:
tgt_pager.click()
# wait for the notification, maximum 10 seconds
id_body_date = (By.XPATH, "//div[@class='ebTabs']/div[2]/div/div/div/div/table/tbody")
find_single_widget(driver, 10, id_body_date)
test.info('Now in page ' + str(tgt_page_number) + '.')
def set_time_for_query(driver, date_time):
# first edition will only set the time part
id_time_holder = (By.XPATH, "//div[@data-namespace='ebTimePicker']")
time_holder = find_single_widget(driver, 10, id_time_holder)
id_hour = (By.XPATH, ".//table[1]/tbody/tr/td[2]/div[2]/input")
hour_input = find_single_widget(time_holder, 10, id_hour)
hour_input.clear()
hour_input.send_keys(date_time.hour)
id_minute = (By.XPATH, ".//table[2]/tbody/tr/td[2]/div[2]/input")
minute_input = find_single_widget(time_holder, 10, id_minute)
minute_input.clear()
minute_input.send_keys(date_time.minute)
id_second = (By.XPATH, ".//table[3]/tbody/tr/td[2]/div[2]/input")
second_input = find_single_widget(time_holder, 10, id_second)
second_input.clear()
# second_input.send_keys(date_time.second)
second_input.send_keys(0)
id_ok_btn = (By.XPATH, "//div[@class='ebDialogBox-actionBlock']/button[1]")
find_single_widget(driver, 10, id_ok_btn).click()
|
The OODA loop is a well established concept often used in security which originated in the military. OODA stands for Observe, Orient, Decide, Act.
OODA is an iterative process because after each action you need to observe your results and any new opposing action. The idea is that if you can consistently get to the action faster than your opponent you can beat them. It is typically described using an airplane dogfight analogy – airplanes try to turn more quickly and sharply than their opponent in order to get off a shot. But, as you turn faster and faster the g-forces build and at this point the ever faster OODA loop is more like a centrifuge crushing us. We need to break out of the loop and find a new way to play the security game.
Lately, every time I hear about the OODA loop I think of the Ouroboros, a snake eating its own tail. Defenders react faster, so attackers do too and they get sneakier, and so on and so on. Like many similar technological races it is hard to get more than an incremental and temporary advantage. It seems to take an enormous amount of effort and money to make even the smallest improvements, and even the biggest and best prepared companies are still regularly falling victim to cyber attacks. It takes revolutionary improvements to make substantial differences to the balance between attackers and defenders.
Consider a typical OODA loop scenario. An attacker sends a phishing email, and someone clicks on a link to a bad website which infects the user’s computer. At some point that attack is detected (observed), often well after the attacker has had a chance to move horizontally through the organization and establish a presence. The victimized enterprise then turns their attention to gathering more data about what has happened and which systems have been compromised, and quickly decides on a plan of action. Finally they start to try to clean up infected systems and prevent further compromise. Of course the attacker may notice that they have been observed and start taking counter measures at the same time.
How much better would it be if many of these attacks could be stopped or remediated without detection? Could we skip the “OOD” in most cases and move directly to Acting frequently and repeatedly? That can only work if the cost of remediating potentially infected systems can be reduced by many orders of magnitude. Conventionally it might take an IT person an hour or more to clean up a single desktop. If we want to do that every day on every machine, or maybe more often, the cost has to be almost zero.
Virtualization and containerization make it possible to automate this kind of process very effectively. Images of the system, component, or application can be created in a known good and clean condition. The VM or container can be quickly, easily, and cheaply deleted and re-created from that image. That efficiency makes it possible to take this remediation action quite frequently. It is common for such systems to restart the image daily and some do so every few minutes.
Another advantage of bypassing the observe/detect phase is the ability to be secure in the face of undetectable malware. Current generation security tools have a dismal track record for detecting sophisticated attacks. Web interactions require scanners to allow or deny content in milliseconds making detection particularly difficult. With automated restart of the images, all malware gets cleaned up, including the stuff that managed to evade all the scanners.
Virtualization or containerization of small individual applications provides many advantages over virtualizing the whole system. Isolating individual applications limits the amount of data and resources at risk between the time of any infection and when it is remediated. Strict isolation of the application from file systems, networks, and hardware prevents attacks from reaching their objectives of capturing information or inflicting damage.
Applications are too large, too complex, and evolve too quickly to be free of major vulnerabilities any time soon. And attackers continue to develop new tools and techniques to evade immediate detection. This has resulted in businesses spending untold sweat and treasure trying to race faster and faster around an ever tightening OODA loop. If they continue they will be eating themselves alive like the mythical snake.
Isolating those vulnerable applications in highly restricted boxes which are then frequently destroyed and rebuilt whether or not anything has been detected, is an important approach for robust security and survivability in today’s modern threat environment. It can allow us to break out of the OODA loop and cut straight to taking effective action.
|
#!/usr/bin/env python
import json
import multiprocessing
import os
import random
import threading
import webbrowser
# third-party imports
import tornado.ioloop
import tornado.web
import tornado.websocket
import tornado.httpclient
import tornado.gen
# local imports
import settings
import LikedSavedDatabase
from downloaders import redditUserImageScraper
from utils import utilities
# Require a username and password in order to use the web interface. See ReadMe.org for details.
#enable_authentication = False
enable_authentication = True
useSSL = True
if enable_authentication:
import PasswordManager
# List of valid user ids (used to compare user cookie)
authenticated_users = []
# If "next" isn't specified from login, redirect here after login instead
landingPage = "/"
class SessionData:
def __init__(self):
# Just in case, because tornado is multithreaded
self.lock = threading.Lock()
self.randomHistory = []
self.randomHistoryIndex = -1
self.favorites = []
self.favoritesIndex = 0
self.currentImage = None
self.randomImageFilter = ''
self.filteredImagesCache = []
self.currentDirectoryPath = ''
self.currentDirectoryCache = []
self.directoryFilter = ''
def acquire(self):
self.lock.acquire()
def release(self):
self.lock.release()
# user id : session data
userSessionData = {}
videoExtensions = ('.mp4', '.webm')
supportedExtensions = ('.gif', '.jpg', '.jpeg', '.png', '.mp4', '.webm', '.riff')
savedImagesCache = []
def generateSavedImagesCache(outputDir):
global savedImagesCache
# Clear cache in case already created
savedImagesCache = []
print('Creating content cache...', flush=True)
for root, dirs, files in os.walk(outputDir):
for file in files:
if file.endswith(supportedExtensions):
savedImagesCache.append(os.path.join(root, file))
print('Finished creating content cache ({} images/videos)'.format(len(savedImagesCache)))
def getRandomImage(filteredImagesCache=None, randomImageFilter=''):
if not savedImagesCache:
generateSavedImagesCache(settings.settings['Output_dir'])
if filteredImagesCache:
randomImage = random.choice(filteredImagesCache)
else:
randomImage = random.choice(savedImagesCache)
print('\tgetRandomImage(): Chose random image {} (filter {})'.format(randomImage, randomImageFilter))
serverPath = utilities.outputPathToServerPath(randomImage)
return randomImage, serverPath
#
# Tornado handlers
#
# See https://github.com/tornadoweb/tornado/blob/stable/demos/blog/blog.py
# https://www.tornadoweb.org/en/stable/guide/security.html
def login_get_current_user(handler):
if enable_authentication:
cookie = handler.get_secure_cookie("user")
if cookie in authenticated_users:
return cookie
else:
print("Bad/expired cookie received")
return None
else:
return "authentication_disabled"
class AuthHandler(tornado.web.RequestHandler):
def get_current_user(self):
return login_get_current_user(self)
class LoginHandler(AuthHandler):
def get(self):
if not enable_authentication:
self.redirect("/")
else:
if PasswordManager.havePasswordsBeenSet():
self.render("templates/Login.html",
next=self.get_argument("next", landingPage),
xsrf_form_html=self.xsrf_form_html())
else:
# New password setup
self.render("templates/LoginCreate.html",
next=self.get_argument("next", landingPage),
xsrf_form_html=self.xsrf_form_html())
def post(self):
global authenticated_users
# Test password
print("Attempting to authorize user {}...".format(self.get_argument("name")))
if enable_authentication and PasswordManager.verify(self.get_argument("password")):
# Generate new authenticated user session
randomGenerator = random.SystemRandom()
cookieSecret = str(randomGenerator.getrandbits(128))
authenticated_user = self.get_argument("name") + "_" + cookieSecret
authenticated_user = authenticated_user.encode()
authenticated_users.append(authenticated_user)
# Set the cookie on the user's side
self.set_secure_cookie("user", authenticated_user)
print("Authenticated user {}".format(self.get_argument("name")))
# Let them in
self.redirect(self.get_argument("next", landingPage))
else:
print("Refused user {} (password doesn't match any in database)".format(self.get_argument("name")))
self.redirect("/login")
class LogoutHandler(AuthHandler):
@tornado.web.authenticated
def get(self):
global authenticated_users
if enable_authentication:
print("User {} logging out".format(self.current_user))
if self.current_user in authenticated_users:
authenticated_users.remove(self.current_user)
self.redirect("/login")
else:
self.redirect("/")
class SetPasswordHandler(AuthHandler):
def get(self):
pass
def post(self):
if not enable_authentication:
self.redirect("/")
else:
print("Attempting to set password")
if PasswordManager.havePasswordsBeenSet():
print("Rejected: Password has already been set!")
elif self.get_argument("password") != self.get_argument("password_verify"):
print("Rejected: password doesn't match verify field!")
else:
PasswordManager.createPassword(self.get_argument("password"))
print("Success: Set password")
self.redirect("/login")
class AuthedStaticHandler(tornado.web.StaticFileHandler):
def get_current_user(self):
return login_get_current_user(self)
@tornado.web.authenticated
def prepare(self):
pass
class HomeHandler(AuthHandler):
@tornado.web.authenticated
def get(self):
self.render('webInterface/index.html')
def settingsToHtmlForm():
settingsInputs = []
for sectionSettingsPair in settings.settingsStructure:
settingsInputs.append('<h2>{}</h2>'.format(sectionSettingsPair[0]))
for sectionOption in sectionSettingsPair[1]:
option = None
optionComment = ''
if type(sectionOption) == tuple:
option = sectionOption[0]
optionComment = '<p class="optionComment">{}</p>'.format(sectionOption[1])
else:
option = sectionOption
if type(settings.settings[option]) == bool:
settingsInputs.append('''<input type="checkbox" id="{option}" name="{option}" value="{optionValue}" {checkedState} />
<label for="{option}">{optionName}</label>{comment}
<br />'''
.format(option=option, optionName=option.replace('_', ' '),
comment=optionComment,
checkedState=('checked' if settings.settings[option] else ''),
optionValue=('1' if settings.settings[option] else '0')))
elif type(settings.settings[option]) == int:
settingsInputs.append('''<label for="{option}">{optionName}</label>
<input type="number" id="{option}" name="{option}" value="{optionValue}" />{comment}
<br />'''
.format(option=option, optionName=option.replace('_', ' '), comment=optionComment,
optionValue=settings.settings[option]))
elif type(settings.settings[option]) == str:
settingsInputs.append('''<label for="{option}">{optionName}</label>
<input type="{type}" id="{option}" name="{option}" value="{optionValue}" />{comment}
<br />'''
.format(option=option, optionName=option.replace('_', ' '),
comment=optionComment, optionValue=settings.settings[option],
type=('password' if 'secret' in option.lower() or 'password' in option.lower() else 'text')))
return ''.join(settingsInputs)
unsupportedSubmissionShownColumns = ['title',
'bodyUrl',
'reasonForFailure']
unsupportedSubmissionColumnLabels = ['Retry', 'Source', 'Title',
'Content URL',
'Reason for Failure']
class UnsupportedSubmissionsHandler(AuthHandler):
def unsupportedSubmissionToTableColumns(self, unsupportedSubmission):
rowHtml = ''
rowHtml += '\t<td><input type="checkbox" name="shouldRetry" value="{}"/></td>\n'.format(unsupportedSubmission['id'])
# Special case source cell
rowHtml += '\t<td><a href="{}">{}</a></td>\n'.format(
'https://reddit.com{}'.format(unsupportedSubmission['postUrl']) if unsupportedSubmission['source'] == 'reddit'
else unsupportedSubmission['postUrl'],
unsupportedSubmission['source'])
for columnName in unsupportedSubmissionShownColumns:
if 'url' in columnName[-3:].lower():
rowHtml += '\t<td><a href="{}">Content</a></td>\n'.format(unsupportedSubmission['bodyUrl'])
else:
rowHtml += '\t<td>{}</td>\n'.format(unsupportedSubmission[columnName])
return rowHtml
def createTableHeader(self):
tableHeaderHtml = '<thead>\n<tr class="header">\n'
for columnName in unsupportedSubmissionColumnLabels:
tableHeaderHtml +='<th>{}</th>'.format(columnName)
tableHeaderHtml += '</tr>\n</thead>\n<tbody>\n'
return tableHeaderHtml
def getPendingFixups(self):
fixupHtml = ''
missingPixivSubmissions = LikedSavedDatabase.db.getMissingPixivSubmissionIds()
if len(missingPixivSubmissions):
if not fixupHtml:
fixupHtml += "<h2>Download missing content</h2>"
fixupHtml += '<p>There was an error which caused {} Pixiv submissions to not be downloaded.</p>'.format(len(missingPixivSubmissions))
fixupHtml += '<button id="FixupPixiv" onclick="fixupPixiv()">Download missing Pixiv submissions</button>'
fixupHtml += '<p>You should only need to do this once. The code error has been fixed.</p>'
return fixupHtml
@tornado.web.authenticated
def get(self):
unsupportedSubmissionsListHtml = self.createTableHeader()
unsupportedSubmissions = LikedSavedDatabase.db.getAllUnsupportedSubmissions()
i = 0
for unsupportedSubmission in reversed(unsupportedSubmissions):
unsupportedSubmissionsListHtml += ('<tr class="{}">{}</tr>\n'
.format('even' if i % 2 == 0 else 'odd',
self.unsupportedSubmissionToTableColumns(unsupportedSubmission)))
i += 1
unsupportedSubmissionsListHtml += '</tbody>\n'
self.render("templates/UnsupportedSubmissions.html",
unsupported_submissions_html=unsupportedSubmissionsListHtml,
length_unsupported_submissions=len(unsupportedSubmissions),
fixup_html=self.getPendingFixups())
class SettingsHandler(AuthHandler):
def doSettings(self, afterSubmit):
htmlSettingsForm = settingsToHtmlForm()
settingsFilename = settings.getSettingsFilename()
self.render("templates/Settings.html",
status_html=('<p><b>Settings updated</b></p>' if afterSubmit else ''),
settings_filename=settingsFilename,
settings_form_html=htmlSettingsForm,
xsrf_form_html=self.xsrf_form_html())
@tornado.web.authenticated
def get(self):
self.doSettings(False)
@tornado.web.authenticated
def post(self):
currentOutputDir = settings.settings['Output_dir']
print('Received new settings')
for option in settings.settings:
newValue = self.get_argument(option, None)
if not newValue:
# It's okay if it's a boolean because POST doesn't send unchecked checkboxes
# This means the user set the value to false
if type(settings.settings[option]) == bool:
settings.settings[option] = False
else:
print('Warning: Option {} unset! The settingsStructure might be out of sync.'
'\n\tIgnore this if the field is intentionally empty'.format(option))
else:
# All false bools are handed in the above if block, so we know they're true here
if type(settings.settings[option]) == bool:
newValue = True
elif type(settings.settings[option]) == int:
newValue = int(newValue)
settings.settings[option] = newValue
# print('\tSet {} = {}'.format(option, newValue))
# Write out the new settings
settings.writeServerSettings()
# Respond with a settings page saying we've updated the settings
self.doSettings(True)
# Refresh the cache in case the output directory changed
if currentOutputDir != settings.settings['Output_dir']:
generateSavedImagesCache(settings.settings['Output_dir'])
class RandomImageBrowserWebSocket(tornado.websocket.WebSocketHandler):
connections = set()
def cacheFilteredImages(self):
# Clear the cache
self.sessionData.filteredImagesCache = []
if not self.sessionData.randomImageFilter:
return
randomImageFilterLower = self.sessionData.randomImageFilter.lower()
for imagePath in savedImagesCache:
if randomImageFilterLower in imagePath.lower():
self.sessionData.filteredImagesCache.append(imagePath)
print('\tFiltered images with "{}"; {} images matching filter'
.format(self.sessionData.randomImageFilter,
len(self.sessionData.filteredImagesCache)))
def changeCurrentDirectory(self, newDirectory):
self.sessionData.currentDirectoryPath = newDirectory
dirList = os.listdir(self.sessionData.currentDirectoryPath)
filteredDirList = []
for fileOrDir in dirList:
# The script spits out a lot of .json files the user probably doesn't want to see
if (not fileOrDir.endswith('.json')
and (not self.sessionData.directoryFilter
or self.sessionData.directoryFilter.lower() in fileOrDir.lower())):
filteredDirList.append(fileOrDir)
self.sessionData.currentDirectoryCache = sorted(filteredDirList)
def open(self):
global userSessionData
currentUser = login_get_current_user(self)
if not currentUser:
# Failed authorization
return None
self.connections.add(self)
if currentUser not in userSessionData:
newSessionData = SessionData()
userSessionData[currentUser] = newSessionData
self.sessionData = userSessionData[currentUser]
self.sessionData.acquire()
# Set up the directory cache with the top-level output
self.changeCurrentDirectory(settings.settings['Output_dir'])
self.sessionData.release()
def on_message(self, message):
currentUser = login_get_current_user(self)
if not currentUser:
# Failed authorization
return None
print('RandomImageBrowserWebSocket: Received message ', message)
parsedMessage = json.loads(message)
command = parsedMessage['command']
print('RandomImageBrowserWebSocket: Command ', command)
action = ''
self.sessionData.acquire()
"""
Random Image Browser
"""
if command == 'imageAddToFavorites':
if self.sessionData.currentImage:
self.sessionData.favorites.append(self.sessionData.currentImage)
self.sessionData.favoritesIndex = len(self.sessionData.favorites) - 1
LikedSavedDatabase.db.addFileToCollection(self.sessionData.currentImage[1], "Favorites")
if command == 'nextFavorite':
self.sessionData.favoritesIndex += 1
if self.sessionData.favoritesIndex >= 0 and self.sessionData.favoritesIndex < len(self.sessionData.favorites):
action = 'setImage'
fullImagePath, serverImagePath = self.sessionData.favorites[self.sessionData.favoritesIndex]
else:
self.sessionData.favoritesIndex = len(self.sessionData.favorites) - 1
if len(self.sessionData.favorites):
action = 'setImage'
fullImagePath, serverImagePath = self.sessionData.favorites[self.sessionData.favoritesIndex]
if command == 'previousFavorite' and len(self.sessionData.favorites):
action = 'setImage'
if self.sessionData.favoritesIndex > 0:
self.sessionData.favoritesIndex -= 1
fullImagePath, serverImagePath = self.sessionData.favorites[self.sessionData.favoritesIndex]
if command == 'nextImage':
action = 'setImage'
if self.sessionData.randomHistoryIndex == -1 or self.sessionData.randomHistoryIndex >= len(self.sessionData.randomHistory) - 1:
fullImagePath, serverImagePath = getRandomImage(self.sessionData.filteredImagesCache, self.sessionData.randomImageFilter)
self.sessionData.randomHistory.append((fullImagePath, serverImagePath))
self.sessionData.randomHistoryIndex = len(self.sessionData.randomHistory) - 1
else:
self.sessionData.randomHistoryIndex += 1
fullImagePath, serverImagePath = self.sessionData.randomHistory[self.sessionData.randomHistoryIndex]
if command == 'previousImage':
action = 'setImage'
if self.sessionData.randomHistoryIndex > 0:
self.sessionData.randomHistoryIndex -= 1
fullImagePath, serverImagePath = self.sessionData.randomHistory[self.sessionData.randomHistoryIndex]
if command in ['nextImageInFolder', 'previousImageInFolder'] and len(self.sessionData.randomHistory):
fullImagePath, serverImagePath = self.sessionData.currentImage
folder = fullImagePath[:fullImagePath.rfind('/')]
imagesInFolder = []
for root, dirs, files in os.walk(folder):
for file in files:
if file.endswith(supportedExtensions):
imagesInFolder.append(os.path.join(root, file))
utilities.sort_naturally(imagesInFolder)
currentImageIndex = imagesInFolder.index(fullImagePath)
if currentImageIndex >= 0:
action = 'setImage'
nextImageIndex = currentImageIndex + (1 if command == 'nextImageInFolder' else -1)
if nextImageIndex == len(imagesInFolder):
nextImageIndex = 0
if nextImageIndex < 0:
nextImageIndex = len(imagesInFolder) - 1
fullImagePath = imagesInFolder[nextImageIndex]
serverImagePath = utilities.outputPathToServerPath(fullImagePath)
if command == 'setFilter':
newFilter = parsedMessage['filter']
if newFilter != self.sessionData.randomImageFilter:
self.sessionData.randomImageFilter = newFilter
self.cacheFilteredImages()
"""
Directory browser
"""
if command == 'setDirectoryFilter':
newFilter = parsedMessage['filter']
if newFilter != self.sessionData.directoryFilter:
self.sessionData.directoryFilter = newFilter
# Refresh cache with new filter
self.changeCurrentDirectory(self.sessionData.currentDirectoryPath)
action = 'sendDirectory'
if command == 'listCurrentDirectory':
action = 'sendDirectory'
if command == 'changeDirectory':
# Reset the filter (chances are the user only wanted to filter at one level
self.sessionData.directoryFilter = ''
self.changeCurrentDirectory('{}/{}'.format(self.sessionData.currentDirectoryPath, parsedMessage['path']));
action = 'sendDirectory'
if command == 'directoryUp':
# Don't allow going higher than output dir
if self.sessionData.currentDirectoryPath != settings.settings['Output_dir']:
upDirectory = (settings.settings['Output_dir'] +
self.sessionData.currentDirectoryPath[len(settings.settings['Output_dir'])
: self.sessionData.currentDirectoryPath.rfind('/')])
# Reset the filter (chances are the user only wanted to filter at one level
self.sessionData.directoryFilter = ''
self.changeCurrentDirectory(upDirectory)
action = 'sendDirectory'
if command == 'directoryRoot':
# Reset the filter (chances are the user only wanted to filter at one level
self.sessionData.directoryFilter = ''
self.changeCurrentDirectory(settings.settings['Output_dir'])
action = 'sendDirectory'
"""
Actions
"""
# Only send a response if needed
if action == 'setImage':
# Stupid hack
if serverImagePath.endswith(videoExtensions):
action = 'setVideo'
self.sessionData.currentImage = (fullImagePath, serverImagePath)
responseMessage = ('{{"responseToCommand":"{}", "action":"{}", "fullImagePath":"{}", "serverImagePath":"{}"}}'
.format(command, action, fullImagePath, serverImagePath))
self.write_message(responseMessage)
if action == 'sendDirectory':
directoryList = ''
for path in self.sessionData.currentDirectoryCache:
isSupportedFile = path.endswith(supportedExtensions)
isFile = '.' in path
if path.endswith(videoExtensions):
fileType = 'video'
elif isSupportedFile:
fileType = 'image'
elif isFile:
fileType = 'file'
else:
fileType = 'dir'
serverPath = 'output' + self.sessionData.currentDirectoryPath[len(settings.settings['Output_dir']):] + '/' + path
directoryList += '{{"path":"{}", "type":"{}", "serverPath":"{}"}},'.format(path, fileType, serverPath)
# Do directoryList[:-1] (yuck) to trim the final trailing comma because JSON doesn't like it
responseMessage = ('{{"responseToCommand":"{}", "action":"{}", "directoryList":[{}]}}'
.format(command, action, directoryList[:-1]))
self.write_message(responseMessage)
self.sessionData.release()
def on_close(self):
self.connections.remove(self)
scriptPipeConnection = None
scriptProcess = None
def startScript(functionToRun, args=None):
global scriptPipeConnection, scriptProcess
# Script already running
if scriptProcess and scriptProcess.is_alive():
return
scriptPipeConnection, childConnection = multiprocessing.Pipe()
if not args:
scriptProcess = multiprocessing.Process(target=functionToRun,
args=(childConnection,))
else:
scriptProcess = multiprocessing.Process(target=functionToRun,
args=(childConnection, args,))
scriptProcess.start()
runScriptWebSocketConnections = set()
class RunScriptWebSocket(tornado.websocket.WebSocketHandler):
def open(self):
if not login_get_current_user(self):
return None
global runScriptWebSocketConnections
runScriptWebSocketConnections.add(self)
def on_message(self, message):
if not login_get_current_user(self):
return None
print('RunScriptWebSocket: Received message ', message)
parsedMessage = json.loads(message)
command = parsedMessage['command']
print('RunScriptWebSocket: Command ', command)
if scriptProcess and scriptProcess.is_alive():
print('RunScriptWebSocket: Script already running')
responseMessage = ('{{"message":"{}", "action":"{}"}}'
.format('A download process is already running. Please wait until it completes.\\n',
'printMessage'))
self.write_message(responseMessage)
if command == 'runScript':
print('RunScriptWebSocket: Starting script')
startScript(redditUserImageScraper.runLikedSavedDownloader)
responseMessage = ('{{"message":"{}", "action":"{}"}}'
.format('Running downloader.\\n', 'printMessage'))
self.write_message(responseMessage)
elif command == 'retrySubmissions':
print('RunScriptWebSocket: Starting script')
if parsedMessage['submissionsToRetry']:
submissionIds = []
for submissionId in parsedMessage['submissionsToRetry']:
submissionIds.append(int(submissionId))
startScript(redditUserImageScraper.saveRequestedSubmissions,
submissionIds)
responseMessage = ('{{"message":"{}", "action":"{}"}}'
.format('Running downloader.\\n', 'printMessage'))
self.write_message(responseMessage)
else:
responseMessage = ('{{"message":"{}", "action":"{}"}}'
.format('No content selected.\\n', 'printMessage'))
self.write_message(responseMessage)
# Fix the non-unique filenames error
elif command == 'fixupPixivSubmissions':
print('RunScriptWebSocket: Starting pixiv fixup')
missingPixivSubmissions = LikedSavedDatabase.db.getMissingPixivSubmissionIds()
missingPixivSubmissionIds = []
for missingPixivSubmission in missingPixivSubmissions:
missingPixivSubmissionIds.append(int(missingPixivSubmission['id']))
# print(missingPixivSubmissionIds)
startScript(redditUserImageScraper.saveRequestedSubmissions, missingPixivSubmissionIds)
responseMessage = ('{{"message":"{}", "action":"{}"}}'
.format('Running downloader to download {} missing pixiv submissions.\\n'
.format(len(missingPixivSubmissions)),
'printMessage'))
elif command == 'explicitDownloadUrls':
print('RunScriptWebSocket: Starting script')
if parsedMessage['urls']:
urls = []
urlLines = parsedMessage['urls'].split('\n')
for line in urlLines:
# TODO: It would be a good idea to do some validation here, and maybe even regex extract URLs
urls.append(line)
print(urls)
startScript(redditUserImageScraper.saveRequestedUrls, urls)
responseMessage = ('{{"message":"{}", "action":"{}"}}'
.format('Running downloader.\\n', 'printMessage'))
self.write_message(responseMessage)
else:
responseMessage = ('{{"message":"{}", "action":"{}"}}'
.format('No URLs provided.\\n', 'printMessage'))
self.write_message(responseMessage)
else:
print('RunScriptWebSocket: Error: Received command not understood')
def on_close(self):
global runScriptWebSocketConnections
runScriptWebSocketConnections.remove(self)
def updateScriptStatus():
global scriptPipeConnection
# If no pipe or no data to receive from pipe, we're done
# Poll() is non-blocking whereas recv is blocking
try:
if (not runScriptWebSocketConnections
or not scriptPipeConnection
or not scriptPipeConnection.poll()):
return
except OSError:
scriptPipeConnection = None
return
try:
pipeOutput = scriptPipeConnection.recv()
if pipeOutput:
responseMessage = ('{{"message":"{}", "action":"{}"}}'
.format(pipeOutput.replace('\n', '\\n').replace('\t', ''),
'printMessage'))
for client in runScriptWebSocketConnections:
client.write_message(responseMessage)
if redditUserImageScraper.scriptFinishedSentinel in pipeOutput:
# Script finished; refresh image cache
print('Refreshing cache due to script finishing')
generateSavedImagesCache(settings.settings['Output_dir'])
responseMessage = ('{{"action":"{}"}}'
.format('scriptFinished'))
for client in runScriptWebSocketConnections:
client.write_message(responseMessage)
scriptPipeConnection.close()
except EOFError:
scriptPipeConnection = None
print("Lost connection to subprocess!")
responseMessage = ('{{"message":"{}", "action":"{}"}}'
.format("Downloader encountered a problem. Check your server output.",
'printMessage'))
for client in runScriptWebSocketConnections:
client.write_message(responseMessage)
#
# Startup
#
def make_app():
# Each time the server starts up, invalidate all cookies
randomGenerator = random.SystemRandom()
cookieSecret = str(randomGenerator.getrandbits(128))
return tornado.web.Application([
# Home page
(r'/', HomeHandler),
# Login
(r'/login', LoginHandler),
(r'/logout', LogoutHandler),
(r'/setPassword', SetPasswordHandler),
# Configure the script
(r'/settings', SettingsHandler),
# Handles messages for run script
(r'/runScriptWebSocket', RunScriptWebSocket),
# Handles messages for randomImageBrowser
(r'/randomImageBrowserWebSocket', RandomImageBrowserWebSocket),
(r'/unsupportedSubmissions', UnsupportedSubmissionsHandler),
#
# Static files
#
(r'/webInterface/(.*)', AuthedStaticHandler, {'path' : 'webInterface'}),
# Don't change this "output" here without changing the other places as well
(r'/output/(.*)', AuthedStaticHandler, {'path' : settings.settings['Output_dir']}),
# Files served regardless of whether the user is authenticated. Only login page resources
# should be in this folder, because anyone can see them
(r'/webInterfaceNoAuth/(.*)', tornado.web.StaticFileHandler, {'path' : 'webInterfaceNoAuth'}),
],
xsrf_cookies=True,
cookie_secret=cookieSecret,
login_url="/login")
if __name__ == '__main__':
print('Loading settings...')
settings.getSettings()
print('Content output directory: ' + settings.settings['Output_dir'])
if not settings.settings['Output_dir']:
print('WARNING: No output directory specified! This will probably break things')
if not savedImagesCache:
generateSavedImagesCache(settings.settings['Output_dir'])
LikedSavedDatabase.initializeFromSettings(settings.settings)
# Backwards compatibility: Read the old .json files into the database. This can be slow for old
# repositories, so only do it once
if not settings.settings['Database_Has_Imported_All_Submissions']:
# Also scan output_dir because Metadata_output_dir was a late addition
LikedSavedDatabase.importFromAllJsonInDir(settings.settings['Output_dir'])
LikedSavedDatabase.importFromAllJsonInDir(settings.settings['Metadata_output_dir'])
settings.settings['Database_Has_Imported_All_Submissions'] = True
settings.writeServerSettings()
print('Successfully imported "All" Submissions into database')
if not settings.settings['Database_Has_Imported_Unsupported_Submissions']:
LikedSavedDatabase.importUnsupportedSubmissionsFromAllJsonInDir(settings.settings['Output_dir'])
LikedSavedDatabase.importUnsupportedSubmissionsFromAllJsonInDir(settings.settings['Metadata_output_dir'])
print('Removing Unsupported Submissions which have file associations')
LikedSavedDatabase.db.removeUnsupportedSubmissionsWithFileAssociations()
settings.settings['Database_Has_Imported_Unsupported_Submissions'] = True
settings.writeServerSettings()
print('Successfully imported Unsupported Submissions into database')
# TODO
# if not settings.settings['Database_Has_Imported_Comments']:
# LikedSavedDatabase.importFromAllJsonInDir(settings.settings['Output_dir'])
# settings.settings['Database_Has_Imported_Comments'] = True
# This isn't pretty, but it'll get the job done
webSocketSettings = open('webInterface/webSocketSettings.js', 'w')
webSocketSettings.write('useSSL = {};'.format('true' if useSSL else 'false'))
webSocketSettings.close()
port = settings.settings['Port'] if settings.settings['Port'] else 8888
print('\nStarting Content Collector Server on port {}...'.format(port))
app = make_app()
# Generating a self-signing certificate:
# openssl req -x509 -nodes -days 365 -newkey rsa:1024 -keyout certificates/server_jupyter_based.crt.key -out certificates/server_jupyter_based.crt.pem
# (from https://jupyter-notebook.readthedocs.io/en/latest/public_server.html)
# I then had to tell Firefox to trust this certificate even though it is self-signing (because
# I want a free certificate for this non-serious project)
if useSSL:
if os.path.exists("certificates/liked_saved_server.crt.pem"):
app.listen(port, ssl_options={"certfile":"certificates/liked_saved_server.crt.pem",
"keyfile":"certificates/liked_saved_server.crt.key"})
# For backwards compatibility
elif os.path.exists("certificates/server_jupyter_based.crt.pem"):
app.listen(port, ssl_options={"certfile":"certificates/server_jupyter_based.crt.pem",
"keyfile":"certificates/server_jupyter_based.crt.key"})
else:
print('\n\tERROR: Certificates non-existent! Run ./Generate_Certificates.sh to create them')
else:
# Show the warning only if SSL is not enabled
print('\n\tWARNING: Do NOT run this server on the internet (e.g. port-forwarded)'
' nor when\n\t connected to an insecure LAN! It is not protected against malicious use.\n')
app.listen(port)
if settings.settings['Launch_Browser_On_Startup']:
browseUrl ="{}://localhost:{}".format('https' if useSSL else 'http', port)
print("Attempting to launch user's default browser to {}".format(browseUrl))
webbrowser.open(browseUrl)
ioLoop = tornado.ioloop.IOLoop.current()
updateStatusCallback = tornado.ioloop.PeriodicCallback(updateScriptStatus, 100)
updateStatusCallback.start()
ioLoop.start()
|
Our Certified Translators can translate your documents to/from Chinese (Traditional) quickly and accurately for just $24.95 per page.
Chinese (Traditional) Certified Translators Our professional translators are native Chinese (Traditional) speakers also fluent in English.
Whether you need a translation from Chinese (Traditional) to English or English to Chinese (Traditional), RushTranslate is your best bet for an accurate translation delivered quickly and affordably. Our Chinese (Traditional) Certified Translators are native Chinese (Traditional) speakers also fluent in English. With a wide range of industry-specific experience, we'll pair you with the perfect translator for your project to ensure success.
|
#! /usr/bin/env python
# D.J. Bennett
# 26/05/2014
"""
setup.py for pglt
"""
import os
import pglt
from setuptools import setup, find_packages
# PACAKAGE INFO
PACKAGES = find_packages()
PACKAGE_DIRS = [p.replace(".", os.path.sep) for p in PACKAGES]
# SETUP
setup(
name="pglt",
version=pglt.__version__,
author="Dominic John Bennett",
author_email="dominic.john.bennett@gmail.com",
description=("pG-lt: An automated pipeline for phylogeney generation."),
license="LICENSE.txt",
keywords="phylogenetics ecology evolution conservation",
url="https://github.com/DomBennett/MassPhylogenyEstimation",
packages=PACKAGES,
package_dir=dict(zip(PACKAGES, PACKAGE_DIRS)),
package_data={'pglt': ['parameters.csv', 'gene_parameters.csv',
'dependencies.p']},
scripts=['run_pglt.py', 'pglt_set_dependencies.py'],
test_suite='tests',
long_description=pglt.__doc__,
classifiers=[
"Development Status :: 4 - Beta",
"Topic :: Scientific/Engineering :: Bio-Informatics",
"Programming Language :: Python :: 2.7",
"License :: OSI Approved :: GNU General Public License v2 (GPLv2)",
],
install_requires=['setuptools', 'taxon_names_resolver', 'biopython',
'dendropy', 'numpy', 'tabulate'],
)
print('''
Congratulations -- you`ve installed pglt!
Now use `pglt_set_dependencies.py` to add external programs in order to run
the pipeline.
''')
|
You have embarked on a journey of personal discovery that can take you far beyond your expectations at this point.
As the journey continues you will find deeper meaning and focus in your life. A process has begun. This initiation will assist in focusing the energy you are working with.
The word initiation as used here or to initiate means to begin, set going, or originate: to initiate major social reforms. To introduce into the knowledge of some art or subject.
It is traditional, when one embarks on a spiritual journey, that one be initiated into that school. However, in the traditional mystery schools, there is a "keeper of the mysteries" who must perform the initiation. In other words, one must receivepermission to advance spiritually.
Initiations should not be taken lightly. You are making an agreement with your Innermost Self, and your Innermost Self takes these things very seriously.
|
"""
Spectral data representation.
"""
from __future__ import (absolute_import, division, print_function,
unicode_literals)
from .core import Data
from .atomic import get_atomdat
from astropy.units import (erg, km, cm, s, angstrom, spectral,
spectral_density, Quantity, UnitsError)
from astropy.constants import m_e, c, e
from astropy.table import Table, Column
from math import pi, sqrt, exp, log10
from warnings import warn
import numpy as np
__all__ = ['Spectrum2D', 'Spectrum1D', 'Absorber', 'EmissionLine']
e2_me_c = (e.esu ** 2 / (m_e.cgs * c.cgs)).to(cm ** 2 / s)
c_kms = c.to(km / s)
atomdat = None
def find_bin_edges(bin_centres):
"""
Find the bin edges given the bin centres.
Parameters
----------
bin_centres : array, shape (N,)
The bin centres.
Returns
-------
bins : array, shape (N + 1,)
The bin edges.
"""
if not isinstance(bin_centres, np.ndarray):
bin_centres = np.asarray(bin_centres)
edges = bin_centres[:-1] + 0.5 * (bin_centres[1:] - bin_centres[:-1])
bins = np.concatenate(([2 * bin_centres[0] - edges[0]], edges,
[2 * bin_centres[-1] - edges[-1]]))
return bins
class Spectrum2D(object):
"""
A 2D spectrum.
Parameters
----------
dispersion : `astropy.units.Quantity` or array, shape (N,)
Spectral dispersion axis.
data : array, shape (N, M)
The spectral data.
unit : `astropy.units.UnitBase` or str, optional
Unit for the dispersion axis.
"""
def __init__(self, dispersion, data, unit=None):
self.dispersion = Quantity(dispersion, unit=unit)
if unit is not None:
self.wavelength = self.dispersion.to(angstrom)
else:
self.wavelength = self.dispersion
self.data = data
class Spectrum1D(object):
"""
A 1D spectrum. Assumes wavelength units unless otherwise specified.
Parameters
----------
dispersion : `astropy.units.Quantity` or array
Spectral dispersion axis.
flux : `igmtools.data.Data`, `astropy.units.Quantity` or array
Spectral flux. Should have the same length as `dispersion`.
error : `astropy.units.Quantity` or array, optional
Error on each flux value.
continuum : `astropy.units.Quantity` or array, optional
An estimate of the continuum flux.
mask : array, optional
Mask for the spectrum. The values must be False where valid and True
where not.
unit : `astropy.units.UnitBase` or str, optional
Spectral unit.
dispersion_unit : `astropy.units.UnitBase` or str, optional
Unit for the dispersion axis.
meta : dict, optional
Meta data for the spectrum.
"""
def __init__(self, dispersion, flux, error=None, continuum=None,
mask=None, unit=None, dispersion_unit=None, meta=None):
_unit = flux.unit if unit is None and hasattr(flux, 'unit') else unit
if isinstance(error, (Quantity, Data)):
if error.unit != _unit:
raise UnitsError('The error unit must be the same as the '
'flux unit.')
error = error.value
elif isinstance(error, Column):
if error.unit != _unit:
raise UnitsError('The error unit must be the same as the '
'flux unit.')
error = error.data
# Set zero error elements to NaN:
if error is not None:
zero = error == 0
error[zero] = np.nan
# Mask these elements:
if mask is not None:
self.mask = mask | np.isnan(error)
else:
self.mask = np.isnan(error)
# If dispersion is a `Quantity`, `Data`, or `Column` instance with the
# unit attribute set, that unit is preserved if `dispersion_unit` is
# None, but overriden otherwise
self.dispersion = Quantity(dispersion, unit=dispersion_unit)
if dispersion_unit is not None:
self.wavelength = self.dispersion.to(angstrom)
else:
# Assume wavelength units:
self.wavelength = self.dispersion
self.flux = Data(flux, error, unit)
if continuum is not None:
self.continuum = Quantity(continuum, unit=unit)
else:
self.continuum = None
self.meta = meta
@classmethod
def from_table(cls, table, dispersion_column, flux_column,
error_column=None, continuum_column=None, unit=None,
dispersion_unit=None):
"""
Initialises a `Spectrum1D` object from an `astropy.table.Table`
instance.
Parameters
----------
table : `astropy.table.Table`
Contains information used to construct the spectrum. Must have
columns for the dispersion axis and the spectral flux.
dispersion_column : str
Name for the dispersion column.
flux_column : str
Name for the flux column.
error_column : str, optional
Name for the error column.
continuum_column : str, optional
Name for the continuum column.
unit : `astropy.units.UnitBase` or str, optional
Spectral unit.
dispersion_unit : `astropy.units.UnitBase` or str, optional
Unit for the dispersion axis.
"""
dispersion = Quantity(table[dispersion_column])
flux = Quantity(table[flux_column])
if error_column is not None:
error = Quantity(table[error_column])
else:
error = None
if continuum_column is not None:
continuum = Quantity(table[continuum_column])
else:
continuum = None
meta = table.meta
mask = table.mask
return cls(dispersion, flux, error, continuum, mask, unit,
dispersion_unit, meta)
def write(self, *args, **kwargs):
"""
Write the spectrum to a file. Accepts the same arguments as
`astropy.table.Table.write`
"""
if self.dispersion.unit is None:
label_string = 'WAVELENGTH'
else:
if self.dispersion.unit.physical_type == 'length':
label_string = 'WAVELENGTH'
elif self.dispersion.unit.physical_type == 'frequency':
label_string = 'FREQUENCY'
elif self.dispersion.unit.physical_type == 'energy':
label_string = 'ENERGY'
else:
raise ValueError('unrecognised unit type')
t = Table([self.dispersion, self.flux, self.flux.uncertainty.value],
names=[label_string, 'FLUX', 'ERROR'])
t['ERROR'].unit = t['FLUX'].unit
if self.continuum is not None:
t['CONTINUUM'] = self.continuum
t.write(*args, **kwargs)
def plot(self, **kwargs):
"""
Plot the spectrum. Accepts the same arguments as
`igmtools.plot.Plot`.
"""
from ..plot import Plot
p = Plot(1, 1, 1, **kwargs)
p.axes[0].plot(self.dispersion.value, self.flux.value,
drawstyle='steps-mid')
if self.flux.uncertainty is not None:
p.axes[0].plot(self.dispersion.value, self.flux.uncertainty.value,
drawstyle='steps-mid')
p.tidy()
p.display()
def normalise_to_magnitude(self, magnitude, band):
"""
Normalises the spectrum to match the flux equivalent to the
given AB magnitude in the given passband.
Parameters
----------
magnitude : float
AB magnitude.
band : `igmtools.photometry.Passband`
The passband.
"""
from ..photometry import mag2flux
mag_flux = mag2flux(magnitude, band)
spec_flux = self.calculate_flux(band)
norm = mag_flux / spec_flux
self.flux *= norm
def calculate_flux(self, band):
"""
Calculate the mean flux for a passband, weighted by the response
and wavelength in the given passband.
Parameters
----------
band : `igmtools.photometry.Passband`
The passband.
Returns
-------
flux : `astropy.units.Quantity`
The mean flux in erg / s / cm^2 / Angstrom.
Notes
-----
This function does not calculate an uncertainty.
"""
if (self.wavelength[0] > band.wavelength[0] or
self.wavelength[-1] < band.wavelength[-1]):
warn('Spectrum does not cover the whole bandpass, '
'extrapolating...')
dw = np.median(np.diff(self.wavelength.value))
spec_wavelength = np.arange(
band.wavelength.value[0],
band.wavelength.value[-1] + dw, dw) * angstrom
spec_flux = np.interp(spec_wavelength, self.wavelength,
self.flux.value)
else:
spec_wavelength = self.wavelength
spec_flux = self.flux.value
i, j = spec_wavelength.searchsorted(
Quantity([band.wavelength[0], band.wavelength[-1]]))
wavelength = spec_wavelength[i:j]
flux = spec_flux[i:j]
dw_band = np.median(np.diff(band.wavelength))
dw_spec = np.median(np.diff(wavelength))
if dw_spec.value > dw_band.value > 20:
warn('Spectrum wavelength sampling interval {0:.2f}, but bandpass'
'sampling interval {1:.2f}'.format(dw_spec, dw_band))
# Interpolate the spectrum to the passband wavelengths:
flux = np.interp(band.wavelength, wavelength, flux)
band_transmission = band.transmission
wavelength = band.wavelength
else:
# Interpolate the band transmission to the spectrum wavelengths:
band_transmission = np.interp(
wavelength, band.wavelength, band.transmission)
# Weight by the response and wavelength, appropriate when we're
# counting the number of photons within the band:
flux = (np.trapz(band_transmission * flux * wavelength, wavelength) /
np.trapz(band_transmission * wavelength, wavelength))
flux *= erg / s / cm ** 2 / angstrom
return flux
def calculate_magnitude(self, band, system='AB'):
"""
Calculates the magnitude in a given passband.
band : `igmtools.photometry.Passband`
The passband.
system : {`AB`, `Vega`}
Magnitude system.
Returns
-------
magnitude : float
Magnitude in the given system.
"""
if system not in ('AB', 'Vega'):
raise ValueError('`system` must be one of `AB` or `Vega`')
f1 = self.calculate_flux(band)
if f1 > 0:
magnitude = -2.5 * log10(f1 / band.flux[system])
if system == 'Vega':
# Add 0.026 because Vega has V = 0.026:
magnitude += 0.026
else:
magnitude = np.inf
return magnitude
def apply_extinction(self, EBmV):
"""
Apply Milky Way extinction.
Parameters
----------
EBmV : float
Colour excess.
"""
from astro.extinction import MWCardelli89
tau = MWCardelli89(self.wavelength, EBmV=EBmV).tau
self.flux *= np.exp(-tau)
if self.continuum is not None:
self.continuum *= np.exp(-tau)
def rebin(self, dispersion):
"""
Rebin the spectrum onto a new dispersion axis.
Parameters
----------
dispersion : float, `astropy.units.Quantity` or array
The dispersion for the rebinned spectrum. If a float, assumes a
linear scale with that bin size.
"""
if isinstance(dispersion, float):
dispersion = np.arange(
self.dispersion.value[0], self.dispersion.value[-1],
dispersion)
old_bins = find_bin_edges(self.dispersion.value)
new_bins = find_bin_edges(dispersion)
widths = np.diff(old_bins)
old_length = len(self.dispersion)
new_length = len(dispersion)
i = 0 # index of old array
j = 0 # index of new array
# Variables used for rebinning:
df = 0.0
de2 = 0.0
nbins = 0.0
flux = np.zeros_like(dispersion)
error = np.zeros_like(dispersion)
# Sanity check:
if old_bins[-1] < new_bins[0] or new_bins[-1] < old_bins[0]:
raise ValueError('Dispersion scales do not overlap!')
# Find the first contributing old pixel to the rebinned spectrum:
if old_bins[i + 1] < new_bins[0]:
# Old dispersion scale extends lower than the new one. Find the
# first old bin that overlaps with the new scale:
while old_bins[i + 1] < new_bins[0]:
i += 1
i -= 1
elif old_bins[0] > new_bins[j + 1]:
# New dispersion scale extends lower than the old one. Find the
# first new bin that overlaps with the old scale:
while old_bins[0] > new_bins[j + 1]:
flux = np.nan
error = np.nan
j += 1
j -= 1
l0 = old_bins[i] # lower edge of contributing old bin
while True:
h0 = old_bins[i + 1] # upper edge of contributing old bin
h1 = new_bins[j + 1] # upper edge of jth new bin
if h0 < h1:
# Count up the decimal number of old bins that contribute to
# the new one and start adding up fractional flux values:
if self.flux.uncertainty.value[i] > 0:
bin_fraction = (h0 - l0) / widths[i]
nbins += bin_fraction
# We don't let `Data` handle the error propagation here
# because a sum of squares will not give us what we
# want, i.e. 0.25**2 + 0.75**2 != 0.5**2 + 0.5**2 != 1**2
df += self.flux.value[i] * bin_fraction
de2 += self.flux.uncertainty.value[i] ** 2 * bin_fraction
l0 = h0
i += 1
if i == old_length:
break
else:
# We have all but one of the old bins that contribute to the
# new one, so now just add the remaining fraction of the new
# bin to the decimal bin count and add the remaining
# fractional flux value to the sum:
if self.flux.uncertainty.value[i] > 0:
bin_fraction = (h1 - l0) / widths[i]
nbins += bin_fraction
df += self.flux.value[i] * bin_fraction
de2 += self.flux.uncertainty.value[i] ** 2 * bin_fraction
if nbins > 0:
# Divide by the decimal bin count to conserve flux density:
flux[j] = df / nbins
error[j] = sqrt(de2) / nbins
else:
flux[j] = 0.0
error[j] = 0.0
df = 0.0
de2 = 0.0
nbins = 0.0
l0 = h1
j += 1
if j == new_length:
break
if hasattr(self.dispersion, 'unit'):
dispersion = Quantity(dispersion, self.dispersion.unit)
if hasattr(self.flux, 'unit'):
flux = Data(flux, error, self.flux.unit)
# Linearly interpolate the continuum onto the new dispersion scale:
if self.continuum is not None:
continuum = np.interp(dispersion, self.dispersion, self.continuum)
else:
continuum = None
return self.__class__(dispersion, flux, continuum=continuum)
class Absorber(object):
"""
Class representation of an absorber.
Parameters
----------
identifier : str
Name of the ion, molecule or isotope, e.g. `HI`.
redshift : float, optional
Redshift of the absorber.
logn : float, optional
Log10 column density (cm^-2).
b : float, optional
Doppler broadening parameter (km/s).
covering_fraction : float, optional
Covering fraction.
atom : `igmtools.data.AtomDat`, optional
Atomic data.
"""
def __init__(self, identifier, redshift=None, logn=None, b=None,
covering_fraction=1, atom=None):
if atom is None:
atom = get_atomdat()
self.identifier = identifier
self.transitions = atom[identifier]
self.redshift = redshift
self.logn = logn
self.b = b
self.covering_fraction = covering_fraction
def __repr__(self):
return 'Absorber({0}, z={1:.2f}, logN={2:.2f}, b={3})'.format(
self.identifier, self.redshift, self.logn, int(self.b))
@classmethod
def from_tau_peak(cls, transition, tau, b):
"""
Initialise an absorber from the optical depth at line centre and
Doppler broadining parameter of a given transition.
Parameters
----------
transition : str
Name of the transition, e.g. `HI 1215'
tau : float
Optical depth at the line centre.
b : float
Doppler broadening parameter (km/s).
"""
atom = get_atomdat()
transition = atom.get_transition(transition)
if isinstance(b, Quantity):
b = b.to(cm / s)
else:
b = (b * km / s).to(cm / s)
wavelength = transition.wavelength.to(cm)
osc = transition.osc
column = tau * b / (sqrt(pi) * e2_me_c * osc * wavelength)
logn = log10(column.value)
return cls(identifier=transition.parent, logn=logn, b=b)
def optical_depth(self, dispersion):
"""
Calculates the optical depth profile for a given spectral
dispersion array.
Parameters
----------
dispersion : array
Spectral dispersion.
Returns
-------
tau : array
The optical depth profile.
"""
from ..calculations import optical_depth, tau_peak
if isinstance(dispersion, Quantity):
dispersion = dispersion.to(angstrom)
elif hasattr(dispersion, 'unit'):
if dispersion.unit is not None:
dispersion = dispersion.to(angstrom)
else:
dispersion = Quantity(dispersion, unit=angstrom)
velocity_range = ([-20000, 20000] * km / s if self.logn > 18
else [-1000, 1000] * km / s)
# Select only transitions with redshifted central wavelengths inside
# `dispersion` +/- 500 km/s:
rest_wavelengths = Quantity([t.wavelength for t in self.transitions])
observed_wavelengths = rest_wavelengths * (1 + self.redshift)
wmin = dispersion[0] * (1 - 500 * km / s / c_kms)
wmax = dispersion[-1] * (1 - 500 * km / s / c_kms)
in_range = ((observed_wavelengths >= wmin) &
(observed_wavelengths <= wmax))
transitions = np.array(self.transitions)[in_range]
tau = np.zeros_like(dispersion.value)
for i, transition in enumerate(transitions):
tau_max = tau_peak(transition, self.logn, self.b)
if 1 - exp(-tau_max) < 1e-3:
continue
observed_wavelength = transition.wavelength * (1 + self.redshift)
dv = ((dispersion - observed_wavelength) /
observed_wavelength * c_kms)
i0, i1 = dv.searchsorted(velocity_range)
tau0 = optical_depth(dv[i0:i1], transition, self.logn, self.b)
tau[i0:i1] += tau0
return tau
class EmissionLine(object):
"""
Class representation of an emission line and its properties.
Parameters
----------
wavelength : float
Rest frame wavelength of the line in Angstrom.
redshift : float
Redshift of the emission line.
flux : `igmtools.data.Data`, optional
Integrated line flux.
cont : `igmtools.data.Data`, optional
Continuum flux at the line centre.
eqw : `igmtools.data.Data`, optional
Equivalent width of the line.
"""
def __init__(self, wavelength, redshift, flux=None, cont=None, eqw=None):
from ..calculations import comoving_distance
if flux and not isinstance(flux, Data):
raise ValueError('flux must be an instance of a Data object')
if cont and not isinstance(cont, Data):
raise ValueError('cont must be an instance of a Data object')
if eqw and not isinstance(eqw, Data):
raise ValueError('eqw must be an instance of a Data object')
if isinstance(wavelength, Quantity):
self.wavelength = wavelength.to(angstrom)
else:
self.wavelength = wavelength * angstrom
self.redshift = redshift
self.wavelength_observed = self.wavelength * (1 + self.redshift)
if flux:
self._flux = flux.to(erg / cm ** 2 / s, equivalencies=spectral())
self.rflux = self._flux * (1 + redshift) ** 2
distance = comoving_distance(self.redshift).cgs
self.luminosity = 4 * pi * distance ** 2 * self._flux
if cont and eqw:
self._cont = cont.to(
erg / cm ** 2 / s / angstrom,
equivalencies=spectral_density(self.wavelength_observed))
self.rcont = self._cont * (1 + redshift) ** 3
self._eqw = eqw.to(angstrom)
self.reqw = self._eqw / (1 + redshift)
elif cont and not eqw:
self._cont = cont.to(
erg / cm ** 2 / s / angstrom,
equivalencies=spectral_density(self.wavelength_observed))
self.rcont = self._cont * (1 + redshift) ** 3
self._eqw = self._flux / self._cont
self.reqw = self._eqw / (1 + redshift)
elif eqw and not cont:
self._eqw = eqw.to(angstrom)
self.reqw = self._eqw / (1 + redshift)
self._cont = self._flux / self._eqw
self.rcont = self._cont * (1 + redshift) ** 3
else:
self._eqw = eqw
self.reqw = None
self._cont = cont
self.rcont = None
elif cont:
self._cont = cont.to(
erg / cm ** 2 / s / angstrom,
equivalencies=spectral_density(self.wavelength_observed))
self.rcont = self._cont * (1 + redshift) ** 3
if eqw:
self._eqw = eqw.to(angstrom)
self.reqw = self._eqw / (1 + redshift)
self._flux = self._cont * self._eqw
self.rflux = self._flux * (1 + redshift) ** 2
distance = comoving_distance(self.redshift).cgs
self.luminosity = 4 * pi * distance ** 2 * self._flux
else:
self._eqw = eqw
self.reqw = None
self._flux = flux
self.rflux = None
self.luminosity = None
elif eqw:
self._eqw = eqw.to(angstrom)
self.reqw = self._eqw / (1 + redshift)
self._flux = flux
self.rflux = None
self._cont = cont
self.rcont = None
self.luminosity = None
else:
self._flux = flux
self.rflux = None
self._cont = cont
self.rcont = None
self._eqw = eqw
self.reqw = None
self.luminosity = None
@property
def flux(self):
return self._flux
@flux.setter
def flux(self, value):
from ..calculations import comoving_distance
if not isinstance(value, Data):
raise ValueError('flux must be an instance of a Data object')
self._flux = value.to(erg / cm ** 2 / s, equivalencies=spectral())
self.rflux = self._flux * (1 + self.redshift) ** 2
distance = comoving_distance(self.redshift).cgs
self.luminosity = 4 * pi * distance ** 2 * self._flux
@property
def cont(self):
return self._cont
@cont.setter
def cont(self, value):
if not isinstance(value, Data):
raise ValueError('cont must be an instance of a Data object')
self._cont = value.to(
erg / cm ** 2 / s / angstrom,
equivalencies=spectral_density(self.wavelength_observed))
self.rcont = self._cont * (1 + self.redshift) ** 3
@property
def eqw(self):
return self._eqw
@eqw.setter
def eqw(self, value):
if not isinstance(value, Data):
raise ValueError('eqw must be an instance of a Data object')
self._eqw = value.to(angstrom)
self.reqw = self._eqw / (1 + self.redshift)
|
More 1980 alumni from Biggersville HS have posted profiles on Classmates.com®. Click here to register for free at Classmates.com® and view other 1980 alumni.
The students that went to school at the Corinth high school called Biggersville High School and graduated in '80 are on this page. Register to add your name to the class of 1980 graduates list.
|
import subprocess
import sys
import string
import os
def start_carbon_cache_instance(name):
path = os.path.realpath(__file__)
subprocess.call(["python", "{0}/carbon-cache.py".format(os.path.dirname(path)), "--instance={0}".format(name), "start"])
def stop_carbon_cache_instance(name):
path = os.path.realpath(__file__)
subprocess.call(["python", "{0}/carbon-cache.py".format(os.path.dirname(path)), "--instance={0}".format(name), "stop"])
def usage():
print("carbon_cache [start/stop] [instance name type: letter or number] [number of instances]")
print("instance names should be continuous")
print("For example: 1, 2, 3,... or a, b, c,...")
print("Usage: python carbon_cache start n 5")
def main():
if len(sys.argv) < 4:
print("Too few arguments")
usage()
return
if len(sys.argv) > 4:
print("Too many arguments")
usage()
return
if sys.argv[1] not in ['start', 'stop']:
print("Wrong operation! start or stop only!")
return;
if sys.argv[2] not in ['n', 'l']:
print("Wrong Type! l or n only!")
return
num = int(sys.argv[3])
if sys.argv[1] == 'start':
func = start_carbon_cache_instance
else:
func = stop_carbon_cache_instance
if sys.argv[2] == 'n':
for i in range(num):
func(i + 1)
else:
li = list(string.ascii_lowercase)[:num]
for i in li:
func(i)
if __name__ == '__main__':
main()
|
Make Money Betting on Sports. 100% Welcome Bonus. Make hititbet deposit and receive 50% Sports and 50% Casino Bonus. Просмотреть 156 фотографий и 4 подсказки(-ок) от Посетителей: 1451 для Hititbet. "Ülkemde olmadığı için midir bilmem çok Şirin geldi burası bana .". HITITBET Future of Betting.
Founded in 2006, licensed and regulated by the Aldeney Gambling Control Commission (AGCC) License no:44C1 C2. Account, *. Password, *. Forget Password · Home · Signup · About Hititbet · Wagering Guide · Contact. This Pin was discovered by Hititbet Giris. Discover (and save!) your own Pins on Pinterest. The latest Tweets from HititBet (@Hitit_Bet). The Future of Betting. Giriş. Hemen Üye Olun! Para Yatır. % Oran Türü.
Kesirli; Ondalık; American. Türkçe; English. Profil. Kuponlarım. Çıkış. Diğer. HitItBet has used a hititbet that has never before been seen in the industry to upset and alienate customers. Hititbet, Трикомо. Отметки hititbet 503. Mehmed atak Yapımıdır.). Bet on Football and NCAA Football at Superbook.ag. Click here to get live NFL Football Betting Odds and college football betting lines. WELCOME TO dimeline! It looks like you are accessing our site from a mobile device. Moovit helps you to find the best routes to Hititbet using public transit and gives you step by step directions with updated schedule times for Bus in Nicosia.
Это бесплатно! Посмотрите, кого вы знаете в компании HITITBET, используйте свою сеть профессиональных контактов и получите работу своей мечты. Online sports betting site HitItBet has suspended wagering and registrations through June for maintenance. Search our free database to find email addresses and direct dials for HITITBET employees. Bet On Sports Online | Bet on sports online with us!
We give you the opportunity to transform your sports passion into some money.
|
from scikits.talkbox.features.mfcc import *
__author__ = 'zhangxulong'
def mfcc(input, nwin=800, nfft=512, fs=8000, nceps=13):
# MFCC parameters: taken from auditory toolbox
over = 160
# Pre-emphasis factor (to take into account the -6dB/octave rolloff of the
# radiation at the lips level)
prefac = 0.97
# lowfreq = 400 / 3.
lowfreq = 133.33
# highfreq = 6855.4976
linsc = 200 / 3.
logsc = 1.0711703
nlinfil = 13
nlogfil = 27
nfil = nlinfil + nlogfil
w = hamming(nwin, sym=0)
fbank = trfbank(fs, nfft, lowfreq, linsc, logsc, nlinfil, nlogfil)[0]
# ------------------
# Compute the MFCC
# ------------------
extract = preemp(input, prefac)
framed = segment_axis(extract, nwin, over) * w
# Compute the spectrum magnitude
spec = np.abs(fft(framed, nfft, axis=-1))
# Filter the spectrum through the triangle filterbank
mspec = np.log10(np.dot(spec, fbank.T))
# Use the DCT to 'compress' the coefficients (spectrum -> cepstrum domain)
ceps = dct(mspec, type=2, norm='ortho', axis=-1)[:, :nceps]
return ceps, mspec, spec
|
Theranos has issued corrections for thousands of blood tests and has voided two years' worth of results from one of its devices.
It's the latest setback for the embattled startup, which is facing federal probes.
Theranos spokeswoman Brooke Buchanan said Wednesday that no patients suffered harm due to the inaccurate tests, citing an analysis conducted by the company. She declined to provide details of the analysis.
The move to correct the test results, first reported by the Wall Street Journal, comes as Theranos faces scrutiny from The Centers for Medicare and Medicaid Services (CMS), the Securities and Exchange Commission and the U.S. Attorney's Office for the Northern District of California.
Buchanan said Theranos informed CMS of the voided tests and the corrected results. CMS couldn't be reached for comment late Wednesday.
The company has annulled blood-test results from its "Edison" devices from 2014 and 2015, Buchanan said. She declined to provide details on the other tests that were corrected.
"Excellence in quality and patient safety is our top priority and we've taken comprehensive corrective measures to address the issues CMS raised in their observations," Buchanan said. "As these matters are currently under review, we have no further comment at this time."
Theranos has positioned itself as a provider of cheaper, more efficient alternatives to traditional medical tests. It claimed it could process up to 70 lab tests on just a few drops of blood. It was once valued at more than $9 billion and billed as a classic industry disruptor.
But in October, a scathing report in the Wall Street Journal called much of its technology and testing methods -- including the Edison devices -- into question, prompting wider scrutiny.
The WSJ reported that Theranos' Edison machines were "the main basis" for the company's $9 billion valuation in a 2014 funding round.
Theranos has said previously that Edison "is only one of many proprietary devices used as part of Theranos proprietary technologies."
|
#!/usr/bin/env python
import os
import sys
import time
sys.path.append('/usr/local/lib/tummy-backup/www/lib')
sys.path.append('/usr/local/lib/tummy-backup/lib')
import tbsupp
import cherrypy
import cherrypy.lib.auth_basic
from genshi.template import TemplateLoader
from formencode import Invalid, Schema, validators
emptyPng = (
'\x89PNG\r\n\x1a\n\x00\x00\x00\rIHDR\x00\x00\x00\x03\x00\x00\x00'
'\x03\x08\x02\x00\x00\x00\xd9J"\xe8\x00\x00\x00\x01sRGB\x00\xae\xce'
'\x1c\xe9\x00\x00\x00\x14IDAT\x08\xd7c```\xf8\xff\xff?\x03\x9cB\xe1'
'\x00\x00\xb3j\x0b\xf52-\x07\x95\x00\x00\x00\x00IEND\xaeB`\x82')
loader = TemplateLoader(
os.path.join(os.path.dirname(__file__), 'templates'),
auto_reload=True)
def dbconnect():
from psycopgwrap import Database as db
db.connect()
return db
def db_close():
from psycopgwrap import Database as db
db.close()
cherrypy.tools.db_close = cherrypy.Tool('on_end_request', db_close)
def url(*args, **kwargs):
if len(args) == 1 and len(kwargs) == 0 and type(args[0]) in (str, unicode):
return cherrypy.url(args[0])
else:
import routes
return cherrypy.url(routes.url_for(*args, **kwargs))
def flash_present():
return 'flash' in cherrypy.session
def flash_get():
data = cherrypy.session.get('flash')
del(cherrypy.session['flash'])
return data
def contextFromLocals(locals):
from genshi.template import Context
data = {
'url': url, 'cherrypy': cherrypy,
'current_dashboard': {},
'current_config': {},
'current_indexdetails': {},
'current_newbackup': {},
'host_menu': False,
'flash_present': flash_present,
'flash_get': flash_get, }
for key, value in locals.items():
if key.startswith('_'):
continue
if key == 'self':
continue
data[key] = value
return(Context(**data))
def processform(validator, defaults, form, list_fields=[]):
'''Simple form validator, see `sysconfig()` for example use. Any field
names listed in `list_fields` will always be converted to a list before
processing.
'''
if cherrypy.request.method != 'POST':
return {}, defaults
# convert fields that may be lists to always be lists
for field in list_fields:
if field in form and not isinstance(form[field], (tuple, list)):
form[field] = (form[field], )
try:
return {}, validator.to_python(form)
except Invalid, e:
return e.unpack_errors(), form
def validateTime(s):
import re
if not re.match(r'^[0-9][0-9]:[0-9][0-9](:[0-9][0-9])?$', s):
raise ValueError('Invalid time format.')
return(s)
def validateExcludeRule(s):
import re
if not re.match(
r'^(include|exclude|merge|dir-merge|merge|\+|1)\s+\S.*$', s):
raise ValueError('Invalid exclude rule format.')
return(s)
def validateHostname(s):
import re
if not re.match(r'^[-a-zA-Z0-9._]+$', s):
raise ValueError('Invalid hostname.')
return(s)
def validateFailureWarn(s):
from psycopg2 import DataError
db = dbconnect()
if s == '':
return None
try:
db.queryone("SELECT %s::INTERVAL", s)
except DataError:
raise ValueError(
'Must be empty (to use default) '
'or a PostgreSQL interval like "3 days".')
return s
def validateServername(s):
db = dbconnect()
row = db.queryone(
"SELECT COUNT(*) FROM backupservers WHERE hostname = %s", s)
if row[0] != 1:
raise ValueError('Invalid backup server name.')
return(s)
class NewHostValidator(Schema):
hostname = validators.Wrapper(
validate_python=validateHostname, not_empty=True)
servername = validators.Wrapper(
validate_python=validateServername, not_empty=True)
class HostConfigValidator(Schema):
from formencode.foreach import ForEach
new_exclude_priority = ForEach(
validators.Int(min=0, max=9, not_empty=True), convert_to_list=True)
new_exclude_rule = ForEach(validators.Wrapper(
validate_python=validateExcludeRule,
not_empty=True), convert_to_list=True)
use_global_excludes = validators.StringBoolean(if_missing=False)
rsync_do_compress = validators.StringBoolean(if_missing=False)
active = validators.StringBoolean(if_missing=False)
rsync_bwlimit = validators.Int(not_empty=False, min=0)
priority = validators.Int(not_empty=True, min=0, max=10)
retain_daily = validators.Int(not_empty=True, min=0)
retain_weekly = validators.Int(not_empty=True, min=0)
retain_monthly = validators.Int(not_empty=True, min=0)
window_start_time = validators.Wrapper(validate_python=validateTime)
window_end_time = validators.Wrapper(validate_python=validateTime)
failure_warn = validators.Wrapper(
validate_python=validateFailureWarn, not_empty=False)
class SystemConfigValidator(Schema):
mail_to = validators.Email()
failure_warn = validators.Wrapper(
validate_python=validateFailureWarn, not_empty=False)
rsync_timeout = validators.Int(not_empty=True, min=60, max=160000)
rsync_username = validators.String(
not_empty=True, strip=True, min=1, max=40)
class Root(object):
def __init__(self):
pass
@cherrypy.expose
def index(self):
current_dashboard = {'class': 'current'}
db = dbconnect()
hosts_needing_attention = tbsupp.getHostsNeedingAttention(db)
hosts = list(db.query(
"SELECT *, "
" (SELECT COUNT(*) FROM backups "
"WHERE host_id = hosts.id AND generation = 'daily' "
"AND successful = 't') AS daily_count, "
" (SELECT COUNT(*) FROM backups "
"WHERE host_id = hosts.id AND generation = 'weekly' "
"AND successful = 't') AS weekly_count, "
" (SELECT COUNT(*) FROM backups "
"WHERE host_id = hosts.id AND generation = 'monthly' "
"AND successful = 't') AS monthly_count, "
"(SELECT NOW() - MAX(start_time) FROM backups "
"WHERE host_id = hosts.id "
"AND successful = 't') AS last_backup "
"FROM hosts "
"ORDER BY hostname"))
active_backups = list(db.query(
"SELECT * FROM backups, hosts "
"WHERE backups.backup_pid IS NOT NULL "
"AND hosts.id = backups.host_id "
"ORDER BY backups.start_time"))
title = 'Tummy-Backup'
graphdate = ''
graphdatajs = ''
graphdatajsmax = ''
for row in db.query(
"SELECT sample_date, AVG(usage_pct) AS usage_pct, "
"MAX(usage_pct) as max_usage_pct "
"FROM serverusage GROUP BY sample_date ORDER BY sample_date;"):
graphdate += '"%s",' % row['sample_date'].strftime('%Y-%m-%d')
graphdatajs += '%.1f,' % row['usage_pct']
graphdatajsmax += '%.1f,' % row['max_usage_pct']
graphdate += ''
graphdatajs += ''
graphdatajsmax += ''
tmpl = loader.load('index.html')
return tmpl.generate(contextFromLocals(locals())).render(
'html', doctype='html')
@cherrypy.expose
def detailedindex(self):
import tbsupp
current_indexdetails = {'class': 'current'}
db = dbconnect()
hosts_needing_attention = tbsupp.getHostsNeedingAttention(db)
hosts = list(db.query(
"SELECT *, "
" (SELECT COUNT(*) FROM backups "
"WHERE host_id = hosts.id AND generation = 'daily' "
"AND successful = 't') AS daily_count, "
" (SELECT COUNT(*) FROM backups "
"WHERE host_id = hosts.id AND generation = 'weekly' "
"AND successful = 't') AS weekly_count, "
" (SELECT COUNT(*) FROM backups "
"WHERE host_id = hosts.id AND generation = 'monthly' "
"AND successful = 't') AS monthly_count, "
"(SELECT NOW() - MAX(start_time) FROM backups "
"WHERE host_id = hosts.id "
"AND successful = 't') AS last_backup "
"FROM hosts "
"ORDER BY hostname"))
title = 'Detailed Index'
tmpl = loader.load('index-detailed.html')
return tmpl.generate(contextFromLocals(locals())).render(
'html', doctype='html')
@cherrypy.expose
def sysconfig(self, **kwargs):
db = dbconnect()
title = 'Tummy-Backup Configuration'
current_config = {'class': 'current'}
config = db.queryone("SELECT * FROM config")
errors, formdata = processform(SystemConfigValidator(), config, kwargs)
if cherrypy.request.method == 'POST' and not errors:
for field in [
'mail_to', 'failure_warn', 'rsync_timeout',
'rsync_username']:
db.query(
"UPDATE config SET %s = %%s" % field, formdata[field])
db.commit()
cherrypy.session['flash'] = 'Settings saved successfully.'
raise cherrypy.HTTPRedirect(url('/config'))
tmpl = loader.load('sysconfig.html')
return tmpl.generate(contextFromLocals(locals())).render(
'html', doctype='html')
@cherrypy.expose
def hostsearch(self, **kwargs):
db = dbconnect()
hostname = kwargs['hostname']
hosts = list(db.query(
"SELECT hostname FROM hosts WHERE hostname ~* %s", hostname))
# redirect if only one match
if len(hosts) == 1:
raise cherrypy.HTTPRedirect(url(str(
'/hosts/%s/' % hosts[0]['hostname'])))
# return search results page
if len(hosts) > 1:
title = 'Host Search: %s' % (hostname,)
tmpl = loader.load('host-search-list.html')
return tmpl.generate(contextFromLocals(locals())).render(
'html', doctype='html')
# error page if not found
title = 'No Hosts Found'
cherrypy.response.status = 404
tmpl = loader.load('host-search-notfound.html')
return tmpl.generate(contextFromLocals(locals())).render(
'html', doctype='html')
@cherrypy.expose
def host(self, hostname):
db = dbconnect()
title = 'Host %s' % hostname
host = db.queryone("SELECT * FROM hosts WHERE hostname = %s", hostname)
# search for host if we didn't find an exact match above
if not host:
hosts = list(db.query(
"SELECT hostname FROM hosts WHERE hostname ~* %s", hostname))
# redirect if only one match
if len(hosts) == 1:
raise cherrypy.HTTPRedirect(url(str(
'/hosts/%s/' % hosts[0]['hostname'])))
# return search results page
if len(hosts) > 1:
title = 'Host Search: %s' % (hostname,)
tmpl = loader.load('host-search-list.html')
return tmpl.generate(contextFromLocals(locals())).render(
'html', doctype='html')
# error page if not found
title = 'No Hosts Found'
cherrypy.response.status = 404
tmpl = loader.load('host-search-notfound.html')
return tmpl.generate(contextFromLocals(locals())).render(
'html', doctype='html')
server = db.queryone(
"SELECT * from backupservers WHERE id = %s",
host['backupserver_id'])
backups = db.query(
"SELECT backups.* FROM backups, hosts "
"WHERE backups.host_id = hosts.id AND hosts.hostname = %s "
"ORDER BY backups.id DESC", hostname)
mostrecentbackup = db.queryone(
"SELECT * FROM backups "
"WHERE host_id = %s ORDER BY id DESC LIMIT 1", host['id'])
from tbsupp import describe_rsync_exit_code # NOQA
dategraphdatajs = ''
datagraphdatajs = ''
snapgraphdatajs = ''
maxSize = 0
for row in db.query(
"SELECT "
"sample_date, SUM(used_by_dataset) AS used_by_dataset, "
"SUM(used_by_snapshots) AS used_by_snapshots "
"FROM backupusage "
"WHERE host_id = %s "
"GROUP BY sample_date ORDER BY sample_date;", host['id']):
dategraphdatajs += '"%s",' % (
row['sample_date'].strftime('%Y-%m-%d'))
datagraphdatajs += '%d,' % row['used_by_dataset']
snapgraphdatajs += '%d,' % row['used_by_snapshots']
total = row['used_by_snapshots'] + row['used_by_dataset']
if total > maxSize:
maxSize = total
if datagraphdatajs == '[':
dategraphdatajs = '"%s"' % time.strftime('%Y/%m/%d')
datagraphdatajs = '0,'
snapgraphdatajs = '0,'
yticks = '['
sizel = [
[0, 'B'], [1024, 'KiB'], [1024 ** 2, 'MiB'], [1024 ** 3, 'GiB'],
[1024 ** 4, 'TiB'], [1024 ** 5, 'PiB'], [1024 ** 6, 'EiB']]
for size, ext in sizel:
if maxSize >= size:
order = size
suffix = ext
if maxSize > 0:
val = float(maxSize) / float(order)
else:
val = 0
tweak = 10
rounding = 3
deci = 1
for testmax, testtweak, testround, testdeci in [
[10, 1, 2, 0], [100, 0.1, 1, 0]]:
if val >= testmax:
tweak = testtweak
rounding = testround
deci = testdeci
peak = (int(val * tweak) + rounding) / float(tweak)
yticks += "[0, '0'],"
yticks += "[%d,'%0.*f %s']," % (
peak * order * 0.25, deci, peak * 0.25, suffix)
yticks += "[%d,'%0.*f %s']," % (
peak * order * 0.5, deci, peak * 0.5, suffix)
yticks += "[%d,'%0.*f %s']," % (
peak * order * 0.75, deci, peak * 0.75, suffix)
yticks += "[%d,'%0.*f %s']," % (peak * order, deci, peak, suffix)
yticks += ']'
tmpl = loader.load('host.html')
return tmpl.generate(contextFromLocals(locals())).render(
'html', doctype='html')
@cherrypy.expose
def hostconfig(self, hostname, **kwargs):
import datetime
db = dbconnect()
title = 'Configure Host %s' % (hostname,)
now = datetime.datetime.now()
host = db.queryone("SELECT * FROM hosts WHERE hostname = %s", hostname)
if not host:
title = 'Invalid Host'
tmpl = loader.load('hostconfig-invalidhost.html')
return tmpl.generate(contextFromLocals(locals())).render(
'html', doctype='html')
excludes = db.query(
"SELECT * FROM excludes "
"WHERE (host_id IS NULL AND %s::BOOLEAN) OR host_id = %s "
"ORDER BY priority", host['use_global_excludes'], host['id'])
# strip off any trailing form list entries
while (kwargs.get('new_exclude_priority')
and not kwargs['new_exclude_priority'][-1]
and kwargs.get('new_exclude_rule')
and not kwargs['new_exclude_rule'][-1]):
kwargs['new_exclude_priority'] = kwargs[
'new_exclude_priority'][:-1]
kwargs['new_exclude_rule'] = kwargs['new_exclude_rule'][:-1]
if (not kwargs.get('new_exclude_priority')
and not kwargs.get('new_exclude_rule')):
if 'new_exclude_priority' in kwargs:
del(kwargs['new_exclude_priority'])
if 'new_exclude_rule' in kwargs:
del(kwargs['new_exclude_rule'])
errors, formdata = processform(
HostConfigValidator(),
dict(new_exclude_priority='', new_exclude_rule='', **dict(host)),
dict([
(key, kwargs[key]) for key in kwargs.keys()
if not key.startswith('delete_')]),
['new_exclude_priority', 'new_exclude_rule'])
if cherrypy.request.method == 'POST' and not errors:
for field in [
'active', 'use_global_excludes', 'retain_daily',
'retain_weekly', 'retain_monthly', 'rsync_do_compress',
'rsync_bwlimit', 'priority',
'window_start_time', 'window_end_time', 'failure_warn']:
db.query(
"UPDATE hosts SET %s = %%s WHERE id = %%s" % field,
formdata[field], host['id'])
if (formdata['new_exclude_priority']
and formdata['new_exclude_rule']):
for priority, rule in zip(
formdata['new_exclude_priority'],
formdata['new_exclude_rule']):
db.query(
"INSERT INTO excludes "
"( host_id, priority, rsync_rule ) "
"VALUES ( %s, %s, %s )", host['id'], priority, rule)
deleteList = [
int(x.split('_', 1)[1]) for x in kwargs.keys()
if x.startswith('delete_')]
for deleteId in deleteList:
db.query(
"DELETE FROM excludes WHERE id = %s AND host_id = %s",
deleteId, host['id'])
db.commit()
output = tbsupp.runWebCmd(
tbsupp.lookupBackupServer(db, clienthostname=hostname),
'updatekey\0%s\n' % (hostname,))
if output.template_error():
cherrypy.session['flash'] = (
'Error saving settings: %s' % output.template_error())
else:
cherrypy.session['flash'] = 'Settings saved successfully.'
raise cherrypy.HTTPRedirect('config')
rules_data = []
if 'new_exclude_rule' in errors or 'new_exclude_priority' in errors:
import itertools
rules_data = list(itertools.izip_longest(
formdata.get('new_exclude_priority', []),
errors.get('new_exclude_priority', []),
formdata.get('new_exclude_rule', []),
errors.get('new_exclude_rule', [])))
tmpl = loader.load('hostconfig.html')
return tmpl.generate(contextFromLocals(locals())).render(
'html', doctype='html')
@cherrypy.expose
def hostdestroy(self, hostname, **kwargs):
db = dbconnect()
title = 'Destroy Host %s' % (hostname,)
host = db.query("SELECT * FROM hosts WHERE hostname = %s", hostname)[0]
if cherrypy.request.method == 'POST':
output = tbsupp.runWebCmd(
tbsupp.lookupBackupServer(db, clienthostname=hostname),
'destroyhost\0%s\n' % (hostname,))
if output.template_error():
cherrypy.session['flash'] = (
'Error destroying host: %s' % output.template_error())
raise cherrypy.HTTPRedirect(url('/'))
tmpl = loader.load('hostdestroy.html')
return tmpl.generate(contextFromLocals(locals())).render(
'html', doctype='html')
@cherrypy.expose
def backupdestroy(self, hostname, backupid, **kwargs):
db = dbconnect()
title = 'Destroy Backup %s for %s' % (backupid, hostname)
host = db.queryone("SELECT * FROM hosts WHERE hostname = %s", hostname)
backup = db.queryone(
"SELECT * FROM backups WHERE id = %s", backupid)
if cherrypy.request.method == 'POST':
output = tbsupp.runWebCmd(
tbsupp.lookupBackupServer(db, backupid=backupid),
'destroybackup\0%s\n' % (backupid,))
if output.template_error():
cherrypy.session['flash'] = (
'Error destroying backup: %s' % output.template_error())
raise cherrypy.HTTPRedirect('..')
tmpl = loader.load('backupdestroy.html')
return tmpl.generate(contextFromLocals(locals())).render(
'html', doctype='html')
@cherrypy.expose
def backupctl(self, hostname, action):
db = dbconnect()
command = 'startbackup'
if action.startswith('Kill'):
command = 'killbackup'
output = tbsupp.runWebCmd(
tbsupp.lookupBackupServer(db, clienthostname=hostname),
'%s\0%s\n' % (command, hostname))
# wait for child to finish
output.stdout.read()
if output.template_error():
cherrypy.session['flash'] = (
'Error sending command: %s' % output.template_error())
else:
cherrypy.session['flash'] = (
'Backup "%s" message sent' % command[:-6])
raise cherrypy.HTTPRedirect('.')
@cherrypy.expose
def backuplogfiles(self, hostname, backupid, **kwargs):
from genshi.core import Markup
db = dbconnect()
title = 'Logs for Backup %s of %s' % (backupid, hostname)
output = tbsupp.runWebCmd(
tbsupp.lookupBackupServer(db, backupid=backupid),
'logfiles\0%s\n' % backupid)
logfileoutput = Markup(output.stdout.read())
webcmd_error = output.template_error()
tmpl = loader.load('backuplogfiles.html')
return tmpl.generate(contextFromLocals(locals())).render(
'html', doctype='html')
@cherrypy.expose
def backupajax(self, hostname, backup, **kwargs):
cherrypy.response.headers['Content-Type'] = 'application/json'
db = dbconnect()
output = tbsupp.runWebCmd(
tbsupp.lookupBackupServer(db, clienthostname=hostname),
'fsbrowsejson\0%s\0%s\n' % (backup, kwargs['key']))
return(output.persistent_stdout())
@cherrypy.expose
def backuprecover(self, hostname, backupid, **kwargs):
db = dbconnect()
title = 'Recovery %s for %s' % (backupid, hostname)
if cherrypy.request.method == 'POST':
import pickle
try:
import json
except ImportError:
import simplejson as json
recoverylist = json.loads(kwargs['recoverylist'])
db = dbconnect()
output = tbsupp.runWebCmd(
tbsupp.lookupBackupServer(db, backupid=backupid),
('createtar\0%s\n' % backupid) + pickle.dumps(recoverylist))
filename = 'recovery-%s-%s.tar.gz' % (hostname, backupid)
cherrypy.response.headers['Content-Type'] = 'application/x-tar'
cherrypy.response.headers[
'Content-Disposition'] = 'attachment; filename="%s"' % filename
return output.persistent_stdout()
tmpl = loader.load('backup-recovery.html')
return tmpl.generate(contextFromLocals(locals())).render(
'html', doctype='html')
@cherrypy.expose
def backup(self, hostname, backupid):
db = dbconnect()
title = 'Backup %s for %s' % (backupid, hostname)
host = db.queryone("SELECT * FROM hosts WHERE hostname = %s", hostname)
backup = db.queryone(
"SELECT * FROM backups "
"WHERE host_id = %s AND backups.id = %s ",
host['id'], int(backupid))
from tbsupp import describe_rsync_exit_code # NOQA
tmpl = loader.load('backup.html')
return tmpl.generate(contextFromLocals(locals())).render(
'html', doctype='html')
@cherrypy.expose
def hostkeys(self, hostname):
db = dbconnect()
title = 'SSH Key for %s' % (hostname,)
output = tbsupp.runWebCmd(
tbsupp.lookupBackupServer(db, clienthostname=hostname),
'hostkey\0%s\n' % hostname)
key = output.stdout.read()
webcmd_error = output.template_error()
tmpl = loader.load('host-keys.html')
return tmpl.generate(contextFromLocals(locals())).render(
'html', doctype='html')
@cherrypy.expose
def newbackup(self, **kwargs):
db = dbconnect()
title = 'Backup New Host'
current_newbackup = {'class': 'current'}
errors, formdata = processform(
NewHostValidator(), dict(hostname='', servername=''), kwargs)
if cherrypy.request.method == 'POST' and not errors:
db = dbconnect()
output = tbsupp.runWebCmd(tbsupp.lookupBackupServer(
db, backupservername=formdata['servername']),
'newbackup\0%s\n' % formdata['hostname'])
# wait for child to finish
output.stdout.read()
if output.template_error():
cherrypy.session['flash'] = (
'Error creating backup: %s' % output.template_error())
raise cherrypy.HTTPRedirect(url(str(
'/hosts/%s/config' % formdata['hostname'])))
# get list of servers and post-process elements
backupservers = []
for server in list(db.query("""SELECT backupservers.*,
(SELECT usage_pct FROM serverusage
WHERE server_id = backupservers.id
ORDER BY id DESC LIMIT 1) AS usage_pct
FROM backupservers ORDER BY hostname""")):
data = {'usage_pct_string': '', 'selected': None}
data.update(server)
if server['usage_pct']:
data['usage_pct_string'] = ' (%d%%)' % server['usage_pct']
backupservers.append(data)
# select the element with the lowest usage
try:
min_usage = min([
x['usage_pct'] for x in backupservers
if x['usage_pct'] is not None])
[x for x in backupservers if x['usage_pct'] == min_usage][0][
'selected'] = True
except ValueError:
min_usage = None
# hide server list if there's only one server
show_serverlist = {}
if len(backupservers) == 1:
show_serverlist = {'class': 'hidden'}
tmpl = loader.load('newbackup.html')
return tmpl.generate(contextFromLocals(locals())).render(
'html', doctype='html')
def checkpassword(realm, user, passwd):
import crypt
db = dbconnect()
data = db.query(
'SELECT cryptedpassword FROM users '
'WHERE name = %s AND cryptedpassword IS NOT NULL', user)
if data:
data = list(data)
if not data or not data[0]:
return(False)
return data[0][0] == crypt.crypt(passwd, data[0][0])
def routes():
root = Root()
d = cherrypy.dispatch.RoutesDispatcher()
mapper = d.mapper
mapper.explicit = False
mapper.minimization = False
d.connect(
'backup', r'/hosts/{hostname}/{backupid}/', root, action='backup')
d.connect(
'backuplogfiles', r'/hosts/{hostname}/{backupid}/logfiles', root,
action='backuplogfiles')
d.connect(
'backupajax', r'/hosts/{hostname}/{backup}/ajax', root,
action='backupajax')
d.connect(
'backuprecover', r'/hosts/{hostname}/{backupid}/recover', root,
action='backuprecover')
d.connect(
'backupdestroy', r'/hosts/{hostname}/{backupid}/destroy', root,
action='backupdestroy')
d.connect('host', r'/hosts/{hostname}/', root, action='host')
d.connect('hostsearch', r'/hostsearch', root, action='hostsearch')
d.connect(
'hostconfig', r'/hosts/{hostname}/config', root, action='hostconfig')
d.connect(
'hostdestroy', r'/hosts/{hostname}/destroy', root,
action='hostdestroy')
d.connect(
'hostkeys', r'/hosts/{hostname}/keys', root, action='hostkeys')
d.connect(
'backupctrl', r'/hosts/{hostname}/backupctl', root, action='backupctl')
d.connect('newbackup', r'/newbackup', root, action='newbackup')
d.connect(
'detailedindex', r'/index-detailed/', root, action='detailedindex')
d.connect('sysconfig', r'/config', root, action='sysconfig')
d.connect('index', r'/', root)
return d
def config():
# Some global configuration; note that this could be moved into a
# configuration file
cherrypy.config.update({
'tools.encode.on': True,
'tools.encode.encoding': 'utf-8',
'tools.decode.on': True,
'tools.trailing_slash.on': True,
'tools.staticdir.root': os.path.abspath(os.path.dirname(__file__)),
'server.socket_host': '0.0.0.0',
'server.socket_port': 8080,
'log.screen': False,
'log.error_file': '/tmp/site.log',
'environment': 'production',
'show_tracebacks': True,
})
config = {
'/': {
'request.dispatch': routes(),
'tools.auth_basic.on': True,
'tools.auth_basic.realm': 'tummy-backup',
'tools.auth_basic.checkpassword': checkpassword,
'tools.db_close.on': True,
},
'/static': {
'tools.staticdir.on': True,
'tools.staticdir.dir': 'static'
},
}
return(config)
def setup_server():
cfg = config()
app = cherrypy.tree.mount(None, config=cfg)
return (cfg, app)
if __name__ == '__main__':
cfg = setup_server()[0]
cfg.update({'log.screen': True})
cherrypy.quickstart(config=cfg)
else:
sys.stdout = sys.stderr
cfg = config()
cherrypy.config.update({
'log.error_file': '/tmp/error.log',
})
# basic auth is implemented in Apache, couldn't get it working here
cfg['/']['tools.auth_basic.on'] = False
cfg['/']['tools.sessions.on'] = True
cfg['/']['tools.sessions.storage_type'] = 'file'
cfg['/']['tools.sessions.storage_path'] = '/tmp/'
cfg['/']['tools.sessions.timeout'] = 600 # in minutes
application = cherrypy.Application(
None, script_name='/tummy-backup', config=cfg)
|
Description: Another good flash game is sonic-epoch-the-game a great fun online game to play, a very addictive sonic game for your entertainment!
omg! AMY ROSE IS SO WEIRD! AND HER BUT IS SHOWING!
|
from props.graph_representation.node import isProp, isTime
class Propagate:
"""
class to bunch together all function of propagation on a digraph
Mainly in order to store the graph as a member which all these functions can edit.
"""
def __init__(self,graph):
self.gr = graph
self.applyPropagation()
def applyPropagation(self):
"""
Apply closure propagation algorithms on the graph
"""
change = True
while change:
change = self.propogateFeatures()
def propogateFeatures(self):
"""
handle propagating features between nodes of the graph
@rtype bool
@return True iff this function has changed the graph in some way
"""
ret = False
for curNode in self.gr.nodes():
# for each node in the graph
curNodeNeigbours = self.gr.neighbors(curNode)
for curPropogateNode in curNode.propagateTo:
# for each of its propgated nodes
curPropogateNodeNeigboursIds = [cpn.uid for cpn in self.gr.neighbors(curPropogateNode)]
for curNeigbour in curNodeNeigbours:
if isProp(curNeigbour) or isTime(curNeigbour):
# for each *prop* neigbour
if curNeigbour.uid not in curPropogateNodeNeigboursIds:
# if its not a neigbour of propogated node - add it
self.gr.add_edge(edge=(curPropogateNode,curNeigbour),
label=self.gr.edge_label((curNode,curNeigbour)))
# mark that a change was made to the graph
ret = True
return ret
|
Leather Wallet, Ladies Small Compact Bifold Pocket RFID Blocking Wallet for Women http://www.amazon.com/gp/product/B07KR4RHV4 100% Genuine Leather & Handmade - Made of high quality cowhide genuine leather with high standard handicraft production. Sturdy streamlined double stitching and neat edge cutting to ensure women wallet’s longevity. - Large Capacity - The whole wallet made of soft leather, small but roomy with 6 card slots,1 zipper coin pocket,1 photo slots, ,1 full-length compartments to put money holds a lot without being bulky, you can stay organized anytime and get easy access to what you need. - RFID Blocking Wallet - This Small wallets are equipped with advanced unique proprietary blocking material. Be Safe and Effective, this RFID wallets for women protect your all vital private information stored on RFID chips Secure. - Ultra-thin Design - Measure 4.2 x 3.55 x 0.6 inch. Compact & small enough to put into your back pockets or jacket with pockets. Clear layout and bifold design for this bifold wallet is made for you easy access to cash and credit cards. - Looks Classy and Attractive - This Wallet for women use Lichee pebble leather, which has many irregular wrinkles and clear texture. Excellent for yourself or presenting as a gift for her on Birthday, Anniversary, Mother's Day, Thanksgiving, Christmas, and all other holidays!
Get yours on Amazon - https://amzn.to/2INGzql or on our website- https://bit.ly/2EcPdjg CONVENIENT - Perfect for ladies who don't like to carry a big wallet around town. Clear layout and compact design is made for your easy access to cash and credit cards. Comes in a cute gift box and ready to be gifted! COMPACT - Functional yet small enough to put into your designer coat, or your front pocket -. Dimension: 4.50" x 3.4" x 0.20" Comes in 10 different colors!
Women wallet Red leather Wallet For Women's. Wallet made up of Genuine Leather, 2 note section, 1 Coin Pocket closed with Button tab, 4 card Slots, one Zip Pocket Women Curewe Kerien Designer PU Leather Long Black Zipper Wallet. Business Clutch Style Men and Women PU Leather Large Bifold Wallet Fashion Purse Card Cash Receipt Holder Organizer,Zip Around High Quality Wallet. Measurements 7.5 in.*3.2 in.*1.1 inch. Tap Fashion Fancy Stylish Elegant Synthetic Handmade Wallet/ Clutch/ Purse for Girls and Women. red leather Wallet For Women's Buy This Rs.320/- (Affiliate) - http://amzn.to/2pFhVB6 Dark Blue Women's Wallet Buy This Rs. 520/- (Affiliate) - http://amzn.to/2oucrEy Women Designer Long Black Zipper Wallet Buy This Rs. 710/- (Affiliate) - http://amzn.to/2o6L4oC & More Visit To - http://mfashion99.blogspot.in Subscribe My Channel & Get More Videos Daily.
Purchase on Amazon: https://www.amazon.com/dp/B01H4X8LHE/ref=twister_B01H4X8JEO?_encoding=UTF8&psc=1 Product Review by Jannel ([email protected]) I received this product at a discounted price in exchange for my honest/unbiased review during its promotional period. All thoughts, opinions and ratings are my own. I was under no obligation to post positive feedback. I am simply sharing my opinions the same as I would have if purchasing the item at full price. I will update my review as soon as I encounter some issues with the product.
Bellroy Phone Pocket Plus: A Luxurious Leather Zip Wallet that Fits Your 6s Plus!
iLi Leather womens wallet with RFID blocking lining style 7411. Available in multiple colors.
Best Urban Forest Erin Womens Leather Wallet.
YALUXE Women's Small Compact Bi fold Leather Pocket Wallet, Great Quality Wallet, Nice Compact Size!
|
# -*- coding: utf-8 -*-
# Form implementation generated from reading ui file 'data/liveusb-creator.ui'
#
# Created: Fri Dec 12 13:46:45 2014
# by: PyQt4 UI code generator 4.10.2
#
# WARNING! All changes made in this file will be lost!
from PyQt4 import QtCore, QtGui
try:
_fromUtf8 = QtCore.QString.fromUtf8
except AttributeError:
def _fromUtf8(s):
return s
try:
_encoding = QtGui.QApplication.UnicodeUTF8
def _translate(context, text, disambig):
return QtGui.QApplication.translate(context, text, disambig, _encoding)
except AttributeError:
def _translate(context, text, disambig):
return QtGui.QApplication.translate(context, text, disambig)
class Ui_MainWindow(object):
def setupUi(self, MainWindow):
MainWindow.setObjectName(_fromUtf8("MainWindow"))
MainWindow.resize(690, 538)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Expanding, QtGui.QSizePolicy.Expanding)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(MainWindow.sizePolicy().hasHeightForWidth())
MainWindow.setSizePolicy(sizePolicy)
MainWindow.setMinimumSize(QtCore.QSize(477, 519))
MainWindow.setLayoutDirection(QtCore.Qt.LeftToRight)
self.centralwidget = QtGui.QWidget(MainWindow)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Maximum, QtGui.QSizePolicy.Maximum)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(10)
sizePolicy.setHeightForWidth(self.centralwidget.sizePolicy().hasHeightForWidth())
self.centralwidget.setSizePolicy(sizePolicy)
self.centralwidget.setMinimumSize(QtCore.QSize(477, 519))
self.centralwidget.setLayoutDirection(QtCore.Qt.LeftToRight)
self.centralwidget.setAutoFillBackground(False)
self.centralwidget.setObjectName(_fromUtf8("centralwidget"))
self.gridLayout = QtGui.QGridLayout(self.centralwidget)
self.gridLayout.setContentsMargins(0, 0, 0, -1)
self.gridLayout.setObjectName(_fromUtf8("gridLayout"))
self.horizontalLayout_9 = QtGui.QHBoxLayout()
self.horizontalLayout_9.setSpacing(0)
self.horizontalLayout_9.setObjectName(_fromUtf8("horizontalLayout_9"))
self.label = QtGui.QLabel(self.centralwidget)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Minimum, QtGui.QSizePolicy.Minimum)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.label.sizePolicy().hasHeightForWidth())
self.label.setSizePolicy(sizePolicy)
self.label.setStyleSheet(_fromUtf8("background-image: url(:/liveusb-header-bg.png);"))
self.label.setText(_fromUtf8(""))
self.label.setPixmap(QtGui.QPixmap(_fromUtf8(":/liveusb-header-left.png")))
self.label.setScaledContents(False)
self.label.setAlignment(QtCore.Qt.AlignLeading|QtCore.Qt.AlignLeft|QtCore.Qt.AlignTop)
self.label.setMargin(-1)
self.label.setObjectName(_fromUtf8("label"))
self.horizontalLayout_9.addWidget(self.label)
self.label_2 = QtGui.QLabel(self.centralwidget)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Minimum, QtGui.QSizePolicy.Minimum)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.label_2.sizePolicy().hasHeightForWidth())
self.label_2.setSizePolicy(sizePolicy)
self.label_2.setLayoutDirection(QtCore.Qt.RightToLeft)
self.label_2.setStyleSheet(_fromUtf8("background-image: url(:/liveusb-header-bg.png);"))
self.label_2.setText(_fromUtf8(""))
self.label_2.setPixmap(QtGui.QPixmap(_fromUtf8(":/liveusb-header-right.png")))
self.label_2.setScaledContents(False)
self.label_2.setAlignment(QtCore.Qt.AlignLeading|QtCore.Qt.AlignLeft|QtCore.Qt.AlignTop)
self.label_2.setMargin(-1)
self.label_2.setObjectName(_fromUtf8("label_2"))
self.horizontalLayout_9.addWidget(self.label_2)
self.gridLayout.addLayout(self.horizontalLayout_9, 0, 0, 1, 1)
self.verticalLayout = QtGui.QVBoxLayout()
self.verticalLayout.setContentsMargins(9, -1, 9, -1)
self.verticalLayout.setObjectName(_fromUtf8("verticalLayout"))
self.horizontalLayout_4 = QtGui.QHBoxLayout()
self.horizontalLayout_4.setObjectName(_fromUtf8("horizontalLayout_4"))
self.groupBox = QtGui.QGroupBox(self.centralwidget)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Expanding, QtGui.QSizePolicy.Expanding)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.groupBox.sizePolicy().hasHeightForWidth())
self.groupBox.setSizePolicy(sizePolicy)
font = QtGui.QFont()
font.setPointSize(9)
self.groupBox.setFont(font)
self.groupBox.setObjectName(_fromUtf8("groupBox"))
self.horizontalLayout_7 = QtGui.QHBoxLayout(self.groupBox)
self.horizontalLayout_7.setObjectName(_fromUtf8("horizontalLayout_7"))
self.isoBttn = QtGui.QPushButton(self.groupBox)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Expanding, QtGui.QSizePolicy.Fixed)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.isoBttn.sizePolicy().hasHeightForWidth())
self.isoBttn.setSizePolicy(sizePolicy)
self.isoBttn.setObjectName(_fromUtf8("isoBttn"))
self.horizontalLayout_7.addWidget(self.isoBttn)
self.horizontalLayout_4.addWidget(self.groupBox)
self.downloadGroup = QtGui.QGroupBox(self.centralwidget)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Expanding, QtGui.QSizePolicy.Expanding)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.downloadGroup.sizePolicy().hasHeightForWidth())
self.downloadGroup.setSizePolicy(sizePolicy)
font = QtGui.QFont()
font.setPointSize(9)
self.downloadGroup.setFont(font)
self.downloadGroup.setObjectName(_fromUtf8("downloadGroup"))
self.gridLayout_5 = QtGui.QGridLayout(self.downloadGroup)
self.gridLayout_5.setObjectName(_fromUtf8("gridLayout_5"))
self.horizontalLayout_6 = QtGui.QHBoxLayout()
self.horizontalLayout_6.setObjectName(_fromUtf8("horizontalLayout_6"))
self.downloadCombo = QtGui.QComboBox(self.downloadGroup)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Expanding, QtGui.QSizePolicy.Fixed)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.downloadCombo.sizePolicy().hasHeightForWidth())
self.downloadCombo.setSizePolicy(sizePolicy)
self.downloadCombo.setObjectName(_fromUtf8("downloadCombo"))
self.horizontalLayout_6.addWidget(self.downloadCombo)
self.refreshReleasesButton = QtGui.QPushButton(self.downloadGroup)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Minimum, QtGui.QSizePolicy.Minimum)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.refreshReleasesButton.sizePolicy().hasHeightForWidth())
self.refreshReleasesButton.setSizePolicy(sizePolicy)
self.refreshReleasesButton.setText(_fromUtf8(""))
icon = QtGui.QIcon()
icon.addPixmap(QtGui.QPixmap(_fromUtf8(":/refresh.png")), QtGui.QIcon.Normal, QtGui.QIcon.Off)
self.refreshReleasesButton.setIcon(icon)
self.refreshReleasesButton.setFlat(True)
self.refreshReleasesButton.setObjectName(_fromUtf8("refreshReleasesButton"))
self.horizontalLayout_6.addWidget(self.refreshReleasesButton)
self.gridLayout_5.addLayout(self.horizontalLayout_6, 0, 0, 1, 1)
self.horizontalLayout_4.addWidget(self.downloadGroup)
self.verticalLayout.addLayout(self.horizontalLayout_4)
self.horizontalLayout_3 = QtGui.QHBoxLayout()
self.horizontalLayout_3.setObjectName(_fromUtf8("horizontalLayout_3"))
self.groupBox_2 = QtGui.QGroupBox(self.centralwidget)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Expanding, QtGui.QSizePolicy.Expanding)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.groupBox_2.sizePolicy().hasHeightForWidth())
self.groupBox_2.setSizePolicy(sizePolicy)
font = QtGui.QFont()
font.setPointSize(9)
self.groupBox_2.setFont(font)
self.groupBox_2.setObjectName(_fromUtf8("groupBox_2"))
self.gridLayout_3 = QtGui.QGridLayout(self.groupBox_2)
self.gridLayout_3.setObjectName(_fromUtf8("gridLayout_3"))
self.horizontalLayout = QtGui.QHBoxLayout()
self.horizontalLayout.setObjectName(_fromUtf8("horizontalLayout"))
self.driveBox = QtGui.QComboBox(self.groupBox_2)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Expanding, QtGui.QSizePolicy.Fixed)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.driveBox.sizePolicy().hasHeightForWidth())
self.driveBox.setSizePolicy(sizePolicy)
self.driveBox.setEditable(False)
self.driveBox.setInsertPolicy(QtGui.QComboBox.InsertAtTop)
self.driveBox.setDuplicatesEnabled(False)
self.driveBox.setObjectName(_fromUtf8("driveBox"))
self.horizontalLayout.addWidget(self.driveBox)
self.refreshDevicesButton = QtGui.QPushButton(self.groupBox_2)
self.refreshDevicesButton.setText(_fromUtf8(""))
self.refreshDevicesButton.setIcon(icon)
self.refreshDevicesButton.setFlat(True)
self.refreshDevicesButton.setObjectName(_fromUtf8("refreshDevicesButton"))
self.horizontalLayout.addWidget(self.refreshDevicesButton)
self.gridLayout_3.addLayout(self.horizontalLayout, 0, 0, 1, 1)
self.horizontalLayout_3.addWidget(self.groupBox_2)
self.overlayTitle = QtGui.QGroupBox(self.centralwidget)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Expanding, QtGui.QSizePolicy.Expanding)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.overlayTitle.sizePolicy().hasHeightForWidth())
self.overlayTitle.setSizePolicy(sizePolicy)
font = QtGui.QFont()
font.setPointSize(8)
self.overlayTitle.setFont(font)
self.overlayTitle.setObjectName(_fromUtf8("overlayTitle"))
self.gridLayout_4 = QtGui.QGridLayout(self.overlayTitle)
self.gridLayout_4.setObjectName(_fromUtf8("gridLayout_4"))
self.overlaySlider = QtGui.QSlider(self.overlayTitle)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Expanding, QtGui.QSizePolicy.Fixed)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.overlaySlider.sizePolicy().hasHeightForWidth())
self.overlaySlider.setSizePolicy(sizePolicy)
self.overlaySlider.setMaximum(2047)
self.overlaySlider.setOrientation(QtCore.Qt.Horizontal)
self.overlaySlider.setTickPosition(QtGui.QSlider.NoTicks)
self.overlaySlider.setObjectName(_fromUtf8("overlaySlider"))
self.gridLayout_4.addWidget(self.overlaySlider, 0, 0, 1, 1)
self.horizontalLayout_3.addWidget(self.overlayTitle)
self.verticalLayout.addLayout(self.horizontalLayout_3)
self.groupBox_3 = QtGui.QGroupBox(self.centralwidget)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Expanding, QtGui.QSizePolicy.Expanding)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.groupBox_3.sizePolicy().hasHeightForWidth())
self.groupBox_3.setSizePolicy(sizePolicy)
font = QtGui.QFont()
font.setPointSize(9)
self.groupBox_3.setFont(font)
self.groupBox_3.setObjectName(_fromUtf8("groupBox_3"))
self.gridLayout_2 = QtGui.QGridLayout(self.groupBox_3)
self.gridLayout_2.setObjectName(_fromUtf8("gridLayout_2"))
self.horizontalLayout_5 = QtGui.QHBoxLayout()
self.horizontalLayout_5.setObjectName(_fromUtf8("horizontalLayout_5"))
self.nonDestructiveButton = QtGui.QRadioButton(self.groupBox_3)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Preferred, QtGui.QSizePolicy.Preferred)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.nonDestructiveButton.sizePolicy().hasHeightForWidth())
self.nonDestructiveButton.setSizePolicy(sizePolicy)
self.nonDestructiveButton.setChecked(True)
self.nonDestructiveButton.setObjectName(_fromUtf8("nonDestructiveButton"))
self.horizontalLayout_5.addWidget(self.nonDestructiveButton)
self.destructiveButton = QtGui.QRadioButton(self.groupBox_3)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Preferred, QtGui.QSizePolicy.Preferred)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.destructiveButton.sizePolicy().hasHeightForWidth())
self.destructiveButton.setSizePolicy(sizePolicy)
self.destructiveButton.setObjectName(_fromUtf8("destructiveButton"))
self.horizontalLayout_5.addWidget(self.destructiveButton)
self.gridLayout_2.addLayout(self.horizontalLayout_5, 0, 0, 1, 1)
self.verticalLayout.addWidget(self.groupBox_3)
self.textEdit = QtGui.QTextEdit(self.centralwidget)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Expanding, QtGui.QSizePolicy.Expanding)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.textEdit.sizePolicy().hasHeightForWidth())
self.textEdit.setSizePolicy(sizePolicy)
font = QtGui.QFont()
font.setPointSize(8)
self.textEdit.setFont(font)
self.textEdit.setReadOnly(True)
self.textEdit.setObjectName(_fromUtf8("textEdit"))
self.verticalLayout.addWidget(self.textEdit)
self.progressBar = QtGui.QProgressBar(self.centralwidget)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Expanding, QtGui.QSizePolicy.Preferred)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.progressBar.sizePolicy().hasHeightForWidth())
self.progressBar.setSizePolicy(sizePolicy)
self.progressBar.setProperty("value", 0)
self.progressBar.setObjectName(_fromUtf8("progressBar"))
self.verticalLayout.addWidget(self.progressBar)
self.startButton = QtGui.QPushButton(self.centralwidget)
self.startButton.setEnabled(True)
sizePolicy = QtGui.QSizePolicy(QtGui.QSizePolicy.Preferred, QtGui.QSizePolicy.Fixed)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(self.startButton.sizePolicy().hasHeightForWidth())
self.startButton.setSizePolicy(sizePolicy)
self.startButton.setObjectName(_fromUtf8("startButton"))
self.verticalLayout.addWidget(self.startButton)
self.gridLayout.addLayout(self.verticalLayout, 1, 0, 1, 1)
MainWindow.setCentralWidget(self.centralwidget)
self.retranslateUi(MainWindow)
QtCore.QMetaObject.connectSlotsByName(MainWindow)
def retranslateUi(self, MainWindow):
MainWindow.setWindowTitle(_translate("MainWindow", "Fedora Live USB Creator", None))
self.groupBox.setWhatsThis(_translate("MainWindow", "This button allows you to browse for an existing Live CD ISO that you have previously downloaded. If you do not select one, a release will be downloaded for you automatically.", None))
self.groupBox.setTitle(_translate("MainWindow", "Use existing Live CD", None))
self.isoBttn.setText(_translate("MainWindow", "Browse", None))
self.isoBttn.setShortcut(_translate("MainWindow", "Alt+B", None))
self.downloadGroup.setWhatsThis(_translate("MainWindow", "If you do not select an existing Live CD, the selected release will be downloaded for you.", None))
self.downloadGroup.setTitle(_translate("MainWindow", "Download Fedora", None))
self.groupBox_2.setWhatsThis(_translate("MainWindow", "This is the USB stick that you want to install your Live CD on. This device must be formatted with the FAT filesystem.", None))
self.groupBox_2.setTitle(_translate("MainWindow", "Target Device", None))
self.overlayTitle.setWhatsThis(_translate("MainWindow", "By allocating extra space on your USB stick for a persistent overlay, you will be able to store data and make permanent modifications to your live operating system. Without it, you will not be able to save data that will persist after a reboot.", "comment!"))
self.overlayTitle.setTitle(_translate("MainWindow", "Persistent Storage (0 MB)", None))
self.groupBox_3.setTitle(_translate("MainWindow", "Method", None))
self.nonDestructiveButton.setToolTip(_translate("MainWindow", "This method uses the \'cp\' command to copy the files from the ISO on to your USB key, without deleting any existing files.", None))
self.nonDestructiveButton.setText(_translate("MainWindow", "Non-destructive (cp)", None))
self.destructiveButton.setToolTip(_translate("MainWindow", "This method uses the \'dd\' comand to copy the ISO directly to your USB device, destroying any pre-existing data/partitions. This method tends to be more reliable with regard to booting, especially with UEFI systems. This method also works with DVD images.", None))
self.destructiveButton.setText(_translate("MainWindow", "Overwrite device (dd)", None))
self.textEdit.setWhatsThis(_translate("MainWindow", "This is the status console, where all messages get written to.", None))
self.progressBar.setWhatsThis(_translate("MainWindow", "This is the progress bar that will indicate how far along in the LiveUSB creation process you are", None))
self.startButton.setWhatsThis(_translate("MainWindow", "This button will begin the LiveUSB creation process. This entails optionally downloading a release (if an existing one wasn\'t selected), extracting the ISO to the USB device, creating the persistent overlay, and installing the bootloader.", None))
self.startButton.setText(_translate("MainWindow", "Create Live USB", None))
import resources_rc
|
Why team up with a broker ?
You wish to sell a property but you don’t know where to start? Are you looking for a real estate broker that is able to offer you a turnkey service and assist you in all stages of the real estate transaction?
Look no further : I am the person for you! I will accompany you from the beginning up until the end of your real estate transaction. I will first complete a personalized marketing plan and estimate the market value of your property, which will be highly visible because it will be displayed on more than 100 popular Internet sites around the world, including Realtor (MLS / SIA) and Centris. I will also take care of the coordination of all calls and visits thus freeing you from these time consuming tasks. You will also benefit from my judicious advice and my negotiation skills, and I will take care of all the technical and legal aspects of the promise to purchase, as well as accompany you to the notary.
Are you wondering what will happen if you find the buyer yourself? You will simply have to pay me a reduced commission and I will do exactly the same job as if were me who found the buyer!
You want to buy a real estate? No problem, I can also offer you a turnkey service. Contact me now to learn more or to book an appointment.
The HONORS prize is awarded to Proprio Direct brokers whose results are in the top 20% compared to the results of all Proprio Direct brokers.
|
#!/usr/bin/env python
#
# Copyright 2013 Free Software Foundation, Inc.
#
# This file is part of GNU Radio
#
# SPDX-License-Identifier: GPL-3.0-or-later
#
#
import numpy, sys
from matplotlib import pyplot
from gnuradio import digital
from .soft_dec_lut_gen import soft_dec_table, calc_soft_dec_from_table, calc_soft_dec
from .psk_constellations import psk_4_0, psk_4_1, psk_4_2, psk_4_3, psk_4_4, psk_4_5, psk_4_6, psk_4_7, sd_psk_4_0, sd_psk_4_1, sd_psk_4_2, sd_psk_4_3, sd_psk_4_4, sd_psk_4_5, sd_psk_4_6, sd_psk_4_7
from .qam_constellations import qam_16_0, sd_qam_16_0
def test_qpsk(i, sample, prec):
qpsk_const_list = [psk_4_0, psk_4_1, psk_4_2, psk_4_3,
psk_4_4, psk_4_5, psk_4_6, psk_4_7]
qpsk_lut_gen_list = [sd_psk_4_0, sd_psk_4_1, sd_psk_4_2, sd_psk_4_3,
sd_psk_4_4, sd_psk_4_5, sd_psk_4_6, sd_psk_4_7]
constel, code = qpsk_const_list[i]()
qpsk_lut_gen = qpsk_lut_gen_list[i]
rot_sym = 1
side = 2
width = 2
c = digital.constellation_rect(constel, code, rot_sym,
side, side, width, width)
# Get max energy/symbol in constellation
constel = c.points()
Es = max([numpy.sqrt(constel_i.real**2 + constel_i.imag**2) for constel_i in constel])
#table = soft_dec_table_generator(qpsk_lut_gen, prec, Es)
table = soft_dec_table(constel, code, prec)
c.gen_soft_dec_lut(prec)
#c.set_soft_dec_lut(table, prec)
y_python_gen_calc = qpsk_lut_gen(sample, Es)
y_python_table = calc_soft_dec_from_table(sample, table, prec, Es)
y_python_raw_calc = calc_soft_dec(sample, constel, code)
y_cpp_table = c.soft_decision_maker(sample)
y_cpp_raw_calc = c.calc_soft_dec(sample)
return (y_python_gen_calc, y_python_table, y_python_raw_calc,
y_cpp_table, y_cpp_raw_calc, constel, code, c)
def test_qam16(i, sample, prec):
sample = sample / 1
qam_const_list = [qam_16_0, ]
qam_lut_gen_list = [sd_qam_16_0, ]
constel, code = qam_const_list[i]()
qam_lut_gen = qam_lut_gen_list[i]
rot_sym = 4
side = 2
width = 2
c = digital.constellation_rect(constel, code, rot_sym,
side, side, width, width)
# Get max energy/symbol in constellation
constel = c.points()
Es = max([abs(constel_i) for constel_i in constel])
#table = soft_dec_table_generator(qam_lut_gen, prec, Es)
table = soft_dec_table(constel, code, prec, 1)
#c.gen_soft_dec_lut(prec)
c.set_soft_dec_lut(table, prec)
y_python_gen_calc = qam_lut_gen(sample, Es)
y_python_table = calc_soft_dec_from_table(sample, table, prec, Es)
y_python_raw_calc = calc_soft_dec(sample, constel, code, 1)
y_cpp_table = c.soft_decision_maker(sample)
y_cpp_raw_calc = c.calc_soft_dec(sample)
return (y_python_gen_calc, y_python_table, y_python_raw_calc,
y_cpp_table, y_cpp_raw_calc, constel, code, c)
if __name__ == "__main__":
index = 0
prec = 8
x_re = 2*numpy.random.random()-1
x_im = 2*numpy.random.random()-1
x = x_re + x_im*1j
#x = -1 + -0.j
if 1:
y_python_gen_calc, y_python_table, y_python_raw_calc, \
y_cpp_table, y_cpp_raw_calc, constel, code, c \
= test_qpsk(index, x, prec)
else:
y_python_gen_calc, y_python_table, y_python_raw_calc, \
y_cpp_table, y_cpp_raw_calc, constel, code, c \
= test_qam16(index, x, prec)
k = numpy.log2(len(constel))
print("Sample: ", x)
print("Python Generator Calculated: ", (y_python_gen_calc))
print("Python Generator Table: ", (y_python_table))
print("Python Raw calc: ", (y_python_raw_calc))
print("C++ Table calc: ", (y_cpp_table))
print("C++ Raw calc: ", (y_cpp_raw_calc))
fig = pyplot.figure(1)
sp1 = fig.add_subplot(1,1,1)
sp1.plot([c.real for c in constel],
[c.imag for c in constel], 'bo')
sp1.plot(x.real, x.imag, 'ro')
sp1.set_xlim([-1.5, 1.5])
sp1.set_ylim([-1.5, 1.5])
fill = int(numpy.log2(len(constel)))
for i,c in enumerate(constel):
sp1.text(1.2*c.real, 1.2*c.imag, bin(code[i])[2:].zfill(fill),
ha='center', va='center', size=18)
pyplot.show()
|
Sheep exist in astonishing variety beyond the typical concept of “fluffy white sheep in a field.” I had a notion to make something unusual, but nevertheless real. So I researched rarer breeds, and settled upon the Balwen Welsh Mountain sheep because I really like their markings. Prior to research, I’d never heard of this breed. They originated in the British Isles, particularly in Wales. Here I’ve chosen to create a ewe with her newborn lamb, ready to nurse.
The Balwen sheep are born black, their fleece later fading to brown in sunlight. It has a white blaze on the face, four white feet (referred to as socks), and white covering the last half or more of the tail – which is normally left undocked.
Now before things get entirely too serious…..the magical sheep reappear in a silly parade of Blooming Ewes sporting their colorful finery. From floral bonnets to striped stockings and beautiful booties (of the footwear sort!). (Ps. they are bright for sure, but not neon, like I suspect my monitor is showing me).
Appearing soon at Black Sheep Gathering 2013. Come see us!
How does one purchase one of these whimsical sheep and what size are they?
Please excuse my late reply – due to being on long sabbatical to prep & sell a house (mission accomplished & I’m ba-ack).
My creations are sold only at local wool festivals in the Portland Oregon area, where we have a large & lively fiber community. Large enough that I can scarcely produce enough to fill each show booth and never did manage to populate an Etsy page. Good thing, because Etsy is open 24/7 year round, and I’m just trying to keep up with shows every few months.
The sheep tend to be about 5 ~ 6″ long, and about 6 ~ 7″ tall. There are exceptions. The Blooming Ewes are larger by a few inches. Having just resumed production after a very long absence, I am also creating some miniature sizes.
As always, new ideas appear alongside now staple favorites.
|
#!/usr/bin/env python
'''
pexpect demo using netmiko to change
'''
# imports
try:
import textwrap
import time
import netmiko
except ImportError:
print "Could not import a required module.\n Exiting"
raise SystemExit
# Variables
RTR1 = {
'host':'pynet-rtr1',
'device_type':'cisco_ios',
'ip':'184.105.247.70',
'username':'pyclass',
'password':'88newclass',
'secret':'',
'port':22,
'timeout':60
}
RTR2 = {
'host':'pynet-rtr2',
'device_type':'cisco_ios',
'ip':'184.105.247.71',
'username':'pyclass',
'password':'88newclass',
'secret':'',
'port':22,
'timeout':60
}
NEW_BUFFER = str(((time.localtime()[3] * 60 + time.localtime()[4]) * 60) + time.localtime()[5] + 4096)
COMMAND_FILE = 'exer08_commands.txt'
DEVICE_LIST = [
RTR1,
RTR2
]
LINE_INDENT = 8
LINE_WIDTH = 100
def main():
'''
main app
'''
for device_to_change in DEVICE_LIST:
# make connection to device
dev_connection = netmiko.ConnectHandler(**device_to_change)
# check config before change
print device_to_change['host']
command_result = dev_connection.send_command('sh run | inc buff|logging con')
for result_line in command_result.splitlines():
print textwrap.fill(
result_line,
width = LINE_WIDTH,
initial_indent = ' ' * LINE_INDENT,
subsequent_indent = ' ' * (LINE_INDENT + 4)
)
# execute commands
dev_connection.send_config_from_file(config_file = COMMAND_FILE)
# check config after change
command_result = dev_connection.send_command('sh run | inc buff|logging con')
for result_line in command_result.splitlines():
print textwrap.fill(
result_line,
width = LINE_WIDTH,
initial_indent = ' ' * LINE_INDENT,
subsequent_indent = ' ' * (LINE_INDENT + 4)
)
if __name__ == "__main__":
main()
|
Neal’s Yard is a British health and beauty brand focused on organic skincare and creating the most natural remedies possible. In 1981, the first Neal’s Yard Apothecary store was opened in Covent Garden by Romy Fraser. Not long after, the iconic Frankincense Nourishing Cream was born in 1983. Neal’s Yard now boasts an impressive range of anti-ageing skincare, men’s skincare, and make-up.
|
#!/usr/bin/env python
# File: collect_stamps.py
# Created on: Fri 15 Jun 2012 10:11:00 AM CDT
# Last Change: Mon 18 Jun 2012 11:10:48 AM CDT
# Purpose of script: <+INSERT+>
# Author: Steven Boada
import pyfits as pyf
from mk_galaxy_struc import mk_galaxy_struc
galaxies = mk_galaxy_struc()
f1 = open('lowMass.list','wt')
f2 = open('medMass_lowicd.list','wt')
f3 = open('medMass_highicd.list','wt')
f4 = open('highMass.list','wt')
f1.writelines('#field #ID #ICD_IH #MASS #SPIRAL #ELLIPTICAL #UNCERTAIN\n')
f2.writelines('#field #ID #ICD_IH #MASS #SPIRAL #ELLIPTICAL #UNCERTAIN\n')
f3.writelines('#field #ID #ICD_IH #MASS #SPIRAL #ELLIPTICAL #UNCERTAIN\n')
f4.writelines('#field #ID #ICD_IH #MASS #SPIRAL #ELLIPTICAL #UNCERTAIN\n')
for i in range(len(galaxies)):
if galaxies[i].ston_I >= 30.0:
if galaxies[i].Mass <= 1e9:
f1.writelines(str(galaxies[i].field)+' '+str(galaxies[i].ID)+\
' '+str(galaxies[i].ICD_IH)+' '+str(galaxies[i].Mass)+\
' '+str(galaxies[i].Spiral)+' '+str(galaxies[i].Elliptical)+\
' '+str(galaxies[i].Uncertain)+'\n')
elif 1e9 <= galaxies[i].Mass and galaxies[i].Mass <= 1e11:
if galaxies[i].ICD_IH <= 0.05:
f2.writelines(str(galaxies[i].field)+' '+str(galaxies[i].ID)+\
' '+str(galaxies[i].ICD_IH)+' '+str(galaxies[i].Mass)+\
' '+str(galaxies[i].Spiral)+' '+str(galaxies[i].Elliptical)+\
' '+str(galaxies[i].Uncertain)+'\n')
else:
f3.writelines(str(galaxies[i].field)+' '+str(galaxies[i].ID)+\
' '+str(galaxies[i].ICD_IH)+' '+str(galaxies[i].Mass)+\
' '+str(galaxies[i].Spiral)+' '+str(galaxies[i].Elliptical)+\
' '+str(galaxies[i].Uncertain)+'\n')
elif 1e11 <= galaxies[i].Mass:
f4.writelines(str(galaxies[i].field)+' '+str(galaxies[i].ID)+\
' '+str(galaxies[i].ICD_IH)+' '+str(galaxies[i].Mass)+\
' '+str(galaxies[i].Spiral)+' '+str(galaxies[i].Elliptical)+\
' '+str(galaxies[i].Uncertain)+'\n')
f1.close()
f2.close()
f3.close()
f4.close()
|
Best Ideas Of where is the Resume Template In Word Brilliant 105 Free Resume Templates for Word Able Freesumes Resume Also see our article on images for additional information about image and its dimensions and with multiple Best Ideas Of where is the Resume Template In Word Brilliant 105 Free Resume Templates for Word Able Freesumes Resume to help you. Best Ideas Of where is the Resume Template In Word Brilliant 105 Free Resume Templates for Word Able Freesumes Resume.
This 600 x 841 Resume, image over the Best Ideas Of where is the Resume Template In Word Brilliant 105 Free Resume Templates for Word Able Freesumes Resume. Original portrait file type: image/jpg size pictures. Usage on Seocinim.com.
Bookmark the Best Ideas Of where is the Resume Template In Word Brilliant 105 Free Resume Templates for Word Able Freesumes Resume.
CBest Ideas Of where is the Resume Template In Word Brilliant 105 Free Resume Templates for Word Able Freesumes Resume images photos and 40 collection by admin about Resume and make sure you get the information you are looking for. For this main visual you can choose photography, graphics to use or for example. We also listed another Resume images related to Where is the Resume Template In Word and many more.
|
from django.shortcuts import render, get_object_or_404, redirect
from django.core.paginator import Paginator, EmptyPage, PageNotAnInteger
from taggit.models import Tag
from .models import Post, Page
POSTS_PER_PAGE = 5
def index(request):
"""Displays first page with latest posts"""
return index_pagination(request, 1)
def index_pagination(request, pagination):
"""Displays n-th page with latest posts"""
page = int(pagination)
if not request.user.is_authenticated:
posts_published = Post.objects.filter(published=True).order_by('-pub_date')
else:
posts_published = Post.objects.order_by('-pub_date')
paginator = Paginator(posts_published, POSTS_PER_PAGE)
try:
posts = paginator.page(page)
except PageNotAnInteger:
posts = paginator.page(1)
except EmptyPage:
posts = paginator.page(paginator.num_pages)
context = {
'posts': posts
}
return render(request, 'blog/index.html', context)
def post(request, post_slug):
"""Displays single post"""
if not request.user.is_authenticated:
query = Post.objects.filter(published=True)
else:
query = Post.objects.all()
post = get_object_or_404(query, slug=post_slug)
context = {'post': post}
return render(request, 'blog/post.html', context)
def tag(request, tag_slug):
"""Displays first page with posts with given tag"""
return tag_pagination(request, tag_slug, 1)
def tag_pagination(request, tag_slug, pagination):
"""Displays n-th page with posts with given tag"""
page = int(pagination)
if not request.user.is_authenticated:
posts_with_tag = Post.objects.filter(published=True).filter(tags__slug__in=[tag_slug]).order_by('-pub_date').all()
else:
posts_with_tag = Post.objects.filter(tags__slug__in=[tag_slug]).order_by('-pub_date').all()
paginator = Paginator(posts_with_tag, POSTS_PER_PAGE)
try:
posts = paginator.page(page)
except PageNotAnInteger:
posts = paginator.page(1)
except EmptyPage:
posts = paginator.page(paginator.num_pages)
tag = Tag.objects.get(slug=tag_slug)
context = {
'posts': posts,
'tag': tag
}
return render(request, 'blog/tag.html', context)
def page(request, page_slug):
"""Displays page"""
page = Page.objects.get(slug=page_slug)
context = {'page': page}
return render(request, 'blog/page.html', context)
def youtube(request):
return redirect('https://www.youtube.com/channel/UCHPUGfK2zW0VUNN2SgCHsXg')
|
Explore the medium of etching with a variety of tools and techniques. Learn to work with acid, hard ground, soft ground, drypoint, aquatints, chine-collé and other techniques and acquire the skills to transfer your printing plates onto paper using a printing press. Whether you are producing traditional editioned prints or investigating other methods of printing, explore the exciting range of results that are possible through the etching process. With self-generated ideas, enjoy creating one or a number of bodies of work.
Angus Fisher is a practicing printmaker who is skilled in using intricate, traditional techniques to represent the natural world. Angus is a graduate of the National Art School and is currently represented by Australian Galleries where he exhibits regularly. He has worked as an artist and teacher around Australia and as an archaeological illustrator in Greece. Angus is currently working towards an exhibition in Sydney as well as the creation of a limited edition series of books. Watch Angus producing his prints here.
|
# This program is to count instances of particular values or names in a file.
# It requires two inputs at the command line, the first is the file being read, and the second
# is the file to be written with the keys and values in a csv file.
# Written by Robert Tyx wyu3@cdc.gov on 3/3/2014
# File: instance_counter.py
import csv # Comma Separated variable module
import sys # system module
def count_names(lines): # function for counting names using lines as input
result = {} # create library called result
for name in lines: # at every new line, do the following
name = name.strip() # strip off any spaces etc
if name in result: # if the name key of what we are counting is in the library already
result[name] = result[name]+1 # add one to the name key
else: # otherwise do the following
result[name] = 1 # add the name key and one to the library
return result # return the library to what called the function
if __name__ == '__main__':
reader = open(sys.argv[1],'r') # open file specified by command line
lines = reader.readlines() # read each line into lines
reader.close()
count = count_names(lines) # call function count_names
for name in count: # print out all keys and values onto screen
print name, count[name]
writer = csv.writer(open(sys.argv[2], 'wb')) # write to the second file specified in command line
for key, value in count.items(): # write each key and value
writer.writerow([key,value])
|
One of my blog readers asked me to write more about goal-setting and performance against goals. In response, I studied the work of University of Maryland’s Edwin Locke and University of Toronto’s Gary Latham, two renowned researchers on goal-setting. Here is a summary.
Goals can help direct: A person’s goals should direct his/her attention, effort, and action toward goal-relevant actions at the expense of less-relevant actions.
Goals can help motivate: A person’s goals can motivate him/her to pursue specific outcomes. The person can be motivated only when his/her goals are sufficiently challenging and can nudge him/her to put in special efforts.
Goals can help persist: A person is likely to persist at his/her efforts when his/her goal is worthy enough to attain.
Goals can trigger learning: Goals can either activate a person’s knowledge and skills that are relevant to performance or induce the person to acquire such knowledge or skills.
Specific, difficult, but attainable goals lead to better performance than easy, vague, or abstract goals such as the general-purpose exhortation to “do your best.” Hard goals motivate because they require a person to achieve more in order to be content with his/her own performance.
Goal specificity and performance share a positive, linear relationship. When a person’s goals are specific, they direct and energize his/her behavior far more effectively than when they are vague and unspecific.
Performance is directly proportional to the difficulty of a goal as long as a person is committed to the goal, has the requisite ability and resources to achieve the goal, and does not have conflicting goals.
Taking on excess work without access to the necessary resources to realize the goals (“overload”) can moderate the effects of goals.
A team performs best when the goals of the individuals on the team are compatible with the team’s goal. Therefore, when an individual’s goals are incompatible with his team’s, his/her contribution to the team will be subpar.
The goal need not be in focal awareness all the time. Once a goal is accepted and understood, it resides in the periphery of the person’s consciousness and serves to guide and give meaning to his/her actions.
While long-term goals are relevant and helpful, most people find short-term goals more effective because they channel a person’s immediate and direct efforts and provide quick feedback. This suggests that it’s best to divide long-term goals into concrete short-term objectives.
Self-efficacy plays a key role in the achievement of goals. A person is much more likely to buy into and pursue goals if he/she believes himself/herself to be competent enough to reach those goals. The most effective goals must therefore embrace a person’s strengths—such goals help him/her strive towards success by leveraging the best of who he/she is and what he/she can do.
One reason a person may lack self-efficacy is his/her past failures with undertaking similar goals. Such a person may believe that he/she may never reach his/her goals and should first undertake a series of small, near-term goals instead of difficult, distant goals. The person’s success with a series of smaller goals can boost his/her confidence and can inspire him/her to undertake larger goals. For example, a chain-smoker will find the goal of smoking cessation daunting. He should therefore focus on smaller goals like gradually cutting down the number of cigarettes he smokes every day. Experiences of goal achievement can build up momentum to tackle the larger goal.
Goals are not effective by themselves. Feedback is the most important moderator of goal-setting because it tracks the progress of performance towards goals and creates new sub-goals. If a person finds his/her progress towards a goal unsatisfactory, the feedback he/she receives can drive corrective efforts to develop new skills or pursue the goal in a new way.
|
"""
Generate http://plantuml.com/sequence-diagram
"""
import logging
from . import formatting, plantuml_text_encoding
from .seqdiag_model import Category
logger = logging.getLogger(__name__)
MSG_TO_TEXTUAL_REPR = {
Category.request: '"{source}" -> "{destination}": {text}\n',
Category.response: '"{destination}" <-- "{source}": {text}\n',
}
NOTE_LOCATION = {
Category.request: 'right',
Category.response: 'left',
}
def html_image(messages):
"""
Generate an HTML img element with an SVG sequence diagram
"""
logger.debug('Generating sequence diagram')
textual_repr = _generate_textual_representation(messages)
encoded_repr = plantuml_text_encoding.encode(textual_repr)
html = '<img src="http://www.plantuml.com/plantuml/svg/%s">' % encoded_repr
return html
def _generate_textual_representation(messages):
textual_repr = ''
for msg in messages:
textual_repr += MSG_TO_TEXTUAL_REPR[msg.category].format(
source=_sanitize(msg.src),
destination=_sanitize(msg.dst),
text=msg.text)
formatters = [
formatting.prettify_json, formatting.shorten_long_strings
]
for fmt in formatters:
fmt(msg)
if msg.note:
textual_repr += 'note ' + NOTE_LOCATION[
msg.category] + '\n' + _indent(msg.note) + '\nend note'
return textual_repr
def _sanitize(participant):
return participant.replace('"', "'")
def _indent(text):
return ' ' + '\n '.join(text.splitlines())
|
Special Announcement Our first community project for 2013 is about to launch on 17th March 2013. Mikode Music Competition 2013 An Online Talent Contest Open to all music genres i.e.: Solo or groups, vocals and or instrumentals, originals or covers. 1st prize professional recording at AAA Studios located at: Caloundra Music Academy, in Caloundra QLD Australia. Plus online promotion of your CD for one year – see demo page CLICK HERE. 10% of revenue from your CD sales to be donated to a school which you nominate. Winner will be decided by public votes.
|
# -*- coding: utf-8 -*-
##############################################################################
#
# OpenERP, Open Source Management Solution
# Copyright (C) 2004-2009 Tiny SPRL (<http://tiny.be>).
# Copyright (C) 2010-2014 OpenERP s.a. (<http://openerp.com>).
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as
# published by the Free Software Foundation, either version 3 of the
# License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
#
##############################################################################
"""
The PostgreSQL connector is a connectivity layer between the OpenERP code and
the database, *not* a database abstraction toolkit. Database abstraction is what
the ORM does, in fact.
"""
from contextlib import contextmanager
from functools import wraps
import logging
import urlparse
import uuid
import psycopg2.extras
import psycopg2.extensions
from psycopg2.extensions import ISOLATION_LEVEL_AUTOCOMMIT, ISOLATION_LEVEL_READ_COMMITTED, ISOLATION_LEVEL_REPEATABLE_READ
from psycopg2.pool import PoolError
psycopg2.extensions.register_type(psycopg2.extensions.UNICODE)
_logger = logging.getLogger(__name__)
types_mapping = {
'date': (1082,),
'time': (1083,),
'datetime': (1114,),
}
def unbuffer(symb, cr):
if symb is None:
return None
return str(symb)
def undecimalize(symb, cr):
if symb is None:
return None
return float(symb)
for name, typeoid in types_mapping.items():
psycopg2.extensions.register_type(psycopg2.extensions.new_type(typeoid, name, lambda x, cr: x))
psycopg2.extensions.register_type(psycopg2.extensions.new_type((700, 701, 1700,), 'float', undecimalize))
import tools
from tools.func import frame_codeinfo
from datetime import datetime as mdt
from datetime import timedelta
import threading
from inspect import currentframe
import re
re_from = re.compile('.* from "?([a-zA-Z_0-9]+)"? .*$')
re_into = re.compile('.* into "?([a-zA-Z_0-9]+)"? .*$')
sql_counter = 0
class Cursor(object):
"""Represents an open transaction to the PostgreSQL DB backend,
acting as a lightweight wrapper around psycopg2's
``cursor`` objects.
``Cursor`` is the object behind the ``cr`` variable used all
over the OpenERP code.
.. rubric:: Transaction Isolation
One very important property of database transactions is the
level of isolation between concurrent transactions.
The SQL standard defines four levels of transaction isolation,
ranging from the most strict *Serializable* level, to the least
strict *Read Uncommitted* level. These levels are defined in
terms of the phenomena that must not occur between concurrent
transactions, such as *dirty read*, etc.
In the context of a generic business data management software
such as OpenERP, we need the best guarantees that no data
corruption can ever be cause by simply running multiple
transactions in parallel. Therefore, the preferred level would
be the *serializable* level, which ensures that a set of
transactions is guaranteed to produce the same effect as
running them one at a time in some order.
However, most database management systems implement a limited
serializable isolation in the form of
`snapshot isolation <http://en.wikipedia.org/wiki/Snapshot_isolation>`_,
providing most of the same advantages as True Serializability,
with a fraction of the performance cost.
With PostgreSQL up to version 9.0, this snapshot isolation was
the implementation of both the ``REPEATABLE READ`` and
``SERIALIZABLE`` levels of the SQL standard.
As of PostgreSQL 9.1, the previous snapshot isolation implementation
was kept for ``REPEATABLE READ``, while a new ``SERIALIZABLE``
level was introduced, providing some additional heuristics to
detect a concurrent update by parallel transactions, and forcing
one of them to rollback.
OpenERP implements its own level of locking protection
for transactions that are highly likely to provoke concurrent
updates, such as stock reservations or document sequences updates.
Therefore we mostly care about the properties of snapshot isolation,
but we don't really need additional heuristics to trigger transaction
rollbacks, as we are taking care of triggering instant rollbacks
ourselves when it matters (and we can save the additional performance
hit of these heuristics).
As a result of the above, we have selected ``REPEATABLE READ`` as
the default transaction isolation level for OpenERP cursors, as
it will be mapped to the desired ``snapshot isolation`` level for
all supported PostgreSQL version (8.3 - 9.x).
Note: up to psycopg2 v.2.4.2, psycopg2 itself remapped the repeatable
read level to serializable before sending it to the database, so it would
actually select the new serializable mode on PostgreSQL 9.1. Make
sure you use psycopg2 v2.4.2 or newer if you use PostgreSQL 9.1 and
the performance hit is a concern for you.
.. attribute:: cache
Cache dictionary with a "request" (-ish) lifecycle, only lives as
long as the cursor itself does and proactively cleared when the
cursor is closed.
This cache should *only* be used to store repeatable reads as it
ignores rollbacks and savepoints, it should not be used to store
*any* data which may be modified during the life of the cursor.
"""
IN_MAX = 1000 # decent limit on size of IN queries - guideline = Oracle limit
def check(f):
@wraps(f)
def wrapper(self, *args, **kwargs):
if self._closed:
msg = 'Unable to use a closed cursor.'
if self.__closer:
msg += ' It was closed at %s, line %s' % self.__closer
raise psycopg2.OperationalError(msg)
return f(self, *args, **kwargs)
return wrapper
def __init__(self, pool, dbname, dsn, serialized=True):
self.sql_from_log = {}
self.sql_into_log = {}
# default log level determined at cursor creation, could be
# overridden later for debugging purposes
self.sql_log = _logger.isEnabledFor(logging.DEBUG)
self.sql_log_count = 0
# avoid the call of close() (by __del__) if an exception
# is raised by any of the following initialisations
self._closed = True
self.__pool = pool
self.dbname = dbname
# Whether to enable snapshot isolation level for this cursor.
# see also the docstring of Cursor.
self._serialized = serialized
self._cnx = pool.borrow(dsn)
self._obj = self._cnx.cursor()
if self.sql_log:
self.__caller = frame_codeinfo(currentframe(), 2)
else:
self.__caller = False
self._closed = False # real initialisation value
self.autocommit(False)
self.__closer = False
self._default_log_exceptions = True
self.cache = {}
def __build_dict(self, row):
return {d.name: row[i] for i, d in enumerate(self._obj.description)}
def dictfetchone(self):
row = self._obj.fetchone()
return row and self.__build_dict(row)
def dictfetchmany(self, size):
return map(self.__build_dict, self._obj.fetchmany(size))
def dictfetchall(self):
return map(self.__build_dict, self._obj.fetchall())
def __del__(self):
if not self._closed and not self._cnx.closed:
# Oops. 'self' has not been closed explicitly.
# The cursor will be deleted by the garbage collector,
# but the database connection is not put back into the connection
# pool, preventing some operation on the database like dropping it.
# This can also lead to a server overload.
msg = "Cursor not closed explicitly\n"
if self.__caller:
msg += "Cursor was created at %s:%s" % self.__caller
else:
msg += "Please enable sql debugging to trace the caller."
_logger.info(msg)
self._close(True)
@check
def execute(self, query, params=None, log_exceptions=None):
if '%d' in query or '%f' in query:
_logger.info("SQL queries cannot contain %%d or %%f anymore. Use only %%s:\n%s" % query,
exc_info=_logger.isEnabledFor(logging.DEBUG))
if params and not isinstance(params, (tuple, list, dict)):
_logger.info("SQL query parameters should be a tuple, list or dict; got %r", params)
raise ValueError("SQL query parameters should be a tuple, list or dict; got %r" % (params,))
if self.sql_log:
now = mdt.now()
try:
params = params or None
res = self._obj.execute(query, params)
except psycopg2.ProgrammingError, pe:
if self._default_log_exceptions if log_exceptions is None else log_exceptions:
_logger.info("Programming error: %s, in query %s", pe, query)
raise
except Exception:
if self._default_log_exceptions if log_exceptions is None else log_exceptions:
_logger.info("bad query: %s", self._obj.query or query)
raise
# simple query count is always computed
self.sql_log_count += 1
# advanced stats only if sql_log is enabled
if self.sql_log:
delay = mdt.now() - now
delay = delay.seconds * 1E6 + delay.microseconds
_logger.debug("query: %s", self._obj.query)
res_from = re_from.match(query.lower())
if res_from:
self.sql_from_log.setdefault(res_from.group(1), [0, 0])
self.sql_from_log[res_from.group(1)][0] += 1
self.sql_from_log[res_from.group(1)][1] += delay
res_into = re_into.match(query.lower())
if res_into:
self.sql_into_log.setdefault(res_into.group(1), [0, 0])
self.sql_into_log[res_into.group(1)][0] += 1
self.sql_into_log[res_into.group(1)][1] += delay
return res
def split_for_in_conditions(self, ids):
"""Split a list of identifiers into one or more smaller tuples
safe for IN conditions, after uniquifying them."""
return tools.misc.split_every(self.IN_MAX, set(ids))
def print_log(self):
global sql_counter
if not self.sql_log:
return
def process(type):
sqllogs = {'from': self.sql_from_log, 'into': self.sql_into_log}
sum = 0
if sqllogs[type]:
sqllogitems = sqllogs[type].items()
sqllogitems.sort(key=lambda k: k[1][1])
_logger.debug("SQL LOG %s:", type)
sqllogitems.sort(lambda x, y: cmp(x[1][0], y[1][0]))
for r in sqllogitems:
delay = timedelta(microseconds=r[1][1])
_logger.debug("table: %s: %s/%s", r[0], delay, r[1][0])
sum += r[1][1]
sqllogs[type].clear()
sum = timedelta(microseconds=sum)
_logger.debug("SUM %s:%s/%d [%d]", type, sum, self.sql_log_count, sql_counter)
sqllogs[type].clear()
process('from')
process('into')
self.sql_log_count = 0
self.sql_log = False
@check
def close(self):
return self._close(False)
def _close(self, leak=False):
global sql_counter
if not self._obj:
return
del self.cache
if self.sql_log:
self.__closer = frame_codeinfo(currentframe(), 3)
# simple query count is always computed
sql_counter += self.sql_log_count
# advanced stats only if sql_log is enabled
self.print_log()
self._obj.close()
# This force the cursor to be freed, and thus, available again. It is
# important because otherwise we can overload the server very easily
# because of a cursor shortage (because cursors are not garbage
# collected as fast as they should). The problem is probably due in
# part because browse records keep a reference to the cursor.
del self._obj
self._closed = True
# Clean the underlying connection.
self._cnx.rollback()
if leak:
self._cnx.leaked = True
else:
chosen_template = tools.config['db_template']
templates_list = tuple(set(['template0', 'template1', 'postgres', chosen_template]))
keep_in_pool = self.dbname not in templates_list
self.__pool.give_back(self._cnx, keep_in_pool=keep_in_pool)
@check
def autocommit(self, on):
if on:
isolation_level = ISOLATION_LEVEL_AUTOCOMMIT
else:
# If a serializable cursor was requested, we
# use the appropriate PotsgreSQL isolation level
# that maps to snaphsot isolation.
# For all supported PostgreSQL versions (8.3-9.x),
# this is currently the ISOLATION_REPEATABLE_READ.
# See also the docstring of this class.
# NOTE: up to psycopg 2.4.2, repeatable read
# is remapped to serializable before being
# sent to the database, so it is in fact
# unavailable for use with pg 9.1.
isolation_level = \
ISOLATION_LEVEL_REPEATABLE_READ \
if self._serialized \
else ISOLATION_LEVEL_READ_COMMITTED
self._cnx.set_isolation_level(isolation_level)
@check
def commit(self):
""" Perform an SQL `COMMIT`
"""
return self._cnx.commit()
@check
def rollback(self):
""" Perform an SQL `ROLLBACK`
"""
return self._cnx.rollback()
def __enter__(self):
""" Using the cursor as a contextmanager automatically commits and
closes it::
with cr:
cr.execute(...)
# cr is committed if no failure occurred
# cr is closed in any case
"""
return self
def __exit__(self, exc_type, exc_value, traceback):
if exc_type is None:
self.commit()
self.close()
@contextmanager
@check
def savepoint(self):
"""context manager entering in a new savepoint"""
name = uuid.uuid1().hex
self.execute('SAVEPOINT "%s"' % name)
try:
yield
self.execute('RELEASE SAVEPOINT "%s"' % name)
except:
self.execute('ROLLBACK TO SAVEPOINT "%s"' % name)
raise
@check
def __getattr__(self, name):
return getattr(self._obj, name)
class TestCursor(Cursor):
""" A cursor to be used for tests. It keeps the transaction open across
several requests, and simulates committing, rolling back, and closing.
"""
def __init__(self, *args, **kwargs):
super(TestCursor, self).__init__(*args, **kwargs)
# in order to simulate commit and rollback, the cursor maintains a
# savepoint at its last commit
self.execute("SAVEPOINT test_cursor")
# we use a lock to serialize concurrent requests
self._lock = threading.RLock()
def acquire(self):
self._lock.acquire()
def release(self):
self._lock.release()
def force_close(self):
super(TestCursor, self).close()
def close(self):
if not self._closed:
self.rollback() # for stuff that has not been committed
self.release()
def autocommit(self, on):
_logger.debug("TestCursor.autocommit(%r) does nothing", on)
def commit(self):
self.execute("RELEASE SAVEPOINT test_cursor")
self.execute("SAVEPOINT test_cursor")
def rollback(self):
self.execute("ROLLBACK TO SAVEPOINT test_cursor")
self.execute("SAVEPOINT test_cursor")
class PsycoConnection(psycopg2.extensions.connection):
pass
class ConnectionPool(object):
""" The pool of connections to database(s)
Keep a set of connections to pg databases open, and reuse them
to open cursors for all transactions.
The connections are *not* automatically closed. Only a close_db()
can trigger that.
"""
def locked(fun):
@wraps(fun)
def _locked(self, *args, **kwargs):
self._lock.acquire()
try:
return fun(self, *args, **kwargs)
finally:
self._lock.release()
return _locked
def __init__(self, maxconn=64):
self._connections = []
self._maxconn = max(maxconn, 1)
self._lock = threading.Lock()
def __repr__(self):
used = len([1 for c, u in self._connections[:] if u])
count = len(self._connections)
return "ConnectionPool(used=%d/count=%d/max=%d)" % (used, count, self._maxconn)
def _debug(self, msg, *args):
_logger.debug(('%r ' + msg), self, *args)
@locked
def borrow(self, dsn):
# free dead and leaked connections
for i, (cnx, _) in tools.reverse_enumerate(self._connections):
if cnx.closed:
self._connections.pop(i)
self._debug('Removing closed connection at index %d: %r', i, cnx.dsn)
continue
if getattr(cnx, 'leaked', False):
delattr(cnx, 'leaked')
self._connections.pop(i)
self._connections.append((cnx, False))
_logger.info('%r: Free leaked connection to %r', self, cnx.dsn)
for i, (cnx, used) in enumerate(self._connections):
if not used and cnx._original_dsn == dsn:
try:
cnx.reset()
except psycopg2.OperationalError:
self._debug('Cannot reset connection at index %d: %r', i, cnx.dsn)
# psycopg2 2.4.4 and earlier do not allow closing a closed connection
if not cnx.closed:
cnx.close()
continue
self._connections.pop(i)
self._connections.append((cnx, True))
self._debug('Borrow existing connection to %r at index %d', cnx.dsn, i)
return cnx
if len(self._connections) >= self._maxconn:
# try to remove the oldest connection not used
for i, (cnx, used) in enumerate(self._connections):
if not used:
self._connections.pop(i)
if not cnx.closed:
cnx.close()
self._debug('Removing old connection at index %d: %r', i, cnx.dsn)
break
else:
# note: this code is called only if the for loop has completed (no break)
raise PoolError('The Connection Pool Is Full')
try:
result = psycopg2.connect(dsn=dsn, connection_factory=PsycoConnection)
except psycopg2.Error:
_logger.info('Connection to the database failed')
raise
result._original_dsn = dsn
self._connections.append((result, True))
self._debug('Create new connection')
return result
@locked
def give_back(self, connection, keep_in_pool=True):
self._debug('Give back connection to %r', connection.dsn)
for i, (cnx, used) in enumerate(self._connections):
if cnx is connection:
self._connections.pop(i)
if keep_in_pool:
self._connections.append((cnx, False))
self._debug('Put connection to %r in pool', cnx.dsn)
else:
self._debug('Forgot connection to %r', cnx.dsn)
cnx.close()
break
else:
raise PoolError('This connection does not below to the pool')
@locked
def close_all(self, dsn=None):
count = 0
last = None
for i, (cnx, used) in tools.reverse_enumerate(self._connections):
if dsn is None or cnx._original_dsn == dsn:
cnx.close()
last = self._connections.pop(i)[0]
count += 1
_logger.info('%r: Closed %d connections %s', self, count,
(dsn and last and 'to %r' % last.dsn) or '')
class Connection(object):
""" A lightweight instance of a connection to postgres
"""
def __init__(self, pool, dbname, dsn):
self.dbname = dbname
self.dsn = dsn
self.__pool = pool
def cursor(self, serialized=True):
cursor_type = serialized and 'serialized ' or ''
_logger.debug('create %scursor to %r', cursor_type, self.dsn)
return Cursor(self.__pool, self.dbname, self.dsn, serialized=serialized)
def test_cursor(self, serialized=True):
cursor_type = serialized and 'serialized ' or ''
_logger.debug('create test %scursor to %r', cursor_type, self.dsn)
return TestCursor(self.__pool, self.dbname, self.dsn, serialized=serialized)
# serialized_cursor is deprecated - cursors are serialized by default
serialized_cursor = cursor
def __nonzero__(self):
"""Check if connection is possible"""
try:
_logger.info("__nonzero__() is deprecated. (It is too expensive to test a connection.)")
cr = self.cursor()
cr.close()
return True
except Exception:
return False
def dsn(db_or_uri):
"""parse the given `db_or_uri` and return a 2-tuple (dbname, uri)"""
if db_or_uri.startswith(('postgresql://', 'postgres://')):
# extract db from uri
us = urlparse.urlsplit(db_or_uri)
if len(us.path) > 1:
db_name = us.path[1:]
elif us.username:
db_name = us.username
else:
db_name = us.hostname
return db_name, db_or_uri
_dsn = ''
for p in ('host', 'port', 'user', 'password'):
cfg = tools.config['db_' + p]
if cfg:
_dsn += '%s=%s ' % (p, cfg)
return db_or_uri, '%sdbname=%s' % (_dsn, db_or_uri)
_Pool = None
def db_connect(to, allow_uri=False):
global _Pool
if _Pool is None:
_Pool = ConnectionPool(int(tools.config['db_maxconn']))
db, uri = dsn(to)
if not allow_uri and db != to:
raise ValueError('URI connections not allowed')
return Connection(_Pool, db, uri)
def close_db(db_name):
""" You might want to call openerp.modules.registry.RegistryManager.delete(db_name) along this function."""
global _Pool
if _Pool:
_Pool.close_all(dsn(db_name)[1])
def close_all():
global _Pool
if _Pool:
_Pool.close_all()
|
"A particularly controversial topic in current education policy is the expansion of the charter school sector. This paper analyzes the spillover effects of charter schools on traditional public school (TPS) students in New York City. I exploit variation in both the timing of charter school entry and distance to the nearest charter school to obtain credibly causal estimates of the impacts of charter schools on TPS student performance and I am among the first to estimate the impacts of charter school co-location. I further add to the literature by exploring potential mechanisms for these findings with school-level data on per pupil expenditures (PPE), parent, and teacher perception of schools. Briefly, I find that charter schools significantly increase TPS student performance in both English Language Arts and math and decrease the probability of grade retention. Effects increase with charter school proximity and are largest in TPSs co-located with charter schools. Potential explanations for improved performance include increased PPE, academic expectations, student engagement, and a more respectful and safe school environment after charter entry. The findings suggest that more charter schools in NYC may be beneficial at the margin and that co-location may be mutually beneficial for charter and traditional public schools."
|
#!/usr/bin/env python
from os.path import exists
from setuptools import setup
import re
version_raw = open('glopen/_version.py').read()
version_regex = r"^__version__ = ['\"]([^'\"]*)['\"]"
version_result = re.search(version_regex, version_raw, re.M)
if version_result:
version_string = version_result.group(1)
else:
raise RuntimeError("Unable to find version string in %s." % (VERSIONFILE,))
setup(name='glopen',
version=version_string,
description='Open-like interface to globus remotes',
url='http://github.com/maxhutch/glopen/',
author='https://raw.github.com/maxhutch/glopen/master/AUTHORS.md',
author_email='maxhutch@gmail.com',
maintainer='Max Hutchinson',
maintainer_email='maxhutch@gmail.com',
license='MIT',
keywords='globus ssh open',
install_requires=list(open('requirements.txt').read().strip()
.split('\n')),
long_description=(open('README.rst').read() if exists('README.rst')
else ''),
packages=['glopen'],
zip_safe=True)
|
This book and its prose is conceptualized with great thought, illuminating, bringing clear images to the mind, and materialized well at the end. There is a quote I came across where some highly-regarded author expressed that writing a book is the closest association a man can have to that of delivering a child. Well, as a man, the mental application of peeling back the layers upon layers found in the story, and the differentiation of the characters “en masse” gave me the perception that the author wanted me to feel her pain. The production of this book made me feel that this would have been an excruciating pregnancy complete with endless mind-altering labour pains and intense contractions. The suffering was most definitely transferable and my sympathies are in order, but man what a beautiful baby. After this experience I am never having a child, sorry sorry, let me rephrase that, I am never writing a novel that requires such quality. Just imagining the author’s sleepless nights abruptly waking up to the thoughts of filling plot holes, tying up any and all loose ends, the totality of the revision process makes me need a bottle of Jose Cuervo upon contemplation of these thoughts. The author’s ability to reconstruct/weave a tale like this is an accomplishment in it self that must have required great patience and attention to detail that all aspiring author’s can/should admire.
Welcome to Vietnam, the climate known for its enveloping heat, the vibrancy of colour, and the unpredictability that is often found in the land of the dragon people. As The Heart Bones Break allows you a rare window into the mind of a young conflicted boy named Thong Tran as he grows into a man with little cares where the only consistency in his life is that of national and internal disparities. As the French would say, “c’est la vie” for Thong conflict is a way of life. States of confusion are initiated at conception, gradually he becomes disturbed by the imposition of the American forces and their crass and boorish behaviour. Despite all of these perpetual problems in combination with the love he has for his country, Thong Tran is driven at an early age by the life that can only be achieved by assuming a life in enemy territory. His only problem, and a big one at that is figuring out in the process of his new life, the true intentions of his own nation and everyone around him at a time when survival is not only left for the fittest, but the most desperate and judicious.
I initially read the first person narrative of this book because it was suggested by the author’s website for North American comfort and perspective. I understand it is not commonplace within literary fiction, but readers should be encouraged to widen their scope, not narrow it. To me it was a veiled shot, I am sure it was not intended with any malice, but to offer a more accessible, less academic, and narrowed viewpoint kind of made me want to read the book the way it was intended. Unlucky for me at the time, but beneficial for me in the long run, my first person e-book edition went blank after the thirtieth page. Scrapping a book after starting it is hard so I felt compelled to download the author’s standard second person point of view edition. I will admit, this did take some adjustments but the differing frames-of-reference made you feel like you were sitting at every seat of the family holiday dinner table, Uncle Jim’s turtleneck wearing, eggnog induced stupor and all.
War is often a very physical form of destruction. From a metaphysical standpoint As The Heart Bones Break engages the audience in the ways war can unsettle the spirit and reshape the mental state of innocent civilians. The book constantly reminds the reader of the unrest found in the spiritual world. The author showcases this through graveyards having to constantly be remade, ancestral temples being constructed, and the presence of kites being flown in the sky. The insinuation is that through all this time the war is not over, death is lurking and peace in Vietnam is long sought to this day. From a mental standpoint this book is not for the ever-paranoid. With the constant presence of double agents and strategic espionage life in Vietnam was very much like the slogan from the X-Files “Don’t Trust Anyone.” Constant influential ploy’s on the Vietnamese state of minds were the attacks on typical masculine downfalls by way of perceived riches from wealthy men, offerings of freedom for you and yours, and the utilization of female wiles. While reading you may find yourself looking over your shoulder or adopting a “crooked eye” as a permanent fixture to your visage. This book thrives on accepting the superficial while learning through experience on how to “peel back the layers” in order to figure out the motivation of individuals and the outcomes that will follow.
|
import click
from collections.abc import Mapping, Sequence
from collections import OrderedDict, defaultdict
import json
import os
import re
import sys
from .common.formatter import sort_atml3
if sys.version_info[0] == 3 and sys.version_info[1] == 5:
from math import gcd
else:
from fractions import gcd
@click.group()
@click.argument('input', type=click.File('r'))
@click.pass_obj
def analyze(obj, input):
"""
analyze a directory of atml3 files
"""
print(input.name)
try:
fcontents = json.load(input, encoding='utf-8',
object_hook=OrderedDict,
object_pairs_hook=OrderedDict)
obj['input'] = fcontents
except Exception as e:
print(e)
@analyze.command()
@click.pass_obj
def show(obj):
"""
merely print the contents of the dict
"""
return
print(json.dumps(obj['input'], indent=4))
@click.group(chain=True, invoke_without_command=True)
@click.argument('path', type=click.Path(exists=True))
@click.pass_context
def atml3file(ctx, path):
ctx.obj = {}
if not os.path.isdir(path):
click.echo('{} is not a directory'.format(path), err=True)
return
def atml3file_iter(path):
for root, dirs, files in os.walk(path):
for name in files:
if name[-6:].lower() == '.atml3':
fname = os.path.join(root, name)
try:
with open(fname, 'r') as f:
fcontents = f.read()
atml3 = json.loads(fcontents, encoding='utf-8',
object_hook=OrderedDict,
object_pairs_hook=OrderedDict)
item = {
'fcontents': fcontents,
'atml3': atml3,
'filename': fname,
}
yield item
except Exception as e:
item = {
'filename': fname,
'atml3': {},
'fcontents': '',
'error': str(e),
}
yield item
@atml3file.resultcallback()
def process_commands(processors, path):
def echo_name(iterator):
for item in iterator:
click.echo(item['filename'])
yield item
iterator = atml3file_iter(path)
for processor in processors:
iterator = processor(iterator)
for item in iterator:
pass
@atml3file.command('collect-errors')
def atml3_errors():
def processor(iterator):
counter = 0
fails = 0
errors = []
for item in iterator:
counter += 1
if 'error' in item:
fails += 1
errors.append('{}: {}'.format(item['filename'], item['error']))
yield item
click.echo('\n\n{}/{} failed to validate:'.format(fails, counter))
click.echo('\n'.join(errors))
return processor
@atml3file.command('keys')
@click.pass_obj
def atml3_rootkeys(obj):
obj['allkeys'] = set()
obj['orderings'] = set()
def processor(iterator):
for item in iterator:
for key in item['atml3'].keys():
obj['allkeys'].add(key)
orderstring = '", "'.join(item['atml3'].keys())
obj['orderings'].add('"{}"'.format(orderstring))
yield item
click.echo('\n')
click.echo('all keys:\n"{}"'.format('", "'.join(obj['allkeys'])))
click.echo('all orderings:')
orderings = list(obj['orderings'])
orderings.sort()
for line in orderings:
click.echo(line)
return processor
@atml3file.command('reindent')
def atml3_reindent():
def processor(iterator):
for item in iterator:
if item.get('atml3', '') and item.get('filename', ''):
filename = item['filename']
atml3 = item['atml3']
click.echo('writing reindented file to {}'.format(filename))
with open(filename, 'w') as f:
json.dump(atml3, f, indent=2)
yield item
return processor
@atml3file.command('showkey')
@click.argument('key')
def atml3_print_keyvalue(key):
def processor(iterator):
for item in iterator:
if key in item['atml3']:
click.echo('{} {}: "{}"'.format(item['filename'], key, item['atml3'][key]))
yield item
return processor
@atml3file.command('sort')
def atml3_sort():
def processor(iterator):
for item in iterator:
if item['atml3']:
print(json.dumps(sort_atml3(item['atml3']), indent=2))
yield item
return processor
@atml3file.command('guess-indentation')
def atml3_guess_indentation():
def processor(iterator):
for item in iterator:
if item['fcontents']:
tabdivisor = 0
spacedivisor = 0
for line in item['fcontents'].split('\n'):
tabs = 0
spaces = 0
for letter in line:
if letter == ' ':
spaces += 1
elif letter == '\t':
tabs += 1
else:
break
tabdivisor = gcd(tabdivisor, tabs)
spacedivisor = gcd(spacedivisor, spaces)
if spacedivisor > 0:
click.echo('{}: {} spaces'.format(item['filename'], spacedivisor))
elif tabdivisor > 0:
click.echo('{}: {} tabs'.format(item['filename'], tabdivisor))
yield item
return processor
class TypeSuggester(object):
def __init__(self):
self.suggestions = defaultdict(int)
def suggest(self, typename):
self.suggestions[typename] += 1
def print_suggestions(self):
d = self.suggestions
total = sum(d.values())
sorter = ((k, d[k]) for k in sorted(d, key=d.get, reverse=True))
key, value = list(sorter)[0]
print('{}% {}'.format(value / total * 100.0,
key))
def highest_suggestion(self):
d = self.suggestions
sorter = ((k, d[k]) for k in sorted(d, key=d.get, reverse=True))
key, _ = list(sorter)[0]
return key
keys = defaultdict(TypeSuggester)
@atml3file.command('find-data')
def atml3_find_data():
data_re = re.compile(r'\#[^) ,]+')
def recurse_data(item, path):
global keys
if isinstance(item, str):
if path.endswith('mappingExpression') or path.endswith('truthExpression'):
variables = data_re.findall(item)
if variables:
for key in variables:
if 'str({})'.format(key) in item:
keys[key].suggest('string')
elif 'numeric({})'.format(key) in item:
keys[key].suggest('string')
else:
keys
elif isinstance(item, Mapping):
for key, value in item.items():
recurse_data(value,
'{}.{}'.format(path,
key))
elif isinstance(item, Sequence):
for pos, element in enumerate(item):
recurse_data(element,
'{}.{}'.format(path,
pos))
def processor(iterator):
global keys
for item in iterator:
if item['atml3']:
atml3 = item['atml3']
recurse_data(atml3, '')
yield item
schema = {
"$schema": "http://json-schema.org/draft-04/schema#",
"type": "object",
"properties": {
},
}
list_re = re.compile(r'\[\d+\]$')
for key in sorted(keys.keys()):
isarray = False
if '.' in key:
prop = schema['properties']
accessor = key.lstrip('#').split('.')
for snippet in accessor:
match = list_re.search(snippet)
if match:
isarray = True
# is a list
key_without_index = snippet[:snippet.rfind('[')]
prop[key_without_index] = {
'type': 'array',
'items': {
'type': 'object',
'properties': {}
}
}
prop = prop[key_without_index]['items']['properties']
else:
prop[snippet] = {
'type': 'object',
'properties': {}
}
prop = prop[snippet]['properties']
if isarray:
print(json.dumps(schema, indent=2, sort_keys=True))
return processor
|
Jeezus! :o – Thar She Blows!
I bet daddy bought this brandnew vehicle extra for his little princess so she’ll be save and sound on the road. He didn’t know that he raised the biggest safety hazard in his own house.
I don’t understand why everyone is getting dumber. Or is it just that the internet is so damn good at spotlighting stupidity?
Hey, btw, are you by any chance the Sean of Assholes Watching Movies? Love your reviews.
|
import logging
from eemeter.weather.location import (
zipcode_to_usaf_station,
zipcode_to_tmy3_station,
)
from eemeter.weather.noaa import ISDWeatherSource
from eemeter.weather.tmy3 import TMY3WeatherSource
logger = logging.getLogger(__name__)
def get_weather_source(project):
''' Finds most relevant WeatherSource given project site.
Parameters
----------
project : eemeter.structures.Project
Project for which to find weather source data.
Returns
-------
weather_source : eemeter.weather.ISDWeatherSource
Closest data-validated weather source in the same climate zone as
project ZIP code, if available.
'''
zipcode = project.site.zipcode
station = zipcode_to_usaf_station(zipcode)
if station is None:
logger.error(
"Could not find ISD station for zipcode {}."
.format(zipcode)
)
return None
logger.info(
"Mapped ZIP code {} to ISD station {}"
.format(zipcode, station)
)
try:
weather_source = ISDWeatherSource(station)
except ValueError:
logger.error(
"Could not create ISDWeatherSource for station {}."
.format(station)
)
return None
logger.info("Created ISDWeatherSource using station {}".format(station))
return weather_source
def get_weather_normal_source(project):
''' Finds most relevant WeatherSource given project site.
Parameters
----------
project : eemeter.structures.Project
Project for which to find weather source data.
Returns
-------
weather_source : eemeter.weather.TMY3WeatherSource
Closest data-validated weather source in the same climate zone as
project ZIP code, if available.
'''
zipcode = project.site.zipcode
station = zipcode_to_tmy3_station(zipcode)
if station is None:
logger.error(
"Could not find appropriate TMY3 station for zipcode {}."
.format(zipcode)
)
return None
logger.info(
"Mapped ZIP code {} to TMY3 station {}"
.format(zipcode, station)
)
try:
weather_normal_source = TMY3WeatherSource(station)
except ValueError:
logger.error(
"Could not create TMY3WeatherSource for station {}."
.format(station)
)
return None
logger.info("Created TMY3WeatherSource using station {}".format(station))
return weather_normal_source
|
Satta Socken Exklusives Design Leicht Gestrickt Mit Charts Auf Cd is good choice for you that looking for nice reading experience. We hope you glad to visit our website. Please read our description and our privacy and policy page.
Finally I get this ebook, thanks for all these Satta Socken Exklusives Design Leicht Gestrickt Mit Charts Auf Cd can get now!
|
import csv
from datetime import datetime as dt
import logging
import sys
from parsers import BaseParser
_logger = logging.getLogger('yabfd.' + __name__)
class Parser(BaseParser):
def __init__(self, name, blacklist, bandate=dt.max, hitweight=sys.maxint):
super(Parser, self).__init__(name)
self.load_logs([blacklist])
self.weight = hitweight
self.date = bandate
def _parse(self, blacklist):
_logger.debug('%s reading %r.', self, blacklist)
r = csv.reader(open(blacklist, 'rb'))
for row in r:
try:
host = row.pop(0)
except ValueError:
_logger.error('Blacklist %r malformed at line %d, skipping.',
blacklist, r.line_num)
continue
date = dt.strptime(row.pop(0), '%Y-%m-%d').date() if row else dt.max
weight = int(row.pop(0)) if row else self.weight
yield (date, host, weight)
_logger.debug('%s read %d hosts from %r.', self, r.line_num, blacklist)
|
We offer service of process services, which includes delivering Subpoena's and serving lawsuits to people and businesses located in Portland, New York. Our fees are fair and we guarantee results. Our statewide New York process services are performed by local process servers who are familiar with most locations. A.C.E Process Servers provide prompt and efficient services you can depend on, Guaranteed.
To arrange service of your lawsuit or delivery of your subpoena in Portland, New York call or email us for an immediate response. Call, 800 987-4680 or email contact@serveservices.com.
|
# -*- encoding: utf-8 -*-
##############################################################################
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published
# by the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see http://www.gnu.org/licenses/.
#
##############################################################################
from openerp import models, fields, api
class MrpProductProduce(models.TransientModel):
_inherit = 'mrp.product.produce'
def default_lot_id(self):
if 'active_id' in self.env.context and (
self.env.context.get('active_model') == 'mrp.production'):
production_obj = self.env['mrp.production']
production = production_obj.browse(self.env.context['active_id'])
return production.mapped('move_lines2.prod_parent_lot')[:1]
lot_id = fields.Many2one(default=default_lot_id)
@api.multi
def do_produce(self):
track_lot_obj = self.env['mrp.track.lot']
result = super(MrpProductProduce, self).do_produce()
production = self.env['mrp.production'].browse(
self.env.context['active_id'])
for data in self:
if data.lot_id:
for move in production.move_lines2:
if not move.prod_parent_lot:
move.prod_parent_lot = data.lot_id.id
track_lot_obj.create(
{'component': move.product_id.id,
'component_lot': move.restrict_lot_id.id,
'product': production.product_id.id,
'product_lot': data.lot_id.id,
'production': production.id,
'st_move': move.id})
return result
|
There’s no doubt about it – maritime law is particularly complex, and many people do not even realize that there are specific laws governing accidents that occur on the water. It’s unfortunate that this area of law is so poorly understood because it affects plenty of Floridians. With 1,197 miles of coastline, 922,597 registered boats and an estimated $12 billion economic contribution from the boating industry in 2019 (and growing)—not even counting the cruise industry, there is no denying the fact that the ocean is essential to many Floridians’ lives.
Maritime law impacts almost all Florida residents in some way–whether they are vacationing on a cruise ship, working on an oil rig or fishing vessel, riding a ferry, or taking a private boat out for a recreational weekend trip, maritime law is an important body of law in the state of Florida. You might be surprised to learn that maritime law (also referred to as admiralty law) doesn’t just apply to the ocean – in some cases, it can extend to accidents and injuries that occur on or around any body of water.
If you have sustained injuries while enjoying the Fort Lauderdale waterways, Atlantic Ocean, Gulf of Mexico or while on a cruise ship, you need a certified maritime lawyer who you can trust to advocate on your behalf. Our skilled team of maritime lawyers have over 40 years’ worth of experience fighting for clients who have been injured on cruise ships and on the water, and are ready to go to work in your case today.
Our modern maritime legal system can be traced back to 1920, when Congress passed the Jones Act. The Jones Act is a measure designed to protect anyone who works at sea if they are injured during the course of their employment. It allows plaintiffs to bring a personal injury action in federal court in order to collect compensation if their injury was caused by the negligence of their employer.
Although recent highly publicized cases have drawn more attention to the issue, people usually embark on a cruise excited and looking forward to a relaxing vacation—not worrying that they may suffer some sort of injury while on board the ship. Cruises are supposed to be fun and stress-free – you lounge on deck, enjoy the onboard entertainment, and visit new places. What’s there to worry about?
In early 2013, a Carnival Cruise ship proved that everything that can go wrong will go wrong when an engine room fire shut down its power, propulsion, sewer, and air-conditioning systems and left passengers stranded at sea for four days. Lately, that story has gotten even more outrageous as evidence has emerged that Carnival knew of the fire risk before this ship set sail. The high-profile Carnival incident has led many people to ask just how common cruise accidents are and what kinds of injuries cruise ship guests can suffer, even if their ship doesn’t encounter any major issues.
A child falls into the pool and drowns.
Slip and falls. It’s not unusual for decks on cruise ships to become slick, especially near swimming pools, and if passengers fall and seriously injure themselves, they may not be able to get medical attention until the next port.
Injuries due to fires. Between 1990 and 2011, there were 79 reported fires on board cruise ships, and studies have shown that several fires occur aboard cruise ships every year. Fires can seriously damage the ship and put passengers at risk, and they can be particularly dangerous for the crew members who have to put the fire out.
Food poisoning. If food isn’t stored properly on board the cruise ship, or if a refrigeration unit stops working, guests may spend an unfortunate amount of their trip confined to their rooms.
Infections. When conditions on a cruise ship become unsanitary, as they did on the beleaguered Carnival cruise ship mentioned above, the risks of passengers getting an infection increases significantly.
Medical malpractice. You don’t have access to a full hospital when you’re on a cruise ship – medical staff and resources are limited, so if you do have to seek medical attention, you might not get the quality of care that you need.
Wrongful death. Deaths onboard cruise ships are relatively rare, thankfully, but they still happen. For example, in 2010, two passengers aboard a Louis Cruise Lines ship were killed after 26-foot waves that the crew had not anticipated slammed into the vessel.
If you are injured on a cruise ship, a ruined vacation might be the least of your worries. Injuries can have long-lasting effects, especially if you don’t receive proper treatment in a timely manner. Some injuries even prevent people from returning to work or being able to enjoy their lives to the extent they did before the accident.
Injuries sustained at port or on a dock.
Because maritime law is such a complex topic, you shouldn’t try to file a lawsuit and build your case all on your own. You also shouldn’t just walk into the nearest law office to find an attorney to represent you. Few attorneys have experience in this area of law, and the state of Florida actually requires that lawyers become certified to practice in this field. If you work with an attorney who doesn’t understand this unique branch of the law, you’ll have a difficult time winning your case.
Lawlor, White and Murphey has over 40 years of experience with maritime litigation – and that can make all the difference for your case. To learn more about what we can do for you, fill out this online contact form to schedule an appointment with one of our certified maritime lawyers. Initial consultations are free, and if you decide to work with us, you won’t have to pay us anything until we recover the compensation you deserve.
FAQ: Are there any other laws besides the Jones Act that might impact my case when I was injured on the water?
Yes, and the laws that may apply will depend upon the specific facts of your case. For example, if you were injured while working as a dockworker, the Longshore and Harbor Worker’s Compensation Act may apply, and if you were injured while working on an oil rig, the Outer Continental Shelf Lands Act may apply. Schedule an appointment with our experienced lawyers to determine which body of law may apply in your specific case.
FAQ: What special timing considerations may apply if my injury was sustained while vacationing on a cruise?
Cruise ships insert a myriad of fine print into their tickets—but you often do not actually even see the fine print, because many cruise ships now simply refer you to a website that outlines the terms and conditions of your cruise. In some cases, cruise lines require that passengers who are injured while on board provide notice of that injury to the cruise line within as little as six months in order to protect your right to recover compensation from the cruise line itself. Further, some cruise ships require that the injury be somehow reported or documented while you are still on board the cruise ship. Because of these restrictions—and the fact that they can vary based on which cruise line was operating your vessel—it is important to contact a certified maritime lawyer with experience handling cruise ship injuries as soon as possible, even if you have yet to even depart the ship.
|
from test.support import check_warnings
import cgi
import os
import sys
import tempfile
import unittest
import warnings
from collections import namedtuple
from io import StringIO, BytesIO
class HackedSysModule:
# The regression test will have real values in sys.argv, which
# will completely confuse the test of the cgi module
argv = []
stdin = sys.stdin
cgi.sys = HackedSysModule()
class ComparableException:
def __init__(self, err):
self.err = err
def __str__(self):
return str(self.err)
def __eq__(self, anExc):
if not isinstance(anExc, Exception):
return NotImplemented
return (self.err.__class__ == anExc.__class__ and
self.err.args == anExc.args)
def __getattr__(self, attr):
return getattr(self.err, attr)
def do_test(buf, method):
env = {}
if method == "GET":
fp = None
env['REQUEST_METHOD'] = 'GET'
env['QUERY_STRING'] = buf
elif method == "POST":
fp = BytesIO(buf.encode('latin-1')) # FieldStorage expects bytes
env['REQUEST_METHOD'] = 'POST'
env['CONTENT_TYPE'] = 'application/x-www-form-urlencoded'
env['CONTENT_LENGTH'] = str(len(buf))
else:
raise ValueError("unknown method: %s" % method)
try:
return cgi.parse(fp, env, strict_parsing=1)
except Exception as err:
return ComparableException(err)
parse_strict_test_cases = [
("", ValueError("bad query field: ''")),
("&", ValueError("bad query field: ''")),
("&&", ValueError("bad query field: ''")),
(";", ValueError("bad query field: ''")),
(";&;", ValueError("bad query field: ''")),
# Should the next few really be valid?
("=", {}),
("=&=", {}),
("=;=", {}),
# This rest seem to make sense
("=a", {'': ['a']}),
("&=a", ValueError("bad query field: ''")),
("=a&", ValueError("bad query field: ''")),
("=&a", ValueError("bad query field: 'a'")),
("b=a", {'b': ['a']}),
("b+=a", {'b ': ['a']}),
("a=b=a", {'a': ['b=a']}),
("a=+b=a", {'a': [' b=a']}),
("&b=a", ValueError("bad query field: ''")),
("b&=a", ValueError("bad query field: 'b'")),
("a=a+b&b=b+c", {'a': ['a b'], 'b': ['b c']}),
("a=a+b&a=b+a", {'a': ['a b', 'b a']}),
("x=1&y=2.0&z=2-3.%2b0", {'x': ['1'], 'y': ['2.0'], 'z': ['2-3.+0']}),
("x=1;y=2.0&z=2-3.%2b0", {'x': ['1'], 'y': ['2.0'], 'z': ['2-3.+0']}),
("x=1;y=2.0;z=2-3.%2b0", {'x': ['1'], 'y': ['2.0'], 'z': ['2-3.+0']}),
("Hbc5161168c542333633315dee1182227:key_store_seqid=400006&cuyer=r&view=bustomer&order_id=0bb2e248638833d48cb7fed300000f1b&expire=964546263&lobale=en-US&kid=130003.300038&ss=env",
{'Hbc5161168c542333633315dee1182227:key_store_seqid': ['400006'],
'cuyer': ['r'],
'expire': ['964546263'],
'kid': ['130003.300038'],
'lobale': ['en-US'],
'order_id': ['0bb2e248638833d48cb7fed300000f1b'],
'ss': ['env'],
'view': ['bustomer'],
}),
("group_id=5470&set=custom&_assigned_to=31392&_status=1&_category=100&SUBMIT=Browse",
{'SUBMIT': ['Browse'],
'_assigned_to': ['31392'],
'_category': ['100'],
'_status': ['1'],
'group_id': ['5470'],
'set': ['custom'],
})
]
def norm(seq):
return sorted(seq, key=repr)
def first_elts(list):
return [p[0] for p in list]
def first_second_elts(list):
return [(p[0], p[1][0]) for p in list]
def gen_result(data, environ):
encoding = 'latin-1'
fake_stdin = BytesIO(data.encode(encoding))
fake_stdin.seek(0)
form = cgi.FieldStorage(fp=fake_stdin, environ=environ, encoding=encoding)
result = {}
for k, v in dict(form).items():
result[k] = isinstance(v, list) and form.getlist(k) or v.value
return result
class CgiTests(unittest.TestCase):
def test_parse_multipart(self):
fp = BytesIO(POSTDATA.encode('latin1'))
env = {'boundary': BOUNDARY.encode('latin1'),
'CONTENT-LENGTH': '558'}
result = cgi.parse_multipart(fp, env)
expected = {'submit': [b' Add '], 'id': [b'1234'],
'file': [b'Testing 123.\n'], 'title': [b'']}
self.assertEqual(result, expected)
def test_fieldstorage_properties(self):
fs = cgi.FieldStorage()
self.assertFalse(fs)
self.assertIn("FieldStorage", repr(fs))
self.assertEqual(list(fs), list(fs.keys()))
fs.list.append(namedtuple('MockFieldStorage', 'name')('fieldvalue'))
self.assertTrue(fs)
def test_fieldstorage_invalid(self):
self.assertRaises(TypeError, cgi.FieldStorage, "not-a-file-obj",
environ={"REQUEST_METHOD":"PUT"})
self.assertRaises(TypeError, cgi.FieldStorage, "foo", "bar")
fs = cgi.FieldStorage(headers={'content-type':'text/plain'})
self.assertRaises(TypeError, bool, fs)
def test_escape(self):
# cgi.escape() is deprecated.
with warnings.catch_warnings():
warnings.filterwarnings('ignore', 'cgi\.escape',
DeprecationWarning)
self.assertEqual("test & string", cgi.escape("test & string"))
self.assertEqual("<test string>", cgi.escape("<test string>"))
self.assertEqual(""test string"", cgi.escape('"test string"', True))
def test_strict(self):
for orig, expect in parse_strict_test_cases:
# Test basic parsing
d = do_test(orig, "GET")
self.assertEqual(d, expect, "Error parsing %s method GET" % repr(orig))
d = do_test(orig, "POST")
self.assertEqual(d, expect, "Error parsing %s method POST" % repr(orig))
env = {'QUERY_STRING': orig}
fs = cgi.FieldStorage(environ=env)
if isinstance(expect, dict):
# test dict interface
self.assertEqual(len(expect), len(fs))
self.assertCountEqual(expect.keys(), fs.keys())
##self.assertEqual(norm(expect.values()), norm(fs.values()))
##self.assertEqual(norm(expect.items()), norm(fs.items()))
self.assertEqual(fs.getvalue("nonexistent field", "default"), "default")
# test individual fields
for key in expect.keys():
expect_val = expect[key]
self.assertIn(key, fs)
if len(expect_val) > 1:
self.assertEqual(fs.getvalue(key), expect_val)
else:
self.assertEqual(fs.getvalue(key), expect_val[0])
def test_log(self):
cgi.log("Testing")
cgi.logfp = StringIO()
cgi.initlog("%s", "Testing initlog 1")
cgi.log("%s", "Testing log 2")
self.assertEqual(cgi.logfp.getvalue(), "Testing initlog 1\nTesting log 2\n")
if os.path.exists(os.devnull):
cgi.logfp = None
cgi.logfile = os.devnull
cgi.initlog("%s", "Testing log 3")
self.addCleanup(cgi.closelog)
cgi.log("Testing log 4")
def test_fieldstorage_readline(self):
# FieldStorage uses readline, which has the capacity to read all
# contents of the input file into memory; we use readline's size argument
# to prevent that for files that do not contain any newlines in
# non-GET/HEAD requests
class TestReadlineFile:
def __init__(self, file):
self.file = file
self.numcalls = 0
def readline(self, size=None):
self.numcalls += 1
if size:
return self.file.readline(size)
else:
return self.file.readline()
def __getattr__(self, name):
file = self.__dict__['file']
a = getattr(file, name)
if not isinstance(a, int):
setattr(self, name, a)
return a
f = TestReadlineFile(tempfile.TemporaryFile("wb+"))
self.addCleanup(f.close)
f.write(b'x' * 256 * 1024)
f.seek(0)
env = {'REQUEST_METHOD':'PUT'}
fs = cgi.FieldStorage(fp=f, environ=env)
self.addCleanup(fs.file.close)
# if we're not chunking properly, readline is only called twice
# (by read_binary); if we are chunking properly, it will be called 5 times
# as long as the chunksize is 1 << 16.
self.assertGreater(f.numcalls, 2)
f.close()
def test_fieldstorage_multipart(self):
#Test basic FieldStorage multipart parsing
env = {
'REQUEST_METHOD': 'POST',
'CONTENT_TYPE': 'multipart/form-data; boundary={}'.format(BOUNDARY),
'CONTENT_LENGTH': '558'}
fp = BytesIO(POSTDATA.encode('latin-1'))
fs = cgi.FieldStorage(fp, environ=env, encoding="latin-1")
self.assertEqual(len(fs.list), 4)
expect = [{'name':'id', 'filename':None, 'value':'1234'},
{'name':'title', 'filename':None, 'value':''},
{'name':'file', 'filename':'test.txt', 'value':b'Testing 123.\n'},
{'name':'submit', 'filename':None, 'value':' Add '}]
for x in range(len(fs.list)):
for k, exp in expect[x].items():
got = getattr(fs.list[x], k)
self.assertEqual(got, exp)
def test_fieldstorage_multipart_leading_whitespace(self):
env = {
'REQUEST_METHOD': 'POST',
'CONTENT_TYPE': 'multipart/form-data; boundary={}'.format(BOUNDARY),
'CONTENT_LENGTH': '560'}
# Add some leading whitespace to our post data that will cause the
# first line to not be the innerboundary.
fp = BytesIO(b"\r\n" + POSTDATA.encode('latin-1'))
fs = cgi.FieldStorage(fp, environ=env, encoding="latin-1")
self.assertEqual(len(fs.list), 4)
expect = [{'name':'id', 'filename':None, 'value':'1234'},
{'name':'title', 'filename':None, 'value':''},
{'name':'file', 'filename':'test.txt', 'value':b'Testing 123.\n'},
{'name':'submit', 'filename':None, 'value':' Add '}]
for x in range(len(fs.list)):
for k, exp in expect[x].items():
got = getattr(fs.list[x], k)
self.assertEqual(got, exp)
def test_fieldstorage_multipart_non_ascii(self):
#Test basic FieldStorage multipart parsing
env = {'REQUEST_METHOD':'POST',
'CONTENT_TYPE': 'multipart/form-data; boundary={}'.format(BOUNDARY),
'CONTENT_LENGTH':'558'}
for encoding in ['iso-8859-1','utf-8']:
fp = BytesIO(POSTDATA_NON_ASCII.encode(encoding))
fs = cgi.FieldStorage(fp, environ=env,encoding=encoding)
self.assertEqual(len(fs.list), 1)
expect = [{'name':'id', 'filename':None, 'value':'\xe7\xf1\x80'}]
for x in range(len(fs.list)):
for k, exp in expect[x].items():
got = getattr(fs.list[x], k)
self.assertEqual(got, exp)
def test_fieldstorage_multipart_maxline(self):
# Issue #18167
maxline = 1 << 16
self.maxDiff = None
def check(content):
data = """---123
Content-Disposition: form-data; name="upload"; filename="fake.txt"
Content-Type: text/plain
%s
---123--
""".replace('\n', '\r\n') % content
environ = {
'CONTENT_LENGTH': str(len(data)),
'CONTENT_TYPE': 'multipart/form-data; boundary=-123',
'REQUEST_METHOD': 'POST',
}
self.assertEqual(gen_result(data, environ),
{'upload': content.encode('latin1')})
check('x' * (maxline - 1))
check('x' * (maxline - 1) + '\r')
check('x' * (maxline - 1) + '\r' + 'y' * (maxline - 1))
def test_fieldstorage_multipart_w3c(self):
# Test basic FieldStorage multipart parsing (W3C sample)
env = {
'REQUEST_METHOD': 'POST',
'CONTENT_TYPE': 'multipart/form-data; boundary={}'.format(BOUNDARY_W3),
'CONTENT_LENGTH': str(len(POSTDATA_W3))}
fp = BytesIO(POSTDATA_W3.encode('latin-1'))
fs = cgi.FieldStorage(fp, environ=env, encoding="latin-1")
self.assertEqual(len(fs.list), 2)
self.assertEqual(fs.list[0].name, 'submit-name')
self.assertEqual(fs.list[0].value, 'Larry')
self.assertEqual(fs.list[1].name, 'files')
files = fs.list[1].value
self.assertEqual(len(files), 2)
expect = [{'name': None, 'filename': 'file1.txt', 'value': b'... contents of file1.txt ...'},
{'name': None, 'filename': 'file2.gif', 'value': b'...contents of file2.gif...'}]
for x in range(len(files)):
for k, exp in expect[x].items():
got = getattr(files[x], k)
self.assertEqual(got, exp)
def test_fieldstorage_part_content_length(self):
BOUNDARY = "JfISa01"
POSTDATA = """--JfISa01
Content-Disposition: form-data; name="submit-name"
Content-Length: 5
Larry
--JfISa01"""
env = {
'REQUEST_METHOD': 'POST',
'CONTENT_TYPE': 'multipart/form-data; boundary={}'.format(BOUNDARY),
'CONTENT_LENGTH': str(len(POSTDATA))}
fp = BytesIO(POSTDATA.encode('latin-1'))
fs = cgi.FieldStorage(fp, environ=env, encoding="latin-1")
self.assertEqual(len(fs.list), 1)
self.assertEqual(fs.list[0].name, 'submit-name')
self.assertEqual(fs.list[0].value, 'Larry')
def test_fieldstorage_as_context_manager(self):
fp = BytesIO(b'x' * 10)
env = {'REQUEST_METHOD': 'PUT'}
with cgi.FieldStorage(fp=fp, environ=env) as fs:
content = fs.file.read()
self.assertFalse(fs.file.closed)
self.assertTrue(fs.file.closed)
self.assertEqual(content, 'x' * 10)
with self.assertRaisesRegex(ValueError, 'I/O operation on closed file'):
fs.file.read()
_qs_result = {
'key1': 'value1',
'key2': ['value2x', 'value2y'],
'key3': 'value3',
'key4': 'value4'
}
def testQSAndUrlEncode(self):
data = "key2=value2x&key3=value3&key4=value4"
environ = {
'CONTENT_LENGTH': str(len(data)),
'CONTENT_TYPE': 'application/x-www-form-urlencoded',
'QUERY_STRING': 'key1=value1&key2=value2y',
'REQUEST_METHOD': 'POST',
}
v = gen_result(data, environ)
self.assertEqual(self._qs_result, v)
def testQSAndFormData(self):
data = """---123
Content-Disposition: form-data; name="key2"
value2y
---123
Content-Disposition: form-data; name="key3"
value3
---123
Content-Disposition: form-data; name="key4"
value4
---123--
"""
environ = {
'CONTENT_LENGTH': str(len(data)),
'CONTENT_TYPE': 'multipart/form-data; boundary=-123',
'QUERY_STRING': 'key1=value1&key2=value2x',
'REQUEST_METHOD': 'POST',
}
v = gen_result(data, environ)
self.assertEqual(self._qs_result, v)
def testQSAndFormDataFile(self):
data = """---123
Content-Disposition: form-data; name="key2"
value2y
---123
Content-Disposition: form-data; name="key3"
value3
---123
Content-Disposition: form-data; name="key4"
value4
---123
Content-Disposition: form-data; name="upload"; filename="fake.txt"
Content-Type: text/plain
this is the content of the fake file
---123--
"""
environ = {
'CONTENT_LENGTH': str(len(data)),
'CONTENT_TYPE': 'multipart/form-data; boundary=-123',
'QUERY_STRING': 'key1=value1&key2=value2x',
'REQUEST_METHOD': 'POST',
}
result = self._qs_result.copy()
result.update({
'upload': b'this is the content of the fake file\n'
})
v = gen_result(data, environ)
self.assertEqual(result, v)
def test_deprecated_parse_qs(self):
# this func is moved to urllib.parse, this is just a sanity check
with check_warnings(('cgi.parse_qs is deprecated, use urllib.parse.'
'parse_qs instead', DeprecationWarning)):
self.assertEqual({'a': ['A1'], 'B': ['B3'], 'b': ['B2']},
cgi.parse_qs('a=A1&b=B2&B=B3'))
def test_deprecated_parse_qsl(self):
# this func is moved to urllib.parse, this is just a sanity check
with check_warnings(('cgi.parse_qsl is deprecated, use urllib.parse.'
'parse_qsl instead', DeprecationWarning)):
self.assertEqual([('a', 'A1'), ('b', 'B2'), ('B', 'B3')],
cgi.parse_qsl('a=A1&b=B2&B=B3'))
def test_parse_header(self):
self.assertEqual(
cgi.parse_header("text/plain"),
("text/plain", {}))
self.assertEqual(
cgi.parse_header("text/vnd.just.made.this.up ; "),
("text/vnd.just.made.this.up", {}))
self.assertEqual(
cgi.parse_header("text/plain;charset=us-ascii"),
("text/plain", {"charset": "us-ascii"}))
self.assertEqual(
cgi.parse_header('text/plain ; charset="us-ascii"'),
("text/plain", {"charset": "us-ascii"}))
self.assertEqual(
cgi.parse_header('text/plain ; charset="us-ascii"; another=opt'),
("text/plain", {"charset": "us-ascii", "another": "opt"}))
self.assertEqual(
cgi.parse_header('attachment; filename="silly.txt"'),
("attachment", {"filename": "silly.txt"}))
self.assertEqual(
cgi.parse_header('attachment; filename="strange;name"'),
("attachment", {"filename": "strange;name"}))
self.assertEqual(
cgi.parse_header('attachment; filename="strange;name";size=123;'),
("attachment", {"filename": "strange;name", "size": "123"}))
self.assertEqual(
cgi.parse_header('form-data; name="files"; filename="fo\\"o;bar"'),
("form-data", {"name": "files", "filename": 'fo"o;bar'}))
BOUNDARY = "---------------------------721837373350705526688164684"
POSTDATA = """-----------------------------721837373350705526688164684
Content-Disposition: form-data; name="id"
1234
-----------------------------721837373350705526688164684
Content-Disposition: form-data; name="title"
-----------------------------721837373350705526688164684
Content-Disposition: form-data; name="file"; filename="test.txt"
Content-Type: text/plain
Testing 123.
-----------------------------721837373350705526688164684
Content-Disposition: form-data; name="submit"
Add\x20
-----------------------------721837373350705526688164684--
"""
POSTDATA_NON_ASCII = """-----------------------------721837373350705526688164684
Content-Disposition: form-data; name="id"
\xe7\xf1\x80
-----------------------------721837373350705526688164684
"""
# http://www.w3.org/TR/html401/interact/forms.html#h-17.13.4
BOUNDARY_W3 = "AaB03x"
POSTDATA_W3 = """--AaB03x
Content-Disposition: form-data; name="submit-name"
Larry
--AaB03x
Content-Disposition: form-data; name="files"
Content-Type: multipart/mixed; boundary=BbC04y
--BbC04y
Content-Disposition: file; filename="file1.txt"
Content-Type: text/plain
... contents of file1.txt ...
--BbC04y
Content-Disposition: file; filename="file2.gif"
Content-Type: image/gif
Content-Transfer-Encoding: binary
...contents of file2.gif...
--BbC04y--
--AaB03x--
"""
if __name__ == '__main__':
unittest.main()
|
Ever since the new Chevrolet Cruze Diesel Debuted in 2013, we have known it was a winner. (It’s the only non-hybrid car available today that gets such high gas mileage- and it’s a blast to drive.) And now USA Today agrees, having just voted it “One of the Best Cars from 2013”.
“The low-speed torque that’s inherent in a diesel is well-suited to much U.S. driving, and can make the daily slog a bit more exciting. In fact, the diesel’s the quickest Cruze, Chevy says, clipping off the 0-to-60 dash notably faster than the gasoline models.
Click here to learn more about the new Chevy Cruze.
The New Chevrolet Cruz Turbo Diesel has arrived at Mirak Chevrolet, a Boston Chevrolet Dealer in Arlington, MA.
The New 2014 Chevy Cruz Turbo Diesel gets 46 highway miles to the gallon, yet it has 150 horsepower.
The Cruze Clean Turbo Diesel hits the Boston Area roads running as the first clean diesel car ever produced by a U.S. automaker and will offer an EPA-estimated 46 MPG highway. It has a starting MSRP of $24,885*.
Clean diesels generate at least 90% less Nitrogen Oxide (NOx) and particulate emissions when compared to previous-generation diesels.
It all starts with a 2.0L turbocharged clean diesel engine designed in Italy, built in Germany and installed in the Cruze at our factory in Lordstown, Ohio. The 2014 Cruze Clean Turbo Diesel comes equipped with the very latest clean diesel technology which helps reduce emissions while also boasting 148 horsepower and 258 lb.-ft. of “low-end” torque that customers of our Boston Area Chevrolet dealership will have to experience to believe.
Cruze Clean Turbo Diesel emissions are below strict U.S. environmental standards. In fact, to even qualify for clean diesel classification, Cruze had to meet modern Tier 2 Bin 5 Emission Standards —a new generation of stringent clean diesel standards.
Cruze Clean Turbo Diesel is equipped with technology including exhaust gas recirculation, selective catalyst reduction, a particulate filter and advanced fuel system components. All these technologies working together allow the diesel engine in Cruze to perform with very low Nitrogen Oxide (NOx) emissions and very low particulate emissions.
While the Cruze Clean Turbo Diesel certainly packs a cleaner kick, it’s designed to reduce the noise and vibration often associated with earlier generation diesel cars.
Right down to the sporty 17-inch alloy wheels, chances are, Boston Area drivers never felt something so fun to drive with this kind of efficiency.
The Cruze Clean Turbo Diesel hits the Boston Area roads running as the first clean diesel car ever produced by a U.S. automaker and will offer an EPA-estimated 46 MPG highway. It will have a starting MSRP of $24,885*.
|
#! /usr/bin/env python
#
# PyWeather
# (c) 2010 Patrick C. McGinty <pyweather@tuxcoder.com>
# (c) 2005 Christopher Blunck <chris@wxnet.org>
#
# You're welcome to redistribute this software under the
# terms of the GNU General Public Licence version 2.0
# or, at your option, any higher version.
#
# You can read the complete GNU GPL in the file COPYING
# which should come along with this software, or visit
# the Free Software Foundation's WEB site http://www.fsf.org
#
import os
from distutils.core import setup
import weather as pkg
name = pkg.__name__
def _read(*path_name):
return open(os.path.join(os.path.dirname(__file__), *path_name)).read()
setup(name=name,
version=pkg.__version__,
license="GNU GPL",
description=pkg.__doc__,
long_description=_read('README'),
author="Patrick C. McGinty, Christopher Blunck",
author_email="pyweather@tuxcoder.com, chris@wxnet.org",
url="http://github.com/cmcginty/PyWeather",
download_url="https://github.com/cmcginty/PyWeather/archive/%s.zip" %
pkg.__version__,
packages=[
name,
name + '.services',
name + '.stations',
name + '.units',
],
scripts=['scripts/weatherpub.py'],
)
|
Found in FNA Volume 8. Treatment on page 307. Mentioned on page 306.
Plants annual. Stems mostly erect, 1–3(–5) dm. Leaves alternate or opposite; blade elliptic to narrowly lanceolate, 4.5–8 × 1–3.5 mm. Pedicels erect in fruit, 2.5–4 mm, mostly equaling or shorter than subtending leaf. Flowers: sepals (4–)5, calyx divided nearly to base, 2–2.5 mm, equaling or longer than corolla, margins entire, narrowly or not scarious, apex acuminate; petals 5, corolla white, salverform (almost rotate), 1.5–2.3 mm. Capsules 1–1.5 mm. Seeds 5–12.
Fla., Mexico, West Indies (Jamaica), Central America, South America (Colombia, Ecuador).
|
# !/usr/bin/env python
# -*- coding: utf-8 -*-
'''
Description:
Design a stack that supports push, pop, top, and retrieving the minimum element in constant time.
push(x) -- Push element x onto stack.
pop() -- Removes the element on top of the stack.
top() -- Get the top element.
getMin() -- Retrieve the minimum element in the stack.
Example:
MinStack minStack = new MinStack();
minStack.push(-2);
minStack.push(0);
minStack.push(-3);
minStack.getMin(); --> Returns -3.
minStack.pop();
minStack.top(); --> Returns 0.
minStack.getMin(); --> Returns -2.
Tags: Stack, Design
'''
class MinStack(object):
# O(n) runtime, O() space – Minor space optimization
def __init__(self):
"""
initialize your data structure here.
"""
self.stack = []
self.minstack = []
def push(self, x):
"""
:type x: int
:rtype: void
"""
self.stack.append(x)
if not self.minstack:
self.minstack.append(x)
elif x <= self.minstack[-1]:
self.minstack.append(x)
def pop(self):
"""
:rtype: void
"""
if self.stack:
x = self.stack.pop()
if x == self.minstack[-1]:
self.minstack.pop()
def top(self):
"""
:rtype: int
"""
if self.stack:
return self.stack[-1]
def getMin(self):
"""
:rtype: int
"""
if self.minstack != []:
return self.minstack[-1]
class MinStack2(object):
def __init__(self):
"""
initialize your data structure here.
"""
self.stack, self.minstack = [], []
def push(self, x):
"""
:type x: int
:rtype: void
"""
self.stack.append(x)
if self.minstack != []:
if x < self.minstack[-1][0]:
self.minstack.append([x, 1])
elif x == self.minstack[-1][0]:
self.minstack[-1][1] += 1
else:
self.minstack.append([x, 1])
def pop(self):
"""
:rtype: void
"""
x = self.stack.pop()
if x == self.minstack[-1][0]:
self.minstack[-1][1] -= 1
if self.minstack[-1][1] == 0:
self.minstack.pop()
def top(self):
"""
:rtype: int
"""
return self.stack[-1]
def getMin(self):
"""
:rtype: int
"""
if self.minstack != []:
return self.minstack[-1][0]
# Your MinStack object will be instantiated and called as such:
# obj = MinStack()
# obj.push(x)
# obj.pop()
# param_3 = obj.top()
# param_4 = obj.getMin()
|
Description Chair featuring velvety fabric. Antiqued brass nail heads accent the frame along with weathered hickory stained legs and base. Pillow included. Seat height is 19".
|
#!/usr/bin/env python
import glob
import random
import cgi
import cgitb
import zipfile
import bz2
import tarfile
import subprocess
cgitb.enable()
#files = glob.glob("hands/*")
# Yes this is stupid. But so is the zip format - it doesn't compress between
# files. The tarfile module doesn't allow streaming out single files.
#smallblob = bz2.BZ2File("hands.zip.bz2")
#blob = zipfile.ZipFile("hands.zip")
#blob = tarfile.open("hands.tar.bz2")
#hand = random.choice(blob.getnames())
hands = subprocess.check_output(["tar", "tf", "hands.tar.gz"]).splitlines()
hand = random.choice(hands)
#hand = 'hands/hand1346582280164'
# maybe filter out empty files?
form = cgi.FieldStorage()
score = form.getfirst('score', 0)
try:
score = int(score)
except Exception, e:
score = 0
target = '/cgi-bin/gamestate.py?hand=%s' % hand[6:]
if score:
target += '&score=%d' % score
hands = form.getfirst('hands', 0)
try:
hands = int(hands)
except Exception, e:
hands = 0
if hands:
target += '&hands=%d' % hands
print 'Status: 200 OK' # This seems to make the browser hide the real URL... good for hiding the score
print 'Content-Type: text/plain'
print 'Location: %s' % target
print
# Should put a HTML redirect here
|
Wonderful set of six Georges Briard designed mid century rocks glasses. Clear 13 ounce double rocks glasses with elegant frosted white doves of peace outlined in 22k. gold. Beautiful to enjoy holiday cocktails in, and raise a toast to good will on earth. They measure 4 1/8" high, 4" wide.
|
from bitwise import *
def split(l, c=0):
"""Split the given list in two."""
return (l[: int(len(l)/2)], l[int(len(l)/2) : None if c == 0 else c])
def main():
print("This program excecutes (Non-)restoring division algorithm.\n")
print("The formula it's going to calculate is: X / Y = ?")
print("Choose which algorithm (R)estoring or (N)on-restoring [r/n]: ", end="")
while True:
inp = str(input())[0]
if inp in ["n", "N"]:
algorithm = "n"
break
elif inp in ["r", "R"]:
algorithm = "r"
break
else:
print("Input R or N. ", end="")
algorithm = inp
print("Input the bit length of SECOND variable Y: ", end="")
ylen = int(input())
xlen = ylen * 2
print("(The bit length of X is: len(Y)*2 = %d)" % xlen)
print("Input the number of first variable X: ", end="")
x = int(input())
if x < 0:
x = TwoComp( ("{0:0%db}" % xlen).format(x) ) #Calculate the two's complement number of x
else:
x = ("{0:0%db}" % xlen).format(x) #Convert to bits and assign directly
print("Input the number of second variable Y: ", end="")
y = int(input())
if y < 0:
y = TwoComp( ("{0:0%db}" % ylen).format(y) )
else:
y = ("{0:0%db}" % ylen).format(y)
n = ylen
c = ""
#----- Prepare internal variables -----#
print("Internal variables:")
print("X = %s %s" % (x[:ylen], x[ylen:]))
print("Y =", y)
print("n =", n)
print("")
#----- Algorithm start -----#
print("#Algorithm start: %s\n" % ("Restoring" if algorithm == "r" else "Non-restoring"))
if not "1" in y:
print("Y is zero. Aborting.")
return
print("X1 = X1 - Y\t\t", end="")
x = BitAdd(x, TwoComp(y) + GenZeroStr(ylen), xlen)
print("X = %s %s" % split(x))
if x[0] == "0":
print("X1 is positive or zero. Aborting.")
return
x = BitShift(x, -1)
print("[X1][X2][C] << 1\tX = %s %sC" % split(x, -1))
print("X1 = X1 + Y\t\t", end="")
x = BitAdd(x, y + GenZeroStr(ylen), xlen)
print("X = %s %sC" % split(x, -1))
print("n = n - 1 = %d" % (n-1))
n -= 1
#--- Go into the loop --- #
print("\n#Into the loop...\n")
if algorithm == "r":
pass
elif algorithm == "n":
for i in range(n): # X1 != 0
print("Step %d:" % (i+1))
if x[0] == "0": # X1 >= 0
c = "1"
print("X1 >= 0 -> c = 1", end="")
x = x[:-1] + c #[X1][X2][C] << 1
print("\tX = %s %s" % split(x))
x = BitShift(x, -1) #Shift bits leftward
print("[X1][X2][C] << 1\tX = %s %sC" % split(x, -1))
x = BitAdd(x, TwoComp(y) + GenZeroStr(ylen), xlen) #X1 = X1 - Y
print("X1 = X1 - Y\t\tX = %s %sC" % split(x, -1))
else:
c = "0"
print("X1 < 0 -> c = 0", end="")
x = x[:-1] + c
print("\t\tX = %s %s" % split(x))
x = BitShift(x, -1)
print("[X1][X2][C] << 1\tX = %s %sC" % split(x, -1))
x = BitAdd(x, y + GenZeroStr(ylen), xlen) #X1 = X1 + Y
print("X1 = X1 + Y\t\tX = %s %sC" % split(x, -1))
print("")
if x[0] == "0": # X1 >= 0
print("X1 >= 0 -> C = 1")
c = "1"
x = x[:-1] + c
else:
print("X1 < 0 -> C = 0")
c = "0"
x = x[:-1] + c
x = BitAdd(x, y + GenZeroStr(ylen), xlen)
print("X1 = X1 + Y")
print("X = %s %s" % split(x))
print("")
print("The answer is: R = %s, Q = %s" % split(x))
if __name__ == "__main__":
main()
|
A few years ago, a UCLA study on customer service revealed that within the first seven seconds inside your door, customers decide one of three things: they like you, they don’t like you, or they don’t care whether they like you or not (in other words, they’re indifferent toward you). Once people have made up their minds, of course, they behave accordingly. This is more proof in a large body of scientific and anecdotal evidence of the power of first impressions. A new employee’s first impression of the organization, the people, and the job is equally powerful in influencing her or his attitudes and behaviors. Clearly, shaping the new employee’s first impression is a matter of utmost importance. A well planned and executed orientation is your best tool for this purpose.
The basic function of an orientation is to provide a new employee with a context for 1) interpreting events and information, and 2) deciding how to respond or behave. A good orientation, in other words, helps everything to make sense. When things make sense to us, (i.e.: when we understand) we feel more confident. Confidence, in turn, leads to a positive attitude. And a positive attitude has at least two things going for it: 1) it increases our capacity for learning, and 2) it’s infectious – and therefore – self reinforcing. Put these two attributes together, and you’ve got something pretty cool happening for your organization – people are learning, and feeling positively about it, and those positive feelings, make people want to learn more, which makes them feel more positively, and so on. Now here’s a contagious condition that most managers would welcome in their shelter! Heck, imagine the ramifications of a learning and positivity epidemic?!
Believe it or not, the topics within an orientation that can initiate this kind of positive learning cycle are pretty basic. These include the history of the organization and the movement, the mission and vision of the organization, the organization’s philosophies and values, policies, and safety protocols. Additionally, facility tours, employee introductions, job descriptions, and training overviews are usually part of an orientation. Now you may be thinking that your orientations routinely include all of these topics, but you’ve yet to see that “positive learning cycle” take off in your organization. That’s because these things constitute the “what” of an orientation, but “what” is only half the story. To utilize orientations to an organization’s greatest advantage, we need to focus on the “how” of the orientation. In this article, we’ll explore designing orientations as a team development and learning activity. In subsequent articles, we’ll look at specific exercises to create more meaningful, interactive orientation activities for each of the basic elements of orientation.
It’s helpful to think of orientation not so much as a one time event, but as a process – a process that ideally begins the moment the new employee enters the organization, and continues throughout the first few weeks – or even months – of her or his tenure. Everyone on staff can play a role in a new person’s orientation. In fact, your team will benefit tremendously by sharing in the responsibilities of creating and implementing a good orientation. In essence, the orientation says “welcome, we’re glad you’re here” over and over to your new employee. It also communicates that you value your employees enough to invest a lot of time in helping them to succeed.
The place to begin, of course, is planning. Ask yourself (and your staff) some basic questions in order to help you develop the learning objectives of your orientation. In other words, what do you want people to be able to do once they’ve completed their orientation? For example, at the Monadnock Humane Society where I used to work, we set the following learning objectives for our orientation program: “Participants will be able to… 1) represent the organization accurately, 2) identify and use communication channels appropriately, 3) direct the public to MHS resources appropriately and effectively, and 4) exemplify the MHS internal “humane community” culture.” Once you’ve established your learning objectives, it’s easier to select topics and design learning activities to bring your orientation to life.
Nearly all of the topics and activities of orientations are opportunities for practicing team work and team learning. Making orientation a team activity, spreads the work and responsibility among a number of players. Additionally, team orientation provides multiple opportunities for refreshing staff knowledge, improving team communication and work skills, and reviewing and improving your day to day practices.
To get started engaging all of your team in orientation, ask them to revisit their own experiences of being new. Here’s an exercise to help with this.
1) How did you share the news of your new job and with whom?
2) What was the best part of being new?
3) What would you have really wanted to know or understand about the organization, the people here, and your job when you first started?
A) How can we capture and support the enthusiasm of new people?
B) How do we want new people to feel here? How can we help them feel this way?
D) What topics are important to include in orientation?
This conversation with your staff will both facilitate the development of better content for your orientations and immediately initiate more involvement from staff members in the orientation of new employees.
While you’re in the process of developing a new, more thorough orientation process, remember that orientation is a lot about providing a helpful framework – so give new employees some essential pieces of that framework right away…such as a quick tour of the facility, introductions to other staff (especially those staff who the employee can go to with questions), and copies of the organization’s mission, personnel policies, and safety protocol. As the days and weeks unfold, you and your staff can engage in other orientation activities to integrate new employees into the team and the work. Designing good orientations does take a good deal of time and energy, but the benefits to your new employees, your team, your day to day activities, and your overall mission are well worth the investment.
|
size(600, 1000)
# pathmatics functions
nofill()
stroke(0)
def label(s,x,y):
"""put a black label preserving fill color."""
push()
c = fill()
fill(0)
text(s,x,y)
fill(c)
pop()
def circlepath(x, y, r):
"""Make a circle with curveto."""
r2 = r * 0.5555 #
autoclosepath(close=True)
beginpath(x, y-r)
curveto(x+r2, y-r, x+r, y-r2, x+r, y)
curveto(x+r, y+r2, x+r2, y+r, x , y+r)
curveto(x-r2, y+r, x-r, y+r2, x-r, y)
curveto(x-r, y-r2, x-r2, y-r, x, y-r)
return endpath(draw=False)
# normal
c1 = circlepath( 200, 100, 100)
c2 = circlepath( 300, 100, 100)
drawpath(c1)
drawpath(c2)
label("Normal", 420, 100)
print "Path c1 intersects path c2:", c1.intersects(c2)
# flatness should always be 0.5
var("flatness", NUMBER, 0.6, 0.1, 5.0)
print "flatness:", flatness
# union
c1 = circlepath( 200, 300, 100)
c2 = circlepath( 300, 300, 100)
drawpath(c1.union(c2, flatness=flatness))
label("Union", 420, 300)
# difference
c1 = circlepath( 200, 500, 100)
c2 = circlepath( 300, 500, 100)
drawpath(c1.difference(c2, flatness=flatness))
label("Difference", 420, 500)
# intersect
c1 = circlepath( 200, 700, 100)
c2 = circlepath( 300, 700, 100)
drawpath(c1.intersect(c2, flatness=flatness))
label("Intersection", 420, 700)
# xor
fill(0)
c1 = circlepath( 200, 900, 100)
c2 = circlepath( 300, 900, 100)
drawpath(c1.xor(c2, flatness=flatness))
label("XOR", 420, 900)
|
Product code: 22848 Category: Mobile Accessories.
The back of anyone’s phone is prime real estate for any promotion! A PopSocket sticks flat to the back of your phone, tablet or case with its rinsable, repositionable adhesive gel. Once extended, the PopSocket becomes a media stand for your device, a photo or texting grip, or lower it for a video chat. The possibilities are endless with PopSockets, and with full colour imprinting, so are the imprint possibilities. The best part is that PopSockets can be used on any brand of phone. Instructions included.
Size Choose an optionExpanded: 1.53 in. x 1.53 in. x 0.9 in.
Setup Charges :$123.75Pricing includes a four colour imprint on one location* May vary according to product availability from supplier.** Price subject to change without notice.
Expanded: 1.53 in. x 1.53 in. x 0.9 in.
|
#!/usr/bin/python
# -*- coding: utf-8 -*-
import re
import argparse
import cli_browser
import helper_functions as helpers
import hook_system_variables as hook
from colorama import init, Fore, Back
from HTMLParser import HTMLParser
from datetime import datetime
import os_operations as op
import cli_download_manager as idm
import package_manager as manager
import os
import helper_functions as helper
import app_setup as app
import time
import subprocess as sp
class cmd_line(object):
def __init__(self):
self.__parser = argparse.ArgumentParser(
prog=hook.application_name,
description=hook.application_description,
#epilog=Back.BLUE + hook.additional_description + Back.RESET,
usage=hook.usage_syntax,
version=Fore.YELLOW + hook.application_name + ' ' + hook.application_version + Fore.RESET,
conflict_handler='resolve'
)
def __cmd_init(self, _interactive=False, _pversion=False, _commands=None):
created_time = datetime.today().strftime('%H:%M:%S - %b, %d %Y')
print '[+] : Created at {0}'.format(created_time)
print '[+] : Installed packages {0}'.format(0)
print '[+] : Updates at the working directory {0}'.format(0)
packages = []
if _interactive:
print
want_more = 'Y'
while want_more in ('y', 'Y'):
repository = raw_input("Offer your package name: ")
if repository in ('q', 'quit', 'exit'):
break
print
cmd_browser = cli_browser.cli_browser()
cmd_browser.setRequestedURL("https://github.com/search?q={0}&type=Repositories&ref=searchresults".format(repository))
response = cmd_browser.submit()
repos_list = cmd_browser.parseResponse(response)
parser = HTMLParser()
length = len(repos_list)
for repo_index in range(length):
tpl = repos_list[repo_index]
print parser.unescape("[{0:2}] : {1} {2}".format((repo_index+1), tpl[0][1:], '- '+re.sub(r'<em>|</em>', '', tpl[2]).strip()))
if length > 0:
print
package_number = -1
while package_number < 0 or package_number > length:
try:
_input = raw_input("Choose your package number (1: DEFAULT, 0: IGNORE): ")
package_number = int(_input)
except ValueError:
package_number = 1
if package_number == 0:
continue
package_name = repos_list[(package_number-1)][0][1:]
package_version = '*'
if _pversion:
cmd_browser.setRequestedURL('https://github.com/{0}/tags'.format(package_name))
response = cmd_browser.submit()
versions = cmd_browser.parseVersions(response)
if len(versions) > 0:
for vr in versions:
print vr, ', ',
print
package_version = raw_input("Choose your package number (latest: DEFAULT): ")
if package_version == '':
package_version = versions[0]
else:
print Back.RED+"There is no releases"+Back.RESET
po = {"package": package_name, "version": package_version}
packages.append(po)
print
print "[+] : {0} added to your target packages".format(package_name)
print
else:
print "There is no package named: {0}".format(repository)
want_more = raw_input("Do you want more packages (y/N): ")
cmd_browser.closeConnections()
else:
_packages = _commands[1:]
for pkg in _packages:
dtls = pkg.split(':')
packages.append({"package": dtls[0], "version": dtls[1]})
#print
#d.pretty_print(packages)
self.__setup_workspace(packages, {"created_at": created_time, "installed_packages": [], "workspace_updates": []})
def __cmd_self_install(self):
app.setup()
def __cmd_create(self, _args, _version=False, _foundation=False, _bootstrap=False):
step = ""
if not _version:
if len(_args) < 3:
print Fore.YELLOW + "Not enough arguments" + Fore.RESET
return
else:
if len(_args) < 4:
print Fore.YELLOW + "Not enough arguments" + Fore.RESET
return
project_type = _args[0]
project_name = _args[2]
if project_type != "dynamic" and project_type != "static":
print Back.RED + " Unrecognized command " + Back.RESET
return
if project_type == "dynamic":
try:
step = "php"
print "checking if php is installed on the system ..."
sp.check_call(['php', '-v'])
print "php is verified successfully"
print "downloading composer to your current working directory ..."
if not op.is_exits('composer.phar'):
ps = sp.Popen(('php', '-r', "readfile('https://getcomposer.org/installer');"), stdout=sp.PIPE)
output = sp.check_output('php', stdin=ps.stdout)
ps.wait()
print "composer is successfully downloaded, you can use separately by typing composer.phar [options] command [arguments] in your command prompt"
else:
print "composer downloading operation is canceled, composer is found in your current working directory"
if not _version:
sp.call(['php', 'composer.phar', 'create-project', _args[1], project_name])
elif _version:
sp.call(['php', 'composer.phar', 'create-project', _args[1], project_name, _args[3]])
except:
if step == "php":
print Fore.YELLOW + "php is not found in your system's environment path, you may first setup your local development environment and try again" + Fore.RESET
elif step == "composer":
print Fore.YELLOW + "composer is not found in your system's environment path, use --getcomposer option instead" + Fore.RESET
elif project_type == "static":
pass
def __cmd_add(self, _args):
file_name = 'hook.json'
if not op.is_exits(file_name):
print "hook can't find " + file_name
return
if len(_args) == 0:
return
require_list = helpers.load_json_file(file_name)
for arg in _args:
try:
repository, version = arg.split(':')
if not helper.is_repository(repository):
print repository + " is not a valid repository name"
continue
package = {"package": repository, "version": version}
require_list['require'].append(package)
except ValueError:
print arg + " is not a valid argument"
pass
helper.prettify(require_list)
op.create_file(file_name, _content=helper.object_to_json(require_list))
def __cmd_install(self):
browser_object = cli_browser.cli_browser()
browser_connection = browser_object.getHttpConnection()
download_manager = idm.download_manager()
download_manager.plugInBrowserWithDownloadManager(browser_connection)
if not op.is_exits('hook.json'):
print "hook can't find hook.json"
return
require_list = helpers.load_json_file('hook.json')['require']
#helpers.prettify(list)
urls = []
repositories = []
for pkg in require_list:
name = pkg['package']
version = ''
if pkg['version'] == '*':
browser_object.setRequestedURL('https://github.com/{0}/tags'.format(name))
response = browser_object.submit()
versions_list = browser_object.parseVersions(response)
if len(versions_list) == 0:
print Back.RED+'No releases for {0}'.format(name)+Back.RESET
version = 'master'
else:
version = versions_list[0]
else:
version = pkg['version']
#url = 'https://github.com/fabpot/Twig/archive/v1.16.0.zip'
url = 'https://github.com/{0}/archive/{1}.zip'.format(name, version)
repositories.append(name)
#print url
urls.append(url)
download_manager.startQueue(urls, _repositories=repositories)
browser_connection.close()
browser_object.closeConnections()
def __cmd_search(self, _package, _surfing=False, _current="1", _av_pages=-1):
if len(_package) == 0:
print "No package is provided"
return
cmd_browser = cli_browser.cli_browser()
while True:
current = _current
av_pages = _av_pages
prompt_message = "Choose your package number (1: DEFAULT, q: QUIT): "
repository = _package[0]
cmd_browser.setRequestedURL("https://github.com/search?q={0}&p={1}&type=Repositories&ref=searchresults".format(repository, current))
response = ''
new_line = ''
while True:
response = cmd_browser.submit()
if response is not None:
if new_line == "#":
print "\n"
break
new_line = "#"
#cmd_browser.closeConnections()
#cmd_browser = cli_browser.cli_browser()
#cmd_browser.setRequestedURL("https://github.com/search?q={0}&p={1}&type=Repositories&ref=searchresults".format(repository, current))
repos_list = cmd_browser.parseResponse(response)
parser = HTMLParser()
length = len(repos_list)
if length > 0:
print Fore.BLUE + "{0:6} {1}\n".format("Num", "Repository - Description") + Fore.RESET
for repo_index in range(length):
tpl = repos_list[repo_index]
print parser.unescape("[{0:2}] : {1} {2}".format((repo_index+1), tpl[0][1:], '- '+re.sub(r'<em>|</em>', '', tpl[2]).strip()))
if length > 0:
if _surfing:
#print
current = cmd_browser.getCurrentPage(response)
print "\nCurrent page: {0}".format(current)
print "Available pages: ",
pages = cmd_browser.parsePagination(response)
if av_pages == -1 and pages is not None:
#print pages[-1]
#print pages
av_pages = int(pages[-1])
#if pages is not None:
print av_pages,
else:
if av_pages == -1:
av_pages = 1
print av_pages if av_pages != -1 else 1,
prompt_message = "\nChoose your (package number/action) (1: DEFAULT, p: PREVIOUS, n: NEXT, r: RESET, q: QUIT): "
#print
package_number = -1
_input = ''
try:
print
_input = raw_input(prompt_message)
package_number = int(_input)
except ValueError:
#print av_pages
if _surfing and _input in ('p', 'n', 'r', 'q') and 0 < int(current) <= av_pages:
print
if _input == 'p':
_current = str(int(current)-1)
_av_pages = av_pages
elif _input == 'n':
crnt = int(current)+1
if crnt > av_pages:
crnt = av_pages
_current = str(crnt)
_av_pages = av_pages
elif _input == 'r':
_current = '1'
_av_pages = av_pages
else:
print "Hook is quitting ..."
cmd_browser.closeConnections()
return
continue
elif _input == 'q':
print "\nHook is quitting ..."
cmd_browser.closeConnections()
return
else:
package_number = 1
if package_number < 0:
package_number = 1
elif package_number > length:
package_number = length
package_name = repos_list[(package_number-1)][0][1:]
cmd_browser.setRequestedURL('https://github.com/{0}/tags'.format(package_name))
response = cmd_browser.submit()
versions = cmd_browser.parseVersions(response)
print "\n" + Back.BLUE + package_name + Back.RESET + " versions" + "\n"
if len(versions) > 0:
for vr in versions:
print vr, ', ',
else:
print Back.RED+"There is no releases"+Back.RESET
else:
print "There is no package named: {0}".format(Fore.YELLOW + repository + Fore.RESET)
break
cmd_browser.closeConnections()
def __cmd_list(self):
manager.get_list_of_installed_packages()
def __cmd_info(self):
print "information"
def __cmd_update(self, args):
if not args:
browser_object = cli_browser.cli_browser()
browser_connection = browser_object.getHttpConnection()
download_manager = idm.download_manager()
download_manager.plugInBrowserWithDownloadManager(browser_connection)
installed_list = manager.get_installed_packages()
length = len(installed_list)
for package_index in range(length):
pkg = installed_list[package_index]
name, version = re.search(r'(.+?)\-([\d\w\.]*)\.zip', pkg['package'], re.IGNORECASE).groups()
#print
browser_object.setRequestedURL('https://github.com/{0}/tags'.format(pkg['repository']))
response = browser_object.submit()
versions_list = browser_object.parseVersions(response)
if len(versions_list) == 0 and version == 'master':
print Fore.GREEN + name + Fore.RESET + " is already up-to-date"
continue
elif versions_list[0] == version or versions_list[0] == 'v' + version:
print Fore.GREEN + name + Fore.RESET + " is already up-to-date"
continue
print 'Update process: ' + Fore.YELLOW + version + Fore.RESET + ' -> ' + Fore.GREEN + versions_list[0] + Fore.RESET
message = " is going to be updated to " + Fore.GREEN + name + ' (' + versions_list[0] + ')' + Fore.RESET
print "\t" + Fore.YELLOW + "{0} ({1})".format(name, version) + Fore.RESET + message
url = 'https://github.com/{0}/archive/{1}.zip'.format(pkg['repository'], versions_list[0])
download_manager.startQueue(url, _params={"repository": pkg['repository'], "type": "update", "old_pkg": pkg['package']})
browser_connection.close()
browser_object.closeConnections()
return
package = args[0]
matching_list = manager.match_package(package)
if len(matching_list) > 0:
print Back.BLUE + " Package(s) matching " + Back.RESET + " ({0})\n".format(package)
self.__update_helper_interface(manager.match_package(package))
else:
print Fore.YELLOW + "No package matches " + Fore.RESET + "({0})\n".format(package)
def __update_helper_interface(self, installed_list):
browser_object = cli_browser.cli_browser()
browser_connection = browser_object.getHttpConnection()
download_manager = idm.download_manager()
download_manager.plugInBrowserWithDownloadManager(browser_connection)
length = len(installed_list)
_input = ''
try:
print Fore.BLUE+"{0:4} {1:28}{2:28}{3:28}".format("Num", "Installed at", "Name", "Version")+Fore.RESET
print
for item_index in range(length):
pkg = installed_list[item_index]
installed_at = pkg['installed_at']
name, version = re.search(r'(.+?)\-([\d\w\.]*)\.zip', pkg['package'], re.IGNORECASE).groups()
print "[{0:2}] {1:28}{2:28}{3:28}".format((item_index+1), installed_at, name, version)
print
while True:
_input = raw_input("Choose your package number (1: DEFAULT, q: QUIT): ")
if _input == "":
_input = 1
package_index = int(_input)
if 0 < package_index <= length:
pkg = installed_list[package_index-1]
name, version = re.search(r'(.+?)\-([\d\w\.]*)\.zip', pkg['package'], re.IGNORECASE).groups()
print
browser_object.setRequestedURL('https://github.com/{0}/tags'.format(pkg['repository']))
response = browser_object.submit()
versions_list = browser_object.parseVersions(response)
if len(versions_list) == 0 and version == 'master':
print name + " is already up-to-date"
browser_connection.close()
browser_object.closeConnections()
return
elif versions_list[0] == version or versions_list[0] == 'v' + version:
print name + " is already up-to-date"
browser_connection.close()
browser_object.closeConnections()
return
print 'Update process: ' + Fore.YELLOW + version + Fore.RESET + ' -> ' + Fore.GREEN + versions_list[0] + Fore.RESET
message = " is going to be updated to " + Fore.GREEN + name + ' (' + versions_list[0] + ')' + Fore.RESET + ". Are you sure (y,N): "
while True:
confirmation = raw_input("\n\t" + Fore.YELLOW + "{0} ({1})".format(name, version) + Fore.RESET + message)
if confirmation in ('y', 'Y', 'yes'):
url = 'https://github.com/{0}/archive/{1}.zip'.format(pkg['repository'], versions_list[0])
download_manager.startQueue(url, _params={"repository": pkg['repository'], "type": "update", "old_pkg": pkg['package']})
browser_connection.close()
browser_object.closeConnections()
print "\nUpdate action on "+name
break
elif confirmation in ('', 'n', 'N', 'no'):
print "\nOperation is canceled"
print "Hook is quitting"
break
break
except ValueError:
if _input not in ('q', 'quit'):
print "No value was specified"
print "Hook is quitting"
except AttributeError as e:
print e.message
browser_connection.close()
browser_object.closeConnections()
def __uninstall_helper_interface(self, installed_list):
length = len(installed_list)
_input = ''
try:
print Fore.BLUE+"{0:4} {1:28}{2:28}{3:28}".format("Num", "Installed at", "Name", "Version")+Fore.RESET
print
for item_index in range(length):
pkg = installed_list[item_index]
installed_at = pkg['installed_at']
name, version = re.search(r'(.+?)\-([\d\w\.]*)\.zip', pkg['package'], re.IGNORECASE).groups()
print "[{0:2}] {1:28}{2:28}{3:28}".format((item_index+1), installed_at, name, version)
print
while True:
_input = raw_input("Choose your package number (1: DEFAULT, q: QUIT): ")
if _input == "":
_input = 1
package_index = int(_input)
if 0 < package_index <= length:
pkg = installed_list[package_index-1]
name, version = re.search(r'(.+?)\-([\d\w\.]*)\.zip', pkg['package'], re.IGNORECASE).groups()
print
print Back.RED + " DANGER ZONE " + Back.RESET + " Selected[{0}]".format(Fore.YELLOW + name + ' (' + version + ')' + Fore.RESET)
while True:
confirmation = raw_input("\n\t" + Fore.RED + "{0} ({1})".format(name, version) + Fore.RESET + " is going to be deleted. Are you sure (y,N): ")
if confirmation in ('y', 'Y', 'yes'):
manager.uninstall_package(name, version)
print "Delete action on "+name
break
elif confirmation in ('', 'n', 'N', 'no'):
print "\nOperation is canceled"
print "Hook is quitting"
break
break
except ValueError:
if _input not in ('q', 'quit'):
print "No value was specified"
print "Hook is quitting"
except AttributeError as e:
print e.message
def __cmd_uninstall(self, _packages=None):
if _packages is None or _packages == []:
installed_list = manager.get_installed_packages()
if len(installed_list) > 0:
self.__uninstall_helper_interface(installed_list)
else:
print Fore.YELLOW + "No packages were installed yet" + Fore.RESET
return
item_to_uninstall = _packages[0]
matching_list = manager.match_package(item_to_uninstall)
if len(matching_list) > 0:
print Back.BLUE + " Package(s) matching " + Back.RESET + " ({0})\n".format(item_to_uninstall)
self.__uninstall_helper_interface(manager.match_package(item_to_uninstall))
else:
print Fore.YELLOW + "No package matches " + Fore.RESET + "({0})\n".format(item_to_uninstall)
def __cmd_profile(self):
if not op.is_exits('.hook/workspace_settings.json'):
manager.settings_not_found_error_print()
return
if not op.is_exits("hook.json"):
print "You're not in the working directory. Switch to the working directory and try again"
return
settings = helper.load_json_file('.hook/workspace_settings.json')
print Back.BLUE + " Workspace: " + Back.RESET + " %0.2f MB\n" % op.get_folder_size(os.getcwd())['mb']
print "\t" + Fore.BLUE + "Created at:" + Fore.RESET + " {0}\n".format(settings['created_at'])
print Back.BLUE + " Components: " + Back.RESET + " %0.1f MB\n" % op.get_folder_size('components')['mb']
components = os.listdir('components')
print "\t" + Fore.BLUE+"{0:32}{1:14}{2:14}\n".format("Name", "Size(mb)", "Size(kb)")+Fore.RESET
for item in components:
size = op.get_folder_size('components/' + item)
print "\t" + "{0:32}{1:14}{2:14}".format(item, ("%0.2f" % size['mb']), ("%d" % size['kb']))
def __cmd_home(self, _repository):
url = ''
if helper.is_ssh_url(_repository):
url = 'https://github.com/' + re.search(r':(.+?).git$', _repository, re.IGNORECASE).group(1)
elif helper.is_http_url(_repository):
url = _repository
elif helper.is_repository(_repository):
url = 'https://github.com/' + _repository
if url == '':
print "No proper information was given"
return
cmd_browser = cli_browser.cli_browser()
cmd_browser.setRequestedURL(url)
print Fore.GREEN + 'Requesting' + Fore.RESET + ' -> ' + url
response = cmd_browser.submit(_return_status_code=True)
try:
response_status = int(response)
print Fore.YELLOW + str(response_status) + Fore.RESET + ': ' + cmd_browser.status_code_desc(response_status)
except ValueError as e:
print Fore.GREEN + 'Opening' + Fore.RESET + ' -> ' + url + ' in the default web browser'
op.open_url(url)
def __cache_list(self):
cache_list = op.list_dir(op.get_home() + op.separator() + hook.data_storage_path)
length = len(cache_list)
print Fore.BLUE + "{0:4} {1:35}{2:10}".format("Num", "File name", "Type") + Fore.RESET
print
for index in range(length):
try:
cached_file = cache_list[index]
name, type = re.search(r'(.+?)\.(zip|rar|gzip|bzip2|tar)', cached_file, re.IGNORECASE).groups()
print "[{0:2}] {1:35}{2:10}".format((index+1), name, type)
except Exception:
pass
def __cache_remove(self, _args):
separator = op.separator()
cache_path = op.get_home() + separator + hook.data_storage_path
cache_list = op.list_dir(cache_path)
if len(_args) > 0:
file_name = _args[0]
matching_list = manager.match_package_by_list(cache_list, file_name)
length = len(matching_list)
if length == 0:
print Fore.YELLOW + 'No file matches ' + Fore.RESET + "({0})\n".format(file_name)
return
"""
if length == 1:
file_name = matching_list[0]
print Back.RED + " DANGER ZONE " + Back.RESET
while True:
confirmation = raw_input("\n\t" + Fore.RED + file_name + Fore.RESET + " is going to be deleted. Are you sure (y,N): ")
if confirmation in ('y', 'Y', 'yes'):
op.remove_file(cache_path + separator + file_name)
print "\n" + Fore.YELLOW + file_name + Fore.RESET + " has been deleted"
break
elif confirmation in ('', 'n', 'N', 'no'):
print "\nOperation is canceled"
print "Hook is quitting"
break
return
"""
_input = ''
try:
print Back.BLUE + ' File(s) matching ' + Back.RESET + " ({0})".format(file_name)
print
print Fore.BLUE + "{0:4} {1:30}".format('Num', 'File name') + Fore.RESET
print
for index in range(length):
print "[{0:2}] {1:30}".format((index + 1), matching_list[index])
print
while True:
_input = raw_input("Choose your file number (1: DEFAULT, q: QUIT): ")
if _input == "":
_input = 1
file_index = int(_input)
if 0 < file_index <= length:
file_name = matching_list[(file_index - 1)]
print "\n" + Back.RED + " WARNING " + Back.RESET + " Selected[{0}]".format(Fore.YELLOW + file_name + Fore.RESET)
while True:
confirmation = raw_input("\n\t" + Fore.RED + file_name + Fore.RESET + " is going to be deleted. Are you sure (y,N): ")
if confirmation in ('y', 'Y', 'yes'):
op.remove_file(cache_path + separator + file_name)
print "\n" + Fore.YELLOW + file_name + Fore.RESET + " has been deleted"
break
elif confirmation in ('', 'n', 'N', 'no'):
print "\nOperation is canceled"
print "Hook is quitting"
break
break
except ValueError:
if _input not in ('q', 'quit'):
print "No value was specified"
print "Hook is quitting"
except AttributeError as e:
print e.message
return
_input = ''
try:
length = len(cache_list)
print Fore.BLUE + "{0:4} {1:30}".format('Num', 'File name') + Fore.RESET
print
for index in range(length):
print "[{0:2}] {1:30}".format((index + 1), cache_list[index])
print
while True:
_input = raw_input("Choose your file number (1: DEFAULT, q: QUIT): ")
if _input == "":
_input = 1
file_index = int(_input)
if 0 < file_index <= length:
file_name = cache_list[(file_index - 1)]
print "\n" + Back.RED + " WARNING " + Back.RESET + " Selected[{0}]".format(Fore.YELLOW + file_name + Fore.RESET)
while True:
confirmation = raw_input("\n\t" + Fore.RED + file_name + Fore.RESET + " is going to be deleted. Are you sure (y,N): ")
if confirmation in ('y', 'Y', 'yes'):
op.remove_file(cache_path + separator + file_name)
print "\n" + Fore.YELLOW + file_name + Fore.RESET + " has been deleted"
break
elif confirmation in ('', 'n', 'N', 'no'):
print "\nOperation is canceled"
print "Hook is quitting"
break
break
except ValueError:
if _input not in ('q', 'quit'):
print "No value was specified"
print "Hook is quitting"
except AttributeError as e:
print e.message
def __cache_info(self):
separator = op.separator()
cache_path = op.get_home() + separator + hook.data_storage_path
cache_list = op.list_dir(cache_path)
length = len(cache_list)
print Fore.BLUE + "{0:4} {1:35}{2:8}{3:28}{4:14}{5:14}".format("Num", "File name", "Type", "Downloaded at", "Size(mb)", "Size(kb)") + Fore.RESET
print
for index in range(length):
try:
cached_file = cache_list[index]
name, type = re.search(r'(.+?)\.(zip|rar|gzip|bzip2|tar)', cached_file, re.IGNORECASE).groups()
file_size = op.get_file_size(cache_path + separator + cached_file)
t = os.path.getmtime(cache_path + separator + cached_file) # returns seconds
m = time.strftime("%H:%M:%S - %b, %d %Y", time.gmtime(t))
print "[{0:2}] {1:35}{2:8}{3:28}{4:14}{5:14}".format((index+1), name, type, m, ("%0.2f" % file_size['mb']), ("%d" % file_size['kb']))
except Exception:
pass
def __cache_rename(self, _args):
old_name = _args[0]
new_name = _args[1]
separator = op.separator()
cache_path = op.get_home() + separator + hook.data_storage_path
cache_list = op.list_dir(cache_path)
matching_list = manager.match_package_by_list(cache_list, old_name)
length = len(matching_list)
if length == 0:
print Fore.YELLOW + 'No file matches ' + Fore.RESET + "({0})\n".format(old_name)
return
if length == 1:
old_name = matching_list[0]
oldname_file_extension = manager.get_file_extension(old_name)
newname_file_extension = manager.get_file_extension(new_name)
extension = manager.choose_extension(oldname_file_extension, newname_file_extension)
if extension is not None:
new_name += '.'+extension
op.rename_file(cache_path + separator + old_name, cache_path + separator + new_name)
print Fore.YELLOW + old_name + Fore.RESET + ' renamed -> ' + Fore.GREEN + new_name + Fore.RESET
return
_input = ''
try:
print Back.BLUE + ' File(s) matching ' + Back.RESET + " ({0})".format(old_name)
print
print Fore.BLUE + "{0:4} {1:30}".format('Num', 'File name') + Fore.RESET
print
for index in range(length):
print "[{0:2}] {1:30}".format((index + 1), matching_list[index])
print
while True:
_input = raw_input("Choose your file number (1: DEFAULT, q: QUIT): ")
if _input == "":
_input = 1
file_index = int(_input)
if 0 < file_index <= length:
old_name = matching_list[(file_index - 1)]
print "\n" + Back.RED + " WARNING " + Back.RESET + " Selected[{0}]".format(Fore.YELLOW + old_name + Fore.RESET)
while True:
confirmation = raw_input("\n\tAre you sure (y,N): ")
if confirmation in ('y', 'Y', 'yes'):
oldname_file_extension = manager.get_file_extension(old_name)
newname_file_extension = manager.get_file_extension(new_name)
extension = manager.choose_extension(oldname_file_extension, newname_file_extension)
if extension is not None:
new_name += '.'+extension
op.rename_file(cache_path + separator + old_name, cache_path + separator + new_name)
print "\n" + Fore.YELLOW + old_name + Fore.RESET + ' renamed -> ' + Fore.GREEN + new_name + Fore.RESET
break
elif confirmation in ('', 'n', 'N', 'no'):
print "\nOperation is canceled"
print "Hook is quitting"
break
break
except ValueError:
if _input not in ('q', 'quit'):
print "No value was specified"
print "Hook is quitting"
except AttributeError as e:
print e.message
def __cache_help(self):
print "usage: hook cache <command> [<args>] [<options>]\n"
print "Commands:"
print "{0}{1:19}{2}".format((" " * 2), 'help', "show this help message")
print "{0}{1:19}{2}".format((" " * 2), 'list', "list cached files")
print "{0}{1:19}{2}".format((" " * 2), 'info', "show basic information about cached files")
print "{0}{1:19}{2}".format((" " * 2), 'remove', "remove selected file from cache")
print "{0}{1:19}{2}".format((" " * 2), 'rename', "rename selected cached file")
print "{0}{1:19}{2}".format((" " * 2), 'register', "register package manually in your local cache")
print "{0}{1:19}{2}".format((" " * 2), 'repackage', "modify and repackage an already register package in your cache")
print "{0}{1:19}{2}".format((" " * 2), 'load', "load a package to your current working directory")
def __cache_register(self, _args):
separator = op.separator()
cache_path = op.get_home() + separator + hook.data_storage_path
if len(_args) > 0:
str_list = filter(None, _args[0].split(separator))
_foldername = _args[0]
_zipfilename = str_list[-1]
print _zipfilename
op.compress_folder(_foldername, cache_path + separator + _zipfilename + '.zip')
else:
print Fore.YELLOW + "Not enough arguments" + Fore.RESET
def __cache_repackage(self, _args):
print 'repackage'
def __cache_load(self, _args):
if not op.is_exits('.hook/workspace_settings.json'):
manager.settings_not_found_error_print()
return
separator = op.separator()
cache_path = op.get_home() + separator + hook.data_storage_path
if len(_args) > 0:
_filename = _args[0]
if manager.is_in_cache(_filename + '.zip'):
download_manager = idm.download_manager()
download_manager.load_from_cache(_filename + '.zip', op.get_file_size(cache_path + separator + _filename + '.zip')['kb'])
manager.register_installed_package(_filename + '.zip', _filename + '/' + _filename)
else:
print Fore.YELLOW + _filename + " is not registered in the cache" + Fore.RESET
else:
print Fore.YELLOW + "Not enough arguments" + Fore.RESET
def __cmd_cache(self, cache_cmd):
"""
cache related commands: list, remove, info, rename
"""
if cache_cmd[0] == 'list':
self.__cache_list()
elif cache_cmd[0] == 'remove':
self.__cache_remove(cache_cmd[1:])
elif cache_cmd[0] == 'info':
self.__cache_info()
elif cache_cmd[0] == 'rename':
self.__cache_rename(cache_cmd[1:])
elif cache_cmd[0] == 'help':
self.__cache_help()
elif cache_cmd[0] == 'register':
self.__cache_register(cache_cmd[1:])
elif cache_cmd[0] == 'repackage':
self.__cache_repackage(cache_cmd[1:])
elif cache_cmd[0] == 'load':
self.__cache_load(cache_cmd[1:])
else:
print Back.RED + " Unrecognized command " + Back.RESET
self.__cache_help()
def __cmd_download(self, _args):
browser_object = cli_browser.cli_browser()
browser_connection = browser_object.getHttpConnection()
download_manager = idm.download_manager()
download_manager.plugInBrowserWithDownloadManager(browser_connection)
urls = []
repositories = []
if len(_args) > 0:
repository = ''
version = ''
for arg in _args:
if helper.is_http_url(arg):
info = helper.http_url_package(arg)
repository = info[0] + '/' + info[1]
version = '*'
elif helper.is_ssh_url(arg):
info = helper.ssh_url_package(arg)
repository = info[0] + '/' + info[1]
version = '*'
elif helper.is_repository(arg):
repository = arg
version = '*'
else:
try:
repository, version = arg.split(':')
except ValueError:
print arg + " is not a valid argument"
continue
if version == '*':
browser_object.setRequestedURL('https://github.com/{0}/tags'.format(repository))
response = browser_object.submit()
versions_list = browser_object.parseVersions(response)
if len(versions_list) == 0:
print Back.RED + 'No releases for {0}'.format(repository) + Back.RESET
version = 'master'
else:
version = versions_list[0]
#url = 'https://github.com/fabpot/Twig/archive/v1.16.0.zip'
url = 'https://github.com/{0}/archive/{1}.zip'.format(repository, version)
repositories.append(repository)
#print url
urls.append(url)
download_manager.startQueue(urls, _repositories=repositories, _params={'type': 'download'})
browser_connection.close()
browser_object.closeConnections()
else:
print Fore.YELLOW + "Not enough arguments" + Fore.RESET
def __initCommands(self):
self.__parser.add_argument('commands', nargs="*")
self.__parser.add_argument('self-install', help="Setup working environment of hook it self", nargs="?")
self.__parser.add_argument('init', help="Interactively create a hook.json file", nargs="?")
self.__parser.add_argument('create', help="Create dynamic/static project with a specific package", nargs="*")
self.__parser.add_argument('download', help="Download package in your current working directory", nargs="?")
self.__parser.add_argument('add', help="Add a package(s) to your hook.json", nargs="*")
#self.__parser.add_argument('create', help="Setting up environment for the project", nargs="*")
self.__parser.add_argument("install", help="Install a package(s) locally", nargs="*")
self.__parser.add_argument("search", help="Search for a package by name", nargs="?")
self.__parser.add_argument("list", help="List local packages", nargs="?")
self.__parser.add_argument("info", help="Show information of a particular package", nargs="?")
self.__parser.add_argument("update", help="Update a local package", nargs="?")
self.__parser.add_argument("uninstall", help="Remove a local package", nargs="?")
self.__parser.add_argument("profile", help="Show memory usage of the working directory", nargs="?")
self.__parser.add_argument("home", help="Opens a package homepage into your default browser", nargs="?")
self.__parser.add_argument("cache", help="Manage hook cache", nargs="?")
def __initOptions(self):
#self.__parser.add_argument("-f", "--force", help="Makes various commands more forceful", action="store_true")
self.__parser.add_argument("-j", "--json", help="Output consumable JSON", action="store_true")
self.__parser.add_argument("-i", "--interactive", help="Makes various commands work interactively", action="store_true")
self.__parser.add_argument("-p", "--pversion", help="Tells if you want to get specific version of packages", action="store_true")
self.__parser.add_argument("-s", "--surf", help="Allows you to paginate packages list", action="store_true")
self.__parser.add_argument("-c", "--getversion", help="specify your target version", action="store_true")
self.__parser.add_argument("-b", "--getbootstrap", help="setup web application using twitter bootstrap", action="store_true")
self.__parser.add_argument("-f", "--getfoundation", help="setup web application using zurb foundation", action="store_true")
def __setup_workspace(self, _packages, _settings):
op.create_directory('.hook')
op.create_file('.hook/workspace_settings.json', op.object_to_json(_settings))
#op.hide_directory('.hook')
op.show_directory('.hook')
op.create_directory('components')
op.generate_json_file("hook.json", _packages)
print "Initialized empty HooK workspace in {0}".format(op.get_current_path())
print "Generating hook.json ..."
def __is_workspace_setup(self):
hook_exists = op.is_exits('.hook')
workspace_settings_exists = op.is_exits('.hook/workspace_settings.json')
components_exists = op.is_exits('components')
hook_json = op.is_exits('hook.json')
if hook_exists and workspace_settings_exists and components_exists and hook_json:
return True
return False
def __parseArguments(self):
return self.__parser.parse_args()
def initializeCommandLineTool(self):
self.__initCommands()
self.__initOptions()
return self.__parseArguments()
def logoPrint(self, _logo=''):
init()
if _logo == 'init':
print hook.application_logo
return
print
def execute(self, args):
try:
commands = args.commands
self.logoPrint(commands[0])
if commands[0] == 'init':
if not self.__is_workspace_setup():
self.__cmd_init(args.interactive, args.pversion, commands)
else:
print "Workspace is already setup"
elif commands[0] == 'self-install':
self.__cmd_self_install()
elif commands[0] == 'create':
self.__cmd_create(commands[1:], args.getversion, args.getfoundation, args.getbootstrap)
elif commands[0] == 'download':
self.__cmd_download(commands[1:])
elif commands[0] == 'add':
self.__cmd_add(commands[1:])
elif commands[0] == 'install':
self.__cmd_install()
elif commands[0] == 'search':
self.__cmd_search(commands[1:], args.surf)
elif commands[0] == 'list':
self.__cmd_list()
elif commands[0] == 'info':
self.__cmd_info()
elif commands[0] == 'update':
if not self.__is_workspace_setup():
manager.settings_not_found_error_print()
return
self.__cmd_update(commands[1:])
elif commands[0] == 'uninstall':
if not self.__is_workspace_setup():
manager.settings_not_found_error_print()
return
self.__cmd_uninstall(commands[1:])
elif commands[0] == 'profile':
self.__cmd_profile()
elif commands[0] == 'home':
self.__cmd_home(commands[1])
elif commands[0] == 'cache':
try:
self.__cmd_cache(commands[1:])
except Exception:
print Fore.YELLOW + "Not enough arguments" + Fore.RESET
else:
self.__parser.print_help()
except IndexError:
print "\n" + Back.RED + Fore.WHITE + ' Not enough arguments ' + Fore.RESET + Back.RESET
self.__parser.print_help()
|
how to restart pokemon alpha sapphire save file bundle alpha sapphire omega ruby.
how to restart pokemon alpha sapphire save file gateway alpha sapphire.
how to restart pokemon alpha sapphire save file the will get the omega ruby and alpha sapphire demo from these places.
how to restart pokemon alpha sapphire save file secret bases are a feature from the original games that make a triumphant return in omega ruby alpha sapphire but contrary to what many people.
how to restart pokemon alpha sapphire save file you can check how much your has grown using the effort o meter graph in the super training menu each has some stats that will increase more.
how to restart pokemon alpha sapphire save file so had omega ruby and alpha sapphire for a few days now and the time has come to get past the story and into the competitive area once more.
how to restart pokemon alpha sapphire save file how to start reset a new game in alpha sapphire 3.
how to restart pokemon alpha sapphire save file the demand for the omega ruby alpha sapphire special demo version has been truly extraordinary with our own giveaway drawing numbers of.
how to restart pokemon alpha sapphire save file omega ruby alpha sapphire players get free shiny new.
how to restart pokemon alpha sapphire save file photo photo photo photo.
how to restart pokemon alpha sapphire save file omega ruby alpha sapphire differences versions.
how to restart pokemon alpha sapphire save file image titled get the time travel award in omega ruby and alpha sapphire step 6.
how to restart pokemon alpha sapphire save file omega ruby vs alpha sapphire top 5 differences.
how to restart pokemon alpha sapphire save file.
|
import numpy as np
import unittest
from caffe2.python import core, workspace, muji, test_util
@unittest.skipIf(not workspace.has_gpu_support, "no gpu")
class TestMuji(test_util.TestCase):
def RunningAllreduceWithGPUs(self, gpu_ids, allreduce_function):
"""A base function to test different scenarios."""
net = core.Net("mujitest")
for id in gpu_ids:
net.ConstantFill(
[],
"testblob_gpu_" + str(id),
shape=[1, 2, 3, 4],
value=float(id + 1),
device_option=muji.OnGPU(id)
)
allreduce_function(
net, ["testblob_gpu_" + str(i)
for i in gpu_ids], "_reduced", gpu_ids
)
workspace.RunNetOnce(net)
target_value = sum(gpu_ids) + len(gpu_ids)
all_blobs = workspace.Blobs()
all_blobs.sort()
for blob in all_blobs:
print('{} {}'.format(blob, workspace.FetchBlob(blob)))
for idx in gpu_ids:
blob = workspace.FetchBlob("testblob_gpu_" + str(idx) + "_reduced")
np.testing.assert_array_equal(
blob,
target_value,
err_msg="gpu id %d of %s" % (idx, str(gpu_ids))
)
def testAllreduceFallback(self):
self.RunningAllreduceWithGPUs(
list(range(workspace.NumGpuDevices())), muji.AllreduceFallback
)
def testAllreduceSingleGPU(self):
for i in range(workspace.NumGpuDevices()):
self.RunningAllreduceWithGPUs([i], muji.Allreduce)
def testAllreduceWithTwoGPUs(self):
pattern = workspace.GetGpuPeerAccessPattern()
if pattern.shape[0] >= 2 and np.all(pattern[:2, :2]):
self.RunningAllreduceWithGPUs([0, 1], muji.Allreduce2)
else:
print('Skipping allreduce with 2 gpus. Not peer access ready.')
def testAllreduceWithFourGPUs(self):
pattern = workspace.GetGpuPeerAccessPattern()
if pattern.shape[0] >= 4 and np.all(pattern[:4, :4]):
self.RunningAllreduceWithGPUs([0, 1, 2, 3], muji.Allreduce4)
else:
print('Skipping allreduce with 4 gpus. Not peer access ready.')
def testAllreduceWithFourGPUsAndTwoGroups(self):
pattern = workspace.GetGpuPeerAccessPattern()
if pattern.shape[0] >= 4 and np.all(pattern[:2, :2]) and np.all(pattern[2:4, 2:4]):
self.RunningAllreduceWithGPUs([0, 1, 2, 3], muji.Allreduce4Group2)
else:
print('Skipping allreduce with 4 gpus and 2 groups. Not peer access ready.')
def testAllreduceWithEightGPUs(self):
pattern = workspace.GetGpuPeerAccessPattern()
if (
pattern.shape[0] >= 8 and np.all(pattern[:4, :4]) and
np.all(pattern[4:, 4:])
):
self.RunningAllreduceWithGPUs(
list(range(8)), muji.Allreduce8)
else:
print('Skipping allreduce with 8 gpus. Not peer access ready.')
if __name__ == '__main__':
unittest.main()
|
You'll be the hit of school drop-off in our outstanding 2014 Toyota Sienna LE proudly displayed in Salsa Red Pearl. Powered by a 3.5 Liter V6 that offers 266hp while tethered to a 6 Speed Automatic transmission that rewards you with easy passing maneuvers. With a smooth ride that is low on road noise, our Front Wheel Drive sets the stage for a serene trip with your family and rewards you with near 25mpg. Plus, you'll be completely equipped to tow a trailer up to 3,500 pounds! The aerodynamic exterior of the Sienna LE is accented by alloy wheels, a roof rack, and child-friendly dual power sliding doors.
Inside our LE, you'll immediately notice the great visibility and that the distinctive swept shape of the dashboard makes the spacious cabin feel even roomier. The ergonomically designed leather-trimmed seats offer power lumbar support to keep you comfortable on those long trips and with three-zone automatic climate control, everyone will be content. Thoughtfully designed with your hectic schedule in mind, our LE also has a central display with hands-free phone capability and music streaming via Bluetooth, available satellite radio and an iPod interface. Second-row captain's chairs tip up to allow easier access to the third row and help to maximize legroom for taller passengers. Of course, you'll have ample cargo space for all of your family's gear so enjoy your trip.
|
from tkinter import *
from Bill import *
from consumption_parser import foods, drinks
import CafeDB as db
class GameSlot(Frame):
def __init__(self, game_slots_frame, slot_number, game_info):
Frame.__init__(self, game_slots_frame, highlightbackground="black", highlightthickness=3)
self.game_info = game_info
self.slot_number = slot_number
self.game_type = game_info['type']
self.game_status = 0
self.number_of_players = 0
self.time_passed_in_sec = IntVar()
self.bill = Bill()
self.clickable_chidren = []
self.set_inner_widgets()
def set_bill(self, bill):
self.bill = bill
self.pay_bill_button.pack(side=RIGHT)
self.add_extra_button.pack(side=LEFT)
self.transact_game_slot_button.pack(side=LEFT)
def set_inner_widgets(self):
self.pack_propagate(0)
self.top_frame = Frame(self)
self.top_frame.pack(side=TOP, fill=X)
self.clickable_chidren.append(self.top_frame)
self.middle_frame = Frame(self)
self.middle_frame.pack(side=BOTTOM, fill=X)
self.clickable_chidren.append(self.middle_frame)
self.bottom_frame = Frame(self)
self.bottom_frame.pack(side=BOTTOM)
self.clickable_chidren.append(self.bottom_frame)
self.top_left_frame = Frame(self.top_frame, highlightthickness=1)
self.top_left_frame.pack(side=LEFT)
self.clickable_chidren.append(self.top_left_frame)
self.top_right_frame = Frame(self.top_frame, highlightthickness=1)
self.top_right_frame.pack(side=RIGHT)
self.clickable_chidren.append(self.top_right_frame)
self.middle_left_frame = Frame(self.middle_frame, highlightthickness=1)
self.middle_left_frame.pack(side=LEFT)
self.clickable_chidren.append(self.middle_left_frame)
self.middle_right_frame = Frame(self.middle_frame, highlightthickness=1)
self.middle_right_frame.pack(side=RIGHT)
self.clickable_chidren.append(self.middle_right_frame)
# number_label = Label(self.top_left_frame, text=self.slot_number, font=("Helvetica", 16))
# number_label.pack(side=LEFT)
self.game_slot_name_label = Label(self.top_left_frame, text=self.game_info['name'], font=("Helvetica", 16))
self.game_slot_name_label.pack(side=LEFT)
self.clickable_chidren.append(self.game_slot_name_label)
# game_type_label = Label(self.top_right_frame, text=self.game_type['name'], font=("Helvetica", 16))
# game_type_label.pack(side=RIGHT)
self.number_of_players_var = IntVar(self.middle_left_frame)
self.number_of_players_option = OptionMenu(self.middle_left_frame, self.number_of_players_var, *[1, 2, 3, 4])
self.number_of_players_var.set(2)
self.number_of_players_option.pack(side=LEFT)
self.start_button = Button(self.middle_left_frame, text="Başlat", fg="green", font=("Helvetica", 12))
self.start_button.bind("<Button-1>", self.start)
self.start_button.pack(side=LEFT)
# removed due to customer feedback
# self.change_number_of_players_button = Button(self.middle_left_frame, text="Değiştir", fg="orange", font=("Helvetica", 12))
# self.change_number_of_players_button.bind("<Button-1>", self.change_number_of_players)
self.finish_button = Button(self.middle_right_frame, text="Bitir", fg="red", font=("Helvetica", 12))
self.finish_button.bind("<Button-1>", self.finish)
self.pay_bill_button = Button(self.middle_right_frame, text="Kapat", fg="red", font=("Helvetica", 12))
self.pay_bill_button.bind("<Button-1>", self.pay_bill)
self.game_status_text = StringVar()
self.game_status_text.set(str(self.bill.get_total_charge(self.game_type, self.number_of_players, self.time_passed_in_sec.get())) + " / " + str(self.number_of_players))
self.charge_label = Label(self.bottom_frame, textvariable=self.game_status_text, font=("Helvetica", 26))
self.clickable_chidren.append(self.charge_label)
self.add_extra_button = Button(self.middle_left_frame, text="Ekle", fg="purple", font=("Helvetica", 10))
self.add_extra_button.bind("<Button-1>", self.add_extra)
## ## debug
## self.time_label = Label(self.top_left_frame, textvariable=self.time_passed_in_sec, font=("Helvetica", 16))
## self.time_label.pack(side=LEFT)
## ## debug
def second_hit(self):
if self.game_status == 1:
self.time_passed_in_sec.set(self.time_passed_in_sec.get() + 1)
self.game_status_text.set(str(self.bill.get_total_charge(self.game_type, self.number_of_players, self.time_passed_in_sec.get())) + " / " + str(self.number_of_players))
self.after(1000, self.second_hit)
def set_clicked(self):
self.config(highlightbackground="red")
def set_released(self):
self.config(highlightbackground="black")
def start(self, event):
self.game_status = 1
self.bill.is_active = True
self.bill.startingTime = datetime.datetime.now()
self.set_start_ui()
self.update() # update before delay
self.time_passed_in_sec.set(-1)
self.second_hit() # delay
def set_start_ui(self):
self.charge_label.config(fg="red")
self.number_of_players_option.forget()
self.start_button.pack_forget()
self.number_of_players = self.number_of_players_var.get()
self.game_status_text.set(str(self.bill.get_total_charge(self.game_type, self.number_of_players, self.time_passed_in_sec.get())) + " / " + str(self.number_of_players))
self.charge_label.pack()
self.pay_bill_button.pack_forget()
self.transact_game_slot_button.forget()
self.finish_button.pack(side=RIGHT)
#self.change_number_of_players_button.pack(side=LEFT)
self.add_extra_button.pack(side=LEFT)
# removed due to customer feedback
# def change_number_of_players(self, event):
# self.bill.add_game(self.game_type, self.number_of_players, self.time_passed_in_sec.get())
# self.number_of_players = self.number_of_players_var.get()
# self.time_passed_in_sec.set(0)
# self.game_status_text.set(str(self.bill.get_total_charge(self.game_type, self.number_of_players, self.time_passed_in_sec.get())) + " / " + str(self.number_of_players))
def finish(self, event):
self.bill.add_game(self.game_type, self.number_of_players, self.time_passed_in_sec.get())
self.game_status = 0
self.number_of_players = 0
self.set_finish_ui()
def set_finish_ui(self):
self.charge_label.config(fg="black")
self.finish_button.pack_forget()
self.pay_bill_button.pack(side=RIGHT)
#self.change_number_of_players_button.forget()
self.game_status_text.set(self.bill.total_charge)
self.add_extra_button.forget()
self.number_of_players_option.pack(side=LEFT)
self.after(1000, self.start_button.pack(side=LEFT))
self.add_extra_button.pack(side=LEFT)
self.transact_game_slot_button.pack(side=LEFT)
def pay_bill(self, event):
self.set_pay_bill_ui()
self.bill.endingTime = datetime.datetime.now()
db.saveBill(self.bill)
self.bill = Bill()
def set_pay_bill_ui(self):
self.pay_bill_button.pack_forget()
self.charge_label.pack_forget()
self.add_extra_button.forget()
self.transact_game_slot_button.forget()
def add_extra(self, event):
menu = Toplevel()
menu.title("Ekle...")
row = 0
column = 0
for extra in foods:
# PRINT(extra['name'] + " row: " + str(row) + " column: " + str(column))
food_button = Button(menu, text=extra['name']+": "+str(extra['charge']), anchor=W, font=("Helvetica", 8), command=lambda extra=extra:self.bill.add_extra(extra))
food_button.grid(row=row, column=column, sticky=W+E+S+N)
row += 1
if row > 20:
row = 0
column += 1
row = 0
column += 1
for extra in drinks:
drink_button = Button(menu, text=extra['name']+": "+str(extra['charge']), anchor=W, font=("Helvetica", 8), command=lambda extra=extra:self.bill.add_extra(extra))
drink_button.grid(row=row, column=column, sticky=W+E+S+N)
row += 1
if row > 20:
row = 1
column += 1
add_other_button = Button(menu, text="Başka...", fg="blue", font=("Helvetica", 12), command=self.add_other_popup)
add_other_button.grid(columnspan=2, sticky=W+E+S+N)
quit_button = Button(menu, text="Kapat", fg="red", font=("Helvetica", 12), command=menu.destroy)
quit_button.grid(columnspan=2, sticky=W+E+S+N)
def add_other_popup(self):
menu = Toplevel()
menu.title("Başka ekle...")
desc_label = Label(menu, text="Açıklama:")
charge_label = Label(menu, text="Ücret:")
description = StringVar()
desc_entry = Entry(menu, textvariable=description)
charge = DoubleVar()
charge_entry = Entry(menu, textvariable=charge)
desc_label.grid(row=0)
charge_label.grid(row=1)
desc_entry.grid(row=0, column=1)
charge_entry.grid(row=1, column=1)
submit_button = Button(menu, text="Ekle", command=lambda: self.bill.add_other(description.get(), float(charge.get())))
submit_button.grid(columnspan=2)
|
A challenging sportive set in border country between England and Wales. An area known as the Marches, taken from the Medieval title Lord of the Marches, offers a ‘secret county’ of traffic-free roads.
For event organisers Kilotogo, Shropshire is a haven for the cyclist, a county that makes the perfect venue for the third of their four sportives being covered by Cycling Weekly, the fourth being The Tour of the Peak on Saturday, October 24. Even though this is their first event in the area, a long history of riding and training here meant they were sure of the possibilities it offered.
Bishop’s Castle with its modern sports centre is event HQ and the epicentre for accessing the area’s hills. The number and severity of its gradients is the main feature of Kilotogo’s Wild Edric Sportive. As a consequence most people were in the granny ring within 200 metres of the start and that was just the High Street!
As for the climb of Asterton bank, by this point it was reckoned that 80 per cent of the 150 riders took to walking. With at least four more 20 per cent climbs to follow and countless lesser ones, a lot of people found the route tougher than expected and everybody was unanimous in their opinion that this was the hardest sportive so far.
This is, of course, a double-edged sword. One of the appeals of sportives in the first place was the fact they were within the capabilities of non-racing cyclists and you could set your own tempo, now it seems we have come full circle and people are being deterred because they’ve become too hard.
‘Traffic-free roads’ is a phrase used by many promoters of sportives, but in the case of Geoff Saxon’s description it was justified. Trouble was, this being Shropshire, no one told the wildlife. A family of pheasants, sheep, rabbits, several dogs, and a cat, all decided they would make up for that deficit and do their best to dump me on the floor, even a man with a ladder tried to get in on the act.
Due to the heavy rain this part of the world has experienced recently, the added impediments of mud, rocks, gravel and forestry debris littered the lanes, accumulating at worst on tightening unsighted bends that had been preceded by frighteningly steep descents. All this meant one of the most interesting rides I’d had for a long while.
When the lack of hazards permitted, this usually this coincided with the summit of some horrendously steep hill, and after my heart rate and vision allowed, the rewards for all this seemingly hazardous violation to health an safety were the views of Shropshire. ‘You can’t eat scenery’ as they say but if that was the case, you could dine out forever on a landscape such as this. Unfortunately in my case a little bit of indigestion worked its way in.
A combination of the mistaken belief that I could ride anything on racing gears and as the French say ‘Jour Sans’ meant that the finishing line, hot shower and a cup of tea, in that order, couldn’t have come soon enough. Read the promotional leaflet carefully and definitely put this in the diary for next year.
|
# Copyright 2021 The TensorFlow Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Util classes and functions."""
from absl import logging
import tensorflow as tf
# pylint: disable=g-direct-tensorflow-import
from tensorflow.python.training.tracking import tracking
class VolatileTrackable(tracking.AutoTrackable):
"""A util class to keep Trackables that might change instances."""
def __init__(self, **kwargs):
for k, v in kwargs.items():
setattr(self, k, v)
def reassign_trackable(self, **kwargs):
for k, v in kwargs.items():
delattr(self, k) # untrack this object
setattr(self, k, v) # track the new object
class CheckpointWithHooks(tf.train.Checkpoint):
"""Same as tf.train.Checkpoint but supports hooks.
In progressive training, use this class instead of tf.train.Checkpoint.
Since the network architecture changes during progressive training, we need to
prepare something (like switch to the correct architecture) before loading the
checkpoint. This class supports a hook that will be executed before checkpoint
loading.
"""
def __init__(self, before_load_hook, **kwargs):
self._before_load_hook = before_load_hook
super(CheckpointWithHooks, self).__init__(**kwargs)
# override
def read(self, save_path, options=None):
self._before_load_hook(save_path)
logging.info('Ran before_load_hook.')
super(CheckpointWithHooks, self).read(save_path=save_path, options=options)
|
This ride is a fully-supported event that covers the entire 140 mile length of Southeast Utah’s Kokopelli trail. Depending on fitness levels and group goals this ride can range from three to five days. The trip starts just outside of Moab, Utah at the Slick Rock parking lot, and ends in Loma, Colorado.
The spring or fall months offer the most ideal weather and trail conditions for this ride due to the potential for high temperatures and snow that can be encountered in the La Sal Mountains. Although you will start and end the ride at approximately 4,500 feet above sea level, the trail climbs and winds through various desert and high alpine environments, with a total of approximately 16,000 feet of total elevation gain.
© South Park Adventure Guides - All rights reserved.
|
"""Provides argument parser."""
# Copyright (c) 2014 - I.T. Dev Ltd
#
# This file is part of MCVirt.
#
# MCVirt is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 2 of the License, or
# (at your option) any later version.
#
# MCVirt is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with MCVirt. If not, see <http://www.gnu.org/licenses/>
import argparse
import os
from mcvirt.exceptions import (ArgumentParserException,
AuthenticationError)
from mcvirt.client.rpc import Connection
from mcvirt.system import System
from mcvirt.parser_modules.virtual_machine.start_parser import StartParser
from mcvirt.parser_modules.virtual_machine.stop_parser import StopParser
from mcvirt.parser_modules.virtual_machine.reset_parser import ResetParser
from mcvirt.parser_modules.virtual_machine.shutdown_parser import ShutdownParser
from mcvirt.parser_modules.virtual_machine.create_parser import CreateParser
from mcvirt.parser_modules.virtual_machine.delete_parser import DeleteParser
from mcvirt.parser_modules.clear_method_lock_parser import ClearMethodLockParser
from mcvirt.parser_modules.iso_parser import IsoParser
from mcvirt.parser_modules.virtual_machine.register_parser import RegisterParser
from mcvirt.parser_modules.virtual_machine.unregister_parser import UnregisterParser
from mcvirt.parser_modules.virtual_machine.update_parser import UpdateParser
from mcvirt.parser_modules.virtual_machine.migrate_parser import MigrateParser
from mcvirt.parser_modules.virtual_machine.info_parser import InfoParser
from mcvirt.parser_modules.permission_parser import PermissionParser
from mcvirt.parser_modules.network_parser import NetworkParser
from mcvirt.parser_modules.hard_drive_parser import HardDriveParser
from mcvirt.parser_modules.group_parser import GroupParser
from mcvirt.parser_modules.user_parser import UserParser
from mcvirt.parser_modules.virtual_machine.list_parser import ListParser
from mcvirt.parser_modules.virtual_machine.duplicate_parser import DuplicateParser
from mcvirt.parser_modules.virtual_machine.clone_parser import CloneParser
from mcvirt.parser_modules.virtual_machine.move_parser import MoveParser
from mcvirt.parser_modules.cluster_parser import ClusterParser
from mcvirt.parser_modules.storage_parser import StorageParser
from mcvirt.parser_modules.node_parser import NodeParser
from mcvirt.parser_modules.verify_parser import VerifyParser
from mcvirt.parser_modules.resync_parser import ResyncParser
from mcvirt.parser_modules.drbd_parser import DrbdParser
from mcvirt.parser_modules.virtual_machine.backup_parser import BackupParser
from mcvirt.parser_modules.virtual_machine.lock_parser import LockParser
from mcvirt.parser_modules.watchdog_parser import WatchdogParser
class ThrowingArgumentParser(argparse.ArgumentParser):
"""Override the ArgumentParser class, in order to change the handling of errors."""
def error(self, message):
"""Override the error function."""
# Force the argument parser to throw an MCVirt exception on error.
print '\nError: %s\n' % message
self.print_help()
raise ArgumentParserException(message)
class Parser(object):
"""Provide an argument parser for MCVirt."""
AUTH_FILE = '.mcvirt-auth'
def __init__(self, verbose=True):
"""Configure the argument parser object."""
self.print_output = []
self.username = None
self.session_id = None
self.rpc = None
self.auth_cache_file = os.getenv('HOME') + '/' + self.AUTH_FILE
self.verbose = verbose
self.parent_parser = ThrowingArgumentParser(add_help=False)
self.global_option = self.parent_parser.add_argument_group('Global optional arguments')
self.global_option.add_argument('--username', '-U', dest='username',
help='MCVirt username')
self.global_option.add_argument('--password', dest='password',
help='MCVirt password')
self.global_option.add_argument('--cache-credentials', dest='cache_credentials',
action='store_true',
help=('Store the session ID, so it can be used for '
'multiple MCVirt calls.'))
self.global_option.add_argument('--ignore-failed-nodes', dest='ignore_failed_nodes',
help='Ignores nodes that are inaccessible',
action='store_true')
self.global_option.add_argument('--accept-failed-nodes-warning',
dest='accept_failed_nodes_warning',
help=argparse.SUPPRESS, action='store_true')
self.global_option.add_argument('--ignore-drbd', dest='ignore_drbd',
help='Ignores Drbd state', action='store_true')
argparser_description = "\nMCVirt - Managed Consistent Virtualisation\n\n" + \
'Manage the MCVirt host'
argparser_epilog = "\nFor more information, see http://mcvirt.itdev.co.uk\n"
# Create an argument parser object
self.parser = ThrowingArgumentParser(description=argparser_description,
epilog=argparser_epilog,
formatter_class=argparse.RawDescriptionHelpFormatter)
self.subparsers = self.parser.add_subparsers(dest='action', metavar='Action',
help='Action to perform')
# Add arguments for starting a VM
StartParser(self.subparsers, self.parent_parser)
# Add arguments for stopping a VM
StopParser(self.subparsers, self.parent_parser)
# Add arguments for resetting a VM
ResetParser(self.subparsers, self.parent_parser)
# Add arguments for shutting down a VM
ShutdownParser(self.subparsers, self.parent_parser)
# Add arguments for fixing deadlock on a vm
ClearMethodLockParser(self.subparsers, self.parent_parser)
# Add arguments for ISO functions
IsoParser(self.subparsers, self.parent_parser)
# Add arguments for managing users
UserParser(self.subparsers, self.parent_parser)
# Add arguments for creating a VM
CreateParser(self.subparsers, self.parent_parser)
# Get arguments for deleting a VM
DeleteParser(self.subparsers, self.parent_parser)
RegisterParser(self.subparsers, self.parent_parser)
UnregisterParser(self.subparsers, self.parent_parser)
# Get arguments for updating a VM
UpdateParser(self.subparsers, self.parent_parser)
PermissionParser(self.subparsers, self.parent_parser)
GroupParser(self.subparsers, self.parent_parser)
# Create subparser for network-related commands
NetworkParser(self.subparsers, self.parent_parser)
# Get arguments for getting VM information
InfoParser(self.subparsers, self.parent_parser)
# Get arguments for listing VMs
ListParser(self.subparsers, self.parent_parser)
# Get arguments for cloning a VM
CloneParser(self.subparsers, self.parent_parser)
# Get arguments for cloning a VM
DuplicateParser(self.subparsers, self.parent_parser)
# Get arguments for migrating a VM
MigrateParser(self.subparsers, self.parent_parser)
# Create sub-parser for moving VMs
MoveParser(self.subparsers, self.parent_parser)
# Create sub-parser for cluster-related commands
ClusterParser(self.subparsers, self.parent_parser)
StorageParser(self.subparsers, self.parent_parser)
HardDriveParser(self.subparsers, self.parent_parser)
# Create subparser for commands relating to the local node configuration
NodeParser(self.subparsers, self.parent_parser)
# Create sub-parser for VM verification
VerifyParser(self.subparsers, self.parent_parser)
# Create sub-parser for VM Disk resync
ResyncParser(self.subparsers, self.parent_parser)
# Create sub-parser for Drbd-related commands
DrbdParser(self.subparsers, self.parent_parser)
# Create sub-parser for watchdog
WatchdogParser(self.subparsers, self.parent_parser)
# Create sub-parser for backup commands
BackupParser(self.subparsers, self.parent_parser)
# Create sub-parser for managing VM locks
LockParser(self.subparsers, self.parent_parser)
self.exit_parser = self.subparsers.add_parser('exit', help='Exits the MCVirt shell',
parents=[self.parent_parser])
def print_status(self, status):
"""Print if the user has specified that the parser should print statuses."""
if self.verbose:
print status
else:
self.print_output.append(status)
def check_ignore_failed(self, args):
"""Check ignore failed."""
if args.ignore_failed_nodes:
# If the user has specified to ignore the cluster,
# print a warning and confirm the user's answer
if not args.accept_failed_nodes_warning:
self.print_status(('WARNING: Running MCVirt with --ignore-failed-nodes'
' can leave the cluster in an inconsistent state!'))
continue_answer = System.getUserInput('Would you like to continue? (Y/n): ')
if continue_answer.strip() is not 'Y':
self.print_status('Cancelled...')
return
return True
return False
def authenticate_saved_session(self, ignore_cluster):
"""Attempt to authenticate using saved session."""
# Try logging in with saved session
auth_session = None
try:
with open(self.auth_cache_file, 'r') as cache_fh:
auth_username = cache_fh.readline().strip()
auth_session = cache_fh.readline().strip()
except IOError:
pass
if auth_session:
try:
self.rpc = Connection(username=auth_username, session_id=auth_session,
ignore_cluster=ignore_cluster)
self.session_id = self.rpc.session_id
self.username = self.rpc.username
except AuthenticationError:
# If authentication fails with cached session,
# print error, attempt to remove sessionn file and
# remove rpc connection
self.print_status('Authentication error occured when using saved session.')
try:
os.remove(self.auth_cache_file)
except OSError:
pass
self.rpc = None
def authenticate_username_password(self, args, ignore_cluster):
"""Authenticate using username and password."""
# Check if user/password have been passed. Else, ask for them.
username = args.username if args.username else System.getUserInput(
'Username: '
).rstrip()
if args.password:
password = args.password
else:
password = System.getUserInput(
'Password: ', password=True
).rstrip()
self.rpc = Connection(username=username, password=password,
ignore_cluster=ignore_cluster)
self.session_id = self.rpc.session_id
self.username = self.rpc.username
def store_cached_session(self, args):
"""Store session details in temporary file"""
# If successfully authenticated then store session ID and username in auth file
if args.cache_credentials:
try:
with open(self.auth_cache_file, 'w') as cache_fh:
cache_fh.write("%s\n%s" % (self.rpc.username, self.rpc.session_id))
except OSError:
pass
def parse_arguments(self, script_args=None):
"""Parse arguments and performs actions based on the arguments."""
# If arguments have been specified, split, so that
# an array is sent to the argument parser
if script_args is not None:
script_args = script_args.split()
args = self.parser.parse_args(script_args)
ignore_cluster = self.check_ignore_failed(args)
if self.session_id and self.username:
self.rpc = Connection(username=self.username, session_id=self.session_id,
ignore_cluster=ignore_cluster)
else:
# Obtain connection to Pyro server
if not (args.password or args.username):
self.authenticate_saved_session(ignore_cluster)
if not self.rpc:
self.authenticate_username_password(args, ignore_cluster)
self.store_cached_session(args)
if args.ignore_drbd:
self.rpc.ignore_drbd()
# If a custom parser function has been defined, used this and exit
# instead of running through (old) main parser workflow
if 'func' in dir(args):
args.func(args=args, p_=self)
else:
raise ArgumentParserException('No handler registered for parser')
|
G21.5-0.9 is a particularly nice example of a composite supernova remnant, shown in the left panel in a Chandra X-ray image. The spherical supernova shell is faintly glowing in X-rays; the center of the remnant is filled by a pulsar wind nebula of highly relativistic electrons with strong X-ray emission. From Gaensler and Slane (2006).
The standard candle of VHE astronomy, the Crab Nebula, has served for decades as a yardstick in almost all wavelengths, and yet it is a very peculiar object, harboring the most energetic and one of the youngest pulsars of our Galaxy. In the VHE domain, H.E.S.S. has revealed more than a dozen pulsar wind nebulae (PWN), either firmly established as such or compelling candidates (e.g. SOM 9/05, SOM 4/06, SOM 5/06, SOM 6/07), almost all of which are middle-aged (at least few kyrs up to 100 kyrs, except MSH 15-52) and which exhibit an extended VHE nebula with an offset between the pulsar position and the nebula center, quite different from the (for gamma-ray instruments) point-like Crab Nebula. The remnant G21.5-0.9 (Wilson and Weiler 1976) was previously classified as one of the about ten Crab-like SNR, and was already in 1995 predicted as a gamma-ray source with a flux of a few % of the Crab flux and hard spectrum (de Jager et al.). Chandra images (Safi-Harb et al, 2001; Bocchino et al., 2005) beautifully reveal the composite nature of the remnant, consisting of a spherical shell of 4' diameter, tracing the supernova blast wave, and a pulsar wind nebula of about 1' size, of high-energy electrons accelerated in a pulsar created in the explosion (Top figure and Fig. 1). Finally, in 2005, the 61.8 ms pulsar PSR J1833-1034, with a spin-down power of ˙E = 3.3 × 10^37erg/s and a characteristic age of 4.9 kyr was discovered through its faint radio pulsed emission (Gupta et al., 2005, Camilo et al., 2006). PSR J1833-1034 in G 21.5-0.9 is the second most energetic pulsar known in the Galaxy.
Data obtained in the H.E.S.S. Galactic Plane Survey (SOM 12/07) finally seem to confirm the predictions of G21.5-0.9 as a VHE gamma-ray source and as a companion of the Crab Nebula. At the 2007 ICRC in Merida, evidence for the source HESS J1833-105 was presented (Djannati-Atai et al. , 2007), detected at a level of more than 6 standard deviations in about 20 h of observations (Fig. 2). The location of the source is determined with a precision of better than 1' and coincides with the location of PSR J1833-1034 in G 21.5-0.9. On the scale of the H.E.S.S. angular resolution of 4', the source appears point-like. The gamma ray flux, measured between 200 GeV and 5 TeV, corresponds to about 2% of that of the Crab Nebula, close to the predicted value. Why is G21.5-0.9, like the Crab Nebula, much more compact than most other PWN seen by HESS? One reason is obviously their relatively young age, giving the nebulae less time to expand. Most likely more important is the high luminosity of the pulsar, which generates relatively high magnetic fields in the nebula, resulting in strong synchrotron radiation losses and short lifetimes - and hence short range - of the accelerated electrons. G21.5-0.9 and the Crab Nebula are therefore rather inefficient gamma-ray sources, in the sense that synchrotron radiation losses dominate over Inverse-Compton production of high energy gamma rays: in the Crab nebula, the synchrotron luminosity is about 100 times higher than the gamma ray luminosity, in G21.5 it is by a factor of 30 higher.
"New Companions for the lonely Crab? VHE emission from young pulsar wind nebulae revealed by H.E.S.S."
Fig. 1: Zoomed Chandra image of the somewhat irregular pulsar wind nebula in G21.5-0.9. From Camilo et al. (2006).
Fig. 2: Smoothed VHE gamma ray map of the region around G21.5-0.9. Contour lines show significance contours for 4,5,6 standard deviations. The location of the pulsar PSR J1833-1034 is indicated by the black triangle, the centroid of the VHE gamma ray source by the black error bars. Note difference in scale compared to the upper images; the source size is consistent with the angular resolution of the telescopes, of about 4'. Preliminary.
|
# -*- coding: utf-8 -*-
# This file is a part of MediaDrop (http://www.mediadrop.net),
# Copyright 2009-2015 MediaDrop contributors
# For the exact contribution history, see the git revision log.
# The source code in this file is dual licensed under the MIT license or
# the GPLv3 or (at your option) any later version.
# See LICENSE.txt in the main project directory, for more information.
from decimal import Decimal
import simplejson
from simplejson.encoder import JSONEncoderForHTML
from sqlalchemy.orm.properties import NoneType
__all__ = ['InlineJS', 'Script', 'Scripts']
class Script(object):
def __init__(self, url, async=False, key=None):
self.url = url
self.async = async
self.key = key
def render(self):
async = self.async and ' async="async"' or ''
return '<script src="%s"%s type="text/javascript"></script>' % (self.url, async)
def __unicode__(self):
return self.render()
def __repr__(self):
return 'Script(%r, async=%r, key=%r)' % (self.url, self.async, self.key)
def __eq__(self, other):
# please note that two Script instances are considered equal when they
# point to the same URL. The async attribute is not checked, let's not
# include the same source code twice.
if not hasattr(other, 'url'):
return False
return self.url == other.url
def __ne__(self, other):
return not (self == other)
class InlineJS(object):
def __init__(self, code, key=None, params=None):
self.code = code
self.key = key
self.params = params
def as_safe_json(self, s):
return simplejson.dumps(s, cls=JSONEncoderForHTML)
def _escaped_parameters(self, params):
escaped_params = dict()
for key, value in params.items():
if isinstance(value, (bool, NoneType)):
# this condition must come first because "1 == True" in Python
# but "1 !== true" in JavaScript and the "int" check below
# would pass True unmodified
escaped_params[key] = self.as_safe_json(value)
elif isinstance(value, (int, long, float)):
# use these numeric values directly as format string
# parameters - they are mapped to JS types perfectly and don't
# need any escaping.
escaped_params[key] = value
elif isinstance(value, (basestring, dict, tuple, list, Decimal)):
escaped_params[key] = self.as_safe_json(value)
else:
klassname = value.__class__.__name__
raise ValueError('unknown type %s' % klassname)
return escaped_params
def render(self):
js = self.code
if self.params is not None:
js = self.code % self._escaped_parameters(self.params)
return '<script type="text/javascript">%s</script>' % js
def __unicode__(self):
return self.render()
def __repr__(self):
return 'InlineJS(%r, key=%r)' % (self.code, self.key)
def __eq__(self, other):
# extremely simple equality check: two InlineJS instances are equal if
# the code is exactly the same! No trimming of whitespaces or any other
# analysis is done.
if not hasattr(other, 'render'):
return False
return self.render() == other.render()
def __ne__(self, other):
return not (self == other)
class SearchResult(object):
def __init__(self, item, index):
self.item = item
self.index = index
class ResourcesCollection(object):
def __init__(self, *args):
self._resources = list(args)
def replace_resource_with_key(self, new_resource):
result = self._find_resource_with_key(new_resource.key)
if result is None:
raise AssertionError('No script with key %r' % new_resource.key)
self._resources[result.index] = new_resource
def render(self):
markup = u''
for resource in self._resources:
markup = markup + resource.render()
return markup
def __len__(self):
return len(self._resources)
# --- internal api ---------------------------------------------------------
def _get(self, resource):
result = self._find_resource(resource)
if result is not None:
return result
raise AssertionError('Resource %r not found' % resource)
def _get_by_key(self, key):
result = self._find_resource_with_key(key)
if result is not None:
return result
raise AssertionError('No script with key %r' % key)
def _find_resource(self, a_resource):
for i, resource in enumerate(self._resources):
if resource == a_resource:
return SearchResult(resource, i)
return None
def _find_resource_with_key(self, key):
for i, resource in enumerate(self._resources):
if resource.key == key:
return SearchResult(resource, i)
return None
class Scripts(ResourcesCollection):
def add(self, script):
if script in self._resources:
if not hasattr(script, 'async'):
return
# in case the same script is added twice and only one should be
# loaded asynchronously, use the non-async variant to be on the safe
# side
older_script = self._get(script).item
older_script.async = older_script.async and script.async
return
self._resources.append(script)
def add_all(self, *scripts):
for script in scripts:
self.add(script)
# --- some interface polishing ---------------------------------------------
@property
def scripts(self):
return self._resources
def replace_script_with_key(self, script):
self.replace_resource_with_key(script)
|
On Tuesday night (starting at 12:01 am Wednesday July 29th) we will be upgrading over 7,600 Thompson/Technicolor DCM475 internet cable modems. The v2.51 firmware being deployed fixes an issue with the current firmware which would cause the device to become non-functional as we begin to upgrade our network to 24 downstream channels, starting in August. This new firmware has been thoroughly tested and addresses this critical issue.
We have been made aware of an issue that may occur if the modem goes off-line due to an interruption in RF signal. In the lab environment, we have seem that the modem will reconnect by itself after an indeterminate amount of time (between 2 to 8 hours). Technicolor has acknowledged that this is a known issue, but does not currently have a workaround in place. Shaw Engineering has confirmed that there is a workaround; a hard power cycle will cause the modem to reconnect almost immediately after boot-up. At this time, we would ask that you work with your customers and recommend this workaround should an incident occur.
Our teams will continue to work with Technicolor to resolve this issue; we would also ask your departments to work with Technicolor as well in finding a workaround.
We will continue to provide updates as the situation unfolds."
|
#!/usr/bin/env python
#
# gen-javahl-errors.py: Generate a Java class containing an enum for the
# C error codes
#
# ====================================================================
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
# ====================================================================
#
import sys, os
try:
from svn import core
except ImportError, e:
sys.stderr.write("ERROR: Unable to import Subversion's Python bindings: '%s'\n" \
"Hint: Set your PYTHONPATH environment variable, or adjust your " \
"PYTHONSTARTUP\nfile to point to your Subversion install " \
"location's svn-python directory.\n" % e)
sys.stderr.flush()
sys.exit(1)
def get_errors():
errs = {}
for key in vars(core):
if key.find('SVN_ERR_') == 0:
try:
val = int(vars(core)[key])
errs[val] = key
except:
pass
return errs
def gen_javahl_class(error_codes, output_filename):
jfile = open(output_filename, 'w')
jfile.write(
"""/** ErrorCodes.java - This file is autogenerated by gen-javahl-errors.py
*/
package org.tigris.subversion.javahl;
/**
* Provide mappings from error codes generated by the C runtime to meaningful
* Java values. For a better description of each error, please see
* svn_error_codes.h in the C source.
*/
public class ErrorCodes
{
""")
keys = sorted(error_codes.keys())
for key in keys:
# Format the code name to be more Java-esque
code_name = error_codes[key][8:].replace('_', ' ').title().replace(' ', '')
code_name = code_name[0].lower() + code_name[1:]
jfile.write(" public static final int %s = %d;\n" % (code_name, key))
jfile.write("}\n")
jfile.close()
if __name__ == "__main__":
if len(sys.argv) > 1:
output_filename = sys.argv[1]
else:
output_filename = os.path.join('..', '..', 'subversion', 'bindings',
'javahl', 'src', 'org', 'tigris',
'subversion', 'javahl', 'ErrorCodes.java')
gen_javahl_class(get_errors(), output_filename)
|
The Robert E. Russ Award recognizes one individual for outstanding contributions to the Ruston-Lincoln business community or overall economic development of the area, in honor of the founder of Ruston, Louisiana, Robert E. Russ. This award is one of the highest honors given by the Ruston-Lincoln Chamber of Commerce (RLCOC) to an individual. Nominations will be accepted through December 3, 2018. The award will be presented at the Chamber’s 100th Annual Banquet on the evening of Tuesday, January 21, 2020.
This award will be open to any resident of Lincoln Parish, male or female. It is not required that this person be a member of the RLCOC.
The nominee cannot be a paid professional leader, an elected official who currently holds office, or a current member of the RLCOC Board of Directors.
The nominee must have made several contributions to the betterment of the business community.
An individual can be nominated in multiple and successive years but can only be honored by the award one time.
It is the selection committee’s responsibility to critique and consider nominations and select a recipient based on the nominee’s personal contributions toward business and/or economic advancement or development of this region in an outstanding manner during the year. However, the committee can take into account the activities of the individual over the last three years.
It is not required that the Russ Award be given every year.
For more information contact Ivana Flowers at the Chamber, 255-2031 or email iflowers@rustonlincoln.org.
2007 John F. Emory, Sr.
1989 William A. Marbury, Jr.
1988 Clarence E. Faulk, Jr.
|
# ===============================================================================
# Copyright 2015 Jake Ross
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# ===============================================================================
# ============= enthought library imports =======================
from traits.api import HasTraits, Str, Int
# ============= standard library imports ========================
# ============= local library imports ==========================
class LayoutItem(HasTraits):
row = Int
column = Int
kind = Str
identifier = Str
class FigureLayout(HasTraits):
rows = Int(1)
columns = Int(2)
fixed = Str('cols')
def add_item(self, kind):
self.items.append(LayoutItem(kind=kind))
# ============= EOF =============================================
|
You’ve loaded your best product content in Wetu, bumped up your Content Rating, you regularly update and refresh it. Fabulous!🎉 Now Tour Operators/DMCs/Travel Agents can use it in their itineraries. Ensure your content travels even further by linking your iBrochure and/or Digital Catalogue to your email signature. It’s that easy!
Simply copy the hyperlink(s) and attach them to your signature to supplement your marketing efforts.⛓ Here’s a reminder of how to create that hyperlink. Contact support@wetu.com if you need help.
|
import sys
sys.path.append('/home/jwalker/dynamics/python/atmos-tools')
sys.path.append('/home/jwalker/dynamics/python/atmos-read')
import numpy as np
import xarray as xray
import pandas as pd
import matplotlib.pyplot as plt
import atmos as atm
import merra
from indices import onset_HOWI, summarize_indices, plot_index_years
# ----------------------------------------------------------------------
# Compute HOWI indices (Webster and Fasullo 2003)
datadir = atm.homedir() + 'datastore/merra/daily/'
datafile = datadir + 'merra_vimt_ps-300mb_apr-sep_1979-2014.nc'
lat1, lat2 = -20, 30
lon1, lon2 = 40, 100
with xray.open_dataset(datafile) as ds:
uq_int = ds['uq_int'].load()
vq_int = ds['vq_int'].load()
npts = 100
#npts = 50
pre_days = 'May 18-24'
post_days = 'June 8-14'
namestr = 'HOWI_%dpts_' % npts
exts = ['png', 'eps']
isave = True
howi, ds = onset_HOWI(uq_int, vq_int, npts)
# ----------------------------------------------------------------------
# MAPS
# ----------------------------------------------------------------------
# Plot climatological VIMT composites
lat = atm.get_coord(ds, 'lat')
lon = atm.get_coord(ds, 'lon')
x, y = np.meshgrid(lon, lat)
axlims = (lat1, lat2, lon1, lon2)
plt.figure(figsize=(12,10))
plt.subplot(221)
m = atm.init_latlon(lat1, lat2, lon1, lon2)
m.quiver(x, y, ds['uq_bar_pre'], ds['vq_bar_pre'])
plt.title(pre_days + ' VIMT Climatology')
plt.subplot(223)
m = atm.init_latlon(lat1, lat2, lon1, lon2)
m.quiver(x, y, ds['uq_bar_post'], ds['vq_bar_post'])
plt.title(post_days + ' VIMT Climatology')
# Plot difference between pre- and post- composites
plt.subplot(222)
m = atm.init_latlon(lat1, lat2, lon1, lon2)
#m, _ = atm.pcolor_latlon(ds['vimt_bar_diff'], axlims=axlims, cmap='hot_r')
m.quiver(x, y, ds['uq_bar_diff'], ds['vq_bar_diff'])
plt.title(post_days + ' minus ' + pre_days + ' VIMT Climatology')
# Top N difference vectors
plt.subplot(224)
m, _ = atm.pcolor_latlon(ds['vimt_bar_diff_masked'],axlims=axlims, cmap='hot_r')
plt.title('Magnitude of Top %d Difference Fluxes' % npts)
# Plot vector VIMT fluxes for a few individual years
ylist = [0, 1, 2, 3]
plt.figure(figsize=(12, 10))
for yr in ylist:
plt.subplot(2, 2, yr + 1)
m = atm.init_latlon(lat1, lat2, lon1, lon2)
m.quiver(x, y, ds['uq'][yr].mean(dim='day'), ds['vq'][yr].mean(dim='day'))
m.contour(x, y, ds['mask'].astype(float), [0.99], colors='red')
plt.title('%d May-Sep VIMT Fluxes' % ds.year[yr])
# ----------------------------------------------------------------------
# TIMESERIES
# ----------------------------------------------------------------------
onset = howi.onset
retreat = howi.retreat
length = retreat - onset
days = howi.day
years = howi.year.values
yearstr = '%d-%d' % (years[0], years[-1])
nroll = howi.attrs['nroll']
# Timeseries with and without rolling mean
def index_tseries(days, ind, ind_roll, titlestr):
plt.plot(days, ind, label='daily')
plt.plot(days, ind_roll, label='%d-day rolling' % nroll)
plt.grid()
plt.legend(loc='lower right')
plt.title(titlestr)
plt.figure(figsize=(12, 10))
plt.subplot(221)
index_tseries(days, ds.howi_clim_norm, ds.howi_clim_norm_roll,
'HOWI ' + yearstr + ' Climatology')
for yr in [0, 1, 2]:
plt.subplot(2, 2, yr + 2)
index_tseries(days, ds.howi_norm[yr], ds.howi_norm_roll[yr],
'HOWI %d' % years[yr])
# ----------------------------------------------------------------------
# Onset and retreat indices
summarize_indices(years, onset, retreat, 'HOWI')
# ----------------------------------------------------------------------
# Plot timeseries of each year
plot_index_years(howi)
# ----------------------------------------------------------------------
# Save figures
if isave:
for ext in exts:
atm.savefigs(namestr, ext)
|
A family wins a high-tech computerized house. 13-year-old Ben (Ryan Merriman) keeps the family well-organized and on track so his widower father won't start dating other women ... he thinks they don't need a new mother to replace their late mother. Ben is always entering contests...and enters one in which the winner receives a brand-new house that is totally computerized--the computer runs everything.
Jessica Steen plays the woman who created the computerized house. When Ben's father starts dating her, Ben gets upset because he doesn't want a new mom.
Then computerized P.A.T. (Katey Sagal) turns into a hologram and decides to protect her family from the nasty outside world (after monitoring all-news TV-networks) by locking them inside FOREVER.
Ben: Take it easy, Pat.
Pat: Easy? You think I wouldn't want to take it easy? But NO, I'm too busy keeping up with the Coopers. Slaving away in a hot control room, making your lives perfect. And you think you can bring another women into my domain? I don't think so buster.
Nick: We can't stay here forever, PAT.
Pat: Why not? Haven't I given you everything you've needed?
Ben: We need fresh air and exercise.
Pat: But I can give you sythetic air, and virtual exercise.
Pat: Haven't you heard of home schooling?
Ben: Yes, but not where the home is the one doing the schooling.
Nick: What about friends? Kids can't survive without friends.
Pat: We can be each other's best friends.
* When PAT is analyzing the Coopers' DNA, she was able to tell whether they had any injuries or not and their ages. These bits of information cannot be determined from DNA.
* Katey Sagal's father Boris Sagal was a famous movie director who met his end while filming "World War 3" (1982 TV-movie, 4 hours) when he walked into a helicopter blade on location and was beheaded. Her current TV-series is "8 Simple Rules For Dating My Daughter" which starred Jack Ritter until his unexpected death from a heart attack.
|
# -*- coding: utf-8 -*-
"""
Created on Tue Jul 04 09:21:54 2017
@author: Enrique Alejandro
Description: this library contains the core algorithms for numerical simulations
You need to have installed:
numba -- > this can be easily installed if you have the anaconda distribution via pip install method: pip install numba
"""
from __future__ import division, print_function, absolute_import, unicode_literals
import numpy as np
from numba import jit
import sys
sys.path.append('d:\github\pycroscopy')
from pycroscopy.simulation.afm_calculations import amp_phase, e_diss, v_ts
def verlet(zb, Fo1, Fo2, Fo3, Q1, Q2, Q3, k_L1, k_L2, k_L3, time, z1, z2,z3, v1,v2,v3, z1_old, z2_old, z3_old, Fts, dt,
fo1, fo2, fo3, f1, f2, f3):
"""This function performs verlet algorithm (central difference) for numerical integration.
It integrates the differential equations of three harmonic oscillator equations (each corresponding to a distinct
cantilever eigenmode)
This function does not assume ideal Euler-Bernoulli scaling but instead the cantilever parameters are passed to the
function
The dynamics of the cantilever are assumed to be contained in the first three flexural modes
This function will be called each simulation timestep by a main wrap around function which will contain the specific
contact-mechanics model.
Parameters:
----------
zb : float
z equilibrium position (average tip postion with respect to the sample)
Fo1 : float
amplitude of the sinuosidal excitation force term (driving force) for the first eigenmode
Fo2 : float
amplitude of the sinuosidal excitation force term (driving force) for the second eigenmode
Fo3 : float
amplitude of the sinuosidal excitation force term (driving force) for the third eigenmode
Q1 : float
first eigenmode's quality factor
Q2 : float
second eigenmode's quality factor
Q3 : float
third eigenmode's quality factor
k_L1 : float
1st eigenmode's stiffness
k_L2 : float
2nd eigenmode's stiffness
k_L3 : float
3rd eigenmode's stiffness
z1 : float
instant 1st eigenmode deflection contribution
z2 : float
instant 2nd eigenmode deflection contribution
z3 : float
instant 3rd eigenmode deflection contribution
v1 : float
instant 1st eigenmode velocity
v2 : float
instant 2nd eigenmode velocity
v3 : float
instant 3rd eigenmode velocity
z1_old : float
instant 1st eigenmode deflection contribution corresponding to previous timestep
z2_old : float
instant 2nd eigenmode deflection contribution corresponding to previous timestep
z3_old : float
instant 3rd eigenmode deflection contribution corresponding to previous timestep
Fts : float
tip-sample interacting force
dt : float
simulation timestep
fo1 : float
1st eigenmode resonance frequency
fo2 : float
2nd eigenmode resonance frequency
fo3 : float
3rd eigenmode resonance frequency
f1 : float
1st sinusoidal excitation frequency
f2 : float
2nd sinusoidal excitation frequency
f3 : float
3rd sinusoidal excitation frequency
Returns:
-------
tip: float
instant tip position for new simulation timestep
z1 : float
instant 1st eigenmode deflection contribution for new simulation timestep
z2 : float
instant 2nd eigenmode deflection contribution for new simulation timestep
z3 : float
instant 3rd eigenmode deflection contribution for new simulation timestep
v1 : float
instant 1st eigenmode velocity for new simulation timestep
v2 : float
instant 2nd eigenmode velocity for new simulation timestep
v3 : float
instant 3rd eigenmode velocity for new simulation timestep
z1_old : float
instant 1st eigenmode deflection contribution corresponding to current timestep
z2_old : float
instant 2nd eigenmode deflection contribution corresponding to current timestep
z3_old : float
instant 3rd eigenmode deflection contribution corresponding to current timestep
"""
# TODO: Simplify inputs and outputs for this function. Consider wrapping up parameters for each eignenmode into an object or use lists for all k, Q, fo, etc.
a1 = ( -z1 - v1/(Q1*(fo1*2*np.pi)) + ( Fo1*np.cos((f1*2*np.pi)*time) + Fo2*np.cos((f2*2*np.pi)*time) + Fo3*np.cos((f3*2*np.pi)*time) + Fts)/k_L1 )* (fo1*2.0*np.pi)**2
a2 = ( -z2 - v2/(Q2*(fo2*2*np.pi)) + ( Fo1*np.cos((f1*2*np.pi)*time) + Fo2*np.cos((f2*2*np.pi)*time) + Fo3*np.cos((f3*2*np.pi)*time) + Fts)/k_L2 )* (fo2*2.0*np.pi)**2
a3 = ( -z3 - v3/(Q3*(fo3*2*np.pi)) + ( Fo1*np.cos((f1*2*np.pi)*time) + Fo2*np.cos((f2*2*np.pi)*time) + Fo3*np.cos((f3*2*np.pi)*time) + Fts)/k_L3 )* (fo3*2.0*np.pi)**2
# Verlet algorithm (central difference) to calculate position of the tip
z1_new = 2*z1 - z1_old + a1*pow(dt, 2)
z2_new = 2*z2 - z2_old + a2*pow(dt, 2)
z3_new = 2*z3 - z3_old + a3*pow(dt, 2)
# central difference to calculate velocities
v1 = (z1_new - z1_old)/(2*dt)
v2 = (z2_new - z2_old)/(2*dt)
v3 = (z3_new - z3_old)/(2*dt)
# Updating z1_old and z1 for the next run
z1_old = z1
z1 = z1_new
z2_old = z2
z2 = z2_new
z3_old = z3
z3 = z3_new
tip = z1 + z2 + z3 + zb
return tip, z1, z2, z3, v1, v2, v3, z1_old, z2_old, z3_old
numba_verlet = jit()(verlet) #it is important to keep this line out of the effectively accelerate the function when called
def gen_maxwell_lr(G, tau, R, dt, startprint, simultime, fo1, fo2, fo3, k_m1, k_m2, k_m3, A1, A2, A3, zb, printstep = 1, Ge = 0.0, Q1=100, Q2=200, Q3=300, H=2.0e-19):
"""This function is designed for multifrequency simulation performed over a Generalized Maxwell (Wiechert) viscoelastic surface.
The contact mechanics are performed over the framework of Lee and Radok (Lee, E. Ho, and Jens Rainer Maria Radok. "The contact problem for viscoelastic bodies." Journal of Applied Mechanics 27.3 (1960): 438-444.)
The cantilever dynamics are assumed to be contained in the first three eigenmodes.
The numerical integration is performed with the aid of the verlet function(defined above)
Parameters:
----------
G : numpy.ndarray
moduli of the springs in the Maxwell arms of a generalized Maxwell model (also called Wiechert model)
tau: numpy.ndarray
relaxation times of the Maxwell arms
R : float
tip radius
dt : float
simulation timestep
fo1 : float
1st eigenmode resonance frequency
fo2 : float
2nd eigenmode resonance frequency
fo3 : float
3rd eigenmode resonance frequency
k_m1 : float
1st eigenmode's stiffness
k_m2 : float
2nd eigenmode's stiffness
k_m3 : float
3rd eigenmode's stiffness
A1 : float
target oscillating amplitude of 1st cantilever eigenmode
A2 : float
target oscillating amplitude of 2nd cantilever eigenmode
A3 : float
target oscillating amplitude of 3rd cantilever eigenmode
zb : float
cantilever equilibrium position (average tip-sample distance)
printstep : float, optional
how often the data will be stored, default is timestep
Ge : float, optional
rubbery modulus, the default value is zero
Q1 : float, optional
first eigenmode's quality factor
Q2 : float, optional
second eigenmode's quality factor
Q3 : float, optional
third eigenmode's quality factor
H : float, optional
Hammaker constant
Returns:
-------
np.array(t_a) : numpy.ndarray
time trace
np.array(tip_a) : numpy.ndarray
array containing the tip trajectory
np.array(Fts_a) : numpy.ndarray
array containing the tip-sample interacting force
np.array(xb_a) : numpy.ndarray
numpy array containing the instant position of the viscoelastic surface
"""
# TODO: Simplify inputs for this function. Consider useing lists for all k, Q, fo, etc.
G_a = []
tau_a = []
"""
this for loop is to make sure tau passed does not contain values lower than time step which would make numerical
integration unstable
"""
for i in range(len(G)):
if tau[i] > dt*10.0:
G_a.append(G[i])
tau_a.append(tau[i])
G = np.array(G_a)
tau = np.array(tau_a)
f1 = fo1
f2 = fo2
f3 = fo3
"""
Calculating the force amplitude to achieve the given free amplitude from amplitude response of tip excited
oscillator
"""
# Amplitude of 1st mode's force to achieve target amplitude based on amplitude response of a tip excited harmonic oscillator:
Fo1 = k_m1*A1/(fo1*2*np.pi)**2*(((fo1*2*np.pi)**2 - (f1*2*np.pi)**2 )**2 + (fo1*2*np.pi*f1*2*np.pi/Q1)**2)**0.5
# Amplitude of 2nd mode's force to achieve target amplitude based on amplitude response of a tip excited harmonic oscillator:
Fo2 = k_m2*A2/(fo2*2*np.pi)**2*(((fo2*2*np.pi)**2 - (f2*2*np.pi)**2 )**2 + (fo2*2*np.pi*f2*2*np.pi/Q2)**2)**0.5
# Amplitude of 3rd mode's force to achieve target amplitude based on amplitude response of a tip excited harmonic oscillator
Fo3 = k_m3*A3/(fo3*2*np.pi)**2*(((fo3*2*np.pi)**2 - (f3*2*np.pi)**2 )**2 + (fo3*2*np.pi*f3*2*np.pi/Q3)**2)**0.5
a = 0.2e-9 # interatomic distance
eta = tau*G
Gg = Ge
for i in range(len(tau)):
Gg = Gg + G[i]
t_a = []
Fts_a = []
xb_a = []
tip_a = []
printcounter = 1
if printstep == 1:
printstep = dt
t = 0.0 # initializing time
Fts = 0.0
xb = 0.0
pb = 0.0
pc, pc_rate = np.zeros(len(tau)), np.zeros(len(tau))
xc, xc_rate = np.zeros(len(tau)), np.zeros(len(tau))
alfa = 16.0/3.0*np.sqrt(R)
# Initializing Verlet variables
z1, z2, z3, v1, v2, v3, z1_old, z2_old, z3_old = 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0
sum_Gxc = 0.0
sum_G_pb_pc = 0.0
while t < simultime:
t = t + dt
tip, z1, z2, z3, v1, v2, v3, z1_old, z2_old, z3_old = numba_verlet(zb, Fo1, Fo2, Fo3, Q1, Q2, Q3, k_m1, k_m2, k_m3, t, z1, z2,z3, v1,v2,v3, z1_old, z2_old, z3_old, Fts, dt, fo1,fo2,fo3, f1,f2,f3)
if t > ( startprint + printstep*printcounter):
t_a.append(t)
Fts_a.append(Fts)
xb_a.append(xb)
tip_a.append(tip)
printcounter = printcounter + 1
sum_Gxc = 0.0
sum_G_pb_pc = 0.0
if tip > xb: # aparent non contact
for i in range(len(tau)):
sum_Gxc = sum_Gxc + G[i]*xc[i]
if sum_Gxc/Gg > tip: # contact, the sample surface surpassed the tip in the way up
xb = tip
pb = (-xb)**1.5
for i in range(len(tau)):
sum_G_pb_pc = sum_G_pb_pc + G[i]*(pb - pc[i])
Fts = alfa*( Ge*pb + sum_G_pb_pc )
# get postion of dashpots
for i in range(len(tau)):
pc_rate[i] = G[i]/eta[i] * (pb - pc[i])
pc[i] = pc[i] + pc_rate[i]*dt
xc[i] = -(pc[i])**(2.0/3)
else: # true non-contact
xb = sum_Gxc/Gg
Fts = 0.0
for i in range(len(tau)):
xc_rate[i] = G[i]*(xb-xc[i])/eta[i]
xc[i] = xc[i] + xc_rate[i]*dt
pc[i] = (-xc[i])**(3.0/2) #debugging
else: # contact region
xb = tip
pb = (-xb)**1.5
for i in range(len(tau)):
sum_G_pb_pc = sum_G_pb_pc + G[i]*(pb - pc[i])
Fts = alfa*( Ge*pb + sum_G_pb_pc )
# get postion of dashpots
for i in range(len(tau)):
pc_rate[i] = G[i]/eta[i] * (pb - pc[i])
pc[i] = pc[i] + pc_rate[i]*dt
xc[i] = -(pc[i])**(2.0/3)
# MAKING CORRECTION TO INCLUDE VDW ACCORDING TO DMT THEORY
if tip > xb: # overall non-contact
Fts = -H*R/( 6.0*( (tip-xb) + a )**2 )
else:
Fts = Fts - H*R/(6.0*a**2)
return np.array(t_a), np.array(tip_a), np.array(Fts_a), np.array(xb_a)
GenMaxwell_jit = jit()(gen_maxwell_lr) #this line should stay outside function to allow the numba compilation and simulation acceleration work properly
def dynamic_spectroscopy(G, tau, R, dt, startprint, simultime, fo1, fo2, fo3, k_m1, k_m2, k_m3, A1, A2, A3, printstep = 1, Ge = 0.0, Q1=100, Q2=200, Q3=300, H=2.0e-19, z_step = 1):
"""This function is designed for tapping mode spectroscopy to obtain amplitude and phase curves as the cantilever is approached towards the surface.
The contact mechanics are performed over the framework of Lee and Radok for viscoelastic indentation (Lee, E. Ho, and Jens Rainer Maria Radok. "The contact problem for viscoelastic bodies." Journal of Applied Mechanics 27.3 (1960): 438-444.)
Parameters:
----------
G : numpy.ndarray
moduli of the springs in the Maxwell arms of a generalized Maxwell model (also called Wiechert model)
tau: numpy.ndarray
relaxation times of the Maxwell arms
R : float
tip radius
dt : float
simulation timestep
fo1 : float
1st eigenmode resonance frequency
fo2 : float
2nd eigenmode resonance frequency
fo3 : float
3rd eigenmode resonance frequency
k_m1 : float
1st eigenmode's stiffness
k_m2 : float
2nd eigenmode's stiffness
k_m3 : float
3rd eigenmode's stiffness
A1 : float
target oscillating amplitude of 1st cantilever eigenmode
A2 : float
target oscillating amplitude of 2nd cantilever eigenmode
A3 : float
target oscillating amplitude of 3rd cantilever eigenmode
printstep : float, optional
how often the data will be stored, default is timestep
Ge : float, optional
rubbery modulus, the default value is zero
Q1 : float, optional
first eigenmode's quality factor
Q2 : float, optional
second eigenmode's quality factor
Q3 : float, optional
third eigenmode's quality factor
H : float, optional
Hammaker constant
z_step : float, optional
cantilever equilibrium spatial step between runs. The smaller this number, the more runs but slower the simulation
Returns:
-------
np.array(amp) : numpy.ndarray
array containing the reduced amplitudes at different cantilever equilibrium positions
np.array(phase) : numpy.ndarray
array containing the phase shifts obtained at different cantilever equilibrium positions
np.array(zeq) : numpy.ndarray
array containing the approaching cantilever equilibrium positions
np.array(Ediss) : numpy.ndarray
array containing the values of dissipated energy
p.array(Virial) : np.ndarray
array containing the values of the virial of the interaction
np.array(peakF) : np.ndarray
array containing valued of peak force
np.array(maxdepth) : numpy.ndarray
array containing the values of maximum indentation
np.array(t_a) : numpy.ndarray
time trace
np.array(tip_a) : numpy.ndarray
2D array containing the tip trajectory for each run
np.array(Fts_a) : numpy.ndarray
2D array containing the tip-sample interacting force for each run
np.array(xb_a) : numpy.ndarray
2D array array containing the instant position of the viscoelastic surface for each run
"""
if z_step == 1:
z_step = A1*0.05 #default value is 5% of the free oscillation amplitude
zeq = []
peakF = []
maxdepth = []
amp = []
phase = []
Ediss = []
Virial = []
tip_a = []
Fts_a = []
xb_a = []
zb = A1*1.1
while zb > 0.0:
t, tip, Fts, xb = GenMaxwell_jit(G, tau, R, dt, startprint, simultime, fo1, fo2, fo3, k_m1, k_m2,k_m3, A1, A2, A3, zb, printstep, Ge, Q1, Q2, Q3, H)
A,phi = amp_phase(t, tip, fo1)
Ets = e_diss(tip, Fts, dt, fo1)
fts_peak = Fts[np.argmax(Fts)]
tip_depth = xb[np.argmax(tip)] -xb[np.argmin(tip)]
Vts = v_ts(tip-zb, Fts, dt)
#Attaching single values to lists
zeq.append(zb)
peakF.append(fts_peak)
maxdepth.append(tip_depth)
amp.append(A)
phase.append(phi)
Ediss.append(Ets)
Virial.append(Vts)
#attaching 1D arrays to lists
tip_a.append(tip)
Fts_a.append(Fts)
xb_a.append(xb)
zb -= z_step
return np.array(amp), np.array(phase), np.array(zeq), np.array(Ediss), np.array(Virial), np.array(peakF), np.array(maxdepth), t, np.array(tip_a), np.array(Fts_a), np.array(xb_a)
def verlet_FS(y_t, Q1, Q2, Q3, k1, k2, k3, time, z1, z2,z3, v1,v2,v3, z1_old, z2_old, z3_old, Fts, dt, fo1,fo2,fo3, Fb1=0.0, Fb2=0.0, Fb3=0.0):
"""This function performs verlet algorithm (central difference) for numerical integration of the AFM cantilever dynamics.
The equations of motion are for a based excited cantilever, to be used for a static force spectroscopy simulation
It integrates the differential equations of three harmonic oscillators (each corresponding to a distinct
cantilever eigenmode)
The dynamics of the cantilever are assumed to be contained in the first three flexural modes
This function will be called each simulation timestep by a main wrap around function which will contain the specific
contact-mechanics model.
Parameters:
----------
y_t : float
z equilibrium position (average tip postion with respect to the sample)
Q1 : float
first eigenmode's quality factor
Q2 : float
second eigenmode's quality factor
Q3 : float
third eigenmode's quality factor
k1 : float
1st eigenmode's stiffness
k2 : float
2nd eigenmode's stiffness
k3 : float
3rd eigenmode's stiffness
time : float
instant time of the simulation
z1 : float
instant 1st eigenmode deflection contribution
z2 : float
instant 2nd eigenmode deflection contribution
z3 : float
instant 3rd eigenmode deflection contribution
v1 : float
instant 1st eigenmode velocity
v2 : float
instant 2nd eigenmode velocity
v3 : float
instant 3rd eigenmode velocity
z1_old : float
instant 1st eigenmode deflection contribution corresponding to previous timestep
z2_old : float
instant 2nd eigenmode deflection contribution corresponding to previous timestep
z3_old : float
instant 3rd eigenmode deflection contribution corresponding to previous timestep
Fts : float
tip-sample interacting force
dt : float
simulation timestep
fo1 : float
1st eigenmode resonance frequency
fo2 : float
2nd eigenmode resonance frequency
fo3 : float
3rd eigenmode resonance frequency
Fb1 : float, optional
amplitude of the 1st eigenmode Brownian force (associated to thermal noise)
Fb2 : float, optional
amplitude of the 2nd eigenmode Brownian force (associated to thermal noise)
Fb3 : float, optional
amplitude of the 3rd eigenmode Brownian force (associated to thermal noise)
Returns:
-------
tip: float
instant tip position for new simulation timestep
z1 : float
instant 1st eigenmode deflection contribution for new simulation timestep
z2 : float
instant 2nd eigenmode deflection contribution for new simulation timestep
z3 : float
instant 3rd eigenmode deflection contribution for new simulation timestep
v1 : float
instant 1st eigenmode velocity for new simulation timestep
v2 : float
instant 2nd eigenmode velocity for new simulation timestep
v3 : float
instant 3rd eigenmode velocity for new simulation timestep
z1_old : float
instant 1st eigenmode deflection contribution corresponding to current timestep
z2_old : float
instant 2nd eigenmode deflection contribution corresponding to current timestep
z3_old : float
instant 3rd eigenmode deflection contribution corresponding to current timestep
"""
a1 = ( - z1 - v1*1.0/(fo1*2*np.pi*Q1) + y_t + Fts/k1 + Fb1/k1) *(2.0*np.pi*fo1)**2
a2 = ( - z2 - v2*1.0/(fo2*2*np.pi*Q2) + Fts/k2 + Fb2/k2) *(2.0*np.pi*fo2)**2
a3 = ( - z3 - v3*1.0/(fo3*2*np.pi*Q3) + Fts/k3 + Fb3/k3) *(2.0*np.pi*fo3)**2
#Verlet algorithm (central difference) to calculate position of the tip
z1_new = 2*z1 - z1_old + a1*pow(dt, 2)
z2_new = 2*z2 - z2_old + a2*pow(dt, 2)
z3_new = 2*z3 - z3_old + a3*pow(dt, 2)
#central difference to calculate velocities
v1 = (z1_new - z1_old)/(2*dt)
v2 = (z2_new - z2_old)/(2*dt)
v3 = (z3_new - z3_old)/(2*dt)
#Updating z1_old and z1 for the next run
z1_old = z1
z1 = z1_new
z2_old = z2
z2 = z2_new
z3_old = z3
z3 = z3_new
tip = z1 + z2 + z3
return tip, z1, z2, z3, v1, v2, v3, z1_old, z2_old, z3_old
numba_verlet_FS = jit()(verlet_FS)
def sfs_genmaxwell_lr(G, tau, R, dt, simultime, y_dot, y_t_initial, k_m1, fo1, Ge = 0.0, Q1=100, printstep = 1, H = 2.0e-19, Q2=200, Q3=300, startprint = 1, vdw = 1):
"""This function is designed for force spectroscopy over a Generalized Maxwel surface
The contact mechanics are performed over the framework of Lee and Radok, thus strictly only applies for approach portion
Parameters:
----------
G : numpy.ndarray
moduli of the springs in the Maxwell arms of a generalized Maxwell model (also called Wiechert model)
tau: numpy.ndarray
relaxation times of the Maxwell arms
R : float
tip radius
dt : float
simulation timestep
simultime : float
total simulation time
y_dot: float
approach velocity of the cantilever's base towards the sample
y_t_initial: float
initial position of the cantilever base with respect to the sample
k_m1 : float
1st eigenmode's stiffness
fo1 : float
1st eigenmode resonance frequency
Ge : float, optional
equilibrium modulus of the material, default value is zero
Q1 : float, optional
1st eigenmode quality factor
printstep : int, optional
if value is 1 the data will be printed with step equal to dt
H : float, optional
Hamaker constant
Q2 : float, optional
2nd eigenmode quality factor
Q3 : float, optional
3rd eigenmode quality factor
startprint : float, optional
when the simulation starts getting printed
vdw : int, optional
if value is 1 van der Waals forces are neglected
Returns:
-------
np.array(t_a) : numpy.ndarray
time trace
np.array(tip_a) : numpy.ndarray
tip position in force spectroscopy simulation
np.array(Fts_a) : numpy.ndarray
tip-sample force interaction if force spectroscopy simulation
np.array(xb_a) : numpy.ndarray
viscoelastic sample position in the simulation
np.array(defl_a) : numpy.ndarray
cantilever deflection duing the force spectroscopy simulation
np.array(zs_a) : numpy.ndarray
z-sensor position (cantilever base position)
"""
G_a = []
tau_a = []
"""
this for loop is to make sure tau passed does not contain values lower than time step which would make numerical
integration unstable
"""
for i in range(len(G)):
if tau[i] > dt*10.0:
G_a.append(G[i])
tau_a.append(tau[i])
G = np.array(G_a)
tau = np.array(tau_a)
a = 0.2e-9 #intermolecular distancePopo
fo2 = 6.27*fo1 # resonance frequency of the second eigenmode (value taken from Garcia, R., & Herruzo, E. T. (2012). The emergence of multifrequency force microscopy. Nature nanotechnology, 7(4), 217-226.)
fo3 = 17.6*fo1 # resonance frequency of the third eigenmode (value taken from Garcia, R., & Herruzo, E. T. (2012). The emergence of multifrequency force microscopy. Nature nanotechnology, 7(4), 217-226.)
k_m2 = k_m1*(fo2/fo1)**2
k_m3 = k_m1*(fo3/fo1)**2
if startprint == 1: #this is the default value when the function will start saving results
startprint = y_t_initial/y_dot
eta = tau*G
Gg = Ge
for i in range(len(tau)): #this loop looks silly but if you replace it with Gg = Ge + sum(G[:]) it will conflict with numba making, simulation very slow
Gg = Gg + G[i]
t_a = []
Fts_a = []
xb_a = []
tip_a = []
defl_a = []
zs_a = []
printcounter = 1
if printstep == 1: #default value of tinestep
printstep = dt
t = 0.0 #initializing time
Fts = 0.0
xb = 0.0
pb = 0.0
pc, pc_rate = np.zeros(len(tau)), np.zeros(len(tau))
alfa = 16.0/3.0*np.sqrt(R) #cell constant, related to tip geometry
#Initializing Verlet variables
z2, z3, v2, v3, z2_old, z3_old = 0.0, 0.0, 0.0, 0.0, 0.0, 0.0
v1 = y_dot
z1 = y_t_initial
z1_old = y_t_initial
while t < simultime:
t = t + dt
y_t = - y_dot*t + y_t_initial #Displacement of the base (z_sensor position)
tip, z1, z2, z3, v1, v2, v3, z1_old, z2_old, z3_old = numba_verlet_FS(y_t, Q1, Q2, Q3, k_m1, k_m2, k_m3, t, z1, z2,z3, v1,v2,v3, z1_old, z2_old, z3_old, Fts, dt, fo1,fo2,fo3)
defl = tip - y_t
if t > ( startprint + printstep*printcounter):
defl_a.append(defl)
zs_a.append(y_t)
t_a.append(t)
Fts_a.append(Fts)
xb_a.append(xb)
tip_a.append(tip)
printcounter += 1
sum_G_pc = 0.0
sum_G_pb_pc = 0.0
if tip > xb: #aparent non contact
for i in range(len(tau)):
sum_G_pc = sum_G_pc + G[i]*pc[i]
if sum_G_pc/Gg > tip: #contact, the sample surface surpassed the tip
xb = tip
pb = (-xb)**1.5
for i in range(len(tau)):
sum_G_pb_pc = sum_G_pb_pc + G[i]*(pb - pc[i])
Fts = alfa*( Ge*pb + sum_G_pb_pc )
else: #true non contact
pb = sum_G_pc/Gg
xb = pb**(2.0/3)
Fts = 0.0
else: #contact region
xb = tip
pb = (-xb)**1.5
for i in range(len(tau)):
sum_G_pb_pc = sum_G_pb_pc + G[i]*(pb - pc[i])
Fts = alfa*( Ge*pb + sum_G_pb_pc )
#get postion of dashpots
for i in range(len(tau)):
pc_rate[i] = G[i]/eta[i] * (pb - pc[i])
pc[i] = pc[i] + pc_rate[i]*dt
if vdw != 1:
#MAKING CORRECTION TO INCLUDE VDW ACCORDING TO DMT THEORY
if tip > xb: #overall non-contact
Fts = -H*R/( 6.0*( (tip-xb) + a )**2 )
else:
Fts = Fts - H*R/(6.0*a**2)
return np.array(t_a), np.array(tip_a), np.array(Fts_a), np.array(xb_a), np.array(defl_a), np.array(zs_a)
def compliance_maxwell(G, tau , Ge = 0.0, dt = 1, simul_t = 1, lw=0):
"""This function returns the numerical compliance of a Generalized Maxwell model.
This numerical compliance is useful for interconversion from Gen Maxwell model to generalized Voigt model
Parameters:
----------
G : numpy.ndarray
moduli of the springs in the Maxwell arms of a generalized Maxwell model (also called Wiechert model)
tau: numpy.ndarray
relaxation times of the Maxwell arms
Ge : float, optional
equilibrium modulus of the material, default value is zero
dt : float, optional
simulation timestep
simul_t : float, optional
total simulation time
lw : int, optional
flag to return calculated compliance with logarithmic weight
Returns:
----------
np.array(t_r) : numpy.ndarray
array containing the time trace
np.array(J_r) : numpy.ndarray
array containing the calculated creep compliance
"""
if dt == 1: #if timestep is not user defined it will given as a fracion of the lowest characteristic time
dt = tau[0]/100.0
if simul_t ==1: #if simulation time is not defined it will be calculated with respect to largest retardation time
simul_t = tau[len(tau)-1]*10.0e3
G_a = []
tau_a = []
"""
this for loop is to make sure tau passed does not contain values lower than time step which would make numerical
integration unstable
"""
for i in range(len(G)):
if tau[i] > dt*10.0:
G_a.append(G[i])
tau_a.append(tau[i])
G = np.array(G_a)
tau = np.array(tau_a)
Gg = Ge
for i in range(len(tau)): #this loop looks silly but if you replace it with Gg = Ge + sum(G[:]) it will conflict with numba making, simulation very slow
Gg = Gg + G[i]
eta = tau*G
Jg =1.0/Gg #glassy compliance
N = len(tau)
Epsilon_visco = np.zeros(N) #initial strain
Epsilon_visco_dot = np.zeros(N) #initial strain velocity
t_r = [] #creating list with unknown number of elements
J_r = [] #creating list with unknown number of elements
time = 0.0
J_t = Jg #initial compliance
print_counter = 1
tr = dt #printstep
while time < simul_t: #CREEP COMPLIANCE SIMULATION, ADVANCING IN TIME
time = time + dt
sum_Gn_EpsVisco_n = 0.0 #this sum has to be resetted to zero every timestep
for n in range(0,N):
Epsilon_visco_dot[n] = G[n]*(J_t - Epsilon_visco[n])/eta[n]
Epsilon_visco[n] = Epsilon_visco[n] + Epsilon_visco_dot[n]*dt
sum_Gn_EpsVisco_n = sum_Gn_EpsVisco_n + G[n]*Epsilon_visco[n]
J_t = (1 + sum_Gn_EpsVisco_n)/Gg
if time >= print_counter*tr and time < simul_t:
t_r.append(time)
J_r.append(J_t)
print_counter += 1
if lw != 0: #if logarithmic weight is activated, the data will be appended weighted logarithmically
if print_counter == 10:
tr = tr*10
print_counter = 1
return np.array(t_r), np.array(J_r)
def relaxation_voigt(J, tau, Jg, phi_f = 0.0, dt = 1, simul_t = 1, lw = 0):
"""This function returns the numerical relaxation modulus of a Generalized Voigt model
This numerical relaxation modulus is useful for interconversion from Gen Maxwell model to generalized Voigt model
Parameters:
----------
J : numpy.ndarray
compliances of the springs in the Voigt units of a generalized Voigt model
tau: numpy.ndarray
relaxation times of the Maxwell arms
Jg : float
glassy compliance of the material
dt : float, optional
simulation timestep
simul_t : float, optional
total simulation time
lw : int, optional
flag to return calculated compliance with logarithmic weight
Returns:
----------
np.array(t_r) : numpy.ndarray
array containing the time trace
np.array(G_r) : numpy.ndarray
array containing the calculated relaxation modulus
"""
if dt == 1: #if timestep is not user defined it will given as a fracion of the lowest characteristic time
dt = tau[0]/100.0
if simul_t ==1: #if simulation time is not defined it will be calculated with respect to largest retardation time
simul_t = tau[len(tau)-1]*10.0e3
J_a = []
tau_a = []
"""
this for loop is to make sure tau passed does not contain values lower than time step which would make numerical
integration unstable
"""
for i in range(len(J)):
if tau[i] > dt*10.0:
J_a.append(J[i])
tau_a.append(tau[i])
J = np.array(J_a)
tau = np.array(tau_a)
Gg = 1.0/Jg
N = len(tau)
phi = J/tau
#Defining initial conditions
x = np.zeros(N)
x_dot = np.zeros(N)
t_r = [] #creating list with unknown number of elements
G_r = [] #creating list with unknown number of elements
time = 0.0
G_t = Gg #initial relaxation modulus
print_counter = 1
tr = dt #printstep
while time < simul_t: #RELAXATION MODULUS SIMULATION, ADVANCING IN TIME
time = time + dt
k = len(tau) - 1
while k > -1:
if k == len(tau) - 1:
x_dot[k] = G_t*phi[k]
else:
x_dot[k] = G_t*phi[k] + x_dot[k+1]
k -=1
for i in range(len(tau)):
x[i] = x[i] + x_dot[i]*dt
G_t = Gg*(1.0-x[0])
if time >= print_counter*tr and time <simul_t:
t_r.append(time)
G_r.append(G_t)
print_counter += 1
if lw != 0: #if logarithmic weight is activated, the data will be appended weighted logarithmically
if print_counter == 10:
tr = tr*10
print_counter = 1
return np.array(t_r), np.array(G_r)
|
Then said the king to the servants, Bind him hand and foot, and take him away, and cast him into outer darkness; there shall be weeping and gnashing of teeth.
Cast him into outer darkness - See the notes at Matthew 8:12. This, without doubt, refers to the future punishment of the hypocrite, Matthew 23:23-33; Matthew 24:51.
The provision made for perishing souls in the gospel, is represented by a royal feast made by a king, with eastern liberality, on the marriage of his son. Our merciful God has not only provided food, but a royal feast, for the perishing souls of his rebellious creatures. There is enough and to spare, of every thing that can add to our present comfort and everlasting happiness, in the salvation of his Son Jesus Christ. The guests first invited were the Jews. When the prophets of the Old Testament prevailed not, nor John the Baptist, nor Christ himself, who told them the kingdom of God was at hand, the apostles and ministers of the gospel were sent, after Christ's resurrection, to tell them it was come, and to persuade them to accept the offer. The reason why sinners come not to Christ and salvation by him, is, not because they cannot, but because they will not. Making light of Christ, and of the great salvation wrought out by him, is the damning sin of the world. They were careless. Multitudes perish for ever through mere carelessness, who show no direct aversion, but are careless as to their souls. Also the business and profit of worldly employments hinder many in closing with the Saviour. Both farmers and merchants must be diligent; but whatever we have of the world in our hands, our care must be to keep it out of our hearts, lest it come between us and Christ. The utter ruin coming upon the Jewish church and nation, is here represented. Persecution of Christ's faithful ministers fills up the measure of guilt of any people. The offer of Christ and salvation to the Gentiles was not expected; it was such a surprise as it would be to wayfaring men, to be invited to a royal wedding-feast. The design of the gospel is to gather souls to Christ; all the children of God scattered abroad, Joh 10:16; 11:52. The case of hypocrites is represented by the guest that had not on a wedding-garment. It concerns all to prepare for the scrutiny; and those, and those only, who put on the Lord Jesus, who have a Christian temper of mind, who live by faith in Christ, and to whom he is all in all, have the wedding-garment. The imputed righteousness of Christ, and the sanctification of the Spirit, are both alike necessary. No man has the wedding-garment by nature, or can form it for himself. The day is coming, when hypocrites will be called to account for all their presumptuous intruding into gospel ordinances, and usurpation of gospel privileges. Take him away. Those that walk unworthy of Christianity, forfeit all the happiness they presumptuously claimed. Our Saviour here passes out of the parable into that which it teaches. Hypocrites go by the light of the gospel itself down to utter darkness. Many are called to the wedding-feast, that is, to salvation, but few have the wedding-garment, the righteousness of Christ, the sanctification of the Spirit. Then let us examine ourselves whether we are in the faith, and seek to be approved by the King.
13. Take him away. Men are excluded from the kingdom of heaven as a result of their own wrong choices. Thus it was with the five foolish virgins (see on ch. 25:11, 12). The man in the parable was able to enter the hall only by virtue of the royal invitation, but he alone was responsible for his being put out. No man can save himself, but he can bring condemnation on himself. Conversely, God is able to “save ââ¬Â¦ to the uttermost” (Heb. 7:25), but He does not arbitrarily condemn any, or deny them entrance into the kingdom.
Outer darkness. See chs. 8:12; 25:30. This is the darkness of oblivion, of eternal separation from God, of annihilation. In the parable the darkness was all the more palpable in contrast with the brilliant light of the wedding chamber.
Gnashing of teeth. See on ch. 8:12.
This chapter is based on Matthew 22:1-14.
|
#!/usr/bin/python
"""
Extracting features from VTK input files.
Authors:
- Forrest Sheng Bao (forrest.bao@gmail.com) http://fsbao.net
Copyright 2012, Mindboggle team (http://mindboggle.info), Apache v2.0 License
For algorithmic details, please check:
Forrest S. Bao, et al., Automated extraction of nested sulcus features from human brain MRI data,
IEEE EMBC 2012, San Diego, CA
Dependencies:
python-vtk: vtk's official Python binding
numpy
io_vtk : under mindboggle/utils
"""
from numpy import mean, std, median, array, zeros, eye, flatnonzero, sign, matrix, zeros_like
import os.path
import cPickle
#import io_vtk # Assummng io_vtk is in PYTHONPATH
from mindboggle.utils import io_vtk
import sys
#-----------------Begin function definitions-------------------------------------------------------------
def fcNbrLst(FaceDB, Hemi):
'''Get a neighbor list of faces, also the vertex not shared with current face
Data structure:
NbrLst: a list of size len(FaceDB)
NbrLst[i]: two lists of size 3 each.
NbrLst[i][0] = [F0, F1, F2]: F0 is the neighbor of face i facing V0 where [V0, V1, V2] is face i. And so forth.
NbrLst[i][1] = [V0p, V1p, V2p]: V0p is the vertex of F0 that is not shared with face i
'''
NbrFile = Hemi + '.fc.nbr'
if os.path.exists(NbrFile):
#return fileio.loadFcNbrLst(NbrFile)
print "loading face nbr lst from:" , NbrFile
Fp = open(NbrFile, 'r')
NbrLst = cPickle.load(Fp)
Fp.close()
return NbrLst
print "calculating face neighbor list"
FaceNo = len(FaceDB)
NbrLst = []
[NbrLst.append([[-1,-1,-1], [-1,-1,-1]]) for i in xrange(FaceNo)]
Done =[]
[Done.append(0) for i in xrange(FaceNo)]
for i in xrange(0, FaceNo):
# for i in xrange(0, 2600+1):
# print i
Face = FaceDB[i]
# [V0, V1, V2] = Face
# Found = 0 # if Found == 1, no need to try other faces
for j in xrange(i+1, FaceNo):
AnotherFace = FaceDB[j]
for Idx in xrange(0,2):
ChkFc1 = Face[Idx]
for ChkFc2 in Face[Idx+1:3]:
if ChkFc1 in AnotherFace:
if ChkFc2 in AnotherFace:
NbrID1 = 3 - Face.index(ChkFc1) - Face.index(ChkFc2) # determine it's F0, F1 or F2.
NbrLst[i][0][NbrID1] = j
NbrID2 = 3 - AnotherFace.index(ChkFc1) - AnotherFace.index(ChkFc2) # determine it's F0, F1 or F2.
NbrLst[j][0][NbrID2] = i
# Vp1 = AnotherFace[NbrID2]# determine V{0,1,2}p
# Vp2 = Face[NbrID1]# determine V{0,1,2}p
NbrLst[i][1][NbrID1] = AnotherFace[NbrID2]
NbrLst[j][1][NbrID2] = Face[NbrID1]
Done[i] += 1
Done[j] += 1
if Done[i] ==3:
break # all three neighbors of Face has been found
Fp = open(NbrFile, 'w')
# Commented 2011-11-27 23:54
# for i in xrange(0, len(FaceDB)
# for j in NbrLst[i]:
# Fp.write(str(j[0]) + '\t' + str(j[1]) + '\t' + str(j[2]) + '\t')
# Fp.write('\n')
# End of Commented 2011-11-27 23:54
cPickle.dump(NbrLst, Fp)
Fp.close()
return NbrLst
def vrtxNbrLst(VrtxNo, FaceDB, Hemi):
"""Given the number of vertexes and the list of faces, find the neighbors of each vertex, in list formate.
"""
NbrFile = Hemi + '.vrtx.nbr'
if os.path.exists(NbrFile):
#return fileio.loadVrtxNbrLst(NbrFile) # change to cPickle
print "Loading vertex nbr lst from:", NbrFile
Fp = open(NbrFile, 'r') # need to use cPickle
NbrLst = cPickle.load(Fp)
Fp.close()
return NbrLst
print "Calculating vertex neighbor list"
NbrLst = [[] for i in xrange(0, VrtxNo)]
for Face in FaceDB:
[V0, V1, V2] = Face
if not V1 in NbrLst[V0]:
NbrLst[V0].append(V1)
if not V2 in NbrLst[V0]:
NbrLst[V0].append(V2)
if not V0 in NbrLst[V1]:
NbrLst[V1].append(V0)
if not V2 in NbrLst[V1]:
NbrLst[V1].append(V2)
if not V0 in NbrLst[V2]:
NbrLst[V2].append(V1)
if not V1 in NbrLst[V2]:
NbrLst[V2].append(V1)
Fp = open(NbrFile, 'w') # need to use cPickle
# Commented 2011-11-27 23:54
# for i in xrange(0, VrtxNo):
# [Fp.write(str(Vrtx) + '\t') for Vrtx in NbrLst[i]]
# Fp.write('\n')
# End of Commented 2011-11-27 23:54
cPickle.dump(NbrLst, Fp)
Fp.close()
return NbrLst
def compnent(FaceDB, Basin, NbrLst, PathHeader):
'''Get connected component, in each of all basins, represented as faces and vertex clouds
Parameters
-----------
NbrLst : list
neighbor list of faces, NOT VERTEXES
PathHeader : header of the path to save component list
'''
FcCmpntFile = PathHeader + '.cmpnt.face'
VrtxCmpntFile = PathHeader + '.cmpnt.vrtx'
if os.path.exists(FcCmpntFile) and os.path.exists(VrtxCmpntFile):
# return fileio.loadCmpnt(FcCmpntFile), fileio.loadCmpnt(VrtxCmpntFile)
print "Loading Face Components from:", FcCmpntFile
Fp = open(FcCmpntFile, 'r')
FcCmpnt = cPickle.load(Fp)
Fp.close()
print "Loading Vertex Components from:", VrtxCmpntFile
Fp = open(VrtxCmpntFile, 'r')
VrtxCmpnt = cPickle.load(Fp)
Fp.close()
return FcCmpnt, VrtxCmpnt
print "calculating face and vertex components"
Visited = [False for i in xrange(0, len(Basin))]
FcCmpnt, VrtxCmpnt = [], []
while not allTrue(Visited):
Seed = dfsSeed(Visited, Basin)# first basin face that is not True in Visited
# print Seed
Visited, FcMbr, VrtxMbr = dfs(Seed, Basin, Visited, NbrLst, FaceDB)# DFS to fine all connected members from the Seed
FcCmpnt.append(FcMbr)
VrtxCmpnt.append(VrtxMbr)
# fileio.writeCmpnt(FcCmpnt, FcCmpntFile)
# fileio.writeCmpnt(VrtxCmpnt, VrtxCmpntFile)
Fp = open(FcCmpntFile, 'w')
cPickle.dump(FcCmpnt, Fp)
Fp.close()
Fp = open(VrtxCmpntFile, 'w')
cPickle.dump(VrtxCmpnt, Fp)
Fp.close()
return FcCmpnt, VrtxCmpnt
def judgeFace1(FaceID, FaceDB, CurvatureDB, Threshold = 0):
"""Check whether a face satisfies the zero-order criterion
If all three vertexes of a face have negative curvature, return True. O/w, False.
Input
======
FaceID: integer
the ID of a face, indexing from 0
FaceDB: list
len(FaceDB) == number of faces in the hemisphere
FaceDB[i]: a 1-D list of the IDs of three vertexes that consist of the i-th face
CurvatureDB: list
len(CurvatureDB) == number of vertexes in the hemisphere
CurvatureDB[i]: integer, the curvature of the i-th vertex
"""
[V0, V1, V2] = FaceDB[FaceID]
##
# if (CurvatureDB[V0] > Threshold) and (CurvatureDB[V1] > Threshold) and (CurvatureDB[V2] > Threshold):
# return True
# else:
# return False
##
if (CurvatureDB[V0] <= Threshold) or (CurvatureDB[V1] <= Threshold) or (CurvatureDB[V2] <= Threshold):
return False
else:
return True
def basin(FaceDB, CurvatureDB, Threshold = 0):
'''Given a list of faces and per-vertex curvature value, return a list of faces comprising basins
'''
Basin = []
Left = []
for FaceID in xrange(0, len(FaceDB)):
if judgeFace1(FaceID, FaceDB, CurvatureDB, Threshold = Threshold):
Basin.append(FaceID)
else:
Left.append(FaceID)
return Basin, Left
def allTrue(List):
'''Check whether a logical list contains non-True elements.
'''
# for Bool in List:
# if not Bool:
# return False
# return True
return all(x==True for x in List)
def dfsSeed(Visited, Basin):
'''Given a list of faces comprising the basins, find a face that has not been visited which will be used as the seeding point for DFS.
'''
for i in xrange(0, len(Visited)):
if not Visited[i]:
return Basin[i]
def dfs(Seed, Basin, Visited, NbrLst, FaceDB):
'''Return all members (faces and vertexes) of the connected component that can be found by DFS from a given seed point
Parameters
-----------
NbrLst : list
neighbor list of faces, NOT VERTEXES
'''
Queue = [Seed]
FcMbr = [] # members that are faces of this connected component
VrtxMbr = [] # members that are vertex of this connected component
while Queue != []:
# print Queue
Seed = Queue.pop()
if Seed in Basin:
if not Visited[Basin.index(Seed)]:
Visited[Basin.index(Seed)] = True
FcMbr.append(Seed)
for Vrtx in FaceDB[Seed]:
if not (Vrtx in VrtxMbr):
VrtxMbr.append(Vrtx)
Queue += NbrLst[Seed][0]
return Visited, FcMbr, VrtxMbr
def pmtx(Adj):
'''Print a matrix as shown in MATLAB stdio
'''
for j in xrange(0,25):
print j,
print '\n'
for i in xrange(0, 25):
print i,
for j in xrange(0, 25):
print Adj[i,j],
print '\n'
def all_same(items):
return all(x == items[0] for x in items)
def univariate_pits(CurvDB, VrtxNbrLst, VrtxCmpnt, Thld):
'''Finding pits using one variable, e.g., depth.
'''
print "Extracting pits"
# Stack, P, Child, M, B, End, L = [], [], {}, -1, [], {}, 10
C = [-1 for i in xrange(0, len(VrtxNbrLst))]
Child = {}
End = {}
M = -1
B = []
for Cmpnt in VrtxCmpnt: # for each component
Curv=dict([(i, CurvDB[i]) for i in Cmpnt])
Stack = []
for Vrtx, Cvtr in sorted(Curv.iteritems(), key=lambda (k,v): (v,k)):
Stack.append(Vrtx)
Visited = []
while len(Stack) >0:
Skip_This_Vrtx = False # updated Forrest 2012-02-12, skip vertexes whose neighbors are not in the component to denoise
Vrtx = Stack.pop()
WetNbr = []
NbrCmpnt = []
for Nbr in list(set(VrtxNbrLst[Vrtx])):
if not Nbr in Cmpnt:
Skip_This_Vrtx = True
if Nbr in Visited: # This condition maybe replaced by If C[Vrtx] ==-1
WetNbr.append(Nbr)
if C[Nbr] != -1:
NbrCmpnt.append(C[Nbr])
if Skip_This_Vrtx :
continue
Visited.append(Vrtx)
if len(WetNbr) == 1: # if the vertex has one neighbor that is already wet
[Nbr] = WetNbr
if End[C[Nbr]]:
C[Vrtx] = Child[C[Nbr]]
else:
C[Vrtx] = C[Nbr]
# print C[Nbr], "==>", C[V]
elif len(WetNbr) >1 and all_same(NbrCmpnt): # if the vertex has more than one neighbors which are in the same component
if End[NbrCmpnt[0]]:
C[Vrtx] = Child[NbrCmpnt[0]]
else:
C[Vrtx] = NbrCmpnt[0]
elif len(WetNbr) >1 and not all_same(NbrCmpnt): # if the vertex has more than one neighbors which are NOT in the same component
M += 1
C[Vrtx] = M
for Nbr in WetNbr:
Child[C[Nbr]] = M
End[C[Nbr]] = True
End[M] = False
# elif : # the vertex's neighbor are not fully in the component
else:
M += 1
if CurvDB[Vrtx] > Thld:
B.append(Vrtx)
End[M] = False
C[Vrtx] = M
return B, C, Child
def clouchoux(MCurv, GCurv):
'''Judge whether a vertex is a pit in Clouchoux's definition
Parameters
===========
MCurv: float
mean curvature of a vertex
H in Clouchoux's paper
GCurv: float
mean curvature of a vertex
K in Clouchoux's paper
Returns
========
True if this is a pit. False, otherwise.
Notes
=========
(Since Joachim's code updates all the time, this settings has to be updated accordingly)
In Clochoux's paper, the following definitions are used:
H > 0, K > 0: pit, in Clouchoux's paper
H < 0, K > 0: peak, in Clouchoux's paper
If features are computed by ComputePricipalCurvature(),
use this settings to get proper pits:
H > 3, K < 0 (curvatures not normalized)
H > 0.2, K < 0 (curvatures normalized)
'''
# if (MCurv > 3) and (GCurv < 0):
if (MCurv > 0.2) and (GCurv < 0):
return True
else:
return False
def clouchoux_pits(Vertexes, MCurv, GCurv):
'''Extract pits using Clouchoux's definition
'''
Pits = []
for i in xrange(len(Vertexes)):
if clouchoux(MCurv[i], GCurv[i]):
Pits.append(i)
print len(Pits), "Pits found"
return Pits
def getBasin_and_Pits(Maps, Mesh, SulciVTK, PitsVTK, SulciThld = 0, PitsThld = 0, Quick=False, Clouchoux=False, SulciMap='depth'):
'''Extracting basin and pits (either local minimum approach or Clouchoux's)
Parameters
=============
Maps: dictionary
Keys are map names, e.g., depth or curvatures.
Values are per-vertex maps, e.g., curvature map.
Mesh: 2-tuple of lists
the first list has coordinates of vertexes while the second defines triangles on the mesh
This is a mandatory surface, normally a non-inflated surface.
SulciThld: float
the value to threshold the surface to separate sulci and gyri
PitsThld: float
vertexes deeper than this value can be considered as pits
Quick: Boolean
If true, extract sulci only (no component ID, only thresholding), skipping pits and later fundi.
Clouchoux: Boolean
If true, extract pits using Clouchoux's definition. O/w, local minimum approach.
SulciMap: string
The map to be used to get sulci
by default, 'depth'
'''
def write_surface_with_LUTs(File, Points, Faces, Maps):
"""Like write_scalars in io_vtk but no writing of vertices
"""
print "writing sulci into VTK file:", File
Fp = open(File,'w')
io_vtk.write_header(Fp)
io_vtk.write_points(Fp, Points)
io_vtk.write_faces(Fp, Faces)
if len(Maps) > 0:
# Make sure that LUTs is a list of lists
Count = 0
for LUT_name, LUT in Maps.iteritems():
if Count == 0 :
io_vtk.write_scalars(Fp, LUT, LUT_name)
else:
io_vtk.write_scalars(Fp, LUT, LUT_name, begin_scalars=False)
Count += 1
Fp.close()
return None
def write_pits_without_LUTs(File, Points, Indexes):
"""Like write_scalars in io_vtk but no writing of vertices
"""
print "writing pits into VTK file:", File
Fp = open(File,'w')
io_vtk.write_header(Fp)
io_vtk.write_points(Fp, Points)
io_vtk.write_vertices(Fp, Indexes)
Fp.close()
return None
print "\t thresholding the surface using threshold = ", SulciThld
[Vertexes, Faces] = Mesh
MapBasin = Maps[SulciMap]
Basin, Gyri = basin(Faces, Maps[SulciMap], Threshold = SulciThld)
if not Quick:
LastSlash = len(SulciVTK) - SulciVTK[::-1].find('/')
Hemi = SulciVTK[:SulciVTK[LastSlash:].find('.')+LastSlash]# path up to which hemisphere, e.g., /home/data/lh
VrtxNbr = vrtxNbrLst(len(Vertexes), Faces, Hemi)
FcNbr = fcNbrLst(Faces, Hemi)
FcCmpnt, VrtxCmpnt = compnent(Faces, Basin, FcNbr, ".".join([Hemi, SulciMap, str(SulciThld)]))
CmpntLUT = [-1 for i in xrange(len(MapBasin))]
for CmpntID, Cmpnt in enumerate(VrtxCmpnt):
for Vrtx in Cmpnt:
CmpntLUT[Vrtx] = CmpntID
Maps['CmpntID'] = CmpntLUT
if Clouchoux:
Pits = clouchoux_pits(Vertexes, Maps['meancurv'], Maps['gausscurv'])
else: # local minimum approach
MapPits = Maps[SulciMap] # Users will get the option to select pits extraction map in the future.
Pits, Parts, Child = univariate_pits(MapPits, VrtxNbr, VrtxCmpnt, PitsThld)
Maps['hierarchy'] = Parts
else:
print "\t\t Thresholding the surface to get sulci only."
Faces = [map(int,i) for i in Faces]# this is a temporal fix. It won't cause precision problem because sys.maxint is 10^18.
Vertexes = map(list, Vertexes)
write_surface_with_LUTs(SulciVTK, Vertexes, [Faces[i] for i in Basin], Maps)
if Quick:
sys.exit()
write_pits_without_LUTs(PitsVTK, Vertexes, Pits)
# output tree hierarchies of basal components
# print "writing hierarchies of basal components"
# WetFile = PrefixExtract + '.pits.hier'
# WetP = open(WetFile,'w')
# for LowComp, HighComp in Child.iteritems():
# WetP.write(str(LowComp) + '\t' + str(HighComp) + '\n')
# WetP.close()
# end of output tree hierarchies of basal components
# End of Get pits Forrest 2011-05-30 10:16
# a monolithic code output each component
# Dic = {}
# for CID, Cmpnt in enumerate(FcCmpnt):
# Dic[CID] = len(Cmpnt)
#
# #Dic = sorted(Dic.iteritems(), key= lambda (k,v,) : (v,k))
# Counter = 1
# for CID, Size in sorted(Dic.iteritems(), key=lambda (k,v): (v,k)):
## print Size
# Rank = len(FcCmpnt) - Counter +1
# Fp = open(BasinFile + '.' + SurfFile[-1*SurfFile[::-1].find('.'):] + '.' + str(Rank) +'-th.vtk','w')
# Vertex, Face = fileio.readSurf(SurfFile)
# FundiList = FcCmpnt[CID]
# libvtk.wrtFcFtr(Fp, Vertex, Face, FundiList)
# Fp.close()
# Counter += 1
# a monolithic code output each component
#---------------End of function definitions---------------------------------------------------------------
|
Lot 1: International "Rhapsody" Sterling Silver Flatware.
Lot 2: (2) 19th C. English Sterling Silver Serving Spoons.
Lot 3: 19th C. English Sterling Silver Grape Scissors.
Lot 4: Amboss Rostfrei Stainless Steel Flatware.
Lot 5: 14K Yellow Gold & Diamond Cluster Ring.
Lot 6: 14K Yellow Gold Ring with Red Stone.
Lot 7: 14K White Gold Ring with 3 Diamonds.
Lot 8: 14K Yellow Gold Ring with Lavender Cabochon.
Lot 9: 14K Yellow Gold Heart Ring.
Lot 10: 14K Yellow Gold Ring with Accent Diamonds.
Lot 11: Ladies' Platinum and Diamond Ring.
Lot 12: Pair of 14K Gold & Diamond Cluster Earrings.
Lot 13: 14K Yellow Gold Necklace with Pendant.
Lot 14: 14K Yellow Gold Necklace with Heart Shape Pendant.
Lot 15: 14K Yellow Gold Necklace with 14K Pendant.
Lot 16: 14K Yellow Gold Necklace with 14K & Red Stone Pendant.
Lot 17: Lady's Platinum & Diamond Solitaire Ring.
Lot 18: 14K Yellow Gold Bracelet.
Lot 19: 14K Yellow Gold Tennis Bracelet.
Lot 20: Ladies' Vintage Hamilton Wristwatch in 14K Case.
Lot 21: Gold and Diamond Fancy Dress Ring.
Lot 22: 14K Yellow Gold Bracelet with 3 Blue Stones.
Lot 23: Lady's Omega LadyMatic Wristwatch.
Lot 24: (2) Chinese Carved Jade Pendants.
Lot 25: Le Corbusier LC4 Chaise Lounge by Cassina.
Lot 26: Pair of Industria Argentina Modernist Chairs.
Lot 27: (4) Harry Bertoia Diamond Chairs.
Lot 28: Modernist Brazilian Rosewood Curved Bench / Daybed.
Lot 31: 1999 Harley Davidson XL 1200S Motorcycle.
Lot 32: Nicolas De Stael Limited Edition Lithograph.
Lot 33: (6) Volumes, Herbier De La France, 1784.
Lot 34: Album with Early 20th C. Continental Photographs.
Lot 35: Antiquarian French Book, New Geography of France.
Lot 36: (6) French Vintage Children's Books.
Lot 37: (3) Volumes, Queen Summer or The Journey of the Lily and the Rose.
Lot 38: (2) Antiquarian Children's Books.
Lot 39: One Volume, Die Mark Brandenburg.
Lot 40: Cilicia Tarsos, Satarap Datames (378 - 362 BC), AR Stater.
Lot 41: Sicily, Syracuse AR tetradrachm, (485-478 BC).
Lot 42: Phoenicia Tyre Shekel (126/5 BC - 67/8 AD).
Lot 43: Ionia Ephesos Tetradrachm.
Lot 44: Ancient Celtic Heavy Bronze Bracelet.
Lot 45: Cilicia Tarsos Stater (379 - 374 BC).
Lot 46: Pamphylia Asepndos AR Stater. (410 - 385 BC).
Lot 47: Alexander II Russian Porcelain Tea Set for One.
Lot 48: (11) WST Bavaria Porcelain Cabinet Plates.
Lot 49: Group of (35) Copeland Spode Porcelain Rose Plates.
Lot 50: Royal Doulton "Old Colony" Porcelain Dinner Service, 78 pieces.
Lot 51: (2) Sets of Continental Porcelain Plates.
Lot 52: Sokolov Palekh Russian Lacquer Box.
Lot 53: Lebedeva Palekh Russian Lacquer Box.
Lot 54: Vaneshov & Bikova Palekh Russian Lacquer Boxes.
Lot 55: (2) Palekh Russian lacquer boxes.
Lot 56: Kumchavova Fedoskino Russian Lacquer Box.
Lot 57: (2) Myzhnikova Palekh Russian Lacquer Boxes.
Lot 58: Merkulova Fedoskino Russian Lacquer Box.
Lot 59: (2) Palekh Russian Red Lacquer Boxes.
Lot 60: (4) Tonner Limited Edition Dolls.
Lot 61: (4) Tonner Matt O'Neill Dolls.
Lot 62: 2 Volumes, Chinese Woodblock prints, Rong Bao Zhai, 1935.
Lot 63: Chinese Rosewood Altar Table.
Lot 64: Chinese Carved Rosewood Circular Side Table.
Lot 65: Chinese Elm Wood Plank Seat Chair.
Lot 66: Chinese Carved Rectangular Side Table.
Lot 67: (4) Chinese Carved Jade and Hard Stone Items.
Lot 68: South Asian Metal Buddhist Figure.
Lot 69: (2) Chinese Carved Wood Bird Cages.
Lot 70: Liu Xun, Ink and Color on Paper.
Lot 71: (10) Minton Porcelain Dinner Plates.
Lot 72: Shreve & Co. Sterling Silver Trumpet Vase.
Lot 73: (28) Volumes, Works of Charles Dickens, 19th C.
Lot 74: Persian Hand Knotted Rug.
Lot 75: Hand Knotted Oriental Rug.
Lot 76: (2) Ralph Lauren RRL Women's Faded Denim jackets.
Lot 77: (2) Ralph Lauren RRL Men's Faded Denim jackets.
Lot 78: Ralph Lauren Polo Sport Mountain Spa Poncho.
Lot 79: Ralph Lauren Polo Country Wool Knit Sweater.
Lot 80: Ralph Lauren Polo Sport Mountain Spa Poncho.
Lot 81: (2) Western Style Leather Vests, one RRL.
Lot 82: (2) Ralph Lauren Double RL Hard Cover Catalogs.
Lot 83: Caucasian Hand Knotted Rug.
Lot 84: Hand Knotted Persian Rug.
Lot 85: Hand Knotted Persian Rug.
Lot 86: Canada General Services Medal, Fenian Raid.
Lot 87: Pair of 14K Gold & Diamond Earrings.
Lot 88: 14K Yellow Bracelet with Diamond and Blue Stones.
Lot 89: 14K Gold, Diamond & Ruby Bracelet and Earrings.
Lot 90: Indian Gold Pine Cone Necklace.
Lot 91: 14K Yellow Gold Link Bracelet with Diamond Accents.
Lot 92: 14K Yellow Gold Necklace and Blue Stone Pendant.
Lot 93: Indian Gold Pendant Necklace and Earrings.
Lot 94: 21K Yellow Gold Open Work Bracelet.
Lot 95: Indian Gold and Dual Strand Necklace.
Lot 96: Pair of Indian Gold Bangles with Blue Stones.
Lot 97: Indian Gold Necklace with Ruby Accent Pendant.
Lot 98: Pair of Indian Gold Earrings.
Lot 99: Pair of Indian Gold Bangles.
Lot 100: 14K White Gold and Solitaire Diamond Ring.
Lot 101: Luigi Kasimir Color Lithograph.
Lot 102: Luigi Kasimir Color Lithograph.
Lot 103: Bernard Shepro Oil on Board.
Lot 104: Charles Weigel Oil on Board.
Lot 105: John Masefield Watercolor, Seascape.
Lot 106: Leroy Neiman Lithograph.
Lot 107: Luigi Kasimir Artist Proof Lithograph.
Lot 108: T. Sebot Oil on Canvas.
Lot 109: Bernard Shepro Oil on Board.
Lot 110: Luigi Kasimir Artist Proof Lithograph.
Lot 111: Franklin 150HP 6-Cylinder Aircraft Engine in Parts.
Lot 112: (2) Marvel Schebler Aircraft Carburetors.
Lot 113: Group of Aircraft Instruments.
Lot 114: Group: Stinson 108 Aircraft Fuselage Parts and Wheels.
Lot 115: Chelsea Ship's Bell Clock.
Lot 116: (4) Tonner Doll Outfits, Memoirs of a Geisha 2006.
Lot 117: (15) Ashton-Drake Gene Doll Outfits.
Lot 118: (5) Limited Edition Barbie Fashion Model Dolls.
Lot 119: (3) Tonner 2006 Memoirs of a Geisha Dolls.
Lot 120: (15) Ashton-Drake Gene Doll Outfits.
Lot 121: (4) Barbie Pink Label "On Location" Doll Sets.
Lot 122: (8) Tyler Wentworth Collection Doll Outfits.
Lot 123: (5) Limited Edition Barbie Fashion Model Dolls.
Lot 125: Drop Crystal 7-light Chandelier.
Lot 126: Russian Gold and Lacquer Pendant / Brooch.
Lot 127: Russian Filigree Box with Palekh Lacquer Medallion.
Lot 128: Palekh Russian Lacquer Box.
Lot 129: Zverkova Palekh Russian Box.
Lot 130: Vanyashov Palekh Russian Box.
|
"""Modoboa API models."""
from dateutil.relativedelta import relativedelta
from django.db import models
from django.utils import timezone
class ModoboaInstanceManager(models.Manager):
"""Custom manager for ModoboaInstance."""
def active(self):
"""Return active instances (last_request <= 1 month)."""
return self.get_queryset().filter(
last_request__gte=timezone.now() - relativedelta(months=1))
class ModoboaInstance(models.Model):
"""A model to represent a modoboa instance."""
hostname = models.CharField(max_length=255)
ip_address = models.GenericIPAddressField()
known_version = models.CharField(max_length=30)
created = models.DateTimeField(auto_now_add=True)
last_request = models.DateTimeField(auto_now=True)
# Statistics
domain_counter = models.PositiveIntegerField(default=0)
domain_alias_counter = models.PositiveIntegerField(default=0)
mailbox_counter = models.PositiveIntegerField(default=0)
alias_counter = models.PositiveIntegerField(default=0)
# Used extensions
extensions = models.ManyToManyField("ModoboaExtension", blank=True)
objects = ModoboaInstanceManager()
def __str__(self):
return "[{0}] {1} -> {2}".format(
self.ip_address, self.hostname, self.known_version)
class ModoboaExtension(models.Model):
"""A modoboa extension with its latest version."""
name = models.CharField(max_length=255, unique=True)
version = models.CharField(max_length=30)
def __str__(self):
return self.name
|
Catholic Hospice is a community based not-for-profit organization dedicated to meeting the needs of people in South Florida with a serious illness through optimal service delivery. We incorporate a biopsychosocial-spiritual model of care which involves medical supervision, pain and symptom management, emotional support, and spiritual guidance. We treat the whole person as we emphasize quality of life, instead of duration. For more information, please contact (305) 822-2380.
|
from util import OfcDict
from elements import Colour, ColoursList, LineStyle
from values import ValuesList, ShapePointsList, Value, BarValue
import conf
class SeriesList(list):
colours = conf.colours
value_colours = {
'positive': '#009900',
'negative': '#990000',
'zero': '#000099',
}
def append(self, series):
if series.colorize_series:
series['colour'] = self.colours[len(self)%len(self.colours)]
super(SeriesList, self).append(series)
class Series(OfcDict):
types = {
'type': str,
'alpha': float,
'colour': Colour,
'gradient-fill': str,
'halo-size': int,
'width': int,
'dot-size': int,
'text': str,
'font-size': int,
'values': ValuesList,
'line-style': LineStyle,
'tip': str,
'no-labels': bool,
'loop': bool,
'on-click': str,
}
value_cls = Value
def __init__(self, dictionary, colorize_series=True):
self.colorize_series = colorize_series
super(Series, self).__init__(dictionary)
class OutlineSeries(Series):
types = {
'outline-colour': Colour
}
# TODO! Keys in bar stacks
class Line(Series):
colors = {
'p': '#009900',
'z': '#000099',
'n': '#990000',
}
COLORIZE_NONE, COLORIZE_NEGATIVES, \
COLORIZE_ZEROS, COLORIZE_ZEROS_NEGATIVES, \
COLORIZE_POSITIVES, COLORIZE_POSITIVES_NEGATIVES, \
COLORIZE_PORITIVES_ZEROS, COLORIZE_ALL = range(8)
def __init__(self, dictionary, colorize_values=COLORIZE_ALL, **kwargs):
self.colorize_values = colorize_values
dictionary['type'] = dictionary.get('type', 'line')
super(Line, self).__init__(dictionary, **kwargs)
def _process_values(self, values):
#return values
return self.colorized_values(values)
def colorized_values(self, values, colors=None):
if not colors:
colors = self.colors
colorize_positives = bool(self.colorize_values/4) and 'p' in colors
colorize_zeros = bool((self.colorize_values%4)/2) and 'z' in colors
colorize_negatives = bool(self.colorize_values%2) and 'n' in colors
for k in range(len(values)):
value = values[k]
if isinstance(value, Value):
num_value = float(value['value'])
else:
num_value = float(value)
if num_value < 0:
if colorize_negatives:
values[k] = self.colorize_value(value, colors['n'])
elif num_value > 0:
if colorize_positives:
values[k] = self.colorize_value(value, colors['p'])
else:
values[k] = self.colorize_value(value, colors['z'])
return values
def colorize_value(self, value, color):
if isinstance(value, self.value_cls):
value['colour'] = color
return value
else:
return self.value_cls({'value': value, 'colour': color})
class LineDot(Line):
def __init__(self, dictionary, **kwargs):
dictionary['type'] = 'line_dot'
super(LineDot, self).__init__(dictionary, **kwargs)
class LineHollow(Line):
def __init__(self, dictionary, **kwargs):
dictionary['type'] = 'line_hollow'
super(LineHollow, self).__init__(dictionary, **kwargs)
class Bar(Series):
def __init__(self, dictionary, **kwargs):
dictionary['type'] = 'bar'
super(Bar, self).__init__(dictionary, **kwargs)
class BarFilled(OutlineSeries):
def __init__(self, dictionary, **kwargs):
dictionary['type'] = 'bar_filled'
super(BarFilled, self).__init__(dictionary, **kwargs)
class BarGlass(Series):
def __init__(self, dictionary, **kwargs):
dictionary['type'] = 'bar_glass'
kwargs['colorize_series'] = kwargs.get('colorize_series', False)
super(BarGlass, self).__init__(dictionary, **kwargs)
class Bar3d(Series):
def __init__(self, dictionary, **kwargs):
dictionary['type'] = 'bar_3d'
super(Bar3d, self).__init__(dictionary, **kwargs)
class BarSketch(OutlineSeries):
def __init__(self, dictionary, **kwargs):
dictionary['type'] = 'bar_sketch'
super(BarSketch, self).__init__(dictionary, **kwargs)
class HBar(Series):
def __init__(self, dictionary, **kwargs):
dictionary['type'] = 'hbar'
super(HBar, self).__init__(dictionary, **kwargs)
class BarStack(Series):
types = {
'colours': ColoursList,
}
def __init__(self, dictionary, **kwargs):
dictionary['type'] = 'bar_stack'
super(BarStack, self).__init__(dictionary, **kwargs)
class AreaLine(Series):
types = {
'fill-alpha': float,
'fill': Colour,
}
def __init__(self, dictionary, **kwargs):
dictionary['type'] = 'area_line'
super(AreaLine, self).__init__(dictionary, **kwargs)
class AreaHollow(AreaLine):
def __init__(self, dictionary, **kwargs):
dictionary['type'] = 'area_hollow'
super(AreaHollow, self).__init__(dictionary, **kwargs)
class Pie(Series):
types = {
'start-angle': int,
'animate': bool,
'colours': ColoursList,
'label-colour': Colour,
}
def __init__(self, dictionary, **kwargs):
dictionary['type'] = 'pie'
dictionary['colours'] = conf.colours
super(Pie, self).__init__(dictionary, **kwargs)
class Scatter(Series):
def __init__(self, dictionary, **kwargs):
dictionary['type'] = 'scatter'
super(Scatter, self).__init__(dictionary, **kwargs)
class ScatterLine(Series):
def __init__(self, dictionary, **kwargs):
dictionary['type'] = 'scatter_line'
super(ScatterLine, self).__init__(dictionary, **kwargs)
class Shape(OfcDict):
types = {
'type': str,
'colour': Colour,
'values': ShapePointsList
}
def __init__(self, dictionary, **kwargs):
dictionary['type'] = 'shape'
super(Shape, self).__init__(dictionary, **kwargs)
|
by Guest Contributor Jeff Spross.
Agreement in Denmark’s parliament this week cleared the way for passage of climate targets that would outstrip the recent goals set by the European Union.
The bill would establish a legally binding requirement that Denmark cut its greenhouse gas emissions by 40 percent below 1990′s levels by 2020, and that the government return to the question every five years to set new 10-year targets. The legislation would also establish a Climate Council — modelled on a similar body in Britain — to advise the government on the best ways to continue reducing Denmark’s reliance on fossil fuels.
The bill is backed by the Social Democrats, the Conservative People’s Party, the Socialist People’s Party, and the Red-Green Alliance.
Denmark’s present and former governments have already committed the country to a goal of 100 percent renewable energy generation by 2050, and the new bill is seen as a concrete step to achieving that goal.
A 40 percent reduction from 1990 levels by 2020 is on par with carbon emission cuts the National Research Council advised America to take on in 2010. It’s also noticeably more ambitious than the target the European Parliament recently passed — to cut emissions 40 percent below 1990 levels by 2030 — for the European Union as a whole.
The broader EU target remains non-binding until it’s approved by the governments of the individual countries that make up the group. And debate remains on exactly how the target should be divvied up amongst the member states. So Denmark moving forward with more ambitious cuts at the individual level would put it ahead of the curve set by most of its peers on the Continent.
Denmark is also part of the Emissions Trading System (ETS), Europe’s cap-and-trade system for cutting carbon emissions, which will likely serve as the main driver of both Denmark’s reductions and the EU’s as a whole. Unfortunately, the unexpected drop in economic activity from to the 2008 recession, along with some inherent design flaws, drove the price of carbon permits under the ETS to remarkable lows. That removed the incentive for firms in Europe to cut their carbon emissions, leaving the entire system stalled in limbo.
The ETS’ problems have served as a learning experience for other, newer cap-and-trade systems like California’s. And reforms are in the works in the EU to get the ETS back on its feet.
Meanwhile, Denmark has already been making substantial progress on the climate front. According to numbers that Responding to Climate Change pulled from the Danish Energy Agency, renewable energy accounted for 43.1 percent of Denmark’s domestic electricity supply in 2012, and for 25.8 percent of all energy consumption in the country that year. The year before that, renewables provided 23.1 percent of Denmark’s electricity consumption.
This article, Denmark Is About To Outstrip European Climate Goals, is syndicated from Clean Technica and is posted here with permission.
Previous Article Spain’s #1 Power Source in 2013? Wind Power!
|
from os.path import dirname, join
from sqlalchemy import create_engine, MetaData, Table, Column, Integer, \
String, Text
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import create_session, scoped_session
from wtforms import Form, TextField, TextAreaField, validators
from glashammer.application import make_app
from glashammer.utils import local, local_manager, get_app, redirect, \
url_for, run_very_simple, render_response, Response
FOLDER = dirname(__file__)
Base = declarative_base()
metadata = Base.metadata
db = scoped_session(lambda: create_session(local.application.sqla_db_engine,
autocommit=False), local_manager.get_ident)
# Table, Class and Mapper - declarative style
class Note(Base):
""" Represents a note """
__tablename__ = 'notes'
id = Column(Integer, primary_key=True)
title = Column(String(150))
note = Column(Text)
importance = Column(String(20))
def __init__(self, title, text, importance=None):
self.title = title
self.note = text
self.importance = importance
# The form
class NotesForm(Form):
"""Add/edit form for notes"""
title = TextField(u'Title:', [validators.length(min=4, max=150)])
note = TextAreaField(u'Note:', [validators.length(min=4, max=500)])
importance = TextField(u'Importance:')
# The views
def index_view(req):
notes = db.query(Note).order_by(Note.id.desc()).all()
form = NotesForm(req.form)
return render_response('notes_index.jinja', notes=notes, form=form)
def add_edit_view(req, nid=None):
if nid is None:
form = NotesForm(req.form)
# Validate form
if req.method == 'POST' and form.validate():
# No validation errors, save note and redirect to success page
note = Note(
req.form.get('title'),
req.form.get('note'),
req.form.get('importance')
)
db.add(note)
db.commit()
return redirect(url_for('example/success'))
return render_response('notes_add.jinja', form=form)
else:
# Find note
note = db.query(Note).get(nid)
# Check if note exists
if note is None:
return Response('Not Found', status=404)
# Form with values
form = NotesForm(req.form,
title = note.title,
note = note.note,
importance = note.importance
)
# Validate form
if req.method == 'POST' and form.validate():
# No validation errors, update note and redirect to success page
note.title = req.form.get('title')
note.note = req.form.get('note')
note.importance = req.form.get('importance')
db.add(note)
db.commit()
return redirect(url_for('example/success'))
return render_response('notes_edit.jinja', note=note, form=form)
def add_success_view(req):
return render_response('notes_success.jinja')
# Setup
def _get_default_db_uri(app):
db_file = join(app.instance_dir, 'gh.sqlite')
return 'sqlite:///' + db_file
def setup(app):
# Setting up our database
app.add_config_var('sqla_db_uri', str, _get_default_db_uri(app))
app.sqla_db_engine = create_engine(app.cfg['sqla_db_uri'],
convert_unicode=True)
metadata.bind = app.sqla_db_engine
# Function to be run during data setup phase
app.add_data_func(init_data)
# Add the template searchpath
app.add_template_searchpath(FOLDER)
# Add bundles
from glashammer.bundles.htmlhelpers import setup_htmlhelpers
app.add_setup(setup_htmlhelpers)
# Urls
app.add_url('/', 'example/index', view=index_view)
app.add_url('/add', 'example/add', view=add_edit_view)
app.add_url('/add/success', 'example/success', view=add_success_view)
app.add_url('/edit/<int:nid>', 'example/edit', view=add_edit_view)
# Static files
app.add_shared('files', join(FOLDER, 'static'))
def init_data(app):
engine = get_app().sqla_db_engine
metadata.create_all(engine)
# Used by gh-admin
def create_app():
return make_app(setup, FOLDER)
if __name__ == '__main__':
app = create_app()
run_very_simple(app)
|
“These markets are particularly important to us, as 40 percent of Uber Eats customers in these areas are part of smaller towns,” said Kiran Vinta, head of launch and expansion at Uber Eats U.S.
Starting this year, the company introduced a “self-signup” option, letting restaurants apply for consideration to be added to the network. Uber Eats also reaches out to restaurants to include them.
Uber’s popular rideshare service helps establish a foundation upon which the food delivery can grow, Vinta said. Many Uber drivers drive for both services, he said.
“Restaurants in general want new customers, so as they come to us they want to be able to tap into the large userbase we have and the marketing power of our app,” Vinta said.
Uber Eats was founded in 2014.
|
# <copyright>
# (c) Copyright 2017 Hewlett Packard Enterprise Development LP
#
# This program is free software: you can redistribute it and/or modify it
# under the terms of the GNU General Public License as published by the
# Free Software Foundation, either version 3 of the License, or (at your
# option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General
# Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
# </copyright>
class Setting:
def __init__(self, key, default, description, isFlag, short=None):
self.key = key
self.value = default
self.description = description
self.isFlag = isFlag
self.short = short
if short is None:
self.short = description
def keys(self):
if self.value.has_method("keys"):
return self.value.keys()
else:
[]
self.keys = keys
def __getitem__(self, key):
return self.value[key].value
def __setitem__(self, key, value):
self.value[key].value = value
def getObject(self, key):
return self.value[key]
class Settings:
def initSettings(self, root, settingsDict):
# Settings are in the form [default value, help text, isFlag, short text]
root.value = {}
for (key, value) in settingsDict.iteritems():
if value is not type(dict):
if len(value) == 3:
root.value[key] = Setting(
key, value[0], value[1], value[2])
else:
root.value[key] = Setting(
key, value[0], value[1], value[2], value[3])
else:
self.initSetting(root.value[key], value)
def __init__(self, settingsSeed):
self.allsettings = Setting("<root>", {}, "Root", False, "Root")
self.initSettings(self.allsettings, settingsSeed)
def __getitem__(self, key):
return self.lookupSetting(key).value
def __setitem__(self, key, value):
try:
self.lookupSetting(key).value = value
except KeyError:
self.lookupSetting(key[:-1]).value[key] = Setting(key,value,"",False)
def getObject(self, key):
return self.lookupSetting(key)
def keys(self):
return self.allsettings.value.keys()
def lookupSetting(self, key):
if len(key) == 0:
return self.allsettings
keylist = key.split(':')
keypath = keylist[:-1]
keyelem = keylist[-1]
current = self.allsettings
for part in keypath:
current = current.value[part]
return current.value[keyelem]
def getDescription(self, key):
setting = self.lookupSetting(key)
return setting.description
def isFlag(self, key):
setting = self.lookupSetting(key)
return setting.isFlag
def appendSettings(self, key, newSettings):
"""This will add settings for your CLI to use:
newSettings will be in a
dict(-name-, list(-default-, -description-, -flag-))
form
Where:
name is the option from the command line, file, or json input
default is the default value
description is the description of the setting
flag is a True/False boolean
True means that it doesn't expect a parameter on the CLI
False means a parameter is expected
"""
if len(key) == 0:
self.initSettings(self.allsettings, newSettings)
|
Louis de Blois or as he was more often called, Ludovicus Blosius originally trained for court life but left at the age of fourteen to pursue a life in the monastery of Liesse, becoming it’s abbot by the age of twenty four. There are several accounts of how he left court, some saying that he left by choice and others saying it was a result of an accident in which he needed an operation. He was asked by the surgeons how he would prefer the shape of the incision to be and unexpectedly opted for a cross. This was taken as a sign and the rest, as they say, was history. Now, I don’t know if that was true but it seemed too good a story not to note down.
He is best remembered now as a prolific and greatly learned author who’s various religious works spread across Europe impressing both devout Roman Catholic scholars and the more secular laymen such as Gladstone and Coleridge.
He was an extremely pious man and did not seem to suffer from ambition or pride. Indeed he stoutly refused Charles V who offered him the prestigious archbishopric of Cambrai instead desiring to remain peacefully at his monastery until his death in 1566 ensuring it’s continued reform and journey away from the greed and laxity of the past.
|
import glob
import operator
import os
from magichour.api.local.modelgen.preprocess import log_cardinality
from magichour.api.local.sample.steps.evalapply import evalapply_step
from magichour.api.local.sample.steps.event import event_step
from magichour.api.local.sample.steps.genapply import genapply_step
from magichour.api.local.sample.steps.genwindow import genwindow_step
from magichour.api.local.sample.steps.preprocess import preprocess_step
from magichour.api.local.sample.steps.template import template_step
from magichour.api.local.util.log import get_logger, log_time
from magichour.api.local.util.namedtuples import strTimedEvent
from magichour.api.local.util.pickl import write_pickle_file
from magichour.validate.datagen.eventgen import auditd
logger = get_logger(__name__)
def get_auditd_templates(auditd_templates_file):
"""
Helper function to read in auditd type to id mapping and return lookup dictionary
"""
auditd_templates = {}
for line in open(auditd_templates_file):
type_field, id = line.rstrip().split(',')
auditd_templates[type_field] = int(id)
return auditd_templates
@log_time
def run_pipeline(options):
read_lines_kwargs = {'transforms_file': options.transforms_file,
'gettime_auditd': options.auditd,
'type_template_auditd': options.auditd_templates_file,
'ts_start_index': options.ts_start_index,
'ts_end_index': options.ts_end_index,
'ts_format': options.ts_format,
'skip_num_chars': options.skip_num_chars,
'mp': options.mp, }
loglines = []
log_files = []
if options.data_file:
log_files.append(options.data_file)
if options.data_dir:
log_files.extend(glob.glob(os.path.join(options.data_dir, '*')))
if not log_files or (not options.data_file and not options.data_dir):
raise RuntimeError('No input specified/available')
for log_file in log_files:
loglines.extend(
preprocess_step(
log_file,
**read_lines_kwargs))
# count cardinality; print unique lines if verbose and there are actually
# transforms to apply
log_cardinality(loglines,
get_item=operator.attrgetter('processed'),
item_title='Transform',
verbose=options.verbose and options.transforms_file)
if options.save_intermediate:
transformed_lines_file = os.path.join(
options.pickle_cache_dir, "transformed_lines.pickle")
write_pickle_file(loglines, transformed_lines_file)
if read_lines_kwargs.get('type_template_auditd'):
# Read in auditd template definitions
templates = get_auditd_templates(options.auditd_templates_file)
else:
# Generate templates
if options.template_gen == 'logcluster':
logcluster_kwargs = {"support": str(options.template_support)}
templates = template_step(
loglines, "logcluster", **logcluster_kwargs)
elif options.template_gen == 'stringmatch':
templates = template_step(loglines, "stringmatch") # WIP
else:
raise NotImplementedError(
'%s Template generation method not implemented' %
options.template_gen)
if options.save_intermediate:
templates_file = os.path.join(
options.pickle_cache_dir, "templates.pickle")
write_pickle_file(templates, templates_file)
log_cardinality(templates,
item_key=operator.attrgetter('id'),
item_title='Template',
verbose=options.verbose)
timed_templates = genapply_step(loglines, templates, **read_lines_kwargs)
if options.save_intermediate:
timed_templates_file = os.path.join(
options.pickle_cache_dir, "timed_templates.pickle")
write_pickle_file(timed_templates, timed_templates_file)
modelgen_windows = genwindow_step(timed_templates,
window_size=options.gwindow_time,
tfidf_threshold=options.gtfidf_threshold)
if options.save_intermediate:
modelgen_windows_file = os.path.join(
options.pickle_cache_dir, "modelgen_windows.pickle")
write_pickle_file(modelgen_windows, modelgen_windows_file)
if options.event_gen == 'fp-growth':
fp_growth_kwargs = {
"min_support": options.min_support,
"iterations": options.iterations,
"tfidf_threshold": options.tfidf_threshold}
gen_events = event_step(
modelgen_windows,
"fp_growth",
**fp_growth_kwargs)
elif options.event_gen == 'paris':
paris_kwargs = {
"r_slack": options.r_slack,
"num_iterations": options.num_iterations,
"tau": options.tau}
gen_events = event_step(
modelgen_windows,
"paris",
**paris_kwargs) # WIP
elif options.event_gen == 'glove':
glove_kwargs = {
'num_components': options.num_components,
'glove_window': options.glove_window,
'epochs': options.epochs}
gen_events = event_step(
modelgen_windows,
"glove",
verbose=options.verbose,
**glove_kwargs)
elif options.event_gen == 'auditd':
# ignore timed_templates and modelgen_window and pass templates to
# auditd-specific event generator
gen_events = auditd.event_gen(templates)
else:
raise NotImplementedError('%s Not implemented' % options.event_gen)
if options.save_intermediate:
events_file = os.path.join(options.pickle_cache_dir, "events.pickle")
write_pickle_file(gen_events, events_file)
logger.info("Discovered events: %d" % len(gen_events))
if options.verbose:
# Print events and their templates
if read_lines_kwargs.get('type_template_auditd'):
template_list = [(templates[template], template)
for template in templates]
else:
template_list = [(template.id, template) for template in templates]
template_d = {
template_id: template for (
template_id,
template) in template_list}
e = []
for event in sorted(gen_events, key=lambda event: event.id):
ts = ["event_id: %s" % event.id]
for template_id in sorted(event.template_ids):
ts.append("%s: %s" % (template_id, template_d[template_id]))
e.append(ts)
from pprint import pformat
logger.info("\n" + pformat(e))
# compute how many times each template was used (i.e. how many events
# each template appears in)
event_templates = (
template_d[template_id] for event in gen_events for template_id in event.template_ids)
log_cardinality(
event_templates,
item_title='EventTemplate',
item_key=operator.attrgetter('id'),
verbose=options.verbose)
timed_events = evalapply_step(
gen_events,
timed_templates,
window_time=options.awindow_time,
mp=options.mp)
if options.save_intermediate:
timed_events_file = os.path.join(
options.pickle_cache_dir, "timed_events.pickle")
write_pickle_file(timed_events, timed_events_file)
logger.info("Timed events: %d" % len(timed_events))
log_cardinality(
timed_events,
item_title='TimedEvent',
get_item=operator.attrgetter('event_id'),
verbose=options.verbose)
if options.verbose > 1:
# Print timed event summary for -vv
# sort timed_templates in ascending time order
for te in timed_events:
te.timed_templates.sort(key=lambda tt: tt.ts)
if options.sort_events_key == 'time':
# sort timed events in ascending time order (of their first
# occurring timed_template)
timed_event_key = lambda te: te.timed_templates[0].ts
else:
# sort timed events by event id, then by time order
timed_event_key = lambda te: (
te.event_id, te.timed_templates[0].ts)
timed_events.sort(key=timed_event_key)
e = []
for event in timed_events:
s = strTimedEvent(event)
e.append(s)
logger.info("\n" + pformat(e))
logger.info("Done!")
def main():
from argparse import ArgumentParser
import sys
logger.info('args: %s', ' '.join(sys.argv[1:]))
# NOTE: parser.add_argument() default value: default=None
parser = ArgumentParser()
parser.add_argument(
'-f',
'--data-file',
dest="data_file",
help="Input log file")
parser.add_argument(
'-d',
'--data-dir',
dest="data_dir",
help="Input log directory")
parser.add_argument(
'-t',
'--transforms-file',
dest="transforms_file",
help="Transforms mapping file")
parser.add_argument(
'--template-gen',
choices=[
'logcluster',
'stringmatch'],
required=True)
parser.add_argument(
'--event-gen',
choices=[
'fp-growth',
'paris',
'glove',
'auditd'],
required=True)
source_args = parser.add_argument_group('Source-specific Parameters')
source_args.add_argument(
'--auditd',
default=False,
help='Input is auditd logs',
action="store_true") # for now, this just means read auditd-timestamps
source_args.add_argument(
'--auditd_templates_file',
dest="auditd_templates_file",
help="CSV Mapping Auditd types to ids (if not specified Templates will be auto-generated)")
source_args.add_argument(
'--skip_num_chars',
default=0,
help='skip characters at beginning of each line')
source_args.add_argument(
'--ts_start_index',
default=0,
help='start of timestamp (after skipping)')
source_args.add_argument(
'--ts_end_index',
default=12,
help='end of timestamp (after skipping)')
source_args.add_argument(
'--ts_format',
help='datetime.strptime() format string')
control_args = parser.add_argument_group('General Control Parameters')
control_args.add_argument(
'-w',
'--gwindow_time',
default=60,
help='Event model generate window size (seconds)')
control_args.add_argument(
'--gtfidf_threshold',
default=None, # default = don't apply tfidf to model generation
help='Event model generation tf_idf threshold')
control_args.add_argument(
'--awindow_time',
default=60,
help='Event application window size (seconds)')
control_args.add_argument(
'--template-support',
default=50,
help='# occurrences required to generate a Template')
control_args.add_argument(
'--mp',
default=False,
help='Turn on multi-processing',
action='store_true')
fp_growth_args = parser.add_argument_group('FP-Growth Control Parameters')
fp_growth_args.add_argument('--min_support', default=0.03, help='?')
fp_growth_args.add_argument(
'--iterations',
default=-1,
help='Number of itemsets to produce (-1 == all)')
fp_growth_args.add_argument('--tfidf_threshold', default=0, help='?')
paris_args = parser.add_argument_group('PARIS Control Parameters')
paris_args.add_argument(
'--r_slack',
default=0,
help='cost function parameter')
paris_args.add_argument('--num_iterations', default=3, help='?')
paris_args.add_argument(
'--tau',
default=1.0,
help='cost function parameter')
glove_args = parser.add_argument_group('Glove Control Parameters')
glove_args.add_argument(
'--num_components',
default=16,
help='?')
glove_args.add_argument(
'--glove_window',
default=10,
help='?')
glove_args.add_argument(
'--epochs',
default=20,
help='?')
optional_args = parser.add_argument_group('Debug Arguments')
optional_args.add_argument(
'-v',
'--verbose',
dest='verbose',
default=False,
action="count",
help="Print definitions")
optional_args.add_argument(
'--sort-events-key',
choices=[
'time',
'event'],
default='time',
help="Sort events by time or event-id.")
optional_args.add_argument(
"--save-intermediate",
dest="save_intermediate",
default=False,
action="store_true",
help="Save intermediate files which may result in large files")
optional_args.add_argument(
'--pickle_dir',
dest="pickle_cache_dir",
help="Directory for intermediate files")
options = parser.parse_args()
run_pipeline(options)
if __name__ == "__main__":
main()
|
Worley, now in his fourth year at the helm of GWF, led the organization’s efforts to pass the Georgia Outdoor Stewardship Amendment. The amendment, which passed in November 2018 with overwhelming support from voters, provides a minimum of ten years of dedicated funding for conservation and is touted as a game-changer for conservation in the state.
Wrigley, entering his third year as Chancellor of the University System of Georgia, oversees 26 colleges and universities. A lifelong sportsman and avid wingshooter, Wrigley has served multiple terms on the GWF board, with the most recent beginning in 2015. His commitment to the core mission of GWF, along with his vast network of contacts throughout the state, have made him a great asset to the Federation.
Read more at Georgia Trend.
|
# encoding: utf-8
import os
from django.utils.translation import ugettext_lazy as _
from django.db import models
from django.conf import settings
from django.core.files.storage import FileSystemStorage
from django_extensions.db.fields import *
from django.db.models.signals import post_delete, pre_save
from django.dispatch import receiver
from taggit.managers import TaggableManager
from filer.fields.image import FilerImageField
class Image(models.Model):
uuid = UUIDField()
created = CreationDateTimeField()
updated = ModificationDateTimeField()
image = FilerImageField(blank=True, null=True)
caption = models.TextField(_('caption'), blank=True, null=True)
link = models.URLField(_('link'), blank=True, null=True)
gallery = models.ForeignKey('Gallery', related_name='images', null=True, blank=True)
class Meta:
verbose_name = _('image')
verbose_name_plural = _('images')
db_table = 'gallery_images'
ordering = ('-created',)
get_latest_by = 'created'
def __unicode__(self):
return self.image.file.name
def image_tag(self):
return u'<img src="%s" width="400px" />' % (settings.FILES_URL + self.image.file.name)
image_tag.short_description = 'Image'
image_tag.allow_tags = True
class Gallery(models.Model):
uuid = UUIDField()
created = CreationDateTimeField()
updated = ModificationDateTimeField()
name = models.CharField(max_length=1024)
class Meta:
verbose_name = _('gallery')
verbose_name_plural = _('galleries')
db_table = 'gallery_galleries'
ordering = ('-created',)
def __unicode__(self):
return self.name
@property
def images(self):
if self.images.exists():
return self.images.all()
|
It's interesting how life gives you unexpected surprises. I remember visiting Mount Rushmore for the first time when I was about twelve years old. Never did I imagine that I'd be writing books about it!
It all started back in 2004. A woman from the Mount Rushmore Society (who ended up being my editor) asked me if I knew of anyone who might be interested in writing a children's book about the memorial.
So far, I've written four books about Mount Rushmore. Who Carved the Mountain? The Story of Mount Rushmore is a picture book with beautiful illustrations by Renee Graef. Face to Face with Mount Rushmore is a book for older kids and includes tons of questions to help readers think about Mount Rushmore from several perspectives.
I also love to answer questions about Mount Rushmore, so if there's something you'd like to know that hasn't been covered, please shoot me an email.
|
# -*- coding: utf-8 -*-
""" Neo4j GraphDB flask connector """
import socket
import neo4j
from neomodel import db, config
from flask_ext import BaseExtension, get_logger
from rapydo.utils.logs import re_obscure_pattern
log = get_logger(__name__)
class NeomodelClient():
def __init__(self, db):
self.db = db
def cypher(self, query):
""" Execute normal neo4j queries """
from neomodel import db
try:
results, meta = db.cypher_query(query)
except Exception as e:
raise Exception(
"Failed to execute Cypher Query: %s\n%s" % (query, str(e)))
return False
# log.debug("Graph query.\nResults: %s\nMeta: %s" % (results, meta))
return results
class NeoModel(BaseExtension):
def set_connection_exception(self):
return (
socket.gaierror,
neo4j.bolt.connection.ServiceUnavailable
)
def custom_connection(self, **kwargs):
if len(kwargs) > 0:
variables = kwargs
else:
variables = self.variables
self.uri = "bolt://%s:%s@%s:%s" % \
(
# User:Password
variables.get('user', 'neo4j'),
variables.get('password'),
# Host:Port
variables.get('host'),
variables.get('port'),
)
log.very_verbose("URI IS %s" % re_obscure_pattern(self.uri))
config.DATABASE_URL = self.uri
# Ensure all DateTimes are provided with a timezone
# before being serialised to UTC epoch
config.FORCE_TIMEZONE = True # default False
db.url = self.uri
db.set_connection(self.uri)
client = NeomodelClient(db)
return client
# return db
|
The InTouch Level “T” Series is built on RIO Products’ ConnectCore technology.
RIO Products is continuing to expand its offerings built with ConnectCore technology by introducing the InTouch Level “T” Series.
They are fast-sinking, level tips that use tungsten powder for density to offer a supple tip that won’t kink, yet sinks ‘like a rock’. Because they are built on RIO’s ultra-low stretch ConnectCore, the tips are extremely sensitive to soft grabs to ensure fast, solid hook sets.
Each tip is colour-coded for quick identification with the T-8 in deep red, the T-11 (dark green), T-14 (dark blue), T-17 (gray) and the T-20 (black).
The tips come in 30 foot sections with both a front and back welded loop for quick changes and in bulk form of 500 feet which is designed to be cut and turned to suit individual fishing and casting requirements.
A new, easy-to-weld coating allows anglers and retailers to make fast, neat welded loops for each fishing situation.
The 30 foot tips have an MSRP of $29.95 and the 500 foot retails at $249.95.
|
#!/usr/bin/python
import urllib2
import urllib
import re
from BeautifulSoup import BeautifulSoup
import requests
# from stegano import slsb
import random
import os
import json
#gets pic from reddit.com/r/pics and saves as temp.png
def getpic():
num = random.randint(0, 24)
page = BeautifulSoup(urllib2.urlopen("http://www.reddit.com/r/pics/new"))
i = re.findall('(?<=<a class="thumbnail " href=")([^"]*)', str(page))
if i:
url = i[num]
#adds jpg to end of file if need be
if url.find(".jpg") == -1 and url.find(".png") == -1:
url = url + ".jpg"
print url
r = urllib2.urlopen(url)
f = open('images/temp.jpg', 'wb+')
f.write(r.read())
f.close()
else:
print "shits broke yo"
#stegs some stuff into image and saves image
def stegstuffs():
os.system("steghide embed -ef cryptmessage.txt -cf images/temp.jpg -p asd\n 2>errors")
f = open('errors', 'rw')
if "steghide" in f.read():
f.write("")
f.close()
raise Exception("lol error")
#os.system("steghide extract -sf images/temp.jpg -p asd\n")
#message = "(signature) #Internal intergrity check\n(signature of previous instruction)\n(signature of next instructions):(signature):(signature):(signature):....\n(pool index):(pool boolean) #Experimental pool finder and adhoc'r\n\n(Next location):(location):(location):(location):(location):(location):(location):...\n(previous instruction location) #doubly linked list\n(ghost pool) #backup pool also triply linked list :)\n\n(instructions)#one per line\nS #A single Captial S represents beginning of instruction set\n(1instruction)\n(2instruction)\n(3instruction)\n(4instruction)\n ...\nE #A single Captial E represents end of instruction set\n\n(key to decrypt next message) # optional can help obstuficate and make the process more difficult"
#secret = slsb.hide("images/temp.jpg", message)
#secret.save("images/tempsteg.jpg")
#datagram = str(slsb.reveal("images/tempsteg.jpg"))
#uploads image to imgur (need to figure out how imgur.com/upload post stuff works)
class banned(Exception):
def __init__(self, value):
self.value = value
def __str__(self):
return repr(self.value)
def uploadpic():
url = 'http://imgur.com/upload'
payload = {'file': open('images/temp.jpg', 'rb')}
headers = {'Host': 'imgur.com',
'User-Agent': 'Mozilla/5.0 (Windows NT 6.1; WOW64; rv:19.0) Gecko/20100101 Firefox/19.0',
'Accept':'text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8',
'Accept-Language': 'en-US,en;q=0.5',
'Accept-Encoding': 'gzip, deflate',
'DNT': '1',
'Connection': 'keep-alive'}
r = requests.post(url, files=payload, headers=headers)
if str(r.text).find("Exception_OverUploadLimits") > 1:
raise banned(1)
test = json.loads(r.text)
print test['data']['hash']
def encrypt_RSA(key, message):
'''
param: public_key_loc Path to public key
param: message String to be encrypted
return base64 encoded encrypted string
'''
from M2Crypto import RSA, BIO
#key = open(public_key_loc, "r").read()
pubkey = str(key).encode('utf8')
bio = BIO.MemoryBuffer(pubkey)
rsa = RSA.load_pub_key_bio(bio)
encrypted = rsa.public_encrypt(message, RSA.pkcs1_oaep_padding)
return encrypted.encode('base64')
def hash_and_encrypt_Mod(message):
"""
Encrypts datagram payload. Also prepends hash of plaintext for integrity checking.
"""
import hashlib
s = hashlib.sha512()
s.update(message)
hash = s.hexdigest()
fo = open("rootkeys/public.txt", "rb")
publickey = fo.read()
fo.close()
crypttext = encrypt_RSA(publickey,message)
fo = open("cryptmessage.txt","wb")
fo.write(hash + crypttext)
fo.close()
return crypttext
while True:
try:
getpic()
except:
continue
break
while True:
try:
example_datagram = '''{
"prev_sig":"(signature of previous instruction)",
"pool_num":"(pool number)",
"instrs":"c3R1ZmYgPSAnaGVsbG8gd29ybGQnDQppZiBzdHVmZls6Nl0gPT0gJ2hlbGxvJzoNCiAgICBwcmludCAnaXQgaXMgY29vbCcNCmVsc2U6DQogICAgcHJpbnQgJ2dvZGRhbWl0Jw==",
"maint_instrs":"c3R1ZmYgPSAnaGVsbG8gd29ybGQnDQppZiBzdHVmZls6Nl0gPT0gJ2hlbGxvJzoNCiAgICBwcmludCAnaXQgaXMgY29vbCcNCmVsc2U6DQogICAgcHJpbnQgJ2dvZGRhbWl0Jw==",
"next_key":"(keyifweneedit)",
"next_locs":[
[{"check_time":"(EPOCH time)"}, {"loc":"(location)"},{"sig":"(signature)"}],
[{"check_time":"(EPOCH time)"}, {"loc":"(location)"},{"sig":"(signature)"}],
[{"check_time":"(EPOCH time)"}, {"loc":"(location)"},{"sig":"(signature)"}],
[{"check_time":"(EPOCH time)"}, {"loc":"(location)"},{"sig":"(signature)"}]
],
"prev_loc":"(previous location)",
"ghost_pool_locs":[
{"loc":"(location)"},
{"loc":"(location)"}
],
"gen_time":"1395082897"
}
'''
hash_and_encrypt_Mod(example_datagram)
stegstuffs()
except:
continue
break
while True:
try:
uploadpic()
except banned as e:
print "we got banned yo"
except:
continue
break
#cleansup
#os.remove("images/temp.png")
#os.remove("images/tempsteg.png")
|
Home » Backyard » Dead Bird In Backyard Meaning » Dead Bird In Backyard Meaning #2 Dead Bird In Backyard Meaning By 6 Fun Ways To Use Your Backyard During The Winter .
Hi , this photo is about Dead Bird In Backyard Meaning #2 Dead Bird In Backyard Meaning By 6 Fun Ways To Use Your Backyard During The Winter .. It is a image/jpeg and the resolution of this image is 637 x 1203. This photo's file size is only 130 KB. Wether You decided to download This blog post to Your laptop, you might Click here. You may too download more pictures by clicking the photo below or read more at this post: Dead Bird In Backyard Meaning.
For Dead Bird In Backyard Meaning #2 Dead Bird In Backyard Meaning By 6 Fun Ways To Use Your Backyard During The Winter . has a natural location that will usually be properly used as being a playground region which will be rooted with numerous kinds of crops that include cosmetic value towards the property and will create a stunning. For your newest property yard design is regular of two areas, specifically leading and raise of your home.
In which each portion will be maximized thus a lovely garden and appealing to own various capabilities and features a selected spot, and can be designed to the desires of every property. Wildlife is one-part of the Dead Bird In Backyard Meaning #2 Dead Bird In Backyard Meaning By 6 Fun Ways To Use Your Backyard During The Winter . that may be made to begin to see the whole-house appears beautiful and more gorgeous. Regrettably, you can still find many individuals who do not think a lot of about designing the backyard so that the look of your home appears from your outside to be less lovely and appealing.
To create a household yard decoration is contemporary front, there are a few appealing tips as you are able to use, so the park is not just a natural spot to place the flowers develop effectively, but also provides an aesthetic price that is good around the home front. Thus become a benefit that is added towards the house with naturalness.
Tags: Dead Bird In Backyard Meaning #2 Dead Bird In Backyard Meaning By 6 Fun Ways To Use Your Backyard During The Winter ., , Dead, Bird, In, Backyard, Meaning, , #2, Dead, Bird, In, Backyard, Meaning, By, 6, Fun, Ways, To, Use, Your, Backyard, During, , The, Winter, .
Similar Ideas on Dead Bird In Backyard Meaning #2 Dead Bird In Backyard Meaning By 6 Fun Ways To Use Your Backyard During The Winter .
|
# This file is subject to the terms and conditions of the GPLv3 (see file 'LICENSE' as part of this source code package)
u"""
This file holds all the functions and types necessary for the aggregate likelihood model.
"""
__author__ = "code@fungs.de"
from .. import common, types
import numpy as np
from sys import argv, exit, stdin, stdout, stderr, exit
class AggregateData(list): # TODO: rename CompositeData
def __init__(self, *args, **kwargs):
super(AggregateData, self).__init__(*args, **kwargs)
# try:
# self.sizes = np.asarray(sizes, dtype=self.size_type)
# except TypeError:
# self.sizes = np.fromiter(sizes, dtype=self.size_type)[:, np.newaxis]
def deposit(self, features):
# self.names.append(name)
for d, f in zip(self, features):
# print(d, f)
d.deposit(f)
def prepare(self):
for d in self:
d.prepare()
return self
@property
def num_data(self):
if not super(AggregateData, self).__len__():
return 0
# print(self, file=stderr)
num = self[0].num_data
assert num == len(self[0])
for l in self[1:]:
assert(l.num_data == num)
return num
@property
def num_features(self):
return super(AggregateData, self).__len__()
size_type = types.seqlen_type
class AggregateModel(list): # TODO: rename CompositeModel, implement update() and maximize_likelihood()
def __init__(self, *args, **kw):
super(AggregateModel, self).__init__(*args, **kw)
self.beta_correction = 1.0
@property
def names(self):
if len(self) > 1:
component_names = list(zip(*[m.names for m in self]))
return [",".join(t) for t in component_names]
return self[0].names
@property
def num_components(self): # transitional
if not len(self):
return 0
cluster_num = self[0].num_components
assert np.all(np.equal(cluster_num, [model.num_components for model in self[1:]]))
return cluster_num
def log_likelihood(self, data):
#assert self.weights.size == len(self)
ll_scale = np.asarray([m.stdev if m.stdev > 0.0 else 0.1 for m in self]) # stdev of zero is not allowed, quick workaround!
ll_weights = self.beta_correction*(ll_scale.sum()/ll_scale.size**2)/ll_scale
ll_per_model = np.asarray([w*m.log_likelihood(d) for (m, d, w) in zip(self, data, ll_weights)]) # TODO: reduce memory usage, de-normalize scale
s = np.mean(np.exp(ll_per_model), axis=1) # TODO: remove debug calculations
l = np.sum(ll_per_model, axis=1, dtype=types.large_float_type) # TODO: remove debug calculations
for m, mvec, lvec in zip(self, s, l): # TODO: save memory
stderr.write("LOG %s: average likelihood %s *** %s\n" % (m._short_name, common.pretty_probvector(mvec), common.pretty_probvector(lvec)))
loglike = np.sum(ll_per_model, axis=0, keepdims=False) # TODO: serialize
return loglike
def maximize_likelihood(self, data, responsibilities, weights, cmask=None):
loglikelihood = np.zeros(shape=(data.num_data, self.num_components), dtype=types.logprob_type)
return_value = False
for m, d in zip(self, data):
ret, ll = m.maximize_likelihood(d, responsibilities, weights, cmask)
ll = loglikelihood + ll
return_value = return_value and ret
return return_value, loglikelihood
|
All lawns are not the same, and using the right equipment to maintain the lawn is important. After all, unlocking the potential of the lawn can result in significant advantages from curb appeal to relaxing in the backyard to hosting gatherings with family and friends. Here are three key areas to consider with zero-turn mowers, an option that’s gaining popularity.
(Family Features) All lawns are not the same, and using the right equipment to maintain the lawn is important. After all, unlocking the potential of the lawn can result in significant advantages from curb appeal to relaxing in the backyard to hosting gatherings with family and friends.
Many homeowners may have a push mower, a lawn tractor or both – and technology on lawn equipment has come a long way over the years. Today’s lawn tractors incorporate fuel-injected engines or feature Bluetooth technology that connects with a smartphone.
However, a third lawn mowing option is gaining in popularity – the zero-turn mower. It’s called a zero-turn mower because its design allows users to make zero-degree turns – allowing for easier maneuverability around trees, shrubs, lawn decorations and other obstacles typically encountered when mowing.
Zero-turn mowers are becoming more popular for homeowners because they are comfortable, have great handling and can mow lawns quickly and efficiently.
Cub Cadet offers its RZT SX Series that includes a more familiar steering wheel versus the typical lap-bar controls that traditionally have been seen on zero-turns.
The steering wheel option offers patented SynchroSteer® technology, providing unparalleled stability on hills and terrain with no turf damage. With four-wheel control, the RZT SX delivers incredible handling and unmatched stability on hills. The line is available in a variety of cutting widths from 42 to 54 inches.
The RZT SX includes a premium cushioned seat with armrests, a storage console and charging station provide extra room to bring all the necessities. Mowing near dusk? Built-in LED lights let homeowners mow when the sun isn’t on their side.
All of Cub Cadet’s residential zero-turn mowers feature a 3-year limited warranty and are supported by the strong nationwide network of locally owned Cub Cadet Independent Retailers ready to provide advice and support whenever needed.
Young girls enjoying a swim in their family's saltwater pool!
(BPT) - It was supposed to be a community swimming pool, but many people stayed away because they couldn't tolerate the biting, nose-curdling odor of chlorine. Others experienced breathing and skin problems.
So the Evergreen Commons senior center in Holland, Michigan, converted its 65,000-gallon chlorine pool into a saltwater pool. People who had stayed away are now coming back, getting exercise and therapy, while socializing with others.
The senior center is hardly alone. Across the country, traditional chlorine pools are being converted into saltwater pools, sometimes called saline pools.
Swimmers noticed the difference right away after the switch, making their pool experience much more enjoyable. The new system also meant softer water without harsh chemicals that sometimes required a shower to wash off.
* Simplified, more convenient maintenance. Saltwater pool owners don't have to buy, transport, store and handle hazardous chlorine chemicals. This saves time and money.
* Water that's gentle on skin, eyes, nose and hair. Saltwater pools have approximately one-tenth the salinity of ocean water and about one-third the salinity of human tears, with no unpleasant chlorine smell.
* A more environmentally friendly approach. Routine pool maintenance doesn't involve the handling and storage of manufactured chlorine and lessens the need for other potentially hazardous chemicals.
Saltwater pools use a generator to convert the salt into mild chlorine that keeps the pool free of harmful bacteria. This chlorine is added to the water at a constant rate, displacing the bad smell and burning irritation we normally associate with chlorine and maintaining the right amount. Once the chlorine sanitizes the pool it converts back to salt. The process continues, over and over again, conserving the salt and keeping sanitizer levels balanced.
The technology for a saltwater pool was first developed in Australia in the 1960s and today more than 80 percent of all pools Down Under use this system. In the United States, saltwater pools first began to see use in the 1980s and have grown exponentially in popularity. According to data published in Pool & Spa News, today there are more than 1.4 million saltwater pools in operation nationwide and an estimated 75 percent of all new in-ground pools are saltwater, compared with only 15 percent in 2002.
The other good news for homeowners and pool managers is that pool salt is far cheaper than traditional chlorine. This is a big reason why so many hotels and water parks in the United States have already made the switch. The initial construction and installation of an electrolytic converter is very small and easily made up in maintenance savings. Even converting an existing chlorine pool to saltwater pool can pay off quickly.
(BPT) - Everyone looks forward to a shift into summer mode, with its sun-soaked days, flower-scented breezes and velvety nights under the stars.
Now's the time to take advantage of a golden opportunity right outside your door. Celebrate summer and all it offers by recreating your outdoor space. Whether it’s a balcony, a patio or a deck, a few touches are all you need to turn it into a highly functional living space.
When done right, a patio makeover is like a boost to your home's square footage of living space without straining your wallet. Think of the patio as a summer room, and the possibilities really open up.
One approach to making your summer room perfect is to start by thinking of your needs and what you love to do. If you could add any room to your home, what would it be, and what would you use it for? Then, turn to a resource like Big Lots, which offers everything you need to build that summer room, while keeping you within budget.
Transform any outdoor space into a relaxing oasis that’s perfect for unwinding and summer daydreams. Your key piece is a comfortable outdoor couch you can really sink into.
It's always best to start with a neutral-colored cushion, then work the accessories and accent pieces, including side tables, brightly colored throw pillows and outdoor lanterns to make the space feel extra homey and just like an authentic living room. These decor pieces can be easily and affordably switched out year-over-year to make your outdoor space feel fresh and new.
With an easy assembly gazebo, you can also keep the space cool and comfortable in the heat of the day.
Finally, if your patio faces an open or public area, a row of evergreens planted in large colorful pots will transform it into an intimate space with a perfect touch of nature.
With the right pieces, you can set the scene for any gathering of friends and family. Start with ample seating. Add to the traditional living room setup with an outdoor cushioned bench, and position some accent chairs and tables in a nook or two for conversation clusters. A patterned outdoor rug also helps to delineate spaces on a large patio or deck.
A fire pit always creates a natural centerpiece and gathering spot. Some designs take this up a notch and incorporate the soft glow of fire right into a tabletop, making it easy to talk long into the summer night. As a finishing touch, be sure to have a wireless speaker and playlist ready to set the mood with music.
The downside of summer living is building up unwanted heat in the kitchen from cooking dinner. The best solution is to take it outside. Don’t limit the grill to weekends and burgers and brats. Explore the many grilling recipes out there to expand your repertoire. While you're doing this, set up your patio as an outdoor cooking station that’s ready to go whenever you’re ready to start cooking.
Set up a sturdy table for prepping veggies and meats and a selection of lightweight, outdoor serving dishes. The variety of colors and designs are endless and add to the space aesthetic. (Just as you would indoors, make sure the surface is clean before you get started.) Use colorful crates to keep grilling tools, potholders and outdoor dishes and glassware organized and handy.
Finally, pick up some bright-colored pots that match your decor style and plant rosemary, parsley, basil and other herbs so they’re within easy reach to add fresh flavors to your grilled fish and chicken, as well as those tasty summer veggies. These potted plants also make for beautiful, easy centerpieces.
Dining outside is a fun and relaxing way to enjoy food as well as the company of your family and friends. When you choose a patio table, choose one with ample seating, and keep things comfortable and colorful with waterproof cushions. Umbrellas can throw shade on a sun-drenched deck or patio, making daytime dining (or your morning coffee time) more pleasant and easy on the eyes. If you’re looking for something different, a patio umbrella outfitted with lights on the underside will let you linger over dinner longer.
Creating beautiful outdoor spaces can be easy, but it also doesn’t have to break the budget. Big Lots has a wide variety of affordable patio furniture and décor so you can create an outdoor scene with indoor style. Visit BigLots.com or a store near you for all your patio and outdoor accessory needs.
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.