text
stringlengths
1
22.8M
```javascript Self-closing tags in **JSX** `null` value for *Controlled Components* in **React** Using `false` in **JSX** Expose Component Functions in **React** Dealing with `this.props.children` ```
```c++ /* * * * path_to_url * * Unless required by applicable law or agreed to in writing, software * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. */ #include "sfntly/port/memory_output_stream.h" namespace sfntly { MemoryOutputStream::MemoryOutputStream() { } MemoryOutputStream::~MemoryOutputStream() { } void MemoryOutputStream::Write(ByteVector* buffer) { store_.insert(store_.end(), buffer->begin(), buffer->end()); } void MemoryOutputStream::Write(ByteVector* buffer, int32_t offset, int32_t length) { assert(buffer); if (offset >= 0 && length > 0) { store_.insert(store_.end(), buffer->begin() + offset, buffer->begin() + offset + length); } else { #if !defined(SFNTLY_NO_EXCEPTION) throw IndexOutOfBoundException(); #endif } } void MemoryOutputStream::Write(byte_t* buffer, int32_t offset, int32_t length) { assert(buffer); if (offset >= 0 && length > 0) { store_.insert(store_.end(), buffer + offset, buffer + offset + length); } else { #if !defined(SFNTLY_NO_EXCEPTION) throw IndexOutOfBoundException(); #endif } } void MemoryOutputStream::Write(byte_t b) { store_.push_back(b); } byte_t* MemoryOutputStream::Get() { if (store_.empty()) { return NULL; } return &(store_[0]); } size_t MemoryOutputStream::Size() { return store_.size(); } } // namespace sfntly ```
```c /* */ #include <stdbool.h> #include "interpreter.h" static int trueValue = 1; static int falseValue = 0; /* structure definitions */ const char StdboolDefs[] = "typedef int bool;"; /* creates various system-dependent definitions */ void StdboolSetupFunc(Picoc* pc) { /* defines */ VariableDefinePlatformVar(pc, NULL, "true", &pc->IntType, (union AnyValue*)&trueValue, false); VariableDefinePlatformVar(pc, NULL, "false", &pc->IntType, (union AnyValue*)&falseValue, false); VariableDefinePlatformVar(pc, NULL, "__bool_true_false_are_defined", &pc->IntType, (union AnyValue*)&trueValue, false); } ```
{{DISPLAYTITLE:C16H18O9}} The molecular formula C16H18O9 may refer to: Chlorogenic acid (3-O-caffeoylquinic acid or 3-CQA) Cryptochlorogenic acid (4-O-caffeoylquinic acid or 4-CGA) Neochlorogenic acid (5-O-caffeoylquinic acid or 5-CQA) Scopolin
Where We Started is a 2013 American romantic drama film directed by Christopher J. Hansen, starring Matthew Brumlow and Cora Vander Broek. Cast Matthew Brumlow as Will Shelton Cora Vander Broek as Nora Van Der Graf Stan Denman as Liquor Store Clerk Nellsyn Hill as Restaurant Waitress Mallory Olivier as Diner Waitress Release The film received a limited theatrical release on 2 May 2014. Reception Sherilyn Connelly of LA Weekly wrote that "It all feels carefully scripted and rehearsed, with actors boasting terrific chemistry, who never seem like they're improvising or veering into mumblecore territory", and that it "will resonate with anyone who's ever clicked with the right person at the wrong time, or has wondered what it might be like." Danny King of The Village Voice wrote that the film on the whole "remains an engaging platform for its actors, and Hansen's ability to maximize their work. Gary Goldstein of the Los Angeles Times wrote that "There’s a kind of bland realism to writer-director Chris Hansen’s long night’s journey into morning. Some may recognize their own feelings of longing and regret. But for all the emotional onion-peeling here, little is revealed that’s surprising, unique or particularly deep." Daniel M. Gold of The New York Times wrote that while the film "has its authentic moments", the relationship between Will and Nora "feels all too glib, a beat out of sync." References External links American romantic drama films 2013 romantic drama films 2013 films
Gary Michael Gray (born February 23, 1945) is an American former basketball player who played as a guard in the NBA. Early years Gary Gray was born in Fort Cobb, Oklahoma. Gray is Native American, of the Delaware Nation. Following graduation from Fort Cobb High School in 1963, he attended Oklahoma City University, where he led them to the All-College Tournament championship in 1966. His OCU Chiefs made the 1966 NCAA Men's Basketball Tournament. Gray was named an Academic All American for 1966–1967 by the College Sports Information Directors of America (CoSIDA). Professional basketball career Gray was drafted in the third round of the 1967 NBA draft by the Cincinnati Royals. He was also selected in the 1967 American Basketball Association Draft by the Dallas Chaparrals. He spent the 1967–68 season with the Royals, averaging 2.4 points per game in limited playing time. He was later selected by the Milwaukee Bucks in the 1968 NBA Expansion Draft. Retirement Gary Gray was inducted into the OCU Basketball Hall of Fame in 1986. References 1945 births Living people American men's basketball players Basketball players from Oklahoma Cincinnati Royals draft picks Cincinnati Royals players Delaware Nation people Native American basketball players Oklahoma City Stars men's basketball players People from Caddo County, Oklahoma Shooting guards 20th-century Native Americans 21st-century Native Americans
```python #!/usr/bin/env python # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """Utility functions for Windows builds. These functions are executed via gyp-win-tool when using the ninja generator. """ from __future__ import print_function import os import re import shutil import subprocess import stat import string import sys BASE_DIR = os.path.dirname(os.path.abspath(__file__)) PY3 = bytes != str # A regex matching an argument corresponding to the output filename passed to # link.exe. _LINK_EXE_OUT_ARG = re.compile('/OUT:(?P<out>.+)$', re.IGNORECASE) def main(args): executor = WinTool() exit_code = executor.Dispatch(args) if exit_code is not None: sys.exit(exit_code) class WinTool(object): """This class performs all the Windows tooling steps. The methods can either be executed directly, or dispatched from an argument list.""" def _UseSeparateMspdbsrv(self, env, args): """Allows to use a unique instance of mspdbsrv.exe per linker instead of a shared one.""" if len(args) < 1: raise Exception("Not enough arguments") if args[0] != 'link.exe': return # Use the output filename passed to the linker to generate an endpoint name # for mspdbsrv.exe. endpoint_name = None for arg in args: m = _LINK_EXE_OUT_ARG.match(arg) if m: endpoint_name = re.sub(r'\W+', '', '%s_%d' % (m.group('out'), os.getpid())) break if endpoint_name is None: return # Adds the appropriate environment variable. This will be read by link.exe # to know which instance of mspdbsrv.exe it should connect to (if it's # not set then the default endpoint is used). env['_MSPDBSRV_ENDPOINT_'] = endpoint_name def Dispatch(self, args): """Dispatches a string command to a method.""" if len(args) < 1: raise Exception("Not enough arguments") method = "Exec%s" % self._CommandifyName(args[0]) return getattr(self, method)(*args[1:]) def _CommandifyName(self, name_string): """Transforms a tool name like recursive-mirror to RecursiveMirror.""" return name_string.title().replace('-', '') def _GetEnv(self, arch): """Gets the saved environment from a file for a given architecture.""" # The environment is saved as an "environment block" (see CreateProcess # and msvs_emulation for details). We convert to a dict here. # Drop last 2 NULs, one for list terminator, one for trailing vs. separator. pairs = open(arch).read()[:-2].split('\0') kvs = [item.split('=', 1) for item in pairs] return dict(kvs) def ExecStamp(self, path): """Simple stamp command.""" open(path, 'w').close() def ExecRecursiveMirror(self, source, dest): """Emulation of rm -rf out && cp -af in out.""" if os.path.exists(dest): if os.path.isdir(dest): def _on_error(fn, path, excinfo): # The operation failed, possibly because the file is set to # read-only. If that's why, make it writable and try the op again. if not os.access(path, os.W_OK): os.chmod(path, stat.S_IWRITE) fn(path) shutil.rmtree(dest, onerror=_on_error) else: if not os.access(dest, os.W_OK): # Attempt to make the file writable before deleting it. os.chmod(dest, stat.S_IWRITE) os.unlink(dest) if os.path.isdir(source): shutil.copytree(source, dest) else: shutil.copy2(source, dest) def ExecLinkWrapper(self, arch, use_separate_mspdbsrv, *args): """Filter diagnostic output from link that looks like: ' Creating library ui.dll.lib and object ui.dll.exp' This happens when there are exports from the dll or exe. """ env = self._GetEnv(arch) if use_separate_mspdbsrv == 'True': self._UseSeparateMspdbsrv(env, args) link = subprocess.Popen([args[0].replace('/', '\\')] + list(args[1:]), shell=True, env=env, stdout=subprocess.PIPE, stderr=subprocess.STDOUT) out, _ = link.communicate() if PY3: out = out.decode('utf-8') for line in out.splitlines(): if (not line.startswith(' Creating library ') and not line.startswith('Generating code') and not line.startswith('Finished generating code')): print(line) return link.returncode def ExecLinkWithManifests(self, arch, embed_manifest, out, ldcmd, resname, mt, rc, intermediate_manifest, *manifests): """A wrapper for handling creating a manifest resource and then executing a link command.""" # The 'normal' way to do manifests is to have link generate a manifest # based on gathering dependencies from the object files, then merge that # manifest with other manifests supplied as sources, convert the merged # manifest to a resource, and then *relink*, including the compiled # version of the manifest resource. This breaks incremental linking, and # is generally overly complicated. Instead, we merge all the manifests # provided (along with one that includes what would normally be in the # linker-generated one, see msvs_emulation.py), and include that into the # first and only link. We still tell link to generate a manifest, but we # only use that to assert that our simpler process did not miss anything. variables = { 'python': sys.executable, 'arch': arch, 'out': out, 'ldcmd': ldcmd, 'resname': resname, 'mt': mt, 'rc': rc, 'intermediate_manifest': intermediate_manifest, 'manifests': ' '.join(manifests), } add_to_ld = '' if manifests: subprocess.check_call( '%(python)s gyp-win-tool manifest-wrapper %(arch)s %(mt)s -nologo ' '-manifest %(manifests)s -out:%(out)s.manifest' % variables) if embed_manifest == 'True': subprocess.check_call( '%(python)s gyp-win-tool manifest-to-rc %(arch)s %(out)s.manifest' ' %(out)s.manifest.rc %(resname)s' % variables) subprocess.check_call( '%(python)s gyp-win-tool rc-wrapper %(arch)s %(rc)s ' '%(out)s.manifest.rc' % variables) add_to_ld = ' %(out)s.manifest.res' % variables subprocess.check_call(ldcmd + add_to_ld) # Run mt.exe on the theoretically complete manifest we generated, merging # it with the one the linker generated to confirm that the linker # generated one does not add anything. This is strictly unnecessary for # correctness, it's only to verify that e.g. /MANIFESTDEPENDENCY was not # used in a #pragma comment. if manifests: # Merge the intermediate one with ours to .assert.manifest, then check # that .assert.manifest is identical to ours. subprocess.check_call( '%(python)s gyp-win-tool manifest-wrapper %(arch)s %(mt)s -nologo ' '-manifest %(out)s.manifest %(intermediate_manifest)s ' '-out:%(out)s.assert.manifest' % variables) assert_manifest = '%(out)s.assert.manifest' % variables our_manifest = '%(out)s.manifest' % variables # Load and normalize the manifests. mt.exe sometimes removes whitespace, # and sometimes doesn't unfortunately. with open(our_manifest, 'rb') as our_f: with open(assert_manifest, 'rb') as assert_f: our_data = our_f.read().translate(None, string.whitespace) assert_data = assert_f.read().translate(None, string.whitespace) if our_data != assert_data: os.unlink(out) def dump(filename): sys.stderr.write('%s\n-----\n' % filename) with open(filename, 'rb') as f: sys.stderr.write(f.read() + '\n-----\n') dump(intermediate_manifest) dump(our_manifest) dump(assert_manifest) sys.stderr.write( 'Linker generated manifest "%s" added to final manifest "%s" ' '(result in "%s"). ' 'Were /MANIFEST switches used in #pragma statements? ' % ( intermediate_manifest, our_manifest, assert_manifest)) return 1 def ExecManifestWrapper(self, arch, *args): """Run manifest tool with environment set. Strip out undesirable warning (some XML blocks are recognized by the OS loader, but not the manifest tool).""" env = self._GetEnv(arch) popen = subprocess.Popen(args, shell=True, env=env, stdout=subprocess.PIPE, stderr=subprocess.STDOUT) out, _ = popen.communicate() if PY3: out = out.decode('utf-8') for line in out.splitlines(): if line and 'manifest authoring warning 81010002' not in line: print(line) return popen.returncode def ExecManifestToRc(self, arch, *args): """Creates a resource file pointing a SxS assembly manifest. |args| is tuple containing path to resource file, path to manifest file and resource name which can be "1" (for executables) or "2" (for DLLs).""" manifest_path, resource_path, resource_name = args with open(resource_path, 'wb') as output: output.write('#include <windows.h>\n%s RT_MANIFEST "%s"' % ( resource_name, os.path.abspath(manifest_path).replace('\\', '/'))) def ExecMidlWrapper(self, arch, outdir, tlb, h, dlldata, iid, proxy, idl, *flags): """Filter noisy filenames output from MIDL compile step that isn't quietable via command line flags. """ args = ['midl', '/nologo'] + list(flags) + [ '/out', outdir, '/tlb', tlb, '/h', h, '/dlldata', dlldata, '/iid', iid, '/proxy', proxy, idl] env = self._GetEnv(arch) popen = subprocess.Popen(args, shell=True, env=env, stdout=subprocess.PIPE, stderr=subprocess.STDOUT) out, _ = popen.communicate() if PY3: out = out.decode('utf-8') # Filter junk out of stdout, and write filtered versions. Output we want # to filter is pairs of lines that look like this: # Processing C:\Program Files (x86)\Microsoft SDKs\...\include\objidl.idl # objidl.idl lines = out.splitlines() prefixes = ('Processing ', '64 bit Processing ') processing = set(os.path.basename(x) for x in lines if x.startswith(prefixes)) for line in lines: if not line.startswith(prefixes) and line not in processing: print(line) return popen.returncode def ExecAsmWrapper(self, arch, *args): """Filter logo banner from invocations of asm.exe.""" env = self._GetEnv(arch) popen = subprocess.Popen(args, shell=True, env=env, stdout=subprocess.PIPE, stderr=subprocess.STDOUT) out, _ = popen.communicate() if PY3: out = out.decode('utf-8') for line in out.splitlines(): not line.startswith('Microsoft (R) Macro Assembler') and not line.startswith(' Assembling: ') and line): print(line) return popen.returncode def ExecRcWrapper(self, arch, *args): """Filter logo banner from invocations of rc.exe. Older versions of RC don't support the /nologo flag.""" env = self._GetEnv(arch) popen = subprocess.Popen(args, shell=True, env=env, stdout=subprocess.PIPE, stderr=subprocess.STDOUT) out, _ = popen.communicate() if PY3: out = out.decode('utf-8') for line in out.splitlines(): if (not line.startswith('Microsoft (R) Windows (R) Resource Compiler') and line): print(line) return popen.returncode def ExecActionWrapper(self, arch, rspfile, *dir): """Runs an action command line from a response file using the environment for |arch|. If |dir| is supplied, use that as the working directory.""" env = self._GetEnv(arch) # TODO(scottmg): This is a temporary hack to get some specific variables # through to actions that are set after gyp-time. path_to_url for k, v in os.environ.items(): if k not in env: env[k] = v args = open(rspfile).read() dir = dir[0] if dir else None return subprocess.call(args, shell=True, env=env, cwd=dir) def ExecClCompile(self, project_dir, selected_files): """Executed by msvs-ninja projects when the 'ClCompile' target is used to build selected C/C++ files.""" project_dir = os.path.relpath(project_dir, BASE_DIR) selected_files = selected_files.split(';') ninja_targets = [os.path.join(project_dir, filename) + '^^' for filename in selected_files] cmd = ['ninja.exe'] cmd.extend(ninja_targets) return subprocess.call(cmd, shell=True, cwd=BASE_DIR) if __name__ == '__main__': sys.exit(main(sys.argv[1:])) ```
```ruby class GithubRepo < ApplicationRecord belongs_to :user serialize :info_hash, Hash validates :name, :url, :github_id_code, presence: true validates :url, url: true, uniqueness: true validates :github_id_code, uniqueness: true scope :featured, -> { where(featured: true) } before_destroy :clear_caches after_save :clear_caches # Update existing repository or create a new one with given params. # Repository is searched by either GitHub ID or URL. def self.upsert(user, **params) repo = user.github_repos .where(github_id_code: params[:github_id_code]) .or(where(url: params[:url])) .first repo ||= new(params.merge(user_id: user.id)) repo.update(params) repo end def self.update_to_latest ids = where(updated_at: ...26.hours.ago).ids.map { |id| [id] } GithubRepos::RepoSyncWorker.perform_bulk(ids) end private def clear_caches return if user.blank? user.touch cache_bust = EdgeCache::Bust.new cache_bust.call(user.path) cache_bust.call("#{user.path}?i=i") cache_bust.call("#{user.path}/?i=i") end end ```
```java /* This file is part of the iText (R) project. Authors: Apryse Software. This program is offered under a commercial and under the AGPL license. For commercial licensing, contact us at path_to_url For AGPL licensing, see below. AGPL licensing: This program is free software: you can redistribute it and/or modify (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the along with this program. If not, see <path_to_url */ package com.itextpdf.bouncycastle.asn1; import com.itextpdf.commons.bouncycastle.asn1.IDERSequence; import org.bouncycastle.asn1.ASN1Encodable; import org.bouncycastle.asn1.ASN1EncodableVector; import org.bouncycastle.asn1.DERSequence; /** * Wrapper class for {@link DERSequence}. */ public class DERSequenceBC extends ASN1SequenceBC implements IDERSequence { /** * Creates new wrapper instance for {@link DERSequence}. * * @param derSequence {@link DERSequence} to be wrapped */ public DERSequenceBC(DERSequence derSequence) { super(derSequence); } /** * Creates new wrapper instance for {@link DERSequence}. * * @param vector {@link ASN1EncodableVector} to create {@link DERSequence} */ public DERSequenceBC(ASN1EncodableVector vector) { super(new DERSequence(vector)); } /** * Creates new wrapper instance for {@link DERSequence}. * * @param encodable {@link ASN1Encodable} to create {@link DERSequence} */ public DERSequenceBC(ASN1Encodable encodable) { super(new DERSequence(encodable)); } /** * Gets actual org.bouncycastle object being wrapped. * * @return wrapped {@link DERSequence}. */ public DERSequence getDERSequence() { return (DERSequence) getEncodable(); } } ```
Lotus ononopsis is a species of legume in the family Fabaceae. It is found only in Yemen. Its natural habitats are subtropical or tropical dry forests and subtropical or tropical dry lowland grassland. References ononopsis Endemic flora of Socotra Least concern plants Taxonomy articles created by Polbot Taxa named by Isaac Bayley Balfour
```qmake TEMPLATE = app TARGET = test_matrix_vector !include (configuration.pri) HEADERS += \ ../../../test/utils.hpp SOURCES += \ ../../../test/test_matrix_vector.cpp INCLUDEPATH += \ ../../../include ```
The Empire Air Training Scheme (EATS) was a policy designed to train Royal Australian Air Force pilots for eventual transfer into the Royal Air Force during World War II. The policy, dubbed the Empire Air Training Scheme in Australia, was envisioned after the British Empire was unable to supply enough pilots and aircraft for the Royal Air force. In Australia the scheme would eventually branch out and provide the training of pilots for deployment in the Pacific War. Background In the period of rearmament preceding World War II, the Royal Air Force estimated that they would need to acquire 50,000 new pilots annually in order to keep the RAF sufficiently supplied. While planners were confident that the industrial capacity of the British Empire would be capable of producing a sufficient number of planes, it became clear that there was a shortage of able fliers. As the War in Europe drew closer, it was estimated Britain could muster only 22,000 pilots annually. In response to this shortage, the British government instituted a plan to levy pilots from the dominions referred to as the British Commonwealth Air Training Plan. The plan called for an establishment of a pool of recruits in the dominions from which the RAF could siphon replacement pilots. The government of Australia accepted the plan for three years and began making preparations to adopt it. Under the plan, dubbed the Empire Air Scheme in Australia, 50,000 aircrew would be trained in the dominions. Australia planned to provide 28,000 aircrew under the scheme, accounting for 36% of the total number of proposed aircrew. Basic flying courses officially began 29 April 1940. The first Australian pilots departed for Canada on 14 November 1940, from where they would be transferred to Britain and funneled into the RAF. Empire Air Training Scheme Following the signing of the plan, a massive construction and recruitment campaign was launched to increase the number of Australian pilots. The "Scheme" would ultimately cost Australia about £100,000,000 for her commitments. The RAAF built air and ground training schools, airfields, and specialized flying academies. While original designed only to train aircrew, the Australian government soon modified the scheme to compensate for the unique situation Australia found itself in. In addition to the Empire Air Training Scheme, wartime demands and restrictions led to shortages as funds and resources were needed for home defence. When German Strategic Bombing of British factories reduced the number of serviceable aircraft in Britain, the Australian government appropriated funds from EATS to establish the Department of Aircraft Production, the precursor to Government Aircraft Factories. Following the opening of the Pacific War in 1941, the number of Australian aircrews being transferred to the European theatre greatly decreased as the RAAF prepared to counter the armed forces of the Empire of Japan. A series of Japanese air raids greatly increased the need for a large force of combat ready pilots and aircraft in Australia. Schools established by EATS The following types of schools were established as part of EATS: Initial Training Elementary Flying Training Service Flying Training Air Navigation Air Observer Bombing and Gunnery Wireless Air Gunnery A memorial was dedicated to 5 Service Flying Training School RAAF, within the Empire Air Training Scheme at Uranquinty, 19 September 1999. References External links A 1940s Flight Global article about pilots from the Commonwealth Australia in World War II Pacific Ocean theatre of World War II World War II British Empire in World War II
Croydon Park may refer to: Croydon Park, New South Wales, a suburb of Sydney Croydon Park, South Australia, a suburb of Adelaide Croydon Park Public School, A school in Croydon Park Sydney
```c /* packet-dcerpc-rs_replist.c * * Routines for dcerpc RepServer Calls * This information is based off the released idl files from opengroup. * ftp://ftp.opengroup.org/pub/dce122/dce/src/security.tar.gz security/idl/rs_repadm.idl * * Wireshark - Network traffic analyzer * By Gerald Combs <gerald@wireshark.org> * * This program is free software; you can redistribute it and/or * as published by the Free Software Foundation; either version 2 * * This program is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the * * along with this program; if not, write to the Free Software * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. */ #include "config.h" #include <epan/packet.h> #include "packet-dcerpc.h" void proto_register_rs_replist (void); void proto_reg_handoff_rs_replist (void); static int proto_rs_replist = -1; static int hf_rs_replist_opnum = -1; static gint ett_rs_replist = -1; static e_guid_t uuid_rs_replist = { 0x850446b0, 0xe95b, 0x11CA, { 0xad, 0x90, 0x08, 0x00, 0x1e, 0x01, 0x45, 0xb1 } }; static guint16 ver_rs_replist = 2; static dcerpc_sub_dissector rs_replist_dissectors[] = { { 0, "rs_replist_add_replica", NULL, NULL}, { 1, "rs_replist_replace_replica", NULL, NULL}, { 2, "rs_replist_delete_replica", NULL, NULL}, { 3, "rs_replist_read", NULL, NULL}, { 4, "rs_replist_read_full", NULL, NULL}, { 5, "rs_replist_add_replica", NULL, NULL}, { 6, "rs_replist_replace_replica", NULL, NULL}, { 7, "rs_replist_delete_replica", NULL, NULL}, { 8, "rs_replist_read", NULL, NULL}, { 9, "rs_replist_read_full", NULL, NULL}, { 0, NULL, NULL, NULL } }; void proto_register_rs_replist (void) { static hf_register_info hf[] = { { &hf_rs_replist_opnum, { "Operation", "rs_replist.opnum", FT_UINT16, BASE_DEC, NULL, 0x0, NULL, HFILL }}, }; static gint *ett[] = { &ett_rs_replist, }; proto_rs_replist = proto_register_protocol ("DCE/RPC Repserver Calls", "RS_REPLIST", "rs_replist"); proto_register_field_array (proto_rs_replist, hf, array_length (hf)); proto_register_subtree_array (ett, array_length (ett)); } void proto_reg_handoff_rs_replist (void) { /* Register the protocol as dcerpc */ dcerpc_init_uuid (proto_rs_replist, ett_rs_replist, &uuid_rs_replist, ver_rs_replist, rs_replist_dissectors, hf_rs_replist_opnum); } /* * Editor modelines - path_to_url * * Local variables: * c-basic-offset: 8 * tab-width: 8 * indent-tabs-mode: t * End: * * vi: set shiftwidth=8 tabstop=8 noexpandtab: * :indentSize=8:tabSize=8:noTabs=false: */ ```
The African dusky shrew or African foggy shrew (Crocidura caliginea) is a species of shrew. It is native to the Democratic Republic of the Congo, where it lives in forests. References Crocidura Mammals of the Democratic Republic of the Congo Mammals described in 1916 Taxonomy articles created by Polbot Northeastern Congolian lowland forests Endemic fauna of the Democratic Republic of the Congo
```swift import Foundation extension GraphAPI.BackingState { /** An adapter method which takes a `BackingState` and converts it to a `GraphAPI.BackingState?` object. - parameter backingState: `BackingState` object that needs to be converted to be `GraphAPI` compatible. */ static func from(_ backingState: BackingState) -> GraphAPI.BackingState? { return GraphAPI.BackingState(rawValue: backingState.rawValue) } } ```
```smalltalk using System; using System.Collections.Generic; using System.Linq; using System.Reflection; namespace AssetBundleBrowser.AssetBundleDataSource { internal class ABDataSourceProviderUtility { private static List<Type> s_customNodes; internal static List<Type> CustomABDataSourceTypes { get { if(s_customNodes == null) { s_customNodes = BuildCustomABDataSourceList(); } return s_customNodes; } } private static List<Type> BuildCustomABDataSourceList() { var properList = new List<Type>(); properList.Add(null); //empty spot for "default" var x = AppDomain.CurrentDomain.GetAssemblies(); foreach (var assembly in x) { try { var list = new List<Type>( assembly .GetTypes() .Where(t => t != typeof(ABDataSource)) .Where(t => typeof(ABDataSource).IsAssignableFrom(t))); for (int count = 0; count < list.Count; count++) { if (list[count].Name == "AssetDatabaseABDataSource") properList[0] = list[count]; else if (list[count] != null) properList.Add(list[count]); } } catch (System.Exception) { //assembly which raises exception on the GetTypes() call - ignore it } } return properList; } } } ```
```html <html> <head> <meta http-equiv="Content-Type" content="text/html; charset=US-ASCII"> <title>Riemann Zeta Function</title> <link rel="stylesheet" href="../../math.css" type="text/css"> <meta name="generator" content="DocBook XSL Stylesheets V1.79.1"> <link rel="home" href="../../index.html" title="Math Toolkit 2.6.0"> <link rel="up" href="../zetas.html" title="Zeta Functions"> <link rel="prev" href="../zetas.html" title="Zeta Functions"> <link rel="next" href="../expint.html" title="Exponential Integrals"> </head> <body bgcolor="white" text="black" link="#0000FF" vlink="#840084" alink="#0000FF"> <table cellpadding="2" width="100%"><tr> <td valign="top"><img alt="Boost C++ Libraries" width="277" height="86" src="../../../../../../boost.png"></td> <td align="center"><a href="../../../../../../index.html">Home</a></td> <td align="center"><a href="../../../../../../libs/libraries.htm">Libraries</a></td> <td align="center"><a href="path_to_url">People</a></td> <td align="center"><a href="path_to_url">FAQ</a></td> <td align="center"><a href="../../../../../../more/index.htm">More</a></td> </tr></table> <hr> <div class="spirit-nav"> <a accesskey="p" href="../zetas.html"><img src="../../../../../../doc/src/images/prev.png" alt="Prev"></a><a accesskey="u" href="../zetas.html"><img src="../../../../../../doc/src/images/up.png" alt="Up"></a><a accesskey="h" href="../../index.html"><img src="../../../../../../doc/src/images/home.png" alt="Home"></a><a accesskey="n" href="../expint.html"><img src="../../../../../../doc/src/images/next.png" alt="Next"></a> </div> <div class="section"> <div class="titlepage"><div><div><h3 class="title"> <a name="math_toolkit.zetas.zeta"></a><a class="link" href="zeta.html" title="Riemann Zeta Function">Riemann Zeta Function</a> </h3></div></div></div> <h5> <a name="math_toolkit.zetas.zeta.h0"></a> <span class="phrase"><a name="math_toolkit.zetas.zeta.synopsis"></a></span><a class="link" href="zeta.html#math_toolkit.zetas.zeta.synopsis">Synopsis</a> </h5> <pre class="programlisting"><span class="preprocessor">#include</span> <span class="special">&lt;</span><span class="identifier">boost</span><span class="special">/</span><span class="identifier">math</span><span class="special">/</span><span class="identifier">special_functions</span><span class="special">/</span><span class="identifier">zeta</span><span class="special">.</span><span class="identifier">hpp</span><span class="special">&gt;</span> </pre> <pre class="programlisting"><span class="keyword">namespace</span> <span class="identifier">boost</span><span class="special">{</span> <span class="keyword">namespace</span> <span class="identifier">math</span><span class="special">{</span> <span class="keyword">template</span> <span class="special">&lt;</span><span class="keyword">class</span> <span class="identifier">T</span><span class="special">&gt;</span> <a class="link" href="../result_type.html" title="Calculation of the Type of the Result"><span class="emphasis"><em>calculated-result-type</em></span></a> <span class="identifier">zeta</span><span class="special">(</span><span class="identifier">T</span> <span class="identifier">z</span><span class="special">);</span> <span class="keyword">template</span> <span class="special">&lt;</span><span class="keyword">class</span> <span class="identifier">T</span><span class="special">,</span> <span class="keyword">class</span> <a class="link" href="../../policy.html" title="Chapter&#160;18.&#160;Policies: Controlling Precision, Error Handling etc">Policy</a><span class="special">&gt;</span> <a class="link" href="../result_type.html" title="Calculation of the Type of the Result"><span class="emphasis"><em>calculated-result-type</em></span></a> <span class="identifier">zeta</span><span class="special">(</span><span class="identifier">T</span> <span class="identifier">z</span><span class="special">,</span> <span class="keyword">const</span> <a class="link" href="../../policy.html" title="Chapter&#160;18.&#160;Policies: Controlling Precision, Error Handling etc">Policy</a><span class="special">&amp;);</span> <span class="special">}}</span> <span class="comment">// namespaces</span> </pre> <p> The return type of these functions is computed using the <a class="link" href="../result_type.html" title="Calculation of the Type of the Result"><span class="emphasis"><em>result type calculation rules</em></span></a>: the return type is <code class="computeroutput"><span class="keyword">double</span></code> if T is an integer type, and T otherwise. </p> <p> The final <a class="link" href="../../policy.html" title="Chapter&#160;18.&#160;Policies: Controlling Precision, Error Handling etc">Policy</a> argument is optional and can be used to control the behaviour of the function: how it handles errors, what level of precision to use etc. Refer to the <a class="link" href="../../policy.html" title="Chapter&#160;18.&#160;Policies: Controlling Precision, Error Handling etc">policy documentation for more details</a>. </p> <h5> <a name="math_toolkit.zetas.zeta.h1"></a> <span class="phrase"><a name="math_toolkit.zetas.zeta.description"></a></span><a class="link" href="zeta.html#math_toolkit.zetas.zeta.description">Description</a> </h5> <pre class="programlisting"><span class="keyword">template</span> <span class="special">&lt;</span><span class="keyword">class</span> <span class="identifier">T</span><span class="special">&gt;</span> <a class="link" href="../result_type.html" title="Calculation of the Type of the Result"><span class="emphasis"><em>calculated-result-type</em></span></a> <span class="identifier">zeta</span><span class="special">(</span><span class="identifier">T</span> <span class="identifier">z</span><span class="special">);</span> <span class="keyword">template</span> <span class="special">&lt;</span><span class="keyword">class</span> <span class="identifier">T</span><span class="special">,</span> <span class="keyword">class</span> <a class="link" href="../../policy.html" title="Chapter&#160;18.&#160;Policies: Controlling Precision, Error Handling etc">Policy</a><span class="special">&gt;</span> <a class="link" href="../result_type.html" title="Calculation of the Type of the Result"><span class="emphasis"><em>calculated-result-type</em></span></a> <span class="identifier">zeta</span><span class="special">(</span><span class="identifier">T</span> <span class="identifier">z</span><span class="special">,</span> <span class="keyword">const</span> <a class="link" href="../../policy.html" title="Chapter&#160;18.&#160;Policies: Controlling Precision, Error Handling etc">Policy</a><span class="special">&amp;);</span> </pre> <p> Returns the <a href="path_to_url" target="_top">zeta function</a> of z: </p> <p> <span class="inlinemediaobject"><img src="../../../equations/zeta1.svg"></span> </p> <p> <span class="inlinemediaobject"><img src="../../../graphs/zeta1.svg" align="middle"></span> </p> <p> <span class="inlinemediaobject"><img src="../../../graphs/zeta2.svg" align="middle"></span> </p> <h5> <a name="math_toolkit.zetas.zeta.h2"></a> <span class="phrase"><a name="math_toolkit.zetas.zeta.accuracy"></a></span><a class="link" href="zeta.html#math_toolkit.zetas.zeta.accuracy">Accuracy</a> </h5> <p> The following table shows the peak errors (in units of epsilon) found on various platforms with various floating point types, along with comparisons to the <a href="path_to_url" target="_top">GSL-1.9</a> and <a href="path_to_url" target="_top">Cephes</a> libraries. Unless otherwise specified any floating point type that is narrower than the one shown will have <a class="link" href="../relative_error.html#math_toolkit.relative_error.zero_error">effectively zero error</a>. </p> <div class="table"> <a name="math_toolkit.zetas.zeta.table_zeta"></a><p class="title"><b>Table&#160;6.73.&#160;Error rates for zeta</b></p> <div class="table-contents"><table class="table" summary="Error rates for zeta"> <colgroup> <col> <col> <col> <col> <col> </colgroup> <thead><tr> <th> </th> <th> <p> Microsoft Visual C++ version 12.0<br> Win32<br> double </p> </th> <th> <p> GNU C++ version 5.1.0<br> linux<br> double </p> </th> <th> <p> GNU C++ version 5.1.0<br> linux<br> long double </p> </th> <th> <p> Sun compiler version 0x5130<br> Sun Solaris<br> long double </p> </th> </tr></thead> <tbody> <tr> <td> <p> Zeta: Random values greater than 1 </p> </td> <td> <p> <span class="blue">Max = 0.836&#949; (Mean = 0.093&#949;)</span> </p> </td> <td> <p> <span class="blue">Max = 0&#949; (Mean = 0&#949;)</span><br> <br> (<span class="emphasis"><em>GSL 1.16:</em></span> Max = 8.69&#949; (Mean = 1.03&#949;))<br> (<span class="emphasis"><em>Cephes:</em></span> <span class="red">Max = 4.49e+33&#949; (Mean = 6.85e+32&#949;) <a class="link" href="../logs_and_tables/logs.html#your_sha256_hashvalues_greater_than_1">And other failures.</a>)</span> </p> </td> <td> <p> <span class="blue">Max = 0.846&#949; (Mean = 0.0833&#949;)</span><br> <br> (<span class="emphasis"><em>&lt;tr1/cmath&gt;:</em></span> Max = 5.45&#949; (Mean = 1&#949;)) </p> </td> <td> <p> <span class="blue">Max = 0.846&#949; (Mean = 0.0743&#949;)</span> </p> </td> </tr> <tr> <td> <p> Zeta: Random values less than 1 </p> </td> <td> <p> <span class="blue">Max = 7.03&#949; (Mean = 2.98&#949;)</span> </p> </td> <td> <p> <span class="blue">Max = 0&#949; (Mean = 0&#949;)</span><br> <br> (<span class="emphasis"><em>GSL 1.16:</em></span> Max = 137&#949; (Mean = 13.8&#949;))<br> (<span class="emphasis"><em>Cephes:</em></span> <span class="red">Max = +INF&#949; (Mean = +INF&#949;) <a class="link" href="../logs_and_tables/logs.html#your_sha256_hashvalues_less_than_1">And other failures.</a>)</span> </p> </td> <td> <p> <span class="blue">Max = 7.03&#949; (Mean = 2.71&#949;)</span><br> <br> (<span class="emphasis"><em>&lt;tr1/cmath&gt;:</em></span> Max = 538&#949; (Mean = 59.3&#949;)) </p> </td> <td> <p> <span class="blue">Max = 70.1&#949; (Mean = 17.1&#949;)</span> </p> </td> </tr> <tr> <td> <p> Zeta: Values close to and greater than 1 </p> </td> <td> <p> <span class="blue">Max = 0.994&#949; (Mean = 0.421&#949;)</span> </p> </td> <td> <p> <span class="blue">Max = 0&#949; (Mean = 0&#949;)</span><br> <br> (<span class="emphasis"><em>GSL 1.16:</em></span> Max = 7.73&#949; (Mean = 4.07&#949;))<br> (<span class="emphasis"><em>Cephes:</em></span> <span class="red">Max = 6.77e+15&#949; (Mean = 1.52e+15&#949;) <a class="link" href="../logs_and_tables/logs.html#your_sha256_hashclose_to_and_greater_than_1">And other failures.</a>)</span> </p> </td> <td> <p> <span class="blue">Max = 0.995&#949; (Mean = 0.5&#949;)</span><br> <br> (<span class="emphasis"><em>&lt;tr1/cmath&gt;:</em></span> Max = 1.9e+06&#949; (Mean = 5.11e+05&#949;)) </p> </td> <td> <p> <span class="blue">Max = 0.995&#949; (Mean = 0.5&#949;)</span> </p> </td> </tr> <tr> <td> <p> Zeta: Values close to and less than 1 </p> </td> <td> <p> <span class="blue">Max = 0.991&#949; (Mean = 0.375&#949;)</span> </p> </td> <td> <p> <span class="blue">Max = 0&#949; (Mean = 0&#949;)</span><br> <br> (<span class="emphasis"><em>GSL 1.16:</em></span> Max = 0.991&#949; (Mean = 0.28&#949;))<br> (<span class="emphasis"><em>Cephes:</em></span> <span class="red">Max = 8.66e+15&#949; (Mean = 1.9e+15&#949;) <a class="link" href="../logs_and_tables/logs.html#your_sha256_hashclose_to_and_less_than_1">And other failures.</a>)</span> </p> </td> <td> <p> <span class="blue">Max = 0.998&#949; (Mean = 0.508&#949;)</span><br> <br> (<span class="emphasis"><em>&lt;tr1/cmath&gt;:</em></span> Max = 8.53e+06&#949; (Mean = 1.87e+06&#949;)) </p> </td> <td> <p> <span class="blue">Max = 0.998&#949; (Mean = 0.568&#949;)</span> </p> </td> </tr> <tr> <td> <p> Zeta: Integer arguments </p> </td> <td> <p> <span class="blue">Max = 6.5&#949; (Mean = 2.17&#949;)</span> </p> </td> <td> <p> <span class="blue">Max = 0&#949; (Mean = 0&#949;)</span><br> <br> (<span class="emphasis"><em>GSL 1.16:</em></span> Max = 3.75&#949; (Mean = 1.1&#949;))<br> (<span class="emphasis"><em>Cephes:</em></span> <span class="red">Max = +INF&#949; (Mean = +INF&#949;) <a class="link" href="../logs_and_tables/logs.html#your_sha256_hash_arguments">And other failures.</a>)</span> </p> </td> <td> <p> <span class="blue">Max = 9&#949; (Mean = 3.06&#949;)</span><br> <br> (<span class="emphasis"><em>&lt;tr1/cmath&gt;:</em></span> Max = 70.3&#949; (Mean = 17.4&#949;)) </p> </td> <td> <p> <span class="blue">Max = 21&#949; (Mean = 7.13&#949;)</span> </p> </td> </tr> </tbody> </table></div> </div> <br class="table-break"><h5> <a name="math_toolkit.zetas.zeta.h3"></a> <span class="phrase"><a name="math_toolkit.zetas.zeta.testing"></a></span><a class="link" href="zeta.html#math_toolkit.zetas.zeta.testing">Testing</a> </h5> <p> The tests for these functions come in two parts: basic sanity checks use spot values calculated using <a href="path_to_url" target="_top">Mathworld's online evaluator</a>, while accuracy checks use high-precision test values calculated at 1000-bit precision with <a href="path_to_url" target="_top">NTL::RR</a> and this implementation. Note that the generic and type-specific versions of these functions use differing implementations internally, so this gives us reasonably independent test data. Using our test data to test other "known good" implementations also provides an additional sanity check. </p> <h5> <a name="math_toolkit.zetas.zeta.h4"></a> <span class="phrase"><a name="math_toolkit.zetas.zeta.implementation"></a></span><a class="link" href="zeta.html#math_toolkit.zetas.zeta.implementation">Implementation</a> </h5> <p> All versions of these functions first use the usual reflection formulas to make their arguments positive: </p> <p> <span class="inlinemediaobject"><img src="../../../equations/zeta3.svg"></span> </p> <p> The generic versions of these functions are implemented using the series: </p> <p> <span class="inlinemediaobject"><img src="../../../equations/zeta6.svg"></span> </p> <p> When the significand (mantissa) size is recognised (currently for 53, 64 and 113-bit reals, plus single-precision 24-bit handled via promotion to double) then a series of rational approximations <a class="link" href="../sf_implementation.html#math_toolkit.sf_implementation.rational_approximations_used">devised by JM</a> are used. </p> <p> For 0 &lt; z &lt; 1 the approximating form is: </p> <p> <span class="inlinemediaobject"><img src="../../../equations/zeta4.svg"></span> </p> <p> For a rational approximation R(1-z) and a constant C. </p> <p> For 1 &lt; z &lt; 4 the approximating form is: </p> <p> <span class="inlinemediaobject"><img src="../../../equations/zeta5.svg"></span> </p> <p> For a rational approximation R(n-z) and a constant C and integer n. </p> <p> For z &gt; 4 the approximating form is: </p> <p> &#950;(z) = 1 + e<sup>R(z - n)</sup> </p> <p> For a rational approximation R(z-n) and integer n, note that the accuracy required for R(z-n) is not full machine precision, but an absolute error of: &#949;/R(0). This saves us quite a few digits when dealing with large z, especially when &#949; is small. </p> <p> Finally, there are some special cases for integer arguments, there are closed forms for negative or even integers: </p> <p> <span class="inlinemediaobject"><img src="../../../equations/zeta7.svg"></span> </p> <p> <span class="inlinemediaobject"><img src="../../../equations/zeta8.svg"></span> </p> <p> <span class="inlinemediaobject"><img src="../../../equations/zeta9.svg"></span> </p> <p> and for positive odd integers we simply cache pre-computed values as these are of great benefit to some infinite series calculations. </p> </div> <table xmlns:rev="path_to_url~gregod/boost/tools/doc/revision" width="100%"><tr> <td align="left"></td> Agrawal, Anton Bikineev, Paul A. Bristow, Marco Guazzone, Christopher Kormanyos, Hubert Holin, Bruno Lalande, John Maddock, Jeremy Murphy, Johan R&#229;de, Gautam Sewani, Benjamin Sobotta, Nicholas Thompson, Thijs van den Berg, Daryle Walker and Xiaogang Zhang<p> file LICENSE_1_0.txt or copy at <a href="path_to_url" target="_top">path_to_url </p> </div></td> </tr></table> <hr> <div class="spirit-nav"> <a accesskey="p" href="../zetas.html"><img src="../../../../../../doc/src/images/prev.png" alt="Prev"></a><a accesskey="u" href="../zetas.html"><img src="../../../../../../doc/src/images/up.png" alt="Up"></a><a accesskey="h" href="../../index.html"><img src="../../../../../../doc/src/images/home.png" alt="Home"></a><a accesskey="n" href="../expint.html"><img src="../../../../../../doc/src/images/next.png" alt="Next"></a> </div> </body> </html> ```
```java /* * Use of this source code is governed by the GPL v3 license * that can be found in the LICENSE file. */ package de.neemann.digital.draw.shapes; import de.neemann.digital.core.BitsException; import de.neemann.digital.core.element.ElementAttributes; import de.neemann.digital.core.element.Keys; import de.neemann.digital.core.element.PinDescriptions; import de.neemann.digital.draw.elements.IOState; import de.neemann.digital.draw.elements.Pin; import de.neemann.digital.draw.elements.Pins; import de.neemann.digital.draw.graphics.*; import static de.neemann.digital.draw.shapes.GenericShape.SIZE; import static de.neemann.digital.draw.shapes.GenericShape.SIZE2; /** * The Splitter shape */ public class SplitterShape implements Shape { private final PinDescriptions inputs; private final PinDescriptions outputs; private final int length; private final int spreading; private Pins pins; /** * Creates a new instance * * @param attr the attributes * @param inputs the inputs * @param outputs the outputs * @throws BitsException BitsException */ public SplitterShape(ElementAttributes attr, PinDescriptions inputs, PinDescriptions outputs) throws BitsException { this.inputs = inputs; this.outputs = outputs; spreading = attr.get(Keys.SPLITTER_SPREADING); length = (Math.max(inputs.size(), outputs.size()) - 1) * spreading * SIZE + 2; } @Override public Pins getPins() { if (pins == null) { pins = new Pins(); for (int i = 0; i < inputs.size(); i++) pins.add(new Pin(new Vector(0, i * spreading * SIZE), inputs.get(i))); for (int i = 0; i < outputs.size(); i++) pins.add(new Pin(new Vector(SIZE, i * spreading * SIZE), outputs.get(i))); } return pins; } @Override public Interactor applyStateMonitor(IOState ioState) { return null; } @Override public void drawTo(Graphic graphic, Style heighLight) { for (int i = 0; i < inputs.size(); i++) { Vector pos = new Vector(-2, i * spreading * SIZE - 3); graphic.drawText(pos, inputs.get(i).getName(), Orientation.RIGHTBOTTOM, Style.SHAPE_SPLITTER); graphic.drawLine(new Vector(0, i * spreading * SIZE), new Vector(SIZE2, i * spreading * SIZE), Style.NORMAL); } for (int i = 0; i < outputs.size(); i++) { Vector pos = new Vector(SIZE + 2, i * spreading * SIZE - 3); graphic.drawText(pos, outputs.get(i).getName(), Orientation.LEFTBOTTOM, Style.SHAPE_SPLITTER); graphic.drawLine(new Vector(SIZE, i * spreading * SIZE), new Vector(SIZE2, i * spreading * SIZE), Style.NORMAL); } graphic.drawPolygon(new Polygon(true) .add(SIZE2 - 2, -2) .add(SIZE2 + 2, -2) .add(SIZE2 + 2, length) .add(SIZE2 - 2, length), Style.FILLED); } } ```
```python #!/usr/bin/env python3 # ################################################################ # All rights reserved. # # This source code is licensed under both the BSD-style license (found in the # LICENSE file in the root directory of this source tree) and the GPLv2 (found # in the COPYING file in the root directory of this source tree). # You may select, at your option, one of the above-listed licenses. # ########################################################################## # Rate limiter, replacement for pv # this rate limiter does not "catch up" after a blocking period # Limitations: # - only accepts limit speed in MB/s import sys import time MB = 1024 * 1024 rate = float(sys.argv[1]) * MB start = time.time() total_read = 0 # sys.stderr.close() # remove error message, for Ctrl+C try: buf = " " while len(buf): now = time.time() to_read = max(int(rate * (now - start)), 1) max_buf_size = 1 * MB to_read = min(to_read, max_buf_size) start = now buf = sys.stdin.buffer.read(to_read) sys.stdout.buffer.write(buf) except (KeyboardInterrupt, BrokenPipeError) as e: pass ```
The Ristić Palace (Serbian Cyrillic: Ристићева палата; Macedonian: Ристиќева палата, Ristikjeva palata) is a monumental symbolic building at Macedonia Square (with the 'СКОПСКО' sign on top meaning 'Skopsko', a popular local beer brand) in Skopje, North Macedonia. The palace is located on the southern side of the Vardar river, in the southern part of Macedonia Square. Just to the east is the birthplace of Mother Teresa, and to the south is the Memorial House of Mother Teresa and the headquarters of the Ministry of Transport and Communications of North Macedonia. It was built in 1926 and is currently used as an office block. History Built in 1926 by Serbian Vladislav Ristić (Serbian Cyrillic: Владислав Ристић), a pharmacist, the building served as offices on the ground floor with the Ristiḱ family living in the other floors. However, now it is a complex of business offices. The palace is one of the few large buildings in Skopje from that period that survived the shocks of the 1963 Skopje earthquake which occurred in then part of SFR Yugoslavia, now North Macedonia. The architectural design of the building is credited to Dragutin Maslać and construction credit is to Danilo Stanković, who provided the sculptural aspects of the building. Threatened with destruction at one time, due to it being an alleged illegal construction area of some 50 square meters, a law was passed by the government of Skopje preserving the palace as a Cultural Heritage landmark. Description The palace was named after its owner, Vladislav Ristić. Designed by Dragutin Maslać, it is typical of the buildings that were built by wealthy businessmen of Skopje. The ground floor formerly housed the business centers, the basement was used as stores and the upper floors were used for residential purpose by the owner and his family. Even at the time it was built, the building had many modern facilities such as refrigerator that functioned on ice, telephones, and attached toilets and bathrooms with each room. Ristikje Palace is a cream and beige painted building, aside from the basement it includes the ground floor, a first floor, then a midsection consisting of three floors, and then the attic and roof part of the building which contains the banner on top. The ground floor today contains shops and flowers are sold to the right of the building. Two of the rooms on the second floor (the lowest of the three floor midsection) contain small balconies overlooking the square. The railings are low, but elegantly designed. Above the two balconies, between the second and third floors and again above that, between the third and fourth floors, are symmetrical sculptural designs painted white/cream, consisting of two, side by side, above each window, so eight in total, facing the square. When the building was built, the designers and architects were aware of the seismic conditions of the area, based on past experience of earthquake incidence and its damaging effects on buildings particularly in metropolitan cities like Skopje. They had thus taken due care to account for the seismic parameters based on the magnitude of earthquakes as design factors in the design of the building. This is one reason due to which the palace survived the effects of the 1963 Skopje earthquake; earthquake was of magnitude of 6.1 that occurred on July 26, 1963, causing death of 1070 apart from injuries to many people. Ristiḱ Palace was one of the few buildings that survived when nearly 70% of the buildings in the town were destroyed. Most of the buildings were built in Skopje with vertical load bearing walls and this is one reason attributed for the collapse. Other reasons mentioned for the collapse being use of the materials used in construction. In the case of the palace, modern building standards were followed. Ikonomov House was built in 1922 by architect Boris Dutov, Todorov House built in 1927 by the architect Novakovic. in the same area which became the elite area of Skopje. Gallery References Buildings and structures in Skopje Houses completed in 1926
```xml import React from 'react' // import { map, redirect, route } from 'navi' // import Form from '../styled/Form' // import { RoutingContext } from '../types/RoutingContext' // export default map(async (request, context: RoutingContext) => { // if (context.currentUser) { // return redirect('/') // } // if (request.method === 'post') { // let { email, password } = request.body // try { // await request.serializeEffectToHistory(() => // context.firebase.auth.signInWithEmailAndPassword(email, password) // ) // return redirect('/') // } // catch (error) { // return route({ // error, // view: <Login /> // }) // } // } // return route({ // view: <Login /> // }) // }) // function Login() { // return ( // <Form method='post'> // <h1>Need a new Password?</h1> // <Form.Errors /> // <Form.Field // label='Your email' // name='email' // validate={value => // value === '' ? 'Please enter your email.' : undefined // } // /> // <Form.SubmitButton> // Request a new Password // </Form.SubmitButton> // </Form> // ) // } ```
```c++ //your_sha256_hash--------------------------------------- // ChakraCore/Pal // Contains portions (c) copyright Microsoft, portions copyright (c) the .NET Foundation and Contributors // and edits (c) copyright the ChakraCore Contributors. // See THIRD-PARTY-NOTICES.txt in the project root for .NET Foundation license //your_sha256_hash--------------------------------------- /*++ Module Name: init/pal.cpp Abstract: Implementation of PAL exported functions not part of the Win32 API. --*/ #include "pal/thread.hpp" #include "pal/synchobjects.hpp" #include "pal/procobj.hpp" #include "pal/cs.hpp" #include "pal/file.hpp" #include "pal/map.hpp" #include "../objmgr/shmobjectmanager.hpp" #include "pal/palinternal.h" #include "pal/dbgmsg.h" #include "pal/shmemory.h" #include "pal/process.h" #include "../thread/procprivate.hpp" #include "pal/module.h" #include "pal/virtual.h" #include "pal/misc.h" #include "pal/utils.h" #include "pal/debug.h" #include "pal/locale.h" #include "pal/init.h" #if HAVE_MACH_EXCEPTIONS #include "../exception/machexception.h" #else #include "../exception/signal.hpp" #endif #include <stdlib.h> #include <unistd.h> #include <pwd.h> #include <errno.h> #include <sys/types.h> #include <sys/param.h> #include <sys/resource.h> #include <sys/stat.h> #include <limits.h> #include <string.h> #include <fcntl.h> #if HAVE_POLL #include <poll.h> #else #include "pal/fakepoll.h" #endif // HAVE_POLL #if defined(__APPLE__) #include <sys/sysctl.h> int CacheLineSize; #endif //__APPLE__ #ifdef __APPLE__ #include <mach-o/dyld.h> #endif // __APPLE__ using namespace CorUnix; // // $$TODO The C++ compiler doesn't like pal/cruntime.h so duplicate the // necessary prototype here // extern "C" BOOL CRTInitStdStreams( void ); SET_DEFAULT_DEBUG_CHANNEL(PAL); Volatile<INT> init_count PAL_GLOBAL = 0; Volatile<BOOL> shutdown_intent PAL_GLOBAL = 0; Volatile<LONG> g_chakraCoreInitialized PAL_GLOBAL = 0; static BOOL g_fThreadDataAvailable = FALSE; static pthread_mutex_t init_critsec_mutex = PTHREAD_MUTEX_INITIALIZER; /* critical section to protect access to init_count. This is allocated on the very first PAL_Initialize call, and is freed afterward. */ static PCRITICAL_SECTION init_critsec = NULL; static LPWSTR INIT_FormatCommandLine (int argc, const char * const *argv); static LPWSTR INIT_FindEXEPath(LPCSTR exe_name); #ifdef _DEBUG extern void PROCDumpThreadList(void); #endif /*++ Function: Initialize Abstract: Common PAL initialization function. Return: 0 if successful -1 if it failed --*/ static int Initialize() { PAL_ERROR palError = ERROR_GEN_FAILURE; CPalThread *pThread = NULL; CSharedMemoryObjectManager *pshmom = NULL; int retval = -1; bool fFirstTimeInit = false; /* the first ENTRY within the first call to PAL_Initialize is a special case, since debug channels are not initialized yet. So in that case the ENTRY will be called after the DBG channels initialization */ ENTRY_EXTERNAL("PAL_Initialize()\n"); /*Firstly initiate a lastError */ SetLastError(ERROR_GEN_FAILURE); // prevent un-reasonable stack limits. (otherwise affects mmap calls later) #if !defined(__IOS__) && !defined(__ANDROID__) #if defined (_AMD64_) || defined (_M_ARM64) const rlim_t maxStackSize = 8 * 1024 * 1024; // CC Max stack size #else const rlim_t maxStackSize = 2 * 1024 * 1024; // CC Max stack size #endif struct rlimit rl; int err = getrlimit(RLIMIT_STACK, &rl); if (!err) { if (rl.rlim_cur > maxStackSize) { rl.rlim_cur = maxStackSize; err = setrlimit(RLIMIT_STACK, &rl); _ASSERTE(err == 0 && "Well, the environment has a strange stack limit \ and setrlimit call failed to fix that"); } } #endif // !__IOS__ && !__ANDROID__ CriticalSectionSubSysInitialize(); if(NULL == init_critsec) { pthread_mutex_lock(&init_critsec_mutex); // prevents race condition of two threads // initializing the critical section. if(NULL == init_critsec) { static CRITICAL_SECTION temp_critsec; // Want this critical section to NOT be internal to avoid the use of unsafe region markers. InternalInitializeCriticalSectionAndSpinCount(&temp_critsec, 0, false); if(NULL != InterlockedCompareExchangePointer(&init_critsec, &temp_critsec, NULL)) { // Another thread got in before us! shouldn't happen, if the PAL // isn't initialized there shouldn't be any other threads WARN("Another thread initialized the critical section\n"); InternalDeleteCriticalSection(&temp_critsec); } } pthread_mutex_unlock(&init_critsec_mutex); } InternalEnterCriticalSection(pThread, init_critsec); // here pThread is always NULL if (init_count == 0) { // Set our pid. gPID = getpid(); fFirstTimeInit = true; // Initialize the TLS lookaside cache if (FALSE == TLSInitialize()) { goto done; } // Initialize the environment. if (FALSE == MiscInitialize()) { goto done; } // Initialize debug channel settings before anything else. // This depends on the environment, so it must come after // MiscInitialize. if (FALSE == DBG_init_channels()) { goto done; } #if _DEBUG // Verify that our page size is what we think it is. If it's // different, we can't run. if (VIRTUAL_PAGE_SIZE != getpagesize()) { ASSERT("VIRTUAL_PAGE_SIZE is incorrect for this system!\n" "Change include/pal/virtual.h and clr/src/inc/stdmacros.h " "to reflect the correct page size of %d.\n", getpagesize()); } #endif // _DEBUG /* initialize the shared memory infrastructure */ if (!SHMInitialize()) { ERROR("Shared memory initialization failed!\n"); goto CLEANUP0; } // // Initialize global process data // palError = InitializeProcessData(); if (NO_ERROR != palError) { ERROR("Unable to initialize process data\n"); goto CLEANUP1; } #if HAVE_MACH_EXCEPTIONS // Mach exception port needs to be set up before the thread // data or threads are set up. if (!SEHInitializeMachExceptions()) { ERROR("SEHInitializeMachExceptions failed!\n"); palError = ERROR_GEN_FAILURE; goto CLEANUP1; } #endif // HAVE_MACH_EXCEPTIONS // // Allocate the initial thread data // palError = CreateThreadData(&pThread); if (NO_ERROR != palError) { ERROR("Unable to create initial thread data\n"); goto CLEANUP1a; } PROCAddThread(pThread, pThread); // // It's now safe to access our thread data // g_fThreadDataAvailable = TRUE; // // Initialize module manager // if (FALSE == LOADInitializeModules()) { ERROR("Unable to initialize module manager\n"); palError = ERROR_INTERNAL_ERROR; goto CLEANUP1b; } // // Initialize the object manager // pshmom = InternalNew<CSharedMemoryObjectManager>(); if (NULL == pshmom) { ERROR("Unable to allocate new object manager\n"); palError = ERROR_OUTOFMEMORY; goto CLEANUP1b; } palError = pshmom->Initialize(); if (NO_ERROR != palError) { ERROR("object manager initialization failed!\n"); InternalDelete(pshmom); goto CLEANUP1b; } g_pObjectManager = pshmom; // // Initialize the synchronization manager // g_pSynchronizationManager = CPalSynchMgrController::CreatePalSynchronizationManager(); if (NULL == g_pSynchronizationManager) { palError = ERROR_NOT_ENOUGH_MEMORY; ERROR("Failure creating synchronization manager\n"); goto CLEANUP1c; } } else { pThread = InternalGetCurrentThread(); } palError = ERROR_GEN_FAILURE; if (init_count == 0) { // // Create the initial process and thread objects // palError = CreateInitialProcessAndThreadObjects(pThread); if (NO_ERROR != palError) { ERROR("Unable to create initial process and thread objects\n"); goto CLEANUP5; } #if !HAVE_MACH_EXCEPTIONS if(!SEHInitializeSignals()) { goto CLEANUP5; } #endif palError = ERROR_GEN_FAILURE; if (FALSE == TIMEInitialize()) { ERROR("Unable to initialize TIME support\n"); goto CLEANUP6; } /* Initialize the File mapping critical section. */ if (FALSE == MAPInitialize()) { ERROR("Unable to initialize file mapping support\n"); goto CLEANUP6; } /* create file objects for standard handles */ if(!FILEInitStdHandles()) { ERROR("Unable to initialize standard file handles\n"); goto CLEANUP13; } if (FALSE == CRTInitStdStreams()) { ERROR("Unable to initialize CRT standard streams\n"); goto CLEANUP15; } TRACE("First-time PAL initialization complete.\n"); init_count++; /* Set LastError to a non-good value - functions within the PAL startup may set lasterror to a nonzero value. */ SetLastError(NO_ERROR); retval = 0; } else { init_count++; // Behave the same wrt entering the PAL independent of whether this // is the first call to PAL_Initialize or not. The first call implied // PAL_Enter by virtue of creating the CPalThread for the current // thread, and its starting state is to be in the PAL. (void)PAL_Enter(PAL_BoundaryTop); TRACE("Initialization count increases to %d\n", init_count.Load()); SetLastError(NO_ERROR); retval = 0; } goto done; /* No cleanup required for CRTInitStdStreams */ CLEANUP15: FILECleanupStdHandles(); CLEANUP13: VIRTUALCleanup(); MAPCleanup(); CLEANUP6: CLEANUP5: PROCCleanupInitialProcess(); CLEANUP1d: // Cleanup synchronization manager CLEANUP1c: // Cleanup object manager CLEANUP1b: // Cleanup initial thread data CLEANUP1a: // Cleanup global process data CLEANUP1: SHMCleanup(); CLEANUP0: ERROR("PAL_Initialize failed\n"); SetLastError(palError); done: #ifdef PAL_PERF if( retval == 0) { PERFEnableProcessProfile(); PERFEnableThreadProfile(FALSE); PERFCalibrate("Overhead of PERF entry/exit"); } #endif InternalLeaveCriticalSection(pThread, init_critsec); if (fFirstTimeInit && 0 == retval) { _ASSERTE(NULL != pThread); } if (retval != 0 && GetLastError() == ERROR_SUCCESS) { ASSERT("returning failure, but last error not set\n"); } LOGEXIT("PAL_Initialize returns int %d\n", retval); return retval; } /*++ Function: PAL_InitializeChakraCore Abstract: A replacement for PAL_Initialize when starting the host process that hosts ChakraCore This routine also makes sure the psuedo dynamic libraries like PALRT have their initialization methods called. Return: ERROR_SUCCESS if successful An error code, if it failed --*/ #if defined(ENABLE_CC_XPLAT_TRACE) || defined(DEBUG) bool PAL_InitializeChakraCoreCalled = false; #endif int PALAPI PAL_InitializeChakraCore() { // this is not thread safe but PAL_InitializeChakraCore is per process // besides, calling Jsrt initializer function is thread safe if (init_count > 0) return ERROR_SUCCESS; #if defined(ENABLE_CC_XPLAT_TRACE) || defined(DEBUG) PAL_InitializeChakraCoreCalled = true; #endif if (Initialize()) { return GetLastError(); } if (FALSE == VIRTUALInitialize()) { ERROR("Unable to initialize virtual memory support\n"); return ERROR_GEN_FAILURE; } // Check for a repeated call (this is a no-op). if (InterlockedIncrement(&g_chakraCoreInitialized) > 1) { PAL_Enter(PAL_BoundaryTop); return ERROR_SUCCESS; } return ERROR_SUCCESS; } /*++ Function: PAL_IsDebuggerPresent Abstract: This function should be used to determine if a debugger is attached to the process. --*/ PALIMPORT BOOL PALAPI PAL_IsDebuggerPresent() { #if defined(__LINUX__) BOOL debugger_present = FALSE; char buf[2048]; int status_fd = open("/proc/self/status", O_RDONLY); if (status_fd == -1) { return FALSE; } ssize_t num_read = read(status_fd, buf, sizeof(buf) - 1); if (num_read > 0) { static const char TracerPid[] = "TracerPid:"; char *tracer_pid; buf[num_read] = '\0'; tracer_pid = strstr(buf, TracerPid); if (tracer_pid) { debugger_present = !!atoi(tracer_pid + sizeof(TracerPid) - 1); } } close(status_fd); return debugger_present; #elif defined(__APPLE__) struct kinfo_proc info = {}; size_t size = sizeof(info); int mib[4] = { CTL_KERN, KERN_PROC, KERN_PROC_PID, getpid() }; int ret = sysctl(mib, sizeof(mib)/sizeof(*mib), &info, &size, NULL, 0); if (ret == 0) return ((info.kp_proc.p_flag & P_TRACED) != 0); return FALSE; #else return FALSE; #endif } /*++ Function: PAL_Shutdown Abstract: This function shuts down the PAL WITHOUT exiting the current process. --*/ void PALAPI PAL_Shutdown( void) { TerminateCurrentProcessNoExit(FALSE /* bTerminateUnconditionally */); } /*++ Function: PAL_Terminate Abstract: This function is the called when a thread has finished using the PAL library. It shuts down PAL and exits the current process. --*/ void PALAPI PAL_Terminate( void) { PAL_TerminateEx(0); } /*++ Function: PAL_TerminateEx Abstract: This function is the called when a thread has finished using the PAL library. It shuts down PAL and exits the current process with the specified exit code. --*/ void PALAPI PAL_TerminateEx( int exitCode) { ENTRY_EXTERNAL("PAL_TerminateEx()\n"); if (NULL == init_critsec) { /* note that these macros probably won't output anything, since the debug channels haven't been initialized yet */ ASSERT("PAL_Initialize has never been called!\n"); LOGEXIT("PAL_Terminate returns.\n"); } PALSetShutdownIntent(); LOGEXIT("PAL_TerminateEx is exiting the current process.\n"); exit(exitCode); } /*++ Function: PALIsThreadDataInitialized Returns TRUE if startup has reached a point where thread data is available --*/ BOOL PALIsThreadDataInitialized() { return g_fThreadDataAvailable; } /*++ Function: PALCommonCleanup Utility function to prepare for shutdown. --*/ void PALCommonCleanup() { static bool cleanupDone = false; if (!cleanupDone) { cleanupDone = true; PALSetShutdownIntent(); // // Let the synchronization manager know we're about to shutdown // CPalSynchMgrController::PrepareForShutdown(); #ifdef _DEBUG PROCDumpThreadList(); #endif } } /*++ Function: PALShutdown sets the PAL's initialization count to zero, so that PALIsInitialized will return FALSE. called by PROCCleanupProcess to tell some functions that the PAL isn't fully functional, and that they should use an alternate code path (no parameters, no retun vale) --*/ void PALShutdown() { init_count = 0; } BOOL PALIsShuttingDown() { /* ROTORTODO: This function may be used to provide a reader/writer-like mechanism (or a ref counting one) to prevent PAL APIs that need to access PAL runtime data, from working when PAL is shutting down. Each of those API should acquire a read access while executing. The shutting down code would acquire a write lock, i.e. suspending any new incoming reader, and waiting for the current readers to be done. That would allow us to get rid of the dangerous suspend-all-other-threads at shutdown time */ return shutdown_intent; } void PALSetShutdownIntent() { /* ROTORTODO: See comment in PALIsShuttingDown */ shutdown_intent = TRUE; } /*++ Function: PALInitLock Take the initializaiton critical section (init_critsec). necessary to serialize TerminateProcess along with PAL_Terminate and PAL_Initialize (no parameters) Return value : TRUE if critical section existed (and was acquired) FALSE if critical section doens't exist yet --*/ BOOL PALInitLock(void) { if(!init_critsec) { return FALSE; } CPalThread * pThread = (PALIsThreadDataInitialized() ? InternalGetCurrentThread() : NULL); InternalEnterCriticalSection(pThread, init_critsec); return TRUE; } /*++ Function: PALInitUnlock Release the initialization critical section (init_critsec). (no parameters, no return value) --*/ void PALInitUnlock(void) { if(!init_critsec) { return; } CPalThread * pThread = (PALIsThreadDataInitialized() ? InternalGetCurrentThread() : NULL); InternalLeaveCriticalSection(pThread, init_critsec); } /*++ Function: INIT_FormatCommandLine [Internal] Abstract: This function converts an array of arguments (argv) into a Unicode command-line for use by GetCommandLineW Parameters : int argc : number of arguments in argv char **argv : argument list in an array of NULL-terminated strings Return value : pointer to Unicode command line. This is a buffer allocated with malloc; caller is responsible for freeing it with free() Note : not all peculiarities of Windows command-line processing are supported; -what is supported : -arguments with white-space must be double quoted (we'll just double-quote all arguments to simplify things) -some characters must be escaped with \ : particularly, the double-quote, to avoid confusion with the double-quotes at the start and end of arguments, and \ itself, to avoid confusion with escape sequences. -what is not supported: -under Windows, \\ is interpreted as an escaped \ ONLY if it's followed by an escaped double-quote \". \\\" is passed to argv as \", but \\a is passed to argv as \\a... there may be other similar cases -there may be other characters which must be escaped --*/ static LPWSTR INIT_FormatCommandLine (int argc, const char * const *argv) { LPWSTR retval; LPSTR command_line=NULL, command_ptr; LPCSTR arg_ptr; INT length, i,j; BOOL bQuoted = FALSE; /* list of characters that need no be escaped with \ when building the command line. currently " and \ */ LPCSTR ESCAPE_CHARS="\"\\"; /* allocate temporary memory for the string. Play it safe : double the length of each argument (in case they're composed exclusively of escaped characters), and add 3 (for the double-quotes and separating space). This is temporary anyway, we return a LPWSTR */ length=0; for(i=0; i<argc; i++) { TRACE("argument %d is %s\n", i, argv[i]); length+=3; length+=strlen(argv[i])*2; } command_line = reinterpret_cast<LPSTR>(InternalMalloc(length)); if(!command_line) { ERROR("couldn't allocate memory for command line!\n"); return NULL; } command_ptr=command_line; for(i=0; i<argc; i++) { /* double-quote at beginning of argument containing at least one space */ for(j = 0; (argv[i][j] != 0) && (!isspace((unsigned char) argv[i][j])); j++); if (argv[i][j] != 0) { *command_ptr++='"'; bQuoted = TRUE; } /* process the argument one character at a time */ for(arg_ptr=argv[i]; *arg_ptr; arg_ptr++) { /* if character needs to be escaped, prepend a \ to it. */ if( strchr(ESCAPE_CHARS,*arg_ptr)) { *command_ptr++='\\'; } /* now we can copy the actual character over. */ *command_ptr++=*arg_ptr; } /* double-quote at end of argument; space to separate arguments */ if (bQuoted == TRUE) { *command_ptr++='"'; bQuoted = FALSE; } *command_ptr++=' '; } /* replace the last space with a NULL terminator */ command_ptr--; *command_ptr='\0'; /* convert to Unicode */ i = MultiByteToWideChar(CP_ACP, 0,command_line, -1, NULL, 0); if (i == 0) { ASSERT("MultiByteToWideChar failure\n"); InternalFree(command_line); return NULL; } retval = reinterpret_cast<LPWSTR>(InternalMalloc((sizeof(WCHAR)*i))); if(retval == NULL) { ERROR("can't allocate memory for Unicode command line!\n"); InternalFree(command_line); return NULL; } if(!MultiByteToWideChar(CP_ACP, 0,command_line, i, retval, i)) { ASSERT("MultiByteToWideChar failure\n"); InternalFree(retval); retval = NULL; } else TRACE("Command line is %s\n", command_line); InternalFree(command_line); return retval; } /*++ Function: INIT_FindEXEPath Abstract: Determine the full, canonical path of the current executable by searching $PATH. Parameters: LPCSTR exe_name : file to search for Return: pointer to buffer containing the full path. This buffer must be released by the caller using free() Notes : this function assumes that "exe_name" is in Unix style (no \) Notes 2: This doesn't handle the case of directories with the desired name (and directories are usually executable...) --*/ static LPWSTR INIT_FindEXEPath(LPCSTR exe_name) { #ifndef __APPLE__ CHAR real_path[PATH_MAX+1]; LPSTR env_path; LPSTR path_ptr; LPSTR cur_dir; INT exe_name_length; BOOL need_slash; LPWSTR return_value; INT return_size; struct stat theStats; /* if a path is specified, only search there */ if(strchr(exe_name, '/')) { if ( -1 == stat( exe_name, &theStats ) ) { ERROR( "The file does not exist\n" ); return NULL; } if ( UTIL_IsExecuteBitsSet( &theStats ) ) { if(!realpath(exe_name, real_path)) { ERROR("realpath() failed!\n"); return NULL; } return_size=MultiByteToWideChar(CP_ACP,0,real_path,-1,NULL,0); if ( 0 == return_size ) { ASSERT("MultiByteToWideChar failure\n"); return NULL; } return_value = reinterpret_cast<LPWSTR>(InternalMalloc((return_size*sizeof(WCHAR)))); if ( NULL == return_value ) { ERROR("Not enough memory to create full path\n"); return NULL; } else { if(!MultiByteToWideChar(CP_ACP, 0, real_path, -1, return_value, return_size)) { ASSERT("MultiByteToWideChar failure\n"); InternalFree(return_value); return_value = NULL; } else { TRACE("full path to executable is %s\n", real_path); } } return return_value; } } /* no path was specified : search $PATH */ env_path=MiscGetenv("PATH"); if(!env_path || *env_path=='\0') { WARN("$PATH isn't set.\n"); goto last_resort; } /* get our own copy of env_path so we can modify it */ env_path=InternalStrdup(env_path); if(!env_path) { ERROR("Not enough memory to copy $PATH!\n"); return NULL; } exe_name_length=strlen(exe_name); cur_dir=env_path; while(cur_dir) { LPSTR full_path; struct stat theStats; /* skip all leading ':' */ while(*cur_dir==':') { cur_dir++; } if(*cur_dir=='\0') { break; } /* cut string at next ':' */ path_ptr=strchr(cur_dir, ':'); if(path_ptr) { /* check if we need to add a '/' between the path and filename */ need_slash=(*(path_ptr-1))!='/'; /* NULL_terminate path element */ *path_ptr++='\0'; } else { /* check if we need to add a '/' between the path and filename */ need_slash=(cur_dir[strlen(cur_dir)-1])!='/'; } TRACE("looking for %s in %s\n", exe_name, cur_dir); /* build tentative full file name */ int iLength = (strlen(cur_dir)+exe_name_length+2); full_path = reinterpret_cast<LPSTR>(InternalMalloc(iLength)); if(!full_path) { ERROR("Not enough memory!\n"); break; } if (strcpy_s(full_path, iLength, cur_dir) != SAFECRT_SUCCESS) { ERROR("strcpy_s failed!\n"); InternalFree(full_path); InternalFree(env_path); return NULL; } if(need_slash) { if (strcat_s(full_path, iLength, "/") != SAFECRT_SUCCESS) { ERROR("strcat_s failed!\n"); InternalFree(full_path); InternalFree(env_path); return NULL; } } if (strcat_s(full_path, iLength, exe_name) != SAFECRT_SUCCESS) { ERROR("strcat_s failed!\n"); InternalFree(full_path); InternalFree(env_path); return NULL; } /* see if file exists AND is executable */ if ( -1 != stat( full_path, &theStats ) ) { if( UTIL_IsExecuteBitsSet( &theStats ) ) { /* generate canonical path */ if(!realpath(full_path, real_path)) { ERROR("realpath() failed!\n"); InternalFree(full_path); InternalFree(env_path); return NULL; } InternalFree(full_path); return_size = MultiByteToWideChar(CP_ACP,0,real_path,-1,NULL,0); if ( 0 == return_size ) { ASSERT("MultiByteToWideChar failure\n"); InternalFree(env_path); return NULL; } return_value = reinterpret_cast<LPWSTR>(InternalMalloc((return_size*sizeof(WCHAR)))); if ( NULL == return_value ) { ERROR("Not enough memory to create full path\n"); InternalFree(env_path); return NULL; } if(!MultiByteToWideChar(CP_ACP, 0, real_path, -1, return_value, return_size)) { ASSERT("MultiByteToWideChar failure\n"); InternalFree(return_value); return_value = NULL; } else { TRACE("found %s in %s; real path is %s\n", exe_name, cur_dir,real_path); } InternalFree(env_path); return return_value; } } /* file doesn't exist : keep searching */ InternalFree(full_path); /* path_ptr is NULL if there's no ':' after this directory */ cur_dir=path_ptr; } InternalFree(env_path); TRACE("No %s found in $PATH (%s)\n", exe_name, MiscGetenv("PATH")); last_resort: /* last resort : see if the executable is in the current directory. This is possible if it comes from a exec*() call. */ if(0 == stat(exe_name,&theStats)) { if ( UTIL_IsExecuteBitsSet( &theStats ) ) { if(!realpath(exe_name, real_path)) { ERROR("realpath() failed!\n"); return NULL; } return_size = MultiByteToWideChar(CP_ACP,0,real_path,-1,NULL,0); if (0 == return_size) { ASSERT("MultiByteToWideChar failure\n"); return NULL; } return_value = reinterpret_cast<LPWSTR>(InternalMalloc((return_size*sizeof(WCHAR)))); if (NULL == return_value) { ERROR("Not enough memory to create full path\n"); return NULL; } else { if(!MultiByteToWideChar(CP_ACP, 0, real_path, -1, return_value, return_size)) { ASSERT("MultiByteToWideChar failure\n"); InternalFree(return_value); return_value = NULL; } else { TRACE("full path to executable is %s\n", real_path); } } return return_value; } else { ERROR("found %s in current directory, but it isn't executable!\n", exe_name); } } else { TRACE("last resort failed : executable %s is not in the current " "directory\n",exe_name); } ERROR("executable %s not found anywhere!\n", exe_name); return NULL; #else // !__APPLE__ // On the Mac we can just directly ask the OS for the executable path. CHAR exec_path[PATH_MAX+1]; LPWSTR return_value; INT return_size; uint32_t bufsize = sizeof(exec_path); if (_NSGetExecutablePath(exec_path, &bufsize)) { ASSERT("_NSGetExecutablePath failure\n"); return NULL; } return_size = MultiByteToWideChar(CP_ACP,0,exec_path,-1,NULL,0); if (0 == return_size) { ASSERT("MultiByteToWideChar failure\n"); return NULL; } return_value = reinterpret_cast<LPWSTR>(InternalMalloc((return_size*sizeof(WCHAR)))); if (NULL == return_value) { ERROR("Not enough memory to create full path\n"); return NULL; } else { if(!MultiByteToWideChar(CP_ACP, 0, exec_path, -1, return_value, return_size)) { ASSERT("MultiByteToWideChar failure\n"); InternalFree(return_value); return_value = NULL; } else { TRACE("full path to executable is %s\n", exec_path); } } return return_value; #endif // !__APPLE__ } ```
The 1965 American Road Race of Champions was the second running of the SCCA National Championship Runoffs. It took place on 27 and 28 November 1965 at Daytona International Speedway, on 3,1-mile and 1,6-mile courses. Despite the National Championship being cancelled, the ARRC still was not a championship race, as National Championships were awarded to Divisional winners. Most competitive drivers from SCCA's seven divisions were invited to the event. Changes for 1965 1965 saw several changes in SCCA's class structure. Formula Libre was split up in Formula A for under-3-litre racing engines and Formula B for 1,6-litre production engines. Formula Junior was now replaced by Formula C for 1,1-litre racing engines. New cars were homologated for Production classes, including the new Porsche 911. Some other cars were reclassified, for example the Austin-Healey 100-6. Race results Sources: Class winners for multi-class races in bold. Race 1 - H Production The first race, held on November 27, was the H Production race. It was held on the 1,6-mile course for 45 minutes and 32 laps. Race 2 - H Modified The H Modified race was held on November 27 on the 1,6-mile course for 45 minutes and 35 laps. Race 3 - G Production The G Production race was held on November 27 on the 1,6-mile course for 45 minutes and 33 laps. Race 4 - G Modified The G Modified race was held on November 27 on the 1,6-mile course for 45 minutes and 36 laps. Race 5 - F Production The F Production race was held on November 27 on the 1,6-mile course for 45 minutes and 34 laps. Race 6 - E Production The E Production race was held on November 27 on the 1,6-mile course for 45 minutes and 34 laps. Race 7 - C & D Production C Production and D Production cars raced in a multi-class race held on November 28 on the 3,1-mile course for 45 minutes and 24 laps. Race 8 - Formula A, B & C Formula B and Formula C cars raced in a multi-class race held on November 28 on the 1,6-mile course for 45 minutes and 38 laps. Despite the 3-litre Formula A already being introduced, the number of cars was so small that no drivers were invited. Race 9 - Formula Vee The Formula Vee race was held on November 28 on the 1,6-mile course for 45 minutes and 33 laps. Race 10 - A & B Production A Production and B Production drivers raced in a multi-class race held on November 28 on the 3,1-mile course for 45 minutes and 25 laps. Race 11 - C, D, E & F Modified C Modified, D Modified, E Modified and F Modified drivers raced in a multi-class race held on November 28 on the 3,1-mile course for 45 minutes and 25 laps. Notes H Production Turgeon and Barton are both listed as sixth-place qualifiers. Only 15 cars qualified for the race, but Brownfield and Garrison are listed as 16th-place qualifiers. E Production Only 18 cars qualified for the race, but Zitza and Collins are listed as 19th- and 20th-place qualifiers. Formula A, B & C Only 19 cars qualified for the race, but SCCA's website lists 20th- and 21th-place qualifiers. Bunn is said to drive a Lola T55 by some other sources. A & B Production Heinz's qualifying position is unknown, but the 16th-place qualifier is also unknown, so it is most likely that Heinz is the last qualifier in class. C, D, E & F Modified Only 25 cars qualified for the race, but most sources list 32nd- through 36th-place qualifiers. Non-finishers' laps are listed differently on different sources. References SCCA National Sports Car Championship SCCA National Championship Runoffs 1965 in American motorsport
```markdown # An Introduction To `aima-python` The [aima-python](path_to_url repository implements, in Python code, the algorithms in the textbook *[Artificial Intelligence: A Modern Approach](path_to_url A typical module in the repository has the code for a single chapter in the book, but some modules combine several chapters. See [the index](path_to_url#index-of-code) if you can't find the algorithm you want. The code in this repository attempts to mirror the pseudocode in the textbook as closely as possible and to stress readability foremost; if you are looking for high-performance code with advanced features, there are other repositories for you. For each module, there are three/four files, for example: - [**`nlp.py`**](path_to_url Source code with data types and algorithms for natural language processing; functions have docstrings explaining their use. - [**`nlp.ipynb`**](path_to_url A notebook like this one; gives more detailed examples and explanations of use. - [**`nlp_apps.ipynb`**](path_to_url A Jupyter notebook that gives example applications of the code. - [**`tests/test_nlp.py`**](path_to_url Test cases, used to verify the code is correct, and also useful to see examples of use. There is also an [aima-java](path_to_url repository, if you prefer Java. ## What version of Python? The code is tested in Python [3.4](path_to_url and [3.5](path_to_url If you try a different version of Python 3 and find a problem, please report it as an [Issue](path_to_url We recommend the [Anaconda](path_to_url distribution of Python 3.5. It comes with additional tools like the powerful IPython interpreter, the Jupyter Notebook and many helpful packages for scientific computing. After installing Anaconda, you will be good to go to run all the code and all the IPython notebooks. ## IPython notebooks The IPython notebooks in this repository explain how to use the modules, and give examples of usage. You can use them in three ways: 1. View static HTML pages. (Just browse to the [repository](path_to_url and click on a `.ipynb` file link.) 2. Run, modify, and re-run code, live. (Download the repository (by [zip file](path_to_url or by `git` commands), start a Jupyter notebook server with the shell command "`jupyter notebook`" (issued from the directory where the files are), and click on the notebook you want to interact with.) 3. Binder - Click on the binder badge on the [repository](path_to_url main page to open the notebooks in an executable environment, online. This method does not require any extra installation. The code can be executed and modified from the browser itself. Note that this is an unstable option; there is a chance the notebooks will never load. You can [read about notebooks](path_to_url and then [get started](path_to_url``` ```markdown # Helpful Tips Most of these notebooks start by importing all the symbols in a module:``` ```python from logic import * ``` ```markdown From there, the notebook alternates explanations with examples of use. You can run the examples as they are, and you can modify the code cells (or add new cells) and run your own examples. If you have some really good examples to add, you can make a github pull request. If you want to see the source code of a function, you can open a browser or editor and see it in another window, or from within the notebook you can use the IPython magic function `%psource` (for "print source") or the function `psource` from `notebook.py`. Also, if the algorithm has pseudocode available, you can read it by calling the `pseudocode` function with the name of the algorithm passed as a parameter.``` ```python %psource WalkSAT ``` ```python from notebook import psource, pseudocode psource(WalkSAT) pseudocode("WalkSAT") ``` ```markdown Or see an abbreviated description of an object with a trailing question mark:``` ```python WalkSAT? ``` ```markdown # Authors This notebook is written by [Chirag Vertak](path_to_url and [Peter Norvig](path_to_url```
```xml <?xml version="1.0" encoding="UTF-8" ?> <!-- --> <!DOCTYPE ldml SYSTEM "../../dtd/cldr/common/dtd/ldml.dtd"> <ldml> <identity> <version number="$Revision$"/> <language type="ru"/> </identity> <rbnf> <rulesetGrouping type="SpelloutRules"> <ruleset type="lenient-parse" access="private"> <rbnfrule value="0">&amp;[last primary ignorable ] ' ' ',' '-' '';</rbnfrule> </ruleset> </rulesetGrouping> </rbnf> </ldml> ```
Javier Molina Casillas (born January 2, 1990) is an American professional boxer. As an amateur, he won the 2007 U.S. National Championships at the age of 17 and represented the United States the following year at the 2008 Beijing Olympics. Personal life Molina's father, Miguel, had a successful amateur boxing career in Ciudad Juárez, Mexico, before he migrated to the United States. His older brother Carlos is a highly regarded prospect with a 17-1-1 record, and his twin brother, Oscar Molina, fights on the Mexican Olympic team. Amateur career With a Vicente Fernández ring entrance song of "No Me Se Rajar", a tune that reflects the macho culture that prevails in Mexico, Molina finished his amateur career with a record of 111-12. He won a bronze medal at the 2006 Cadet World Championships at lightweight and a national title at the 2006 Junior Olympic International Invitational. He knocked down Karl Dargan (a two-time 141-pound U.S. champion and winner of the 2007 Pan American Games) twice at the U.S. championships. He then won against Jeremy Bryan and Dan O'Connor, followed by Brad Solomon in the finals, to win the junior welterweight title. At the World Championships in 2007, he beat Azerbaijan's Emil Maharramov, the 2005 bronze medalist, 27-10, but lost to England's 2008 Olympian Bradley Saunders. 2008 Olympics At the Olympic qualifier, Molina beat Myke Carvalho and then sealed his qualification with a win over Canada's Kevin Bizier. He lost his Olympic debut 1:14 to Boris Georgiev of Bulgaria. According to at least one doctor, it was a fight that never should have taken place. After it was over, Coach Dan Campbell said Molina had gone into the bout with a small hole in his lung, which allowed air to seep out beneath the skin. Professional career Molina is signed to the promotional company Goossen Tutor. In his third fight, he got a second round K.O. over veteran Miguel Garcia. Professional boxing record |- style="margin:0.5em auto; font-size:95%;" |align="center" colspan=8|22 Wins (9 knockouts), 4 Losses, 0 Draw |- style="margin:0.5em auto; font-size:95%;" |align=center style="border-style: none none solid solid; background: #e3e3e3"|Res. |align=center style="border-style: none none solid solid; background: #e3e3e3"|Record |align=center style="border-style: none none solid solid; background: #e3e3e3"|Opponent |align=center style="border-style: none none solid solid; background: #e3e3e3"|Type |align=center style="border-style: none none solid solid; background: #e3e3e3"|Rd., Time |align=center style="border-style: none none solid solid; background: #e3e3e3"|Date |align=center style="border-style: none none solid solid; background: #e3e3e3"|Location |align=center style="border-style: none none solid solid; background: #e3e3e3"|Notes |-align=center |Loss || 22-4-0 ||align=left| Jesus Alejandro Ramos |UD || 10 || May 1, 2021 ||align=left| Dignity Health Sports Park, Carson |align=left| |-align=center |Loss || 22-3-0 ||align=left| José Pedraza |UD || 10 || September 19, 2020 ||align=left| The Bubble, Las Vegas |align=left| |-align=center |Win || 22-2-0 ||align=left| Amir Imam |UD || 8 || February 22, 2020 ||align=left| MGM Grand Garden Arena, Paradise |align=left| |-align=center |-align=center |Win || 21-2-0 ||align=left| Hiroki Okada |KO || 1 (10) || November 2, 2019 ||align=left| Dignity Health Sports Park, Carson |align=left| |-align=center |-align=center |Win || 20-2-0 ||align=left| Manuel Mendez |UD || 8 || August 17, 2019 ||align=left| Banc of California Stadium, Los Angeles |align=left| |-align=center |-align=center |Win || 19-2-0 ||align=left| Abdiel Ramírez |UD || 8 || March 23, 2019 ||align=left| The Hangar, Costa Mesa |align=left| |-align=center |-align=center |Win || 18-2-0 ||align=left| Jessie Roman |UD || 8 || June 1, 2018 ||align=left| Belasco Theater, Los Angeles |align=left| |-align=center |-align=center |Loss || 17-2-0 ||align=left| Jamal James |UD || 10 (10) || January 19, 2016 ||align=left| Club Nokia, Los Angeles |align=left| |-align=center |-align=center |Win || 17-1-0 ||align=left| Lenwood Dozier |RTD || 7 (10) || October 13, 2015 ||align=left| Little Creek Casino Resort, Shelton |align=left| |-align=center |-align=center |Win || 16-1-0 ||align=left| Luis Prieto |SD || 6 (6) || November 1, 2014 ||align=left| Arena Coliseo, Mexico City |align=left| |-align=center |-align=center |Win || 15-1-0 ||align=left| Jorge Pimentel |KO || 3 (8) || September 6, 2014 ||align=left| Gimnasio Miguel Hidalgo, Puebla |align=left| |-align=center |-align=center |Win || 14-1-0 ||align=left| Francisco Javier Parra |KO || 1 (6) || June 8, 2013 ||align=left| Villa Charra, Tijuana |align=left| |-align=center |-align=center |Win || 13-1-0 ||align=left| Joseph Elegele |UD || 8 (8) || March 9, 2013 ||align=left| The Hangar, Costa Mesa |align=left| |-align=center |-align=center |Win || 12-1-0 ||align=left| Fernando Silva |MD || 6 (6) || November 24, 2012 ||align=left|Gimnasio Municipal "Jose Neri Santos", Ciudad Juárez |align=left| |-align=center |-align=center |Win || 11-1-0 ||align=left| Octavio Narvaez |TKO || 3 (6) || June 22, 2012 ||align=left| Soboba Casino, San Jacinto |align=left| |-align=center |-align=center |Win || 10-1-0 ||align=left| Alberto Herrera |UD || 6 (6) || January 20, 2012 ||align=left| Pearl Theater, Paradise |align=left| |-align=center |-align=center |Loss || 9-1-0 ||align=left| Artemio Reyes |UD || 8 (8) || October 28, 2011 ||align=left| Bally's Event Center, Atlantic City |align=left| |-align=center |-align=center |Win || 9-0-0 ||align=left| John Revish |UD || 6 (6) || September 15, 2011 ||align=left| County Coliseum, El Paso |align=left| |-align=center |-align=center |Win || 8-0-0 ||align=left| Hector Alatorre |UD || 6 (6) || June 24, 2011 ||align=left| Pechanga Resort and Casino, Temecula |align=left| |-align=center |-align=center |Win || 7-0-0 ||align=left| David Lopez |UD || 6 (6) || May 27, 2011 ||align=left|Reno Events Center, Reno |align=left| |-align=center |Win || 6-0-0 ||align=left| Danny Diaz |UD || 4 (4) || May 14, 2011 ||align=left| Home Depot Center, Carson |align=left| |-align=center |Win || 5-0-0 ||align=left| Francisco Ríos |UD || 4 (4) || November 27, 2010 ||align=left|Oracle Arena, Oakland |align=left| |-align=center |Win || 4-0-0 ||align=left| Antonio Arauz |TKO || 1 (0:39) || October 7, 2010 ||align=left| Tachi Palace Hotel & Casino, Lemoore |align=left| |-align=center |Win || 3-0-0 ||align=left| Miguel Garcia |TKO || 2 (2:42) || November 27, 2009 ||align=left|Pechanga Resort and Casino, Temecula |align=left| |-align=center |Win || 2-0-0 || align=left| Gerald Valdez |TKO || 2 (2:39) || April 23, 2009 ||align=left| Tachi Palace Hotel & Casino, Lemoore |align=left| |-align=center |Win || 1-0-0 || align=left| Jaime Cabrera |TKO || 2 (1:50) || March 27, 2009 ||align=left| Nokia Theater, Los Angeles |align=left|Pro Debut |-align=center References External links Javier Molina's Amateur Record American boxers of Mexican descent Boxers at the 2008 Summer Olympics Living people 1990 births Winners of the United States Championship for amateur boxers Olympic boxers for the United States Mexican emigrants to the United States American twins American male boxers People from Commerce, California Welterweight boxers
Ahmad Maulana Putra (born 27 July 1988) is an Indonesian professional footballer who plays as a defensive midfielder for Liga 3 club Adhyaksa Farmel. He was called The Indonesian Marouane Fellaini by his friends because of his similar hairstyle with that Belgian footballer. Club career Semen Padang He was signed for Semen Padang to play in Liga 2 in the 2020 season. This season was suspended on 27 March 2020 due to the COVID-19 pandemic. The season was abandoned and was declared void on 20 January 2021. Hizbul Wathan FC In 2021, Ahmad Maulana signed a contract with Indonesian Liga 2 club Hizbul Wathan. He made his league debut on 27 September against Persijap Jepara at the Manahan Stadium, Surakarta. References External links 1988 births Men's association football midfielders Indonesian men's footballers Indonesian Premier Division players Liga 1 (Indonesia) players Liga 2 (Indonesia) players PSMS Medan players Persikabo Bogor players Persires Rengat players Deltras F.C. players Persiba Balikpapan players PSM Makassar players Madura United F.C. players Sriwijaya F.C. players Borneo F.C. Samarinda players Bali United F.C. players Sulut United F.C. players Semen Padang F.C. players Persekat Tegal players Hizbul Wathan F.C. players PSKC Cimahi players Footballers from Medan Living people 21st-century Indonesian people
The Taylor County School District is a public school district in Taylor County, Georgia, United States, based in Butler. It serves the communities of Butler and Reynolds. Schools The Taylor County School District has two elementary schools, one middle school, and one high school. Elementary schools Taylor County Primary School Taylor County Upper Elementary Middle school Taylor County Middle School High school Taylor County High School References External links School districts in Georgia (U.S. state) Education in Taylor County, Georgia
```php <?php declare(strict_types=1); /** */ namespace OCP; use OCP\Federation\ICloudFederationFactory; use OCP\Federation\ICloudFederationProviderManager; use OCP\Log\ILogFactory; use OCP\Security\IContentSecurityPolicyManager; use Psr\Container\ContainerInterface; /** * This is a tagging interface for the server container * * The interface currently extends IContainer, but this interface is deprecated as of Nextcloud 20, * thus this interface won't extend it anymore once that was removed. So migrate to the ContainerInterface * only. * * @deprecated 20.0.0 * * @since 6.0.0 */ interface IServerContainer extends ContainerInterface, IContainer { /** * The calendar manager will act as a broker between consumers for calendar information and * providers which actual deliver the calendar information. * * @return \OCP\Calendar\IManager * @since 13.0.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getCalendarManager(); /** * The calendar resource backend manager will act as a broker between consumers * for calendar resource information an providers which actual deliver the room information. * * @return \OCP\Calendar\Resource\IBackend * @since 14.0.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getCalendarResourceBackendManager(); /** * The calendar room backend manager will act as a broker between consumers * for calendar room information an providers which actual deliver the room information. * * @return \OCP\Calendar\Room\IBackend * @since 14.0.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getCalendarRoomBackendManager(); /** * The contacts manager will act as a broker between consumers for contacts information and * providers which actual deliver the contact information. * * @return \OCP\Contacts\IManager * @since 6.0.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getContactsManager(); /** * The current request object holding all information about the request currently being processed * is returned from this method. * In case the current execution was not initiated by a web request null is returned * * @return \OCP\IRequest * @since 6.0.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getRequest(); /** * Returns the preview manager which can create preview images for a given file * * @return \OCP\IPreview * @since 6.0.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getPreviewManager(); /** * Returns the tag manager which can get and set tags for different object types * * @see \OCP\ITagManager::load() * @return \OCP\ITagManager * @since 6.0.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getTagManager(); /** * Returns the root folder of ownCloud's data directory * * @return \OCP\Files\IRootFolder * @since 6.0.0 - between 6.0.0 and 8.0.0 this returned \OCP\Files\Folder * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getRootFolder(); /** * Returns a view to ownCloud's files folder * * @param string $userId user ID * @return \OCP\Files\Folder * @since 6.0.0 - parameter $userId was added in 8.0.0 * @see getUserFolder in \OCP\Files\IRootFolder * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getUserFolder($userId = null); /** * Returns a user manager * * @return \OCP\IUserManager * @since 8.0.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getUserManager(); /** * Returns a group manager * * @return \OCP\IGroupManager * @since 8.0.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getGroupManager(); /** * Returns the user session * * @return \OCP\IUserSession * @since 6.0.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getUserSession(); /** * Returns the navigation manager * * @return \OCP\INavigationManager * @since 6.0.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getNavigationManager(); /** * Returns the config manager * * @return \OCP\IConfig * @since 6.0.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getConfig(); /** * Returns a Crypto instance * * @return \OCP\Security\ICrypto * @since 8.0.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getCrypto(); /** * Returns a Hasher instance * * @return \OCP\Security\IHasher * @since 8.0.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getHasher(); /** * Returns a SecureRandom instance * * @return \OCP\Security\ISecureRandom * @since 8.1.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getSecureRandom(); /** * Returns a CredentialsManager instance * * @return \OCP\Security\ICredentialsManager * @since 9.0.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getCredentialsManager(); /** * Returns the app config manager * * @return \OCP\IAppConfig * @since 7.0.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getAppConfig(); /** * @return \OCP\L10N\IFactory * @since 8.2.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getL10NFactory(); /** * get an L10N instance * @param string $app appid * @param string $lang * @return \OCP\IL10N * @since 6.0.0 - parameter $lang was added in 8.0.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getL10N($app, $lang = null); /** * @return \OC\Encryption\Manager * @since 8.1.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getEncryptionManager(); /** * @return \OC\Encryption\File * @since 8.1.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getEncryptionFilesHelper(); /** * @return \OCP\Encryption\Keys\IStorage * @since 8.1.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getEncryptionKeyStorage(); /** * Returns the URL generator * * @return \OCP\IURLGenerator * @since 6.0.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getURLGenerator(); /** * Returns an ICache instance * * @return \OCP\ICache * @since 6.0.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getCache(); /** * Returns an \OCP\CacheFactory instance * * @return \OCP\ICacheFactory * @since 7.0.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getMemCacheFactory(); /** * Returns the current session * * @return \OCP\ISession * @since 6.0.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getSession(); /** * Returns the activity manager * * @return \OCP\Activity\IManager * @since 6.0.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getActivityManager(); /** * Returns the current session * * @return \OCP\IDBConnection * @since 6.0.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getDatabaseConnection(); /** * Returns an avatar manager, used for avatar functionality * * @return \OCP\IAvatarManager * @since 6.0.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getAvatarManager(); /** * Returns an job list for controlling background jobs * * @return \OCP\BackgroundJob\IJobList * @since 7.0.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getJobList(); /** * Returns a logger instance * * @return \OCP\ILogger * @since 8.0.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getLogger(); /** * returns a log factory instance * * @return ILogFactory * @since 14.0.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getLogFactory(); /** * Returns a router for generating and matching urls * * @return \OCP\Route\IRouter * @since 7.0.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getRouter(); /** * Get the certificate manager * * @return \OCP\ICertificateManager * @since 8.0.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getCertificateManager(); /** * Returns an instance of the HTTP client service * * @return \OCP\Http\Client\IClientService * @since 8.1.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getHTTPClientService(); /** * Get the active event logger * * @return \OCP\Diagnostics\IEventLogger * @since 8.0.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getEventLogger(); /** * Get the active query logger * * The returned logger only logs data when debug mode is enabled * * @return \OCP\Diagnostics\IQueryLogger * @since 8.0.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getQueryLogger(); /** * Get the manager for temporary files and folders * * @return \OCP\ITempManager * @since 8.0.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getTempManager(); /** * Get the app manager * * @return \OCP\App\IAppManager * @since 8.0.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getAppManager(); /** * Get the webroot * * @return string * @since 8.0.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getWebRoot(); /** * @return \OCP\Files\Config\IMountProviderCollection * @since 8.0.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getMountProviderCollection(); /** * Get the IniWrapper * * @return \bantu\IniGetWrapper\IniGetWrapper * @since 8.0.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getIniWrapper(); /** * @return \OCP\Command\IBus * @since 8.1.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getCommandBus(); /** * Creates a new mailer * * @return \OCP\Mail\IMailer * @since 8.1.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getMailer(); /** * Get the locking provider * * @return \OCP\Lock\ILockingProvider * @since 8.1.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getLockingProvider(); /** * @return \OCP\Files\Mount\IMountManager * @since 8.2.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getMountManager(); /** * Get the MimeTypeDetector * * @return \OCP\Files\IMimeTypeDetector * @since 8.2.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getMimeTypeDetector(); /** * Get the MimeTypeLoader * * @return \OCP\Files\IMimeTypeLoader * @since 8.2.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getMimeTypeLoader(); /** * Get the Notification Manager * * @return \OCP\Notification\IManager * @since 9.0.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getNotificationManager(); /** * @return \OCP\Comments\ICommentsManager * @since 9.0.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getCommentsManager(); /** * Returns the system-tag manager * * @return \OCP\SystemTag\ISystemTagManager * * @since 9.0.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getSystemTagManager(); /** * Returns the system-tag object mapper * * @return \OCP\SystemTag\ISystemTagObjectMapper * * @since 9.0.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getSystemTagObjectMapper(); /** * Returns the share manager * * @return \OCP\Share\IManager * @since 9.0.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getShareManager(); /** * @return IContentSecurityPolicyManager * @since 9.0.0 * @deprecated 17.0.0 Use the AddContentSecurityPolicyEvent */ public function getContentSecurityPolicyManager(); /** * @return \OCP\IDateTimeZone * @since 8.0.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getDateTimeZone(); /** * @return \OCP\IDateTimeFormatter * @since 8.0.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getDateTimeFormatter(); /** * @return \OCP\Federation\ICloudIdManager * @since 12.0.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getCloudIdManager(); /** * @return \OCP\GlobalScale\IConfig * @since 14.0.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getGlobalScaleConfig(); /** * @return ICloudFederationFactory * @since 14.0.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getCloudFederationFactory(); /** * @return ICloudFederationProviderManager * @since 14.0.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getCloudFederationProviderManager(); /** * @return \OCP\Remote\Api\IApiFactory * @since 13.0.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getRemoteApiFactory(); /** * @return \OCP\Remote\IInstanceFactory * @since 13.0.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getRemoteInstanceFactory(); /** * @return \OCP\Files\Storage\IStorageFactory * @since 15.0.0 * @deprecated 20.0.0 have it injected or fetch it through \Psr\Container\ContainerInterface::get */ public function getStorageFactory(); } ```
```go package semver import ( "errors" "fmt" "strconv" "strings" ) const ( numbers string = "0123456789" alphas = "abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ-" alphanum = alphas + numbers ) // SpecVersion is the latest fully supported spec version of semver var SpecVersion = Version{ Major: 2, Minor: 0, Patch: 0, } // Version represents a semver compatible version type Version struct { Major uint64 Minor uint64 Patch uint64 Pre []PRVersion Build []string //No Precedence } // Version to string func (v Version) String() string { b := make([]byte, 0, 5) b = strconv.AppendUint(b, v.Major, 10) b = append(b, '.') b = strconv.AppendUint(b, v.Minor, 10) b = append(b, '.') b = strconv.AppendUint(b, v.Patch, 10) if len(v.Pre) > 0 { b = append(b, '-') b = append(b, v.Pre[0].String()...) for _, pre := range v.Pre[1:] { b = append(b, '.') b = append(b, pre.String()...) } } if len(v.Build) > 0 { b = append(b, '+') b = append(b, v.Build[0]...) for _, build := range v.Build[1:] { b = append(b, '.') b = append(b, build...) } } return string(b) } // FinalizeVersion discards prerelease and build number and only returns // major, minor and patch number. func (v Version) FinalizeVersion() string { b := make([]byte, 0, 5) b = strconv.AppendUint(b, v.Major, 10) b = append(b, '.') b = strconv.AppendUint(b, v.Minor, 10) b = append(b, '.') b = strconv.AppendUint(b, v.Patch, 10) return string(b) } // Equals checks if v is equal to o. func (v Version) Equals(o Version) bool { return (v.Compare(o) == 0) } // EQ checks if v is equal to o. func (v Version) EQ(o Version) bool { return (v.Compare(o) == 0) } // NE checks if v is not equal to o. func (v Version) NE(o Version) bool { return (v.Compare(o) != 0) } // GT checks if v is greater than o. func (v Version) GT(o Version) bool { return (v.Compare(o) == 1) } // GTE checks if v is greater than or equal to o. func (v Version) GTE(o Version) bool { return (v.Compare(o) >= 0) } // GE checks if v is greater than or equal to o. func (v Version) GE(o Version) bool { return (v.Compare(o) >= 0) } // LT checks if v is less than o. func (v Version) LT(o Version) bool { return (v.Compare(o) == -1) } // LTE checks if v is less than or equal to o. func (v Version) LTE(o Version) bool { return (v.Compare(o) <= 0) } // LE checks if v is less than or equal to o. func (v Version) LE(o Version) bool { return (v.Compare(o) <= 0) } // Compare compares Versions v to o: // -1 == v is less than o // 0 == v is equal to o // 1 == v is greater than o func (v Version) Compare(o Version) int { if v.Major != o.Major { if v.Major > o.Major { return 1 } return -1 } if v.Minor != o.Minor { if v.Minor > o.Minor { return 1 } return -1 } if v.Patch != o.Patch { if v.Patch > o.Patch { return 1 } return -1 } // Quick comparison if a version has no prerelease versions if len(v.Pre) == 0 && len(o.Pre) == 0 { return 0 } else if len(v.Pre) == 0 && len(o.Pre) > 0 { return 1 } else if len(v.Pre) > 0 && len(o.Pre) == 0 { return -1 } i := 0 for ; i < len(v.Pre) && i < len(o.Pre); i++ { if comp := v.Pre[i].Compare(o.Pre[i]); comp == 0 { continue } else if comp == 1 { return 1 } else { return -1 } } // If all pr versions are the equal but one has further prversion, this one greater if i == len(v.Pre) && i == len(o.Pre) { return 0 } else if i == len(v.Pre) && i < len(o.Pre) { return -1 } else { return 1 } } // IncrementPatch increments the patch version func (v *Version) IncrementPatch() error { v.Patch++ return nil } // IncrementMinor increments the minor version func (v *Version) IncrementMinor() error { v.Minor++ v.Patch = 0 return nil } // IncrementMajor increments the major version func (v *Version) IncrementMajor() error { v.Major++ v.Minor = 0 v.Patch = 0 return nil } // Validate validates v and returns error in case func (v Version) Validate() error { // Major, Minor, Patch already validated using uint64 for _, pre := range v.Pre { if !pre.IsNum { //Numeric prerelease versions already uint64 if len(pre.VersionStr) == 0 { return fmt.Errorf("Prerelease can not be empty %q", pre.VersionStr) } if !containsOnly(pre.VersionStr, alphanum) { return fmt.Errorf("Invalid character(s) found in prerelease %q", pre.VersionStr) } } } for _, build := range v.Build { if len(build) == 0 { return fmt.Errorf("Build meta data can not be empty %q", build) } if !containsOnly(build, alphanum) { return fmt.Errorf("Invalid character(s) found in build meta data %q", build) } } return nil } // New is an alias for Parse and returns a pointer, parses version string and returns a validated Version or error func New(s string) (*Version, error) { v, err := Parse(s) vp := &v return vp, err } // Make is an alias for Parse, parses version string and returns a validated Version or error func Make(s string) (Version, error) { return Parse(s) } // ParseTolerant allows for certain version specifications that do not strictly adhere to semver // specs to be parsed by this library. It does so by normalizing versions before passing them to // Parse(). It currently trims spaces, removes a "v" prefix, adds a 0 patch number to versions // with only major and minor components specified, and removes leading 0s. func ParseTolerant(s string) (Version, error) { s = strings.TrimSpace(s) s = strings.TrimPrefix(s, "v") // Split into major.minor.(patch+pr+meta) parts := strings.SplitN(s, ".", 3) // Remove leading zeros. for i, p := range parts { if len(p) > 1 { p = strings.TrimLeft(p, "0") if len(p) == 0 || !strings.ContainsAny(p[0:1], "0123456789") { p = "0" + p } parts[i] = p } } // Fill up shortened versions. if len(parts) < 3 { if strings.ContainsAny(parts[len(parts)-1], "+-") { return Version{}, errors.New("Short version cannot contain PreRelease/Build meta data") } for len(parts) < 3 { parts = append(parts, "0") } } s = strings.Join(parts, ".") return Parse(s) } // Parse parses version string and returns a validated Version or error func Parse(s string) (Version, error) { if len(s) == 0 { return Version{}, errors.New("Version string empty") } // Split into major.minor.(patch+pr+meta) parts := strings.SplitN(s, ".", 3) if len(parts) != 3 { return Version{}, errors.New("No Major.Minor.Patch elements found") } // Major if !containsOnly(parts[0], numbers) { return Version{}, fmt.Errorf("Invalid character(s) found in major number %q", parts[0]) } if hasLeadingZeroes(parts[0]) { return Version{}, fmt.Errorf("Major number must not contain leading zeroes %q", parts[0]) } major, err := strconv.ParseUint(parts[0], 10, 64) if err != nil { return Version{}, err } // Minor if !containsOnly(parts[1], numbers) { return Version{}, fmt.Errorf("Invalid character(s) found in minor number %q", parts[1]) } if hasLeadingZeroes(parts[1]) { return Version{}, fmt.Errorf("Minor number must not contain leading zeroes %q", parts[1]) } minor, err := strconv.ParseUint(parts[1], 10, 64) if err != nil { return Version{}, err } v := Version{} v.Major = major v.Minor = minor var build, prerelease []string patchStr := parts[2] if buildIndex := strings.IndexRune(patchStr, '+'); buildIndex != -1 { build = strings.Split(patchStr[buildIndex+1:], ".") patchStr = patchStr[:buildIndex] } if preIndex := strings.IndexRune(patchStr, '-'); preIndex != -1 { prerelease = strings.Split(patchStr[preIndex+1:], ".") patchStr = patchStr[:preIndex] } if !containsOnly(patchStr, numbers) { return Version{}, fmt.Errorf("Invalid character(s) found in patch number %q", patchStr) } if hasLeadingZeroes(patchStr) { return Version{}, fmt.Errorf("Patch number must not contain leading zeroes %q", patchStr) } patch, err := strconv.ParseUint(patchStr, 10, 64) if err != nil { return Version{}, err } v.Patch = patch // Prerelease for _, prstr := range prerelease { parsedPR, err := NewPRVersion(prstr) if err != nil { return Version{}, err } v.Pre = append(v.Pre, parsedPR) } // Build meta data for _, str := range build { if len(str) == 0 { return Version{}, errors.New("Build meta data is empty") } if !containsOnly(str, alphanum) { return Version{}, fmt.Errorf("Invalid character(s) found in build meta data %q", str) } v.Build = append(v.Build, str) } return v, nil } // MustParse is like Parse but panics if the version cannot be parsed. func MustParse(s string) Version { v, err := Parse(s) if err != nil { panic(`semver: Parse(` + s + `): ` + err.Error()) } return v } // PRVersion represents a PreRelease Version type PRVersion struct { VersionStr string VersionNum uint64 IsNum bool } // NewPRVersion creates a new valid prerelease version func NewPRVersion(s string) (PRVersion, error) { if len(s) == 0 { return PRVersion{}, errors.New("Prerelease is empty") } v := PRVersion{} if containsOnly(s, numbers) { if hasLeadingZeroes(s) { return PRVersion{}, fmt.Errorf("Numeric PreRelease version must not contain leading zeroes %q", s) } num, err := strconv.ParseUint(s, 10, 64) // Might never be hit, but just in case if err != nil { return PRVersion{}, err } v.VersionNum = num v.IsNum = true } else if containsOnly(s, alphanum) { v.VersionStr = s v.IsNum = false } else { return PRVersion{}, fmt.Errorf("Invalid character(s) found in prerelease %q", s) } return v, nil } // IsNumeric checks if prerelease-version is numeric func (v PRVersion) IsNumeric() bool { return v.IsNum } // Compare compares two PreRelease Versions v and o: // -1 == v is less than o // 0 == v is equal to o // 1 == v is greater than o func (v PRVersion) Compare(o PRVersion) int { if v.IsNum && !o.IsNum { return -1 } else if !v.IsNum && o.IsNum { return 1 } else if v.IsNum && o.IsNum { if v.VersionNum == o.VersionNum { return 0 } else if v.VersionNum > o.VersionNum { return 1 } else { return -1 } } else { // both are Alphas if v.VersionStr == o.VersionStr { return 0 } else if v.VersionStr > o.VersionStr { return 1 } else { return -1 } } } // PreRelease version to string func (v PRVersion) String() string { if v.IsNum { return strconv.FormatUint(v.VersionNum, 10) } return v.VersionStr } func containsOnly(s string, set string) bool { return strings.IndexFunc(s, func(r rune) bool { return !strings.ContainsRune(set, r) }) == -1 } func hasLeadingZeroes(s string) bool { return len(s) > 1 && s[0] == '0' } // NewBuildVersion creates a new valid build version func NewBuildVersion(s string) (string, error) { if len(s) == 0 { return "", errors.New("Buildversion is empty") } if !containsOnly(s, alphanum) { return "", fmt.Errorf("Invalid character(s) found in build meta data %q", s) } return s, nil } // FinalizeVersion returns the major, minor and patch number only and discards // prerelease and build number. func FinalizeVersion(s string) (string, error) { v, err := Parse(s) if err != nil { return "", err } v.Pre = nil v.Build = nil finalVer := v.String() return finalVer, nil } ```
```java /* __ __ _ _ _ | \/ | ___ __ _ __ _| |__ __ _ ___| |_ ___ _ __ __| | | |\/| |/ _ \/ _` |/ _` | '_ \ / _` / __| __/ _ \ '__/ _` | | | | | __/ (_| | (_| | |_) | (_| \__ \ || __/ | | (_| | |_| |_|\___|\__, |\__,_|_.__/ \__,_|___/\__\___|_| \__,_| |___/ Perpetrated by tonikelope since 2016 */ package com.tonikelope.megabasterd; import static com.tonikelope.megabasterd.MainPanel.*; import static com.tonikelope.megabasterd.MiscTools.*; import java.awt.Dialog; import java.io.File; import java.util.ArrayList; import java.util.Collections; import java.util.Enumeration; import java.util.HashMap; import java.util.List; import java.util.Map; import static java.util.logging.Level.SEVERE; import java.util.logging.Logger; import javax.swing.JComponent; import javax.swing.JOptionPane; import javax.swing.JTree; import javax.swing.tree.DefaultTreeModel; import javax.swing.tree.TreeNode; /** * * @author tonikelope */ public class FolderLinkDialog extends javax.swing.JDialog { private final String _link; private boolean _download; private final List<HashMap> _download_links; private long _total_space; private int _mega_error; private volatile boolean working = false; private volatile boolean exit = false; @Override public void dispose() { file_tree.setModel(null); super.dispose(); } public List<HashMap> getDownload_links() { return Collections.unmodifiableList(_download_links); } public boolean isDownload() { return _download; } public int isMega_error() { return _mega_error; } /** * Creates new form FolderLink * * @param parent * @param link */ public FolderLinkDialog(MainPanelView parent, boolean modal, String link) { super(parent, modal); _mega_error = 0; _total_space = 0L; _download = false; _download_links = new ArrayList<>(); _link = link; MiscTools.GUIRunAndWait(() -> { initComponents(); updateFonts(this, GUI_FONT, parent.getMain_panel().getZoom_factor()); translateLabels(this); file_tree.setRootVisible(false); node_bar.setIndeterminate(true); folder_link_label.setText(link); restore_button.setVisible(false); final Dialog tthis = this; THREAD_POOL.execute(() -> { _loadMegaDirTree(); if (_mega_error == 0) { _genDownloadLiks(); MiscTools.GUIRun(() -> { dance_button.setText(LabelTranslatorSingleton.getInstance().translate("Let's dance, baby")); pack(); }); } else if (_mega_error == -18) { MiscTools.GUIRun(() -> { JOptionPane.showMessageDialog(tthis, LabelTranslatorSingleton.getInstance().translate("MEGA FOLDER TEMPORARILY UNAVAILABLE!"), "Error", JOptionPane.ERROR_MESSAGE); setVisible(false); }); } else if (_mega_error == -16) { MiscTools.GUIRun(() -> { JOptionPane.showMessageDialog(tthis, LabelTranslatorSingleton.getInstance().translate("MEGA FOLDER BLOCKED/DELETED"), "Error", JOptionPane.ERROR_MESSAGE); setVisible(false); }); } else { MiscTools.GUIRun(() -> { JOptionPane.showMessageDialog(tthis, LabelTranslatorSingleton.getInstance().translate("MEGA FOLDER LINK ERROR!"), "Error", JOptionPane.ERROR_MESSAGE); setVisible(false); }); } }); pack(); }); } /** * This method is called from within the constructor to initialize the form. * WARNING: Do NOT modify this code. The content of this method is always * regenerated by the Form Editor. */ @SuppressWarnings("unchecked") // <editor-fold defaultstate="collapsed" desc="Generated Code">//GEN-BEGIN:initComponents private void initComponents() { file_tree_scrollpane = new javax.swing.JScrollPane(); skip_button = new javax.swing.JButton(); link_detected_label = new javax.swing.JLabel(); dance_button = new javax.swing.JButton(); folder_link_label = new javax.swing.JLabel(); warning_label = new javax.swing.JLabel(); skip_rest_button = new javax.swing.JButton(); restore_button = new javax.swing.JButton(); total_space_label = new javax.swing.JLabel(); node_bar = new javax.swing.JProgressBar(); setDefaultCloseOperation(javax.swing.WindowConstants.DO_NOTHING_ON_CLOSE); setTitle("FolderLink"); addWindowListener(new java.awt.event.WindowAdapter() { public void windowClosing(java.awt.event.WindowEvent evt) { formWindowClosing(evt); } }); file_tree.setFont(new java.awt.Font("Dialog", 0, 18)); // NOI18N javax.swing.tree.DefaultMutableTreeNode treeNode1 = new javax.swing.tree.DefaultMutableTreeNode("root"); file_tree.setModel(new javax.swing.tree.DefaultTreeModel(treeNode1)); file_tree.setDoubleBuffered(true); file_tree.setEnabled(false); file_tree_scrollpane.setViewportView(file_tree); skip_button.setFont(new java.awt.Font("Dialog", 1, 18)); // NOI18N skip_button.setIcon(new javax.swing.ImageIcon(getClass().getResource("/images/icons8-trash-can-30.png"))); // NOI18N skip_button.setText("REMOVE THIS"); skip_button.setDoubleBuffered(true); skip_button.setEnabled(false); skip_button.addActionListener(new java.awt.event.ActionListener() { public void actionPerformed(java.awt.event.ActionEvent evt) { skip_buttonActionPerformed(evt); } }); link_detected_label.setFont(new java.awt.Font("Dialog", 1, 24)); // NOI18N link_detected_label.setIcon(new javax.swing.ImageIcon(getClass().getResource("/images/icons8-folder-30.png"))); // NOI18N link_detected_label.setText("Folder link detected!"); link_detected_label.setDoubleBuffered(true); dance_button.setBackground(new java.awt.Color(102, 204, 255)); dance_button.setFont(new java.awt.Font("Dialog", 1, 22)); // NOI18N dance_button.setForeground(new java.awt.Color(255, 255, 255)); dance_button.setText("Loading..."); dance_button.setDoubleBuffered(true); dance_button.setEnabled(false); dance_button.addActionListener(new java.awt.event.ActionListener() { public void actionPerformed(java.awt.event.ActionEvent evt) { dance_buttonActionPerformed(evt); } }); folder_link_label.setFont(new java.awt.Font("Dialog", 1, 18)); // NOI18N folder_link_label.setText("jLabel2"); folder_link_label.setDoubleBuffered(true); warning_label.setFont(new java.awt.Font("Dialog", 0, 16)); // NOI18N warning_label.setText("If you DO NOT want to transfer some folder or file you can REMOVE it (to select several items at the same time use CTRL + LMOUSE)."); warning_label.setDoubleBuffered(true); warning_label.setEnabled(false); skip_rest_button.setFont(new java.awt.Font("Dialog", 1, 18)); // NOI18N skip_rest_button.setIcon(new javax.swing.ImageIcon(getClass().getResource("/images/icons8-trash-can-30.png"))); // NOI18N skip_rest_button.setText("REMOVE ALL EXCEPT THIS"); skip_rest_button.setDoubleBuffered(true); skip_rest_button.setEnabled(false); skip_rest_button.addActionListener(new java.awt.event.ActionListener() { public void actionPerformed(java.awt.event.ActionEvent evt) { skip_rest_buttonActionPerformed(evt); } }); restore_button.setFont(new java.awt.Font("Dialog", 1, 14)); // NOI18N restore_button.setIcon(new javax.swing.ImageIcon(getClass().getResource("/images/icons8-undelete-30.png"))); // NOI18N restore_button.setText("Restore folder data"); restore_button.setDoubleBuffered(true); restore_button.addActionListener(new java.awt.event.ActionListener() { public void actionPerformed(java.awt.event.ActionEvent evt) { restore_buttonActionPerformed(evt); } }); total_space_label.setFont(new java.awt.Font("Dialog", 1, 32)); // NOI18N total_space_label.setForeground(new java.awt.Color(0, 0, 255)); total_space_label.setText("[---]"); total_space_label.setDoubleBuffered(true); total_space_label.setEnabled(false); javax.swing.GroupLayout layout = new javax.swing.GroupLayout(getContentPane()); getContentPane().setLayout(layout); layout.setHorizontalGroup( layout.createParallelGroup(javax.swing.GroupLayout.Alignment.LEADING) .addGroup(layout.createSequentialGroup() .addContainerGap() .addGroup(layout.createParallelGroup(javax.swing.GroupLayout.Alignment.LEADING) .addComponent(link_detected_label, javax.swing.GroupLayout.DEFAULT_SIZE, javax.swing.GroupLayout.DEFAULT_SIZE, Short.MAX_VALUE) .addGroup(layout.createSequentialGroup() .addComponent(folder_link_label, javax.swing.GroupLayout.DEFAULT_SIZE, javax.swing.GroupLayout.DEFAULT_SIZE, Short.MAX_VALUE) .addGap(29, 29, 29) .addComponent(restore_button)) .addComponent(node_bar, javax.swing.GroupLayout.DEFAULT_SIZE, javax.swing.GroupLayout.DEFAULT_SIZE, Short.MAX_VALUE) .addComponent(file_tree_scrollpane, javax.swing.GroupLayout.Alignment.TRAILING) .addComponent(warning_label, javax.swing.GroupLayout.DEFAULT_SIZE, javax.swing.GroupLayout.DEFAULT_SIZE, Short.MAX_VALUE) .addComponent(total_space_label, javax.swing.GroupLayout.Alignment.TRAILING, javax.swing.GroupLayout.DEFAULT_SIZE, javax.swing.GroupLayout.DEFAULT_SIZE, Short.MAX_VALUE) .addGroup(layout.createSequentialGroup() .addComponent(skip_rest_button) .addPreferredGap(javax.swing.LayoutStyle.ComponentPlacement.UNRELATED) .addComponent(skip_button) .addPreferredGap(javax.swing.LayoutStyle.ComponentPlacement.RELATED, javax.swing.GroupLayout.DEFAULT_SIZE, Short.MAX_VALUE) .addComponent(dance_button))) .addContainerGap()) ); layout.setVerticalGroup( layout.createParallelGroup(javax.swing.GroupLayout.Alignment.LEADING) .addGroup(javax.swing.GroupLayout.Alignment.TRAILING, layout.createSequentialGroup() .addContainerGap() .addComponent(link_detected_label) .addGap(8, 8, 8) .addGroup(layout.createParallelGroup(javax.swing.GroupLayout.Alignment.BASELINE) .addComponent(folder_link_label) .addComponent(restore_button)) .addPreferredGap(javax.swing.LayoutStyle.ComponentPlacement.RELATED) .addComponent(file_tree_scrollpane, javax.swing.GroupLayout.DEFAULT_SIZE, 289, Short.MAX_VALUE) .addPreferredGap(javax.swing.LayoutStyle.ComponentPlacement.RELATED) .addComponent(node_bar, javax.swing.GroupLayout.PREFERRED_SIZE, 14, javax.swing.GroupLayout.PREFERRED_SIZE) .addPreferredGap(javax.swing.LayoutStyle.ComponentPlacement.RELATED) .addComponent(total_space_label) .addPreferredGap(javax.swing.LayoutStyle.ComponentPlacement.RELATED) .addGroup(layout.createParallelGroup(javax.swing.GroupLayout.Alignment.LEADING) .addGroup(javax.swing.GroupLayout.Alignment.TRAILING, layout.createParallelGroup(javax.swing.GroupLayout.Alignment.BASELINE) .addComponent(skip_rest_button) .addComponent(skip_button) .addComponent(dance_button)) .addGroup(javax.swing.GroupLayout.Alignment.TRAILING, layout.createSequentialGroup() .addComponent(warning_label) .addGap(49, 49, 49))) .addContainerGap()) ); pack(); }// </editor-fold>//GEN-END:initComponents private void skip_buttonActionPerformed(java.awt.event.ActionEvent evt) {//GEN-FIRST:event_skip_buttonActionPerformed if (deleteSelectedTreeItems(file_tree)) { file_tree.setEnabled(false); node_bar.setVisible(true); skip_rest_button.setEnabled(false); skip_button.setEnabled(false); THREAD_POOL.execute(() -> { MiscTools.resetTreeFolderSizes(((MegaMutableTreeNode) file_tree.getModel().getRoot())); MiscTools.calculateTreeFolderSizes(((MegaMutableTreeNode) file_tree.getModel().getRoot())); _genDownloadLiks(); MiscTools.GUIRun(() -> { restore_button.setVisible(true); file_tree.setEnabled(true); file_tree.setModel(new DefaultTreeModel((TreeNode) file_tree.getModel().getRoot())); boolean root_childs = ((TreeNode) file_tree.getModel().getRoot()).getChildCount() > 0; dance_button.setEnabled(root_childs); skip_button.setEnabled(root_childs); skip_rest_button.setEnabled(root_childs); }); }); } }//GEN-LAST:event_skip_buttonActionPerformed private void dance_buttonActionPerformed(java.awt.event.ActionEvent evt) {//GEN-FIRST:event_dance_buttonActionPerformed _download = true; dispose(); }//GEN-LAST:event_dance_buttonActionPerformed private void skip_rest_buttonActionPerformed(java.awt.event.ActionEvent evt) {//GEN-FIRST:event_skip_rest_buttonActionPerformed if (deleteAllExceptSelectedTreeItems(file_tree)) { file_tree.setEnabled(false); node_bar.setVisible(true); skip_rest_button.setEnabled(false); skip_button.setEnabled(false); THREAD_POOL.execute(() -> { MiscTools.resetTreeFolderSizes(((MegaMutableTreeNode) file_tree.getModel().getRoot())); MiscTools.calculateTreeFolderSizes(((MegaMutableTreeNode) file_tree.getModel().getRoot())); _genDownloadLiks(); MiscTools.GUIRunAndWait(() -> { restore_button.setVisible(true); file_tree.setEnabled(true); file_tree.setModel(new DefaultTreeModel((TreeNode) file_tree.getModel().getRoot())); boolean root_childs = ((TreeNode) file_tree.getModel().getRoot()).getChildCount() > 0; dance_button.setEnabled(root_childs); skip_button.setEnabled(root_childs); skip_rest_button.setEnabled(root_childs); }); }); } }//GEN-LAST:event_skip_rest_buttonActionPerformed private void restore_buttonActionPerformed(java.awt.event.ActionEvent evt) {//GEN-FIRST:event_restore_buttonActionPerformed restore_button.setText(LabelTranslatorSingleton.getInstance().translate("Restoring data, please wait...")); file_tree.setEnabled(false); restore_button.setEnabled(false); dance_button.setEnabled(false); node_bar.setVisible(true); node_bar.setIndeterminate(true); skip_button.setEnabled(false); skip_rest_button.setEnabled(false); THREAD_POOL.execute(() -> { _loadMegaDirTree(); _genDownloadLiks(); MiscTools.GUIRun(() -> { restore_button.setVisible(false); restore_button.setText(LabelTranslatorSingleton.getInstance().translate("Restore folder data")); boolean root_childs = ((TreeNode) file_tree.getModel().getRoot()).getChildCount() > 0; for (JComponent c : new JComponent[]{restore_button, dance_button, skip_button, skip_rest_button, file_tree}) { c.setEnabled(root_childs); } skip_button.setEnabled(root_childs); skip_rest_button.setEnabled(root_childs); }); }); }//GEN-LAST:event_restore_buttonActionPerformed private void formWindowClosing(java.awt.event.WindowEvent evt) {//GEN-FIRST:event_formWindowClosing // TODO add your handling code here: if (working && JOptionPane.showConfirmDialog(this, "EXIT?") == 0) { dispose(); exit = true; } else if (!working) { dispose(); exit = true; } }//GEN-LAST:event_formWindowClosing private int _loadMegaDirTree() { try { working = true; HashMap<String, Object> folder_nodes; MegaAPI ma = new MegaAPI(); String folder_id = findFirstRegex("#F!([^!]+)", _link, 1); String subfolder_id = null; if (folder_id.contains("@")) { String[] fids = folder_id.split("@"); folder_id = fids[0]; subfolder_id = fids[1]; } int r = -1; if (ma.existsCachedFolderNodes(folder_id)) { r = JOptionPane.showConfirmDialog(this, "Do you want to use FOLDER CACHED VERSION?\n\n(It could speed up the loading of very large folders)", "FOLDER CACHE", JOptionPane.YES_NO_OPTION); } if (r == 0) { MiscTools.GUIRun(() -> { folder_link_label.setText(_link + " (CACHED VERSION)"); }); } String folder_key = findFirstRegex("#F![^!]+!(.+)", _link, 1); folder_nodes = ma.getFolderNodes(folder_id, folder_key, node_bar, (r == 0)); MegaMutableTreeNode root = null; final int nodos_totales = folder_nodes.size(); MiscTools.GUIRun(() -> { node_bar.setIndeterminate(false); node_bar.setMaximum(nodos_totales); node_bar.setValue(0); }); int conta_nodo = 0; for (Object o : folder_nodes.values()) { if (exit) { return 1; } conta_nodo++; int c = conta_nodo; MiscTools.GUIRun(() -> { node_bar.setValue(c); }); HashMap<String, Object> current_hashmap_node = (HashMap<String, Object>) o; MegaMutableTreeNode current_node; if (current_hashmap_node.get("jtree_node") == null) { current_node = new MegaMutableTreeNode(current_hashmap_node); current_hashmap_node.put("jtree_node", current_node); } else { current_node = (MegaMutableTreeNode) current_hashmap_node.get("jtree_node"); } String parent_id = (String) current_hashmap_node.get("parent"); String current_id = (String) current_hashmap_node.get("h"); boolean ignore_node = false; do { if ((subfolder_id == null && folder_nodes.get(parent_id) != null) || (subfolder_id != null && !subfolder_id.equals(current_id) && folder_nodes.get(parent_id) != null)) { HashMap<String, Object> parent_hashmap_node = (HashMap) folder_nodes.get(parent_id); MegaMutableTreeNode parent_node; if (parent_hashmap_node.get("jtree_node") == null) { parent_node = new MegaMutableTreeNode(parent_hashmap_node); parent_hashmap_node.put("jtree_node", parent_node); } else { parent_node = (MegaMutableTreeNode) parent_hashmap_node.get("jtree_node"); } parent_node.add(current_node); parent_id = (String) parent_hashmap_node.get("parent"); current_node = parent_node; } else if (subfolder_id != null && subfolder_id.equals(current_id)) { root = current_node; } else if (subfolder_id != null && folder_nodes.get(parent_id) == null) { ignore_node = true; } else if (subfolder_id == null && folder_nodes.get(parent_id) == null) { root = current_node; } } while (current_node != root && !ignore_node); } MiscTools.GUIRun(() -> { node_bar.setIndeterminate(true); }); if (root != null) { MiscTools.sortTree(root); MiscTools.calculateTreeFolderSizes(root); } if (root == null) { LOG.log(SEVERE, null, "MEGA FOLDER ERROR (EMPTY?)"); _mega_error = 2; } else { root.setParent(null); final JTree ftree = file_tree; final MegaMutableTreeNode roott = root; MiscTools.GUIRunAndWait(() -> { node_bar.setIndeterminate(true); ftree.setModel(new DefaultTreeModel(roott)); ftree.setRootVisible(roott != null ? roott.getChildCount() > 0 : false); ftree.setEnabled(true); }); } } catch (MegaAPIException mex) { LOG.log(SEVERE, null, mex); _mega_error = mex.getCode(); } catch (Exception ex) { LOG.log(SEVERE, null, ex); _mega_error = 1; } working = false; return 0; } private void _genDownloadLiks() { MiscTools.GUIRun(() -> { working = true; _download_links.clear(); MegaMutableTreeNode root = (MegaMutableTreeNode) file_tree.getModel().getRoot(); Enumeration files_tree = root.depthFirstEnumeration(); total_space_label.setText("[---]"); THREAD_POOL.execute(() -> { String folder_id = findFirstRegex("#F!([^!]+)", _link, 1); if (folder_id.contains("@")) { String[] fids = folder_id.split("@"); folder_id = fids[0]; } _total_space = 0L; while (files_tree.hasMoreElements()) { MegaMutableTreeNode node = (MegaMutableTreeNode) files_tree.nextElement(); if (node.isLeaf() && node != root && ((HashMap<String, Object>) node.getUserObject()).get("size") != null) { String path = ""; Object[] object_path = node.getUserObjectPath(); for (Object p : object_path) { path += File.separator + ((Map<String, Object>) p).get("name"); } path = path.replaceAll("^/+", "").replaceAll("^\\+", "").trim(); String url = "path_to_url#N!" + ((Map<String, Object>) node.getUserObject()).get("h") + "!" + ((Map<String, Object>) node.getUserObject()).get("key") + "###n=" + folder_id; HashMap<String, Object> download_link = new HashMap<>(); download_link.put("url", url); download_link.put("filename", cleanFilePath(path)); download_link.put("filekey", ((Map<String, Object>) node.getUserObject()).get("key")); download_link.put("filesize", ((Map<String, Object>) node.getUserObject()).get("size")); _total_space += (long) download_link.get("filesize"); _download_links.add(download_link); } else if (node.isLeaf() && node != root) { String path = ""; Object[] object_path = node.getUserObjectPath(); for (Object p : object_path) { path += File.separator + ((Map<String, Object>) p).get("name"); } path = path.replaceAll("^/+", "").replaceAll("^\\+", "").trim(); HashMap<String, Object> download_link = new HashMap<>(); download_link.put("url", "*"); download_link.put("filename", cleanFilePath(path)); download_link.put("type", ((HashMap<String, Object>) node.getUserObject()).get("type")); _download_links.add(download_link); } } MiscTools.GUIRunAndWait(() -> { total_space_label.setText("[" + formatBytes(_total_space) + "]"); for (JComponent c : new JComponent[]{dance_button, warning_label, skip_button, skip_rest_button, total_space_label}) { c.setEnabled(root.getChildCount() > 0); } node_bar.setVisible(false); working = false; }); }); }); } // Variables declaration - do not modify//GEN-BEGIN:variables private javax.swing.JButton dance_button; private final javax.swing.JTree file_tree = new javax.swing.JTree(); private javax.swing.JScrollPane file_tree_scrollpane; private javax.swing.JLabel folder_link_label; private javax.swing.JLabel link_detected_label; private javax.swing.JProgressBar node_bar; private javax.swing.JButton restore_button; private javax.swing.JButton skip_button; private javax.swing.JButton skip_rest_button; private javax.swing.JLabel total_space_label; private javax.swing.JLabel warning_label; // End of variables declaration//GEN-END:variables private static final Logger LOG = Logger.getLogger(FolderLinkDialog.class.getName()); } ```
Bess Eaton or Bess Eaton Management LLC is a small chain of coffee shops based in Rhode Island and Connecticut, serving doughnuts, bagels, and muffins. It started in 1953, grew to over 50 shops throughout southern New England, was sold to other chains following bankruptcy in 2004. The chain reopened in 2011 under new ownership. As of 2018, there are four locations. History The Bess Eaton Donut Flour Company was founded in 1953 by Angelo (Bangy) Gencarelli Jr. and was known for its coffee and hand-cut donuts. The corporate headquarters were located in Westerly, Rhode Island, with up to 56 retail shops spread between Rhode Island, Massachusetts, and Connecticut. At one time, it was Rhode Island's seventh largest private employer of 750 workers and 650 workers at the chain's sale. Throughout the chain's 50-year history, the company was privately held by the Gencarelli family. In the last year of operations, the firm was focusing on wholesale business and non-store locations to boost profits, but ultimately was sold to Tim Hortons of Canada. Leading up to the company's sale, then CEO, Louis A. Gencarelli, Sr., made headlines printing Biblical scripture verses on the company's cups and product packaging. 2004 closing In its last decade of operation, the Bess Eaton Donut Flour Company faced many internal difficulties, including claims from its own management of financial improprieties. The firm sold its retail division in mid-2004 following bankruptcy litigation. With a reported $35 million bid, the fast-food chain Wendy's International Corporation prevailed over the Dunkin' Donuts chain in their competition to purchase the 48 defunct Bess Eaton stores and other assets. Within two months of acquisition, Wendy's had converted 42 of those stores to their Tim Hortons brand. In conjunction with the sale, Bess Eaton closed its production facility and corporate headquarters. Wendy's sold Tim Hortons in 2006, but kept ownership of the Bess Eaton trademark. 2011 reopening From 2008 to 2010, Tim Hortons closed the doors on all shops in Southern New England, including all of the Bess Eaton shops it had acquired. In late January 2011, David Liguori registered a limited liability company under the name Bess Eaton Management LLC with the Rhode Island Secretary of State's Office. He also acquired the Bess Eaton trademark under his company name, according to the U.S. Patent and Trademark Office, and acquired the leases of several of the closed Tim Hortons. There are currently shops in Westerly, Wakefield, and Galilee, Rhode Island, and Pawcatuck, Connecticut. The Galilee location is open seasonally to serve travelers waiting to board the Block Island Ferry. They use all the original recipes for coffee and pastries. Original management Louis A. Gencarelli Sr. - CEO 1982–2002; 2003–2004 George Cioe - CEO 2002–2003 Angelo Gencarelli Jr. - CEO and founder 1953–1982 Paul Gencarelli - CFO and son of Louis Gencarelli Sr. References Doughnut shops in the United States Restaurants in Rhode Island Restaurants established in 1953 Restaurant chains in the United States Fast-food chains of the United States 1953 establishments in Rhode Island Re-established companies Restaurants disestablished in 2004 2004 disestablishments in Rhode Island Defunct restaurant chains in the United States
```python from __future__ import division, absolute_import, print_function import os import pytest from numpy.testing import assert_equal from . import util def _path(*a): return os.path.join(*((os.path.dirname(__file__),) + a)) class TestSizeSumExample(util.F2PyTest): sources = [_path('src', 'size', 'foo.f90')] @pytest.mark.slow def test_all(self): r = self.module.foo([[]]) assert_equal(r, [0], repr(r)) r = self.module.foo([[1, 2]]) assert_equal(r, [3], repr(r)) r = self.module.foo([[1, 2], [3, 4]]) assert_equal(r, [3, 7], repr(r)) r = self.module.foo([[1, 2], [3, 4], [5, 6]]) assert_equal(r, [3, 7, 11], repr(r)) @pytest.mark.slow def test_transpose(self): r = self.module.trans([[]]) assert_equal(r.T, [[]], repr(r)) r = self.module.trans([[1, 2]]) assert_equal(r, [[1], [2]], repr(r)) r = self.module.trans([[1, 2, 3], [4, 5, 6]]) assert_equal(r, [[1, 4], [2, 5], [3, 6]], repr(r)) @pytest.mark.slow def test_flatten(self): r = self.module.flatten([[]]) assert_equal(r, [], repr(r)) r = self.module.flatten([[1, 2]]) assert_equal(r, [1, 2], repr(r)) r = self.module.flatten([[1, 2, 3], [4, 5, 6]]) assert_equal(r, [1, 2, 3, 4, 5, 6], repr(r)) ```
```java /* * one or more contributor license agreements. See the NOTICE file distributed * with this work for additional information regarding copyright ownership. */ package io.camunda.zeebe.engine.processing.streamprocessor; import io.camunda.zeebe.protocol.impl.record.UnifiedRecordValue; import io.camunda.zeebe.stream.api.records.TypedRecord; /** * Some commands are distributed to different partitions. During distribution the command gets * written on the other partitions. Depending on whether it is distributed the behavior of the * processor may slightly change. For example, if it was distributed before we don't want to * distribute it a second time. * * <p>This interface provides some convenience for commands that get distributed. Instead of * checking if the command was distributed in the processor directly, the interface taskes care of * it. * * @param <T> */ public interface DistributedTypedRecordProcessor<T extends UnifiedRecordValue> extends TypedRecordProcessor<T> { @Override default void processRecord(final TypedRecord<T> command) { if (command.isCommandDistributed()) { processDistributedCommand(command); } else { processNewCommand(command); } } /** * Process a command that is not distributed yet * * @param command the not yet distributed command to process */ void processNewCommand(final TypedRecord<T> command); /** * Process a command that has been distributed. Be aware to not distribute it again! * * @param command the already distributed command to process */ void processDistributedCommand(final TypedRecord<T> command); } ```
```go // // // path_to_url // // Unless required by applicable law or agreed to in writing, software // WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. package zetcd // This file describes the on-wire format. type Xid int32 type Op int32 type ZXid int64 type Sid int64 type Ver int32 // version type ACL struct { Perms int32 Scheme string ID string } type CheckVersionRequest pathVersionRequest type ConnectRequest struct { ProtocolVersion int32 LastZxidSeen ZXid TimeOut int32 SessionID Sid Passwd []byte } type ConnectResponse struct { ProtocolVersion int32 TimeOut int32 SessionID Sid Passwd []byte } type CreateRequest struct { Path string Data []byte Acl []ACL Flags int32 } type CreateResponse pathResponse type CloseRequest struct{} type CloseResponse struct{} type auth struct { Type int32 Scheme string Auth []byte } type SetAuthRequest auth type SetAuthResponse struct{} type SetWatchesRequest struct { RelativeZxid ZXid DataWatches []string ExistWatches []string ChildWatches []string } type SetWatchesResponse struct{} type MultiHeader struct { Type Op Done bool Err ErrCode } type MultiRequestOp struct { Header MultiHeader Op interface{} } type MultiRequest struct { Ops []MultiRequestOp DoneHeader MultiHeader } type MultiResponseOp struct { Header MultiHeader String string Stat *Stat } type MultiResponse struct { Ops []MultiResponseOp DoneHeader MultiHeader } type GetChildren2Request pathWatchRequest type GetChildren2Response struct { Children []string Stat Stat } type GetDataRequest pathWatchRequest type GetDataResponse struct { Data []byte Stat Stat } type DeleteRequest pathVersionRequest type DeleteResponse struct{} type ExistsRequest pathWatchRequest type ExistsResponse statResponse type GetAclRequest pathRequest type GetAclResponse struct { Acl []ACL Stat Stat } type SetAclRequest struct { Path string Acl []ACL Version Ver } type SetAclResponse statResponse type GetChildrenRequest pathWatchRequest type GetChildrenResponse struct { Children []string } type SyncRequest pathRequest type SyncResponse pathResponse type PingRequest struct{} type PingResponse struct{} type SetDataRequest struct { Path string Data []byte Version Ver } type SetDataResponse statResponse type Stat struct { // Czxid is the zxid change that caused this znode to be created. Czxid ZXid // Mzxid is The zxid change that last modified this znode. Mzxid ZXid // Ctime is milliseconds from epoch when this znode was created. Ctime int64 // Mtime is The time in milliseconds from epoch when this znode was last modified. Mtime int64 Version Ver // The number of changes to the data of this znode. Cversion Ver // The number of changes to the children of this znode. Aversion Ver // The number of changes to the ACL of this znode. EphemeralOwner Sid // The session id of the owner of this znode if the znode is an ephemeral node. If it is not an ephemeral node, it will be zero. DataLength int32 // The length of the data field of this znode. NumChildren int32 // The number of children of this znode. Pzxid ZXid // last modified children } type WatcherEvent struct { Type EventType State State Path string } type pathWatchRequest struct { Path string Watch bool } type pathResponse struct { Path string } type pathVersionRequest struct { Path string Version Ver } type statResponse struct { Stat Stat } type requestHeader struct { Xid Xid Opcode Op } type ResponseHeader struct { Xid Xid Zxid ZXid Err ErrCode } type pathRequest struct { Path string } ```
```objective-c #pragma once #include <array> #include <base/types.h> namespace Poco { namespace Net { class IPAddress; }} namespace DB { /// Convert IP address to raw binary with IPv6 data (big endian). If it's an IPv4, map it to IPv6. /// Saves result into the first 16 bytes of `res`. void IPv6ToRawBinary(const Poco::Net::IPAddress & address, char * res); /// Convert IP address to 16-byte array with IPv6 data (big endian). If it's an IPv4, map it to IPv6. std::array<char, 16> IPv6ToBinary(const Poco::Net::IPAddress & address); /// Returns a reference to 16-byte array containing mask with first `prefix_len` bits set to `1` and `128 - prefix_len` to `0`. /// The reference is valid during all program execution time. /// Values of prefix_len greater than 128 interpreted as 128 exactly. const std::array<uint8_t, 16> & getCIDRMaskIPv6(UInt8 prefix_len); /// Check that address contained in CIDR range bool matchIPv4Subnet(UInt32 addr, UInt32 cidr_addr, UInt8 prefix); bool matchIPv6Subnet(const uint8_t * addr, const uint8_t * cidr_addr, UInt8 prefix); } ```
T-Kernel is an open source real-time operating system (RTOS) designed for 32-bit microcontrollers. It is standardized by the T-Engine Forum, which distributes it under a T-License agreement. There is also a corresponding Micro T-Kernel (μT-Kernel) implementation designed for embedded systems with 16-bit or 8-bit microcontrollers. History In 1984 professor Ken Sakamura started The Real-time Operating system Nucleus (TRON project) at the University of Tokyo, with the goal of designing an open real-time operating system (RTOS) kernel. The TRON framework defines a complete architecture for the different computing units. Industrial TRON (ITRON) is the most popular TRON architecture. ITRON specification promotion was done by the various companies which sell the commercial implementations. T-Kernel is the name of the specification and at the same time a single implementation based on the authorized source code available from the T-Engine Forum for free under T-License. T-Engine is arguably the most advanced ubiquitous computing platform in the world. In 1989, Matsushita Electric Industrial Co., Ltd., now known as Panasonic Corporation, introduced a TRON PC. This personal computer had an Intel 80286 chip of 8 MHz and only 2 MB of memory, but it could display moving video. Also, it had a dual-booting system that could run both the TRON OS and DOS. Although the Japanese government once announced it would use the TRON PC in Japanese schools, the plan was dropped, partly due to economic issues with the United States. But ITRON survived, and today is used in many devices, household appliances, automobile electronics, robots, some satellites, and in factory automation systems in China. Embedded system developers claim that ITRON is the number one OS for embedded chips in both Japan and the United States. Overview To make it easy to distribute middleware, T-Kernel has separate specification for subsystem and device driver which will be suitable for different types of middleware APIs. A real-time OS appropriate for individual application can be created by combining the middleware called T-Kernel Extension with the T-Kernel. T-Monitor initializes computer hardware and handles the interrupt set up at the start. T-Monitor lessens hardware-dependency of T-Kernel, and improves the application portability. T-Kernel consists of the following three components from the viewpoint of function. T-Kernel/OS (operating system) This offers the basic functions as real-time Operating System. T-Kernel/SM (system manager) This offers the functions including system memory management function and address space management function in order to manage middleware such as device drivers and subsystems. T-Kernel/DS (debugger support) This offers the functions for debuggers to be used in development tools. Development environment eBinder from eSol Corporation is one commonly used integrated development environment (IDE) for software cross-development targeting T-Kernel. The current release of T-Kernel 2.0 is distributed with a plug-in for Eclipse IDE. Also, a version of T-Kernel that runs on QEMU based emulator, and the QEMU based emulator itself, are available so that testing, training, and development can be done on a PC without a target hardware. It is supported by popular SSL/TLS libraries such as wolfSSL. See also ThreadX References External links , TRON Forum Sakamura home page ITRON Project Archive Introducing the μT-Kernel Information about T-Engine, T-Kernel, and μT-Kernel Programming Embedded operating systems TRON project
```forth *> \brief \b SPORFS * * =========== DOCUMENTATION =========== * * Online html documentation available at * path_to_url * *> \htmlonly *> Download SPORFS + dependencies *> <a href="path_to_url"> *> [TGZ]</a> *> <a href="path_to_url"> *> [ZIP]</a> *> <a href="path_to_url"> *> [TXT]</a> *> \endhtmlonly * * Definition: * =========== * * SUBROUTINE SPORFS( UPLO, N, NRHS, A, LDA, AF, LDAF, B, LDB, X, * LDX, FERR, BERR, WORK, IWORK, INFO ) * * .. Scalar Arguments .. * CHARACTER UPLO * INTEGER INFO, LDA, LDAF, LDB, LDX, N, NRHS * .. * .. Array Arguments .. * INTEGER IWORK( * ) * REAL A( LDA, * ), AF( LDAF, * ), B( LDB, * ), * $ BERR( * ), FERR( * ), WORK( * ), X( LDX, * ) * .. * * *> \par Purpose: * ============= *> *> \verbatim *> *> SPORFS improves the computed solution to a system of linear *> equations when the coefficient matrix is symmetric positive definite, *> and provides error bounds and backward error estimates for the *> solution. *> \endverbatim * * Arguments: * ========== * *> \param[in] UPLO *> \verbatim *> UPLO is CHARACTER*1 *> = 'U': Upper triangle of A is stored; *> = 'L': Lower triangle of A is stored. *> \endverbatim *> *> \param[in] N *> \verbatim *> N is INTEGER *> The order of the matrix A. N >= 0. *> \endverbatim *> *> \param[in] NRHS *> \verbatim *> NRHS is INTEGER *> The number of right hand sides, i.e., the number of columns *> of the matrices B and X. NRHS >= 0. *> \endverbatim *> *> \param[in] A *> \verbatim *> A is REAL array, dimension (LDA,N) *> The symmetric matrix A. If UPLO = 'U', the leading N-by-N *> upper triangular part of A contains the upper triangular part *> of the matrix A, and the strictly lower triangular part of A *> is not referenced. If UPLO = 'L', the leading N-by-N lower *> triangular part of A contains the lower triangular part of *> the matrix A, and the strictly upper triangular part of A is *> not referenced. *> \endverbatim *> *> \param[in] LDA *> \verbatim *> LDA is INTEGER *> The leading dimension of the array A. LDA >= max(1,N). *> \endverbatim *> *> \param[in] AF *> \verbatim *> AF is REAL array, dimension (LDAF,N) *> The triangular factor U or L from the Cholesky factorization *> A = U**T*U or A = L*L**T, as computed by SPOTRF. *> \endverbatim *> *> \param[in] LDAF *> \verbatim *> LDAF is INTEGER *> The leading dimension of the array AF. LDAF >= max(1,N). *> \endverbatim *> *> \param[in] B *> \verbatim *> B is REAL array, dimension (LDB,NRHS) *> The right hand side matrix B. *> \endverbatim *> *> \param[in] LDB *> \verbatim *> LDB is INTEGER *> The leading dimension of the array B. LDB >= max(1,N). *> \endverbatim *> *> \param[in,out] X *> \verbatim *> X is REAL array, dimension (LDX,NRHS) *> On entry, the solution matrix X, as computed by SPOTRS. *> On exit, the improved solution matrix X. *> \endverbatim *> *> \param[in] LDX *> \verbatim *> LDX is INTEGER *> The leading dimension of the array X. LDX >= max(1,N). *> \endverbatim *> *> \param[out] FERR *> \verbatim *> FERR is REAL array, dimension (NRHS) *> The estimated forward error bound for each solution vector *> X(j) (the j-th column of the solution matrix X). *> If XTRUE is the true solution corresponding to X(j), FERR(j) *> is an estimated upper bound for the magnitude of the largest *> element in (X(j) - XTRUE) divided by the magnitude of the *> largest element in X(j). The estimate is as reliable as *> the estimate for RCOND, and is almost always a slight *> overestimate of the true error. *> \endverbatim *> *> \param[out] BERR *> \verbatim *> BERR is REAL array, dimension (NRHS) *> The componentwise relative backward error of each solution *> vector X(j) (i.e., the smallest relative change in *> any element of A or B that makes X(j) an exact solution). *> \endverbatim *> *> \param[out] WORK *> \verbatim *> WORK is REAL array, dimension (3*N) *> \endverbatim *> *> \param[out] IWORK *> \verbatim *> IWORK is INTEGER array, dimension (N) *> \endverbatim *> *> \param[out] INFO *> \verbatim *> INFO is INTEGER *> = 0: successful exit *> < 0: if INFO = -i, the i-th argument had an illegal value *> \endverbatim * *> \par Internal Parameters: * ========================= *> *> \verbatim *> ITMAX is the maximum number of steps of iterative refinement. *> \endverbatim * * Authors: * ======== * *> \author Univ. of Tennessee *> \author Univ. of California Berkeley *> \author Univ. of Colorado Denver *> \author NAG Ltd. * *> \ingroup porfs * * ===================================================================== SUBROUTINE SPORFS( UPLO, N, NRHS, A, LDA, AF, LDAF, B, LDB, X, $ LDX, FERR, BERR, WORK, IWORK, INFO ) * * -- LAPACK computational routine -- * -- LAPACK is a software package provided by Univ. of Tennessee, -- * -- Univ. of California Berkeley, Univ. of Colorado Denver and NAG Ltd..-- * * .. Scalar Arguments .. CHARACTER UPLO INTEGER INFO, LDA, LDAF, LDB, LDX, N, NRHS * .. * .. Array Arguments .. INTEGER IWORK( * ) REAL A( LDA, * ), AF( LDAF, * ), B( LDB, * ), $ BERR( * ), FERR( * ), WORK( * ), X( LDX, * ) * .. * * ===================================================================== * * .. Parameters .. INTEGER ITMAX PARAMETER ( ITMAX = 5 ) REAL ZERO PARAMETER ( ZERO = 0.0E+0 ) REAL ONE PARAMETER ( ONE = 1.0E+0 ) REAL TWO PARAMETER ( TWO = 2.0E+0 ) REAL THREE PARAMETER ( THREE = 3.0E+0 ) * .. * .. Local Scalars .. LOGICAL UPPER INTEGER COUNT, I, J, K, KASE, NZ REAL EPS, LSTRES, S, SAFE1, SAFE2, SAFMIN, XK * .. * .. Local Arrays .. INTEGER ISAVE( 3 ) * .. * .. External Subroutines .. EXTERNAL SAXPY, SCOPY, SLACN2, SPOTRS, SSYMV, $ XERBLA * .. * .. Intrinsic Functions .. INTRINSIC ABS, MAX * .. * .. External Functions .. LOGICAL LSAME REAL SLAMCH EXTERNAL LSAME, SLAMCH * .. * .. Executable Statements .. * * Test the input parameters. * INFO = 0 UPPER = LSAME( UPLO, 'U' ) IF( .NOT.UPPER .AND. .NOT.LSAME( UPLO, 'L' ) ) THEN INFO = -1 ELSE IF( N.LT.0 ) THEN INFO = -2 ELSE IF( NRHS.LT.0 ) THEN INFO = -3 ELSE IF( LDA.LT.MAX( 1, N ) ) THEN INFO = -5 ELSE IF( LDAF.LT.MAX( 1, N ) ) THEN INFO = -7 ELSE IF( LDB.LT.MAX( 1, N ) ) THEN INFO = -9 ELSE IF( LDX.LT.MAX( 1, N ) ) THEN INFO = -11 END IF IF( INFO.NE.0 ) THEN CALL XERBLA( 'SPORFS', -INFO ) RETURN END IF * * Quick return if possible * IF( N.EQ.0 .OR. NRHS.EQ.0 ) THEN DO 10 J = 1, NRHS FERR( J ) = ZERO BERR( J ) = ZERO 10 CONTINUE RETURN END IF * * NZ = maximum number of nonzero elements in each row of A, plus 1 * NZ = N + 1 EPS = SLAMCH( 'Epsilon' ) SAFMIN = SLAMCH( 'Safe minimum' ) SAFE1 = REAL( NZ )*SAFMIN SAFE2 = SAFE1 / EPS * * Do for each right hand side * DO 140 J = 1, NRHS * COUNT = 1 LSTRES = THREE 20 CONTINUE * * Loop until stopping criterion is satisfied. * * Compute residual R = B - A * X * CALL SCOPY( N, B( 1, J ), 1, WORK( N+1 ), 1 ) CALL SSYMV( UPLO, N, -ONE, A, LDA, X( 1, J ), 1, ONE, $ WORK( N+1 ), 1 ) * * Compute componentwise relative backward error from formula * * max(i) ( abs(R(i)) / ( abs(A)*abs(X) + abs(B) )(i) ) * * where abs(Z) is the componentwise absolute value of the matrix * or vector Z. If the i-th component of the denominator is less * than SAFE2, then SAFE1 is added to the i-th components of the * numerator and denominator before dividing. * DO 30 I = 1, N WORK( I ) = ABS( B( I, J ) ) 30 CONTINUE * * Compute abs(A)*abs(X) + abs(B). * IF( UPPER ) THEN DO 50 K = 1, N S = ZERO XK = ABS( X( K, J ) ) DO 40 I = 1, K - 1 WORK( I ) = WORK( I ) + ABS( A( I, K ) )*XK S = S + ABS( A( I, K ) )*ABS( X( I, J ) ) 40 CONTINUE WORK( K ) = WORK( K ) + ABS( A( K, K ) )*XK + S 50 CONTINUE ELSE DO 70 K = 1, N S = ZERO XK = ABS( X( K, J ) ) WORK( K ) = WORK( K ) + ABS( A( K, K ) )*XK DO 60 I = K + 1, N WORK( I ) = WORK( I ) + ABS( A( I, K ) )*XK S = S + ABS( A( I, K ) )*ABS( X( I, J ) ) 60 CONTINUE WORK( K ) = WORK( K ) + S 70 CONTINUE END IF S = ZERO DO 80 I = 1, N IF( WORK( I ).GT.SAFE2 ) THEN S = MAX( S, ABS( WORK( N+I ) ) / WORK( I ) ) ELSE S = MAX( S, ( ABS( WORK( N+I ) )+SAFE1 ) / $ ( WORK( I )+SAFE1 ) ) END IF 80 CONTINUE BERR( J ) = S * * Test stopping criterion. Continue iterating if * 1) The residual BERR(J) is larger than machine epsilon, and * 2) BERR(J) decreased by at least a factor of 2 during the * last iteration, and * 3) At most ITMAX iterations tried. * IF( BERR( J ).GT.EPS .AND. TWO*BERR( J ).LE.LSTRES .AND. $ COUNT.LE.ITMAX ) THEN * * Update solution and try again. * CALL SPOTRS( UPLO, N, 1, AF, LDAF, WORK( N+1 ), N, INFO ) CALL SAXPY( N, ONE, WORK( N+1 ), 1, X( 1, J ), 1 ) LSTRES = BERR( J ) COUNT = COUNT + 1 GO TO 20 END IF * * Bound error from formula * * norm(X - XTRUE) / norm(X) .le. FERR = * norm( abs(inv(A))* * ( abs(R) + NZ*EPS*( abs(A)*abs(X)+abs(B) ))) / norm(X) * * where * norm(Z) is the magnitude of the largest component of Z * inv(A) is the inverse of A * abs(Z) is the componentwise absolute value of the matrix or * vector Z * NZ is the maximum number of nonzeros in any row of A, plus 1 * EPS is machine epsilon * * The i-th component of abs(R)+NZ*EPS*(abs(A)*abs(X)+abs(B)) * is incremented by SAFE1 if the i-th component of * abs(A)*abs(X) + abs(B) is less than SAFE2. * * Use SLACN2 to estimate the infinity-norm of the matrix * inv(A) * diag(W), * where W = abs(R) + NZ*EPS*( abs(A)*abs(X)+abs(B) ))) * DO 90 I = 1, N IF( WORK( I ).GT.SAFE2 ) THEN WORK( I ) = ABS( WORK( N+I ) ) + REAL( NZ )*EPS*WORK( I ) ELSE WORK( I ) = ABS( WORK( N+I ) ) + REAL( NZ )*EPS*WORK( I ) $ + SAFE1 END IF 90 CONTINUE * KASE = 0 100 CONTINUE CALL SLACN2( N, WORK( 2*N+1 ), WORK( N+1 ), IWORK, $ FERR( J ), $ KASE, ISAVE ) IF( KASE.NE.0 ) THEN IF( KASE.EQ.1 ) THEN * * Multiply by diag(W)*inv(A**T). * CALL SPOTRS( UPLO, N, 1, AF, LDAF, WORK( N+1 ), N, $ INFO ) DO 110 I = 1, N WORK( N+I ) = WORK( I )*WORK( N+I ) 110 CONTINUE ELSE IF( KASE.EQ.2 ) THEN * * Multiply by inv(A)*diag(W). * DO 120 I = 1, N WORK( N+I ) = WORK( I )*WORK( N+I ) 120 CONTINUE CALL SPOTRS( UPLO, N, 1, AF, LDAF, WORK( N+1 ), N, $ INFO ) END IF GO TO 100 END IF * * Normalize error. * LSTRES = ZERO DO 130 I = 1, N LSTRES = MAX( LSTRES, ABS( X( I, J ) ) ) 130 CONTINUE IF( LSTRES.NE.ZERO ) $ FERR( J ) = FERR( J ) / LSTRES * 140 CONTINUE * RETURN * * End of SPORFS * END ```
```python import pytest from helpers.cluster import ClickHouseCluster cluster = ClickHouseCluster(__file__) node1 = cluster.add_instance("node1") @pytest.fixture(scope="module") def start_cluster(): try: cluster.start() yield cluster finally: cluster.shutdown() def test_attach_without_checksums(start_cluster): node1.query( "CREATE TABLE test (date Date, key Int32, value String) Engine=MergeTree ORDER BY key PARTITION by date" ) node1.query( "INSERT INTO test SELECT toDate('2019-10-01'), number, toString(number) FROM numbers(100)" ) assert node1.query("SELECT COUNT() FROM test WHERE key % 10 == 0") == "10\n" node1.query("ALTER TABLE test DETACH PARTITION '2019-10-01'") assert node1.query("SELECT COUNT() FROM test WHERE key % 10 == 0") == "0\n" assert node1.query("SELECT COUNT() FROM test") == "0\n" # to be sure output not empty node1.exec_in_container( [ "bash", "-c", 'find /var/lib/clickhouse/data/default/test/detached -name "checksums.txt" | grep -e ".*" ', ], privileged=True, user="root", ) node1.exec_in_container( [ "bash", "-c", 'find /var/lib/clickhouse/data/default/test/detached -name "checksums.txt" -delete', ], privileged=True, user="root", ) node1.query("ALTER TABLE test ATTACH PARTITION '2019-10-01'") assert node1.query("SELECT COUNT() FROM test WHERE key % 10 == 0") == "10\n" assert node1.query("SELECT COUNT() FROM test") == "100\n" node1.query("DROP TABLE test") ```
Iris subg. Nepalensis is one subgenus of Iris, also known as 'Himalayan irises'. It was formerly genus Junopsis. The irises have fleshy-like roots very similar to a day lily (Hemerocallis). They are best grown in a semi-shady spot in a bulb frame. Most bulbs in the subgenus are found in the Himalayas and Yunnan region. Only four species are known. Iris decora Wall. Iris colletti Hook. Iris staintonii H Hara Iris barbatula Noltie & K.Y.Guan Iris decora This is the most known of the species. It has many synonyms: Evansia nepalensis (Klatt), Iris nepalensis (D.Don), Iris nepalensis var. khasiana (Baker), Iris sulcata (Wall.), Iris yunnanensis (H.Lév.), Junopsis decora (Wall.) Wern.Schulze, Neubeckia decora (Wall.) Klatt and Neubeckia sulcata (Klatt) It was first published in British Flower Garden Series 2, in 1829. It was first described by Nathaniel Wallich in his book Plantae Asiaticae Rariores in 1830. It was later published in then Journal of the Royal Horticultural Society in 1969. It is hardy to USDA Zone 3. It also requires frequent watering while in growth. Sometimes it is confused with Iris leptophylla (in Iris subg. Scorpiris). It has a rhizome covered in bristly fibres. It is similar in form to the roots of Hemerocallis. It reaches a height of 10–30 cm tall. It has 3–7 flowers per stem, in the summer, June in the UK. which are approximately 4–5 cm in diameter. They come in a range of colours between pale bluish lavender and deep reddish purple. The perianth tube measures 3.5–5 cm. The falls are up to 3.5 cm long. The blade has an orange-yellow central ridge that becomes white or purple at the apex. It has a whitish claw with purple veins. The leaves reach up to 30 cm at flowering time and then grow up to 45–60 cm tall later, growing to longer than the flowering stem. The strongly ribbed leaves can be 2–8 mm wide. Iris decora was found in 1832 on grassy hillsides on plateaus, open stony pastures, and cliffs at 2800–3100 m above sea level. It can be found in the Himalayas from Kashmir to China. In Sichuan, Xizang (Tibet), Yunnan, Bhutan, N India and Nepal of the Western Central Himalayas. A white-flowered form from Yunnan region has been described as Iris decora var. leucantha by D. Dong & Y. T. Zhao (Bull. Bot. Res., Harbin 18: 150.) in 1998. Iris colletti It was found in 1909, in North Burma, Thailand, Tibet and the province of Yunnan and Sichuan (in China). It was named after Sir Henry Collett (1836–1901), who collected plants in most of those regions. It has been found growing in various habitats, including wood edges, clearings, shrubby areas, and sunny grasslands. It can grow at altitudes of up to 3400 m above sea level. It has 3–7 lilac-blue flowers on a 5–15 cm tall stem. The flower has a very long neck, similar to a crocus. It generally flowers in May – June. The flower has an orange caterpillar-like beard on the midrib. It also has ribbed, grey-green leaves which extend after blooming up to . Two hybrids have been found: Iris collettii var. collettii and Iris collettii var. acaulis. Iris collettii var. acaulis (Noltie) was described in New Plantsman (magazine) in 1995. It was found at above sea level, in the provinces of Sichuan and Yunnan in China. Iris staintonii Originally found in 1974 in Nepal. It normally has a single mauve flower (about 3 cm) with bearded fall and is marked with white. It is deemed a rare plant in Nepal. It was first published by Kanesuke Hara in Journal of Japanese Botany in 1974. It was given to Kew Gardens by an Oxford University team in 1992. Other mentions. Hara, H. et al. 1978–1982. An enumeration of the flowering plants of Nepal. Mathew, B. 1981. The Iris. 134. Iris barbatula A recent discovery, it was described by Henry John Noltie and K.Y.Guan in 1995 in the New Plantsman 2: 137, and was collected from N.W. Yunnan. It has been found in open grassy areas and forest clearings, and found on grassy plateaus at above sea level. It has three long-tubed purple to dark-violet flowers, which are about 5 cm across, and has a short subterranean stem. Unusually, it also has a fimbriate (fringed), almost beard-like crest. It flowers between May and July. It has leaves that grow 9–19 cm tall and 2–5 mm wide. It tends to form small clumps of bulbs after several years. References Iris (plant) Plant subgenera Flora of East Himalaya Flora of West Himalaya Flora of Nepal Garden plants of Asia
A list of films produced in Italy in 1946 (see 1946 in film): References External links Italian films of 1946 at the Internet Movie Database Italian 1946 Films
```smalltalk // // A (very) simple network interpolation script, using Lerp(). // // This will lag-behind, compared to the moving cube on the controlling client. // Actually, we deliberately lag behing a bit more, to avoid stops, if updates arrive late. // // This script does not hide loss very well and might stop the local cube. // using UnityEngine; [RequireComponent(typeof (PhotonView))] public class CubeLerp : Photon.MonoBehaviour, IPunObservable { private Vector3 latestCorrectPos; private Vector3 onUpdatePos; private float fraction; public void Start() { this.latestCorrectPos = transform.position; this.onUpdatePos = transform.position; } public void OnPhotonSerializeView(PhotonStream stream, PhotonMessageInfo info) { if (stream.isWriting) { Vector3 pos = transform.localPosition; Quaternion rot = transform.localRotation; stream.Serialize(ref pos); stream.Serialize(ref rot); } else { // Receive latest state information Vector3 pos = Vector3.zero; Quaternion rot = Quaternion.identity; stream.Serialize(ref pos); stream.Serialize(ref rot); this.latestCorrectPos = pos; // save this to move towards it in FixedUpdate() this.onUpdatePos = transform.localPosition; // we interpolate from here to latestCorrectPos this.fraction = 0; // reset the fraction we alreay moved. see Update() transform.localRotation = rot; // this sample doesn't smooth rotation } } public void Update() { if (this.photonView.isMine) { return; // if this object is under our control, we don't need to apply received position-updates } // We get 10 updates per sec. Sometimes a few less or one or two more, depending on variation of lag. // Due to that we want to reach the correct position in a little over 100ms. We get a new update then. // This way, we can usually avoid a stop of our interpolated cube movement. // // Lerp() gets a fraction value between 0 and 1. This is how far we went from A to B. // // So in 100 ms, we want to move from our previous position to the latest known. // Our fraction variable should reach 1 in 100ms, so we should multiply deltaTime by 10. // We want it to take a bit longer, so we multiply with 9 instead! this.fraction = this.fraction + Time.deltaTime * 9; transform.localPosition = Vector3.Lerp(this.onUpdatePos, this.latestCorrectPos, this.fraction); // set our pos between A and B } } ```
The men's singles badminton event at the 2018 Commonwealth Games was held from 10 to 15 April 2018 at the Carrara Sports and Leisure Centre on the Gold Coast, Australia. The defending gold medalist was Parupalli Kashyap of India. Kashyap did not defend his title. The athletes were drawn into straight knockout stage. The draw for the competition was conducted on 2 April 2018. Seeds The seeds for the tournament were: (silver medalist) (gold medalist) (Fourth place) (bronze medalist) (quarter-finals) (round of 16) (round of 32) (quarter-finals) Results Finals Top half Section 1 Section 2 Bottom half Section 3 Section 4 References Men's singles
The Ernest Malinowski Monument is a sculpture located at the highest point of the Ticlio, a mountain pass in the Department of Lima, Peru, within the Chicla District. It is dedicated to Ernest Malinowski, an engineer who designed the Ferrocarril Central Andino railway in Peru. The monument was designed by sculptor Gustaw Zemła, and unveiled on 2 March 1999, on the hundredth anniversary of Malinowski's death. It is placed at 4,818 m (15,807 ft) above mean sea level, which makes it one of the highest-placed monument in the world. History The monument was designed by sculptor Gustaw Zemła, and dedicated to Ernest Malinowski, an engineer who designed the Ferrocarril Central Andino railway in Peru. The monument was unveiled on 2 March 1999, on the hundredth anniversary of Malinowski's death. Characteristics The monument is located at the highest point of the Ticlio, a mountain pass in the Department of Lima, Peru, within the Chicla District, which is also the highest point of the Ferrocarril Central Andino railway. It is placed at 4,818 m (15,807 ft) above mean sea level, which makes it one of the highest-placed monument in the world. It consists of a granite cuboid pedestal, which features coats of arms of Peru and Poland, and inscriptions in Spanish and Polish, which translates to "Polish engineer, Peruvian patriot, hero of the Defence of Callao of 1866, the designer and constructor of the Ferrocarril Central Andino [Andian Central Railway]". On top is placed a granite cylinder facing to the front with a flat side, with a bronze relief depicting the face of Ernest Malinowski, to whom the monument was dedicated. The height of the monument is 7 m (22.97 ft). Notes References 1979 establishments in Peru 1999 sculptures Buildings and structures completed in 1999 Buildings and structures in Lima Region Monuments and memorials in Peru Sculptures of men
```php <?php /* * * * path_to_url * * Unless required by applicable law or agreed to in writing, software * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the */ namespace Google\Service\Apigee\Resource; use Google\Service\Apigee\GoogleCloudApigeeV1ListDeploymentsResponse; /** * The "deployments" collection of methods. * Typical usage is: * <code> * $apigeeService = new Google\Service\Apigee(...); * $deployments = $apigeeService->organizations_sharedflows_revisions_deployments; * </code> */ class OrganizationsSharedflowsRevisionsDeployments extends \Google\Service\Resource { /** * Lists all deployments of a shared flow revision. * (deployments.listOrganizationsSharedflowsRevisionsDeployments) * * @param string $parent Required. Name of the API proxy revision for which to * return deployment information in the following format: * `organizations/{org}/sharedflows/{sharedflow}/revisions/{rev}`. * @param array $optParams Optional parameters. * @return GoogleCloudApigeeV1ListDeploymentsResponse * @throws \Google\Service\Exception */ public function listOrganizationsSharedflowsRevisionsDeployments($parent, $optParams = []) { $params = ['parent' => $parent]; $params = array_merge($params, $optParams); return $this->call('list', [$params], GoogleCloudApigeeV1ListDeploymentsResponse::class); } } // Adding a class alias for backwards compatibility with the previous class name. class_alias(OrganizationsSharedflowsRevisionsDeployments::class, your_sha256_hashDeployments'); ```
The British Columbia Men's Premier League is a provincial rugby union competition currently contested by twelve clubs in British Columbia, Canada and one in the U.S. state of Washington. The BC Premier League is organized by the British Columbia Rugby Union. The league currently consists of teams from the Lower Mainland and Vancouver Island. Clubs play each other twice throughout the regular season. The top six teams are then seeded and qualify for two rounds of playoffs. The top two teams from the semi-finals then face each other at the BC Rugby Club Finals in May. History The teams compete for the prestigious Rounsefell Cup. The trophy was donated by F.W. Rounsefell, a Vancouver insurance broker and financier and former British Columbia rugby star. The Rounsefell Cup was first awarded to the Central Athletic Club in March 1922. The cup has been competed for annually ever since. 2018–19 Premier League Teams Past Champions 1922 - Central Athletic Club 1923 - Vancouver Rowing Club 1924 - UBC Thunderbirds 1925 - James Bay Athletic Association 1926 - Ex King George 1927 - Ex King George 1928 - Ex King George 1929 - Meraloma Athletic Club 1930 - UBC Thunderbirds 1931 - Ex King George & The Canadian Scottish Regiment (Princess Mary's) 1932 - Not contested 1933 - North Shore All Blacks 1934 - North Shore All Blacks 1935 - North Shore All Blacks 1936 - Vancouver Rowing Club & The 5th Regiment 1937 - North Shore All Blacks 1938 - James Bay Athletic Association 1939 - Meraloma Athletic Club & JBAA 1940 - James Bay Athletic Association 1941 - Meraloma Athletic Club 1942 - Ex Lord Byng 1943 - Royal Canadian Naval College 1944 - Army Victoria 1945 - UBC Thunderbirds 1946 - James Bay Athletic Association 1947 - UBC Thunderbirds 1948 - North Shore All Blacks 1949 - Ex South Burnaby 1950 - Ex Britannia (Brit Lions Rugby Club) 1951 - Vindex RFC 1952 - Vindex RFC 1953 - Vindex RFC 1954 - Meraloma Athletic Club 1955 - North Shore All Blacks 1956 - Kats Rugby Club 1957 - Kats Rugby Club 1958 - Kats Rugby Club 1959 - Kats Rugby Club 1960 - Oak Bay Wanderers 1961 - Kats Rugby Club 1962 - Kats Rugby Club & Oak Bay Wanderers 1963 - Kats Rugby Club & JBAA 1964 - Kats Rugby Club 1965 - Meraloma Athletic Club 1966 - Kats Rugby Club 1967 - Meraloma Athletic Club 1968 - Kats Rugby Club 1969 - Kats Rugby Club 1970 - Kats Rugby Club 1971 - University of Victoria Vikes 1972 - Meraloma Athletic Club 1973 - Meraloma Athletic Club 1974 - James Bay Athletic Association 1975 - James Bay Athletic Association 1976 - James Bay Athletic Association 1977 - James Bay Athletic Association 1978 - James Bay Athletic Association 1979 - James Bay Athletic Association 1980 - James Bay Athletic Association 1981 - UBC Old Boys Ravens 1982 - James Bay Athletic Association 1983 - Meraloma Athletic Club 1984 - Ex Britannia (Brit Lions Rugby Club) 1985 - UBC Old Boys Ravens 1986 - Meraloma Athletic Club 1987 - Meraloma Athletic Club 1988 - Meraloma Athletic Club 1989 - James Bay Athletic Association 1990 - UBC Old Boys Ravens 1991 - UBC Old Boys Ravens 1992 - James Bay Athletic Association 1993 - James Bay Athletic Association 1994 - UBC Old Boys Ravens 1995 - Vancouver Rowing Club 1996 - James Bay Athletic Association 1997 - Cowichan RFC 1998 - Cowichan RFC 1999 - James Bay Athletic Association 2000 - Castaway Wanderers RFC 2001 - Castaway Wanderers RFC 2002 - Castaway Wanderers RFC 2003 - University of Victoria Vikes 2004 - Capilano RFC 2005 - Capilano RFC 2006 - James Bay Athletic Association 2007 - James Bay Athletic Association 2008 - James Bay Athletic Association 2009 - Meraloma Athletic Club 2010 - University of Victoria Vikes 2011 - Castaway Wanderers RFC 2012 - Capilano RFC 2013 - James Bay Athletic Association 2014 - James Bay Athletic Association 2015 - UBC Thunderbirds 2016 - UBC Thunderbirds 2017 - UBC Thunderbirds 2018 - UBC Old Boys Ravens 2019 - UBC Old Boys Ravens 2020 - Not contested due to COVID-19 2021 - Not contested due to COVID-19 2022 - UBC Thunderbirds 2023 - UBC Thunderbirds Awards Player of the Year See also Rugby Canada British Columbia Rugby Union Coastal Cup References External links Rugby Canada British Columbia Rugby Union Rugby union leagues in Canada Rugby union in British Columbia Rugby union
Lucius Vitellius (before 7 BC – AD 51) was the youngest of four sons of procurator Publius Vitellius and the only one who did not die through politics. He was consul three times, which was unusual during the Roman empire for someone who was not a member of the Imperial family. The first time was in the year 34 as the colleague of Paullus Fabius Persicus; the second was in 43 as the colleague of the emperor Claudius; the third was in 47 again as the colleague of the emperor Claudius. Career Under Emperor Tiberius, he was consul and in the following year governor of Syria in 35. He deposed Pontius Pilate in 36 after complaints from the people in Samaria. He supported Emperor Caligula, and was a favorite of Emperor Claudius' wife Valeria Messalina. During Claudius' reign, he was Consul again twice, and governed Rome while the Emperor was absent on his invasion of Britain. Around the time that Claudius married Agrippina the Younger in 47, 48 or 49, Vitellius served as a Censor. Josephus, in his Antiquities of the Jews, records that he wrote Tiberius to request that the Jewish high priestly robe be allowed back under Jewish control and this request was granted. He wielded great influence and was known for his outstanding character, though, at one time, a Senator accused him of treason. He died of paralysis in 51. Lucius received a state funeral and had a statue on the rostra bearing the inscription ‘steadfast loyal to the Emperor’. Family Lucius married Sextilia, a reputable woman from a distinguished family, who gave birth to two sons, Aulus Vitellius Germanicus (the ephemeral Emperor in 69), and Lucius Vitellius. In fiction Vitellius is a prominent character in Robert Graves's novel Claudius the God as an intimate friend of Claudius. References External links Lucius Vitellius entry in historical sourcebook by Mahlon H. Smith Livius.org: Lucius Vitellius 1st-century BC births 51 deaths Year of birth uncertain 1st-century Roman governors of Syria Ancient Roman equites Lucius 1st-century Romans Imperial Roman consuls Ancient Roman censors Roman governors of Syria
```javascript OC.L10N.register( "updatenotification", { "{version} is available. Get more information on how to update." : "{version} . .", "Channel updated" : " ", "Update notifications" : " ", "The update server could not be reached since %d days to check for new updates." : " %d .", "Please check the Nextcloud and server log files for errors." : " Nextcloud .", "Update to %1$s is available." : " %1$s.", "Update for {app} to version %s is available." : " {app} %s.", "Update notification" : " ", "Displays update notifications for Nextcloud and provides the SSO for the updater." : " Nextcloud SSO ( ) .", "Update" : "", "The version you are running is not maintained anymore. Please make sure to update to a supported version as soon as possible." : " . .", "View in store" : " ", "Open updater" : " ", "Download now" : " ", "What's new?" : " ?", "View changelog" : " ", "The update check is not yet finished. Please refresh the page." : " . .", "Your version is up to date." : " .", "You can always update to a newer version. But you can never downgrade to a more stable version." : " . .", "Notify members of the following groups about available updates:" : " :", "The selected update channel does not support updates of the server." : " .", "A new version is available: <strong>{newVersionString}</strong>" : " : <strong>{newVersionString}</strong>", "Please make sure your config.php does not set <samp>appstoreenabled</samp> to false." : " config.php <samp>appstoreenabled</samp> false.", "Stable" : "", "Beta" : "", "Update channel:" : " :", "Checked on {lastCheckedDate}" : " {lastCheckedDate}" }, "nplurals=3; plural=(n == 1 && n % 1 == 0) ? 0 : (n == 2 && n % 1 == 0) ? 1: (n % 10 == 0 && n % 1 == 0 && n > 10) ? 2 : 3;"); ```
2017 Kashiwa Reysol season. J1 League References External links J.League official site Kashiwa Reysol Kashiwa Reysol seasons
IJD or ijd may refer to: Indian Journal of Dermatology, a bimonthly peer-reviewed open-access medical journal published in India International Journal of Dermatology, a peer-reviewed monthly journal covering all aspects of dermatology IJD, the FAA LID code for Windham Airport, Connecticut, United States
```c++ // Boost string_algo library join.hpp header file ---------------------------// // // (See accompanying file LICENSE_1_0.txt or copy at // path_to_url // See path_to_url for updates, documentation, and revision history. #ifndef BOOST_STRING_JOIN_HPP #define BOOST_STRING_JOIN_HPP #include <boost/algorithm/string/config.hpp> #include <boost/algorithm/string/detail/sequence.hpp> #include <boost/range/value_type.hpp> #include <boost/range/as_literal.hpp> /*! \file Defines join algorithm. Join algorithm is a counterpart to split algorithms. It joins strings from a 'list' by adding user defined separator. Additionally there is a version that allows simple filtering by providing a predicate. */ namespace boost { namespace algorithm { // join --------------------------------------------------------------// //! Join algorithm /*! This algorithm joins all strings in a 'list' into one long string. Segments are concatenated by given separator. \param Input A container that holds the input strings. It must be a container-of-containers. \param Separator A string that will separate the joined segments. \return Concatenated string. \note This function provides the strong exception-safety guarantee */ template< typename SequenceSequenceT, typename Range1T> inline typename range_value<SequenceSequenceT>::type join( const SequenceSequenceT& Input, const Range1T& Separator) { // Define working types typedef typename range_value<SequenceSequenceT>::type ResultT; typedef typename range_const_iterator<SequenceSequenceT>::type InputIteratorT; // Parse input InputIteratorT itBegin=::boost::begin(Input); InputIteratorT itEnd=::boost::end(Input); // Construct container to hold the result ResultT Result; // Append first element if(itBegin!=itEnd) { detail::insert(Result, ::boost::end(Result), *itBegin); ++itBegin; } for(;itBegin!=itEnd; ++itBegin) { // Add separator detail::insert(Result, ::boost::end(Result), ::boost::as_literal(Separator)); // Add element detail::insert(Result, ::boost::end(Result), *itBegin); } return Result; } // join_if ----------------------------------------------------------// //! Conditional join algorithm /*! This algorithm joins all strings in a 'list' into one long string. Segments are concatenated by given separator. Only segments that satisfy the predicate will be added to the result. \param Input A container that holds the input strings. It must be a container-of-containers. \param Separator A string that will separate the joined segments. \param Pred A segment selection predicate \return Concatenated string. \note This function provides the strong exception-safety guarantee */ template< typename SequenceSequenceT, typename Range1T, typename PredicateT> inline typename range_value<SequenceSequenceT>::type join_if( const SequenceSequenceT& Input, const Range1T& Separator, PredicateT Pred) { // Define working types typedef typename range_value<SequenceSequenceT>::type ResultT; typedef typename range_const_iterator<SequenceSequenceT>::type InputIteratorT; // Parse input InputIteratorT itBegin=::boost::begin(Input); InputIteratorT itEnd=::boost::end(Input); // Construct container to hold the result ResultT Result; // Roll to the first element that will be added while(itBegin!=itEnd && !Pred(*itBegin)) ++itBegin; // Add this element if(itBegin!=itEnd) { detail::insert(Result, ::boost::end(Result), *itBegin); ++itBegin; } for(;itBegin!=itEnd; ++itBegin) { if(Pred(*itBegin)) { // Add separator detail::insert(Result, ::boost::end(Result), ::boost::as_literal(Separator)); // Add element detail::insert(Result, ::boost::end(Result), *itBegin); } } return Result; } } // namespace algorithm // pull names to the boost namespace using algorithm::join; using algorithm::join_if; } // namespace boost #endif // BOOST_STRING_JOIN_HPP ```
The Rogans Hill railway line was a short-lived railway line in the north-western suburbs of Sydney, Australia. History A steam tramway opened between Parramatta and Baulkham Hills in 1902, and was extended to Castle Hill in 1910, carrying passengers and produce to and from the area. This tramway departed at Argyle St in Parramatta and tracked north along Church Street to Northmead, then along Windsor Road and Old Northern Road to Castle Hill. In 1919, the NSW government decided to convert the tramway into a railway to encourage the subdivision of estates for residential use. This involved building a new railway from the Main Western line at Westmead to Northmead on a new right-of way, and then converting the tramway to railway standard along the existing route to Castle Hill. The new section between Westmead and Northmead was built in 1922, and the line opened to traffic to Castle Hill in 1923. It was extended to Rogans Hill in 1924 on a new right-of-way. Stations were built at Mons Road (on the corner of Old Windsor Road), Northmead (on the corner of Briens Road and Windsor Road), Moxhams Road (at Windsor Road), Model Farms Road, Junction Road, Baulkham Hills, Cross Street, Southleigh (at Excelsior and Old Northern Roads), Parsonage Road, Castle Hill and Rogans Hill. The line was single track throughout, and ran alongside Windsor and Old Northern Roads between Northmead and Castle Hill. An island platform and crossing loop was provided at Baulkham Hills station. Most of the stations were short 20 metre (70 feet) wooden platforms. An office, waiting room and signal box were provided on the island platform at Baulkham Hills. Passenger service initially consisted of a steam locomotive (20 Class) hauling 3 wooden passenger cars. In latter years, CPH railmotors were used. The line proved to be unsuccessful – unlike the tramway, goods traffic was not carried and the stations were too sparsely spread to be as convenient as the tram it replaced. The rise of motor traffic on the adjacent roadway, which was not divided from the railway, also assisted in the line's demise. Passengers preferred the new and faster motor buses which could take them directly to businesses in Parramatta, and the line closed on 31 January 1932. The district that the line served is now substantially developed, and is a region of Sydney deficient in fixed-rail public transport infrastructure. A railway to the Hills District was opened in May 2019 to remedy this, but following a different alignment. What remains Little trace remains of the line, the route having been absorbed by road widening and residential development. The abutments and two concrete piers for the rail bridge over Toongabbie Creek still stand between Westmead and Northmead. There is also a well preserved wall of the cutting in the council car park off Raemot Lane in Baulkham Hills. Rails remain in the pavement near Castle Hill Bus interchange. A plaque has now been erected, on sleepers and old rails, at the site of the Castle Hill railway station, with pictures of the line during its 30-year history, and a short history of it. See also Railways in Sydney Sydney Metro Northwest References Closed railway lines in Sydney Railway lines opened in 1902 Railway lines closed in 1932 Standard gauge railways in Australia The Hills Shire 1902 establishments in Australia 1932 disestablishments in Australia
```objective-c //////////////////////////////////////////////////////////////////////////////////////////////////// // // Project: Embedded Learning Library (ELL) // File: SGDTrainer.h (trainers) // Authors: Ofer Dekel // //////////////////////////////////////////////////////////////////////////////////////////////////// #pragma once #include "ITrainer.h" #include <predictors/include/LinearPredictor.h> #include <data/include/Dataset.h> #include <data/include/Example.h> #include <cstddef> #include <memory> #include <random> #include <string> namespace ell { namespace trainers { /// <summary> Parameters for the stochastic gradient descent trainer. </summary> struct SGDTrainerParameters { double regularization; std::string randomSeedString; }; /// <summary> /// Implements the averaged stochastic gradient descent algorithm on an L2 regularized empirical /// loss. This class must be have a derived class that implements DoFirstStep(), DoNextStep(), and CalculatePredictors(). /// </summary> class SGDTrainerBase : public ITrainer<predictors::LinearPredictor<double>> { public: using PredictorType = predictors::LinearPredictor<double>; /// <summary> Sets the trainer's dataset. </summary> /// /// <param name="anyDataset"> A dataset. </param> void SetDataset(const data::AnyDataset& anyDataset) override; /// <summary> Updates the state of the trainer by performing a learning epoch. </summary> void Update() override; /// <summary> Returns The averaged predictor. </summary> /// /// <returns> A const reference to the averaged predictor. </returns> const predictors::LinearPredictor<double>& GetPredictor() const override { return GetAveragedPredictor(); } protected: // Instances of the base class cannot be created directly SGDTrainerBase(std::string randomSeedString); virtual void DoFirstStep(const data::AutoDataVector& x, double y, double weight) = 0; virtual void DoNextStep(const data::AutoDataVector& x, double y, double weight) = 0; virtual const PredictorType& GetAveragedPredictor() const = 0; data::AutoSupervisedDataset _dataset; std::default_random_engine _random; bool _firstIteration = true; }; // // SGDTrainer - Stochastic Gradient Descent // /// <summary> Implements the steps of a simple sgd linear trainer. </summary> /// /// <typeparam name="LossFunctionType"> Loss function type. </typeparam> template <typename LossFunctionType> class SGDTrainer : public SGDTrainerBase { public: using SGDTrainerBase::PredictorType; /// <summary> Constructs an SGD linear trainer. </summary> /// /// <param name="lossFunction"> The loss function. </param> /// <param name="parameters"> The training parameters. </param> SGDTrainer(const LossFunctionType& lossFunction, const SGDTrainerParameters& parameters); /// <summary> Returns a const reference to the last predictor. </summary> /// /// <returns> A const reference to the last predictor. </returns> const PredictorType& GetLastPredictor() const { return _lastPredictor; } /// <summary> Returns a const reference to the averaged predictor. </summary> /// /// <returns> A const reference to the averaged predictor. </returns> const PredictorType& GetAveragedPredictor() const override { return _averagedPredictor; } protected: void DoFirstStep(const data::AutoDataVector& x, double y, double weight) override; void DoNextStep(const data::AutoDataVector& x, double y, double weight) override; private: LossFunctionType _lossFunction; SGDTrainerParameters _parameters; double _t = 0; // step counter; PredictorType _lastPredictor; PredictorType _averagedPredictor; void ResizeTo(const data::AutoDataVector& x); }; // // SparseDataSGDTrainer - Sparse Data Stochastic Gradient Descent // /// <summary> Implements the steps of Sparse Data Stochastic Gradient Descent. </summary> /// /// <typeparam name="LossFunctionType"> Loss function type. </typeparam> template <typename LossFunctionType> class SparseDataSGDTrainer : public SGDTrainerBase { public: using SGDTrainerBase::PredictorType; /// <summary> Constructs an instance of SparseDataSGDTrainer. </summary> /// /// <param name="lossFunction"> The loss function. </param> /// <param name="parameters"> The training parameters. </param> SparseDataSGDTrainer(const LossFunctionType& lossFunction, const SGDTrainerParameters& parameters); /// <summary> Returns a const reference to the last predictor. </summary> /// /// <returns> A const reference to the last predictor. </returns> const PredictorType& GetLastPredictor() const; /// <summary> Returns a const reference to the averaged predictor. </summary> /// /// <returns> A const reference to the averaged predictor. </returns> const PredictorType& GetAveragedPredictor() const override; protected: void DoFirstStep(const data::AutoDataVector& x, double y, double weight) override; void DoNextStep(const data::AutoDataVector& x, double y, double weight) override; private: LossFunctionType _lossFunction; SGDTrainerParameters _parameters; // these variables follow the notation in path_to_url math::ColumnVector<double> _v; // gradient sum - weights math::ColumnVector<double> _u; // harmonic-weighted gradient sum - weights double _t = 0; // step counter double _a = 0; // gradient sum - bias double _h = 0; // harmonic number double _c = 0; // 1/t-weighted sum of _a // these variables are mutable because we calculate them in a lazy manner (only when `GetPredictor() const` is called) mutable PredictorType _lastPredictor; mutable PredictorType _averagedPredictor; void ResizeTo(const data::AutoDataVector& x); }; // // SparseDataCenteredSGDTrainer - Sparse Data Centered Stochastic Gradient Descent // /// <summary> Implements the steps of Sparse Data Centered Stochastic Gradient Descent. </summary> /// /// <typeparam name="LossFunctionType"> Loss function type. </typeparam> template <typename LossFunctionType> class SparseDataCenteredSGDTrainer : public SGDTrainerBase { public: using SGDTrainerBase::PredictorType; /// <summary> Constructs an instance of SparseDataCenteredSGDTrainer. </summary> /// /// <param name="lossFunction"> The loss function. </param> /// <param name="center"> The center (mean) of the training set. </param> /// <param name="parameters"> Trainer parameters. </param> SparseDataCenteredSGDTrainer(const LossFunctionType& lossFunction, math::RowVector<double> center, const SGDTrainerParameters& parameters); /// <summary> Returns a const reference to the last predictor. </summary> /// /// <returns> A const reference to the last predictor. </returns> const PredictorType& GetLastPredictor() const; /// <summary> Returns a const reference to the averaged predictor. </summary> /// /// <returns> A const reference to the averaged predictor. </returns> const PredictorType& GetAveragedPredictor() const override; protected: void DoFirstStep(const data::AutoDataVector& x, double y, double weight) override; void DoNextStep(const data::AutoDataVector& x, double y, double weight) override; private: LossFunctionType _lossFunction; SGDTrainerParameters _parameters; // these variables follow the notation in path_to_url math::ColumnVector<double> _v; // gradient sum - weights math::ColumnVector<double> _u; // harmonic-weighted gradient sum - weights double _t = 0; // step counter double _a = 0; // gradient sum - bias double _h = 0; // harmonic number double _c = 0; // 1/t-weighted sum of _a double _z = 0; double _r = 0; double _s = 0; math::RowVector<double> _center; double _theta; // these variables are mutable because we calculate them in a lazy manner (only when `GetPredictor() const` is called) mutable PredictorType _lastPredictor; mutable PredictorType _averagedPredictor; void ResizeTo(const data::AutoDataVector& x); }; // // MakeTrainer helper functions // /// <summary> Makes a SGD linear trainer. </summary> /// /// <typeparam name="LossFunctionType"> Type of loss function to use. </typeparam> /// <param name="lossFunction"> The loss function. </param> /// <param name="parameters"> The trainer parameters. </param> /// /// <returns> A linear trainer </returns> template <typename LossFunctionType> std::unique_ptr<trainers::ITrainer<predictors::LinearPredictor<double>>> MakeSGDTrainer(const LossFunctionType& lossFunction, const SGDTrainerParameters& parameters); /// <summary> Makes a SparseDataSGD linear trainer. </summary> /// /// <typeparam name="LossFunctionType"> Type of loss function to use. </typeparam> /// <param name="lossFunction"> The loss function. </param> /// <param name="parameters"> The trainer parameters. </param> /// /// <returns> A linear trainer </returns> template <typename LossFunctionType> std::unique_ptr<trainers::ITrainer<predictors::LinearPredictor<double>>> MakeSparseDataSGDTrainer(const LossFunctionType& lossFunction, const SGDTrainerParameters& parameters); /// <summary> Makes a SparseDataCenteredSGD linear trainer. </summary> /// /// <typeparam name="LossFunctionType"> Type of loss function to use. </typeparam> /// <param name="lossFunction"> The loss function. </param> /// <param name="center"> The center (mean) of the training set. </param> /// <param name="parameters"> The trainer parameters. </param> /// /// <returns> A linear trainer </returns> template <typename LossFunctionType> std::unique_ptr<trainers::ITrainer<predictors::LinearPredictor<double>>> MakeSparseDataCenteredSGDTrainer(const LossFunctionType& lossFunction, math::RowVector<double> center, const SGDTrainerParameters& parameters); } // namespace trainers } // namespace ell #pragma region implementation #include <cmath> #include <data/include/DataVector.h> #include <data/include/DataVectorOperations.h> #include <data/include/Dataset.h> #include <math/include/VectorOperations.h> namespace ell { namespace trainers { // the code in this file follows the notation and pseudocode in path_to_url // // SGDTrainer // template <typename LossFunctionType> SGDTrainer<LossFunctionType>::SGDTrainer(const LossFunctionType& lossFunction, const SGDTrainerParameters& parameters) : SGDTrainerBase(parameters.randomSeedString), _lossFunction(lossFunction), _parameters(parameters) { } template <typename LossFunctionType> void SGDTrainer<LossFunctionType>::DoFirstStep(const data::AutoDataVector& x, double y, double weight) { DoNextStep(x, y, weight); } template <typename LossFunctionType> void SGDTrainer<LossFunctionType>::DoNextStep(const data::AutoDataVector& x, double y, double weight) { ResizeTo(x); ++_t; // Predict double p = _lastPredictor.Predict(x); // calculate the loss derivative double g = weight * _lossFunction.GetDerivative(p, y); // get abbreviated names auto& lastW = _lastPredictor.GetWeights(); double& lastB = _lastPredictor.GetBias(); // update the (last) predictor double scaleCoefficient = 1.0 - 1.0 / _t; lastW *= scaleCoefficient; lastB *= scaleCoefficient; const double lambda = _parameters.regularization; double updateCoefficient = -g / (lambda * _t); lastW.Transpose() += updateCoefficient * x; lastB += updateCoefficient; // get abbreviated names auto& averagedW = _averagedPredictor.GetWeights(); double& averagedB = _averagedPredictor.GetBias(); // update the average predictor averagedW *= scaleCoefficient; averagedB *= scaleCoefficient; averagedW += 1.0 / _t * lastW; averagedB += lastB / _t; } template <typename LossFunctionType> void SGDTrainer<LossFunctionType>::ResizeTo(const data::AutoDataVector& x) { auto xSize = x.PrefixLength(); if (xSize > _lastPredictor.Size()) { _lastPredictor.Resize(xSize); _averagedPredictor.Resize(xSize); } } // // SparseDataSGDTrainer // template <typename LossFunctionType> SparseDataSGDTrainer<LossFunctionType>::SparseDataSGDTrainer(const LossFunctionType& lossFunction, const SGDTrainerParameters& parameters) : SGDTrainerBase(parameters.randomSeedString), _lossFunction(lossFunction), _parameters(parameters) { } template <typename LossFunctionType> void SparseDataSGDTrainer<LossFunctionType>::DoFirstStep(const data::AutoDataVector& x, double y, double weight) { ResizeTo(x); _t = 1.0; double g = weight * _lossFunction.GetDerivative(0, y); _v.Transpose() += g * x; _a += g; _c = _a; _h = 1.0; } template <typename LossFunctionType> void SparseDataSGDTrainer<LossFunctionType>::DoNextStep(const data::AutoDataVector& x, double y, double weight) { ResizeTo(x); ++_t; // apply the predictor const double lambda = _parameters.regularization; double d = x * _v; double p = -(d + _a) / (lambda * (_t - 1.0)); // get the derivative double g = weight * _lossFunction.GetDerivative(p, y); // update _v.Transpose() += g * x; _a += g; _u.Transpose() += _h * g * x; _c += _a / _t; _h += 1.0 / _t; } template <typename LossFunctionType> auto SparseDataSGDTrainer<LossFunctionType>::GetLastPredictor() const -> const PredictorType& { const double lambda = _parameters.regularization; _lastPredictor.Resize(_v.Size()); auto& w = _lastPredictor.GetWeights(); // define last predictor based on _v, _a, _t w.Reset(); w += (-1 / (lambda * _t)) * _v; _lastPredictor.GetBias() = -_a / (lambda * _t); return _lastPredictor; } template <typename LossFunctionType> auto SparseDataSGDTrainer<LossFunctionType>::GetAveragedPredictor() const -> const PredictorType& { const double lambda = _parameters.regularization; _averagedPredictor.Resize(_v.Size()); auto& w = _averagedPredictor.GetWeights(); // define averaged predictor based on _v, _h, _u, _t w.Reset(); w += -_h / (lambda * _t) * _v; w += 1 / (lambda * _t) * _u; _averagedPredictor.GetBias() = -_c / (lambda * _t); return _averagedPredictor; } template <typename LossFunctionType> inline void SparseDataSGDTrainer<LossFunctionType>::ResizeTo(const data::AutoDataVector& x) { auto xSize = x.PrefixLength(); if (xSize > _v.Size()) { _v.Resize(xSize); _u.Resize(xSize); } } // // SparseDataCenteredSGDTrainer // template <typename LossFunctionType> SparseDataCenteredSGDTrainer<LossFunctionType>::SparseDataCenteredSGDTrainer(const LossFunctionType& lossFunction, math::RowVector<double> center, const SGDTrainerParameters& parameters) : SGDTrainerBase(parameters.randomSeedString), _lossFunction(lossFunction), _parameters(parameters), _center(std::move(center)) { _theta = 1 + _center.Norm2Squared(); } template <typename LossFunctionType> void SparseDataCenteredSGDTrainer<LossFunctionType>::DoFirstStep(const data::AutoDataVector& x, double y, double weight) { ResizeTo(x); _t = 1.0; // first, perform the standard SparseDataSGD step double g = weight * _lossFunction.GetDerivative(0, y); _v.Transpose() += g * x; _a += g; _c = _a; _h = 1.0; // next, perform the special steps needed for centering double q = x * _center.Transpose(); _z = g * q; _r = _a * _theta - _z; _s = _r; } template <typename LossFunctionType> void SparseDataCenteredSGDTrainer<LossFunctionType>::DoNextStep(const data::AutoDataVector& x, double y, double weight) { ResizeTo(x); ++_t; // apply the predictor const double lambda = _parameters.regularization; double d = x * _v; double q = x * _center.Transpose(); double p = -(d + _r - _a * q) / (lambda * (_t - 1.0)); // get the derivative double g = weight * _lossFunction.GetDerivative(p, y); // apply the SparseDataSGD update _v.Transpose() += g * x; _a += g; _u.Transpose() += _h * g * x; _c += _a / _t; _h += 1.0 / _t; // next, perform the special steps needed for centering _z += g * q; _r = _a * _theta - _z; _s += _r / _t; } template <typename LossFunctionType> auto SparseDataCenteredSGDTrainer<LossFunctionType>::GetLastPredictor() const -> const PredictorType& { const double lambda = _parameters.regularization; _lastPredictor.Resize(_v.Size()); auto& w = _lastPredictor.GetWeights(); w += (-1 / (lambda * _t)) * _v; _lastPredictor.GetBias() = -_a / (lambda * _t); return _lastPredictor; } template <typename LossFunctionType> auto SparseDataCenteredSGDTrainer<LossFunctionType>::GetAveragedPredictor() const -> const PredictorType& { const double lambda = _parameters.regularization; const double coeff = 1.0 / (lambda * _t); _averagedPredictor.Resize(_v.Size()); auto& w = _averagedPredictor.GetWeights(); // define last predictor based on _v, _u, _c w.Reset(); w += -_h * coeff * _v; w += coeff * _u; w += _c * coeff * _center.Transpose(); _averagedPredictor.GetBias() = -_s * coeff; return _averagedPredictor; } template <typename LossFunctionType> inline void SparseDataCenteredSGDTrainer<LossFunctionType>::ResizeTo(const data::AutoDataVector& x) { auto xSize = x.PrefixLength(); if (xSize > _v.Size()) { _v.Resize(xSize); _u.Resize(xSize); } } // // Helper functions // template <typename LossFunctionType> std::unique_ptr<ITrainer<predictors::LinearPredictor<double>>> MakeSGDTrainer(const LossFunctionType& lossFunction, const SGDTrainerParameters& parameters) { return std::make_unique<SGDTrainer<LossFunctionType>>(lossFunction, parameters); } template <typename LossFunctionType> std::unique_ptr<ITrainer<predictors::LinearPredictor<double>>> MakeSparseDataSGDTrainer(const LossFunctionType& lossFunction, const SGDTrainerParameters& parameters) { return std::make_unique<SparseDataSGDTrainer<LossFunctionType>>(lossFunction, parameters); } template <typename LossFunctionType> std::unique_ptr<ITrainer<predictors::LinearPredictor<double>>> MakeSparseDataCenteredSGDTrainer(const LossFunctionType& lossFunction, math::RowVector<double> center, const SGDTrainerParameters& parameters) { return std::make_unique<SparseDataCenteredSGDTrainer<LossFunctionType>>(lossFunction, std::move(center), parameters); } } // namespace trainers } // namespace ell #pragma endregion implementation ```
Slovak nationalism is an ethnic nationalist ideology that asserts that the Slovaks are a nation and promotes the cultural unity of the Slovaks. History Modern Slovak nationalism first arose in the 19th century in response to Magyarization of Slovak-inhabited territories in the Kingdom of Hungary. It was based on two main ideas: a historical state right based on a continuity with the early medieval Great Moravian Empire and an identity associated with the Slavs. Ethnic nationalism and civic nationalism During the century-long period spanning from Slovakia's semi-independence as part of democratic Czechoslovakia in 1918, to the liberal democratic independent republic of the early 2020s, Slovak nationalism had gradually evolved into several different ideological strands. One is the continued ethnic nationalism, focused mainly on the Slovak ethnic majority and Slovakia as a primarily Slovak nation state. This nationalism occurs both in moderate and radical forms. The other major strand is civic nationalism, which emphasizes more of a patriotic perspective, focused on Slovakia as a homeland of both the major ethnicity (Slovaks) and all of Slovakia's ethnic, religious and other minorities. Though more traditional ethnic nationalism is still influential in Slovakia, modern civic nationalism had gradually grown in significance, and forms the dominant view in today's Slovakia. Due to Slovakia's long historical existence without its own separate and sovereign national government, the various expressions of Slovak nationalism have continued to develop in complex ways. The civic nationalist view of the country had developed and grown in significance especially during the democratic eras of the country's history, particularly after Slovakia's full independence in 1993. With the country's democratic independence issue resolved, the focus of both ethnic nationalists and civic nationalists has shifted away from the original focus of Slovak nationalism, as originally established in the 19th century (i.e. achievement of political and cultural representation and self-governance). Slovakia's transfer from a repressed society with a Soviet-imposed totalitarian government and planned economy (1948-1989) back to a liberal-democratic society, with a mixed-market economy and membership in the EU (after 1989 and 1993) has also influenced the nature of Slovak nationalism. Despite certain domestic tensions in the 1990s, a civic nationalist view of domestic minorities had ultimately become the dominant perspective since the 1990s. This coincided with a lasting improvement of relations with some neighbouring countries (Hungary, Ukraine) by the early 2000s, as well as with a general trend of civic nationalist views being very supportive of an active, democratically minded and transparency-focused civic society in Slovakia during the early 21st century. Support for civic society has also been voiced among more moderate ethnic nationalists, though unambiguous support for a modern civic society remains more contested among contemporary ethnic nationalists. Radical civic nationalism is almost non-existent in contemporary Slovakia, whereas radical ethnic nationalism is represented a fairly vocal minority in Slovak politics and certain parts of Slovak society. A populist political approach occurs among both ethnic nationalist and civic nationalist political parties and other groups. Slovak nationalist parties Slovak National Party (1871–1938) Slovak People's Party (1913–1945) Juriga's Slovak People's Party (1925–1938) Slovak National Party (1989–present) People's Party – Movement for a Democratic Slovakia (1991–2014) Direction – Social Democracy (1999–present) True Slovak National Party (2001–2005) Conservative Democrats of Slovakia (2008–2014) Kotleba – People's Party Our Slovakia (2010–present) We Are Family (2015–present) See also Nationalism
```objective-c /* * */ // The LL layer for ESP32-C5 LP_AON register operations #pragma once #include <stdlib.h> #include "soc/soc.h" #include "soc/lp_aon_struct.h" #include "hal/misc.h" #include "esp32c5/rom/rtc.h" #ifdef __cplusplus extern "C" { #endif /** * @brief Get ext1 wakeup source status * @return The lower 8 bits of the returned value are the bitmap of * the wakeup source status, bit 0~7 corresponds to LP_IO 0~7 */ static inline uint32_t lp_aon_ll_ext1_get_wakeup_status(void) { return HAL_FORCE_READ_U32_REG_FIELD(LP_AON.ext_wakeup_cntl, ext_wakeup_status); } /** * @brief Clear the ext1 wakeup source status */ static inline void lp_aon_ll_ext1_clear_wakeup_status(void) { HAL_FORCE_MODIFY_U32_REG_FIELD(LP_AON.ext_wakeup_cntl, ext_wakeup_status_clr, 1); } /** * @brief Set the wake-up LP_IO of the ext1 wake-up source * @param io_mask wakeup LP_IO bitmap, bit 0~7 corresponds to LP_IO 0~7 * @param level_mask LP_IO wakeup level bitmap, bit 0~7 corresponds to LP_IO 0~7 wakeup level * each bit's corresponding position is set to 0, the wakeup level will be low * on the contrary, each bit's corresponding position is set to 1, the wakeup * level will be high */ static inline void lp_aon_ll_ext1_set_wakeup_pins(uint32_t io_mask, uint32_t level_mask) { uint32_t wakeup_sel_mask = HAL_FORCE_READ_U32_REG_FIELD(LP_AON.ext_wakeup_cntl, ext_wakeup_sel); wakeup_sel_mask |= io_mask; HAL_FORCE_MODIFY_U32_REG_FIELD(LP_AON.ext_wakeup_cntl, ext_wakeup_sel, wakeup_sel_mask); uint32_t wakeup_level_mask = HAL_FORCE_READ_U32_REG_FIELD(LP_AON.ext_wakeup_cntl, ext_wakeup_lv); wakeup_level_mask |= io_mask & level_mask; wakeup_level_mask &= ~(io_mask & ~level_mask); HAL_FORCE_MODIFY_U32_REG_FIELD(LP_AON.ext_wakeup_cntl, ext_wakeup_lv, wakeup_level_mask); } /** * @brief Clear all ext1 wakup-source setting */ static inline void lp_aon_ll_ext1_clear_wakeup_pins(void) { HAL_FORCE_MODIFY_U32_REG_FIELD(LP_AON.ext_wakeup_cntl, ext_wakeup_sel, 0); } /** * @brief Get ext1 wakeup source setting * @return The lower 8 bits of the returned value are the bitmap of * the wakeup source status, bit 0~7 corresponds to LP_IO 0~7 */ static inline uint32_t lp_aon_ll_ext1_get_wakeup_pins(void) { return HAL_FORCE_READ_U32_REG_FIELD(LP_AON.ext_wakeup_cntl, ext_wakeup_sel); } /** * @brief ROM obtains the wake-up type through LP_AON_STORE9_REG[0]. * Set the flag to inform * @param true: deepsleep false: lightsleep */ static inline void lp_aon_ll_inform_wakeup_type(bool dslp) { if (dslp) { REG_SET_BIT(RTC_SLEEP_MODE_REG, BIT(0)); /* Tell rom to run deep sleep wake stub */ } else { REG_CLR_BIT(RTC_SLEEP_MODE_REG, BIT(0)); /* Tell rom to run light sleep wake stub */ } } /** * @brief Get the flag that marks whether LP CPU is awakened by ETM * * @return Return true if lpcore is woken up by soc_etm */ static inline bool lp_aon_ll_get_lpcore_etm_wakeup_flag(void) { return REG_GET_BIT(LP_AON_LPCORE_REG, LP_AON_LPCORE_ETM_WAKEUP_FLAG); } /** * @brief Clear the flag that marks whether LP CPU is awakened by soc_etm */ static inline void lp_aon_ll_clear_lpcore_etm_wakeup_flag(void) { REG_SET_BIT(LP_AON_LPCORE_REG, LP_AON_LPCORE_ETM_WAKEUP_FLAG_CLR); } #ifdef __cplusplus } #endif ```
Bigger than Us may refer to: Bigger than Us (album), a 2001 album by Aurora "Bigger than Us" (Michael Rice song), a 2019 song by Michael Rice representing UK in the Eurovision Song Contest 2019 "Bigger than Us" (White Lies song), a 2011 song by White Lies "Bigger than Us", a song by Miley Cyrus from her album Hannah Montana 2: Meet Miley Cyrus "Bigger than Us", a song by Josh Groban on the 2018 album Bridges "Bigger Than Us (documentary)", a 2021 documentary film directed by Flore Vasseur
Ryan Adam Rene Jean Spilborghs (born September 5, 1979) is an American baseball broadcaster for AT&T SportsNet Rocky Mountain & SiriusXM's MLB Network Radio, and a former professional baseball outfielder. He played in Major League Baseball (MLB) for the Colorado Rockies and in Nippon Professional Baseball (NPB) for the Saitama Seibu Lions. Baseball career College Spilborghs played college ball at the University of California, Santa Barbara, where he was all Big West Conference in 2001. He also played for the Madison Mallards during the summer of 2001. Colorado Rockies Spilborghs was drafted by the Colorado Rockies in the 7th round of the 2002 MLB Draft. Between 2002 and 2005, he played for the Tri-City Dust Devils, Asheville Tourists, Visalia Oaks, and Tulsa Drillers. He made his Major League debut for the Rockies on July 16, 2005 against the Cincinnati Reds and recorded his first hit in that game, a single to right field off of Todd Coffey. That was the only game he played in for the Rockies that year, spending the rest of the year in AAA with the Colorado Springs Sky Sox, where he hit .338 in 68 games. He rejoined the Rockies in 2006 and hit his first home run on May 29 off of Jim Brower of the San Diego Padres. He began the 2007 season at Triple-A with the Sky Sox, after being beaten out for a roster spot by veteran Steve Finley. After Finley was released by the Rockies, Spilborghs returned to the team. In 2008, he made the team from spring training, serving as the fourth outfielder. On August 24, 2009, in the 14th inning against the San Francisco Giants with the Rockies down 4–2, Spilborghs homered off pitcher Merkin Valdéz for the first walk-off grand slam in Rockies history. This solo Rockies record was held for over 11 years until Charlie Blackmon, also wearing jersey number 19, hit a walk-off grand slam against the Los Angeles Angels on September 11, 2020. On December 12, 2011, Spilborghs was non-tendered by the Rockies and became a free agent. Cleveland Indians Spilborghs signed a minor league contract with the Cleveland Indians on January 20, 2012. He also received an invitation to spring training. He failed to make the team and was assigned to the AAA Columbus Clippers, where he hit .250 in 21 games. Texas Rangers On May 4, 2012, Spilborghs was traded to the Texas Rangers organization for cash considerations and played for the Triple-A Round Rock Express. In 103 games with Round Rock, he hit .295. Saitama Seibu Lions On December 6, 2012 he agreed to a one-year contract with the Saitama Seibu Lions of the Japanese Pacific League. Broadcasting On February 6, 2014, it was announced that Spilborghs had joined the Root Sports Rocky Mountain broadcasting team, where his primary role is sideline reporting during games; however he occasionally provides in booth color commentary. He is an analyst for Rockies pregame and postgame shows, as well as for other programs on the network. Spilborghs currently co-hosts (with CJ Nitkowski) the "Loud Outs" program that airs Saturdays on SiriusXM's MLB Network Radio. Spilborghs announced he was joining Apple TV for Friday night baseball coverage in February 2023. Personal life Spilborghs is of mixed descent, as his father is Belgian, and his mother is Guatemalan. On July 20, 2009, Ryan's wife gave birth to their daughter and first child, Kierra. Her middle name, Esperanza, was chosen after his mother, who died during spring training earlier in the year. References External links 1979 births Living people American expatriate baseball players in Japan American people of Belgian descent American people of Guatemalan descent Asheville Tourists players Baseball players from Santa Barbara, California Colorado Rockies announcers Colorado Rockies players Colorado Springs Sky Sox players Columbus Clippers players Madison Mallards players Major League Baseball broadcasters Major League Baseball outfielders Round Rock Express players Saitama Seibu Lions players Sportspeople of Guatemalan descent Tri-City Dust Devils players Tulsa Drillers players UC Santa Barbara Gauchos baseball players Visalia Oaks players
Robert Vaughan may refer to: Politicians Robert Vaughan (MP for Grampound) (fl. 1554-55), English Member of Parliament for Grampound Robert Vaughan (MP for New Radnor) (fl. 1524-75 or later), represented New Radnor (UK Parliament constituency) Sir Robert Vaughan, 2nd Baronet (1768–1843), British Member of Parliament for Merioneth Others Bob Vaughan (born 1945), British mathematician Robert Charles Vaughan (1883–1966), Canadian railway executive Robert Vaughan (antiquary) (died 1667), Welsh antiquary and manuscript collector Robert Vaughan (author) (born 1930s), American author Robert Vaughan (cricketer) (1834-1865), Australian cricketer Robert Vaughan (minister) (1795–1868), English minister of the Congregationalist communion Robert Alfred Vaughan (1823–1857), English Congregationalist minister and author, son of Robert Vaughan (1795–1868) Robert Charles Vaughan (businessman), former Canadian National Railways executive Robert E. Vaughan (1888–1969), American head football coach for the Wabash College Little Giants Robert Vaughn (Montana rancher) (Robert Vaughan, 1836–1918), Welsh-American Montanan rancher and pioneer See also Robert Vaughan Gorle (1896–1937), British Army officer Robert Vaughn (disambiguation) Robert Charles Vaughan (disambiguation)
Gate Dancer (1981–1998) was an American Thoroughbred racehorse best known as a winner of an American Classic Race, the Preakness Stakes, and for his part in a three-horse finish in the inaugural running of the Breeders' Cup Classic. Bred in Florida by William R. Davis, Gate Dancer was a son of Sovereign Dancer, in turn a son of the great Northern Dancer. He was out of the mare Sun Gate, whose sire was Bull Lea, a five-time Leading sire in North America. Owned by Ken Opstein. Trained by Jack Van Berg, on the racetrack the high-strung colt became distressed from the sounds of the crowd until his trainer devised a hood for his head with earmuffs that minimized the noise. Racing career Early career In June 1983, Gate Dancer won his two-year-old racing debut Ak-Sar-Ben Racetrack in Omaha, Nebraska. He raced once more in Omaha, then compete twice in California. Of his four starts that year, he ended up with two wins and two seconds. In his three-year-old season, Gate Dancer was aimed toward the Kentucky Derby. Staying in California, in February 1984 he ran second in the El Camino Real Derby at Bay Meadows and two weeks later at Santa Anita Park won an allowance race. In March, he finished second in the San Felipe and third in the Santa Catalina Stakes. Gate Dancer then had another third in April's Arkansas Derby behind Althea, whose winning time equaled the Oaklawn Park track record for 1⅛ miles. Althea, the 1983 U.S. Champion 2-Year-Old Filly, became the heavy betting favorite going into the Kentucky Derby. 1984 U.S. Triple Crown Ridden by Eddie Delahoussaye in the Kentucky Derby, Gate Dancer was sent off at longshot odds of 19:1. Starting at the far outside in post position twenty, he immediately ran into difficulty but by the mile pole had moved up to ninth place with race favorite Althea tiring badly and dropping out of contention. Gate Dancer tried in vain to catch the leaders down the homestretch but veered in, bumping another horse several times. He finished fourth behind Claiborne Farm's winning colt, Swale. Following an interference complaint over the bumping incident, Churchill Downs stewards set Gate Dancer back to fifth place. For the Preakness Stakes, jockey Ángel Cordero Jr. was aboard Gate Dancer. This time, the colt had a clean start in the much smaller field of nine and won the second leg of the Triple Crown series. Swale, the heavily favored Derby winner, finished well back in seventh place. However, in the grueling 1½ mile Belmont Stakes, Gate Dancer moved into contention as they headed into the homestretch but after making a charge at the front-running Swale, he tired and dropped back to finish sixth. Breeders' Cup Classic Following his loss in the Belmont Stakes, Gate Dancer's handlers brought him back to his first home at Ak-Sar-Ben Racetrack in Nebraska, where in August he won the 1984 Omaha Gold Cup. In his next start in September, he set a new track record for a mile and a quarter in winning the 1984 Super Derby at Louisiana Downs. In November, he was shipped to Hollywood Park Racetrack for the inaugural running of the Breeders' Cup Classic. Gate Dancer, Wild Again, and Slew o' Gold battled head to head through the stretch and all the way to the finish line. Wild Again was on the rail, with Gate Dancer on the outside and Slew o' Gold in close quarters between his rivals. The stretch run contained bumping, with Wild Again coming out on top and Gate Dancer crossing the wire second. However, Gate Dancer was disqualified for interfering with Slew o' Gold and moved down to third place, while Slew o' Gold officially finished second. At the end of 1984, Jack Van Berg was voted the Eclipse Award for Outstanding Trainer. 1985 and 1986 In 1985, Gate Dancer started ten times, winning once and capturing second and third place on three occasions each. Back for his second attempt in the Breeders' Cup Classic, he finished second to the Darby Dan Farm colt Proud Truth. Raced three times in 1986, Gate Dancer earned a fifth-place finish in the Santa Anita Handicap and a third in the Widener. In the final race of his career, he came in second in the Oaklawn Handicap. Career as a sire Retired to Florida, Gate Dancer initially stood at stud at owner Kenneth Opstein's Good Chance Farm near Ocala in south Marion County, Florida and eventually ended up at Silverleaf Farm near Summerfield. Although he sired 27 stakes winners, none achieved the same level of racing success as Gate Dancer. On March 6, 1998, after a long struggle with laminitis, he was humanely euthanized. He was buried at Johnson Hollow Farm, near Oxford, Florida. Nebraska artist Gwen G. Sides painted a portrait of Gate Dancer. Breeding References Gate Dancer's offspring at the Triple Video Winning 1984 Ak-Sar-Ben Omaha Gold Cup Crown database by Kathleen Irwin and Joy Reeves 1981 racehorse births Racehorses bred in Florida Racehorses trained in the United States Preakness Stakes winners American Grade 1 Stakes winners Thoroughbred family 3-l
```smalltalk using System.Threading.Tasks; using FluentAssertions; using Xunit; namespace CSharpFunctionalExtensions.Tests.ResultTests.Extensions { public class CheckIfTests_Task : CheckIfTestsBase { [Theory] [InlineData(true, true)] [InlineData(true, false)] [InlineData(false, true)] [InlineData(false, false)] public async Task your_sha256_hashself(bool isSuccess, bool condition) { Result<bool> result = Result.SuccessIf(isSuccess, condition, ErrorMessage); var returned = await result.AsTask().CheckIf(condition, Task_Func_Result); actionExecuted.Should().Be(isSuccess && condition); result.Should().Be(returned); } [Theory] [InlineData(true, true)] [InlineData(true, false)] [InlineData(false, true)] [InlineData(false, false)] public async Task your_sha256_hashself(bool isSuccess, bool condition) { Result<bool> result = Result.SuccessIf(isSuccess, condition, ErrorMessage); var returned = await result.AsTask().CheckIf(condition, Task_Func_Result_K); actionExecuted.Should().Be(isSuccess && condition); result.Should().Be(returned); } [Theory] [InlineData(true, true)] [InlineData(true, false)] [InlineData(false, true)] [InlineData(false, false)] public async Task your_sha256_hashs_self(bool isSuccess, bool condition) { Result<bool, E> result = Result.SuccessIf(isSuccess, condition, E.Value); var returned = await result.AsTask().CheckIf(condition, Task_Func_Result_K_E); actionExecuted.Should().Be(isSuccess && condition); result.Should().Be(returned); } [Theory] [InlineData(true, true)] [InlineData(true, false)] [InlineData(false, true)] [InlineData(false, false)] public async Task your_sha256_hashs_self(bool isSuccess, bool condition) { Result<bool, E> result = Result.SuccessIf(isSuccess, condition, E.Value); var returned = await result.AsTask().CheckIf(condition, Task_Func_UnitResult_E); actionExecuted.Should().Be(isSuccess && condition); result.Should().Be(returned); } [Theory] [InlineData(true, true)] [InlineData(true, false)] [InlineData(false, true)] [InlineData(false, false)] public async Task your_sha256_hashrns_self(bool isSuccess, bool condition) { UnitResult<E> result = UnitResult.SuccessIf(isSuccess, E.Value); var returned = await result.AsTask().CheckIf(condition, Task_Func_UnitResult_E); actionExecuted.Should().Be(isSuccess && condition); result.Should().Be(returned); } [Theory] [InlineData(true, true)] [InlineData(true, false)] [InlineData(false, true)] [InlineData(false, false)] public async Task your_sha256_hashself(bool isSuccess, bool condition) { Result<bool> result = Result.SuccessIf(isSuccess, condition, ErrorMessage); var returned = await result.AsTask().CheckIf(Predicate, Task_Func_Result); actionExecuted.Should().Be(isSuccess && condition); result.Should().Be(returned); } [Theory] [InlineData(true, true)] [InlineData(true, false)] [InlineData(false, true)] [InlineData(false, false)] public async Task your_sha256_hashself(bool isSuccess, bool condition) { Result<bool> result = Result.SuccessIf(isSuccess, condition, ErrorMessage); var returned = await result.AsTask().CheckIf(Predicate, Task_Func_Result_K); actionExecuted.Should().Be(isSuccess && condition); result.Should().Be(returned); } [Theory] [InlineData(true, true)] [InlineData(true, false)] [InlineData(false, true)] [InlineData(false, false)] public async Task your_sha256_hashs_self(bool isSuccess, bool condition) { Result<bool, E> result = Result.SuccessIf(isSuccess, condition, E.Value); var returned = await result.AsTask().CheckIf(Predicate, Task_Func_Result_K_E); actionExecuted.Should().Be(isSuccess && condition); result.Should().Be(returned); } [Theory] [InlineData(true, true)] [InlineData(true, false)] [InlineData(false, true)] [InlineData(false, false)] public async Task your_sha256_hashs_self(bool isSuccess, bool condition) { Result<bool, E> result = Result.SuccessIf(isSuccess, condition, E.Value); var returned = await result.AsTask().CheckIf(Predicate, Task_Func_UnitResult_E); actionExecuted.Should().Be(isSuccess && condition); result.Should().Be(returned); } [Theory] [InlineData(true, true)] [InlineData(true, false)] [InlineData(false, true)] [InlineData(false, false)] public async Task your_sha256_hashrns_self(bool isSuccess, bool condition) { UnitResult<E> result = UnitResult.SuccessIf(isSuccess, E.Value); var returned = await result.AsTask().CheckIf(Predicate(condition), Task_Func_UnitResult_E); actionExecuted.Should().Be(isSuccess && condition); result.Should().Be(returned); } } } ```
```pod =pod =head1 NAME DSA_meth_new, DSA_meth_free, DSA_meth_dup, DSA_meth_get0_name, DSA_meth_set1_name, DSA_meth_get_flags, DSA_meth_set_flags, DSA_meth_get0_app_data, DSA_meth_set0_app_data, DSA_meth_get_sign, DSA_meth_set_sign, DSA_meth_get_sign_setup, DSA_meth_set_sign_setup, DSA_meth_get_verify, DSA_meth_set_verify, DSA_meth_get_mod_exp, DSA_meth_set_mod_exp, DSA_meth_get_bn_mod_exp, DSA_meth_set_bn_mod_exp, DSA_meth_get_init, DSA_meth_set_init, DSA_meth_get_finish, DSA_meth_set_finish, DSA_meth_get_paramgen, DSA_meth_set_paramgen, DSA_meth_get_keygen, DSA_meth_set_keygen - Routines to build up DSA methods =head1 SYNOPSIS #include <openssl/dsa.h> DSA_METHOD *DSA_meth_new(const char *name, int flags); void DSA_meth_free(DSA_METHOD *dsam); DSA_METHOD *DSA_meth_dup(const DSA_METHOD *meth); const char *DSA_meth_get0_name(const DSA_METHOD *dsam); int DSA_meth_set1_name(DSA_METHOD *dsam, const char *name); int DSA_meth_get_flags(const DSA_METHOD *dsam); int DSA_meth_set_flags(DSA_METHOD *dsam, int flags); void *DSA_meth_get0_app_data(const DSA_METHOD *dsam); int DSA_meth_set0_app_data(DSA_METHOD *dsam, void *app_data); DSA_SIG *(*DSA_meth_get_sign(const DSA_METHOD *dsam)) (const unsigned char *, int, DSA *); int DSA_meth_set_sign(DSA_METHOD *dsam, DSA_SIG *(*sign) (const unsigned char *, int, DSA *)); int (*DSA_meth_get_sign_setup(const DSA_METHOD *dsam)) (DSA *, BN_CTX *, BIGNUM **, BIGNUM **); int DSA_meth_set_sign_setup(DSA_METHOD *dsam, int (*sign_setup) (DSA *, BN_CTX *, BIGNUM **, BIGNUM **)); int (*DSA_meth_get_verify(const DSA_METHOD *dsam)) (const unsigned char *, int , DSA_SIG *, DSA *); int DSA_meth_set_verify(DSA_METHOD *dsam, int (*verify) (const unsigned char *, int, DSA_SIG *, DSA *)); int (*DSA_meth_get_mod_exp(const DSA_METHOD *dsam)) (DSA *dsa, BIGNUM *rr, BIGNUM *a1, BIGNUM *p1, BIGNUM *a2, BIGNUM *p2, BIGNUM *m, BN_CTX *ctx, BN_MONT_CTX *in_mont); int DSA_meth_set_mod_exp(DSA_METHOD *dsam, int (*mod_exp) (DSA *dsa, BIGNUM *rr, BIGNUM *a1, BIGNUM *p1, BIGNUM *a2, BIGNUM *p2, BIGNUM *m, BN_CTX *ctx, BN_MONT_CTX *mont)); int (*DSA_meth_get_bn_mod_exp(const DSA_METHOD *dsam)) (DSA *dsa, BIGNUM *r, BIGNUM *a, const BIGNUM *p, const BIGNUM *m, BN_CTX *ctx, BN_MONT_CTX *mont); int DSA_meth_set_bn_mod_exp(DSA_METHOD *dsam, int (*bn_mod_exp) (DSA *dsa, BIGNUM *r, BIGNUM *a, const BIGNUM *p, const BIGNUM *m, BN_CTX *ctx, BN_MONT_CTX *mont)); int (*DSA_meth_get_init(const DSA_METHOD *dsam))(DSA *); int DSA_meth_set_init(DSA_METHOD *dsam, int (*init)(DSA *)); int (*DSA_meth_get_finish(const DSA_METHOD *dsam)) (DSA *); int DSA_meth_set_finish(DSA_METHOD *dsam, int (*finish) (DSA *)); int (*DSA_meth_get_paramgen(const DSA_METHOD *dsam)) (DSA *, int, const unsigned char *, int, int *, unsigned long *, BN_GENCB *); int DSA_meth_set_paramgen(DSA_METHOD *dsam, int (*paramgen) (DSA *, int, const unsigned char *, int, int *, unsigned long *, BN_GENCB *)); int (*DSA_meth_get_keygen(const DSA_METHOD *dsam)) (DSA *); int DSA_meth_set_keygen(DSA_METHOD *dsam, int (*keygen) (DSA *)); =head1 DESCRIPTION The B<DSA_METHOD> type is a structure used for the provision of custom DSA implementations. It provides a set of of functions used by OpenSSL for the implementation of the various DSA capabilities. See the L<dsa> page for more information. DSA_meth_new() creates a new B<DSA_METHOD> structure. It should be given a unique B<name> and a set of B<flags>. The B<name> should be a NULL terminated string, which will be duplicated and stored in the B<DSA_METHOD> object. It is the callers responsibility to free the original string. The flags will be used during the construction of a new B<DSA> object based on this B<DSA_METHOD>. Any new B<DSA> object will have those flags set by default. DSA_meth_dup() creates a duplicate copy of the B<DSA_METHOD> object passed as a parameter. This might be useful for creating a new B<DSA_METHOD> based on an existing one, but with some differences. DSA_meth_free() destroys a B<DSA_METHOD> structure and frees up any memory associated with it. DSA_meth_get0_name() will return a pointer to the name of this DSA_METHOD. This is a pointer to the internal name string and so should not be freed by the caller. DSA_meth_set1_name() sets the name of the DSA_METHOD to B<name>. The string is duplicated and the copy is stored in the DSA_METHOD structure, so the caller remains responsible for freeing the memory associated with the name. DSA_meth_get_flags() returns the current value of the flags associated with this DSA_METHOD. DSA_meth_set_flags() provides the ability to set these flags. The functions DSA_meth_get0_app_data() and DSA_meth_set0_app_data() provide the ability to associate implementation specific data with the DSA_METHOD. It is the application's responsibility to free this data before the DSA_METHOD is freed via a call to DSA_meth_free(). DSA_meth_get_sign() and DSA_meth_set_sign() get and set the function used for creating a DSA signature respectively. This function will be called in response to the application calling DSA_do_sign() (or DSA_sign()). The parameters for the function have the same meaning as for DSA_do_sign(). DSA_meth_get_sign_setup() and DSA_meth_set_sign_setup() get and set the function used for precalculating the DSA signature values B<k^-1> and B<r>. This function will be called in response to the application calling DSA_sign_setup(). The parameters for the function have the same meaning as for DSA_sign_setup(). DSA_meth_get_verify() and DSA_meth_set_verify() get and set the function used for verifying a DSA signature respectively. This function will be called in response to the application calling DSA_do_verify() (or DSA_verify()). The parameters for the function have the same meaning as for DSA_do_verify(). DSA_meth_get_mod_exp() and DSA_meth_set_mod_exp() get and set the function used for computing the following value: rr = a1^p1 * a2^p2 mod m This function will be called by the default OpenSSL method during verification of a DSA signature. The result is stored in the B<rr> parameter. This function may be NULL. DSA_meth_get_bn_mod_exp() and DSA_meth_set_bn_mod_exp() get and set the function used for computing the following value: r = a ^ p mod m This function will be called by the default OpenSSL function for DSA_sign_setup(). The result is stored in the B<r> parameter. This function may be NULL. DSA_meth_get_init() and DSA_meth_set_init() get and set the function used for creating a new DSA instance respectively. This function will be called in response to the application calling DSA_new() (if the current default DSA_METHOD is this one) or DSA_new_method(). The DSA_new() and DSA_new_method() functions will allocate the memory for the new DSA object, and a pointer to this newly allocated structure will be passed as a parameter to the function. This function may be NULL. DSA_meth_get_finish() and DSA_meth_set_finish() get and set the function used for destroying an instance of a DSA object respectively. This function will be called in response to the application calling DSA_free(). A pointer to the DSA to be destroyed is passed as a parameter. The destroy function should be used for DSA implementation specific clean up. The memory for the DSA itself should not be freed by this function. This function may be NULL. DSA_meth_get_paramgen() and DSA_meth_set_paramgen() get and set the function used for generating DSA parameters respectively. This function will be called in response to the application calling DSA_generate_parameters_ex() (or DSA_generate_parameters()). The parameters for the function have the same meaning as for DSA_generate_parameters_ex(). DSA_meth_get_keygen() and DSA_meth_set_keygen() get and set the function used for generating a new DSA key pair respectively. This function will be called in response to the application calling DSA_generate_key(). The parameter for the function has the same meaning as for DSA_generate_key(). =head1 RETURN VALUES DSA_meth_new() and DSA_meth_dup() return the newly allocated DSA_METHOD object or NULL on failure. DSA_meth_get0_name() and DSA_meth_get_flags() return the name and flags associated with the DSA_METHOD respectively. All other DSA_meth_get_*() functions return the appropriate function pointer that has been set in the DSA_METHOD, or NULL if no such pointer has yet been set. DSA_meth_set1_name() and all DSA_meth_set_*() functions return 1 on success or 0 on failure. =head1 SEE ALSO L<dsa(3)>, L<DSA_new(3)>, L<DSA_generate_parameters(3)>, L<DSA_generate_key(3)>, L<DSA_dup_DH(3)>, L<DSA_do_sign(3)>, L<DSA_set_method(3)>, L<DSA_SIG_new(3)>, L<DSA_sign(3)>, L<DSA_size(3)>, L<DSA_get0_pqg(3)> =head1 HISTORY The functions described here were added in OpenSSL 1.1.0. =head1 COPYRIGHT in the file LICENSE in the source distribution or at L<path_to_url =cut ```
```yaml # Note: On most ST development boards, external clock "HSE 8MHz" is provided thanks to ST-Link # via its MCO line. On some boards, ST-Link MCO sloder brigde is not set out of the box. # To reflect this constraint on such boards, a specific fixture "mco_sb_closed" is provided. # To run HSE tests on these boards: # - add the sloder bridge # - add the fixture in map file common: timeout: 5 tags: - clock_control tests: drivers.clock.stm32_clock_configuration.common_core.l4_l5.sysclksrc_pll_48_msi_4: extra_args: DTC_OVERLAY_FILE="boards/clear_clocks.overlay;boards/clear_msi.overlay;boards/pll_48_msi_4.overlay" platform_allow: - disco_l475_iot1 - nucleo_l4r5zi - stm32l562e_dk integration_platforms: - disco_l475_iot1 drivers.clock.stm32_clock_configuration.common_core.l4_l5.sysclksrc_pll_64_hsi_16: extra_args: DTC_OVERLAY_FILE="boards/clear_clocks.overlay;boards/clear_msi.overlay;boards/pll_64_hsi_16.overlay" platform_allow: - disco_l475_iot1 - nucleo_l4r5zi - stm32l562e_dk integration_platforms: - disco_l475_iot1 drivers.clock.stm32_clock_configuration.common_core.sysclksrc_hsi_16: extra_args: DTC_OVERLAY_FILE="boards/clear_clocks.overlay;boards/clear_msi.overlay;boards/hsi_16.overlay" platform_allow: - disco_l475_iot1 - nucleo_l4r5zi - stm32l562e_dk - nucleo_wb55rg - nucleo_wl55jc integration_platforms: - disco_l475_iot1 drivers.clock.stm32_clock_configuration.common_core.sysclksrc_msi_48: extra_args: DTC_OVERLAY_FILE="boards/clear_clocks.overlay;boards/clear_msi.overlay;boards/msi_range11.overlay" platform_allow: - disco_l475_iot1 - nucleo_l4r5zi - stm32l562e_dk - nucleo_wl55jc - nucleo_wb55rg integration_platforms: - disco_l475_iot1 drivers.clock.stm32_clock_configuration.common_core.l4_l5.sysclksrc_hse_8.fixup: extra_args: DTC_OVERLAY_FILE="boards/clear_clocks.overlay;boards/clear_msi.overlay;boards/hse_8.overlay" platform_allow: - disco_l475_iot1 - nucleo_l4r5zi - stm32l562e_dk harness: ztest harness_config: fixture: mco_sb_closed integration_platforms: - disco_l475_iot1 drivers.clock.stm32_clock_configuration.common_core.l4_l5.sysclksrc_pll_64_hse_8.fixup: extra_args: DTC_OVERLAY_FILE="boards/clear_clocks.overlay;boards/clear_msi.overlay;boards/pll_64_hse_8.overlay" platform_allow: - disco_l475_iot1 - nucleo_l4r5zi - stm32l562e_dk harness: ztest harness_config: fixture: mco_sb_closed integration_platforms: - disco_l475_iot1 drivers.clock.stm32_clock_configuration.common_core.g0.sysclksrc_pll_64_hse_8: extra_args: DTC_OVERLAY_FILE="boards/clear_clocks.overlay;boards/pll_64_hse_8.overlay" platform_allow: nucleo_g071rb harness: ztest harness_config: fixture: mco_sb_closed integration_platforms: - nucleo_g071rb drivers.clock.stm32_clock_configuration.common_core.g0.sysclksrc_hsi_g0_16_div_2: extra_args: DTC_OVERLAY_FILE="boards/clear_clocks.overlay;boards/hsi_g0_16_div_2.overlay" platform_allow: nucleo_g071rb integration_platforms: - nucleo_g071rb drivers.clock.stm32_clock_configuration.common_core.g0.sysclksrc_hsi_g0_16_div_4: extra_args: DTC_OVERLAY_FILE="boards/clear_clocks.overlay;boards/hsi_g0_16_div_4.overlay" platform_allow: nucleo_g071rb integration_platforms: - nucleo_g071rb drivers.clock.stm32_clock_configuration.common_core.g4.sysclksrc_pll_64_hsi_16: extra_args: DTC_OVERLAY_FILE="boards/clear_clocks.overlay;boards/pll_64_hsi_16.overlay" platform_allow: nucleo_g474re integration_platforms: - nucleo_g474re drivers.clock.stm32_clock_configuration.common_core.g0.sysclksrc_pll_g0_64_hsi_16: extra_args: DTC_OVERLAY_FILE="boards/clear_clocks.overlay;boards/pll_g0_64_hsi_16.overlay" platform_allow: nucleo_g071rb integration_platforms: - nucleo_g071rb drivers.clock.stm32_clock_configuration.common_core.g4.sysclksrc_hsi_16: extra_args: DTC_OVERLAY_FILE="boards/clear_clocks.overlay;boards/hsi_16.overlay" platform_allow: nucleo_g474re integration_platforms: - nucleo_g474re drivers.clock.stm32_clock_configuration.common_core.g0.sysclksrc_hsi_g0_16: extra_args: DTC_OVERLAY_FILE="boards/clear_clocks.overlay;boards/hsi_g0_16.overlay" platform_allow: nucleo_g071rb integration_platforms: - nucleo_g071rb drivers.clock.stm32_clock_configuration.common_core.g4.sysclksrc_hse_24: extra_args: DTC_OVERLAY_FILE="boards/clear_clocks.overlay;boards/hse_24.overlay" platform_allow: nucleo_g474re drivers.clock.stm32_clock_configuration.common_core.g4.sysclksrc_hse_24.css: extra_args: DTC_OVERLAY_FILE="boards/clear_clocks.overlay;boards/hse_24.overlay;boards/hse_css.overlay" platform_allow: nucleo_g474re drivers.clock.stm32_clock_configuration.common_core.l0_l1.sysclksrc_hse_8: extra_args: DTC_OVERLAY_FILE="boards/clear_clocks.overlay;boards/clear_msi.overlay;boards/hse_8.overlay" platform_allow: - nucleo_l152re - nucleo_l073rz integration_platforms: - nucleo_l152re drivers.clock.stm32_clock_configuration.common_core.l0_l1.sysclksrc_pll_32_hse_8: extra_args: DTC_OVERLAY_FILE="boards/clear_clocks.overlay;boards/pll_32_hse_8.overlay" platform_allow: - nucleo_l152re - nucleo_l073rz integration_platforms: - nucleo_l152re drivers.clock.stm32_clock_configuration.common_core.l0_l1.sysclksrc_pll_32_hsi_16: extra_args: DTC_OVERLAY_FILE="boards/clear_clocks.overlay;boards/pll_32_hsi_16.overlay" platform_allow: - nucleo_l152re - nucleo_l073rz integration_platforms: - nucleo_l152re drivers.clock.stm32_clock_configuration.common_core.l0_l1.sysclksrc_msi_range6: extra_args: DTC_OVERLAY_FILE="boards/clear_clocks.overlay;boards/msi_range6.overlay" platform_allow: - nucleo_l152re - nucleo_l073rz integration_platforms: - nucleo_l152re drivers.clock.stm32_clock_configuration.common_core.wl.sysclksrc_pll_48_hsi_16: extra_args: DTC_OVERLAY_FILE="boards/clear_clocks.overlay;boards/pll_48_hsi_16.overlay" platform_allow: nucleo_wl55jc integration_platforms: - nucleo_wl55jc drivers.clock.stm32_clock_configuration.common_core.wl.sysclksrc_pll_48_hse_32: extra_args: DTC_OVERLAY_FILE="boards/clear_clocks.overlay;boards/wl_pll_48_hse_32.overlay" platform_allow: nucleo_wl55jc integration_platforms: - nucleo_wl55jc drivers.clock.stm32_clock_configuration.common_core.wl.sysclksrc_hse_32: extra_args: DTC_OVERLAY_FILE="boards/clear_clocks.overlay;boards/wl_32_hse.overlay" platform_allow: nucleo_wl55jc integration_platforms: - nucleo_wl55jc drivers.clock.stm32_clock_configuration.common_core.wb.sysclksrc_hse_32: extra_args: DTC_OVERLAY_FILE="boards/clear_clocks.overlay;boards/hse_32.overlay" platform_allow: nucleo_wb55rg integration_platforms: - nucleo_wb55rg drivers.clock.stm32_clock_configuration.common_core.wb.sysclksrc_pll_48_hsi_16: extra_args: DTC_OVERLAY_FILE="boards/clear_clocks.overlay;boards/wb_pll_48_hsi_16.overlay" platform_allow: nucleo_wb55rg integration_platforms: - nucleo_wb55rg drivers.clock.stm32_clock_configuration.common_core.wb.sysclksrc_pll_64_hse_32: extra_args: DTC_OVERLAY_FILE="boards/clear_clocks.overlay;boards/wb_pll_64_hse_32.overlay" platform_allow: nucleo_wb55rg integration_platforms: - nucleo_wb55rg drivers.clock.stm32_clock_configuration.common_core.wb.sysclksrc_pll_48_msi_4: extra_args: DTC_OVERLAY_FILE="boards/clear_clocks.overlay;boards/wb_pll_48_msi_4.overlay" platform_allow: nucleo_wb55rg integration_platforms: - nucleo_wb55rg drivers.clock.stm32_clock_configuration.common_core.f0_f3.sysclksrc_hsi_8: extra_args: DTC_OVERLAY_FILE="boards/clear_f0_f1_f3_clocks.overlay;boards/hsi_8.overlay" platform_allow: - nucleo_f091rc - stm32f3_disco integration_platforms: - nucleo_f091rc drivers.clock.stm32_clock_configuration.common_core.f0_f3.sysclksrc_hse_8: extra_args: DTC_OVERLAY_FILE="boards/clear_f0_f1_f3_clocks.overlay;boards/hse_8_bypass.overlay" platform_allow: - nucleo_f091rc - stm32f3_disco integration_platforms: - nucleo_f091rc drivers.clock.stm32_clock_configuration.common_core.f0_f3.sysclksrc_pll_32_hsi_8: extra_args: DTC_OVERLAY_FILE="boards/clear_f0_f1_f3_clocks.overlay;boards/f0_f3_pll_32_hsi_8.overlay" platform_allow: - nucleo_f091rc - stm32f3_disco integration_platforms: - nucleo_f091rc drivers.clock.stm32_clock_configuration.common_core.f0_f3.sysclksrc_pll_32_hse_8: extra_args: DTC_OVERLAY_FILE="boards/clear_f0_f1_f3_clocks.overlay;boards/f0_f3_pll_32_hse_8.overlay" platform_allow: - nucleo_f091rc - stm32f3_disco integration_platforms: - nucleo_f091rc drivers.clock.stm32_clock_configuration.common_core.f1.sysclksrc_hsi_8: extra_args: DTC_OVERLAY_FILE="boards/clear_f0_f1_f3_clocks.overlay;boards/hsi_8.overlay" platform_allow: nucleo_f103rb integration_platforms: - nucleo_f103rb drivers.clock.stm32_clock_configuration.common_core.f1.sysclksrc_hse_8: extra_args: DTC_OVERLAY_FILE="boards/clear_f0_f1_f3_clocks.overlay;boards/hse_8.overlay" platform_allow: nucleo_f103rb integration_platforms: - nucleo_f103rb drivers.clock.stm32_clock_configuration.common_core.f1.sysclksrc_pll_64_hsi_8: extra_args: DTC_OVERLAY_FILE="boards/clear_f0_f1_f3_clocks.overlay;boards/f1_pll_64_hsi_8.overlay" platform_allow: nucleo_f103rb integration_platforms: - nucleo_f103rb drivers.clock.stm32_clock_configuration.common_core.f1.sysclksrc_pll_64_hse_8: extra_args: DTC_OVERLAY_FILE="boards/clear_f0_f1_f3_clocks.overlay;boards/f1_pll_64_hse_8.overlay" platform_allow: nucleo_f103rb integration_platforms: - nucleo_f103rb drivers.clock.stm32_clock_configuration.common_core.f2_f4_f7.sysclksrc_hsi_16: extra_args: DTC_OVERLAY_FILE="boards/clear_f2_f4_f7_clocks.overlay;boards/hsi_16.overlay" platform_allow: - nucleo_f207zg - nucleo_f429zi - nucleo_f446re - nucleo_f746zg integration_platforms: - nucleo_f207zg drivers.clock.stm32_clock_configuration.common_core.f2_f4_f7.sysclksrc_hse_8: extra_args: DTC_OVERLAY_FILE="boards/clear_f2_f4_f7_clocks.overlay;boards/hse_8.overlay" platform_allow: - nucleo_f207zg - nucleo_f429zi - nucleo_f446re - nucleo_f746zg integration_platforms: - nucleo_f207zg drivers.clock.stm32_clock_configuration.common_core.f2_f4_f7.sysclksrc_pll_64_hsi_16: extra_args: DTC_OVERLAY_FILE="boards/clear_f2_f4_f7_clocks.overlay;boards/f2_f4_f7_pll_64_hsi_16.overlay" platform_allow: - nucleo_f207zg - nucleo_f429zi - nucleo_f446re - nucleo_f746zg integration_platforms: - nucleo_f207zg drivers.clock.stm32_clock_configuration.common_core.f2_f4_f7.sysclksrc_pll_64_hse_8: extra_args: DTC_OVERLAY_FILE="boards/clear_f2_f4_f7_clocks.overlay;boards/f2_f4_f7_pll_64_hse_8.overlay" platform_allow: - nucleo_f207zg - nucleo_f429zi - nucleo_f446re - nucleo_f746zg integration_platforms: - nucleo_f207zg drivers.clock.stm32_clock_configuration.common_core.f2_f4_f7.sysclksrc_pll_100_hsi_16_ahb2: extra_args: DTC_OVERLAY_FILE="boards/clear_f2_f4_f7_clocks.overlay;boards/f2_f4_f7_pll_100_hsi_16_ahb_2.overlay" platform_allow: - nucleo_f207zg - nucleo_f429zi - nucleo_f446re - nucleo_f746zg integration_platforms: - nucleo_f207zg ```
```smalltalk " Please describe the package using the class comment of the included manifest class. The manifest class also includes other additional metadata for the package. These meta data are used by other tools such as the SmalllintManifestChecker and the critics Browser " Class { #name : 'ManifestBaselineOfIDE', #superclass : 'PackageManifest', #category : 'BaselineOfIDE-Manifest', #package : 'BaselineOfIDE', #tag : 'Manifest' } { #category : 'code-critics' } ManifestBaselineOfIDE class >> ruleLongMethodsRuleV1FalsePositive [ <ignoreForCoverage> ^ #(#(#(#RGMethodDefinition #(#BaselineOfIDE #baseline: #false)) #'2023-11-21T08:59:08.032603+01:00') ) ] ```
The Institute of Mission Helpers of the Sacred Heart is a Catholic religious congregation for women, founded in 1890 in Baltimore, Maryland. Initially established to provide religious education for Black children, their apostolate developed to address the needs of the neglected poor in general. Their emphasis is on catechetical and social work. History Founding Around 1888, widowed Catholic convert Anna Frances Hartwell moved from Chicago to Baltimore to do social work and conduct catechism classes for the city's black population, at the request of John R. Slattery, superior general of the Josephites. She and four other women formed a religious community under the name of Mission Helpers, Daughters of the Holy Ghost, dedicated to providing religious education for black people. The convent was on Biddle Street in Baltimore. They were known as "the Tan Sisters" for their tan habits. In 1893 Hartwell assumed the title Mother Joseph. Baltimore native Mary Frances Cunningham discovered that the black children in her southwest neighborhood were excluded from religious education classes at St. Martin's Church. She began to teach the children, first on the church steps, and then in the basement. Cunningham joined the Mission Helpers in 1891, taking the name Sister Mary Demetrias. One of the group's first efforts was to open an industrial school for girls to teach workplace skills so that they could help support their families. They also established a professional laundry that provided employment for local women. In 1895, the name of the institute was changed to 'Mission Helpers of the Sacred Heart'. Their ministry was expanded to assisting the neglected poor regardless of race. Hence, their field of missionary and catechetical labour was greatly broadened. Slattery, whose apostolate was focused on the black population, was unenthusiastic about this change in direction. The Mission Helpers consulted James Gibbons, Archbishop of Baltimore, and in 1896 he assigned Fr Peter Tarro to replace Slattery as spiritual director. They established daycare centers for the children of working mothers, and in 1896 provided catechism classes to Italian immigrants. In 1897, at the request of Cardinal Gibbons, St. Francis Xavier's school for the deaf. This was the first such Catholic institution in the ecclesiastical province of Baltimore. Expansion In 1902 the sisters established a foundation in Puerto Rico and opened a school for the deaf there also. This was a heavy undertaking, as the demands on the sisters for missionary and catechetical work in Porto Rico were very great, and the need urgent. They were the first community of Catholic sisters in the Marianas and Micronesia, arriving on Guam in 1905. The Government welcomed the presence of American sisters who could teach English and assist in the public health efforts of the government. However, much of the work was not the ministry for which their community was founded, and some of the sisters had not adapted well to the tropical climate. They were recalled to Baltimore in 1908. At the first general chapter of the institute, which was held on 5 November 1906. A constitution was adopted, and a superior general and her assistants elected. At this first election Mother M. Demetrias was chosen as mother general. The community was then officially declared canonically organized. On account of their missionary labours the sisters were unable to keep up the work of perpetual adoration, consequently it was decided to schedule it to the First Fridays. In 1922, they purchased the Boyce mansion from the Deford family in West Towson, Maryland. A portion of the land was sold in on 1981 to the Blakehurst senior community, and a new motherhouse built next door. In New York, the Sisters ran St. Pascal Day Nursery in Manhattan, and the Mount Mongola summer camp in Ellenville. They established a house in Venezuela in 1962. Sister Rosalia Walsh developed the "Adaptive Way", a method to teach religion in a way appropriate to a child's age. The program has been adopted by catechists in a number of dioceses. Present day The Mission Helpers are based in West Towson, Maryland. Their annual crab feast at the Towson American Legion has become a local tradition. They also host an annual flea market at the Mission Helpers Center. The Center is home to the Asylee Women's Enterprise, a support group for women waiting to apply for asylum. The sisters continue to work in parishes, hospitals, nursing homes, senior communities and college campuses in the United States, Puerto Rico, and Venezuela. The sisters are active in the Diocese of Orlando. References Sources "Religious Orders of Women in the United States", (Elinor Tong Dehey, ed.) (Indiana: W.B. Conkey, 1913), 295-298 External links Congregation website Catholic female orders and societies African-American Roman Catholicism Catholic Church in the United States Religious organizations established in 1890 Catholic organizations established in the 19th century Christianity in Baltimore Religious orders Catholic orders and societies History of women in Maryland
```objective-c // // YPPartitionViewController.m // Wuxianda // // Created by MichaelPPP on 16/6/8. // #import "YPPartitionViewController.h" @interface YPPartitionViewController () @end @implementation YPPartitionViewController - (void)viewDidLoad { [super viewDidLoad]; // Do any additional setup after loading the view from its nib. } @end ```
```smalltalk Class { #name : 'CDFluidClassParserTest', #superclass : 'TestCase', #category : 'ClassParser-Tests', #package : 'ClassParser-Tests' } { #category : 'running' } CDFluidClassParserTest >> classDefinitionParserClass [ ^ CDFluidClassDefinitionParser ] { #category : 'tests - (r) simple class definition' } CDFluidClassParserTest >> testClassSideDefinitionIsClassSide [ | def | def := self classDefinitionParserClass parse: 'Object class << Point class slot: { }'. self assert: def isClassSide ] { #category : 'tests - (r) class side' } CDFluidClassParserTest >> testClassSideEmpty [ | parser defString def | parser := self classDefinitionParserClass new. defString := 'Object class << AlignmentMorph class '. def := parser parse: defString. self assert: def class equals: CDMetaclassDefinitionNode ] { #category : 'tests - (r) class side' } CDFluidClassParserTest >> testClassSideWithTraits [ | parser defString def | parser := self classDefinitionParserClass new. defString := 'Object class << AlignmentMorph class traits: TableRotate classTrait; slots: { #x . #y}'. def := parser parse: defString. self assert: def class equals: CDMetaclassDefinitionNode. self assert: def hasTraitComposition. self assert: def traitDefinition class equals: CDClassTraitNode. self assert: def traitDefinition name equals: #TableRotate. self assert: def slots first name equals: #x ] { #category : 'tests - (r) class variables' } CDFluidClassParserTest >> testComplexClassVariables [ | parser defString def | parser := self classDefinitionParserClass new. defString := 'Object << #MyObject sharedVariables: { #A => ClassVar default: 5 }; package: #MyPackage'. def := parser parse: defString. self assert: def sharedVariables first class equals: CDSharedVariableNode. self assert: def sharedVariables first name equals: #A. self assert: def sharedVariables first variableClassName equals: #ClassVar ] { #category : 'tests - (r) class variables' } CDFluidClassParserTest >> testComplexClassVariablesCascae [ | parser defString def | parser := self classDefinitionParserClass new. defString := 'Object << #MyObject sharedVariables: { #A => ClassVar default: 5; default2: 4 }; package: #MyPackage'. def := parser parse: defString. self assert: def sharedVariables first class equals: CDSharedVariableNode. self assert: def sharedVariables first name equals: #A. self assert: def sharedVariables first variableClassName equals: #ClassVar ] { #category : 'tests - (r) slots' } CDFluidClassParserTest >> testComplexSlots [ | parser defString def slot | parser := self classDefinitionParserClass new. defString := 'Object << #MyObject slots: { #inst => Slot default: 5 }; package: #MyPackage'. def := parser parse: defString. slot := def slots first. self assert: slot name equals: #inst. self assert: slot node selector equals: #default:. self assert: slot variableClassName equals: #Slot ] { #category : 'tests - (r) slots' } CDFluidClassParserTest >> testComplexSlotsCascade [ | parser defString def slot | parser := self classDefinitionParserClass new. defString := 'Object << #MyObject slots: { #inst => Slot default: 5; default2: 4}; package: #MyPackage'. def := parser parse: defString. slot := def slots first. self assert: slot name equals: #inst. self assert: slot node messages first selector equals: #default:. self assert: slot variableClassName equals: #Slot ] { #category : 'tests - (r) slots' } CDFluidClassParserTest >> testComplexSlotsClass [ | parser defString def slot | parser := self classDefinitionParserClass new. defString := 'Object << #MyObject slots: { #inst => Slot }; package: #MyPackage'. def := parser parse: defString. slot := def slots first. self assert: slot name equals: #inst. self assert: slot variableClassName equals: #Slot ] { #category : 'tests - (r) class variables' } CDFluidClassParserTest >> testEmptyClassVariable [ | parser defString def | parser := self classDefinitionParserClass new. defString := 'Object << #MyObject sharedVariables: { }; package: #MyPackage'. def := parser parse: defString. self assert: def sharedVariables isEmpty ] { #category : 'tests - (r) slots' } CDFluidClassParserTest >> testEmptySlots [ | parser defString def | parser := self classDefinitionParserClass new. defString := 'Object << #MyObject slots: {}; package: #MyPackage'. def := parser parse: defString. self assert: def slots isEmpty ] { #category : 'tests - (r) kinds' } CDFluidClassParserTest >> testEphemeronSubclass [ | parser defString def | parser := self classDefinitionParserClass new. defString := 'Object << #MyObject layout: EphemeronLayout; package: #MyPackage'. def := parser parse: defString. self assert: def layoutClass equals: EphemeronLayout ] { #category : 'tests - (r) simple class definition' } CDFluidClassParserTest >> testInstanceDefinitionIsInstanceSide [ | def | def := self classDefinitionParserClass parse: 'Object << #Point package: ''Kernel-BasicObjects'''. self assert: def isInstanceSide ] { #category : 'tests - (r) kinds' } CDFluidClassParserTest >> testNormalSubclass [ | parser defString def | parser := self classDefinitionParserClass new. defString := 'Object << #MyObject layout: FixedLayout; package: #MyPackage'. def := parser parse: defString. self assert: def layoutClass equals: FixedLayout ] { #category : 'tests - (r) sharedPools' } CDFluidClassParserTest >> testSharedPools [ | parser defString def | parser := self classDefinitionParserClass new. defString := 'Object << #MyObject sharedPools: {TextConstants}; package: #MyPackage'. def := parser parse: defString. self assert: def sharedPools first name equals: 'TextConstants' ] { #category : 'tests - (r) class variables' } CDFluidClassParserTest >> testSimpleClassVariableClass [ | parser defString def | parser := self classDefinitionParserClass new. defString := 'Object << #MyObject sharedVariables: { #A => ClassVar }; package: #MyPackage'. def := parser parse: defString. self assert: def sharedVariables first name equals: #A. self assert: def sharedVariables first variableClassName equals: #ClassVar. self assert: def sharedVariables first class equals: CDSharedVariableNode ] { #category : 'tests - (r) class variables' } CDFluidClassParserTest >> testSimpleClassVariables [ | parser defString def | parser := self classDefinitionParserClass new. defString := 'Object << #MyObject sharedVariables: { #A . #B }; package: #MyPackage'. def := parser parse: defString. self assert: def sharedVariables first name equals: #A. self assert: def sharedVariables second name equals: #B. self assert: def sharedVariables first variableClassName equals: #ClassVariable. self assert: def sharedVariables second variableClassName equals: #ClassVariable ] { #category : 'tests - (r) simple class definition' } CDFluidClassParserTest >> testSimpleDefinition [ | def | def := self classDefinitionParserClass parse: 'Object << #Point package: ''Kernel-BasicObjects'''. self assert: def className equals: #Point ] { #category : 'tests - (r) simple class definition' } CDFluidClassParserTest >> testSimpleDefinitionClassNode [ | def | def := self classDefinitionParserClass parse: 'Object << #Point package: ''Kernel-BasicObjects'''. self assert: def classNameNode className equals: #Point. "The following cannot work self assert: def classNameNode binding value equals: Point. because binding is defined as existingBindingIfAbsent: aBlock | binding | binding := originalNode methodNode compilationContext environment bindingOf: className. ^ binding ifNil: aBlock " ] { #category : 'tests - (r) simple class definition' } CDFluidClassParserTest >> testSimpleDefinitionPackageIsCorrect [ | def | def := self classDefinitionParserClass parse: 'Object << #Point package: ''Kernel-BasicObjects'''. self assert: def packageName equals: 'Kernel-BasicObjects' ] { #category : 'tests - (r) simple class definition' } CDFluidClassParserTest >> testSimpleDefinitionSuperclassName [ | def | def := self classDefinitionParserClass parse: 'Object << #Point package: ''Kernel-BasicObjects'''. self assert: def superclassName equals: 'Object' ] { #category : 'tests - (r) slots' } CDFluidClassParserTest >> testSimpleSlots [ | parser defString def | parser := self classDefinitionParserClass new. defString := 'Object << #MyObject slots: { #a. #b }; package: #MyPackage'. def := parser parse: defString. self assert: def slots size equals: 2. self assert: def slots first name equals: #a. self assert: def slots second name equals: #b. self assert: def slots first variableClassName equals: #InstanceVariableSlot. self assert: def slots second variableClassName equals: #InstanceVariableSlot ] { #category : 'tests - (r) tags' } CDFluidClassParserTest >> testTag [ | parser defString def | parser := self classDefinitionParserClass new. defString := 'Object << #MyObject tag: ''tag1''; package: #MyPackage'. def := parser parse: defString. self assert: def tag name equals: 'tag1' ] { #category : 'tests - (r) traits' } CDFluidClassParserTest >> testTraitAlias [ | parser defString def | parser := self classDefinitionParserClass new. defString := 'Object << #MyObject traits: MyTrait @ {#foo -> #bar}; package: #MyPackage'. def := parser parse: defString. self assert: def traitDefinition class equals: CDTraitAliasNode. self assert: (def traitDefinition aliases values) equals: #(bar). self assert: (def traitDefinition aliases keys) equals: #(foo). self assert: def traitDefinition subject name equals: #MyTrait ] { #category : 'tests - (r) traits' } CDFluidClassParserTest >> testTraitEmpty [ | parser defString def | parser := self classDefinitionParserClass new. defString := 'Object << #MyObject uses: {}; package: #MyPackage'. def := parser parse: defString. self assert: def traitDefinition equals: nil ] { #category : 'tests - (r) traits' } CDFluidClassParserTest >> testTraitPlainSimple [ | parser defString def | parser := self classDefinitionParserClass new. defString := 'Object << #MyObject traits: MyTrait; package: #MyPackage'. def := parser parse: defString. self assert: def traitDefinition name equals: #MyTrait ] { #category : 'tests - (r) traits' } CDFluidClassParserTest >> testTraitSequence [ | parser defString def | parser := self classDefinitionParserClass new. defString := 'Object << #MyObject traits: MyTrait + (AnotherTrait - {#selector} @ {#selector1 -> #selector}); package: #MyPackage'. def := parser parse: defString. self assert: def traitDefinition class equals: CDTraitCompositionSequenceNode. self assert: def traitDefinition sequence size equals: 2. self assert: (def traitDefinition sequence second aliases values) equals: #(selector). self assert: (def traitDefinition sequence second aliases keys) equals: #(selector1). self assert: def traitDefinition sequence first name equals: #MyTrait ] { #category : 'tests - (r) class variables' } CDFluidClassParserTest >> testUnrestrictedClassVariable [ | orginalSetting parser defString def | orginalSetting := CDFluidClassDefinitionParser unrestrictedVariableDefinitions. CDFluidClassDefinitionParser unrestrictedVariableDefinitions: true. parser := self classDefinitionParserClass new. defString := 'Object << #MyObject sharedVariables: { ClassVariable named: #A }; package: #MyPackage'. def := parser parse: defString. self assert: def sharedVariables first class equals: CDSharedVariableNode. self assert: def sharedVariables first name equals: #A. self assert: def sharedVariables first variableClassName equals: #ClassVariable. CDFluidClassDefinitionParser unrestrictedVariableDefinitions: orginalSetting ] { #category : 'tests - (r) class variables' } CDFluidClassParserTest >> testUnrestrictedClassVariableSimple [ | orginalSetting parser defString def | orginalSetting := CDFluidClassDefinitionParser unrestrictedVariableDefinitions. CDFluidClassDefinitionParser unrestrictedVariableDefinitions: true. parser := self classDefinitionParserClass new. defString := 'Object << #MyObject sharedVariables: { #A }; package: #MyPackage'. def := parser parse: defString. self assert: def sharedVariables first class equals: CDSharedVariableNode. self assert: def sharedVariables first name equals: #A. self assert: def sharedVariables first variableClassName equals: #ClassVariable. CDFluidClassDefinitionParser unrestrictedVariableDefinitions: orginalSetting ] { #category : 'tests - (r) slots' } CDFluidClassParserTest >> testUnrestrictedSlot [ | orginalSetting parser defString def | orginalSetting := CDFluidClassDefinitionParser unrestrictedVariableDefinitions. CDFluidClassDefinitionParser unrestrictedVariableDefinitions: true. parser := self classDefinitionParserClass new. defString := 'Object << #MyObject slots: { InstanceVariableSlot named: #a. #b }; package: #MyPackage'. def := parser parse: defString. self assert: def slots size equals: 2. self assert: def slots first name equals: #a. self assert: def slots second name equals: #b. self assert: def slots first variableClassName equals: #InstanceVariableSlot. self assert: def slots second variableClassName equals: #InstanceVariableSlot. CDFluidClassDefinitionParser unrestrictedVariableDefinitions: orginalSetting ] { #category : 'tests - (r) slots' } CDFluidClassParserTest >> testUnrestrictedSlotsSimple [ | orginalSetting parser defString def | orginalSetting := CDFluidClassDefinitionParser unrestrictedVariableDefinitions. CDFluidClassDefinitionParser unrestrictedVariableDefinitions: true. parser := self classDefinitionParserClass new. defString := 'Object << #MyObject slots: { #a. #b }; package: #MyPackage'. def := parser parse: defString. self assert: def slots size equals: 2. self assert: def slots first name equals: #a. self assert: def slots second name equals: #b. self assert: def slots first variableClassName equals: #InstanceVariableSlot. self assert: def slots second variableClassName equals: #InstanceVariableSlot. CDFluidClassDefinitionParser unrestrictedVariableDefinitions: orginalSetting ] { #category : 'tests - (r) kinds' } CDFluidClassParserTest >> testVariableByteSubclass [ | parser defString def | parser := self classDefinitionParserClass new. defString := 'Object << #MyObject layout: ByteLayout; package: #MyPackage'. def := parser parse: defString. self assert: def layoutClass equals: ByteLayout ] { #category : 'tests - (r) kinds' } CDFluidClassParserTest >> testVariableSubclass [ | parser defString def | parser := self classDefinitionParserClass new. defString := 'Object << #MyObject layout: VariableLayout; package: #MyPackage'. def := parser parse: defString. self assert: def layoutClass equals: VariableLayout ] { #category : 'tests - (r) kinds' } CDFluidClassParserTest >> testVariableWordSubclass [ | parser defString def | parser := self classDefinitionParserClass new. defString := 'Object << #MyObject layout: WordLayout; package: #MyPackage'. def := parser parse: defString. self assert: def layoutClass equals: WordLayout ] { #category : 'tests - (r) kinds' } CDFluidClassParserTest >> testWeakSubclass [ | parser defString def | parser := self classDefinitionParserClass new. defString := 'Object << #MyObject layout: WeakLayout; package: #MyPackage'. def := parser parse: defString. self assert: def layoutClass equals: WeakLayout ] { #category : 'tests - rb xp' } CDFluidClassParserTest >> testWithRB [ | dict searcher | searcher := RBParseTreeSearcher new. searcher matches: '`superklass << `#ClassName slots: {}; sharedVariables: {}; package: ''''' do: [ :aNode :answer | dict:= searcher context ]. dict := searcher executeTree: (RBParser parseExpression: 'Object << #MyClass slots: {}; sharedVariables: {}; package: ''''') ] { #category : 'tests - rb xp' } CDFluidClassParserTest >> testWithRB10 [ | searcher kind | searcher := RBParseTreeSearcher new. searcher matches: 'Trait << `#traitSymbol' do: [:aNode :answer | kind := #traitInstance ]; matches: 'Trait << `@symb classTrait' do: [:aNode :answer | kind := #traitClass ]; matches: '`@tm << `#symb' do: [:aNode :answer | kind := #instance ]; matches: '`@tm class << `@symb class' do: [:aNode :answer | kind := #class ]. searcher executeTree: (RBParser parseExpression: ' Trait << TViewModelMock3 classTrait ') . self assert: kind equals: #traitClass. "reference to TViewModelMock3 is just in the string, add it here so we can find it" TViewModelMock3. ] { #category : 'tests - rb xp' } CDFluidClassParserTest >> testWithRB10WithError [ | searcher kind | searcher := RBParseTreeSearcher new. searcher matches: 'Trait << `#traitSymbol' do: [:aNode :answer | kind := #traitInstance ]; matches: 'Trait << `@symb classTrait' do: [:aNode :answer | kind := #traitClass ]; matches: '`@tm << `#symb' do: [:aNode :answer | kind := #instance ]; matches: '`@tm class << `@symb class' do: [:aNode :answer | kind := #class ]. searcher executeTree: (RBParser parseExpression: ' Trait << TViewModelMock3 class ') . self assert: kind isNil. "reference to TViewModelMock3 is just in the string, add it here so we can find it" TViewModelMock3. ] { #category : 'tests - rb xp' } CDFluidClassParserTest >> testWithRB3 [ | searcher coll| searcher := RBParseTreeSearcher new. coll := OrderedCollection new. searcher matches: '^self' do: [:aNode :answer | coll add: aNode ]; matches: '^`@anything' do: [:aNode :answer | coll add: aNode]. searcher executeTree: (RBParser parseMethod: 'foo |tmp| tmp := 22. ^ 42'). self assert: coll size equals: 1 ] { #category : 'tests - rb xp' } CDFluidClassParserTest >> testWithRB4 [ | searcher coll| searcher := RBParseTreeSearcher new. coll := OrderedCollection new. searcher matches: '`@tm := `@val' do: [:aNode :answer | coll add: aNode ]; matches: '^`@anything' do: [:aNode :answer | coll add: aNode]. searcher executeTree: (RBParser parseMethod: 'foo | tmp | tmp := 22. tmp := 55. ^ 42'). self assert: coll size equals: 3 ] { #category : 'tests - rb xp' } CDFluidClassParserTest >> testWithRB5 [ | searcher coll| searcher := RBParseTreeSearcher new. coll := OrderedCollection new. searcher matches: '`@tm << `#symb' do: [:aNode :answer | coll add: #instance ]; matches: '`@tm class << `@symb class' do: [:aNode :answer | coll add: #class ]; matches: 'Trait << `#traitSymbol' do: [:aNode :answer | coll add: #traitInstance ]; matches: 'Trait << `@symb classTrait' do: [:aNode :answer | coll add: #traitClass ]. searcher executeTree: (RBParser parseExpression: ' Object << #Point slots: { #x . #y }; package: ''Foo'' ') . self assert: coll first equals: #instance. self assert: coll size equals: 1 ] { #category : 'tests - rb xp' } CDFluidClassParserTest >> testWithRB6 [ | searcher coll| searcher := RBParseTreeSearcher new. coll := OrderedCollection new. searcher matches: '`@tm << `#symb' do: [:aNode :answer | coll add: #instance ]; matches: '`@tm class << `@symb class' do: [:aNode :answer | coll add: #class ]; matches: 'Trait << `#traitSymbol' do: [:aNode :answer | coll add: #traitInstance ]; matches: 'Trait << `@symb classTrait' do: [:aNode :answer | coll add: #traitClass ]. searcher executeTree: (RBParser parseExpression: ' Object class << #Point class slots: { #x . #y }; package: ''Foo'' ') . self assert: coll first equals: #class. self assert: coll size equals: 1 ] { #category : 'tests - rb xp' } CDFluidClassParserTest >> testWithRB7 [ | searcher coll| searcher := RBParseTreeSearcher new. coll := OrderedCollection new. searcher matches: '`@tm << `#symb' do: [:aNode :answer | coll add: #instance ]; matches: '`@tm class << `@symb class' do: [:aNode :answer | coll add: #class ]; matches: 'Trait << `#traitSymbol' do: [:aNode :answer | coll add: #traitInstance ]; matches: 'Trait << `@symb classTrait' do: [:aNode :answer | coll add: #traitClass ]. searcher executeTree: (RBParser parseExpression: ' Trait << #TPoint classTrait slots: { #x . #y }; package: ''Foo'' ') . self assert: coll first equals: #traitClass. self assert: coll size equals: 1 ] { #category : 'tests - rb xp' } CDFluidClassParserTest >> testWithRB8 [ | searcher coll| searcher := RBParseTreeSearcher new. coll := OrderedCollection new. searcher matches: 'Trait << `#traitSymbol' do: [:aNode :answer | coll add: #traitInstance ]; matches: '`@tm << `#symb' do: [:aNode :answer | coll add: #instance ]; matches: '`@tm class << `@symb class' do: [:aNode :answer | coll add: #class ]; matches: 'Trait << `@symb classTrait' do: [:aNode :answer | coll add: #traitClass ]. searcher executeTree: (RBParser parseExpression: ' Trait << #Point slots: { #x . #y }; package: ''Foo'' ') . self assert: coll size equals: 1. self assert: coll first equals: #traitInstance ] { #category : 'tests - rb xp' } CDFluidClassParserTest >> testWithRB9 [ | searcher coll| searcher := RBParseTreeSearcher new. coll := OrderedCollection new. searcher matches: 'Trait << `#traitSymbol' do: [:aNode :answer | coll add: #traitInstance ]; matches: '`@tm << `#symb' do: [:aNode :answer | coll add: #instance ]; matches: '`@tm class << `@symb class' do: [:aNode :answer | coll add: #class ]; matches: 'Trait << `@symb classTrait' do: [:aNode :answer | coll add: #traitClass ]. searcher executeTree: (RBParser parseExpression: ' Object << #MyObject sharedVariables: { #A . #B }; package: ''MyPackage'' ') . self assert: coll size equals: 1. self assert: coll first equals: #instance ] ```
```javascript /** * Helper for resolving environment specific configuration files. * * It resolves .env files that are supported by the `dotenv` library. * * Please read the application configuration docs for more info. */ import appRootDir from 'app-root-dir'; import dotenv from 'dotenv'; import fs from 'fs'; import path from 'path'; import ifElse from '../../shared/utils/logic/ifElse'; import removeNil from '../../shared/utils/arrays/removeNil'; import { log } from '../../internal/utils'; // PRIVATES function registerEnvFile() { const DEPLOYMENT = process.env.DEPLOYMENT; const envFile = '.env'; // This is the order in which we will try to resolve an environment configuration // file. const envFileResolutionOrder = removeNil([ // Is there an environment config file at the app root? // This always takes preference. // e.g. /projects/react-universally/.env path.resolve(appRootDir.get(), envFile), // Is there an environment config file at the app root for our target // environment name? // e.g. /projects/react-universally/.env.staging ifElse(DEPLOYMENT)(path.resolve(appRootDir.get(), `${envFile}.${DEPLOYMENT}`)), ]); // Find the first env file path match. const envFilePath = envFileResolutionOrder.find(filePath => fs.existsSync(filePath)); // If we found an env file match the register it. if (envFilePath) { // eslint-disable-next-line no-console log({ title: 'server', level: 'special', message: `Registering environment variables from: ${envFilePath}`, }); dotenv.config({ path: envFilePath }); } } // Ensure that we first register any environment variables from an existing // env file. registerEnvFile(); // EXPORTED HELPERS /** * Gets a string environment variable by the given name. * * @param {String} name - The name of the environment variable. * @param {String} defaultVal - The default value to use. * * @return {String} The value. */ export function string(name, defaultVal) { return process.env[name] || defaultVal; } /** * Gets a number environment variable by the given name. * * @param {String} name - The name of the environment variable. * @param {number} defaultVal - The default value to use. * * @return {number} The value. */ export function number(name, defaultVal) { return process.env[name] ? parseInt(process.env[name], 10) : defaultVal; } export function bool(name, defaultVal) { return process.env[name] ? process.env[name] === 'true' || process.env[name] === '1' : defaultVal; } ```
```kotlin package mega.privacy.android.app.domain.usecase.shares import mega.privacy.android.domain.entity.node.NodeId import nz.mega.sdk.MegaShare /** * Get a list with the active and pending outbound sharings for a MegaNode */ fun interface GetOutShares { /** * Get a list with the active and pending outbound sharings for a MegaNode * @param nodeId the [NodeId] of the node to get the outbound sharings * @return a list of [MegaShare] of the outbound sharings of the node */ suspend operator fun invoke(nodeId: NodeId): List<MegaShare>? } ```
```python # coding=utf-8 """Trakt checker module.""" from __future__ import unicode_literals import datetime import logging import time from builtins import object from builtins import str from json.decoder import JSONDecodeError from medusa import app, db, ui from medusa.common import ARCHIVED, DOWNLOADED, Quality, SKIPPED, SNATCHED, SNATCHED_BEST, SNATCHED_PROPER, WANTED from medusa.helper.common import episode_num from medusa.helpers.externals import show_in_library from medusa.helpers.trakt import create_episode_structure, create_show_structure, get_trakt_user from medusa.indexers.config import EXTERNAL_IMDB, EXTERNAL_TRAKT, indexerConfig from medusa.indexers.imdb.api import ImdbIdentifier from medusa.indexers.utils import get_trakt_indexer from medusa.logger.adapters.style import BraceAdapter from medusa.search.queue import BacklogQueueItem from medusa.show.show import Show from requests.exceptions import RequestException from trakt import sync, tv from trakt.errors import TraktException log = BraceAdapter(logging.getLogger(__name__)) log.logger.addHandler(logging.NullHandler()) def set_episode_to_wanted(show, season, episode): """Set an episode to wanted, only if it is currently skipped.""" # Episode must be loaded from DB to get current status and not default blank episode status ep_obj = show.get_episode(season, episode) if ep_obj: with ep_obj.lock: if ep_obj.status != SKIPPED or ep_obj.airdate == datetime.date.fromordinal(1): log.info("Not setting episode '{show}' {ep} to WANTED because current status is not SKIPPED " "or it doesn't have a valid airdate", {'show': show.name, 'ep': episode_num(season, episode)}) return log.info("Setting episode '{show}' {ep} to wanted", { 'show': show.name, 'ep': episode_num(season, episode) }) # figure out what segment the episode is in and remember it so we can backlog it ep_obj.status = WANTED # As we created the episode and updated the status, need to save to DB ep_obj.save_to_db() cur_backlog_queue_item = BacklogQueueItem(show, [ep_obj]) app.search_queue_scheduler.action.add_item(cur_backlog_queue_item) log.info("Starting backlog search for '{show}' {ep} because some episodes were set to wanted", { 'show': show.name, 'ep': episode_num(season, episode) }) class TraktChecker(object): """Trakt checker class.""" def __init__(self): """Initialize the class.""" self.todo_wanted = [] self.show_watchlist = [] self.episode_watchlist = [] self.collection_list = [] self.amActive = False def run(self, force=False): """Run Trakt Checker.""" self.amActive = True # add shows from Trakt watchlist if app.TRAKT_SYNC_WATCHLIST: self.todo_wanted = [] # its about to all get re-added if len(app.ROOT_DIRS) < 2: log.warning('No default root directory') ui.notifications.error('Unable to add show', 'You do not have any default root directory. ' 'Please configure in general settings!') return try: self.sync_watchlist() self.sync_library() except (TraktException, RequestException, JSONDecodeError) as error: log.exception('Trakt exception while running trakt_checker.\nError: {error}', {'error': error}) self.amActive = False def find_show(self, indexerid, indexer): """Find show in Trakt library.""" trakt_library = [] try: trakt_library = sync.get_collection('shows') except (TraktException, RequestException, JSONDecodeError) as error: log.info('Unable to retrieve shows from Trakt collection. Error: {error!r}', {'error': error}) if not trakt_library: log.info('No shows found in your Trakt library. Nothing to sync') return trakt_show = [show for show in trakt_library if get_trakt_indexer(indexer) and show.ids['ids'].get(get_trakt_indexer(indexer)) and indexerid in [show.ids['ids'].get(get_trakt_indexer(indexer))]] return trakt_show if trakt_show else None def remove_show_trakt_library(self, show_obj): """Remove show from trakt library.""" if self.find_show(show_obj.indexerid, show_obj.indexer): # Check if TRAKT supports that indexer if not get_trakt_indexer(show_obj.indexer): return log.info("Removing '{show}' from Trakt library", {'show': show_obj.name}) # Remove all episodes from the Trakt collection for this show try: self.remove_episode_trakt_collection(filter_show=show_obj) except (TraktException, RequestException, JSONDecodeError) as error: log.info("Unable to remove all episodes from show '{show}' from Trakt library. Error: {error!r}", { 'show': show_obj.name, 'error': error }) try: sync.remove_from_collection(create_show_structure(show_obj)) except (TraktException, RequestException, JSONDecodeError) as error: log.info("Unable to remove show '{show}' from Trakt library. Error: {error!r}", { 'show': show_obj.name, 'error': error }) def add_show_trakt_library(self, show_obj): """Add show to trakt library.""" if self.find_show(show_obj.indexerid, show_obj.indexer): return # Check if TRAKT supports that indexer if not get_trakt_indexer(show_obj.indexer): return log.info("Adding show '{show}' to Trakt library", {'show': show_obj.name}) try: result = sync.add_to_collection(create_show_structure(show_obj)) except (TraktException, RequestException, JSONDecodeError) as error: log.info("Unable to add show '{show}' to Trakt library. Error: {error!r}", { 'show': show_obj.name, 'error': error }) return if result and (result.get('added') or result.get('existing')): return True return False def sync_library(self): """Sync Trakt library.""" if app.TRAKT_SYNC and app.USE_TRAKT: log.debug('Syncing Trakt collection') if self._get_show_collection(): self.add_episode_trakt_collection() if app.TRAKT_SYNC_REMOVE: self.remove_episode_trakt_collection() log.debug('Synced Trakt collection') def remove_episode_trakt_collection(self, filter_show=None): """Remove episode from trakt collection. For episodes that no longer have a media file (location) :param filter_show: optional. Only remove episodes from trakt collection for given shows """ if not (app.TRAKT_SYNC_REMOVE and app.TRAKT_SYNC and app.USE_TRAKT): return params = [] main_db_con = db.DBConnection() statuses = [DOWNLOADED, ARCHIVED] sql_selection = 'SELECT s.indexer, s.startyear, s.indexer_id, s.show_name,' \ 'e.season, e.episode, e.status ' \ 'FROM tv_episodes AS e, tv_shows AS s WHERE e.indexer = s.indexer AND ' \ 's.indexer_id = e.showid and e.location = "" ' \ 'AND e.status in ({0})'.format(','.join(['?'] * len(statuses))) if filter_show: sql_selection += ' AND s.indexer_id = ? AND e.indexer = ?' params = [filter_show.series_id, filter_show.indexer] sql_result = main_db_con.select(sql_selection, statuses + params) if not sql_result: return episodes = [] shows = {} for cur_episode in sql_result: # Check if TRAKT supports that indexer if not get_trakt_indexer(cur_episode['indexer']): continue show_id = cur_episode['indexer'], cur_episode['indexer_id'] episode = cur_episode['season'], cur_episode['episode'] if show_id not in shows: shows[show_id] = [] shows[show_id].append(episode) media_object_shows = [] for show_id in shows: episodes = [] show_obj = Show.find_by_id(app.showList, show_id[0], show_id[1]) for season, episode in shows[show_id]: if not self._check_list( indexer=show_obj.indexer, indexer_id=show_obj.series_id, season=season, episode=episode, list_type='Collection' ): continue log.info("Removing episode '{show}' {ep} from Trakt collection", { 'show': show_obj.name, 'ep': episode_num(season, episode) }) episodes.append(show_obj.get_episode(season, episode)) media_object_shows.append(create_episode_structure(show_obj, episodes)) try: sync.remove_from_collection({'shows': media_object_shows}) self._get_show_collection() except (TraktException, RequestException, JSONDecodeError) as error: log.info('Unable to remove episodes from Trakt collection. Error: {error!r}', { 'error': error }) def add_episode_trakt_collection(self): """Add all existing episodes to Trakt collections. For episodes that have a media file (location) """ if not(app.TRAKT_SYNC and app.USE_TRAKT): return main_db_con = db.DBConnection() statuses = [DOWNLOADED, ARCHIVED] sql_selection = 'SELECT s.indexer, s.startyear, s.indexer_id, s.show_name, e.season, e.episode ' \ 'FROM tv_episodes AS e, tv_shows AS s ' \ 'WHERE e.indexer = s.indexer AND s.indexer_id = e.showid ' \ "AND e.status in ({0}) AND e.location <> ''".format(','.join(['?'] * len(statuses))) sql_result = main_db_con.select(sql_selection, statuses) if not sql_result: return episodes = [] shows = {} for cur_episode in sql_result: # Check if TRAKT supports that indexer if not get_trakt_indexer(cur_episode['indexer']): continue show_id = cur_episode['indexer'], cur_episode['indexer_id'] episode = cur_episode['season'], cur_episode['episode'] if show_id not in shows: shows[show_id] = [] shows[show_id].append(episode) media_object_shows = [] for show_id in shows: episodes = [] show_obj = Show.find_by_id(app.showList, show_id[0], show_id[1]) for season, episode in shows[show_id]: if not self._check_list( indexer=show_obj.indexer, indexer_id=show_obj.series_id, season=season, episode=episode, list_type='Collection' ): continue log.info("Adding episode '{show}' {ep} to Trakt collection", { 'show': show_obj.name, 'ep': episode_num(season, episode) }) episodes.append(show_obj.get_episode(season, episode)) media_object_shows.append(create_episode_structure(show_obj, episodes)) try: sync.add_to_collection({'shows': media_object_shows}) self._get_show_collection() except (TraktException, RequestException, JSONDecodeError) as error: log.info('Unable to add episodes to Trakt collection. Error: {error!r}', {'error': error}) def sync_watchlist(self): """Sync Trakt watchlist.""" if app.USE_TRAKT and app.TRAKT_SYNC_WATCHLIST: log.debug('Syncing Trakt Watchlist') self.remove_from_library() if self._get_show_watchlist(): log.debug('Syncing shows from Trakt watchlist to library') self.sync_trakt_shows() if app.TRAKT_SYNC_TO_WATCHLIST: log.debug('Syncing shows from library to Trakt watchlist') self.add_show_watchlist() if self._get_episode_watchlist(): log.debug('Syncing episodes from Trakt watchlist to library') self.remove_episode_watchlist() self.sync_trakt_episodes() log.debug('Syncing episodes from library to trakt watchlist') self.add_episode_watchlist() log.debug('Synced Trakt watchlist') def remove_episode_watchlist(self): """Remove episode from Trakt watchlist.""" if not (app.TRAKT_SYNC_WATCHLIST and app.USE_TRAKT): return main_db_con = db.DBConnection() statuses = [DOWNLOADED, ARCHIVED] sql_selection = 'SELECT s.indexer, s.startyear, s.indexer_id, s.show_name, e.season, e.episode ' \ 'FROM tv_episodes AS e, tv_shows AS s ' \ 'WHERE e.indexer = s.indexer ' \ 'AND s.indexer_id = e.showid AND e.status in ({0})'.format(','.join(['?'] * len(statuses))) sql_result = main_db_con.select(sql_selection, statuses) if not sql_result: return episodes = [] shows = {} for cur_episode in sql_result: # Check if TRAKT supports that indexer if not get_trakt_indexer(cur_episode['indexer']): continue show_id = cur_episode['indexer'], cur_episode['indexer_id'] episode = cur_episode['season'], cur_episode['episode'] if show_id not in shows: shows[show_id] = [] shows[show_id].append(episode) media_object_shows = [] for show_id in shows: episodes = [] show_obj = Show.find_by_id(app.showList, show_id[0], show_id[1]) for season, episode in shows[show_id]: if not self._check_list( indexer=show_obj.indexer, indexer_id=show_obj.series_id, season=season, episode=episode, list_type='Collection' ): continue log.info("Removing episode '{show}' {ep} from Trakt watchlist", { 'show': show_obj.name, 'ep': episode_num(season, episode) }) episodes.append(show_obj.get_episode(season, episode)) media_object_shows.append(create_episode_structure(show_obj, episodes)) try: sync.remove_from_collection({'shows': media_object_shows}) self._get_episode_watchlist() except (TraktException, RequestException, JSONDecodeError) as error: log.info('Unable to remove episodes from Trakt watchlist. Error: {error!r}', { 'error': error }) def add_episode_watchlist(self): """Add episode to Tratk watchlist.""" if not(app.TRAKT_SYNC_WATCHLIST and app.USE_TRAKT): return main_db_con = db.DBConnection() statuses = [SNATCHED, SNATCHED_BEST, SNATCHED_PROPER, WANTED] sql_selection = 'SELECT s.indexer, s.startyear, s.indexer_id, s.show_name, e.season, e.episode ' \ 'FROM tv_episodes AS e, tv_shows AS s ' \ 'WHERE e.indexer = s.indexer AND s.indexer_id = e.showid AND s.paused = 0 ' \ 'AND e.status in ({0})'.format(','.join(['?'] * len(statuses))) sql_result = main_db_con.select(sql_selection, statuses) if not sql_result: return episodes = [] shows = {} for cur_episode in sql_result: # Check if TRAKT supports that indexer if not get_trakt_indexer(cur_episode['indexer']): continue show_id = cur_episode['indexer'], cur_episode['indexer_id'] episode = cur_episode['season'], cur_episode['episode'] if show_id not in shows: shows[show_id] = [] shows[show_id].append(episode) media_object_shows = [] for show_id in shows: episodes = [] show_obj = Show.find_by_id(app.showList, show_id[0], show_id[1]) for season, episode in shows[show_id]: if not self._check_list( indexer=show_obj.indexer, indexer_id=show_obj.series_id, season=season, episode=episode, list_type='Collection' ): continue log.info("Adding episode '{show}' {ep} to Trakt watchlist", { 'show': show_obj.name, 'ep': episode_num(season, episode) }) episodes.append(show_obj.get_episode(season, episode)) media_object_shows.append(create_episode_structure(show_obj, episodes)) try: sync.add_to_watchlist({'shows': media_object_shows}) self._get_episode_watchlist() except (TraktException, RequestException, JSONDecodeError) as error: log.info('Unable to add episode to Trakt watchlist. Error: {error!r}', { 'error': error }) def add_show_watchlist(self): """Add show to Trakt watchlist. It will add all shows from Medusa library """ if not (app.TRAKT_SYNC_WATCHLIST and app.USE_TRAKT): return if not app.showList: return trakt_show_objects = [] for show_obj in app.showList: if not self._check_list(show_obj=show_obj, list_type='Show'): log.info("Adding show '{show}' to Trakt watchlist", {'show': show_obj.name}) trakt_show_objects.append(create_show_structure(show_obj)) if trakt_show_objects: try: sync.add_to_watchlist({'shows': trakt_show_objects}) except (TraktException, RequestException, JSONDecodeError) as error: log.info('Unable to add shows to Trakt watchlist. Error: {error!r}', {'error': error}) self._get_show_watchlist() def remove_from_library(self): """Remove show from Medusa library if it is ended/completed.""" if not (app.TRAKT_SYNC_WATCHLIST and app.USE_TRAKT and app.TRAKT_REMOVE_SHOW_FROM_APPLICATION): return log.debug('Retrieving ended/completed shows to remove from Medusa') if not app.showList: return for show in app.showList: if show.status == 'Ended': trakt_id = show.externals.get('trakt_id', None) if not (trakt_id or show.imdb_id): log.info("Unable to check Trakt progress for show '{show}' " 'because Trakt|IMDB ID is missing. Skipping', {'show': show.name}) continue try: trakt_show = tv.TVShow(str(trakt_id or ImdbIdentifier(show.imdb_id).imdb_id)) progress = trakt_show.progress except (TraktException, RequestException, JSONDecodeError) as error: log.info("Unable to check if show '{show}' is ended/completed. Error: {error!r}", { 'show': show.name, 'error': error }) continue else: if progress and progress.get('aired', True) == progress.get('completed', False): app.show_queue_scheduler.action.removeShow(show, full=True) log.info("Show '{show}' has being queued to be removed from Medusa library", { 'show': show.name }) def sync_trakt_shows(self): """Sync Trakt shows watchlist.""" if not self.show_watchlist: log.info('No shows found in your Trakt watchlist. Nothing to sync') return trakt_default_indexer = int(app.TRAKT_DEFAULT_INDEXER) for trakt_show in self.show_watchlist: if trakt_show.year and trakt_show.ids['ids']['slug'].endswith(str(trakt_show.year)): show_name = f'{trakt_show.title} ({trakt_show.year})' else: show_name = trakt_show.title show = None indexer = None for i in indexerConfig: trakt_indexer = get_trakt_indexer(i) indexer_id = trakt_show.ids['ids'].get(trakt_indexer) if not indexer_id: continue indexer = indexerConfig[i]['id'] show = show_in_library(i, indexer_id) # show = Show.find_by_id(app.showList, indexer, indexer_id) if show: break if not show: # If can't find with available indexers try IMDB trakt_indexer = get_trakt_indexer(EXTERNAL_IMDB) indexer_id = trakt_show.ids['ids'].get(trakt_indexer) show = Show.find_by_id(app.showList, EXTERNAL_IMDB, indexer_id) if not show: # If can't find with available indexers try TRAKT trakt_indexer = get_trakt_indexer(EXTERNAL_TRAKT) indexer_id = trakt_show.ids['ids'].get(trakt_indexer) show = Show.find_by_id(app.showList, EXTERNAL_TRAKT, indexer_id) if show: continue # If we don't have an indexer id for the trakt default indexer, skip it. indexer_id = trakt_show.ids['ids'].get(get_trakt_indexer(trakt_default_indexer)) if not indexer_id: log.info( 'Can not add show {show_name}, as trakt does not have an {indexer} id for this show.', {'show_name': show_name, 'indexer': get_trakt_indexer(trakt_default_indexer)} ) continue if int(app.TRAKT_METHOD_ADD) != 2: self.add_show(trakt_default_indexer, indexer_id, show_name, SKIPPED) else: self.add_show(trakt_default_indexer, indexer_id, show_name, WANTED) if int(app.TRAKT_METHOD_ADD) == 1 and indexer: new_show = Show.find_by_id(app.showList, indexer, indexer_id) if new_show: set_episode_to_wanted(new_show, 1, 1) else: log.warning('Unable to find the new added show.' 'Pilot will be set to wanted in the next Trakt run') self.todo_wanted.append((indexer, indexer_id, 1, 1)) log.debug('Synced shows with Trakt watchlist') def sync_trakt_episodes(self): """Sync Trakt episodes watchlist.""" if not self.episode_watchlist: log.info('No episodes found in your Trakt watchlist. Nothing to sync') return added_shows = [] trakt_default_indexer = int(app.TRAKT_DEFAULT_INDEXER) for watchlist_item in self.episode_watchlist: trakt_show = watchlist_item.show trakt_episode = watchlist_item.episode trakt_season = watchlist_item.season show = None for i in indexerConfig: trakt_indexer = get_trakt_indexer(i) if not trakt_indexer: continue indexer_id = trakt_show['ids'].get(trakt_indexer) indexer = indexerConfig[i]['id'] show = Show.find_by_id(app.showList, indexer, indexer_id) if show: break if not show: # If can't find with available indexers try IMDB trakt_indexer = get_trakt_indexer(EXTERNAL_IMDB) indexer_id = trakt_show['ids'].get(trakt_indexer) show = Show.find_by_id(app.showList, EXTERNAL_IMDB, indexer_id) if not show: # If can't find with available indexers try TRAKT trakt_indexer = get_trakt_indexer(EXTERNAL_TRAKT) indexer_id = trakt_show['ids'].get(trakt_indexer) show = Show.find_by_id(app.showList, EXTERNAL_TRAKT, indexer_id) # If can't find show add with default trakt indexer if not show: trakt_indexer = get_trakt_indexer(trakt_default_indexer) indexer_id = trakt_show['ids'].get(trakt_indexer) # Only add show if we didn't added it before if indexer_id not in added_shows: self.add_show(trakt_default_indexer, indexer_id, trakt_show['title'], SKIPPED) added_shows.append(indexer_id) elif not trakt_season == 0 and not show.paused: set_episode_to_wanted(show, trakt_season, trakt_episode) log.debug('Synced episodes with Trakt watchlist') @staticmethod def add_show(indexer, indexer_id, show_name, status): """Add a new show with default settings.""" if Show.find_by_id(app.showList, EXTERNAL_IMDB, indexer_id): return root_dirs = app.ROOT_DIRS location = root_dirs[int(root_dirs[0]) + 1] if root_dirs else None if location: log.info("Adding show '{show}' using indexer: '{indexer_name}' and ID: {id}", { 'show': show_name, 'indexer_name': indexerConfig[indexer]['identifier'], 'id': indexer_id }) allowed, preferred = Quality.split_quality(int(app.QUALITY_DEFAULT)) quality = {'allowed': allowed, 'preferred': preferred} app.show_queue_scheduler.action.addShow(indexer, indexer_id, None, default_status=status, quality=quality, season_folders=int(app.SEASON_FOLDERS_DEFAULT), paused=app.TRAKT_START_PAUSED, default_status_after=status, root_dir=location) tries = 0 while tries < 3: if Show.find_by_id(app.showList, indexer, indexer_id): return # Wait before show get's added and refreshed time.sleep(60) tries += 1 log.warning("Error creating show '{show}. Please check logs' ", { 'show': show_name }) return else: log.warning("Error creating show '{show}' folder. No default root directory", { 'show': show_name }) return def manage_new_show(self, show): """Set episodes to wanted for the recently added show.""" log.debug("Checking for wanted episodes for show '{show}' in Trakt watchlist", {'show': show.name}) episodes = [i for i in self.todo_wanted if i[0] == show.indexer and i[1] == show.indexerid] for episode in episodes: self.todo_wanted.remove(episode) set_episode_to_wanted(show, episode[2], episode[3]) def _check_list(self, show_obj=None, indexer=None, indexer_id=None, season=None, episode=None, list_type=None): """Check if we can find the show in the Trakt watchlist|collection list.""" def match_trakt_by_id(trakt_show, medusa_show): """Try to match the trakt show object to a Medusa show.""" trakt_supported_indexer = get_trakt_indexer(show_obj.indexer) if trakt_supported_indexer and getattr(trakt_show, trakt_supported_indexer) == medusa_show.indexerid: return True # Try to match by imdb_id if getattr(trakt_show, 'imdb') == ImdbIdentifier(medusa_show.imdb_id).imdb_id: return True return False if 'Collection' == list_type: trakt_indexer = get_trakt_indexer(indexer) for collected_show in self.collection_list: if not getattr(collected_show, trakt_indexer) == indexer_id: continue if hasattr(collected_show, 'seasons'): for season_item in collected_show.seasons: for episode_item in season_item.episodes: trakt_season = season_item.number trakt_episode = episode_item.number if trakt_season == season and trakt_episode == episode: return True else: return False elif 'Show' == list_type: for watchlisted_show in self.show_watchlist: if match_trakt_by_id(watchlisted_show, show_obj): return True return False else: trakt_indexer = get_trakt_indexer(indexer) for watchlisted_episode in self.episode_watchlist: if watchlisted_episode.season == season and \ watchlisted_episode.episode == episode and \ watchlisted_episode['ids'].get(trakt_indexer) == indexer_id: return True return False def _get_show_watchlist(self): """Get shows watchlist.""" user = get_trakt_user() self.show_watchlist = user.watchlist_shows return self.show_watchlist def _get_episode_watchlist(self): """Get episodes watchlist.""" try: self.episode_watchlist = sync.get_watchlist('episodes') except (TraktException, RequestException, JSONDecodeError) as error: log.info(u'Unable to retrieve episodes from Trakt watchlist. Error: {error!r}', {'error': error}) return False return True def _get_show_collection(self): """Get show collection.""" try: self.collection_list = sync.get_collection('shows') except (TraktException, RequestException, JSONDecodeError) as error: log.info('Unable to retrieve shows from Trakt collection. Error: {error!r}', {'error': error}) return False return True @staticmethod def trakt_bulk_data_generate(trakt_data): """Build the JSON structure to send back to Trakt.""" unique_shows = {} unique_seasons = {} for indexer_id, indexer, show_name, start_year, season, episode in trakt_data: if indexer_id not in unique_shows: unique_shows[indexer_id] = {'title': show_name, 'year': start_year, 'ids': {}, 'seasons': []} unique_shows[indexer_id]['ids'][get_trakt_indexer(indexer)] = indexer_id unique_seasons[indexer_id] = [] # Get the unique seasons per Show for indexer_id, indexer, show_name, start_year, season, episode in trakt_data: if season not in unique_seasons[indexer_id]: unique_seasons[indexer_id].append(season) # build the query show_list = [] seasons_list = {} for searched_show in unique_shows: show = [] seasons_list[searched_show] = [] for searched_season in unique_seasons[searched_show]: episodes_list = [] for indexer_id, indexer, show_name, start_year, season, episode in trakt_data: if season == searched_season and indexer_id == searched_show: episodes_list.append({'number': episode}) show = unique_shows[searched_show] show['seasons'].append({'number': searched_season, 'episodes': episodes_list}) if show: show_list.append(show) post_data = {'shows': show_list} return post_data ```
```java /* * DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS FILE HEADER. * * This code is free software; you can redistribute it and/or modify it * published by the Free Software Foundation. * * This code is distributed in the hope that it will be useful, but WITHOUT * ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or * version 2 for more details (a copy is included in the LICENSE file that * accompanied this code). * * 2 along with this work; if not, write to the Free Software Foundation, * Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA. * * Please contact Oracle, 500 Oracle Parkway, Redwood Shores, CA 94065 USA * or visit www.oracle.com if you need additional information or have any * questions. */ package org.graalvm.visualizer.data; import java.lang.annotation.Retention; import java.lang.annotation.RetentionPolicy; /** * Used to suppress <a href="path_to_url">FindBugs</a> warnings. */ @Retention(RetentionPolicy.CLASS) public @interface SuppressFBWarnings { /** * The set of FindBugs * <a href="path_to_url">warnings</a> that are to be * suppressed in annotated element. The value can be a bug category, kind or pattern. */ String[] value(); /** * Reason why the warning is suppressed. */ String justification(); } ```
Franklin is a rural town in Macon County, Alabama, United States. As of the 2020 census, the population was 590. History and educational legacy The Muscogee (Creek) people had long been cultivating lands in this area, producing crops of maize, squash and beans (the Three Sisters), and tobacco, used primarily for ritual purposes. Osceola (1804-1838), who became well known as a leader of the Seminole people in Florida, was born to a Creek woman at Red Creek, 10 miles from the Tallapoosa River. He was of mixed race but identified as Creek; the people have a matrilineal kinship system. Franklin has been home to many churches for more than 200 years. In the late 18th and early 19th centuries, a Methodist Missionary Church operated here for the Creek. It had two cemeteries, one for whites and one for the Creek. James McQueen, a Scots trader who lived here and married a Creek woman, was great-grandfather of Osceola. McQueen is buried in the Indian cemetery. After the Creek were forced to cede their lands, European Americans developed the area for cotton cultivation. They depended on the labor of enslaved African Americans, many of whom were initially transported to this region from the Upper South in the domestic slave trade. Cotton continued as the chief commodity crop after the Civil War. Residents established Franklin School by the 1890s, teaching grades 1–11 of white students. By the mid-1930s, the upper grades had been moved to another facility, and it held grades 1–6. Northern and southern classrooms were adjoined by a common auditorium. The school's original water source was a spring near the buildings. A well was later dug in the front yard of the school, with a hand pump to get water. Heat was provided by a wood-burning potbelly stove. Each student brought a stick of wood every morning to burn in the stove. The school closed in 1942, and its 75–80 students transferred to Tuskegee schools. After the school closed, the northern classroom was moved to its current location and converted to a community center. The rest of the school was torn down. In the mid-20th century, musician Hank Williams Sr. often performed at dances at the community center. Upon Franklin's incorporation in 1977, the town began using the community center building as the town hall. A mile north of Town Hall lay the remnants of what is rumored to be the first school in Macon County. Harris Barrett School was built in 1903 with handmade bricks made by students of the Tuskegee Normal School (now Tuskegee University), under the direction of Booker T. Washington. In the segregated system of public facilities, the Barrett School was reserved for African-American students, who were mostly descendants of freedmen in this rural area. Both the Barrett School and the Tuskegee Institute played a major role in education in the Franklin community. They operated an experimental farm on the west side of Baldwin Farm Road. Booker T. Washington and George Washington Carver were both active in farming in Franklin, and assisted farmers both black and white. Harris Barrett School was restored and is operated as a historic museum; it is located at the corner of Co. Rd. 27 and 36. Geography Franklin is located at (32.455388, -85.802884). According to the U.S. Census Bureau, the town has a total area of , all land. Demographics 2020 census As of the census of 2000, there were 145 people, 59 households, and 44 families residing in the town. The population density was . There were 73 housing units at an average density of . The racial makeup of the town was 56.38% Black or African American and 43.62% White. 0.67% of the population were Hispanic or Latino of any race. There were 59 households, of which 28.1% had children under the age of 18 living with them, 51.6% were married couples living together, 15.6% had a female householder with no husband present, and 25.0% were non-families. 21.9% of all households were made up of individuals, and 10.9% had someone living alone who was 65 years of age or older. The average household size was 2.33 and the average family size was 2.69. In the town, the population was spread out, with 20.1% under the age of 18, 10.7% from 18 to 24, 18.1% from 25 to 44, 33.6% from 45 to 64, and 17.4% who were 65 years of age or older. The median age was 47 years. For every 100 females, there were 96.1 males. For every 100 females age 18 and over, there were 88.9 males. The median income for a household in the town was $45,923, and the median income for a family was $53,111. Males had a median income of $43,840 versus $40,744 for females. The per capita income for the town was $45,495. 4.9% of the population were living below the poverty line. 9.7% of those were over the age of 64. Government The Town of Franklin is a mayor-council form of government with a mayor and 5 councilpersons which are elected every four years. The town operates its own police department, volunteer fire department, as well as water system. Mayor Henry Peavy, Mayor Pro-Temp David Clinkscales, Council Members Alvin Sears, Memphis Boston, Rheba Knoxx, Robert T. Perry Town Clerk/Treasurer Micha Segrest Chief of Police James Chris Johnson, Jr. Fire Chief Scott Cooper References Towns in Macon County, Alabama Towns in Alabama Columbus metropolitan area, Georgia
George John Bailey (born 7 September 1982) is a former Australian cricketer, who played all formats for the national team and captained the team in limited-over formats. Domestically, Bailey played for the Tasmanian cricket team in all three domestic state competitions (the Sheffield Shield, One-Day Cup and KFC Twenty20 Big Bash) as well as the Hobart Hurricanes and Melbourne Stars in the Twenty20 Big Bash's successor, the KFC Big Bash League. He has also played in the Indian Premier League and T20 Blast, and in Scotland with Grange Cricket Club. Bailey was a member of the Australian team that won the 2015 Cricket World Cup. Bailey was appointed as Twenty20 captain of the Australian national team in 2012, succeeding Cameron White prior to the two match series against India that ended 1–1. He became the second ever Australian to captain an international game without having played an international game before, after Dave Gregory in the first ever Test match. On 1 May 2013, Bailey was appointed the vice-captain of the Australian ODI team for the 2013 ICC Champions Trophy. He captained the Australian team in India in ODI in the absence of Michael Clarke. In November 2013, Bailey was named in the Australian team for the 2013–14 Ashes series against England. He played all five matches of the series, but was subsequently dropped from the Test team. In the 2017–18 season, Bailey won his first Ricky Ponting Medal for Tasmania's best player in the previous season. He was appointed as the chief selector of Cricket Australia in August 2021. Early life and education Bailey is the great-great-grandson of George Herbert Bailey, who represented Tasmania in 15 first-class matches, and the great-grandson of Keith Bailey, who represented Tasmania in two first-class matches. He was born and raised in Launceston, Tasmania. He attended the Launceston Church Grammar School, where he was school captain and graduated in 2000. He then studied business at the University of Tasmania, and resided at Jane Franklin Hall. Bailey graduated with a Graduate Certificate of Management in 2016, and is currently completing a Master of Business Administration degree at the University. Domestic and T20 franchise career A destructive striker who can change a match within a few overs, Bailey arrived as a state one-day player at the age of 19 after playing his junior cricket with the South Launceston Cricket Club. Bailey was first selected to play for Tasmania in 2005/06, due to injuries to regular players, and he was given an extended stint in the first-class team, scoring 778 Pura Cup runs, including three centuries, and earning a second invitation to the Academy. Talk of the state leadership and possible national team representation began that summer, after he scored a highest score was 155 against South Australia, an innings that formed part of a state-record fourth-wicket partnership of 292 with Travis Birt. Another highlight came shortly before the 2006/07 season, when he bludgeoned 136 from 65 balls for the Academy against a Zimbabwe Board XI. Bailey is a former national under-19 player. Further prominent performances in the coming seasons saw Bailey play for Australia against the All Star team in the All Star Twenty20 match in 2009. Bailey was appointed as the permanent captain of Tasmania for the 2009/10 season, replacing Daniel Marsh. In February 2011, Bailey led Tasmania to a five wicket Sheffield Shield win over Victoria where he scored an unbeaten 160. Needing 130 in the final session, he and James Faulkner pushed the Tigers past the total in the 91st over of play on the final day to lift Tasmania to second on the table behind New South Wales. He captained Tasmania to its second Sheffield Shield title against New South Wales at Bellerive Oval in 2010/11. In the 2011/12 Ryobi Cup final in Adelaide, Bailey showed he is made of stern stuff; he scored 101 and was out in the last over. But although Tasmania tied with South Australia, they lost the title because South Australia finished top of the ladder that season. In 2012, he was signed by the Melbourne Stars for the first season of the Big Bash League. Bailey scored 114 runs at an average of 19 for the Melbourne Stars in the Big Bash League. In 2016, Bailey was signed by the Rising Pune Supergiants, a new Indian Premier League team, as a replacement to Faf du Plessis who was ruled out of due to a finger injury. This was Bailey's third IPL team after Chennai Super Kings and Kings XI Punjab where he was captain for last two years as the franchise decided to release him and there was no buyer for him 2016 auction. In June 2019, he was selected to play for the Montreal Tigers franchise team in the 2019 Global T20 Canada tournament. International career In early 2010, Bailey was called up for the ODIs in New Zealand when Michael Clarke returned home for personal reasons – but did not win a cap. He subsequently had to wait until 2012 to make his international debut. When he did, he did so as captain of the Australian national Twenty20 team, succeeding Cameron White prior to the two match series against India, which ended 1–1. He became the second ever Australian to captain an international game, without having played an international game before, after Dave Gregory in the first ever Test match. When he walked out as leader for the T20 at Sydney's Stadium Australia and the match at the MCG, Bailey was in charge of a new-look side. The fast-bowling allrounder James Faulkner was on his debut, the batsman Travis Birt had earned a recall nearly two years after his last international appearance and Brad Hogg had returned after retiring in 2008. There had been criticism over his appointment. Bailey's highest score in the shortest format at the time of his appointment had been 60 and he had made only one T20 half-century in the previous three seasons, but he said batting at No.5 opportunities were often limited. He led Australia to victory in his first game. He presided over a 31-run defeat of India in his first match promoting Matthew Wade to the opener's post where he scored 72 . In second match of the series Australia lost the match and series was levelled. He made some hasty decisions such as sending Matthew Wade at No. 6 given the fact that in the first match he had opened and made 72. He still contributed with 32 and his opening replacement Aaron Finch top-scored with 36, but Shaun Marsh at first drop failed to score and the batting was weakened with the allrounders Daniel Christian and James Faulkner left out. He was named in ODI the squad for the West Indies tour. As a result of an injury to Australian captain Michael Clarke, he was included in the first ODI of the series and Bailey top scored with 48 runs in Australia's 204 for 8. He was the third highest run scorer in the series with 172 runs behind Kieron Pollard and Michael Hussey. He also scored his first half-century. During a T20 series against West Indies, he scored 45 runs in two matches. Like the India series, the West Indies series was ended in tie. Australia won the first match by 8 wickets but lost the second match by 14 runs. At the start of the Ireland and England tour it was decided that Bailey would not receive a central contract for the 2012/13 season, despite being captain of the national T20I team. In the fifth ODI Bailey produced some excellent cricket, rounding off his efforts with a 46 from 41 balls to ensuring that Australia posted a troubling total for England. He was named as the ODI captain of the team in absence of the captain Michael Clarke and vice-captain Shane Watson. He led three ODIs in the five match series, winning two and losing one. He made an impressive 89 during the win at the MCG to give a good first impression as captain. Bailey scored his maiden ODI century in a match against the West Indies. Australia had been in a difficult situation at 56/4 when Bailey came in but his 125 not out from 110 balls took Australia to a challenging total of 266. During the 2013 ODI series against India, Bailey scored a total of 478 runs, setting a record for the most runs by any batsman in a bilateral series. With one match remaining, he had broken the previous record of 467 set by Zimbabwe's Hamilton Masakadza in a five-match series against Kenya in 2009, which in turn was broken by India's Rohit Sharma in the same series. In the sixth match of the series, Bailey posted an innings of 156, and in doing so went past 1,500 ODI runs in only his 32nd innings. Only Hashim Amla has done it faster in 30 innings. Bailey became only the ninth Australian to score 1000 or more ODI runs in a calendar year. He was also named in the ODI XI by Cricinfo for his performances in 2013. In November 2013, Bailey became Australia's 436th Test cap in the Ashes and was presented with the baggy green before the start of play by a former captain Mark Taylor. In December 2013, Bailey hit 28 runs off an over from James Anderson in the Third Ashes Test at WACA Ground equalling Brian Lara's record for the most runs off an over in Test cricket. He was also awarded the Men's ODI Player of the Year at the Allan Border Medal ceremony by the CA in 2014. On 7 September 2014, George Bailey resigned as the captain of the Australian T20I team to focus solely on the 2015 ODI World Cup. In August 2017, he was named in a World XI side to play three Twenty20 International matches against Pakistan in the 2017 Independence Cup in Lahore. In the third match of the series, Bailey undertook wicketkeeping duties. Channel Nine controversy During the 2012–13 summer, George Bailey led a one-day Australian team lacking draw-cards David Warner and Shane Watson. This led to criticism from Channel Nine, who broadcast the game. Bailey defended the side at a press conference, saying Channel Nine were motivated in part by a desire to talk down the game and thus pay a cheaper price for the TV rights: I can probably understand it coming from Channel Nine. I think they're about to go into negotiations for the TV rights. I think that was a pretty tactical move to try to talk down one-day cricket and what the Australian team's putting out. But it's still called the Australian cricket team. Channel Nine's executive director of cricket, and former NSW player Brad McNamara, angrily denied this: Nowhere has Channel Nine ever talked the one-day game down, nowhere have we ever said this is a 'B team'. It's rubbish and George should stick to playing cricket and leave (television) rights to the people who know what they're talking about. I reckon he's got his hands full as it is. He needs to concentrate on staying in the side. And he needs to understand where his money's coming from. Without the TV rights deal, George is probably working in a coalmine or flipping burgers at McDonald's. Cricket writer Jarrod Kimber later argued that this caused a permanent schism between Bailey and Channel Nine. He says it was brought to a head during the 2013–14 Ashes test in Sydney, when Bailey failed to make a fifty in two attempts: It seemed that no one in the Channel Nine box could make a comment about him that wasn't negative. His feet, hands, technique and temperament were questioned. His second-innings 46 was not enough. And they weren't always wrong. It just seemed kind of mean. Especially when at the back of the press box some seemed happy when he was out. But it went deeper than McNamara's comments. Bailey had made mistakes in his career. He hadn't made enough first-class runs. He hadn't come into the team as a young man. He came into the captaincy without playing a game. He came into the Test team because of one-day runs. He was everything old-school cricket didn't like. A thinking cricketer who had never demanded inclusion, but who had been included regardless. For old-school types like Ian Chappell, he was pretty much everything he didn't like. And Chappell wasn't just turning on Bailey because of his stoush with Channel Nine. He had not liked Bailey for a long time. In 2021 George Bailey became chief selector for the Australian men's international cricket team. References External links George Bailey's Official Website 1982 births Living people Australia One Day International cricketers Australia Test cricketers Australia Twenty20 International cricketers Australian cricket captains Australian cricketers Chennai Super Kings cricketers Cricketers at the 2015 Cricket World Cup Cricketers from Launceston, Tasmania Hampshire cricketers Hobart Hurricanes cricketers Melbourne Stars cricketers Middlesex cricketers Punjab Kings cricketers Rising Pune Supergiant cricketers Scotland cricketers Sussex cricketers Tasmania cricketers World XI Twenty20 International cricketers Australia national cricket team selectors
A Time to Remember is the 2009 double album recording of the show by the same name, by The Dubliners, recorded in Vienna. First performed in Vicar Street, Dublin on 4 July 2009 and later taken on tour around Europe, it was conceived as a tribute to their deceased members. The show features the group playing along live to video and audio performances featuring former members Ciarán Bourke, Luke Kelly and Ronnie Drew, as well as performances from The Dubliners' then current lineup. This is the last Dubliner's release featuring Barney McKenna, as he died in 2012. This also means that it is the last album to feature a founding member, as following McKenna's death none of the founding members of the band are still alive. Track list Disc 1: "Introduction" "Fermoy Lassies/Sporting Paddy" "The Banks Of The Roses" "The Ferryman" "Three Score And Ten" "The Belfast Hornpipe/The Swallow’s Tail" "For What Died The Sons Of Róisín" "Maids When You’re Young" "The Nightingale" "Luke’s Gravestone" "Kelly the Boy from Killanne" "The Black Velvet Band" "The Town I Loved So Well" "Cooley’s Reel/The Dawn/The Mullingar Races" "The Auld Triangle" Disc 2: "All For Me Grog" "Remembering Ciarán" "Preab San Ól" "Peggy Lettermore" "St. Patrick’s Cathedral" "I Wish I Had Someone To Love Me" "Ronnie’s Heaven" "McAlpine's Fusiliers" "Fáinne Geal An Lae" "Finnegan's Wake" "The Marino Waltz" "Dirty Old Town" "Whiskey In The Jar" "The Wild Rover" "Molly Malone" The Dubliners live albums 2009 live albums
Clarence Leonard "Sal" Walker (13 December 1898 – 30 April 1957) was a South African bantamweight professional boxer who competed in the early 1920s. He won the gold medal at the 1920 Summer Olympics, defeating Chris Graham in the final. He was born in Port Elizabeth, and died in Roodepoort, Gauteng. His paternal grandfather was from Scotland. Olympic results Defeated Alfons Bouwens (Belgium) Defeated Edwart Hartman (United States) Defeated George McKenzie (Great Britain) Defeated Chris Graham (Canada) References External links 1898 births 1957 deaths Sportspeople from Port Elizabeth Cape Colony people Bantamweight boxers Olympic boxers for South Africa Olympic gold medalists for South Africa Boxers at the 1920 Summer Olympics Olympic medalists in boxing Medalists at the 1920 Summer Olympics South African male boxers South African people of Scottish descent White South African people Cape Colony sportspeople
LB, lb or lb. may refer to: Businesses and organizations L Brands, an American clothing retailer Lane Bryant, a plus-size clothing retailer Laurier Brantford, a satellite campus of Wilfrid Laurier University in Brantford, Ontario, Canada Movement for Unification (), a nationalist Albanian political party in Kosovo Ljubljana Bank (), a bank named after and based in Ljubljana, Slovenia that operated in SFR Yugoslavia airline (IATA code) (Left Bank (online edition), a Ukrainian online newspaper Places Labrador (former postal abbreviation) Lebanon (ISO 3166-1 alpha-2 country code) Long Beach, California Los Baños, Laguna (an abbreviation commonly used to address the town of Los Baños) Science and technology Mathematics and computing .lb, the Internet country code top-level domain (ccTLD) for Lebanon Lattice Boltzmann methods, a class of computational fluid dynamics (CFD) methods for fluid simulation Liberty BASIC, a programming language Binary logarithm, Lower bound, a mathematical concept in order theory Units of measurement Pound (mass), abbreviation derived from Latin libra Pound-force Other uses in science and technology "L" shaped electrical conduit body with the outlet in the Back ("LB") Lysogeny broth (also known as Luria or Luria-Bertani broth), a microbial growth medium Sport Left back, a defensive position in Association football Linebacker, a position in American and Canadian football Other uses LB (car ferries), one of several ferries on the HH Ferry route between Elsinore, Denmark and Helsingborg, Sweden Luxembourgish language (ISO 639 alpha-2 code) Letterboxing (filming)
The Thumb SC is a bass guitar manufactured by the Warwick company. It is the first Warwick bass with the single cut design. History Following the increasing popularity of the "Single Cut" bass design, Warwick decided to keep up with the other companies and build their own single cut bass. This idea was initially started within the Warwick Official Forum, with one user making a SC Thumb mockup that looked promising for the rest of the members. This gave the idea to Florin Barbu, the forum host, to contact Hans-Peter Wilfer, to see if there was a way for this initial idea to be realized. Peter Wilfer gave him the idea to organize a forum contest, in which users would create a "Warwick-like" SC bass using modern Warwick specifications. In a tournament-like fashion, the contest began in January 2008. On 7 July 2009, Florin announced that the users Zsolt Ferenczi and Laszlo Demeter were the winning team. The winning model became a real bass in the Warwick line-up. The Thumb SC was officially presented on NAMM 2010. Concept The Thumb SC's shape is based on the original design of the Thumb NT. The woods it is constructed out of, however, make it different from the Thumb NT. The Thumb SC is constructed with a US Swamp Ash body and a Bubinga Pommelé top, a Tigerstripe Ebony fingerboard, and a Flame Maple neck. Hardware/pickups & Electronics The Thumb SC uses the black standard Warwick hardware. Like the Thumb NT 6-string, the Thumb SC has two active MEC humbuckers nestled near the bridge, and a 3-band active EQ. However, the Thumb SC has two mini toggle switches for coil splitting, where the Thumb NT does not. References Official Warwick Site Thumb SC announcement Single Cut Contest Electric bass guitars
```c++ // // // path_to_url // // Unless required by applicable law or agreed to in writing, software // WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. #ifndef libXCB_hpp #define libXCB_hpp #include <xcb/shm.h> #include <xcb/xcb.h> struct LibXcbExports { LibXcbExports() {} LibXcbExports(void *libxcb, void *libshm); xcb_void_cookie_t (*xcb_create_gc)(xcb_connection_t *c, xcb_gcontext_t cid, xcb_drawable_t drawable, uint32_t value_mask, const void *value_list) = nullptr; int (*xcb_flush)(xcb_connection_t *c) = nullptr; xcb_void_cookie_t (*xcb_free_gc)(xcb_connection_t *c, xcb_gcontext_t gc) = nullptr; uint32_t (*xcb_generate_id)(xcb_connection_t *c) = nullptr; xcb_get_geometry_cookie_t (*xcb_get_geometry)(xcb_connection_t *c, xcb_drawable_t drawable) = nullptr; xcb_get_geometry_reply_t *(*xcb_get_geometry_reply)(xcb_connection_t *c, xcb_get_geometry_cookie_t cookie, xcb_generic_error_t **e) = nullptr; xcb_void_cookie_t (*xcb_put_image)(xcb_connection_t *c, uint8_t format, xcb_drawable_t drawable, xcb_gcontext_t gc, uint16_t width, uint16_t height, int16_t dst_x, int16_t dst_y, uint8_t left_pad, uint8_t depth, uint32_t data_len, const uint8_t *data) = nullptr; xcb_void_cookie_t (*xcb_copy_area)(xcb_connection_t *conn, xcb_drawable_t src_drawable, xcb_drawable_t dst_drawable, xcb_gcontext_t gc, int16_t src_x, int16_t src_y, int16_t dst_x, int16_t dst_y, uint16_t width, uint16_t height); xcb_void_cookie_t (*xcb_free_pixmap)(xcb_connection_t *conn, xcb_pixmap_t pixmap); xcb_query_extension_reply_t *(*xcb_get_extension_data)(xcb_connection_t *c, xcb_extension_t *extension) = nullptr; int (*xcb_connection_has_error)(xcb_connection_t *c); uint32_t (*xcb_get_maximum_request_length)(xcb_connection_t *c); xcb_shm_query_version_cookie_t (*xcb_shm_query_version)(xcb_connection_t *c); xcb_shm_query_version_reply_t *(*xcb_shm_query_version_reply)(xcb_connection_t *c, xcb_shm_query_version_cookie_t cookie, xcb_generic_error_t **e); xcb_void_cookie_t (*xcb_shm_attach)(xcb_connection_t *c, xcb_shm_seg_t shmseg, uint32_t shmid, uint8_t read_only); xcb_void_cookie_t (*xcb_shm_detach)(xcb_connection_t *c, xcb_shm_seg_t shmseg); xcb_void_cookie_t (*xcb_shm_create_pixmap)(xcb_connection_t *c, xcb_pixmap_t pid, xcb_drawable_t drawable, uint16_t width, uint16_t height, uint8_t depth, xcb_shm_seg_t shmseg, uint32_t offset); xcb_extension_t *xcb_shm_id; }; class LibXCB { public: bool isPresent() { return loadExports() != nullptr; } LibXcbExports *operator->(); private: LibXcbExports *loadExports(); }; extern LibXCB libXCB; #endif // libXCB_hpp ```
Avraham Barkai (1921 in Berlin – 29 February 2020 in Kibbutz Lehavot HaBashan) was a German-born Israeli historian and researcher of antisemitism. He died at age 99 on 29 February 2020 in Lehavot HaBashan. Publications Barkai, Avraham. From Boycott to Annihilation: The Economic Struggle of German Jews, 1933-1943. The Tauber Institute for the Study of European Jewry series, 11. Hanover NH: Published for Brandeis University Press by University Press of New England, 1989. (held in over 400 US libraries according to WorldCat) translation of his Vom Boykott zur "Entjudung". Barkai, Avraham. Nazi Economics: Ideology, Theory, and Policy. New Haven: Yale University Press, 1990. (held in over 500 US libraries according to WorldCat) (translation of his Wirtschaftssystem des Nationalsozialismus.) Barkai, Avraham. Branching Out: German-Jewish Immigration to the United States, 1820-1914. Ellis Island series. New York: Holmes & Meier, 1994. (held in over 350 US libraries according to WorldCat) Barkai, Avraham, and Schoschanna Barkai-Lasker. Jüdische Minderheit und Industrialisierung: Demographie, Berufe, und Einkommem der Juden in Westdeutschland 1850-1914. Schriftenreihe wissenschaftlicher Abhandlungen des Leo Baeck Instituts, Bd. 46. Tübingen: J.S.B. Mohr (P. Siebeck), 1988. Barkai, Avraham. "Wehr dich!": der Centralverein deutscher Staatsbürger jüdischen Glaubens (C.V.) 1893-1938. München: Beck, 2002. Barkai, Avraham. Hoffnung und Untergang: Studien zur deutsch-jüdischen Geschichte des 19. und 20. Jahrhunderts. Hamburger Beiträge zur Sozial- und Zeitgeschichte, Bd. 36. Hamburg: Christians, 1998. Barkai, Avraham. "Das Wirtschaftssystem des Nationalsozialismus: der historische und ideologische Hintergrund, 1933-1936". Koln: Verlag Wissenschaft und Politik, c1977. Barkai, Avraham. Vom Boykott zur "Entjudung": der wirtschaftliche Existenzkampf der Juden im Dritten Reich, 1933-1943. Frankfurt am Main : Fischer, 1988. Barkai, Avraham. Jüdische Minderheit und Industrialisierung : Demographie, Berufe, und Einkommem der Juden in Westdeutschland 1850-1914. Tübingen : J.S.B. Mohr (P. Siebeck), 1988. Barkai, Avraham. Oscar Wassermann und die Deutsche Bank : Bankier in schwierigen Zeiten. München: Beck, c2005. (cl.) References Israeli historians Jewish historians German emigrants to Israel Writers from Berlin 1921 births 2020 deaths
Merchants Trust () is a large British investment trust dedicated to investments in higher yielding FTSE 100 companies. Established in 1889 by Robin Benson, the company is listed on the London Stock Exchange and is a constituent of the FTSE 250 Index. The chairman is Simon Fraser. References External links Merchants Trust website Investment trusts of the United Kingdom
Hasan (Hassan) Hourani (, 1974 – August 6, 2003) was a Palestinian artist, born in Hebron, Palestine. He attended the College of Fine Art in Baghdad, Iraq from 1993-97. In 2001 he arrived in New York City and presented his one-man show "One Day, One Night" in the UN building. He then studied at the Art Students League of New York and continued to live in the city for several years. His work has been exhibited in Palestine, Iraq, Egypt, Jordan, South Korea, New York and Houston. In 2003, he returned home for a visit. Like nearly all West Bank Palestinians, he had been barred from traveling across the Green Line to see the sea for many years; but during this trip home, he was able to visit the Mediterranean shore. On August 6, 2003 he went swimming with his young nephew Samer Abu Ajamieh and their girlfriends from Ramallah, and both drowned near the Port of Jaffa. Hassan Everywhere At the time of his death, Hourani had completed only 10 of the 40 drawings that make up his whimsical children's book, Hassan Everywhere, in which the character Rihan roams the world in search of the rose of love. That year, his drawings were exhibited at Al-Hoash's grand opening in Jerusalem, and the next year A. M. Qattan Foundation, a Palestinian cultural foundation, established the Hassan Hourani Young Artist of the Year Award. In 2004 the Qattan Foundation compiled his completed stories and half-rendered drawings, and published Hassan Everywhere. Hassan's friends are the birds, bees, fishes in the sea and fearful beasts, as he journeys alone but makes his home everywhere. The Paltel Virtual Gallery of Birzeit University writes that Hassan embodies the local as well as the world traveller, from ancient Egypt to the rooftops of New York: "Hassan rides the waves, is fed by the birds, flies on his magic bicycle, sits on the rooftops, always looking on to see the panorama of the world. Finally, the freedom of flight and travel of Hassan in 'Hassan Everywhere' carries particular resonance in the context of the confinement of Palestinians for whom such freedom is a dream." Dorit Rabinyan Dorit Rabinyan's book Gader Haya ("Hedgerow" - English title: All the Rivers, novel), 2014 [גדר חיה Gader Chaya) is dedicated to Ms. Rabinyan’s former lover, the artist Hassan Hourani, who drowned in 2003. She wrote a poignant farewell to him in The Guardian. References External links International artist database Brief Bio Birzeit university catalog of Hassan Everywhere drawings Artist authors 1974 births 2003 deaths Deaths by drowning People from Hebron Accidental deaths in Israel Palestinian children's writers Palestinian contemporary artists
Matheu is a town in the Escobar Partido of the Buenos Aires Province, Argentina. Escobar Partido Populated places in Buenos Aires Province
The Saltcellar with Portuguese Figures is a salt cellar in carved ivory, made in the Kingdom of Benin in West Africa in the 16th century, for the European market. It is attributed to an unknown master or workshop who has been given the name Master of the Heraldic Ship by art historians. It depicts four Portuguese figures, two of higher class and the other two are possibly guards protecting them. In the 16th century Portuguese visitors ordered ivory salt cellars and ivory spoons like this, specifically this Afro-Portuguese ivory salt cellar was carved in the style of a Benin court ivory, comparable to the famous Benin bronzes and Benin ivory masks. These kinds of ivory arts were commissioned and exported initially from Sierra Leone and later Benin City, Nigeria. During the age of exploration European powers expanded their trade and efforts towards establishing trade posts in the New World, Africa, the Middle East and Asia. Portuguese sailors disembarked from their caravels to buy goods for trading like ivory, gold, and others. These goods were taken from markets to colonial outposts to Portugal and then traded within European markets. During the 16th and 17th century countries that participated in colonialism reaped the economical benefits from its international trade. The salt cellar was probably carved for a Portuguese nobleman to put it on his dining table. It is one of four almost identical pieces, probably made as a set. The other three are now in European museums. Ivory salt cellars and ivory spoons like the Sapi-Portuguese Ivory Spoon, also in the Metropolitan, were common pieces of art that Portuguese sailors brought back from West African countries. There are no records of the order for this commission but it is believed that a Benin Ivory carver produced this in the Benin Kingdom, in modern day Nigeria. Description The figures, in high relief form a circle around the shaft of the elephant tusk, supporting the bowl at top used to hold the salt. The amount and type of decoration indicates that this piece was created in a Benin court. Two of the four male figures are from clearly of a higher rank, probably from a higher class. They are more elaborately carved and shown frontally, while the other two have less ornament and are shown in profile. The men on the front and on the back are dressed with elaborate clothes with a cross necklace, showing they are European Christians. In addition they are wearing hats and holding spears in their left hand. The style used to carve the ivory piece may be intended to be somewhat grotesque. In Afro-Portuguese ivories there are three African elements that are fundamental to call a piece African art: a focus on the human figure, an enunciation of the parts and a preference for pure geometric forms. Individuals are presented as the main subject in African art usually depicting an important figure like royalty or a deity, this is shown in the ivory salt cellar and other Benin Bronzes. The faces of each man are bigger with their long beards and deep eyes than their body while keeping their proportions in check. The geometry of the pattern of the men's clothing, the socket of the spear is another example where this geometry is repeated. Background The kingdom of Benin existed in the southwestern region of Nigeria in modern Edo state, Nigeria. According to scholars the kingdom of Benin (also known as the Edo Kingdom, or the Benin Empire) originated around the year 900 by the Ogiso kings, it is said between the eleventh and the thirteenth a member from the Oba dynasty would take control of the state. This dynasty would rule until 1897 when the British occupied the kingdom of Benin in February 9. The kingdom reached its peak during the rule of Ewuare the Great, he ruled from 1440 to 1473. King Ewuare expanded its natural borders and introduced wood and ivory carving to the kingdom. One of the first recorded visits to Benin City was made by Portuguese explorer, João Afonso de Aveiro in 1486. After contact with the Portuguese the Benin Kingdom established a strong mercantile relationship with Portugal and later other European states. They traded slaves and Beninese products such as ivory, pepper, gold and palm oil for European goods such as manillas, metals and guns. In addition they established diplomatic relations in the late 15th century, the Oba sent an ambassador to Lisbon, and the king of Portugal sent Christian missionaries to Benin City in 1486. References Ivory works of art Benin art Salts
```go package repositories import ( "context" "fmt" "os" "github.com/pkg/errors" "github.com/rockbears/log" "github.com/ovh/cds/sdk" cdslog "github.com/ovh/cds/sdk/log" ) var vcsPublicKeys map[string][]sdk.Key func (s *Service) processCheckout(ctx context.Context, op *sdk.Operation) error { gitRepo, _, currentBranch, err := s.processGitClone(ctx, op) if err != nil { return sdk.WrapError(err, "unable to process gitclone") } log.Debug(ctx, "processCheckout> repo cloned with current branch: %s", currentBranch) // Clean no commited changes if exists if err := gitRepo.ResetHard(ctx, "HEAD"); err != nil { return sdk.WithStack(err) } log.Debug(ctx, "processCheckout> repo reset to HEAD") if op.Setup.Checkout.Tag != "" { log.Debug(ctx, "processCheckout> fetching tag %s from %s", op.Setup.Checkout.Tag, op.URL) if err := gitRepo.FetchRemoteTag(ctx, "origin", op.Setup.Checkout.Tag); err != nil { return sdk.WithStack(err) } log.Info(ctx, "processCheckout> repository %s ready on tag '%s'", op.URL, op.Setup.Checkout.Tag) } else { if op.Setup.Checkout.Branch == "" { op.Setup.Checkout.Branch = op.RepositoryInfo.DefaultBranch } log.Debug(ctx, "processCheckout> fetching branch %s from %s", op.Setup.Checkout.Branch, op.URL) if err := gitRepo.FetchRemoteBranch(ctx, "origin", op.Setup.Checkout.Branch); err != nil { return sdk.WithStack(err) } // Check commit if op.Setup.Checkout.Commit == "" { // Reset HARD to the latest commit of the remote branch (don't use pull because there can be conflicts if the remote was forced) log.Debug(ctx, "processCheckout> resetting the branch %s from remote", op.Setup.Checkout.Branch) if err := gitRepo.ResetHard(ctx, "origin/"+op.Setup.Checkout.Branch); err != nil { return sdk.WithStack(err) } } else { currentCommit, err := gitRepo.LatestCommit(ctx) if err != nil { return sdk.WithStack(err) } if currentCommit.LongHash != op.Setup.Checkout.Commit { // Not the same commit, pull and reset HARD the commit log.Debug(ctx, "processCheckout> resetting the branch %s from remote", op.Setup.Checkout.Branch) if err := gitRepo.ResetHard(ctx, "origin/"+op.Setup.Checkout.Branch); err != nil { return sdk.WithStack(err) } log.Debug(ctx, "processCheckout> resetting commit %s", op.Setup.Checkout.Commit) if err := gitRepo.ResetHard(ctx, op.Setup.Checkout.Commit); err != nil { return sdk.WithStack(err) } } } } if op.Setup.Checkout.GetMessage { currentCommit, err := gitRepo.LatestCommit(ctx) if err != nil { return err } op.Setup.Checkout.Result.CommitMessage = currentCommit.Subject } if op.Setup.Checkout.ProcessSemver { describe, err := gitRepo.Describe(ctx, nil) if err != nil { log.ErrorWithStackTrace(ctx, errors.Wrap(err, "git describe failed")) } else { if describe.Semver != nil { op.Setup.Checkout.Result.Semver.Current = describe.SemverString op.Setup.Checkout.Result.Semver.Next = describe.Semver.IncMinor().String() } } } if op.Setup.Checkout.CheckSignature && (op.Setup.Checkout.Commit != "" || op.Setup.Checkout.Tag != "") { var gpgKeyID string if op.Setup.Checkout.Tag != "" { log.Debug(ctx, "retrieve gpg key id from tag %s", op.Setup.Checkout.Tag) // Check tag signature t, err := gitRepo.GetTag(ctx, op.Setup.Checkout.Tag) if err != nil { return sdk.WithStack(err) } gpgKeyID = t.GPGKeyID } else { log.Debug(ctx, "retrieve gpg key id from commit %s", op.Setup.Checkout.Commit) c, err := gitRepo.GetCommit(ctx, op.Setup.Checkout.Commit) if err != nil { return sdk.WithStack(err) } gpgKeyID = c.GPGKeyID } if gpgKeyID == "" { op.Setup.Checkout.Result.CommitVerified = false op.Setup.Checkout.Result.Msg = "commit not signed" return nil } ctx = context.WithValue(ctx, cdslog.GpgKey, gpgKeyID) op.Setup.Checkout.Result.SignKeyID = gpgKeyID // Search for public key on vcsserver var publicKey string vcsKeys, has := vcsPublicKeys[op.VCSServer] if has { for _, k := range vcsKeys { if k.KeyID == gpgKeyID { publicKey = k.Public break } } } // If not key found, try to get it from a user if publicKey == "" { // Retrieve gpg public key userKey, err := s.Client.UserGpgKeyGet(ctx, gpgKeyID) if err != nil { op.Setup.Checkout.Result.CommitVerified = false op.Setup.Checkout.Result.Msg = fmt.Sprintf("commit signed but key %s not found in CDS: %v", gpgKeyID, err) return nil } publicKey = userKey.PublicKey } // Import gpg public key fileName, _, err := sdk.ImportGPGKey(os.TempDir(), gpgKeyID, publicKey) if err != nil { return err } log.Debug(ctx, "key: %s, fileName: %s imported", gpgKeyID, fileName) // Check commit signature if op.Setup.Checkout.Tag != "" { if _, err := gitRepo.VerifyTag(ctx, op.Setup.Checkout.Tag); err != nil { op.Setup.Checkout.Result.CommitVerified = false op.Setup.Checkout.Result.Msg = fmt.Sprintf("%v", err) return nil } } else { if err := gitRepo.VerifyCommit(ctx, op.Setup.Checkout.Commit); err != nil { op.Setup.Checkout.Result.CommitVerified = false op.Setup.Checkout.Result.Msg = fmt.Sprintf("%v", err) return nil } } op.Setup.Checkout.Result.CommitVerified = true } if op.Setup.Checkout.GetChangeSet { op.Setup.Checkout.Result.Files = make(map[string]sdk.OperationChangetsetFile) computeFromLastCommit := false if op.Setup.Checkout.ChangeSetCommitSince != "" { files, err := gitRepo.DiffSinceCommit(ctx, op.Setup.Checkout.ChangeSetCommitSince) if err != nil { log.ErrorWithStackTrace(ctx, err) computeFromLastCommit = true } else { for k, v := range files { op.Setup.Checkout.Result.Files[k] = sdk.OperationChangetsetFile{ Filename: v.Filename, Status: v.Status, } } } } else { computeFromLastCommit = true } if computeFromLastCommit { commitWithDiffs, err := gitRepo.GetCommitWithDiff(ctx, op.Setup.Checkout.Commit) if err != nil { return err } for k, v := range commitWithDiffs.Files { op.Setup.Checkout.Result.Files[k] = sdk.OperationChangetsetFile{ Filename: v.Filename, Status: v.Status, } } } } log.Info(ctx, "processCheckout> repository %s ready", op.URL) return nil } ```
Stargel Peak (, ) is the sharp rocky peak rising to 1434 m at the north extremity of Ivanili Heights on Oscar II Coast in Graham Land, Antarctica. It is linked by Okorsh Saddle to Foster Plateau to the north, and surmounts Brenitsa Glacier to the west and Rogosh Glacier to the east. The feature is named after the settlement of Stargel in Western Bulgaria. Location Stargel Peak is located at , which is 8.55 km west-northwest of Mount Persenk in Lovech Heights, 14.15 km north-northwest of Skilly Peak, and 9.95 km east of Mount Quandary. British mapping in 1978. Maps British Antarctic Territory. Scale 1:200000 topographic map. DOS 610 Series, Sheet W 64 60. Directorate of Overseas Surveys, Tolworth, UK, 1978. Antarctic Digital Database (ADD). Scale 1:250000 topographic map of Antarctica. Scientific Committee on Antarctic Research (SCAR). Since 1993, regularly upgraded and updated. Notes References Stargel Peak. SCAR Composite Antarctic Gazetteer. Bulgarian Antarctic Gazetteer. Antarctic Place-names Commission. (details in Bulgarian, basic data in English) External links Stargel Peak. Copernix satellite image Mountains of Graham Land Oscar II Coast Bulgaria and the Antarctic
Darley Park is a neighborhood in east Baltimore, Maryland. References Neighborhoods in Baltimore East Baltimore
```c++ // // // path_to_url // // Unless required by applicable law or agreed to in writing, software // WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. #include "paddle/phi/kernels/sparse_weight_embedding_grad_kernel.h" #include "paddle/phi/backends/cpu/cpu_context.h" #include "paddle/phi/common/data_type.h" #include "paddle/phi/core/kernel_registry.h" #include "paddle/phi/core/utils/data_type.h" #include "paddle/phi/kernels/funcs/embedding_util.h" namespace phi { template <typename T, typename Context> struct SparseWeightEmbeddingGradCPUFunctor { SparseWeightEmbeddingGradCPUFunctor(const Context& dev_ctx, const DenseTensor& input, const SelectedRows& weight, const DenseTensor& out_grad, int64_t padding_idx, DenseTensor* weight_grad) : dev_ctx_(dev_ctx), input_(input), weight_(weight), out_grad_(out_grad), weight_grad_(weight_grad), padding_idx_(padding_idx) {} template <typename IdT> void apply() { DDim table_dim = weight_.dims(); auto ids = CopyIdsToVector<IdT, int64_t>(input_); auto ids_num = static_cast<int64_t>(ids.size()); // Since paddings are not trainable and fixed in forward, the gradient of // paddings makes no sense and we don't deal with it in backward. { auto* d_output = &out_grad_; // auto d_table = weight_grad_; auto* ids_data = ids.data(); int64_t N = table_dim[0]; int64_t D = table_dim[1]; auto* d_output_data = d_output->template data<T>(); dev_ctx_.template Alloc<T>(weight_grad_); auto* d_table_data = weight_grad_->data<T>(); memset(d_table_data, 0, weight_grad_->numel() * sizeof(T)); for (int64_t i = 0; i < ids_num; ++i) { if (padding_idx_ != kNoPadding && ids_data[i] == padding_idx_) { // the gradient of padding_idx should be 0, already done by memset, so // do nothing. } else { PADDLE_ENFORCE_LT( ids_data[i], N, common::errors::InvalidArgument( "Variable value (input) of " "OP(paddle.nn.functional.embedding) " "expected >= 0 and < %ld, but got %ld. Please check input " "value.", N, ids_data[i])); PADDLE_ENFORCE_GE( ids_data[i], 0, common::errors::InvalidArgument( "Variable value (input) of " "OP(paddle.nn.functional.embedding) " "expected >= 0 and < %ld, but got %ld. Please check input " "value.", N, ids_data[i])); for (int j = 0; j < D; ++j) { d_table_data[ids_data[i] * D + j] += d_output_data[i * D + j]; } } } } } private: const Context& dev_ctx_; const DenseTensor& input_; const SelectedRows& weight_; const DenseTensor& out_grad_; DenseTensor* weight_grad_; int64_t padding_idx_; }; template <typename T, typename Context> struct SparseWeightEmbeddingSparseGradCPUFunctor { SparseWeightEmbeddingSparseGradCPUFunctor(const Context& dev_ctx, const DenseTensor& input, const SelectedRows& weight, const DenseTensor& out_grad, int64_t padding_idx, SelectedRows* weight_grad) : dev_ctx_(dev_ctx), input_(input), weight_(weight), out_grad_(out_grad), weight_grad_(weight_grad), padding_idx_(padding_idx) {} template <typename IdT> void apply() { DDim table_dim = weight_.dims(); auto ids = CopyIdsToVector<IdT, int64_t>(input_); auto ids_num = static_cast<int64_t>(ids.size()); // Since paddings are not trainable and fixed in forward, the gradient of // paddings makes no sense and we don't deal with it in backward. auto* d_table = weight_grad_; auto* d_output = &out_grad_; d_table->set_rows(ids); auto* d_table_value = d_table->mutable_value(); d_table_value->Resize({ids_num, table_dim[1]}); dev_ctx_.template Alloc<T>(d_table_value); d_table->set_height(table_dim[0]); auto* d_output_data = d_output->template data<T>(); auto* d_table_data = d_table_value->template data<T>(); auto d_output_dims = d_output->dims(); auto d_output_dims_2d = common::flatten_to_2d(d_output_dims, d_output_dims.size() - 1); PADDLE_ENFORCE_EQ(d_table_value->dims(), d_output_dims_2d, common::errors::InvalidArgument( "ShapeError: The shape of lookup_table@Grad and " "output@Grad should be same. " "But received lookup_table@Grad's shape = [%s], " "output@Grad's shape = [%s].", d_table_value->dims(), d_output_dims_2d)); memcpy(d_table_data, d_output_data, sizeof(T) * d_output->numel()); } private: const Context& dev_ctx_; const DenseTensor& input_; const SelectedRows& weight_; const DenseTensor& out_grad_; SelectedRows* weight_grad_; int64_t padding_idx_; }; template <typename T, typename Context> void SparseWeightEmbeddingGradKernel(const Context& ctx, const DenseTensor& input, const SelectedRows& weight, const DenseTensor& out_grad, int64_t padding_idx, DenseTensor* weight_grad) { SparseWeightEmbeddingGradCPUFunctor<T, Context> functor( ctx, input, weight, out_grad, padding_idx, weight_grad); if (input.dtype() == phi::DataType::INT32) { functor.template apply<int>(); } else if (input.dtype() == phi::DataType::INT64) { functor.template apply<int64_t>(); } else { PADDLE_THROW(common::errors::Unimplemented( "emebdding input only support int32 and int64")); } } template <typename T, typename Context> void SparseWeightEmbeddingSparseGradKernel(const Context& ctx, const DenseTensor& input, const SelectedRows& weight, const DenseTensor& out_grad, int64_t padding_idx, SelectedRows* weight_grad) { SparseWeightEmbeddingSparseGradCPUFunctor<T, Context> functor( ctx, input, weight, out_grad, padding_idx, weight_grad); if (input.dtype() == phi::DataType::INT32) { functor.template apply<int>(); } else if (input.dtype() == phi::DataType::INT64) { functor.template apply<int64_t>(); } else { PADDLE_THROW(common::errors::Unimplemented( "emebdding input only support int32 and int64")); } } } // namespace phi PD_REGISTER_KERNEL(sparse_weight_embedding_grad, CPU, ALL_LAYOUT, phi::SparseWeightEmbeddingGradKernel, float, double, phi::dtype::bfloat16) {} PD_REGISTER_KERNEL(sparse_weight_embedding_sparse_grad, CPU, ALL_LAYOUT, phi::SparseWeightEmbeddingSparseGradKernel, float, double, phi::dtype::bfloat16) {} ```
```makefile ################################################################################ # # taglib # ################################################################################ TAGLIB_VERSION = 1.13 TAGLIB_SITE = path_to_url TAGLIB_INSTALL_STAGING = YES TAGLIB_LICENSE = LGPL-2.1 or MPL-1.1 TAGLIB_LICENSE_FILES = COPYING.LGPL COPYING.MPL TAGLIB_CPE_ID_VENDOR = taglib ifeq ($(BR2_PACKAGE_ZLIB),y) TAGLIB_DEPENDENCIES += zlib TAGLIB_CONF_OPTS += -DWITH_ZLIB=ON else TAGLIB_CONF_OPTS += -DWITH_ZLIB=OFF endif define TAGLIB_REMOVE_DEVFILE rm -f $(TARGET_DIR)/usr/bin/taglib-config endef TAGLIB_POST_INSTALL_TARGET_HOOKS += TAGLIB_REMOVE_DEVFILE $(eval $(cmake-package)) ```
Once machos 2 () is a 2019 Peruvian sports comedy film directed by Aldo Miyashiro and written by Miyashiro and Marco Paulo Melendez. It is a sequel to the 2017 Peruvian film Once machos. It stars Aldo Miyashiro, Pietro Sibille, André Silva and Cristian Rivero. It premiered on February 14, 2019, in Peruvian theaters. Synopsis The Once Machos will have to face off on the field against the "Villain", who has kidnapped their children. Now, only a victory will make them see their creatures alive again. Cast The actors participating in this film are: Aldo Miyashiro as Alejandro Pietro Sibille as 'Mono' Cristian Rivero as Cris Erika Villalobos as Beatriz André Silva as Andy Junior Silva as Junior Andrés Salas as 'Chato' Sebastian Monteghirfo as Sebas Pablo Villanueva “Melcochita” as 'Huapayita' Gilberto Nué as Gil Wendy Vásquez as Tatiana Natalie Vértiz as Natalia Production The filming of the film began on July 10, 2018, and ended at the end of August 2018. Reception Once machos 2 drew more than 67,000 viewers on its first day in theaters. In its opening weekend, the film drew 179,000 viewers to the theater. At the end of the year, the film attracts more than 650,000 spectators to the cinema, becoming the most watched Peruvian film of 2019. Future After the success of the sequel, a third part was announced, which will begin filming in 2020, along with a new series titled Once machos, la serie (Eleven males, the series) to premiere in 2020. but they were never released, and it is unknown what happened to these projects. In October 2021, a real soccer team called Once Machos FC was formed. References External links 2019 films 2019 comedy films Peruvian sports comedy films 2010s Spanish-language films 2010s Peruvian films Films set in Peru Films shot in Peru Films about friendship Films about kidnapping Films about sportspeople Peruvian sequel films
is a Japanese writer and director of Japanese animation. He is the creator of titles such as Wicked City, Ninja Scroll, and Vampire Hunter D: Bloodlust. Biography Kawajiri was born on November 18, 1950 and grew up in Yokohama, Kanagawa Prefecture, Japan. After he graduated from high school in 1968, he worked as an animator at Mushi Production Animation until it closed in 1972. He then joined Madhouse as one of the four co-founders, and in the 1970s was promoted to animation director. He finally debuted as a film director with 1984's Lensman: Secret of The Lens, directing jointly with the more experienced Kazuyuki Hirokawa (Kawajiri also did the character design along with Kazuo Tomizawa). Gaining an interest in darker animation, he next directed The Running Man. Afterwards, he was instructed to make a 35-minute short based on Hideyuki Kikuchi's novels, which was released as Wicked City. After completing it, however, his producers were so impressed that he was asked to make it a feature-length film. Kawajiri enjoyed the dark tone, and agreed to manage and complete the film within a year. That same year he began to work for the Original Video Animation market debuting with "The Phoenix". From 1987 he also wrote his own scripts. Wicked City received critical and commercial success when released in 1987, giving Kawajiri more creative freedom. He began scripting and designing his own film set in feudal Japan. The result, Ninja Scroll, about the Japanese folk hero Jubei Yagyu, was soon released. After the Western release in 1996, Kawajiri's status as a director received international recognition. He was asked in 2002 to direct a segment, titled Program, of The Animatrix, considered a showcase of the best directors of Japanese animation. Before The Animatrix, he also directed Vampire Hunter D: Bloodlust, which was based on a novel by Hideyuki Kikuchi. Kawajiri directed Highlander: The Search for Vengeance. It was released on DVD on 5 June 2007. According to an interview with Ain't It Cool News with producer Galen Walker, Kawajiri disliked the fact that 7–8 minutes of added scenes with no opening exposition text sequence were removed when the film was released, but the director's cut will include the footage. Kawajiri has script approval for a sequel to Ninja Scroll, which was listed as being in pre-production with no specific release date as of 2010. Filmography Films Lensman: Secret of The Lens (1984) – director, storyboard, character design, key animation Wicked City (1987) – director, screenwriter (as Kisei Choo), character design, animation director Neo Tokyo (1987) (Segment: "The Running Man") – director, screenwriter, character design, animation director Demon City Shinjuku (1988) – director, character design A Wind Named Amnesia (1990) – screenwriter, supervision Ninja Scroll (1993) – director, screenwriter, original work, original character design Vampire Hunter D: Bloodlust (2000) – director, screenwriter, storyboard The Animatrix (2003) – director (Segment: "Program"), screenwriter (Segments: "World Record" and "Program") Azumi 2: Death or Love (2005) – screenwriter Highlander: The Search for Vengeance (2007) – director, storyboard, key animation OVAs The Phoenix -Space- (1987) – director Goku: Midnight Eye (1989) – director Cyber City Oedo 808 (1990-1991) – director, character design The Cockpit (1994) (Segment: "Slipstream") – director, screenwriter, character design, animation director Biohunter (1995) – screenwriter, supervision, key animation Birdy the Mighty (1996-1997) – director, screenwriter, storyboard, key animation Batman: Gotham Knight (2008) (Segment: "Deadshot") – co-director (uncredited) TV series X (2001) – director, script, storyboard Ninja Scroll: The Series (2003) – original creator Other work Dororo (1969) – in-between animation Cleopatra (1970) – in-between animation Tomorrow's Joe (1970) – in-between animation, key animation New Moomin (1972) – key animation Science Ninja Team Gatchaman (1972) – key animation Dokonjō Gaeru (1972) – key animation Aim for the Ace! (1973) – ending Illustration, key animation Jungle Kurobe (1973) – key animation Samurai Giants (1973) – key animation Judo Sanka (1974) – key animation Hajime Ningen Gyatoruz (1974) – key animation The Fire G-Man (1975) – key animation Adventures of Ganba (1975) – key animation Demon Dragon of the Heavens Gaiking (1976) – key animation Manga Sekai Mukashi Banashi (1976) – episode director, character design, key animation, background art Manga Nihon Mukashi Banashi (1976) – episode director, character design (uncredited), key animation, background art Jetter Mars (1977) – key animation Future Boy Conan (1978) – key animation Animation Kikō Marco Polo no Bōken (1979) – storyboard, key animation Botchan (1980) – key animation The Fantastic Adventures of Unico (1981) – key animation, setting (assistance) The Sea Prince and the Fire Child (1981) – key animation The Door Into Summer (1981) – layout Wandering Cloud (1982) – storyboard, layout, key animation (uncredited) Unico in the Island of Magic (1983) – layout, key animation Barefoot Gen (1983) – key animation Harmagedon (1983) – key animation Georgie! (1983) – opening animation Stop!! Hibari-kun! (1983) – episode director The Dagger of Kamui (1985) – key animation Barefoot Gen 2 (1986) – key animation Toki no Tabibito -Time Stranger- (1986) – key animation Junk Boy (1987) – special CF Bride of Deimos (1988) – key animation Legend of the Forest (1988) – key animation Legend of the Galactic Heroes (1988) – guest character design Kimba the White Lion (1989) – character design Record of Lodoss War (1990) – key animation Zetsuai 1989 (1992) – storyboard Phantom Quest Corp. (1994) – opening animation Bronze: Cathexis (1994) – storyboard Azuki-chan (1995) – character design Memories (1995) (Segment: "Stink Bomb") – supervision, key animation X (1996) – key animation Todd McFarlane's Spawn (1997) – main title director Master Keaton (1998) – screenwriter, storyboard Pet Shop of Horrors (1999) – storyboard Cardcaptor Sakura Movie 2: The Sealed Card (2000) – storyboard, key animation Party 7 (2000) – key animation Metropolis (2001) – key animation, layout (assistance) Ogawa No Medaka (2002) – key animation Space Pirate Captain Herlock: The Endless Odyssey (2002) – storyboard Gokusen (2004) – storyboard Devil May Cry: The Animated Series (2007) – key animation Shigurui (2007) – storyboard Redline (2009) – 1st key animation Iron Man (2010) – storyboard Wolverine (2011) – storyboard The Tibetan Dog (2011) – key animation Kaiji: Against All Rules (2011) – storyboard X-Men (2011) – storyboard Blade (2011) – storyboard Chihayafuru (2011) – storyboard Black Jack: Dezaki's Final Chapter (2011) – key animation Btooom! (2012) – storyboard Chihayafuru 2 (2013) – storyboard Iron Man: Rise of Technovore (2013) – storyboard Ace of Diamond (2014) – storyboard Overlord (2015) – storyboard Rokka: Braves of the Six Flowers (2015) – storyboard One Punch Man (2015) – storyboard Garo: Crimson Moon (2015) – storyboard Alderamin on the Sky (2016) – storyboard All Out!! (2016) – storyboard ACCA: 13-Territory Inspection Dept. (2017) – storyboard Marvel Future Avengers (2017) – storyboard Rage of Bahamut: Virgin Soul (2017) – storyboard Overlord II (2018) – storyboard Mr. Tonegawa: Middle Management Blues (2018) – storyboard Attack on Titan Season 3 Part 1 (2018) – storyboard Boogiepop and Others (2019) – storyboard Demon Slayer: Kimetsu no Yaiba (2019) – storyboard No Guns Life (2019) – storyboard Blade of the Immortal (2019) – storyboard Chihayafuru 3 (2020) – storyboard Deca-Dence (2020) – storyboard Jujutsu Kaisen (2020) – storyboard Beastars 2nd Season (2021) – storyboard Sonny Boy (2021) – storyboard Platinum End (2022) – storyboard Police in a Pod (2022) – storyboard Kin no Kuni Mizu no Kuni (2023) – storyboard Books Arctic Luko (北極のルーコ). Chobunsha , 1989. Vampire Hunter D: Bloodlust Storyboard Collection (川尻善昭「バンパイアハンターD」絵コンテ集). Asahi Sonorama , 2001. References External link http://mediaarts-db.jp/an/anime_series?utf8=✓&asf%5Bkeyword%5D=川尻善昭&asf%5Bmedia%5D%5B%5D=tv_a&asf%5Bmedia%5D%5B%5D=tv_sp&asf%5Bmedia%5D%5B%5D=movie&asf%5Bmedia%5D%5B%5D=ova&asf%5Bmedia%5D%5B%5D=event&asf%5Bmedia%5D%5B%5D=personal&asf%5Bmedia%5D%5B%5D=etc&asf%5Bmedia%5D%5B%5D=blank Yoshiaki Kawajiri anime] at Media Arts Database Filmography:Yoshiaki Kawijiri at Sakuga@Wiki(Japanese) Interview With Yoshiaki Kawijiri at The Animatrix Official Website 1950 births Anime directors Japanese animators Japanese animated film directors Horror film directors Living people Madhouse (company) people Artists from Yokohama Japanese storyboard artists
Brigadier-General Ernest Berdoe Wilkinson (1864-1946) was an English soldier who after service in Asia and Africa commanded a brigade of the British Army on the Western Front in the First World War. Life Born on 10 March 1864 in Castleknock, Ireland, he was the fourth son of Lieutenant-Colonel Berdoe Amherst Wilkinson (1827-1895) and his wife Frances Neale (1830-1891). After initially serving in the militia, in 1885 he was commissioned in the regular army as a lieutenant with the Lincolnshire Regiment. Following service in Burma, in 1897 he was attached to the Egyptian Army and served in Sudan, being rewarded in 1900 with the Egyptian Order of the Medjidie and promotion to brevet major. In 1907, as a full major aged 43, he retired and took the post of Director of Agriculture and Forests for Sudan. With the outbreak of the First World War in 1914, he was recalled to service and given command of the 8th (Service) Battalion of the Lincolnshire Regiment. On 3 September 1915, he was promoted to head the 62nd Brigade, which was sent into the bloody Battle of Loos. He survived, but in 1916 was replaced and sent back to the Remount Department in the UK. In 1917 he was awarded the Egyptian Order of the Nile. He died on 11 April 1946 in Bramley, Surrey, at the age of 82. Family On 17 June 1905, in Folkestone, Kent, he married Harriet Emma Eccles (1863-1925). No children are known. He was the uncle of Dermot William Berdoe-Wilkinson (1882-1955), his executor, who served as High Sheriff of Surrey. References 1864 births 1946 deaths British Army personnel of the Mahdist War Sudanese civil servants British Army personnel of World War I Recipients of the Order of the Medjidie
In theoretical computer science, nondeterministic constraint logic is a combinatorial system in which an orientation is given to the edges of a weighted undirected graph, subject to certain constraints. One can change this orientation by steps in which a single edge is reversed, subject to the same constraints. The constraint logic problem and its variants have been proven to be PSPACE-complete to determine whether there exists a sequence of moves that reverses a specified edge and are very useful to show various games and puzzles are PSPACE-hard or PSPACE-complete. This is a form of reversible logic in that each sequence of edge orientation changes can be undone. The hardness of this problem has been used to prove that many games and puzzles have high game complexity. Constraint graphs In the simplest version of nondeterministic constraint logic, each edge of an undirected graph has weight either one or two. (The weights may also be represented graphically by drawing edges of weight one as red and edges of weight two as blue.) The graph is required to be a cubic graph: each vertex is incident to three edges, and additionally each vertex should be incident to an even number of red edges. The edges are required to be oriented in such a way that at least two units of weight are oriented towards each vertex: there must be either at least one incoming blue edge, or at least two incoming red edges. An orientation can change by steps in which a single edge is reversed, respecting these constraints. More general forms of nondeterministic constraint logic allow a greater variety of edge weights, more edges per vertex, and different thresholds for how much incoming weight each vertex must have. A graph with a system of edge weights and vertex thresholds is called a constraint graph. The restricted case where the edge weights are all one or two, the vertices require two units of incoming weight, and the vertices all have three incident edges with an even number of red edges, are called and/or constraint graphs. The reason for the name and/or constraint graphs is that the two possible types of vertex in an and/or constraint graph behave in some ways like an AND gate and OR gate in Boolean logic. A vertex with two red edges and one blue edge behaves like an AND gate in that it requires both red edges to point inwards before the blue edge can be made to point outwards. A vertex with three blue edges behaves like an OR gate, with two of its edges designated as inputs and the third as an output, in that it requires at least one input edge to point inwards before the output edge can be made to point outwards. Typically, constraint logic problems are defined around finding valid configurations of constraint graphs. Constraint graphs are undirected graphs with two types of edges: red edges with weight blue edges with weight We use constraint graphs as computation models, where we think of the entire graph as a machine. A configuration of the machine consists of the graph along with a specific orientation of its edges. We call a configuration valid, if it satisfies the inflow constraint: each vertex must have an incoming weight of at least . In other words, the sum of the weights of the edges that enter a given vertex must be at least more than the sum of the weights of the edges that exit the vertex. We also define a move in a constraint graph to be the action of reversing the orientation of an edge, such that the resulting configuration is still valid. Formal definition of the Constraint logic problem Suppose we are given a constraint graph, a starting configuration and an ending configuration. This problem asks if there exists a sequence of valid moves that move it from the starting configuration to the ending configuration This problem is PSPACE-Complete for 3-regular or max-degree 3 graphs. The reduction follows from QSAT and is outlined below. Variants Planar Non-Deterministic Constraint Logic The above problem is PSPACE-Complete even if the constraint graph is planar, i.e. no the graph can be drawn in a way such that no two edges cross each other. This reduction follows from Planar QSAT. Edge Reversal This problem is a special case of the previous one. It asks, given a constraint graph, if it is possible to reverse a specified edge by a sequence of valid moves. Note that this could be done by a sequence of valid moves so long as the last valid move reverses the desired edge. This problem has also been proven to be PSPACE-Complete for 3-regular or max-degree 3 graphs. Constraint Graph Satisfaction This problem asks if there exists an orientation of the edges that satisfies the inflow constraints given an undirected graph . This problem has been proven to be NP-Complete. Hard problems The following problems, on and/or constraint graphs and their orientations, are PSPACE-complete: Given an orientation and a specified edge , testing whether there is a sequence of steps from the given orientation that eventually reverses edge . Testing whether one orientation can be changed into another one by a sequence of steps. Given two edges and with specified directions, testing whether there are two orientations for the whole graph, one having the specified direction on and the other having the specified direction on , that can be transformed into each other by a sequence of steps. The proof that these problems are hard involves a reduction from quantified Boolean formulas, based on the logical interpretation of and/or constraint graphs. It requires additional gadgets for simulating quantifiers and for converting signals carried on red edges into signals carried on blue edges (or vice versa), which can all be accomplished by combinations of and-vertices and or-vertices. These problems remain PSPACE-complete even for and/or constraint graphs that form planar graphs. The proof of this involves the construction of crossover gadgets that allow two independent signals to cross each other. It is also possible to impose an additional restriction, while preserving the hardness of these problems: each vertex with three blue edges can be required to be part of a triangle with a red edge. Such a vertex is called a protected or, and it has the property that (in any valid orientation of the whole graph) it is not possible for both of the blue edges in the triangle to be directed inwards. This restriction makes it easier to simulate these vertices in hardness reductions for other problems. Additionally, the constraint graphs can be required to have bounded bandwidth, and the problems on them will still remain PSPACE-complete. Proof of PSPACE-hardness The reduction follows from QSAT. In order to embed a QSAT formula, we need to create AND, OR, NOT, UNIVERSAL, EXISTENTIAL, and Converter (to change color) gadgets in the constraint graph. The idea goes as follows: An AND vertex is a vertex such that it has two incident red edges (inputs) and one blue incident edge (output). An OR vertex is a vertex such that it has three incident blue edges (two inputs, one output). The other gadgets can also be created in this manner. The full construction is available in Erik Demaine's website. The full construction is also explained in an interactive way. Applications The original applications of nondeterministic constraint logic used it to prove the PSPACE-completeness of sliding block puzzles such as Rush Hour and Sokoban. To do so, one needs only to show how to simulate edges and edge orientations, and vertices, and protected or vertices in these puzzles. Nondeterministic constraint logic has also been used to prove the hardness of reconfiguration versions of classical graph optimization problems including the independent set, vertex cover, and dominating set, on planar graphs of bounded bandwidth. In these problems, one must change one solution to the given problem into another, by moving one vertex at a time into or out of the solution set while maintaining the property that at all times the remaining vertices form a solution. Reconfiguration 3SAT Given a 3-CNF formula and two satisfying assignments, this problem asks whether it is possible find a sequence of steps that take us from one assignment to the others, where in each step we are allowed to flip the value of a variable. This problem can be shown PSPACE-complete via a reduction from the Non-deterministic Constraint Logic problem. Sliding-Block Puzzles This problem asks whether we can reach a desired configuration in a sliding block puzzle given an initial configuration of the blocks. This problem is PSPACE-complete, even if the rectangles are dominoes. Rush Hour This problem asks whether we can reach the victory condition of rush hour puzzle given an initial configuration. This problem is PSPACE-complete, even if the blocks have size . Dynamic Map Labeling Given a static map, this problem asks whether there is a smooth dynamic labeling. This problem is also PSPACE-complete. References PSPACE-complete problems Computational problems in graph theory Reversible computing Logical calculi Reconfiguration
Santa Silvia is a 20th-century parochial church and titular church in southwest Rome, dedicated to Saint Silvia (6th century AD, mother of Gregory the Great). History The church was built in 1963–1968. On 21 February 2001, it was made a titular church to be held by a cardinal-priest. Cardinal-protectors Jānis Pujats (2001–present) References Titular churches Santa Silvia Roman Catholic churches completed in 1968 20th-century Roman Catholic church buildings in Italy
Chloe Woodruff (born July 21, 1987) is an American cross-country cyclist. She placed 14th in the women's cross-country race at the 2016 Summer Olympics. She has qualified to represent the United States at the 2020 Summer Olympics. References 1987 births Living people American female cyclists Olympic cyclists for the United States Cyclists at the 2016 Summer Olympics 21st-century American women
The 2014–15 Coppin State Eagles men's basketball team represented Coppin State University during the 2014–15 NCAA Division I men's basketball season. The Eagles, led by first year head coach Michael Grant, played their home games at the Physical Education Complex and were members of the Mid-Eastern Athletic Conference. They finished the season 8–23 6–10 in MEAC play to finish in a tie for ninth place. They advanced to the quarterfinals of the MEAC TOurnament where they lost to North Carolina Central. Roster Schedule |- !colspan=9 style="background:#333399; color:#CFB53B;"| Regular season |- !colspan=9 style="background:#333399; color:#CFB53B;"| MEAC tournament References Coppin State Eagles men's basketball seasons Coppin State Coppin Coppin
```ruby # typed: true # rubocop:todo Sorbet/StrictSigil # frozen_string_literal: true require "cache_store" # # {DescriptionCacheStore} provides methods to fetch and mutate formula descriptions used # by the `brew desc` and `brew search` commands. # class DescriptionCacheStore < CacheStore # Inserts a formula description into the cache if it does not exist or # updates the formula description if it does exist. # # @param formula_name [String] the name of the formula to set # @param description [String] the description from the formula to set # @return [nil] def update!(formula_name, description) database.set(formula_name, description) end # Delete the formula description from the {DescriptionCacheStore}. # # @param formula_name [String] the name of the formula to delete # @return [nil] def delete!(formula_name) database.delete(formula_name) end # If the database is empty `update!` it with all known formulae. # # @return [nil] def populate_if_empty!(eval_all: Homebrew::EnvConfig.eval_all?) return unless eval_all return unless database.empty? Formula.all(eval_all:).each { |f| update!(f.full_name, f.desc) } end # Use an update report to update the {DescriptionCacheStore}. # # @param report [Report] an update report generated by cmd/update.rb # @return [nil] def update_from_report!(report) unless Homebrew::EnvConfig.eval_all? database.clear! return end return populate_if_empty! if database.empty? return if report.empty? renamings = report.select_formula_or_cask(:R) alterations = report.select_formula_or_cask(:A) + report.select_formula_or_cask(:M) + renamings.map(&:last) update_from_formula_names!(alterations) delete_from_formula_names!(report.select_formula_or_cask(:D) + renamings.map(&:first)) end # Use an array of formula names to update the {DescriptionCacheStore}. # # @param formula_names [Array] the formulae to update # @return [nil] def update_from_formula_names!(formula_names) unless Homebrew::EnvConfig.eval_all? database.clear! return end return populate_if_empty! if database.empty? formula_names.each do |name| update!(name, Formula[name].desc) rescue FormulaUnavailableError, *FormulaVersions::IGNORED_EXCEPTIONS delete!(name) end end # Use an array of formula names to delete them from the {DescriptionCacheStore}. # # @param formula_names [Array] the formulae to delete # @return [nil] def delete_from_formula_names!(formula_names) return if database.empty? formula_names.each(&method(:delete!)) end alias delete_from_cask_tokens! delete_from_formula_names! # `select` from the underlying database. def select(&block) database.select(&block) end end # # {CaskDescriptionCacheStore} provides methods to fetch and mutate cask descriptions used # by the `brew desc` and `brew search` commands. # class CaskDescriptionCacheStore < DescriptionCacheStore # If the database is empty `update!` it with all known casks. # # @return [nil] def populate_if_empty!(eval_all: Homebrew::EnvConfig.eval_all?) return unless eval_all return unless database.empty? Cask::Cask.all(eval_all:) .each { |c| update!(c.full_name, [c.name.join(", "), c.desc.presence]) } end # Use an update report to update the {CaskDescriptionCacheStore}. # # @param report [Report] an update report generated by cmd/update.rb # @return [nil] def update_from_report!(report) unless Homebrew::EnvConfig.eval_all? database.clear! return end return populate_if_empty! if database.empty? return if report.empty? alterations = report.select_formula_or_cask(:AC) + report.select_formula_or_cask(:MC) update_from_cask_tokens!(alterations) delete_from_cask_tokens!(report.select_formula_or_cask(:DC)) end # Use an array of cask tokens to update the {CaskDescriptionCacheStore}. # # @param cask_tokens [Array] the casks to update # @return [nil] def update_from_cask_tokens!(cask_tokens) unless Homebrew::EnvConfig.eval_all? database.clear! return end return populate_if_empty! if database.empty? cask_tokens.each do |token| c = Cask::CaskLoader.load(token) update!(c.full_name, [c.name.join(", "), c.desc.presence]) rescue Cask::CaskUnavailableError, *FormulaVersions::IGNORED_EXCEPTIONS delete!(c.full_name) if c.present? end end end ```
```xml import { useEffect, useState } from 'react'; export function useReadyState({ requestId, protocol }: { requestId: string; protocol: 'curl' | 'webSocket' }): boolean { const [readyState, setReadyState] = useState<boolean>(false); // get readyState when requestId or protocol changes useEffect(() => { let isMounted = true; const fn = async () => { window.main[protocol].readyState.getCurrent({ requestId }) .then((currentReadyState: boolean) => { isMounted && setReadyState(currentReadyState); }); }; fn(); return () => { isMounted = false; }; }, [protocol, requestId]); // listen for readyState changes useEffect(() => { let isMounted = true; // @ts-expect-error -- we use a dynamic channel here const unsubscribe = window.main.on(`${protocol}.${requestId}.readyState`, (_, incomingReadyState: boolean) => { isMounted && setReadyState(incomingReadyState); }); return () => { isMounted = false; unsubscribe(); }; }, [protocol, requestId]); return readyState; } ```
```java /* * * * path_to_url * * Unless required by applicable law or agreed to in writing, software * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. */ package reactor.netty.http.server; import io.netty.handler.codec.http.HttpMethod; import org.junit.jupiter.api.Test; import org.mockito.Mockito; import org.reactivestreams.Publisher; import org.reactivestreams.Subscriber; import org.reactivestreams.Subscription; import reactor.netty.NettyOutbound; import reactor.test.StepVerifier; import java.net.URISyntaxException; import java.nio.file.Path; import java.nio.file.Paths; import java.time.Duration; import static org.assertj.core.api.Assertions.assertThat; /** * Tests for {@link DefaultHttpServerRoutes}. * * @author Sascha Dais * @since 1.0.8 */ class DefaultHttpServerRoutesTest { @Test void directoryRouteTest() throws URISyntaxException { HttpServerRequest request = Mockito.mock(HttpServerRequest.class); Mockito.when(request.paramsResolver(Mockito.any())).thenReturn(request); Mockito.when(request.uri()).thenReturn("/test"); Mockito.when(request.method()).thenReturn(HttpMethod.GET); Subscription subscription = Mockito.mock(Subscription.class); NettyOutbound outbound = Mockito.mock(NettyOutbound.class); Mockito.doAnswer(invocation -> { Subscriber<Void> subscriber = invocation.getArgument(0); subscriber.onSubscribe(subscription); subscriber.onNext(null); subscriber.onComplete(); return null; }).when(outbound).subscribe(Mockito.any()); HttpServerResponse response = Mockito.mock(HttpServerResponse.class); Mockito.when(response.sendFile(Mockito.any())).thenReturn(outbound); Path resource = Paths.get(getClass().getResource("/public").toURI()); DefaultHttpServerRoutes routes = new DefaultHttpServerRoutes(); HttpServerRoutes route = routes.directory("/test", resource); Publisher<Void> publisher = route.apply(request, response); assertThat(publisher).isNotNull(); StepVerifier.create(publisher) .expectNextMatches(p -> true) .expectComplete() .verify(Duration.ofMillis(200)); } } ```
```php <?php /* * * * path_to_url * * Unless required by applicable law or agreed to in writing, software * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the */ namespace Google\Service\GamesManagement; class ProfileSettings extends \Google\Model { /** * @var string */ public $kind; /** * @var bool */ public $profileVisible; /** * @param string */ public function setKind($kind) { $this->kind = $kind; } /** * @return string */ public function getKind() { return $this->kind; } /** * @param bool */ public function setProfileVisible($profileVisible) { $this->profileVisible = $profileVisible; } /** * @return bool */ public function getProfileVisible() { return $this->profileVisible; } } // Adding a class alias for backwards compatibility with the previous class name. class_alias(ProfileSettings::class, 'Google_Service_GamesManagement_ProfileSettings'); ```
```objective-c // // ZFADControlView.h // ZFPlayer_Example // // Created by on 2019/5/15. // #import <UIKit/UIKit.h> #import <ZFPlayer/ZFPlayerMediaControl.h> NS_ASSUME_NONNULL_BEGIN @interface ZFADControlView : UIView <ZFPlayerMediaControl> @property (nonatomic, copy) void(^skipCallback)(void); @property (nonatomic, copy) void(^fullScreenCallback)(void); @end NS_ASSUME_NONNULL_END ```