text
stringlengths 27
775k
|
|---|
# Laravel-OCI8 CHANGELOG
## [Unreleased]
## [v8.5.0] - 2021-09-18
- Allow top-level application to dynamically set its own database `$config[]` parameters [#667]
- Fixed checkMultipleHostDsn SERVICE_NAME using inconsistent database config bugs. #670
## [v8.4.1] - 2021-07-10
- Fix non-existent global constants. #661
## [v8.4.0] - 2021-04-23
- Add Oci8Driver for the new DBAL implementation. #648
- Fix #641
## [v8.3.0] - 2021-01-22
- Add PHP8 support. [#619]
- Fix [#624]
## [v8.2.3] - 2021-01-06
- Quote column name "id" so as to not affected by PDO::ATTR_CASE [#623]
## [v8.2.2] - 2020-12-08
- Query builder fixes and tests. [#615]
## [v8.2.1] - 2020-12-07
- Fix query builder bulk insert. [#612]
- Fix [#558].
## [v8.2.0] - 2020-12-06
- Improve pagination performance. [#611]
- Fixes [#563]
## [v8.1.3] - 2020-12-06
- Fix Model::create() with guarded property. [#609]
- Fix [#596]
## [v8.1.2] - 2020-12-06
- Fix database presence verifier. [#607]
- Revert [#598]
- Fixes [#601], [#602]
- Use orchestra testbench for tests
## [v8.1.1] - 2020-11-21
- Implement case insensitive function-based unique index. [#599]
## [v8.1.0] - 2020-11-20
- Enable oracle case insensitive searches. [#598]
- Fix database presence validation issue (unique, exists, etc).
- Removes the dependency on OracleUserProvider.
## [v8.0.1] - 2020-09-23
- Fix [#590] WhereIn query with more than 2k++ records. [#591], credits to [@bioleyl].
## [v8.0.0] - 2020-09-09
- Add support for Laravel 8.
## [v7.0.1] - 2020-06-18
- Fix pagination aggregate count. [#570]
## [v7.0.0] - 2020-03-04
- Add support for Laravel 7 [#565].
- Fix [#564].
## [v6.1.0] - 2020-02-11
- Add support for joinSub. [#551], credits to [@mozgovoyandrey].
- Add jobSub tests [#560].
- Apply StyleCI laravel preset changes.
## [v6.0.4] - 2019-11-26
- Wrap sequence name with schema prefix if set. [#535]
- Fix [#523].
## [v6.0.3] - 2019-11-26
- `saveLob` - Parameter should start at 1. [#543], credits to [@jeidison].
## [v6.0.2] - 2019-10-18
- Fix bug from pull request [#532] [#538], credits to [@dantesCode].
## [v6.0.1] - 2019-10-11
- Fix whereInRaw and whereNotInRaw Grammar. [#532], credits to [@dantesCode].
- Fix [#464], [#405], [#73].
## [v6.0.0] - 2019-09-05
- Laravel 6 support. [#505]
- Allow custom sequence name on model nextValue. [#511]
## [v5.8.2] - 2019-06-25
- Added illuminate/auth as dependency in composer.json [#508], credits to [@tumainimosha]
## [v5.8.1] - 2019-04-24
- Fix stripping of AS from table name. [#504]
- Facilitate wallet support. [#474]
- Fix changelog dates & Update license to 2019 [#498]
## [v5.7.3] - 2019-02-19
- Fix [#485] - Preventing ORA-00933 when using fromSub method. [#486], credits to [@renanwilliam].
## [v5.7.2] - 2018-09-29
- Added Support for Oracle Edition Based Redefinition [#439][#465], credits to [@Adam2Marsh]
## [v5.7.1] - 2018-09-20
- Fix paginate(1) SQL. [#461]
- Fix [#458].
## [v5.7.0] - 2018-09-05
- Add support for Laravel 5.7 [#457], credits to [@gredimano]
## [v5.6.3] - 2018-05-07
- Add support for migrate:fresh command. [#437]
- Fix [#435].
## [v5.6.2] - 2018-05-05
- Escape ENUM column name to avoid problems with reserved words [#432], credits to [@Stolz].
- Fixes issue [#431].
## [v5.6.1] - 2018-04-17
- Fix [#413], binding issue.
## [v5.6.0] - 2018-02-16
- Add support for Laravel 5.6.
- Fix compatbility with PHP 7.2.
- Fix Declaration of causedByLostConnection [#407], credits to [@FabioSmeriglio].
- Fix [#406], [#404].
- Added more options to Sequence Create Method [#355], credits to [@nikklass].
[Unreleased]: https://github.com/yajra/laravel-oci8/compare/v8.5.0...8.x
[v8.5.0]: https://github.com/yajra/laravel-oci8/compare/v8.4.1...v8.5.0
[v8.4.1]: https://github.com/yajra/laravel-oci8/compare/v8.4.0...v8.4.1
[v8.4.0]: https://github.com/yajra/laravel-oci8/compare/v8.3.0...v8.4.0
[v8.3.0]: https://github.com/yajra/laravel-oci8/compare/v8.2.3...v8.3.0
[v8.2.3]: https://github.com/yajra/laravel-oci8/compare/v8.2.2...v8.2.3
[v8.2.2]: https://github.com/yajra/laravel-oci8/compare/v8.2.1...v8.2.2
[v8.2.1]: https://github.com/yajra/laravel-oci8/compare/v8.2.0...v8.2.1
[v8.2.0]: https://github.com/yajra/laravel-oci8/compare/v8.1.3...v8.2.0
[v8.1.3]: https://github.com/yajra/laravel-oci8/compare/v8.1.2...v8.1.3
[v8.1.2]: https://github.com/yajra/laravel-oci8/compare/v8.1.1...v8.1.2
[v8.1.1]: https://github.com/yajra/laravel-oci8/compare/v8.1.0...v8.1.1
[v8.1.0]: https://github.com/yajra/laravel-oci8/compare/v8.0.0...v8.1.0
[v8.0.1]: https://github.com/yajra/laravel-oci8/compare/v8.0.0...v8.0.1
[v8.0.0]: https://github.com/yajra/laravel-oci8/compare/v7.0.1...v8.0.0
[v7.0.1]: https://github.com/yajra/laravel-oci8/compare/v7.0.0...v7.0.1
[v7.0.0]: https://github.com/yajra/laravel-oci8/compare/v6.1.0...v7.0.0
[v6.1.0]: https://github.com/yajra/laravel-oci8/compare/v6.0.4...v6.1.0
[v6.0.4]: https://github.com/yajra/laravel-oci8/compare/v6.0.3...v6.0.4
[v6.0.3]: https://github.com/yajra/laravel-oci8/compare/v6.0.2...v6.0.3
[v6.0.2]: https://github.com/yajra/laravel-oci8/compare/v6.0.1...v6.0.2
[v6.0.1]: https://github.com/yajra/laravel-oci8/compare/v6.0.0...v6.0.1
[v6.0.0]: https://github.com/yajra/laravel-oci8/compare/v5.8.2...v6.0.0
[v5.8.2]: https://github.com/yajra/laravel-oci8/compare/v5.8.1...v5.8.2
[v5.8.1]: https://github.com/yajra/laravel-oci8/compare/v5.8.0...v5.8.1
[v5.8.0]: https://github.com/yajra/laravel-oci8/compare/v5.7.3...v5.8.0
[v5.7.3]: https://github.com/yajra/laravel-oci8/compare/v5.7.2...v5.7.3
[v5.7.2]: https://github.com/yajra/laravel-oci8/compare/v5.7.1...v5.7.2
[v5.7.1]: https://github.com/yajra/laravel-oci8/compare/v5.7.0...v5.7.1
[v5.7.0]: https://github.com/yajra/laravel-oci8/compare/v5.6.2...v5.7.0
[v5.6.3]: https://github.com/yajra/laravel-oci8/compare/v5.6.2...v5.6.3
[v5.6.2]: https://github.com/yajra/laravel-oci8/compare/v5.6.1...v5.6.2
[v5.6.1]: https://github.com/yajra/laravel-oci8/compare/v5.6.0...v5.6.1
[v5.6.0]: https://github.com/yajra/laravel-oci8/compare/v5.5.7...v5.6.0
[#355]: https://github.com/yajra/laravel-oci8/pull/355
[#407]: https://github.com/yajra/laravel-oci8/pull/407
[#432]: https://github.com/yajra/laravel-oci8/pull/432
[#437]: https://github.com/yajra/laravel-oci8/pull/437
[#457]: https://github.com/yajra/laravel-oci8/pull/457
[#461]: https://github.com/yajra/laravel-oci8/pull/461
[#439]: https://github.com/yajra/laravel-oci8/pull/439
[#465]: https://github.com/yajra/laravel-oci8/pull/465
[#486]: https://github.com/yajra/laravel-oci8/pull/486
[#491]: https://github.com/yajra/laravel-oci8/pull/491
[#504]: https://github.com/yajra/laravel-oci8/pull/504
[#474]: https://github.com/yajra/laravel-oci8/pull/474
[#498]: https://github.com/yajra/laravel-oci8/pull/498
[#508]: https://github.com/yajra/laravel-oci8/pull/508
[#505]: https://github.com/yajra/laravel-oci8/pull/505
[#511]: https://github.com/yajra/laravel-oci8/pull/511
[#532]: https://github.com/yajra/laravel-oci8/pull/532
[#538]: https://github.com/yajra/laravel-oci8/pull/538
[#543]: https://github.com/yajra/laravel-oci8/pull/543
[#535]: https://github.com/yajra/laravel-oci8/pull/535
[#551]: https://github.com/yajra/laravel-oci8/pull/551
[#560]: https://github.com/yajra/laravel-oci8/pull/560
[#565]: https://github.com/yajra/laravel-oci8/pull/565
[#570]: https://github.com/yajra/laravel-oci8/pull/570
[#591]: https://github.com/yajra/laravel-oci8/pull/591
[#598]: https://github.com/yajra/laravel-oci8/pull/598
[#599]: https://github.com/yajra/laravel-oci8/pull/599
[#607]: https://github.com/yajra/laravel-oci8/pull/607
[#609]: https://github.com/yajra/laravel-oci8/pull/609
[#611]: https://github.com/yajra/laravel-oci8/pull/611
[#612]: https://github.com/yajra/laravel-oci8/pull/612
[#615]: https://github.com/yajra/laravel-oci8/pull/615
[#623]: https://github.com/yajra/laravel-oci8/pull/623
[#619]: https://github.com/yajra/laravel-oci8/pull/619
[#624]: https://github.com/yajra/laravel-oci8/issue/624
[#558]: https://github.com/yajra/laravel-oci8/issue/558
[#563]: https://github.com/yajra/laravel-oci8/issue/563
[#596]: https://github.com/yajra/laravel-oci8/issue/596
[#602]: https://github.com/yajra/laravel-oci8/issue/602
[#601]: https://github.com/yajra/laravel-oci8/issue/601
[#590]: https://github.com/yajra/laravel-oci8/issue/590
[#564]: https://github.com/yajra/laravel-oci8/issue/564
[#523]: https://github.com/yajra/laravel-oci8/issue/523
[#413]: https://github.com/yajra/laravel-oci8/issue/413
[#406]: https://github.com/yajra/laravel-oci8/issue/406
[#404]: https://github.com/yajra/laravel-oci8/issue/404
[#431]: https://github.com/yajra/laravel-oci8/issue/431
[#435]: https://github.com/yajra/laravel-oci8/issue/435
[#458]: https://github.com/yajra/laravel-oci8/issue/458
[#485]: https://github.com/yajra/laravel-oci8/issue/485
[#464]: https://github.com/yajra/laravel-oci8/issue/464
[#405]: https://github.com/yajra/laravel-oci8/issue/405
[#73]: https://github.com/yajra/laravel-oci8/issue/73
[@FabioSmeriglio]: https://github.com/FabioSmeriglio
[@nikklass]: https://github.com/nikklass
[@Stolz]: https://github.com/Stolz
[@gredimano]: https://github.com/gredimano
[@Adam2Marsh]: https://github.com/Adam2Marsh
[@renanwilliam]: https://github.com/renanwilliam
[@tumainimosha]: https://github.com/tumainimosha
[@dantesCode]: https://github.com/dantesCode
[@jeidison]: https://github.com/jeidison
[@mozgovoyandrey]: https://github.com/mozgovoyandrey
[@bioleyl]: https://github.com/bioleyl
|
require 'nokogiri'
require 'minitest/autorun'
def find_posts(dir)
posts = []
Dir.each_child(dir) do |child|
posts += find_posts(dir + "/#{child}") if File.directory?(dir + "/#{child}")
posts.append(dir + "/#{child}") if child.end_with?('.html')
end
posts
end
class TestTags < Minitest::Test
def test_all_posts_have_tags_not_tag1
year_dirs = Dir.children('_site').select { |f| f.start_with?('20') }
year_dirs.each do |year_dir|
posts = find_posts("_site/#{year_dir}")
posts.each do |post|
doc = Nokogiri::HTML.parse(open(post))
tags = doc.css('a.post_tag')
# Make sure all posts have tags
assert tags.length.positive?
tags.each do |tag|
tag.children.each do |child|
# Make sure the template tag `tag1` is not used
assert child.content.include?('tag1') == false
end
end
end
end
end
end
|
// code-examples/AdvOOP/ui3/button.scala
package ui3
class Button(val label: String) extends Widget with Clickable {
def click() = {
// Logic to give the appearance of clicking a button...
}
def draw() = {
// Logic to draw the button on the display, web page, etc.
}
override def toString() =
"(button: label=" + label + ", " + super.toString() + ")"
}
|
import os
import json
import numpy as np
import pandas as pd
import FeaturesStack as FS
def calculate_gmdh_model(img_f):
if task_type == "1":
if sensor_type == "convex":
prob = (
-0.946477
+ img_f["std_vert"] * np.cbrt(img_f["P95(1)_vert"]) * 0.0171222
+ np.power(img_f["balx2_hor"], 3)
* np.sin(img_f["dif12_hor"])
* (-1.583e-05)
+ img_f["P5_vert"] * np.cos(img_f["pair6664_vert"]) * (-0.007739)
+ np.cbrt(img_f["x2_vert"]) * np.cbrt(img_f["balx2_vert"]) * 0.0831053
+ np.cos(img_f["pair3947_hor"]) * np.cos(img_f["dif12_vert"]) * 0.413282
+ np.cos(img_f["pair4639_hor"])
* np.cos(img_f["pair6967_vert"])
* (-0.141326)
+ np.cbrt(img_f["maxfreq_hor"])
* np.cbrt(img_f["mean(1)_vert"])
* 0.396514
+ np.cos(img_f["pair4639_hor"])
* np.arctan(img_f["pair5555_vert"])
* 0.123721
+ np.sqrt(img_f["pair5045_hor"])
* np.cos(img_f["pair4846_hor"])
* (-0.110306)
+ np.sqrt(img_f["maxfreq_orig"])
* np.power(img_f["balx2_hor"], 3)
* 1.51139e-05
+ img_f["dif13_vert"] * np.cbrt(img_f["x2_orig"]) * 0.0276597
)
elif sensor_type == "linear":
prob = (
0.521463
+ np.cos(img_f["fractal_dim"])
* np.arctan(img_f["pair1526_hor"])
* (-0.510109)
+ np.cbrt(img_f["x2_orig"]) * np.arctan(img_f["std(3)_hor"]) * 0.320271
+ np.sin(img_f["Q1_vert"]) * np.cos(img_f["skew(2)_vert"]) * 0.347042
+ np.cbrt(img_f["median(2)_hor"]) * np.cos(img_f["Q3_vert"]) * 0.120014
+ np.sin(img_f["x1_orig"]) * np.sin(img_f["pair5050_vert"]) * 0.149371
+ np.power(img_f["kurt(1)_hor"], 2)
* np.cos(img_f["pair2820_hor"])
* 0.107874
+ np.power(img_f["pair4845_vert"], 3)
* np.cos(img_f["mean(3)_vert"])
* 1.95106e-05
+ np.cos(img_f["mean(3)_vert"])
* np.arctan(img_f["mean(2)_hor"])
* (-0.115669)
)
elif sensor_type == "reinforced_linear":
prob = (
0.564665
+ np.cbrt(img_f["pair2420_hor"])
* np.arctan(img_f["P5(1)_hor"])
* (-0.185308)
+ np.sin(img_f["std_hor"]) * np.sin(img_f["pair5359_vert"]) * 0.529036
+ np.cos(img_f["range_vert"])
* np.cos(img_f["pair7878_vert"])
* (-0.326662)
+ np.sin(img_f["pair6574_vert"])
* np.cos(img_f["Q3(1)_hor"])
* (-0.337944)
+ np.cos(img_f["IQR_vert"])
* np.cos(img_f["median(2)_vert"])
* (-0.237002)
+ np.sin(img_f["pair5359_vert"])
* np.cos(img_f["median(2)_vert"])
* (-0.118517)
+ np.cos(img_f["median(2)_vert"])
* np.arctan(img_f["P5(1)_hor"])
* 0.138423
+ np.cos(img_f["pair6574_vert"])
* np.arctan(img_f["pair5649_vert"])
* 0.051217
+ np.sin(img_f["pair5359_vert"])
* np.arctan(img_f["x2_vert"])
* 0.296591
+ img_f["dif23_vert"] * np.cos(img_f["dif23_vert"]) * 0.914249
)
else:
prob = 0
elif task_type == "2":
prob = 0
else:
prob = 0
return prob, 1 if prob < 0.5 else 2
def forest_prediction(img_f):
if task_type == "1":
with open(os.path.join(cur_dir, "SystemBack/SelfOrganizationForests/" + sensor_type + ".json")) as f:
forest = json.load(f)
ypl = [] # y_pred list
for obj in forest:
tree = pd.DataFrame(obj["tree"])
leaf = 1
index = 0
flag = False
y_pred = 0
while not flag:
node = tree.loc[index]
if node["side"] == 1:
if img_f[node["feature"]] < node["threshold"]:
y_pred = 1
else:
y_pred = 2
else:
if img_f[node["feature"]] < node["threshold"]:
y_pred = 2
else:
y_pred = 1
try:
index = np.where(
(tree["previous_leaf"] == leaf)
& (tree["previous_direction"] == y_pred)
)[0][0]
leaf = tree.loc[index]["leaf_number"]
except:
flag = True
ypl.append(y_pred)
ypl = np.asarray(ypl)
ypl_sum = np.sum(ypl == 1) + np.sum(ypl == 2)
if np.sum(ypl == 1) > np.sum(ypl == 2):
y_pred = 1
forest_prob = (np.sum(ypl == 1) / ypl_sum) * 100
else:
y_pred = 2
forest_prob = (np.sum(ypl == 2) / ypl_sum) * 100
elif task_type == "2":
forest_prob = 0
y_pred = 0
else:
forest_prob = 0
y_pred = 0
return forest_prob, y_pred
def get_mean_signs(img_f):
if task_type == "1":
if sensor_type == "convex":
feature1, feature2, feature3 = (
"cbrt(P95(1)_vert)",
"cos(dif12_vert)",
"std_vert",
)
threshold1, threshold2, threshold3 = (
5.0132979349645845,
0.6306169224667781,
7.127663290343068,
)
value1, value2, value3 = (
np.cbrt(img_f["P95(1)_vert"]),
np.cos(img_f["dif12_vert"]),
img_f["std_vert"],
)
if value1 < threshold1:
res1 = "Печень в норме"
else:
res1 = "Печень не в норме"
if value2 < threshold2:
res2 = "Печень не в норме"
else:
res2 = "Печень в норме"
if value3 < threshold3:
res3 = "Печень в норме"
else:
res3 = "Печень не в норме"
elif sensor_type == "linear":
feature1, feature2, feature3 = (
"cbrt(x2_orig)",
"arctan(pair1526_hor)",
"cos(fractal_dim)",
)
threshold1, threshold2, threshold3 = (
0.6440777961495892,
1.3522438545232742,
0.41596845937104104,
)
value1, value2, value3 = (
np.cbrt(img_f["x2_orig"]),
np.arctan(img_f["pair1526_hor"]),
np.cos(img_f["fractal_dim"]),
)
if value1 < threshold1:
res1 = "Печень в норме"
else:
res1 = "Печень не в норме"
if value2 < threshold2:
res2 = "Печень не в норме"
else:
res2 = "Печень в норме"
if value3 < threshold3:
res3 = "Печень не в норме"
else:
res3 = "Печень в норме"
elif sensor_type == "reinforced_linear":
feature1, feature2, feature3 = (
"cos(range_vert)",
"cbrt(pair2420_hor)",
"sin(pair5359_vert)",
)
threshold1, threshold2, threshold3 = (
0.9998433086476912,
1.6407957194770635,
-0.5549728719823037,
)
value1, value2, value3 = (
np.cos(img_f["range_vert"]),
np.cbrt(img_f["pair2420_hor"]),
np.sin(img_f["pair5359_vert"]),
)
if value1 < threshold1:
res1 = "Печень в норме"
else:
res1 = "Печень не в норме"
if value2 < threshold2:
res2 = "Печень не в норме"
else:
res2 = "Печень в норме"
if value3 < threshold3:
res3 = "Печень не в норме"
else:
res3 = "Печень в норме"
else:
feature1, feature2, feature3 = "", "", ""
threshold1, threshold2, threshold3 = 0, 0, 0
value1, value2, value3 = 0, 0, 0
res1, res2, res3 = 0, 0, 0
elif task_type == "2":
feature1, feature2, feature3 = "", "", ""
threshold1, threshold2, threshold3 = 0, 0, 0
value1, value2, value3 = 0, 0, 0
res1, res2, res3 = 0, 0, 0
else:
feature1, feature2, feature3 = "", "", ""
threshold1, threshold2, threshold3 = 0, 0, 0
value1, value2, value3 = 0, 0, 0
res1, res2, res3 = 0, 0, 0
return [
{"feature": feature1, "threshold": threshold1, "value": value1, "result": res1},
{"feature": feature2, "threshold": threshold2, "value": value2, "result": res2},
{"feature": feature3, "threshold": threshold3, "value": value3, "result": res3},
]
def get_all_features():
with open(os.path.join(cur_dir, "SystemBack/Features/", filename)) as f:
feature_names = json.load(f)["features"]
with open(os.path.join(cur_dir, "SystemBack/BestGrad/", filename)) as f:
best_grad = json.load(f)
with open(os.path.join(cur_dir, "SystemBack/MaxFeatures/", filename)) as f:
best_pairs = json.load(f)
img_f = []
# fractal dimension of image
img_f.append(FS.mink_val(path))
# initial matrix
init_matrix = np.concatenate(FS.get_greyscale_matrix(path), axis=None)
img_f.append((np.sum(init_matrix == np.amin(init_matrix)) / init_matrix.size) * 100)
img_f.append((np.sum(init_matrix == np.amax(init_matrix)) / init_matrix.size) * 100)
# glcm
glcm = FS.get_glcm(init_matrix)
img_f = FS.get_x1x2x3(
glcm, img_f, best_grad["initstandard"], best_grad["initbalanced"]
)
# horizontal differential matrix
img_f, diff_matrix = FS.get_norm_features(
FS.get_greyscale_matrix(path),
img_f,
"hor",
best_grad["horstandard"],
best_grad["horbalanced"],
best_pairs["hor"],
flag=True,
)
# vertical differential matrix
img_f = FS.get_norm_features(
FS.get_greyscale_matrix(path),
img_f,
"vert",
best_grad["vertstandard"],
best_grad["vertbalanced"],
best_pairs["vert"],
)
return pd.DataFrame([img_f], columns=feature_names).iloc[0], diff_matrix
def get_classification_results(parameters):
# task_type: 1 - норма/патология, 2 - стадия фиброза
global sensor_type, path, task_type, cur_dir, filename
sensor_type, path, task_type = (
parameters["sensor_type"],
parameters["path"],
parameters["task_type"],
)
cur_dir, filename = parameters["cur_dir"], parameters["filename"]
(
img_f,
diff_matrix,
) = get_all_features() # img_f - image features (признаки изображения)
# МГУА
gmdh_prob, gmdh_liver_class = calculate_gmdh_model(img_f)
if gmdh_prob > 1 or gmdh_prob < 0:
gmdh_prob = 100
elif gmdh_liver_class == 2:
gmdh_prob = round(gmdh_prob * 100, 1)
elif gmdh_liver_class == 1:
gmdh_prob = round((1 - gmdh_prob) * 100, 1)
gmdh_result = "Печень в норме" if gmdh_liver_class == 1 else "Печень не в норме"
# Лес самоорганизации
forest_prob, forest_liver_class = forest_prediction(img_f)
forest_result = "Печень в норме" if forest_liver_class == 1 else "Печень не в норме"
# Пороги трёх наилучших признаков
mean_signs = get_mean_signs(img_f)
return (
{
"gmdh_result": gmdh_result,
"gmdh_probability": gmdh_prob,
"forest_result": forest_result,
"forest_probability": forest_prob,
"mean_signs": mean_signs,
},
diff_matrix,
)
|
UPGRADE FROM 1.x to 2.0
=======================
### Exceptions namespaces
All exceptions of this bundle was moved to `Debril\RssAtomBundle\Exception` namespace.
Feed exceptions are now on `Debril\RssAtomBundle\Exception\FeedException` namespace.
Before:
```php
use Debril\RssAtomBundle\Driver\DriverUnreachableResourceException;
use Debril\RssAtomBundle\Protocol\Parser\ParserException;
use Debril\RssAtomBundle\Exception\FeedCannotBeReadException;
use Debril\RssAtomBundle\Exception\FeedNotFoundException;
use Debril\RssAtomBundle\Exception\FeedNotModifiedException;
use Debril\RssAtomBundle\Exception\FeedServerErrorException;
use Debril\RssAtomBundle\Exception\FeedForbiddenException;
```
After:
```php
use Debril\RssAtomBundle\Exception\DriverUnreachableResourceException;
use Debril\RssAtomBundle\Exception\ParserException;
use Debril\RssAtomBundle\Exception\FeedException\FeedCannotBeReadException;
use Debril\RssAtomBundle\Exception\FeedException\FeedNotFoundException;
use Debril\RssAtomBundle\Exception\FeedException\FeedNotModifiedException;
use Debril\RssAtomBundle\Exception\FeedException\FeedServerErrorException;
use Debril\RssAtomBundle\Exception\FeedException\FeedForbiddenException;
```
|
for (study in c("SDY212", "SDY400", "SDY404")) {
fn.ge = file.path(PROJECT_DIR, "generated_data", "HIPC",
paste0(study, "_GE_matrix_gene.txt"))
dat = fread(fn.ge, data.table = F)
fn.si = file.path(PROJECT_DIR, "generated_data", "HIPC",
paste0(study, "_sample_info.txt"))
info = fread(fn.si)
si.d0 = info$time == "d0"
dat.d0 = dat[,c(1,which(si.d0)+1)]
info.d0 = info[si.d0,]
fn.ge.d0 = sub(".txt", "_day0.txt", fn.ge)
fn.si.d0 = sub(".txt", "_day0.txt", fn.si)
fwrite(dat.d0, fn.ge.d0, sep="\t", quote=T)
fwrite(info.d0, fn.si.d0, sep="\t", quote=T)
si.hl = info.d0$Response %in% c("low","high")
dat.hl = dat.d0[,c(1,which(si.hl)+1)]
info.hl = info.d0[si.hl,]
fn.ge.hl = sub(".txt", "_ResponseLoHi.txt", fn.ge.d0)
fn.si.hl = sub(".txt", "_ResponseLoHi.txt", fn.si.d0)
fwrite(dat.hl, fn.ge.hl, sep="\t", quote=T)
fwrite(info.hl, fn.si.hl, sep="\t", quote=T)
}
|
{-|
Module : Test.Problem.ProblemExpr.Class
Description : The ProblemExpr Class tests
Copyright : (c) Andrew Burnett, 2014-2015
Maintainer : andyburnett88@gmail.com
Stability : experimental
Portability : -
Exports the testing functionality of the ProblemExpr Class
-}
module Test.Problem.ProblemExpr.Class (
tests
) where
import HSat.Problem.Instances.CNF.Internal
import HSat.Problem.ProblemExpr.Class
import Test.Problem.Instances.CNF.Internal (genCNF)
import TestUtils
name :: String
name = "Class"
tests :: TestTree
tests =
testGroup name [
testGroup "fromProblemExpr" [
testFromProblemExpr1
]
]
testFromProblemExpr1 :: TestTree
testFromProblemExpr1 =
testProperty ("fromProblemExpr (ProblemExpr cnf) " `equiv` " cnf") $ property
(\cnf ->
case fromProblemExpr $ ProblemExpr cnf of
Just cnf'@CNF{} -> cnf === cnf'
Nothing -> counterexample "Incorrect value from fromProblemExpr" False
)
instance Arbitrary ProblemExpr where
arbitrary = oneof [
ProblemExpr <$> sized genCNF
]
shrink problemExpr =
case fromProblemExpr problemExpr of
Just cnf@CNF{} -> map ProblemExpr . shrink $ cnf
Nothing -> []
|
window.onload = function() {
var m1 = new Matrix($('#matrix1'), 20, 20);
m1.construct();
var s1 = new Snake(m1, 200);
s1.setSnake();
var m2 = new Matrix($('#matrix2'), 20, 20);
m2.construct();
var s2 = new Snake(m2, 200);
s2.setSnake();
/*var m3 = new Matrix($('#matrix3'), 20, 20);
m3.construct();
var s3 = new Snake(m3, 200);
s3.setSnake();*/
}
|
let cosmos = require('../../cosmos');
let generateToken = () => {
let ALPHA_NUM = "ABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789abcdefghijklmnopqrstuvwxyz";
let token = '';
for (let i = 0; i < 16; ++i)
token += ALPHA_NUM.charAt(Math.floor(Math.random() * ALPHA_NUM.length));
return token;
};
let getNodeDetails = (txHash, cb) => {
cosmos.call('verifyHash', {
hash: txHash
}, (error, result) => {
console.log(error, result);
if (error) cb(error);
else {
let data = Buffer.from(result.result.data, 'base64');
data = JSON.parse(data.toString()).value;
result = {
accountAddress: data.From,
IP: data.Ip,
pricePerGB: parseFloat(data.PricePerGb),
encMethod: data.EncMethod,
description: data.description,
moniker: data.moniker,
location: {
latitude: parseFloat(data.Location.Latitude) / Math.pow(10, 6),
longitude: parseFloat(data.Location.Longitude) / Math.pow(10, 6),
city: data.Location.City,
country: data.Location.Country
},
netSpeed: {
download: parseFloat(data.NetSpeed.DownloadSpeed),
upload: parseFloat(data.NetSpeed.UploadSpeed)
},
nodeType: data.NodeType,
version: data.Version
};
cb(null, result);
}
});
};
module.exports = {
generateToken,
getNodeDetails
};
|
// Copyright 2021 bilibili-base
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package bootstrap
import (
"github.com/spf13/pflag"
pluginsgrpc "github.com/bilibili-base/powermock/pkg/pluginregistry/grpc"
pluginshttp "github.com/bilibili-base/powermock/pkg/pluginregistry/http"
pluginscript "github.com/bilibili-base/powermock/pkg/pluginregistry/script"
pluginssimple "github.com/bilibili-base/powermock/pkg/pluginregistry/simple"
pluginredis "github.com/bilibili-base/powermock/pkg/pluginregistry/storage/redis"
pluginrediscluster "github.com/bilibili-base/powermock/pkg/pluginregistry/storage/rediscluster"
"github.com/bilibili-base/powermock/pkg/util"
)
// PluginConfig defines the plugin config
type PluginConfig struct {
Redis *pluginredis.Config
RedisCluster *pluginrediscluster.Config
Simple *pluginssimple.Config
GRPC *pluginsgrpc.Config
HTTP *pluginshttp.Config
Script *pluginscript.Config
}
// NewPluginConfig is used to create plugin config
func NewPluginConfig() *PluginConfig {
return &PluginConfig{
Redis: pluginredis.NewConfig(),
RedisCluster: pluginrediscluster.NewConfig(),
Simple: pluginssimple.NewConfig(),
GRPC: pluginsgrpc.NewConfig(),
HTTP: pluginshttp.NewConfig(),
Script: pluginscript.NewConfig(),
}
}
// RegisterFlagsWithPrefix is used to register flags
func (c *PluginConfig) RegisterFlagsWithPrefix(prefix string, f *pflag.FlagSet) {
c.Redis.RegisterFlagsWithPrefix(prefix+"plugin.", f)
c.RedisCluster.RegisterFlagsWithPrefix(prefix+"plugin.", f)
c.Simple.RegisterFlagsWithPrefix(prefix+"plugin.", f)
c.GRPC.RegisterFlagsWithPrefix(prefix+"plugin.", f)
c.HTTP.RegisterFlagsWithPrefix(prefix+"plugin.", f)
c.Script.RegisterFlagsWithPrefix(prefix+"plugin.", f)
}
// Validate is used to validate config and returns error on failure
func (c *PluginConfig) Validate() error {
return util.ValidateConfigs(
c.Redis,
c.RedisCluster,
c.Simple,
c.GRPC,
c.HTTP,
c.Script,
)
}
|
import logging
from flask_sqlalchemy import sqlalchemy
from init import DecorationForm, config
from utilities.scm_enums import ErrorCodes
from utilities.scm_exceptions import ScmException
from utilities.scm_logger import ScmLogger
class DecorationFormRepository:
logger = ScmLogger(__name__)
def __init__(self, db):
self.db = db
def get_all_decoration_forms(self):
return DecorationForm.query. \
order_by(DecorationForm.name). \
all()
def get_paginated_decoration_forms(self,
page,
per_page,
search_text):
decoration_form_recs = DecorationForm.query
if search_text is not None and search_text != '':
search_pattern = '%' + search_text + '%'
decoration_form_recs = decoration_form_recs.filter(DecorationForm.name.ilike(search_pattern))
decoration_form_recs = decoration_form_recs.order_by(DecorationForm.name)
return decoration_form_recs.paginate(page, per_page, error_out=False)
def get_decoration_form(self, id):
return DecorationForm.query.filter(DecorationForm.id == id).first()
def add_decoration_form(self,
name,
description):
try:
decoration_form_rec = DecorationForm(name=name, description=description)
self.db.session.add(decoration_form_rec)
self.db.session.flush()
return decoration_form_rec.id
except sqlalchemy.exc.SQLAlchemyError as ex:
message = 'Error: failed to add decoration form. Details: %s' % (str(ex))
DecorationFormRepository.logger.error(message)
raise ScmException(ErrorCodes.ERROR_ADD_DECORATION_FORM_FAILED, message)
def update_decoration_form(self,
decoration_form_id,
name,
description):
decoration_form_rec = self.get_decoration_form(decoration_form_id)
decoration_form_rec.name = name
decoration_form_rec.description = description
self.db.session.flush()
|
## listbox filtered by entry
If there is no text in entry then it display all items

If there is text in entry then it displays only matching elements
It check matching in any place in item (like `"%TEXT%"` in database)

|
---
layout: mobile-pe
title: ThingsBoard PE Mobile Application
description:
---
<section id="intro">
<main>
<h1><a href="/docs/pe/mobile/">ThingsBoard PE Mobile Application</a> is an open-source <a href="https://github.com/thingsboard/flutter_thingsboard_pe_app">project</a> based on Flutter</h1>
<h1 class="second">Build your own advanced IoT mobile application with minimum coding efforts</h1>
<h1 class="second">Powered by ThingsBoard PE IoT Platform</h1>
</main>
</section>
<section class="features">
<main>
<div class="features-top">
<div class="background">
<div class="main1"></div><div class="small1"></div><div class="small2"></div><div class="small3"></div>
</div>
<div class="block">
<div class="feature-des"><h2>Home screen with flexible navigation</h2>
<p>Browse everything from home screen. Use Platform to configure dashboard icons, order and visibility.</p>
<a class="read-more-button" href="/docs/pe/mobile/customize-dashboards/">Read more<img class="arrow first" src="/images/pe/read-more-arrow.svg"><img class="arrow second" src="/images/pe/read-more-arrow.svg"><img class="arrow third" src="/images/pe/read-more-arrow.svg"></a>
</div>
<div class="preview">
<div class="mobile-frame ios">
<img class="phone-bg points" src="/images/mobile/pe/mobile-bg-pe.svg">
<img class="phone-bg web flexible" src="/images/mobile/pe/pe-flexible-nav.svg">
<div class="phone-shadow pe"></div>
<div class="frame-image">
<img src="/images/mobile/pe/browse-dashboards-frame.png">
</div>
<div class="frame-video">
<video autoplay loop preload="auto" muted playsinline>
<source src="https://s3-us-west-1.amazonaws.com/tb-videos/mobile/pe/browse-dashboards.mp4" type="video/mp4">
<source src="https://s3-us-west-1.amazonaws.com/tb-videos/mobile/pe/browse-dashboards.webm" type="video/webm">
</video>
</div>
</div>
</div>
</div>
<div class="block vis">
<div class="preview">
<div class="mobile-frame ios">
<img class="phone-bg points right" src="/images/mobile/pe/mobile-bg-pe.svg">
<img class="phone-bg web right w-label" src="/images/mobile/pe/pe-white-labeling.svg">
<div class="phone-shadow right pe"></div>
<div class="frame-image">
<img src="/images/mobile/pe/white-labeling-frame.png">
</div>
<div class="frame-video">
<video autoplay loop preload="auto" muted playsinline>
<source src="https://s3-us-west-1.amazonaws.com/tb-videos/mobile/pe/white-labeling.mp4" type="video/mp4">
<source src="https://s3-us-west-1.amazonaws.com/tb-videos/mobile/pe/white-labeling.webm" type="video/webm">
</video>
</div>
</div>
</div>
<div class="feature-des"><h2>White-labeling</h2>
<p>Rebrand your mobile app interface with your company or product logo and color scheme using ThingsBoard PE <a href="/docs/pe/user-guide/white-labeling/">white-labeling</a> feature.</p>
<a class="read-more-button" href="/docs/pe/mobile/white-labeling/">Read more<img class="arrow first" src="/images/pe/read-more-arrow.svg"><img class="arrow second" src="/images/pe/read-more-arrow.svg"><img class="arrow third" src="/images/pe/read-more-arrow.svg"></a>
</div>
</div>
</div>
</main>
</section>
<section class="features">
<main>
<div class="features-top">
<div class="background">
<div class="main2"></div><img src="/images/grid.svg"><div class="small4"></div><div class="small5"></div>
</div>
<div class="block dark">
<div class="feature-des"><h2>Self-registration</h2>
<p>Setup sign-up page for your customers using ThingsBoard PE <a href="/docs/pe/user-guide/self-registration/">self-registration</a> feature.</p>
<a class="read-more-button" href="/docs/pe/mobile/self-registration/">Read more<img class="arrow first" src="/images/pe/read-more-arrow.svg"><img class="arrow second" src="/images/pe/read-more-arrow.svg"><img class="arrow third" src="/images/pe/read-more-arrow.svg"></a>
</div>
<div class="preview">
<div class="mobile-frame ios">
<img class="phone-bg points" src="/images/mobile/pe/mobile-bg-pe.svg">
<img class="phone-bg web self-reg" src="/images/mobile/pe/pe-self-registration.svg">
<div class="phone-shadow pe"></div>
<div class="frame-image">
<img src="/images/mobile/pe/self-registration-frame.png">
</div>
<div class="frame-video">
<video autoplay loop preload="auto" muted playsinline>
<source src="https://s3-us-west-1.amazonaws.com/tb-videos/mobile/pe/self-registration.mp4" type="video/mp4">
<source src="https://s3-us-west-1.amazonaws.com/tb-videos/mobile/pe/self-registration.webm" type="video/webm">
</video>
</div>
</div>
</div>
</div>
<div class="block micro">
<div class="preview">
<div class="mobile-frame ios">
<img class="phone-bg points right" src="/images/mobile/pe/mobile-bg-pe.svg">
<img class="phone-bg web right alarms-m" src="/images/mobile/pe/pe-alarms-m.svg">
<div class="phone-shadow right pe"></div>
<div class="frame-image">
<img src="/images/mobile/pe/manage-alarms-frame.png">
</div>
<div class="frame-video">
<video autoplay loop preload="auto" muted playsinline>
<source src="https://s3-us-west-1.amazonaws.com/tb-videos/mobile/pe/manage-alarms.mp4" type="video/mp4">
<source src="https://s3-us-west-1.amazonaws.com/tb-videos/mobile/pe/manage-alarms.webm" type="video/webm">
</video>
</div>
</div>
</div>
<div class="feature-des"><h2>Simple and convenient alarms management</h2>
<p>Manage alarms in one place. Navigate to related dashboards configurable from the Platform.</p>
<a class="read-more-button" href="/docs/pe/mobile/alarm-dashboard/">Read more<img class="arrow first" src="/images/pe/read-more-arrow.svg"><img class="arrow second" src="/images/pe/read-more-arrow.svg"><img class="arrow third" src="/images/pe/read-more-arrow.svg"></a>
</div>
</div>
</div>
</main>
</section>
<section class="features">
<main>
<div class="features-top">
<div class="background">
<div class="main3"></div><div class="small6"></div><div class="small7"></div><div class="small8"></div>
</div>
<div class="block">
<div class="feature-des"><h2>Structured devices navigation</h2>
<p>Browse devices grouped by their type and online status. On ThingsBoard, assign device specific dashboard and image.</p>
<a class="read-more-button" href="/docs/pe/mobile/customize-devices/">Read more<img class="arrow first" src="/images/pe/read-more-arrow.svg"><img class="arrow second" src="/images/pe/read-more-arrow.svg"><img class="arrow third" src="/images/pe/read-more-arrow.svg"></a>
</div>
<div class="preview">
<div class="mobile-frame ios">
<img class="phone-bg points" src="/images/mobile/pe/mobile-bg-pe.svg">
<img class="phone-bg web devices-nav" src="/images/mobile/pe/pe-devices-nav.svg">
<div class="phone-shadow pe"></div>
<div class="frame-image">
<img src="/images/mobile/pe/navigate-devices-frame.png">
</div>
<div class="frame-video">
<video autoplay loop preload="auto" muted playsinline>
<source src="https://s3-us-west-1.amazonaws.com/tb-videos/mobile/pe/navigate-devices.mp4" type="video/mp4">
<source src="https://s3-us-west-1.amazonaws.com/tb-videos/mobile/pe/navigate-devices.webm" type="video/webm">
</video>
</div>
</div>
</div>
</div>
<div class="block micro">
<div class="preview act">
<div class="mobile-frame ios">
<img class="phone-bg points" src="/images/mobile/pe/mobile-bg-pe.svg">
<img class="phone-bg web mobile-act" src="/images/mobile/pe/pe-mobile-act.svg">
<div class="phone-shadow pe"></div>
<div class="frame-image">
<img src="/images/mobile/pe/mobile-actions-frame.png">
</div>
<div class="frame-video">
<video autoplay loop preload="auto" muted playsinline>
<source src="https://s3-us-west-1.amazonaws.com/tb-videos/mobile/pe/mobile-actions.mp4" type="video/mp4">
<source src="https://s3-us-west-1.amazonaws.com/tb-videos/mobile/pe/mobile-actions.webm" type="video/webm">
</video>
</div>
</div>
</div>
<div class="feature-des"><h2>Rich set of mobile actions</h2>
<p>Use mobile device to take photo, scan QR code, update location and more within dashboard. Extend the action with your own processing logic using ThingsBoard.</p>
<a class="read-more-button" href="/docs/pe/mobile/mobile-actions/">Read more<img class="arrow first" src="/images/pe/read-more-arrow.svg"><img class="arrow second" src="/images/pe/read-more-arrow.svg"><img class="arrow third" src="/images/pe/read-more-arrow.svg"></a>
</div>
</div>
<div class="background bottom">
<div class="bottom"></div><div class="small9"></div>
</div>
</div>
</main>
</section>
<section id="bottom">
<main>
<a href="/docs/pe/mobile/getting-started/" class="getting-started">Getting started</a>
</main>
</section>
|
export * from './product-deletion.model.ts';
export * from './product-dto.model';
export * from './product-modifitaction.model';
export * from './product.model';
|
function createmesh!(data::Dict{Symbol, Any}, params::Parameters)
el = 0
seg = data[:segments]
sgmedium = data[:media][data[:medium]] / 231. # in [in^3]
for i in 1:length(seg)
mat = data[:materials][seg[i].material]
radius = seg[i].od / 2.0
dm2 = seg[i].od^2 - seg[i].id^2
dm4 = seg[i].od^4 - seg[i].id^4
dp2 = seg[i].od^2 + seg[i].id^2
if typeof(seg[i]) in [Bit, Stabilizer, Bentsub, Rig]
if (el+1) < data[:noofelements] && radius > params.nodes[el+1].stringradius
params.nodes[el+1].fc = seg[i].fc
params.nodes[el+1].stringradius = radius
end
else
nels = floor(Int64, 1.0+data[:ratio]*data[:segments][i].length/data[:segments][i].od)
dl = seg[i].length
elength = 12.0 * dl / nels
# Cross sectional area: csa = pi * d^2 / 4
csa = pi * dm2 / 4.0
sgstring = mat[:sg]
eweight = elength * csa * (sgstring - sgmedium)
# See http://en.wikipedia.org/wiki/Area_moment_of_inertia
# Area moment of inertia: Pi * r^4 / 4 or Pi * d^4 / 64
# See http://en.wikipedia.org/wiki/Polar_moment_of_inertia
# Polar moment of inertia: Pi * r^4 / 2 or Pi * d^4 / 32
ea = csa * mat[:ez]
er = pi * mat[:er] * dm4 / 64
gj = pi * mat[:g] * dm4 / 32
# g is in [m / sec^2], 32.2 [ft/sec^2] or 386.4 [in/sec^2]
# In US-units the assumption is that sgstring & sgmedium are forces [lbf],
# not [lbm].
#println([nels, elength, sgstring, sgmedium, eweight, ea, eiz, eir, gj])
emass = elength * csa * sgstring
mmass = elength * pi * seg[i].id^2 * sgmedium / 4.0
# See Wikipedia, list of moments of inertia:
# http://en.wikipedia.org/wiki/List_of_moments_of_inertia
# Iz = elementaxialmoment = elementmass * (od^2 + id^2) / 8
# Ix = Iy = (1 / 12) * emass * ( 3 * (od^2 + id^2) / 4 + elength^2)
# = emass * elength^2 / 12 + elementaxialmoment / 2
einertialmass = emass + mmass
eaxialmoment = emass * dp2 / 8.0
eradialmoment = emass * elength^2 / 12.0 + eaxialmoment / 2.0
for j in 1:nels
el += 1
# params.nodes[1] is usually the bit or a 1st support for deflection testing
# params.nodes[1] = 0.0, add length of element.
params.nodes[el+1].z = params.nodes[el].z + elength
if typeof(seg[i]) in [Pipe, Collar]
params.elements[el].ea = ea
params.elements[el].er = er
params.elements[el].gj = gj
else
# Adjust these values for Stabilizers. :od is not the whole story.
params.elements[el].ea = 0.58 * ea
params.elements[el].er = 0.68 * er
params.elements[el].gj = 0.36 * gj
end
# Use largest stringRadius if 2 elements differ.
if radius > params.nodes[el].stringradius
params.nodes[el].fc = seg[i].fc
end
params.nodes[el].stringradius = max(radius, params.nodes[el].stringradius)
params.nodes[el+1].fc = seg[i].fc
params.nodes[el+1].stringradius = radius
# Pattern always the same, asign 0.5 to this node and 0.5 to next node,
# during next loop add 0.5 of next element.
params.nodes[el].weight += eweight / 2.0
params.nodes[el+1].weight = eweight / 2.0
params.nodes[el].mass += emass / 2.0
params.nodes[el+1].mass = emass / 2.0
params.nodes[el].massinertia += einertialmass / 2.0
params.nodes[el+1].massinertia = einertialmass / 2.0
params.nodes[el].axialinertia += eaxialmoment / 2.0
params.nodes[el+1].axialinertia = eaxialmoment / 2.0
params.nodes[el].radialinertia += eradialmoment / 2.0
params.nodes[el+1].radialinertia = eradialmoment / 2.0
# Set element data if it is not a "(Exc)Stab(ilizer)" or "NBS" or "Bit" or "Rig"
if typeof(seg[i]) in [Pipe, Collar]
params.elements[el].fc = seg[i].fc
params.elements[el].length = elength
params.elements[el].radius = radius
params.elements[el].od = seg[i].od
params.elements[el].id = seg[i].id
params.elements[el].mass = emass
end
end
end
end
typeof(params.nodes)
end
|
<?php
namespace Riconect\MailerBundle\Service;
/**
* Class MongoDBSpool
*/
class MongoDBSpool extends \Swift_ConfigurableSpool
{
/**
* @abstract object $dm
*/
protected $dm;
/**
* @var string $documentClass
*/
protected $documentClass;
/**
* @var boolean $keep_emails
*/
protected $keep_emails = false;
/**
*
*/
public function __construct($doctrine_mongodb, $config = [])
{
$this->dm = $doctrine_mongodb->getManager();
$this->documentClass = $config['message_class'];
$this->keep_emails = $config['keep_sent_emails'];
}
/**
* Starts Spool mechanism.
*/
public function start()
{
}
/**
* Stops Spool mechanism.
*/
public function stop()
{
}
/**
* Tests if Spool mechanism has started.
*
* @return boolean
*/
public function isStarted()
{
return true;
}
/**
* Queues a message.
*
* @param \Swift_Mime_Message $message The message to store
* @return boolean
* @throws \Exception
*/
public function queueMessage(\Swift_Mime_Message $message)
{
$document = new $this->documentClass;
$document->setMessage(serialize($message));
$document->setCreated(new \DateTime());
$document->setStatus('message');
$dm = $this->dm;
try
{
$dm->persist($document);
$dm->flush();
}
catch (\Exception $e)
{
throw new \Exception($e);
}
return true;
}
/**
* Sends messages using the given transport instance.
*
* @param \Swift_Transport $transport A transport instance
* @param string[] &$failedRecipients An array of failures by-reference
*
* @return int The number of sent emails
*/
public function flushQueue(\Swift_Transport $transport, &$failedRecipients = null)
{
// TODO Fetch messages with status 'error' and 'sendind' (with time ago) for retry.
if (!$transport->isStarted())
{
$transport->start();
}
$limit = $this->getMessageLimit() ? $this->getMessageLimit() : 1000;
$messages = $this->dm->createQueryBuilder($this->documentClass)
->hydrate(false)
->field('status')->equals('message')
->sort('created', 'asc')
->limit($limit)
->getQuery()
->execute();
if (empty($messages)) {
return 0;
}
$failedRecipients = (array) $failedRecipients;
$count = 0;
$time = time();
foreach ($messages as $message)
{
$this->status($message['_id'], 'sending');
$email = unserialize($message['message']);
try
{
$count += $transport->send($email, $failedRecipients);
$this->status($message['_id'], 'complete');
}
catch (\Exception $e)
{
$this->status($message['_id'], 'error');
}
if ($this->getTimeLimit() && (time() - $time) >= $this->getTimeLimit())
{
break;
}
}
if (!$this->keep_emails)
{
// Delete sent emails.
$this->dm->createQueryBuilder($this->documentClass)
->remove()
->field('status')->equals('complete')
->getQuery()
->execute();
}
return $count;
}
/**
* Change status of Message Document
*
* @param string $message_id Document Id
* @param string $status Status to be changed
*/
private function status($message_id, $status)
{
$this->dm->createQueryBuilder($this->documentClass)
->update()
->field('_id')->equals($message_id)
->field('status')->set($status)
->getQuery()
->execute();
}
}
|
package cz.tacr.elza.dataexchange.input.sections.context;
import java.util.ArrayList;
import java.util.LinkedList;
import java.util.List;
import org.apache.commons.lang3.StringUtils;
import org.apache.commons.lang3.Validate;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import cz.tacr.elza.core.data.RuleSet;
import cz.tacr.elza.core.data.StaticDataProvider;
import cz.tacr.elza.dataexchange.input.DEImportException;
import cz.tacr.elza.dataexchange.input.context.ImportInitHelper;
import cz.tacr.elza.dataexchange.input.context.ObservableImport;
import cz.tacr.elza.dataexchange.input.sections.SectionProcessedListener;
import cz.tacr.elza.dataexchange.input.storage.StorageManager;
import cz.tacr.elza.domain.ApScope;
import cz.tacr.elza.domain.ArrChange;
import cz.tacr.elza.domain.ArrFund;
import cz.tacr.elza.domain.ParInstitution;
import cz.tacr.elza.repository.InstitutionRepository;
import cz.tacr.elza.schema.v2.FundInfo;
import cz.tacr.elza.service.ArrangementService;
/**
* Context for new funds or subtrees of import node.
*/
public class SectionsContext {
private static final Logger LOG = LoggerFactory.getLogger(SectionsContext.class);
private final List<SectionProcessedListener> sectionProcessedListeners = new LinkedList<>();
private final StorageManager storageManager;
private final int batchSize;
private final ArrChange createChange;
private final ApScope importScope;
private final ImportPosition importPosition;
private final StaticDataProvider staticData;
private final ImportInitHelper initHelper;
private SectionContext currentSection;
public SectionsContext(StorageManager storageManager, int batchSize, ArrChange createChange, ApScope importScope,
ImportPosition importPosition, StaticDataProvider staticData, ImportInitHelper initHelper) {
this.storageManager = storageManager;
this.batchSize = batchSize;
this.createChange = createChange;
this.importScope = importScope;
this.importPosition = importPosition;
this.staticData = staticData;
this.initHelper = initHelper;
}
public ImportPosition getImportPostition() {
return importPosition;
}
public void init(ObservableImport observableImport) {
// NOP
}
public void registerSectionProcessedListener(SectionProcessedListener sectionProcessedListener) {
sectionProcessedListeners.add(sectionProcessedListener);
}
/**
* Prepare context for new section.
*
* Method endSection have to be called when section is finished.
*
* @param ruleSetCode
* Rules for section
*/
public void beginSection(String ruleSetCode) {
Validate.isTrue(currentSection == null);
// find rule system
if (StringUtils.isEmpty(ruleSetCode)) {
throw new DEImportException("Rule set code is empty");
}
RuleSet ruleSet = staticData.getRuleSetByCode(ruleSetCode);
if (ruleSet == null) {
throw new DEImportException("Rule set not found, code:" + ruleSetCode);
}
// create current section
SectionContext section = new SectionContext(storageManager, batchSize, createChange,
ruleSet.getEntity(), staticData, initHelper);
// set subsection root adapter when position present
if (importPosition != null) {
String fundRuleSetCode = importPosition.getFundVersion().getRuleSet().getCode();
if (!ruleSetCode.equals(fundRuleSetCode)) {
throw new DEImportException("Rule set must match with fund, subsection code:" + ruleSetCode
+ ", fund code:" + fundRuleSetCode);
}
section.setRootAdapter(new SubsectionRootAdapter(importPosition, createChange, initHelper));
}
// set current section
currentSection = section;
}
public void setFundInfo(FundInfo fundInfo) {
Validate.notNull(currentSection);
if (importPosition != null) {
LOG.warn("Fund info will be ignored during subsection import");
} else {
prepareNewFundRootAdapter(fundInfo, currentSection);
}
}
public SectionContext getCurrentSection() {
Validate.notNull(currentSection);
return currentSection;
}
public void endSection() {
Validate.notNull(currentSection);
currentSection.storeNodes();
// notify listeners
List<SectionProcessedListener> listeners = new ArrayList<>(sectionProcessedListeners);
listeners.forEach(l -> l.onSectionProcessed(currentSection));
// close & clear current section
currentSection.close();
currentSection = null;
}
/* internal methods */
private void prepareNewFundRootAdapter(FundInfo fundInfo, SectionContext sectionCtx) {
InstitutionRepository instRepo = initHelper.getInstitutionRepository();
ArrangementService arrService = initHelper.getArrangementService();
if (StringUtils.isBlank(fundInfo.getN())) {
throw new DEImportException("Fund name must be set");
}
ParInstitution institution = instRepo.findByInternalCode(fundInfo.getIc());
if (institution == null) {
throw new DEImportException("Institution not found, internal code:" + fundInfo.getIc());
}
Integer fundNum = null;
if (fundInfo.getNum() != null) {
fundNum = fundInfo.getNum().intValue();
}
ArrFund fund = arrService.createFund(fundInfo.getN(), fundInfo.getC(), institution,
fundNum, fundInfo.getTr(), fundInfo.getMrk());
arrService.addScopeToFund(fund, importScope);
FundRootAdapter adapter = new FundRootAdapter(fund,
sectionCtx.getRuleSet(),
sectionCtx.getCreateChange(),
arrService);
currentSection.setRootAdapter(adapter);
}
}
|
{-# LANGUAGE CPP #-}
{-# LANGUAGE Rank2Types #-}
{-# LANGUAGE TypeOperators #-}
-- | Resources used by node and ways to deal with them.
module Pos.Launcher.Resource
(
-- * Full resources
NodeResources (..)
, hoistNodeResources
, allocateNodeResources
, releaseNodeResources
, bracketNodeResources
-- * Smaller resources
, loggerBracket
, bracketKademlia
, bracketTransport
) where
import Universum hiding (bracket, finally)
import Control.Concurrent.STM (newEmptyTMVarIO, newTBQueueIO)
import Data.Tagged (untag)
import qualified Data.Time as Time
import Formatting (sformat, shown, (%))
import Mockable (Bracket, Catch, Mockable, Production (..),
Throw, bracket, throw)
import Network.QDisc.Fair (fairQDisc)
import qualified Network.Transport as NT (closeTransport)
import Network.Transport.Abstract (Transport, hoistTransport)
import Network.Transport.Concrete (concrete)
import qualified Network.Transport.TCP as TCP
import System.IO (BufferMode (..), Handle, hClose,
hSetBuffering)
import qualified System.Metrics as Metrics
import System.Wlog (CanLog, LoggerConfig (..), WithLogger,
getLoggerName, logError, prefixB,
productionB, releaseAllHandlers,
setupLogging, showTidB, usingLoggerName)
import Pos.Binary ()
import Pos.Block.Slog (mkSlogContext)
import Pos.Client.CLI.Util (readLoggerConfig)
import Pos.Configuration
import Pos.Context (ConnectedPeers (..), NodeContext (..),
StartTime (..))
import Pos.Core (HasConfiguration, Timestamp,
gdStartTime, genesisData)
import Pos.DB (MonadDBRead, NodeDBs)
import Pos.DB.DB (initNodeDBs)
import Pos.DB.Rocks (closeNodeDBs, openNodeDBs)
import Pos.Delegation (DelegationVar, mkDelegationVar)
import Pos.DHT.Real (KademliaDHTInstance, KademliaParams (..),
startDHTInstance, stopDHTInstance)
import Pos.Infra.Configuration (HasInfraConfiguration)
import Pos.Launcher.Param (BaseParams (..), LoggingParams (..),
NodeParams (..), TransportParams (..))
import Pos.Lrc.Context (LrcContext (..), mkLrcSyncData)
import Pos.Network.Types (NetworkConfig (..), Topology (..))
import Pos.Shutdown.Types (ShutdownContext (..))
import Pos.Slotting (SlottingContextSum (..), SlottingData,
mkNtpSlottingVar, mkSimpleSlottingVar)
import Pos.Ssc.Class (SscConstraint, SscParams,
sscCreateNodeContext)
import Pos.Ssc.Extra (SscState, mkSscState)
import Pos.Ssc.GodTossing.Configuration (HasGtConfiguration)
import Pos.StateLock (newStateLock)
import Pos.Txp (GenericTxpLocalData (..),
mkTxpLocalData, recordTxpMetrics)
#ifdef WITH_EXPLORER
import Pos.Explorer (explorerTxpGlobalSettings)
#else
import Pos.Txp (txpGlobalSettings)
#endif
import Pos.Launcher.Mode (InitMode, InitModeContext (..),
runInitMode)
import Pos.Update.Context (mkUpdateContext)
import qualified Pos.Update.DB as GState
import Pos.Util (newInitFuture)
import Pos.WorkMode (TxpExtra_TMP)
#ifdef linux_HOST_OS
import qualified System.Systemd.Daemon as Systemd
import qualified System.Wlog as Logger
#endif
-- Remove this once there's no #ifdef-ed Pos.Txp import
{-# ANN module ("HLint: ignore Use fewer imports" :: Text) #-}
----------------------------------------------------------------------------
-- Data type
----------------------------------------------------------------------------
-- | This data type contains all resources used by node.
data NodeResources ssc m = NodeResources
{ nrContext :: !(NodeContext ssc)
, nrDBs :: !NodeDBs
, nrSscState :: !(SscState ssc)
, nrTxpState :: !(GenericTxpLocalData TxpExtra_TMP)
, nrDlgState :: !DelegationVar
, nrTransport :: !(Transport m)
, nrJLogHandle :: !(Maybe Handle)
-- ^ Handle for JSON logging (optional).
, nrEkgStore :: !Metrics.Store
}
hoistNodeResources ::
forall ssc n m. Functor m
=> (forall a. n a -> m a)
-> NodeResources ssc n
-> NodeResources ssc m
hoistNodeResources nat nr =
nr {nrTransport = hoistTransport nat (nrTransport nr)}
----------------------------------------------------------------------------
-- Allocation/release/bracket
----------------------------------------------------------------------------
-- | Allocate all resources used by node. They must be released eventually.
allocateNodeResources
:: forall ssc m.
( SscConstraint ssc
, HasConfiguration
, HasNodeConfiguration
, HasInfraConfiguration
-- FIXME avieth
-- 'HasGtConfiguration' arises from 'initNodeDBs', where that constraint
-- in turn arises from 'prepareGStateDB', which is in fact tied to
-- godtossing.
-- So the 'forall ssc' here is misleading. This only works for
-- godtossing. The dependency was hidden before, where the godtossing
-- data was all delivered by global mutable variables. That's to say,
-- 'allocateNodeResources' had a hidden assumption that somebody will
-- fill in the required godtossing data, probably using 'unsafePerformIO'.
, HasGtConfiguration
)
=> Transport m
-> NetworkConfig KademliaDHTInstance
-> NodeParams
-> SscParams ssc
-> Production (NodeResources ssc m)
allocateNodeResources transport networkConfig np@NodeParams {..} sscnp = do
db <- openNodeDBs npRebuildDb npDbPathM
(futureLrcContext, putLrcContext) <- newInitFuture "lrcContext"
(futureSlottingVar, putSlottingVar) <- newInitFuture "slottingVar"
(futureSlottingContext, putSlottingContext) <- newInitFuture "slottingContext"
let putSlotting sv sc = do
putSlottingVar sv
putSlottingContext sc
initModeContext = InitModeContext
db
futureSlottingVar
futureSlottingContext
futureLrcContext
runInitMode initModeContext $ do
initNodeDBs @ssc
nrEkgStore <- liftIO $ Metrics.newStore
txpVar <- mkTxpLocalData -- doesn't use slotting or LRC
let ancd =
AllocateNodeContextData
{ ancdNodeParams = np
, ancdSscParams = sscnp
, ancdPutSlotting = putSlotting
, ancdNetworkCfg = networkConfig
, ancdEkgStore = nrEkgStore
, ancdTxpMemState = txpVar
}
ctx@NodeContext {..} <- allocateNodeContext ancd
putLrcContext ncLrcContext
setupLoggers $ bpLoggingParams npBaseParams
dlgVar <- mkDelegationVar @ssc
sscState <- mkSscState @ssc
let nrTransport = transport
nrJLogHandle <-
case npJLFile of
Nothing -> pure Nothing
Just fp -> do
h <- openFile fp WriteMode
liftIO $ hSetBuffering h NoBuffering
return $ Just h
return NodeResources
{ nrContext = ctx
, nrDBs = db
, nrSscState = sscState
, nrTxpState = txpVar
, nrDlgState = dlgVar
, ..
}
-- | Release all resources used by node. They must be released eventually.
releaseNodeResources ::
forall ssc m. ( )
=> NodeResources ssc m -> Production ()
releaseNodeResources NodeResources {..} = do
releaseAllHandlers
whenJust nrJLogHandle (liftIO . hClose)
closeNodeDBs nrDBs
releaseNodeContext nrContext
-- | Run computation which requires 'NodeResources' ensuring that
-- resources will be released eventually.
bracketNodeResources :: forall ssc m a.
( SscConstraint ssc
, MonadIO m
, HasConfiguration
, HasNodeConfiguration
, HasInfraConfiguration
, HasGtConfiguration
)
=> NodeParams
-> SscParams ssc
-> (HasConfiguration => NodeResources ssc m -> Production a)
-> Production a
bracketNodeResources np sp k = bracketTransport tcpAddr $ \transport ->
bracketKademlia (npBaseParams np) (npNetworkConfig np) $ \networkConfig ->
bracket (allocateNodeResources transport networkConfig np sp) releaseNodeResources $ \nodeRes ->do
-- Notify systemd we are fully operative
notifyReady
k nodeRes
where
tcpAddr = tpTcpAddr (npTransport np)
----------------------------------------------------------------------------
-- Logging
----------------------------------------------------------------------------
getRealLoggerConfig :: MonadIO m => LoggingParams -> m LoggerConfig
getRealLoggerConfig LoggingParams{..} = do
let cfgBuilder = productionB
<> showTidB
<> maybe mempty prefixB lpHandlerPrefix
cfg <- readLoggerConfig lpConfigPath
pure $ cfg <> cfgBuilder
setupLoggers :: MonadIO m => LoggingParams -> m ()
setupLoggers params = setupLogging =<< getRealLoggerConfig params
-- | RAII for Logging.
loggerBracket :: LoggingParams -> IO a -> IO a
loggerBracket lp = bracket_ (setupLoggers lp) releaseAllHandlers
----------------------------------------------------------------------------
-- NodeContext
----------------------------------------------------------------------------
data AllocateNodeContextData ssc = AllocateNodeContextData
{ ancdNodeParams :: !NodeParams
, ancdSscParams :: !(SscParams ssc)
, ancdPutSlotting :: (Timestamp, TVar SlottingData) -> SlottingContextSum -> InitMode ssc ()
, ancdNetworkCfg :: NetworkConfig KademliaDHTInstance
, ancdEkgStore :: !Metrics.Store
, ancdTxpMemState :: !(GenericTxpLocalData TxpExtra_TMP)
}
allocateNodeContext
:: forall ssc .
(HasConfiguration, HasNodeConfiguration, HasInfraConfiguration, SscConstraint ssc)
=> AllocateNodeContextData ssc
-> InitMode ssc (NodeContext ssc)
allocateNodeContext ancd = do
let AllocateNodeContextData { ancdNodeParams = np@NodeParams {..}
, ancdSscParams = sscnp
, ancdPutSlotting = putSlotting
, ancdNetworkCfg = networkConfig
, ancdEkgStore = store
, ancdTxpMemState = TxpLocalData {..}
} = ancd
ncLoggerConfig <- getRealLoggerConfig $ bpLoggingParams npBaseParams
ncStateLock <- newStateLock
ncStateLockMetrics <- liftIO $ recordTxpMetrics store txpMemPool
lcLrcSync <- mkLrcSyncData >>= newTVarIO
ncSlottingVar <- (gdStartTime genesisData,) <$> mkSlottingVar
ncSlottingContext <-
case npUseNTP of
True -> SCNtp <$> mkNtpSlottingVar
False -> SCSimple <$> mkSimpleSlottingVar
putSlotting ncSlottingVar ncSlottingContext
ncUserSecret <- newTVarIO $ npUserSecret
ncBlockRetrievalQueue <- liftIO $ newTBQueueIO blockRetrievalQueueSize
ncRecoveryHeader <- liftIO newEmptyTMVarIO
ncProgressHeader <- liftIO newEmptyTMVarIO
ncShutdownFlag <- newTVarIO False
ncStartTime <- StartTime <$> liftIO Time.getCurrentTime
ncLastKnownHeader <- newTVarIO Nothing
ncUpdateContext <- mkUpdateContext
ncSscContext <- untag @ssc sscCreateNodeContext sscnp
ncSlogContext <- mkSlogContext store
-- TODO synchronize the NodeContext peers var with whatever system
-- populates it.
peersVar <- newTVarIO mempty
let ctx shutdownQueue =
NodeContext
{ ncConnectedPeers = ConnectedPeers peersVar
, ncLrcContext = LrcContext {..}
, ncShutdownContext = ShutdownContext ncShutdownFlag shutdownQueue
, ncNodeParams = np
#ifdef WITH_EXPLORER
, ncTxpGlobalSettings = explorerTxpGlobalSettings
#else
, ncTxpGlobalSettings = txpGlobalSettings
#endif
, ncNetworkConfig = networkConfig
, ..
}
-- TODO bounded queue not necessary.
ctx <$> liftIO (newTBQueueIO maxBound)
releaseNodeContext :: forall ssc m . MonadIO m => NodeContext ssc -> m ()
releaseNodeContext _ = return ()
-- Create new 'SlottingVar' using data from DB. Probably it would be
-- good to have it in 'infra', but it's complicated.
mkSlottingVar :: (MonadIO m, MonadDBRead m) => m (TVar SlottingData)
mkSlottingVar = newTVarIO =<< GState.getSlottingData
----------------------------------------------------------------------------
-- Kademlia
----------------------------------------------------------------------------
createKademliaInstance ::
(HasNodeConfiguration, MonadIO m, Mockable Catch m, Mockable Throw m, CanLog m)
=> BaseParams
-> KademliaParams
-> Word16 -- ^ Default port to bind to.
-> m KademliaDHTInstance
createKademliaInstance BaseParams {..} kp defaultPort =
usingLoggerName (lpRunnerTag bpLoggingParams) (startDHTInstance instConfig defaultBindAddress)
where
instConfig = kp {kpPeers = ordNub $ kpPeers kp ++ defaultPeers}
defaultBindAddress = ("0.0.0.0", defaultPort)
-- | RAII for 'KademliaDHTInstance'.
bracketKademliaInstance
:: (HasNodeConfiguration, MonadIO m, Mockable Catch m, Mockable Throw m, Mockable Bracket m, CanLog m)
=> BaseParams
-> KademliaParams
-> Word16 -- ^ Default port to bind to.
-> (KademliaDHTInstance -> m a)
-> m a
bracketKademliaInstance bp kp defaultPort action =
bracket (createKademliaInstance bp kp defaultPort) stopDHTInstance action
-- | The 'NodeParams' contain enough information to determine whether a Kademlia
-- instance should be brought up. Use this to safely acquire/release one.
bracketKademlia
:: (HasNodeConfiguration, MonadIO m, Mockable Catch m, Mockable Throw m, Mockable Bracket m, CanLog m)
=> BaseParams
-> NetworkConfig KademliaParams
-> (NetworkConfig KademliaDHTInstance -> m a)
-> m a
bracketKademlia bp nc@NetworkConfig {..} action = case ncTopology of
-- cases that need Kademlia
TopologyP2P{topologyKademlia = kp, ..} ->
bracketKademliaInstance bp kp ncDefaultPort $ \kinst ->
k $ TopologyP2P{topologyKademlia = kinst, ..}
TopologyTraditional{topologyKademlia = kp, ..} ->
bracketKademliaInstance bp kp ncDefaultPort $ \kinst ->
k $ TopologyTraditional{topologyKademlia = kinst, ..}
TopologyRelay{topologyOptKademlia = Just kp, ..} ->
bracketKademliaInstance bp kp ncDefaultPort $ \kinst ->
k $ TopologyRelay{topologyOptKademlia = Just kinst, ..}
TopologyCore{topologyOptKademlia = Just kp, ..} ->
bracketKademliaInstance bp kp ncDefaultPort $ \kinst ->
k $ TopologyCore{topologyOptKademlia = Just kinst, ..}
-- cases that don't
TopologyRelay{topologyOptKademlia = Nothing, ..} ->
k $ TopologyRelay{topologyOptKademlia = Nothing, ..}
TopologyCore{topologyOptKademlia = Nothing, ..} ->
k $ TopologyCore{topologyOptKademlia = Nothing, ..}
TopologyBehindNAT{..} ->
k $ TopologyBehindNAT{..}
TopologyAuxx{..} ->
k $ TopologyAuxx{..}
where
k topology = action (nc { ncTopology = topology })
data MissingKademliaParams = MissingKademliaParams
deriving (Show)
instance Exception MissingKademliaParams
----------------------------------------------------------------------------
-- Transport
----------------------------------------------------------------------------
createTransportTCP
:: (HasNodeConfiguration, MonadIO n, MonadIO m, WithLogger m, Mockable Throw m)
=> TCP.TCPAddr
-> m (Transport n, m ())
createTransportTCP addrInfo = do
loggerName <- getLoggerName
let tcpParams =
(TCP.defaultTCPParameters
{ TCP.transportConnectTimeout =
Just $ fromIntegral networkConnectionTimeout
, TCP.tcpNewQDisc = fairQDisc $ \_ -> return Nothing
-- Will check the peer's claimed host against the observed host
-- when new connections are made. This prevents an easy denial
-- of service attack.
, TCP.tcpCheckPeerHost = True
, TCP.tcpServerExceptionHandler = \e ->
usingLoggerName (loggerName <> "transport") $
logError $ sformat ("Exception in tcp server: " % shown) e
})
transportE <-
liftIO $ TCP.createTransport addrInfo tcpParams
case transportE of
Left e -> do
logError $ sformat ("Error creating TCP transport: " % shown) e
throw e
Right transport -> return (concrete transport, liftIO $ NT.closeTransport transport)
-- | RAII for 'Transport'.
bracketTransport
:: (HasNodeConfiguration, MonadIO m, MonadIO n, Mockable Throw m, Mockable Bracket m, WithLogger m)
=> TCP.TCPAddr
-> (Transport n -> m a)
-> m a
bracketTransport tcpAddr k =
bracket (createTransportTCP tcpAddr) snd (k . fst)
-- | Notify process manager tools like systemd the node is ready.
-- Available only on Linux for systems where `libsystemd-dev` is installed.
-- It defaults to a noop for all the other platforms.
notifyReady :: (MonadIO m, WithLogger m) => m ()
#ifdef linux_HOST_OS
notifyReady = do
res <- liftIO Systemd.notifyReady
case res of
Just () -> return ()
Nothing -> Logger.logWarning "notifyReady failed to notify systemd."
#else
notifyReady = return ()
#endif
|
class Questionnaire < ActiveRecord::Base
attr_accessible :parent_id, :question
def self.get_root_question_id
1 #should actually get root instead of hard code
end
end
|
#!/bin/bash
if which flow>/dev/null 2>&1 ;
then
echo 0 > ~/install-exit-status
else
echo "ERROR: Open Porous Media (OPM) is not found on the system. The flow binary could not be found in the PATH. Binaries for RHEL / Ubuntu and download instructions can be found @ https://opm-project.org/"
echo 2 > ~/install-exit-status
exit
fi
tar -xf opm-data-norne-202007.tar.xz
tar -xf Norne-4C.tar.gz
echo "#!/bin/sh
NPROC=\$2
MPIRUN_AS_ROOT_ARG=\"--allow-run-as-root\"
if [ \`whoami\` != \"root\" ]
then
MPIRUN_AS_ROOT_ARG=\"\"
fi
if [ \$1 = \"flow_mpi_norne\" ]
then
cd norne
nice mpirun \$MPIRUN_AS_ROOT_ARG -np \$NPROC --report-bindings \$HOSTFILE flow NORNE_ATW2013.DATA --tolerance-mb=1e-5 --max-strict-iter=4 > \$LOG_FILE 2>&1
elif [ \$1 = \"flow_mpi_norne_4c\" ]
then
cd Norne-4C
nice mpirun \$MPIRUN_AS_ROOT_ARG -np \$NPROC --report-bindings \$HOSTFILE flow NORNE_ATW2013_4C_MSW.DATA --tolerance-mb=1e-5 --max-strict-iter=4 > \$LOG_FILE 2>&1
fi
flow --version > ~/pts-footnote
# echo \$? > ~/test-exit-status" > opm
chmod +x opm
|
from HRMSystem.models import HRMSystem
from django.contrib import admin
@admin.register(HRMSystem)
class HRMSSystemAdmin(admin.ModelAdmin):
pass
|
#!/usr/bin/env python
# -*- coding:utf-8 -*-
"""
Date: 2022/5/23 14:05
Desc: 东方财富网-数据中心-特色数据-停复牌信息
http://data.eastmoney.com/tfpxx/
"""
import pandas as pd
import requests
def stock_tfp_em(date: str = "20220523") -> pd.DataFrame:
"""
东方财富网-数据中心-特色数据-停复牌信息
http://data.eastmoney.com/tfpxx/
:param date: specific date as "2020-03-19"
:type date: str
:return: 停复牌信息表
:rtype: pandas.DataFrame
"""
url = "https://datacenter-web.eastmoney.com/api/data/v1/get"
params = {
"sortColumns": "SUSPEND_START_DATE",
"sortTypes": "-1",
"pageSize": "500",
"pageNumber": "1",
"reportName": "RPT_CUSTOM_SUSPEND_DATA_INTERFACE",
"columns": "ALL",
"source": "WEB",
"client": "WEB",
"filter": f"""(MARKET="全部")(DATETIME='{"-".join([date[:4], date[4:6], date[6:]])}')""",
}
r = requests.get(url, params=params)
data_json = r.json()
temp_df = pd.DataFrame(data_json["result"]["data"])
temp_df.reset_index(inplace=True)
temp_df["index"] = temp_df.index + 1
temp_df.columns = [
"序号",
"代码",
"名称",
"停牌时间",
"停牌截止时间",
"停牌期限",
"停牌原因",
"所属市场",
"停牌开始日期",
"预计复牌时间",
"-",
"-",
]
temp_df = temp_df[
["序号", "代码", "名称", "停牌时间", "停牌截止时间", "停牌期限", "停牌原因", "所属市场", "预计复牌时间"]
]
temp_df["停牌时间"] = pd.to_datetime(temp_df["停牌时间"]).dt.date
temp_df["停牌截止时间"] = pd.to_datetime(temp_df["停牌截止时间"]).dt.date
temp_df["预计复牌时间"] = pd.to_datetime(temp_df["预计复牌时间"]).dt.date
return temp_df
if __name__ == "__main__":
stock_tfp_em_df = stock_tfp_em(date="20220523")
print(stock_tfp_em_df)
|
package com.example;
import net.thauvin.erik.semver.Version;
import java.io.IOException;
import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.file.Paths;
import java.util.List;
import java.text.SimpleDateFormat;
@Version(properties = "version.properties")
//@Version(
// properties = "example.properties",
// keysPrefix = "example.",
// preReleaseKey = "release",
// buildMetaKey = "meta")
public class Example {
public static void main(String... args) throws IOException {
final SimpleDateFormat sdf = new SimpleDateFormat("EEE, d MMM yyyy 'at' HH:mm:ss z");
System.out.println("-----------------------------------------------------");
System.out.println(" Version: " + GeneratedVersion.PROJECT + ' ' + GeneratedVersion.VERSION);
System.out.println(" Built on: " + sdf.format(GeneratedVersion.BUILDDATE));
System.out.println(" Major: " + GeneratedVersion.MAJOR);
System.out.println(" Minor: " + GeneratedVersion.MINOR);
System.out.println(" Patch: " + GeneratedVersion.PATCH);
System.out.println(" PreRelease: " + GeneratedVersion.PRERELEASE);
System.out.println(" BuildMetaData: " + GeneratedVersion.BUILDMETA);
System.out.println("-----------------------------------------------------");
if (args.length == 1) {
final Path path = Paths.get(args[0]);
if (Files.exists(path)) {
final List<String> content = Files.readAllLines(path);
System.out.println("> cat " + path.getFileName());
for (final String line : content) {
System.out.println(line);
}
}
}
}
}
|
ALTER TABLE `lh_chat_online_user`
ADD `device_type` tinyint(1) NOT NULL DEFAULT '0',
COMMENT='';
ALTER TABLE `lh_abstract_proactive_chat_invitation`
ADD INDEX `show_on_mobile` (`show_on_mobile`);
|
PROGRAM TSRFAC
C
C Define the error file, the Fortran unit number, the workstation type,
C and the workstation ID to be used in calls to GKS routines.
C
C PARAMETER (IERRF=6, LUNIT=2, IWTYPE=1, IWKID=1) ! NCGM
C PARAMETER (IERRF=6, LUNIT=2, IWTYPE=8, IWKID=1) ! X Windows
C PARAMETER (IERRF=6, LUNIT=2, IWTYPE=11, IWKID=1) ! PDF
C PARAMETER (IERRF=6, LUNIT=2, IWTYPE=20, IWKID=1) ! PostScript
C
PARAMETER (IERRF=6, LUNIT=2, IWTYPE=1, IWKID=1)
C
C OPEN GKS, OPEN WORKSTATION OF TYPE 1, ACTIVATE WORKSTATION
C
CALL GOPKS (IERRF, ISZDM)
CALL GOPWK (IWKID, LUNIT, IWTYPE)
CALL GACWK (IWKID)
C
C INVOKE DEMO DRIVER
C
CALL SRFAC(IERR)
C
C DEACTIVATE AND CLOSE WORKSTATION, CLOSE GKS.
C
CALL GDAWK (IWKID)
CALL GCLWK (IWKID)
CALL GCLKS
C
STOP
END
C
SUBROUTINE SRFAC (IERROR)
C
C PURPOSE To provide a simple demonstration of SRFACE.
C
C USAGE CALL SRFAC (IERROR)
C
C ARGUMENTS
C
C ON OUTPUT IERROR
C An integer variable
C = 0, if the test was successful,
C = 1, the test was not successful.
C
C I/O If the test is successful, the message
C
C SRFACE TEST EXECUTED--SEE PLOT TO CERTIFY
C
C is printed on unit 6. In addition, 2
C frames are produced on the machine graphics
C device. In order to determine if the test
C was successful, it is necessary to examine
C the plots.
C
C PRECISION Single
C
C LANGUAGE FORTRAN 77
C
C REQUIRED ROUTINES SRFACE
C
C REQUIRED GKS LEVEL 0A
C
C ALGORITHM The function
C
C Z(X,Y) = .25*(X + Y + 1./((X-.1)**2+Y**2+.09)
C -1./((X+.1)**2+Y**2+.09)
C
C for X = -1. to +1. in increments of .1, and
C Y = -1.2 to +1.2 in increments of .1,
C is computed. Then, entries EZSRFC and SURFACE
C are called to generate surface plots of Z.
C
C HISTORY SURFACE was first written in April 1979 and
C converted to FORTRAN 77 and GKS in March 1984.
C
C XX contains the X-direction coordinate values for Z(X,Y); YY contains
C the Y-direction coordinate values for Z(X,Y); Z contains the function
C values; S contains values for the line of sight for entry SRFACE;
C WORK is a work array; ANGH contains the angle in degrees in the X-Y
C plane to the line of sight; and ANGV contains the angle in degrees
C from the X-Y plane to the line of sight.
C
REAL XX(21) ,YY(25) ,Z(21,25) ,S(6) ,
1 WORK(1096)
C
DATA S(1), S(2), S(3), S(4), S(5), S(6)/
1 -8.0, -6.0, 3.0, 0.0, 0.0, 0.0/
C
DATA ANGH/45./, ANGV/15./
C
C Specify coordinates for plot titles. The values CX and CY
C define the center of the title string in a 0. to 1. range.
C
DATA CX/.5/, CY/.9/
C
C Initialize the error parameter.
C
IERROR = 0
C
C Fill the XX and YY coordinate arrays as well as the Z function array.
C
DO 20 I=1,21
X = .1*REAL(I-11)
XX(I) = X
DO 10 J=1,25
Y = .1*REAL(J-13)
YY(J) = Y
Z(I,J) = (X+Y+1./((X-.1)**2+Y**2+.09)-
1 1./((X+.1)**2+Y**2+.09))*.25
10 CONTINUE
20 CONTINUE
C
C Select the normalization transformation 0.
C
CALL GSELNT(0)
C
C
C Frame 1 -- The EZSRFC entry.
C
C Add the plot title using GKS calls.
C
C Set the text alignment to center the string in horizontal and vertical
C
CALL GSTXAL(2,3)
C
C Set the character height.
C
CALL GSCHH(.016)
C
C Write the text.
C
CALL GTX(CX,CY,'DEMONSTRATION PLOT FOR EZSRFC ENTRY OF SRFACE')
C
CALL EZSRFC (Z,21,25,ANGH,ANGV,WORK)
C
C
C Frame 2 -- The SRFACE entry.
C
C Add the plot title.
C
C Set the text alignment to center the string in horizontal and vertical
C
CALL GSTXAL(2,3)
C
C Set the character height.
C
CALL GSCHH(.016)
C
C Write the text.
C
CALL GTX(CX,CY,'DEMONSTRATION PLOT FOR SRFACE ENTRY OF SRFACE')
C
CALL SRFACE (XX,YY,Z,WORK,21,21,25,S,0.)
C
C This routine automatically generates frame advances.
C
WRITE (6,1001)
C
RETURN
C
1001 FORMAT (' SRFACE TEST EXECUTED--SEE PLOT TO CERTIFY')
C
END
|
#include "usr_misc.h"
#include "shell.h"
#include "string.h"
#include "mpconfigport.h"
#if MICROPY_PY_THREAD
#include "mpthreadport.h"
#endif
#ifdef MICROPY_USING_FILESYSTEM
#include <vfs_fs.h>
#include "vfs_fs.h"
#include "vfs_posix.h"
#include "usr_general.h"
#endif
#ifdef MICROPY_USING_AMS
#include "ams.h"
#ifndef MICROPY_USING_FILESYSTEM
#error "Application management system(AMS) need filesystem, open filesystem please!"
#endif
#else
#include <os_util.h>
#endif
//#if SHELL_TASK_STACK_SIZE < 2336
//#error "SHELL_TASK_STACK_SIZE need more than 2336 bytes if use microPython"
//#endif
os_err_t usr_task_delete(os_task_t *task)
{
os_err_t ret = os_task_destroy(task);
micropy_file_exit();
return ret;
}
#ifdef MICROPY_USING_FILESYSTEM
#define MICROPY_OPEN_BLOCK_DEVICE_TIMES 5
#ifdef BOARD_ATK_APOLLO
#include <fal/fal.h>
static int mp_fal_mount(const char *part_name, const char *mount_point, const char *fs_type)
{
int32_t ret = MP_ERROR;
if (fal_blk_device_create(part_name))
{
mp_log("Create a block device on the %s partition of flash successful.\r\n", part_name);
}
else
{
mp_log("Can't create a block device on '%s' partition.\r\n", part_name);
return ret;
}
ret = vfs_mount(part_name, mount_point, fs_type, 0, 0);
if (ret == 0)
{
mp_log("filesystem mount successful.\r\n");
}
else
{
mp_log("filesystem mount fail.\r\n");
ret = vfs_mkfs(fs_type ,part_name);
if(ret != 0)
{
mp_err("Failed to make file system!");
}
ret = vfs_mount(part_name, mount_point, fs_type, 0, 0);
if(ret != 0)
{
mp_err("Failed to mount file system!");
}
}
return ret;
}
#endif
static int fs_dev_link(int argc, char **argv)
{
char *file_sys_device = MICROPY_FS_DEVICE_NAME;
if (argc == 2)
{
file_sys_device = argv[1];
mp_log("file_sys_device:%s", file_sys_device);
}
#ifdef BOARD_ATK_APOLLO
mp_fal_mount(file_sys_device, "/", "fat");
#else
int ret = 0;
int i = 0;
/* --todo--规避: 探索者407有时会出现挂文件系统时,sd0还未注册的情况 */
for (i = 0; i < MICROPY_OPEN_BLOCK_DEVICE_TIMES; i++)
{
ret = mpy_usr_driver_open(file_sys_device);
if (ret == 0)
{
mp_log("Try[%d] open device[%s] success.", i, file_sys_device);
break;
}
os_task_tsleep(100);
}
if (ret != 0)
{
mp_err("Failed to open device[%s].", file_sys_device);
return OS_ERROR;
}
/* Mount the file system from tf card(sd0) or internal flash(W25Q64) */
ret = vfs_mount(file_sys_device, "/", "fat", 0, 0);
if (ret != 0)
{
mp_log("For the first time mount file system on device[%s] failed, try again.",
file_sys_device);
os_task_tsleep(1);
ret = vfs_mkfs("fat" ,file_sys_device);
if(ret != 0)
{
mp_err("Failed to make file system!\n");
return OS_ERROR;
}
ret = vfs_mount(file_sys_device, "/", "fat", 0, 0);
if(ret != 0)
{
mp_err("Failed to mount file system!\n");
return OS_ERROR;
}
}
#endif
mp_log("File system initialized!");
os_task_tsleep(500);
return OS_EOK;
}
#endif
static int register_mpy_log_api(void)
{
#ifdef MICROPY_USING_AMS
struct ams_misc_fun *misc_fun = ams_port_get_misc_structure();
misc_fun->sm_fun->print = os_kprintf;
#ifdef MICROPY_USING_DEBUG_MODE
misc_fun->sm_fun->model = MODEL_LOG_BASE;
#else
misc_fun->sm_fun->model = MODEL_LOG_DEBUG;
#endif
#else
struct model_misc_fun * misc_api = model_get_misc_structure();
misc_api->print = os_kprintf;
#ifdef MICROPY_USING_DEBUG_MODE
misc_api->model = MODEL_LOG_BASE;
#else
misc_api->model = MODEL_LOG_DEBUG;
#endif
#endif
return 0;
}
OS_PREV_INIT(register_mpy_log_api, OS_INIT_SUBLEVEL_LOW);
static int micropy_start(void)
{
int ret = MP_EOK;
#ifdef MICROPY_USING_AMS
init_ams();
ret = start_ams_component();
mp_log("ams_device:%s, ret: %d", mp_misc_get_dev_name(AMS_DEVICE_ID), ret);
#endif
#ifdef MICROPY_USING_FILESYSTEM
ret = fs_dev_link(0, NULL);
#endif
return ret;
}
OS_APP_INIT(micropy_start, OS_INIT_SUBLEVEL_LOW);
os_task_t *g_micropy_task = NULL;
char * g_script_name[40] = {0};
static int save_file(char *file)
{
if(!file){
return MP_ERROR;
}
memset(g_script_name, 0, strlen((const char *)g_script_name));
memcpy(g_script_name, file, strlen(file));
return MP_EOK;
}
int run_mpy(int argc, char **argv)
{
char *file = NULL;
if (argc == 1)
{
#ifdef MICROPYTHON_USING_REPL
mpy_repl_entry();
#else
mp_err("The microPython repl mode is closed.");
#endif
return MP_EOK;
}
file = argv[1];
if (strncmp(file, "stop", 4) == 0)
{
if (g_micropy_task)
{
usr_task_delete(g_micropy_task);
}
else
{
mp_log("No app running!");
}
return MP_EOK;
}
save_file(file);
g_micropy_task = os_task_create("mpy-task",
(fun_0_1_t)mpy_file_entry,
g_script_name,
8192,
19); /* tshell的优先级是20, 这里调成19,可以解决数据读、写 */
if (!g_micropy_task)
{
mp_err("Failed to create micropython task !");
return MP_ERROR;
}
os_task_startup(g_micropy_task);
return MP_EOK;
}
SH_CMD_EXPORT(mpy, run_mpy, "Run/stop python file or enter MicroPython repl");
OS_PREV_INIT(mpycall_device_list_init, OS_INIT_SUBLEVEL_MIDDLE);
|
mod fixtures;
use assert_cmd::prelude::*;
use assert_fs::fixture::TempDir;
use fixtures::{port, tmpdir, Error};
use rstest::rstest;
use std::process::{Command, Stdio};
use std::thread::sleep;
use std::time::Duration;
#[rstest(headers,
case(vec!["x-info: 123".to_string()]),
case(vec!["x-info1: 123".to_string(), "x-info2: 345".to_string()])
)]
fn custom_header_set(tmpdir: TempDir, port: u16, headers: Vec<String>) -> Result<(), Error> {
let mut child = Command::cargo_bin("miniserve")?
.arg(tmpdir.path())
.arg("-p")
.arg(port.to_string())
.args(headers.iter().flat_map(|h| vec!["--header", h]))
.stdout(Stdio::null())
.spawn()?;
sleep(Duration::from_secs(1));
let resp = reqwest::blocking::get(format!("http://localhost:{}", port).as_str())?;
for header in headers {
let mut header_split = header.splitn(2, ':');
let header_name = header_split.next().unwrap();
let header_value = header_split.next().unwrap().trim();
assert_eq!(resp.headers().get(header_name).unwrap(), header_value);
}
child.kill()?;
Ok(())
}
|
#!/bin/bash
nodes=1
ppn=4
let nmpi=$nodes*$ppn
tstamp=`date +%m_%d_%H_%M_%S`
#--------------------------------------
cat >batch.job <<EOF
#BSUB -o %J.out
#BSUB -e %J.err
#BSUB -R "span[ptile=${ppn}]"
#BSUB -R "select[ngpus=6] rusage[ngpus_shared=20]"
#BSUB -R "select[type=RHEL7_4]"
#BSUB -env "LSB_START_JOB_MPS=N"
#BSUB -R "affinity[core(11):distribute=pack]"
#BSUB -n ${nmpi}
#BSUB -x
#BSUB -q excl_ws_dd21
#BSUB -W 15
#---------------------------------------
ulimit -s 10240
ulimit -c 1000
/opt/ibm/spectrum_mpi/bin/mpirun -aff off -tag-output -np $nmpi ./snap 4rank.in mout.4rank.$tstamp
EOF
#---------------------------------------
bsub <batch.job
|
package enterpriseAndMobile.dto;
import enterpriseAndMobile.model.Round;
import java.util.List;
public class QuizDto {
private String name;
private boolean enabled;
private List<Round> rounds;
public QuizDto() {
}
public QuizDto(String name, boolean enabled) {
this.name = name;
this.enabled = enabled;
}
public QuizDto(String name, boolean enabled, List<Round> rounds) {
this.name = name;
this.enabled = enabled;
this.rounds = rounds;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public boolean isEnabled() {
return enabled;
}
public void setEnabled(boolean enabled) {
this.enabled = enabled;
}
public List<Round> getRounds() { return rounds; }
public void setRounds(List<Round> rounds) { this.rounds = rounds;
}
}
|
//
// UITextView_twPlaceHolder.h
// TWTool_Example
//
// Created by TW on 2021/3/11.
// Copyright © 2021 tanwang11. All rights reserved.
//
#import <UIKit/UIKit.h>
NS_ASSUME_NONNULL_BEGIN
IB_DESIGNABLE
//IB_DESIGNABLE 让你的自定 UIView 可以在 IB 中预览。
//IBInspectable 让你的自定义 UIView 的属性出现在 IB 中 Attributes inspector 。
@interface UITextView_twPlaceHolder : UITextView
@property (nonatomic, retain) IBInspectable NSString * placeholder;
@property (nonatomic, retain) IBInspectable UIColor * placeholderColor;
@property (nonatomic, strong) UILabel * placeHolderLabel;
@property (nonatomic ) CGPoint placeHolderOrigin;
-(void)textChanged:(NSNotification* _Nullable)notification;
@end
NS_ASSUME_NONNULL_END
|
# SPDX-License-Identifier: BSD-3-Clause
# Copyright (c) 2021 Scipp contributors (https://github.com/scipp)
# @file
# @author Neil Vaytet
import pytest
from pathlib import Path
from tempfile import TemporaryDirectory
import numpy as np
import scipp as sc
from ..factory import make_dense_data_array, make_dense_dataset, \
make_binned_data_array
from .plot_helper import plot
import matplotlib
matplotlib.use('Agg')
def test_plot_2d():
da = make_dense_data_array(ndim=2)
plot(da)
plot(da, resampling_mode='sum')
plot(da, resampling_mode='mean')
def test_plot_2d_dataset():
plot(make_dense_dataset(ndim=2))
def test_plot_2d_with_variances():
plot(make_dense_data_array(ndim=2, with_variance=True))
def test_plot_2d_with_log():
da = make_dense_data_array(ndim=2)
plot(da, norm='log')
plot(da, norm='log', resampling_mode='sum')
plot(da, norm='log', resampling_mode='mean')
def test_plot_2d_with_log_and_variances():
da = make_dense_data_array(ndim=2, with_variance=True)
plot(da, norm='log')
plot(da, norm='log', resampling_mode='sum')
plot(da, norm='log', resampling_mode='mean')
def test_plot_2d_with_vmin_vmax():
da = make_dense_data_array(ndim=2)
plot(da, vmin=0.1 * da.unit, vmax=0.9 * da.unit)
def test_plot_2d_with_unit():
plot(make_dense_data_array(ndim=2, unit=sc.units.kg))
def test_plot_2d_with_vmin_vmax_with_log():
da = make_dense_data_array(ndim=2)
plot(da, vmin=0.1 * da.unit, vmax=0.9 * da.unit, norm='log')
def test_plot_2d_with_log_scale_x():
plot(make_dense_data_array(ndim=2), scale={'xx': 'log'})
def test_plot_2d_with_log_scale_y():
plot(make_dense_data_array(ndim=2), scale={'yy': 'log'})
def test_plot_2d_with_log_scale_xy():
plot(make_dense_data_array(ndim=2), scale={'xx': 'log', 'yy': 'log'})
def test_plot_2d_with_aspect():
plot(make_dense_data_array(ndim=2), aspect='equal')
plot(make_dense_data_array(ndim=2), aspect='auto')
def test_plot_2d_with_grid():
plot(make_dense_data_array(ndim=2), grid=True)
def test_plot_2d_with_with_nan():
da = make_dense_data_array(ndim=2)
da.values[0, 0] = np.nan
plot(da)
def test_plot_2d_with_with_nan_with_log():
da = make_dense_data_array(ndim=2)
da.values[0, 0] = np.nan
plot(da, norm='log')
def test_plot_2d_with_cmap():
plot(make_dense_data_array(ndim=2), cmap='jet')
def test_plot_2d_with_labels():
plot(make_dense_data_array(ndim=2, labels=True), labels={'xx': 'lab'})
def test_plot_2d_with_attrs():
plot(make_dense_data_array(ndim=2, attrs=True), labels={'xx': 'attr'})
def test_plot_2d_with_filename():
with TemporaryDirectory() as dirname:
plot(make_dense_data_array(ndim=2),
filename=Path(dirname) / 'image.pdf',
close=False)
def test_plot_2d_with_bin_edges():
plot(make_dense_data_array(ndim=2, binedges=True))
def test_plot_2d_with_masks():
plot(make_dense_data_array(ndim=2, masks=True))
def test_plot_2d_with_masks_and_labels():
plot(make_dense_data_array(ndim=2, masks=True, labels=True), labels={'xx': 'lab'})
def test_plot_2d_with_non_regular_bin_edges():
da = make_dense_data_array(ndim=2, binedges=True)
da.coords['xx'].values = da.coords['xx'].values**2
plot(da)
def test_plot_2d_with_non_regular_bin_edges_resolution():
da = make_dense_data_array(ndim=2, binedges=True)
da.coords['xx'].values = da.coords['xx'].values**2
plot(da, resolution=128)
def test_plot_2d_with_non_regular_bin_edges_with_masks():
da = make_dense_data_array(ndim=2, masks=True, binedges=True)
da.coords['xx'].values = da.coords['xx'].values**2
plot(da)
def test_plot_variable_2d():
N = 50
v2d = sc.Variable(dims=['yy', 'xx'], values=np.random.rand(N, N), unit='K')
plot(v2d)
def test_plot_ndarray_2d():
plot(np.random.random([10, 50]))
def test_plot_dict_of_ndarrays_2d():
plot({'a': np.arange(50).reshape(5, 10), 'b': np.random.random([30, 40])})
def test_plot_from_dict_variable_2d():
plot({'dims': ['yy', 'xx'], 'values': np.random.random([20, 10])})
def test_plot_from_dict_data_array_2d():
plot({
'data': {
'dims': ['yy', 'xx'],
'values': np.random.random([20, 10])
},
'coords': {
'xx': {
'dims': ['xx'],
'values': np.arange(11)
},
'yy': {
'dims': ['yy'],
'values': np.arange(21)
}
}
})
def test_plot_string_and_vector_axis_labels_2d():
N = 10
M = 5
vecs = []
for i in range(N):
vecs.append(np.random.random(3))
da = sc.DataArray(data=sc.Variable(dims=['yy', 'xx'],
values=np.random.random([M, N]),
unit='counts'),
coords={
'xx':
sc.vectors(dims=['xx'], values=vecs, unit='m'),
'yy':
sc.Variable(dims=['yy'],
values=['a', 'b', 'c', 'd', 'e'],
unit='m')
})
plot(da)
def test_plot_2d_with_dimension_of_size_1():
N = 10
M = 1
x = np.arange(N, dtype=np.float64)
y = np.arange(M, dtype=np.float64)
z = np.arange(M + 1, dtype=np.float64)
d = sc.Dataset()
d['a'] = sc.Variable(dims=['yy', 'xx'],
values=np.random.random([M, N]),
unit=sc.units.counts)
d['b'] = sc.Variable(dims=['zz', 'xx'],
values=np.random.random([M, N]),
unit=sc.units.counts)
d.coords['xx'] = sc.Variable(dims=['xx'], values=x, unit=sc.units.m)
d.coords['yy'] = sc.Variable(dims=['yy'], values=y, unit=sc.units.m)
d.coords['zz'] = sc.Variable(dims=['zz'], values=z, unit=sc.units.m)
plot(d['a'])
plot(d['b'])
def test_plot_2d_with_dimension_of_size_2():
a = sc.DataArray(data=sc.zeros(dims=['yy', 'xx'], shape=[2, 4]),
coords={
'xx': sc.Variable(dims=['xx'], values=[1, 2, 3, 4]),
'yy': sc.Variable(dims=['yy'], values=[1, 2])
})
plot(a)
def test_plot_2d_ragged_coord():
plot(make_dense_data_array(ndim=2, ragged=True))
def test_plot_2d_ragged_coord_bin_edges():
plot(make_dense_data_array(ndim=2, ragged=True, binedges=True))
def test_plot_2d_ragged_coord_with_masks():
plot(make_dense_data_array(ndim=2, ragged=True, masks=True))
def test_plot_2d_with_labels_but_no_dimension_coord():
da = make_dense_data_array(ndim=2, labels=True)
del da.coords['xx']
plot(da, labels={'xx': 'lab'})
def test_plot_2d_with_decreasing_edges():
a = sc.DataArray(data=sc.Variable(dims=['yy', 'xx'],
values=np.arange(12).reshape(3, 4)),
coords={
'xx': sc.Variable(dims=['xx'], values=[4, 3, 2, 1]),
'yy': sc.Variable(dims=['yy'], values=[1, 2, 3])
})
plot(a)
def test_plot_2d_binned_data():
da = make_binned_data_array(ndim=2)
plot(da)
plot(da, resampling_mode='sum')
plot(da, resampling_mode='mean')
# Try without event-coord so implementation cannot use `histogram`
for dim in ['xx', 'yy']:
copy = da.copy()
del copy.bins.coords[dim]
# With edge coord, cannot use `bin` directly
plot(copy)
copy.coords[dim] = copy.coords[dim][dim, 1:]
plot(copy)
def test_plot_2d_binned_data_non_counts():
da = make_binned_data_array(ndim=2)
da.bins.unit = 'K'
plot(da)
# Try without event-coord so implementation cannot use `histogram`
for dim in ['xx', 'yy']:
copy = da.copy()
del copy.bins.coords[dim]
# With edge coord, cannot use `bin` directly
plot(copy)
copy.coords[dim] = copy.coords[dim][dim, 1:]
plot(copy)
def test_plot_2d_binned_data_float32_coord():
da = make_binned_data_array(ndim=2)
da.bins.coords['xx'] = da.bins.coords['xx'].astype('float32')
plot(da)
# Try without event-coord so implementation cannot use `histogram`
for dim in ['xx', 'yy']:
copy = da.copy()
del copy.bins.coords[dim]
# With edge coord, cannot use `bin` directly
plot(copy)
copy.coords[dim] = copy.coords[dim][dim, 1:]
plot(copy)
def test_plot_2d_binned_data_datetime64():
da = make_binned_data_array(ndim=2, masks=True)
start = sc.scalar(np.datetime64('now'))
offset = (1000 * da.coords['xx']).astype('int64')
offset.unit = 's'
da.coords['xx'] = start + offset
offset = (1000 * da.bins.coords['xx']).astype('int64') * sc.scalar(1, unit='s/m')
da.bins.coords['xx'] = start + offset
plot(da)
# Try without event-coord so implementation cannot use `histogram`
for dim in ['xx', 'yy']:
copy = da.copy()
del copy.bins.coords[dim]
# With edge coord, cannot use `bin` directly
plot(copy)
copy.coords[dim] = copy.coords[dim][dim, 1:]
plot(copy)
def test_plot_3d_binned_data_where_outer_dimension_has_no_event_coord():
data = make_binned_data_array(ndim=2, masks=True)
data = sc.concatenate(data, data * sc.scalar(2.0), 'run')
plot_obj = sc.plot(data)
plot_obj.widgets._controls['run']['slider'].value = 1
plot_obj.close()
def test_plot_3d_binned_data_where_inner_dimension_has_no_event_coord():
data = make_binned_data_array(ndim=2)
data = sc.concatenate(data, data * sc.scalar(2.0), 'run')
plot(sc.transpose(data, dims=['yy', 'xx', 'run']))
def test_plot_2d_binned_data_with_variances():
plot(make_binned_data_array(ndim=2, with_variance=True))
def test_plot_2d_binned_data_with_variances_resolution():
plot(make_binned_data_array(ndim=2, with_variance=True), resolution=64)
def test_plot_2d_binned_data_with_masks():
da = make_binned_data_array(ndim=2, masks=True)
p = da.plot()
unmasked = p.view.figure.image_values.get_array()
da.masks['all'] = da.data.bins.sum() == da.data.bins.sum()
p = da.plot()
# Bin masks are *not* applied
assert np.allclose(p.view.figure.image_values.get_array(), unmasked)
assert not np.isclose(p.view.figure.image_values.get_array().sum(), 0.0)
def test_plot_customized_mpl_axes():
da = make_dense_data_array(ndim=2)
plot(da, title='MyTitle', xlabel='MyXlabel', ylabel='MyYlabel')
def test_plot_access_ax_and_fig():
da = make_dense_data_array(ndim=2)
out = sc.plot(da, title='MyTitle')
out.ax.set_xlabel('MyXlabel')
out.fig.set_dpi(120.)
out.close()
def test_plot_2d_int32():
plot(make_dense_data_array(ndim=2, dtype=sc.dtype.int32))
def test_plot_2d_int64_with_unit():
plot(make_dense_data_array(ndim=2, unit='K', dtype=sc.dtype.int64))
def test_plot_2d_int_coords():
N = 20
M = 10
da = sc.DataArray(data=sc.Variable(dims=['yy', 'xx'],
values=np.random.random([M, N]),
unit='K'),
coords={
'xx': sc.arange('xx', N + 1, unit='m'),
'yy': sc.arange('yy', M, unit='m')
})
plot(da)
def test_plot_2d_datetime():
time = sc.array(dims=['time'],
values=np.arange(np.datetime64('2017-01-01T12:00:00'),
np.datetime64('2017-01-01T12:00:00.0001')))
N, M = time.sizes['time'], 200
da = sc.DataArray(data=sc.array(dims=['time', 'xx'],
values=np.random.normal(0, 1, (N, M))),
coords={
'time': time,
'xx': sc.Variable(dims=['xx'], values=np.linspace(0, 10, M))
})
da.plot().close()
def test_plot_redraw_dense():
da = make_dense_data_array(ndim=2, unit='K')
p = sc.plot(da)
before = p.view.figure.image_values.get_array()
da *= 5.0
p.redraw()
assert np.allclose(p.view.figure.image_values.get_array(), 5.0 * before)
p.close()
def test_plot_redraw_dense_int64():
da = make_dense_data_array(ndim=2, unit='K', dtype=sc.dtype.int64)
p = sc.plot(da)
before = p.view.figure.image_values.get_array()
da *= 5
p.redraw()
assert np.allclose(p.view.figure.image_values.get_array(), 5 * before)
p.close()
def test_plot_redraw_counts():
da = make_dense_data_array(ndim=2, unit='counts')
p = sc.plot(da)
before = p.view.figure.image_values.get_array()
da *= 5.0
p.redraw()
assert np.allclose(p.view.figure.image_values.get_array(), 5.0 * before)
p.close()
def test_plot_redraw_binned():
da = make_binned_data_array(ndim=2)
p = sc.plot(da, resolution=64)
before = p.view.figure.image_values.get_array()
da *= 5.0
p.redraw()
assert np.allclose(p.view.figure.image_values.get_array(), 5.0 * before)
p.close()
@pytest.mark.skip(reason="Require in-place concatenate")
def test_plot_redraw_binned_concat_inplace():
a = make_binned_data_array(ndim=2)
pa = sc.plot(a, resolution=64)
asum = pa.view.figure.image_values.get_array().sum()
b = make_binned_data_array(ndim=2)
pb = sc.plot(b, resolution=64)
bsum = pb.view.figure.image_values.get_array().sum()
a.data = a.bins.concatenate(b).data
a.data = a.bins.concatenate(b).data
# TODO would need to change data inplace rather than replacing
a.data = a.bins.concatenate(other=b).data
pa.redraw()
assert np.isclose(pa.view.figure.image_values.get_array().sum(), asum + bsum)
pa.close()
pb.close()
def test_plot_various_2d_coord():
def make_array(dims, coord_name):
return sc.DataArray(data=sc.fold(sc.arange('xx', 2 * 10), 'xx', {
dims[0]: 10,
dims[1]: 2
}),
coords={
coord_name:
sc.fold(0.1 * sc.arange('xx', 20), 'xx', {
dims[0]: 10,
dims[1]: 2
})
})
# Dimension coord for xx
a = make_array(['xx', 'yy'], 'xx')
plot(a)
# Dimension coord for xx
b = make_array(['yy', 'xx'], 'xx')
plot(b)
# Non-dim coord for yy
c = make_array(['xx', 'yy'], 'zz')
plot(c)
plot(c, labels={'yy': 'zz'})
|
package com.kotlinconf.library.domain.repository
import com.github.aakira.napier.Napier
import dev.bluefalcon.*
import kotlinx.coroutines.Dispatchers
import kotlinx.coroutines.GlobalScope
import kotlinx.coroutines.delay
import kotlinx.coroutines.isActive
import kotlinx.coroutines.launch
import com.kotlinconf.library.domain.UI
import com.kotlinconf.library.domain.entity.BeaconInfo
class SpotSearchRepository(
context: ApplicationContext,
private val gameDataRepository: GameDataRepository
) : BlueFalconDelegate {
private val bf: BlueFalcon = BlueFalcon(context, null)
private val lastBeaconRssi = mutableMapOf<String, Int>()
init {
this.bf.delegates.add(this)
}
fun startScanning() {
if (this.bf.isScanning) {
return
}
Napier.d(message = "starting search...")
this.doScanning()
}
fun stopScanning() {
this.bf.stopScanning()
}
fun isScanning(): Boolean {
return this.bf.isScanning
}
fun restartScanning() {
Napier.d(">>> SCANNING RESTARTED")
this.doScanning()
}
private fun doScanning() {
GlobalScope.launch(Dispatchers.UI) {
while (isActive) {
if (tryStartScan())
break
delay(500)
}
}
}
private fun tryStartScan(): Boolean {
try {
this.bf.scan()
return true
} catch (error: Throwable) {
return when (error) {
is BluetoothUnsupportedException,
is BluetoothNotEnabledException,
is BluetoothResettingException,
is BluetoothUnknownException -> {
Napier.e(message = "known BT exception, try again later", throwable = error)
false
}
else -> {
Napier.e(message = "fail scan", throwable = error)
true
}
}
}
}
private fun sendBeaconInfo(bluetoothPeripheral: BluetoothPeripheral) {
val name: String = bluetoothPeripheral.name ?: return
val rssi: Int = bluetoothPeripheral.rssi?.toInt() ?: return
val processedRssi = if (rssi == 127) {
lastBeaconRssi[name] ?: return
} else {
lastBeaconRssi[name] = rssi
rssi
}
val beaconInfo = BeaconInfo(name = name, rssi = processedRssi)
GlobalScope.launch(Dispatchers.UI) {
Napier.d("beaconInfo: $beaconInfo")
gameDataRepository.beaconsChannel.send(beaconInfo)
}
}
override fun didDiscoverDevice(bluetoothPeripheral: BluetoothPeripheral) {
this.sendBeaconInfo(bluetoothPeripheral)
}
override fun didConnect(bluetoothPeripheral: BluetoothPeripheral) {}
override fun didDisconnect(bluetoothPeripheral: BluetoothPeripheral) {}
override fun didDiscoverServices(bluetoothPeripheral: BluetoothPeripheral) {}
override fun didDiscoverCharacteristics(bluetoothPeripheral: BluetoothPeripheral) {}
override fun didCharacteristcValueChanged(
bluetoothPeripheral: BluetoothPeripheral,
bluetoothCharacteristic: BluetoothCharacteristic
) {
}
override fun didUpdateMTU(bluetoothPeripheral: BluetoothPeripheral) {}
override fun didRssiUpdate(bluetoothPeripheral: BluetoothPeripheral) {
this.sendBeaconInfo(bluetoothPeripheral)
}
override fun scanDidFailed(error: Throwable) {
Napier.e("scan failed: $error")
doScanning()
}
}
|
http POST localhost:8080/player email=andrewrademacher@gmail.com fortune="Where is Waldo?"
http localhost:8080/fortune/trade/andrewrademacher@gmail.com/andrewrademacher@gmail.com
http localhost:8080/fortune/trade/andrewrademacher@gmail.com/andrewrademacher@gmail.com
|
using System;
using NUnit.Framework;
using Shouldly;
using SqlPad.Oracle.DatabaseConnection;
using SqlPad.Oracle.DataDictionary;
#if ORACLE_MANAGED_DATA_ACCESS_CLIENT
using Oracle.ManagedDataAccess.Types;
#else
using Oracle.DataAccess.Types;
#endif
namespace SqlPad.Oracle.Test
{
public class MiscellaneousTest
{
[TestFixture]
public class OracleExtensionsTest
{
[Test]
public void TestQuotedToSimpleIdentifierStartingWithNonLetter()
{
const string quotedIdentifier = "\"_IDENTIFIER\"";
var simpleIdentifier = quotedIdentifier.ToSimpleIdentifier();
simpleIdentifier.ShouldBe(quotedIdentifier);
}
[Test]
public void TestQuotedToSimpleIdentifierContainingDash()
{
const string quotedIdentifier = "\"DASH-COLUMN\"";
var simpleIdentifier = quotedIdentifier.ToSimpleIdentifier();
simpleIdentifier.ShouldBe(quotedIdentifier);
}
[Test]
public void TestQuotedToSimpleIdentifierOfSingleLetterIdentifier()
{
const string quotedIdentifier = "\"x\"";
var simpleIdentifier = quotedIdentifier.ToSimpleIdentifier();
simpleIdentifier.ShouldBe(quotedIdentifier);
}
[Test]
public void TestNormalStringPlainText()
{
const string literal = "'some''text'";
var text = literal.ToPlainString();
text.ShouldBe("some'text");
}
[Test]
public void TestQuotedStringPlainText()
{
const string literal = "q'|some''text|'";
var text = literal.ToPlainString();
text.ShouldBe("some''text");
}
[Test]
public void TestInvalidQuotedStringPlainText()
{
const string literal = "q'|some''text'";
var text = literal.ToPlainString();
text.ShouldBe("some''text");
}
}
[TestFixture]
public class OracleStatementTest
{
private static readonly OracleSqlParser Parser = OracleSqlParser.Instance;
[Test]
public void TestTryGetPlSqlUnitNameFromCreateProcedure()
{
var statement = Parser.Parse("CREATE PROCEDURE TEST_SCHEMA.TEST_PROCEDURE")[0];
OracleStatement.TryGetPlSqlUnitName(statement, out var identifier).ShouldBeTrue();
identifier.Owner.ShouldBe("TEST_SCHEMA");
identifier.Name.ShouldBe("TEST_PROCEDURE");
}
[Test]
public void TestTryGetPlSqlUnitNameFromCreateFunction()
{
var statement = Parser.Parse("CREATE FUNCTION TEST_SCHEMA.TEST_FUNCTION")[0];
OracleStatement.TryGetPlSqlUnitName(statement, out var identifier).ShouldBeTrue();
identifier.Owner.ShouldBe("TEST_SCHEMA");
identifier.Name.ShouldBe("TEST_FUNCTION");
}
[Test]
public void TestTryGetPlSqlUnitNameFromCreateTable()
{
var statement = Parser.Parse("CREATE TABLE TEST_SCHEMA.TEST_TABLE")[0];
OracleStatement.TryGetPlSqlUnitName(statement, out var _).ShouldBeFalse();
}
[Test]
public void TestFeedbackMessage()
{
var statement = (OracleStatement)Parser.Parse("CREATE PROCEDURE TEST_SCHEMA.TEST_PROCEDURE")[0];
var message = statement.BuildExecutionFeedbackMessage(null, false);
message.ShouldBe("Procedure created. ");
}
[Test]
public void TestCompilationErrorFeedbackMessage()
{
var statement = (OracleStatement)Parser.Parse("CREATE FUNCTION TEST_SCHEMA.TEST_FUNCTION")[0];
var message = statement.BuildExecutionFeedbackMessage(null, true);
message.ShouldBe("Function created with compilation errors. ");
}
}
[TestFixture]
public class OracleValueAggregatorTest
{
[Test]
public void TestNumberAggregation()
{
var aggregator = new OracleValueAggregator();
aggregator.AddValue(1);
aggregator.AddValue(2);
aggregator.AggregatedValuesAvailable.ShouldBeTrue();
aggregator.LimitValuesAvailable.ShouldBeTrue();
aggregator.Average.ShouldBe(new OracleNumber(new OracleDecimal(1.5m)));
aggregator.Sum.ShouldBe(new OracleNumber(new OracleDecimal(3m)));
aggregator.Minimum.ShouldBe(new OracleNumber(new OracleDecimal(1m)));
aggregator.Maximum.ShouldBe(new OracleNumber(new OracleDecimal(2m)));
aggregator.Mode.Value.ShouldBeNull();
aggregator.Mode.Count.ShouldBeNull();
aggregator.Median.ShouldBe(new OracleNumber(new OracleDecimal(1.5m)));
aggregator.Count.ShouldBe(2);
aggregator.DistinctCount.ShouldBe(2);
}
[Test]
public void TestDateAggregation()
{
var aggregator = new OracleValueAggregator();
aggregator.AddValue(new DateTime(2016, 6, 11));
aggregator.AddValue(new DateTime(2016, 6, 12));
aggregator.AggregatedValuesAvailable.ShouldBeFalse();
aggregator.LimitValuesAvailable.ShouldBeTrue();
aggregator.Average.ShouldBeNull();
aggregator.Sum.ShouldBeNull();
aggregator.Minimum.ShouldBe(new OracleDateTime(2016, 6, 11, 0, 0, 0));
aggregator.Maximum.ShouldBe(new OracleDateTime(2016, 6, 12, 0, 0, 0));
aggregator.Mode.Value.ShouldBeNull();
aggregator.Mode.Count.ShouldBeNull();
aggregator.Median.ShouldBeNull();
aggregator.Count.ShouldBe(2);
aggregator.DistinctCount.ShouldBe(2);
}
[Test]
public void TestIntervalAggregation()
{
var aggregator = new OracleValueAggregator();
var oneYear = new OracleIntervalYearToMonth(new OracleIntervalYM(1, 0));
aggregator.AddValue(oneYear);
aggregator.AddValue(oneYear);
var twoYear = new OracleIntervalYearToMonth(new OracleIntervalYM(2, 0));
aggregator.AddValue(twoYear);
aggregator.AddValue(twoYear);
aggregator.AggregatedValuesAvailable.ShouldBeTrue();
aggregator.LimitValuesAvailable.ShouldBeTrue();
aggregator.Average.ShouldBe(new OracleIntervalYearToMonth(new OracleIntervalYM(1, 6)));
aggregator.Sum.ShouldBe(new OracleIntervalYearToMonth(new OracleIntervalYM(6, 0)));
aggregator.Minimum.ShouldBe(oneYear);
aggregator.Maximum.ShouldBe(twoYear);
aggregator.Mode.Value.ShouldBeNull();
aggregator.Mode.Count.ShouldBeNull();
aggregator.Median.ShouldBe(new OracleIntervalYearToMonth(new OracleIntervalYM(1, 6)));
aggregator.Count.ShouldBe(4);
aggregator.DistinctCount.ShouldBe(2);
}
[Test]
public void TestMultipleTypes()
{
var aggregator = new OracleValueAggregator();
aggregator.AddValue(1);
aggregator.AddValue(1);
aggregator.AddValue("string");
aggregator.AddValue("string");
aggregator.AddValue(null);
aggregator.AggregatedValuesAvailable.ShouldBeFalse();
aggregator.LimitValuesAvailable.ShouldBeFalse();
aggregator.Average.ShouldBeNull();
aggregator.Sum.ShouldBeNull();
aggregator.Minimum.ShouldBeNull();
aggregator.Maximum.ShouldBeNull();
aggregator.Mode.Value.ShouldBeNull();
aggregator.Mode.Count.ShouldBeNull();
aggregator.Median.ShouldBeNull();
aggregator.Count.ShouldBe(4);
aggregator.DistinctCount.ShouldBeNull();
}
[Test]
public void TestDistinctStrings()
{
var aggregator = new OracleValueAggregator();
aggregator.AddValue("value1");
aggregator.AddValue("value2");
aggregator.AddValue("value1");
aggregator.AggregatedValuesAvailable.ShouldBeFalse();
aggregator.LimitValuesAvailable.ShouldBeFalse();
aggregator.Average.ShouldBeNull();
aggregator.Sum.ShouldBeNull();
aggregator.Minimum.ShouldBeNull();
aggregator.Maximum.ShouldBeNull();
aggregator.Mode.Value.ShouldBeNull();
aggregator.Mode.Count.ShouldBeNull();
aggregator.Median.ShouldBeNull();
aggregator.Count.ShouldBe(3);
aggregator.DistinctCount.ShouldBe(2);
}
[Test]
public void TestIntervalDayToSecondMode()
{
var aggregator = new OracleValueAggregator();
var oneDay = new OracleIntervalDayToSecond(new OracleIntervalDS(1d));
aggregator.AddValue(oneDay);
aggregator.AddValue(oneDay);
var twoDays = new OracleIntervalDayToSecond(new OracleIntervalDS(2d));
aggregator.AddValue(twoDays);
aggregator.Mode.Value.ShouldBe(oneDay);
aggregator.Mode.Count.ShouldBe(2);
}
[Test]
public void TestDateMode()
{
var aggregator = new OracleValueAggregator();
var date = new OracleDateTime(new OracleDate(2016, 6, 22));
aggregator.AddValue(date);
aggregator.AddValue(date);
aggregator.Mode.Value.ShouldBe(date);
aggregator.Mode.Count.ShouldBe(2);
}
}
}
}
|
---
title: Agenda
inshort: To-do list untuk hidup & bekerja [Wunderlist]
translator: Microsoft Cognitive Services
---
Dari pekerjaan untuk bermain, agenda adalah cara termudah untuk mendapatkan barang dilakukan, setiap hari.
|
import router from './router';
import getQueryStringParams from '../shared/getQueryStringParams';
import seserializeParam from '../shared/serializeParam';
import serializeSearch from '../shared/serializeSearch';
import segments, {resetSegments} from './segments';
const paramsProxyHandler = {
set(target, name, value) {
const serializedValue = seserializeParam(value);
target[name] = serializedValue;
const search = serializeSearch(target);
router.url = router.path + (search ? '?' : '') + search;
return true;
},
get(target, name) {
if(target[name] === false) return false;
if(segments[name] === false) return false;
return target[name] || segments[name] || '';
}
}
const params = {...window.params};
delete window.params;
const proxy = new Proxy(params, paramsProxyHandler);
export function updateParams(query) {
resetSegments();
const delta = getQueryStringParams(query);
for(const key of Object.keys({...delta, ...params})) {
params[key] = delta[key];
}
return proxy;
}
export default proxy;
|
import 'package:flutter/material.dart';
import 'package:tool/widgets/decoration.dart';
extension IntExtensions on int? {
/// Validate given int is not null and returns given value if null.
int validate({int value = 0}) {
return this ?? value;
}
/// Leaves given height of space
Widget get height => SizedBox(height: this?.toDouble());
/// Leaves given width of space
Widget get width => SizedBox(width: this?.toDouble());
/// HTTP status code
bool isSuccessful() => this! >= 200 && this! <= 206;
BorderRadius borderRadius([double? val]) => radius(val);
/// Returns microseconds duration
/// 5.microseconds
Duration get microseconds => Duration(microseconds: validate());
/// Returns milliseconds duration
/// ```dart
/// 5.milliseconds
/// ```
Duration get milliseconds => Duration(milliseconds: validate());
/// Returns seconds duration
/// ```dart
/// 5.seconds
/// ```
Duration get seconds => Duration(seconds: validate());
/// Returns minutes duration
/// ```dart
/// 5.minutes
/// ```
Duration get minutes => Duration(minutes: validate());
/// Returns hours duration
/// ```dart
/// 5.hours
/// ```
Duration get hours => Duration(hours: validate());
/// Returns days duration
/// ```dart
/// 5.days
/// ```
Duration get days => Duration(days: validate());
/// Returns Size
Size get size => Size(this!.toDouble(), this!.toDouble());
}
|
module TIPS.RowColumn where
import qualified Data.ByteString.Char8 as B
import Data.Maybe (fromJust)
import Control.Monad (replicateM)
import Data.List (transpose)
getInts :: IO [Int]
getInts = map (fst . fromJust . B.readInt) . B.words <$> B.getLine
--input
--3 5 4
--11100
--10001
--00111
main = do
[h,w,k] <- getInts
sn <- replicateM h $ map (:[]) <$> getLine
let rowCount = map length $ filter (=="1") <$> sn
let columCount = map length $ filter (=="1") <$> transpose sn
print $ rowCount
print $ columCount
|
#pragma once
#include "demhandlers.h"
#include "base/array.h"
namespace DemMsg
{
struct Dem_DataTables
{
static const int DATA_MAX_LENGTH = 256 * 1024;
Array<uint8_t> data;
};
}
DECLARE_DEM_HANDLERS(Dem_DataTables);
|
use super::*;
/// A rectangle around some area.
///
/// By convention, coordinates **include** the top-left and **exclude** the
/// bottom-right.
#[derive(Debug, Clone, Copy, Default, PartialEq, Eq, PartialOrd, Ord, Hash)]
#[repr(C)]
pub struct RECT {
pub left: LONG,
pub top: LONG,
pub right: LONG,
pub bottom: LONG,
}
pub type LPRECT = *mut RECT;
impl RECT {
pub fn top_left_mut(&mut self) -> &mut POINT {
unsafe { &mut *(self as *mut RECT as *mut POINT) }
}
pub fn bottom_right_mut(&mut self) -> &mut POINT {
unsafe { &mut *(self as *mut RECT as *mut POINT).add(1) }
}
}
|
---
layout: post
comments: true
title: "WeeChat: easy instructions for using SASL"
category: [english]
tags: [irc, english]
redirect_from:
- /weechat-sasl.html
- /english/2015/03/26/weechat-sasl-simply.html
---
This seems to confuse many WeeChat users, so I will try to explain it more
simply as I am repeating myself everywhere about this same thing.
SASL is mechanism for identifying to services at IRC automatically even
before you are visible to the network.
* * * * *
First set mechanism as plain if you have it as anything else.
```
/set irc.server_default.sasl_mechanism PLAIN
```
PLAIN is simple "login using username and password" mechanism that sends
the username and password in plaintext which isn't an issue if you also use
SSL (like you should) and trust the server (and
**use different password everywhere**).
Then simply set your username and password
```
/unset irc.server.NETWORK.sasl_mechanism
/set irc.server.NETWORK.sasl_username REGISTERED_NICKNAME
/set irc.server.NETWORK.sasl_password PASSWORD
/save
```
*Replace NETWORK with the name of network that you have in WeeChat, for
example `liberachat`.*
And now after `/reconnect` you should be identified automatically using
SASL, but you might also ensure that you use SSL.
## Using SSL
Change your address to use SSL port and enable SSL for the network:
```
/set irc.server.liberachat.addresses irc.libera.chat/6697
/set irc.server.liberachat.ssl on
/save
```
*Note: SSL does nothing until you `/reconnect`*
*6697 is the [standard SSL port](https://tools.ietf.org/html/rfc7194).*
liberachat has valid SSL certificate, but if it didn't, you would have two
choises:
1. Trust the fingerprints manually using
`irc.server.NETWORK.ssl_fingerprint`, see [this post].
2. Disable SSL certificate checking using
`/set irc.server.NETWORK.ssl_verify off` **NOT RECOMMENDED**, see
[this post].
[this post]:{% post_url blog/2015-02-24-znc160-ssl %}
|
package me.toptas.fancyshowcase
import android.view.animation.AlphaAnimation
internal class FadeInAnimation : AlphaAnimation(0f, 1f) {
init {
fillAfter = true
duration = 400
}
}
|
package chat.rocket.android.authentication.signup.ui
import android.app.Activity
import android.content.Intent
import android.os.Bundle
import android.view.LayoutInflater
import android.view.View
import android.view.ViewGroup
import androidx.core.content.ContextCompat
import androidx.core.view.ViewCompat
import androidx.core.view.isVisible
import androidx.fragment.app.Fragment
import chat.rocket.android.R
import chat.rocket.android.analytics.AnalyticsManager
import chat.rocket.android.analytics.event.ScreenViewEvent
import chat.rocket.android.authentication.signup.presentation.SignupPresenter
import chat.rocket.android.authentication.signup.presentation.SignupView
import chat.rocket.android.helper.saveCredentials
import chat.rocket.android.util.extension.asObservable
import chat.rocket.android.util.extensions.inflate
import chat.rocket.android.util.extensions.isEmail
import chat.rocket.android.util.extensions.showToast
import chat.rocket.android.util.extensions.textContent
import chat.rocket.android.util.extensions.ui
import dagger.android.support.AndroidSupportInjection
import io.reactivex.disposables.CompositeDisposable
import io.reactivex.rxkotlin.Observables
import kotlinx.android.synthetic.main.fragment_authentication_sign_up.*
import javax.inject.Inject
fun newInstance() = SignupFragment()
internal const val SAVE_CREDENTIALS = 1
class SignupFragment : Fragment(), SignupView {
@Inject
lateinit var presenter: SignupPresenter
@Inject
lateinit var analyticsManager: AnalyticsManager
private val editTextsDisposable = CompositeDisposable()
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
AndroidSupportInjection.inject(this)
}
override fun onCreateView(
inflater: LayoutInflater,
container: ViewGroup?,
savedInstanceState: Bundle?
): View? = container?.inflate(R.layout.fragment_authentication_sign_up)
override fun onViewCreated(view: View, savedInstanceState: Bundle?) {
super.onViewCreated(view, savedInstanceState)
subscribeEditTexts()
setupOnClickListener()
analyticsManager.logScreenView(ScreenViewEvent.SignUp)
}
override fun onActivityResult(requestCode: Int, resultCode: Int, data: Intent?) {
if (resultCode == Activity.RESULT_OK) {
if (data != null) {
if (requestCode == SAVE_CREDENTIALS) {
showMessage(getString(R.string.msg_credentials_saved_successfully))
}
}
}
}
override fun onDestroyView() {
super.onDestroyView()
unsubscribeEditTexts()
}
private fun setupOnClickListener() =
ui {
button_register.setOnClickListener {
presenter.signup(
text_username.textContent,
text_username.textContent,
text_password.textContent,
text_email.textContent
)
}
}
override fun enableButtonRegister() {
context?.let {
ViewCompat.setBackgroundTintList(
button_register, ContextCompat.getColorStateList(it, R.color.colorAccent)
)
button_register.isEnabled = true
}
}
override fun disableButtonRegister() {
context?.let {
ViewCompat.setBackgroundTintList(
button_register,
ContextCompat.getColorStateList(it, R.color.colorAuthenticationButtonDisabled)
)
button_register.isEnabled = false
}
}
override fun showLoading() {
ui {
disableUserInput()
view_loading.isVisible = true
}
}
override fun hideLoading() {
ui {
view_loading.isVisible = false
enableUserInput()
}
}
override fun showMessage(resId: Int) {
ui {
showToast(resId)
}
}
override fun showMessage(message: String) {
ui {
showToast(message)
}
}
override fun showGenericErrorMessage() = showMessage(getString(R.string.msg_generic_error))
override fun saveSmartLockCredentials(id: String, password: String) {
activity?.saveCredentials(id, password)
}
private fun subscribeEditTexts() {
editTextsDisposable.add(
Observables.combineLatest(
text_name.asObservable(),
text_username.asObservable(),
text_password.asObservable(),
text_email.asObservable()
) { text_name, text_username, text_password, text_email ->
return@combineLatest (
text_name.isNotBlank() &&
text_username.isNotBlank() &&
text_password.isNotBlank() &&
text_email.isNotBlank() &&
text_email.toString().isEmail()
)
}.subscribe { isValid ->
if (isValid) {
enableButtonRegister()
} else {
disableButtonRegister()
}
})
}
private fun unsubscribeEditTexts() = editTextsDisposable.clear()
private fun enableUserInput() {
text_name.isEnabled = true
text_username.isEnabled = true
text_password.isEnabled = true
text_email.isEnabled = true
enableButtonRegister()
}
private fun disableUserInput() {
disableButtonRegister()
text_name.isEnabled = false
text_username.isEnabled = false
text_password.isEnabled = false
text_email.isEnabled = false
}
}
|
package io.github.voidcontext.easyvalidator
import DSL._
import org.scalatest._
import scala.util.{Try}
class DSLSpec extends FlatSpec with TryValues {
def testValidValue[T](result: Try[T], num: T) = assert(result.success.value == num)
def testInvalidValue[T](result: Try[T], errMsg: String) = assert(
result.failure.exception.getMessage == errMsg
)
val intVal = 10
val floatVal = 11.3f
val doubleVal = 11.3
val stringVal = "foo bar"
"`greater than` expression" should "validate int values" in {
testValidValue[Int] (intVal is greater than 5, intVal)
testInvalidValue[Int] (intVal is greater than 19, "10 is not greater than 19")
}
it should "validate float values" in {
testValidValue[Float] (floatVal is greater than 5, floatVal)
testInvalidValue[Float] (floatVal is greater than 11.5f, "11.3 is not greater than 11.5")
}
it should "validate double values" in {
testValidValue[Double] (doubleVal is greater than 5, doubleVal)
testInvalidValue[Double] (doubleVal is greater than 11.5, "11.3 is not greater than 11.5")
}
"`less than` expression" should "validate int values" in {
testValidValue[Int] (intVal is less than 11, intVal)
testInvalidValue[Int] (intVal is less than 5, "10 is not less than 5")
}
it should "validate float values" in {
testValidValue[Float] (floatVal is less than 11.300001f, floatVal)
testInvalidValue[Float] (floatVal is less than 11.2f, "11.3 is not less than 11.2")
}
it should "validate double values" in {
testValidValue[Double] (doubleVal is less than 15, doubleVal)
testInvalidValue[Double] (doubleVal is less than 3.45, "11.3 is not less than 3.45")
}
"`is` expression" should "validate int values" in {
testValidValue[Int] (intVal is 10, intVal)
testInvalidValue[Int] (intVal is 5, "10 is not equal to 5")
}
it should "validate float values" in {
testValidValue[Float] (floatVal is 11.3f, floatVal)
testInvalidValue[Float] (floatVal is 11.2f, "11.3 is not equal to 11.2")
}
it should "validate double values" in {
testValidValue[Double] (doubleVal is 11.3, doubleVal)
testInvalidValue[Double] (doubleVal is 3.45, "11.3 is not equal to 3.45")
}
"`is not` expression" should "validate int values" in {
testValidValue[Int] (intVal isNot 5, intVal)
testInvalidValue[Int] (intVal isNot 10, "10 is equal to 10")
}
it should "validate float values" in {
testValidValue[Float] (floatVal isNot 11.299f, floatVal)
testInvalidValue[Float] (floatVal isNot 11.3f, "11.3 is equal to 11.3")
}
it should "validate double values" in {
testValidValue[Double] (doubleVal isNot 11.301, doubleVal)
testInvalidValue[Double] (doubleVal isNot 11.3, "11.3 is equal to 11.3")
}
"`is shorter than`" should "validate strings" in {
testValidValue[String](stringVal is longer than 3, stringVal)
testInvalidValue[String](stringVal is longer than 7, s"$stringVal's length is not greater than 7")
}
"`is longer than`" should "validate strings" in {
testValidValue[String](stringVal is shorter than 10, stringVal)
testInvalidValue[String](stringVal is shorter than 7, s"$stringVal's length is not less than 7")
}
"`matches`" should "validate strings" in {
testValidValue[String](stringVal matchesRegex """foo.*""", stringVal)
testInvalidValue[String](stringVal matchesRegex """bar.*""", s"$stringVal doesn't match bar.*")
}
"`isNot empty`" should "validate strings" in {
testValidValue[String](stringVal isNot empty, stringVal)
testInvalidValue[String]("" isNot empty, s"String is empty")
}
}
|
class Guest
def name
'Guest Visitor'.freeze
end
def email
'unknown@domain.com'.freeze
end
def appear
end
def disappear
end
def away
end
end
|
module Docks
module Tags
class Link < Base
def initialize
@name = :link
@synonyms = [:see]
@multiline = false
@multiple_allowed = true
end
def process(symbol)
symbol.update(@name) do |links|
Array(links).map { |link| OpenStruct.new name_and_parenthetical(link, :url, :caption) }
end
end
end
end
end
|
// Generated on Thu Nov 09 17:15:14 MSK 2006
// DTD/Schema : http://www.springframework.org/schema/jee
package com.intellij.spring.model.xml.jee;
import javax.annotation.Nonnull;
import com.intellij.psi.PsiClass;
import com.intellij.util.xml.DomElement;
import com.intellij.util.xml.GenericAttributeValue;
import com.intellij.util.xml.GenericDomValue;
import com.intellij.util.xml.Required;
/**
* http://www.springframework.org/schema/jee:ejbType interface.
*/
public interface SpringEjb extends DomElement, JndiLocated {
/**
* Returns the value of the lookup-home-on-startup child.
* <pre>
* <h3>Attribute null:lookup-home-on-startup documentation</h3>
* Controls whether the lookup of the EJB home object is performed
* immediately on startup (if true, the default), or on first access
* (if false).
*
* </pre>
* @return the value of the lookup-home-on-startup child.
*/
@Nonnull
GenericAttributeValue<Boolean> getLookupHomeOnStartup();
/**
* Returns the value of the cache-home child.
* <pre>
* <h3>Attribute null:cache-home documentation</h3>
* Controls whether the EJB home object is cached once it has been located.
*
* </pre>
* @return the value of the cache-home child.
*/
@Nonnull
GenericAttributeValue<Boolean> getCacheHome();
/**
* Returns the value of the business-interface child.
* <pre>
* <h3>Attribute null:business-interface documentation</h3>
* The business interface of the EJB being proxied.
*
* </pre>
* @return the value of the business-interface child.
*/
@Nonnull
@Required
GenericAttributeValue<PsiClass> getBusinessInterface();
/**
* Returns the value of the jndi-name child.
* <pre>
* <h3>Attribute null:jndi-name documentation</h3>
* The JNDI name to look up.
*
* </pre>
* @return the value of the jndi-name child.
*/
@Nonnull
@Required
GenericAttributeValue<String> getJndiName();
/**
* Returns the value of the resource-ref child.
* <pre>
* <h3>Attribute null:resource-ref documentation</h3>
* Controls whether the lookup occurs in a J2EE container, i.e. if the
* prefix "java:comp/env/" needs to be added if the JNDI name doesn't
* already contain it.
*
* </pre>
* @return the value of the resource-ref child.
*/
@Nonnull
GenericAttributeValue<Boolean> getResourceRef();
/**
* Returns the value of the environment child.
* <pre>
* <h3>Element http://www.springframework.org/schema/jee:environment documentation</h3>
* The newline-separated, key-value pairs for the JNDI environment
* (in standard Properties format, namely 'key=value' pairs)
*
* </pre>
* @return the value of the environment child.
*/
@Nonnull
GenericDomValue<String> getEnvironment();
}
|
import tensorflow as tf
import config
def load_dataset():
mnist = tf.keras.datasets.mnist
train, test = mnist.load_data(path=config.DATA_PATH)
return train, test
|
RSpec.describe XLSXToHTML do
let(:xlsx_path) { fixture_path('with_static_headers.xlsx') }
it 'has a version number' do
expect(described_class::VERSION).not_to be nil
end
it '.convert' do
expect { described_class.convert(xlsx_path) }.not_to raise_error
end
end
|
subroutine getvort()
!Calculates the vorticity on the edge of the object.
!The vorticity is taken to be a constant on the surface:
! vortcoeff on the top edge, and -vortcoeff on the bottom edge
use vars
use parallel
implicit none
integer :: i, j
call getv() !get the velocity
do j=1,ny
do i=1,nx
if (boundary(i,j) .ne. 0) vort(i,j) = real(boundary(i,j))*vortcoeff
enddo
enddo
vort(nx+1,:) = 0.
!transfer halo values for the vorticity
call haloswap(vort)
end subroutine
|
package com.amsavarthan.apps.media_toolbox.whatsapp.home
interface OnClickDownloadListener {
fun onClickDownload()
}
|
#!/bin/sh
# Create Package
aws cloudformation package \
--template-file template.yml \
--output-template-file template-output.yml \
--s3-bucket alexa-sounds-of-rain-package \
--profile xblood
# Deploy
aws cloudformation deploy \
--template-file template-output.yml \
--stack-name alexa-sounds-of-rain-sam \
--parameter-overrides \
SoundFileBaseUrl=${ALEXA_SOUNDS_OF_RAIN_SOUND_FILE_BASE_URL} \
SoundFileBaseName=${ALEXA_SOUNDS_OF_RAIN_SOUND_FILE_BASE_NAME} \
AppId=${ALEXA_SOUNDS_OF_RAIN_APP_ID} \
Stage=${ALEXA_SOUNDS_OF_RAIN_STAGE} \
--capabilities CAPABILITY_IAM \
--profile xblood
# Set Stack Policy
aws cloudformation set-stack-policy \
--stack-name alexa-sounds-of-rain-sam \
--stack-policy-body=file://./template-stack-policy.json \
--profile xblood
|
import 'package:app/app/route/approute.dart';
import 'package:app/common/utils/resultutils.dart';
import 'package:app/generated/l10n.dart';
import 'package:app/module/minus/result/data/model/resultitemmodel.dart';
import 'package:flutter/material.dart';
import 'package:get/get.dart';
import 'package:intl/intl.dart';
class HasiTestItem extends StatelessWidget {
final ResultItemModel result;
HasiTestItem(this.result);
@override
Widget build(BuildContext context) {
final date = DateTime.parse(result.testAt);
return GestureDetector(
onTap: () => Get.toNamed(AppRoute.minusResultPage, arguments: result),
child: Container(
height: 120,
padding: EdgeInsets.symmetric(horizontal: 16, vertical: 16),
margin: EdgeInsets.only(bottom: 16),
decoration: BoxDecoration(
color: Colors.white,
borderRadius: BorderRadius.circular(8),
),
child: Row(
children: <Widget>[
_Score(result, true),
SizedBox(width: 16),
_Score(result, false),
Spacer(),
Column(
crossAxisAlignment: CrossAxisAlignment.end,
children: <Widget>[
Text(S.of(context).tanggal),
Spacer(),
Text(
date.day.toString(),
style: TextStyle(
fontWeight: FontWeight.bold,
fontSize: 36,
),
),
Text(
DateFormat.yMMMM().format(date),
style: TextStyle(
fontWeight: FontWeight.bold,
fontSize: 12,
),
),
],
)
],
),
),
);
}
}
class _Score extends StatelessWidget {
final ResultItemModel item;
final bool isLeft;
_Score(this.item, this.isLeft);
@override
Widget build(BuildContext context) {
return Column(
crossAxisAlignment: CrossAxisAlignment.start,
children: <Widget>[
Text(isLeft ? S.of(context).kiriResult : S.of(context).kananResult),
Expanded(
child: Container(
decoration: BoxDecoration(
color: ResultUtils.defineColor(context, isLeft ? item.leftScore : item.rightScore, item.total),
borderRadius: BorderRadius.circular(8),
),
child: AspectRatio(
aspectRatio: 1,
child: Center(
child: Row(
mainAxisAlignment: MainAxisAlignment.center,
crossAxisAlignment: CrossAxisAlignment.end,
children: <Widget>[
Text(
"${isLeft ? item.leftScore : item.rightScore}",
style: TextStyle(
color: Colors.white,
fontWeight: FontWeight.bold,
fontSize: 24,
),
),
Text(
"/${item.total}",
style: TextStyle(
color: Colors.white,
fontWeight: FontWeight.bold,
fontSize: 10,
),
),
],
)),
),
),
),
],
);
}
}
|
<?php
namespace App\Http\Controllers;
use App\Services\ShopService;
class ShopController
{
private $shopService;
public function __construct(ShopService $shopService)
{
$this->shopService = $shopService;
}
public function enterShop()
{
$items = $this->shopService->getShopableItems();
foreach ($items as $category => $categoryItems)
foreach ($categoryItems as $itemName => $item){
$items[$category][$itemName]['nicePrice']
= round($item['price'] / 100, 2);
}
return view('shop', [
'items' => $items
]);
}
}
|
#!/usr/bin/env bash
set -x
CONFIG_FILE="configs/video/cityscapes/memory_cam_4_r50-d16_769x769_80k_cityscapes_video.py"
CONFIG_PY="${CONFIG_FILE##*/}"
CONFIG="${CONFIG_PY%.*}"
WORK_DIR="./work_dirs/${CONFIG}_k64_s4_concat"
#WORK_DIR="./work_dirs/test"
#WORK_DIR="./work_dirs/memory_r18-d16_769x769_80k_cityscapes_video_k64_s4"
#CONFIG_FILE="${WORK_DIR}/memory_r18-d16_769x769_80k_cityscapes_video.py"
SHOW_DIR="${WORK_DIR}/show"
TMPDIR="${WORK_DIR}/tmp"
CHECKPOINT="${WORK_DIR}/latest.pth"
RESULT_FILE="${WORK_DIR}/result.pkl"
#CHECKPOINT="${WORK_DIR}/iter_36000.pth,${WORK_DIR}/iter_40000.pth,${WORK_DIR}/iter_32000.pth"
GPUS=4
PORT=${PORT:-29511}
if [ ! -d "${WORK_DIR}" ]; then
mkdir -p "${WORK_DIR}"
cp "${CONFIG_FILE}" $0 "${WORK_DIR}"
fi
echo -e "\nconfig file: ${CONFIG}\n"
PYTHONPATH="$(dirname $0)/..":$PYTHONPATH
RANDOM_SEED=0
export CUDA_VISIBLE_DEVICES=0,1,2,3
#export CUDA_VISIBLE_DEVICES=4,5,6,7
#export CUDA_VISIBLE_DEVICES=1
# training
echo -e '\nDistributed Training\n'
python -m torch.distributed.launch --nproc_per_node=$GPUS --master_port=$PORT \
./tools/train.py ${CONFIG_FILE} \
--seed $RANDOM_SEED \
--launcher 'pytorch' \
--work-dir $WORK_DIR \
# --resume-from $CHECKPOINT \
echo -e "\nWork Dir: ${WORK_DIR}.\n"
# evaluation
#echo -e "\nEvaluating ${WORK_DIR}\n"
#python -m torch.distributed.launch --nproc_per_node=$GPUS --master_port=$PORT \
# ./tools/test.py \
# ${CONFIG_FILE} \
# ${CHECKPOINT} \
# --launcher 'pytorch' \
# --work-dir $WORK_DIR \
# --eval mIoU \
# --tmpdir $TMPDIR \
# --out $RESULT_FILE \
#
#echo -e "\nWork Dir: ${WORK_DIR}.\n"
#
#python ./tools/test.py \
# ${CONFIG_FILE} \
# ${CHECKPOINT} \
# --test-fps \
# --test-shape 2048 1024 \
# visualization
#echo -e '\nVisualization.\n'
#if [ -d "${SHOW_DIR}" ]; then
# rm -rf "${SHOW_DIR}"
# mkdir "${SHOW_DIR}"
#fi
#python ./tools/test.py \
# ${CONFIG_FILE} \
# ${CHECKPOINT} \
# --show-dir $SHOW_DIR \
|
{-# LANGUAGE DataKinds #-}
{-# LANGUAGE DeriveGeneric #-}
{-# LANGUAGE OverloadedStrings #-}
module Zmora.Queue where
import qualified Data.ByteString as BS
import qualified Data.ByteString.Lazy as BL
import Data.Int (Int64)
import Data.ProtocolBuffers
import Data.Serialize (runGetLazy, runPutLazy)
import qualified Data.Text as T
import GHC.Generics (Generic)
--
-- Data model
--
data File = File
{ name :: Optional 1 (Value T.Text)
, content :: Optional 2 (Value BS.ByteString)
} deriving (Generic, Show)
data Test = Test
{ testId :: Optional 1 (Value Int64)
, input :: Optional 2 (Value T.Text)
, output :: Optional 3 (Value T.Text)
, timeLimit :: Optional 4 (Value Int64)
, ramLimit :: Optional 5 (Value Int64)
} deriving (Generic, Show)
data Task = Task
{ taskId :: Optional 1 (Value Int64)
, configuration :: Optional 2 (Value T.Text)
, files :: Repeated 3 (Message File)
, tests :: Repeated 4 (Message Test)
} deriving (Generic, Show)
data TaskResult = TaskResult
{ resultId :: Optional 1 (Value Int64)
, compilationLog :: Optional 2 (Value T.Text)
, testResults :: Repeated 3 (Message TestResult)
} deriving (Generic, Show)
data TestResult = TestResult
{ sourceTestId :: Optional 1 (Value Int64)
, status :: Optional 2 (Enumeration Status)
, executionTime :: Optional 3 (Value Int64)
, ramUsage :: Optional 4 (Value Int64)
} deriving (Generic, Show)
data Status
= OK
| RTE
| MEM
| TLE
| ANS
| CME
deriving (Enum, Show)
--
-- Serialization
--
instance Encode File
instance Encode Test
instance Encode Task
instance Encode TaskResult
instance Encode TestResult
serialize
:: Encode a
=> a -> BL.ByteString
serialize = runPutLazy . encodeMessage
--
-- Deserialization
--
instance Decode File
instance Decode Test
instance Decode Task
instance Decode TaskResult
instance Decode TestResult
deserialize
:: Decode a
=> BL.ByteString -> Either String a
deserialize = runGetLazy decodeMessage
|
package isamrs.tim21.klinika.dto;
import java.util.List;
import isamrs.tim21.klinika.domain.Sifarnik;
public class PosetaDTO3 {
private String opis;
private Long posetaId;
private Sifarnik bolest;
private List<Sifarnik> lekovi;
private String dioptrija;
private String krvnaGrupa;
private Integer visina;
private Integer tezina;
public PosetaDTO3(){
//default constructor
}
public PosetaDTO3(
String opis, Long posetaId, Sifarnik selectedDijagnoza, List<Sifarnik> selectedLekovi,
String dioptrija, String krvnaGrupa, Integer visina, Integer tezina ){
this.opis = opis;
this.posetaId = posetaId;
this.bolest = selectedDijagnoza;
this.lekovi = selectedLekovi;
this.dioptrija = dioptrija;
this.krvnaGrupa = krvnaGrupa;
this.visina = visina;
this.tezina = tezina;
}
public Long getPosetaId() {
return this.posetaId;
}
public void setPosetaId(Long posetaId) {
this.posetaId = posetaId;
}
public Sifarnik getBolest() {
return this.bolest;
}
public void setBolest(Sifarnik bolest) {
this.bolest = bolest;
}
public List<Sifarnik> getLekovi() {
return this.lekovi;
}
public void setLekovi(List<Sifarnik> lekovi) {
this.lekovi = lekovi;
}
public String getOpis() {
return this.opis;
}
public void setOpis(String opis) {
this.opis = opis;
}
public String getDioptrija() {
return this.dioptrija;
}
public void setDioptrija(String dioptrija) {
this.dioptrija = dioptrija;
}
public String getKrvnaGrupa() {
return this.krvnaGrupa;
}
public void setKrvnaGrupa(String krvnaGrupa) {
this.krvnaGrupa = krvnaGrupa;
}
public Integer getVisina() {
return this.visina;
}
public void setVisina(Integer visina) {
this.visina = visina;
}
public Integer getTezina() {
return this.tezina;
}
public void setTezina(Integer tezina) {
this.tezina = tezina;
}
}
|
var classv8_1_1internal_1_1_s_c_table_reference =
[
[ "SCTableReference", "classv8_1_1internal_1_1_s_c_table_reference.html#a3d27c68aadf4366d32e8391634559250", null ],
[ "address", "classv8_1_1internal_1_1_s_c_table_reference.html#a9ef72f9ebe8c83cf2706d426b8cb942e", null ],
[ "StubCache", "classv8_1_1internal_1_1_s_c_table_reference.html#a9dd0864bf7d020620606b5f3e1a0452f", null ],
[ "address_", "classv8_1_1internal_1_1_s_c_table_reference.html#adda6883a1d47763a5ba23dda0edf6e4f", null ]
];
|
<?php
namespace App\Http\Livewire\Importer;
use Elasticsearch\ClientBuilder;
use Illuminate\Http\File;
use Illuminate\Support\Facades\Storage;
use Illuminate\Validation\ValidationException;
use Livewire\Component;
class Import extends Component
{
public $csvUrl;
public $supportedColumns = [
'date',
'state',
'newDeaths'
];
public $csvColumns = [];
public $columnMappings = [];
public $state = [
'original' => '',
'mapped' => ''
];
public function import()
{
$fileHeaders = get_headers($this->csvUrl, true);
if ($fileHeaders['Content-Length'] > 10 * 1024 * 1024) {
throw ValidationException::withMessages([
'csvUrl' => 'The file is too big.'
]);
}
$file = Storage::put('data.csv', file_get_contents($this->csvUrl));
if (($handle = fopen(Storage::path('data.csv'), "r")) !== FALSE) {
if (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
$this->csvColumns = $data;
}
fclose($handle);
}
// $client = ClientBuilder::create()->build();
}
public function addMapping()
{
$this->columnMappings[$this->state['mapped']] = $this->state['original'];
$this->reset('state');
}
public function render()
{
return view('livewire.importer.import');
}
}
|
import 'package:flutter/material.dart';
import 'package:get/get.dart';
import 'package:getx_route_navigation/home.dart';
import 'package:getx_route_navigation/next.dart';
import 'package:getx_route_navigation/unknown_route.dart';
void main() {
runApp(const MyApp());
}
class MyApp extends StatelessWidget {
const MyApp({Key? key}) : super(key: key);
@override
Widget build(BuildContext context) {
return GetMaterialApp(
debugShowCheckedModeBanner: false,
title: 'Route Navigation with Name',
initialRoute: "/",
defaultTransition: Transition.zoom,
getPages: [
GetPage(name: '/', page: () => MyApp()),
GetPage(name: '/home', page: () => MyHome()),
// GetPage(name: '/nextScreen', page: () => NextScreen(), transition: Transition.rightToLeft),
GetPage(name: '/nextScreen/:someValue', page: () => NextScreen(), transition: Transition.rightToLeft),
],
unknownRoute: GetPage(name: "/notfound", page: () => UnknownRoute()),
home: Scaffold(
appBar: AppBar(title: Text("Route Navigation with Name"),),
body: Center(
child: Column(
mainAxisAlignment: MainAxisAlignment.center,
crossAxisAlignment: CrossAxisAlignment.center,
children: [
ElevatedButton(
onPressed: () {
// Get.to(
// MyHome(),
// // To show as a full screen dialog
// fullscreenDialog: true,
// // To animate the navigation screen with duration
// transition: Transition.zoom,
// duration: Duration(milliseconds: 1000),
// );
// Go to next screen finally, No option for get back
// Get.off(MyHome());
// Go to next screen and cancel all previous page
// Get.offAll(MyHome());
// Go to next screen with arguments / data
// Get.to(MyHome(), arguments: "Arguments from Main Screen");
// var data = await Get.to(MyHome());
// print("Received data is $data");
// Get.toNamed("/home");
// Go to next screen finally, No option for get back
// Get.offNamed("/home");
// Go to next screen with arguments / data
Get.toNamed("/home?name=Ashraf uddin&email=ashrafuanhid@gmail.com");
},
child: Text("Go to Home"),
)
],
),
),
),
);
}
}
|
package havis.middleware.ale.core.manager;
import org.junit.Assert;
import org.junit.Test;
public class ACTest {
@Test
public void getStandardTest(){
Assert.assertEquals("1.1", AC.getStandardVersion());
}
}
|
package openfoodfacts.github.scrachx.openfood.features.changelog
import android.view.LayoutInflater
import android.view.View
import android.view.ViewGroup
import android.widget.TextView
import androidx.recyclerview.widget.RecyclerView
import openfoodfacts.github.scrachx.openfood.R
class ChangelogAdapter(private val items: List<ChangelogListItem>) : RecyclerView.Adapter<RecyclerView.ViewHolder>() {
companion object {
private const val VIEW_TYPE_HEADER = 1
private const val VIEW_TYPE_ITEM = 2
}
override fun onCreateViewHolder(parent: ViewGroup, viewType: Int): RecyclerView.ViewHolder {
val inflater = LayoutInflater.from(parent.context)
return when (viewType) {
VIEW_TYPE_HEADER -> HeaderViewHolder(inflater.inflate(R.layout.view_changelog_item_header, parent, false))
VIEW_TYPE_ITEM -> ItemViewHolder(inflater.inflate(R.layout.view_changelog_item, parent, false))
else -> throw IllegalStateException("Unexpected value: $viewType")
}
}
override fun onBindViewHolder(holder: RecyclerView.ViewHolder, position: Int) {
when (holder.itemViewType) {
VIEW_TYPE_HEADER -> (holder as HeaderViewHolder).bind((items[position] as ChangelogListItem.Header))
VIEW_TYPE_ITEM -> (holder as ItemViewHolder).bind((items[position] as ChangelogListItem.Item))
else -> throw IllegalStateException("Unexpected value: " + holder.itemViewType)
}
}
override fun getItemCount(): Int = items.size
override fun getItemViewType(position: Int): Int {
return if (items[position] is ChangelogListItem.Header) VIEW_TYPE_HEADER else VIEW_TYPE_ITEM
}
private class HeaderViewHolder(view: View) : RecyclerView.ViewHolder(view) {
private val versionLabel: TextView = view.findViewById(R.id.changelog_list_header_version)
private val dateLabel: TextView = view.findViewById(R.id.changelog_list_header_date)
fun bind(item: ChangelogListItem.Header) {
versionLabel.text = item.version
dateLabel.text = item.date
}
}
private class ItemViewHolder(view: View) : RecyclerView.ViewHolder(view) {
private val itemLabel: TextView = itemView.findViewById(R.id.changelog_list_item_label)
fun bind(item: ChangelogListItem.Item) {
itemLabel.text = item.description
}
}
}
|
<?php
declare(strict_types = 1);
namespace app\modules\dynamic_attributes\models\references;
use app\components\pozitronik\references\models\CustomisableReference;
use app\models\relations\RelUsersAttributesTypes;
/**
* This is the model class for table "ref_attributes_types".
*
* @property int $id
* @property string $name
* @property string $color
* @property int $deleted
* @property-read int $usedCount Количество объектов, использующих это значение справочника
*/
class RefAttributesTypes extends CustomisableReference {
public $menuCaption = 'Типы отношений атрибутов';
public $menuIcon = false;
/**
* {@inheritdoc}
*/
public static function tableName():string {
return 'ref_attributes_types';
}
/**
* @return int
*/
public function getUsedCount():int {
return (int)RelUsersAttributesTypes::find()->where(['type' => $this->id])->count();
}
}
|
<?php
namespace App\Http\Controllers\Admin;
use App\Http\Controllers\Controller;
use App\Models\Why_choose;
use Illuminate\Http\Request;
use Carbon\Carbon;
use Image;
class Why_chooseController extends Controller
{
public function __construct()
{
$this->middleware('auth');
}
// view page
public function index()
{
return view('admin.why_choose', [
'why_chooses' => Why_choose::paginate(10),
]);
}
// insert page
public function why_choose_add()
{
return view('admin.why_choose_add');
}
// insert
public function insert(Request $req)
{
$heading = $req->heading;
$content = $req->content;
$photo = $req->file('photo');
$created_at = Carbon::now();
$req->validate([
'heading' => 'required',
'content' => 'required',
'photo' => 'required|file|image|mimes:jpeg,jpg,png',
]);
$id = Why_choose::insertGetId([
"heading" => $heading,
"content" => $content,
"created_at" => $created_at,
]);
$photo = $req->file('photo');
$photo_extention = $photo->getClientOriginalExtension();
$photo_name = "why_choose_" .$id ."." . $photo_extention;
Image::make($photo)->save(base_path('public/uploads/why_choose/' . $photo_name));
Why_choose::find($id)->update([
"photo" => $photo_name,
]);
return back()->with('success', 'You are success to add why choose item');
}
// edit page
public function edit_page($id)
{
return view('admin.why_choose_edit', [
'why_choose' => Why_choose::find($id),
]);
}
// edit
public function edit(Request $req)
{
$id = $req->id;
$heading = $req->heading;
$content = $req->content;
$photo = $req->file('photo');
$req->validate([
'heading' => 'required',
'content' => 'required',
]);
if($photo){
$req->validate([
'photo' => 'required|file|image|mimes:jpeg,jpg,png',
]);
}
Why_choose::find($id)->update([
"heading" => $heading,
"content" => $content,
]);
// photo update
if($photo){
$old_photo_name = Why_choose::find($id)->photo;
unlink('uploads/why_choose/' . $old_photo_name);
$photo = $req->file('photo');
$photo_extention = $photo->getClientOriginalExtension();
$photo_name = $old_photo_name . "." . $photo_extention;
Image::make($photo)->save(base_path('public/uploads/why_choose/' . $photo_name));
Why_choose::find($id)->update([
"photo" => $photo_name,
]);
}
return back()->with('success', 'You are success to update why choose item');
}
// p_delete single
public function delete($id)
{
$photo = Why_choose::find($id)->photo;
unlink('uploads/why_choose/' . $photo);
Why_choose::find($id)->forceDelete();
return back()->with('error', 'You are success delete your why choose');
}
}
|
package pattern
import models.{Customer, Employee}
import scala.util.Random
object PatternMatchingExample extends App {
val x = Random.nextInt(8)
x match {
case 0 => println("zero")
case 1 => println("one")
case 2 => println("two")
case 3 => println("three")
case _ => println("more than two")
}
(1, true, "test") match {
case (_, false, _) => println("false")
case (_, _, "production") => println("production")
case (_, true, _) => println("true")
case _ => println("other")
}
val person = Customer(1L, "Michael", "Angel")
person match {
case Customer(_, "Michael", _) => println("It's Michael.")
case _ => println("Someone else")
}
person match {
case p: Employee => println("Employee") // compiler will warn because person is Customer
case p: Customer if p.firstName == "Michael" => println("It's Michael.")
case _ => println("Unemployed")
}
val intSeq = 10 :: 20 :: 30 :: Nil
intSeq match {
case Seq(_, second, _*) => Option(second)
case _ => None
}
}
|
#!/bin/bash
# Set up vnc server
# https://askubuntu.com/questions/328240/assign-vnc-password-using-script
myuser="econ-ark"
mypass="kra-noce"
echo "$mypass" > /tmp/vncpasswd # First is the read-write password
echo "$myuser" >> /tmp/vncpasswd # Next is the read-only password (useful for sharing screen with students)
[[ -e /home/$myuser/.vnc ]] && rm -Rf /home/$myuser/.vnc # If a previous version exists, delete it
mkdir /home/$myuser/.vnc
vncpasswd -f < /tmp/vncpasswd > /home/$myuser/.vnc/passwd # Create encrypted versions
# Give the files the right permissions
chown -R $myuser:$myuser /home/$myuser/.vnc
chmod 0600 /home/$myuser/.vnc/passwd
touch /home/$myuser/.bash_aliases
echo '# If not already running, launch the vncserver whenever an interactive shell starts' >> /home/$myuser/.bash_aliases
echo 'pgrep x0vncserver > /dev/null' >> /home/$myuser/.bash_aliases
echo '[[ $? -eq 1 ]] && (x0vncserver -display :0 -PasswordFile=/home/'$myuser'/.vnc/passwd >> /dev/null 2>&1 &)' >> /home/$myuser/.bash_aliases
|
namespace AlbedoTeam.Sdk.Authentication.Internals
{
using System;
using System.Linq;
using Abstractions;
internal class AuthenticationConfigurator : IAuthenticationConfigurator
{
public IAuthenticationOptions Options { get; private set; }
public IAuthenticationConfigurator SetOptions(Action<IAuthenticationOptions> configureOptions)
{
IAuthenticationOptions options = new AuthenticationOptions();
configureOptions.Invoke(options);
if (string.IsNullOrWhiteSpace(options.AuthServerUrl))
throw new InvalidOperationException("Can not setup the authentication without a valid AuthServer URL");
if (string.IsNullOrWhiteSpace(options.AuthServerId))
throw new InvalidOperationException("Can not setup the authentication without a valid AuthServer ID");
if (string.IsNullOrWhiteSpace(options.Audience))
throw new InvalidOperationException("Can not setup the authentication without a valid Audience");
// if (options.AllowedOrigins is null || !options.AllowedOrigins.Any())
// throw new InvalidOperationException("Can not setup the authentication without allowed origins");
Options = options;
return this;
}
}
}
|
(cl:defpackage cacc_msgs-msg
(:use )
(:export
"<CACCCONTROLPACKET>"
"CACCCONTROLPACKET"
"<CACCMPCPARAM>"
"CACCMPCPARAM"
"<CACCMPCSTATE>"
"CACCMPCSTATE"
"<CACCSENSORPACKET>"
"CACCSENSORPACKET"
"<CACCSTATEPACKET>"
"CACCSTATEPACKET"
"<PRARXPARAM>"
"PRARXPARAM"
))
|
import {Vessel} from "../../../../backend-api/identity-registry/autogen/model/Vessel";
import {VesselAttribute} from "../../../../backend-api/identity-registry/autogen/model/VesselAttribute";
import AttributeNameEnum = VesselAttribute.AttributeNameEnum;
import {EnumsHelper} from "../../../../shared/enums-helper";
export interface VesselAttributeViewModel extends VesselAttribute {
attributeNameText?:string;
}
// TODO maybe this should just be a helper.service instead. Or mayby just static methods if no objects is needed
export class VesselViewModel {
private attributes:Array<VesselAttributeViewModel>;
private vessel:Vessel;
constructor(vessel:Vessel) {
this.vessel = vessel;
this.generateAttributes();
}
public static getAllPossibleVesselAttributes(): Array<VesselAttributeViewModel> {
let attributes:Array<VesselAttributeViewModel> = [];
let attributeKeysAndValues = EnumsHelper.getKeysAndValuesFromEnum(AttributeNameEnum);
attributeKeysAndValues.forEach(enumKeyAndValue => {
let vesselAttribute:VesselAttributeViewModel = {
attributeValue: '',
attributeName: enumKeyAndValue.value,
attributeNameText: VesselViewModel.getTextForVesselAttributeNameEnum(enumKeyAndValue.value)
};
attributes.push(vesselAttribute);
});
return attributes;
}
public static convertVesselsToViewModels(vessels:Array<Vessel>):Array<VesselViewModel> {
let viewModels:Array<VesselViewModel> = [];
if (vessels) {
vessels.forEach(vessel => {
viewModels.push(new VesselViewModel(vessel));
});
}
return viewModels;
}
public getVessel():Vessel {
return this.vessel;
}
public getAttributeViewModels():Array<VesselAttributeViewModel> {
return this.attributes;
}
private generateAttributes() {
this.attributes = [];
if (this.vessel.attributes) {
this.vessel.attributes.forEach(attribute => {
this.attributes.push(this.attributeViewModelFromAttribute(attribute));
});
}
}
private attributeViewModelFromAttribute(attribute:VesselAttribute): VesselAttributeViewModel {
let attributeViewModel: VesselAttributeViewModel = attribute;
attributeViewModel.attributeNameText = VesselViewModel.getTextForVesselAttributeNameEnum(attribute.attributeName);
return attributeViewModel;
}
private static getTextForVesselAttributeNameEnum(vesselAttributeEnum:AttributeNameEnum):string {
var text = '';
switch (vesselAttributeEnum) {
case AttributeNameEnum.AisClass: {
text = 'AIS class';
break;
}
case AttributeNameEnum.Callsign: {
text = 'Call sign';
break;
}
case AttributeNameEnum.Flagstate: {
text = 'Flag state';
break;
}
case AttributeNameEnum.ImoNumber: {
text = 'IMO number';
break;
}
case AttributeNameEnum.MmsiNumber: {
text = 'MMSI number';
break;
}
case AttributeNameEnum.PortOfRegister: {
text = 'Port of register';
break;
}
default : {
text = AttributeNameEnum[vesselAttributeEnum];
if (!text) {
text = ''+ vesselAttributeEnum;
}
}
}
return text;
}
}
|
# Copyright (c) Microsoft Corporation
# Licensed under the MIT License.
name = 'raiutils'
_major = '0'
_minor = '0'
_patch = '1'
version = '{}.{}.{}'.format(_major, _minor, _patch)
|
import autocomplete, { AutocompleteItem } from 'ag-grid-autocomplete-editor/autocompleter/autocomplete'
describe('autocomplete end-to-end autoselect tests', () => {
it('should not autoselect first when outside click is detected', function () {
const inputText = 'United'
cy.fixture('selectDatas/united.json').as('selectData')
// @ts-ignore
cy.visit('./cypress/static/autocomplete-test-sandbox.html')
// Get the input element and setup autocomplete to it
cy.get('#autocompleter').then((indexQueryElement) => {
const { selectData } = this
autocomplete({
autoselectfirst: true,
fetch(search: string, update: <AutocompleteItem>(items: AutocompleteItem[] | false) => void) {
update(selectData)
},
onSelect(item: AutocompleteItem | undefined) {
if (item && item.label) {
indexQueryElement.val(item.label)
} else {
indexQueryElement.val('')
}
},
strict: true,
input: <HTMLInputElement>indexQueryElement.get(0),
})
})
// Type some text into the autocompleter input field
cy.get('#autocompleter').type(inputText)
// Should show the select list on the page
cy.get('.autocomplete')
cy.get('html').realClick()
cy.get('.autocomplete').should('not.exist')
cy.get('#autocompleter').then((indexQueryElement) => {
expect(indexQueryElement.val()).to.be.equal(inputText)
})
})
it('should not autoselect first escape key is sent', function () {
const inputText = 'United'
cy.fixture('selectDatas/united.json').as('selectData')
// @ts-ignore
cy.visit('./cypress/static/autocomplete-test-sandbox.html')
// Get the input element and setup autocomplete to it
cy.get('#autocompleter').then((indexQueryElement) => {
const { selectData } = this
autocomplete({
autoselectfirst: true,
fetch(search: string, update: <AutocompleteItem>(items: AutocompleteItem[] | false) => void) {
update(selectData)
},
onSelect(item: AutocompleteItem | undefined) {
if (item && item.label) {
indexQueryElement.val(item.label)
} else {
indexQueryElement.val('')
}
},
strict: true,
input: <HTMLInputElement>indexQueryElement.get(0),
})
})
// Type some text into the autocompleter input field
cy.get('#autocompleter').type(inputText)
// Should show the select list on the page
cy.get('.autocomplete')
// Should close the list with escape key
cy.get('#autocompleter').type('{esc}')
cy.get('.autocomplete').should('not.exist')
cy.get('#autocompleter').then((indexQueryElement) => {
expect(indexQueryElement.val()).to.be.equal('')
})
})
it('should autoselect first when enter key is sent', function () {
const inputText = 'United'
cy.fixture('selectDatas/united.json').as('selectData')
// @ts-ignore
cy.visit('./cypress/static/autocomplete-test-sandbox.html')
// Get the input element and setup autocomplete to it
cy.get('#autocompleter').then((indexQueryElement) => {
const { selectData } = this
autocomplete({
autoselectfirst: true,
fetch(search: string, update: <AutocompleteItem>(items: AutocompleteItem[] | false) => void) {
update(selectData)
},
onSelect(item: AutocompleteItem | undefined) {
if (item && item.label) {
indexQueryElement.val(item.label)
} else {
indexQueryElement.val('')
}
},
strict: true,
input: <HTMLInputElement>indexQueryElement.get(0),
})
})
// Type some text into the autocompleter input field
cy.get('#autocompleter').type(inputText)
// Should show the select list on the page
cy.get('.autocomplete > :nth-child(1)').then((indexQueryElement) => {
expect(indexQueryElement.hasClass('selected')).to.be.equal(true)
})
// Should close the list with escape key
cy.get('#autocompleter').type('{enter}')
cy.get('.autocomplete').should('not.exist')
cy.get('#autocompleter').then((indexQueryElement) => {
const { selectData } = this
expect(indexQueryElement.val()).to.be.equal(selectData[0].label)
})
})
})
|
package io.homeassistant.companion.android.domain.authentication
enum class SessionState {
ANONYMOUS,
CONNECTED,
}
|
# 3.1.1 为域适配持久化
在将对象持久化到数据库时,通常最好有一个惟一标识对象的字段。Ingredient 类已经有一个 id 字段,但是需要向 Taco 和 Order 添加 id 字段。
此外,了解何时创建 Taco 以及何时放置 Order 可能很有用。还需要向每个对象添加一个字段,以捕获保存对象的日期和时间。下面的程序清单显示了 Taco 类中需要的新 id 和 createdAt 字段。
{% code title="程序清单 3.3 向 Taco 类添加 id 和 timestamp 字段" %}
```java
@Data
public class Taco {
private Long id;
private Date createdAt;
...
}
```
{% endcode %}
因为使用 Lombok 在运行时自动生成访问器方法,所以除了声明 id 和 createdAt 属性外,不需要做任何事情。它们将在运行时根据需要生成适当的 getter 和 setter 方法。Order 类也需要做类似的修改,如下所示:
```java
@Data
public class Order {
private Long id;
private Date placedAt;
...
}
```
同样,Lombok 会自动生成访问字段的方法,因此只需要按顺序进行这些更改。(如果由于某种原因选择不使用 Lombok,那么需要自己编写这些方法。)
域类现在已经为持久化做好了准备。让我们看看如何使用 JdbcTemplate 在数据中对它们进行读写。
|
// Copyright 2020 The SQLFlow Authors. All rights reserved.
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package response
import (
"regexp"
"strings"
"github.com/golang/protobuf/proto"
pb "sqlflow.org/sqlflow/pkg/proto"
)
// CompoundResponses Compounds response
type CompoundResponses struct {
responses []*pb.Response
errMessages []string
}
// New returns CompoundResponses with step index
func New() *CompoundResponses {
return &CompoundResponses{
responses: []*pb.Response{},
}
}
// AppendMessage append message
func (r *CompoundResponses) AppendMessage(message string) error {
res, e := pb.EncodeMessage(message)
if e != nil {
return e
}
r.responses = append(r.responses, res)
return nil
}
// AppendProtoMessages appends the message with protobuf message format
func (r *CompoundResponses) AppendProtoMessages(messages []string) error {
// unmarshal pb.Response from proto message with text format
out, err, e := unMarshalProtoMessages(messages)
if e != nil {
return e
}
r.responses = append(r.responses, out...)
r.errMessages = append(r.errMessages, err...)
return nil
}
// ErrorMessage returns the error message as string
func (r *CompoundResponses) ErrorMessage() string {
return strings.Join(r.errMessages, "\n")
}
// Response returns the compounded Response
func (r *CompoundResponses) Response(jobID, stepID, stepPhase string, eof bool) *pb.FetchResponse {
return NewFetchResponse(NewFetchRequest(jobID, stepID, stepPhase), eof, r.responses)
}
// ResponseWithStepComplete returns the compounded Response at the end of step
func (r *CompoundResponses) ResponseWithStepComplete(jobID, stepID, stepPhase string, eof bool) *pb.FetchResponse {
eoe := &pb.Response{Response: &pb.Response_Eoe{Eoe: &pb.EndOfExecution{}}}
r.responses = append(r.responses, eoe)
return r.Response(jobID, stepID, stepPhase, eof)
}
func unMarshalProtoMessages(messages []string) ([]*pb.Response, []string, error) {
responses := []*pb.Response{}
errMessages := []string{}
for _, msg := range messages {
msg = strings.TrimSpace(msg)
if isHTMLCode(msg) {
r, e := pb.EncodeMessage(msg)
if e != nil {
return nil, errMessages, e
}
responses = append(responses, r)
}
response := &pb.Response{}
if e := proto.UnmarshalText(msg, response); e != nil {
// skip this line if it's not protobuf message
continue
}
// TODO(yancey1989): Add an Error proto message which contains error code and error message
if response.GetMessage() != nil {
errMessages = append(errMessages, response.GetMessage().Message)
} else {
responses = append(responses, response)
}
}
return responses, errMessages, nil
}
func isHTMLCode(code string) bool {
//TODO(yancey1989): support more lines HTML code e.g.
//<div>
// ...
//</div>
re := regexp.MustCompile(`<div.*?>.*</div>`)
return re.MatchString(code)
}
// NewFetchRequest returns a FetchRequest
func NewFetchRequest(workflowID, stepID, stepPhase string) *pb.FetchRequest {
return &pb.FetchRequest{
Job: &pb.Job{
Id: workflowID,
},
StepId: stepID,
StepPhase: stepPhase,
}
}
// NewFetchResponse returns a FetchResponse
func NewFetchResponse(newReq *pb.FetchRequest, eof bool, responses []*pb.Response) *pb.FetchResponse {
return &pb.FetchResponse{
UpdatedFetchSince: newReq,
Eof: eof,
Responses: &pb.FetchResponse_Responses{
Response: responses,
},
}
}
|
const pipe: import('ts-functionaltypes').Pipe = (...functions) => (arg) => functions.reduce(
(result, fn) => fn(result),
arg
);
export default pipe;
|
package csc207.fall2018.gamecentreapp.slidingtiles;
import android.support.annotation.NonNull;
import java.util.NoSuchElementException;
import java.util.Observable;
import java.io.Serializable;
import java.util.Arrays;
import java.util.Iterator;
import java.util.List;
/**
* The sliding tiles board.
*/
public class Board extends Observable implements Serializable, Iterable<Tile> {
/**
* The number of rows.
*/
public int NUM_ROWS;
/**
* The number of columns.
*/
public int NUM_COLS;
/**
* The tiles on the board in row-major order.
*/
private Tile[][] tiles;
/**
* The constructor of the Board class.
*
* @param tiles the tiles.
* @param size the size.
*/
Board(List<Tile> tiles, int size) {
NUM_COLS = size;
NUM_ROWS = size;
this.tiles = new Tile[NUM_ROWS][NUM_COLS];
Iterator<Tile> iter = tiles.iterator();
for (int row = 0; row != size; row++) {
for (int col = 0; col != size; col++) {
this.tiles[row][col] = iter.next();
}
}
}
/**
* Return the number of tiles on the board.
*
* @return the number of tiles on the board
*/
int numTiles() {
return NUM_COLS * NUM_ROWS;
}
/**
* Return the tile at (row, col)
*
* @param row the tile row
* @param col the tile column
* @return the tile at (row, col)
*/
public Tile getTile(int row, int col) {
return tiles[row][col];
}
/**
* Swap the tiles at (row1, col1) and (row2, col2)
*
* @param row1 the first tile row
* @param col1 the first tile col
* @param row2 the second tile row
* @param col2 the second tile col
*/
void swapTiles(int row1, int col1, int row2, int col2) {
Tile tile1 = getTile(row1, col1);
Tile tile2 = getTile(row2, col2);
tiles[row1][col1] = tile2;
tiles[row2][col2] = tile1;
setChanged();
notifyObservers();
}
@Override
public String toString() {
return "Board{" +
"tiles=" + Arrays.toString(tiles) +
'}';
}
@NonNull
@Override
public Iterator<Tile> iterator() {
return new TileIterator();
}
/**
* A new TileIterator that iterates the tiles in the Board.
*/
private class TileIterator implements Iterator<Tile> {
/**
* The index of the next tile in the Board.
*/
private int nextIndex = 0;
@Override
public boolean hasNext() {
return nextIndex != numTiles();
}
@Override
public Tile next() {
if (this.hasNext()) {
Tile result = getTile(nextIndex / NUM_COLS, nextIndex % NUM_COLS);
nextIndex++;
return result;
}
throw new NoSuchElementException();
}
}
}
|
from data_processing import DataProcessing as DP
if __name__ == '__main__':
dp = DP(cut_level='word')
dp.dataset_dir('usrfiles', 'newfiles')
|
'use strict';
Object.defineProperty(exports, "__esModule", {
value: true
});
exports.myKeyBindingFn = undefined;
exports.findInlineTeXEntities = findInlineTeXEntities;
exports.changeDecorator = changeDecorator;
var _draftJs = require('draft-js');
var hasCommandModifier = _draftJs.KeyBindingUtil.hasCommandModifier;
var myKeyBindingFn = exports.myKeyBindingFn = function myKeyBindingFn(getEditorState) {
return function (e) {
// J'aurais préféré CTRL+$ que CTRL+M, mais cela semble
// un peu compliqué car chrome gère mal e.key.
// if (e.key === '$' && hasCommandModifier(e))
if (e.keyCode === /* m */77 && hasCommandModifier(e)) {
return 'insert-texblock';
}
// if (e.key === /* $ */ '$' /* && hasCommandModifier(e)*/) {
// const c = getEditorState().getCurrentContent()
// const s = getEditorState().getSelection()
// if (!s.isCollapsed()) return 'insert-inlinetex'
// const bk = s.getStartKey()
// const b = c.getBlockForKey(bk)
// const offset = s.getStartOffset() - 1
// if (b.getText()[offset] === '\\') {
// return `insert-char-${e.key}`
// }
// return 'insert-inlinetex'
// }
// if (e.key === '*') {
// return 'test'
// }
// gestion du cursor au cas où il est situé près d'une formule
if (e.key === 'ArrowRight' || e.key === 'ArrowLeft') {
var d = e.key === 'ArrowRight' ? 'r' : 'l';
var s = getEditorState().getSelection();
var c = getEditorState().getCurrentContent();
if (!s.isCollapsed()) {
return undefined;
}
var offset = s.getStartOffset();
var blockKey = s.getStartKey();
var cb = c.getBlockForKey(blockKey);
if (cb.getLength() === offset && d === 'r') {
var b = c.getBlockAfter(blockKey);
if (b && b.getType() === 'atomic' && b.getData().get('mathjax')) {
return 'update-texblock-' + d + '-' + b.getKey();
}
}
if (offset === 0 && d === 'l') {
var _b = c.getBlockBefore(blockKey);
if (_b && _b.getType() === 'atomic' && _b.getData().get('mathjax')) {
return 'update-texblock-' + d + '-' + _b.getKey();
}
}
var ek = cb.getEntityAt(offset - (e.key === 'ArrowLeft' ? 1 : 0));
if (ek && c.getEntity(ek).getType() === 'INLINETEX') {
return 'update-inlinetex-' + d + '-' + ek;
}
}
return (0, _draftJs.getDefaultKeyBinding)(e);
};
};
function findInlineTeXEntities(contentBlock, callback, contentState) {
contentBlock.findEntityRanges(function (character) {
var entityKey = character.getEntity();
return entityKey !== null && contentState.getEntity(entityKey).getType() === 'INLINETEX';
}, callback);
}
function changeDecorator(editorState, decorator) {
return _draftJs.EditorState.create({
allowUndo: true,
currentContent: editorState.getCurrentContent(),
decorator: decorator,
selection: editorState.getSelection()
});
}
|
import React, { Component } from 'react';
import Searchbar from '../components/Eventspage/Searchbar/Searchbar'
import Actionsbar from '../components/Actionsbar/Actionsbar'
import ProcessEventList from '../components/Eventspage/Eventslist/ProcessEventList'
class Eventspage extends Component {
state = {
activeTab: "explore",
processInstanceInfoList: [
],
processDetails: [
],
events: [
{id: "550e8400-e29b-11d4-a716-446655440000", name: "ProcessX", status: 22},
{id: "550e8400-e29b-11d4-a716-446655440001", name: "ProcessY", status: 44},
{id: "550e8400-e29b-11d4-a716-446655440002", name: "ProcessZ", status: 445}
]
}
setProcessInstanceInfoList(processInstanceInfos : any) {
this.setState({
processInstanceInfoList: processInstanceInfos
})
}
setProcessDetails(processDetails : any) {
this.setState({
processDetails: processDetails
})
}
changeTab(tabName: string) {
this.setState({
activeTab: tabName
})
}
render() {
let activeTabContent;
if (this.state.activeTab === "explore") {
activeTabContent =
<div>
<Searchbar/>
<ProcessEventList events={this.state.events}/>
</div>
} else if (this.state.activeTab === "usermanagment"){
activeTabContent = <div>This is the usermanagement tab</div>
} else if (this.state.activeTab === "deadletters"){
activeTabContent = <div>This is the deadletters tab</div>
}
return (
<div className="Eventspage">
<Actionsbar changed={this.changeTab.bind(this)}/>
{activeTabContent}
</div>
);
}
}
export default Eventspage
|
package net.ndrei.teslacorelib.netsync
import net.minecraftforge.fml.common.network.simpleimpl.IMessage
/**
* Created by CF on 2017-06-28.
*/
interface ITeslaCorePackets {
fun send(message: IMessage)
fun sendToServer(message: IMessage)
}
|
<?php
require 'view/template/nav.php';
require 'view/template/header.php';
?>
<article>
<h3 class="text-center my-4">Votre compte et vos dernières opérations</h3>
<div class="container">
<div class="row">
<div class="card col-10 col-md-5 mx-auto my-4" style="width: 18rem;">
<div class="card-header">
<?= $show_account_single[0]->getAccountType()?>
</div>
<ul class="list-group list-group-flush">
<?php foreach ($show_account_single as $key => $account): ?>
<li class="list-group-item">Votre solde : <?=$account->getAmountA()?> euro</li>
<li class="list-group-item">Enregistré le : <?=$account->getOpeningDate()?></li>
<?php endforeach; ?>
</ul>
<a href="index.php" class="btn btn-primary">Acceuil</a>
</div>
<table class="table col-10 col-md-5 mx-auto my-4">
<thead class="thead-dark">
<tr>
<th scope="col">Opérations</th>
<th scope="col">Montant</th>
<th scope="col">Date</th>
<th scope="col">Label</th>
</tr>
</thead>
<tbody>
<?php foreach ($show_operations as $key => $operation): ?>
<tr>
<th scope="row"><?=$operation->getOperationType()?></th>
<td><?=$operation->getAmountO()?></td>
<td><?=$operation->getRegistered()?></td>
<td><?=$operation->getLabel()?></td>
</tr>
<?php endforeach; ?>
</tbody>
</table>
</div>
</div>
</article>
<?php
require 'view/template/footer.php';
?>
|
/**
* Copyright (C) 2012 - present by OpenGamma Inc. and the OpenGamma group of companies
*
* Please see distribution for license.
*/
package com.opengamma.analytics.financial.model.finitedifference;
/**
* @param <T> The type of the PDE coefficients
*/
public interface PDE1DSolver<T extends PDE1DCoefficients> {
PDEResults1D solve(PDE1DDataBundle<T> pdeData);
//void visit(T coeff);
}
|
import React, { Component } from 'react';
import { Button, Form, Grid, Header, Message, Segment, Icon } from 'semantic-ui-react'
import { connect } from 'react-redux';
import { fetchPiece } from '../store/piece'
import { updatePieceThunk, deletePieceThunk, fetchAllPieces } from '../store/pieces'
class Modal extends Component {
constructor(props) {
super(props)
this.state = {
category: '',
item: '',
quantity: ''
}
this.handleCategoryChange = this.handleCategoryChange.bind(this)
this.handleItemChange = this.handleItemChange.bind(this)
this.handleQuantityChange = this.handleQuantityChange.bind(this)
}
componentWillMount() {
this.props.loadPieceData(this.props.xCoord, this.props.yCoord)
}
handleCategoryChange(event) {
event.preventDefault()
this.setState({ category: event.target.value.toLowerCase() })
}
handleItemChange(event) {
event.preventDefault()
this.setState({ item: event.target.value.toLowerCase() })
}
handleQuantityChange(event) {
event.preventDefault()
this.setState({ quantity: event.target.value.toLowerCase() })
}
//COME BACK TO FIX THIS LATER Quantity is not working
render() {
let pieceId = this.props.piece.id
let pieceX = this.props.piece.positionX;
let pieceY = this.props.piece.positionY;
return (
<div>
<div id="modal">
<Grid
textAlign='center'
style={{ height: '100%' }}
verticalAlign='middle' >
<Grid.Column style={{ maxWidth: 450 }}>
<Header as='h2' color='olive' textAlign='center'>
Add Inventory to Crate
</Header>
<Form onSubmit={(event) => this.props.inventorySubmission(event, pieceId)}>
<label>Category</label>
<Form.Input
name='category'
placeholder={this.props.piece.category || this.state.category}
value={this.state.category}
onChange={this.handleCategoryChange} />
<label>Item</label>
<Form.Input
name='item'
placeholder={this.props.piece.item || this.state.item}
value={this.state.item}
onChange={this.handleItemChange} />
<label>Quantity</label>
<Form.Input
name='quantity'
placeholder={this.props.piece.quantity || this.state.quantity}
value={this.state.quantity}
onChange={this.handleQuantityChange} />
<Button type='submit'>Submit</Button>
</Form>
<Button type='submit' onClick={() => this.props.deleteCrate(pieceId)}>Delete Crate</Button>
</Grid.Column>
</Grid>
</div>
<div id="modalBackground" onClick={this.props.closeModal}>
</div>
</div>
)
}
}
const mapState = ({ piece }) => ({ piece })
const mapDispatch = (dispatch, ownProps) => {
return {
loadPieceData(x, y) {
dispatch(fetchPiece(ownProps.xCoord, ownProps.yCoord))
},
inventorySubmission(event, id) {
event.preventDefault();
dispatch(updatePieceThunk({
category: event.target.category.value,
item: event.target.item.value,
quantity: event.target.quantity.value,
pieceId: id
}, ownProps.xCoord, ownProps.yCoord)).then(ownProps.closeModal())
}
}
}
export default connect(mapState, mapDispatch)(Modal)
|
#!/bin/bash
helpstr="$0 <tarball> (fedora|centos) <output-dir>"
if [ $# -ne 3 ]; then
echo $helpstr
exit 1
fi
tarball=$1
dist=""
case $2 in
"fedora")
dist="fedora-24-x86_64"
;;
"centos")
dist="epel-7-x86_64"
;;
*)
echo "Unknown dist $2"
echo $helpstr
exit 1
;;
esac
outdir=$3
if [ -f $outdir ]; then
echo "$outdir is not a directory"
exit 1
fi
if [ ! -e $outdir ]; then
mkdir -p $outdir
fi
tmpdir=$(mktemp -d)
echo "Creating SRPM"
rpmbuild --define "_topdir $tmpdir" -ts $tarball || exit 1
srpm=$tmpdir/SRPMS/*
echo "Building RPMS using mock"
/usr/bin/mock -r $dist --resultdir=$outdir --rebuild $srpm || exit 1
echo "Cleaning up"
rm -rf $tmpdir
echo "Created RPMs from $tarball in $outdir"
|
#!/bin/bash
# Copyright 2012 Cloudera Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
# This script can be executed in two ways:
# 1) Without any command line parameters - A normal data load will happen where data is
# generated as needed, generally by issuing 'INSERT INTO <table> SELECT *' commands.
# 2) With a command line parameter pointing to a test-warehouse snapshot file - In this
# case the snapshot file contents will be copied into HDFS prior to calling the data load
# scripts. This speeds up overall data loading time because it usually means only the
# table metadata needs to be created.
#
# For more information look at testdata/bin/load-test-warehouse-snapshot.sh and
# bin/load-data.py
# Exit on error.
set -e
. ${IMPALA_HOME}/bin/impala-config.sh > /dev/null 2>&1
SKIP_METADATA_LOAD=0
SKIP_SNAPSHOT_LOAD=0
SNAPSHOT_FILE=""
LOAD_DATA_ARGS=""
JDBC_URL="jdbc:hive2://localhost:11050/default;"
DATA_LOADING_LOG_DIR=${IMPALA_TEST_CLUSTER_LOG_DIR}/data_loading
mkdir -p ${DATA_LOADING_LOG_DIR}
while [ -n "$*" ]
do
case $1 in
-skip_metadata_load)
SKIP_METADATA_LOAD=1
;;
-skip_snapshot_load)
SKIP_SNAPSHOT_LOAD=1
;;
-snapshot_file)
SNAPSHOT_FILE=${2-}
if [ ! -f $SNAPSHOT_FILE ]; then
echo "-snapshot_file does not exist: $SNAPSHOT_FILE"
exit 1;
fi
shift;
;;
-help|-h|*)
echo "create-load-data.sh : Creates data and loads from scratch"
echo "[-skip_metadata_load] : Skips loading of metadata"
echo "[-skip_snapshot_load] : Assumes that the snapshot is already loaded"
echo "[-snapshot_file] : Loads the test warehouse snapshot into hdfs"
exit 1;
;;
esac
shift;
done
if [[ $SKIP_METADATA_LOAD -eq 0 && "$SNAPSHOT_FILE" = "" ]]; then
echo "Loading Hive Builtins"
${IMPALA_HOME}/testdata/bin/load-hive-builtins.sh
echo "Generating HBase data"
${IMPALA_HOME}/testdata/bin/create-hbase.sh &> ${DATA_LOADING_LOG_DIR}/create-hbase.log
echo "Creating /test-warehouse HDFS directory"
hadoop fs -mkdir /test-warehouse
elif [ $SKIP_SNAPSHOT_LOAD -eq 0 ]; then
echo Loading hdfs data from snapshot: $SNAPSHOT_FILE
${IMPALA_HOME}/testdata/bin/load-test-warehouse-snapshot.sh "$SNAPSHOT_FILE"
# Don't skip the metadata load if a schema change is detected.
if ! ${IMPALA_HOME}/testdata/bin/check-schema-diff.sh; then
echo "Schema change detected, metadata will be loaded."
SKIP_METADATA_LOAD=0
fi
else
# hdfs data already exists, don't load it.
echo Skipping loading data to hdfs.
fi
function load-custom-schemas {
echo LOADING CUSTOM SCHEMAS
SCHEMA_SRC_DIR=${IMPALA_HOME}/testdata/data/schemas
SCHEMA_DEST_DIR=/test-warehouse/schemas
# clean the old schemas directory.
hadoop fs -rm -r -f ${SCHEMA_DEST_DIR}
hadoop fs -mkdir ${SCHEMA_DEST_DIR}
hadoop fs -put $SCHEMA_SRC_DIR/zipcode_incomes.parquet ${SCHEMA_DEST_DIR}/
hadoop fs -put $SCHEMA_SRC_DIR/unsupported.parquet ${SCHEMA_DEST_DIR}/
hadoop fs -put $SCHEMA_SRC_DIR/map.parquet ${SCHEMA_DEST_DIR}/
hadoop fs -put $SCHEMA_SRC_DIR/array.parquet ${SCHEMA_DEST_DIR}/
hadoop fs -put $SCHEMA_SRC_DIR/struct.parquet ${SCHEMA_DEST_DIR}/
hadoop fs -put $SCHEMA_SRC_DIR/alltypestiny.parquet ${SCHEMA_DEST_DIR}/
hadoop fs -put $SCHEMA_SRC_DIR/malformed_decimal_tiny.parquet ${SCHEMA_DEST_DIR}/
hadoop fs -put $SCHEMA_SRC_DIR/decimal.parquet ${SCHEMA_DEST_DIR}/
# CHAR and VARCHAR tables written by Hive
hadoop fs -mkdir -p /test-warehouse/chars_formats_avro_snap/
hadoop fs -put -f ${IMPALA_HOME}/testdata/data/chars-formats.avro \
/test-warehouse/chars_formats_avro_snap
hadoop fs -mkdir -p /test-warehouse/chars_formats_parquet/
hadoop fs -put -f ${IMPALA_HOME}/testdata/data/chars-formats.parquet \
/test-warehouse/chars_formats_parquet
hadoop fs -mkdir -p /test-warehouse/chars_formats_text/
hadoop fs -put -f ${IMPALA_HOME}/testdata/data/chars-formats.txt \
/test-warehouse/chars_formats_text
}
function load-data {
WORKLOAD=${1}
EXPLORATION_STRATEGY=${2:-"core"}
TABLE_FORMATS=${3:-}
MSG="Loading workload '$WORKLOAD'"
ARGS=("--workloads $WORKLOAD")
MSG+=" Using exploration strategy '$EXPLORATION_STRATEGY'"
ARGS+=("-e $EXPLORATION_STRATEGY")
if [ $TABLE_FORMATS ]; then
MSG+=" in table formats '$TABLE_FORMATS'"
ARGS+=("--table_formats $TABLE_FORMATS")
fi
if [ $LOAD_DATA_ARGS ]; then
ARGS+=("$LOAD_DATA_ARGS")
fi
# functional-query is unique. The dataset name is not the same as the workload name.
if [ "${WORKLOAD}" = "functional-query" ]; then
WORKLOAD="functional"
fi
# Force load the dataset if we detect a schema change.
if ! ${IMPALA_HOME}/testdata/bin/check-schema-diff.sh $WORKLOAD; then
ARGS+=("--force")
echo "Force loading $WORKLOAD because a schema change was detected"
fi
LOG_FILE=${DATA_LOADING_LOG_DIR}/data-load-${WORKLOAD}-${EXPLORATION_STRATEGY}.log
echo "$MSG. Logging to ${LOG_FILE}"
# Use unbuffered logging by executing with 'python -u'
python -u ${IMPALA_HOME}/bin/load-data.py ${ARGS[@]} &> ${LOG_FILE}
}
function cache-test-tables {
echo CACHING tpch.nation AND functional.alltypestiny
# uncaching the tables first makes this operation idempotent.
${IMPALA_HOME}/bin/impala-shell.sh -q "alter table functional.alltypestiny set uncached"
${IMPALA_HOME}/bin/impala-shell.sh -q "alter table tpch.nation set uncached"
${IMPALA_HOME}/bin/impala-shell.sh -q "alter table tpch.nation set cached in 'testPool'"
${IMPALA_HOME}/bin/impala-shell.sh -q\
"alter table functional.alltypestiny set cached in 'testPool'"
}
function load-aux-workloads {
echo LOADING AUXILIARY WORKLOADS
LOG_FILE=${DATA_LOADING_LOG_DIR}/data-load-auxiliary-workloads-core.log
rm -f $LOG_FILE
# Load all the auxiliary workloads (if any exist)
if [ -d ${IMPALA_AUX_WORKLOAD_DIR} ] && [ -d ${IMPALA_AUX_DATASET_DIR} ]; then
python -u ${IMPALA_HOME}/bin/load-data.py --workloads all\
--workload_dir=${IMPALA_AUX_WORKLOAD_DIR}\
--dataset_dir=${IMPALA_AUX_DATASET_DIR}\
--exploration_strategy=core ${LOAD_DATA_ARGS} &>> $LOG_FILE
else
echo "Skipping load of auxilary workloads because directories do not exist"
fi
}
function copy-auth-policy {
echo COPYING AUTHORIZATION POLICY FILE
hadoop fs -rm -f ${FILESYSTEM_PREFIX}/test-warehouse/authz-policy.ini
hadoop fs -put ${IMPALA_HOME}/fe/src/test/resources/authz-policy.ini \
${FILESYSTEM_PREFIX}/test-warehouse/
}
function copy-and-load-dependent-tables {
# COPY
# TODO: The multi-format table will move these files. So we need to copy them to a
# temporary location for that table to use. Should find a better way to handle this.
echo COPYING AND LOADING DATA FOR DEPENDENT TABLES
hadoop fs -rm -r -f /test-warehouse/alltypesmixedformat
hadoop fs -rm -r -f /tmp/alltypes_rc
hadoop fs -rm -r -f /tmp/alltypes_seq
hadoop fs -mkdir -p /tmp/alltypes_seq/year=2009
hadoop fs -mkdir -p /tmp/alltypes_rc/year=2009
hadoop fs -cp /test-warehouse/alltypes_seq/year=2009/month=2/ /tmp/alltypes_seq/year=2009
hadoop fs -cp /test-warehouse/alltypes_rc/year=2009/month=3/ /tmp/alltypes_rc/year=2009
# Create a hidden file in AllTypesSmall
hadoop fs -rm -f /test-warehouse/alltypessmall/year=2009/month=1/_hidden
hadoop fs -rm -f /test-warehouse/alltypessmall/year=2009/month=1/.hidden
hadoop fs -cp /test-warehouse/zipcode_incomes/DEC_00_SF3_P077_with_ann_noheader.csv \
/test-warehouse/alltypessmall/year=2009/month=1/_hidden
hadoop fs -cp /test-warehouse/zipcode_incomes/DEC_00_SF3_P077_with_ann_noheader.csv \
/test-warehouse/alltypessmall/year=2009/month=1/.hidden
# For tables that rely on loading data from local fs test-warehouse
# TODO: Find a good way to integrate this with the normal data loading scripts
beeline -n $USER -u "${JDBC_URL}" -f\
${IMPALA_HOME}/testdata/bin/load-dependent-tables.sql
}
function create-internal-hbase-table {
echo CREATING INTERNAL HBASE TABLE
# TODO: For some reason DROP TABLE IF EXISTS sometimes fails on HBase if the table does
# not exist. To work around this, disable exit on error before executing this command.
# Need to investigate this more, but this works around the problem to unblock automation.
set +o errexit
beeline -n $USER -u "${JDBC_URL}" -e\
"DROP TABLE IF EXISTS functional_hbase.internal_hbase_table"
echo "disable 'functional_hbase.internal_hbase_table'" | hbase shell
echo "drop 'functional_hbase.internal_hbase_table'" | hbase shell
set -e
# Used by CatalogTest to confirm that non-external HBase tables are identified
# correctly (IMP-581)
# Note that the usual 'hbase.table.name' property is not specified to avoid
# creating tables in HBase as a side-effect.
cat > /tmp/create-hbase-internal.sql << EOF
CREATE TABLE functional_hbase.internal_hbase_table(key int, value string)
STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,cf1:val");
EOF
beeline -n $USER -u "${JDBC_URL}" -f /tmp/create-hbase-internal.sql
rm -f /tmp/create-hbase-internal.sql
}
function load-custom-data {
echo LOADING CUSTOM DATA
# Load the index files for corrupted lzo data.
hadoop fs -rm -f /test-warehouse/bad_text_lzo_text_lzo/bad_text.lzo.index
hadoop fs -put ${IMPALA_HOME}/testdata/bad_text_lzo/bad_text.lzo.index \
/test-warehouse/bad_text_lzo_text_lzo/
hadoop fs -rm -r -f /bad_text_lzo_text_lzo/
hadoop fs -mv /test-warehouse/bad_text_lzo_text_lzo/ /
# Cleanup the old bad_text_lzo files, if they exist.
hadoop fs -rm -r -f /test-warehouse/bad_text_lzo/
# Index all lzo files in HDFS under /test-warehouse
${IMPALA_HOME}/testdata/bin/lzo_indexer.sh /test-warehouse
hadoop fs -mv /bad_text_lzo_text_lzo/ /test-warehouse/
# IMPALA-694: data file produced by parquet-mr version 1.2.5-cdh4.5.0
hadoop fs -put -f ${IMPALA_HOME}/testdata/data/bad_parquet_data.parquet \
/test-warehouse/bad_parquet_parquet
# Data file produced by parquet-mr with repeated values (produces 0 bit width dictionary)
hadoop fs -put -f ${IMPALA_HOME}/testdata/data/repeated_values.parquet \
/test-warehouse/bad_parquet_parquet
# IMPALA-720: data file produced by parquet-mr with multiple row groups
hadoop fs -put -f ${IMPALA_HOME}/testdata/data/multiple_rowgroups.parquet \
/test-warehouse/bad_parquet_parquet
# IMPALA-1401: data file produced by Hive 13 containing page statistics with long min/max
# string values
hadoop fs -put -f ${IMPALA_HOME}/testdata/data/long_page_header.parquet \
/test-warehouse/bad_parquet_parquet
# Remove all index files in this partition.
hadoop fs -rm /test-warehouse/alltypes_text_lzo/year=2009/month=1/*.lzo.index
# Add a sequence file that only contains a header (see IMPALA-362)
hadoop fs -put -f ${IMPALA_HOME}/testdata/tinytable_seq_snap/tinytable_seq_snap_header_only \
/test-warehouse/tinytable_seq_snap
beeline -n $USER -u "${JDBC_URL}" -f\
${IMPALA_HOME}/testdata/avro_schema_resolution/create_table.sql
}
function build-and-copy-hive-udfs {
# Build the test Hive UDFs
pushd ${IMPALA_HOME}/tests/test-hive-udfs
mvn clean package
popd
# Copy the test UDF/UDA libraries into HDFS
${IMPALA_HOME}/testdata/bin/copy-udfs-udas.sh -build
}
function copy-and-load-ext-data-source {
# Copy the test data source library into HDFS
${IMPALA_HOME}/testdata/bin/copy-data-sources.sh
# Create data sources table.
${IMPALA_HOME}/bin/impala-shell.sh -f\
${IMPALA_HOME}/testdata/bin/create-data-source-table.sql
}
# Enable debug logging.
set -x
# For kerberized clusters, use kerberos
if ${CLUSTER_DIR}/admin is_kerberized; then
LOAD_DATA_ARGS="${LOAD_DATA_ARGS} --use_kerberos --principal=${MINIKDC_PRINC_HIVE}"
fi
# Start Impala
${IMPALA_HOME}/bin/start-impala-cluster.py -s 3 --log_dir=${DATA_LOADING_LOG_DIR}
${IMPALA_HOME}/testdata/bin/setup-hdfs-env.sh
if [ $SKIP_METADATA_LOAD -eq 0 ]; then
# load custom schems
load-custom-schemas
# load functional/tpcds/tpch
load-data "functional-query" "exhaustive"
load-data "tpch" "core"
load-data "tpcds" "core"
load-aux-workloads
copy-and-load-dependent-tables
load-custom-data
${IMPALA_HOME}/testdata/bin/create-table-many-blocks.sh -p 1234 -b 1
elif [ "${TARGET_FILESYSTEM}" = "hdfs" ]; then
echo "Skipped loading the metadata. Loading HBase."
load-data "functional-query" "core" "hbase/none"
fi
build-and-copy-hive-udfs
# Configure alltypes_seq as a read-only table. This is required for fe tests.
hadoop fs -chmod -R 444 ${FILESYSTEM_PREFIX}/test-warehouse/alltypes_seq/year=2009/month=1
hadoop fs -chmod -R 444 ${FILESYSTEM_PREFIX}/test-warehouse/alltypes_seq/year=2009/month=3
if [ "${TARGET_FILESYSTEM}" = "hdfs" ]; then
# Caching tables in s3 returns an IllegalArgumentException, see IMPALA-1714
cache-test-tables
# TODO: Modify the .sql file that creates the table to take an alternative location into
# account.
copy-and-load-ext-data-source
${IMPALA_HOME}/testdata/bin/split-hbase.sh > /dev/null 2>&1
create-internal-hbase-table
fi
# TODO: Investigate why all stats are not preserved. Theorectically, we only need to
# recompute stats for HBase.
${IMPALA_HOME}/testdata/bin/compute-table-stats.sh
copy-auth-policy
|
package wallet
import (
"fmt"
"time"
"github.com/bitmark-inc/bitmark-wallet/tx"
"github.com/boltdb/bolt"
"github.com/bitmark-inc/bitmarkd/util"
)
var (
ErrAccountBucketNotExisted = fmt.Errorf("account bucket is not existed")
ErrUTXOBucketNotExisted = fmt.Errorf("utxo bucket is not existed")
)
func packUTXOs(utxos tx.UTXOs) []byte {
b := make([]byte, 0)
for _, utxo := range utxos {
if utxo == nil {
continue
}
hashLen := len(utxo.TxHash)
b = append(b, util.ToVarint64(uint64(hashLen))...)
b = append(b, utxo.TxHash...)
b = append(b, util.ToVarint64(uint64(utxo.TxIndex))...)
b = append(b, util.ToVarint64(utxo.Value)...)
}
return b
}
func unpackUTXOs(b []byte) tx.UTXOs {
utxos := make([]*tx.UTXO, 0)
offset := 0
for offset < len(b) {
txLen, txStart := util.FromVarint64(b[offset:])
txEnd := txStart + int(txLen)
txHash := b[offset+txStart : offset+txEnd]
offset += txEnd
txIndex, n := util.FromVarint64(b[offset:])
offset += n
val, n := util.FromVarint64(b[offset:])
utxos = append(utxos, &tx.UTXO{
TxHash: txHash,
TxIndex: uint32(txIndex),
Value: val,
})
offset += n
}
return utxos
}
type AccountStore interface {
GetLastIndex() (uint64, error)
SetLastIndex(uint64) error
GetAllUTXO() (map[string]tx.UTXOs, error)
GetUTXO(address string) (tx.UTXOs, error)
SetUTXO(address string, utxo tx.UTXOs) error
Close()
}
// BoltAccountStore is an account store using boltdb.
// The wallet data is organized as follow:
// + bucket (pubkey of coin_account)
// + bucket ("utxo")
// - address : txs
// - lastIndex : varint
type BoltAccountStore struct {
account string
db *bolt.DB
}
func (b BoltAccountStore) Close() {
b.db.Close()
}
func (b BoltAccountStore) GetLastIndex() (uint64, error) {
var buf []byte
if err := b.db.View(func(tx *bolt.Tx) error {
bucket := tx.Bucket([]byte(b.account))
buf = bucket.Get([]byte("lastIndex"))
return nil
}); err != nil {
return 0, err
}
index, _ := util.FromVarint64(buf)
return index, nil
}
func (b BoltAccountStore) SetLastIndex(index uint64) error {
buf := util.ToVarint64(index)
return b.db.Update(func(tx *bolt.Tx) error {
bucket := tx.Bucket([]byte(b.account))
return bucket.Put([]byte("lastIndex"), buf)
})
}
func (b BoltAccountStore) GetAllUTXO() (map[string]tx.UTXOs, error) {
utxos := make(map[string]tx.UTXOs)
if err := b.db.View(func(tx *bolt.Tx) error {
bucket := tx.Bucket([]byte(b.account))
if bucket == nil {
return ErrAccountBucketNotExisted
}
utxoBkt := bucket.Bucket([]byte("utxo"))
if utxoBkt == nil {
return ErrUTXOBucketNotExisted
}
err := utxoBkt.ForEach(func(address, tx []byte) error {
txs := unpackUTXOs(tx)
utxos[string(address)] = txs
return nil
})
return err
}); err != nil {
return nil, err
}
return utxos, nil
}
func (b BoltAccountStore) GetUTXO(address string) (tx.UTXOs, error) {
var utxos tx.UTXOs
if err := b.db.View(func(tx *bolt.Tx) error {
bucket := tx.Bucket([]byte(b.account))
if bucket == nil {
return ErrAccountBucketNotExisted
}
utxoBkt := bucket.Bucket([]byte("utxo"))
if utxoBkt == nil {
return ErrUTXOBucketNotExisted
}
utxos = unpackUTXOs(utxoBkt.Get([]byte(address)))
return nil
}); err != nil {
return nil, err
}
return utxos, nil
}
func (b BoltAccountStore) SetUTXO(address string, utxos tx.UTXOs) error {
b.db.Update(func(tx *bolt.Tx) error {
bucket := tx.Bucket([]byte(b.account))
if bucket == nil {
return ErrAccountBucketNotExisted
}
utxoBkt := bucket.Bucket([]byte("utxo"))
if utxoBkt == nil {
return ErrUTXOBucketNotExisted
}
b := packUTXOs(utxos)
if len(b) == 0 {
return utxoBkt.Delete([]byte(address))
}
return utxoBkt.Put([]byte(address), b)
})
return nil
}
func NewBoltAccountStore(filename, account string) (*BoltAccountStore, error) {
db, err := bolt.Open(filename, 0600, &bolt.Options{Timeout: 1 * time.Second})
if err != nil {
return nil, err
}
tx, err := db.Begin(true)
if err != nil {
return nil, err
}
defer tx.Rollback()
// root bucket of an account
root, err := tx.CreateBucketIfNotExists([]byte(account))
if err != nil {
return nil, err
}
// utxo bucket of an account
_, err = root.CreateBucketIfNotExists([]byte("utxo"))
if err != nil {
return nil, err
}
if err := tx.Commit(); err != nil {
return nil, err
}
return &BoltAccountStore{
account: account,
db: db,
}, nil
}
|
module SkypekitPure
class Request
def initialyze
@tokens = ['B']
@encoders = {}
@encoders['i'] = encode_varint
@encoders['u'] = encode_varuint
@encoders['e'] = encode_varuint
@encoders['o'] = encode_varuint
@encoders['O'] = encode_objectid
@encoders['S'] = encode_string
@encoders['X'] = encode_string
@encoders['f'] = encode_string
@encoders['B'] = encode_string
end
def add_parm(kind, tag, val)
if val.is_a?(Hash)
@tokens << ord_method('[')
self.encode_varuint(tag)
encoder = @encoders[kind]
val.each do |elem|
if kind != 'b'
@tokens << ord_method(kind)
#encoder(self, elem)
else
if elem
@tokens << ord_method('T')
else
@tokens << ord_method('F')
end
end
end
@tokens << ord_method(']')
elsif kind != 'b'
@tokens << ord_method(kind)
if tag == 0
@oid = val.object_id
end
self.encode_varuint(tag)
#@encoders[kind](self, val)
else
if val
@tokens << ord_method('T')
else
@tokens << ord_method('F')
end
self.encode_varuint(tag)
end
self
end
private
def encode_varint(number)
if number >= 0
number = number << 1
else
number = (number << 1) ^ (~0)
end
self.encode_varuint(number)
end
def encode_varuint(number)
while 1 do
towrite = number & 0x7f
number = number >> 7
if number == 0
@tokens << towrite
break
end
@tokens << (0x80|towrite)
end
end
def encode_objectid(val)
unless val
self.encode_varuint(0)
else
self.encode_varuint(val.object_id)
end
end
def encode_objectid(val)
length = val.length
self.encode_varuint(length)
if length > 0
@tokens += split("").map{|a| ord_method(a) }
end
end
def ord_method(str)
str.respond_to?('ord') ? str.ord : str.unpack('c')[0]
end
end
end
|
export {UploadContainerComponent} from './component';
export {FileUploadComponent} from './file-upload/component';
export {UploadService} from './service';
export * from './types';
|
import { promiseError } from '@kwsites/promise-result';
import { createTestContext, newSimpleGit, SimpleGitTestContext } from '../__fixtures__';
describe('progress-monitor', () => {
let context: SimpleGitTestContext;
beforeEach(async () => context = await createTestContext());
it('detects successful completion', async () => {
const git = newSimpleGit(context.root);
expect(await promiseError(git.init())).toBeUndefined();
});
});
|
import { Component, OnInit, ElementRef, ViewChild } from '@angular/core';
import { User, PROFILES } from '../../Entitity/User'
import {FormGroup, FormBuilder, Validators} from '@angular/forms'
import { HttpClient } from '@angular/common/http'
@Component({
selector: 'app-create-user',
templateUrl: './create-user.component.html',
styleUrls: ['./create-user.component.sass']
})
export class CreateUserComponent implements OnInit {
heading = "Cadastrar usuário."
subheading = "Cadastro de usuários de acesso ao sistema"
icon = 'pe-7s-users icon-gradient bg-tempting-azure';
profiles:Array<String>
formUser:FormGroup
userCreated:boolean = false
errorMessage:String
@ViewChild("video")
public video: ElementRef;
constraints = {
audio: false,
video: { width: 64, height: 64 }
}
@ViewChild("canvas")
public canvas: ElementRef;
constructor(private formBuider:FormBuilder, private http:HttpClient) { }
submited:Boolean = false
invalidEmail(){
return this.submited && this.formUser.controls.email.errors
}
invalidFullName(){
return this.submited && this.formUser.controls.name.errors
}
invalidPassword(){
return this.submited && this.formUser.controls.password.errors
}
invalidPasswordConfimation(){
return this.submited && this.formUser.controls.password.value != this.formUser.controls.passwordReapt.value
}
ngOnInit() {
this.profiles = PROFILES
this.formUser = this.formBuider.group({
email:['', [Validators.required, Validators.email]],
name: ['', [Validators.required, Validators.minLength(3)]],
password: ['', [Validators.required]],
passwordReapt: ['', [Validators.required]],
profile: ['ADMIN', [Validators.required]],
img_profile: ['']
})
}
public ngAfterViewInit() {
if(navigator.mediaDevices && navigator.mediaDevices.getUserMedia) {
navigator.mediaDevices.getUserMedia(this.constraints).then(stream => {
if ('srcObject' in this.video.nativeElement) {
this.video.nativeElement.srcObject = stream
} else {
this.video.nativeElement.src = window.URL.createObjectURL(stream);
}
});
}
}
stopSteam(){
this.video.nativeElement.pause()
}
profileUrl:any
Capturar(){
this.canvas.nativeElement.getContext("2d").drawImage(this.video.nativeElement, 0, 0, 133, 133);
this.stopSteam()
this.profileUrl = this.canvas.nativeElement.toDataURL('image/jpg')
this.formUser.controls.img_profile.setValue(this.profileUrl)
}
onSubmit(){
this.userCreated = false
this.submited = true
if(this.formUser.invalid){
return;
}else{
delete this.formUser.value['passwordReapt']
var user = new User(this.formUser.value)
this.http.post('https://pfc2020-api.herokuapp.com/api/user', user).subscribe((data)=>{
this.userCreated = true
setTimeout(() => {
this.userCreated = false
}, 3000);
}, (error) =>{
this.errorMessage = error.message
setTimeout(()=>{
this.errorMessage = null
}, 3000)
})
}
}
}
|
#Concatenate
##Description
Concatenates the two supplied strings.
##Inputs
###a
The first string.
###b
The second string.
##Outputs
###result
The string representing the combination of **first** + **second**.
##Detail
|
# Create a new dashboard with free_text widget
require "datadog_api_client"
api_instance = DatadogAPIClient::V1::DashboardsAPI.new
body = DatadogAPIClient::V1::Dashboard.new({
title: "Example-Create_a_new_dashboard_with_free_text_widget",
description: nil,
widgets: [
DatadogAPIClient::V1::Widget.new({
layout: DatadogAPIClient::V1::WidgetLayout.new({
x: 0,
y: 0,
width: 24,
height: 6,
}),
definition: DatadogAPIClient::V1::FreeTextWidgetDefinition.new({
type: DatadogAPIClient::V1::FreeTextWidgetDefinitionType::FREE_TEXT,
text: "Example free text",
color: "#4d4d4d",
font_size: "auto",
text_align: DatadogAPIClient::V1::WidgetTextAlign::LEFT,
}),
}),
],
template_variables: [],
layout_type: DatadogAPIClient::V1::DashboardLayoutType::FREE,
is_read_only: false,
notify_list: [],
})
p api_instance.create_dashboard(body)
|
using Android.App;
using Android.Content;
namespace LocalNotifications.Droid
{
[BroadcastReceiver(Enabled = true, Label = "Reboot complete receiver")]
[IntentFilter(new[] { Android.Content.Intent.ActionBootCompleted })]
public class BootReceiver : BroadcastReceiver
{
public override void OnReceive(Context context, Intent intent)
{
if (intent.Action == "android.intent.action.BOOT_COMPLETED")
{
// Recreate alarms
}
}
}
}
|
package com.annimon.ownlang.modules.forms;
import com.annimon.ownlang.lib.Arguments;
import com.annimon.ownlang.lib.Value;
import java.awt.BorderLayout;
import java.awt.CardLayout;
import java.awt.FlowLayout;
import java.awt.GridLayout;
import javax.swing.BoxLayout;
/**
* Functions for working with layout managers.
*/
public final class LayoutManagers {
private LayoutManagers() { }
static Value borderLayout(Value[] args) {
Arguments.checkOrOr(0, 2, args.length);
int hgap = (args.length == 2) ? args[0].asInt() : 0;
int vgap = (args.length == 2) ? args[1].asInt() : 0;
return new LayoutManagerValue(
new BorderLayout(hgap, vgap)
);
}
static Value boxLayout(Value[] args) {
Arguments.checkOrOr(1, 2, args.length);
int axis = (args.length == 2) ? args[1].asInt() : BoxLayout.PAGE_AXIS;
return new LayoutManagerValue(
new BoxLayout(((JPanelValue) args[0]).panel, axis)
);
}
static Value cardLayout(Value[] args) {
Arguments.checkOrOr(0, 2, args.length);
int hgap = (args.length == 2) ? args[0].asInt() : 0;
int vgap = (args.length == 2) ? args[1].asInt() : 0;
return new LayoutManagerValue(
new CardLayout(hgap, vgap)
);
}
static Value gridLayout(Value[] args) {
Arguments.checkRange(0, 4, args.length);
int rows = 1, cols = 0, hgap = 0, vgap = 0;
switch (args.length) {
case 1:
rows = args[0].asInt();
break;
case 2:
rows = args[0].asInt();
cols = args[1].asInt();
break;
case 3:
rows = args[0].asInt();
cols = args[1].asInt();
hgap = args[2].asInt();
break;
case 4:
rows = args[0].asInt();
cols = args[1].asInt();
hgap = args[2].asInt();
vgap = args[3].asInt();
break;
}
return new LayoutManagerValue(
new GridLayout(rows, cols, hgap, vgap)
);
}
static Value flowLayout(Value[] args) {
Arguments.checkRange(0, 3, args.length);
final int align, hgap, vgap;
switch (args.length) {
case 1:
align = args[0].asInt();
hgap = 5;
vgap = 5;
break;
case 2:
align = FlowLayout.CENTER;
hgap = args[0].asInt();
vgap = args[1].asInt();
break;
case 3:
align = args[0].asInt();
hgap = args[1].asInt();
vgap = args[2].asInt();
break;
default:
align = FlowLayout.CENTER;
hgap = 5;
vgap = 5;
break;
}
return new LayoutManagerValue(
new FlowLayout(align, hgap, vgap)
);
}
}
|
#!/bin/bash
num_vms=$num_vms
remote_script='vm_remote_update_hostname_script.sh'
port=3022
for i in `seq 1 $num_vms`; do
sshpass -p "password" rsh -o "StrictHostKeyChecking no" -p $port user@127.0.0.1 "echo password| sudo -S bash "$remote_script" "$i
port=`expr $port + 1`
done
|
# Partitura Tutorial
A quick introduction to symbolic music processing with partitura:
## Installation
The easiest way to install the package is via ``pip`` from the `PyPI (Python
Package Index) <https://pypi.python.org/pypi>`_::
```shell
pip install partitura
```
This will install the latest release of the package and will install all
dependencies automatically.
**To install latest stable version:**
```shell
pip install git+https://github.com/CPJKU/partitura.git@develop
```
## QuickStart
The following code loads the contents of an example MusicXML file included in
the package:
```python
import partitura
my_xml_file = partitura.EXAMPLE_MUSICXML
part = partitura.load_musicxml(my_xml_file)
```
### Import other formats
For **MusicXML** files do:
```python
import partitura
my_xml_file = partitura.EXAMPLE_MUSICXML
part = partitura.load_musicxml(my_xml_file)
```
For **Kern** files do:
```python
import partitura
my_kern_file = partitura.EXAMPLE_KERN
part = partitura.load_kern(my_kern_file)
```
For **MEI** files do:
```python
import partitura
my_mei_file = partitura.EXAMPLE_MEI
part = partitura.load_mei(my_mei_file)
```
|
package hydra.prototyping
import hydra.core.*
def freeVariables[a](term: Term[a]): Set[Variable] = {
def free(bound: Set[Variable], t: Term[a]): List[Variable] = term.data match
case Expression.application(Application(t1, t2)) => free(bound, t1) ++ free(bound, t2)
case Expression.literal(_) => List()
case Expression.element(_) => List()
case Expression.function(fun) => functionFree(bound, fun)
case Expression.list(els) => els.flatMap(t => free(bound, t)).toList
case Expression.map(m) => (m map { case (k, v) => free(bound, k) ++ free(bound, v) }).toList.flatten
case Expression.optional(m) => (m map {t => free(bound, t)}) getOrElse List()
case Expression.record(fields) => fields.flatMap(f => free(bound, f.term)).toList
case Expression.set(els) => els.flatMap(t => free(bound, t)).toList
case Expression.typeAbstraction(TypeAbstraction(v, term)) => free(bound, term)
case Expression.typeApplication(TypeApplication(lhs, rhs)) => free(bound, lhs)
case Expression.union(f) => free(bound, f.term)
case Expression.variable(v) => if bound.contains(v) then List() else List(v)
def functionFree(bound: Set[Variable], fun: Function[a]): List[Variable] = fun match
case Function.cases(cases) => cases.flatMap(f => free(bound, f.term)).toList
case Function.compareTo(t) => free(bound, t)
case Function.data() => List()
case Function.lambda(Lambda(v, t)) => free(bound + v, t)
case Function.primitive(_) => List()
case Function.projection(_) => List()
free(Set(), term).toSet
}
/**
* Whether a term is closed, i.e. represents a complete program
*/
def termIsClosed[a](term: Term[a]): Boolean = freeVariables(term).isEmpty
|
<?php
namespace App\Controller;
use App\Utils\MarkdownContext;
use App\Utils\MarkdownConverter;
use Embed\Embed;
use Embed\Exceptions\InvalidUrlException;
use Sensio\Bundle\FrameworkExtraBundle\Configuration\IsGranted;
use Symfony\Component\HttpFoundation\JsonResponse;
use Symfony\Component\HttpFoundation\Request;
use Symfony\Component\HttpFoundation\Response;
/**
* Helpers for Ajax-related stuff.
*
* @IsGranted("ROLE_USER")
*/
class AjaxController {
/**
* JSON action for retrieving link titles.
*
* - 200 - Found a title
* - 400 - Bad URL
* - 404 - No title found
*
* @param Request $request
*
* @return Response
*/
public function fetchTitle(Request $request) {
$url = $request->request->get('url');
try {
$title = Embed::create($url)->getTitle();
if (!strlen($title)) {
return new JsonResponse(null, 404);
}
return new JsonResponse(['title' => $title]);
} catch (InvalidUrlException $e) {
return new JsonResponse(['error' => $e->getMessage()], 400);
}
}
public function markdownPreview(
Request $request,
MarkdownConverter $converter,
MarkdownContext $context
) {
$markdown = $request->request->get('markdown', '');
$options = $context->getContextAwareOptions();
return new Response($converter->convertToHtml($markdown, $options));
}
}
|
#! /usr/bin/env ruby -S rspec
require 'spec_helper_acceptance'
describe 'is_a function', :unless => UNSUPPORTED_PLATFORMS.include?(fact('operatingsystem')) do
it 'should match a string' do
pp = <<-EOS
if 'hello world'.is_a(String) {
notify { 'output correct': }
}
EOS
apply_manifest(pp, :catch_failures => true) do |r|
expect(r.stdout).to match(/Notice: output correct/)
end
end
it 'should not match a integer as string' do
pp = <<-EOS
if 5.is_a(String) {
notify { 'output wrong': }
}
EOS
apply_manifest(pp, :catch_failures => true) do |r|
expect(r.stdout).not_to match(/Notice: output wrong/)
end
end
end
|
namespace HandyControlDemo.UserControl;
public partial class ToolBarDemoCtl
{
public ToolBarDemoCtl()
{
InitializeComponent();
}
}
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.