hexsha
stringlengths
40
40
size
int64
5
1.04M
ext
stringclasses
6 values
lang
stringclasses
1 value
max_stars_repo_path
stringlengths
3
344
max_stars_repo_name
stringlengths
5
125
max_stars_repo_head_hexsha
stringlengths
40
78
max_stars_repo_licenses
listlengths
1
11
max_stars_count
int64
1
368k
max_stars_repo_stars_event_min_datetime
stringlengths
24
24
max_stars_repo_stars_event_max_datetime
stringlengths
24
24
max_issues_repo_path
stringlengths
3
344
max_issues_repo_name
stringlengths
5
125
max_issues_repo_head_hexsha
stringlengths
40
78
max_issues_repo_licenses
listlengths
1
11
max_issues_count
int64
1
116k
max_issues_repo_issues_event_min_datetime
stringlengths
24
24
max_issues_repo_issues_event_max_datetime
stringlengths
24
24
max_forks_repo_path
stringlengths
3
344
max_forks_repo_name
stringlengths
5
125
max_forks_repo_head_hexsha
stringlengths
40
78
max_forks_repo_licenses
listlengths
1
11
max_forks_count
int64
1
105k
max_forks_repo_forks_event_min_datetime
stringlengths
24
24
max_forks_repo_forks_event_max_datetime
stringlengths
24
24
content
stringlengths
5
1.04M
avg_line_length
float64
1.14
851k
max_line_length
int64
1
1.03M
alphanum_fraction
float64
0
1
lid
stringclasses
191 values
lid_prob
float64
0.01
1
e15d0561ae2252f65302263e4aaf3c9ed6f30647
1,097
md
Markdown
pf-freebsd-setup/README.md
ramankumarlive/azure-quickstart-templates
cb540e7b95809ccf960657a7b5c30a32299faf3b
[ "MIT" ]
18
2017-05-16T12:03:56.000Z
2021-01-22T17:35:59.000Z
pf-freebsd-setup/README.md
Trigsy1/azure-quickstart-templates
24d7930bd2a562163541ea6df413f6a8ab0222a0
[ "MIT" ]
3
2020-07-07T20:20:26.000Z
2022-03-08T21:27:42.000Z
pf-freebsd-setup/README.md
gjlumsden/azure-quickstart-templates
70935bff823b8650386f6d3223dc199a66c4efd2
[ "MIT" ]
45
2017-05-08T15:13:29.000Z
2021-03-09T16:49:04.000Z
#NAT firewall with round-robin load balancing using FreeBSD's pf <a href="https://portal.azure.com/#create/Microsoft.Template/uri/https%3A%2F%2Fraw.githubusercontent.com%2FAzure%2Fazure-quickstart-templates%2Fmaster%2Fpf-freebsd-setup%2Fazuredeploy.json" target="_blank"> <img src="http://azuredeploy.net/deploybutton.png"/> </a> <a href="http://armviz.io/#/?load=https%3A%2F%2Fraw.githubusercontent.com%2FAzure%2Fazure-quickstart-templates%2Fmaster%2Fpf-freebsd-setup%2Fazuredeploy.json" target="_blank"> <img src="http://armviz.io/visualizebutton.png"/> </a> This template can help you deploy a NAT firewall with round-robin load balancing using FreeBSD's pf on Azure for common web server scenario where 2 FreeBSD virtual machines install the Nginix web server. Since the front-end VM acting as the NAT has 2 NICs, please refer [**HERE**](https://docs.microsoft.com/en-us/azure/virtual-machines/virtual-machines-windows-sizes) to choose satisfied VM size. After the template deploys successfully, you can access Nginx using the public IP of front-end VM from the explorer.
68.5625
206
0.778487
eng_Latn
0.570825
e15d9c5f974b5bd9535530667dc688ee1dfc3267
955
md
Markdown
solutions/dynamic-programming/001-fibonacci/Readme.md
bilgrami/resume
cfb4e3591df145cc806836679cd240dd135ed10c
[ "MIT" ]
2
2019-09-16T08:13:23.000Z
2020-09-05T08:17:52.000Z
solutions/dynamic-programming/001-fibonacci/Readme.md
bilgrami/resume
cfb4e3591df145cc806836679cd240dd135ed10c
[ "MIT" ]
null
null
null
solutions/dynamic-programming/001-fibonacci/Readme.md
bilgrami/resume
cfb4e3591df145cc806836679cd240dd135ed10c
[ "MIT" ]
null
null
null
# Fibonacci Number (Fn) Compare performance of Fn via dynamic vs recursion approaches. Use the shortest time method to compare runtime performance. ## Implementation ```python def Fn_recursion(n): if n == 0 or n == 1: return n else: return Fn_recursion(n-1) + Fn_recursion(n-2) def Fn_dynamic(n): memo = [0 for _ in range(n+1)]; memo[0], memo[1] = 0, 1 for i in range(2, n+1): memo[i] = memo[i-1] + memo[i-2] return memo[n] ``` ## Execution Result ```bash python3 fibonacci.py ``` Result ``` n = 20 Fn_dynamic: 20 -> 6765 Fn_dynamic: 20 -> 6765 Fn_dynamic: 20 -> 6765 Fn_dynamic: 20 -> 6765 Fn_dynamic: 20 -> 6765 Algorithm: Fn_dynamic. Minimum execution time: 0.00012869801139459014 Fn_recursion: 20 -> 6765 Fn_recursion: 20 -> 6765 Fn_recursion: 20 -> 6765 Fn_recursion: 20 -> 6765 Fn_recursion: 20 -> 6765 Algorithm: Fn_recursion. Minimum execution time: 0.011802783992607147 ```
20.76087
123
0.66178
eng_Latn
0.386425
e15dc5861e6e2a4346cc848d6f296e5fcba6f312
32,539
md
Markdown
README.md
lldelisle/pairix
82be6913ed28653f3cb6bc5e17c4236e8c445e2e
[ "MIT" ]
61
2017-02-08T22:15:29.000Z
2022-03-07T08:34:16.000Z
README.md
lldelisle/pairix
82be6913ed28653f3cb6bc5e17c4236e8c445e2e
[ "MIT" ]
18
2017-02-20T05:03:23.000Z
2021-08-09T00:21:37.000Z
README.md
lldelisle/pairix
82be6913ed28653f3cb6bc5e17c4236e8c445e2e
[ "MIT" ]
10
2017-01-27T16:36:33.000Z
2021-11-20T02:16:05.000Z
# pairix [![Codacy Badge](https://api.codacy.com/project/badge/Grade/6e73d29f423f49ff9293f60b25b4778a)](https://www.codacy.com/app/SooLee/pairix?utm_source=github.com&utm_medium=referral&utm_content=4dn-dcic/pairix&utm_campaign=badger) [![Build Status](https://travis-ci.org/4dn-dcic/pairix.svg?branch=master)](https://travis-ci.org/4dn-dcic/pairix) * Pairix is a tool for indexing and querying on a block-compressed text file containing pairs of genomic coordinates. * Pairix is a stand-alone C program that was written on top of tabix (https://github.com/samtools/tabix) as a tool for the 4DN-standard pairs file format describing Hi-C data: [pairs_format_specification.md](pairs_format_specification.md) * However, Pairix can be used as a generic tool for indexing and querying any bgzipped text file containing genomic coordinates, for either 2D- or 1D- indexing and querying. * For example: given the custom text file below, you want to extract specific lines from the Pairs file further below. An `awk` command would read the Pairs file from beginning to end. Pairix creates an index and uses it to access the file from a relevant position by taking advantage of bgzf compression, allowing for a fast query on large files. **Some custom text file** ``` chr1 10000 20000 chr2 30000 50000 3.5 chr1 30000 40000 chr3 10000 70000 4.6 ``` **Pairs format** ``` ## pairs format v1.0 #sorted: chr1-chr2-pos1-pos2 #shape: upper triangle #chromsize: chr1 249250621 #chromsize: chr2 243199373 #chromsize: chr3 198022430 ... #genome_assembly: hg38 #columns: readID chr1 pos1 chr2 pos2 strand1 strand2 EAS139:136:FC706VJ:2:2104:23462:197393 chr1 10000 chr1 20000 + + EAS139:136:FC706VJ:2:8762:23765:128766 chr1 50000 chr1 70000 + + EAS139:136:FC706VJ:2:2342:15343:9863 chr1 60000 chr2 10000 + + EAS139:136:FC706VJ:2:1286:25:275154 chr1 30000 chr3 40000 + - ``` * Bgzip can be found either in *this repo* or https://github.com/samtools/tabix (original). ## Table of contents * [Availability](#availability) * [Input file format](#input-file-format) * [Pairix](#pairix) * [Installation](#installation-for-pairix) * [Usage](#usage-for-pairix) * [Compression](#compression) * [Indexing](#indexing) * [Querying](#querying) * [List of chromosome pairs](#list-of-chromosome-pairs) * [Total linecount](#total-linecount) * [Examples](#usage-examples-for-pairix) * [Pypairix](#pypairix) * [Installation](#installation-for-pypairix) * [Examples](#usage-examples-for-pypairix) * [Rpairix](#rpairix) * [Utils](#utils) * [bam2pairs](#bam2pairs) * [process_merged_nodup.sh](#process_merged_nodupsh) * [process_old_merged_nodup.sh](#process_old_merged_nodupsh) * [merged_nodup2pairs.pl](#merged_nodup2pairspl) * [old_merged_nodup2pairs.pl](#old_merged_nodup2pairspl) * [fragment_4dnpairs.pl](#fragment_4dnpairspl) * [duplicate_header_remover.pl](#duplicate_header_removerpl) * [column_remover.pl](#column_removerpl) * [Pairs_merger](#pairs_merger) * [Usage](#usage-for-pairs_merger) * [Examples](#usage-examples-for-pairs_merger) * [merge-pairs.sh](#merge-pairssh) * [Usage](#usage-for-merge-pairssh) * [Examples](#usage-examples-for-merge-pairssh) * [Streamer_1d](#streamer_1d) * [Usage](#usage-for-streamer-1d) * [Examples](#usage-examples-for-streamer-1d) * [Note](#note) * [Version history](#version-history) <br> ## Availability * Pairix is available as a stand-alone command-line program, a python library (pypairix), and an R package (Rpairix https://github.com/4dn-dcic/Rpairix) * Various utils including `bam2pairs`, `merged_nodups2pairs.pl`, `pairs_merger` etc. are available within this repo. * The `bgzip` program that is provided as part of the repo is identical to the original program in https://github.com/samtools/tabix. <br> ## Input file format * For 2D indexing, the input file of paired coordinates must first be sorted by the two chromosome columns and then by the first genomic position column. For 1D indexing, the file must be sorted by a chromosome column and then by a position column. * The input file must be compressed using bgzip and is either tab-delimited or space-delimited. * The resulting index file will be given the extension `.px2`. <br> ## Pairix ### Installation for pairix ``` git clone https://github.com/4dn-dcic/pairix cd pairix make # Add the bin path to PATH for pairix, bgzip, pairs_merger and stream_1d # In order to use utils, add util path to PATH # In order to use bam2pairs, add util/bam2pairs to PATH # eg: PATH=~/git/pairix/bin/:~/git/pairix/util:~/git/pairix/util/bam2pairs:$PATH ``` Alternatively, `conda install pairix` can be used to install both Pairix and Pypairix together. This requires Anaconda or Miniconda. <br> If you get an error message saying zlib cannot be found, try installing zlib first as below before `make`. ``` # ubuntu sudo apt-get install zlib1g-dev # centos sudo yum install zlib-devel ``` <br> <br> ### Usage for pairix #### Compression ``` bgzip textfile ``` #### Indexing ``` pairix textfile.gz # for recognized file extension pairix -p <preset> textfile.gz pairix -s<chr1_column> [-d<chr2_column>] -b<pos1_start_column> -e<pos1_end_column> [-u<pos2_start_column> -v<pos2_end_column>] [-T] textfile.gz # u, v is required for full 2d query. ``` * column indices are 1-based. * use `-T` option for a space-delimited file. * use `-f` option to overwrite an existing index file. * presets can be used for indexing : `pairs`, `merged_nodups`, `old_merged_nodups` for 2D indexing, `gff`, `vcf`, `bed`, `sam` for 1D indexing. Default is `pairs`. * For the recognized file extensions, the `-p` option can be dropped: `.pairs.gz`, `.vcf.gz`, `gff.gz`, `bed.gz`, `sam.gz` * Custom column specification (-s, -d, -b, -e, -u, -v) overrides file extension recognition. The custom specification must always have at least chr1_column (-s). * use `-w <character>` option to change region split character (default '|', see below the [Querying](#querying) section for details). This is useful when your chromosome names contain the '|' character. (e.g. `-w '^'`) #### Querying ``` pairix textfile.gz region1 [region2 [...]] ## region is in the following format. # for 1D indexed file pairix textfile.gz '<chr>:<start>-<end>' '<chr>:<start>-<end>' ... # for 2D indexed file pairix [-a] textfile.gz '<chr1>:<start1>-<end1>|<chr2>:<start2>-<end2>' ... # make sure to quote so '|' is not interpreted as a pipe. pairix [-a] textfile.gz '*|<chr2>:<start2>-<end2>' # wild card is accepted for 1D query on 2D indexed file pairix [-a] textfile.gz '<chr1>:<start1>-<end1>|*' # wild card is accepted for 1D query on 2D indexed file ``` * The -a option (auto-flip) flips query when a given chromosome pair doesn't exist. ``` pairix -a samples/test_4dn.pairs.gz 'chrY|chrX' |head SRR1658581.13808070 chrX 359030 chrY 308759 - + SRR1658581.1237993 chrX 711481 chrY 3338402 + - SRR1658581.38694206 chrX 849049 chrY 2511913 - - SRR1658581.6691868 chrX 1017548 chrY 967955 + - SRR1658581.2398986 chrX 1215519 chrY 569356 - + SRR1658581.21090183 chrX 1406586 chrY 2621557 + - SRR1658581.35447261 chrX 1501769 chrY 1458068 + - SRR1658581.26384827 chrX 1857703 chrY 1807309 - + SRR1658581.13824346 chrX 2129016 chrY 2411576 - - SRR1658581.6160690 chrX 2194708 chrY 2485859 - - ``` # using a file listing query regions ``` pairix -L textfile.gz regionfile1 [regionfile2 [...]] # region file contains one region string per line ``` * The default region split character is '|', which can be changed by using the `-w` option when building an index. #### List of chromosome pairs This command prints out all chromosome pairs in the file. ``` pairix -l textfile.gz ``` #### Total linecount This is equivalent to but much faster than `gunzip -c | wc -l`. ``` pairix -n textfile.gz ``` #### Print out region split character By default '|' is used to split the two genomic regions, but in some cases, a different character is used and it is stored in the index. This command prints out the character used for a specific pairs file. ``` pairix -W textfile.gz ``` #### Print out number of bgzf blocks that span each chromosome pair. This command prints out the number of bgzk blocks for all chromosome pairs. ``` pairix -B textfile.gz ``` <br> ### Usage examples for pairix #### Preparing a 4dn-style pairs file. This is a double-chromosome-block sorted test file. (column 2 and 4 are chromosomes (chr1 and chr2), column 3 is position of the first coordinate (pos1)). ``` # sorting & bgzipping sort -k2,2 -k4,4 -k3,3n -k5,5n samples/4dn.bsorted.chr21_22_only.pairs |bgzip -c > samples/4dn.bsorted.chr21_22_only.pairs.gz # indexing pairix -f samples/4dn.bsorted.chr21_22_only.pairs.gz # The above command is equivalent to: pairix -f -s2 -b3 -e3 -d4 -u5 samples/4dn.bsorted.chr21_22_only.pairs.gz # The above command is also equivalent to: pairix -f -p pairs samples/4dn.bsorted.chr21_22_only.pairs.gz # Pairs extension .pairs.gz is automatically recognized. ``` #### Preparing a double-chromosome-block sorted `merged_nodups.txt` file (Juicer-style pairs file) (columns 2 and 6 are chromosomes (chr1 and chr2), and column 3 is position of the first coordinate (pos1)). ``` # sorting & bgzipping sort -t' ' -k2,2 -k6,6 -k3,3n -k7,7n merged_nodups.txt |bgzip -c > samples/merged_nodups.space.chrblock_sorted.subsample3.txt #indexing pairix -f -p merged_nodups samples/merged_nodups.space.chrblock_sorted.subsample3.txt.gz # The above command is equivalent to : pairix -f -s2 -d6 -b3 -e3 -u7 -T samples/merged_nodups.space.chrblock_sorted.subsample3.txt.gz ``` #### querying semi 2D query with two chromosomes ``` pairix samples/test_4dn.pairs.gz 'chr21|chr22' SRR1658581.33025893 chr21 9712946 chr22 21680462 - + SRR1658581.9428560 chr21 10774937 chr22 37645396 - + SRR1658581.8816993 chr21 11171003 chr22 33169971 + + SRR1658581.10673815 chr21 16085548 chr22 35451128 + - SRR1658581.2504661 chr21 25672432 chr22 21407301 - - SRR1658581.40524826 chr21 28876237 chr22 42449178 + + SRR1658581.8969171 chr21 33439464 chr22 43912252 - - SRR1658581.6842680 chr21 35467614 chr22 33478115 + + SRR1658581.15363628 chr21 37956917 chr22 21286436 - - SRR1658581.3572823 chr21 40651454 chr22 41358228 - - SRR1658581.50137399 chr21 42446807 chr22 49868647 - + SRR1658581.11358652 chr21 43768599 chr22 40759935 - - SRR1658581.4127782 chr21 45142744 chr22 36929446 + + SRR1658581.38401094 chr21 46989766 chr22 45627553 - + SRR1658581.34261420 chr21 48113817 chr22 51138644 + - ``` semi 2D query with a chromosome and a range ``` pairix samples/test_4dn.pairs.gz 'chr21:10000000-20000000|chr22' SRR1658581.9428560 chr21 10774937 chr22 37645396 - + SRR1658581.8816993 chr21 11171003 chr22 33169971 + + SRR1658581.10673815 chr21 16085548 chr22 35451128 + - ``` full 2D query with two ranges ``` pairix samples/test_4dn.pairs.gz 'chr21:10000000-20000000|chr22:30000000-35000000' SRR1658581.8816993 chr21 11171003 chr22 33169971 + + ``` full 2D multi-query ``` pairix samples/test_4dn.pairs.gz 'chr21:10000000-20000000|chr22:30000000-35000000' 'chrX:100000000-110000000|chrX:150000000-170000000' SRR1658581.8816993 chr21 11171003 chr22 33169971 + + SRR1658581.39700722 chrX 100748075 chrX 154920234 + + SRR1658581.36337371 chrX 104718152 chrX 151646254 + - SRR1658581.49591338 chrX 104951264 chrX 154363440 + + SRR1658581.46205223 chrX 105732382 chrX 155162659 + - SRR1658581.32048997 chrX 107326643 chrX 151899433 - + ``` Wild-card 2D query ``` pairix samples/test_4dn.pairs.gz 'chr21:9000000-9700000|*' SRR1658581.18102003 chr21 9582382 chr21 9733996 + + SRR1658581.10121427 chr21 9665774 chr4 49203518 + - SRR1658581.1019708 chr21 9496682 chr4_gl000193_random 48672 + + SRR1658581.44516250 chr21 9662891 chr6 7280832 - + SRR1658581.15515341 chr21 9549471 chr9 68384076 + + SRR1658581.51399686 chr21 9687495 chrUn_gl000221 87886 + + SRR1658581.25532108 chr21 9519859 chrUn_gl000226 6821 + - SRR1658581.22081000 chr21 9659013 chrUn_gl000232 19626 - - SRR1658581.34308344 chr21 9532618 chrX 61793091 - + ``` ``` pairix samples/test_4dn.pairs.gz '*|chr21:9000000-9700000' SRR1658581.21313395 chr1 25612365 chr21 9679403 + - SRR1658581.46040617 chr1 143255816 chr21 9663103 + + SRR1658581.54790470 chr14 101961336 chr21 9481250 + + SRR1658581.38248307 chr18 18518988 chr21 9452846 - + SRR1658581.9143926 chr2 90452598 chr21 9486716 + - ``` Symmetric query - 1D query on a 2D-index file is interpreted as a symmetric 2D query. The two commands below are equivalent. ``` pairix samples/test_4dn.pairs.gz 'chr22:50000000-60000000' SRR1658581.11011611 chr22 50224888 chr22 50225362 + - SRR1658581.37423580 chr22 50528835 chr22 50529355 + - SRR1658581.20673732 chr22 50638372 chr22 51062837 + - SRR1658581.38906907 chr22 50661701 chr22 50813018 + - SRR1658581.7631402 chr22 50767962 chr22 50773437 - + SRR1658581.31517355 chr22 50910780 chr22 50911083 + - SRR1658581.31324262 chr22 50991542 chr22 50991895 + - SRR1658581.46124457 chr22 51143411 chr22 51143793 + - SRR1658581.23040702 chr22 51229529 chr22 51229809 + - ``` ``` pairix samples/test_4dn.pairs.gz 'chr22:50000000-60000000|chr22:50000000-60000000' SRR1658581.11011611 chr22 50224888 chr22 50225362 + - SRR1658581.37423580 chr22 50528835 chr22 50529355 + - SRR1658581.20673732 chr22 50638372 chr22 51062837 + - SRR1658581.38906907 chr22 50661701 chr22 50813018 + - SRR1658581.7631402 chr22 50767962 chr22 50773437 - + SRR1658581.31517355 chr22 50910780 chr22 50911083 + - SRR1658581.31324262 chr22 50991542 chr22 50991895 + - SRR1658581.46124457 chr22 51143411 chr22 51143793 + - SRR1658581.23040702 chr22 51229529 chr22 51229809 + - ``` Query using a region file ``` cat samples/test.regions chr1:1-50000|* *|chr1:1-50000 chr2:1-20000|* *|chr2:1-20000 cat samples/test.regions2 chrX:100000000-110000000|chrY chr19:1-300000|chr19 bin/pairix -L samples/test_4dn.pairs.gz samples/test.regions samples/test.regions2 SRR1658581.49364897 chr1 36379 chr20 62713042 + + SRR1658581.31672330 chr1 12627 chr9 23963238 + - SRR1658581.22713561 chr1 14377 chrX 107423076 - + SRR1658581.31992022 chrX 108223782 chrY 5017118 - - SRR1658581.55524746 chr19 105058 chr19 105558 + - ``` #### 1D indexing on a regular vcf file, bgzipped. 1D indexing ``` pairix -f samples/SRR1171591.variants.snp.vqsr.p.vcf.gz # The above command is equivalent to : pairix -f -s1 -b2 -e2 samples/SRR1171591.variants.snp.vqsr.p.vcf.gz # The above command is also equivalent to : pairix -f -p vcf samples/SRR1171591.variants.snp.vqsr.p.vcf.gz # The extension `.vcf.gz` is automatically recognized. ``` 1D query ``` pairix samples/SRR1171591.variants.snp.vqsr.p.vcf.gz chr10:1-4000000 chr10 3463966 . C T 51.74 PASS AC=2;AF=1.00;AN=2;DB;DP=2;Dels=0.00;FS=0.000;HaplotypeScore=0.0000;MLEAC=2;MLEAF=1.00;MQ=50.00;MQ0=0;POSITIVE_TRAIN_SITE;QD=25.87;VQSLOD=7.88;culprit=FS GT:AD:DP:GQ:PL 1/1:0,2:2:6:79,6,0 chr10 3978708 rs29320259 T C 1916.77 PASS AC=2;AF=1.00;AN=2;BaseQRankSum=1.016;DB;DP=67;Dels=0.00;FS=0.000;HaplotypeScore=629.1968;MLEAC=2;MLEAF=1.00;MQ=50.00;MQ0=0;MQRankSum=0.773;POSITIVE_TRAIN_SITE;QD=28.61;ReadPosRankSum=0.500;VQSLOD=3.29;culprit=FS GT:AD:DP:GQ:PL 1/1:3,64:67:70:1945,70,0 chr10 3978709 . G A 1901.77 PASS AC=2;AF=1.00;AN=2;BaseQRankSum=0.677;DB;DP=66;Dels=0.00;FS=0.000;HaplotypeScore=579.9049;MLEAC=2;MLEAF=1.00;MQ=50.00;MQ0=0;MQRankSum=0.308;POSITIVE_TRAIN_SITE;QD=28.81;ReadPosRankSum=0.585;VQSLOD=3.24;culprit=FS GT:AD:DP:GQ:PL 1/1:3,63:66:73:1930,73,0 ``` <br> ## Pypairix ### Installation for pypairix ``` # to install the python module pypairix, pip install pypairix # you may need to install python-dev for some ubuntu releases. # or cd pairix python setup.py install # testing the python module python test/test.py ``` Alternatively, `conda install pairix` can be used to install both Pairix and Pypairix together. This requires Anaconda or Miniconda. <br> ### Usage examples for pypairix ``` # to import and use python module pypairix, add the following in your python script. import pypairix # 2D query usage example 1 with `query2D(chrom, start, end, chrom2, start2, end2)` tb=pypairix.open("textfile.gz") it = tb.query2D(chrom, start, end, chrom2, start2, end2) for x in it: print(x) # 2D query usage example 1 with *autoflip* with `query2D(chrom, start, end, chrom2, start2, end2, 1)` # Autoflip: if the queried chromosome pair does not exist in the pairs file, query the flipped pair. tb=pypairix.open("textfile.gz") it = tb.query2D(chrom2, start2, end2, chrom1, start1, end1, 1) for x in it: print(x) # 2D query usage example 2 with `querys2D(querystr)` tb=pypairix.open("textfile.gz") querystr='{}:{}-{}|{}:{}-{}'.format(chrom, start, end, chrom2, start2, end2) it = tb.querys2D(querystr) for x in it: print(x) # 2D query usage example with wild card tb=pypairix.open("textfile.gz") querystr='{}:{}-{}|*'.format(chrom, start, end) it = tb.querys2D(querystr) for x in it: print(x) # 2D query usage example 3, with *autoflip* with `querys2D(querystr, 1)` # Autoflip: if the queried chromosome pair does not exist in the pairs file, query the flipped pair. tb=pypairix.open("textfile.gz") querystr='{}:{}-{}|{}:{}-{}'.format(chrom2, start2, end2, chrom, start, end) it = tb.querys2D(querystr, 1) for x in it: print(x) # 1D query on 2D indexed file tb=pypairix.open("textfile.gz") querystr='{}:{}-{}'.format(chrom, start, end) it = tb.querys2D(querystr) # The above two lines are equivalent to the following: # querystr='{}:{}-{}|{}:{}-{}'.format(chrom, start, end, chrom, start, end) # it = tb.querys2D(querystr) for x in it: print(x) # 1D query on 1D indexed file, example 1 tb=pypairix.open("textfile.gz") it = tb.query(chrom, start, end) for x in it: print(x) # 1D query on 1D indexed file, example 2 tb=pypairix.open("textfile.gz") querystr='{}:{}-{}'.format(chrom, start, end) it = tb.querys(querystr) for x in it: print(x) # get the list of (chr-pair) blocks tb=pypairix.open("textfile.gz") chrplist = tb.get_blocknames() print str(chrplist) # get the column index (0-based) tb=pypairix.open("textfile.gz") print( tb.get_chr1_col() ) print( tb.get_chr2_col() ) print( tb.get_startpos1_col() ) print( tb.get_startpos2_col() ) print( tb.get_endpos1_col() ) print( tb.get_endpos2_col() ) # check if key exists (key is a chromosome pair for a 2D-indexed file, or a chromosome for a 1D-indexed file) tb=pypairix.open("textfile.gz") print( tb.exists("chr1|chr2") ) # 1 if exists, 0 if not. print( tb.exists2("chr1","chr2") ) # 1 if exists, 0 if not. # get header tb=pypairix.open("textfile.gz") print (tb.get_header()) # get chromsize tb=pypairix.open("textfile.gz") print (tb.get_chromsize()) # get the number of bgzf blocks that span a given chromosome pair tb=pypairix.open("textfile.gz") print (tb.bgzf_block_count("chr1", "chr2")) # check if an indexed file is a triangle tb=pypairix.open("textfile.gz") print (tb.check_triangle()) ``` <br> ## Rpairix * Rpairix is an R package for reading pairix-indexed pairs files. It has its own repo: https://github.com/4dn-dcic/Rpairix <br> ## Utils ### bam2pairs * This script converts a bam file to a 4dn style pairs file, sorted and indexed. * See [util/bam2pairs/README.md](util/bam2pairs/README.md) for more details. ### process_merged_nodup.sh * This script sorts, bgzips and indexes a newer version of `merged_nodups.txt` file with strand1 as the first column. ``` Usage: process_merged_nodup.sh <merged_nodups.txt> ``` ### process_old_merged_nodup.sh * This script sorts, bgzips and indexes an old version of `merged_nodups.txt` file with readID as the first column. ``` Usage: process_old_merged_nodup.sh <merged_nodups.txt> ``` ### merged_nodup2pairs.pl * This script converts Juicer's `merged_nodups.txt` format to 4dn-style pairs format. It requires pairix and bgzip binaries in PATH. ``` Usage: merged_nodup2pairs.pl <input_merged_nodups.txt> <chromsize_file> <output_prefix> ``` * An example output file (bgzipped and indexed) looks as below. ``` ## pairs format v1.0 #sorted: chr1-chr2-pos1-pos2 #shape: upper triangle #chromsize: 1 249250621 #chromsize: 2 243199373 ... #columns: readID chr1 pos1 chr2 pos2 strand1 strand2 frag1 frag2 SRR1658650.8850202.2/2 1 16944943 1 151864549 - + 45178 333257 SRR1658650.8794979.1/1 1 21969282 1 50573348 - - 59146 140641 SRR1658650.6209714.1/1 1 31761397 1 32681095 - + 88139 90865 SRR1658650.6348458.2/2 1 40697468 1 40698014 + - 113763 113763 SRR1658650.12316544.1/1 1 41607001 1 41608253 + + 116392 116398 ``` ### old_merged_nodup2pairs.pl * This script converts Juicer's old `merged_nodups.txt` format to 4dn-style pairs format. It requires pairix and bgzip binaries in PATH. ``` Usage: old_merged_nodup2pairs.pl <input_merged_nodups.txt> <output_prefix> ``` * An example output file (bgzipped and indexed) looks as below. ``` ## pairs format v1.0 #sorted: chr1-chr2-pos1-pos2 #shape: upper triangle #chromsize: 1 249250621 #chromsize: 2 243199373 ... #columns: readID chr1 pos1 chr2 pos2 strand1 strand2 frag1 frag2 SRR1658650.8850202.2/2 1 16944943 1 151864549 - + 45178 333257 SRR1658650.8794979.1/1 1 21969282 1 50573348 - - 59146 140641 SRR1658650.6209714.1/1 1 31761397 1 32681095 - + 88139 90865 SRR1658650.6348458.2/2 1 40697468 1 40698014 + - 113763 113763 SRR1658650.12316544.1/1 1 41607001 1 41608253 + + 116392 116398 ``` ### fragment_4dnpairs.pl * This script adds juicer-style fragment information to 4DN-DCIC style pairs file. ``` Usage: gunzip -c <input.pairs.gz> | fragment_4dnpairs.pl [--allow-replacement] - <out.pairs> <juicer-style-restriction-site-file> ``` ### duplicate_header_remover.pl * This script removes duplicate headers from a pairs file (either ungzipped or streamed). This is useful when you accidentally created a wrong pairs file with duplicate headers. The order of the headers doesn't change. Duplicates don't necessarily have to be in consecutive lines. ``` Usage: gunzip -c <input.pairs.gz> | duplicate_header_remover.pl - | bgzip -c > <out.pairs.gz> ``` ### column_remover.pl * This script removes columns from a pairs file (either ungzipped or streamed). ``` # The following removes multiple columns from both header and content. The columns to be removed should be referred to by column names. Usage: gunzip -c <input.pairs.gz> | column_remover.pl - <colname1> [<colname2> ...] | bgzip -c > <out.pairs.gz> # The following removes a single column from only the content. The column to be removed should be referred to by column index (0-based). Usage: gunzip -c <input.pairs.gz> | column_remover.pl --do-not-fix-header - <colindex> | bgzip -c > <out.pairs.gz> ``` ### Pairs_merger Pairs_merger is a tool that merges indexed pairs files that are already sorted, creating a sorted output pairs file. Pairs_merger uses a k-way merge sort algorithm starting with k file streams. Specifically, it loops over a merged iterator composed of a dynamically sorted array of k iterators. It neither requires additional memory nor produces any temporary files. #### Usage for pairs_merger ``` pairs_merger <in1.gz> <in2.gz> <in3.gz> ... > <out.txt> # Each of the input files must have a .px2 index file. bgzip out.txt ## or pipe to bgzip pairs_merger <in1.gz> <in2.gz> <in3.gz> ... | bgzip -c > <out.gz> # To index the output file as well # use the appropriate options according to the output file format. pairix [options] out.gz ``` #### Usage examples for pairs_merger ``` bin/pairs_merger samples/merged_nodups.space.chrblock_sorted.subsample2.txt.gz samples/merged_nodups.space.chrblock_sorted.subsample3.txt.gz | bin/bgzip -c > out.gz bin/pairix -f -p merged_nodups out.gz # The above command is equivalent to : bin/pairix -f -s2 -d6 -b3 -e3 -u7 -T out.gz ``` ### merge-pairs.sh Merge-pairs.sh is a merger specifically for the 4DN pairs file. This merger is header-friendly. The input pairs files do not need to be indexed, but need to be sorted properly. #### Usage for merge-pairs.sh ``` # The following command will create outprefix.pairs.gz and outprefix.pairs.gz.px2, given in1.pairs.gz, in2.pairs.gz, .... merge-pairs.sh <outprefix> <in1.pairs.gz> <in2.pairs.gz> ... ``` #### Usage examples for merge-pairs.sh ``` util/merge-pairs.sh output sample1.pairs.gz sample2.pairs.gz ``` ### Streamer_1d Streamer_1d is a tool that converts a 2d-sorted pairs file to a 1d-sorted stream (sorted by chr1-chr2-pos1-pos2 -> sorted by chr1-pos1). This tool uses a k-way merge sort on k file pointers on the same input file, operates linearly without producing any temporary files. Currently, the speed is actually slower than unix sort and is therefore *not recommended*. #### Usage for streamer_1d ``` streamer_1d <in.2d.gz> > out.1d.pairs streamer_1d <in.2d.gz> | bgzip -c > out.1d.pairs.gz ``` #### Usage Examples for streamer_1d ``` bin/streamer_1d samples/merged_nodups.space.chrblock_sorted.subsample2.txt.gz | bgzip -c > out.1d.pairs.gz ``` #### FAQ for streamer_1d ##### The tool creates many file pointers for the input file, which is equivalent to opening many files simultaneously. Your OS may have a limit on the number of files that can be open at a time. For example, for Mac El Captain and Sierra, it is by default set to 256. This is usually enough, but in case the number of chromosomes in your pairs file happen to be larger than or close to this limit, the tool may produce an error message saying file limit is exceeded. You can increase this limit outside the program. For example, for Mac El Captain and Sierra, the following command raises the limit to 2000. ``` # view the limits ulimit -a # raise the limit to 2000 ulimit -n 2000 ``` <br> ## Note * Note that if the chromosome pair block are ordered in a way that the first coordinate is always smaller than the second ('upper-triangle'), a lower-triangle query will return an empty result. For example, if there is a block with chr1='6' and chr2='X', but not with chr1='X' and chr2='6', then the query for X|6 will not return any result. The search is not symmetric. However, using the `-a` option for `pairix` or `flip` option for `pypairix` turns on autoflip, which searches for '6|X' if 'X|6' doesn't exist. * Tabix and pairix indices are not cross-compatible. <br> ## Version history ### 0.3.7 * The issue with integer overflow with get_linecount of pypairix is now fixed. (no need to re-index). * Fixed issue where autoflip causes segmentation fault or returns an empty result on some systems. This affects `pairix -Y`, `pairix -a`, `pypairix.check_triangle()` and `pypairix.querys2D()`. It does not affect the results on 4DN Data Portal. ### 0.3.6 * Line count (`pairix -n`) integer overflow issue has been fixed. The index structure has changed. The index generated by the previous versions (0.2.5 ~ 0.3.3, 0.3.4 ~ 0.3.5) can be auto-detected and used as well (backward compatible). ### 0.3.5 * Backward compatibility is added - The index generated by the previous version (0.2.5 ~ 0.3.3) can now be auto-detected and used by Pairix. ### 0.3.4 * The maximum chromosome size allowed is now 2^30 instead of 2^29 with new index. *Index structure changed.* ### 0.3.3 * The problem of `pypairix` `get_blocknames` crashing python when called twice now fixed. ### 0.3.2 * `pairix -Y` option is now available to check whether a pairix-indexed file is a triangle (i.e. a chromosome pair occurs in one direction. e.g. if chr1|chr2 exists, chr2|chr1 doesn't) * `pypairix` `check_triangle` function is also now available to check whether a pairix-indexed file is a triangle. * `pairix -B` option is now listed as part of the usage. ### 0.3.1 * `pairix -B` option is now available to print out the number of bgzf blocks for each chromosome (pair). * The same function is available for pypairix (`bgzf_block_count`). ### 0.3.0 * The problem with `fragment_4dnpairs.pl` of adding an extra column is now fixed. * 1D querying on 2D data now works with `pypairix` (function `querys2D`). ### 0.2.9 * `pairix` can now take 1D query for 2D data. e.g.) `pairix file.gz 'chr22:50000-60000'` is equivalent to `pairix file.gz 'chr22:50000-60000|chr22:50000-60000'` if file.gz is 2D indexed. ### 0.2.8 * `pairix` now has option `-a` (autoflip) that flips the query in case the matching chromosome pair doesn't exist in the file. * `pairix` now has option `-W` that prints out region split character use for indexing a specific file. * `merge-pairs.sh` is now included in `util`. * ~`pairix` can now take 1D query for 2D data. e.g.) `pairix file.gz 'chr22:50000-60000'` is equivalent to `pairix file.gz 'chr22:50000-60000|chr22:50000-60000'` if file.gz is 2D indexed.~  This one currently does not work. ### 0.2.6 * Two utils are added: `duplicate_header_remover.pl` and `column_remover.pl` for pairs file format correction. ### 0.2.5 * `pairix` has now option `-w` which specifies region split character (default '|') during indexing. A query string should use this character as a separater. * `pypairix` also now has a parameter `region_split_character` in function `build_index` (default '|') * `juicer_shortform2pairs.pl` is now available in the `util` folder. * *Index structure changed* - please re-index if you're using an older version of index. ### 0.2.4 * Updated magic number for the new index, to avoid crash caused by different index structure. ### 0.2.3 * Total linecount is added to the index now with virtually no added runtime or memory for indexing (`pairix -n` and `pypairix` `get_linecount` to retrieve the total line count) * Index structure changed - please re-index if you're using an older version of index. ### 0.2.2 * fixed -s option still not working in `old_merged_nodups2pairs.pl`. ### 0.2.1 * fixed a newly introdued error in `fragment_4dnpairs.pl` ### 0.2.0 * fixed --split option still not working in `merged_noduds2pairs.pl` and `old_merged_nodups2pairs.pl`. ### 0.2.0 * fixed an issue of not excluding chromosomes that are not in chrome size file in `merged_noduds2pairs.pl` and `old_merged_nodups2pairs.pl` * fixed --split option not working in `merged_noduds2pairs.pl` and `old_merged_nodups2pairs.pl` * mapq filtering option (-m|--mapq) is added to `merged_noduds2pairs.pl` and `old_merged_nodups2pairs.pl` ### 0.1.8 * Now util has an (updated) fragment_4dnpairs.pl script in it, which allows adding juicer fragment index information to 4DN-DCIC style pairs file. ### 0.1.7 * Now pairix index contains a pairix-specific magic number that differentiates it from a tabix index. * Pypairix produces a better error message when the index file doesn't exist (instead of throwing a segfault). ### 0.1.6 * `merged_nodup2pairs.pl` and `old_merged_nodup2pairs.pl` now take chromsize file and adds chromsize in the output file header. Upper triangle is also defined according to the chromsize file. * `bam2pairs`: the option description in the usage printed out and the command field in the output pairs file has not been fixed. (-l instead of -5 for the opposite effect) * `pairix': command `pairix --help` now exits 0 after printing usage (`pairix` alone exits 1 as before). ### 0.1.5 * `pypairix`: function `build_index` now has option `zero` which created a zero-based index (defaut 1-based). * `bam2pairs`: now adds chromsize in the header. Optionally takes chromsize file to define mate ordering and filter chromosomes. If chromsize file is not fed, the mate ordering is alphanumeric. * `pypairix`: functions `get_header` and `get_chromsize` are added. * pairs format now has chromsize in the header as requirement. ### 0.1.4 * fixed usage print for `merged_nodup2pairs.pl` and `old_merged_nodup2pairs.pl`. * `pypairix`: function `exists2` is added ### 0.1.3 * added build_index methods. Now you can build index files (.px2) using command line, R, or Python + R: px_build_index(<filename>) : see Rpairix + Python: pypairix.build_index(<filename>) * tests updated * pairs_format_specification updated ### 0.1.2 * Now custom column set overrides file extension recognition * bam2pairs now records 5end of a read for position (instead of leftmost) * Version number in the binaries fixed. ### 0.1.1 * Now all source files are in `src/`. * `pypairix`: function `exists` is added * `pairix`: indexing presets (-p option) now includes `pairs`, `merged_nodups`, `old_merged_nodups`. It also automatically recognizes extension `.pairs.gz`. * merged_nodups.tab examples are now deprecated (since the original space-delimited files can be recognized as well) * `pairs_merger`: memory error fixed * updated tests
43.269947
608
0.74446
eng_Latn
0.900736
e15e78c8f5712f419f8fc0fbb6f5ea3c4939d9c4
8,541
md
Markdown
articles/xplat-cli-install.md
SunnyDeng/azure-content-dede
edb0ac8eec176b64971ec219274a4a922dd00fec
[ "CC-BY-3.0" ]
2
2020-08-29T21:10:59.000Z
2021-07-25T10:13:02.000Z
articles/xplat-cli-install.md
SunnyDeng/azure-content-dede
edb0ac8eec176b64971ec219274a4a922dd00fec
[ "CC-BY-3.0" ]
null
null
null
articles/xplat-cli-install.md
SunnyDeng/azure-content-dede
edb0ac8eec176b64971ec219274a4a922dd00fec
[ "CC-BY-3.0" ]
null
null
null
<properties pageTitle="Installieren der Azure-Befehlszeilenschnittstelle | Microsoft Azure" description="Installieren der Azure-Befehlszeilenschnittstelle für Mac, Linux und Windows, um Azure Services zu nutzen" editor="" manager="timlt" documentationCenter="" authors="dlepow" services="" tags="azure-resource-manager,azure-service-management"/> <tags ms.service="multiple" ms.workload="multiple" ms.tgt_pltfrm="command-line-interface" ms.devlang="na" ms.topic="article" ms.date="09/18/2015" ms.author="danlep"/> # Installieren der Azure-Befehlszeilenschnittstelle In diesem Artikel wird beschrieben, wie Sie die Azure-Befehlszeilenschnittstelle (Azure-CLI) installieren. Die Azure-Befehlszeilenschnittstelle bietet eine Reihe von auf der Open-Source-Shell basierenden Befehlen zum Erstellen und Verwalten von Ressourcen in Microsoft Azure. [AZURE.INCLUDE [learn-about-deployment-models](../includes/learn-about-deployment-models-both-include.md)] Die Azure-Befehlszeilenschnittstelle ist in JavaScript geschrieben und erfordert [Node.js](https://nodejs.org). Sie wird mithilfe des [Azure SDK für Node.js](https://github.com/azure/azure-sdk-for-node) implementiert und unter einer Apache 2.0-Lizenz veröffentlicht. Das Projekt-Repository befindet sich unter [https://github.com/WindowsAzure/azure-sdk-tools-xplat](https://github.com/azure/azure-xplat-cli). > [AZURE.NOTE]Wenn Sie die Azure-Befehlszeilenschnittstelle bereits installiert haben, verbinden Sie sie mit Ihrer Azure-Ressourcen. Weitere Informationen finden Sie unter [Verbinden mit Ihrem Azure-Abonnement](xplat-cli-connect.md#configure). <a id="install"></a> ## Gewusst wie: Installieren der Azure-Befehlszeilenschnittstelle Es gibt einige Möglichkeiten, die Azure-Befehlszeilenschnittstelle zu installieren. 1. Verwenden eines Installers 2. Installieren von "Node.js" und "npm" und Verwenden des Befehls **npm install** 3. Ausführen der Azure-Befehlszeilenschnittstelle als Docker-Container Sobald die Azure-Befehlszeilenschnittstelle installiert ist, können Sie den Befehl **azure** in der Befehlszeilenschnittstelle (Bash, Terminal, Eingabeaufforderung usw.) verwenden, um auf die Befehle der Azure-Befehlszeilenschnittstelle zuzugreifen. ## Verwenden eines Installers Die folgenden Installer-Pakete sind verfügbar: * [Windows Installer][windows-installer] * [OS X Installer](http://go.microsoft.com/fwlink/?LinkId=252249) * [Linux Installer][linux-installer] ## Installieren von "Node.js" und "npm" Wenn Node.js bereits auf Ihrem System installiert ist, verwenden Sie den folgenden Befehl, um die Azure-Befehlszeilenschnittstelle zu installieren: npm install azure-cli -g > [AZURE.NOTE]Bei Linux-Distributionen müssen Sie möglicherweise `sudo` verwenden, um den Befehl __npm__ erfolgreich ausführen zu können. ### Installieren von "node.js" und "npm" in Linux-Distributionen, die die [dpkg](http://en.wikipedia.org/wiki/Dpkg)-Paketverwaltung verwenden Die am häufigsten verwendeten Distributionen verwenden entweder das [Advanced Packaging Tool (apt)](http://en.wikipedia.org/wiki/Advanced_Packaging_Tool) oder andere Tools, die auf dem `.deb`-Paketformat basieren. Beispiele hierfür sind Ubuntu und Debian. Die meisten der neueren Distributionen erfordern die Installation von **nodejs-legacy**, um ein ordnungsgemäß konfiguriertes **npm**-Tool zum Installieren der Azure-Befehlszeilenschnittstelle zu erhalten. Der folgende Code zeigt die Befehle, mit denen **npm** ordnungsgemäß auf Ubuntu 14.04 installiert wird. sudo apt-get install nodejs-legacy sudo apt-get install npm sudo npm install -g azure-cli Einige der älteren Verteilungen, wie z. B. Ubuntu 12.04, erfordern die Installation der aktuellen binären Verteilung von Node.js. Der folgende Code zeigt, wie Sie dazu durch die Installation und Verwendung von **curl** vorgehen müssen. >[AZURE.NOTE]Die angeführten Befehle stammen aus den Joyent-Installationsanweisungen, die Sie [hier](https://github.com/joyent/node/wiki/installing-node.js-via-package-manager) finden. Bei der Verwendung von **sudo** als Pipeziel sollten Sie immer die Skripts überprüfen, die Sie installieren, und prüfen, ob sie sich so verhalten, wie Sie es erwarten, bevor Sie sie mit **sudo** ausführen. Je größer die Macht desto größer ist auch die Verantwortung. sudo apt-get install curl curl -sL https://deb.nodesource.com/setup | sudo bash - sudo apt-get install -y nodejs sudo npm install -g azure-cli ### Installieren von "node.js" und "npm" in Linux-Distributionen, die die [rpm](http://en.wikipedia.org/wiki/RPM_Package_Manager)-Paketverwaltung verwenden Installieren von Node.js auf rpm-basierten Verteilungen erfordert die Aktivierung des EPEL-Repository. Der folgende Code zeigt die bewährten Methoden für die Installation auf CentOS 7. (Beachten Sie, dass der „-“ (Bindestrich) in der ersten Zeile wichtig ist!) su - yum update [enter] yum upgrade –y [enter] yum install epel-release [enter] yum install nodejs [enter] yum install npm [enter] npm install -g azure-cli [enter] ### Installieren von "node.js" und "npm" unter Windows und Mac OS X Sie können node.js und npm mit den Installationsprogrammen von [Nodejs.org](https://nodejs.org/download/) unter Windows und OS X installieren. Möglicherweise müssen Sie den Computer zum Abschließen der Installation neu starten. Überprüfen Sie, ob "node.js" und "npm" ordnungsgemäß installiert wurden, indem Sie ein Befehlsfenster öffnen und Folgendes eingeben: npm -v Wenn die Version des installierten npm angezeigt wird, können Sie fortfahren und die Azure-Befehlszeilenschnittstelle installieren mit npm install -g azure-cli Am Ende der Installation sollte eine Ausgabe angezeigt werden, die etwa wie folgt aussieht: azure-cli@0.8.0 ..\node_modules\azure-cli |-- easy-table@0.0.1 |-- eyes@0.1.8 |-- xmlbuilder@0.4.2 |-- colors@0.6.1 |-- node-uuid@1.2.0 |-- async@0.2.7 |-- underscore@1.4.4 |-- tunnel@0.0.2 |-- omelette@0.1.0 |-- github@0.1.6 |-- commander@1.0.4 (keypress@0.1.0) |-- xml2js@0.1.14 (sax@0.5.4) |-- streamline@0.4.5 |-- winston@0.6.2 (cycle@1.0.2, stack-trace@0.0.7, async@0.1.22, pkginfo@0.2.3, request@2.9.203) |-- kuduscript@0.1.2 (commander@1.1.1, streamline@0.4.11) |-- azure@0.7.13 (dateformat@1.0.2-1.2.3, envconf@0.0.4, mpns@2.0.1, mime@1.2.10, validator@1.4.0, xml2js@0.2.8, wns@0.5.3, request@2.25.0) >[AZURE.NOTE]Bei Linux-Systemen können Sie die Azure-Befehlszeilenschnittstelle auch installieren, erstellen Sie sie aus dem [Quellcode](http://go.microsoft.com/fwlink/?linkid=253472) neu erstellen. Weitere Informationen zum Erstellen aus dem Quellcode finden Sie in der Datei „INSTALL“, die im Archiv enthalten ist. ## Verwenden eines Docker-Containers Führen Sie in einem Docker-Host folgenden Befehl aus: ``` docker run -it microsoft/azure-cli ``` ## Ausführen von Azure-CLI-Befehlen Sobald die Azure-Befehlszeilenschnittstelle installiert worden ist, können Sie den Befehl **azure** in der Befehlszeilenschnittstelle (Bash, Terminal, Eingabeaufforderung usw.) verwenden, um auf die Befehle der Azure-Befehlszeilenschnittstelle zuzugreifen. Um beispielsweise den Hilfebefehl in Windows auszuführen, öffnen Sie ein Befehlsfenster, und geben Sie Folgendes ein: ``` c:> azure help ``` Jetzt sind Sie soweit. Als Nächstes können Sie sich [von der Azure-Befehlszeilenschnittstelle aus mit Ihrem Azure-Abonnement verbinden](xplat-cli-connect.md) und die **azure**-Befehle verwenden. <a id="additional-resources"></a> ## Zusätzliche Ressourcen * [Verwenden der Azure-Befehlszeilenschnittstelle mit Ressourcenverwaltungsbefehlen][cliarm] * [Verwenden der Azure-Befehlszeilenschnittstelle mit (klassischen) Dienstverwaltungsbefehlen][cliasm] * Wenn Sie weitere Informationen zur Azure-Befehlszeilenschnittstelle erhalten, den Quellcode herunterladen, Probleme melden oder etwas zum Projekt beitragen möchten, besuchen Sie die Webseite [GitHub repository for the Azure CLI](https://github.com/azure/azure-xplat-cli). * Wenn Sie Probleme bei der Verwendung der Azure-Befehlszeilenschnittstelle oder Azure haben, besuchen Sie die [Azure-Foren](http://social.msdn.microsoft.com/Forums/windowsazure/home). [mac-installer]: http://go.microsoft.com/fwlink/?LinkId=252249 [windows-installer]: http://go.microsoft.com/?linkid=9828653&clcid=0x409 [linux-installer]: http://go.microsoft.com/fwlink/?linkid=253472 [cliasm]: virtual-machines/virtual-machines-command-line-tools.md [cliarm]: virtual-machines/xplat-cli-azure-resource-manager.md <!---HONumber=Oct15_HO3-->
53.716981
451
0.782695
deu_Latn
0.978642
e15fb72e298d15e0ea9b3b4cdded2db08b2464ad
2,672
md
Markdown
docs/csharp/language-reference/compiler-options/target-winexe-compiler-option.md
vitorm04/docs.pt-br
29d9e04ecafef24affa2cdffcc91df81c19784fe
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/csharp/language-reference/compiler-options/target-winexe-compiler-option.md
vitorm04/docs.pt-br
29d9e04ecafef24affa2cdffcc91df81c19784fe
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/csharp/language-reference/compiler-options/target-winexe-compiler-option.md
vitorm04/docs.pt-br
29d9e04ecafef24affa2cdffcc91df81c19784fe
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- description: -target:winexe (opções do compilador C#) title: -target:winexe (opções do compilador C#) ms.date: 07/20/2015 f1_keywords: - /target:winexe helpviewer_keywords: - /target compiler options [C#], /target:winexe - -target compiler options [C#], /target:winexe - target compiler options [C#], /target:winexe ms.assetid: b5a0619c-8caa-46a5-a743-1cf68408ad7a ms.openlocfilehash: 6e14a2aac427c7adfd69f66eaf624816b75f6ea2 ms.sourcegitcommit: 5b475c1855b32cf78d2d1bbb4295e4c236f39464 ms.translationtype: MT ms.contentlocale: pt-BR ms.lasthandoff: 09/24/2020 ms.locfileid: "91168927" --- # <a name="-targetwinexe-c-compiler-options"></a>-target:winexe (opções do compilador C#) A opção **-target:winexe** faz com que o compilador crie um programa do Windows executável (EXE). ## <a name="syntax"></a>Sintaxe ```console -target:winexe ``` ## <a name="remarks"></a>Comentários O arquivo executável será criado com a extensão .exe. Um programa do Windows é aquele que fornece uma interface do usuário da biblioteca .NET ou com as APIs do Windows. Use [-target:exe](./target-exe-compiler-option.md) para criar um aplicativo do console. A menos que seja especificado de outra forma com a opção [-out](./out-compiler-option.md), o nome do arquivo de saída usará o nome do arquivo de entrada que contém o método [Main](../../programming-guide/main-and-command-args/index.md). Quando especificado na linha de comando, todos os arquivos até a próxima opção **-out** ou [-target](./target-compiler-option.md) serão usados para criar o programa do Windows. Somente um método **Main** é necessário nos arquivos de código-fonte que são compilados em um arquivo .exe. A opção [-main](./main-compiler-option.md) permite especificar qual classe contém o método **Main**, nos casos em que o código tem mais de uma classe com um método **Main**. ### <a name="to-set-this-compiler-option-in-the-visual-studio-development-environment"></a>Para definir esta opção do compilador no ambiente de desenvolvimento do Visual Studio 1. Abra a página **Propriedades** do projeto. 2. Clique na página de propriedades do **Aplicativo**. 3. Modifique a propriedade **Tipo de saída**. Para obter informações sobre como definir essa opção do compilador programaticamente, consulte <xref:VSLangProj80.ProjectProperties3.OutputType%2A>. ## <a name="example"></a>Exemplo Compile `in.cs` em um programa do Windows: ```console csc -target:winexe in.cs ``` ## <a name="see-also"></a>Confira também - [-Target (opções do compilador C#)](./target-compiler-option.md) - [Opções do compilador de C#](./index.md)
42.412698
284
0.730913
por_Latn
0.985137
e15fc88749f8d408fc72951ec2a11eca51fb44b5
32,252
md
Markdown
docs/CHANGELOG.md
jtylek/EpesiCRM
1978c461b5decffc199d0d6ad108da84b35e0982
[ "Unlicense" ]
14
2021-02-08T16:02:36.000Z
2022-03-18T10:38:29.000Z
docs/CHANGELOG.md
Epesi-Team/epesi
ae448b9803d5415e6bcba3dfe23d9e6db87f9ca0
[ "Unlicense" ]
4
2020-01-09T21:50:26.000Z
2020-11-04T11:28:46.000Z
docs/CHANGELOG.md
jtylek/epesi
1978c461b5decffc199d0d6ad108da84b35e0982
[ "Unlicense" ]
8
2020-01-22T05:28:47.000Z
2020-08-12T20:09:25.000Z
EPESI CHANGELOG =============== (Dev) means that this change is significant only for developers. RELEASE 1.8.2-20180413 ---------------------- ### Fixed - Error: "Undefined index: name" in dashboard - BBcode display - watchdog applet caused epesi error ### Feature - Double clock - HTML5 Notification - rebuild:all command - remove:all command - create test module command - possibility to set tooltip for leightbox buttons - TCPDF from composer (dev) RELEASE 1.8.2-20171019 ---------------------- ### Fixed - fixed to use openssl if mcrypt is not loaded - fixed charset exporting - fixed XSS vulnerabilities in Attachments, Meetings, Calendar, Perspective - display - added XSS purifier on recordset update ### Feature - added phpmoney library RELEASE 1.8.2-20170830 ---------------------- ### Fixed - notes file leightbox in display and view - avoid error on deleting page splits (georgehristov) ### Feature - Access class for RecordBrowser (georgehristov) - set caption for recordbrowser (georgehristov) - menu caching (georgehristov) - allow access for recursive permissions if set to empty (georgehristov) RELEASE 1.8.2-20170826 ---------------------- ### Fixed - sql error in commondata - copy patch for csv export params array to commondata to RB install - csv exporting ascii translation - get old link if exists for attachments - sorting menu error if non array passed - limit xss protection to utils_attachment - add xss protection for notes in view and history mode - check if mod_rewrite supported for RC multiwin support ### Feature - set watchdog applet title for better segregation - file field email actionbar in leightbox RELEASE 1.8.2-20170730 ---------------------- ### Fixed - fixed notes displaying html tags - added safe html class for clearing html from xss - introduced htmlpurifier to safe html class - fixed roundcube long sorting by date by changing to arrival RELEASE 1.8.2-20170701 ---------------------- ### Fixed - column visibility if no records (georgehristov) - invalid "from" header in mails from epesi, missing name before email address - timeless tasks deadline - fixed xss vulnerabilities - file attachments display (georgehristov) - properly display overflow for dropzone file field (georgehristov) - quickjump error - avoid exception and display missing file text in case file hash not found - fixed quick new records ### Changed - enable adding help with installation of field (georgehristov) - execute in order of module priority (georgehristov) - pass tab as argument on processing callback (georgehristov) RELEASE 1.8.2-20170430 ---------------------- ### Added - File field available in RB - Files panel for administrator - Codeception for unit testing (Dev) - Psy shell (Dev) - Console command to create patch (Dev) ### Fixed - Allow attachments to be encrypted using openssl if mcrypt not installed (Dev) - Notice for non well formated numeric values on php >= 7.0 (Dev) - Using $this in non object on php >= 7.0 (Dev) - Block cross-origin framing (Dev) - Watchdog notify employees about phone calls - Date display for empty value - RB filter issue after refresh (Dev) - Block the view of the record during add - Issue with crits validation - Update composer.json to work on windows (Dev) - Checking add crits - restore original behavior - Update get_element_by_array callback type - call with reference as argument (Dev) - JS currency calculation method - issue occured when value was smaller than 0.1 - Update watchdog record access - Update update_access - ignore 'grant' and 'restrict' ids - throw exception on non numeric id (Dev) - Extend login audit hostname col length - Description fields to allow use comma in quoted string - Check if method exists in update.php script - for older versions of epesi (Dev) - Renaming fields without DB entries (georgehristov) - Add possibility to open multiple roundcube accounts - Logged out user was still in the whoisonline applet - Set default cache ttl to 24h (Dev) - Quickjump feature (Dev) - Improve RB search code (Dev) - Commondata crits - return empty set if records not matched - Set tray filters with fixed RB method - Allow to override saved filter values - Quickform row/column display templates - Module create command - add version method to install (Dev) - Session locking issue (Dev) - Do not require linked fields in RBO select field (Dev) - Module uninstall console command - fix messages (Dev) - Display users online count (georgehristov) - Next CSV export modification based on AJB comments (praski) - Crits validation - add preg_quote - Display callbacks with PHP code not working (Dev) - QFfield callback with PHP code embedded calling (Dev) - Issue with bbcode when callback is not callable (Dev) - Issue with calendar when module with event handler has been disabled - Set order of watchdog notifications ### Changed - Filestorage and db structure has been refactored (Dev) - Fileupload has been refactored - added Dropzone (Dev) - Attachments patch for new filestorage (Dev) - Update PHPExcel library (Dev) - Remove json encode/decode service lib (Dev) - Enable for custom field_id different than field name (georgehristov) - Use custom class to detect if field is auto-hidden (georgehristov) - Enable for custom icons not related to the parent module - Allow to use like special characters in commondata crits (Dev) - Establish get_mime_type as static method (georgehristov) - Introduce the "more" option to linked_label and linked_text (georgehristov) - Improve field access selection (georgehristov) - Load all similar tooltips at the same time (georgehristov) - Change method of tooltip_id generation (georgehristov) ### Important Since adding the new file field to RB utils_filestorage and utils_fileupload have been heavily refactored . Any module that has been using them might have to write a patch to work with this update. I.e. Utils_FilestorageCommon::meta accepts the id from the new utils_filestorage table. RELEASE 1.8.1-20161121 ------- ### Added - Allow to define fields to use as default record label - Allow to assign multiple contacts to company by related field (georgehristov) - New console commands: * create backups of db (mysql only) and files * enable all disabled modules at once * create module files (Dev) - Show recordset caption and record id if label would be empty - Show requirement of PHP gd library during installation ### Fixed - Mobile edit issues - MySQL syntax error during watchdog cleanup - Decrease RB search index size - do not store empty strings - Allow empty option key in Leightbox Prompt - Issue when creating a new phonecall from company addon - Editing select field - populate with proper label - Like operator for date fields - issue with mysql collaction - Selection crits - Resetting advanced order in GB/RB - Record labels with empty values - sometimes unwanted html appeared ### Changed - Always use default linked label for select field suggestbox - Error reporting - better console readability RELEASE 1.8.1-20161118 ------- Replaced by rev. 20161121. Added in this revision PhpFastCache library requires PHP 5.5, but EPESI works fine with PHP 5.4. Cache engine has been reverted to the old one in rev. 20161121. RELEASE 1.8.0-20160926 ------- - Allow to add help text to each field in RB - Allow to sort by calculated field when references sort column - Disable modules when files are not available - Improve commondata ordering (georgehristov) - Add note password hint - RB js indexer - change run interval - Fix RB record picker code issues (georgehristov) - PHPMailer - do not use auto tls - Replace deprecated mysql driver with mysqli - Rewrite edit_history.js to jquery (georgehristov) - Fix autosuggestbox - github issue #82 - Fix attachments get unlink error - Fix crits validation in PHP - Fix watchdog's subscriber list - do not show contacts without access - Cleanup watchdog notifications - Add custom access callbacks to RB (Dev) - Fix notes access check in search results with rb custom access callback - Fix uncompress errors from database - Fix not like rule handling made from js query builder - Allow to search with nested selects using like operator - Fix rb search and permission issues - Fix new record rules check and show better message about issues - Fix automulti suggestbox - better search for records - Fix CSV export memory limit issues - Disallow php code as callbacks by default - Fix chained select contacts sort - Fix special values replacement in crits - Fix JS Query Builder integration - proper queries for multiple values - New global search engine - Fix file uploads - remove temporary files - Fix default dashboard - use default settings when user is not allowed to edit dashboard - Add option to use EPESI as email handling application in browsers - Add custom field template (georgehristov) - Improvements in Utils/Tooltip and LeightboxPrompt (georgehristov) - Fix Roundcube's addressbook permissions to contacts/companies - Update Roundcube version to v1.2.1 - Unify contact or company field type with standard multiselect field - Add separate RB/Filters module (georgehristov) - Add separate display callback methods (georgehristov) - Filter by favorite, subscribed and recent in JS Query Builder - Add user version of JS Query Builder - respecting permissions - Add option to search by date fields with datepicker - Several other minor improvements and fixes RELEASE 1.7.2-20160314 ------- - Fix birthday applet - Fix RB search and sort for calculated and select fields - Fix crits - allow to query by date field with LIKE operator - Fix calendar event drags - revert to original position on invalid drop - Fix tasks printout from calendar - Remove unsupported CRM/Assets module - Allow to block network update with .noupdate file or when .git exists - Fix RB permission rules save on PostgreSQL - Fix permission editor issue after editing rule - Fix WhoIsOnline module installation - Whitelabel integration fixes - Fix Roundcube memcache - Fix sending email with bug report - use files absolute path - Fix tabbed browser - pick last tab if trying to select page out of range - Fix Query Builder integration with QuickForm - Allow to disable expandable rows per RB instance - Add missing Exception class - Reorganize installation files and modules installed by default RELEASE 1.7.1-20160127 ------- - Fix agenda sort order - Fix RB browse search logic - search for all words in record - Fix sales opportunity and related notes integration - Fix quick search - save categories selection - SQL_TIMES - show caller in debug (georgehristov) - Fix update.php argv handling - Fix donate links, credits page, year in scripts - Remove unused modules and code - Add japanese, chinese and korean font support to PDF printing - Fix code in RB search to use merge_crits - Change mail applet refresh rate - Fix client IP address recognition in autologin feature - Update autologin feature - clean unused tokens, rolling token - Fix file cache implementation - Improve update.php script to skip some versions and create backups - Fix Currency field issues with empty value - New - pass RB table name as param to display callbacks (Dev) - Adapt some code to PHP 7 - Add activity report for each user as addon - Fix rb search indexes patch - add checkpoints - Fix attachments - do not store edited_on changes in history entries - Fix timeless task switch for am/pm clock - Fix and improve switch_to_addon in RB and TabbedBrowser - Improve notes watchdog notification message - link to parent record - Fix automulti field freeze - Enable custom label for attachment addon (Dev) - Fix Query Builder integration - translation methods - Fix user param for user settings save method - Fix hide/show filters behavior - Fix access restrictions in RB queries using nested queries - Fix cache key for building queries - include tab alias - Add method to RB - get_record_vals (Dev) (georgehristov) - Fix crits to words - better translations and support all operators - Fix currency value parser - parse negative values - Update PHPMailer to 5.2 - Allow to use self signed certificates for mail server settings - Fix frozen autoselect field value - Fix words map patch - check if indexes exist, truncate large index - Remove update EPESI from /admin - Add currency filter to currencies field (Dev) - Fix RB subqueries performance issues - Fix column width persistence in addons - Fix confirmLeave feature for with recordpicker opened - Fix cloned values in processing callback - Fix recordpicker's record label - Improve query builder date filters - Fix filtering by user id in recordbrowser permissions - Fix decimal point issues in currency field - New feature - code to auto hide some fields depending on the other value (georgehristov) - Fix RB crits for negation in multiselects - Add generic method to created linked field value (georgehristov) - Fix birthdays applet - Update Roundcube to 1.1.4 - Fix RB navigation issues - Fix column widths saving in GB (georgehristov) - Update translations RELEASE 1.7.0-20151201 ------- - Fix error reporting issue - use absolute path to error file - Fix tooltip issues - Add confirmation before sending shoutbox message to all (georgehristov) - Fix multiple memcache sessions on one server - Fix records permission check for select fields - Remove watchdog notifications for forbidden notes - Add record processing for permanent record delete - Send watchdog notifications with cron - Fix select filter values formatting (georgehristov) - SQL fixes for PostgreSQL - Commondata arbitrary sort order (georgehristov) - Add epesi_log method and log errors to file (Dev) - New crits and query building mechanism (Dev) - Option to set custom QFfield callback code from GUI - Fix recent settings in RecordBrowser - Improve phone prefix check - Fix filter labels to get custom caption properly - Remove preload images cache - Fix performance issues in watchdog desktop notifications - Rewrite desktop notifications module (georgehristov) - Add mail indicator desktop notifications - Allow set number of watchdog notification per applet - Show record subscribers on watchdog eye icon - Enable watchdog for email records - Add filestorage module - Save filters for calendar - Enable advanced search for long text fields - Add option to decide who can manage applets on dashboard - Update JQuery to 1.11 - Phonecalls - pass related record to notes - Improve RecordBrowser and Watchdog performance - Query Building with JQuery QueryBuilder - RB - add new fields before first page split - Fix selected currency reset on form validation - Fix reload mails action in Roundcube - Fix email archive - do not attach to disabled addresses - Add option to search companies by tax id field - Deactivate user when contact is deleted - Add Telegram integration to pass notifications to users - http://telegram.org - Remove not working google docs integration - Update Roundcube to version 1.1.3 - Translations update - Other fixes and improvements RELEASE 1.6.5-20150324 ------- - Fix leightbox prompt id collision - Fix timestamp field layout - Fix print templates enabling - Fix printouts caching by browser - Fix JS errors related to form focus - Fix autoloader to use absolute path - fixes Roundcube issues - Fix calendar event span issues - Rename Roundcube's archive folders to not use EPESI word - Clear xcache on module update/install and themeup - Create function to return default CRM priority and use it for defaults - Fix RB patches order for update from older versions - Add method to filter blocked fields from record array (Dev) - Fix events permission issues in Activities addon and calendar - Update TCPDF fonts - fixes Chrome blank printout issue - Clear global cache on themeup - Fix order by currency field - Fix filters for currency field on PostgreSQL - Update CKEditor to version 4.4.7 - Fix memcache session locking issues - Fix RB crits issue with empty multiselect rule RELEASE 1.6.4-20150316 ------- - Change cookie expiration time to 7 days for maintenance mode - Improve update process - make sure maintenance mode is on during patches - Set default customer for phonecalls, tasks and meetings - Fix toolbar mode switch in ckeditor - Add global cache stored in the file (Dev) - Update ckeditor to the latest version - Replace CustomRoundcubeAddons with built-in related mechanism - Fix Roundcube archive copy/paste - Add option to export RB report to csv file - Fix issues in currencies admin panel for PostgreSQL database - Register custom Prototype events also in jQuery (Dev) - Add selection rules to RecordBrowser - Currency field - fix frozen value display - Add option to disable module without uninstall - Improve /admin script to manage disabled modules - Base/Print - add more document config params to PDF - Base/Print - pass printer classname to document_config method (Dev) - Fix GenericBrowser's default template when expandable is disabled - Changes in expandable calculation to wrap long text fields - GenericBrowser - Fix admin access param to get_records method - RecordBrowser - Fix RB access rules - edit form and add check - Improve Demo mode security issue check for Base/Print module - Add filtering in RB for currency, integer and float fields - Add submodule concept and clean module manager code (Dev) - Add shared module variables concept (Dev) - Add custom port for SQL server during installation - Add resizable columns to GB (georgehristov) - Add small delay to load tooltips with AJAX - Add record info tooltips for every default linked label in RB - Fix access to csv export - Remove addons during recordset uninstallation - Add global function to get client IP address - Translate watchdog email notifications to user's language - Separate and improve watchdog email template - Confirm leaving edit form - Fix bugs in: RB search, module instance name - Fix recurrent calls to get_val - Improve select field labels to retrieve nested select values - Fix PostgreSQL database creation - quote db name RELEASE 1.6.3-20150107 ------- - Base/Print add method to obtain all templates (Dev) - Fix recordset uninstall issue caused by words map table constraint - Add desktop notifications for Shoutbox, Watchdog and Messenger modules (georgehristov) - Add desktop notifications possibility (Dev) (georgehristov) - Login audit - obtain real IP for proxy connections - Improve module dependencies issue reporting - Add method Utils_CurrencyFieldCommon::get_all_currencies (Dev) - Add printer document config (Dev) - Rename recordset_printer RB patch to fix updates from older EPESI versions - Whitelabel fixes - replace EPESI text with constant - Include autonumber fields in default description callback - Fix duplicate tooltip on field's label - RecordBrowser - Fix popup calendar event - Fix locking issues with RB indexer - Make fields management as a first tab in RecordBrowser's admin - Sort patches by name (not by path), when no date is supplied in the filename - Rewrite to JQuery: Utils/Watchdog, Utils/Tooltip (georgehristov) - Base/Print - change PrintingHandler::output_document to public (Dev) - Fix fields editor in RecordBrowser - issues with select/multiselect - Fix display_phone to not create links when nolink is true - Fix translations in Access Restrictions admin panel - Fix fields processing order for new fields with *position* set - Fix date/time crits issues - Rewrite Session class - Fix issues with crm_company_contact field edit - Fix records indexing - create labels with *nolink* param - Fix Base/Print - buffer PDF output to append footers just once - Change icon for drag and drop fields sorting - Add default currency concept - Add new processing modes to RecordBrowser: edited, deleted, restored - Fix access issues for autocomplete fields - Fix desktop notifications to not show shoutbox notifications every time - Create link to record for autonumber fields - Add option to jump to new record or not in RB object (Dev) - Fix indexing when autonumber field has changed - Fix fields position numbers when removing field - Keep autonumber position during field edit - Add method to clear search index for certain tab (Dev) - Add button to clear search index in RB admin - Update RoundCube to 1.0.3 - Add related records concept to meetings, tasks and phone calls - Change table width in RecordBrowser Reports printout - Fix undefined index issue in RB Reports - Fix update script to detect glob errors - Fix admin access - check for method with method exists instead of is callbable - Fix turkish language issues - Fix year bug in QuickForm - Allow to set custom caption for every field in RB RELEASE 1.6.2-20141020 ------- - Fix like operator on date fields - used by birthdays applet RELEASE 1.6.2-20141017 ------- - Fix user activity report issues - Update AdoDB to 5.19 - Fix Roundcube cache issue - Decrypt note in view - allows to enter crypted note from search - Fix autoselect filter issue - Fix tax_id label - Search file downloads just by token - Index records for search without cron - Fix RB select field edit issues - Fix some RB field edit issues - Fix handling of relative date crits - Add new processing callback: browse (Dev) - Fix time intervals in meetings - ESS - test connection before registration - Functions to check database type (Dev) - Extract SimpleLogin class from admin tools for easy login (Dev) - Fix setup script for PHP >= 5.6 - Fix blank index page issue - Fix bad character at the bottom of the page - Make display_as_row to wrap fields - Improve module install failure message - Add method to remove access rules by definition to RecordBrowser (Dev) - Keep form field focus on soft refresh - Include Utils/Tray module (Dev) - Reopen leightbox when error occured in a form - Add function to replace Base_Box main (Dev) - Admin tools - add Update Manager to download updates - Fix translation module to not grow custom translations files - Update translations RELEASE 1.6.1-20140913 ------- - Fix dashboard applets removal - Add field to select from multiple recordsets in RecordBrowser - Fix attachments PHP 5.3 code issue - Fix RoundCube addressbook contacts search - Set Contacts/Access Manager as read-only - Fix translation functions in Attachments - Allow negative integer numbers in RecordBrowser - Set "Company Name" and "Tax ID" fields as unique - Fix mobile RB edit bug - Fix Base/Theme get_icon function - Add Cron Management to Administrator Panel - Add Custom Recordsets tool to RecordBrowser - Fix cron CLI detection - Add who made last edit in attachments display - Fix Email applet issue with password encoding - Fix unique email rule - Add time and timestamp fields to RB GUI admin - Add datepicker placeholder text - Fix phonecalls template - Add ability to sort RB fields with drag n drop - Update ckeditor version to the most recent - Add button to switch full toolbar in ckeditor - Improve patches util error reporting in admin tools - Maintain QFfields callbacks order during position change - Configurable edited_on field format in Attachments - Allow to disable expandable rows in user settings - Search improvements - optimization, set defaults, disable certain recordsets - Attachments - do not show password in decrypt form - RecordBrowser - save filters per user - RecordBrowser - do not show filter for blocked field - RecordBrowser - add ability to print any record - Improve Translation panel - Fix watchdog notification for notes - Update translations - Fix bbcodes in attachments - RecordBrowser - allow multiple additional actions methods RELEASE 1.6.0-20140710 ------- - New attachments based on RecordBrowser - Add exception handling - Fix Base/Print uninstall method - Fix attachments when mcrypt module is not loaded - Do not show files in attachments when note is not decrypted - Add DEBUG_JS option for better js errors handling - Add option to forbid autologin - Add another admin access level to control ban and autologin - Do not generate watchdog notification when user doesn't have view access to modified field - Fix Roundcube rc_contactgroups reference - Fix RecordBrowser's field tooltip for select and multiselect fields - Fix Month View applet issue related to the daylight saving shift - Fix new langpack rule issue. - Remove duplicated codes from countries list and calling codes - Move jump to id setting to database (remove function Utils_RecordBrowser::disable_jump_to_record) - Add option to run update procedure from commandline interface - Add maintenance mode - Add Utils_CurrencyFieldCommon::parse_currency method - Improve RB uninstall method to remove processing callbacks and others - Add option to create mailto: links even when RoundCube accounts are set - Time management for patches - Allow patches to save some state and run from that place - Update process reinvented to match new patches with restart RELEASE 1.5.6-20140305 ------- - Fix Base/Print filename suffix - Fix not working RoundCube due to not loaded DBSession class RELEASE 1.5.6-20140303 ---------------------- - Crypted notes - New module to generate printouts - Change cron mechanism - Trigger error, when patch has failed during update - Fix HomePage template installation - Add mod_alias rules to show 404 on .svn and .git directories - Set read-only attribute in commondata - Fix access restrictions and use proper data directory in check.php script - Fix logo file in Utils/FrontPage - Properly sanitize language variable in setup.php script - Fix get_access method to respect temporary user_id changes - Fix icon in RecordBrowser for different template - Extend session_id length - Allow filtering of custom status in task applet - Fix commondata edit form - do not allow to override values - Remove unused code that caused performance issues in CRM/Filters - Do not validate form in RB during soft submit - Fix Related Notes company addon - Fix module_manager to generate proper list of module requirements - Fix some issues in reset pass script - Fix TCPDF top margin when logo is set but it's hidden - Update translations RELEASE 1.5.5-20140113 ---------------------- - Fix recurrent meeting issue in Activities tab - [Forum thread](http://forum.epe.si/viewtopic.php?f=6&t=2023) - Fix "Paste company data" button - [Forum thread](http://forum.epe.si/viewtopic.php?f=6&t=2026) - Add option to use "Reply-to" header in SMTP settings - Fix BBCode url matching - Remove ckeditor's internal save button, that wasn't used - Fix moving notes - some rare issue with directories - Fix deleting files upon note removal - Update RoundCube version to 0.9.5 - Fix dashboard's tab management - Fix RecordBrowserReports column summary to not show last row doubled - Fix wrong time and date in mobile view - [Forum thread](http://forum.epe.si/viewtopic.php?f=6&t=1925) - Add new possible Home Page - company of current user - Check access when copying company data into contact - Clean up include path - Fix creating new contact - [Forum thread](http://forum.epe.si/viewtopic.php?f=6&t=2082) - Fix calendar event with duration less than 1h - Several fixes for PostgreSQL engine - Fix broken Contact's template (#2) - Fix printing all records from RecordBrowser - Fix watchdog email notifications (#3) - Update translations RELEASE 1.5.4-rev11060 (20131015) --------------------------------- - update translations - bugfixes to problems reported since original 1.5.4 release RELEASE 1.5.4-rev11044 (20131014) --------------------------------- - RoundCube 0.9.4 **Warning** New RoundCube client requires PDO extension enabled in php.ini and PHP 5.2.1 or greater. When using MySQL database it **requires PHP version 5.3 or higher** - fixed bugs in RecordBrowser and Attachments - changed admin view for currencies - do not report E_DEPRECATED errors - PHP 5.5.x [deprecates some features](http://php.net/manual/en/migration55.deprecated.php) used by Smarty templating engine - EPESI - RoundCube archiving fixes - RoundCube imap cache fixes - fix RecordBrowser's field edit error when param is empty - use reply-to header as default when sending emails from EPESI - fix time issues in mobile view - [Forum thread](http://forum.epe.si/viewtopic.php?f=6&t=1925#p7132) - improve CSV export RELEASE 1.5.3-rev10944 (20130709) --------------------------------- - fix calendar month view in certain timezone configuration - [forum thread](http://forum.epe.si/viewtopic.php?f=6&t=1523&p=5959#p5959) - fix adding new record - rare issue - add patch to create one of the ban variables - sometimes after installation admin could get error "undefined variable" - fix template html for launchpad - fix deprecated hook name in RoundCube EPESI plugin - fix leightbox js issues - fix searching for a lot of records - sort meetings in activities tab - fix issues with field names in record's history - add filtering for currency field - RBO - add set_style method for field definition, add get_access method to recordset - fix add note from table view and record view - [forum thread](http://forum.epe.si/viewtopic.php?f=6&t=1760) - updated translations RELEASE 1.5.2-rev10766 (20130513) --------------------------------- - Full version of CKEditor included. - Fixed bugs: - commondata field created by user was causing error during search - [Forum thread](http://forum.epe.si/viewtopic.php?f=6&t=1678) - tooltips in calendar events were broken - [Polish forum thread](http://forum.epe.si/viewtopic.php?f=25&t=1685) - print browse mode of company or contact field didn't indicate record type. - Icon of company or contact field has been changed to text indicator ([Company] / [Contact]) in some places. It's related to third bug listed above. RELEASE 1.5.1-rev10757 (20130508) --------------------------------- - A new version of CKEditor - Fixed bug in Utils/Attachments - user was unable to edit note using Firefox. Now notes edit box is always on top of the notes. - Updated translations RELEASE 1.5.0-rev10738 (20130424) ------------------------------- USER PERSPECTIVE - new RoundCube email client - new CKEditor version - modern look & feel - click2fill appearance and help improvements - multiple attachments per note - shoutbox improvements - click to address person, changed user labels, tab+enter to send - company or contact suggestbox - show icon based on type, always display several records from both recordsets - watchdog - subscribe to categories (by default only for managers) - sort mails archived in EPESI by thread ADMIN PERSPECTIVE - User ban system improvements and restore controls in administrator panel - add option to disable EPESI store to faster module administration launch - changed install process - allow translating from first screen - allow run /admin tools before Base installation - add option to set security in smtp server settings - improved RecordBrowser fields administration - changed HomePage mechanism - allow to set default home page for specific group of users - link from Administrator panel to /admin tools - add EPESI shell in /admin tools - disabled by default - add patch utility in /admin tools DEVELOPERS PERSPECTIVE - RecordBrowser - allow disable "jump to record" - RecordBrowser - add autonumbering field type - new types for RBO - company, contact, employee, company or contact, email, time, phone - allow to translate strings from smarty templates SYSTEM - RoundCube 0.8.2 with several EPESI integration fixes - CKEditor 4.0.2 - optimize startup time - allow to translate /admin tools - interactive help system - fixed automulti suggestboxes to display all selected fields - attachments bug fixes - display errors by default (config.php) - RecordBrowser - fix permission check issues - fix search engine for contacts and companies - partial rewrite to jQuery (we are going to remove Prototype) - several PostgreSQL fixes (thanks to forum user - pedro42) - fixed EpesiStore on PHP 5.2.6 - [php.net](https://bugs.php.net/bug.php?id=45028) - add option to store session in files instead of database - appearance bug fixes - translations improved - more string have been marked to translate - clean up some parts of code IMPORTANT NOTES - PHP 5.2.0 is not supported due to bug in json_decode function. (PHP >= 5.2.1 and PHP < 5.2.0 works)
41.994792
172
0.774061
eng_Latn
0.961848
e16085c45e75ad35c91994a12a1ffde1c3ada24a
6,049
md
Markdown
Vitae/VH/47.md
alfonsodepalencia/alfonsodepalencia.github.io
13bd39b3ae941809940d740274c58fd6e2501d56
[ "MIT" ]
null
null
null
Vitae/VH/47.md
alfonsodepalencia/alfonsodepalencia.github.io
13bd39b3ae941809940d740274c58fd6e2501d56
[ "MIT" ]
null
null
null
Vitae/VH/47.md
alfonsodepalencia/alfonsodepalencia.github.io
13bd39b3ae941809940d740274c58fd6e2501d56
[ "MIT" ]
null
null
null
--- layout: edition titulo: Vida de Hanníbal permalink: /Vitae/VH/47 paginas: <li><a href="1.html">1</a></li><li><a href="2.html">2</a></li><li><a href="3.html">3</a></li><li><a href="4.html">4</a></li><li><a href="5.html">5</a></li><li><a href="6.html">6</a></li><li><a href="7.html">7</a></li><li><a href="8.html">8</a></li><li><a href="9.html">9</a></li><li><a href="10.html">10</a></li><li><a href="11.html">11</a></li><li><a href="12.html">12</a></li><li><a href="13.html">13</a></li><li><a href="14.html">14</a></li><li><a href="15.html">15</a></li><li><a href="16.html">16</a></li><li><a href="17.html">17</a></li><li><a href="18.html">18</a></li><li><a href="19.html">19</a></li><li><a href="20.html">20</a></li><li><a href="21.html">21</a></li><li><a href="22.html">22</a></li><li><a href="23.html">23</a></li><li><a href="24.html">24</a></li><li><a href="25.html">25</a></li><li><a href="26.html">26</a></li><li><a href="27.html">27</a></li><li><a href="28.html">28</a></li><li><a href="29.html">29</a></li><li><a href="30.html">30</a></li><li><a href="31.html">31</a></li><li><a href="32.html">32</a></li><li><a href="33.html">33</a></li><li><a href="34.html">34</a></li><li><a href="35.html">35</a></li><li><a href="36.html">36</a></li><li><a href="37.html">37</a></li><li><a href="38.html">38</a></li><li><a href="39.html">39</a></li><li><a href="40.html">40</a></li><li><a href="41.html">41</a></li><li><a href="42.html">42</a></li><li><a href="43.html">43</a></li><li><a href="44.html">44</a></li><li><a href="45.html">45</a></li><li><a href="46.html">46</a></li><li><a href="47.html">47</a></li><li><a href="48.html">48</a></li><li><a href="49.html">49</a></li><li><a href="50.html">50</a></li><li><a href="51.html">51</a></li><li><a href="52.html">52</a></li><li><a href="53.html">53</a></li><li><a href="54.html">54</a></li><li><a href="55.html">55</a></li><li><a href="56.html">56</a></li><li><a href="57.html">57</a></li><li><a href="58.html">58</a></li><li><a href="59.html">59</a></li><li><a href="60.html">60</a></li><li><a href="61.html">61</a></li><li><a href="62.html">62</a></li><li><a href="63.html">63</a></li><li><a href="64.html">64</a></li><li><a href="65.html">65</a></li><li><a href="66.html">66</a></li><li><a href="67.html">67</a></li><li><a href="68.html">68</a></li><li><a href="69.html">69</a></li><li><a href="70.html">70</a></li><li><a href="71.html">71</a></li><li><a href="72.html">72</a></li><li><a href="73.html">73</a></li><li><a href="74.html">74</a></li><li><a href="75.html">75</a></li><li><a href="76.html">76</a></li><li><a href="77.html">77</a></li><li><a href="78.html">78</a></li><li><a href="79.html">79</a></li><li><a href="80.html">80</a></li><li><a href="81.html">81</a></li><li><a href="82.html">82</a></li><li><a href="83.html">83</a></li><li><a href="84.html">84</a></li><li><a href="85.html">85</a></li><li><a href="86.html">86</a></li><li><a href="87.html">87</a></li><li><a href="88.html">88</a></li><li><a href="89.html">89</a></li><li><a href="90.html">90</a></li><li><a href="91.html">91</a></li><li><a href="92.html">92</a></li><li><a href="93.html">93</a></li><li><a href="94.html">94</a></li><li><a href="95.html">95</a></li><li><a href="96.html">96</a></li> panel_left: | <p><span class="seg">47.1</span> Postero die frequens senatus Hannibali datus est a Campanis, in quo ille gratissimis uerbis audientium impleuit aures, multa promittens multa suadens, quae facile credebant Campani, <span class="tooltip">proptereaque<span class="tooltiptext">Preterea quȩ <span class="siglas">U</span> </span></span> sibi opinionis errore de Italiae principatu sperabant. <span class="seg">2</span> Quare ita turpiter se submiserunt Poeno, <span class="tooltip">ut<span class="tooltiptext">et <span class="siglas">F</span> </span></span> quasi libertatis obliti, non socium in <span class="tooltip">urbem<span class="tooltiptext">urbe <span class="siglas">U</span> </span></span>, sed dominum accepisse uiderentur. <span class="seg">3</span> Quin etiam praeter alia petenti Hannibali sibi tradi Decium Magium <span class="tooltip">principem<span class="tooltiptext"><span class="om"><i>om. </i></span> <span class="siglas">U</span> </span></span> factionis aduersae non modo seruili decreto est senatus assensus, <span class="seg">4</span> sed etiam passus est, ut spectante populo catenis uinctus in castra duceretur uir antiquae societatis memor reique publicae magis quam barbaris gentibus affectus ciuis. </p> panel_right: | <p><span class="seg">47.1</span> Otro día diose a <span class="persName">Hanníbal</span> grand concurso del senado capuano, y en aquel ayuntamiento él con muy gratas palabras pudo contentar y hinchir las orejas de los oyentes, prometiendo muchas cosas y muchas amonestando, y los capuanos prestamente las creyeron porque, errados en su opinión, speravan aver el principado de Ytalia. <span class="seg">2</span> Y d’esta causa tan torpemente se sometieron al carthaginés que, quasi olvidada su libertad, pareçía averle ellos reçebido en la çibdad no por compañero, sino por señor. <span class="seg">3</span> Y aun allende de otras cosas, demandando <span class="persName">Hanníbal</span> que le entregassen a Decio Magio, principal del otro vando contrario, no solamente por servil decreto el senado lo consentió, <span class="seg">4</span> mas aun padeçió que a vista <a href="../public/images/1491/174r.png" target="new"><img class="facs" src="https://alfonsodepalencia.github.io/Vitae/public/images/facs_icon.jpg"/></a>[174r,a] del pueblo llevassen atado en cadenas al real aquel varón por se acordar más de la antigua compañía y del dever de la cosa pública como çibdadano affecçionado al bien que de se contentar de las gentes bárbaras.</p> --- Technical Description
155.102564
3,159
0.615308
yue_Hant
0.531268
e160e6bdc9a374937e743e58f58f86852511d423
501
md
Markdown
routing-hands-on/02/01_hands-on/README.md
Paalchrb/golang-web
53bffcf8273aefbfbd73d8809037b96b0b6eb5fb
[ "MIT" ]
33
2021-01-05T06:34:47.000Z
2022-03-30T09:48:52.000Z
routing-hands-on/02/01_hands-on/README.md
Paalchrb/golang-web
53bffcf8273aefbfbd73d8809037b96b0b6eb5fb
[ "MIT" ]
null
null
null
routing-hands-on/02/01_hands-on/README.md
Paalchrb/golang-web
53bffcf8273aefbfbd73d8809037b96b0b6eb5fb
[ "MIT" ]
10
2021-04-20T18:55:56.000Z
2022-03-05T07:27:02.000Z
# Create a basic server using TCP. The server should use net.Listen to listen on port 8080. Remember to close the listener using defer. Remember that from the "net" package you first need to LISTEN, then you need to ACCEPT an incoming connection. Now write a response back on the connection. Use io.WriteString to write the response: I see you connected. Remember to close the connection. Once you have all of that working, run your TCP server and test it from telnet (telnet localhost 8080).
33.4
110
0.778443
eng_Latn
0.999532
e161a6d84e5560b7a97177fae9fef6c70e269893
118
md
Markdown
README.md
ace03uec/build-raft
b294871d7e6121a6bc6dc9703fc132e26a941172
[ "MIT" ]
null
null
null
README.md
ace03uec/build-raft
b294871d7e6121a6bc6dc9703fc132e26a941172
[ "MIT" ]
null
null
null
README.md
ace03uec/build-raft
b294871d7e6121a6bc6dc9703fc132e26a941172
[ "MIT" ]
null
null
null
# build-raft Trying to build the raft paper. Expected to build out a few samples before implementing the paper as is.
39.333333
104
0.788136
eng_Latn
0.999846
e161f58f5cda2b27d2a2117e5cd5837408816a4d
952
md
Markdown
docs/introduction/aPage.md
BSafes-Help/BSafes-Help.github.io
fe9d49f7a586a71baeb330f94771a7142613a688
[ "MIT" ]
null
null
null
docs/introduction/aPage.md
BSafes-Help/BSafes-Help.github.io
fe9d49f7a586a71baeb330f94771a7142613a688
[ "MIT" ]
1
2021-05-11T16:29:52.000Z
2021-05-11T16:29:52.000Z
docs/introduction/aPage.md
BSafes-Help/BSafes-Help.github.io
fe9d49f7a586a71baeb330f94771a7142613a688
[ "MIT" ]
null
null
null
--- layout: page title: A Page parent: Introduction nav_order: 10 --- # A Page {: .no_toc } ## Table of contents {: .no_toc .text-delta } 1. TOC {:toc} --- ## A sample page ![](https://statics.bsafes.com/samplePage.png) On a page, you could have tags, title, content, photo gallery, file attachments, and comments. In content, you could include text, images, videos, etc. BSafes provides an intutive rich text editor for you to edit content. Every bit of data on a page is encrypted on your device before being sent to cloud with your own key, no others could see your data, not even us at BSafes. Every page update creates a new page version, you could always see what it is like in previous versions. Version control is very useful for team collaboration. ## Encrypted Data Snippet ![](https://statics.bsafes.com/encryptedDataSnippet.png) You could see your encrypted data snippet by clicking on the len in lower left corner of a page.
30.709677
377
0.742647
eng_Latn
0.99268
e16225fe8813e032e300dd31f99f9ef1a4b46582
652
md
Markdown
README.md
Lucker25/MMM-SonosController
ff3c1c28b4e3e1e11e1a7ebdafa44d0a1e0385bb
[ "MIT" ]
null
null
null
README.md
Lucker25/MMM-SonosController
ff3c1c28b4e3e1e11e1a7ebdafa44d0a1e0385bb
[ "MIT" ]
null
null
null
README.md
Lucker25/MMM-SonosController
ff3c1c28b4e3e1e11e1a7ebdafa44d0a1e0385bb
[ "MIT" ]
null
null
null
{ module: "MMM-SonosController", header: "SONOS", position: "top_right", config:{ showFavorites: true, //shows favorites, or doesn´t } }, Sonos Controller is based on the MMM-Sonos Mpdule of tbouron (https://github.com/tbouron/MMM-Sonos). The NodeHelper has been edited a little bit, the MMM-SonosController is completly new. The Module adds control functionality to the MagicMirror Module. Right now it only works with single Sonos Rooms (I only have a one room setup). Only the selection part where you can pick a Playlist/radio stream: It´s mainly tested with spotify and german radio streams (tunein). Probably i´ll add Amazon playlists soon.
32.6
187
0.773006
eng_Latn
0.98642
e1638315ce7f01ee189dd490c3ecdc5a5fe7e057
885
md
Markdown
_posts/2011-04-21-document-management-system.md
BlogToolshed50/wellnessmanager.info
1fe9abf2247ee71f84e83dde192888813364ebb8
[ "CC-BY-4.0" ]
null
null
null
_posts/2011-04-21-document-management-system.md
BlogToolshed50/wellnessmanager.info
1fe9abf2247ee71f84e83dde192888813364ebb8
[ "CC-BY-4.0" ]
null
null
null
_posts/2011-04-21-document-management-system.md
BlogToolshed50/wellnessmanager.info
1fe9abf2247ee71f84e83dde192888813364ebb8
[ "CC-BY-4.0" ]
null
null
null
--- id: 101 title: Document Management System date: 2011-04-21T13:26:15+00:00 author: admin layout: post guid: http://www.wellnessmanager.info/2011/04/21/document-management-system/ permalink: /2011/04/21/document-management-system/ categories: - General --- As a business owner, you want to secure your business documents and looking for the best software product, then [document management system](http://www.dokmee.net/) is the right choice for you. Dokmee.net offers Dokmee, which is the best document management software that help you organize, secure and manage your business documents without any problem. They also offer different types like client version and web version, you can select one based on your business needs. Dokmee is easy to use and access your documents online from anywhere at any time. Using their document management software and enjoy all the facilities.
73.75
624
0.79774
eng_Latn
0.996801
e16428ab51905cd773c688770dfc7e97e36f89b4
30
md
Markdown
README.md
Mohinesh27/timepass
d837358f54682ce2d7477d480f39bc7d8484d5bf
[ "MIT" ]
null
null
null
README.md
Mohinesh27/timepass
d837358f54682ce2d7477d480f39bc7d8484d5bf
[ "MIT" ]
null
null
null
README.md
Mohinesh27/timepass
d837358f54682ce2d7477d480f39bc7d8484d5bf
[ "MIT" ]
null
null
null
# timepass Ionic Test Project
10
18
0.8
eng_Latn
0.495256
e164c21a8dbaa951da15b92a82dbc31a3a40e3bb
8,675
md
Markdown
vendor/acquia/blt/readme/testing.md
dragos-dumi/mysite-artifact
07f77b16aa3ee8d8dc1b159ac66b91e17af6f4df
[ "Apache-2.0" ]
1
2019-01-18T01:18:33.000Z
2019-01-18T01:18:33.000Z
vendor/acquia/blt/readme/testing.md
dragos-dumi/mysite-artifact
07f77b16aa3ee8d8dc1b159ac66b91e17af6f4df
[ "Apache-2.0" ]
null
null
null
vendor/acquia/blt/readme/testing.md
dragos-dumi/mysite-artifact
07f77b16aa3ee8d8dc1b159ac66b91e17af6f4df
[ "Apache-2.0" ]
null
null
null
# Testing Software testing has been around for decades, and it has been proven to provide many crucial benefits, including: * Reduce the number of bugs and regressions * Increase project velocity (in the long run) * Improves accuracy of scheduling estimates * Saves time and money * Increase user trust and satisfaction You should use automated testing. Do not fall prey to common rationalizations and excuses relating to insufficient time, money, or resources. Time spent developing tests is repaid ten fold. > Quality is the ally of schedule and cost, not their adversary. If we have to sacrifice quality to meet schedule, it’s because we are doing the job wrong from the very beginning. > > -- <cite>James A. Ward</cite> > The bitterness of poor quality remains long after the sweetness of meeting the schedule has been forgotten. > > -- <cite>Karl Wiegers</cite> That being said, two important pitfalls should be acknowledged: 1. It is possible to do automated testing incorrectly such that it is too expensive. See [Why Test Automation Costs Too Much](http://testobsessed.com/2010/07/why-test-automation-costs-too-much/). 2. It is possible to write automated tests that have little value. To avoid these pitfalls, follow the best practices outlined in sections below. ### Resources * [Realizing quality improvement through test driven development](http://research.microsoft.com/en-us/groups/ese/nagappan_tdd.pdf) * [Why Test Automation Costs Too Much](http://testobsessed.com/2010/07/why-test-automation-costs-too-much/). ## Test directory structure This directory contains all projects tests, grouped by testing technology. For all configuration related to builds that actually run these tests, please see the [build](/build) directory. tests ├── behat - contains all Behat tests │ ├── features │ │ ├── bootstrap │ │ └── Example.feature │ ├── behat.yml - contains behat configuration common to all behat profiles. │ └── integration.yml - contains behat configuration for the integration profile, which is used to run tests on the integration environment. ├── jmeter - contains all jMeter tests └── phpunit - contains all PHP Unit tests ## Executing tests Before attempting to execute any tests, verify that composer dependencies are built by running `composer install` in the project root. The following testing commands are available: * `blt tests:all` * `blt tests:behat` * `blt tests:phpunit` * `blt tests:security-updates` ### Modifying test targets See [Extending BLT](extending-blt.md#target-configuration) for more information on overriding default configuration values. For more information on the commands, run: * `./vendor/bin/phpunit --help` * `./vendor/bin/behat --help` ## Behat The high-level purpose BDD is to create a strong connection between business requirements and the actual tests. Behat tests should mirror ticket acceptance criteria as closely as possible. Consequently, proper Behat tests should be written using business domain language. The test should be comprehensible by the stakeholder and represent a clear business value. It should represent a typical user behavior and need not be an exhaustive representation of all possible scenarios. See referenced materials for more information on BDD best practices. ### Testing individual features or scenarios To execute a single feature: blt tests:behat -D behat.paths=${PWD}/tests/behat/features/Examples.feature # Relative paths are assumed to be relative to tests/behat/features. blt tests:behat -D behat.paths=Examples.feature To execute a single scenario: blt tests:behat -D behat.paths=${PWD}/tests/behat/features/Examples.feature:4 # Relative paths are assumed to be relative to tests/behat/features. blt tests:behat -D behat.paths=Examples.feature:4 Where "4" is the line number of the scenario in the feature file. To execute the tests directly (without BLT) see the following examples: * `./vendor/bin/behat -c tests/behat/local.yml tests/behat/features/Examples.feature -p local` ### Configuration Configuration for the BLT Behat commands is stored in the `behat` configuration variable. You can modify the behavior of the BLT `tests:behat` target by customizing this configuration. See [Extending BLT](extending-blt.md) for more information on overriding configuration variables. Behat's own configuration is defined in the following files: * tests/behat/behat.yml * tests/behat/example.local.yml * tests/behat/local.yml #### Screenshots for failed steps BLT includes the Behat [ScreenshotExtension](https://github.com/elvetemedve/behat-screenshot), configured by default to store a screenshot of any failed step locally. You can configure the extension globally under the `Bex\Behat\ScreenshotExtension` key in `tests/behat/behat.yml`, or override locally inside `tests/behat.local.yml`. Read through the [ScreenshotExtension documentation](https://github.com/elvetemedve/behat-screenshot#configuration) to discover how to change where images are saved, disable the extension, or change the screenshot taking mode. ### Best practices * Behat tests must be used behaviorally. I.E., they must use business domain language. * Each test should be isolated. E.g., it should not depend on conditions created by another test. In pratice, this means: * Resetting testing environment via CI after test suite runs * Defining explicit cleanup tasks in features * @todo add examples of good and bad features ### Common mistakes * Writing Behat tests that do not use business domain language. * Tests are not sufficiently isolated. Making tests interdependent diminishes their value! * Writing tests that are exhaustive of all scenarios rather than representative of a typical scenario. * Writing Behat tests when a unit test should be employed. ### Resources * [Cucumber - Where to start?](https://github.com/cucumber/cucumber/wiki/Cucumber-Backgrounder#where-to-start) Note that Cucumber is simply a Ruby based BDD library, whereas Behat is a PHP based BDD library. Best practices for tests writing apply to both * [The training wheels came off](http://aslakhellesoy.com/post/11055981222/the-training-wheels-came-off) ## PHPUnit Project level, functional PHPUnit tests are included in `tests/phpunit`. Any PHPUnit tests that affect specific modules or application level features should be placed in the same directory as that module, not in this directory. ### Best practices * Tests should not contain any control statements * Be careful to make both positive and negative assertions of expectations * @todo add examples of good and bad tests ### Common mistakes * Writing unit tests that are not independent * Making unit tests too large. Tests should be small and granular. * Asserting only positive conditions. Negative assertions should also be made. ### Resources * [Drupal's implementation of PHPUnit](https://www.drupal.org/phpunit) * [Presentations on PHPUnit](https://phpunit.de/presentations.html) * [Test Driven Development: By Example (book)](http://www.amazon.com/dp/0321146530) * [xUnit Test Patterns: Refactoring Test Code (book for the really serious)](http://amazon.com/dp/0131495054) * [Unit testing: Why bother?](http://soundsoftware.ac.uk/unit-testing-why-bother/) ### Configuration You can customize the `tests:phpunit` command by [customize the configuration values](extending-blt.md#modifying-blt-configuration) for the `phpunit` key. Each row under the `phpunit` key should contain a combination of the following properties: * config: path to either the Core phpunit configuration file (docroot/core/phpunit.xml.dist) or a custom one. If left blank, no configuration will be loaded with the unit test. * path: the path to the custom phpunit test * group: run tests only tagged with a specific `@group` * exclude: run tests excluding any tagged with this `@group` * filter: allows text filter for tests see [documentation](https://phpunit.de/manual/current/en/textui.htm) for more ```yml phpunit: - config: ${docroot}/core/phpunit.xml.dist group: 'example' - config: ${docroot}/core/phpunit.xml.dist exclude: 'mylongtest' group: 'example' - config: ${docroot}/core/phpunit.xml.dist path: ${docroot}/modules/custom/example ``` ## Frontend Testing BLT supports a `frontend-test` target that can be used to execute a variety of testing frameworks. Examples may include Jest, Jasmine, Mocha, Chai, etc. ### Configuration You can [customize the configuration values](extending-blt.md#modifying-blt-configuration) for the `frontend-test` key to enable this capability of BLT.
44.948187
333
0.769222
eng_Latn
0.993491
e1656f12d4152b285e5e73f3f1acef399b83b30a
8,828
md
Markdown
health/nutrition/nutrition.md
cthoyt/knowledge
955955aefb92c95af8d9937d68a7bebcb0333057
[ "CC-BY-4.0" ]
null
null
null
health/nutrition/nutrition.md
cthoyt/knowledge
955955aefb92c95af8d9937d68a7bebcb0333057
[ "CC-BY-4.0" ]
null
null
null
health/nutrition/nutrition.md
cthoyt/knowledge
955955aefb92c95af8d9937d68a7bebcb0333057
[ "CC-BY-4.0" ]
null
null
null
# Nutrition [This book](https://www.goodreads.com/book/show/25663961-how-not-to-die) contains all the evidence needed to prove that a whole food, plant based diet is the best thing you can do to mitigate diseases and living a healthy life. The author also has a [website](https://nutritionfacts.org) where he blogs about various things nutrition and health. Knowing this, I eat [whole food, plant based diet](foods.md) & [supplementing](supplements.md) where necessary. I only drink water and tea with occasional coffee. No sugar in food/drinks. I love exploring [new and interesting vegan recipes](recipes.md). ## Notes - Minimize high GI (glycemic index). It spikes blood sugar and insulin. - Eat many healthy fats. Avocados, Flax seed oil, etc. - Eat lots of high quality protein. - [The “best” diet is a theme: an emphasis on vegetables, fruits, whole grains, beans, lentils, nuts, seeds, and plain water for thirst.](http://www.grubstreet.com/2018/03/ultimate-conversation-on-healthy-eating-and-nutrition.html) - [Proteins stimulate IGF-1](https://nutritionfacts.org/video/protein-intake-and-igf-1-production/) - Avoid dairy - The Academy of Nutrition and Dietetics, the largest professional organization for nutrition and dietary professionals, says [plant-based diets can be safe and healthy at all stages of life](https://www.ncbi.nlm.nih.gov/m/pubmed/27886704/). Additionally, a sizeable percentage of Seventh Day Adventists eat a plant-based diet, and population studies show that [Adventists have overall excellent health](https://en.m.wikipedia.org/wiki/Adventist_Health_Studies). Evidence and scientific consensus are solidly on the side of plant-based diets being healthy. - [So think about it this way, all of the animals people eat are herbivores, they eat plants only. All of the nutrients in animal products come from the plants they eat. All you're doing is cutting out the middle man and putting those plant nutrients directly into your body. On the B12 issue, most animal feeds are supplemented with it because they no longer graze on pasture where they would get it naturally. So again, you can take the supplement yourself or through an intermediary.](https://www.reddit.com/r/PlantBasedDiet/comments/ahyhaf/i_am_conflicted_on_who_is_correct_regarding_a/) - Final note, according to Michael Pollan unless you are actively ill with diabetes, heart disease, or some other ailment this diet can help, there is a negligible difference between a full whole foods vegan diet and someone who eats up to 10% of their calories from animal products. Personally I wouldn't for moral reasons. It's anecdotal but I reversed pre-diabetes and lost 40lbs eating this way and I have all of my blood levels done yearly. ALL of my markers are significantly better than before I was plant based. - Highlights from [high-carb diet may explain why Okinawans live so long article](http://www.bbc.com/future/story/20190116-a-high-carb-diet-may-explain-why-okinawans-live-so-long): - [strong social bonds → observed to be beneficial to bodily defence against stress](https://news.ycombinator.com/item?id=18953408) - sweet potato, rather than rice, is high-consumption food in the Okinawan diet - high engagement in agriculture and fishing jobs → high physical activity - possible effects of genetics — high presence of FOXO3 gene - 10-to-1 carbs-to-protein consumption ratio, like in other particularly-long-lived populations – linked studies support the conclusion, though too early to judge definitively - most of the diet is vegetables and fruits; meats and fish are rare - calorie consumption is, on average, 13% lower than general population - studies suggest that plant-based protein intake have a more positive effect on the human organism than meat- and fished-based - their diet is not the "elixir of youth", as multiple interacting factors may be in play - [We can argue about carbohydrates endlessly, and I have to no avail in the past, but there is no dietary need for sugar at all. We tolerate some of it, but the less of it you eat the better off your body will be.](https://news.ycombinator.com/item?id=25570376) - [The structure of amino acids require nitrogen, and there is no nitrogen in fatty acids - only carbon and hydrogen. However, we can recycle nitrogen that is already in the body. There has been a documented fast of over a year for an obese person with no documented ill effects.](https://www.reddit.com/r/nutrition/comments/mfqcxw/can_the_human_body_produce_a_protein_from_stored/) ## Links - [The Last Conversation You’ll Ever Need to Have About Eating Right](http://www.grubstreet.com/2018/03/ultimate-conversation-on-healthy-eating-and-nutrition.html) - [Nutrition Facts](https://nutritionfacts.org/) - Has bias towards vegan foods. But whole food plant based diet is really the way to go. - [Examine](https://examine.com/) - Unbiased source on nutrition and supplements. - [Joel Fuhrman - How Processed Food is Killing Us and What We Can Do About It (2018)](https://www.youtube.com/watch?v=gBGnX8aLc6A) - [Lighter](https://www.lighter.world/welcome) - Shows you what food to buy and how to throw great meals together, based on the recommendations of food leaders. - [Latest Low-Carb Study: All Politics, No Science (2018)](https://www.psychologytoday.com/us/blog/diagnosis-diet/201809/latest-low-carb-study-all-politics-no-science) - [HN: I am conflicted on who is correct regarding a healthy diet. Need help (2019)](https://news.ycombinator.com/item?id=18953398#18953437) - [The Growth Of Mental Illness Cause By These Foods (2018)](https://www.youtube.com/watch?v=D98KeBAuxzc) - [Plant Positive](http://plantpositive.com/) - Making the Case for Plant-Based Nutrition. - [I am conflicted on who is correct regarding a healthy diet. Need help (2019)](https://www.reddit.com/r/PlantBasedDiet/comments/ahyhaf/i_am_conflicted_on_who_is_correct_regarding_a/) - [Dr. Rhonda Patrick - Why Eating Fish, But Not Omega-3 Supplements, Can Help Prevent Alzheimer’s (2018)](https://overcast.fm/+GMuFZBqY0) - [Blueberries for a Diabetic Diet and DNA Repair (2019)](https://www.youtube.com/watch?v=CDNyZeD87oc) - [Best foods to encourage healthy/good gut bacteria? (2019)](https://www.reddit.com/r/Nootropics/comments/chouz4/best_foods_to_encourage_healthygood_gut_bacteria/) - [Who can share strategies about how you optimize the positive impact of your diet on your microbiome (and thus your health)? (2019)](https://www.reddit.com/r/nutrition/comments/dhq295/who_can_share_strategies_about_how_you_optimize/) - [What makes a food "inflammatory" and what foods cause systemic inflammation?](https://www.reddit.com/r/nutrition/comments/diw5r9/what_makes_a_food_inflammatory_and_what_foods/) - [Grim Grains Nutrition guide](https://grimgrains.com/#nutrition) - [Ask HN: Which is the best book on nutrition? (2019)](https://news.ycombinator.com/item?id=21800737) - [Ask HN: Most sustainable diet long term? (2019)](https://news.ycombinator.com/item?id=19660819) - [Zoe](https://joinzoe.com/) - Find the best foods to optimize your metabolism. - [How to Find Your Best Diet (2020)](https://www.gq.com/story/how-to-find-your-best-diet) - [What are some underrated or lesser known nutritional tips or changes people can make nutritionally to improve their health? (2020)](https://www.reddit.com/r/nutrition/comments/gkmp2t/what_are_some_underrated_or_lesser_known/) - [Nutrition Courses](https://www.futurelearn.com/subjects/healthcare-medicine-courses/nutrition) - [Precision Nutrition](https://www.precisionnutrition.com/) - Nutrition Certification, Coaching & Software. - [Precision Nutrition course](https://www.precisionnutrition.com/nutrition-coaching-free-course) - [What are the best websites for someone that is trying to learn nutrition? (2020)](https://www.reddit.com/r/nutrition/comments/gw8kt0/what_are_the_best_websites_for_someone_that_is/) - [Is sugar bad for you if your body needs the calories? (2020)](https://www.reddit.com/r/nutrition/comments/gxv7ya/is_sugar_bad_for_you_if_your_body_needs_the/) - [Why is there no definitive healthy Human Diet? (2020)](https://www.reddit.com/r/nutrition/comments/h7jxvp/why_is_there_no_definitive_healthy_human_diet/) - [Summary of US nutrition guidelines (2020)](https://news.ycombinator.com/item?id=25570551) - [Nutrition Reddit Wiki - Suggested Reading](https://www.reddit.com/r/nutrition/wiki/books) - [If calories = energy, why do some high calorie things not make you satiated and energized while low calorie things do? (2021)](https://www.reddit.com/r/nutrition/comments/messhn/if_calories_energy_why_do_some_high_calorie/) - [Nutrition Reddit FAQ](https://www.reddit.com/r/nutrition/wiki/faq) - [Best place to get nutrition data (2021)](https://www.reddit.com/r/nutrition/comments/mg70il/best_place_to_get_nutrition_data/)
131.761194
591
0.786248
eng_Latn
0.950397
e165998bcd100218e710ca3e3486440e2b863ed2
6,246
md
Markdown
treebanks/grc_proiel/grc_proiel-dep-advmod.md
emmettstr/docs
2d0376d6e07f3ffa828f6152d12cf260a530c64d
[ "Apache-2.0" ]
204
2015-01-20T16:36:39.000Z
2022-03-28T00:49:51.000Z
treebanks/grc_proiel/grc_proiel-dep-advmod.md
emmettstr/docs
2d0376d6e07f3ffa828f6152d12cf260a530c64d
[ "Apache-2.0" ]
654
2015-01-02T17:06:29.000Z
2022-03-31T18:23:34.000Z
treebanks/grc_proiel/grc_proiel-dep-advmod.md
emmettstr/docs
2d0376d6e07f3ffa828f6152d12cf260a530c64d
[ "Apache-2.0" ]
200
2015-01-16T22:07:02.000Z
2022-03-25T11:35:28.000Z
--- layout: base title: 'Statistics of advmod in UD_Ancient_Greek-PROIEL' udver: '2' --- ## Treebank Statistics: UD_Ancient_Greek-PROIEL: Relations: `advmod` This relation is universal. 12089 nodes (6%) are attached to their parents as `advmod`. 9780 instances of `advmod` (81%) are right-to-left (child precedes parent). Average distance between parent and child is 2.35859045413186. The following 23 pairs of parts of speech are connected with `advmod`: <tt><a href="grc_proiel-pos-VERB.html">VERB</a></tt>-<tt><a href="grc_proiel-pos-ADV.html">ADV</a></tt> (8086; 67% instances), <tt><a href="grc_proiel-pos-VERB.html">VERB</a></tt>-<tt><a href="grc_proiel-pos-ADJ.html">ADJ</a></tt> (1048; 9% instances), <tt><a href="grc_proiel-pos-ADJ.html">ADJ</a></tt>-<tt><a href="grc_proiel-pos-ADV.html">ADV</a></tt> (880; 7% instances), <tt><a href="grc_proiel-pos-NOUN.html">NOUN</a></tt>-<tt><a href="grc_proiel-pos-ADV.html">ADV</a></tt> (691; 6% instances), <tt><a href="grc_proiel-pos-ADV.html">ADV</a></tt>-<tt><a href="grc_proiel-pos-ADV.html">ADV</a></tt> (288; 2% instances), <tt><a href="grc_proiel-pos-PRON.html">PRON</a></tt>-<tt><a href="grc_proiel-pos-ADV.html">ADV</a></tt> (279; 2% instances), <tt><a href="grc_proiel-pos-AUX.html">AUX</a></tt>-<tt><a href="grc_proiel-pos-ADV.html">ADV</a></tt> (205; 2% instances), <tt><a href="grc_proiel-pos-NOUN.html">NOUN</a></tt>-<tt><a href="grc_proiel-pos-ADJ.html">ADJ</a></tt> (167; 1% instances), <tt><a href="grc_proiel-pos-PROPN.html">PROPN</a></tt>-<tt><a href="grc_proiel-pos-ADV.html">ADV</a></tt> (116; 1% instances), <tt><a href="grc_proiel-pos-ADJ.html">ADJ</a></tt>-<tt><a href="grc_proiel-pos-ADJ.html">ADJ</a></tt> (108; 1% instances), <tt><a href="grc_proiel-pos-NUM.html">NUM</a></tt>-<tt><a href="grc_proiel-pos-ADV.html">ADV</a></tt> (49; 0% instances), <tt><a href="grc_proiel-pos-ADP.html">ADP</a></tt>-<tt><a href="grc_proiel-pos-ADV.html">ADV</a></tt> (38; 0% instances), <tt><a href="grc_proiel-pos-ADV.html">ADV</a></tt>-<tt><a href="grc_proiel-pos-ADJ.html">ADJ</a></tt> (38; 0% instances), <tt><a href="grc_proiel-pos-CCONJ.html">CCONJ</a></tt>-<tt><a href="grc_proiel-pos-ADV.html">ADV</a></tt> (27; 0% instances), <tt><a href="grc_proiel-pos-DET.html">DET</a></tt>-<tt><a href="grc_proiel-pos-ADV.html">ADV</a></tt> (26; 0% instances), <tt><a href="grc_proiel-pos-AUX.html">AUX</a></tt>-<tt><a href="grc_proiel-pos-ADJ.html">ADJ</a></tt> (15; 0% instances), <tt><a href="grc_proiel-pos-PROPN.html">PROPN</a></tt>-<tt><a href="grc_proiel-pos-ADJ.html">ADJ</a></tt> (11; 0% instances), <tt><a href="grc_proiel-pos-PRON.html">PRON</a></tt>-<tt><a href="grc_proiel-pos-ADJ.html">ADJ</a></tt> (7; 0% instances), <tt><a href="grc_proiel-pos-INTJ.html">INTJ</a></tt>-<tt><a href="grc_proiel-pos-ADV.html">ADV</a></tt> (5; 0% instances), <tt><a href="grc_proiel-pos-INTJ.html">INTJ</a></tt>-<tt><a href="grc_proiel-pos-ADJ.html">ADJ</a></tt> (2; 0% instances), <tt><a href="grc_proiel-pos-CCONJ.html">CCONJ</a></tt>-<tt><a href="grc_proiel-pos-ADJ.html">ADJ</a></tt> (1; 0% instances), <tt><a href="grc_proiel-pos-NUM.html">NUM</a></tt>-<tt><a href="grc_proiel-pos-ADJ.html">ADJ</a></tt> (1; 0% instances), <tt><a href="grc_proiel-pos-VERB.html">VERB</a></tt>-<tt><a href="grc_proiel-pos-CCONJ.html">CCONJ</a></tt> (1; 0% instances). ~~~ conllu # visual-style 5 bgColor:blue # visual-style 5 fgColor:white # visual-style 6 bgColor:blue # visual-style 6 fgColor:white # visual-style 6 5 advmod color:blue 1 ὃ ὁ PRON Pp Case=Nom|Gender=Masc|Number=Sing|Person=3|PronType=Prs 9 nsubj _ ref=1.10.1 2 μὲν μέν ADV Df _ 9 discourse _ ref=1.10.1 3 δὴ δή ADV Df _ 9 discourse _ ref=1.10.1 4 ὡς ὡς SCONJ G- _ 6 mark _ ref=1.10.1 5 οὐκ οὐ ADV Df Polarity=Neg 6 advmod _ ref=1.10.1 6 ἐδύνατο δύναμαι VERB V- Aspect=Imp|Mood=Ind|Number=Sing|Person=3|Tense=Past|VerbForm=Fin|Voice=Mid 9 advcl _ ref=1.10.1 7 διαφυγεῖν διαφεύγω VERB V- Aspect=Perf|Mood=Ind|Number=Sing|Person=1|Tense=Past|VerbForm=Fin|Voice=Act 6 xcomp _ ref=1.10.1 8 ἦν εἰμί AUX V- Aspect=Imp|Mood=Ind|Number=Sing|Person=3|Tense=Past|VerbForm=Fin|Voice=Act 9 cop _ ref=1.10.1|LId=1 9 ἕτοιμος ἕτοιμος ADJ A- Case=Nom|Degree=Pos|Gender=Masc|Number=Sing 0 root _ ref=1.10.1 ~~~ ~~~ conllu # visual-style 8 bgColor:blue # visual-style 8 fgColor:white # visual-style 9 bgColor:blue # visual-style 9 fgColor:white # visual-style 9 8 advmod color:blue 1 ὡς ὡς SCONJ G- _ 4 mark _ ref=1.112.2 2 δὲ δέ ADV Df _ 9 discourse _ ref=1.112.2 3 οὐκ οὐ ADV Df Polarity=Neg 4 advmod _ ref=1.112.2 4 ἔπειθε πείθω VERB V- Aspect=Imp|Mood=Ind|Number=Sing|Person=3|Tense=Past|VerbForm=Fin|Voice=Act 9 advcl _ ref=1.112.2 5 ἄρα ἆρα ADV Df _ 4 discourse _ ref=1.112.2 6 τὸν ὁ DET S- Case=Acc|Definite=Def|Gender=Masc|Number=Sing|PronType=Dem 7 det _ ref=1.112.2 7 ἄνδρα ἀνήρ NOUN Nb Case=Acc|Gender=Masc|Number=Sing 4 obj _ ref=1.112.2 8 δευτέρα δεύτερος ADJ Mo Case=Acc|Gender=Neut|Number=Plur 9 advmod _ ref=1.112.2 9 λέγει λέγω VERB V- Mood=Ind|Number=Sing|Person=3|Tense=Pres|VerbForm=Fin|Voice=Act 0 root _ ref=1.112.2 10 ἡ ὁ DET S- Case=Nom|Definite=Def|Gender=Fem|Number=Sing|PronType=Dem 11 det _ ref=1.112.2 11 γυνὴ γυνή NOUN Nb Case=Nom|Gender=Fem|Number=Sing 9 nsubj _ ref=1.112.2 12 τάδε ὅδε ADJ Pd Case=Acc|Gender=Neut|Number=Plur 9 obj _ ref=1.112.2 ~~~ ~~~ conllu # visual-style 9 bgColor:blue # visual-style 9 fgColor:white # visual-style 10 bgColor:blue # visual-style 10 fgColor:white # visual-style 10 9 advmod color:blue 1 ἄλλα ἄλλος ADJ A- Case=Acc|Degree=Pos|Gender=Neut|Number=Plur 3 amod _ ref=1.16.2 2 δὲ δέ ADV Df _ 4 discourse _ ref=1.16.2 3 ἔργα ἔργον NOUN Nb Case=Acc|Gender=Neut|Number=Plur 4 obj _ ref=1.16.2 4 ἀπεδέξατο ἀποδέχομαι VERB V- Aspect=Perf|Mood=Ind|Number=Sing|Person=3|Tense=Past|VerbForm=Fin|Voice=Mid 0 root _ ref=1.16.2 5 ἐὼν εἰμί AUX V- Case=Nom|Gender=Masc|Number=Sing|Tense=Pres|VerbForm=Part|Voice=Act 8 cop _ ref=1.16.2|LId=1 6 ἐν ἐν ADP R- _ 8 case _ ref=1.16.2 7 τῇ ὁ DET S- Case=Dat|Definite=Def|Gender=Fem|Number=Sing|PronType=Dem 8 det _ ref=1.16.2 8 ἀρχῇ ἀρχή NOUN Nb Case=Dat|Gender=Fem|Number=Sing 4 advcl _ ref=1.16.2 9 ἀξιαπηγητότατα ἀξιαφήγητος ADV Df Degree=Sup 10 advmod _ ref=1.16.2 10 τάδε ὅδε ADJ Pd Case=Acc|Gender=Neut|Number=Plur 3 appos _ ref=1.16.2 ~~~
78.075
2,921
0.700608
yue_Hant
0.744578
e16682b75c3f193665b0007fa63099784b5556e7
1,651
md
Markdown
CONTRIBUTING.md
Praqma/Praqmatic-Automated-Changelog
128f4863df99026c928f2f6a006ec9d9d5925c6b
[ "MIT" ]
20
2016-04-06T09:19:10.000Z
2021-05-02T11:46:57.000Z
CONTRIBUTING.md
Praqma/Praqmatic-Automated-Changelog
128f4863df99026c928f2f6a006ec9d9d5925c6b
[ "MIT" ]
123
2016-01-24T21:44:03.000Z
2021-10-01T11:36:38.000Z
CONTRIBUTING.md
Praqma/Praqmatic-Automated-Changelog
128f4863df99026c928f2f6a006ec9d9d5925c6b
[ "MIT" ]
10
2015-05-12T13:25:35.000Z
2020-08-27T18:40:48.000Z
# Contributing * Simple factual changes, typos, wording etc. can be done using pull requests without any further planning. * You're also welcome to use Github Pull Request as a collaboration request, for just sharing thoughts etc. Please file proper description then when opening the PR. ## Conceptual and larger changes We follow our pragmatic workflow using Github issues, labels and milestones. Read these two blog posts: * http://www.praqma.com/stories/a-pragmatic-workflow/ * http://www.praqma.com/stories/milestones-and-officehours In real life ... * create an issue explaining your idea * if you want to work on an issue, it needs to be prioritized by a PO _If you want to change something - you will always need to interact with a product owner for conceptual and larger changes._ ## Releases A release is a tagged version of the repository following semantic versioning. See [/docs/versioning.md](/docs/versioning.md) ## Responsibilites Product owners: * Bue Petersen (Github handle: @buep) * Peers: Jan Krag (Github handle: @JKrag), Claus Schneider (Github handle: @bicschneider) _Product owners_ will take the daily responsibility for any change: * that changes conform and agree with our roadmap and our testing strategy * accept pull requests * manage and plan issues * make sure site- and office specific details are cleared with the relevant persons The product owner will make sure changes comply with roadmap and the concept owners agree. The _concept owners_ have the overall vision and make decisions on roadmap level, but on daily basis the product owners carry out the decision. Concept owner: * Bue Petersen
33.693878
164
0.783162
eng_Latn
0.997546
e166c13ef3b437dc62a18ba814d2b34749dc47fe
711
md
Markdown
docs/api/@remirror/core-extensions/core-extensions.bulletlistextension.md
fullstackio/remirror
4508a47a5f95d31444fb14f5a9907b902a5c1f36
[ "MIT" ]
1
2021-05-22T06:22:01.000Z
2021-05-22T06:22:01.000Z
docs/api/@remirror/core-extensions/core-extensions.bulletlistextension.md
fullstackio/remirror
4508a47a5f95d31444fb14f5a9907b902a5c1f36
[ "MIT" ]
null
null
null
docs/api/@remirror/core-extensions/core-extensions.bulletlistextension.md
fullstackio/remirror
4508a47a5f95d31444fb14f5a9907b902a5c1f36
[ "MIT" ]
null
null
null
<!-- Do not edit this file. It is automatically generated by API Documenter. --> [Home](./index.md) &gt; [@remirror/core-extensions](./core-extensions.md) &gt; [BulletListExtension](./core-extensions.bulletlistextension.md) ## BulletListExtension class <b>Signature:</b> ```typescript export declare class BulletListExtension extends NodeExtension ``` ## Methods | Method | Modifiers | Description | | --- | --- | --- | | [commands({ type, schema })](./core-extensions.bulletlistextension.commands.md) | | | | [inputRules({ type })](./core-extensions.bulletlistextension.inputrules.md) | | | | [keys({ type, schema })](./core-extensions.bulletlistextension.keys.md) | | |
33.857143
143
0.66526
yue_Hant
0.319095
e166fc6e395122639cd1244e68b31c401f551b37
11,430
md
Markdown
_posts/web-development/javascript/2019-07-30-js-async-await.md
joshua1988/joshua1988.github.io
c3464b05b196182dc542ee3b6d45b0b44ff24aab
[ "MIT" ]
31
2017-08-14T07:37:58.000Z
2022-03-07T00:32:17.000Z
_posts/web-development/javascript/2019-07-30-js-async-await.md
joshua1988/joshua1988.github.io
c3464b05b196182dc542ee3b6d45b0b44ff24aab
[ "MIT" ]
3
2017-06-05T11:09:37.000Z
2022-03-22T00:08:35.000Z
_posts/web-development/javascript/2019-07-30-js-async-await.md
joshua1988/joshua1988.github.io
c3464b05b196182dc542ee3b6d45b0b44ff24aab
[ "MIT" ]
33
2017-08-14T06:55:30.000Z
2022-03-21T23:51:35.000Z
--- layout: article title: "자바스크립트 async와 await" date: 2019-07-30 11:59:13 +0900 categories: [web-development, javascript] excerpt: "(중급) 자바스크립트 개발자를 위한 async, await 사용법 설명. 쉽게 알아보는 자바스크립트 async await 개념, 사용법, 예제 코드, 예외 처리 방법" image: teaser: posts/web/javascript/async-await.png credit: Hackernoon #name of the person or site you want to credit creditlink: https://hackernoon.com/javascript-async-await-the-good-part-pitfalls-and-how-to-use-9b759ca21cda #url to their site or licensing locale: "ko" # 리플 옵션 comments: true tags: - async await - async await 예제 - async await 사용법 - vue async await - 자바스크립트 async await - javascript 비동기를 동기로 - 자바스크립트 - 자바스크립트 기초 - 자바스크립트 기초 예제 - 자바스크립트 기본 - 자바스크립트 강좌 - 자바스크립트 비동기 - 자바스크립트 비동기 처리 - 자바스크립트 비동기 프로그래밍 - 자바스크립트 비동기 함수 - 자바스크립트 입문 - 자바스크립트 초급 - 자바스크립트 시작하기 - 자바스크립트 코딩 면접 - javascript 코딩 면접 - 자바스크립트 입문 책 - 자바스크립트 서적 - 캡틴판교 - 장기효 - 인프런 - 패스트 캠퍼스 --- {% include toc.html %} ## 들어가며 안녕하세요. 오랜만에 글을 올립니다. 작년에 Promise 글을 작성할 때까지만 해도 Async 편을 작성하는 데까지 이렇게 오랜 시간이 걸릴 거라고는 생각 못 했네요 :) 기존 글에 많은 응원과 댓글 남겨주셔서 감사하게 받아보고 있습니다. 계속 글 구성만 고민하다가 이제서야 글을 작성합니다. 이번 글에서 살펴볼 내용은 자바스크립트의 비동기 처리 시리즈의 마지막 연재물 async & await 문법입니다. 처음 접하시는 분들이 최대한 이해하기 쉽게 코드와 글을 풀어서 작성했으니 재밌게 읽으셨으면 좋겠습니다 :) 그리고, 이번 글을 읽으시려면 꼭 [비동기 처리 및 콜백 함수](https://joshua1988.github.io/web-development/javascript/javascript-asynchronous-operation/)와 [Promise](https://joshua1988.github.io/web-development/javascript/promise-for-beginners/)에 대해 이해하고 계셔야 합니다. 만약 아직 개념을 이해하지 못하셨다면 글을 꼭 읽어보시고 오시는 걸 추천드립니다. 그럼 재밌게 읽으세요! :) ## async & await는 뭔가요? async와 await는 자바스크립트의 비동기 처리 패턴 중 가장 최근에 나온 문법입니다. 기존의 비동기 처리 방식인 콜백 함수와 프로미스의 단점을 보완하고 개발자가 읽기 좋은 코드를 작성할 수 있게 도와주죠. ## 개발자에게 읽기 좋은 코드란? 처음 프로그래밍을 배웠을 때 아래와 같이 변수와 조건문을 사용하셨던 기억이 있으시죠? ```js var user = { id: 1, name: 'Josh' }; if (user.id === 1) { console.log(user.name); // Josh } ``` 이 코드는 `user`라는 변수에 객체를 할당한 뒤 조건문으로 사용자의 아이디를 확인하고 콘솔에 사용자의 `name`을 찍는 간단한 코드입니다. 우리는 이렇게 위에서부터 아래로 한 줄 한 줄 차근히 읽으면서 사고하는 것이 편합니다. 그렇게 프로그래밍을 배웠으니까요. ## 그래서 읽기 좋은 코드와 async & await가 무슨 상관이죠? 조금 전에 읽고 이해한 방식대로 코드를 구성하는 것이 async, await 문법의 목적입니다. 다음 코드를 한번 볼까요? ```js var user = fetchUser('domain.com/users/1'); if (user.id === 1) { console.log(user.name); } ``` `fetchUser()`라는 메서드를 호출하면 앞에서 봤던 코드처럼 사용자 객체를 반환한다고 해보겠습니다. 그리고 여기서 `fetchUser()` 메서드가 서버에서 사용자 정보를 가져오는 HTTP 통신 코드라고 가정한다면 위 코드는 async & await 문법이 적용된 형태라고 보셔도 됩니다. 이게 대체 무슨 말인지 아래에서 함께 알아보겠습니다 :) ## async & await 맛보기 먼저 앞에서 살펴본 코드를 `logName()`이라는 간단한 함수로 감싸보겠습니다. ```js function logName() { var user = fetchUser('domain.com/users/1'); if (user.id === 1) { console.log(user.name); } } ``` 이제 위 함수를 실행하면 아까와 동일하게 코드가 동작할 겁니다. 자 그리고 여기서 아래와 같이 `async`와 `await`를 추가해주면 ```js async function logName() { var user = await fetchUser('domain.com/users/1'); if (user.id === 1) { console.log(user.name); } } ``` 짜잔. 이게 바로 async await 코드입니다. 혹시 아직 이해가 정확히 안 가더라도 걱정 마세요. 지금부터 차근히 살펴볼게요! :) ## async & await 적용된 코드와 그렇지 않은 코드 자 저희가 조금 전에 본 코드가 대체 어떤 의미인지 한번 알아보겠습니다. 먼저 아까 살펴봤던 `logName()` 함수 코드를 다시 보겠습니다. ```js function logName() { var user = fetchUser('domain.com/users/1'); if (user.id === 1) { console.log(user.name); } } ``` 여기서 `fetchUser()`라고 하는 코드는 서버에서 데이터를 받아오는 HTTP 통신 코드라고 가정했습니다. 일반적으로 자바스크립트의 비동기 처리 코드는 아래와 같이 콜백을 사용해야지 코드의 실행 순서를 보장받을 수 있죠. ```js function logName() { // 아래의 user 변수는 위의 코드와 비교하기 위해 일부러 남겨놓았습니다. var user = fetchUser('domain.com/users/1', function(user) { if (user.id === 1) { console.log(user.name); } }); } ``` 이미 위와 같이 콜백으로 비동기 처리 코드를 작성하는 게 익숙하신 분들이라면 문제가 없겠지만, 이 사고방식에 익숙하지 않은 분들은 고개가 갸우뚱할 겁니다. 그래서 저희가 처음 프로그래밍을 배웠던 그때 그 사고로 돌아가는 것이죠. 아래와 같이 간단하게 생각하자구요. ```js // 비동기 처리를 콜백으로 안해도 된다면.. function logName() { var user = fetchUser('domain.com/users/1'); if (user.id === 1) { console.log(user.name); } } ``` 서버에서 사용자 데이터를 불러와서 변수에 담고, 사용자 아이디가 1이면 사용자 이름을 출력한다. 이렇게 하려면 async await만 붙이시면 됩니다 :) ```js // async & await 적용 후 async function logName() { var user = await fetchUser('domain.com/users/1'); if (user.id === 1) { console.log(user.name); } } ``` ※참고: 만약 위의 콜백 함수 코드가 와닿지 않는 분들은 [비동기 처리와 콜백 함수](https://joshua1988.github.io/web-development/javascript/javascript-asynchronous-operation/) 글을 꼭 다시 읽어보시고 오세요. ## async & await 기본 문법 이제 async await의 기본 문법을 알아보겠습니다. ```js async function 함수명() { await 비동기_처리_메서드_명(); } ``` 먼저 함수의 앞에 `async` 라는 예약어를 붙입니다. 그러고 나서 함수의 내부 로직 중 HTTP 통신을 하는 비동기 처리 코드 앞에 `await`를 붙입니다. 여기서 주의하셔야 할 점은 비동기 처리 메서드가 꼭 프로미스 객체를 반환해야 `await`가 의도한 대로 동작합니다. 일반적으로 `await`의 대상이 되는 비동기 처리 코드는 [Axios](https://github.com/axios/axios) 등 프로미스를 반환하는 API 호출 함수입니다. ## async & await 간단한 예제 그럼 문법을 좀 더 정확하게 이해하기 위해서 간단한 async await 코드를 보겠습니다. ```js function fetchItems() { return new Promise(function(resolve, reject) { var items = [1,2,3]; resolve(items) }); } async function logItems() { var resultItems = await fetchItems(); console.log(resultItems); // [1,2,3] } ``` 먼저 `fetchItems()` 함수는 프로미스 객체를 반환하는 함수입니다. 프로미스는 "[자바스크립트 비동기 처리를 위한 객체](https://joshua1988.github.io/web-development/javascript/promise-for-beginners/#promise%EA%B0%80-%EB%AD%94%EA%B0%80%EC%9A%94)"라고 배웠었죠. `fetchItems()` 함수를 실행하면 프로미스가 이행(Resolved)되며 결과 값은 `items` 배열이 됩니다. 그리고 이제 `logItems()` 함수를 보겠습니다. `logItems()` 함수를 실행하면 `fetchItems()` 함수의 결과 값인 `items` 배열이 `resultItems` 변수에 담깁니다. 따라서, 콘솔에는 `[1,2,3]`이 출력되죠. `await`를 사용하지 않았다면 데이터를 받아온 시점에 콘솔을 출력할 수 있게 콜백 함수나 `.then()`등을 사용해야 했을 겁니다. 하지만 async await 문법 덕택에 비동기에 대한 사고를 하지 않아도 되는 것이죠. ※참고: 만약 위 코드가 왜 비동기 처리 코드인지 잘 이해가 안 가신다면 `fetchItems()`를 아래의 함수들로 바꿔서 실행해보셔도 괜찮습니다 :) ```js // HTTP 통신 동작을 모방한 코드 function fetchItems() { return new Promise(function(resolve, reject) { setTimeout(function() { var items = [1,2,3]; resolve(items) }, 3000); }); } // jQuery ajax 코드 function fetchItems() { return new Promise(function(resolve, reject) { $.ajax('domain.com/items', function(response) { resolve(response); }); }); } ``` ## async & await 실용 예제 async & await 문법이 가장 빛을 발하는 순간은 여러 개의 비동기 처리 코드를 다룰 때입니다. 아래와 같이 각각 *사용자*와 *할 일 목록*을 받아오는 HTTP 통신 코드가 있다고 하겠습니다. ```js function fetchUser() { var url = 'https://jsonplaceholder.typicode.com/users/1' return fetch(url).then(function(response) { return response.json(); }); } function fetchTodo() { var url = 'https://jsonplaceholder.typicode.com/todos/1'; return fetch(url).then(function(response) { return response.json(); }); } ``` 위 함수들을 실행하면 각각 사용자 정보와 할 일 정보가 담긴 프로미스 객체가 반환됩니다. 자 이제 이 두 함수를 이용하여 할 일 제목을 출력해보겠습니다. 살펴볼 예제 코드의 로직은 아래와 같습니다. 1. `fetchUser()`를 이용하여 사용자 정보 호출 2. 받아온 사용자 아이디가 `1`이면 할 일 정보 호출 3. 받아온 할 일 정보의 제목을 콘솔에 출력 그럼 코드를 보겠습니다. ```js async function logTodoTitle() { var user = await fetchUser(); if (user.id === 1) { var todo = await fetchTodo(); console.log(todo.title); // delectus aut autem } } ``` `logTodoTitle()`를 실행하면 콘솔에 *delectus aut autem*가 출력될 것입니다. 위 비동기 처리 코드를 만약 콜백이나 프로미스로 했다면 훨씬 더 코드가 길어졌을 것이고 인덴팅 뿐만 아니라 가독성도 좋지 않았을 겁니다. 이처럼 async await 문법을 이용하면 기존의 비동기 처리 코드 방식으로 사고하지 않아도 되는 장점이 생깁니다. ※참고: 위 함수에서 사용한 `fetch()` API는 크롬과 같은 최신 브라우저에서만 동작합니다. 브라우저 지원 여부는 다음 링크로 확인해보세요. [fetch API 브라우저 지원표](https://developer.mozilla.org/en-US/docs/Web/API/Fetch_API) ## async & await 예외 처리 async & await에서 예외를 처리하는 방법은 바로 [try catch](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Statements/try...catch)입니다. 프로미스에서 에러 처리를 위해 `.catch()`를 사용했던 것처럼 async에서는 `catch {}` 를 사용하시면 됩니다. 조금 전 코드에 바로 `try catch` 문법을 적용해보겠습니다. ```js async function logTodoTitle() { try { var user = await fetchUser(); if (user.id === 1) { var todo = await fetchTodo(); console.log(todo.title); // delectus aut autem } } catch (error) { console.log(error); } } ``` 위의 코드를 실행하다가 발생한 네트워크 통신 오류뿐만 아니라 간단한 타입 오류 등의 일반적인 오류까지도 `catch`로 잡아낼 수 있습니다. 발견된 에러는 `error` 객체에 담기기 때문에 에러의 유형에 맞게 에러 코드를 처리해주시면 됩니다. ## 마무리 여태까지 살펴본 내용으로 감을 좀 잡으셨나요? 늘 처음 보는 문법은 완전하게 이해하는데 시간이 필요합니다. 실제로 서비스를 만드실 때 위 내용을 적용해보시면 더 쉽게 체득하실 수 있을거에요. 앞으로 더 프런트엔드의 이벤트와 데이터 처리가 많아질 것이기 때문에 async await에 대해서 정확히 알아놓으시면 도움이 많이 되실 겁니다. 그럼 재밌게 코딩하세요! 감사합니다 😄 <!-- ## async & await 레이스 컨디션 ## async & await 내부 구조 --> #### 다른 시리즈물 확인하기 - [1탄 - 자바스크립트 비동기 처리와 콜백 함수](https://joshua1988.github.io/web-development/javascript/javascript-asynchronous-operation/) - [2탄 - 자바스크립트 Promise 쉽게 이해하기](https://joshua1988.github.io/web-development/javascript/promise-for-beginners/) ## 글보다 더 쉽게 배우는 온라인 강좌 좀 더 친절하고 상세한 설명을 원하신다면 아래 강좌를 이용해보시는 것도 좋을 것 같아요 😄 <figure class="third"> <a href="https://www.inflearn.com/course/Age-of-Vuejs?inst=72986832&utm_source=blog&utm_medium=githubio&utm_campaign=captianpangyo&utm_term=banner" target="_blank"><img src="{{ site.url }}/images/posts/web/inflearn/lv1.png"></a> <a href="https://www.inflearn.com/course/vue-pwa-vue-js-중급?inst=dd3b6c65&utm_source=blog&utm_medium=githubio&utm_campaign=captianpangyo&utm_term=banner" target="_blank"><img src="{{ site.url }}/images/posts/web/inflearn/lv2.png"></a> <a href="https://www.inflearn.com/course/vue-js?inst=c76b3a50&utm_source=blog&utm_medium=githubio&utm_campaign=captianpangyo&utm_term=banner" target="_blank"><img src="{{ site.url }}/images/posts/web/inflearn/lv3.png"></a> <figcaption>인프런 온라인 강의 : Vue.js 시작하기 / Vue.js 중급 / Vue.js 완벽 가이드</figcaption> </figure> <figure class="third"> <a href="https://www.inflearn.com/course/vue-js-끝내기-캡틴판교?inst=2071ec73&utm_source=blog&utm_medium=githubio&utm_campaign=captianpangyo&utm_term=banner" target="_blank"><img src="{{ site.url }}/images/posts/web/inflearn/lv4.png"></a> <a href="https://www.inflearn.com/course/프런트엔드-웹팩?inst=747606f7&utm_source=blog&utm_medium=githubio&utm_campaign=captianpangyo&utm_term=banner" target="_blank"><img src="{{ site.url }}/images/posts/web/inflearn/webpack.png"></a> <a href="https://www.inflearn.com/course/pwa?utm_source=blog&utm_medium=githubio&utm_campaign=captianpangyo&utm_term=banner" target="_blank"><img src="{{ site.url }}/images/posts/web/inflearn/pwa.jpg"></a> <figcaption>인프런 온라인 강의 : Vue.js 끝장내기 / 프런트엔드 개발자를 위한 웹팩 / PWA 시작하기</figcaption> </figure> <figure class="third"> <a href="https://www.inflearn.com/course/타입스크립트-입문?inst=f1ae9299&utm_source=blog&utm_medium=githubio&utm_campaign=captianpangyo&utm_term=banner" target="_blank"><img src="{{ site.url }}/images/posts/web/inflearn/ts1.png"></a> <a href="https://www.inflearn.com/course/타입스크립트-실전?inst=e5a8f85e&utm_source=blog&utm_medium=githubio&utm_campaign=captianpangyo&utm_term=banner" target="_blank"><img src="{{ site.url }}/images/posts/web/inflearn/ts2.png"></a> <a href="https://www.inflearn.com/course/vue-ts?inst=0ced8395&utm_source=blog&utm_medium=githubio&utm_campaign=captianpangyo&utm_term=banner" target="_blank"><img src="{{ site.url }}/images/posts/web/inflearn/vue-ts.png"></a> <figcaption>인프런 온라인 강의 : 타입스크립트 입문 / 실전 프로젝트로 배우는 타입스크립트 / Vue.js + TypeScript 완벽 가이드</figcaption> </figure> <figure class="third"> <a href="https://www.inflearn.com/course/타입스크립트-입문?inst=f1ae9299&utm_source=blog&utm_medium=githubio&utm_campaign=captianpangyo&utm_term=banner" target="_blank"><img src="{{ site.url }}/images/posts/web/inflearn/typescript-beginner-kor.png"></a> <figcaption>인프런 온라인 강의 : 타입스크립트 입문</figcaption> </figure> ## 이해가 잘 안되시나요? 방송에서 직접 물어보세요 :) 매주 토요일 오후 3시 30분에 유튜브 라이브 방송을 진행합니다. 프런트엔드 개발 관련해서 아무거나 여쭤보실 수 있으세요 :) <a href="https://www.youtube.com/watch?v=fL39Yg2H0ig" target="_blank">프런트엔드 개발 상담소 바로가기</a>
30.48
282
0.694401
kor_Hang
0.99999
e167c9938efa618ba49f312cfb5831ef7b3ce08e
20
md
Markdown
README.md
SevcanAlkan/CSharp-OpenGL-Demo
2604c7f5061d519b00810bccd7208ce6a157179c
[ "MIT" ]
null
null
null
README.md
SevcanAlkan/CSharp-OpenGL-Demo
2604c7f5061d519b00810bccd7208ce6a157179c
[ "MIT" ]
null
null
null
README.md
SevcanAlkan/CSharp-OpenGL-Demo
2604c7f5061d519b00810bccd7208ce6a157179c
[ "MIT" ]
null
null
null
# CSharp-OpenGL-Demo
20
20
0.8
kor_Hang
0.713635
e167d8ca36f9d1f27cdc1fdf7fdad40e26bcb5e4
164
md
Markdown
Packs/CommonTypes/ReleaseNotes/3_2_4.md
cstone112/content
7f039931b8cfc20e89df52d895440b7321149a0d
[ "MIT" ]
2
2021-12-06T21:38:24.000Z
2022-01-13T08:23:36.000Z
Packs/CommonTypes/ReleaseNotes/3_2_4.md
cstone112/content
7f039931b8cfc20e89df52d895440b7321149a0d
[ "MIT" ]
87
2022-02-23T12:10:53.000Z
2022-03-31T11:29:05.000Z
Packs/CommonTypes/ReleaseNotes/3_2_4.md
cstone112/content
7f039931b8cfc20e89df52d895440b7321149a0d
[ "MIT" ]
2
2022-01-05T15:27:01.000Z
2022-02-01T19:27:43.000Z
#### Incident Fields Added xsoar marketplace for these fields: - **Dst Ports** - **Process Path** - **SHA256** - **Src Ports** - **Alert Action** - **Source IPs**
16.4
41
0.621951
eng_Latn
0.437957
e167e22fd07dd9a56d37eab7b0f3e085fb3b3631
2,978
md
Markdown
docs/t-sql/functions/connections-transact-sql.md
thiagoamc/sql-docs.pt-br
32e5d2a16f76e552e93b54b343566cd3a326b929
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/t-sql/functions/connections-transact-sql.md
thiagoamc/sql-docs.pt-br
32e5d2a16f76e552e93b54b343566cd3a326b929
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/t-sql/functions/connections-transact-sql.md
thiagoamc/sql-docs.pt-br
32e5d2a16f76e552e93b54b343566cd3a326b929
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: '@@CONNECTIONS (Transact-SQL) | Microsoft Docs' ms.custom: ms.date: 09/18/2017 ms.prod: sql-non-specified ms.prod_service: sql-database ms.service: ms.component: t-sql|functions ms.reviewer: ms.suite: sql ms.technology: - database-engine ms.tgt_pltfrm: ms.topic: language-reference f1_keywords: - '@@CONNECTIONS' - '@@CONNECTIONS_TSQL' dev_langs: - TSQL helpviewer_keywords: - '@@CONNECTIONS function' - connections [SQL Server], number of - connections [SQL Server], attempted - number of connection attempts - attempted connections ms.assetid: c59836a8-443c-4b9a-8b96-8863ada97ac7 caps.latest.revision: author: edmacauley ms.author: edmaca manager: craigg ms.workload: Inactive ms.openlocfilehash: 8422da0d4e550c99fac6c9659f771ab98196cb4a ms.sourcegitcommit: 45e4efb7aa828578fe9eb7743a1a3526da719555 ms.translationtype: HT ms.contentlocale: pt-BR ms.lasthandoff: 11/21/2017 --- # <a name="x40x40connections-transact-sql"></a>&#x40;&#x40;CONNECTIONS (Transact-SQL) [!INCLUDE[tsql-appliesto-ss2008-xxxx-xxxx-xxx-md](../../includes/tsql-appliesto-ss2008-xxxx-xxxx-xxx-md.md)] Retorna o número de tentativas de conexão, bem-sucedidas ou não, desde a última inicialização do [!INCLUDE[ssNoVersion](../../includes/ssnoversion-md.md)]. ![Ícone de link do tópico](../../database-engine/configure-windows/media/topic-link.gif "Topic link icon") [Convenções da sintaxe Transact-SQL](../../t-sql/language-elements/transact-sql-syntax-conventions-transact-sql.md) ## <a name="syntax"></a>Sintaxe ```sql @@CONNECTIONS ``` ## <a name="return-types"></a>Tipos de retorno **inteiro** ## <a name="remarks"></a>Remarks Conexões são diferentes de usuários. Por exemplo, aplicativos podem abrir várias conexões em [!INCLUDE[ssNoVersion](../../includes/ssnoversion-md.md)] sem que o usuário perceba. Para exibir um relatório que contém várias estatísticas do [!INCLUDE[ssNoVersion](../../includes/ssnoversion-md.md)], incluindo tentativas de conexão, execute **sp_monitor**. @@MAX_CONNECTIONS é o número máximo de conexões permitidas simultaneamente com o servidor. @@CONNECTIONS é incrementado a cada tentativa de logon e, portanto, @@CONNECTIONS pode ser maior que @@MAX_CONNECTIONS. ## <a name="examples"></a>Exemplos O exemplo a seguir mostra o retorno do número de tentativas de logon a partir da data e hora atuais. ```sql SELECT GETDATE() AS 'Today''s Date and Time', @@CONNECTIONS AS 'Login Attempts'; ``` [!INCLUDE[ssResult](../../includes/ssresult-md.md)] ```sql Today's Date and Time Login Attempts ---------------------- -------------- 12/5/2006 10:32:45 AM 211023 ``` ## <a name="see-also"></a>Consulte também [Funções estatísticas do sistema &#40;Transact-SQL&#41;](../../t-sql/functions/system-statistical-functions-transact-sql.md) [sp_monitor &#40;Transact-SQL&#41;](../../relational-databases/system-stored-procedures/sp-monitor-transact-sql.md)
35.879518
222
0.718939
por_Latn
0.490951
e16819e24d7b3f8747083bf6d52900d8652850a6
161
md
Markdown
README.md
chriswmann/rr-lon-stock-price-gru
74f3bf057c592573c5affaf8794ea94a6edb6d7d
[ "MIT" ]
null
null
null
README.md
chriswmann/rr-lon-stock-price-gru
74f3bf057c592573c5affaf8794ea94a6edb6d7d
[ "MIT" ]
null
null
null
README.md
chriswmann/rr-lon-stock-price-gru
74f3bf057c592573c5affaf8794ea94a6edb6d7d
[ "MIT" ]
null
null
null
# RR Stock Price GRU A recurrent gated unit network which attempts (poorly) to defy the efficient market hypothesis and predict the price of Rolls-Royce shares.
53.666667
139
0.807453
eng_Latn
0.993508
e16874e4db8bbf3e6eed0f5ab6b643cbd3ac6ff2
875
md
Markdown
README.md
douglasramos/handwritten-digits-detection
4ea3175479e2258527c533d927d27cd27d1aea4b
[ "MIT" ]
null
null
null
README.md
douglasramos/handwritten-digits-detection
4ea3175479e2258527c533d927d27cd27d1aea4b
[ "MIT" ]
null
null
null
README.md
douglasramos/handwritten-digits-detection
4ea3175479e2258527c533d927d27cd27d1aea4b
[ "MIT" ]
null
null
null
# Project This project, still in development, is based on the scripts (Python 2) made in the book **neuralnetworksanddeeplearning.com** by **Michael Nielsen**, and also on the **Michael Daniel Dobrzanski**'s scripts, that is the same as the book by in Python 3. The aim of this project is to reproduce and evolve what was done by the aforementioned works, achieving better performance and accuracy, and making it a software-product, with user interface and new functionalities. ## Road Map - Finish the basic neural network - Improve the accuracy with some other techniques - Read the digit from a generic photo - Read a sequence of digits from a generic photo - Extend to the alphabet - Build a web or desktop user interface ## License Copyright © 2019 Douglas Ramos Distributed under the MIT License, with due credit to the work on which this project was based.
36.458333
467
0.774857
eng_Latn
0.999674
e168b8360a42f15e28fd6244fade1a6c5440cb84
8,407
md
Markdown
bigquery-wordcount/README.md
RajeshThallam/spark-on-k8s-gcp-examples
e5c20182e04d158fb6f6367cdd6a78437748e8f3
[ "Apache-2.0" ]
38
2018-01-13T14:58:13.000Z
2022-01-28T20:41:13.000Z
bigquery-wordcount/README.md
RajeshThallam/spark-on-k8s-gcp-examples
e5c20182e04d158fb6f6367cdd6a78437748e8f3
[ "Apache-2.0" ]
null
null
null
bigquery-wordcount/README.md
RajeshThallam/spark-on-k8s-gcp-examples
e5c20182e04d158fb6f6367cdd6a78437748e8f3
[ "Apache-2.0" ]
27
2018-01-13T14:57:47.000Z
2022-03-29T23:06:12.000Z
# Spark Word Count using BigQuery and Google Cloud Storage This package contains two variants of a Spark word count example, namely, `BigQueryWordCountToGCS` and `BigQueryWordCountToBigQuery`. Both variants use the [BigQuery](https://cloud.google.com/dataproc/docs/connectors/bigquery) and [GCS](https://cloud.google.com/dataproc/docs/connectors/cloud-storage) connectors. Both variants reads input data from a BigQuery table such as `publicdata:samples.shakespeare`. The two variants differ in where they write the output data to. One version writes its output to a user-specified GCS bucket at path `/spark/output/wordcount`, and the other writes to a user-specified BigQuery table. Both variants of the example requires a GCP service account with the appropriate IAM roles to read and write GCS buckets and objects and to create, read, and write BigQuery datasets and tables. ## Build To build the example, run the following command: ``` mvn clean package ``` This will create a single jar under `target/` named `bigquery-wordcount-<version>-jar-with-dependencies.jar` with the necessary dependencies. This is the jar to be used as the `<application-jar>` in `spark-submit` and must be accessible locally by the driver and executors at runtime. There are two ways of making the jar available locally to the driver and executors. ## Making the Jar Available to the Driver and Executors There are two ways of running this example on [Spark on Kubernetes](https://github.com/apache-spark-on-k8s/spark), depending on how the example jar is shipped. ### Staging The Example Jar using the Resource Staging Server [Spark on Kubernetes](https://github.com/apache-spark-on-k8s/spark) ships with a [Resource Staging Server](https://apache-spark-on-k8s.github.io/userdocs/running-on-kubernetes.html#dependency-management) that can be used to stage resources such as jars and files local to the submission machine. The Spark submission client uploads the resources to the Resource Staging Server, from where they are downloaded by the init-container into the Spark driver and executor Pods so they can be used by the driver and executors. To use it, the Resource Staging Server needs to be deployed to the Kubernetes cluster and the Spark configuration property `spark.kubernetes.resourceStagingServer.uri` needs to be set accordingly. Please refer to the [documentation](https://apache-spark-on-k8s.github.io/userdocs/running-on-kubernetes.html#dependency-management) for more details on how to deploy and use the Resource Staging Server. An example `spark-submit` command when using this option looks like the following: ``` bin/spark-submit \ --deploy-mode cluster \ --master k8s://https://192.168.99.100:8443 \ --kubernetes-namespace default \ --class spark.bigquery.example.wordcount.BigQueryWordCount \ --conf spark.executor.instances=1 \ --conf spark.app.name=bigquery-wordcount \ --conf spark.kubernetes.driver.docker.image=<driver image> \ --conf spark.kubernetes.executor.docker.image=<executor image> \ --conf spark.kubernetes.initcontainer.docker.image=<init-container image> \ --conf spark.kubernetes.driver.secrets.<GCP service account secret name>=<mount path> \ --conf spark.kubernetes.executor.secrets.<GCP service account secret name>=<mount path> \ --conf spark.hadoop.fs.gs.project.id=<GCP project ID> \ --conf spark.hadoop.fs.gs.system.bucket=<Root GCS bucket to use for temporary working and output directories> \ --conf spark.hadoop.google.cloud.auth.service.account.enable=true \ --conf spark.hadoop.google.cloud.auth.service.account.json.keyfile=<Path to GCS service account Json key file> \ --conf spark.kubernetes.resourceStagingServer.uri=<resource staging server URI> \ local:///opt/spark/examples/jars/bigquery-wordcount-1.0-SNAPSHOT-jar-with-dependencies.jar \ publicdata:samples.shakespeare ``` ### Putting The Example Jar into Custom Spark Driver and Executor Images For users who prefer not using the Resource Staging Server, an alternative way is to put the example jar into custom built Spark driver and executor Docker images. Typically the jar gets copy into the `examples/jars` directory of a unzipped Spark distribution, from where the Docker images are to be built. The entire `examples/jars` directory get copied into the driver and executor images. When using this option, the `<application-jar>` is in the form of `local:///opt/spark/examples/jars/bigquery-wordcount-<version>-jar-with-dependencies.jar`, where the `local://` scheme is needed and it means the jar is locally in the driver and executor containers. An example `spark-submit` command when using this option looks like the following: ``` bin/spark-submit \ --deploy-mode cluster \ --master k8s://https://192.168.99.100:8443 \ --kubernetes-namespace default \ --class spark.bigquery.example.wordcount.BigQueryWordCount \ --conf spark.executor.instances=1 \ --conf spark.app.name=bigquery-wordcount \ --conf spark.kubernetes.driver.docker.image=<driver image> \ --conf spark.kubernetes.executor.docker.image=<executor image> \ --conf spark.kubernetes.initcontainer.docker.image=<init-container image> \ --conf spark.kubernetes.driver.secrets.<GCP service account secret name>=<mount path> \ --conf spark.kubernetes.executor.secrets.<GCP service account secret name>=<mount path> \ --conf spark.hadoop.fs.gs.project.id=<GCP project ID> \ --conf spark.hadoop.fs.gs.system.bucket=<Root GCS bucket to use for temporary working and output directories> \ --conf spark.hadoop.google.cloud.auth.service.account.enable=true \ --conf spark.hadoop.google.cloud.auth.service.account.json.keyfile=<Path to GCS service account Json key file> \ local:///opt/spark/examples/jars/bigquery-wordcount-1.0-SNAPSHOT-jar-with-dependencies.jar \ publicdata:samples.shakespeare ``` ## Using the BigQuery/GCS Connectors As mentioned above, this example requires a GCP service account Json key file mounted into the driver and executor containers as a Kubernetes secret volume. Please refer to [Authenticating to Cloud Platform with Service Accounts](https://cloud.google.com/kubernetes-engine/docs/tutorials/authenticating-to-cloud-platform) for detailed information on how to get a service account Json key file and how to create a secret out of it. The service account must have the appropriate roles and permissions to read and write GCS buckets and objects and to create, read, and write BigQuery datasets and tables. As the example commands above show, users can request the secret to be mounted into the driver and executor containers using the following Spark configuration properties: ``` --conf spark.kubernetes.driver.secrets.<GCP service account secret name>=<mount path> \ --conf spark.kubernetes.executor.secrets.<GCP service account secret name>=<mount path> \ ``` The BigQuery and GCS connectors are special in how they use the service account Json key file to authenticate with the BigQuery and GCS services. Specifically, the following two Spark configuration properties must be set: ``` --conf spark.hadoop.google.cloud.auth.service.account.enable=true \ --conf spark.hadoop.google.cloud.auth.service.account.json.keyfile=<Path to GCS service account Json key file> ``` Both connectors also require that the user specifies a GCP project ID for billing and a GCS bucket name for temporary input from data exported from the input BigQuery table and output, which can be done using the following Spark configuration properties. ``` --conf spark.hadoop.fs.gs.project.id=<GCP project ID> --conf spark.hadoop.fs.gs.system.bucket=<Root GCS bucket to use for temporary working and output directories> ``` ## Monitoring and Checking Logs [Spark on Kubernetes](https://github.com/apache-spark-on-k8s/spark) jobs create a driver Pod and one or more executor Pods named after the Spark application name specified by `spark.app.name`, with a suffix `-driver` for the driver Pod and `-exec-<executor ID>` for the executor Pods. The logs of a driver or executor Pod can be checked using `kubectl logs <pod name>`. ## Known Issues ### Guava Version The Saprk on Kubernetes distribution (e.g., of the latest release) comes with `guava-14.0.1.jar` under `jars`, which is older than the version used and needed by the GCS/BigQiery connectors. To fix this issue, replace `guava-14.0.1.jar` with one of a newer version, e.g., `19.0`.
82.421569
1,003
0.779826
eng_Latn
0.979389
e169209145356cc74bd5bb456d994c45869927c6
2,309
md
Markdown
README.md
sinneb/pyo-patcher
764e2a55642520a1e9186976af718cbf86f2d8d5
[ "MIT" ]
5
2019-02-17T09:25:42.000Z
2021-11-15T10:07:13.000Z
README.md
sinneb/pyo-patcher
764e2a55642520a1e9186976af718cbf86f2d8d5
[ "MIT" ]
null
null
null
README.md
sinneb/pyo-patcher
764e2a55642520a1e9186976af718cbf86f2d8d5
[ "MIT" ]
null
null
null
# pyo-patcher ![](http://sinneb.net/pyo-patcher/pyo-patcher-11april.png) Checkout the [wiki](https://github.com/sinneb/pyo-patcher/wiki/Welcome-to-the-pyo-patcher-wiki) for step by step usage instructions pyo-patcher is a visual programming environment to create [pyo](http://ajaxsoundstudio.com/software/pyo/) DSP scripts. It runs on a modified version of [Node-RED](https://nodered.org/): flow-based programming for the Internet of Things. This modified version allows, among others, multiple inputs, does not save states (for multiuser online use) and displays port names. The pyo flows are almost Node-RED standard JSON file exports, which, upon download, are picked up by a local Python compiler script. The compiler translates the JSON flow to a pyo script and runs it. The download - compile - run cycle typically takes less than a second to run, which provides almost-instant feedback to the user. Check it out! You have to run pyo-patcher yourself, the online version is not available anymore. Have "compiler.py" running on your system and point your browser download location to the "livepatches" subfolder. Compiler.py watches this folder for new files. As of now (9th april) there is a nice number of generators available, two effects, MIDI possibilities and a new arithmetic function (multiple). ##Local install - Install Node-RED - grab the pyo-patcher repo (git clone https://github.com/sinneb/pyo-patcher.git) - Change to the pyo-patcher dir (cd pyo-patcher) - Find your local Node-RED install and its red.min.js file (find /. -name red.min.js) - Overwrite that file (cp node-red-changes/red.js /usr/local/lib/node_modules/node-red/public/red/red.min.js). The modified file is not minified, but the default install uses this file. - Copy the pyo nodes to the local Node-RED folder (cp -r nodes/ ~/.node-red/) - Run Node-RED (node-red) - Open the menu and click "Manage palette". Disable all but "node-red". Open the nodes under "node-red" and disable the non-pyo ones (all the not capitalized ones, but "out"). Once I move the pyo nodes to their own library, this process will become a lot easier. - Start the local "compiler.py" (python compiler.py) - Refresh your browser, build something interesting and hit "Deploy". The current flow will be downloaded, picked up, compiled and run.
100.391304
701
0.77003
eng_Latn
0.987522
e16971c33106aa639f91f04722aef0abb97f0997
1,087
md
Markdown
documentation/dataModel.AutonomousMobileRobot/Material/doc/spec.md
Dalma-Systems/broker
07b3f1c2fddc9a4b2e0e731548bd61c5e9bbb67e
[ "MIT" ]
null
null
null
documentation/dataModel.AutonomousMobileRobot/Material/doc/spec.md
Dalma-Systems/broker
07b3f1c2fddc9a4b2e0e731548bd61c5e9bbb67e
[ "MIT" ]
null
null
null
documentation/dataModel.AutonomousMobileRobot/Material/doc/spec.md
Dalma-Systems/broker
07b3f1c2fddc9a4b2e0e731548bd61c5e9bbb67e
[ "MIT" ]
null
null
null
Material: - description: > ## Description This entity contains a harmonised description of a Material. This entity is associated with Warehouse entity. - properties: - batch: - x-ngsi: - type: "Property" - model: "https://schema.org/Text" - type: "string" - description: > A sequence of characters that define the batch identification - mType: - x-ngsi: - type: "Property" - model: "https://schema.org/Text" - type: "string" - description: > A sequence of characters that define the material type - quantity: - x-ngsi: - type: "Property" - model: "https://schema.org/Integer" - type: "integer" - format: "int32" - description: > Number of items of this material available - refWarehouse: - x-ngsi: - type: "Property" - model: "https://schema.org/URL" - type: "string" - format: "URL" - description: > The URL holding the warehouse that has the materials
28.605263
76
0.559338
eng_Latn
0.919367
e1697bd81590a5f8c71893b7e0207157d5b26e3a
589
md
Markdown
docs/api-reference/classes/Generic/GenericToString.md
stianol/crmscript
be1ad4f3a967aee2974e9dc7217255565980331e
[ "MIT" ]
null
null
null
docs/api-reference/classes/Generic/GenericToString.md
stianol/crmscript
be1ad4f3a967aee2974e9dc7217255565980331e
[ "MIT" ]
null
null
null
docs/api-reference/classes/Generic/GenericToString.md
stianol/crmscript
be1ad4f3a967aee2974e9dc7217255565980331e
[ "MIT" ]
null
null
null
--- uid: crmscript_ref_Generic_GenericToString_Generic_generic title: Generic.GenericToString(Generic generic) intellisense: Generic.GenericToString langref: 1 sortOrder: 130 keywords: GenericToString(Generic) so.topic: reference --- # Generic.GenericToString(Generic generic) Explicit downcast from a generic to a String. If the generic does not represent the correct type, an exception is thrown. Together with `getTypeName()`, this function can be used to get an explicit typed variable. * **generic:** Generic The variable to downcast * **Returns:** String The variable as a String
34.647059
213
0.79966
eng_Latn
0.90722
e169c661ae30a0630b274c5e8b181e4f6cfb558f
2,539
md
Markdown
README.md
google/prog-edu-assistant-quizzes
6aa3744af39ed91eb9a7648f0d5ed7375c74ce20
[ "CC-BY-4.0" ]
4
2021-09-09T06:34:33.000Z
2022-02-02T05:04:07.000Z
README.md
google/prog-edu-assistant-quizzes
6aa3744af39ed91eb9a7648f0d5ed7375c74ce20
[ "CC-BY-4.0" ]
null
null
null
README.md
google/prog-edu-assistant-quizzes
6aa3744af39ed91eb9a7648f0d5ed7375c74ce20
[ "CC-BY-4.0" ]
3
2021-05-12T02:22:36.000Z
2021-10-21T02:52:33.000Z
# Quizzes for Programming Education Assitant This repository contains quizzes for a project of [programming education assistant tools](https://github.com/google/prog-edu-assistant). The tools can add autograding capability to Python programming courses using Jupyter or Colab notebooks. Note that this repository only includes quizzes, and the source code of the tools should go to [the tool's repository](https://github.com/google/prog-edu-assistant). ## Who is this project for? The main target audience is teaching staff who develops programming courses using Colab Python notebooks. The quizzes provided in this repository can be added to existing courses and made autocheckable by students. The main focus is Japanese universities, so the quiezzes provided in this repository are mostly in Japanese language. ## How to integrate autograder to your course If you have a course based on Jupyter notebooks and want to integrate the autochecking tests, there are multiple different way how the autochecking tests can be run. See [README of the tool](https://github.com/google/prog-edu-assistant/blob/main/README.md#how-to-integrate-autograder-to-your-course) for its details. Basically, the quizzes in this repository considers running autochecking inside the student notebook. ## How to generate student notebooks TODO(salikh): Replace with the packaged scripts once the PyPI package includes the scripts. 1. Clone the tools project ``` git clone http://github.com/google/prog-edu-assistant ``` 2. Create a virtual Python3 environment and activate it. ``` virtualenv -p python3 venv source venv/bin/activate ``` 3. Install the tool dependencies. ``` pip install -r prog-edu-assistant/python/colab/requirements.txt pip install prog_edu_assistant_tools ``` 4. Convert the instructor notebook to student format. ``` python prog-edu-assistant/python/colab/convert_to_student.py \ --master_notebook instructor/python-intro.ipynb \ --output_student_notebook student/python-intro-student.ipynb ``` ## Development environment setup Follow [the tools guideline](https://github.com/google/prog-edu-assistant/blob/main/SETUP.md) if you want to modify the tools. ## License This work is licensed under a CC BY 4.0 license. See [LICENSE](LICENSE) for details. ## Disclaimer This project is not an official Google project. It is not supported by Google and Google specifically disclaims all warranties as to its quality, merchantability, or fitness for a particular purpose.
33.853333
162
0.779835
eng_Latn
0.992134
e16a341c75dcda9fb752bb1d9813ad5110d28a0a
6,394
md
Markdown
docs/framework/wcf/extending/specifying-a-custom-crypto-algorithm.md
mattia-lunardi/docs.it-it
b9909895e77ae22ac89a7cc8dc6ea289e49ce0b3
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/framework/wcf/extending/specifying-a-custom-crypto-algorithm.md
mattia-lunardi/docs.it-it
b9909895e77ae22ac89a7cc8dc6ea289e49ce0b3
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/framework/wcf/extending/specifying-a-custom-crypto-algorithm.md
mattia-lunardi/docs.it-it
b9909895e77ae22ac89a7cc8dc6ea289e49ce0b3
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: Specifica di un algoritmo di crittografia personalizzato ms.date: 03/30/2017 ms.assetid: d662a305-8e09-451d-9a59-b0f12b012f1d ms.openlocfilehash: 5c7bddb7e6e1696ea1cb4f8359e34a51a89fce40 ms.sourcegitcommit: 6b308cf6d627d78ee36dbbae8972a310ac7fd6c8 ms.translationtype: MT ms.contentlocale: it-IT ms.lasthandoff: 01/23/2019 ms.locfileid: "54537686" --- # <a name="specifying-a-custom-crypto-algorithm"></a>Specifica di un algoritmo di crittografia personalizzato WCF consente di specificare un algoritmo di crittografia personalizzato da usare per crittografare i dati o calcolare le firme digitali. A tale scopo, attenersi alla procedura seguente: 1. Derivare una classe da <xref:System.ServiceModel.Security.SecurityAlgorithmSuite>. 2. Registrare l'algoritmo. 3. Configurare l'associazione con la classe derivata da <xref:System.ServiceModel.Security.SecurityAlgorithmSuite>. ## <a name="derive-a-class-from-securityalgorithmsuite"></a>Derivare una classe da SecurityAlgorithmSuite. <xref:System.ServiceModel.Security.SecurityAlgorithmSuite> è una classe di base astratta che consente di specificare l'algoritmo da usare per eseguire diverse operazioni relative alla sicurezza. Ad esempio, calcolare un hash per una firma digitale o crittografare un messaggio. Nel codice seguente viene illustrato come derivare una classe da <xref:System.ServiceModel.Security.SecurityAlgorithmSuite>. ```csharp public class MyCustomAlgorithmSuite : SecurityAlgorithmSuite { public override string DefaultAsymmetricKeyWrapAlgorithm { get { return SecurityAlgorithms.RsaOaepKeyWrap; } } public override string DefaultAsymmetricSignatureAlgorithm { get { return SecurityAlgorithms.RsaSha1Signature; } } public override string DefaultCanonicalizationAlgorithm { get { return SecurityAlgorithms.ExclusiveC14n; ; } } public override string DefaultDigestAlgorithm { get { return SecurityAlgorithms.MyCustomHashAlgorithm; } } public override string DefaultEncryptionAlgorithm { get { return SecurityAlgorithms.Aes128Encryption; } } public override int DefaultEncryptionKeyDerivationLength { get { return 128; } } public override int DefaultSignatureKeyDerivationLength { get { return 128; } } public override int DefaultSymmetricKeyLength { get { return 128; } } public override string DefaultSymmetricKeyWrapAlgorithm { get { return SecurityAlgorithms.Aes128Encryption; } } public override string DefaultSymmetricSignatureAlgorithm { get { return SecurityAlgorithms.HmacSha1Signature; } } public override bool IsAsymmetricKeyLengthSupported(int length) { return length >= 1024 && length <= 4096; } public override bool IsSymmetricKeyLengthSupported(int length) { return length >= 128 && length <= 256; } } ``` ## <a name="register-the-custom-algorithm"></a>Registrare l'algoritmo personalizzato La registrazione può essere eseguita in un file di configurazione o nel codice imperativo. La registrazione di un algoritmo personalizzato viene eseguita creando un mapping tra una classe che implementa un provider di servizi di crittografia e un alias. L'alias viene quindi mappato a un URI che viene usato per specificare l'algoritmo nell'associazione del servizio WCF. Nel frammento di configurazione seguente viene illustrato come registrare un algoritmo personalizzato in config: ```xml <configuration> <mscorlib> <cryptographySettings> <cryptoNameMapping> <cryptoClasses> <cryptoClass SHA256CSP="System.Security.Cryptography.SHA256CryptoServiceProvider, System.Core, Version=3.5.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" /> </cryptoClasses> <nameEntry name="http://constoso.com/CustomAlgorithms/CustomHashAlgorithm" class="SHA256CSP" /> </cryptoNameMapping> </cryptographySettings> </mscorlib> </configuration> ``` La sezione sotto il <`cryptoClasses`> elemento crea il mapping tra il SHA256CryptoServiceProvider e l'alias "SHA256CSP". Il <`nameEntry`> elemento crea il mapping tra l'alias "SHA256CSP" e l'URL specificato (http://constoso.com/CustomAlgorithms/CustomHashAlgorithm ). Per registrare l'algoritmo personalizzato nel codice usare il metodo <xref:System.Security.Cryptography.CryptoConfig.AddAlgorithm(System.Type,System.String[])>. Questo metodo crea entrambi i mapping. Nell'esempio seguente viene illustrato come chiamare questo metodo: ``` // Register the custom URI string defined for the hashAlgorithm in MyCustomAlgorithmSuite class to create the // SHA256CryptoServiceProvider hash algorithm object. CryptoConfig.AddAlgorithm(typeof(SHA256CryptoServiceProvider), "http://constoso.com/CustomAlgorithms/CustomHashAlgorithm"); ``` ## <a name="configure-the-binding"></a>Configurare l'associazione L'associazione si configura specificando la classe derivata da <xref:System.ServiceModel.Security.SecurityAlgorithmSuite> nelle impostazioni di associazione, come indicato nel frammento di codice seguente: ```csharp WSHttpBinding binding = new WSHttpBinding(); binding.Security.Message.AlgorithmSuite = new MyCustomAlgorithmSuite(); ``` Per un esempio di codice completo, vedere la [agilità di crittografia nella sicurezza WCF](../../../../docs/framework/wcf/samples/cryptographic-agility-in-wcf-security.md) esempio. ## <a name="see-also"></a>Vedere anche - [Protezione di servizi e client](../../../../docs/framework/wcf/feature-details/securing-services-and-clients.md) - [Protezione dei servizi](../../../../docs/framework/wcf/securing-services.md) - [Panoramica della sicurezza](../../../../docs/framework/wcf/feature-details/security-overview.md) - [Concetti relativi alla sicurezza](../../../../docs/framework/wcf/feature-details/security-concepts.md)
48.075188
487
0.707382
ita_Latn
0.7425
e16ac8884066ee42207506ed0f54cbe2c29cf1be
1,203
md
Markdown
week10.md
CS395-BinX/CS395-BinX.github.io
fff0403fdb6cb1d4e6f5842989160e21eecb15e3
[ "MIT" ]
3
2022-01-01T18:36:53.000Z
2022-01-28T20:17:04.000Z
week10.md
CS395-BinX/CS395-BinX.github.io
fff0403fdb6cb1d4e6f5842989160e21eecb15e3
[ "MIT" ]
null
null
null
week10.md
CS395-BinX/CS395-BinX.github.io
fff0403fdb6cb1d4e6f5842989160e21eecb15e3
[ "MIT" ]
2
2021-12-29T21:28:29.000Z
2021-12-30T12:18:40.000Z
# Week 10: Patching Binaries and Hooking ## Lecture Video 1 [![Watch here](http://img.youtube.com/vi/rYrP3YoC_jE/0.jpg)](https://www.youtube.com/watch?v=rYrP3YoC_jE) ## Lecture Video 2 [![Watch here](http://img.youtube.com/vi/3_48XNYh7oo/0.jpg)](https://www.youtube.com/watch?v=3_48XNYh7oo) Get the slides for both [here](https://github.com/CS395-BinX/CS395-BinX.github.io/blob/main/week10/Week%2010%20Lecture.pdf) ## Demos Get them [here](https://github.com/CS395-BinX/CS395-BinX.github.io/tree/main/week10/demos) ## Optional Material ### Intuition Behind Exploitation [This is a great video](https://www.youtube.com/watch?v=akCce7vSSfw) that sums up everything that we've learned so far, and it also helps give you an intuition for binary exploitation. ### From Zero to Zero Day If you're interested in doing research in the field of security, then [this video is for you](https://www.youtube.com/watch?v=xp1YDOtWohw&t=1632s). This is a talk about how an eighteen-year-old went from doing basic stack overflows to finding zero days. He even provides a working demo of the exploit he created and how he went about finding the vulnerability. ## Assignments Complete the [final](./final.html)
46.269231
360
0.758936
eng_Latn
0.860395
e16b76bad26c58e43ea69aa2ee4a63c8efa2e785
1,506
md
Markdown
dynamicsax2012-technet/employeetimeregistrationworkflowhelper-breakfromwork-field-microsoft-dynamics-commerce-runtime-workflow.md
s0pach/DynamicsAX2012-technet
8412306681e6b914ebcfad0a9ee05038474ef1e6
[ "CC-BY-4.0", "MIT" ]
1
2020-06-16T22:06:04.000Z
2020-06-16T22:06:04.000Z
dynamicsax2012-technet/employeetimeregistrationworkflowhelper-breakfromwork-field-microsoft-dynamics-commerce-runtime-workflow.md
s0pach/DynamicsAX2012-technet
8412306681e6b914ebcfad0a9ee05038474ef1e6
[ "CC-BY-4.0", "MIT" ]
null
null
null
dynamicsax2012-technet/employeetimeregistrationworkflowhelper-breakfromwork-field-microsoft-dynamics-commerce-runtime-workflow.md
s0pach/DynamicsAX2012-technet
8412306681e6b914ebcfad0a9ee05038474ef1e6
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: EmployeeTimeRegistrationWorkflowHelper.BreakFromWork Field (Microsoft.Dynamics.Commerce.Runtime.Workflow) TOCTitle: BreakFromWork Field ms:assetid: F:Microsoft.Dynamics.Commerce.Runtime.Workflow.EmployeeTimeRegistrationWorkflowHelper.BreakFromWork ms:mtpsurl: https://technet.microsoft.com/library/microsoft.dynamics.commerce.runtime.workflow.employeetimeregistrationworkflowhelper.breakfromwork(v=AX.60) ms:contentKeyID: 62210433 author: Khairunj ms.date: 05/18/2015 mtps_version: v=AX.60 f1_keywords: - Microsoft.Dynamics.Commerce.Runtime.Workflow.EmployeeTimeRegistrationWorkflowHelper.BreakFromWork dev_langs: - CSharp - C++ - VB --- # BreakFromWork Field Constant to break for work defined in AX. **Namespace:**  [Microsoft.Dynamics.Commerce.Runtime.Workflow](microsoft-dynamics-commerce-runtime-workflow-namespace.md) **Assembly:**  Microsoft.Dynamics.Commerce.Runtime.Workflow (in Microsoft.Dynamics.Commerce.Runtime.Workflow.dll) ## Syntax ``` vb 'Declaration Public Const BreakFromWork As String 'Usage Dim value As String value = EmployeeTimeRegistrationWorkflowHelper.BreakFromWork ``` ``` csharp public const string BreakFromWork ``` ``` c++ public: literal String^ BreakFromWork ``` ## See Also #### Reference [EmployeeTimeRegistrationWorkflowHelper Class](employeetimeregistrationworkflowhelper-class-microsoft-dynamics-commerce-runtime-workflow.md) [Microsoft.Dynamics.Commerce.Runtime.Workflow Namespace](microsoft-dynamics-commerce-runtime-workflow-namespace.md)
28.415094
156
0.820717
yue_Hant
0.794089
e16c4d5dcd30ca92bd93fd5aeaba4e36d3097b3d
498
md
Markdown
content/api/ng-common/common.clickoutsidedirective.clickoutsidecondition.md
ressurectit/ressurectit.github.io
09ed543e50e9b35594333afe6e98d79687849b04
[ "MIT" ]
null
null
null
content/api/ng-common/common.clickoutsidedirective.clickoutsidecondition.md
ressurectit/ressurectit.github.io
09ed543e50e9b35594333afe6e98d79687849b04
[ "MIT" ]
null
null
null
content/api/ng-common/common.clickoutsidedirective.clickoutsidecondition.md
ressurectit/ressurectit.github.io
09ed543e50e9b35594333afe6e98d79687849b04
[ "MIT" ]
null
null
null
<!-- Do not edit this file. It is automatically generated by API Documenter. --> [Home](./index.md) &gt; [@anglr/common](./common.md) &gt; [ClickOutsideDirective](./common.clickoutsidedirective.md) &gt; [clickOutsideCondition](./common.clickoutsidedirective.clickoutsidecondition.md) ## ClickOutsideDirective.clickOutsideCondition property Variable that is used for displaying element that handles click outside <b>Signature:</b> ```typescript clickOutsideCondition: boolean; ```
35.571429
203
0.753012
eng_Latn
0.962645
e16c4f3956a78cc32c42bcdfe16006b8a9815290
2,419
md
Markdown
posts/2018-02-23.md
ab22375/mb-blog
0de45cf0d8b2f3f581fa2bd5d2e998bf36f36412
[ "MIT" ]
null
null
null
posts/2018-02-23.md
ab22375/mb-blog
0de45cf0d8b2f3f581fa2bd5d2e998bf36f36412
[ "MIT" ]
null
null
null
posts/2018-02-23.md
ab22375/mb-blog
0de45cf0d8b2f3f581fa2bd5d2e998bf36f36412
[ "MIT" ]
1
2021-03-17T02:19:05.000Z
2021-03-17T02:19:05.000Z
--- title: MB 高性能ツインヘッダー date: '2018-02-23' img: ../img/posts/2018-02-23/MB-R800-Komatsu-PC200-Bolivia-rocks-granite-2-_640x400.jpg img_w: 640 img_h: 400 dsc: MB-R800 ボリビア 標高4000m の都市にて original: https://www.mbcrusher.com/ja/jp/%E3%81%8A%E7%9F%A5%E3%82%89%E3%81%9B/news/mb-%E9%AB%98%E6%80%A7%E8%83%BD%E3%83%84%E3%82%A4%E3%83%B3%E3%83%98%E3%83%83%E3%83%80%E3%83%BC --- <img src="../img/posts/2018-02-23/Cantiere1.1280x600.jpg" alt="Cantiere1.1280x600" class="rounded-2xl" /> ## MB-R800 ボリビア 標高4000m の都市にて <img src="../img/posts/2018-02-23/Cantiere2.640x400.jpg" alt="Cantiere2.640x400" class="rounded-2xl" /> ## 価値ある都市 ポトシ ボリビアの都市ポトシは今でも「大変価値のあるもの」を表す代名詞として使われています。ポトシの街を見て元鉱夫たちの話を聞けば、ポトシがかつて世界有数の美しい巨大金鉱であったことがうかがえます。またポトシは人が暮らす都市としては世界最高の標高4090mを誇ります。1987年には世界遺産に登録され、現在も錫(すず)の採掘が行われています。 <img src="../img/posts/2018-02-23/Satellite.640x400.jpg" alt="Satellite.640x400" class="rounded-2xl" /> ## セクレ〜ポトシ間パイプラインプロジェクト その標高4000m以上の地でMBツインヘッダーは巨大なプロジェクトに参加したのです。ボリビアの都市セクレとポトシのガス供給パイプラインの拡張工事です。セクレ・マリアカとポトシ・カラチパンパの間57キロにも及ぶ工事です。この工事が成功すればポトシに天然ガスを十分に供給でき商業や産業の活性化に繋がります。 <iframe width="750" height="422" src="https://www.youtube.com/embed/gpmr8QGikMk" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe> <img src="../img/posts/2018-02-23/MBTeamBolivia.640x400.jpg" alt="MBTeamBolivia.640x400" class="rounded-2xl" /> ## どんな状況でも活躍するツインヘッダー MBツインヘッダーはコマツPC200の重機に取付け工事に使用されました。ツインヘッダーの砕く対象物は石とみかげ石の混合物で非常に固く、また地理的にも難しい現場でした。アンデス山脈の中にありマイナス8度まで下がる極寒の中で工事は行われました。 そんな過酷な状況でもMBツインヘッダーは従来の力を発揮しました。それが100% メイドインイタリーが誇る品質です。MBツインヘッダーは、油圧ショベルのパラメーターが正常に作動しなかった場合のセキュリティーシステムを搭載しています。 また様々な強度の材質を砕く際、それぞれに適したパワーを自動的に使い分け破砕することができ、油圧ショベルのアームへの負担を減らし作業を確実に遂行することができます。 <img src="../img/posts/2018-02-23/BF80.3-Caterpillar-Bolivia.640x400.jpg" alt="BF80.3 Caterpillar Bolivia.640x400" class="rounded-2xl" /> 2枚のドラムは現場で取り外すことができ、特許取得のツインモーターシステムは2つの刃にそれぞれ力を供給することができます。また現場ではMBバケットクラッシャーBF80.3も活躍しました。 ツインヘッダーが砕いた石をさらに破砕しパイプライン工事で再利用したのです。 <img src="../img/posts/2018-02-23/MB-R800_KomatsuPC200_Bolivia_rocks_granite_640x400 (1).jpg" alt="MB R800 KomatsuPC200 Bolivia rocks granite 640x400 (1)" class="rounded-2xl" /> ## MB-Rについて MB-Rツインヘッダーは3モデルあります。MB-R700は最小サイズで6トン〜13トンの重機に取付可能です。MB-R800は重さ1トン、10トン〜22トンの重機に取付られます。最大サイズのMB-R900は19トン〜35トンの重機に取付可能です。
35.573529
219
0.784208
yue_Hant
0.332809
e16cd2b2ceed5f3e25e126103ef07e6d46d928a0
15,352
md
Markdown
docs/visual-basic/programming-guide/com-interop/troubleshooting-interoperability.md
Youssef1313/docs.it-it
15072ece39fae71ee94a8b9365b02b550e68e407
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/visual-basic/programming-guide/com-interop/troubleshooting-interoperability.md
Youssef1313/docs.it-it
15072ece39fae71ee94a8b9365b02b550e68e407
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/visual-basic/programming-guide/com-interop/troubleshooting-interoperability.md
Youssef1313/docs.it-it
15072ece39fae71ee94a8b9365b02b550e68e407
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: Risoluzione dei problemi relativi all'interoperabilità ms.date: 07/20/2015 helpviewer_keywords: - interop, deploying assemblies - assemblies [Visual Basic] - interop, installing assemblies that share components - COM objects, troubleshooting - interop, sharing components - troubleshooting interoperability [Visual Basic] - interoperability, troubleshooting - COM interop [Visual Basic], troubleshooting - assemblies [Visual Basic], deploying - troubleshooting Visual Basic, interoperability - interop assemblies - interoperability, sharing components - shared components, using with assemblies ms.assetid: b324cc1e-b03c-4f39-aea6-6a6d5bfd0e37 ms.openlocfilehash: 344c180cf0b9426898e17b45db768a337fd45beb ms.sourcegitcommit: 17ee6605e01ef32506f8fdc686954244ba6911de ms.translationtype: MT ms.contentlocale: it-IT ms.lasthandoff: 11/22/2019 ms.locfileid: "74338677" --- # <a name="troubleshooting-interoperability-visual-basic"></a>Risoluzione dei problemi relativi all'interoperabilità (Visual Basic) Quando si interagisce tra COM e il codice gestito del .NET Framework, è possibile che si verifichino uno o più dei seguenti problemi comuni. ## <a name="vbconinteroperabilitymarshalinganchor1"></a>Marshalling di interoperabilità In alcuni casi, potrebbe essere necessario utilizzare tipi di dati che non fanno parte del .NET Framework. Gli assembly di interoperabilità gestiscono la maggior parte del lavoro per gli oggetti COM, ma potrebbe essere necessario controllare i tipi di dati utilizzati quando gli oggetti gestiti vengono esposti a COM. Ad esempio, le strutture nelle librerie di classi devono specificare il `BStr` tipo non gestito sulle stringhe inviate a oggetti COM creati da Visual Basic 6,0 e versioni precedenti. In questi casi, è possibile usare l'attributo <xref:System.Runtime.InteropServices.MarshalAsAttribute> per far sì che i tipi gestiti vengano esposti come tipi non gestiti. ## <a name="vbconinteroperabilitymarshalinganchor2"></a>Esportazione di stringhe a lunghezza fissa in codice non gestito In Visual Basic 6,0 e versioni precedenti, le stringhe vengono esportate in oggetti COM come sequenze di byte senza un carattere di terminazione null. Per la compatibilità con altri linguaggi, Visual Basic .NET include un carattere di terminazione durante l'esportazione delle stringhe. Il modo migliore per risolvere questa incompatibilità consiste nell'esportare stringhe che non dispongono del carattere di terminazione come matrici di `Byte` o `Char`. ## <a name="vbconinteroperabilitymarshalinganchor3"></a>Esportazione di gerarchie di ereditarietà Le gerarchie di classi gestite vengono appiattite quando vengono esposte come oggetti COM. Se, ad esempio, si definisce una classe base con un membro e quindi si eredita la classe di base in una classe derivata esposta come oggetto COM, i client che utilizzano la classe derivata nell'oggetto COM non saranno in grado di utilizzare i membri ereditati. È possibile accedere ai membri della classe di base dagli oggetti COM solo come istanze di una classe di base e quindi solo se la classe di base viene creata anche come oggetto COM. ## <a name="overloaded-methods"></a>Metodi di overload Sebbene sia possibile creare metodi di overload con Visual Basic, non sono supportati da COM. Quando una classe che contiene metodi di overload viene esposta come oggetto COM, vengono generati nuovi nomi di metodo per i metodi di overload. Si consideri, ad esempio, una classe che dispone di due overload del metodo `Synch`. Quando la classe viene esposta come oggetto COM, i nuovi nomi dei metodi generati potrebbero essere `Synch` e `Synch_2`. La ridenominazione può causare due problemi per i consumer dell'oggetto COM. 1. I client potrebbero non prevedere i nomi dei metodi generati. 2. I nomi dei metodi generati nella classe esposti come oggetto COM possono essere modificati quando vengono aggiunti nuovi overload alla classe o alla relativa classe di base. Questo può causare problemi di controllo delle versioni. Per risolvere entrambi i problemi, assegnare a ogni metodo un nome univoco, anziché usare l'overload, quando si sviluppano oggetti che verranno esposti come oggetti COM. ## <a name="vbconinteroperabilitymarshalinganchor4"></a>Utilizzo di oggetti COM tramite assembly di interoperabilità Gli assembly di interoperabilità vengono usati quasi come se fossero sostituzioni di codice gestito per gli oggetti COM che rappresentano. Tuttavia, poiché si tratta di wrapper e non di oggetti COM effettivi, esistono alcune differenze tra l'utilizzo degli assembly di interoperabilità e degli assembly standard. Queste aree di differenza includono l'esposizione delle classi e i tipi di dati per i parametri e i valori restituiti. ## <a name="vbconinteroperabilitymarshalinganchor5"></a>Classi esposte come interfacce e classi A differenza delle classi negli assembly standard, le classi COM vengono esposte negli assembly di interoperabilità sia come interfaccia sia come classe che rappresenta la classe COM. Il nome dell'interfaccia è identico a quello della classe COM. Il nome della classe di interoperabilità è identico a quello della classe COM originale, ma con la parola "class" accodata. Si supponga, ad esempio, di disporre di un progetto con un riferimento a un assembly di interoperabilità per un oggetto COM. Se la classe COM è denominata `MyComClass`, IntelliSense e il Visualizzatore oggetti mostrano un'interfaccia denominata `MyComClass` e una classe denominata `MyComClassClass`. ## <a name="vbconinteroperabilitymarshalinganchor6"></a>Creazione di istanze di una classe .NET Framework In genere, si crea un'istanza di una classe .NET Framework usando l'istruzione `New` con un nome di classe. La presenza di una classe COM rappresentata da un assembly di interoperabilità è un caso in cui è possibile utilizzare l'istruzione `New` con un'interfaccia. A meno che non si usi la classe COM con un'istruzione `Inherits`, è possibile usare l'interfaccia esattamente come si farebbe con una classe. Il codice seguente illustra come creare un oggetto `Command` in un progetto che contiene un riferimento all'oggetto COM della libreria Microsoft ActiveX Data Objects 2,8: [!code-vb[VbVbalrInterop#20](~/samples/snippets/visualbasic/VS_Snippets_VBCSharp/VbVbalrInterop/VB/Class1.vb#20)] Tuttavia, se si utilizza la classe COM come base per una classe derivata, è necessario utilizzare la classe di interoperabilità che rappresenta la classe COM, come nel codice seguente: [!code-vb[VbVbalrInterop#21](~/samples/snippets/visualbasic/VS_Snippets_VBCSharp/VbVbalrInterop/VB/Class1.vb#21)] > [!NOTE] > Gli assembly di interoperabilità implementano in modo implicito interfacce che rappresentano classi COM. Non provare a usare l'istruzione `Implements` per implementare queste interfacce. in caso contrario, verrà generato un errore. ## <a name="vbconinteroperabilitymarshalinganchor7"></a>Tipi di dati per parametri e valori restituiti A differenza dei membri degli assembly standard, i membri dell'assembly di interoperabilità possono avere tipi di dati diversi da quelli usati nella dichiarazione dell'oggetto originale. Sebbene gli assembly di interoperabilità convertano in modo implicito i tipi COM in tipi di Common Language Runtime compatibili, è necessario prestare attenzione ai tipi di dati utilizzati da entrambi i lati per evitare errori di Runtime. Ad esempio, negli oggetti COM creati in Visual Basic 6,0 e versioni precedenti, i valori di tipo `Integer` assumono il .NET Framework tipo equivalente `Short`. È consigliabile usare il Visualizzatore oggetti per esaminare le caratteristiche dei membri importati prima di usarli. ## <a name="vbconinteroperabilitymarshalinganchor8"></a>Metodi COM a livello di modulo La maggior parte degli oggetti COM viene utilizzata creando un'istanza di una classe COM utilizzando la parola chiave `New` e chiamando quindi i metodi dell'oggetto. Un'eccezione a questa regola riguarda gli oggetti COM che contengono `AppObj` o `GlobalMultiUse` classi COM. Tali classi sono simili ai metodi a livello di modulo in Visual Basic le classi .NET. Visual Basic 6,0 e versioni precedenti creano in modo implicito istanze di tali oggetti per la prima volta che si chiama uno dei relativi metodi. Ad esempio, in Visual Basic 6,0 è possibile aggiungere un riferimento alla libreria di oggetti Microsoft DAO 3,6 e chiamare il metodo `DBEngine` senza prima creare un'istanza: ```vb Dim db As DAO.Database ' Open the database. Set db = DBEngine.OpenDatabase("C:\nwind.mdb") ' Use the database object. ``` Visual Basic .NET richiede di creare sempre istanze di oggetti COM prima di poter usare i relativi metodi. Per usare questi metodi in Visual Basic, dichiarare una variabile della classe desiderata e usare la parola chiave New per assegnare l'oggetto alla variabile oggetto. È possibile utilizzare la parola chiave `Shared` quando si desidera assicurarsi che venga creata una sola istanza della classe. [!code-vb[VbVbalrInterop#23](~/samples/snippets/visualbasic/VS_Snippets_VBCSharp/VbVbalrInterop/VB/Class1.vb#23)] ## <a name="vbconinteroperabilitymarshalinganchor9"></a>Errori non gestiti nei gestori eventi Un problema di interoperabilità comune implica errori nei gestori eventi che gestiscono gli eventi generati dagli oggetti COM. Tali errori vengono ignorati a meno che non si verifichino in modo specifico gli errori usando le istruzioni `On Error` o `Try...Catch...Finally`. Ad esempio, l'esempio seguente è riportato da un progetto Visual Basic .NET che contiene un riferimento all'oggetto COM della libreria Microsoft ActiveX Data Objects 2,8. [!code-vb[VbVbalrInterop#24](~/samples/snippets/visualbasic/VS_Snippets_VBCSharp/VbVbalrInterop/VB/Class1.vb#24)] Questo esempio genera un errore come previsto. Tuttavia, se si prova lo stesso esempio senza il blocco `Try...Catch...Finally`, l'errore viene ignorato come se fosse stata usata l'istruzione `OnError Resume Next`. Senza la gestione degli errori, la divisione per zero non riesce automaticamente. Poiché tali errori non generano mai errori di eccezione non gestiti, è importante usare una forma di gestione delle eccezioni nei gestori eventi che gestiscono gli eventi dagli oggetti COM. ### <a name="understanding-com-interop-errors"></a>Informazioni sugli errori di interoperabilità COM Senza la gestione degli errori, le chiamate di interoperabilità generano spesso errori che forniscono scarse informazioni. Quando possibile, usare la gestione degli errori strutturati per fornire altre informazioni sui problemi che si verificano. Questo può essere particolarmente utile quando si esegue il debug delle applicazioni. Ad esempio: [!code-vb[VbVbalrInterop#25](~/samples/snippets/visualbasic/VS_Snippets_VBCSharp/VbVbalrInterop/VB/Class1.vb#25)] È possibile trovare informazioni quali la descrizione dell'errore, HRESULT e l'origine di errori COM esaminando il contenuto dell'oggetto eccezione. ## <a name="vbconinteroperabilitymarshalinganchor10"></a>Problemi di controllo ActiveX La maggior parte dei controlli ActiveX che funzionano con Visual Basic 6,0 funzionano con Visual Basic .NET senza problemi. Le eccezioni principali sono controlli contenitore o controlli che contengono visivamente altri controlli. Di seguito sono riportati alcuni esempi di controlli precedenti che non funzionano correttamente con Visual Studio: - Controllo frame Microsoft Forms 2,0 - Controllo di scorrimento, noto anche come controllo di selezione - Controllo scheda Sheridan Esistono solo alcune soluzioni alternative per i problemi di controllo ActiveX non supportati. È possibile eseguire la migrazione di controlli esistenti a Visual Studio se si è proprietari del codice sorgente originale. In caso contrario, è possibile verificare che i fornitori di software siano aggiornati. Versioni di controlli compatibili con .NET per sostituire i controlli ActiveX non supportati. ## <a name="vbconinteroperabilitymarshalinganchor11"></a>Passaggio delle proprietà di sola lettura dei controlli ByRef Visual Basic .NET a volte genera errori COM, ad esempio "Error 0x800A017F CTL_E_SETNOTSUPPORTED", quando si passano `ReadOnly` proprietà di alcuni controlli ActiveX meno recenti come parametri `ByRef` ad altre procedure. Chiamate di procedura analoghe da Visual Basic 6,0 non generano un errore e i parametri vengono considerati come se fossero stati passati per valore. Il messaggio di errore Visual Basic .NET indica che si sta tentando di modificare una proprietà che non dispone di una proprietà `Set` routine. Se si ha accesso alla routine chiamata, è possibile evitare questo errore usando la parola chiave `ByVal` per dichiarare i parametri che accettano `ReadOnly` proprietà. Ad esempio: [!code-vb[VbVbalrInterop#26](~/samples/snippets/visualbasic/VS_Snippets_VBCSharp/VbVbalrInterop/VB/Class1.vb#26)] Se non si ha accesso al codice sorgente per la routine chiamata, è possibile forzare il passaggio della proprietà in base al valore aggiungendo un set aggiuntivo di parentesi quadre alla procedura chiamante. Ad esempio, in un progetto che contiene un riferimento all'oggetto COM della libreria Microsoft ActiveX Data Objects 2,8, è possibile usare: [!code-vb[VbVbalrInterop#27](~/samples/snippets/visualbasic/VS_Snippets_VBCSharp/VbVbalrInterop/VB/Class1.vb#27)] ## <a name="vbconinteroperabilitymarshalinganchor12"></a>Distribuzione degli assembly che espongono l'interoperabilità La distribuzione di assembly che espongono interfacce COM presenta alcune esigenze specifiche. Ad esempio, un potenziale problema si verifica quando applicazioni separate fanno riferimento allo stesso assembly COM. Questa situazione è comune quando viene installata una nuova versione di un assembly e un'altra applicazione usa ancora la versione precedente dell'assembly. Se si disinstalla un assembly che condivide una DLL, è possibile renderlo involontariamente non disponibile agli altri assembly. Per evitare questo problema, è necessario installare assembly condivisi nella global assembly cache (GAC) e usare un MergeModule per il componente. Se non è possibile installare l'applicazione nella GAC, è necessario installarla in CommonFilesFolder in una sottodirectory specifica della versione. Gli assembly non condivisi devono trovarsi side-by-side nella directory con l'applicazione chiamante. ## <a name="see-also"></a>Vedere anche - <xref:System.Runtime.InteropServices.MarshalAsAttribute> - [Interoperabilità COM](../../../visual-basic/programming-guide/com-interop/index.md) - [Tlbimp.exe (utilità di importazione della libreria dei tipi)](../../../framework/tools/tlbimp-exe-type-library-importer.md) - [Tlbexp.exe (utilità di esportazione della libreria dei tipi)](../../../framework/tools/tlbexp-exe-type-library-exporter.md) - [Procedura dettagliata: Implementazione dell'ereditarietà con gli oggetti COM](../../../visual-basic/programming-guide/com-interop/walkthrough-implementing-inheritance-with-com-objects.md) - [Istruzione Inherits](../../../visual-basic/language-reference/statements/inherits-statement.md) - [Global Assembly Cache](../../../framework/app-domains/gac.md)
111.246377
707
0.803348
ita_Latn
0.998876
e16e8a2790c0098083fc36eee3013c28b0200f8a
1,429
md
Markdown
configs/yolof/README.md
Brym-Gyimah/mmdetection
d5d749afe57c77e2ec4500395faed3566fdfedae
[ "Apache-2.0" ]
314
2020-06-26T20:41:19.000Z
2022-03-30T06:05:47.000Z
configs/yolof/README.md
Joker-co/mmdet_pro
96abfd90cf0e38c5ce398795f949e9328eb85c1b
[ "Apache-2.0" ]
48
2021-07-06T07:17:12.000Z
2022-03-14T11:38:36.000Z
configs/yolof/README.md
Joker-co/mmdet_pro
96abfd90cf0e38c5ce398795f949e9328eb85c1b
[ "Apache-2.0" ]
54
2021-07-07T08:40:49.000Z
2022-03-16T05:02:35.000Z
# You Only Look One-level Feature ## Introduction <!-- [ALGORITHM] --> ``` @inproceedings{chen2021you, title={You Only Look One-level Feature}, author={Chen, Qiang and Wang, Yingming and Yang, Tong and Zhang, Xiangyu and Cheng, Jian and Sun, Jian}, booktitle={IEEE Conference on Computer Vision and Pattern Recognition}, year={2021} } ``` ## Results and Models | Backbone | Style | Epoch | Lr schd | Mem (GB) | box AP | Config | Download | |:---------:|:-------:|:-------:|:-------:|:--------:|:------:|:------:|:--------:| | R-50-C5 | caffe | Y | 1x | 8.3 | 37.5 | [config](https://github.com/open-mmlab/mmdetection/tree/master/configs/yolof/yolof_r50_c5_8x8_1x_coco.py) |[model](https://download.openmmlab.com/mmdetection/v2.0/yolof/yolof_r50_c5_8x8_1x_coco/yolof_r50_c5_8x8_1x_coco_20210425_024427-8e864411.pth) &#124; [log](https://download.openmmlab.com/mmdetection/v2.0/yolof/yolof_r50_c5_8x8_1x_coco/yolof_r50_c5_8x8_1x_coco_20210425_024427.log.json) | **Note**: 1. We find that the performance is unstable and may fluctuate by about 0.3 mAP. mAP 37.4 ~ 37.7 is acceptable in YOLOF_R_50_C5_1x. Such fluctuation can also be found in the [original implementation](https://github.com/chensnathan/YOLOF). 2. In addition to instability issues, sometimes there are large loss fluctuations and NAN, so there may still be problems with this project, which will be improved subsequently.
54.961538
458
0.69909
eng_Latn
0.762816
e16efd5f8269b2e6fe3dccca45db1e306db28cbe
7,549
md
Markdown
_posts/2021-08-02-Libraries-Qs.md
charlo66609/CompPhys
a28ad04f44e314adf238d4ea9dc8d73af4bce2ce
[ "CC-BY-4.0" ]
null
null
null
_posts/2021-08-02-Libraries-Qs.md
charlo66609/CompPhys
a28ad04f44e314adf238d4ea9dc8d73af4bce2ce
[ "CC-BY-4.0" ]
17
2021-10-06T08:11:43.000Z
2021-11-24T13:46:08.000Z
_posts/2021-08-02-Libraries-Qs.md
charlo66609/CompPhys
a28ad04f44e314adf238d4ea9dc8d73af4bce2ce
[ "CC-BY-4.0" ]
1
2021-11-03T10:11:28.000Z
2021-11-03T10:11:28.000Z
--- toc: false layout: post title: Libraries - quick test hide: true --- ## Exploring the Math Module 1. What function from the `math` module can you use to calculate a square root *without* using `sqrt`? 2. Since the library contains this function, why does `sqrt` exist? {::options parse_block_html="true" /} <details> <summary markdown="span">Show answer</summary> 1. Using `help(math)` we see that we've got `pow(x,y)` in addition to `sqrt(x)`, so we could use `pow(x, 0.5)` to find a square root. 2. The `sqrt(x)` function is arguably more readable than `pow(x, 0.5)` when implementing equations. Readability is a cornerstone of good programming, so it makes sense to provide a special function for this specific common case. Also, the design of Python's `math` library has its origin in the C standard, which includes both `sqrt(x)` and `pow(x,y)`, so a little bit of the history of programming is showing in Python's function names. </details> {::options parse_block_html="false" /} ## Locating the Right Module You want to select a random character from a string: ~~~python bases = 'ACTTGCTTGAC' ~~~ 1. Which [standard library][stdlib] module could help you? 2. Which function would you select from that module? Are there alternatives? 3. Try to write a program that uses the function. {::options parse_block_html="true" /} <details> <summary markdown="span">Show answer</summary> The [random module](randommod) seems like it could help you. The string has 11 characters, each having a positional index from 0 to 10. You could use `random.randrange` function (or the alias `random.randint` if you find that easier to remember) to get a random integer between 0 and 10, and then pick out the character at that position: ~~~python from random import randrange random_index = randrange(len(bases)) print(bases[random_index]) ~~~ or more compactly: ~~~python from random import randrange print(bases[randrange(len(bases))]) ~~~ Perhaps you found the `random.sample` function? It allows for slightly less typing: ~~~python from random import sample print(sample(bases, 1)[0]) ~~~ Note that this function returns a list of values. We will learn about lists in episode 11. There's also other functions you could use, but with more convoluted code as a result. </details> {::options parse_block_html="false" /} ## Jigsaw Puzzle (Parson's Problem) Programming Example Rearrange the following statements so that a random DNA base is printed and its index in the string. Not all statements may be needed. Feel free to use/add intermediate variables. ~~~python bases="ACTTGCTTGAC" import math import random ___ = random.randrange(n_bases) ___ = len(bases) print("random base ", bases[___], "base index", ___) ~~~ {::options parse_block_html="true" /} <details> <summary markdown="span">Show answer</summary> ~~~python import math import random bases = "ACTTGCTTGAC" n_bases = len(bases) idx = random.randrange(n_bases) print("random base", bases[idx], "base index", idx) ~~~ </details> {::options parse_block_html="false" /} ## When Is Help Available? When a colleague of yours types `help(math)`, Python reports an error: ~~~output NameError: name 'math' is not defined ~~~ What has your colleague forgotten to do? {::options parse_block_html="true" /} <details> <summary markdown="span">Show answer</summary> Importing the math module (`import math`) </details> {::options parse_block_html="false" /} ## Importing With Aliases 1. Fill in the blanks so that the program below prints `90.0`. 2. Rewrite the program so that it uses `import` *without* `as`. 3. Which form do you find easier to read? ~~~python import math as m angle = ____.degrees(____.pi / 2) print(____) ~~~ {::options parse_block_html="true" /} <details> <summary markdown="span">Show answer</summary> ~~~python import math as m angle = m.degrees(m.pi / 2) print(angle) ~~~ can bewritten as ~~~python import math angle = math.degrees(math.pi / 2) print(angle) ~~~ Since you just wrote the code and are familiar with it, you might actually find the first version easier to read. But when trying to read a huge piece of code written by someone else, or when getting back to your own huge piece of code after several months, non-abbreviated names are often easier, except where there are clear abbreviation conventions. </details> {::options parse_block_html="false" /} ## There Are Many Ways To Import Libraries! Match the following print statements with the appropriate library calls. Print commands: 1. `print("sin(pi/2) =",sin(pi/2))` 2. `print("sin(pi/2) =",m.sin(m.pi/2))` 3. `print("sin(pi/2) =",math.sin(math.pi/2))` Library calls: 1. `from math import sin,pi` 2. `import math` 3. `import math as m` 4. `from math import *` {::options parse_block_html="true" /} <details> <summary markdown="span">Show answer</summary> 1. Library calls 1 and 4. In order to directly refer to `sin` and `pi` without the library name as prefix, you need to use the `from ... import ...` statement. Whereas library call 1 specifically imports the two functions `sin` and `pi`, library call 4 imports all functions in the `math` module. 2. Library call 3. Here `sin` and `pi` are referred to with a shortened library name `m` instead of `math`. Library call 3 does exactly that using the `import ... as ...` syntax - it creates an alias for `math` in the form of the shortened name `m`. 3. Library call 2. Here `sin` and `pi` are referred to with the regular library name `math`, so the regular `import ...` call suffices. </details> {::options parse_block_html="false" /} ## Importing Specific Items 1. Fill in the blanks so that the program below prints `90.0`. 2. Do you find this version easier to read than preceding ones? 3. Why *wouldn't* programmers always use this form of `import`? ~~~python ____ math import ____, ____ angle = degrees(pi / 2) print(angle) ~~~ {::options parse_block_html="true" /} <details> <summary markdown="span">Show answer</summary> ~~~python from math import degrees, pi angle = degrees(pi / 2) print(angle) ~~~ Most likely you find this version easier to read since it's less dense. The main reason not to use this form of import is to avoid name clashes. For instance, you wouldn't import `degrees` this way if you also wanted to use the name `degrees` for a variable or function of your own. Or if you were to also import a function named `degrees` from another library. </details> {::options parse_block_html="false" /} ## Reading Error Messages 1. Read the code below and try to identify what the errors are without running it. 2. Run the code, and read the error message. What type of error is it? ~~~python from math import log log(0) ~~~ {::options parse_block_html="true" /} <details> <summary markdown="span">Show answer</summary> 1. The logarithm of `x` is only defined for `x 0`, so 0 is outside the domain of the function. 2. You get an error of type "ValueError", indicating that the function received an inappropriate argument value. The additional message "math domain error" makes it clearer what the problem is. </details> {::options parse_block_html="false" /} [pypi]: https://pypi.python.org/pypi/ [stdlib]: https://docs.python.org/3/library/ [randommod]: https://docs.python.org/3/library/random.html --- See [the notebook](https://nu-cem.github.io/CompPhys/2021/08/02/Libraries.html). Back to [Python part two](https://nu-cem.github.io/CompPhys/2021/08/02/Python_basics_two.html). ---
24.832237
105
0.724599
eng_Latn
0.988018
e16f2094c07f85ceb49fe0135e1b200a40962003
3,607
md
Markdown
desktop-src/WmiSdk/swbemdatetime-setfiletime.md
Lectem/win32
03fb201e1ea36e353c335ff063ed993fb5993e61
[ "CC-BY-4.0", "MIT" ]
null
null
null
desktop-src/WmiSdk/swbemdatetime-setfiletime.md
Lectem/win32
03fb201e1ea36e353c335ff063ed993fb5993e61
[ "CC-BY-4.0", "MIT" ]
null
null
null
desktop-src/WmiSdk/swbemdatetime-setfiletime.md
Lectem/win32
03fb201e1ea36e353c335ff063ed993fb5993e61
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- Description: Converts a date in the string FILETIME format to the CIM datetime format. ms.assetid: e375afda-5e94-46d6-b1ac-e801e0f4a620 ms.tgt_platform: multiple title: SWbemDateTime.SetFileTime method (Wbemdisp.h) ms.topic: reference ms.date: 05/31/2018 topic_type: - APIRef - kbSyntax api_name: - SWbemDateTime.SetFileTime - ISWbemDateTime.SetFileTime - ISWbemDateTime.SetFileTime api_type: - COM api_location: - Wbemdisp.dll --- # SWbemDateTime.SetFileTime method The **SetFileTime** method of the [**SWbemDateTime**](swbemdatetime.md) object converts a date in the string **FILETIME** format to the [CIM datetime](date-and-time-format.md) format. The **FILETIME** format is a 64-bit datetime structure that represents the number of 100-nanosecond units since the beginning of January 1, 1601. Windows Management Instrumentation (WMI) treats **FILETIME** values as string representations of unsigned 64-bit numbers. For the syntax explanation, see [Document Conventions for the Scripting API](document-conventions-for-the-scripting-api.md). ## Syntax ```VB SWbemDateTime.SetFileTime( _ ByVal strFileTime, _ [ ByVal bIsLocal ] _ ) ``` ## Parameters <dl> <dt> *strFileTime* \[in\] </dt> <dd> **FILETIME** value used to set the object. </dd> <dt> *bIsLocal* \[in, optional\] </dt> <dd> If **TRUE**, *strFileTime* is interpreted as a local time. The Coordinated Universal Time (UTC) property contains the local time converted to the correct UTC offset. When *bIsLocal* is **FALSE**, then *strFileTime* is converted directly into a UTC value with an offset of 0 (zero). </dd> </dl> ## Return value This method does not return a value. ## Error codes After completing the **SetFileTime** method, the [Err](/previous-versions//sbf5ze0e(v=vs.85)) object may contain the error code in the following list. <dl> <dt> **wbemErrInvalidSyntax** - 2147749921 (0x80041021) </dt> <dd> The format of *strFileTime* is not valid. </dd> </dl> ## Remarks After a successful call to **SetFileTime**, the [**datetime**](datetime.md) value is always interpreted as an absolute (**datetime**) value, and [**IsInterval**](swbemdatetime-isinterval.md) is set to **FALSE**. ## Examples For examples of using the [**SWbemDateTime**](swbemdatetime.md) object to convert CIM [**DATETIME**](datetime.md) values to and from either the **FILETIME** format or the **VT\_DATE** format, see [WMI Tasks: Dates and Times](wmi-tasks--dates-and-times.md). For a description of the CIM **DATETIME** format, see [Date and Time Format](date-and-time-format.md). ## Requirements | Requirement | Value | |-------------------------------------|-----------------------------------------------------------------------------------------| | Minimum supported client<br/> | Windows Vista<br/> | | Minimum supported server<br/> | Windows Server 2008<br/> | | Header<br/> | <dl> <dt>Wbemdisp.h</dt> </dl> | | Type library<br/> | <dl> <dt>Wbemdisp.tlb</dt> </dl> | | DLL<br/> | <dl> <dt>Wbemdisp.dll</dt> </dl> | | CLSID<br/> | CLSID\_SWbemDateTime<br/> | | IID<br/> | IID\_ISWbemDateTime<br/> | ## See also <dl> <dt> [**SWbemDateTime.SetVarDate**](swbemdatetime-setvardate.md) </dt> <dt> [**SWbemDateTime**](swbemdatetime.md) </dt> <dt> [**DATETIME**](datetime.md) </dt> </dl>
31.365217
359
0.622955
eng_Latn
0.560747
e16f9beb11b1b3de7f8d8f9c80fd550552c8236e
11,184
md
Markdown
README.md
textcreationpartnership/A34255
327b766aed8cdde720f73d6ac9c9db1c60e46e0d
[ "CC0-1.0" ]
null
null
null
README.md
textcreationpartnership/A34255
327b766aed8cdde720f73d6ac9c9db1c60e46e0d
[ "CC0-1.0" ]
null
null
null
README.md
textcreationpartnership/A34255
327b766aed8cdde720f73d6ac9c9db1c60e46e0d
[ "CC0-1.0" ]
null
null
null
#A Confession of faith put forth by the elders and brethren of many congregations of Christians (baptized upon profession of their faith) in London and the country.# A Confession of faith put forth by the elders and brethren of many congregations of Christians (baptized upon profession of their faith) in London and the country. ##General Summary## **Links** [TCP catalogue](http://www.ota.ox.ac.uk/tcp/) • [HTML](http://tei.it.ox.ac.uk/tcp/Texts-HTML/free/A34/A34255.html) • [EPUB](http://tei.it.ox.ac.uk/tcp/Texts-EPUB/free/A34/A34255.epub) • [Page images (Historical Texts)](https://historicaltexts.jisc.ac.uk/eebo-08939927e) **Availability** To the extent possible under law, the Text Creation Partnership has waived all copyright and related or neighboring rights to this keyboarded and encoded edition of the work described above, according to the terms of the CC0 1.0 Public Domain Dedication (http://creativecommons.org/publicdomain/zero/1.0/). This waiver does not extend to any page images or other supplementary files associated with this work, which may be protected by copyright or other license restrictions. Please go to https://www.textcreationpartnership.org/ for more information about the project. **Major revisions** 1. __2012-06__ __TCP__ *Assigned for keying and markup* 1. __2012-06__ __Apex CoVantage__ *Keyed and coded from ProQuest page images* 1. __2013-01__ __Colm MacCrossan__ *Sampled and proofread* 1. __2013-01__ __Colm MacCrossan__ *Text and markup reviewed and edited* 1. __2013-02__ __pfs__ *Batch review (QC) and XML conversion* ##Content Summary## #####Front##### A CONFESSION OF FAITH. Put forth by the ELDERS and BRETHREN Of many CONGREGATIONS OF Chriſtians (bap 1. TO THE Judicious and Impartial READER 1. THE CONTENTS. #####Body##### 1. A Confeſsion of FAITH. _ CHAP. I. Of the Holy Scriptures. * Of the Old Teſtament. * Of the new Teſtament. _ CHAP. II. Of God and of the Holy Trinity. * Of the Old Teſtament. * Of the new Teſtament. _ CHAP. III. Of Gods Decree. * Of the Old Teſtament. * Of the new Teſtament. _ CHAP. IV. Of Creation. * Of the Old Teſtament. * Of the new Teſtament. _ CHAP. V. Of Divine Providence. * Of the Old Teſtament. * Of the new Teſtament. _ CHAP. VI. Of the fall of Man, of Sin, and of the Puniſhment thereof. * Of the Old Teſtament. * Of the new Teſtament. _ CHAP. VII. Of Gods Covenant. * Of the Old Teſtament. * Of the new Teſtament. _ CHAP. VIII. Of Chriſt the Mediator. * Of the Old Teſtament. * Of the new Teſtament. _ CHAP. IX. Of Free Will. * Of the Old Teſtament. * Of the new Teſtament. _ CHAP. X. Of Effectual Calling. * Of the Old Teſtament. * Of the new Teſtament. _ CHAP. XI. Of Juſtification. * Of the Old Teſtament. * Of the new Teſtament. _ CHAP. XII. Of Adoption. * Of the Old Teſtament. * Of the new Teſtament. _ CHAP. XIII. Of Sanctification. * Of the Old Teſtament. * Of the new Teſtament. _ CHAP. XIV. Of Saving Faith. * Of the Old Teſtament. * Of the new Teſtament. _ CHAP. XV. Of Repentance unto Life and Salvation. * Of the Old Teſtament. * Of the new Teſtament. _ CHAP. XVI. Of Good Works. * Of the Old Teſtament. * Of the new Teſtament. _ CHAP. XVII. Of Perſeverance of the Saints. * Of the Old Teſtament. * Of the new Teſtament. _ CHAP. XVIII. Of the Aſſurance of Grace and Salvation. * Of the Old Teſtament. * Of the new Teſtament. _ CHAP. XIX. Of the Law of God. * Of the Old Teſtament. * Of the new Teſtament. _ CHAP. 20. Of the Goſpel, and of the extent of the Grace thereof. * Of the Old Teſtament. * Of the new Teſtament. _ CHAP. XXI. Of Chriſtian Liberty and Liberty of Conſcience. * Of the Old Teſtament. * Of the new Teſtament. _ CHAP. XXII. Of Religious Worſhip and the Sabbath Day. * Of the Old Teſtament. * Of the new Teſtament. _ CHAP. XXIII. Of Lawful Oaths and Vows. * Of the Old Teſtament. * Of the new Teſtament. _ CHAP. XXIV. Of the Civil Magiſtrate. * Of the Old Teſtament. * Of the new Teſtament. _ CHAP. XXV. Of Marriage. * Of the Old Teſtament. * Of the new Teſtament. _ CHAP. XXVI. Of the Church. * Of the Old Teſtament. * Of the new Teſtament. _ CHAP. XXVII. Of the Communion of Saints. * Of the Old Teſtament. * Of the new Teſtament. _ CHAP. XXVIII. Of Baptiſm and the Lords Supper. * Of the Old Teſtament. * Of the new Teſtament. _ CHAP. XXIX. Of Baptiſm. * Of the Old Teſtament. * Of the new Teſtament. _ CHAP. XXX. Of the Lords Supper. * Of the Old Teſtament. * Of the new Teſtament. _ CHAP. XXXI. Of the State of Man after Death and of the Reſurrection of the Dead. * Of the Old Teſtament. * Of the new Teſtament. _ CHAP. XXXII. Of the Laſt Judgement. * Of the Old Teſtament. * Of the new Teſtament. #####Back##### 1. AN APPENDIX. **Types of content** * Oh, Mr. Jourdain, there is **prose** in there! There are 5 **omitted** fragments! @__reason__ (5) : illegible (1), foreign (4) • @__resp__ (5) : #MURP (1), #OXF (4) • @__extent__ (1) : 1 letter (1) **Character listing** |Text|string(s)|codepoint(s)| |---|---|---| |Latin-1 Supplement|âèòá|226 232 242 225| |Latin Extended-A|ſ|383| |Latin Extended-B|Ʋ|434| |General Punctuation|•|8226| |CJKSymbolsandPunctuation|〈〉|12296 12297| ##Tag Usage Summary## ###Header Tag Usage### |No|element name|occ|attributes| |---|---|---|---| |1.|__availability__|1|| |2.|__biblFull__|1|| |3.|__change__|5|| |4.|__date__|8| @__when__ (1) : 2013-12 (1)| |5.|__edition__|1|| |6.|__editionStmt__|1|| |7.|__editorialDecl__|1|| |8.|__encodingDesc__|1|| |9.|__extent__|2|| |10.|__fileDesc__|1|| |11.|__idno__|6| @__type__ (6) : DLPS (1), STC (2), EEBO-CITATION (1), OCLC (1), VID (1)| |12.|__keywords__|1| @__scheme__ (1) : http://authorities.loc.gov/ (1)| |13.|__label__|5|| |14.|__langUsage__|1|| |15.|__language__|1| @__ident__ (1) : eng (1)| |16.|__listPrefixDef__|1|| |17.|__note__|4|| |18.|__notesStmt__|2|| |19.|__p__|11|| |20.|__prefixDef__|2| @__ident__ (2) : tcp (1), char (1) • @__matchPattern__ (2) : ([0-9\-]+):([0-9IVX]+) (1), (.+) (1) • @__replacementPattern__ (2) : http://eebo.chadwyck.com/downloadtiff?vid=$1&page=$2 (1), https://raw.githubusercontent.com/textcreationpartnership/Texts/master/tcpchars.xml#$1 (1)| |21.|__profileDesc__|1|| |22.|__projectDesc__|1|| |23.|__pubPlace__|2|| |24.|__publicationStmt__|2|| |25.|__publisher__|2|| |26.|__ref__|1| @__target__ (1) : http://www.textcreationpartnership.org/docs/. (1)| |27.|__revisionDesc__|1|| |28.|__seriesStmt__|1|| |29.|__sourceDesc__|1|| |30.|__term__|1|| |31.|__textClass__|1|| |32.|__title__|3|| |33.|__titleStmt__|2|| ###Text Tag Usage### |No|element name|occ|attributes| |---|---|---|---| |1.|__back__|1|| |2.|__bibl__|2|| |3.|__body__|1|| |4.|__cell__|14| @__cols__ (1) : 2 (1) • @__rows__ (1) : 2 (1)| |5.|__desc__|5|| |6.|__div__|39| @__type__ (39) : title_page (1), to_the_reader (1), table_of_contents (1), confession_of_faith (1), chapter (32), part (2), appendix (1) • @__n__ (32) : 1 (1), 2 (1), 3 (1), 4 (1), 5 (1), 6 (1), 7 (1), 8 (1), 9 (1), 10 (1), 11 (1), 12 (1), 13 (1), 14 (1), 15 (1), 16 (1), 17 (1), 18 (1), 19 (1), 20 (1), 21 (1), 22 (1), 23 (1), 24 (1), 25 (1), 26 (1), 27 (1), 28 (1), 29 (1), 30 (1), 31 (1), 32 (1)| |7.|__front__|1|| |8.|__g__|813| @__ref__ (813) : char:EOLhyphen (812), char:V (1)| |9.|__gap__|5| @__reason__ (5) : illegible (1), foreign (4) • @__resp__ (5) : #MURP (1), #OXF (4) • @__extent__ (1) : 1 letter (1)| |10.|__head__|38|| |11.|__hi__|449|| |12.|__item__|33|| |13.|__list__|1|| |14.|__note__|496| @__n__ (493) : a (35), b (35), c (34), d (35), e (36), f (34), g (34), h (30), i (27), k (24), l (21), m (20), n (18), o (17), p (12), q (13), r (13), s (10), t (10), u (9), x (8), ſ (1), y (7), z (6), * (3), w (1) • @__place__ (496) : margin (496)| |15.|__p__|218| @__n__ (116) : 1 (24), 2 (24), 3 (22), 4 (15), 5 (10), 6 (6), 7 (5), 8 (3), 9 (1), 10 (1), 11 (1), 12 (1), 13 (1), 14 (1), 15 (1)| |16.|__pb__|162| @__facs__ (162) : tcp:42012:1 (2), tcp:42012:2 (2), tcp:42012:3 (2), tcp:42012:4 (2), tcp:42012:5 (2), tcp:42012:6 (2), tcp:42012:7 (2), tcp:42012:8 (2), tcp:42012:9 (2), tcp:42012:10 (2), tcp:42012:11 (2), tcp:42012:12 (2), tcp:42012:13 (2), tcp:42012:14 (2), tcp:42012:15 (2), tcp:42012:16 (2), tcp:42012:17 (2), tcp:42012:18 (2), tcp:42012:19 (2), tcp:42012:20 (2), tcp:42012:21 (2), tcp:42012:22 (2), tcp:42012:23 (2), tcp:42012:24 (2), tcp:42012:25 (2), tcp:42012:26 (2), tcp:42012:27 (2), tcp:42012:28 (2), tcp:42012:29 (2), tcp:42012:30 (2), tcp:42012:31 (2), tcp:42012:32 (2), tcp:42012:33 (2), tcp:42012:34 (2), tcp:42012:35 (2), tcp:42012:36 (2), tcp:42012:37 (2), tcp:42012:38 (2), tcp:42012:39 (2), tcp:42012:40 (2), tcp:42012:41 (2), tcp:42012:42 (2), tcp:42012:43 (2), tcp:42012:44 (2), tcp:42012:45 (2), tcp:42012:46 (2), tcp:42012:47 (2), tcp:42012:48 (2), tcp:42012:49 (2), tcp:42012:50 (2), tcp:42012:51 (2), tcp:42012:52 (2), tcp:42012:53 (2), tcp:42012:54 (2), tcp:42012:55 (2), tcp:42012:56 (2), tcp:42012:57 (2), tcp:42012:58 (2), tcp:42012:59 (2), tcp:42012:60 (2), tcp:42012:61 (2), tcp:42012:62 (2), tcp:42012:63 (2), tcp:42012:64 (2), tcp:42012:65 (2), tcp:42012:66 (2), tcp:42012:67 (3), tcp:42012:68 (3), tcp:42012:69 (3), tcp:42012:70 (3), tcp:42012:71 (2), tcp:42012:72 (2), tcp:42012:73 (2), tcp:42012:74 (2), tcp:42012:75 (2), tcp:42012:76 (2), tcp:42012:77 (2), tcp:42012:78 (2), tcp:42012:79 (2) • @__rendition__ (4) : simple:additions (4) • @__n__ (145) : 1 (1), 2 (1), 3 (1), 4 (1), 5 (1), 6 (1), 7 (1), 8 (1), 9 (1), 10 (1), 11 (1), 12 (1), 13 (1), 14 (1), 15 (1), 16 (1), 17 (1), 18 (1), 19 (1), 20 (1), 21 (1), 22 (1), 23 (1), 24 (1), 25 (1), 26 (1), 27 (1), 28 (1), 29 (1), 30 (1), 31 (1), 32 (1), 33 (1), 34 (1), 35 (1), 36 (1), 37 (1), 38 (1), 39 (1), 40 (1), 41 (1), 42 (1), 43 (1), 44 (1), 45 (1), 46 (1), 47 (1), 48 (1), 49 (1), 50 (1), 51 (1), 52 (1), 53 (1), 54 (1), 55 (1), 56 (1), 57 (1), 58 (1), 59 (1), 60 (1), 61 (1), 62 (1), 63 (1), 64 (1), 65 (1), 66 (1), 67 (1), 68 (1), 69 (1), 70 (1), 71 (1), 72 (1), 73 (1), 74 (1), 75 (1), 76 (1), 77 (1), 78 (1), 79 (1), 80 (1), 81 (1), 82 (1), 83 (1), 84 (1), 85 (1), 86 (1), 87 (1), 88 (1), 89 (1), 90 (1), 91 (1), 92 (1), 93 (1), 94 (1), 95 (1), 96 (1), 97 (1), 98 (1), 99 (1), 100 (1), 101 (1), 102 (1), 103 (1), 104 (1), 105 (1), 106 (1), 107 (1), 109 (1), 110 (1), 111 (1), 112 (1), 113 (1), 114 (1), 115 (1), 116 (1), 117 (1), 118 (1), 119 (2), 120 (2), 121 (1), 122 (1), 123 (2), 124 (2), 125 (1), 126 (1), 127 (1), 128 (1), 129 (1), 130 (1), 131 (1), 132 (1), 133 (1), 134 (1), 135 (1), 136 (1), 137 (1), 138 (1), 139 (1), 140 (1), 141 (1), 142 (1)| |17.|__q__|2|| |18.|__row__|8|| |19.|__table__|1|| |20.|__trailer__|1||
35.392405
2,672
0.599607
yue_Hant
0.424389
e16fc590675e0f41ad3b8841ef51fb7cccbb5287
3,069
md
Markdown
dynamicsax2012-technet/create-a-request-for-quotation-from-a-purchase-requisition.md
MicrosoftDocs/DynamicsAX2012-technet
4e3ffe40810e1b46742cdb19d1e90cf2c94a3662
[ "CC-BY-4.0", "MIT" ]
9
2019-01-16T13:55:51.000Z
2021-11-04T20:39:31.000Z
dynamicsax2012-technet/create-a-request-for-quotation-from-a-purchase-requisition.md
MicrosoftDocs/DynamicsAX2012-technet
4e3ffe40810e1b46742cdb19d1e90cf2c94a3662
[ "CC-BY-4.0", "MIT" ]
265
2018-08-07T18:36:16.000Z
2021-11-10T07:15:20.000Z
dynamicsax2012-technet/create-a-request-for-quotation-from-a-purchase-requisition.md
MicrosoftDocs/DynamicsAX2012-technet
4e3ffe40810e1b46742cdb19d1e90cf2c94a3662
[ "CC-BY-4.0", "MIT" ]
32
2018-08-09T22:29:36.000Z
2021-08-05T06:58:53.000Z
--- title: Create a request for quotation from a purchase requisition TOCTitle: Create a request for quotation from a purchase requisition ms:assetid: 9691cc43-d2a7-481a-adc7-7623a8342c5d ms:mtpsurl: https://technet.microsoft.com/library/Gg232234(v=AX.60) ms:contentKeyID: 36058635 author: Khairunj ms.date: 03/25/2015 mtps_version: v=AX.60 f1_keywords: - quotes - Classes.PurchRFQTableMap2LineParametersForm - Forms.PurchReqCopyRFQ - Forms.PurchRFQCaseLineCopy - RFQ - Forms.PurchCreateRFQCase - procurement - purchase requisition - request for quotation - purchase requisitions - requests for quotations - Forms.PurchRFQCaseTableListPage - Forms.PurchCopying - Forms.PurchReqTableListPage - Forms.PurchRFQReplyFields - Menu_Items.Display.PurchRFQEditLines - quote - request for quotations - requests for quotation - RFQs audience: Application User ms.search.region: Global --- # Create a request for quotation from a purchase requisition [!INCLUDE[archive-banner](includes/archive-banner.md)] _**Applies To:** Microsoft Dynamics AX 2012 R3, Microsoft Dynamics AX 2012 R2, Microsoft Dynamics AX 2012 Feature Pack, Microsoft Dynamics AX 2012_ You can create an RFQ from a purchase requisition only when the status of the purchase requisition is **Pending approval**, because the lines in the purchase requisition are updated automatically as you accept lines from RFQ replies (bids) from vendors. You cannot complete, reject, approve, or perform any other actions on the purchase requisition while the RFQ is in progress. When you accept an RFQ reply that has a type of **Purchase requisition**, the RFQ reply lines update the purchase requisition lines with the following information: - Unit price - Discount percentage - Discount amount - Purchase charges - Line charges - Vendor - External number - External description Use this procedure to create a request for quotation (RFQ) from a purchase requisition or a project purchase requisition. 1. Click **Procurement and sourcing** \> **Common** \> **Purchase requisitions** \> **All purchase requisitions**. –or– Click Click **Project management and accounting** \> **Common** \> **Item tasks** \> **Project purchase requisitions**. 2. Select the purchase requisition to copy to an RFQ. 3. On the **Action Pane**, on the **Purchase requisition** tab, in the **Actions** group, click **Create request for quotation**. 4. In the **Select the lines to copy to the request for quotation** form, select the check boxes next to the lines to copy to the RFQ. 5. Click **OK** to create an RFQ that has a type of **Purchase requisition**. The information from the selected purchase requisition lines is copied to the RFQ. All vendors from the purchase requisition become vendors for the RFQ. The status of the purchase requisition changes to **Pending request for quotation**. You cannot perform any action on the purchase requisition until the RFQ process is completed. ## See also [About requests for quotation](about-requests-for-quotation.md)
34.1
253
0.766699
eng_Latn
0.977545
e170ae55e1f820bd40b27c061f8a55901c8b3ccb
1,692
md
Markdown
README.md
marvinlenk/pbs_qlist
0644ba0a6a41f1aa5a1396e9d63c7c906ba6c81d
[ "BSD-2-Clause" ]
null
null
null
README.md
marvinlenk/pbs_qlist
0644ba0a6a41f1aa5a1396e9d63c7c906ba6c81d
[ "BSD-2-Clause" ]
null
null
null
README.md
marvinlenk/pbs_qlist
0644ba0a6a41f1aa5a1396e9d63c7c906ba6c81d
[ "BSD-2-Clause" ]
null
null
null
# pbs_qlist A Uni-Bonn BAF-Cluster PBS qstat parser for Python3 Props to Oliver Evans and JLT for providing the basis of this program: https://stackoverflow.com/questions/26104116/qstat-and-long-job-names Takes the qstat -x output and parses it to a nicer and cleaner output. Options are explained by qlist --help How to get it working on your system: 1. Copy the qlist file to your local binary folder (e.g. ~/.local/bin). 2. Make shure the path is in your $PATH variable - if not do the following: Change your terminal profile to include the path, e.g. add export PATH=~/.local/bin:$PATH to your ~/.bash_profile file (create one if it doesn't exist) 3. Change the path after #! to your Python3.X in the qlist file. If you are not sure where your Python sits, use: which python3 You may have to plug in the command you use to open Python, but make sure it is version 3.0 or newer !!! 4. Make it executable via chmod +x qlist 5. You might want to restart the terminal session for bash to recognize the new command. Tada! Now it should work. Please note, that the command sometimes takes more time than expected and is usually slower than qstat itself. I guess this is due to lustre performance problems, since the code is not CPU heavy at all. # qint With this bash command you can very easily start an interactive job on the same node as the job specified. To use this program, basically repeat the same steps as for qlist but without the python part (since it is bash). Usage: qint 142213116 Will start an interactive session requesting 1gb RAM and 5 mins walltime on the node that runs job 142213116. Replace the number by any PBS session ID of your desire.
40.285714
113
0.76182
eng_Latn
0.999342
e172a4fb763feacef4b9c4c984c59fe1806750d7
3,538
md
Markdown
IntegrationTests/tests_04_performance/test_01_resources/README.md
MrLotU/swift-nio
f8fa6c9960d7a15e8600c81e796e0f845fd43d2e
[ "Apache-2.0" ]
7,148
2018-03-01T05:30:05.000Z
2022-03-31T15:52:05.000Z
IntegrationTests/tests_04_performance/test_01_resources/README.md
MrLotU/swift-nio
f8fa6c9960d7a15e8600c81e796e0f845fd43d2e
[ "Apache-2.0" ]
1,433
2018-03-01T07:12:09.000Z
2022-03-28T11:22:20.000Z
IntegrationTests/tests_04_performance/test_01_resources/README.md
MrLotU/swift-nio
f8fa6c9960d7a15e8600c81e796e0f845fd43d2e
[ "Apache-2.0" ]
667
2018-03-01T05:39:16.000Z
2022-03-26T10:41:49.000Z
# Allocation Counting Test This briefly describes how the allocation counting test works. ## How does it work? This is possibly the simplest implementation that counts memory allocations (`malloc` and friends) and frees (mostly `free`). It just maintains two atomic variables which count the number of mallocs and the number of frees respectively. We run a simple HTTP1 example -- 1000 requests and responses generated by a simple SwiftNIO based client and server -- and then evaluate the number of mallocs and frees. The difference `mallocs - frees` should be pretty much 0 and the number of `mallocs` should remain stable (or decrease) across commits. We can't establish a perfect baseline as the exact number of allocations depends on your operating system, libc and Swift version. ### How are the functions hooked? Usually in UNIX it's enough to just define a function, for example ```C void free(void *ptr) { ... } ``` in the main binary and all modules will use this `free` function instead of the real one from the `libc`. For Linux, this is exactly what we're doing, the `bootstrap` binary defines such a `free` function in its `main.c`. On Darwin (macOS/iOS/...) however that is not the case and you need to use [dyld's interpose feature](https://books.google.co.uk/books?id=K8vUkpOXhN4C&lpg=PA73&ots=OMjhRWWwUu&dq=dyld%20interpose&pg=PA73#v=onepage&q=dyld%20interpose&f=false). The odd thing is that dyld's interposing _only_ works if it's in a `.dylib` and not from a binary's main executable. Therefore we need to build a slightly strange SwiftPM package: - `bootstrap`: The main executable's main module (written in C) so we can hook the `free` function on Linux. - `BootstrapSwift`: A SwiftPM module (written in Swift) called in from `bootstrap` which implements the actual SwiftNIO benchmark (and therefore depends on the `NIO` module). - `HookedFunctions`: A separate SwiftPM package that builds a shared library (`.so` on Linux, `.dylib` on Darwin) which contains the `replacement_malloc`, `replacement_free`, etc functions which just increment an atomic integers representing the number of operations. On Darwin, we use `DYLD_INTERPOSE` in this module, interposing libc functions with our `replacement_` functions. This needs to be a separate SwiftPM package as otherwise its code would just live inside of the `bootstrap` executable and the dyld interposing feature wouldn't work. - `AtomicCounter`: SwiftPM package (written in C) that implements the atomic counters. It needs to be a separate package as both `BoostrapSwift` (to read the allocation counter) as well as `HookedFree` (to increment the allocation counter) depend on it. ## What benchmark is run? We run a single TCP connection over which 1000 HTTP requests are made by a client written in NIO, responded to by a server also written in NIO. We re-run the benchmark 10 times and return the lowest number of allocations that has been made. ## Why do I have to set a baseline? By default this test should always succeed as it doesn't actually compare the number of allocations to a certain number. The reason is that this number varies ever so slightly between operating systems and Swift versions. At the time of writing on macOS we got roughly 326k allocations and on Linux 322k allocations for 1000 HTTP requests & responses. To set a baseline simply run ```bash export MAX_ALLOCS_ALLOWED_1000_reqs_1_conn=327000 ``` or similar to set the maximum number of allocations allowed. If the benchmark exceeds these allocations the test will fail.
95.621622
673
0.782928
eng_Latn
0.999172
e172ab81e9d01f2cde4c21310be0c54cdb48e98a
3,587
md
Markdown
docs/help.md
MojixCoder/authx
0bb6e70dd4e3c64eaa4e994d761ceb0e3a4f85c2
[ "MIT" ]
null
null
null
docs/help.md
MojixCoder/authx
0bb6e70dd4e3c64eaa4e994d761ceb0e3a4f85c2
[ "MIT" ]
null
null
null
docs/help.md
MojixCoder/authx
0bb6e70dd4e3c64eaa4e994d761ceb0e3a4f85c2
[ "MIT" ]
null
null
null
# Help AuthX - Get Help 🦥 Do you like **AuthX**? Would you like to help AuthX, other users, and the author? Or would you like to get help with **AuthX**? There are very simple ways to help (several involve just one or two clicks). And there are several ways to get help too. ## Follow AuthX's Author <a href="https://twitter.com/THyasser1" class="external-link" target="_blank">Follow @THyasser1 on **Twitter**</a> to get the latest news about **AuthX**. ## Star **AuthX** in GitHub You can "star" AuthX in GitHub (clicking the star button at the top right): <a href="https://github.com/yezz123/AuthX" class="external-link" target="_blank">https://github.com/yezz123/AuthX</a>. ⭐️ By adding a star, other users will be able to find it more easily and see that it has been already useful for others. ## Watch the GitHub repository for releases You can "watch" AuthX in GitHub (clicking the "watch" button at the top right): <a href="https://github.com/yezz123/AuthX" class="external-link" target="_blank">https://github.com/yezz123/AuthX</a>. 👀 There you can select __"Releases only"__. By doing it, you will receive notifications (in your email) whenever there's a new release (a new version) of **AuthX** with bug fixes and new features. ## Help Author with issues in GitHub You can see <a href="https://github.com/yezz123/AuthX/issues" class="external-link" target="_blank">existing issues</a> and try and help others, most of the times they are questions that you might already know the answer for. 🤓 ## Watch the GitHub repository You can "watch" AuthX in GitHub (clicking the "watch" button at the top right): <a href="https://github.com/yezz123/AuthX" class="external-link" target="_blank">https://github.com/yezz123/AuthX</a>. 👀 If you select "Watching" instead of "Releases only" you will receive notifications when someone creates a new issue. Then you can try and help them solve those issues. ## Create issues You can <a href="https://github.com/yezz123/AuthX/issues/new/choose" class="external-link" target="_blank">create a new issue</a> in the GitHub repository, for example to: * Ask a **question** or ask about a **problem**. * Suggest a new **feature**. **Note**: if you create an issue, then I'm going to ask you to also help others. 😉 ## Create a Pull Request You can [contribute](contributing.md) to the source code with Pull Requests, for example: * To fix a typo you found on the documentation. * To help [translate the documentation](contributing.md) to your language. * You can also help to review the translations created by others. * To propose new documentation sections. * To fix an existing issue/bug. * To add a new feature. ## Sponsor the author You can also financially support the author (me) through <a href="https://paypal.me/yassertahiri?locale.x=en_US" class="external-link" target="_blank">Paypal sponsors</a>. There you could buy me a [coffee ☕️](https://www.buymeacoffee.com/tahiri) to say thanks. 😄 And you can also become a Silver or Gold sponsor for AuthX. 🏅🎉 ## Sponsor the tools that power AuthX As you have seen in the documentation, AuthX stands on the shoulders of giants, FastAPI, Starlette and Pydantic. You can also sponsor: * <a href="https://github.com/sponsors/tiangolo" class="external-link" target="_blank">Sebastián Ramírez (FastAPI)</a> * <a href="https://github.com/sponsors/samuelcolvin" class="external-link" target="_blank">Samuel Colvin (Pydantic)</a> * <a href="https://github.com/sponsors/encode" class="external-link" target="_blank">Encode (Starlette, Uvicorn)</a> --- Thanks! 🚀
42.702381
227
0.735712
eng_Latn
0.983598
e172e6ed923f1efadf2a5acde9ce00597d70625c
6,441
md
Markdown
data/blog/javascript-property.md
uglyduck1104/duck-s-study-blog
7481cc3c3b5f45292e1de08528682d6f9f420813
[ "MIT" ]
null
null
null
data/blog/javascript-property.md
uglyduck1104/duck-s-study-blog
7481cc3c3b5f45292e1de08528682d6f9f420813
[ "MIT" ]
1
2022-03-30T00:18:36.000Z
2022-03-30T00:18:36.000Z
data/blog/javascript-property.md
uglyduck1104/duck-s-study-blog
7481cc3c3b5f45292e1de08528682d6f9f420813
[ "MIT" ]
null
null
null
--- title: 프로퍼티 어트리뷰트 date: '2022-05-16' tags: ['javascript'] draft: false summary: 자바스크립트 엔진은 프로퍼티를 생성할 때 프로퍼티의 상태를 나타내는 프로퍼티 어트리뷰트를 기본값으로 자동 정의함 layout: PostSimple authors: ['default'] --- # 프로퍼티 어트리뷰트 ## 프로퍼티 어트리뷰트와 디스크립터 객체 > 자바스크립트 엔진은 프로퍼티를 생성할 때 프로퍼티의 상태를 나타내는 프로퍼티 어트리뷰트를 기본값으로 자동 정의함 ### 프로퍼티 어트리뷰트 - 자바스크립트 엔진이 관리하는 내부 상태값인 `내부 슬롯`을 나타냄 - 직접 접근할 수 없으나 `Object.getOwnPropertyDescriptor` 메서드로 간접적인 확인이 가능함 ### `Object.getOwnPropertyDescriptor` ```javascript const person = { name: 'Lee' }; console.log(Object.getOwnPropertyDescriptor(person, 'name')); // {value: "Lee", writable: true, enumerable: true, configurable: true} ``` - 하나의 프로퍼티에 대해 `프로퍼티 디스크립터 객체`를 반환 - 존재하지 않는 프로퍼티나 상속받은 프로퍼티는 `undefined` 반환 - 첫 번째 매개변수: `객체의 참조` 전달 - 두번째 매개변수: `프로퍼티 키를 문자열`로 전달 - Object.getOwnPropertyDescriptors - ES8에 도입됐으며, 모든 프로퍼티의 프로퍼티 어트리뷰트 정보를 제공하는 프로퍼티 디스크립터 객체들을 반환 ## 데이터 프로퍼티와 접근자 프로퍼티 ### 데이터 프로퍼티 - `키`와 `값`으로 구성된 일반적인 프로퍼티 | 프로퍼티 어트리뷰트 | 프로퍼티 디스크립터 객체의 프로퍼티 | 설명 | |------------------|---------------------|-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | [[Value]] | value | - 프로퍼티 키를 통해 `프로퍼티 값에 접근`하면 반환되는 값<br/>- 프로퍼티키를 통해 프로퍼티 값을 변경하면 [[Value]]에 값을 재할당 | | [[Writable]] | writable | - 프로퍼티 값의 `변경 가능 여부`를 나타냄(불리언 값)<br/>- [[Writable]]의 값이 `false`인 경우 해당 프로퍼티의 [[Value]]의 값을 변경할 ` 없는 `읽기 전용 프로퍼티`가 됨 | | [[Enumerable]] | enumerable | - 프로퍼티 `열거 가능 여부`를 나타냄(불리언 값)<br/>- [[Enumerable]]의 값이 `false`인 경우 해당 프로퍼티는 for ... in문이나 Object.keys 메서드 등으로 `열거할 수 없음` | | [[Configurable]] | configurable | - 프로퍼티의 `재정의 가능 여부`를 나타냄(불리언 값)<br/>- [[Configurable]]의 값이 `false`인 경우 해당 프로퍼티의 삭제, 프로퍼티 어트리뷰트 값의 `변경이 금지`됨<br/>- [[Writable]]이 true인 경우 [[Value]]의 변경과 [[Writable]]을 false로 변경하는 것은 허용 | ```javascript const person = { name: 'Lee' }; // 프로퍼티 어트리뷰트 정보를 제공하는 프로퍼티 디스크립터 객체를 취득 consolelog(Object.getOwnPropertyDescriptor(person, 'name')); // {value: "Lee", writable: true, enumerable: true, configurable: true} ``` - 프로퍼티가 생성될 떄 [[Value]]의 값은 프로퍼티 값으로 초기화 됨 - [[Writable]], [[Enumerable]], [[Configurable]]의 값은 true로 초기화됨 ### 접근자 프로퍼티 - 자체적으로 값을 갖지 않고 다른 데이터 프로퍼티의 값을 읽거나 저장할 때 호출되는 `접근자 함수로 구성된 프로퍼티` | 프로퍼티 어트리뷰트 | 프로퍼티 디스크립터 객체의 프로퍼티 | 설명 | |------------------|---------------------|-----------------------------------------------------------------------------------------| | [[Get]] | get | - 데이터 프로퍼티의 `값을 읽을 때` 호출<br/>- 프로퍼티 키로 값에 접근하면 프로퍼티 어트리뷰트 [[Get]]의 값 혹은 `getter` 함수 호출 | | [[Set]] | set | - 데이터 프로퍼티의 `값을 저장할 때` 호출<br/>- 프로퍼티 키로 값에 저장하면 프로퍼티 어트리뷰트 [[Set]]의 값 혹은 `setter` 함수 호출 | | [[Enumerable]] | enumerable | - 데이터 프로퍼티와 동일 | | [[Configurable]] | configurable | - 데이터 프로퍼티와 동일 | - 접근자 프로퍼티는 자체적으로 값을 가지지 않으며 읽거나 저장할때만 관여함 ### 접근자 프로퍼티와 데이터 프로퍼티의 구별 ```javascript Object.getOwnPropertyDescriptor(Object.prototype, '__proto__'); // {get: f, set: f, enumerable: false, configurable: true} Object.getOwnPropertyDescriptor(function() {}, 'prototype'); // {value: {...}, writable: true, enumerable: false, configurable: false} ``` - 일반 객체의 `__proto__`는 접근자 프로퍼티 - 함수 객체의 `prototype`은 데이터 프로퍼티 ## 프로퍼티 정의 - 새로운 프로퍼티 추가, 어트리뷰터 정의, 재정의등을 말함 - 프로퍼티의 값을 갱신하거나 열거하거나 재정의 가능하도록 할 것인지에 대한 정의 가능 ### `Object.defineProperty` - 프로퍼티의 어트리뷰트 `정의` - 객체의 참조, 문자열, 디스크립터 객체를 인자로 전달 | 프로퍼티 디스크립터 객체의 프로퍼티 | 대응하는 프로퍼티 어트리뷰트 | 생략했을 때의 기본값 | |---------------------|------------------|-------------| | value | [[Value]] | undefined | | get | [[Get]] | undefined | | set | [[Set]] | undefined | | writable | [[Writable]] | false | | enumerable | [[Enumerable]] | false | | configurable | [[Configurable]] | false | - 한번에 하나의 프로퍼티만 정의 가능 ```javascript const person = {} // 데이터 프로퍼티 정의 Object.defineProperty(person, 'firstName', { value: 'Ungmo', writable: true, enumerable: true, configurable: true }); Object.defineProperty(person, 'lastName', { value: 'Lee', }); // 접근자 프로퍼티 정의 Object.defineProperty(person, 'fullName', { // getter 함수 get() { return `${this.firstName} ${this.lastName}`; }, // setter 함수 set(name) { [this.firstName, this.lastname] = name.split(' '); }, enumerable: true, configurable: true }); ``` - 여러개의 프로퍼티를 정의하려면 `Object.defineProperties` 사용 ## 객체 변경 방지 - 객체는 변경 가능한 값이므로 재할당 없이 직접 변경 가능 - 프로퍼티 `추가, 삭제, 갱신` - Object.defineProperty, Object.defineProperties 메서드로 `재정의 가능` | 구분 | 메서드 | 프로퍼티 추가 | 프로퍼티 삭제 | 프로퍼티 값 읽기 | 프로퍼티 값 쓰기 | 프로퍼티 어트리뷰트 재정의 | |----------|--------------------------|---------|---------|-----------|-----------|----------------| | 객체 확장 금지 | Object.preventExtensions | X | O | O | O | O | | 객체 밀봉 | Object.seal | X | X | O | O | X | | 객체 동결 | Object.freeze | X | X | O | X | X | ### `Object.preventExtensions` (객체 확장 금지) - 확장이 금지된 객체는 프로퍼티 `추가가 금지`됨 - 프로퍼티 추가는 금지되지만 `삭제는 가능` - 확장이 가능한 객체 여부 확인 메서드 - `Object.isExtensible(object)` → Boolean ### `Object.seal` (객체 밀봉) - 프로퍼티 `추가` 및 `삭제`, 어트리뷰트 `재정의` 금지 - 밀봉된 객체는 `읽기`와 `쓰기`만 가능 - configurable → false - 밀봉 객체 여부 확인 메서드 - `Object.isExtensible(object)` → Boolean ### `Object.freeze` (객체 동결) - 프로퍼티 `추가` 및 `삭제`, 어트리뷰트 `재정의` 금지, 프로퍼티 값 `갱신` 금지 - 동결된 객체는 `읽기`만 가능 - 밀봉 객체 여부 확인 메서드 - `Object.isFrozen(object)` → Boolean > 위의 변경 방지 메서드는 `얕은 변경 방지`로 직속 프로퍼티만 제한을 줄 수 있으며, `중첩 객체는 영향을 주지 못함` > **Referenced** - 이응모, 『모던 자바스크립트 Deep Dive』, 위키북스(2022.4.25), 220 ~ 233p
35.585635
228
0.487191
kor_Hang
0.999988
e17348cda4cd8489bbd36e069fe333a4d1752353
4,043
md
Markdown
node_modules/@storybook/ui/node_modules/telejson/README.md
rkuma238/React_UI
75a2c01ff79256bb2782d8ad9f37ebdaa5eb0fe1
[ "MIT" ]
2
2016-09-17T20:20:45.000Z
2018-04-14T06:07:48.000Z
node_modules/@storybook/ui/node_modules/telejson/README.md
rkuma238/React_UI
75a2c01ff79256bb2782d8ad9f37ebdaa5eb0fe1
[ "MIT" ]
2
2018-01-25T16:38:09.000Z
2018-01-25T16:40:32.000Z
node_modules/@storybook/ui/node_modules/telejson/README.md
rkuma238/React_UI
75a2c01ff79256bb2782d8ad9f37ebdaa5eb0fe1
[ "MIT" ]
null
null
null
# TeleJSON A library for teleporting rich data to another place. ## Install ```sh yarn add telejson ``` ## What can it do, what can't it do: `JSON.parse` & `JSON.stringify` have limitation by design, because there are no data formats for things like - Date - Function - Class - Symbol - etc. Also JSON doesn't support cyclic data structures. This library allows you to pass in data with all all the above properties. It will transform the properties to something that's allowed by the JSON spec whilst stringifying, and then convert back to the cyclic data structure when parsing. When parsing, **class instances** will be given the Class's name again. The prototype isn't copied over. **Functions** are supported, they are stringified and will be eval-ed when called. This lazy eval is important for performance. The eval happens via `eval()` Functions are stripped of comments and whitespace. > Obviously calling the function will only really work as expected if the functions were pure the begin with. **Regular expressions** just work. **Symbol** will be re-created with the same string. (resulting in a similar, but different symbol) **Dates** are parsed back into actual Date objects. ## API You have 2 choices: ```js import { stringify, parse } from 'telejson'; const Foo = function () {}; const root = { date: new Date('2018'), regex1: /foo/, regex2: /foo/g, regex2: new RegExp('foo', 'i'), fn1: () => 'foo', fn2: function fn2() { return 'foo'; }, Foo: new Foo(), }; // something cyclic root.root = root; const stringified = stringify(root); const parsed = parse(stringified); ``` stringify and parse do not conform to the JSON.stringify or JSON.parse api. they take an data object and a option object. OR you can use use the `replacer` and `reviver`: ```js import { replacer, reviver } from 'telejson'; import data from 'somewhere'; const stringified = JSON.stringify(data, reviver(), 2); const parsed = JSON.parse(stringified, reviver(), 2); ``` notice that both replacer and reviver need to be called! doing the following will NOT WORK: ``` const stringified = JSON.stringify(data, reviver, 2); const parsed = JSON.parse(stringified, reviver, 2); ``` ## options You either pass the options-object to `replacer` or as a second argument to `stringify`: ```js replacer({ maxDepth: 10 }); stringify(date, { maxDepth: 10 }); ``` ### replacer `maxDepth`: controls how deep to keep stringifying. When max depth is reach, objects will be replaced with `"[Object]"`, arrays will be replaced with `"[Array(<length>)]"`. default value is `10` This option is really useful if your object is huge/complex, and you don't care about the deeply nested data. `space`: controls how to prettify the output string. default value is `undefined`, no white space is used. Only relevant when using `stringify`. `allowFunction`: When set to false, functions will not be serialized. (default = true) `allowRegExp`: When set to false, regular expressions will not be serialized. (default = true) `allowClass`: When set to false, class instances will not be serialized. (default = true) `allowDate`: When set to false, Date objects will not be serialized. (default = true) `allowUndefined`: When set to false, `undefined` will not be serialized. (default = true) `allowSymbol`: When set to false, Symbols will not be serialized. (default = true) ### reviver `lazyEval`: When set to false, lazy eval will be disabled. (default true) Note: disabling lazy eval will affect performance. Consider disabling it only if you truly need to. ## Requirements `telejson` depends on the collection type `Map`. If you support older browsers and devices which may not yet provide these natively (e.g. IE < 11) or which have non-compliant implementations (e.g. IE 11), consider including a global polyfill in your bundled application, such as `core-js` or `babel-polyfill`. ## Contributing If you have any suggestions, please open an issue. All contributions are welcome! ### run tests: ```sh yarn test ```
28.076389
309
0.730398
eng_Latn
0.996171
e173529fcbcb4751ab31b858d1832f7d8b3477d4
1,072
md
Markdown
developers/api/methods/staking-related-methods/hmy_getstake.md
metronotes-testing/docs-developer
9232844cb6fc75d68ab42069c433d29305f8b6fb
[ "MIT" ]
null
null
null
developers/api/methods/staking-related-methods/hmy_getstake.md
metronotes-testing/docs-developer
9232844cb6fc75d68ab42069c433d29305f8b6fb
[ "MIT" ]
null
null
null
developers/api/methods/staking-related-methods/hmy_getstake.md
metronotes-testing/docs-developer
9232844cb6fc75d68ab42069c433d29305f8b6fb
[ "MIT" ]
null
null
null
# hmy\_getStake ## API v1 ### Parameters 1. `String` - validator one address \("one1..."\) ### Returns * `String` - returns validators stake 0x format ### Sample Curl Request ```bash curl -d '{ "id": "1", "jsonrpc": "2.0", "method": "hmy_getBalance", "params": [ "one1z05g55zamqzfw9qs432n33gycdmyvs38xjemyl", "latest" ] }' -H "Content-Type: application/json" -X POST "http://localhost:9500" ``` **Sample Curl Response** ```javascript { "jsonrpc": "2.0", "id": 1, "result": "0x6046f35fca29af800" } ``` ## API v2 ### Parameters 1. `String` - validator one address \("one1..."\) ### Returns * `uint64` - returns validators stake ### Sample Curl Request ```bash curl -d '{ "id": "1", "jsonrpc": "2.0", "method": "hmyv2_getBalance", "params": [ "one1z05g55zamqzfw9qs432n33gycdmyvs38xjemyl" ] }' -H "Content-Type: application/json" -X POST "http://localhost:9500" ``` **Sample Curl Response** ```javascript { "jsonrpc": "2.0", "id": 1, "result": "100000000000000" } ```
15.314286
70
0.58209
kor_Hang
0.239562
e17369209c70716273dc61ca9ca858df8e93b1cd
385
md
Markdown
examples/downhill-bike-physics-demo/README.md
GDevelopApp/GDevelop-examples
14acafa79be1d116c30bb93c0a72bbf9c93571ff
[ "MIT" ]
17
2021-05-31T22:02:27.000Z
2022-03-09T03:34:36.000Z
examples/downhill-bike-physics-demo/README.md
GDevelopApp/GDevelop-examples
14acafa79be1d116c30bb93c0a72bbf9c93571ff
[ "MIT" ]
160
2021-06-01T22:48:08.000Z
2022-03-30T23:54:29.000Z
examples/downhill-bike-physics-demo/README.md
GDevelopApp/GDevelop-examples
14acafa79be1d116c30bb93c0a72bbf9c93571ff
[ "MIT" ]
8
2021-06-01T07:40:09.000Z
2021-09-23T04:09:15.000Z
Try to cross the finish line as fast as possible without falling over! And remember to pedal safe! Starting with this template, use the Physics Engine to build an advanced simulation of a bike with a person pedaling on it! You will discover how to join objects, apply basic physics behaviors to your scene and change your camera settings to adapt to what's happening during the game.
96.25
284
0.805195
eng_Latn
0.999712
e17458ce0a421499acfc28f550dc01bec359a1a0
888
md
Markdown
_posts/2018-05-23-minutes-template-representation.md
ga4gh-cp/ga4gh-cp.github.io
c9798b95d6ecca31c7b03d45cc682fe8228ea617
[ "MIT" ]
null
null
null
_posts/2018-05-23-minutes-template-representation.md
ga4gh-cp/ga4gh-cp.github.io
c9798b95d6ecca31c7b03d45cc682fe8228ea617
[ "MIT" ]
3
2019-04-01T16:00:58.000Z
2019-05-06T17:17:50.000Z
_posts/2018-05-23-minutes-template-representation.md
ga4gh-cp/ga4gh-cp.github.io
c9798b95d6ecca31c7b03d45cc682fe8228ea617
[ "MIT" ]
3
2018-05-29T13:35:54.000Z
2019-11-07T16:26:30.000Z
--- title: 'Representation Minutes Archive' date: 2018-05-23 layout: default author: mbaudis permalink: /minutes-representation.html category: - representation tags: - featured --- ## GA4GH::CP {{ page.title }} The (combined) current meeting minutes are published accessible through [here](https://docs.google.com/document/d/1Qfms-6C8z1sFcjbhtcdpeUeAyeFF6vmGjX7sGCV3DEs/edit) for review and comments. ### Archive {% for item in site.categories.representation %} {% if item.tags contains 'minutes' %} {% assign currentyear = item.date | date: "%Y" %} {% if currentyear != currentdate %} <h2 id="y{{ currentyear }}">{{ currentyear }}</h2> {% assign currentdate = currentyear %} {% endif %} <div class="excerpt"> {{ item.excerpt }} <p>{{ item.date | date: "%Y-%m-%d" }}: <a href="{{ item.url | relative_url }}">more ...</a></p> </div> {% endif %} {% endfor %}
27.75
189
0.658784
eng_Latn
0.668178
e1746cd3f07a8bd120557f79cdf0b2b7d6b99f25
10,889
md
Markdown
sourceCodeAnalysis/builtin-type/slice.md
DSZSN/Go-Notes
4ae9c4e0893c9533070606e598145c30e12a6281
[ "Apache-2.0" ]
85
2019-01-09T09:30:36.000Z
2021-12-29T23:32:15.000Z
sourceCodeAnalysis/builtin-type/slice.md
DSZSN/Go-Notes
4ae9c4e0893c9533070606e598145c30e12a6281
[ "Apache-2.0" ]
2
2019-01-09T09:53:13.000Z
2019-07-29T08:42:54.000Z
sourceCodeAnalysis/builtin-type/slice.md
DSZSN/Go-Notes
4ae9c4e0893c9533070606e598145c30e12a6281
[ "Apache-2.0" ]
8
2019-01-03T10:42:19.000Z
2021-05-28T13:42:22.000Z
# slice slice类型是一个组合类型,它由头部和底层数组两部分组成。其中header包括ptr,len和cap三个字段,共24byte。ptr指向底层存储数据的字节数组,len表示slice元素数据长度,cap是ptr指向底层数组的长度。 ## slice 类型 slice通常使用make(type, len, cap)进行创建。make是内置类型,分配并初始化一个类型的对象slice,其返回值为type类型。 使用make创建slice时,如果cap是一个常量表达式,make通常在goroutine stack上分配内存(无需进行内存回收)。如果make的大小分配太大,例如:常量(64*1024)或变量(i+1),那么这两个都会在heap上分配(使用makeslice进行内存分配). ``` +---------------------+ | Ptr | len | cap | +---------------------+ | | | +-------------------+ +-->| 0 | 0 | 0 | 0 | +-------------------+ ``` ### slice header ``` type slice struct { array unsafe.Pointer // 指向底层数组 len int // slice长度 cap int // 底层数组长度 } ``` ### cap实例1: cap常量 ``` // go:noinline func f1() { s := make([]int, 0, 128) _ = s } // 编译 $> go build -gcflags "-l -m -N" -o test main.go \# command-line-arguments // 没有发生逃逸 ./main.go:10:14: main make([]int, 0, 128) does not escape ``` ### cap实例2: cap常量较大 ``` // go:noinline func f2() { s := make([]int, 0, 10000) _ = s } // 编译 $> go build -gcflags "-l -m -N" -o test main.go \# command-line-arguments ./main.go:10:14: make([]int, 0, 10000) escapes to heap // make发生了逃逸 ``` ### cap实例3: cap是变量 cap为变量i,i是一个运行时变量表达式,编译器必须进行数据流分析以证明make cap变量i值为1。 ``` func f3() { var i int = 1 s := make([]int, 0, i) _ = s } // 编译 $> go build -gcflags "-l -m -N" -o test main.go \# command-line-arguments ./main.go:10:14: make([]int, 0, i) escapes to heap // make发生了逃逸 ``` ## 创建slice ``` func makeslice(et *_type, len, cap int) slice { // maxElements: 可以分配最大cap长度 maxElements := maxSliceCap(et.size) // 这里进行判读,主要是纺织len太长,超出内存寻址方位,具体可以看: // issue 4085 if len < 0 || uintptr(len) > maxElements { panicmakeslicelen() } if cap < len || uintptr(cap) > maxElements { panicmakeslicecap() } // 创建slice // 被分配在heap上 p := mallocgc(et.size*uintptr(cap), et, true) return slice{p, len, cap} } ``` ### slice长度 ``` var maxElems = [...]uintptr{ ^uintptr(0), maxAlloc / 1, maxAlloc / 2, maxAlloc / 3, maxAlloc / 4, maxAlloc / 5, maxAlloc / 6, maxAlloc / 7, maxAlloc / 8, maxAlloc / 9, maxAlloc / 10, maxAlloc / 11, maxAlloc / 12, maxAlloc / 13, maxAlloc / 14, maxAlloc / 15, maxAlloc / 16, maxAlloc / 17, maxAlloc / 18, maxAlloc / 19, maxAlloc / 20, maxAlloc / 21, maxAlloc / 22, maxAlloc / 23, maxAlloc / 24, maxAlloc / 25, maxAlloc / 26, maxAlloc / 27, maxAlloc / 28, maxAlloc / 29, maxAlloc / 30, maxAlloc / 31, maxAlloc / 32, } // maxAlloc是分配的最大大小 maxAlloc = (1 << heapAddrBits) - (1-_64bit)*1 // maxSliceCap returns the maximum capacity for a slice. func maxSliceCap(elemsize uintptr) uintptr { // 如果elemsize小于maxElems长度,则直接从maxElems数组获取 if elemsize < uintptr(len(maxElems)) { return maxElems[elemsize] } // 根据类型大小获取长度 return maxAlloc / elemsize } ``` ## 内存分配 这部分内容其实属于内存分配器范畴,在以上例子中调用到mallocgc,这里提前讲解。 ``` func mallocgc(size uintptr, typ *_type, needzero bool) unsafe.Pointer { // 如果申请底层数组大小为0,则返回0x00000000内存地址 if size == 0 { return unsafe.Pointer(&zerobase) } // debug.sbrk 非0,在stack进行内存创建(function/type/debug-related等类型) // debug.sbrk 默认值0,采用内存分配器进行分配 if debug.sbrk != 0 { align := uintptr(16) if typ != nil { align = uintptr(typ.align) } return persistentalloc(size, align, &memstats.other_sys) } // assistG is the G to charge for this allocation, or nil if // GC is not currently active. var assistG *g if gcBlackenEnabled != 0 { // Charge the current user G for this allocation. assistG = getg() if assistG.m.curg != nil { assistG = assistG.m.curg } // Charge the allocation against the G. We'll account // for internal fragmentation at the end of mallocgc. assistG.gcAssistBytes -= int64(size) if assistG.gcAssistBytes < 0 { // This G is in debt. Assist the GC to correct // this before allocating. This must happen // before disabling preemption. gcAssistAlloc(assistG) } } // Set mp.mallocing to keep from being preempted by GC. // 获取当前Goutine m(执行线程)的结构 mp := acquirem() // 判断当前m是否正在执行分配操作 if mp.mallocing != 0 { throw("malloc deadlock") } // 判断当前是否正在执行当前g分配任务 if mp.gsignal == getg() { throw("malloc during signal") } // 抢占当前的分配信号,一旦被抢占其它分配操作则处于阻塞 mp.mallocing = 1 shouldhelpgc := false dataSize := size c := gomcache() var x unsafe.Pointer noscan := typ == nil || typ.kind&kindNoPointers != 0 // Tiny allocator 一种微小的内存分配器. // 当申请内存小于maxSmallSize时,可以将这种小内存对象放在同一个内存块.当所有子对象都不可访问时, // 就可以将该内存块进行释放。 // 使用Tiny allocator前提申请对象内不能包含指针类型子对象,否则禁止使用Tiny allocator,主要从 // 内存浪费角度考虑. // Tiny allocator 主要永在小字符串和逃逸的变量。实验室对json对象压力测试,可以减少大约12%分配 // 数量,减少heap大小越20%. // maxSmallSize 默认值32KB(俗称小对象) if size <= maxSmallSize { // 分配的对象不包含指针,并且分配内存大小小于maxTinySize(默认16字节,可以动态调整) if noscan && size < maxTinySize { // 进行内存对齐 off := c.tinyoffset if size&7 == 0 { off = round(off, 8) } else if size&3 == 0 { off = round(off, 4) } else if size&1 == 0 { off = round(off, 2) } // tiny指针指向内存块的开始位置;如果tiny为nil,则表示没有分配内存块。 // 1. 偏移量(当前已经占用的内存块)+新分配对象大小 <= tiny memory block大小 // 2. tiny memory block存在 // 以上2个条件都存在,则这个新对象命中了一个tiny block块。 if off+size <= maxTinySize && c.tiny != 0 { x = unsafe.Pointer(c.tiny + off) // slice header ptr指向底层数组的内存地址 c.tinyoffset = off + size // tiny 偏移量更改 c.local_tinyallocs++ // 计数器 mp.mallocing = 0 releasem(mp) // 释放m分配权 return x } // 分配一个新的 tiny block span := c.alloc[tinySpanClass] v := nextFreeFast(span) x = unsafe.Pointer(v) // slice header ptr指向底层数组的内存地址 (*[2]uint64)(x)[0] = 0 (*[2]uint64)(x)[1] = 0 size = maxTinySize } else { // 这两个数组用于根据对象的大小得出相应的类的索引 // size_to_class8用于大小小于1KB的对象 // size_to_class128用于 1 – 32KB大小的对象 var sizeclass uint8 if size <= smallSizeMax-8 { sizeclass = size_to_class8[(size+smallSizeDiv-1)/smallSizeDiv] } else { sizeclass = size_to_class128[(size-smallSizeMax+largeSizeDiv-1)/largeSizeDiv] } // class_to_size用于将类(这里指其在全局类列表中的索引值)映射为其所占内存空间的大小 size = uintptr(class_to_size[sizeclass]) // 从heap申请一块sizeclass大小的空间 spc := makeSpanClass(sizeclass, noscan) span := c.alloc[spc] // 从缓存中获取一块儿空闲内存 v := nextFreeFast(span) if v == 0 { // 如果获取失败,则需要重新开辟一块儿内存,填满缓存,然后再向缓存中获取内存。 // 再填满缓存之前,还是会尝试在缓存中是否可以获得空闲内存 // 如果填满缓存池,需要触发GC v, span, shouldhelpgc = c.nextFree(spc) } // slice header ptr指向底层数组的内存地址 x = unsafe.Pointer(v) } } else { // 申请大对象内存,直接从主存申请 var s *mspan shouldhelpgc = true systemstack(func() { s = largeAlloc(size, needzero, noscan) }) s.freeindex = 1 s.allocCount = 1 // slice header ptr指向底层数组的内存地址 x = unsafe.Pointer(s.base()) size = s.elemsize } return x } ``` ## slice扩容 当slice空间不足需要扩容时,需要调用growslice进行扩容. ### 实例1: ``` func main() { s := make([]int, 0, 2) int a []int = []int{1, 2, 3, 4, 5,6} s = append(s, a) } ``` * 反汇编 ``` 0x000000000104abfc <+300>: mov QWORD PTR [rsp+0x8],rdx 0x000000000104ac01 <+305>: mov QWORD PTR [rsp+0x10],rcx 0x000000000104ac06 <+310>: mov QWORD PTR [rsp+0x18],rax 0x000000000104ac0b <+315>: mov rax,QWORD PTR [rsp+0x48] 0x000000000104ac10 <+320>: mov QWORD PTR [rsp+0x20],rax 0x000000000104ac15 <+325>: call 0x10333c0 <runtime.growslice> ``` ### Growslice 源码实现 slice 在进行append()时,当cap容量不够用时,才会调用growslice函数进行内存扩容。该函数至少返回一个新slice容量的长度;新slice长度还是为旧slice长度,不是新slice容量;计算新加入元素的位置。 SSA在Go1.7版本被引入,作为Go新的后端。SSA后端更喜欢新slice长度为旧slice长度或仅返回指针以节省stack空间.现在还是使用之前的方式,会持续跟进Go growslice发展。 ``` // et : slice类型 // old : 旧slice // cap : 所需要的cap func growslice(et *_type, old slice, cap int) slice { // 数据类型为nil if et.size == 0 { // append不能创建slice len大于0的nil指针 // 这种情况下,slice不需要创建底层数组。直接返回nil指针即可 return slice{unsafe.Pointer(&zerobase), old.len, cap} } newcap := old.cap doublecap := newcap + newcap if cap > doublecap { // 所需要的cap > 两倍旧slice.cap大小,新的cap为所需要的cap大小 newcap = cap } else { if old.len < 1024 { // 如果所需要的cap小于旧slice.cap*2 // 旧的slice len小于1024 // 扩容后的slice cap大小为就slice cap的2倍 newcap = doublecap } else { // 旧的slice len大于1024,则扩容后的cap以1/4的比例增长 for 0 < newcap && newcap < cap { newcap += newcap / 4 } // 旧的slice没有指定cap长度,则扩容后的cap长度为所需的cap长度 if newcap <= 0 { newcap = cap } } } // 以下主要计算扩容前后内存占用大小,针对几种常见的类型进行了优化处理 // overflow: 判断申请的cap大小是否导致heap溢出 // lenmem: 旧slice len内存大小 // newlenmem: 扩容后slice len内存大小 // capmem: 扩容后slice cap内存大小 var overflow bool var lenmem, newlenmem, capmem uintptr switch { case et.size == 1: lenmem = uintptr(old.len) newlenmem = uintptr(cap) capmem = roundupsize(uintptr(newcap)) // 通过mallocgc创建size大小内存,作为扩容后slice的底层数组 overflow = uintptr(newcap) > maxAlloc newcap = int(capmem) case et.size == sys.PtrSize: lenmem = uintptr(old.len) * sys.PtrSize newlenmem = uintptr(cap) * sys.PtrSize capmem = roundupsize(uintptr(newcap) * sys.PtrSize) overflow = uintptr(newcap) > maxAlloc/sys.PtrSize newcap = int(capmem / sys.PtrSize) case isPowerOfTwo(et.size): // eg: int64,uint64, int64... var shift uintptr if sys.PtrSize == 8 { // sys.Ctz64判断et.size 从低位起有多少个0 // 16 => 10000 => 4 // 4 & 63 ==> 4 shift = uintptr(sys.Ctz64(uint64(et.size))) & 63 } else { shift = uintptr(sys.Ctz32(uint32(et.size))) & 31 } lenmem = uintptr(old.len) << shift newlenmem = uintptr(cap) << shift capmem = roundupsize(uintptr(newcap) << shift) overflow = uintptr(newcap) > (maxAlloc >> shift) newcap = int(capmem >> shift) default: lenmem = uintptr(old.len) * et.size newlenmem = uintptr(cap) * et.size capmem = roundupsize(uintptr(newcap) * et.size) overflow = uintptr(newcap) > maxSliceCap(et.size) newcap = int(capmem / et.size) } // 满足以下三个条件,cap发生内存溢出 // 1. 扩容后cap小于旧cap // 2. 扩容后cap大于maxSliceCap(et.size) // 3. 扩容后cap内存大于heap内存范围 if cap < old.cap || overflow || capmem > maxAlloc { panic(errorString("growslice: cap out of range")) } var p unsafe.Pointer // heap上对内存进行重新分配 if et.kind&kindNoPointers != 0 { // 指针类型 p = mallocgc(capmem, nil, false) memmove(p, old.array, lenmem) // 旧数据覆盖扩容后底层数组位置 // 仅清除不会被覆盖的部分 memclrNoHeapPointers(add(p, newlenmem), capmem-newlenmem) } else { p = mallocgc(capmem, et, true) memmove(p, old.array, lenmem) } // 返回扩容后slice类型 return slice{p, old.len, newcap} } ``` ## convT2Eslice创建slice 创建slice时除使用make方式创建,还可以通过convT2Eslice创建。 ``` var a []int ``` 以上源代码经过编译之后,会通过runtime.convT2Eslice函数创建slice。 ``` func convT2Eslice(t *_type, elem unsafe.Pointer) (e eface) { var x unsafe.Pointer // elem数据转换成slice类型 if v := *(*slice)(elem); uintptr(v.array) == 0 { // 如果slice array指针为空,则返回nil x = unsafe.Pointer(&zeroVal[0]) } else { // 申请内存 x = mallocgc(t.size, t, true) *(*slice)(x) = *(*slice)(elem) } // 返回 e._type = t e.data = x return } ```
24.037528
139
0.650381
yue_Hant
0.211289
e1754b7b25ebb9b93b8cbd13b538142db9f7772e
122
md
Markdown
README.md
viking-sudo-rm/personal-website
a3e3538702423e8c048b14a3a131db5a1a0bc97f
[ "MIT" ]
null
null
null
README.md
viking-sudo-rm/personal-website
a3e3538702423e8c048b14a3a131db5a1a0bc97f
[ "MIT" ]
null
null
null
README.md
viking-sudo-rm/personal-website
a3e3538702423e8c048b14a3a131db5a1a0bc97f
[ "MIT" ]
null
null
null
My personal website, built using the [Academic](https://sourcethemes.com/academic/) theme for [Hugo](https://gohugo.io/).
61
121
0.745902
eng_Latn
0.53624
e176075c9fbafa41b0fae5976984464bc058f228
29
md
Markdown
README.md
Sly-Ry/pizza-hunt
86fcfc51f1332ec5bb3cd649c9b1b75c2b449965
[ "MIT" ]
null
null
null
README.md
Sly-Ry/pizza-hunt
86fcfc51f1332ec5bb3cd649c9b1b75c2b449965
[ "MIT" ]
5
2022-03-06T18:55:57.000Z
2022-03-10T22:09:33.000Z
README.md
Sly-Ry/pizza-hunt
86fcfc51f1332ec5bb3cd649c9b1b75c2b449965
[ "MIT" ]
null
null
null
# pizza-hunt MongoDB web app
9.666667
15
0.758621
hun_Latn
0.299594
e176ef7691bc759e0c5d86e750d6368db9ebf90b
99
md
Markdown
playground/cargo-scripts/README.md
dougtq/rust-lang
b95727426da36f7fce8d2b32395ce25d14094d5f
[ "MIT" ]
151
2019-09-03T16:46:32.000Z
2022-03-16T06:17:51.000Z
playground/cargo-scripts/README.md
dougtq/rust-lang
b95727426da36f7fce8d2b32395ce25d14094d5f
[ "MIT" ]
3
2020-05-17T10:28:08.000Z
2021-02-25T02:46:28.000Z
playground/cargo-scripts/README.md
dougtq/rust-lang
b95727426da36f7fce8d2b32395ce25d14094d5f
[ "MIT" ]
46
2019-03-20T21:07:40.000Z
2022-03-11T14:20:42.000Z
### Install ```bash cargo install cargo-script ``` ### Run ```bash chmod +x now.crs ./now.crs ```
9
26
0.59596
eng_Latn
0.422139
e1770b7ed75a8077ce1b88b058b514d85b5c97bc
2,310
md
Markdown
packages/secret-key/CHANGELOG.md
qiwi/masker
d157019f08213d4f2c33b79c717a8cca33cf22a1
[ "MIT" ]
7
2020-09-24T21:56:01.000Z
2022-03-24T20:31:42.000Z
packages/secret-key/CHANGELOG.md
qiwi/masker
d157019f08213d4f2c33b79c717a8cca33cf22a1
[ "MIT" ]
66
2020-06-24T15:34:48.000Z
2021-08-30T08:49:24.000Z
packages/secret-key/CHANGELOG.md
qiwi/masker
d157019f08213d4f2c33b79c717a8cca33cf22a1
[ "MIT" ]
null
null
null
## @qiwi/masker-secret-key [1.0.6](https://github.com/qiwi/masker/compare/@qiwi/masker-secret-key@1.0.5...@qiwi/masker-secret-key@1.0.6) (2021-11-18) ### Performance Improvements * update deps ([e2e0f2d](https://github.com/qiwi/masker/commit/e2e0f2d9020d8f53d9e67d748a0566030ad367f6)) ### Dependencies * **@qiwi/masker-common:** upgraded to 1.13.2 * **@qiwi/masker-plain:** upgraded to 1.2.17 ## @qiwi/masker-secret-key [1.0.5](https://github.com/qiwi/masker/compare/@qiwi/masker-secret-key@1.0.4...@qiwi/masker-secret-key@1.0.5) (2021-11-04) ### Bug Fixes * update deps, fix some vuls ([d303201](https://github.com/qiwi/masker/commit/d303201ab664ad185d0e64243301796611041274)) ### Dependencies * **@qiwi/masker-common:** upgraded to 1.13.1 * **@qiwi/masker-plain:** upgraded to 1.2.16 ## @qiwi/masker-secret-key [1.0.4](https://github.com/qiwi/masker/compare/@qiwi/masker-secret-key@1.0.3...@qiwi/masker-secret-key@1.0.4) (2021-07-18) ### Dependencies * **@qiwi/masker-common:** upgraded to 1.13.0 * **@qiwi/masker-plain:** upgraded to 1.2.15 ## @qiwi/masker-secret-key [1.0.3](https://github.com/qiwi/masker/compare/@qiwi/masker-secret-key@1.0.2...@qiwi/masker-secret-key@1.0.3) (2021-07-11) ### Dependencies * **@qiwi/masker-common:** upgraded to 1.12.1 * **@qiwi/masker-plain:** upgraded to 1.2.14 ## @qiwi/masker-secret-key [1.0.2](https://github.com/qiwi/masker/compare/@qiwi/masker-secret-key@1.0.1...@qiwi/masker-secret-key@1.0.2) (2021-07-10) ### Dependencies * **@qiwi/masker-common:** upgraded to 1.12.0 * **@qiwi/masker-plain:** upgraded to 1.2.13 ## @qiwi/masker-secret-key [1.0.1](https://github.com/qiwi/masker/compare/@qiwi/masker-secret-key@1.0.0...@qiwi/masker-secret-key@1.0.1) (2021-07-09) ### Bug Fixes * **common:** fix pipe opts resolver ([0c20d21](https://github.com/qiwi/masker/commit/0c20d2138f2d8e8319ca492077c2e6795b7c768b)) ### Dependencies * **@qiwi/masker-common:** upgraded to 1.11.0 * **@qiwi/masker-plain:** upgraded to 1.2.12 # @qiwi/masker-secret-key 1.0.0 (2021-07-07) ### Bug Fixes * **secret-ky:** fix pkg name ([f837351](https://github.com/qiwi/masker/commit/f837351077a16a4f08d7e4560608b7ac54203337)) ### Dependencies * **@qiwi/masker-common:** upgraded to 1.10.1 * **@qiwi/masker-plain:** upgraded to 1.2.11
23.814433
149
0.68658
eng_Latn
0.080379
e177939a21d208bbeb25eef41eedc51a3a70abfe
13,661
md
Markdown
README.md
flex-development/mango
57cb472477ba47ffe7ef92761aabf4f68c7b83b6
[ "BSD-3-Clause" ]
null
null
null
README.md
flex-development/mango
57cb472477ba47ffe7ef92761aabf4f68c7b83b6
[ "BSD-3-Clause" ]
null
null
null
README.md
flex-development/mango
57cb472477ba47ffe7ef92761aabf4f68c7b83b6
[ "BSD-3-Clause" ]
null
null
null
# :mango: Mango MongoDB query plugin and repository API for in-memory object collections [![TypeScript](https://badgen.net/badge/-/typescript?icon=typescript&label)](https://www.typescriptlang.org/) [![tested with jest](https://img.shields.io/badge/tested_with-jest-99424f.svg)](https://github.com/facebook/jest) ## Overview [Getting Started](#getting-started) [Installation](#installation) [Usage](#usage) [Built With](#built-with) [Contributing](docs/CONTRIBUTING.md) ## Getting Started MongoDB query plugin and repository API for in-memory object collections. - run aggregation pipelines - execute searches (with query criteria **and** URL queries) - parse and convert URL query objects and strings - perform CRUD operations on repositories - validate collection objects ## Installation 1. Create or edit an `.npmrc` file with the following information: ```utf-8 @flex-development:registry=https://npm.pkg.github.com/ ``` 2. Add project to `dependencies` ```zsh yarn add @flex-development/mango # or npm i @flex-development/mango ``` ## Usage [Configuration](#configuration) [Mango Finder](#mango-finder) [Mango Repository](#mango-repository) [Mango Validator](#mango-validator) ### Configuration #### Environment Variables - `DEBUG`: Toggle [debug][4] logs from the `mango` namespace - `DEBUG_COLORS`: Toggle [debug][4] log namespace colors #### Mingo The `MangoFinder` and `MangoFinderAsync` plugins integrate with [mingo][5], a MongoDB query language for in-memory objects, to support aggregation pipelines and executing searches. Operators loaded by Mango can be viewed in the [config](src/config/mingo.ts) file. If additional operators are needed, load them _before_ [creating a new plugin](#creating-a-new-mango-plugin). #### TypeScript For shorter import paths, TypeScript users can add the following aliases: ```json { "compilerOptions": { "paths": { "@mango": ["node_modules/@flex-development/mango/index"], "@mango/*": ["node_modules/@flex-development/mango/*"] } } } ``` These aliases will be used in following code examples. ### Mango Finder The Mango Finder plugins allow users to run aggregation pipelines and execute searches against in-memory object collections. Query documents using a URL query, or search for them using a query criteria and options object. #### Plugin Documentation - [`AbstractMangoFinder`](src/abstracts/mango-finder.abstract.ts) - [`MangoFinderAsync`](src/plugins/mango-finder-async.plugin.ts) - [`MangoFinder`](src/plugins/mango-finder.plugin.ts) ```typescript /** * `AbstractMangoFinder` plugin interface. * * Used to define class contract of `MangoFinder`, `MangoFinderAsync`, and * possible derivatives. * * See: * * - https://github.com/kofrasa/mingo * - https://github.com/fox1t/qs-to-mongo * * @template D - Document (collection object) * @template U - Name of document uid field * @template P - Search parameters (query criteria and options) * @template Q - Parsed URL query object * * @extends IAbstractMangoFinderBase */ export interface IAbstractMangoFinder< D extends ObjectPlain = ObjectUnknown, U extends string = DUID, P extends MangoSearchParams<D> = MangoSearchParams<D>, Q extends MangoParsedUrlQuery<D> = MangoParsedUrlQuery<D> > extends IAbstractMangoFinderBase<D, U> { aggregate( pipeline?: OneOrMany<AggregationStages<D>> ): OrPromise<AggregationPipelineResult<D>> find(params?: P): OrPromise<DocumentPartial<D, U>[]> findByIds(uids?: UID[], params?: P): OrPromise<DocumentPartial<D, U>[]> findOne(uid: UID, params?: P): OrPromise<DocumentPartial<D, U> | null> findOneOrFail(uid: UID, params?: P): OrPromise<DocumentPartial<D, U>> query(query?: Q | string): OrPromise<DocumentPartial<D, U>[]> queryByIds( uids?: UID[], query?: Q | string ): OrPromise<DocumentPartial<D, U>[]> queryOne( uid: UID, query?: Q | string ): OrPromise<DocumentPartial<D, U> | null> queryOneOrFail(uid: UID, query?: Q | string): OrPromise<DocumentPartial<D, U>> setCache(collection?: D[]): OrPromise<MangoCacheFinder<D>> uid(): string } /** * Base `AbstractMangoFinder` plugin interface. * * Used to define properties of `MangoFinder`, `MangoFinderAsync`, and * possible derivatives. * * @template D - Document (collection object) * @template U - Name of document uid field */ export interface IAbstractMangoFinderBase< D extends ObjectPlain = ObjectUnknown, U extends string = DUID > { readonly cache: Readonly<MangoCacheFinder<D>> readonly logger: Debugger readonly mingo: typeof mingo readonly mparser: IMangoParser<D> readonly options: MangoFinderOptions<D, U> } ``` #### Documents A document is an object from an in-memory collection. Each document should have a unique identifier (uid). By default, this value is assumed to map to the `id` field of each document, but can be changed via the [plugin settings](#plugin-settings). ```typescript import type { MangoParsedUrlQuery, MangoSearchParams } from '@mango/types' export interface IPerson { email: string first_name: string last_name: string } export type PersonUID = 'email' export type PersonParams = MangoSearchParams<IPerson> export type PersonQuery = MangoParsedUrlQuery<IPerson> ``` #### Creating a New Finder Both the `MangoFinder` and `MangoFinderAsync` plugins accept an `options` object thats gets passed down to the [mingo][5] and [qs-to-mongo][6] modules. Via the options dto, you can: - set initial collection cache - set uid field for each document - set date fields and fields searchable by text ```typescript import { MangoFinder, MangoFinderAsync } from '@mango' import type { MangoFinderOptionsDTO } from '@mango/dto' const options: MangoFinderOptionsDTO<IPerson, PersonUID> = { cache: { collection: [ { email: 'nmaxstead0@arizona.edu', first_name: 'Nate', last_name: 'Maxstead' }, { email: 'rbrisseau1@sohu.com', first_name: 'Roland', last_name: 'Brisseau' }, { email: 'ksmidmoor2@sphinn.com', first_name: 'Kippar', last_name: 'Smidmoor' }, { email: 'gdurnford3@360.cn', first_name: 'Godfree', last_name: 'Durnford' }, { email: 'mfauguel4@webnode.com', first_name: 'Madelle', last_name: 'Fauguel' } ] }, mingo: { idKey: 'email' }, parser: { fullTextFields: ['first_name', 'last_name'] } } export const PeopleFinder = new MangoFinder<IPerson, PersonUID>(options) export const PeopleFinderA = new MangoFinderAsync<IPerson, PersonUID>(options) ``` **Note**: All properties are optional. To learn more about [qs-to-mongo][6] options, see [Options][8] from the package documentation. Note that the `objectIdFields` and `parameters` options are not accepted by the [`MangoParser`](src/mixins/mango-parser.mixin.ts). ### Mango Repository The Mango Repositories extend the [Mango Finder](#mango-finder) plugins and allow users to perform write operations on an object collection. #### Repository Documentation - [`AbstractMangoRepository`](src/abstracts/mango-repo.abstract.ts) - [`MangoRepositoryAsync`](src/repositories/mango-async.repository.ts) - [`MangoRepository`](src/repositories/mango.repository.ts) ```typescript /** * `AbstractMangoRepository` class interface. * * Used to define class contract of `MangoRepository`, `MangoRepositoryAsync`, * and possible derivatives. * * @template E - Entity * @template U - Name of entity uid field * @template P - Repository search parameters (query criteria and options) * @template Q - Parsed URL query object * * @extends IAbstractMangoFinder * @extends IAbstractMangoRepositoryBase */ export interface IAbstractMangoRepository< E extends ObjectPlain = ObjectUnknown, U extends string = DUID, P extends MangoSearchParams<E> = MangoSearchParams<E>, Q extends MangoParsedUrlQuery<E> = MangoParsedUrlQuery<E> > extends Omit<IAbstractMangoFinder<E, U, P, Q>, 'cache' | 'options'>, IAbstractMangoRepositoryBase<E, U> { clear(): OrPromise<boolean> create(dto: CreateEntityDTO<E>): OrPromise<E> delete(uid?: OneOrMany<UID>, should_exist?: boolean): OrPromise<UID[]> patch(uid: UID, dto?: PatchEntityDTO<E>, rfields?: string[]): OrPromise<E> setCache(collection?: E[]): OrPromise<MangoCacheRepo<E>> save(dto?: OneOrMany<EntityDTO<E>>): OrPromise<E[]> } /** * Base `AbstractMangoRepository` class interface. * * Used to define properties of `MangoRepository`, `MangoRepositoryAsync`, * and possible derivatives. * * @template E - Entity * @template U - Name of entity uid field * * @extends IAbstractMangoFinderBase */ export interface IAbstractMangoRepositoryBase< E extends ObjectPlain = ObjectUnknown, U extends string = DUID > extends IAbstractMangoFinderBase<E, U> { readonly cache: MangoCacheRepo<E> readonly options: MangoRepoOptions<E, U> readonly validator: IMangoValidator<E> } ``` #### Modeling Entities Before creating a new repository, a model needs to be created. For the next set of examples, the model `User` will be used. ```typescript import { IsStrongPassword, IsUnixTimestamp } from '@mango/decorators' import type { MangoParsedUrlQuery, MangoSearchParams } from '@mango/types' import { IsEmail, IsNotEmpty, IsOptional, IsPhoneNumber, IsString } from 'class-validator' import type { IPerson, PersonUID } from './people' export interface IUser extends IPerson { created_at: number password: string phone?: string updated_at?: number } export type UserParams = MangoSearchParams<IUser> export type UserQuery = MangoParsedUrlQuery<IUser> export class User implements IUser { @IsUnixTimestamp() created_at: IUser['created_at'] @IsEmail() email: IUser['email'] @IsString() @IsNotEmpty() first_name: IUser['first_name'] @IsString() @IsNotEmpty() last_name: IUser['last_name'] @IsStrongPassword() password: IUser['password'] @IsOptional() @IsPhoneNumber() phone?: IUser['phone'] @IsOptional() @IsUnixTimestamp() updated_at: IUser['updated_at'] } ``` For more information about validation decorators, see the [class-validator][3] package. Mango also exposes a set of [custom decorators](src/decorators/index.ts). #### Creating a New Repository The `MangoRepository` class accepts an `options` object that gets passed down to the [`MangoFinder`](#mango-finder) and [`MangoValidator`](#mango-validator). ```typescript import { MangoRepository, MangoRepositoryAsync } from '@mango' import type { MangoRepoOptionsDTO } from '@mango/dtos' const options: MangoRepoOptionsDTO<IUser, PersonUID> = { cache: { collection: [] }, mingo: { idKey: 'email' }, parser: { fullTextFields: ['first_name', 'last_name'] }, validation: { enabled: true, transformer: {}, validator: {} } } export const UsersRepo = new MangoRepository<IUser, PersonUID>(User, options) export const UsersRepoA = new MangoRepositoryAsync<IUser, PersonUID>( User, options ) ``` See [Mango Validator](#mango-validator) for more information about `validation` options. ### Mango Validator The `MangoValidator` mixin allows for **decorator-based** model validation. Under the hood, it uses [class-transformer-validator][1]. #### Validator Documentation - [`MangoValidator`](src/mixins/mango-validator.mixin.ts) ```typescript /** * `MangoValidator` mixin interface. * * @template E - Entity */ export interface IMangoValidator<E extends ObjectPlain = ObjectUnknown> { readonly enabled: boolean readonly model: ClassType<E> readonly model_name: string readonly tvo: Omit<MangoValidatorOptions, 'enabled'> readonly validator: typeof transformAndValidate readonly validatorSync: typeof transformAndValidateSync check<V extends unknown = ObjectPlain>(value?: V): Promise<E | V> checkSync<V extends unknown = ObjectPlain>(value?: V): E | V handleError(error: Error | ValidationError[]): Exception } ``` Each [repository](#mango-repository) has it owns validator, but the validator can be used standalone as well. ```typescript import { MangoValidator } from '@mango' import type { MangoValidatorOptions } from '@mango/types' const options: MangoValidatorOptions = { transformer: {}, validator: {} } export const UsersValidator = new MangoValidator<IUser>(User, options) ``` Validation options will be merged with the following object: ```typescript import type { TVODefaults } from '@mango/types' /** * @property {TVODefaults} TVO_DEFAULTS - `class-transformer-validator` options * @see https://github.com/MichalLytek/class-transformer-validator */ export const TVO_DEFAULTS: TVODefaults = Object.freeze({ transformer: {}, validator: { enableDebugMessages: true, forbidNonWhitelisted: true, stopAtFirstError: false, validationError: { target: false, value: true }, whitelist: true } }) ``` ## Built With - [class-transformer-validator][1] - Plugin for [class-transformer][2] and [class-validator][3] - [debug][4] - Debugging utility - [mingo][5] - MongoDB query language for in-memory objects - [qs-to-mongo][6] - Parse and convert URL queries into MongoDB query criteria and options - [uuid][7] - Generate RFC-compliant UUIDs [1]: https://github.com/MichalLytek/class-transformer-validator [2]: https://github.com/typestack/class-transformer [3]: https://github.com/typestack/class-validator [4]: https://github.com/visionmedia/debug [5]: https://github.com/kofrasa/mingo [6]: https://github.com/fox1t/qs-to-mongo [7]: https://github.com/uuidjs/uuid [8]: https://github.com/fox1t/qs-to-mongo#options
28.342324
113
0.721543
eng_Latn
0.332924
e17914354b6819c1aeeb5746c2182de19c782738
21,759
md
Markdown
docs/relational-databases/backup-restore/file-snapshot-backups-for-database-files-in-azure.md
polocco/sql-docs.it-it
054013d9cd6f2c81f53fc91a7eafc8043f12c380
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/relational-databases/backup-restore/file-snapshot-backups-for-database-files-in-azure.md
polocco/sql-docs.it-it
054013d9cd6f2c81f53fc91a7eafc8043f12c380
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/relational-databases/backup-restore/file-snapshot-backups-for-database-files-in-azure.md
polocco/sql-docs.it-it
054013d9cd6f2c81f53fc91a7eafc8043f12c380
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: Backup di snapshot di file per i file di database in Azure | Microsoft Docs description: Il backup di snapshot di file di SQL Server usa gli snapshot di Azure per offrire backup veloci e ripristini più veloci per i file di database archiviati mediante il servizio di archiviazione BLOB di Azure. ms.custom: '' ms.date: 05/23/2016 ms.prod: sql ms.prod_service: backup-restore ms.reviewer: '' ms.technology: backup-restore ms.topic: conceptual ms.assetid: 17a81fcd-8dbd-458d-a9c7-2b5209062f45 author: MikeRayMSFT ms.author: mikeray ms.openlocfilehash: fe74f56af2726a32d6216852ca2d8dec341ee6dd ms.sourcegitcommit: 04cf7905fa32e0a9a44575a6f9641d9a2e5ac0f8 ms.translationtype: HT ms.contentlocale: it-IT ms.lasthandoff: 10/07/2020 ms.locfileid: "91809442" --- # <a name="file-snapshot-backups-for-database-files-in-azure"></a>Backup di snapshot di file per i file di database in Azure [!INCLUDE [SQL Server](../../includes/applies-to-version/sqlserver.md)] [!INCLUDE[ssNoVersion](../../includes/ssnoversion-md.md)] Il backup di snapshot di file usa gli snapshot di Azure per offrire backup quasi istantanei e ripristini più veloci per i file di database archiviati usando il servizio di archiviazione BLOB di Azure. Questa funzionalità consente di semplificare i criteri di backup e ripristino. Per una dimostrazione dal vivo, vedere la [demo del ripristino temporizzato](https://channel9.msdn.com/Blogs/Windows-Azure/File-Snapshot-Backups-Demo). Per altre informazioni sull'archiviazione dei file di database con il servizio di archiviazione BLOB di Azure, vedere [File di dati di SQL Server in Microsoft Azure](../../relational-databases/databases/sql-server-data-files-in-microsoft-azure.md). ![diagramma dell'architettura per il backup di snapshot](../../relational-databases/backup-restore/media/snapshotbackups.PNG "diagramma dell'architettura per il backup di snapshot") **Scaricare** - Per scaricare [!INCLUDE[ssSQL15](../../includes/sssql15-md.md)], passare a **[Evaluation Center](https://www.microsoft.com/evalcenter/evaluate-sql-server-2016)** . - Se si ha un account di Azure, fare clic **[qui](https://azure.microsoft.com/services/virtual-machines/sql-server/)** per creare rapidamente una macchina virtuale in cui è già installato [!INCLUDE[ssCurrent](../../includes/sscurrent-md.md)] . ## <a name="using-azure-snapshots-to-back-up-database-files-stored-in-azure"></a>Uso degli snapshot di Azure per eseguire il backup dei file di database archiviati in Azure ### <a name="what-is-a-ssnoversion-file-snapshot-backup"></a>Che cos'è un backup di snapshot di file [!INCLUDE[ssNoVersion](../../includes/ssnoversion-md.md)] Un backup di snapshot di file è costituito da un set di snapshot di Azure dei BLOB che contengono i file di database e da un file di backup che contiene puntatori a tali snapshot di file. Ogni snapshot di file viene archiviato nel contenitore con il BLOB di base. È possibile specificare che il file di backup stesso venga scritto su URL, disco o nastro. È consigliabile eseguire il backup su URL. Per altre informazioni sul backup, vedere [BACKUP &#40;Transact-SQL&#41;](../../t-sql/statements/backup-transact-sql.md). Per altre informazioni sul backup nell'URL, vedere [Backup di SQL Server nell'URL](../../relational-databases/backup-restore/sql-server-backup-to-url.md). ![architettura della funzionalità relativa agli snapshot](../../relational-databases/backup-restore/media/snapshotbackups-flat.png "architettura della funzionalità relativa agli snapshot") Se si elimina il BLOB il set di backup non sarà più valido e non è possibile eliminare un BLOB che contiene i file di snapshot (a meno che non si sia scelto espressamente di eliminare un BLOB con tutti i relativi snapshot di file). Inoltre, l'eliminazione di un database o di un file di dati non elimina il BLOB di base o uno qualsiasi dei relativi snapshot di file e l'eliminazione del file di backup non elimina alcuno degli snapshot di file nel set di backup. Per eliminare un set di backup di snapshot di file, usare la stored procedure di sistema **sys.sp_delete_backup** . **Backup di database completo:** L'esecuzione di un backup di database completo usando il backup di snapshot di file crea uno snapshot di Azure di ogni file di dati e di log che costituisce il database, stabilisce la catena di backup del log delle transazioni e scrive la posizione degli snapshot di file nel file di backup. **Backup del log delle transazioni:** L'esecuzione di un backup del log delle transazioni con il backup di snapshot di file crea uno snapshot di file di ogni file di database (non solo del log delle transazioni), registra le informazioni sul percorso dello snapshot di file nel file di backup e tronca il file di log delle transazioni. > [!IMPORTANT] > Dopo il backup completo iniziale che è necessario per stabilire la catena dei backup del log delle transazioni (che può essere un backup di snapshot di file), è sufficiente eseguire i backup del log delle transazioni perché ogni set di backup di snapshot di file del log delle transazioni contiene snapshot di file di tutti i file di database e può essere usato per eseguire un ripristino del database o un ripristino del log. Dopo il backup del database completo iniziale, non sono necessari altri backup completi o differenziali perché il servizio di archiviazione BLOB di Azure gestisce le differenze tra ogni snapshot di file e lo stato corrente del BLOB di base per ogni file di database. > [!NOTE] > Per un'esercitazione sull'uso di SQL Server 2016 con il servizio di archiviazione BLOB di Microsoft Azure, vedere [Esercitazione: Uso del servizio di archiviazione BLOB di Microsoft Azure con i database di SQL Server 2016](../tutorial-use-azure-blob-storage-service-with-sql-server-2016.md) ### <a name="restore-using-file-snapshot-backups"></a>Eseguire il ripristino con i backup di snapshot di file Ogni set di backup di snapshot di file contiene uno snapshot di file di ogni file di database, quindi un processo di ripristino richiede al massimo due set di backup di snapshot di file adiacenti. Questo vale indipendentemente dal fatto che il set di backup provenga da un backup di database completo o da un backup del log. Si tratta di un processo molto diverso rispetto al ripristino eseguito con i file di backup di flusso tradizionale. Con il backup di flusso tradizionale il processo di ripristino richiede l'uso di un'intera catena dei set di backup: il backup completo, un backup differenziale e uno o più backup del log delle transazioni. La parte di recupero del processo di ripristino rimane invariata indipendentemente dal fatto che il ripristino usi un backup di snapshot di file o un set di backup di flusso. **All'ora di qualsiasi set di backup:** Per eseguire un'operazione RESTORE DATABASE per ripristinare un database all'ora di un set di backup di snapshot di file specifico, è necessario solo il set di backup specifico, oltre ai BLOB di base stessi. Considerato che è possibile usare un set di backup di snapshot di file del log delle transazioni per eseguire un'operazione RESTORE DATABASE, in genere si usa un set di backup del log delle transazioni per eseguire questo tipo di operazione e solo raramente si usa un set di backup di database completo. Alla fine di questo argomento viene fornito un esempio che illustra questa tecnica. **A un punto nel tempo che intercorre tra due set di backup di snapshot di file:** Per eseguire un'operazione RESTORE DATABASE per ripristinare un database a uno specifico punto nel tempo che intercorre tra l'ora di due set di backup del log delle transazioni adiacenti, sono necessari solo due set di backup del log delle transazioni (prima e dopo il punto nel tempo per cui si vuole ripristinare il database). A tale scopo, eseguire un'operazione RESTORE DATABASE WITH NORECOVERY usando il set di backup di snapshot di file del log transazionale del precedente punto nel tempo ed eseguire un'operazione RESTORE LOG WITH RECOVERY usando il set di backup di snapshot di file del log delle transazioni del successivo punto nel tempo e usando l'argomento STOPAT per specificare il punto nel tempo in cui interrompere il recupero dal backup del log delle transazioni. Alla fine di questo argomento viene fornito un esempio che illustra questa tecnica. Per una dimostrazione dal vivo, vedere la [demo del ripristino temporizzato](https://channel9.msdn.com/Blogs/Windows-Azure/File-Snapshot-Backups-Demo). ### <a name="file-backup-set-maintenance"></a>Manutenzione di set di backup di file **Eliminazione di un set di backup di snapshot di file:** Non è possibile sovrascrivere un set di backup di snapshot di file impostato usando l'argomento FORMAT. L'argomento FORMAT non è consentito per evitare di lasciare orfani snapshot di file che erano stati creati con il backup di snapshot di file originale. Per eliminare un set di backup di snapshot di file, usare la stored procedure di sistema **sys.sp_delete_backup** . Questa stored procedure elimina il file di backup e gli snapshot di file che costituiscono il set di backup. L'uso di un altro metodo per eliminare un set di backup di snapshot di file può eliminare il file di backup senza eliminare gli snapshot di file nel set di backup. **Eliminazione di snapshot di file di backup orfani:** Potrebbero essere stati lasciati snapshot di file orfani se il file di backup è stato eliminato senza usare la stored procedure di sistema **sys.sp_delete_backup** o se un database o un file di database è stato eliminato mentre i BLOB che contenevano il database o il file di database contenevano snapshot di file di backup a essi associati. Per identificare gli snapshot di file che potrebbero essere orfani, usare la funzione di sistema **sys.fn_db_backup_file_snapshots** per elencare tutti gli snapshot di file dei file di database. Per identificare gli snapshot di file che fanno parte di un set di backup di snapshot di file specifico, usare la stored procedure di sistema RESTORE FILELISTONLY. È quindi possibile usare la stored procedure di sistema **sys.sp_delete_backup_file_snapshot** per eliminare un singolo snapshot di file di backup rimasto orfano. Alla fine di questo argomento vengono forniti esempi di uso di questa funzione di sistema e di queste stored procedure di sistema. Per altre informazioni, vedere [sp_delete_backup &#40;Transact-SQL&#41;](../../relational-databases/system-stored-procedures/snapshot-backup-sp-delete-backup.md), [sys.fn_db_backup_file_snapshots &#40;Transact-SQL&#41;](../../relational-databases/system-functions/sys-fn-db-backup-file-snapshots-transact-sql.md), [sp_delete_backup_file_snapshot &#40;Transact-SQL&#41;](../../relational-databases/system-stored-procedures/snapshot-backup-sp-delete-backup-file-snapshot.md) e [RESTORE FILELISTONLY &#40;Transact-SQL&#41;](../../t-sql/statements/restore-statements-filelistonly-transact-sql.md). ### <a name="considerations-and-limitations"></a>Considerazioni e limitazioni **Archiviazione Premium:** Quando si usa l'archiviazione Premium, si applicano le limitazioni seguenti: - Il file di backup stesso non può essere archiviato usando l'archiviazione Premium. - La frequenza dei backup non può essere inferiore a 10 minuti. - Il numero massimo di snapshot che è possibile archiviare è 100. - RESTORE WITH MOVE è obbligatorio. - Per altre informazioni sull'archiviazione Premium, vedere [Premium Storage: High-Performance Storage for Azure Virtual Machine Workloads](/azure/virtual-machines/disks-types) (Archiviazione Premium: Archiviazione ad alte prestazioni per i carichi di lavoro delle macchine virtuali di Azure) **Account di archiviazione singolo:** I BLOB degli snapshot di file e di destinazione devono usare lo stesso account di archiviazione. **Modello di recupero con registrazione minima delle operazioni bulk:** Quando si usa la modalità di recupero con registrazione minima delle operazioni bulk e si lavora con un backup del log delle transazioni contenente transazioni con registrazione minima, non è possibile eseguire un ripristino del log (compreso il recupero temporizzato) usando il backup del log delle transazioni. Eseguire invece un ripristino del database all'ora del set di backup di snapshot di file. Questa limitazione è identica alla limitazione con il backup di flusso. **Ripristino in linea:** Quando si usa il backup di snapshot di file non è possibile eseguire un ripristino in linea. Per altre informazioni sul ripristino in linea, vedere [Ripristino in linea &#40;SQL Server&#41;](../../relational-databases/backup-restore/online-restore-sql-server.md). **Fatturazione:** Quando si usa il backup di snapshot di file di SQL Server, le modifiche dei dati saranno soggette a costi aggiuntivi. Per altre informazioni, vedere [Informazioni sull'incremento dei costi dovuto agli snapshot](/rest/api/storageservices/Understanding-How-Snapshots-Accrue-Charges). **Archiviazione:** Se si vuole archiviare un backup di snapshot di file, è possibile usare l'archivio BLOB o il backup di flusso. Per archiviare nell'archivio BLOB, copiare gli snapshot nel set di backup di snapshot di file in BLOB separati. Per archiviare il backup di flusso, ripristinare il backup di snapshot di file come un nuovo database e quindi eseguire un normale backup di flusso con la compressione e/o la crittografia e archiviarlo per il tempo desiderato, indipendentemente dai BLOB di base. > [!IMPORTANT] > La gestione di più backup di snapshot di file comporta solo un minimo overhead delle prestazioni. La gestione di un numero eccessivo di backup di snapshot di file, invece, può avere un impatto sulle prestazioni di I/O del database. È consigliabile gestire solo i backup di snapshot di file necessari per supportare l'obiettivo del punto di recupero. ## <a name="backing-up-the-database-and-log-using-a-file-snapshot-backup"></a>Backup del database e del log usando un backup di snapshot di file L'esempio seguente usa il backup di snapshot di file per eseguire il backup del database di esempio AdventureWorks2016 su URL. ``` -- To permit log backups, before the full database backup, modify the database -- to use the full recovery model. USE master; GO ALTER DATABASE AdventureWorks2016 SET RECOVERY FULL; GO -- Back up the full AdventureWorks2016 database. BACKUP DATABASE AdventureWorks2016 TO URL = 'https://<mystorageaccountname>.blob.core.windows.net/<mycontainername>/AdventureWorks2016.bak' WITH FILE_SNAPSHOT; GO -- Back up the AdventureWorks2016 log using a time stamp in the backup file name. DECLARE @Log_Filename AS VARCHAR (300); SET @Log_Filename = 'https://<mystorageaccountname>.blob.core.windows.net/<mycontainername>/AdventureWorks2016_Log_'+ REPLACE (REPLACE (REPLACE (CONVERT (VARCHAR (40), GETDATE (), 120), '-','_'),':', '_'),' ', '_') + '.trn'; BACKUP LOG AdventureWorks2016 TO URL = @Log_Filename WITH FILE_SNAPSHOT; GO ``` ## <a name="restoring-from-a-ssnoversion-file-snapshot-backup"></a>Ripristino da un backup di snapshot di file [!INCLUDE[ssNoVersion](../../includes/ssnoversion-md.md)] L'esempio eseguente mostra come eseguire il ripristino del database AdventureWorks2016 usando un set di backup di snapshot di file del log delle transazioni e illustra un'operazione di recupero. Si noti che è possibile ripristinare un database da un singolo set di backup di snapshot di file del log transazionale. ``` RESTORE DATABASE AdventureWorks2016 FROM URL = 'https://<mystorageaccountname>.blob.core.windows.net/<mycontainername>/AdventureWorks2016_2015_05_18_16_00_00.trn' WITH RECOVERY, REPLACE; GO ``` ## <a name="restoring-from-a-ssnoversion-file-snapshot-backup-to-a-point-in-time"></a>Ripristino temporizzato da un backup di snapshot di file [!INCLUDE[ssNoVersion](../../includes/ssnoversion-md.md)] L'esempio seguente mostra come eseguire il ripristino di AdventureWorks2016 allo stato in cui si trova in un punto nel tempo specificato usando due set di backup di snapshot di file del log delle transazioni e illustra un'operazione di recupero. ``` RESTORE DATABASE AdventureWorks2016 FROM URL = 'https://<mystorageaccountname>.blob.core.windows.net/<mycontainername>/AdventureWorks2016_2015_05_18_16_00_00.trn' WITH NORECOVERY,REPLACE; GO RESTORE LOG AdventureWorks2016 FROM URL = 'https://<mystorageaccountname>.blob.core.windows.net/<mycontainername>/AdventureWorks2016_2015_05_18_18_00_00.trn' WITH RECOVERY,STOPAT = 'May 18, 2015 5:35 PM'; GO ``` ## <a name="deleting-a-database-file-snapshot-backup-set"></a>Eliminazione di un set di backup di snapshot di file di database Per eliminare un set di backup di snapshot di file, usare la stored procedure di sistema **sys.sp_delete_backup** . Specificare il nome del database in modo che il sistema verifichi che il set di backup di snapshot di file specificato sia effettivamente un backup per il database specificato. Se non si specifica alcun nome di database, il set di backup specificato con i relativi snapshot di file verrà eliminato senza tale convalida. Per altre informazioni, vedere [sp_delete_backup &#40;Transact-SQL&#41;](../../relational-databases/system-stored-procedures/snapshot-backup-sp-delete-backup.md). > [!WARNING] > Se si prova a eliminare un set di backup di snapshot di file usando un altro metodo, ad esempio il portale di gestione di Microsoft Azure o il visualizzatore di archiviazione di Azure in SQL Server Management Studio, gli snapshot di file nel set di backup non verranno eliminati. Questi strumenti elimineranno il file di backup stesso che contiene i puntatori agli snapshot di file nel set di backup di snapshot di file. Per identificare gli snapshot di file di backup che rimangono dopo aver eliminato un file di backup in modo errato, usare la funzione di sistema **sys.fn_db_backup_file_snapshots** e quindi la stored procedure di sistema **sys.sp_delete_backup_file_snapshot** per eliminare un singolo snapshot di file di backup. L'esempio seguente mostra come eliminare il set di backup di snapshot di file specificato, inclusi i file di backup e gli snapshot di file che costituiscono il set di backup specificato. ``` sys.sp_delete_backup 'https://<mystorageaccountname>.blob.core.windows.net/<mycontainername>/AdventureWorks2016.bak', 'adventureworks2016' ; GO ``` ## <a name="viewing-database-backup-file-snapshots"></a>Visualizzazione di snapshot di file di backup del database Per visualizzare snapshot di file di un BLOB di base per ogni file di database, usare la funzione di sistema **sys.fn_db_backup_file_snapshots** , che consente di visualizzare tutti gli snapshot di file di backup di ogni BLOB di base per un database archiviato usando il servizio di archiviazione BLOB di Azure. Un caso d'utilizzo primario per questa funzione prevede l'identificazione degli snapshot di file di backup di un database che rimangono quando il file di backup per un set di backup di snapshot di file viene eliminato usando un meccanismo diverso dalla stored procedure di sistema **sys.sp_delete_backup** . Per determinare gli snapshot di file di backup che fanno parte di set di backup intatti e quelli che non ne fanno parte, usare la stored procedure di sistema **RESTORE FILELISTONLY** per elencare gli snapshot di file che appartengono a ogni file di backup. Per altre informazioni, vedere [sys.fn_db_backup_file_snapshots &#40;Transact-SQL&#41;](../../relational-databases/system-functions/sys-fn-db-backup-file-snapshots-transact-sql.md) e [RESTORE FILELISTONLY &#40;Transact-SQL&#41;](../../t-sql/statements/restore-statements-filelistonly-transact-sql.md). L'esempio seguente restituisce l'elenco di tutti gli snapshot di file di backup per il database specificato. ``` --Either specify the database name or set the database context USE AdventureWorks2016 select * from sys.fn_db_backup_file_snapshots (null) ; GO select * from sys.fn_db_backup_file_snapshots ('AdventureWorks2016') ; GO ``` ## <a name="deleting-an-individual-database-backup-file-snapshot"></a>Eliminazione di un singolo snapshot di file di backup di database Per eliminare un singolo snapshot di file di backup di un BLOB di base di un database, usare la stored procedure di sistema **sys.sp_delete_backup_file_snapshot** . Un caso d'utilizzo primario per questa stored procedure di sistema prevede l'eliminazione dei file di snapshot di file orfani che rimangono dopo aver eliminato un file di backup con un metodo diverso dalla stored procedure di sistema **sys.sp_delete_backup** . Per altre informazioni, vedere [sp_delete_backup_file_snapshot &#40;Transact-SQL&#41;](../../relational-databases/system-stored-procedures/snapshot-backup-sp-delete-backup-file-snapshot.md). > [!WARNING] > L'eliminazione di un singolo snapshot di file che fa parte di un set di backup di snapshot di file invalida il set di backup. L'esempio seguente mostra come eliminare lo snapshot di file di backup specificato. L'URL per il backup specificato viene ottenuto usando la funzione di sistema **sys.fn_db_backup_file_snapshots** . ``` sys.sp_delete_backup_file_snapshot N'adventureworks2016', N'https://<mystorageaccountname>.blob.core.windows.net/<mycontainername>/AdventureWorks2016Data.mdf?snapshot=2015-05-29T21:31:31.6502195Z'; GO ``` ## <a name="see-also"></a>Vedere anche [Esercitazione: Uso del servizio di archiviazione BLOB di Microsoft Azure con i database di SQL Server 2016](../tutorial-use-azure-blob-storage-service-with-sql-server-2016.md)
119.554945
1,646
0.784411
ita_Latn
0.993283
e179730a9316be76f0f64b1af514038f08f244d0
1,794
md
Markdown
README.md
EruptedLava/War007_Game_exe
ebd29534448f6a787a1c38ac060d37c4feb3896a
[ "Unlicense" ]
null
null
null
README.md
EruptedLava/War007_Game_exe
ebd29534448f6a787a1c38ac060d37c4feb3896a
[ "Unlicense" ]
null
null
null
README.md
EruptedLava/War007_Game_exe
ebd29534448f6a787a1c38ac060d37c4feb3896a
[ "Unlicense" ]
null
null
null
# war007 ======== THIS GAME MAY HURT YOUR FEELINGS ======== War007 is game created in python using pygame module this game is created by [Eruptedlava / Sarthak Sharma](https://t.me/Eruptedlava) ## Installation Use the package manager [pip](https://pip.pypa.io/en/stable/) to install pygame. ```bash pip install pygame ``` ## Usage ### windows users can run this directly by double clicking main.py #### First open your terminal in war007 folder then execute the following command (For Linux users ) ``` python3 main.py ``` ## Contributing Pull requests are welcome. For major changes, please open an issue first to discuss what you would like to change. Please make sure to update tests as appropriate. . ## License MIT License Copyright (c) [2021] War_007 Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "war 007"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
33.849057
114
0.772018
eng_Latn
0.85023
e179a8490d48d9e24d37bbb8967ce4e721a56e0a
6,512
md
Markdown
docs/analytics-platform-system/tempdb-database.md
lerencio/sql-docs.ru-ru
f9ca599f8243fa8283ccb181f324274fa9cc174f
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/analytics-platform-system/tempdb-database.md
lerencio/sql-docs.ru-ru
f9ca599f8243fa8283ccb181f324274fa9cc174f
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/analytics-platform-system/tempdb-database.md
lerencio/sql-docs.ru-ru
f9ca599f8243fa8283ccb181f324274fa9cc174f
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: База данных tempdb description: База данных tempdb в параллельном хранилище данных. author: mzaman1 ms.prod: sql ms.technology: data-warehouse ms.topic: conceptual ms.date: 04/17/2018 ms.author: murshedz ms.reviewer: martinle ms.custom: seo-dt-2019 ms.openlocfilehash: 3772e2b4cabac84c00854eba85f7a0c2a33d48bc ms.sourcegitcommit: e042272a38fb646df05152c676e5cbeae3f9cd13 ms.translationtype: MT ms.contentlocale: ru-RU ms.lasthandoff: 04/27/2020 ms.locfileid: "74400141" --- # <a name="tempdb-database-in-parallel-data-warehouse"></a>база данных tempdb в параллельном хранилище данных база данных **tempdb** — это SQL Server PDW системная БД, в которой хранятся локальные временные таблицы для пользовательских баз данных. Временные таблицы часто используются для повышения производительности запросов. Например, можно использовать временную таблицу для разделения сценария и повторного использования вычисленных данных. Дополнительные сведения о системных базах данных см. в разделе [системные базы данных](system-databases.md). ## <a name="key-terms-and-concepts"></a><a name="Basics"></a>Ключевые термины и понятия *Локальная временная таблица* *Локальная временная таблица* использует префикс # перед именем таблицы и является временной таблицей, созданной локальным сеансом пользователя. Каждый сеанс может получить доступ только к данным в локальных временных таблицах для собственного сеанса. Каждый сеанс может просматривать метаданные локальных временных таблиц во всех сеансах. Например, все сеансы могут просматривать метаданные всех локальных временных таблиц с `SELECT * FROM tempdb.sys.tables` запросом. Глобальная временная таблица *Глобальные временные таблицы*, поддерживаемые в SQL Server с синтаксисом # #, не поддерживаются в этом выпуске SQL Server PDW. пдвтемпдб **пдвтемпдб** — это база данных, в которой хранятся локальные временные таблицы. PDW не реализует временные таблицы с помощью базы данных SQL Server**tempdb** . Вместо этого PDW сохраняет их в базе данных с именем пдвтемпдб. Эта база данных существует на каждом кластерном узле и является невидимой для пользователя через интерфейсы PDW. В консоли администрирования на странице хранилище эти учетные записи будут отображаться в системной базе данных PDW с именем **tempdb-SQL**. tempdb **tempdb** — база данных SQL Server tempdb. Он использует минимальное протоколирование. SQL Server использует базу данных tempdb на вычисленных узлах для хранения временных таблиц, которые необходимы в процессе выполнения SQL Serverных операций. SQL Server PDW удаляет таблицы из **базы данных tempdb** в следующих случаях: - Выполняется инструкция DROP TABLE. - Сеанс отключен. Удаляются только временные таблицы для сеанса. - Устройство завершает работу. - Узел элемента управления имеет отработку отказа кластера. ## <a name="general-remarks"></a>Общие замечания SQL Server PDW выполняет те же операции с временными таблицами и с постоянными таблицами, если явно не указано иное. Например, данные в локальных временных таблицах, как и постоянные таблицы, либо распределяются, либо реплицируются между разными узлами вычислений. ## <a name="limitations-and-restrictions"></a><a name="LimitationsRestrictions"></a>Ограничения Ограничения для базы данных SQL Server PDW**tempdb** . Вы *не можете:* - Создайте глобальную временную таблицу, которая начинается с # #. - Выполните резервное копирование или восстановление **базы данных tempdb**. - Измените разрешения на **базу данных tempdb** с помощью инструкций **Grant**, **Deny**или **REVOKE** . - Выполните **инструкцию DBCC SHRINKLOG** **для tempdb tempdb.** - Выполнение DDL операций в **базе данных tempdb**. Существует несколько исключений. Дополнительные сведения см. в следующем списке ограничений и ограничений для локальных временных таблиц. Ограничения для локальных временных таблиц. Вы *не можете:* - Переименование временной таблицы - Создание секций, представлений или некластеризованных индексов во временной таблице. **Инструкцию ALTER INDEX** можно использовать для перестроения кластеризованного индекса для таблицы, созданной с помощью. - Измените разрешения для временных таблиц с помощью инструкций GRANT, DENY или REVOKE. - Выполнение команд консоли базы данных во временных таблицах. - Используйте одно и то же имя для двух или более временных таблиц в одном пакете. Если в пакете используется несколько локальных временных таблиц, они должны иметь уникальные имена. Если несколько сеансов используют один пакет и создает одну и ту же локальную временную таблицу, SQL Server PDW внутренне добавляет числовой суффикс к имени локальной временной таблицы, чтобы сохранить уникальное имя для каждой локальной временной таблицы. > [!NOTE] > Вы *можете* создавать и обновлять статистику во временной таблице. **Инструкцию ALTER INDEX** можно использовать для перестроения кластеризованного индекса. ## <a name="permissions"></a>Разрешения Любой пользователь может создавать временные объекты в базе данных tempdb. Если не предоставлены какие-либо дополнительные разрешения, то пользователи могут производить доступ только к тем объектам, которыми они владеют. Существует возможность отменить разрешение на соединение с базой данных tempdb, чтобы пользователь не мог ей пользоваться, но этого делать не рекомендуется, так как база данных tempdb необходима для работы некоторым подпрограммам. ## <a name="related-tasks"></a><a name="RelatedTasks"></a>Связанные задачи |Задачи|Описание| |---------|---------------| |Создайте таблицу в **базе данных tempdb**.|Можно создать временную таблицу пользователя с помощью CREATE TABLE и CREATE TABLE в качестве инструкций SELECT. Дополнительные сведения см. в разделе [CREATE TABLE](../t-sql/statements/create-table-azure-sql-data-warehouse.md) и [CREATE TABLE в качестве SELECT](../t-sql/statements/create-table-as-select-azure-sql-data-warehouse.md).| |Просмотр списка существующих таблиц в **базе данных tempdb**.|`SELECT * FROM tempdb.sys.tables;`| |Просмотр списка существующих столбцов в **базе данных tempdb**.|`SELECT * FROM tempdb.sys.columns;`| |Просмотр списка существующих объектов в **базе данных tempdb**.|`SELECT * FROM tempdb.sys.objects;`| <!-- MISSING LINKS ## See Also [Common Metadata Query Examples &#40;SQL Server PDW&#41;](../sqlpdw/common-metadata-query-examples-sql-server-pdw.md) -->
65.777778
453
0.777488
rus_Cyrl
0.950719
e179dddd876a3ee44970c2053f37120353a86b5c
485
md
Markdown
kdocs/-kores/com.koresframework.kores.base/-class-declaration/-builder/comments.md
koresframework/Kores
b6ab31b1d376ab501fd9f481345c767cb0c37d04
[ "MIT-0", "MIT" ]
1
2019-09-17T16:36:51.000Z
2019-09-17T16:36:51.000Z
kdocs/-kores/com.koresframework.kores.base/-class-declaration/-builder/comments.md
koresframework/Kores
b6ab31b1d376ab501fd9f481345c767cb0c37d04
[ "MIT-0", "MIT" ]
8
2020-12-12T06:48:34.000Z
2021-08-15T22:34:49.000Z
kdocs/-kores/com.koresframework.kores.base/-class-declaration/-builder/comments.md
koresframework/Kores
b6ab31b1d376ab501fd9f481345c767cb0c37d04
[ "MIT-0", "MIT" ]
null
null
null
//[Kores](../../../../index.md)/[com.koresframework.kores.base](../../index.md)/[ClassDeclaration](../index.md)/[Builder](index.md)/[comments](comments.md) # comments [jvm]\ open override fun [comments](comments.md)(value: [Comments](../../../com.koresframework.kores.base.comment/-comments/index.md)): [ClassDeclaration.Builder](index.md) See CommentHolder.comments [jvm]\ var [comments](comments.md): [Comments](../../../com.koresframework.kores.base.comment/-comments/index.md)
40.416667
165
0.698969
yue_Hant
0.163127
e17ac23bcf5edf111e2a8b663bd2262c07205e01
1,237
md
Markdown
_posts/2017-03-19-Houghton-Galina-Long-Sleeves-FloorLength-AlinePrincess.md
novstylessee/novstylessee.github.io
4c99fd7f6148fa475871b044a67df2ac0151b9ba
[ "MIT" ]
null
null
null
_posts/2017-03-19-Houghton-Galina-Long-Sleeves-FloorLength-AlinePrincess.md
novstylessee/novstylessee.github.io
4c99fd7f6148fa475871b044a67df2ac0151b9ba
[ "MIT" ]
null
null
null
_posts/2017-03-19-Houghton-Galina-Long-Sleeves-FloorLength-AlinePrincess.md
novstylessee/novstylessee.github.io
4c99fd7f6148fa475871b044a67df2ac0151b9ba
[ "MIT" ]
null
null
null
--- layout: post date: 2017-03-19 title: "Houghton Galina Long Sleeves Floor-Length Aline/Princess" category: Houghton tags: [Houghton,Aline/Princess ,V-neck,Floor-Length,Long Sleeves] --- ### Houghton Galina Just **$259.99** ### Long Sleeves Floor-Length Aline/Princess <table><tr><td>BRANDS</td><td>Houghton</td></tr><tr><td>Silhouette</td><td>Aline/Princess </td></tr><tr><td>Neckline</td><td>V-neck</td></tr><tr><td>Hemline/Train</td><td>Floor-Length</td></tr><tr><td>Sleeve</td><td>Long Sleeves</td></tr></table> <a href="https://www.readybrides.com/en/houghton/35461-houghton-galina.html"><img src="//img.readybrides.com/74166/houghton-galina.jpg" alt="Houghton Galina" style="width:100%;" /></a> <!-- break --><a href="https://www.readybrides.com/en/houghton/35461-houghton-galina.html"><img src="//img.readybrides.com/74167/houghton-galina.jpg" alt="Houghton Galina" style="width:100%;" /></a> <a href="https://www.readybrides.com/en/houghton/35461-houghton-galina.html"><img src="//img.readybrides.com/74165/houghton-galina.jpg" alt="Houghton Galina" style="width:100%;" /></a> Buy it: [https://www.readybrides.com/en/houghton/35461-houghton-galina.html](https://www.readybrides.com/en/houghton/35461-houghton-galina.html)
72.764706
246
0.719483
yue_Hant
0.818133
e17acee6e2f98c16746c7277984f457aa19bac9e
2,670
md
Markdown
BasicRecordingApi/README.md
jinwkim90/Step_C_ByGoogle
6a1bfc99d799712ae060c422942eef5987d6ccdf
[ "Apache-2.0" ]
null
null
null
BasicRecordingApi/README.md
jinwkim90/Step_C_ByGoogle
6a1bfc99d799712ae060c422942eef5987d6ccdf
[ "Apache-2.0" ]
null
null
null
BasicRecordingApi/README.md
jinwkim90/Step_C_ByGoogle
6a1bfc99d799712ae060c422942eef5987d6ccdf
[ "Apache-2.0" ]
null
null
null
Android Fit Recording Api Sessions Sample ============ A simple example of how to use the Recording API on the Android Fit platform. - Android API Level > 9 - Android Build Tools v23 - Android Support Repository - Register a Google Project with an Android client per getting started instructions http://developers.google.com/fit/android/get-started Getting Started --------------- This sample uses the Gradle build system. To build this project, use the "gradlew build" command or use "Import Project" in Android Studio. NOTE: You must register an Android client underneath a Google Project in order for the Google Fit API to become available for your app. The process ensures your app has proper consent screen information for users to accept, among other things required to access Google APIs. See the instructions for more details: http://developers.google.com/fit/android/get-started Support ------- The most common problem using these samples is a SIGN_IN_FAILED exception. Users can experience this after selecting a Google Account to connect to the FIT API. If you see the following in logcat output then make sure to register your Android app underneath a Google Project as outlined in the instructions for using this sample at: http://developers.google.com/fit/android/get-started `10-26 14:40:37.082 1858-2370/? E/MDM: [138] b.run: Couldn't connect to Google API client: ConnectionResult{statusCode=API_UNAVAILABLE, resolution=null, message=null}` Use the following channels for support: - Google+ Community: https://plus.google.com/communities/103314459667402704958 - Stack Overflow: http://stackoverflow.com/questions/tagged/android If you've found an error in this sample, please file an issue: https://github.com/googlesamples/android-fitness/issues Patches are encouraged, and may be submitted according to the instructions in CONTRIB.md. License ------- Copyright 2014 Google, Inc. Licensed to the Apache Software Foundation (ASF) under one or more contributor license agreements. See the NOTICE file distributed with this work for additional information regarding copyright ownership. The ASF licenses this file to you under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.
43.064516
167
0.788764
eng_Latn
0.977134
e17ad4fe2b2b955059bf5d8e28471b0a4ea197a7
365
md
Markdown
README.md
setsuodu/CommandPattern
c66de7baed4a6f0c4f94302f4a6e151cc1852b61
[ "MIT" ]
null
null
null
README.md
setsuodu/CommandPattern
c66de7baed4a6f0c4f94302f4a6e151cc1852b61
[ "MIT" ]
null
null
null
README.md
setsuodu/CommandPattern
c66de7baed4a6f0c4f94302f4a6e151cc1852b61
[ "MIT" ]
null
null
null
# Lockstep by CommandPattern ## Tools - Unity 2018.4.0f1 - VS2019 - LitNetLib 1. 命令模式核心:Execute、Undo; 2. 帧同步核心:客户端逻辑、帧回放; 3. 帧同步进阶:预测、回滚; ## //TODO: - [x] 1. 保存、解析帧数据,实现回放; - [x] 2. 搭建服务器(LiteNetLib); - [x] 3. PvP大厅、房间; - [ ] 4. 服务器帧同步逻辑; - [ ] 5. 改写命令模式; - [ ] 6. 外部素材导入,解决实际开发遇到的问题; - [ ] 7. 断线重连; - [ ] 8. 预测、回滚; - [ ] 9. 日志、反外挂机制总结; - [ ] 10. 网络抖动、平滑处理总结;
14.6
28
0.569863
yue_Hant
0.503138
e17adbcbe3681fde5fb7a6506b0cf61fb63181cd
18,444
md
Markdown
ce/unified-service-desk/admin/unified-service-desk-configurations-bpa.md
mairaw/dynamics-365-customer-engagement
18b45fa62f6af559f6f272575878c21ab279638c
[ "CC-BY-4.0", "MIT" ]
null
null
null
ce/unified-service-desk/admin/unified-service-desk-configurations-bpa.md
mairaw/dynamics-365-customer-engagement
18b45fa62f6af559f6f272575878c21ab279638c
[ "CC-BY-4.0", "MIT" ]
null
null
null
ce/unified-service-desk/admin/unified-service-desk-configurations-bpa.md
mairaw/dynamics-365-customer-engagement
18b45fa62f6af559f6f272575878c21ab279638c
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: "Unified Service Desk configurations | MicrosoftDocs" description: "Learn about the Unified Service Desk configurations that you make in Microsoft Dataverse on which the Best practices Analyer performs analysis and displays a report." author: mh-jaya ms.author: v-jmh manager: shujoshi ms.date: 04/24/2018 ms.topic: article ms.service: dynamics-365-customerservice --- # [!INCLUDE[pn_unified_service_desk](../../includes/pn-unified-service-desk.md)] Configurations [!INCLUDE[cc-data-platform-banner](../../includes/cc-data-platform-banner.md)] [!INCLUDE[pn-best-practices-analyzer](../../includes/pn-best-practices-analyzer.md)] analyzes the [!INCLUDE[pn_unified_service_desk](../../includes/pn-unified-service-desk.md)] configurations that you make in Microsoft Dataverse. ## Internal WPF Hosting Type [!INCLUDE[pn-best-practices-analyzer](../../includes/pn-best-practices-analyzer.md)] checks for **Internal WPF** hosting type hosted controls that you configure in [!INCLUDE[pn_unified_service_desk](../../includes/pn-unified-service-desk.md)] and displays a warning when one or more Internal WPF hosting type hosted controls are configured. We recommend that you move the **Hosting Type** of all the hosted controls of component type **CRM Page** or **Standard Web Application** from **Internal WPF** to **IE Process**. ### Mitigation For all the hosted controls of component type **CRM Page** or **Standard Web Application**, we recommend that you move the hosting type from **Internal WPF** to **IE Process**. 1. Sign in to the Dynamics 365 instance. 2. Go to **Settings** > **[!INCLUDE[pn_unified_service_desk](../../includes/pn-unified-service-desk.md)]** > **Hosted Controls**. 3. Select the applicable hosted controls from the list. </br>You can change the hosting type for only CRM Page and Standard Web Application hosted controls. 4. In the **Hosting Type** list, select **IE Process**. 5. Select **Save**. ## Actions Calls in PageLoadComplete Event [!INCLUDE[pn-best-practices-analyzer](../../includes/pn-best-practices-analyzer.md)] checks and displays a warning message when you associate any action calls with the **PageLoadComplete** event. Action calls that are associated with the **PageLoadComplete** event occur several times per page load when an iFrame or frame is used on the CRM entity forms. [!INCLUDE[pn_unified_service_desk](../../includes/pn-unified-service-desk.md)] works best when **PageLoadComplete** is replaced with a **BrowserDocumentComplete** or **DataReady** event if you're using this event for CRM entity forms. ### Mitigation Replace the **PageLoadComplete** event with a **BrowserDocumentComplete** event or **DataReady** event if you're using it for CRM entity forms. > [!Note] > The **DataReady** event is available for use in [!INCLUDE[pn_unified_service_desk](../../includes/pn-unified-service-desk.md)] > 3.0 or later. ## Actions Calls with Events [!INCLUDE[pn-best-practices-analyzer](../../includes/pn-best-practices-analyzer.md)] checks and displays a warning when you associate 10 or more action calls with the events listed in the following table. |Event|Description| |-----------|------------| |**DesktopReady**|A **DesktopReady** event occurs on startup after all the desktop initialization are complete and the connections to the server are established. |**SessionNew**|A **SessionNew** event occurs when a new session is created.| |**SessionActivated**|A **SessionActivated** event occurs when a new session is activated.| |**SessionDeactivated**|A **SessionDeactivated** event occurs when a new session is deactivated.| |**SessionClosed**|A **SessionClosed** event occurs when a session is closed.| [!INCLUDE[pn_unified_service_desk](../../includes/pn-unified-service-desk.md)] works best when you associate 10 or fewer actions with any event. ### Mitigation [!INCLUDE[pn-best-practices-analyzer](../../includes/pn-best-practices-analyzer.md)] recommends optimizing to associate 10 or fewers action calls with **DesktopReady**, **SessionNew** **SessionActivated**, **SessionDeactivated**, and **SessionClosed** events. ## Number of Navigation Rules [!INCLUDE[pn-best-practices-analyzer](../../includes/pn-best-practices-analyzer.md)] checks for the number of navigation rules that you configure in [!INCLUDE[pn_unified_service_desk](../../includes/pn-unified-service-desk.md)] and displays a warning message when the value is more than **50**. [!INCLUDE[pn_unified_service_desk](../../includes/pn-unified-service-desk.md)] works best when 50 or fewer navigation rules are configured. ### Mitigation [!INCLUDE[pn-best-practices-analyzer](../../includes/pn-best-practices-analyzer.md)] recommends optimizing the number of navigation rules in the range between 0 and 50. ## Show Script Errors (ShowScriptErrors) [!INCLUDE[pn-best-practices-analyzer](../../includes/pn-best-practices-analyzer.md)] checks for the **ShowScriptErrors** value and displays a warning message when the value is set to **true**. When you enable Show Script Errors, you often see pop-up error messages. To work without interruption, you can set the **ShowScriptErrors** value to **false**. ### Mitigation Set **ShowScriptErrors** to **false**: 1. Sign in to the Dynamics 365 instance. 2. Go to **Settings** > **[!INCLUDE[pn_unified_service_desk](../../includes/pn-unified-service-desk.md)]** > **Options** 3. In the list of options, select **ShowScriptErrors**. 4. In the **Value** field, select **false**. 5. Select **Save**. [!INCLUDE[proc_more_information](../../includes/proc-more-information.md)] [Manage Options for [!INCLUDE[pn_unified_service_desk](../../includes/pn-unified-service-desk.md)]](https://docs.microsoft.com/dynamics365/customer-engagement/unified-service-desk/admin/manage-options-unified-service-desk) ## Client Caching [!INCLUDE[pn-best-practices-analyzer](../../includes/pn-best-practices-analyzer.md)] checks for the client caching option and displays a warning message when the field is blank. You can use client caching to reduce the amount of bandwidth required during [!INCLUDE[pn_unified_service_desk](../../includes/pn-unified-service-desk.md)] startup. ### Mitigation Enable client caching: 1. Sign in to the Dynamics 365 instance. 2. Go to **Settings** > **[!INCLUDE[pn_unified_service_desk](../../includes/pn-unified-service-desk.md)]** > **Options** 3. To create a new option, select **New** on the command bar. 4. For the new option, select **ClientCacheVersionNumber** in the **Name** box, and then type an alphanumeric number in the **Value** box. <br/>The alphanumeric value is used as the cache key for [!INCLUDE[pn_unified_service_desk](../../includes/pn-unified-service-desk.md)]. The alphanumeric value can be of any value but unique for each time you change. 5. Select **Save**. > [!Note] > When an agent launches the [!INCLUDE[pn_unified_service_desk](../../includes/pn-unified-service-desk.md)] client again, client caching isn't used. However, it doesn't delete or refresh the client cache store for the agent. When you remove the **DisableCaching** key for the agent, the agent returns to using the previously stored client cache store. [!INCLUDE[proc_more_information](../../includes/proc-more-information.md)] [Enable client caching](/dynamics365/customer-engagement/unified-service-desk/admin/configure-client-caching-unified-service-desk) ## Maximum Number of Sessions (maxNumberOfSessions) [!INCLUDE[pn-best-practices-analyzer](../../includes/pn-best-practices-analyzer.md)] checks for the **maxNumberOfSessions** option and displays a warning message or an error message in accordance with the following table. |Severity| maxNumberOfSessions | |---------|---------------------| | Error | 0 | | Error | More than 5 | | Warning | 4 or 5 | **maxNumberOfSessions** indicates the maximum number of simultaneous sessions that each user can open using [!INCLUDE[pn_unified_service_desk](../../includes/pn-unified-service-desk.md)]. [!INCLUDE[pn_unified_service_desk](../../includes/pn-unified-service-desk.md)] works best when the **maxNumberOfSessions** value is more than **0** and less than or equal to **3**. ### Mitigation Set the **maxNumberOfSessions** value to less than or equal to **3**:  1. Sign in to the Dynamics 365 instance. 2. Go to **Settings** > **[!INCLUDE[pn_unified_service_desk](../../includes/pn-unified-service-desk.md)]** > **Options** 3. In the list of options, select **maxNumberOfSessions**. 4. In the **Value** field, type **3**. 5. Select **Save**. [!INCLUDE[proc_more_information](../../includes/proc-more-information.md)] [Manage Options for Unified Service Desk](/dynamics365/customer-engagement/unified-service-desk/admin/manage-options-unified-service-desk) ## Help Improve [!INCLUDE[pn_unified_service_desk](../../includes/pn-unified-service-desk.md)] (HelpImproveUSD) [!INCLUDE[pn-best-practices-analyzer](../../includes/pn-best-practices-analyzer.md)] checks for `HelpImproveUSD` User Interface Integration (UII) option and displays a warning message when the value is set to **false**. By using the Help Improve [!INCLUDE[pn_unified_service_desk](../../includes/pn-unified-service-desk.md)] program, organization-wide agents can send improvement program information to Microsoft. To enable the program, activate (set to **true**) the **HelpImproveUSD** option. ### Mitigation Set `HelpImproveUSD` to **true**: 1. Sign in to the Dynamics 365 instance. 2. Go to **Settings** > **[!INCLUDE[pn_unified_service_desk](../../includes/pn-unified-service-desk.md)]** > **Options** 3. To create a new option, select **New** on the command bar. 4. On the New Option page, select **HelpImproveUSD** in the **Global Options** list. 5. In the **Value** field, select **true**. 6. Select **Save**. [!INCLUDE[proc_more_information](../../includes/proc-more-information.md)] [Enable client caching](/dynamics365/customer-engagement/unified-service-desk/admin/configure-client-caching-unified-service-desk) ## Internet Explorer Pooling (InternetExplorerPooling) [!INCLUDE[pn-best-practices-analyzer](../../includes/pn-best-practices-analyzer.md)] checks for the **InternetExplorerPooling** UII option and displays a warning message when the value is set to **false**. Internet Explorer Pooling enhances performance of CRM entity page loading in [!INCLUDE[pn_unified_service_desk](../../includes/pn-unified-service-desk.md)].<br> [!INCLUDE[pn_unified_service_desk](../../includes/pn-unified-service-desk.md)] works best when the **InternetExplorerPooling** option is set to **true**. You must configure the **InternetExplorerPooling** option to set it to **true**. ### Mitigation Set the **InternetExplorerPooling** option to **true**: 1. Sign in to the Dynamics 365 instance. 2. Go to **Settings** > **[!INCLUDE[pn_unified_service_desk](../../includes/pn-unified-service-desk.md)]** > **Options** 3. On the **Active UII Options** page, select **New**. 4. In the **Global Option** field, select **Others**. 5. In the **Name** field, enter **InternetExplorerPooling**. 6. In the **Value** field, select **true**. 7. Select **Save**. [!INCLUDE[proc_more_information](../../includes/proc-more-information.md)] [**Enable Internet Explorer pooling**](/dynamics365/customer-engagement/unified-service-desk/admin/performance-enhancement-crm-entity-page-loads) ## Activity Tracking Enabled [!INCLUDE[pn-best-practices-analyzer](../../includes/pn-best-practices-analyzer.md)] checks for the **Activity Tracking Enabled** option and displays a warning message when the option is disabled. By using Activity Tracking Enabled options, you can track all the events for audit and diagnostic purposes. ### Mitigation Enable the **Activity Tracking Enabled** option: 1. Sign in to the Dynamics 365 instance. 2. Go to **Settings** > **[!INCLUDE[pn_unified_service_desk](../../includes/pn-unified-service-desk.md)]** > **Audit & Diagnostics Settings** 3. In the **Audit Settings** section, select the **Activity Tracking Enabled** check box. 4. Select **Save**. [!INCLUDE[proc_more_information](../../includes/proc-more-information.md)] [**Auditing**](/dynamics365/customer-engagement/unified-service-desk/admin/configure-auditing-diagnostics-unified-service-desk) ## Diagnostic Tracking Enabled [!INCLUDE[pn-best-practices-analyzer](../../includes/pn-best-practices-analyzer.md)] checks for the **Diagnostic Tracking Enabled** option and displays a warning message when the option is disabled. By using **Diagnostic Tracking Enabled**, you can record operational events and errors in the [!INCLUDE[pn_unified_service_desk](../../includes/pn-unified-service-desk.md)] log files. ### Mitigation Enable the **Diagnostic Tracking Enabled** option: 1. Sign in to the Dynamics 365 instance. 2. Go to **Settings** > **[!INCLUDE[pn_unified_service_desk](../../includes/pn-unified-service-desk.md)]** > **Audit & Diagnostics Settings** 3. In the **Diagnostic Settings** section, select the **Diagnostic Tracking Enabled** check box. 4. Select **Save**. [!INCLUDE[proc_more_information](../../includes/proc-more-information.md)] [**Diagnostics**](/dynamics365/customer-engagement/unified-service-desk/admin/configure-auditing-diagnostics-unified-service-desk) ## Enable Exit Monitoring [!INCLUDE[pn-best-practices-analyzer](../../includes/pn-best-practices-analyzer.md)] checks for the **Enable Exit Monitoring** option and displays a warning message when the option is disabled. **Enable Exit Monitoring** collects diagnostics logs and exit logs in the event of an exception in the [!INCLUDE[pn_unified_service_desk](../../includes/pn-unified-service-desk.md)] client. ### Mitigation Enable the **Enable Exit Monitoring** option: 1. Sign in to the Dynamics 365 instance. 2. Go to **Settings** > **[!INCLUDE[pn_unified_service_desk](../../includes/pn-unified-service-desk.md)]** > **Audit & Diagnostics Settings** 3. In the **Diagnostic Settings** section, select the **Enable Exit Monitoring** check box. 4. Select **Save**. [!INCLUDE[proc_more_information](../../includes/proc-more-information.md)] [**Diagnostics**](/dynamics365/customer-engagement/unified-service-desk/admin/configure-auditing-diagnostics-unified-service-desk) ## Enable Crash Dump Generation [!INCLUDE[pn-best-practices-analyzer](../../includes/pn-best-practices-analyzer.md)] checks for the **Enable Crash Dump Generation** option and displays a warning message when the option is disabled. By using Enable Crash Dump Generation, you can collect dump files during a fatal exception of [!INCLUDE[pn_unified_service_desk](../../includes/pn-unified-service-desk.md)]. ### Mitigation Enable the **Enable Crash Dump Generation** option: 1. Sign in to the Dynamics 365 instance. 2. Go to **Settings** > **[!INCLUDE[pn_unified_service_desk](../../includes/pn-unified-service-desk.md)]** > **Audit & Diagnostics Settings** 3. In the **Diagnostic Settings** section, select the **Enable Crash Dump Generation** check box. 4. Select **Save**. [!INCLUDE[proc_more_information](../../includes/proc-more-information.md)] [**Diagnostics**](/dynamics365/customer-engagement/unified-service-desk/admin/configure-auditing-diagnostics-unified-service-desk) ## Internet Explorer Webpage Recovery (IEWebPageRecovery) [!INCLUDE[pn-best-practices-analyzer](../../includes/pn-best-practices-analyzer.md)] checks for the **IEWebPageRecovery** UII option and displays an error message when the option is set to **false**. **IEWebPageRecovery** is an UII option that a system administrator can modify. The option helps recover unresponsive Internet Explorer webpages. [!INCLUDE[pn_unified_service_desk](../../includes/pn-unified-service-desk.md)] works best when the **IEWebPageRecovery** option is set to **true**. ### Mitigation Set the `IEWebPageRecovery` option to **true**. 1. Sign in to the Dynamics 365 instance. 2. Go to **Settings** > **[!INCLUDE[pn_unified_service_desk](../../includes/pn-unified-service-desk.md)]** > **Options** 3. In the list of options, select **IEWebPageRecovery**. 4. In the **Value** field, select **true**. 5. Specify **Save**. [!INCLUDE[proc_more_information](../../includes/proc-more-information.md)] [Manage Options for Unified Service Desk](/dynamics365/customer-engagement/unified-service-desk/admin/manage-options-unified-service-desk) ## Process Termination Threshold (ProcessTerminationThreshold) [!INCLUDE[pn-best-practices-analyzer](../../includes/pn-best-practices-analyzer.md)] checks for the **ProcessTerminationThreshold** UII option and displays an error message when the value is set to **0**. **ProcessTerminationThreshold** indicates the timeout period for the duration (in milliseconds) that the [!INCLUDE[pn_unified_service_desk](../../includes/pn-unified-service-desk.md)] monitoring process (usdmp.exe) waits before terminating an unresponsive Internet Explorer process, which also causes [!INCLUDE[pn_unified_service_desk](../../includes/pn-unified-service-desk.md)] to become unresponsive. [!INCLUDE[pn_unified_service_desk](../../includes/pn-unified-service-desk.md)] works best when **ProcessTerminationThreshold** option is set between the range **0** and **30000**. ### Mitigation Set the **ProcessTerminationThreshold** value between the range **0** and **30000**: 1. Sign in to the Dynamics 365 instance. 2. Go to **Settings** > **[!INCLUDE[pn_unified_service_desk](../../includes/pn-unified-service-desk.md)]** > **Options** 3. In the list of options, select **ProcessTerminationThreshold**. 4. In the **Value** field, type a value between **0** and **30000**. 5. Select **Save**. [!INCLUDE[proc_more_information](../../includes/proc-more-information.md)] [Manage Options for Unified Service Desk](/dynamics365/customer-engagement/unified-service-desk/admin/manage-options-unified-service-desk) ## See also [Analyze best practices in Unified Service Desk](../admin/analyze-best-practices-unified-service-desk.md) [Download and install Best Practices Analyzer](../admin/download-install-best-practices-analyzer.md) [Read Best Practices Analyzer report](../admin/read-best-practices-analyzer-report.md) [List of rule categories and parameters](../admin/compliance-categories-parameters-bpa.md) [System configurations](../admin/system-configurations-bpa.md) [Internet Explorer settings](../admin/internet-explorer-settings-bpa.md) [!INCLUDE[footer-include](../../includes/footer-banner.md)]
60.671053
388
0.74859
eng_Latn
0.882471
e17b1eedd67483a2e0eaeacae8b6c82584189612
998
md
Markdown
sentinel/java-client/README.md
MajescoIndia/redis-poc
03df78297c161868b08909af5bb42f6f6ab1abaa
[ "Apache-2.0" ]
null
null
null
sentinel/java-client/README.md
MajescoIndia/redis-poc
03df78297c161868b08909af5bb42f6f6ab1abaa
[ "Apache-2.0" ]
null
null
null
sentinel/java-client/README.md
MajescoIndia/redis-poc
03df78297c161868b08909af5bb42f6f6ab1abaa
[ "Apache-2.0" ]
null
null
null
## Introduction : A java client for redis master-slave replication cluster in sentinel mode. This POS has been tested with Redis 3.2 on windows. ## How to run : ### Assumption Cluster of two redis nodes (One master One slave) [ Replace below IP in configuration files with your actual IP] 172.17.192.194 (master) 172.17.192.191 (slave) ### Steps : * Copy `config/master` directory to master node of redis. * Copy `config/slave` directory to master node of redis. * Run below commands in master : redis-server master/redis.conf redis-server master/sentinel.conf --sentinel * Run below commands in slave : redis-server slave/redis.conf redis-server slave/sentinel.conf --sentinel * Run `RedisClusterApplicationTests` from JUnit/Maven (replace the IPs from Config.java with actual IPs). mvn test ## References : * http://redis.io/topics/sentinel * https://seanmcgary.com/posts/how-to-build-a-fault-tolerant-redis-cluster-with-sentinel
22.681818
126
0.716433
eng_Latn
0.744088
e17b4054d4bdc5de16b27f95487a6fca9a756153
1,130
md
Markdown
staticDecor/node_modules/pageres/node_modules/viewport-list/readme.md
jiumx60rus/grishyGhost
c56241304da11b9a1307c6261cca50f7546981b1
[ "MIT" ]
null
null
null
staticDecor/node_modules/pageres/node_modules/viewport-list/readme.md
jiumx60rus/grishyGhost
c56241304da11b9a1307c6261cca50f7546981b1
[ "MIT" ]
null
null
null
staticDecor/node_modules/pageres/node_modules/viewport-list/readme.md
jiumx60rus/grishyGhost
c56241304da11b9a1307c6261cca50f7546981b1
[ "MIT" ]
null
null
null
# viewport-list [![Build Status](http://img.shields.io/travis/kevva/viewport-list.svg?style=flat)](https://travis-ci.org/kevva/viewport-list) > Return a list of devices and their viewports ## Install ``` $ npm install --save viewport-list ``` ## Usage Pass in a optional keyword which is a device name from [this list](http://viewportsizes.com). ```js var viewport = require('viewport-list'); viewport(['iphone 4s'], function (err, items) { console.log(items); //=> [{name: 'iphone 4s', platform: 'iOS', os: '4.3.5', size: '320x480', release: '2011-10'}] }); ``` ## API ### viewport([items], callback) #### items Type: `array` Default: `[]` An array of device names to fetch. #### callback(err, ret) Type: `function` ##### ret Type: `array` An array of objects containing devices and their attributes. ## CLI ``` $ npm install --global viewport-list ``` ```bash $ viewport-list --help Usage $ viewport-list [device] $ viewport-list < <file> Example $ viewport-list iphone4 iphone5 $ viewport-list < devices.txt ``` ## License MIT © [Kevin Mårtensson](https://github.com/kevva)
15.915493
141
0.650442
eng_Latn
0.652831
e17c47d1476a47e6e9ee36190880c763a672c1ca
23
md
Markdown
README.md
AndrewByles/AndrewByles.github.io
dedd1bd22362a99481b1d59c4887f7db388a6c53
[ "MIT" ]
null
null
null
README.md
AndrewByles/AndrewByles.github.io
dedd1bd22362a99481b1d59c4887f7db388a6c53
[ "MIT" ]
null
null
null
README.md
AndrewByles/AndrewByles.github.io
dedd1bd22362a99481b1d59c4887f7db388a6c53
[ "MIT" ]
null
null
null
# AndrewByles.github.io
23
23
0.826087
vie_Latn
0.289854
e17dad30a73f256cc6d62bb459a038b6a7d3349d
592
md
Markdown
PRIVACY.md
akshay2000/MonocleGiraffe
7e5f769f5a74b9742d5743db0a7e61d3e57afa31
[ "Apache-2.0" ]
7
2016-07-11T08:13:50.000Z
2021-04-09T01:47:15.000Z
PRIVACY.md
akshay2000/MonocleGiraffe
7e5f769f5a74b9742d5743db0a7e61d3e57afa31
[ "Apache-2.0" ]
1
2016-07-15T05:58:59.000Z
2016-07-16T13:16:29.000Z
PRIVACY.md
akshay2000/MonocleGiraffe
7e5f769f5a74b9742d5743db0a7e61d3e57afa31
[ "Apache-2.0" ]
9
2016-01-28T18:36:12.000Z
2018-01-09T15:51:57.000Z
# Privacy Policy This document details how a user's details are used by Monocle Giraffe (henceforth referred as the app). ##User handles The Imgur user handle or user name of a person is stored on the device when a user logs into Imgur using Monocle Giraffe. This information is stored as plain text. ##Passwords The app does not store user's passwords. Credentials entered by the user while using the app are immediately exchanged for app specific tokens using Imgur's [OAuth](https://api.imgur.com/oauth2) API. ##Other information No other identifiable information is stored by the app.
49.333333
199
0.79223
eng_Latn
0.999671
e17e05ff319fca6b086b0393242ed27dafd94710
841
md
Markdown
README.md
unbit/uwsgi-node-rpc-server
4eff9e23da641973c2ae01eecbca1b0792e11722
[ "MIT" ]
3
2015-01-20T06:59:22.000Z
2021-01-13T01:20:48.000Z
README.md
unbit/uwsgi-node-rpc-server
4eff9e23da641973c2ae01eecbca1b0792e11722
[ "MIT" ]
null
null
null
README.md
unbit/uwsgi-node-rpc-server
4eff9e23da641973c2ae01eecbca1b0792e11722
[ "MIT" ]
null
null
null
uwsgi-node-rpc-server ===================== A simple uwsgi-RPC server written in node.js write your rpc functions ------------------------ save it as server.js (eventually change the listening port at the end) ``` js var uwsgi = require('./uwsgi_rpcserver.js'); rpc_functions = { 'hello': function() { return "Hello World !!!"; }, 'sum': function(x, y) { return (parseInt(x)+parseInt(y)) + '';}, }; uwsgi.listen(rpc_functions, 3000); ``` run your rpc server ------------------- ``` sh node server.js ``` call functions from your uWSGI applications ------------------------------------------- ``` py # python import uwsgi uwsgi.rpc('127.0.0.1:3000', 'hello') uwsgi.rpc('127.0.0.1:3000', 'sum', '17', '30') ``` ``` pl # perl uwsgi::rpc('127.0.0.1:3000', 'hello') uwsgi::rpc('127.0.0.1:3000', 'sum', '17', '30') ```
19.113636
72
0.548157
eng_Latn
0.396079
e17e54f514bc539d4032c86eafa5e50e331bc843
1,460
md
Markdown
ShellCheck/SC2222.md
r3yn0ld4/docs-for-code-review-tools
a1590fce3b30891679373ec284787b227b21df05
[ "MIT" ]
4
2019-07-17T18:16:06.000Z
2021-03-28T23:53:10.000Z
ShellCheck/SC2222.md
Acidburn0zzz/docs-for-code-review-tools
9659492c76b988e14363dced6c2ab5f86fcdd6e0
[ "MIT" ]
null
null
null
ShellCheck/SC2222.md
Acidburn0zzz/docs-for-code-review-tools
9659492c76b988e14363dced6c2ab5f86fcdd6e0
[ "MIT" ]
5
2018-09-29T17:02:14.000Z
2021-12-26T16:53:04.000Z
Pattern: `case` never matches because of a previous pattern Issue: - ## Description You have specified multiple patterns in a `case` statement, where one will always override the other. Example of **incorrect** code: ```sh case "$1" in -?) echo "Usage: $0 [-n]";; -n) echo "Hello World";; *) exit 1;; esac ``` In the example, `-?` actually matches a dash followed by any character, such as `-n`. This means that the later `-n` branch will never trigger. In this case, the correct solution is to escape the `-\?` so that it doesn't match `-n`. Another common reason for this is accidentally duplicating a branch. In this case, fix or delete the duplicate branch. Example of **correct** code: ```sh case "$1" in -\?) echo "Usage: $0 [-n]";; -n) echo "Hello World";; *) exit 1;; esac ``` ## Exceptions One could argue that having `-*|--*) echo "Invalid flag";` is a readability issue, even though the second pattern follows from the first. In this case, you can either rearrange the pattern from most to least specific, i.e. `--*|-*)` or ignore the error. When ignoring this error, remember that ShellCheck directives have to go in front of the `case` statement, and not in front of the branch: # shellcheck disable=SC2221,SC2222 case "$1" in -n) ...;; # no directive here -*|--*) echo "Unknown flag" ;; esac ## Further Reading * [ShellCheck - SC2222](https://github.com/koalaman/shellcheck/wiki/SC2222)
30.416667
253
0.675342
eng_Latn
0.997987
e17e5c21a2a821320373d430c1edd75dbef2ef34
598
md
Markdown
TODO.md
cdelamarre/esoldb
d32e1e37212ad1a3a7070816a8a6167e9877a1eb
[ "MIT" ]
1
2019-11-21T11:26:47.000Z
2019-11-21T11:26:47.000Z
TODO.md
cdelamarre/esoldb
d32e1e37212ad1a3a7070816a8a6167e9877a1eb
[ "MIT" ]
null
null
null
TODO.md
cdelamarre/esoldb
d32e1e37212ad1a3a7070816a8a6167e9877a1eb
[ "MIT" ]
null
null
null
~~1° Ajout d'un script qui se lance à l'issue de composer install qui construira si necessaire - les fichiers config/packages/[dev|prod|test]/esolDb.yml - le repertoire Resources/sql~~ 2° Rendre compatible avec pdo_mysql et pdo_pgsql 3° Faire le printSqlr 4° nettoyer les public et private Conn.php ~~Esol.db.php~~ Params.php Sqlr.php ~~5° tester extension mysqli et pgsql~~ 6° document les fonctions public Conn.php ~~Esol.db.php~~ Params.php Sqlr.php 7° gérer les environnements dev, prod, ~~8° ajouter la possibilité de passer en paramètre un tableau de clé ~~
33.222222
95
0.722408
fra_Latn
0.704236
e17e8a42b5dbc7804711d0939e4d99daf626d77e
3,562
md
Markdown
_posts/2020-12-21-wangdan1989.md
NodeBE4/opinion
81a7242230f02459879ebc1f02eb6fc21507cdf1
[ "MIT" ]
21
2020-07-20T16:10:55.000Z
2022-03-14T14:01:14.000Z
_posts/2020-12-21-wangdan1989.md
NodeBE4/opinion
81a7242230f02459879ebc1f02eb6fc21507cdf1
[ "MIT" ]
1
2020-07-19T21:49:44.000Z
2021-09-16T13:37:28.000Z
_posts/2020-12-21-wangdan1989.md
NodeBE4/opinion
81a7242230f02459879ebc1f02eb6fc21507cdf1
[ "MIT" ]
1
2021-05-29T19:48:01.000Z
2021-05-29T19:48:01.000Z
--- author: wangdan1989 categories: - Twitter date: 2020-12-27 from: https://twitter.com/wangdan1989/status/1342594110021115904 layout: post tags: - Twitter title: 'Twitter @王丹: 2020-12-21~2020-12-27' --- 一个不可救药的理想主义者; 致力于做一个温和,坚定,建设性的政治反对派; 期待未来的中国,能够重建政治秩序和生活秩序。 * This will become a table of contents (this text will be scrapped). {:toc} ### 1: [2020-12-26 06:12:34+08:00 推文](https://twitter.com/wangdan1989/status/1342594110021115904) Re @PacianoDuran 不用理他們。 ### 2: [2020-12-26 07:37:16+08:00 推文](https://twitter.com/wangdan1989/status/1342615424253440000) 阿里巴巴遭調查 浙江首長馬上翻臉表態:挺黨反壟斷 <br><br>中共喊“反壟斷”,純屬賊喊捉賊。<br><br>天下,那還有比中共更壟斷的?每一寸土地都是他們的,思想,文化,歷史真相,都要壟斷。更不要提政治權利了。中共是見什麼壟斷什麼。<br><br>要反壟斷,請中共自宮先。 <a href="https://tw.appledaily.com/international/20201226/EDWBBKMZ6FGV7EFRQOOKU7TU5M/?utm_source=facebook&utm_medium=social&utm_campaign=twad_article_share&utm_content=share_link&fbclid=IwAR0gWCvZqzT7mVBLjwL1jmzRX9sPCPbNEQhtvb4jQMUiBCT1fqG_QMhsQp8&utm_source=twitter&utm_medium=social&utm_campaign=twad_article_share&utm_content=share_link" target="_blank" rel="noopener noreferrer">https://tw.appledaily.com/international/20201226/EDWBBKMZ6FGV7EFRQOOKU7TU5M/?utm_source=facebook&utm_medium=social&utm_campaign=twad_article_share&utm_content=share_link&fbclid=IwAR0gWCvZqzT7mVBLjwL1jmzRX9sPCPbNEQhtvb4jQMUiBCT1fqG_QMhsQp8&utm_source=twitter&utm_medium=social&utm_campaign=twad_article_share&utm_content=share_link</a> ### 3: [2020-12-26 10:19:07+08:00 推文](https://twitter.com/wangdan1989/status/1342656154569236481) 「中國90後」是當代中國一個特殊的群體,他們誕生於共產中國距今最大的一場政治風波(八九民運/六四鎮壓)之後,享有改革開放和全球化的絕對紅利,卻只有模糊的中共嚴酷統治歷史觀。作為未來中國社會的接班人,他們的三觀值得被傾聽和觀察。 <a href="https://youtu.be/JerjJkGLFgg" target="_blank" rel="noopener noreferrer">https://youtu.be/JerjJkGLFgg</a> ### 4: [2020-12-26 10:22:34+08:00 推文](https://twitter.com/wangdan1989/status/1342657022886604800) 專訪蘇曉康老師-鬼推磨 <a href="https://youtu.be/k0FtZK6hYNI" target="_blank" rel="noopener noreferrer">https://youtu.be/k0FtZK6hYNI</a> ### 5: [2020-12-27 00:18:06+08:00 推文](https://twitter.com/wangdan1989/status/1342867292837851137) 阿里巴巴遭中共「反壟斷」調查 王丹批:賊喊抓賊先自宮 <a href="https://tw.appledaily.com/international/20201226/U5OTMDKVPRESNIUJU6ENYVRXDM/?utm_source=twitter&utm_medium=social&utm_campaign=twad_article_share&utm_content=share_link" target="_blank" rel="noopener noreferrer">https://tw.appledaily.com/international/20201226/U5OTMDKVPRESNIUJU6ENYVRXDM/?utm_source=twitter&utm_medium=social&utm_campaign=twad_article_share&utm_content=share_link</a> ### 6: [2020-12-27 11:04:54+08:00 推文](https://twitter.com/wangdan1989/status/1343030066440187907) 《一週時事》特別節目直播預告:<br><br> 洪哲勝先生是海外台獨大老,幫助了很多中國國內的政治犯。他是一個一生追求民主不放棄的人。<br><br> 本週節目,將現場直播“中華學人聯誼會”為洪先生舉辦的網上追思會,歡迎參加<br><br> 時間:北京/台北今天(12/27)晚上10點。 <a href="https://www.youtube.com/channel/UCAbpBUsrolyu6HvwPcmhXZw" target="_blank" rel="noopener noreferrer">https://www.youtube.com/channel/UCAbpBUsrolyu6HvwPcmhXZw</a> ### 7: [2020-12-27 20:52:40+08:00 推文](https://twitter.com/wangdan1989/status/1343177982672822274) 洪哲勝先生追思會,今晚10:30正式開始。<br><br>在「王丹學堂」進行直播。 <a href="https://youtube.com/channel/UCAbpBUsrolyu6HvwPcmhXZw" target="_blank" rel="noopener noreferrer">https://youtube.com/channel/UCAbpBUsrolyu6HvwPcmhXZw</a> ### 8: [2020-12-27 21:01:34+08:00 推文](https://twitter.com/wangdan1989/status/1343180220292083713) Re 美東時間今天上午9:30 ### 9: [2020-12-27 22:34:21+08:00 推文](https://twitter.com/wangdan1989/status/1343203572335439880) 直播已經開始。 <a href="https://youtu.be/y5xTK4pptjQ" target="_blank" rel="noopener noreferrer">https://youtu.be/y5xTK4pptjQ</a>
65.962963
857
0.797305
yue_Hant
0.535895
e17f8c5d85e5ca4f413565a999861e26d293a591
3,638
md
Markdown
CONTRIBUTING.md
Shahil98/SE_Fall20_Project-1
c01966f819a6ae12a3639dfdbaaac2fb7ebf7ebb
[ "MIT" ]
1
2020-10-22T05:30:15.000Z
2020-10-22T05:30:15.000Z
CONTRIBUTING.md
Shahil98/SE_Fall20_Project-1
c01966f819a6ae12a3639dfdbaaac2fb7ebf7ebb
[ "MIT" ]
16
2020-10-22T15:49:39.000Z
2020-10-28T02:20:55.000Z
CONTRIBUTING.md
Shahil98/SE_Fall20_Project-1
c01966f819a6ae12a3639dfdbaaac2fb7ebf7ebb
[ "MIT" ]
2
2020-09-28T23:24:15.000Z
2021-09-27T23:41:00.000Z
# Contributing when you want to contribute to this repository, please send us an email and tell us what you want to change. ### Table Of Contents [Purpose of Contributing](#Purpose-of-Contributing) [Code of Conduct](#Code-of-Conduct) [Pull Request Submission Guidelines](#Pull-Request-Submission-Guidelines) [Style Guide](#Style-guide) ## ⛄ Purpose of Contributing * Maintain our product quality. * Fix bugs and problems. * Engage the community in working toward the best possible to our product. ## ❄️ Code of Conduct All contributors should abide by the [code of conduct](CODE-OF-CONDUCT.md). Please read this carefully before contributing. ## ⚡ Pull Request Submission Guidelines ### 💻 Install Git First thing first, you should install and configure the [Git](https://git-scm.com/) on your local machine. [Set Up Git](https://docs.github.com/en/github/getting-started-with-github/quickstart) is a good source for you to set up. ### 🍴 Fork Our Repositroy To contribute code to our product, you must have a Github account so you could push code to your own fork and open Pull Requests in the [GitHub repository](https://github.com/Shahil98/SE_Fall20_Project-1) ### 👔 Work On Your Own branch Once done and you have the code locally on the disk, you can get started. We advice to not work directly on the master branch, but to create a separate branch for each issue you are working on. That way you can easily switch between different work, and you can update each one for latest changes on upstream master individually. ### 📝 Write Code For writing the code just follow our Python and JavaScript style guides. If there is something unclear of the style, just look at existing code which might help you to understand it better. ## 💬 Style Guide ### Python Style Guide * Use [Flake8](https://flake8.pycqa.org/en/latest/) as linter for Python codes * Give variable names as words in lowercase seperated by underscores for e.g. 'variable_name' * Give function names as words in lowercase seperated by underscores for e.g. 'function_name' * Use upper camel case to write class names for e.g. 'MyClass' * Try to select class, function and variable names that are meaningful wherever possible instead of using names like x, y, a etc * Comment as much as you can. Write comments for each function stating what it does, what are it's input parameters and what are it's outputs. An example for comments in a function can be seen below: ``` def add_function(num1, num2): """ Function to add two integers. Parameters ---------- num1: int first number num2: int second number Returns ------- sum an integer which is the sum of input integers """ sum = num1 + num2 return(sum) ``` ### JavaScript Style Guide * Give variable names as words in lowercase seperated by underscores for e.g. 'model_json' * Use lower camel case to write function names for e.g. 'createKerasModel' * Try to select function and variable names that are meaningful wherever possible instead of using names like x, y, a etc * Use [js-beautify](https://github.com/beautify-web/js-beautify) as a code formatter. We are using 'Beautify' extension in VS Code which implements js-beautify and 'HTML-CSS-JS Prettify' extension in Sublime text editor which implements js-beautify. ### Git Commit Messages * Use the present tense ("Add feature" not "Added feature") * Use the imperative mood ("Move cursor to..." not "Moves cursor to...") * Limit the first line to 72 characters or less ### Documentation Style Guide * Use [Markdown](https://daringfireball.net/projects/markdown/)
44.91358
328
0.742716
eng_Latn
0.995924
e17ff81034d81e2a9e4f61e02b0594a2d9d853e9
2,350
md
Markdown
presentations/README.md
butlife/teach.github.com
e63fc94fc708b9f42f3f4c8fdf6641e374f19bbb
[ "CC-BY-3.0" ]
1
2019-03-17T23:41:01.000Z
2019-03-17T23:41:01.000Z
presentations/README.md
butlife/teach.github.com
e63fc94fc708b9f42f3f4c8fdf6641e374f19bbb
[ "CC-BY-3.0" ]
1
2015-08-20T21:14:21.000Z
2017-01-25T17:33:46.000Z
presentations/README.md
butlife/teach.github.com
e63fc94fc708b9f42f3f4c8fdf6641e374f19bbb
[ "CC-BY-3.0" ]
1
2019-12-12T03:17:35.000Z
2019-12-12T03:17:35.000Z
# HydeSlides ## Building from Existing Chapters Creating a new slide deck (within teach.github.com) from existing slide "chapters" is easy: 1. Create a new HTML file in /presentations 2. Add YAML front matter with `layout`, `title`, `chapters` fields `layout` must be set to `slidedeck`; `title` can be whatever you like; `chapters` is a quoted, comma separated list of tag names of slides in the _posts directory. ## Creating new Chapter Content All "chapter" content for teach.github.com HydeSlides are located in the `_posts` directory. Subfolders of markdown files are used only for ease of human-readable organization. A chapter consists of a `_posts/<yourchapter>` and markdown files consisting of four YAML front matter fields: `chapter`, `layout`, `title`, `tags`. * `chapter` serves as the string for the cover slide auto-built by HydeSlides * `layout` must be set to slide * `title` must be a string or, to hide the slide header, an empty string. e.g. `''` * `tags` for simplicity sake is only assigned one value, usually the same name as the chapter folder ## Notes & What's Next Support ### Notes Speaker notes, only shown on the "split" screen displayed by presseing the S key, are included on slides in an HTML wrapped element with `class="notes"`. {% include hydeslides/notes-open.html %} Oh hey, these are some notes. They'll be hidden in your presentation, but you can see them if you open the speaker notes window (hit 's' on your keyboard). {% include hydeslides/notes-close.html %} ### Slide Deck "What's Next" Feature Pressing S will launch the slide deck What's Next with presenter notes (if any are in the original slide markdown). One idiosyncrasy of the core RevealJS behavior is browser focus must be on the main slide deck window when navigating slides. ## Theming /dependencies * SASS theming is found under `/dependencies/github/css` and controls all ReavealJS and slide presentation overrides * Graphical and JS dependencies centrally stored in `/dependencies` * Assets used throughout any slide deck should be stored in `/assets` ## To Do * Download Google Fonts for offline slide use * Simplify layouts.scss styles --- Thanks go out to the contributors of HydeSlide's core components and features: Tom Preston-Werner for Jekyll, Hakim El Hattab for Reveal-JS, and Dave Gandy for Font-Awesome.
45.192308
176
0.760851
eng_Latn
0.997919
e180013d7344388c3b2c01a8416bf180b41b6e27
1,299
md
Markdown
docs/exceptions/DeltaConcurrentModificationException.md
japila-books/delta-lake-internals
58c82cb8189e954ce8a2d85a535c0d5c4fbad5d9
[ "Apache-2.0" ]
100
2020-01-02T20:11:12.000Z
2022-03-28T15:04:39.000Z
docs/exceptions/DeltaConcurrentModificationException.md
jaceklaskowski/delta-lake-internals
5be53a77a5418c0b771d0e446ff884dacb7fd5da
[ "Apache-2.0" ]
1
2022-03-16T10:53:04.000Z
2022-03-18T08:00:58.000Z
docs/exceptions/DeltaConcurrentModificationException.md
japila-books/delta-lake-internals
58c82cb8189e954ce8a2d85a535c0d5c4fbad5d9
[ "Apache-2.0" ]
24
2020-01-24T22:43:37.000Z
2022-03-13T14:55:58.000Z
# DeltaConcurrentModificationException `DeltaConcurrentModificationException` is an extension of the `ConcurrentModificationException` ([Java]({{ java.api }}/java.base/java/util/ConcurrentModificationException.html)) abstraction for [commit conflict exceptions](#implementations). !!! note There are two `DeltaConcurrentModificationException` abstractions in two different packages: * `io.delta.exceptions` * `org.apache.spark.sql.delta` (obsolete since 1.0.0) ## Implementations * [ConcurrentAppendException](ConcurrentAppendException.md) * [ConcurrentDeleteDeleteException](ConcurrentDeleteDeleteException.md) * [ConcurrentDeleteReadException](ConcurrentDeleteReadException.md) * [ConcurrentTransactionException](ConcurrentTransactionException.md) * [ConcurrentWriteException](ConcurrentWriteException.md) * [MetadataChangedException](MetadataChangedException.md) * [ProtocolChangedException](ProtocolChangedException.md) ## Creating Instance `DeltaConcurrentModificationException` takes the following to be created: * <span id="message"> Error Message ??? note "Abstract Class" `DeltaConcurrentModificationException` is an abstract class and cannot be created directly. It is created indirectly for the [concrete DeltaConcurrentModificationExceptions](#implementations).
44.793103
241
0.821401
kor_Hang
0.527198
e1802806c9164b82012c0ed89887244e726f2059
696
md
Markdown
osc/docs/RouteLight.md
jerome-jutteau/osc-sdk-go
d654809d06993841a5c3ad7d619118df159b04e3
[ "BSD-3-Clause" ]
null
null
null
osc/docs/RouteLight.md
jerome-jutteau/osc-sdk-go
d654809d06993841a5c3ad7d619118df159b04e3
[ "BSD-3-Clause" ]
null
null
null
osc/docs/RouteLight.md
jerome-jutteau/osc-sdk-go
d654809d06993841a5c3ad7d619118df159b04e3
[ "BSD-3-Clause" ]
null
null
null
# RouteLight ## Properties Name | Type | Description | Notes ------------ | ------------- | ------------- | ------------- **DestinationIpRange** | **string** | The IP range used for the destination match, in CIDR notation (for example, 10.0.0.0/24). | [optional] **RouteType** | **string** | The type of route (always &#x60;static&#x60;). | [optional] **State** | **string** | The current state of the static route (&#x60;pending&#x60; \\| &#x60;available&#x60; \\| &#x60;deleting&#x60; \\| &#x60;deleted&#x60;). | [optional] [[Back to Model list]](../README.md#documentation-for-models) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to README]](../README.md)
49.714286
174
0.599138
eng_Latn
0.335853
e180b1a463cf97337f20da25283ba68b7336c0f2
583
md
Markdown
mocha_tests/compression/en_gen_tn/content/08/04.md
unfoldingWord-dev/node-resource-container
20c4b7bfd2fa3f397ee7e0e743567822912c305b
[ "MIT" ]
1
2016-12-15T03:59:00.000Z
2016-12-15T03:59:00.000Z
mocha_tests/compression/en_gen_tn/content/08/04.md
unfoldingWord-dev/node-resource-container
20c4b7bfd2fa3f397ee7e0e743567822912c305b
[ "MIT" ]
1
2016-12-16T18:41:20.000Z
2016-12-16T18:41:20.000Z
mocha_tests/compression/out/en_gen_tn.reopened/content/08/04.md
unfoldingWord-dev/node-resource-container
20c4b7bfd2fa3f397ee7e0e743567822912c305b
[ "MIT" ]
null
null
null
#came to rest "landed" or "stopped on solid ground" #in the seventh month, on the seventeenth day of the month ... tenth month Since Moses wrote this book it is possible he is referring to the seventh month and tenth month of the Hebrew calendar. But this is uncertain. (See: [[:en:ta:vol2:translate:translate_hebrewmonths]] and [[:en:ta:vol2:translate:translate_ordinal]]) #On the first day of the month "on the first day of the tenth month" #appeared This can be made more explicit as "appeared above the surface of the water." (See: [[:en:ta:vol1:translate:figs_explicit]])
38.866667
246
0.754717
eng_Latn
0.999396
e180f9a40790251c58ff402547321e879c273343
2,599
md
Markdown
detailedDesign.md
miladi75/exercise-6-gruppe21
186c55aadaeff6aea93f827aa3312fccf8d7a5b6
[ "Unlicense" ]
null
null
null
detailedDesign.md
miladi75/exercise-6-gruppe21
186c55aadaeff6aea93f827aa3312fccf8d7a5b6
[ "Unlicense" ]
null
null
null
detailedDesign.md
miladi75/exercise-6-gruppe21
186c55aadaeff6aea93f827aa3312fccf8d7a5b6
[ "Unlicense" ]
1
2021-03-12T20:37:23.000Z
2021-03-12T20:37:23.000Z
# Detailed Synchronizer module design Our design philosophy is to have minimal fault-handling and simple module design, which we aim to achieve by having an order synchronizer module which handles the "difficult" task of synchronizing and delegating hall orders between the different elevators. This module should be able to handle most failures, including elevators disconnecting, motors losing power and programs crashing. It should also be able to automatically get an elevator up to speed with the current state of the system after restart/reconnect. The main part of the module will consist of a floor-order state machine, and functions to handle hall-order state events. Each elevator will have a state for each of the hall call buttons, with four different possible state values. All elevators will broadcast their floor-order state machines, and the different elevators communicate by comparing their floor-order states. To make the system robust, the ways in which a floor-order state can change is strict.<br> ## Synchronizer FSM <img width="614" alt="OrderFSMfigure" src="https://user-images.githubusercontent.com/61008623/110940089-a5ea4b00-8336-11eb-81e3-7a7c611ab66f.png"> - Table with one entry for each hall button type. Four possible states: None - New - Handling - Complete - The synchronizer state is broadcasted by all elevators. - Transitions from left to right, except when an order changes from Complete to None.<br> None -> New - A button is pressed - Another syncFSM is in the New or Handling state New -> Handling - The elevator with lowest cost function takes the order and turn on the light. - order time-to-complete exceeded Handling -> Complete - The transition from Handling to Complete occurs when the door opens at the desired floor. The order will be in the Complete state for ~100 milliseconds New -> Complete - The syncFSM see that another elevator has completed the order Complete -> None - The transition happens after x ms. This should be enough time for all elevators to change to Complete. # Error handling If an elevator does not have the lowest cost function, it remains in the New state and starts a timer. If the order is not completed before the timeout, another elevator takes the order. This ensures that the orders will be served if an elevator disconnects, is obstructed, etc. The syncFSM is initialized with all orders as None. An elevator is initialized when it reconnects after being disconnected. As the None state has no impact on the other syncFSMs, elevators disconnecting and reconnecting will not affect on the existing orders.
70.243243
341
0.799923
eng_Latn
0.999429
e181a65eea3dcd99d82139ab067fd61b37f412a4
17
md
Markdown
README.md
Godluck41/QRcode-scanner-
014550193baf96b744ed020b5be8fdd395623fdb
[ "Apache-2.0" ]
null
null
null
README.md
Godluck41/QRcode-scanner-
014550193baf96b744ed020b5be8fdd395623fdb
[ "Apache-2.0" ]
null
null
null
README.md
Godluck41/QRcode-scanner-
014550193baf96b744ed020b5be8fdd395623fdb
[ "Apache-2.0" ]
null
null
null
# QRcode-scanner-
17
17
0.764706
eng_Latn
0.244979
e18226cbae9a9de04ddddc8b388e39c4a93ed4d4
8,223
md
Markdown
README.md
pixel-perfect-metodology/gemini
61e1b3ba12d5787001daa92b643a854c4d5ce759
[ "MIT" ]
1,359
2015-04-16T14:33:48.000Z
2022-03-03T00:38:27.000Z
README.md
pixel-perfect-metodology/gemini
61e1b3ba12d5787001daa92b643a854c4d5ce759
[ "MIT" ]
754
2015-04-16T11:34:11.000Z
2021-06-29T07:09:39.000Z
README.md
pixel-perfect-metodology/gemini
61e1b3ba12d5787001daa92b643a854c4d5ce759
[ "MIT" ]
199
2015-04-23T08:01:05.000Z
2022-03-19T07:29:09.000Z
# Gemini [![npm](https://img.shields.io/npm/v/gemini.svg?maxAge=2592000)](https://www.npmjs.com/package/gemini) [![Build Status](https://travis-ci.org/gemini-testing/gemini.svg?branch=master)](https://travis-ci.org/gemini-testing/gemini) [![Coverage Status](https://img.shields.io/coveralls/gemini-testing/gemini.svg)](https://coveralls.io/r/gemini-testing/gemini) [![Join the chat at https://gitter.im/gemini-testing/gemini](https://badges.gitter.im/Join%20Chat.svg)](https://gitter.im/gemini-testing/gemini?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge) [![Stories on waffle.io](https://img.shields.io/badge/waffle-dashboard-green.svg)](http://waffle.io/gemini-testing/gemini) [Gemini](https://github.com/gemini-testing/gemini) is a utility for regression testing the visual appearance of web pages. Gemini allows you to: * Work with different browsers: - Google Chrome (tested in latest version) - Mozilla Firefox (tested in latest version) - IE8+ - Opera 12+ * Test separate sections of a web page * Include the `box-shadow` and `outline` properties when calculating element position and size * Ignore some special case differences between images (rendering artifacts, text caret, etc.) * Gather CSS test coverage statistics **Gemini** was created at [Yandex](http://www.yandex.com/) and is especially useful to UI library developers. ## Quick start ### Installing ``` npm install -g gemini npm install -g selenium-standalone selenium-standalone install ``` ### Configuring Put the `.gemini.js` file in the root of your project: ```javascript module.exports = { rootUrl: 'http://yandex.ru', gridUrl: 'http://127.0.0.1:4444/wd/hub', browsers: { chrome: { desiredCapabilities: { browserName: 'chrome' } } } }; ``` ### Writing tests Write a test and put it in the `gemini` folder in the root of your project: ```javascript gemini.suite('yandex-search', (suite) => { suite.setUrl('/') .setCaptureElements('.home-logo') .capture('plain'); }); ``` ### Saving reference images You have written a new test and should save a reference image for it: ``` gemini update ``` ### Running tests Start `selenium-standalone` in a separate tab before running the tests: ``` selenium-standalone start ``` Run gemini tests: ``` gemini test ``` ## Dependencies Required software: 1. WebDriver server implementation. There are several options: - [Selenium Server](http://docs.seleniumhq.org/download/) — for testing in different browsers. Launch with the `selenium-standalone start` command (if you will get error like "No Java runtime present, requesting install." you should install [Java Development Kit (JDK)](https://www.oracle.com/technetwork/java/javase/downloads/index.html) for your OS.). - [ChromeDriver](https://sites.google.com/a/chromium.org/chromedriver/) — for testing in Google Chrome. Launch with the `chromedriver --port=4444 --url-base=wd/hub` command. - [PhantomJS](http://phantomjs.org/) — launch with the `phantomjs --webdriver=4444` command. - Cloud WebDriver services, such as [SauceLabs](http://saucelabs.com/) or [BrowserStack](http://www.browserstack.com/) 2. Compiler with support for C++11 (`GCC@4.6` or higher). This is a [png-img](https://github.com/gemini-testing/png-img) requirement. Compiling on Windows machines requires the [node-gyp prerequisites](https://github.com/nodejs/node-gyp#on-windows). ## Installing To install the utility, use the [npm](https://www.npmjs.org/) `install` command: ```sh npm install -g gemini ``` Global installation is used for launching commands. ## Configuring **Gemini** is configured using a config file at the root of the project. Gemini can use one of the following files: * `.gemini.conf.js` * `.gemini.conf.json` * `.gemini.conf.yml` * `.gemini.js` * `.gemini.json` * `.gemini.yml` Let's say we want to run our tests only in the locally installed `PhantomJS`. In this case, the minimal configuration file will only need to have the root URL of your web app and the WebDriver capabilities of `PhantomJS`: For example, ```yaml rootUrl: http://yandex.com browsers: PhantomJS: desiredCapabilities: browserName: phantomjs ``` Also, you need to run `PhantomJS` manually in `WebDriver` mode: ``` phantomjs --webdriver=4444 ``` If you are using a remote WebDriver server, you can specify its URL with the `gridUrl` option: ```yaml rootUrl: http://yandex.com gridUrl: http://selenium.example.com:4444/wd/hub browsers: chrome: desiredCapabilities: browserName: chrome version: "45.0" firefox: desiredCapabilities: browserName: firefox version: "39.0" ``` You can also set up each browser to have its own node: ```yaml rootUrl: http://yandex.com browsers: chrome: gridUrl: http://chrome-node.example.com:4444/wd/hub desiredCapabilities: browserName: chrome version: "45.0" firefox: gridUrl: http://firefox-node.example.com:4444/wd/hub desiredCapabilities: browserName: firefox version: "39.0" ``` ### Other configuration options [See the details](doc/config.md) of the config file structure and available options. ## Writing tests Each of the blocks that are being tested may be in one of the determined states. States are tested with the help of chains of step-by-step actions declared in a block's test suites. For example, let's write a test for a search block at [yandex.com](http://www.yandex.com): ```javascript gemini.suite('yandex-search', function(suite) { suite.setUrl('/') .setCaptureElements('.search2__input') .capture('plain') .capture('with text', function(actions, find) { actions.sendKeys(find('.search2__input .input__control'), 'hello gemini'); }); }); ``` We are creating a new test suite `yandex-search`, assuming that we will capture the `.search2__input` element from the root URL `http://yandex.com`. We know that the block has two states: * `plain` — right after the page is loaded * `with text` — with the `hello gemini` text inserted into `.search2__input .input__control` States are executed one after another in the order in which they are defined, without the browser reloading in between. [See the details](doc/tests.md) of test creation methods. ## Using CLI To complete the test creation procedure, you need to take reference shots using the following command: ``` gemini update [paths to test suites] ``` To launch a test (to compare the current state of a block with a reference shot), use the command: ``` gemini test [paths to test suites] ``` [See the details](doc/commands.md) of interaction with CLI and available options. ## GUI You can use the `Gemini` graphical user interface instead of the command line. It is located in the [gemini-gui](https://github.com/gemini-testing/gemini-gui) package and must be installed additionally: ``` npm install -g gemini-gui ``` GUI advantages: * Handy preview of reference shots * Clear real-time demonstration of the differences between a reference shot and the current state of a block * Easy to update reference shots ## Plugins Gemini can be extended with plugins. You can choose from the [existing plugins](https://www.npmjs.com/browse/keyword/gemini-plugin) or [write your own](doc/plugins.md). To use a plugin, install and enable it in your `.gemini.yml`: ```yaml system: plugins: some-awesome-plugin: plugin-option: value ``` ## HTML report To see the difference between the current state of a block and a reference picture more clearly, use the [HTML reporter](https://github.com/gemini-testing/html-reporter) - plugin for gemini. This plugin produces HTML report, which displays reference image, current image and differences between them, for each state in each browser. When all tests are completed, you will see a link to HTML report. ## Programmatic API To use Gemini in your scripts or build tools, you can use the experimental [programmatic API](doc/programmatic-api.md). ## Events To learn more about all events in Gemini, see the [events documentation](doc/events.md).
27.228477
283
0.719932
eng_Latn
0.931251
e182c2d1c7fa74842ba6120fdc66e1709edccace
289
md
Markdown
src/main/resources/docs/description/DM_BOXED_PRIMITIVE_FOR_PARSING.md
codacy/codacy-spotbugs
f360b79a4e79f20a3c3458bb7d6648f744eb5bbc
[ "Apache-2.0" ]
null
null
null
src/main/resources/docs/description/DM_BOXED_PRIMITIVE_FOR_PARSING.md
codacy/codacy-spotbugs
f360b79a4e79f20a3c3458bb7d6648f744eb5bbc
[ "Apache-2.0" ]
5
2019-07-18T15:08:02.000Z
2022-02-02T17:42:24.000Z
src/main/resources/docs/description/DM_BOXED_PRIMITIVE_FOR_PARSING.md
codacy/codacy-spotbugs
f360b79a4e79f20a3c3458bb7d6648f744eb5bbc
[ "Apache-2.0" ]
3
2019-07-18T14:58:51.000Z
2020-02-17T11:45:23.000Z
# [Boxing/unboxing to parse a primitive](https://spotbugs.readthedocs.io/en/latest/bugDescriptions.html#DM_BOXED_PRIMITIVE_FOR_PARSING) A boxed primitive is created from a String, just to extract the unboxed primitive value. It is more efficient to just call the static parseXXX method.
72.25
135
0.816609
eng_Latn
0.935732
e182c72d5b9f71247d28349a4bada23ab23d4027
402
md
Markdown
README.md
JustGhostz/inStorm
a774f0b17f7fb744dfe603b7568e96f52ba16eff
[ "MIT" ]
6
2020-10-26T10:31:43.000Z
2021-12-27T17:53:10.000Z
README.md
JustGhostz/inStorm
a774f0b17f7fb744dfe603b7568e96f52ba16eff
[ "MIT" ]
null
null
null
README.md
JustGhostz/inStorm
a774f0b17f7fb744dfe603b7568e96f52ba16eff
[ "MIT" ]
null
null
null
# instorm instagram cracker written in go language. ![alt text](img/1.png) ![alt text](img/2.png) # installation ``` git clone https://github.com/justghostz/instorm.git ~/go/src/instorm ``` # libraries ``` go get github.com/fatih/color go get github.com/google/uuid ``` # run ``` go run instorm ``` # build ``` cd ~/go/src/instorm && go build -o build/instorm instorm ``` # about instagram: @0xhades
15.461538
68
0.68408
eng_Latn
0.33635
e182f77ef75ad37d37af903e5410bde4385b7567
12,326
md
Markdown
specs/Parking/OnStreetParking/doc/spec.md
mathi123/dataModels
0317d4beb6dd0c9af0f4fead925e40d4f0286edf
[ "MIT" ]
null
null
null
specs/Parking/OnStreetParking/doc/spec.md
mathi123/dataModels
0317d4beb6dd0c9af0f4fead925e40d4f0286edf
[ "MIT" ]
null
null
null
specs/Parking/OnStreetParking/doc/spec.md
mathi123/dataModels
0317d4beb6dd0c9af0f4fead925e40d4f0286edf
[ "MIT" ]
null
null
null
# On Street Parking ## Description A site, open space zone, on street, (metered or not) with direct access from a road, intended to park vehicles. In DATEX 2 version 2.3 terminology it corresponds to a *UrbanParkingSite* of type *onStreetParking*. A data dictionary for DATEX II terms can be found at [http://datexbrowser.tamtamresearch.com/](http://datexbrowser.tamtamresearch.com/). ## Data Model + `id` : Unique identifier. + `type` : Entity type. It must be equal to `OnStreetParking`. + `dateCreated` : Entity's creation timestamp + Attribute type: [DateTime](https://schema.org/DateTime) + Read-Only. Automatically generated. + `dateModified` : Last update timestamp of this entity + Attribute type: [DateTime](https://schema.org/DateTime) + Read-Only. Automatically generated. + `category` : Street parking category. + Attribute type: List of [Text](http://schema.org/Text) + Allowed values: + (`forDisabled`, `forResidents`, `forLoadUnload`, `onlyWithPermit`, `forELectricalCharging`) + (`free`, `feeCharged`) + (`blueZone`, `greenZone`) + (`taxiStop`) + (`shortTerm`, `mediumTerm`) + Any value not covered by the above enumeration and meaningful for the application. + Mandatory + `location` : Geolocation of the parking site represented by a GeoJSON (Multi)Polygon. + Attribute type: `geo:json`. + Normative References: [https://tools.ietf.org/html/rfc7946](https://tools.ietf.org/html/rfc7946) + Mandatory if `address`is not defined. + `address` : Registered onstreet parking civic address. + Normative References: [https://schema.org/address](https://schema.org/address) + Mandatory if location not defined + `name` : Name given to the onstreet parking zone. + Normative References: [https://schema.org/name](https://schema.org/name) + Mandatory + `chargeType` : Type of charge(s) performed by the parking site. + Attribute type: List of [Text](http://schema.org/Text) + Allowed values: Some of those defined by the DATEX II version 2.3 *ChargeTypeEnum* enumeration: + (`flat`, `minimum`, `maximum`, `additionalIntervalPrice` `seasonTicket` `temporaryPrice` `firstIntervalPrice`, `annualPayment`, `monthlyPayment`, `free`, `unknown`, `other`) + Any other application-specific + Mandatory + `requiredPermit` : This attribute captures what permit(s) might be needed to park at this site. Semantics is that at least *one of* these permits is needed to park. When a permit is composed by more than one item (and) they can be combined with a ",". For instance "residentPermit,disabledPermit" stays that both, at the same time, a resident and a disabled permit are needed to park. If empty or `null`, no permit is needed. + Attribute type: List of [Text](http://schema.org/Text) + Allowed values: The following, defined by the *PermitTypeEnum* enumeration of DATEX II version 2.3. + oneOf (`fairPermit`, `governmentPermit`, `residentPermit`, `disabledPermit`, `blueZonePermit`, `careTakingPermit`, `carpoolingPermit`, `carSharingPermit`, `emergencyVehiclePermit`, `maintenanceVehiclePermit`, `roadWorksPermit`, `taxiPermit`, `transportationPermit`, `noPermitNeeded`) + Any other application-specific + Mandatory. It can be `null`. + `permitActiveHours` : This attribute allows to capture situations when a permit is only needed at specific hours or days of week. It is an structured value which must contain a subproperty per each required permit, indicating when the permit is active. If nothing specified (or `null`) for a permit it will mean that a permit is always required. `null`or empty object means always active. The syntax must be conformant with schema.org (opening hours specification)[https://schema.org/openingHours]. For instance, a blue zone which is only active on dayweeks will be encoded as "blueZonePermit": "Mo,Tu,We,Th,Fr,Sa 09:00-20:00". + Attribute type: [StructuredValue](http://schema.org/StructuredValue) + Mandatory. It can be `null`. + `allowedVehicleType` : Vehicle type allowed (only one per on street parking). + Attribute type: [Text](http://schema.org/Text) + Allowed Values: The following values defined by *VehicleTypeEnum* [DATEX 2 version 2.3](http://www.datex2.eu/sites/www.datex2.eu/files/DATEXIISchema_2_2_2_1.zip) : + (`bicycle`, `bus`, `car`, `caravan`, `carWithCaravan`, `carWithTrailer`, `constructionOrMaintenanceVehicle`, `lorry`, `moped`, `motorcycle`, `motorcycleWithSideCar`, `motorscooter`, `tanker`, `trailer`, `van`, `anyVehicle`) + Mandatory + `maximumParkingDuration` : Maximum allowed stay at site encoded as a ISO8601 duration. A `null` or empty value indicates an indefinite duration. + Attribute type: [Text](http://schema.org/Text) + Optional + `usageScenario` : Usage scenario. Gives more details about the `category` attribute. + Attribute type: List of [Text](http://schema.org/Text) + Allowed values: Those defined by the enumeration *ParkingUsageScenarioEnum* of DATEX II version 2.3: + (`parkAndRide`, `parkAndCycle`, `parkAndWalk`, `kissAndRide`, ` liftshare`, `carSharing`, `vehicleLift`, `loadingBay`, `dropOff`, `overnightParking`, `other`) + Or any other value useful for the application and not covered above. + Optional + `description` : Description about the onstreet parking zone. + Normative References: [https://schema.org/description](https://schema.org/description) + Optional + `areBordersMarked` : Denotes whether parking spots are delimited (with blank lines or similar) or not. + Attribute type: [Boolean](https://schema.org/Boolean) + Optional + `totalSpotNumber` : The total number of spots offered by this parking site. This number can be difficult to be obtained for those parking locations on which spots are not clearly marked by lines. + Attribute type: [Number](http://schema.org/Number) + Allowed values: Any positive integer number or 0. + Normative references: DATEX 2 version 2.3 attribute *parkingNumberOfSpaces* of the *ParkingRecord* class. + Optional + `availableSpotNumber` : The number of spots available globally, including reserved spaces, such as those for disabled people, long term parkers and so on. This might be harder to estimate at those parking locations on which spots borders are not clearly marked by lines. + Attribute type: [Number](http://schema.org/Number) + Allowed values: A positive integer number, including 0. It must lower or equal than `totalSpotNumber`. + Metadata: + `timestamp` : Timestamp of the last attribute update + Type: [DateTime](https://schema.org/DateTime) + Optional + `extraSpotNumber` : The number of extra spots *available*, i.e. free. Extra spots are those reserved for special purposes and usually require a permit. Permit details will be found at parking group level (entity of type `ParkingGroup`). This value must aggregate free spots from all groups devoted to special parking conditions. + Attribute type: [Number](http://schema.org/Number) + Allowed values: A positive integer number, including 0. `extraSpotNumber` plus `availableSpotNumber` must be lower than or equal to `totalSpotNumber`. + Metadata: + `timestamp` : Timestamp of the last attribute update + Type: [DateTime](https://schema.org/DateTime) + `occupancyDetectionType` : Occupancy detection method(s). + Attribute type: List of [Text](http://schema.org/Text) + Allowed values: The following from DATEX II version 2.3 *OccupancyDetectionTypeEnum*: + (`none`, `balancing`, `singleSpaceDetection`, `modelBased`, `manual`) + Or any other application-specific + Mandatory + `parkingMode` : Parking mode(s). + Attribute type: List of [Text](http://schema.org/Text) + Allowed values: Those defined by the DATEX II version 2.3 *ParkingModeEnum* enumeration: + (`perpendicularParking`, `parallelParking`, `echelonParking`) + Optional + `averageSpotWidth` : The average width of parking spots. + Attribute type: [Number](http://schema.org/Number) + Default unit: Meters + Optional + `averageSpotLength` : The average length of parking spots. + Attribute type: [Number](http://schema.org/Number) + Default unit: Meters + Optional + `acceptedPaymentMethod` : Accepted payment method(s) + Normative references: https://schema.org/acceptedPaymentMethod + Optional + `image` : A URL containing a photo of this parking site. + Normative References: [https://schema.org/image](https://schema.org/image) + Optional + `refParkingSpot` : Individual parking spots belonging to this on street parking site. + Attribute type: List of references to [ParkingSpot](../../ParkingSpot/doc/spec.md) + Optional + `refParkingGroup` : Reference to the parking group(s) (if any) belonging to this onstreet parking zone. + Attribute type: List of references to [ParkingGroup](../../ParkingGroup/doc/spec.md) + Optional + `areaServed` : Area served by this onstreet parking. Precise semantics can depend on the application or target city. For instance, it can be a neighbourhood, burough or district. + Attribute type: [Text](http://schema.org/Text) + Optional **Note**: JSON Schemas only capture the NGSI simplified representation, this means that to test the JSON schema examples with a [FIWARE NGSI version 2](http://fiware.github.io/specifications/ngsiv2/stable) API implementation, you need to use the `keyValues` mode (`options=keyValues`). ## Examples of use An on street parking which contains a group of parking spots reserved for disabled people. At root entity level is announced that special parking spots for disabled are present and two of them free. Main `OnstreetParking` entity. ``` { "id": "santander:daoiz_velarde_1_5", "type": "OnStreetParking", "category": ["blueZone", "shortTerm", "forDisabled"], "allowedVehicleType": "car", "chargeType": ["temporaryFee"], "requiredPermit": ["blueZonePermit", "disabledPermit"], "permitActiveHours": { "blueZonePermit": "Mo, Tu, We, Th, Fr, Sa 09:00-20:00" }, "maximumAllowedStay": "PT2H", "availableSpotNumber": 3, "totalSpotNumber": 6, "extraSpotNumber": 2, "dateModified": "2016-06-02T09:25:55.00Z", "location": { "type": "Polygon", "coordinates": [ [ [-3.80356167695194, 43.46296641666926 ], [-3.803161973253841,43.46301091092682 ], [-3.803147082548618,43.462879859445884], [-3.803536474744068,43.462838666196674], [-3.80356167695194, 43.46296641666926] ] ] }, "areaServed": "Zona Centro", "refParkingGroup: ["daoiz-velarde-1-5-main", daoiz-velarde-1-5-disabled"] } ``` Two different parking groups are needed in this case: A/ Subrogated `ParkingGroup` which gives details about the regular parking spots ``` { "id": "daoiz-velarde-1-5-main", "type": "ParkingGroup", "category": ["onstreet", "blueZone", "shortTerm"], "allowedVehicleType": "car", "chargeType": ["temporaryFee"], "refParkingSite": "daoiz-velarde-1-5", "totalSpotNumber": 4, "availableSpotNumber": 1, "requiredPermit": "blueZonePermit" /* Other required attributes */ } ``` B/ Subrogated `ParkingGroup`. `refPArkingSite` is a pointer to the root entity. All the parking spots are free. ``` { "id": "daoiz-velarde-1-5-disabled", "type": "ParkingGroup", "category": ["onstreet", "blueZone", "shortTerm", "onlyDisabled"], "allowedVehicleType": "car", "chargeType": ["temporaryFee"], "refParkingSite": "daoiz-velarde-1-5", "description": "Two parking spots reserved for disabled people", "totalSpotNumber": 2, "availableSpotNumber": 2, "requiredPermit": "disabledPermit,blueZonePermit" /* Other required attributes */ } ``` ## Test it with a real service ## Open issues + How to model tariffs
46.513208
143
0.691222
eng_Latn
0.859725
e183370876a457b9f5bf62985fee2cad3e233b24
18
md
Markdown
README.md
sevanspowell/hie-nix-wrapper
8442b101121a9026e0180619d3ac34a7c9038203
[ "BSD-3-Clause" ]
null
null
null
README.md
sevanspowell/hie-nix-wrapper
8442b101121a9026e0180619d3ac34a7c9038203
[ "BSD-3-Clause" ]
null
null
null
README.md
sevanspowell/hie-nix-wrapper
8442b101121a9026e0180619d3ac34a7c9038203
[ "BSD-3-Clause" ]
null
null
null
# hie-nix-wrapper
9
17
0.722222
eng_Latn
0.634886
e1848032a26ef7c8828311a7bd417f415266958c
180
md
Markdown
content/blog/3/3.md
olavea/3.8.2_Gro_Grevling_og_Herr_Havre_Rev_meets_Lillian_5_years
1bc098f971a168aad11da4fcf7b87edd85baa45e
[ "MIT" ]
null
null
null
content/blog/3/3.md
olavea/3.8.2_Gro_Grevling_og_Herr_Havre_Rev_meets_Lillian_5_years
1bc098f971a168aad11da4fcf7b87edd85baa45e
[ "MIT" ]
null
null
null
content/blog/3/3.md
olavea/3.8.2_Gro_Grevling_og_Herr_Havre_Rev_meets_Lillian_5_years
1bc098f971a168aad11da4fcf7b87edd85baa45e
[ "MIT" ]
null
null
null
--- title: "3" date: "2015-05-06T23:46:37.121Z" --- ![Geir Gliser'n Grevling & Herr Havre Rev](./image001.jpg) <!-- ![Geir Gliser'n Grevling & Herr Havre Rev](./image002.jpg) -->
22.5
67
0.627778
kor_Hang
0.132724
e1856b032c37cd6ff04f1b034c2d982afe9e8924
719
md
Markdown
latex/README.md
kancheng/kan_pku_beamer_theme_advanced
34bcb993a3d8cb69ab527fd5581e5ef9e82d12e7
[ "MIT" ]
null
null
null
latex/README.md
kancheng/kan_pku_beamer_theme_advanced
34bcb993a3d8cb69ab527fd5581e5ef9e82d12e7
[ "MIT" ]
null
null
null
latex/README.md
kancheng/kan_pku_beamer_theme_advanced
34bcb993a3d8cb69ab527fd5581e5ef9e82d12e7
[ "MIT" ]
null
null
null
# PKU-Beamer-Theme A Beamer Theme of PKU for academic report, thesis and talk. # Demo <img src="img/demo1.jpg" > <img src="img/demo2.jpg" > <img src="img/3.jpg" > <img src="img/4.jpg" > # Document Check out the tutorial in [How_to_do_pku_beamer_theme](How_to_do_pku_beamer_theme.pdf) for details. # Plug-in - 画表神器 https://www.tablesgenerator.com/ - 写公式神器 https://mathpix.com/ - 文献 bib 整理神器 https://dblp.uni-trier.de/ - Latex 画图画表常用命令 https://en.wikibooks.org/wiki/LaTeX/Floats,_Figures_and_Captions#Tip # Overleaf Template Coming Soon! # Acknowledgements This repository is based on [THU-Beamer-Theme](https://github.com/Trinkle23897/THU-Beamer-Theme), and you may refer to it for more details about the code.
31.26087
154
0.746871
yue_Hant
0.652494
e185729c79a27ba9385b0799af8092b15bb3334c
24,776
md
Markdown
README.md
Azure/azure-notificationhubs-ios
35042e4dbe20d1f332d6e2b4fbb1c51285412a31
[ "Apache-2.0" ]
27
2019-04-17T01:31:05.000Z
2021-11-10T19:01:42.000Z
README.md
Azure/azure-notificationhubs-ios
35042e4dbe20d1f332d6e2b4fbb1c51285412a31
[ "Apache-2.0" ]
52
2019-02-26T16:43:14.000Z
2022-02-28T02:43:43.000Z
README.md
Azure/azure-notificationhubs-ios
35042e4dbe20d1f332d6e2b4fbb1c51285412a31
[ "Apache-2.0" ]
19
2019-02-27T11:20:22.000Z
2022-01-31T12:45:45.000Z
[![framework-docs](https://github.com/Azure/azure-notificationhubs-ios/workflows/framework-docs/badge.svg)](https://github.com/Azure/azure-notificationhubs-ios/actions?query=workflow%3Aframework-docs) [![analyze-test](https://github.com/Azure/azure-notificationhubs-ios/workflows/analyze-test/badge.svg)](https://github.com/Azure/azure-notificationhubs-ios/actions?query=workflow%3Aanalyze-test) # Microsoft Azure Notification Hubs SDK for Apple Microsoft Azure Notification Hubs provide a multiplatform, scaled-out push infrastructure that enables you to send mobile push notifications from any backend (in the cloud or on-premises) to any mobile platform. To learn more, visit our [Developer Center](https://azure.microsoft.com/en-us/documentation/services/notification-hubs). The Azure Notification Hubs SDK for Apple provides capabilities for registering your device and receive push notifications on macOS and iOS including platforms including tvOS, watchOS and Mac Catalyst. ## Getting Started The Azure Notification Hubs can be added to your app via Cocoapods, Carthage, Swift Package Manager, or by manually adding the binaries to your project. We have a number of sample applications available written in both Swift and Objective-C to help you get started for both iOS with Mac Catalyst support, and a macOS sample, and SwiftUI coming soon. **This introduces a new API as of version 3.0, and the usage of `SBNotificationHub` with registrations is still supported, but discouraged as we have the new `MSNotificationHub` which uses the Installation API and modern Apple APIs.** 1. NH Sample App for iOS/Mac Catalyst ([Swift](SampleNHAppSwift) | [Objective-C](SampleNHAppObjC)) 2. NH Sample App for macOS ([Swift](SampleNHAppMacSwift) | [Objective-C](SampleNHAppMacObjC)) 3. NH Sample App for SwiftUI ([iOS](SampleNHAppSwiftUI) | [macOS](SampleNHAppMacSwiftUI)) 4. NH Sample Legacy App using Legacy APIs ([Swift](SampleNHAppLegacySwift) | [Objective-C](SampleNHAppLegacyObjC)) ### Integration with Cocoapods Add the following into your `podfile` to pull in the Azure Notification Hubs SDK: ```ruby pod 'AzureNotificationHubs-iOS' ``` Run `pod install` to install the pod and then open your project workspace in Xcode. ### Integration with Carthage Below are the steps on how to integrate the Azure Notification Huds SDK in your Xcode project using Carthage version 0.30 or higher. Add the following to your `Cartfile` to include GitHub repository. ```ruby # Gets the latest release github "Azure/azure-notificationhubs-ios" ``` You can also specify a specific version of the Azure Notification Hubs SDK such as 3.1.4. ```ruby # Get version in the format of X.X.X such as 3.1.4 github "Azure/azure-notificationhubs-ios" ~> 3.1.4 ``` Once you have this, run `carthage update`. This will fetch the SDK and put it into the `Carthage/Checkouts` folder. Open Xcode and drag the `WindowsAzureMessaging.framework` from the `Carthage/Builds/iOS` for iOS or `Carthage/Builds/macOS` for macOS. Ensure the app target is checked during the import. ### Integration via Swift Package Manager The Azure Notification Hubs SDK also supports the Swift Package Manager. To integrate, use the following steps: 1. From the Xcode menu click File > Swift Packages > Add Package Dependency. 2. In the dialog, enter the repository URL: `http://github.com/Azure/azure-notificationhubs-ios.git` 3. In the Version, select Up to Next Major and take the default option. 4. Choose WindowsAzureMessaging in the Package Product column. ### Integration via copying binaries The Azure Notification Hubs SDK can also be added manually by downloading the release from GitHub on the [Azure Notification Hubs SDK Releases](https://github.com/Azure/azure-notificationhubs-ios/releases). The SDK supports the use of XCframework. If you want to integrate XCframeworks into your project, download the WindowsAzureMessaging-SDK-Apple-XCFramework.zip from the releases page and unzip it. Resulting folder contents are not platform-specific, instead it contains the XCframework. Unzip the file and you will see a folder called WindowsAzureMessaging-SDK-Apple that contains the framework files each platform folder. Copy the framework to a desired location and then add to Xcode. Ensure the app target is checked during the import. ### Initializing the SDK To get started with the SDK, you need to configure your Azure Notification Hub with your Apple credentials. To integrate the SDK, you will need the name of the hub as well as a connection string from your Access Policies. Note that you only need the "Listen" permission to intercept push notifications. You can then import the headers for Swift: ```swift import WindowsAzureMessaging ``` And Objective-C as well: ```objc #import <WindowsAzureMessaging/WindowsAzureMessaging.h> ``` Then we can initialize the SDK with our hub name and connection string. This will automatically register the device using the [Installation API](https://docs.microsoft.com/en-us/azure/notification-hubs/notification-hubs-push-notification-registration-management#installations) with your device token. Using Swift, we can use the `start` method, which then starts the installation and device registration process for push notifications. Swift: ```swift let connectionString = "<connection-string>" let hubName = "<hub-name>" MSNotificationHub.start(connectionString: connectionString!, hubName: hubName!) ``` With Objective-C, it is largely the same with calling the `startWithConnectionString` method: Objective-C: ```objc NSString *connectionString = @"<connection-string>"; NSString *hubName = @"<hub-name>"; [MSNotificationHub startWithConnectionString:connectionString hubName:hubName]; ``` By default, the SDK will initialize with the `UNAuthorizationOptions` for alert, badge and sound, however, if you wish to change that, you can use the `startWithConnectionString:hubName:options` method specifying which options you wish to use. Swift: ```swift // Create with alert, badge and sound let hubOptions = MSNotificationHubOptions(withOptions: [.alert, .badge, .sound]) // Start SDK MSNotificationHub.start(connectionString: connectionString!, hubName: hubName!, options: hubOptions!) ``` Objective-C: ```objc // Create with alert, badge and sound MSNotificationHubOptions *hubOptions = [[MSNotificationHubOptions alloc] initWithAuthorizationOptions:(UNAuthorizationOptions)(UNAuthorizationOptionAlert | UNAuthorizationOptionSound | UNAuthorizationOptionBadge)]; // Start SDK [MSNotificationHub startWithConnectionString:connectionString hubName:hubName options:hubOptions]; ``` ### Intercepting Push Notifications You can set up a delegate to be notified whenever a push notification is received in foreground or a background push notification has been tapped by the user. To get started with intercepting push notifications, implement the `MSNotificationHubDelegate`, and use the `MSNotificationHub.setDelegate` method to set the delegate implementation. Swift: ```swift class SetupViewController: MSNotificationHubDelegate // And other imports // Set up the delegate MSNotificationHub.setDelegate(self) // Implement the method func notificationHub(_ notificationHub: MSNotificationHub!, didReceivePushNotification notification: MSNotificationHubMessage!) { let title = notification.title ?? "" let body = notification.body ?? "" if (UIApplication.shared.applicationState == .background) { NSLog("Notification received in background: title:\"\(title)\" body:\"\(body)\"") } else { let alertController = UIAlertController(title: title, message: body, preferredStyle: .alert) alertController.addAction(UIAlertAction(title: "OK", style: .cancel)) self.present(alertController, animated: true) } } ``` Objective-C: ```objc @interface SetupViewController <MSNotificationHubDelegate /* Other protocols */> // Set up the delegate [MSNotificationHub setDelegate:self]; // Implement the method - (void)notificationHub:(MSNotificationHub *)notificationHub didReceivePushNotification:(MSNotificationHubMessage *)notification { NSString *title = notification.title ?: @""; NSString *body = notification.body ?: @""; if ([[UIApplication sharedApplication] applicationState] == UIApplicationStateBackground) { NSLog(@"Notification received in the background: title: %@ body: %@", title, body); } else { UIAlertController *alertController = [UIAlertController alertControllerWithTitle:notification.title message:notification.body preferredStyle:UIAlertControllerStyleAlert]; [alertController addAction:[UIAlertAction actionWithTitle:@"OK" style:UIAlertActionStyleCancel handler:nil]]; [self presentViewController:alertController animated:YES completion:nil]; } } ``` ### Tag Management One of the ways to target a device or set of devices is through the use of [tags](https://docs.microsoft.com/en-us/azure/notification-hubs/notification-hubs-tags-segment-push-message#tags), where you can target a specific tag, or a tag expression. The Azure Notification Hub SDK for Apple handles this through top level methods that allow you to add, clear, remove and get all tags for the current installation. In this example, we can add some recommended tags such as the app language preference, and device country code. Swift: ```swift // Get language and country code for common tag values let language = Bundle.main.preferredLocalizations.first! let countryCode = NSLocale.current.regionCode! // Create tags with type_value format let languageTag = "language_" + language let countryCodeTag = "country_" + countryCode MSNotificationHub.addTags([languageTag, countryCodeTag]) ``` Objective-C: ```objc // Get language and country code for common tag values NSString *language = [[[NSBundle mainBundle] preferredLocalizations] objectAtIndex:0]; NSString *countryCode = [[NSLocale currentLocale] countryCode]; // Create tags with type_value format NSString *languageTag = [NSString stringWithFormat:@"language_%@", language]; NSString *countryCodeTag = [NSString stringWithFormat:@"country_%@", countryCode]; [MSNotificationHub addTags:@[languageTag, countryCodeTag]]; ``` ### Template Management With [Azure Notification Hub Templates](https://docs.microsoft.com/en-us/azure/notification-hubs/notification-hubs-templates-cross-platform-push-messages), you can enable a client application to specify the exact format of the notifications it wants to receive. This is useful when you want to create a more personalized notification, with string replacement to fill the values. The Installation API [allows multiple templates](https://docs.microsoft.com/en-us/azure/notification-hubs/notification-hubs-push-notification-registration-management#templates) for each installation which gives you greater power to target your users with rich messages. For example, we can create a template with a body, some headers, and some tags. Swift: ```swift // Get language and country code for common tag values let language = Bundle.main.preferredLocalizations.first! let countryCode = NSLocale.current.regionCode! // Create tags with type_value format let languageTag = "language_" + language let countryCodeTag = "country_" + countryCode let body = "{\"aps\": {\"alert\": \"$(message)\"}}" let template = MSInstallationTemplate() template.body = body template.addTags([languageTag, countryCodeTag]) MSNotificationHub.setTemplate(template, forKey: "template1") ``` Objective-C: ```objc NSString *language = [[[NSBundle mainBundle] preferredLocalizations] objectAtIndex:0]; NSString *countryCode = [[NSLocale currentLocale] countryCode]; // Create tags with type_value format NSString *languageTag = [NSString stringWithFormat:@"language_%@", language]; NSString *countryCodeTag = [NSString stringWithFormat:@"country_%@", countryCode]; NSString *body = @"{\"aps\": {\"alert\": \"$(message)\"}}"; MSInstallationTemplate *template = [MSInstallationTemplate new]; template.body = body; [template addTags:@[languageTag, countryCodeTag]]; [MSNotificationHub setTemplate:template forKey:@"template1"]; ``` ### Push to User Management The SDK supports the ability to associate a user with an installation. This allows you to be able to target all devices associated with a particular User ID. The user's identity set through the SDK can be whatever the developer wants it to be: the user's name, email address, phone number, or some other unique identifier. This is supported through the `MSNotificationHub` and the `setUserId` method. Swift: ```swift let userId = "iosUser123" MSNotificationHub.setUserId(userId); ``` Objective-C: ```objc NSString *userId = @"iosUser123"; [MSNotificationHub setUserId:userId]; ``` To target a particular user on the backend, you can specify a tag such as `$UserId:{VALUE}` where VALUE is the user name you have specified, just as you can target an installation using the `$InstallationId:{VALUE}` tag. ### Intercepting Installation Management The SDK will handle saving the installation for you, however, we provide hooks where you can intercept both the successful installation or any failure through the `MSInstallationLifecycleDelegate`. This has two methods, `didSaveInstallation` for successful saves, and `didFailToSaveInstallation` for any failures. We can implement this to have our own logging for example. Swift: ```swift // Set the delegate MSNotificationHub.setLifecycleDelegate(self) // Handle success func notificationHub(_ notificationHub: MSNotificationHub!, didSave installation: MSInstallation!) { let installationId = installation.installationId; NSLog("Successful save with Installation ID: \"\(installationId)\"") } // Handle failure func notificationHub(_ notificationHub: MSNotificationHub!, didFailToSave installation: MSInstallation!, withError error: Error!) { NSLog("Failed to save installation") } ``` Objective-C: ```objc // Set the delegate [MSNotificationHub setLifecycleDelegate:self]; // Handle successes - (void)notificationHub:(MSNotificationHub *)notificationHub didSaveInstallation:(MSInstallation *)installation { NSLog(@"Successful save with Installation ID: %@", installation.installationId); } // Handle failure - (void)notificationHub:(MSNotificationHub *)notificationHub didFailToSaveInstallation:(MSInstallation *)installation withError:(NSError *)error { NSLog(@"Failed to save installation with error %@", [error localizedDescription]); } ``` ### Enriching Installations The SDK will update the installation on the device any time you change its properties such as adding a tag or adding an installation template. Before the installation is sent to the backend, you can intercept this installation to modify anything before it goes to the backend, for example, if you wish to add or modify tags. This is implemented in the `MSInstallationEnrichmentDelegate` protocol with a single method of `willEnrichInstallation`. Swift: ```swift // Set the delegate MSNotificationHub.setEnrichmentDelegate(self) // Handle the enrichment func notificationHub(_ notificationHub: MSNotificationHub!, willEnrichInstallation installation: MSInstallation!) { installation.addTag("customTag") } ``` Objective-C: ```objc // Set the delegate [MSNotificationHub setEnrichmentDelegate:self]; // Handle the enrichment - (void)notificationHub:(MSNotificationHub *)notificationHub willEnrichInstallation:(MSInstallation *)installation { // Add another tag [installation addTag:@"customTag"]; } ``` ### Saving Installations to a Custom Backend The Azure Notification Hubs SDK will save the installation to our backend by default. If, however, you wish to skip our backend and store it on your backend, we support that through the `MSInstallationManagementDelegate` protocol. This has a method to save the installation `willUpsertInstallation`, passing in the installation, and then a completion handler is called with either an error if unsuccessful, or nil if successful. To set the delegate, instead of specifying the connection string and hub name, you specify the installation manager with `startWithInstallationManagement` Swift: ```swift // Set the delegate MSNotificationHub.startWithInstallationManagement(self) func notificationHub(_ notificationHub: MSNotificationHub!, willUpsertInstallation installation: MSInstallation!, withCompletionHandler completionHandler: @escaping (NSError?) -> Void) { // Save to your own backend // Call the completion handler with no error if successful completionHandler(nil); } ``` Objective-C: ```objc // Set the delegate [MSNotificationHub startWithInstallationManagement:self]; // Save to your own backend - (void)notificationHub:(MSNotificationHub *)notificationHub willUpsertInstallation:(MSInstallation *)installation completionHandler:(void (^)(NSError *_Nullable))completionHandler { // Save to your own backend // Call the completion handler with no error if successful completionHandler(nil); } ``` ### Disabling Automatic Swizzling By default, the SDK will swizzle methods to automatically intercept calls to `UIApplicationDelegate`/`NSApplicationDelegate` for calls to registering and intercepting push notifications, as well as `UNUserNotificationCenterDelegate` methods. Note this is only available for iOS, watchOS, and Mac Catalyst. This is not supported on macOS and tvOS. #### Disabling UIApplicationDelegate/NSApplicationDelegate 1. Open the project's Info.plist 2. Add the `NHAppDelegateForwarderEnabled` key and set the value to 0. This disables the UIApplicationDelegate/NSApplicationDelegate auto-forwarding to MSNotificaitonHub. 3. Implement the `MSApplicationDelegate`/`NSApplicationDelegate` methods for push notifications. Implement the application:didRegisterForRemoteNotificationsWithDeviceToken: callback and the application:didFailToRegisterForRemoteNotificationsWithError: callback in your AppDelegate to register for Push notifications. In the code below, if on macOS, not Mac Catalyst, replace `UIApplication` with `NSApplication`. Swift: ```swift func application(_ application: UIApplication, didRegisterForRemoteNotificationsWithDeviceToken deviceToken: Data) { // Pass the device token to MSNotificationHub MSNotificationHub.didRegisterForRemoteNotifications(withDeviceToken: deviceToken) } func application(_ application: UIApplication, didFailToRegisterForRemoteNotificationsWithError error: Error) { // Pass the error to MSNotificationHub MSNotificationHub.didFailToRegisterForRemoteNotificationsWithError(error) } ``` Objective-C: ```objc - (void)application:(UIApplication *)application didRegisterForRemoteNotificationsWithDeviceToken:(NSData *)deviceToken { // Pass the device token to MSNotificationHub [MSNotificationHub didRegisterForRemoteNotificationsWithDeviceToken:deviceToken]; } - (void)application:(UIApplication *)application didFailToRegisterForRemoteNotificationsWithError:(NSError *)error { // Pass the error to MSNotificationHub [MSNotificationHub didFailToRegisterForRemoteNotificationsWithError:error]; } ``` 4. Implement the callback to receive push notifications Implement the application:didReceiveRemoteNotification:fetchCompletionHandler callback to forward push notifications to MSNotificationHub In the code below, if on macOS, not Mac Catalyst, replace `UIApplication` with `NSApplication`. Swift: ```swift func application(_ application: UIApplication, didReceiveRemoteNotification userInfo: [AnyHashable : Any], fetchCompletionHandler completionHandler: @escaping (UIBackgroundFetchResult) -> Void) { // Forward to MSNotificationHub MSNotificationHub.didReceiveRemoteNotification(userInfo) // Complete handling the notification completionHandler(.noData) } ``` Objective-C: ```objc - (void)application:(UIApplication *)application didReceiveRemoteNotification:(NSDictionary *)userInfo fetchCompletionHandler:(void (^)(UIBackgroundFetchResult))completionHandler { // Forward to MSNotificationHub [MSNotificationHub didReceiveRemoteNotification:userInfo]; // Complete handling the notification completionHandler(UIBackgroundFetchResultNoData); } ``` #### Disabling UNUserNotificationCenterDelegate 1. Open the project's Info.plist 2. Add the `NHUserNotificationCenterDelegateForwarderEnabled` key and set the value to 0. This disables the UNUserNotificationCenterDelegate auto-forwarding to MSNotificaitonHub. 3. Implement UNUserNotificationCenterDelegate callbacks and pass the notification's payload to `MSNotificationHub`. Swift: ```swift @available(iOS 10.0, *) func userNotificationCenter(_ center: UNUserNotificationCenter, willPresent notification: UNNotification, withCompletionHandler completionHandler: @escaping (UNNotificationPresentationOptions) -> Void) { //... // Pass the notification payload to MSNotificationHub MSNotificationHub.didReceiveRemoteNotification(notification.request.content.userInfo) // Complete handling the notification completionHandler([]) } @available(iOS 10.0, *) func userNotificationCenter(_ center: UNUserNotificationCenter, didReceive response: UNNotificationResponse, withCompletionHandler completionHandler: @escaping () -> Void) { //... // Pass the notification payload to MSNotificationHub MSNotificationHub.didReceiveRemoteNotification(response.notification.request.content.userInfo) // Complete handling the notification completionHandler() } ``` Objective-C: ```objc - (void)userNotificationCenter:(UNUserNotificationCenter *)center willPresentNotification:(UNNotification *)notification withCompletionHandler:(void (^)(UNNotificationPresentationOptions options))completionHandler API_AVAILABLE(ios(10.0), tvos(10.0), watchos(3.0)) { //... // Pass the notification payload to MSNotificationHub [MSNotificationHub didReceiveRemoteNotification:notification.request.content.userInfo]; // Complete handling the notification completionHandler(UNNotificationPresentationOptionNone); } - (void)userNotificationCenter:(UNUserNotificationCenter *)center didReceiveNotificationResponse:(UNNotificationResponse *)response withCompletionHandler:(void (^)(void))completionHandler API_AVAILABLE(ios(10.0), tvos(10.0), watchos(3.0)) { //... [MSNotificationHub didReceiveRemoteNotification:response.notification.request.content.userInfo]; // Complete handling the notification completionHandler(); } ``` ## Useful Resources * Tutorials and product overview are available at [Microsoft Azure Notification Hubs Developer Center](https://azure.microsoft.com/en-us/documentation/services/notification-hubs). * Our product team actively monitors the [Notification Hubs Developer Forum](http://social.msdn.microsoft.com/Forums/en-US/notificationhubs/) to assist you with any troubles. ## Contributing This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.microsoft.com. When you submit a pull request, a CLA-bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA. All Objective-C files follow LLVM coding style (with a few exceptions) and are formatted accordingly. To format your changes, make sure you have the clang-format tool. It can be installed with Homebrew using the command `brew install clang-format`. Once you have installed clang-format, run `./clang-format-changed-files.sh` from the repository root - this will format all files that have changes against the remote `main` branch (it will also perform a git fetch). This project has adopted the [Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/). For more information see the [Code of Conduct FAQ](https://opensource.microsoft.com/codeofconduct/faq/) or contact [opencode@microsoft.com](mailto:opencode@microsoft.com) with any additional questions or comments.
46.397004
650
0.775186
eng_Latn
0.817641
e185dcb23993dc908acff00b2845b2fa3b12f184
218
md
Markdown
docs/extensions/css.md
ValentijnG/formats
b2bc97cf369f27eb4f6913fcb3783e987abfc5d2
[ "MIT" ]
null
null
null
docs/extensions/css.md
ValentijnG/formats
b2bc97cf369f27eb4f6913fcb3783e987abfc5d2
[ "MIT" ]
null
null
null
docs/extensions/css.md
ValentijnG/formats
b2bc97cf369f27eb4f6913fcb3783e987abfc5d2
[ "MIT" ]
2
2020-04-07T07:18:39.000Z
2022-02-07T11:43:51.000Z
# .css item | info --- | --- types | [Markup](../dataTypes/markup.md), [Text (plain)](../dataTypes/textPlain.md) formats | [CSS](../fileFormats/css.md) variants | file info | [`extension/css`]({{fileinfo}}/css)
15.571429
83
0.610092
kor_Hang
0.170143
e186f65df116b14cd5b315437272b7736a334a0b
31
md
Markdown
powerapps-docs/developer/model-driven-apps/clientapi/reference/Xrm-WebApi/includes/execute-description.md
s-kawara/powerapps-docs.ja-jp
51d9f6ac20ab16f6adaac5c5ef1204c1cbdfb292
[ "CC-BY-4.0", "MIT" ]
null
null
null
powerapps-docs/developer/model-driven-apps/clientapi/reference/Xrm-WebApi/includes/execute-description.md
s-kawara/powerapps-docs.ja-jp
51d9f6ac20ab16f6adaac5c5ef1204c1cbdfb292
[ "CC-BY-4.0", "MIT" ]
null
null
null
powerapps-docs/developer/model-driven-apps/clientapi/reference/Xrm-WebApi/includes/execute-description.md
s-kawara/powerapps-docs.ja-jp
51d9f6ac20ab16f6adaac5c5ef1204c1cbdfb292
[ "CC-BY-4.0", "MIT" ]
null
null
null
1 つのアクション、関数、または CRUD 操作を実行します。
31
31
0.806452
yue_Hant
0.283008
e1875f65727eb89357fa2ea9973e7b8ae3ede831
2,176
md
Markdown
docs/resources/podcasts.md
its4zahoor/docs
94e705ca1140e4e3cf42533ad59de8285e206152
[ "MIT" ]
null
null
null
docs/resources/podcasts.md
its4zahoor/docs
94e705ca1140e4e3cf42533ad59de8285e206152
[ "MIT" ]
null
null
null
docs/resources/podcasts.md
its4zahoor/docs
94e705ca1140e4e3cf42533ad59de8285e206152
[ "MIT" ]
null
null
null
--- id: podcasts title: Podcasts --- import { Image } from '@site/src/components/image' ## Managing Component Architecture <div class="podcast-card"> <a href="https://www.podbean.com/pu/pbblog-3m5bz-f0593"> <Image src="/img/podcasts/modern-web.jpg" alt="Modern Web" padding={20} width="50%" /> </a> <div> <p> In this episode, Tracy Lee interviews Debbie O'Brien about Bit. We learn how to use Bit to manage component architecture, think in components, and how this can help you a build more scalable, reusable codebase and work across teams. </p> <p> <a href="https://www.podbean.com/pu/pbblog-3m5bz-f0593">Modern Web</a> - March 2021 </p> </div> </div> ## The Component Marketplace <div class="podcast-card"> <a href="https://www.heavybit.com/library/podcasts/jamstack-radio/ep-59-the-component-marketplace-with-alexander-karan-of-climateclever/"> <Image src="/img/podcasts/jamstack-radio.jpg" alt="Jamstack Radio" padding={20} width="50%" /> </a> <div> <p> In episode 59 of JAMstack Radio, Brian Douglas talks with Alexander Karan of ClimateClever. They discuss front-end microservices, breaking down developer projects into individual components, and building applications with Bit. </p> <p> <a href="https://www.heavybit.com/library/podcasts/jamstack-radio/ep-59-the-component-marketplace-with-alexander-karan-of-climateclever/">HeavyBit JAMStack Radio</a> - February 2021 </p> </div> </div> ## From Nuxt to React <div class="podcast-card"> <a href="https://devchat.tv/views-on-vue/vue-142-from-nuxt-to-react-catching-up-with-debbie-obrien/"> <Image src="/img/podcasts/views-on-vue.jpg" alt="Views on Vue" padding={20} width="50%" /> </a> <div> <p> Lindsay and Steve talk with Debbie about her new position, about what Bit is, and how they are bringing a new approach to component development. We also talk about how Debbie is having to learn React and ways we learn new frameworks and libraries. </p> <p> <a href="https://devchat.tv/views-on-vue/vue-142-from-nuxt-to-react-catching-up-with-debbie-obrien/">Views on Vue </a> - March 2021 </p> </div> </div>
36.881356
251
0.699449
eng_Latn
0.655377
e1877ba04bb453fba0f0200d24e3cb10f06e26a0
58
md
Markdown
cv.md
aldelmerico/aldelmerico.github.io
10911e1ede33e24767dd86af5a1827ba7dd8d35f
[ "MIT" ]
null
null
null
cv.md
aldelmerico/aldelmerico.github.io
10911e1ede33e24767dd86af5a1827ba7dd8d35f
[ "MIT" ]
null
null
null
cv.md
aldelmerico/aldelmerico.github.io
10911e1ede33e24767dd86af5a1827ba7dd8d35f
[ "MIT" ]
null
null
null
--- layout: page title: CV --- This is where my cv lives
8.285714
25
0.637931
eng_Latn
1.000004
e187e786e1bcc769c858a0d5b52da33492a21d4d
2,255
md
Markdown
README.md
hariom282538/golang-mongo-pool
739d15c3da718e413f28e12dab40bae466789a57
[ "MIT" ]
null
null
null
README.md
hariom282538/golang-mongo-pool
739d15c3da718e413f28e12dab40bae466789a57
[ "MIT" ]
null
null
null
README.md
hariom282538/golang-mongo-pool
739d15c3da718e413f28e12dab40bae466789a57
[ "MIT" ]
null
null
null
# golang-mongo-pool This repository is uploaded for academic learning of database pooling class. I have used official golang mongo-driver for mongodb. For installation please go to your GOPATH and enter the following command: ```shell go get github.com/Mohammed-Aadil/golang-mongo-pool ``` For pooling following functions are required in class: 1. `Init(string)` 2. `CreateConnection() (*mongo.Client, error)` 3. `GetDatabase() (*mongo.Database, error)` 4. `GetBusyConnectionsCount() int` 5. `GetMaxConnections() int` 6. `GetMinConnections() int` 7. `GetOpenConnections() []*mongo.Client` 8. `GetOpenConnectionsCount() int` 9. `GetPoolName() string` 10. `GetTimeOut() time.Duration` 11. `SetErrorOnBusy()` 12. `SetPoolSize(uint32, uint32)` 13. `TerminateConnection(*mongo.Client)` ## Init(string) This function is use for initializing important config of mongo db eg: connection client ## CreateConnection() (*mongo.Client, error) This function is used as getObject of pooling class. It will return new connection if required on calling. ## GetDatabase() (*mongo.Database, error) This function is call to initialize database and return db instance from mongo client. ## GetBusyConnectionsCount() int This Function return no of busy connection count. ## GetMaxConnections() int This function return count of maximum allowed connections in pool. ## GetMinConnections() int This function return count of minimum allowed connections in pool. ## GetOpenConnections() []*mongo.Client This function return all the open connection instances in pool. ## GetOpenConnectionsCount() int This function return count of open connections in pool. ## GetPoolName() string This function return pool name. Default is `root` ## GetTimeOut() time.Duration This function return timeout duration of connection in pool. ## SetErrorOnBusy() If this function is called then if all connections are busy `CreateConnection()` function will return `MongoPoolError`. ## SetPoolSize(uint32, uint32) This will resize mongo pool without affecting currently open connection. **This function still need to be implemented. All PR for this are welcome.** ## TerminateConnection(*mongo.Client) This will terminate the connection and release same object from pool.
29.285714
149
0.772506
eng_Latn
0.962431
e18816f5b451e34d19f4b5689d2227a2ec022ed8
10,337
md
Markdown
_posts/2015-03-23-Affinity.md
yorkhuang-au/yorkhuang-au.github.io
a1af2664239b8731a27aba28dbd257988ba721fe
[ "MIT" ]
null
null
null
_posts/2015-03-23-Affinity.md
yorkhuang-au/yorkhuang-au.github.io
a1af2664239b8731a27aba28dbd257988ba721fe
[ "MIT" ]
null
null
null
_posts/2015-03-23-Affinity.md
yorkhuang-au/yorkhuang-au.github.io
a1af2664239b8731a27aba28dbd257988ba721fe
[ "MIT" ]
null
null
null
--- layout: post title: Implementations of Affinity in R, Hadoop Mapreduce and Spark comments: true --- Affinity analysis is a techinque that discovers co-occurence relationships among activities performed by specific individuals or groups. In retail it is used to perform basket analysis. Basket analysis may tell a retailer that customers often purchases shampooo and conditioner together. [1] This blog tries to implement this analysis using several popular techniques, R, Scala & Scalding in Apache Hadoop, and Scala & Apache Spark. The idea can also be implemented in other techinques such as Java Mapreduce, Python, etc. Let's look at the problem below. ## Problem The sales data are recorded in a csv file. There are three fields, orderid, product and quantity. We will find out the quantity that every groups of product are purchased together (same order). A small sample data are listed. ``` sales.csv orderid, product, quantity -- Header not included 2,orange,5 2,grape,5 1,orange,6 1,peach,7 2,apple,3 2,peach,4 3,apple,3 3,peach,5 2,orange,6 1,grape,4 ``` ## Algorithms We are going to use the Split-Apply-Combine method to solve the problem. Split-Apply-Combine is a common data manipulation which has three phases: * Splitting data by the value of one or more variables * Applying a function to each chunk of data independently * Combining the data back into one piece For our task, * Split the sales data by orderid. Create a list of orderid-> (product, quantity) * Calculate all the product combinations and their quantities in every order. Create (product group, quantity) list. * Combine all the (product group, quantity) among all the order and sum up by grouping on the product groups. This gives the final (product group, quantity) list. ## R Implementation In R, there are many ways and packages that can do split-apply-combine. Here, we use split, lapply and aggregate functions. The comments in the code have clearly explained the algorithms. ```r ## Affinity.R ## Calculate affinity from a sales file for n-product group ## affinityR <- function(fileName,n=2) { ## This function creates all the n-product combinations in an order. ## For every n-product group, sort the product in order, calculate ## the quantity. ## combinations <- function(df,n) { ## Only calculate for orders having n and more products if( nrow(df) >=n ) { ## Create all the possible n-product group t1 <- combn(split(df[,2:3], df["product"]),n) t2 <- do.call(rbind.data.frame,lapply(1:ncol(t1), function(x) { p <- sapply(1:n, function(y) t1[[y,x]][["product"]] ) c( sort(p), list( prod( sapply(1:n, function(z) t1[[z,x]] [["quantity"]]) ))) } ) ) names(t2) <- c(sapply(1:n, function(x) paste("product", x, sep="")), "quantity") t2 } else list() } ## Read data data <- read.csv(fileName, colClasses=c("integer", "character", "integer"), header=FALSE) names(data) <- c("orderid","product","quantity") ## Group by orderid and quantity to ensure one product only occurs ## once in a single orderid. ## This step can be bypassed if the original data has satisfied ## this requirement. d1 <- aggregate(data$quantity, by=list(data$orderid, data$product), FUN=sum) names(d1) <- c("orderid","product","quantity") ## Split by orderid and then apply the combinations function on ## each group d2<- do.call(rbind.data.frame, lapply(split(d1, d1["orderid"]), combinations, n=n)) ## Sum up all groups d3 <- aggregate( d2$quantity, by=d2[,sapply(1:n, function(x) paste("product", x, sep=""))], FUN=sum) names(d3)[ length(names(d3)) ] <- "quantity" ## Output d3[ with(d3, order(-quantity)), ] } ``` In the codes, R function split() is used to split the data. Then function sapply() is used to apply the combinations() UDF to create all product groups in an order. At the end, R function aggregate() is called combine to the results. Some results: ``` > source('C:/Users/yhuang/git/blog/affinity/AffinityR/affinity.R') > affinityR("C:/Users/yhuang/git/blog/affinity/data/sales.csv", 2) product1 product2 quantity 4 orange peach 86 1 grape orange 79 3 grape peach 48 2 apple orange 33 5 apple peach 27 6 apple grape 15 > affinityR("C:/Users/yhuang/git/blog/affinity/data/sales.csv", 3) product1 product2 product3 quantity 1 grape orange peach 388 4 apple grape orange 165 2 apple orange peach 132 3 apple grape peach 60 > ``` ## Hadoop Mapreduce Implementation When the data size is huge, we may think of a Big Data solution, such as Apache Hadoop. Hadoop uses the Mapreduce framework for distributed computing. A Hadoop Mapreduce contains a map phase and a reduce phase. There are many ways to create a Mapreduce program. Here Scala and Scalding are used as I think this option can show the beauty of functional programming and make the solution more compact than the native Java Mapreduce. The source code is below. ```scala // // AffinityHadoop.scala // import com.twitter.scalding.{Job, Args, TextLine, Csv} import cascading.pipe.Pipe class AffinityHadoop(args : Args) extends Job(args) { val logSchema = List('orderid, 'product, 'quanity) val input = Csv("/home/yhuang/blog/affinity/data/sales.csv", ",", logSchema).read val n = args("n").toInt val output = affinityHadoop(input, n) output.write( Csv("/home/yhuang/blog/affinity/data/affinity_hadoop.csv")) def affinityHadoop(pipe:Pipe, n:Int) = { pipe // Sum quantity by orderid and product .groupBy(('orderid, 'product))( group => group.sum[Int]('quanity -> 'quantity) ) // Group product and quantity by order .groupBy('orderid) { group => group.toList[(String, Int)]( ('product, 'quantity)-> 'prods ) } // Calculate all combinations in every groups // Calculate quantity for every combinations .flatMapTo('prods -> ('pg, 'quantity) ){ prods: List[(String,Int)] => prods.combinations(n) .map{ c => c.sortBy(_._1) .foldLeft(("",1)) ( (prev: (String,Int), cur :(String, Int)) => (prev._1 + "|" + cur._1, prev._2*cur._2 ) ) } } // Sum quantity by product group .groupBy('pg)(_.sum[Int]('quantity -> 'quantity)) .groupAll { _.sortBy('quantity).reverse} } } ``` In the Scala codes above, groupBy() and map() are used to implement the Split-Apply-Combine Algorithms. There is no clear boundary between them. When the data file is read, a scalding Pipe object is created, which contains the data in records. When a groupBy() is used, the data is split into groups. Then the group functions can be used to implement the Apply stage or the Combine stage. The results: ``` // n=2 |orange|peach,86 |grape|orange,79 |grape|peach,48 |apple|orange,33 |apple|peach,27 |apple|grape,15 // n=3 |grape|orange|peach,388 |apple|grape|orange,165 |apple|orange|peach,132 |apple|grape|peach,60 ``` ## Spark Implementation Another distributed computing option is Apache Spark. Spark can cache the intermediate results in memory and so that can achieve very fast speed. Scala is used in this example. The code is below. ```scala // // AffinitySpark.scala // import org.apache.spark._ import org.apache.spark.SparkContext._ import org.apache.spark.rdd._ object AffinitySpark { def main(args: Array[String]) { val inputFile = args(0) val n = args(1).trim().toInt val conf = new SparkConf().setAppName("Affinity") .setMaster("local") val sc = new SparkContext(conf) val input = sc.textFile(inputFile) val output = affinitySpark(input, n) output.foreach(println) } def affinitySpark(d1 : RDD[String], n:Int) = { // Split csv into fields, key is order id no & product id val d2 = d1.map( line => { val t = line.split(",") ((t(0).trim.toInt, t(1).trim), t(2).trim.toInt) }) // Combine same product in same order val d3 = d2.reduceByKey( (a,b) => a+b) // Convert to key to order id val d4 = d3.map( x => (x._1._1, (x._1._2, x._2))) // Create all combinations in every orders val d5 = d4.groupByKey() val d6 = d5.map( x => x._2.toList.combinations(n) ) // In every order, create the product set as key and quantity val d7 = d6.flatMap(y => y.map( c => { c.sortBy(_._1) .foldLeft(("",1)) ( (prev: (String,Int), cur :(String, Int)) => (prev._1 + "|" + cur._1, prev._2*cur._2 ) ) } )) // Sum up by every product sets val d8 = d7.reduceByKey((a,b) => a+b).collect().sortBy(_._2) .reverse d8 } } ``` The Spark APIs look similar to Scalding APIs. The comments and codes are pretty self-contained. Some results: ``` // n=2 (|orange|peach,86) (|grape|orange,79) (|grape|peach,48) (|apple|orange,33) (|apple|peach,27) (|apple|grape,15) // n=3 (|grape|orange|peach,388) (|apple|grape|orange,165) (|apple|orange|peach,132) (|apple|grape|peach,60) ``` ## Discussion I had tried to use MDX and DAX of SQL Server Analysis Service to implement the basket affinity in a project. It was not pleasant experiences anyway:). The Split-Apply-Combine methodology makes the problem quite straightforward. Quite often, we use R for ad-hoc analysis, Hadoop for huge amount of data and batch processing. Spark, claimed by its speed, is a good option to real-time/ fast speed environment. But this is out of scope of this writing. Three options are provided to try to adopt the same method in different environments. Please note that the implementations in this writing may not be fully efficient, for example, in the Spark codes, no cache is used. Also I have not got a chance to test and compare with a big dataset in cluster. If anyone can have a try, please kindly provide your feedback. All codes in this writing can be found in my [github](https://github.com/yorkhuang-au/blog/tree/master/affinity). ## References: [1] [Wikipedia page](http://en.wikipedia.org/wiki/Affinity_analysis) {% include twitter_plug.html %}
32.303125
274
0.6736
eng_Latn
0.972627
e188a38ab0639ee3c449cfa102b435374c91e5be
5,744
md
Markdown
README.md
glowreeyah/ruby_tic_tac_toe
b358bfaddb7823836535e31343bff9b7e1cf56cf
[ "MIT" ]
null
null
null
README.md
glowreeyah/ruby_tic_tac_toe
b358bfaddb7823836535e31343bff9b7e1cf56cf
[ "MIT" ]
null
null
null
README.md
glowreeyah/ruby_tic_tac_toe
b358bfaddb7823836535e31343bff9b7e1cf56cf
[ "MIT" ]
null
null
null
# ruby_tic_tac_toe [![Contributors][contributors-shield]][contributors-url] [![Forks][forks-shield]][forks-url] [![Stargazers][stars-shield]][stars-url] [![Issues][issues-shield]][issues-url] > This game is a virtual representation of the classic Tic-Tac-Toe game. > All public class methods are tested with RSpec. ![Tic tac toe game](https://lemmoscripts.com/wp/wp-content/uploads/2018/09/tic-tac-toe-capture-2.gif) <!-- *** Thanks for checking out this README Template. If you have a suggestion that would *** make this better, please fork the repo and create a pull request or simply open *** an issue with the tag "enhancement". *** Thanks again! Now go create something AMAZING! :D --> <!-- PROJECT SHIELDS --> <!-- *** I'm using markdown "reference style" links for readability. *** Reference links are enclosed in brackets [ ] instead of parentheses ( ). *** See the bottom of this document for the declaration of the reference variables *** for contributors-url, forks-url, etc. This is an optional, concise syntax you may use. *** https://www.markdownguide.org/basic-syntax/#reference-style-links --> <!-- PROJECT LOGO --> <br /> <p align="center"> <a href="https://github.com/praz99/ruby_tic_tac_toe"> </a> <br /> <a href="https://github.com/praz99/ruby_tic_tac_toe"><strong>Explore the docs »</strong></a> <br /> <br /> <a href="https://github.com/praz99/ruby_tic_tac_toe/issues">Report Bug</a> · <a href="https://github.com/praz99/ruby_tic_tac_toe/issues">Request Feature</a> </p> <!-- TABLE OF CONTENTS --> <!-- ABOUT THE PROJECT --> ## Rules For Tic-Tac-Toe 1. The game is played on a grid that is 3 by 3 squares. 2. It is a game of two players that use X and O to mark moves in empty squares. 3. The first player who succeeds in placing three of their marks in a horizontal, vertical, or diagonal row is the winner. 4. The game is over when all 9 squares are full. If no player has 3 marks in a row, the game ends in a tie. 5. Below is a virtual implementation where marks can be placed by selecting from numbers 1 to 9 when prompted. | (1) | (2) | (3) | ------------------- | (4) | (5) | (6) | ------------------- | (7) | (8) | (9) | ------------------- 6. A winning position would like the image below at the end. As soon as someone gets three of their marks in a row, the game ends. | O | X | O | ------------------- | X | X | X | --- *Player 'X' wins with this row!* ------------------- | O | O | O | ------------------- For a detailed explanation, please have a look at this [guide](https://www.wikihow.com/Play-Tic-Tac-Toe). ## Getting Started ### Prerequisites Ruby installed on local machine Text editor (preferably: VSCode, Atom, Sublime) ### Starting the game 1. If you have installed `Ruby` on your machine: Clone the project into your local machine using `git clone` command or download the zip file. Go into the project directory using `cd directory name` command. Open your terminal and type `bin/main.rb` command. Enter players' names, select the letters which players play with. On the displayed board choose the cells by numbers between 1 and 9 to mark your selected letter ('X' or 'O'). 2. If you have not installed `Ruby`, please, install `Ruby` and repeat step 1. 3. Run ```rspec``` in the terminal of your root directory to run the described tests or run ```rspec --format doc``` to see a more detailed description of the written tests. ## Installation Contributions, issues and feature requests are welcome! Start by: * Forking the project * Cloning the project to your local machine * `cd` into the project directory * Run `git checkout -b your-branch-name` * Make your contributions * Push your branch up to your forked repository * Open a Pull Request with a detailed description to the development branch of the original project for a review ### Built With This project was built using these technologies. * Ruby * RSpec * Visual Studio Code <!-- CONTACT --> ## Contributors 👤 **Prajwal Thapa** - LinkedIn: [Prazwalthapa](www.linkedin.com/in/prazwal-thapa/) - GitHub: [@praz99](https://github.com/praz99) - E-mail: t.prazwal@gmail.com 👤 **Glory David** - LinkedIn: [Glorydavid](https://www.linkedin/in/glory-david/) - GitHub: [@glowreeyah](https://github.com/glowreeyah) - E-mail: glodave99@gmail.com - Twitter: [@gloweeeyah](https://twitter.com/gloweeeyah) 👤 **Marylene Sawyer** - Github: [@Bluette1](https://github.com/Bluette1) - Twitter: [@MaryleneSawyer](https://twitter.com/MaryleneSawyer) - Linkedin: [Marylene Sawyer](https://www.linkedin.com/in/marylene-sawyer-b4ba1295/) ## Show your support Give a ⭐️ if you like this project! <!-- ACKNOWLEDGEMENTS --> ## Acknowledgements * The Odin Project * [Microverse](https://www.microverse.org/) <!-- MARKDOWN LINKS & IMAGES --> <!-- https://www.markdownguide.org/basic-syntax/#reference-style-links --> [contributors-shield]: https://img.shields.io/github/contributors/praz99/ruby_tic_tac_toe.svg?style=flat-square [contributors-url]: https://github.com/praz99/ruby_tic_tac_toe/graphs/contributors [forks-shield]: https://img.shields.io/github/forks/praz99/ruby_tic_tac_toe.svg?style=flat-square [forks-url]: https://github.com/praz99/ruby_tic_tac_toe/network/members [stars-shield]: https://img.shields.io/github/stars/praz99/ruby_tic_tac_toe.svg?style=flat-square [stars-url]: https://github.com/praz99/ruby_tic_tac_toe/stargazers [issues-shield]: https://img.shields.io/github/issues/praz99/ruby_tic_tac_toe.svg?style=flat-square [issues-url]: https://github.com/praz99/ruby_tic_tac_toe/issues
39.342466
174
0.68663
eng_Latn
0.890671
e188adb23f23102cf41cc3cc2fb99ebd619a31d2
134
md
Markdown
_includes/04-lists.md
zeinabzamani/markdown-portfolio
2b1f02aa9edcee6a1fe159d6b96b52058b2ffdf4
[ "MIT" ]
null
null
null
_includes/04-lists.md
zeinabzamani/markdown-portfolio
2b1f02aa9edcee6a1fe159d6b96b52058b2ffdf4
[ "MIT" ]
5
2021-02-08T21:38:50.000Z
2021-02-08T22:42:00.000Z
_includes/04-lists.md
zeinabzamani/markdown-portfolio
2b1f02aa9edcee6a1fe159d6b96b52058b2ffdf4
[ "MIT" ]
null
null
null
### Here is a list of my favorite things * Outdoors * Hiking * Rock climbing * Making crafts * Traveling to historical places
19.142857
40
0.701493
eng_Latn
0.995541
e189099ebb78ca6190b124bb903cbaf2f1c5ef4d
560
md
Markdown
vector/back.md
aditya041997/30-seconds-of-cpp
df6e750de7b26bc401ea75860be1e6ec1df34454
[ "MIT" ]
1,210
2019-03-22T18:53:59.000Z
2022-03-31T07:47:34.000Z
vector/back.md
aditya041997/30-seconds-of-cpp
df6e750de7b26bc401ea75860be1e6ec1df34454
[ "MIT" ]
217
2019-04-05T09:07:26.000Z
2022-01-29T22:42:37.000Z
vector/back.md
aditya041997/30-seconds-of-cpp
df6e750de7b26bc401ea75860be1e6ec1df34454
[ "MIT" ]
657
2019-03-17T15:04:47.000Z
2022-03-31T05:05:40.000Z
# back **Description :** Returns a reference to the last element in the vector **Example** : ```cpp // Initialize int vector std::vector<int> myVector = {1, 2, 3, 4, 5}; // Save reference to the last element int& lastEle = myVector.back(); // Last element is 5 std::cout << myVector.back() << " "; // Change the last element by changing the reference variable lastEle lastEle = 10; // Last element is 10 std::cout << myVector.back() << " "; ``` **[Run Code](https://rextester.com/RQEY20725)**
20.740741
73
0.589286
eng_Latn
0.840531
e189e91f42d0128439d35121de380ae84e29bc22
2,120
md
Markdown
request-data/README.md
salingers/java-ee-8-mvc-master
8ae2e06b0bc768130990c490b78ec57b61683551
[ "Apache-2.0" ]
24
2015-09-03T10:53:46.000Z
2018-10-03T00:48:30.000Z
request-data/README.md
salingers/java-ee-8-mvc-master
8ae2e06b0bc768130990c490b78ec57b61683551
[ "Apache-2.0" ]
1
2016-03-09T17:50:41.000Z
2016-03-09T17:50:41.000Z
request-data/README.md
salingers/java-ee-8-mvc-master
8ae2e06b0bc768130990c490b78ec57b61683551
[ "Apache-2.0" ]
20
2015-09-09T15:13:04.000Z
2020-02-17T14:46:23.000Z
Java EE MVC 1.0 - Working with request data ============= This example project shows various ways of accessing request data in MVC Controllers. It shows: #### How to work with query parameters using `@QueryParam` Explained in: [Java EE MVC: Working with Query Parameters][1]. Related classes: * [QueryParamsController.java][2] * [QueryParamsFieldController.java][3] #### How to work with path parameters using `@Path` and `@PathParam` Explained in: [Java EE 8 MVC: Working with Path Parameters][5]. Related classes: * [PathParamsController.java][4] #### How to work with form parameters using `@FormParam` Explained in: [Java EE 8 MVC: Working with form parameters][6] Related classes: * [FormParamsController.java][7] * [FormParamsFieldController.java][8] #### How to work with bean parameters using `@BeanParam` Explained in: [Java EE 8 MVC: Working with bean parameters][9] Related classes: * [BeanParamsController.java][10] [1]: http://www.mscharhag.com/java-ee-mvc/query-parameters [2]: https://github.com/mscharhag/java-ee-8-mvc/blob/master/request-data/src/main/java/com/mscharhag/javaee8/mvc/requestparams/QueryParamsController.java [3]: https://github.com/mscharhag/java-ee-8-mvc/blob/master/request-data/src/main/java/com/mscharhag/javaee8/mvc/requestparams/QueryParamsFieldController.java [4]: https://github.com/mscharhag/java-ee-8-mvc/blob/master/request-data/src/main/java/com/mscharhag/javaee8/mvc/requestparams/PathParamsController.java [5]: http://www.mscharhag.com/java-ee-mvc/path-parameters [6]: http://www.mscharhag.com/java-ee-mvc/form-parameters [7]: https://github.com/mscharhag/java-ee-8-mvc/blob/master/request-data/src/main/java/com/mscharhag/javaee8/mvc/requestparams/FormParamsController.java [8]: https://github.com/mscharhag/java-ee-8-mvc/blob/master/request-data/src/main/java/com/mscharhag/javaee8/mvc/requestparams/FormParamsFieldController.java [9]: http://www.mscharhag.com/java-ee-mvc/bean-parameters [10]: https://github.com/mscharhag/java-ee-8-mvc/blob/master/request-data/src/main/java/com/mscharhag/javaee8/mvc/requestparams/bean/BeanParamsController.java
37.857143
158
0.769811
eng_Latn
0.180555
e18a5ff950a4023a159d735a168d509e5d8d764c
1,610
md
Markdown
README.md
aguisset/Magic-8-ball
f906d7b8a9adab1f9deb90ed2dd7ccbb4643c0d3
[ "MIT" ]
null
null
null
README.md
aguisset/Magic-8-ball
f906d7b8a9adab1f9deb90ed2dd7ccbb4643c0d3
[ "MIT" ]
null
null
null
README.md
aguisset/Magic-8-ball
f906d7b8a9adab1f9deb90ed2dd7ccbb4643c0d3
[ "MIT" ]
null
null
null
# Magic 8 Ball Companion iOS project from Angela Bauer tutorial. The goal is to create an app with a Magic 8 Ball which will make all the difficult decisions for you. ## Getting started These instructions will get you a copy of the project up and running on your local machine for development and testing purposes. See deployment for notes on how to deploy the project on a live system. ### Prerequisites Before you begin, you should already have the Xcode downloaded and set up correctly. Follow this guide to know how to do this here: [Setting up Xcode](https://developer.apple.com/xcode/) ### Installing 1. Download the I-am-single project source. You can do this either by forking and cloning the repository (recommended if you plan on pushing changes) or by downloading it as a ZIP file and extracting it. OR ```$ git clone https://github.com/aguisset/Magic-8-ball.git``` 2. Navigate to the unzipped folder and open How I Feel in Valentines Day.xcodeproj from the folder. 3. Build the project (⌘+B) and check for any errors. 4. Run the app (⌘+R).and test it. Here is an example of the end results: ![app screen shot](https://github.com/aguisset/Magic-8-ball/blob/master/Documentation/screen-1.png) ## Built With * [Xcode](https://developer.apple.com/xcode/) - The IDE used to build iOS application. ## Versioning I use [Git](https://git-scm.com/) for versioning on my machine ## Authors * **Abdoul Guisset** ### Contributors * **Angela Bauer**: Starting companion code. ## License This project is licensed under the MIT License - see the [LICENSE.md](LICENSE.md) file for details
32.2
206
0.748447
eng_Latn
0.992212
e18b9adad4e6e60c838d3924db4c2f16aef86f20
2,126
md
Markdown
docs/update_salamander.md
jlewi/course-v3
253e26759e2145fdc2848c48087120d19f670485
[ "Apache-2.0" ]
10
2019-10-21T08:04:17.000Z
2020-09-20T14:28:56.000Z
docs/update_salamander.md
jlewi/course-v3
253e26759e2145fdc2848c48087120d19f670485
[ "Apache-2.0" ]
5
2021-09-27T22:01:25.000Z
2022-02-27T10:26:46.000Z
docs/update_salamander.md
jlewi/course-v3
253e26759e2145fdc2848c48087120d19f670485
[ "Apache-2.0" ]
2
2020-02-19T12:44:58.000Z
2020-12-14T16:40:27.000Z
--- title: Returning to Salamander keywords: sidebar: home_sidebar --- To return to your notebook, the basic steps will be: 1. Start your instance 1. Update the course repo 1. Update the fastai library 1. When done, shut down your instance ## Step by step guide ### Start your instance Sign in to [salamander](https://salamander.ai/) and choose the instance you want to start, then click on the button 'Start Server'. <img alt="" src="/images/salamander/start.png" class="screenshot"> Wait about a minute for the server to start. You'll see the status go through several stages (written in orange) until it's ready like this: <img alt="" src="/images/salamander/ready.png" class="screenshot"> You can then either ssh to your terminal by copying the content of the second box or click on the 'Jupyter Notebook' button to directly head to your notebooks ### Update the course repo To update the course repo, you will need to be in terminal. If you used the `ssh` method, you're already there, if you clicked on the 'Jupyter Notebook' button, launch a new terminal from the jupyter notebook menu. <img alt="" src="/images/gradient/terminal.png" class="screenshot"> This will open a new window, in which you should run those two instructions: ```bash cd course-v3 git pull ``` <img alt="" src="/images/gradient/update.png" class="screenshot"> This should give you the latest of the course notebooks. If you modified some of the notebooks in course-v3/nbs directly, GitHub will probably throw you an error. You should type `git stash` to remove your local changes. Remember you should always work on a copy of the lesson notebooks. ### Update the fastai library To update the fastai library, open the terminal like before and type ```bash source activate fastai conda install -c fastai fastai ``` ### Stop your instance Once you're finished, go back to the [salamander page](https://salamander.ai/) and click on the 'Stop Server' button next to your instance. <img alt="" src="/images/salamander/stop.png" class="screenshot"> **It's not enough to just close your browser or turn off your own computer.**
34.852459
287
0.751176
eng_Latn
0.994186
e18c17cd22110cd503813448fcbcce63dcfa345b
889
md
Markdown
v-ref/freetype_freetype.md
shanghuo/v-ref
40dae3836cec710b76378efbdbea232210e8b29e
[ "MIT" ]
1
2019-07-14T11:41:53.000Z
2019-07-14T11:41:53.000Z
v-ref/freetype_freetype.md
shanghuo/v-ref
40dae3836cec710b76378efbdbea232210e8b29e
[ "MIT" ]
1
2019-07-21T13:38:33.000Z
2019-07-22T03:28:09.000Z
v-ref/freetype_freetype.md
shanghuo/v-ref
40dae3836cec710b76378efbdbea232210e8b29e
[ "MIT" ]
null
null
null
--- permalink: /v-ref/TODO.html title: "V语言-TODO" description: "V语言-TODO - 这是一个新兴的语言,虽然有时不是那么稳定,又或者许多功能还在实现途中,但是你不得不相信开源社区的强大!它来了,它改变着! —— V lang" lang: "zh_CN" --- # V语言 - TODO [本站首页](/) [中文文档](/docs.html) [手册目录](/menu/v.html) ## new_context() TODO ### 语法 ``` pub fn new_context(cfg gg.Cfg) *Context { ``` 参数|描述 ---|--- cfg|TODO **返回值** TODO **注意** TODO ### 示例 ``` TODO ``` 以上代码将输出 ``` TODO ``` ## draw_text() TODO ### 语法 ``` pub fn (ctx mut Context) draw_text(_x, _y int, text string, cfg gx.TextCfg) { ``` 参数|描述 ---|--- |TODO _x|TODO _y|TODO text|TODO cfg|TODO **注意** TODO ### 示例 ``` TODO ``` 以上代码将输出 ``` TODO ``` ## draw_text_def() TODO ### 语法 ``` pub fn (ctx mut Context) draw_text_def(x, y int, text string) { ``` 参数|描述 ---|--- |TODO x|TODO y|TODO text|TODO **注意** TODO ### 示例 ``` TODO ``` 以上代码将输出 ``` TODO ``` <script src="/script.js"></script>
7.867257
96
0.573678
yue_Hant
0.398497
e18ebfcc2a4584774cb3ac8b5d760e2155775cc6
1,761
md
Markdown
README.md
SammyEnigma/flat-gui
9ebabb9dbc2aa4c98fc077a3b2893dd1f42572d2
[ "MIT" ]
58
2018-02-05T07:49:20.000Z
2022-02-02T14:47:45.000Z
README.md
SammyEnigma/flat-gui
9ebabb9dbc2aa4c98fc077a3b2893dd1f42572d2
[ "MIT" ]
3
2019-03-19T21:24:03.000Z
2020-12-26T15:14:09.000Z
README.md
SammyEnigma/flat-gui
9ebabb9dbc2aa4c98fc077a3b2893dd1f42572d2
[ "MIT" ]
13
2018-08-10T14:42:00.000Z
2022-02-08T20:26:12.000Z
![flatgui cover](https://i.stack.imgur.com/uAn8u.png?raw=true "FlatGUI Cover") # FlatGUI _A Qt C++ Library_ **Note:** I am currently reworking this repository. Some links might not be operational yet, nor have the complete nor the actual content. Please be patient! ## What is FlatGUI? FlatGUI is an open-source library created to support the development of non-native looking graphical user interfaces for [Qt Widgets](https://doc.qt.io/qt-5/qtwidgets-index.html)-based desktop applications. It servers as an extension to the Qt’s functionality, providing a component development kit (CDK) for the creation of custom widgets, as well as a collection of ready-to-use GUI components. ## Who might find FlatGUI useful? If you are a die-hard fan of the [Qt Widgets](https://doc.qt.io/qt-5/qtwidgets-index.html) technology and QML is not your preferred language, though you still would like to create desktop applications with non-native looking graphical user interfaces, then FlatGUI is made just for you. ## How to use FlatGUI? To use the FlatGUI you need the [Qt Library](https://www.qt.io/) and a C++ compiler for your platform. - Get started [here](https://www.scopchanov.de/projects/flatgui/get-started/) - Read the [API documentation](https://doc.scopchanov.de/flatgui/flatgui-module.html) - Browse the [examples](https://doc.scopchanov.de/flatgui/examples.html) ## How to contribute? Check out the [contribution guide](CONTRIBUTING.md). ## Disclaimer The code in this repository is **not** an official release by the Qt company or any other organization. It is written solely by me, Michael Scopchanov, in my spare time, based on my personal programming experience, as well as on my understanding of the intended way of using the Qt library.
55.03125
396
0.770017
eng_Latn
0.989909
e18f9d4bc56f945736efa53f14995eb8f5e42169
1,922
md
Markdown
AlchemyInsights/consistencyguidsourceanchor-behavior.md
isabella232/OfficeDocs-AlchemyInsights-pr.da-DK
a907697f48db2dc57c19d7e003d92831c111566e
[ "CC-BY-4.0", "MIT" ]
2
2020-05-19T19:06:02.000Z
2020-09-17T11:26:05.000Z
AlchemyInsights/consistencyguidsourceanchor-behavior.md
isabella232/OfficeDocs-AlchemyInsights-pr.da-DK
a907697f48db2dc57c19d7e003d92831c111566e
[ "CC-BY-4.0", "MIT" ]
2
2022-02-09T06:59:12.000Z
2022-02-09T06:59:36.000Z
AlchemyInsights/consistencyguidsourceanchor-behavior.md
isabella232/OfficeDocs-AlchemyInsights-pr.da-DK
a907697f48db2dc57c19d7e003d92831c111566e
[ "CC-BY-4.0", "MIT" ]
2
2019-10-11T18:36:50.000Z
2021-10-09T10:49:57.000Z
--- title: KonsistensGuid/kildeAnchor-funktionsmåde ms.author: pebaum author: pebaum manager: scotv ms.date: 04/21/2020 ms.audience: Admin ms.topic: article ms.service: o365-administration ROBOTS: NOINDEX, NOFOLLOW localization_priority: Normal ms.collection: Adm_O365 ms.custom: '' ms.assetid: 6a44f797-acc7-4cbe-aa5a-47e2581fabf5 ms.openlocfilehash: 9b5765ff3c59b1312bead41a45a53478a96260df0567f006ab93c3ccfaf4be64 ms.sourcegitcommit: b5f7da89a650d2915dc652449623c78be6247175 ms.translationtype: MT ms.contentlocale: da-DK ms.lasthandoff: 08/05/2021 ms.locfileid: "54044334" --- # <a name="consistencyguid--sourceanchor-behavior"></a>KonsistensGuid/kildeAnchor-funktionsmåde Azure AD Forbind (version 1.1.524.0 og nyere) letter nu brugen af msDS-ConsistencyGuid som kildeAnchor-attribut. Når du bruger denne funktion, konfigurerer Azure AD Forbind automatisk synkroniseringsregler til: - Brug msDS-ConsistencyGuid som kildeAnchor-attributten for brugerobjekter. ObjectGUID bruges til andre objekttyper. - For et givent lokalt AD User-objekt, hvis msDS-ConsistencyGuid-attribut ikke er udfyldt, skriver Azure AD Forbind dens objektGUID-værdi tilbage til attributten msDS-ConsistencyGuid i det lokale Active Directory. Når attributten msDS-ConsistencyGuid er udfyldt, eksporterer Azure AD Forbind objektet til Azure AD. **Bemærk!** Når et lokalt AD-objekt er importeret til Azure AD Forbind (dvs. importeret i AD-forbindelsesrummet og projiceret ind i Metaverse), kan du ikke længere ændre dens kildeanchor-værdi. Hvis du vil angive sourceAnchor-værdien for et givet lokalt AD-objekt, skal du konfigurere dens msDS-ConsistencyGuid-attribut, før den importeres til Azure AD Forbind. Du kan finde flere oplysninger om SourceAnchor og ConsistencyGuid i følgende: [Azure AD Forbind: Designkoncepter](https://docs.microsoft.com/azure/active-directory/connect/active-directory-aadconnect-design-concepts)
54.914286
363
0.819979
dan_Latn
0.913495