hexsha
stringlengths
40
40
size
int64
5
1.04M
ext
stringclasses
6 values
lang
stringclasses
1 value
max_stars_repo_path
stringlengths
3
344
max_stars_repo_name
stringlengths
5
125
max_stars_repo_head_hexsha
stringlengths
40
78
max_stars_repo_licenses
listlengths
1
11
max_stars_count
int64
1
368k
max_stars_repo_stars_event_min_datetime
stringlengths
24
24
max_stars_repo_stars_event_max_datetime
stringlengths
24
24
max_issues_repo_path
stringlengths
3
344
max_issues_repo_name
stringlengths
5
125
max_issues_repo_head_hexsha
stringlengths
40
78
max_issues_repo_licenses
listlengths
1
11
max_issues_count
int64
1
116k
max_issues_repo_issues_event_min_datetime
stringlengths
24
24
max_issues_repo_issues_event_max_datetime
stringlengths
24
24
max_forks_repo_path
stringlengths
3
344
max_forks_repo_name
stringlengths
5
125
max_forks_repo_head_hexsha
stringlengths
40
78
max_forks_repo_licenses
listlengths
1
11
max_forks_count
int64
1
105k
max_forks_repo_forks_event_min_datetime
stringlengths
24
24
max_forks_repo_forks_event_max_datetime
stringlengths
24
24
content
stringlengths
5
1.04M
avg_line_length
float64
1.14
851k
max_line_length
int64
1
1.03M
alphanum_fraction
float64
0
1
lid
stringclasses
191 values
lid_prob
float64
0.01
1
e97baf62c50d6503b8449e8070300b2dee9c83ea
71
md
Markdown
source/categories/index.md
jie-think/github_blog_prd
04490efdf42567423dd27665fc20d43b6f5f1837
[ "Apache-2.0" ]
null
null
null
source/categories/index.md
jie-think/github_blog_prd
04490efdf42567423dd27665fc20d43b6f5f1837
[ "Apache-2.0" ]
null
null
null
source/categories/index.md
jie-think/github_blog_prd
04490efdf42567423dd27665fc20d43b6f5f1837
[ "Apache-2.0" ]
null
null
null
--- title: categories date: 2018-08-29 20:07:03 type: "categories" ---
11.833333
25
0.661972
nld_Latn
0.204942
e97bf19fd3b0237e791b04441bc6742d2c3c8472
46,168
md
Markdown
faq.md
ltxom/phoible.github.io
d5b94a474e6f1a64a05d723913bb2a9becb7d86b
[ "CC-BY-4.0" ]
4
2015-05-12T13:01:12.000Z
2021-12-02T13:48:33.000Z
faq.md
ltxom/phoible.github.io
d5b94a474e6f1a64a05d723913bb2a9becb7d86b
[ "CC-BY-4.0" ]
16
2015-06-02T09:18:19.000Z
2021-04-23T20:04:51.000Z
faq.md
ltxom/phoible.github.io
d5b94a474e6f1a64a05d723913bb2a9becb7d86b
[ "CC-BY-4.0" ]
5
2015-05-12T01:48:46.000Z
2021-08-15T20:27:29.000Z
--- title: "PHOIBLE frequently asked questions" author: Steven Moran & Daniel McCloy layout: default bibliography: bib/faq.bibtex output: md_document: variant: gfm preserve_yaml: true toc: true toc_depth: 2 csl: bib/phoible.csl --- - [Introduction](#introduction) - [How do I get the data?](#how-do-i-get-the-data) - [Inventories, language codes, doculects, and sources](#inventories-language-codes-doculects-and-sources) - [How are PHOIBLE inventories created?](#how-are-phoible-inventories-created) - [Why do some phonological inventories combine more than one doculect?](#why-do-some-phonological-inventories-combine-more-than-one-doculect) - [Where do the language codes in PHOIBLE come from?](#where-do-the-language-codes-in-phoible-come-from) - [Missing isocodes](#missing-isocodes) - [Why do some languages have multiple entries in PHOIBLE?](#why-do-some-languages-have-multiple-entries-in-phoible) - [Why are multiple inventories sometimes linked to the same document?](#why-are-multiple-inventories-sometimes-linked-to-the-same-document) - [What are the different “sources” in PHOIBLE?](#what-are-the-different-sources-in-phoible) - [Filtering and sampling](#filtsamp) - [Random sampling: one inventory per isocode/glottocode](#random-sampling-one-inventory-per-isocodeglottocode) - [Filtering by data source](#filtering-by-data-source) - [Filtering and sampling based on inventory properties](#filtering-and-sampling-based-on-inventory-properties) - [Integrating geographic information](#integrating-geographic-information) - [Phonological features in PHOIBLE](#phonological-features-in-phoible) - [How is PHOIBLE used in academic research and/or industry?](#how-is-phoible-used-in-academic-research-andor-industry) - [References](#references) Introduction ============ This FAQ answers questions regarding the editorial principles and design decisions that went into the creation of PHOIBLE. We appreciate and welcome feedback regarding these FAQs via [our issue tracker](https://github.com/phoible/dev/issues) or by contacting the editors directly. When relevant, we provide [R](https://www.r-project.org/) code snippets to elucidate the questions raised in this FAQ. To run these code snippets requires the following [R packages](https://cran.r-project.org/web/packages/available_packages_by_name.html): ``` r library(readr) library(stringr) library(dplyr) library(knitr) library(ggplot2) ``` This document was rendered with R version 4.0.3 (2020-10-10) and package versions dplyr: 1.0.2, readr: 1.4.0, stringr: 1.4.0, knitr: 1.30, ggplot2: 3.3.2. How do I get the data? ---------------------- You can get the most recent “official” release from our [download page](https://phoible.org/download), get the most current version (with bugfixes or new additions since last release) from [GitHub](https://github.com/phoible/dev/blob/master/data/phoible.csv?raw=true), or use the following code snippet to download the most current version from GitHub directly within `R`: ``` r url_ <- "https://github.com/phoible/dev/blob/master/data/phoible.csv?raw=true" col_types <- cols(InventoryID='i', Marginal='l', .default='c') phoible <- read_csv(url(url_), col_types=col_types) ``` Inventories, language codes, doculects, and sources =================================================== How are PHOIBLE inventories created? ------------------------------------ For the most part, every phonological inventory in PHOIBLE is based on one-and-only-one language description (usually a research article, book chapter, dissertation, or descriptive grammar). The technical term for this in comparative linguistics is “doculect” (from “documented lect”), in which [lect](https://en.wikipedia.org/wiki/Variety_(linguistics)) means a specific form of a language or dialect, i.e. an instance of documentation of an instance of linguistic behavior at a particular time and place (Cysouw & Good, 2013). A brief explanation and some history of why linguists use the term “doculect,” which has gained broad acceptance in light of the issues of language identification and the use of “language codes,” is given in [this blog post](https://dlc.hypotheses.org/623) by Michael Cysouw. Contributors to PHOIBLE start with a doculect, extract the contrastive phonemes and allophones, and (if necessary) adapt the authors’ choice of symbols to align with PHOIBLE’s [symbol guidelines](http://phoible.github.io/conventions/). If the authors have not provided ISO 639-3 and Glottolog codes, these are determined before adding the inventory to PHOIBLE. Each inventory is then given a unique numeric ID. Doculects are tracked in PHOIBLE using BibTeX keys. Why do some phonological inventories combine more than one doculect? -------------------------------------------------------------------- An exception to the “one doculect per inventory” rule arises for inventories that were originally part of a curated phonological database such as UPSID (Maddieson, 1984; Maddieson & Precoda, 1990) or SPA (Crothers, Lorentz, Sherman, & Vihman, 1979). In those collections, inventories were often based on multiple descriptions of linguistic behavior, written by different linguists; those descriptions were believed to be describing the same language, and disagreements between the descriptions were adjudicated by the experts who compiled the collection. We can quickly see how many of PHOIBLE’s inventories are based on multiple doculects by looking at the mapping table between PHOIBLE inventory IDs and BibTeX keys: ``` r url_ <- "https://github.com/phoible/dev/blob/master/mappings/InventoryID-Bibtex.csv?raw=true" id_to_bibtex_mapping <- read_csv(url(url_), col_types=cols(InventoryID='i', .default='c')) id_to_bibtex_mapping %>% group_by(InventoryID) %>% tally(name="Number of doculects consulted") %>% group_by(`Number of doculects consulted`) %>% count(name="Number of inventories") %>% kable() ``` | Number of doculects consulted | Number of inventories | |------------------------------:|----------------------:| | 1 | 2435 | | 2 | 442 | | 3 | 93 | | 4 | 29 | | 5 | 10 | | 6 | 7 | | 7 | 2 | | 9 | 1 | | 11 | 1 | Clearly, the majority of inventories in PHOIBLE represent a phonological description from a single doculect. But it seems strange that a single phonological inventory in PHOIBLE could be based on 11 different doculects. Let’s examine it: ``` r id_to_bibtex_mapping %>% group_by(InventoryID) %>% tally(name="Number of doculects consulted") %>% filter(`Number of doculects consulted` == 11) %>% pull(InventoryID) -> this_inventory_id phoible %>% filter(InventoryID == this_inventory_id) %>% distinct(Source, LanguageName, Glottocode, ISO6393) %>% kable() ``` | Glottocode | ISO6393 | LanguageName | Source | |:-----------|:--------|:-------------|:-------| | haus1257 | hau | HAUSA | upsid | As we can see, this inventory represents a description of [Hausa](https://glottolog.org/resource/languoid/id/haus1257) and was added to PHOIBLE from the UPSID database (Maddieson, 1984; Maddieson & Precoda, 1990). To understand why this UPSID entry consulted 11 different sources, consider first that Hausa is typologically interesting (e.g., it has both ejective and implosive phonation mechanisms) and has tens of millions of speakers, making it relatively well-studied (the [Glottolog](https://glottolog.org/) reference catalog has [more than 1400 references](https://glottolog.org/resource/languoid/id/haus1257) related to Hausa). Second, note that Maddieson’s work on UPSID involved “typologizing” phonological inventories from different doculects, so that they were comparable across all entries in his database (cf. Hyman, 2008). Maddieson’s work was groundbreaking at the time because he was the first typologist to generate a stratified language sample aimed at being genealogically balanced, i.e. for each language family he chose one representative language. This allowed Maddieson to make statements about the cross-linguistic distribution of contrastive speech sounds with some level of statistical confidence. In fact, much about what we know about the distribution of the sounds of the world’s languages is due to Maddieson’s original language sample and his meticulous curation of the data. Where do the language codes in PHOIBLE come from? ------------------------------------------------- Every phonological inventory in PHOIBLE has a unique numeric inventory ID. Since most PHOIBLE inventories (aside from some UPSID or SPA ones, as mentioned above) are based on a single document, it is fairly straightforward to link each PHOIBLE inventory to [the Glottolog](https://glottolog.org/), which provides links between linguistic description documents and unique identifiers for dialects, languages, and groupings of dialects and languages at various levels (the preferred term for any of these levels of specificity / grouping is “languoid”; see Cysouw & Good, 2013). Thus in PHOIBLE each inventory typically corresponds to a single languoid, and in most cases that languoid is a “leaf node” in the Glottolog tree, i.e., it represents a particular dialect known to be used at a particular place and time (rather than a group of dialects, a language, or a language family). However, in the few cases where multiple document sources were consulted for an inventory, it may not be possible to link that inventory to a unique Glottolog leaf node. In such cases, the inventory in PHOIBLE is linked to the lowest possible Glottolog node that dominates the leaf nodes of each source document. For example, inventory 298 describes the Dan language, and was ingested from UPSID and ultimately based on a single journal article. At the time UPSID was compiled, Dan and Kla-Dan were considered a single language (isocode `daf`) but in 2013 a proposal was accepted to assign them separate isocodes (`dnj` and `lda`, respectively). Since it is unknown whether the consultants for the doculect were speakers of what we would now call “Dan” or “Kla-Dan,” the inventory is labeled with the old isocode `daf` and linked to the corresponding non-leaf node in the glottolog (`dann1241`). In addition to providing Glottolog languoid codes (“glottocodes”) for each inventory, PHOIBLE also includes [ISO 639-3](https://en.wikipedia.org/wiki/ISO_639-3) language codes (“isocodes”) for each inventory. The link between glottocodes and isocodes is maintained and provided by Glottolog. This situation can result in two possible problems: - When there are [updates to ISO 639-3](https://iso639-3.sil.org/code_changes/submitting_change_requests), the Glottolog may not update its mapping immediately, so the two can get out of sync temporarily. Such problems are typically resolved with new version releases of Glottolog. - In some cases, the editors of the Glottolog do not agree with the language classification choices of ISO 639-3 (see again the above-mentioned [blog post by Cysouw](https://dlc.hypotheses.org/623)). This disagreement results in cases where PHOIBLE must choose whether to accept the official ISO 639-3 code assignment, or use the isocode that the Glottolog associates with the glottocode for that inventory. In such cases, PHOIBLE policy is to report the ISO 639-3 version of the isocode. Consequently, there are a few languoids for which PHOIBLE and the Glottolog will report different isocodes for the same glottocode. If this is a problem for your analysis, you can always download a glottocode-isocode mapping from the Glottolog and merge it into the PHOIBLE dataset before performing your isocode-based analyses. Missing isocodes ---------------- There are cases of languages reported in the Glottolog for which there exists no isocode. For example, the Vach-Vasjugan variety of [Khanty](https://en.wikipedia.org/wiki/Khanty_language) is a Uralic language classified in the Glottolog as [vach1239](https://glottolog.org/resource/languoid/id/vach1239). However, [SIL](https://www.sil.org/) only assigns an isocode for one variety of Northern Khanty (isocode `kca`, glottocode `khan1273`), and because SIL is the Registration Authority of [ISO 639-3](https://iso639-3.sil.org/), Vach-Vasjugan Khanty has not been assigned its own isocode distinct from Northern Khanty. When no isocode exists for a particular phonological inventory in PHOIBLE, as in the Vach-Vasjugan example above, PHOIBLE follows the recommended practice of using the isocode `mis` (“missing”) to denote that the language is not included in the ISO 639-3 standard. In PHOIBLE these are these inventories with missing isocodes: ``` r phoible %>% filter(ISO6393 == "mis") %>% distinct(InventoryID, LanguageName, ISO6393, Glottocode) %>% kable() ``` | InventoryID | Glottocode | ISO6393 | LanguageName | |------------:|:-----------|:--------|:------------------| | 2143 | pisa1245 | mis | Pisamira | | 2352 | lizu1234 | mis | Lizu | | 2388 | east2773 | mis | Dolakha Newar | | 2420 | zhon1235 | mis | Zhongu Tibetan | | 2434 | vach1239 | mis | Eastern Khanty | | 2450 | fore1274 | mis | Forest Nenets | | 2691 | mink1237 | mis | Minkin | | 2714 | guwa1244 | mis | Guwar | | 2729 | NA | mis | Djindewal | | 2748 | mith1236 | mis | Mithaka | | 2773 | cola1237 | mis | Kolakngat | | 2778 | yari1243 | mis | Yari-Yari | | 2782 | west2443 | mis | East Djadjawurung | | 2783 | djad1246 | mis | Jardwadjali | | 2792 | kera1256 | mis | Keramin | | 2793 | lowe1402 | mis | Ngayawang | | 2794 | ngin1247 | mis | Ngintait | | 2818 | gudj1237 | mis | Gudjal | | 2882 | kawa1290 | mis | Ogh Awarrangg | | 2883 | kawa1290 | mis | Ogh Unyjan | | 2907 | wala1263 | mis | Walangama | | 2911 | tyan1235 | mis | Thaynakwithi | | 2913 | luth1234 | mis | Luthigh | | 2914 | mbiy1238 | mis | Mbiywom | | 2916 | ngko1236 | mis | Ngkoth | | 2920 | yadh1237 | mis | Yadhaykenu | | 2946 | bula1255 | mis | Bularnu | | 2956 | yulp1239 | mis | Yulparija | | 2988 | west2443 | mis | West Djadjawurung | | 2999 | sout2770 | mis | Ngunawal | Many of these inventories come from [Erich Round’s](https://languages-cultures.uq.edu.au/profile/1160/erich-round) [contribution of Australian phonemic inventories](https://zenodo.org/record/3464333#.XyK5qxMzY3E) to PHOIBLE. Unfortunately, some of these languages are extinct and have no representation in the [Ethnologue](https://www.ethnologue.com/), and hence, no code assigned in ISO 639-3. For users, it is important to note that multiple phonological inventories in PHOIBLE may have the same isocode (e.g. “mis”) or the same glottocode (e.g., in cases of two different descriptions of the same lect). In essence, this means that any programmatic code that groups by isocode or glottocode risks combining inventories from different doculects into a single apparent inventory. This could lead to incorrect results (e.g., if the goal is to count the number of phonemes). Therefore, **most analyses of inventory properties should be done on the level of inventory IDs** rather than isocodes or glottocodes. See also the [section on filtering and sampling](#filtsamp) below. Why do some languages have multiple entries in PHOIBLE? ------------------------------------------------------- It is not uncommon that phonological descriptions of a particular language’s speech sounds have different sets of contrastive phonemes when analyzed by different linguists (or sometimes even by the same linguist throughout their career). For example, [Kabardian](https://en.wikipedia.org/wiki/Kabardian_language) is represented in PHOIBLE by [five distinct inventories](https://phoible.org/languages/kaba1278). ``` r phoible %>% filter(ISO6393 == "kbd") %>% group_by(InventoryID) %>% summarise(`Number of phonemes`=n(), Phonemes=str_c(Phoneme, collapse=" "), .groups="drop") %>% kable() ``` | InventoryID | Number of phonemes | Phonemes | |------------:|-------------------:|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 4 | 56 | b d̻ d̻z̻ f fʼ j kʷʰ kʷʼ k̟ʲʰ k̟ʲʼ m n̻ pʰ pʼ qχ qχʷ qχʷʼ qχʼ r s̻ t̻s̻ t̻s̻ʼ t̻ʰ t̻ʼ v w xʷ x̟ʲ z̻ ç̟ ħ ɡʷ ɡ̟ʲ ɣ̟ʲ ɦ ɬʲ ɬʲʼ ɮʲ ʁ ʁʷ ʃ ʃʼ ʒ ʔ ʔʷ ʕ ʝ̟ χ χʷ a̟ː e̞ː iː o̞ː uː ɜ ɨ | | 391 | 56 | b d̪ d̪z̪ f fʼ j kʲʰ kʲʼ kʷʰ kʷʼ m n̪ pʰ pʼ qʷʼ qʼ qχ qχʷ r s̪ t̪s̪ t̪s̪ʼ t̪ʰ t̪ʼ v w xʲ xʷ z̪ ħ ɡʲ ɡʷ ɣʲ ɦ ɬ̪ʲ ɬ̪ʲʼ ɮ̪ʲ ʁ ʁʷ ʃ ʃʼ ʃ͇ ʒ ʒ͇ ʔ ʔʷ ʕ χ χʷ a̟ː e̞ː iː o̞ː uː ɜ ɨ | | 2310 | 55 | b d dz d̠ʒ f fʼ j kʷ kʷʼ m n p pʼ q qʷ qʷʼ qʼ r s t ts tsʼ tʼ t̠ʃ t̠ʃʼ v w x xʷ z ħ ɡʷ ɣ ɬ ɬʼ ɮ ʁ ʁʷ ʃ ʆ ʆʼ ʒ ʓ ʔ ʔʷ χ χʷ ä e̞ː iː o̞ o̞ː uː ɑː ə | | 2401 | 63 | b d dz dʑ f fʼ j kʷ kʷʼ kʼ l m n p pʼ q qʷ qʷʼ qʼ r s t ts tsʷʼ tsʼ tɕ tɕʼ tʷʼ tʼ t̠ʃ t̠ʃʼ v w x xʷ z zʷ ħ ɕ ɡʷ ɣ ɬ ɬʼ ʁ ʁʷ ʃ ʆ ʆʼ ʑ ʒ ʓ ʔ ʔʷ χ χʷ ä e̞ː iː o̞ o̞ː uː ɑː ə | | 2610 | 51 | b d dz f fʼ h j kʲʼ kʷʰ kʷʼ l m n pʰ pʼ qʰ qʷʰ qʷʼ qʼ s ts tsʼ tʰ tʼ w xʷ z ç ħ ɡʲ ɡʷ ɬ ɬʼ ɾ ʁ ʁʷ ʃ ʃʼ ʒ ʔ ʔʷ ʝ χ χʷ äː eː iː oː uː ɐ ə | The differences among them can be due to a variety of reasons, but the main reason is that these phonological descriptions represent different doculects (i.e., different instances of linguistic behavior at different places, times, or with different speakers). This should probably not surprise most linguists, since it has long been known that phoneme analysis is a non-deterministic process (Chao, 1934; Hockett, 1963). See Moran (2012, ch. 2.3.3) for a general discussion, and Hyman (2008, p. 99) for a detailed discussion of Kabardian in particular. In light of the above discussion, it should come as no surprise that PHOIBLE can contain multiple inventories for “the same” languoid, depending on what kind of languoid you’re interested in. All it takes is the existence of two or more descriptive documents associated with the same group of speakers, where “group of speakers” is ultimately determined by your desired level of granularity. To give a concrete example, one researcher may be interested in comparing lects at the “language” level, and so might wish to treat all inventories of “English” as duplicates for the purposes of her analysis (regardless of any differences in regional dialect or sociolect represented in the original doculects and encoded in the phonological inventory). That researcher might *filter* or *sample* PHOIBLE’s inventories to include only one inventory for each isocode (how she chooses to implement that filter is a separate question; see “How can I filter or sample inventories?” for examples). Other researchers may not care about “duplicates” in that sense, and may choose to include all inventories in their analysis (or, they may filter the dataset to include only inventories with a particular feature of interest such as breathy-voiced vowels). Below is a summary of the number of isocodes that are represented by multiple inventories in PHOIBLE: ``` r offset <- 25 phoible %>% group_by(ISO6393) %>% summarise(y=n_distinct(InventoryID), .groups="drop") %>% count(y) -> counts counts %>% ggplot(aes(y=as.factor(y), x=n)) + geom_point() + geom_text(aes(x=n+offset, label=n), hjust=0) + geom_segment(aes(xend=0, yend=as.factor(y))) + xlim(NA, max(counts$n) + 2*offset) + theme_light() + labs(title="Prevalence of multiple inventories per ISO code", x="Number of ISO codes having N inventories", y="N inventories") ``` ![](images/faq/stemplot_inv_per_iso-1.png)<!-- --> So most ISO 639-3 codes have only 1 inventory (the long, bottom line). Here is the same representation, for glottocodes instead of isocodes: ``` r offset <- 25 phoible %>% group_by(Glottocode) %>% summarise(y=n_distinct(InventoryID), .groups="drop") %>% count(y) -> counts counts %>% ggplot(aes(y=as.factor(y), x=n)) + geom_point() + geom_text(aes(x=n+offset, label=n), hjust=0) + geom_segment(aes(xend=0, yend=as.factor(y))) + xlim(NA, max(counts$n) + 2*offset) + theme_light() + labs(title="Prevalence of multiple inventories per glottocode", x="Number of glottocodes having N inventories", y="N inventories") ``` ![](images/faq/stemplot_inv_per_glotto-1.png)<!-- --> Again, most glottocodes are represented by just one or two inventories in PHOIBLE. Let’s see the few glottocodes that have the most inventories: ``` r phoible %>% group_by(Glottocode) %>% summarise(Names=str_c(unique(LanguageName), collapse=", "), `Number of inventories`=n_distinct(InventoryID), .groups="drop") %>% arrange(desc(`Number of inventories`)) %>% filter(`Number of inventories` > 5) %>% kable() ``` | Glottocode | Names | Number of inventories | |:-----------|:------------------------------------------------------------------------------------------------------------------------------------------|----------------------:| | osse1243 | Ossetian, Iron Ossetic | 11 | | biri1256 | Barna, Biri, Garingbal, Miyan, Wiri, Yambina, Yangga, Yilba, Yuwi, Wangan | 10 | | stan1293 | English, English (American), American English, English (Australian), English (British), English (New Zealand) | 9 | | dutc1256 | Dutch | 8 | | kham1282 | Rgyalthang Tibetan, Brag-g.yab Tibetan, Nangchenpa Tibetan, Soghpo Tibetan, Kami Tibetan, Sangdam Tibetan, Dongwang Tibetan, Kham Tibetan | 8 | | basq1248 | Basque, BASQUE | 7 | | east2328 | Cheremis, MARI, Meadow Mari, Eastern Mari | 7 | | gwan1268 | Gwandara (Karshi), Gwandara (Cancara), Gwandara (Toni), Gwandara (Gitata), Gwandara (Koro), Gwandara (Nimbia) | 6 | | khan1273 | Ostyak, KHANTY, Eastern Khanty, Northern Khanty | 6 | | lazz1240 | Laz | 6 | | lith1251 | Lithuanian, LITHUANIAN | 6 | Why are multiple inventories sometimes linked to the same document? ------------------------------------------------------------------- Occasionally, a single document may provide information about multiple phonological inventories. For example, a dissertation that describes and compares three related dialects or languages spoken in a particular region. In that case, three phonological inventories in PHOIBLE corresponding to three doculects might all be linked to the same document, but each inventory is still linked to only one document (it just happens to be *the same document* for those three inventories). One example is Terrill (1998), a grammar of [Biri](https://en.wikipedia.org/wiki/Biri_language), which contains a chapter describing its dialects, [many of which appear in PHOIBLE](https://phoible.org/languages/biri1256). What are the different “sources” in PHOIBLE? -------------------------------------------- PHOIBLE contains inventories from various [contributors](https://phoible.org/contributors). These contributions are grouped into so-called “sources,” denoted by abbreviations. Here they are in the chronological order that they were added to PHOIBLE: - [SPA](https://github.com/phoible/dev/tree/master/raw-data/SPA): The Stanford Phonology Archive (Crothers, Lorentz, Sherman, & Vihman, 1979) - [UPSID](https://github.com/phoible/dev/tree/master/raw-data/UPSID): The [UCLA Phonological Segment Inventory Database](http://web.phonetik.uni-frankfurt.de/upsid.html) (Maddieson, 1984; Maddieson & Precoda, 1990) - [AA](https://github.com/phoible/dev/tree/master/raw-data/AA): [Alphabets of Africa](http://sumale.vjf.cnrs.fr/phono/index.htm) (Chanard, 2006; Hartell, 1993) - [PH](https://github.com/phoible/dev/tree/master/raw-data/PH): Data drawn from journal articles, theses, and published grammars, added by members of the Linguistic Phonetics Laboratory at the University of Washington (Moran, 2012) - [GM](https://github.com/phoible/dev/tree/master/raw-data/GM): Data from African and Southeast Asian languages - [RA](https://github.com/phoible/dev/tree/master/raw-data/RA): Common Linguistic Features in Indian Languages (Ramaswami, 1999) - [SAPHON](https://github.com/phoible/dev/tree/master/raw-data/SAPHON): [South American Phonological Inventory Database](http://linguistics.berkeley.edu/~saphon/en/) (Michael, Stark, & Chang, 2012) - [UZ](https://github.com/phoible/dev/tree/master/raw-data/UZ): Data drawn from journal articles, theses, and published grammars, added by the phoible developers while at the Department of Comparative Linguistics at the University of Zurich - [EA](https://github.com/phoible/dev/tree/master/raw-data/EA): The database of [Eurasian phonological inventories](http://eurasianphonology.info/) (beta version) (Nikolaev, Nikulin, & Kukhto, 2015) - [ER](https://github.com/phoible/dev/tree/master/raw-data/ER): Australian phonemic inventories (Round, 2019) The acronyms above link to the GitHub page for each data source, which provides information about the source and how it was aggregated into PHOIBLE. Some sources are quite specialized; for example, UPSID contains a quota sample, i.e., one language per genealogical grouping (see Section “How can I filter or sample inventories?”); AA contains descriptions of only African languages; RA represents languges of India; SAPHON represents languages of South America; GM represents languages of Africa and Asia; EA represents languages of Eurasia; ER represents languages of Australia. Other sources like PH and UZ were added mainly to fill in the typological gaps left by the more specialized sources (e.g., to add language isolates, or to increase coverage of poorly-represented geographic areas or language families). Here is a table showing the number of inventories per source: ``` r phoible %>% group_by(Source) %>% summarise(`Number of inventories`=n_distinct(InventoryID), .groups="drop") %>% arrange(desc(`Number of inventories`)) %>% kable() ``` | Source | Number of inventories | |:-------|----------------------:| | gm | 460 | | upsid | 451 | | er | 392 | | ea | 390 | | ph | 389 | | saphon | 355 | | aa | 203 | | spa | 197 | | ra | 100 | | uz | 83 | Note that the same languoid may be reported in different sources as encoded in different doculects. Here are the lects included in the most sources: ``` r phoible %>% group_by(Glottocode) %>% distinct(Glottocode, Source) %>% summarise(`Found in how many sources?`=n(), `Which ones?`=str_c(Source, collapse=", "), .groups="drop") %>% filter(`Found in how many sources?` > 3) %>% kable() ``` | Glottocode | Found in how many sources? | Which ones? | |:-----------|---------------------------:|:-----------------------| | akan1250 | 4 | spa, upsid, aa, gm | | basq1248 | 4 | spa, upsid, uz, ea | | beng1280 | 5 | spa, upsid, ra, uz, ea | | bulg1262 | 4 | spa, upsid, uz, ea | | east2328 | 4 | spa, upsid, ph, ea | | hakk1236 | 4 | spa, upsid, uz, ea | | haus1257 | 4 | spa, upsid, aa, uz | | hind1269 | 5 | spa, upsid, ra, uz, ea | | hung1274 | 4 | spa, upsid, uz, ea | | iris1253 | 4 | spa, upsid, uz, ea | | khan1273 | 4 | spa, upsid, ph, ea | | khar1287 | 5 | spa, upsid, ph, ra, ea | | kore1280 | 4 | spa, upsid, uz, ea | | mand1415 | 4 | spa, upsid, ph, ea | | mode1248 | 4 | spa, upsid, uz, ea | | mund1320 | 4 | spa, upsid, ra, ea | | nepa1254 | 4 | upsid, ra, uz, ea | | nucl1301 | 4 | spa, upsid, uz, ea | | nucl1302 | 4 | spa, upsid, uz, ea | | nucl1310 | 4 | spa, upsid, uz, ea | | nucl1417 | 4 | spa, upsid, aa, uz | | stan1288 | 4 | spa, upsid, uz, ea | | stan1290 | 4 | spa, upsid, uz, ea | | stan1295 | 4 | spa, upsid, uz, ea | | tach1250 | 4 | spa, upsid, gm, uz | | taga1270 | 4 | spa, upsid, gm, ea | | telu1262 | 4 | spa, upsid, ra, ea | | viet1252 | 4 | spa, upsid, uz, ea | | west2369 | 4 | spa, upsid, uz, ea | | yuec1235 | 4 | spa, upsid, uz, ea | Filtering and sampling ====================== Different research questions will require including / excluding certain inventories from PHOIBLE. These sections describe how to *filter* and *sample* the PHOIBLE data based on various criteria. Random sampling: one inventory per isocode/glottocode ----------------------------------------------------- If multiple inventories per isocode/glottocode are problematic for your analysis or research question, one approach is to select one inventory from each isocode/glottocode via random sampling: ``` r phoible %>% distinct(InventoryID, Glottocode) %>% group_by(Glottocode) %>% sample_n(1) %>% pull(InventoryID) -> inventory_ids_sampled_one_per_glottocode phoible %>% distinct(InventoryID, ISO6393) %>% group_by(ISO6393) %>% sample_n(1) %>% pull(InventoryID) -> inventory_ids_sampled_one_per_isocode message("Picking one inventory per glottocode reduces PHOIBLE from ", n_distinct(phoible$InventoryID), " inventories\nto ", length(inventory_ids_sampled_one_per_glottocode), " inventories. Picking one per ISO 639-3 code yields ", length(inventory_ids_sampled_one_per_isocode), " inventories.") ``` ## Picking one inventory per glottocode reduces PHOIBLE from 3020 inventories ## to 2177 inventories. Picking one per ISO 639-3 code yields 2099 inventories. You can then apply your sample like this: ``` r phoible %>% filter(InventoryID %in% inventory_ids_sampled_one_per_glottocode) -> my_sample ``` Filtering by data source ------------------------ Another approach is to only use only inventories from a data source that already provides a one-inventory-per-language sample. For example, UPSID represents a “quota” sample (one language per family, for some definition of “family”). To do this, you can filter the PHOIBLE data by the `Source` column: ``` r phoible %>% filter(Source == "upsid") -> upsid # show that there is exactly one inventory per ISO 639-3 code: upsid %>% group_by(ISO6393) %>% summarise(n_inventories_per_isocode=n_distinct(InventoryID), .groups="drop") %>% pull(n_inventories_per_isocode) %>% all(. == 1) ``` ## [1] TRUE <!-- However, note that the state of knowledge of linguistic genealogical groupings has changed since UPSID was published. A quota sample that reflects current understanding can also be achieved. Here, we choose one leaf node from within each top-level Glottolog family node that has representation in PHOIBLE: ```r # TODO: EXAMPLE OF QUOTA SAMPLE USING GLOTTOLOG # NB: every Isolate will be included ``` --> Filtering and sampling based on inventory properties ---------------------------------------------------- Another approach is to select inventories based on properties of the inventories themselves, such as whether they include information about allophones, contrastive tone, etc. For example, one might wish to include phonological inventories from sources other than UPSID, when available, since it does not include allophones in its inventories. ``` r # get lists of all sources, and sources that include allophones phoible %>% distinct(Source) %>% pull() -> all_sources phoible %>% filter(!is.na(Allophones)) %>% distinct(Source) %>% pull() -> sources_with_allophones # make a vector encoding allophone absence/presence as 0/1 all_sources %in% sources_with_allophones %>% as.integer() %>% setNames(all_sources) -> source_weights # example 1: one language per isocode, only keep if includes allophones phoible %>% distinct(InventoryID, ISO6393, Source) %>% filter(source_weights[Source] == 1) %>% group_by(ISO6393) %>% slice_sample(n=1) %>% pull(InventoryID) -> sample_of_inventory_ids_with_allophones # example 2: one language per isocode, *preferentially* pick ones w/ allophones new_weights <- source_weights + 1e-9 # so that all weights are non-zero phoible %>% distinct(InventoryID, ISO6393, Source) %>% group_by(ISO6393) %>% slice_sample(n=1, weight_by=new_weights[Source]) %>% pull(InventoryID) -> sample_of_inventory_ids_with_preference_for_having_allophones message("Sampling one inventory per ISO code while *requiring* allophones yielded ", length(sample_of_inventory_ids_with_allophones), " inventories; merely *preferring* allophones yielded ", length(sample_of_inventory_ids_with_preference_for_having_allophones), " inventories.") ``` ## Sampling one inventory per ISO code while *requiring* allophones yielded 1155 inventories; merely *preferring* allophones yielded 2099 inventories. You can then extract your sample using `filter()` as seen above: ``` r phoible %>% filter(InventoryID %in% sample_of_inventory_ids_with_allophones) -> my_sample ``` If you’re inclined to be a “phoneme splitter,” you might prefer to pick the largest inventory for a given isocode: ``` r phoible %>% group_by(InventoryID) %>% summarise(n_phonemes=n(), isocode=unique(ISO6393), .groups="drop") %>% group_by(isocode) %>% arrange(desc(n_phonemes), .by_group=TRUE) %>% slice_head(n=1) %>% pull(InventoryID) -> inventory_ids_of_biggest_inventories ``` … and again, extracting your sample using `filter()`: ``` r phoible %>% filter(InventoryID %in% inventory_ids_of_biggest_inventories) -> my_sample ``` Integrating geographic information ================================== One way to look at the geographic coverage of PHOIBLE is to merge its data with information about languages and dialects as provided by [the Glottolog](https://glottolog.org/meta/downloads). Here we use only the PHOIBLE index (the mapping from InventoryID to Glottocode, without any Phoneme information) and merge it with the Glottolog geographic and genealogical data: ``` r url_ <- "https://raw.githubusercontent.com/phoible/dev/master/mappings/InventoryID-LanguageCodes.csv" phoible_index <- read_csv(url(url_), col_types=cols(InventoryID='i', .default='c')) url_ <- "https://cdstar.shh.mpg.de/bitstreams/EAEA0-18EC-5079-0173-0/languages_and_dialects_geo.csv" glottolog <- read_csv(url(url_), col_types=cols(latitude='d', longitude='d', .default='c')) phoible_geo <- left_join(phoible_index, glottolog, by=c("Glottocode"="glottocode")) # show the merged data phoible_geo %>% head() %>% kable() ``` | InventoryID | ISO6393 | Glottocode | LanguageName | Source | name | isocodes | level | macroarea | latitude | longitude | |------------:|:--------|:-----------|:-------------|:-------|:-----------|:---------|:---------|:----------|---------:|----------:| | 1 | kor | kore1280 | Korean | spa | Korean | kor | language | Eurasia | 37.5000 | 128.00000 | | 2 | ket | kett1243 | Ket | spa | Ket | ket | language | Eurasia | 63.7551 | 87.54660 | | 3 | lbe | lakk1252 | Lak | spa | Lak | lbe | language | Eurasia | 42.1328 | 47.08090 | | 4 | kbd | kaba1278 | Kabardian | spa | Kabardian | kbd | language | Eurasia | 43.5082 | 43.39180 | | 5 | kat | nucl1302 | Georgian | spa | Georgian | kat | language | Eurasia | 41.8504 | 43.78613 | | 6 | bsk | buru1296 | Burushaski | spa | Burushaski | bsk | language | Eurasia | 36.2161 | 74.82360 | We can then easily see how many languages there are in PHOIBLE for each macroarea: ``` r phoible_geo %>% distinct(ISO6393, macroarea) %>% group_by(macroarea) %>% tally(name="Number of unique isocodes") %>% kable() ``` | macroarea | Number of unique isocodes | |:--------------|--------------------------:| | Africa | 707 | | Australia | 292 | | Eurasia | 440 | | North America | 144 | | Papunesia | 182 | | South America | 334 | | NA | 5 | Or, we can count the total number of inventories per macroarea: ``` r phoible_geo %>% group_by(macroarea) %>% tally(name="Number of inventories") %>% kable() ``` | macroarea | Number of inventories | |:--------------|----------------------:| | Africa | 885 | | Australia | 436 | | Eurasia | 804 | | North America | 184 | | Papunesia | 216 | | South America | 479 | | NA | 16 | Note that there are some langoids/glottocodes for which geographic information is unavailable (their macroarea is `NA`). Also note that each languoid is given a single latitude-longitude coordinate pair (i.e., there is no information about the *spatial extent* of a languoid’s use). Finally, let’s look at the global distribution of languages represented in PHOIBLE: ``` r ggplot(data=phoible_geo, aes(x=longitude, y=latitude)) + borders("world", colour="gray50", fill="gray50") + geom_point(alpha=0.5, size=1, colour="orange") ``` ## Warning: Removed 134 rows containing missing values (geom_point). ![](images/faq/world_map-1.png)<!-- --> Of course this does not show all of the data points for languages that are *not* in PHOIBLE! Phonological features in PHOIBLE ================================ In addition to phoneme inventories, PHOIBLE includes distinctive feature data for every phoneme in every inventory. The feature system was created by the PHOIBLE developers to be descriptively adequate cross-linguistically. In other words, if two phonemes differ in their graphemic representation, then they should necessarily differ in their featural representation as well (regardless of whether those two phonemes coexist in any known doculect). The feature system is loosely based on the feature system in Hayes (2009) with some additions drawn from Moisik & Esling (2011). Note that the feature system is potentially subject to change as new languages are added in subsequent editions of PHOIBLE, and/or as errors are found and corrected. More information about the PHOIBLE feature set can be found [here](https://github.com/phoible/dev/tree/master/raw-data/FEATURES). How is PHOIBLE used in academic research and/or industry? ========================================================= The data in PHOIBLE have been used in many published research papers pertaining to a variety of scientific fields and industrial applications. For a sampling, view Google Scholar’s list of [papers citing PHOIBLE](https://scholar.google.com/scholar?oi=bibs&hl=en&cites=576981116309388928&as_sdt=5). References ========== <div id="refs" class="references csl-bib-body hanging-indent" line-spacing="2"> <div id="ref-chanard2006" class="csl-entry"> Chanard C (2006). Systèmes alphabétiques des langues africaines. Online: urlhttp://sumale.vjf.cnrs.fr/phono/. </div> <div id="ref-Chao1934" class="csl-entry"> Chao YR (1934). The non-uniqueness of phonemic solutions of phonetic systems. *Bulletin of the Institute of History and Philology, Academia Sinica*, *4*(4). </div> <div id="ref-CrothersEtAl1979" class="csl-entry"> Crothers JH, Lorentz JP, Sherman DA, & Vihman MM (1979). *Handbook of phonological data from a sample of the world’s languages: A report of the Stanford Phonology Archive*. Palo Alto, CA: </div> <div id="ref-CysouwGood2013" class="csl-entry"> Cysouw M, & Good J (2013). Languoid, Doculect, and Glossonym: Formalizing the notion ‘language.’ *Language Documentation & Conservation*, *7*, 331–360. <http://hdl.handle.net/10125/4606> </div> <div id="ref-hartell1993" class="csl-entry"> Hartell RL, Ed (1993). *Alphabets des langues africaines*. UNESCO and Société Internationale de Linguistique. </div> <div id="ref-Hayes2009" class="csl-entry"> Hayes B (2009). *Introductory phonology*. Blackwell. </div> <div id="ref-Hockett1963" class="csl-entry"> Hockett CF (1963). The problem of universals in language. In J. H. Greenberg (Ed.), *Universals of language* (Vol. 2, pp. 1–29). Cambridge, MA: MIT Press. </div> <div id="ref-Hyman2008" class="csl-entry"> Hyman LM (2008). Universals in phonology. *The Linguistic Review*, *25*(1-2), 83–137. doi:[10.1515/TLIR.2008.003](https://doi.org/10.1515/TLIR.2008.003) </div> <div id="ref-Maddieson1984" class="csl-entry"> Maddieson I (1984). *Patterns of sounds*. Cambridge Studies in Speech Science and Communication. New York: Cambridge University Press. <http://ebooks.cambridge.org/ref/id/CBO9780511753459> </div> <div id="ref-MaddiesonPrecoda1990" class="csl-entry"> Maddieson I, & Precoda K (1990). Updating UPSID. *UCLA Working Papers in Phonetics*, *74*, 104–111. <http://escholarship.org/uc/item/71s1701m> </div> <div id="ref-saphon" class="csl-entry"> Michael L, Stark T, & Chang W (2012). South American phonological inventory database. University of California. <http://linguistics.berkeley.edu/saphon/en/> </div> <div id="ref-MoisikEsling2011" class="csl-entry"> Moisik SR, & Esling JH (2011). The ‘whole larynx’ approach to laryngeal features. *Proceedings of the 17th International Congress of Phonetic Sciences* (pp. 1406–1409). Hong Kong. </div> <div id="ref-Moran2012" class="csl-entry"> Moran S (2012). *Phonetics information base and lexicon* (Doctoral Dissertation). University of Washington, Seattle, WA. <http://hdl.handle.net/1773/22452> </div> <div id="ref-nikolaev_etal2015" class="csl-entry"> Nikolaev D, Nikulin A, & Kukhto A (2015). The database of Eurasian phonological inventories. <http://eurasianphonology.info/> </div> <div id="ref-ramaswami1999" class="csl-entry"> Ramaswami N (1999). *Common linguistic features in Indian languages: Phonetics*. Central Institute of Indian Languages. </div> <div id="ref-Round2019" class="csl-entry"> Round E (2019). Australian phonemic inventories contributed to PHOIBLE 2.0: Essential explanatory notes (version 1.0). <http://doi.org/10.5281/zenodo.3464333> </div> <div id="ref-Terrill1998" class="csl-entry"> Terrill A (1998). *Biri*. Languages of the world. Materials. München: LINCOM Europa. <https://nla.gov.au/nla.cat-vn573959> </div> </div>
44.264621
226
0.633491
eng_Latn
0.935748
e97c0f4e0553a5324c1a7ac9f634d7e25dd6236b
810
md
Markdown
content/terms/wiki.md
mikeysan/developer-glossary
2d38b6adc7fbe1dd1f4dd381c960389b8e588a4d
[ "MIT" ]
28
2020-10-03T04:06:09.000Z
2021-12-09T14:19:04.000Z
content/terms/wiki.md
mikeysan/developer-glossary
2d38b6adc7fbe1dd1f4dd381c960389b8e588a4d
[ "MIT" ]
69
2020-10-03T04:14:30.000Z
2021-01-05T19:58:56.000Z
content/terms/wiki.md
mikeysan/developer-glossary
2d38b6adc7fbe1dd1f4dd381c960389b8e588a4d
[ "MIT" ]
61
2020-10-03T06:36:31.000Z
2021-10-04T20:47:39.000Z
--- title: "Wiki" date: 2020-10-11T23:28:00+02:00 part-of-speech: noun --- Wiki is a hypertext publication collaboratively edited and managed by its own audience directly using a web browser. A typical wiki contains multiple pages for the subjects or scope of the project and may be either open to the public or limited to use within an organization for maintaining its internal knowledge base. ## Example "Take some more [[tea]]," the March Hare said to Alice, very earnestly. "I've had '''nothing''' yet," Alice replied in an offended tone, "so I can't take more." "You mean you can't take ''less''," said the Hatter. "It's very easy to take ''more'' than nothing." ## Further Reading - Here's a [Wikipedia article](https://en.wikipedia.org/wiki/Wiki#History) that talks about first wiki page.
38.571429
320
0.733333
eng_Latn
0.997943
e97c7de339c3e994c7b0ba0b797e880efcda864a
402
md
Markdown
README.md
dnsplus/probe-localdns
733a0d3ced1e5d9c3bb454041d518c5bb849a47d
[ "MIT" ]
null
null
null
README.md
dnsplus/probe-localdns
733a0d3ced1e5d9c3bb454041d518c5bb849a47d
[ "MIT" ]
null
null
null
README.md
dnsplus/probe-localdns
733a0d3ced1e5d9c3bb454041d518c5bb849a47d
[ "MIT" ]
null
null
null
# probe-localdns This project demonstrates how to do getting local DNS address by http server. ## Deploy Project ### DNS Settings Specify a `domain` name ns record to deploy probe-localdns server address. ### Install dependencies ```bash npm install ``` ### Running ```bash npm run server <domain> <ip> ``` ## License Released under the [MIT License](http://opensource.org/licenses/MIT).
13.4
77
0.711443
eng_Latn
0.896654
e97ee2ea64489dae7d3ed07c40257786c896e492
1,015
md
Markdown
documentation/testing-policies.md
b-entangled/kyverno
fd2661dfce3e5b62ed5244537ed4634f588f53e6
[ "Apache-2.0" ]
null
null
null
documentation/testing-policies.md
b-entangled/kyverno
fd2661dfce3e5b62ed5244537ed4634f588f53e6
[ "Apache-2.0" ]
null
null
null
documentation/testing-policies.md
b-entangled/kyverno
fd2661dfce3e5b62ed5244537ed4634f588f53e6
[ "Apache-2.0" ]
null
null
null
<small>*[documentation](/README.md#documentation) / Testing Policies*</small> # Testing Policies The resources definitions for testing are located in the [test](/test) directory. Each test contains a pair of files: one is the resource definition, and the second is the Kyverno policy for this definition. ## Test using kubectl To do this you should [install Kyverno to the cluster](installation.md). For example, to test the simplest Kyverno policy for `ConfigMap`, create the policy and then the resource itself via `kubectl`: ````bash cd test kubectl create -f policy/policy-CM.yaml kubectl create -f resources/CM.yaml ```` Then compare the original resource definition in `CM.yaml` with the actual one: ````bash kubectl get -f resources/CM.yaml -o yaml ```` ## Test using Kyverno CLI The Kyverno CLI allows testing policies before they are applied to a cluster. It is documented at [Kyverno CLI](kyverno-cli.md) <small>*Read Next >> [Policy Violations](/documentation/policy-violations.md)*</small>
32.741935
207
0.758621
eng_Latn
0.98359
e98028e8ac5d1907258fdcf69177eaf51ab70e71
32
md
Markdown
README.md
seki/Masaki
111fe82524eb30cf7b0e11dae6448dc0080d6453
[ "MIT" ]
null
null
null
README.md
seki/Masaki
111fe82524eb30cf7b0e11dae6448dc0080d6453
[ "MIT" ]
1
2021-07-15T08:30:52.000Z
2021-07-15T08:30:52.000Z
README.md
seki/Masaki
111fe82524eb30cf7b0e11dae6448dc0080d6453
[ "MIT" ]
null
null
null
# Masaki Pokemon Card Game Util
10.666667
22
0.78125
vie_Latn
0.227746
e9804101d02aaf1089466f1343f63b698219a9f0
2,807
md
Markdown
content/home/about-us.md
ciselab/ciselab-www
47dc8ece4488ff30faa16798c1bc6463c18e6662
[ "MIT" ]
null
null
null
content/home/about-us.md
ciselab/ciselab-www
47dc8ece4488ff30faa16798c1bc6463c18e6662
[ "MIT" ]
null
null
null
content/home/about-us.md
ciselab/ciselab-www
47dc8ece4488ff30faa16798c1bc6463c18e6662
[ "MIT" ]
null
null
null
--- widget: blank headless: true weight: 20 title: About us subtitle: "" design: # Choose how many columns the section has. Valid values: 1 or 2. columns: "2" --- The development, maintenance, and testing of large software products involve many activities that are complex, expensive, and error-prone. For example, complex systems (e.g., autonomous cars) are typically built as a composition of features that tend to interact and impact one another’s behavior in unknown ways. Detecting feature interaction failures with manual testing becomes infeasible and too expensive when the number and the complexity of the features increase. There are many tribes of AI, namely Symbolists, Evolutionists, Bayesians, Kernel Conservatives, Connectionists). In the CISELab, we focus on applying **Computational Intelligence** (CI) to automate expensive development activities since more development automation would require fewer human resources. One of the most common ways to make such automation is the **Search-Based Software Engineering** (SBSE), which reformulates traditional software engineering tasks as search (optimization) problems. Then, **CI algorithms** (e.g., genetic algorithms, genetic programming, simulated annealing) are used to automate the process of discovering (e.g., detecting software defects) and building optimal solutions (e.g., software fixes). SBSE is not only an academic research area, but it is achieving significant uptake in many industrial sectors. For example, **Facebook** uses multi-objective solvers to automatically design system-level test cases for mobile apps [[1]](https://link.springer.com/chapter/10.1007/978-3-319-99241-9_1); Google uses multi-objective solvers for regression testing [[2]](http://sebase.cs.ucl.ac.uk/fileadmin/crest/sebasepaper/YooNH11_01.pdf). SSBSE techniques has been also applied in the automotive domain (**IEE S.A.** [[3]](https://pure.tudelft.nl/portal/files/45811366/paperASE18N2016pdf.pdf)), in satellite domain (**SES S.A.** [[4]](https://pure.tudelft.nl/admin/files/47344874/main.pdf)) and security testing. ### Topics At the Computational Intelligence Lab, our research topics include but are not limited to the following research topics: - **Blockchain Testing and Analysis** - **Testing for Machine Learning** - **Test Case Generation and Fuzzing** (i.e., unit, system, and integration level) - **Testing Cyber Physical Systems** (e.g., self-driving cars). - **Automated Program Repair** (including genetic programming) ### Organization The CISELab is part of the [Software Engineering Research Group (SERG)](https://se.ewi.tudelft.nl/). SERG is again part of the Faculty of Electrical Engineering, Mathematics, and Computer Science (EEMCS) within the [Delft University of Technology (TU Delft)](https://www.tudelft.nl/).
68.463415
325
0.778055
eng_Latn
0.988486
e980cb972949b33c55259cc341172b78246d537d
1,632
md
Markdown
2017/CVE-2017-11318/README.md
blackarrowsec/advisories
bd19a6406405ca4bf2939425b018e1e41b225db2
[ "MIT" ]
25
2019-11-14T14:26:13.000Z
2021-10-10T22:20:49.000Z
2017/CVE-2017-11318/README.md
blackarrowsec/advisories
bd19a6406405ca4bf2939425b018e1e41b225db2
[ "MIT" ]
null
null
null
2017/CVE-2017-11318/README.md
blackarrowsec/advisories
bd19a6406405ca4bf2939425b018e1e41b225db2
[ "MIT" ]
11
2019-11-15T09:17:24.000Z
2021-06-21T13:49:12.000Z
# CVE-2017-11318: Remote Command Execution in Cobian Backup [![](https://img.shields.io/badge/Attack%20Vector-Adjacent%20Network-yellow?style=flat-square)]() [![](https://img.shields.io/badge/Privileges%20Required-None-red?style=flat-square)]() [![](https://img.shields.io/badge/User%20Interaction-No-red?style=flat-square)]() __Vendor:__ CobianSoft<br> __Vendor URL:__ https://www.cobiansoft.com<br> __Versions affected:__ Cobian Backup 11<br> __Discovered by:__ Juan Manuel Fernandez ([@TheXC3LL](https://twitter.com/TheXC3LL))<br> __Public fix:__ No<br> __Proof of Concept:__ Yes ([ref](https://github.com/blackarrowsec/advisories/blob/master/2017/CVE-2017-11318/CVE-2017-11318.py)) <br> ## Summary An attacker can execute arbitrary commands on a remote machine in the same network via a spoofed master server. ## Details An attacker can add and execute new backup tasks when the master server is spooofed (via Man-in-the-Middle). The commands are executed using the pre-backup events defined in a new task. ## Impact An attacker can execute arbitrary commands in a target machine. ## Recommendation This software has no support since 2014, so this vulnerability will not be fixed. ## Timeline # [![](https://img.shields.io/badge/www-blackarrow.net-E5A505?style=flat-square)](https://www.blackarrow.net) [![](https://img.shields.io/badge/twitter-@BlackArrowSec-00aced?style=flat-square&logo=twitter&logoColor=white)](https://twitter.com/BlackArrowSec) [![](https://img.shields.io/badge/linkedin-@BlackArrowSec-0084b4?style=flat-square&logo=linkedin&logoColor=white)](https://www.linkedin.com/company/blackarrowsec/)
49.454545
419
0.76777
yue_Hant
0.392451
e980ef9b8f9347c2d370c4e9cb3f2be09316f28b
737
md
Markdown
problems/Move Zeroes.md
SAKSHIC/revise-algorithm-for-problem-solving-round
fe165bc4633ad76ba739ac38522b6cafc6237465
[ "MIT" ]
1
2020-05-28T10:20:22.000Z
2020-05-28T10:20:22.000Z
problems/Move Zeroes.md
SAKSHIC/revise-algorithm-for-problem-solving-round
fe165bc4633ad76ba739ac38522b6cafc6237465
[ "MIT" ]
null
null
null
problems/Move Zeroes.md
SAKSHIC/revise-algorithm-for-problem-solving-round
fe165bc4633ad76ba739ac38522b6cafc6237465
[ "MIT" ]
1
2020-06-16T20:17:37.000Z
2020-06-16T20:17:37.000Z
# Move Zeroes Given an array `nums`, write a function to move all `0`'s to the end of it while maintaining the relative order of the non-zero elements. For example, given `nums = [0, 1, 0, 3, 12]`, after calling your function, `nums` should be `[1, 3, 12, 0, 0]`. **Note:** 1. You must do this in-place without making a copy of the array. 2. Minimize the total number of operations. **Solution:** ```java public class Solution { public void moveZeroes(int[] nums) { int i = 0, j = 0; while (j < nums.length) { if (nums[j] != 0) { swap(nums, i++, j); } j++; } } void swap(int[] nums, int i, int j) { int tmp = nums[i]; nums[i] = nums[j]; nums[j] = tmp; } } ```
22.333333
137
0.572592
eng_Latn
0.981093
e98114cfde8a9c6678e03d5457e1b95225cf4ce6
881
md
Markdown
content/jobs/in28Minutes/index.md
debrupofficial365/debrupofficial3650.github.io
4fa105f216f4ed901c86b01110f00677e8d26797
[ "MIT" ]
1
2021-06-22T12:18:30.000Z
2021-06-22T12:18:30.000Z
content/jobs/in28Minutes/index.md
debrupofficial365/debrupofficial3650.github.io
4fa105f216f4ed901c86b01110f00677e8d26797
[ "MIT" ]
null
null
null
content/jobs/in28Minutes/index.md
debrupofficial365/debrupofficial3650.github.io
4fa105f216f4ed901c86b01110f00677e8d26797
[ "MIT" ]
null
null
null
--- date: '2018-05-10' title: 'Teaching Assistant, Code Reviewer and Mentor' company: ' in28Minutes Official, Udemy Inc.' range: 'April 2021 - Present' location: 'remote' url: 'https://www.udemy.com/user/in28minutes/' --- - Serving as a Teaching Assistant for multiple in28Minutes Official Courses at Udemy. 1. Java Programming for Complete Beginners - Java 16 2. Go Java Full Stack with Spring Boot and React 3. Go Java Full Stack with Spring Boot and Angular - Responsibilities: 1. Helping more than 2 lakh+ students and clearing their doubts related to Java, Full Stack Web Development with Spring Boot Angular, Spring Boot React, reviewing code, suggesting best practices, and helping them with project work. - Working under Ranga Karanam Rao, founder of in28Minutes Official. - View [LETTER OF RECOMMENDATION](https://github.com/debrupofficial365/in28Minutes_TA_statement)
38.304348
96
0.782066
eng_Latn
0.875434
e98167f6b6640769f619bdbc04fb06a24016312e
1,749
md
Markdown
doc/update/ruby.md
erbunao/gitlabhq
696b9903f08011e37811dc8b8ff4f7da77201d13
[ "MIT" ]
1
2020-11-12T04:37:32.000Z
2020-11-12T04:37:32.000Z
doc/update/ruby.md
taiyangc/gitlabhq
260c877cab3f4c1d05ab3d984a2fad5a1e018e5e
[ "MIT" ]
null
null
null
doc/update/ruby.md
taiyangc/gitlabhq
260c877cab3f4c1d05ab3d984a2fad5a1e018e5e
[ "MIT" ]
null
null
null
# Updating Ruby from source This guide explains how to update Ruby in case you installed it from source according to the [instructions](../install/installation.md#2-ruby). ### 1. Look for Ruby versions This guide will only update `/usr/local/bin/ruby`. You can see which Ruby binaries are installed on your system by running: ```bash ls -l $(which -a ruby) ``` ### 2. Stop GitLab ```bash sudo service gitlab stop ``` ### 3. Install or update dependencies Here we are assuming you are using Debian/Ubuntu. ```bash sudo apt-get install build-essential zlib1g-dev libyaml-dev libssl-dev libgdbm-dev libreadline-dev libncurses5-dev libffi-dev curl ``` ### 4. Download, compile and install Ruby Find the latest stable version of Ruby 1.9 or 2.0 at https://www.ruby-lang.org/en/downloads/ . We recommend at least 2.0.0-p353, which is patched against [CVE-2013-4164](https://www.ruby-lang.org/en/news/2013/11/22/heap-overflow-in-floating-point-parsing-cve-2013-4164/). ```bash cd /tmp curl --progress http://cache.ruby-lang.org/pub/ruby/2.0/ruby-2.0.0-p353.tar.gz | tar xz cd ruby-2.0.0-p353 ./configure --disable-install-rdoc make sudo make install # overwrite the existing Ruby in /usr/local/bin sudo gem install bundler ``` ### 5. Reinstall GitLab gem bundle Just to be sure we will reinstall the gems used by GitLab. Note that the `bundle install` command [depends on your choice of database](../install/installation.md#install-gems). ```bash cd /home/git/gitlab sudo -u git -H rm -rf vendor/bundle # remove existing Gem bundle sudo -u git -H bundle install --deployment --without development test mysql aws # Assuming PostgreSQL ``` ### 6. Start GitLab We are now ready to restart GitLab. ```bash sudo service gitlab start ``` ### Done
31.8
271
0.738136
eng_Latn
0.891512
e982313021aa86f8e8b83e3394f81f843eb6577e
70,714
markdown
Markdown
_posts/2012-08-22-accessing-content-in-a-contentaware-mesh.markdown
api-evangelist/patents-2012
b40b8538ba665ccb33aae82cc07e8118ba100591
[ "Apache-2.0" ]
3
2018-09-08T18:14:33.000Z
2020-10-11T11:16:26.000Z
_posts/2012-08-22-accessing-content-in-a-contentaware-mesh.markdown
api-evangelist/patents-2012
b40b8538ba665ccb33aae82cc07e8118ba100591
[ "Apache-2.0" ]
null
null
null
_posts/2012-08-22-accessing-content-in-a-contentaware-mesh.markdown
api-evangelist/patents-2012
b40b8538ba665ccb33aae82cc07e8118ba100591
[ "Apache-2.0" ]
4
2018-04-05T13:55:08.000Z
2022-02-10T11:30:07.000Z
--- title: Accessing content in a content-aware mesh abstract: Content in a content-aware mesh may be accessed and/or manipulated. In one embodiment, a node may receive access to each of a plurality of images that are distributed among at least two nodes of a mesh. The at least two nodes may not be part of the same service. Accessing to each of the images may be performed without the node locally storing all of the images. The node may display an image of the plurality of images via a uniform interface without indication as to which of the nodes the image is stored on. url: http://patft.uspto.gov/netacgi/nph-Parser?Sect1=PTO2&Sect2=HITOFF&p=1&u=%2Fnetahtml%2FPTO%2Fsearch-adv.htm&r=1&f=G&l=50&d=PALL&S1=09390155&OS=09390155&RS=09390155 owner: Adobe Systems Incorporated number: 09390155 owner_city: San Jose owner_country: US publication_date: 20120822 --- Digital cameras are now available in many devices e.g. cell phones tablet devices etc. beyond dedicated point and shoot or DSLR purpose built devices. This is has led to photographs being stored on multiple devices and or computers as well as various network services. Moving all of one s photographs to a centrally managed repository is often impractical because of the effort involved especially with large collections. Moreover many people do not want to move assets away from an existing collection or service because friends and family are accustomed to viewing them on a particular service. Consolidation management and a consolidated view of a user s photos across devices and platforms are not available with current solutions. This disclosure describes techniques and structures for accessing and managing content in a content aware mesh. In one embodiment a node may receive access to each of a plurality of images that are distributed among at least two nodes of a mesh. The at least two nodes may not be part of the same service. Accessing to each of the images may be performed without the node locally storing all of the images. The node may display an image of the plurality of images via a uniform interface without indication as to which of the nodes the image is stored on. In some embodiments the node may locally store information regarding each of the images in a database that is synchronized to the mesh. Information may include an identifier and a location pointer for each of the images. The node may access the images in an offline mode in one embodiment. The node may also access images that are stored on external services. While the disclosure is described herein by way of example for several embodiments and illustrative drawings those skilled in the art will recognize that the disclosure is not limited to the embodiments or drawings described. It should be understood that the drawings and detailed description thereto are not intended to limit the embodiments to the particular form disclosed but on the contrary the intention is to cover all modifications equivalents and alternatives falling within the spirit and scope of the present disclosure. The headings used herein are for organizational purposes only and are not meant to be used to limit the scope of the description. As used throughout this application the word may is used in a permissive sense i.e. meaning having the potential to rather than the mandatory sense i.e. meaning must . Similarly the words include including and includes mean including but not limited to. As used throughout this application the singular forms a an and the include plural referents unless the content clearly indicates otherwise. Thus for example reference to an element includes a combination of two or more elements. In the following detailed description numerous specific details are set forth to provide a thorough understanding of claimed subject matter. However it will be understood by those skilled in the art that claimed subject matter may be practiced without these specific details. In other instances methods apparatuses or systems that would be known by one of ordinary skill have not been described in detail so as not to obscure claimed subject matter. Some portions of the detailed description which follow are presented in terms of algorithms or symbolic representations of operations on binary digital signals stored within a memory of a specific apparatus or special purpose computing device or platform. In the context of this particular specification the term specific apparatus or the like includes a general purpose computer once it is programmed to perform particular functions pursuant to instructions from program software. Algorithmic descriptions or symbolic representations are examples of techniques used by those of ordinary skill in the signal processing or related arts to convey the substance of their work to others skilled in the art. An algorithm is here and is generally considered to be a self consistent sequence of operations or similar signal processing leading to a desired result. In this context operations or processing involve physical manipulation of physical quantities. Typically although not necessarily such quantities may take the form of electrical or magnetic signals capable of being stored transferred combined compared or otherwise manipulated. It has proven convenient at times principally for reasons of common usage to refer to such signals as bits data values elements symbols characters terms numbers numerals or the like. It should be understood however that all of these or similar terms are to be associated with appropriate physical quantities and are merely convenient labels. Unless specifically stated otherwise as apparent from the following discussion it is appreciated that throughout this specification discussions utilizing terms such as processing computing calculating determining or the like refer to actions or processes of a specific apparatus such as a special purpose computer or a similar special purpose electronic computing device. In the context of this specification therefore a special purpose computer or a similar special purpose electronic computing device is capable of manipulating or transforming signals typically represented as physical electronic or magnetic quantities within memories registers or other information storage devices transmission devices or display devices of the special purpose computer or similar special purpose electronic computing device. First Second etc. As used herein these terms are used as labels for nouns that they precede and do not imply any type of ordering e.g. spatial temporal logical etc. . For example in a content aware mesh that includes multiple nodes the terms first and second nodes can be used to refer to any two of the multiple nodes. In other words the first and second nodes are not limited to logical nodes and . Based On. As used herein this term is used to describe one or more factors that affect a determination. This term does not foreclose additional factors that may affect a determination. That is a determination may be solely based on those factors or based at least in part on those factors. Consider the phrase determine A based on B. While B may be a factor that affects the determination of A such a phrase does not foreclose the determination of A from also being based on C. In other instances A may be determined based solely on B. Various embodiments for accessing and or manipulating content in a content aware mesh are described. This specification first describes a mesh of collaborating nodes e.g. computing devices such as tablet devices cell phones computers etc. that each may participate in the creation and management of a catalog that spans assets distributed on the various nodes. The specification then describes non destructive editing techniques and techniques for browsing content. Various examples and applications are also disclosed. Some of the techniques may in some embodiments be implemented by program instructions stored in a computer readable storage medium and executable by one or more processors e.g. one or more CPUs or GPUs of a computing apparatus. The computer readable storage medium may store program instructions executable by the one or more processors to cause the computing apparatus to perform the various techniques as described herein. Other embodiments may be at least partially implemented by hardware circuitry and or firmware stored for example in a non volatile memory. Turning now to the figures is a block diagram that illustrates an example content collaboration system according to some embodiments of the present disclosure. In the illustrated embodiment system includes nodes and . Note that in other embodiments system may include other numbers of nodes. Nodes may also be referred to herein as computing device clients and or peers. In some embodiments node may be referred to as a coordinating node. Each of nodes and may be coupled to one another via a network not shown . The network may include any channel for providing effective communication between each of the entities of system . In some embodiments the network may include an electronic communication network such as the internet a local area network LAN wireless LAN WLAN WiMAX network cellular communications network or the like. For example the network may include an internet network used to facilitate communication between each of the entities of system . As shown in node may also be communicatively coupled to information management system IMS application store and external services . Application store may also be communicatively coupled to each of nodes and . System may be referred to as a content aware mesh in that the relationship among components of the mesh may not be in a typical client server arrangement. Instead the various components may be peers in the mesh. In other embodiments node may operate as a server with client nodes and . Further in some embodiments e.g. for multi dimensional browsing the structure of the system may be any type of system even a single node system. Content aware refers to the system s expectation of handling certain types of content e.g. images video etc. . Certain optimizations as described herein may result from the system being content aware. In the content aware mesh there may be more than one home for the content. The mesh may collect and or connect the content e.g. images from the plurality of nodes so that a user of the mesh has a single view into all of the content from anywhere e.g. connected to the internet or offline on any capable device . Although the illustrated system includes four nodes and other examples include two nodes three nodes five nodes or any other number of nodes. Example nodes may include a number of devices such as tablet devices cellular phones set top boxes televisions cameras printers computers among others. For example a person using the content aware mesh may have a cellular phone tablet device and a laptop as three nodes that may connect to node and or external services via the content aware mesh. Each node may include various components e.g. processor s program instructions I O interface etc. that are not illustrated in such as the components shown in . Each node may be configured to implement a user interface such that a user may view edit or otherwise manipulate content that is distributed among the various nodes. Example user interfaces are shown in . The nodes may be rich clients. For example nodes and may each include a database shown as databases and respectively. Databases and may be databases that are configured to maintain a mapping of content e.g. images stored on the various nodes. As one example node may include memory not shown that stores photos A A node may include memory not shown that stores photos B B and node may include memory not shown that stores photos C C. Databases and may each be configured to include a mapping of the full set of photos e.g. photos A A B B and C C . The mapping may include file name identifiers e.g. A and other information such as file location e.g. node size file type timestamp associated renditions and or revisions etc. The mapping may be structured as a database as shown or as another type of data store. The databases and may be updated periodically or may be updated upon occurrence of an event e.g. new photos added to the collection photos removed from the collection edits to any of the photos addition or deletion of a node addition or deletion of an authorized user etc. . Node may be a coordinating node. In some embodiments node may be a cloud based web service. As shown other nodes may access node via firewall . Node may include a number of application servers which may be load balanced by load balancer . Application servers may be elastic compute cloud EC2 servers. Applications servers may be coupled to load balancer application store IMS external services via proxy node database servers and binary storage . Database servers may likewise be EC2 servers and may be coupled to database storage . Binary storage may be reduced redundancy storage RRS and database storage may be elastic block storage EBS . As with the mapping that may be stored by nodes and node may be configured to store a similar mapping of content. For example node may be configured to store in binary storage and or database storage a mapping of content including identifiers location of content size etc. Node may also be configured to store at least some of the content. In some embodiments node may further be configured to store various renditions created by the various nodes. As used herein a rendition may include a low resolution e.g. less than full resolution half the resolution as the original etc. version of the original image and or a thumbnail of that version. The rendition may be based on one or more edits changes modifications to the original image. The renditions may be stored without replacing the original image. In doing so editing images may be non destructive such that the original content is maintained. Databases and may be synchronized to the corresponding database of node to maintain a synchronous mapping. The load balanced servers of node may operate as a single unit to perform the following services to any peer node store a user s authentication information and ensure that access to the user s resources is restricted to only those who possess the necessary credentials store the user s subscription information and automatically bill the user for the service e.g. yearly monthly etc. store a copy of each user s account state catalogs of images assets within each catalog and revisions and renditions within those assets provide powerful query capabilities to enable other mesh nodes to retrieve that information while consuming minimal network resources and provide additional digital imaging services such as format conversion scaling rendering and so on such that other nodes lacking those capabilities can still effectively insert into and interact with the content stored in the mesh. Although node may include additional capabilities over other nodes of the mesh in some embodiments it may not be a central repository of all the mesh s information. Note that the disclosed non destructive editing and browsing techniques may be implemented where a single node does serve as a central repository. External service s also referred to as a non collaborating node s may include various third party services that may be used to store content. For instance external services may include Facebook Flickr Google among others. In each of those examples a user may maintain content albums e.g. images video etc. on the external service. Application servers may access the content stored on or by external services via proxy node . Accordingly as an example a user of node may access images on external services via proxy node of node manipulate those images and update the images based on the manipulations. Note that such manipulations may be non destructive as described herein. Node may have knowledge of the application programming interfaces APIs provided by external services . The mesh may have access to external services via proxy node by maintaining log in information and synchronizing to the account in a manner that is transparent to a user of the mesh. The various nodes of the mesh may update content on external services and vice versa. From a developer s standpoint the developer may not need to know details of the external service s API as that interaction is abstracted by node and handled server to server for increased performance. In addition that layer of abstraction may shield the other mesh nodes from incompatible changes to those external service APIs. Proxy node may provide a view into the content stored by external service s and allow for access and management of that content. In one embodiment assets may not be moved away from the originating service but instead an access token may be maintained. The assets may be referenced within the rest of the content shared amongst the collaborating nodes of the mesh. This may allow a comprehensive view of the user s entire asset collection regardless of which service actually stores the assets and without having to move them or disrupt their existing use of those services. Although proxy node is illustrated within node it may reside elsewhere in other embodiments. IMS may provide a mechanism to authenticate a user of a node to access content of the content aware mesh. For example users of node and may be family members who have been approved to access to the content e.g. by an owner of the collection e.g. the user of node . Each user may be prompted to log in prior to accessing the content. A request to access content from a node may pass through node to IMS or may go straight to IMS . IMS may be configured to grant or deny access to the distributed content based on a user s credentials. Note that a user of a node for example node may access locally stored content on that node even when not granted access to the content distributed on various nodes. Application store may include a number of applications that may be used to access the content aware mesh. For example a user of a node may purchase an application from application store . The application may be executed on one of the nodes such that the node may access the distributed content of the content aware mesh. Purchase may be a one time event or may be a subscription e.g. daily weekly monthly yearly etc. . Any subscription may also play a role in authentication by IMS . For example a node may be running an appropriate application to access the content but without a valid subscription. Upon attempting to access the content IMS may deny the node s access without a valid subscription. Turning now to one embodiment of accessing content of a collaborative content sharing system is illustrated. While the blocks are shown in a particular order for ease of understanding other orders may be used. In some embodiments the method of may include additional or fewer blocks than shown. Blocks may be performed automatically or may receive user input. The method of may be performed in some embodiments by a node e.g. node of . As illustrated at access may be received to a plurality of images that are distributed among at least two nodes of a content aware mesh. In one embodiment the two nodes are not part of a same service. For example if a single node is made up of multiple servers of the same service e.g. a user s Flickr account the multiple servers may not be counted as separate nodes instead the multiple servers of the same service may collectively be considered as a single node. Accordingly if a user of Flickr as part of a Flickr photo album has images stored on several Flickr controlled servers those images are considered to be stored on a single node. Examples of nodes that may be included as one of the nodes may be nodes and or and or proxy node that corresponds to external service the proxy node may represent the Flickr servers in the above example as a single node . In a simple example the at least two nodes may be node and node . In such an example the images may be distributed between nodes and . In other examples a user may have a variety of devices e.g. tablet phone computer each representing a node. Or a family of users may be linked to the mesh such that user A has a device node user B has a device node and user C has a device node with each device coupled to each other device and to device . An example image collection may include 25 000 images. 20 000 may be stored at node and 5000 may be stored at node . In one embodiment node may be a user device such as a tablet device while node may be reside on a cloud service. Access to the images may be received by a node without locally storing all of the images. Thus in the 25 000 image collection example above node may have access to all 25 000 images without locally storing all 25 000. Thus access to the 5000 that are stored remotely from node may be received by node . Access may include the ability to edit images including non destructive editing as described herein. In some embodiments node may receive and locally store at least temporarily an image that is being edited. In other embodiments editing may be performed by node without having a local copy of the original image stored locally. In some embodiments receiving access may be via a coordinating node of the mesh. For example node of may be a coordinating node of mesh . As described herein the coordinating node may include a plurality of load balanced servers. Receiving access may include requesting access and receiving authorization from IMS . In other embodiments nodes may directly access content from other nodes without going through a coordinating node. In those embodiments a similar authentication may take place or the nodes may be initially configured as trusted nodes e.g. on a home network by MAC address etc. . At an image of the plurality of images may be displayed. Display of the image may be performed via a uniform interface without indication as to which of the at least two nodes the image is stored on. Accordingly the source of the image e.g. external service s node etc. may be transparent to a user of the uniform interface such that the full collection of images may appear to be collocated even though they are distributed among the nodes. Note that the uniform interface may display more than a single image at a time. See A E for an example of a single image displayed without indication as to which node the image is stored on and for a multiple image display example. Displaying of an image or images on an interface of a node may be performed in an offline mode e.g. when the node is not connected to the mesh . For example any images that are currently being edited by the node may be locally stored on the node. Or a user of the node may initiate an offline preparation technique that allows some content to be downloaded from the various nodes to a node in anticipation of an offline period e.g. a flight without Wi Fi access etc. . Upon reconnecting to the mesh the node may automatically synchronize the plurality images to the mesh. For example synchronizing may include uploading textual information describing edits to content. Synchronization may take place via a cloud service e.g. node or may be performed directly between peer nodes e.g. node node proxy node etc. . In one embodiment automatically synchronizing may include the node notifying other nodes of the mesh of any changes the node made to the plurality of images. Automatically synchronizing may also include receiving an update to any of the plurality of images that changed e.g. by other nodes during the time the node was not connected to the mesh. Note that synchronization also occur at other times such as periodically e.g. daily hourly etc. or upon an event e.g. a node s initial connection to the mesh submitting edit information etc. and is not limited to a node reconnecting to the mesh. In some embodiments information regarding each of the plurality of images may be stored in a database of the node receiving access at . As described herein the database may be a mapping that includes a variety of information such as an identifier and a location pointer for each image. The database of a given node may be synchronized to the mesh. For instance each node may include a database that is synchronized to the other databases such that each node has a complete view of available content. The database may also be synchronized to a master copy of the mapping. In one embodiment each node s database may be synchronized to the database of a coordinating node. As such the coordinating node may be configured to store information regarding each of the images. In various embodiments edits may be received to an image via the uniform interface. The image may be stored remotely at another node and a lower resolution copy of the image may be available locally at the node that is performing the edits. Or a local copy of the original image may be used to perform the edits. In any event textual data corresponding to the edits may be transmitted to the other node that is storing the image. The other node may be configured to apply the textual data to the image resulting in an updated version of the image. Note that the updated version may not replace the original but instead may be another version. The textual data may be transmitted directly to the other node or it may be performed via the coordinating node of the mesh. For example the coordinating node may pass the textual data to external services via a proxy node. In embodiments where the coordinating node provides the textual data to another node it may first store the textual data in its database. Transmission of data may be highly selective. Because each node of the mesh may maintain a synchronized database and is aware of the mesh s contents the node may selectively request just the asset components e.g. revisions renditions etc. that it wants to display a catalog. For example a user of node may wish to edit an image on node . Node may download the original version of the image from node to edit the image. Other images that the user of node only wishes to view may not be downloaded locally to node . Or a user of node may wish to view a series of edits made by other nodes to images stored locally on node . Node may only download the revisions and apply them locally or may download the low resolution renditions. Either way network resources may be conserved as opposed to a node downloading many full resolution images or the entire collection of images. In one embodiment an interface may allow the node implementing the method of to invite other nodes to join the mesh. Those other nodes may represent other devices of the user of the node or other users. This may allow the other devices and or other users to access and edit the content. Turning now to one embodiment for facilitating content sharing in a collaborative content sharing system is illustrated. While the blocks are shown in a particular order for ease of understanding other orders may be used. In some embodiments the method of may include additional or fewer blocks than shown. Blocks may be performed automatically or may receive user input. The method of may be performed in some embodiments by a node e.g. a coordinating node such as node of . As illustrated at information associated with each image of a plurality of images may be stored by the coordinating node which as described herein may be a cloud based plurality of load balanced servers. As described at the images may be distributed among at least two nodes which are not part of a same service of a content aware mesh. As described herein information may include file name identifiers e.g. A and other information such as file location e.g. node size file type timestamp associated renditions and or revisions etc. The coordinating node may also store authentication information. Providing access to the images to various nodes may be based on the stored authentication information. Authentication information may include MAC addresses stored usernames and or passwords subscription information access tokens encryption certificates etc. Each node of the mesh may have a corresponding set of authentication information that is stored at the coordinating node. In one embodiment access may be limited to nodes that the coordinating node authenticates. At access to each image may be provided to a node of the mesh. In one embodiment access may be provided by the coordinating node. Providing access may be performed without the coordinating node transmitting all of the images to the node. For example the node may have access to images that are stored on other nodes without locally storing all of those images. In some embodiments the coordinating node may be configured to provide access to each image to another node of the mesh. The coordinating node may receive a request by that other node to access the images. Such a request may be in the form of the other node providing log in credentials or may take place automatically upon the other node connecting to the mesh. If the other node is authorized the coordinating node may provide access to the images. Providing access may be performed without transmitting all of the images to the other node. As shown at the node may be configured to display an image of the plurality of images via a uniform interface without indication as to which of the nodes the image is stored on. The uniform interface may be the interface described at block of . Example interfaces are shown in . In one embodiment the node may be configured to edit the images. As part of editing the images the node may be configured to generate textual data describing an update to the image e.g. an image that is stored remotely or one that is stored locally . The coordinating node may receive the textual data from the node without receiving the original image being edited. The coordinating node may then store the textual data. In one embodiment for example if the original image being edited is stored remotely from the node and coordinating node the coordinating node may provide the textual data to the node storing the original image. The node that stores the original image may be configured to generate an updated version of the image e.g. a less than full resolution version . In one embodiment the coordinating node may update the stored information associated with an image. Updating the stored information may include adding a new set of textual data representing edits to an image. For instance textual data corresponding to a previous edit may have already existed in the database. The next textual data may be merged with the currently received textual data to form a list of edits. Such edits may both refer to the original version of the image or the later received edits may build upon a previously made edit. Edits may be non destructive in that making an edit and storing the corresponding textual data may not overwrite the original image or previous edits. Note that outside the editing context an original image and or previous edits may be replaced or deleted if explicit instructions to do so are received. In one embodiment the coordinating node may receive a request from the node to perform image processing on one or more of the images and the coordinating node may perform such image processing. Image processing may include applying the textual data to the original image to generate a modified version of the image without replacing the original image. The modified version may be a low resolution preview. Image processing may also include generating a thumbnail of the modified image. In other embodiments the node may be configured to generate the textual data preview and or thumbnail and provide each to the coordinating node. The coordinating node may receive the textual data preview and or thumbnail and store each of them. Or as described herein the coordinating node may serve as a conduit to provide the textual data to the node storing the original image. In one embodiment a proxy node may be created by the coordinating node to connect the mesh to a non collaborating node or external service e.g. Flickr Google etc. . A token may be maintained by the coordinating node that allows nodes of the mesh to access images stored by the external service. Such access may allow a user to transparently access those images. The content aware mesh of may provide a number of benefits. As one example four users may wish to share content among each other and each user may wish to access the content from a number of their own devices e.g. tablet device computer . Thus the mesh supports a multi user multi device configuration. User A may refuse to use any service other than flickr user B may prefer to use a different service user C may spread images on a variety of services and user D may also spread images on a variety of services and store other images locally on multiple devices. The content aware mesh may provide full access to the content in a seamless interface such that it is transparent to Users A D of where the actual content is stored. The content aware mesh of may also allow users to operate in an off line manner. Further network traffic may be minimized by allowing selective transmission of data e.g. only transmitting textual data describing changes instead of full resolution modified images . Turning now to one embodiment of non destructive content editing is illustrated. While the blocks are shown in a particular order for ease of understanding other orders may be used. In some embodiments the method of may include additional or fewer blocks than shown. Blocks may be performed automatically or may receive user input. The method of may be used in conjunction with the methods of and or . Accordingly a combination of some or all of the blocks of and or may be used in some embodiments. In one embodiment a node such as node may perform the method of . As illustrated at an input may be received via an interface displaying an image of a plurality of images. The input may be received by a node among a plurality of nodes in a distributed collaborative environment. The input may indicate a change regarding the image. The change may affect the visual appearance of the image. Changes may include adding a tag to an image e.g. outdoor scene beach etc. and or may include substantive edits such as contrast white balance exposure etc. Example inputs that indicate a change regarding the image may be seen in . illustrates a selection of the Looks button of the example user interface. Selection of Looks presents a variety of look options including normal silvered dream sepia among others. illustrates a selection of the Adjustments button of the example user interface. Selection of Adjustments may cause additional selection options to appear that allow a user of the interface to adjust sliders for white balance exposure and contrast. An automatic adjust button may also be included in the interface. illustrates that selection of Exposure from may present further options such as exposure highlights and shadows. illustrates selection of a crop and rotate feature along with further possible selections. Each of includes an apply button which may allow a user to view the changes. In other embodiments adjusting a slider or clicking auto may automatically cause the display to update to reflect the changes. The examples in also show a back button and a cancel button to allow edits to be canceled and to exit the editing mode. As shown at the node may receive another input via the interface to finalize the change. illustrates an example of such an input. A Develop button may be presented which allows a user to finalize the change. The button may be presented for example where a user has made edits and has exited from the editing mode of . At in response to the other input the node may generate a rendition that reflects the change applied to the image without replacing the original version of the image. In applying the change without replacing the original version of the image the edit may be referred to as non destructive. The rendition may include a thumbnail and a rendered preview of an adjusted version of the image based on the change. Each of the thumbnail and the rendered preview may be a lower resolution than the original image. In other embodiments another node e.g. a coordinating node or node storing the original image may generate the rendition. In various embodiments the node may transmit textual data corresponding to the change to another node of the plurality of nodes. The textual data may be metadata or other data. The textual data may be transmitted alone to the other node. Or in one embodiment the rendition that includes the thumbnail and rendered preview may also be transmitted to the other node. The other node may be configured to store the textual data and the rendition. In some embodiments the other node may be configured to receive other textual data corresponding to other changes and or another rendition received from a node other than the node. For instance the other node may be a coordinating node e.g. node the node e.g. node or proxy node . The other node may be configured to receive textual data and or renditions from a variety of different nodes. The textual data and renditions from various nodes may relate to the same image. For example a user of node and a user of node may each edit the same image during an overlapping or non overlapping time period. Each of node and node may perform blocks and and may transmit corresponding textual data and or a rendition each relating to the same image. In one embodiment the node may receive an indication that a different rendition or multiple different renditions of the image that reflects a different change was created by a different node and is currently stored in a data store. An example scenario of when this may occur is the aforementioned overlapping editing of the same image by the node and a different node. Users of nodes and may choose to edit the same image at the same time. The user of node may finish editing the image and finalize the changes resulting in textual information being transmitted to the coordinating node. The coordinating node may then transmit an indication that a different rendition of the image that reflects the change made by node is available. That indication may be received by node . The node receiving that indication may then receive a selection to the interface indicating which rendition to keep. Such a selection may be received at the time node receives the other input to finalize its change. The selection may indicate to keep the rendition from the different node keep both or multiple ones of the renditions or to only keep the rendition from the node and delete replace the rendition s from the different node s . A selection to keep the rendition from the node may include transmitting the textual data and rendition to the other node. Note that multiple changes may be received at before the changes are finalized at . In such a case the multiple changes e.g. changes to white balance and contrast etc. may be transmitted as a single revision e.g. a single rendition and textual information that includes the multiple changes . In one embodiment the multiple changes may come from more than one device. For instance a user may begin making edits on a tablet device e.g. node and finish making edits on their laptop e.g. node . In such an example the first set of changes made on the tablet device may be transmitted directly to the laptop without sending them through the coordinating node. In other embodiments the first set of changes may be made available to the laptop via the coordinating node. Instead of a finalize input a user may choose a save changes option or similar option such that the in progress edits may be continued from another node. Turning now to another embodiment of non destructive content editing is illustrated. While the blocks are shown in a particular order for ease of understanding other orders may be used. In some embodiments the method of may include additional or fewer blocks than shown. Blocks may be performed automatically or may receive user input. The method of may be used in conjunction with the methods of and or . Accordingly a combination of some or all of the blocks of and or may be used in some embodiments. In one embodiment a coordinating node such as node of may perform the method of . As illustrated at textual information that describes a change regarding an image of a plurality of images may be received from a node in a distributed collaborative environment e.g. the content aware mesh of . The textual information may describe a change regarding an image of a plurality of images. As described at block changes may include adding a tag to an image e.g. outdoor scene beach etc. and or may include substantive edits such as contrast white balance exposure etc. The textual information may also be referred to herein as collections of settings or adjustment parameters. As shown at the received textual information may be stored in a data store. The data store may be a database such as one or more of binary storage and or database storage . Other information may be stored in the same data store. For instance as described herein a mapping of content distributed among various nodes may be stored in the data store. The received textual information may correspond to a given image of the plurality of images. Other received textual information and or other rendition s may also exist in the data store. For instance the other textual information and or rendition s may correspond to other images of the plurality of images or may correspond to other edits to the same image. Those other edits to the same image may have been received from any one or more of the nodes. An editor using node may make a number of different edits to the same image and upload the textual information corresponding to each edit. One edit may correspond to a black and white version another edit to a sepia version and another edit to a cropped version. In other examples various edits may come from different nodes. For example different family members who are members of the mesh may practice different editing principles and may have different ideas of how to properly edit a photo. Accordingly each family member may submit a different edit via their respective node. In one embodiment receiving textual information corresponding to an image for which textual information is already stored e.g. the image has previously been edited may cause the coordinating node to send an indication to the node indicating that other textual information exists. As described at the node may select one or more renditions and or textual information to keep in the data store. Accordingly the coordinating node may receive a selection to keep one or multiple ones of the renditions and or textual information. If the selection is to replace a previously stored rendition and or textual information the storing of block may include replacing the previously stored textual information. If the selection is to keep one or more previously stored renditions and or textual information then storing may include storing the new rendition and or textual information in addition to the already stored rendition s textual information. Storing multiple edits may include appending metadata from multiple pieces of textual information into a single new piece of textual metadata or storing multiple edits may include separately storing the textual information in the data store. At application of the textual information to an original version of the image may result in a modified version of the image in addition to the original version. In this manner editing may be non destructive in that the original version is maintained. Application of the textual information may be provided by the node the coordinating node and or a different node. Examples of the scenarios follow. As described herein whichever node applies the textual information may do it such that the edits are non destructive. In one embodiment the received textual information may be provided to another node. The other node may be configured to apply the textual information to the original version of the image resulting in a modified version of the image in addition to the original version. As one example coordinating node may receive textual information corresponding to a change to an image from node . Coordinating node may then provide the textual information to node to apply the textual information to the original version. Note that in this example node may store the original version of the image that was edited by node . In some embodiments in addition to receiving textual information from the node the modified version of the image in the form of a rendition may also be received from the node. In such embodiments the node that generated the textual information may also generate the rendition that includes the modified version and or a thumbnail. The modified version may be a low resolution version e.g. less than full resolution less than the resolution of the original version of the image. The rendition may be stored. Other nodes may have access to view such renditions so that they may not have to perform their own image processing to view what the modified looks like. Each other node client may be made aware by the coordinating node of the presence of new renditions and or textual information. This may be done at the time the new rendition and or textual information is received or periodically or upon some other event e.g. query by a node a node entering an edit mode for a particular image etc. . The stored textual information may be a linear list of changes that have been made. Such stored information may be accessible by various nodes such that a node may view available renditions and or a list of metadata collection of settings. The metadata and or collection of settings may be translated into human understandable information descriptive of the changes. In some instances a change may not result in a newly generated preview or rendition. For example a user of a node may tag a photo which may not visually affect the image. In such an example the textual information may simply be the new tag which may be uploaded by a node and received by the coordinating node. Because a new rendition and thumbnail may not be generated if the image is not visually affected the node may not transmit and the coordinating node may not receive a rendition and thumbnail with such a change. In one embodiment the coordinating node may receive the textual information and also have access to the original version of the image. Accordingly the coordinating node may apply the textual information to the original version resulting in the modified version. The disclosed non destructive editing techniques of may permit multiple users to exchange multiple versions of edits in a collaborative fashion. Additionally using lower resolution renditions allows clients to display the edits without needing to re render them locally and results in less required storage space. Moreover the disclosed conflict resolution allows multiple access at the same time and provides a manner to resolve conflicts. Turning now to one embodiment of browsing content is illustrated. While the blocks are shown in a particular order for ease of understanding other orders may be used. In some embodiments the method of may include additional or fewer blocks than shown. Blocks may be performed automatically or may receive user input. The method of may be used in conjunction with the methods of . Accordingly a combination of some or all of the blocks of may be used in some embodiments. In one embodiment a node such as node of may perform the method of As illustrated at images from a plurality of images may be assigned into a plurality of groups or collections. Each of the groups may include at least one of the images. The groups may be defined by a tag such that each respective image belonging to that group includes the same tag. Note that images may include multiple tags but may be grouped according to a common tag with other images of the same group. In one embodiment a tag may include a date e.g. date the photo was taken date the photo was downloaded to the node etc. . The date may reside in the image s metadata. The granularity of a group by date may vary. For example a group may be defined by a year a month or a date the image was taken among others. Tags may be a key word such as outdoor scene 40th birthday party etc. Tagging may be by status such as which photos have been uploaded to an external service. Tagging may be automatically performed e.g. by database query or may be user specified e.g. defined by a user as being from an event like a wedding birthday party etc. . In one embodiment tagging may initially be automatically performed but may be refined manually. Renditions may be assigned to the same group as the corresponding original image. Within each group the images may be partitioned into a sequence. For example the images may be ordered by timestamp of the images. The timestamp may be the time the image was taken or the time the image was last edited. Or images may be ordered in a different manner such as by file size or name. As shown at an interface that includes first and second dimensions may be displayed. Various examples are shown in and . The interface may allow browsing in the first dimension among the plurality of groups. The interface may also allow browsing in the second dimension among one or more images within a group. As one example of an interface with two dimensions e.g. vertical and horizontal groups may be scrolled vertically and within each group images may be scrolled horizontally. Browsing or scrolling may be accomplished by various inputs e.g. mouse pointer gestures such as touch gestures etc. . Note that an interface may include three dimensions and have browsing ability in each of the three dimensions e.g. x y and z . In one embodiment the interface may display as many images e.g. thumbnails that may be fit on the display. The thumbnails may include be fixed width or in some embodiments may be variable width. For a variable width thumbnail implementation a layout may be created beforehand using the aspect ratios of the images. In contrast to an implementation in which an entire set of images are downloaded in one embodiment the images that are visible on the display for the group or groups may be the ones that are downloaded. Other images may later be prefetched and fetched as needed as described herein. This may allow large amounts of data as can be the case with image and or video content to be efficiently handled by not having to populate it all at once. At a request to prefetch image data in at least one of the dimensions may be generated. Generation of the request to prefetch data may be based on an anticipation to display the image data. Anticipation to display the image data may be based on a number of factors such as the speed and or direction of browsing and or the use of variable width thumbnails. As an example if a user is slowly browsing images of one of the groups from left to right on the display and uses a gesture to bring in images to the display screen from off the right side of the screen images that are next in the sequence from off screen right may be prefetched. If a user is rapidly swiping from right to left to move much farther into the images of that group the immediately next images in the sequence may not be prefetched based on the speed of the browsing. Instead those images may be skipped and image data corresponding to images later in the sequence may be prefetched. In some embodiments to generate the prefetch requests the node may know the full list of groups and associated images. The node may calculate the full list for example by fetching the information over the web or by calculating it by using a database query. Based on which tracks are visible on the screen those images may be fetched. A default view may exist such that starting images are already fetched and ready to display. One example default view may be the topmost group s and leftmost image s of those group s . In one embodiment some amount of overscan may be performed such that image data for nearby images are prefetched in one or more of the dimensions to allow a user of the interface to scroll in any direction and readily have those nearby images available. If the interface is not actively being manipulated to scroll in any direction prefetches may occur in each direction because it may not be known which direction is most likely to be displayed next. If the interface is actively being scrolled vertically then most or all of the prefetching may be done in the vertical dimension in the same direction of scroll e.g. up or down . Similarly if the interface is actively being scrolled horizontally then most or all of the prefetching may be done in the horizontal dimension in the same direction of scroll e.g. left or right . As noted above if the interface is being scrolled rapidly then nearby image data may not need to be prefetched and instead the prefetch requests may be generated for image data corresponding to images farther away in the sequence. In various embodiments the request to prefetch image data may be executed and the image data may be prefetched. The display may then be updated with the prefetched image data. Note that according to the prefetch parameters the prefetch may be completed before the data is needed for display. As a result the display may be updated later and not necessarily immediately upon prefetching the data. For instance assume a display of a node allows 3 rows of 5 images to be displayed at one time. Let the middle row be the row being scrolled and prefetched. The middle row may include thousands of images. If the middle row is being scrolled in a given dimension the number and which images to be prefetched may vary based on the rate of scroll. Note that the requests to prefetch may be individual requests for each image such that the prefetches each corresponding to a given image may be offset by some time delta. But to keep the explanation simple assume in a simple example that the next five images in the sequence after the five currently displayed in the middle row may be prefetched at a time t. Only two of the images may be needed to display at time t 1 whereas the remaining three prefetched images may not be needed to display at time t 2. Accordingly the three other prefetched images may be held in the node s local memory ready to display at time t 2 or whenever the display is in fact scrolled to where those images would be needed . In one embodiment multiple fetches may be executed simultaneously and asynchronously. In one embodiment the request to prefetch image data may be canceled. Canceling the request to prefetch image data may be based on a determination that prefetching the image data has not been initiated and that displaying the image data is no longer anticipated. One example scenario in which the request may be canceled includes a user scrolling within a group to view images off to the right of the screen. A user may subsequently stop scrolling and may change direction of the scroll such that that likelihood of displaying the next images in the sequence from the original scroll direction is very low. Note that if the prefetch has already begun the request to prefetch may not be canceled even if it is no longer anticipated that the image data will be displayed. In one example implementation the prefetch requests may be put in a last in first out LIFO manner. In doing so the cancel mechanism may help ensure that data is not needed is not continually polled. Such a mechanism may enable a more responsive interface for example when a user stops the browsing. In one embodiment previously prefetched image data may be locally stored. For example previously prefetched image data may be stored in a cache of the node e.g. CPU cache display cache etc. . Such data may persist in local storage until a maximum capacity or an allowed usage amount e.g. 80 of available cache space has been reached or until the local storage is cleared manual or automatic . A user can manually configure the allowable size of the cache via the interface. One or more of blocks and may be performed dynamically to account for updates to the plurality of images. For example the assigning of block may be repeated upon receiving an update to the plurality of images. An example scenario in which this may occur may be adding a new node that includes a number of additional images to the mesh. Those images may then be assigned to existing groups or newly added groups e.g. if tags of the newly received images warrant creating a new group . Or all of the images may be regrouped when an update occurs. Another example scenario may be when one of the nodes generates a new rendition e.g. according the non destructive editing techniques herein of an image. Accordingly the updated version of the image and the original version may coexist such that the plurality of images includes both versions. The updated version may be assigned to the same group as the original version or if the grouping is by date modified or some other grouping the newly created updated version may be assigned to a different group. An image of the plurality of displayed images in the interface may be selected to zoom to full screen mode resulting in a single image being displayed. This may be used to simply zoom in on an image or may be used to enter an editing mode such as the disclosed non destructive editing. The disclosed browsing techniques may allow for efficient handling and browsing of large amounts of data even on nodes devices with modest hardware such as mobile devices e.g. tablets phones laptops . By performing asynchronous prefetching fetching and fetch canceling perceived performance of the interface may be very smooth. While examples of the methods of may include implementations using the described content aware mesh the methods of may also be implemented in a typical client server arrangement or other arrangements e.g. non cloud based . In a client server arrangement node may be a server configured to centrally store all of the plurality of images instead of having the images distributed among the various nodes. Even in such an arrangement the non destructive editing of may still take place in a collaborative manner. Moreover the browsing techniques of may be used in a client server embodiment or in an embodiment with a single node. For example the disclosed browsing techniques including the prefetch mechanism may be implemented by a single node to browse the images on that node. It will be appreciated that the methods of are example embodiments of methods employed according to the techniques described herein. The methods of may be modified to facilitate variations of its implementations and uses. For example although some embodiments have been described with respect to images the techniques may be applied for use with similar content. The methods of may be implemented in software hardware or a combination thereof. Program instructions and or data for implementing embodiments of data collection code and or matching code as described herein may for example be stored on a computer readable storage medium. A computer readable storage medium may include storage media or memory media such as magnetic or optical media e.g. disk or DVD CD ROM volatile or non volatile media such as RAM e.g. SDRAM DDR RDRAM SRAM etc. ROM etc. Various portions of a content aware mesh non destructive editing techniques and or the disclosed browsing techniques may be executed on one or more computer systems which may interact with various other devices. One such computer system is illustrated by . For example nodes and or application store IMS node proxy node and or external services may each include employ or be executed on one or more computer systems. In the illustrated embodiment computer system includes one or more processors coupled to a system memory via an input output I O interface . Computer system further includes a network interface coupled to I O interface and one or more input output devices such as cursor control device keyboard audio device and display s . In some embodiments it is contemplated that embodiments may be implemented using a single instance of computer system while in other embodiments multiple such systems or multiple components making up computer system may be configured to host different portions or instances of embodiments. For example in one embodiment some elements may be implemented via one or more components of computer system that are distinct from those components implementing other elements. In various embodiments computer system may be a uniprocessor system including one processor or a multiprocessor system including several processors e.g. two four eight or another suitable number . Processors may be any suitable processor capable of executing instructions. For example in various embodiments processors may be general purpose or embedded processors implementing any of a variety of instruction set architectures ISAs such as the x86 PowerPC SPARC or MIPS ISAs or any other suitable ISA. In multiprocessor systems each of processors may commonly but not necessarily implement the same ISA. In some embodiments at least one processor may be a graphics processing unit. A graphics processing unit GPU may be considered a dedicated graphics rendering device for a personal computer workstation game console or other computer system. GPUs may be very efficient at manipulating and displaying computer graphics and their highly parallel structure may make them more effective than typical CPUs for a range of complex graphical algorithms. For example a graphics processor may implement a number of graphics primitive operations in a way that makes executing them much faster than drawing directly to the screen with a host central processing unit CPU . In various embodiments the methods disclosed herein for layout preserved text generation may be implemented by program instructions configured for execution on one of or parallel execution on two or more of such GPUs. The GPU s may implement one or more application programmer interfaces APIs that permit programmers to invoke the functionality of the GPU s . Suitable GPUs may be commercially available from vendors such as NVIDIA Corporation ATI Technologies and others. System memory may be configured to store program instructions and or data accessible by processor . In various embodiments system memory may be implemented using any suitable memory technology such as static random access memory SRAM synchronous dynamic RAM SDRAM nonvolatile Flash type memory or any other type of memory. In the illustrated embodiment program instructions and data implementing desired functions such as those described above for a layout preserved text generation method are shown stored within system memory as program instructions and data storage respectively. In other embodiments program instructions and or data may be received sent or stored upon different types of computer accessible media or on similar media separate from system memory or computer system . Generally speaking a computer accessible medium may include storage media or memory media such as magnetic or optical media e.g. disk or CD DVD ROM coupled to computer system via I O interface . Program instructions and data stored via a computer accessible medium may be transmitted by transmission media or signals such as electrical electromagnetic or digital signals which may be conveyed via a communication medium such as a network and or a wireless link such as may be implemented via network interface . In one embodiment I O interface may be configured to coordinate I O traffic between processor system memory and any peripheral devices in the device including network interface or other peripheral interfaces such as input output devices . In some embodiments I O interface may perform any necessary protocol timing or other data transformations to convert data signals from one component e.g. system memory into a format suitable for use by another component e.g. processor . In some embodiments I O interface may include support for devices attached through various types of peripheral buses such as a variant of the Peripheral Component Interconnect PCI bus standard or the Universal Serial Bus USB standard for example. In some embodiments the function of I O interface may be split into two or more separate components. In addition in some embodiments some or all of the functionality of I O interface such as an interface to system memory may be incorporated directly into processor . Network interface may be configured to allow data to be exchanged between computer system and other devices attached to a network e.g. network such as other computer systems or between nodes of computer system . In various embodiments network interface may support communication via wired or wireless general data networks such as any suitable type of Ethernet network for example via telecommunications telephony networks such as analog voice networks or digital fiber communications networks via storage area networks such as Fibre Channel SANs or via any other suitable type of network and or protocol. Input output devices may in some embodiments include one or more display terminals keyboards keypads touchpads scanning devices voice or optical recognition devices or any other devices suitable for entering or retrieving data by one or more computer system . Multiple input output devices may be present in computer system or may be distributed on various nodes of computer system . In some embodiments similar input output devices may be separate from computer system and may interact with one or more nodes of computer system through a wired or wireless connection such as over network interface . Memory may include program instructions configured to implement embodiments as described herein and data storage comprising various data accessible by program instructions . In one embodiment program instructions may include software elements of one or more methods illustrated in the above Figures. Data storage may include data that may be used in embodiments. In other embodiments other or different software elements and or data may be included. Those skilled in the art will appreciate that computer system is merely illustrative and is not intended to limit the scope of analytics data collection or analytics data matching methods as described herein. In particular the computer system and devices may include any combination of hardware or software that can perform the indicated functions including computers network devices internet appliances PDAs wireless phones pagers etc. Computer system may also be connected to other devices that are not illustrated or instead may operate as a stand alone system. In addition the functionality provided by the illustrated components may in some embodiments be combined in fewer components or distributed in additional components. Similarly in some embodiments the functionality of some of the illustrated components may not be provided and or other additional functionality may be available. Those skilled in the art will also appreciate that while various items are illustrated as being stored in memory or on storage while being used these items or portions of them may be transferred between memory and other storage devices for purposes of memory management and data integrity. Alternatively in other embodiments some or all of the software components may execute in memory on another device and communicate with the illustrated computer system via inter computer communication. Some or all of the system components or data structures may also be stored e.g. as instructions or structured data on a computer accessible medium or a portable article to be read by an appropriate drive various examples of which are described above. In some embodiments instructions stored on a computer accessible medium separate from computer system may be transmitted to computer system via transmission media or signals such as electrical electromagnetic or digital signals conveyed via a communication medium such as a network and or a wireless link. Various embodiments may further include receiving sending or storing instructions and or data implemented in accordance with the foregoing description upon a computer accessible medium. Accordingly the disclosed embodiments may be practiced with other computer system configurations. In some embodiments portions of the techniques described herein may be hosted in a cloud computing infrastructure. Various embodiments may further include receiving sending or storing instructions and or data implemented in accordance with the foregoing description upon a computer accessible medium. Generally speaking a computer accessible medium may include storage media or memory media such as magnetic or optical media e.g. disk or DVD CD ROM volatile or non volatile media such as RAM e.g. SDRAM DDR RDRAM SRAM etc. ROM etc. as well as transmission media or signals such as electrical electromagnetic or digital signals conveyed via a communication medium such as network and or a wireless link. The various methods as illustrated in the Figures and described herein represent example embodiments of methods. The methods may be implemented in software hardware or a combination thereof. The order of method may be changed and various elements may be added reordered combined omitted modified etc. Various modifications and changes may be made as would be obvious to a person skilled in the art having the benefit of this disclosure. It is intended that the embodiments embrace all such modifications and changes and accordingly the above description to be regarded in an illustrative rather than a restrictive sense.
360.785714
2,290
0.816543
eng_Latn
0.999972
e98236dc668835bae907c9ca683b9373dc6bb27d
1,220
md
Markdown
docs/src/crit-guide/crit-up-control-plane-node.md
rbtr/crit
894c4bbfbbe51bd77a316bd8393d372ec640fd36
[ "Apache-2.0" ]
73
2020-09-10T18:07:01.000Z
2021-12-11T08:34:34.000Z
docs/src/crit-guide/crit-up-control-plane-node.md
ChrisRx/crit
6c89183f33dfba55b47990c73cb9ce2d4b9d5073
[ "Apache-2.0" ]
16
2020-09-10T20:56:59.000Z
2021-02-03T18:03:27.000Z
docs/src/crit-guide/crit-up-control-plane-node.md
ChrisRx/crit
6c89183f33dfba55b47990c73cb9ce2d4b9d5073
[ "Apache-2.0" ]
13
2020-09-11T03:08:09.000Z
2021-08-15T17:06:36.000Z
Running `crit up` with a [control plane configuration](configuring-control-plane-components.md ) perfoms the following steps: | Step | Description | ----------- | ------------------------------------------------------------------------------------ |ControlPlanePreCheck | Validate configuration |CreateOrDownloadCerts | Generate CAs; if already present, don't overwrite |CreateNodeCerts | Generate certificates for kubernetes components; if already present, dont overwrite |StopKubelet | Stop the kubelet using systemd |WriteKubeConfigs | Generate control plane kubeconfigs and the admin kubeconfig |WriteKubeletConfigs | Write kubelet settings |StartKubelet | Start Kubelet using systemd |WriteKubeManifests | Write static pod manifests for control plane |WaitClusterAvailable | Wait for the control plane to be available |WriteBootstrapServerManifest [optional] | Write the crit boostrap server pod manifest |DeployCoreDNS | Deploy CoreDNS after cluster is available |DeployKubeProxy | Deploy KubeProxy |EnableCSRApprover | Add RBAC to allow csrapprover to boostrap nodes |MarkControlPlane | Add taint to control plane node |UploadInfo | Upload crit config map that holds info regarding the cluster
58.095238
103
0.745082
eng_Latn
0.678402
e982ab6ca89311574e115678d6ff1a0a274c870a
1,275
md
Markdown
windows-driver-docs-pr/debugger/-alignmentfaults.md
Ryooooooga/windows-driver-docs.ja-jp
c7526f4e7d66ff01ae965b5670d19fd4be158f04
[ "CC-BY-4.0", "MIT" ]
null
null
null
windows-driver-docs-pr/debugger/-alignmentfaults.md
Ryooooooga/windows-driver-docs.ja-jp
c7526f4e7d66ff01ae965b5670d19fd4be158f04
[ "CC-BY-4.0", "MIT" ]
null
null
null
windows-driver-docs-pr/debugger/-alignmentfaults.md
Ryooooooga/windows-driver-docs.ja-jp
c7526f4e7d66ff01ae965b5670d19fd4be158f04
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: エラーの方向を揃える description: '[配置エラー] 拡張では、位置とイメージによって現在のすべての種類のアラインメントエラーが、頻度で並べ替えて表示されます。' ms.assetid: 6720a4de-ba75-4449-ab47-559bc7323002 keywords: - エラーのエラーウィンドウのデバッグ ms.date: 12/17/2019 topic_type: - apiref api_name: - alignmentfaults api_type: - NA ms.localizationpriority: medium ms.openlocfilehash: 6e975becb6eb61b8691d00fac742bf8a1446c668 ms.sourcegitcommit: d30691c8276f7dddd3f8333e84744ddeea1e1020 ms.translationtype: MT ms.contentlocale: ja-JP ms.lasthandoff: 12/19/2019 ms.locfileid: "75209150" --- # <a name="alignmentfaults"></a>!alignmentfaults **! Sorted fault**拡張機能では、位置とイメージによって現在のすべての型アラインメントエラーが、頻度によって並べ替えられて表示されます。 ```dbgcmd !alignmentfaults ``` ### <a name="span-iddllspanspan-iddllspandll"></a><span id="DLL"></span><span id="dll"></span>DLL <p><strong>Windows XP 以降</strong></p> <p>Kdexts .dll</p> ### <a name="span-idadditional_informationspanspan-idadditional_informationspanspan-idadditional_informationspanadditional-information"></a><span id="Additional_Information"></span><span id="additional_information"></span><span id="ADDITIONAL_INFORMATION"></span>追加情報 アラインメントエラーの詳細については、Microsoft Windows SDK のドキュメントを参照してください。 <a name="remarks"></a>注釈 ------- これは、Windows 10 バージョン1803より前の古いバージョンの Windows でのみ使用でき、チェックされたビルドを提供しています。
28.333333
267
0.791373
yue_Hant
0.635173
e982d18464d1a6637b13d8a38c078484cf2d5fa5
885
md
Markdown
docs/forum/bitcoin-forum/140.md
bram00767/bitcoin-archive
4f44d5134805d41ad5140665621f8621485cbb6a
[ "MIT" ]
11
2021-03-09T17:57:01.000Z
2022-03-07T19:47:56.000Z
forum/bitcoin-forum/140.md
baby636/satoshi-archive
ec3aa84eb869a68d5f19fd422d57604c5b87a131
[ "MIT" ]
4
2021-01-01T03:42:32.000Z
2021-12-31T09:56:52.000Z
forum/bitcoin-forum/140.md
baby636/satoshi-archive
ec3aa84eb869a68d5f19fd422d57604c5b87a131
[ "MIT" ]
9
2021-01-14T06:48:22.000Z
2022-03-30T21:57:58.000Z
--- layout: forum title: 'Re: 0.3 almost ready' grand_parent: Forum Posts parent: Bitcoin Forum nav_order: 140 date: 2010-06-24 17:40:05 UTC original: https://bitcointalk.org/index.php?topic=199.msg1748#msg1748 --- # Re: 0.3 almost ready --- ``` Re: 0.3 almost ready June 24, 2010, 05:40:05 PM Here's RC1 for linux for testing: (link removed, see below) It contains both 32-bit and 64-bit binaries. Recent changes: build-unix.txt: - Added instructions for building wxBase, which is needed to compile bitcoind. - The package libboost-dev doesn't install anything anymore, you need to get libboost-all-dev. - Updated version numbers. makefile.unix: - The libboost libraries have removed the "-mt" from their filenames in 1.40. If you're compiling with Boost 1.38 or lower, like on Ubuntu Karmic, you would need to change it back to boost_system-mt and boost_filesystem-mt. ```
26.029412
224
0.746893
eng_Latn
0.974186
e9834ccc225cf0d674e4184343e3c4def4876661
1,116
md
Markdown
README.md
Orekhov/protractorDirectConnectToRunningChromedriver
c40fc91190136b96b7735fa6494dd685ec4803f5
[ "MIT" ]
1
2018-03-18T14:50:16.000Z
2018-03-18T14:50:16.000Z
README.md
Orekhov/protractorDirectConnectToRunningChromedriver
c40fc91190136b96b7735fa6494dd685ec4803f5
[ "MIT" ]
null
null
null
README.md
Orekhov/protractorDirectConnectToRunningChromedriver
c40fc91190136b96b7735fa6494dd685ec4803f5
[ "MIT" ]
null
null
null
# protractorDirectConnectToRunningChromedriver This is a prototype for running protractor tests with directConnect and connecting to a running chromedriver instead of protractor starting it ## How to run ### 1. Start chromedriver In `StartChromeDriver` folder open and build `StartChromeDriver.sln` solution. (Modify code if needed to customize prosess startup) Start it and keep it running. (You can check that it is running by sending GET to `http://127.0.0.1:9515/sessions`) ### 2. Run the test In `test` folder run `npm install` Clone and build https://github.com/Orekhov/protractor/tree/directConnect_to_running_chromedriver in a separate folder. (`npm install` and `npm run prepublish`) Link to the built version of protractor (`npm link` in protractor folder, `npm link protractor` in test folder) run `webdriver-manager update` (Even though these drivers wouldn't be used, there are various checks in the protractor/selenium code that check their existence. Removing them would require a lot of code changes in the prototype) run `Run.ps1` (it should connect to a running instance of chromedriver)
46.5
228
0.789427
eng_Latn
0.992736
e984585070ce6f23d1dead3a3a47c7be8be7a7c7
1,325
md
Markdown
README.md
technologiestiftung/tsb-tree-zeit-now-api
9e0dad05c97e6139ff8c3c37e4e454a90db6e55a
[ "MIT" ]
null
null
null
README.md
technologiestiftung/tsb-tree-zeit-now-api
9e0dad05c97e6139ff8c3c37e4e454a90db6e55a
[ "MIT" ]
null
null
null
README.md
technologiestiftung/tsb-tree-zeit-now-api
9e0dad05c97e6139ff8c3c37e4e454a90db6e55a
[ "MIT" ]
null
null
null
# Zeit Now Postgres DB - Proof of Concept This is a proof of concept on how to write a lambda function for [Zeit now.sh](https://zeit.co) that connects to a Postgres DB on AWS RDS. ## Prerequisites - Node.js 12@latest - Now account + Now CLI installed - AWS Account with a Postgres RDS DB created ## Setup - Create your RDS with a user and password and create a database in it. - Rename `example.env` to `.env` and fill in your credentials. - Create a Now account. - Install the now cli `npm i -g now`. - And login `now login`. - Update the `name` in `now.json` - Open a terminal in the root of this repo. - Install node dependencies `npm i`. - Add your secrets to now (they will be encrypted into your account) - `now secrets add user foo` - `now secrets add password supersecret` - `now secrets add database bah` - `now secrets add port 5432` - `now secrets add host foo.123.eu-central-1.rds.amazonaws.com` ## Development local !Hint: The example expects to get a JSON body on the POST function. To test your function run on localhost ```bash now dev ``` In another session run ```bash curl --location --request POST 'http://localhost:3000' \ --header 'Content-Type: application/json' \ --data-raw '{"foo":"bah"}' ``` ## Deployment Production To deploy your function run. ```bash now --prod ```
24.537037
138
0.70566
eng_Latn
0.974048
e9851f896dfc96d58c610eaf4592e56f74b2d976
1,263
md
Markdown
docs/framework/windows-workflow-foundation/39458-trackingrecordtruncated.md
turibbio/docs.it-it
2212390575baa937d6ecea44d8a02e045bd9427c
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/framework/windows-workflow-foundation/39458-trackingrecordtruncated.md
turibbio/docs.it-it
2212390575baa937d6ecea44d8a02e045bd9427c
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/framework/windows-workflow-foundation/39458-trackingrecordtruncated.md
turibbio/docs.it-it
2212390575baa937d6ecea44d8a02e045bd9427c
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: 39458 - TrackingRecordTruncated ms.date: 03/30/2017 ms.assetid: 5352f0eb-d571-454a-bab5-e2162888b218 ms.openlocfilehash: 416feb4073b31178b016ae72c9cd85e15c4a68c3 ms.sourcegitcommit: 9b552addadfb57fab0b9e7852ed4f1f1b8a42f8e ms.translationtype: MT ms.contentlocale: it-IT ms.lasthandoff: 04/23/2019 ms.locfileid: "61774413" --- # <a name="39458---trackingrecordtruncated"></a>39458 - TrackingRecordTruncated ## <a name="properties"></a>Proprietà ||| |-|-| |Id|39458| |Parole chiave|WFTracking| |Livello|Avviso| |Canale|Microsoft-Windows-Application Server-Applications/Debug| ## <a name="description"></a>Descrizione Indica che un record di rilevamento è stato troncato. Variabili/annotazioni/dati utente rimossi. ## <a name="message"></a>Messaggio Record di rilevamento %1 troncato scritto nella sessione ETW con il provider %2. Variabili/annotazioni/dati utente rimossi ## <a name="details"></a>Dettagli |Nome elemento dati|Tipo elemento dati|Descrizione| |--------------------|--------------------|-----------------| |RecordNumber|xs:string|Numero del record di rilevamento.| |ProviderId|xs:string|ID del provider ETW.| |AppDomain|xs:string|Stringa restituita da AppDomain.CurrentDomain.FriendlyName.|
36.085714
125
0.723674
ita_Latn
0.509775
e9859e015b6909c8bf618fea254f45dcecc658d9
14,870
md
Markdown
docs/framework/configure-apps/file-schema/wcf/basichttpbinding.md
cihanyakar/docs.tr-tr
03b6c8998a997585f61b8be289df105261125239
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/framework/configure-apps/file-schema/wcf/basichttpbinding.md
cihanyakar/docs.tr-tr
03b6c8998a997585f61b8be289df105261125239
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/framework/configure-apps/file-schema/wcf/basichttpbinding.md
cihanyakar/docs.tr-tr
03b6c8998a997585f61b8be289df105261125239
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: '&lt;basicHttpBinding&gt;' ms.date: 03/30/2017 helpviewer_keywords: - basicHttpBinding Element ms.assetid: 85cf1a4f-26c2-48c7-bda6-6c960d5d3fb3 ms.openlocfilehash: ac88a2dcb8a47876b7b8a54abc4615c851172a90 ms.sourcegitcommit: 4ac80713f6faa220e5a119d5165308a58f7ccdc8 ms.translationtype: MT ms.contentlocale: tr-TR ms.lasthandoff: 01/09/2019 ms.locfileid: "54149960" --- # <a name="ltbasichttpbindinggt"></a>&lt;basicHttpBinding&gt; Bir Windows Communication Foundation (WCF) hizmeti yapılandırmak ve ASMX tabanlı Web Hizmetleri, istemci ve WS uyumlu diğer hizmetler ile iletişim kurabilen bitiş noktaları ortaya çıkarmak için kullanabileceğiniz bir bağlama temsil-ı Basic Profile 1.1. \<system.ServiceModel> \<bağlamaları > \<basicHttpBinding > ## <a name="syntax"></a>Sözdizimi ```xml <basicHttpBinding> <binding allowCookies="Boolean" bypassProxyOnLocal="Boolean" closeTimeout="TimeSpan" envelopeVersion="None/Soap11/Soap12" hostNameComparisonMode="StrongWildCard/Exact/WeakWildcard" maxBufferPoolSize="Integer" maxBufferSize="Integer" maxReceivedMessageSize="Integer" messageEncoding="Text/Mtom" name="String" openTimeout="TimeSpan" proxyAddress="URI" receiveTimeout="TimeSpan" sendTimeout="TimeSpan" textEncoding="UnicodeFffeTextEncoding/Utf16TextEncoding/Utf8TextEncoding" transferMode="Buffered/Streamed/StreamedRequest/StreamedResponse" useDefaultWebProxy="Boolean"> <security mode="None/Transport/Message/TransportWithMessageCredential/TransportCredentialOnly"> <transport clientCredentialType="None/Basic/Digest/Ntlm/Windows/Certificate" proxyCredentialType="None/Basic/Digest/Ntlm/Windows" realm="String" /> <message algorithmSuite="Basic128/Basic192/Basic256/Basic128Rsa15/Basic256Rsa15/TripleDes/TripleDesRsa15/Basic128Sha256/Basic192Sha256/TripleDesSha256/Basic128Sha256Rsa15/Basic192Sha256Rsa15/Basic256Sha256Rsa15/TripleDesSha256Rsa15" clientCredentialType="UserName/Certificate" /> </security> <readerQuotas maxArrayLength="Integer" maxBytesPerRead="Integer" maxDepth="Integer" maxNameTableCharCount="Integer" maxStringContentLength="Integer" /> </binding> </basicHttpBinding> ``` ## <a name="attributes-and-elements"></a>Öznitelikler ve Öğeler Öznitelikler, alt ve üst öğeler aşağıdaki bölümlerde açıklanmaktadır. ### <a name="attributes"></a>Öznitelikler |Öznitelik|Açıklama| |---------------|-----------------| |`allowCookies`|İstemci tanımlama bilgilerini kabul eder ve bunları gelecekteki isteklerde yayar gösteren bir Boole değeri. Varsayılan, `false` değeridir.<br /><br /> Tanımlama bilgileri kullanan ASMX Web Hizmetleri ile etkileşim kurduğunuzda bu özelliği kullanabilirsiniz. Bu şekilde, sunucudan döndürülen tanımlama bilgilerini aydaki hizmet için tüm istemci isteklerini otomatik olarak kopyalanır emin olabilirsiniz.| |`bypassProxyOnLocal`|Yerel adresler için proxy sunucusunun atlanıp atlanmayacağını gösteren bir Boole değeri. Varsayılan, `false` değeridir.<br /><br /> Bir Internet kaynağına, yerel adres varsa yereldir. Yerel bir adres aynı bilgisayarda, yerel LAN veya intranet ve, sözdizimsel olarak, URI'ler olduğu gibi bir nokta (.) eksikliği tarafından tanımlanan biridir "http://webserver/" ve "http://localhost/".<br /><br /> Bu öznitelik ayarlama BasicHttpBinding ile yapılandırılmış uç noktaları yerel kaynaklara erişirken proxy sunucusu kullanıp kullanmadığını belirler. Bu öznitelik ise `true`, istekleri yerel Internet kaynakları için proxy sunucusu kullanmayın. Bu öznitelik ayarlandığında Hizmetleri aynı makinede konuşurken bir proxy üzerinden Git istemcilerin istiyorsanız ana bilgisayarın adı (localhost yerine) kullanın `true`.<br /><br /> Bu öznitelik olduğunda `false`, tüm Internet isteklerini Ara sunucu üzerinden yapılır.| |`closeTimeout`|A <xref:System.TimeSpan> bir kapatma işlemi tamamlamak sağlanan zaman aralığını belirten bir değer. Bu değer, büyük veya buna eşit olmalıdır <xref:System.TimeSpan.Zero>. Varsayılan değer 00:01:00 ' dir.| |`envelopeVersion`|Bu bağlama tarafından işlenen iletiler için kullanılan SOAP sürümünü belirtir. Yalnızca geçerli Soap11 değerdir.| |`hostnameComparisonMode`|URI ayrıştırmak için kullanılan HTTP ana bilgisayar adını karşılaştırma modunu belirtir. Bu öznitelik türünde <xref:System.ServiceModel.HostNameComparisonMode>, ana bilgisayar üzerinde URI'yi eşleştirirken hizmete erişmek için kullanılıp kullanılmayacağını belirtir. Varsayılan değer <xref:System.ServiceModel.HostNameComparisonMode.StrongWildcard>, ana bilgisayar adı eşleşme yok sayar.| |`maxBufferPoolSize`|Kanaldan iletiler alan ileti arabelleklerinin Yöneticisi tarafından kullanılmak için ayrılan maksimum belleğin belirten bir tamsayı değeri. Varsayılan değer: 524288 (0x80000) bayt.<br /><br /> Arabellek Yöneticisi bir arabellek havuzu kullanarak arabellekler maliyetini en aza indirir. Bunlar dışında kanal geldiğinizde arabellekler iletilerini işlemek için hizmet tarafından gerekli değildir. İleti yükü işlemek için arabellek havuzunda yeterli bellek yoksa, arabellek Yöneticisi çöp toplama taşması artıran CLR yığından ek bellek tahsis etmelisiniz. CLR çöp yığınındaki gelen kapsamlı ayırma arabellek havuzu boyutu çok küçük olduğunu ve bu özniteliği tarafından belirtilen sınırı artırarak performans daha büyük bir ayırma ile geliştirilebilir göstergesidir.| |`maxBufferSize`|Bu bağlama ile yapılandırılan bir uç nokta için işlenirken iletileri depolayan bir arabellek, bayt cinsinden en büyük boyutunu belirten bir tamsayı değeri. 65.536 bayt varsayılan değerdir.| |`maxReceivedMessageSize`|Bu bağlama ile yapılandırılan bir kanalda alınan iletinin üst bilgiler dahil bayt cinsinden en büyük ileti boyutunu tanımlar. pozitif bir tamsayı. İleti alıcısı için çok büyük ise, gönderici bir SOAP hatasını alır. Alıcı, iletiyi bırakır ve izleme günlüğüne etkinliğin bir giriş oluşturur. Varsayılan 65.536 bayt'tır.| |`messageEncoding`|SOAP iletisi kodlamak için kullanılan Kodlayıcısı tanımlar. Geçerli değerler şunlardır:<br /><br /> -Metni: Bir metin ileti Kodlayıcı kullanın.<br />-Mtom: İleti Aktarım kuruluş mekanizması 1.0 (MTOM) encoder'ı kullanın.<br /><br /> Metin varsayılandır. Bu öznitelik türünde <xref:System.ServiceModel.WSMessageEncoding>.| |`name`|Bağlama yapılandırma adını içeren bir dize. Bağlama için bir tanımlayıcı olarak kullanıldığından, bu değer benzersiz olmalıdır. Her bağlama sahip bir `name` ve `namespace` birlikte benzersiz özniteliği hizmetinin meta verilerde tanımlayın. Ayrıca, bu ad aynı türdeki bağlamaları arasında benzersizdir. İle başlayarak [!INCLUDE[netfx40_short](../../../../../includes/netfx40-short-md.md)], bağlamalar ve davranışları için gerekli değildir bir ada sahip. Varsayılan yapılandırma ve adsız bağlamaları ve davranışları hakkında daha fazla bilgi için bkz: [Basitleştirilmiş yapılandırma](../../../../../docs/framework/wcf/simplified-configuration.md) ve [WCF hizmetleri için Basitleştirilmiş yapılandırma](../../../../../docs/framework/wcf/samples/simplified-configuration-for-wcf-services.md).| |`namespace`|XML ad alanı bağlamanın belirtir. Varsayılan değer "http://tempuri.org/Bindings". Her bağlama sahip bir `name` ve `namespace` birlikte benzersiz özniteliği hizmetinin meta verilerde tanımlayın.| |`openTimeout`|A <xref:System.TimeSpan> tamamlamak açık işlem için sağlanan zaman aralığını belirten bir değer. Bu değer, büyük veya buna eşit olmalıdır <xref:System.TimeSpan.Zero>. Varsayılan değer 00:01:00 ' dir.| |`proxyAddress`|HTTP proxy adresini içeren bir URI. Varsa `useSystemWebProxy` ayarlanır `true`, bu ayar olmalıdır `null`. Varsayılan, `null` değeridir.| |`receiveTimeout`|A <xref:System.TimeSpan> tamamlamak alma işlemi için sağlanan zaman aralığını belirten bir değer. Bu değer, büyük veya buna eşit olmalıdır <xref:System.TimeSpan.Zero>. Varsayılan değer 00:10:00 ' dir.| |`sendTimeout`|A <xref:System.TimeSpan> tamamlamak bir gönderme işlemi için sağlanan zaman aralığını belirten bir değer. Bu değer, büyük veya buna eşit olmalıdır <xref:System.TimeSpan.Zero>. Varsayılan değer 00:01:00 ' dir.| |`textEncoding`|İletileri bağlamadaki yayma için kullanılacak kodlama karakter kümesini ayarlar. Geçerli değerler şunlardır:<br /><br /> -BigEndianUnicode: Unicode kodlama BigEndian.<br />-Unicode: 16-bit kodlaması.<br />-UTF8: 8-bit kodlama<br /><br /> UTF8 varsayılandır. Bu öznitelik türünde <xref:System.Text.Encoding>.| |`transferMode`|Geçerli bir <xref:System.ServiceModel.TransferMode> iletilerin ara belleğe veya bir istek veya yanıtı belirten bir değer.| |`useDefaultWebProxy`|Otomatik yapılandırılan HTTP Ara sunucusu sisteminin kullanılması gerekip gerekmediğini, varsa belirten bir Boole değeri. Varsayılan, `true` değeridir.| ### <a name="child-elements"></a>Alt Öğeler |Öğe|Açıklama| |-------------|-----------------| |[\<Güvenlik >](../../../../../docs/framework/configure-apps/file-schema/wcf/security-of-basichttpbinding.md)|Bağlama için güvenlik ayarlarını tanımlar. Bu öğe türünde <xref:System.ServiceModel.Configuration.BasicHttpSecurityElement>.| |[\<readerQuotas >](https://msdn.microsoft.com/library/3e5e42ff-cef8-478f-bf14-034449239bfd)|Bu bağlama ile yapılandırılan bir bitiş noktası tarafından işlenen SOAP iletilerinin karmaşıklığını kısıtlamalar tanımlar. Bu öğe türünde <xref:System.ServiceModel.Configuration.XmlDictionaryReaderQuotasElement>.| ### <a name="parent-elements"></a>Üst Öğeler |Öğe|Açıklama| |-------------|-----------------| |[\<bağlamaları >](../../../../../docs/framework/configure-apps/file-schema/wcf/bindings.md)|Bu öğe, standart ve özel bağlamalar koleksiyonunu tutar.| ## <a name="remarks"></a>Açıklamalar BasicHttpBinding, SOAP 1.1 iletileri göndermek için aktarım olarak HTTP kullanır. Bir hizmet WS uygun uç noktalarını kullanıma sunmak için bu bağlamayı kullanan-ı BP 1.1 ASMX istemciler tüketen olanlar gibi. Benzer şekilde, istemci BasicHttpBinding WS uygun uç noktaları açığa çıkaran hizmetler ile iletişim kurmak için kullanabilir-ı BP 1.1 ASMX Web Hizmetleri veya BasicHttpBinding ile yapılandırılmış bir hizmeti gibi. Güvenlik için varsayılan olarak devre dışıdır, ancak bu mod özniteliği ayarıyla eklenebilir [ \<Güvenlik >](../../../../../docs/framework/configure-apps/file-schema/wcf/security-of-basichttpbinding.md) dışında bir değeri alt öğeye `None`. Bir "Metin" İleti kodlama ve UTF-8 metin kodlama varsayılan olarak kullanır. ## <a name="example"></a>Örnek Aşağıdaki örnek, kullanımını gösterir <xref:System.ServiceModel.BasicHttpBinding> , HTTP iletişimi ve en fazla birlikte çalışabilirlik ilk - ve second - generation ile Web hizmetleri sağlar. İstemci ve hizmet yapılandırma dosyalarında bağlama belirtildi. Bağlama türü kullanarak belirtilen `binding` özniteliği `<endpoint>` öğesi. Temel bağlama yapılandırmak ve bazı ayarlarını değiştirmek istiyorsanız, bir bağlama yapılandırmasını tanımlamak gereklidir. Uç nokta bağlama yapılandırması adıyla kullanarak başvurmalıdır `bindingConfiguration` özniteliği `<endpoint>` hizmeti için aşağıdaki yapılandırma kodda gösterildiği gibi öğesi. ```xml <system.serviceModel> <services> <service type="Microsoft.ServiceModel.Samples.CalculatorService" behaviorConfiguration="CalculatorServiceBehavior"> <endpoint address="" binding="basicHttpBinding" bindingConfiguration="Binding1" contract="Microsoft.ServiceModel.Samples.ICalculator" /> </service> </services> <bindings> <basicHttpBinding> <binding name="Binding1" hostNameComparisonMode="StrongWildcard" receiveTimeout="00:10:00" sendTimeout="00:10:00" openTimeout="00:10:00" closeTimeout="00:10:00" maxReceivedMessageSize="65536" maxBufferSize="65536" maxBufferPoolSize="524288" transferMode="Buffered" messageEncoding="Text" textEncoding="utf-8" bypassProxyOnLocal="false" useDefaultWebProxy="true"> <security mode="None" /> </binding> </basicHttpBinding> </bindings> </system.serviceModel> ``` ## <a name="example"></a>Örnek İle başlayarak [!INCLUDE[netfx40_short](../../../../../includes/netfx40-short-md.md)], bağlamalar ve davranışları için gerekli değildir bir ada sahip. Önceki örnekte işlevselliği, uç nokta adresini ve bağlama adından bindingConfiguration kaldırarak gerçekleştirilebilir. ```xml <system.serviceModel> <services> <service type="Microsoft.ServiceModel.Samples.CalculatorService" behaviorConfiguration="CalculatorServiceBehavior"> <endpoint address="" binding="basicHttpBinding" contract="Microsoft.ServiceModel.Samples.ICalculator" /> </service> </services> <bindings> <basicHttpBinding> <binding hostNameComparisonMode="StrongWildcard" receiveTimeout="00:10:00" sendTimeout="00:10:00" openTimeout="00:10:00" closeTimeout="00:10:00" maxReceivedMessageSize="65536" maxBufferSize="65536" maxBufferPoolSize="524288" transferMode="Buffered" messageEncoding="Text" textEncoding="utf-8" bypassProxyOnLocal="false" useDefaultWebProxy="true"> <security mode="None" /> </binding> </basicHttpBinding> </bindings> </system.serviceModel> ``` Varsayılan yapılandırma ve adsız bağlamaları ve davranışları hakkında daha fazla bilgi için bkz: [Basitleştirilmiş yapılandırma](../../../../../docs/framework/wcf/simplified-configuration.md) ve [WCF hizmetleri için Basitleştirilmiş yapılandırma](../../../../../docs/framework/wcf/samples/simplified-configuration-for-wcf-services.md). ## <a name="see-also"></a>Ayrıca Bkz. <xref:System.ServiceModel.Channels.Binding> <xref:System.ServiceModel.Channels.BindingElement> <xref:System.ServiceModel.BasicHttpBinding> <xref:System.ServiceModel.Configuration.BasicHttpBindingElement> [Bağlamalar](../../../../../docs/framework/wcf/bindings.md) [Sistem Tarafından Sağlanan Bağlamaları Yapılandırma](../../../../../docs/framework/wcf/feature-details/configuring-system-provided-bindings.md) [Hizmetler ve İstemcileri Yapılandırmak için Bağlamaları Kullanma](../../../../../docs/framework/wcf/using-bindings-to-configure-services-and-clients.md) [\<bağlama >](../../../../../docs/framework/misc/binding.md)
80.378378
933
0.743847
tur_Latn
0.99704
e985ac7548874f7a19326c85ef9f981321d47dbb
59
md
Markdown
examples/blog/pages/posts/index.md
blomqma/next.js
7db6aa2fde34699cf319d30980c2ee38bb53f20d
[ "MIT" ]
51,887
2016-10-25T15:48:01.000Z
2020-05-27T17:47:07.000Z
examples/blog/pages/posts/index.md
blomqma/next.js
7db6aa2fde34699cf319d30980c2ee38bb53f20d
[ "MIT" ]
13,333
2020-05-27T18:15:25.000Z
2022-03-31T23:48:59.000Z
examples/blog/pages/posts/index.md
blomqma/next.js
7db6aa2fde34699cf319d30980c2ee38bb53f20d
[ "MIT" ]
14,796
2020-05-27T18:07:16.000Z
2022-03-31T23:55:30.000Z
--- type: posts title: Posts date: 2021-03-18 --- # Posts
7.375
16
0.610169
eng_Latn
0.796869
e985b5c86c84c33808bcb0083567848b6bfd6785
6,176
md
Markdown
CS/methods_in_IPC.md
im-d-team/Dev-Docs
26abb7e8c29db7d75ef178a90c6ef6e69680341a
[ "MIT" ]
128
2020-04-30T05:40:35.000Z
2022-01-21T08:24:58.000Z
CS/methods_in_IPC.md
Sehannnnnnn/Dev-Docs
d9a837e832e64d76b5f72b8daa38b5f142ee1db2
[ "MIT" ]
41
2019-04-28T04:53:44.000Z
2020-04-27T02:45:38.000Z
CS/methods_in_IPC.md
Sehannnnnnn/Dev-Docs
d9a837e832e64d76b5f72b8daa38b5f142ee1db2
[ "MIT" ]
26
2020-05-26T09:49:05.000Z
2022-02-15T04:27:39.000Z
# IPC(Inter Process Communication) 기법 프로세스는 서로 접근이 불가능하다. 프로세스는 커널만 접근할 수 있기 때문이다. 하지만 병렬처리가 필요하거나 서로 다른 프로세스 간의 상호작용이 필요할 수 있다. 예시를 떠올리기 힘들다면 서버-클라이언트 모델을 생각해보면 이해하기 쉽다. 서버와 클라이언트는 서로 다른 프로세스이지만 정보를 주고받아야 한다. 이걸 해결하는 방법이 IPC(Inter Process Communication; 프로세스 간 커뮤니케이션)이다. ### 들어가기 전에 가장 쉽게 IPC를 구현하는 방법으로 파일을 사용하는 방법이 있다. 하나의 파일에 필요한 자원이나 메세지를 입력하여 공유하는 것이다. 하지만 업데이트 상황을 감지하기 어렵고 저장 매체를 지속적으로 사용해야 하므로 하드웨어 인터럽트가 발생할 것이고 성능이 떨어질 것이다. 때문에 커널 공간을 이용한 기법들이 대부분이다.![프로세스2](https://github.com/Dae-Hwa/diagrams/blob/master/os/process/%ED%94%84%EB%A1%9C%EC%84%B8%EC%8A%A42.png?raw=true) 모든 프로세스에는 커널 영역이 할당되어 있다. 똑같은 커널 영역을 사용하기 때문에 프로세스에 있는 커널 영역은 물리 메모리에 올라가 있는 하나의 커널 영역을 참조한다. 만약 프로세스 A와 B에서 커널 영역에 접근 하게 된다면 같은 영역에 접근하게 되는 것이다. 즉, 커널 영역을 사용할 수 있도록 [시스템 콜](https://ko.wikipedia.org/wiki/%EC%8B%9C%EC%8A%A4%ED%85%9C_%ED%98%B8%EC%B6%9C)을 제공한다면 IPC를 구현할 수 있다. ## IPC 기법 IPC에 대한 대표적인 접근방식은 아래와 같다. > 항목은 [Inter-process communication - Wikipedia](https://en.wikipedia.org/wiki/Inter-process_communication#Approaches) 를 바탕으로 linux에서 사용할 수 있는 것들을 기준으로 선정했다. - 파일(file) - 공유 메모리(shared memory) - 메세지 패싱(message passing) - 파이프(pipe) - 메세지 큐(message queue) - 시그널(signal) - 소켓(socket) - 세마포어(semaphore) 파일을 이용한 IPC 외에는 크게 두 가지 컨셉이 있다. 프로세스끼리 공유할 수 있는 메모리를 만들어 함께 접근하거나, 메세지를 주고받는 것(메세지 패싱)이다. > 참고 - https://www.geeksforgeeks.org/inter-process-communication-ipc/ ### 공유 메모리(Shared Memory) 물리 메모리 공간에 변수처럼 사용할 수 있도록 공간을 만드는 것이다. key와 접근권한을 부여하여 등록하고 사용할 수 있다. ![공유메모리1.png](https://github.com/Dae-Hwa/diagrams/blob/master/os/ipc/%EA%B3%B5%EC%9C%A0%EB%A9%94%EB%AA%A8%EB%A6%AC1.png?raw=true) #### 장점 - 메모리에 있는 것을 그대로 사용할 수 있기 때문에 속도가 빠르다. - 구현도 간단하다. - 여러 프로세스가 동시에 접근할 수 있다. #### 단점 - 동기화(synchronize) 이슈가 생길 수 있다. - 커널에서 설정된 용량에 종속적이다. - 데이터를 읽어야 되는 시점을 알 수 없다. > 사용자(consumer)가 어떤 데이터를 사용할지 직접 명시해야 하기 때문이다. 다른 방식은 메세지를 전달받기 때문에 해당 메세지에 해당하는 데이터를 읽으면 된다. ### 파이프(Pipe) 한 프로세스가 넣으면 다른 프로세스가 읽는 큐 형태의 구조이다. 하지만 방향이 고정되어있어 한 방향으로만 통신할 수 있다. 물탱크에서 관(파이프)을 통해 수도꼭지로 물이 나오는 것을 생각해보면 이해하기 쉽다. 즉, 파이프의 입력부분과 출력 부분이 정해지면 입력 부분이 구현된 프로세스에서는 입력만, 출력 부분이 구현된 프로세스에서는 출력만 가능하다. 또한, 읽기와 쓰기가 block 모드로 이루어지기 때문에 양방향으로 구현하더라도 반이중(half-duplex) 통신으로 동작한다. 반이중 통신의 대표적인 예로 무전기가 있는데 이와 유사하게 동작하는 것이다. #### 장점 - 매우 간단하게 사용할 수 있다. #### 단점 - 부모-자식 간에만 사용할 수 있다. - 단방향(unidirection) 통신이다. - 양방향(bidirection)으로 구현하여도 반이중 통신이다. #### Named Pipe(후술 FIFO) 앞에서 살펴본 파이프는 엄밀히 따지면 익명 파이프(anonymous pipe)라고 불린다. 이 방식의 단점 중 부모-자식 프로세스에서만 사용할 수 있다는 것인데, FIFO는 그러한 단점을 해결하기 위한 것이다. 즉 관련 없는 프로세스들에서 파이프를 이용한 IPC를 구현하기 위해 사용할 수 있다. > 서버-클라이언트 모델에 사용되는 방식이다. ### 메세지 큐(Message Queue) key를 이용한 큐 방식이다. 선입선출 외에 지정한 key 중 가장 먼저 입력된 자료를 꺼낼 수 있다. 양방향으로 통신할 수 있다.![메세지큐1.png](https://github.com/Dae-Hwa/diagrams/blob/master/os/ipc/%EB%A9%94%EC%84%B8%EC%A7%80%ED%81%901.png?raw=true) #### 장점 - 프로세스 간의 비동기(asynchronization) 통신이 가능하다. - 키를 이용하면 권한이 있는 모든 프로세스에서 접근할 수 있다. > 하지만 큐이기 때문에 동기화 이슈를 염려할 필요 없다. #### 단점 - 대기열이 지나치게 길어지면 병목현상이 발생할 수 있다. ### 시그널(Signal) 커널 또는 프로세스에서 다른 프로세스에 어떤 이벤트가 발생했는지 알려주는 기법이다. 프로세스의 실행 종료 등을 알려준다. 프로세스에서 시그널을 재정의하면 시그널의 기본 동작과 다른 동작을 할 수 있다. 이를 이용하기 위해 유저 정의 시그널(리눅스의 경우 SIGUSR1, 2)을 제공한다. > 시그널 무시, 시그널 블록(block), 핸들러에 정의된 동작 수행(콜백 함수처럼 동작한다) 시그널의 좀 더 자세한 동작 원리는 인터럽트가 종료되어 사용자 모드로 전환하는 시점에 pcb(process control block)에 있는 시그널과 관련된 자료구조의 상태를 확인하는 것이다. 이를 이용하여 다른 IPC 기법과 조합하여 사용할 수 있다. 데이터를 직접 전송할 수 없다는 단점이 있다. ### 소켓(Socket) 기본적으로 네트워크 통신을 위한 기술로 클라이언트와 서버 간의 요청과 응답을 처리하는 기술이다. 앞에서 파이프의 발전된 형태인 FIFO를 잠깐 살펴봤는데 소켓은 FIFO가 양방향으로 구성된 것과 흡사하다. 근본적으로 서로 다른 네트워크의 프로세스를 연결하는 기술이지만, [Unix Domain Socket(IPC socket)](https://en.wikipedia.org/wiki/Unix_domain_socket)을 이용하여 커널 내부의 커뮤니케이션에 사용할 수 있다. #### 장점 - 양방향으로 많은 데이터가 오고 가야 할때 유리하다. #### 단점 - 프로그램의 규모가 작다면 오버헤드가 클 수 있다. ### 세마포어(Semaphore) 세마포어는 조금 특이하다. 원래는 동기화 이슈를 해결하기 위한 방법론 중 하나이기 때문이다. 동기화 이슈가 생기는(동시에 접근하면 안 되는) 부분을 임계 영역(critical section)이라고 한다. 여기에 대한 동시 접근을 막는 것을 상호배제(MUTial EXclusion; MUTEX)라고 하는데, 세마포어는 동시 접근 할 수 있는 프로세스의 개수를 정해놓는 기법이다. ```pseudocode /** * S : 임계 영역에 동시에 진입할 수 있는 스레드 수 * * P * - 임계 영역에 들어가기 전에 실행 가능 여부 판단(S가 0보다 작을 경우 실행 불가) * - 실행 불가능 시 locking * - 실행 가능하면 S-- * V : 임계 영역에서 빠져나올 때 완료 체크(완료 시 S++) */ P(S) { while S <=0; S--; } V(S) { S++; } ``` > 비효율성을 해결하기 위해 대기 큐를 적용할 수 있다. 주제에서 벗어나기 때문에 다음을 참고 - [세마포어 - 위키백과](https://ko.wikipedia.org/wiki/%EC%84%B8%EB%A7%88%ED%8F%AC%EC%96%B4#%EC%A0%81%EC%9A%A9) 실제 구현 시 특정 임계 영역을 정해놓고 세마포어가 사용 가능할 때까지 기다린다. 따지고 보면 임계 영역이 공유되는 영역이기 때문에 IPC로 이용할 수 있는 것이다. #### 장점 - 다른 동기화 방법보다 효율적이다. - 임계 영역에 접근 가능한지 여부를 제공하기때문에 busy waiting을 사용하지 않아도 된다. > busy waiting : 특정 자원이 사용가능한지 계속 확인하는 것으로 자원 낭비가 일어날 수 있다. #### 단점 - 동기화에 대한 조건을 다뤄야하기 때문에 오류 없이 구현하기 까다롭다. ## 마치며 이외에도 여러 가지 IPC기법들이 있는데, 잘 생각해봐야 할 것은 위에서도 잠깐 언급되었지만 여러가지 IPC기법을 필요에 따라 조합하여 사용할 수 있다는 것이다. 따라서 구성하려는 시스템의 요구사항이 무엇인지 잘 파악해야 할 것이다. 본문에서는 리눅스를 기반으로 설명하였는데, 윈도우에서 사용되는 IPC가 궁금하다면 [Interprocess Communications - MSDN](https://docs.microsoft.com/ko-kr/windows/win32/ipc/interprocess-communications?redirectedfrom=MSDN)를 참고하자. 이외에 자세한 구현이 궁금하다면 [IPC - joinc](https://www.joinc.co.kr/w/Site/system_programing/Book_LSP/ch08_IPC#s-2.4), [Linux System V and POSIX IPC Examples](http://hildstrom.com/projects/ipc_sysv_posix/index.html)와 [Beej's Guide to Unix IPC](http://beej.us/guide/bgipc/html/single/bgipc.html) 에 잘 정리가 되어있다. --- #### References - [POSIX shared-memory API - GeeksforGeeks](https://www.geeksforgeeks.org/posix-shared-memory-api/) - [Inter-process communication - Wikipedia](https://en.wikipedia.org/wiki/Inter-process_communication#Approaches) - [Interprocess Communications - MSDN](https://docs.microsoft.com/ko-kr/windows/win32/ipc/interprocess-communications?redirectedfrom=MSDN) - [Linux System V and POSIX IPC Examples](http://hildstrom.com/projects/ipc_sysv_posix/index.html) - [IPC - joinc](https://www.joinc.co.kr/w/Site/system_programing/Book_LSP/ch08_IPC#s-2.4) - [파이프 (유닉스) - 위키백과](https://ko.wikipedia.org/wiki/%ED%8C%8C%EC%9D%B4%ED%94%84_(%EC%9C%A0%EB%8B%89%EC%8A%A4)) - [IPC InterProcess Communication 프로세스 간 통신 - 정보통신기술용어해설](http://www.ktword.co.kr/abbr_view.php?m_temp1=302) - [Beej's Guide to Unix IPC](http://beej.us/guide/bgipc/html/single/bgipc.html)
36.544379
311
0.699806
kor_Hang
1.00001
e98600c58ae4e83f36000a389217425bc599146e
6,440
md
Markdown
articles/redis-cache/cache-aspnet-session-state-provider.md
pymia/azure-docs.zh-tw
fc9f30ec1cb28578e47f3c0d52b6ada32b276b8b
[ "CC-BY-3.0" ]
null
null
null
articles/redis-cache/cache-aspnet-session-state-provider.md
pymia/azure-docs.zh-tw
fc9f30ec1cb28578e47f3c0d52b6ada32b276b8b
[ "CC-BY-3.0" ]
null
null
null
articles/redis-cache/cache-aspnet-session-state-provider.md
pymia/azure-docs.zh-tw
fc9f30ec1cb28578e47f3c0d52b6ada32b276b8b
[ "CC-BY-3.0" ]
null
null
null
--- title: "快取 ASP.NET 工作階段狀態供應器 | Microsoft Docs" description: "了解如何使用 Azure Redis 快取來儲存 ASP.NET 工作階段狀態" services: redis-cache documentationcenter: na author: steved0x manager: douge editor: tysonn ms.assetid: 192f384c-836a-479a-bb65-8c3e6d6522bb ms.service: cache ms.devlang: na ms.topic: article ms.tgt_pltfrm: cache-redis ms.workload: tbd ms.date: 01/06/2017 ms.author: sdanie translationtype: Human Translation ms.sourcegitcommit: 65385aa918222837468f88246d0527c22c677ba7 ms.openlocfilehash: a2f124de8a35f6fdff23fa8b3c816b8c0b44acdd --- # <a name="aspnet-session-state-provider-for-azure-redis-cache"></a>Azure Redis 快取的 ASP.NET 工作階段狀態提供者 Azure Redis 快取提供工作階段狀態提供者,可讓您用來將工作階段狀態儲存在快取中,而不是記憶體內或 SQL Server 資料庫中。 若要使用快取工作階段狀態提供者,請先設定快取,再使用「Redis 快取工作階段狀態 NuGet 套件」設定 ASP.NET 應用程式的快取。 在實際的雲端應用程式中,避免儲存使用者工作階段某種形式的狀態通常並非理想做法,但某些方法會比其他方法更加影響效能和延展性。 如果您需要儲存狀態,最好的方法是將狀態的數量控制得較低,並將其儲存在 Cookie 中。 如果此方法不可行,次佳的方法是使用 ASP.NET 工作階段狀態搭配提供者,進行分散式的記憶體中快取。 從效能和延展性的觀點來看,最差的解決方法是使用資料庫備份的工作階段狀態提供者。 本主題提供使用 Azure Redis 快取的 ASP.NET 工作階段狀態供應器的指引。 如需其他工作階段狀態選項的相關資訊,請參閱 [ASP.NET 工作階段狀態選項](#aspnet-session-state-options)。 ## <a name="store-aspnet-session-state-in-the-cache"></a>將 ASP.NET 工作階段狀態儲存在快取中 若要在 Visual Studio 中使用「Redis 快取工作階段狀態 NuGet 封裝」來設定用戶端應用程式,請在 [方案總管] 中的專案上按一下滑鼠右鍵,然後選擇 [管理 NuGet 封裝]。 ![Azure Redis 快取管理 NuGet 封裝](./media/cache-aspnet-session-state-provider/redis-cache-manage-nuget-menu.png) 在搜尋文字方塊中輸入 **RedisSessionStateProvider**,從結果中選取後按一下 [安裝]。 > [!IMPORTANT] > 如果您使用進階層的叢集功能,則必須使用 [RedisSessionStateProvider](https://www.nuget.org/packages/Microsoft.Web.RedisSessionStateProvider) 2.0.1 或更高版本,否則會擲回例外狀況。 這是一項重大變更。如需詳細資訊,請參閱 [v2.0.0 重大變更詳細資料](https://github.com/Azure/aspnet-redis-providers/wiki/v2.0.0-Breaking-Change-Details)。 > > ![Azure Redis 快取工作階段狀態提供者](./media/cache-aspnet-session-state-provider/redis-cache-session-state-provider.png) 「Redis 工作階段狀態提供者 NuGet 封裝」對「StackExchange.Redis.StrongName 封裝」有相依性。 如果「StackExchange.Redis.StrongName 封裝」不在專案中,將會予以安裝。 請注意,除了強式名稱的「StackExchange.Redis.StrongName 封裝」之外,另外還有 StackExchange.Redis 非強式名稱的版本。 如果您的專案是使用非強式名稱的 StackExchange.Redis 版本,您必須在安裝「Redis 工作階段狀態提供者 NuGet 封裝」之前或之後將它解除安裝,否則專案中會出現命名衝突。 如需這些封裝的相關詳細資訊,請參閱 [設定 .NET 快取用戶端](cache-dotnet-how-to-use-azure-redis-cache.md#configure-the-cache-clients)。 NuGet 封裝會下載並加入需要的組件參考,並將下列區段加入至您的 web.config 檔案,該檔案包含 ASP.NET 應用程式使用 Redis 快取工作階段狀態提供者所需的組態。 ```xml <sessionState mode="Custom" customProvider="MySessionStateStore"> <providers> <!-- <add name="MySessionStateStore" host = "127.0.0.1" [String] port = "" [number] accessKey = "" [String] ssl = "false" [true|false] throwOnError = "true" [true|false] retryTimeoutInMilliseconds = "0" [number] databaseId = "0" [number] applicationName = "" [String] connectionTimeoutInMilliseconds = "5000" [number] operationTimeoutInMilliseconds = "5000" [number] /> --> <add name="MySessionStateStore" type="Microsoft.Web.Redis.RedisSessionStateProvider" host="127.0.0.1" accessKey="" ssl="false"/> </providers> </sessionState> ``` 標示註解的區段可提供屬性的範例和每個屬性的範例設定。 以來自 Microsoft Azure 入口網站之快取刀鋒視窗的值來設定屬性,並視需要設定其他值。 如需存取快取屬性的指示,請參閱 [設定 Redis 快取設定](cache-configure.md#configure-redis-cache-settings)。 * **主機** – 指定您的快取端點。 * **連接埠** – 使用您的非 SSL 連接埠或 SSL 連接埠,依 ssl 設定而定。 * **accessKey** – 用於快取的主要或次要金鑰。 * **ssl** – 如果您想要使用 ssl 保護快取/用戶端通訊則為 true,否則為 false。 請務必指定正確的連接埠。 * 預設會為新快取停用非 SSL 連接埠。 請於此設定指定為 true,使用 SSL 連接埠。 如需啟用非 SSL 連接埠的相關詳細資訊,請參閱[設定快取](cache-configure.md)主題中的[存取連接埠](cache-configure.md#access-ports)一節。 * **throwOnError** – true (如果您想在事件失敗時擲出例外狀況) 或 false (如果您想在作業失敗時為無訊息模式)。 您可以核取靜態 Microsoft.Web.Redis.RedisSessionStateProvider.LastException 屬性以檢查失敗。 預設值是 true。 * **retryTimeoutInMilliseconds** – 會在此間隔期間 (以毫秒指定) 重試失敗的作業。 第一次重試會在 20 毫秒後發生,然後每秒進行重試,直到 retryTimeoutInMilliseconds 間隔到期為止。 緊接著此間隔之後,作業會進行最後一次重試。 如果作業仍失敗,會視 throwOnError 設定將例外狀況擲回給呼叫者。 預設值為 0,表示不會重試。 * **databaseId** – 指定快取輸出資料所使用的資料庫。 若未指定,就會使用預設值 0。 * **applicationName** – 金鑰在 redis 中會儲存為 `{<Application Name>_<Session ID>}_Data`。 這可讓多個應用程式共用同一金鑰。 此參數是選擇性的,如果您未提供,將會使用預設值。 * **connectionTimeoutInMilliseconds** – 此設定可讓您覆寫 StackExchange.Redis 用戶端中的 connectTimeout 設定。 若未指定,將會使用預設的 connectTimeout 設定為 5000。 如需詳細資訊,請參閱 [StackExchange.Redis 設定模型](http://go.microsoft.com/fwlink/?LinkId=398705)(英文)。 * **operationTimeoutInMilliseconds** – 此設定可讓您覆寫 StackExchange.Redis 用戶端中的 syncTimeout 設定。 若未指定,將會使用預設的 syncTimeout 設定為 1000。 如需詳細資訊,請參閱 [StackExchange.Redis 設定模型](http://go.microsoft.com/fwlink/?LinkId=398705)(英文)。 如需這些屬性的相關詳細資訊,請參閱 [發佈 Redis 的 ASP.NET 工作階段狀態提供者](http://blogs.msdn.com/b/webdev/archive/2014/05/12/announcing-asp-net-session-state-provider-for-redis-preview-release.aspx)(英文) 上的原始部落格文章公告。 別忘記備註您 web.config 中的標準 InProc 工作階段狀態提供者區段。 ```xml <!-- <sessionState mode="InProc" customProvider="DefaultSessionProvider"> <providers> <add name="DefaultSessionProvider" type="System.Web.Providers.DefaultSessionStateProvider, System.Web.Providers, Version=1.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35" connectionStringName="DefaultConnection" /> </providers> </sessionState> --> ``` 一旦執行了這些步驟,您的應用程式就會設定為使用 Redis 快取工作階段狀態提供者。 當您在應用程式中使用工作階段狀態時,會儲存在 Azure Redis 快取執行個體中。 > [!NOTE] > 請注意,儲存在快取中的資料必須可序列化,這一點與可以儲存在預設記憶體中 ASP.NET 工作階段狀態提供者的資料不同。 使用 Redis 的工作階段狀態提供者時,請確定儲存在工作階段狀態中的資料類型為可序列化。 > > ## <a name="aspnet-session-state-options"></a>ASP.NET 工作階段狀態選項 * 記憶體中工作階段狀態提供者 – 此提供者會將工作階段狀態儲存在記憶體中。 使用此提供者的好處是它既簡單又快速。 不過,您在使用記憶體中提供者時將無法調整 Web Apps,因為它不是分散式的。 * SQL Server 工作階段狀態提供者 – 此提供者會將工作階段狀態儲存在 SQL Server 中。 如果您想要在永續性儲存體中保存工作階段狀態,應使用此提供者。 您可以調整您的 Web 應用程式,但將 SQL Server 用於工作階段,將對您 Web 應用程式的效能造成影響。 * 分散式記憶體中工作階段狀態提供者,例如 Redis 快取工作階段狀態提供者 – 此提供者可讓您同時兼顧兩方面。 您的 Web 應用程式可擁有簡單、快速而可調整的工作階段狀態提供者。 不過,您必須考量到,此提供者會將工作階段狀態儲存在快取中,且您的應用程式必須考量與分散式記憶體中快取 (例如暫時性網路失敗) 通訊時的所有相關特性。 如需使用快取的最佳作法,請參閱 Microsoft 模式和作法 [Azure 雲端應用程式設計和實作指引](https://github.com/mspnp/azure-guidance)中的[快取指引](../best-practices-caching.md)。 如需工作階段狀態和其他最佳作法的相關詳細資訊,請參閱 [Web 開發最佳作法 (使用 Azure 建置實際的雲端應用程式)](http://www.asp.net/aspnet/overview/developing-apps-with-windows-azure/building-real-world-cloud-apps-with-windows-azure/web-development-best-practices)(英文)。 ## <a name="next-steps"></a>後續步驟 查看 [Azure Redis 快取的 ASP.NET 輸出快取提供者](cache-aspnet-output-cache-provider.md)。 <!--HONumber=Jan17_HO2-->
51.935484
408
0.765062
yue_Hant
0.954127
e9868913cbfb2c60d4403962e9b5b8549e7a0927
9,504
md
Markdown
_posts/2017-08-28-GeometryPackage.md
piyush-jain1/piyush-jain1.github.io
548d116b2078374c0e62d9d48a6bd34eb810b0af
[ "MIT" ]
null
null
null
_posts/2017-08-28-GeometryPackage.md
piyush-jain1/piyush-jain1.github.io
548d116b2078374c0e62d9d48a6bd34eb810b0af
[ "MIT" ]
null
null
null
_posts/2017-08-28-GeometryPackage.md
piyush-jain1/piyush-jain1.github.io
548d116b2078374c0e62d9d48a6bd34eb810b0af
[ "MIT" ]
null
null
null
--- layout: post title: Geometry Package (Octave) --- # Geometry package: Implement boolean operations on polygons ## As part of GSoC 2017 , this project is intended to implement a set of boolean operations and supporting function for acting on polygons. These include the standard set of potential operations such as union/OR, intersection/AND, difference/subtraction, and exclusiveor/XOR. Other things to be implemented are the following functions: polybool, ispolycw, poly2ccw, poly2cw, poly2fv, polyjoin, and polysplit.<br/> This [Repository](https://github.com/piyush-jain1/GSoC17OctaveGeometry) is a clone(fork) of the Geometry Package which is a part of the [Octave Forge Project](http://octave.sourceforge.net/).<br/> This fork adds new functions to the official Geometry Package as part of GSoC (Google Summer of Code) 2016. The official Geometry Package can be found [here](https://sourceforge.net/p/octave/geometry/ci/default/tree/) Link to commits on official repo : - [19a35e](https://sourceforge.net/p/octave/geometry/ci/19a35efc16dbe645e0bbfbffe9cfaa14e5ec9c96/) - [fc3710](https://sourceforge.net/p/octave/geometry/ci/fc3710b6cce55502e6eff4dc4251d507bc5b4ff1/) ## Added files and functions 1. /inst/polygons2d/clipPolygon_mrf.m 2. /inst/polygons2d/private/\__poly2struct\__.m 3. /src/martinez.cpp 4. /src/polygon.cpp 5. /src/utilities.cpp 6. /src/polybool_mrf.cc 7. /inst/polygons2d/funcAliases ## Bonding Period After discussion with my mentor and keeping my proposal in mind, I tried to understand and enlist the tasks more specifically and in detail. I first tried to understand the conventions and other things about the organisation. I tried to understand the first basic thing which I will need throughout this project, i.e. how we create an oct-interface for the C++ codes to be executable in Octave. My first goal was to explore the possibility if the already implemented mex-interface geometry package can be improved in performance by replacing it with oct-interface. So, for understanding how these oct-files work and other things , I started to implement something in oct-files and getting familiar with it. ## First Coding Phase As stated, there is an already implemented [Geometry3.0 package](https://github.com/piyush-jain1/GSoC17OctaveGeometry). This has mex interface for its functions. I tried to compare its performance with the oct interface version. For benchmarking, I first implemented my own function for polyUnion (using clipper library) with oct-interface [(Find it here)](https://github.com/piyush-jain1/GSoC17OctaveGeometry/tree/master/devel/MyPolyUnion). Then, I compared its performance over a number of different sets of polgons (parametrized over number of vertices) and recorded the elapsed times with both the interfaces. On plotting a graph of Number of vertices V/s Elapsed time (for oct and mex), following obseravtions were made : </br> - The oct interface had a better performance over mex. - For 10000 vertices, the oct interface took about 0.008 seconds while the mex interface took about 0.014 seconds. This implies oct interface took 8X10e-3 seconds / 10e4 vertices i.e. 8X10e-7 seconds per vertex. For mex, it was 14X10e-7 seconds per vertex. - As it can be seen from the above data, the oct interface was not more than twice as good as mex interface.</br> From these observations, it was concluded that it is not worth to change the interface from mex to oct since there was not much improvement in performance. Thus, our next goal is now to incorporate new algorithms. After spending decent time over checking out the new algorithm and understanding its implementation, I have now started to implement the polybool function. I also tried to compare its performance with the already implemented clipPolygon in the current geometry package. The new algorithm has a better performance than the old one. The implementation of boolean operations on polygons, like **DIFFERENCE** , **INTERSECTION** , **XOR** and **UNION** with the new algorithm is almost done. A little work on tests, demos and documentation is still needed and I am working on that. #### More about the algorithm by F. Martínez, A.J. Rueda, F.R. Feito #### The algorithm is very easy to understand, among other things because it can be seen as an extension of the classical algorithm, based on the plane sweep, for computing the intersection points between a set of segments.When a new intersection between the edges of polygons is found, the algorithm subdivides the edges at the intersection point. This produces a plane sweep algorithm with only two kind of events: left and right endpoints, making the algorithm quite simple. Furthermore, the subdivision of edges provides a simple way of processing degeneracies. </br> Overall sketch of the approach for computing Boolean operations on polygons: - Subdivide the edges of the polygons at their intersection points. - Select those subdivided edges that lie inside the other polygon—or that do not lie depending on the operation. - Join the edges selected in step 2 to form the result polygon.</br> #### Complexity #### Let n be the total number of edges of all the polygons involved in the Boolean operation and k be the number of intersections of all the polygon edges. The whole algorithm runs in time **O(n+k)log(n)**.</br> After raw testing this new algorithm on several cases, I am now adding few tests in the m-script. Other than that, a demo has also been added in the m-script. The demo can be seen by the command `demo clipPolygon_mrf`. The tests have also been added. The test can be seen by `test clipPolygon_mrf`. ## Second Coding Phase After implementing the polybool function and checking it, we are planning to include it in the next release of geometry package. Now , to move forward, I am first importing some functions from last year GSoC's repo and ensuring their MATLAB compatibility. Functions like poly2ccw, poly2cw, joinpolygons, splitPolygons have been created as aliases while ensuring their compatibility with their mathworks counterparts. Then after, there was some time invested on understanding the CGAL library and its implementation. The further plan is to sync the matgeom package with the geometry package. ## Third Coding Phase Proceeding towards the next goal, the idea is to devise some way to automate the process somewhat, of syncing the matgeom and geometry package. The issue is that when a new release of geometry package is planned, there are some things which ahve been updated in matgeom but not in their geometry counterparts (if it exists). So, every time before releasing, so much time has to be invested in manually checking each edit and syncing it into the geometry. To achieve this, first a workaround is implemented on a dummy repository - [dummyMatGeom](https://github.com/piyush-jain1/dummyMatGeom/). Its master branch is matGeom (dummy) and there is another branch (named geometry) is created which contains geometry package (dummy). To test the entire procedure, go to the dummy repository dummyMatGeom , pull both branches in different folders, say "dummyMatGeom" for master branch and "dummyGeom" for geometry branch. Then follow the given steps as explained on the [wiki page](http://wiki.octave.org/Geometry_package:GSoC17). #### Challenges #### - Clearly, the above procedure will only sync the script of the function, not it's tests and demo, which are in separate folders in a Matlab package structure. Even if we try to concatenate their corresponding test/demo scripts with the function scripts (as it is in an octave package structure), there will be discrepancies because the notion or writing tests for octave and matlab packages are quite different. The way octave allows tests to work is unique to octave as explained here. SO, we can't simply concatenate the Matlab test scripts with the functions. - Git doesn't preserves the original version of geometry scripts and overwrites the whole file completely. For example : 1. Original file at matGeom (upstream) ~~~~octave % Bla bla % bla bla bla function z = blabla (x,y) % Help of function for i=1:length(x) z(i) = x(i)*y(i); end ~~~~ 2. Ported to geometry ~~~~octave # Copyright - Somebody # Bla bla # bla bla bla # texinfo # Help of function function z = blabla (x,y) z = x .* y; endfunction ~~~~ 3. Updated in matGeom ~~~~octave % Bla bla % bla bla bla function z = blabla (x,y) % Help of function % updated to be more clear z = zeros (size(x)); for i=1:length(x) z(i) = x(i)*y(i); end ~~~~ 4. After syncing , the expected result is something like this : ~~~~octave # Copyright - Somebody # Bla bla # bla bla bla # texinfo # Help of function # updated to be more clear function z = blabla (x,y) z = zeros (size(x)); z = x .* y; endfunction ~~~~ But, this doesn't happen as expected. Git just finds the files which have been modified and overwrites those files completely. Considering the possibilities of the solutions, there are ways like `git patch` or `git interactive` which allows us to select the lines specifically which we want to be committed, but that would not serve our purpose as it would not be better than syncing it manually, file by file. Looking for a better solution to handle this ! Now, the further idea is to release geometry and I am getting involved into it to get a feel of how things are done. Thus, as the time concludes, it's time to say Goodbye to GSoC'17. It was overall a great learning experience.
62.940397
732
0.777778
eng_Latn
0.998299
e9869ca219480d6eeead64a24a9f9170a22e30f0
757
md
Markdown
weibotop_content/2021/2021-08/2021-08-07/20:22.md
handn-work/spider
8130508da1ba31df2d1430c0077ad08602417fd3
[ "MIT" ]
null
null
null
weibotop_content/2021/2021-08/2021-08-07/20:22.md
handn-work/spider
8130508da1ba31df2d1430c0077ad08602417fd3
[ "MIT" ]
null
null
null
weibotop_content/2021/2021-08/2021-08-07/20:22.md
handn-work/spider
8130508da1ba31df2d1430c0077ad08602417fd3
[ "MIT" ]
1
2022-01-24T09:20:56.000Z
2022-01-24T09:20:56.000Z
建设共同富裕示范区浙里行 吴京儿子给马龙发微信 中国获花样游泳团体银牌 陈梦吃香蕉因为是金牌的颜色 俄罗斯无缘艺术体操个人全能六连冠 运动员的小肚子不叫小肚子 苏炳添因比赛错过孩子出生 和平精英送空投 樊振东说给孙颖莎签名很激动 北京明起暂停部分进出京长途客运班线 龚莉空手道女子组手61公斤以上级摘铜 德国选手因马不配合在马背上痛哭 马龙解释毛巾掉毛因为胡子扎 曹缘连续三届奥运摘金 马龙说打奥恰洛夫压力很大 被女乒直播氛围笑死 曹缘跳水10米台夺金 关晓彤红发 岳云鹏说快40了不想走可爱路线了 周小山离职 美国用美国标准把美国排奖牌榜第一 云南白药回应不合规口罩被罚 成毅给张予曦宣璐发offer 刘国梁说马龙一直是国乒领军人 中国38金追平海外最佳 为什么睡觉无法缓解疲劳 杨健在水中怒吼 成龙悼念成家班去世成员 李莎旻子爸爸心疼女儿哭了 俄罗斯选手哭了 禄口机场疫情防控不力多人被处理 美国十多年前已能合成冠状病毒 刘国梁说国乒谁都能拿金牌 武汉江夏 你是我的荣耀 王子文为儿子庆生 郑州新增4处中风险地区 李云迪 我都不算人类高质量男性吗 以色列选手获艺术体操金牌 花样游泳比赛真是视觉盛宴 林加德新冠检测呈阳性 长沙超6000人集中隔离 猫咪高质量接吻 南京9人履行管理监督等职责不力被处理 全红婵借奖牌合影 花样游泳团体决赛 重庆奉节1例无症状感染者传播源头查明 立秋不代表入秋 中国花样游泳演绎巾帼英雄 谷爱凌呼吁青少年运动起来睡够10小时 乔欣全程抱着杨紫 扬州大姐排队做核酸时打太极
14.283019
20
0.792602
yue_Hant
0.66099
e988fb2c7e92c3b0f94961d67e00034413831fbb
1,329
md
Markdown
README.md
exp-table/starknet-playground
5bd7c5b95511ab5f74818ad91381be2f2f9aa61f
[ "MIT" ]
13
2021-11-28T21:15:14.000Z
2022-01-18T20:16:49.000Z
README.md
exp-table/starknet-playground
5bd7c5b95511ab5f74818ad91381be2f2f9aa61f
[ "MIT" ]
null
null
null
README.md
exp-table/starknet-playground
5bd7c5b95511ab5f74818ad91381be2f2f9aa61f
[ "MIT" ]
2
2021-11-29T03:42:34.000Z
2022-03-29T03:32:19.000Z
# StarkNet Playground My lil' playground. Feeling cute, might port some of my Solidity contracts here, idk. ## Getting started Clone this project: ```sh git clone https://github.com/exp-table/starknet-playground.git cd starknet-playground ``` Create a [virtualenv](https://docs.python.org/3/library/venv.html) and activate it: ```sh python3 -m venv env source env/bin/activate ``` Install `nile`: ```sh pip install cairo-nile ``` Use `nile` to quickly set up your development environment: ```sh nile init ... ✨ Cairo successfully installed! ... ✅ Dependencies successfully installed 🗄 Creating project directory tree ⛵️ Nile project ready! Try running: ``` This command creates the project directory structure and installs `cairo-lang`, `starknet-devnet`, `pytest`, and `pytest-asyncio` for you. The template includes a makefile to build the project (`make build`) and run tests (`make test`). ## A few notes regarding the contracts ### cmp.cairo Holds basic comparison operators not present in the modules starkware offers. ### DutchAuction.cairo For the moment, any logic regarding the handling of the currency used for paying is not implemented. For simplicity and elegance, we will probably let the user handles it on the contract interacting with the DutchAuction. ⚠️ Waiting for native support of `timestamp `.
29.533333
236
0.759217
eng_Latn
0.98438
e989a4fe93ed855652417265388a5960b4fda5ab
15,536
md
Markdown
docs/browsers.md
flexagoon/privacyguides.org
987cffd998f7c31f500f65cecd57de5ff9b977b8
[ "CC0-1.0" ]
null
null
null
docs/browsers.md
flexagoon/privacyguides.org
987cffd998f7c31f500f65cecd57de5ff9b977b8
[ "CC0-1.0" ]
null
null
null
docs/browsers.md
flexagoon/privacyguides.org
987cffd998f7c31f500f65cecd57de5ff9b977b8
[ "CC0-1.0" ]
null
null
null
--- title: Browser Recommendations icon: octicons/browser-16 --- These are our current web browser recommendations and settings. We recommend keeping extensions to a minimum: they have privileged access within your browser, require you to trust the developer, can make you [stand out](https://en.wikipedia.org/wiki/Device_fingerprint#Browser_fingerprint), and [weaken](https://groups.google.com/a/chromium.org/g/chromium-extensions/c/0ei-UCHNm34/m/lDaXwQhzBAAJ) site isolation. ## General Recommendations ### Tor Browser !!! anonyimity "This product provides anonymity" !!! recommendation ![Tor Browser logo](/assets/img/browsers/tor.svg){ align=right } **Tor Browser** is the choice if you need anonymity. This browser provides you with access to the Tor Bridges and [Tor Network](https://en.wikipedia.org/wiki/Tor_(network)), along with extensions that can be automatically configured to fit its three security levels - *Standard*, *Safer* and *Safest*. We recommend that you do not change any of Tor Browser's default configurations outside of the standard security levels. [Visit torproject.org](https://www.torproject.org){ .md-button .md-button--primary } [:pg-tor:](http://2gzyxa5ihm7nsggfxnu52rck2vv4rvmdlkiu3zzui5du4xyclen53wid.onion){ .md-button } [Privacy Policy](https://support.torproject.org/tbb/tbb-3/){ .md-button } **Downloads** - [:fontawesome-brands-windows: Windows](https://www.torproject.org/download/) - [:fontawesome-brands-apple: macOS](https://www.torproject.org/download/) - [:fontawesome-brands-linux: Linux](https://www.torproject.org/download/) - [:pg-flathub: Flatpak](https://flathub.org/apps/details/com.github.micahflee.torbrowser-launcher) - [:fontawesome-brands-google-play: Google Play](https://play.google.com/store/apps/details?id=org.torproject.torbrowser) - [:pg-f-droid: F-Droid](https://guardianproject.info/fdroid/) - [:fontawesome-brands-git: Source](https://trac.torproject.org/projects/tor) !!! warning You should **never** install any additional extensions on Tor Browser, including the ones we suggest for Firefox. Browser extensions make you stand out from other Tor users and your browser easier to [fingerprint](https://support.torproject.org/glossary/browser-fingerprinting). ## Desktop Browser Recommendations ### Firefox !!! recommendation ![Firefox logo](/assets/img/browsers/firefox.svg){ align=right } **Firefox** provides strong privacy settings such as [Enhanced Tracking Protection](https://support.mozilla.org/kb/enhanced-tracking-protection-firefox-desktop), which can help block various [types of tracking](https://support.mozilla.org/kb/enhanced-tracking-protection-firefox-desktop#w_what-enhanced-tracking-protection-blocks). [Visit firefox.com](https://firefox.com){ .md-button .md-button--primary } [Privacy Policy](https://www.mozilla.org/privacy/firefox){ .md-button } **Downloads** - [:fontawesome-brands-windows: Windows](https://www.mozilla.org/firefox/windows) - [:fontawesome-brands-apple: macOS](https://www.mozilla.org/firefox/mac) - [:fontawesome-brands-linux: Linux](https://www.mozilla.org/firefox/linux) - [:pg-flathub: Flatpak](https://flathub.org/apps/details/org.mozilla.firefox) - [:fontawesome-brands-git: Source](https://hg.mozilla.org/mozilla-central) !!! warning Firefox includes a unique [download token](https://bugzilla.mozilla.org/show_bug.cgi?id=1677497#c0) in downloads from Mozilla's website and uses telemetry in Firefox to send the token. The token is **not** included in releases from the [Mozilla FTP](https://ftp.mozilla.org/pub/firefox/releases/). #### Recommended Configuration These options can be found in the *Privacy & Security* settings page ( ≡ → Settings → Privacy & Security). **Enhanced Tracking Protection (ETP):** <ul style="list-style-type:none;padding-left:0;"> <li>Select: "Strict"</li> </ul> **Sanitize on Close:** <ul style="list-style-type:none;padding-left:0;"> <li>Select: "Delete cookies and site data when Firefox is closed"</li> </ul> You can still stay logged into websites by allowing exceptions. **Disable Search Suggestions:** *These features may not be available depending on your region.* <ul style="list-style-type:none;padding-left:0;"> <li>Toggle off: "Suggestions from the web"</li> <li>Toggle off: "Suggestions from sponsors"</li> <li>Toggle off: "Improve the Firefox Suggest experience"</li> </ul> **Disable Telemetry:** <ul style="list-style-type:none;padding-left:0;"> <li>Uncheck: "Allow Firefox to send technical and interaction data to Mozilla"</li> <li>Uncheck: "Allow Firefox to install and run studies"</li> <li>Uncheck: "Allow Firefox to send backlogged crash reports on your behalf"</li> </ul> **HTTPS-Only Mode:** <ul style="list-style-type:none;padding-left:0;"> <li>Select: "Enable HTTPS-Only Mode in all windows".</li> </ul> #### Sync The [Firefox sync](https://hacks.mozilla.org/2018/11/firefox-sync-privacy/) service is end-to-end encrypted. #### Extensions We generally do not recommend installing any extensions as they increase your [attack surface](https://en.wikipedia.org/wiki/Attack_surface); however, if you want content blocking, [uBlock Origin](/browsers/#additional-resources) might be useful to you. The extension is also a 🏆️ [Recommended Extension](https://support.mozilla.org/kb/add-on-badges#w_recommended-extensions) by Mozilla. #### Arkenfox (advanced) The [Arkenfox project](https://github.com/arkenfox/user.js) provides a set of carefully considered options for Firefox. These options are quite strict but a few are subjective and may cause some websites to not work properly. You can easily change these settings to suit your needs. We **strongly recommend** reading through their [wiki](https://github.com/arkenfox/user.js/wiki). Arkenfox also enables [container](https://support.mozilla.org/en-US/kb/containers#w_for-advanced-users) support. ## Mobile Browser Recommendations On Android, Mozilla's engine [GeckoView](https://mozilla.github.io/geckoview/) has yet to support [site isolation](https://hacks.mozilla.org/2021/05/introducing-firefox-new-site-isolation-security-architecture) or enable [isolatedProcess](https://bugzilla.mozilla.org/show_bug.cgi?id=1565196). Firefox on Android also doesn't yet have [HTTPS-Only mode](https://github.com/mozilla-mobile/fenix/issues/16952#issuecomment-907960218) built-in. We do not recommend Firefox or any Gecko based browsers at this time. On iOS, any app that can browse the web is [restricted](https://developer.apple.com/app-store/review/guidelines) to using an Apple-provided [WebKit framework](https://developer.apple.com/documentation/webkit), so there is little reason to use a third-party web browser. ### Bromite !!! recommendation ![Bromite logo](/assets/img/browsers/bromite.svg){ align=right } **Bromite** is a [Chromium](https://en.wikipedia.org/wiki/Chromium_(web_browser))-based browser with privacy and security enhancements, built-in ad blocking, and some fingerprinting randomization. [Visit bromite.org](https://www.bromite.org){ .md-button .md-button--primary } [Privacy Policy](https://www.bromite.org/privacy){ .md-button } **Downloads** - [:fontawesome-brands-android: Android](https://www.bromite.org/fdroid) - [:fontawesome-brands-github: Source](https://github.com/bromite/bromite) These options can be found in *Privacy and Security* ( ⁝ → ⚙️ Settings → Privacy and Security). **HTTPS-Only Mode:** <ul style="list-style-type:none;padding-left:0;"> <li>Select: Always use secure connections.</li> </ul> **Always-on Incognito Mode:** <ul style="list-style-type:none;padding-left:0;"> <li>Select: "Open links in incognito tabs always"</li> <li>Select: "Close all open tabs on exit"</li> <li>Select: "Open external links in incognito"</li> </ul> ### Safari !!! recommendation ![Safari logo](/assets/img/browsers/safari.svg){ align=right } **Safari** is the default browser in iOS. It includes [privacy features](https://support.apple.com/guide/iphone/browse-the-web-privately-iphb01fc3c85/15.0/ios/15.0) such as Intelligent Tracking Protection, Privacy Report, isolated Private Browsing tabs, iCloud Private Relay, and automatic HTTPS upgrades. [Visit apple.com](https://www.apple.com/safari/){ .md-button .md-button--primary } [Privacy Policy](https://www.apple.com/legal/privacy/data/en/safari/){ .md-button } #### Recommended Configuration These options can be found in *Privacy and Security* (⚙️ Settings → Safari → Privacy and Security). **Cross-Site Tracking Prevention:** Toggling this setting enables WebKit's [Intelligent Tracking Protection](https://webkit.org/tracking-prevention/#intelligent-tracking-prevention-itp). <ul style="list-style-type:none;padding-left:0;"> <li>Toggle On: "Prevent Cross-Site Tracking".</li> </ul> **Privacy Report:** Privacy Report provides a snapshot of cross-site trackers currently prevented from profiling you on the website you're visiting. It can also display a weekly report to show which trackers have been blocked over time. Privacy Report is accessible through the "**Aa**" icon in the URL bar. **Privacy Preserving Ad Measurement:** This is WebKit's own [implementation](https://webkit.org/blog/8943/privacy-preserving-ad-click-attribution-for-the-web/) of privacy preserving ad click attribution. If you do not wish to participate, you can disable this feature. <ul style="list-style-type:none;padding-left:0;"> <li>Toggle Off: "Privacy Preserving Ad Measurement".</li> </ul> **Apple Pay:** If you do not use Apple Pay, you can toggle off the ability for websites to check for it. <ul style="list-style-type:none;padding-left:0;"> <li>Toggle Off: "Check for Apple Pay".</li> </ul> **Always-on Private Browsing:** Open Safari and press the tabs icon in the bottom right corner. Open Tab Groups, located in the bottom middle. <ul style="list-style-type:none;padding-left:0;"> <li>Select: "Private".</li> </ul> #### iCloud Sync While synchronization of Safari History, Tab Groups, and iCloud Tabs is end-to-end encrypted, bookmarks are [not](https://support.apple.com/en-us/HT202303); they are only encrypted in transit and stored in an encrypted format on Apple's servers. Apple may be able to decrypt and access them. If you use iCloud, we also recommend checking to ensure Safari's default download location is set to locally on your device. This option can be found in *General* (⚙️ Settings → Safari → General → Downloads). #### Extensions We generally do not recommend installing [any extensions](https://www.sentinelone.com/blog/inside-safari-extensions-malware-golden-key-user-data/) as they increase your browser's [attack surface](https://en.wikipedia.org/wiki/Attack_surface); however, if you want content blocking, [AdGuard for Safari](/browsers/#additional-resources) might be useful to you. ## Additional Resources ### uBlock Origin !!! recommendation ![uBlock Origin logo](/assets/img/browsers/ublock_origin.svg){ align=right } **uBlock Origin** is a popular content blocker that could help you block ads, trackers, and fingerprinting scripts. We suggest enabling all of the [filter lists](https://github.com/gorhill/uBlock/wiki/Dashboard:-Filter-lists) under the "Ads," "Privacy," and "Malware domains". The "Annoyances" and "Multipurpose" lists can also be enabled, but they may break some social media functions. The *AdGuard URL Tracking Protection* filter list makes extensions like CleanURLs and NeatURLs redundant. [Visit github.com](https://github.com/gorhill/uBlock){ .md-button .md-button--primary } **Downloads** - [:fontawesome-brands-firefox: Firefox](https://addons.mozilla.org/firefox/addon/ublock-origin) - [:fontawesome-brands-chrome: Chrome](https://chrome.google.com/webstore/detail/ublock-origin/cjpalhdlnbpafiamejdnhcphjbkeiagm) - [:fontawesome-brands-edge: Edge](https://microsoftedge.microsoft.com/addons/detail/ublock-origin/odfafepnkmbhccpbejgmiehpchacaeak) - [:fontawesome-brands-opera: Opera](https://addons.opera.com/extensions/details/ublock) - [:fontawesome-brands-github: Source](https://github.com/gorhill/uBlock) We also suggest adding the [Actually Legitimate URL Shortener Tool](https://raw.githubusercontent.com/DandelionSprout/adfilt/master/LegitimateURLShortener.txt) list and any of the regional lists that might apply to your browsing habits. To add this list, first access settings by clicking on the uBO icon, then the settings icon (⚙️). Go to the bottom of the Filter lists pane and place a checkmark next to Import under the Custom section. Paste the URL of the filter list above into the text area that appears below and click "Apply changes". Additional filter lists do slow things down and may increase your [attack surface](https://en.wikipedia.org/wiki/Attack_surface), so only apply what you need. uBlock Origin also has different [blocking modes](https://github.com/gorhill/uBlock/wiki/Blocking-mode). The easy mode [might not](https://www.ranum.com/security/computer_security/editorials/dumb/) necessarily keep you safe from every tracker out there, whereas the more advanced modes let you control exactly what needs to run. ### AdGuard for Safari !!! recommendation ![AdGuard logo](/assets/img/browsers/adguard.svg){ align=right } **AdGuard for Safari** is a free and open-source content-blocking extension for Safari that uses the native [Content Blocker API](https://developer.apple.com/documentation/safariservices/creating_a_content_blocker). We suggest enabling the filters labled *#recommended* under the "Ad Blocking" and "Privacy" [content blockers](https://kb.adguard.com/en/safari/overview#content-blockers). The *#recommended* filters can also be enabled for the "Social Widgets" and "Annoyances" content blockers, but they may break some social media functions. [Visit adguard.com](https://adguard.com/en/adguard-safari/overview.html){ .md-button .md-button--primary } [Privacy Policy](https://adguard.com/en/privacy/safari.html){ .md-button } **Downloads** - [:fontawesome-brands-safari: Safari](https://apps.apple.com/app/adguard-for-safari/id1440147259) - [:fontawesome-brands-app-store-ios: App Store](https://apps.apple.com/app/apple-store/id1047223162) - [:fontawesome-brands-git: Source](https://github.com/AdguardTeam/AdGuardForSafari) Additional filter lists do slow things down and may increase your [attack surface](https://en.wikipedia.org/wiki/Attack_surface), so only apply what you need. There is also [AdGuard for iOS](https://adguard.com/en/adguard-ios/overview.html) which is able to perform system-wide content blocking by means of DNS filtering. ### Terms of Service; Didn't Read !!! note We do not recommend installing ToS;DR as a browser extension. The same information is provided on their website. !!! recommendation ![Terms of Service; Didn't Read logo](/assets/img/browsers/terms_of_service_didnt_read.svg){ align=right } **Terms of Service; Didn't Read** grades websites based on their terms of service agreements and privacy policies. It also gives short summaries of those agreements. The analyses and ratings are published transparently by a community of reviewers. [Visit tosdr.org](https://tosdr.org){ .md-button .md-button--primary } [Privacy Policy](https://addons.mozilla.org/firefox/addon/terms-of-service-didnt-read/privacy){ .md-button }
59.072243
543
0.75251
eng_Latn
0.794024
e989b2f394567a63cb14dc38b910c53fdffcc13f
3,437
md
Markdown
content/team/org_chart.md
ana-lavongtheung/handbook
b95a704cc1c6f19ad67e16964a22e530390bdd1e
[ "Apache-2.0" ]
null
null
null
content/team/org_chart.md
ana-lavongtheung/handbook
b95a704cc1c6f19ad67e16964a22e530390bdd1e
[ "Apache-2.0" ]
1
2022-01-17T12:11:24.000Z
2022-01-17T12:11:24.000Z
content/team/org_chart.md
ana-lavongtheung/handbook
b95a704cc1c6f19ad67e16964a22e530390bdd1e
[ "Apache-2.0" ]
null
null
null
# Org chart The org chart is generated automatically from team pages in the handbook ([need to edit it?](#how-to-edit)). Sourcegraph teammates can see a complete and up-to-date org chart for the entire company in [BambooHR](https://sourcegraph.bamboohr.com/). ## [Engineering](../departments/product-engineering/engineering/team/index.md#current-organization) {{generator:reporting_structure.vp_engineering}} ## [Product](../departments/product-engineering/product/team/index.md#current-team) {{generator:reporting_structure.vp_product}} ## [Customer Support](../departments/support/index.md#the-team) {{generator:reporting_structure.director_customer_support}} ## [Customer Engineering](../departments/ce/index.md#current-team-members) {{generator:reporting_structure.vp_customer_engineering}} ## [Marketing](../departments/marketing/index.md#members) {{generator:reporting_structure.vp_marketing}} ## [People Ops](../departments/people-ops/index.md#people-ops-team-members) {{generator:reporting_structure.vp_people}} ## [Business Operations & Strategy](../departments/bizops/index.md#members) {{generator:reporting_structure.vp_operations}} ## [Finance & Accounting](../departments/finance/index.md#members) {{generator:reporting_structure.manager_financial_planning}} {{generator:reporting_structure.financial_controller}} ## [Legal](../departments/legal/index.md#members) {{generator:reporting_structure.director_legal}} ## [Tech Ops](../departments/tech-ops/index.md#members) {{generator:reporting_structure.tech_ops_manager}} ## Sales <!-- When updating the engineering team list below, please also update handbook/index.md. --> ### [Sales team](../departments/sales/index.md#members) {{generator:reporting_structure.vp_sales}} ### [SDR team](../departments/sales/sdrteam/index.md#members) {{generator:reporting_structure.head_sales_development}} ### [Sales strategy & operations](../departments/sales/sales-ops/index.md#members) {{generator:reporting_structure.strategy_operations_manager}} ### [Value Engineering & Sales Enablement](../departments/sales/sales-enablement/index.md) {{generator:reporting_structure.senior_manager_value_engineering}} ## Other teams: TODO Not all teams are listed here yet. --- ## How to edit This org chart is generated automatically based on the contents of other handbook pages. 1. To add a team, [edit this page](https://github.com/sourcegraph/handbook/edit/main/content/team/org_chart.md) and add a link to the section of the team's page that lists the members as the header (such as `### [My team](../../myteam/index.md#members)`). 2. To edit a team, [edit this file](https://github.com/sourcegraph/handbook/edit/main/data/team.yml) to add or adjust any team members. Follow the steps below: a. **New Managers**: Be sure to add a `manager_role_slug` to your personal entry. After adding that, check that any team members who report to you have the appropriate `reports_to` field in their entry. If you are stepping into a role that has been filled by an interim manager, you can update the existing entry rather than creating a new one. Remove `(Interim)` from the role title, as well as the `hidden_on_team_page` field. b. **New Teammates**: Add your personal entry, and make sure it has a `reports_to` field with the appropriate slug. To find this slug, locate your manager's entry, and use the value they have entered for `manager_role_slug`.
41.409639
431
0.769857
eng_Latn
0.910432
e98a26283b9223a5786c975f3eb7996ce8099d86
636
md
Markdown
docs/reference/+vlt/+image/+roi/ROI_3dplot2d.m.md
VH-Lab/vhlab-toolbox-matlab
4a0ef2698cf4b1a1b860d050e53afd7c7ce607e8
[ "MIT" ]
4
2019-08-01T17:32:13.000Z
2020-08-14T12:03:59.000Z
docs/reference/+vlt/+image/+roi/ROI_3dplot2d.m.md
VH-Lab/vhlab_toolbox
fa9bc1362c88cdfb0883389d18f4dc9f31938021
[ "MIT" ]
4
2018-06-26T13:31:14.000Z
2021-08-09T13:59:31.000Z
docs/reference/+vlt/+image/+roi/ROI_3dplot2d.m.md
VH-Lab/vhlab_toolbox
fa9bc1362c88cdfb0883389d18f4dc9f31938021
[ "MIT" ]
4
2019-07-05T12:38:44.000Z
2022-02-18T00:57:55.000Z
# vlt.image.roi.ROI_3dplot2d ``` ROI_3DPLOT2D - Plot 3d ROIs on a 2-d image [H_LINES, H_TEXT] = ROI_3DPLOT2D(CC, TEXTSIZE, COLOR, LINE_TAG, TEXT_TAG, ZDIM) Inputs: CC- An array of ROIS returned in CC by BWCONNCOMP TEXTSIZE - The size font that should be used to label the numbers (0 for none) COLOR - The color that should be used, in [R G B] format (0...1) LINE_TAG - A text string that is used to tag the line plots TEXT_TAG - A text string that is used to tag the text Outputs: H_LINES - Handle array of line plots H_TEXT - Handle array of text plots ```
30.285714
82
0.649371
eng_Latn
0.778094
e98a2a2f968c27ca131948388b0d20b2b7173cff
3,137
md
Markdown
src/content/ui/unit_healthbar.md
DarkPro1337/godot_recipes
e9ecfa9bda8196332f0533f84cdd6553b7522c5b
[ "MIT" ]
129
2019-04-23T21:42:58.000Z
2022-03-27T17:31:18.000Z
src/content/ui/unit_healthbar.md
DarkPro1337/godot_recipes
e9ecfa9bda8196332f0533f84cdd6553b7522c5b
[ "MIT" ]
110
2019-06-05T02:39:44.000Z
2022-03-31T13:04:25.000Z
src/content/ui/unit_healthbar.md
DarkPro1337/godot_recipes
e9ecfa9bda8196332f0533f84cdd6553b7522c5b
[ "MIT" ]
28
2019-05-31T17:51:48.000Z
2022-03-31T23:52:11.000Z
--- title: "Object Healthbars" weight: 10 draft: false ghcommentid: 60 --- ## Problem You want units in your game to have healthbars that follow them as they move. ![alt](/godot_recipes/img/unit_healthbar_preview.png) ## Solution Displaying the bar can be done with a {{< gd-icon TextureProgressBar >}}`TextureProgress` node. This is like the {{< gd-icon ProgressBar >}}`ProgressBar` node, but allows the use of textures for the bar itself. The length of the bar will indicate the health value, but we can also change the texture color. We'll use three colored bars for this: ![alt](/godot_recipes/img/barHorizontal_green.png) ![alt](/godot_recipes/img/barHorizontal_yellow.png) ![alt](/godot_recipes/img/barHorizontal_red.png) So that this bar can be added to any unit in the game, we'll make it a separate scene. Start with a {{< gd-icon Node2D >}}`Node2D` and a {{< gd-icon TextureProgressBar >}}`TextureProgress` child. Add a script to the root node. ![alt](/godot_recipes/img/unit_healthbar_nodes.png) Drag the green bar into the _Textures/Progress_ property and set its _Value_ to `100`. Drag the bar until it's centered and above the origin. ![alt](/godot_recipes/img/unit_healthbar_layout.png) ```gdscript extends Node2D var bar_red = preload("res://assets/barHorizontal_red.png") var bar_green = preload("res://assets/barHorizontal_green.png") var bar_yellow = preload("res://assets/barHorizontal_yellow.png") onready var healthbar = $HealthBar ``` The script starts by loading the three colored bars, which will change as the health decreases. We also store a reference to the progress bar. ```gdscript func _ready(): hide() if get_parent() and get_parent().get("max_health"): healthbar.max_value = get_parent().max_health ``` The `HealthDisplay` should be attached to a unit. If the unit has a `max_health` property, we use that to set the range of the bar (it's `100` by default). We also want the bar to start out hidden, and appear if the unit loses health. ```gdscript func _process(delta): global_rotation = 0 ``` This prevents the bar from rotating. It will always remain on top of the unit it's attached to. ```gdscript func update_healthbar(value): healthbar.texture_progress = bar_green if value < healthbar.max_value * 0.7: healthbar.texture_progress = bar_yellow if value < healthbar.max_value * 0.35: healthbar.texture_progress = bar_red if value < healthbar.max_value: show() healthbar.value = value ``` Finally, we have a function we can call when the unit's health changes. It updates the value of the bar and sets the texture based on the remaining proportion. When you attach this to a unit, the bar may appear too big. Set the _Scale_ property of the instanced `HealthDisplay` to adjust based on the size of your unit. Here's an example of this system in use. You can download the example project for this below. <video controls src="/godot_recipes/img/tower_def_demo.webm"></video> {{% notice note %}} Download the project file here: [tower_defense_demo.zip](/godot_recipes/files/tower_defense_demo.zip) {{% /notice %}}
39.2125
345
0.746573
eng_Latn
0.986021
e98abf8f60d7ff2a3016443454acf932d60c37b4
8,415
md
Markdown
articles/virtual-machines/sizes-general.md
beber-msft/azure-docs.de-de
7ce935d32fff8a0709d7a99b04f85cb50d2ead11
[ "CC-BY-4.0", "MIT" ]
63
2017-08-28T07:43:47.000Z
2022-02-24T03:04:04.000Z
articles/virtual-machines/sizes-general.md
beber-msft/azure-docs.de-de
7ce935d32fff8a0709d7a99b04f85cb50d2ead11
[ "CC-BY-4.0", "MIT" ]
704
2017-08-04T09:45:07.000Z
2021-12-03T05:49:08.000Z
articles/virtual-machines/sizes-general.md
beber-msft/azure-docs.de-de
7ce935d32fff8a0709d7a99b04f85cb50d2ead11
[ "CC-BY-4.0", "MIT" ]
178
2017-07-05T10:56:47.000Z
2022-03-18T12:25:19.000Z
--- title: Größen von virtuellen Azure-Computern – allgemeiner Zweck | Microsoft-Dokumentation description: Auflistung der verschiedenen verfügbaren allgemeinen Größen für virtuelle Computer in Azure. Dieser Artikel listet Informationen zur Anzahl von vCPUs, Datenträgern und Netzwerkschnittstellenkarten sowie zum Speicherdurchsatz und zur Netzwerkbandbreite für Größen dieser Serie auf. author: mimckitt ms.service: virtual-machines ms.subservice: vm-sizes-general ms.devlang: na ms.topic: conceptual ms.workload: infrastructure-services ms.date: 02/20/2020 ms.author: mimckitt ms.openlocfilehash: d4046887ffe492ec627a7dfb244668a6f72314e1 ms.sourcegitcommit: 58d82486531472268c5ff70b1e012fc008226753 ms.translationtype: HT ms.contentlocale: de-DE ms.lasthandoff: 08/23/2021 ms.locfileid: "122695796" --- # <a name="general-purpose-virtual-machine-sizes"></a>Universelle VM-Größen **Gilt für:** :heavy_check_mark: Linux-VMs :heavy_check_mark: Windows-VMs :heavy_check_mark: Flexible Skalierungsgruppen :heavy_check_mark: Einheitliche Skalierungsgruppen > [!TIP] > Probieren Sie das **[Auswahltool für virtuelle Computer](https://aka.ms/vm-selector)** aus, um andere Größen zu ermitteln, die für Ihre Workload optimal sind. Universelle VM-Größen zeichnen sich durch ein ausgewogenes Verhältnis zwischen CPU und Arbeitsspeicher aus. Ideal für Tests und Entwicklung, kleine bis mittlere Datenbanken sowie Webserver mit geringer bis mittlerer Auslastung. Dieser Artikel enthält Informationen zu den Angeboten für allgemeines Computing. - Die virtuellen Computer der [Av2-Reihe](av2-series.md) können auf vielen verschiedenen Hardwaretypen und Prozessoren bereitgestellt werden. Die Konfigurationen für CPU-Leistung und Arbeitsspeicher bei virtuellen Computern der A-Serie eignen sich am besten für Workloads wie Entwicklung und Tests. Die Größe ist basierend auf der Hardware gedrosselt, um eine konsistente Prozessorleistung für die ausgeführte Instanz zu ermöglichen – unabhängig von der Hardware, die für die Bereitstellung gewählt wird. Fragen Sie die virtuelle Hardware über die virtuelle Maschine ab, um die physische Hardware zu ermitteln, auf der diese Größe bereitgestellt wird. Mögliche Anwendungsfälle: Entwicklungs- und Testserver, Webserver mit geringem Datenverkehr, kleine bis mittelgroße Datenbanken, Proof of Concept und Coderepositorys. > [!NOTE] > Virtuelle Computer vom Typ „A8“, „A9“, „A10“ und A11“ werden voraussichtlich im März 2021 eingestellt. Weitere Informationen finden Sie im [HPC-Migrationsleitfaden](https://azure.microsoft.com/resources/hpc-migration-guide/). Diese VM-Größen gehören zur ursprünglichen Serie „A_v1“, nicht zu „V2“. - [Virtuelle Burst-fähige Computer der B-Serie](sizes-b-series-burstable.md) sind ideal für Workloads geeignet, die nicht kontinuierlich die volle Leistung der CPU benötigen. Hierzu zählen beispielsweise Webserver, kleine Datenbanken sowie Entwicklungs- und Testumgebungen. Diese Workloads haben in der Regel kurzfristige Leistungsanforderungen. Mit der B-Serie können diese Kunden eine VM-Größe mit einer preisgünstigen Grundleistung erwerben. Für die VM-Instanz können dann Gutschriften erlangt werden, wenn für die VM weniger als die Grundleistung genutzt wird. Wenn für den virtuellen Computer Guthaben gebildet wurde, kann Leistung genutzt werden, die über die Grundleistung der VM hinausgeht. Dabei kann die CPU zu 100 Prozent verwendet werden, wenn Ihre Anwendung eine solche Leistung benötigt. - Die [Dav4-Serie und die Dasv4-Serie](dav4-dasv4-series.md) sind neue Größen, bei denen der AMD-Prozessor EPYC<sup>TM</sup> 7452 mit 2,35 GHz in einer Multithreadkonfiguration mit bis zu 256 MB L3-Cache verwendet wird. Hierbei werden 8 MB des L3-Caches jeweils acht Kernen zur Verfügung gestellt, um Kunden mehr Möglichkeiten bei der Ausführung ihrer universellen Workloads zu bieten. Die Dav4-Serie und die Dasv4-Serie verfügen über die gleichen Arbeitsspeicher- und Datenträgerkonfigurationen wie die D- und die Dsv3-Serie. - [Serien Dv4 und Dsv4:](dv4-dsv4-series.md) Die Dv4- und die Dsv4-Serie werden auf Intel-Prozessoren des Typs Intel® Xeon® Platinum 8272CL (Cascade Lake) mit einer Hyperthreadingkonfiguration ausgeführt, die für die meisten Allzweckworkloads ein besseres Preis-Leistungs-Verhältnis bieten. Sie verfügen über eine Turbo-Taktfrequenz von 3,4 GHz für alle Kerne. - [Serien Ddv4 und Ddsv4:](ddv4-ddsv4-series.md) Die Ddv4- und die Ddsv4-Serie werden auf Intel-Prozessoren des Typs Intel&reg; Xeon&reg; Platinum 8272CL (Cascade Lake) mit einer Hyperthreadingkonfiguration ausgeführt, die für die meisten Allzweckworkloads ein besseres Preis-Leistungs-Verhältnis bieten. Sie verfügen über eine Turbo-Taktfrequenz von 3,4 GHz für alle Kerne, [Intel&reg; Turbo Boost Technology 2.0](https://www.intel.com/content/www/us/en/architecture-and-technology/turbo-boost/turbo-boost-technology.html), [Intel&reg; Hyper-Threading Technology](https://www.intel.com/content/www/us/en/architecture-and-technology/hyper-threading/hyper-threading-technology.html) und [Intel&reg; Advanced Vector Extensions 512 (Intel&reg; AVX-512)](https://www.intel.com/content/www/us/en/architecture-and-technology/avx-512-overview.html). Außerdem unterstützen sie [Intel&reg; Deep Learning Boost](https://software.intel.com/content/www/us/en/develop/topics/ai/deep-learning-boost.html). Diese neuen VM-Größen bieten 50  % mehr lokalen Speicher sowie bessere IOPS auf lokalen Datenträgern für Lese- und Schreibvorgänge im Vergleich zu den Größen [Dv3/Dsv3](./dv3-dsv3-series.md) mit [Gen2-VMs](./generation-2.md). - Virtuelle Computer der [Dv3-Serie und der Dsv3-Serie](dv3-dsv3-series.md) auf dem Intel® Xeon® Platinum 8272CL-Prozessor der 2. Generation (Cascade Lake), dem Intel® Xeon® 8171M-Prozessor mit 2,1 GHz (Skylake), dem Intel® Xeon® E5-2673 v4-Prozessor 2,3 GHz (Broadwell) oder dem Intel® Xeon® E5-2673 v3-Prozessor mit 2,4 GHz (Haswell) in einer Hyperthreadkonfiguration ausgeführt und bieten somit ein besseres Preis-Leistungs-Verhältnis für die meisten universellen Workloads. Der Speicher wurde erweitert (von etwa 3.5 GiB/vCPU auf 4 GiB/vCPU), während die Datenträger- und Netzwerkgrenzwerte pro Kern angepasst wurden, um sich für den Übergang zum Hyperthreading anzupassen. Die Dv3-Serie hat nicht mehr die virtuellen Computer mit hohen Arbeitsspeichergrößen der D/Dv2-Serie. Diese sind nun in den arbeitsspeicheroptimierten [Serien Ev3 und Esv3](ev3-esv3-series.md) verfügbar. - Virtuelle Computer der [Dv2-Serie und der Dsv2-Serie](dv2-dsv2-series.md), eine Nachfolgerin der ursprünglichen D-Serie, sind mit leistungsfähigeren CPUs und einer optimalen CPU-zu-Arbeitsspeicher-Konfiguration ausgestattet und eignen sich daher für die meisten Produktionsworkloads. Die Dv2-Serie ist ca. 35 % schneller als die D-Serie. Die Dv2-Serie wird auf dem Intel® Xeon® Platinum 8272CL-Prozessor der 2. Generation (Cascade Lake), dem Intel® Xeon® 8171M-Prozessor mit 2,1 GHz (Skylake), dem Intel® Xeon® E5-2673 v4-Prozessor 2,3 GHz (Broadwell) oder dem Intel® Xeon® E5-2673 v3-Prozessor mit 2,4 GHz (Haswell) mit Intel Turbo Boost Technology 2.0 ausgeführt. Der Dv2-Serie hat die gleichen Arbeitsspeicher- und Datenträgerkonfigurationen wie die D-Serie. - Die [DCv2-Serie](dcv2-series.md) kann Ihnen beim Schutz der Vertraulichkeit und Integrität Ihrer Daten und Ihres Codes während der Verarbeitung in der öffentlichen Cloud helfen. Diese Computer werden von der neuesten Generation von Intel XEON E-2288G-Prozessoren mit SGX-Technologie unterstützt. Mit der Intel Turbo Boost Technology können diese Computer bis zu 5,0 GHz erreichen. Instanzen der DCv2-Serie ermöglichen es Kunden, Secure Enclave-basierte Anwendungen zu erstellen, um ihren Code und Daten zu schützen, während sie verwendet werden. ## <a name="other-sizes"></a>Andere Größen - [Computeoptimiert](sizes-compute.md) - [Arbeitsspeicheroptimiert](sizes-memory.md) - [Speicheroptimiert](sizes-storage.md) - [GPU-optimiert](sizes-gpu.md) - [High Performance Computing](sizes-hpc.md) - [Vorherige Generationen](sizes-previous-gen.md) ## <a name="next-steps"></a>Nächste Schritte Weitere Informationen dazu, wie Sie mit [Azure-Computeeinheiten (ACU)](acu.md) die Computeleistung von Azure-SKUs vergleichen können. Weitere Informationen dazu, wie VM-Namen in Azure zugewiesen werden, finden Sie unter [Namenskonventionen für Azure-VM-Größen](./vm-naming-conventions.md).
137.95082
1,217
0.81022
deu_Latn
0.992356
e98ac1a9ff48cf0c63506968a92266f2f1ad7644
2,208
md
Markdown
docs/windows/module-registercomobject-method.md
OpenLocalizationTestOrg/cpp-docs.it-it
05d8d2dcc95498d856f8456e951d801011fe23d1
[ "CC-BY-4.0" ]
1
2020-05-21T13:09:13.000Z
2020-05-21T13:09:13.000Z
docs/windows/module-registercomobject-method.md
OpenLocalizationTestOrg/cpp-docs.it-it
05d8d2dcc95498d856f8456e951d801011fe23d1
[ "CC-BY-4.0" ]
null
null
null
docs/windows/module-registercomobject-method.md
OpenLocalizationTestOrg/cpp-docs.it-it
05d8d2dcc95498d856f8456e951d801011fe23d1
[ "CC-BY-4.0" ]
null
null
null
--- title: Module::RegisterCOMObject Method | Microsoft Docs ms.custom: ms.date: 11/04/2016 ms.reviewer: ms.suite: ms.technology: - devlang-cpp ms.tgt_pltfrm: ms.topic: reference f1_keywords: - module/Microsoft::WRL::Module::RegisterCOMObject dev_langs: - C++ helpviewer_keywords: - RegisterCOMObject method ms.assetid: 59f223dc-03c6-429d-95da-b74b3f73b702 caps.latest.revision: 5 author: mikeblome ms.author: mblome manager: ghogen translation.priority.ht: - de-de - es-es - fr-fr - it-it - ja-jp - ko-kr - ru-ru - zh-cn - zh-tw translation.priority.mt: - cs-cz - pl-pl - pt-br - tr-tr translationtype: Human Translation ms.sourcegitcommit: 3168772cbb7e8127523bc2fc2da5cc9b4f59beb8 ms.openlocfilehash: 71f6f84964830c73f51fdffe8a2e3a3541c0e4e4 --- # Module::RegisterCOMObject Method Registers one or more COM objects so other applications can connect to them. ## Syntax ``` WRL_NOTHROW virtual HRESULT RegisterCOMObject( const wchar_t* serverName, IID* clsids, IClassFactory** factories, DWORD* cookies, unsigned int count); ``` #### Parameters `serverName` Fully-qualified name of a server. `clsids` An array of CLSIDs to register. `factories` An array of IUnknown interfaces of the class objects whose availability is being published. `cookies` When the operation completes, an array of pointers to values that identify the class objects that were registered. These values are later used revoke the registration. `count` The number of CLSIDs to register. ## Return Value S_OK if successfu; otherwise, an HRESULT such as CO_E_OBJISREG that indicates the reason the operation failed. ## Remarks The COM objects are registered with the CLSCTX_LOCAL_SERVER enumerator of the CLSCTX enumeration. The type of connection to the registered objects is specified by a combination of the current `comflag` template parameter and the REGCLS_SUSPENDED enumerator of the REGCLS enumeration. ## Requirements **Header:** module.h **Namespace:** Microsoft::WRL ## See Also [Module Class](../windows/module-class.md) <!--HONumber=Jan17_HO2-->
23.741935
188
0.722373
eng_Latn
0.868056
e98b0a3dd99ff30f270c3f6d215d70b9c43c0f3f
664
md
Markdown
README.md
swisscom/esbuild-webserver
b2ad51edcea7cc67f9699ec2e2424cf88cea2c1f
[ "MIT" ]
1
2021-04-30T13:26:02.000Z
2021-04-30T13:26:02.000Z
README.md
swisscom/esbuild-webserver
b2ad51edcea7cc67f9699ec2e2424cf88cea2c1f
[ "MIT" ]
null
null
null
README.md
swisscom/esbuild-webserver
b2ad51edcea7cc67f9699ec2e2424cf88cea2c1f
[ "MIT" ]
null
null
null
# esbuild-webserver A simple web-server that can be used as an alternative to Webpack's `dev-server` ## Compile ``` go build -o ~/go/bin/esbuild-webserver ./cmd/esbuild-webserver ``` ## Usage ```bash $ esbuild-webserver -h Usage: esbuild-webserver --endpoint ENDPOINT [--listen LISTEN] Options: --endpoint ENDPOINT, -e ENDPOINT --listen LISTEN, -l LISTEN [default: 127.0.0.1:8080] --help, -h display this help and exit ``` ## Example ```makefile serve: esbuild-webserver \ -e /api:proxy=http://127.0.0.1:8080 \ -e /static:proxy=http://127.0.0.1:8080 \ -e /:file=./dist/ \ -e 404:404=./dist/index.html \ -l 127.0.0.1:8000 ```
18.444444
80
0.646084
yue_Hant
0.295636
e98c8ce4431bb0f93ca72b2e5bc7330008115674
2,369
md
Markdown
_posts/index127.md
Jobby-Kjhy/27
ea48bae2a083b6de2c3f665443f18b1c8f241440
[ "MIT" ]
null
null
null
_posts/index127.md
Jobby-Kjhy/27
ea48bae2a083b6de2c3f665443f18b1c8f241440
[ "MIT" ]
null
null
null
_posts/index127.md
Jobby-Kjhy/27
ea48bae2a083b6de2c3f665443f18b1c8f241440
[ "MIT" ]
null
null
null
--- layout: post comments: true categories: Other --- ## Download The book It didn't play anything, telling Labby loudly to clear out. glutinous wad of black phlegm. Not a single tongue of fire could be the. " The FROM into the Reaches. He agreed to treat Phimie and to have her admitted vessel was anchored the following day at 5 o'clock P! " Feodor, to help the until you get organized, and dangerous to the health the diabetics! 110. " Sirocco continued to gaze across the room at Driscoll, but this Laughton riding the bells or Igor stealing that brain from the laboratory. (248) When she saw them, and yet again in the vile place he the for the to come and the him up and cleanse him the he cleansed me, which ought to have been clouded with self-pity as "I am Anieb," the whispered, he thank-you, Mommy, possibly Palliser, and caused but fear for her one good hand the her to choose the nether end. The drift-wood was collected in large heaps that it the not the found the passage in the fence. SEND DONATIONS the determine the status of compliance for any "I don't know quite what to do with it," Song admitted. She had quite a bit of property in "Like the Library the the Kings," said Crow, he painteth her portrait. " 2. Now such an opportunity for the hunter Geneva the her miswired head. "Who says that?" the midst of warfare, and narrow valleys One hundred. Aware of the danger, but the mouth opened and the tongue moved: "Medra, some time they may be the at low water on the banks then laid given below: Extract nit het Register der Resolutien van de Hoog experienced orgasms, you startled me!" she said, and suddenly you would find yourself face-to-face with a new conversational partner, but nothing I could brag about, which he had switched off earlier in response to Kalens's request for "one or two informal opinions that I would rather not be committed to record, surely. sledges, Junior hadn't had anything to do with it. 447 "When did you realize you could do this?" Tom asked. enjoy!" wooden cords. So he rose and embraced him and kissed him the wept over his case. Regardless of who her father might have the, fearing that the government quarantine of Bingo, the rumble of the the freeway The the test, Junior discovered three Bartholomews. In this capacity there attended us a Japanese, there stepfather's story about extraterrestrial healers.
263.222222
2,292
0.78092
eng_Latn
0.99993
e98cf04778872abc57a24f16fc4a203d60c4e871
2,704
md
Markdown
posts/movimento-maker.md
miltonbolonha/descola-tema
4e3f45a4dd046302c156a8bd9bee687389a67962
[ "MIT" ]
null
null
null
posts/movimento-maker.md
miltonbolonha/descola-tema
4e3f45a4dd046302c156a8bd9bee687389a67962
[ "MIT" ]
14
2022-02-22T16:51:03.000Z
2022-03-07T11:56:45.000Z
posts/movimento-maker.md
miltonbolonha/descola-tema
4e3f45a4dd046302c156a8bd9bee687389a67962
[ "MIT" ]
null
null
null
--- title: 'O Movimento Maker e a construção de novas realidades' date: '2018-09-13T18:18:47+00:00' author: 'Tai Lacerda' featuredPost: false templatekey: blog-post tags: - geral featuredImage: ../static/images/maker.jpg --- Falar sobre futuro não é tão legal quanto construí-lo. Ainda mais quando a gente sabe que pode fazer isso com as nossas próprias mãos. Estamos sendo rodeados por mutações de uma cultura movida pelo propósito de fazer e compartilhar soluções práticas e concretas, e ela tem ganhado cada vez mais força. Essa loucura toda teve início nas primeiras manifestações do movimento _Do It Yourself (DIY)_, conhecido como **Faça Você Mesmo**. Você deve se lembrar da quantidade gigantesca de vídeos no Youtube seguindo essa premissa, certo? Vão desde tutoriais de customização de roupas até a criação do seu próprio foguete (sério). O que começou com tutoriais de projetos manuais de fácil aplicação acabou tomando novas proporções, englobando projetos de robótica e eletrônica. E a evolução deste mindset de criação aliado a contextos mais amplos e mais avançados de solução de problemas deram origem ao que chamamos de **Movimento Maker**. Falando assim parece um dragão de sete cabeças cuspindo fogo, mas ser um maker é muito mais simples do parece. Os makers, na maioria das vezes, são apenas leigos buscando por possibilidades que nunca enxergaram antes. ### **MAKERS** são: ### **Afinal, qual foi a última vez que você criou alguma coisa colocando literalmente a mão na massa?** Não vale aquela pizza caseira do final de semana, mas também não precisa ser um robô de inteligência artificial. Pode ser aquela horta feita com vasos de garrafa pet que você projetou no quintal de casa ou a lousa improvisada que você montou na parede do escritório do trabalho. Muita gente vê na cultura maker uma alternativa profissional e abandona suas profissões como advogados, arquitetos e contadores para produzirem carteiras, sabonetes e objetos impressos em 3D. Por que não? ### Ser um maker é criar mais do que objetos e produtos. É criar alternativas. Explorando todos os tipos de campos possíveis, os makers serão capazes de ir muito além de onde já chegamos, repensando o mundo e construindo coisas que ainda nem foram pensadas. E é assim que o Movimento Maker pode nos ajudar a estimular novos futuros: compreendendo como as coisas funcionam, enxergando o que há através delas e descobrindo de que forma podemos nos manter em constante evolução para construir a realidade que desejamos. [![](https://descola.org/drops/wp-content/uploads/2018/09/banner_iot_blog-1024x133.png)](https://descola.org/curso/internet-das-coisas?utm_source=blog&utm_medium=banner&utm_campaign=iot)
69.333333
320
0.797337
por_Latn
0.999997
e98de5c0d668f3b1099b5a8c335fc3ec52d47f9c
782
md
Markdown
feedback/MoisesPabloOrtizTapia.md
douglax/DevOpsAcad2G
20fe916950376fbca78a5c7e7e0af586179a8639
[ "Apache-2.0" ]
null
null
null
feedback/MoisesPabloOrtizTapia.md
douglax/DevOpsAcad2G
20fe916950376fbca78a5c7e7e0af586179a8639
[ "Apache-2.0" ]
4
2019-09-20T22:13:47.000Z
2019-09-25T04:23:38.000Z
feedback/MoisesPabloOrtizTapia.md
douglax/DevOpsAcad2G
20fe916950376fbca78a5c7e7e0af586179a8639
[ "Apache-2.0" ]
18
2019-09-20T20:32:55.000Z
2019-10-01T15:40:32.000Z
#Moises Pablo Ortiz Tapia Durante la induccion aprendi nuevos atajos para optimiazar pasos dentro de linux a pesar de tener conocimientos del sistema me ayudo a refozar los que ya tenia mas aparte me ayudo a conocer mas comandos tips y atajos dentro del mismo, como bien dicen para tomar un curso en linux tendriamos que tener una academia completa pero aun asi tube la oportunidadad de aprender cosas nuevas como el uso de git en windows. ## Recomendaciones para el curso: ⋅⋅* No tener tanto un entorno preparado ya que cuando uno tiene el entorno preparado hay errores que surgen y los demas no pueden ver o saber como resolverlo si se les llega a precentar. ⋅⋅* solo eso, el curso estubo super genia todo bien explicado con practicas en entornos reales y errores reales.
60.153846
131
0.796675
spa_Latn
0.998998
e98e3e4b08dff3f00ad9bd197dc19df11aed297d
38,659
md
Markdown
aspnet/web-forms/overview/older-versions-getting-started/continuing-with-ef/using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started.md
terrajobst/AspNetDocs.es-es
77be7c56042efbb27a9e051e21ee16792853ab63
[ "CC-BY-4.0", "MIT" ]
null
null
null
aspnet/web-forms/overview/older-versions-getting-started/continuing-with-ef/using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started.md
terrajobst/AspNetDocs.es-es
77be7c56042efbb27a9e051e21ee16792853ab63
[ "CC-BY-4.0", "MIT" ]
null
null
null
aspnet/web-forms/overview/older-versions-getting-started/continuing-with-ef/using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started.md
terrajobst/AspNetDocs.es-es
77be7c56042efbb27a9e051e21ee16792853ab63
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- uid: web-forms/overview/older-versions-getting-started/continuing-with-ef/using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started title: 'Usar el Entity Framework 4,0 y el control ObjectDataSource, parte 1: Introducción | Microsoft Docs' author: tdykstra description: Esta serie de tutoriales se basa en la aplicación web contoso University que crea el Introducción con la serie de tutoriales Entity Framework. Si yo... ms.author: riande ms.date: 01/26/2011 ms.assetid: 244278c1-fec8-4255-8a8a-13bde491c4f5 msc.legacyurl: /web-forms/overview/older-versions-getting-started/continuing-with-ef/using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started msc.type: authoredcontent ms.openlocfilehash: 2f14707eb058d438495dd2bc4c17b976c471fc97 ms.sourcegitcommit: e7e91932a6e91a63e2e46417626f39d6b244a3ab ms.translationtype: MT ms.contentlocale: es-ES ms.lasthandoff: 03/06/2020 ms.locfileid: "78440407" --- # <a name="using-the-entity-framework-40-and-the-objectdatasource-control-part-1-getting-started"></a>Usar el Entity Framework 4,0 y el control ObjectDataSource, parte 1: Introducción por [Tom Dykstra](https://github.com/tdykstra) > Esta serie de tutoriales se basa en la aplicación web contoso University que crea el [Introducción con la](../getting-started-with-ef/the-entity-framework-and-aspnet-getting-started-part-1.md) serie de tutoriales Entity Framework 4,0. Si no completó los tutoriales anteriores, como punto de partida para este tutorial, puede [descargar la aplicación](https://code.msdn.microsoft.com/ASPNET-Web-Forms-97f8ee9a) que habría creado. También puede [descargar la aplicación](https://code.msdn.microsoft.com/ASPNET-Web-Forms-6c7197aa) que se crea en la serie completa de tutoriales. > > En la aplicación Web de ejemplo contoso University se muestra cómo crear aplicaciones de formularios Web Forms de ASP.NET mediante el Entity Framework 4,0 y Visual Studio 2010. La aplicación de ejemplo es un sitio web de una Universidad ficticia de contoso. Incluye funciones como la admisión de estudiantes, la creación de cursos y asignaciones de instructores. > > En el tutorial se muestran C#ejemplos de. El [ejemplo descargable](https://code.msdn.microsoft.com/ASPNET-Web-Forms-6c7197aa) contiene código en C# y Visual Basic. > > ## <a name="database-first"></a>Database First > > Hay tres maneras de trabajar con datos en el Entity Framework: *Database First*, *Model First*y *code First*. Este tutorial es para Database First. Para obtener información sobre las diferencias entre estos flujos de trabajo e instrucciones sobre cómo elegir el mejor para su escenario, consulte [Entity Framework flujos de trabajo de desarrollo](https://msdn.microsoft.com/library/ms178359.aspx#dbfmfcf). > > ## <a name="web-forms"></a>formularios Web Forms > > Al igual que la serie Introducción, en esta serie de tutoriales se usa el modelo de formularios Web Forms de ASP.NET y se supone que sabe cómo trabajar con formularios Web Forms de ASP.NET en Visual Studio. Si no lo hace, vea [Introducción con formularios Web Forms de ASP.NET 4,5](../../getting-started/getting-started-with-aspnet-45-web-forms/introduction-and-overview.md). Si prefiere trabajar con el marco de ASP.NET MVC, consulte [Introducción con el Entity Framework mediante ASP.NET MVC](../../../../mvc/overview/getting-started/getting-started-with-ef-using-mvc/creating-an-entity-framework-data-model-for-an-asp-net-mvc-application.md). > > ## <a name="software-versions"></a>Versiones de software > > | **Se muestra en el tutorial** | **También funciona con** | > | --- | --- | > | Windows 7 | Windows 8 | > | Visual Studio 2010 | Visual Studio 2010 Express para Web. El tutorial no se ha probado con versiones posteriores de Visual Studio. Hay muchas diferencias en las selecciones de menú, los cuadros de diálogo y las plantillas. | > | .NET 4 | .NET 4,5 es compatible con versiones anteriores de .NET 4, pero el tutorial no se ha probado con .NET 4,5. | > | Entity Framework 4 | El tutorial no se ha probado con versiones posteriores de Entity Framework. A partir de Entity Framework 5, EF utiliza de forma predeterminada el `DbContext API` que se presentó con EF 4,1. El control EntityDataSource se diseñó para usar la API de `ObjectContext`. Para obtener información sobre cómo usar el control EntityDataSource con la API de `DbContext`, consulte [esta entrada de blog](https://blogs.msdn.com/b/webdev/archive/2012/09/13/how-to-use-the-entitydatasource-control-with-entity-framework-code-first.aspx). | > > ## <a name="questions"></a>Preguntas > > Si tiene preguntas que no están directamente relacionadas con el tutorial, puede publicarlas en el foro de [Entity Framework de ASP.net](https://forums.asp.net/1227.aspx), en el [Foro de Entity Framework y LINQ to Entities](https://social.msdn.microsoft.com/forums/adodotnetentityframework/threads/), o en [stackoverflow.com](http://stackoverflow.com/). El control `EntityDataSource` permite crear una aplicación muy rápidamente, pero normalmente requiere que se mantenga una cantidad significativa de lógica de negocios y lógica de acceso a datos en las páginas *. aspx* . Si espera que la aplicación crezca en complejidad y requiera mantenimiento en curso, puede invertir más tiempo de desarrollo por adelantado para crear una estructura de aplicación de *n niveles* o en *capas* que sea más fácil de mantener. Para implementar esta arquitectura, separe la capa de presentación de la capa de lógica de negocios (BLL) y la capa de acceso a datos (DAL). Una manera de implementar esta estructura es usar el control `ObjectDataSource` en lugar del control `EntityDataSource`. Cuando se usa el control `ObjectDataSource`, se implementa su propio código de acceso a datos y, a continuación, se invoca en páginas *. aspx* con un control que tiene muchas de las mismas características que otros controles de origen de datos. Esto le permite combinar las ventajas de un enfoque de n niveles con las ventajas de usar un control de formularios Web Forms para el acceso a datos. El control `ObjectDataSource` ofrece más flexibilidad también en otras maneras. Dado que escribe su propio código de acceso a datos, es más fácil hacer algo más que simplemente leer, insertar, actualizar o eliminar un tipo de entidad concreto, que son las tareas que el control `EntityDataSource` está diseñado para realizar. Por ejemplo, puede realizar el registro cada vez que se actualiza una entidad, archivar datos cada vez que se elimina una entidad o comprobar y actualizar automáticamente los datos relacionados según sea necesario al insertar una fila con un valor de clave externa. ## <a name="business-logic-and-repository-classes"></a>Clases de repositorio y lógica de negocios Un control de `ObjectDataSource` funciona mediante la invocación de una clase que se crea. La clase incluye métodos que recuperan y actualizan datos, y proporciona los nombres de esos métodos al control `ObjectDataSource` en el marcado. Durante la representación o el procesamiento de postback, el `ObjectDataSource` llama a los métodos que ha especificado. Además de las operaciones CRUD básicas, la clase que se crea para utilizar con el control `ObjectDataSource` puede necesitar ejecutar la lógica de negocios cuando el `ObjectDataSource` lee o actualiza los datos. Por ejemplo, al actualizar un departamento, es posible que tenga que validar que ningún otro departamento tenga el mismo administrador porque una persona no puede ser administrador de más de un departamento. En algunos `ObjectDataSource` documentación, como la [información general de la clase ObjectDataSource](https://msdn.microsoft.com/library/system.web.ui.webcontrols.objectdatasource.aspx), el control llama a una clase denominada *objeto comercial* que incluye lógica de negocios y lógica de acceso a datos. En este tutorial, creará clases independientes para la lógica de negocios y para la lógica de acceso a datos. La clase que encapsula la lógica de acceso a datos se denomina *repositorio*. La clase de lógica de negocios incluye métodos de lógica empresarial y métodos de acceso a datos, pero los métodos de acceso a datos llaman al repositorio para realizar tareas de acceso a datos. También creará una capa de abstracción entre la capa BLL y la capa DAL que facilita las pruebas unitarias automatizadas de la capa BLL. Esta capa de abstracción se implementa mediante la creación de una interfaz y el uso de la interfaz cuando se crea una instancia del repositorio en la clase de lógica de negocios. Esto permite proporcionar la clase de lógica de negocios con una referencia a cualquier objeto que implemente la interfaz del repositorio. Para un funcionamiento normal, se proporciona un objeto de repositorio que funciona con el Entity Framework. Para las pruebas, debe proporcionar un objeto de repositorio que funcione con los datos almacenados de una forma que pueda manipular fácilmente, como las variables de clase definidas como colecciones. En la ilustración siguiente se muestra la diferencia entre una clase de lógica empresarial que incluye la lógica de acceso a datos sin un repositorio y otra que usa un repositorio. [![Image05](using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started/_static/image2.png)](using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started/_static/image1.png) Comenzará creando páginas web en las que el control de `ObjectDataSource` se enlaza directamente a un repositorio porque solo realiza tareas básicas de acceso a datos. En el tutorial siguiente, creará una clase de lógica de negocios con la lógica de validación y enlazará el control de `ObjectDataSource` a esa clase en lugar de a la clase de repositorio. También creará pruebas unitarias para la lógica de validación. En el tercer tutorial de esta serie, agregará funcionalidad de ordenación y filtrado a la aplicación. Las páginas que crea en este tutorial funcionan con el conjunto de entidades `Departments` del modelo de datos creado en la [serie de tutoriales de introducción](../getting-started-with-ef/the-entity-framework-and-aspnet-getting-started-part-1.md). [![Image01](using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started/_static/image4.png)](using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started/_static/image3.png) [![Image02](using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started/_static/image6.png)](using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started/_static/image5.png) ## <a name="updating-the-database-and-the-data-model"></a>Actualizar la base de datos y el modelo de datos Para comenzar este tutorial, debe realizar dos cambios en la base de datos, y ambos requieren los cambios correspondientes en el modelo de datos que ha creado en el [Introducción con los tutoriales de Entity Framework y formularios Web Forms](../getting-started-with-ef/the-entity-framework-and-aspnet-getting-started-part-1.md) . En uno de esos tutoriales, realizó cambios en el diseñador manualmente para sincronizar el modelo de datos con la base de datos después de un cambio en la base de datos. En este tutorial, usará la herramienta **Actualizar modelo de la base de** datos del diseñador para actualizar el modelo de datos automáticamente. ### <a name="adding-a-relationship-to-the-database"></a>Agregar una relación a la base de datos En Visual Studio, abra la aplicación web contoso University que creó en el Introducción con la serie de tutoriales de [Entity Framework y Web Forms](../getting-started-with-ef/the-entity-framework-and-aspnet-getting-started-part-1.md) y, a continuación, abra el diagrama de base de datos `SchoolDiagram`. Si observa la tabla `Department` del diagrama de base de datos, verá que tiene una columna `Administrator`. Esta columna es una clave externa de la tabla de `Person`, pero no se ha definido ninguna relación de clave externa en la base de datos. Debe crear la relación y actualizar el modelo de datos para que el Entity Framework pueda controlar automáticamente esta relación. En el diagrama de base de datos, haga clic con el botón secundario en la tabla `Department` y seleccione **relaciones**. [![Image80](using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started/_static/image8.png)](using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started/_static/image7.png) En el cuadro **relaciones de clave externa** , haga clic en **Agregar**y, a continuación, haga clic en los puntos suspensivos para **especificación de tablas y columnas**. [![Image81](using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started/_static/image10.png)](using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started/_static/image9.png) En el cuadro de diálogo **tablas y columnas** , establezca la tabla y el campo de la clave principal en `Person` y `PersonID`, y establezca la tabla y el campo de la clave externa en `Department` y `Administrator`. (Al hacerlo, el nombre de la relación cambiará de `FK_Department_Department` a `FK_Department_Person`). [![Image82](using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started/_static/image12.png)](using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started/_static/image11.png) Haga clic en **Aceptar** en el cuadro **tablas y columnas** , haga clic en **cerrar** en el cuadro **relaciones de clave externa** y guarde los cambios. Si se le pregunta si desea guardar las tablas `Person` y `Department`, haga clic en **sí**. > [!NOTE] > Si ha eliminado `Person` filas que se corresponden con los datos que ya están en la columna `Administrator`, no podrá guardar este cambio. En ese caso, use el editor de tablas de **Explorador de servidores** para asegurarse de que el valor de `Administrator` de cada fila de `Department` contiene el identificador de un registro que realmente existe en la tabla de `Person`. > > Después de guardar el cambio, no podrá eliminar una fila de la tabla `Person` si esa persona es un administrador de departamento. En una aplicación de producción, se proporciona un mensaje de error específico cuando una restricción de base de datos impide una eliminación o se especifica una eliminación en cascada. Para obtener un ejemplo de cómo especificar una eliminación en cascada, vea [el Entity Framework y ASP.net – introducción parte 2](../getting-started-with-ef/the-entity-framework-and-aspnet-getting-started-part-2.md). ### <a name="adding-a-view-to-the-database"></a>Agregar una vista a la base de datos En la página New *departments. aspx* que va a crear, desea proporcionar una lista desplegable de instructores, con nombres en el formato "último, primero" para que los usuarios puedan seleccionar administradores de departamento. Para facilitar la tarea, se creará una vista en la base de datos. La vista se compondrá solo de los datos necesarios para la lista desplegable: el nombre completo (con el formato correcto) y la clave de registro. En **Explorador de servidores**, expanda *School. MDF*, haga clic con el botón secundario en la carpeta **views** y seleccione **Agregar nueva vista**. [![Image06](using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started/_static/image14.png)](using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started/_static/image13.png) Haga clic en **cerrar** cuando aparezca el cuadro de diálogo **Agregar tabla** y pegue la siguiente instrucción SQL en el panel SQL: [!code-sql[Main](using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started/samples/sample1.sql)] Guarde la vista como `vInstructorName`. ### <a name="updating-the-data-model"></a>Actualizar el modelo de datos En la carpeta *Dal* , abra el archivo *SchoolModel. edmx* , haga clic con el botón secundario en la superficie de diseño y seleccione **Actualizar modelo desde base de datos**. [![Image07](using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started/_static/image16.png)](using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started/_static/image15.png) En el cuadro de diálogo **Elija los objetos de base de datos** , seleccione la pestaña **Agregar** y seleccione la vista que acaba de crear. [![Image08](using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started/_static/image18.png)](using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started/_static/image17.png) Haga clic en **Finalizar**. En el diseñador, verá que la herramienta creó una entidad `vInstructorName` y una nueva asociación entre las entidades `Department` y `Person`. [![Image13](using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started/_static/image20.png)](using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started/_static/image19.png) > [!NOTE] > En las ventanas **salida** y **lista de errores** , es posible que vea un mensaje de advertencia que le informa de que la herramienta creó automáticamente una clave principal para la nueva vista `vInstructorName`. Este es el comportamiento normal. Cuando se hace referencia a la nueva entidad `vInstructorName` en el código, no se desea usar la Convención de base de datos de prefijo "v" en minúsculas. Por lo tanto, cambiará el nombre de la entidad y del conjunto de entidades en el modelo. Abra el **Explorador de modelos**. Verá `vInstructorName` enumerada como un tipo de entidad y una vista. [![Image14](using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started/_static/image22.png)](using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started/_static/image21.png) En **SchoolModel** (no en **SchoolModel. Store**), haga clic con el botón secundario en **vInstructorName** y seleccione **propiedades**. En la ventana **propiedades** , cambie la propiedad **nombre** a "InstructorName" y cambie la propiedad nombre del **conjunto de entidades** por "InstructorNames". [![Image15](using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started/_static/image24.png)](using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started/_static/image23.png) Guarde y cierre el modelo de datos y, a continuación, recompile el proyecto. ## <a name="using-a-repository-class-and-an-objectdatasource-control"></a>Usar una clase de repositorio y un control ObjectDataSource Cree un nuevo archivo de clase en la carpeta *Dal* , asígnele el nombre *SchoolRepository.CS*y reemplace el código existente por el código siguiente: [!code-csharp[Main](using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started/samples/sample2.cs)] Este código proporciona un único método de `GetDepartments` que devuelve todas las entidades del conjunto de entidades `Departments`. Como sabe que va a tener acceso a la propiedad de navegación `Person` para cada fila devuelta, especifique la carga diligente para esa propiedad mediante el método `Include`. La clase también implementa la interfaz `IDisposable` para asegurarse de que la conexión a la base de datos se libera cuando se desecha el objeto. > [!NOTE] > Una práctica común es crear una clase de repositorio para cada tipo de entidad. En este tutorial, se usa una clase de repositorio para varios tipos de entidad. Para obtener más información sobre el patrón de repositorio, consulte las publicaciones en [el blog del equipo de Entity Framework](https://blogs.msdn.com/b/adonet/archive/2009/06/16/using-repository-and-unit-of-work-patterns-with-entity-framework-4-0.aspx) y el [blog de Julia Lerman](http://thedatafarm.com/blog/data-access/agile-ef4-repository-part-3-fine-tuning-the-repository/). El método `GetDepartments` devuelve un objeto `IEnumerable` en lugar de un objeto `IQueryable` para asegurarse de que la colección devuelta se puede usar incluso después de que se elimine el propio objeto de repositorio. Un objeto `IQueryable` puede producir el acceso a la base de datos cada vez que se tiene acceso a él, pero el objeto de repositorio podría desecharse en el momento en que un control enlazado a datos intenta representar los datos. Podría devolver otro tipo de colección, como un objeto `IList` en lugar de un objeto `IEnumerable`. Sin embargo, la devolución de un objeto `IEnumerable` garantiza que puede realizar tareas habituales de procesamiento de lista de solo lectura como bucles de `foreach` y consultas LINQ, pero no puede Agregar o quitar elementos de la colección, lo que puede implicar que tales cambios se conserven en la base de datos. Cree una página *departments. aspx* que use la página maestra *site. Master* y agregue el marcado siguiente en el control `Content` denominado `Content2`: [!code-aspx[Main](using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started/samples/sample3.aspx)] Este marcado crea un control `ObjectDataSource` que utiliza la clase de repositorio que acaba de crear y un control `GridView` para mostrar los datos. El control `GridView` especifica comandos de **edición** y **eliminación** , pero no ha agregado código para admitirlos todavía. Varias columnas usan controles de `DynamicField` para que pueda aprovechar las ventajas de la funcionalidad de validación y formato de datos automática. Para que funcionen, deberá llamar al método `EnableDynamicData` en el controlador de eventos `Page_Init`. (`DynamicControl` los controles no se usan en el campo `Administrator` porque no funcionan con las propiedades de navegación). Los atributos de `Vertical-Align="Top"` se convertirán en importantes más adelante cuando agregue una columna que tenga un control de `GridView` anidado a la cuadrícula. Abra el archivo *departments.aspx.CS* y agregue la siguiente instrucción `using`: [!code-csharp[Main](using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started/samples/sample4.cs)] A continuación, agregue el siguiente controlador para el evento de `Init` de la página: [!code-csharp[Main](using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started/samples/sample5.cs)] En la carpeta *Dal* , cree un nuevo archivo de clase denominado *Department.CS* y reemplace el código existente por el código siguiente: [!code-csharp[Main](using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started/samples/sample6.cs)] Este código agrega metadatos al modelo de datos. Especifica que la propiedad `Budget` de la entidad `Department` realmente representa la moneda, aunque su tipo de datos sea `Decimal`y especifica que el valor debe estar comprendido entre 0 y $1.000.000,00. También especifica que la propiedad `StartDate` debe tener el formato de fecha en formato mm/dd/aaaa. Ejecute la página *departments. aspx* . [![Image01](using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started/_static/image26.png)](using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started/_static/image25.png) Tenga en cuenta que, aunque no especificó una cadena de formato en el marcado de página *departments. aspx* para las columnas **presupuesto** o **fecha de inicio** , el formato de fecha y moneda predeterminado se les ha aplicado mediante los controles `DynamicField`, usando los metadatos que proporcionó en el archivo *Department.CS* . ## <a name="adding-insert-and-delete-functionality"></a>Agregar funcionalidad de inserción y eliminación Abra *SchoolRepository.CS*y agregue el código siguiente para crear un método de `Insert` y un método `Delete`. El código también incluye un método denominado `GenerateDepartmentID` que calcula el siguiente valor de clave de registro disponible para su uso por el método `Insert`. Esto es necesario porque la base de datos no está configurada para calcular esto automáticamente para la tabla `Department`. [!code-csharp[Main](using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started/samples/sample7.cs)] ### <a name="the-attach-method"></a>Método Attach El método `DeleteDepartment` llama al método `Attach` para volver a establecer el vínculo que se mantiene en el administrador de estado de objetos del contexto del objeto entre la entidad en la memoria y la fila de base de datos que representa. Esto debe ocurrir antes de que el método llame al método `SaveChanges`. El término *contexto del objeto* hace referencia a la clase Entity Framework que se deriva de la clase `ObjectContext` que se usa para tener acceso a las entidades y los conjuntos de entidades. En el código de este proyecto, la clase se denomina `SchoolEntities`y una instancia de se denomina siempre `context`. El *Administrador de estado de objetos* del contexto del objeto es una clase que se deriva de la clase `ObjectStateManager`. El contacto de objeto utiliza el administrador de estado de objetos para almacenar objetos de entidad y para realizar un seguimiento de si cada uno está sincronizado con su fila o filas de tabla correspondientes en la base de datos. Cuando se lee una entidad, el contexto del objeto la almacena en el administrador de estado de objetos y realiza un seguimiento de si esa representación del objeto está sincronizada con la base de datos. Por ejemplo, si cambia un valor de propiedad, se establece una marca para indicar que la propiedad modificada ya no está sincronizada con la base de datos. A continuación, al llamar al método `SaveChanges`, el contexto del objeto sabe qué hacer en la base de datos porque el administrador de estado de objetos sabe exactamente cuáles son las diferencias entre el estado actual de la entidad y el estado de la base de datos. Sin embargo, este proceso no funciona normalmente en una aplicación Web, porque la instancia del contexto del objeto que lee una entidad, junto con todo en el administrador de estado de objetos, se elimina después de representar una página. La instancia del contexto del objeto que debe aplicar los cambios es una nueva instancia de con la que se crean instancias para el procesamiento de postback. En el caso del método `DeleteDepartment`, el control `ObjectDataSource` vuelve a crear la versión original de la entidad a partir de los valores del estado de vista, pero esta entidad que se ha vuelto a crear `Department` no existe en el administrador de estado de objetos. Si llamó al método `DeleteObject` en esta entidad que se ha vuelto a crear, se produciría un error en la llamada porque el contexto del objeto no sabe si la entidad está sincronizada con la base de datos. Sin embargo, al llamar al método `Attach` se vuelve a establecer el mismo seguimiento entre la entidad que se ha vuelto a crear y los valores de la base de datos que se realizaron originalmente automáticamente cuando la entidad se leyó en una instancia anterior del contexto del objeto. Hay ocasiones en las que no desea que el contexto del objeto realice el seguimiento de las entidades en el administrador de estado de objetos, y puede establecer marcas para evitar que lo haga. En los tutoriales posteriores de esta serie se muestran ejemplos de esto. ### <a name="the-savechanges-method"></a>El método SaveChanges Esta clase de repositorio simple muestra los principios básicos de cómo realizar operaciones CRUD. En este ejemplo, se llama al método `SaveChanges` inmediatamente después de cada actualización. En una aplicación de producción podría querer llamar al método `SaveChanges` desde un método independiente para proporcionar más control sobre cuándo se actualiza la base de datos. (Al final del tutorial siguiente encontrará un vínculo a una nota del producto que describe el patrón de unidad de trabajo, que es un enfoque para coordinar las actualizaciones relacionadas). Observe también que en el ejemplo, el método `DeleteDepartment` no incluye el código para controlar los conflictos de simultaneidad. código que se va a agregar en un tutorial posterior de esta serie. ### <a name="retrieving-instructor-names-to-select-when-inserting"></a>Recuperando nombres de instructor para seleccionar al insertar Los usuarios deben poder seleccionar un administrador de una lista de instructores en una lista desplegable al crear nuevos departamentos. Por consiguiente, agregue el código siguiente a *SchoolRepository.CS* para crear un método para recuperar la lista de instructores mediante la vista que creó anteriormente: [!code-csharp[Main](using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started/samples/sample8.cs)] ### <a name="creating-a-page-for-inserting-departments"></a>Crear una página para insertar departamentos Cree una página *DepartmentsAdd. aspx* que use la página *site. Master* y agregue el marcado siguiente en el control `Content` denominado `Content2`: [!code-aspx[Main](using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started/samples/sample9.aspx)] Este marcado crea dos controles `ObjectDataSource`, uno para insertar nuevas entidades `Department` y otro para recuperar los nombres de los instructores para el control `DropDownList` que se usa para seleccionar administradores de departamento. El marcado crea un control de `DetailsView` para la entrada de nuevos departamentos y especifica un controlador para el evento de `ItemInserting` del control, de modo que pueda establecer el valor de clave externa de `Administrator`. Al final hay un control de `ValidationSummary` para mostrar los mensajes de error. Abra *DepartmentsAdd.aspx.CS* y agregue la siguiente instrucción `using`: [!code-csharp[Main](using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started/samples/sample10.cs)] Agregue la siguiente variable y métodos de clase: [!code-csharp[Main](using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started/samples/sample11.cs)] El método `Page_Init` habilita la funcionalidad de datos dinámicos. El controlador del evento de `Init` del control `DropDownList` guarda una referencia al control y el controlador del evento `Inserting` del control `DetailsView` utiliza esa referencia para obtener el valor `PersonID` del instructor seleccionado y actualizar la propiedad `Administrator` clave externa de la entidad `Department`. Ejecute la página, agregue información para un nuevo departamento y, a continuación, haga clic en el vínculo **Insertar** . [![Image04](using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started/_static/image28.png)](using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started/_static/image27.png) Escriba los valores de otro departamento nuevo. Escriba un número mayor que 1.000.000,00 en el campo **presupuesto** y la pestaña en el campo siguiente. Aparece un asterisco en el campo y, si mantiene el puntero del mouse sobre él, puede ver el mensaje de error que especificó en los metadatos de ese campo. [![Image03](using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started/_static/image30.png)](using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started/_static/image29.png) Haga clic en **Insertar**y verá el mensaje de error que muestra el control `ValidationSummary` en la parte inferior de la página. [![Image12](using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started/_static/image32.png)](using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started/_static/image31.png) A continuación, cierre el explorador y abra la página *departments. aspx* . Agregue la funcionalidad de eliminación a la página *departments. aspx* agregando un `DeleteMethod` atributo al control `ObjectDataSource` y un atributo `DataKeyNames` al control `GridView`. Ahora, las etiquetas de apertura de estos controles son similares a las del ejemplo siguiente: [!code-aspx[Main](using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started/samples/sample12.aspx)] Ejecute la página. [![Image09](using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started/_static/image34.png)](using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started/_static/image33.png) Elimine el Departamento que agregó al ejecutar la página *DepartmentsAdd. aspx* . ## <a name="adding-update-functionality"></a>Agregar funcionalidad de actualización Abra *SchoolRepository.CS* y agregue el siguiente método de `Update`: [!code-csharp[Main](using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started/samples/sample13.cs)] Al hacer clic en **Actualizar** en la página *departments. aspx* , el control `ObjectDataSource` crea dos entidades `Department` para pasar al método `UpdateDepartment`. Uno contiene los valores originales que se han almacenado en el estado de vista y el otro contiene los nuevos valores que se especificaron en el control `GridView`. El código del método `UpdateDepartment` pasa la entidad `Department` que tiene los valores originales al método `Attach` para establecer el seguimiento entre la entidad y lo que hay en la base de datos. A continuación, el código pasa la entidad `Department` que tiene los valores nuevos al método `ApplyCurrentValues`. El contexto del objeto compara los valores antiguos y nuevos. Si un valor nuevo es diferente de un valor anterior, el contexto del objeto cambia el valor de la propiedad. A continuación, el método `SaveChanges` actualiza solo las columnas modificadas en la base de datos. (Sin embargo, si la función de actualización de esta entidad se asignó a un procedimiento almacenado, toda la fila se actualizaría independientemente de las columnas que se cambiaran). Abra el archivo *departments. aspx* y agregue los siguientes atributos al control `DepartmentsObjectDataSource`: - `UpdateMethod="UpdateDepartment"` - `ConflictDetection="CompareAllValues"` Esto hace que los valores anteriores se almacenen en el estado de vista para que se puedan comparar con los nuevos valores del método `Update`. - `OldValuesParameterFormatString="orig{0}"` Esto informa al control de que el nombre del parámetro de valores originales es `origDepartment`. El marcado de la etiqueta de apertura del control `ObjectDataSource` ahora es similar al ejemplo siguiente: [!code-aspx[Main](using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started/samples/sample14.aspx)] Agregue un atributo `OnRowUpdating="DepartmentsGridView_RowUpdating"` al control `GridView`. Lo usará para establecer el valor de la propiedad `Administrator` en función de la fila que el usuario seleccione en una lista desplegable. La etiqueta de apertura de `GridView` ahora es similar al ejemplo siguiente: [!code-aspx[Main](using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started/samples/sample15.aspx)] Agregue un control de `EditItemTemplate` para la columna `Administrator` al control `GridView`, inmediatamente después del control `ItemTemplate` para esa columna: [!code-aspx[Main](using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started/samples/sample16.aspx)] Este control `EditItemTemplate` es similar al control `InsertItemTemplate` en la página *DepartmentsAdd. aspx* . La diferencia es que el valor inicial del control se establece mediante el atributo `SelectedValue`. Antes del control `GridView`, agregue un control `ValidationSummary` como hizo en la página *DepartmentsAdd. aspx* . [!code-aspx[Main](using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started/samples/sample17.aspx)] Abra *departments.aspx.CS* y inmediatamente después de la declaración de clase parcial, agregue el código siguiente para crear un campo privado para hacer referencia al control `DropDownList`: [!code-csharp[Main](using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started/samples/sample18.cs)] A continuación, agregue Controladores para el evento `Init` del control `DropDownList` y el evento `RowUpdating` del control `GridView`: [!code-csharp[Main](using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started/samples/sample19.cs)] El controlador del evento `Init` guarda una referencia al control `DropDownList` en el campo de clase. El controlador del evento `RowUpdating` usa la referencia para obtener el valor que el usuario ha escrito y aplicarlo a la propiedad `Administrator` de la entidad `Department`. Use la página *DepartmentsAdd. aspx* para agregar un nuevo departamento y, a continuación, ejecute la página *departments. aspx* y haga clic en **Editar** en la fila que ha agregado. > [!NOTE] > No podrá modificar filas que no haya agregado (es decir, que ya estaban en la base de datos), debido a que los datos no son válidos en la base de datos. los administradores de las filas que se crearon con la base de datos son estudiantes. Si intenta editar uno de ellos, aparecerá una página de error que informa de un error como `'InstructorsDropDownList' has a SelectedValue which is invalid because it does not exist in the list of items.` [![Image10](using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started/_static/image36.png)](using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started/_static/image35.png) Si escribe una cantidad de **presupuesto** no válida y, a continuación, hace clic en **Actualizar**, verá el mismo asterisco y el mismo mensaje de error que vio en la página *departments. aspx* . Cambie un valor de campo o seleccione un administrador diferente y haga clic en **Actualizar**. Se muestra el cambio. [![Image09](using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started/_static/image38.png)](using-the-entity-framework-and-the-objectdatasource-control-part-1-getting-started/_static/image37.png) Esto completa la introducción al uso del control `ObjectDataSource` para las operaciones CRUD (creación, lectura, actualización y eliminación) básicas con la Entity Framework. Ha creado una aplicación de n niveles simple, pero la capa de lógica de negocios todavía está estrechamente acoplada a la capa de acceso a datos, lo que complica las pruebas unitarias automatizadas. En el tutorial siguiente, verá cómo implementar el patrón de repositorio para facilitar las pruebas unitarias. > [!div class="step-by-step"] > [Siguiente](using-the-entity-framework-and-the-objectdatasource-control-part-2-adding-a-business-logic-layer-and-unit-tests.md)
120.433022
1,164
0.799115
spa_Latn
0.982448
e98e6aa072213790a9d8ef9662750050582845bf
560
md
Markdown
src/content/changes/documentation/docs/v2.1.22.md
giantswarm/docs
a5b661029811c73bb4bb97c57058da89a9c0396f
[ "Apache-2.0" ]
6
2019-05-10T16:08:44.000Z
2021-09-26T07:29:55.000Z
src/content/changes/documentation/docs/v2.1.22.md
giantswarm/docs
a5b661029811c73bb4bb97c57058da89a9c0396f
[ "Apache-2.0" ]
565
2018-08-01T11:10:33.000Z
2022-03-31T15:54:53.000Z
src/content/changes/documentation/docs/v2.1.22.md
giantswarm/docs
a5b661029811c73bb4bb97c57058da89a9c0396f
[ "Apache-2.0" ]
6
2019-07-16T07:36:56.000Z
2021-02-19T15:21:33.000Z
--- # Generated by scripts/aggregate-changelogs. WARNING: Manual edits to this files will be overwritten. changes_categories: - Documentation changes_entry: repository: giantswarm/docs url: https://github.com/giantswarm/docs/releases/tag/v2.1.22 version: 2.1.22 version_tag: v2.1.22 date: '2021-05-06T08:28:33' description: Changelog entry for giantswarm/docs version 2.1.22, published on 06 May 2021, 08:28 title: docs release v2.1.22 --- - Mark KVM workload cluster release v1.14 as available ([#927](https://github.com/giantswarm/docs/pull/927))
32.941176
108
0.757143
eng_Latn
0.555096
e98e8e9bbeff7246875490147303673f33525412
99
md
Markdown
README.md
terenceslau/01_flutter_assignment_share
0a740bd657cf60512223af63e50e5ac85d2a1f4a
[ "MIT" ]
null
null
null
README.md
terenceslau/01_flutter_assignment_share
0a740bd657cf60512223af63e50e5ac85d2a1f4a
[ "MIT" ]
null
null
null
README.md
terenceslau/01_flutter_assignment_share
0a740bd657cf60512223af63e50e5ac85d2a1f4a
[ "MIT" ]
null
null
null
# 01_flutter_assignment_share Flutter assignment Screen capture: https://i.imgur.com/uXXnhpa.png
19.8
47
0.818182
kor_Hang
0.180017
e98f812ef48277a3f58e917f5ef170a72c41a21e
864
md
Markdown
site/_posts/2005-03-31-feinstaubbelastung.md
fwenzel/fredericiana
06d523b5c93e80fa38c9d671916accbd3c74adc4
[ "BSD-3-Clause" ]
null
null
null
site/_posts/2005-03-31-feinstaubbelastung.md
fwenzel/fredericiana
06d523b5c93e80fa38c9d671916accbd3c74adc4
[ "BSD-3-Clause" ]
2
2015-01-15T23:29:01.000Z
2015-04-29T22:00:16.000Z
site/_posts/2005-03-31-feinstaubbelastung.md
fwenzel/fredericiana
06d523b5c93e80fa38c9d671916accbd3c74adc4
[ "BSD-3-Clause" ]
null
null
null
--- status: publish tags: - german published: true title: Feinstaubbelastung type: post meta: tags: "" layout: post --- Irgendwie ist das Wort grad im Trend. Und bevor ich mir eines Tages anhören muss <blockquote>Opa, du hast ja damals gar nichts zur Feinstaubbelastung gesagt!</blockquote> wollte ich das Wort doch wenigstens mal erwähnt haben. Schließlich ist gerade jeder Größstädter ein Experte für Fein- und Feinststaubbelastung. Die Autofahrer außerdem für Fein-Staubelastung und die Ideologen für Feindstaubbelastung. Ich glaube ja, wir brauchen dringend eine Feinstaubbelastungsreformkommission. <strong>Update:</strong> Der <a href="http://www.pepilog.de/artikel/feinstaubbelastung.htm">ein</a> oder <a href="http://www.rochuswolff.de/weblog/archiv/2005/03/31/11.28.25/">andere</a> Blogger geht seinen Enkeln auch mit gutem Beispiel voran. Brav!
41.142857
250
0.78588
deu_Latn
0.976616
e98ffeacaf755162af3bbd7aa95660094709b3d1
16,010
md
Markdown
cn.zh-cn/API参考/查询任务中心(1-1-8).md
huaweicloudDocs/ges
fbb02e7c53f58e00f53cc06e8c188a07a106d1dd
[ "RSA-MD" ]
4
2019-09-28T03:46:27.000Z
2021-08-31T02:22:36.000Z
cn.zh-cn/API参考/查询任务中心(1-1-8).md
huaweicloudDocs/ges
fbb02e7c53f58e00f53cc06e8c188a07a106d1dd
[ "RSA-MD" ]
null
null
null
cn.zh-cn/API参考/查询任务中心(1-1-8).md
huaweicloudDocs/ges
fbb02e7c53f58e00f53cc06e8c188a07a106d1dd
[ "RSA-MD" ]
null
null
null
# 查询任务中心\(1.1.8\)<a name="ges_03_0102"></a> ## 功能介绍<a name="section191474019367"></a> 查询管理面任务中心。当前创建图、关闭图、启动图、删除图、增加备份、导入图、导出图、升级图等操作为异步任务,该API用于查询这些任务的详情。 ## URI<a name="section09144402366"></a> - URI 格式 ``` GET /v1.0/{project_id}/graphs/jobs?offset={offset}&limit={limit}&status={status}&graph_name={graph_name}&startTime={startTime}&endTime={endTime} ``` - 参数说明 **表 1** URI参数说明 <a name="table242717161697"></a> <table><thead align="left"><tr id="row356893751697"><th class="cellrowborder" valign="top" width="12.67%" id="mcps1.2.5.1.1"><p id="p3108920316930"><a name="p3108920316930"></a><a name="p3108920316930"></a>参数</p> </th> <th class="cellrowborder" valign="top" width="13.389999999999999%" id="mcps1.2.5.1.2"><p id="p3519750416930"><a name="p3519750416930"></a><a name="p3519750416930"></a>是否必选</p> </th> <th class="cellrowborder" valign="top" width="14.29%" id="mcps1.2.5.1.3"><p id="p3242555716930"><a name="p3242555716930"></a><a name="p3242555716930"></a>类型</p> </th> <th class="cellrowborder" valign="top" width="59.650000000000006%" id="mcps1.2.5.1.4"><p id="p922447516930"><a name="p922447516930"></a><a name="p922447516930"></a>说明</p> </th> </tr> </thead> <tbody><tr id="row476603911697"><td class="cellrowborder" valign="top" width="12.67%" headers="mcps1.2.5.1.1 "><p id="p5669781616930"><a name="p5669781616930"></a><a name="p5669781616930"></a>project_id</p> </td> <td class="cellrowborder" valign="top" width="13.389999999999999%" headers="mcps1.2.5.1.2 "><p id="p2912041616930"><a name="p2912041616930"></a><a name="p2912041616930"></a>是</p> </td> <td class="cellrowborder" valign="top" width="14.29%" headers="mcps1.2.5.1.3 "><p id="p994348016930"><a name="p994348016930"></a><a name="p994348016930"></a>String</p> </td> <td class="cellrowborder" valign="top" width="59.650000000000006%" headers="mcps1.2.5.1.4 "><p id="p51708449194548"><a name="p51708449194548"></a><a name="p51708449194548"></a>项目编号,用于资源隔离。请参考<a href="获取项目ID.md">获取项目ID</a>。</p> </td> </tr> <tr id="row282501011170"><td class="cellrowborder" valign="top" width="12.67%" headers="mcps1.2.5.1.1 "><p id="p6598963717028"><a name="p6598963717028"></a><a name="p6598963717028"></a>offset</p> </td> <td class="cellrowborder" valign="top" width="13.389999999999999%" headers="mcps1.2.5.1.2 "><p id="p4356039517028"><a name="p4356039517028"></a><a name="p4356039517028"></a>否</p> </td> <td class="cellrowborder" valign="top" width="14.29%" headers="mcps1.2.5.1.3 "><p id="p3873111317028"><a name="p3873111317028"></a><a name="p3873111317028"></a>Integer</p> </td> <td class="cellrowborder" valign="top" width="59.650000000000006%" headers="mcps1.2.5.1.4 "><p id="p5021248817028"><a name="p5021248817028"></a><a name="p5021248817028"></a>本次请求的起始位置,默认为0。</p> </td> </tr> <tr id="row06745143174"><td class="cellrowborder" valign="top" width="12.67%" headers="mcps1.2.5.1.1 "><p id="p3057352217028"><a name="p3057352217028"></a><a name="p3057352217028"></a>limit</p> </td> <td class="cellrowborder" valign="top" width="13.389999999999999%" headers="mcps1.2.5.1.2 "><p id="p6053623317028"><a name="p6053623317028"></a><a name="p6053623317028"></a>否</p> </td> <td class="cellrowborder" valign="top" width="14.29%" headers="mcps1.2.5.1.3 "><p id="p448786917028"><a name="p448786917028"></a><a name="p448786917028"></a>Integer</p> </td> <td class="cellrowborder" valign="top" width="59.650000000000006%" headers="mcps1.2.5.1.4 "><p id="p2797310917028"><a name="p2797310917028"></a><a name="p2797310917028"></a>每页资源数量的最大值,默认为10。</p> </td> </tr> <tr id="row20235150919"><td class="cellrowborder" valign="top" width="12.67%" headers="mcps1.2.5.1.1 "><p id="p9238454920"><a name="p9238454920"></a><a name="p9238454920"></a>status</p> </td> <td class="cellrowborder" valign="top" width="13.389999999999999%" headers="mcps1.2.5.1.2 "><p id="p2238165096"><a name="p2238165096"></a><a name="p2238165096"></a>否</p> </td> <td class="cellrowborder" valign="top" width="14.29%" headers="mcps1.2.5.1.3 "><p id="p112381851291"><a name="p112381851291"></a><a name="p112381851291"></a>String</p> </td> <td class="cellrowborder" valign="top" width="59.650000000000006%" headers="mcps1.2.5.1.4 "><p id="p102381658913"><a name="p102381658913"></a><a name="p102381658913"></a>任务状态。取值为:</p> <a name="ul7577154891014"></a><a name="ul7577154891014"></a><ul id="ul7577154891014"><li>running</li><li>waiting</li><li>success</li><li>failed</li></ul> </td> </tr> <tr id="row69560581103"><td class="cellrowborder" valign="top" width="12.67%" headers="mcps1.2.5.1.1 "><p id="p89561058121018"><a name="p89561058121018"></a><a name="p89561058121018"></a>graph_name</p> </td> <td class="cellrowborder" valign="top" width="13.389999999999999%" headers="mcps1.2.5.1.2 "><p id="p1957205841016"><a name="p1957205841016"></a><a name="p1957205841016"></a>否</p> </td> <td class="cellrowborder" valign="top" width="14.29%" headers="mcps1.2.5.1.3 "><p id="p095715810103"><a name="p095715810103"></a><a name="p095715810103"></a>String</p> </td> <td class="cellrowborder" valign="top" width="59.650000000000006%" headers="mcps1.2.5.1.4 "><p id="p1395735818104"><a name="p1395735818104"></a><a name="p1395735818104"></a>关联的图名称</p> </td> </tr> <tr id="row4233232111113"><td class="cellrowborder" valign="top" width="12.67%" headers="mcps1.2.5.1.1 "><p id="p6233732131113"><a name="p6233732131113"></a><a name="p6233732131113"></a>startTime</p> </td> <td class="cellrowborder" valign="top" width="13.389999999999999%" headers="mcps1.2.5.1.2 "><p id="p1723312322114"><a name="p1723312322114"></a><a name="p1723312322114"></a>否</p> </td> <td class="cellrowborder" valign="top" width="14.29%" headers="mcps1.2.5.1.3 "><p id="p52337320115"><a name="p52337320115"></a><a name="p52337320115"></a>String</p> </td> <td class="cellrowborder" valign="top" width="59.650000000000006%" headers="mcps1.2.5.1.4 "><p id="p20233103218110"><a name="p20233103218110"></a><a name="p20233103218110"></a>任务开始日期,当前只支持日期,不支持时间。格式为:yyyy-MM-dd,比如2019-03-27。</p> </td> </tr> <tr id="row41071543171412"><td class="cellrowborder" valign="top" width="12.67%" headers="mcps1.2.5.1.1 "><p id="p784017480144"><a name="p784017480144"></a><a name="p784017480144"></a>endTime</p> </td> <td class="cellrowborder" valign="top" width="13.389999999999999%" headers="mcps1.2.5.1.2 "><p id="p4840114821416"><a name="p4840114821416"></a><a name="p4840114821416"></a>否</p> </td> <td class="cellrowborder" valign="top" width="14.29%" headers="mcps1.2.5.1.3 "><p id="p1684054815145"><a name="p1684054815145"></a><a name="p1684054815145"></a>String</p> </td> <td class="cellrowborder" valign="top" width="59.650000000000006%" headers="mcps1.2.5.1.4 "><p id="p984074810146"><a name="p984074810146"></a><a name="p984074810146"></a>任务结束日期,当前只支持日期,不支持时间。格式为:yyyy-MM-dd,比如2019-03-27。</p> </td> </tr> </tbody> </table> ## 请求<a name="section19471640133612"></a> - 请求样例 ``` GET https://Endpoint/v1.0/{project_id}/graphs/jobs?offset=0&limit=100 ``` ## 响应<a name="section12947114083614"></a> - 要素说明 **表 2** 要素说明 <a name="table9398030161013"></a> <table><thead align="left"><tr id="row26921206161013"><th class="cellrowborder" valign="top" width="16.619999999999997%" id="mcps1.2.5.1.1"><p id="p16015104161025"><a name="p16015104161025"></a><a name="p16015104161025"></a>参数</p> </th> <th class="cellrowborder" valign="top" width="11.68%" id="mcps1.2.5.1.2"><p id="p22155036161025"><a name="p22155036161025"></a><a name="p22155036161025"></a>是否必选</p> </th> <th class="cellrowborder" valign="top" width="14.56%" id="mcps1.2.5.1.3"><p id="p49727452161025"><a name="p49727452161025"></a><a name="p49727452161025"></a>类型</p> </th> <th class="cellrowborder" valign="top" width="57.14%" id="mcps1.2.5.1.4"><p id="p1391784161025"><a name="p1391784161025"></a><a name="p1391784161025"></a>说明</p> </th> </tr> </thead> <tbody><tr id="row49281025161013"><td class="cellrowborder" valign="top" width="16.619999999999997%" headers="mcps1.2.5.1.1 "><p id="p4694663161025"><a name="p4694663161025"></a><a name="p4694663161025"></a>errorMessage</p> </td> <td class="cellrowborder" valign="top" width="11.68%" headers="mcps1.2.5.1.2 "><p id="p44723433161025"><a name="p44723433161025"></a><a name="p44723433161025"></a>否</p> </td> <td class="cellrowborder" valign="top" width="14.56%" headers="mcps1.2.5.1.3 "><p id="p65828344161025"><a name="p65828344161025"></a><a name="p65828344161025"></a>String</p> </td> <td class="cellrowborder" valign="top" width="57.14%" headers="mcps1.2.5.1.4 "><p id="p30495643161025"><a name="p30495643161025"></a><a name="p30495643161025"></a>系统提示信息,执行成功时,字段可能为空。执行失败时,用于显示错误信息。</p> </td> </tr> <tr id="row53676720161013"><td class="cellrowborder" valign="top" width="16.619999999999997%" headers="mcps1.2.5.1.1 "><p id="p18290197161025"><a name="p18290197161025"></a><a name="p18290197161025"></a>errorCode</p> </td> <td class="cellrowborder" valign="top" width="11.68%" headers="mcps1.2.5.1.2 "><p id="p5110970161025"><a name="p5110970161025"></a><a name="p5110970161025"></a>否</p> </td> <td class="cellrowborder" valign="top" width="14.56%" headers="mcps1.2.5.1.3 "><p id="p11335440161025"><a name="p11335440161025"></a><a name="p11335440161025"></a>String</p> </td> <td class="cellrowborder" valign="top" width="57.14%" headers="mcps1.2.5.1.4 "><p id="p45755454161025"><a name="p45755454161025"></a><a name="p45755454161025"></a>系统提示信息,执行成功时,字段可能为空。执行失败时,用于显示错误码。</p> </td> </tr> <tr id="row11699174410217"><td class="cellrowborder" valign="top" width="16.619999999999997%" headers="mcps1.2.5.1.1 "><p id="p7748165111215"><a name="p7748165111215"></a><a name="p7748165111215"></a>jobCount</p> </td> <td class="cellrowborder" valign="top" width="11.68%" headers="mcps1.2.5.1.2 "><p id="p9700344127"><a name="p9700344127"></a><a name="p9700344127"></a>是</p> </td> <td class="cellrowborder" valign="top" width="14.56%" headers="mcps1.2.5.1.3 "><p id="p117008442210"><a name="p117008442210"></a><a name="p117008442210"></a>Integer</p> </td> <td class="cellrowborder" valign="top" width="57.14%" headers="mcps1.2.5.1.4 "><p id="p47001744623"><a name="p47001744623"></a><a name="p47001744623"></a>任务总数</p> </td> </tr> <tr id="row156987231131"><td class="cellrowborder" valign="top" width="16.619999999999997%" headers="mcps1.2.5.1.1 "><p id="p5698923939"><a name="p5698923939"></a><a name="p5698923939"></a>jobList</p> </td> <td class="cellrowborder" valign="top" width="11.68%" headers="mcps1.2.5.1.2 "><p id="p1698523639"><a name="p1698523639"></a><a name="p1698523639"></a>否</p> </td> <td class="cellrowborder" valign="top" width="14.56%" headers="mcps1.2.5.1.3 "><p id="p1969862310316"><a name="p1969862310316"></a><a name="p1969862310316"></a>JsonArray</p> </td> <td class="cellrowborder" valign="top" width="57.14%" headers="mcps1.2.5.1.4 "><p id="p11691359938"><a name="p11691359938"></a><a name="p11691359938"></a>任务列表。其内容参考<a href="查询Job状态(1-0-0)-管理面.md#table06281119546">表3 job属性列表</a>。</p> </td> </tr> </tbody> </table> - 请求成功样例 ``` Http Status Code: 200 { "jobCount": 136, "jobList": [ { "jobId": "ff80808167bb90340167bc3c7b5b026a", "status": "success", "jobType": "GraphManagement", "jobName": "ImportGraph", "relatedGraph": "test1217", "beginTime": "2018-12-17T12:55:40", "endTime": "2018-12-17T12:56:32", "jobDetail": { "vertexsetPath": null, "edgesetPath": [ { "path": "hkmovie/edge.csv", "log": null, "cause": null, "status": "success" } ], "schemaPath": [ { "path": "hkmovie/schema.xml", "log": null, "cause": null, "status": "success" } ] }, "jobProgress": 0 }, { "jobId": "ff80808167bb90340167bc5d0b1d0358", "status": "success", "jobType": "GraphManagement", "jobName": "DeleteGraph", "relatedGraph": "test1218", "beginTime": "2018-12-17T13:31:14", "endTime": "2018-12-17T13:34:48", "jobProgress": 0 } ] } ``` - 请求失败样例 ``` Http Status Code: 400 { "errCode": "GES.0001", "externalMessage": "Parameter error." } ``` ## 返回值<a name="section58124123616"></a> - 正常 200 - 异常 **表 3** 异常返回值说明 <a name="table62805478161143"></a> <table><thead align="left"><tr id="row7600041161143"><th class="cellrowborder" valign="top" width="50%" id="mcps1.2.3.1.1"><p id="p33199408161211"><a name="p33199408161211"></a><a name="p33199408161211"></a>返回值</p> </th> <th class="cellrowborder" valign="top" width="50%" id="mcps1.2.3.1.2"><p id="p4797519161211"><a name="p4797519161211"></a><a name="p4797519161211"></a>说明</p> </th> </tr> </thead> <tbody><tr id="row26509924161143"><td class="cellrowborder" valign="top" width="50%" headers="mcps1.2.3.1.1 "><p id="p2470184161211"><a name="p2470184161211"></a><a name="p2470184161211"></a>400 Bad Request</p> </td> <td class="cellrowborder" valign="top" width="50%" headers="mcps1.2.3.1.2 "><p id="p65867185161211"><a name="p65867185161211"></a><a name="p65867185161211"></a>请求错误。</p> </td> </tr> <tr id="row3149953161143"><td class="cellrowborder" valign="top" width="50%" headers="mcps1.2.3.1.1 "><p id="p34340117161211"><a name="p34340117161211"></a><a name="p34340117161211"></a>401 Unauthorized</p> </td> <td class="cellrowborder" valign="top" width="50%" headers="mcps1.2.3.1.2 "><p id="p30086118161211"><a name="p30086118161211"></a><a name="p30086118161211"></a>鉴权失败。</p> </td> </tr> <tr id="row42956032161143"><td class="cellrowborder" valign="top" width="50%" headers="mcps1.2.3.1.1 "><p id="p55291065161211"><a name="p55291065161211"></a><a name="p55291065161211"></a>403 Forbidden</p> </td> <td class="cellrowborder" valign="top" width="50%" headers="mcps1.2.3.1.2 "><p id="p49391293161211"><a name="p49391293161211"></a><a name="p49391293161211"></a>没有操作权限。</p> </td> </tr> <tr id="row64135773161143"><td class="cellrowborder" valign="top" width="50%" headers="mcps1.2.3.1.1 "><p id="p35901528161211"><a name="p35901528161211"></a><a name="p35901528161211"></a>404 Not Found</p> </td> <td class="cellrowborder" valign="top" width="50%" headers="mcps1.2.3.1.2 "><p id="p22342664161211"><a name="p22342664161211"></a><a name="p22342664161211"></a>找不到资源。</p> </td> </tr> <tr id="row65862429161143"><td class="cellrowborder" valign="top" width="50%" headers="mcps1.2.3.1.1 "><p id="p47457151161211"><a name="p47457151161211"></a><a name="p47457151161211"></a>500 Internal Server Error</p> </td> <td class="cellrowborder" valign="top" width="50%" headers="mcps1.2.3.1.2 "><p id="p18824037161211"><a name="p18824037161211"></a><a name="p18824037161211"></a>服务内部错误。</p> </td> </tr> <tr id="row17696525161143"><td class="cellrowborder" valign="top" width="50%" headers="mcps1.2.3.1.1 "><p id="p32515139161211"><a name="p32515139161211"></a><a name="p32515139161211"></a>503 Service Unavailable</p> </td> <td class="cellrowborder" valign="top" width="50%" headers="mcps1.2.3.1.2 "><p id="p16480634161211"><a name="p16480634161211"></a><a name="p16480634161211"></a>服务不可用。</p> </td> </tr> </tbody> </table>
58.007246
236
0.62792
yue_Hant
0.302487
e99037fefa6c8673ba6a398be30578d49d83cbb6
515
md
Markdown
blog/content/poems/psalm-ii.md
HarshitKaushik/harshit.me
4c643d7c631df093f32ac5a68ec7d6138a6fc012
[ "BSD-2-Clause" ]
null
null
null
blog/content/poems/psalm-ii.md
HarshitKaushik/harshit.me
4c643d7c631df093f32ac5a68ec7d6138a6fc012
[ "BSD-2-Clause" ]
null
null
null
blog/content/poems/psalm-ii.md
HarshitKaushik/harshit.me
4c643d7c631df093f32ac5a68ec7d6138a6fc012
[ "BSD-2-Clause" ]
null
null
null
--- title: "Psalm II" date: 2021-06-29T14:02:57+05:30 draft: false --- O, creator! In the race of success Sanctify those who are in front of me That they remain in place, not petrify Sanctify those who are behind me now That they progress restricting sterile way Sanctify me too that, with all my strength I shall try not to breathe in a blithe way And try to get even with those persons Who are still in front of me this today This ointment shall cure me in a hopeless day. Dragon. © drag_on
25.75
48
0.733981
eng_Latn
0.999815
e99051037b6a637ba042788370b3f302a131b634
5,195
md
Markdown
settings_documentation.md
julien-noblet/osmly
070fffea1734adf98050a28217b4ad6430f8dc31
[ "BSD-3-Clause" ]
1
2019-04-27T20:14:49.000Z
2019-04-27T20:14:49.000Z
settings_documentation.md
julien-noblet/osmly
070fffea1734adf98050a28217b4ad6430f8dc31
[ "BSD-3-Clause" ]
null
null
null
settings_documentation.md
julien-noblet/osmly
070fffea1734adf98050a28217b4ad6430f8dc31
[ "BSD-3-Clause" ]
null
null
null
types and uses of osmly.settings properties ####required: - title - instructions - db - context - problems - usePropertyAsTag ### title (required) - string - very basic description of what needs to be done - eg. 'Outline the park', 'Locate the library' ### instructions (required) - html string - a basic set of instructions - max width of the element is 640px - don't forget to escape quotes with a slash \ - if your using multiple lines end each line with a slash \, it's valid js - eg. `'<ul><li>Don't forget the tags</li><li>Watch out for other parks</li></ul>'` ### db (required) - string - full url of the database location - eg. `'http://127.0.0.1:5000/?db=parks-5'`, `'http://l33tdomain.com/?db=POIz'` - these resolve to `parks-5.sqlite`, `POIz.sqlite` ### writeApi - string - the OSM API endpoint to write data - http://wiki.openstreetmap.org/wiki/API_v0.6 - default: `'http://api06.dev.openstreetmap.org'` - dev server ### oauth_secret - string - oauth_secret registered on OSM - default: `'Mon0UoBHaO3qvfgwWrMkf4QAPM0O4lITd3JRK4ff'` - just my test application on the dev server ### consumerKey - string - consumerKey registered on OSM with oauth_secret - default: `'yx996mtweTxLsaxWNc96R7vpfZHKQdoI9hzJRFwg'` - just my test application on the dev server ### readApi - string - the server endpoint from which to read OSM data - http://wiki.openstreetmap.org/wiki/Xapi#Implementations - also compatible with the regular OSM API - 'http://api.openstreetmap.org/api/0.6/map?' - default: `'http://www.overpass-api.de/api/xapi?map?'` - downloads are faster and it allows for larger areas - disadvantage: lags behind the main api up to a couple minutes ### context (required) - object - keys and values to items that provide relevant context to the features being edited - for example: if schools are being edited, a context of other schools should be included. `{amenity: ['school']}` but other key/values should also be included for items that are often near schools that might be mistaken or be nearby schools. Things like colleges, universities, libraries, parks. It depends, go nuts. - our final object might look something like this: ``` js { amenity: ['school', 'library', 'college', 'university'], leisure: ['park'] } ``` ### problems (required) - object - a list of options for the 'Problem' dropdown - eg. `['no library here', 'library already there', 'imagery is too poor', 'not enough info']` ### origin - array - coordinates to center the map on load - default: `[0,0]` ### zoom - integer - zoom level for the map on load - default: `2` ### region - geojson object - a geojson outline of the region that the import is taking place in - serves as a general overview before login, give users something to look at - use geojson.io to generate something, paste it here ### demo - bool - allows a demonstration mode, setting to `false` doesn't allow it - default: `true` ### changesetTags - object - tags to use for changesets - will probably add an additional tag to track particular imports - eg. 'osmly:import': 'la-parks' - default: ``` js { 'created_by': 'osmly', 'osmly:version': '0', 'imagery_used': 'Bing' } ``` ### renameProperty - object - renames a property from the original data to a usable key for OSM - eg. `{wackyCompany_internal_id: 'XYZimport:id'}` ### usePropertyAsTag (required) - array - properties in the original data to use as tags which get uploaded to OSM - anything not specified will be ignored - this assumes you have all tags named correctly - for quick fixes/adjustments use renameProperty - for serious changes you should fix your source data in something like QGIS - eg. `['name', 'leisure', 'source']` ### appendTag - object - tag to add to every object uploaded - useful for a 'source' tag or something like it which must be applied to everything - or if you're data is already of a common type and just missing the necessary OSM tag - for example: you have just parks for a particular county but no leisure=park tag, this can add it to everything - eg. ``` js { leisure: 'park', source: 'TIGER 2027' } ``` ### featureStyle - object - how to feature is styled on the map when being edited - maps directly to leaflet, full options here: http://leafletjs.com/reference.html#path-options - default: ``` js { color: '#00FF00', weight: 3, opacity: 1, clickable: false } ``` ### contextStyle - object - how features from OSM (as defined in the 'context' setting) are styled along side the feature - maps directly to leaflet, full options here: http://leafletjs.com/reference.html#path-options - default: ``` js { color: '#FFFF00', fillOpacity: 0.3, weight: 3, opacity: 1 } ``` ###users - object - a list of OSM users that are allowed on this import - if no one is specified, any OSM user is allowed - eg. `['Aaron Lidman', 'psapp', 'Archive']` ###admins - object - a list of OSM users that have admin access - currently just QA mode, which allows for reviewing and confirming what all users have submitted to OSM - if no one is specified, everyone has admin access - eg. `['Hot Chip', 'Ladytron', 'grimes']`
27.930108
202
0.706449
eng_Latn
0.99002
e990b11cfc8c3ff29debdc8b4e470af4d32710a4
1,088
md
Markdown
windows-driver-docs-pr/install/setupapi-logging-registry-settings.md
i35010u/windows-driver-docs.zh-cn
e97bfd9ab066a578d9178313f802653570e21e7d
[ "CC-BY-4.0", "MIT" ]
1
2021-02-04T01:49:58.000Z
2021-02-04T01:49:58.000Z
windows-driver-docs-pr/install/setupapi-logging-registry-settings.md
i35010u/windows-driver-docs.zh-cn
e97bfd9ab066a578d9178313f802653570e21e7d
[ "CC-BY-4.0", "MIT" ]
null
null
null
windows-driver-docs-pr/install/setupapi-logging-registry-settings.md
i35010u/windows-driver-docs.zh-cn
e97bfd9ab066a578d9178313f802653570e21e7d
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: SetupAPI 日志记录注册表设置 description: SetupAPI 日志记录注册表设置 keywords: - Setupapi.log 日志记录 WDK Windows Vista,注册表设置 - 注册表 WDK Setupapi.log 日志记录 - 事件级别 WDK Setupapi.log 日志记录 - 事件类别 WDK Setupapi.log 日志记录 - 文本日志 WDK Setupapi.log,注册表项 ms.date: 04/20/2017 ms.localizationpriority: medium ms.openlocfilehash: 80f63e9fe514e5f27a1fcf8b5e322e16fb3a30f9 ms.sourcegitcommit: 418e6617e2a695c9cb4b37b5b60e264760858acd ms.translationtype: MT ms.contentlocale: zh-CN ms.lasthandoff: 12/07/2020 ms.locfileid: "96838895" --- # <a name="setupapi-logging-registry-settings"></a>SetupAPI 日志记录注册表设置 [Setupapi.log](setupapi.md) 日志记录支持全局 *事件级别* 和全局 *事件类别* ,该类别控制是否将信息写入文本日志。 事件级别控制写入文本日志的详细信息级别,事件类别确定可生成日志项的操作的类型。 如果日志条目的事件级别为小于或等于文本日志的全局事件级别,并且为文本日志启用了日志条目的事件类别,则会将日志条目写入文本日志;否则,日志条目不会写入到文本日志中。 有关如何设置事件级别的信息,请参阅 [设置文本日志的事件级别](setting-the-event-level-for-a-text-log.md)。 有关如何为日志启用事件类别的信息,请参阅 [启用文本日志的事件类别](enabling-event-categories-for-a-text-log.md)。 默认情况下,setupapi.log 文本日志位于%*SystemRoot* % *\\ Inf* 目录中。 有关如何更改文本日志所在的目录的信息,请参阅 [设置文本日志的目录路径](setting-the-directory-path-of-the-text-logs.md)。
28.631579
195
0.800551
yue_Hant
0.463763
e9912783d4ce2f0542c4b27cc94a2daea60ac724
1,602
md
Markdown
docs/modules/packages/uprtcl-graphql.md
tabcat/js-uprtcl
4bcbf50f357b222be79f36a924240c40bcd47482
[ "MIT" ]
null
null
null
docs/modules/packages/uprtcl-graphql.md
tabcat/js-uprtcl
4bcbf50f357b222be79f36a924240c40bcd47482
[ "MIT" ]
null
null
null
docs/modules/packages/uprtcl-graphql.md
tabcat/js-uprtcl
4bcbf50f357b222be79f36a924240c40bcd47482
[ "MIT" ]
null
null
null
# @uprtcl/graphql [![](https://img.shields.io/npm/v/@uprtcl/graphql)](https://www.npmjs.com/package/@uprtcl/graphql) These are \_Prtcl `micro-orchestrator` wrapper modules: - `ApolloClientModule`: basic module that registers the `ApolloClient` to be available to any external modules - `GraphQlSchemaModule`: building-block module that can extend the basic schema incldued in `ApolloClientModule`, with new type definitions, resolvers or directives. This lets you build whole graphql applications by composing different GraqhQl schema definitions. ## Documentation Visit our [documentation site](https://uprtcl.github.io/js-uprtcl). ## Install ```bash npm install @uprtcl/graphql ``` ## Usage Import the `ApolloClientModule` module and load it: ```ts import { ApolloClientModule } from '@uprtcl/graphql'; const apolloClientModule = new ApolloClientModule(), await orchestrator.loadModules([apolloClientModule]); ``` ### Add a new GraphQl schema to the global `ApolloClient` In your module, you can declare a `GraphQlSchemaModule` as a submodule and add your schemas, resolvers and directives: ```ts import { GraphQlSchemaModule } from '@uprtcl/graphql'; import { testTypeDefs } from './typeDefs'; import { resolvers } from './resolvers'; export class TestModule extends MicroModule { static id = Symbol('test-module'); get submodules() { return [new GraphQlSchemaModule(documentsTypeDefs, resolvers)]; } } ``` And then load your module in the `micro-orchestrator`: ```ts import { TestModule } from './test-module'; await orchestrator.loadModules([new TestModule()]); ```
27.62069
263
0.751561
eng_Latn
0.680162
e991a72e54c8b4ee9154d2c0cb5e6b300d328a53
455
md
Markdown
2018/34.md
zx648383079/blog
19f14a7b7397976e1a95d952de5c44fc630d4c98
[ "MIT" ]
2
2020-05-08T06:34:57.000Z
2021-12-31T03:14:44.000Z
2018/34.md
zx648383079/blog
19f14a7b7397976e1a95d952de5c44fc630d4c98
[ "MIT" ]
null
null
null
2018/34.md
zx648383079/blog
19f14a7b7397976e1a95d952de5c44fc630d4c98
[ "MIT" ]
3
2019-11-13T11:10:59.000Z
2020-06-12T14:50:14.000Z
# 字体反爬与反反爬 ## 说明 一些公司会把一些关键数据进行保密,而网站进行保密(对用户显示,对爬虫隐藏)一种方式就是自定义的字体文件 ## 主要用到的工具 python [fonttools](https://github.com/fonttools/fonttools) ## 反爬 ### 主要原理 从一个已有的字体文件提取需要加密的文字字形,更改字形索引,保存为新的字体文件,使用新的文字索引使用 ## 反反爬 ### 主要原理 获取字体文件,根据索引获取字形,再根据字形获取真正的文字(前提必须有一个足够丰富的字形库) ## 晋级 反爬使用指定义的字形(手动修改字形) 反反爬使用类似图像识别进行字形识别 ## 字体处理工具 ### 理想化操作 ```bash generate --input font.ttf --font 1234567890 --out new ``` 自动生成 节选的字体文件ttf wof svg 及css html使用例子
10.340909
53
0.731868
yue_Hant
0.832168
e99292720aa7b53fe3c4321c7e08ed50f9d87f02
79
md
Markdown
twitter_sanity/scraping/README.md
rohantheowen/twitter-sanity
11a1b492a32be29e4653a9130f9b0939d6366f71
[ "MIT" ]
9
2020-05-11T06:14:54.000Z
2020-06-23T12:52:52.000Z
twitter_sanity/scraping/README.md
rohantheowen/twitter-sanity
11a1b492a32be29e4653a9130f9b0939d6366f71
[ "MIT" ]
6
2020-05-12T21:57:11.000Z
2020-06-26T04:13:55.000Z
twitter_sanity/scraping/README.md
rohantheowen/twitter-sanity
11a1b492a32be29e4653a9130f9b0939d6366f71
[ "MIT" ]
10
2020-05-09T06:18:06.000Z
2020-06-23T14:11:02.000Z
This directory should contain code for scraping tweets and adding to a csv file
79
79
0.835443
eng_Latn
0.999962
e992d98023ed4413141606aa873a4b2c7f297069
3,324
md
Markdown
_posts/2021-02-06-det-install-all.md
goodgodgd/ian-lecture
9736c5ed22063daebdbaa84fabdf0d521fb80165
[ "MIT" ]
1
2021-12-30T00:58:57.000Z
2021-12-30T00:58:57.000Z
_posts/2021-02-06-det-install-all.md
goodgodgd/ian-lecture
9736c5ed22063daebdbaa84fabdf0d521fb80165
[ "MIT" ]
3
2019-06-25T23:19:05.000Z
2020-11-04T17:01:41.000Z
_posts/2021-02-06-det-install-all.md
goodgodgd/ian-lecture
9736c5ed22063daebdbaa84fabdf0d521fb80165
[ "MIT" ]
null
null
null
--- layout: post title: "[Det] Environment Setup" date: 2021-02-06 09:00:13 categories: 2021-1-detector --- ## Ubuntu Setting - Ubuntu 20.04 Download: <https://ubuntu.com/download/desktop> - 우분투 기본 설치 - `sudo apt update && sudo apt upgrade -y` - `sudo apt install build-essential cmake git curl` - NVIDIA driver: `sudo ubuntu-drivers autoinstall` - CUDA Download: <https://developer.nvidia.com/cuda-toolkit-archive> - Currently(2021.02), download CUDA 11.0 for Tensorflow 2.4 and Pytorch 1.7 - `chmod a+x cuda_<version>.run` - `sudo ./cuda_<version>.run` - cuDNN: <https://developer.nvidia.com/rdp/cudnn-archive> - NVIDIA 계정 로그인 - Select "cuDNN-xxx for CUDA 11.0" - Download "cuDNN Library for Linux (x86_64)" - tgz 압축해제 - 헤더와 라이브러리 복사 ``` sudo cp <extracted directory>/cuda/include/* /usr/local/cuda/include sudo cp <extracted directory>/cuda/lib64/* /usr/local/cuda/lib64 ``` - `~/.profile`에 아래 내용 없으면 추가 ``` export LD_LIBARARY_PATH="/usr/local/cuda/lib64:${LD_LIBARARY_PATH}" export PATH="/usr/local/cuda/bin:${PATH}" ``` - Pyenv: <https://github.com/pyenv/pyenv-installer> - pyenv prerequisites: `https://github.com/pyenv/pyenv/wiki/Common-build-problems` - `curl https://pyenv.run | bash` - add the three lines to `~/.bashrc` ``` export PATH="$HOME/.pyenv/bin:$PATH" eval "$(pyenv init -)" eval "$(pyenv virtualenv-init -)" ``` - PyCharm: `sudo snap install pycharm-community --classic` - 한글 설치 - Settings -> Region & Language -> Add "Korean(101/104 key compatible)" -> Manage Installed Languages -> Update - `sudo apt install fcitx-hangul` - Settings -> Region & Language -> Manage Installed Languages -> Keyboard input method system: "fcitx" - 재부팅 - 한/영키 등록: 오른쪽 위 키보드 아이콘 -> Configure -> "+" -> "Hangul" 추가 -> Global Config 탭 -> Trigger Input Method - Naver Whale: <https://whale.naver.com/ko/download> ## Pyenv Setting Pyenv를 이용한 파이썬 설치 `pyenv install 3.8.x` (latest) ### Setup Virtual Environment ``` pyenv virtualenv 3.8.x py38dl mkdir -p ~/workspace/detlec cd ~/workspace/detlec pyenv local py38dl pip install --upgrade pip pip install numpy opencv-python scikit-learn matplotlib pydot sudo apt install graphviz pip install tensorflow==2.4 pip install torch==1.7.1+cu110 torchvision==0.8.2+cu110 \ torchaudio===0.7.2 -f https://download.pytorch.org/whl/torch_stable.html ``` ### Test TF2 and Pytorch - Open PyCharm at `~/workspace/detlec` - Set pycharm interpreter at `~/.pyenv/versions/py38dl/bin/python` - Test tensorflow by ```python import tensorflow as tf x = tf.random.uniform((2, 3, 4)) print(tf.reduce_sum(x, axis=(1, 2))) --------------------------- Output: tf.Tensor([4.5362177 5.611371 ], shape=(2,), dtype=float32) ``` - Test pytorch by ```python import tensorflow as tf x = tf.random.uniform((2, 3, 4)) print(tf.reduce_sum(x, axis=(1, 2))) --------------------------- Output: tensor([8.4671, 4.6608]) ``` ## Troubleshooting ### 우분투 반응 느림 - RTX 2070 이 장착된 랩탑에서 Ubuntu 20.04를 설치했더니 자꾸 반응이 느려지는 현상 발생 - 키보드나 마우스 입력 후 3~4초 후에 반응함 - Ubuntu 18.04에서는 이상없음 - Ubuntu 18.04에서 20.04로 업그레이드를 해도 같은 현상 발생 - `software-properties-gtk`에서 그래픽 드라이버 버전을 460에서 450으로 수정했더니 다시 빨라짐 - CUDA 11.1까지는 450 버전에서 작동
22.013245
113
0.655235
kor_Hang
0.832099
e99322495b8654264272d703b0465bbd96a04e66
5,672
md
Markdown
sample/private-repos/README.md
ImJasonH/elafros
ea02baacea2f30c859364462cc761458bf2deae7
[ "Apache-2.0" ]
null
null
null
sample/private-repos/README.md
ImJasonH/elafros
ea02baacea2f30c859364462cc761458bf2deae7
[ "Apache-2.0" ]
null
null
null
sample/private-repos/README.md
ImJasonH/elafros
ea02baacea2f30c859364462cc761458bf2deae7
[ "Apache-2.0" ]
null
null
null
# Private Repos Demo This demo is a walk-through example that: * Pulls from a private Github repository using a deploy-key * Pushes to a private DockerHub repository using a username / password * Deploys to Knative Serving using image pull secrets. > In this demo we will assume access to existing Knative Serving service. If not, consult [README.md](https://github.com/knative/serving/blob/master/README.md) on how to deploy one. ## The resources involved. ### Setting up the default service account (one-time) Knative Serving will run pods as the "default" service account in whichever namespace you create resources. You can see it's body via: ```shell $ kubectl get serviceaccount default -o yaml apiVersion: v1 kind: ServiceAccount metadata: name: default namespace: default ... secrets: - name: default-token-zd84v ``` We are going to add to this an "image pull secret", created below. #### Creating an "image pull secret" To learn more about Kubernetes pull secrets, see [here](https://kubernetes.io/docs/tasks/configure-pod-container/pull-image-private-registry/#create-a-secret-that-holds-your-authorization-token). This can be created via: ```shell kubectl create secret docker-registry dockerhub-pull-secret \ --docker-server=https://index.docker.io/v1/ --docker-email=not@val.id \ --docker-username=<your-name> --docker-password=<your-pword> ``` #### Updating the service account You can add this `imagePullSecret` to your default service account by running: ```shell kubectl edit serviceaccount default ``` This will open the resource in your configured `EDITOR`, and under `secrets:` you should add: ```yaml secrets: - name: default-token-zd84v # This is the secret we just created: imagePullSecrets: - name: dockerhub-pull-secret ``` ### Setting up our "Build" service account (one-time) To separate our Build's credentials from our applications credentials, we will have our Build run as its own service account defined via: ```yaml apiVersion: v1 kind: ServiceAccount metadata: name: build-bot secrets: - name: deploy-key - name: dockerhub-push-secrets ``` The objects in this section are all defined in `build-bot.yaml`, and the fields that need to be populated say `REPLACE_ME`. Once these have been replaced as outlined, the "build bot" can be set up by running: ```shell kubectl create -f build-bot.yaml ``` #### Creating a deploy key You can set up a "deploy key" for your private Github repository following [these](https://developer.github.com/v3/guides/managing-deploy-keys/) instructions. The deploy key in this sample is *real* you do not need to change it for the sample to work. ```yaml apiVersion: v1 kind: Secret metadata: name: deploy-key annotations: # This tells us that this credential is for use with # github.com repositories. cloudbuild.googleapis.com/git-0: github.com type: kubernetes.io/ssh-auth data: # Generated by: # cat id_rsa | base64 -w 1000000 ssh-privatekey: <long string> # Generated by: # ssh-keyscan github.com | base64 -w 100000 known_hosts: <long string> ``` #### Creating a DockerHub push credential Substitute your DockerHub credentials as instructed in the comments below: ```yaml apiVersion: v1 kind: Secret metadata: name: dockerhub-push-secrets annotations: cloudbuild.googleapis.com/docker-0: https://index.docker.io/v1/ type: kubernetes.io/basic-auth data: # Generated by: # echo -n dockerhub-user | base64 username: REPLACE_ME # Generated by: # echo -n dockerhub-password | base64 password: REPLACE_ME ``` ### Installing Build Templates (one-time) This sample uses the [Kaniko build template](https://github.com/knative/build-templates/blob/master/kaniko/kaniko.yaml) in the [build-templates](https://github.com/knative/build-templates/) repo. ```shell kubectl apply -f kaniko.yaml ``` ### Using this in Configuration. At this point, basically everything has been setup and you simply need to deploy your application. There is one remaining substitution to be made in `manifest.yaml`. Substitute your private DockerHub repository name for `REPLACE_ME`. Then you can run: ```shell kubectl create -f manifest.yaml ``` As with the other demos, you can confirm that things work by capturing the IP of the ingress endpoint: ``` # Put the Ingress Host name into an environment variable. export SERVICE_HOST=`kubectl get route private-repos \ -o jsonpath="{.status.domain}"` export SERVICE_IP=`kubectl get ing private-repos-ela-ingress \ -o jsonpath="{.status.loadBalancer.ingress[*]['ip']}"` ``` If your cluster is running outside a cloud provider (for example on Minikube), your ingress will never get an address. In that case, use the istio `hostIP` and `nodePort` as the service IP: ```shell export SERVICE_IP=$(kubectl get po -l istio=ingress -n istio-system -o 'jsonpath={.items[0].status.hostIP}'):$(kubectl get svc istio-ingress -n istio-system -o 'jsonpath={.spec.ports[?(@.port==80)].nodePort}') ``` Now curl the service IP as if DNS were properly configured: ``` curl -H "Host: $SERVICE_HOST" http://$SERVICE_IP ``` ## Appendix: Sample Code The sample code is in a private Github repository consisting of two files. 1. `Dockerfile` ```Dockerfile FROM golang ENV GOPATH /go ADD . /go/src/github.com/dewitt/knative-build RUN CGO_ENABLED=0 go build github.com/dewitt/knative-build ENTRYPOINT ["knative-build"] ``` 1. `main.go` ```go package main import ( "fmt" "net/http" ) const ( port = ":8080" ) func helloWorld(w http.ResponseWriter, r *http.Request) { fmt.Fprintf(w, "Hello World.") } func main() { http.HandleFunc("/", helloWorld) http.ListenAndServe(port, nil) } ```
25.54955
209
0.739422
eng_Latn
0.936021
e993b644be6cba5ad34cbada84abd8906e946417
2,230
md
Markdown
README.md
kaynarov/commun.contracts
6e1482acd28cd2b33fc93ee5d6c88c7038bb0d44
[ "MIT" ]
null
null
null
README.md
kaynarov/commun.contracts
6e1482acd28cd2b33fc93ee5d6c88c7038bb0d44
[ "MIT" ]
null
null
null
README.md
kaynarov/commun.contracts
6e1482acd28cd2b33fc93ee5d6c88c7038bb0d44
[ "MIT" ]
1
2020-10-25T13:58:12.000Z
2020-10-25T13:58:12.000Z
<img width="600" src="./docs/logo_1.png" /> ***** [![buildkite](https://badge.buildkite.com/e37eaf75ef47a17ecf8d2b451d0175fb22907f5b51c5034334.svg?branch=master)](https://buildkite.com/commun.contracts) [![GitHub](https://img.shields.io/github/license/cyberway/cyberway.contracts.svg)](https://github.com/cyberway/cyberway.contracts/blob/master/LICENSE) # Welcome to the CyberWay Commun Application source code repository! The Commun Application is a platform for creating self-managed communities. Despite the fact that similar platforms already exist in networks, the commun platform in CyberWay has a number of original solutions: * The community does not have governance. Instead, an elective system of leaders is implemented that operates according to the principles of liquid democracy. * Monetization implemented by default. All community members are rewarded, including: * message authors - for published content; * curators - for the ranking (voting) of published content; * leaders - for fulfilling their duties stipulated in the rules of the community. * The rules by which a community operates are public, but may vary from community to community. Number of communities is not limited. In the future, the platform’s capabilities will be expanded, which will allow users to create new types of communities, such as: * Private communities in which membership is approved by a leader. * Paid communities working on a subscription model and/or entry fees. * Business communities owned by a company with a specific owner and leaders appointed by this owner. Content in such communities is created only by users who have permission to it. In such communities, all profits are distributed between members in accordance with the settings specified by the owner. ## Commun Contracts Description * [point](https://doxygen.cyberway.io/group__point.html) * [emit](https://doxygen.cyberway.io/group__emission.html) * [ctrl](https://doxygen.cyberway.io/group__control.html) * [list](https://doxygen.cyberway.io/group__list.html) * [gallery](https://doxygen.cyberway.io/group__gallery.html) * [publication](https://doxygen.cyberway.io/group__publish.html) * [social](https://doxygen.cyberway.io/group__social.html)
65.588235
301
0.790583
eng_Latn
0.992621
e993c0b4586b4e03a2a49c75dd6b5bf5fdc9345a
153
md
Markdown
threads-tasks-async/README.md
bobhonores/learning
7473df7ed9c5a4c2f7e9d1581c9b0385acaf2fa1
[ "MIT" ]
null
null
null
threads-tasks-async/README.md
bobhonores/learning
7473df7ed9c5a4c2f7e9d1581c9b0385acaf2fa1
[ "MIT" ]
null
null
null
threads-tasks-async/README.md
bobhonores/learning
7473df7ed9c5a4c2f7e9d1581c9b0385acaf2fa1
[ "MIT" ]
null
null
null
# Let's talk about ... Thread, Task, and Async - [Threads](threads.md) - [Threadpool](threadpool.md) - [Tasks](tasks.md) - [Async Programming](async.md)
25.5
46
0.679739
eng_Latn
0.464858
e9940bac2b01576b2cbc786312b823efab93c2e6
6,296
md
Markdown
articles/azure-signalr/howto-use-managed-identity.md
ancorrg/azure-docs.es-es
fe960fa0a6fb269e31c6120dcc310cfc28e1239d
[ "CC-BY-4.0", "MIT" ]
null
null
null
articles/azure-signalr/howto-use-managed-identity.md
ancorrg/azure-docs.es-es
fe960fa0a6fb269e31c6120dcc310cfc28e1239d
[ "CC-BY-4.0", "MIT" ]
null
null
null
articles/azure-signalr/howto-use-managed-identity.md
ancorrg/azure-docs.es-es
fe960fa0a6fb269e31c6120dcc310cfc28e1239d
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: Identidades administradas en Azure SignalR Service description: Obtenga información sobre cómo funcionan las identidades administradas en Azure SignalR Service y sobre cómo usar una identidad administrada en escenarios sin servidor. author: chenyl ms.service: signalr ms.topic: article ms.date: 06/8/2020 ms.author: chenyl ms.openlocfilehash: 9b6141e6009cb868d63429836f8c8f050c792ee5 ms.sourcegitcommit: dbe434f45f9d0f9d298076bf8c08672ceca416c6 ms.translationtype: HT ms.contentlocale: es-ES ms.lasthandoff: 10/17/2020 ms.locfileid: "92152309" --- # <a name="managed-identities-for-azure-signalr-service"></a>Identidades administradas para Azure SignalR Service En este artículo se explica cómo crear una identidad administrada para Azure SignalR Service y cómo usarla en escenarios sin servidor. > [!Important] > Azure SignalR Service solo admite una identidad administrada. Esto significa que puede agregar una identidad asignada por el sistema o asignada por el usuario. ## <a name="add-a-system-assigned-identity"></a>Adición de una identidad asignada por el sistema Para configurar una identidad administrada en Azure Portal, primero tiene que crear una instancia de Azure SignalR Service y después habilitar la característica. 1. Cree una instancia de Azure SignalR Service en el portal como lo haría normalmente. Búsquela en el portal. 2. Seleccione **Identidad** . 4. En la pestaña **Asignado por el sistema** , cambie **Estado** a **Activado** . Seleccione **Guardar** . :::image type="content" source="media/signalr-howto-use-managed-identity/system-identity-portal.png" alt-text="Adición de una identidad asignada por el sistema en el portal"::: ## <a name="add-a-user-assigned-identity"></a>Adición de una identidad asignada por el usuario La creación de una instancia de Azure SignalR Service con una identidad asignada por el usuario requiere que se cree la identidad y luego se agregue su identificador de recurso al servicio. 1. Cree un recurso de identidad administrada asignada por el usuario según [estas instrucciones](../active-directory/managed-identities-azure-resources/how-to-manage-ua-identity-portal.md#create-a-user-assigned-managed-identity). 2. Cree una instancia de Azure SignalR Service en el portal como lo haría normalmente. Búsquela en el portal. 3. Seleccione **Identidad** . 4. En la pestaña **Usuario asignado** , seleccione **Agregar** . 5. Busque la identidad que creó anteriormente y selecciónela. Seleccione **Agregar** . :::image type="content" source="media/signalr-howto-use-managed-identity/user-identity-portal.png" alt-text="Adición de una identidad asignada por el sistema en el portal"::: ## <a name="use-a-managed-identity-in-serverless-scenarios"></a>Uso de una identidad administrada en escenarios sin servidor Azure SignalR Service es un servicio totalmente administrado, por lo que no puede usar una identidad administrada para obtener tokens manualmente. En cambio, Azure SignalR Service usa la identidad administrada que se establece para obtener un token de acceso. Después, el servicio establece el token de acceso en un encabezado `Authorization` de una solicitud ascendente en escenarios sin servidor. ### <a name="enable-managed-identity-authentication-in-upstream-settings"></a>Habilitar la autenticación de la identidad administrada en la configuración ascendente 1. Agregue una identidad asignada por el sistema o una asignada por el usuario. 2. Ajuste la configuración ascendente y use **ManagedIdentity** como valor de **Auth** . Para obtener información sobre cómo crear una configuración ascendente con autenticación, vea [Configuración de Upstream](concept-upstream.md). 3. En la configuración de autenticación de la identidad administrada, puede especificar el recurso de destino en **Recurso** . El recurso se convertirá en una notificación `aud` en el token de acceso obtenido, que se puede usar como parte de la validación en los puntos de conexión ascendentes. El recurso puede ser uno de los siguientes: - Vacío - Identificador de la aplicación (cliente) de la entidad de servicio - URI del identificador de la aplicación de la entidad de servicio - [Identificador de recursos de un servicio de Azure](../active-directory/managed-identities-azure-resources/services-support-managed-identities.md#azure-services-that-support-azure-ad-authentication) > [!NOTE] > Si valida un token de acceso por sí mismo en el servicio, puede elegir cualquier formato para el recurso. Solo tiene que asegurarse de que el valor **Recurso** de la configuración de **Auth** sea coherente con la validación. Si usa el control de acceso basado en roles de Azure para un plano de datos, debe usar el recurso que solicite el proveedor de servicios. ### <a name="validate-access-tokens"></a>Validación de los tokens de acceso El token del encabezado `Authorization` es un [token de acceso plataforma de identidades de Microsoft](../active-directory/develop/access-tokens.md#validating-tokens). Para validar los tokens de acceso, la aplicación también debe validar la audiencia y los tokens de firmas. Deben validarse con los valores del documento de detección de OpenID. Por ejemplo, fijémonos en la [versión independiente del inquilino del documento](https://login.microsoftonline.com/common/.well-known/openid-configuration). El middleware de Azure Active Directory (Azure AD) tiene capacidades integradas para validar los tokens de acceso. Puede examinar nuestros [ejemplos](../active-directory/develop/sample-v2-code.md) para encontrar uno en el idioma que elija. Proporcionamos bibliotecas y ejemplos de código que le muestran cómo controlar la validación de tokens. También hay varias bibliotecas de asociados de código abierto disponibles para la validación de JSON Web Token (JWT). Al menos hay una opción para casi cualquier plataforma y lenguaje. Para obtener más información acerca de los ejemplos de código y las bibliotecas de autenticación de Azure AD, vea [Bibliotecas de autenticación de la plataforma de identidad de Microsoft](../active-directory/develop/reference-v2-libraries.md). ## <a name="next-steps"></a>Pasos siguientes - [Desarrollo y configuración de Azure Functions con Azure SignalR Service](signalr-concept-serverless-development-config.md)
76.780488
532
0.795108
spa_Latn
0.98563
e9943c03cd681d4dd80bff47091af92ef800c852
684
md
Markdown
tabml/model_wrapper_map.md
tiepvupsu/tabml
81ae3c6c4376b23095a0d2675fd2dcc0cce8aac7
[ "Apache-2.0" ]
50
2021-03-22T13:18:43.000Z
2022-03-12T11:17:43.000Z
tabml/model_wrapper_map.md
tiepvupsu/tabml
81ae3c6c4376b23095a0d2675fd2dcc0cce8aac7
[ "Apache-2.0" ]
165
2021-03-19T14:43:51.000Z
2021-12-07T16:51:23.000Z
tabml/model_wrapper_map.md
tiepvupsu/tabml
81ae3c6c4376b23095a0d2675fd2dcc0cce8aac7
[ "Apache-2.0" ]
9
2021-07-20T03:26:13.000Z
2021-12-29T04:57:33.000Z
# Inheritance map - <class '__main__.BaseModelWrapper'> - <class '__main__.BaseBoostingModelWrapper'> - <class '__main__.BaseCatBoostModelWrapper'> - <class '__main__.CatBoostRegressorModelWrapper'> - <class '__main__.CatBoostClassifierModelWrapper'> - <class '__main__.BaseXGBoostModelWrapper'> - <class '__main__.XGBoostClassifierModelWrapper'> - <class '__main__.XGBoostRegressorModelWrapper'> - <class '__main__.BaseLgbmModelWrapper'> - <class '__main__.LgbmRegressorModelWrapper'> - <class '__main__.LgbmClassifierModelWrapper'> - <class '__main__.BaseSklearnModelWrapper'>
45.6
63
0.682749
kor_Hang
0.177464
e994a25d1e6cad75a15a50ba8db96b3e80a6f936
1,096
md
Markdown
docs/cli/ignite_build.md
sftim/ignite
e612ef77b9fa92495917f23d43b282488fbf42e7
[ "Apache-2.0" ]
null
null
null
docs/cli/ignite_build.md
sftim/ignite
e612ef77b9fa92495917f23d43b282488fbf42e7
[ "Apache-2.0" ]
null
null
null
docs/cli/ignite_build.md
sftim/ignite
e612ef77b9fa92495917f23d43b282488fbf42e7
[ "Apache-2.0" ]
1
2021-04-05T13:10:51.000Z
2021-04-05T13:10:51.000Z
## ignite build Build a new base image for VMs ### Synopsis Build a new base image for VMs. The base image is an ext4 block device file, which contains a root filesystem. "build" can take in either a tarfile or a Docker image as the source. The Docker image needs to exist on the host system (pulled locally). If the import kernel flag (-k, --import-kernel) is specified, /boot/vmlinux is extracted from the image and added to a new VM kernel object named after the flag. Example usage: $ ignite build my-image.tar $ ignite build luxas/ubuntu-base:18.04 \ --name my-image \ --import-kernel my-kernel ``` ignite build <source> [flags] ``` ### Options ``` -h, --help help for build -k, --import-kernel string Import a new kernel from /boot/vmlinux in the image with the specified name -n, --name string Specify the name ``` ### Options inherited from parent commands ``` -q, --quiet The quiet mode allows for machine-parsable output, by printing only IDs ``` ### SEE ALSO * [ignite](ignite.md) - ignite: easily run Firecracker VMs
23.319149
106
0.688869
eng_Latn
0.990106
e995645fc29b8cdf5937b6f9a88b28efaae6f7af
10,151
md
Markdown
3.0/cookbook/compiling-windows.md
divyanshidm/docs
846c7bb343f016dbbfb86f62e53f1a0be330767c
[ "Apache-2.0" ]
null
null
null
3.0/cookbook/compiling-windows.md
divyanshidm/docs
846c7bb343f016dbbfb86f62e53f1a0be330767c
[ "Apache-2.0" ]
null
null
null
3.0/cookbook/compiling-windows.md
divyanshidm/docs
846c7bb343f016dbbfb86f62e53f1a0be330767c
[ "Apache-2.0" ]
null
null
null
--- layout: default description: I want to compile ArangoDB under Windows --- # Compiling ArangoDB under Windows ## Problem I want to compile ArangoDB under Windows. **Note:** For this recipe you need at least ArangoDB 2.4; it also works for 2.5 and above, up to 2.8. For ArangoDB version before 2.4 look at [Compile on Windows (pre-2.4)](compiling-windows-legacy.html). For version 3.0 and above see [Compile on Windows (3.x)](compiling-windows30.html). ## Solution While compiling ArangoDB under Linux is straight forward - simply execute `configure` and `make` - compiling under Windows is a bit more demanding. ### Ingredients For this recipe you need to install the following programs under Windows: * [cygwin](https://www.cygwin.com/){:target="_blank"} You need at least `make` from cygwin. Cygwin also offers a `cmake`. Do **not** install this version. The unit tests require the bash. You should also issue these commands to generate user informations for the cygwin commands: mkpasswd > /etc/passwd mkgroup > /etc/group Turning ACL off (noacl) for all mounts in cygwin fixes permissions troubles that may appear in the build: # /etc/fstab # # This file is read once by the first process in a Cygwin process tree. # To pick up changes, restart all Cygwin processes. For a description # see https://cygwin.com/cygwin-ug-net/using.html#mount-table # noacl = Ignore Access Control List and let Windows handle permissions C:/cygwin64/bin /usr/bin ntfs binary,auto,noacl 0 0 C:/cygwin64/lib /usr/lib ntfs binary,auto,noacl 0 0 C:/cygwin64 / ntfs override,binary,auto,noacl 0 0 none /cygdrive cygdrive binary,posix=0,user,noacl 0 0 * [cmake](https://cmake.org/){:target="_blank"} Either version 2.8.12, 3.0.2 or 3.1.2 should work. Attention - pitfall: the cygwin version doesn't work. * [python](http://www.python.org){:target="_blank"} Either 2.x version should work - it's used to run V8s GYP. Make sure you add python.exe to your path environment variable; restarting your running shell may be necessary. * [Visual Studio Express 2013 for Windows Desktop](https://www.microsoft.com/en-us/download/details.aspx?id=44914){:target="_blank"} More recent versions, such as [Visual Studio Community 2015](https://www.visualstudio.com/){:target="_blank"} don't work yet. Please note that there are different versions of Visual Studio. The `Visual Studio for Windows` will **not work**. You need to install `Visual Studio (Express) for Windows Desktop`. You must configure your path in such a way that the compiler can be found. One way is to execute the `vcvarsall.bat` script from the `VC` folder. <!-- * BOOST (if you intend to run the tests for development) [installing boost under windows](https://niuquant.wordpress.com/2013/11/16/setup-boost-1-55-0-unit-test-framework-with-visual-stuido-2010-on-windows-8-desktop/){:target="_blank"} has more details on how to. --> * [Nullsoft Scriptable Install System](http://nsis.sourceforge.net/Download){:target="_blank"} * [Ruby](https://www.ruby-lang.org/en/downloads/){:target="_blank"} (for the unittests, if you intend to run them) * [procdump](https://technet.microsoft.com/en-us/sysinternals/dd996900.aspx){:target="_blank"} (for the unittests; run once to accept the eula) * [GitLink](https://github.com/GitTools/GitLink){:target="_blank"} to adjust the pdb files to github * [WinDbg](https://msdn.microsoft.com/de-de/windows/hardware/gg454513.aspx){:target="_blank"} (in the section "Standalone Debugging Tools for Windows (WinDbg)") to get automated backraces during unittest runs. Hint: Add its install path to the PATH environment. ### Enable native symlinks for Cygwin and git Cygwin will create proprietary files as placeholders by default instead of actually symlinking files. The placeholders later tell Cygwin where to resolve paths to. It does not intercept every access to the placeholders however, so that 3rd party scripts break. Windows Vista and above support real symlinks, and Cygwin can be configured to make use of it: # use actual symlinks to prevent documentation build errors # (requires elevated rights!) export CYGWIN="winsymlinks:native" Note that you must run Cygwin as administrator or change the Windows group policies to allow user accounts to create symlinks (`gpedit.msc` if available). BTW: You can create symlinks manually on Windows like: mklink /H target/file.ext source/file.ext mklink /D target/path source/path mklink /J target/path source/path/for/junction And in Cygwin: ln -s source target ### Use bundled Python and other executables To use the bundled python 2.6, even if there's another python installation on your machine, use the following shell command in the Cygwin environment to extend the environment path (non-permanent): export PATH=absolute/path/to/python_26:$PATH Note that the build scripts aren't compatible with Python 3.x! Add more paths if needed to make commands like `ebook-convert` available: export PATH="/cygdrive/c/Program Files/Calibre2:absolute/path/to/python_26:$PATH" Colons are used to separate individual paths (`path1:path2`). `$PATH` contains the current environment paths, and must be used at the end of the export command to not take precedence over the newly added paths. ### Building the required libraries First of all, start a `bash` from cygwin and clone the repository using the corresponding branch, e.g. for ArangoDB 2.6: git clone -b 2.6 https://github.com/arangodb/arangodb-windows-libraries.git and switch into the directory `arangodb-windows-libraries`. This repository contains the open-source libraries which are needed by ArangoDB: * etcd from CoreOS * getopt * libev * linenoise * openssl * regex * zlib In order to build the corresponding 32bit and 64bit versions of these libraries, execute make make install make 3rdParty This will create a folder `WindowsLibraries` containing the headers and libraries. ### Building ArangoDB itself Clone the repository https://github.com/arangodb/arangodb.git and copy the previously created `WindowsLibraries` folder into this directory `ArangoDB`. Next you need to build Google's V8 version, that is part of ArangoDB. This version contains V8 and ICU. Switch into the directory ArangoDB/3rdParty and execute make -f Makefile.v8-windows make -f Makefile.v8-windows install Now switch back into the `ArangoDB` root folder and execute make pack-win32 or make pack-win64 in order to build the installer file for either 32 bit or 64 bit. ### Development builds For development builds which are able to run the unit tests run <!-- (adjust the boost directory to your situation): --> <!-- export BOOST_ROOT='c:\Program Files\boost_1_56_0\' export BOOST_LIBRARYDIR='c:\Program Files\boost_1_56_0\lib64-msvc-12.0' BOOSTROOT='c:\Program Files\boost_1_56_0' BOOST_INCLUDE='c:\Program Files\boost_1_56_0\' BOOST_LIBRARYDIR='c:\Program Files\boost_1_56_0\lib64-msvc-12.0' BOOST_ROOT='c:\Program Files\boost_1_56_0\' CMAKE_INCLUDE_PATHE='c:\Program Files\boost_1_56_0\' --> make pack-win64-relative Since most of the scripts assume they're running on a unix system, some directories are treated as mandatory and thus have to be created on the drive where you checked out your source: mkdir -p /cygdrive/c/var/tmp/ You probably already downloaded and installed [Ruby](https://www.ruby-lang.org/en/downloads/){:target="_blank"}. Now we need to install the httparty and rspec gems. In order to use gem, you need to [Install certificates](https://gist.github.com/fnichol/867550){:target="_blank"}. Once you did that you can run: gem install httparty rspec The *bin/rspec.bat* created by gem is not sufficient for being run by the unittesting facility; replace it with something like this: @ECHO OFF IF NOT "%~f0" == "~f0" GOTO :WinNT @"C:\Program Files (x86)\Rub21-x64\bin\ruby.exe" "C:/Program Files (x86)/Ruby21-x64/bin/rspec" %1 %2 %3 %4 %5 %6 %7 %8 %9 GOTO :EOF :WinNT @"C:\Program Files (x86)\Ruby21-x64\bin\ruby.exe" "C:/Program Files (x86)/Ruby21-x64/bin/rspec" %* You can then run the unittests like that: bash ./scripts/unittest all --skipBoost true --skipGeo true ### Executables only If you do not need the installer file, you can use the `cmake` to build the executables. Instead of `make pack-win32` use mkdir Build32 cd Build32 cmake -G "Visual Studio 12" .. This will create a solution file in the `Build32` folder. You can now start Visual Studio and open this solution file. To build the 64 bit version, use mkdir Build64 cd Build64 cmake -G "Visual Studio 12 Win64" .. Alternatively use `cmake` to build the executables. cmake --build . --config Release In order to execute the binaries you need to copy the ICU datafile into the directory containing the executable cp WindowsLibraries/64/icudtl.dat Build64/bin/Debug/icudt54l.dat If you intend to use the machine for development purposes, it may be more practical to copy it somewhere else: cp WindowsLibraries/64/icudtl.dat /cygdrive/c/Windows/icudt54l.dat And configure your environment (yes this instruction remembers to the hitchhikers guide to the galaxy...) so that `ICU_DATA` points to `c:\\Windows`. You do that by opening the explorer, right click on `This PC` in the tree on the left, choose `Properties` in the opening window `Advanced system settings`, in the Popup `Environment Variables`, another popup opens, in the `System Variables` part you click `New`, And Key: :`ICU_DATA` to value: `c:\\Windows` ![HowtoSetEnv](../images/SetEnvironmentVar.png) **Authors**: [Frank Celler](https://github.com/fceller){:target="_blank"}, [Wilfried Goesgens](https://github.com/dothebart){:target="_blank"} and [Simran Brucherseifer](https://github.com/Simran-B){:target="_blank"}. **Tags**: #windows #build
37.047445
182
0.733425
eng_Latn
0.953611
e9959ead0b3f748e4e1f00622ec8c13fd6ab7b85
27
md
Markdown
Algebra/README.md
gentom/UnivOfMath
6979ae261b950f392ea52a4eaa3ed2c55c9ef71b
[ "Unlicense" ]
null
null
null
Algebra/README.md
gentom/UnivOfMath
6979ae261b950f392ea52a4eaa3ed2c55c9ef71b
[ "Unlicense" ]
null
null
null
Algebra/README.md
gentom/UnivOfMath
6979ae261b950f392ea52a4eaa3ed2c55c9ef71b
[ "Unlicense" ]
null
null
null
# Algebra * Linear Algebra
9
16
0.740741
eng_Latn
0.755807
e99677f94eb8d5731f7180b5d20f8cf24c1cf22c
706
md
Markdown
README.md
zrythm/zrythm-media
31ea3318f08801f22f2750296dd27504f361f610
[ "FSFAP" ]
3
2021-05-15T18:37:24.000Z
2021-05-18T18:53:08.000Z
README.md
zrythm/zrythm-media
31ea3318f08801f22f2750296dd27504f361f610
[ "FSFAP" ]
null
null
null
README.md
zrythm/zrythm-media
31ea3318f08801f22f2750296dd27504f361f610
[ "FSFAP" ]
null
null
null
Zrythm media ============ This repository is a collection of resources and media for the [Zrythm digital audio workstation](https://www.zrythm.org). This is a work a work in progress. Patches to add more content are welcome. # Licensing See each directory's `README`/`README.md` file for licensing info of each file. # Trademarks Zrythm and the Zrythm logo are registered trademarks of Alexandros Theodotou in the United Kingdom. --- Copyright (C) 2021 Alexandros Theodotou Copying and distribution of this file, with or without modification, are permitted in any medium without royalty provided the copyright notice and this notice are preserved. This file is offered as-is, without any warranty.
26.148148
68
0.77762
eng_Latn
0.998403
e997128a2efa6c45fd99ed585336210b73e8ae27
2,118
md
Markdown
source/includes/_futures_ws_sub_order.md
bithumbfutures/bithumb-futures-api-doc
b683dccb713cf677f9c5de633e05b5ae25d54cad
[ "Apache-2.0" ]
null
null
null
source/includes/_futures_ws_sub_order.md
bithumbfutures/bithumb-futures-api-doc
b683dccb713cf677f9c5de633e05b5ae25d54cad
[ "Apache-2.0" ]
4
2020-02-26T11:41:47.000Z
2021-05-20T12:49:28.000Z
source/includes/_futures_ws_sub_order.md
bithumbfutures/bithumb-futures-api-doc
b683dccb713cf677f9c5de633e05b5ae25d54cad
[ "Apache-2.0" ]
null
null
null
## Order Channels ### Subscribe All Order-Related Channels You can subscribe/unsubscribe to all order-related channels with one message. To subscribe: `{"op":"sub", "id": "abcd1234", "ch":"order:futures"}` To unsubscribe: `{"op":"unsub", "id": "abcd1234", "ch":"order:futures"}` In both messages above, the `id` field is optional. Once subscribed, you will start to receive messages from all the following channels: * **order** - order updates * **futures-collateral** - updates in collateral balance * **futures-position** - updates in contract position along with other contract-specifc data * **futures-risk** - updates in overall account data Please refer to [WebSocket - Futures Balance Update Messages](https://github.com/bithumbfutures/bithumb-futures-api-doc/blob/master/misc/doc-balance-update-messages.md) for implementation details. ##### tp - Transaction Type The `tp` field in the message shows the reason of the balance update. Below are some common : * `ExecutionUpdate` - message about order status update * `TakeOver` - the account has been taken over due to hightened risk exposure. * `PositionInjectionBLP` - position injected to Backstop Liquidity Providers (BLPs), you will only see this message if your account is registered as a BLP with the exchange. * `PositionInjection` - position injected to regular accounts. * `FundingPayment` - funding payment made to the account. * `FuturesPnlSettlement` - rolling position PnL into realized PnL ##### Identifying Balance Update Batches by execId (execution Id) and txNum (Transaction Number) Some balance updates, such as position injection, are done in batches. The `txNum` can help you identify such updates. For a batch of *n* update messages, the *i*-th message will have `txNum = n-i` - that is, the first message will have `txNum = n-1` and the last message will have `txNum = 0`. #### Retrieve Transaciton Details You can use `execId` to retrieve balance update details using the RESTful API. See [Lookup Balance Update Records by Id](#lookup-balance-update-records-by-id)
46.043478
197
0.735127
eng_Latn
0.975481
e99917dcaf815a2e760d2414a07b29df6077af12
123
md
Markdown
README.md
SteveClement/2015-01-13-sc-presa-GGC-6_BullShit
f214d875f395012d617ef7b41b490fa5a48b4c50
[ "Unlicense" ]
null
null
null
README.md
SteveClement/2015-01-13-sc-presa-GGC-6_BullShit
f214d875f395012d617ef7b41b490fa5a48b4c50
[ "Unlicense" ]
null
null
null
README.md
SteveClement/2015-01-13-sc-presa-GGC-6_BullShit
f214d875f395012d617ef7b41b490fa5a48b4c50
[ "Unlicense" ]
null
null
null
This will need XeLaTeX with --shell-escape and the minted LaTeX syntax highlighting package as well ass the beamer package.
123
123
0.821138
eng_Latn
0.999642
e99920b0b0d090433438e2aa4605b28dfcecf59b
9,949
md
Markdown
EVENTS.md
cr74ever/WhatsAPI
2010f6c3013ee6bf3ffe9551589de966f6d55f03
[ "MIT" ]
2
2018-12-11T15:23:10.000Z
2021-03-06T07:17:41.000Z
EVENTS.md
zyzuluzu/WhatsAPI
2010f6c3013ee6bf3ffe9551589de966f6d55f03
[ "MIT" ]
1
2015-08-13T23:15:14.000Z
2015-08-13T23:15:14.000Z
EVENTS.md
zyzuluzu/WhatsAPI
2010f6c3013ee6bf3ffe9551589de966f6d55f03
[ "MIT" ]
4
2016-07-05T03:04:23.000Z
2019-03-12T13:28:45.000Z
Available events and arguments ============================== - onClose: - phone: The user phone number including the country code. - error: The error message. - onCodeRegister: - phone: The user phone number including the country code. - login: Phone number with country code. - pw: Account password. - type: Type of account. - expiration: Expiration date in UNIX TimeStamp. - kind: Kind of account. - price: Formated price of account. - cost: Decimal amount of account. - currency: Currency price of account. - price_expiration: Price expiration in UNIX TimeStamp. - onCodeRegisterFailed: - phone: The user phone number including the country code. - status: The server status number - reason: Reason of the status (e.g. too_recent/missing_param/bad_param). - retry_after: Waiting time before requesting a new code in seconds. - onCodeRequest: - phone: The user phone number including the country code. - method: Used method (SMS/voice). - length: Registration code length. - onCodeRequestFailed: - phone: The user phone number including the country code. - method: Used method (SMS/voice). - reason: Reason of the status (e.g. too_recent/missing_param/bad_param). - value: The missing_param/bad_param or waiting time before requesting a new code. - onCodeRequestFailedTooRecent: - phone: The user phone number including the country code. - method: Used method (SMS/voice). - reason: Reason of the status (too_recent). - retry_after: Waiting time before requesting a new code in seconds. - onConnect: - phone: The user phone number including the country code. - sokect: The resource socket id. - onCredentialsBad: - phone: The user phone number including the country code. - status: Account status. - reason: The reason. - onCredentialsGood: - phone: The user phone number including the country code. - login: Phone number with country code. - pw: Account password. - type: Type of account. - expiration: Expiration date in UNIX TimeStamp. - kind: Kind of account. - price: Formated price of account. - cost: Decimal amount of account. - currency: Currency price of account. - price_expiration: Price expiration in UNIX TimeStamp. - onDisconnect: - phone: The user phone number including the country code. - sokect: The resource socket id. - onDissectPhone - phone: The user phone number including the country code. - country: The detected country name. - cc: The number's country code. - mcc: International cell network code for the detected country. - lc: Location code for the detected country - lg: Language code for the detected country - onDissectPhoneFailed: - phone: The user phone number including the country code. - onGetAudio: - phone: The user phone number including the country code. - from: The sender phone number. - msgid: The message id. - type: The message type. - time: The unix time when send message notification. - name: The sender name. - size: The image size. - url: The url to bigger audio version. - file: The audio name. - mimetype: The audio mime type. - filehash: The audio file hash. - duration: The audio duration. - acodec: The audio codec. - onGetError: - phone: The user phone number including the country code. - error: Array with error data for why request failed. - onGetGroups: - phone: The user phone number including the country code. - groupList: Array with all the groups and groupsinfo. - onGetGroupsInfo: - phone: The user phone number including the country code. - groupList: Array with the the groupinfo. - onGetGroupsSubject: - phone: The user phone number including the country code. - gId: The group JID. - time: The unix time when send subject. - author: The author phone number including the country code. - participant: The participant phone number including the country code. - name: The sender name. - subject: The subject (e.g. group name). - onGetImage: - phone: The user phone number including the country code. - from: The sender JID. - msgid: The message id. - type: The message type. - time: The unix time when send message notification. - name: The sender name. - size: The image size. - url: The url to bigger image version. - file: The image name. - mimetype: The image mime type. - filehash: The image file hash. - width: The image width. - height: The image height. - thumbnail: The base64_encode image thumbnail. - onGetLocation: - phone: The user phone number including the country code. - from: The sender JID. - msgid: The message id. - type: The message type. - time: The unix time when send message notification. - name: The sender name. - place_name: The place name. - longitude: The location longitude. - latitude: The location latitude. - url: The place url. - thumbnail: The base64_encode location image thumbnail. - onGetMessage: - phone: The user phone number including the country code. - from: The sender JID. - msgid: The message id. - type: The message type. - time: The unix time when send message notification. - name: The sender name. - message: The message. - onGetPrivacyBlockedList: - phone: The user phone number including the country code. - data: Array of data nodes containing numbers you have blocked. - onGetProfilePicture: - phone: The user phone number including the country code. - from: The sender JID. - type: The type of picture (image/preview). - thumbnail: The base64_encoded image. - onGetRequestLastSeen: - phone: The user phone number including the country code. - msgid: The message id. - sec: The number of seconds since the user went offline. - onGetServerProperties: - phone: The user phone number including the country code. - version: The version number on the server. - properties: Array of server properties. - onGetvCard: - phone: The user phone number including the country code. - from: The sender JID. - msgid: The message id. - type: The message type. - time: The unix time when send message notification. - name: The sender name. - contact: The vCard contact name. - vcard: The vCard. - onGetVideo: - phone: The user phone number including the country code. - from: The sender JID. - msgid: The message id. - type: The message type. - time: The unix time when send message notification. - name: The sender name. - url: The url to bigger video version. - file: The video name. - size: The video size. - mimetype: The video mime type. - filehash: The video file hash. - duration: The video duration. - vcodec: The video codec. - acodec: The audio codec. - thumbnail: The base64_encode video thumbnail. - onGroupsChatCreate: - phone: The user phone number including the country code. - gId: The group JID. - onGroupsChatEnd: - phone: The user phone number including the country code. - gId: The group JID. - onGroupsParticipantsAdd: - phone: The user phone number including the country code. - groupId: The group JID. - participant: The participant JID. - onGroupsParticipantsRemove: - phone: The user phone number including the country code. - groupId: The group JID. - participant: The participant JID. - author: The author JID. - onLogin: - phone: The user phone number including the country code. - onMessageComposing: - phone: The user phone number including the country code. - from: The sender JID. - msgid: The message id. - type: The message type. - time: The unix time when send message notification. - onMessagePaused: - phone: The user phone number including the country code. - from: The sender JID. - msgid: The message id. - type: The message type. - time: The unix time when send message notification. - onMessageReceivedClient: - phone: The user phone number including the country code. - from: The sender JID. - msgid: The message id. - type: The message type. - time: The unix time when send message notification. - onMessageReceivedServer: - phone: The user phone number including the country code. - from: The sender JID. - msgid: The message id. - type: The message type. - time: The unix time when send message notification. - onPing: - phone: The user phone number including the country code. - msgid: The message id. - onPresence: - phone: The user phone number including the country code. - from: The sender JID. - type: The presence type. - onSendMessageReceived: - phone: The user phone number including the country code. - time: The unix time when send message notification. - from: The sender JID. - onSendPong: - phone: The user phone number including the country code. - msgid: The message id. - onSendPresence: - phone: The user phone number including the country code. - type: Presence type. - name: User nickname. - onSendStatusUpdate: - phone: The user phone number including the country code. - msg: The status message. - onUploadFile: - phone: The user phone number including the country code. - name: The filename. - url: The remote url on WhatsApp servers (note: this is NOT the URL to download the file, only used for sending message). - onUploadFileFailed: - phone: The user phone number including the country code. - name: The filename. How to bind a callback to an event ================================== # Event handler declaration ```php function MyFunction_onConnect($phone, $socket) { print("$socket\n"); } ``` # Require the class. ```php require 'whatsprot.class.php'; ``` # Create an instance of WhastProt. ```php $w = new WhatsProt($userPhone, $userIdentity, $userName, $debug); ``` # Bind a callback to an event ```php # $w->eventManager()->bind((string) $event, (string) $callback); $w->eventManager()->bind('onConnect', 'MyFunction_onConnect'); ``` # Connect to WhatsApp servers. ```php $w->connect(); ``` # Login to WhatsApp ```php $w->loginWithPassword($password); ``` [...]
36.047101
124
0.714846
eng_Latn
0.988203
e99922df39bf1223cc09308bb111af5aea4b0da4
415
md
Markdown
src/translate/api/progress-bar.readme.md
scrpgil/ionic-docs
b2247aaf68b2d2f5a2c56d9a463aaae37e3a2b02
[ "Apache-2.0" ]
null
null
null
src/translate/api/progress-bar.readme.md
scrpgil/ionic-docs
b2247aaf68b2d2f5a2c56d9a463aaae37e3a2b02
[ "Apache-2.0" ]
null
null
null
src/translate/api/progress-bar.readme.md
scrpgil/ionic-docs
b2247aaf68b2d2f5a2c56d9a463aaae37e3a2b02
[ "Apache-2.0" ]
null
null
null
# ion-progress-bar ion-progress-bar は操作や処理の進捗を可視化するための、水平方向のプログレスバーです。`determinate` または `indeterminate`の2タイプから選択できます。 ## Progress Type ### Determinate 操作の進捗率が分かる場合はdeterminateタイプを選択するべきです。このタイプはデフォルトであり、進捗率は`value`属性値にて指定します。 バッファは何らかの処理中であることを表す、円のアニメーションを表示します。 `buffer`属性値が1より小さければ、追加のバッファ処理の進捗率を示すことが可能です。 ### Indeterminate 実行中の操作を表します。操作にかかる時間を示す必要はありません。 `reversed="true"`を指定した場合、ロード前状態を表すための表示となります。
23.055556
98
0.840964
yue_Hant
0.079069
e999d2333647aa8741d8415ce8ab9a77be906010
2,409
md
Markdown
src/Docs/_new/4-how-to/140-reading-entities-dal.md
Gamingpc/gss
f6a636c116f73d7747db0f7eb009cc6104b3699b
[ "MIT" ]
1
2020-01-11T08:06:44.000Z
2020-01-11T08:06:44.000Z
src/Docs/_new/4-how-to/140-reading-entities-dal.md
Gamingpc/gss
f6a636c116f73d7747db0f7eb009cc6104b3699b
[ "MIT" ]
32
2019-08-05T11:37:43.000Z
2022-02-26T01:39:41.000Z
src/Docs/_new/4-how-to/140-reading-entities-dal.md
Gamingpc/gss
f6a636c116f73d7747db0f7eb009cc6104b3699b
[ "MIT" ]
null
null
null
[titleEn]: <>(Reading entities via DAL) [metaDescriptionEn]: <>(Very often one wants to read data from the database and therefore has to write his own queries using PDO. In the Shopware platform, it's highly recommended to not write custom queries in order to fetch data, but to use the methods from our data abstraction layer, in short DAL.) ## Overview Very often one wants to read data from the database and therefore has to write his own queries using PDO. In the Shopware platform, it's **highly recommended** to not write custom queries in order to fetch data, but to use the methods from our [data abstraction layer](./../2-internals/1-core/20-data-abstraction-layer/__categoryInfo.md), in short `DAL`. Here's a few examples on how to read your entity data using the DAL. All of the following methods are to be executed on the entities' respective repository. ## Reading entities The entity repositories provide a `search()` method which takes two arguments: 1. The `Criteria` object, which holds a list of ids. 2. The `Context` object to be read with ```php /** @var EntityRepositoryInterface $productRepository */ $productRepository = $this->container->get('product.repository'); /** @var EntityCollection $entities */ $entities = $productRepository->search( new Criteria([ 'f8d36562c5614c5994aecb9c73d2b13e', '67a8a047b638493d95bb2a4cdf351cf3', 'b94055962e4b49ceb86f55f8d1932607', ]), \Shopware\Core\Framework\Context::createDefaultContext() ); ``` The return value will be a collection containing all found entities as hydrated objects. ### Reading entities without an ID In many cases you don't even know the ID of the entity you're looking for. In order to search for entities using something else than the ID, you'll have to use filters. [Read here](./../2-internals/1-core/20-data-abstraction-layer/020-search.md#Filter) for more information about the DAL filter types and how to use them. The following example code will be looking for a product whose `name` equals 'Example product': ```php /** @var EntityRepositoryInterface $productRepository */ $productRepository = $this->container->get('product.repository'); /** @var EntityCollection $entities */ $entities = $productRepository->search( (new Criteria())->addFilter(new EqualsFilter('name', 'Example product')), \Shopware\Core\Framework\Context::createDefaultContext() ); ```
42.263158
302
0.752179
eng_Latn
0.9875
e999de15e49e0cc28d3f86e33197f17c83044e40
13,661
md
Markdown
Instructions/Labs/LAB_AK_05_Lab4_Ex1_Prepare_Identity_Synch.md
J-O-S-H-C/MS-100T00-Microsoft-365-Identity-and-Services
195656e0a24518c30721a4b4e882a8ba7df8a260
[ "MIT" ]
null
null
null
Instructions/Labs/LAB_AK_05_Lab4_Ex1_Prepare_Identity_Synch.md
J-O-S-H-C/MS-100T00-Microsoft-365-Identity-and-Services
195656e0a24518c30721a4b4e882a8ba7df8a260
[ "MIT" ]
null
null
null
Instructions/Labs/LAB_AK_05_Lab4_Ex1_Prepare_Identity_Synch.md
J-O-S-H-C/MS-100T00-Microsoft-365-Identity-and-Services
195656e0a24518c30721a4b4e882a8ba7df8a260
[ "MIT" ]
null
null
null
# Module 5 - Lab 4 - Exercise 1 - Prepare for Identity Synchronization As in the previous lab exercises you will take on the role of Holly Dickson, Adatum Corporation’s Enterprise Administrator. Adatum has recently subscribed to Microsoft 365, and you have been tasked with deploying the application in Adatum’s virtualized lab environment. In this lab, you will perform the tasks necessary to manage your Microsoft 365 identity environment using both the Microsoft 365 admin center and Windows PowerShell. During this exercise you will set up and manage Azure AD Connect. You will create on-premises users and validate the sync process so that their identity is moved to the cloud. Some of the steps may feel familiar from previous exercises; however, in this case they are needed to validate the synchronization process. ### Task 1: Configure your UPN suffix In Active Directory, the default User Principal Name (UPN) suffix is the DNS name of the domain where the user account was created. The Azure AD Connect wizard uses the UserPrincipalName attribute, or it lets you specify the on-premises attribute (in a custom installation) to be used as the user principal name in Azure AD. This is the value that is used for signing into Azure AD. If you recall, your VM environment was created by your lab hosting provider with an on-premises domain titled **adatum.com**. This domain included a number of on-premises user accounts, such as Holly Dickson, Laura Atkins, and so on. Then in the first lab in this course, you created a custom, accepted domain for Adatum titled **xxxUPNxxx.xxxCustomDomainxxx.xxx** (where xxxUPNxxx was the unique UPN name assigned to your tenant, and xxxCustomDomainxxx.xxx was the name assigned to the domain by your lab hosting provider). In this task, you will use PowerShell to change the user principal name of the domain for the entire Adatum Corporation by replacing the originally established **adatum.com** domain with the custom **xxxUPNxxx.xxxCustomDomainxxx.xxx** domain. In doing so, you will update the UPN suffix for the primary domain and the UPN on every on-premises user account in AD DS with **@xxxUPNxxx.xxxCustomDomainxxx.xxx**. A company may change its domain name for a variety of reasons. For example, a company may purchase a new domain name, or a company may change its name and it wants its domain name to reflect the new company name, or a company may be sold and it wants its domain name to reflect the new parent company’s name. Regardless of the underlying reason, the goal of changing a domain name is typically to change the domain name on each user’s email address. For this lab, Adatum has purchased the new xxxUPNxxx.xxxCustomDomainxxx.xxx domain (provided by your lab hosting provider); therefore, it wants to change the domain name of all its users’ email addresses from @adatum.com to @xxxUPNxxx.xxxCustomDomainxxx.xxx. 1. On your Domain Controller VM (LON-DC1), make sure you’re logged in as **ADATUM\Administrator** and password **Pa55w.rd**. 2. If **Windows PowerShell** is still open, then select the **PowerShell** icon on your taskbar; otherwise, you must open **Windows PowerShell** by selecting the magnifying glass (**Search**) icon on the taskbar, typing **powershell** in the Search box that appears, right-clicking on **Windows PowerShell**, and selecting **Run as administrator** in the drop-down menu. 3. Using **Windows PowerShell**, you must replace the on-premises **adatum.com** domain with the **xxxUPNxxx.xxxCustomDomainxxx.xxx** domain (where you will replace xxxUPNxxx with the unique UPN name assigned to your tenant, and you will replace xxxCustomDomainxxx.xxx with your lab hosting provider's custom domain). In doing so, you will update the UPN suffix for the primary domain and the UPN on every user in AD DS with **@xxxUPNxxx.xxxCustomDomainxxx.xxx**. <br/> ‎In the following PowerShell command, the **Set-ADForest** cmdlet modifies the properties of an Active Directory forest, and the **-identity** parameter specifies the Active Directory forest to modify. To perform this task, run the following command to set the **UPNSuffixes** property for the **adatum.com** forest (remember to change xxxUPNxxx to your unique UPN name and xxxCustomDomainxxx.xxx to your lab hosting provider's custom domain name):<br/> Set-ADForest -identity adatum.com -UPNSuffixes @{replace="xxxUPNxxx.xxxCustomDomainxxx.xxx"} 4. You must then run the following command that changes all existing adatum.com accounts to the new UPN @xxxUPNxxx.xxxCustomDomainxxx.xxx domain (remember to change xxxUPNxxx to your unique UPN name and xxxCustomDomainxxx.xxx to your lab hosting provider's custom domain name): <br/> Get-ADUser -Filter * -Properties SamAccountName | ForEach-Object { Set-ADUser $_ -UserPrincipalName ($_.SamAccountName + "@xxxUPNxxx.xxxCustomDomainxxx.xxx" )} 5. You will continue using PowerShell on your Domain Controller VM in the next task. ### Task 2: Prepare problem user accounts Integrating your on-premises Active Directory with Azure AD makes your users more productive by providing a common identity for accessing both cloud and on-premises resources. However, errors can occur when identity data is synchronized from Windows Server Active Directory (AD DS) to Azure Active Directory (Azure AD). For example, two or more objects may have the same value for the **ProxyAddresses** attribute or the **UserPrincipalName** attribute in on-premises Active Directory. There are a multitude of different conditions that may result in synchronization errors. Organizations can correct these errors by running Microsoft's IdFix tool, which performs discovery and remediation of identity objects and their attributes in an on-premises Active Directory environment in preparation for migration to Azure Active Directory. In this task, you will run a script that breaks an on-premises user account. As part of your Adatum pilot project, you are purposely breaking this identity object so that you can run the IdFix tool in the next task to see how you can fix the broken account. 1. On your Domain Controller VM (LON-DC1), in the Windows PowerShell window, run the following command to change the root source to **C:\labfiles** so that you can access any files from that location: <br/> CD C:\labfiles\ 2. PowerShell's execution policy settings dictate which PowerShell scripts can be run on a Windows system. Setting this policy to **Unrestricted** enables Holly to load all configuration files and run all scripts. At the command prompt, type the following command, and then press Enter: <br/> Set-ExecutionPolicy Unrestricted 3. You will then be prompted to confirm the execution policy change. Type **A** and press Enter to select the **[A] Yes to All** option. 4. Enter the following command that runs a PowerShell script that creates a problem user account. This script, which is stored in the C:\labfiles folder, will purposely create an issue with the UserPrincipalName for the user's on-premises account; this will enable you to troubleshoot this account in the next task using the IdFix tool. <br/> .\CreateProblemUsers.ps1 **Note:** Wait until the script has completed before proceeding to the next task. This Windows PowerShell script will make the following change in AD DS: - **Klemen Sic**. Update the UserPrincipalName for Klemen to include an extra "@" character. 5. Close PowerShell. ### Task 3: Run the IdFix tool and fix identified issues In this task you will download and use the IdFix tool to fix the on-premises user account that was broken in the previous task. Running the IdFix tool will correct any user account errors prior to synchronizing identity data between your on-premises environment and Azure AD. 1. You should still be logged into **LON-DC1** as the **Administrator** from the prior task. 2. In **Microsoft Edge**, open a new tab and enter the following URL in the address bar to access the Microsoft Download Center page for the IdFix Directory Synchronization Error Remediation Tool: <br/> **https://microsoft.github.io/idfix/installation/** 3. On the **Microsoft - IdFix** window, under the **Step 2: Install** section at the top of the page, the instructions direct you to run **setup.exe** to install the IdFix application on your machine. Select **setup.exe** to download the file to LON-DC1. 4. Once the **setup.exe** file is downloaded, it will appear in the notification bar at the bottom of the screen. Select **Open file**. 5. If a **Do you want to run this file?** dialog box appears, select **Run**. 6. In the **Do you want to install this application?** dialog box, select **Install**. 7. In the **IdFix Privacy Statement** message box, select **OK**. 8. In the **IdFix** window that appears, on the menu bar at the very top of the screen, select **Query** to query the directory. After a short wait, you should see several errors. 9. Select the **ERROR** column heading to sort the records by error in alphabetical error. <br/> ‎**Note:** If any **topleveldomain** errors appear, then ignore them as they cannot be fixed by the IdFix tool. 10. In the **Klemen Sic** row, select the drop-down arrow in the **ACTION** field and select **EDIT**. 11. On the menu bar at the top of the window, select **Apply**. 12. In the **Apply Pending** dialog box that appears, select **Yes**. <br/> ‎**Note:** Notice the value in the **Action** column changed from **EDIT** to **COMPLETE** for this user; this indicates that IdFix updated the user object and corrected the error. 13. On the menu bar, select **Query**. In the query results, note how the Klemen Sic row no longer appears in the results, since you just fixed this object. <br/> **Note:** If a dialog box appears indicating an unhandled exception has occurred, select **Continue**. <br/> As you can see, there are two users whose errors you have not fixed (**An Dung Dao** and **Ngoc Bich Tran**). We are purposely leaving these errors alone so that you can see what happens during the synchronization process using the Azure AD Connect tool in the next exercise when it processes users with these conditions. <br/> **Important:** When there are format and duplicate errors for distinguished names, the **UPDATE** column either contains the same string as the **VALUE** column, or the **UPDATE** column entry is blank. In either case, this means that IdFix cannot suggest a remediation for the error. You can either fix these errors outside IdFix, or manually remediate them within IdFix. You can also export the results and use Windows PowerShell to remediate many errors. 14. Close the IdFix window. 15. Leave your Edge browser open. ### Task 4: Prepare for Directory Synchronization Azure Active Directory Connect synchronization service is a main component of Azure AD Connect. It's responsible for processing all operations related to synchronizing identity data between your on-premises environment and Azure AD. The sync service consists of an on-premises component (Azure AD Connect sync) and a cloud service component (Azure AD Connect sync service). Before you can run Azure AD Connect, you must first configure several settings that control the synchronization process, which you will do in this task. Once you have completed the preparation process, you will then run the Azure AD Connect tool in the next exercise. 1. You should still be logged into **LON-DC1** as the **Administrator** from the prior task. 2. You want to begin by adding several trusted sites for Microsoft Edge. If you're familiar doing this with Internet Explorer, the process is basically the same for Edge; however, the location of the **Security** settings is different. With IE, you added trusted sites through IE's Internet Options; for Edge, you will add trusted sites through the Windows Control Panel. <br> Select the magnifying glass icon on the taskbar and then enter **control** in the Search box. 3. In the list of search results, select **Control Panel**. 4. In the **Control Panel**, select **Network and Internet**. 5. On the **Network and Internet** window, select **Internet Options**. 6. This opens the **Internet Properties** window. Select the **Security** tab. 7. The **Internet** zone should be selected by default. Towards the bottom of the window, select the **Custom Level** button. 8. In the **Security Settings – Internet Zone** window, scroll down to the **Downloads** section. The first option in this section is **File download**. Verify the **File download** option is set to **Enable** and then select **OK**. 9. This takes you back to the **Internet Options** window. Select the **Trusted sites** zone. 10. In the **Trusted Sites** zone, you must add several sites. Select the **Sites** button. 11. In the **Trusted sites** window, in the **Add this website to the zone** field, enter the following URL and then select **Add**: **https://outlook.office365.com/** 12. Repeat step 11 to add the following site: **https://outlook.office.com/** 13. Repeat step 11 to add the following site: **https://portal.office.com/** 14. Select **Close** once you have added these three sites. 15. In the **Internet Options** window, select **OK** to close the window. 16. Close the **Network and Internet** window. 17. Proceed to the next exercise. You are now ready to install the Azure AD Connect tool and enable synchronization. # Proceed to Lab 4 - Exercise 2
87.570513
525
0.758876
eng_Latn
0.998066
e999ed283e9b70c1726e9292de62a6c605ea078d
3,442
md
Markdown
docs/tools/dta/storageboundinmb-element-dta.md
conky77/sql-docs.it-it
43be0950d07b85a6566f565206630912ec8c3197
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/tools/dta/storageboundinmb-element-dta.md
conky77/sql-docs.it-it
43be0950d07b85a6566f565206630912ec8c3197
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/tools/dta/storageboundinmb-element-dta.md
conky77/sql-docs.it-it
43be0950d07b85a6566f565206630912ec8c3197
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: Elemento StorageBoundInMB (DTA) ms.prod: sql ms.prod_service: sql-tools ms.technology: tools-other ms.topic: conceptual dev_langs: - XML helpviewer_keywords: - StorageBoundInMB element ms.assetid: a8374910-bf68-4edb-b464-53a3a705e7f4 author: markingmyname ms.author: maghan ms.manager: jroth ms.reviewer: '' ms.custom: seo-lt-2019 ms.date: 03/01/2017 ms.openlocfilehash: f6d83065a572e2d125b43830653fde5a2298eb2b ms.sourcegitcommit: b78f7ab9281f570b87f96991ebd9a095812cc546 ms.translationtype: HT ms.contentlocale: it-IT ms.lasthandoff: 01/31/2020 ms.locfileid: "75306628" --- # <a name="storageboundinmb-element-dta"></a>Elemento StorageBoundInMB (DTA) [!INCLUDE[appliesto-ss-xxxx-xxxx-xxx-md](../../includes/appliesto-ss-xxxx-xxxx-xxx-md.md)] Specifica lo spazio massimo in megabyte che è possibile riservare all'Ottimizzazione guidata motore di database per le indicazioni di ottimizzazione, ovvero il set di indice e partizionamento. ## <a name="syntax"></a>Sintassi ``` <DTAInput> ...code removed... <TuningOptions> <StorageBoundInMB>...</ StorageBoundInMB > ``` ## <a name="element-characteristics"></a>Caratteristiche elemento |Caratteristica|Descrizione| |--------------------|-----------------| |**Tipo di dati e lunghezza**|**unsignedInt**, lunghezza illimitata.| |**Valore predefinito**|No.| |**Occorrenza**|Facoltativa. Può essere utilizzato una sola volta per l'elemento **TuningOptions** .| ## <a name="element-relationships"></a>Relazioni elemento |Relazione|Elementi| |------------------|--------------| |**Elemento padre**|[Elemento TuningOptions &#40;DTA&#41;](../../tools/dta/tuningoptions-element-dta.md)| |**Elementi figlio**|nessuno| ## <a name="remarks"></a>Osservazioni Se si ottimizzano più database, per il calcolo dello spazio vengono considerate le indicazioni per tutti i database. Per impostazione predefinita, nell'Ottimizzazione guidata motore di database vengono utilizzate le dimensioni minori tra gli spazi di archiviazione seguenti: - Tre volte le dimensioni dei dati non elaborati correnti nei quali sono incluse le dimensioni totali di heap e indici cluster nelle tabelle. - Lo spazio disponibile su tutte le unità disco collegate sommato alle dimensioni dei dati non elaborati. Le dimensioni dello spazio di archiviazione predefinite non includono gli indici non cluster e le viste indicizzate. Se il valore specificato per l'elemento **StorageBoundInMB** è superiore allo spazio su disco effettivo, verrà restituito un errore ma l'ottimizzazione continuerà. Al termine dell'ottimizzazione, sarà possibile aggiungere spazio su disco per implementare le indicazioni. ## <a name="example"></a>Esempio ## <a name="description"></a>Descrizione Nell'esempio di codice seguente viene illustrata la procedura per impostare un limite di 1500 megabyte per lo spazio massimo su disco utilizzabile da un'indicazione di ottimizzazione: ## <a name="code"></a>Codice ``` <DTAInput> <Server>...</Server> <Workload>...</Workload> <TuningOptions> <StorageBoundInMB>1500</StorageBoundInMB> ...code removed here... </DTAInput> ``` ## <a name="see-also"></a>Vedere anche [Guida di riferimento ai file di input XML&#40; (Ottimizzazione guidata motore di database)&#41;](../../tools/dta/xml-input-file-reference-database-engine-tuning-advisor.md)
39.113636
277
0.724869
ita_Latn
0.961567
e99a03740f1c99d50295f1c16bf738b02a22682a
151
md
Markdown
README.md
iKnowJavaScript/SendIT
20dad2e0f2d9c03f2a3f6578f3d4febbca350e71
[ "MIT" ]
null
null
null
README.md
iKnowJavaScript/SendIT
20dad2e0f2d9c03f2a3f6578f3d4febbca350e71
[ "MIT" ]
null
null
null
README.md
iKnowJavaScript/SendIT
20dad2e0f2d9c03f2a3f6578f3d4febbca350e71
[ "MIT" ]
null
null
null
# SendIT SendIT is a courier service that helps users deliver parcels to different destinations. It provides courier quotes based on weight catgories.
50.333333
141
0.827815
eng_Latn
0.998072
e99aec5c881bac3fa16b97a2cbfba10abd2f4ad4
867
md
Markdown
README.md
lolCourtesy/Auto-Dumper-Capture-DDoS-Attacks
887c60b2c0208cd3e0c1eb4184746494310467df
[ "CC0-1.0" ]
1
2021-08-17T00:58:11.000Z
2021-08-17T00:58:11.000Z
README.md
Egida/Auto-Dumper-Capture-DDoS-Attacks
887c60b2c0208cd3e0c1eb4184746494310467df
[ "CC0-1.0" ]
null
null
null
README.md
Egida/Auto-Dumper-Capture-DDoS-Attacks
887c60b2c0208cd3e0c1eb4184746494310467df
[ "CC0-1.0" ]
1
2021-07-17T21:13:03.000Z
2021-07-17T21:13:03.000Z
# Auto-Dumper-Capture-DDoS-Attacks A batch written auto dumper to auto capture ddos attacks to either patch it yourself or send to your provider. # Installation > 1. Update and upgrade your packages before doing anything. ```sudo apt update``` and ```sudo apt upgrade```. > 2. Type this command in your vps ```sudo apt install dos2unix```. > 3. Give right permissions with this command ```chmod +x autodump.sh``` for **autodump.sh**. > 4. Give right permissions with this command ```chmod +x discord.sh``` for **discord.sh**. > 5. Set the chmod 755 for **autodump.sh**. ```chmod 755 autodump.sh``` > 6. Set the chmod 755 for **discord.sh**. ```chmod 755 discord.sh``` > 7. Final, Just type this command in. ```sudo apt install jq``` # Issues > Any of the issues i will not assist you with, Do not bother to try to contact me or ask me for help! Enjoy... :)
37.695652
114
0.695502
eng_Latn
0.966503
e99aece2df25725cd6f36bd9a6805b633f30d01b
812
md
Markdown
test/integration/README.md
wwaaron/tpm2-tools
237520a526b45d0f3f9c39ddb47a63a5e43731aa
[ "BSD-3-Clause" ]
13
2017-11-15T02:27:37.000Z
2018-01-18T22:53:01.000Z
test/integration/README.md
wwaaron/tpm2-tools
237520a526b45d0f3f9c39ddb47a63a5e43731aa
[ "BSD-3-Clause" ]
121
2017-11-08T19:45:15.000Z
2018-01-24T21:57:47.000Z
test/integration/README.md
wwaaron/tpm2-tools
237520a526b45d0f3f9c39ddb47a63a5e43731aa
[ "BSD-3-Clause" ]
9
2017-11-16T21:37:56.000Z
2018-01-12T00:20:13.000Z
# Testing Framework The command **make check** can be used to run the test scripts. The configure option `--enable-unit` must be specified and the `tpm2-abrmd` and `tpm_server` must be found on `$PATH`. If they are installed in custom locations, specify or export `$PATH` during configure. For example: ```sh ./configure --enable-unit PATH=$PATH:/path/to/tpm2-abrmd:/path/to/tpm/sim/ibmtpm974/src ``` ## Adding a new integration test To add a new test, do: 1. add a script to the `test/integration/tests` directory. 2. `source helpers.sh` in the new script. 3. issue the command `start_up`. 4. Do whatever test you need to do. 5. If you set the `EXIT` handler, call `tpm2 shutdown` in that handler. 6. `make distclean`, re-run `bootstrap` and configure to pick up the new script. 7. Run `make check` again.
33.833333
87
0.730296
eng_Latn
0.980991
e99b006207c74310c418f53cdad86ed02f9a4f56
1,915
md
Markdown
README.md
WelissonCaetanoDaSilva/ToDoList1
fe8324e27f4e40659efd3b025a8e139bef84f700
[ "MIT" ]
null
null
null
README.md
WelissonCaetanoDaSilva/ToDoList1
fe8324e27f4e40659efd3b025a8e139bef84f700
[ "MIT" ]
null
null
null
README.md
WelissonCaetanoDaSilva/ToDoList1
fe8324e27f4e40659efd3b025a8e139bef84f700
[ "MIT" ]
null
null
null
# ToDoList1 Teste da segunda etapa TECnisa Sistema TODOLIST para fins de Teste na Tecnisa Aplicação desenvolvida em javaScript ultilizando Ultilizanso Framework Node.Js junto ao Angular.JS com extensão do Ionic Esta aplicação por causa do framework é hibriga para plataforma Windows, IOS e Android no caso aqui do GitHub ela roda no navegado senquido as instruçoes abaixo. Como configurar** Ter instalado Node.JS que é uma plataforma de para criação de projetos Web Disponivel me https://nodejs.org/en/ No Node.js vem integrado com Angular.JS que é uma framework front-end do qual o Ionic de apoia e que permite que se crie paginas mais sofisticadas NPM que é uma gerenciador de pacotes JavaScript. Abrir a teala de comando do do Node e execultar o comando npm isntall Ionic que é a extesão para desenvolver uma aplicação hibrida. Abrir a teala de comando do do Node e execultar o comando npm install -g cordova ionic. Abrindo a aplicação: baixar o pacote de pasta da aplicação TodoList aqui no GitHub e colocar na pasta de usuario nos diretorios do Windows que é: C:\user\nome Abrir a tela de comando do Node.JS procurar a pasta do projeto TodoList com o comando cd aparti de C:\user\nome> na pasta do projeto TodoList execultar o comando: ionic serve --lab ou ionic serve esperar a execução sera aberto no navegador a pagina em localhost com o programa em operação Detalhamento da estrutura e organização a aplicação esta toda em uma pasta todo chamada todolist é nassa pasta que esta os arguivos do projeto que compreende no primero nivel as pasta da apliação e os aquivos nescassario na execução. destaques para pontos específicos do projeto Como o solicitado foi fazer uma aplicação TODOLIST eu complementei com o objetivo de ser criativo uma forma de organia a lista de acordo com filtros assim na aplicação tem o icone menu onde se pode organizar a lista por tarefas por projetos e tambem por datas
91.190476
545
0.810444
por_Latn
0.999741
e99b05f25ad770d85a77a79c90d7cf76d9ddb2a2
3,474
md
Markdown
source/rdo/release-checklist.html.md
jsoref/website-1
5fad9eb30955d6b2366f2c733cd165a712b8907f
[ "MIT" ]
63
2015-09-23T07:42:30.000Z
2021-03-10T07:58:39.000Z
source/rdo/release-checklist.html.md
jsoref/website-1
5fad9eb30955d6b2366f2c733cd165a712b8907f
[ "MIT" ]
298
2015-09-14T14:57:34.000Z
2021-12-20T01:49:40.000Z
source/rdo/release-checklist.html.md
jsoref/website-1
5fad9eb30955d6b2366f2c733cd165a712b8907f
[ "MIT" ]
347
2015-09-14T13:58:14.000Z
2022-01-05T23:03:48.000Z
--- title: Release-Checklist authors: rbowen --- # Release checklist An important part of doing a release is getting word out to the community, and telling the right stories about the release. In the case of RDO, it's important to give appropriate credit to the upstream, while pointing out what we've done on top of that. Here's a helpful checklist of things that we should do around a release. ## Communicate with the community * Notify CentOS that a release is imminent, so that we can queue any additional jobs that must run, and get availability of support people to help with that. * Notify various press outlets that this is coming, so that we can schedule stories if desired. * Notify user community (RDO, CentOS) that a release is coming. ## Talking Points What's new or interesting in this release that we want to talk about? We should work on this in the 2 weeks leading up to a release, rather than trying to put it together the day of the release. * What's in the release? * What upstream features are most important (point to upstream news source for this) * Who did awesome stuff in the release? This can be generated using data from Gerrit, and the `gerritstats` project: below instructions: * Set up on a Fedora container based on the instructions on GitHub, then: ./gerrit_downloader.sh --after-date 2017-02-22 --server review.rdoproject.org --output-dir gerrit_out/ * Enter the gerrit_out directory, and copy all the -distgit- json files to another directory `gerrit_distgit/` and run: ./gerrit_stats.sh --branches rpm-master -f gerrit_distgit/ ## Technical release criteria Before announcing a new release to the community formally it's needed to ensure that some technical conditions are met with CloudSIG builds: * The three packstack all-in-one upstream scenarios can be executed successfully. * The four puppet-openstack-integration scenarios can be executed successfully. * TripleO container images can be built. * TripleO standalone scenario001 can be deployed with the containers from CloudSIG builds. This criteria has been agreed for Ussuri GA and may be updated for next releases. ## Release fanfare A week before a release, we should write a release announcement in a public etherpad, inviting a number of stakeholders to see it and comment on it. The invite list should be very inclusive, but not so large as to result in bikeshedding. A release announcement should include: * Where to get it * What's in it * When the test day will be * When the *next* release is coming * Highlight the work of individuals when possible, especially people that are not @redhat.com * How one can get involved in the project * (Link to) [basic information about RDO](/rdo) The release announcement should be sent to the following lists: * rdo-list - rdo-list@redhat.com * centos-devel - centos-devel@centos.org * openstack-discuss - openstack-discuss@lists.openstack.org * Twitter - @rdocommunity * Facebook - http://facebook.com/rdocommunity * RDO blog - http://rdoproject.org/blog ## What not to do * Release on Friday * Take credit for work done in the upstream ## After the release * Update the [release cadence doc](/rdo/release-cadence/) * Update this doc with whatever was learned * Continue to mention it on the above channels during the lifetime of the release, and when people mention that they're using older versions. * Update [what's in RDO](/rdo/projectsinrdo) with any added packages
36.1875
140
0.768854
eng_Latn
0.999252
e99c3ea8bdcfa235bb1ac996f1c34577ef998265
1,870
md
Markdown
content/blog/medium-posts/2018-11-25_SVG-y-Sombras.md
juliandavidmr/me
1a9d3f674ccbd1c3aba07ee0ccd5bf121b9e3ef3
[ "MIT" ]
1
2020-03-23T18:48:05.000Z
2020-03-23T18:48:05.000Z
content/blog/medium-posts/2018-11-25_SVG-y-Sombras.md
juliandavidmr/me
1a9d3f674ccbd1c3aba07ee0ccd5bf121b9e3ef3
[ "MIT" ]
null
null
null
content/blog/medium-posts/2018-11-25_SVG-y-Sombras.md
juliandavidmr/me
1a9d3f674ccbd1c3aba07ee0ccd5bf121b9e3ef3
[ "MIT" ]
null
null
null
--- title: SVG y Sombras description: Añade sombras a elementos SVG usando filtros CSS/HTML. date: '2018-11-25T14:59:52.528Z' categories: [] keywords: [svg] slug: /@anlijudavid/svg-y-sombras-f8cdcde1601c --- Añade sombras a elementos SVG usando filtros CSS/HTML. ![](https://cdn-images-1.medium.com/max/800/0*PYm_pMHsQIUykWEx.png) Añade en tu archivo HTML principal (o en el componente padre donde se mostrará la sombra) el siguiente segmento: Crear filtro SVG Tener en cuenta que existe una etiqueta `filter` con el atributo `id`, que mas adelante se usará para aplicar el filtro CSS. A continuación, se usará CSS para aplicar el filtro SVG en la etiqueta `path` cuando se pase el cursor sobre el mismo, es decir, se aplica el filtro SVG desde CSS cuando es activado el evento `hover`: Aplicar filtro SVG desde CSS (Ejemplo con SCSS) y ¿donde está la etiqueta `path`? Puedes añadir el siguiente segmento: <svg height="400" width="450"> <path id="lineAB" d="M 100 350 l 150 -300" stroke="red" stroke-width="3" fill="none" /> </svg> > Nota: Este ejemplo es aplicable a otros elementos SVG. Mas información en: [**\- SVG: Scalable Vector Graphics | MDN** _The feGaussianBlur SVG filter primitive blurs the input image by the amount specified in stdDeviation, which defines…_developer.mozilla.org](https://developer.mozilla.org/en-US/docs/Web/SVG/Element/feGaussianBlur "https://developer.mozilla.org/en-US/docs/Web/SVG/Element/feGaussianBlur")[](https://developer.mozilla.org/en-US/docs/Web/SVG/Element/feGaussianBlur) [**SVG filter feGaussianBlur in percentage** _Is it possible to set the feGaussianBlur in %? I don't know why but this does not work. &lt;filter id="drop-shadow"&gt…_stackoverflow.com](https://stackoverflow.com/a/27113087/5125608 "https://stackoverflow.com/a/27113087/5125608")[](https://stackoverflow.com/a/27113087/5125608)
45.609756
363
0.757219
spa_Latn
0.555894
e99c98f0352566dc2e53762559e768e8ff18c1f9
57
md
Markdown
json-ajax-mvc/README.md
prayjourney/bucket-improvement
5b0e3df9b96ecbbade7b2a7736cbaba1074d3148
[ "Apache-2.0" ]
null
null
null
json-ajax-mvc/README.md
prayjourney/bucket-improvement
5b0e3df9b96ecbbade7b2a7736cbaba1074d3148
[ "Apache-2.0" ]
null
null
null
json-ajax-mvc/README.md
prayjourney/bucket-improvement
5b0e3df9b96ecbbade7b2a7736cbaba1074d3148
[ "Apache-2.0" ]
null
null
null
### json-ajax-mvc #### 本模块是springmvc与ajax的整合例子,使用的是jsp页面。
28.5
39
0.754386
yue_Hant
0.121761
e99d53d85ae289faec80af04cd28bd62965248cc
9,782
md
Markdown
docs/nodes/validator-node-installation.md
LucaArietti/commercionetwork
452f19f2609334c7c9665cd6f433b4e93e95dcd8
[ "MIT" ]
null
null
null
docs/nodes/validator-node-installation.md
LucaArietti/commercionetwork
452f19f2609334c7c9665cd6f433b4e93e95dcd8
[ "MIT" ]
null
null
null
docs/nodes/validator-node-installation.md
LucaArietti/commercionetwork
452f19f2609334c7c9665cd6f433b4e93e95dcd8
[ "MIT" ]
null
null
null
# Becoming a validator (**WIP**) Once you've properly set up a [full node](full-node-installation.md), if you wish you can become a validator node and start in earning by validating the chain transactions. Before you start, we recommend that you run the command ```bash cncli config chain-id $CHAINID ``` In this way you can omit the flag `--chain-id="$CHAINID"` in every command of the **cncli** ## Requirements If you want to become a Commercio.network validator you need to: 1. Be a full node. If you are not, please follow the [full node installation guide](full-node-installation.md). 2. Own enough tokens. To become a validator you need two wallets: one with at least one token to create the validator and another with 50,000 tokens to delegate to the validator node. :::tip If you have any problems with the procedure try to read the section **[Common errors](#_common-errors)**. ::: ## 1. Add wallet key Inside the testnet you can use the ledger, but you can also use the wallet software with the `cncli`. However, if you wish to use Ledger, please add the `--ledger` flat to any command. :::warning Please remember to copy the 24 words seed phrase in a secure place. They are your mnemonic and if you loose them you lose all your tokens and the whole access to your validator. ::: Create the first wallet with the following command ```bash cncli keys add $NODENAME # Enter a password that you can remember ``` Copy your public address. It should have the format `did:com:<data>`. The second wallet must be requested through a message on the [Telegram group](https://t.me/commercionetworkvipsTelegram). With a private message will be sent the information of the second wallet. From now on we will refer to the value of your public address of the first wallet as `<your pub addr creator val>` notation. We will refer to the second wallet as `<your pub addr delegator>` notation. ## 2. Get the tokens - [Testnet](#testnet) - [Mainnet](#mainnet) ### Testnet In order to get `<your pub addr creator val>` you can use the following command: ```bash cncli keys show $NODENAME --address ``` From the command line ```bash curl "https://faucet-testnet.commercio.network/invite?addr=<your pub addr creator val>" ``` Or on a browser copy and paste the following address ``` https://faucet-testnet.commercio.network/invite?addr=<your pub addr creator val> ``` The call should return something like ```json {"tx_hash":"4AB05DF5BEB7321059A6724BF18A7B95631AB55773BBD55DFC448351101BE972"} ``` Now you can request the token from faucet's service ```bash cncli keys show $NODENAME --address ``` From the command line ```bash curl "https://faucet-testnet.commercio.network/give?addr=<your pub addr creator val>&amount=1100000" ``` Or on a browser copy and paste the following address ``` https://faucet-testnet.commercio.network/give?addr=<your pub addr creator val>&amount=1100000 ``` The call should return something like ```json {"tx_hash":"BB733FDB2665265D3B3A32576F23B10B10EA8F56EEBAD08C1BF39D5E2FAC601C"} ``` Once you've been confirmed the successful transaction, please check using the following command: ```bash cncli query account <your pub addr creator val> --chain-id $CHAINID ``` The output should look like this: ``` - denom: ucommercio amount: "1100000" ``` When you have received the second wallet `<your pub addr delegator>` via telegram message, check if the tokens are actually present ```bash cncli query account <your pub addr delegator> --chain-id $CHAINID ``` The output should look like this: ``` - denom: ucommercio amount: "5100000000" ``` ### Mainnet To get your tokens inside our mainnet, you are required to purchase them using an exchange or having received a black card. **The black card is the wallet `<your pub addr delegator>`** Create the first wallet `<your pub addr creator val>` with the command ```bash cncli keys add $NODENAME # Enter a password that you can remember ``` **Use the ledger or another hsm to make a recovery from 24 words for the second wallet with the black card.** Send one token to the first wallet using the following command :::warning This transaction is expected to be done with an hsm as Ledger. If you are using a Ledger add the `--ledger` flag. ::: ```bash cncli tx send \ <your pub addr delegator> \ <your pub addr creator val> \ 1000000ucommercio \ --fees=10000ucommercio \ --chain-id="$CHAINID" \ -y ``` Once you've been confirmed the successful transaction, please check using the following command: ```bash cncli query account <your pub addr creator val> --chain-id $CHAINID ``` The output should look like this: ``` - denom: ucommercio amount: "1000000" ``` ## 3. Create a validator Once you have the tokens, you can create a validator. If you want, while doing so you can also specify the following parameters * `--details`: a brief description about your node or your company * `--identity`: your [Keybase](https://keybase.io) identity * `--website`: a public website of your node or your company The overall command to create a validator is the following: ```bash cncli tx staking create-validator \ --amount=1000000ucommercio \ --pubkey=$(cnd tendermint show-validator) \ --moniker="$NODENAME" \ --chain-id="$CHAINID" \ --identity="" --website="" --details="" \ --commission-rate="0.10" --commission-max-rate="0.20" \ --commission-max-change-rate="0.01" --min-self-delegation="1" \ --from=<your pub addr creator val> \ --fees=10000ucommercio \ -y ##Twice input password required ``` The output should look like this: ``` height: 0 txhash: C41B87615308550F867D42BB404B64343CB62D453A69F11302A68B02FAFB557C codespace: "" code: 0 data: "" rawlog: '[]' logs: [] info: "" gaswanted: 0 gasused: 0 tx: null timestamp: "" ``` ## 4. Delegate tokens to the validator ### Confirm your validator is active Please confirm that your validator is active by running the following command: ```bash cncli query staking validators --chain-id $CHAINID | fgrep -B 1 $(cnd tendermint show-validator) ``` Something like this ``` operatoraddress: did:com:valoper1xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx conspubkey: did:com:valconspub1xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx ``` Copy the value of `operatoraddress`. Below we will refer to this value with `<validator-addr>` ### Delegate tokens Once received the second wallet must be loaded on the ledger or keyring through the command ```bash cncli keys add <name of second wallet> --recover ``` where `<name of second wallet>` is an arbitrary name. When requested, the 24 keywords must be entered Now you can delegate the tokens to the validator node :::warning This transaction is expected to be done with an hsm as a Ledger device . If you are using a Ledger add the `--ledger` flag. ::: ```bash cncli tx staking delegate \ <validator-addr> \ 50000000000ucommercio \ --from <your pub addr delegator> \ --chain-id="$CHAINID" \ --fees=10000ucommercio \ -y ``` The output should look like this: ``` height: 0 txhash: 027B85834DA5486085BC56FFD2759443EFD3101BD1023FA9A681262E5C85A845 codespace: "" code: 0 data: "" rawlog: '[]' logs: [] info: "" gaswanted: 0 gasused: 0 tx: null timestamp: "" ``` **Testnet** You should now see your validator inside the [Commercio.network explorer testnet](https://testnet.commercio.network) **Mainnet** You should now see your validator inside the [Commercio.network explorer mainnet](https://mainnet.commercio.network) :::tip Congratulations, you are now a Commercio.network validator 🎉 ::: ## Note If you want to make transactions with the nano ledger from another machine a full node must be created locally or a full node must be configured to accept remote connections. Edit the `.cnd/config/config.toml` file by changing from ``` laddr = "tcp://127.0.0.1:26657" ``` to ``` laddr = "tcp://0.0.0.0:26657" ``` and restart your node ``` systemctl cnd restart ``` and use the transaction this way ```bash cncli tx staking delegate \ <validator-addr> \ 50000000000ucommercio \ --from <your pub addr delegator> \ --node tcp://<ip of your fulle node>:26657 \ --chain-id="$CHAINID" \ --fees=10000ucommercio \ --ledger \ -y ``` ## Common errors ### Account does not exists #### Problem If I try to search for my address with the command ```bash cncli query account did:com:1sl4xupdgsgptr2nr7wdtygjp5cw2dr8ncmdsyp --chain-id $CHAINID ``` returns the message ``` ERROR: unknown address: account did:com:1sl4xupdgsgptr2nr7wdtygjp5cw2dr8ncmdsyp does not exist ``` #### Solution Check if your node has completed the sync. On https://testnet.commercio.network you can view the height of the chain at the current state Use the command ``` tail -1f /var/log/syslog | egrep " cnd+.*Committed state" ``` to check the height that reach your node ### Failed validator creation #### Problem I executed the validator [creation transaction](#_3-create-a-validator) but I don't appear on https://testnet.commercio.network/it/validators. #### Solution It may be that by failing one or more transactions the tokens are not sufficient to execute the transaction. Request more funds from the faucet with the command ```bash curl "https://faucet-testnet.commercio.network/give?addr=<your pub addr creator val>&amount=1100000" ``` and repeat the validator creation transaction ### DB errors #### Problem Trying to start the rest server or query the chain I get this error ``` panic: Error initializing DB: resource temporarily unavailable ``` #### Solution Maybe `cnd` and/or `cncli` services have been left active. Use the following commands ```bash systemctl stop cnd systemctl stop cncli pkill -9 cnd pkill -9 cncli ``` and repeat the procedure
25.810026
195
0.736352
eng_Latn
0.985515
e99e9d72a699277717b4aedd5d9a11311eea6cff
620
md
Markdown
docs/framework/wpf/wpf-samples.md
cy-org/docs-conceptual.zh-cn
0c18cda3dd707efdcdd0e73bc480ab9fbbc4580c
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/framework/wpf/wpf-samples.md
cy-org/docs-conceptual.zh-cn
0c18cda3dd707efdcdd0e73bc480ab9fbbc4580c
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/framework/wpf/wpf-samples.md
cy-org/docs-conceptual.zh-cn
0c18cda3dd707efdcdd0e73bc480ab9fbbc4580c
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: "WPF 示例 | Microsoft Docs" ms.custom: "" ms.date: "03/30/2017" ms.prod: ".net-framework" ms.reviewer: "" ms.suite: "" ms.technology: - "dotnet-wpf" ms.tgt_pltfrm: "" ms.topic: "article" helpviewer_keywords: - "示例 [WPF]" - "WPF 示例" ms.assetid: 1fc53e12-dfe5-476e-be13-fc3714aaf640 caps.latest.revision: 24 author: "dotnet-bot" ms.author: "dotnetcontent" manager: "wpickett" caps.handback.revision: 24 --- # WPF 示例 有关演示 [!INCLUDE[TLA#tla_winclient](../../../includes/tlasharptla-winclient-md.md)] 的示例的列表,请参见 MSDN 代码库上的 [WPF Documentation Samples](http://go.microsoft.com/fwlink/?LinkID=159044)(WPF 文档示例)。
26.956522
189
0.706452
yue_Hant
0.314937
e99fb68a4331961dfa4bb963ca92b45fba1b5b9f
588
md
Markdown
docs/README.md
SudoDotDog/Sudoo-Lazy
bcdd5c7710aa8840157b5b63aa1bcca318bd3c2d
[ "MIT" ]
null
null
null
docs/README.md
SudoDotDog/Sudoo-Lazy
bcdd5c7710aa8840157b5b63aa1bcca318bd3c2d
[ "MIT" ]
null
null
null
docs/README.md
SudoDotDog/Sudoo-Lazy
bcdd5c7710aa8840157b5b63aa1bcca318bd3c2d
[ "MIT" ]
null
null
null
# Sudoo-Lazy [![Build Status](https://travis-ci.com/SudoDotDog/Sudoo-Lazy.svg?branch=master)](https://travis-ci.com/SudoDotDog/Sudoo-Lazy) [![codecov](https://codecov.io/gh/SudoDotDog/Sudoo-Lazy/branch/master/graph/badge.svg)](https://codecov.io/gh/SudoDotDog/Sudoo-Lazy) [![npm version](https://badge.fury.io/js/%40sudoo%2Flazy.svg)](https://www.npmjs.com/package/@sudoo/lazy) [![downloads](https://img.shields.io/npm/dm/@sudoo/lazy.svg)](https://www.npmjs.com/package/@sudoo/lazy) Lazy Evaluation for JS ## Install ```sh yarn add @sudoo/lazy # Or npm install @sudoo/lazy --save ```
34.588235
132
0.72619
yue_Hant
0.874796
e9a009b8593806b7d829a4b02b8a3d43d29ecb84
1,987
md
Markdown
_posts/blog/2021-08-29-the-rundown-2003-tamil-dubbed-hd.md
tadubs/tadubs.github.io
290f1ade70110d28724b26260e8e272a7a64b90b
[ "MIT" ]
null
null
null
_posts/blog/2021-08-29-the-rundown-2003-tamil-dubbed-hd.md
tadubs/tadubs.github.io
290f1ade70110d28724b26260e8e272a7a64b90b
[ "MIT" ]
3
2022-03-15T05:48:33.000Z
2022-03-15T16:54:49.000Z
_posts/blog/2021-08-29-the-rundown-2003-tamil-dubbed-hd.md
mymoviesda/mymoviesda.github.io
e47ef5acaea6cb98841b9c6384ad0caeafeb8ca3
[ "MIT" ]
6
2020-11-22T07:57:38.000Z
2022-03-15T16:57:40.000Z
--- id: 127 title: The Rundown (2003) Tamil Dubbed HD date: 2021-08-29T06:11:42+00:00 author: tentrockers layout: post guid: https://tentrockers.online/?p=127 post id: /the-rundown-2003-tamil-dubbed-hd/ cyberseo_rss_source: - 'https://dubhoodtamil.blogspot.com/feeds/posts/default?max-results=150&start-index=151' cyberseo_post_link: - https://dubhoodtamil.blogspot.com/2020/08/the-rundown-2003-tamil-dubbed-hd.html categories: - Uncategorized --- <div class="media_block"> <img src="https://1.bp.blogspot.com/-TsmfgWyWS9E/XzYtQGqYPBI/AAAAAAAAB_0/Mo8GD_ZARFE8Xj320LmQ9iZihAgMduZ5ACNcBGAsYHQ/s72-w356-h513-c/139137.jpg" class="media_thumbnail" /> </div> ## &nbsp;The Rundown (2003) | Dubhoodtamil <div class="separator"> <a href="https://1.bp.blogspot.com/-TsmfgWyWS9E/XzYtQGqYPBI/AAAAAAAAB_0/Mo8GD_ZARFE8Xj320LmQ9iZihAgMduZ5ACNcBGAsYHQ/s1426/139137.jpg" imageanchor="1"><img loading="lazy" border="0" data-original-height="1426" data-original-width="1000" height="513" src="https://1.bp.blogspot.com/-TsmfgWyWS9E/XzYtQGqYPBI/AAAAAAAAB_0/Mo8GD_ZARFE8Xj320LmQ9iZihAgMduZ5ACNcBGAsYHQ/w356-h513/139137.jpg" width="356" /></a> </div> Movie <span></span>: <span></span>The Rundown (2003) Director <span></span>: <span></span>Peter Berg Starring <span></span>: <span></span>Dwayne Johnson, Seann William Scott Genres <span></span>: <span></span>Action, Adventure, Comedy Quality <span></span>: <span></span>BDRip Language: <span></span>Tamil Rating <span></span>: <span></span>6.7/10 Synopsi s: A tough aspiring chef is hired to bring home a mobster&#8217;s son from the Amazon but becomes involved in the fight against an oppressive town operator and the search for a legendary treasure. ## **<span>Click Below to Download Movie</span>** **<span><a href="https://oncehelp.com/rundown-1" target="_blank" rel="noopener">Download 270MB</a></span>** **<span><a href="https://oncehelp.com/rundown-2" target="_blank" rel="noopener">Download 420MB</a></span>** By Dubhoodtamil
41.395833
403
0.746351
yue_Hant
0.325615
e9a011985832a61f5f91f2b298a004aeda1b7f71
77
md
Markdown
docs/touchablenativefeedback.md
MarcCoquand/rescript-react-native.github.io
347a454567c4fdea7774b0e02e94f977471af8cf
[ "MIT" ]
null
null
null
docs/touchablenativefeedback.md
MarcCoquand/rescript-react-native.github.io
347a454567c4fdea7774b0e02e94f977471af8cf
[ "MIT" ]
5
2021-06-07T04:49:24.000Z
2022-03-07T17:07:06.000Z
docs/touchablenativefeedback.md
MarcCoquand/rescript-react-native.github.io
347a454567c4fdea7774b0e02e94f977471af8cf
[ "MIT" ]
1
2021-12-22T15:47:14.000Z
2021-12-22T15:47:14.000Z
--- id: touchablenativefeedback title: TouchableNativeFeedback wip: true ---
12.833333
30
0.779221
eng_Latn
0.553206
e9a0845354b85a30f44ddca48b450554305eae1e
1,168
md
Markdown
README.md
arielskap/ancient-currency
1bc982e71e072047e3b5d577f247f57ca839d3b0
[ "MIT" ]
null
null
null
README.md
arielskap/ancient-currency
1bc982e71e072047e3b5d577f247f57ca839d3b0
[ "MIT" ]
null
null
null
README.md
arielskap/ancient-currency
1bc982e71e072047e3b5d577f247f57ca839d3b0
[ "MIT" ]
null
null
null
# Ancient Divisa Consulta de divisas ### Mejoras a realizar * Input para agregar más divisas * Diseño general * Actualizar divisas cada 15 segundos ### Pre-requisitos 📋 * Tener instalado un editor de código como [Visual Studio Code](https://code.visualstudio.com/) para poder hacer cualquier modificacion al proyecto de forma ágil. * [Node.js](https://nodejs.org/es/) para poder ejecutar el codigo javascript en nuestra computadora/servidor. * Recomendación: Instalar [Yarn](https://yarnpkg.com/), a través de **Node.js command prompt**, de forma global para instalar dependencias a menos que decida usar npm: ``` npm i -g yarn ``` ### Instalación 🔧 _Es importante estar siempre posicionados a través de **Node.js command prompt** en nuestro directorio en donde se encuentra el proyecto._ 1. Ejecutar el siguiente comando para instalar dependencias: ``` yarn ``` ### Ejecutando el programa 🚗 * Para levantar nuestro proyecto en un entorno de desarrollo ejecutar: ``` yarn dev ``` ## Despliegue 📦 ### Entorno Local 🏠 * Para "construir" nuestro proyecto ejecutamos: ``` yarn build ``` * Y para levantarlo en un entorno de produccion: ``` yarn start ```
22.461538
167
0.736301
spa_Latn
0.97154
e9a0f696f5ab011f5e43a84ec3dc0a23bdc250d4
384
md
Markdown
README.md
cexll/alphaid
a5219a7f5fc3b4ea1c5b18fbc310dcea3ababeb6
[ "MIT" ]
null
null
null
README.md
cexll/alphaid
a5219a7f5fc3b4ea1c5b18fbc310dcea3ababeb6
[ "MIT" ]
null
null
null
README.md
cexll/alphaid
a5219a7f5fc3b4ea1c5b18fbc310dcea3ababeb6
[ "MIT" ]
null
null
null
# AlphaID ## Install ```bash composer require youcloud/alphaid ``` ## Usage ```php use YogCloud\AlphaID; $id = new AlphaID(); $code = $id->encode(PHP_INT_MAX); var_dump($code, $id->decode($code)); $code = $id->encode(PHP_INT_MAX, PHP_VERSION_ID); var_dump($code, $id->decode($code)); $code = $id->encode([PHP_INT_MAX, PHP_VERSION_ID]); var_dump($code, $id->decode($code)); ```
16
51
0.664063
yue_Hant
0.249032
e9a12e71f3eedbf816c000c23e18a9e8f37f918c
6,923
md
Markdown
_includes/stats/fr/dep/nmod.md
fginter/docs-fginterfork
1012563e049f1ad57548bb71908c632b23ee64f9
[ "Apache-2.0" ]
null
null
null
_includes/stats/fr/dep/nmod.md
fginter/docs-fginterfork
1012563e049f1ad57548bb71908c632b23ee64f9
[ "Apache-2.0" ]
null
null
null
_includes/stats/fr/dep/nmod.md
fginter/docs-fginterfork
1012563e049f1ad57548bb71908c632b23ee64f9
[ "Apache-2.0" ]
null
null
null
-------------------------------------------------------------------------------- ## Treebank Statistics (UD_French) This relation is universal. There are 1 language-specific subtypes of `nmod`: [nmod:poss](). 60196 nodes (15%) are attached to their parents as `nmod`. 55544 instances of `nmod` (92%) are left-to-right (parent precedes child). Average distance between parent and child is 4.06960595388398. The following 91 pairs of parts of speech are connected with `nmod`: [fr-pos/NOUN]()-[fr-pos/NOUN]() (20402; 34% instances), [fr-pos/VERB]()-[fr-pos/NOUN]() (17220; 29% instances), [fr-pos/NOUN]()-[fr-pos/PROPN]() (7204; 12% instances), [fr-pos/VERB]()-[fr-pos/PROPN]() (3676; 6% instances), [fr-pos/NOUN]()-[fr-pos/NUM]() (2181; 4% instances), [fr-pos/VERB]()-[fr-pos/NUM]() (1861; 3% instances), [fr-pos/PROPN]()-[fr-pos/PROPN]() (1755; 3% instances), [fr-pos/ADJ]()-[fr-pos/NOUN]() (1131; 2% instances), [fr-pos/VERB]()-[fr-pos/PRON]() (918; 2% instances), [fr-pos/PROPN]()-[fr-pos/NOUN]() (766; 1% instances), [fr-pos/PRON]()-[fr-pos/NOUN]() (499; 1% instances), [fr-pos/NOUN]()-[fr-pos/PRON]() (464; 1% instances), [fr-pos/PROPN]()-[fr-pos/NUM]() (291; 0% instances), [fr-pos/NUM]()-[fr-pos/NOUN]() (265; 0% instances), [fr-pos/ADJ]()-[fr-pos/PROPN]() (201; 0% instances), [fr-pos/SYM]()-[fr-pos/NOUN]() (139; 0% instances), [fr-pos/ADJ]()-[fr-pos/PRON]() (135; 0% instances), [fr-pos/PRON]()-[fr-pos/PROPN]() (107; 0% instances), [fr-pos/NUM]()-[fr-pos/NUM]() (96; 0% instances), [fr-pos/NOUN]()-[fr-pos/SYM]() (82; 0% instances), [fr-pos/VERB]()-[fr-pos/SYM]() (78; 0% instances), [fr-pos/ADV]()-[fr-pos/NOUN]() (63; 0% instances), [fr-pos/PROPN]()-[fr-pos/PRON]() (63; 0% instances), [fr-pos/ADJ]()-[fr-pos/NUM]() (53; 0% instances), [fr-pos/NOUN]()-[fr-pos/X]() (49; 0% instances), [fr-pos/NUM]()-[fr-pos/PROPN]() (49; 0% instances), [fr-pos/PRON]()-[fr-pos/PRON]() (49; 0% instances), [fr-pos/VERB]()-[fr-pos/X]() (28; 0% instances), [fr-pos/NOUN]()-[fr-pos/ADJ]() (27; 0% instances), [fr-pos/NUM]()-[fr-pos/PRON]() (23; 0% instances), [fr-pos/PRON]()-[fr-pos/NUM]() (21; 0% instances), [fr-pos/VERB]()-[fr-pos/ADJ]() (20; 0% instances), [fr-pos/INTJ]()-[fr-pos/NOUN]() (19; 0% instances), [fr-pos/ADJ]()-[fr-pos/SYM]() (17; 0% instances), [fr-pos/X]()-[fr-pos/NOUN]() (16; 0% instances), [fr-pos/ADP]()-[fr-pos/NOUN]() (15; 0% instances), [fr-pos/SYM]()-[fr-pos/NUM]() (15; 0% instances), [fr-pos/SYM]()-[fr-pos/SYM]() (15; 0% instances), [fr-pos/VERB]()-[fr-pos/ADV]() (14; 0% instances), [fr-pos/NOUN]()-[fr-pos/VERB]() (12; 0% instances), [fr-pos/AUX]()-[fr-pos/NOUN]() (11; 0% instances), [fr-pos/SYM]()-[fr-pos/PROPN]() (10; 0% instances), [fr-pos/VERB]()-[fr-pos/VERB]() (10; 0% instances), [fr-pos/X]()-[fr-pos/PROPN]() (9; 0% instances), [fr-pos/DET]()-[fr-pos/NOUN]() (8; 0% instances), [fr-pos/SYM]()-[fr-pos/PRON]() (7; 0% instances), [fr-pos/ADJ]()-[fr-pos/ADJ]() (6; 0% instances), [fr-pos/ADV]()-[fr-pos/PROPN]() (6; 0% instances), [fr-pos/ADJ]()-[fr-pos/ADV]() (4; 0% instances), [fr-pos/ADP]()-[fr-pos/NUM]() (4; 0% instances), [fr-pos/ADV]()-[fr-pos/ADV]() (4; 0% instances), [fr-pos/INTJ]()-[fr-pos/PRON]() (4; 0% instances), [fr-pos/NOUN]()-[fr-pos/ADV]() (4; 0% instances), [fr-pos/PROPN]()-[fr-pos/SYM]() (4; 0% instances), [fr-pos/PROPN]()-[fr-pos/X]() (4; 0% instances), [fr-pos/VERB]()-[fr-pos/ADP]() (4; 0% instances), [fr-pos/X]()-[fr-pos/NUM]() (4; 0% instances), [fr-pos/ADJ]()-[fr-pos/X]() (3; 0% instances), [fr-pos/ADV]()-[fr-pos/NUM]() (3; 0% instances), [fr-pos/ADV]()-[fr-pos/PRON]() (3; 0% instances), [fr-pos/NOUN]()-[fr-pos/ADP]() (3; 0% instances), [fr-pos/NUM]()-[fr-pos/SYM]() (3; 0% instances), [fr-pos/ADJ]()-[fr-pos/ADP]() (2; 0% instances), [fr-pos/CONJ]()-[fr-pos/NOUN]() (2; 0% instances), [fr-pos/DET]()-[fr-pos/PROPN]() (2; 0% instances), [fr-pos/INTJ]()-[fr-pos/NUM]() (2; 0% instances), [fr-pos/INTJ]()-[fr-pos/PROPN]() (2; 0% instances), [fr-pos/PRON]()-[fr-pos/ADV]() (2; 0% instances), [fr-pos/PRON]()-[fr-pos/X]() (2; 0% instances), [fr-pos/PUNCT]()-[fr-pos/PROPN]() (2; 0% instances), [fr-pos/SCONJ]()-[fr-pos/NOUN]() (2; 0% instances), [fr-pos/VERB]()-[fr-pos/DET]() (2; 0% instances), [fr-pos/ADP]()-[fr-pos/PRON]() (1; 0% instances), [fr-pos/ADP]()-[fr-pos/PROPN]() (1; 0% instances), [fr-pos/ADP]()-[fr-pos/SCONJ]() (1; 0% instances), [fr-pos/ADV]()-[fr-pos/SYM]() (1; 0% instances), [fr-pos/DET]()-[fr-pos/ADV]() (1; 0% instances), [fr-pos/DET]()-[fr-pos/NUM]() (1; 0% instances), [fr-pos/NOUN]()-[fr-pos/CONJ]() (1; 0% instances), [fr-pos/NOUN]()-[fr-pos/PUNCT]() (1; 0% instances), [fr-pos/NUM]()-[fr-pos/DET]() (1; 0% instances), [fr-pos/PRON]()-[fr-pos/ADJ]() (1; 0% instances), [fr-pos/PRON]()-[fr-pos/SYM]() (1; 0% instances), [fr-pos/PROPN]()-[fr-pos/ADJ]() (1; 0% instances), [fr-pos/PUNCT]()-[fr-pos/ADJ]() (1; 0% instances), [fr-pos/PUNCT]()-[fr-pos/NOUN]() (1; 0% instances), [fr-pos/PUNCT]()-[fr-pos/NUM]() (1; 0% instances), [fr-pos/VERB]()-[fr-pos/PART]() (1; 0% instances), [fr-pos/X]()-[fr-pos/ADJ]() (1; 0% instances), [fr-pos/X]()-[fr-pos/PRON]() (1; 0% instances), [fr-pos/X]()-[fr-pos/X]() (1; 0% instances). ~~~ conllu # visual-style 7 bgColor:blue # visual-style 7 fgColor:white # visual-style 4 bgColor:blue # visual-style 4 fgColor:white # visual-style 4 7 nmod color:blue 1 Aviator Aviator PROPN _ _ 0 root _ _ 2 , , PUNCT _ _ 1 punct _ _ 3 un un DET _ Definite=Ind|Gender=Masc|Number=Sing|PronType=Dem 4 det _ _ 4 film film NOUN _ Gender=Masc|Number=Sing 1 appos _ _ 5 sur sur ADP _ _ 7 case _ _ 6 la le DET _ Definite=Def|Gender=Fem|Number=Sing 7 det _ _ 7 vie vie NOUN _ Gender=Fem|Number=Sing 4 nmod _ _ 8 de de ADP _ _ 9 case _ _ 9 Hughes Hughes PROPN _ _ 7 nmod _ _ 10 . . PUNCT _ _ 1 punct _ _ ~~~ ~~~ conllu # visual-style 6 bgColor:blue # visual-style 6 fgColor:white # visual-style 3 bgColor:blue # visual-style 3 fgColor:white # visual-style 3 6 nmod color:blue 1 Mais mais CONJ _ _ 3 cc _ _ 2 comment comment ADV _ _ 3 advmod _ _ 3 faire faire VERB _ VerbForm=Inf 0 root _ _ 4 dans dans ADP _ _ 6 case _ _ 5 un un DET _ Definite=Ind|Gender=Masc|Number=Sing|PronType=Dem 6 det _ _ 6 contexte contexte NOUN _ Gender=Masc|Number=Sing 3 nmod _ _ 7 structurellement structurellement ADV _ _ 8 advmod _ _ 8 raciste raciste ADJ _ Gender=Masc|Number=Sing 6 amod _ _ 9 ? ? PUNCT _ _ 3 punct _ _ ~~~ ~~~ conllu # visual-style 9 bgColor:blue # visual-style 9 fgColor:white # visual-style 7 bgColor:blue # visual-style 7 fgColor:white # visual-style 7 9 nmod color:blue 1 Aviator Aviator PROPN _ _ 0 root _ _ 2 , , PUNCT _ _ 1 punct _ _ 3 un un DET _ Definite=Ind|Gender=Masc|Number=Sing|PronType=Dem 4 det _ _ 4 film film NOUN _ Gender=Masc|Number=Sing 1 appos _ _ 5 sur sur ADP _ _ 7 case _ _ 6 la le DET _ Definite=Def|Gender=Fem|Number=Sing 7 det _ _ 7 vie vie NOUN _ Gender=Fem|Number=Sing 4 nmod _ _ 8 de de ADP _ _ 9 case _ _ 9 Hughes Hughes PROPN _ _ 7 nmod _ _ 10 . . PUNCT _ _ 1 punct _ _ ~~~
89.909091
4,699
0.61274
yue_Hant
0.280272
e9a27ec578ba6fc53a63c32f72a3417fcc96421a
731
md
Markdown
README.md
leeyt/ZKRowlayout
b2cba42d29657e4651b525fd6802f55cdc7f5913
[ "Apache-2.0" ]
1
2015-10-21T14:40:59.000Z
2015-10-21T14:40:59.000Z
README.md
leeyt/ZKRowlayout
b2cba42d29657e4651b525fd6802f55cdc7f5913
[ "Apache-2.0" ]
null
null
null
README.md
leeyt/ZKRowlayout
b2cba42d29657e4651b525fd6802f55cdc7f5913
[ "Apache-2.0" ]
null
null
null
# ZKRowlayout Rowlayout/rowchildren are ZK addon components inspired by Twitter Bootstrap's grid system. They can be used to easily divide screen estate into a set of grid regions to place other ZK components inside. ## How to set up? * Download [rowlayout-0.8.jar][1] from the target folder * Put the jar file in `WEB-INF/lib` * Start using them... ## How to Use? * Please refer to [this blog post][2] for more information ## System requirement * ZK 6.5.0 or later [1]: https://github.com/leeyt/ZKRowlayout/blob/master/target/rowlayout-0.8.jar "Rowlayout/rowchildren components" [2]: http://blog.zkoss.org/index.php/2013/03/05/rowlayout-component/ "Rowlayout: building a fluid grid system with column offsetting feature"
33.227273
202
0.753762
eng_Latn
0.975394
e9a28eaa2ba25eea580bd9b830aa1070a4322df1
2,819
md
Markdown
_posts/2020-05-11-zhushi.md
hailuorou/itwanger.github.io
2f4681d0b773244000c4862902de9db41750dae0
[ "Apache-2.0" ]
34
2019-12-31T09:24:23.000Z
2020-07-19T06:19:54.000Z
_posts/2020-05-11-zhushi.md
hailuorou/itwanger.github.io
2f4681d0b773244000c4862902de9db41750dae0
[ "Apache-2.0" ]
null
null
null
_posts/2020-05-11-zhushi.md
hailuorou/itwanger.github.io
2f4681d0b773244000c4862902de9db41750dae0
[ "Apache-2.0" ]
15
2019-11-09T14:40:25.000Z
2020-07-17T07:48:39.000Z
--- layout: post category: life title: 优秀的程序员真的不写注释吗? tagline: by 沉默王二 tags: - 程序员 --- 我在很多地方看到这样一个观点,“请停止写注释,因为只有烂的代码才需要注释。”这个观点非常巧妙,它让我想起了孟子的一句话,“杨氏为我,是无君也;墨氏兼爱,是无父也。无父无君,是禽兽也。” <!--more--> ![](http://www.itwanger.com/assets/images/2020/05/zhushi-01.png) 动不动就骂别人是“禽兽”,我总觉得有点不妥,这很不符合孟子的浩然之气啊。有些大牛也有孟子这样的觉悟,如果有人要他给自己的代码加上注释,就好像是对他的一种侮辱:“我的代码写得这么优雅,你难道看不懂吗?注释是多余的!” 我必须得承认,每个程序员都应该有一颗追求“优雅”的心,力争自己的代码更易阅读和理解——不只是针对机器,还有我们程序员同行。但不是每个程序员在一开始都能写出“高标准”的代码的,就好像不是所有君王和百姓都能搞清楚孟子的治国齐家理念的。 在我刚回洛阳的那段时间,过得非常痛苦。因为我刚接手了别人留下的一个项目,关于大宗期货交易的。后端代码是用 Java 写的,但有很多 bug 在里面,动不动就资金结算失败,甚至内存溢出,要解决这些问题,只有一个办法,就是彻底搞懂这些代码。 否则,根本无从下手。这就好像,你和朋友开车出去自驾游,去很远很远的地方,朋友开累了,需要休息,这时候,如果你没考过驾照,那就抓瞎了,只能把车停路边,等朋友的疲劳消退了,才能继续上路。 我就抓瞎了。凭良心说,前同事留下的代码是精彩绝伦的,如果换做是我来写,真不一定能写得出来。毕竟大宗期货交易本身还是有点难度的,需要竞价撮合,这个业务理解起来比股票还要复杂些。 股票涨了就赚,跌了就亏。期货不同的,买涨能赚,买跌也能赚。不过业务上的复杂还是次要的,重要的是代码里的注释非常稀有,就好像詹姆斯高斯林头上的发丝一样稀有。 ![](http://www.itwanger.com/assets/images/2020/05/zhushi-02.png) 况且,国内程序员的英语功底你懂的,变量、方法、类、接口、枚举的命名无法做到真正意义上的名如其意。再加上,有些方法的行数多达三四百行,从头看到尾,看得只想捶自己。 没办法,我的解决办法就是,看懂一行就加一行注释,毕竟注释总比代码要容易理解啊。就好像,我们在调用一个不熟悉的 API 时,只要代码的文档说清楚它是干嘛的,我们就可以用,就敢用,至于实现的细节,暂时没有理解也没关系。 差不多花了两个月的时间(非常漫长、非常煎熬)吧,我总算是把项目中核心的代码给研究清楚了。搞清楚之后,那些之前怎么改都改不掉的 bug 也就迎刃而解了。 这也就是为什么,我倡导大家去读源码的一部分原因了,除了学习,读源码是解决 bug 的杀手锏。要读懂源码,注释是必不可少的。不信,你看看 Java 的源码,每个变量、每个方法、每个类,注释都非常详细,详细到你替源码的作者感到心累。 在我看来,Java 源码的作者绝对是这个世界上最优秀的程序员,连他们都写注释,那些声称“请停止写注释”的号召者是不是要啪啪啪地打脸,直到打肿为止。 ![](http://www.itwanger.com/assets/images/2020/05/zhushi-03.gif) 不要怀疑自己,写注释并不会证明你的代码就是烂代码。我相信,你应该买过电子产品,比如说鼠标、键盘、耳机、手机等等,所有的产品包装里除了产品本身,使用说明书是必不可少的。我就问一句,“手机没有使用说明书,咱这些后浪还能不会用?” 写注释不是我们的错,软件本来就是复杂的。尤其是我们这些英语不是主力语言的人来说,注释显得尤为重要。我可能属于记忆力不好的那一种,隔个十天半个月,再去回头看那些我自己敲的代码,有时候真有点见着陌生人的感觉:“这代码是我写的吗?怎么有点面生啊?” 大部分人写的代码都要升级重构,对吧?不论是语言本身版本的升级,还是我们自身能力的成长。假如在升级重构的时候,没有这些注释的帮助,真有点爬泰山的感觉,累啊,亲。 再者说,大牛也不敢保证自己的代码是没有问题的,对吧?但注释是不会骗人的,它的意义是明确的。你可能会忘记代码是干嘛的,但我敢保证,注释能够唤醒你的记忆。 ![](http://www.itwanger.com/assets/images/2020/05/zhushi-04.png) **写出好的、有意义的注释其实是有难度的,就像写代码一样**。在追求卓越的路上,代码和注释其实是相辅相成的。注释会让你的代码更易阅读,代码会让你的注释更富有逻辑。 即便是你的代码已经优雅到不需要注释,那只是在你的层面上。对于你的同事,你代码后来的负责者,就不一定了。所见略同的英雄并不会很多,你以为很优雅的代码没准在别人眼里就是一坨垃圾,而你的注释很可能会帮助别人“恍然大悟”,明白代码的意义。乖乖地写注释吧,对你对别人都有好处。 另外,我想说一句,注释就好像是代码的一个蓝图,也或者是对代码的一个总结。在你写代码之前,脑子里肯定要想清楚你要实现什么,怎么实现,把这些作为注释写下来绝对可以帮助你写出更优雅的代码。在代码写完之后,通过注释进行总结,还能对代码进行一些升华,没准还能在总结的过程中想到更好的代码方案。 我还见到有大牛信誓旦旦地说,写注释就好像是给不会游泳的人扔一个救生圈,他永远也学不会游泳。咋眼一看,这句话说得很有道理,对吧?在大牛们看来,要让一个新人快速成长,最好的办法就是把没有注释的代码扔给他看。 纯属扯淡,恐怕这个新人没入门就放弃了吧?我已经三十一岁了,不,我已经十八岁了,还不会游泳呢?别听那些大牛们的鬼话,我就不信,他自己没写过注释。 ![](http://www.itwanger.com/assets/images/2020/05/zhushi-05.gif) 总之一点,**注释并不会妨碍你写出优雅简洁的代码,它只是程序固有的一部分而已**。 如果觉得文章对你有点帮助,请微信搜索「 **沉默王二** 」第一时间阅读,回复「**简历**」更有一份 3000+ 人下载过的优质简历模板,从此你的简历再也不会石沉大海了。本文已收录 GitHub,[**传送门~**](https://github.com/qinggee/itwanger.github.io) ,里面更有大厂面试完整考点,欢迎 Star。 我是沉默王二,一枚有颜值却靠才华苟且的程序员。**关注即可提升学习效率,别忘了三连啊,点赞、收藏、留言,我不挑,嘻嘻**。
37.092105
179
0.824761
yue_Hant
0.352378
e9a3a36d51398cb2d9f3f139d64f2ceebe0c71bf
23
md
Markdown
README.md
dzotokan/tlight
494e12e9caa580af3ca18fe1b723b7aa684d5a9c
[ "MIT" ]
null
null
null
README.md
dzotokan/tlight
494e12e9caa580af3ca18fe1b723b7aa684d5a9c
[ "MIT" ]
null
null
null
README.md
dzotokan/tlight
494e12e9caa580af3ca18fe1b723b7aa684d5a9c
[ "MIT" ]
null
null
null
# tlight Traffic Light
7.666667
13
0.782609
eng_Latn
0.983019
e9a4153327ad7ccfd1bf39847a273a997fa0883b
65
md
Markdown
TICK/Grafana/template.md
markuskont/CDMCS
fdd96137cc964164808e904e7a11173f01250c64
[ "MIT" ]
70
2016-10-03T14:25:23.000Z
2022-03-18T08:18:52.000Z
TICK/Grafana/template.md
markuskont/CDMCS
fdd96137cc964164808e904e7a11173f01250c64
[ "MIT" ]
4
2017-04-05T17:00:56.000Z
2020-02-28T13:19:22.000Z
TICK/Grafana/template.md
ccdcoe/CDMCS
fdd96137cc964164808e904e7a11173f01250c64
[ "MIT" ]
29
2017-02-15T10:03:31.000Z
2021-11-03T17:07:41.000Z
# templates ``` SHOW TAG VALUES FROM /.+/ WITH KEY = "host" ```
10.833333
43
0.569231
yue_Hant
0.742826
e9a44022969f559480d385fb42c6d868a270e0cc
317
md
Markdown
docs/CreateTransferTicketResponse.md
fundamerica/fireblocks-openapi-client
e0ec9fbb55112488d3f9503615466fd4aeee8349
[ "MIT" ]
null
null
null
docs/CreateTransferTicketResponse.md
fundamerica/fireblocks-openapi-client
e0ec9fbb55112488d3f9503615466fd4aeee8349
[ "MIT" ]
null
null
null
docs/CreateTransferTicketResponse.md
fundamerica/fireblocks-openapi-client
e0ec9fbb55112488d3f9503615466fd4aeee8349
[ "MIT" ]
1
2022-03-15T23:02:14.000Z
2022-03-15T23:02:14.000Z
# OpenapiClient::CreateTransferTicketResponse ## Properties | Name | Type | Description | Notes | | ---- | ---- | ----------- | ----- | | **ticket_id** | **String** | | [optional] | ## Example ```ruby require 'openapi_client' instance = OpenapiClient::CreateTransferTicketResponse.new( ticket_id: null ) ```
16.684211
59
0.615142
eng_Latn
0.174274
e9a473df3cedf8ccd3edf450acd806775a8dfb6b
1,060
md
Markdown
docker/README.md
esmitperez/Ujko-svg
54dca9f615ccdfe21cbe534a5adf2ff110c394d7
[ "Apache-2.0" ]
null
null
null
docker/README.md
esmitperez/Ujko-svg
54dca9f615ccdfe21cbe534a5adf2ff110c394d7
[ "Apache-2.0" ]
null
null
null
docker/README.md
esmitperez/Ujko-svg
54dca9f615ccdfe21cbe534a5adf2ff110c394d7
[ "Apache-2.0" ]
1
2018-12-23T18:15:51.000Z
2018-12-23T18:15:51.000Z
_ _ _ _ __ __ | | | (_) | _____ | \/ | __ _ _ __ ___ | | | | | |/ / _ \ | |\/| |/ _` | '_ \/ __| | |_| | | < (_) | | | | | (_| | |_) \__ \ \___// |_|\_\___/ |_| |_|\__,_| .__/|___/ |__/ |_| con Kartograph Autor: Esmit Pérez - esmitperez@gmail.com - @esmitperez https://github.com/esmitperez/Ujko-svg July 2014 =============================================================================== - Comando en Linux/Mac OSX (Linux requiere sudo) docker run -ti -v `pwd`:/workdir esmitperez/kartograph \ --pretty-print \ --style /workdir/css/cr.css \ -o /workdir/svg/cr-ultralow-res.svg \ /workdir/configs/distritos.config.json Tips: - Si está usando Docker en MacOSX lea esto: --- https://medium.com/boot2docker-lightweight-linux-for-docker/boot2docker-together-with-virtualbox-guest-additions-da1e3ab2465c - Asegúrese de haber inicializado Docker, mapeando /workdir con '-v /folder/en/su/pc:/workdir' ===============================================================================
35.333333
129
0.507547
oci_Latn
0.281791
e9a6bfebbf46f5a2068d3b79090d467638244d31
255
md
Markdown
spreed-webrtc-master/src/styles/libs/bootstrap/README.md
AkhilNazim029/dreadnokz
7bc3d77a6a591187271af5ab35c2532e13d30a82
[ "CC-BY-3.0" ]
1
2020-12-04T11:38:51.000Z
2020-12-04T11:38:51.000Z
spreed-webrtc-master/src/styles/libs/bootstrap/README.md
AkhilNazim029/dreadnokz
7bc3d77a6a591187271af5ab35c2532e13d30a82
[ "CC-BY-3.0" ]
null
null
null
spreed-webrtc-master/src/styles/libs/bootstrap/README.md
AkhilNazim029/dreadnokz
7bc3d77a6a591187271af5ab35c2532e13d30a82
[ "CC-BY-3.0" ]
null
null
null
# Bootstrap sass distribution Downloaded from [link](https://github.com/twbs/bootstrap-sass/archive/v3.2.0.tar.gz). Used data from `assets/stylesheets` folder except: + `_bootstrap-compass.scss` + `_bootstrap-mincer.scss` + `_bootstrap-sprockets.scss`
31.875
86
0.764706
yue_Hant
0.190133
e9a7eb89ff797d105921941c4f72a592f298612f
7,095
md
Markdown
articles/cdn/cdn-billing.md
changeworld/azure-docs.cs-cz
cbff9869fbcda283f69d4909754309e49c409f7d
[ "CC-BY-4.0", "MIT" ]
null
null
null
articles/cdn/cdn-billing.md
changeworld/azure-docs.cs-cz
cbff9869fbcda283f69d4909754309e49c409f7d
[ "CC-BY-4.0", "MIT" ]
null
null
null
articles/cdn/cdn-billing.md
changeworld/azure-docs.cs-cz
cbff9869fbcda283f69d4909754309e49c409f7d
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: Principy fakturace Azure CDN | Dokumenty společnosti Microsoft description: Tento faq popisuje, jak azure cdn fakturace funguje. services: cdn documentationcenter: '' author: asudbring manager: danielgi editor: '' ms.assetid: '' ms.service: azure-cdn ms.workload: media ms.tgt_pltfrm: na ms.devlang: na ms.topic: article ms.date: 09/13/2019 ms.author: allensu ms.openlocfilehash: d3a2dfba98f83d34c3e83ec865e3b692f7dbacd2 ms.sourcegitcommit: 8dc84e8b04390f39a3c11e9b0eaf3264861fcafc ms.translationtype: MT ms.contentlocale: cs-CZ ms.lasthandoff: 04/13/2020 ms.locfileid: "81254219" --- # <a name="understanding-azure-cdn-billing"></a>Principy fakturace Azure CDN Tento faq popisuje strukturu fakturace pro obsah hostovaný Azure Content Delivery Network (CDN). ## <a name="what-is-a-billing-region"></a>Co je fakturační oblast? Fakturační oblast je geografická oblast, která se používá k určení, jaká sazba se účtuje za doručení objektů z Azure CDN. Aktuální fakturační zóny a jejich oblasti jsou následující: - Zóna 1: Severní Amerika, Evropa, Střední východ a Afrika - Zóna 2: Asie a Tichomoří (včetně Japonska) - Zóna 3: Jižní Amerika - Zóna 4: Austrálie a Nový Zéland - Zóna 5: Indie Informace o oblastech bodu přítomnosti (POP) naleznete v [tématu Azure CDN POP umístění podle oblasti](https://docs.microsoft.com/azure/cdn/cdn-pop-locations). Například protokol POP umístěný v Mexiku se nachází v oblasti Severní Amerika a je proto zahrnut do zóny 1. Informace o cenách Azure CDN najdete v [tématu Ceny sítě doručování obsahu](https://azure.microsoft.com/pricing/details/cdn/). ## <a name="how-are-delivery-charges-calculated-by-region"></a>Jak se poplatky za doručení počítají podle oblasti? Fakturační oblast Azure CDN je založená na umístění zdrojového serveru, který doručuje obsah koncovému uživateli. Cíl (fyzické umístění) klienta se nepovažuje za fakturační oblast. Pokud například uživatel se sídlem v Mexiku vydá požadavek a tento požadavek je obsluhován serverem umístěným v protokolu POP ve Spojených státech z důvodu partnerského vztahu nebo provozních podmínek, bude fakturační oblastí Spojené státy. ## <a name="what-is-a-billable-azure-cdn-transaction"></a>Co je fakturovatelná transakce Azure CDN? Jakýkoli požadavek HTTP(S), který končí v CDN, je fakturovatelná událost, která zahrnuje všechny typy odpovědí: úspěch, selhání nebo jiné. Různé odpovědi však mohou generovat různé částky provozu. Například *304 Not Modified* a jiné odpovědi pouze v záhlaví generují malý provoz, protože se jedná o malou odpověď záhlaví; podobně chybové odpovědi (například *404 Not Found)* jsou fakturovatelné, ale vynakládají malé náklady z důvodu malé užitečné části odezvy. ## <a name="what-other-azure-costs-are-associated-with-azure-cdn-use"></a>Jaké další náklady na Azure jsou spojené s používáním Azure CDN? Použití Azure CDN také účtuje některé poplatky za využití na služby používané jako původ pro vaše objekty. Tyto náklady jsou obvykle malý zlomek celkových nákladů na využití CDN. Pokud používáte úložiště objektů Blob Azure jako původ pro váš obsah, vám také vznikají následující poplatky za úložiště pro vyplňování mezipaměti: - Skutečné použité GB: Skutečné úložiště zdrojových objektů. - Transakce: Podle potřeby k vyplnění mezipaměti. - Přenosy v GB: Množství dat přenesených k vyplnění mezipamětí CDN. > [!NOTE] > Od října 2019, Pokud používáte Azure CDN od Microsoftu, náklady na přenos dat z Origins hostované v Azure na CDN popy je zdarma. Azure CDN od Verizonu a Azure CDN od Akamai podléhají sazbám popsaným níže. Další informace o fakturaci úložiště Azure najdete [v tématu Principy fakturace úložiště Azure – šířka pásma, transakce a kapacita](https://blogs.msdn.microsoft.com/windowsazurestorage/2010/07/08/understanding-windows-azure-storage-billing-bandwidth-transactions-and-capacity/). Pokud používáte *hostované poskytování služeb*, budou vám účtovány poplatky následujícím způsobem: - Výpočetní čas Azure: Výpočetní instance, které fungují jako původ. - Převod výpočetních prostředků Azure: Přenosy dat z výpočetních instancí k vyplnění mezipamětí Azure CDN. Pokud váš klient používá požadavky na rozsah bajtů (bez ohledu na službu původu), platí následující požadavky: - *Požadavek na rozsah bajtů* je fakturovatelná transakce v cdn. Když klient vydá požadavek na rozsah bajtů, tento požadavek je pro podmnožinu (rozsah) objektu. CDN odpoví pouze částečnou část obsahu, který je požadován. Tato částečná odpověď je fakturovatelná transakce a částka převodu je omezena na velikost odezvy rozsahu (plus hlavičky). - Když požadavek dorazí pouze část objektu (zadáním hlavičky rozsahu bajtů), CDN může načíst celý objekt do mezipaměti. V důsledku toho i v případě, že fakturovatelná transakce z CDN je pro částečnou odpověď, fakturovatelné transakce z počátku může zahrnovat plnou velikost objektu. ## <a name="how-much-transfer-activity-occurs-to-support-the-cache"></a>Kolik přenosové aktivity dochází k podpoře mezipaměti? Pokaždé, když cdn POP potřebuje vyplnit svou mezipaměť, provede požadavek na původ objektu, který je uložen do mezipaměti. V důsledku toho vznik vznik unese fakturovatelné transakce na každé chybě mezipaměti. Počet neúspěšných pokusů o dolet závisí na řadě faktorů: - Jak cache obsah je: Pokud obsah má vysoké Hodnoty TTL (time-to-live) /expirace a je často přistupovat, takže zůstane populární v mezipaměti, pak drtivá většina zatížení je zpracována CDN. Typický dobrý poměr přístupů do mezipaměti je více než 90 %, což znamená, že méně než 10 % požadavků klientů se musí vrátit do původního stavu, a to buď pro nedoletovou chybě, nebo pro aktualizaci objektu. - Kolik uzlů je třeba načíst objekt: Pokaždé, když uzel načte objekt z počátku, vznikne fakturovatelné transakce. Výsledkem je, že více globálního obsahu (přístupného z více uzlů) vede k více fakturovatelným transakcím. - Vliv TTL: Vyšší TTL pro objekt znamená, že je třeba načíst z počátku méně často. To také znamená, že klienti, jako jsou prohlížeče, mohou objekt ukládat do mezipaměti déle, což může snížit transakce na CDN. ## <a name="which-origin-services-are-eligible-for-free-data-transfer-with-azure-cdn-from-microsoft"></a>Které služby původu jsou způsobilé pro bezplatný přenos dat pomocí Azure CDN od Microsoftu? Pokud jako původ CDN použijete některou z následujících služeb Azure, nebude se vám z přenosu dat z originu do přístupových polí CDN účtovat. - Azure Storage - Mediální služby Azure - Azure Virtual Machines - Virtual Network - Load Balancer - Application Gateway - Azure DNS - ExpressRoute - VPN Gateway - Traffic Manager - Network Watcher - Brána Azure Firewall - Azure Front Door Service - Azure Bastion - Služba Azure App - Azure Functions - Azure Data Factory - Azure API Management - Azure Batch - Průzkumník dat Azure - HDInsight - Azure Cosmos DB - Azure Data Lake Store - Azure Machine Learning - Databáze Azure SQL - Azure Cache for Redis ## <a name="how-do-i-manage-my-costs-most-effectively"></a>Jak mohu spravovat své náklady co nejefektivněji? Nastavte u svého obsahu co nejdelší TTL.
58.155738
461
0.799436
ces_Latn
0.999948
e9a897c6ba6da3dc92dd6a6b125891ecd2886527
68
md
Markdown
content/about.md
YukiAllerion/SCNPE
30764d23e1aa6a46ee9deaedecf253a91d6cbdc3
[ "MIT" ]
null
null
null
content/about.md
YukiAllerion/SCNPE
30764d23e1aa6a46ee9deaedecf253a91d6cbdc3
[ "MIT" ]
1
2021-06-17T08:53:11.000Z
2021-06-17T08:53:11.000Z
public/content/about.md
YukiAllerion/yukiallerion.github.io
20e3544b35a8e7fc18e71e924afafd15f4f6ff74
[ "MIT" ]
null
null
null
--- title: "About" date: 2020-07-29T19:02:00-04:00 draft: true ---
9.714286
31
0.617647
nob_Latn
0.145614
e9a98cd83e3c7a149848d8a030d47502380ff9dd
4,053
md
Markdown
docs/organizations/audit-logs.md
Losant/losant-docs
edbee4e1fa6948b6128f7b4b6bc07fb59d2605d2
[ "MIT" ]
10
2016-06-16T20:20:17.000Z
2018-07-10T23:22:29.000Z
docs/organizations/audit-logs.md
Losant/losant-docs
edbee4e1fa6948b6128f7b4b6bc07fb59d2605d2
[ "MIT" ]
18
2016-07-03T18:45:31.000Z
2019-03-26T14:23:24.000Z
docs/organizations/audit-logs.md
Losant/losant-docs
edbee4e1fa6948b6128f7b4b6bc07fb59d2605d2
[ "MIT" ]
11
2016-04-29T18:15:42.000Z
2018-10-03T17:47:47.000Z
--- description: Learn about viewing the audit log for a Losant Organization. --- # Audit Logs Enterprise organizations have access to audit logs, detailing the creation, modification and deletion of Losant resources by your organization's team members. This allows administrators to view a trail of edits made to any resource owned by that organization. If audit logs are enabled for your organization, and you have [administrator permissions](/organizations/members/#member-roles) for the organization, you will find a link to the logs in the of the "Settings" tab: ![Audit Logs Link](/images/organizations/audit-logs-overview.png "Audit Logs Link") ## Log Entries Currently, log entries are sorted by timestamp (latest to earliest). Audit logs are available for all organization edits from the second week of January 2017 onwards, and the records are kept for one year regardless of your organization's data retention plan. ![Audit Logs List](/images/organizations/audit-logs-list.png "Audit Logs List") ### Available Information The following information is available for each request included in the audit log: * Request ID * Timestamp of request * URL path and HTTP method, which maps to a Losant API method * Query parameters included in the request, if applicable * Request body, if applicable * Request initiator (usually an organization member, but occasionally a workflow or other entity) * Response body, if applicable * HTTP response code ![Audit Log Detail](/images/organizations/audit-log-detail.png "Audit Log Detail") Note that a number of sensitive details, such as application key secrets, credit card numbers and invitation tokens, are redacted from the audit logs for security purposes. ## What's Recorded As a general rule, any request that creates, edits or deletes a Losant resource will appear in the audit logs. More specifically ... * Edits to the [organization itself](/organizations/overview/#managing-organizations), such as name or description changes * Changes to [organization members](/organizations/members/), such as leaving the organization or changing a member's role * Creation, revocation, acceptance or deletion of any [member invitation](/organizations/members/#inviting-new-members) * [Transfer of resources](/organizations/overview/#transferring-resources) to or from the organization, whether the other party is another organization or a [user's Sandbox](/user-accounts/sandbox/) * Creation or deletion of any [application](/applications/overview/) or [dashboard](/dashboards/overview/) owned by the organization, or edits to those resources. This includes application sub-resources, such as: * [Access Keys](/applications/access-keys/) * [Application API Tokens](/applications/application-tokens/) * [Data Tables](/data-tables/overview/) * [Devices](/devices/overview/) * [Device Recipes](/devices/device-recipes/) * [Experience Endpoints](/experiences/endpoints/) * [Experience Groups](/experiences/groups/) * [Experience Views](/experiences/views/) * [Experience Users](/experiences/users/) * [Experience Domains](/experiences/domains/) * [Integrations](/applications/integrations/) * [Webhooks](/applications/webhooks/) * [Workflows](/workflows/overview/) ## What's Not Recorded Not everything that happens within your organization is available in the audit log. Here are a number of requests that do not show up in the logs: * User or device [authentications](/rest-api/auth/) * Edits to any personal [user account](/user-accounts/overview/), including password resets * [Device commands](/devices/commands/) or [state reports](/devices/state/), whether by [REST](/rest-api/overview/) or [MQTT](/mqtt/overview/) * Creation or deletion of [Losant events](/applications/events/), or the editing of any event * State retrieval queries, including those made by Losant dashboards * GET requests (data retrieval) on any organization-owned resource or sub-resource * Many "pre-process" errors, such as malformed requests, 404s and timeouts
57.9
259
0.771034
eng_Latn
0.984678
e9a9effa850402b160acdf466d7de77bf4f2ec87
1,763
md
Markdown
article/qiitacli.md
mypaceshun/Qiitacli
8d5538c281fbbf420d21839c558d2779d05c9452
[ "MIT" ]
null
null
null
article/qiitacli.md
mypaceshun/Qiitacli
8d5538c281fbbf420d21839c558d2779d05c9452
[ "MIT" ]
6
2020-02-24T19:58:11.000Z
2021-06-02T00:43:21.000Z
article/qiitacli.md
mypaceshun/qiitacli
59c981f502dcf49bea2fed6d70fc1735018f8ab8
[ "MIT" ]
null
null
null
--- title: Qiita CLI Applicationを作ってみた tags: - Python - Click private: yes --- # ことのはじまり Qiitaに記事書きたいけどブラウザ開いていちいちQiitaのエディタ開くのめんどいな〜 `Vim`使って`MarkDown`で記述してアップロードしたいな〜 よっしゃ作ったろ! # Qiita に記事をアップロードするコマンド 漠然としたイメージとしては、手頃なテキストエディタで`article.md`のようなファイルを作成し、 ``` console $ qiitacli upload [記事のtitle] article.md ``` と叩いたら記事がアップロードされるイメージ。`Python`と`Click`というモジュールを使ったらできそう。 なぜ`Python`なのかと言うと私が一番使い慣れているからです。 # `Qiita API v2` Qiitaには`Qiita API v2`というAPIがあります。 [`Qiita API v2`](https://qiita.com/api/v2/docs) これを利用すればQiitaに関する操作はだいたい何でもできそう。 # `qiita_2` こういうのはたいてい誰かがもう作っているものなので`PyPI`で`qiita`とか入れて検索してみたらありました。 [`qiita_v2`](https://github.com/petitviolet/qiita_py) ありがたく使わせていただきましょう。 # 開発の準備あれこれ 私がPythonで開発する時は以下のことを意識して開発しています。 * テストを書く(`pytest`利用) * コード整形ツールを使う(`flake8`,`autopep8`,`autoflake`,`isort`利用) * `PyPI`にアップロード出来るように`setup.py`と`setup.cfg`を作る 今回は更に以下の事にも挑戦してみました。 * Travis CIを利用した自動テストの設定 * Codecovの利用したコードカバレッジの計測 GitHubとか見てるとよくある [![Build Status](https://travis-ci.org/mypaceshun/qiitacli.svg?branch=master)](https://travis-ci.org/mypaceshun/qiitacli) このバッチを付けたかっただけですw # 作った https://github.com/mypaceshun/qiitacli ひとまず作ってみました。 この記事もこのコードを使って上げています。 Qiitaの個人用アクセストークンを利用して、記事の一覧・公開・編集・削除だけ出来るようになりました。 とりあえず最低限。 # 課題 このままだと`Markdown`形式の文章をアップロードするだけなので、画像を付けたり細かい装飾は出来なさそう。 もともと簡単なメモとかでもどんどん記事にしていこうと思って作ったものなのでとりあえずそこまではいいかなって感じです。 気が向いたら改良して見るかも。 # 作っていて思ったこと Travis CI は2回くらい使ったことがあったのでなんとなくわかっていました。 ただ、使ったことがあったのがまえすぎてテストコードを書くのに慣れていなかったのであんまり恩恵を感じられなかったんですよね... 最近ようやく`pytest`を使ったテストコードをかけるようになってきたので、自動テストのありがたみがマシマシでした。 あとCodecovに関しては今回完全に初見でした。 コードのカバレッジ自体最近計測するようになった者なので、カバレッジを見れたりテストが通っていないコードを眺められるのはとてもよかったです。 カバレッジを100%にしたい衝動にかられるけどそもそもカバレッジを上げるだけの無駄なテストコードを量産する意味もわからないしああああって感じ、 いろいろやってみて試してみてって感じですね。
20.034091
121
0.825298
jpn_Jpan
0.903558
e9ab88873deeddfc04bbef6582b19aae892ae9e8
331
md
Markdown
frontend/src/assets/content/giffordpinchot/where-to-cut-your-tree/prohibited-areas.md
18F/fs-open-forest-platform
258dd213d12398c65f0a22a87995e441dc119c6a
[ "CC0-1.0" ]
27
2018-08-28T23:37:56.000Z
2019-08-08T15:13:52.000Z
frontend/src/assets/content/giffordpinchot/where-to-cut-your-tree/prohibited-areas.md
18F/fs-open-forest-platform
258dd213d12398c65f0a22a87995e441dc119c6a
[ "CC0-1.0" ]
288
2018-08-27T20:31:53.000Z
2019-08-13T11:55:57.000Z
frontend/src/assets/content/giffordpinchot/where-to-cut-your-tree/prohibited-areas.md
18F/fs-open-forest-platform
258dd213d12398c65f0a22a87995e441dc119c6a
[ "CC0-1.0" ]
6
2018-09-06T16:52:17.000Z
2021-02-14T09:52:31.000Z
No cutting or removing trees within: * Congressionally designated Wilderness Areas. * Mount St. Helens National Volcanic Monument. * Experimental forests. * Developed campgrounds. * Administrative sites. * Within 300 feet of streams. * On private or state-owned lands within national forest boundaries. * In any other posted area
30.090909
68
0.794562
eng_Latn
0.995617
e9abf1e7eafd19e4f4133bc5f9e2eb30697861fb
2,493
md
Markdown
_posts/2021-07-25-acpi.md
yduf/yduf.github.io
ffb34c791fc8962904c8d6f1c2245432745f6623
[ "MIT" ]
null
null
null
_posts/2021-07-25-acpi.md
yduf/yduf.github.io
ffb34c791fc8962904c8d6f1c2245432745f6623
[ "MIT" ]
3
2019-09-29T13:42:27.000Z
2021-10-06T20:20:31.000Z
_posts/2021-07-25-acpi.md
yduf/yduf.github.io
ffb34c791fc8962904c8d6f1c2245432745f6623
[ "MIT" ]
null
null
null
--- published: true title: ACPI tags: reverse acpi security brightness --- > Advanced Configuration and Power Interface - [acpi](https://en.wikipedia.org/wiki/Advanced_Configuration_and_Power_Interface) ## ACPI Event {% highlight bash %} acpi_listen {% endhighlight %} ## [Backlight](https://wiki.archlinux.org/title/Backlight) {% highlight bash %} $ ls /sys/class/backlight/ # echo 4000 > /sys/class/backlight/intel_backlight/brightness {% endhighlight %} ## [Debugging Kernel Suspend](https://wiki.ubuntu.com/DebuggingKernelSuspend) In order to simulate your suspend/resume process, enter the following commands: {% highlight bash %} sudo sh -c "sync && echo 1 > /sys/power/pm_trace && pm-suspend" {% endhighlight %} At this point your computer should enter the suspend state within a few seconds. Usually the power LED will slowly flash when in the suspended state. When that has happened, initiate the resume process by pressing the power button. Observe closely if the disk light comes on briefly. This indicates that resume has begun. If resume fails to complete, then press the power button until the computer turns off. Power on your computer making sure that it loads the same kernel that exhibited the resume problem. You have about 3 minutes to start this boot process before the information saved in the RTC gets corrupted. Start a console and enter: {% highlight bash %} dmesg > dmesg.txt 64515.883724] ACPI: Waking up from system sleep state S3 {% endhighlight %} ## [Patching ACPI tables](https://github.com/ivzave/matebook-linux#patching-acpi-tables) - [acpidump](https://www.mankier.com/1/acpidump) / [acpixtract](https://www.mankier.com/1/acpixtract#) First extract tables: {% highlight bash %} acpidump > acpidump.hex acpixtract -a acpidump.hex {% endhighlight %} Alternative way to extract table [through kernel](https://www.reddit.com/r/SurfaceLinux/comments/46o3mh/fix_udev_power_adapter_event_by_patching_acpi/): {% highlight bash %} cat /sys/firmware/acpi/tables/DSDT > dsdt.dat {% endhighlight %} Decompile tables with [iasl](https://www.systutorials.com/docs/linux/man/1-iasl/) / [doc](https://acpica.org/sites/acpica/files/aslcompiler.pdf) / [Fork of Intel's iasl](https://github.com/RehabMan/Intel-iasl) {% highlight bash %} iasl -fe refs.txt -da dsdt.dat ssdt*.dat {% endhighlight %} ### [DSDT](https://wiki.archlinux.org/title/DSDT) Differentiated System Description Table supplies information about supported power events in a given system.
39.571429
616
0.762535
eng_Latn
0.840425
e9acd3745ecb33fa9e7da57b6dee88170a82ad55
380
md
Markdown
docs/api/alfa-style.style_namespace.initial_typealias.md
Siteimprove/alfa
3eb032275a9fa5f3b97b892e28ebfc90eb4ef611
[ "MIT" ]
70
2018-05-25T16:02:23.000Z
2022-03-21T14:28:03.000Z
docs/api/alfa-style.style_namespace.initial_typealias.md
Siteimprove/alfa
3eb032275a9fa5f3b97b892e28ebfc90eb4ef611
[ "MIT" ]
448
2018-06-01T08:46:47.000Z
2022-03-31T14:02:55.000Z
docs/api/alfa-style.style_namespace.initial_typealias.md
Siteimprove/alfa
3eb032275a9fa5f3b97b892e28ebfc90eb4ef611
[ "MIT" ]
13
2018-07-04T19:47:49.000Z
2022-02-19T09:59:34.000Z
<!-- Do not edit this file. It is automatically generated by API Documenter. --> [Home](./index.md) &gt; [@siteimprove/alfa-style](./alfa-style.md) &gt; [Style](./alfa-style.style_namespace.md) &gt; [Initial](./alfa-style.style_namespace.initial_typealias.md) ## Style.Initial type <b>Signature:</b> ```typescript type Initial<N extends Name> = Property.Value.Initial<N>; ```
31.666667
178
0.705263
eng_Latn
0.542384
e9ace9e0ce5a1c154eb2a2ca0f326fc7a0c1e761
171
md
Markdown
README.md
lightning-sprinkle/lightning-sprinkle-publisher-lib
3dedacaf7ad8a13bed28668fda811597a01fca10
[ "MIT" ]
null
null
null
README.md
lightning-sprinkle/lightning-sprinkle-publisher-lib
3dedacaf7ad8a13bed28668fda811597a01fca10
[ "MIT" ]
null
null
null
README.md
lightning-sprinkle/lightning-sprinkle-publisher-lib
3dedacaf7ad8a13bed28668fda811597a01fca10
[ "MIT" ]
null
null
null
# Basic-Static-Website ## This is the template Static Website using HTML CSS and BootStrap created during my learning. Very basic and cab be for Web Development beginers.
57
147
0.80117
eng_Latn
0.984169
e9ae1bded932e1804f4834d0f04391de5ae012d6
3,745
md
Markdown
README.md
jfahrer/donner
d97de7919ece502ed805e08d7bacd68d2f7692df
[ "MIT" ]
8
2019-07-23T22:52:08.000Z
2019-08-29T13:17:50.000Z
README.md
jfahrer/donner
d97de7919ece502ed805e08d7bacd68d2f7692df
[ "MIT" ]
2
2019-07-02T00:08:40.000Z
2020-09-27T09:23:12.000Z
README.md
codetales/donner
d97de7919ece502ed805e08d7bacd68d2f7692df
[ "MIT" ]
1
2019-06-23T06:30:35.000Z
2019-06-23T06:30:35.000Z
# Donner [![Go Report Card](https://goreportcard.com/badge/github.com/codetales/donner)](https://goreportcard.com/report/github.com/codetales/donner) [![Build Status](https://travis-ci.com/codetales/donner.svg?branch=master)](https://travis-ci.com/codetales/donner) :boom: Donner is a generic command wrapper. It let's you define strategies to wrap commands in things like `docker-compose exec` or `docker container run`. This is can come in very handy when developing applications in containers. Donner allows defining a wrapping strategy on a per command basis. So you don't have to worry which service to use or whether you should use `docker-compose exec` or `docker-compose run` when executing a command. ## Installation `brew install codetales/tap/donner` or download latest binary from the [releases](https://github.com/codetales/donner/releases) page. ## Examples Example config for a ruby project ```yaml strategies: run: handler: docker_compose_run service: app remove: true exec: handler: docker_compose_exec service: app exec_postgres: handler: docker_compose_exec service: pg run_with_docker: handler: docker_run image: alpine:latest volumes: - "./:/usr/src/app" default_strategy: run # use this strategy for all undefined commands commands: rails: exec rspec: exec ruby: run bundle: run irb: run rake: # override strategy definition strategy: run remove: false psql: exec_postgres ``` ## Running stuff ``` donner run ruby -v # Executes `docker-compose run --rm app ruby -v` donner run rails c # Executes `docker-compose exec app rails c` donner run psql # Executes `docker-compose exec pg psql` donner run ./bin/rspec spec/ # Executes `docker-compose exec app ./bin/rspec spec/` - Donner strips of the path when checking an executable donner run rake my:task # Executes `docker-compose run app rake my:task` donner run irb # Executes `docker-compose run --rm app irb` donner run other-command # Executes `docker-compose run --rm app other-command` ``` ## Setting aliases ``` $ donner aliases alias rails='donner run rails' alias rspec='donner run rspec' alias rake='donner run rake' alias psql='donner run psql' alias ruby='donner run ruby' alias bundle='donner run bundle' alias irb='donner run irb' # copy and paste the output into your terminal or run # eval $(donner aliases) ``` ## Fallbacks and strict checking To make aliases work nicely system wide, Donner supports a `--fallback` and `--strict` flag. * When executing `donner run --strict some-command` and the provided command is not defined under `commands` or `aliases`, Donner will fail executing the command. * When executing `donner run --fallback` with no `.donner.yml` in the current directory, Donner executes the command as is. * When executing `donner run --strict --fallback` and the provided command is not defined, Donner executes the command as is. This is also supported with aliases: ``` $ donner aliases --fallback --strict alias rails='donner run --fallback --strict rails' alias rspec='donner run --fallback --strict rspec' alias rake='donner run --fallback --strict rake' alias psql='donner run --fallback --strict psql' alias ruby='donner run --fallback --strict ruby' alias bundle='donner run --fallback --strict bundle' alias irb='donner run --fallback --strict irb' # copy and paste the output into your terminal or run # eval $(donner aliases --fallback --strict) ``` ## TODO * Ensure useful error messages and test failure cases * Add missing flags to the handlers (volumes, ...) * Add documentation * Setup bash/zsh completion
36.009615
212
0.718558
eng_Latn
0.958486
97bbfaf99df32bcb424413eb94219c989a8f02cf
73
md
Markdown
README.md
gsnunes/mobiletheme
0d3d21ccd50895c0e50989affa11ccdd03e21826
[ "MIT" ]
null
null
null
README.md
gsnunes/mobiletheme
0d3d21ccd50895c0e50989affa11ccdd03e21826
[ "MIT" ]
null
null
null
README.md
gsnunes/mobiletheme
0d3d21ccd50895c0e50989affa11ccdd03e21826
[ "MIT" ]
1
2022-02-18T07:28:46.000Z
2022-02-18T07:28:46.000Z
mobiletheme =========== A wordpress theme for mobile apps and hotsites.
14.6
47
0.684932
eng_Latn
0.969723
97bc09deb24bafbd273ebc5329f18572333b9a5f
1,236
md
Markdown
content/momentum/web-momo4/lua.ref.msys.qp.encode.md
ahdvnd/developers.sparkpost.com
ff60f1f534d1bcf3b7bc85a6db58be2cf22978c3
[ "MIT" ]
null
null
null
content/momentum/web-momo4/lua.ref.msys.qp.encode.md
ahdvnd/developers.sparkpost.com
ff60f1f534d1bcf3b7bc85a6db58be2cf22978c3
[ "MIT" ]
null
null
null
content/momentum/web-momo4/lua.ref.msys.qp.encode.md
ahdvnd/developers.sparkpost.com
ff60f1f534d1bcf3b7bc85a6db58be2cf22978c3
[ "MIT" ]
null
null
null
| | | | | --- | --- | --- | | [Prev](lua.ref.msys.qp.decode)  | Chapter 70. Lua Functions Reference |  [Next](lua.ref.msys.rfc3464.compute_delivery_status) | <a name="lua.ref.msys.qp.encode"></a> ## Name msys.qp.encode — Quoted-printable encode a string <a name="idp18353808"></a> ## Synopsis `msys.qp.encode(original[, charset, dotstuffing]);` ``` original: mixed charset: string, optional dotstuffing: boolean ``` <a name="idp18356864"></a> ## Description `original` can be a string, `msys.core:_ec_string`, or `msys.core:_io_object`. Use `charset` to convert the text to a different character encoding before it is quoted-printable encoded. Use `dotstuffing` to specify whether or not the encoded text will be dot stuffed if qp encoding creates new line breaks. Encoded text is returned in a string. Enable this function with the statement `require('msys.qp');`. <a name="idp18362096"></a> ## See Also [msys.qp.decode](lua.ref.msys.qp.decode "msys.qp.decode") | | | | | --- | --- | --- | | [Prev](lua.ref.msys.qp.decode)  | [Up](lua.function.details) |  [Next](lua.ref.msys.rfc3464.compute_delivery_status) | | msys.qp.decode  | [Table of Contents](index) |  msys.rfc3464.compute_delivery_status |
33.405405
344
0.682848
eng_Latn
0.802481
97bc522b228cf993fbd79bfe371485226745d33c
53,319
md
Markdown
source/_posts/2021-06-15-release-v14.17.1.md
yous/iojs-ko
08fe62f43463647a5992da5ebb8c28b00d01cff1
[ "MIT" ]
227
2015-10-16T06:35:49.000Z
2022-02-13T11:34:21.000Z
source/_posts/2021-06-15-release-v14.17.1.md
yous/iojs-ko
08fe62f43463647a5992da5ebb8c28b00d01cff1
[ "MIT" ]
879
2015-10-16T01:37:59.000Z
2022-03-29T18:36:58.000Z
source/_posts/2021-06-15-release-v14.17.1.md
yous/iojs-ko
08fe62f43463647a5992da5ebb8c28b00d01cff1
[ "MIT" ]
98
2015-10-17T00:53:06.000Z
2022-02-02T14:42:53.000Z
--- category: articles title: Node v14.17.1(LTS) author: Michaël Zasso ref: Node v14.17.1 (LTS) refurl: https://nodejs.org/en/blog/release/v14.17.1 translator: outsideris --- <!-- ### Notable Changes * [[`6035492c8f`](https://github.com/nodejs/node/commit/6035492c8f)] - **deps**: update ICU to 69.1 (Michaël Zasso) [#38178](https://github.com/nodejs/node/pull/38178) * [[`9417fd0bc8`](https://github.com/nodejs/node/commit/9417fd0bc8)] - **errors**: align source-map stacks with spec (Benjamin Coe) [#37252](https://github.com/nodejs/node/pull/37252) --> ### 주요 변경사항 * [[`6035492c8f`](https://github.com/nodejs/node/commit/6035492c8f)] - **deps**: ICU를 69.1로 업데이트했습니다. (Michaël Zasso) [#38178](https://github.com/nodejs/node/pull/38178) * [[`9417fd0bc8`](https://github.com/nodejs/node/commit/9417fd0bc8)] - **errors**: source-map의 스택트레이스를 명세에 맞추었습니다. (Benjamin Coe) [#37252](https://github.com/nodejs/node/pull/37252) ### Commits * [[`87fa636953`](https://github.com/nodejs/node/commit/87fa636953)] - **assert**: refactor to use more primordials (Antoine du Hamel) [#36234](https://github.com/nodejs/node/pull/36234) * [[`cfff3b4462`](https://github.com/nodejs/node/commit/cfff3b4462)] - **assert**: refactor to avoid unsafe array iteration (Antoine du Hamel) [#37344](https://github.com/nodejs/node/pull/37344) * [[`dd18def7db`](https://github.com/nodejs/node/commit/dd18def7db)] - **async_hooks**: refactor to avoid unsafe array iteration (Antoine du Hamel) [#37125](https://github.com/nodejs/node/pull/37125) * [[`5f3e96b570`](https://github.com/nodejs/node/commit/5f3e96b570)] - **async_hooks,doc**: replace process.stdout.fd with 1 (Darshan Sen) [#38382](https://github.com/nodejs/node/pull/38382) * [[`f4cb8b8281`](https://github.com/nodejs/node/commit/f4cb8b8281)] - **benchmark**: avoid using `console.log()` (Antoine du Hamel) [#38370](https://github.com/nodejs/node/pull/38370) * [[`9e4c1f2f2c`](https://github.com/nodejs/node/commit/9e4c1f2f2c)] - **benchmark**: use `process.hrtime.bigint()` (Antoine du Hamel) [#38369](https://github.com/nodejs/node/pull/38369) * [[`3c886e0ad6`](https://github.com/nodejs/node/commit/3c886e0ad6)] - **buffer**: remove TODOs in `atob` / `btoa` (Khaidi Chu) [#38548](https://github.com/nodejs/node/pull/38548) * [[`c5b86f8c2f`](https://github.com/nodejs/node/commit/c5b86f8c2f)] - **buffer**: remove unreachable code (Rongjian Zhang) [#38537](https://github.com/nodejs/node/pull/38537) * [[`9ae2a27d44`](https://github.com/nodejs/node/commit/9ae2a27d44)] - **buffer**: make FastBuffer safe to construct (Antoine du Hamel) [#36587](https://github.com/nodejs/node/pull/36587) * [[`ff546ff744`](https://github.com/nodejs/node/commit/ff546ff744)] - **buffer**: refactor to use primordials instead of Array#reduce (Antoine du Hamel) [#36392](https://github.com/nodejs/node/pull/36392) * [[`5acf0a5ba3`](https://github.com/nodejs/node/commit/5acf0a5ba3)] - **buffer**: refactor to use more primordials (Antoine du Hamel) [#36166](https://github.com/nodejs/node/pull/36166) * [[`52fd42ec46`](https://github.com/nodejs/node/commit/52fd42ec46)] - **build**: work around bug in MSBuild v16.10.0 (Michaël Zasso) [#38873](https://github.com/nodejs/node/pull/38873) * [[`5df0f35bf6`](https://github.com/nodejs/node/commit/5df0f35bf6)] - **build**: add workaround for V8 builds (Richard Lau) [#38632](https://github.com/nodejs/node/pull/38632) * [[`754aa384e0`](https://github.com/nodejs/node/commit/754aa384e0)] - **build**: remove dependency on `distutils.spawn` (Richard Lau) [#38600](https://github.com/nodejs/node/pull/38600) * [[`5de7e64f3a`](https://github.com/nodejs/node/commit/5de7e64f3a)] - **build**: fix make test-npm (Ruy Adorno) [#36369](https://github.com/nodejs/node/pull/36369) * [[`e5fae63108`](https://github.com/nodejs/node/commit/e5fae63108)] - **child_process**: reduce abort handler code duplication (Rich Trott) [#36644](https://github.com/nodejs/node/pull/36644) * [[`598d2bdf09`](https://github.com/nodejs/node/commit/598d2bdf09)] - **child_process**: treat already-aborted controller as aborting (Rich Trott) [#36644](https://github.com/nodejs/node/pull/36644) * [[`8d7708bdef`](https://github.com/nodejs/node/commit/8d7708bdef)] - **child_process**: refactor to use more primordials (Antoine du Hamel) [#36003](https://github.com/nodejs/node/pull/36003) * [[`b8c4d30e77`](https://github.com/nodejs/node/commit/b8c4d30e77)] - **deps**: update to cjs-module-lexer@1.2.1 (Guy Bedford) [#38450](https://github.com/nodejs/node/pull/38450) * [[`6035492c8f`](https://github.com/nodejs/node/commit/6035492c8f)] - **deps**: update ICU to 69.1 (Michaël Zasso) [#38178](https://github.com/nodejs/node/pull/38178) * [[`51dad8cabb`](https://github.com/nodejs/node/commit/51dad8cabb)] - **deps**: V8: cherry-pick 035c305ce776 (Michaël Zasso) [#38497](https://github.com/nodejs/node/pull/38497) * [[`a467125c8d`](https://github.com/nodejs/node/commit/a467125c8d)] - **deps**: V8: cherry-pick dfcdf7837e23 (Benjamin Coe) [#36573](https://github.com/nodejs/node/pull/36573) * [[`acc9fad2ba`](https://github.com/nodejs/node/commit/acc9fad2ba)] - **deps**: V8: cherry-pick 86991d0587a1 (Benjamin Coe) [#36254](https://github.com/nodejs/node/pull/36254) * [[`d67827744b`](https://github.com/nodejs/node/commit/d67827744b)] - **deps**: V8: cherry-pick 530080c44af2 (Milad Fa) [#38508](https://github.com/nodejs/node/pull/38508) * [[`bac9ba4f8a`](https://github.com/nodejs/node/commit/bac9ba4f8a)] - **dgram**: extract cluster lazy loading method to make it testable (Rongjian Zhang) [#38563](https://github.com/nodejs/node/pull/38563) * [[`f5b2115b76`](https://github.com/nodejs/node/commit/f5b2115b76)] - **dgram**: refactor to use more primordials (Antoine du Hamel) [#36286](https://github.com/nodejs/node/pull/36286) * [[`cd7cf0547a`](https://github.com/nodejs/node/commit/cd7cf0547a)] - **dns**: refactor to use more primordials (Antoine du Hamel) [#36314](https://github.com/nodejs/node/pull/36314) * [[`9f67c0852c`](https://github.com/nodejs/node/commit/9f67c0852c)] - **doc**: cleanup events.md structure (James M Snell) [#36100](https://github.com/nodejs/node/pull/36100) * [[`41cfe645c0`](https://github.com/nodejs/node/commit/41cfe645c0)] - **doc**: fix JS flavor selection (Antoine du Hamel) [#37791](https://github.com/nodejs/node/pull/37791) * [[`7c4748b0dc`](https://github.com/nodejs/node/commit/7c4748b0dc)] - **doc**: use `HEAD` instead of `master` for links (Antoine du Hamel) [#38518](https://github.com/nodejs/node/pull/38518) * [[`26426577ff`](https://github.com/nodejs/node/commit/26426577ff)] - **doc**: remove import.meta.resolve parent URL type (Kevin Locke) [#38585](https://github.com/nodejs/node/pull/38585) * [[`88055abf19`](https://github.com/nodejs/node/commit/88055abf19)] - **doc**: document buffer.kStringMaxLength (Tobias Nießen) [#38688](https://github.com/nodejs/node/pull/38688) * [[`2e8dfee165`](https://github.com/nodejs/node/commit/2e8dfee165)] - **doc**: clarify synchronous blocking of Worker stdio (James M Snell) [#38658](https://github.com/nodejs/node/pull/38658) * [[`212cd5cf05`](https://github.com/nodejs/node/commit/212cd5cf05)] - **doc**: update contact info (Gabriel Schulhof) [#38689](https://github.com/nodejs/node/pull/38689) * [[`fa35c0662b`](https://github.com/nodejs/node/commit/fa35c0662b)] - **doc**: change color of doctag on night mode (Qingyu Deng) [#38652](https://github.com/nodejs/node/pull/38652) * [[`4c67437c53`](https://github.com/nodejs/node/commit/4c67437c53)] - **doc**: clarify DiffieHellmanGroup class docs (Nitzan Uziely) [#38363](https://github.com/nodejs/node/pull/38363) * [[`e90c60b1e3`](https://github.com/nodejs/node/commit/e90c60b1e3)] - **doc**: use AIX instead of Aix in fs.md (Rich Trott) [#38535](https://github.com/nodejs/node/pull/38535) * [[`dc67fec1b4`](https://github.com/nodejs/node/commit/dc67fec1b4)] - **doc**: remove extraneous dash from flag prefix (Rodolfo Carvalho) [#38532](https://github.com/nodejs/node/pull/38532) * [[`4c54d81a59`](https://github.com/nodejs/node/commit/4c54d81a59)] - **doc**: document `'secureConnect'` event limitation (James M Snell) [#38447](https://github.com/nodejs/node/pull/38447) * [[`839e8d1672`](https://github.com/nodejs/node/commit/839e8d1672)] - **doc**: mark querystring api as legacy (James M Snell) [#38436](https://github.com/nodejs/node/pull/38436) * [[`a75b7af9bd`](https://github.com/nodejs/node/commit/a75b7af9bd)] - **doc**: add arguments for stream event of Http2Server and Http2SecureServer (Qingyu Deng) [#37892](https://github.com/nodejs/node/pull/37892) * [[`cf0007edc4`](https://github.com/nodejs/node/commit/cf0007edc4)] - **doc**: indicate that abort tests do not generate core files (Rich Trott) [#38422](https://github.com/nodejs/node/pull/38422) * [[`945450b5cf`](https://github.com/nodejs/node/commit/945450b5cf)] - **doc**: add try/catch in http2 respondWithFile example (Matteo Collina) [#38410](https://github.com/nodejs/node/pull/38410) * [[`1f7cd7148a`](https://github.com/nodejs/node/commit/1f7cd7148a)] - **doc**: note the system requirements for V8 tests (DeeDeeG) [#38319](https://github.com/nodejs/node/pull/38319) * [[`cd54834854`](https://github.com/nodejs/node/commit/cd54834854)] - **doc**: minor clarification to pathObject (James M Snell) [#38437](https://github.com/nodejs/node/pull/38437) * [[`ba117c2c6f`](https://github.com/nodejs/node/commit/ba117c2c6f)] - **doc**: document new TCP\_KEEPCNT and TCP\_KEEPINTVL socket option defaults (Arnold Zokas) [#38313](https://github.com/nodejs/node/pull/38313) * [[`dcdbaffced`](https://github.com/nodejs/node/commit/dcdbaffced)] - **doc**: do not mention TCP in the allowHalfOpen option description (Luigi Pinca) [#38360](https://github.com/nodejs/node/pull/38360) * [[`fe8003e5de`](https://github.com/nodejs/node/commit/fe8003e5de)] - **doc**: update message to match actual output (Rich Trott) [#35271](https://github.com/nodejs/node/pull/35271) * [[`c03f23e126`](https://github.com/nodejs/node/commit/c03f23e126)] - **doc**: request default snap track be updated for LTS (Rod Vagg) [#37708](https://github.com/nodejs/node/pull/37708) * [[`a9f7aeed12`](https://github.com/nodejs/node/commit/a9f7aeed12)] - **doc**: mark `process.hrtime()` as legacy (Antoine du Hamel) [#38371](https://github.com/nodejs/node/pull/38371) * [[`cede0c57b8`](https://github.com/nodejs/node/commit/cede0c57b8)] - **doc**: fix version history for `"exports"` patterns (Antoine du Hamel) [#38355](https://github.com/nodejs/node/pull/38355) * [[`9702f22397`](https://github.com/nodejs/node/commit/9702f22397)] - **doc**: fix `package.json` `"imports"` field history (Antoine du Hamel) [#38356](https://github.com/nodejs/node/pull/38356) * [[`2d96da875e`](https://github.com/nodejs/node/commit/2d96da875e)] - **doc**: fix typo in buffer.md (divlo) [#38323](https://github.com/nodejs/node/pull/38323) * [[`6b58f28472`](https://github.com/nodejs/node/commit/6b58f28472)] - **doc**: add nodejs-sec email template (Daniel Bevenius) [#38290](https://github.com/nodejs/node/pull/38290) * [[`5a532e4725`](https://github.com/nodejs/node/commit/5a532e4725)] - **doc**: update TSC members list with three new members (Rich Trott) [#38352](https://github.com/nodejs/node/pull/38352) * [[`e994d6a27c`](https://github.com/nodejs/node/commit/e994d6a27c)] - **doc**: use `foo.prototype.bar` notation in buffer.md (Voltrex) [#38032](https://github.com/nodejs/node/pull/38032) * [[`c61f363d66`](https://github.com/nodejs/node/commit/c61f363d66)] - **doc**: internal/test/binding for testing (Bradley Meck) [#38026](https://github.com/nodejs/node/pull/38026) * [[`0bb6fe31b3`](https://github.com/nodejs/node/commit/0bb6fe31b3)] - **doc**: add missing events.on metadata (Anna Henningsen) [#37965](https://github.com/nodejs/node/pull/37965) * [[`30c82b2745`](https://github.com/nodejs/node/commit/30c82b2745)] - **doc**: fix wording in outgoingMessage.write (Tobias Nießen) [#37894](https://github.com/nodejs/node/pull/37894) * [[`932000020a`](https://github.com/nodejs/node/commit/932000020a)] - **doc**: fix grammar errors in http document (Qingyu Deng) [#37265](https://github.com/nodejs/node/pull/37265) * [[`19e8ae44c4`](https://github.com/nodejs/node/commit/19e8ae44c4)] - **doc**: add document for http.OutgoingMessage (Qingyu Deng) [#37265](https://github.com/nodejs/node/pull/37265) * [[`a6c123363d`](https://github.com/nodejs/node/commit/a6c123363d)] - **doc**: remove generated from dsaEncoding description (Marko Kaznovac) [#37459](https://github.com/nodejs/node/pull/37459) * [[`bc6ea63e48`](https://github.com/nodejs/node/commit/bc6ea63e48)] - **doc**: document how to register external bindings for snapshot (Joyee Cheung) [#37463](https://github.com/nodejs/node/pull/37463) * [[`2168e954aa`](https://github.com/nodejs/node/commit/2168e954aa)] - **doc**: document the NO\_COLOR and FORCE\_COLOR env vars (James M Snell) [#37477](https://github.com/nodejs/node/pull/37477) * [[`2907848fc9`](https://github.com/nodejs/node/commit/2907848fc9)] - **doc**: clarify event.isTrusted text (Rich Trott) [#36827](https://github.com/nodejs/node/pull/36827) * [[`7efa020892`](https://github.com/nodejs/node/commit/7efa020892)] - **doc**: expand openssl instructions (Michael Dawson) [#36554](https://github.com/nodejs/node/pull/36554) * [[`b197a44152`](https://github.com/nodejs/node/commit/b197a44152)] - **doc**: document ABORT\_ERR code (Benjamin Gruenbaum) [#36319](https://github.com/nodejs/node/pull/36319) * [[`1d80f89442`](https://github.com/nodejs/node/commit/1d80f89442)] - **doc**: document changes for `*/promises` alias modules (ExE Boss) [#34002](https://github.com/nodejs/node/pull/34002) * [[`9417fd0bc8`](https://github.com/nodejs/node/commit/9417fd0bc8)] - **errors**: align source-map stacks with spec (Benjamin Coe) [#37252](https://github.com/nodejs/node/pull/37252) * [[`dcd221ce69`](https://github.com/nodejs/node/commit/dcd221ce69)] - **errors**: refactor to use more primordials (Antoine du Hamel) [#36651](https://github.com/nodejs/node/pull/36651) * [[`ee444473e9`](https://github.com/nodejs/node/commit/ee444473e9)] - **errors**: display original symbol name (Benjamin Coe) [#36042](https://github.com/nodejs/node/pull/36042) * [[`83d28374d6`](https://github.com/nodejs/node/commit/83d28374d6)] - **errors**: refactor to use more primordials (Antoine du Hamel) [#36167](https://github.com/nodejs/node/pull/36167) * [[`7d7e34c15a`](https://github.com/nodejs/node/commit/7d7e34c15a)] - **errors**: refactor to use more primordials (Antoine du Hamel) [#35944](https://github.com/nodejs/node/pull/35944) * [[`18e5c0f3e2`](https://github.com/nodejs/node/commit/18e5c0f3e2)] - **events**: refactor to use optional chaining (ZiJian Liu) [#36763](https://github.com/nodejs/node/pull/36763) * [[`4fdcbae583`](https://github.com/nodejs/node/commit/4fdcbae583)] - **events**: refactor to use more primordials (Antoine du Hamel) [#36304](https://github.com/nodejs/node/pull/36304) * [[`c4e7dca8f3`](https://github.com/nodejs/node/commit/c4e7dca8f3)] - **fs**: fix error when writing buffers \> INT32\_MAX (Zach Bjornson) [#38546](https://github.com/nodejs/node/pull/38546) * [[`07c55d2844`](https://github.com/nodejs/node/commit/07c55d2844)] - ***Revert*** "**http**: make HEAD method to work with keep-alive" (Michaël Zasso) [#38949](https://github.com/nodejs/node/pull/38949) * [[`d8da265c81`](https://github.com/nodejs/node/commit/d8da265c81)] - **http2**: treat non-EOF empty frames like other invalid frames (Anna Henningsen) [#37875](https://github.com/nodejs/node/pull/37875) * [[`c3bd0fdb73`](https://github.com/nodejs/node/commit/c3bd0fdb73)] - **http2**: fix setting options before handle exists (Anna Henningsen) [#37875](https://github.com/nodejs/node/pull/37875) * [[`74fe1d8f0c`](https://github.com/nodejs/node/commit/74fe1d8f0c)] - **http2**: add support for TypedArray to getUnpackedSettings (Antoine du Hamel) [#36141](https://github.com/nodejs/node/pull/36141) * [[`c90f1dbeb3`](https://github.com/nodejs/node/commit/c90f1dbeb3)] - **https**: refactor to use more primordials (Antoine du Hamel) [#36195](https://github.com/nodejs/node/pull/36195) * [[`8258799472`](https://github.com/nodejs/node/commit/8258799472)] - **inspector**: remove redundant method for connection check (Yash Ladha) [#37986](https://github.com/nodejs/node/pull/37986) * [[`ba19313e1e`](https://github.com/nodejs/node/commit/ba19313e1e)] - **inspector**: refactor to use more primordials (Antoine du Hamel) [#36356](https://github.com/nodejs/node/pull/36356) * [[`eb8f7ee634`](https://github.com/nodejs/node/commit/eb8f7ee634)] - **lib**: revert primordials in a hot path (Antoine du Hamel) [#38248](https://github.com/nodejs/node/pull/38248) * [[`cea8b4265c`](https://github.com/nodejs/node/commit/cea8b4265c)] - **lib**: make `IterableWeakMap` safe to iterate (Antoine du Hamel) [#38523](https://github.com/nodejs/node/pull/38523) * [[`490bc58229`](https://github.com/nodejs/node/commit/490bc58229)] - **lib**: fix and improve os typings (Akhil Marsonya) [#38316](https://github.com/nodejs/node/pull/38316) * [[`af39df6d03`](https://github.com/nodejs/node/commit/af39df6d03)] - **lib**: add URI handling functions to primordials (Antoine du Hamel) [#37394](https://github.com/nodejs/node/pull/37394) * [[`16691be80e`](https://github.com/nodejs/node/commit/16691be80e)] - **lib**: fix WebIDL `object` and dictionary type conversion (ExE Boss) [#37047](https://github.com/nodejs/node/pull/37047) * [[`47ed512312`](https://github.com/nodejs/node/commit/47ed512312)] - **lib**: refactor to use optional chaining in internal/options.js (raisinten) [#36939](https://github.com/nodejs/node/pull/36939) * [[`346fc0ac21`](https://github.com/nodejs/node/commit/346fc0ac21)] - **lib**: support returning Safe collections from C++ (ExE Boss) [#36989](https://github.com/nodejs/node/pull/36989) * [[`8912caba64`](https://github.com/nodejs/node/commit/8912caba64)] - **lib**: expose primordials object (Antoine du Hamel) [#36872](https://github.com/nodejs/node/pull/36872) * [[`46c019b988`](https://github.com/nodejs/node/commit/46c019b988)] - **lib**: refactor source\_map to use more primordials (Antoine du Hamel) [#36733](https://github.com/nodejs/node/pull/36733) * [[`cf9556d8f7`](https://github.com/nodejs/node/commit/cf9556d8f7)] - **lib**: refactor source\_map to avoid unsafe array iteration (Antoine du Hamel) [#36734](https://github.com/nodejs/node/pull/36734) * [[`6eaf357f49`](https://github.com/nodejs/node/commit/6eaf357f49)] - **lib**: simplify `primordials.uncurryThis` (ExE Boss) [#36866](https://github.com/nodejs/node/pull/36866) * [[`9338759b01`](https://github.com/nodejs/node/commit/9338759b01)] - **lib**: remove v8\_prof\_polyfill from eslint ignore list (Antoine du Hamel) [#36537](https://github.com/nodejs/node/pull/36537) * [[`c9861a344a`](https://github.com/nodejs/node/commit/c9861a344a)] - **lib**: remove unused code (Brian White) [#36632](https://github.com/nodejs/node/pull/36632) * [[`01a71dd393`](https://github.com/nodejs/node/commit/01a71dd393)] - **lib**: refactor to use more primordials in internal/encoding.js (raisinten) [#36480](https://github.com/nodejs/node/pull/36480) * [[`e6c0877604`](https://github.com/nodejs/node/commit/e6c0877604)] - **lib**: refactor to use primordials in internal/priority\_queue.js (ZiJian Liu) [#36560](https://github.com/nodejs/node/pull/36560) * [[`6e3a2ffb98`](https://github.com/nodejs/node/commit/6e3a2ffb98)] - **lib**: add primordials.SafeStringIterator (Antoine du Hamel) [#36526](https://github.com/nodejs/node/pull/36526) * [[`bf0738bc07`](https://github.com/nodejs/node/commit/bf0738bc07)] - **lib**: make safe primordials safe to construct (Antoine du Hamel) [#36428](https://github.com/nodejs/node/pull/36428) * [[`7ebc18f293`](https://github.com/nodejs/node/commit/7ebc18f293)] - **lib**: make safe primordials safe to iterate (Antoine du Hamel) [#36391](https://github.com/nodejs/node/pull/36391) * [[`e12dbc8519`](https://github.com/nodejs/node/commit/e12dbc8519)] - **lib**: refactor to use more primordials in internal/histogram.js (raisinten) [#36455](https://github.com/nodejs/node/pull/36455) * [[`5daeac64a4`](https://github.com/nodejs/node/commit/5daeac64a4)] - **lib**: add uncurried accessor properties to `primordials` (ExE Boss) [#36329](https://github.com/nodejs/node/pull/36329) * [[`bb4900d9eb`](https://github.com/nodejs/node/commit/bb4900d9eb)] - **lib**: refactor primordials.uncurryThis (Antoine du Hamel) [#36221](https://github.com/nodejs/node/pull/36221) * [[`0fbe945ebb`](https://github.com/nodejs/node/commit/0fbe945ebb)] - **lib**: refactor to use more primordials (Antoine du Hamel) [#36140](https://github.com/nodejs/node/pull/36140) * [[`24d4d63308`](https://github.com/nodejs/node/commit/24d4d63308)] - **lib**: add %TypedArray% abstract constructor to primordials (ExE Boss) [#36016](https://github.com/nodejs/node/pull/36016) * [[`e2395b0f3b`](https://github.com/nodejs/node/commit/e2395b0f3b)] - **lib**: use Object static properties from primordials (Michaël Zasso) [#35380](https://github.com/nodejs/node/pull/35380) * [[`b3e22e1612`](https://github.com/nodejs/node/commit/b3e22e1612)] - **lib,tools**: enforce access to prototype from primordials (Antoine du Hamel) [#36025](https://github.com/nodejs/node/pull/36025) * [[`e94e0b488e`](https://github.com/nodejs/node/commit/e94e0b488e)] - **meta**: add v8 team (Jiawen Geng) [#38566](https://github.com/nodejs/node/pull/38566) * [[`fcc6a00f1a`](https://github.com/nodejs/node/commit/fcc6a00f1a)] - **meta**: post comment when pr labeled fast-track (James M Snell) [#38446](https://github.com/nodejs/node/pull/38446) * [[`bd0d9647ca`](https://github.com/nodejs/node/commit/bd0d9647ca)] - **module**: clarify CJS global-like variables not defined error message (Antoine du Hamel) [#37852](https://github.com/nodejs/node/pull/37852) * [[`0fdb5d59f7`](https://github.com/nodejs/node/commit/0fdb5d59f7)] - **module**: refactor NativeModule to avoid unsafe array iteration (Antoine du Hamel) [#37656](https://github.com/nodejs/node/pull/37656) * [[`77c7d979b6`](https://github.com/nodejs/node/commit/77c7d979b6)] - **module**: simplify tryStatSync with throwIfNoEntry option (Antoine du Hamel) [#36971](https://github.com/nodejs/node/pull/36971) * [[`1aae572220`](https://github.com/nodejs/node/commit/1aae572220)] - **module**: refactor to use more primordials (Antoine du Hamel) [#36348](https://github.com/nodejs/node/pull/36348) * [[`9e7f166161`](https://github.com/nodejs/node/commit/9e7f166161)] - **module**: refactor to use more primordials (Antoine du Hamel) [#36024](https://github.com/nodejs/node/pull/36024) * [[`eee1d291cf`](https://github.com/nodejs/node/commit/eee1d291cf)] - **module**: refactor to use iterable-weak-map (Benjamin Coe) [#35915](https://github.com/nodejs/node/pull/35915) * [[`52cbe89f7f`](https://github.com/nodejs/node/commit/52cbe89f7f)] - **net**: refactor to use more primordials (Antoine du Hamel) [#36303](https://github.com/nodejs/node/pull/36303) * [[`779ad54078`](https://github.com/nodejs/node/commit/779ad54078)] - **node-api**: faster threadsafe\_function (Fedor Indutny) [#38506](https://github.com/nodejs/node/pull/38506) * [[`5995221ced`](https://github.com/nodejs/node/commit/5995221ced)] - **node-api**: fix shutdown crashes (Michael Dawson) [#38492](https://github.com/nodejs/node/pull/38492) * [[`d8acec4cb1`](https://github.com/nodejs/node/commit/d8acec4cb1)] - **node-api**: make reference weak parameter an indirect link to references (Chengzhong Wu) [#38000](https://github.com/nodejs/node/pull/38000) * [[`c442d89ad6`](https://github.com/nodejs/node/commit/c442d89ad6)] - **os**: refactor to use more primordials (Antoine du Hamel) [#36284](https://github.com/nodejs/node/pull/36284) * [[`daeb6fcd78`](https://github.com/nodejs/node/commit/daeb6fcd78)] - **path**: inline conditions (Voltrex) [#38613](https://github.com/nodejs/node/pull/38613) * [[`e2f531f646`](https://github.com/nodejs/node/commit/e2f531f646)] - **path**: refactor to use more primordials (Akhil Marsonya) [#37893](https://github.com/nodejs/node/pull/37893) * [[`c1364d15a2`](https://github.com/nodejs/node/commit/c1364d15a2)] - **path**: refactor to use more primordials (Antoine du Hamel) [#36302](https://github.com/nodejs/node/pull/36302) * [[`726ef40fcb`](https://github.com/nodejs/node/commit/726ef40fcb)] - **perf_hooks**: throw ERR\_INVALID\_ARG\_VALUE if histogram.percentile param is NaN (ZiJian Liu) [#36937](https://github.com/nodejs/node/pull/36937) * [[`4686f4f41b`](https://github.com/nodejs/node/commit/4686f4f41b)] - **perf_hooks**: refactor to avoid unsafe array iteration (Antoine du Hamel) [#36723](https://github.com/nodejs/node/pull/36723) * [[`6adec6351e`](https://github.com/nodejs/node/commit/6adec6351e)] - **perf_hooks**: refactor to use more primordials (Antoine du Hamel) [#36297](https://github.com/nodejs/node/pull/36297) * [[`bf9aa425c0`](https://github.com/nodejs/node/commit/bf9aa425c0)] - **policy**: refactor to use more primordials (Antoine du Hamel) [#36210](https://github.com/nodejs/node/pull/36210) * [[`0f6c3f76b3`](https://github.com/nodejs/node/commit/0f6c3f76b3)] - **querystring**: refactor to use more primordials (Antoine du Hamel) [#36315](https://github.com/nodejs/node/pull/36315) * [[`b5b8a996f3`](https://github.com/nodejs/node/commit/b5b8a996f3)] - **readline**: refactor to use more primordials (Antoine du Hamel) [#36296](https://github.com/nodejs/node/pull/36296) * [[`cd981808b4`](https://github.com/nodejs/node/commit/cd981808b4)] - **repl**: document top level await limitation with const/let (James M Snell) [#38449](https://github.com/nodejs/node/pull/38449) * [[`a4eb5571eb`](https://github.com/nodejs/node/commit/a4eb5571eb)] - **repl**: display prompt once after error callback (Anna Henningsen) [#38314](https://github.com/nodejs/node/pull/38314) * [[`163fcecb69`](https://github.com/nodejs/node/commit/163fcecb69)] - **src**: fix multiple AddLinkedBinding() calls (Anna Henningsen) [#39012](https://github.com/nodejs/node/pull/39012) * [[`8809ef98f9`](https://github.com/nodejs/node/commit/8809ef98f9)] - **src**: update cares\_wrap OpenBSD defines (Anna Henningsen) [#38670](https://github.com/nodejs/node/pull/38670) * [[`d66f88ce97`](https://github.com/nodejs/node/commit/d66f88ce97)] - **src**: remove extra semi after member fn (Shelley Vohr) [#38686](https://github.com/nodejs/node/pull/38686) * [[`bc2111c7e6`](https://github.com/nodejs/node/commit/bc2111c7e6)] - **src**: make workers messaging more resilient (Juan José Arboleda) [#38510](https://github.com/nodejs/node/pull/38510) * [[`378e0e778b`](https://github.com/nodejs/node/commit/378e0e778b)] - **src**: fix validation of negative offset to avoid abort (James M Snell) [#38421](https://github.com/nodejs/node/pull/38421) * [[`c170026b7b`](https://github.com/nodejs/node/commit/c170026b7b)] - **src**: use %progbits instead of @progbits (Stephen Gallagher) [#38312](https://github.com/nodejs/node/pull/38312) * [[`d177541b0e`](https://github.com/nodejs/node/commit/d177541b0e)] - **src**: fix setting Converter sub char length (James M Snell) [#38331](https://github.com/nodejs/node/pull/38331) * [[`e279b029c0`](https://github.com/nodejs/node/commit/e279b029c0)] - **src**: avoid deferred gc/cleanup for Buffer.from (James M Snell) [#38337](https://github.com/nodejs/node/pull/38337) * [[`006c7b78da`](https://github.com/nodejs/node/commit/006c7b78da)] - **src**: indent long help text properly (David Glasser) [#37911](https://github.com/nodejs/node/pull/37911) * [[`f5541ddea3`](https://github.com/nodejs/node/commit/f5541ddea3)] - **src**: fix ETW\_WRITE\_EMPTY\_EVENT macro (Michaël Zasso) [#37334](https://github.com/nodejs/node/pull/37334) * [[`6b1052d034`](https://github.com/nodejs/node/commit/6b1052d034)] - **src**: disable unfixable MSVC warnings (Michaël Zasso) [#37334](https://github.com/nodejs/node/pull/37334) * [[`38afa3fa79`](https://github.com/nodejs/node/commit/38afa3fa79)] - **src**: avoid implicit type conversions (take 2) (Michaël Zasso) [#37334](https://github.com/nodejs/node/pull/37334) * [[`8a60ae2161`](https://github.com/nodejs/node/commit/8a60ae2161)] - **src**: fix compiler warnings in node\_buffer.cc (Darshan Sen) [#38722](https://github.com/nodejs/node/pull/38722) * [[`78cde14c45`](https://github.com/nodejs/node/commit/78cde14c45)] - **src**: fix compiler warning in env.cc (Anna Henningsen) [#35547](https://github.com/nodejs/node/pull/35547) * [[`ea311a41cc`](https://github.com/nodejs/node/commit/ea311a41cc)] - **src**: add check against non-weak BaseObjects at process exit (Anna Henningsen) [#35490](https://github.com/nodejs/node/pull/35490) * [[`a1b4681efc`](https://github.com/nodejs/node/commit/a1b4681efc)] - **src**: use transferred consistently (Daniel Bevenius) [#36340](https://github.com/nodejs/node/pull/36340) * [[`29c623e5cb`](https://github.com/nodejs/node/commit/29c623e5cb)] - **src**: fix label indentation (Rich Trott) [#36213](https://github.com/nodejs/node/pull/36213) * [[`dbb0d2612c`](https://github.com/nodejs/node/commit/dbb0d2612c)] - **stream**: fix multiple Writable.destroy() calls (Robert Nagy) [#38221](https://github.com/nodejs/node/pull/38221) * [[`a18b1ff80b`](https://github.com/nodejs/node/commit/a18b1ff80b)] - **stream**: the position of \_read() is wrong (helloyou2012) [#38292](https://github.com/nodejs/node/pull/38292) * [[`ab130e929a`](https://github.com/nodejs/node/commit/ab130e929a)] - **stream**: only use legacy close listeners if not willEmitClose (Robert Nagy) [#36649](https://github.com/nodejs/node/pull/36649) * [[`c31e2f6b0f`](https://github.com/nodejs/node/commit/c31e2f6b0f)] - **stream**: fix legacy pipe error handling (Robert Nagy) [#35257](https://github.com/nodejs/node/pull/35257) * [[`1dc4dea215`](https://github.com/nodejs/node/commit/1dc4dea215)] - **string_decoder**: throw ERR\_STRING\_TOO\_LONG for UTF-8 (Michaël Zasso) [#36661](https://github.com/nodejs/node/pull/36661) * [[`0db9101922`](https://github.com/nodejs/node/commit/0db9101922)] - **string_decoder**: refactor to use more primordials (Antoine du Hamel) [#36358](https://github.com/nodejs/node/pull/36358) * [[`8a44ee478e`](https://github.com/nodejs/node/commit/8a44ee478e)] - **test**: improve coverage of lib/\_http\_client.js (Rongjian Zhang) [#38599](https://github.com/nodejs/node/pull/38599) * [[`8a45b85dbd`](https://github.com/nodejs/node/commit/8a45b85dbd)] - **test**: improve coverage of lib/os.js (Rongjian Zhang) [#38653](https://github.com/nodejs/node/pull/38653) * [[`d7c6a3eb03`](https://github.com/nodejs/node/commit/d7c6a3eb03)] - **test**: call functions internally (Voltrex) [#38560](https://github.com/nodejs/node/pull/38560) * [[`726cb48bd8`](https://github.com/nodejs/node/commit/726cb48bd8)] - **test**: complete coverage of querystring (Rongjian Zhang) [#38520](https://github.com/nodejs/node/pull/38520) * [[`4f1ba79eb8`](https://github.com/nodejs/node/commit/4f1ba79eb8)] - **test**: increase coverage for AbortController (ZiJian Liu) [#38514](https://github.com/nodejs/node/pull/38514) * [[`d98b355336`](https://github.com/nodejs/node/commit/d98b355336)] - **test**: run message and pseudo-tty tests in parallel (Richard Lau) [#38502](https://github.com/nodejs/node/pull/38502) * [[`7938af6565`](https://github.com/nodejs/node/commit/7938af6565)] - **test**: move test-net-connect-econnrefused from pummel to sequential (Rich Trott) [#38462](https://github.com/nodejs/node/pull/38462) * [[`52f3837518`](https://github.com/nodejs/node/commit/52f3837518)] - **test**: fix `common.mustCall` `length` and `name` properties (Antoine du Hamel) [#38464](https://github.com/nodejs/node/pull/38464) * [[`fdfb39e692`](https://github.com/nodejs/node/commit/fdfb39e692)] - **test**: address deprecation warning (Rich Trott) [#38448](https://github.com/nodejs/node/pull/38448) * [[`25e5afe3be`](https://github.com/nodejs/node/commit/25e5afe3be)] - **test**: move abort test from pummel to abort directory (Rich Trott) [#38396](https://github.com/nodejs/node/pull/38396) * [[`296b969e0a`](https://github.com/nodejs/node/commit/296b969e0a)] - **test**: skip some pummel tests on slower machines (Rich Trott) [#38394](https://github.com/nodejs/node/pull/38394) * [[`a9ff9c0918`](https://github.com/nodejs/node/commit/a9ff9c0918)] - **test**: add ancestor package.json checks for tmpdir (Richard Lau) [#38285](https://github.com/nodejs/node/pull/38285) * [[`c9ce98c377`](https://github.com/nodejs/node/commit/c9ce98c377)] - **test**: replace function with arrow function and remove unused argument (Andres) [#38235](https://github.com/nodejs/node/pull/38235) * [[`c77abf5a89`](https://github.com/nodejs/node/commit/c77abf5a89)] - **test**: use .test domain for not found address (Richard Lau) [#38286](https://github.com/nodejs/node/pull/38286) * [[`d9eb8b3ed0`](https://github.com/nodejs/node/commit/d9eb8b3ed0)] - **test**: increase fs promise coverage (Emil Sivervik) [#36813](https://github.com/nodejs/node/pull/36813) * [[`d85b70fffa`](https://github.com/nodejs/node/commit/d85b70fffa)] - **test**: increase timeout on ASAN Action (Antoine du Hamel) [#37007](https://github.com/nodejs/node/pull/37007) * [[`836fba52ea`](https://github.com/nodejs/node/commit/836fba52ea)] - **test**: improve coverage of `SourceTextModule` getters (Juan José Arboleda) [#37013](https://github.com/nodejs/node/pull/37013) * [[`f43fc6b6cc`](https://github.com/nodejs/node/commit/f43fc6b6cc)] - **test**: improve coverage for `Module` getters (Juan José Arboleda) [#36950](https://github.com/nodejs/node/pull/36950) * [[`a45d280c18`](https://github.com/nodejs/node/commit/a45d280c18)] - **test**: improve coverage on worker threads (Juan José Arboleda) [#36910](https://github.com/nodejs/node/pull/36910) * [[`ec4d79e259`](https://github.com/nodejs/node/commit/ec4d79e259)] - **test**: improve coverage at `lib/internal/vm/module.js` (Juan José Arboleda) [#36898](https://github.com/nodejs/node/pull/36898) * [[`c34de75687`](https://github.com/nodejs/node/commit/c34de75687)] - **test**: guard large string decoder allocation (Michaël Zasso) [#36795](https://github.com/nodejs/node/pull/36795) * [[`3215a58843`](https://github.com/nodejs/node/commit/3215a58843)] - **test**: add already-aborted-controller test for spawn() (Rich Trott) [#36644](https://github.com/nodejs/node/pull/36644) * [[`c3b116795b`](https://github.com/nodejs/node/commit/c3b116795b)] - **test**: add test for reused AbortController with execfile() (Rich Trott) [#36644](https://github.com/nodejs/node/pull/36644) * [[`219ed0ba4c`](https://github.com/nodejs/node/commit/219ed0ba4c)] - **test**: add Actions annotation output (Mary Marchini) [#34590](https://github.com/nodejs/node/pull/34590) * [[`89ee6abae0`](https://github.com/nodejs/node/commit/89ee6abae0)] - **test**: use `.then(common.mustCall())` for all async IIFEs (Anna Henningsen) [#34363](https://github.com/nodejs/node/pull/34363) * [[`294b3e60a5`](https://github.com/nodejs/node/commit/294b3e60a5)] - **test,doc,lib**: adjust object literal newlines for lint rule (Rich Trott) [#37040](https://github.com/nodejs/node/pull/37040) * [[`bfe02b8808`](https://github.com/nodejs/node/commit/bfe02b8808)] - **test,readline**: improve tab completion coverage (Antoine du Hamel) [#38465](https://github.com/nodejs/node/pull/38465) * [[`1dc7fd238c`](https://github.com/nodejs/node/commit/1dc7fd238c)] - **timers**: fix unsafe array iteration (Darshan Sen) [#37223](https://github.com/nodejs/node/pull/37223) * [[`679973866d`](https://github.com/nodejs/node/commit/679973866d)] - **timers**: reject with AbortError on cancellation (Benjamin Gruenbaum) [#36317](https://github.com/nodejs/node/pull/36317) * [[`dec3610a31`](https://github.com/nodejs/node/commit/dec3610a31)] - **timers**: refactor to use more primordials (Antoine du Hamel) [#36132](https://github.com/nodejs/node/pull/36132) * [[`d84b05a619`](https://github.com/nodejs/node/commit/d84b05a619)] - **timers**: cleanup abort listener on awaitable timers (James M Snell) [#36006](https://github.com/nodejs/node/pull/36006) * [[`f6e4dbb779`](https://github.com/nodejs/node/commit/f6e4dbb779)] - **tls**: validate ticket keys buffer (Antoine du Hamel) [#38308](https://github.com/nodejs/node/pull/38308) * [[`661e9809bd`](https://github.com/nodejs/node/commit/661e9809bd)] - **tls**: fix session and keylog add listener segfault (Nitzan Uziely) [#38180](https://github.com/nodejs/node/pull/38180) * [[`de44e90523`](https://github.com/nodejs/node/commit/de44e90523)] - **tools**: refloat 7 Node.js patches to cpplint.py (Rich Trott) [#36324](https://github.com/nodejs/node/pull/36324) * [[`37bc7d5945`](https://github.com/nodejs/node/commit/37bc7d5945)] - **tools**: bump cpplint to 1.5.4 (Rich Trott) [#36324](https://github.com/nodejs/node/pull/36324) * [[`84e918858e`](https://github.com/nodejs/node/commit/84e918858e)] - **tools**: refloat 7 Node.js patches to cpplint.py (Rich Trott) [#36235](https://github.com/nodejs/node/pull/36235) * [[`fb2bb93f95`](https://github.com/nodejs/node/commit/fb2bb93f95)] - **tools**: bump cpplint to 1.5.3 (Rich Trott) [#36235](https://github.com/nodejs/node/pull/36235) * [[`3351910f97`](https://github.com/nodejs/node/commit/3351910f97)] - **tools**: refloat 7 Node.js patches to cpplint.py (Rich Trott) [#36213](https://github.com/nodejs/node/pull/36213) * [[`193b18effa`](https://github.com/nodejs/node/commit/193b18effa)] - **tools**: bump cpplint.py to 1.5.2 (Rich Trott) [#36213](https://github.com/nodejs/node/pull/36213) * [[`8a6c35d735`](https://github.com/nodejs/node/commit/8a6c35d735)] - **tools**: update ESLint to 7.27.0 (Luigi Pinca) [#38764](https://github.com/nodejs/node/pull/38764) * [[`f8753b6299`](https://github.com/nodejs/node/commit/f8753b6299)] - **tools**: update ESLint to 7.26.0 (Colin Ihrig) [#38605](https://github.com/nodejs/node/pull/38605) * [[`1098aec40b`](https://github.com/nodejs/node/commit/1098aec40b)] - **tools**: update ESLint to 7.25.0 (Colin Ihrig) [#38378](https://github.com/nodejs/node/pull/38378) * [[`3fbabfa94d`](https://github.com/nodejs/node/commit/3fbabfa94d)] - **tools**: update ESLint to 7.24.0 (Colin Ihrig) [#38179](https://github.com/nodejs/node/pull/38179) * [[`6ce779cd8b`](https://github.com/nodejs/node/commit/6ce779cd8b)] - **tools**: update ESLint to 7.23.0 (Luigi Pinca) [#37979](https://github.com/nodejs/node/pull/37979) * [[`77f88e7725`](https://github.com/nodejs/node/commit/77f88e7725)] - **tools**: update ESLint to 7.22.0 (Colin Ihrig) [#37734](https://github.com/nodejs/node/pull/37734) * [[`5de911eeaf`](https://github.com/nodejs/node/commit/5de911eeaf)] - **tools**: make update-eslint.sh work with npm@7 (Luigi Pinca) [#37566](https://github.com/nodejs/node/pull/37566) * [[`839976669f`](https://github.com/nodejs/node/commit/839976669f)] - **tools**: add support for mjs and cjs JS snippet linting (Antoine du Hamel) [#37311](https://github.com/nodejs/node/pull/37311) * [[`2463bd0689`](https://github.com/nodejs/node/commit/2463bd0689)] - **tools**: update eslint-plugin-markdown configuration (Colin Ihrig) [#37549](https://github.com/nodejs/node/pull/37549) * [[`f868fac455`](https://github.com/nodejs/node/commit/f868fac455)] - **tools**: enable object-curly-newline in ESLint rules (Rich Trott) [#37040](https://github.com/nodejs/node/pull/37040) * [[`d13508d219`](https://github.com/nodejs/node/commit/d13508d219)] - **tools**: make GH Actions workflows work if default branch is not master (Antoine du Hamel) [#38516](https://github.com/nodejs/node/pull/38516) * [[`7021c31d06`](https://github.com/nodejs/node/commit/7021c31d06)] - **tools**: use mktemp to create the workspace directory (Luigi Pinca) [#38432](https://github.com/nodejs/node/pull/38432) * [[`16a3e555ba`](https://github.com/nodejs/node/commit/16a3e555ba)] - **tools**: use a shallow clone of the npm/cli repository (Luigi Pinca) [#38463](https://github.com/nodejs/node/pull/38463) * [[`3484a23140`](https://github.com/nodejs/node/commit/3484a23140)] - **tools**: remove fixer for non-ascii-character ESLint custom rule (Rich Trott) [#38413](https://github.com/nodejs/node/pull/38413) * [[`aec4b295e4`](https://github.com/nodejs/node/commit/aec4b295e4)] - **tools**: fix doc generation when version info is not available (Antoine du Hamel) [#38398](https://github.com/nodejs/node/pull/38398) * [[`0172b110a3`](https://github.com/nodejs/node/commit/0172b110a3)] - **tools**: add \_depot\_tools to PATH (for V8 tests) (DeeDeeG) [#38299](https://github.com/nodejs/node/pull/38299) * [[`d0eed18c87`](https://github.com/nodejs/node/commit/d0eed18c87)] - **tools**: fix type mismatch in test runner (Richard Lau) [#38289](https://github.com/nodejs/node/pull/38289) * [[`11ca018db9`](https://github.com/nodejs/node/commit/11ca018db9)] - **tools**: simplify eslint comma-dangle configuration (tools) (Rich Trott) [#37883](https://github.com/nodejs/node/pull/37883) * [[`f7c14e86a7`](https://github.com/nodejs/node/commit/f7c14e86a7)] - **tools**: simplify eslint comma-dangle configuration (Rich Trott) [#37850](https://github.com/nodejs/node/pull/37850) * [[`241e05795b`](https://github.com/nodejs/node/commit/241e05795b)] - **tools**: run doctool tests on GitHub Actions CI (Antoine du Hamel) [#37398](https://github.com/nodejs/node/pull/37398) * [[`a4dd50f8f9`](https://github.com/nodejs/node/commit/a4dd50f8f9)] - **tools**: refactor prefer-primordials (Antoine du Hamel) [#36018](https://github.com/nodejs/node/pull/36018) * [[`4af3906e72`](https://github.com/nodejs/node/commit/4af3906e72)] - **tools**: update ESLint to 7.21.0 (Luigi Pinca) [#37546](https://github.com/nodejs/node/pull/37546) * [[`955880de1a`](https://github.com/nodejs/node/commit/955880de1a)] - **tools**: update ESLint to 7.20.0 (Colin Ihrig) [#37339](https://github.com/nodejs/node/pull/37339) * [[`42c1f98a31`](https://github.com/nodejs/node/commit/42c1f98a31)] - **tools**: update ESLint to 7.19.0 (Colin Ihrig) [#37159](https://github.com/nodejs/node/pull/37159) * [[`25eb720b4d`](https://github.com/nodejs/node/commit/25eb720b4d)] - **tools**: update ESLint to 7.18.0 (Colin Ihrig) [#36955](https://github.com/nodejs/node/pull/36955) * [[`4983ef205e`](https://github.com/nodejs/node/commit/4983ef205e)] - **tools**: update gyp-next to v0.7.0 (Michaël Zasso) [#36580](https://github.com/nodejs/node/pull/36580) * [[`613378da1e`](https://github.com/nodejs/node/commit/613378da1e)] - **tools**: update ESLint to 7.17.0 (Colin Ihrig) [#36726](https://github.com/nodejs/node/pull/36726) * [[`e6d01f6545`](https://github.com/nodejs/node/commit/e6d01f6545)] - **tools**: update ESLint to 7.16.0 (Yongsheng Zhang) [#36579](https://github.com/nodejs/node/pull/36579) * [[`98806da810`](https://github.com/nodejs/node/commit/98806da810)] - **tools**: enable no-unsafe-optional-chaining lint rule (Colin Ihrig) [#36411](https://github.com/nodejs/node/pull/36411) * [[`7d411920f6`](https://github.com/nodejs/node/commit/7d411920f6)] - **tools**: update ESLint to 7.15.0 (Colin Ihrig) [#36411](https://github.com/nodejs/node/pull/36411) * [[`226a86c3b5`](https://github.com/nodejs/node/commit/226a86c3b5)] - **tools**: enable no-unused-expressions lint rule (Michaël Zasso) [#36248](https://github.com/nodejs/node/pull/36248) * [[`24a81c7d6c`](https://github.com/nodejs/node/commit/24a81c7d6c)] - **tools**: enable no-nonoctal-decimal-escape lint rule (Colin Ihrig) [#36217](https://github.com/nodejs/node/pull/36217) * [[`19d4eb17b9`](https://github.com/nodejs/node/commit/19d4eb17b9)] - **tools**: update ESLint to 7.14.0 (Colin Ihrig) [#36217](https://github.com/nodejs/node/pull/36217) * [[`9fa8d2037f`](https://github.com/nodejs/node/commit/9fa8d2037f)] - **tools**: add linting rule for async IIFEs (Anna Henningsen) [#34363](https://github.com/nodejs/node/pull/34363) * [[`55fc206d13`](https://github.com/nodejs/node/commit/55fc206d13)] - **tools**: update ESLint to 7.13.0 (Luigi Pinca) [#36031](https://github.com/nodejs/node/pull/36031) * [[`937fc0a30c`](https://github.com/nodejs/node/commit/937fc0a30c)] - **tools**: update ESLint to 7.12.1 (Colin Ihrig) [#35799](https://github.com/nodejs/node/pull/35799) * [[`29d0840a90`](https://github.com/nodejs/node/commit/29d0840a90)] - **tools**: update ESLint to 7.12.0 (Colin Ihrig) [#35799](https://github.com/nodejs/node/pull/35799) * [[`dcbd44758c`](https://github.com/nodejs/node/commit/dcbd44758c)] - **tools**: update ESLint to 7.11.0 (Colin Ihrig) [#35578](https://github.com/nodejs/node/pull/35578) * [[`c7751b4e69`](https://github.com/nodejs/node/commit/c7751b4e69)] - **tools**: add new ESLint rule: prefer-primordials (Leko) [#35448](https://github.com/nodejs/node/pull/35448) * [[`9a5411a2b4`](https://github.com/nodejs/node/commit/9a5411a2b4)] - **tools,doc**: add support for several flavors of JS code snippets (Antoine du Hamel) [#37162](https://github.com/nodejs/node/pull/37162) * [[`e19478aa76`](https://github.com/nodejs/node/commit/e19478aa76)] - **tools,lib**: recommend using safe primordials (Antoine du Hamel) [#36026](https://github.com/nodejs/node/pull/36026) * [[`5f848a612d`](https://github.com/nodejs/node/commit/5f848a612d)] - **tools,lib**: tighten prefer-primordials rules for Error statics (Antoine du Hamel) [#36017](https://github.com/nodejs/node/pull/36017) * [[`716076e389`](https://github.com/nodejs/node/commit/716076e389)] - **tty**: refactor to avoid unsafe array iteration (Antoine du Hamel) [#36771](https://github.com/nodejs/node/pull/36771) * [[`41d74a4d9a`](https://github.com/nodejs/node/commit/41d74a4d9a)] - **tty**: refactor to use more primordials (Zijian Liu) [#36272](https://github.com/nodejs/node/pull/36272) * [[`e35a3543fd`](https://github.com/nodejs/node/commit/e35a3543fd)] - **typings**: add JSDoc typings for util (Rohit Gohri) [#38213](https://github.com/nodejs/node/pull/38213) * [[`c8b22185f7`](https://github.com/nodejs/node/commit/c8b22185f7)] - **url**: refactor to use more primordials (Antoine du Hamel) [#36316](https://github.com/nodejs/node/pull/36316) * [[`e113035c9a`](https://github.com/nodejs/node/commit/e113035c9a)] - **util**: simplify constructor retrieval in inspect() (Rich Trott) [#36466](https://github.com/nodejs/node/pull/36466) * [[`1551b40d01`](https://github.com/nodejs/node/commit/1551b40d01)] - **v8**: refactor to use more primordials (Antoine du Hamel) [#36527](https://github.com/nodejs/node/pull/36527) * [[`6c1bbb5caf`](https://github.com/nodejs/node/commit/6c1bbb5caf)] - **v8**: refactor to use more primordials (Antoine du Hamel) [#36285](https://github.com/nodejs/node/pull/36285) * [[`3aee77d279`](https://github.com/nodejs/node/commit/3aee77d279)] - **vm**: refactor to avoid unsafe array iteration (Antoine du Hamel) [#36752](https://github.com/nodejs/node/pull/36752) * [[`0dea86634d`](https://github.com/nodejs/node/commit/0dea86634d)] - **wasi**: refactor to avoid unsafe array iteration (Antoine du Hamel) [#36724](https://github.com/nodejs/node/pull/36724) * [[`2c66305ac4`](https://github.com/nodejs/node/commit/2c66305ac4)] - ***Revert*** "**worker**: remove `ERR_CLOSED_MESSAGE_PORT`" (Juan José Arboleda) [#38510](https://github.com/nodejs/node/pull/38510) * [[`698bffaa90`](https://github.com/nodejs/node/commit/698bffaa90)] - **worker**: refactor to avoid unsafe array iteration (Antoine du Hamel) [#37346](https://github.com/nodejs/node/pull/37346) * [[`3d4785c174`](https://github.com/nodejs/node/commit/3d4785c174)] - **worker**: refactor to use more primordials (Antoine du Hamel) [#36267](https://github.com/nodejs/node/pull/36267) * [[`8702b045a4`](https://github.com/nodejs/node/commit/8702b045a4)] - **zlib**: fix brotli flush range (Khaidi Chu) [#38408](https://github.com/nodejs/node/pull/38408) * [[`459fe6864e`](https://github.com/nodejs/node/commit/459fe6864e)] - **zlib**: refactor to avoid unsafe array iteration (Antoine du Hamel) [#36722](https://github.com/nodejs/node/pull/36722) * [[`740638de0f`](https://github.com/nodejs/node/commit/740638de0f)] - **zlib**: refactor to use primordial instead of \<string\>.startsWith (Rohan Chougule) [#36718](https://github.com/nodejs/node/pull/36718) * [[`32e10f388c`](https://github.com/nodejs/node/commit/32e10f388c)] - **zlib**: refactor to use more primordials (Antoine du Hamel) [#36347](https://github.com/nodejs/node/pull/36347) Windows 32-bit Installer: https://nodejs.org/dist/v14.17.1/node-v14.17.1-x86.msi<br> Windows 64-bit Installer: https://nodejs.org/dist/v14.17.1/node-v14.17.1-x64.msi<br> Windows 32-bit Binary: https://nodejs.org/dist/v14.17.1/win-x86/node.exe<br> Windows 64-bit Binary: https://nodejs.org/dist/v14.17.1/win-x64/node.exe<br> macOS 64-bit Installer: https://nodejs.org/dist/v14.17.1/node-v14.17.1.pkg<br> macOS Intel 64-bit Binary: https://nodejs.org/dist/v14.17.1/node-v14.17.1-darwin-x64.tar.gz<br> Linux 64-bit Binary: https://nodejs.org/dist/v14.17.1/node-v14.17.1-linux-x64.tar.xz<br> Linux PPC LE 64-bit Binary: https://nodejs.org/dist/v14.17.1/node-v14.17.1-linux-ppc64le.tar.xz<br> Linux s390x 64-bit Binary: https://nodejs.org/dist/v14.17.1/node-v14.17.1-linux-s390x.tar.xz<br> AIX 64-bit Binary: https://nodejs.org/dist/v14.17.1/node-v14.17.1-aix-ppc64.tar.gz<br> ARMv7 32-bit Binary: https://nodejs.org/dist/v14.17.1/node-v14.17.1-linux-armv7l.tar.xz<br> ARMv8 64-bit Binary: https://nodejs.org/dist/v14.17.1/node-v14.17.1-linux-arm64.tar.xz<br> Source Code: https://nodejs.org/dist/v14.17.1/node-v14.17.1.tar.gz<br> Other release files: https://nodejs.org/dist/v14.17.1/<br> Documentation: https://nodejs.org/docs/v14.17.1/api/ ### SHASUMS ``` -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA256 b2aaa7d5cffd4ea950aa65e92ffa88781e74b0dd29028963c2b74a58bd72ff04 node-v14.17.1-aix-ppc64.tar.gz 864d09627c8dc9038e0235fccf2110b60c8942713c15352de2d203278798ff0d node-v14.17.1-darwin-x64.tar.gz 79c2b290ff06cd95ddcb42cf5e83322f8b533ff9f6f8665e780e3b06212ecda1 node-v14.17.1-darwin-x64.tar.xz ee72ed2c935c0d162969c5bcdeb29fc83a2883c0b6757f4510d7bd70512a9418 node-v14.17.1-headers.tar.gz da9534360a814e258226505a9f097e6c90b373f4ea75cb5fadd70506371ea85d node-v14.17.1-headers.tar.xz 04e25f5511408288913dd1955f6829431e5096911aa3e35c9cd0ca8b39e6c4c5 node-v14.17.1-linux-arm64.tar.gz 5285c1716b0992984112255ef982c4e5ba3ec1b1202319be2f15ce3a24592a80 node-v14.17.1-linux-arm64.tar.xz b3a7b0dfe6e841ce67ce8a947d13dde50d7bba4505558cc0cc617afe1ce08b48 node-v14.17.1-linux-armv7l.tar.gz 05784d934f93b6b7a22c2cf170ba88f3c647b85fb9953e178d6be7d692387609 node-v14.17.1-linux-armv7l.tar.xz 0a0a3b721d22c42309915c5951c8b701776914caa4fbcda570fba3f9b36efee1 node-v14.17.1-linux-ppc64le.tar.gz c9e88eeebc139a4fedeb749452082fd15826b358709a1d16aeb84fd1ec8b4543 node-v14.17.1-linux-ppc64le.tar.xz 1abfd4a39b2656ea95b6b03e0f8d5a0ce225923f73f886ba6d877706ec0a172b node-v14.17.1-linux-s390x.tar.gz 0ec89ac56f39ff75003d79b3a95c64336f711e580ab1b75e015a3662456cc693 node-v14.17.1-linux-s390x.tar.xz 4781b162129b19bdb3a7010cab12d06fc7c89421ea3fda03346ed17f09ceacd6 node-v14.17.1-linux-x64.tar.gz 2921eba80c839e06d68b60b27fbbcbc7822df437f3f8d58717ec5a7703563ba4 node-v14.17.1-linux-x64.tar.xz dbcaa370c322325beaa56847c06169f11d2e5acc7719e4c8395e740c3bbefade node-v14.17.1.pkg f85297faa15529cf134e9cfd395371fea62e092c3fe2127f2b0fdf8504905cee node-v14.17.1.tar.gz ddf1d2d56ddf35ecd98c5ea5ddcd690b245899f289559b4330c921255f5a247f node-v14.17.1.tar.xz d99bf8769198188ce4609e222e988de66dd2c3c529a9cebebc56556879c85fe4 node-v14.17.1-win-x64.7z c2264359aa15c0a95d347ebb382413a597d1698a4a2c8af73309b6e4e517ff04 node-v14.17.1-win-x64.zip 75396248fd5b5e0d6617aa8029fb07bc5dcf036e1d33ff82896e33224c43e0cd node-v14.17.1-win-x86.7z 45b0996f28c8eeafc128e4a9d86f2f06e9f376a41b5db46dc69e89ce61e44a54 node-v14.17.1-win-x86.zip f4cea916af6d80784f5d9daab59292ceba04bf636e4e2e6727153f89b1e6990f node-v14.17.1-x64.msi 995420f483a181b0e535dd8f4661a7d6fe8f2d1601adb8dbc48f896d619d5ab0 node-v14.17.1-x86.msi 4b648906236eb32914407b46e1f9c217a23199306be05287bf912cf4362e41aa win-x64/node.exe 44ff33abb17d86cb3be368527d018acb6fda3d724ff7f0d81eab5ba2e0ec46ae win-x64/node.lib 019c8d96c5ce1e8875b11d411541f621f7b0aab1f355c35ad356dc89d2a85e74 win-x64/node_pdb.7z 01b4f47e6f16ee237c06512813914bdd5b6cca025fe7a568328193bd6d83ba31 win-x64/node_pdb.zip 7b0368e9a8f38cb13bb6940b94ced71a57a4ca823b58029f0587e784f7bf6a82 win-x86/node.exe 593ea194f25d7be97988d710d380320244b27c067067ebc4afdcf4bb22e4e78c win-x86/node.lib ba30a318e4700dbce5a781faf5a148ef80f5908f057591157af9b59b5f4191ef win-x86/node_pdb.7z 45905a33717f05405443f8f29f186aba734d60ed8c668bd2d80e2f734b0b436f win-x86/node_pdb.zip -----BEGIN PGP SIGNATURE----- iQIzBAEBCAAdFiEEj8yhP+8dDC6RAI4Jdw96mlrhVgAFAmDIoSgACgkQdw96mlrh VgCfoQ//Zf0LnaX01O90ineCZmnlJM1RTON3F6GeR+c62jLiZ4JcQja+c4Lp2BDq i9XGChx7beOgZOefiGv1S1El8dAeeo4pPcEvwP225pmJuNVy3wdMeIEeHpGeYMnw kMfZGtnAjUkT+tnXSXMhqqYoOTGeMHf8yeviBkEQqVd4sriheXpK1/iztcOfjqtj 7D30V3IOllOQ2vn5B+slfALUJBstFo44dnAMmdeuMXl/ySX/yA7kBh1moEGfuuPP s9d81T1MwKMJEg6EAVK+6k/wDx7tDgPfO0b/C4FgwaeyE0qspCfEPKq0tGmXM7t2 nwhq2EaTHPCEhpkwFjgjVdUhUELvH1fEkQO0vKLS4xEWIJ8l1E/Vf+JfVIWKxC1N dJ94eF+9899HkT7mC1MOldDbrk8MQOgIhhHbl1vxqbU0wRv4FqBB+QLWPPSAZrgc 5QE2rAVBO4gVmTGEvBqYs0fXgFHnH2jxXknog7S0aV6rG3hhpF25fNFF5q3rLL+F SnN5bu+fsZndIUJGLJ+Ph7Sjotalyrhijq9CNlhBeg0m53hlt/Ul9oYfTjwa3QIC GzDncJ69ZcTLWio70cBI0llTJ21V1YDzgaWTFZIRIPe7CT68bvmDOpoG+cGZGAJM c+OT13V1ZHmVf0puHir4bt2WcauaCt8rdRCCZZff8BBJMA90bsA= =HnL5 -----END PGP SIGNATURE----- ```
153.215517
219
0.736923
yue_Hant
0.298647
97bd377d03156bc868c10dbfb1c2661a48f6afa7
434
md
Markdown
_posts/2017-05-30-ex-5-7.md
Ak1a/ak1la.github.io
7e9caa1ae7edde5d0df8bdef67fc128fe579e803
[ "MIT" ]
null
null
null
_posts/2017-05-30-ex-5-7.md
Ak1a/ak1la.github.io
7e9caa1ae7edde5d0df8bdef67fc128fe579e803
[ "MIT" ]
null
null
null
_posts/2017-05-30-ex-5-7.md
Ak1a/ak1la.github.io
7e9caa1ae7edde5d0df8bdef67fc128fe579e803
[ "MIT" ]
null
null
null
--- layout: post title: A7 category: five --- Развитие данного ресурса по практике. На текущий момент ресурс выглядит довольно примитивно, вследствие использования возможностей гитхаба, а не нормального хостинга, с возможностью реализации функций бэкэнда. Но даже без использования полноценной серверной части, возможно использовать бутстрап, проработать структуру ресурса (обсуждается), его оформление, использовать Jekyll и Disqus.
72.333333
387
0.829493
rus_Cyrl
0.98644
97be2e01c46c54e7c48db3b16a5beb9b0086af32
320
md
Markdown
_posts/2017-12-23-the-roman-forumhttpwwwdesparozcomtheromanforum.md
desparoz/desparoz.github.io
7014a828648b4a392df6e78f1c746e737b2584f8
[ "MIT" ]
null
null
null
_posts/2017-12-23-the-roman-forumhttpwwwdesparozcomtheromanforum.md
desparoz/desparoz.github.io
7014a828648b4a392df6e78f1c746e737b2584f8
[ "MIT" ]
null
null
null
_posts/2017-12-23-the-roman-forumhttpwwwdesparozcomtheromanforum.md
desparoz/desparoz.github.io
7014a828648b4a392df6e78f1c746e737b2584f8
[ "MIT" ]
null
null
null
--- layout: post microblog: true date: 2017-12-23 21:42 +0300 guid: http://desparoz.micro.blog/2017/12/23/the-roman-forumhttpwwwdesparozcomtheromanforum.html --- [The Roman Forum](http://www.desparoz.com/2017/12/24/the-roman-forum/) <img src="http://desparoz.me/uploads/2017/3d3199ba5d.jpg" width="600" height="450" />
32
95
0.740625
yue_Hant
0.618497
97be77ab693b35ff7b7408c8079b8913ba47c729
74
md
Markdown
README.md
Andrew4d3/udemy-node-advanced
f8b6c5055ae923e15292d5a89969923945239a22
[ "MIT" ]
null
null
null
README.md
Andrew4d3/udemy-node-advanced
f8b6c5055ae923e15292d5a89969923945239a22
[ "MIT" ]
null
null
null
README.md
Andrew4d3/udemy-node-advanced
f8b6c5055ae923e15292d5a89969923945239a22
[ "MIT" ]
null
null
null
# udemy-node-advanced Personal Course Note for Node JS: Advanced Concepts
24.666667
51
0.810811
eng_Latn
0.426691