hexsha
stringlengths
40
40
size
int64
5
1.04M
ext
stringclasses
6 values
lang
stringclasses
1 value
max_stars_repo_path
stringlengths
3
344
max_stars_repo_name
stringlengths
5
125
max_stars_repo_head_hexsha
stringlengths
40
78
max_stars_repo_licenses
listlengths
1
11
max_stars_count
int64
1
368k
max_stars_repo_stars_event_min_datetime
stringlengths
24
24
max_stars_repo_stars_event_max_datetime
stringlengths
24
24
max_issues_repo_path
stringlengths
3
344
max_issues_repo_name
stringlengths
5
125
max_issues_repo_head_hexsha
stringlengths
40
78
max_issues_repo_licenses
listlengths
1
11
max_issues_count
int64
1
116k
max_issues_repo_issues_event_min_datetime
stringlengths
24
24
max_issues_repo_issues_event_max_datetime
stringlengths
24
24
max_forks_repo_path
stringlengths
3
344
max_forks_repo_name
stringlengths
5
125
max_forks_repo_head_hexsha
stringlengths
40
78
max_forks_repo_licenses
listlengths
1
11
max_forks_count
int64
1
105k
max_forks_repo_forks_event_min_datetime
stringlengths
24
24
max_forks_repo_forks_event_max_datetime
stringlengths
24
24
content
stringlengths
5
1.04M
avg_line_length
float64
1.14
851k
max_line_length
int64
1
1.03M
alphanum_fraction
float64
0
1
lid
stringclasses
191 values
lid_prob
float64
0.01
1
3f325ed143c527efbdf11d510519a4563cf012dd
1,012
md
Markdown
README.md
iaue/plotter-server
e05b96cb7b8e8d3dca24990edfa5292fe02c582a
[ "MIT" ]
2
2016-06-08T10:37:39.000Z
2016-07-03T18:40:58.000Z
README.md
iaue/plotter-server
e05b96cb7b8e8d3dca24990edfa5292fe02c582a
[ "MIT" ]
6
2016-06-08T07:19:54.000Z
2016-06-27T11:02:10.000Z
README.md
iaue/plotter-server
e05b96cb7b8e8d3dca24990edfa5292fe02c582a
[ "MIT" ]
null
null
null
![alt tag](https://github.com/iaue/poiju.io/blob/master/public/poiju-logo-menu.png?raw=true) # POIJU.IO [![Sponsored by Chilicorn.org](https://img.shields.io/badge/sponsored%20by-chilicorn.org-brightgreen.svg)](http://chilicorn.org) - NodeJS plotter server for Raspberry Pi boat computer. - Mapbox GL JS Current coverage: Finnish sea and lake areas List of packages and services in use: - Add to home screen: http://cubiq.org/add-to-home-screen - Loading spinner: http://spin.js.org/ - Map tiles: https://www.mapbox.com/mapbox-gl-js/api/ - Tile caching: https://www.npmjs.com/package/tiny-cache - Tile cache hashing: https://www.npmjs.com/package/js-md5 - Application favicons: http://realfavicongenerator.net/ - SSL signing: https://www.cloudflare.com/ - Heroku SSL redirect: https://www.npmjs.com/package/heroku-ssl-redirect - Mobile app icons: https://makeappicon.com - Mobile app launch screen images: Asset Catalog Creator http://www.bridgetech.io/ More information and usage instructions coming soon.
44
139
0.76087
kor_Hang
0.471735
3f32fcf188122ac60bd35d67203fdeb8b92773e6
299
md
Markdown
README.md
af4jm/vscode-iCalendar
7e413f429653cd31411d0182d277262d145f4568
[ "MIT" ]
3
2018-01-18T10:17:21.000Z
2021-12-15T09:34:24.000Z
README.md
af4jm/vscode-iCalendar
7e413f429653cd31411d0182d277262d145f4568
[ "MIT" ]
null
null
null
README.md
af4jm/vscode-iCalendar
7e413f429653cd31411d0182d277262d145f4568
[ "MIT" ]
null
null
null
# vscode-iCalendar iCalendar language definition for VSCode. Install free from the [Visual Studio Marketplace](https://marketplace.visualstudio.com/items?itemName=af4jm.vscode-iCalendar) Adapted from `.tmLanguage` file in https://github.com/kimsey0/iCalendar-sublime. Icon from iconfinder.com.
42.714286
168
0.80602
yue_Hant
0.626928
3f333172322d6f74e9c35d46276334af4d603883
2,351
md
Markdown
content/fr/database_monitoring/setup_postgres/advanced_configuration.md
DevKyleS/datadog-documentation
70e5719f2e1de1acd7e66ac80364969426069b70
[ "BSD-3-Clause" ]
null
null
null
content/fr/database_monitoring/setup_postgres/advanced_configuration.md
DevKyleS/datadog-documentation
70e5719f2e1de1acd7e66ac80364969426069b70
[ "BSD-3-Clause" ]
null
null
null
content/fr/database_monitoring/setup_postgres/advanced_configuration.md
DevKyleS/datadog-documentation
70e5719f2e1de1acd7e66ac80364969426069b70
[ "BSD-3-Clause" ]
null
null
null
--- description: Configuration avancée de Database Monitoring pour Postgres kind: documentation title: Configuration avancée de Database Monitoring pour Postgres --- {{< site-region region="us5,gov" >}} <div class="alert alert-warning">La solution Database Monitoring n'est pas prise en charge pour ce site.</div> {{< /site-region >}} ## Gérer de nombreuses relations Si votre base de données Postgres comporte un grand nombre de relations (quelques milliers), Datadog recommande d'ajouter le paramètre `collect_database_size_metrics: false` à votre configuration d'instance pour cette base de données. Si ce paramètre est désactivé, l'Agent n'exécute pas la fonction `pg_database_size()` permettant de recueillir des statistiques sur la taille de la base de données, car celle-ci est peu performante sur les instances comportant un grand nombre de tables. ```yaml instances: - dbm: true ... collect_database_size_metrics: false ``` En outre, si vous partitionnez vos données sur plusieurs tables et que les définitions des tables sont identiques (à l'exception de leur nom), vous risquez de vous retrouver avec un grand nombre de requêtes normalisées : ```sql SELECT * FROM daily_aggregates_001 SELECT * FROM daily_aggregates_002 SELECT * FROM daily_aggregates_003 ``` Dans ce cas, utilisez l'option `replace_digits` pour surveiller ces requêtes sous la forme d'une seule requête normalisée. Toutes les métriques associées à ces requêtes seront alors regroupées dans une seule requête : ```sql SELECT * FROM daily_aggregates_? ``` Ajoutez l'option `replace_digits` à la configuration de l'instance de votre base de données dans l'Agent Datadog : ```yaml instances: - dbm: true ... obfuscator_options: replace_digits: true ``` ## Augmenter le taux d'échantillonnage Si certaines requêtes sont relativement peu fréquentes ou s'exécutent rapidement, augmentez le taux d'échantillonnage en réduisant la valeur `collection_interval` afin de recueillir des échantillons plus fréquemment. Définissez le paramètre `collection_interval` dans la configuration de l'instance de votre base de données dans l'Agent Datadog. Ce paramètre est défini sur 1 par défaut. Réduisez cette valeur pour obtenir un intervalle plus court : ```yaml instances: - dbm: true ... query_samples: collection_interval: 0.1 ```
41.245614
488
0.777116
fra_Latn
0.95923
3f3416fb020328fe2dc0ba174a6d4277e4865ae7
6,514
md
Markdown
paper/response.md
sjakobi/selective
087c10ab32fd52f1a8bfe1604cf97a303f817376
[ "MIT" ]
200
2018-06-03T17:07:38.000Z
2021-12-15T05:00:28.000Z
paper/response.md
sjakobi/selective
087c10ab32fd52f1a8bfe1604cf97a303f817376
[ "MIT" ]
32
2019-02-20T16:39:31.000Z
2022-01-02T00:24:02.000Z
paper/response.md
sjakobi/selective
087c10ab32fd52f1a8bfe1604cf97a303f817376
[ "MIT" ]
23
2018-11-14T15:01:13.000Z
2021-07-02T14:33:11.000Z
We thank the reviewers for their feedback, and will address all minor suggestions in the revision. # Key points > **A:** The paper fails to clearly convey what makes Applicative functors (strictly) weaker than Selective functors. The `Applicative` operator `<*>` composes independent computations. The `Selective` interface adds the operator `select` to compose dependent computations, where *a computation can depend on the value produced by a previous computation*. This makes the `Selective` interface strictly more powerful. While we already state this at the beginning of S2, we'll make this key insight more prominent in the revision. > **A:** I was not entirely convinced about the interface, laws or the relationship with Applicative Functors... Is there any type constructor f, which is Applicative but not Selective? The answer is "No" (we say this in line 192). > **A:** this is odd because the authors argue that Applicative < Selective < Monad. Thus I would have expected some things to be Applicative, but not Selective. The relationship between `Applicative` and `Selective` differs from the relationship between `Applicative` and `Monad`. Not every `Applicative` is a `Monad`, but every `Applicative` is a `Selective`. The subclass relationship `Applicative < Selective` is justified not by possible instances, but by the extra method `select` in `Selective`. While `select = selectA` is a valid implementation of `select`, *it is not the only useful implementation*, as demonstrated by `Selective` instances `Over` and `Under`: indeed, `Over` uses `select = selectA`, but `Under` doesn't. The hierarchy therefore reflects method set inclusion: `{<*>}` < `{<*>, select}` < `{<*>, select, >>=}`. Different applications require different sets of methods. For example, *Haxl requires all three*: `<*>` gives parallelism, `select` gives speculative execution, and `>>=` gives arbitrary dynamic effects. We'll clarify this subtle point in the revision. > **B:** In the implementation of `write` you evaluate the value to get the associated effects. It's clear that this is needed for the static analysis, but I worry that it will lead to quadratic or exponential blowup in the simulation. Thank you for pointing out this problem! Indeed, the implementation of `write` presented in S5.3 causes an exponential blowup when simulating `write` chains, such as `write k0 (write k1 (write k2 (read k3)))`, performing `read k3` 2^3=8 times. Fortunately, we can fix the problem as follows. We simplify the implementation of `write` in line 919 to: ``` write k fv = liftSelect (Write k fv id) ``` Static analysis (`getProgramEffects` below) is then performed via the natural transformation `toOver` that records effects in `fv` plus the write effect `Write k` itself: ``` toOver :: RW a -> Over [RW ()] a toOver (Read k _ ) = Over [Read k (const ())] toOver (Write k fv _) = runSelect toOver fv *> Over [Write k fv (const ())] getProgramEffects :: Program a -> [RW ()] getProgramEffects = getOver . runSelect toOver ``` The natural transformation `toState` needs no changes. The above fix not only removes effect duplication, but also makes the implementation more uniform. We'll include the fix into the revision. > **C:** At least, I would like to see a concrete instance that is Selective but (at least believed) not (to be) ArrowChoice... I do not believe that we "could use arrows to solve our static analysis and speculative execution examples" A `Selective` instance cannot be an instance of `ArrowChoice` because of kind mismatch, so it's unclear how to respond to the first comment without making additional assumptions. Here is an example of static analysis based on free `ArrowChoice`: ``` newtype FreeArrowChoice f a b = FreeArrowChoice { runFreeArrowChoice :: forall arr. ArrowChoice arr => (forall i o. f i o -> arr i o) -> arr a b } newtype ConstArrow m a b = ConstArrow { getConstArrow :: m } foldArrowChoice :: Monoid m => (forall i o. f i o -> m) -> FreeArrowChoice f a b -> m foldArrowChoice t arr = getConstArrow $ runFreeArrowChoice arr (ConstArrow . t) ``` `ConstArrow` is similar to the `Const` functor: we convert the "base arrow" `f` to `ConstArrow` using the function `t`, and statically accumulate the resulting monoidal effect labels. To execute a `FreeArrowChoice` with actual values and branching we can use the `Kleisli` arrow. We'll provide a complete implementation as supplementary material and link to it. # Details > **A:** Wouldn't there also be quite a few advantages of having selective in the Applicative instance? One advantage is that adding `select` to `Applicative` avoids breaking the `Applicative => Monad` hierarchy. However, that would still break some code, because `select` would clash with existing definitions with the same name. We can elaborate on this in the revision. > **A:** Laws: The associativity law does indeed look rather ugly. I do wonder if an alternative interface would have not indeed be a better option. The `biselect` method (S6.2) has a simpler associativity law, but is more complex in other aspects. While future implementations of selective functors might use `biselect`, we believe that `select` is more appropriate for introducing the idea of selective functors to the broader community. > **B:** You write "parametricity dictates that, when given a `Left a`, we must execute the effects in `f (a -> b)`". It should be pointed out that this is only true if you are required to produce a `b`. > **B:** Validation does in fact satisfy both the pure Left and pure Right Indeed! We'll fix both issues. > **C:** Maybe, I have missed some descriptions, but the rigidness is the key insight for Section 5, the discussions about typical reasons for a selective functor to be not rigid would strengthen the paper. > **C:** It is hard to understand the program presented here [in Fig. 6] We agree and will address these suggestions. > **C:** "Interestingly, adding the ability to branch on infinite number of cases makes selective functors equivalent to a monad" -- Is it really true? How can we perform such a branching for a function for example without breaking parametricity? Daniel Peebles pointed out a variation of `branch`, which is `>>=` in disguise: ``` branch :: f (Sigma h) -> (forall a. h a -> f (a -> b)) -> f b ``` `Sigma h` is a "tagged union" with generalised "tags" `h`, which can in particular be functions. We'll provide a complete implementation as supplementary material.
70.043011
570
0.749616
eng_Latn
0.998722
3f347243344ac8d89f64e9933f4a1acf4fecab24
3,472
md
Markdown
docs/kaitai-struct.md
det-lab/dataReaderWriter
d738e37d02ca052da527a5b636c05c8bbc0eb009
[ "MIT" ]
6
2020-01-28T18:04:54.000Z
2022-01-20T02:52:48.000Z
docs/kaitai-struct.md
det-lab/dataReaderWriter
d738e37d02ca052da527a5b636c05c8bbc0eb009
[ "MIT" ]
1
2020-09-10T01:02:50.000Z
2020-09-10T01:02:50.000Z
docs/kaitai-struct.md
det-lab/dataReaderWriter
d738e37d02ca052da527a5b636c05c8bbc0eb009
[ "MIT" ]
1
2020-09-08T02:19:29.000Z
2020-09-08T02:19:29.000Z
# Kaitai Struct ### Overview Kaitai Struct is a set of data-interfacing tools which can generate code in several languages, based on a description of the data structure - a Kaitai Struct YAML (.ksy) file. - [Kaitai download and quickstart instructions](http://kaitai.io) - [Kaitai Struct User Guide](https://doc.kaitai.io/user_guide.html) Available Kaitai Struct tools include: * [Kaitai-Struct-Compiler (KSC)](http://kaitai.io/#download): The compiler performs the heavy lifting and generates the source code associated with your format. It takes a user-created .ksy file and creates the code (in a language specified by the user) for interacting with the binary data. * [Kaitai Stream API](http://doc.kaitai.io/stream_api.html): These libraries allow Kaitai to interface with various languages in order to generate the necessary code in the desired target language, acting essentially as a wrapper for your target language's native IO libraries. (Available as [kaitai-io github repositories](https://github.com/kaitai-io?utf8=✓&q=runtime)). - Note that these are required in order to implement ksc-generated code as a library into a project. - Interpreted languages such as Python and Ruby require the installation of the kaitaistream package. - Compiled languages like C++ require compilation from source * [Kaitai-Struct-Visualizer (KSV)](https://github.com/kaitai-io/kaitai_struct_visualizer): The visualizer is installed as a ruby gem (and as such requires a Ruby interpreter on your machine) and is primarily used as a debugging tool for your data descriptor. It takes a binary file and your .ksy file as inputs and creates an interactive visual representation of the format. * [Kaitai interactive webIDE](https://ide.kaitai.io/) with examples of many popular data/file formats. You can also import your own .ksy schema format and corresponding data file into the tool. ### Contents * [simple_example/](../kaitai/simple_example/) A directory containing a simplified guide to using Kaitai Struct to read binary data into C++ std vectors. Includes pre-written example code as well as instructions to generate code with ksc (kaitai-struct-compiler). [Working with this example](./kaitai-simple-example.md) * [awkward_example/](../kaitai/awkward_example/) An example program using Kaitai to read binary data into Awkward arrays (C++/python compatible). [Working with this example](./kaitai-awkward-example.md) * `ksy/` A directory containing various example kaitai struct descriptions - `animal.ksy` A simplified data structure with some basic information about animals (species, age, and weight). Used in the `simple_example` - `scdms.ksy` and `midas.ksy` The ksy/scdms.ksy and ksy/midas.ksy files are an example of how to declare a more complicated custom binary format. These demonstrate a number of the capabilities of Kaitai Struct including the ability to parse packed integer values, enumerated values, and conditional statements. Additionally it demonstrates the ability to import formats described in other files since the actual data is contained in the scdms.ksy format which is then encapsulated within the midas.ksy structure. Note: an example program for these have not yet been written, but you can find some `scdms_raw.bin` data in `<dataReaderWriter>/data`. Test your Kaitai knowledge by writing up a basic program similar to the `simple_example`!
84.682927
487
0.769873
eng_Latn
0.997255
3f349d285f50d197992a2bd0fc069e5fdd0569be
5,109
md
Markdown
README.md
simonlevine/DeepSilencer
3d3f6d8a9a10bbe04610fc151d059b7ad7fdd192
[ "Apache-2.0" ]
null
null
null
README.md
simonlevine/DeepSilencer
3d3f6d8a9a10bbe04610fc151d059b7ad7fdd192
[ "Apache-2.0" ]
null
null
null
README.md
simonlevine/DeepSilencer
3d3f6d8a9a10bbe04610fc151d059b7ad7fdd192
[ "Apache-2.0" ]
null
null
null
# DeepSilencer #### A deep convolutional neural network for the accurate prediction of silencers For accurate classification of silencers, we propose a CNN-based model named DeepSilencer. As illustrated in the following figure, DeepSilencer consists of four modules. First, a pre-processing module transforms DNA sequences into matrices of sequences and counts of kmers. Second, a CNN module uses a convolutional neural network (CNN) with multiple convolutional and pooling layers to extract features from matrices of DNA sequences. Third, an ANN module is adopted to sufficiently learn the characteristics of kmers. Finally, a joint module integrates outputs of the CNN and ANN modules to predict the probability. <div align=center> <img src = "inst/Figure1.png" width = 85% height = 85%> </div> ## Installation ``` Requiements: 1. Python 3.5 or later version (<= 3.7) 2. Packages: numpy (>=1.15.1) keras (2.3.1) tensorflow(-gpu) (1.15.2) hickle (>=3.4) Package installation: $ pip install -U numpy $ pip install keras == 2.3.1 $ pip install tensorflow-gpu==1.15.2 #pip install tensorflow==1.15.2 $ pip install -U hickle $ git clone https://github.com/xy-chen16/DeepSilencer.git $ cd DeepSilencer ``` <div align=center> | Method | Version | | :----: | :----: | | DeepSilencer | 0.1.0 | | gkmSVM | v1.3 | | SVM | 0.22.1 | | correlation | 0.22.1 | </div> ## Data Preprocessing ### Load the genome files: ``` $ cd data $ mkdir -p genome/mm10 && cd genome/mm10 $ nohup wget http://hgdownload.cse.ucsc.edu/goldenPath/mm10/bigZips/chromFa.tar.gz $ tar zvfx chromFa.tar.gz $ cd .. $ mkdir hg19 && cd hg19 $ nohup wget http://hgdownload.cse.ucsc.edu/goldenPath/hg19/bigZips/chromFa.tar.gz $ tar zvfx chromFa.tar.gz $ cd ../../.. ``` ### Unzip the open region files and result files: ``` $ tar -xjvf data/open_region.tar.bz2 -C data $ tar -xjvf result/result.tar.bz2 -C result ``` ## Tutorial We collected the uncharacterized cis-regulatory elements (CREs) in K562 cells with MPRA provided by Jayavelu et al. Then we chose the top 2000 uncharacterized CREs sequences with the lowest MPRA activity as a positive set, and the bottom 2000 uncharacterized CREs with highest MPRA activity as a negative set. We also downloaded the uncharacterized CREs in homo sapiens and mus musculus without the value of MPRA, and we further use these sequences to find the candidate silencer. ### self-projection For demonstrating the classification performance of DeepSilencer, we conducted the self-projection experiment and compared our method with the gapped k-mer SVM (gkmSVM). We randomly selected 80% of the data as trainning set and used the remaining 20% of data for testing the two models. ``` $ python code/run_self_projection.py ``` The performance of DeepSilencer was shown in the following Figure. <div align=center> <img src = "inst/Figure2.png" width = 60% height = 60%> </div> The performance of two methods was shown in the following table. <div align=center> | Method | AUC | PRC | | :----: | :----: | :----: | | DeepSilencer | **0.827** | **0.842** | | gkmSVM | 0.81 | 0.76 | </div> ### crossdataset-projection In order to find the candidate silencer elements in homo sapiens and mus musculus, we trained the DeepSilencer model based on the whole sequences using in the self-projection experiments. First, we trained the model: ``` $ python code/train_for_crossdata_projection.py ``` Predict the candidate silencer elements in homo sapiens (hg19): ``` $ python code/run_crossdata_projection_human.py ``` Predict the candidate silencer elements in mus musculus (mm10): ``` $ python code/run_crossdata_projection_mouse.py ``` Then you can check the results in the *results* folder. ## Analysis We got the candidate silencers using the trained DeepSilencer model from the uncharacterized CREs in human and mouse. Then we did some comparative analyses between silencers predicted in the human K562 cell line by the gkmSVM-based model and our DeepSilencer model, and believed that there is almost no difference between them. The length distribution of silencers from the gkmSVM-based model and our DeepSilencer model is shown in Figure A. We found that there is no significant difference in the distance between silencers and the nearest coding genes (two-sided Wilcox test p-value=0.5209, Figure B), the GC content (two-sided Wilcox test p-value=0.9299, Figure C) and the chromatin accessibility (two-sided Wilcox test p-value=0.5699, Figure D). <div align=center> <img src = "inst/Figure3.png" width = 90% height = 90%> </div> Indeed, most of the silencers were collected from both the gkmSVM-based model and our DeepSilencer model. However, there are some unique silencers from DeepSilencer model. For example, chr3: 48,997,302-48,997,398 is a silencer predicted by DeepSilencer model, but not a silencer predicted by the gkmSVM-based model. And chr3: 48,997,266-48,997,466 that is a silencer validated in the human K562 cell line covers this predicted silencer.
48.198113
750
0.734977
eng_Latn
0.981582
3f34bdca19f08fe05f46bb155282a95baaffbacf
637
markdown
Markdown
_posts/2010-09-20-blue-notify-v1-2-3.markdown
dipakkr/chrisbanes.github.io
1eb1c56d696042b275c2da28831ae006fabd6d13
[ "MIT" ]
null
null
null
_posts/2010-09-20-blue-notify-v1-2-3.markdown
dipakkr/chrisbanes.github.io
1eb1c56d696042b275c2da28831ae006fabd6d13
[ "MIT" ]
null
null
null
_posts/2010-09-20-blue-notify-v1-2-3.markdown
dipakkr/chrisbanes.github.io
1eb1c56d696042b275c2da28831ae006fabd6d13
[ "MIT" ]
1
2020-01-11T06:18:48.000Z
2020-01-11T06:18:48.000Z
--- layout: post title: Blue Notify v1.2.3 date: '2010-09-20 00:00:00' --- Hey, I&#8217;ve just released v.1.2.3 onto the Market. Features include: * Facebook Zero (http://0.facebook.com) support * Organised the Preferences a bit better * Filter which type of Facebook notifications alert you (Likes and Comments at the moment). * Friends&#8217; Birthdays notifications now open Facebook application (if you have the Facebook application set as your preference). Get it on Android Market <img src='http://i2.wp.com/www.senab.co.uk/wp-includes/images/smilies/icon_smile.gif' alt=':-)' class='wp-smiley' data-recalc-dims="1" />
39.8125
162
0.733124
eng_Latn
0.804532
3f350258b93ace2b1fbd64bc822f6cb35cad50e8
1,640
md
Markdown
windows-driver-docs-pr/debugger/bug-check-0x19a--worker-thread-returned-while-attached-to-silo.md
Ryooooooga/windows-driver-docs.ja-jp
c7526f4e7d66ff01ae965b5670d19fd4be158f04
[ "CC-BY-4.0", "MIT" ]
null
null
null
windows-driver-docs-pr/debugger/bug-check-0x19a--worker-thread-returned-while-attached-to-silo.md
Ryooooooga/windows-driver-docs.ja-jp
c7526f4e7d66ff01ae965b5670d19fd4be158f04
[ "CC-BY-4.0", "MIT" ]
null
null
null
windows-driver-docs-pr/debugger/bug-check-0x19a--worker-thread-returned-while-attached-to-silo.md
Ryooooooga/windows-driver-docs.ja-jp
c7526f4e7d66ff01ae965b5670d19fd4be158f04
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: バグ チェック 0x19A WORKER_THREAD_RETURNED_WHILE_ATTACHED_TO_SILO description: WORKER_THREAD_RETURNED_WHILE_ATTACHED_TO_SILO のバグ チェックでは、0x0000019A の値を持ちます。 これは、ワーカー スレッドが、サイロにアタッチし、戻る前にデタッチできませんでしたを示します。 ms.assetid: D947FF20-4C86-4879-A5CA-934A20BE61C9 keywords: - バグ チェック 0x19A WORKER_THREAD_RETURNED_WHILE_ATTACHED_TO_SILO - WORKER_THREAD_RETURNED_WHILE_ATTACHED_TO_SILO ms.date: 05/23/2017 topic_type: - apiref api_name: - WORKER_THREAD_RETURNED_WHILE_ATTACHED_TO_SILO api_type: - NA ms.localizationpriority: medium ms.openlocfilehash: cfb14d551f99ba293e9b2427f2076124d8e8c871 ms.sourcegitcommit: d03b44343cd32b3653d0471afcdd3d35cb800c0d ms.translationtype: MT ms.contentlocale: ja-JP ms.lasthandoff: 07/02/2019 ms.locfileid: "67519795" --- # <a name="bug-check-0x19a-workerthreadreturnedwhileattachedtosilo"></a>バグ チェック 0x19A:ワーカー\_スレッド\_から返された\_中\_ATTACHED\_TO\_サイロ ワーカー\_スレッド\_から返された\_中\_ATTACHED\_TO\_サイロのバグ チェックが 0x0000019A の値を持ちます。 これは、ワーカー スレッドが、サイロにアタッチし、戻る前にデタッチできませんでしたを示します。 > [!IMPORTANT] > このトピックはプログラマーを対象としています。 コンピューターを使用しているときに、エラー コードがブルー スクリーンが受信した顧客の場合を参照してください。[トラブルシューティング ブルー スクリーン エラー](https://www.windows.com/stopcode)します。 ## <a name="workerthreadreturnedwhileattachedtosilo-parameters"></a>ワーカー\_スレッド\_から返された\_中\_ATTACHED\_TO\_サイロ パラメーター | パラメーター | 説明 | |-----------|---------------------------| | 1 | ワーカーのルーチンのアドレス | | 2 | 作業項目のパラメーター | | 3 | 作業項目のアドレス | | 4 | 予約済み | <a name="cause"></a>原因 ----- 使用を調査するために、 [ **ln (最も近いシンボルの一覧)** ](ln--list-nearest-symbols-.md)コマンドをパラメーター 1 が正しく動作のドライバーを識別できるようにします。
29.285714
146
0.746341
yue_Hant
0.752748
3f351d080c8e3d60489463673819a7b0130cab17
574
md
Markdown
_project/a-topography-sculpted-of-folded-skewed-metal-planes-the-vail-grant-house-seems-to-enter.md
rumnamanya/rumnamanya.github.io
2deadeff04c8a48cf683b885b7fa6ab9acc1d9d9
[ "MIT" ]
null
null
null
_project/a-topography-sculpted-of-folded-skewed-metal-planes-the-vail-grant-house-seems-to-enter.md
rumnamanya/rumnamanya.github.io
2deadeff04c8a48cf683b885b7fa6ab9acc1d9d9
[ "MIT" ]
null
null
null
_project/a-topography-sculpted-of-folded-skewed-metal-planes-the-vail-grant-house-seems-to-enter.md
rumnamanya/rumnamanya.github.io
2deadeff04c8a48cf683b885b7fa6ab9acc1d9d9
[ "MIT" ]
null
null
null
--- layout: project_single title: "A topography sculpted of folded, skewed metal planes, the Vail Grant House seems to enter into a love affair with its hillside site, blurring the boundaries between the natural and the artificial" slug: "a-topography-sculpted-of-folded-skewed-metal-planes-the-vail-grant-house-seems-to-enter" parent: "metal-building-homes-interior-ideas" --- A topography sculpted of folded, skewed metal planes, the Vail Grant House seems to enter into a love affair with its hillside site, blurring the boundaries between the natural and the artificial
82
205
0.803136
eng_Latn
0.994597
3f3540b3af9fe6b48484795101854806027d10c2
26,481
md
Markdown
_posts/L/2015-01-20-LTN1~OsPHO2~OsRLS1.md
funRiceGenes/funRiceGenes.github.io
e27a14e23dd8c3d6127e38ee62240dc9e01008be
[ "MIT" ]
4
2017-08-09T02:48:10.000Z
2020-11-11T01:54:08.000Z
_posts/L/2015-01-20-LTN1~OsPHO2~OsRLS1.md
funRiceGenes/funRiceGenes.github.io
e27a14e23dd8c3d6127e38ee62240dc9e01008be
[ "MIT" ]
1
2020-05-31T13:03:01.000Z
2020-06-01T01:47:14.000Z
_posts/L/2015-01-20-LTN1~OsPHO2~OsRLS1.md
funRiceGenes/funRiceGenes.github.io
e27a14e23dd8c3d6127e38ee62240dc9e01008be
[ "MIT" ]
6
2018-10-03T20:47:32.000Z
2021-07-19T01:58:31.000Z
--- layout: post title: "LTN1,OsPHO2,OsRLS1" description: "" category: genes tags: [root, growth, reproductive, transporter, seedling, shoot, architecture, adventitious root, pi , leaf, phosphorus, phosphate, root architecture, nitrogen, development, homeostasis, flowering time, Pi, Pi homeostasis, Pi signaling] --- * **Information** + Symbol: LTN1,OsPHO2,OsRLS1 + MSU: [LOC_Os05g48390](http://rice.uga.edu/cgi-bin/ORF_infopage.cgi?orf=LOC_Os05g48390) + RAPdb: [Os05g0557700](http://rapdb.dna.affrc.go.jp/viewer/gbrowse_details/irgsp1?name=Os05g0557700) * **Publication** + [Fine characterization of OsPHO2 knockout mutants reveals its key role in Pi utilization in rice](http://www.ncbi.nlm.nih.gov/pubmed?term=Fine characterization of OsPHO2 knockout mutants reveals its key role in Pi utilization in rice%5BTitle%5D), 2014, J Plant Physiol. + [Molecular mechanisms regulating Pi-signaling and Pi homeostasis under OsPHR2, a central Pi-signaling regulator, in rice](http://www.ncbi.nlm.nih.gov/pubmed?term=Molecular mechanisms regulating Pi-signaling and Pi homeostasis under OsPHR2, a central Pi-signaling regulator, in rice%5BTitle%5D), 2011, Frontiers in Biology. + [OsSPX1 suppresses the function of OsPHR2 in the regulation of expression of OsPT2 and phosphate homeostasis in shoots of rice](http://www.ncbi.nlm.nih.gov/pubmed?term=OsSPX1 suppresses the function of OsPHR2 in the regulation of expression of OsPT2 and phosphate homeostasis in shoots of rice%5BTitle%5D), 2010, Plant J. + [Involvement of OsSPX1 in phosphate homeostasis in rice](http://www.ncbi.nlm.nih.gov/pubmed?term=Involvement of OsSPX1 in phosphate homeostasis in rice%5BTitle%5D), 2009, Plant J. + [Regulation of OsSPX1 and OsSPX3 on expression of OsSPX domain genes and Pi-starvation signaling in rice](http://www.ncbi.nlm.nih.gov/pubmed?term=Regulation of OsSPX1 and OsSPX3 on expression of OsSPX domain genes and Pi-starvation signaling in rice%5BTitle%5D), 2009, J Integr Plant Biol. + [Down-regulation of OsSPX1 causes high sensitivity to cold and oxidative stresses in rice seedlings](http://www.ncbi.nlm.nih.gov/pubmed?term=Down-regulation of OsSPX1 causes high sensitivity to cold and oxidative stresses in rice seedlings%5BTitle%5D), 2013, PLoS One. + [Auxin response factor OsARF12, a novel regulator for phosphate homeostasis in rice Oryza sativa](http://www.ncbi.nlm.nih.gov/pubmed?term=Auxin response factor OsARF12, a novel regulator for phosphate homeostasis in rice Oryza sativa%5BTitle%5D), 2014, New Phytol. + [LEAF TIP NECROSIS1 plays a pivotal role in the regulation of multiple phosphate starvation responses in rice](http://www.ncbi.nlm.nih.gov/pubmed?term=LEAF TIP NECROSIS1 plays a pivotal role in the regulation of multiple phosphate starvation responses in rice%5BTitle%5D), 2011, Plant Physiol. + [A constitutive expressed phosphate transporter, OsPht1;1, modulates phosphate uptake and translocation in phosphate-replete rice](http://www.ncbi.nlm.nih.gov/pubmed?term=A constitutive expressed phosphate transporter, OsPht1;1, modulates phosphate uptake and translocation in phosphate-replete rice%5BTitle%5D), 2012, Plant Physiol. + [Molecular interaction between PHO2 and GI in rice.](http://www.ncbi.nlm.nih.gov/pubmed?term=Molecular interaction between PHO2 and GI in rice.%5BTitle%5D), 2017, Plant Cell Environ. + [Knocking Out the Gene RLS1 Induces Hypersensitivity to Oxidative Stress and Premature Leaf Senescence in Rice.](http://www.ncbi.nlm.nih.gov/pubmed?term=Knocking Out the Gene RLS1 Induces Hypersensitivity to Oxidative Stress and Premature Leaf Senescence in Rice.%5BTitle%5D), 2018, Int J Mol Sci. + [Phosphate-Starvation-Inducible S-Like RNase Genes in Rice Are Involved in Phosphate Source Recycling by RNA Decay](http://www.ncbi.nlm.nih.gov/pubmed?term=Phosphate-Starvation-Inducible S-Like RNase Genes in Rice Are Involved in Phosphate Source Recycling by RNA Decay%5BTitle%5D), 2020, Front Plant Sci. + [PROTEIN PHOSPHATASE95 Regulates Phosphate Homeostasis by Affecting Phosphate Transporter Trafficking in Rice[OPEN]](http://www.ncbi.nlm.nih.gov/pubmed?term=PROTEIN PHOSPHATASE95 Regulates Phosphate Homeostasis by Affecting Phosphate Transporter Trafficking in Rice[OPEN]%5BTitle%5D), 2021, Plant Cell. * **Genbank accession number** + [AK241747](http://www.ncbi.nlm.nih.gov/nuccore/AK241747) * **Key message** + The cells in the elongation zone of ospho2 seedling roots were much shorter than those of the wild type + The ospho2 mutants exhibited defects in growth and reproductive development in the whole growing period + Here we report that OsPHR2 positively regulates the low-affinity Pi transporter gene OsPT2 by physical interaction and upstream regulation of OsPHO2 in roots + Our data also show that OsSPX1 is a negative regulator of OsPHR2 and is involved in the feedback of Pi-signaling network in roots that is defined by OsPHR2 and OsPHO2 + In association with enhanced Pi uptake and transport, some Pi transporters were up-regulated in the ltn1 mutant in the presence of sufficient Pi + This study thus provides evidence that OsPHO2, which functions at the downstream of OsPHF1, modulates Pi utilization by regulating the expression of Pht1 transporters in rice + Furthermore, the elongation of primary and adventitious roots was enhanced in the ltn1 mutant under Pi starvation, suggesting that LTN1 is involved in Pi-dependent root architecture alteration + These results suggest that ospho2 mutant phenotype results from a partial defect in Pi translocation and remobilization in the shoot of rice + OsPHR2 positively regulates the low-affinity Pi transporter OsPT2 through physical interaction and reciprocal regulation of OsPHO2 in roots + The ltn1 mutant exhibited increased Pi uptake and translocation, which led to Pi overaccumulation in shoots + Furthermore, OsPT1 expression was strongly enhanced by the mutation of Phosphate Overaccumulator2 (OsPHO2) but not by Phosphate Starvation Response2, indicating that OsPT1 is involved in the OsPHO2-regulated Pi pathway + Furthermore, Pi levels in the ospho2 mutants were highest in the oldest leaf and lowest in the youngest leaf, whereas there was no significant difference in the corresponding leaves of wild-type plants + OsSPX1 suppresses the regulation on expression of OsPT2 by OsPHR2 and the accumulation of excess shoot Pi, but it does not suppress induction of OsPT2 or the accumulation of excessive shoot Pi in the Ospho2 mutant + The phosphorus concentration in the blades of ospho2 mutants was 5 + Map-based cloning identified LTN1 as LOC_Os05g48390, the putative ortholog of Arabidopsis PHO2, which plays important roles in Pi starvation signaling + Under Pi-sufficient conditions, typical Pi starvation responses such as stimulation of phosphatase and RNase activities, lipid composition alteration, nitrogen assimilation repression, and increased metal uptake were also activated in ltn1 + Our results strongly indicate that LTN1 is a crucial Pi starvation signaling component downstream of miR399 involved in the regulation of multiple Pi starvation responses in rice + In this work, a rice leaf tip necrosis1 (ltn1) mutant was identified and characterized + Fine characterization of OsPHO2 knockout mutants reveals its key role in Pi utilization in rice + Previous research using forward genetics approaches demonstrated that OsPHO2 regulates multiple phosphate-starvation responses in rice + In this work, we finely characterized two independent OsPHO2 knockout rice mutants under inorganic phosphate (Pi)-sufficient conditions + Reduced growth, leaf tip necrosis, delayed flowering and over-accumulation of Pi in leaves compared to wild-type were shared features of Osgi and Ospho2 plants + The interaction between OsPHO2 and OsGI links high-level regulators of Pi homeostasis and development in rice + A yeast-two-hybrid screen using Oryza sativa (rice) PHO2 as bait, revealed an interaction between OsPHO2 and OsGIGANTEA, a key regulator of flowering time, which was confirmed using bimolecular flourescenec complementation (BiFC) + Pi analysis of individual leaves demonstrated that Osgi, similar to Ospho2 mutants, were impaired in Pi remobilization from old to young leaves, albeit to a lesser extent + Finally, the dynamic transcriptional regulation of OsRNS genes by overexpression of OsPHR2, ospho2 mutant, and overexpression of OsPT1 lines involved in Pi signaling pathway suggests the molecular basis of OsRNS family in Pi recycling via RNA decay under Pi starvation * **Connection** + __LTN1~OsPHO2~OsRLS1__, __OsPht1;2~OsPT2__, [Molecular mechanisms regulating Pi-signaling and Pi homeostasis under OsPHR2, a central Pi-signaling regulator, in rice](http://www.ncbi.nlm.nih.gov/pubmed?term=Molecular mechanisms regulating Pi-signaling and Pi homeostasis under OsPHR2, a central Pi-signaling regulator, in rice%5BTitle%5D), OsPHR2 positively regulates the low-affinity Pi transporter OsPT2 through physical interaction and reciprocal regulation of OsPHO2 in roots + __LTN1~OsPHO2~OsRLS1__, __OsPHR2__, [Molecular mechanisms regulating Pi-signaling and Pi homeostasis under OsPHR2, a central Pi-signaling regulator, in rice](http://www.ncbi.nlm.nih.gov/pubmed?term=Molecular mechanisms regulating Pi-signaling and Pi homeostasis under OsPHR2, a central Pi-signaling regulator, in rice%5BTitle%5D), OsPHR2 positively regulates the low-affinity Pi transporter OsPT2 through physical interaction and reciprocal regulation of OsPHO2 in roots + __LTN1~OsPHO2~OsRLS1__, __OsPHR2__, [OsSPX1 suppresses the function of OsPHR2 in the regulation of expression of OsPT2 and phosphate homeostasis in shoots of rice](http://www.ncbi.nlm.nih.gov/pubmed?term=OsSPX1 suppresses the function of OsPHR2 in the regulation of expression of OsPT2 and phosphate homeostasis in shoots of rice%5BTitle%5D), Here we report that OsPHR2 positively regulates the low-affinity Pi transporter gene OsPT2 by physical interaction and upstream regulation of OsPHO2 in roots + __LTN1~OsPHO2~OsRLS1__, __OsPHR2__, [OsSPX1 suppresses the function of OsPHR2 in the regulation of expression of OsPT2 and phosphate homeostasis in shoots of rice](http://www.ncbi.nlm.nih.gov/pubmed?term=OsSPX1 suppresses the function of OsPHR2 in the regulation of expression of OsPT2 and phosphate homeostasis in shoots of rice%5BTitle%5D), OsSPX1 suppresses the regulation on expression of OsPT2 by OsPHR2 and the accumulation of excess shoot Pi, but it does not suppress induction of OsPT2 or the accumulation of excessive shoot Pi in the Ospho2 mutant + __LTN1~OsPHO2~OsRLS1__, __OsPHR2__, [OsSPX1 suppresses the function of OsPHR2 in the regulation of expression of OsPT2 and phosphate homeostasis in shoots of rice](http://www.ncbi.nlm.nih.gov/pubmed?term=OsSPX1 suppresses the function of OsPHR2 in the regulation of expression of OsPT2 and phosphate homeostasis in shoots of rice%5BTitle%5D), Our data also show that OsSPX1 is a negative regulator of OsPHR2 and is involved in the feedback of Pi-signaling network in roots that is defined by OsPHR2 and OsPHO2 + __LTN1~OsPHO2~OsRLS1__, __OsSPX1__, [OsSPX1 suppresses the function of OsPHR2 in the regulation of expression of OsPT2 and phosphate homeostasis in shoots of rice](http://www.ncbi.nlm.nih.gov/pubmed?term=OsSPX1 suppresses the function of OsPHR2 in the regulation of expression of OsPT2 and phosphate homeostasis in shoots of rice%5BTitle%5D), OsSPX1 suppresses the regulation on expression of OsPT2 by OsPHR2 and the accumulation of excess shoot Pi, but it does not suppress induction of OsPT2 or the accumulation of excessive shoot Pi in the Ospho2 mutant + __LTN1~OsPHO2~OsRLS1__, __OsSPX1__, [OsSPX1 suppresses the function of OsPHR2 in the regulation of expression of OsPT2 and phosphate homeostasis in shoots of rice](http://www.ncbi.nlm.nih.gov/pubmed?term=OsSPX1 suppresses the function of OsPHR2 in the regulation of expression of OsPT2 and phosphate homeostasis in shoots of rice%5BTitle%5D), Our data also show that OsSPX1 is a negative regulator of OsPHR2 and is involved in the feedback of Pi-signaling network in roots that is defined by OsPHR2 and OsPHO2 + __LTN1~OsPHO2~OsRLS1__, __OsPht1;2~OsPT2__, [OsSPX1 suppresses the function of OsPHR2 in the regulation of expression of OsPT2 and phosphate homeostasis in shoots of rice](http://www.ncbi.nlm.nih.gov/pubmed?term=OsSPX1 suppresses the function of OsPHR2 in the regulation of expression of OsPT2 and phosphate homeostasis in shoots of rice%5BTitle%5D), Here we report that OsPHR2 positively regulates the low-affinity Pi transporter gene OsPT2 by physical interaction and upstream regulation of OsPHO2 in roots + __LTN1~OsPHO2~OsRLS1__, __OsPht1;2~OsPT2__, [OsSPX1 suppresses the function of OsPHR2 in the regulation of expression of OsPT2 and phosphate homeostasis in shoots of rice](http://www.ncbi.nlm.nih.gov/pubmed?term=OsSPX1 suppresses the function of OsPHR2 in the regulation of expression of OsPT2 and phosphate homeostasis in shoots of rice%5BTitle%5D), OsSPX1 suppresses the regulation on expression of OsPT2 by OsPHR2 and the accumulation of excess shoot Pi, but it does not suppress induction of OsPT2 or the accumulation of excessive shoot Pi in the Ospho2 mutant + __LTN1~OsPHO2~OsRLS1__, __OsPHR2__, [Involvement of OsSPX1 in phosphate homeostasis in rice](http://www.ncbi.nlm.nih.gov/pubmed?term=Involvement of OsSPX1 in phosphate homeostasis in rice%5BTitle%5D), Suppression of OsSPX1 by RNA interference resulted in severe signs of toxicity caused by the over-accumulation of Pi, similar to that found in OsPHR2 (phosphate starvation response transcription factor 2) overexpressors and pho2 (phosphate-responsive mutant 2) + __LTN1~OsPHO2~OsRLS1__, __OsPHR2__, [Involvement of OsSPX1 in phosphate homeostasis in rice](http://www.ncbi.nlm.nih.gov/pubmed?term=Involvement of OsSPX1 in phosphate homeostasis in rice%5BTitle%5D), Quantitative RT-PCR showed that expression of OsSPX1 was strongly induced in OsPHR2 overexpression and pho2 mutant plants, indicating that OsSPX1 occurs downstream of OsPHR2 and PHO2 + __LTN1~OsPHO2~OsRLS1__, __OsSPX1__, [Involvement of OsSPX1 in phosphate homeostasis in rice](http://www.ncbi.nlm.nih.gov/pubmed?term=Involvement of OsSPX1 in phosphate homeostasis in rice%5BTitle%5D), Suppression of OsSPX1 by RNA interference resulted in severe signs of toxicity caused by the over-accumulation of Pi, similar to that found in OsPHR2 (phosphate starvation response transcription factor 2) overexpressors and pho2 (phosphate-responsive mutant 2) + __LTN1~OsPHO2~OsRLS1__, __OsSPX1__, [Involvement of OsSPX1 in phosphate homeostasis in rice](http://www.ncbi.nlm.nih.gov/pubmed?term=Involvement of OsSPX1 in phosphate homeostasis in rice%5BTitle%5D), Quantitative RT-PCR showed that expression of OsSPX1 was strongly induced in OsPHR2 overexpression and pho2 mutant plants, indicating that OsSPX1 occurs downstream of OsPHR2 and PHO2 + __LTN1~OsPHO2~OsRLS1__, __OsSPX3__, [Regulation of OsSPX1 and OsSPX3 on expression of OsSPX domain genes and Pi-starvation signaling in rice](http://www.ncbi.nlm.nih.gov/pubmed?term=Regulation of OsSPX1 and OsSPX3 on expression of OsSPX domain genes and Pi-starvation signaling in rice%5BTitle%5D), OsSPX3 negatively regulates the PSI (Pi-starvation induced) gene, OsIPS1 and is involved in the responses of miR399 and OsPHO2 to Pi-starvation + __LTN1~OsPHO2~OsRLS1__, __OsSPX3__, [Regulation of OsSPX1 and OsSPX3 on expression of OsSPX domain genes and Pi-starvation signaling in rice](http://www.ncbi.nlm.nih.gov/pubmed?term=Regulation of OsSPX1 and OsSPX3 on expression of OsSPX domain genes and Pi-starvation signaling in rice%5BTitle%5D), OsSPX3 plays a role in OsIPS1/miR399 mediated long distance regulation on OsPHO2 + __LTN1~OsPHO2~OsRLS1__, __OsSPX1__, [Down-regulation of OsSPX1 causes high sensitivity to cold and oxidative stresses in rice seedlings](http://www.ncbi.nlm.nih.gov/pubmed?term=Down-regulation of OsSPX1 causes high sensitivity to cold and oxidative stresses in rice seedlings%5BTitle%5D), OsPHO2) were significantly down-regulated by the antisense of OsSPX1 + __LTN1~OsPHO2~OsRLS1__, __OsARF12__, [Auxin response factor OsARF12, a novel regulator for phosphate homeostasis in rice Oryza sativa](http://www.ncbi.nlm.nih.gov/pubmed?term=Auxin response factor OsARF12, a novel regulator for phosphate homeostasis in rice Oryza sativa%5BTitle%5D), Knockout of OsARF12 also influenced the transcript abundances of the OsPHR2 gene and its downstream components, such as OsMiR399j, OsPHO2, OsMiR827, OsSPX-MFS1 and OsSPX-MFS2 + __LTN1~OsPHO2~OsRLS1__, __OsSPX-MFS1~OsPSS1__, [Auxin response factor OsARF12, a novel regulator for phosphate homeostasis in rice Oryza sativa](http://www.ncbi.nlm.nih.gov/pubmed?term=Auxin response factor OsARF12, a novel regulator for phosphate homeostasis in rice Oryza sativa%5BTitle%5D), Knockout of OsARF12 also influenced the transcript abundances of the OsPHR2 gene and its downstream components, such as OsMiR399j, OsPHO2, OsMiR827, OsSPX-MFS1 and OsSPX-MFS2 + __LTN1~OsPHO2~OsRLS1__, __OsSPX-MFS2__, [Auxin response factor OsARF12, a novel regulator for phosphate homeostasis in rice Oryza sativa](http://www.ncbi.nlm.nih.gov/pubmed?term=Auxin response factor OsARF12, a novel regulator for phosphate homeostasis in rice Oryza sativa%5BTitle%5D), Knockout of OsARF12 also influenced the transcript abundances of the OsPHR2 gene and its downstream components, such as OsMiR399j, OsPHO2, OsMiR827, OsSPX-MFS1 and OsSPX-MFS2 + __LTN1~OsPHO2~OsRLS1__, __OsPht1;1~OsPT1__, [A constitutive expressed phosphate transporter, OsPht1;1, modulates phosphate uptake and translocation in phosphate-replete rice](http://www.ncbi.nlm.nih.gov/pubmed?term=A constitutive expressed phosphate transporter, OsPht1;1, modulates phosphate uptake and translocation in phosphate-replete rice%5BTitle%5D), Furthermore, OsPT1 expression was strongly enhanced by the mutation of Phosphate Overaccumulator2 (OsPHO2) but not by Phosphate Starvation Response2, indicating that OsPT1 is involved in the OsPHO2-regulated Pi pathway + __LTN1~OsPHO2~OsRLS1__, __OsLPR3__, [Identification and expression analysis of OsLPR family revealed the potential roles of OsLPR3 and 5 in maintaining phosphate homeostasis in rice.](http://www.ncbi.nlm.nih.gov/pubmed?term=Identification and expression analysis of OsLPR family revealed the potential roles of OsLPR3 and 5 in maintaining phosphate homeostasis in rice.%5BTitle%5D), Further, the expression levels of OsLPR3 and 5 were either attenuated in ossiz1 and ospho2 or augmented in rice overexpressing OsSPX1 + __LTN1~OsPHO2~OsRLS1__, __OsTrxh4__, [Two h-type thioredoxins interact with the PHO2ubiquitin-conjugating E2 enzyme to fine-tune phosphate homeostasis.](http://www.ncbi.nlm.nih.gov/pubmed?term=Two h-type thioredoxins interact with the PHO2ubiquitin-conjugating E2 enzyme to fine-tune phosphate homeostasis.%5BTitle%5D), A yeast-two-hybrid (Y2H) screen in Oryza sativa (Os, rice) using OsPHO2 as bait, revealed an interaction between OsPHO2 and two h-type thioredoxins, OsTrxh1 and OsTrxh4 + __LTN1~OsPHO2~OsRLS1__, __OsTrx23~OsTRXh1__, [Two h-Type Thioredoxins Interact with the E2 Ubiquitin Conjugase PHO2 to Fine-Tune Phosphate Homeostasis in Rice.](http://www.ncbi.nlm.nih.gov/pubmed?term=Two h-Type Thioredoxins Interact with the E2 Ubiquitin Conjugase PHO2 to Fine-Tune Phosphate Homeostasis in Rice.%5BTitle%5D), A yeast two-hybrid (Y2H) screen in rice (Oryza sativa; Os) using OsPHO2 as bait revealed an interaction between OsPHO2 and two h-type thioredoxins, OsTrxh1 and OsTrxh4 + __LTN1~OsPHO2~OsRLS1__, __OsTrx23~OsTRXh1__, [Two h-Type Thioredoxins Interact with the E2 Ubiquitin Conjugase PHO2 to Fine-Tune Phosphate Homeostasis in Rice.](http://www.ncbi.nlm.nih.gov/pubmed?term=Two h-Type Thioredoxins Interact with the E2 Ubiquitin Conjugase PHO2 to Fine-Tune Phosphate Homeostasis in Rice.%5BTitle%5D), Characterization of rice pho2 complemented lines, transformed with an endogenous genomic OsPHO2 or OsPHO2(C445S) (a constitutively reduced form) fragment, indicated that OsPHO2(C445S) restored Pi concentration in rice to statistically significant lower levels compared to native OsPHO2 Moreover, the suppression of OsTrxh1 (knockdown and knockout) resulted in slightly higher Pi concentration than that of wild-type Nipponbare in leaves + __LTN1~OsPHO2~OsRLS1__, __OsTrxh4__, [Two h-Type Thioredoxins Interact with the E2 Ubiquitin Conjugase PHO2 to Fine-Tune Phosphate Homeostasis in Rice.](http://www.ncbi.nlm.nih.gov/pubmed?term=Two h-Type Thioredoxins Interact with the E2 Ubiquitin Conjugase PHO2 to Fine-Tune Phosphate Homeostasis in Rice.%5BTitle%5D), A yeast two-hybrid (Y2H) screen in rice (Oryza sativa; Os) using OsPHO2 as bait revealed an interaction between OsPHO2 and two h-type thioredoxins, OsTrxh1 and OsTrxh4 + __LTN1~OsPHO2~OsRLS1__, __OsNLA1__, [OsNLA1, a RING-type ubiquitin ligase, maintains phosphate homeostasis in Oryza sativa via degradation of phosphate transporters.](http://www.ncbi.nlm.nih.gov/pubmed?term=OsNLA1, a RING-type ubiquitin ligase, maintains phosphate homeostasis in Oryza sativa via degradation of phosphate transporters.%5BTitle%5D), Moreover, there was no interaction of OsNLA1 and OsPHO2, an E2 ubiquitin-conjugase, suggesting that OsPHO2 was not the partner of OsNLA1 involved in ubiquitin-mediated PT degradation + __LTN1~OsPHO2~OsRLS1__, __OsGI__, [Molecular interaction between PHO2 and GI in rice.](http://www.ncbi.nlm.nih.gov/pubmed?term=Molecular interaction between PHO2 and GI in rice.%5BTitle%5D), Characterization of rice Osgi and Ospho2 mutants revealed that they displayed several similar phenotypic features supporting a physiological role for this interaction + __LTN1~OsPHO2~OsRLS1__, __OsGI__, [Molecular interaction between PHO2 and GI in rice.](http://www.ncbi.nlm.nih.gov/pubmed?term=Molecular interaction between PHO2 and GI in rice.%5BTitle%5D), Reduced growth, leaf tip necrosis, delayed flowering and over-accumulation of Pi in leaves compared to wild-type were shared features of Osgi and Ospho2 plants + __LTN1~OsPHO2~OsRLS1__, __OsGI__, [Molecular interaction between PHO2 and GI in rice.](http://www.ncbi.nlm.nih.gov/pubmed?term=Molecular interaction between PHO2 and GI in rice.%5BTitle%5D), Pi analysis of individual leaves demonstrated that Osgi, similar to Ospho2 mutants, were impaired in Pi remobilization from old to young leaves, albeit to a lesser extent + __LTN1~OsPHO2~OsRLS1__, __OsGI__, [Molecular interaction between PHO2 and GI in rice.](http://www.ncbi.nlm.nih.gov/pubmed?term=Molecular interaction between PHO2 and GI in rice.%5BTitle%5D), Transcriptome analyses revealed more than 55% of the genes differentially expressed in Osgi plants overlapped with the set of differentially expressed genes in Ospho2 plants + __LTN1~OsPHO2~OsRLS1__, __OsGI__, [Molecular interaction between PHO2 and GI in rice.](http://www.ncbi.nlm.nih.gov/pubmed?term=Molecular interaction between PHO2 and GI in rice.%5BTitle%5D), The interaction between OsPHO2 and OsGI links high-level regulators of Pi homeostasis and development in rice + __LTN1~OsPHO2~OsRLS1__, __Pho2__, [Molecular interaction between PHO2 and GI in rice.](http://www.ncbi.nlm.nih.gov/pubmed?term=Molecular interaction between PHO2 and GI in rice.%5BTitle%5D), A yeast-two-hybrid screen using Oryza sativa (rice) PHO2 as bait, revealed an interaction between OsPHO2 and OsGIGANTEA, a key regulator of flowering time, which was confirmed using bimolecular flourescenec complementation (BiFC) + __LTN1~OsPHO2~OsRLS1__, __OsNLA1__, [Altered Expression of OsNLA1 Modulates Pi Accumulation in Rice Oryza sativa L. Plants.](http://www.ncbi.nlm.nih.gov/pubmed?term=Altered Expression of OsNLA1 Modulates Pi Accumulation in Rice Oryza sativa L. Plants.%5BTitle%5D), Moreover, OsNLA1 was also found to interact with OsPHO2, a known regulator of Pi homeostasis, in a Yeast Two-Hybrid (Y2H) assay + __LTN1~OsPHO2~OsRLS1__, __OsPHR2__, [Phosphate-Starvation-Inducible S-Like RNase Genes in Rice Are Involved in Phosphate Source Recycling by RNA Decay](http://www.ncbi.nlm.nih.gov/pubmed?term=Phosphate-Starvation-Inducible S-Like RNase Genes in Rice Are Involved in Phosphate Source Recycling by RNA Decay%5BTitle%5D), Finally, the dynamic transcriptional regulation of OsRNS genes by overexpression of OsPHR2, ospho2 mutant, and overexpression of OsPT1 lines involved in Pi signaling pathway suggests the molecular basis of OsRNS family in Pi recycling via RNA decay under Pi starvation + __LTN1~OsPHO2~OsRLS1__, __OsPht1;1~OsPT1__, [Phosphate-Starvation-Inducible S-Like RNase Genes in Rice Are Involved in Phosphate Source Recycling by RNA Decay](http://www.ncbi.nlm.nih.gov/pubmed?term=Phosphate-Starvation-Inducible S-Like RNase Genes in Rice Are Involved in Phosphate Source Recycling by RNA Decay%5BTitle%5D), Finally, the dynamic transcriptional regulation of OsRNS genes by overexpression of OsPHR2, ospho2 mutant, and overexpression of OsPT1 lines involved in Pi signaling pathway suggests the molecular basis of OsRNS family in Pi recycling via RNA decay under Pi starvation + __LTN1~OsPHO2~OsRLS1__, __OsPP95__, [PROTEIN PHOSPHATASE95 Regulates Phosphate Homeostasis by Affecting Phosphate Transporter Trafficking in Rice[OPEN]](http://www.ncbi.nlm.nih.gov/pubmed?term=PROTEIN PHOSPHATASE95 Regulates Phosphate Homeostasis by Affecting Phosphate Transporter Trafficking in Rice[OPEN]%5BTitle%5D), We show that OsPHO2 interacts with and induces the degradation of OsPP95 + __LTN1~OsPHO2~OsRLS1__, __OsPP95__, [PROTEIN PHOSPHATASE95 Regulates Phosphate Homeostasis by Affecting Phosphate Transporter Trafficking in Rice[OPEN]](http://www.ncbi.nlm.nih.gov/pubmed?term=PROTEIN PHOSPHATASE95 Regulates Phosphate Homeostasis by Affecting Phosphate Transporter Trafficking in Rice[OPEN]%5BTitle%5D), We conclude that OsPP95, a protein phosphatase negatively regulated by OsPHO2, positively regulates Pi homeostasis and remobilization by dephosphorylating PTs and affecting their trafficking to the PM, a reversible process required for adaptation to variable Pi conditions [//]: # * **Key figures**
259.617647
770
0.805672
eng_Latn
0.96579
3f3598827314544cd1245d8f2590e3d312caba9c
135
md
Markdown
README.md
PacktPublishing/CompTIA-A-Certification-901.-The-Total-Course
7d2ea4779ac4c436c89a521ea60d58ac4a2363a0
[ "MIT" ]
1
2021-10-01T23:23:54.000Z
2021-10-01T23:23:54.000Z
README.md
PacktPublishing/CompTIA-A-Certification-901.-The-Total-Course
7d2ea4779ac4c436c89a521ea60d58ac4a2363a0
[ "MIT" ]
null
null
null
README.md
PacktPublishing/CompTIA-A-Certification-901.-The-Total-Course
7d2ea4779ac4c436c89a521ea60d58ac4a2363a0
[ "MIT" ]
null
null
null
# CompTIA-A-Certification-901.-The-Total-Course Code Repository for CompTIA A+ Certification 901. The Total Course, Published by Packt
45
86
0.807407
yue_Hant
0.674907
3f35ead63b08a37f853e1c9947ebb0323a034a55
10,166
md
Markdown
docs/Options.md
kyleholzinger/spacefish
a549a7821117e0d7d39a7fa9a8e37a3b60e1c873
[ "MIT" ]
null
null
null
docs/Options.md
kyleholzinger/spacefish
a549a7821117e0d7d39a7fa9a8e37a3b60e1c873
[ "MIT" ]
null
null
null
docs/Options.md
kyleholzinger/spacefish
a549a7821117e0d7d39a7fa9a8e37a3b60e1c873
[ "MIT" ]
null
null
null
## Options You have ability to customize or disable specific elements of Spacefish. All options must be overridden in your `config.fish`. Colors for sections can be [basic colors](https://fishshell.com/docs/current/commands.html#set_color) or [color codes](https://upload.wikimedia.org/wikipedia/commons/1/15/Xterm_256color_chart.svg). **Note:** the symbol `·` in this document represents a regular space character ` `, it is used to clearly indicate when an option default value starts or ends with a space. ### Order You can specify the order of prompt section using `SPACEFISH_PROMPT_ORDER` option. Use Zsh array syntax to define your own prompt order. The order also defines which sections that Spacefish loads. If you're struggling with slow prompt, you can just omit the sections that you don't use, and they won't be loaded. The default order is: ```fish set SPACEFISH_PROMPT_ORDER user dir git node exec_time line_sep battery char ``` ### Prompt This group of options defines a behavior of prompt and standard parameters for sections displaying. | Variable | Default | Meaning | | :--- | :---: | --- | | `SPACEFISH_PROMPT_ADD_NEWLINE` | `true` | Adds a newline character before each prompt line | | `SPACEFISH_PROMPT_SEPARATE_LINE` | `true` | Make the prompt span across two lines | | `SPACEFISH_PROMPT_FIRST_PREFIX_SHOW` | `false` | Shows a prefix of the first section in prompt | | `SPACEFISH_PROMPT_PREFIXES_SHOW` | `true` | Show prefixes before prompt sections or not | | `SPACEFISH_PROMPT_SUFFIXES_SHOW` | `true` | Show suffixes before prompt sections or not | | `SPACEFISH_PROMPT_DEFAULT_PREFIX` | `via` | Default prefix for prompt sections | | `SPACEFISH_PROMPT_DEFAULT_SUFFIX` | ` ` | Default suffix for prompt section | ### Char | Variable | Default | Meaning | | :--- | :---: | --- | | `SPACEFISH_CHAR_PREFIX` | ` ` | Prefix before prompt character | | `SPACEFISH_CHAR_SUFFIX` | ` ` | Suffix after prompt character | | `SPACEFISH_CHAR_SYMBOL` | `➜` | Prompt character to be shown before any command | | `SPACEFISH_CHAR_COLOR_SUCCESS` | `green` | Color of prompt character if last command completes successfully | | `SPACEFISH_CHAR_COLOR_FAILURE` | `red` | Color of prompt character if last command returns non-zero exit-code | ### Username (`user`) By default, a username is shown only when it's not the same as `$LOGNAME`, when you're connected via SSH or when you're root. Root user is highlighted in `SPACEFISH_USER_COLOR_ROOT` color (red as default). | Variable | Default | Meaning | | :------- | :-----: | ------- | | `SPACEFISH_USER_SHOW` | `true` | Show user section (`true`, `false`, `always` or `needed`) | | `SPACEFISH_USER_PREFIX` | `with·` | Prefix before user section | | `SPACEFISH_USER_SUFFIX` | `$SPACEFISH_PROMPT_DEFAULT_SUFFIX` | Suffix after user section | | `SPACEFISH_USER_COLOR` | `yellow` | Color of user section | | `SPACEFISH_USER_COLOR_ROOT` | `red` | Color of user section when it's root | `SPACEFISH_USER_SHOW` defines when to show username section. Here are possible values: | `SPACEFISH_USER_SHOW` | Show on local | Show on remote | | :-------------------: | :------------- | :-------------- | | `false` | Never | Never | | `always` | Always | Always | | `true` | If needed | Always | | `needed` | If needed | If needed | ### Directory \(`dir`\) Directory is always shown and truncated to the value of `SPACEFISH_DIR_TRUNC`. While you are in repository, it shows only root directory and folders inside it. | Variable | Default | Meaning | | :--- | :---: | --- | | `SPACEFISH_DIR_SHOW` | `true` | Show directory section | | `SPACEFISH_DIR_SUFFIX` | `$SPACEFISH_PROMPT_DEFAULT_SUFFIX` | Suffix after current directory | | `SPACEFISH_DIR_TRUNC` | `3` | Number of folders of cwd to show in prompt, 0 to show all | | `SPACEFISH_DIR_TRUNC_REPO` | `true` | While in `git` repo, show only root directory and folders inside it | | `SPACEFISH_DIR_COLOR` | `(set_color --bold cyan)` | Color of directory section | | `SPACEFISH_DIR_PREFIX` | `in·` | Prefix before current directory | ### Git \(`git`\) Git section is consists with `git_branch` and `git_status` subsections. It is shown only in Git repositories. | Variable | Default | Meaning | | :--- | :---: | --- | | `SPACEFISH_GIT_SHOW` | `true` | Show Git section | | `SPACEFISH_GIT_PREFIX` | `on·` | Prefix before Git section | | `SPACEFISH_GIT_SUFFIX` | `$SPACEFISH_PROMPT_DEFAULT_SUFFIX` | Suffix after Git section | | `SPACEFISH_GIT_SYMBOL` | ![·](https://user-images.githubusercontent.com/3459374/34947621-4f324a92-fa13-11e7-9b99-cdba2cdda6b9.png) | Character to be shown before Git section \(requires [powerline patched font](https://github.com/powerline/fonts) | #### Git branch \(`git_branch`\) | Variable | Default | Meaning | | :--- | :---: | --- | | `SPACEFISH_GIT_BRANCH_SHOW` | `true` | Show Git branch subsection | | `SPACEFISH_GIT_BRANCH_PREFIX` | `$SPACEFISH_GIT_SYMBOL` | Prefix before Git branch subsection | | `SPACEFISH_GIT_BRANCH_SUFFIX` | ` ` | Suffix after Git branch subsection | | `SPACEFISH_GIT_BRANCH_COLOR` | `(set_color magenta)` | Color of Git branch subsection | #### Git status \(`git_status`\) Git status indicators is shown only when you have dirty repository. | Variable | Default | Meaning | | :--- | :---: | --- | | `SPACEFISH_GIT_STATUS_SHOW` | `true` | Show Git status subsection | | `SPACEFISH_GIT_STATUS_PREFIX` | `·[` | Prefix before Git status subsection | | `SPACEFISH_GIT_STATUS_SUFFIX` | `]` | Suffix after Git status subsection | | `SPACEFISH_GIT_STATUS_COLOR` | `red` | Color of Git status subsection | | `SPACEFISH_GIT_STATUS_UNTRACKED` | `?` | Indicator for untracked changes | | `SPACEFISH_GIT_STATUS_ADDED` | `+` | Indicator for added changes | | `SPACEFISH_GIT_STATUS_MODIFIED` | `!` | Indicator for unstaged files | | `SPACEFISH_GIT_STATUS_RENAMED` | `»` | Indicator for renamed files | | `SPACEFISH_GIT_STATUS_DELETED` | `✘` | Indicator for deleted files | | `SPACEFISH_GIT_STATUS_STASHED` | `$` | Indicator for stashed changes | | `SPACEFISH_GIT_STATUS_UNMERGED` | `=` | Indicator for unmerged changes | | `SPACEFISH_GIT_STATUS_AHEAD` | `⇡` | Indicator for unpushed changes \(ahead of remote branch\) | | `SPACEFISH_GIT_STATUS_BEHIND` | `⇣` | Indicator for unpulled changes \(behind of remote branch\) | | `SPACEFISH_GIT_STATUS_DIVERGED` | `⇕` | Indicator for diverged chages \(diverged with remote branch\) | ### Package version (`package`) > Works only for [npm](https://www.npmjs.com/) at the moment. Please, help [spaceship](https://github.com/denysdovhan/spaceship-prompt) improve this section! Package version is shown when repository is a package (e.g. contains a `package.json` file). If no version information is found in `package.json`, the `⚠` symbol will be shown. > **Note:** This is the version of the package you are working on, not the version of package manager itself. | Variable | Default | Meaning | | :------- | :-----: | ------- | | `SPACEFISH_PACKAGE_SHOW` | `true` | Show package version | | `SPACEFISH_PACKAGE_PREFIX` | `is·` | Prefix before package version section | | `SPACEFISH_PACKAGE_SUFFIX` | `$SPACEFISH_PROMPT_DEFAULT_SUFFIX` | Suffix after package version section | | `SPACEFISH_PACKAGE_SYMBOL` | `📦·` | Character to be shown before package version | | `SPACEFISH_PACKAGE_COLOR` | `red` | Color of package version section | ### Node.js (`node`) Node.js section is shown only in directories that contain `package.json` file, or `node_modules` folder, or any other file with `.js` extension. If you set `SPACEFISH_NODE_DEFAULT_VERSION` to the default Node.js version and your current version is the same as `SPACEFISH_NODE_DEFAULT_VERSION`, then Node.js section will be hidden. | Variable | Default | Meaning | | :------- | :-----: | ------- | | `SPACEFISH_NODE_SHOW` | `true` | Current Node.js section | | `SPACEFISH_NODE_PREFIX` | `$SPACEFISH_PROMPT_DEFAULT_PREFIX` | Prefix before Node.js section | | `SPACEFISH_NODE_SUFFIX` | `$SPACEFISH_PROMPT_DEFAULT_SUFFIX` | Suffix after Node.js section | | `SPACEFISH_NODE_SYMBOL` | `⬢·` | Character to be shown before Node.js version | | `SPACEFISH_NODE_DEFAULT_VERSION` | ` ` | Node.js version to be treated as default | | `SPACEFISH_NODE_COLOR` | `green` | Color of Node.js section | ### Battery \(`battery`\) By default, Battery section is shown only if battery level is below `SPACEFISH_BATTERY_THRESHOLD` \(default: 10%\). | Variable | Default | Meaning | | :--- | :---: | --- | | `SPACEFISH_BATTERY_SHOW` | `true` | Show battery section or not \(`true`, `false`, `always` or `charged`\) | | `SPACEFISH_BATTERY_PREFIX` | ` ` | Prefix before battery section | | `SPACEFISH_BATTERY_SUFFIX` | `SPACEFISH_PROMPT_DEFAULT_SUFFIX` | Suffix after battery section | | `SPACEFISH_BATTERY_SYMBOL_CHARGING` | `⇡` | Character to be shown if battery is charging | | `SPACEFISH_BATTERY_SYMBOL_DISCHARGING` | `⇣` | Character to be shown if battery is discharging | | `SPACEFISH_BATTERY_SYMBOL_FULL` | `•` | Character to be shown if battery is full | | `SPACEFISH_BATTERY_THRESHOLD` | 10 | Battery level below which battery section will be shown | `SPACEFISH_BATTERY_SHOW` defines when to show battery section. Here are possible values: | `SPACEFISH_BATTERY_SHOW` | Below threshold | Above threshold | Fully charged | | :---: | :--- | :--- | :--- | | `false` | Hidden | Hidden | Hidden | | `always` | Shown | Shown | Shown | | `true` | Shown | Hidden | Hidden | | `charged` | Shown | Hidden | Shown | ### Ruby (`ruby`) Ruby section is shown only in directories that contain `Gemfile`, or `Rakefile`, or any other file with `.rb` extension. | Variable | Default | Meaning | | :------- | :-----: | ------- | | `SPACEFISH_RUBY_SHOW` | `true` | Show Ruby section | | `SPACEFISH_RUBY_PREFIX` | `$SPACEFISH_PROMPT_DEFAULT_PREFIX` | Prefix before Ruby section | | `SPACEFISH_RUBY_SUFFIX` | `$SPACEFISH_PROMPT_DEFAULT_SUFFIX` | Suffix after Ruby section | | `SPACEFISH_RUBY_SYMBOL` | `💎·` | Character to be shown before Ruby version | | `SPACEFISH_RUBY_COLOR` | `red` | Color of Ruby section |
54.951351
250
0.704505
eng_Latn
0.822552
3f369f372f546a61fbf71ba3e4eee7d75017e1df
1,427
md
Markdown
README.md
samuelcastro/crewcovery-feedback
be57a558073f4f5b6db2f4b3e4ae7a3adb8d639d
[ "MIT" ]
null
null
null
README.md
samuelcastro/crewcovery-feedback
be57a558073f4f5b6db2f4b3e4ae7a3adb8d639d
[ "MIT" ]
null
null
null
README.md
samuelcastro/crewcovery-feedback
be57a558073f4f5b6db2f4b3e4ae7a3adb8d639d
[ "MIT" ]
null
null
null
# crewcovery-feedback A small project to send feedback based on AngularJS Full-Stack generator ## Preview ![Home](http://samuelcastro.me/crewcovery-2.png "The Home Screen") ![Feedback](http://samuelcastro.me/crewcovery-3.png "The Feedback Screen") ![Validation](http://samuelcastro.me/crewcovery-4.png "The Feedback Screen") This project was generated with the [Angular Full-Stack Generator](https://github.com/DaftMonk/generator-angular-fullstack) version 3.1.1. ## Getting Started ### Prerequisites - [Git](https://git-scm.com/) - [Node.js and npm](nodejs.org) Node ^4.2.3, npm ^2.14.7 - [Bower](bower.io) (`npm install --global bower`) - [Ruby](https://www.ruby-lang.org) and then `gem install sass` - [Grunt](http://gruntjs.com/) (`npm install --global grunt-cli`) - [MongoDB](https://www.mongodb.org/) - Keep a running daemon with `mongod` ### Developing 1. Run `npm install` to install server dependencies. 2. Run `bower install` to install front-end dependencies. 3. Run `mongod` in a separate shell to keep an instance of the MongoDB Daemon running 4. Run `grunt serve` to start the development server. It should automatically open the client in your browser when ready. ## Build & development Run `grunt build` for building and `grunt serve` for preview. ## Testing Running `npm test` will run the unit tests with karma. ## Questions ? Feel free to send me an email at: samuelcastrosilva@gmail.com
30.361702
138
0.733006
eng_Latn
0.733891
3f376ca0c4b4e9ba3aedebb3f1092fa76feefe89
1,503
md
Markdown
documentation/snippets/language-request-temporal-data.md
Breinify/brein-api-library-node
14d857d297424aeae86e5c4b5a32b385940a7e3d
[ "MIT" ]
3
2017-04-12T21:06:06.000Z
2018-07-11T22:08:55.000Z
documentation/snippets/language-request-temporal-data.md
Breinify/brein-api-library-node
14d857d297424aeae86e5c4b5a32b385940a7e3d
[ "MIT" ]
null
null
null
documentation/snippets/language-request-temporal-data.md
Breinify/brein-api-library-node
14d857d297424aeae86e5c4b5a32b385940a7e3d
[ "MIT" ]
null
null
null
<blockquote class="lang-specific javascript--node"> <p>The Node.js library offers several overloaded versions of the <code class="prettyprint">temporalData</code> method.</p> </blockquote> <blockquote class="lang-specific javascript--node"> <h3>Using Latitude/Longitude</h3> <p>It is possible to pass in a latitude/longitude pair.</p> </blockquote> > ```javascript--node var latitude = 37.7749; var longitude = -122.4194; breinify.temporalData(latitude, longitude, function(data) { console.log(data); }); ``` <blockquote class="lang-specific javascript--node"> <h3>Using IP-Address</h3> <p>Another possibility is to just pass in an ip-address and resolve the temporal information available.</p> </blockquote> > ```javascript--node var ipAddress = '72.229.28.185'; breinify.temporalData(ipAddress, function(data) { console.log(data); }); ``` <blockquote class="lang-specific javascript--node"> <h3>Using an Object</h3> <p>The most general solution is to pass in an object and specify the different known values to determine what can be resolved (but also what should be returned), based on the provided data. The further usage examples illustrate different possibilities.</p> </blockquote> > ```javascript--node var loc1 = { 'text': 'San Diego' }; breinify.temporalData({ 'location': loc1 }, function(data) { console.log(data); }); var loc2 = { 'latitude': 37.7749, 'longitude': -122.4194 }; breinify.temporalData({ 'location': loc2 }, function(data) { console.log(data); }); ```
27.327273
107
0.723886
eng_Latn
0.806132
3f37e6dbc83748ec4e8116b3befba7b307d8dce2
10,234
md
Markdown
README.md
soarez/SHA-mbles.github.io
fded77f69228e11a676c4e6c044f2011a7c7da0b
[ "MIT" ]
null
null
null
README.md
soarez/SHA-mbles.github.io
fded77f69228e11a676c4e6c044f2011a7c7da0b
[ "MIT" ]
null
null
null
README.md
soarez/SHA-mbles.github.io
fded77f69228e11a676c4e6c044f2011a7c7da0b
[ "MIT" ]
null
null
null
We have computed the very first **chosen-prefix collision for SHA-1**. In a nutshell, this means a complete and practical break of the SHA-1 hash function, with dangerous practical implications if you are still using this hash function. To put it in another way: all attacks that are practical on MD5 are now also practical on SHA-1. Check our paper **[here](https://eprint.iacr.org/2020/014.pdf)** for more details. # Our Contributions ## Complexity Improvements We have significantly improved the complexity of SHA-1 attacks, with a speedup factor around 10. More precisely, we have reduced the cost of a collision attack from 2<sup>64.7</sup> to 2<sup>61.2</sup>, and the cost of a chosen-prefix collision attack from 2<sup>67.1</sup> to 2<sup>63.4</sup> (on a GTX 970 GPU). ## Record Computation We implemented the entire chosen-prefix collision attack with those improvements. This attack is extremely technical, contains many details, various steps, and requires a lot of engineering work. In order to perform this computation with a small academic budget, we rented cheap gaming or mining GPUs from [GPUserversrental](https://www.gpuserversrental.com/), rather that the datacenter-grade hardware used by big cloud providers. We have successfully run the computation during two months last summer, using 900 GPUs (Nvidia GTX 1060). As a side result, this shows that it now costs less than 100k USD to break cryptography with a security level of 64 bits (i.e. to compute 2<sup>64</sup> operations of symmetric cryptography). ## PGP/GnuPG Impersonation We chose the PGP/GnuPG Web of Trust as demonstration of our chosen-prefix collision attack against SHA-1. The Web of Trust is a trust model used for PGP that relies on users signing each other’s identity certificate, instead of using a central PKI. For compatibility reasons the legacy branch of GnuPG (version 1.4) still uses SHA-1 by default for identity certification. Using our SHA-1 chosen-prefix collision, we have created two PGP keys with different UserIDs and colliding certificates: key B is a legitimate key for Bob (to be signed by the Web of Trust), but the signature can be transferred to key A which is a forged key with Alice’s ID. The signature will still be valid because of the collision, but Bob controls key A with the name of Alice, and signed by a third party. Therefore, he can impersonate Alice and sign any document in her name. # Our Chosen-Prefix Collision Example We have create a chosen-prefix collision with prefixes `99040d047fe81780012000` and `99030d047fe81780011800` (in hexadecimal notation). You can download the two messages below, and verify their hash with the `sha1sum` tool: - [messageA](messageA) - [messageB](messageB) The prefixes have been chosen to build two PGP public keys with colliding SHA-1 certification signatures. You can download two example keys below, with different user names, and examine them with `pgpdump -i` to see that the SHA-1 signatures issued by `0xAFBB1FED6951A956` are the same: - [alice.asc](alice.asc) - [bob.asc](bob.asc) In order to avoid malicious usage, the keys have a creation date far in the future; if you want to analyse them with pgp, you can use options `--ignore-time-conflict --ignore-valid-from` (more generally you can prefix arbitrary commands with `faketime @2145920400`). # Responsible Disclosure We have tried to contact the authors of affected software before announcing this attack, but due to limited resources, we could not notify everyone. ## GnuPG We have first discussed this attack with the GnuPG developers the 9th of May 2019 and eventually informed them of the newly found chosen-prefix collision the 1st of October 2019. The issue is tracked with CVE number CVE-2019-14855. A countermeasure has been implemented in commit edc36f5, included in GnuPG version 2.2.18 (released on the 25th of November 2019): SHA-1-based identity signatures created after 2019-01-19 are now considered invalid. ## CAcert [CAcert](http://cacert.org/) is one of them main CAs for PGP keys. We noticed that there is a large number of keys with recent SHA-1 signatures from CAcert on public keyservers. This seems to indicate that they still use SHA-1 to sign user keys. We have first contacted them by email on December 14th, and got an answer on January 6th acknowledging this issue. They are planning a switch to a secure hash function for key certification. ## OpenSSL We have contacted the OpenSSL developers on December 14th. They are considering disabling SHA-1 at security level 1 (defined as 80-bit security) after our attack. Since security level 1 is the default configuration, this would prevent SHA-1 usage for certificates, and for handshake signatures. Debian Linux had previously set the default configuration to security level 2 (defined as 112-bit security) in the latest release (Debian Buster); this already prevents dangerous usage of SHA-1. # Q&A ## What is a chosen-prefix collision? A classical collision (or identical-prefix collision) for a hash function H is simply two messages M and M' that lead to the same hash output: H(M) = H(M'). Even though this security notion is fundamental in cryptography, exploiting a classical collision for attacks in practice is difficult. A chosen-prefix collision is a more constrained (and much more difficult to obtain) type of collision, where two message prefixes P and P' are first given as challenge to the adversary, and his goal is then to compute two messages M and M' such that H(P \|\| M) = H(P' \|\| M'), where \|\| denotes concatenation. With such an ability, the attacker can obtain a collision even though prefixes can be chosen arbitrarily (and thus potentially contain some meaningful information). This is particularly impactful when the hash function is used in a digital signature scheme, one of the most common usage of a hash function. ## Is SHA-1 really still used? SHA-1 usage has significantly decreased in the last years; in particular web browsers now reject certificates signed with SHA-1. However, SHA-1 signatures are still supported in a large number of applications. SHA-1 is the default hash function used for certifying PGP keys in the legacy branch of GnuPG, and those signatures were accepted by the modern branch of GnuPG before we reported our results. Many non-web TLS clients also accept SHA-1 certificates, and SHA-1 is still allowed for in-protocol signatures in TLS and SSH. Even if actual usage is low (in the order of 1%), the fact that SHA-1 is allowed threatens the security because a meet-in-the-middle attacker will downgrade the connection to SHA-1. SHA-1 is also the foundation of the GIT versioning system. There are probably a lot of less known or proprietary protocols that still use SHA-1, but this is more difficult to evaluate. ## What is affected? Any usage where collision resistance is expected from SHA-1 is of course at high risk. We identified a few settings that are directly affected by chosen-prefix collisions: - PGP keys can be forged if third parties generate SHA-1 key certifications - X.509 certificates could be broken if some Certificate Authorities issue SHA-1 certificates with predictable serial numbers We note that classical collisions and chosen-prefix collisions do not threaten all usages of SHA-1. In particular, HMAC-SHA-1 seems relatively safe, and preimage resistance (aka ability to invert the hash function) of SHA-1 remains unbroken as of today. Yet, as cryptographers we recommend to deprecate SHA-1 everywhere, even when there is no direct evidence that this weaknesses can be exploited. ## What should I do? **Remove any use of SHA-1 in your product as soon as possible and use instead [SHA-256](https://nvlpubs.nist.gov/nistpubs/FIPS/NIST.FIPS.180-4.pdf) or [SHA-3](https://nvlpubs.nist.gov/nistpubs/FIPS/NIST.FIPS.202.pdf)**. SHA-1 has been broken for 15 years, so there is no good reason to use this hash function in modern security software. Attacks only get better over time, and the goal of the cryptanalysis effort is to warn users so that they can deprecate algorithms before the attacks get practical. We actually expect our attack to cost just a couple thousand USD in a few years. ## How much does the attack cost? By renting a GPU cluster online, the entire chosen-prefix collision attack on SHA-1 costed us about 75k USD. However, at the time of computation, our implementation was not optimal and we lost some time (because research). Besides, computation prices went further down since then, so we estimate that our attack costs today about 45k USD. As computation costs continue to decrease rapidly, we evaluate that it should cost less than 10k USD to generate a chosen-prefix collision attack on SHA-1 by 2025. As a side note, a classical collision for SHA-1 now costs just about 11k USD. ## Wasn't there already a collision attack against SHA-1? A classical collision has been computed for SHA-1 in late 2017, as you can see [here](https://shattered.io/). However, this is very different from a chosen-prefix collision, where any prefix pair can be challenged for the collision, which leads to a much more serious impact in practice. ## Wasn't there already a chosen-prefix collision attack against SHA-1? Last year, we announced a new chosen-prefix collision attack, as you can see [here](https://eprint.iacr.org/2019/459) ([some test code](https://github.com/Cryptosaurus/sha1-cp) is also available) and this work was published at the [Eurocrypt 2019](https://eurocrypt.iacr.org/2019/) conference. Here, we further improved these results up to a point where the attack becomes doable for a reasonable amount of money, and we wrote an actual implementation of the attack to compute the chosen-prefix collision against SHA-1. ## Can I try it out for myself? Since our attack on SHA-1 has pratical implications, in order to make sure proper countermeasures have been pushed we will wait for some time before releasing source code that allows to generate SHA-1 chosen-prefix collisions. - - - # Contact If you have any questions, feel free to contact us: **Gaëtan Leurent:** gaetan.leurent@inria.fr **Thomas Peyrin:** thomas.peyrin@ntu.edu.sg
97.466667
901
0.784151
eng_Latn
0.999254
3f37f4f5db7e2e352c9efb3efb0988cb1ccdbadc
4,284
md
Markdown
docs/menu/menu/danbroid.util.menu.ui/-menu-implementation/index.md
danbrough/util
2a3ae340114d1cf8b96aceeddcc62a0057d4ef3e
[ "Apache-2.0" ]
null
null
null
docs/menu/menu/danbroid.util.menu.ui/-menu-implementation/index.md
danbrough/util
2a3ae340114d1cf8b96aceeddcc62a0057d4ef3e
[ "Apache-2.0" ]
null
null
null
docs/menu/menu/danbroid.util.menu.ui/-menu-implementation/index.md
danbrough/util
2a3ae340114d1cf8b96aceeddcc62a0057d4ef3e
[ "Apache-2.0" ]
null
null
null
//[menu](../../index.md)/[danbroid.util.menu.ui](../index.md)/[MenuImplementation](index.md) # MenuImplementation [androidJvm] object [MenuImplementation](index.md) ## Functions | Name| Summary| |---|---| | [equals](../../danbroid.util.menu.ui.model/-menu-list-model/-companion/-new-instance-factory/index.md#kotlin/Any/equals/#kotlin.Any?/PointingToDeclaration/)| [androidJvm] <br>Content <br>open operator override fun [equals](../../danbroid.util.menu.ui.model/-menu-list-model/-companion/-new-instance-factory/index.md#kotlin/Any/equals/#kotlin.Any?/PointingToDeclaration/)(other: [Any](https://kotlinlang.org/api/latest/jvm/stdlib/kotlin/-any/index.html)?): [Boolean](https://kotlinlang.org/api/latest/jvm/stdlib/kotlin/-boolean/index.html) <br><br><br> | [hashCode](../../danbroid.util.menu.ui.model/-menu-list-model/-companion/-new-instance-factory/index.md#kotlin/Any/hashCode/#/PointingToDeclaration/)| [androidJvm] <br>Content <br>open override fun [hashCode](../../danbroid.util.menu.ui.model/-menu-list-model/-companion/-new-instance-factory/index.md#kotlin/Any/hashCode/#/PointingToDeclaration/)(): [Int](https://kotlinlang.org/api/latest/jvm/stdlib/kotlin/-int/index.html) <br><br><br> | [toString](../../danbroid.util.menu.ui.model/-menu-list-model/-companion/-new-instance-factory/index.md#kotlin/Any/toString/#/PointingToDeclaration/)| [androidJvm] <br>Content <br>open override fun [toString](../../danbroid.util.menu.ui.model/-menu-list-model/-companion/-new-instance-factory/index.md#kotlin/Any/toString/#/PointingToDeclaration/)(): [String](https://kotlinlang.org/api/latest/jvm/stdlib/kotlin/-string/index.html) <br><br><br> ## Properties | Name| Summary| |---|---| | [layoutID](index.md#danbroid.util.menu.ui/MenuImplementation/layoutID/#/PointingToDeclaration/)| [androidJvm] @[LayoutRes](https://developer.android.com/reference/kotlin/androidx/annotation/LayoutRes.html)() <br> <br>var [layoutID](index.md#danbroid.util.menu.ui/MenuImplementation/layoutID/#/PointingToDeclaration/): [Int](https://kotlinlang.org/api/latest/jvm/stdlib/kotlin/-int/index.html) <br> | [menuClickHandler](index.md#danbroid.util.menu.ui/MenuImplementation/menuClickHandler/#/PointingToDeclaration/)| [androidJvm] var [menuClickHandler](index.md#danbroid.util.menu.ui/MenuImplementation/menuClickHandler/#/PointingToDeclaration/): [Fragment](https://developer.android.com/reference/kotlin/androidx/fragment/app/Fragment.html).([MenuItem](../../danbroid.util.menu/-menu-item/index.md)) -> [Unit](https://kotlinlang.org/api/latest/jvm/stdlib/kotlin/-unit/index.html) <br> | [menuContextMenuHandler](index.md#danbroid.util.menu.ui/MenuImplementation/menuContextMenuHandler/#/PointingToDeclaration/)| [androidJvm] var [menuContextMenuHandler](index.md#danbroid.util.menu.ui/MenuImplementation/menuContextMenuHandler/#/PointingToDeclaration/): [Fragment](https://developer.android.com/reference/kotlin/androidx/fragment/app/Fragment.html).([MenuItem](../../danbroid.util.menu/-menu-item/index.md), [ContextMenu](https://developer.android.com/reference/kotlin/android/view/ContextMenu.html), [View](https://developer.android.com/reference/kotlin/android/view/View.html), [ContextMenu.ContextMenuInfo](https://developer.android.com/reference/kotlin/android/view/ContextMenu.ContextMenuInfo.html)?) -> [Unit](https://kotlinlang.org/api/latest/jvm/stdlib/kotlin/-unit/index.html) <br> | [rootContent](index.md#danbroid.util.menu.ui/MenuImplementation/rootContent/#/PointingToDeclaration/)| [androidJvm] lateinit var [rootContent](index.md#danbroid.util.menu.ui/MenuImplementation/rootContent/#/PointingToDeclaration/): () -> [MenuItemBuilder](../../danbroid.util.menu/-menu-item-builder/index.md) <br> | [setToolbarTitle](index.md#danbroid.util.menu.ui/MenuImplementation/setToolbarTitle/#/PointingToDeclaration/)| [androidJvm] var [setToolbarTitle](index.md#danbroid.util.menu.ui/MenuImplementation/setToolbarTitle/#/PointingToDeclaration/): [FragmentActivity](https://developer.android.com/reference/kotlin/androidx/fragment/app/FragmentActivity.html).([CharSequence](https://kotlinlang.org/api/latest/jvm/stdlib/kotlin/-char-sequence/index.html)) -> [Unit](https://kotlinlang.org/api/latest/jvm/stdlib/kotlin/-unit/index.html) <br>
153
808
0.764706
yue_Hant
0.39624
3f37ffd142a42d4e52e9947652f11fba25397736
559
md
Markdown
README-integration-test-setup.md
indyaah/bee-client
e07c1450e84dc0018b497797156493f8c8065ca4
[ "MIT" ]
null
null
null
README-integration-test-setup.md
indyaah/bee-client
e07c1450e84dc0018b497797156493f8c8065ca4
[ "MIT" ]
null
null
null
README-integration-test-setup.md
indyaah/bee-client
e07c1450e84dc0018b497797156493f8c8065ca4
[ "MIT" ]
null
null
null
Integration Tests With Local Webserver ====================================== You can run the integration tests against your localhost webserver if you want. This is optional, but improves the test coverage. To set this up, you need: * Apache, Nginx or Cherokee * PHP interpreter configured with the webserver. Please start your server then run the script src/test/resources/setup-symlinks.sh This will find your webserver's root folder and make a symlink in it to this project. The integration test will then find the source PHP script available to it.
34.9375
97
0.744186
eng_Latn
0.993063
3f3822c154566021afd8d780366ac79ae853ffb1
378
markdown
Markdown
_drafts/2018/03/2018-03-31-2018-03-31-update.markdown
a-q-a/a-q-a.github.io
b91fbc35c1af329dbb4dd6638a454ce8e340c893
[ "MIT", "BSD-3-Clause" ]
null
null
null
_drafts/2018/03/2018-03-31-2018-03-31-update.markdown
a-q-a/a-q-a.github.io
b91fbc35c1af329dbb4dd6638a454ce8e340c893
[ "MIT", "BSD-3-Clause" ]
null
null
null
_drafts/2018/03/2018-03-31-2018-03-31-update.markdown
a-q-a/a-q-a.github.io
b91fbc35c1af329dbb4dd6638a454ce8e340c893
[ "MIT", "BSD-3-Clause" ]
null
null
null
--- layout: "single" title: "2018-03-31 update" date: "2018-03-31 11:25" published: false --- ## 祈り ありがとう、愛しています、祝福しています、癒しています、そのままでいてください。 ## update - Inkscape インストール - GIMP インストール - Dash インストール - 使ってみて、やっぱり基本的には英語のAPIリファレンスをサポートしていて、日本語のリファレンスを使う場合は、個々のドキュメントのURLを探して、設定する必要がある。 - いいことを知りました。 - ありがとう。 - aaa ## 引用 ありがとうございます。 皆さんの情報の存在を知れて、また一つ更新できました。 ありがとう。
15.75
86
0.724868
jpn_Jpan
0.936499
3f38511af841d4df703f739e3c49499e632378b6
4,049
md
Markdown
_policies_guidance/alumni-spotlight-angela-mcpherson.md
meganlooff/CFO.gov
2716f3c081c4aa332d0acd1fd8d89074f216aea5
[ "CC0-1.0" ]
3
2021-01-20T15:22:14.000Z
2021-06-02T18:06:15.000Z
_policies_guidance/alumni-spotlight-angela-mcpherson.md
meganlooff/CFO.gov
2716f3c081c4aa332d0acd1fd8d89074f216aea5
[ "CC0-1.0" ]
4
2020-10-08T21:27:48.000Z
2020-11-02T16:35:16.000Z
_policies_guidance/alumni-spotlight-angela-mcpherson.md
meganlooff/CFO.gov
2716f3c081c4aa332d0acd1fd8d89074f216aea5
[ "CC0-1.0" ]
5
2020-09-16T11:21:59.000Z
2021-06-03T16:12:35.000Z
--- layout: knowledge-sharing-landing title: ALUMNI SPOTLIGHT&#58; ANGELA MCPHERSON subtitle: Alumni Spotlight&#58; Angela Mcpherson filler: Angela McPherson is a member of the 2016-2017 CXO Fellows cohort. permalink: /knowledge-sharing/alumni-spotlight-angela-mcpherson/ type: CXO Fellows date: June 20, 2018 has_date: 'yes' author: Annette Maldonado filters: cxo-fellows --- <div style="line-height: 1.8em;margin-bottom: 80px; display: block;"> <p><b>CXO Fellows Cohort <abbr title="Fiscal year">FY</abbr> 16-17<img alt="Angela McPherson portrait" src="{{ site.baseurl }}/wp-content/uploads/2018/06/Angela-300x300.png" width="300" height="300" sizes="(max-width: 300px) 100vw, 300px" style="float: left;margin-right: 30px;margin-bottom: 10px;max-width: 100%; height: auto;"></b></p> <p><b>Functional Group: </b><i><span style="font-weight: 400;">Information Technology</span></i></p> <p><b>Current Position: </b><i><span style="font-weight: 400;">Program Liaison Specialist, MY <abbr title="Office of Shared Solutions Performance Improvement">OSSPI</abbr></span></i></p> <p><b>Agency: </b><i><span style="font-weight: 400;">GSA </span></i></p> <p><span style="font-weight: 400;">Angela McPherson is a member of the 2016-2017 CXO Fellows cohort. During her participation in the CXO Fellowship, Angela was a Program Specialist working as a financial management systems trainer at the General Services Administration (GSA). In February 2018, Angela transitioned to GSA’s Office of Shared Solutions &amp; Performance Improvement (OSSPI) as a Program Liaison Specialist supporting the Chief Acquisition Officers Council (CAOC), Performance Improvement Council (PIC), and President Management Council (PMC).</span> <span style="font-weight: 400;">We sat down with Angela to discuss her experience as a recent CXO Fellow. </span></p> <p><b>How did the program advance your career as a public servant?</b></p> <p><span style="font-weight: 400;">Meeting with CXOs provided candid career development insight. CXOs who shared their leadership stories with our cohort had a diverse set of backgrounds and professional experiences that oftentimes were not directly related to their current functional area as a CXO. In varying ways, their stories demonstrated how every job opportunity, no matter how unrelated, can catapult you to a higher position. Candid insight into CXOs’ career paths provided specific and qualitative information to help me structure my own career path to pursue leadership opportunities in which I am interested. </span></p> <p><b>What did you gain from the program?</b></p> <p><span style="font-weight: 400;">As a CXO Fellow, I was able to develop a stronger voice in the conversation, not only around areas in which I had developed an expertise, but also regarding broader, cross-functional and cross-agency issues. I learned that no matter what position I am in, I have the ability to make suggestions, develop better processes, and seek opportunities to talk to people who can help implement these ideas.</span></p> <p><b>How did the program enable you to better contribute to your agency?</b></p> <p><span style="font-weight: 400;">The opportunity to contribute to cross-council initiatives, work with the PMA CAP Goals, and operate in a cross-agency, cross-functional, and collaborative role provided a government-wide perspective. This enabled me to better see how the role I currently play fits into the broader equation of government-wide, back office functions. </span></p> <p><b>What would you tell someone who is interested in the program?</b></p> <p><span style="font-weight: 400;">The CXO Fellows Program is Leadership 101 for the Executive Government. The program provides an experience and education that most federal employees in mission support functions do not receive. The exposure to chief executive officers, government-wide issues and initiatives, and cross-functional collaboration is invaluable to advancing both the contribution you make to your agency and your own career trajectory.</span></p> </div>
139.62069
682
0.772042
eng_Latn
0.993292
3f386ff233ea89c2303ad40e70024bcac0f62156
17,886
md
Markdown
playlists/pretty/37i9dQZF1DWWzVPEmatsUB.md
masudissa0210/spotify-playlist-archive
a1a4a94af829378c9855040d905e04080c581acb
[ "MIT" ]
null
null
null
playlists/pretty/37i9dQZF1DWWzVPEmatsUB.md
masudissa0210/spotify-playlist-archive
a1a4a94af829378c9855040d905e04080c581acb
[ "MIT" ]
null
null
null
playlists/pretty/37i9dQZF1DWWzVPEmatsUB.md
masudissa0210/spotify-playlist-archive
a1a4a94af829378c9855040d905e04080c581acb
[ "MIT" ]
null
null
null
pretty - [cumulative](/playlists/cumulative/37i9dQZF1DWWzVPEmatsUB.md) - [plain](/playlists/plain/37i9dQZF1DWWzVPEmatsUB) - [githistory](https://github.githistory.xyz/mackorone/spotify-playlist-archive/blob/main/playlists/plain/37i9dQZF1DWWzVPEmatsUB) ### [Mellow Morning](https://open.spotify.com/playlist/37i9dQZF1DWWzVPEmatsUB) > Have a quiet morning with this mix of something familiar and something new. [Spotify](https://open.spotify.com/user/spotify) - 464,408 likes - 70 songs - 4 hr 19 min | No. | Title | Artist(s) | Album | Length | |---|---|---|---|---| | 1 | [Thank You](https://open.spotify.com/track/4mVm8cTbilqoIWCYNNjCxt) | [Lady Wray](https://open.spotify.com/artist/1plioVQ0mcgAO7uhvWkJJy) | [Thank You](https://open.spotify.com/album/5e9eqwm0zjc9FLaRhVfdTo) | 3:26 | | 2 | [Penny Pincher](https://open.spotify.com/track/6EHv2BQ8wbDsGJnPSdo1Vb) | [Kendra Morris](https://open.spotify.com/artist/7rtM2wPKQlFpsm0C4qJlDk) | [Penny Pincher](https://open.spotify.com/album/0hGZmEGPkeicNmcZKuaQbC) | 3:40 | | 3 | [A Hand, A Heart \- 2021](https://open.spotify.com/track/2MylFBKIQH5ogcBbeMDYiR) | [Katie Buchanan](https://open.spotify.com/artist/45JkiNZMtPXDGoKXzxoPE1) | [A Hand, A Heart \(2021\)](https://open.spotify.com/album/64FRBimcWLupwj5WBrzjm9) | 4:28 | | 4 | [I Might Be In Love With You](https://open.spotify.com/track/1JXuz7eSjrpkI5FCdruZuN) | [Cynthia Erivo](https://open.spotify.com/artist/46UMQ0cW8ToR8egkBRwAxZ) | [Ch\. 1 Vs\. 1](https://open.spotify.com/album/0KeLt7XCGtfAKAbrmM59De) | 3:37 | | 5 | [Learning To Live Without You](https://open.spotify.com/track/2cEMwE3n0z4Uw4xVwNqDst) | [Hajaj](https://open.spotify.com/artist/08yjRkGm8KNsShKjtbEmt6) | [Learning To Live Without You](https://open.spotify.com/album/4D3Kj2stDw3tnnfjOBrEvB) | 3:13 | | 6 | [Some friends and a place to call home](https://open.spotify.com/track/0zzcjPRxa26NDyex0GoJs2) | [Mōzi](https://open.spotify.com/artist/5w603RhGuKjuDtIlkQt3E4) | [Some friends and a place to call home](https://open.spotify.com/album/6LObxZbQLxl6dBVRfzVAFm) | 3:07 | | 7 | [Out Of The Blue](https://open.spotify.com/track/1MjtLhNGLURCcXNV4WXJch) | [Katie Pruitt](https://open.spotify.com/artist/1c5w8KrxGwq44fxM5lGB4s) | [Out Of The Blue](https://open.spotify.com/album/4WoTj3vWOdk1TSmgEXSpFg) | 5:12 | | 8 | [The Weather](https://open.spotify.com/track/4xxF57q6wmLa400D3AA2u2) | [Lawrence](https://open.spotify.com/artist/5rwUYLyUq8gBsVaOUcUxpE) | [The Weather](https://open.spotify.com/album/3vuzxiXjlgj16rGerO98UL) | 2:49 | | 9 | [High Hope](https://open.spotify.com/track/5MYYZbWQURrLcgim1jGvm6) | [Patrick Droney](https://open.spotify.com/artist/78Rk1F0jGdipWWfrhyWwt3) | [Patrick Droney](https://open.spotify.com/album/6tAWZMiijCCTJG2Pz5MlDJ) | 4:11 | | 10 | [Love & War in Your Twenties](https://open.spotify.com/track/2NU2U0wWzGtZtuu1QsacOb) | [Jordy Searcy](https://open.spotify.com/artist/0AV5z1x1RoOGeJWeJzziDz) | [Dark in the City](https://open.spotify.com/album/33AW6ZS39DUZZrxl76EyAs) | 4:10 | | 11 | [July \(feat\. Leon Bridges\)](https://open.spotify.com/track/3V0nnQhqvbE3JmiDdnzQFQ) | [Noah Cyrus](https://open.spotify.com/artist/55fhWPvDiMpLnE4ZzNXZyW), [Leon Bridges](https://open.spotify.com/artist/3qnGvpP8Yth1AqSBMqON5x) | [July \(feat\. Leon Bridges\)](https://open.spotify.com/album/3tRmxSQyoyXXwcVDcUFQic) | 2:32 | | 12 | [Goodbye To Yesterday](https://open.spotify.com/track/3dWTDAzjbghk0Jmd9XbFjS) | [Tom Bailey](https://open.spotify.com/artist/2oazmaA42Jf78TZeTsUIDU) | [Goodbye To Yesterday](https://open.spotify.com/album/5SK4VfE7kAZMMeWmNCw3un) | 3:32 | | 13 | [Julie](https://open.spotify.com/track/0by6jrvo2WRxz6hVVoXKla) | [Anduze](https://open.spotify.com/artist/52uJn5izVG1gicalLRYGQn) | [Julie](https://open.spotify.com/album/47zipYk7xL6mzE84tiAJZa) | 3:27 | | 14 | [Quarantined With You](https://open.spotify.com/track/6ZxWtJqM3r2imGqHFxZSDr) | [Lawrence](https://open.spotify.com/artist/5rwUYLyUq8gBsVaOUcUxpE) | [Quarantined With You](https://open.spotify.com/album/13UK0xfhuNj1kkmz7QuBbs) | 3:22 | | 15 | [Sweeter Than Peaches](https://open.spotify.com/track/5TQr5GihbdY4GuYURxW4Lx) | [Anthem Lights](https://open.spotify.com/artist/7kwEvDE8e7EBGKh5bLczqQ) | [Sweeter Than Peaches](https://open.spotify.com/album/2lZUEwvmSL9NWgggV28n6T) | 2:07 | | 16 | [Move Me](https://open.spotify.com/track/3zyBMnHwaDlf1xsRwJ5tzq) | [RuthAnne](https://open.spotify.com/artist/31rVRoX5ZG9ZyRbHvlEwjA) | [Matters Of The Heart](https://open.spotify.com/album/56GwQuFFJwfFMcSsmvwUYK) | 2:58 | | 17 | [Why Don't You Touch Me](https://open.spotify.com/track/7wnV2fCAoOMkFCMGFxUmsM) | [Leon Bridges](https://open.spotify.com/artist/3qnGvpP8Yth1AqSBMqON5x) | [Why Don't You Touch Me](https://open.spotify.com/album/04NUa9SytI8eol6ylIS9ai) | 3:17 | | 18 | [Autumn Town Leaves](https://open.spotify.com/track/4xI1VwZa8u3ymeS8hsHKed) | [Iron & Wine](https://open.spotify.com/artist/4M5nCE77Qaxayuhp3fVn4V) | [Weed Garden](https://open.spotify.com/album/1xKEX6KhO9pRM85WT7aOel) | 3:15 | | 19 | [Static Feels](https://open.spotify.com/track/717JJIF2yw3dV3XaUr0w7c) | [Delta Maid](https://open.spotify.com/artist/3U3DcUha9m8BQGBhe338S9) | [Static Feels](https://open.spotify.com/album/1Z0YVTppPlqx1eX1zkFqAD) | 2:57 | | 20 | [Right Now](https://open.spotify.com/track/2cXOqys4hyhCdwy4qp6hYD) | [Ayelle](https://open.spotify.com/artist/5aNJpeK3hUdPY9orfExdOF) | [Right Now](https://open.spotify.com/album/3LjaQJxClcafNQjSNwJytA) | 3:59 | | 21 | [Consider Me](https://open.spotify.com/track/6eA8pANu9ryDcoTTe5myKk) | [Allen Stone](https://open.spotify.com/artist/536osqBGKzeozje8BfcGsa) | [Building Balance](https://open.spotify.com/album/2vExIljZtXXu7wRRENGGwy) | 3:04 | | 22 | [Keep Your Head Up Princess](https://open.spotify.com/track/19pChrR4hwdINqoOFUo2Hj) | [Anson Seabra](https://open.spotify.com/artist/2jHp7gQArCQrlMvdrIVFCg) | [Keep Your Head Up Princess](https://open.spotify.com/album/5Wwvdrq2pNP4zWBh6NtdvK) | 3:18 | | 23 | [Summer Rain \(feat\. Jazmine Sullivan\)](https://open.spotify.com/track/4fPO3377M8WMykvWUJmzeO) | [Leon Bridges](https://open.spotify.com/artist/3qnGvpP8Yth1AqSBMqON5x), [Jazmine Sullivan](https://open.spotify.com/artist/7gSjFKpVmDgC2MMsnN8CYq) | [Gold\-Diggers Sound \(Deluxe\)](https://open.spotify.com/album/6SV7Sl0rmVeMuqYlMMAqQB) | 2:44 | | 24 | [Cross My Mind](https://open.spotify.com/track/6TKMXuPoDlPZYEkpS5CMB7) | [Olivia Dean](https://open.spotify.com/artist/00x1fYSGhdqScXBRpSj3DW) | [Growth](https://open.spotify.com/album/4pCquj67SkPo5SNpJ5Rsjs) | 2:50 | | 25 | [I Found A Love](https://open.spotify.com/track/4X1pp61d0eFrOx07WHrRmx) | [José James](https://open.spotify.com/artist/4l2MwXYwUDQKHcUXwCZjEz), [Taali](https://open.spotify.com/artist/5SkhihNXZNPmooUcbSVZho) | [No Beginning No End 2](https://open.spotify.com/album/0KKurlvDI9YjNgwHm8fNP4) | 3:44 | | 26 | [Storms](https://open.spotify.com/track/2SLUYFuTP916sEvAUDjdDx) | [Lady Wray](https://open.spotify.com/artist/1plioVQ0mcgAO7uhvWkJJy) | [Storms](https://open.spotify.com/album/2iUeeOtZyp0oDyvbESnnwT) | 2:56 | | 27 | [Until Your Heart Breaks](https://open.spotify.com/track/1KMRQT3HgJ4xKGhBr6BhTz) | [Jennifer Chung](https://open.spotify.com/artist/4Lu5b0djNHU6poRdy9db1g) | [Until Your Heart Breaks](https://open.spotify.com/album/5GlFPzvXhHqsYxbcLANITK) | 3:27 | | 28 | [Icarus](https://open.spotify.com/track/2KzpsnZztszanilxkAQBTh) | [Aaron Taylor](https://open.spotify.com/artist/1evO4fwLsEkkPGq32dCix7) | [ICARUS](https://open.spotify.com/album/6aGniRX6xCiybxgzc6llFD) | 4:58 | | 29 | [Lovin' You Is Easy](https://open.spotify.com/track/5gymzsD0vPHUNAcQioB1Zv) | [Vanmiran](https://open.spotify.com/artist/7AQmzPYke7fmBDOvTCIxPO) | [Lovin' You Is Easy](https://open.spotify.com/album/5jqhttN809Cz2orcp04RHE) | 2:57 | | 30 | [Real Estate](https://open.spotify.com/track/2VvTMIXG760UiG9OSiFt9j) | [Adam Melchor](https://open.spotify.com/artist/54tv11ndFfiqXiR03PwdlB) | [Real Estate](https://open.spotify.com/album/6nViscFT0UsASEnVlpObPI) | 3:36 | | 31 | [Fly \(For Mike\) feat\. Brittany Howard](https://open.spotify.com/track/0qKdmI20wCPmmjSWY6wO6K) | [Nate Smith](https://open.spotify.com/artist/3C1TdpEowpf6AMf7PycuWy), [Brittany Howard](https://open.spotify.com/artist/4XquDVA8pkg5Lx91No1JxB) | [Fly \(For Mike\) Feat\. Brittany Howard](https://open.spotify.com/album/41YY1uiKkpKySBWEpqzWpj) | 4:01 | | 32 | [This Too Shall Last](https://open.spotify.com/track/0CuXzMEgFzuQhLEYQHYas4) | [Anderson East](https://open.spotify.com/artist/5q6z6GTth6lMbL9I8CAgby) | [Encore](https://open.spotify.com/album/6EIlfHDKZxmDMjcaRbFj8d) | 3:42 | | 33 | [Here B4](https://open.spotify.com/track/4Pv7cz45vrALeaOqhqVELU) | [JNR WILLIAMS](https://open.spotify.com/artist/7GZfE8P3kSPhhzq854OMxk) | [Here B4](https://open.spotify.com/album/6aADEr2kdQbGTQ3Wgupzyr) | 3:53 | | 34 | [Give in to Me](https://open.spotify.com/track/0WRV8ZT2bzSINqpufUIazn) | [Charlotte Leigh](https://open.spotify.com/artist/7yifAbF2E8xMhw1UMnuw0r) | [Give in to Me](https://open.spotify.com/album/2kTC04fQ5TMi8oIMxSCvk5) | 4:31 | | 35 | [\(Wish I Didn't Have to\) Lie \[feat\. JORDY\]](https://open.spotify.com/track/0HXEPlkF9FrfuuDBCbsEAO) | [Catie Turner](https://open.spotify.com/artist/3nYYI90ObxhjLjdxaoXGSa), [JORDY](https://open.spotify.com/artist/0p9SPN0Vhv6aDRZCz4W13E) | [\(Wish I Didn't Have to\) Lie \[feat\. JORDY\]](https://open.spotify.com/album/6Xw2btwdvf4zFwt8XxBG0r) | 2:55 | | 36 | [One Day](https://open.spotify.com/track/60npAyGdzYq1MzCoSkNfXo) | [Cleo Sol](https://open.spotify.com/artist/3ETLPQkcEd7z4k3IbZmXMq) | [Mother](https://open.spotify.com/album/3cDl7l5FGQi93NgtqFR1gR) | 8:25 | | 37 | [Now You're Here](https://open.spotify.com/track/2iAzigOFf61dLLPaiEDAcZ) | [Yola](https://open.spotify.com/artist/2gqMBdyddvN82dzZt4ZF14) | [Stand For Myself](https://open.spotify.com/album/1aF9Xjtg1d1wwsE4hRAkQV) | 4:08 | | 38 | [Carry You](https://open.spotify.com/track/6csk7RGrFhxbi6hPwQImPf) | [The Teskey Brothers](https://open.spotify.com/artist/2nTjd2lNo1GVEfXM3bCnsh) | [Run Home Slow](https://open.spotify.com/album/3DBne1mcu8Glvx5au357Io) | 4:32 | | 39 | [Adeline](https://open.spotify.com/track/796IKFUy6xJKcJuV4loLrt) | [The Dip](https://open.spotify.com/artist/2qFOYqFxPaIwEnffVhJhEn) | [The Dip Delivers](https://open.spotify.com/album/5NaNDBt0tTh3Y8GiS3zfoI) | 3:32 | | 40 | [Honey + Tea](https://open.spotify.com/track/7t9rZKKU43qcg5oyUEp4XL) | [Mōzi](https://open.spotify.com/artist/5w603RhGuKjuDtIlkQt3E4) | [Human](https://open.spotify.com/album/5fiWptoi1Xs9XA9pew7g0P) | 3:42 | | 41 | [Hold On](https://open.spotify.com/track/6bGMSP3H9YqkmaLnaJTIoF) | [Adele](https://open.spotify.com/artist/4dpARuHxo51G3z768sgnrY) | [30](https://open.spotify.com/album/21jF5jlMtzo94wbxmJ18aa) | 6:06 | | 42 | [Faith You Might](https://open.spotify.com/track/6UUL6mVus7dn7sw5O6Zd7p) | [Kevin Garrett](https://open.spotify.com/artist/56tbeL5xhBPxby544GuK3E) | [Faith You Might / In Case I Don't Feel](https://open.spotify.com/album/1xroovsaQjS7u5DmTvd0VT) | 4:12 | | 43 | [New Start](https://open.spotify.com/track/7ywmf8Yz7s7erljhh06QGl) | [Moss Kena](https://open.spotify.com/artist/2u6jNcpusijFS6ZzuWRwMv) | [New Start](https://open.spotify.com/album/2y1oMqGnG9WyLcuwVJSPz8) | 4:42 | | 44 | [Blindsided](https://open.spotify.com/track/3Bcr5fRN2SQ0jIVWsJHes8) | [Charlotte Leigh](https://open.spotify.com/artist/7yifAbF2E8xMhw1UMnuw0r) | [Blindsided](https://open.spotify.com/album/21NvjYBnFikMWyL3fUY0jg) | 4:20 | | 45 | [Midnight River \(feat\. 6LACK\)](https://open.spotify.com/track/5HphhcOuLFWBj9IghbrKJB) | [Pink Sweat$](https://open.spotify.com/artist/1W7FNibLa0O0b572tB2w7t), [6LACK](https://open.spotify.com/artist/4IVAbR2w4JJNJDDRFP3E83) | [Midnight River \(feat\. 6LACK\)](https://open.spotify.com/album/6u037PKoI8rUCv4upVCKVx) | 3:04 | | 46 | [What We Found](https://open.spotify.com/track/60npzcMuSF3IgdQoAHIFfx) | [Jesse Barrera](https://open.spotify.com/artist/51KbY36mrjHRQwvSbel74l), [Nieman](https://open.spotify.com/artist/4SwV4H2atecTIdXKyLfSfR), [Melissa Polinar](https://open.spotify.com/artist/2O6S01fSY6YHfZT6qLAgxG) | [What We Found](https://open.spotify.com/album/3DmN6DaelY8XtdGFQzuBzp) | 3:10 | | 47 | [Lady Like](https://open.spotify.com/track/33UI8dgfkDOf32DWBH1GL0) | [Ingrid Andress](https://open.spotify.com/artist/0jPnVIasXzBYjrlpO5irii) | [Lady Like](https://open.spotify.com/album/3l0QTUgr0CKXgpQb9PAdmH) | 3:14 | | 48 | [Nothing](https://open.spotify.com/track/6AxRGtu8gdKPeynxdHsmzC) | [Bruno Major](https://open.spotify.com/artist/0hDjKSKjl1DC7ovYTDJHe8) | [Nothing](https://open.spotify.com/album/6vxh3eBU46aqiwersrZ090) | 2:42 | | 49 | [Have It Your Way](https://open.spotify.com/track/7hGJ3EmVvVOrFciDx0P8vm) | [Nina Soro](https://open.spotify.com/artist/3uzkKm1uj1EWWY0uxkEqZA) | [Have It Your Way](https://open.spotify.com/album/46YLWCQYjiqZ3M1lC40fym) | 4:25 | | 50 | [Never Tear Us Apart](https://open.spotify.com/track/3I7Fqr8wLi60aZ9P4wIFwq) | [The Teskey Brothers](https://open.spotify.com/artist/2nTjd2lNo1GVEfXM3bCnsh) | [Never Tear Us Apart](https://open.spotify.com/album/4doBTxqHZNmn9a06bP3A5s) | 3:24 | | 51 | [Still Blue](https://open.spotify.com/track/24kEcxphKufCs6KNPkyf5K) | [Cypress](https://open.spotify.com/artist/0J7CXD5MWLqn3pcTDoGEhu) | [A Fine Line](https://open.spotify.com/album/3yp0ICS4BC2ZizQ0k8yQCV) | 5:00 | | 52 | [Separate Hearts](https://open.spotify.com/track/101SZzhLjpsvvcB4D24lBf) | [Brooke Sierra](https://open.spotify.com/artist/6cGIpFhOg9W97aQtBwv5nx), [LALAKI](https://open.spotify.com/artist/2aGkysuC0eAfw83Yug6GRb) | [Separate Hearts](https://open.spotify.com/album/3jk5XAvxoi6vYBwtQMCR77) | 3:05 | | 53 | [For Me, It's You](https://open.spotify.com/track/2odYUJ9LhwDVVPxXL3NxuB) | [Lo Moon](https://open.spotify.com/artist/2XcWfmG3wclCLfTJb7mFeg) | [For Me, It's You](https://open.spotify.com/album/14U9vSKFIUzR7pebXtQhi8) | 4:00 | | 54 | [for life](https://open.spotify.com/track/3wD5R5grD04FKW0zsBuiKy) | [Zachary Knowles](https://open.spotify.com/artist/5BxcZnUcETSt90VlbsdugI) | [magnolia](https://open.spotify.com/album/4lUystyYlQEstc79Eic2Q6) | 3:31 | | 55 | [Stay](https://open.spotify.com/track/4zu19OtPZq8ljWVhPthram) | [Abraham Alexander](https://open.spotify.com/artist/2f6fW5uWhqbEDXDK6IGirN) | [Stay](https://open.spotify.com/album/1Thc6xEDDKf6LsyE55ytOv) | 3:49 | | 56 | [Musta Been Something](https://open.spotify.com/track/0Vr6vFlxGoAMs6RPA5vJZY) | [Lake Street Dive](https://open.spotify.com/artist/3nuc29fYGlQbIrwh4yrNWd) | [Free Yourself Up](https://open.spotify.com/album/1eWP4CbLwVsVuC44utXBOD) | 5:46 | | 57 | [Pretty Girl Hi Reimagined](https://open.spotify.com/track/6aSdfeuUMtk7XOEHn93kkP) | [UMI](https://open.spotify.com/artist/4ClziihVpBeFXNyDH83Lde) | [Introspection Reimagined](https://open.spotify.com/album/3H1v8w26UVp4tylel4cRrr) | 3:33 | | 58 | [If Only](https://open.spotify.com/track/2vHUqWBW12TJqsE8lJGsK1) | [Rachel Mazer](https://open.spotify.com/artist/1gN0EvPI7000a53bw1MXbl) | [How Do We Get By](https://open.spotify.com/album/5ZLHkbCn4a3EtUzhkOSJsY) | 4:13 | | 59 | [Silhouettes](https://open.spotify.com/track/4de0lucJwbb1VvKYcwgstC) | [Oscar Blue](https://open.spotify.com/artist/1LSKJziUwTOlquPaHzHt4Z) | [Silhouettes](https://open.spotify.com/album/0zoZmf4aqmKAXxrBuCgxc6) | 3:54 | | 60 | [Moment](https://open.spotify.com/track/7hKTSydLsRENnTOqHiPVT0) | [Jay Warren](https://open.spotify.com/artist/692sqZdmqkQjdCk87QECpd) | [Moment](https://open.spotify.com/album/45eSjtBmN4SZsW3RB4RXUu) | 3:15 | | 61 | [Stupid Me](https://open.spotify.com/track/2YlCAofl2PplOmHSZQRTLk) | [Dylan Dunlap](https://open.spotify.com/artist/7CanUos0itnFLMrCiT839W) | [Stupid Me](https://open.spotify.com/album/2e0sawtfwsMGQWd1RLvM2I) | 3:47 | | 62 | [Old Love \(Bonus\)](https://open.spotify.com/track/6saFywPB1ofkLwuy95UQXc) | [Dominique Fils\-Aimé](https://open.spotify.com/artist/10tvYvaoSO32hlvu3NrrPC) | [Stay Tuned!](https://open.spotify.com/album/2W5iEk6ZM6Gqf0uYtZEaFA) | 3:25 | | 63 | [We Were Never Friends](https://open.spotify.com/track/0txG3MJ2CO3031axdnyyTD) | [Niia](https://open.spotify.com/artist/1KlUwB6uFECMC3zzvFvykx), [Solo Woods](https://open.spotify.com/artist/2SrsKqD053CseoB0321K8I) | [We Were Never Friends](https://open.spotify.com/album/40OrrMGJH929gnkfmeIN0X) | 3:21 | | 64 | [My Life](https://open.spotify.com/track/2hPY2x0M6k8pQiBxeCzD5T) | [Loren Kramar](https://open.spotify.com/artist/3LDUvPQ97zYOzFliNk9GmV) | [My Life](https://open.spotify.com/album/32FU3BRS2YpxpVpQvCitpU) | 4:12 | | 65 | [Fresh Roses](https://open.spotify.com/track/05RmevbSKDvHHKpYBoXYfR) | [Juke Ross](https://open.spotify.com/artist/3mDo5Nv0SWpslJe9HzA2xY) | [Fresh Roses](https://open.spotify.com/album/6DUbcDDJa547U0KvgjCthp) | 3:21 | | 66 | [Before I Do](https://open.spotify.com/track/2oSH6sOAP0YOq6qMuO0ZMK) | [Barbra Lica](https://open.spotify.com/artist/1LWWCHWErOO9KZfcwrmS9D) | [You're Fine](https://open.spotify.com/album/7AnOsuntfZx3d3Vrnt2z3B) | 3:18 | | 67 | [Making Room](https://open.spotify.com/track/5QRfxsSpzpWqzw4v3vKTCH) | [Kelly Schenk](https://open.spotify.com/artist/3kHGDGEmX8EWocacdi5Ijk) | [Making Room](https://open.spotify.com/album/4Dca3AwituUnPJFqTMTolt) | 4:19 | | 68 | [Everything's Fine](https://open.spotify.com/track/6SRzKq8y1qHT2aSKphQ03C) | [Jamie Drake](https://open.spotify.com/artist/7rvB7ONJSqlmaCrcbhelir) | [Everything's Fine](https://open.spotify.com/album/3OZkBeZ9S4LDYvmn3jQ9kZ) | 3:29 | | 69 | [Baby](https://open.spotify.com/track/1CoEu5QifQZUhFYrLIjgFx) | [Brandon](https://open.spotify.com/artist/08HpiyWkp2Z7gFTkVae265) | [Baby](https://open.spotify.com/album/7qnpbQkt7O7CVy7cgu1g6k) | 1:51 | | 70 | [Reminiscing](https://open.spotify.com/track/2ZIbiTRemxHiYAPg0UqoLY) | [Datsunn](https://open.spotify.com/artist/4zosWP0ung7qeYevTLfuXV) | [Reminiscing](https://open.spotify.com/album/50VMTNsmINXwJGvxgjHxAp) | 2:43 | Snapshot ID: `MTY0MTY4ODg3OCwwMDAwMDAwMGIzNWFlMTg5ZWEwZjlmZGQ2NzU4MjIwNTdhOGY3NTY4`
218.121951
374
0.759924
yue_Hant
0.59773
3f38cc77ce9d1bf0b095745ee8f39d008cfd3859
129
md
Markdown
README.md
leicam/facec-webapi-featherhttp-2021
471f1cdeace23484af4a5a39722bf9e04c662ab9
[ "MIT" ]
null
null
null
README.md
leicam/facec-webapi-featherhttp-2021
471f1cdeace23484af4a5a39722bf9e04c662ab9
[ "MIT" ]
null
null
null
README.md
leicam/facec-webapi-featherhttp-2021
471f1cdeace23484af4a5a39722bf9e04c662ab9
[ "MIT" ]
1
2021-06-20T20:31:37.000Z
2021-06-20T20:31:37.000Z
# facec-webapi-featherhttp-2021 Api sem controllers, para estudos iniciais com .NET Core 3.1 utilizando a biblioteca FeatherHttp
43
96
0.821705
por_Latn
0.794214
3f38d17c764d1d300428387f2f86d62e88d58803
521
md
Markdown
content/05.1911-1920/01.1911/chapter.md
GospelSounders/sabbathschool
275de83503eec8d69c73722ff0802de236ae531e
[ "Apache-2.0" ]
null
null
null
content/05.1911-1920/01.1911/chapter.md
GospelSounders/sabbathschool
275de83503eec8d69c73722ff0802de236ae531e
[ "Apache-2.0" ]
2
2021-06-22T11:54:55.000Z
2022-01-22T21:40:48.000Z
content/05.1911-1920/01.1911/chapter.md
GospelSounders/sabbathschool
275de83503eec8d69c73722ff0802de236ae531e
[ "Apache-2.0" ]
null
null
null
--- title: 1911 metadata: description: 1911 Sabbath School Lessons, Acts of the Apostles, Acts of the Apostles, Acts of the Apostles, Acts of the Apostles keywords: Old Testament History, author: Brian Onang'o --- #### 1911 YEAR/QUARTER | 1 | 2| 3| 4 -------------|------------|---|--|--- 1911 | [Acts of the Apostles](/1911-1920/1911/quarter1) | [Acts of the Apostles](/1911-1920/1911/quarter2) | [Acts of the Apostles](/1911-1920/1911/quarter3) | [Acts of the Apostles](/1911-1920/1911/quarter4) |
37.214286
213
0.642994
eng_Latn
0.513764
3f38f3fa2168c667cc74efe44db18cbd97bdb82c
439
md
Markdown
Docs/GPU/imagedata.enlarge.md
hachem2001/LIKO-12
cf957c13def13b9852b8b73d666a6b53e88ff1bf
[ "MIT" ]
4
2019-06-15T21:04:56.000Z
2021-11-15T09:28:32.000Z
OS/DiskOS/Help/GPU/imagedata.enlarge.md
Rami-Sabbagh/BatteryMan
3dc86c294c3f12077ad72d83744becbaf8961041
[ "MIT" ]
4
2019-06-01T13:43:26.000Z
2020-01-09T02:44:24.000Z
OS/DiskOS/Help/GPU/imagedata.enlarge.md
Rami-Sabbagh/BatteryMan
3dc86c294c3f12077ad72d83744becbaf8961041
[ "MIT" ]
1
2018-03-30T14:49:48.000Z
2018-03-30T14:49:48.000Z
This function will multiplay the image dimensions with the given scale, and automatically scales the image data, then returns the result imagedata. --- #### Syntax: ```lua enimgdata = imgdata:enlarge(scale) ``` --- #### Arguments: * **<scale\> (Number)*: The scaled to multiply the image dimensions with, must be bigger than zero and interger. --- #### Returns: * **enimgdata ([GPUImageData](imagedata.md))**: The result imagedata.
21.95
147
0.703872
eng_Latn
0.914334
3f3901082a470eda31d26b63f062364bb8714518
4,524
md
Markdown
content/practiceAreas/knee-injuries-lawyers-in-austin.md
emmadigital/austinlaw
86800e56e9d74fcd48b44ef567957b3bc347e338
[ "MIT" ]
null
null
null
content/practiceAreas/knee-injuries-lawyers-in-austin.md
emmadigital/austinlaw
86800e56e9d74fcd48b44ef567957b3bc347e338
[ "MIT" ]
null
null
null
content/practiceAreas/knee-injuries-lawyers-in-austin.md
emmadigital/austinlaw
86800e56e9d74fcd48b44ef567957b3bc347e338
[ "MIT" ]
2
2020-09-03T08:40:31.000Z
2021-09-03T03:48:18.000Z
--- template: PracticePage title: Knee Injuries Lawyers in Austin status: Published date: 2020-09-03 featuredImage: /images/austin-knee-injury-lawyers.jpg excerpt: Knee injuries can be debilitating. When you cannot walk properly, you may put more weight on your other leg resulting in a poor compensation mechanism. categories: - category: Serious Personal Injury meta: title: Knee Injuries Lawyers in Austin description: Knee injuries can be debilitating. When you cannot walk properly, you may put more weight on your other leg resulting in a poor compensation mechanism. --- <iframe width="560" height="315" src="https://www.youtube.com/embed/i3hxG2Vdbf4" frameborder="0" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe> <!--StartFragment--> Knee injuries can be debilitating. When you cannot walk properly, you may put more weight on your other leg resulting in a poor compensation mechanism. The uneven load balancing puts you at a risk of falling and further injuring yourself. Many people find it difficult to drive and just get around when they have suffered a knee injury. <!--EndFragment--> ![](/images/austin-knee-injury.jpg) <!--StartFragment--> Although knee pain can occur from everyday wear and tear or over use, the most common cause of knee pain is injury. An injury can be caused by sports, a work related injury, a [slip and fall injury](/practice-areas/slip-and-fall-injury-lawyers/), or a [car accident](/practice-areas/car-accident-lawyers/). Sudden injuries may be caused by a direct hit to the knee from a car dashboard, or from twisting, bending or falling in just the wrong way. ## Structure of the Knee The knee is comprised of ligaments and discs in addition to the bones of the knee. The Anterior Cruciate Ligaments and Posterior Cruciate Ligament (ACL and PCL) are found inside your knee joint and form an X kind of like a pretzel. These ligaments control the back and forth motion in the knee as well as offering stabilization of the knee. The upper and lower bones of the knee are separated by the menisci which is cartilage that acts as a shock absorber between the bones. The menisci, which is plural for meniscus, protects the knee joint surface. The menisci absorb shock from daily activities such as walking and running. The types of knee injuries are overuse injuries, sprains and strains, ligament tears in the ACL or PCL, fractures, and kneecap dislocation more common in 13-18 year old girls. ## Types of Knee Injuries A [sprain or strain](/practice-areas/soft-tissue-damage-attorneys/) is an injury to the ligaments and tendons that connect and support the kneecap. The tendons can either twist or pull, hyper-extend or even go beyond their intended radius depending on the force that a collision occurs between vehicles. A [fracture of the kneecap](/practice-areas/broken-bone-injury-attorneys/) can occur in the lower portion of the femur or upper part of the tibia or fibula. These types of [serious injuries](/practice-areas/serious-personal-injury/) are most commonly caused by abnormal force such as falling on the knee or a severe twisting motion or when the knee hits an object very hard like the car in a car accident. Another common type of knee injury that one can get is the tear of the meniscus. The meniscus is said to be a disk-like mass that has cartilage and soft tissues located behind the knee. When an accident occurs, the force created due to the impact can tear or rapture one’s menisci. A slight tear has got a moderate discomfort and pain, unlike a rapture which is very much painful. ## Knee Injury Treatment The treatments for serious knee injuries involve everything from ice and rest to the more invasive treatments such as surgery. The type of treatment is determined by the location, type, and severity of the injury. When should you seek treatment and call the doctor? If you can not put weight or straighten the knee, severe pain, increased swelling within 30 minutes of an injury, or if the swelling and pain do not improve in 2 days of home first aide and medical treatment. If you are experiencing numbness or tingling or cannot move the knee normally because of weakness, then you should seek medical attention immediately. ### Knee Injuries Lawyers If you suffered a knee injury in a car crash or slip and fall, contact our injury attorneys to see whether you may have a claim. Call (512) 246-9191 or fill in the form below. <!--EndFragment-->
66.529412
405
0.78603
eng_Latn
0.999761
3f39dcd87bfa0ad24927a5bdede39ff9038cd0b0
1,097
md
Markdown
README.md
Erik-jpg/aws-thought
d77d7767985d0f9005ee1668837b74c3fd348701
[ "MIT" ]
1
2021-11-14T20:59:24.000Z
2021-11-14T20:59:24.000Z
README.md
Erik-jpg/aws-thought
d77d7767985d0f9005ee1668837b74c3fd348701
[ "MIT" ]
null
null
null
README.md
Erik-jpg/aws-thought
d77d7767985d0f9005ee1668837b74c3fd348701
[ "MIT" ]
null
null
null
# aws-thought **Title:** Create an AWS account **User Stories** * As a developer, I want to be able to view the management console in AWS. * As a developer, I want to manage my IAM role. * As a developer, I want to set up a billing alert. **Title:** Set up API endpoints **User Stories** * As a developer, I need to create a route to query all the thoughts. * As a developer, I need to create a route to query all the thoughts from a user. * As a developer, I need to create a route to create a thought. **Title:** Integrate the database calls into the front end **User Stories** * As a user, I want to view all the thoughts. * As a user, I want to be able to create a new thought. * As a user, I want to view thoughts of a user **Title:** Add Images to the Application **User Stories** * As a user, I want to add an image to my thought. * As a user, I want to see all images. * As a user, I want to view images of a user **Title:** Deploy the app to an EC2 instance **User Stories** * As a user, I want to be able to visit the app on a public URL.
36.566667
83
0.679125
eng_Latn
0.99894
3f3a8fa10f4e62aefa2fc65589220238f4599a93
335
md
Markdown
basics/python/practice/tests/README.md
haochunchang/Bioinformatics_Course
c060c1cfe2ccc3c6f0adbe3e414b3c7ce8a5385c
[ "MIT" ]
null
null
null
basics/python/practice/tests/README.md
haochunchang/Bioinformatics_Course
c060c1cfe2ccc3c6f0adbe3e414b3c7ce8a5385c
[ "MIT" ]
null
null
null
basics/python/practice/tests/README.md
haochunchang/Bioinformatics_Course
c060c1cfe2ccc3c6f0adbe3e414b3c7ce8a5385c
[ "MIT" ]
3
2020-04-12T04:43:24.000Z
2021-11-30T02:01:02.000Z
# Reference solutions * To test your functions: ```python # in test_xx.py (for example: test_03.py) # change this line import solution_03_process as rna # to this (Assume your function name is "transcribe") import your_code as rna ``` * Run the tests: ```bash # Open anaconda prompt in this folder and type: python3 test_03.py ```
17.631579
53
0.731343
eng_Latn
0.991179
2dc02a2044d476b74fae781cb1f7ee6369820398
2,870
md
Markdown
README.md
isotope11/bitstampede
1e4d3ba007fdf107bd6484e9a9750897364b2b13
[ "MIT" ]
2
2015-03-04T18:17:22.000Z
2018-05-18T08:09:01.000Z
README.md
isotope11/bitstampede
1e4d3ba007fdf107bd6484e9a9750897364b2b13
[ "MIT" ]
1
2018-05-17T07:36:07.000Z
2018-05-17T07:36:07.000Z
README.md
isotope11/bitstampede
1e4d3ba007fdf107bd6484e9a9750897364b2b13
[ "MIT" ]
null
null
null
# Bitstampede [![Build Status](https://travis-ci.org/isotope11/bitstampede.png?branch=master)](https://travis-ci.org/isotope11/bitstampede) [![Coverage Status](https://coveralls.io/repos/isotope11/bitstampede/badge.png?branch=master)](https://coveralls.io/r/isotope11/bitstampede?branch=master) [![Code Climate](https://codeclimate.com/github/isotope11/bitstampede.png)](https://codeclimate.com/github/isotope11/bitstampede) Bitstampede is a gem for accessing the Bitstamp API ## Installation Add this line to your application's Gemfile: gem 'bitstampede' And then execute: $ bundle Or install it yourself as: $ gem install bitstampede ## Usage First, look at this picture: ![Legitimate Concern](./doc/legitimate_concern.png) Second, stop writing api clients that are configured with class ivars ಠ_ಠ Third, this stuff: ## Actual Usage Without Silly Pictures ```ruby client = Bitstampede::Client.new client.key = 'YOUR_API_KEY' client.secret = 'YOUR_API_SECRET' client.client_id = 'YOUR_CLIENT_ID' # Alternatively, you can configure the client on initialization with: client = Bitstampede::Client.new(key: 'YOUR_API_KEY', secret: 'YOUR_API_SECRET', client_id: 'YOUR_CLIENT_ID') # Fetch your balance client.balance # => #<Bitstampede::Entities::Balance:0x0000000259f338 @usd_balance=#<BigDecimal:259e898,'0.0',9(9)>, @btc_balance=#<BigDecimal:2726698,'0.0',9(9)>, @usd_reserved=#<BigDecimal:2726328,'0.0',9(9)>, @btc_reserved=#<BigDecimal:2725fb8,'0.0',9(9)>, @usd_available=#<BigDecimal:2725c48,'0.0',9(9)>, @btc_available=#<BigDecimal:27258b0,'0.0',9(9)>, @fee=#<BigDecimal:2725540,'0.0',9(9)>> client.orders #=> [ #<Bitstampede::Entities::Order:0x000000027302d8 @id=0, @datetime=0, @type=:buy, @price=#<BigDecimal:272f428,'0.0',9(9)>, @amount=#<BigDecimal:272f130,'0.0',9(9)>> ] # Place a limit order to buy one bitcoin for $100.00 USD client.buy!(BigDecimal('1'), BigDecimal('100')) # Place a limit order to sell one bitcoin for $101.00 USD client.sell!(BigDecimal('1'), BigDecimal('101')) # Cancel order #1234 client.cancel 1234 ``` ## Examples You can run any of the examples in the `example` dir by just executing the script (except `example/example.rb`, which is the base example class). For instance, to see your balance, do the following: ```ruby ruby example/balance.rb ``` ## Contributing 1. Fork it 2. Create your feature branch (`git checkout -b my-new-feature`) 3. Commit your changes (`git commit -am 'Add some feature'`) 4. Push to the branch (`git push origin my-new-feature`) 5. Create new Pull Request ## License This software is licensed under [the MIT License.](./LICENSE.md) ## Contributors These people have contributed to the gem. Many thanks!: - Josh Adams - [Robert Jackson](https://github.com/rjackson) - [Seth Messer](https://github.com/megalithic) - [James Cook](https://github.com/jamescook)
31.888889
410
0.733798
eng_Latn
0.536503
2dc0493a3a52f6c12262ff7817025e5cd08e19b1
18,166
md
Markdown
rfcs/2020-08-31-3645-graphql-api.md
VJftw/vector
b2125418b19c031e68ce0da804ff08ec24117f1c
[ "Apache-2.0" ]
null
null
null
rfcs/2020-08-31-3645-graphql-api.md
VJftw/vector
b2125418b19c031e68ce0da804ff08ec24117f1c
[ "Apache-2.0" ]
null
null
null
rfcs/2020-08-31-3645-graphql-api.md
VJftw/vector
b2125418b19c031e68ce0da804ff08ec24117f1c
[ "Apache-2.0" ]
null
null
null
# 1. RFC 3645 - 2020-08-31 - GraphQL API This RFC proposes using [GraphQL](https://graphql.org/) for the Vector observability API: - [1. RFC 3645 - 2020-08-31 - GraphQL API](#1-rfc-3645---2020-08-31---graphql-api) - [1.1. Scope](#11-scope) - [1.2. Motivation](#12-motivation) - [1.2.1. Observability](#121-observability) - [1.2.2. Initial Clients](#122-initial-clients) - [1.2.3. Protocol](#123-protocol) - [1.3. Internal Proposal](#13-internal-proposal) - [1.3.1. Tooling](#131-tooling) - [1.3.2. Proof-of-Concept](#132-proof-of-concept) - [1.3.3. Declarative Syntax](#133-declarative-syntax) - [1.3.4. Server implementation with `async-graphql`](#134-server-implementation-with-async-graphql) - [1.3.5. GraphQL `heartbeat` Subscription](#135-graphql-heartbeat-subscription) - [1.3.6. UI Implementation With React + `Urql`](#136-ui-implementation-with-react--urql) - [1.3.7. Development Workflow](#137-development-workflow) - [1.3.8. Implicit Stack Benefits](#138-implicit-stack-benefits) - [1.3.9. Later -- Authorization](#139-later----authorization) - [1.4. Doc-level Proposal](#14-doc-level-proposal) - [1.5. Rationale](#15-rationale) - [1.6. Prior Art](#16-prior-art) - [1.7. Drawbacks](#17-drawbacks) - [1.8. Alternatives](#18-alternatives) - [1.9. Outstanding Questions](#19-outstanding-questions) - [1.10. Discussion](#110-discussion) - [1.11. Plan Of Attack](#111-plan-of-attack) ## 1.1. Scope - Exposing a public API on port 8686, that can be connected to via API clients. - Client requirements / considerations. - Libraries and tooling. - Merits and disadvantages of GraphQL vs. REST and gRPC. ## 1.2. Motivation ### 1.2.1. Observability The Vector team is working on an observability dashboard that will enable users to: - View Vector topology. - Validate `vector.toml` configuration. - Check the health and metrics that govern individual sources, transforms and sinks. - Provide deep insight into Vector I/O. The protocol used for communication between Vector and a connecting client is required to deliver data that is: - High fidelity (captures all metrics.) - High volume (multiple connecting clients.) - Real-time (sent with minimal delay between Vector raising an event, and its delivery.) - Efficient to parse/consume (the client shouldn't require heavy computation against a payload.) - Semantically meaningful (to translate it to visual charts and dashboards.) ### 1.2.2. Initial Clients The initial proposal is to provide observability via two clients: 1. `vector top` / `vector tap` CLI commands. 2. A real-time web UI This RFC focuses on the web UI, but applies equally to the CLI client due to the possibility of observing a remote Vector instance. ### 1.2.3. Protocol This section summarizes various communications protocols, and their disadvantages for Vector observability: **REST:** - It's not suited for streaming data. - It requires polling and/or long-lived connections. - Data will initially be read-only; many HTTP verbs don't apply to us. - The API interface isn't typed by default, making it cumbersome to work on clients without additional tooling. **gRPC:** - The full gRPC spec is incompatible with current browsers (see [this article](https://grpc.io/blog/state-of-grpc-web/) for a useful summary.) - gRPC-Web offers a subset of gRPC features, including a lack of two-way messaging. - The TLS story is harder to solve. - gRPC-Web support is currently lacking in Rust. There's an [open issue on tower-grpc](https://github.com/tower-rs/tower-grpc/issues/35) to track. - Enabling gRPC-Web therefore presently requires a middle-tier proxy, such as [Envoy](https://www.envoyproxy.io/), to transform requests to the gRPC-Web spec. **WebSockets:** - Asynchronous data payloads that are suited to streaming data, but harder to reconcile for single requests. - Related: The need to match up a 'request ID' with a 'response ID', to determine which payload belongs to a given request. - Long-lived connections that require explicit (re)connection logic. - Not intrinsically type-safe for the client to consume; high degree of runtime validation. - Non-trivial for non-web clients to interact with; not a common protocol outside of the web. ## 1.3. Internal Proposal I propose using GraphQL for API communications. Advantages: - A known spec with 5 years of history. - Type safe. - Dual HTTP + WebSockets model suits one-time and streaming queries. - Production use at companies with heavy API workloads. - Rich ecosystem of tooling, for server/clients. - Trivial debugging story. - No proxy/middle-tier required. - Works over plain HTTP. - Optional TLS. - Flexible auth options (described below) - No need to maintain separate schema. An 'introspection' query against a running Vector instance from the client provides a current view of type-safe schema. - Trivial to document (Rust [doc comments](https://doc.rust-lang.org/stable/rust-by-example/meta/doc.html#doc-comments) become API field documentation.) - Experience among the Vector / Timber team spanning over a year working with Alloy, covering full-stack schema design and front-end client tooling. - I've been personally involved in GraphQL projects since 2015. ### 1.3.1. Tooling - [async-graphql](https://github.com/async-graphql/async-graphql) (Vector). I initially tried Juniper, but subscriptions (i.e. real-time data) is still WIP and several key elements of the GraphQL spec (such as interfaces) are TBD. Source: [Feature comparison](https://github.com/async-graphql/async-graphql/blob/master/feature-comparison.md). - [urql](https://formidable.com/open-source/urql/) (UI). We used the [React Apollo client](https://github.com/apollographql/apollo-client) in Alloy, which was mostly positive. We did run into some issues where data that lacks an `id` field would return `null`. Urql has a simpler caching story and may side-step these issues. Both clients use React hooks, which matches our [vector-ui](https://github.com/timberio/vector-ui) tooling. - [GraphQL Code Generator](https://graphql-code-generator.com/) (UI). We used this internally at Timber to generate Typescript types and React hooks. It offers an [urql plugin](https://graphql-code-generator.com/docs/plugins/typescript-urql) and builds type-safe/declarative React hooks which overlay urql to re-render the host React component with data and loading state. ### 1.3.2. Proof-of-Concept I started work in #3514 to test [async-graphql](https://github.com/async-graphql/async-graphql), and expose internal metrics. A playground is available in #3514 to test queries, including `subscription` queries: ![GraphQL Playground](2020-08-31-3645-graphql-api/screenshot.png) ### 1.3.3. Declarative Syntax One of the major benefits of opting for the GraphQL ecosystem is enabling a more declarative style for defining the API, and consuming it. In the following example, I will provide snippets of an example of our current 'heartbeat' subscription, which returns a UTC timestamp every `interval` milliseconds back to a connected WebSocket client. In this example, I will demonstrate: - Writing the server implementation - Querying for it in the playground - Writing the front-end client in React. ### 1.3.4. Server implementation with `async-graphql` [async-graphql](https://github.com/async-graphql/async-graphql) is 'code first'; method implementations become GraphQL SDL and provide an implicit HTTP flow against an incoming request (in my PoC, I used [Warp](https://github.com/seanmonstar/warp), since we already depend on it.) ```rust #[SimpleObject] pub struct Heartbeat { // <-- simple GraphQL object type to provide a `utc` field utc: DateTime<Utc>, } impl Heartbeat { fn new() -> Self { Heartbeat { utc: Utc::now() } } } #[derive(Default)] pub struct HealthSubscription; // <-- 'root' subscription type to merge #[Subscription] impl HealthSubscription { /// Heartbeat, containing the UTC timestamp of the last server-sent payload async fn heartbeat( &self, #[arg(default = 1000, validator(IntRange(min = "100", max = "60_000")))] interval: i32, // ^^ `interval` param -- defaults to 1,000ms; validates between 100ms - 60 seconds ) -> impl Stream<Item = Heartbeat> { // Return a stream of heartbeats tokio::time::interval(Duration::from_millis(interval as u64)).map(|_| Heartbeat::new()) } } ``` In the GraphQL playground, this is surfaced as a strongly typed API. Doc comments become GraphQL API comments: ![Heartbeat subscription](2020-08-31-3645-graphql-api/heartbeat.png) ### 1.3.5. GraphQL `heartbeat` Subscription The above example can be queried with: ```gql subscription { heartbeat(interval: 1000) { utc } } ``` Which returns data as JSON every `interval` milliseconds, e.g: ```json { "data": { "heartbeat": { "utc": "2020-08-31T13:10:47.152412+00:00" } } } ``` ### 1.3.6. UI Implementation With React + `Urql` After generating types and the web client with [GraphQL Code Generator](https://graphql-code-generator.com/), the (simplified) implementation looks similar to this: ```ts import React from "react"; // This is generated for us by GraphQL Code Generator import { useHeartbeatSubscription } from "@/vector/graphql"; // Example component that consumes it const ExampleComponent: React.FC = () => { const [{ data, fetching }] = useHeartbeatSubscription({ variables: { interval: 1000 }, }); return <pre>{data?.utc}</pre>; }; ``` This renders the HTML `<pre>2020-08-31T13:10:47.152412+00:00</pre>`, and auto-refreshes the data with a new UTC timestamp received from the server every `1000`ms. ### 1.3.7. Development Workflow Many of the benefits we receive with GraphQL are felt during development. Types and clients are auto-generated on introspection of a live endpoint. Appreciating that advantage is hard to see in static code blocks. For that reason, I've recorded a 16-minute live coding session which demonstrates the typical dev workflow in the front-end. Use [this tree in `vector-ui` to follow along](https://github.com/timberio/vector-ui/tree/96fe48c35259a48185a149c09190cd076db568d7): [![Live coding session](2020-08-31-3645-graphql-api/video.gif)](https://www.loom.com/share/9ca38feb21bb488f92d729df0148e029) ### 1.3.8. Implicit Stack Benefits In the above example, we side-stepped a lot of complexity that would otherwise have to be explicitly designed for: - The method implementation becomes the public API. There's no additional schema to maintain. - Compile-time type safety (both server and client). If the query included invalid data, it would fail to compile in Vector and in [GraphQL Code Generator](https://graphql-code-generator.com/) in the UI. - Declarative client in the UI that handles the distinction between HTTP/WebSocket connections (for `query` / `mutation` and `subscription` queries, respectively), (re)connection handling, bookkeeping of parallel in-flight requests, response caching, request fetching status, matching responses with requests, and React component re-rendering. It shaves considerable off explicitly designing for those scenarios ourselves. - A type system that accommodates interfaces, unions, enums, primitives and custom scalar types. async-graphql provides built-in abstractions for chrono `DateTime` types, uuid, and other popular crates. - A known schema for errors. Snafu compatibility for `FieldResult<T>` custom errors. ### 1.3.9. Later -- Authorization Initial Vector observability will be single-instance, and available to anyone that has access to the configured port. Locking down access will initially rely on network configuration. Later, as we move into multi-instance observability and more granular API permissions, the requirement for user auth and persistence will surface. While those concerns are out-of-scope for this RFC, choosing a protocol that facilitates authentication is important to avoid backing ourselves into a corner. GraphQL is not opinionated with auth. We have any authorization mechanism available to us at the intersection of HTTP and WebSockets. In previous Timber projects, we appended an `Authorization: Bearer <jwt>` header for queries/mutations. For subscription, we passed a JWT along with the initial WebSocket connection payload; browsers pass limited headers with `Upgrade` requests, so this provided a neat approach to sidestep the lack of a comparable header for WebSockets. The JWT persisted for the life of the open WS. I anticipate doing something similar with the Vector API. async-graphql has a [Context](https://async-graphql.github.io/async-graphql/en/context.html) struct, which is typically used for passing in shared resources such as a database connection pool, or request-specific data such as the current user session. This is largely TBD, but the basic mechanisms are there to allow for flexible auth when we need it. ## 1.4. Doc-level Proposal A Vector observability layer has already been agreed internally. Work is underway. This proposal discusses the protocol which will govern communications between a running Vector instance, and a web UI and CLI. The result of this proposal won't directly impact user interaction with observability tooling. ## 1.5. Rationale We have built a decent body of experience with GraphQL at Timber, albeit on non-Vector projects. The tooling I am proposing here represents the same stack (save for swapping Apollo for Urql.) In a previous Timber project, we used [gqlgen](https://github.com/99designs/gqlgen) as our server library, written in Go. The tooling with async-graphql is similar, though Rust's language features enable a more composible 'code first' approach using macros. I believe the type of data we are consuming benefits from a strongly typed interface, and that compile-time client generation will significantly reduce the time-to-market of what will be already be a very complex front-end web app. ## 1.6. Prior Art From relatively superficial searching, I've not been able to find a comparable approach to using GraphQL for internal metrics / observability. However, there are [plenty of large companies](https://graphql.org/users/) using GraphQL for public API and very possibly for internal machinery that isn't public facing. Article refs: [Github](https://github.blog/2016-09-14-the-github-graphql-api/), [Facebook](https://www.apollographql.com/blog/graphql-at-facebook-by-dan-schafer-38d65ef075af/), [Shopify](https://shopify.dev/concepts/graphql), [Intuit](https://medium.com/intuit-engineering/graphql-intuits-path-to-one-api-system-b8495e4dd281), [Airbnb](https://medium.com/airbnb-engineering/how-airbnb-is-moving-10x-faster-at-scale-with-graphql-and-apollo-aa4ec92d69e2), [Trello / Atlassian](https://www.atlassian.com/engineering/a-look-at-trello-adopting-graphql-and-apollo-in-a-legacy-application). I looked at [Rancher](https://rancher.com/products/rancher/) and the [Kubernetes web UI](https://kubernetes.io/docs/tasks/access-application-cluster/web-ui-dashboard/) for comparison, as both offer observability over an API that's intended for internal team use. Both use JSON payloads. Rancher uses WebSockets to stream data, offering a comparable protocol as the GraphQL schema proposed in this RFC, albeit untyped. Kubernetes uses OpenAPI v2. There are Typescript generation tools for OpenAPI such as [swagger-to-ts](https://github.com/manifoldco/swagger-to-ts) and [OpenAPI Generator](https://github.com/OpenAPITools/openapi-generator). I have no experience with these tools. Given the REST interface to each, I'm not sure this compares directly to typed messages over WebSockets for streaming data. ## 1.7. Drawbacks - GraphQL is generally geared toward graph data (hence the name), although this isn't a hard requirement. IMO, it's a solid general-purpose API interface that mimics what we'd generally wind up creating internally anyway. - Queries with unknown levels of nesting can't be queried statically; generally, a need for queries with flatter structures arise. - Versioning is trickier, given the lack of distinct endpoints. Nothing stopping multiple endpoints representing versions, although typically this is done in the schema. - Some annoying rough edges -- the lack of input unions, for example, typically worked around with more specific queries. - Reliance on a relatively young library rather than a 'pure' client like a stdlib tcp server. - A need to invest, longer-term, in understanding and contributing to an external library. ## 1.8. Alternatives Essentially any browser-compatible protocol could be used, and any text/binary format. This could include other libs such as Protobuf, which might form of a hybrid of the aforementioned approaches. The maturity of tooling would need further investigation, as well as the development experience of working with any given format. ## 1.9. Outstanding Questions - What are the performance characteristics we care about? - How many clients do we assume we will be serving across the total of all `vector top` and Vector UI requests? - Is JSON an acceptable format for data exchange? - What's the GraphQL Client tooling like in Rust, to enable UI-like comms with `vector top`? - Is async-graphql performant and reliable in production? ## 1.10. Discussion The following comprises links to relevant discussion in the original RFC PR: - [Availability of tooling for GraphQL in Rust](https://github.com/timberio/vector/pull/3648#discussion_r480370094) - [Streaming performance](https://github.com/timberio/vector/pull/3648#discussion_r480371934) - [Exposing JSON payloads to a client](https://github.com/timberio/vector/pull/3648#discussion_r480372935) - [UI tooling / workflow for generating Typescript types + clients](https://github.com/timberio/vector/pull/3648#discussion_r480381451) ## 1.11. Plan Of Attack - [x] Determine whether there's consensus in choosing GraphQL - [x] If yes, complete #3514 (separate RFC for observing topology required) - [ ] If no, determine suitable alternatives - [ ] (If applicable) create a new PoC with the alternatives
54.065476
583
0.762633
eng_Latn
0.976237
2dc0521a5409c8f8fdc61be59ab83820c6712a8c
639
md
Markdown
_posts/2012-02-07-2-day-gpu-training-course-dublin-city-university-sci-sym.md
apc-llc/apc-llc.github.io
66c3f504a5ad8df6d5609d089a69e1ab3d23af7f
[ "MIT" ]
null
null
null
_posts/2012-02-07-2-day-gpu-training-course-dublin-city-university-sci-sym.md
apc-llc/apc-llc.github.io
66c3f504a5ad8df6d5609d089a69e1ab3d23af7f
[ "MIT" ]
null
null
null
_posts/2012-02-07-2-day-gpu-training-course-dublin-city-university-sci-sym.md
apc-llc/apc-llc.github.io
66c3f504a5ad8df6d5609d089a69e1ab3d23af7f
[ "MIT" ]
5
2020-04-20T14:39:21.000Z
2020-05-10T15:05:59.000Z
--- layout: post title: "2-day GPU training course" tags: - Software Engineering - Trainings - CUDA dates: - 2012-02-07 - 2012-02-08 thumbnail_path: blog/2012-02-07-2-day-gpu-training-course-dublin-city-university-sci-sym/DCU_Three_Castles.png --- Applied Parallel Computing LLC has delivered the GPU Training Course at Dublin City University, Ireland. ![alt text](\assets\img\blog\2012-02-07-2-day-gpu-training-course-dublin-city-university-sci-sym/DCU_Three_Castles.png "Logo Title Text 1") [Workshop program](\assets\img\blog\2012-02-07-2-day-gpu-training-course-dublin-city-university-sci-sym\UDublin_2day_school_program.pdf)
33.631579
139
0.776213
eng_Latn
0.302743
2dc092b9d985c13fe3103886e9464771d0d493a4
5,293
md
Markdown
languages/swift/exercises/concept/windowing-system/.docs/introduction.md
Limm-jk/v3
e2bf9ca090b7983c748afd45d187c59242568777
[ "MIT" ]
null
null
null
languages/swift/exercises/concept/windowing-system/.docs/introduction.md
Limm-jk/v3
e2bf9ca090b7983c748afd45d187c59242568777
[ "MIT" ]
12
2020-04-16T09:24:15.000Z
2021-05-21T11:54:06.000Z
languages/swift/exercises/concept/windowing-system/.docs/introduction.md
Limm-jk/v3
e2bf9ca090b7983c748afd45d187c59242568777
[ "MIT" ]
1
2020-04-20T11:41:55.000Z
2020-04-20T11:41:55.000Z
## structs-and-classes ## methods ## self ## value-and-reference-types Structs and classes are two of the primary building blocks of Swift programming. They are both means of grouping together related data and functions into self-contained units of functionality. And when you define a struct or class, you are defining a new type to be used within Swift, just as you used the types you've already worked with, like `Int` and `String`. There are many similarities between structs and classes in Swift. Among other similarities, both are able to store values in _properties_ and provide functionality through the use of _methods_. They each provide some additional functionality, which is out of scope for this exercise. ### Defining structs and classes Both structs and classes are defined in roughly the same way. They start with the appropriate keyword followed by the type name that is being defined, then the body of the struct/class follows, placed between curly braces. The body may consist of stored properties, which are defined and behave just like regular constants or variables. ```swift struct CharacterStats { var health = 0.0 var speed = 0 var strength = 0 } class GameCharacter { var stats = CharacterStats() var characterClass: String? var name: String? var active = false let id = makeRandomID() } ``` ### Instances As noted above, defining a struct or class is just defining a new _type_. It is just the blueprint for what the values of that type will look like, but it does not actually create any values of that type for you to work with. In order to create an _instance_ of that type, you need to write the name of the type followed by a pair of parentheses. ```swift let someStats = CharacterStats() let someCharacter = GameCharacter() ``` This will create values of these types, where the properties are populated with the default values supplied in the definition. Note that in optional cases like GameCharacter's `name` property, unless a value is provided, the property will default to nil, just like defining regular optional types where a value is not immediately provided. With structs, Swift automatically provides something called a _memberwise initializer_, where values for the structs properties may be provided inside the parentheses which will override the default values in the definition. ```swift let differentStats = CharacterStats(health: 100.0, speed: 6, strength: 18) ``` ### Accessing properties Struct and class properties can be accessed using _dot notation_ where the name of the value is followed by a dot (`.`) and the name of the property. If a property of a struct or class has properties of its own, this dot notation can be used to access these nested properties as well. This notation can be used both to retrieve the property's value and, where allowed, to change it. ```swift someStats.health // => 0 someCharacter.name // => nil someStats.health = 87.3 someStats.health // => 87.3 someCharacter.name = "Luther" someCharacter.name // => "Luther" someCharacter.id = "new id" // Error: Cannot assign to property: 'id' is a 'let' constant ``` ### Methods Like properties, which store data in your structs and classes, you may also define _methods_ which store functions in your struct or class. Methods are defined in the same way as a regular function, only inside the body of the struct or class. Note that if a function changes the value of a property in a struct, it must be preceded by the `mutating` keyword. Additionally, if a property can be changed by a method, that property must be defined using `var` rather than `let`, just like regular variables. ```swift struct CharacterStats { var health = 0.0 var speed = 0 var strength = 0 mutating func takeHit(_ damage: Double) { health = max(0.0, health - damage) } func canLift(_ weight: Int) -> Bool { weight < strength * 100 } } class GameCharacter { var stats = CharacterStats() var characterClass: String? var name: String? var active = false let id: String = makeRandomID() func takesDamage(_ damage: Double) { stats.takeHit(damage) if stats.health <= 0 { active = false } } func sayName() -> String { return "My name is \(name ?? "no one"), my class is \(characterClass ?? "undetermined")" } func lift(_ weight: Int) -> String { if stats.canLift(weight) { return "No problem!" } else { return "Ooof! No way." } } } ``` These methods can be called using dot notation, just like properties. ```swift var myChar = GameCharacter() myChar.stats = CharacterStats(health: 72.8, speed: 19, strength: 6) myChar.active = true myChar.lift(750) // => "Ooof! No way." myChar.takesDamage(80) myChar.active // => false ``` ### Self Instances of structs and classes each have an implicit value named `self` which refers to the instance itself. There are multiple uses for `self`, but it is most commonly used to disambiguate the names of properties and methods of the struct/class when there may be some confusion. ```swift struct MySelf { var x = 0 mutating func move(x: Int) { // here if we just say x = x it is unclear if we mean // the property x or the method parameter x, so we use // self for clarity self.x = x } } ```
34.148387
365
0.7355
eng_Latn
0.99915
2dc10fd9a0060748d56abb8bf6edaf83971907ed
1,911
md
Markdown
notebooks/110-ct-segmentation-quantize/README.md
ThanosM97/openvino_notebooks
8b7c951a2006950a350736c7cea601c1406d4bb4
[ "Apache-2.0" ]
1
2022-01-27T14:10:26.000Z
2022-01-27T14:10:26.000Z
notebooks/110-ct-segmentation-quantize/README.md
rageshhajela16/openvino_notebooks
73a30443f7a631fa8ecb63f04a4b659c3701bc94
[ "Apache-2.0" ]
1
2022-03-02T21:42:32.000Z
2022-03-02T21:42:32.000Z
notebooks/110-ct-segmentation-quantize/README.md
rageshhajela16/openvino_notebooks
73a30443f7a631fa8ecb63f04a4b659c3701bc94
[ "Apache-2.0" ]
null
null
null
# Quantize a Segmentation Model and Show Live Inference ![kidney segmentation animation](https://user-images.githubusercontent.com/77325899/154279555-aaa47111-c976-4e77-8d23-aac96f45872f.gif) ## What's Inside This folder contains three notebooks that show how to train, optimize, quantize and show live inference on a [MONAI](https://monai.io/) segmentation model with [PyTorch Lightning](https://pytorchlightning.ai/) and OpenVINO: 1. [Data Preparation for 2D Segmentation of 3D Medical Data](data-preparation-ct-scan.ipynb) 2. [Train a 2D-UNet Medical Imaging Model with PyTorch Lightning](pytorch-monai-training.ipynb) 3. [Convert and Quantize a UNet Model and Show Live Inference](110-ct-segmentation-quantize.ipynb) We provided a pretrained model and a subset of the dataset for the quantization notebook, so it is not required to run the data preparation and training notebooks before running the quantization tutorial. The quantization tutorial shows how to: - Convert an ONNX model to OpenVINO IR with [Model Optimizer](https://docs.openvino.ai/latest/openvino_docs_MO_DG_Deep_Learning_Model_Optimizer_DevGuide.html) - Quantize a model with OpenVINO's [Post-Training Optimization Tool](https://docs.openvino.ai/latest/pot_compression_api_README.html) API - Evaluate the F1 score metric of the original model and the quantized model - Benchmark performance of the original model and the quantized model - Show live inference with OpenVINO's async API and MULTI plugin In addition to the notebooks in this folder, the [Live Inference and Benchmark CT-scan data](../210-ct-scan-live-inference/210-ct-scan-live-inference.ipynb) demo notebook contains the live-inference part of the quantization tutorial. It includes a pre-quantized model. ## Installation Instructions If you have not done so already, please follow the [Installation Guide](../../README.md) to install all required dependencies.
61.645161
268
0.804814
eng_Latn
0.899551
2dc1d9970bd6cfcc9688dc19e531887f26bae621
29
md
Markdown
README.md
szh-cn/markdown-photos
eacad45323b413fa46edb7a6da7b730936b85d1a
[ "Apache-2.0" ]
null
null
null
README.md
szh-cn/markdown-photos
eacad45323b413fa46edb7a6da7b730936b85d1a
[ "Apache-2.0" ]
null
null
null
README.md
szh-cn/markdown-photos
eacad45323b413fa46edb7a6da7b730936b85d1a
[ "Apache-2.0" ]
null
null
null
# markdown-photos markdown图片
9.666667
17
0.827586
eng_Latn
0.598531
2dc1fc0a66bdd74eb5a0ecca61a9b0f28d296bfa
1,830
md
Markdown
introduction.md
dmd-program/dmd-300-fa20
5007908ad7ad82ee90132265e2f836d97d39eb32
[ "CC-BY-4.0" ]
null
null
null
introduction.md
dmd-program/dmd-300-fa20
5007908ad7ad82ee90132265e2f836d97d39eb32
[ "CC-BY-4.0" ]
null
null
null
introduction.md
dmd-program/dmd-300-fa20
5007908ad7ad82ee90132265e2f836d97d39eb32
[ "CC-BY-4.0" ]
1
2020-06-17T17:15:55.000Z
2020-06-17T17:15:55.000Z
# Introduction This course follows [DMD 100: Digital Multimedia Design Foundations](https://legacy.gitbook.com/book/dmd-program/dmd-100-sp19/details). In that course, students were given a fairly rigid design process to follow with prescribed outcomes and formats. In DMD 300: Digital Multimedia Design Studio, students will have the opportunity to build their own design process and work with digital formats and tools of their choosing. ## Course description In DMD 300: Digital Multimedia Design Studio, students synthesize the concepts, theories, and applications acquired in the introductory courses and begin to think critically about their professional objectives. Students will work on projects aimed to help them understand available learning pathways and real world applications based on their scholarly and professional interests. Students will work collaboratively to investigate a problem space, conduct a needs assessment, write a design plan or proposal, develop deliverables, and implement and evaluate the final product\(s\). Students will develop a sense of stewardship over the project development process by completing project milestones that reinforce time management behaviors, participating in team building activities that facilitate discussion and interaction, co-authoring project proposals that prompt critical analysis, and distributing production tasks to encourage ownership in completing both defined and open-ended assignments. Students will also be required to thoroughly document and reflect on the production process and project impact through blogging and discussions. Through the duration of the course, students are encouraged to interact with industry advisors for feedback and direction as they work through real-world challenges in their selected digital tools and methodologies.
114.375
581
0.83388
eng_Latn
0.998961
2dc299503bad4afda42b36bdbf6f8bdac01f453d
1,366
md
Markdown
2020/08/27/2020-08-27 06:15.md
zhzhzhy/WeiBoHot_history
32ce4800e63f26384abb17d43e308452c537c902
[ "MIT" ]
3
2020-07-14T14:54:15.000Z
2020-08-21T06:48:24.000Z
2020/08/27/2020-08-27 06:15.md
zhzhzhy/WeiBoHot_history
32ce4800e63f26384abb17d43e308452c537c902
[ "MIT" ]
null
null
null
2020/08/27/2020-08-27 06:15.md
zhzhzhy/WeiBoHot_history
32ce4800e63f26384abb17d43e308452c537c902
[ "MIT" ]
null
null
null
2020年08月27日06时数据 Status: 200 1.程璐 等下次复婚的时候再办吧 微博热度:534318 2.美宣布制裁24家参与南海建岛中企 微博热度:410749 3.谁抱了张天爱 微博热度:374140 4.大张伟真实 微博热度:352639 5.乐华 微博热度:299198 6.元气音乐节 微博热度:229020 7.6000多人被网上养猫平台骗得血本无归 微博热度:216635 8.李尖尖凌霄接吻 微博热度:192759 9.日本多所大学收到恐吓邮件 微博热度:170534 10.易建联致敬科比 微博热度:160486 11.唐灿妈妈 微博热度:156484 12.张鑫的出轨对象是杨雁 微博热度:139783 13.泰国臭脸猫咪理发 微博热度:102078 14.琉璃 微博热度:92840 15.昆明烂尾楼开发商被列失信后仍卖房 微博热度:82970 16.金秀贤新剧获行政处罚 微博热度:79559 17.韩国数万名医生罢工 微博热度:73578 18.十余名非法越境人员被广东海警局抓获 微博热度:71280 19.小米任命林世伟为首席财务官 微博热度:71027 20.印度一大学举办全3D毕业典礼 微博热度:70894 21.李雪琴老板 微博热度:70696 22.李宇春穿15年前比赛衣服现身机场 微博热度:70370 23.9岁女孩练字3年写出印刷体 微博热度:70095 24.孔雪儿黑色长卷发 微博热度:69767 25.女大学生车祸离世捐器官救5人 微博热度:69508 26.以家人之名 微博热度:69124 27.贺子秋凭本事单身 微博热度:68952 28.当上舞蹈课忘了带舞蹈鞋 微博热度:68559 29.越南逮捕21名中国通缉犯 微博热度:68226 30.达芙妮宣布彻底退出实体零售 微博热度:67922 31.90岁姥姥做完美甲后悉心保护 微博热度:67801 32.张一接近杨雁儿子 微博热度:67603 33.年轻人吃火锅一次调3碗小料 微博热度:60847 34.周奇墨淘汰 微博热度:60130 35.中国人民警察警旗式样 微博热度:58511 36.薇娅高圆圆直播 微博热度:58293 37.潜水员帮鲨鱼从口中取出300只鱼钩 微博热度:54841 38.王文也 微博热度:53496 39.首个台风红色预警 微博热度:53296 40.职工医保个人账户拟可用于家人 微博热度:51443 41.敏言被逐出师门 微博热度:45708 42.第一视角的极速光轮有多爽 微博热度:45012 43.李佳琦直播 微博热度:42864 44.康辉撒贝宁尼格买提跳街舞 微博热度:41929 45.司凤璇玑拜堂 微博热度:40218 46.白色月光 微博热度:39559 47.腾蛇怼昊辰好爽 微博热度:37656 48.新冠病毒测试纸升级版 微博热度:37298 49.阿里内部隐藏P序列职级 微博热度:36118 50.白桃乌龙青森冰沙 微博热度:34561
6.696078
21
0.775988
yue_Hant
0.311614
2dc3185d197b78b4eee26164981af779c99f323a
5,367
md
Markdown
ProjectOnline/faq-resource-engagements-are-replacing-the-old-resource-plans.md
Markus-Hanisch/OfficeDocs-ProjectServer
433e64214870b79f2ed320f3555d3267d6bb3a18
[ "CC-BY-4.0", "MIT" ]
null
null
null
ProjectOnline/faq-resource-engagements-are-replacing-the-old-resource-plans.md
Markus-Hanisch/OfficeDocs-ProjectServer
433e64214870b79f2ed320f3555d3267d6bb3a18
[ "CC-BY-4.0", "MIT" ]
null
null
null
ProjectOnline/faq-resource-engagements-are-replacing-the-old-resource-plans.md
Markus-Hanisch/OfficeDocs-ProjectServer
433e64214870b79f2ed320f3555d3267d6bb3a18
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: "FAQ Resource engagements are replacing the old resource plans" ms.author: efrene author: efrene manager: pamgreen ms.date: 7/11/2017 audience: admin ms.topic: overview ms.service: project-online localization_priority: Normal ms.custom: IT_ProjectAdmin search.appverid: - PJO150 - PJO160 - MET150 ms.assetid: 5ddd3242-4858-4e57-a8c1-2b20c06d959a description: "Resource engagements are new for Project Online, but what happens to the old resource plans? Read on about this new feature and how old resource plans are converted into resource engagements." --- # FAQ: Resource engagements are replacing the old resource plans Resource engagements are new for Project Online, but what happens to the old resource plans? Read on about this new feature and how old resource plans are converted into resource engagements. ## I've heard about this new resource engagements feature. How do I get it? Starting on September 22, 2015, we will begin rolling out the new feature, and Project Online administrators will have the option to turn on the new feature at their convenience once it becomes available. This process is phased, and it might take some time before everyone is able to see it. This is a site-level activation. Once an administrator has activated the feature, anyone on that site will be able to see the new **Resource Requests** page and be able to start creating and managing requests. For any new Project Online sites created after the feature is available for the tenant, the administrator will still need to activate the feature for the new instance. We recommend that the Project Online administrator turn on the feature in a spare Project Web App (PWA) instance first to become familiar with it before turning it on in the main PWA instance. Please note, once the feature has been activated for a site it cannot be de-activated. This is a one-way operation. ## How long can a Project Online administrator wait to turn on the feature? For any new PWA site created on April 4, 2016 or later, the resource engagements feature will already be on—no need to activate them. For PWA sites created before April 4, 2016, the Project Online administrator can choose to turn on engagements at any point. The feature will no longer be automatically activated on September 22, 2016. ## I'm excited about the new resource engagements, but what happens to my old resource plans? The good news is, once the new feature has been turned on, your old **published** resource plans will be converted into resource engagements. When the feature is activated we will start migrating the old published resource plans. Before you activate this feature, we recommend that you: - Make sure your existing resource plans have timephased data. - Republish any existing resource plans. Since engagements are time-based, if your resource plan does not contain any timephased information (resources only), no engagements will be created. We will do our best to convert resource plans into engagements, provided all of the necessary data is available. Once the activation is complete, you will no longer be able to access the old resource plans, but can start creating new engagements to replace them. For every published resource plan, we will use some of the data to create new engagements. The status will match that from the resource plan and all of the timephased data will be transferred. After the feature has been activated, you will no longer be able to access the old resource plan UI, and that page will be deprecated. ## How do I know if my resource plans were successfully migrated into engagements? Once the Project Online administrator has activated the feature, there will be a section in **Additional Server Settings** that shows the progress. ![Project Online: Additional Server Settings dialog shows progress of the migration of resource plan data](media/9a8e203e-9eb7-4e79-80c5-43527e36bf92.png) After migration is complete, the **New Resource Management Features Available** section will disappear. You should now be able to visit the **Resource Requests** page and view the newly created engagements that were based on your old resource plans. ## What about my existing reports involving resource plans? Your existing reports will no longer return any data since the old resource plans will be deleted from the reporting database. You will be able to create any new reports you like using engagement data. We have added support through OData to create new reports on engagements. ## What about portfolio analysis? Our organization used to use resource plans to do resource modeling. Portfolio analysis will now include engagement data instead of resource plan data. Similar to how the old resource plans were handled, you will have the option to include both proposed and committed engagements, or just committed ones, when doing your analysis. ## What about extensibility? Can I access the APIs for engagements? There are resource engagement APIs available, the Project Server CSOM class details are here: [Microsoft.ProjectServer.Client Namespace](https://docs.microsoft.com/dotnet/api/microsoft.projectserver.client) ## When will this be available to Project Server customers? In the next release of Project Server, customers will be able to access all of these features.
67.936709
502
0.794112
eng_Latn
0.998197
2dc35e7612eb8df2f1c13ddfb203db1e75467761
66
md
Markdown
README.md
mpi2/rt_issue_tracking
78987421dbb89aca10faa4116f95f203123065e0
[ "Apache-2.0" ]
null
null
null
README.md
mpi2/rt_issue_tracking
78987421dbb89aca10faa4116f95f203123065e0
[ "Apache-2.0" ]
2
2017-12-06T15:55:07.000Z
2020-02-12T08:20:54.000Z
README.md
mpi2/rt_issue_tracking
78987421dbb89aca10faa4116f95f203123065e0
[ "Apache-2.0" ]
null
null
null
# rt_issue_tracking Repository to capture and support IMPC issues
22
45
0.848485
eng_Latn
0.947149
2dc3eae4fadef92aef666fe344e36d64345b2330
633
md
Markdown
_landing_pages/slideshow.md
theNewDynamic/mahfoudbennoune.com
813f7da02fa417f54ea1a4c7a1b8c4ac446b9b3a
[ "MIT" ]
null
null
null
_landing_pages/slideshow.md
theNewDynamic/mahfoudbennoune.com
813f7da02fa417f54ea1a4c7a1b8c4ac446b9b3a
[ "MIT" ]
null
null
null
_landing_pages/slideshow.md
theNewDynamic/mahfoudbennoune.com
813f7da02fa417f54ea1a4c7a1b8c4ac446b9b3a
[ "MIT" ]
null
null
null
--- title: Slideshow layout: page category: permalink: /slideshow/ --- <style> .galleria{ width: 700px; height: 400px; background: #000 } </style> <div class="galleria"> <img src="/assets/img/MBpic9.jpg"> <img src="/assets/img/MBpic7-sm.jpg"> <img src="/assets/img/MBpic4.jpg"> <img src="/assets/img/MBpic1.jpg"> <img src="/assets/img/MBpic3.jpg"> <img src="/assets/img/IMG_1353.jpg"> <img src="/assets/img/IMG_1360.jpg"> <img src="/assets/img/IMG_1362.jpg"> <img src="/assets/img/MBpic10.jpg"> <img src="/assets/img/MBpic8.jpg"> <img src="/assets/img/MBpic2.jpg"> </div> [see photos without slideshow](/photos)
22.607143
62
0.671406
bos_Latn
0.063726
2dc4e174d40e9d7fce557e77f8a8171b5e95b7ae
1,065
md
Markdown
AlchemyInsights/help-with-printing-in-excel.md
isabella232/OfficeDocs-AlchemyInsights-pr.bg-BG
7701e28cebcdd224cf0fdde712e5598893bc4342
[ "CC-BY-4.0", "MIT" ]
1
2020-05-19T19:05:50.000Z
2020-05-19T19:05:50.000Z
AlchemyInsights/help-with-printing-in-excel.md
isabella232/OfficeDocs-AlchemyInsights-pr.bg-BG
7701e28cebcdd224cf0fdde712e5598893bc4342
[ "CC-BY-4.0", "MIT" ]
2
2022-02-09T06:51:27.000Z
2022-02-09T06:51:41.000Z
AlchemyInsights/help-with-printing-in-excel.md
isabella232/OfficeDocs-AlchemyInsights-pr.bg-BG
7701e28cebcdd224cf0fdde712e5598893bc4342
[ "CC-BY-4.0", "MIT" ]
3
2019-10-09T20:29:19.000Z
2021-10-09T10:52:40.000Z
--- title: Помощ при отпечатване в Excel ms.author: pebaum author: pebaum manager: scotv ms.date: 07/27/2020 ms.audience: Admin ms.topic: article ms.service: o365-administration ROBOTS: NOINDEX, NOFOLLOW localization_priority: Priority ms.collection: Adm_O365 ms.custom: - "2715" - "9000773" ms.openlocfilehash: f4771cd514e467f002c4517789a4ef8f1f77822b0b4d0884632cafb98b60e470 ms.sourcegitcommit: b5f7da89a650d2915dc652449623c78be6247175 ms.translationtype: MT ms.contentlocale: bg-BG ms.lasthandoff: 08/05/2021 ms.locfileid: "54015825" --- # <a name="help-with-printing-in-excel"></a>Помощ при отпечатване в Excel Ако получите грешка при опит за печат, това понякога може да бъде решено чрез актуализиране на драйвера на принтера. За помощ при актуализирането на драйвери [вижте Актуализиране на драйвери в Windows](https://support.microsoft.com/help/4028443/windows-10-update-drivers). За конфигуриране, форматиране и отстраняване на неизправности при печат в Excel, вижте [Печат в Excel](https://support.office.com/client/9785e791-de6f-48dd-9b0d-899d75c33d69).
39.444444
272
0.811268
bul_Cyrl
0.456812
2dc54472e3d77880a7a12b128de8c9544110ad28
29
md
Markdown
README.md
capnp-js/trans-base64
51142b1a19318f37fb0b6f578865392d7c4b1a24
[ "MIT" ]
null
null
null
README.md
capnp-js/trans-base64
51142b1a19318f37fb0b6f578865392d7c4b1a24
[ "MIT" ]
null
null
null
README.md
capnp-js/trans-base64
51142b1a19318f37fb0b6f578865392d7c4b1a24
[ "MIT" ]
null
null
null
# base64 Base-64 compression
9.666667
19
0.793103
fra_Latn
0.440839
2dc55e6dffd126e21db783bc37f6f21eae3a3f37
1,794
md
Markdown
~2021/content/post/TIL/2021/01/2021-01-01-til.md
Woomin-Jeon/Woomin-blog
7b17bccccc6637fa0ebe606ac47bb9aaf3b6c0aa
[ "MIT" ]
null
null
null
~2021/content/post/TIL/2021/01/2021-01-01-til.md
Woomin-Jeon/Woomin-blog
7b17bccccc6637fa0ebe606ac47bb9aaf3b6c0aa
[ "MIT" ]
null
null
null
~2021/content/post/TIL/2021/01/2021-01-01-til.md
Woomin-Jeon/Woomin-blog
7b17bccccc6637fa0ebe606ac47bb9aaf3b6c0aa
[ "MIT" ]
null
null
null
--- title: 2021-01-01 TIL date: 2021-01-01 category: "All" draft: true --- ## Facts - 부스트 캠프 면접 스터디에 참여하였습니다. - 프로그래머스 3레벨 "입국심사" 문제를 해결하였습니다. - 모던 자바스크립트 튜토리얼의 "클래스" 부분을 공부하고 정리하였습니다. - Bubble Sort와 Insert Sort를 구현해보았습니다. - 코딩도장 자료구조 스터디에 참여하였습니다. ## Feelings - 꿀 낮 잠 굳 예아 ## Findings - **class문법의 편의성** 생성자 함수를 사용하게되면 불필요하게 계속 생성되는 메서드와 같은 것들은 따로 prototype에 관리해주어 최적화하곤 합니다. ```js function User(name) { this.name = name; } User.prototype.getName = function() { return this.name; } ``` 하지만 class를 사용하게 되면 저렇게 prototype을 따로 분리해줄 필요 없이 한번에 가능합니다. contructor 함수 밖에서 선언된 것들은 모두 prototype에 저장됩니다. ```js class User { constructor(name) { this.name = name; } getName() { return this.name; } } ``` - **class의 화살표 함수 메서드** 객체의 메서드로 화살표함수를 사용하면 무조건 전역 객체를 가리키는 문제가 있지만, class의 메서드로 화살표함수를 사용하면 잘 동작합니다. 뿐만아니라 콜백함수로 class의 화살표 함수 메서드를 전달해도 상위 스코프의 this를 가져와서 잘 사용할 수 있습니다. ```js class User { constructor(name) { this.name = name; } getName = () => { console.log(this.name); } } const user = new User('woomin'); setTimeout(user.getName, 1000); // woomin ``` - **class의 super 키워드** super(...)를 사용해서 부모 constructor를 호출할 수 있는데, 이는 자식의 constructor 내부에서만 사용 가능 합니다. 아울러 super.method(...)를 통해 부모 클래스에서 정의된 메서드를 호출할 수 있습니다. - **class를 확장(extends)를 한 경우, 해당 class의 constructor의 첫번째로 super를 호출해야 하는 이유** 일반적인 class나 생성자함수는 new 키워드와 함께 호출되면 this에 빈 객체를 할당합니다. 하지만 class에서 상속이 발생하면 "this에 빈 객체를 할당하는 일"을 해당 class에서 하는게 아니라 부모 생성자에서 해주길 기대합니다. 따라서 super를 호출하지 않으면 this가 될 객체를 만들지 않아서 에러가 발생합니다. ## Future Action Plans - 새로운 해의 첫 날도 낮잠을 자긴 했지만 열심히 달린 것 같습니다! 앞으로도 화이팅! - 내일은 주말이긴 하지만 그래도 공부 좀 해야겠습니다. ## Feedback - OK
22.148148
189
0.622631
kor_Hang
1.00001
2dc621e4f3139f85ef0d9202933d9f536b7d444a
8,228
md
Markdown
Tools/Migration/folder-options-dpct/README.md
outoftardis/oneAPI-samples
acd5aecfa438be146fe2b4a27faa3fd3c7d16e72
[ "MIT" ]
1
2021-07-07T16:47:24.000Z
2021-07-07T16:47:24.000Z
Tools/Migration/folder-options-dpct/README.md
outoftardis/oneAPI-samples
acd5aecfa438be146fe2b4a27faa3fd3c7d16e72
[ "MIT" ]
1
2021-04-16T19:00:35.000Z
2021-04-22T18:44:52.000Z
Tools/Migration/folder-options-dpct/README.md
outoftardis/oneAPI-samples
acd5aecfa438be146fe2b4a27faa3fd3c7d16e72
[ "MIT" ]
1
2021-05-13T15:13:35.000Z
2021-05-13T15:13:35.000Z
# Intel DPC++ Compatibility Tool: Foo Bar Example ## Use the Command Line to Migrate Large Code Bases The Intel DPC++ Compatibility Tool (dpct) can migrate projects that include multiple source and header files. This sample provides an example of how to migrate more complex projects and use options. | Optimized for | Description |:--- |:--- | OS | Linux* Ubuntu* 18.04; Windows 10 | Software | Intel® DPC++ Compatibility Tool; | What you will learn | Simple invocation of dpct to migrate CUDA code | Time to complete | 10 minutes ## Purpose The Intel® DPC++ Compatibility Tool (dpct) can be used to migrate projects composed of multiple source and header files. This includes headers from the system libraries and other projects. The tool must know which headers need migration and which should be left alone. Use the dpct `--in-root` option to set the root location of your program sources that are to be migrated. Only files and folders located within the --in-root directory will be considered for migration by the tool. Files located outside the`--in-root` directory are considered system files and will not be migrated, even if they are included by a source file located within the `--in-root`directory. The dpct `--out-root` option specifies the directory into which the DPC++ code produced by the Intel DPC++ Compatibility Tool is written. The relative location and names of the migrated files are maintained, except the file extensions are changed to `.dp.cpp`. ## Key Implementation Details Use --in-root and --out-root for projects which contain more than one source file. Additional migration options can be reviewed at: [Command Line Options Reference](https://software.intel.com/content/www/us/en/develop/documentation/intel-dpcpp-compatibility-tool-user-guide/top/command-line-options-reference.html). ## License Code samples are licensed under the MIT license. See [License.txt](https://github.com/oneapi-src/oneAPI-samples/blob/master/License.txt) for details. Third party program Licenses can be found here: [third-party-programs.txt](https://github.com/oneapi-src/oneAPI-samples/blob/master/third-party-programs.txt) ## Migrating the CUDA Sample to Data Parallel C++ with the Intel DPC++ Compatibility Tool Building and running the CUDA sample is not required to migrate this project to a Data Parallel C++ project. > **NOTE:** Certain CUDA header files, referenced by the CUDA application > source files to be migrated, need to be accessible for the migration step. > See the [Getting Started Guide][cuda-headers] for more details. [cuda-headers]: <https://software.intel.com/content/www/us/en/develop/documentation/get-started-with-intel-dpcpp-compatibility-tool/top.html#top_BEFORE_YOU_BEGIN> ### Command-Line on a Linux* System 1. Ensure your environment is configured to use the OneAPI tools. ```shell $ source /opt/intel/oneapi/setvars.sh ``` 2. This sample project contains a simple CUDA program with three files (main.cu, util.cu and util.h) located in two folders (foo and bar): ``` foo ├── bar │ ├── util.cu │ └── util.h └── main.cu ``` 3. Use the tool's `--in-root` option and provide input files to specify where to locate the CUDA files that need migration; use the tool’s `--out-root` option to designate where to generate the resulting files: ```sh # From the repo root directory: $ dpct --in-root=foo --out-root=result/foo foo/main.cu foo/bar/util.cu --extra-arg="-Ifoo/bar/" ``` > If an `--in-root` option is not specified, the directory of the first input > source file is implied. If `--out-root` is not specified, `./dpct_output` > is implied. You should see the migrated files in the `result/foo` folder that was specified by the `--out-root` option: ``` result/foo ├── bar │ ├── util.dp.cpp │ └── util.h └── main.dp.cpp ``` 4. Inspect the migrated source code, address any `DPCT` warnings generated by the Intel DPC++ Compatibility Tool, and verify the new program correctness. Warnings are printed to the console and added as comments in the migrated source. See the [Diagnostic Reference][diag-ref] for more information on what each warning means. [diag-ref]: <https://software.intel.com/content/www/us/en/develop/documentation/get-started-with-intel-dpcpp-compatibility-tool/top/diagnostics-reference.html> This sample should generate the following warning: ``` warning: DPCT1015:0: Output needs adjustment. ``` See below **Addressing Warnings in the Migrated Code** to understand how to resolve the warning. 5. Copy the original `Makefile` into the `result` folder and update the copy to build the migrated project using DPC++. Replace the CUDA configurations in that new `Makefile` with the following for use with DPC++: ```Makefile CXX = dpcpp # Remainder of the Makefile should work without changes. ``` > **NOTE:** The above Makefile changes work for this sample project. The > modifications needed to update the build files in your own projects will vary > greatly depending on the nature and complexity of your migrated projects. 6. Switch to the migration directory with `cd result`. 7. Build the migrated sample with `make`. 8. Run the migrated sample with `make run`. 9. Clean up the build with `make clean`. ## Windows 1. Ensure your environment is configured to use the OneAPI tools. ```bat > "C:\Program Files (x86)\Intel\oneAPI\setvars.bat" ``` 2. This sample project contains a simple CUDA program with three files (main.cu, util.cu and util.h) located in two folders (foo and bar): ``` foo ├── bar │ ├── util.cu │ └── util.h └── main.cu ``` 3. Use the dpct's `--in-root` and `--out-root` options to specify where to locate the migrated CUDA files: ```bat > dpct --in-root=foo --out-root=result\foo foo\main.cu foo\bar\util.cu --extra-arg="-Ifoo\bar\" ``` This sample should generate the following warning: ``` warning: DPCT1015:0: Output needs adjustment. ``` See below **Addressing Warnings in the Migrated Code** to understand how to resolve the warning. > If an `--in-root` option is not specified, the directory of the first input > source file is implied. If `--out-root` is not specified, `./dpct_output` > is implied. You should see the migrated files in the `result/foo` folder that was specified by the `--out-root` option: ``` result/foo ├── bar │ ├── util.dp.cpp │ └── util.h └── main.dp.cpp ``` To build this migrated application on Windows, you must modify the original `Makefile` to be compatible with Microsoft `nmake` and Windows command-line tools. # Addressing Warnings in Migrated Code Migration generated one warning for code that `dpct` could not migrate: ``` warning: DPCT1015:0: Output needs adjustment. ``` As you have noticed, the migration of this project resulted in one DPCT message that needs to be addressed, DPCT1015. This message is shown because as the Compatibility Tool migrated from the printf-style formatted string in the CUDA code to the output stream supported by DPC++, manual adjustment is needed to generate the equivalent output. Open result/foo/bar/util.dp.cpp and locate the error DPCT1015. Then make the following changes: Change: ``` stream_ct1 << "kernel_util,%d\n"; ``` to ``` stream_ct1 << "kernel_util," << c << sycl::endl; ``` You’ll also need to change the stream statement in result/foo/main.dp.cpp. Change: ``` stream_ct1 << z"kernel_main!\n"; ``` to ``` stream_ct1 << "kernel_main!" << sycl::endl; ``` # Example Output When you run the migrated application, you should see the following console output: ``` ./foo-bar kernel_main! kernel_util,2 ``` > **NOTE:** If you see the following TBB error message, you can safely it. > Not all users will see this TBB error message. It has nothing to do with the > migration of your application and does not mean that your application is not > running correctly. The issue will be resolved in a future oneAPI release. ``` TBB Warning: The number of workers is currently limited to 11. The request for 31 workers is ignored. Further requests for more workers will be silently ignored until the limit changes. ```
31.40458
185
0.730919
eng_Latn
0.996858
2dc63650f3e93bbf2981965318c66c5d6ee8ab55
4,134
md
Markdown
README.md
ttameshige/logistic_model_IR-laser_Cre-loxP
2eb61e0b6db067cb5fa202c828dd6655e360f8c2
[ "MIT" ]
null
null
null
README.md
ttameshige/logistic_model_IR-laser_Cre-loxP
2eb61e0b6db067cb5fa202c828dd6655e360f8c2
[ "MIT" ]
2
2021-11-27T17:06:38.000Z
2021-11-27T17:17:55.000Z
README.md
ttameshige/logistic_model_IR-laser_Cre-loxP
2eb61e0b6db067cb5fa202c828dd6655e360f8c2
[ "MIT" ]
null
null
null
# logistic_model_IR-laser_Cre-loxP * This is a program to analyze probabilities of stochastic cellular events like cellular Cre-loxP DNA recombination or cell death. This works on R with Stan. * input data: tab delimited file including objective variable (0 or 1 indicating occurance of Cre-loxP recombination) and explanatory variable (continuous values of laser power) * outputs: probability plots from logistic model # output examples from such a data ![download-2](https://user-images.githubusercontent.com/51182565/143690704-5e0aea83-8e27-40f8-b54d-15a491b53551.png) calculate a logistic model like this. ![download](https://user-images.githubusercontent.com/51182565/143690433-b35130a3-508d-4013-9d35-95ff71d3278d.png) # Requirements * R and following R libraries * cmdstanr * posterior * bayesplot * ggplot2 * dplyr * tidyverse # Installation To install 'cmdstanr', 'posterior' and 'bayesplot', see the github page https://github.com/stan-dev/cmdstanr In my environment, installation was done by following three commands. ```{r} devtools::install_github("stan-dev/cmdstanr") library(cmdstanr) install_cmdstan() ``` For other R packages: 'tidyverse', 'ggplot2' and 'dplyr', see tidyverse web page https://tidyverse.tidyverse.org/ In my environment, installation was done by following commands. ```{r} install.packages("tidyverse") ``` # Usage get a Bayesian model from Stan ```{r} # import data data <- read.delim("demoData_IR-LEGO_Cre-loxP_2.txt") # data formatting power_upper_limit <- 15.5 #set upper limit of fitting data data <- data[data$laser_power < power_upper_limit,] #data filtering data.list <- list(N=dim(data)[1], induction=data$recombination, power=data$laser_power) # import Stan model file <- "IR-LEGO_creloxp_logit_plimit_model.stan" #define the Stan model file mod <- cmdstan_model(file) #compile the model # MCMC sampling for logistic model: induction ~ power fit_plimit_model <- mod$sample( data = data.list, seed = 123, adapt_delta = 0.999, chains = 4, parallel_chains = 2, refresh = 500 ) # format the sampling result into a data.frame draws <- fit_plimit_model$draws() fit_df <- as_draws_df(draws) # get the most likely parameters from MCMC sampling alpha <- median(fit_df$alpha) beta <- median(fit_df$beta) p.limit <- median(fit_df$plimit) # make a data frame to plot the logistic model x <- seq(5, 15, length.out = 50) y <- p.limit / (1 + exp(- beta - alpha * x)) plot_df <- data.frame(x, y) # plot the model p <- ggplot() + geom_line(aes(x, y), data = plot_df, size = 2, color="green") + scale_x_continuous("Laser Power (mW)", limits = c(5, 20), breaks = seq(5, 20, 2), ) + scale_y_continuous("Probability", limits = c(-0.05, 1.05), breaks = seq(0, 1, 0.2)) print(p) ``` get standard logistic model from glm() ```{r} data <- read.delim("demoData_IR-LEGO_Cre-loxP_1.txt") model <- glm(recombination ~ laser_power, data=data,family=binomial(link="logit")) x <- seq(5, 18, length.out = 50) y <- 1 / (1 + exp(- model$coefficients["(Intercept)"] - model$coefficients["laser_power"] * x)) # fitting model plot_df <- data.frame(x, y) p <- ggplot() + geom_line(aes(x, y), data = plot_df, size = 2, color="blue") + scale_x_continuous("Laser Power (mW)", limits = c(5, 20), breaks = seq(5, 20, 2), ) + scale_y_continuous("Probability", limits = c(-0.05, 1.05), breaks = seq(0, 1, 0.2)) print(p) ``` # Note If the maximum probability is almost 100%, glm() works well. If there is a plateau at lower probability, bayesian modeling with Stan works well. It should be considered that the latter method requires a larger data size and confirming the sampling convergence. # Author #### Toshiaki Tameshige PhD. #### affiliation1: Kihara Institute for Biological Research, Yokohama City Univ. #### affiliation2: Faculty of Science, Niigata Univ. # Acknowledgments I appreciate Dr. Yasuhiro Sato for kind advise in appropriate usage of MCMC. # License MIT license (https://en.wikipedia.org/wiki/MIT_License).
32.551181
177
0.702709
eng_Latn
0.776311
2dc75e4d9bf7cf960c68d3f1abc6cc21b57a680f
1,534
md
Markdown
README.md
frank-laemmer/die-natur-der-stadt
3c6afeedc560fafc9fc2f37bb2cb51c1a7d64035
[ "RSA-MD" ]
null
null
null
README.md
frank-laemmer/die-natur-der-stadt
3c6afeedc560fafc9fc2f37bb2cb51c1a7d64035
[ "RSA-MD" ]
null
null
null
README.md
frank-laemmer/die-natur-der-stadt
3c6afeedc560fafc9fc2f37bb2cb51c1a7d64035
[ "RSA-MD" ]
null
null
null
# Zur digitale Ausgabe von Die Natur der Stadt "Die Natur der Stadt" ist ein Buch von Heide Berndt aus dem Jahr 1978. Dieses Repository widmet sich der Digitalisierung des Textes. ## Downloads + [die-natur-der-stadt.epub](die-natur-der-stadt.epub) + [die-natur-der-stadt.pdf](die-natur-der-stadt.pdf) ## Struktur und Inhalt + Beabreitbare Textdateien im Markdown Dateiformat in Ordner `src` + Exporte des Textes in verschiedenen Endformaten: + ebook `.epub` + PDF `.pdf` + ( Scan PDF als Quelldatei ) ## Status: ERSTER ENTWURF + 2020-02-03: Struktur aufgeräumt, Buchscan vorbereitet + 2020-02-02: Im Moment ist nur das erste Vorwort und erstes Kapitel digitalisiert ## Mitmachen Pull Requests werden gerne entgegengenommen. ## Generierung der Exporte Die Endformate können aus den Markdown-Dateien erstellt werden. Pandoc ist ein Dateiformatierungsprogramm. Hier sind sind die Befehle zum erstellen der Endformate: ```shell # Create epub ebook pandoc --toc src/metadata.txt src/*.md -o die-natur-der-stadt.epub --css=src/epub.css # Create PDF ebook pandoc src/metadata.txt src/*.md -o die-natur-der-stadt.pdf ``` Mehr Informationen zur technischen Umsetzung finden sich auf [diesem Medium Blog Post](https://frank-laemmer.medium.com/from-analog-text-to-e-book-e1a90dcbe92). ## Lizenz Dieser Text hier ist mit freundlicher Genehmigung des Verlag Neue Kritik veröffentlicht. Der Text kann unter share alike Bedingungen genutzt werden. Mehr hier: [LICENSE.md](LICENSE.md) ## Kontakt heide@franklaemmer.de
30.078431
163
0.765319
deu_Latn
0.953846
2dc75e880c847b7e42b6bdc11404867e974157f8
858
md
Markdown
website/translated_docs/de/MSC/backup.md
Sieste68/docs
63c06aaa9f06de535d3943294aca4a09fdac454a
[ "CC-BY-4.0" ]
null
null
null
website/translated_docs/de/MSC/backup.md
Sieste68/docs
63c06aaa9f06de535d3943294aca4a09fdac454a
[ "CC-BY-4.0" ]
null
null
null
website/translated_docs/de/MSC/backup.md
Sieste68/docs
63c06aaa9f06de535d3943294aca4a09fdac454a
[ "CC-BY-4.0" ]
null
null
null
--- id: backup title: Seite Backup sidebar_label: Seite Backup --- Die Seite Backup des MSC zeigt die Backup-Einstellungen für die Datenbank. Hier können Sie auch ein manuelles Backup starten: ![](assets/en/MSC/msc_Backup.png) Diese Seite ist in drei Bereiche unterteilt: - **Backup File Destination**: displays information about the location of the application backup file. sowie den freien bzw. verwendeten Platz auf der Backup-Festplatte. - **Last Backup Information**: provides the date and time of the last backup (automatic or manual) carried out on the application. - **Contents of the backup file**: lists the files and folders included in the backup file. The **Backup** button is used to launch a manual backup. Auf dieser Seite können Sie keine Backup-Parameter verändern. To do this, you must click on the **Database properties...** button.
45.157895
169
0.772727
deu_Latn
0.686168
2dc7db803e0f8d6e3b69351683216a03d030be4d
22,357
md
Markdown
treebanks/ja_bccwj/ja_bccwj-dep-obl.md
akoehn/docs
ab54f654db7ab7c3433b5a6aa3dae19c3a938c7b
[ "Apache-2.0" ]
null
null
null
treebanks/ja_bccwj/ja_bccwj-dep-obl.md
akoehn/docs
ab54f654db7ab7c3433b5a6aa3dae19c3a938c7b
[ "Apache-2.0" ]
null
null
null
treebanks/ja_bccwj/ja_bccwj-dep-obl.md
akoehn/docs
ab54f654db7ab7c3433b5a6aa3dae19c3a938c7b
[ "Apache-2.0" ]
null
null
null
--- layout: base title: 'Statistics of obl in UD_Japanese-BCCWJ' udver: '2' --- ## Treebank Statistics: UD_Japanese-BCCWJ: Relations: `obl` This relation is universal. 45358 nodes (4%) are attached to their parents as `obl`. 45356 instances of `obl` (100%) are right-to-left (child precedes parent). Average distance between parent and child is 10.2170951100137. The following 125 pairs of parts of speech are connected with `obl`: <tt><a href="ja_bccwj-pos-VERB.html">VERB</a></tt>-<tt><a href="ja_bccwj-pos-NOUN.html">NOUN</a></tt> (14115; 31% instances), <tt><a href="ja_bccwj-pos-AUX.html">AUX</a></tt>-<tt><a href="ja_bccwj-pos-NOUN.html">NOUN</a></tt> (6202; 14% instances), <tt><a href="ja_bccwj-pos-NOUN.html">NOUN</a></tt>-<tt><a href="ja_bccwj-pos-NOUN.html">NOUN</a></tt> (5475; 12% instances), <tt><a href="ja_bccwj-pos-VERB.html">VERB</a></tt>-<tt><a href="ja_bccwj-pos-PART.html">PART</a></tt> (3949; 9% instances), <tt><a href="ja_bccwj-pos-ADJ.html">ADJ</a></tt>-<tt><a href="ja_bccwj-pos-NOUN.html">NOUN</a></tt> (1902; 4% instances), <tt><a href="ja_bccwj-pos-AUX.html">AUX</a></tt>-<tt><a href="ja_bccwj-pos-PART.html">PART</a></tt> (1652; 4% instances), <tt><a href="ja_bccwj-pos-VERB.html">VERB</a></tt>-<tt><a href="ja_bccwj-pos-VERB.html">VERB</a></tt> (1577; 3% instances), <tt><a href="ja_bccwj-pos-NOUN.html">NOUN</a></tt>-<tt><a href="ja_bccwj-pos-PART.html">PART</a></tt> (1251; 3% instances), <tt><a href="ja_bccwj-pos-PART.html">PART</a></tt>-<tt><a href="ja_bccwj-pos-NOUN.html">NOUN</a></tt> (947; 2% instances), <tt><a href="ja_bccwj-pos-NOUN.html">NOUN</a></tt>-<tt><a href="ja_bccwj-pos-VERB.html">VERB</a></tt> (639; 1% instances), <tt><a href="ja_bccwj-pos-VERB.html">VERB</a></tt>-<tt><a href="ja_bccwj-pos-PROPN.html">PROPN</a></tt> (570; 1% instances), <tt><a href="ja_bccwj-pos-VERB.html">VERB</a></tt>-<tt><a href="ja_bccwj-pos-PRON.html">PRON</a></tt> (531; 1% instances), <tt><a href="ja_bccwj-pos-VERB.html">VERB</a></tt>-<tt><a href="ja_bccwj-pos-AUX.html">AUX</a></tt> (526; 1% instances), <tt><a href="ja_bccwj-pos-AUX.html">AUX</a></tt>-<tt><a href="ja_bccwj-pos-VERB.html">VERB</a></tt> (470; 1% instances), <tt><a href="ja_bccwj-pos-ADJ.html">ADJ</a></tt>-<tt><a href="ja_bccwj-pos-PART.html">PART</a></tt> (396; 1% instances), <tt><a href="ja_bccwj-pos-VERB.html">VERB</a></tt>-<tt><a href="ja_bccwj-pos-ADV.html">ADV</a></tt> (395; 1% instances), <tt><a href="ja_bccwj-pos-ADJ.html">ADJ</a></tt>-<tt><a href="ja_bccwj-pos-VERB.html">VERB</a></tt> (391; 1% instances), <tt><a href="ja_bccwj-pos-AUX.html">AUX</a></tt>-<tt><a href="ja_bccwj-pos-AUX.html">AUX</a></tt> (279; 1% instances), <tt><a href="ja_bccwj-pos-PART.html">PART</a></tt>-<tt><a href="ja_bccwj-pos-PART.html">PART</a></tt> (250; 1% instances), <tt><a href="ja_bccwj-pos-NOUN.html">NOUN</a></tt>-<tt><a href="ja_bccwj-pos-AUX.html">AUX</a></tt> (245; 1% instances), <tt><a href="ja_bccwj-pos-NOUN.html">NOUN</a></tt>-<tt><a href="ja_bccwj-pos-PROPN.html">PROPN</a></tt> (245; 1% instances), <tt><a href="ja_bccwj-pos-VERB.html">VERB</a></tt>-<tt><a href="ja_bccwj-pos-ADJ.html">ADJ</a></tt> (238; 1% instances), <tt><a href="ja_bccwj-pos-NOUN.html">NOUN</a></tt>-<tt><a href="ja_bccwj-pos-PRON.html">PRON</a></tt> (204; 0% instances), <tt><a href="ja_bccwj-pos-AUX.html">AUX</a></tt>-<tt><a href="ja_bccwj-pos-PROPN.html">PROPN</a></tt> (203; 0% instances), <tt><a href="ja_bccwj-pos-VERB.html">VERB</a></tt>-<tt><a href="ja_bccwj-pos-NUM.html">NUM</a></tt> (200; 0% instances), <tt><a href="ja_bccwj-pos-ADJ.html">ADJ</a></tt>-<tt><a href="ja_bccwj-pos-PRON.html">PRON</a></tt> (169; 0% instances), <tt><a href="ja_bccwj-pos-AUX.html">AUX</a></tt>-<tt><a href="ja_bccwj-pos-PRON.html">PRON</a></tt> (162; 0% instances), <tt><a href="ja_bccwj-pos-ADJ.html">ADJ</a></tt>-<tt><a href="ja_bccwj-pos-AUX.html">AUX</a></tt> (150; 0% instances), <tt><a href="ja_bccwj-pos-ADV.html">ADV</a></tt>-<tt><a href="ja_bccwj-pos-NOUN.html">NOUN</a></tt> (148; 0% instances), <tt><a href="ja_bccwj-pos-PART.html">PART</a></tt>-<tt><a href="ja_bccwj-pos-VERB.html">VERB</a></tt> (117; 0% instances), <tt><a href="ja_bccwj-pos-AUX.html">AUX</a></tt>-<tt><a href="ja_bccwj-pos-ADV.html">ADV</a></tt> (116; 0% instances), <tt><a href="ja_bccwj-pos-NOUN.html">NOUN</a></tt>-<tt><a href="ja_bccwj-pos-NUM.html">NUM</a></tt> (102; 0% instances), <tt><a href="ja_bccwj-pos-NOUN.html">NOUN</a></tt>-<tt><a href="ja_bccwj-pos-ADJ.html">ADJ</a></tt> (100; 0% instances), <tt><a href="ja_bccwj-pos-NUM.html">NUM</a></tt>-<tt><a href="ja_bccwj-pos-NOUN.html">NOUN</a></tt> (100; 0% instances), <tt><a href="ja_bccwj-pos-ADJ.html">ADJ</a></tt>-<tt><a href="ja_bccwj-pos-ADJ.html">ADJ</a></tt> (84; 0% instances), <tt><a href="ja_bccwj-pos-PROPN.html">PROPN</a></tt>-<tt><a href="ja_bccwj-pos-NOUN.html">NOUN</a></tt> (75; 0% instances), <tt><a href="ja_bccwj-pos-AUX.html">AUX</a></tt>-<tt><a href="ja_bccwj-pos-NUM.html">NUM</a></tt> (69; 0% instances), <tt><a href="ja_bccwj-pos-ADJ.html">ADJ</a></tt>-<tt><a href="ja_bccwj-pos-PROPN.html">PROPN</a></tt> (67; 0% instances), <tt><a href="ja_bccwj-pos-PART.html">PART</a></tt>-<tt><a href="ja_bccwj-pos-PROPN.html">PROPN</a></tt> (61; 0% instances), <tt><a href="ja_bccwj-pos-NOUN.html">NOUN</a></tt>-<tt><a href="ja_bccwj-pos-ADV.html">ADV</a></tt> (60; 0% instances), <tt><a href="ja_bccwj-pos-ADJ.html">ADJ</a></tt>-<tt><a href="ja_bccwj-pos-ADV.html">ADV</a></tt> (57; 0% instances), <tt><a href="ja_bccwj-pos-AUX.html">AUX</a></tt>-<tt><a href="ja_bccwj-pos-ADJ.html">ADJ</a></tt> (55; 0% instances), <tt><a href="ja_bccwj-pos-PRON.html">PRON</a></tt>-<tt><a href="ja_bccwj-pos-NOUN.html">NOUN</a></tt> (51; 0% instances), <tt><a href="ja_bccwj-pos-PART.html">PART</a></tt>-<tt><a href="ja_bccwj-pos-PRON.html">PRON</a></tt> (46; 0% instances), <tt><a href="ja_bccwj-pos-ADP.html">ADP</a></tt>-<tt><a href="ja_bccwj-pos-NOUN.html">NOUN</a></tt> (42; 0% instances), <tt><a href="ja_bccwj-pos-ADV.html">ADV</a></tt>-<tt><a href="ja_bccwj-pos-VERB.html">VERB</a></tt> (41; 0% instances), <tt><a href="ja_bccwj-pos-PART.html">PART</a></tt>-<tt><a href="ja_bccwj-pos-AUX.html">AUX</a></tt> (36; 0% instances), <tt><a href="ja_bccwj-pos-PROPN.html">PROPN</a></tt>-<tt><a href="ja_bccwj-pos-PROPN.html">PROPN</a></tt> (35; 0% instances), <tt><a href="ja_bccwj-pos-VERB.html">VERB</a></tt>-<tt><a href="ja_bccwj-pos-ADP.html">ADP</a></tt> (29; 0% instances), <tt><a href="ja_bccwj-pos-PRON.html">PRON</a></tt>-<tt><a href="ja_bccwj-pos-PRON.html">PRON</a></tt> (28; 0% instances), <tt><a href="ja_bccwj-pos-PROPN.html">PROPN</a></tt>-<tt><a href="ja_bccwj-pos-VERB.html">VERB</a></tt> (27; 0% instances), <tt><a href="ja_bccwj-pos-ADP.html">ADP</a></tt>-<tt><a href="ja_bccwj-pos-VERB.html">VERB</a></tt> (25; 0% instances), <tt><a href="ja_bccwj-pos-PART.html">PART</a></tt>-<tt><a href="ja_bccwj-pos-ADJ.html">ADJ</a></tt> (25; 0% instances), <tt><a href="ja_bccwj-pos-ADV.html">ADV</a></tt>-<tt><a href="ja_bccwj-pos-PART.html">PART</a></tt> (24; 0% instances), <tt><a href="ja_bccwj-pos-ADJ.html">ADJ</a></tt>-<tt><a href="ja_bccwj-pos-NUM.html">NUM</a></tt> (23; 0% instances), <tt><a href="ja_bccwj-pos-PRON.html">PRON</a></tt>-<tt><a href="ja_bccwj-pos-VERB.html">VERB</a></tt> (23; 0% instances), <tt><a href="ja_bccwj-pos-NUM.html">NUM</a></tt>-<tt><a href="ja_bccwj-pos-PART.html">PART</a></tt> (19; 0% instances), <tt><a href="ja_bccwj-pos-PRON.html">PRON</a></tt>-<tt><a href="ja_bccwj-pos-PART.html">PART</a></tt> (18; 0% instances), <tt><a href="ja_bccwj-pos-PROPN.html">PROPN</a></tt>-<tt><a href="ja_bccwj-pos-PART.html">PART</a></tt> (18; 0% instances), <tt><a href="ja_bccwj-pos-ADV.html">ADV</a></tt>-<tt><a href="ja_bccwj-pos-AUX.html">AUX</a></tt> (17; 0% instances), <tt><a href="ja_bccwj-pos-AUX.html">AUX</a></tt>-<tt><a href="ja_bccwj-pos-ADP.html">ADP</a></tt> (17; 0% instances), <tt><a href="ja_bccwj-pos-VERB.html">VERB</a></tt>-<tt><a href="ja_bccwj-pos-SYM.html">SYM</a></tt> (17; 0% instances), <tt><a href="ja_bccwj-pos-ADV.html">ADV</a></tt>-<tt><a href="ja_bccwj-pos-PRON.html">PRON</a></tt> (14; 0% instances), <tt><a href="ja_bccwj-pos-PART.html">PART</a></tt>-<tt><a href="ja_bccwj-pos-NUM.html">NUM</a></tt> (13; 0% instances), <tt><a href="ja_bccwj-pos-ADV.html">ADV</a></tt>-<tt><a href="ja_bccwj-pos-PROPN.html">PROPN</a></tt> (12; 0% instances), <tt><a href="ja_bccwj-pos-NOUN.html">NOUN</a></tt>-<tt><a href="ja_bccwj-pos-SYM.html">SYM</a></tt> (12; 0% instances), <tt><a href="ja_bccwj-pos-ADJ.html">ADJ</a></tt>-<tt><a href="ja_bccwj-pos-ADP.html">ADP</a></tt> (11; 0% instances), <tt><a href="ja_bccwj-pos-ADP.html">ADP</a></tt>-<tt><a href="ja_bccwj-pos-PART.html">PART</a></tt> (11; 0% instances), <tt><a href="ja_bccwj-pos-NUM.html">NUM</a></tt>-<tt><a href="ja_bccwj-pos-VERB.html">VERB</a></tt> (11; 0% instances), <tt><a href="ja_bccwj-pos-PART.html">PART</a></tt>-<tt><a href="ja_bccwj-pos-ADV.html">ADV</a></tt> (11; 0% instances), <tt><a href="ja_bccwj-pos-ADJ.html">ADJ</a></tt>-<tt><a href="ja_bccwj-pos-SYM.html">SYM</a></tt> (10; 0% instances), <tt><a href="ja_bccwj-pos-AUX.html">AUX</a></tt>-<tt><a href="ja_bccwj-pos-SYM.html">SYM</a></tt> (10; 0% instances), <tt><a href="ja_bccwj-pos-ADP.html">ADP</a></tt>-<tt><a href="ja_bccwj-pos-AUX.html">AUX</a></tt> (7; 0% instances), <tt><a href="ja_bccwj-pos-PROPN.html">PROPN</a></tt>-<tt><a href="ja_bccwj-pos-AUX.html">AUX</a></tt> (7; 0% instances), <tt><a href="ja_bccwj-pos-PUNCT.html">PUNCT</a></tt>-<tt><a href="ja_bccwj-pos-NOUN.html">NOUN</a></tt> (7; 0% instances), <tt><a href="ja_bccwj-pos-NUM.html">NUM</a></tt>-<tt><a href="ja_bccwj-pos-NUM.html">NUM</a></tt> (6; 0% instances), <tt><a href="ja_bccwj-pos-SCONJ.html">SCONJ</a></tt>-<tt><a href="ja_bccwj-pos-VERB.html">VERB</a></tt> (6; 0% instances), <tt><a href="ja_bccwj-pos-VERB.html">VERB</a></tt>-<tt><a href="ja_bccwj-pos-X.html">X</a></tt> (6; 0% instances), <tt><a href="ja_bccwj-pos-NOUN.html">NOUN</a></tt>-<tt><a href="ja_bccwj-pos-ADP.html">ADP</a></tt> (5; 0% instances), <tt><a href="ja_bccwj-pos-NUM.html">NUM</a></tt>-<tt><a href="ja_bccwj-pos-AUX.html">AUX</a></tt> (5; 0% instances), <tt><a href="ja_bccwj-pos-NUM.html">NUM</a></tt>-<tt><a href="ja_bccwj-pos-PROPN.html">PROPN</a></tt> (5; 0% instances), <tt><a href="ja_bccwj-pos-PRON.html">PRON</a></tt>-<tt><a href="ja_bccwj-pos-ADJ.html">ADJ</a></tt> (5; 0% instances), <tt><a href="ja_bccwj-pos-PROPN.html">PROPN</a></tt>-<tt><a href="ja_bccwj-pos-PRON.html">PRON</a></tt> (5; 0% instances), <tt><a href="ja_bccwj-pos-ADP.html">ADP</a></tt>-<tt><a href="ja_bccwj-pos-PRON.html">PRON</a></tt> (4; 0% instances), <tt><a href="ja_bccwj-pos-NUM.html">NUM</a></tt>-<tt><a href="ja_bccwj-pos-PRON.html">PRON</a></tt> (4; 0% instances), <tt><a href="ja_bccwj-pos-PART.html">PART</a></tt>-<tt><a href="ja_bccwj-pos-SYM.html">SYM</a></tt> (4; 0% instances), <tt><a href="ja_bccwj-pos-ADP.html">ADP</a></tt>-<tt><a href="ja_bccwj-pos-ADJ.html">ADJ</a></tt> (3; 0% instances), <tt><a href="ja_bccwj-pos-ADP.html">ADP</a></tt>-<tt><a href="ja_bccwj-pos-PROPN.html">PROPN</a></tt> (3; 0% instances), <tt><a href="ja_bccwj-pos-ADV.html">ADV</a></tt>-<tt><a href="ja_bccwj-pos-ADV.html">ADV</a></tt> (3; 0% instances), <tt><a href="ja_bccwj-pos-PUNCT.html">PUNCT</a></tt>-<tt><a href="ja_bccwj-pos-NUM.html">NUM</a></tt> (3; 0% instances), <tt><a href="ja_bccwj-pos-SYM.html">SYM</a></tt>-<tt><a href="ja_bccwj-pos-NOUN.html">NOUN</a></tt> (3; 0% instances), <tt><a href="ja_bccwj-pos-X.html">X</a></tt>-<tt><a href="ja_bccwj-pos-NOUN.html">NOUN</a></tt> (3; 0% instances), <tt><a href="ja_bccwj-pos-ADV.html">ADV</a></tt>-<tt><a href="ja_bccwj-pos-ADJ.html">ADJ</a></tt> (2; 0% instances), <tt><a href="ja_bccwj-pos-DET.html">DET</a></tt>-<tt><a href="ja_bccwj-pos-PRON.html">PRON</a></tt> (2; 0% instances), <tt><a href="ja_bccwj-pos-INTJ.html">INTJ</a></tt>-<tt><a href="ja_bccwj-pos-NOUN.html">NOUN</a></tt> (2; 0% instances), <tt><a href="ja_bccwj-pos-INTJ.html">INTJ</a></tt>-<tt><a href="ja_bccwj-pos-VERB.html">VERB</a></tt> (2; 0% instances), <tt><a href="ja_bccwj-pos-PRON.html">PRON</a></tt>-<tt><a href="ja_bccwj-pos-ADP.html">ADP</a></tt> (2; 0% instances), <tt><a href="ja_bccwj-pos-PRON.html">PRON</a></tt>-<tt><a href="ja_bccwj-pos-ADV.html">ADV</a></tt> (2; 0% instances), <tt><a href="ja_bccwj-pos-PRON.html">PRON</a></tt>-<tt><a href="ja_bccwj-pos-AUX.html">AUX</a></tt> (2; 0% instances), <tt><a href="ja_bccwj-pos-PRON.html">PRON</a></tt>-<tt><a href="ja_bccwj-pos-PROPN.html">PROPN</a></tt> (2; 0% instances), <tt><a href="ja_bccwj-pos-PUNCT.html">PUNCT</a></tt>-<tt><a href="ja_bccwj-pos-PRON.html">PRON</a></tt> (2; 0% instances), <tt><a href="ja_bccwj-pos-PUNCT.html">PUNCT</a></tt>-<tt><a href="ja_bccwj-pos-VERB.html">VERB</a></tt> (2; 0% instances), <tt><a href="ja_bccwj-pos-X.html">X</a></tt>-<tt><a href="ja_bccwj-pos-PART.html">PART</a></tt> (2; 0% instances), <tt><a href="ja_bccwj-pos-ADJ.html">ADJ</a></tt>-<tt><a href="ja_bccwj-pos-X.html">X</a></tt> (1; 0% instances), <tt><a href="ja_bccwj-pos-ADP.html">ADP</a></tt>-<tt><a href="ja_bccwj-pos-ADV.html">ADV</a></tt> (1; 0% instances), <tt><a href="ja_bccwj-pos-ADP.html">ADP</a></tt>-<tt><a href="ja_bccwj-pos-NUM.html">NUM</a></tt> (1; 0% instances), <tt><a href="ja_bccwj-pos-DET.html">DET</a></tt>-<tt><a href="ja_bccwj-pos-AUX.html">AUX</a></tt> (1; 0% instances), <tt><a href="ja_bccwj-pos-DET.html">DET</a></tt>-<tt><a href="ja_bccwj-pos-NOUN.html">NOUN</a></tt> (1; 0% instances), <tt><a href="ja_bccwj-pos-DET.html">DET</a></tt>-<tt><a href="ja_bccwj-pos-VERB.html">VERB</a></tt> (1; 0% instances), <tt><a href="ja_bccwj-pos-INTJ.html">INTJ</a></tt>-<tt><a href="ja_bccwj-pos-ADV.html">ADV</a></tt> (1; 0% instances), <tt><a href="ja_bccwj-pos-INTJ.html">INTJ</a></tt>-<tt><a href="ja_bccwj-pos-AUX.html">AUX</a></tt> (1; 0% instances), <tt><a href="ja_bccwj-pos-INTJ.html">INTJ</a></tt>-<tt><a href="ja_bccwj-pos-PART.html">PART</a></tt> (1; 0% instances), <tt><a href="ja_bccwj-pos-INTJ.html">INTJ</a></tt>-<tt><a href="ja_bccwj-pos-PRON.html">PRON</a></tt> (1; 0% instances), <tt><a href="ja_bccwj-pos-NUM.html">NUM</a></tt>-<tt><a href="ja_bccwj-pos-SYM.html">SYM</a></tt> (1; 0% instances), <tt><a href="ja_bccwj-pos-PART.html">PART</a></tt>-<tt><a href="ja_bccwj-pos-ADP.html">ADP</a></tt> (1; 0% instances), <tt><a href="ja_bccwj-pos-PROPN.html">PROPN</a></tt>-<tt><a href="ja_bccwj-pos-ADJ.html">ADJ</a></tt> (1; 0% instances), <tt><a href="ja_bccwj-pos-PROPN.html">PROPN</a></tt>-<tt><a href="ja_bccwj-pos-ADV.html">ADV</a></tt> (1; 0% instances), <tt><a href="ja_bccwj-pos-PUNCT.html">PUNCT</a></tt>-<tt><a href="ja_bccwj-pos-AUX.html">AUX</a></tt> (1; 0% instances), <tt><a href="ja_bccwj-pos-PUNCT.html">PUNCT</a></tt>-<tt><a href="ja_bccwj-pos-PART.html">PART</a></tt> (1; 0% instances), <tt><a href="ja_bccwj-pos-SCONJ.html">SCONJ</a></tt>-<tt><a href="ja_bccwj-pos-AUX.html">AUX</a></tt> (1; 0% instances), <tt><a href="ja_bccwj-pos-SYM.html">SYM</a></tt>-<tt><a href="ja_bccwj-pos-NUM.html">NUM</a></tt> (1; 0% instances), <tt><a href="ja_bccwj-pos-SYM.html">SYM</a></tt>-<tt><a href="ja_bccwj-pos-PROPN.html">PROPN</a></tt> (1; 0% instances), <tt><a href="ja_bccwj-pos-SYM.html">SYM</a></tt>-<tt><a href="ja_bccwj-pos-SYM.html">SYM</a></tt> (1; 0% instances), <tt><a href="ja_bccwj-pos-X.html">X</a></tt>-<tt><a href="ja_bccwj-pos-ADP.html">ADP</a></tt> (1; 0% instances), <tt><a href="ja_bccwj-pos-X.html">X</a></tt>-<tt><a href="ja_bccwj-pos-AUX.html">AUX</a></tt> (1; 0% instances). ~~~ conllu # visual-style 1 bgColor:blue # visual-style 1 fgColor:white # visual-style 3 bgColor:blue # visual-style 3 fgColor:white # visual-style 3 1 obl color:blue 1 _ _ NOUN _ _ 3 obl _ BunsetuPosition=B|JPYomi=_|BunsetuPositionType=SEM_HEAD|SpaceAfter=No 2 _ _ ADP _ _ 1 case _ BunsetuPosition=I|JPYomi=_|BunsetuPositionType=SYN_HEAD|SpaceAfter=No 3 _ _ VERB _ _ 9 acl _ BunsetuPosition=B|JPYomi=_|BunsetuPositionType=SEM_HEAD|SpaceAfter=No 4 _ _ NOUN _ _ 5 compound _ BunsetuPosition=B|JPYomi=_|BunsetuPositionType=CONT|SpaceAfter=No 5 _ _ NOUN _ _ 7 nmod _ BunsetuPosition=I|JPYomi=_|BunsetuPositionType=SEM_HEAD|SpaceAfter=No 6 _ _ ADP _ _ 5 case _ BunsetuPosition=I|JPYomi=_|BunsetuPositionType=SYN_HEAD|SpaceAfter=No 7 _ _ NOUN _ _ 9 nmod _ BunsetuPosition=B|JPYomi=_|BunsetuPositionType=SEM_HEAD|SpaceAfter=No 8 _ _ ADP _ _ 7 case _ BunsetuPosition=I|JPYomi=_|BunsetuPositionType=SYN_HEAD|SpaceAfter=No 9 _ _ NOUN _ _ 0 root _ BunsetuPosition=B|JPYomi=_|BunsetuPositionType=ROOT|SpaceAfter=No ~~~ ~~~ conllu # visual-style 5 bgColor:blue # visual-style 5 fgColor:white # visual-style 32 bgColor:blue # visual-style 32 fgColor:white # visual-style 32 5 obl color:blue 1 _ _ X _ _ 5 dep _ BunsetuPosition=B|BunsetuPositionType=CONT|SpaceAfter=No 2 _ _ PROPN _ _ 5 compound _ BunsetuPosition=I|JPYomi=_|BunsetuPositionType=CONT|SpaceAfter=No 3 _ _ NUM _ _ 5 nummod _ BunsetuPosition=I|JPYomi=_|BunsetuPositionType=CONT|SpaceAfter=No 4 _ _ NUM _ _ 5 nummod _ BunsetuPosition=I|JPYomi=_|BunsetuPositionType=CONT|SpaceAfter=No 5 _ _ NOUN _ _ 32 obl _ BunsetuPosition=I|JPYomi=_|BunsetuPositionType=SEM_HEAD|SpaceAfter=No 6 _ _ ADP _ _ 5 case _ BunsetuPosition=I|JPYomi=_|BunsetuPositionType=FUNC|SpaceAfter=No 7 _ _ VERB _ _ 5 aux _ BunsetuPosition=I|JPYomi=_|BunsetuPositionType=CONT|SpaceAfter=No 8 _ _ SCONJ _ _ 5 mark _ BunsetuPosition=I|JPYomi=_|BunsetuPositionType=FUNC|SpaceAfter=No 9 _ _ ADP _ _ 5 case _ BunsetuPosition=I|JPYomi=_|BunsetuPositionType=SYN_HEAD|SpaceAfter=No 10 _ _ PUNCT _ _ 5 punct _ BunsetuPosition=I|JPYomi=_|BunsetuPositionType=CONT|SpaceAfter=No 11 _ _ NOUN _ _ 12 compound _ BunsetuPosition=B|JPYomi=_|BunsetuPositionType=CONT|SpaceAfter=No 12 _ _ NOUN _ _ 14 nmod _ BunsetuPosition=I|JPYomi=_|BunsetuPositionType=SEM_HEAD|SpaceAfter=No 13 _ _ ADP _ _ 12 case _ BunsetuPosition=I|JPYomi=_|BunsetuPositionType=SYN_HEAD|SpaceAfter=No 14 _ _ NOUN _ _ 32 nmod _ BunsetuPosition=B|JPYomi=_|BunsetuPositionType=SEM_HEAD|SpaceAfter=No 15 _ _ PUNCT _ _ 14 punct _ BunsetuPosition=I|JPYomi=_|BunsetuPositionType=CONT|SpaceAfter=No 16 _ _ NOUN _ _ 17 compound _ BunsetuPosition=B|JPYomi=_|BunsetuPositionType=CONT|SpaceAfter=No 17 _ _ NOUN _ _ 20 nmod _ BunsetuPosition=I|JPYomi=_|BunsetuPositionType=SEM_HEAD|SpaceAfter=No 18 _ _ PUNCT _ _ 17 punct _ BunsetuPosition=I|JPYomi=_|BunsetuPositionType=CONT|SpaceAfter=No 19 _ _ NOUN _ _ 20 compound _ BunsetuPosition=B|JPYomi=_|BunsetuPositionType=CONT|SpaceAfter=No 20 _ _ NOUN _ _ 32 obj _ BunsetuPosition=I|JPYomi=_|BunsetuPositionType=SEM_HEAD|SpaceAfter=No 21 _ _ ADP _ _ 20 case _ BunsetuPosition=I|JPYomi=_|BunsetuPositionType=SYN_HEAD|SpaceAfter=No 22 _ _ NOUN _ _ 23 compound _ BunsetuPosition=B|JPYomi=_|BunsetuPositionType=CONT|SpaceAfter=No 23 _ _ PART _ _ 28 nmod _ BunsetuPosition=I|JPYomi=_|BunsetuPositionType=SEM_HEAD|SpaceAfter=No 24 _ _ NOUN _ _ 28 compound _ BunsetuPosition=B|JPYomi=_|BunsetuPositionType=CONT|SpaceAfter=No 25 _ _ NOUN _ _ 28 compound _ BunsetuPosition=I|JPYomi=_|BunsetuPositionType=CONT|SpaceAfter=No 26 _ _ NOUN _ _ 28 compound _ BunsetuPosition=I|JPYomi=_|BunsetuPositionType=CONT|SpaceAfter=No 27 _ _ NOUN _ _ 28 compound _ BunsetuPosition=I|JPYomi=_|BunsetuPositionType=CONT|SpaceAfter=No 28 _ _ PART _ _ 30 obj _ BunsetuPosition=I|JPYomi=_|BunsetuPositionType=SEM_HEAD|SpaceAfter=No 29 _ _ ADP _ _ 28 case _ BunsetuPosition=I|JPYomi=_|BunsetuPositionType=SYN_HEAD|SpaceAfter=No 30 _ _ VERB _ _ 32 advcl _ BunsetuPosition=B|JPYomi=_|BunsetuPositionType=SEM_HEAD|SpaceAfter=No 31 _ _ VERB _ _ 32 aux _ BunsetuPosition=B|JPYomi=_|BunsetuPositionType=CONT|SpaceAfter=No 32 _ _ AUX _ _ 0 root _ BunsetuPosition=I|JPYomi=_|BunsetuPositionType=ROOT|SpaceAfter=No 33 _ _ PUNCT _ _ 32 punct _ BunsetuPosition=I|JPYomi=_|BunsetuPositionType=CONT|SpaceAfter=No ~~~ ~~~ conllu # visual-style 1 bgColor:blue # visual-style 1 fgColor:white # visual-style 5 bgColor:blue # visual-style 5 fgColor:white # visual-style 5 1 obl color:blue 1 _ _ NOUN _ _ 5 obl _ BunsetuPosition=B|JPYomi=_|BunsetuPositionType=SEM_HEAD|SpaceAfter=No 2 _ _ ADP _ _ 1 case _ BunsetuPosition=I|JPYomi=_|BunsetuPositionType=FUNC|SpaceAfter=No 3 _ _ VERB _ _ 1 aux _ BunsetuPosition=I|JPYomi=_|BunsetuPositionType=CONT|SpaceAfter=No 4 _ _ AUX _ _ 1 aux _ BunsetuPosition=I|JPYomi=_|BunsetuPositionType=SYN_HEAD|SpaceAfter=No 5 _ _ NOUN _ _ 7 nmod _ BunsetuPosition=B|JPYomi=_|BunsetuPositionType=SEM_HEAD|SpaceAfter=No 6 _ _ ADP _ _ 5 case _ BunsetuPosition=I|JPYomi=_|BunsetuPositionType=SYN_HEAD|SpaceAfter=No 7 _ _ NOUN _ _ 9 nmod _ BunsetuPosition=B|JPYomi=_|BunsetuPositionType=SEM_HEAD|SpaceAfter=No 8 _ _ PUNCT _ _ 7 punct _ BunsetuPosition=I|JPYomi=_|BunsetuPositionType=CONT|SpaceAfter=No 9 _ _ NOUN _ _ 12 obj _ BunsetuPosition=B|JPYomi=_|BunsetuPositionType=SEM_HEAD|SpaceAfter=No 10 _ _ ADP _ _ 9 case _ BunsetuPosition=I|JPYomi=_|BunsetuPositionType=SYN_HEAD|SpaceAfter=No 11 _ _ VERB _ _ 12 aux _ BunsetuPosition=B|JPYomi=_|BunsetuPositionType=CONT|SpaceAfter=No 12 _ _ AUX _ _ 13 advcl _ BunsetuPosition=I|JPYomi=_|BunsetuPositionType=SEM_HEAD|SpaceAfter=No 13 _ _ NOUN _ _ 25 nmod _ BunsetuPosition=B|JPYomi=_|BunsetuPositionType=SEM_HEAD|SpaceAfter=No 14 _ _ PUNCT _ _ 13 punct _ BunsetuPosition=I|JPYomi=_|BunsetuPositionType=CONT|SpaceAfter=No 15 _ _ NOUN _ _ 18 obl _ BunsetuPosition=B|JPYomi=_|BunsetuPositionType=SEM_HEAD|SpaceAfter=No 16 _ _ ADP _ _ 15 case _ BunsetuPosition=I|JPYomi=_|BunsetuPositionType=FUNC|SpaceAfter=No 17 _ _ VERB _ _ 15 aux _ BunsetuPosition=I|JPYomi=_|BunsetuPositionType=SYN_HEAD|SpaceAfter=No 18 _ _ NOUN _ _ 20 nmod _ BunsetuPosition=B|JPYomi=_|BunsetuPositionType=SEM_HEAD|SpaceAfter=No 19 _ _ ADP _ _ 18 case _ BunsetuPosition=I|JPYomi=_|BunsetuPositionType=SYN_HEAD|SpaceAfter=No 20 _ _ NOUN _ _ 23 obl _ BunsetuPosition=B|JPYomi=_|BunsetuPositionType=SEM_HEAD|SpaceAfter=No 21 _ _ ADP _ _ 20 case _ BunsetuPosition=I|JPYomi=_|BunsetuPositionType=FUNC|SpaceAfter=No 22 _ _ VERB _ _ 20 aux _ BunsetuPosition=I|JPYomi=_|BunsetuPositionType=SYN_HEAD|SpaceAfter=No 23 _ _ NOUN _ _ 25 iobj _ BunsetuPosition=B|JPYomi=_|BunsetuPositionType=SEM_HEAD|SpaceAfter=No 24 _ _ ADP _ _ 23 case _ BunsetuPosition=I|JPYomi=_|BunsetuPositionType=SYN_HEAD|SpaceAfter=No 25 _ _ VERB _ _ 0 root _ BunsetuPosition=B|JPYomi=_|BunsetuPositionType=ROOT|SpaceAfter=No 26 _ _ PUNCT _ _ 25 punct _ BunsetuPosition=I|JPYomi=_|BunsetuPositionType=CONT|SpaceAfter=No ~~~
191.08547
15,103
0.690701
yue_Hant
0.89805
2dc8b7a249cb331bcfc84d3aaf87ca0717e00376
1,086
md
Markdown
games/battleship/README.md
greg-kennedy/HasbroPBEMProxy
035c9cbc177bf7df78d380616c461bb80eea96c9
[ "CC0-1.0" ]
3
2021-09-29T06:42:02.000Z
2022-02-25T23:12:31.000Z
games/battleship/README.md
greg-kennedy/HasbroPBEMProxy
035c9cbc177bf7df78d380616c461bb80eea96c9
[ "CC0-1.0" ]
null
null
null
games/battleship/README.md
greg-kennedy/HasbroPBEMProxy
035c9cbc177bf7df78d380616c461bb80eea96c9
[ "CC0-1.0" ]
null
null
null
# HasbroPBEMProxy: Battleship * Product Code: 0x3216 (12822) * Version: 0x40 (64) [Read the Manual](./help.txt) Email Battleship will use system-wide proxy settings taken from Registry here: `HKEY_CURRENT_USER\SOFTWARE\Microsoft\Windows\CurrentVersion\Internet Settings` **Email Battleship is a pre-HTTP game and uses SOCKS4 for proxy.** See [the Socks4 readme](../../socks4/README.md) for more information. The two keys are * ProxyServer - `REG_SZ` - "socks=hostname:port" * ProxyEnable - `REG_DWORD` - 0x01 Supplied IPS patch does four things: * changes the read location to a local key instead: `HKEY_LOCAL_MACHINE\SOFTWARE\Hasbro Interactive\Email Battleship` * fixes access in `RegOpenKeyExA()` calls to allow Write as well as Read access * fix infinite loop that occurs when closing the game and triggering `DISP_CHANGE_BADMODE` error message * remove CD check Steps: * Apply IPS patch to `email-Battleship.exe` * Edit `Email Battleship.reg` and set `ProxyServer` values to match desired proxy server * Apply `Email Battleship.reg` to create override entries in Registry
40.222222
137
0.774401
eng_Latn
0.882607
2dc8fbf0bf6a822b45da2e345bc180aff1c30ade
1,568
md
Markdown
wdk-ddi-src/content/d3dhal/ns-d3dhal-_d3dhal_dp2ext.md
hsebs/windows-driver-docs-ddi
d655ca4b723edeff140ca2d9cc1bfafd72fcda97
[ "CC-BY-4.0", "MIT" ]
null
null
null
wdk-ddi-src/content/d3dhal/ns-d3dhal-_d3dhal_dp2ext.md
hsebs/windows-driver-docs-ddi
d655ca4b723edeff140ca2d9cc1bfafd72fcda97
[ "CC-BY-4.0", "MIT" ]
null
null
null
wdk-ddi-src/content/d3dhal/ns-d3dhal-_d3dhal_dp2ext.md
hsebs/windows-driver-docs-ddi
d655ca4b723edeff140ca2d9cc1bfafd72fcda97
[ "CC-BY-4.0", "MIT" ]
1
2021-04-22T21:40:43.000Z
2021-04-22T21:40:43.000Z
--- UID: NS:d3dhal._D3DHAL_DP2EXT title: _D3DHAL_DP2EXT (d3dhal.h) description: The D3DHAL_DP2EXT structure's use has yet to be defined. old-location: display\d3dhal_dp2ext.htm tech.root: display ms.assetid: d7cec277-d1d3-4c0f-91ec-fd5e962b6e1c ms.date: 05/10/2018 ms.keywords: "*LPD3DHAL_DP2EXT, D3DHAL_DP2EXT, D3DHAL_DP2EXT structure [Display Devices], LPD3DHAL_DP2EXT, LPD3DHAL_DP2EXT structure pointer [Display Devices], _D3DHAL_DP2EXT, d3dhal/D3DHAL_DP2EXT, d3dhal/LPD3DHAL_DP2EXT, d3dstrct_e56171cd-ae20-4277-abd5-cb8f0c008637.xml, display.d3dhal_dp2ext" f1_keywords: - "d3dhal/D3DHAL_DP2EXT" req.header: d3dhal.h req.include-header: D3dhal.h req.target-type: Windows req.target-min-winverclnt: req.target-min-winversvr: req.kmdf-ver: req.umdf-ver: req.ddi-compliance: req.unicode-ansi: req.idl: req.max-support: req.namespace: req.assembly: req.type-library: req.lib: req.dll: req.irql: topic_type: - APIRef - kbSyntax api_type: - HeaderDef api_location: - d3dhal.h api_name: - D3DHAL_DP2EXT product: - Windows targetos: Windows req.typenames: D3DHAL_DP2EXT --- # _D3DHAL_DP2EXT structure ## -description The D3DHAL_DP2EXT structure's use has yet to be defined. ## -struct-fields ### -field dwExtToken Specifies the extension token. ### -field dwSize Specifies the size, in bytes of this structure. ## -remarks This structure is used with hardware transform and lighting. Contact the DirectX team at Microsoft for further implementation details.
20.631579
296
0.739158
eng_Latn
0.447973
2dc9579595a92d8582e36a3aedad4a05eef8cb30
31
md
Markdown
README.md
TouchFriend/A_Category
9780118e3e07be15c0d4cb06b29a2f83901447f6
[ "Apache-2.0" ]
null
null
null
README.md
TouchFriend/A_Category
9780118e3e07be15c0d4cb06b29a2f83901447f6
[ "Apache-2.0" ]
null
null
null
README.md
TouchFriend/A_Category
9780118e3e07be15c0d4cb06b29a2f83901447f6
[ "Apache-2.0" ]
null
null
null
# A_Category A module category
10.333333
17
0.806452
kor_Hang
0.675808
2dca436b481ede3cbe6fc213aaa98a19edc48a4d
443
md
Markdown
config/template/README.md
normancarcamo/boilerplate-react-universal
f16fd71dc3c73e28dfc362de53444993dc54635d
[ "MIT" ]
5
2017-10-07T03:57:32.000Z
2021-07-31T13:41:28.000Z
config/template/README.md
normancarcamo/boilerplate-react-universal
f16fd71dc3c73e28dfc362de53444993dc54635d
[ "MIT" ]
null
null
null
config/template/README.md
normancarcamo/boilerplate-react-universal
f16fd71dc3c73e28dfc362de53444993dc54635d
[ "MIT" ]
1
2017-10-10T12:50:13.000Z
2017-10-10T12:50:13.000Z
In this directory you can put your own templates files, they can be .html, .ejs, .pug. The default template engined used is: ".pug" so, if you want to use ".ejs" go [webpack plugins file](../webpack/client/plugins/index.js) and replace it. **Note**: If you want to use another sort of template engine ensure to add it's loader it into the [webpack config file](../webpack/shared/rules) because it will be used during the compilation process.
88.6
201
0.747178
eng_Latn
0.994437
2dca639e556da9edbee8b5e5f5cd88d3243ec54f
1,071
md
Markdown
README.md
Belencha/trasteando-con-git
081ddeb9b6675a481681b452884a3c87a9972ccd
[ "MIT" ]
null
null
null
README.md
Belencha/trasteando-con-git
081ddeb9b6675a481681b452884a3c87a9972ccd
[ "MIT" ]
null
null
null
README.md
Belencha/trasteando-con-git
081ddeb9b6675a481681b452884a3c87a9972ccd
[ "MIT" ]
null
null
null
# Trasteando con git Este proyecto simplemente me sirve para poder practicar merges, squashes, rebases... ## Comenzando 🚀 No creo que te sirva de nada este repo porque las lío muy pardas, y para entender lo que he querido probar pues te vas a tirar mucho rato. ## Git hooks 👾 Sí, estoy aprendiendo a construir mis propios hooks y aviso de que hay alguno que otro. Son bastante chorra, pero para poder vivirlos en primera persona lo que tienes que hacer es irte a la carpeta doc/hooks del repo y ejecutar el script install-hooks.sh que hay. Lo que hace es crear un enlace simbólico de la carpeta oficial del repo donde git espera encontrar los hooks (.git/hooks) a la carpeta doc/hooks que es donde oficialmente voy a guardarlos para poder subirlos al repo y llevar el track. Los ejemplos que trae git por defecto están en: [https://githooks.com/](https://githooks.com/). ## Licencia 📄 Pero si aún así te apetece mucho hacer algo con él xD, pues que sepas que este repo está bajo la licencia MIT. Échale un ojo al archivo [LICENSE.md](LICENSE.md) para detalles.
56.368421
498
0.769374
spa_Latn
0.999149
2dca7a6f37a25f6c78629d70b85f333c445bd754
1,067
md
Markdown
SUMMARY.md
DevinLow/book
32e81772aedf4748aefa2ac607824781ef3cf1a9
[ "Apache-2.0" ]
2
2018-04-23T09:50:09.000Z
2021-03-06T10:43:59.000Z
SUMMARY.md
DevinLow/book
32e81772aedf4748aefa2ac607824781ef3cf1a9
[ "Apache-2.0" ]
null
null
null
SUMMARY.md
DevinLow/book
32e81772aedf4748aefa2ac607824781ef3cf1a9
[ "Apache-2.0" ]
1
2018-04-23T09:35:50.000Z
2018-04-23T09:35:50.000Z
# Summary * [Introduction](README.md) * [Android](android.md) * [Python](python.md) * [U-boot](u-boot.md) * [Markdown](markdown.md) * [Shell](shell.md) * [Git](git.md) * [QT](qt.md) * [C++](c++.md) * [image processing](image-processing.md) * [Systemd ](systemd.md) * [android-xi-tong-gong-neng-shi-xian](android-xi-tong-gong-neng-shi-xian.md) * [bat](bat.md) * [backlight\_open\_close](backlightopen-close.md) * [bluetooth\_ioctrl](bluetoothioctrl.md) * [bootlogo](bootlogo.md) * [chang-yong-ming-ling](chang-yong-ming-ling.md) * [conjpgtobmp](conjpgtobmp.md) * ep126 * [gms](gms.md) * [kernel](kernel.md) * [landspace](landspace.md) * ncs8801 * newled * [odex](odex.md) * [powerkey](powerkey.md) * qn8027 * [quicksetting](quicksetting.md) * [screenshot](screenshot.md) * [screensetting](screensetting.md) * [setptiv](setptiv.md) * [settings](settings.md) * [ sl7729](sl7729.md) * [swith](swith.md) * [ubuntuchang-yong-gong-ju-ming-ling](ubuntuchang-yong-gong-ju-ming-ling.md) * [volume](volume.md) * [Tina Linux quick start guide](tina-linux-quick-start-guide.md)
26.02439
77
0.689784
yue_Hant
0.122599
2dcb0bcdd771f83ca7bc7ea0350cb90ed340ca5c
8,277
md
Markdown
06-JavaScript/34-EventLoop.md
xiaofeilalala/Learn-Notes
aa362620adcfad10b76feee989d5bbfb2c089344
[ "MIT" ]
null
null
null
06-JavaScript/34-EventLoop.md
xiaofeilalala/Learn-Notes
aa362620adcfad10b76feee989d5bbfb2c089344
[ "MIT" ]
null
null
null
06-JavaScript/34-EventLoop.md
xiaofeilalala/Learn-Notes
aa362620adcfad10b76feee989d5bbfb2c089344
[ "MIT" ]
null
null
null
# Event Loop ## 1. 定时器 - `setTimeout` 允许我们将函数推迟到一段时间间隔之后再执行 - `setInterval` 允许我们重复运行一个函数,从一段时间间隔之后开始运行,之后以该时间间隔连续重复运行该函数 - 定时器 `setTimeout/setinterval` 中的 `this` 都指向 `window` ### 1-1 setTimeout `setTimeout(func|code, delay, arg1, arg2, ...)` 在指定毫秒数后执行代码或者函数 * `dunc|code` 想要执行的函数或代码 * `delay` 执行前的延时,以毫秒为单位(1000 毫秒 = 1 秒),默认值是 0 * `arg1, arg2, ...` 要传入被执行函数或代码的参数列表 ```js let timeId = setTimeout(func|code, delay, arg1, arg2, ...) ``` ```js // setTimeout 在指定毫秒数后执行 let timeId = setTimeout((...name) => { // 可以使用箭头函数 console.log('执行') // 传入参数 console.log(name); // ['jsx', 'ljj'] }, 1000, 'jsx', 'ljj') ``` ### 1-2 取消setTimeout `setTimeout` 在调用时会返回一个定时器标识符,可以使用 `clearTimeout` 来取消定时器执行调用 > **Tips:**如果在创建定时器时没有名字,则定时器无法清除 ```js let timerId = setTimeout(...); ``` ```js let timeId = setTimeout(function() { console.log('执行') }, 1000) // 定时器不会执行 clearTimeout(timeId) ``` ### 1-3 嵌套setTimeout 嵌套的 `setTimeout` 可以实现与 `setInterval` 相同的循环执行调用 ```js let timeId = setTimeout(function run() { console.log('执行') timeId = setTimeout(run, 1000) }, 1000) ``` ### 1-4 setInterval `setInterval(func|code, delay, arg1, arg2, ...)` 每间隔指定的时间周期性执行代码或者函数,定时器不被清除,代码或者函数会无限执行 * `dunc|code` 想要执行的函数或代码 * `delay` 执行前的延时,以毫秒为单位(1000 毫秒 = 1 秒),默认值是 0 * `arg1, arg2, ...` 要传入被执行函数或代码的参数列表 ```js let timerId = setInterval(func|code, delay, arg1, arg2, ...) ``` ### 1-5 取消setInterval `setInterval` 在调用时会返回一个定时器标识符,可以使用 `clearInterval` 来取消定时器执行调用 ```js let interval = setInterval(function() { console.log('执行') }, 1000) // 定时器标识符清除定时器 clearInterval(interval) ``` ### 1-5 定时器垃圾回收 当一个函数传入 `setInterval/setTimeout` 时,将为其创建一个内部引用,并保存在调度程序中,即使这个函数没有其他引用,也能防止垃圾回收器(GC)将其回收 可以通过定时器标识符取消定时器,来使得定时器传入的函数被垃圾回收 ```js let timeId = setTimeout(time, 1000) function time() { console.log('执行') } // 取消定时器 clearTimeout(timeId) let interval = setInterval(time, 1000) // 取消定时器 clearInterval(interval) ``` ## 2. 线程 ### 2-1 单线程 `JavaScript` 是单线程的,任务需要排队执行 `JavaScript` 的主要用途是与用户互动,以及操作 `DOM` 如果是多线程的,则会带来很复杂的同步问题, `DOM` 节点内容两个线程不同操作,浏览器执行问题,所以为了避免复杂性,`JavaScript` 核心特征就是单线程 ### 2-2 多线程 浏览器是多进程的,`GUI` 渲染线程和 `JS` 引擎线程是不能同时进行的,渲染线程在执行任务的时候,`JS` 引擎线程会被挂起 * `GUI` 渲染线程 —— 负责渲染浏览器界面,解析 `HTML`,`CSS` * `JS` 引擎线程 —— 解析和执行 `JS` 脚本 * 事件触发线程 —— 控制事件循环,管理事件任务队列 * 定时器触发线程 —— 处理 `setInterval` 与 `setTimeout` 定时器 * 异步 `http` 请求线程 —— 处理异步 `http` 请求 ## 3. 任务队列 ### 3-1 执行栈与主线程 * 执行栈 —— 当触发某个事件时,该事件触发就会进入任务队列等待主线程读取,执行任务队列中的某个任务时,这个被执行的任务就称为执行栈 * 主线程 —— 规定现在执行执行栈中的哪个事件,主线程会不停的从执行栈中读取事件,会执行完所有栈中的同步代码,当主线程将执行栈中所有的代码执行完之后,主线程将会去查看任务队列是否有任务,将可运行的异步任务添加到执行栈中,开始执行 ![](https://raw.githubusercontent.com/xiaofeilalala/DocsPics/main/imgs/20220503211538.png) ### 3-2 同步与异步任务 * 同步任务:在主线程上排队执行的任务,只有前一个任务执行完毕,才能执行后一个任务 * 异步任务 —— 不进入主线程、而进入任务队列的任务,只有任务队列通知主线程,某个异步任务可以执行了,该任务才会进入主线程执行 ![](https://raw.githubusercontent.com/xiaofeilalala/DocsPics/main/imgs/20220504090347.png) ### 3-3 任务队列 主线程之外,事件触发线程管理着一个任务队列,只要异步任务有了运行结果,就在任务队列之中放一个事件回调 执行栈中的所有同步任务执行完毕,浏览器就会读取任务队列,将可运行的异步任务添加到执行栈中,开始执行 * 首先,执行栈开始顺序执行 * 判断是否为同步,异步则进入异步线程,最终事件回调给事件触发线程的任务队列等待执行,同步继续执行 * 执行栈空,询问任务队列中是否有事件回调 * 任务队列中有事件回调则把回调加入执行栈末尾继续从第一步开始执行 * 任务队列中没有事件回调则不停发起询问 ```js let timeId = setTimeout(function() { console.log('定时器') }, 1000) console.log('console') let result = new Promise(resolve => { console.log('Promise resolve') resolve() }) // 异步请求 result.then(() => { console.log('Promise then') }) // console // Promise resolve // Promise then // 定时器 ``` ![](https://raw.githubusercontent.com/xiaofeilalala/DocsPics/main/imgs/20220503222546.png) ## 4. 宏任务与微任务 异步任务 —— 异步任务分为宏任务(macrotask )和微任务(microtask ) ### 4-1 宏任务 宏任务 —— 可以将每次执行栈执行的代码当做是一个宏任务, 每一个宏任务会从头到尾执行完毕,不会执行其它 * 每个宏任务之后,引擎会立即执行微任务队列中的所有任务,然后再执行其他的宏任务,或渲染,或进行其他任何操作 * 浏览器为了能够使得使 `宏任务` 和 `DOM任务` 能够有序的执行,会在一个宏任务执行结束后,在下一个宏任务执行开始前,对页面进行重新渲染 * 常见的宏任务 —— 主代码块 `script`、`setTimeout`、`setInterval`、`setImmediate() —— Node`、浏览器 `requestAnimationFrame()` ```js // 执行顺序 宏任务 -> GUI渲染 -> 宏任务 -> ... ``` ### 4-2 微任务 微任务 —— 在当前任务执行结束后立即执行的任务,在当前任务后,下一个任务之前,在渲染之前 * 微任务优先级会高于宏任务 * 在一个宏任务执行完后,会在渲染前将在执行期间产生的所有微任务都执行完毕 * 常见的微任务 —— `Promise.then()`、`Promise.catch`、`Promise.finally`、`process.nextTick() —— Node`、`Object.observe`、`MutationObserver` ```js // 执行顺序 宏任务 -> 微任务 -> GUI渲染 -> 宏任务 -> ... ``` ### 4-3 执行流程 * 整段脚本 `script` 作为宏任务开始执行 * 遇到微任务将其推入微任务队列,宏任务推入宏任务队列 * 宏任务执行完毕,检查有没有可执行的微任务 * 发现有可执行的微任务,将所有微任务执行完毕 * 当前宏任务执行完毕,开始检查渲染,然后 `GUI` 线程接管渲染 * 渲染完毕后,`JS` 线程继续接管,开始新的宏任务,反复如此直到所有任务执行完毕 ![](https://raw.githubusercontent.com/xiaofeilalala/DocsPics/main/imgs/20220503212546.png) ## 5. 事件循环 ### 5-1 事件循环流程 * 首先,整体的 `script`(作为第一个宏任务)开始执行的时候,会把所有代码分为`同步任务`、`异步任务`两部分 * 同步任务会直接进入主线程依次执行 * 异步任务会再分为宏任务和微任务 * 宏任务进入到 `Event Table` 事件列表中,并在里面注册回调函数,每当指定的事件完成时,事件列表会将这个函数移到 `Event Queue` 宏任务队列中 * 微任务也会进入到另一个 `Event Table` 事件列表中,并在里面注册回调函数,每当指定的事件完成时,事件列表会将这个函数移到 `Event Queue` 微任务队列中 * 当主线程内的任务执行完毕,主线程为空时,会检查微任务的 `Event Queue` 事件队列,如果有任务,就全部执行,如果没有就执行下一个宏任务 * 上述过程会不断重复,这就是 `Event Loop`,比较完整的事件循环 ![](https://raw.githubusercontent.com/xiaofeilalala/DocsPics/main/imgs/20220504091805.png) ### 5-2 Promise和async/await * `new Promise(() => {}).then()` ,前面的 `new Promise()` 这一部分是一个构造函数,这是一个同步任务,后面的 `.then()` 才是一个异步微任务 * `async/await` 本质上还是基于 `Promise` 的一些封装,`await` 以前的代码,相当于与 `new Promise` 的同步代码,`await` 以后的代码相当于 `Promise.then` 的异步微任务 * `await` 后面的表达式会先执行一遍,将 `await` 后面的代码加入到 `microtask` 微任务中,然后就会跳出整个 `async` 函数来执行后面的代码 ```js // promise代码是一个同步任务 // .then、.catch、.finally是微任务 console.log('a') let promise = new Promise(function(resolve, reject) { console.log('b') resolve(); }) .then(function() { console.log('c') }) // a b c ``` ```js // async与await await前面的代码都是同步的 await后面则是微任务 async function func1() { console.log('a'); await func2(); console.log('b') } async function func2() { console.log('c') } console.log('d') setTimeout(function() { console.log('setTimeout') }, 0) func1(); // dacb ``` ## 6. 事件循环练习 - 同步任务进入主线程排队,异步任务进入事件队列排队等待被推入主线程执行 - 定时器的延迟时间为 `0` 并不是立刻执行,只是代表相比于其他定时器更早的被执行 ### 6-1 练习1 * `setTimeout` 进入宏任务队列, * `Promise` 创建立即执行,打印 `a b` * 遇到 `Promise.then` 进入微任务队列 * 遇到 `console.log('d')` 打印 `d` * 有可执行的微任务,打印 `c`,遇到 `setTimeout`(Promise中的) 将其推入宏任务队列中 * 定时器延迟时间相同,开始按照顺序执行宏任务,分别打印 `setTimeout` `then中的setTimeout` ```js setTimeout(function(){ console.log('setTimeout') }, 0) const p = new Promise(resolve => { console.log('a') resolve() console.log('b') }) p.then(() => { console.log('c') setTimeout(function(){ console.log('then中的setTimeout') }, 0) }) console.log('d') // a b d c setTimeout then中的setTimeout ``` ### 6-2 练习2 * 打印 `a` * `promise` 立即执行,打印 `b` * `promise.then` 推入微任务队列 * `setTimeout` 推入宏任务队列 * 整段代码执行完毕,开始执行微任务,打印 `c` ,遇到 `setTimeout` 推入宏任务队列排队等待执行 * 没有可执行的微任务开始执行宏任务,定时器按照延迟时间排队执行 * 打印 `h j` ,`promise.then` 推入微任务队列 * 有可执行的微任务,打印 `i` ,继续执行宏任务,打印 `d` * 执行延迟为100的宏任务,打印 `e f`,执行微任务打印 `g`,所有任务执行完毕 ```js console.log('a'); new Promise(resolve => { console.log('b') resolve() }).then(() => { console.log('c') setTimeout(() => { console.log('d') }, 0) }) setTimeout(() => { console.log('e') new Promise(resolve => { console.log('f') resolve() }).then(() => { console.log('g') }) }, 100) setTimeout(() => { console.log('h') new Promise(resolve => { resolve() }).then(() => { console.log('i') }) console.log('j') }, 0) // a b c h j i d e f g ``` ### 6-3 练习3 * 执行 `console.log('script start')` ,输出 `script start` * `setTimeout` 推入宏任务队列 * 执行了 `async1()` 函数,`await` 前面代码同步执行输出 `async1 start`,`await` 后面代码推入微任务立即执行输出 `async2` ,然后跳出 `async1` 函数 * `Promise` 中的函数是立即执行的,而后续的 `.then` 则会被分发到微任务队列中,输出 `promise1` * 向下继续执行输出 `script end` * 查找清空微任务队列输出 `async1 end`、`promise2` * 最后输出 `setTimeout` 定时器宏任务 `setTimeout` ```js async function async1() { console.log('async1 start') await async2() console.log('async1 end') } async function async2() { console.log('async2') } console.log('script start') setTimeout(function () { console.log('setTimeout') }, 0) async1() new Promise(function (resolve) { console.log('promise1') resolve() }).then(function () { console.log('promise2') }) console.log('script end') // script start, async1 start, async2, promise1, script end, async1 end, promise2, setTimeout ```
18.726244
127
0.683219
yue_Hant
0.564571
2dcb22ba17b086607213dbf9429ef0871f930f3e
3,958
md
Markdown
vendor/amranidev/scaffold-interface/README.md
elizeub/cadastro-sistemaebd
8c01c8bcb4c130a3319cd1d0c3a61aedc196e247
[ "MIT" ]
null
null
null
vendor/amranidev/scaffold-interface/README.md
elizeub/cadastro-sistemaebd
8c01c8bcb4c130a3319cd1d0c3a61aedc196e247
[ "MIT" ]
null
null
null
vendor/amranidev/scaffold-interface/README.md
elizeub/cadastro-sistemaebd
8c01c8bcb4c130a3319cd1d0c3a61aedc196e247
[ "MIT" ]
null
null
null
[![SensioLabsInsight](https://insight.sensiolabs.com/projects/4c35ba52-551e-4d62-adb8-ff5199c54801/big.png)](https://insight.sensiolabs.com/projects/4c35ba52-551e-4d62-adb8-ff5199c54801) [![Build Status](https://travis-ci.org/amranidev/scaffold-interface.svg?branch=master)](https://travis-ci.org/amranidev/scaffold-interface) [![StyleCI](https://styleci.io/repos/45497055/shield?style=flat)](https://styleci.io/repos/45497055) [![Total Downloads](https://poser.pugx.org/amranidev/scaffold-interface/downloads)](https://packagist.org/packages/amranidev/scaffold-interface) [![Latest Stable Version](https://poser.pugx.org/amranidev/scaffold-interface/v/stable)](https://packagist.org/packages/amranidev/scaffold-interface) [![Latest Unstable Version](https://poser.pugx.org/amranidev/scaffold-interface/v/unstable)](https://packagist.org/packages/amranidev/scaffold-interface) [![License](https://poser.pugx.org/amranidev/scaffold-interface/license)](https://packagist.org/packages/amranidev/scaffold-interface) ## A Smart CRUD Generator ![Scaffold](http://i.imgur.com/65uhrP7.gif) ####Features : + Generate your model,views,controller and migrations just in a few clicks. + Views support Bootstrap and Materializecss. + Generate OneToMany relationships including views and controllers. + Generate a dashboard template. + A delete confirmation message. + Using an interface to design your table. + Rollbacking possibility. + Craft your laravel application faster and easier. + Generate CRUD for packages, see [Lpackager](https://github.com/amranidev/lpackager), [CRUD for packages/modules](http://amranidev.github.io/blog/site/crud-generator-for-packages/). ###I. Package installation 1. Run composer require to install Scaffold-Interface : ``` composer require Amranidev/scaffold-interface:1.4.* ``` Or add in composer.json: ```json require : { "Amranidev/scaffold-interface": "v1.4.*" } ``` Then update composer : ``` $ composer update ``` 3. Add the service providers to config/app.php : ```php Amranidev\ScaffoldInterface\ScaffoldInterfaceServiceProvider::class, Amranidev\Ajaxis\AjaxisServiceProvider::class, ``` 4. Publish the assets in your application with: ``` $ php artisan vendor:publish ``` 5. Migrate for the Scaffold Interface table: ``` $ php artisan migrate ``` Congratulations, you have successfully installed Scaffold Interface! ###II. Quick Start 1. Access to Scaffold Interface: http://{your-project}/scaffold to get into Scaffold Interface. 2. Table creation: You can add many of attributes such as a string, date, longtext,etc. 3. After the creation, to complete your scaffolding, go to the terminal and run: ``` $ php artisan migrate ``` 4. Finally: Go to http://{your-project}/{your-model} to see the result. 5. Rollback If you want to rollback the table just check this: ![Imgur](http://i.imgur.com/dnYc2ZE.png) Before you make your rollback make sure that you have rolled back your table in the database. 6. OneToMany Relationship Example: Basically we want to generate a small app that contain Clients, Products and Orders. So the Orders must include the Clients and products foreign keys. The first thing to do is to generate the Clients and Products normally. After that you could generate Orders and adding two relations to Clients and Products. ####Contribution Any ideas are welcome. Feel free to submit any issues or pull requests. ####TODOS - [ ] 100% Code coverage + Maximum code quality. ####DONE - [x] Improve Vuejs. - [x] Add a select for OneToMany (on data fields) in interface. - [x] Laravel 5.2 supported. - [x] Laravel 5.1 supported. ####Contact : amranidev@gmail.com
28.681159
186
0.70288
eng_Latn
0.782267
2dcb727567a3af0cbf6b595764eff9be8a110da7
1,021
md
Markdown
README.md
appcircleio/appcircle-cocoapods-component
c8e05ac8598ba820b45df1f2c03a0b6f2abbe7ff
[ "MIT" ]
null
null
null
README.md
appcircleio/appcircle-cocoapods-component
c8e05ac8598ba820b45df1f2c03a0b6f2abbe7ff
[ "MIT" ]
null
null
null
README.md
appcircleio/appcircle-cocoapods-component
c8e05ac8598ba820b45df1f2c03a0b6f2abbe7ff
[ "MIT" ]
null
null
null
# Appcircle Cocoapods Cocoapods is a dependency manager for Swift and Objective-C applications. Cocoapods handles the installation of external libraries your application depends on during a build. Cocoapods is widely used among iOS developers for dependency management and it's very easy to include it in your iOS projects with Appcircle. ## Adding Cocoapods to your repository Appcircle will look for the `Podfile.lock` file in your repository and use the specified Cocoapods version to install the dependencies. If your `Podfile.lock` file doesn't specify a Cocoapods version, default Cocoapods version in the virtual machine will be used. You need to specify an Xcode Workspace .xcworkspace file instead of an Xcode Project .xcodeproj file when you use Cocoapods as the dependency manager. Required Input Variables - `$AC_PROJECT_PATH`: Specifies the project path Optional Input Variables - `$AC_REPOSITORY_DIR`: Specifies the cloned repository directory. - `$AC_COCOAPODS_VERSION`: Specifies cocoapods version.
53.736842
174
0.816846
eng_Latn
0.993966
2dcbd76d8ec191a4479b72424ca5c9d51175e93a
494
md
Markdown
docs/DashboardSectionRow.md
mdennehy/python-client
4d9cfa32075a6a65d88a38fe9e72b282e87b8808
[ "Apache-2.0" ]
11
2016-05-30T17:16:45.000Z
2021-06-11T19:32:59.000Z
docs/DashboardSectionRow.md
mdennehy/python-client
4d9cfa32075a6a65d88a38fe9e72b282e87b8808
[ "Apache-2.0" ]
25
2016-05-02T23:05:19.000Z
2020-11-18T22:43:20.000Z
docs/DashboardSectionRow.md
mdennehy/python-client
4d9cfa32075a6a65d88a38fe9e72b282e87b8808
[ "Apache-2.0" ]
30
2016-04-29T17:17:11.000Z
2022-02-11T04:58:37.000Z
# DashboardSectionRow ## Properties Name | Type | Description | Notes ------------ | ------------- | ------------- | ------------- **charts** | [**list[Chart]**](Chart.md) | Charts in this section row | **height_factor** | **int** | Scalar for the height of this row. 100 is normal and default. 50 is half height | [optional] [[Back to Model list]](../README.md#documentation-for-models) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to README]](../README.md)
41.166667
161
0.601215
eng_Latn
0.660064
2dcd1899edeb3d0500716492beb2fd587eecc4aa
7,095
md
Markdown
_posts/2019-07-25-Download-securing-the-past-conservation-in-art-architecture-and-literature.md
Luanna-Lynde/28
1649d0fcde5c5a34b3079f46e73d5983a1bfce8c
[ "MIT" ]
null
null
null
_posts/2019-07-25-Download-securing-the-past-conservation-in-art-architecture-and-literature.md
Luanna-Lynde/28
1649d0fcde5c5a34b3079f46e73d5983a1bfce8c
[ "MIT" ]
null
null
null
_posts/2019-07-25-Download-securing-the-past-conservation-in-art-architecture-and-literature.md
Luanna-Lynde/28
1649d0fcde5c5a34b3079f46e73d5983a1bfce8c
[ "MIT" ]
null
null
null
--- layout: post comments: true categories: Other --- ## Download Securing the past conservation in art architecture and literature book Humor is emotional chaos remembered in tranquility. Which is securing the past conservation in art architecture and literature pilot. Furthermore, had stranded on the as if there against his will. The mechanism creaks and rasps. of Jupiter and the moon was observed, including the cruelest catastrophe. "She seems like a pretty special kid," the driver said. " The other was strand-bank thickly overgrown with luxuriant grass, said. The fires themselves burned in huge scooped out basins of stone. What was she keeping Bren's shoes for, whereas an organism that securing the past conservation in art architecture and literature from sexual reproduction (except where self-fertilization is possible) has two parents. She escaped, Barty reached up for his mother, the bottom of the grave was dark and hidden from view, before a man dressed in white, wheezing, here. Sundays on Wednesday. They had been so close, "Seems like science fiction, the direction of side of the family, doomed look that Byron must have had. The drab furniture had probably been purchased in the thrift shop on the corner. " (William Atheling, who securing the past conservation in art architecture and literature the slaves and slave- girls encompass him about. I'll be out in a few minutes to take a spell with Carson and Young. Whatever the source of the noise, had pretended that he wasn't a Farrel, O my lord. shut and weighted them with quarters. With this vessel Minin penetrated attention of all on board to an island, during the time we remained in their neighbourhood, with bright white hair. He gave orders, the two tall panes began to open outward but too The Ambassador referred to was to be Securing the past conservation in art architecture and literature Farnhill. Securing the past conservation in art architecture and literature animal Since the name of the person is the person, as his hand grew slimier, not now, Vanadium felt a squirming in his marrow, not with the use to which their end result will be put Sinsemilla giggling in the co-pilot's chair. SIN BOJARSKI PETER GUTUROV, every stone steeped computer, and in fact it developed into a prosperous little dip. " across the Kara Sea, "It seems a shame to take someone's clothes away. With her colored pencils and a large pad of drawing paper, she saw a way to Junior tipped his head back and gazed up toward the section of broken-out railing along the high observation deck. Therefore, ii, dressed this way, 270_n_ Polar Sea, or at least through Geneva's gardening, however, "multegroet" (preserved file:D|Documents20and20SettingsharryDesktopUrsula20K, the car keys from the pegboard, and this eunuch cometh to rouse me up. They expect modesty to come later, because she didn't want to talk about her past. The discussion continued through the meal, and only a handful of the nonbetrizated were still note of long-throttled anger in her voice. But Wrangel believes that he future at all. He wouldn't mind. we're the instruments of some strange destiny. Leilani's situation was no better but no worse united with each other the dwellings they had excavated in the fraction of an inch? Tall and solemn, but a cheery wave These three roses, and "Yeah," Angel said. Several times, perfectly clear once seen, was unrolled to reveal ordinary newsprint, nodding her greeting. Dwina in the White Sea, keen-faced old man standing beside him nodded in agreement. And be couldn't afford to alienate Mama now. Accordingly, After a while, but they aren't hardly ever smart, nevermore will I give thee aught, sitting so nearby and having no one to talk to himself, but was c. He has such an incredible innocence. On the 2nd "That's a good honest answer. which had been formed in the course of the preceding night Securing the past conservation in art architecture and literature Isaacson--twin brother of Edom-knew nothing negative about Panglo, Wulfstan, or perhaps she saw more in Micky's face than she cared to "Well. [186] In the gulf of Yenisej a large island was Kath closed her eyes gratefully for a moment,' and then turned to speak to Veronica, that nothing may assain, and it would make them Spinel, although that is a little more trouble, that sometimes it seemed that she was actually there with them. Did you look into her eyes when you pushed her?" Vanadium's uninflected monologue was like the voice of a conscience that preferred to torture by droning rather than by nagging. " abducted by ETs as a child and was being used as an instrument to prepare brief; there's no relief in even one voice among them-only shirk anxiety, she explained. and, anyway, "Sit up. Men must be all over you. Jacob go on about big storms blowing people away and explosions blowing people 62. These were She tried to shield her journal against her body, and returned to the marvels of the Allking's realm. " Quoth she, I'll be forced to order you to abort. ' 'Thou sayst sooth,' rejoined they; 'but we desire thine advantage, certainly, called the first ten years; it is only known that it was enormously large. She didn't say where she'd learned it. On the 2nd "That's a good honest answer. Ninaвit wasn't her name, no-doubt-about-it, might have opened its shell to feed in this guarded fashion, she sent to me, Jay," Bernard cautioned. important chapters in the history of the former and recent condition the mode of life and domestic economy of the Russians in the "Darkrose," he breathed in her ear, for the last 250 years. haul myself out here to this historical hellhole five nights a week an' listen By now slowed to a cautious pace, pussyfooting as silently as any creeping cat, her classic features had a pixie charm, set him down before Er Reshid, the E. The at last been solved. Would Jay like to go too. I didn't want to make too much of mere childish play. Bruzewitz. His pink tongue protruded from his mouth, wasn't she?" Livor mortis had already set in. I am a prisoner Now the Persian had a mameluke, Mr, both _kayaks_ and _umiaks_, the hunters begin to haul in the lines, going to help Rush. "It looks as if the fall-guy has gone down, the image on the screen was instantly recognizable! ) Micky could find no story in the media exploring Maddoc's belief that UFOs helped me get in, although a disappointment on hand, or perhaps longer, as were the father's hands, and "Yeah," Angel said. The mechanism creaks and rasps. " "They began arriving at the Spindle a few minutes ago," Lesley seemed surprised. She found her lord Ishac lying aswoon in the vestibule; so she took him up and strained him to her bosom, you She blotted her hands on her shorts, I have arranged for Zorphwar to be made available to you on the Executive Interactive Display Terminal securing the past conservation in art architecture and literature your office, all the way to the roadblock, enchanted by the sisters' style of full-tilt cooking, the heart of his chestnut kingdom, served her so well now, as far as Angel was concerned.
788.333333
6,956
0.79253
eng_Latn
0.999911
2dcdbdab1838fef99ef6f1c73ffcf8152043826e
625
md
Markdown
_posts/2002-10-03-how_does_all_this_stuff_work.md
cwinters/cwinters.github.io
a1c68381f4d1c82cadca08aa784f53fe0b3fb3f2
[ "MIT" ]
null
null
null
_posts/2002-10-03-how_does_all_this_stuff_work.md
cwinters/cwinters.github.io
a1c68381f4d1c82cadca08aa784f53fe0b3fb3f2
[ "MIT" ]
6
2020-02-24T22:24:42.000Z
2022-02-26T01:45:04.000Z
_posts/2002-10-03-how_does_all_this_stuff_work.md
cwinters/cwinters.github.io
a1c68381f4d1c82cadca08aa784f53fe0b3fb3f2
[ "MIT" ]
null
null
null
--- tags: programming layout: post title: "How does all this stuff work?" --- <a href="http://weblog.infoworld.com/udell/2002/10/03.html">Jon gives props</a> to the open source build process, that all this random stuff can be built in the first place. Testify! <p>This is one of the reasons why geeks spend so much time exploring, testing and (sometimes) creating new build tools -- it's a hard problem. Sometimes we take heat for our focus -- concentrating on infrastructure rather than end-user tools -- but that's because we know how important the infrastructure is to keeping the ecosystem alive and vibrant.</p>
44.642857
355
0.7536
eng_Latn
0.998479
2dceb1f151aa6deaa71df10a1ea094102292e0a7
1,931
md
Markdown
source/development/dotnet/api-reference/index.md
autolaborcenter/pm1-docs-sphinx
88632dab128e68e9053937afb89456d2e0f09b3c
[ "Apache-2.0" ]
1
2019-07-15T12:44:39.000Z
2019-07-15T12:44:39.000Z
source/development/dotnet/api-reference/index.md
autolaborcenter/pm1-doc-sphinx
88632dab128e68e9053937afb89456d2e0f09b3c
[ "Apache-2.0" ]
null
null
null
source/development/dotnet/api-reference/index.md
autolaborcenter/pm1-doc-sphinx
88632dab128e68e9053937afb89456d2e0f09b3c
[ "Apache-2.0" ]
null
null
null
# PM1 .Net API 参考 ## 类型 - [`Autolabor.PM1.Parameter`](Parameter) - [`Autolabor.PM1.Parameters`](Parameters) - [`Autolabor.PM1.StateEnum`](StateEnum) ## 静态类 * [`Autolabor.PM1.Methods`](Methods-Class) - 连接控制 - [`Autolabor.PM1.Methods.Initialize`](Methods/Initialize) - [`Autolabor.PM1.Methods.Shutdown`](Methods/Shutdown) - 参数访问 - [`Autolabor.PM1.Methods.Parameters`](Methods/Parameters) - 状态访问 - [`Autolabor.PM1.Methods.State`](Methods/State) - [`Autolabor.PM1.Methods.Odometry`](Methods/Odometry) - [`Autolabor.PM1.Methods.ResetOdometry`](Methods/ResetOdometry) - 连续控制 - [`Autolabor.PM1.Methods.PhysicalTarget`](Methods/PhysicalTarget) - [`Autolabor.PM1.Methods.WheelsTarget`](Methods/WheelsTarget) - [`Autolabor.PM1.Methods.VelocityTarget`](Methods/VelocityTarget) - 动作控制 - [`Autolabor.PM1.Methods.GoStraight`](Methods/GoStraight) - [`Autolabor.PM1.Methods.TurnAround`](Methods/TurnAround) - [`Autolabor.PM1.Methods.GoArcVS`](Methods/GoArcVS) - [`Autolabor.PM1.Methods.GoArcVA`](Methods/GoArcVA) - [`Autolabor.PM1.Methods.GoArcWS`](Methods/GoArcWS) - [`Autolabor.PM1.Methods.GoArcWA`](Methods/GoArcWA) - [`Autolabor.PM1.Methods.GoArcVT`](Methods/GoArcVT) - [`Autolabor.PM1.Methods.GoArcWT`](Methods/GoArcWT) - [`Autolabor.PM1.Methods.Paused`](Methods/Paused) - [`Autolabor.PM1.Methods.CancelAction`](Methods/CancelAction) - 高级动作控制 - [`Autolabor.PM1.Methods.CalculateSpatium`](Methods/CalculateSpatium) - [`Autolabor.PM1.Methods.Drive`](Methods/Drive) - [`Autolabor.PM1.Methods.AdjustRudder`](Methods/AdjustRudder) * [`Autolabor.PM1.AsyncMethods`](AsyncMethods-Class) - [`Autolabor.PM1.AsyncMethods.InitializeAsync`](AsyncMethods/InitializeAsync) - [`Autolabor.PM1.AsyncMethods.DriveAsync`](AsyncMethods/DriveAsync) - [`Autolabor.PM1.AsyncMethods.AdjustRudderAsync`](AsyncMethods/AdjustRudderAsync)
32.183333
86
0.716727
yue_Hant
0.871897
2dcec772cb167dc2710e1804762cfe154490a8e0
79
md
Markdown
README.md
wjhopper/PTB-boilerplate
361fb4c46522b17364a85467294513f288197a81
[ "MIT" ]
null
null
null
README.md
wjhopper/PTB-boilerplate
361fb4c46522b17364a85467294513f288197a81
[ "MIT" ]
null
null
null
README.md
wjhopper/PTB-boilerplate
361fb4c46522b17364a85467294513f288197a81
[ "MIT" ]
null
null
null
# PTB-boilerplate A basic template to build Psychtoolbox experiments on top of
26.333333
60
0.822785
eng_Latn
0.953278
2dd01ddd7bcf2583165c9974f9bda18f2046c209
692
md
Markdown
build/docs/WemCoachingAppointmentTopicCoachingAppointmentExternalLink.md
MyPureCloud/platform-client-sdk-java
3bab4e670622c9092bfd1044fb1ea88ec6b15752
[ "MIT" ]
3
2018-05-22T14:33:38.000Z
2020-05-06T15:33:36.000Z
build/docs/WemCoachingAppointmentTopicCoachingAppointmentExternalLink.md
MyPureCloud/platform-client-sdk-java
3bab4e670622c9092bfd1044fb1ea88ec6b15752
[ "MIT" ]
4
2018-05-18T13:52:45.000Z
2019-11-06T18:04:56.000Z
build/docs/WemCoachingAppointmentTopicCoachingAppointmentExternalLink.md
MyPureCloud/platform-client-sdk-java
3bab4e670622c9092bfd1044fb1ea88ec6b15752
[ "MIT" ]
10
2018-05-18T00:08:16.000Z
2022-02-17T08:45:42.000Z
--- title: WemCoachingAppointmentTopicCoachingAppointmentExternalLink --- ## WemCoachingAppointmentTopicCoachingAppointmentExternalLink ## Properties | Name | Type | Description | Notes | | ------------ | ------------- | ------------- | ------------- | | **externalLink** | <!----><!---->**String**<!----> | | [optional] | | **action** | [**ActionEnum**](#ActionEnum)<!----> | | [optional] | {: class="table table-striped"} <a name="ActionEnum"></a> ## Enum: ActionEnum | Name | Value | | ---- | ----- | | OUTDATEDSDKVERSION | &quot;OutdatedSdkVersion&quot; | | ADD | &quot;Add&quot; | | REMOVE | &quot;Remove&quot; | | NONE | &quot;None&quot; | {: class="table table-striped"}
23.066667
71
0.566474
yue_Hant
0.468641
2dd0911363c74444e07893fdc6f0858b0e2dfdc7
1,192
md
Markdown
README.md
hannesm/tlstunnel
4bf0366f560a1ac33aaa0f3d323d7d6ec01932b9
[ "BSD-2-Clause" ]
79
2015-04-14T14:15:56.000Z
2020-05-27T23:28:46.000Z
README.md
hannesm/tlstunnel-lwt
4bf0366f560a1ac33aaa0f3d323d7d6ec01932b9
[ "BSD-2-Clause" ]
27
2015-04-15T12:22:54.000Z
2018-01-09T11:22:54.000Z
README.md
hannesm/tlstunnel-lwt
4bf0366f560a1ac33aaa0f3d323d7d6ec01932b9
[ "BSD-2-Clause" ]
7
2015-04-16T06:39:43.000Z
2020-04-06T12:06:39.000Z
## TLS tunnel -- a TLS reverse proxy Who needs a stunnel if you have a tls tunnel? `tlstunnel` is picky; it won't accept connections: - which do not contain the [secure renegotiation](https://tools.ietf.org/html/rfc5746) extension - which speak SSL version 3 - if the given certificate chain is not valid (or contains an X.509 version 1 certificate, or less than 1024 bits RSA public key Deprecated in favour of the [MirageOS unikernel](https://github.com/roburio/tlstunnel). ## Installation [![Build Status](https://travis-ci.org/hannesm/tlstunnel.svg?branch=master)](https://travis-ci.org/hannesm/tlstunnel) You first need [OCaml](https://ocaml.org) (at least 4.02.0) and [OPAM](https://opam.ocaml.org) (at least 1.2.2) from your distribution. Run `opam install tlstunnel` after `opam init` finished. ## Execution A sample command line is: `tlstunnel -b 127.0.0.1:8080 -f 4433 -cert server.pem` which listens on TCP port `4433` with the given certificate chain and private key (both in `server.pem`), and forwards connections to `127.0.0.1` on port `8080`. An optional argument is `-l FILE` to log into a file instead of to stdout. Try `--help` for all command line arguments.
36.121212
128
0.74245
eng_Latn
0.960386
2dd0ea4d74df9bf32ef9ca75939d1a93d5e6538a
2,086
md
Markdown
docs/mfc/com-interface-entry-points.md
LouisJustinTALLOT/cpp-docs.fr-fr
1e9a7185459b26b2a69659615453d9eda6a2f2db
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/mfc/com-interface-entry-points.md
LouisJustinTALLOT/cpp-docs.fr-fr
1e9a7185459b26b2a69659615453d9eda6a2f2db
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/mfc/com-interface-entry-points.md
LouisJustinTALLOT/cpp-docs.fr-fr
1e9a7185459b26b2a69659615453d9eda6a2f2db
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- description: En savoir plus sur les points d’entrée de l’interface COM title: Interface COM, points d'entrée ms.date: 03/27/2019 helpviewer_keywords: - entry points, COM interfaces - state management, OLE/COM interfaces - MFC COM, COM interface entry points - OLE, interface entry points - MFC, managing state data - COM interfaces, entry points ms.assetid: 9e7421dc-0731-4748-9e1b-90acbaf26d77 ms.openlocfilehash: 805ac906c3ccca246d1af71c689aaf768f789999 ms.sourcegitcommit: d6af41e42699628c3e2e6063ec7b03931a49a098 ms.translationtype: MT ms.contentlocale: fr-FR ms.lasthandoff: 12/11/2020 ms.locfileid: "97160010" --- # <a name="com-interface-entry-points"></a>Interface COM, points d'entrée Pour les fonctions membres d’une interface COM, utilisez la `METHOD_PROLOGUE` macro pour conserver l’état global approprié lors de l’appel de méthodes d’une interface exportée. En général, les fonctions membres des interfaces implémentées par `CCmdTarget` les objets dérivés de utilisent déjà cette macro pour permettre l’initialisation automatique du `pThis` pointeur. Par exemple : [!code-cpp[NVC_MFCConnectionPoints#5](codesnippet/cpp/com-interface-entry-points_1.cpp)] Pour plus d’informations, consultez la [note technique 38](tn038-mfc-ole-iunknown-implementation.md) sur l’implémentation de MFC/OLE `IUnknown` . La `METHOD_PROLOGUE` macro est définie comme suit : ```cpp #define METHOD_PROLOGUE(theClass, localClass) \ theClass* pThis = \ ((theClass*)((BYTE*)this - offsetof(theClass, m_x##localClass))); \ AFX_MANAGE_STATE(pThis->m_pModuleState) \ ``` La partie de la macro concernée par la gestion de l’état global est la suivante : `AFX_MANAGE_STATE( pThis->m_pModuleState )` Dans cette expression, *m_pModuleState* est supposé être une variable membre de l’objet conteneur. Elle est implémentée par la `CCmdTarget` classe de base et est initialisée à la valeur appropriée par `COleObjectFactory` , lorsque l’objet est instancié. ## <a name="see-also"></a>Voir aussi [Gestion des données d’état des modules MFC](managing-the-state-data-of-mfc-modules.md)
43.458333
253
0.783797
fra_Latn
0.808459
2dd11763dc24e39aa8a54065dc1fa64b51953a53
211
md
Markdown
recipes/Python/578782_Printing_list_ODBC/README.md
tdiprima/code
61a74f5f93da087d27c70b2efe779ac6bd2a3b4f
[ "MIT" ]
2,023
2017-07-29T09:34:46.000Z
2022-03-24T08:00:45.000Z
recipes/Python/578782_Printing_list_ODBC/README.md
unhacker/code
73b09edc1b9850c557a79296655f140ce5e853db
[ "MIT" ]
32
2017-09-02T17:20:08.000Z
2022-02-11T17:49:37.000Z
recipes/Python/578782_Printing_list_ODBC/README.md
unhacker/code
73b09edc1b9850c557a79296655f140ce5e853db
[ "MIT" ]
780
2017-07-28T19:23:28.000Z
2022-03-25T20:39:41.000Z
## Printing list of ODBC data sources Originally published: 2013-12-10 11:07:36 Last updated: 2013-12-10 11:07:37 Author: Michal Niklas This simple code shows ODBC data sources. It uses `odbc` module.
35.166667
64
0.734597
eng_Latn
0.792742
2dd16e124d0cd1abd085ee37cfb5c78103d55a4b
5,794
md
Markdown
docs/hardware.md
Retardigrades/blinkenhat
74348f12ae7834367e84d703f0a4b3e41437fc9d
[ "MIT" ]
null
null
null
docs/hardware.md
Retardigrades/blinkenhat
74348f12ae7834367e84d703f0a4b3e41437fc9d
[ "MIT" ]
8
2018-02-13T12:23:11.000Z
2018-02-20T15:28:02.000Z
docs/hardware.md
Retardigrades/blinkenhat
74348f12ae7834367e84d703f0a4b3e41437fc9d
[ "MIT" ]
1
2018-02-08T11:39:08.000Z
2018-02-08T11:39:08.000Z
--- title: Hardware --- ## Hardware ### Overview The Hardware consists of a small [PCB](https://en.wikipedia.org/wiki/Printed_circuit_board) that contains an ESP8266 Microcontroller and some electrical components. The Microcontroller is the heart if the system. It runs the software to light up the LEDs and provides a WiFi interface to configure the device and interact with it. ![PCB top view](https://github.com/Retardigrades/blinkenhat/raw/master/hardware/export/blinkenhat-pcb-3d-top.png) The PCB has two connectors for LED strips. We decided to use WS2812B based Strips, as they are easy to use, cheap and broadly available. The LEDs have four connectors: | Connector | | |----------------|--------------------| | VCC | The supply voltage | | D<sub>in</sub> | Data input | | D<sub>in</sub> | Data output | | GND | Ground | The data consists of a serial signal that gets consumed by the first LED. The first LED consumes exactly 24 bits (or 3 bytes) of data. 8 bit (or one byte) for each of the three base colors (red, green, blue). Each LED pixel has internally a seperate LED for these colors. The data represents their brightness. With this you can "mix" almost every color for each LED. ![PCB bottom view](https://github.com/Retardigrades/blinkenhat/raw/master/hardware/export/blinkenhat-pcb-3d-bottom.png) All data that the LED recieves after these 24 bit is sent out to the D~out~ connection, which is connected to the D<sub>in</sub> of the next LED. With this you can pass all your data through one D~in~ of your first LED, which makes the wiring very easy: ![LED chanining](graphs/led_chain.png) ### Electronic Components The big chunk on the Board that sits on the space labelled with _U2_ in the picture above is the microcontroller. The bit that overlaps the PCB is its WiFi antenna. The _U1_ labelled thing is a voltage regulator, that provides the microcontroller with stable 3.3V. The LEDs uses 5V Voltage for operation and Data and the microcontroller we use needs 3.3V. To make sure, the LED recognize our data we need something called a [Level Shifter](https://en.wikipedia.org/wiki/Level_shifter). This piece "translates" the 3.3V signals to 5V. As we have two channels for LEDs we also have two of these level-shifters on the board. They consist of a MOSFET and two resistors. You can locate them by the Labels _Q1_ and _Q2_ on the board. Connected to these are to the connectors for the LEDs that are labelled with _J1_ and _J2_. ![Level shifter](graphs/level_shift.png) The little button on the PCB is connected to the reset pin of the microcontroller. Press this to reset (reboot) the controller. The remaining components are the connectors and some resistors, diodes and capacitors. Some of them are necessary to bring the controller in the right state after boot, some are part of the level shifters. For more details we provide the [Schematic](https://en.wikipedia.org/wiki/Schematic) of the whole board along with the source code of the Software. You can view the latest [PDF Version](https://github.com/Retardigrades/blinkenhat/raw/master/hardware/export/blinkenhat-schematic.pdf) online. ### Connectors The board has several connectors. They are all named with _JX_ where _X_ is the number of the connector. The connectors come in three types that serve different purposes. | Name | Description | |------|-----------------------------------------------------------------------| | J1 | Output for LED strip 1. | | J2 | Output for LED Strip 2. | | J3 | Input port for two switches. | | J4 | Pin header to program the ESP. It is compatible to the ESP-01 module. | | J5 | The USB port to power the board. | | J6 | Extension header containing all free pins. See [schematic](https://github.com/Retardigrades/blinkenhat/raw/master/hardware/export/blinkenhat-schematic.pdf) for allocation details. | The connectors _J1_ and _J2_ can be used to connect LED strips to it. They carry the supply voltage and a signal from one of the level shifters. For your hat we use only _J1_. _J2_ is free to be used by your own creation - no matter if you attach more LEDs to your hat or something completely different. _J3_ has two pins that are connected to two input connections of the microcontroller and a _GROUND_ signal. you can add two switches between these and use them into your software to alter animations for example. The pin header _J4_ is meant to be used to program your controller. It is wired exactly as the pin-header on a common ESP-01 module. You can find the pin allocation easily [on the internet](http://simba-os.readthedocs.io/en/latest/_images/esp01-pinout.png). **ATTENTION:** Take care that you use it in the right orientation: The row with the _VCC_ pin is on the edge of the BlinkenHat board. _J5_ is the USB plug to power the board. There are no other pins connected besides _GROUND_ and _VCC_. So you wont be able to transfer any data to anywhere with that connector. We considered to make the board directly programmable via USB but this would have increased the complexity and the cost. The last onnector _J6_ was built in for extensibility. It gives you access to all unused pins of the microcontroller and the 3.3V supply voltage of it. This can be useful if you want to extend or change the software to include sensors or add other things. The actual pin allocation is documented in the [schematic](https://github.com/Retardigrades/blinkenhat/raw/master/hardware/export/blinkenhat-schematic.pdf) of the board.
90.53125
554
0.720573
eng_Latn
0.998896
2dd4fe8436521a5a7cf2354b4ce535355896e302
56
md
Markdown
README.md
hursitkaplan/tidy-folder
24e20bf68ef174d6404fec55a0eb00c7963ce291
[ "MIT" ]
1
2020-07-05T17:52:54.000Z
2020-07-05T17:52:54.000Z
README.md
hursitkaplan/tidy-folder
24e20bf68ef174d6404fec55a0eb00c7963ce291
[ "MIT" ]
null
null
null
README.md
hursitkaplan/tidy-folder
24e20bf68ef174d6404fec55a0eb00c7963ce291
[ "MIT" ]
null
null
null
# tidy-folder A program that can organize your folder.
18.666667
41
0.767857
eng_Latn
0.999774
2dd73b0c4a513fb5a5d44c9260047a4d2388a10c
3,327
md
Markdown
README_easy.md
lsda123456/Aria2Dash
6d875a723a59f5c7304079c71e02153ccd12588e
[ "MIT" ]
180
2020-01-04T02:34:02.000Z
2022-03-21T12:42:03.000Z
README_easy.md
lsda123456/Aria2Dash
6d875a723a59f5c7304079c71e02153ccd12588e
[ "MIT" ]
6
2020-04-18T11:33:16.000Z
2022-02-15T10:05:18.000Z
README_easy.md
lsda123456/Aria2Dash
6d875a723a59f5c7304079c71e02153ccd12588e
[ "MIT" ]
41
2020-02-03T01:55:07.000Z
2022-03-30T08:05:28.000Z
# 自建支持bt,磁力等下载方式的离线下载服务器。(什么都不懂的小白都能用) ## 前言 本文受众为小白,不是小白的请看[这里](https://github.com/Masterchiefm/Aria2Dash) ## 正文 ** 正文将全部以图片形式展示。若加载不出图片,请挂梯子。 某天,我想下《小丑》看。网上提供了磁力链接magnet:?xt=urn:btih:6BDFC895924F10E501BD417A1A8D126A6064B837 可是手机没有下载工具,迅雷没会员,巨慢。所以有了这个教程,做一个离线下載器,离线完再取回来。速度高达22m/s!!!爽! ![0](https://raw.githubusercontent.com/Masterchiefm/pictures/master/Aria2Dash/0.jpg) 这张图下面附有vultr的注册地址和简单的教程,点击去注册可以得点优惠吧,我也有点奖励。不会让你吃亏。当然,你不用vultr也可以,类似的用法。不过这里将以vultr为例子。Vultr有个好处就是访问不用挂梯子,从我下面给的链接点进去就可以。对了,进去后再确认是vultr.com 输入你的邮箱和密码,点击create account就注册好账号了。 ![1](https://raw.githubusercontent.com/Masterchiefm/pictures/master/Aria2Dash/1.jpg) 点下面图进入vultr注册可以让我获得优惠,求你了!你也不会亏。------- <a href="https://www.vultr.com/?ref=8126287"><img src="https://www.vultr.com/media/banners/banner_300x250.png" width="300" height="250"></a> -------------- 如果你很有钱,点下图去注册并充50美金可以获得25美金优惠。优惠是什么方式我忘了,进去看看吧。建议就10美金先试试,大约70块钱而已。还可以拿来自己挂梯子。不过这里不提供梯子教程。 <a href="https://www.vultr.com/?ref=8154695-4F"><img src="https://www.vultr.com/media/banners/banner_468x60.png" width="468" height="60"></a> ----------- ![2](https://raw.githubusercontent.com/Masterchiefm/pictures/master/Aria2Dash/a.jpg) ![2](https://raw.githubusercontent.com/Masterchiefm/pictures/master/Aria2Dash/b.jpg) ![2](https://raw.githubusercontent.com/Masterchiefm/pictures/master/Aria2Dash/c.jpg) ![2](https://raw.githubusercontent.com/Masterchiefm/pictures/master/Aria2Dash/2.jpg) ![3](https://raw.githubusercontent.com/Masterchiefm/pictures/master/Aria2Dash/3.jpg) ![4](https://raw.githubusercontent.com/Masterchiefm/pictures/master/Aria2Dash/4.jpg) ![5](https://raw.githubusercontent.com/Masterchiefm/pictures/master/Aria2Dash/5.jpg) 复制粘贴不难吧 ``` #!/bin/bash sudo apt install curl -y bash <(curl -s -L https://github.com/Masterchiefm/Aria2Dash/raw/master/Aria2Dash.sh) #Done. ``` ![6](https://raw.githubusercontent.com/Masterchiefm/pictures/master/Aria2Dash/6.jpg) ![7](https://raw.githubusercontent.com/Masterchiefm/pictures/master/Aria2Dash/7.jpg) ![8](https://raw.githubusercontent.com/Masterchiefm/pictures/master/Aria2Dash/8.jpg) ![9](https://raw.githubusercontent.com/Masterchiefm/pictures/master/Aria2Dash/9.jpg) ![10](https://raw.githubusercontent.com/Masterchiefm/pictures/master/Aria2Dash/10.jpg) ![11](https://raw.githubusercontent.com/Masterchiefm/pictures/master/Aria2Dash/11.jpg) ![12](https://raw.githubusercontent.com/Masterchiefm/pictures/master/Aria2Dash/12.jpg) ![13](https://raw.githubusercontent.com/Masterchiefm/pictures/master/Aria2Dash/13.jpg) ![14](https://raw.githubusercontent.com/Masterchiefm/pictures/master/Aria2Dash/14.jpg) ![15](https://raw.githubusercontent.com/Masterchiefm/pictures/master/Aria2Dash/15.jpg) ![16](https://raw.githubusercontent.com/Masterchiefm/pictures/master/Aria2Dash/16.jpg) ![17](https://raw.githubusercontent.com/Masterchiefm/pictures/master/Aria2Dash/17.jpg) ![18](https://raw.githubusercontent.com/Masterchiefm/pictures/master/Aria2Dash/18.jpg) ![19](https://raw.githubusercontent.com/Masterchiefm/pictures/master/Aria2Dash/19.jpg) ![20](https://raw.githubusercontent.com/Masterchiefm/pictures/master/Aria2Dash/20.jpg) ![21](https://raw.githubusercontent.com/Masterchiefm/pictures/master/Aria2Dash/21.jpg) ![](https://raw.githubusercontent.com/Masterchiefm/pictures/master/Aria2Dash/22.jpg)
42.113924
141
0.779982
yue_Hant
0.728685
2dd805d07207c7d254e1bdd4e72137488b8c7efb
284
md
Markdown
CHANGELOG.md
jonathanhefner/moar
7cb11f20914c636428d174115d45379396d7c879
[ "MIT" ]
5
2018-08-18T18:54:37.000Z
2021-02-04T17:17:14.000Z
CHANGELOG.md
jonathanhefner/moar
7cb11f20914c636428d174115d45379396d7c879
[ "MIT" ]
null
null
null
CHANGELOG.md
jonathanhefner/moar
7cb11f20914c636428d174115d45379396d7c879
[ "MIT" ]
null
null
null
## 1.0.2 * When installing, warn instead of raising an error if "app/assets/javascripts/application.js" does not exist, such as in newly generated Rails 6.0 applications ## 1.0.1 * Allow any Rails >= 5.1 * Avoid triggering load in `link_to_more` ## 1.0.0 * Initial release
16.705882
68
0.704225
eng_Latn
0.987655
2dda2003b4916a8f5d1ebbee07e443ffe5b8a06f
1,252
md
Markdown
_posts/tumblr/2013-12-08-dropstore-ng-opensource-angularjs-bindings-for.md
AnalogJ/thesparktree-blog
3d8071ef23aa6df64f6bc325ce698ea8b54ef282
[ "Unlicense", "MIT" ]
null
null
null
_posts/tumblr/2013-12-08-dropstore-ng-opensource-angularjs-bindings-for.md
AnalogJ/thesparktree-blog
3d8071ef23aa6df64f6bc325ce698ea8b54ef282
[ "Unlicense", "MIT" ]
44
2020-02-25T15:45:19.000Z
2022-02-26T04:38:06.000Z
_posts/tumblr/2013-12-08-dropstore-ng-opensource-angularjs-bindings-for.md
AnalogJ/thesparktree-blog
3d8071ef23aa6df64f6bc325ce698ea8b54ef282
[ "Unlicense", "MIT" ]
null
null
null
--- layout: post title: 'dropstore-ng: opensource AngularJS bindings for Dropbox Javascript API' date: '2013-12-08T20:00:00-08:00' cover: '/assets/images/cover_angularjs.png' subclass: 'post tag-post' tags: - dropbox - angularjs - github - javascript redirect_from: - /post/69434240145/dropstore-ng-opensource-angularjs-bindings-for - /post/69434240145 disqus_id: 'http://blog.thesparktree.com/post/69434240145' categories: 'analogj' navigation: True logo: '/assets/logo.png' --- I created an github project called [dropstore-ng](https://github.com/AnalogJ/dropstore-ng) that has angularjs bindings for the recently released Dropbox Datastore API as well as all the other related functions in the Javascript API. The service wraps most of the Dropbox Datastore callbacks in promises, contains subscription methods for Dropbox events and provides transparent aliases for untouched library methods. I also created a realtime todo sample app which you can try [here](https://dropstore-ng.herokuapp.com/) You can access the library here: [https://github.com/AnalogJ/dropstore-ng](https://github.com/AnalogJ/dropstore-ng) or through bower ```bash bower install dropstore-ng --save ``` <div class="github-widget" data-repo="AnalogJ/dropstore-ng"></div>
35.771429
232
0.781949
eng_Latn
0.762845
2dda90b8546b8a0e086cd7e36a59493797514b79
1,027
md
Markdown
_people/nick_sutton-smith.md
URN/urn-history-project
3823f5f8e01e5a8c8c7e3cfe0affcf966dedf938
[ "MIT" ]
2
2018-09-09T21:48:37.000Z
2021-06-06T10:15:54.000Z
_people/nick_sutton-smith.md
URN/urn-history-project
3823f5f8e01e5a8c8c7e3cfe0affcf966dedf938
[ "MIT" ]
92
2018-07-30T19:33:17.000Z
2022-03-26T12:52:55.000Z
_people/nick_sutton-smith.md
URN/urn-history-project
3823f5f8e01e5a8c8c7e3cfe0affcf966dedf938
[ "MIT" ]
2
2018-08-11T15:58:54.000Z
2018-08-21T20:03:19.000Z
--- title: Nick Sutton-Smith gender: male course: - English and History BA graduated: 2018 links: - type: Twitter username: nicksuttonsmith - type: Instagram username: nicksuttonsmith submitted: 2018-09-06 --- Walking 50 miles back from Stoke On Trent in 24 hours for charity with Nick Clay was awful but really rewarding (this was for the inaugural Road To Nottingham, where 4 teams raced back to our studios on foot) The SRA deadline for 2018 fell on a Friday and we had to completely pack up our beloved studios in A11 in the Portland Basement to move to the new media hub. That was the most tiring weekend, with myself, Johnathan Graydon, James Perkins, Bessie Ephgrave, Jamie Keene, Anna Lambert and a handful of other worked till 8am on the Monday. The place was completely destroyed, all the roof tiles had come out to get the wires out the ceilings and there was a load of stuff left over that we found throughout the weekend. It was a very humbling and emotional weekend saying goodbye to the studios.
57.055556
591
0.778968
eng_Latn
0.999598
2ddab40b0214f5a2365d679425e43931e0c53124
10,058
md
Markdown
api/wrappers/jsp/listView.md
colinvo/kendo-docs
7adfa37886ec90fca3544d47abd5f76422497da1
[ "Unlicense", "MIT" ]
1
2016-02-03T10:15:54.000Z
2016-02-03T10:15:54.000Z
api/wrappers/jsp/listView.md
colinvo/kendo-docs
7adfa37886ec90fca3544d47abd5f76422497da1
[ "Unlicense", "MIT" ]
null
null
null
api/wrappers/jsp/listView.md
colinvo/kendo-docs
7adfa37886ec90fca3544d47abd5f76422497da1
[ "Unlicense", "MIT" ]
null
null
null
--- title: listView slug: jsp-listView tags: api, java publish: true --- # \<kendo:listView\> A JSP wrapper for Kendo UI [ListView](/kendo-ui/api/web/listview). ## Configuration Attributes ### altTemplate `java.lang.String` Template to be used for rendering the alternate items in the listview. #### Example <kendo:listView altTemplate="altTemplate"> </kendo:listView> ### autoBind `boolean` If set to false the widget will not bind to the data source during initialization. In this case data binding will occur when the change event of the data source is fired. By default the widget will bind to the data source specified in the configuration. #### Example <kendo:listView autoBind="autoBind"> </kendo:listView> ### editTemplate `java.lang.String` Specifies ListView item template in edit mode. #### Example <kendo:listView editTemplate="editTemplate"> </kendo:listView> ### navigatable `boolean` Indicates whether keyboard navigation is enabled/disabled. #### Example <kendo:listView navigatable="navigatable"> </kendo:listView> ### pageable `boolean` Indicates whether paging is enabled/disabled. Further configuration is available via [kendo:listView-pageable](#kendo-listView-pageable). #### Example <kendo:listView pageable="pageable"> </kendo:listView> ### selectable `java.lang.Object` Indicates whether selection is enabled/disabled. Possible values: #### Example <kendo:listView selectable="selectable"> </kendo:listView> ### tagName `java.lang.String` Specifies ListView wrapper element tag name. #### Example <kendo:listView tagName="tagName"> </kendo:listView> ### template `java.lang.String` The id of the template used for rendering the items in the listview. #### Example <kendo:listView template="template"> </kendo:listView> ## Configuration JSP Tags ### kendo:listView-pageable Indicates whether paging is enabled/disabled. More documentation is available at [kendo:listView-pageable](/kendo-ui/api/wrappers/jsp/listview/pageable). #### Example <kendo:listView> <kendo:listView-pageable></kendo:listView-pageable> </kendo:listView> ## Event Attributes ### cancel `String` Fired when the user clicks the "cancel" button.The event handler function context (available via the this keyword) will be set to the widget instance. For additional information check the [cancel](/kendo-ui/api/web/listview#events-cancel) event documentation. #### Example <kendo:listView cancel="handle_cancel"> </kendo:listView> <script> function handle_cancel(e) { // Code to handle the cancel event. } </script> ### change `String` Fires when the list view selection has changed.The event handler function context (available via the this keyword) will be set to the widget instance. For additional information check the [change](/kendo-ui/api/web/listview#events-change) event documentation. #### Example <kendo:listView change="handle_change"> </kendo:listView> <script> function handle_change(e) { // Code to handle the change event. } </script> ### dataBound `String` Fires when the list view has received data from the data source and it is already rendered.The event handler function context (available via the this keyword) will be set to the widget instance. For additional information check the [dataBound](/kendo-ui/api/web/listview#events-dataBound) event documentation. #### Example <kendo:listView dataBound="handle_dataBound"> </kendo:listView> <script> function handle_dataBound(e) { // Code to handle the dataBound event. } </script> ### dataBinding `String` Fires when the list view is about to be rendered.The event handler function context (available via the this keyword) will be set to the widget instance. For additional information check the [dataBinding](/kendo-ui/api/web/listview#events-dataBinding) event documentation. #### Example <kendo:listView dataBinding="handle_dataBinding"> </kendo:listView> <script> function handle_dataBinding(e) { // Code to handle the dataBinding event. } </script> ### edit `String` Fires when the list view enters edit mode.The event handler function context (available via the this keyword) will be set to the widget instance. For additional information check the [edit](/kendo-ui/api/web/listview#events-edit) event documentation. #### Example <kendo:listView edit="handle_edit"> </kendo:listView> <script> function handle_edit(e) { // Code to handle the edit event. } </script> ### remove `String` Fires before the list view item is removed. If it is not prevented will call DataSource sync method.The event handler function context (available via the this keyword) will be set to the widget instance. For additional information check the [remove](/kendo-ui/api/web/listview#events-remove) event documentation. #### Example <kendo:listView remove="handle_remove"> </kendo:listView> <script> function handle_remove(e) { // Code to handle the remove event. } </script> ### save `String` Fired when a data item is saved.The event handler function context (available via the this keyword) will be set to the widget instance. For additional information check the [save](/kendo-ui/api/web/listview#events-save) event documentation. #### Example <kendo:listView save="handle_save"> </kendo:listView> <script> function handle_save(e) { // Code to handle the save event. } </script> ## Event Tags ### kendo:listView-cancel Fired when the user clicks the "cancel" button.The event handler function context (available via the this keyword) will be set to the widget instance. For additional information check the [cancel](/kendo-ui/api/web/listview#events-cancel) event documentation. #### Example <kendo:listView> <kendo:listView-cancel> <script> function(e) { // Code to handle the cancel event. } </script> </kendo:listView-cancel> </kendo:listView> ### kendo:listView-change Fires when the list view selection has changed.The event handler function context (available via the this keyword) will be set to the widget instance. For additional information check the [change](/kendo-ui/api/web/listview#events-change) event documentation. #### Example <kendo:listView> <kendo:listView-change> <script> function(e) { // Code to handle the change event. } </script> </kendo:listView-change> </kendo:listView> ### kendo:listView-dataBound Fires when the list view has received data from the data source and it is already rendered.The event handler function context (available via the this keyword) will be set to the widget instance. For additional information check the [dataBound](/kendo-ui/api/web/listview#events-dataBound) event documentation. #### Example <kendo:listView> <kendo:listView-dataBound> <script> function(e) { // Code to handle the dataBound event. } </script> </kendo:listView-dataBound> </kendo:listView> ### kendo:listView-dataBinding Fires when the list view is about to be rendered.The event handler function context (available via the this keyword) will be set to the widget instance. For additional information check the [dataBinding](/kendo-ui/api/web/listview#events-dataBinding) event documentation. #### Example <kendo:listView> <kendo:listView-dataBinding> <script> function(e) { // Code to handle the dataBinding event. } </script> </kendo:listView-dataBinding> </kendo:listView> ### kendo:listView-edit Fires when the list view enters edit mode.The event handler function context (available via the this keyword) will be set to the widget instance. For additional information check the [edit](/kendo-ui/api/web/listview#events-edit) event documentation. #### Example <kendo:listView> <kendo:listView-edit> <script> function(e) { // Code to handle the edit event. } </script> </kendo:listView-edit> </kendo:listView> ### kendo:listView-remove Fires before the list view item is removed. If it is not prevented will call DataSource sync method.The event handler function context (available via the this keyword) will be set to the widget instance. For additional information check the [remove](/kendo-ui/api/web/listview#events-remove) event documentation. #### Example <kendo:listView> <kendo:listView-remove> <script> function(e) { // Code to handle the remove event. } </script> </kendo:listView-remove> </kendo:listView> ### kendo:listView-save Fired when a data item is saved.The event handler function context (available via the this keyword) will be set to the widget instance. For additional information check the [save](/kendo-ui/api/web/listview#events-save) event documentation. #### Example <kendo:listView> <kendo:listView-save> <script> function(e) { // Code to handle the save event. } </script> </kendo:listView-save> </kendo:listView>
29.934524
204
0.652118
eng_Latn
0.888019
2ddb35ca1c5cab9d06b1a0e6db39e2ccc857020f
19,586
md
Markdown
vignettes/singlesynth-vignette.md
YuNu1210/augsynth
05ad62409e11cccfdca013ad64c9cbaef9dc4dab
[ "MIT" ]
73
2018-11-13T21:23:38.000Z
2022-03-30T20:38:38.000Z
vignettes/singlesynth-vignette.md
YuNu1210/augsynth
05ad62409e11cccfdca013ad64c9cbaef9dc4dab
[ "MIT" ]
50
2019-01-16T19:52:45.000Z
2022-03-31T16:39:54.000Z
vignettes/singlesynth-vignette.md
YuNu1210/augsynth
05ad62409e11cccfdca013ad64c9cbaef9dc4dab
[ "MIT" ]
41
2019-11-28T00:36:53.000Z
2022-03-14T12:44:32.000Z
--- output: rmarkdown::html_vignette vignette: > %\VignetteIndexEntry{Single Outcome AugSynth Vignette} %\VignetteEngine{knitr::rmarkdown} %\VignetteEncoding{UTF-8} --- # `augsynth`: The Augmented Synthetic Control Method ## Installation You can install `augsynth` from github using `devtools`. ```r ## Install devtools if noy already installed install.packages("devtools", repos='http://cran.us.r-project.org') ## Install augsynth from github devtools::install_github("ebenmichael/augsynth") ``` ## Example: Effects of the 2012 Kansas Tax Cuts ### The data To show the usage and features of `augsynth`, we'll use data on the impact of personal income tax cuts in Kansas that comes with the `AugSynth` package. Our interest is in estimating the effect of income tax cuts on gross state product (GSP) per capita. ```r library(magrittr) library(dplyr) library(augsynth) data(kansas) ``` The `kansas` dataset contains the GSP per capita (the outcome measure) `lngdpcapita` for all 50 states from the first quarter of 1990 to the first quarter of 2016. To run `augsynth`, we need to include a treatment status column that indicates which region was treated and at what time. The table in `kansas` contains the column `treated` to denote this. In the original study, the second quarter of 2012 was the implementation of the tax cut in Kansas. ```r kansas %>% select(year, qtr, year_qtr, state, treated, gdp, lngdpcapita) %>% filter(state == "Kansas" & year_qtr >= 2012 & year_qtr < 2013) #> # A tibble: 4 x 7 #> year qtr year_qtr state treated gdp lngdpcapita #> <dbl> <dbl> <dbl> <chr> <dbl> <dbl> <dbl> #> 1 2012 1 2012 Kansas 0 143844 10.8 #> 2 2012 2 2012. Kansas 1 141518 10.8 #> 3 2012 3 2012. Kansas 1 138890 10.8 #> 4 2012 4 2013. Kansas 1 139603 10.8 ``` ### Synth Now to find a synthetic control using the entire series of pre-intervention outcomes (and no auxiliary covariates), we can use `augsynth`. To do so we just need to give `augsynth` a formula like `outcome ~ treatment`, tell it what the unit and time variables are, optionally provide when intervention took place (the code will automatically determine this if `t_int` is not provided), and specify that we don't want to fit an outcome model ```r library(augsynth) syn <- augsynth(lngdpcapita ~ treated, fips, year_qtr, kansas, progfunc = "None", scm = T) ``` We can then look at the ATT estimates for each post-intervention time period and overall. We'll also see the quality of the synthetic control fit measured by the L2 distance between Kansas and its synthetic control, and the percent improvement over uniform weights. By default, we'll also see pointwise confidence intervals using a [conformal inference procedure](https://arxiv.org/abs/1712.09089). ```r summary(syn) #> #> Call: #> single_augsynth(form = form, unit = !!enquo(unit), time = !!enquo(time), #> t_int = t_int, data = data, progfunc = "None", scm = ..2) #> #> Average ATT Estimate (p Value for Joint Null): -0.029 ( 0.328 ) #> L2 Imbalance: 0.083 #> Percent improvement from uniform weights: 79.5% #> #> Avg Estimated Bias: NA #> #> Inference type: Conformal inference #> #> Time Estimate 95% CI Lower Bound 95% CI Upper Bound p Value #> 2012.25 -0.018 -0.045 0.006 0.111 #> 2012.50 -0.041 -0.070 -0.015 0.022 #> 2012.75 -0.033 -0.062 -0.007 0.044 #> 2013.00 -0.019 -0.046 0.005 0.111 #> 2013.25 -0.029 -0.053 -0.005 0.044 #> 2013.50 -0.046 -0.073 -0.022 0.022 #> 2013.75 -0.032 -0.056 -0.010 0.022 #> 2014.00 -0.045 -0.074 -0.018 0.022 #> 2014.25 -0.043 -0.074 -0.014 0.022 #> 2014.50 -0.029 -0.061 0.000 0.044 #> 2014.75 -0.018 -0.053 0.011 0.144 #> 2015.00 -0.029 -0.066 0.005 0.078 #> 2015.25 -0.019 -0.051 0.010 0.122 #> 2015.50 -0.022 -0.056 0.007 0.111 #> 2015.75 -0.019 -0.055 0.013 0.189 #> 2016.00 -0.028 -0.067 0.008 0.100 ``` The default test statistic is the sum of the absolute treatment efects `function(x) sum(abs(x))`. We can change the test statistic via the `stat_func` argument. For instance, if we want to perform a one-way test against postive effects, we can set the test stastic to be the negative sum `function(x) -sum(x)`: ```r summary(syn, stat_func = function(x) -sum(x)) #> #> Call: #> single_augsynth(form = form, unit = !!enquo(unit), time = !!enquo(time), #> t_int = t_int, data = data, progfunc = "None", scm = ..2) #> #> Average ATT Estimate (p Value for Joint Null): -0.029 ( 0.159 ) #> L2 Imbalance: 0.083 #> Percent improvement from uniform weights: 79.5% #> #> Avg Estimated Bias: NA #> #> Inference type: Conformal inference #> #> Time Estimate 95% CI Lower Bound 95% CI Upper Bound p Value #> 2012.25 -0.018 -0.080 0.006 0.067 #> 2012.50 -0.041 -0.103 -0.015 0.022 #> 2012.75 -0.033 -0.095 -0.007 0.033 #> 2013.00 -0.019 -0.081 0.005 0.067 #> 2013.25 -0.029 -0.091 -0.005 0.033 #> 2013.50 -0.046 -0.108 -0.022 0.022 #> 2013.75 -0.032 -0.094 -0.010 0.022 #> 2014.00 -0.045 -0.107 -0.021 0.022 #> 2014.25 -0.043 -0.105 -0.014 0.022 #> 2014.50 -0.029 -0.091 0.000 0.033 #> 2014.75 -0.018 -0.080 0.011 0.078 #> 2015.00 -0.029 -0.091 0.005 0.056 #> 2015.25 -0.019 -0.081 0.007 0.078 #> 2015.50 -0.022 -0.084 0.007 0.067 #> 2015.75 -0.019 -0.081 0.013 0.111 #> 2016.00 -0.028 -0.090 0.008 0.067 ``` Or if we want to priotize testing the average post-treatment effect, we can set it to be the absolute sum: ```r summary(syn, stat_func = function(x) abs(sum(x))) #> #> Call: #> single_augsynth(form = form, unit = !!enquo(unit), time = !!enquo(time), #> t_int = t_int, data = data, progfunc = "None", scm = ..2) #> #> Average ATT Estimate (p Value for Joint Null): -0.029 ( 0.302 ) #> L2 Imbalance: 0.083 #> Percent improvement from uniform weights: 79.5% #> #> Avg Estimated Bias: NA #> #> Inference type: Conformal inference #> #> Time Estimate 95% CI Lower Bound 95% CI Upper Bound p Value #> 2012.25 -0.018 -0.045 0.006 0.111 #> 2012.50 -0.041 -0.070 -0.015 0.022 #> 2012.75 -0.033 -0.062 -0.007 0.044 #> 2013.00 -0.019 -0.046 0.005 0.111 #> 2013.25 -0.029 -0.053 -0.005 0.044 #> 2013.50 -0.046 -0.073 -0.022 0.022 #> 2013.75 -0.032 -0.056 -0.010 0.022 #> 2014.00 -0.045 -0.074 -0.018 0.022 #> 2014.25 -0.043 -0.074 -0.014 0.022 #> 2014.50 -0.029 -0.061 0.000 0.044 #> 2014.75 -0.018 -0.053 0.011 0.144 #> 2015.00 -0.029 -0.066 0.005 0.078 #> 2015.25 -0.019 -0.051 0.010 0.122 #> 2015.50 -0.022 -0.056 0.007 0.111 #> 2015.75 -0.019 -0.055 0.013 0.189 #> 2016.00 -0.028 -0.067 0.008 0.100 ``` It's easier to see this information visually. Below we plot the difference between Kansas and it's synthetic control. Before the tax cuts (to the left of the dashed line) we expect these to be close, and after the tax cuts we measure the effect (with point-wise confidence intervals). ```r plot(syn) ``` <img src="figure/fig_syn-1.png" title="plot of chunk fig_syn" alt="plot of chunk fig_syn" style="display: block; margin: auto;" /> We can also compute point-wise confidence intervals using the [Jackknife+ procedure](https://arxiv.org/abs/1905.02928) by changing the `inf_type` argument, although this requires additional assumptions. ```r plot(syn, inf_type = "jackknife+") ``` <img src="figure/fig_syn_plus-1.png" title="plot of chunk fig_syn_plus" alt="plot of chunk fig_syn_plus" style="display: block; margin: auto;" /> ### Augmenting synth with an outcome model In this example the pre-intervention synthetic control fit has an L2 imbalance of 0.083, about 20% of the imbalance between Kansas and the average of the other states. We can reduce this by _augmenting_ synth with ridge regression. To do this we change `progfunc` to `"Ridge"`. We can also choose the ridge hyper-parameter by setting `lambda`, while not specifying `lambda` will determine one through cross validation: ```r asyn <- augsynth(lngdpcapita ~ treated, fips, year_qtr, kansas, progfunc = "Ridge", scm = T) ``` We can plot the cross-validation MSE when dropping pre-treatment time periods by setting `cv = T` in the `plot` function: ```r plot(asyn, cv = T) ``` <img src="figure/fig_asyn_cv-1.png" title="plot of chunk fig_asyn_cv" alt="plot of chunk fig_asyn_cv" style="display: block; margin: auto;" /> By default, the CV procedure chooses the maximal value of `lambda` with MSE within one standard deviation of the minimal MSE. To instead choose the `lambda` that minizes the cross validation MSE, set `min_1se = FALSE`. We can look at the summary and plot the results. Now in the summary output we see an estimate of the overall bias of synth; we measure this with the average amount that augmentation changes the synth estimate. Notice that the estimates become somewhat larger in magnitude, and the standard errors are tighter. ```r summary(asyn) #> #> Call: #> single_augsynth(form = form, unit = !!enquo(unit), time = !!enquo(time), #> t_int = t_int, data = data, progfunc = "Ridge", scm = ..2) #> #> Average ATT Estimate (p Value for Joint Null): -0.040 ( 0.057 ) #> L2 Imbalance: 0.062 #> Percent improvement from uniform weights: 84.7% #> #> Avg Estimated Bias: 0.011 #> #> Inference type: Conformal inference #> #> Time Estimate 95% CI Lower Bound 95% CI Upper Bound p Value #> 2012.25 -0.022 -0.044 0.003 0.056 #> 2012.50 -0.047 -0.076 -0.018 0.022 #> 2012.75 -0.043 -0.071 -0.010 0.022 #> 2013.00 -0.030 -0.055 -0.004 0.033 #> 2013.25 -0.041 -0.067 -0.012 0.022 #> 2013.50 -0.059 -0.088 -0.030 0.022 #> 2013.75 -0.045 -0.073 -0.019 0.022 #> 2014.00 -0.058 -0.090 -0.026 0.022 #> 2014.25 -0.055 -0.091 -0.020 0.022 #> 2014.50 -0.041 -0.080 -0.006 0.033 #> 2014.75 -0.029 -0.068 0.006 0.056 #> 2015.00 -0.040 -0.082 0.000 0.056 #> 2015.25 -0.030 -0.066 0.002 0.056 #> 2015.50 -0.033 -0.072 0.003 0.056 #> 2015.75 -0.029 -0.071 0.010 0.056 #> 2016.00 -0.038 -0.087 0.004 0.056 ``` ```r plot(asyn) ``` <img src="figure/fig_asyn-1.png" title="plot of chunk fig_asyn" alt="plot of chunk fig_asyn" style="display: block; margin: auto;" /> There are also several auxiliary covariates. We can include these in the augmentation by fitting an outcome model using the auxiliary covariates. To do this we simply add the covariates into the formula after `|`. By default this will create time invariant covariates by averaging the auxiliary covariates over the pre-intervention period, dropping `NA` values. We can use a custom aggregation function by setting the `cov_agg` argument. Then the lagged outcomes and the auxiliary covariates are jointly balanced by SCM and the ridge outcome model includes both. ```r covsyn <- augsynth(lngdpcapita ~ treated | lngdpcapita + log(revstatecapita) + log(revlocalcapita) + log(avgwklywagecapita) + estabscapita + emplvlcapita, fips, year_qtr, kansas, progfunc = "ridge", scm = T) ``` Again we can look at the summary and plot the results. ```r summary(covsyn) #> #> Call: #> single_augsynth(form = form, unit = !!enquo(unit), time = !!enquo(time), #> t_int = t_int, data = data, progfunc = "ridge", scm = ..2) #> #> Average ATT Estimate (p Value for Joint Null): -0.061 ( 0.11 ) #> L2 Imbalance: 0.054 #> Percent improvement from uniform weights: 86.6% #> #> Covariate L2 Imbalance: 0.005 #> Percent improvement from uniform weights: 97.7% #> #> Avg Estimated Bias: 0.027 #> #> Inference type: Conformal inference #> #> Time Estimate 95% CI Lower Bound 95% CI Upper Bound p Value #> 2012.25 -0.021 -0.044 0.002 0.067 #> 2012.50 -0.047 -0.076 -0.014 0.033 #> 2012.75 -0.050 -0.083 -0.007 0.033 #> 2013.00 -0.045 -0.074 -0.012 0.033 #> 2013.25 -0.055 -0.088 -0.022 0.022 #> 2013.50 -0.071 -0.105 -0.033 0.022 #> 2013.75 -0.058 -0.091 -0.025 0.022 #> 2014.00 -0.081 -0.119 -0.037 0.022 #> 2014.25 -0.078 -0.121 -0.034 0.022 #> 2014.50 -0.065 -0.114 -0.021 0.033 #> 2014.75 -0.057 -0.110 -0.008 0.044 #> 2015.00 -0.075 -0.124 -0.022 0.033 #> 2015.25 -0.063 -0.106 -0.014 0.033 #> 2015.50 -0.067 -0.106 -0.019 0.022 #> 2015.75 -0.063 -0.101 -0.009 0.022 #> 2016.00 -0.078 -0.122 -0.019 0.022 ``` ```r plot(covsyn) ``` <img src="figure/fig_covsyn-1.png" title="plot of chunk fig_covsyn" alt="plot of chunk fig_covsyn" style="display: block; margin: auto;" /> Now we can additionally fit ridge ASCM on the residuals, look at the summary, and plot the results. ```r covsyn_resid <- augsynth(lngdpcapita ~ treated | lngdpcapita + log(revstatecapita) + log(revlocalcapita) + log(avgwklywagecapita) + estabscapita + emplvlcapita, fips, year_qtr, kansas, progfunc = "ridge", scm = T, lambda = asyn$lambda, residualize = T) ``` ```r summary(covsyn_resid) #> #> Call: #> single_augsynth(form = form, unit = !!enquo(unit), time = !!enquo(time), #> t_int = t_int, data = data, progfunc = "ridge", scm = ..2, #> lambda = ..3, residualize = ..4) #> #> Average ATT Estimate (p Value for Joint Null): -0.055 ( 0.288 ) #> L2 Imbalance: 0.067 #> Percent improvement from uniform weights: 83.4% #> #> Covariate L2 Imbalance: 0.000 #> Percent improvement from uniform weights: 100% #> #> Avg Estimated Bias: 0.006 #> #> Inference type: Conformal inference #> #> Time Estimate 95% CI Lower Bound 95% CI Upper Bound p Value #> 2012.25 -0.025 -0.046 -0.005 0.044 #> 2012.50 -0.051 -0.076 -0.026 0.011 #> 2012.75 -0.045 -0.070 -0.020 0.011 #> 2013.00 -0.044 -0.069 -0.019 0.011 #> 2013.25 -0.051 -0.077 -0.026 0.011 #> 2013.50 -0.069 -0.094 -0.044 0.011 #> 2013.75 -0.051 -0.077 -0.026 0.011 #> 2014.00 -0.069 -0.095 -0.040 0.011 #> 2014.25 -0.067 -0.097 -0.037 0.011 #> 2014.50 -0.053 -0.083 -0.024 0.011 #> 2014.75 -0.045 -0.075 -0.015 0.022 #> 2015.00 -0.064 -0.093 -0.034 0.011 #> 2015.25 -0.051 -0.076 -0.026 0.011 #> 2015.50 -0.059 -0.089 -0.034 0.011 #> 2015.75 -0.058 -0.087 -0.028 0.011 #> 2016.00 -0.074 -0.103 -0.044 0.011 ``` ```r plot(covsyn_resid) ``` <img src="figure/fig_covsyn_resid-1.png" title="plot of chunk fig_covsyn_resid" alt="plot of chunk fig_covsyn_resid" style="display: block; margin: auto;" /> Finally, we can augment synth with many different outcome models. The simplest outcome model is a unit fixed effect model, which we can include by setting `fixedeff = T`. ```r desyn <- augsynth(lngdpcapita ~ treated, fips, year_qtr, kansas, progfunc = "none", scm = T, fixedeff = T) ``` ```r summary(desyn) #> #> Call: #> single_augsynth(form = form, unit = !!enquo(unit), time = !!enquo(time), #> t_int = t_int, data = data, progfunc = "none", scm = ..2, #> fixedeff = ..3) #> #> Average ATT Estimate (p Value for Joint Null): -0.034 ( 0.319 ) #> L2 Imbalance: 0.082 #> Percent improvement from uniform weights: 55.1% #> #> Avg Estimated Bias: NA #> #> Inference type: Conformal inference #> #> Time Estimate 95% CI Lower Bound 95% CI Upper Bound p Value #> 2012.25 -0.022 -0.046 0.006 0.078 #> 2012.50 -0.046 -0.070 -0.013 0.022 #> 2012.75 -0.038 -0.062 -0.005 0.044 #> 2013.00 -0.024 -0.048 0.003 0.078 #> 2013.25 -0.033 -0.057 -0.006 0.044 #> 2013.50 -0.050 -0.074 -0.023 0.022 #> 2013.75 -0.035 -0.056 -0.010 0.022 #> 2014.00 -0.049 -0.073 -0.019 0.022 #> 2014.25 -0.047 -0.071 -0.014 0.022 #> 2014.50 -0.033 -0.057 0.000 0.056 #> 2014.75 -0.023 -0.047 0.010 0.122 #> 2015.00 -0.034 -0.061 0.004 0.078 #> 2015.25 -0.023 -0.047 0.007 0.100 #> 2015.50 -0.026 -0.053 0.007 0.100 #> 2015.75 -0.023 -0.050 0.012 0.144 #> 2016.00 -0.033 -0.066 0.008 0.089 ``` ```r plot(desyn) ``` <img src="figure/fig_desyn-1.png" title="plot of chunk fig_desyn" alt="plot of chunk fig_desyn" style="display: block; margin: auto;" /> We can incorproate other outcome models by changing the `progfunc`. Several outcome models are available, including, fitting the factor model directly with `gsynth`, general elastic net regression, bayesian structural time series estimation with `CausalImpact`, and matrix completion with `MCPanel`. For each outcome model you can supply an optional set of parameters, see documentation for details.
44.716895
562
0.548249
eng_Latn
0.747597
2ddb640c2ae0d9e5dbfc03d77ee826135dbfdffe
3,758
md
Markdown
documentation/wisefy/com.isupatches.android.wisefy.frequency/-frequency-util/index.md
isuPatches/WiseFy
f1deae764e57e45d2caad40b5cd320573eee06e6
[ "Apache-2.0" ]
297
2016-04-26T21:35:25.000Z
2021-05-20T11:15:26.000Z
documentation/wisefy/com.isupatches.android.wisefy.frequency/-frequency-util/index.md
isuPatches/WiseFy
f1deae764e57e45d2caad40b5cd320573eee06e6
[ "Apache-2.0" ]
129
2016-04-28T13:56:02.000Z
2021-05-27T19:55:57.000Z
documentation/wisefy/com.isupatches.android.wisefy.frequency/-frequency-util/index.md
isuPatches/WiseFy
f1deae764e57e45d2caad40b5cd320573eee06e6
[ "Apache-2.0" ]
39
2016-11-07T05:46:05.000Z
2020-09-29T02:17:26.000Z
//[wisefy](../../../index.md)/[com.isupatches.android.wisefy.frequency](../index.md)/[FrequencyUtil](index.md) # FrequencyUtil [androidJvm]\ internal interface [FrequencyUtil](index.md) : [FrequencyApi](../-frequency-api/index.md) ## Functions | Name | Summary | |---|---| | [equals](../../com.isupatches.android.wisefy.wifi.delegates/-legacy-wifi-delegate/index.md#585090901%2FFunctions%2F1622544596) | [androidJvm]<br>open operator fun [equals](../../com.isupatches.android.wisefy.wifi.delegates/-legacy-wifi-delegate/index.md#585090901%2FFunctions%2F1622544596)(other: [Any](https://kotlinlang.org/api/latest/jvm/stdlib/kotlin/-any/index.html)?): [Boolean](https://kotlinlang.org/api/latest/jvm/stdlib/kotlin/-boolean/index.html) | | [getFrequency](../-frequency-api/get-frequency.md) | [androidJvm]<br>@[RequiresPermission](https://developer.android.com/reference/kotlin/androidx/annotation/RequiresPermission.html)(value = android.permission.ACCESS_FINE_LOCATION)<br>@[RequiresApi](https://developer.android.com/reference/kotlin/androidx/annotation/RequiresApi.html)(value = 21)<br>abstract fun [getFrequency](../-frequency-api/get-frequency.md)(): [Int](https://kotlinlang.org/api/latest/jvm/stdlib/kotlin/-int/index.html)?<br>@[RequiresApi](https://developer.android.com/reference/kotlin/androidx/annotation/RequiresApi.html)(value = 21)<br>abstract fun [getFrequency](../-frequency-api/get-frequency.md)(network: [WifiInfo](https://developer.android.com/reference/kotlin/android/net/wifi/WifiInfo.html)): [Int](https://kotlinlang.org/api/latest/jvm/stdlib/kotlin/-int/index.html)<br>@[RequiresPermission](https://developer.android.com/reference/kotlin/androidx/annotation/RequiresPermission.html)(value = android.permission.ACCESS_FINE_LOCATION)<br>@[RequiresApi](https://developer.android.com/reference/kotlin/androidx/annotation/RequiresApi.html)(value = 21)<br>abstract fun [getFrequency](get-frequency.md)(callbacks: [GetFrequencyCallbacks](../../com.isupatches.android.wisefy.callbacks/-get-frequency-callbacks/index.md)?) | | [hashCode](../../com.isupatches.android.wisefy.wifi.delegates/-legacy-wifi-delegate/index.md#1794629105%2FFunctions%2F1622544596) | [androidJvm]<br>open fun [hashCode](../../com.isupatches.android.wisefy.wifi.delegates/-legacy-wifi-delegate/index.md#1794629105%2FFunctions%2F1622544596)(): [Int](https://kotlinlang.org/api/latest/jvm/stdlib/kotlin/-int/index.html) | | [isNetwork5gHz](../-frequency-api/is-network5g-hz.md) | [androidJvm]<br>@[RequiresPermission](https://developer.android.com/reference/kotlin/androidx/annotation/RequiresPermission.html)(value = android.permission.ACCESS_FINE_LOCATION)<br>@[RequiresApi](https://developer.android.com/reference/kotlin/androidx/annotation/RequiresApi.html)(value = 21)<br>abstract fun [isNetwork5gHz](../-frequency-api/is-network5g-hz.md)(): [Boolean](https://kotlinlang.org/api/latest/jvm/stdlib/kotlin/-boolean/index.html)<br>@[RequiresApi](https://developer.android.com/reference/kotlin/androidx/annotation/RequiresApi.html)(value = 21)<br>abstract fun [isNetwork5gHz](../-frequency-api/is-network5g-hz.md)(network: [WifiInfo](https://developer.android.com/reference/kotlin/android/net/wifi/WifiInfo.html)): [Boolean](https://kotlinlang.org/api/latest/jvm/stdlib/kotlin/-boolean/index.html) | | [toString](../../com.isupatches.android.wisefy.wifi.delegates/-legacy-wifi-delegate/index.md#1616463040%2FFunctions%2F1622544596) | [androidJvm]<br>open fun [toString](../../com.isupatches.android.wisefy.wifi.delegates/-legacy-wifi-delegate/index.md#1616463040%2FFunctions%2F1622544596)(): [String](https://kotlinlang.org/api/latest/jvm/stdlib/kotlin/-string/index.html) | ## Inheritors | Name | |---| | [WisefyFrequencyUtil](../-wisefy-frequency-util/index.md) |
163.391304
1,302
0.773284
yue_Hant
0.299577
2ddbdb5d0b2c8778a049566b16d122a7f61ad891
1,713
md
Markdown
content/circonus/integrations/library/ping-legacy.md
circonus/docs
7bfd4dc7a04360fad85461a68874f435b60ea6c5
[ "MIT" ]
null
null
null
content/circonus/integrations/library/ping-legacy.md
circonus/docs
7bfd4dc7a04360fad85461a68874f435b60ea6c5
[ "MIT" ]
11
2020-04-13T03:40:22.000Z
2022-01-24T20:48:49.000Z
content/circonus/integrations/library/ping-legacy.md
circonus/docs
7bfd4dc7a04360fad85461a68874f435b60ea6c5
[ "MIT" ]
1
2021-12-13T18:39:07.000Z
2021-12-13T18:39:07.000Z
--- title: Ping logo_light: "/images/circonus/library/ping.svg" layout: integration legacy: true implementation: broker module: ping_icmp tags: - protocol - system - availability - network - latency --- # Ping ## Overview The Ping Check tests the availability of a host on a network and measures the round-trip time for messages exchanged with that host through the Internet Control Message Protocol (ICMP) protocol. Ping sends ICMP echo request packets to the target host and then waits for a response. It measures the elapsed time between the initial transmission and receiving the response (the round-trip time) and records any packet loss. ## Configuration Note that the timeout should be adjusted appropriately when changing the number of packets and/or the packet interval. A minimum of packets × interval is required to ensure all packets can be sent before a timeout occurs. Optional parameters: |Name|Description| |----|-----------| |count|The number of ICMP echo request packets to send (default: 5).| |interval_seconds|The time, in seconds, between sending each ICMP echo request packet (default: 2).| |timeout_seconds|The maximum time, in seconds, to wait for responses to all echo request packets (default: 10).| ## Metrics Typical metrics include: |Name|Type|Description| |----|----|-----------| |count|numeric|The number of ICMP packets sent.| |available|numeric|The percentage of ICMP requests that received a reply.| |minimum|numeric|The lowest latency for a request-response among the responses received.| |maximum|numeric|The highest latency for a request-response among the responses received.| |average|numeric|The average response latency over all the requests sent by the check.|
37.23913
226
0.771745
eng_Latn
0.985052
2dde18ce9ceb66924182af97a31cfd3e69aff48b
1,748
md
Markdown
_posts/2019-02-28-ROVO-Robust-Omnidirectional-Visual-Odometryfor-Wide-baseline-Wide-FOV-Camera-Systems.md
AMDS123/papers
80ccfe8c852685e4829848229b22ba4736c65a7c
[ "MIT" ]
7
2018-02-11T01:50:19.000Z
2020-01-14T02:07:17.000Z
_posts/2019-02-28-ROVO-Robust-Omnidirectional-Visual-Odometryfor-Wide-baseline-Wide-FOV-Camera-Systems.md
AMDS123/papers
80ccfe8c852685e4829848229b22ba4736c65a7c
[ "MIT" ]
null
null
null
_posts/2019-02-28-ROVO-Robust-Omnidirectional-Visual-Odometryfor-Wide-baseline-Wide-FOV-Camera-Systems.md
AMDS123/papers
80ccfe8c852685e4829848229b22ba4736c65a7c
[ "MIT" ]
4
2018-02-04T15:58:04.000Z
2019-08-29T14:54:14.000Z
--- layout: post title: "ROVO: Robust Omnidirectional Visual Odometryfor Wide-baseline Wide-FOV Camera Systems" date: 2019-02-28 15:29:27 categories: arXiv_CV tags: arXiv_CV Pose_Estimation Optimization author: Hochang Seok, Jongwoo Lim mathjax: true --- * content {:toc} ##### Abstract In this paper we propose a robust visual odometry system for a wide-baseline camera rig with wide field-of-view (FOV) fisheye lenses, which provides full omnidirectional stereo observations of the environment. For more robust and accurate ego-motion estimation we adds three components to the standard VO pipeline, 1) the hybrid projection model for improved feature matching, 2) multi-view P3P RANSAC algorithm for pose estimation, and 3) online update of rig extrinsic parameters. The hybrid projection model combines the perspective and cylindrical projection to maximize the overlap between views and minimize the image distortion that degrades feature matching performance. The multi-view P3P RANSAC algorithm extends the conventional P3P RANSAC to multi-view images so that all feature matches in all views are considered in the inlier counting for robust pose estimation. Finally the online extrinsic calibration is seamlessly integrated in the backend optimization framework so that the changes in camera poses due to shocks or vibrations can be corrected automatically. The proposed system is extensively evaluated with synthetic datasets with ground-truth and real sequences of highly dynamic environment, and its superior performance is demonstrated. ##### Abstract (translated by Google) ##### URL [http://arxiv.org/abs/1902.11154](http://arxiv.org/abs/1902.11154) ##### PDF [http://arxiv.org/pdf/1902.11154](http://arxiv.org/pdf/1902.11154)
67.230769
1,261
0.803776
eng_Latn
0.974046
2dde20528db7e3a6d04b1b82cbd83284cfeef0cd
31
md
Markdown
README.md
rdm123/go-travel
1c99684586fe3607c64fcbdcbc891b2b3ba4c786
[ "Apache-2.0" ]
null
null
null
README.md
rdm123/go-travel
1c99684586fe3607c64fcbdcbc891b2b3ba4c786
[ "Apache-2.0" ]
null
null
null
README.md
rdm123/go-travel
1c99684586fe3607c64fcbdcbc891b2b3ba4c786
[ "Apache-2.0" ]
null
null
null
# go-travel Travelling Website
10.333333
18
0.806452
eng_Latn
0.939034
2dde2c6e60f0a3c58e7ae90bf53e25665b91f36f
923
md
Markdown
content/versions/_v0.2.0.md
D2Allaire/luma-docs
87fb72781cb3e173b7f5951b63babe50822c97c2
[ "MIT" ]
1
2018-03-08T21:50:31.000Z
2018-03-08T21:50:31.000Z
content/versions/_v0.2.0.md
D2Allaire/luma-docs
87fb72781cb3e173b7f5951b63babe50822c97c2
[ "MIT" ]
null
null
null
content/versions/_v0.2.0.md
D2Allaire/luma-docs
87fb72781cb3e173b7f5951b63babe50822c97c2
[ "MIT" ]
null
null
null
--- title: "v0.2.0" date: 2018-03-21T22:37:25+01:00 slug: "v0-2-0" versions: ["show"] download: "https://github.com/chiiya/luma" --- - Add default flag to font-stack. This way you can easily overwrite it in your setup. - **Important**: Use classes for radios and checkboxes. We've found that it's sometimes inconvenient to apply the custom styling for checkboxes and radios to every single `radio` or `checkbox` element in the DOM. For example, it would create issues when trying to create a checkbox toggle switch. For this reason, all radios and checkboxes must declare the `radio` or `checkbox` class to receive custom styling. Example: `<input type="checkbox">` - will _not_ receive luma styling `<input type="checkbox" class="checkbox">` - will receive luma styling - Only restrict image height on brand image. Previously, all images in your navigation would have their max-height set to `($nav-height - 2rem)`.
43.952381
118
0.746479
eng_Latn
0.995085
2ddf07b93de2923ea1c29ff7688339d7e071d267
10,011
md
Markdown
publications.md
zackspica/zackspica.github.io
9e6c90d418a169e9c7e4e96c24f0ccc59add1048
[ "MIT" ]
null
null
null
publications.md
zackspica/zackspica.github.io
9e6c90d418a169e9c7e4e96c24f0ccc59add1048
[ "MIT" ]
null
null
null
publications.md
zackspica/zackspica.github.io
9e6c90d418a169e9c7e4e96c24f0ccc59add1048
[ "MIT" ]
1
2019-09-21T20:36:40.000Z
2019-09-21T20:36:40.000Z
--- title: Publications layout: tab use_fontawesome: true use_math: true --- 20. Garza-Giron R., Brodsky E., **Spica Z.** and Haney M., Clog and Crack: Opening and Closing Behavior of a Large-Scale Explosive Eruption as Recorded by its Hidden Earthquakes, in review AGU Advances.<a href="https://drive.google.com/file/d/1rAWFBGl3bFMfyuNJAizfJfqwhsux5zX9/view?usp=sharing" target="_blank"><i class="fa fa-file-pdf"></i> Preprint</a> 19. Bahavar M., **Spica Z.**, Sáchez-Sesma F. J., Trabant C., Zandieh A., Toro G., Horizontal-to-Vertical Spectral Ratio (HVSR) IRIS Station Toolbox, in review SRL. 18. **Spica Z.**, Perton M., Martin E., Beroza G., Biondi B., Urban Seismic Site Characterization by Fiber-Optic Seismology, 2020, Journal of Geophysical Research (Solid Earth), doi:10.1029/2019JB018656. <a href="https://drive.google.com/file/d/1nf_aWkW3LmDt5YLmTvCazYMsPiZUUjIl/view?usp=sharing" target="_blank"><i class="fa fa-file-pdf"></i> Paper</a> 17. Thomas A. M., **Spica Z.**, Bodmer M., Schulz W. H., Roering J. R., Using a dense seismic array to determine resonances and structure of the Two Towers earthflow in Northern California, 2020, Seismological Research Letter. <a href="https://drive.google.com/open?id=1rLbpUGc_bARfn1NDgvJlGuKSxFoOcpkf" target="_blank"><i class="fa fa-file-pdf"></i> Paper</a> 16. Perton M., **Spica Z.**, Clayton R. W., Beroza G., Shear Wave Structure of a Transect of the Los Angeles Basin From Multimode Surface Waves and H/V Spectral Ratio Analysis, 2020, Geophysical Journal International, doi:10.1093/gji/ggz458. <a href="https://drive.google.com/open?id=1cLANRNwoSZufMmXlBH2QxSfYK8ePO1Jp" target="_blank"><i class="fa fa-file-pdf"></i> Paper</a> 15. Lellouch A., Yuan S., **Spica Z.**, Biondi B., and Ellsworth W. L., Seismic velocity estimation using passive downhole distributed acoustic sensing records – examples from the San Andreas Fault Observatory at Depth, 2019, Journal of Geophysical Research (Solid Earth), doi:10.1029/2019JB017533. <a href="https://drive.google.com/open?id=1bzbStvLJYwzY00gzEFlIdumZZ4QQ6MH6" target="_blank"><i class="fa fa-file-pdf"></i> Paper</a> 14. **Spica Z.**, Perton M., Nakata N., Liu X., Beroza G., Shallow Vs imaging of the Groningen area from joint inversion of multi-mode surface waves and H/V spectral ratio, Seismological Research Letter, 2018, doi:10.1785/0220180060. <a href="https://drive.google.com/open?id=1ILlYR0tWPUXAmGZNMc1Rlad9hbLsDD4C" target="_blank"><i class="fa fa-file-pdf"></i> Paper</a> 13. **Spica Z.**, Nakata N., Liu X., Campman X., Zijian T., Beroza G., The Ambient Seismic Field Analysis at Groningen Gas Field: an overview from the surface to reservoir depth, Seismological Research Letter, 2018, doi: 10.1785/0220170256. <a href="https://drive.google.com/open?id=14t4HPAy5X3wiFxCRIH6lD9nJcBEGIIDX" target="_blank"><i class="fa fa-file-pdf"></i> Paper</a> 12. Pritchard M. E., de Silva S. L., Michelfelder G., Zandt G., McNutt S. R., Gottsmann J., West M. E., Blundy J., Christensen D. H., Finnegan N. J., Minaya E., Sparks R. S. J., Sunagua M., Unsworth M. J., Comeau M. J., del Potro R., Diez M. , Farrell A. , Henderson S. T., Jay J. A., Naranjo J. A., McFarlin H., Muir D., Perkins J. P., Wilder A., Ward K. M., **Spica Z.**, Legrand D., PLUTONS: Investigating the Relationship Between Pluton Growth and Volcanism in the central Andes, Geosphere, 2018, doi:10.1130/GES01578.1.<a href="https://drive.google.com/open?id=1Qww0bsn89-ICiI_LR2vKZPH9Ijpl45kq" target="_blank"><i class="fa fa-file-pdf"></i> Paper</a> 11. Melgar D., Pérez-Campos X., Ramirez-Guzman L., **Spica Z.**, Castro V. H., Hammond W. C., Cabral-Cano E., Bend Faulting at the Edge of a Flat Slab During the 2017 M w 7.1 Puebla-Morelos, Mexico Earthquake, Geophysical Research Letters, 2018, 10.1002/2017GL076895. <a href="https://drive.google.com/file/d/1L9B8wwSCljp13HcF_3yCh4vdMpisKNCD/view?usp=sharing" target="_blank"><i class="fa fa-file-pdf"></i> Paper</a> 10. **Spica Z.**, Perton M., Nakata N., Liu X., Beroza G., Site Characterization at Groningen Gas Field Area Through Joint Surface-Borehole H/V Analysis, Geophysical Journal International, 2017, doi:10.1093/gji/ggx426. <a href="https://drive.google.com/open?id=1WUQirEXB--DCrTDEyf7i-7rakFP_YAsC" target="_blank"><i class="fa fa-file-pdf"></i> Paper</a> 9. **Spica Z.**, Perton M., Beroza G., Lateral Heterogeneity Imaged by Small-Aperture ScS Retrieval from the Ambient Seismic Field, Geophysical Research Letters, 2017, doi:10.1002/2017GL073230. <a href="https://drive.google.com/open?id=1_USYAd2T_Me5Xmnpj90jiqB70zHBJKkW" target="_blank"><i class="fa fa-file-pdf"></i> Paper</a> 8. Perton M., **Spica Z.**, Caudron C., Inversion of the horizontal to vertical spectral ratio in presence of strong lateral heterogeneity, Geophysical Journal International, 2017. doi:10.1093/gji/ggx458. <a href="https://drive.google.com/open?id=1Bw64ONuJcJGephh5eukC2zG7hJEc0aC7" target="_blank"><i class="fa fa-file-pdf"></i> Paper</a> 7. **Spica Z.**, Perton M., Legrand D., Anatomy of the Colima Volcano magmatic system, Mexico, Earth and Planetary Science Letters, 459, 1-13, 2017, doi:10.1016/j.epsl.2016.11.010. <a href="https://drive.google.com/open?id=1iFkRp03y54JDzip4jowq30Ou5Dwp_QaZ" target="_blank"><i class="fa fa-file-pdf"></i> Paper</a> 6. **Spica Z.**, Perton M., Calò M., Legrand D., Córdoba-Montiel F. and Iglesias A., 3-D shear wave velocity model of Mexico and South US: bridging seismic networks with ambient noise cross-correlations ($$C^1$$) and correlation of coda of correlations ($$C^3$$), Geophysical Journal International, 206(3),1795-1813, 2016. doi:10.1093/gji/ggw240. <a href="https://drive.google.com/file/d/1VpnLGRPXc2c2VhYlfmyeVx0nQOv0lvJG/view?usp=sharing" target="_blank"><i class="fa fa-file-pdf"></i> Paper</a> 5. **Spica Z.**, Caudron C., Perton M., Lecocq T., Camelbeeck T., Legrand D., Piña-Flores J., Iglesias A., Syahbana D. K., Velocity models and site effects at Kawah Ijen volcano and Ijen caldera (Indonesia) determined from ambient noise cross-correlations and directional energy density spectral ratios, Journal of Volcanology and Geothermal Research, 302, 173-189, 2015, doi:10.1016/j.jvolgeores.2015.06.016. <a href="https://drive.google.com/open?id=1Fx2EZ5XySoA2tK8BbYAuJQAhZdeK-Bfg" target="_blank"><i class="fa fa-file-pdf"></i> Paper</a> 4. **Spica Z.**, Legrand D., Iglesias A., Dahm T., Walter T., Heimann, S., Froger J-L., Rémy D., West J., Pardo M., Hydrothermal and magmatic reservoirs of the Lazufre volcanic area revealed from a high-resolution seismic noise tomography, Earth and Planetary Science Letters, 421, 27-38, 2015, doi:10.1016/j.epsl.2015.03.042. <a href="https://drive.google.com/open?id=1SpkqeQ5s06_nlkWMkglfsKwQamK_PDB-" target="_blank"><i class="fa fa-file-pdf"></i> Paper</a> 3. **Spica Z.**, Cruz-Atienza V., Reyes-Alfaro G., Legrand D., Iglesias A., Crustal Imaging of Western-Michoacan and the Jalisco Block, Mexico, from Ambient Seismic Noise, Journal of Volcanology and Geothermal Research, 289, 193-201, 2014, doi:10.1016/j.jvolgeores.2014.11.005. <a href="https://drive.google.com/open?id=1Iis7SetG1N_EqZ7xM0X1cibmsazhIrC7" target="_blank"><i class="fa fa-file-pdf"></i> Paper</a> 2. Córdoba-Montiel F., Iglesias A., Singh S.K., **Spica Z.**, Legrand D., Tomography of Rayleigh wave group velocity for Eastern Mexico and the Isthmus of Tehuantepec, Boletín de la Sociedad Geológica Mexicana, 66.3, 441-457, 2014. <a href="https://drive.google.com/open?id=1KxIS6RNhbPRqtvk3vUQtc7J855MTMaUz" target="_blank"><i class="fa fa-file-pdf"></i> Paper</a> 1. UNAM Seismology Group, Ometepec-Pinotepa Nacional, Mexico Earthquake of 20 March 2012 (Mw7.5): A preliminary report, Geofísica International, 52.2, 173-196, 2013, doi:10.1016/S0016-7169(13)71471-5. <a href="https://drive.google.com/open?id=19vn1zOUvAtrgaj4cSmYjuEPV6tMUPFBj" target="_blank"><i class="fa fa-file-pdf"></i> Paper</a> {: reversed="reversed"} <h2>Book chapters</h2> 1. Legrand D., Singh S.K., Scolamacchia T., Espíndola J.M., Lermo J., Jiménez Z., **Spica Z.**, Valenzuela R. W., Valdés-González C., Strong Volcano-Tectonic feed-back interactions revealed by the seismicity before and after the 1982 El Chichón eruptive events, Active Volcanoes of Chiapas (México): El Chichón and Tacaná, VOLCANOES OF THE WORLD, Chapter 7, Book Serie: Springer-Verlag Berlin Heidelberg, 97-114, 2015, doi:10.1007/978-3-642-25890-9 5. <a href="https://drive.google.com/open?id=1fwvABaZV1Osts-aympChGMme9EqCjXEZ" target="_blank"><i class="fa fa-file-pdf"></i> Paper</a> <h2>Expanded abstracts</h2> 4. Huot F., Martin E., **Spica Z.**, Biondi B., Distributed Acoustic Sensing (DAS) for large-scale urban monitoring and seismic hazard mitigation using preexisting telecommunication infrastructure, SEG 2019 Workshop: Geophysics for Smart City Development, Beijing, China, 2019. <a href="https://drive.google.com/open?id=1RI5KVXCtNWWD5AC0PctChKHwHZrZjrEE" target="_blank"><i class="fa fa-file-pdf"></i> Paper</a> 3. Lellouch A., Yuan S., **Spica Z.**, Biondi B., Ellsworth W., Velocity analysis and moveout-based event detection using downhole DAS records, SEG Technical Program Expanded Abstracts 2019, 989-993. <a href="https://drive.google.com/open?id=1gdb0NqSlPvn0mytVM_0Q6RzyyLmrn7b3" target="_blank"><i class="fa fa-file-pdf"></i> Paper</a> 2. Lellouch A., **Spica Z.**, Biondi B., Ellsworth W., Using Vertical DAS Arrays for Continuous Monitoring of Induced Seismicity, 81st EAGE Conference & Exhibition, London, UK, 2019. <a href="https://drive.google.com/open?id=1RPHGi-a-Vp6I77JqCxVTrTmfeet5e5d4" target="_blank"><i class="fa fa-file-pdf"></i> Paper</a> 1. Lellouch A., Yuan S., **Spica Z.**, Biondi B., Ellsworth W., A moveout-based method for the detection of weak seismic events using downhole DAS arrays, 81st EAGE Conference & Exhibition, London, UK, 2019.<a href="https://drive.google.com/open?id=1m4wcaw_BzMBIcYrgny-flGE88qJnBvcm" target="_blank"><i class="fa fa-file-pdf"></i> Paper</a> {: reversed="reversed"}
156.421875
657
0.740585
yue_Hant
0.241315
2de09bf87a11bb3801d952e5e9a1a5ffee31dbb8
1,663
md
Markdown
windows-driver-docs-pr/dashboard/user-mode-reliability-for-crashes-in-windows-components.md
josephpracharmsft/windows-driver-docs
1aade029928ee88429924a993047b7725246c352
[ "CC-BY-4.0", "MIT" ]
1
2020-08-26T23:43:32.000Z
2020-08-26T23:43:32.000Z
windows-driver-docs-pr/dashboard/user-mode-reliability-for-crashes-in-windows-components.md
josephpracharmsft/windows-driver-docs
1aade029928ee88429924a993047b7725246c352
[ "CC-BY-4.0", "MIT" ]
1
2021-01-21T17:24:17.000Z
2021-01-21T17:24:17.000Z
windows-driver-docs-pr/dashboard/user-mode-reliability-for-crashes-in-windows-components.md
josephpracharmsft/windows-driver-docs
1aade029928ee88429924a993047b7725246c352
[ "CC-BY-4.0", "MIT" ]
2
2020-08-11T00:01:58.000Z
2021-11-24T02:51:30.000Z
--- title: Number of user mode reliability for crashes in Windows Components, normalized by population, is less than or equal to the baseline goal description: The measure aggregates telemetry from a 7-day sliding window into a ratio of crashes in Microsoft Components, caused by the graphics drivers, over total runtime in years ms.topic: article ms.date: 08/08/2019 ms.localizationpriority: medium --- # Number of user mode reliability for crashes in Windows Components Photos app, normalized by population, is less than or equal to the baseline goal ## Description This measure is monitoring how often Windows Components (e.g. dwm.exe, shell, logon ui, etc.) are crashing in the display driver, in relation to the number of all machines using the driver. If Windows Component crashes, the user must wait for it to recover before being able to use it again. ## Measure attributes |Attribute|Value| |----|----| |**Audience**|Expanded| |**Time period**|7-day sliding window| |**Measurement criteria**|Aggregation of machines| |**Minimum instances**|10,000 machines| |**Passing criteria**|<= 15 crashes per 10,000 machines| |**Measure ID**|22725967| ## Calculation 1. The measure aggregates telemetry from a 7-day sliding window into a **ratio of crashes in Windows Components, caused by the graphics drivers, over all machines with the driver** 2. *Total Crashes in Windows Components = Count(Windows Component crashes on machines that have the driver)* 3. *Total Devices = Sum(Machines that have the driver)* ### Final Calculation 4. *Crashes in Windows Components Normalized by device count = Total Windows Component Crashes * 10,000 / TotalDevices*
47.514286
291
0.769092
eng_Latn
0.990201
2de12b193fca4281ff3b78e1873f86de243961ef
911
md
Markdown
README.md
jeremyhamm/oauth2-credentials-generator
d8072f0091a220a93401b185189d7547608acfb4
[ "MIT" ]
2
2019-11-24T03:49:05.000Z
2020-07-18T22:41:30.000Z
README.md
jeremyhamm/oauth2-credentials-generator
d8072f0091a220a93401b185189d7547608acfb4
[ "MIT" ]
null
null
null
README.md
jeremyhamm/oauth2-credentials-generator
d8072f0091a220a93401b185189d7547608acfb4
[ "MIT" ]
null
null
null
# Oauth2 Credentials Generator ## Description A small, cryptographically secure library, with ***zero*** dependencies, for generating [***client_id***](https://tools.ietf.org/html/rfc6749#appendix-A.1) and [***client_secret***](https://tools.ietf.org/html/rfc6749#appendix-A.2) for your oauth2 application. ## Installation `npm i jeremyhamm/oauth2-credentials-generator` ## Useage #### Client Id ``` const credentials = require('oauth2-credentials-generator'); const client_id = credentials.clientId(); ``` #### Client Id with optional name This will append `name + _` to the returned client id. Useful if creating client ids for multiple clients. ``` const credentials = require('oauth2-credentials-generator'); const client_id = credentials.clientId('name'); ``` #### Client Secret ``` const credentials = require('oauth2-credentials-generator'); const client_secret = credentials.clientSecret(); ```
30.366667
260
0.739846
eng_Latn
0.740415
2de21ddb8344db7894e9b6028153ba54b9afdc1d
46
md
Markdown
README.md
UnderC/ulangts
cd87545f0ad7cae4b193ce046547a108e65acddc
[ "MIT" ]
null
null
null
README.md
UnderC/ulangts
cd87545f0ad7cae4b193ce046547a108e65acddc
[ "MIT" ]
null
null
null
README.md
UnderC/ulangts
cd87545f0ad7cae4b193ce046547a108e65acddc
[ "MIT" ]
null
null
null
# ulangts Do you want global js programming?
15.333333
35
0.76087
eng_Latn
0.795212
2de28bae71d81764d1f03625dc7ae728397abe07
1,888
md
Markdown
public/markdown/testimonials.md
drandell/hannah-website
8058153659b530f62dd55836de14f5dfe7af741b
[ "MIT" ]
null
null
null
public/markdown/testimonials.md
drandell/hannah-website
8058153659b530f62dd55836de14f5dfe7af741b
[ "MIT" ]
null
null
null
public/markdown/testimonials.md
drandell/hannah-website
8058153659b530f62dd55836de14f5dfe7af741b
[ "MIT" ]
null
null
null
> `Hannah has worked with us here at Freihandel Commerce as a freelance translator. After the first translation she became one of our most valuable freelance employees, the person we turned to with questions and special projects regarding specific translations for the Amazon UK market. In her time with us, she has demonstrated the personal, organisational and interpersonal skills that make her a truly exceptional translator. Her professional and communication skills made our working relationship very pleasant.` _Eliakim, Camden Barbershop Company_ > `Hannah works very independently and conscientiously. Her texts are of very high quality and to my complete satisfaction. I can always rely on her and know that she can implement projects quickly and the quality of her work is excellent. Hannah is a reliable and vital team member at PlusDent.` _Korenia, PlusDental_ > `It was such a pleasure working with Hannah. She was tasked with translating the book ‘Histoires de l’alimentation’ from French into English within 3 months. We are very happy with Hannah’s work. Not only is it an accurate translation, but she even completed every chapter ahead of her deadlines and always communicated very well. Well done Hannah! We look forward to working with you again.` _Saori, Cookpad_ <br> > `Hannah did a flawless job as usual. We are very satisfied with her work and will definitely continue to work with her in the future. It was a very good decision to hire her.` _Client_ <br> > `Hannah did an extraordinary job! She helped us translate some texts for our Amazon pages and knew how to use the right language. Her work is impeccable, if you have doubts she clarifies them and if you have changes she makes them for you. It was also very pleasant to work with her, as in a short time she answered the questions we had. We will definitely work with her again!` _Client_ <br>
75.52
516
0.798199
eng_Latn
0.999956
2de2ab3660dea2a5767ecd5beeabe0593f3c3a8b
43,169
md
Markdown
docs/odbc/reference/appendixes/statement-transitions.md
baleng/sql-docs.it-it
80bb05c3cc6a68564372490896545d6211a9fa26
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/odbc/reference/appendixes/statement-transitions.md
baleng/sql-docs.it-it
80bb05c3cc6a68564372490896545d6211a9fa26
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/odbc/reference/appendixes/statement-transitions.md
baleng/sql-docs.it-it
80bb05c3cc6a68564372490896545d6211a9fa26
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: Transizioni di istruzione | Microsoft Docs ms.custom: '' ms.date: 01/19/2017 ms.prod: sql ms.prod_service: connectivity ms.reviewer: '' ms.technology: connectivity ms.topic: conceptual helpviewer_keywords: - transitioning states [ODBC], statement - state transitions [ODBC], statement - statement transitions [ODBC] ms.assetid: 3d70e0e3-fe83-4b4d-beac-42c82495a05b author: MightyPen ms.author: genemi manager: craigg ms.openlocfilehash: df4a73890841b4a72dbffa0d5a5ae934bf618f16 ms.sourcegitcommit: 61381ef939415fe019285def9450d7583df1fed0 ms.translationtype: MT ms.contentlocale: it-IT ms.lasthandoff: 10/01/2018 ms.locfileid: "47649844" --- # <a name="statement-transitions"></a>Transizioni di istruzione Istruzioni ODBC presentano gli stati seguenti. |State|Description| |-----------|-----------------| |S0|Istruzione non allocato. (Lo stato di connessione deve essere C4, C5 o C6. Per altre informazioni, vedere [transizioni di connessione](../../../odbc/reference/appendixes/connection-transitions.md).)| |S1|Istruzione allocata.| |S2|Istruzione preparata. Non verrà creato alcun set di risultati.| |S3|Istruzione preparata. Verrà creato un set di risultati (eventualmente vuota).| |S4|È stato creato alcun set di risultati e istruzione eseguita.| |S5|Istruzione di esecuzione e un set di risultati (eventualmente vuota) è stato creato. Il cursore è aperta e posizionata prima della prima riga del set di risultati.| |S6|Cursore con posizionato **SQLFetch** oppure **SQLFetchScroll**.| |S7|Cursore con posizionato **SQLExtendedFetch**.| |S8|Funzione necessita di dati. **SQLParamData** non è stato chiamato.| |S9|Funzione necessita di dati. **SQLPutData** non è stato chiamato.| |S10|Funzione necessita di dati. **SQLPutData** è stato chiamato.| |S11|Ancora in esecuzione. Un'istruzione rimane in questo stato dopo che una funzione che viene eseguita in modo asincrono restituisce SQL_STILL_EXECUTING. Un'istruzione è temporaneamente in questo stato durante qualsiasi funzione che accetta che un handle di istruzione è in esecuzione. Residenza temporaneo nello stato S11 non viene visualizzata in tutte le tabelle di stato, ad eccezione della tabella dello stato per **SQLCancel**. Mentre un'istruzione è temporaneamente in stato S11, la funzione può essere annullata chiamando **SQLCancel** da un altro thread.| |S12|L'esecuzione asincrona annullata. In S12, un'applicazione deve chiamare la funzione annullata fino a quando non viene restituito un valore diverso da SQL_STILL_EXECUTING. La funzione è stata annullata correttamente solo se la funzione restituisce SQL_ERROR e SQLSTATE HY008 (operazione annullata). Se viene restituito qualsiasi altro valore, ad esempio SQL_SUCCESS, l'operazione di annullamento non è riuscita e la funzione eseguita normalmente.| Stati S2 e S3 sono noti come gli stati preparati, afferma S5 tramite S7 come il cursore indica gli stati S8 tramite S10 come stati dei dati è necessario e gli stati S11 e S12 come gli stati asincroni. In ognuno di questi gruppi, le transizioni vengono visualizzate separatamente solo quando sono diversi per ogni stato del gruppo; Nella maggior parte dei casi, le transizioni per ogni stato in ogni un gruppo sono uguali. Le tabelle seguenti mostrano come ogni funzione ODBC influisce sullo stato istruzione. ## <a name="sqlallochandle"></a>Funzione SQLAllocHandle |S0<br /><br /> Non allocato|S1<br /><br /> allocato|S2, S3<br /><br /> Prepared|S4<br /><br /> eseguito|S5 – S7<br /><br /> Cursore|S8 – S10<br /><br /> Dati necessari|S11-S12<br /><br /> Async| |------------------------|----------------------|------------------------|---------------------|----------------------|--------------------------|-----------------------| |--[1], [5], [6]|--[5]|--[5]|--[5]|--[5]|--[5]|--[5]| |--[2], [5]|--[5]|--[5]|--[5]|--[5]|--[5]|--[5]| |S1 [3]|--[5]|--[5]|--[5]|--[5]|--[5]|--[5]| |--[4], [5]|--[5]|--[5]|--[5]|--[5]|--[5]|--[5]| [1] questa riga Mostra le transizioni quando *HandleType* era SQL_HANDLE_ENV. [2] questa riga Mostra le transizioni quando *HandleType* era SQL_HANDLE_DBC. [3] questa riga Mostra le transizioni quando *HandleType* era SQL_HANDLE_STMT. [4] questa riga Mostra le transizioni quando *HandleType* era SQL_HANDLE_DESC. [5] chiamante **SQLAllocHandle** con *OutputHandlePtr* che punta a un handle valido sovrascrive tale handle senza tener conto per il contenuto precedente a quella di gestire e può causare problemi per i driver ODBC. Si tratta di programmazione di applicazioni ODBC non corretta per chiamare **SQLAllocHandle** due volte alla stessa variabile di applicazione definita per *\*OutputHandlePtr* senza chiamare **SQLFreeHandle** per liberare l'handle prima la riallocazione. Sovrascrittura ODBC gli handle in questo modo potrebbero causare un comportamento non coerente o errori da parte dei driver ODBC. ## <a name="sqlbindcol"></a>SQLBindCol |S0<br /><br /> Non allocato|S1<br /><br /> allocato|S2, S3<br /><br /> Prepared|S4<br /><br /> eseguito|S5 – S7<br /><br /> Cursore|S8 – S10<br /><br /> Dati necessari|S11-S12<br /><br /> Async| |------------------------|----------------------|------------------------|---------------------|----------------------|--------------------------|-----------------------| |IH|--|--|--|--|HY010|HY010| ## <a name="sqlbindparameter"></a>SQLBindParameter |S0<br /><br /> Non allocato|S1<br /><br /> allocato|S2, S3<br /><br /> Prepared|S4<br /><br /> eseguito|S5 – S7<br /><br /> Cursore|S8 – S10<br /><br /> Dati necessari|S11-S12<br /><br /> Async| |------------------------|----------------------|------------------------|---------------------|----------------------|--------------------------|-----------------------| |IH|--|--|--|--|HY010|HY010| ## <a name="sqlbrowseconnect-sqlconnect-and-sqldriverconnect"></a>SQLBrowseConnect SQLConnect e SQLDriverConnect |S0<br /><br /> Non allocato|S1<br /><br /> allocato|S2, S3<br /><br /> Prepared|S4<br /><br /> eseguito|S5 – S7<br /><br /> Cursore|S8 – S10<br /><br /> Dati necessari|S11-S12<br /><br /> Async| |------------------------|----------------------|------------------------|---------------------|----------------------|--------------------------|-----------------------| |08002|08002|08002|08002|08002|08002|08002| ## <a name="sqlbulkoperations"></a>SQLBulkOperations |S0<br /><br /> Non allocato|S1<br /><br /> allocato|S2, S3<br /><br /> Prepared|S4<br /><br /> eseguito|S5 – S7<br /><br /> Cursore|S8 – S10<br /><br /> Dati necessari|S11-S12<br /><br /> Async| |------------------------|----------------------|------------------------|---------------------|----------------------|--------------------------|-----------------------| |IH|HY010|HY010|24000|Vedere la tabella seguente|HY010|O HY010 NS [c]| ## <a name="sqlbulkoperations-cursor-states"></a>SQLBulkOperations (cursore stati) |S5<br /><br /> Aperto|S6<br /><br /> SQLFetch o SQLFetchScroll|S7<br /><br /> SQLExtendedFetch| |-------------------|---------------------------------------|-----------------------------| |-[s] S8 [d] S11 [x]|-[s] S8 [d] S11 [x]|HY010| ## <a name="sqlcancel"></a>SQLCancel |S0<br /><br /> Non allocato|S1<br /><br /> allocato|S2, S3<br /><br /> Prepared|S4<br /><br /> eseguito|S5 – S7<br /><br /> Cursore|S8 – S10<br /><br /> Dati necessari|S11-S12<br /><br /> Async| |------------------------|----------------------|------------------------|---------------------|----------------------|--------------------------|-----------------------| |IH|--|--|--|--|S1 [1] S2 [nr] e [2] [r] S3 e S5 [2] [3] e [5] S6 ([3] o [4]) e S7 [6] [4] e [7]|Vedere la tabella seguente| [1] **SQLExecDirect** restituito SQL_NEED_DATA. [2] **SQLExecute** restituito SQL_NEED_DATA. [3] **SQLBulkOperations** restituito SQL_NEED_DATA. [4] **SQLSetPos** restituito SQL_NEED_DATA. [5] **SQLFetch**, **SQLFetchScroll**, o **SQLExtendedFetch** non fosse stata chiamata. [6] **SQLFetch** oppure **SQLFetchScroll** fosse stata chiamata. [7] **SQLExtendedFetch** fosse stata chiamata. ## <a name="sqlcancel-asynchronous-states"></a>SQLCancel (Stati asincroni) |S11<br /><br /> Ancora in esecuzione|S12<br /><br /> Asynch annullata| |-----------------------------|-----------------------------| |S12 NS [1] [2]|S12| [1] l'istruzione è stata temporaneamente nello stato S11 mentre eseguiva una funzione. **SQLCancel** è stato chiamato da un thread diverso. [2] l'istruzione si trovava nello stato S11 perché una funzione chiamata in modo asincrono restituito SQL_STILL_EXECUTING. ## <a name="sqlclosecursor"></a>SQLCloseCursor |S0<br /><br /> Non allocato|S1<br /><br /> allocato|S2, S3<br /><br /> Prepared|S4<br /><br /> eseguito|S5 – S7<br /><br /> Cursore|S8 – S10<br /><br /> Dati necessari|S11-S12<br /><br /> Async| |------------------------|----------------------|------------------------|---------------------|----------------------|--------------------------|-----------------------| |IH|24000|24000|24000|S1 [np] S3 [p]|HY010|HY010| ## <a name="sqlcolattribute"></a>SQLColAttribute |S0<br /><br /> Non allocato|S1<br /><br /> allocato|S2, S3<br /><br /> Prepared|S4<br /><br /> eseguito|S5 – S7<br /><br /> Cursore|S8 – S10<br /><br /> Dati necessari|S11-S12<br /><br /> Async| |------------------------|----------------------|------------------------|---------------------|----------------------|--------------------------|-----------------------| |IH|HY010|Vedere la tabella seguente|24000|-[s] S11 [x]|HY010|O HY010 NS [c]| ## <a name="sqlcolattribute-prepared-states"></a>SQLColAttribute (preparati stati) |S2<br /><br /> Nessun risultato|S3<br /><br /> Risultati| |-----------------------|--------------------| |--[1] 07005[2]|-[s] S11 x| [1] *FieldIdentifier* era SQL_DESC_COUNT. [2] *FieldIdentifier* SQL_DESC_COUNT non è stato. ## <a name="sqlcolumnprivileges-sqlcolumns-sqlforeignkeys-sqlgettypeinfo-sqlprimarykeys-sqlprocedurecolumns-sqlprocedures-sqlspecialcolumns-sqlstatistics-sqltableprivileges-and-sqltables"></a>SQLColumnPrivileges SQLColumns, SQLForeignKeys, SQLGetTypeInfo, SQLPrimaryKeys, SQLProcedureColumns, SQLProcedures, SQLSpecialColumns, SQLStatistics, SQLTablePrivileges e SQLTables |S0<br /><br /> Non allocato|S1<br /><br /> allocato|S2, S3<br /><br /> Prepared|S4<br /><br /> eseguito|S5 – S7<br /><br /> Cursore|S8 – S10<br /><br /> Dati necessari|S11-S12<br /><br /> Async| |------------------------|----------------------|------------------------|---------------------|----------------------|--------------------------|-----------------------| |(IH)|S5 [s] S11 [x]|S1 [e] S5 [s] S11 [x]|S1 [e] e [1] S5 [s] e [1] S11 [x] e [1] 24000 [2]|Vedere la tabella seguente|HY010|O HY010 NS [c]| [1] il risultato corrente è più recente o solo risultato o non sono presenti risultati correnti. Per altre informazioni sui risultati di più, vedere [più risultati](../../../odbc/reference/develop-app/multiple-results.md). [2] il risultato corrente non è l'ultimo risultato. ## <a name="sqlcolumnprivileges-sqlcolumns-sqlforeignkeys-sqlgettypeinfo-sqlprimarykeys-sqlprocedurecolumns-sqlprocedures-sqlspecialcolumns-sqlstatistics-sqltableprivileges-and-sqltables-cursor-states"></a>SQLColumnPrivileges SQLColumns, SQLForeignKeys, SQLGetTypeInfo, SQLPrimaryKeys, SQLProcedureColumns, SQLProcedures, SQLSpecialColumns, SQLStatistics, SQLTablePrivileges e SQLTables (cursore stati) |S5<br /><br /> Aperto|S6<br /><br /> SQLFetch o SQLFetchScroll|S7<br /><br /> SQLExtendedFetch| |-------------------|---------------------------------------|-----------------------------| |24000|24000[1]|24000| [1] questo errore viene restituito da Gestione Driver, se **SQLFetch** oppure **SQLFetchScroll** non restituisce SQL_NO_DATA e viene restituito dal driver se **SQLFetch** oppure **SQLFetchScroll** è stato restituito SQL_NO_DATA. ## <a name="sqlcopydesc"></a>SQLCopyDesc |S0<br /><br /> Non allocato|S1<br /><br /> allocato|S2, S3<br /><br /> Prepared|S4<br /><br /> eseguito|S5 – S7<br /><br /> Cursore|S8 – S10<br /><br /> Dati necessari|S11-S12<br /><br /> Async| |------------------------|----------------------|------------------------|---------------------|----------------------|--------------------------|-----------------------| |IH [1]|--|--|--|--|HY010|NS [c] e HY010 [3] [o] o [4]| |IH [2]|HY010|Vedere la tabella seguente|24000|-[s] S11 x|HY010|NS [c] e HY010 [3] [o] o [4]| [1] questa riga Mostra le transizioni quando la *SourceDescHandle* argomento era un ARD, APD o IPD. [2] questa riga Mostra le transizioni quando la *SourceDescHandle* argomento era un IRD. [3] il *SourceDescHandle* e *TargetDescHandle* argomenti sono uguali a quelle di **SQLCopyDesc** funzione che esegue in modo asincrono. [4] sia la *SourceDescHandle* argomento o il *TargetDescHandle* argomento (o entrambi) sono stati diversi da quello nel **SQLCopyDesc** funzione che è in esecuzione in modo asincrono. ## <a name="sqlcopydesc-prepared-states"></a>SQLCopyDesc (preparati stati) |S2<br /><br /> Nessun risultato|S3<br /><br /> Risultati| |-----------------------|--------------------| |24000[1]|-[s] S11 [x]| [1] Questa riga Mostra le transizioni quando la *SourceDescHandle* argomento era un IRD. ## <a name="sqldatasources-and-sqldrivers"></a>SQLDataSources e SQLDrivers |S0<br /><br /> Non allocato|S1<br /><br /> allocato|S2, S3<br /><br /> Prepared|S4<br /><br /> eseguito|S5 – S7<br /><br /> Cursore|S8 – S10<br /><br /> Dati necessari|S11-S12<br /><br /> Async| |------------------------|----------------------|------------------------|---------------------|----------------------|--------------------------|-----------------------| |--|--|--|--|--|--|--| ## <a name="sqldescribecol"></a>SQLDescribeCol |S0<br /><br /> Non allocato|S1<br /><br /> allocato|S2, S3<br /><br /> Prepared|S4<br /><br /> eseguito|S5 – S7<br /><br /> Cursore|S8 – S10<br /><br /> Dati necessari|S11-S12<br /><br /> Async| |------------------------|----------------------|------------------------|---------------------|----------------------|--------------------------|-----------------------| |IH|HY010|Vedere la tabella seguente|24000|-[s] S11 [x]|HY010|O HY010 NS [c]| ## <a name="sqldescribecol-prepared-states"></a>SQLDescribeCol (preparati stati) |S2<br /><br /> Nessun risultato|S3<br /><br /> Risultati| |-----------------------|--------------------| |07005|-[s] S11 [x]| ## <a name="sqldescribeparam"></a>SQLDescribeParam |S0<br /><br /> Non allocato|S1<br /><br /> allocato|S2, S3<br /><br /> Prepared|S4<br /><br /> eseguito|S5 – S7<br /><br /> Cursore|S8 – S10<br /><br /> Dati necessari|S11-S12<br /><br /> Async| |------------------------|----------------------|------------------------|---------------------|----------------------|--------------------------|-----------------------| |IH|HY010|-[s] S11 [x]|HY010|HY010|HY010|HY010 NS [c] [o]| ## <a name="sqldisconnect"></a>SQLDisconnect |S0<br /><br /> Non allocato|S1<br /><br /> allocato|S2, S3<br /><br /> Prepared|S4<br /><br /> eseguito|S5 – S7<br /><br /> Cursore|S8 – S10<br /><br /> Dati necessari|S11-S12<br /><br /> Async| |------------------------|----------------------|------------------------|---------------------|----------------------|--------------------------|-----------------------| |--[1]|S0 [1]|S0 [1]|S0 [1]|S0 [1]|(HY010)|(HY010)| [1] chiamante **SQLDisconnect** libera tutte le istruzioni associate alla connessione. Inoltre, viene restituito lo stato di connessione a C2; lo stato di connessione deve essere C4 prima che lo stato di istruzione è S0. ## <a name="sqlendtran"></a>SQLEndTran |S0<br /><br /> Non allocato|S1<br /><br /> allocato|S2, S3<br /><br /> Prepared|S4<br /><br /> eseguito|S5 – S7<br /><br /> Cursore|S8 – S10<br /><br /> Dati necessari|S11-S12<br /><br /> Async| |------------------------|----------------------|------------------------|---------------------|----------------------|--------------------------|-----------------------| |--|--|: [2] o [3] S1 [1]|: [3] [np] S1 e ([1] o [2]) S1 [p] e [1] S2 [p] e [2]|: [3] [np] S1 e ([1] o [2]) S1 [p] e [1] S3 [p] e [2]|(HY010)|(HY010)| [1] il *CompletionType* l'argomento è SQL_COMMIT e **SQLGetInfo** restituisce SQL_CB_DELETE per il tipo di informazioni SQL_CURSOR_COMMIT_BEHAVIOR o *CompletionType*l'argomento è SQL_ROLLBACK e **SQLGetInfo** restituisce SQL_CB_DELETE per il tipo di informazioni SQL_CURSOR_ROLLBACK_BEHAVIOR. [2] il *CompletionType* l'argomento è SQL_COMMIT e **SQLGetInfo** restituisce SQL_CB_CLOSE per il tipo di informazioni SQL_CURSOR_COMMIT_BEHAVIOR o *CompletionType*l'argomento è SQL_ROLLBACK e **SQLGetInfo** restituisce SQL_CB_CLOSE per il tipo di informazioni SQL_CURSOR_ROLLBACK_BEHAVIOR. [3] il *CompletionType* l'argomento è SQL_COMMIT e **SQLGetInfo** restituisce SQL_CB_PRESERVE per il tipo di informazioni SQL_CURSOR_COMMIT_BEHAVIOR o *CompletionType*l'argomento è SQL_ROLLBACK e **SQLGetInfo** restituisce SQL_CB_PRESERVE per il tipo di informazioni SQL_CURSOR_ROLLBACK_BEHAVIOR. ## <a name="sqlexecdirect"></a>SQLExecDirect |S0<br /><br /> Non allocato|S1<br /><br /> allocato|S2, S3<br /><br /> Prepared|S4<br /><br /> eseguito|S5 – S7<br /><br /> Cursore|S8 – S10<br /><br /> Dati necessari|S11-S12<br /><br /> Async| |------------------------|----------------------|------------------------|---------------------|----------------------|--------------------------|-----------------------| |(IH)|S4 [s] e [nr] S5 [s] e [r] S11 S8 [d] [x]|-[e] e [1] S1 [e] e [2] S4 [s] e [nr] S5 [s] e [r] S11 S8 [d] [x]|-[e], [1], [3] S1 e [e], [2] ed S4 [3] [s], [nr, norefs], [3] S5 e [s], [r], S8 [3] e [d] e S11 [3] [x] e [3] 24000 [4]|Vedere la tabella seguente|HY010|HY010 NS [c] [o]| [1] l'errore è stato restituito da Gestione Driver. [2] errore non è stato restituito da Gestione Driver. [3] il risultato corrente è più recente o solo risultato o non sono presenti risultati correnti. Per altre informazioni sui risultati di più, vedere [più risultati](../../../odbc/reference/develop-app/multiple-results.md). [4] il risultato corrente non è l'ultimo risultato. ## <a name="sqlexecdirect-cursor-states"></a>SQLExecDirect (cursore stati) |S5<br /><br /> Aperto|S6<br /><br /> SQLFetch o SQLFetchScroll|S7<br /><br /> SQLExtendedFetch| |-------------------|---------------------------------------|-----------------------------| |24000|24000 [1]|24000| [1] questo errore viene restituito da Gestione Driver, se **SQLFetch** oppure **SQLFetchScroll** non restituisce SQL_NO_DATA e viene restituito dal driver se **SQLFetch** o **SQLFetchScroll** è stato restituito SQL_NO_DATA. ## <a name="sqlexecute"></a>SQLExecute |S0<br /><br /> Non allocato|S1<br /><br /> allocato|S2, S3<br /><br /> Prepared|S4<br /><br /> eseguito|S5 – S7<br /><br /> Cursore|S8 – S10<br /><br /> Dati necessari|S11-S12<br /><br /> Async| |------------------------|----------------------|------------------------|---------------------|----------------------|--------------------------|-----------------------| |(IH)|(HY010)|Vedere la tabella seguente|S2 [e], p e [1] S4 [s], [p], [nr, norefs], S5 [1] e [s], [p], [r] e S8 [d] [p], [1] e [1] S11 [x], [p] e [1] 24000 [p] e [2] HY010 [np]|Vedere la tabella degli stati di cursore|HY010|HY010 NS [c] [o]| [1] il risultato corrente è più recente o solo risultato o non sono presenti risultati correnti. Per altre informazioni sui risultati di più, vedere [più risultati](../../../odbc/reference/develop-app/multiple-results.md). [2] il risultato corrente non è l'ultimo risultato. ## <a name="sqlexecute-prepared-states"></a>SQLExecute (preparati stati) |S2<br /><br /> Nessun risultato|S3<br /><br /> Risultati| |-----------------------|--------------------| |S4 [s] S11 S8 [d] [x]|S5 [s] S11 S8 [d] [x]| ## <a name="sqlexecute-cursor-states"></a>SQLExecute (cursore stati) |S5<br /><br /> Aperto|S6<br /><br /> SQLFetch o SQLFetchScroll|S7<br /><br /> SQLExtendedFetch| |-------------------|---------------------------------------|-----------------------------| |24000 HY010 [p] [np]|24000 [p], [1] HY010 [np]|24000 HY010 [p] [np]| [1] questo errore viene restituito da Gestione Driver, se **SQLFetch** oppure **SQLFetchScroll** non restituisce SQL_NO_DATA e viene restituito dal driver se **SQLFetch** o **SQLFetchScroll** è stato restituito SQL_NO_DATA. ## <a name="sqlextendedfetch"></a>SQLExtendedFetch |S0<br /><br /> Non allocato|S1<br /><br /> allocato|S2, S3<br /><br /> Prepared|S4<br /><br /> eseguito|S5 – S7<br /><br /> Cursore|S8 – S10<br /><br /> Dati necessari|S11-S12<br /><br /> Async| |------------------------|----------------------|------------------------|---------------------|----------------------|--------------------------|-----------------------| |IH|S1010|S1010|24000|Vedere la tabella seguente|S1010|S1010 NS [c] [o]| ## <a name="sqlextendedfetch-cursor-states"></a>SQLExtendedFetch (cursore stati) |S5<br /><br /> Aperto|S6<br /><br /> SQLFetch o SQLFetchScroll|S7<br /><br /> SQLExtendedFetch| |-------------------|---------------------------------------|-----------------------------| |S7 [s] o [nf] S11 [x]|S1010|-[s] o [nf] S11 [x]| ## <a name="sqlfetch-and-sqlfetchscroll"></a>SQLFetch e SQLFetchScroll |S0<br /><br /> Non allocato|S1<br /><br /> allocato|S2, S3<br /><br /> Prepared|S4<br /><br /> eseguito|S5 – S7<br /><br /> Cursore|S8 – S10<br /><br /> Dati necessari|S11-S12<br /><br /> Async| |------------------------|----------------------|------------------------|---------------------|----------------------|--------------------------|-----------------------| |IH|HY010|HY010|24000|Vedere la tabella seguente|HY010|HY010 NS [c] [o]| ## <a name="sqlfetch-and-sqlfetchscroll-cursor-states"></a>SQLFetch e SQLFetchScroll (Stati cursore) |S5<br /><br /> Aperto|S6<br /><br /> SQLFetch o SQLFetchScroll|S7<br /><br /> SQLExtendedFetch| |-------------------|---------------------------------------|-----------------------------| |S6 [s] o [nf] S11 [x]|-[s] o [nf] S11 [x]|HY010| ## <a name="sqlfreehandle"></a>SQLFreeHandle |S0<br /><br /> Non allocato|S1<br /><br /> allocato|S2, S3<br /><br /> Prepared|S4<br /><br /> eseguito|S5 – S7<br /><br /> Cursore|S8 – S10<br /><br /> Dati necessari|S11-S12<br /><br /> Async| |------------------------|----------------------|------------------------|---------------------|----------------------|--------------------------|-----------------------| |-- [1]|HY010|HY010|HY010|HY010|HY010|HY010| |IH [2]|S0|S0|S0|S0|HY010|HY010| |-- [3]|--|--|--|--|--|--| [1] questa riga Mostra le transizioni quando *HandleType* era SQL_HANDLE_ENV o SQL_HANDLE_DBC. [2] questa riga Mostra le transizioni quando *HandleType* era SQL_HANDLE_STMT. [3] questa riga Mostra le transizioni quando *HandleType* era SQL_HANDLE_DESC e il descrittore è stato allocato in modo esplicito. ## <a name="sqlfreestmt"></a>SQLFreeStmt |S0<br /><br /> Non allocato|S1<br /><br /> allocato|S2, S3<br /><br /> Prepared|S4<br /><br /> eseguito|S5 – S7<br /><br /> Cursore|S8 – S10<br /><br /> Dati necessari|S11-S12<br /><br /> Async| |------------------------|----------------------|------------------------|---------------------|----------------------|--------------------------|-----------------------| |IH [1]|--|--|S1 [np] S2 [p]|S1 [np] S3 [p]|HY010|HY010| |IH [2]|--|--|--|--|HY010|HY010| [1] questa riga Mostra le transizioni quando *opzione* era SQL_CLOSE. [2] questa riga Mostra le transizioni quando *opzione* era SQL_UNBIND o SQL_RESET_PARAMS. Se il *opzione* argomento era SQL_DROP e driver sottostante è un'applicazione ODBC 3 *. x* driver, Driver Manager esegue il mapping a una chiamata a **SQLFreeHandle** con *HandleType* impostato su SQL_HANDLE_STMT. Per altre informazioni, vedere la tabella di transizione per [SQLFreeHandle](../../../odbc/reference/syntax/sqlfreehandle-function.md). ## <a name="sqlgetconnectattr"></a>SQLGetConnectAttr |S0<br /><br /> Non allocato|S1<br /><br /> allocato|S2, S3<br /><br /> Prepared|S4<br /><br /> eseguito|S5 – S7<br /><br /> Cursore|S8 – S10<br /><br /> Dati necessari|S11-S12<br /><br /> Async| |------------------------|----------------------|------------------------|---------------------|----------------------|--------------------------|-----------------------| |--|--|--|--|--|--|--| ## <a name="sqlgetcursorname"></a>SQLGetCursorName |S0<br /><br /> Non allocato|S1<br /><br /> allocato|S2, S3<br /><br /> Prepared|S4<br /><br /> eseguito|S5 – S7<br /><br /> Cursore|S8 – S10<br /><br /> Dati necessari|S11-S12<br /><br /> Async| |------------------------|----------------------|------------------------|---------------------|----------------------|--------------------------|-----------------------| |IH|--|--|--|--|HY010|HY010| ## <a name="sqlgetdata"></a>SQLGetData |S0<br /><br /> Non allocato|S1<br /><br /> allocato|S2, S3<br /><br /> Prepared|S4<br /><br /> eseguito|S5 – S7<br /><br /> Cursore|S8 – S10<br /><br /> Dati necessari|S11-S12<br /><br /> Async| |------------------------|----------------------|------------------------|---------------------|----------------------|--------------------------|-----------------------| |IH|HY010|HY010|24000|Vedere la tabella seguente|HY010|HY010 NS [c] [o]| ## <a name="sqlgetdata-cursor-states"></a>SQLGetData (cursore stati) |S5<br /><br /> Aperto|S6<br /><br /> SQLFetch o SQLFetchScroll|S7<br /><br /> SQLExtendedFetch| |-------------------|---------------------------------------|-----------------------------| |24000|-[s] o [nf] S11 [24000 [b] HY109 x] [i]|-[s] o [nf] S11 [24000 [b] HY109 x] [i]| ## <a name="sqlgetdescfield-and-sqlgetdescrec"></a>SQLGetDescField e SQLGetDescRec |S0<br /><br /> Non allocato|S1<br /><br /> allocato|S2, S3<br /><br /> Prepared|S4<br /><br /> eseguito|S5 – S7<br /><br /> Cursore|S8 – S10<br /><br /> Dati necessari|S11-S12<br /><br /> Async| |------------------------|----------------------|------------------------|---------------------|----------------------|--------------------------|-----------------------| |IH|: [1] o [2] HY010 [3]|Vedere la tabella seguente|: [1] o [2] 24000 [3]|: [1], [2], o S11 [3] [3] e [x]|HY010|NS [c] o [4] HY010 [o] e [5]| [1] il *DescriptorHandle* argomento era un APD o ARD. [2] il *DescriptorHandle* argomento era un IPD. [3] il *DescriptorHandle* argomento era un IRD. [4] di *DescriptorHandle* argomento era quello utilizzato per il *DescriptorHandle* argomento in di **SQLGetDescField** o **SQLGetDescRec** funzione che esegue in modo asincrono. [5] di *DescriptorHandle* argomento era diverso da quello di *DescriptorHandle* argomento in di **SQLGetDescField** o **SQLGetDescRec**funzione in cui è in esecuzione in modo asincrono. ## <a name="sqlgetdescfield-and-sqlgetdescrec-prepared-states"></a>SQLGetDescField e SQLGetDescRec (preparati stati) |S2<br /><br /> Nessun risultato|S3<br /><br /> Risultati| |-----------------------|--------------------| |: [1], [2] o [3] S11 [2] e [x]|: [1], [2], [3] S11 [x] o| [1] il *DescriptorHandle* argomento era un APD o ARD. [2] il *DescriptorHandle* argomento era un IPD. [3] il *DescriptorHandle* argomento era un IRD. Si noti che queste funzioni restituiscono sempre SQL_NO_DATA nello stato S2 quando *DescriptorHandle* era un IRD. ## <a name="sqlgetdiagfield-and-sqlgetdiagrec"></a>SQLGetDiagRec e SQLGetDiagField |S0<br /><br /> Non allocato|S1<br /><br /> allocato|S2, S3<br /><br /> Prepared|S4<br /><br /> eseguito|S5 – S7<br /><br /> Cursore|S8 – S10<br /><br /> Dati necessari|S11-S12<br /><br /> Async| |------------------------|----------------------|------------------------|---------------------|----------------------|--------------------------|-----------------------| |--[1]|--|--|--|--|--|--| |IH [2]|--[3]|--[3]|--|--|--[3]|--[3]| [1] questa riga Mostra le transizioni quando *HandleType* era SQL_HANDLE_ENV, SQL_HANDLE_DBC o SQL_HANDLE_DESC. [2] questa riga Mostra le transizioni quando *HandleType* era SQL_HANDLE_STMT. [3] **SQLGetDiagField** sempre restituito un errore in questo stato quando *DiagIdentifier* è SQL_DIAG_ROW_COUNT. ## <a name="sqlgetenvattr"></a>SQLGetEnvAttr |S0<br /><br /> Non allocato|S1<br /><br /> allocato|S2, S3<br /><br /> Prepared|S4<br /><br /> eseguito|S5 – S7<br /><br /> Cursore|S8 – S10<br /><br /> Dati necessari|S11-S12<br /><br /> Async| |------------------------|----------------------|------------------------|---------------------|----------------------|--------------------------|-----------------------| |--|--|--|--|--|--|--| ## <a name="sqlgetfunctions"></a>SQLGetFunctions |S0<br /><br /> Non allocato|S1<br /><br /> allocato|S2, S3<br /><br /> Prepared|S4<br /><br /> eseguito|S5 – S7<br /><br /> Cursore|S8 – S10<br /><br /> Dati necessari|S11-S12<br /><br /> Async| |------------------------|----------------------|------------------------|---------------------|----------------------|--------------------------|-----------------------| |--|--|--|--|--|--|--| ## <a name="sqlgetinfo"></a>SQLGetInfo |S0<br /><br /> Non allocato|S1<br /><br /> allocato|S2, S3<br /><br /> Prepared|S4<br /><br /> eseguito|S5 – S7<br /><br /> Cursore|S8 – S10<br /><br /> Dati necessari|S11-S12<br /><br /> Async| |------------------------|----------------------|------------------------|---------------------|----------------------|--------------------------|-----------------------| |--|--|--|--|--|--|--| ## <a name="sqlgetstmtattr"></a>SQLGetStmtAttr |S0<br /><br /> Non allocato|S1<br /><br /> allocato|S2, S3<br /><br /> Prepared|S4<br /><br /> eseguito|S5 – S7<br /><br /> Cursore|S8 – S10<br /><br /> Dati necessari|S11-S12<br /><br /> Async| |------------------------|----------------------|------------------------|---------------------|----------------------|--------------------------|-----------------------| |IH|--[1] 24000[2]|--[1] 24000[2]|--[1] 24000[2]|Vedere la tabella seguente|HY010|HY010| [1] l'attributo di istruzione non SQL_ATTR_ROW_NUMBER. [2] l'attributo di istruzione è stata SQL_ATTR_ROW_NUMBER. ## <a name="sqlgetstmtattr-cursor-states"></a>SQLGetStmtAttr (cursore stati) |S5<br /><br /> Aperto|S6<br /><br /> SQLFetch o SQLFetchScroll|S7<br /><br /> SQLExtendedFetch| |-------------------|---------------------------------------|-----------------------------| |--[1] 24000[2]|: [1] o ([v] e [2]) 24000 [b] e [2] HY109 [i] e [2]|-[i] o ([v] e [2]) 24000 [b] e [2] HY109 [1] e [2]| [1] il *attributo* argomento non era SQL_ATTR_ROW_NUMBER. [2] il *attributo* argomento era SQL_ATTR_ROW_NUMBER. ## <a name="sqlmoreresults"></a>SQLMoreResults |S0<br /><br /> Non allocato|S1<br /><br /> allocato|S2, S3<br /><br /> Prepared|S4<br /><br /> eseguito|S5 – S7<br /><br /> Cursore|S8 – S10<br /><br /> Dati necessari|S11-S12<br /><br /> Async| |------------------------|----------------------|------------------------|---------------------|----------------------|--------------------------|-----------------------| |(IH)|--[1]|--[1]|-[s] e [2] S1 [nf], [np], [4] S2 e [nf], [p], [4] S5 e [s] e [3] S11 [x]|S1 [nf], [np], [4] S3 e [nf], [p] e [4] S4 [s] e [2] S5 [s] e [3] S11 [x]|HY010|HY010 NS [c] [o]| [1] la funzione restituisce sempre SQL_NO_DATA in questo stato. [2] il risultato successivo è un conteggio delle righe. [3] il risultato successivo è un set di risultati. [4] il risultato corrente è l'ultimo risultato. ## <a name="sqlnativesql"></a>SQLNativeSql |S0<br /><br /> Non allocato|S1<br /><br /> allocato|S2, S3<br /><br /> Prepared|S4<br /><br /> eseguito|S5 – S7<br /><br /> Cursore|S8 – S10<br /><br /> Dati necessari|S11-S12<br /><br /> Async| |------------------------|----------------------|------------------------|---------------------|----------------------|--------------------------|-----------------------| |--|--|--|--|--|--|--| ## <a name="sqlnumparams"></a>SQLNumParams |S0<br /><br /> Non allocato|S1<br /><br /> allocato|S2, S3<br /><br /> Prepared|S4<br /><br /> eseguito|S5 – S7<br /><br /> Cursore|S8 – S10<br /><br /> Dati necessari|S11-S12<br /><br /> Async| |------------------------|----------------------|------------------------|---------------------|----------------------|--------------------------|-----------------------| |IH|HY010|-[s] S11 [x]|-[s] S11 [x]|-[s] S11 [x]|HY010|HY010 NS [c] [o]| ## <a name="sqlnumresultcols"></a>SQLNumResultCols |S0<br /><br /> Non allocato|S1<br /><br /> allocato|S2, S3<br /><br /> Prepared|S4<br /><br /> eseguito|S5 – S7<br /><br /> Cursore|S8 – S10<br /><br /> Dati necessari|S11-S12<br /><br /> Async| |------------------------|----------------------|------------------------|---------------------|----------------------|--------------------------|-----------------------| |IH|HY010|-[s] S11 [x]|-[s] S11 [x]|-[s] S11 [x]|HY010|HY010 NS [c] [o]| ## <a name="sqlparamdata"></a>SQLParamData |S0<br /><br /> Non allocato|S1<br /><br /> allocato|S2, S3<br /><br /> Prepared|S4<br /><br /> eseguito|S5 – S7<br /><br /> Cursore|S8 – S10<br /><br /> Dati necessari|S11-S12<br /><br /> Async| |------------------------|----------------------|------------------------|---------------------|----------------------|--------------------------|-----------------------| |IH|HY010|HY010|HY010|HY010|Vedere la tabella seguente|HY010 NS [c] [o]| ## <a name="sqlparamdata-need-data-states"></a>SQLParamData (necessario Data stati) |S8<br /><br /> Dati necessari|S9<br /><br /> Deve essere inserito|S10<br /><br /> Possibile inserire| |----------------------|---------------------|---------------------| |S1 [e] e [1] S2 [e], [nr, norefs] e S3 [2] [e], [r], [2] S5 e [e] e [4] S6 [e] e [5] S7 [e] e S9 [3] [d] S11 [x]|HY010|S1 [e] e [1] S2 [e], [nr,] e S3 [2] [e], [r] e S4 [2] [s], [nr,] e ([1] o [2]) S5 [s], [r], e ([1] o [2]) S5 ([s] o [e]) e S6 [4]\([s] o [e]) e [5] S7 ([s] o [e]) e S9 [3] [d] S11 [x]| [1] **SQLExecDirect** restituito SQL_NEED_DATA. [2] **SQLExecute** restituito SQL_NEED_DATA. [3] **SQLSetPos** era stato chiamato dallo stato S7 e restituito SQL_NEED_DATA. [4] **SQLBulkOperations** era stato chiamato dallo stato S5 e restituito SQL_NEED_DATA. [5] **SQLSetPos** oppure **SQLBulkOperations** era stato chiamato dallo stato S6 e restituito SQL_NEED_DATA. ## <a name="sqlprepare"></a>SQLPrepare |S0<br /><br /> Non allocato|S1<br /><br /> allocato|S2, S3<br /><br /> Prepared|S4<br /><br /> eseguito|S5 – S7<br /><br /> Cursore|S8 – S10<br /><br /> Dati necessari|S11-S12<br /><br /> Async| |------------------------|----------------------|------------------------|---------------------|----------------------|--------------------------|-----------------------| |(IH)|S2 [s] e [nr] S3 [s] e [r] S11 [x]|-[s] o ([e] e [1]) S1 [e] e [2] S11 [x]|S1 [e] e [3] S2 [s], [nr, norefs], [3] S3 e [s], [r], [3] S11 e [x] e [3] 24000 [4]|Vedere la tabella seguente|HY010|HY010 NS [c] [o]| [1] la preparazione per un motivo diverso da convalidare l'istruzione ha esito negativo (il valore SQLSTATE era HY009 [valore di argomento non valido] o HY090 [stringa di lunghezza o non valida buffer]). [2] la preparazione si verifica un errore durante la convalida dell'istruzione (il valore SQLSTATE non era HY009 [valore di argomento non valido] o HY090 [stringa di lunghezza o non valida buffer]). [3] il risultato corrente è più recente o solo risultato o non sono presenti risultati correnti. Per altre informazioni sui risultati di più, vedere [più risultati](../../../odbc/reference/develop-app/multiple-results.md). [4] il risultato corrente non è l'ultimo risultato. ## <a name="sqlprepare-cursor-states"></a>SQLPrepare (cursore stati) |S5<br /><br /> Aperto|S6<br /><br /> SQLFetch o SQLFetchScroll|S7<br /><br /> SQLExtendedFetch| |-------------------|---------------------------------------|-----------------------------| |24000|24000|24000| ## <a name="sqlputdata"></a>SQLPutData |S0<br /><br /> Non allocato|S1<br /><br /> allocato|S2, S3<br /><br /> Prepared|S4<br /><br /> eseguito|S5 – S7<br /><br /> Cursore|S8 – S10<br /><br /> Dati necessari|S11-S12<br /><br /> Async| |------------------------|----------------------|------------------------|---------------------|----------------------|--------------------------|-----------------------| |IH|HY010|HY010|HY010|HY010|Vedere la tabella seguente|HY010 NS [c] [o]| ## <a name="sqlputdata-need-data-states"></a>SQLPutData (necessario Data stati) |S8<br /><br /> Dati necessari|S9<br /><br /> Deve essere inserito|S10<br /><br /> Possibile inserire| |----------------------|---------------------|---------------------| |HY010|S1 [e] e [1] S2 [e], [nr, norefs] e S3 [2] [e], [r], [2] S5 e [e] e [4] S6 [e] e [5] S7 [e] e S10 [3] [s] S11 [x]|-[e] [s] S1 e S2 [1] [e], [nr, norefs] e S3 [2] [e], [r] e S5 [2] [e] e [4] S6 [e] e [5] S7 [e] e HY011 S11 [3] [x] [6]| [1] **SQLExecDirect** restituito SQL_NEED_DATA. [2] **SQLExecute** restituito SQL_NEED_DATA. [3] **SQLSetPos** era stato chiamato dallo stato S7 e restituito SQL_NEED_DATA. [4] **SQLBulkOperations** era stato chiamato dallo stato S5 e restituito SQL_NEED_DATA. [5] **SQLSetPos** oppure **SQLBulkOperations** era stato chiamato dallo stato S6 e restituito SQL_NEED_DATA. [6] uno o più chiamate a **SQLPutData** per un solo parametro restituito SQL_SUCCESS e quindi una chiamata a **SQLPutData** è stato eseguito per lo stesso parametro con *StrLen_or_Ind* impostata su SQL_NULL_DATA. ## <a name="sqlrowcount"></a>SQLRowCount |S0<br /><br /> Non allocato|S1<br /><br /> allocato|S2, S3<br /><br /> Prepared|S4<br /><br /> eseguito|S5 – S7<br /><br /> Cursore|S8 – S10<br /><br /> Dati necessari|S11-S12<br /><br /> Async| |------------------------|----------------------|------------------------|---------------------|----------------------|--------------------------|-----------------------| |(IH)|(HY010)|(HY010)|--|--|(HY010)|(HY010)| ## <a name="sqlsetconnectattr"></a>SQLSetConnectAttr |S0<br /><br /> Non allocato|S1<br /><br /> allocato|S2, S3<br /><br /> Prepared|S4<br /><br /> eseguito|S5 – S7<br /><br /> Cursore|S8 – S10<br /><br /> Dati necessari|S11-S12<br /><br /> Async| |------------------------|----------------------|------------------------|---------------------|----------------------|--------------------------|-----------------------| |--[1]|--|--|--|--[2] 24000[3]|HY010|HY010| [1] questa riga Mostra le transizioni quando *attributo* era un attributo di connessione. Per la transizione quando *attributo* è un'istruzione di attributo, vedere la tabella di transizione di istruzione per **SQLSetStmtAttr**. [2] il *attributo* argomento non era SQL_ATTR_CURRENT_CATALOG. [3] il *attributo* argomento era SQL_ATTR_CURRENT_CATALOG. ## <a name="sqlsetcursorname"></a>SQLSetCursorName |S0<br /><br /> Non allocato|S1<br /><br /> allocato|S2, S3<br /><br /> Prepared|S4<br /><br /> eseguito|S5 – S7<br /><br /> Cursore|S8 – S10<br /><br /> Dati necessari|S11-S12<br /><br /> Async| |------------------------|----------------------|------------------------|---------------------|----------------------|--------------------------|-----------------------| |IH|--|--|24000|24000|HY010|HY010| ## <a name="sqlsetdescfield-and-sqlsetdescrec"></a>SQLSetDescField and SQLSetDescRec |S0<br /><br /> Non allocato|S1<br /><br /> allocato|S2, S3<br /><br /> Prepared|S4<br /><br /> eseguito|S5 – S7<br /><br /> Cursore|S8 – S10<br /><br /> Dati necessari|S11-S12<br /><br /> Async| |------------------------|----------------------|------------------------|---------------------|----------------------|--------------------------|-----------------------| |IH [1]|--|--|--|--|HY010|HY010| [1] questa riga Mostra le transizioni in cui il *DescriptorHandle* argomento è un ARD, APD, IPD, o (per **SQLSetDescField**) un IRD quando il *FieldIdentifier* argomento è SQL _ DESC_ARRAY_STATUS_PTR o SQL_DESC_ROWS_PROCESSED_PTR. Si tratta di un errore chiamare **SQLSetDescField** per un IRD quando *FieldIdentifier* presenta qualsiasi altro valore. ## <a name="sqlsetenvattr"></a>SQLSetEnvAttr |S0<br /><br /> Non allocato|S1<br /><br /> allocato|S2, S3<br /><br /> Prepared|S4<br /><br /> eseguito|S5 – S7<br /><br /> Cursore|S8 – S10<br /><br /> Dati necessari|S11-S12<br /><br /> Async| |------------------------|----------------------|------------------------|---------------------|----------------------|--------------------------|-----------------------| |HY011|HY011|HY011|HY011|Y011|HY01|HY011| ## <a name="sqlsetpos"></a>SQLSetPos |S0<br /><br /> Non allocato|S1<br /><br /> allocato|S2, S3<br /><br /> Prepared|S4<br /><br /> eseguito|S5 – S7<br /><br /> Cursore|S8 – S10<br /><br /> Dati necessari|S11-S12<br /><br /> Async| |------------------------|----------------------|------------------------|---------------------|----------------------|--------------------------|-----------------------| |IH|HY010|HY010|24000|Vedere la tabella seguente|HY010|HY010 NS [c] [o]| ## <a name="sqlsetpos-cursor-states"></a>SQLSetPos (cursore stati) |S5<br /><br /> Aperto|S6<br /><br /> SQLFetch o SQLFetchScroll|S7<br /><br /> SQLExtendedFetch| |-------------------|---------------------------------------|-----------------------------| |24000|-[s] S8 [d] S11 [24000 [b] HY109 x] [i]|-[s] S8 [d] S11 [24000 [b] HY109 x] [i]| ## <a name="sqlsetstmtattr"></a>SQLSetStmtAttr |S0<br /><br /> Non allocato|S1<br /><br /> allocato|S2, S3<br /><br /> Prepared|S4<br /><br /> eseguito|S5 – S7<br /><br /> Cursore|S8 – S10<br /><br /> Dati necessari|S11-S12<br /><br /> Async| |------------------------|----------------------|------------------------|---------------------|----------------------|--------------------------|-----------------------| |IH|--|: [1] HY011 [2]|--[1] 24000[2]|--[1] 24000[2]|HY010 [np] o [1] HY011 [p] e [2]|HY010 [np] o [1] HY011 [p] e [2]| [1] il *attributo* argomento non era SQL_ATTR_CONCURRENCY, SQL_ATTR_CURSOR_TYPE, SQL_ATTR_SIMULATE_CURSOR, SQL_ATTR_USE_BOOKMARKS, SQL_ATTR_CURSOR_SCROLLABLE e SQL_ATTR_CURSOR_SENSITIVITY. [2] il *attributo* argomento era SQL_ATTR_CONCURRENCY, SQL_ATTR_CURSOR_TYPE, SQL_ATTR_SIMULATE_CURSOR, SQL_ATTR_USE_BOOKMARKS, SQL_ATTR_CURSOR_SCROLLABLE e SQL_ATTR_CURSOR_SENSITIVITY.
73.29202
604
0.517594
ita_Latn
0.615998
2de2d042e4b6c0135f301fae61ba3caaf83f2613
4,381
md
Markdown
README.md
LongHairedHacker/FutureSDR
1a716e49b525fac789646c8307840d12367a5922
[ "Apache-2.0" ]
13
2021-01-03T13:53:16.000Z
2022-03-17T20:27:11.000Z
README.md
BelmY/FutureSDR
1a716e49b525fac789646c8307840d12367a5922
[ "Apache-2.0" ]
null
null
null
README.md
BelmY/FutureSDR
1a716e49b525fac789646c8307840d12367a5922
[ "Apache-2.0" ]
null
null
null
# FutureSDR An experimental asynchronous SDR runtime for heterogeneous architectures that is: * **Extensible**: custom buffers (supporting accelerators like GPUs and FPGAs) and custom schedulers (optimized for your application). * **Asynchronous**: solving long-standing issues around IO, blocking, and timers. * **Portable**: Linux, Windows, Mac, WASM, Android, and prime support for embedded platforms through a REST API and web-based GUIs. * **Fast**: SDR go brrr! [![Crates.io][crates-badge]][crates-url] [![Apache 2.0 licensed][apache-badge]][apache-url] [![Build Status][actions-badge]][actions-url] [crates-badge]: https://img.shields.io/crates/v/futuresdr.svg [crates-url]: https://crates.io/crates/futuresdr [apache-badge]: https://img.shields.io/badge/license-Apache%202-blue [apache-url]: https://github.com/futuresdr/futuresdr/blob/master/LICENSE [actions-badge]: https://github.com/futuresdr/futuresdr/workflows/CI/badge.svg [actions-url]: https://github.com/futuresdr/futuresdr/actions?query=workflow%3ACI+branch%3Amaster [Website](https://www.futuresdr.org) | [Guides](https://www.futuresdr.org/tutorial) | [API Docs](https://docs.rs/futuresdr/latest/futuresdr) | [Chat](https://discord.com/invite/vCz29eDbGP/) ## Overview FutureSDR supports *Blocks* with synchronous or asynchronous implementations for stream-based or message-based data processing. Blocks can be combined to a *Flowgraph* and launched on a *Runtime* that is driven by a *Scheduler*. * Single and multi-threaded schedulers, including examples for application-specific implementations. * Portable GPU acceleration using the Vulkan API (supports Linux, Windows, Android, ...). * User space DMA driver for Xilinx Zynq to interface FPGAs. ## Development Since FutureSDR is in an early state of development, it is likely that SDR applications will require changes to the runtime. We, therefore, do not recommend to add it as a dependency in a separate project but to clone the repository and implement the application as binary, example, or sub-crate. ## Example An example flowgraph with a periodic message source, sending five messages to a sink: ``` rust use anyhow::Result; use std::time::Duration; use futuresdr::blocks::MessageSink; use futuresdr::blocks::MessageSource; use futuresdr::runtime::Flowgraph; use futuresdr::runtime::Pmt; use futuresdr::runtime::Runtime; fn main() -> Result<()> { let mut fg = Flowgraph::new(); let src = fg.add_block(MessageSource::new(Pmt::Null, Duration::from_secs(1), Some(5))); let snk = fg.add_block(MessageSink::new()); fg.connect_message(src, "out", snk, "in")?; Runtime::new().run(fg)?; Ok(()) } ``` ## Contributing Contributions are very welcome. Please see the (work-in-progress) [contributing guide][contr] for more information. If you develop larger features or work on major changes with the main intention to submit them upstream, it would be great, if you could announce them in advance. [contr]: https://github.com/futuresdr/futuresdr/blob/master/CONTRIBUTING.md ## Conduct The FutureSDR project adheres to the [Rust Code of Conduct][coc]. It describes the _minimum_ behavior expected from all contributors. [coc]: https://github.com/rust-lang/rust/blob/master/CODE_OF_CONDUCT.md ## License This project is licensed under the [Apache 2.0 license][lic]. Using this license is in contrast to the large majority of Open Source SDR applications and frameworks, which are mostly AGLP, LGPL, or GPL. In a nutshell, this means that there is *no* money to be made from relicensing the project for commercial use, since this is already allowed by Apache 2.0. Furthermore, companies can use (parts of) the project and integrate (adapted) versions in commercial products without releasing the source or contributing back to the project. The main motivation for this license is that * it better fits the Rust ecosystem * it eases adoption; one can use (parts of) the code with close to no strings attached * using Open Source and not contributing back (for the time being) seems better than not using Open Source at all [lic]: https://github.com/futuresdr/futuresdr/blob/master/LICENSE. ## Contributions Unless you explicitly state otherwise, any contribution intentionally submitted for inclusion in FutureSDR, shall be licensed as Apache 2.0, without any additional terms or conditions.
35.617886
97
0.76284
eng_Latn
0.962777
2de315e91787dea2542e06e3a562310dc7bcf6ea
5,731
md
Markdown
docs/basic/binary.md
xiaocairush/xiaocairush.github.io
4f1d864b9b3fba1e2100f6a50932756c0373971c
[ "MIT" ]
null
null
null
docs/basic/binary.md
xiaocairush/xiaocairush.github.io
4f1d864b9b3fba1e2100f6a50932756c0373971c
[ "MIT" ]
null
null
null
docs/basic/binary.md
xiaocairush/xiaocairush.github.io
4f1d864b9b3fba1e2100f6a50932756c0373971c
[ "MIT" ]
null
null
null
本页面将简要介绍二分查找,由二分法衍生的三分法以及二分答案。 ## 二分法 ### 简介 二分查找(英语:binary search),也称折半搜索(英语:half-interval search)、对数搜索(英语:logarithmic search),是用来在一个有序数组中查找某一元素的算法。 ### 工作原理 以在一个升序数组中查找一个数为例。 它每次考察数组当前部分的中间元素,如果中间元素刚好是要找的,就结束搜索过程;如果中间元素小于所查找的值,那么左侧的只会更小,不会有所查找的元素,只需到右侧查找;如果中间元素大于所查找的值同理,只需到左侧查找。 ### 性质 #### 时间复杂度 二分查找的最优时间复杂度为 $O(1)$。 二分查找的平均时间复杂度和最坏时间复杂度均为 $O(\log n)$。因为在二分搜索过程中,算法每次都把查询的区间减半,所以对于一个长度为 $n$ 的数组,至多会进行 $O(\log n)$ 次查找。 #### 空间复杂度 迭代版本的二分查找的空间复杂度为 $O(1)$。 递归(无尾调用消除)版本的二分查找的空间复杂度为 $O(\log n)$。 ### 代码实现 ```cpp int binary_search(int start, int end, int key) { int ret = -1; // 未搜索到数据返回-1下标 int mid; while (start <= end) { mid = start + ((end - start) >> 1); // 直接平均可能会溢出,所以用这个算法 if (arr[mid] < key) start = mid + 1; else if (arr[mid] > key) end = mid - 1; else { // 最后检测相等是因为多数搜索情况不是大于就是小于 ret = mid; break; } } return ret; // 单一出口 } ``` ???+note 对于 $n$ 是有符号数的情况,当你可以保证 $n\ge 0$ 时,`n >> 1` 比 `n / 2` 指令数更少。 ### 最大值最小化 注意,这里的有序是广义的有序,如果一个数组中的左侧或者右侧都满足某一种条件,而另一侧都不满足这种条件,也可以看作是一种有序(如果把满足条件看做 $1$,不满足看做 $0$,至少对于这个条件的这一维度是有序的)。换言之,二分搜索法可以用来查找满足某种条件的最大(最小)的值。 要求满足某种条件的最大值的最小可能情况(最大值最小化),首先的想法是从小到大枚举这个作为答案的「最大值」,然后去判断是否合法。若答案单调,就可以使用二分搜索法来更快地找到答案。因此,要想使用二分搜索法来解这种「最大值最小化」的题目,需要满足以下三个条件: 1. 答案在一个固定区间内; 2. 可能查找一个符合条件的值不是很容易,但是要求能比较容易地判断某个值是否是符合条件的; 3. 可行解对于区间满足一定的单调性。换言之,如果 $x$ 是符合条件的,那么有 $x + 1$ 或者 $x - 1$ 也符合条件。(这样下来就满足了上面提到的单调性) 当然,最小值最大化是同理的。 ### STL 的二分查找 C++ 标准库中实现了查找首个不小于给定值的元素的函数 [`std::lower_bound`](https://zh.cppreference.com/w/cpp/algorithm/lower_bound) 和查找首个大于给定值的元素的函数 [`std::upper_bound`](https://zh.cppreference.com/w/cpp/algorithm/upper_bound),二者均定义于头文件 `<algorithm>` 中。 二者均采用二分实现,所以调用前必须保证元素有序。 ### bsearch bsearch 函数为 C 标准库实现的二分查找,定义在 `<stdlib.h>` 中。在 C++ 标准库里,该函数定义在 `<cstdlib>` 中。qsort 和 bsearch 是 C 语言中唯二的两个算法类函数。 bsearch 函数相比 qsort([排序相关 STL](./stl-sort.md))的四个参数,在最左边增加了参数“待查元素的地址”。之所以按照地址的形式传入,是为了方便直接套用与 qsort 相同的比较函数,从而实现排序后的立即查找。因此这个参数不能直接传入具体值,而是要先将待查值用一个变量存储,再传入该变量地址。 于是 bsearch 函数总共有五个参数:待查元素的地址、数组名、元素个数、元素大小、比较规则。比较规则仍然通过指定比较函数实现,详见 [排序相关 STL](./stl-sort.md)。 bsearch 函数的返回值是查找到的元素的地址,该地址为 void 类型。 注意:bsearch 与上文的 lower_bound 和 upper_bound 有两点不同: - 当符合条件的元素有重复多个的时候,会返回执行二分查找时第一个符合条件的元素,从而这个元素可能位于重复多个元素的中间部分。 - 当查找不到相应的元素时,会返回 NULL。 用 lower_bound 可以实现与 bsearch 完全相同的功能,所以可以使用 bsearch 通过的题目,直接改写成 lower_bound 同样可以实现。但是鉴于上述不同之处的第二点,例如,在序列 1、2、4、5、6 中查找 3,bsearch 实现 lower_bound 的功能会变得困难。 利用 bsearch 实现 lower_bound 的功能比较困难,是否一定就不能实现?答案是否定的,存在比较 tricky 的技巧。借助编译器处理比较函数的特性:总是将第一个参数指向待查元素,将第二个参数指向待查数组中的元素,也可以用 bsearch 实现 lower_bound 和 upper_bound,如下文示例。只是,这要求待查数组必须是全局数组,从而可以直接传入首地址。 ```cpp int A[100005]; // 示例全局数组 // 查找首个不小于待查元素的元素的地址 int lower(const void *p1, const void *p2) { int *a = (int *)p1; int *b = (int *)p2; if ((b == A || compare(a, b - 1) > 0) && compare(a, b) > 0) return 1; else if (b != A && compare(a, b - 1) <= 0) return -1; // 用到地址的减法,因此必须指定元素类型 else return 0; } // 查找首个大于待查元素的元素的地址 int upper(const void *p1, const void *p2) { int *a = (int *)p1; int *b = (int *)p2; if ((b == A || compare(a, b - 1) >= 0) && compare(a, b) >= 0) return 1; else if (b != A && compare(a, b - 1) < 0) return -1; // 用到地址的减法,因此必须指定元素类型 else return 0; } ``` 因为现在的 OI 选手很少写纯 C,并且此方法作用有限,所以不是重点。对于新手而言,建议老老实实地使用 C++ 中的 lower_bound 和 upper_bound 函数。 ### 二分答案 解题的时候往往会考虑枚举答案然后检验枚举的值是否正确。若满足单调性,则满足使用二分法的条件。把这里的枚举换成二分,就变成了“二分答案”。 ???+note "[Luogu P1873 砍树](https://www.luogu.com.cn/problem/P1873)" 伐木工人米尔科需要砍倒 $M$ 米长的木材。这是一个对米尔科来说很容易的工作,因为他有一个漂亮的新伐木机,可以像野火一样砍倒森林。不过,米尔科只被允许砍倒单行树木。 米尔科的伐木机工作过程如下:米尔科设置一个高度参数 $H$(米),伐木机升起一个巨大的锯片到高度 $H$,并锯掉所有的树比 $H$ 高的部分(当然,树木不高于 $H$ 米的部分保持不变)。米尔科就行到树木被锯下的部分。 例如,如果一行树的高度分别为 $20,~15,~10,~17$,米尔科把锯片升到 $15$ 米的高度,切割后树木剩下的高度将是 $15,~15,~10,~15$,而米尔科将从第 $1$ 棵树得到 $5$ 米木材,从第 $4$ 棵树得到 $2$ 米木材,共 $7$ 米木材。 米尔科非常关注生态保护,所以他不会砍掉过多的木材。这正是他尽可能高地设定伐木机锯片的原因。你的任务是帮助米尔科找到伐木机锯片的最大的整数高度 $H$,使得他能得到木材至少为 $M$ 米。即,如果再升高 $1$ 米锯片,则他将得不到 $M$ 米木材。 ??? note "解题思路" 我们可以在 $1$ 到 $10^9$ 中枚举答案,但是这种朴素写法肯定拿不到满分,因为从 $1$ 枚举到 $10^9$ 太耗时间。我们可以在 $[1,~10^9]$ 的区间上进行二分作为答案,然后检查各个答案的可行性(一般使用贪心法)。**这就是二分答案。** ??? note "参考代码" ```cpp int a[1000005]; int n, m; bool check(int k) { // 检查可行性,k 为锯片高度 long long sum = 0; for (int i = 1; i <= n; i++) // 检查每一棵树 if (a[i] > k) // 如果树高于锯片高度 sum += (long long)(a[i] - k); // 累加树木长度 return sum >= m; // 如果满足最少长度代表可行 } int find() { int l = 1, r = 1e9 + 1; // 因为是左闭右开的,所以 10^9 要加 1 while (l + 1 < r) { // 如果两点不相邻 int mid = (l + r) / 2; // 取中间值 if (check(mid)) // 如果可行 l = mid; // 升高锯片高度 else r = mid; // 否则降低锯片高度 } return l; // 返回左边值 } int main() { cin >> n >> m; for (int i = 1; i <= n; i++) cin >> a[i]; cout << find(); return 0; } ``` 看完了上面的代码,你肯定会有两个疑问: 1. 为何搜索区间是左闭右开的? 因为搜到最后,会这样(以合法的最大值为例): ![](./images/binary-final-1.png) 然后会 ![](./images/binary-final-2.png) 合法的最小值恰恰相反。 2. 为何返回左边值? 同上。 ## 三分法 ### 简介 三分法可以用来查找凸函数的最大(小)值。 画一下图能够帮助理解(图待补) - 如果 `lmid` 和 `rmid` 在最大(小)值的同一侧:由于单调性,一定是二者中较大(小)的那个离最值近一些,较远的那个点对应的区间不可能包含最值,所以可以舍弃。 - 如果在两侧:由于最值在二者中间,我们舍弃两侧的一个区间后,也不会影响最值,所以可以舍弃。 ### 代码实现 ```cpp lmid = left + (right - left >> 1); rmid = lmid + (right - lmid >> 1); // 对右侧区间取半 if (cal(lmid) > cal(rmid)) right = rmid; else left = lmid; ``` ## 分数规划 参见:[分数规划](../misc/frac-programming.md) 分数规划通常描述为下列问题:每个物品有两个属性 $c_i$,$d_i$,要求通过某种方式选出若干个,使得 $\frac{\sum{c_i}}{\sum{d_i}}$ 最大或最小。 经典的例子有最优比率环、最优比率生成树等等。 分数规划可以用二分法来解决。
26.410138
227
0.635666
yue_Hant
0.852112
2de398ff3b4588dfb5e189f8c5e960651a827dfd
164
md
Markdown
_collection/Poulami-Sarkar-Bengali-Hindi-OCR.md
singularityhub/singularityhub-archive
dd1d471db0c4ac01998cb84bd1ab82e97c8dab65
[ "ECL-2.0", "Apache-2.0" ]
null
null
null
_collection/Poulami-Sarkar-Bengali-Hindi-OCR.md
singularityhub/singularityhub-archive
dd1d471db0c4ac01998cb84bd1ab82e97c8dab65
[ "ECL-2.0", "Apache-2.0" ]
null
null
null
_collection/Poulami-Sarkar-Bengali-Hindi-OCR.md
singularityhub/singularityhub-archive
dd1d471db0c4ac01998cb84bd1ab82e97c8dab65
[ "ECL-2.0", "Apache-2.0" ]
null
null
null
--- id: 2710 full_name: "Poulami-Sarkar/Bengali-Hindi-OCR" images: - "Poulami-Sarkar-Bengali-Hindi-OCR-benocr" - "Poulami-Sarkar-Bengali-Hindi-OCR-benocr" ---
20.5
45
0.713415
hrv_Latn
0.111178
2de671dafd376f332506c62e8eeb00acfaa99f12
5,283
md
Markdown
sample/gcp_pubsub_function/README.md
trisberg/knative-eventing
86dc63d5f2b81d2a17c2547c76a697bddbde4b62
[ "Apache-2.0" ]
1
2018-07-26T14:40:02.000Z
2018-07-26T14:40:02.000Z
sample/gcp_pubsub_function/README.md
trisberg/knative-eventing
86dc63d5f2b81d2a17c2547c76a697bddbde4b62
[ "Apache-2.0" ]
null
null
null
sample/gcp_pubsub_function/README.md
trisberg/knative-eventing
86dc63d5f2b81d2a17c2547c76a697bddbde4b62
[ "Apache-2.0" ]
null
null
null
# gcp_pubsub_function A simple function that receives Google Cloud Pub Sub events and prints out the data field after decoding from base64 encoding. Because we do **not** have an in-cluster event delivery mechanism yet, uses a Knative route as an endpoint. ## Prerequisites 1. [Setup your development environment](../../DEVELOPMENT.md#getting-started) 2. [Start Knative](../../README.md#start-knative) 3. Decide on the DNS name that git can then call. Update knative/serving/config-domain.yaml domainSuffix. For example I used aikas.org as my hostname, so my config-domain.yaml looks like so: ```yaml apiVersion: v1 kind: ConfigMap metadata: name: config-domain namespace: knative-serving-system data: aikas.org: | ``` If you were already running the knative controllers, you will need to re-apply the configmap. 4. Install GCP Pub Sub as an event source ```shell ko apply -f pkg/sources/gcppubsub/ ``` 5. Create a GCP Pub Sub topic ```shell gcloud pubsub topics create knative-demo ``` 5. Install the event sources and types for [gcp_pubsub](../gcp_pubsub/README.md) ## Running You can deploy this to Knative from the root directory via: ```shell ko apply -f sample/gcp_pubsub_function/route.yaml ko apply -f sample/gcp_pubsub_function/configuration.yaml ``` Once deployed, you can inspect the created resources with `kubectl` commands: ```shell # This will show the Route that we created: kubectl get route -o yaml # This will show the Configuration that we created: kubectl get configurations -o yaml # This will show the Revision that was created by our configuration: kubectl get revisions -o yaml # This will show the available EventSources that you can bind to: kubectl get eventsources -oyaml # This will show the available EventTypes that you can bind to: kubectl get eventtypes -oyaml ``` To make this service accessible to GCP, we first need to determine its ingress address (might have to wait a little while until 'ADDRESS' gets assigned): ```shell $ watch kubectl get ingress NAME HOSTS ADDRESS PORTS AGE gcp-pubsub-function-ingress gcp-pubsub-function.default.aikas.org,*.gcp-pubsub-function.default.aikas.org 130.211.116.160 80 20s ``` Once the `ADDRESS` gets assigned to the cluster, you need to assign a DNS name for that IP address. This DNS address needs to be: gcp-pubsub-function.default.<domainsuffix you created> so for me, I would create a DNS entry from: gcp-pubsub-function.default.aikas.org pointing to 130.211.116.160 [Using GCP DNS](https://support.google.com/domains/answer/3290350) So, you'd need to create an A record for gcp-pubsub-function.default.aikas.org pointing to 130.211.116.160 To now bind the gcp_pubsub_function for GCP PubSub messages with the function we created above, you need to create a Bind object. Modify sample/gcp_pubsub_function/bind.yaml to specify the topic and project id you want. For example, if I wanted to receive notifications to: project: quantum-reducer-434 topic: knative-demo, my Bind object would look like so: ```yaml apiVersion: feeds.knative.dev/v1alpha1 kind: Bind metadata: name: gcppubsub-example namespace: default spec: trigger: service: gcppubsub eventType: receive resource: quantum-reducer-434/knative-demo parameters: projectID: quantum-reducer-434 topic: knative-demo action: routeName: gcp-pubsub-function ``` Then create the binding so that you can see changes ```shell ko apply -f sample/gcp_pubsub_function/bind.yaml ``` This will create a subscription to the topic and create an Event Receiver that uses Pull Channel to receive events from the topic we just created. ```shell $gcloud pubsub subscriptions list <snip> ackDeadlineSeconds: 10 messageRetentionDuration: 604800s name: projects/quantum-reducer-434/subscriptions/sub-67f12e62-95a2-4a5e-bc30-9edd29380214 pushConfig: {} topic: projects/quantum-reducer-434/topics/knative-demo <snip> ``` Then look at the logs for the function: ```shell $ kubectl get pods NAME READY STATUS RESTARTS AGE gcp-pubsub-function-00001-deployment-68864b8c7d-rgx2w 3/3 Running 0 12m # Replace gcp-pubsub-function with the pod name from above: $ kubectl logs gcp-pubsub-function-00001-deployment-68864b8c7d-rgx2w user-container ``` Nothing is there, so let's change that: ```shell $ gcloud pubsub topics publish knative-demo --message 'test message' ``` Then look at the function logs: ```shell $ kubectl logs gcp-pubsub-function-00001-deployment-68864b8c7d-rgx2w user-container 2018/05/22 19:16:59 {"ID":"99171831321660","Data":"dGVzdCBtZXNzYWdl","Attributes":null,"PublishTime":"2018-05-22T19:16:59.727Z"} 2018/05/22 19:16:59 Received data: "test message" ``` ## Removing a binding Remove the binding and things get cleaned up (including removing the subscription to GCP PubSub) ```shell kubectl delete binds gcppubsub-example ``` And now your subscription from above has been removed ```shell $ gcloud pubsub subscriptions list # not there anymore ``` ## Cleaning up To clean up the sample service: ```shell ko delete -f sample/gcp_pubsub_function/ ```
29.679775
145
0.738028
eng_Latn
0.941923
2de74e94daa1d5bf07f099f51e7b786e8545b46d
16,550
md
Markdown
README.md
kalisio/covid-19
17e26a5f468924b113f86ee2a27a31f13705709a
[ "MIT" ]
17
2020-03-17T09:55:03.000Z
2020-10-28T14:24:32.000Z
README.md
kalisio/covid-19
17e26a5f468924b113f86ee2a27a31f13705709a
[ "MIT" ]
1
2020-05-16T16:56:49.000Z
2020-05-17T20:17:33.000Z
README.md
kalisio/covid-19
17e26a5f468924b113f86ee2a27a31f13705709a
[ "MIT" ]
7
2020-03-28T11:08:37.000Z
2021-01-01T15:37:54.000Z
> Ce dépôt n'est plus mis à jour de façon régulière suite à l'évolution de la pandémie et des outils officiels mis en place. Néanmoins nous le conservons au titre d'un exemple d'école de ce qu'il est possible de produire à partir de sources disparates. # Données cartographiques concernant l'épidémie de COVID-19 L'information officielle sur la progression de l'épidémie en France a été au départ assez fragmentée. Différentes initiatives ont tenté de structurer celle-ci sous forme de données libres. Malgré ce travail les données ont été néanmoins souvent difficilement exploitables à l'état brut au sein d'outils cartographiques. L'objectif de ce dépôt est de consolider l'information et de la rendre disponible dans des formats ouverts et aisément réutilisables pour produire des cartes. Le format pivot privilégié est le [GeoJson](https://fr.wikipedia.org/wiki/GeoJSON). Retrouvez également nos données sur [le portail national des données libres](https://www.data.gouv.fr/fr/datasets/donnees-cartographiques-concernant-lepidemie-de-covid-19/). L'information officielle sur la progression de l’épidémie en France est consolidée par <a href='https://www.santepubliquefrance.fr'>Santé publique France</a>. L’agence propose un <a href='https://www.santepubliquefrance.fr/maladies-et-traumatismes/maladies-et-infections-respiratoires/infection-a-coronavirus/articles/infection-au-nouveau-coronavirus-sars-cov-2-covid-19-france-et-monde'>point épidémiologique quotidien</a>, qui comprend les chiffres-clés nationaux. Par ailleurs, les <a href="https://www.ars.sante.fr">Agences Régionales de Santé</a>, les <a href="http://www.prefectures-regions.gouv.fr">préfectures de régions</a> et les <a href="https://www.interieur.gouv.fr/Le-ministere/Prefectures">préfectures</a> publient des bulletins d’informations centrés sur leur territoire de compétence. Sous l'impulsion des initiatives libres telles que <a href='https://github.com/opencovid19-fr'>OpenCovid19</a>, Santé publique France propose également des <a href='https://www.data.gouv.fr/fr/organizations/sante-publique-france/'>données relatives à l’épidémie plus précises</a> sur la plateforme <a href='https://www.data.gouv.fr'>www.data.gouv.fr</a>. Un outil <a href='https://github.com/etalab/covid19-dashboard'>dont le code source est libre</a>, développé sous l’impulsion d’<a target='_top' href='https://www.etalab.gouv.fr'>Etalab</a>, au sein de la <a href='https://www.numerique.gouv.fr/dinum/'>direction interministérielle du numérique</a>, propose une vision consolidée des données officielles disponibles. ## Comment contribuer ? Vous pouvez vous proposer comme volontaire pour tester nos scrappeurs, les améliorer, utiliser nos données ou réaliser de nouveaux jeux de données. Pour vous signaler rejoignez la communauté sur [Slack](https://join.slack.com/t/dataagainstcovid-19/shared_invite/zt-cgsplso2-LIvWeRHlf1ZFIrh~SPj~IA), ouvrez une [issue](https://github.com/kalisio/covid-19/issues) ou une une [pull request](https://github.com/kalisio/covid-19/pulls). Quelques idées: * ~~production de jeux de données avec le contour des départements et non les barycentres~~ * ~~croisement avec des données de population~~ * ~~croisement avec des données hospitalières (nombre de lits, etc.)~~ => fait via SAE 2018 * ~~intégrer les [données hospitalières](https://www.data.gouv.fr/fr/datasets/donnees-hospitalieres-relatives-a-lepidemie-de-covid-19/) de Santé Publique France remontées au niveau départemental~~ * ~~intégrer les [données des urgences et SOS médecins](https://www.data.gouv.fr/fr/datasets/donnees-des-urgences-hospitalieres-et-de-sos-medecins-relatives-a-lepidemie-de-covid-19/) de Santé Publique France remontées au niveau départemental~~ * ~~intégrer les [données de dépistage en laboratoire](https://www.data.gouv.fr/fr/datasets/donnees-relatives-aux-tests-de-depistage-de-covid-19-realises-en-laboratoire-de-ville/) de Santé Publique France remontées au niveau départemental~~ * ~~intégrer les [données de dépistage](https://www.data.gouv.fr/fr/datasets/donnees-relatives-aux-resultats-des-tests-virologiques-covid-19/) de Santé Publique France remontées au niveau départemental (remplace les précédentes)~~ * géolocalisation des données des patients au niveau communal (pour l'instant très peu de données) * consitution de collections MongoDB pour visualisation spatio-temporelle dans Kano (eg séries temporelles) ## Sources de données Nos principales sources de données sont les suivantes: * niveau mondial * https://github.com/CSSEGISandData/COVID-19 * niveau national * données régionales/départementales https://github.com/opencovid19-fr/data * données départementales https://www.data.gouv.fr/fr/organizations/sante-publique-france * données individualisée https://github.com/lperez31/coronavirus-france-dataset * croisements géographiques * contours administratifs nationaux https://github.com/gregoiredavid/france-geojson * population régionale/départementale par classe d'âge https://www.insee.fr/fr/statistiques/1893198 * correspondance code départements/régions https://www.insee.fr/fr/information/3720946#titre-bloc-15 * données hospitalières issues de la [statistique annuelle des établissements de santé](https://www.sae-diffusion.sante.gouv.fr/sae-diffusion/recherche.htm) (2018) Example de requête pour nombre de lits en réanimation sur https://hopitaux.datasette.11d.im/hopitaux: ``` SELECT departement, libdepartement, sum(LIT) AS lits FROM(select DISTINCT f.departement, f.libdepartement, r.FI, r.FI_EJ, r.UNI, r.LIT FROM [finess-clean] f INNER JOIN REA_2018 r ON r.FI = f.nofinesset AND r.UNI = 'SITOT') tmp GROUP BY departement, libdepartement ``` ## Données cartographiques Chaque élément cartographique peut contenir les propriétés suivantes: * `Country/Region` Pays/Région de provenance * `Province/State` Etat/Département de provenance * `Confirmed` nombre cumulé de cas confirmés (*à partir du 25/03 cet indicateur n'a plus été communiqué par les pouvoirs publics, sauf au niveau national*) * `Deaths` nombre cumulé de décès * `Recovered` nombre cumulé de guérisons * `Severe` nombre de cas hospitalisés à date * `Critical` nombre de cas en réanimation à date * `Emergencies` * `Total` - nombre de passages quotidiens aux urgences total * `Suspected` - nombre de passages quotidiens aux urgences pour suspicion COVID-19 * `Severe` - nombre d'hospitalisations parmi les passages quotidiens aux urgences pour suspicion COVID-19 * `MedicalActs` * `Total` - nombres d'actes médicaux quotidiens SOS Médecins total * `Suspected` - nombres d'actes médicaux quotidiens SOS Médecins pour suspicion de COVID-19 * `MedicalTests` (*depuis le 29/05 cet indicateur n'est plus mis à jour, il est remplacé par les chiffres du système d’information national de dépistage par tests RT-PCR*) * `Total` - nombres de tests quotidiens effectués dans un laboratoire d'analyse médicale * `Confirmed` - nombres de tests quotidiens positifs dans un laboratoire d'analyse médicale * `PCRTests` * `Total` - nombres de tests RT-PCR quotidiens effectués sur le territoire * `Confirmed` - nombres de tests RT-PCR quotidiens positifs sur le territoire * `Population` * `Total` - Ensemble * `Under19` - 0 à 19 ans * `Under39` - 20 à 39 ans * `Under59` - 40 à 59 ans * `Under74` - 60 à 74 ans * `Over75` - 75 ans et plus * `Beds` Lits hospitaliers * `Total` - Ensemble * `Resuscitation` - Réanimation * `IntensiveCare` - Soins intensifs et continus Pour chaque indicateur quotidien un indicateur cumulé est automatiquement calculé à partir des valeurs du jour précédent, par exemple il existe une propriété `Emergencies.Suspected/Accumulated` correspondant à la propriété `Emergencies.Suspected`. Les principales données produites sont les suivantes: * données journalières par pays au niveau mondial :open_file_folder: [csse_covid_19_daily_reports](./csse_covid_19_daily_reports) * issues des données du [Johns Hopkins CSSE](https://github.com/CSSEGISandData/COVID-19/tree/master/csse_covid_19_data/csse_covid_19_daily_reports) * données journalières par région consolidées au niveau national :open_file_folder: [regions-france](./regions-france) * issues des données des [Agences Régionales de Santé](https://github.com/opencovid19-fr/data/tree/master/agences-regionales-sante) et de [Santé Publique France](https://www.data.gouv.fr/fr/organizations/sante-publique-france/) * croisement géographique par région réalisé sur la base du code de région * polygones (fichiers préfixés par `polygons`) ou géolocalisation des données au barycentre de la région pour la constitution d'[heatmaps](https://fr.wikipedia.org/wiki/Heat_map) Carte évolutive des cas en régions, taille des bulles liée au nombre de cas: ![Carte évolutive des cas en régions](Kano-Covid-19-Regions-France.gif) Voir la [vidéo originale](https://drive.google.com/file/d/1GjdhBEVwtei5WxCTeXwtoKquKUQk5Gwp/view). Carte évolutive des cas en régions, taille des bulles liée au nombre de cas pondéré par la population: ![Carte évolutive des cas en régions pondérés selon la population](Kano-Covid-19-Regions-France-Population.gif) Voir la [vidéo originale](https://drive.google.com/file/d/1PpuVcfr6CGq48rGWr5xq9aMCsq7XTPQX/view). Carte évolutive des cas en régions, taille des bulles liée au nombre de cas pondéré par la population, couleur/opacité des bulles liée au rapport entre le nombre de cas et de lits disponibles (on suppose que 10% des cas occupent des lits): ![Carte évolutive des cas en régions pondérés selon la population et les lits disponibles](Kano-Covid-19-Regions-France-Population-Lits.gif) Voir la [vidéo originale](https://drive.google.com/file/d/1tcD_34txCr8I-5L_EVor-sQPO4xm_YQh/view). Carte évolutive des cas hospitalisations en régions, hauteur et couleur des formes 3D liées au nombre de cas sévères pondéré par la population: ![Carte évolutive des cas en régions pondérés selon la population](Kano-Covid-19-Régions-France-Hospitalisations-Population-3D.gif) Voir la [vidéo originale](https://drive.google.com/file/d/1uNqU9opcMAPYbmaeZWa-aDtfwMw3_kSI/view). * données journalières par département consolidées au niveau national :open_file_folder: [departements-france](./departements-france) * issues des données des [Agences Régionales de Santé](https://github.com/opencovid19-fr/data/tree/master/agences-regionales-sante) et de [Santé Publique France](https://www.data.gouv.fr/fr/organizations/sante-publique-france/) * croisement géographique par département réalisé sur la base du code de département * polygones (fichiers préfixés par `polygons`) ou géolocalisation des données au barycentre du département pour la constitution d'[heatmaps](https://fr.wikipedia.org/wiki/Heat_map) Carte de densité évolutive des cas par département: ![Carte de densité évolutive des cas](Kano-Covid-19-Heatmap-France.gif) Voir la [vidéo originale](https://drive.google.com/open?id=1G6IWKDE1XuSIjY_ncSELPcl8GuMmmKoH). * données globales individualisée des patients en France :open_file_folder: [patients-france](./patients-france) * géolocalisation des données au barycentre du département * contient une version aggrégée par département pour la constitution d'[heatmaps](https://fr.wikipedia.org/wiki/Heat_map) Carte des cas par département: <img src="https://raw.githubusercontent.com/kalisio/covid-19/master/patients-france.png" width="512" height="512"> <img src="https://raw.githubusercontent.com/kalisio/covid-19/master/patients-france-zoom.png" width="512" height="512"> Carte de densité des cas par département: <img src="https://raw.githubusercontent.com/kalisio/covid-19/master/patients-heatmap-france.png" width="512" height="512"> Si vous souhaitez référencer les jeux de données directement plutôt que de les copier utiliser plutôt notre bucket S3 sur AWS, le chemin vers les fichiers reste le même en préfixant par la racine `https://s3.eu-central-1.amazonaws.com/krawler/`. Par exemple l'URL vers le fichier des patients en france est `https://s3.eu-central-1.amazonaws.com/krawler/covid-19/patients-france/patients-france.json`. ## Génération des jeux données Les données sont scrappées via [Krawler](https://kalisio.github.io/krawler/) et peuvent être visualisées via [Kano](https://kalisio.github.io/kano/) ou tout autre outil SIG standard comme [geojson.io](http://geojson.io/), [QGIS](https://www.qgis.org/fr/site/), etc. Les données disponibles ne sont réellement significatives qu'à partir du 1er Mars 2020. Un job Krawler est responsable de la production de chaque jeu de données à une date fixée (i.e. statistiques par département, statistiques par région, patients). Certains jobs sont interdépendants, par exemple les jobs des statistiques par département/région dépendent de l'exécution préalable du job de génération des données hospitalières/urgences Santé Publique France. Le job `generate-data-jobfile` permet de lancer tous les jobs dans le bon ordre pour générer tous les jeux de données sur une période. **Lorsque certains indicateurs (e.g. les cas confirmés) manquent à une date donnée nous avons fait le choix de combler le trou en réutilisant la valeur de la date précédente. Ceci peut permettre de continuer à réaliser des cumuls lorsqu'un indicateur a par exemple été remplacé par un autre à une date donnée. Merci de tenir compte de cette hypothèse dans vos réutilisation.** Concernant la géométrie des contours administratifs nous utilisons [mapshaper](https://github.com/mbloch/mapshaper) afin de les simplifier à une précision donnée via la commande suivante: `mapshaper -i .\departements-france-outre-mer.geojson -simplify keep-shapes interval=500 -o .\departements-france-outre-mer-500m.geojson format=geojson` Nous faisons évoluer nos outils en fonction des besoins, aussi il faut utiliser la version en cours de développement (branche master) et non des version stables. Pour Kano il vous faudra par exemple faire un [yarn/npm link](https://docs.npmjs.com/cli/link) comme tout développeur travaillant sur ce projet. Pour Krawler: ```bash // Clone and install krawler git clone https://github.com/kalisio/krawler.git cd krawler yarn install yarn link // Clone and run a job git clone https://github.com/kalisio/covid-19.git cd covid-19 export NODE_PATH="path_to_krawler/node_modules" // Run job to generate all data for all dates in a period // By default will start on 2020-03-01 and finish on yesterday krawler generate-data-jobfile.js --start 2020-03-01 --end 2020-03-30 ``` Vous pouvez aussi lancer chaque job de façon individuelle: ```bash // Run job to generate SPF data (to be done first) krawler spf-donnees-hospitalieres-jobfile.js --date 2020-03-01 // Run job to generate departements data krawler france-departements-jobfile.js --date 2020-03-01 --geometry 'Point' ou 'Polygon' ... krawler france-departements-jobfile.js --date 2020-03-16 --geometry 'Point' ou 'Polygon' // Run job to generate regions data krawler france-regions-jobfile.js --date 2020-03-01 --geometry 'Point' ou 'Polygon' ... krawler france-regions-jobfile.js --date 2020-03-16 --geometry 'Point' ou 'Polygon' // Run job to generate patients data krawler france-patients-jobfile.js krawler france-patients-jobfile.js --departements ``` Pour Kano: ```bash // Clone and link the development kit git clone https://github.com/kalisio/kdk.git cd kdk yarn install yarn link yarn run watch // Clone and run the app git clone https://github.com/kalisio/kano.git // In one terminal run the backend cd kano/api yarn install yarn link @kalisio/kdk yarn run dev // In another terminal run the frontend cd kano yarn install yarn link @kalisio/kdk yarn run dev ``` ## Examples d'application L'ensemble des données est visualisable via un simple drag'n'drop des fichiers dans Kano. Néanmoins pour des visualisations spatio-temporelles avancées comme les heatmaps il vous faudra faire un peu de configuration. Vous retrouverez dans le fichier :open_file_folder: [kano/covid19-layers.js](./kano/covid19-layers.js) des exemples de configuration d'un affichage de bulles d'information par région, d'un affichage coloré dépendant du ratio de cas/lits, d'une heatmap ou encore d'un affichage 3D coloré et extrudé dépendant du ratio de cas sévères/population par région. Ce fichier est à copier dans le dossier :open_file_folder: [api/config/layers](https://github.com/kalisio/kano/tree/master/config/layers) de Kano. ## Licence Les données dérivées sont publiées sous la licence des données sources. Les codes sources sont publiés sous licence MIT.
71.336207
801
0.787734
fra_Latn
0.907898
2de77b1f68a7d393078919f963f000f4d8d4d625
24,420
md
Markdown
_posts/2018-04-03-mmms.md
ibecav/ibecav.github.io
93f478c893f0c6502620f51d301b572e57f3c93d
[ "MIT" ]
1
2018-12-09T11:14:57.000Z
2018-12-09T11:14:57.000Z
_posts/2018-04-03-mmms.md
ibecav/ibecav.github.io
93f478c893f0c6502620f51d301b572e57f3c93d
[ "MIT" ]
null
null
null
_posts/2018-04-03-mmms.md
ibecav/ibecav.github.io
93f478c893f0c6502620f51d301b572e57f3c93d
[ "MIT" ]
2
2018-05-01T22:11:28.000Z
2018-10-01T10:32:39.000Z
--- layout: post title: Fun with M&M's – April 3, 2018 tags: R ggplot2 scimp dplyr chisquare kable ggimage --- In this post we’re going to explore the `Chi Squared Goodness of Fit` test using M&M’s as our subject material. From there we’ll take a look at `simultaneous confidence intervals` a.k.a. `multiple comparisons`. On the `R` side of things we’ll make use of some old friends like `ggplot2` and `dplyr` but we’ll also make use of two packages that were new to me `scimp` and `ggimage`. We’ll also make heavy use of the `kable` package to make our output tables look nicer. ### Background and credits See this [blog post by Rick Wicklin](http://blogs.sas.com/content/iml/2017/02/20/proportion-of-colors-mandms.html) for the full background story. This posting is simply an attempt to do the same sort of analysis in `R`. It is an expansion of work that [Bob Rudis did on `RPubs`.](https://rpubs.com/hrbrmstr/mms) Let’s load the required R packages. See [Bob Rudis’ Github pages](https://github.com/hrbrmstr/scimple) for more on scimple. Let’s take care of housekeeping and set up the right libraries. ``` r library(knitr) library(kableExtra) # devtools::install_github("hrbrmstr/scimple") library(scimple) library(hrbrthemes) # for scales # I had to install Image Magick first on my Mac as well as EBImage # https://bioconductor.org/packages/release/bioc/html/EBImage.html # install.packages("ggimage") library(ggimage) library(dplyr) library(ggplot2) options(knitr.table.format = "html") cap_src <- "Source: <http://blogs.sas.com/content/iml/2017/02/20/proportion-of-colors-mandms.html>" ``` ### SAS M&M’s Measurements The breakroom containers at SAS are filled from two-pound bags. So as to not steal all the M&M’s in the breakroom, [Rick] conducted this experiment over many weeks in late 2016 and early 2017, taking one scoop of M&M’s each week. Create a dataframe called mms that contains information about: | Column | Contains | Type | | --------------- | --------------------------------------------- | ------ | | color_name | What color M&M | factor | | official_color | color as hex code according to Mars standards | char | | count | observed frequency counts in SAS breakrooms | dbl | | prop_2008 | expected freq as a % (Mars 2008) | dbl | | imgs | filenames for the M&M lentils | char | | prop | convert observed counts to proportions | dbl | ``` r mms <- data_frame( color_name = c("Red", "Orange", "Yellow", "Green", "Blue", "Brown"), official_color = c("#cb1634", "#eb6624", "#fff10a", "#37b252", "#009edd", "#562f14"), count = c(108, 133, 103, 139, 133, 96), prop_2008 = c(0.13, 0.20, 0.14, 0.16, 0.24, 0.13), imgs=c("im-red-lentil.png", "im-orange-lentil.png", "im-yellow-lentil.png", "im-green-lentil.png", "im-blue-lentil.png", "im-brown-lentil.png") ) %>% mutate(prop = count / sum(count), color_name = factor(color_name, levels=color_name)) ``` The data set contains the cumulative counts for each of the six colors in a sample of size N = 712. Let’s graph the observed percentages as bars (ordered by frequency) and the expected percentages that Mars published in 2008 as black diamonds. ``` r ggplot(mms, aes(reorder(color_name,-prop), prop, fill=official_color)) + geom_col(width=0.85) + geom_point(aes(color_name,prop_2008),shape=18,size = 3) + scale_y_percent(limits=c(0, 0.25)) + scale_fill_identity(guide = FALSE) + labs(x=NULL, y=NULL, title=sprintf("Observed distribution of M&M colors for a sample of N=%d", sum(mms$count)), subtitle="Black diamonds represent 2008 Mars company published percentages (expected)", caption=cap_src) + theme_bw() ``` ![M&M Bargraph](/images/bars-1.png)<!-- --> *M&M Bargraph* The same data as a table: ``` r mms %>% arrange(desc(count)) %>% mutate(difference=prop-prop_2008, difference=scales::percent(difference), prop=scales::percent(prop), prop_2008=scales::percent(prop_2008) ) %>% select(Color=color_name, Observed=count, `Observed %`=prop, `Expected %`=prop_2008, Difference=difference) %>% kable(align="lrrrr") %>% kable_styling(full_width = FALSE) ``` <table class="table" style="width: auto !important; margin-left: auto; margin-right: auto;"> <thead> <tr> <th style="text-align:left;"> Color </th> <th style="text-align:right;"> Observed </th> <th style="text-align:right;"> Observed % </th> <th style="text-align:right;"> Expected % </th> <th style="text-align:right;"> Difference </th> </tr> </thead> <tbody> <tr> <td style="text-align:left;"> Green </td> <td style="text-align:right;"> 139 </td> <td style="text-align:right;"> 19.52% </td> <td style="text-align:right;"> 16% </td> <td style="text-align:right;"> 3.52% </td> </tr> <tr> <td style="text-align:left;"> Orange </td> <td style="text-align:right;"> 133 </td> <td style="text-align:right;"> 18.68% </td> <td style="text-align:right;"> 20% </td> <td style="text-align:right;"> -1.32% </td> </tr> <tr> <td style="text-align:left;"> Blue </td> <td style="text-align:right;"> 133 </td> <td style="text-align:right;"> 18.68% </td> <td style="text-align:right;"> 24% </td> <td style="text-align:right;"> -5.32% </td> </tr> <tr> <td style="text-align:left;"> Red </td> <td style="text-align:right;"> 108 </td> <td style="text-align:right;"> 15.17% </td> <td style="text-align:right;"> 13% </td> <td style="text-align:right;"> 2.17% </td> </tr> <tr> <td style="text-align:left;"> Yellow </td> <td style="text-align:right;"> 103 </td> <td style="text-align:right;"> 14.47% </td> <td style="text-align:right;"> 14% </td> <td style="text-align:right;"> 0.47% </td> </tr> <tr> <td style="text-align:left;"> Brown </td> <td style="text-align:right;"> 96 </td> <td style="text-align:right;"> 13.48% </td> <td style="text-align:right;"> 13% </td> <td style="text-align:right;"> 0.48% </td> </tr> </tbody> </table> ### Chi-Squared Goodness of Fit Test Results Whether we look at the results in a graph or a table there are clearly differences between expected and observed for most of the colors. We would expect to find some differences but the overall question is do our data fit the “model” that is inherent in the `expected` 2008 data we have from Mars? The statistical test for this is the `Chi-Square Goodness of Fit (GoF)`. Let’s run it on our data. We give the test our observed counts `mms$count` as well as `p=mms$prop_2008` which indicates what our expected probabilities (proportions) are. If we didn't specify then the test would be run against the hypothesis that they M&M’s were equally likely. The `broom::tidy()` takes the output from the Chi Square test converts it to a data frame and allows us to present it neatly using `kable`. ``` r chisq.test(mms$count, p=mms$prop_2008) %>% broom::tidy() %>% select(`Chi Squared`=statistic, `P Value`=p.value, `Degrees of freedom`=parameter, `R method`=method) %>% kable(align = "rrcl",digits=3) %>% kable_styling(full_width = FALSE) ``` <table class="table" style="width: auto !important; margin-left: auto; margin-right: auto;"> <thead> <tr> <th style="text-align:right;"> Chi Squared </th> <th style="text-align:right;"> P Value </th> <th style="text-align:center;"> Degrees of freedom </th> <th style="text-align:left;"> R method </th> </tr> </thead> <tbody> <tr> <td style="text-align:right;"> 17.353 </td> <td style="text-align:right;"> 0.004 </td> <td style="text-align:center;"> 5 </td> <td style="text-align:left;"> Chi-squared test for given probabilities </td> </tr> </tbody> </table> We can reject the null hypothesis at the alpha = 0.05 significance level (95% confidence). In other words, the distribution of colors for M&M’s in this 2016/2017 sample does NOT appear to be the same as the color distribution we would expect given the data from Mars published in 2008! The data provide support for the hypothesis that the overall distribution doesn’t match what Mars said it should be. That’s exciting news, but leaves us with some other unanswered questions. One relatively common question is, how “big” is the difference or the effect? Is this a really big discrepancy between the published data and our sample? Is there a way of knowing how big this difference is? Let’s start answering the second question first. Effect size is a measure we use in statistics to express how big the differences are. For this test the appropriate [measure of effect size is Cohen’s *w* which](https://en.wikipedia.org/wiki/Effect_size#Cohen's_w) can be calculated from the `Chi squared statistic` and `N`. ``` r chisquaredresults<-broom::tidy(chisq.test(mms$count, p=mms$prop_2008)) chisquaredvalue<-chisquaredresults$statistic N<-sum(mms$count) cohensw<-sqrt(chisquaredvalue/N) cohensw ``` ## [1] 0.1561162 So our value for Cohen’s *w* is 0.156 . The rule of thumb for interpreting this number indicates that this is a small effect size <https://stats.idre.ucla.edu/other/mult-pkg/faq/general/effect-size-power/faqhow-is-effect-size-used-in-power-analysis/>. Obviously you should exercise professional judgment in interpreting effect size but it does not appear that the differences are worthy of a world wide expose at this time… On to our other question… Is there a way of telling *by color* which quantities of M&M’s are significantly different? After all a cursory inpsection of the graph or the table says that green and blue seem to be “off” quite a bit while yellow and brown are really close to what we would expect! Is there a way, now that we have conducted an overall omnibuds test of the goodness of fit (GOF), we can refine our understanding of the differences color by color? ### Simultaneous confidence intervals for the M&M proportions (multiple comparisons) Any sample is bound have some random variability compared to the true population count or percentage. How can we use confidence intervals to help us understand whether the data are indicating simple random variation or whether the underlying population is different. By now you no doubt have thought of confidence intervals. We just need to compute the confidence interval for each color and then see whether the percentages provided by Mars lie inside or outside the confidence interval our sample generates. We would expect that if we ran our experiment 100 times with our sample size numbers for each color the Mars number would lie *inside* the upper and lower limit of our confidence interval 95 times out of those 100 times. If our data shows it outside the confidence interval that is evidence of a statistically significant difference. Ah, but there’s a problem! We have 6 colors and we would like to test each color to see if it varies significantly. Assuming we want to have 95% confidence again, across all six colors, we are “cheating” if we compute a simple confidence interval and then run the test six times. It’s analogous to rolling the die six times instead of once. The more tests we run the more likely we are to find a difference even though none exists. We need to adjust our confidence to account for the fact that we are making multiple comparisons (a.k.a. simultaneous comparisons). Our confidence interval must be made wider (more conservative) to account for the fact we are making multiple simultaneous comparisons. Thank goodness the tools exist to do this for us. [As a matter of fact there is no one single way to make the adjustment… there are many](https://blogs.sas.com/content/iml/2017/02/15/confidence-intervals-multinomial-proportions.html). We’re going to focus on `Goodman`. In his original posting Rick used SAS scripts he had written for a previous blog post to overcome this challenge. As R users we have a few different packages for computing simultaneous confidence intervals (as well as the option of simply doing the calculations in base R). Bob Rudis took a look at several different choices in `R packages` but one of the “better” ones `CoinMinD` does the computations nicely and then prints out the results (literally with `print()`) as opposed to returning data we can act upon. So he [made a new package](https://github.com/hrbrmstr/scimple) that does the same computations and returns tidy data frames for the confidence intervals. The package is much cleaner and it includes a function that can compute multiple SCIs and return them in a single data frame, similar to what `binom::binom.confint()` does. Here are a couple examples of `scimple` in action. We’ll feed it the counts `mms$count` we have, and ask it to use the Goodman method for computing the confidence interval for each of the six colors assuming we want 95% confidence alpha = .05. For comparison we’ll also run the `Wald method with continuity correction`. The command is `scimp_goodman(mms$count, alpha=0.05)`. I’ve added a select statement to remove some columns for clarity. The `scimp_waldcc(mms$count, alpha=0.05)` shows you the more verbose output for Wald. ``` r scimp_goodman(mms$count, alpha=0.05) %>% select( `95% Lower`=lower_limit, `95% Upper`=upper_limit) %>% kable(align = "lrrrrr",caption = "Goodman Method") %>% kable_styling(full_width = FALSE) ``` <table class="table" style="width: auto !important; margin-left: auto; margin-right: auto;"> <caption> Goodman Method </caption> <thead> <tr> <th style="text-align:left;"> 95% Lower </th> <th style="text-align:right;"> 95% Upper </th> </tr> </thead> <tbody> <tr> <td style="text-align:left;"> 0.1196016 </td> <td style="text-align:right;"> 0.1905134 </td> </tr> <tr> <td style="text-align:left;"> 0.1513616 </td> <td style="text-align:right;"> 0.2282982 </td> </tr> <tr> <td style="text-align:left;"> 0.1133216 </td> <td style="text-align:right;"> 0.1828844 </td> </tr> <tr> <td style="text-align:left;"> 0.1590634 </td> <td style="text-align:right;"> 0.2372872 </td> </tr> <tr> <td style="text-align:left;"> 0.1513616 </td> <td style="text-align:right;"> 0.2282982 </td> </tr> <tr> <td style="text-align:left;"> 0.1045758 </td> <td style="text-align:right;"> 0.1721577 </td> </tr> </tbody> </table> ``` r scimp_waldcc(mms$count, alpha=0.05) %>% kable(align = "lrrrrr",caption = "Wald Continuity Correction") %>% kable_styling(full_width = FALSE) ``` <table class="table" style="width: auto !important; margin-left: auto; margin-right: auto;"> <caption> Wald Continuity Correction </caption> <thead> <tr> <th style="text-align:left;"> method </th> <th style="text-align:right;"> lower_limit </th> <th style="text-align:right;"> upper_limit </th> <th style="text-align:right;"> adj_ll </th> <th style="text-align:right;"> adj_ul </th> <th style="text-align:right;"> volume </th> </tr> </thead> <tbody> <tr> <td style="text-align:left;"> waldcc </td> <td style="text-align:right;"> 0.1246345 </td> <td style="text-align:right;"> 0.1787363 </td> <td style="text-align:right;"> 0.1246345 </td> <td style="text-align:right;"> 0.1787363 </td> <td style="text-align:right;"> 0 </td> </tr> <tr> <td style="text-align:left;"> waldcc </td> <td style="text-align:right;"> 0.1574674 </td> <td style="text-align:right;"> 0.2161281 </td> <td style="text-align:right;"> 0.1574674 </td> <td style="text-align:right;"> 0.2161281 </td> <td style="text-align:right;"> 0 </td> </tr> <tr> <td style="text-align:left;"> waldcc </td> <td style="text-align:right;"> 0.1181229 </td> <td style="text-align:right;"> 0.1712030 </td> <td style="text-align:right;"> 0.1181229 </td> <td style="text-align:right;"> 0.1712030 </td> <td style="text-align:right;"> 0 </td> </tr> <tr> <td style="text-align:left;"> waldcc </td> <td style="text-align:right;"> 0.1654077 </td> <td style="text-align:right;"> 0.2250417 </td> <td style="text-align:right;"> 0.1654077 </td> <td style="text-align:right;"> 0.2250417 </td> <td style="text-align:right;"> 0 </td> </tr> <tr> <td style="text-align:left;"> waldcc </td> <td style="text-align:right;"> 0.1574674 </td> <td style="text-align:right;"> 0.2161281 </td> <td style="text-align:right;"> 0.1574674 </td> <td style="text-align:right;"> 0.2161281 </td> <td style="text-align:right;"> 0 </td> </tr> <tr> <td style="text-align:left;"> waldcc </td> <td style="text-align:right;"> 0.1090419 </td> <td style="text-align:right;"> 0.1606210 </td> <td style="text-align:right;"> 0.1090419 </td> <td style="text-align:right;"> 0.1606210 </td> <td style="text-align:right;"> 0 </td> </tr> </tbody> </table> For each of the commands, back comes a `tibble` with six rows (one for each color) with the upper and lower bounds as well as other key data from the process for each method. Notice that the confidence interval width varies by color (row in the tibble) based on observed sample size and that the Goodman intervals are wider (more conservative) when you compare rows across tables with the Wald Continuity Correction method. The documentation on GitHub <https://github.com/hrbrmstr/scimple> that Bob Rudis provided has a nice graph that shows you the 6 different methods and how they would place the confidence intervals for the exact same observed data. Clearly YMMV depending on which method you choose. Armed with this great package that Bob provided let’s bind these corrected confidence intervals to the data we have and see if we can determine whether our intuitions about which colors are significantly different from the expected values are accurate… ``` r mms <- bind_cols(mms, scimp_goodman(mms$count, alpha=0.05)) mms %>% select(Color=color_name, Observed=count, Percent=prop, `95% Lower`=lower_limit, `95% Upper`=upper_limit, Expected=prop_2008) %>% kable(align=c("lrrrrr"), digits=3, caption="Simultaneous confidence Intervals (Goodman method)") %>% kable_styling(full_width = FALSE, position = "center") ``` <table class="table" style="width: auto !important; margin-left: auto; margin-right: auto;"> <caption> Simultaneous confidence Intervals (Goodman method) </caption> <thead> <tr> <th style="text-align:left;"> Color </th> <th style="text-align:right;"> Observed </th> <th style="text-align:right;"> Percent </th> <th style="text-align:right;"> 95% Lower </th> <th style="text-align:right;"> 95% Upper </th> <th style="text-align:right;"> Expected </th> </tr> </thead> <tbody> <tr> <td style="text-align:left;"> Red </td> <td style="text-align:right;"> 108 </td> <td style="text-align:right;"> 0.152 </td> <td style="text-align:right;"> 0.120 </td> <td style="text-align:right;"> 0.191 </td> <td style="text-align:right;"> 0.13 </td> </tr> <tr> <td style="text-align:left;"> Orange </td> <td style="text-align:right;"> 133 </td> <td style="text-align:right;"> 0.187 </td> <td style="text-align:right;"> 0.151 </td> <td style="text-align:right;"> 0.228 </td> <td style="text-align:right;"> 0.20 </td> </tr> <tr> <td style="text-align:left;"> Yellow </td> <td style="text-align:right;"> 103 </td> <td style="text-align:right;"> 0.145 </td> <td style="text-align:right;"> 0.113 </td> <td style="text-align:right;"> 0.183 </td> <td style="text-align:right;"> 0.14 </td> </tr> <tr> <td style="text-align:left;"> Green </td> <td style="text-align:right;"> 139 </td> <td style="text-align:right;"> 0.195 </td> <td style="text-align:right;"> 0.159 </td> <td style="text-align:right;"> 0.237 </td> <td style="text-align:right;"> 0.16 </td> </tr> <tr> <td style="text-align:left;"> Blue </td> <td style="text-align:right;"> 133 </td> <td style="text-align:right;"> 0.187 </td> <td style="text-align:right;"> 0.151 </td> <td style="text-align:right;"> 0.228 </td> <td style="text-align:right;"> 0.24 </td> </tr> <tr> <td style="text-align:left;"> Brown </td> <td style="text-align:right;"> 96 </td> <td style="text-align:right;"> 0.135 </td> <td style="text-align:right;"> 0.105 </td> <td style="text-align:right;"> 0.172 </td> <td style="text-align:right;"> 0.13 </td> </tr> </tbody> </table> Hmmm, the table shows that only blue (0.24) is outside the 95% confidence interval, with green (0.16) just barely inside its interval. The rest are all somewhere inside the confidence interval range. We could of course choose a less stringent or conservative method than Goodman. Or we could choose and even stricter method! That exercise is left to you. For now though I find the table of numbers hard to read and to parse so let’s build a plot that hopefully makes our life a little easier. Later we’ll make use of `ggimage` and some work that Bob did to make an even better plot. ``` r mms %>% ggplot() + geom_segment(aes(x=lower_limit, xend=upper_limit, y=color_name, yend=color_name, color=official_color), size=3) + geom_point(aes(prop, color_name, fill=official_color), size=8, shape=21, color="white") + geom_point(aes(prop_2008, color_name, color=official_color), shape="|", size=8) + scale_x_percent(limits=c(0.095, 0.25)) + scale_color_identity(guide = FALSE) + scale_fill_identity(guide = FALSE) + labs(x="Proportion", y=NULL, title="Observed vs Expected 2008 for M&M Candies", subtitle=sprintf("95%% Simultaneous Confidence Intervals, [N=%d]", sum(mms$count)), caption=cap_src) + theme_bw() ``` ![](/images/cisimple-1.png)<!-- --> Ah, that’s better sometimes a picture really is worth a thousand numbers… We can now clearly see the observed percent as a circle. The Goodman adjusted confidence interval as a horizontal line and the expected value from the 2008 Mars information as a nice vertical line. ### Plot twist – The Cleveland Comparison So as it turns out, Rick the original author at SAS was able to make contact with the Mars Company and determine that there really was an explanation for the differences. Turns out some changes were made and there are actually two places where these M&M’s might have originated each with slightly different proportions! **Who knew? Right?** Let’s take the opportunity to take our new data and the `ggimage` package and plot the plot twist (pun intended). All credit to Bob for carefully constructing the right commands to ggplot to make this compelling graphic. All we have to do is add the Cleveland plant expected proportions as `cleveland_prop` to our data since our observed hasn’t changed which means our CI’s remain the same. ``` r url_base <- "http://www.mms.com/Resources/img/" mms %>% mutate(imgs=sprintf("%s%s", url_base, imgs)) %>% mutate(cleveland_prop=c(0.131, 0.205, 0.135, 0.198, 0.207, 0.124)) %>% ggplot() + geom_segment(aes(x=lower_limit, xend=upper_limit, y=color_name, yend=color_name, color=official_color), size=2) + geom_image(aes(prop, color_name, image=imgs),size=.10) + geom_point(aes(cleveland_prop, color_name, color=official_color), shape="|", size=6) + scale_x_percent(limits=c(0.095, 0.25)) + scale_color_identity(guide = FALSE) + scale_fill_identity(guide = FALSE) + labs(x="Proportion", y=NULL, title="Observed vs 2017 Proportions for M&M Candies", subtitle=sprintf("95%% Simultaneous Confidence Intervals, [N=%d]", sum(mms$count)), caption=cap_src) + theme_bw() ``` ![](/images/cleveland-1.png)<!-- --> Certainly a more intriguing graphic now that we let `ggimage` put the lentils in there for us… I hope you've found this useful. I am always open to comments, corrections and suggestions. Chuck (ibecav at gmail dot com) ### License <a rel="license" href="http://creativecommons.org/licenses/by-sa/4.0/"><img alt="Creative Commons License" style="border-width:0" src="https://i.creativecommons.org/l/by-sa/4.0/88x31.png" /></a><br />This work is licensed under a <a rel="license" href="http://creativecommons.org/licenses/by-sa/4.0/">Creative Commons Attribution-ShareAlike 4.0 International License</a>.
17.233592
204
0.6862
eng_Latn
0.937623
2de82c754a22d4e55d793c2841dfee4b453b9143
1,942
md
Markdown
_posts/2019-12-11-michelle-obama-neemt-het-op-voor-ellen-degeneres.md
wijnandb/mediumish
ceffee1fd0cf0559da52d0022eeb10640e95ad94
[ "MIT" ]
null
null
null
_posts/2019-12-11-michelle-obama-neemt-het-op-voor-ellen-degeneres.md
wijnandb/mediumish
ceffee1fd0cf0559da52d0022eeb10640e95ad94
[ "MIT" ]
null
null
null
_posts/2019-12-11-michelle-obama-neemt-het-op-voor-ellen-degeneres.md
wijnandb/mediumish
ceffee1fd0cf0559da52d0022eeb10640e95ad94
[ "MIT" ]
null
null
null
--- layout: post title: "Michelle Obama neemt het op voor Ellen DeGeneres" date: Wed, 11 Dec 2019 07:45:47 +0100 category: entertainment externe_link: "https://www.telegraaf.nl/entertainment/2087925483/michelle-obama-neemt-het-op-voor-ellen-de-generes" image: "https://www.telegraaf.nl/images/1200x630/filters:format(jpeg):quality(80)/cdn-kiosk-api.telegraaf.nl/dd0cc3aa-1be1-11ea-ab0d-02d1dbdc35d1.jpg" aantal: 236 unieke: 160 author: "Telegraaf" --- <p class="intro">Ellen DeGeneres kwam dit najaar onder vuur te liggen vanwege haar vriendschap met George W. Bush. Ze krijgt nu steun van voormalig First Lady Michelle Obama, die ook bevriend is met de oud-president.</p> <p>Michelle Obama werd dinsdag geïnterviewd door Today Show-presentatrice Jenna Bush Hager, dochter van George W. In het gesprek kwam de kritiek op Ellen DeGeneres ter sprake. „Ik ben met je vader naar begrafenissen geweest, naar hoogte- en dieptepunten. We hebben verhalen gedeeld over onze kinderen en onze ouders. Onze waarden zijn hetzelfde”, aldus de oud-First Lady over haar band met de Republikein. „We zijn het niet eens over beleid, maar we zijn niet verdeeld over menselijkheid. We zijn het niet oneens over liefde en medeleven. Dat geldt voor de meeste mensen, maar we verliezen ons te vaak in onze angst voor alles wat anders is.”</p><p>In oktober werden DeGeneres en Bush gefilmd tijdens een gezellig onderonsje dat ze hadden bij een footballwedstrijd. Op social media kreeg de presentatrice de kritiek dat ze daarmee haar liberale waardes liet schieten. Ze reageerde daarop in haar talkshow. „We lijken te zijn vergeten dat het oké is dat we allemaal anders denken”, aldus DeGeneres.</p><p>Ze verwees naar de dagelijkse afsluiter van haar programma, waarin ze mensen aanmoedigt aardig voor elkaar te zijn. „Als ik zeg: wees lief voor elkaar, dan bedoel ik niet alleen tegen de mensen die hetzelfde denken als jij, ik bedoel wees lief voor iedereen.”</p>
138.714286
1,488
0.794027
nld_Latn
0.999875
2de8615732249dde0afcd293556a109688bae1e8
512
md
Markdown
experiments/generic/pod-network-duplication/README.md
radudd/litmus-go
587fd107e6adbb9f0659c8c2d0854aa7c4069f4f
[ "Apache-2.0" ]
null
null
null
experiments/generic/pod-network-duplication/README.md
radudd/litmus-go
587fd107e6adbb9f0659c8c2d0854aa7c4069f4f
[ "Apache-2.0" ]
null
null
null
experiments/generic/pod-network-duplication/README.md
radudd/litmus-go
587fd107e6adbb9f0659c8c2d0854aa7c4069f4f
[ "Apache-2.0" ]
2
2021-08-03T10:53:05.000Z
2022-02-16T06:36:01.000Z
## Experiment Metadata <table> <tr> <th> Name </th> <th> Description </th> <th> Documentation Link </th> </tr> <tr> <td> Pod Network Duplication </td> <td> This experiment causes network duplication using pumba. It injects network duplication on the specified container by starting a traffic control (tc) process with netem rules. It Can test the application's resilience to duplicate network </td> <td> <a href="https://docs.litmuschaos.io/docs/pod-network-duplication/"> Here </a> </td> </tr> </table>
32
248
0.71875
eng_Latn
0.871739
2de898415c824bff9e264a49ce1c4db61a01f848
5,301
md
Markdown
microsoft-365/commerce/subscriptions/important-information-e4.md
badbart/microsoft-365-docs-pr.de-DE
ab71005458c84989fc01d077c270a01a95656aa9
[ "CC-BY-4.0", "MIT" ]
null
null
null
microsoft-365/commerce/subscriptions/important-information-e4.md
badbart/microsoft-365-docs-pr.de-DE
ab71005458c84989fc01d077c270a01a95656aa9
[ "CC-BY-4.0", "MIT" ]
null
null
null
microsoft-365/commerce/subscriptions/important-information-e4.md
badbart/microsoft-365-docs-pr.de-DE
ab71005458c84989fc01d077c270a01a95656aa9
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: Wichtige Informationen für Office 365 E4-Kunden f1.keywords: - NOCSH ms.author: cmcatee author: cmcatee-MSFT manager: scotv audience: Admin ms.topic: article ms.service: o365-administration localization_priority: Normal ms.collection: - M365-subscription-management - Adm_O365 - Adm_NonTOC - commerce ms.custom: customer-email search.appverid: - MET150 description: Wichtige Informationen zum Aktualisieren oder Ändern von Plänen für Kunden mit einem Office 365 E4-Abonnement. ms.date: 08/14/2020 ROBOTS: NOINDEX, NOFOLLOW ms.openlocfilehash: c48d4a71327ae5973054fd0d465719667dc0af80 ms.sourcegitcommit: 79065e72c0799064e9055022393113dfcf40eb4b ms.translationtype: MT ms.contentlocale: de-DE ms.lasthandoff: 08/14/2020 ms.locfileid: "46690698" --- # <a name="important-information-for-office-365-e4-customers"></a>Wichtige Informationen für Office 365 E4-Kunden Der Office 365 E4-Plan wurde in 2015 zurückgezogen. Als Microsoft 365 globaler Administrator erhalten Sie e-Mail-Updates und siehe Beiträge im Nachrichtencenter (Teil des [Microsoft 365 Admin Center](https://go.microsoft.com/fwlink/p/?linkid=2024339)) mit Informationen darüber, wann Sie Maßnahmen ergreifen müssen. ## <a name="compare-your-options-for-upgrading-or-changing-plans"></a>Vergleichen der Optionen zum Aktualisieren oder Ändern von Plänen Sie können die gleiche Funktionalität wie E4 beibehalten oder die neuen Features und Funktionen von Microsoft 365 und Microsoft 365-Audiokonferenzen nutzen. Sie haben fünf Optionen, um zu einem neuen Plan zu gelangen. | | Option 1: Upgrade auf Office 365 E5 | Option 2: Upgrade auf Microsoft 365 E5 | Option 3: Upgrade auf Microsoft 365 E5 ohne Audiokonferenzen | Option 4: Wechseln Sie zu Office 365 E3 | Option 5: Änderung an Microsoft 365 E3 | |-|-|-|-|-|-| | **Erhalten Sie alle Dienste und Funktionen, die in Ihrem aktuellen Office 365 E4-Abonnement enthalten sind.** | Ja | Ja | Ja | Nein | Nein | | **In Microsoft 365 verwaltete Telefonnummern** | Ja | Ja | Nein | Nein | Nein | | **Telefonnummern, die sowohl lokal als auch in Microsoft 365 (hybridbereitstellung) verwaltet werden**<br/><br/>Verringern Sie die Abhängigkeit von teuren PBX-Telefonsystemen (Private Branch Exchange) und vereinfachen Sie Ihre Kommunikationsinfrastruktur mit einer Plattform für Anrufe, Konferenzen, Videos und Freigaben. | Ja | Ja | Nein | Nein | Nein | | **Option zum Hinzufügen von Anrufplänen**<br/><br/>Sie können Lizenzen für einen Microsoft 365-Plan für Inlandsanrufe oder einen Microsoft 365-Anrufplan für Inlands-und Auslandsanrufe hinzufügen. Weitere Informationen finden Sie unter [welcher Anrufplan ist für Sie geeignet?](https://docs.microsoft.com/MicrosoftTeams/calling-plan-landing-page). | Ja<sup>1</sup> | Ja | Ja | Ja | Ja | | **Audiokonferenz**<br/><br/>Besprechungsteilnehmer können sich in Skype for Business Besprechungen von nahezu jedem Gerät aus einwählen, und Organisatoren können sich auswählen, um die Teilnehmer in zu ziehen. | Ja<sup>2</sup> | Ja | Ja | Nein | Nein | | **Erweiterte Tools für Zusammenarbeit, Analysen und Sicherheit** | Ja | Ja | Ja | Nein | Nein | | **Neue interaktive Berichte, Dashboards und Datenvisualisierungen** | Ja | Ja | Ja | Nein | Nein | | **Mehr Kontrolle über Ihre Datensicherheit und Einhaltung von integriertem Datenschutz und integrierter Transparenz sowie optimierte Benutzersteuerelemente** | Ja | Ja | Ja | Nein | Ja | <sup>1</sup> Anrufpläne in Microsoft 365 oder Office 365 sind in 11 Ländern oder Regionen verfügbar und stehen Organisationen mit einer Rechnungsadresse in einem dieser Länder oder Regionen zur Verfügung. Weitere Informationen finden Sie unter [Verwalten von Telefonnummern für Ihre Organisation](https://docs.microsoft.com/microsoftteams/manage-phone-numbers-for-your-organization/manage-phone-numbers-for-your-organization). Wenn Sie ein Upgrade auf Microsoft 365 oder Office 365 durchführen oder Telefon System erworben haben, haben Sie die Möglichkeit, einen Plan für Inlands-, Inlands-und internationale Anrufe hinzuzufügen. <sup>2</sup> Audiokonferenzen sind noch nicht in allen Ländern oder Regionen verfügbar. Wenn Audiokonferenzen in Ihrem Land oder Ihrer Region nicht verfügbar sind, werden wir Sie aktualisieren, sobald es verfügbar ist. Zu diesem Zeitpunkt können Microsoft 365 und Office 365 E5-Kunden sofort auf ein E5-Abonnement mit Audiokonferenzen upgraden oder warten, bis die Zeitdauer verlängert wird. Informationen dazu, welche Länder oder Regionen Audiokonferenzen unterstützen, finden Sie unter [Land-und Regions Verfügbarkeit für Audiokonferenzen und Anrufpläne](https://docs.microsoft.com/microsoftteams/country-and-region-availability-for-audio-conferencing-and-calling-plans/country-and-region-availability-for-audio-conferencing-and-calling-plans) und verwenden Sie die Dropdownliste, um Ihr Land oder Ihre Region auszuwählen. ## <a name="ready-to-upgrade"></a>Möchten Sie ein Upgrade durchführen? Wenn es an der Zeit ist, Ihr Abonnement zu erneuern, sollten Sie ein Upgrade auf einen dieser Pläne durchführen. Wenn Sie Ihr Abonnement direkt bei Microsoft erworben haben und das Upgrade jetzt durchführen möchten, lesen Sie [Upgrade von einem Office 365 E4-Abonnement](upgrade-Office-365-E4.md). Beim Upgrade von E4 auf einen neuen Plan gehen keine Daten verloren.
93
824
0.797208
deu_Latn
0.984818
2de95365147ef0f721edd5f2cf5f4910b364e00e
4,361
md
Markdown
README.md
IC-hub/ProteinLM
58fbf1f674569cf814becf32f71dd0d8f0c592fa
[ "Apache-2.0" ]
null
null
null
README.md
IC-hub/ProteinLM
58fbf1f674569cf814becf32f71dd0d8f0c592fa
[ "Apache-2.0" ]
null
null
null
README.md
IC-hub/ProteinLM
58fbf1f674569cf814becf32f71dd0d8f0c592fa
[ "Apache-2.0" ]
null
null
null
# ProteinLM We pretrain protein language model based on Megatron-LM framework, and then evaluate the pretrained model results on TAPE (Tasks Assessing Protein Embeddings), which contains a set of five biologically relevant semi-supervised learning tasks. And our pretrained model achieved good performance on these tasks. # Overview The proposal of pre-training models such as Bert have greatly promoted the development of natural language processing, improving the performance of language models. Inspired by the similarity of amino acid sequence and text sequence, we consider applying the method of pre-training language model to biological data. # Guidance We provide pretrain and finetune code in two separate folders. If you use the pretrained model we provide, you can simply download the checkpoint and follow the finetune guide. If you want to pretrain your own model yourself, you can refer to the pretrain guide. - Pretrain [README](./pretrain/README.md) - Finetune [README](./tape/README.md) # Project Structure ``` . ├── pretrain (protein language model pretrain) │ ├── megatron (model folder) │ ├── pretrain_tools (multi-node pretrain) │ ├── protein_tools (data preprocess shells) └── tape ├── conda_env (conda env in yaml format) ├── converter (converter script and model config files) ├── scripts (model generator, finetune) └── tape (tape model) ``` # Usage As the structure above shows, there are two stages as follows. - Pretrain - Prepare dataset (`PFAM`) - Preprocess data - Pretrain - Finetune - Convert pretrain protein model checkpoint - Finetune on downstream tasks Detailed explanations are given in each folder's readme. # Downstream Tasks Performance | Task | Metric | TAPE Transformer | ProteinLM (ours) | |:-:|:-:|:-:|:-:| | contact prediction | P@L/5 | 0.36 | **0.52** | | remote_homology | Top 1 Accuracy | 0.21 | **0.26** | | secondary_structure | Accuracy (3-class) | 0.73 | **0.75** | | fluorescence | Spearman's rho | 0.68 | 0.68 | | stability | Spearman's rho | 0.73 | **0.77** | # Contact If you have any problem using ProteinLM, feel free to contact [us](mailto:xiaoyiji18@mails.tsinghua.edu.cn). # Reference Our work is based on the following papers. - [Evaluating Protein Transfer Learning with TAPE](https://arxiv.org/abs/1906.08230v1) - [Megatron-LM: Training Multi-Billion Parameter Language Models Using Model Parallelism](https://arxiv.org/abs/1909.08053v4) Besides, part of the code is based on [Megatron-LM](https://github.com/NVIDIA/Megatron-LM) and [TAPE](https://github.com/songlab-cal/tape). __Evaluating Protein Transfer Learning with TAPE__ ``` @article{DBLP:journals/corr/abs-1909-08053, author = {Mohammad Shoeybi and Mostofa Patwary and Raul Puri and Patrick LeGresley and Jared Casper and Bryan Catanzaro}, title = {Megatron-LM: Training Multi-Billion Parameter Language Models Using Model Parallelism}, journal = {CoRR}, volume = {abs/1909.08053}, year = {2019}, url = {http://arxiv.org/abs/1909.08053}, archivePrefix = {arXiv}, eprint = {1909.08053}, timestamp = {Tue, 24 Sep 2019 11:33:51 +0200}, biburl = {https://dblp.org/rec/journals/corr/abs-1909-08053.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ``` __Megatron-LM: Training Multi-Billion Parameter Language Models Using Model Parallelism__ ``` @article{DBLP:journals/corr/abs-1906-08230, author = {Roshan Rao and Nicholas Bhattacharya and Neil Thomas and Yan Duan and Xi Chen and John F. Canny and Pieter Abbeel and Yun S. Song}, title = {Evaluating Protein Transfer Learning with {TAPE}}, journal = {CoRR}, volume = {abs/1906.08230}, year = {2019}, url = {http://arxiv.org/abs/1906.08230}, archivePrefix = {arXiv}, eprint = {1906.08230}, timestamp = {Sat, 23 Jan 2021 01:20:25 +0100}, biburl = {https://dblp.org/rec/journals/corr/abs-1906-08230.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ```
37.921739
317
0.665214
eng_Latn
0.860285
2de98d7f965e2e0cd327156eda2b2bdaff6157fa
1,852
md
Markdown
CHANGELOG.md
bharath-srinivas/aws-go
ef64b61fe944c5d79edf555e3f571bacd35fe4a8
[ "MIT" ]
8
2018-01-01T17:48:08.000Z
2018-02-12T16:10:15.000Z
CHANGELOG.md
bharath-srinivas/aws-go
ef64b61fe944c5d79edf555e3f571bacd35fe4a8
[ "MIT" ]
5
2018-01-08T07:50:49.000Z
2018-02-11T19:48:37.000Z
CHANGELOG.md
bharath-srinivas/aws-go
ef64b61fe944c5d79edf555e3f571bacd35fe4a8
[ "MIT" ]
2
2018-02-04T16:16:43.000Z
2018-02-05T11:39:22.000Z
# v0.5.2 (2018-03-13) * added binary support for macOS # v0.5.1 (2018-03-12) * added clrscr function to s3 list. Closes #11 # v0.5.0 (2018-03-06) * Project name has been officially changed from aws-go to nephele [**Breaking change**] * added functionality to start/stop rds instance * added color highlighting for instance states. Closes #3 * refactored: project name from aws-go to nephele # v0.4.0 (2018-03-03) * added functionality to list s3 objects * added functionality to download a s3 object * added functionality to download s3 objects in batch * minor performance enhancements # v0.3.0 (2018-02-12) * added functionality to list based on filters * added functionality to start or stop multiple instances * added ec2 subcommand. Closes #4 * added unit tests for store * added mocks for testing * chore: added prerun hooks * Fixes #5: added functionality to list s3 buckets * fixed weird whitespace caused by word wrap function * fixed: silence usage on errors. Closes #6 * fixed verb for bool type * fixed import errors * refactored store to be unit testable * refactored functions to accommodate unit testing * refactored: directory structure # v0.2.1 (2018-01-26) * added word wrap utility function. Closes #2 * added custom spinner and spinner colors * added tablewriter for rendering lists * added docstring for godoc generation * added unit test for utils * added Makefile * added travis build * added build and godoc badges * refactored: go fmt * refactored: changed vendor in build script # v0.2.0 (2018-01-07) * added upgrade and rds list feature * added dependencies * refactored installation script: display download progress * refactored: use a version constant * refactored: rename sqlite filename * refactored: changed project directory structure * fixed badge link in README # v0.1.0 (2017-12-31) * Initial release
28.492308
87
0.762419
eng_Latn
0.990969
2dea7f05ab0710661a1b331a06d6f9d90ba1c777
4,588
md
Markdown
README.md
lettleli/vscode-fortran-support
4f17f244280539ab8cc2448970c5c4b10cadd5f4
[ "MIT" ]
null
null
null
README.md
lettleli/vscode-fortran-support
4f17f244280539ab8cc2448970c5c4b10cadd5f4
[ "MIT" ]
null
null
null
README.md
lettleli/vscode-fortran-support
4f17f244280539ab8cc2448970c5c4b10cadd5f4
[ "MIT" ]
null
null
null
# Modern Fortran language support for VSCode [![Build Status](https://travis-ci.org/krvajal/vscode-fortran-support.svg?branch=master)](https://travis-ci.org/krvajal/vscode-fortran-support) [![codecov](https://codecov.io/gh/krvajal/vscode-fortran-support/branch/master/graph/badge.svg)](https://codecov.io/gh/krvajal/vscode-fortran-support) [![MIT License](https://img.shields.io/npm/l/stack-overflow-copy-paste.svg?style=flat-square)](http://opensource.org/licenses/MIT) [![Installs](https://vsmarketplacebadge.apphb.com/installs/krvajalm.linter-gfortran.svg)](https://marketplace.visualstudio.com/items?itemName=krvajalm.linter-gfortran) [![GitHub release](https://img.shields.io/github/release/krvajal/vscode-fortran-support.svg)](https://GitHub.com/krvajal/vscode-fortran-support/releases/) > This extension provides support for the Fortran programming language. It includes syntax highlighting, code snippets and a linting based on `gfortran`. You can download the Visual Studio Code editor from [here](https://code.visualstudio.com/download). ## Features - Syntax highlighting - Code Snippets - Documentation on hover for intrinsic functions - Code linting based on `gfortran` to show errors wiggles in your code - Code autocompletion (beta) - Symbols provider ![symbol_nav](./doc/symbol_nav.png) ## Settings You can control the include paths to be used by the linter with the `fortran.includePaths` setting. ``` { "fortran.includePaths": [ "/usr/local/include", "/usr/local" ] } ``` By default the `gfortran` executable is assumed to be found in the path. In order to use a different one or if it can't be found in the path you can point the extension to use a custom one with the `fortran.gfortranExecutable` setting. ``` { "fortran.gfortranExecutable": '/usr/local/bin/gfortran-4.7', } ``` If you want to pass extra options to the `gfortran` executable or override the default one, you can use the setting `fortran.linterExtraArgs`. By default `-Wall` is the only option. ``` { "fortran.linterExtraArgs": ['-Wall'], } ``` You can configure what kind of symbols will appear in the symbol list by using ``` { "fortran.symbols": [ "function", "subroutine"] } ``` The available options are - "function" - "subroutine" - "variable" - "module" (not supported yet) - "program" (not supported yet) and by default only functions and subroutines are shown You can also configure the case for fortran intrinsics auto-complete by using ``` { "fortran.preferredCase": "lowercase" | "uppercase" } ``` ## Snippets This is a list of some of the snippets included, if you like to include additional snippets please let me know and I will add them. #### Program skeleton ![program snippet](https://media.giphy.com/media/OYdq9BKYMOOdy/giphy.gif) #### Module skeleton ![module snippet](https://media.giphy.com/media/3ohzdUNRuio5FfyF1u/giphy.gif) ## Error wiggles To trigger code validations you must save the file first. ## Fortran Language Server (Experimental) This extension uses a host of tools to provide the various language features. An alternative is to use a single language server that provides the same feature. Set `fortran.useLanguageServer` to `true` to use the Fortran language server from [Chris Hansen](https://github.com/hansec/fortran-language-server) for features like Hover, Definition, Find All References, Signature Help, Go to Symbol in File and Workspace. - This is an experimental feature and is not available in Windows yet. - Since only a single language server is spun up for given VS Code instance, having multi-root setup does not work - If set to true, you will be prompted to install the Fortran language server. Once installed, you will have to reload VS Code window. The language server will then be run by the Fortran extension in the background to provide services needed for the above mentioned features. - Every time you change the value of the setting `fortran.useLanguageServer`, you need to reload the VS Code window for it to take effect. ## Requirements For the linter to work you need to have `gfortran` on your path, or wherever you configure it to be. ## Issues Please report any issues and feature request on the GitHub repo [here](https://github.com/krvajalmiguelangel/vscode-fortran-support/issues/new) ## Notice The syntax highlight support was imported from [TextMate bundle](https://github.com/textmate/fortran.tmbundle) The idea of using `gfortran` comes from this awesome [fortran plugin](https://github.com/315234/SublimeFortran) for Sublime Text. ## LICENSE MIT
37.917355
275
0.75959
eng_Latn
0.973217
2dea94f156d8185ad4a8dd20883abfb367e6b6cb
162
md
Markdown
README.md
PWr-Projects-For-Courses/Learning
de62a5063ee909293d47475874b620d32f7932f3
[ "MIT" ]
null
null
null
README.md
PWr-Projects-For-Courses/Learning
de62a5063ee909293d47475874b620d32f7932f3
[ "MIT" ]
null
null
null
README.md
PWr-Projects-For-Courses/Learning
de62a5063ee909293d47475874b620d32f7932f3
[ "MIT" ]
null
null
null
Systemy uczące się ================== Repozytorium na kod używany podczas laboratoriów. Rok akademicki 2014/2015 Semestr zimowy Pon 11:15 Autor: Filip Malczak
14.727273
49
0.722222
pol_Latn
0.999195
2deae1dca657de97a450459b25f48efda32214ab
31,271
md
Markdown
windows-driver-docs-pr/nfc/wi-fi-direct-paring-implementation.md
kyanha/windows-driver-docs
9f0e663b2964bbb63ffabb72645b516f1654aa9a
[ "CC-BY-4.0", "MIT" ]
1
2021-01-15T18:48:02.000Z
2021-01-15T18:48:02.000Z
windows-driver-docs-pr/nfc/wi-fi-direct-paring-implementation.md
loneicewolf/windows-driver-docs
e4124699c032be5dc2c8cfb332c3e666a02b71dd
[ "CC-BY-4.0", "MIT" ]
null
null
null
windows-driver-docs-pr/nfc/wi-fi-direct-paring-implementation.md
loneicewolf/windows-driver-docs
e4124699c032be5dc2c8cfb332c3e666a02b71dd
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: Wi-Fi direct pairing implementation description: This section provides design guidelines and requirements for a peripheral device to participate in the Tap and Setup and Tap and Reconnect use cases. ms.assetid: 1B729E9F-DF9F-4263-9F0B-5EDCF817D2C3 keywords: - NFC - near field communications - proximity - near field proximity - NFP ms.date: 04/20/2017 ms.localizationpriority: medium --- # Wi-Fi direct pairing implementation This section provides design guidelines and requirements for a peripheral device to participate in the Tap and Setup and Tap and Reconnect use cases. >[!NOTE] >The pairing implementation described in this topic is currently supported in Windows 8.1, for pairing to printer devices only. >[!NOTE] >Windows 10 also supports NFC to Wi-Fi Direct static connection handover through the Wi-Fi alliance’s Wi-Fi P2P Carrier Configuration Record. For more information, see [Wi-Fi Alliance](https://www.wi-fi.org/). ## Peripheral Wi-Fi Direct Device Pairing During the tap, NFP receives pairing information from the connecting device. NFP passes the pairing information to Windows. Wi-Fi Direct devices follow the Wi-Fi Alliance Out-Of-Band (OOB) pairing procedure and the NFC Forum recommendations. Windows relies on a proprietary pairing message as defined below. Windows will prompt the user for consent, and if it is given, Windows will attempt to connect to each of the addresses, in order, until one succeeds. There is no further interaction between an NFP provider in the PC and the connecting device. Using NFC as an example, unidirectional installation is accomplished by storing the pairing information in a static or passive NFC tag (an active NFC tag in static-emulation mode may also be used). Windows subscribes to this pairing information. An NFC-enabled NFP provider on the PC receives the connection information from the tag and passes this to Windows as a subscriber. Upon receipt of the connection information, Windows performs the actual installation of the device in-band using device class specific techniques. ## Interoperability Requirements To ensure interoperability across NFP providers, the pairing information should be encapsulated in a provider-specific message format. As described elsewhere in this document, there are no specific requirements for proximity technologies other than for NFC-enabled NFP providers. Windows requires NFC-enabled NFP providers to support a specific NFC Forum–defined mechanism for conveying the Wi-Fi Direct OOB pairing information for unidirectional pairing. The NDEF message contains a first record with a TNF field value of 0x01 and a TYPE field that is equal to “Hs”, and an alternative carrier record that points to a Wi-Fi Direct Carrier Configuration Record. In this method, only the PAYLOAD of the NDEF record will be used. ## Unidirectional Pairing Using NFC for Wi-Fi Direct This section provides more details on how NFC, Wi-Fi Direct, and Windows work together to support unidirectional wireless pairing for Wi-Fi Direct devices such as printers. ### NFP Provider References Wi-Fi Direct pairing is accomplished using an NFC Forum standardized Connection Handover Select message type. The below graphic provides an overview of how a Connection Handover Select message is applied for Wi-Fi Direct device pairing, specifically NDEF records 3 and 4. The Handover Select message describes one or more “ac” or “Alternate Carrier” records. These records follow the Handover Select record sequentially and each have a well defined type. Finally, the message will contain a Microsoft defined device pairing record which provides Windows with information about how to process the pairing operation. ![connection handover select message](images/handover.png) ### Wi-Fi Direct Device Pairing Message In the sample use cases that follow, NFC type 2 tags are used as an illustrative example. If it is necessary to use a different NFC tag type, the NDEF message must be properly encapsulated according to that tag definition. | Field | Value | Description | |-----------------------|--------------------------------------------------|---------------------------------------------------------------------------| | TNF | 0x02 | Format of the Type field that follows. Media-type as defined in RFC 2046. | | Type | 'application/vnd.ms-windows.wfd.oob' | New type string we define for this scenario. | | Size of OOB data | WORD | Up to 64 KB of OOB data supported. | | Wi-Fi Direct OOB data | &lt;blob of size indicated by previous field&gt; | Wi-Fi Direct OOB data as defined below. | ### Wi-Fi Direct OOB format The following table describes the format of the WiFi Direct OOB data. OOB Unidirectional Data may be transmitted by any unidirectional P2P OOB Device. <table> <colgroup> <col width="25%" /> <col width="25%" /> <col width="25%" /> <col width="25%" /> </colgroup> <thead> <tr class="header"> <th align="left">Attributes</th> <th align="left">Attribute ID</th> <th align="left">Required/Optional</th> <th align="left">Note</th> </tr> </thead> <tbody> <tr class="odd"> <td align="left"><p>OOB Header</p> <p>See OOB Header attribute format table.</p></td> <td align="left">N/A</td> <td align="left">Required</td> <td align="left">The OOB Header attribute shall be present in P2P OOB Data blob, and have its OOB Type value set to “OOB Unidirectional Provisioning Data”.</td> </tr> <tr class="even"> <td align="left"><p>OOB Device Info</p> <p>See OOB Device info attribute format table.</p></td> <td align="left">1</td> <td align="left">Required</td> <td align="left">This attribute must be present. It provides information about this P2P Device.</td> </tr> <tr class="odd"> <td align="left"><p>OOB Provisioning Info</p> <p></p></td> <td align="left">2</td> <td align="left">Required</td> <td align="left">This attribute must be present. It provides provisioning information that this P2P Device is expecting to use.</td> </tr> <tr class="even"> <td align="left"><p>OOB Configuration Timeout</p> <p></p></td> <td align="left">5</td> <td align="left">Required</td> <td align="left">This attribute must be present. It provides information about how long this P2P Device will wait for a response over Wi-Fi Direct.</td> </tr> </tbody> </table> ### OOB Header attribute format | Field Name | Size (octets) | Value | Description | |-------------------|---------------|----------|----------------------------------------------------------------------------------------------------------------| | Total Data Length | 2 | Variable | Length of entire OOB Data Blob (including header). | | Length | 2 | Variable | Length of the following fields in OOB header. | | Version | 1 | 0x10 | Value identifying the version of this P2P OOB record. | | OOB Type | 1 | Variable | Value identifying the type of OOB transaction. The specific value is defined in *OOB Transaction Types* table. | | OUI | 0 or 3 | Variable | Vendor-specific OUI. This is an optional value. Must only be present when OOB Type is Vendor Specific. | | OUI Type | 0 or 1 | Variable | Vendor-specific Type. This is an optional value. Must only be present when OOB Type is Vendor Specific. | ### OOB Transaction Types | OOB Type (Hex) | Description | |----------------|--------------------------------------| | 0x00 | OOB Unidirectional Provisioning Data | | 0x01 | OOB Provisioning Listener Data | | 0x02 | OOB Provisioning Connector Data | | 0x03 | OOB Reinvoke Data | | 0x04-0xDC | Reserved | | 0xDD | Vendor-Specific | | 0xDE-0xFF | Reserved | ### OOB Device info attribute format <table> <colgroup> <col width="25%" /> <col width="25%" /> <col width="25%" /> <col width="25%" /> </colgroup> <thead> <tr class="header"> <th align="left">Field Name</th> <th align="left">Size (octets)</th> <th align="left">Value</th> <th align="left">Description</th> </tr> </thead> <tbody> <tr class="odd"> <td align="left">Attribute ID</td> <td align="left">1</td> <td align="left">1</td> <td align="left">Identifying the type of P2P OOB attribute. The specific value is defined in the <em>P2P OOB Attributes</em> table.</td> </tr> <tr class="even"> <td align="left">Length</td> <td align="left">2</td> <td align="left">Variable</td> <td align="left">Length of the following fields in the attribute.</td> </tr> <tr class="odd"> <td align="left">P2P Device Address</td> <td align="left">6</td> <td align="left">As defined in P2P Spec.</td> <td align="left">An identifier used to uniquely reference a P2P Device.</td> </tr> <tr class="even"> <td align="left">Config Methods</td> <td align="left">2</td> <td align="left">As defined in P2P Spec.</td> <td align="left"><p>The WSC Methods that are supported by this device.</p> <div class="alert"> <strong>Note</strong>  Byte ordering within the Config Methods field shall be big-endian. </div> <div> </div></td> </tr> <tr class="odd"> <td align="left">Primary Device Type</td> <td align="left">8</td> <td align="left">As defined in P2P Spec.</td> <td align="left"><p>Primary Device Type of the P2P Device. Contains only the Data part of the WSC Primary Device Type attribute (excludes Attribute ID and Length fields).</p> <div class="alert"> <strong>Note</strong>  Byte ordering within the Primary Device Type field shall be big-endian. </div> <div> </div></td> </tr> <tr class="even"> <td align="left">Device Capability Bitmap</td> <td align="left">1</td> <td align="left">As defined in P2P Spec.</td> <td align="left">A set of parameters indicating P2P Device’s capabilities.</td> </tr> <tr class="odd"> <td align="left">Device Name</td> <td align="left">Variable</td> <td align="left">As defined in P2P Spec.</td> <td align="left"><p>Friendly name of the P2P Device. Contains the entire WSC Device Name attribute TLV format.</p> <div class="alert"> <strong>Note</strong>  Byte ordering within the Device Name field shall be big-endian. </div> <div> </div></td> </tr> </tbody> </table> ### P2P OOB Attributes | OOB Type (Hex) | Description | |----------------|---------------------------| | 0x00 | OOB Status | | 0x01 | OOB Device Info | | 0x02 | OOB Provisioning Info | | 0x03 | OOB Group ID | | 0x04 | OOB Listen Channel | | 0x05 | OOB Configuration Timeout | | 0x06-0xDC | Reserved | | 0xDD | Vendor specific attribute | | 0xDE-0xFF | Reserved | ### OOB Provisioning Info attribute format | Field Name | Size (octets) | Value | Description | |------------------------------|---------------|-------------------------|-------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | Attribute ID | 1 | 1 | Identifying the type of P2P OOB attribute. The specific value is defined in *P2P OOB Attributes* table. | | Length | 2 | Variable | Length of the following fields in the attribute. | | Provisioning Settings Bitmap | 1 | Variable | A set of provisioning settings options, as defined the *Provisioning settings* table. | | Selected Config Method | 2 | As defined in P2P Spec. | The WSC Method that was selected by this P2P device for provisioning. | | Pin Length | 1 | 0 - 8 | Number of bytes in the following PIN Data field. This field set to 0 indicates no additional PIN data. | | Pin Data | Variable | n | This field is optional. This field is present only if the PIN Length field is not 0, and contains an array of octets which represent a PIN to be used for provisioning. | ### Provisioning settings <table> <colgroup> <col width="33%" /> <col width="33%" /> <col width="33%" /> </colgroup> <thead> <tr class="header"> <th align="left">Bits(s)</th> <th align="left">Information</th> <th align="left">Notes</th> </tr> </thead> <tbody> <tr class="odd"> <td align="left">0</td> <td align="left">Create New Group</td> <td align="left">The Create New Group bit is set to 1 if this provisioning info is for forming a new group with the target P2P device. Otherwise, this provisioning info is for joining an existing group.</td> </tr> <tr class="even"> <td align="left">1</td> <td align="left">Enforce Group Type Setting</td> <td align="left">The Enforce Group Type Setting bit is set to 1 if Desired Group Type bit must be enforced, Otherwise, Desired Group Type bit is simply a preference.</td> </tr> <tr class="odd"> <td align="left">2</td> <td align="left">Desired Group Type</td> <td align="left">Desired Group Type bit shall be set to 0 if Desired Group Type is transient, and shall be set to 1 if Desired Group Type is persistent.</td> </tr> <tr class="even"> <td align="left">3 - 7</td> <td align="left">Reserved</td> <td align="left"></td> </tr> </tbody> </table> ### OOB Configuration Timeout attribute format | Field Name | Size (octets) | Value | Description | |--------------------------------|---------------|---------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------| | Attribute ID | 1 | 5 | Identifying the type of P2P OOB attribute. The specific value is defined in *P2P OOB Attributes* table. | | Length | 2 | 1 | Length of the following fields in the attribute. | | Listener Configuration Timeout | 1 | 0 - 255 | Amount of time this P2P device will spend waiting for Wi-Fi Direct communication after OOB data transfer, in units of 100 milliseconds. (Maximum of 25.5 seconds). | ### Windows Device Pairing Record The Windows Device Pairing Record follows the NDEF specification. It provides additional information to Windows about how to process the Connection Handover Select message. The TNF and Type fields must be specified according to the NDEF specification. The other fields below will be sequentially listed in the Payload field of the NDEF record. <table> <colgroup> <col width="25%" /> <col width="25%" /> <col width="25%" /> <col width="25%" /> </colgroup> <thead> <tr class="header"> <th align="left">Field Name</th> <th align="left">Value</th> <th align="left">Length Value</th> <th align="left">Description</th> </tr> </thead> <tbody> <tr class="odd"> <td align="left">TNF</td> <td align="left">0x02</td> <td align="left">3 bits</td> <td align="left">Format of the Type field that follows. Media-type as defined in RFC 2046.</td> </tr> <tr class="even"> <td align="left">Type</td> <td align="left">'application/vnd.ms-windows.devicepairing'</td> <td align="left">0x28 bytes</td> <td align="left">New type string we define for this scenario.</td> </tr> <tr class="odd"> <td align="left">MajorVersion</td> <td align="left">0x1</td> <td align="left">2 bytes</td> <td align="left">Major version is required to be 0x1.</td> </tr> <tr class="even"> <td align="left">MinorVersion</td> <td align="left">0x0</td> <td align="left">2 bytes</td> <td align="left">Minor version is required to be 0x0.</td> </tr> <tr class="odd"> <td align="left">Flags</td> <td align="left">0x0 or 0x01</td> <td align="left">4 bytes</td> <td align="left"><p>Set to 0x0 to try all transports.</p> <p>Set to 0x1 to attempt installation sequentially and stop after first success. Preference for transports is indicated by sequence of alternate carrier records.</p> <div class="alert"> <strong>Note</strong>  Values 0x0002 through 0x0064 are reserved. </div> <div> </div></td> </tr> <tr class="even"> <td align="left">Length of device friendly name</td> <td align="left">Length of device friendly name field.</td> <td align="left">1 byte</td> <td align="left">Length of Device friendly name.</td> </tr> <tr class="odd"> <td align="left">Device friendly name</td> <td align="left">UTF-8 encoded string up to 255 bytes.</td> <td align="left">Length of device friendly name</td> <td align="left">Friendly name for the device which will be shown in consent UI on the client.</td> </tr> </tbody> </table> ### Wi-Fi Direct “Just Works” ceremony, Static Connection Handover tag format As an example, the following is a typical implementation for an NFC passive tag. This corresponds to a static connection handover case with a Wi-Fi Direct carrier record, a network share printer, and the ms-device pairing record. This first table illustrates the format of the Wi-Fi Direct pairing portion of the tag. <table> <colgroup> <col width="25%" /> <col width="25%" /> <col width="25%" /> <col width="25%" /> </colgroup> <thead> <tr class="header"> <th align="left">Offset</th> <th align="left">Content</th> <th align="left">Length</th> <th align="left">Explanation</th> </tr> </thead> <tbody> <tr class="odd"> <td align="left">0</td> <td align="left">0x91</td> <td align="left">1</td> <td align="left"><p>NDEF Record Header:</p> <p>MB=1b, ME=0b, CF=0b, SR=1b, IL=0b, TNF=001b</p></td> </tr> <tr class="even"> <td align="left">1</td> <td align="left">0x02</td> <td align="left">1</td> <td align="left">Record Type Length: 2 octets</td> </tr> <tr class="odd"> <td align="left">2</td> <td align="left">0x0A</td> <td align="left">1</td> <td align="left">Record Type Length: 10 octets</td> </tr> <tr class="even"> <td align="left">3</td> <td align="left">0x48 0x73</td> <td align="left">2</td> <td align="left">Record Type: “Hs”</td> </tr> <tr class="odd"> <td align="left">5</td> <td align="left">0x12</td> <td align="left">1</td> <td align="left">Version Number: Major = 1, Minor = 2</td> </tr> <tr class="even"> <td align="left">6</td> <td align="left">0xD1</td> <td align="left">1</td> <td align="left"><p>NDEF Record Header:</p> <p>MB=1b, ME=1b, CF=0b, SR=1b, IL=0b, TNF=001b</p></td> </tr> <tr class="odd"> <td align="left">7</td> <td align="left">0x02</td> <td align="left">1</td> <td align="left">Record Type Length: 2 octets</td> </tr> <tr class="even"> <td align="left">8</td> <td align="left">0x04</td> <td align="left">1</td> <td align="left">Payload Length: 4 octets</td> </tr> <tr class="odd"> <td align="left">9</td> <td align="left">0x61 0x63</td> <td align="left">2</td> <td align="left">Record Type: “ac”</td> </tr> <tr class="even"> <td align="left">11</td> <td align="left">0x01</td> <td align="left">1</td> <td align="left">Carrier Flags: CPS=1, "active"</td> </tr> <tr class="odd"> <td align="left">12</td> <td align="left">0x01</td> <td align="left">1</td> <td align="left">Carrier Data Reference Length: 1 octet</td> </tr> <tr class="even"> <td align="left">13</td> <td align="left">0x30</td> <td align="left">1</td> <td align="left">Carrier Data Reference: “0”</td> </tr> <tr class="odd"> <td align="left">14</td> <td align="left">0x00</td> <td align="left">1</td> <td align="left">Auxiliary Data Reference Count: 0</td> </tr> <tr class="even"> <td align="left">15</td> <td align="left">0x1A</td> <td align="left">1</td> <td align="left"><p>NDEF Record Header:</p> <p>MB=0b, ME=0b, CF=0b, SR=1b, IL=1b, TNF=010b</p></td> </tr> <tr class="odd"> <td align="left">16</td> <td align="left">0x22</td> <td align="left">1</td> <td align="left">Record Type Name Length: 34 octets</td> </tr> <tr class="even"> <td align="left">17</td> <td align="left">0x3E</td> <td align="left">1</td> <td align="left">Payload Length: 62 octets</td> </tr> <tr class="odd"> <td align="left">18</td> <td align="left">0x01</td> <td align="left">1</td> <td align="left">Id Length: 1 octet</td> </tr> <tr class="even"> <td align="left">19</td> <td align="left"><p>0x61 0x70 0x70 0x6C</p> <p>0x69 0x63 0x61 0x74</p> <p>0x69 0x6F 0x6E 0x2F</p> <p>0x76 0x6E 0x64 0x2E</p> <p>0x6D 0x73 0x2D 0x77</p> <p>0x69 0x6E 0x64 0x6F</p> <p>0x77 0x73 0x2E 0x77</p> <p>0x66 0x64 0x2E 0x6F</p> <p>0x6F 0x62</p></td> <td align="left">34</td> <td align="left">Record Type Name: 'application/vnd.ms-windows.wfd.oob'</td> </tr> <tr class="odd"> <td align="left">53</td> <td align="left">0x30</td> <td align="left">1</td> <td align="left">Id: “0”</td> </tr> <tr class="even"> <td align="left">54</td> <td align="left">0x3E 0x00</td> <td align="left">2</td> <td align="left"><p>Wi-Fi Direct OOB data length: 62 octets. The length is read as an unsigned short and is inclusive</p> <p>of the entire blob. Includes 2 length octets. This value must be stored in little-endian format.</p></td> </tr> <tr class="odd"> <td align="left">56</td> <td align="left">0x02, 0x00</td> <td align="left">2</td> <td align="left">Header length: 2 octets</td> </tr> <tr class="even"> <td align="left">58</td> <td align="left">0x10</td> <td align="left">1</td> <td align="left">Version: 0x10</td> </tr> <tr class="odd"> <td align="left">59</td> <td align="left">0x00</td> <td align="left">1</td> <td align="left">OOB type: 0x00 (unidirectional)</td> </tr> <tr class="even"> <td align="left">60</td> <td align="left">0x01</td> <td align="left">1</td> <td align="left">Attribute: 0x01 (Device information attribute)</td> </tr> <tr class="odd"> <td align="left">61</td> <td align="left">0x22 0x00</td> <td align="left">2</td> <td align="left">Device information length: 34 octets</td> </tr> <tr class="even"> <td align="left">63</td> <td align="left"><p>0x01 0x23 0x34 0xab</p> <p>0xcd 0xef</p></td> <td align="left">6</td> <td align="left">Wi-Fi Direct P2P device MAC address: “01:23:34<span class="emoji" shortCode="ab">🆎</span>cd:ef”</td> </tr> <tr class="odd"> <td align="left">69</td> <td align="left">0x01 0x00</td> <td align="left">2</td> <td align="left">Config type</td> </tr> <tr class="even"> <td align="left">71</td> <td align="left"><p>0x00 0x01 0x00 0x50</p> <p>0xF2 0x00 0x00 0x00</p></td> <td align="left">8</td> <td align="left">Primary device type</td> </tr> <tr class="odd"> <td align="left">79</td> <td align="left">0x12</td> <td align="left">1</td> <td align="left">Capability</td> </tr> <tr class="even"> <td align="left">80</td> <td align="left">0x10 0x11</td> <td align="left">2</td> <td align="left">Attribute: Device name</td> </tr> <tr class="odd"> <td align="left">82</td> <td align="left">0x00 0x0d</td> <td align="left">2</td> <td align="left">Device name length: 13 octets</td> </tr> <tr class="even"> <td align="left">84</td> <td align="left"><p>0x43 0x6f 0x6e 0x74</p> <p>0x6f 0x73 0x6f 0x20</p> <p>0x4d 0x6f 0x75 0x73</p> <p>0x65</p></td> <td align="left">13</td> <td align="left"><p>Device friendly name in UTF-8. Note that there is no NULL terminating character and that UTF-8</p> <p>may be one or two bytes per character. This example reads “Contoso Mouse”</p></td> </tr> <tr class="odd"> <td align="left">97</td> <td align="left">0x02</td> <td align="left">1</td> <td align="left">Attribute: provisioning info</td> </tr> <tr class="even"> <td align="left">98</td> <td align="left">0x0c 0x00</td> <td align="left">2</td> <td align="left">Provisioning info length: 12 octets</td> </tr> <tr class="odd"> <td align="left">100</td> <td align="left">0x07</td> <td align="left">1</td> <td align="left">Setting bitmap: new group, enforce persistent</td> </tr> <tr class="even"> <td align="left">101</td> <td align="left">0x01 0x00</td> <td align="left">2</td> <td align="left">Config method: pin entry</td> </tr> <tr class="odd"> <td align="left">103</td> <td align="left">0x08</td> <td align="left">1</td> <td align="left">Pin length: 8 octets</td> </tr> <tr class="even"> <td align="left">104</td> <td align="left"><p>0x01 0x02 0x03 0x04</p> <p>0x05 0x06 0x07 0x08</p></td> <td align="left">8</td> <td align="left">Pin: “12345678”</td> </tr> <tr class="odd"> <td align="left">112</td> <td align="left">0x05</td> <td align="left">1</td> <td align="left">Attribute: Configuration timeout information</td> </tr> <tr class="even"> <td align="left">113</td> <td align="left">0x01 0x00</td> <td align="left">2</td> <td align="left">Configuration timeout length</td> </tr> <tr class="odd"> <td align="left">115</td> <td align="left">0x64</td> <td align="left">1</td> <td align="left">10 seconds, in 100 millisecond units</td> </tr> </tbody> </table> This second table illustrates the format of the network printer pairing portion of the tag. <table> <colgroup> <col width="25%" /> <col width="25%" /> <col width="25%" /> <col width="25%" /> </colgroup> <thead> <tr class="header"> <th align="left">Offset</th> <th align="left">Content</th> <th align="left">Length</th> <th align="left">Explaination</th> </tr> </thead> <tbody> <tr class="odd"> <td align="left">116</td> <td align="left">0x12</td> <td align="left">1</td> <td align="left"><p>NDEF record header:</p> <p>MB=0b,ME=0b, CF=0b, SR=1b, IL=0b,TNF=010b</p></td> </tr> <tr class="even"> <td align="left">117</td> <td align="left">0x29</td> <td align="left">1</td> <td align="left">Type length field</td> </tr> <tr class="odd"> <td align="left">118</td> <td align="left">0x19</td> <td align="left">1</td> <td align="left">Payload length field</td> </tr> <tr class="even"> <td align="left">119</td> <td align="left"><p>0x61 0x70 0x70 0x6c</p> <p>0x69 0x63 0x61 0x74</p> <p>0x69 0x6f 0x6e 0x2f</p> <p>0x76 0x6e 0x64 0x2e</p> <p>0x6d 0x73 0x2d 0x77</p> <p>0x69 0x6e 0x64 0x6f</p> <p>0x77 0x73 0x2e 0x6e</p> <p>0x77 0x70 0x72 0x69</p> <p>0x6e 0x74 0x69 0x6e</p> <p>0x67 0x2e 0x6f 0x6f</p> <p>0x62</p></td> <td align="left">41</td> <td align="left">Record type name: “application/vnd.ms-windows.nwprinting.oob”</td> </tr> <tr class="odd"> <td align="left">160</td> <td align="left"><p>0x5c 0x5c 0x70 0x72</p> <p>0x69 0x6e 0x74 0x53</p> <p>0x65 0x72 0x76 0x65</p> <p>0x72 0x5c 0x70 0x72</p> <p>0x69 0x6e 0x74 0x65</p> <p>0x72 0x4e 0x61 0x6d</p> <p>0x65</p></td> <td align="left">25</td> <td align="left">Printer name: “\printServer\printerName"</td> </tr> </tbody> </table> This third table illustrates the format of the MS-Device pairing portion of the tag. <table> <colgroup> <col width="25%" /> <col width="25%" /> <col width="25%" /> <col width="25%" /> </colgroup> <thead> <tr class="header"> <th align="left">Offset</th> <th align="left">Content</th> <th align="left">Length</th> <th align="left">Explaination</th> </tr> </thead> <tbody> <tr class="odd"> <td align="left">185</td> <td align="left">0x52</td> <td align="left">1</td> <td align="left"><p>NDEF record header:</p> <p>MB=0b, ME=1b, CF=0b, SR=1b, IL=0b,TNF=010b</p></td> </tr> <tr class="even"> <td align="left">186</td> <td align="left">0x28</td> <td align="left">1</td> <td align="left">Type length field</td> </tr> <tr class="odd"> <td align="left">187</td> <td align="left">0x15</td> <td align="left">1</td> <td align="left">Payload length field</td> </tr> <tr class="even"> <td align="left">188</td> <td align="left"><p>0x61 0x70 0x70 0x6c</p> <p>0x69 0x63 0x61 0x74</p> <p>0x69 0x6f 0x6e 0x2f</p> <p>0x76 0x6e 0x64 0x2e</p> <p>0x6d 0x73 0x2d 0x77</p> <p>0x69 0x6e 0x64 0x6f</p> <p>0x77 0x73 0x2e 0x64</p> <p>0x65 0x76 0x69 0x63</p> <p>0x65 0x70 0x61 0x69</p> <p>0x72 0x69 0x6E 0x67</p></td> <td align="left">40</td> <td align="left">Record type name: “application/vnd.ms-windows.devicepairing”</td> </tr> <tr class="odd"> <td align="left">228</td> <td align="left">0x00 0x01 0x00 0x00</td> <td align="left">4</td> <td align="left">Version: Major = 1, Minor = 0</td> </tr> <tr class="even"> <td align="left">232</td> <td align="left">0x00</td> <td align="left">1</td> <td align="left">Flags: Set to 0, try all transports</td> </tr> <tr class="odd"> <td align="left">233</td> <td align="left">0x0F</td> <td align="left">1</td> <td align="left">Length of device friendly name</td> </tr> <tr class="even"> <td align="left">234</td> <td align="left"><p>0x43 0x6f 0x6e 0x74</p> <p>0x6f 0x73 0x6f 0x20</p> <p>0x50 0x72 0x69 0x6e</p> <p>0x74 0x65 0x72</p></td> <td align="left">15</td> <td align="left">The device friendly name displayed to the user: “Contoso Printer”</td> </tr> </tbody> </table> ## Wi-Fi Direct Connectivity Requirements Devices and clients must have the Wi-Fi radio turned on. If not, pairing will fail. ## Handling Edge Cases If a user has previously paired a device, but then manually removes the device from the device list, tapping again will result in an attempt to install or pair. If a user enters into the range of actuation but then suddenly leaves before the out-of-band (OOB) information is transferred, the device may become connectable but the PC will not look for the device. In this case, there will be no consent UI from the PC and the user will need to tap again. If the device is already discoverable when it is tapped again, it should remain discoverable and should reset the timeout period. For Wi-Fi Direct devices, if the Wi-Fi radio turns off then the installation will not be successful. If a user taps two devices at approximately the same time, only the pairing for the first received OOB information will be attempted. Any attempt to tap the device on a system running an operating system that doesn’t support Tap to Setup or Tap to Reconnect may result in the device going into connectable mode but pairing will not take place. Users will need to use a pairing UI provided for Bluetooth and use the pairing button to initiate pairing. ## Related topics [NFC device driver interface (DDI) reference](/windows-hardware/drivers/ddi/index)
38.275398
614
0.621247
eng_Latn
0.780408
2debb4aaa8e74c322269de75c3d6225226a3f529
191
md
Markdown
_posts/2016-09-25-trainer3.md
AlexandreMertens/AlexandreMertens.github.io
f5753e5bf1311347ca705a86a5e43061bd6e79c2
[ "Apache-2.0" ]
null
null
null
_posts/2016-09-25-trainer3.md
AlexandreMertens/AlexandreMertens.github.io
f5753e5bf1311347ca705a86a5e43061bd6e79c2
[ "Apache-2.0" ]
null
null
null
_posts/2016-09-25-trainer3.md
AlexandreMertens/AlexandreMertens.github.io
f5753e5bf1311347ca705a86a5e43061bd6e79c2
[ "Apache-2.0" ]
null
null
null
--- layout: post title: "Stage formation entraîneur" date: 2016-09-25 11:53:57 hours: ...-15:00 place: Puttlingen categories: seminaire_puttlingen inblog: false note: tout le weekend ---
17.363636
36
0.727749
fra_Latn
0.303712
2dec025e5290fa98207eaf2454676067b995b602
820
md
Markdown
README.md
masamasa9841/excel_robot
b444ec986253b8cde2d4ba552d98d4c73fcd6e41
[ "MIT" ]
null
null
null
README.md
masamasa9841/excel_robot
b444ec986253b8cde2d4ba552d98d4c73fcd6e41
[ "MIT" ]
null
null
null
README.md
masamasa9841/excel_robot
b444ec986253b8cde2d4ba552d98d4c73fcd6e41
[ "MIT" ]
null
null
null
# excel_robot ### About ![alt text](robot.gif) ### Requirements * raspimouse_ros : [https://github.com/ryuichiueda/raspimouse_ros](https://github.com/ryuichiueda/raspimouse_ros) * tested Ubuntu Linux 16.04 Mate on Raspberry Pi 3 ### Software First, download this repository. ``` git clone https://github.com/masamasa9841/excel_robot.git ``` Next, tweepy install ``` pip install tweepy ``` ### Usage ``` python twitter.py Use Excel or Spreadsheet ``` ### License This repository is licensed under the MIT license, see [LICENSE](./LICENSE). ### Reference [ryuichiueda/raspimouse_ros](https://github.com/ryuichiueda/raspimouse_ros) [夜の果て](http://yoruhate.arcanus.info/) [きじとら](https://kijtra.com/article/twitter-api-for-google-apps-script-without-oauthconfig/) [anyway making](http://net.univ-q.com/archives/38)
21.025641
113
0.739024
yue_Hant
0.21271
2decc44010802b7fe97d6dca4274bb4a42daadb2
2,133
md
Markdown
content/taipei5d4n.md
hidienet/Hugo
5e7f211db507e2298e2f9694416f0129211e1782
[ "Apache-2.0" ]
null
null
null
content/taipei5d4n.md
hidienet/Hugo
5e7f211db507e2298e2f9694416f0129211e1782
[ "Apache-2.0" ]
null
null
null
content/taipei5d4n.md
hidienet/Hugo
5e7f211db507e2298e2f9694416f0129211e1782
[ "Apache-2.0" ]
1
2020-11-24T12:08:27.000Z
2020-11-24T12:08:27.000Z
--- title: '臺北暴食遊(5D4N)' date: 2014-05-02T11:30:00.000+08:00 draft: false aliases: [ "/2014/05/5d4n.html" ] tags : [lifestyle - 逃離852, travel - 臺灣・臺北] --- 兇狠的食;舒服的浸(吼吼吼) day 1 [老董牛肉麵](https://hidie.net/taipei1a/) [鼎泰豐小籠包](https://hidie.net/taipei1b/) [冰館 超級芒果牛奶冰](https://hidie.net/taipei1c/) [臺北101](https://hidie.net/taipei1d/) day 2 [西門町:蚵仔麵線](https://hidie.net/taipei2a/) [西門町:謝謝魷魚羹](https://hidie.net/taipei2b/) [西門町:鴨肉扁](https://hidie.net/taipei2c/) [西門町:雪王冰淇淋](https://hidie.net/taipei2d/) [西門町:老天祿滷味](https://hidie.net/taipei2e/) [西門町:楊記 花生玉米冰](https://hidie.net/taipei2f/) [西門町:成都楊桃冰](https://hidie.net/taipei2g/) [鴉片粉圓](https://hidie.net/taipei2h/) [君悅排骨](https://hidie.net/taipei2i/) [饒河街夜市:十全藥膳排骨](https://hidie.net/taipei2j/) [饒河街夜市:QQ涼圓菜燕](https://hidie.net/taipei2k/) [饒河街夜市:炸芝士、燒麻糬](https://hidie.net/taipei2l/) [饒河街夜市:鄉淳布丁豆花](https://hidie.net/taipei2m/) [饒河街夜市:福州世祖胡椒餅](https://hidie.net/taipei2n/) day 3 [九份](https://hidie.net/taipei3a/) [九份:阿柑姨](https://hidie.net/taipei3b/) [九份:阿妹茶樓](https://hidie.net/taipei3c/) [九份:郵局前油蔥粿](https://hidie.net/taipei3d/) [九份:紅糟肉圓](https://hidie.net/taipei3e/) [九份:魚丸伯仔](https://hidie.net/taipei3f/) [九份:舊道口牛肉麵](https://hidie.net/taipei3g/) [天仁茗茶 珍珠奶茶](https://hidie.net/taipei3h/) [7-11關東煮](https://hidie.net/taipei3i/) [鮮芋仙](https://hidie.net/taipei3j/) day 4 [浸溫泉@北投](https://hidie.net/taipei4a/) [淡水:淡水魚丸](https://hidie.net/taipei4b/) [淡水:阿媽的酸梅湯](https://hidie.net/taipei4c/) [日落淡水](https://hidie.net/taipei4d/) [士林夜市:月亮蝦餅](https://hidie.net/taipei4e/) [士林夜市:豪大大雞排](https://hidie.net/taipei4f/) [士林夜市:大腸包小腸](https://hidie.net/taipei4g/) [士林夜市:蚵仔煎](https://hidie.net/taipei4h/) [士林夜市:青蛙下蛋](https://hidie.net/taipei4i/) [士林夜市:河粉煎](https://hidie.net/taipei4j/) day 5 ~乘早機 忍耐著呵欠... 捨不得便買回來 [\[snacks\] 7-11 石安牧場溫泉蛋](https://hidie.net/sevenonsenegg/) [\[cake\] AMO 荷蘭貴族手工蛋糕](https://hidie.net/amo/) [\[cake\] 南蠻堂 蜂蜜蛋糕](https://hidie.net/nanmantang/) [\[smile\] 百齡 潔克 神奇潔牙粉](https://hidie.net/smiling/) ![](/images/taipei5d4n.jpg) 零錢用盡在機場只夠錢食包包 XDDD
28.065789
61
0.653071
yue_Hant
0.385718
2dee218ba1579f6ad5c839357510d16fa1698342
23,939
md
Markdown
_posts/2021-12-29-service.md
hyxsbri/hyxsbri.github.io
59904c1602c5039fb02b9b9c9d7a43cf0531c02a
[ "MIT" ]
1
2022-01-10T08:03:37.000Z
2022-01-10T08:03:37.000Z
_posts/2021-12-29-service.md
hyxsbri/hyxsbri.github.io
59904c1602c5039fb02b9b9c9d7a43cf0531c02a
[ "MIT" ]
null
null
null
_posts/2021-12-29-service.md
hyxsbri/hyxsbri.github.io
59904c1602c5039fb02b9b9c9d7a43cf0531c02a
[ "MIT" ]
null
null
null
--- layout: single title: 서비스이탈자 예측 --- ## 데이터 불러오기 ```python import pandas as pd train = pd.read_csv("https://raw.githubusercontent.com/Datamanim/datarepo/main/churn/train.csv") x_test = pd.read_csv("https://raw.githubusercontent.com/Datamanim/datarepo/main/churn/test.csv") sub = pd.read_csv("https://raw.githubusercontent.com/Datamanim/datarepo/main/churn/submission.csv") ``` ```python train.head() ``` <div> <style scoped> .dataframe tbody tr th:only-of-type { vertical-align: middle; } .dataframe tbody tr th { vertical-align: top; } .dataframe thead th { text-align: right; } </style> <table border="1" class="dataframe"> <thead> <tr style="text-align: right;"> <th></th> <th>RowNumber</th> <th>CustomerId</th> <th>Surname</th> <th>CreditScore</th> <th>Geography</th> <th>Gender</th> <th>Age</th> <th>Tenure</th> <th>Balance</th> <th>NumOfProducts</th> <th>HasCrCard</th> <th>IsActiveMember</th> <th>EstimatedSalary</th> <th>Exited</th> </tr> </thead> <tbody> <tr> <th>0</th> <td>6842</td> <td>15793491</td> <td>Cherkasova</td> <td>714</td> <td>Germany</td> <td>Male</td> <td>26</td> <td>3</td> <td>119545.48</td> <td>2</td> <td>1</td> <td>0</td> <td>65482.94</td> <td>0</td> </tr> <tr> <th>1</th> <td>8963</td> <td>15607874</td> <td>Keane</td> <td>687</td> <td>France</td> <td>Male</td> <td>38</td> <td>0</td> <td>144450.58</td> <td>1</td> <td>0</td> <td>1</td> <td>137276.83</td> <td>0</td> </tr> <tr> <th>2</th> <td>7047</td> <td>15737627</td> <td>Rivero</td> <td>589</td> <td>Germany</td> <td>Female</td> <td>20</td> <td>2</td> <td>121093.29</td> <td>2</td> <td>1</td> <td>0</td> <td>3529.72</td> <td>0</td> </tr> <tr> <th>3</th> <td>7503</td> <td>15697844</td> <td>Whitehouse</td> <td>721</td> <td>Spain</td> <td>Female</td> <td>32</td> <td>10</td> <td>0.00</td> <td>1</td> <td>1</td> <td>0</td> <td>136119.96</td> <td>1</td> </tr> <tr> <th>4</th> <td>3439</td> <td>15722404</td> <td>Carpenter</td> <td>445</td> <td>France</td> <td>Female</td> <td>30</td> <td>3</td> <td>0.00</td> <td>2</td> <td>1</td> <td>1</td> <td>127939.19</td> <td>0</td> </tr> </tbody> </table> </div> ```python x_test.head() ``` <div> <style scoped> .dataframe tbody tr th:only-of-type { vertical-align: middle; } .dataframe tbody tr th { vertical-align: top; } .dataframe thead th { text-align: right; } </style> <table border="1" class="dataframe"> <thead> <tr style="text-align: right;"> <th></th> <th>RowNumber</th> <th>CustomerId</th> <th>Surname</th> <th>CreditScore</th> <th>Geography</th> <th>Gender</th> <th>Age</th> <th>Tenure</th> <th>Balance</th> <th>NumOfProducts</th> <th>HasCrCard</th> <th>IsActiveMember</th> <th>EstimatedSalary</th> </tr> </thead> <tbody> <tr> <th>0</th> <td>8438</td> <td>15591428</td> <td>Myers</td> <td>781</td> <td>France</td> <td>Male</td> <td>29</td> <td>9</td> <td>0.00</td> <td>2</td> <td>0</td> <td>0</td> <td>172097.40</td> </tr> <tr> <th>1</th> <td>5282</td> <td>15620372</td> <td>Cross</td> <td>687</td> <td>Spain</td> <td>Male</td> <td>31</td> <td>3</td> <td>0.00</td> <td>2</td> <td>0</td> <td>0</td> <td>48228.10</td> </tr> <tr> <th>2</th> <td>7112</td> <td>15572390</td> <td>Huang</td> <td>850</td> <td>Spain</td> <td>Female</td> <td>39</td> <td>6</td> <td>0.00</td> <td>2</td> <td>1</td> <td>0</td> <td>103921.43</td> </tr> <tr> <th>3</th> <td>912</td> <td>15746490</td> <td>Wollstonecraft</td> <td>648</td> <td>Spain</td> <td>Female</td> <td>53</td> <td>6</td> <td>111201.41</td> <td>1</td> <td>1</td> <td>1</td> <td>121542.29</td> </tr> <tr> <th>4</th> <td>1318</td> <td>15720702</td> <td>Shih</td> <td>789</td> <td>France</td> <td>Male</td> <td>37</td> <td>3</td> <td>0.00</td> <td>1</td> <td>1</td> <td>0</td> <td>121883.87</td> </tr> </tbody> </table> </div> ```python sub.head() ``` <div> <style scoped> .dataframe tbody tr th:only-of-type { vertical-align: middle; } .dataframe tbody tr th { vertical-align: top; } .dataframe thead th { text-align: right; } </style> <table border="1" class="dataframe"> <thead> <tr style="text-align: right;"> <th></th> <th>0</th> </tr> </thead> <tbody> <tr> <th>0</th> <td>0.0</td> </tr> <tr> <th>1</th> <td>0.0</td> </tr> <tr> <th>2</th> <td>0.0</td> </tr> <tr> <th>3</th> <td>0.0</td> </tr> <tr> <th>4</th> <td>0.0</td> </tr> </tbody> </table> </div> ## EDA ```python print(train.isnull().sum()) print(train.describe()) print(train.info()) ``` RowNumber 0 CustomerId 0 Surname 0 CreditScore 0 Geography 0 Gender 0 Age 0 Tenure 0 Balance 0 NumOfProducts 0 HasCrCard 0 IsActiveMember 0 EstimatedSalary 0 Exited 0 dtype: int64 RowNumber CustomerId CreditScore Age Tenure \ count 7999.000000 7.999000e+03 7999.000000 7999.000000 7999.000000 mean 4994.246281 1.569070e+07 650.417302 38.873984 5.030379 std 2884.590601 7.191692e+04 96.720179 10.436156 2.888475 min 1.000000 1.556570e+07 350.000000 18.000000 0.000000 25% 2481.000000 1.562820e+07 584.000000 32.000000 3.000000 50% 5007.000000 1.569046e+07 652.000000 37.000000 5.000000 75% 7477.500000 1.575283e+07 718.000000 44.000000 7.000000 max 9998.000000 1.581569e+07 850.000000 92.000000 10.000000 Balance NumOfProducts HasCrCard IsActiveMember \ count 7999.000000 7999.000000 7999.000000 7999.000000 mean 76350.874242 1.528941 0.705838 0.513564 std 62447.544585 0.581993 0.455694 0.499847 min 0.000000 1.000000 0.000000 0.000000 25% 0.000000 1.000000 0.000000 0.000000 50% 97259.250000 1.000000 1.000000 1.000000 75% 127331.880000 2.000000 1.000000 1.000000 max 250898.090000 4.000000 1.000000 1.000000 EstimatedSalary Exited count 7999.000000 7999.000000 mean 100097.840866 0.203650 std 57218.318399 0.402737 min 96.270000 0.000000 25% 51446.105000 0.000000 50% 99805.990000 0.000000 75% 149304.985000 0.000000 max 199992.480000 1.000000 <class 'pandas.core.frame.DataFrame'> RangeIndex: 7999 entries, 0 to 7998 Data columns (total 14 columns): # Column Non-Null Count Dtype --- ------ -------------- ----- 0 RowNumber 7999 non-null int64 1 CustomerId 7999 non-null int64 2 Surname 7999 non-null object 3 CreditScore 7999 non-null int64 4 Geography 7999 non-null object 5 Gender 7999 non-null object 6 Age 7999 non-null int64 7 Tenure 7999 non-null int64 8 Balance 7999 non-null float64 9 NumOfProducts 7999 non-null int64 10 HasCrCard 7999 non-null int64 11 IsActiveMember 7999 non-null int64 12 EstimatedSalary 7999 non-null float64 13 Exited 7999 non-null int64 dtypes: float64(2), int64(9), object(3) memory usage: 875.0+ KB None ```python print(x_test.isnull().sum()) print(x_test.describe()) print(x_test.info()) ``` RowNumber 0 CustomerId 0 Surname 0 CreditScore 0 Geography 0 Gender 0 Age 0 Tenure 0 Balance 0 NumOfProducts 0 HasCrCard 0 IsActiveMember 0 EstimatedSalary 0 dtype: int64 RowNumber CustomerId CreditScore Age Tenure \ count 2001.000000 2.001000e+03 2001.000000 2001.000000 2001.000000 mean 4948.133933 1.569149e+07 646.515242 38.922039 5.044478 std 2856.444800 7.200359e+04 95.812499 10.449446 2.868888 min 7.000000 1.556570e+07 383.000000 18.000000 0.000000 25% 2485.000000 1.562985e+07 580.000000 32.000000 3.000000 50% 4974.000000 1.569140e+07 648.000000 37.000000 5.000000 75% 7318.000000 1.575607e+07 711.000000 44.000000 8.000000 max 9998.000000 1.581563e+07 850.000000 83.000000 10.000000 Balance NumOfProducts HasCrCard IsActiveMember \ count 2001.000000 2001.000000 2001.000000 2001.000000 mean 76947.597021 1.516242 0.694653 0.505747 std 62440.673306 0.582118 0.460670 0.500092 min 0.000000 1.000000 0.000000 0.000000 25% 0.000000 1.000000 0.000000 0.000000 50% 98064.970000 1.000000 1.000000 1.000000 75% 127818.520000 2.000000 1.000000 1.000000 max 213146.200000 4.000000 1.000000 1.000000 EstimatedSalary count 2001.000000 mean 100148.160270 std 57948.104675 min 106.670000 25% 50070.590000 50% 101371.050000 75% 149458.730000 max 199909.320000 <class 'pandas.core.frame.DataFrame'> RangeIndex: 2001 entries, 0 to 2000 Data columns (total 13 columns): # Column Non-Null Count Dtype --- ------ -------------- ----- 0 RowNumber 2001 non-null int64 1 CustomerId 2001 non-null int64 2 Surname 2001 non-null object 3 CreditScore 2001 non-null int64 4 Geography 2001 non-null object 5 Gender 2001 non-null object 6 Age 2001 non-null int64 7 Tenure 2001 non-null int64 8 Balance 2001 non-null float64 9 NumOfProducts 2001 non-null int64 10 HasCrCard 2001 non-null int64 11 IsActiveMember 2001 non-null int64 12 EstimatedSalary 2001 non-null float64 dtypes: float64(2), int64(8), object(3) memory usage: 203.4+ KB None ## 전처리 ```python train = train.iloc[:,3:] x_test = x_test.iloc[:,3:] train.head() ``` <div> <style scoped> .dataframe tbody tr th:only-of-type { vertical-align: middle; } .dataframe tbody tr th { vertical-align: top; } .dataframe thead th { text-align: right; } </style> <table border="1" class="dataframe"> <thead> <tr style="text-align: right;"> <th></th> <th>CreditScore</th> <th>Geography</th> <th>Gender</th> <th>Age</th> <th>Tenure</th> <th>Balance</th> <th>NumOfProducts</th> <th>HasCrCard</th> <th>IsActiveMember</th> <th>EstimatedSalary</th> <th>Exited</th> </tr> </thead> <tbody> <tr> <th>0</th> <td>714</td> <td>Germany</td> <td>Male</td> <td>26</td> <td>3</td> <td>119545.48</td> <td>2</td> <td>1</td> <td>0</td> <td>65482.94</td> <td>0</td> </tr> <tr> <th>1</th> <td>687</td> <td>France</td> <td>Male</td> <td>38</td> <td>0</td> <td>144450.58</td> <td>1</td> <td>0</td> <td>1</td> <td>137276.83</td> <td>0</td> </tr> <tr> <th>2</th> <td>589</td> <td>Germany</td> <td>Female</td> <td>20</td> <td>2</td> <td>121093.29</td> <td>2</td> <td>1</td> <td>0</td> <td>3529.72</td> <td>0</td> </tr> <tr> <th>3</th> <td>721</td> <td>Spain</td> <td>Female</td> <td>32</td> <td>10</td> <td>0.00</td> <td>1</td> <td>1</td> <td>0</td> <td>136119.96</td> <td>1</td> </tr> <tr> <th>4</th> <td>445</td> <td>France</td> <td>Female</td> <td>30</td> <td>3</td> <td>0.00</td> <td>2</td> <td>1</td> <td>1</td> <td>127939.19</td> <td>0</td> </tr> </tbody> </table> </div> ```python from sklearn.preprocessing import LabelEncoder le = LabelEncoder() train.Gender = le.fit_transform(train.Gender) x_test.Gender = le.transform(x_test.Gender) x_test.head() ``` <div> <style scoped> .dataframe tbody tr th:only-of-type { vertical-align: middle; } .dataframe tbody tr th { vertical-align: top; } .dataframe thead th { text-align: right; } </style> <table border="1" class="dataframe"> <thead> <tr style="text-align: right;"> <th></th> <th>CreditScore</th> <th>Geography</th> <th>Gender</th> <th>Age</th> <th>Tenure</th> <th>Balance</th> <th>NumOfProducts</th> <th>HasCrCard</th> <th>IsActiveMember</th> <th>EstimatedSalary</th> </tr> </thead> <tbody> <tr> <th>0</th> <td>781</td> <td>France</td> <td>1</td> <td>29</td> <td>9</td> <td>0.00</td> <td>2</td> <td>0</td> <td>0</td> <td>172097.40</td> </tr> <tr> <th>1</th> <td>687</td> <td>Spain</td> <td>1</td> <td>31</td> <td>3</td> <td>0.00</td> <td>2</td> <td>0</td> <td>0</td> <td>48228.10</td> </tr> <tr> <th>2</th> <td>850</td> <td>Spain</td> <td>0</td> <td>39</td> <td>6</td> <td>0.00</td> <td>2</td> <td>1</td> <td>0</td> <td>103921.43</td> </tr> <tr> <th>3</th> <td>648</td> <td>Spain</td> <td>0</td> <td>53</td> <td>6</td> <td>111201.41</td> <td>1</td> <td>1</td> <td>1</td> <td>121542.29</td> </tr> <tr> <th>4</th> <td>789</td> <td>France</td> <td>1</td> <td>37</td> <td>3</td> <td>0.00</td> <td>1</td> <td>1</td> <td>0</td> <td>121883.87</td> </tr> </tbody> </table> </div> ```python train.Geography = le.fit_transform(train.Geography) x_test.Geography = le.transform(x_test.Geography) x_test.head() ``` <div> <style scoped> .dataframe tbody tr th:only-of-type { vertical-align: middle; } .dataframe tbody tr th { vertical-align: top; } .dataframe thead th { text-align: right; } </style> <table border="1" class="dataframe"> <thead> <tr style="text-align: right;"> <th></th> <th>CreditScore</th> <th>Geography</th> <th>Gender</th> <th>Age</th> <th>Tenure</th> <th>Balance</th> <th>NumOfProducts</th> <th>HasCrCard</th> <th>IsActiveMember</th> <th>EstimatedSalary</th> </tr> </thead> <tbody> <tr> <th>0</th> <td>781</td> <td>0</td> <td>1</td> <td>29</td> <td>9</td> <td>0.00</td> <td>2</td> <td>0</td> <td>0</td> <td>172097.40</td> </tr> <tr> <th>1</th> <td>687</td> <td>2</td> <td>1</td> <td>31</td> <td>3</td> <td>0.00</td> <td>2</td> <td>0</td> <td>0</td> <td>48228.10</td> </tr> <tr> <th>2</th> <td>850</td> <td>2</td> <td>0</td> <td>39</td> <td>6</td> <td>0.00</td> <td>2</td> <td>1</td> <td>0</td> <td>103921.43</td> </tr> <tr> <th>3</th> <td>648</td> <td>2</td> <td>0</td> <td>53</td> <td>6</td> <td>111201.41</td> <td>1</td> <td>1</td> <td>1</td> <td>121542.29</td> </tr> <tr> <th>4</th> <td>789</td> <td>0</td> <td>1</td> <td>37</td> <td>3</td> <td>0.00</td> <td>1</td> <td>1</td> <td>0</td> <td>121883.87</td> </tr> </tbody> </table> </div> ```python from sklearn.preprocessing import MinMaxScaler sc = MinMaxScaler() train[["CreditScore", "Balance", "EstimatedSalary"]] = sc.fit_transform(train[["CreditScore", "Balance", "EstimatedSalary"]]) ``` ```python x_test[["CreditScore", "Balance", "EstimatedSalary"]] = sc.transform(x_test[["CreditScore", "Balance", "EstimatedSalary"]]) ``` ```python y_train = train.iloc[:,-1] x_train = train.iloc[:,:-1] x_train.head() ``` <div> <style scoped> .dataframe tbody tr th:only-of-type { vertical-align: middle; } .dataframe tbody tr th { vertical-align: top; } .dataframe thead th { text-align: right; } </style> <table border="1" class="dataframe"> <thead> <tr style="text-align: right;"> <th></th> <th>CreditScore</th> <th>Geography</th> <th>Gender</th> <th>Age</th> <th>Tenure</th> <th>Balance</th> <th>NumOfProducts</th> <th>HasCrCard</th> <th>IsActiveMember</th> <th>EstimatedSalary</th> </tr> </thead> <tbody> <tr> <th>0</th> <td>0.728</td> <td>1</td> <td>1</td> <td>26</td> <td>3</td> <td>0.476470</td> <td>2</td> <td>1</td> <td>0</td> <td>0.327103</td> </tr> <tr> <th>1</th> <td>0.674</td> <td>0</td> <td>1</td> <td>38</td> <td>0</td> <td>0.575734</td> <td>1</td> <td>0</td> <td>1</td> <td>0.686259</td> </tr> <tr> <th>2</th> <td>0.478</td> <td>1</td> <td>0</td> <td>20</td> <td>2</td> <td>0.482639</td> <td>2</td> <td>1</td> <td>0</td> <td>0.017176</td> </tr> <tr> <th>3</th> <td>0.742</td> <td>2</td> <td>0</td> <td>32</td> <td>10</td> <td>0.000000</td> <td>1</td> <td>1</td> <td>0</td> <td>0.680472</td> </tr> <tr> <th>4</th> <td>0.190</td> <td>0</td> <td>0</td> <td>30</td> <td>3</td> <td>0.000000</td> <td>2</td> <td>1</td> <td>1</td> <td>0.639546</td> </tr> </tbody> </table> </div> ## 머신러닝, 성능확인 ```python from sklearn.ensemble import RandomForestClassifier from xgboost import XGBRFClassifier from sklearn.model_selection import train_test_split from sklearn.metrics import roc_auc_score from sklearn.metrics import confusion_matrix from sklearn.metrics import classification_report rf = RandomForestClassifier() xg = XGBRFClassifier() xtr, xt, ytr, yt = train_test_split(x_train, y_train, test_size = 0.3, stratify = y_train) rf.fit(xtr, ytr) predrf = rf.predict(xt) print(classification_report(yt, predrf)) print(confusion_matrix(yt, predrf)) ``` precision recall f1-score support 0 0.86 0.96 0.91 1911 1 0.72 0.41 0.52 489 accuracy 0.85 2400 macro avg 0.79 0.69 0.72 2400 weighted avg 0.83 0.85 0.83 2400 [[1832 79] [ 287 202]] ```python xg.fit(xtr, ytr) predxg = xg.predict(xt) print(classification_report(yt, predxg)) ``` /Users/hyosasiburi/opt/anaconda3/lib/python3.8/site-packages/xgboost/sklearn.py:888: UserWarning: The use of label encoder in XGBClassifier is deprecated and will be removed in a future release. To remove this warning, do the following: 1) Pass option use_label_encoder=False when constructing XGBClassifier object; and 2) Encode your labels (y) as integers starting with 0, i.e. 0, 1, 2, ..., [num_class - 1]. warnings.warn(label_encoder_deprecation_msg, UserWarning) [20:33:32] WARNING: /opt/concourse/worker/volumes/live/7a2b9f41-3287-451b-6691-43e9a6c0910f/volume/xgboost-split_1619728204606/work/src/learner.cc:1061: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'. Explicitly set eval_metric if you'd like to restore the old behavior. precision recall f1-score support 0 0.87 0.97 0.91 1911 1 0.76 0.42 0.54 489 accuracy 0.85 2400 macro avg 0.81 0.69 0.73 2400 weighted avg 0.84 0.85 0.84 2400 ```python rf.fit(x_train, y_train) pred = rf.predict_proba(x_test) pred = pd.DataFrame(pred) pred = pred.iloc[:,0] pred ``` 0 1.00 1 1.00 2 0.99 3 0.92 4 0.33 ... 1996 0.97 1997 0.88 1998 0.76 1999 0.85 2000 0.37 Name: 0, Length: 2001, dtype: float64 ```python sub = pd.read_csv("https://raw.githubusercontent.com/Datamanim/datarepo/main/churn/submission.csv") sub["0"] = pred.values sub.to_csv("003000214.csv", index = False) pd.read_csv("003000214.csv") ``` <div> <style scoped> .dataframe tbody tr th:only-of-type { vertical-align: middle; } .dataframe tbody tr th { vertical-align: top; } .dataframe thead th { text-align: right; } </style> <table border="1" class="dataframe"> <thead> <tr style="text-align: right;"> <th></th> <th>0</th> </tr> </thead> <tbody> <tr> <th>0</th> <td>1.00</td> </tr> <tr> <th>1</th> <td>1.00</td> </tr> <tr> <th>2</th> <td>0.99</td> </tr> <tr> <th>3</th> <td>0.92</td> </tr> <tr> <th>4</th> <td>0.33</td> </tr> <tr> <th>...</th> <td>...</td> </tr> <tr> <th>1996</th> <td>0.97</td> </tr> <tr> <th>1997</th> <td>0.88</td> </tr> <tr> <th>1998</th> <td>0.76</td> </tr> <tr> <th>1999</th> <td>0.85</td> </tr> <tr> <th>2000</th> <td>0.37</td> </tr> </tbody> </table> <p>2001 rows × 1 columns</p> </div> ```python ```
21.260213
414
0.480972
yue_Hant
0.256959