hexsha stringlengths 40 40 | size int64 5 1.04M | ext stringclasses 6 values | lang stringclasses 1 value | max_stars_repo_path stringlengths 3 344 | max_stars_repo_name stringlengths 5 125 | max_stars_repo_head_hexsha stringlengths 40 78 | max_stars_repo_licenses listlengths 1 11 | max_stars_count int64 1 368k ⌀ | max_stars_repo_stars_event_min_datetime stringlengths 24 24 ⌀ | max_stars_repo_stars_event_max_datetime stringlengths 24 24 ⌀ | max_issues_repo_path stringlengths 3 344 | max_issues_repo_name stringlengths 5 125 | max_issues_repo_head_hexsha stringlengths 40 78 | max_issues_repo_licenses listlengths 1 11 | max_issues_count int64 1 116k ⌀ | max_issues_repo_issues_event_min_datetime stringlengths 24 24 ⌀ | max_issues_repo_issues_event_max_datetime stringlengths 24 24 ⌀ | max_forks_repo_path stringlengths 3 344 | max_forks_repo_name stringlengths 5 125 | max_forks_repo_head_hexsha stringlengths 40 78 | max_forks_repo_licenses listlengths 1 11 | max_forks_count int64 1 105k ⌀ | max_forks_repo_forks_event_min_datetime stringlengths 24 24 ⌀ | max_forks_repo_forks_event_max_datetime stringlengths 24 24 ⌀ | content stringlengths 5 1.04M | avg_line_length float64 1.14 851k | max_line_length int64 1 1.03M | alphanum_fraction float64 0 1 | lid stringclasses 191 values | lid_prob float64 0.01 1 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
264438ba7e14eaf61b000d53270ce3e6c29e238b | 204 | md | Markdown | content/quotes/government.md | codemessiah/codemessiah | 56f09046b98be4aa2e38e0cccc305759da75fd30 | [
"MIT"
] | 1 | 2021-01-14T16:45:12.000Z | 2021-01-14T16:45:12.000Z | content/quotes/government.md | fivemoreminix/fivemoreminix | 978fc90c42fcc69dd4da02ac53a3fcde63f70fe8 | [
"MIT"
] | null | null | null | content/quotes/government.md | fivemoreminix/fivemoreminix | 978fc90c42fcc69dd4da02ac53a3fcde63f70fe8 | [
"MIT"
] | null | null | null | ---
title: "Government"
citation: "George Washington"
date: 2020-10-25T12:06:00-05:00
---
"When the people fear the government, there is tyranny; when the government fears the people, there is liberty."
| 25.5 | 112 | 0.740196 | eng_Latn | 0.996281 |
2644669ac986f85b84b98ee5aa65134833dccb17 | 3,168 | md | Markdown | README.md | j-hase/reda | b6419c39842cfbdd9380a27a5c6e9a04ccaeb294 | [
"MIT"
] | null | null | null | README.md | j-hase/reda | b6419c39842cfbdd9380a27a5c6e9a04ccaeb294 | [
"MIT"
] | null | null | null | README.md | j-hase/reda | b6419c39842cfbdd9380a27a5c6e9a04ccaeb294 | [
"MIT"
] | null | null | null | ## REDA - Reproducible Electrical Data Analysis
[](https://travis-ci.org/geophysics-ubonn/reda)
[](https://opensource.org/licenses/MIT)
[](https://gitter.im/geophysics-ubonn/reda?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge)
[](https://mybinder.org/v2/gh/geophysics-ubonn/try-reda/master?filepath=reda_test.ipynb)
*Latest release: 0.1.5 (28. June 2020)*
REDA is a scientific Python library for reproducible geoelectrical data
analysis. It aims to provide a unified interface for common and advanced data
processing steps while bridging the gap between a multitude of geoelectric
measurement devices and inversion codes used across the geophysical community.
It offers functionality to import, analyze, process, visualize, and export
geoelectrical data with particular emphasis on time-lapse functionality and
reproducibility. The latter is realized in the form of a logging system, which
keeps track of each individual processing step applied to particular data set
in a human-readable journal. REDA is platform compatible, tested and
open-source under the permissive MIT license. Any contributions from the
community are highly welcome.
REDA is a work-in-progress. Please contact us if you wish to use it or miss a
specific functionality. Please see the
[status page](https://geophysics-ubonn.github.io/reda/about.html#status-of-reda) for more
information.
### Installation
Install latest release from PyPI (https://pypi.org/project/reda/):
pip install reda
Install current development version from git:
pip install git+https://github.com/geophysics-ubonn/reda
or:
```bash
git clone https://github.com/geophysics-ubonn/reda
cd reda
# 1) Install dependencies with pip
pip install -r requirements.txt
# 2) or Install dependencies with conda
conda install --file requirements.txt
pip install .
# alternatively: (can lead to versioning problems if multiple (development)
# versions are installed.
python setup.py install
```
### Documentation
An online version of the docs can be found here:
<https://geophysics-ubonn.github.io/reda>
### Contributing
We look forward to any type of contributions:
* code contribution
* example contributions
* documentation help
* issuing bug reports
If in doubt, use the gitter chat to contact us (click the Gitter badge above to
join the chat).
<!--
## Roadmap
Milestones for beta versions of the EDF framework. For detailed TODO items,
please refer to the TODO section down below.
### 0.1
- proof-of-concept for the ERT container
- proof-of-concept for the SIP container
- importers: Syscal, ABEM (text), SIP-04
- plots: histograms, pseudosections (regular, normal-vs-reciprocal), decay
curves
### 0.1.1
- proof-of-concept for the EIT container
- saving of containers to file
### 0.1.2
- logfile/log of applied filters/apply filters to other data sets
-->
| 33 | 228 | 0.772727 | eng_Latn | 0.892987 |
2644d6c2f043670a875cc6952e57a15b5736181a | 5,838 | md | Markdown | README.md | kqarlos/developer-profile-generator | 7340ee8d73b6d31487e958e1955b290f82388797 | [
"MIT"
] | null | null | null | README.md | kqarlos/developer-profile-generator | 7340ee8d73b6d31487e958e1955b290f82388797 | [
"MIT"
] | 7 | 2020-11-24T12:30:16.000Z | 2022-02-13T03:38:49.000Z | README.md | kqarlos/developer-profile-generator | 7340ee8d73b6d31487e958e1955b290f82388797 | [
"MIT"
] | null | null | null | # Developer Profile Generator
</br>
<p align="center">
<img src="https://img.shields.io/github/languages/count/kqarlos/developer-profile-generator?style=for-the-badge" alt="Languages" />
<img src="https://img.shields.io/github/languages/top/kqarlos/developer-profile-generator?style=for-the-badge" alt="Top Language" />
<img src="https://img.shields.io/github/languages/code-size/kqarlos/developer-profile-generator?style=for-the-badge" alt="Code Size" />
<img src="https://img.shields.io/github/repo-size/kqarlos/developer-profile-generator?style=for-the-badge" alt="Repo Size" />
<img src="https://img.shields.io/tokei/lines/github/kqarlos/developer-profile-generator?style=for-the-badge" alt="Total Lines" />
<img src="https://img.shields.io/github/package-json/dependency-version/kqarlos/developer-profile-generator/axios?style=for-the-badge" alt="Axios Version" />
<img src="https://img.shields.io/github/package-json/dependency-version/kqarlos/developer-profile-generator/electron?style=for-the-badge" alt="Electron Version" />
<img src="https://img.shields.io/github/package-json/dependency-version/kqarlos/developer-profile-generator/electron-html-to?style=for-the-badge" alt="Electron- html-to Version" />
<img src="https://img.shields.io/github/package-json/dependency-version/kqarlos/developer-profile-generator/inquirer?style=for-the-badge" alt="Inquirer Version" />
<img src="https://img.shields.io/github/last-commit/kqarlos/developer-profile-generator?style=for-the-badge" alt="Last Commit" />
<img src="https://img.shields.io/github/issues/kqarlos/developer-profile-generator?style=for-the-badge" alt="Issues" />
<img src="https://img.shields.io/github/followers/kqarlos?style=social" alt="Followers" />
</p>
## Description
Generates a Github's user profile in PDF format given a Github's username and a chosen favorite color.
## Table of Contents
* [Installation](#installation)
* [Usage](#usage)
* [Screenshots](#screenshots)
* [Snippets](#snippets)
* [Credits](#credits)
* [License](#license)
## Installation
Install dependencies and start the application in the command line
npm i
node index.js
<p align="center">
<a href="https://kqarlos.github.io/developer-profile-generator"><img src="https://img.shields.io/badge/-👉 See Sample Profile-success?style=for-the-badge" alt="Live Site" /></a>
</p>
## Usage
### Screenshots
1. Generated PDF with username: _kqarlos_ and color choice: _red_

2. Command-line prompts with _inquirer_

### Snippets
1. Promise and callback
```javascript
getNumberOfStars(queryURL + "/repos").then(function (stars) {
userInfo = {
name: response.data.name,
location: response.data.location,
githubLink: response.data.html_url,
nOfRepos: response.data.public_repos,
nOfFollowers: response.data.followers,
nFollowing: response.data.following,
blogLink: response.data.blog,
nOfStars: stars,
pImage: response.data.avatar_url,
favoriteColor: unserInput.color,
bio: response.data.bio
}
console.log(userInfo);
generateHTML();
}).catch(function (err) {
console.log(err);
});
```
* This portion of code calls on _getNumberOfStars()_. This return a promise that once is resolved will continue to execute the callback function. In other words, only once _getNumberOfStars()_ resolves the _userInfo_ object will be set and once all the user info is set, we call the generateHTML function.
2. _getNumberOfStars(repos_url)_
```javascript
function getNumberOfStars(repos_url) {
return new Promise(function (resolve, reject) {
if (repos_url == null)
return reject(Error("Invalid URL!"));
var count = 0;
axios.get(repos_url).then(function (response) {
response.data.forEach(element => {
count += element.stargazers_count;
});
resolve(count);
}).catch(function (error) {
console.log("Error", error.message);
});
});
}
```
* This function returns a promise. This is because inside this function there is an _axios.get()_ call that runs asynchronouslly. Returning apromise will ensure that whatever code follows this function call will execute only after this promise is resolved. This avoids issues such as referencing a variable that has not yet been set.
## Credits
### Author
- 💼 Carlos Toledo: [portfolio](https://kqarlos.github.io/)
- :octocat: Github: [kqarlos](https://www.github.com/kqarlos)
- LinkedIn: [carlos-toledo415](https://www.linkedin.com/in/carlos-toledo415/)
### Built With
</br>
<p align="center">
<a href="https://developer.mozilla.org/en-US/docs/Web/HTML"><img src="https://img.shields.io/badge/-HTML-orange?style=for-the-badge" alt="HMTL" /></a>
<a href="https://developer.mozilla.org/en-US/docs/Web/CSS"><img src="https://img.shields.io/badge/-CSS-blue?style=for-the-badge" alt="CSS" /></a>
<a href="https://www.javascript.com/"><img src="https://img.shields.io/badge/-Javascript-yellow?style=for-the-badge" alt="Javascript" /></a>
<a href="https://getbootstrap.com/"><img src="https://img.shields.io/badge/-Bootstrap-blueviolet?style=for-the-badge" alt="Bootstrap" /></a>
<a href="https://handlebarsjs.com/"><img src="https://img.shields.io/badge/-Inquirer-orange?style=for-the-badge" alt="Inquirer" /></a>
</p>
## License
</br>
<p align="center">
<img align="center" src="https://img.shields.io/github/license/kqarlos/developer-profile-generator?style=for-the-badge" alt="MIT license" />
</p> | 42.613139 | 333 | 0.68568 | yue_Hant | 0.250681 |
26455ccc438095dce003f52f559821afc5b7d0f9 | 722 | md | Markdown | README.md | catdawg/hxgo | d541fba2e731ec91cc08b7981d94dcfa3dba4add | [
"MIT"
] | 4 | 2015-06-01T04:27:06.000Z | 2018-10-09T15:07:14.000Z | README.md | catdawg/hxgo | d541fba2e731ec91cc08b7981d94dcfa3dba4add | [
"MIT"
] | null | null | null | README.md | catdawg/hxgo | d541fba2e731ec91cc08b7981d94dcfa3dba4add | [
"MIT"
] | null | null | null | # hxgo
A duell environment library for using Go libraries in Haxe. Uses the https://github.com/tardisgo/tardisgo transpiler to compile from Go to Haxe.
# Usage
Add the dependency:
<duelllib name="hxgo" version="0.0.0+" />
Configure the go sources you want to compile to Haxe, example:
<library-config>
<hxgo>
<go-src value="math" />
<go-src value="strconv"/>
</hxgo>
</library-config>
Use it! example:
```
import tardis.*;
class Main
{
public static function main()
{
Go_math_NNextafter.hx(2.0,3.0);
}
}
```
# Limitations
- Only tested on mac!
- There is a weird problem with big filenames, so if you see something wrong when Tardis runs, it's probably because of that.
| 20.628571 | 144 | 0.6759 | eng_Latn | 0.963762 |
26459ccf89d9a8cf0d3eafa98bae17d336a6fcab | 970 | md | Markdown | security/intel_support_for_aesni.md | sanjosh/csdocs | 76daad74fb7735bc2f4d8a2198ed4b10e6fcb978 | [
"Apache-2.0"
] | 5 | 2018-11-30T08:49:53.000Z | 2021-09-14T10:25:01.000Z | security/intel_support_for_aesni.md | sanjosh/csdocs | 76daad74fb7735bc2f4d8a2198ed4b10e6fcb978 | [
"Apache-2.0"
] | null | null | null | security/intel_support_for_aesni.md | sanjosh/csdocs | 76daad74fb7735bc2f4d8a2198ed4b10e6fcb978 | [
"Apache-2.0"
] | 2 | 2019-02-18T20:46:18.000Z | 2021-07-16T15:05:06.000Z |
# Does processor have AES-NI enabled in BIOS ?
`grep -o aes /proc/cpuinfo | wc -l`
# AES-NI enabled in Linux ?
`cpuid | grep -i aes | sort | uniq`
# Is AES-NI optimized driver loaded
`sort -u /proc/crypto | grep module | grep aes`
# Does openssl have aes-ni
`openssl engine`
must show
`(aesni) Intel AES-NI engine`
# higher MB/s if server has AES-NI support
`
dd if=/dev/zero count=1000 bs=1M | ssh -l sandeep -c aes128-cbc serverB "cat >/dev/null"
`
# openssl benchmark
compare output of
`openssl speed aes-256-cbc`
versus
`openssl speed -evp aes-256-cbc`
EVP (envelope encryption) functions auto-detect if AES-NI supported by CPU
# References
1. http://unix.stackexchange.com/questions/14077/how-to-check-that-aes-ni-is-supported-by-my-cpu
2. https://software.intel.com/en-us/articles/intel-advanced-encryption-standard-instructions-aes-ni/
3. https://www.cyberciti.biz/faq/how-to-find-out-aes-ni-advanced-encryption-enabled-on-linux-system/
| 20.638298 | 100 | 0.724742 | eng_Latn | 0.423062 |
264641d1c8123638927848f6f2c7c458a7e8a684 | 86 | md | Markdown | README.md | michaelplazek/gtp3-doc-generator | f10f82de6094e489ed231b7bdf719adba0e420c2 | [
"MIT"
] | null | null | null | README.md | michaelplazek/gtp3-doc-generator | f10f82de6094e489ed231b7bdf719adba0e420c2 | [
"MIT"
] | null | null | null | README.md | michaelplazek/gtp3-doc-generator | f10f82de6094e489ed231b7bdf719adba0e420c2 | [
"MIT"
] | null | null | null | # gtp3-doc-generator
Uses OpenAI GTP-3 to generate documentation for JavaScript code.
| 28.666667 | 64 | 0.813953 | eng_Latn | 0.80171 |
26464f0b0e095774201cb35a6ba2d46d409980de | 14,565 | md | Markdown | README.md | CharlesHD/clustering | 6a28fb877259609e80d2e3d07ba20b848a31852c | [
"MIT"
] | null | null | null | README.md | CharlesHD/clustering | 6a28fb877259609e80d2e3d07ba20b848a31852c | [
"MIT"
] | null | null | null | README.md | CharlesHD/clustering | 6a28fb877259609e80d2e3d07ba20b848a31852c | [
"MIT"
] | null | null | null | # Clustering Algorithms
[](http://travis-ci.org/rm-hull/clustering)
[](https://coveralls.io/r/rm-hull/clustering?branch=master)
[](https://versions.deps.co/rm-hull/clustering)
[](https://versions.deps.co/rm-hull/clustering)
[](https://clojars.org/rm-hull/clustering)
[]()
Implementation of K-Means, QT and Hierarchical clustering algorithms, in Clojure/Clojurescript.
### Pre-requisites
You will need [Leiningen](https://github.com/technomancy/leiningen) 2.7.1 or above installed.
### Building
To build and install the library locally, run:
$ cd clustering
$ lein test
$ lein install
### Including in your project
There is a version hosted at [Clojars](https://clojars.org/rm-hull/clustering).
For leiningen include a dependency:
```clojure
[rm-hull/clustering "0.2.0"]
```
For maven-based projects, add the following to your `pom.xml`:
```xml
<dependency>
<groupId>rm-hull</groupId>
<artifactId>clustering</artifactId>
<version>0.2.0</version>
</dependency>
```
## API Documentation
See [www.destructuring-bind.org/clustering](http://www.destructuring-bind.org/clustering/) for API details.
## High-level Overview
The main entry point to all the algorithms is the `cluster` function in each of
the `clustering.core.qt`, `clustering.core.k-means` and
`clustering.core.hierarchical` namespaces. Generally all cluster variants
require a distance function and a dataset (sequence/collection/...), where:
* the **distance** function should take two dataset items and perform
some scalar measure of distance between the two points. Typical applications
include euclidean distance, manhattan distance or pearson distance.
* the **dataset** should be a SEQable collection of data points that would be
clustered.
The K-means and hierarchical clustering algorithms also require an
**averaging** function that takes a number of dataset items and creates an
"average" based on those items.
### _N_-dimensional clustering
The below examples show distance and averaging functions on a 1-dimensional
dataset comprising of dates. For most numeric datasets, any of the distance
measures in `clustering.distance` would be sufficient; these implementations
can handle arbitrarily-large _n_-dimensional datasets.
For non-numeric data, it would be necessary to either provide a mapper method
to 'convert' non-numeric values into a meaningful numeric value (and then use
the provided distance functions), or implement your own custom distance function.
Likewise, there exists a generic `clustering.average` namespace that operates
on arbitrarily-large _n_-dimensional numeric datasets.
## Algorithms
### Quality Threshold (QT) clustering
From: https://sites.google.com/site/dataclusteringalgorithms/quality-threshold-clustering-algorithm-1
1. Initialize the threshold distance allowed for clusters and the
minimum cluster size.
2. Build a candidate cluster for each data point by including the
closest point, the next closest, and so on, until the distance
of the cluster surpasses the threshold.
3. Save the candidate cluster with the most points as the first true
cluster, and remove all points in the cluster from further
consideration.
4. Repeat with the reduced set of points until no more cluster can
be formed having the minimum cluster size.
> **NOTE:** QT clustering is computationally intensive and time
> consuming - increasing the minimum cluster size or increasing
> the number of data points can greatly increase the computational
> time.
#### Worked example
We'll start by trying to cluster a simple one-dimensional set of dates:
```clojure
(require '[clustering.core.qt :as qt])
(require '[clj-time.core :refer [after? date-time interval in-days])
(require '[clj-time.format :refer [unparse formatters])
(def test-dataset
(hash-set
(date-time 2013 7 21)
(date-time 2013 7 25)
(date-time 2013 7 14)
(date-time 2013 7 31)
(date-time 2013 7 1)
(date-time 2013 8 3)
(date-time 2012 12 26)
(date-time 2012 12 28)
(date-time 2013 1 16)
(date-time 2012 6 2)
(date-time 2012 6 7)
(date-time 2012 6 6)
(date-time 2012 6 9)
(date-time 2012 5 28)))
```
In order to use the QT clustering algorithm, we need to first define some
measure of distance between data-points; this is quite easy for dates:
```clojure
(defn distance [dt-a dt-b]
(if (after? dt-a dt-b)
(distance dt-b dt-a)
(in-days (interval dt-a dt-b))))
(distance
(date-time 2012 12 26)
(date-time 2013 1 16))
; => 21
```
For convenience, let's also define a date formatter:
```clojure
(def fmt (partial unparse (formatters :date)))
(fmt (date-time 2019 2 19))
; => "2019-02-19"
```
To split these into clusters (with a minimum cluster size of 3), grouped
roughly into months,
```clojure
(def groups (qt/cluster distance test-dataset 31 3))
(count groups)
; => 3
(map fmt (sort (groups 0)))
; => ("2013-07-01" "2013-07-14" "2013-07-21" "2013-07-25" "2013-07-31" "2013-08-03")
(map fmt (sort (groups 1)))
; => ("2012-05-28" "2012-06-02" "2012-06-06" "2012-06-07" "2012-06-09")
(map fmt (sort (groups 2)))
; => ("2012-12-26" "2012-12-28" "2013-01-16")
```
### K-Means clustering
K-means clustering aims to partition _n_ observations into _k_ clusters in
which each observation belongs to the cluster with the nearest mean, serving as
a prototype of the cluster. This results in a partitioning of the data space
into Voronoi cells.
K-means clustering is an NP-hard problem, but can be simply implemented using
the iterative refinement technique outlined below.
1. Pick k points called means. This is called initialization.
2. Associate each input point with the mean that is closest to it. We obtain k
clusters of points, and we refer to this process as classifying the points.
3. Update each mean to have the average value of the corresponding cluster.
4. If the k means have significantly changed, go back to step 2. If they did
not, we say that the algorithm converged.
5. The k means represent different clusters -- every point is in the cluster
corresponding to the closest mean.
#### Worked Example
Using the same one-dimensional dataset as the previous example, but
instead requiring the `clustering/k-means` namespace:
```clojure
(require '[clustering.core.k-means :as k-means])
(require '[clj-time.core :refer [after? date-time interval in-days])
(require '[clj-time.format :refer [unparse formatters])
(require '[clj-time.coerce :refer [to-long from-long])
(def test-dataset
(hash-set
(date-time 2013 7 21)
(date-time 2013 7 25)
...)))
```
We still need a distance measure between two dates, but additionally, the
K-Means algorithm also requires a function capable of averaging across a
range of dates:
```clojure
(defn distance [dt-a dt-b]
(if (after? dt-a dt-b)
(distance dt-b dt-a)
(in-days (interval dt-a dt-b))))
(defn average [dates]
(from-long
(/ (reduce + (map to-long dates)) (count dates))))
```
The algorithm is initialised with some mean values randomly selected from
the dataset. Note that the desired number of clusters must be specified
up-front (in this case, 3):
```clojure
(def means (k-means/init-means 3 test-dataset)
(def groups (k-means/cluster distance average test-dataset means 0))
(count groups)
; => 3
(map fmt (sort (groups 0)))
; => ("2012-12-26" "2012-12-28" "2013-01-16")
(map fmt (sort (groups 1)))
; => ("2013-07-01" "2013-07-14" "2013-07-21" "2013-07-25" "2013-07-31" "2013-08-03")
(map fmt (sort (groups 2)))
; => ("2012-05-28" "2012-06-02" "2012-06-06" "2012-06-07" "2012-06-09")
```
_(Note: Because of the random initialisation of the means, the cluster orderings
will be different each time evaluation occurs.)_
### Hierarchical clustering
Agglomorative clustering seeks to pair up nearest points (according to a chosen
distance measurement) into a cluster, progressively merging clusters into a
hierarchy, until there only is a single cluster left.
#### Worked Example
As before, using the date example, we need the `distance` and `average` functions
as defined previously:
```clojure
(require '[clustering.core.hierarchical :as hier])
(require '[clustering.data-viz.image :refer :all])
(require '[clustering.data-viz.dendrogram :as dendrogram])
(require '[clj-time.core :refer [after? date-time interval in-days])
(require '[clj-time.format :refer [unparse formatters])
(require '[clj-time.coerce :refer [to-long from-long])
(def test-dataset
(hash-set
(date-time 2013 7 21)
(date-time 2013 7 25)
...)))
(defn distance [dt-a dt-b]
...)
(defn average [dates]
...)
```
Rather than returning a vector of clusters, hierarchical clustering returns a
single cluster object with left and right sub-parts that require recursive
traversal, most easily demonstrated with a suitable data visualization, such as
a [dendrogram](https://en.wikipedia.org/wiki/Dendrogram):
```clojure
(def groups (hier/cluster distance average test-dataset))
(write-png
"doc/dendrogram.png"
(dendrogram/->img group fmt))
(spit
"doc/dendrogram.svg"
(dendrogram/->svg group fmt))
```

## More Examples
Further examples can be found in the
https://github.com/rm-hull/clustering/tree/master/test/clustering/examples
directory.
### Word Similaries
Taking a list of sampled dictionary words and using the Levenshtein distance to
cluster, the hierarchical clustering algorithm produce the following
dendrogram:

Substituting different distance metrics (see [clj-fuzzy](http://yomguithereal.github.io/clj-fuzzy/clojure.html))
would give different (and maybe more interesting) cluster clumps.
### Baseball: Team & League Standard Batting
[baseball-reference.com](http://www.baseball-reference.com/) has lots of
interesting historical statistics for all major league games, one of which is
the [2015 National League](http://www.baseball-reference.com/leagues/NL/2015.shtml#teams_standard_batting::none):
|Tm|#Bat|BatAge|R/G|G|PA|AB|R|H|2B|3B|HR|RBI|SB|CS|BB|SO|BA|OBP|SLG|OPS|OPS+|TB|GDP|HBP|SH|SF|IBB|LOB|
|---|---:|-----:|---:|---:|---:|---:|---:|---:|---:|---:|---:|---:|---:|---:|---:|---:|---:|---:|---:|---:|---:|---:|---:|---:|---:|---:|---:|---:|
|ARI|50|26.6|4.44|162|6276|5649|720|1494|289|48|154|680|132|44|490|1312|.264|.324|.414|.738|96|2341|134|33|46|57|40|1153|
|ATL|60|28.8|3.54|162|6034|5420|573|1361|251|18|100|548|69|33|471|1107|.251|.314|.359|.674|88|1948|148|44|67|31|39|1145|
|CHC|50|26.9|4.25|162|6200|5491|689|1341|272|30|171|657|95|37|567|1518|.244|.321|.398|.719|97|2186|101|74|32|35|47|1165|
|CIN|50|29.5|3.95|162|6196|5571|640|1382|257|27|167|613|134|38|496|1255|.248|.312|.394|.706|92|2194|112|42|47|40|38|1148|
|COL|51|28.0|4.55|162|6071|5572|737|1479|274|49|186|702|97|43|388|1283|.265|.315|.432|.748|89|2409|114|33|44|34|47|1016|
|LAD|55|29.6|4.12|162|6090|5385|667|1346|263|26|187|638|59|34|563|1258|.250|.326|.413|.739|107|2222|135|60|49|30|31|1121|
|MIA|51|27.9|3.78|162|5988|5463|613|1420|236|40|120|575|112|45|375|1150|.260|.310|.384|.694|91|2096|133|39|71|40|30|1059|
|MIL|49|28.1|4.04|162|6024|5480|655|1378|274|34|145|624|84|29|412|1299|.251|.307|.393|.700|90|2155|130|41|55|34|35|1026|
|NYM|49|28.5|4.22|162|6145|5527|683|1351|295|17|177|654|51|25|488|1290|.244|.312|.400|.712|98|2211|130|68|29|32|42|1098|
|PHI|50|28.0|3.86|162|6053|5529|626|1374|272|37|130|586|88|32|387|1274|.249|.303|.382|.684|86|2110|119|54|53|29|20|1066|
|PIT|46|28.2|4.30|162|6285|5631|697|1462|292|27|140|661|98|45|461|1322|.260|.323|.396|.719|98|2228|115|89|63|41|46|1166|
|SDP|46|27.7|4.01|162|6019|5457|650|1324|260|36|148|623|82|29|426|1327|.243|.300|.385|.685|92|2100|108|40|52|42|22|1028|
|SFG|48|28.9|4.30|162|6153|5565|696|1486|288|39|136|663|93|36|457|1159|.267|.326|.406|.732|102|2260|142|49|45|37|30|1130|
|STL|46|28.4|3.99|162|6139|5484|647|1386|288|39|137|619|69|38|506|1267|.253|.321|.394|.716|95|2163|128|66|39|42|47|1152|
|WSN|44|28.4|4.34|162|6117|5428|703|1363|265|13|177|665|57|23|539|1344|.251|.321|.403|.724|95|2185|129|44|55|51|38|1114|
|LgAvg|48|28.2|4.11|162|6119|5510|666|1396|272|32|152|634|88|35|468|1278|.253|.316|.397|.713|94|2187|125|52|50|38|37|1106|714|28.2|4.11|2430|91790|82652|9996|20947|4076|480|2275|9508|1320|531|7026|19165|.253|.316|.397|.713|94|32808|1878|776|747|575|552|16587|
Using the Euclidean distance function, this yields the following dendrogram:

## References
* https://sites.google.com/site/dataclusteringalgorithms/quality-threshold-clustering-algorithm-1
* https://www.coursera.org/learn/parprog1/programming/akLxD/k-means
* US, UK & German city data derived from http://people.sc.fsu.edu/~jburkardt/datasets/cities/cities.html
* US voting data sourced from https://forge.scilab.org/index.php/p/rdataset/source/tree/master/csv/cluster/votes.repub.csv
## License
The MIT License (MIT)
Copyright (c) 2016 Richard Hull
Permission is hereby granted, free of charge, to any person obtaining a copy of
this software and associated documentation files (the "Software"), to deal in
the Software without restriction, including without limitation the rights to
use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of
the Software, and to permit persons to whom the Software is furnished to do so,
subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS
FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR
COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER
IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN
CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
| 38.028721 | 259 | 0.731548 | eng_Latn | 0.801921 |
264670d2279acb4fc05479e4f53dab63e91c9090 | 20 | md | Markdown | old-documentation/security/record-security.md | KiandaBPM/Docs | a3fdfcff8a80a2c2821284976470630659ef26e9 | [
"Apache-2.0"
] | null | null | null | old-documentation/security/record-security.md | KiandaBPM/Docs | a3fdfcff8a80a2c2821284976470630659ef26e9 | [
"Apache-2.0"
] | null | null | null | old-documentation/security/record-security.md | KiandaBPM/Docs | a3fdfcff8a80a2c2821284976470630659ef26e9 | [
"Apache-2.0"
] | null | null | null | # Record security
| 6.666667 | 18 | 0.7 | eng_Latn | 0.851931 |
26472560ae3eea23f5f1c296f8f2d0077048ad5e | 720 | md | Markdown | CHANGELOG.md | grafana/redshift-datasource | 514e106e591bb8f5dfcd52081bd7157d8a281ae5 | [
"Apache-2.0"
] | 2 | 2021-11-25T08:23:24.000Z | 2021-11-25T19:08:16.000Z | CHANGELOG.md | grafana/redshift-datasource | 514e106e591bb8f5dfcd52081bd7157d8a281ae5 | [
"Apache-2.0"
] | 20 | 2021-11-30T20:13:12.000Z | 2022-03-31T16:44:16.000Z | CHANGELOG.md | grafana/redshift-datasource | 514e106e591bb8f5dfcd52081bd7157d8a281ae5 | [
"Apache-2.0"
] | 2 | 2022-01-26T15:43:05.000Z | 2022-03-18T07:47:08.000Z | # Changelog
## 1.0.5
- Reduces backoff time factor to retrieve results.
- Upgrades internal dependecies.
## 1.0.4
- Add details in the datasource card #130
- Enable WithEvent to send an event to the AWS EventBridge #132
## 1.0.3
Fixes bugs for Endpoint and Assume Role settings.
## 1.0.2
Fixes a bug preventing from getting null values in a query.
## 1.0.1
Fixes a bug preventing from creating several data sources of the plugin in the same instance.
## 1.0.0
Initial release.
## 0.4.1
Improved curated dashboard.
## 0.4.0
Allow to authenticate using AWS Secret Manager. More bug fixes.
## 0.3.0
Third preview release. Includes curated dashboard.
## 0.2.0
Second release.
## 0.1.0
Initial release.
| 15 | 93 | 0.719444 | eng_Latn | 0.967089 |
26472bf739326ca40e64a7a650aa3393c272f8d6 | 1,549 | md | Markdown | README.md | anvaka/githuboauth | 7b9c6191723b32b3cec050679fcbf85c8bc1e2c7 | [
"MIT"
] | null | null | null | README.md | anvaka/githuboauth | 7b9c6191723b32b3cec050679fcbf85c8bc1e2c7 | [
"MIT"
] | null | null | null | README.md | anvaka/githuboauth | 7b9c6191723b32b3cec050679fcbf85c8bc1e2c7 | [
"MIT"
] | 1 | 2021-07-21T17:09:57.000Z | 2021-07-21T17:09:57.000Z | # githuboauth
Github oauth directive. Allows users of your website to authorize with GitHub.
# Why?
I started doing this to enable github on [graph drawing libraries](https://github.com/anvaka/graph-drawing-libraries) repository. Turns out complexity of "full client side" approach was too high. At the end I stopped using this repository. Instead I switched from to client-server architecture, where [Heroku app](https://github.com/anvaka/graph-drawing-stats) pulls all information from github and returns cached responses for [the frontend](http://anvaka.github.io/graph-drawing-libraries/#/all).
Just keep in mind: there are better alternatives to GitHub authorization problem.
# usage
In your html code:
``` html
<githuboauth clientId='b5926508f327fb8bd01b'
oauthProxy='//ghoauth.herokuapp.com/authenticate/[code]'>
</githuboauth>
```
See [demo](https://github.com/anvaka/githuboauth/tree/master/demo/basic) for
end to end example.
Unfortunatly GitHub does not provide a way [to authenticate securily](https://developer.github.com/v3/oauth/#web-application-flow)
via client side only. This means the directive requires to have oauth proxy to
trade your app's [client secret](https://developer.github.com/v3/oauth/#github-redirects-back-to-your-site)
for an access token.
Fear not, seting up oauth proxy is pretty straigtforward. Please check instructions
for [anvaka/gatepicker](https://github.com/anvaka/gatekeeper).
# install
With [npm](https://npmjs.org) do:
```
npm install githuboauth
```
# license
MIT
| 35.204545 | 499 | 0.766946 | eng_Latn | 0.909516 |
2648a36af4255994626368fe018c8282d7acfe6a | 4,282 | md | Markdown | docs/t-sql/database-console-commands/dbcc-shrinklog-azure-sql-data-warehouse.md | in4matica/sql-docs.de-de | b5a6c26b66f347686c4943dc8307b3b1deedbe7e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/t-sql/database-console-commands/dbcc-shrinklog-azure-sql-data-warehouse.md | in4matica/sql-docs.de-de | b5a6c26b66f347686c4943dc8307b3b1deedbe7e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/t-sql/database-console-commands/dbcc-shrinklog-azure-sql-data-warehouse.md | in4matica/sql-docs.de-de | b5a6c26b66f347686c4943dc8307b3b1deedbe7e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: DBCC SHRINKLOG (Parallel Data Warehouse) | Microsoft-Dokumentation
ms.custom: ''
ms.date: 03/16/2018
ms.prod: sql
ms.reviewer: ''
ms.technology: t-sql
ms.topic: language-reference
dev_langs:
- TSQL
author: pmasl
ms.author: umajay
monikerRange: '>= aps-pdw-2016 || = sqlallproducts-allversions'
ms.openlocfilehash: d2cf30f4f01a30d8171e58cce3052e45fefa6179
ms.sourcegitcommit: b2e81cb349eecacee91cd3766410ffb3677ad7e2
ms.translationtype: HT
ms.contentlocale: de-DE
ms.lasthandoff: 02/01/2020
ms.locfileid: "67930314"
---
# <a name="dbcc-shrinklog-parallel-data-warehouse"></a>DBCC SHRINKLOG (Parallel Data Warehouse)
[!INCLUDE[tsql-appliesto-xxxxxx-xxxx-xxxx-pdw-md](../../includes/tsql-appliesto-xxxxxx-xxxx-xxxx-pdw-md.md)]
Reduziert die Größe des Transaktionsprotokolls *anwendungsübergreifend* für die aktuelle [!INCLUDE[ssPDW](../../includes/sspdw-md.md)]-Datenbank. Die Daten werden defragmentiert, um das Transaktionsprotokoll zu verkleinern. Mit der Zeit kann es sein, dass das Datenbanktransaktionsprotokoll zu viele Bestandteile hat und dadurch nicht effizient verwendet werden kann. Verwenden Sie DBCC SHRINKLOG, um diese Fragmentierung und damit auch die Protokollgröße zu verringern.
 [Transact-SQL-Syntaxkonventionen (Transact-SQL)](../../t-sql/language-elements/transact-sql-syntax-conventions-transact-sql.md)
## <a name="syntax"></a>Syntax
```sql
DBCC SHRINKLOG
[ ( SIZE = { target_size [ MB | GB | TB ] } | DEFAULT ) ]
[ WITH NO_INFOMSGS ]
[;]
```
## <a name="arguments"></a>Argumente
SIZE = { *target_size* [ MB | **GB** | TB ] } | **DEFAULT**.
*target_size* ist die gewünschte Größe des Transaktionsprotokolls auf sämtlichen Computeknoten, nachdem DBCC SHRINKLOG abgeschlossen wurde. Dabei handelt es sich um einen Integer größer als 0 (null).
Die Protokollgröße wird in Megabyte (MB), Gigabyte (GB) oder Terabyte (TB) angegeben. Dabei handelt es sich um die Gesamtgröße aller Transaktionsprotokolle auf den Computeknoten.
Standardmäßig reduziert DBCC SHRINKLOG das Transaktionsprotokoll auf die Protokollgröße, die in den Metadaten der Datenbank gespeichert ist. Die Protokollgröße in den Metadaten wird von dem LOG_SIZE-Parameter in [CREATE DATABASE (Azure SQL Data Warehouse)](../../t-sql/statements/create-database-azure-sql-data-warehouse.md) oder [ALTER DATABASE (Azure SQL Data Warehouse)](../../t-sql/statements/alter-database-azure-sql-data-warehouse.md) bestimmt. DBCC SHRINKLOG reduziert die Größe des Transaktionsprotokolls, wenn `SIZE=DEFAULT` angegeben oder die `SIZE`-Klausel weggelassen wird.
WITH NO_INFOMSGS
Informationsmeldungen werden nicht in den DBCC SHRINKLOG-Ergebnissen angezeigt.
## <a name="permissions"></a>Berechtigungen
Erfordert die ALTER SERVER STATE-Berechtigung.
## <a name="general-remarks"></a>Allgemeine Hinweise
DBCC SHRINKLOG ändert nicht die Protokollgröße, die in den Metadaten der Datenbank gespeichert ist. Die Metadaten enthalten weiterhin den LOG_SIZE-Parameter, der in den Anweisungen CREATE DATABASE oder ALTER DATABASE angegeben wurde.
## <a name="examples"></a>Beispiele
### <a name="a-shrink-the-transaction-log-to-the-original-size-specified-by-create-database"></a>A. Reduzieren Sie das Transaktionsprotokoll auf die ursprüngliche von CREATE DATABASE angegebene Größe.
Angenommen, das Transaktionsprotokoll für die Adressdatenbank wurde bei der Erstellung der Adressdatenbank auf 100 MB festgelegt. D.h., die CREATE DATABASE-Anweisung für Adressen entspricht LOG_SIZE = 100 MB. Jetzt hat die Größe des Protokolls allerdings 150 MB erreicht und Sie möchten diese wieder auf 100 MB reduzieren.
Sie können eine der folgenden Anweisungen verwenden, um zu versuchen, das Transaktionsprotokoll der Adressdatenbank auf seine Standardgröße von 100 MB zu verkleinern. Wenn das Verkleinern des Protokolls auf 100 MB zu Datenverlusten führen würde, verkleinert DBCC SHRINKLOG es so weit wie möglich auf eine Größe von mehr als 100 MB, wobei jedoch keine Daten verloren gehen.
```sql
USE Addresses;
DBCC SHRINKLOG ( SIZE = 100 MB );
DBCC SHRINKLOG ( SIZE = DEFAULT );
DBCC SHRINKLOG;
```
| 64.878788 | 601 | 0.773237 | deu_Latn | 0.958243 |
2649625b2c2e288bcc69153faa21c7523c399aff | 115 | md | Markdown | france.code-civil/Livre II/Titre III/Article 625.md | bradchesney79/illacceptanything | 4594ae4634fdb5e39263a6423dc255ed46c25208 | [
"MIT"
] | 2,986 | 2015-03-31T06:53:53.000Z | 2022-03-29T13:03:22.000Z | france.code-civil/Livre II/Titre III/Article 625.md | bradchesney79/illacceptanything | 4594ae4634fdb5e39263a6423dc255ed46c25208 | [
"MIT"
] | 42 | 2015-03-31T08:46:31.000Z | 2020-11-01T11:28:43.000Z | france.code-civil/Livre II/Titre III/Article 625.md | bradchesney79/illacceptanything | 4594ae4634fdb5e39263a6423dc255ed46c25208 | [
"MIT"
] | 243 | 2015-03-31T06:43:04.000Z | 2022-02-20T21:26:49.000Z | Article 625
----
Les droits d'usage et d'habitation s'établissent et se perdent de la même
manière que l'usufruit.
| 23 | 73 | 0.765217 | fra_Latn | 0.929373 |
2649af26f562a51a1f12710d69608f21b21122ed | 27,728 | md | Markdown | security-updates/SecurityBulletins/2015/ms15-016.md | mdressman/security-updates.de-de | d1153bcdb3ba27eb6a32bd92aa1371666fc8ff02 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | security-updates/SecurityBulletins/2015/ms15-016.md | mdressman/security-updates.de-de | d1153bcdb3ba27eb6a32bd92aa1371666fc8ff02 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | security-updates/SecurityBulletins/2015/ms15-016.md | mdressman/security-updates.de-de | d1153bcdb3ba27eb6a32bd92aa1371666fc8ff02 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
TOCTitle: 'MS15-016'
Title: 'Microsoft Security Bulletin MS15-016 – Hoch'
ms:assetid: 'ms15-016'
ms:contentKeyID: 64119058
ms:mtpsurl: 'https://technet.microsoft.com/de-DE/library/ms15-016(v=Security.10)'
---
Microsoft Security Bulletin MS15-016 – Hoch
===========================================
Sicherheitsanfälligkeit in Microsoft Graphics-Komponente kann Offenlegung von Informationen ermöglichen (3029944)
-----------------------------------------------------------------------------------------------------------------
Veröffentlicht: 10. Februar 2015
**Version:** 1.0
Kurzzusammenfassung
-------------------
Dieses Sicherheitsupdate behebt eine vertraulich gemeldete Sicherheitsanfälligkeit in Microsoft Windows. Diese Sicherheitsanfälligkeit kann die Offenlegung von Informationen ermöglichen, wenn ein Benutzer eine Website besucht, die speziell gestaltete TIFF-Inhalte enthält. Diese Sicherheitsanfälligkeit ermöglicht einem Angreifer keine Codeausführung oder direkte Erhöhung von Benutzerberechtigungen, sondern kann genutzt werden, um Informationen zu sammeln, mit denen das betroffene System noch weiter gefährdet werden könnte.
Dieses Sicherheitsupdate wird für alle unterstützten Versionen von Microsoft Windows als Wichtig eingestuft. Weitere Informationen finden Sie im Abschnitt **Betroffene Software**.
Das Update behebt diese Sicherheitsanfälligkeit, indem korrigiert wird, wie TIFF-Bildformatdateien verarbeitet. Weitere Informationen zu dieser Sicherheitsanfälligkeit finden Sie im Abschnitt **Informationen zu Sicherheitsanfälligkeiten**.
Weitere Informationen zu diesem Update finden Sie im [Microsoft Knowledge Base-Artikel 3029944](https://support.microsoft.com/kb/3029944/de).
Betroffene Software
-------------------
Die folgenden Softwareversionen oder -Editionen sind betroffen. Versionen oder Editionen, die nicht aufgeführt sind, haben entweder das Ende ihres Supportlebenszyklus überschritten oder sind nicht betroffen. Besuchen Sie die Website [Microsoft Support Lifecycle](http://go.microsoft.com/fwlink/?linkid=21742), um den Supportlebenszyklus für Ihre Softwareversion oder Edition zu ermitteln.
<p> </p>
<table style="border:1px solid black;">
<tr>
<td style="border:1px solid black;">
**Betriebssystem**
</td>
<td style="border:1px solid black;" colspan="2">
**Maximale Sicherheitsauswirkung**
</td>
<td style="border:1px solid black;">
**Bewertung des Gesamtschweregrads**
</td>
<td style="border:1px solid black;">
**Ersetzte Updates**
</td>
</tr>
<tr>
<td style="border:1px solid black;" colspan="5">
**Windows Server 2003**
</td>
</tr>
<tr>
<td style="border:1px solid black;" colspan="2">
[Windows Server 2003 Service Pack 2](http://www.microsoft.com/downloads/details.aspx?familyid=1451edb9-35d8-4ec4-af2d-39e3d2f0dde2)
(3029944)
</td>
<td style="border:1px solid black;">
Offenlegung von Informationen
</td>
<td style="border:1px solid black;">
Hoch
</td>
<td style="border:1px solid black;">
Keine
</td>
</tr>
<tr>
<td style="border:1px solid black;" colspan="2">
[Windows Server 2003 x64 Edition Service Pack 2](http://www.microsoft.com/downloads/details.aspx?familyid=262719d4-d63d-4d6f-bebb-d8968c016a84)
(3029944)
</td>
<td style="border:1px solid black;">
Offenlegung von Informationen
</td>
<td style="border:1px solid black;">
Hoch
</td>
<td style="border:1px solid black;">
Keine
</td>
</tr>
<tr>
<td style="border:1px solid black;" colspan="2">
[Windows Server 2003 mit SP2 für Itanium-basierte Systeme](http://www.microsoft.com/downloads/details.aspx?familyid=4cd8c182-8eb1-43d2-a1c6-85c50481598b)
(3029944)
</td>
<td style="border:1px solid black;">
Offenlegung von Informationen
</td>
<td style="border:1px solid black;">
Hoch
</td>
<td style="border:1px solid black;">
Keine
</td>
</tr>
<tr>
<td style="border:1px solid black;" colspan="5">
**Windows Vista**
</td>
</tr>
<tr>
<td style="border:1px solid black;">
[Windows Vista Service Pack 2](http://www.microsoft.com/downloads/details.aspx?familyid=c1d07ccd-0e6f-4c27-82de-b0bfb1288bd8)
(3029944)
</td>
<td style="border:1px solid black;" colspan="2">
Offenlegung von Informationen
</td>
<td style="border:1px solid black;">
Hoch
</td>
<td style="border:1px solid black;">
3013126 in [MS14-085](https://technet.microsoft.com/de-de/library/security/ms14-085)
</td>
</tr>
<tr>
<td style="border:1px solid black;">
[Windows Vista x64 Edition Service Pack 2](http://www.microsoft.com/downloads/details.aspx?familyid=6f64af70-2dc7-4d86-88b9-e27dfe8437d8)
(3029944)
</td>
<td style="border:1px solid black;" colspan="2">
Offenlegung von Informationen
</td>
<td style="border:1px solid black;">
Hoch
</td>
<td style="border:1px solid black;">
3013126 in [MS14-085](https://technet.microsoft.com/de-de/library/security/ms14-085)
</td>
</tr>
<tr>
<td style="border:1px solid black;" colspan="5">
**Windows Server 2008**
</td>
</tr>
<tr>
<td style="border:1px solid black;">
[Windows Server 2008 für 32-Bit-Systeme Service Pack 2](http://www.microsoft.com/downloads/details.aspx?familyid=a148165e-4efe-4072-839e-19785689238a)
(3029944)
</td>
<td style="border:1px solid black;" colspan="2">
Offenlegung von Informationen
</td>
<td style="border:1px solid black;">
Hoch
</td>
<td style="border:1px solid black;">
3013126 in [MS14-085](https://technet.microsoft.com/de-de/library/security/ms14-085)
</td>
</tr>
<tr>
<td style="border:1px solid black;">
[Windows Server 2008 für x64-basierte Systeme Service Pack 2](http://www.microsoft.com/downloads/details.aspx?familyid=9d8d5127-dfea-4b58-aacf-01d63696ff38)
(3029944)
</td>
<td style="border:1px solid black;" colspan="2">
Offenlegung von Informationen
</td>
<td style="border:1px solid black;">
Hoch
</td>
<td style="border:1px solid black;">
3013126 in [MS14-085](https://technet.microsoft.com/de-de/library/security/ms14-085)
</td>
</tr>
<tr>
<td style="border:1px solid black;">
[Windows Server 2008 für Itanium-basierte Systeme Service Pack 2](http://www.microsoft.com/downloads/details.aspx?familyid=1a096e76-5d8c-43e5-bcdb-6fefadd5793e)
(3029944)
</td>
<td style="border:1px solid black;" colspan="2">
Offenlegung von Informationen
</td>
<td style="border:1px solid black;">
Hoch
</td>
<td style="border:1px solid black;">
3013126 in [MS14-085](https://technet.microsoft.com/de-de/library/security/ms14-085)
</td>
</tr>
<tr>
<td style="border:1px solid black;" colspan="5">
**Windows 7**
</td>
</tr>
<tr>
<td style="border:1px solid black;">
[Windows 7 für 32-Bit-Systeme Service Pack 1](http://www.microsoft.com/downloads/details.aspx?familyid=7dbce199-8642-4546-be25-86b54c1aa85a)
(3029944)
</td>
<td style="border:1px solid black;" colspan="2">
Offenlegung von Informationen
</td>
<td style="border:1px solid black;">
Hoch
</td>
<td style="border:1px solid black;">
3013126 in [MS14-085](https://technet.microsoft.com/de-de/library/security/ms14-085)
</td>
</tr>
<tr>
<td style="border:1px solid black;">
[Windows 7 für x64-basierte Systeme Service Pack 1](http://www.microsoft.com/downloads/details.aspx?familyid=67298bc3-04ad-4328-85d9-5fa3fe35474d)
(3029944)
</td>
<td style="border:1px solid black;" colspan="2">
Offenlegung von Informationen
</td>
<td style="border:1px solid black;">
Hoch
</td>
<td style="border:1px solid black;">
3013126 in [MS14-085](https://technet.microsoft.com/de-de/library/security/ms14-085)
</td>
</tr>
<tr>
<td style="border:1px solid black;" colspan="5">
**Windows Server 2008 R2**
</td>
</tr>
<tr>
<td style="border:1px solid black;">
[Windows Server 2008 R2 für x64-basierte Systeme Service Pack 1](http://www.microsoft.com/downloads/details.aspx?familyid=e1357b47-00fa-4be2-9f74-05ef5e14e5ce)
(3029944)
</td>
<td style="border:1px solid black;" colspan="2">
Offenlegung von Informationen
</td>
<td style="border:1px solid black;">
Hoch
</td>
<td style="border:1px solid black;">
3013126 in [MS14-085](https://technet.microsoft.com/de-de/library/security/ms14-085)
</td>
</tr>
<tr>
<td style="border:1px solid black;">
[Windows Server 2008 R2 für Itanium-basierte Systeme Service Pack 1](http://www.microsoft.com/downloads/details.aspx?familyid=30b1ae9b-af1c-486f-87e2-ba11b4447c69)
(3029944)
</td>
<td style="border:1px solid black;" colspan="2">
Offenlegung von Informationen
</td>
<td style="border:1px solid black;">
Hoch
</td>
<td style="border:1px solid black;">
3013126 in [MS14-085](https://technet.microsoft.com/de-de/library/security/ms14-085)
</td>
</tr>
<tr>
<td style="border:1px solid black;" colspan="5">
**Windows 8 und Windows 8.1**
</td>
</tr>
<tr>
<td style="border:1px solid black;">
[Windows 8 für 32-Bit-Systeme](http://www.microsoft.com/downloads/details.aspx?familyid=ec3489ee-d750-4fc2-bd23-2b0cf7735acc)
(3029944)
</td>
<td style="border:1px solid black;" colspan="2">
Offenlegung von Informationen
</td>
<td style="border:1px solid black;">
Hoch
</td>
<td style="border:1px solid black;">
3013126 in [MS14-085](https://technet.microsoft.com/de-de/library/security/ms14-085)
</td>
</tr>
<tr>
<td style="border:1px solid black;">
[Windows 8 für x64-basierte Systeme](http://www.microsoft.com/downloads/details.aspx?familyid=06e3ef40-7b1b-47c7-aff2-b539dabefbf2)
(3029944)
</td>
<td style="border:1px solid black;" colspan="2">
Offenlegung von Informationen
</td>
<td style="border:1px solid black;">
Hoch
</td>
<td style="border:1px solid black;">
3013126 in [MS14-085](https://technet.microsoft.com/de-de/library/security/ms14-085)
</td>
</tr>
<tr>
<td style="border:1px solid black;">
[Windows 8.1 für 32-Bit-Systeme](http://www.microsoft.com/downloads/details.aspx?familyid=215aee51-61de-4a31-87ce-f7eaa2229e75)
(3029944)
</td>
<td style="border:1px solid black;" colspan="2">
Offenlegung von Informationen
</td>
<td style="border:1px solid black;">
Hoch
</td>
<td style="border:1px solid black;">
3013126 in [MS14-085](https://technet.microsoft.com/de-de/library/security/ms14-085)
</td>
</tr>
<tr>
<td style="border:1px solid black;">
[Windows 8.1 für x64-basierte Systeme](http://www.microsoft.com/downloads/details.aspx?familyid=93235781-1383-4029-a93d-427ab4d6cd96)
(3029944)
</td>
<td style="border:1px solid black;" colspan="2">
Offenlegung von Informationen
</td>
<td style="border:1px solid black;">
Hoch
</td>
<td style="border:1px solid black;">
3013126 in [MS14-085](https://technet.microsoft.com/de-de/library/security/ms14-085)
</td>
</tr>
<tr>
<td style="border:1px solid black;" colspan="5">
**Windows Server 2012 und Windows Server 2012 R2**
</td>
</tr>
<tr>
<td style="border:1px solid black;">
[Windows Server 2012](http://www.microsoft.com/downloads/details.aspx?familyid=a0e75197-63ed-43e4-8b4c-42ad7ce8b2f2)
(3029944)
</td>
<td style="border:1px solid black;" colspan="2">
Offenlegung von Informationen
</td>
<td style="border:1px solid black;">
Hoch
</td>
<td style="border:1px solid black;">
3013126 in [MS14-085](https://technet.microsoft.com/de-de/library/security/ms14-085)
</td>
</tr>
<tr>
<td style="border:1px solid black;">
[Windows Server 2012 R2](http://www.microsoft.com/downloads/details.aspx?familyid=6db1e15c-b51c-4867-bd9e-2e67d034327e)
(3029944)
</td>
<td style="border:1px solid black;" colspan="2">
Offenlegung von Informationen
</td>
<td style="border:1px solid black;">
Hoch
</td>
<td style="border:1px solid black;">
3013126 in [MS14-085](https://technet.microsoft.com/de-de/library/security/ms14-085)
</td>
</tr>
<tr>
<td style="border:1px solid black;" colspan="5">
**Windows RT und Windows RT 8.1**
</td>
</tr>
<tr>
<td style="border:1px solid black;">
Windows RT<sup>[1]</sup>
(3029944)
</td>
<td style="border:1px solid black;" colspan="2">
Offenlegung von Informationen
</td>
<td style="border:1px solid black;">
Hoch
</td>
<td style="border:1px solid black;">
3013126 in [MS14-085](https://technet.microsoft.com/de-de/library/security/ms14-085)
</td>
</tr>
<tr>
<td style="border:1px solid black;">
Windows RT 8.1<sup>[1]</sup>
(3029944)
</td>
<td style="border:1px solid black;" colspan="2">
Offenlegung von Informationen
</td>
<td style="border:1px solid black;">
Hoch
</td>
<td style="border:1px solid black;">
3013126 in [MS14-085](https://technet.microsoft.com/de-de/library/security/ms14-085)
</td>
</tr>
<tr>
<td style="border:1px solid black;" colspan="5">
**Server Core-Installationsoption**
</td>
</tr>
<tr>
<td style="border:1px solid black;">
[Windows Server 2008 für 32-Bit-Systeme Service Pack 2](http://www.microsoft.com/downloads/details.aspx?familyid=a148165e-4efe-4072-839e-19785689238a) (Server Core-Installation)
(3029944)
</td>
<td style="border:1px solid black;" colspan="2">
Offenlegung von Informationen
</td>
<td style="border:1px solid black;">
Hoch
</td>
<td style="border:1px solid black;">
3013126 in [MS14-085](https://technet.microsoft.com/de-de/library/security/ms14-085)
</td>
</tr>
<tr>
<td style="border:1px solid black;">
[Windows Server 2008 für x64-basierte Systeme Service Pack 2](http://www.microsoft.com/downloads/details.aspx?familyid=9d8d5127-dfea-4b58-aacf-01d63696ff38) (Server Core-Installation)
(3029944)
</td>
<td style="border:1px solid black;" colspan="2">
Offenlegung von Informationen
</td>
<td style="border:1px solid black;">
Hoch
</td>
<td style="border:1px solid black;">
3013126 in [MS14-085](https://technet.microsoft.com/de-de/library/security/ms14-085)
</td>
</tr>
<tr>
<td style="border:1px solid black;">
[Windows Server 2008 R2 für x64-basierte Systeme Service Pack 1](http://www.microsoft.com/downloads/details.aspx?familyid=e1357b47-00fa-4be2-9f74-05ef5e14e5ce) (Server Core-Installation)
(3029944)
</td>
<td style="border:1px solid black;" colspan="2">
Offenlegung von Informationen
</td>
<td style="border:1px solid black;">
Hoch
</td>
<td style="border:1px solid black;">
3013126 in [MS14-085](https://technet.microsoft.com/de-de/library/security/ms14-085)
</td>
</tr>
<tr>
<td style="border:1px solid black;">
[Windows Server 2012](http://www.microsoft.com/downloads/details.aspx?familyid=a0e75197-63ed-43e4-8b4c-42ad7ce8b2f2) (Server Core-Installation)
(3029944)
</td>
<td style="border:1px solid black;" colspan="2">
Offenlegung von Informationen
</td>
<td style="border:1px solid black;">
Hoch
</td>
<td style="border:1px solid black;">
3013126 in [MS14-085](https://technet.microsoft.com/de-de/library/security/ms14-085)
</td>
</tr>
<tr>
<td style="border:1px solid black;">
[Windows Server 2012 R2](http://www.microsoft.com/downloads/details.aspx?familyid=6db1e15c-b51c-4867-bd9e-2e67d034327e) (Server Core-Installation)
(3029944)
</td>
<td style="border:1px solid black;" colspan="2">
Offenlegung von Informationen
</td>
<td style="border:1px solid black;">
Hoch
</td>
<td style="border:1px solid black;">
3013126 in [MS14-085](https://technet.microsoft.com/de-de/library/security/ms14-085)
</td>
</tr>
</table>
<sup>[1]</sup>Dieses Update ist nur über [Windows Update](http://update.microsoft.com/microsoftupdate/v6/vistadefault.aspx?ln=de-de) verfügbar.
Bewertungen des Schweregrads und Kennungen der Sicherheitsanfälligkeiten
------------------------------------------------------------------------
Bei der folgenden Bewertung des Schweregrads wird die potenzielle maximale Auswirkung der Sicherheitsanfälligkeit angenommen. Informationen zur Wahrscheinlichkeit der Ausnutzung der Sicherheitsanfälligkeit in Bezug auf die Bewertung des Schweregrads und die Sicherheitsauswirkung innerhalb von 30 Tagen nach Veröffentlichung dieses Security Bulletins finden Sie im Ausnutzbarkeitsindex im [Bulletin Summary für Februar](https://technet.microsoft.com/de-de/library/security/ms15-feb).
<p> </p>
<table style="border:1px solid black;">
<tr>
<td style="border:1px solid black;" colspan="3">
**Bewertung des Schweregrads und maximale Sicherheitsauswirkung nach betroffener Software**
</td>
</tr>
<tr>
<td style="border:1px solid black;">
**Betroffene Software**
</td>
<td style="border:1px solid black;">
[**Sicherheitsanfälligkeit bei TIFF-Verarbeitung durch Offenlegung von Informationen – CVE-2015-0061**](http://www.cve.mitre.org/cgi-bin/cvename.cgi?name=cve-2015-0061)
</td>
<td style="border:1px solid black;">
**Bewertung des Gesamtschweregrads**
</td>
</tr>
<tr>
<td style="border:1px solid black;" colspan="3">
**Windows Server 2003**
</td>
</tr>
<tr>
<td style="border:1px solid black;">
Windows Server 2003 Service Pack 2
(3029944)
</td>
<td style="border:1px solid black;">
**Hoch**
Offenlegung von Informationen
</td>
<td style="border:1px solid black;">
**Hoch**
</td>
</tr>
<tr>
<td style="border:1px solid black;">
Windows Server 2003 x64 Edition Service Pack 2
(3029944)
</td>
<td style="border:1px solid black;">
**Hoch**
Offenlegung von Informationen
</td>
<td style="border:1px solid black;">
**Hoch**
</td>
</tr>
<tr>
<td style="border:1px solid black;">
Windows Server 2003 mit SP2 für Itanium-basierte Systeme
(3029944)
</td>
<td style="border:1px solid black;">
**Hoch**
Offenlegung von Informationen
</td>
<td style="border:1px solid black;">
**Hoch**
</td>
</tr>
<tr>
<td style="border:1px solid black;" colspan="3">
**Windows Vista**
</td>
</tr>
<tr>
<td style="border:1px solid black;">
Windows Vista Service Pack 2
(3029944)
</td>
<td style="border:1px solid black;">
**Hoch**
Offenlegung von Informationen
</td>
<td style="border:1px solid black;">
**Hoch**
</td>
</tr>
<tr>
<td style="border:1px solid black;">
Windows Vista x64 Edition Service Pack 2
(3029944)
</td>
<td style="border:1px solid black;">
**Hoch**
Offenlegung von Informationen
</td>
<td style="border:1px solid black;">
**Hoch**
</td>
</tr>
<tr>
<td style="border:1px solid black;" colspan="3">
**Windows Server 2008**
</td>
</tr>
<tr>
<td style="border:1px solid black;">
Windows Server 2008 für 32-Bit-Systeme Service Pack 2
(3029944)
</td>
<td style="border:1px solid black;">
**Hoch**
Offenlegung von Informationen
</td>
<td style="border:1px solid black;">
**Hoch**
</td>
</tr>
<tr>
<td style="border:1px solid black;">
Windows Server 2008 für x64-basierte Systeme Service Pack 2
(3029944)
</td>
<td style="border:1px solid black;">
**Hoch**
Offenlegung von Informationen
</td>
<td style="border:1px solid black;">
**Hoch**
</td>
</tr>
<tr>
<td style="border:1px solid black;">
Windows Server 2008 für Itanium-basierte Systeme Service Pack 2
(3029944)
</td>
<td style="border:1px solid black;">
**Hoch**
Offenlegung von Informationen
</td>
<td style="border:1px solid black;">
**Hoch**
</td>
</tr>
<tr>
<td style="border:1px solid black;" colspan="3">
**Windows 7**
</td>
</tr>
<tr>
<td style="border:1px solid black;">
Windows 7 für 32-Bit-Systeme Service Pack 1
(3029944)
</td>
<td style="border:1px solid black;">
**Hoch**
Offenlegung von Informationen
</td>
<td style="border:1px solid black;">
**Hoch**
</td>
</tr>
<tr>
<td style="border:1px solid black;">
Windows 7 für x64-basierte Systeme Service Pack 1
(3029944)
</td>
<td style="border:1px solid black;">
**Hoch**
Offenlegung von Informationen
</td>
<td style="border:1px solid black;">
**Hoch**
</td>
</tr>
<tr>
<td style="border:1px solid black;" colspan="3">
**Windows Server 2008 R2**
</td>
</tr>
<tr>
<td style="border:1px solid black;">
Windows Server 2008 R2 für x64-basierte Systeme Service Pack 1
(3029944)
</td>
<td style="border:1px solid black;">
**Hoch**
Offenlegung von Informationen
</td>
<td style="border:1px solid black;">
**Hoch**
</td>
</tr>
<tr>
<td style="border:1px solid black;">
Windows Server 2008 R2 für Itanium-basierte Systeme Service Pack 1
(3029944)
</td>
<td style="border:1px solid black;">
**Hoch**
Offenlegung von Informationen
</td>
<td style="border:1px solid black;">
**Hoch**
</td>
</tr>
<tr>
<td style="border:1px solid black;" colspan="3">
**Windows 8 und Windows 8.1**
</td>
</tr>
<tr>
<td style="border:1px solid black;">
Windows 8 für 32-Bit-Systeme
(3029944)
</td>
<td style="border:1px solid black;">
**Hoch**
Offenlegung von Informationen
</td>
<td style="border:1px solid black;">
**Hoch**
</td>
</tr>
<tr>
<td style="border:1px solid black;">
Windows 8 für x64-basierte Systeme
(3029944)
</td>
<td style="border:1px solid black;">
**Hoch**
Offenlegung von Informationen
</td>
<td style="border:1px solid black;">
**Hoch**
</td>
</tr>
<tr>
<td style="border:1px solid black;">
Windows 8.1 für 32-Bit-Systeme
(3029944)
</td>
<td style="border:1px solid black;">
**Hoch**
Offenlegung von Informationen
</td>
<td style="border:1px solid black;">
**Hoch**
</td>
</tr>
<tr>
<td style="border:1px solid black;">
Windows 8.1 für x64-basierte Systeme
(3029944)
</td>
<td style="border:1px solid black;">
**Hoch**
Offenlegung von Informationen
</td>
<td style="border:1px solid black;">
**Hoch**
</td>
</tr>
<tr>
<td style="border:1px solid black;" colspan="3">
**Windows Server 2012 und Windows Server 2012 R2**
</td>
</tr>
<tr>
<td style="border:1px solid black;">
Windows Server 2012
(3029944)
</td>
<td style="border:1px solid black;">
**Hoch**
Offenlegung von Informationen
</td>
<td style="border:1px solid black;">
**Hoch**
</td>
</tr>
<tr>
<td style="border:1px solid black;">
Windows Server 2012 R2
(3029944)
</td>
<td style="border:1px solid black;">
**Hoch**
Offenlegung von Informationen
</td>
<td style="border:1px solid black;">
**Hoch**
</td>
</tr>
<tr>
<td style="border:1px solid black;" colspan="3">
**Windows RT und Windows RT 8.1**
</td>
</tr>
<tr>
<td style="border:1px solid black;">
Windows RT
(3029944)
</td>
<td style="border:1px solid black;">
**Hoch**
Offenlegung von Informationen
</td>
<td style="border:1px solid black;">
**Hoch**
</td>
</tr>
<tr>
<td style="border:1px solid black;">
Windows RT 8.1
(3029944)
</td>
<td style="border:1px solid black;">
**Hoch**
Offenlegung von Informationen
</td>
<td style="border:1px solid black;">
**Hoch**
</td>
</tr>
<tr>
<td style="border:1px solid black;" colspan="3">
**Server Core-Installationsoption**
</td>
</tr>
<tr>
<td style="border:1px solid black;">
Windows Server 2008 für 32-Bit-Systeme Service Pack 2 (Server Core-Installation)
(3029944)
</td>
<td style="border:1px solid black;">
**Hoch**
Offenlegung von Informationen
</td>
<td style="border:1px solid black;">
**Hoch**
</td>
</tr>
<tr>
<td style="border:1px solid black;">
Windows Server 2008 für x64-basierte Systeme Service Pack 2 (Server Core-Installation)
(3029944)
</td>
<td style="border:1px solid black;">
**Hoch**
Offenlegung von Informationen
</td>
<td style="border:1px solid black;">
**Hoch**
</td>
</tr>
<tr>
<td style="border:1px solid black;">
Windows Server 2008 R2 für x64-basierte Systeme Service Pack 1 (Server Core-Installation)
(3029944)
</td>
<td style="border:1px solid black;">
**Hoch**
Offenlegung von Informationen
</td>
<td style="border:1px solid black;">
**Hoch**
</td>
</tr>
<tr>
<td style="border:1px solid black;">
Windows Server 2012 (Server Core-Installation)
(3029944)
</td>
<td style="border:1px solid black;">
**Hoch**
Offenlegung von Informationen
</td>
<td style="border:1px solid black;">
**Hoch**
</td>
</tr>
<tr>
<td style="border:1px solid black;">
Windows Server 2012 R2 (Server Core-Installation)
(3029944)
</td>
<td style="border:1px solid black;">
**Hoch**
Offenlegung von Informationen
</td>
<td style="border:1px solid black;">
**Hoch**
</td>
</tr>
</table>
Informationen zu Sicherheitsanfälligkeiten
------------------------------------------
Sicherheitsanfälligkeit bei TIFF-Verarbeitung durch Offenlegung von Informationen – CVE-2015-0061
-------------------------------------------------------------------------------------------------
Es besteht eine Sicherheitsanfälligkeit durch Offenlegung von Informationen, wenn Windows beim Analysieren bestimmter speziell gestalteter TIFF-Bildformatdateien nicht initialisierten Arbeitsspeicher nicht richtig verarbeitet. Die Sicherheitsanfälligkeit kann zu einer Offenlegung von Informationen führen, wenn ein Angreifer eine speziell gestaltete Anwendung auf einem betroffenen System ausführt.
Ein Angreifer kann eine speziell gestaltete Website einrichten, die diese Sicherheitsanfälligkeit ausnutzt, und dann einen Benutzer zum Besuch der Website verleiten. Dies kann auch beeinträchtigte Websites sowie Websites umfassen, die von Benutzern bereitgestellte Inhalte oder Anzeigen akzeptieren oder hosten. Diese Websites können speziell gestaltete Inhalte enthalten, mit denen diese Sicherheitsanfälligkeit ausgenutzt werden kann. Ein Angreifer kann Endbenutzer jedoch nicht zum Besuch solcher Websites zwingen. Der Angreifer muss stattdessen den Benutzer zum Besuch einer Website verleiten, z. B. indem er den Benutzer dazu bringt, in einer E-Mail oder einer Instant Messenger-Nachricht auf einen Link zur Website des Angreifers zu klicken. Es besteht ebenfalls die Möglichkeit, speziell gestalteten Webinhalt mithilfe von Bannerwerbungen anzuzeigen oder Webinhalt auf andere Weise an betroffene Systeme zu übermitteln.
Ein Angreifer, der diese Sicherheitsanfälligkeit erfolgreich ausnutzt hat, kann potenziell Daten lesen, die nicht offengelegt werden sollen. Beachten Sie, dass diese Sicherheitsanfälligkeit einem Angreifer keine Codeausführung oder direkte Erhöhung von Berechtigungen ermöglicht, sondern dazu führt, dass der Angreifer Informationen sammelt, mit denen das betroffene System noch weiter gefährdet werden könnte. Das Update behebt diese Sicherheitsanfälligkeit, indem korrigiert wird, wie Windows TIFF-Bildformatdateien verarbeitet.
Microsoft hat durch eine koordinierte Offenlegung der Sicherheitsanfälligkeit Informationen zu dieser Sicherheitsanfälligkeit erhalten. Zum Zeitpunkt der Erstveröffentlichung dieses Security Bulletins lagen Microsoft keine Informationen vor, dass diese Sicherheitsanfälligkeit öffentlich für Angriffe auf Benutzer ausgenutzt wurde.
### Schadensbegrenzende Faktoren
Für diese Sicherheitsanfälligkeit gibt es noch keine [schadensbegrenzenden Faktoren](https://technet.microsoft.com/de-de/library/security/dn848375.aspx).
### Problemumgehungen
Für diese Sicherheitsanfälligkeit gibt es noch keine [Problemumgehungen](https://technet.microsoft.com/de-de/library/security/dn848375.aspx).
Bereitstellung von Sicherheitsupdates
-------------------------------------
Informationen zur Bereitstellung von Sicherheitsupdates finden Sie im Microsoft Knowledge Base-Artikel, auf den [hier](#kbarticle) in der Kurzzusammenfassung verwiesen wird.
Danksagung
----------
Microsoft würdigt die Bemühungen derjenigen Benutzer der Sicherheitscommunity, die uns dabei helfen, Kunden durch eine koordinierte Offenlegung von Sicherheitsanfälligkeiten zu schützen. Weitere Informationen finden Sie unter [Danksagung](https://technet.microsoft.com/de-de/library/security/dn903755.aspx).
Haftungsausschluss
------------------
Die Informationen in der Microsoft Knowledge Base werden wie besehen und ohne jede Gewährleistung bereitgestellt. Microsoft schließt alle anderen Garantien, gleich ob ausdrücklich oder konkludent, einschließlich der Garantien der Handelsüblichkeit oder Eignung für einen bestimmten Zweck aus. In keinem Fall kann Microsoft Corporation und/oder deren jeweilige Lieferanten haftbar gemacht werden für Schäden irgendeiner Art, einschließlich direkter, indirekter, zufällig entstandener Schäden, Folgeschäden, Folgen entgangenen Gewinns oder spezieller Schäden, selbst dann nicht, wenn Microsoft Corporation und/oder deren jeweilige Lieferanten auf die mögliche Entstehung dieser Schäden hingewiesen wurde. Weil in einigen Staaten/Rechtsordnungen der Ausschluss oder die Beschränkung einer Haftung für zufällig entstandene Schäden oder Folgeschäden nicht gestattet ist, gilt die obige Einschränkung eventuell nicht für Sie.
Revisionen
----------
- V1.0 (10. Februar 2015): Bulletin veröffentlicht.
*Seite generiert am 30.01.2015 um 14:59Z-08:00.*
| 30.171926 | 926 | 0.726955 | deu_Latn | 0.509824 |
2649de32ab0a29f68e3864c8b7510ea56a0ad4df | 104 | md | Markdown | doc/howto.md | Apollon77/ioBroker.wiobrowser | 4ca9838570b835df41a50d5ced10d5b9ecdc60ca | [
"MIT"
] | null | null | null | doc/howto.md | Apollon77/ioBroker.wiobrowser | 4ca9838570b835df41a50d5ced10d5b9ecdc60ca | [
"MIT"
] | null | null | null | doc/howto.md | Apollon77/ioBroker.wiobrowser | 4ca9838570b835df41a50d5ced10d5b9ecdc60ca | [
"MIT"
] | null | null | null | 
# ioBroker.iobrowser
## Was kann man mit dem Adapter machen ?
## Links | 17.333333 | 40 | 0.692308 | deu_Latn | 0.893044 |
2649e79d7640cebff0de27e3963a74dba8c0191f | 232 | md | Markdown | _news/announcement_3.md | bjralexandru/al-folio | 54657b8b9e0eafd0cfb4806ab2523f975d6fea92 | [
"MIT"
] | null | null | null | _news/announcement_3.md | bjralexandru/al-folio | 54657b8b9e0eafd0cfb4806ab2523f975d6fea92 | [
"MIT"
] | null | null | null | _news/announcement_3.md | bjralexandru/al-folio | 54657b8b9e0eafd0cfb4806ab2523f975d6fea92 | [
"MIT"
] | null | null | null | ---
layout: post
date: 2022-02-19 07:59:00-0400
inline: true
---
Check out my [Capstone Project](https://bjralexandru.github.io/projects) for the Google Data Analytics Certificate and let me know what you think! :sparkles: :smile:
| 29 | 165 | 0.74569 | eng_Latn | 0.6958 |
2649f9e485641f4136497c16b45a7d32efb33054 | 1,370 | md | Markdown | _includes/stats/2011.md | fluca1978/fluca1978.github.io | 799eed9eb899999f0849a4e6db70846e6fbbbdbc | [
"MIT"
] | 1 | 2019-11-13T02:25:42.000Z | 2019-11-13T02:25:42.000Z | _includes/stats/2011.md | fluca1978/fluca1978.github.io | 799eed9eb899999f0849a4e6db70846e6fbbbdbc | [
"MIT"
] | null | null | null | _includes/stats/2011.md | fluca1978/fluca1978.github.io | 799eed9eb899999f0849a4e6db70846e6fbbbdbc | [
"MIT"
] | null | null | null | <a name="2011" />
## 2011
**110 total posts** have been written on 2011.
There have been *43 different tags* used, the most
popular being (sorted by number of posts):
- *programmazione* (33 posts)
- *java* (21 posts)
- *linux* (14 posts)
- *qt* (13 posts)
- *riflessioni* (11 posts)
- *postgresql* (11 posts)
- *freebsd* (10 posts)
- *opensource* (8 posts)
- *kde* (7 posts)
- *eclipse* (6 posts)
- *maven* (5 posts)
- *jsf* (5 posts)
- *opensolaris* (5 posts)
- *gnome* (4 posts)
- *git* (4 posts)
- *pgday* (4 posts)
- *università* (3 posts)
- *research* (3 posts)
- *perl* (2 posts)
- *pcbsd* (2 posts)
- *whitecat* (2 posts)
- *divertimenti* (2 posts)
- *jfk* (2 posts)
- *windows* (2 posts)
- *wtp* (1 posts)
- *arco* (1 posts)
- *sicurezza* (1 posts)
- *itpug* (1 posts)
- *varie* (1 posts)
- *mp3* (1 posts)
- *gmail* (1 posts).<br/>
<br/>
The following is the overall 2011 post ratio by month:
<br/>
<center>
<img src="/images/stats/2011-months.png" alt="2011 post ratio per month" />
</center>
<br/>
<br/>
The following is the overall 2011 post ratio by tag (showing max 30 tags):
<br/>
<center>
<img src="/images/stats/2011-tags.png" alt="2011 post ratio per tag" />
</center>
<br/>
<div align="right">
<small>
Last generated on 2021-01-07 at 14:43
</small>
</div>
<br/>
| 22.096774 | 81 | 0.590511 | eng_Latn | 0.743386 |
264a1febaa4f5773418b5cfc4db824a809fab81b | 52 | md | Markdown | README.md | AbilityProject/CloudLights | 013341cf0ce26c3fc804d5d9158e8b7b3778d0fb | [
"MIT"
] | null | null | null | README.md | AbilityProject/CloudLights | 013341cf0ce26c3fc804d5d9158e8b7b3778d0fb | [
"MIT"
] | null | null | null | README.md | AbilityProject/CloudLights | 013341cf0ce26c3fc804d5d9158e8b7b3778d0fb | [
"MIT"
] | null | null | null | # CloudLights
LED cloud lights in multisensory room
| 17.333333 | 37 | 0.826923 | eng_Latn | 0.765689 |
264a42439a810a984595fc643f0bc5dbdf2d5a93 | 741 | md | Markdown | _fellows/ogdon.md | mozeran/visualizingthefuture | 249a9190dffc5ed6bac81940461b3595bac67473 | [
"MIT"
] | null | null | null | _fellows/ogdon.md | mozeran/visualizingthefuture | 249a9190dffc5ed6bac81940461b3595bac67473 | [
"MIT"
] | null | null | null | _fellows/ogdon.md | mozeran/visualizingthefuture | 249a9190dffc5ed6bac81940461b3595bac67473 | [
"MIT"
] | null | null | null | ---
layout: page
title: Dorothy Ogdon
---
{:height="200px" align="left" style="margin-right:15px; margin-bottom:0px"}
Dorothy Ogdon is the University of Alabama at Birmingham Libraries Emerging Technologies Librarian. She is interested in exploring potential applications for 3D printing, virtual reality, and artificial intelligence in information discovery and data reuse. She was a participant in the 2018 Data Science and Visualization Institute for Librarians at North Carolina State University and is currently coordinating the development of a Virtual Reality Studio and 3D Printing Space at Lister Hill Library of the Health Sciences at the University of Alabama at Birmingham.
| 92.625 | 568 | 0.812416 | eng_Latn | 0.967054 |
264a5a2bcd5cdff7615707067d6f5151472a4dd1 | 70 | md | Markdown | README.md | gusamarante/CorrelationEllipses | 101e80abc7c4df31ddb8a999a8e0c7b0ef1c9296 | [
"MIT"
] | 1 | 2021-12-30T11:55:16.000Z | 2021-12-30T11:55:16.000Z | README.md | gusamarante/CorrelationEllipses | 101e80abc7c4df31ddb8a999a8e0c7b0ef1c9296 | [
"MIT"
] | null | null | null | README.md | gusamarante/CorrelationEllipses | 101e80abc7c4df31ddb8a999a8e0c7b0ef1c9296 | [
"MIT"
] | null | null | null | # CorrelationEllipses
Plot a correlation matrix using ellipses python
| 23.333333 | 47 | 0.857143 | eng_Latn | 0.802767 |
264b0062ab88ba8c5e26ef5a2592db52d660af61 | 11,990 | md | Markdown | tools/data/ru.stackoverflow.com/posts/A1036837.md | MSDN-WhiteKnight/answers | 7e87ccf0edd6978802964503aa4ff8efda5ece62 | [
"BSD-3-Clause"
] | null | null | null | tools/data/ru.stackoverflow.com/posts/A1036837.md | MSDN-WhiteKnight/answers | 7e87ccf0edd6978802964503aa4ff8efda5ece62 | [
"BSD-3-Clause"
] | null | null | null | tools/data/ru.stackoverflow.com/posts/A1036837.md | MSDN-WhiteKnight/answers | 7e87ccf0edd6978802964503aa4ff8efda5ece62 | [
"BSD-3-Clause"
] | null | null | null | ---
title: "Ответ на \"Почему функция CreateThread запускает функцию сразу после создания?\""
se.owner.user_id: 240512
se.owner.display_name: "MSDN.WhiteKnight"
se.owner.link: "https://ru.stackoverflow.com/users/240512/msdn-whiteknight"
se.answer_id: 1036837
se.question_id: 1036663
se.post_type: answer
se.is_accepted: True
---
<blockquote>
<p>История: исполнение WriteConsoleOutputW занимает с каждым последующим обновлением занимает все больше времени, при чем размер буфера не меняется. Изначально была идея вынести вывод в отдельный поток, но оно все еще занимает 74-76 мс, что невероятно долго.</p>
</blockquote>
<p>Хотелось бы увидеть, на каком размере буфера и в какой ОС вывод занимает так долго. У меня на Windows 10 при буфере 100х50 среднее время 0.1 - 0.2 мс, вот код для тестирования:</p>
<pre><code>#include <windows.h>
#include <stdio.h>
const int W = 100; //ширина области рисования
const int H = 50; //высота области рисования
CHAR_INFO chiBuffer[W*H] = { 0 };
COORD coordBufSize;
COORD coordBufCoord;
HANDLE hStdout;
SMALL_RECT srctWriteRect;
void Render(int pos) {
//рисуем звездочку в заданной позиции
for (int i = 0; i < sizeof(chiBuffer) / sizeof(CHAR_INFO); i++) {
chiBuffer[i].Attributes = FOREGROUND_BLUE | FOREGROUND_RED | FOREGROUND_GREEN | BACKGROUND_BLUE;
if (i == pos) chiBuffer[i].Char.UnicodeChar = L'*';
else chiBuffer[i].Char.UnicodeChar = L' ';
}
srctWriteRect.Top = 0;
srctWriteRect.Left = 0;
srctWriteRect.Bottom = H;
srctWriteRect.Right = W;
BOOL fSuccess = WriteConsoleOutputW(
hStdout, // screen buffer to write to
chiBuffer, // buffer to copy from
coordBufSize, // col-row size of chiBuffer
coordBufCoord, // top left src cell in chiBuffer
&srctWriteRect); // dest. screen buffer rectangle
if (!fSuccess)
{
printf("WriteConsoleOutput failed - (%d)\n", GetLastError());
}
}
int main(void)
{
BOOL fSuccess;
hStdout = GetStdHandle(STD_OUTPUT_HANDLE);
coordBufSize.Y = H;
coordBufSize.X = W;
coordBufCoord.X = 0;
coordBufCoord.Y = 0;
// Set the destination rectangle.
srctWriteRect.Top = 0;
srctWriteRect.Left = 0;
srctWriteRect.Bottom = H;
srctWriteRect.Right = W;
const int N_RUNS = 50; //число запусков в пределах одного теста
const int N_TESTS = 50; //число тестов
float tsum = 0;
DWORD t1;
DWORD t2;
Render(0);
int pos = 1;
for (int i = 0; i < N_TESTS; i++) {
t1 = GetTickCount();
for (int j = 0; j < N_RUNS; j++) {
Render(pos);
pos++;
if (pos >= W * H)pos = 0;
}
t2 = GetTickCount();
tsum += (t2 - t1)/(float)N_RUNS; //определим длительность отрисовки в пределах одного теста
}
//выведем среднее время отрисовки
wchar_t s[100] = L"";
swprintf(s,100,L"Render time = %.3f ms\n", tsum / (float)N_TESTS);
MessageBoxW(GetConsoleWindow(), s, L"Results", MB_OK);
getchar();
return 0;
}
</code></pre>
<p>Но я слышал, что на Windows 7 и ранее консоль рисует медленнее.</p>
<blockquote>
<p>Последнее что пришло в голову - разделить буфер на несколько частей и выводить их в нескольких потоках.</p>
</blockquote>
<p>Идея распараллелить что-то с целью его ускорения редко является хорошей. Начать с того, что код может выполняться на компьютере с одним процессорным ядром, тогда "ускорение" в реальности обернется замедлением. Но даже если ядер несколько, реальный эффект от параллельности проявится только когда потоки не конкурируют за общие ресурсы. В случае консоли, я подозреваю, что там внутренне работает свой механизм синхронизации, и эффекта не будет.</p>
<blockquote>
<p>Однако как я выяснил CreateThread запускает функции сразу после создания</p>
</blockquote>
<p>Это не совсем так. Есть флаг <a href="https://docs.microsoft.com/en-us/windows/win32/api/processthreadsapi/nf-processthreadsapi-createthread" rel="nofollow noreferrer">CREATE_SUSPENDED</a>:</p>
<blockquote>
<p>The thread is created in a suspended state, and does not run until the ResumeThread function is called.</p>
</blockquote>
<p>Но для вашей задачи это никак не поможет.</p>
<hr />
<p>Реальная оптимизация, которая здесь могла бы помочь - это вместо вывода всего буфера проверять, какие области изменились, и выводить только их. Если на каждом шаге не весь экран изменяется, это может дать серьезное ускорение.</p>
<hr />
<h2>Дополнение</h2>
<p>Если попробовать на большом буфере и распараллелить вот так с синхронизацией через события:</p>
<pre><code>#include <windows.h>
#include <stdio.h>
void Render1();
void Render2();
const int W = 640; //ширина области рисования
const int H = 320; //высота области рисования
CHAR_INFO chiBuffer[W*H] = { 0 };
HANDLE hStdout;
//первая часть
COORD coordBufSize1;
COORD coordBufCoord1;
SMALL_RECT srctWriteRect1;
//вторая часть
COORD coordBufSize2;
COORD coordBufCoord2;
SMALL_RECT srctWriteRect2;
//события для синхронизации
HANDLE evtRendering1;
HANDLE evtRendering2;
HANDLE evtRendered1;
HANDLE evtRendered2;
DWORD WINAPI ThreadProc1(LPVOID lpThreadParameter) {
while (1) {
WaitForSingleObject(evtRendering1, INFINITE); //ждем сигнала
ResetEvent(evtRendering1);
//выводим первую половину
Render1();
//сообщаем о завершении
SetEvent(evtRendered1);
}
}
DWORD WINAPI ThreadProc2(LPVOID lpThreadParameter) {
while (1) {
WaitForSingleObject(evtRendering2, INFINITE); //ждем сигнала
ResetEvent(evtRendering2);
//выводим первую половину
Render2();
//сообщаем о завершении
SetEvent(evtRendered2);
}
}
void FillBuffer(int pos) {
//рисуем звездочку в заданной позиции
for (int i = 0; i < sizeof(chiBuffer) / sizeof(CHAR_INFO); i++) {
chiBuffer[i].Attributes = FOREGROUND_BLUE | FOREGROUND_RED | FOREGROUND_GREEN | BACKGROUND_BLUE;
if (i == pos) chiBuffer[i].Char.UnicodeChar = L'*';
else chiBuffer[i].Char.UnicodeChar = L' ';
}
}
void Render1() {
srctWriteRect1.Top = 0;
srctWriteRect1.Left = 0;
srctWriteRect1.Bottom = H/2;
srctWriteRect1.Right = W;
BOOL fSuccess = WriteConsoleOutputW(
hStdout, // screen buffer to write to
chiBuffer, // buffer to copy from
coordBufSize1, // col-row size of chiBuffer
coordBufCoord1, // top left src cell in chiBuffer
&srctWriteRect1); // dest. screen buffer rectangle
if (!fSuccess)
{
printf("Render1: WriteConsoleOutput failed - (%d)\n", GetLastError());
}
}
void Render2() {
srctWriteRect2.Top = H / 2;
srctWriteRect2.Left = 0;
srctWriteRect2.Bottom = H;
srctWriteRect2.Right = W;
BOOL fSuccess = WriteConsoleOutputW(
hStdout, // screen buffer to write to
&(chiBuffer[W*H/2]), // buffer to copy from
coordBufSize2, // col-row size of chiBuffer
coordBufCoord2, // top left src cell in chiBuffer
&srctWriteRect2); // dest. screen buffer rectangle
if (!fSuccess)
{
printf("Render2: WriteConsoleOutput failed - (%d)\n", GetLastError());
}
}
void Render(BOOL parallel) {
if (parallel != FALSE) {
//запускаем оба потока
SetEvent(evtRendering1);
SetEvent(evtRendering2);
HANDLE ha[2] = { evtRendered1,evtRendered2 };
//ждем завершения вывода
DWORD dwWaitResult = WaitForMultipleObjects(
2, // number of handles in array
ha, // array of handles
TRUE, // wait until all are signaled
INFINITE);
switch (dwWaitResult)
{
// All objects were signaled
case WAIT_OBJECT_0: break;
// An error occurred
default:
printf("WaitForMultipleObjects failed (%d)\n", GetLastError());
break;
}
ResetEvent(evtRendered1);
ResetEvent(evtRendered2);
}
else {
Render1();
Render2();
}
}
int main(void)
{
BOOL fSuccess;
hStdout = GetStdHandle(STD_OUTPUT_HANDLE);
coordBufSize1.Y = H/2;
coordBufSize1.X = W;
coordBufCoord1.X = 0;
coordBufCoord1.Y = 0;
coordBufSize2.Y = H / 2;
coordBufSize2.X = W;
coordBufCoord2.X = 0;
coordBufCoord2.Y = 0/*H / 2*/;
// Set the destination rectangle.
srctWriteRect1.Top = 0;
srctWriteRect1.Left = 0;
srctWriteRect1.Bottom = H/2;
srctWriteRect1.Right = W;
srctWriteRect2.Top = H / 2;
srctWriteRect2.Left = 0;
srctWriteRect2.Bottom = H;
srctWriteRect2.Right = W;
const int N_TESTS = 500; //число тестов
DWORD tsum = 0;
DWORD t1;
DWORD t2;
evtRendering1 = CreateEvent(
NULL, // default security attributes
TRUE, // manual-reset event
FALSE, // initial state is nonsignaled
TEXT("evtRendering1") // object name
);
evtRendering2 = CreateEvent(
NULL, // default security attributes
TRUE, // manual-reset event
FALSE, // initial state is nonsignaled
TEXT("evtRendering2") // object name
);
evtRendered1 = CreateEvent(
NULL, // default security attributes
TRUE, // manual-reset event
FALSE, // initial state is nonsignaled
TEXT("evtRendered1") // object name
);
evtRendered2 = CreateEvent(
NULL, // default security attributes
TRUE, // manual-reset event
FALSE, // initial state is nonsignaled
TEXT("evtRendered2") // object name
);
HANDLE hThread = CreateThread(
NULL, // default security
0, // default stack size
ThreadProc1, // name of the thread function
NULL, // no thread parameters
0, // default startup flags
NULL);
if (hThread == NULL)
{
printf("CreateThread failed (%d)\n", GetLastError());
return 1;
}
hThread = CreateThread(
NULL, // default security
0, // default stack size
ThreadProc2, // name of the thread function
NULL, // no thread parameters
0, // default startup flags
NULL);
if (hThread == NULL)
{
printf("CreateThread failed (%d)\n", GetLastError());
return 1;
}
/* *** Начало теста *** */
BOOL parallel = TRUE;
FillBuffer(0);
Render(parallel);
int pos = 1;
for (int i = 0; i < N_TESTS; i++) {
FillBuffer(pos);
t1 = GetTickCount();
Render(parallel);
t2 = GetTickCount();
pos++;
if (pos >= W * H)pos = 0;
tsum += (t2 - t1); //определим длительность отрисовки в пределах одного теста
}
//выведем среднее время отрисовки
wchar_t s[100] = L"";
swprintf(s,100,L"Render time = %.3f ms\n", tsum / (float)N_TESTS);
MessageBoxW(GetConsoleWindow(), s, L"Results", MB_OK);
getchar();
return 0;
}
</code></pre>
<p>То получается</p>
<ul>
<li>без параллельности: 1.2 мс</li>
<li>с параллельностью: 1.6 мс</li>
</ul>
<p>То есть вроде как пользы от распараллеливания нет, даже наоборот, получается медленнее за счет накладных расходов на ожидание.</p>
| 31.305483 | 460 | 0.618098 | yue_Hant | 0.222423 |
264b4e9faeb52df11e5f75409d299db91e4f0e80 | 282 | md | Markdown | .github/pull_request_template.md | lnsp/gvisor | ac62743e380c90255fbca6b76d899c3b7cf70877 | [
"Apache-2.0"
] | 1 | 2018-10-25T18:54:07.000Z | 2018-10-25T18:54:07.000Z | .github/pull_request_template.md | lnsp/gvisor | ac62743e380c90255fbca6b76d899c3b7cf70877 | [
"Apache-2.0"
] | 54 | 2020-07-28T18:10:55.000Z | 2021-03-31T01:17:53.000Z | .github/pull_request_template.md | LevyForchh/gvisor | a1d7986df9eeff036c3a03a06725d0ad2b2c96c8 | [
"Apache-2.0"
] | 4 | 2021-11-08T11:15:53.000Z | 2022-03-16T08:51:58.000Z | * [ ] Have you followed the guidelines in [CONTRIBUTING.md](../blob/master/CONTRIBUTING.md)?
* [ ] Have you formatted and linted your code?
* [ ] Have you added relevant tests?
* [ ] Have you added appropriate Fixes & Updates references?
* [ ] If yes, please erase all these lines!
| 47 | 92 | 0.70922 | eng_Latn | 0.996229 |
264c247279eef21760fb1a5c1ffdcbb6944d96c9 | 1,030 | md | Markdown | network/protocols/README.md | ryan4yin/knowledge | aaab388548298052159e86f8f1a6cd3b80a93649 | [
"MIT"
] | 29 | 2020-12-23T03:16:06.000Z | 2022-03-04T10:38:49.000Z | network/protocols/README.md | ryan4yin/DevOps | d6e5bfa1c317b69a981e69b451044471c28862a4 | [
"MIT"
] | 6 | 2022-02-22T07:59:11.000Z | 2022-02-23T15:54:01.000Z | network/protocols/README.md | ryan4yin/knowledge | aaab388548298052159e86f8f1a6cd3b80a93649 | [
"MIT"
] | 9 | 2020-12-03T01:41:44.000Z | 2022-03-01T06:18:04.000Z | >2018-03-30
## 因特网协议分层模型
1. **第五层(或者OSI第七层)- 应用层 Http/SSH/TCP**:软件实现
1. **第四层 - 传输层TCP/UDP**:软件实现。
- **“端口”**这个概念也在这一层被实现,网络层只负责**主机->主机**的数据传递,传输层则通过一个虚拟的“端口”实现**程序->程序**的数据传递。
1. 第三层 - 网络层:软件与硬件混合实现,算是五层协议里最复杂的一部分。
1. data plane: forwarding 转发,由硬件实现。
- 功能:在单个路由器中,将 packet 从输入端口转发到合适的输出端口。(这里的端口,指的是路由器的物理端口,和 tcp/udp 的端口不是一个概念)
- 数据平面需要非常快的工作效率,尤其是在高速线路上,工作在 ns 尺度上,因此需要用硬件实现来保证正常工作。
1. control plane: routing 路径选择,由软件实现。
- 涉及算法,通常由软件实现。计算合适的路由路径,并据此完成对路由器 forwarding table 的更新。(还有部分网络管理功能)
- 控制平面的路径选择算法、网络管理功能,相对数据平面而言对时间的要求不是那么高。工作在ms/s尺度上,更适合用软件实现。(由于其计算量大,而且路由器分布式布置,该算法通常实现为分布式)

1. 第二层 - 数据链路层:负责特定链路的通信,常实现在硬件内。
- 交换机通过 MAC 地址在链路层进行数据转发。(ARP 协议)
1. 第一层 - 物理层:负责处理物理信号(电信号、电磁波、光信号)和第二层数据包之间的转换(调制解调器,猫),硬件实现。
OSI七层模型,在应用层与运输层之间,多了一个表示层和会话层,而在因特网中这两层不一定需要,若有需要也通常由软件自身实现。
虽然五层分层模型更符合实际,但目前各类文档都还是习惯将应用层称为「第七层」。
P.S. 软件实现的好处是成本低,也容易更改。 而硬件实现的好处是效率高,响应快,但是更改难,成本也高。
| 39.615385 | 102 | 0.730097 | yue_Hant | 0.690405 |
264c38dbe179cca1d55e5047085391c1263539e9 | 18,750 | md | Markdown | WindowsServerDocs/networking/sdn/security/sdn-manage-certs.md | ilchiodi/windowsserverdocs.it-it | c9a108584b6430aed06a10c888377ec29480fd01 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | WindowsServerDocs/networking/sdn/security/sdn-manage-certs.md | ilchiodi/windowsserverdocs.it-it | c9a108584b6430aed06a10c888377ec29480fd01 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | WindowsServerDocs/networking/sdn/security/sdn-manage-certs.md | ilchiodi/windowsserverdocs.it-it | c9a108584b6430aed06a10c888377ec29480fd01 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Gestire i certificati per Software Defined Networking
description: È possibile usare questo argomento per informazioni su come gestire i certificati per le comunicazioni del controller di rete verso nord e sud quando si distribuisce la rete SDN (Software Defined Networking) in Windows Server 2016 datacenter.
manager: grcusanz
ms.prod: windows-server
ms.technology: networking-sdn
ms.topic: article
ms.assetid: c4e2f6c7-0364-4bf8-bb66-9af59c0bbd74
ms.author: anpaul
author: AnirbanPaul
ms.date: 08/22/2018
ms.openlocfilehash: 3225b3f5065e49521411b35fa3781338086b4e59
ms.sourcegitcommit: b00d7c8968c4adc8f699dbee694afe6ed36bc9de
ms.translationtype: MT
ms.contentlocale: it-IT
ms.lasthandoff: 04/08/2020
ms.locfileid: "80854354"
---
# <a name="manage-certificates-for-software-defined-networking"></a>Gestire i certificati per Software Defined Networking
>Si applica a: Windows Server (Canale semestrale), Windows Server 2016
È possibile usare questo argomento per informazioni su come gestire i certificati per le comunicazioni del controller di rete verso nord e sud quando si distribuisce software defined networking \(SDN\) in Windows Server 2016 datacenter e si usa System Center Virtual Machine Manager \(SCVMM\) come client di gestione SDN.
>[!NOTE]
>Per informazioni generali sul controller di rete, vedere [controller di rete](../technologies/network-controller/Network-Controller.md).
Se non si utilizza Kerberos per proteggere la comunicazione del controller di rete, è possibile utilizzare i certificati X. 509 per l'autenticazione, l'autorizzazione e la crittografia.
SDN in Windows Server 2016 datacenter supporta sia la firma\-automatica che l'autorità di certificazione \(CA\)certificati X. 509 firmati. In questo argomento vengono fornite istruzioni dettagliate per la creazione di questi certificati e per la relativa applicazione ai canali di comunicazione verso il controller di rete in direzione nord con i client di gestione e le comunicazioni in direzione sud con i dispositivi di rete, ad esempio il software Load Balancer \(\)SLB.
.
Quando si utilizza l'autenticazione basata su\-certificati, è necessario registrare un certificato nei nodi del controller di rete utilizzato nei modi seguenti.
1. Crittografia della comunicazione in direzione nord con Secure Sockets Layer \(\) SSL tra i nodi del controller di rete e i client di gestione, ad esempio System Center Virtual Machine Manager.
2. Autenticazione tra i nodi del controller di rete e i dispositivi e i servizi a sud, ad esempio gli host Hyper-V e i servizi di bilanciamento del carico software \(SLBs\).
## <a name="creating-and-enrolling-an-x509-certificate"></a>Creazione e registrazione di un certificato X. 509
È possibile creare e registrare un certificato auto\-firmato o un certificato emesso da un'autorità di certificazione.
>[!NOTE]
>Quando si usa SCVMM per distribuire il controller di rete, è necessario specificare il certificato X. 509 usato per crittografare le comunicazioni in direzione nord durante la configurazione del modello di servizio di controller di rete.
La configurazione del certificato deve includere i valori seguenti.
- Il valore della casella di testo **RestEndPoint** deve essere il nome di dominio completo del controller di rete \(FQDN\) o indirizzo IP.
- Il valore **RestEndPoint** deve corrispondere al nome del soggetto \(nome comune,\) CN del certificato X. 509.
### <a name="creating-a-self-signed-x509-certificate"></a>Creazione di un certificato X. 509 con firma automatica\-
È possibile creare un certificato X. 509 autofirmato ed esportarlo con la chiave privata \(protetti con una password\) attenendosi alla seguente procedura per un singolo nodo di\-e per più distribuzioni di\-node del controller di rete.
Quando si creano certificati auto\-firmati, è possibile usare le linee guida seguenti.
- È possibile usare l'indirizzo IP dell'endpoint REST del controller di rete per il parametro DnsName, ma questa operazione non è consigliata perché richiede che i nodi del controller di rete si trovino tutti all'interno di una singola subnet di gestione \(ad esempio in un singolo rack\)
- Per le distribuzioni NC a più nodi, il nome DNS specificato diventerà il nome di dominio completo del cluster di controller di rete \(host DNS vengono creati automaticamente record A.\)
- Per le distribuzioni del controller di rete a nodo singolo, il nome DNS può essere il nome host del controller di rete seguito dal nome di dominio completo.
#### <a name="multiple-node"></a>Più nodi
È possibile usare il comando [New-SelfSignedCertificate](https://technet.microsoft.com/itpro/powershell/windows/pkiclient/new-selfsignedcertificate) di Windows PowerShell per creare un certificato auto\-firmato.
**Sintassi**
New-SelfSignedCertificate -KeyUsageProperty All -Provider "Microsoft Strong Cryptographic Provider" -FriendlyName "<YourNCComputerName>" -DnsName @("<NCRESTName>")
**Esempio di utilizzo**
New-SelfSignedCertificate -KeyUsageProperty All -Provider "Microsoft Strong Cryptographic Provider" -FriendlyName "MultiNodeNC" -DnsName @("NCCluster.Contoso.com")
#### <a name="single-node"></a>Nodo singolo
È possibile usare il comando [New-SelfSignedCertificate](https://technet.microsoft.com/itpro/powershell/windows/pkiclient/new-selfsignedcertificate) di Windows PowerShell per creare un certificato auto\-firmato.
**Sintassi**
New-SelfSignedCertificate -KeyUsageProperty All -Provider "Microsoft Strong Cryptographic Provider" -FriendlyName "<YourNCComputerName>" -DnsName @("<NCFQDN>")
**Esempio di utilizzo**
New-SelfSignedCertificate -KeyUsageProperty All -Provider "Microsoft Strong Cryptographic Provider" -FriendlyName "SingleNodeNC" -DnsName @("SingleNodeNC.Contoso.com")
### <a name="creating-a-ca-signed-x509-certificate"></a>Creazione di una CA\-certificato X. 509 firmato
Per creare un certificato utilizzando un'autorità di certificazione, è necessario avere già distribuito un'infrastruttura a chiave pubblica \(PKI\) con servizi certificati Active Directory \(AD CS\).
>[!NOTE]
>È possibile utilizzare CA o strumenti di terze parti, ad esempio OpenSSL, per creare un certificato da utilizzare con il controller di rete. Tuttavia, le istruzioni riportate in questo argomento sono specifiche di Servizi certificati Active Directory. Per informazioni sull'utilizzo di una CA o di uno strumento di terze parti, vedere la documentazione relativa al software in uso.
La creazione di un certificato con una CA include i passaggi seguenti.
1. L'utente o il dominio dell'organizzazione o l'amministratore della sicurezza configura il modello di certificato
2. L'utente o l'amministratore del controller di rete dell'organizzazione o l'amministratore SCVMM richiede un nuovo certificato dalla CA.
#### <a name="certificate-configuration-requirements"></a>Requisiti di configurazione dei certificati
Quando si configura un modello di certificato nel passaggio successivo, verificare che il modello configurato includa gli elementi necessari seguenti.
1. Il nome del soggetto del certificato deve essere il nome di dominio completo dell'host Hyper-V
2. Il certificato deve trovarsi nell'archivio personale del computer locale (My – Cert: \ LocalMachine\My)
3. Il certificato deve disporre di criteri di applicazione di autenticazione server (EKU: 1.3.6.1.5.5.7.3.1) e autenticazione client (EKU: 1.3.6.1.5.5.7.3.2).
>[!NOTE]
>Se l'archivio certificati personale \(My – Cert: \ LocalMachine\My\) nell'host Hyper\-V ha più di un certificato X. 509 con nome soggetto (CN) come nome di dominio completo dell'host \(FQDN\), verificare che il certificato che verrà usato da SDN disponga di una proprietà di utilizzo chiavi avanzata personalizzata con l'OID 1.3.6.1.4.1.311.95.1.1.1. In caso contrario, la comunicazione tra il controller di rete e l'host potrebbe non funzionare.
#### <a name="to-configure-the-certificate-template"></a>Per configurare il modello di certificato
>[!NOTE]
>Prima di eseguire questa procedura, è necessario esaminare i requisiti del certificato e i modelli di certificato disponibili nella console modelli di certificato. È possibile modificare un modello esistente o creare un duplicato di un modello esistente e quindi modificare la copia del modello. È consigliabile creare una copia di un modello esistente.
1. Nel server in cui è installato Servizi certificati Active Directory, nella Server Manager fare clic su **strumenti**e quindi su **autorità di certificazione**. Viene visualizzata l'autorità di certificazione Microsoft Management Console \(MMC\).
2. In MMC fare doppio clic sul nome della CA, fare clic con il pulsante destro del mouse su **modelli di certificato**, quindi scegliere **Gestisci**.
3. Verrà visualizzata la console modelli di certificato. Tutti i modelli di certificato vengono visualizzati nel riquadro dei dettagli.
4. Nel riquadro dei dettagli fare clic sul modello che si desidera duplicare.
5. Fare clic sul menu **azione** , quindi fare clic su **Duplica modello**. Verrà visualizzata la finestra di dialogo **Proprietà** modello.
6. Nella finestra di dialogo **Proprietà** modello, nella scheda **nome soggetto** , fare clic su **specificare nella richiesta**. \(questa impostazione è obbligatoria per i certificati SSL del controller di rete.\)
7. Nella finestra di dialogo **Proprietà** modello, nella scheda **Gestione richiesta** , verificare che sia selezionata l'opzione **Consenti esportazione della chiave privata** . Assicurarsi inoltre che sia selezionata la **firma e** lo scopo della crittografia.
8. Nella finestra di dialogo **Proprietà** modello, nella scheda **estensioni** Selezionare **utilizzo chiave**, quindi fare clic su **modifica**.
9. In **firma**verificare che sia selezionata l'opzione **firma digitale** .
10. Nella finestra di dialogo **Proprietà** modello, nella scheda **estensioni** selezionare criteri di **applicazione**, quindi fare clic su **modifica**.
11. In **criteri di applicazione**assicurarsi che l' **autenticazione client** e **l'autenticazione server** siano elencate.
12. Salvare la copia del modello di certificato con un nome univoco, ad esempio il **modello del controller di rete**.
#### <a name="to-request-a-certificate-from-the-ca"></a>Per richiedere un certificato dalla CA
È possibile utilizzare lo snap-in certificati per richiedere i certificati. È possibile richiedere qualsiasi tipo di certificato preconfigurato e reso disponibile da un amministratore della CA che elabora la richiesta di certificato.
**Gli utenti** o gli **amministratori** locali sono l'appartenenza al gruppo minima richiesta per completare questa procedura.
1. Aprire lo snap-in certificati per un computer.
2. Nell'albero della console fare clic su **certificati \(\)del computer locale** . Selezionare l'archivio certificati **personale** .
3. Scegliere * * tutte le attività dal menu **azione** <strong>e quindi fare clic su * * Richiedi nuovo certificato</strong> per avviare la procedura guidata di registrazione del certificato. Fare clic su **Avanti**.
4. Selezionare il criterio di registrazione del certificato **di amministratore** e fare clic su **Avanti**.
5. Selezionare il **Active Directory** \(dei criteri di registrazione in base al modello di CA configurato nella sezione precedente\).
6. Espandere la sezione **Dettagli** e configurare gli elementi seguenti.
1. Assicurarsi che l' **utilizzo delle chiavi** includa sia la <strong>firma digitale * * che la * * crittografia delle chiavi</strong>.
2. Verificare che i **criteri di applicazione** includano sia **l'autenticazione Server** \(1.3.6.1.5.5.7.3.1\) che **l'autenticazione client** \(1.3.6.1.5.5.7.3.2\).
7. Fare clic su **Proprietà**.
8. Nella scheda **oggetto** , in **nome soggetto**, in **tipo**, selezionare **nome comune**. In valore specificare **endpoint REST del controller di rete**.
9. Fare clic su **Applica** e quindi su **OK**.
10. Fai clic su **Registra**.
In MMC certificati fare clic sull'archivio personale per visualizzare il certificato registrato dalla CA.
## <a name="exporting-and-copying-the-certificate-to-the-scvmm-library"></a>Esportazione e copia del certificato nella libreria SCVMM
Dopo aver creato un certificato autofirmato o\-una CA\-firmata, è necessario esportare il certificato con la chiave privata \(nel formato pfx\) e senza la chiave privata \(nel formato base-64. cer\) dallo snap-in certificati.
È quindi necessario copiare i due file esportati nelle cartelle **serverCertificate.CR** e **NCCertificate.CR** specificate al momento dell'importazione del modello di servizio NC.
1. Aprire lo snap-in certificati (certlm. msc) e individuare il certificato nell'archivio certificati personali per il computer locale.
2. A destra\-fare clic sul certificato, scegliere **tutte le attività**e quindi fare clic su **Esporta**. Verrà visualizzata l'Esportazione guidata certificati. Fare clic su **Avanti**.
3. Selezionare l'opzione **Sì**, Esporta la chiave privata e fare clic su **Avanti**.
4. Scegliere **scambio informazioni personali-PKCS #12 (. PFX)** e accettare l'impostazione predefinita per **includere tutti i certificati nel percorso di certificazione** , se possibile.
5. Assegnare gli utenti o i gruppi e una password per il certificato da esportare, fare clic su **Avanti**.
6. Nella pagina file da esportare individuare il percorso in cui si desidera inserire il file esportato e assegnargli un nome.
7. In modo analogo, esportare il certificato in. Formato CER. Nota: per esportare in. Formato CER, deselezionare l'opzione Sì, Esporta la chiave privata.
8. Copiare il. PFX nella cartella ServerCertificate.cr.
9. Copiare il. File CER nella cartella NCCertificate.cr.
Al termine, aggiornare le cartelle nella libreria SCVMM e verificare che siano stati copiati questi certificati. Continuare con la configurazione e la distribuzione del modello di servizio del controller di rete.
## <a name="authenticating-southbound-devices-and-services"></a>Autenticazione di dispositivi e servizi a sud
La comunicazione del controller di rete con gli host e i dispositivi SLB MUX usa i certificati per l'autenticazione. La comunicazione con gli host è tramite il protocollo OVSDB, mentre la comunicazione con i dispositivi SLB MUX è sul protocollo WCF.
### <a name="hyper-v-host-communication-with-network-controller"></a>Comunicazione dell'host Hyper-V con il controller di rete
Per la comunicazione con gli host Hyper-V tramite OVSDB, il controller di rete deve presentare un certificato ai computer host. Per impostazione predefinita, SCVMM preleva il certificato SSL configurato nel controller di rete e lo usa per la comunicazione a sud con gli host.
Questo è il motivo per cui il certificato SSL deve avere l'EKU di autenticazione client configurato. Questo certificato è configurato nella risorsa REST "Server" \(gli host Hyper-V sono rappresentati nel controller di rete come risorsa server\)e possono essere visualizzati eseguendo il comando di Windows PowerShell **Get-NetworkControllerServer**.
Di seguito è riportato un esempio parziale della risorsa REST del server.
"resourceId": "host31.fabrikam.com",
"properties": {
"connections": [
{
"managementAddresses": [
"host31.fabrikam.com"
],
"credential": {
"resourceRef": "/credentials/a738762f-f727-43b5-9c50-cf82a70221fa"
},
"credentialType": "X509Certificate"
}
],
Per l'autenticazione reciproca, l'host Hyper-V deve disporre anche di un certificato per comunicare con il controller di rete.
È possibile registrare il certificato da un'autorità di certificazione \(CA\). Se nel computer host non viene trovato un certificato basato su un'autorità di certificazione, SCVMM crea un certificato autofirmato e ne effettua il provisioning nel computer host.
Il controller di rete e i certificati host Hyper-V devono essere considerati attendibili gli uni dagli altri. Il certificato radice del certificato host Hyper-V deve essere presente nell'archivio Autorità di certificazione radice attendibili del controller di rete per il computer locale e viceversa.
Quando si usano i certificati auto\-firmati, SCVMM garantisce che i certificati necessari siano presenti nell'archivio Autorità di certificazione radice attendibili per il computer locale.
Se si usano certificati basati su CA per gli host Hyper-V, è necessario assicurarsi che il certificato radice CA sia presente nell'archivio Autorità di certificazione radice attendibili del controller di rete per il computer locale.
### <a name="software-load-balancer-mux-communication-with-network-controller"></a>Comunicazione del software Load Balancer MUX con il controller di rete
Il software Load Balancer multiplexer \(MUX\) e il controller di rete comunicano tramite il protocollo WCF, utilizzando i certificati per l'autenticazione.
Per impostazione predefinita, SCVMM preleva il certificato SSL configurato nel controller di rete e lo usa per la comunicazione a sud con i dispositivi MUX. Questo certificato è configurato per la risorsa REST "NetworkControllerLoadBalancerMux" e può essere visualizzato eseguendo il cmdlet di PowerShell **Get-NetworkControllerLoadBalancerMux**.
Esempio di risorsa REST MUX \(\)parziale:
"resourceId": "slbmux1.fabrikam.com",
"properties": {
"connections": [
{
"managementAddresses": [
"slbmux1.fabrikam.com"
],
"credential": {
"resourceRef": "/credentials/a738762f-f727-43b5-9c50-cf82a70221fa"
},
"credentialType": "X509Certificate"
}
],
Per l'autenticazione reciproca, è necessario avere anche un certificato nei dispositivi MUX di SLB. Questo certificato viene configurato automaticamente da SCVMM quando si distribuisce il servizio di bilanciamento del carico software tramite SCVMM.
>[!IMPORTANT]
>Nei nodi host e SLB è fondamentale che l'archivio certificati delle autorità di certificazione radice attendibili non includa alcun certificato in cui "rilasciato a" non è uguale a "rilasciato da". In tal caso, la comunicazione tra il controller di rete e il dispositivo sud ha esito negativo.
Il controller di rete e i certificati MUX di SLB devono essere considerati attendibili tra loro \(il certificato radice del certificato MUX SLB deve essere presente nell'archivio Autorità di certificazione radice attendibili del computer del controller di rete e viceversa\). Quando si usano i certificati auto\-firmati, SCVMM garantisce che i certificati necessari siano presenti nella nell'archivio Autorità di certificazione radice attendibili per il computer locale.
| 81.521739 | 474 | 0.78096 | ita_Latn | 0.998493 |
264cd76a3ae41cfa3ea1e6d7a3982a619addd66b | 1,407 | md | Markdown | AUTHORS.md | webysther/postgresql_anonymizer | 027cd37d059c68fd2dac0372faa38a1c1c07aad2 | [
"PostgreSQL"
] | 3 | 2020-06-06T22:32:12.000Z | 2022-03-16T16:19:00.000Z | AUTHORS.md | webysther/postgresql_anonymizer | 027cd37d059c68fd2dac0372faa38a1c1c07aad2 | [
"PostgreSQL"
] | null | null | null | AUTHORS.md | webysther/postgresql_anonymizer | 027cd37d059c68fd2dac0372faa38a1c1c07aad2 | [
"PostgreSQL"
] | 1 | 2022-03-28T03:27:38.000Z | 2022-03-28T03:27:38.000Z | PostgreSQL Anonymizer Development Team
===============================================================================
This is an open project. Feel free to join us and improve this tool. To find out
how you can get involved, please read [CONTRIBUTING.md].
[CONTRIBUTING.md]: CONTRIBUTING.md
Maintainer
-------------------------------------------------------------------------------
* Damien Clochard (@daamien)
Contributors
-------------------------------------------------------------------------------
* Christophe Courtois (@Krysztophe) : Proofreading
* Pierre-Henri Dubois Amy (@theodor_lobster) : Proofreading, GIS
* Thomas Clark (@cthomaspdx) : Pseudonymization
* Joe Auty (@joeauty) : Issue #114
* Bernie Caessens (@bcaessens) : Feedback on the black box method
* Sam Buckingham (@sam.buckingham) : Feedback on the black box method
* Sebastien Delobel (@sdelobel) : Typos
* Nikolay Samokhvalov (@NikolayS) : Documentation
* Travis Miller (@travismiller) : MacOS support
* Jan Birk (@jabi27) : Install on Ubuntu
* Olleg Samoylov (@Splarv) : Issue #87, Bug fixes, Documentation
* Damien Cazeils (www.damiencazeils.com) : Logo
* Ioseph Kim (@i0seph) : Documentation
* Matiss Zarkevics (@leovingi) : Tests on Amazon RDS
* Peter Goodwin (@Postgressor) : Tests
* Tim (@roconda) : Documentation
* Michał Lipka (@michallipka) : Tests and typos
* Thibaut Madeleine (@madtibo) : original idea :-)
| 37.026316 | 80 | 0.616205 | eng_Latn | 0.325561 |
264e5d7d9802a1aff92f2d5c16287dce575ad376 | 737 | md | Markdown | refguide/autorun-async.md | lihaobhsfer/MobX-Docs-CN | 36d8be3e26f726243d470e2514b107414e5df8d5 | [
"MIT"
] | 892 | 2017-11-30T02:59:13.000Z | 2022-03-31T15:08:55.000Z | refguide/autorun-async.md | lihaobhsfer/MobX-Docs-CN | 36d8be3e26f726243d470e2514b107414e5df8d5 | [
"MIT"
] | 22 | 2017-03-24T03:48:43.000Z | 2017-11-13T22:02:02.000Z | refguide/autorun-async.md | lihaobhsfer/MobX-Docs-CN | 36d8be3e26f726243d470e2514b107414e5df8d5 | [
"MIT"
] | 224 | 2017-12-01T07:35:54.000Z | 2022-03-11T03:49:39.000Z | # autorunAsync
`autorunAsync(action: () => void, minimumDelay?: number, scope?)`
同 `autorun` 一样,除了 `action` 不是同步调用,而是在传入的最小的毫秒数量之后异步调用。
`action` 将会运行和观察。
然而当它观察的值更改时,不是立即运行 `action`,而是等待 `minimumDelay` 后在重新运行 `action`。
如果在等待这段时间内,观察的值发生了多次的变化,`action` 仍然只会触发一次,因此在某种意义上,它实现了与事务类似的效果。
这可能对昂贵的并且不需要同步发生的东西有用,例如去抖的服务端通信。
如果提供了 scope,那么 action 会绑定到这个作用域对象。
`autorunAsync(debugName: string, action: () => void, minimumDelay?: number, scope?)`
如果传给 `autorunAsync` 的第一个参数是字符串的话,它会作为调试的名称。
`autorunAsync` 返回一个清理函数用来取消 autorun 。
```javascript
autorunAsync(() => {
// 假设 profile.asJson 返回一个用来展现 profile 的 observable json
// 每次改变都会把它发送到服务端,但发送前至少要等待300毫秒
// 发送后,将使用 profile.asJson 的最新值
sendProfileToServer(profile.asJson);
}, 300);
```
| 26.321429 | 84 | 0.761194 | yue_Hant | 0.64543 |
264e9d273d43c08a24bb7edbf6c95ddf7d9c3df7 | 1,178 | md | Markdown | access/Concepts/Miscellaneous/at-most-one-record-can-be-returned-by-this-subqueryerror-3354.md | CeptiveYT/VBA-Docs | 1d9c58a40ee6f2d85f96de0a825de201f950fc2a | [
"CC-BY-4.0",
"MIT"
] | 283 | 2018-07-06T07:44:11.000Z | 2022-03-31T14:09:36.000Z | access/Concepts/Miscellaneous/at-most-one-record-can-be-returned-by-this-subqueryerror-3354.md | CeptiveYT/VBA-Docs | 1d9c58a40ee6f2d85f96de0a825de201f950fc2a | [
"CC-BY-4.0",
"MIT"
] | 1,457 | 2018-05-11T17:48:58.000Z | 2022-03-25T22:03:38.000Z | access/Concepts/Miscellaneous/at-most-one-record-can-be-returned-by-this-subqueryerror-3354.md | CeptiveYT/VBA-Docs | 1d9c58a40ee6f2d85f96de0a825de201f950fc2a | [
"CC-BY-4.0",
"MIT"
] | 469 | 2018-06-14T12:50:12.000Z | 2022-03-27T08:17:02.000Z | ---
title: At most one record can be returned by this subquery. (Error 3354)
keywords: jeterr40.chm5003354
f1_keywords:
- jeterr40.chm5003354
ms.prod: access
ms.assetid: 55e2f3f8-26e2-b6d7-809e-e77f11bbb1dd
ms.date: 06/08/2019
ms.localizationpriority: medium
---
# At most one record can be returned by this subquery. (Error 3354)
**Applies to:** Access 2013 | Access 2016
A subquery of this kind cannot return more than one record. Revise the SELECT statement of the subquery to request only one record.
## See also
- [Access for developers forum](https://social.msdn.microsoft.com/Forums/office/home?forum=accessdev)
- [Access help on support.office.com](https://support.office.com/search/results?query=Access)
- [Access help on answers.microsoft.com](https://answers.microsoft.com/)
- [Access forums on UtterAccess](https://www.utteraccess.com/forum/index.php?act=idx)
- [Access developer and VBA programming help center (FMS)](https://www.fmsinc.com/MicrosoftAccess/developer/)
- [Access posts on StackOverflow](https://stackoverflow.com/questions/tagged/ms-access)
[!include[Support and feedback](~/includes/feedback-boilerplate.md)] | 39.266667 | 132 | 0.751273 | eng_Latn | 0.631706 |
264f6514849052a0fad7b603b23c5d16e026b815 | 103 | md | Markdown | example/http/README.md | Conzxy/kanon | 3440b0966153a0b8469e90b2a52df317aa4fe723 | [
"MIT"
] | null | null | null | example/http/README.md | Conzxy/kanon | 3440b0966153a0b8469e90b2a52df317aa4fe723 | [
"MIT"
] | null | null | null | example/http/README.md | Conzxy/kanon | 3440b0966153a0b8469e90b2a52df317aa4fe723 | [
"MIT"
] | null | null | null | # NOTICE
This directory is deprecated.
Please see [kanon_httpd](https://github.com/Conzxy/kanon_httpd). | 34.333333 | 64 | 0.796117 | eng_Latn | 0.501438 |
264ff33043e1ad7d31faa6cdd90d0294ce2270a1 | 1,360 | md | Markdown | api/Word.Document.Save.md | HappymediaDz/VBA-Docs | 3bdac8973ae846b3d698f8a3dfe7b1a01a885d3b | [
"CC-BY-4.0",
"MIT"
] | null | null | null | api/Word.Document.Save.md | HappymediaDz/VBA-Docs | 3bdac8973ae846b3d698f8a3dfe7b1a01a885d3b | [
"CC-BY-4.0",
"MIT"
] | null | null | null | api/Word.Document.Save.md | HappymediaDz/VBA-Docs | 3bdac8973ae846b3d698f8a3dfe7b1a01a885d3b | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Document.Save method (Word)
keywords: vbawd10.chm158007404
f1_keywords:
- vbawd10.chm158007404
ms.prod: word
api_name:
- Word.Document.Save
ms.assetid: 7e329abc-0530-7016-7712-687de2c780a8
ms.date: 06/08/2017
localization_priority: Normal
---
# Document.Save method (Word)
Saves the specified document.
## Syntax
_expression_. `Save`
_expression_ Required. A variable that represents a **[Document](Word.Document.md)** object.
**Parameters:**
_NoPrompt_ (Optional)
If `true`, then Word automatically saves all documents.
If `false`, then Word prompts the user to save each document that has changed since it was last saved.
_OriginalFormat_ (Optional)
Specifies the way the documents are saved. Can be one of the WdOriginalFormat constants.
## Remarks
If a document has not been saved before, the **Save As** dialog box prompts the user for a file name.
## Example
This example saves the active document if it has changed since it was last saved.
```vb
If ActiveDocument.Saved = False Then ActiveDocument.Save
```
This example saves each document in the **Documents** collection without first prompting the user.
```vb
Documents.Save NoPrompt:=True, _
OriginalFormat:=wdOriginalDocumentFormat
```
## See also
[Document Object](Word.Document.md)
[!include[Support and feedback](~/includes/feedback-boilerplate.md)]
| 20 | 102 | 0.758088 | eng_Latn | 0.895006 |
26519653cd3bf6b74c80358ccca668245295fae2 | 815 | md | Markdown | docs/containers/container-tools.md | paulomorgado/MicrosoftDocs-visualstudio-docs | e2961b3d0a8f868361bc3b4d6540a10cbb24bda3 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2019-07-29T14:33:26.000Z | 2019-07-29T14:33:26.000Z | docs/containers/container-tools.md | waleed5544/visualstudio-docs | f44b628c46f4177370445f545dd3a2c6307e3f5a | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/containers/container-tools.md | waleed5544/visualstudio-docs | f44b628c46f4177370445f545dd3a2c6307e3f5a | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Visual Studio Container Tools with ASP.NET Core
author: ghogen
description: Learn how to use Visual Studio Container Tools and Docker for Windows
ms.author: ghogen
ms.date: 02/01/2019
ms.technology: vs-azure
ms.topic: include
---
# Quickstart: Docker in Visual Studio
::: moniker range="vs-2017"
[!include[Visual Studio Container Tools](includes/vs-2017/container-tools.md)]
::: moniker-end
::: moniker range=">= vs-2019"
[!include[Visual Studio Container Tools](includes/vs-2019/container-tools.md)]
::: moniker-end
## Additional resources
* [Container development with Visual Studio](/visualstudio/containers)
* [Troubleshoot Visual Studio development with Docker](troubleshooting-docker-errors.md)
* [Visual Studio Container Tools GitHub repository](https://github.com/Microsoft/DockerTools)
| 27.166667 | 93 | 0.768098 | eng_Latn | 0.56427 |
2651b6a2048b14a13fda59971ec0ebff7f947c6c | 28,759 | md | Markdown | articles/postgresql/concepts-extensions.md | JingyingWu/azure-docs | d81fa8274533e3b2158ae3a32d81cb6ec4bc8f50 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/postgresql/concepts-extensions.md | JingyingWu/azure-docs | d81fa8274533e3b2158ae3a32d81cb6ec4bc8f50 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/postgresql/concepts-extensions.md | JingyingWu/azure-docs | d81fa8274533e3b2158ae3a32d81cb6ec4bc8f50 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Extensions - Azure Database for PostgreSQL - Single Server
description: Learn about the available Postgres extensions in Azure Database for PostgreSQL - Single Server
author: rachel-msft
ms.author: raagyema
ms.service: postgresql
ms.topic: conceptual
ms.date: 10/11/2019
---
# PostgreSQL extensions in Azure Database for PostgreSQL - Single Server
PostgreSQL provides the ability to extend the functionality of your database using extensions. Extensions bundle multiple related SQL objects together in a single package that can be loaded or removed from your database with a single command. After being loaded in the database, extensions function like built-in features.
## How to use PostgreSQL extensions
PostgreSQL extensions must be installed in your database before you can use them. To install a particular extension, run the [CREATE EXTENSION](https://www.postgresql.org/docs/current/sql-createextension.html) command from psql tool to load the packaged objects into your database.
Azure Database for PostgreSQL supports a subset of key extensions as listed below. This information is also available by running `SELECT * FROM pg_available_extensions;`. Extensions beyond the ones listed are not supported. You cannot create your own extension in Azure Database for PostgreSQL.
## Postgres 11 extensions
The following extensions are available in Azure Database for PostgreSQL servers which have Postgres version 11.
> [!div class="mx-tableFixed"]
> | **Extension**| **Extension version** | **Description** |
> |---|---|---|
> |[address_standardizer](http://postgis.net/docs/Address_Standardizer.html) | 2.5.1 | Used to parse an address into constituent elements. |
> |[address_standardizer_data_us](http://postgis.net/docs/Address_Standardizer.html) | 2.5.1 | Address Standardizer US dataset example|
> |[btree_gin](https://www.postgresql.org/docs/11/btree-gin.html) | 1.3 | support for indexing common datatypes in GIN|
> |[btree_gist](https://www.postgresql.org/docs/11/btree-gist.html) | 1.5 | support for indexing common datatypes in GiST|
> |[citext](https://www.postgresql.org/docs/11/citext.html) | 1.5 | data type for case-insensitive character strings|
> |[cube](https://www.postgresql.org/docs/11/cube.html) | 1.4 | data type for multidimensional cubes|
> |[dblink](https://www.postgresql.org/docs/11/dblink.html) | 1.2 | connect to other PostgreSQL databases from within a database|
> |[dict_int](https://www.postgresql.org/docs/11/dict-int.html) | 1.0 | text search dictionary template for integers|
> |[earthdistance](https://www.postgresql.org/docs/11/earthdistance.html) | 1.1 | calculate great-circle distances on the surface of the Earth|
> |[fuzzystrmatch](https://www.postgresql.org/docs/11/fuzzystrmatch.html) | 1.1 | determine similarities and distance between strings|
> |[hstore](https://www.postgresql.org/docs/11/hstore.html) | 1.5 | data type for storing sets of (key, value) pairs|
> |[hypopg](https://hypopg.readthedocs.io/en/latest/) | 1.1.2 | Hypothetical indexes for PostgreSQL|
> |[intarray](https://www.postgresql.org/docs/11/intarray.html) | 1.2 | functions, operators, and index support for 1-D arrays of integers|
> |[isn](https://www.postgresql.org/docs/11/isn.html) | 1.2 | data types for international product numbering standards|
> |[ltree](https://www.postgresql.org/docs/11/ltree.html) | 1.1 | data type for hierarchical tree-like structures|
> |[orafce](https://github.com/orafce/orafce) | 3.7 | Functions and operators that emulate a subset of functions and packages from commercial RDBMS|
> |[pgaudit](https://www.pgaudit.org/) | 1.3.1 | provides auditing functionality|
> |[pgcrypto](https://www.postgresql.org/docs/11/pgcrypto.html) | 1.3 | cryptographic functions|
> |[pgrouting](https://pgrouting.org/) | 2.6.2 | pgRouting Extension|
> |[pgrowlocks](https://www.postgresql.org/docs/11/pgrowlocks.html) | 1.2 | show row-level locking information|
> |[pgstattuple](https://www.postgresql.org/docs/11/pgstattuple.html) | 1.5 | show tuple-level statistics|
> |[pg_buffercache](https://www.postgresql.org/docs/11/pgbuffercache.html) | 1.3 | examine the shared buffer cache|
> |[pg_partman](https://github.com/pgpartman/pg_partman) | 4.0.0 | Extension to manage partitioned tables by time or ID|
> |[pg_prewarm](https://www.postgresql.org/docs/11/pgprewarm.html) | 1.2 | prewarm relation data|
> |[pg_stat_statements](https://www.postgresql.org/docs/11/pgstatstatements.html) | 1.6 | track execution statistics of all SQL statements executed|
> |[pg_trgm](https://www.postgresql.org/docs/11/pgtrgm.html) | 1.4 | text similarity measurement and index searching based on trigrams|
> |[plpgsql](https://www.postgresql.org/docs/11/plpgsql.html) | 1.0 | PL/pgSQL procedural language|
> |[plv8](https://plv8.github.io/) | 2.3.11 | PL/JavaScript (v8) trusted procedural language|
> |[postgis](https://www.postgis.net/) | 2.5.1 | PostGIS geometry, geography, and raster spatial types and functions|
> |[postgis_sfcgal](https://www.postgis.net/) | 2.5.1 | PostGIS SFCGAL functions|
> |[postgis_tiger_geocoder](https://www.postgis.net/) | 2.5.1 | PostGIS tiger geocoder and reverse geocoder|
> |[postgis_topology](https://postgis.net/docs/Topology.html) | 2.5.1 | PostGIS topology spatial types and functions|
> |[postgres_fdw](https://www.postgresql.org/docs/11/postgres-fdw.html) | 1.0 | foreign-data wrapper for remote PostgreSQL servers|
> |[tablefunc](https://www.postgresql.org/docs/11/tablefunc.html) | 1.0 | functions that manipulate whole tables, including crosstab|
> |[timescaledb](https://docs.timescale.com/latest) | 1.3.2 | Enables scalable inserts and complex queries for time-series data|
> |[unaccent](https://www.postgresql.org/docs/11/unaccent.html) | 1.1 | text search dictionary that removes accents|
> |[uuid-ossp](https://www.postgresql.org/docs/11/uuid-ossp.html) | 1.1 | generate universally unique identifiers (UUIDs)|
## Postgres 10 extensions
The following extensions are available in Azure Database for PostgreSQL servers which have Postgres version 10.
> [!div class="mx-tableFixed"]
> | **Extension**| **Extension version** | **Description** |
> |---|---|---|
> |[address_standardizer](http://postgis.net/docs/Address_Standardizer.html) | 2.5.1 | Used to parse an address into constituent elements. |
> |[address_standardizer_data_us](http://postgis.net/docs/Address_Standardizer.html) | 2.5.1 | Address Standardizer US dataset example|
> |[btree_gin](https://www.postgresql.org/docs/10/btree-gin.html) | 1.3 | support for indexing common datatypes in GIN|
> |[btree_gist](https://www.postgresql.org/docs/10/btree-gist.html) | 1.5 | support for indexing common datatypes in GiST|
> |[chkpass](https://www.postgresql.org/docs/10/chkpass.html) | 1.0 | data type for auto-encrypted passwords|
> |[citext](https://www.postgresql.org/docs/10/citext.html) | 1.4 | data type for case-insensitive character strings|
> |[cube](https://www.postgresql.org/docs/10/cube.html) | 1.2 | data type for multidimensional cubes|
> |[dblink](https://www.postgresql.org/docs/10/dblink.html) | 1.2 | connect to other PostgreSQL databases from within a database|
> |[dict_int](https://www.postgresql.org/docs/10/dict-int.html) | 1.0 | text search dictionary template for integers|
> |[earthdistance](https://www.postgresql.org/docs/10/earthdistance.html) | 1.1 | calculate great-circle distances on the surface of the Earth|
> |[fuzzystrmatch](https://www.postgresql.org/docs/10/fuzzystrmatch.html) | 1.1 | determine similarities and distance between strings|
> |[hstore](https://www.postgresql.org/docs/10/hstore.html) | 1.4 | data type for storing sets of (key, value) pairs|
> |[hypopg](https://hypopg.readthedocs.io/en/latest/) | 1.1.1 | Hypothetical indexes for PostgreSQL|
> |[intarray](https://www.postgresql.org/docs/10/intarray.html) | 1.2 | functions, operators, and index support for 1-D arrays of integers|
> |[isn](https://www.postgresql.org/docs/10/isn.html) | 1.1 | data types for international product numbering standards|
> |[ltree](https://www.postgresql.org/docs/10/ltree.html) | 1.1 | data type for hierarchical tree-like structures|
> |[orafce](https://github.com/orafce/orafce) | 3.7 | Functions and operators that emulate a subset of functions and packages from commercial RDBMS|
> |[pgaudit](https://www.pgaudit.org/) | 1.2 | provides auditing functionality|
> |[pgcrypto](https://www.postgresql.org/docs/10/pgcrypto.html) | 1.3 | cryptographic functions|
> |[pgrouting](https://pgrouting.org/) | 2.5.2 | pgRouting Extension|
> |[pgrowlocks](https://www.postgresql.org/docs/10/pgrowlocks.html) | 1.2 | show row-level locking information|
> |[pgstattuple](https://www.postgresql.org/docs/10/pgstattuple.html) | 1.5 | show tuple-level statistics|
> |[pg_buffercache](https://www.postgresql.org/docs/10/pgbuffercache.html) | 1.3 | examine the shared buffer cache|
> |[pg_partman](https://github.com/pgpartman/pg_partman) | 2.6.3 | Extension to manage partitioned tables by time or ID|
> |[pg_prewarm](https://www.postgresql.org/docs/10/pgprewarm.html) | 1.1 | prewarm relation data|
> |[pg_stat_statements](https://www.postgresql.org/docs/10/pgstatstatements.html) | 1.6 | track execution statistics of all SQL statements executed|
> |[pg_trgm](https://www.postgresql.org/docs/10/pgtrgm.html) | 1.3 | text similarity measurement and index searching based on trigrams|
> |[plpgsql](https://www.postgresql.org/docs/10/plpgsql.html) | 1.0 | PL/pgSQL procedural language|
> |[plv8](https://plv8.github.io/) | 2.1.0 | PL/JavaScript (v8) trusted procedural language|
> |[postgis](https://www.postgis.net/) | 2.4.3 | PostGIS geometry, geography, and raster spatial types and functions|
> |[postgis_sfcgal](https://www.postgis.net/) | 2.4.3 | PostGIS SFCGAL functions|
> |[postgis_tiger_geocoder](https://www.postgis.net/) | 2.4.3 | PostGIS tiger geocoder and reverse geocoder|
> |[postgis_topology](https://postgis.net/docs/Topology.html) | 2.4.3 | PostGIS topology spatial types and functions|
> |[postgres_fdw](https://www.postgresql.org/docs/10/postgres-fdw.html) | 1.0 | foreign-data wrapper for remote PostgreSQL servers|
> |[tablefunc](https://www.postgresql.org/docs/10/tablefunc.html) | 1.0 | functions that manipulate whole tables, including crosstab|
> |[timescaledb](https://docs.timescale.com/latest) | 1.1.1 | Enables scalable inserts and complex queries for time-series data|
> |[unaccent](https://www.postgresql.org/docs/10/unaccent.html) | 1.1 | text search dictionary that removes accents|
> |[uuid-ossp](https://www.postgresql.org/docs/10/uuid-ossp.html) | 1.1 | generate universally unique identifiers (UUIDs)|
## Postgres 9.6 extensions
The following extensions are available in Azure Database for PostgreSQL servers which have Postgres version 9.6.
> [!div class="mx-tableFixed"]
> | **Extension**| **Extension version** | **Description** |
> |---|---|---|
> |[address_standardizer](http://postgis.net/docs/Address_Standardizer.html) | 2.3.2 | Used to parse an address into constituent elements. |
> |[address_standardizer_data_us](http://postgis.net/docs/Address_Standardizer.html) | 2.3.2 | Address Standardizer US dataset example|
> |[btree_gin](https://www.postgresql.org/docs/9.6/btree-gin.html) | 1.0 | support for indexing common datatypes in GIN|
> |[btree_gist](https://www.postgresql.org/docs/9.6/btree-gist.html) | 1.2 | support for indexing common datatypes in GiST|
> |[chkpass](https://www.postgresql.org/docs/9.6/chkpass.html) | 1.0 | data type for auto-encrypted passwords|
> |[citext](https://www.postgresql.org/docs/9.6/citext.html) | 1.3 | data type for case-insensitive character strings|
> |[cube](https://www.postgresql.org/docs/9.6/cube.html) | 1.2 | data type for multidimensional cubes|
> |[dblink](https://www.postgresql.org/docs/9.6/dblink.html) | 1.2 | connect to other PostgreSQL databases from within a database|
> |[dict_int](https://www.postgresql.org/docs/9.6/dict-int.html) | 1.0 | text search dictionary template for integers|
> |[earthdistance](https://www.postgresql.org/docs/9.6/earthdistance.html) | 1.1 | calculate great-circle distances on the surface of the Earth|
> |[fuzzystrmatch](https://www.postgresql.org/docs/9.6/fuzzystrmatch.html) | 1.1 | determine similarities and distance between strings|
> |[hstore](https://www.postgresql.org/docs/9.6/hstore.html) | 1.4 | data type for storing sets of (key, value) pairs|
> |[hypopg](https://hypopg.readthedocs.io/en/latest/) | 1.1.1 | Hypothetical indexes for PostgreSQL|
> |[intarray](https://www.postgresql.org/docs/9.6/intarray.html) | 1.2 | functions, operators, and index support for 1-D arrays of integers|
> |[isn](https://www.postgresql.org/docs/9.6/isn.html) | 1.1 | data types for international product numbering standards|
> |[ltree](https://www.postgresql.org/docs/9.6/ltree.html) | 1.1 | data type for hierarchical tree-like structures|
> |[orafce](https://github.com/orafce/orafce) | 3.7 | Functions and operators that emulate a subset of functions and packages from commercial RDBMS|
> |[pgaudit](https://www.pgaudit.org/) | 1.1.2 | provides auditing functionality|
> |[pgcrypto](https://www.postgresql.org/docs/9.6/pgcrypto.html) | 1.3 | cryptographic functions|
> |[pgrouting](https://pgrouting.org/) | 2.3.2 | pgRouting Extension|
> |[pgrowlocks](https://www.postgresql.org/docs/9.6/pgrowlocks.html) | 1.2 | show row-level locking information|
> |[pgstattuple](https://www.postgresql.org/docs/9.6/pgstattuple.html) | 1.4 | show tuple-level statistics|
> |[pg_buffercache](https://www.postgresql.org/docs/9.6/pgbuffercache.html) | 1.2 | examine the shared buffer cache|
> |[pg_partman](https://github.com/pgpartman/pg_partman) | 2.6.3 | Extension to manage partitioned tables by time or ID|
> |[pg_prewarm](https://www.postgresql.org/docs/9.6/pgprewarm.html) | 1.1 | prewarm relation data|
> |[pg_stat_statements](https://www.postgresql.org/docs/9.6/pgstatstatements.html) | 1.4 | track execution statistics of all SQL statements executed|
> |[pg_trgm](https://www.postgresql.org/docs/9.6/pgtrgm.html) | 1.3 | text similarity measurement and index searching based on trigrams|
> |[plpgsql](https://www.postgresql.org/docs/9.6/plpgsql.html) | 1.0 | PL/pgSQL procedural language|
> |[plv8](https://plv8.github.io/) | 2.1.0 | PL/JavaScript (v8) trusted procedural language|
> |[postgis](https://www.postgis.net/) | 2.3.2 | PostGIS geometry, geography, and raster spatial types and functions|
> |[postgis_sfcgal](https://www.postgis.net/) | 2.3.2 | PostGIS SFCGAL functions|
> |[postgis_tiger_geocoder](https://www.postgis.net/) | 2.3.2 | PostGIS tiger geocoder and reverse geocoder|
> |[postgis_topology](https://postgis.net/docs/Topology.html) | 2.3.2 | PostGIS topology spatial types and functions|
> |[postgres_fdw](https://www.postgresql.org/docs/9.6/postgres-fdw.html) | 1.0 | foreign-data wrapper for remote PostgreSQL servers|
> |[tablefunc](https://www.postgresql.org/docs/9.6/tablefunc.html) | 1.0 | functions that manipulate whole tables, including crosstab|
> |[timescaledb](https://docs.timescale.com/latest) | 1.1.1 | Enables scalable inserts and complex queries for time-series data|
> |[unaccent](https://www.postgresql.org/docs/9.6/unaccent.html) | 1.1 | text search dictionary that removes accents|
> |[uuid-ossp](https://www.postgresql.org/docs/9.6/uuid-ossp.html) | 1.1 | generate universally unique identifiers (UUIDs)|
## Postgres 9.5 extensions
The following extensions are available in Azure Database for PostgreSQL servers which have Postgres version 9.5.
> [!div class="mx-tableFixed"]
> | **Extension**| **Extension version** | **Description** |
> |---|---|---|
> |[address_standardizer](http://postgis.net/docs/Address_Standardizer.html) | 2.3.0 | Used to parse an address into constituent elements. |
> |[address_standardizer_data_us](http://postgis.net/docs/Address_Standardizer.html) | 2.3.0 | Address Standardizer US dataset example|
> |[btree_gin](https://www.postgresql.org/docs/9.5/btree-gin.html) | 1.0 | support for indexing common datatypes in GIN|
> |[btree_gist](https://www.postgresql.org/docs/9.5/btree-gist.html) | 1.1 | support for indexing common datatypes in GiST|
> |[chkpass](https://www.postgresql.org/docs/9.5/chkpass.html) | 1.0 | data type for auto-encrypted passwords|
> |[citext](https://www.postgresql.org/docs/9.5/citext.html) | 1.1 | data type for case-insensitive character strings|
> |[cube](https://www.postgresql.org/docs/9.5/cube.html) | 1.0 | data type for multidimensional cubes|
> |[dblink](https://www.postgresql.org/docs/9.5/dblink.html) | 1.1 | connect to other PostgreSQL databases from within a database|
> |[dict_int](https://www.postgresql.org/docs/9.5/dict-int.html) | 1.0 | text search dictionary template for integers|
> |[earthdistance](https://www.postgresql.org/docs/9.5/earthdistance.html) | 1.0 | calculate great-circle distances on the surface of the Earth|
> |[fuzzystrmatch](https://www.postgresql.org/docs/9.5/fuzzystrmatch.html) | 1.0 | determine similarities and distance between strings|
> |[hstore](https://www.postgresql.org/docs/9.5/hstore.html) | 1.3 | data type for storing sets of (key, value) pairs|
> |[hypopg](https://hypopg.readthedocs.io/en/latest/) | 1.1.1 | Hypothetical indexes for PostgreSQL|
> |[intarray](https://www.postgresql.org/docs/9.5/intarray.html) | 1.0 | functions, operators, and index support for 1-D arrays of integers|
> |[isn](https://www.postgresql.org/docs/9.5/isn.html) | 1.0 | data types for international product numbering standards|
> |[ltree](https://www.postgresql.org/docs/9.5/ltree.html) | 1.0 | data type for hierarchical tree-like structures|
> |[orafce](https://github.com/orafce/orafce) | 3.7 | Functions and operators that emulate a subset of functions and packages from commercial RDBMS|
> |[pgaudit](https://www.pgaudit.org/) | 1.0.7 | provides auditing functionality|
> |[pgcrypto](https://www.postgresql.org/docs/9.5/pgcrypto.html) | 1.2 | cryptographic functions|
> |[pgrouting](https://pgrouting.org/) | 2.3.0 | pgRouting Extension|
> |[pgrowlocks](https://www.postgresql.org/docs/9.5/pgrowlocks.html) | 1.1 | show row-level locking information|
> |[pgstattuple](https://www.postgresql.org/docs/9.5/pgstattuple.html) | 1.3 | show tuple-level statistics|
> |[pg_buffercache](https://www.postgresql.org/docs/9.5/pgbuffercache.html) | 1.1 | examine the shared buffer cache|
> |[pg_partman](https://github.com/pgpartman/pg_partman) | 2.6.3 | Extension to manage partitioned tables by time or ID|
> |[pg_prewarm](https://www.postgresql.org/docs/9.5/pgprewarm.html) | 1.0 | prewarm relation data|
> |[pg_stat_statements](https://www.postgresql.org/docs/9.5/pgstatstatements.html) | 1.3 | track execution statistics of all SQL statements executed|
> |[pg_trgm](https://www.postgresql.org/docs/9.5/pgtrgm.html) | 1.1 | text similarity measurement and index searching based on trigrams|
> |[plpgsql](https://www.postgresql.org/docs/9.5/plpgsql.html) | 1.0 | PL/pgSQL procedural language|
> |[postgis](https://www.postgis.net/) | 2.3.0 | PostGIS geometry, geography, and raster spatial types and functions|
> |[postgis_sfcgal](https://www.postgis.net/) | 2.3.0 | PostGIS SFCGAL functions|
> |[postgis_tiger_geocoder](https://www.postgis.net/) | 2.3.0 | PostGIS tiger geocoder and reverse geocoder|
> |[postgis_topology](https://postgis.net/docs/Topology.html) | 2.3.0 | PostGIS topology spatial types and functions|
> |[postgres_fdw](https://www.postgresql.org/docs/9.5/postgres-fdw.html) | 1.0 | foreign-data wrapper for remote PostgreSQL servers|
> |[tablefunc](https://www.postgresql.org/docs/9.5/tablefunc.html) | 1.0 | functions that manipulate whole tables, including crosstab|
> |[unaccent](https://www.postgresql.org/docs/9.5/unaccent.html) | 1.0 | text search dictionary that removes accents|
> |[uuid-ossp](https://www.postgresql.org/docs/9.5/uuid-ossp.html) | 1.0 | generate universally unique identifiers (UUIDs)|
## pg_stat_statements
The pg_stat_statements extension is preloaded on every Azure Database for PostgreSQL server to provide you a means of tracking execution statistics of SQL statements.
The setting `pg_stat_statements.track`, which controls what statements are counted by the extension, defaults to `top`, meaning all statements issued directly by clients are tracked. The two other tracking levels are `none` and `all`. This setting is configurable as a server parameter through the [Azure portal](https://docs.microsoft.com/azure/postgresql/howto-configure-server-parameters-using-portal) or the [Azure CLI](https://docs.microsoft.com/azure/postgresql/howto-configure-server-parameters-using-cli).
There is a tradeoff between the query execution information pg_stat_statements provides and the impact on server performance as it logs each SQL statement. If you are not actively using the pg_stat_statements extension, we recommend that you set `pg_stat_statements.track` to `none`. Note that some third party monitoring services may rely on pg_stat_statements to deliver query performance insights, so confirm whether this is the case for you or not.
## dblink and postgres_fdw
dblink and postgres_fdw allow you to connect from one PostgreSQL server to another, or to another database in the same server. The receiving server needs to allow connections from the sending server through its firewall. When using these extensions to connect between Azure Database for PostgreSQL servers, this can be done by setting "Allow access to Azure services" to ON. This is also needed if you want to use the extensions to loop back to the same server. The "Allow access to Azure services" setting can be found in the Azure portal page for the Postgres server, under Connection Security. Turning "Allow access to Azure services" ON puts all Azure IPs on the allow list.
Currently, outbound connections from Azure Database for PostgreSQL are not supported, except for connections to other Azure Database for PostgreSQL servers.
## uuid
If you are planning to use `uuid_generate_v4()` from the uuid-ossp extension, consider comparing with `gen_random_uuid()` from the pgcrypto extension for performance benefits.
## pgAudit
The pgAudit extension provides session and object audit logging. To learn how to use this extension in Azure Database for PostgreSQL, visit the [auditing concepts article](concepts-audit.md).
## TimescaleDB
TimescaleDB is a time-series database that is packaged as an extension for PostgreSQL. TimescaleDB provides time-oriented analytical functions, optimizations, and scales Postgres for time-series workloads.
[Learn more about TimescaleDB](https://docs.timescale.com/latest), a registered trademark of [Timescale, Inc.](https://www.timescale.com/)
### Installing TimescaleDB
To install TimescaleDB, you need to include it in the server's shared preload libraries. A change to Postgres's `shared_preload_libraries` parameter requires a **server restart** to take effect. You can change parameters using the [Azure portal](howto-configure-server-parameters-using-portal.md) or the [Azure CLI](howto-configure-server-parameters-using-cli.md).
Using the [Azure portal](https://portal.azure.com/):
1. Select your Azure Database for PostgreSQL server.
2. On the sidebar, select **Server Parameters**.
3. Search for the `shared_preload_libraries` parameter.
4. Select **TimescaleDB**.
5. Select **Save** to preserve your changes. You get a notification once the change is saved.
6. After the notification, **restart** the server to apply these changes. To learn how to restart a server, see [Restart an Azure Database for PostgreSQL server](howto-restart-server-portal.md).
You can now enable TimescaleDB in your Postgres database. Connect to the database and issue the following command:
```sql
CREATE EXTENSION IF NOT EXISTS timescaledb CASCADE;
```
> [!TIP]
> If you see an error, confirm that you [restarted your server](howto-restart-server-portal.md) after saving shared_preload_libraries.
You can now create a TimescaleDB hypertable [from scratch](https://docs.timescale.com/getting-started/creating-hypertables) or migrate [existing time-series data in PostgreSQL](https://docs.timescale.com/getting-started/migrating-data).
## Next steps
If you don't see an extension that you'd like to use, let us know. Vote for existing requests or create new feedback requests in our [feedback forum](https://feedback.azure.com/forums/597976-azure-database-for-postgresql).
| 113.671937 | 678 | 0.637644 | eng_Latn | 0.488383 |
2651d526806ee1df357ed3887f65b7755dd8dc3f | 366 | md | Markdown | dds/user-guide/connecting-to-a-replica-set-instance.md | cici-2019/docs | 085983e9753f7378cc735730d7838964aab8adfe | [
"Apache-2.0"
] | null | null | null | dds/user-guide/connecting-to-a-replica-set-instance.md | cici-2019/docs | 085983e9753f7378cc735730d7838964aab8adfe | [
"Apache-2.0"
] | null | null | null | dds/user-guide/connecting-to-a-replica-set-instance.md | cici-2019/docs | 085983e9753f7378cc735730d7838964aab8adfe | [
"Apache-2.0"
] | null | null | null | # Connecting to a Replica Set Instance<a name="dds_02_0014"></a>
- **[Binding and Unbinding an EIP](binding-and-unbinding-an-eip(replica-set).md)**
- **[Enabling or Disabling SSL](enabling-or-disabling-ssl(replica-set).md)**
- **[Connecting to a DB Instance Through a Client](connecting-to-a-db-instance-through-a-client(replica-set).md)**
| 36.6 | 119 | 0.68306 | eng_Latn | 0.667219 |
265205cc6a1a9b5d8ae5bd948dbeae7986a6bc0f | 193 | md | Markdown | docs/index.md | cjh1/eSimMon | e4dfee63a4483621b53785471fe551012eda1f06 | [
"BSD-3-Clause"
] | null | null | null | docs/index.md | cjh1/eSimMon | e4dfee63a4483621b53785471fe551012eda1f06 | [
"BSD-3-Clause"
] | 19 | 2019-02-25T20:40:41.000Z | 2020-04-20T17:25:09.000Z | docs/index.md | cjh1/eSimMon | e4dfee63a4483621b53785471fe551012eda1f06 | [
"BSD-3-Clause"
] | 2 | 2019-06-15T11:13:45.000Z | 2019-10-17T17:46:49.000Z | The dashboard currently supports user account creation and is able to ingest EFFIS data. Users are also able to save their current view and make it available for other users to select and load. | 193 | 193 | 0.818653 | eng_Latn | 0.999971 |
26525f028043955124d198ceaf9def82d0145c9b | 2,084 | md | Markdown | DATABASE_STRUCT.md | bmstu-iu9/utp2019-8-chat | 9c1b1ba325657039f0699d0caacc52787a58ba98 | [
"MIT"
] | null | null | null | DATABASE_STRUCT.md | bmstu-iu9/utp2019-8-chat | 9c1b1ba325657039f0699d0caacc52787a58ba98 | [
"MIT"
] | 8 | 2019-08-02T17:25:58.000Z | 2019-09-03T21:59:52.000Z | DATABASE_STRUCT.md | bmstu-iu9/utp2019-8-chat | 9c1b1ba325657039f0699d0caacc52787a58ba98 | [
"MIT"
] | 1 | 2019-08-27T15:47:21.000Z | 2019-08-27T15:47:21.000Z | # Структура базы данных #
Здесь приведено примерное описание используемой базы данных.
Для получения подробной информации - используйте [файл экспорта][].
[файл экспорта]:https://github.com/bmstu-iu9/utp2019-8-chat/blob/master/database/9SpT1uQOyM.sql
## Таблицы ##
### users ###
Данные для авторизации.
| Имя | Тип | Комментарий |
| ----- | ----------- | ------------- |
| id | int(11) | ID |
| login | varchar(20) | Логин |
| hash | text | Хэш пароля |
| salt | text | Соль к паролю |
### users_data ###
Данные о пользователях.
| Имя | Тип | Комментарий |
| ----------- | ------------ | ------------------------ |
| id | int(11) | ID |
| nickname | varchar(20) | Никнейм |
| permissions | tinyint(4) | Битовая маска привилегий |
| avatar | varchar(100) | Путь к файлу аватара |
| meta | json | Метаданные |
### chat ###
Данные о каналах.
| Имя | Тип | Комментарий |
| ------- | ----------- | ------------------------------- |
| chat_id | int(11) | ID |
| name | varchar(20) | Название |
| user_id | int(11) | ID создателя (владельца) канала |
| meta | json | Метаданные |
### party ###
Данные о пользователях в каналах. Запись вида 2-3 означает что пользователь с id 3 добавлен в канал 2.
| Имя | Тип | Комментарий |
| ------- | ------- | --------------- |
| chat_id | int(11) | ID канала |
| user_id | int(11) | ID пользователя |
### messages ###
Таблица с сообщениями со всех каналов.
| Имя | Тип | Комментарий |
| ----------- | ---------- | ------------------ |
| message_id | bigint(20) | ID сообщения |
| chat_id | bigint(20) | ID канала |
| user_id | int(11) | ID отправителя |
| content | text | Текст сообщения |
| date_create | datetime | Времядата отправки | | 33.079365 | 102 | 0.460173 | rus_Cyrl | 0.622571 |
2652d6d15eaea9bc424037dcc3b37b9d7182fbdf | 2,841 | md | Markdown | src/es/2020-02/07/07.md | PrJared/sabbath-school-lessons | 94a27f5bcba987a11a698e5e0d4279b81a68bc9a | [
"MIT"
] | 68 | 2016-10-30T23:17:56.000Z | 2022-03-27T11:58:16.000Z | src/es/2020-02/07/07.md | PrJared/sabbath-school-lessons | 94a27f5bcba987a11a698e5e0d4279b81a68bc9a | [
"MIT"
] | 367 | 2016-10-21T03:50:22.000Z | 2022-03-28T23:35:25.000Z | src/es/2020-02/07/07.md | PrJared/sabbath-school-lessons | 94a27f5bcba987a11a698e5e0d4279b81a68bc9a | [
"MIT"
] | 109 | 2016-08-02T14:32:13.000Z | 2022-03-31T10:18:41.000Z | ---
title: Para Estudiar Y Meditar
date: 15/05/2020
---
Lee Elena de White, El conflicto de los siglos, “El lucero de la Reforma”, pp. 75-90; “Un campeón de la verdad”, pp. 135-156. También lee la parte 4, de los puntos a. al j., del documento: “Métodos de estudio de la Biblia”: https://cort.as/-MdHR.
“En su Palabra, Dios comunicó a los hombres el conocimiento necesario para la salvación. Las Santas Escrituras deben ser aceptadas como una revelación autorizada e infalible de su voluntad. Son la norma del carácter, las reveladoras de doctrinas y las examinadoras de la experiencia. [...] Sin embargo, el hecho de haber revelado Dios su voluntad a los hombres por medio de su Palabra no anuló la necesidad que ellos tienen de la continua presencia y dirección del Espíritu Santo. Por el contrario, el Salvador prometió el Espíritu para abrir la Palabra a sus siervos, para iluminar y aplicar sus enseñanzas. Y, como el Espíritu de Dios fue el que inspiró la Biblia, es imposible que alguna vez las enseñanzas del Espíritu sean contrarias a las de la Palabra” (CS 7).
**Preguntas para dialogar**
`1. Independientemente de la cantidad de traducciones de la Biblia que haya en tu idioma, ¿qué puedes hacer para aprovechar al máximo lo que tienes? ¿Cómo puedes aprender a apreciar la Biblia como la Palabra de Dios y procurar, con fe, obedecer lo que esta enseña?`
`2. Piensa en la diferencia entre lo que la Palabra de Dios enseña acerca del origen de la humanidad (que fuimos creados por Dios en el sexto día de la Creación) y lo que la misma humanidad enseña, bajo el nombre de “ciencia” (que evolucionamos durante miles de millones de años). ¿Qué debería decirnos este enorme contraste entre ambas enseñanzas acerca de la importancia de atenernos a lo que enseña la Biblia y de cuán lejos puede llegar la humanidad cuando se aparta de la Palabra de Dios y de lo que esta enseña claramente?`
`3. ¿Qué herramientas bíblicas tienes disponibles para ayudarte a comprender mejor la Biblia? E incluso, si no tienes herramientas adicionales, ¿cómo puedes aprender a aplicar algunas de las lecciones aprendidas esta semana sobre cómo interpretar la Biblia?`
`4. A los israelitas se les dijo que les enseñen a sus hijos las grandes verdades que se les confiaban y que vez tras vez relatasen las historias de la conducción de Dios en su vida (Deut. 4:9). Más allá del beneficio obvio que implica transmitir la fe, ¿qué tienen la enseñanza y el relato de historias sobre la conducción de Dios en nuestra vida que tienden a aumentar nuestra fe? Es decir, ¿por qué el hecho de compartir la verdad bíblica con los demás también es beneficioso para nosotros?`
---
#### Comentarios Elena G.W
Exaltad a Jesús, “Cristo tomó sobre sí la naturaleza humana”, p. 68;
El conflicto de los siglos, “El lucero de la Reforma”, pp. 75-90. | 109.269231 | 781 | 0.774727 | spa_Latn | 0.999303 |
2653de54294782fcd2490c0d52a0b7f1ef34e62e | 2,921 | md | Markdown | azure/azure-docker.md | AmmarBandukwala/generic-cli-examples | 136d6235fd112cc84be747f76eefe7ed40a4532b | [
"MIT"
] | 1 | 2020-08-20T07:28:50.000Z | 2020-08-20T07:28:50.000Z | azure/azure-docker.md | AmmarBandukwala/generic-cli-examples | 136d6235fd112cc84be747f76eefe7ed40a4532b | [
"MIT"
] | null | null | null | azure/azure-docker.md | AmmarBandukwala/generic-cli-examples | 136d6235fd112cc84be747f76eefe7ed40a4532b | [
"MIT"
] | null | null | null | # Microsoft Azure Cloud Command Line - Snippets
## Docker Image Design Manifesto
1. ***Building Containers***
- Don’t trust arbitrary base images!
- Keep base images small
- Use the Builder Pattern
2. ***Container Internals***
- Use a non-root user inside the container
- Make the file system read-only
- One process per container
- Don’t restart on failure. Crash cleanly instead
- Log everything to stdout and stderr
3. ***Deployments***
- Use the “record” option for easier rollbacks
- Use plenty of descriptive labels
- Use sidecars for Proxies, watchers, etc.
- Don’t use sidecars for bootstrapping!
- Don’t use :latest or no tag
- Readiness and Liveness Probes are your friend
4. ***Services***
- Don’t use type: LoadBalancer
- Type: Nodeport can be “good enough”
- Use Static IPs they are free!
- Map External Services to Internal Ones
5. ***Application Architecture***
- Use Helm Charts
- All Downstream dependencies are unreliable
- Make sure your microservices aren’t too micro
- Use Namespaces to split up your cluster
- Role based Access Control
## Create App Service and Deploy Docker Image
```shell
az webapp create --resource-group {resourcegroupname} --plan {appserviceplanname} --name <app-name> --deployment-container-image-name <registry-name>.azurecr.io/appsvc-custom-image:latest
```
## Configure App Service to foward request to a specified port
```shell
az webapp config appsettings set --resource-group {resourcegroupname} --name <app-name> --settings WEBSITES_PORT=8080
```
## Login to Azure Container Registry (Service Principal via Az Login)
```shell
az acr login -n {azurecontainerregistryname}
```
## Docker commands to build/push/tag a container image
```shell
docker build -t $acc/$repo:$tag
docker push $acc/$repo:$tag
docker pull mcr.microsoft.com/dotnet/core/samples:aspnetapp
docker tag mcr.microsoft.com/dotnet/core/samples:aspnetapp {containerregistryendpoint}/aspnetapp:1.0
docker push {containerregistryendpoint}/aspnetapp:1.0
```
## Azure Kubernetes Service get credentials and load into local cache for kubectl
```shell
az aks get-credentials -g aks -n {aksname}
```
## Kubectl command line create reference to container registry, deploy the configuration via YAML, and view the status
```shell
kubectl get nodes
kubectl create secret docker-registry {azurecontainerregistryname} --docker-server={containerregistryendpoint} --docker-username={dockerusername} --docker-password={dockerpassword} --docker-email={email}
kubectl describe secret
kubectl create -f deployment.yml
# (Watch Service)
kubectl get service/aspnetapp -w
# (List Pods)
kubectl get pods -o wide
# (Evict Users)
kubectl drain <node>
# (Kill Pod)
kubectl delete node <node>
```
## Clear All Docker Images and Running Containers
```shell
docker rm -vf $(docker ps -a -q)
docker rmi -f $(docker images -a -q)
```
| 29.505051 | 203 | 0.737076 | eng_Latn | 0.75004 |
2654147795733daac6d98c5290641a8ce1676e2f | 72 | md | Markdown | README.md | jsnider3/Jelo | 0b9b6b1c67a44fbfa6816ca0a7e4bd55cd1e58b3 | [
"MIT"
] | 4 | 2018-06-12T05:00:12.000Z | 2020-05-15T19:23:14.000Z | README.md | jsnider3/Jelo | 0b9b6b1c67a44fbfa6816ca0a7e4bd55cd1e58b3 | [
"MIT"
] | 1 | 2020-08-09T14:20:55.000Z | 2020-08-09T14:20:55.000Z | README.md | jsnider3/Jelo | 0b9b6b1c67a44fbfa6816ca0a7e4bd55cd1e58b3 | [
"MIT"
] | 1 | 2019-07-25T04:11:40.000Z | 2019-07-25T04:11:40.000Z | # Jelo
Jelo is an open-source implementation of the Elo rating system.
| 18 | 63 | 0.777778 | eng_Latn | 0.993147 |
26545b48947f267658f6932a1473d1c4ec40f807 | 1,894 | md | Markdown | content/post/longest-substring-with-at-most-k-distinct-characters.md | lek-tin/hashnopolis-blog | 4caa0a6a0355d0672d75f0b73153bfda9a55eb1f | [
"MIT"
] | 1 | 2021-02-02T21:44:04.000Z | 2021-02-02T21:44:04.000Z | content/post/longest-substring-with-at-most-k-distinct-characters.md | lek-tin/hashnopolis-blog | 4caa0a6a0355d0672d75f0b73153bfda9a55eb1f | [
"MIT"
] | null | null | null | content/post/longest-substring-with-at-most-k-distinct-characters.md | lek-tin/hashnopolis-blog | 4caa0a6a0355d0672d75f0b73153bfda9a55eb1f | [
"MIT"
] | 1 | 2021-03-21T09:58:42.000Z | 2021-03-21T09:58:42.000Z | ---
title: "Longest Substring With at Most K Distinct Characters"
description: "Some description ..."
authors: ["lek-tin"]
tags: ["leetcode", "hashmap", "substring", "sliding-window"]
categories: ["algorithm"]
date: 2019-03-14T01:03:53-08:00
draft: false
archive: false
---
Given a string, find the length of the longest substring _T_ that contains at most _k_ distinct characters.
### Example 1
```
Input: s = "eceba", k = 2
Output: 3
Explanation: T is "ece" which its length is 3.
```
### Example 2
```
Input: s = "aa", k = 1
Output: 2
Explanation: T is "aa" which its length is 2.
```
### Solution
```java
class Solution {
public int lengthOfLongestSubstringKDistinct(String s, int k) {
Map<Character, Integer> countMap = new HashMap<>();
int left = 0;
int max = 0;
for(int i = 0; i < s.length(); i++) {
// character at the right pointer
char c = s.charAt(i);
countMap.put(c, countMap.getOrDefault(c, 0) + 1);
// make sure map size is valid, no need to check left pointer less than s.length()
// distinct count needs to be <= k
/*
* k = 3, left = 0
* {1, 1, 1, 3, 4, 5}
*/
while (countMap.size() > k) {
// Removes the leftmost character
char leftChar = s.charAt(left);
countMap.put(leftChar, countMap.get(leftChar) - 1);
// If leftChar is no longer one of the distinct numbers, remove it.
if (countMap.get(leftChar) == 0) {
countMap.remove(leftChar);
}
// Leftbound moves right by 1 step.
left++;
}
// Compare previous max = {1, 1, 1, 3, 4}.length and {3, 4, 5}.length
max = Math.max(max, i - left + 1);
}
return max;
}
}
```
| 30.548387 | 107 | 0.539071 | eng_Latn | 0.949854 |
26553caa5ccbbf52cd20c9a0dfa51be8fd94f649 | 197 | md | Markdown | Week 2/LAB4/smallproblem.md | john-ansell/linux-bootcamp | bc161d0101745f48ddf9d1ac4bc57d99f79554ee | [
"MIT"
] | null | null | null | Week 2/LAB4/smallproblem.md | john-ansell/linux-bootcamp | bc161d0101745f48ddf9d1ac4bc57d99f79554ee | [
"MIT"
] | null | null | null | Week 2/LAB4/smallproblem.md | john-ansell/linux-bootcamp | bc161d0101745f48ddf9d1ac4bc57d99f79554ee | [
"MIT"
] | null | null | null | I went through the turorial on ubuntu and apache... good stuff... but I did have a problem viewing the subdomain since i couldn't find the dns name in GCP... Everything worked ok other than that!
| 98.5 | 196 | 0.761421 | eng_Latn | 0.999658 |
2655ba2eee8a7c96394919e5b6add30dd93dccfa | 1,896 | md | Markdown | content/post/2017/2017-12-13-k8s-at-home.md | alecharp/blog | d7e6320e7650bc26e4fc5f8e56a24a86020d3bd9 | [
"MIT"
] | null | null | null | content/post/2017/2017-12-13-k8s-at-home.md | alecharp/blog | d7e6320e7650bc26e4fc5f8e56a24a86020d3bd9 | [
"MIT"
] | null | null | null | content/post/2017/2017-12-13-k8s-at-home.md | alecharp/blog | d7e6320e7650bc26e4fc5f8e56a24a86020d3bd9 | [
"MIT"
] | null | null | null | ---
title: "Kubernetes @ Home"
date: 2017-12-13
---
For many years now, I have a Jenkins running at home, for my personal projects.
The problem is that, it's running on my laptop.
So, I basically have a setup where Jenkins tells me "it works on [your] computer".
This is far from ideal.
## Following home's lab
There are 2 articles I read recently which gave me the motivation to have a proper setup at home.
Few days ago, the excellent Jessie Frazelle wrote a post about labs at home.
If you have read it, pause here and go read it: [Home Lab is the Dopest Lab](https://blog.jessfraz.com/post/home-lab-is-the-dopest-lab/).
Few weeks ago, Carolyn Van Slyck wrote a suite of articles on how she used NUCs to create her DVDs digitalization pipeline. Same thing, go read her articles, they are excellent: [My Little Cluster Story](http://carolynvanslyck.com/blog/2017/10/my-little-cluster/).
## What's the connection?
Well, my Jenkins is my lab.
I use it to build side project, to experiment.
But I don't have it running all the time.
For example, at the time I write this line, I haven't started it for 4weeks..
So I want to setup it correctly.
I could simply buy a NUC and install Jenkins on it.
That would be it.
But I'd like to leverage the NUC to run my side projects as well.
So Kubernetes.
## Why Kubernetes?
First, everyone's speaking about it.
Second, I might need to understand it, use it for my job.
Third, why not?
So, before buying anything, I'll setup a minikube instance on my laptop (again).
Make sure I can really use and understand K8S (as the cool kids say).
Then I'll setup properly a NUC with my Jenkins.
I'll see after that if I need more of them.
So yes, at first it'll be a one node k8s cluster..
I'll try to "document" my experiment here.
It won't be glorious (like if anyone read this..) but it might worth the time, so I make sure I do understand what I'm doing.
| 43.090909 | 264 | 0.748418 | eng_Latn | 0.998873 |
2656de9ce1ad559b923faa80e7a39d6929279261 | 959 | md | Markdown | jobshop-genetic-algorithm/README.md | jacobmeneses/GolangCodes | 16e1641bbb2c9dec264fc323febcb77f19735350 | [
"MIT"
] | null | null | null | jobshop-genetic-algorithm/README.md | jacobmeneses/GolangCodes | 16e1641bbb2c9dec264fc323febcb77f19735350 | [
"MIT"
] | null | null | null | jobshop-genetic-algorithm/README.md | jacobmeneses/GolangCodes | 16e1641bbb2c9dec264fc323febcb77f19735350 | [
"MIT"
] | null | null | null | # Job-Shop Scheduling Problem
An implementation of the clasical job-shop scheduling problem using Genetic algorithm.
# Explanation
The job-shop consists on the following:
We have N number of jobs that must be processed on M machines.
Each job is composed of M number of operations, each with a specific processing time and a number of machine in which have to be processed.
The objective of the problem is to find a sequence of operations for all jobs, such that total processing time of the scheduled operations is the minimal posible.
The operations of the problem have the following conditions:
- The operations have a predefined order of execution.
- Each job is processed on each machine exactly once.
- Only 1 operation per machine is allowed to be processed at any given time.
- Given that the operations of any job has a predefined order, 2 operations of the same job cannot be processed concurrently
The problem has NP-Complete complexity.
| 41.695652 | 162 | 0.798749 | eng_Latn | 0.999837 |
2656e88f6a47b3bae914ac87ce2aa72d1428cead | 3,537 | md | Markdown | _posts/2010-2-19-杭州聚会.md | backup53/1984bbs | 152406c37afab79176f0d094de5ac4cb0c780730 | [
"MIT"
] | 18 | 2020-01-02T21:43:02.000Z | 2022-02-14T02:40:34.000Z | _posts/2010-2-19-杭州聚会.md | wzxwj/1984bbs | 152406c37afab79176f0d094de5ac4cb0c780730 | [
"MIT"
] | 3 | 2020-01-01T16:53:59.000Z | 2020-01-05T10:14:11.000Z | _posts/2010-2-19-杭州聚会.md | wzxwj/1984bbs | 152406c37afab79176f0d094de5ac4cb0c780730 | [
"MIT"
] | 13 | 2020-01-20T14:27:39.000Z | 2021-08-16T02:13:21.000Z | ---
layout: default
date: 2010-2-19
title: 杭州聚会
categories: 罗马假日公寓
---
# 杭州聚会
diyzh
天赋人权@diyzh
1楼 大 中 小 发表于 2010-2-19 03:00 只看该作者
杭州聚会
需要参加的,短我你的联系方式,即时通信、邮箱、手机都行,我把日期地点发给你
[ 本帖最后由 diyzh 于 2010-3-9 20:52 编辑 ]
---
[Terminusbot](https://github.com/TerminusBot) 整理,讨论请前往 [2049bbs.xyz](http://2049bbs.xyz/)
---
袖手看热闹
2楼 大 中 小 发表于 2010-2-19 08:36 只看该作者
上海这边的同学可以去围观吗?
袖手看热闹
3楼 大 中 小 发表于 2010-2-19 08:37 只看该作者
大叔、大爷年纪的可以去蹭饭吗?
oxooooooof
4楼 大 中 小 发表于 2010-2-19 10:02 只看该作者
再过几天才回杭州不知道赶的上不
diyzh
天赋人权@diyzh
5楼 大 中 小 发表于 2010-2-19 16:29 只看该作者
引用:
> 原帖由 袖手看热闹 于 2010-2-19 08:36 发表
> 
> 上海这边的同学可以去围观吗?
非常欢迎
diyzh
天赋人权@diyzh
6楼 大 中 小 发表于 2010-2-19 16:33 只看该作者
引用:
> 原帖由 袖手看热闹 于 2010-2-19 08:37 发表
> 
> 大叔、大爷年纪的可以去蹭饭吗?
以前我们一直都是AA的,不过如果你确实有学识,可以免去你的费用,哈哈
diyzh
天赋人权@diyzh
7楼 大 中 小 发表于 2010-2-19 16:35 只看该作者
引用:
> 原帖由 oxooooooof 于 2010-2-19 10:02 发表
> 
> 再过几天才回杭州不知道赶的上不
我也还没回呢,到时DM你
天秤归座剑归位
8楼 大 中 小 发表于 2010-2-20 11:03 只看该作者
支持啊
青春的脚步
人生就像一只牙缸。你可以把它看成是洗具,也可以看成是杯具!~
9楼 大 中 小 发表于 2010-2-20 11:23 只看该作者
文二路那边有家Pub不错,蹭完饭建议去里面小坐。
萧易寒
10楼 大 中 小 发表于 2010-2-20 14:56 只看该作者
上海的,喜欢杭州的外婆家
cpf
11楼 大 中 小 发表于 2010-2-20 15:11 只看该作者
杭州的 能报名否?
diyzh
天赋人权@diyzh
12楼 大 中 小 发表于 2010-2-21 11:39 只看该作者
引用:
> 原帖由 cpf 于 2010-2-20 15:11 发表 
> 杭州的 能报名否?
可以,短我你的即时消息联系方式,QQ Gtalk Skype都行
温克坚
13楼 大 中 小 发表于 2010-2-22 14:05 只看该作者
报个名。
乐在心中
14楼 大 中 小 发表于 2010-2-25 17:24 只看该作者
报个名先。。。。
diyzh
天赋人权@diyzh
15楼 大 中 小 发表于 2010-2-26 14:01 只看该作者
回复 15楼 roc 的话题
还没到清明吧
素。
16楼 大 中 小 发表于 2010-2-26 18:07 只看该作者
绍兴柯桥的路过
青春的脚步
人生就像一只牙缸。你可以把它看成是洗具,也可以看成是杯具!~
17楼 大 中 小 发表于 2010-2-26 19:22 只看该作者
报名
快乐流浪汉
脑力劳动教养所指导员,五毛控 GFW爱好者 低俗控 业余翻墙 长期围观 资深群众 被代表 不明真相
18楼 大 中 小 发表于 2010-3-2 10:52 只看该作者
作为第一届杭州FB发起人,我报名。
麦麦
19楼 大 中 小 发表于 2010-3-2 22:32 只看该作者
大致活动时间?
diyzh
天赋人权@diyzh
20楼 大 中 小 发表于 2010-3-3 19:12 只看该作者
引用:
> 原帖由 麦麦 于 2010-3-2 22:32 发表 
> 大致活动时间?
见私信
[ 本帖最后由 diyzh 于 2010-3-4 11:57 编辑 ]
青春的脚步
人生就像一只牙缸。你可以把它看成是洗具,也可以看成是杯具!~
21楼 大 中 小 发表于 2010-3-4 11:04 只看该作者
作为第一届杭州FB的围观群众,紧跟19楼的脚步去流浪~
左岸←右岸
把你的子宫钉到我的墙上,这样我便会记得你。我们必须走了。明天,明天…
22楼 大 中 小 发表于 2010-3-5 14:40 只看该作者
还可报名??
马克杯
84女女 Twitter:@zhifan
23楼 大 中 小 发表于 2010-3-8 18:58 只看该作者
疯了疯了 我困死了 走路都像踩棉花,不过聚会很high啊~ 好开心~
青春的脚步
人生就像一只牙缸。你可以把它看成是洗具,也可以看成是杯具!~
24楼 大 中 小 发表于 2010-3-8 19:35 只看该作者
YE!~安全到家!这次聚会很成功见到很多达人。很high~ 湿了~
diyzh
天赋人权@diyzh
25楼 大 中 小 发表于 2010-3-9 20:52 只看该作者
引用:
> 原帖由 青春的脚步 于 2010-3-8 19:35 发表
> 
> YE!~安全到家!这次聚会很成功见到很多达人。很high~ 湿了~
蛋定蛋定
| 4.581606 | 89 | 0.541702 | yue_Hant | 0.998895 |
265793e0825cfdd26ad5bc3371e7d7e7acebeae9 | 52 | md | Markdown | README.md | huangxin813/KotlinDemo | f7217e0095220bce063aa8efcb33991287e31e77 | [
"Apache-2.0"
] | null | null | null | README.md | huangxin813/KotlinDemo | f7217e0095220bce063aa8efcb33991287e31e77 | [
"Apache-2.0"
] | null | null | null | README.md | huangxin813/KotlinDemo | f7217e0095220bce063aa8efcb33991287e31e77 | [
"Apache-2.0"
] | null | null | null | # KotlinDemo
a simple demo for leaning kotlin usage
| 17.333333 | 38 | 0.807692 | epo_Latn | 0.402798 |
2657a8b07eea148e7a903254a267a6f1bc117a87 | 3,006 | md | Markdown | README.md | mrkamel/tempfile_for | 93c9634b1a236c54706eda8626acd98b60f81b15 | [
"MIT"
] | 4 | 2015-04-29T02:54:00.000Z | 2016-09-18T03:12:43.000Z | README.md | mrkamel/tempfile_for | 93c9634b1a236c54706eda8626acd98b60f81b15 | [
"MIT"
] | null | null | null | README.md | mrkamel/tempfile_for | 93c9634b1a236c54706eda8626acd98b60f81b15 | [
"MIT"
] | null | null | null |
[](http://travis-ci.org/mrkamel/tempfile_for)
[](https://codeclimate.com/github/mrkamel/tempfile_for)
[](https://gemnasium.com/mrkamel/tempfile_for)
# TempfileFor
Easily create temporary files for in-memory data, modify the file, and get the
full content returned.
## Installation
Add this line to your application's Gemfile:
```
gem "tempfile_for"
```
And then execute:
```
$ bundle
```
Alternatively, you can of course install it without using bundler via:
```
$ gem install tempfile_for
```
## Usage
### for
TempfileFor is a very tiny gem, but it can save you lines of ugly code.
To get a quick introduction into what TempfileFor does, check this out:
```ruby
Tempfile.for("string1") { |tempfile| `echo -n ', string2' >> #{tempfile.path}` }
# => "string1, string2"
```
Say, you have some in-memory data, like an image you fetch from an URL - and
you want to somehow modify it partially like e.g., add or remove IPTC tags, or
scale it, etc. Often, the gems used to modify images or other media files
require a path to the file to be able to modify it.
This is easy thanks to TempfileFor:
```ruby
data = RestClient.get("http://example.com/image.jpg")
image = Tempfile.for(data) do |tempfile|
# Modify the tempfile directly or using tempfile.path and get the modified content returned.
# Tempfile takes care about flushing the file's modifications and rewinding, etc.
end
```
### blank
If you want to use TempfileFor without initial data, simply use:
```ruby
Tempfile.blank { |tempfile| `echo -n data >> #{tempfile.path}` }
```
### Tempfile object
In case you need the tempfile object itself, because you e.g. don't want to
load it into memory, add `:read => false` to either `Tempfile#for` or
`Tempfile#blank`
```ruby
Tempfile.for("string", :read => false) { |tempfile| ... } # => #<File:/tmp/tempfile...>
Tempfile.blank(:read => false) { |tempfile| ... } # => #<File:/tmp/tempfile...>
```
## Encoding
TempfileFor preserves the encoding of the supplied data. Thus, the following
code
```ruby
Tempfile.for("string1".encode(Encoding::ISO_8859_1)) { ... }
```
will return a string encoded as ISO-8859-1. If you use the blank method, you
can supply the desired encoding via
```ruby
Tempfile.blank(:encoding => Encoding::BINARY) { ... }
```
will return a string encoded as binary.
## Suffix
You can pass a `:suffix => "..."` option to all methods, such that
```ruby
Tempfile.for("data", :suffix => ".jpg")
Tempfile.blank(:suffix => ".jpg") { ... }
```
will create tempfiles having a `.jpg` suffix.
## Contributing
1. Fork it
2. Create your feature branch (`git checkout -b my-new-feature`)
3. Commit your changes (`git commit -am 'Added some feature'`)
4. Push to the branch (`git push origin my-new-feature`)
5. Create new Pull Request
| 25.913793 | 129 | 0.708583 | eng_Latn | 0.864108 |
2657da1207d306084b5f700b0ec682fdc192e2fb | 1,599 | md | Markdown | blog/pikachu.md | Facodeur/blog | 4e2637645403e848992cb65be1f32bb43864fdbe | [
"RSA-MD"
] | null | null | null | blog/pikachu.md | Facodeur/blog | 4e2637645403e848992cb65be1f32bb43864fdbe | [
"RSA-MD"
] | 4 | 2021-09-02T11:11:40.000Z | 2022-02-19T06:24:13.000Z | blog/pikachu.md | sandix34/Blog | f7b61e6998046da07f3af178c223fd8db1801ff4 | [
"MIT"
] | null | null | null | ---
slug: pikachu
date: '2019-02-02'
title: Pikachu top
---
# Pikachu
**Pikachu** (Japanese: ピカチュウ Hepburn: Pikachū, pronounced [pikatɕɯː], English: /ˈpiːkətʃuː/) are a species of Pokémon, fictional creatures that appear in an assortment of video games, animated television shows and movies, trading card games, and comic books licensed by The Pokémon Company, a Japanese corporation. **They are yellow rodent-like creatures with powerful electrical abilities.**

In most vocalized appearances, including the anime and certain video games, they are primarily voiced by _Ikue Ōtani_.
Pikachu will make his first live-action appearance as Detective Pikachu as part of the main cast on the upcoming film Pokémon: Detective Pikachu, played in CGI and voiced by Ryan Reynolds.
The Pikachu design was conceived by _Atsuko Nishida_ and finalized by Ken Sugimori. Pikachu first appeared in Pokémon Red and Green in Japan, and later in the first internationally released Pokémon video games, Pokémon Red and Blue, for the original Game Boy.
Like other species of Pokémon, **Pikachu are often captured and groomed by humans to fight other Pokémon for sport**. Pikachu are one of the most well-known varieties of Pokémon, largely because Pikachu is a central character in the Pokémon anime series.
**Pikachu is regarded as a major character of the Pokémon franchise as well as its mascot**, and has become an icon of Japanese pop culture in recent years. It is also seen as one of the major mascots for Nintendo...
| 72.681818 | 392 | 0.791745 | eng_Latn | 0.999301 |
26582ed84b9f1ab422909b21b1f24a01c2a6ec62 | 1,572 | md | Markdown | research/index.md | SupernovaTitanium/Blog | 37811e8e991486a6981f567bc384e2e8bf6a14ac | [
"MIT"
] | null | null | null | research/index.md | SupernovaTitanium/Blog | 37811e8e991486a6981f567bc384e2e8bf6a14ac | [
"MIT"
] | 2 | 2021-09-27T21:32:58.000Z | 2022-02-26T03:57:41.000Z | research/index.md | SupernovaTitanium/website | 37811e8e991486a6981f567bc384e2e8bf6a14ac | [
"MIT"
] | null | null | null | ---
layout: research
title: Research
tags: [research]
date: 2018-5-31
---
## 2017
* <span style="color:navy,font-size:30px">Latent Feature Lasso</span> [pdf](http://www.cs.cmu.edu/~eyan/publication/LatentFeatureLasso.pdf){: .btn .btn-info target="_blank"}
[slide](http://www.cs.cmu.edu/~eyan/publication/LFLassoSlide.pdf){: .btn .btn-info target="_blank"}
[poster](http://www.cs.cmu.edu/~eyan/publication/LFLassoPoster.pdf){: .btn .btn-info target="_blank"}
[talk](https://www.dropbox.com/s/ulz420uvym4huac/LFM_ICML_Tak.mp4){: .btn .btn-info target="_blank"}
[code](http://www.cs.cmu.edu/~eyan/code/ConvexBMF.zip){: .btn .btn-info target="_blank"}
Ian E.H. Yen, <u>Wei-Cheng Lee</u>, Sung-En Chang, Arun S. Suggala, Shou-De Lin and Pradeep Ravikumar.
<I>In International Conference on Machine Learning <span style="color:navy">(ICML)</span>, 2017.</I>
## 2018
* <span style="color:navy,font-size:30px">MixLasso: Generalized Mixed Regression via Convex Atomic-Norm Regularization.</span> [pdf](https://papers.nips.cc/paper/8284-mixlasso-generalized-mixed-regression-via-convex-atomic-norm-regularization){: .btn .btn-info target="_blank"}
[poster]({{site.url}}/assets/download/MRLPoster_NIPS.pdf){: .btn .btn-info target="_blank"}
Ian E.H. Yen, <u>Wei-Cheng Lee</u>, Kai Zhong, Sung-En Chang, Pradeep Ravikumar and Shou-De Lin.
<I>In Advances in Neural Information Processing Systems <span style="color:navy">(NIPS)</span>, 2018.</I>
| 60.461538 | 307 | 0.690204 | yue_Hant | 0.262001 |
26589343358de23c7515ae070f5d4d8001216aa6 | 2,633 | md | Markdown | cloudfunctions/ty-service/node_modules/@cloudbase/node-sdk/docs/database/aggregate/stages/bucket.md | qq81860186/tuya-weapp-demo | 4101e9203200a9c5a81931aceb25084ab67626ec | [
"MIT"
] | 137 | 2018-08-11T14:42:03.000Z | 2022-03-11T11:56:50.000Z | cloudfunctions/ty-service/node_modules/@cloudbase/node-sdk/docs/database/aggregate/stages/bucket.md | qq81860186/tuya-weapp-demo | 4101e9203200a9c5a81931aceb25084ab67626ec | [
"MIT"
] | 19 | 2019-09-06T09:58:56.000Z | 2022-03-19T07:35:02.000Z | cloudfunctions/ty-service/node_modules/@cloudbase/node-sdk/docs/database/aggregate/stages/bucket.md | qq81860186/tuya-weapp-demo | 4101e9203200a9c5a81931aceb25084ab67626ec | [
"MIT"
] | 27 | 2019-08-12T03:44:51.000Z | 2021-12-27T12:37:49.000Z | # Aggregate.bucket
### 1. 接口描述
功能: 聚合阶段。将输入记录根据给定的条件和边界划分成不同的组,每组即一个 `bucket`。
声明:
`bucket({ groupBy: <expression>, boundaries: [<lowerbound1>, <lowerbound2>, ...], default: <literal>, output: { <output1>: <accumulator expr>, ... <outputN>: <accumulator expr> } })`
注意事项:
> 每组分别作为一个记录输出,包含一个以下界为值的 `_id` 字段和一个以组中记录数为值的 `count` 字段。`count` 在没有指定 `output` 的时候是默认输出的。
> `bucket` 只会在组内有至少一个记录的时候输出。
> 使用 `bucket` 需要满足以下至少一个条件,否则会抛出错误:
- 每一个输入记录应用 `groupBy` 表达式获取的值都必须是一个在 `boundaries` 内的值
- 指定一个 `default` 值,该值在 `boundaries` 以外,或与 `boundaries` 元素的值不同的类型。
### 2. 输入参数
| 参数 | 类型 | 必填 | 说明 |
| ---------- | ------------------------------ | ---- | ------------ |
| groupBy | [expression](../expression.md) | 是 | 字段详述如下 |
| boundaries | Array | 是 | 字段详述如下 |
| default | any | 否 | 字段详述如下 |
| output | Object | 否 | 字段详述如下 |
> `groupBy` 是一个用以决定分组的表达式,会应用在各个输入记录上。可以用 `$` 前缀加上要用以分组的字段路径来作为表达式。除非用 `default` 指定了默认值,否则每个记录都需要包含指定的字段,且字段值必须在 `boundaries` 指定的范围之内。
> `boundaries` 是一个数组,每个元素分别是每组的下界。必须至少指定两个边界值。数组值必须是同类型递增的值。
> `default` 可选,指定之后,没有进入任何分组的记录将都进入一个默认分组,这个分组记录的 `_id` 即由 `default` 决定。`default` 的值必须小于 `boundaries` 中的最小值或大于等于其中的最大值。`default` 的值可以与 `boundaries` 元素值类型不同。
> `output` 可选,用以决定输出记录除了 `_id` 外还要包含哪些字段,各个字段的值必须用累加器表达式指定。当 `output` 指定时,默认的 `count` 是不会被默认输出的,必须手动指定:
```
output: {
count: $.sum(1),
...
<outputN>: <accumulator expr>
}
```
### 3. 返回结果
| 参数 | 类型 | 必填 | 说明 |
| ---- | ---------------------------- | ---- | -------- |
| - | [Aggregate](../aggregate.md) | 是 | 聚合对象 |
### 4. 示例代码
假设集合 `items` 有如下记录:
```
{
_id: "1",
price: 10
}
{
_id: "2",
price: 50
}
{
_id: "3",
price: 20
}
{
_id: "4",
price: 80
}
{
_id: "5",
price: 200
}
```
对上述记录进行分组,将 [0, 50) 分为一组,[50, 100) 分为一组,其他分为一组:
```js
const tcb = require('@cloudbase/node-sdk')
const app = tcb.init({
env: 'xxx'
})
const db = app.database()
const $ = db.command.aggregate
exports.main = async (event, context) => {
const res = await db
.collection('items')
.aggregate()
.bucket({
groupBy: '$price',
boundaries: [0, 50, 100],
default: 'other',
output: {
count: $.sum(),
ids: $.push('$_id')
}
})
.end()
console.log(res.data)
}
```
返回结果如下:
```
[
{
"_id": 0,
"count": 2,
"ids": [
"1",
"3"
]
},
{
"_id": 50,
"count": 2,
"ids": [
"2",
"4"
]
},
{
"_id": "other",
"count": 1,
"ids": [
"5"
]
}
]
```
| 19.07971 | 182 | 0.515002 | yue_Hant | 0.290653 |
2659034025c948a0a3d52d3da546089cd31aad84 | 23 | md | Markdown | README.md | nafiul-araf/Olymics-2021-Analysis | 74a0b074d24f852628b8ecd5b604d5d92ae80a03 | [
"Apache-2.0"
] | 2 | 2021-08-29T16:32:25.000Z | 2022-02-10T16:12:59.000Z | README.md | nafiul-araf/Olymics-2021-Analysis | 74a0b074d24f852628b8ecd5b604d5d92ae80a03 | [
"Apache-2.0"
] | null | null | null | README.md | nafiul-araf/Olymics-2021-Analysis | 74a0b074d24f852628b8ecd5b604d5d92ae80a03 | [
"Apache-2.0"
] | null | null | null | # Olymics-2021-Analysis | 23 | 23 | 0.826087 | kor_Hang | 0.769159 |
2659c1579d2ea5e964fdd1210edf5960044da38e | 6,215 | md | Markdown | _hibahcme/356.md | hixio-mh/website-4 | 943b8ecdb1f7e507abbb404051afbd40d0a855d3 | [
"MIT"
] | 4 | 2018-03-23T08:55:53.000Z | 2018-03-24T15:10:22.000Z | _hibahcme/356.md | hixio-mh/website-4 | 943b8ecdb1f7e507abbb404051afbd40d0a855d3 | [
"MIT"
] | 3 | 2021-12-20T17:56:32.000Z | 2021-12-20T17:59:59.000Z | _hibahcme/356.md | hixio-mh/website-4 | 943b8ecdb1f7e507abbb404051afbd40d0a855d3 | [
"MIT"
] | 5 | 2020-01-01T09:54:05.000Z | 2021-11-23T15:49:11.000Z | ---
nomor: 356
nama: Amida Yusriana
foto:
filename: Foto Amida terbaru.jpg
type: image/jpeg
size: 2545774
url: >-
http://5c4cf848f6454dc02ec8-c49fe7e7355d384845270f4a7a0a7aa1.r53.cf2.rackcdn.com/cb15206f-f75c-4f77-ab32-0d912b0f12a0/Foto%20Amida%20terbaru.jpg
seni: penelitian
pengalaman: 4 tahun
website: >-
https://scholar.google.co.id/citations?user=R4zJJpMAAAAJ&hl=en&oi=ao
;https://www.youtube.com/watch?v=s22NNCvYKoI; berbakti.id
sosmed: '@amidayusriana ; @chat_ter_box'
file:
filename: Menghidupkan Kembali Era 90an Dengan Media Sosial.pdf
type: application/pdf
size: 1016033
url: >-
http://5c4cf848f6454dc02ec8-c49fe7e7355d384845270f4a7a0a7aa1.r53.cf2.rackcdn.com/7ff630a2-27fc-42d1-bb3a-67fe0c660978/Menghidupkan%20Kembali%20Era%2090an%20Dengan%20Media%20Sosial.pdf
proyek: Aku Janda
lokasi: Jawa Tengah (Kota Semarang)
lokasi_id: Q3557
deskripsi: "Proyek Aku Janda adalah sebuah proyek penyusunan buku dan roadshow sosialisasi buku tersebut. Buku Aku Janda adalah sebuah buku biografi yang berisi kisah inspiratif 7 orang wanita yang memutuskan untuk bercerai dengan berbagai alasan yang kemudian menapaki kehidupan baru sebagai seorang single parents. Kisah yang dipaparkan dalam 7 cerita tersebut akan berbeda satu sama lain dan memiliki kekhasan masing-masing. Kisah didasarkan pada kisah nyata para kontributor perempuan berstatus janda. Tujuan dari proyek buku ini adalah untuk menghilangkan stigma negatif status janda.\r\nBuku ini akan menjadi antologi yang akan melibatkan 7 orang mahasiswi yang menekuni dunia menulis dengan diketuai 2 kepala proyek (Saya dan seorang teman). Setiap mahasiswi akan berinteraksi dengan perempuan Janda dan menuliskan kembali kisah mereka menjadi sebuah biografi yang inspiratif. Keterlibatan mahasiswi dalam proyek ini tidak lepas dari profesi kepala proyek sebagai seorang dosen dan dalam rangka memberikan ruang kreasi bagi mahasiswi yang menyukai dunia tulis-menulis.\r\nBuku akan diterbitkan sejumlah 100 eksemplar (self-published). Beberapa buku akan didistribusikan ke institusi yang memiliki kepentingan dalam memajukan perempuan. Selain itu akan dilakukan sosialisasi buku seperti acara bedah buku umum, bedah buku kampus, talkshow TV Lokal dan radio lokal. Buku akan dibagikan dalam setiap acara sosialisasi dan bedah buku. \r\n"
kategori: riset_kajian_kuratorial
latar_belakang: "Proyek buku Aku Janda berawal dari pengalaman saudara pemohon yang sedang mengalami transisi menuju ke perceraian. Selama proses, saudara perempuan tersebut memperoleh banyak tekanan karena keputusannya untuk bercerai. Banyak pihak yang mengetahui alasan perceraian tersebut adalah karena KDRT, namun demikian masih banyak juga yang menyalahkannya dan menyebutnya egois. Hal tersebut bertentangan dengan sosialisasi KDRT yang selama ini banyak digaungkan pemerintah agar masyarakat sadar dan tanggap. Pada kenyataannya saat hal tersebut terjadi, perempuan masih saja harus menanggung stigma buruk. Selain itu menilik media mainstream yang ada selama ini masih banyak yang menyudutkan perempuan dalam berbagai aspek termasuk saat sang perempuan menjadi janda. \r\nPemohon adalah penggelut kajian gender, ada banyak karya pemohon yang mengulas persoalan perempuan di masyarakat. Pemohon juga aktif menulis karya populer maupun ilmiah. Berbekal pengalaman tersebut, maka pemohon dan tim menyusun ide karya sebuah buku antologi biografi berjudul ‘Aku Janda’. Judul tersebut dipilih karena ingin secara tegas melawan stigma negatif. Judul yang menggelitik sengaja digunakan untuk menarik perhatian masyarakat umum. Untuk pola bedah buku dan sosialisasi direncanakan didasarkan pada kedekatan pemohon dengan dunia akademis yang mengedepankan seminar sebagai upaya membuka cakrawala pemahaman khalayak.\r\n"
masalah: "Berdasarkan data Badan Pusat Statistik, pasangan bercerai hingga tahun 2015 (data tahun terakhir yang tersaji) mencapai angka 347.256. Angka tersebut merupakan angka tertinggi dalam kurun waktu 5 tahun terakhir. Hal ini berarti terdapat 347.256 janda dan duda di Indonesia. Berbeda dengan kata duda yang seringkali diasosiasikan dengan kata ‘keren’, istilah janda lebih banyak diasosiasikan pada hal-hal negatif di kalangan masyarakat Indonesia. Istilah ‘Janda Kembang’, ‘Janda Anak Satu’, ‘Aku Janda Rasa Perawan’ merupakan istilah-istilah yang menimbulkan kernyitan di dahi. Seorang janda seringkali dianggap sebagai ancaman dalam lingkungannya, perebut suami orang, perempuan penggoda, perempuan yang boleh digoda serta perempuan yang gagal mempertahankan rumah tangganya.\r\nStigma negatif yang mengiringi kata ‘Janda’ ini sangat sakral dan diabadikan dalam berbagai karya populer di media Indonesia. Maka proyek ini berupaya untuk mengubah stigma negatif tersebut dengan menyajikan kisah biografi Janda yang inspiratif. Dalam karya ini diharapkan akan mengkomunikasikan dengan baik alasan perceraian sebagai keputusan terbaik dalam situasi yang sulit yang diambil seorang perempuan, upaya membangun kembali kehidupan dalam babak baru dan menjadi seorang single parents. Selain dari sisi penceritaan, upaya perubahan sikap ini juga akan disampaikan melalui materi saat bedah buku dan sosialisasi.\r\n"
durasi: 9 bulan
sukses: >-
Proyek ini adalah sebuah proyek yang berupaya mengubah tataran sikap
masyarakat. Berdasarkan Consumer Behavior, manusia memiliki tiga tataran
terhadap sesuatu: awareness, sikap dan tindakan. Buku Aku Janda dan
sosialisasinya bertujuan mencapai tataran sikap yakni pemikiran setuju/tidak
setuju, benar/tidak benar dalam diri seseorang. Maka indikator sukses dalam
proyek ini adalah saat peserta/pembaca memilih sikap setuju bahwa kata ‘janda’
tidak boleh selalu diasosiasikan sebagai hal negatif. Proses dari proyek ini
adalah diawali dari perekruitan 7 orang mahasiswa, proses brainstorming
meeting untuk pra produksi (2x) produksi (5x) dan pasca produksi (2x),
pencetakan sebanyak 100 eksemplar buku dan sosialisasi buku berupa 1x event
bedah buku umum, 2x event bedah buku level akademis (kampus UNDIP & UDINUS),
sosialisasi via media di Radio dan televisi lokal (TVKu) dan 1x bedah buku
komunitas solidaritas wanita.
dana: '61'
submission_id: 5a96757253f63e7bc7a12376
---
| 132.234043 | 1,444 | 0.828962 | ind_Latn | 0.975057 |
265a538d62bde97102b30643089765b6a74257e7 | 273 | md | Markdown | README.md | happytm/ESPCoreRules | 625e99d6b864843cd63c17955fc66d470b26ddc0 | [
"MIT"
] | 31 | 2019-12-18T18:42:04.000Z | 2022-02-22T18:45:23.000Z | README.md | happytm/ESPCoreRules | 625e99d6b864843cd63c17955fc66d470b26ddc0 | [
"MIT"
] | 2 | 2021-04-10T19:23:40.000Z | 2022-02-16T02:56:05.000Z | README.md | happytm/ESPCoreRules | 625e99d6b864843cd63c17955fc66d470b26ddc0 | [
"MIT"
] | 7 | 2019-12-20T16:42:26.000Z | 2022-02-01T19:51:57.000Z | # ESPCoreRules
Core Edition, based on ESPEasy stable with some mega patches.
Only rule engine, bare webgui and some plugins.
As of build 5. the ESPNOW protocol is supported for battery operated devices
Documentation:
https://github.com/SmartNodeRules/Documentation/wiki
| 27.3 | 76 | 0.809524 | eng_Latn | 0.906754 |
265aee1d91192c4fcffdf19f3d68bcc0f7ca5cd3 | 42,360 | md | Markdown | documentation/docs/15-changelog.md | Sonia-corporation/stale | 071f0152925eb0c2b18b63e95a718d16c03dede1 | [
"MIT"
] | 14 | 2021-10-21T21:44:44.000Z | 2022-03-06T16:30:13.000Z | documentation/docs/15-changelog.md | Sonia-corporation/stale | 071f0152925eb0c2b18b63e95a718d16c03dede1 | [
"MIT"
] | 589 | 2021-10-21T22:36:05.000Z | 2022-03-31T19:19:17.000Z | documentation/docs/15-changelog.md | Sonia-corporation/stale | 071f0152925eb0c2b18b63e95a718d16c03dede1 | [
"MIT"
] | null | null | null | ---
id: changelog
title: Changelog
description: |
The list of all the new features, documentation improvements, performances improvements and bug fixes.
tags:
- Changelog
---
# Changelog
# [1.57.0](https://github.com/Sonia-corporation/stale/compare/1.56.0...1.57.0) (2022-04-18)
### :rocket: Features {#rocket-features}
- **inputs:** add new inputs to only process items with specific assignees ([2fc8bda](https://github.com/Sonia-corporation/stale/commit/2fc8bdaa2bf5d2031c2f9ca111b44322f67f0e4d)), closes [#559](https://github.com/Sonia-corporation/stale/issues/559)
Add the input `issue-only-any-assignees`.
Add the input `pull-request-only-any-assignees`.
You can use them to only process issues and PRs which have one of the specified assignee(s).
# [1.56.0](https://github.com/Sonia-corporation/stale/compare/1.55.0...1.56.0) (2022-04-17)
### :bug: Bug Fixes {#bug-bug-fixes}
- **inputs:** allow to override the default values of the ignore any milestones inputs ([ca9de12](https://github.com/Sonia-corporation/stale/commit/ca9de1203373e65de5ec7c73e4897675c93c2b5a))
The configuration was missing from the action.yml.
### :rocket: Features {#rocket-features-1}
- **prs:** add new input to ignore all milestones ([a492a64](https://github.com/Sonia-corporation/stale/commit/a492a6464f02131744cd7be3f51f2de5d3121ef3)), closes [#520](https://github.com/Sonia-corporation/stale/issues/520)
Add the `pull-request-ignore-all-milestones` input.
If a pull request has a milestone, it will be ignored from the processing.
# [1.55.0](https://github.com/Sonia-corporation/stale/compare/1.54.0...1.55.0) (2022-04-13)
### :rocket: Features {#rocket-features-2}
- **issues:** add new input to ignore all milestones ([29d8724](https://github.com/Sonia-corporation/stale/commit/29d8724e65f314d4d0456af4f82fe9fd57231c5e)), closes [#519](https://github.com/Sonia-corporation/stale/issues/519)
Add the `issue-ignore-all-milestones` input.
If an issue has a milestone, it will be ignored from the processing.
# [1.54.0](https://github.com/Sonia-corporation/stale/compare/1.53.0...1.54.0) (2022-04-03)
### :rocket: Features {#rocket-features-3}
- **prs:** add new input to ignore any milestones ([769448b](https://github.com/Sonia-corporation/stale/commit/769448b34ea68d2df354ac3ad08edd66b0b44e13)), closes [#518](https://github.com/Sonia-corporation/stale/issues/518)
Add the `pull-request-ignore-any-milestones` input.
# [1.53.0](https://github.com/Sonia-corporation/stale/compare/1.52.0...1.53.0) (2022-04-02)
### :rocket: Features {#rocket-features-4}
- **issues:** add new input to ignore any milestones ([8128bd1](https://github.com/Sonia-corporation/stale/commit/8128bd1ea29e7ebba24e01ad09fbfe6724f330d2)), closes [#517](https://github.com/Sonia-corporation/stale/issues/517)
Add the `issue-ignore-any-milestones` input.
# [1.52.0](https://github.com/Sonia-corporation/stale/compare/1.51.0...1.52.0) (2022-03-03)
### :rocket: Features {#rocket-features-5}
- **input:** add new inputs to ignore any project cards ([#551](https://github.com/Sonia-corporation/stale/issues/551)) ([ba79907](https://github.com/Sonia-corporation/stale/commit/ba799071e7eff6cbe4d99cb2e30a2ef177d4bb93))
Add the `issue-ignore-any-project-cards` input.
Add the `pull-request-ignore-any-project-cards` input.
In addition of the existing input to ignore when a project card is present, now you can choose a white-list instead.
Any item process that belong to those project cards will be ignored.
Closes #165
# [1.51.0](https://github.com/Sonia-corporation/stale/compare/1.50.0...1.51.0) (2022-02-20)
### :rocket: Features {#rocket-features-6}
- **milestone:** add new input `pull-request-only-any-milestones` ([6b9aa5a](https://github.com/Sonia-corporation/stale/commit/6b9aa5ad518326929f1531269deabbff2fd7f543)), closes [#516](https://github.com/Sonia-corporation/stale/issues/516)
This new input will let you process only the pull requests that contains one of the specified milestone.
# [1.50.0](https://github.com/Sonia-corporation/stale/compare/1.49.0...1.50.0) (2022-02-20)
### :rocket: Features {#rocket-features-7}
- **milestone:** add new input `issue-only-any-milestones` ([8d3fffd](https://github.com/Sonia-corporation/stale/commit/8d3fffdc039d567043eba50fe027335046e02bbf)), closes [#310](https://github.com/Sonia-corporation/stale/issues/310)
This new input will let you process only the issues that contains one of the specified milestone.
# [1.49.0](https://github.com/Sonia-corporation/stale/compare/1.48.0...1.49.0) (2022-02-19)
### :books: Documentation {#books-documentation}
- **troubleshooting:** add a new page to help troubleshooting ([c67dc84](https://github.com/Sonia-corporation/stale/commit/c67dc843e1985a929bc3053acddc5e596f1b29e4))
Including information about GitHub token missing permissions.
- **api-count:** add info block to explain that a negative count disable the inputs ([ac54236](https://github.com/Sonia-corporation/stale/commit/ac542362e0bd45b33b7568527e6cde9a2c23f7c3))
Using -X for the API queries and mutations count will just disable the feaure.
- **github-token:** add information about the required permissions ([63fb15b](https://github.com/Sonia-corporation/stale/commit/63fb15bbcdbd35bc50c99ecd3756bc6f8a294443))
Also explain how the token is used under the hood.
- **troubleshooting:** add more tips ([975f2d7](https://github.com/Sonia-corporation/stale/commit/975f2d7101441a862ea2acb5608cb348253ef205))
- **cache:** add the "Cache" tag for all docs pages related to cache topics ([7b401b7](https://github.com/Sonia-corporation/stale/commit/7b401b7cff0f2142187bb9e89310cca5e9bcf08a))
- **fix:** fix 2 typos in the 1.48.0 changelog ([4145e7b](https://github.com/Sonia-corporation/stale/commit/4145e7b464d47268c483523ef92bd1d907b65a90))
# [1.48.0](https://github.com/Sonia-corporation/stale/compare/1.47.0...1.48.0) (2022-02-19)
### :books: Documentation {#books-documentation-1}
- **cache:** mention in the docs where and why there is some cache ([9ce0445](https://github.com/Sonia-corporation/stale/commit/9ce0445fd2573fa9a720fa3fa8f08108f62286d2))
### :bug: Bug Fixes {#bug-bug-fixes-1}
- **logs:** avoid showing a success log on error while fetching a label ([26c07bf](https://github.com/Sonia-corporation/stale/commit/26c07bf1b29236d5b00bf9776dfb0b725e5c10bd)), closes [#507](https://github.com/Sonia-corporation/stale/issues/507)
When a label could not be found, a log saying the label was loaded was display.
It was misleading, and now it's no longer shown on error.
### :zap: Performance Improvements {#zap-performance-improvements}
- **cache:** cache the fetching of labels to reduce the amount of API calls ([9b78d0d](https://github.com/Sonia-corporation/stale/commit/9b78d0dc38774502d56e2fb9f43f724d1cfc6278)), closes [#270](https://github.com/Sonia-corporation/stale/issues/270)
Every request to fetch a single label, like the stale one, will be cached for the whole workflow.
This may reduce drastically the number of API calls made to GitHub.
The limits related to your GitHub token will like this.
# [1.47.0](https://github.com/Sonia-corporation/stale/compare/1.46.0...1.47.0) (2022-02-15)
### :books: Documentation {#books-documentation-2}
- **workflow-testing:** add a section to explain how to enable the actions step debug ([c81331d](https://github.com/Sonia-corporation/stale/commit/c81331d8d27f69f1377f425dbf030ed18993b6f0))
### :rocket: Features {#rocket-features-8}
- **logs:** add more logs to debug the project card features ([ec9d2ce](https://github.com/Sonia-corporation/stale/commit/ec9d2ceb08b4f248431cddd4ba1aecf37d83a0e9)), closes [#498](https://github.com/Sonia-corporation/stale/issues/498)
- **logs:** log the item data before processing (debug) ([2c73a90](https://github.com/Sonia-corporation/stale/commit/2c73a907e990c5aab4ffc6bf5d8bacf0411f0c4c))
This will expose the content fetched from GitHub, which can be helpful to debug.
# [1.46.0](https://github.com/Sonia-corporation/stale/compare/1.45.0...1.46.0) (2022-02-13)
### :bug: Bug Fixes {#bug-bug-fixes-2}
- **logs:** hide the sub-statistics when the count is 0 ([13e2bb1](https://github.com/Sonia-corporation/stale/commit/13e2bb1608d6e3ea014ad079a8a3193cfdc7438d)), closes [#477](https://github.com/Sonia-corporation/stale/issues/477)
It will simply avoid polluting the logs with empty counts.
### :rocket: Features {#rocket-features-9}
- **annotations:** add references to the root cause of the error annotations ([6124e3e](https://github.com/Sonia-corporation/stale/commit/6124e3e713dfc0896bce2a11bcc6cd9b03f8a005)), closes [#457](https://github.com/Sonia-corporation/stale/issues/457)
See AnnotationProperties in https://github.com/actions/toolkit/tree/main/packages/core.
- **annotations:** add references to the root cause of the warning annotations ([4cc9b0c](https://github.com/Sonia-corporation/stale/commit/4cc9b0c55d9ab2cc5ce012985511fbf7bd8bfed8)), closes [#484](https://github.com/Sonia-corporation/stale/issues/484)
See AnnotationProperties in https://github.com/actions/toolkit/tree/main/packages/core.
# [1.45.0](https://github.com/Sonia-corporation/stale/compare/1.44.0...1.45.0) (2022-02-13)
### :books: Documentation {#books-documentation-3}
- **footer:** add a link on the footer to get some help ([4619a5a](https://github.com/Sonia-corporation/stale/commit/4619a5ac96db02797f8cdf641153d887a67c91e2)), closes [#469](https://github.com/Sonia-corporation/stale/issues/469)
- **badge:** add a new page to show our badge ([93e8f80](https://github.com/Sonia-corporation/stale/commit/93e8f80d608f15733baf90e0d79f36bd180ba337)), closes [#456](https://github.com/Sonia-corporation/stale/issues/456)
- **blog:** add new entry to explain the change of UI ([396c1a3](https://github.com/Sonia-corporation/stale/commit/396c1a33f551bb92a5b0f868cb784b25eb96cbf9))
- **readme:** add the sonia changelog badge ([e4d529d](https://github.com/Sonia-corporation/stale/commit/e4d529d43f013e8235d57f4ccf95d14d4720fe09))
- **readme:** add the sonia stale badge ([2ae6eff](https://github.com/Sonia-corporation/stale/commit/2ae6eff367ef0ddece9cba2c8e20dbe1ddf9e880))
- **ui:** change the brand name and the color scheme to follow Sonia style ([3ce0536](https://github.com/Sonia-corporation/stale/commit/3ce0536e843e5f0708b25ce6a4137d96ad14cd27)), closes [#473](https://github.com/Sonia-corporation/stale/issues/473)
The Sonia corporation is a big joke, it's not a corporation and is only about open-source stuff.
But having consistency between the different apps is cool, hence this change.
I am not fond of the pink colour, but whatever I am just bad regarding the design.
### :rocket: Features {#rocket-features-10}
- **prs:** add new input `pull-request-only-any-project-cards` ([ad82fd8](https://github.com/Sonia-corporation/stale/commit/ad82fd87aebe5caac4eeeef5985bc363f28d3e7b)), closes [#416](https://github.com/Sonia-corporation/stale/issues/416)
This new input allow to only process prs belonging to specific projects.
You can link your prs to a project card.
Pass as a multi-line input the names of your projects.
When this input is set, all prs without a project card will not be processed.
# [1.44.0](https://github.com/Sonia-corporation/stale/compare/1.43.2...1.44.0) (2022-02-11)
### :books: Documentation {#books-documentation-4}
- add [@iainlane](https://github.com/iainlane) as a contributor ([aac06f5](https://github.com/Sonia-corporation/stale/commit/aac06f55a4fd21ceeca0e513e835509570a5b8d9))
- add @Sonia-corporation-bot as a contributor ([c73b937](https://github.com/Sonia-corporation/stale/commit/c73b937fc152786f6a8338e17aee7be8438c642e))
- **readme:** add a link to the changelog ([ecc7c62](https://github.com/Sonia-corporation/stale/commit/ecc7c62a9f2be6bc8ca4c4db98d10b6cf742b08a)), closes [#414](https://github.com/Sonia-corporation/stale/issues/414)
- **need-help:** add a new page to explain how to get some help ([1d148e9](https://github.com/Sonia-corporation/stale/commit/1d148e9589e86a0ef7e4df44a443feb73a364cc3)), closes [#415](https://github.com/Sonia-corporation/stale/issues/415)
- **annotations:** add a new page to explain what are the annotations ([c37a183](https://github.com/Sonia-corporation/stale/commit/c37a1834a03a26567bfd02990ace810ca18d95e2)), closes [#448](https://github.com/Sonia-corporation/stale/issues/448)
Explain what to expect to find with the notices, warnings and errors.
- **blog:** add new blog post to introduce the first inclusive input ([f40b60f](https://github.com/Sonia-corporation/stale/commit/f40b60f01b926fe298a25b62c1068f1aefaf15e2)), closes [#458](https://github.com/Sonia-corporation/stale/issues/458)
- **outputs:** add tip about annotations ([36c7378](https://github.com/Sonia-corporation/stale/commit/36c737812292ec561a2b42ef3d215358b39ce8de)), closes [#447](https://github.com/Sonia-corporation/stale/issues/447)
- **stats:** add tip about annotations ([3923187](https://github.com/Sonia-corporation/stale/commit/3923187cf235e69d809bb113bf0c8085f56c1e53))
- update @C0ZEN as a contributor ([578ed8c](https://github.com/Sonia-corporation/stale/commit/578ed8ce5f059be33d1a98e419358a320c7ee8a4))
## [1.43.2](https://github.com/Sonia-corporation/stale/compare/1.43.1...1.43.2) (2022-02-10) {#1432-2022-02-10}
### :books: Documentation {#books-documentation-5}
- **fix:** correctly highlight the examples ([c6fe0d0](https://github.com/Sonia-corporation/stale/commit/c6fe0d022e5e88548136a0d6b870fef192c3d8cf)), closes [#439](https://github.com/Sonia-corporation/stale/issues/439)
### :bug: Bug Fixes {#bug-bug-fixes-3}
- **deps:** update dependency prism-react-renderer to v1.3.1 ([#425](https://github.com/Sonia-corporation/stale/issues/425)) ([f4eae39](https://github.com/Sonia-corporation/stale/commit/f4eae3925b8e696997e11c65e8e3fa45b75cfac8))
Co-authored-by: Renovate Bot <bot@renovateapp.com>
## [1.43.1](https://github.com/Sonia-corporation/stale/compare/1.43.0...1.43.1) (2022-02-09) {#1431-2022-02-09}
### :bug: Bug Fixes {#bug-bug-fixes-4}
- **logs:** properly count the number of processed items within all batches in the logs ([8a17a18](https://github.com/Sonia-corporation/stale/commit/8a17a18df755c446c48ddf8f719bd5afcaedd6be)), closes [#431](https://github.com/Sonia-corporation/stale/issues/431)
Only the first batch was taken into consideration.
The information was lost during the recursion of batches processing.
# [1.43.0](https://github.com/Sonia-corporation/stale/compare/1.42.1...1.43.0) (2022-02-09)
### :rocket: Features {#rocket-features-11}
- **annotations:** add notice annotations for all outputs greater than 0 ([f26f219](https://github.com/Sonia-corporation/stale/commit/f26f219f6a6d40717f1fe075c51d19fa62b2aae5)), closes [#429](https://github.com/Sonia-corporation/stale/issues/429)
## [1.42.1](https://github.com/Sonia-corporation/stale/compare/1.42.0...1.42.1) (2022-02-09) {#1421-2022-02-09}
### :bug: Bug Fixes {#bug-bug-fixes-5}
- **annotations:** format properly the errors annotations ([13bcd6a](https://github.com/Sonia-corporation/stale/commit/13bcd6a9f70ca44159bbab7e5a8bec78872caaca)), closes [#422](https://github.com/Sonia-corporation/stale/issues/422)
- **annotations:** format properly the warning annotations ([469609c](https://github.com/Sonia-corporation/stale/commit/469609ce0f074aef3929af01c523ff371267474b)), closes [#422](https://github.com/Sonia-corporation/stale/issues/422)
- **annotations:** remove every existing notice annotations ([3c3a2d5](https://github.com/Sonia-corporation/stale/commit/3c3a2d574784f0bdfcb89c70f90fe1c45bbf7844)), closes [#422](https://github.com/Sonia-corporation/stale/issues/422)
The logs were not static - including some variables - and were broken due to the colours.
# [1.42.0](https://github.com/Sonia-corporation/stale/compare/1.41.0...1.42.0) (2022-02-08)
### :books: Documentation {#books-documentation-6}
- **examples:** add a multiple-cron jobs example ([d57bc19](https://github.com/Sonia-corporation/stale/commit/d57bc196ee7839e2b6ddb843c7677023437c569c))
- **website:** add more tags to existing docs pages ([4c92a80](https://github.com/Sonia-corporation/stale/commit/4c92a8047925923fb452c64070070b93cb432f6b))
- **readme:** fix broken links to inputs inside the readme ([2252af8](https://github.com/Sonia-corporation/stale/commit/2252af8fa2cb8b4280f5c48fa7adb0731bd97823))
### :rocket: Features {#rocket-features-12}
- **issues:** add new input `issue-only-any-project-cards` ([8650e12](https://github.com/Sonia-corporation/stale/commit/8650e12be11a74a8027e8dfcb51e3024f867c26f)), closes [#312](https://github.com/Sonia-corporation/stale/issues/312) [#412](https://github.com/Sonia-corporation/stale/issues/412)
This new input allow to only process issues belonging to specific projects.
You can link your issues to a project card.
Pass as a multi-line input the names of your projects.
When this input is set, all issues without a project card will not be processed.
Note: this is the first input of this genre, but may not be the last based on people needs!
# [1.41.0](https://github.com/Sonia-corporation/stale/compare/1.40.0...1.41.0) (2022-02-06)
### :books: Documentation {#books-documentation-7}
- **website:** add a changelog page inside the docs ([5cdf738](https://github.com/Sonia-corporation/stale/commit/5cdf7381581014361e2109529d60b533d5e120c8)), closes [#385](https://github.com/Sonia-corporation/stale/issues/385)
# [1.40.0](https://github.com/Sonia-corporation/stale/compare/1.39.0...1.40.0) (2022-02-05)
### :books: Documentation {#books-documentation-8}
- **fix:** fix wrong configuration examples ([98ca681](https://github.com/Sonia-corporation/stale/commit/98ca681065bd39762b32b6db35c7fe6aec9f3550))
# [1.39.0](https://github.com/Sonia-corporation/stale/compare/1.38.0...1.39.0) (2022-02-05)
### :rocket: Features {#rocket-features-13}
- **prs:** add new input `pull-request-limit-api-mutations-count` ([5559871](https://github.com/Sonia-corporation/stale/commit/5559871a1003996b8d6799e935ab559e7ee372cb)), closes [#389](https://github.com/Sonia-corporation/stale/issues/389)
Before processing each pull request, the workflow will check if the input is enabled.
If this is the case, the processing may stop if the statistics for mutations are higher than the limit.
# [1.38.0](https://github.com/Sonia-corporation/stale/compare/1.37.0...1.38.0) (2022-02-05)
### :rocket: Features {#rocket-features-14}
- **issues:** add new input `issue-limit-api-mutations-count` ([14b034a](https://github.com/Sonia-corporation/stale/commit/14b034a428de778d59a6c868b1b9553f39dec3e2)), closes [#376](https://github.com/Sonia-corporation/stale/issues/376)
Before processing each issue, the workflow will check if the input is enabled.
If this is the case, the processing may stop if the statistics for mutations are higher than the limit.
# [1.37.0](https://github.com/Sonia-corporation/stale/compare/1.36.0...1.37.0) (2022-02-02)
### :rocket: Features {#rocket-features-15}
- **issues:** add new input `issue-limit-api-queries-count` ([db5effb](https://github.com/Sonia-corporation/stale/commit/db5effb74d732af800108c6634639cbae6ec9336)), closes [#254](https://github.com/Sonia-corporation/stale/issues/254)
Before processing each issue, the workflow will check if the input is enabled.
If this is the case, the processing may stop if the statistics for queries are higher than the limit.
- **prs:** add new input `pull-request-limit-api-queries-count` ([ee9a293](https://github.com/Sonia-corporation/stale/commit/ee9a29366bf0ef6a3cd0dce7f84b08a4987e1d7a)), closes [#254](https://github.com/Sonia-corporation/stale/issues/254)
Before processing each pull request, the workflow will check if the input is enabled.
If this is the case, the processing may stop if the statistics for queries are higher than the limit.
# [1.36.0](https://github.com/Sonia-corporation/stale/compare/1.35.0...1.36.0) (2022-02-01)
### :books: Documentation {#books-documentation-9}
- **website:** add a search bar ([f32c9dc](https://github.com/Sonia-corporation/stale/commit/f32c9dcd89837a9fb49c5c72fcddb6cc2a04c3b4)), closes [#325](https://github.com/Sonia-corporation/stale/issues/325)
Provided by Algolia
# [1.35.0](https://github.com/Sonia-corporation/stale/compare/1.34.0...1.35.0) (2022-01-31)
### :books: Documentation {#books-documentation-10}
- **website:** add new pages to list the outputs ([54f33b0](https://github.com/Sonia-corporation/stale/commit/54f33b0cea6594753532dc76fb95de707163c425))
- **website:** put the issues and prs inputs into an inputs folder ([c338ab7](https://github.com/Sonia-corporation/stale/commit/c338ab732c666cf82ed789a9053949d7f8509aac))
This will impact all the URLs to add inputs/ in their names
# [1.34.0](https://github.com/Sonia-corporation/stale/compare/1.33.1...1.34.0) (2022-01-29)
### :bug: Bug Fixes {#bug-bug-fixes-6}
- **deps:** update docusaurus monorepo to v2.0.0-beta.15 (patch) ([#358](https://github.com/Sonia-corporation/stale/issues/358)) ([5bfd537](https://github.com/Sonia-corporation/stale/commit/5bfd537ce39437b0e833ebff60ed7013c8bae109))
Co-authored-by: Renovate Bot <bot@renovateapp.com>
Co-authored-by: TESTELIN Geoffrey <geoffrey.testelin@gmail.com>
### :rocket: Features {#rocket-features-16}
- **outputs:** add 6 new outputs to expose the api calls ([7d85f17](https://github.com/Sonia-corporation/stale/commit/7d85f17505514e84f48cbb5cc8d018f08f2e2cb9))
Add `called-api-issues-count`
Add `called-api-issues-queries-count`
Add `called-api-issues-mutations-count`
Add `called-api-pull-requests-count`
Add `called-api-pull-requests-queries-count`
Add `called-api-pull-requests-mutations-count`
- **stats:** add statistics about mutations and queries from the API ([3ddd3ca](https://github.com/Sonia-corporation/stale/commit/3ddd3cab61a3fc685437049df4b2ccc8e0ec9eea))
- **website:** change the color palette ([6d483e5](https://github.com/Sonia-corporation/stale/commit/6d483e504e248d9f461b72b6a49548085dfb29b5))
- **logs:** display the total of stats for mutations and queries API calls ([86261dd](https://github.com/Sonia-corporation/stale/commit/86261dd190539908105deb5297322f53c62cd6a4))
Created a nested view
## [1.33.1](https://github.com/Sonia-corporation/stale/compare/1.33.0...1.33.1) (2022-01-24) {#1331-2022-01-24}
### :bug: Bug Fixes {#bug-bug-fixes-7}
- **outputs:** expose the outputs as expected ([10e9657](https://github.com/Sonia-corporation/stale/commit/10e965716d3791ca98d8e3907bcea478cc42bf99)), closes [#351](https://github.com/Sonia-corporation/stale/issues/351)
# [1.33.0](https://github.com/Sonia-corporation/stale/compare/1.32.0...1.33.0) (2022-01-24)
### :bug: Bug Fixes {#bug-bug-fixes-8}
- **draft:** only convert to draft when the dry-run is disabled ([67cc535](https://github.com/Sonia-corporation/stale/commit/67cc535beca61bf80bfc7ca4e5ced1d3006abc97)), closes [#346](https://github.com/Sonia-corporation/stale/issues/346)
### :rocket: Features {#rocket-features-17}
- **statistics:** always count the statistics even in dry-run ([cbc41eb](https://github.com/Sonia-corporation/stale/commit/cbc41ebf616ebd19379f47d00c5a74ff95c0e5d7)), closes [#323](https://github.com/Sonia-corporation/stale/issues/323)
This will come particuly useful to know what is to be expected from the dry-run.
Newcomers can really take advantage of this feature now.
# [1.32.0](https://github.com/Sonia-corporation/stale/compare/1.31.0...1.32.0) (2022-01-23)
### :books: Documentation {#books-documentation-11}
- **website:** add a new page to explain how the statistics can be helpful ([ae85a4f](https://github.com/Sonia-corporation/stale/commit/ae85a4f7eb3d899b7fd5c3f633fb2481b2379605)), closes [#331](https://github.com/Sonia-corporation/stale/issues/331)
- **website:** change some references of inputs to link to the website ([11f2e62](https://github.com/Sonia-corporation/stale/commit/11f2e62deffea25dd7abe2f0a493d60475286e29))
Also apply the changes in the readme
### :bug: Bug Fixes {#bug-bug-fixes-9}
- **deps:** update dependency @mdx-js/react to v1.6.22 ([c624394](https://github.com/Sonia-corporation/stale/commit/c6243941e201e0d2fa4f1c462dd57030150efde0))
- **deps:** update react monorepo to v17.0.2 ([ad3fc8f](https://github.com/Sonia-corporation/stale/commit/ad3fc8f22103d6f37373f6a295481a19c34a402c))
# [1.31.0](https://github.com/Sonia-corporation/stale/compare/1.30.0...1.31.0) (2022-01-22)
### :books: Documentation {#books-documentation-12}
- **website:** add explicit docs for every pull requests inputs ([566376d](https://github.com/Sonia-corporation/stale/commit/566376dd181f77e50a81c5c273a723031c583181))
- **website:** add explicit docs for every issues inputs ([12af4f5](https://github.com/Sonia-corporation/stale/commit/12af4f53cb7fc03341304a6375790f6f05aa6525))
# [1.30.0](https://github.com/Sonia-corporation/stale/compare/1.29.0...1.30.0) (2022-01-19)
### :rocket: Features {#rocket-features-18}
- **labels:** add new input to add extra labels after closing an item ([f58a86b](https://github.com/Sonia-corporation/stale/commit/f58a86ba4b27a16961c6e85dc34616ec1fc7ca4b)), closes [#213](https://github.com/Sonia-corporation/stale/issues/213)
The input `issue-add-labels-after-close` and `pull-request-add-labels-after-close` were added.
You can now define a list of extra labels to add when the processing close an item.
# [1.29.0](https://github.com/Sonia-corporation/stale/compare/1.28.0...1.29.0) (2022-01-19)
### :rocket: Features {#rocket-features-19}
- **outputs:** add draft pull requests count output ([675faea](https://github.com/Sonia-corporation/stale/commit/675faeab130b299d095ba2690e3519cd8da37b40))
Add the `draft-pull-requests-count` output.
Add a new statistic to count the number of pull requests converted to draft.
- **draft:** add new input to convert a pull request to draft instead of stale ([1efadfb](https://github.com/Sonia-corporation/stale/commit/1efadfba8034cbf041ba830277791fe9c37c67fd)), closes [#275](https://github.com/Sonia-corporation/stale/issues/275)
The input `pull-request-to-draft-instead-of-stale` was added.
When enabled, the processing will no longer stale.
It will instead convert the pull request to a draft.
No stale label added, no stale comment added nor extra labels added.
# [1.28.0](https://github.com/Sonia-corporation/stale/compare/1.27.1...1.28.0) (2022-01-18)
### :rocket: Features {#rocket-features-20}
- **processing:** add new input to enable or disable the processing ([b57eac0](https://github.com/Sonia-corporation/stale/commit/b57eac07802080d49ce925ffdec34fc1fb23049c)), closes [#277](https://github.com/Sonia-corporation/stale/issues/277)
Add the `issue-processing` and `pull-request-processing` inputs.
When enabled, the processing occurs as expected.
When disabled, the processing will be skipped.
## [1.27.1](https://github.com/Sonia-corporation/stale/compare/1.27.0...1.27.1) (2022-01-12) {#1271-2022-01-12}
### :bug: Bug Fixes {#bug-bug-fixes-10}
- **stale:** add in last the stale label to avoid removing the stale the next run ([cb7257d](https://github.com/Sonia-corporation/stale/commit/cb7257daf0d10d778ed05f36f31c2c62fa6fc2a8))
add the stale comment and the extra labels before adding the stale label
this will avoid an issue with the update date being more recent than the stale label addition date
- **deps:** update dependency luxon to v2.3.0 ([b3f91a7](https://github.com/Sonia-corporation/stale/commit/b3f91a744136951fe74f6900bc25d987727d6fc2))
# [1.27.0](https://github.com/Sonia-corporation/stale/compare/1.26.0...1.27.0) (2022-01-10)
### :rocket: Features {#rocket-features-21}
- **logs:** add a log when a statistic is increased ([3885885](https://github.com/Sonia-corporation/stale/commit/388588543dd76c0dfc273ba4bd8cfb8c6cd232c7))
Showing by how many the count was increased and also showing the new total count
- **stats:** add a new statistic for the number of issue added labels ([4805b65](https://github.com/Sonia-corporation/stale/commit/4805b659b2af24fa60055b5842a87219ad688e74))
- **stats:** add a new statistic for the number of PR added labels ([ef5edab](https://github.com/Sonia-corporation/stale/commit/ef5edab8762acd5a9143da35d6240e60324d46ac))
- **label:** add new inputs to add extra labels after marking as stale ([39433b0](https://github.com/Sonia-corporation/stale/commit/39433b0f83ca4e4debd9797a048801a77868199f))
Added the `issue-add-labels-after-stale` input
Added the `pull-request-add-labels-after-stale` input
- **outputs:** add new outputs for the count of added labels ([397a712](https://github.com/Sonia-corporation/stale/commit/397a712bde7fc45494b01b4afdeb2292f19862af))
# [1.26.0](https://github.com/Sonia-corporation/stale/compare/1.25.1...1.26.0) (2022-01-02)
### :rocket: Features {#rocket-features-22}
- **logs:** add counts in the batches and processed items logs ([2022cf0](https://github.com/Sonia-corporation/stale/commit/2022cf0e93c88f0105b8ac2f91cb17e1b9172320))
- **pull-request:** add new input `pull-request-delete-branch-after-close` ([cc53931](https://github.com/Sonia-corporation/stale/commit/cc53931ad5625307f86e8a048c12b483d7f2af99))
When a pull request is closed by the processing, if the option is enabled and the dry-run is disabled, the related branch will be removed
A new output was also added to track the number of deleted pull request branches
## [1.25.1](https://github.com/Sonia-corporation/stale/compare/1.25.0...1.25.1) (2022-01-01) {#1251-2022-01-01}
### :bug: Bug Fixes {#bug-bug-fixes-11}
- **processing:** fix an issue with infinite batches processing ([218f164](https://github.com/Sonia-corporation/stale/commit/218f1641633388297bc4302bec4ad71048e89848))
if the number of issues or pull requests to process was higher than 20, the processing was ending up in a infinity loop
this was due to a missing parameter from the endpoint request "endCursor" which was not requested and so, leading to undefined value
when the cursor is undefined, the first batch is processed hence the infinity loop
# [1.25.0](https://github.com/Sonia-corporation/stale/compare/1.24.0...1.25.0) (2022-01-01)
### :rocket: Features {#rocket-features-23}
- **pull-request:** add new input `pull-request-ignore-draft` ([639e9ba](https://github.com/Sonia-corporation/stale/commit/639e9ba357d41de615e0423f31ac3ecf72e98dcf))
a pull request in draft can now be ignored from the processing
### :zap: Performance Improvements {#zap-performance-improvements-1}
- fix jest performances issues when running locally ([cb4a833](https://github.com/Sonia-corporation/stale/commit/cb4a833196ff07022e3f0f34d840a0e8039f85c0))
# [1.24.0](https://github.com/Sonia-corporation/stale/compare/1.23.0...1.24.0) (2021-12-16)
### :rocket: Features {#rocket-features-24}
- **pr:** add new inputs to support pull requests processing ([ab8f8a0](https://github.com/Sonia-corporation/stale/commit/ab8f8a090db6210c18314c4707bdbb294ce46042))
this is the same implementation than the issues inputs
from this point, issues and pull requests have the exact same options and features and this is closing the alpha state off this action
# [1.23.0](https://github.com/Sonia-corporation/stale/compare/1.22.0...1.23.0) (2021-12-11)
### :bug: Bug Fixes {#bug-bug-fixes-12}
- **deps:** update dependency luxon to v2.2.0 ([03b420a](https://github.com/Sonia-corporation/stale/commit/03b420a8d0f4a9a8f5f8843a9ca744208f71ae46))
### :rocket: Features {#rocket-features-25}
- **issue:** add new input "issue-ignore-all-project-cards" ([91193f2](https://github.com/Sonia-corporation/stale/commit/91193f25428423e92ce6b87c231d8d99c9512597))
# [1.22.0](https://github.com/Sonia-corporation/stale/compare/1.21.0...1.22.0) (2021-12-07)
### :rocket: Features {#rocket-features-26}
- **debug:** add more logs to debug in case a label does not exist in your repo ([0c00efb](https://github.com/Sonia-corporation/stale/commit/0c00efbe379f6f50124bc7e91efc726c4e73abf1))
# [1.21.0](https://github.com/Sonia-corporation/stale/compare/1.20.0...1.21.0) (2021-12-06)
### :rocket: Features {#rocket-features-27}
- **issue:** add new input issue-close-comment to comment on close ([a026020](https://github.com/Sonia-corporation/stale/commit/a0260200e2149a1be591f2ee7b1e97a6e6089115))
# [1.20.0](https://github.com/Sonia-corporation/stale/compare/1.19.0...1.20.0) (2021-12-05)
### :rocket: Features {#rocket-features-28}
- **issue:** add new input issue-stale-comment to comment on stale ([1a4abcd](https://github.com/Sonia-corporation/stale/commit/1a4abcde462e3b35254b18dedf20a820ab584783))
# [1.19.0](https://github.com/Sonia-corporation/stale/compare/1.18.0...1.19.0) (2021-12-05)
### :rocket: Features {#rocket-features-29}
- **issue:** add new option to ignore some assignees ([6876cfd](https://github.com/Sonia-corporation/stale/commit/6876cfd121e9a8eb7d1972977b43f969c097362c))
# [1.18.0](https://github.com/Sonia-corporation/stale/compare/1.17.0...1.18.0) (2021-11-22)
### :rocket: Features {#rocket-features-30}
- **issue:** add new input to ignore the process base on the creation date ([10b4b64](https://github.com/Sonia-corporation/stale/commit/10b4b6449cb728352b3b1430ac37ece82fa3e986))
# [1.17.0](https://github.com/Sonia-corporation/stale/compare/1.16.0...1.17.0) (2021-11-21)
### :rocket: Features {#rocket-features-31}
- **issue:** add new input to ignore issues with assignees ([a1c168b](https://github.com/Sonia-corporation/stale/commit/a1c168b46bb3fe871f91d77a725363857315cf7e))
# [1.16.0](https://github.com/Sonia-corporation/stale/compare/1.15.0...1.16.0) (2021-11-20)
### :rocket: Features {#rocket-features-32}
- **issue:** avoid closing issues on dry-run mode ([b93d952](https://github.com/Sonia-corporation/stale/commit/b93d952686506ad23a3c0daac1cbeae364eda69a))
# [1.15.0](https://github.com/Sonia-corporation/stale/compare/1.14.0...1.15.0) (2021-11-20)
### :rocket: Features {#rocket-features-33}
- **outputs:** add closed issues count output ([aeeb72e](https://github.com/Sonia-corporation/stale/commit/aeeb72eec75a67bb0ee1dfec36c63174d1983bb2))
- **stats:** add stats about the number of closed issues ([de3d98c](https://github.com/Sonia-corporation/stale/commit/de3d98c7d8fea7b4ded4a2b8c12437a3b36518bd))
- **issue:** close stale issues ([e7f9a54](https://github.com/Sonia-corporation/stale/commit/e7f9a545161b02a61fc957ea6f5c7a917b464f06))
# [1.14.0](https://github.com/Sonia-corporation/stale/compare/1.13.0...1.14.0) (2021-11-20)
### :rocket: Features {#rocket-features-34}
- **issue:** add new input to ignore the process when a label is on an issue ([9c118dc](https://github.com/Sonia-corporation/stale/commit/9c118dc14cc113d2d044285d09517e22d051d9bb))
# [1.13.0](https://github.com/Sonia-corporation/stale/compare/1.12.0...1.13.0) (2021-11-18)
### :rocket: Features {#rocket-features-35}
- **label:** use a better search to find the stale label to avoid mismatch ([73148bd](https://github.com/Sonia-corporation/stale/commit/73148bdee19a2faa69e555e7d6071ccdb98530e8))
# [1.12.0](https://github.com/Sonia-corporation/stale/compare/1.11.0...1.12.0) (2021-11-15)
### :rocket: Features {#rocket-features-36}
- **outputs:** add some outputs to this action ([6c8390f](https://github.com/Sonia-corporation/stale/commit/6c8390fe3b5fceb9e4094d10a87b3c7b2bcdfaec))
# [1.11.0](https://github.com/Sonia-corporation/stale/compare/1.10.0...1.11.0) (2021-11-15)
### :rocket: Features {#rocket-features-37}
- **stats:** add a statistics regarding the unaltered issues ([acca942](https://github.com/Sonia-corporation/stale/commit/acca9429dc3538d9daca48481a2458a678ae2cd6))
- **issue:** use the most recent added stale label event for the check ([36e21fd](https://github.com/Sonia-corporation/stale/commit/36e21fd718938168fbfbef1023eb7972760af0da))
# [1.10.0](https://github.com/Sonia-corporation/stale/compare/1.9.0...1.10.0) (2021-11-14)
### :rocket: Features {#rocket-features-38}
- **logs:** add statistics in the logs ([90fbb7a](https://github.com/Sonia-corporation/stale/commit/90fbb7aa86f381d6152f264e0796cd1cbb453671))
# [1.9.0](https://github.com/Sonia-corporation/stale/compare/1.8.0...1.9.0) (2021-11-14)
### :rocket: Features {#rocket-features-39}
- **logs:** display the api services logs with the issue prefix ([6b35fe8](https://github.com/Sonia-corporation/stale/commit/6b35fe8e7a84ed9a64745b7b4657cf843ce0efd0))
- **logs:** improve the issue logger to always add the issue id as a prefix ([b6390da](https://github.com/Sonia-corporation/stale/commit/b6390da9d3dcd924ee92d3df780f595312234c72))
- **issue:** remove the stale label from the issue when no longer stale ([c128375](https://github.com/Sonia-corporation/stale/commit/c128375998ec04299f21a0e4957a5461af83f564))
- **issue:** remove the stale label when the creation date of the label is sooner than the update date ([b66d9b3](https://github.com/Sonia-corporation/stale/commit/b66d9b3e8b3c2fcfbeaaeb4c40379b16ec0fe464))
- **logs:** round the difference of days in the logs ([6801ee5](https://github.com/Sonia-corporation/stale/commit/6801ee5f3f5b94035d6b0ae702bcdc882bd767c7))
- **issue:** stop processing an issue already stale ([3dbd799](https://github.com/Sonia-corporation/stale/commit/3dbd79911c31f414ae0a955f753a9a4705b43905))
# [1.8.0](https://github.com/Sonia-corporation/stale/compare/1.7.0...1.8.0) (2021-11-13)
### :rocket: Features {#rocket-features-40}
- **issue:** add new input issue-days-before-close to customize the number of days before closing a stale issue ([ed43be6](https://github.com/Sonia-corporation/stale/commit/ed43be60c281f1e23081f4d46633b4375144a02d))
# [1.7.0](https://github.com/Sonia-corporation/stale/compare/1.6.1...1.7.0) (2021-11-13)
### :rocket: Features {#rocket-features-41}
- **issue:** add a new input "issue-days-before-stale" to choose when to stale ([2e92099](https://github.com/Sonia-corporation/stale/commit/2e92099f52fa4e0c90772913b43adca9ed24fcf5))
## [1.6.1](https://github.com/Sonia-corporation/stale/compare/1.6.0...1.6.1) (2021-11-13) {#161-2021-11-13}
### :bug: Bug Fixes {#bug-bug-fixes-13}
- **dry-run:** ignore the stale label addition in dry-run mode ([8fe45d2](https://github.com/Sonia-corporation/stale/commit/8fe45d291f6b2eb9ad9d5a295df2000f64b6d682))
# [1.6.0](https://github.com/Sonia-corporation/stale/compare/1.5.0...1.6.0) (2021-11-12)
### :rocket: Features {#rocket-features-42}
- **issue:** add new input issue-ignore-any-labels to ignore the processing of issues with specific labels ([1c4307d](https://github.com/Sonia-corporation/stale/commit/1c4307d433ce98cd978693821b96d0bd421f78a0))
# [1.5.0](https://github.com/Sonia-corporation/stale/compare/1.4.0...1.5.0) (2021-11-12)
### :rocket: Features {#rocket-features-43}
- **labels:** add a new feature to add a label to an issue ([f93639d](https://github.com/Sonia-corporation/stale/commit/f93639d81d4c7d5b8b54d4be9e62f5fd69c33196))
- **labels:** add a new feature to get a label from the repository ([06eb94e](https://github.com/Sonia-corporation/stale/commit/06eb94ef48adabea10aca1849e4d4cb7a587d012))
- **issue:** add a new option "issue-stale-label" to define the label to use when an issue is stale ([b8a54a8](https://github.com/Sonia-corporation/stale/commit/b8a54a8bc1340438275651dadd56cc38cb4da46d))
- **stale:** add a stale label on issue not updated since 30 days ([ffd6c76](https://github.com/Sonia-corporation/stale/commit/ffd6c768c1f1c93cb2ecbf7e119f34e1412bb7a9))
- **logs:** add logs for the stale label ([b57de32](https://github.com/Sonia-corporation/stale/commit/b57de328a7473354bf69a983c26cbbc28e5a7c88))
- **stale:** stale locally the issues older than 30 days ([c5af87c](https://github.com/Sonia-corporation/stale/commit/c5af87c0207c10f3c13160bcc6978e0ebde7da12))
# [1.4.0](https://github.com/Sonia-corporation/stale/compare/1.3.0...1.4.0) (2021-11-09)
### :rocket: Features {#rocket-features-44}
- **logs:** add a log when the process start ([ec78887](https://github.com/Sonia-corporation/stale/commit/ec78887de476e7fb93a59a7accd5d52f5b9904fb))
- **issues:** add pagination ([4c1395b](https://github.com/Sonia-corporation/stale/commit/4c1395b1bb469740047d3fada26643359d5a43b8))
- **logs:** highlight the number of fetched issues ([df09f18](https://github.com/Sonia-corporation/stale/commit/df09f18ef920c88c7e6131e5825eaa790e6d36a0))
# [1.3.0](https://github.com/Sonia-corporation/stale/compare/1.2.0...1.3.0) (2021-11-07)
### :bug: Bug Fixes {#bug-bug-fixes-14}
- **issue:** display the issue link in purple ([a8f4687](https://github.com/Sonia-corporation/stale/commit/a8f4687ed016117b8a8c2902e6f2d2584d1aad06))
- **log:** display the proper end of tree symbol when logging the inputs ([0b53d66](https://github.com/Sonia-corporation/stale/commit/0b53d66576d032bf8ea9dd0ce99e5a3cc1b2c768))
### :rocket: Features {#rocket-features-45}
- **issue:** skip if locked ([05325a1](https://github.com/Sonia-corporation/stale/commit/05325a196632580f8b8adcae1bf455478a7e1103))
# [1.2.0](https://github.com/Sonia-corporation/stale/compare/1.1.0...1.2.0) (2021-11-07)
### :rocket: Features {#rocket-features-46}
- **logs:** add more logs to track the processing ([80afa14](https://github.com/Sonia-corporation/stale/commit/80afa14a5168b099e4e0db78284938f0d5ca08cb))
- **dry-run:** add new input dry-run and remove the required github-token ([71ad840](https://github.com/Sonia-corporation/stale/commit/71ad84010b317981c0deb3d7953be4b93060a0b6))
# [1.1.0](https://github.com/Sonia-corporation/stale/compare/1.0.0...1.1.0) (2021-11-06)
### :rocket: Features {#rocket-features-47}
- **issues:** fetch the issues to process and log the id ([9c79555](https://github.com/Sonia-corporation/stale/commit/9c79555879f840bbd080315a75d8914acf15ee8b))
- **issues:** log on error while fetching the issues ([495425f](https://github.com/Sonia-corporation/stale/commit/495425f94128b2c4445ae3533a688c2f919bfe9d))
# [1.0.0](https://github.com/Sonia-corporation/stale/compare/...1.0.0) (2021-11-06)
### :rocket: Features {#rocket-features-48}
- **error:** catch and stop the action in case of errors ([b1a8b23](https://github.com/Sonia-corporation/stale/commit/b1a8b23ba908ed77a013ea0eef6d43df5280d500))
- **error:** catch and stop the action in case of errors ([96d5b41](https://github.com/Sonia-corporation/stale/commit/96d5b4182c99296e1c133fe4966576f2de1fe134))
| 69.785832 | 294 | 0.760458 | eng_Latn | 0.375394 |
265b363836eae7e2c93a5c436230b4f74c546e9e | 1,309 | md | Markdown | 2020/10/12/2020-10-12 04:35.md | zhzhzhy/WeiBoHot_history | 32ce4800e63f26384abb17d43e308452c537c902 | [
"MIT"
] | 3 | 2020-07-14T14:54:15.000Z | 2020-08-21T06:48:24.000Z | 2020/10/12/2020-10-12 04:35.md | zhzhzhy/WeiBoHot_history | 32ce4800e63f26384abb17d43e308452c537c902 | [
"MIT"
] | null | null | null | 2020/10/12/2020-10-12 04:35.md | zhzhzhy/WeiBoHot_history | 32ce4800e63f26384abb17d43e308452c537c902 | [
"MIT"
] | null | null | null | 2020年10月12日04时数据
Status: 200
1.青岛疫情
微博热度:236678
2.台湾间谍
微博热度:109028
3.亲爱的自己全员be
微博热度:108952
4.希林娜依高张信哲合唱
微博热度:98806
5.王宝强方否认与冯清结婚
微博热度:96443
6.我们的歌
微博热度:73965
7.S10抽签
微博热度:64216
8.央视正告台独分子
微博热度:62484
9.日本出租车开始送外卖
微博热度:60024
10.钟镇涛冯提莫为爱痴狂
微博热度:59480
11.任胤蓬
微博热度:55996
12.台独分子曾潜入深圳偷拍武警集结
微博热度:52642
13.半是蜜糖半是伤预告
微博热度:52093
14.英国议员会议中拿口罩擦眼镜
微博热度:46824
15.亲爱的自己编剧
微博热度:46688
16.FILA官微清空BTS相关
微博热度:46636
17.纳达尔
微博热度:46588
18.台独欲收集香港黑衣人签名献给台湾官员
微博热度:46486
19.罗永浩
微博热度:46429
20.青岛核酸检测
微博热度:46378
21.闫妮脏辫造型
微博热度:46308
22.亲人去世10岁男孩独居写信求助
微博热度:46266
23.天天向上
微博热度:46149
24.台独在深圳刺探武警集结动静
微博热度:46085
25.伴郎团接亲挤掉新娘家房门
微博热度:45994
26.杨芸晴粉丝
微博热度:45971
27.英雄联盟
微博热度:45876
28.中央支持深圳实施综合改革试点
微博热度:45806
29.汉密尔顿追平舒马赫91冠纪录
微博热度:45709
30.威海一轿车失控多人被撞
微博热度:45708
31.中国三星 BTS
微博热度:43416
32.陈一鸣揭穿王子茹
微博热度:41727
33.Tian Ning
微博热度:41340
34.辣目洋子本名李嘉琦
微博热度:39857
35.费启鸣回应追星杨幂脱臼过程
微博热度:35647
36.李健成功追星谭咏麟
微博热度:34488
37.完美先生和差不多小姐
微博热度:32301
38.FNC被LPL包围了
微博热度:31827
39.易烊千玺刘雨昕撞衫
微博热度:30156
40.顾晓菱愿意等雷浩文三年
微博热度:24826
41.彭昱畅 一千万自拍先欠着
微博热度:24181
42.云舒太会了
微博热度:23390
43.国家安全机关破获数百起台湾间谍窃密案
微博热度:23208
44.我和别人谈恋爱的区别
微博热度:23161
45.青岛宜家
微博热度:23066
46.袁帅江君初雪吻
微博热度:21681
47.焦点访谈
微博热度:17317
48.CPA
微博热度:17257
49.TES小组第一出线
微博热度:17054
50.王者荣耀
微博热度:16927
| 6.416667 | 21 | 0.764706 | yue_Hant | 0.482917 |
265ba6ba57659910c4e940dc908239964514a5a8 | 2,300 | md | Markdown | zh-CN/documentation.md | Deedasmi/rust-www | c6b18954e71df4a260f04d6e42ccdb0694e7eaa3 | [
"Apache-2.0",
"MIT"
] | null | null | null | zh-CN/documentation.md | Deedasmi/rust-www | c6b18954e71df4a260f04d6e42ccdb0694e7eaa3 | [
"Apache-2.0",
"MIT"
] | null | null | null | zh-CN/documentation.md | Deedasmi/rust-www | c6b18954e71df4a260f04d6e42ccdb0694e7eaa3 | [
"Apache-2.0",
"MIT"
] | null | null | null | ---
layout: zh-CN/default
title: Rust 语言文档 · Rust 程序设计语言
---
# Rust 语言文档
如果您还不了解 Rust,那么请先阅读 [Rust 程序设计语言][book]。
它将会帮您理清思路:Rust 是什么样的语言、如何安装它、以及它的语法概念(syntax and concepts)。
在看完本书后,您将成为一个登堂入室(intermediate)的 Rust 开发人员,并将很好地理解 Rust 背后的基本理念。
## 学习 Rust
[Rust 程序设计语言][book]:Rust 相关的所有主题最全面的资料,重要的官方文档。
[在实践中学 Rust][rbe]:关于各种主题的 Rust 示例的合集,可在线预览。
[Rust 高级教程][nomicon]:为高级 Rust 工程师准备的专门讲解如何编写不安全的 Rust 代码的书。
[学习 Rust][rust-learning]:由社区维护、收集的学习 Rust 资料合集。
[常见问题解答][faq]
[book]: https://kaisery.gitbooks.io/rust-book-chinese/content/
[rbe]: https://rustwiki.org/rust-by-example/
[nomicon]: https://doc.rust-lang.org/nomicon/
[rust-learning]: https://github.com/ctjhoa/rust-learning
[faq]: faq.html
## 参考文献
[标准库][api]:Rust 标准库文档。
[docs.rs]:发布到 [crates.io] 的所有包的文档。
[Rust 语言参考手册][ref]:当 Rust 还没有一个语言规范的时候,这个文档对它进行了尽可能详尽的描述。
不过某些内容可能已经过时了。
[语法索引][syn]:本索引中包含 Rust 中与《Rust 程序设计语言》的部分交叉引用的所有语法示例。
[Cargo 使用指导][cargo]:Cargo(Rust 的包管理器)的文档。
[编译器错误索引][err]:Rust 编译器生成错误报告的扩展说明。
[版本维护文档][release_notes]:每次发布期间所做更改的记录文档。
[平台支持][platform_support]:不同等级的平台支持列表。
[api]: https://doc.rust-lang.org/std/
[syn]: https://doc.rust-lang.org/book/syntax-index.html
[ref]: https://doc.rust-lang.org/reference
[cargo]: http://doc.crates.io/guide.html
[err]: https://doc.rust-lang.org/error-index.html
[release_notes]: https://github.com/rust-lang/rust/blob/master/RELEASES.md
[docs.rs]: https://docs.rs
[crates.io]: https://crates.io
[platform_support]: https://forge.rust-lang.org/platform-support.html
## Rust 项目政策与策略
[Rust 安全策略][security]:该项目关于报告、修复和披露安全相关问题的策略。
[Rust 版权和商标政策][legal]:Rust 的版权归 Rust 项目开发者所有,其商标归 Mozilla 所有。
该政策描述了 Rust 商标的使用范畴。
[社区行为规范][coc]:适用于包括但不仅限于 GitHub 上的 Rust-lang 组织,
官方论坛,IRC 通道的所有 Rust 社区。
[security]: security.html
[legal]: legal.html
[coc]: https://www.rust-lang.org/conduct.html
## 开发版(nightly) 与 测试版(beta) 文档
除了上面链接的稳定文档外,大多数官方 Rust 文档也可用于 [开发版(nightly)][nightly] 和 [测试版(beta)][beta]。
[nightly]: https://doc.rust-lang.org/nightly/
[beta]: https://doc.rust-lang.org/beta/
## 母语文献
如果您需要非英语的本土化资料,请[点击这里][locale]。
中文文献可以快速链接到:
- [简体中文][locale-zh-CN]
- [台灣正體][locale-zh-TW]
[locale]: https://github.com/ctjhoa/rust-learning#locale-links
[locale-zh-CN]: https://kaisery.gitbooks.io/rust-book-chinese/content/
[locale-zh-TW]: http://askeing.github.io/rust-book/
| 25.274725 | 75 | 0.735217 | yue_Hant | 0.899235 |
265bb405200cca9836fc5c283f3bdffc4dcf2b37 | 2,125 | md | Markdown | exercises/prime-factors/README.md | akshajsunil/c | 43767ca716cbc7247f616d81c095976143e08da1 | [
"MIT"
] | 1 | 2020-07-27T18:39:42.000Z | 2020-07-27T18:39:42.000Z | exercises/prime-factors/README.md | akshajsunil/c | 43767ca716cbc7247f616d81c095976143e08da1 | [
"MIT"
] | 10 | 2021-05-09T00:06:22.000Z | 2021-09-02T12:07:41.000Z | exercises/prime-factors/README.md | akshajsunil/c | 43767ca716cbc7247f616d81c095976143e08da1 | [
"MIT"
] | null | null | null | # Prime Factors
Compute the prime factors of a given natural number.
A prime number is only evenly divisible by itself and 1.
Note that 1 is not a prime number.
## Example
What are the prime factors of 60?
- Our first divisor is 2. 2 goes into 60, leaving 30.
- 2 goes into 30, leaving 15.
- 2 doesn't go cleanly into 15. So let's move on to our next divisor, 3.
- 3 goes cleanly into 15, leaving 5.
- 3 does not go cleanly into 5. The next possible factor is 4.
- 4 does not go cleanly into 5. The next possible factor is 5.
- 5 does go cleanly into 5.
- We're left only with 1, so now, we're done.
Our successful divisors in that computation represent the list of prime
factors of 60: 2, 2, 3, and 5.
You can check this yourself:
- 2 * 2 * 3 * 5
- = 4 * 15
- = 60
- Success!
## Getting Started
Make sure you have read the "Guides" section of the
[C track](https://exercism.io/my/tracks/c) on the Exercism site. This covers
the basic information on setting up the development environment expected
by the exercises.
## Passing the Tests
Get the first test compiling, linking and passing by following the [three
rules of test-driven development][3-tdd-rules].
The included makefile can be used to create and run the tests using the `test`
task.
make test
Create just the functions you need to satisfy any compiler errors and get the
test to fail. Then write just enough code to get the test to pass. Once you've
done that, move onto the next test.
[3-tdd-rules]: http://butunclebob.com/ArticleS.UncleBob.TheThreeRulesOfTdd
As you progress through the tests, take the time to refactor your
implementation for readability and expressiveness and then go on to the next
test.
Try to use standard C99 facilities in preference to writing your own
low-level algorithms or facilities by hand.
## Source
The Prime Factors Kata by Uncle Bob [http://butunclebob.com/ArticleS.UncleBob.ThePrimeFactorsKata](http://butunclebob.com/ArticleS.UncleBob.ThePrimeFactorsKata)
## Submitting Incomplete Solutions
It's possible to submit an incomplete solution so you can see how others have completed the exercise.
| 30.797101 | 160 | 0.756235 | eng_Latn | 0.999044 |
265bc94d9183a172a5c311579cfce5d61d3727d6 | 640 | md | Markdown | collections/_plants/wyethia_angustifolia.md | kaycix/humboldtnativeplants | 5ed5daaad1088facbac49131da3cf31d2c0b3edf | [
"CC0-1.0"
] | null | null | null | collections/_plants/wyethia_angustifolia.md | kaycix/humboldtnativeplants | 5ed5daaad1088facbac49131da3cf31d2c0b3edf | [
"CC0-1.0"
] | 11 | 2022-01-30T23:44:35.000Z | 2022-02-25T05:58:52.000Z | collections/_plants/wyethia_angustifolia.md | kaycix/humboldtnativeplants | 5ed5daaad1088facbac49131da3cf31d2c0b3edf | [
"CC0-1.0"
] | null | null | null | ---
plant_id: 257
name:
common: "Narrow Leaf Mule Ears"
scientific: "Wyethia angustifolia"
type: "perennial herb"
native_to: "Humboldt"
categories: [humboldt_county_native
,cnps_master_inventory
,cnps_2022_spring
,butterfly
]
sun_requirements:
- "Full Sun"
# min then max in feet
plant_size:
- height:
- 3
icon_attribution:
name: "Calflora"
url: "https://www.calflora.org/entry/occdetail.html"
icon: "/assets/images/plants/wyethia_angustifolia_icon.jpg"
calscape_link: "https://calscape.org/Wyethia-angustifolia-(Narrow-Leaf-Mule-Ears)"
gardens: [
]
---
| 15.609756 | 82 | 0.667188 | eng_Latn | 0.435771 |
265c08033a04849e9606437e65c723542f852549 | 1,503 | md | Markdown | README.md | dannycx/XTools | 56034941ad9197010c3c8b1577bfe8a6666173f9 | [
"Apache-2.0"
] | null | null | null | README.md | dannycx/XTools | 56034941ad9197010c3c8b1577bfe8a6666173f9 | [
"Apache-2.0"
] | null | null | null | README.md | dannycx/XTools | 56034941ad9197010c3c8b1577bfe8a6666173f9 | [
"Apache-2.0"
] | null | null | null | # XTools
[常见问题](https://github.com/dannycx/XTools/blob/main/QUESTION.md)
[IPC通信](https://github.com/dannycx/XTools/blob/main/notes/IPC.md)
[Flutter](https://github.com/dannycx/XTools/blob/main/notes/flutter/flutter.md)
[Android四大组件](https://github.com/dannycx/XTools/blob/main/notes/component/component.md)
[通信](https://github.com/dannycx/XTools/blob/main/notes/communication/communication.md)
[动画](https://github.com/dannycx/XTools/blob/main/notes/Animation.md)
[设计模式](https://github.com/dannycx/XTools/blob/main/design/design.md)
[应用框架](https://github.com/dannycx/XTools/blob/main/frame/frame.md)
[issue](https://github.com/dannycx/XTools/blob/main/notes/issue.md)
## 键盘监听
```kotlin
val context = this
XSoftKeyUtil.startSoftKeyListener(this, object : XSoftKeyChangedCallback {
override fun softKeyHide(height: Int) {
Toast.makeText(context, "hide-$height", Toast.LENGTH_SHORT).show()
}
override fun softKeyShow(height: Int) {
Toast.makeText(context, "show-$height", Toast.LENGTH_SHORT).show()
}
})
```
## 退出应用
```kotlin
XSystemUtil.getInstance().doubleExit(this)
```
## root手机删除系统应用
> 进入root
+ adb root
> 装载
+ adb remount
> shell命令
+ adb shell
> 查看系统应用目录
+ cd system/app/
> 查看系统内置应用目录
+ cd priv-app/
> apk目录
+ adb shell pm path package
> 进入apk目录
+ cd system/priv-app/XXX/
> 删除apk
+ rm xx.apk
> 重启
+ adb reboot
## gradle命令
* gradlew build --refresh dependencies
* gradlew :app:denpendencies
* =file:///d:/gradle-5.4.1.zip
| 20.875 | 87 | 0.705256 | yue_Hant | 0.566575 |
265c612cad2d4dfd24a72cec45f433e3fe7a5267 | 24 | md | Markdown | README.md | askdaddy/battlecard.js-server | c8364190e3c51ac06505cf84c018c257c8ac742d | [
"MIT"
] | null | null | null | README.md | askdaddy/battlecard.js-server | c8364190e3c51ac06505cf84c018c257c8ac742d | [
"MIT"
] | null | null | null | README.md | askdaddy/battlecard.js-server | c8364190e3c51ac06505cf84c018c257c8ac742d | [
"MIT"
] | null | null | null | # battlecard.js-server
| 8 | 22 | 0.75 | eng_Latn | 0.410497 |
265c97aa352e8194a98d4d1ea1b23bc6dd7c960f | 262 | md | Markdown | _posts/2008-07-04-t849883686.md | ChristopheDucamp/christopheducamp.github.io | 5fb5f16b75a707cc7d8599bfc02c5cb147a69849 | [
"MIT"
] | 4 | 2018-04-01T18:28:49.000Z | 2018-11-12T08:41:55.000Z | _posts/2008-07-04-t849883686.md | ChristopheDucamp/christopheducamp.github.io | 5fb5f16b75a707cc7d8599bfc02c5cb147a69849 | [
"MIT"
] | null | null | null | _posts/2008-07-04-t849883686.md | ChristopheDucamp/christopheducamp.github.io | 5fb5f16b75a707cc7d8599bfc02c5cb147a69849 | [
"MIT"
] | 4 | 2016-12-17T14:19:26.000Z | 2021-04-14T23:44:50.000Z | ---
layout: post
microblog: true
audio:
photo:
date: 2008-07-04 00:00:00 -0000
guid: http://xtof.micro.blog/2008/07/04/t849883686.html
---
en mode vacances : crevé de faire #placenette en #fengshui [tinyurl.com/5rxfwj](http://tinyurl.com/5rxfwj) #maviesansbug
| 26.2 | 120 | 0.732824 | fra_Latn | 0.130282 |
265d1c19f155e886008604f6bff90db214cc0c55 | 569 | md | Markdown | packages/yak_error_handler/example/README.md | iapicca/yak_packages | f17c8640693647e8e168cc62a9a12995eed8a75e | [
"MIT"
] | 10 | 2020-11-24T13:58:59.000Z | 2021-07-29T08:59:00.000Z | packages/yak_error_handler/example/README.md | iapicca/yak_packages | f17c8640693647e8e168cc62a9a12995eed8a75e | [
"MIT"
] | 96 | 2021-02-12T09:00:39.000Z | 2021-07-25T10:44:15.000Z | packages/yak_error_handler/example/README.md | iapicca/yak_packages | f17c8640693647e8e168cc62a9a12995eed8a75e | [
"MIT"
] | 2 | 2021-03-03T14:59:34.000Z | 2021-03-26T14:57:43.000Z | # Example
```dart
import 'package:yak_error_handler/yak_error_handler.dart';
import 'package:yak_runner/yak_runner.dart';
Stream<int> get stream => Stream.fromIterable([for (var i = 0; i < 10; ++i) i]);
final onError = ErrorHandler<AvowError>((_) => print('this is odd!'));
final runner = YakRunnerArg<void, int>(
(i) {
avow(i.isEven);
print(i);
},
exceptionHandler: ExceptionHandler(),
errorHandlers: {onError},
);
void main() => stream.listen(runner);
```
[Jump to Source](https://github.com/iapicca/yak_packages/tree/master/examples/yak_runner)
| 24.73913 | 89 | 0.6942 | eng_Latn | 0.211641 |
265d497c3c5c4fd9be48fdbcd094990f00bcd9f2 | 14,184 | md | Markdown | desktop-src/menurc/about-keyboard-accelerators.md | dianmsft/win32 | f07b550595a83e44dd2fb6e217525edd10a0341b | [
"CC-BY-4.0",
"MIT"
] | 4 | 2021-07-26T16:18:49.000Z | 2022-02-19T02:00:21.000Z | desktop-src/menurc/about-keyboard-accelerators.md | dianmsft/win32 | f07b550595a83e44dd2fb6e217525edd10a0341b | [
"CC-BY-4.0",
"MIT"
] | 2 | 2020-04-09T17:00:51.000Z | 2020-04-09T18:30:01.000Z | desktop-src/menurc/about-keyboard-accelerators.md | dianmsft/win32 | f07b550595a83e44dd2fb6e217525edd10a0341b | [
"CC-BY-4.0",
"MIT"
] | 2 | 2020-07-19T02:58:48.000Z | 2021-03-06T21:09:47.000Z | ---
title: About Keyboard Accelerators
description: This topic discusses keyboard accelerators.
ms.assetid: cbf7619d-289d-40c9-9a06-6ce47026d43f
keywords:
- Windows User Interface,user input
- Windows User Interface,keyboard accelerators
- user input,keyboard accelerators
- capturing user input,keyboard accelerators
- keyboard accelerators
- accelerators
- WM_COMMAND message
- WM_SYS COMMAND message
- accelerator tables
ms.topic: article
ms.date: 05/31/2018
---
# About Keyboard Accelerators
Accelerators are closely related to menus — both provide the user with access to an application's command set. Typically, users rely on an application's menus to learn the command set and then switch over to using accelerators as they become more proficient with the application. Accelerators provide faster, more direct access to commands than menus do. At a minimum, an application should provide accelerators for the more commonly used commands. Although accelerators typically generate commands that exist as menu items, they can also generate commands that have no equivalent menu items.
This section covers the following topics.
- [Accelerator Tables](#accelerator-tables)
- [Accelerator-Table Creation](#accelerator-table-creation)
- [Accelerator Keystroke Assignments](#accelerator-keystroke-assignments)
- [Accelerators and Menus](#accelerators-and-menus)
- [UI State](#ui-state)
## Accelerator Tables
An accelerator table consists of an array of [**ACCEL**](/windows/win32/api/winuser/ns-winuser-accel) structures, each defining an individual accelerator. Each **ACCEL** structure includes the following information:
- The accelerator's keystroke combination.
- The accelerator's identifier.
- Various flags. This includes one that specifies whether the system is to provide visual feedback by highlighting the corresponding menu item, if any, when the accelerator is used
To process accelerator keystrokes for a specified thread, the developer must call the [**TranslateAccelerator**](/windows/desktop/api/Winuser/nf-winuser-translateacceleratora) function in the message loop associated with the thread's message queue. The **TranslateAccelerator** function monitors keyboard input to the message queue, checking for key combinations that match an entry in the accelerator table. When **TranslateAccelerator** finds a match, it translates the keyboard input (that is, the [**WM\_KEYUP**](https://docs.microsoft.com/windows/desktop/inputdev/wm-keyup) and [**WM\_KEYDOWN**](https://docs.microsoft.com/windows/desktop/inputdev/wm-keydown) messages) into a [**WM\_COMMAND**](wm-command.md) or [**WM\_SYSCOMMAND**](wm-syscommand.md) message and then sends the message to the window procedure of the specified window. The following illustration shows how accelerators are processed.

The [**WM\_COMMAND**](wm-command.md) message includes the identifier of the accelerator that caused [**TranslateAccelerator**](/windows/desktop/api/Winuser/nf-winuser-translateacceleratora) to generate the message. The window procedure examines the identifier to determine the source of the message and then processes the message accordingly.
Accelerator tables exist at two different levels. The system maintains a single, system-wide accelerator table that applies to all applications. An application cannot modify the system accelerator table. For a description of the accelerators provided by the system accelerator table, see [Accelerator Keystroke Assignments](#accelerator-keystroke-assignments).
The system also maintains accelerator tables for each application. An application can define any number of accelerator tables for use with its own windows. A unique 32-bit handle (**HACCEL**) identifies each table. However, only one accelerator table can be active at a time for a specified thread. The handle to the accelerator table passed to the [**TranslateAccelerator**](/windows/desktop/api/Winuser/nf-winuser-translateacceleratora) function determines which accelerator table is active for a thread. The active accelerator table can be changed at any time by passing a different accelerator-table handle to **TranslateAccelerator**.
## Accelerator-Table Creation
Several steps are required to create an accelerator table for an application. First, a resource compiler is used to create accelerator-table resources and to add them to the application's executable file. At run time, the [**LoadAccelerators**](/windows/desktop/api/Winuser/nf-winuser-loadacceleratorsa) function is used to load the accelerator table into memory and retrieve the handle to the accelerator table. This handle is passed to the [**TranslateAccelerator**](/windows/desktop/api/Winuser/nf-winuser-translateacceleratora) function to activate the accelerator table.
An accelerator table can also be created for an application at run time by passing an array of [**ACCEL**](/windows/win32/api/winuser/ns-winuser-accel) structures to the [**CreateAcceleratorTable**](/windows/desktop/api/Winuser/nf-winuser-createacceleratortablea) function. This method supports user-defined accelerators in the application. Like the [**LoadAccelerators**](/windows/desktop/api/Winuser/nf-winuser-loadacceleratorsa) function, **CreateAcceleratorTable** returns an accelerator-table handle that can be passed to [**TranslateAccelerator**](/windows/desktop/api/Winuser/nf-winuser-translateacceleratora) to activate the accelerator table.
The system automatically destroys accelerator tables loaded by [**LoadAccelerators**](/windows/desktop/api/Winuser/nf-winuser-loadacceleratorsa) or created by [**CreateAcceleratorTable**](/windows/desktop/api/Winuser/nf-winuser-createacceleratortablea). However, an application can free resources while it is running by destroying accelerator tables no longer needed by calling the [**DestroyAcceleratorTable**](/windows/desktop/api/Winuser/nf-winuser-destroyacceleratortable) function.
An existing accelerator table can be copied and modified. The existing accelerator table is copied by using the [**CopyAcceleratorTable**](/windows/desktop/api/Winuser/nf-winuser-copyacceleratortablea) function. After the copy is modified, a handle to the new accelerator table is retrieved by calling [**CreateAcceleratorTable**](/windows/desktop/api/Winuser/nf-winuser-createacceleratortablea). Finally, the handle is passed to [**TranslateAccelerator**](/windows/desktop/api/Winuser/nf-winuser-translateacceleratora) to activate the new table.
## Accelerator Keystroke Assignments
An ASCII character code or a virtual-key code can be used to define the accelerator. An ASCII character code makes the accelerator case sensitive. Thus, using the ASCII "C" character defines the accelerator as ALT+C rather than ALT+c. However, case-sensitive accelerators can be confusing to use. For example, the ALT+C accelerator will be generated if the CAPS LOCK key is down or if the SHIFT key is down, but not if both are down.
Typically, accelerators don't need to be case sensitive, so most applications use virtual-key codes for accelerators rather than ASCII character codes.
Avoid accelerators that conflict with an application's menu mnemonics, because the accelerator overrides the mnemonic, which can confuse the user. For more information about menu mnemonics, see [Menus](menus.md).
If an application defines an accelerator that is also defined in the system accelerator table, the application-defined accelerator overrides the system accelerator, but only within the context of the application. Avoid this practice, however, because it prevents the system accelerator from performing its standard role in the user interface. The system-wide accelerators are described in the following list:
| | |
|------------------|-------------------------------------------------------------------------------------------------------|
| ALT+ESC | Switches to the next application. |
| ALT+F4 | Closes an application or a window. |
| ALT+HYPHEN | Opens the **Window** menu for a document window. |
| ALT+PRINT SCREEN | Copies an image in the active window onto the clipboard. |
| ALT+SPACEBAR | Opens the **Window** menu for the application's main window. |
| ALT+TAB | Switches to the next application. |
| CTRL+ESC | Switches to the **Start** menu. |
| CTRL+F4 | Closes the active group or document window. |
| F1 | Starts the application's help file, if one exists. |
| PRINT SCREEN | Copies an image on the screen onto the clipboard. |
| SHIFT+ALT+TAB | Switches to the previous application. The user must press and hold down ALT+SHIFT while pressing TAB. |
## Accelerators and Menus
Using an accelerator is the same as choosing a menu item: Both actions cause the system to send a [**WM\_COMMAND**](wm-command.md) or [**WM\_SYSCOMMAND**](wm-syscommand.md) message to the corresponding window procedure. The **WM\_COMMAND** message includes an identifier that the window procedure examines to determine the source of the message. If an accelerator generated the **WM\_COMMAND** message, the identifier is that of the accelerator. Similarly, if a menu item generated the **WM\_COMMAND** message, the identifier is that of the menu item. Because an accelerator provides a shortcut for choosing a command from a menu, an application usually assigns the same identifier to the accelerator and the corresponding menu item.
An application processes an accelerator [**WM\_COMMAND**](wm-command.md) message in exactly the same way as the corresponding menu item **WM\_COMMAND** message. However, the **WM\_COMMAND** message contains a flag that specifies whether the message originated from an accelerator or a menu item, in case accelerators must be processed differently from their corresponding menu items. The [**WM\_SYSCOMMAND**](wm-syscommand.md) message does not contain this flag.
The identifier determines whether an accelerator generates a [**WM\_COMMAND**](wm-command.md) or [**WM\_SYSCOMMAND**](wm-syscommand.md) message. If the identifier has the same value as a menu item in the System menu, the accelerator generates a **WM\_SYSCOMMAND** message. Otherwise, the accelerator generates a **WM\_COMMAND** message.
If an accelerator has the same identifier as a menu item and the menu item is grayed or disabled, the accelerator is disabled and does not generate a [**WM\_COMMAND**](wm-command.md) or [**WM\_SYSCOMMAND**](wm-syscommand.md) message. Also, an accelerator does not generate a command message if the corresponding window is minimized.
When the user uses an accelerator that corresponds to a menu item, the window procedure receives the [**WM\_INITMENU**](wm-initmenu.md) and [**WM\_INITMENUPOPUP**](wm-initmenupopup.md) messages as though the user had selected the menu item. For information about how to process these messages, see [Menus](menus.md).
An accelerator that corresponds to a menu item should be included in the text of the menu item.
## UI State
Windows enables applications to hide or show various features in its UI. These settings are known as the UI state. The UI state includes the following settings:
- focus indicators (such as focus rectangles on buttons)
- keyboard accelerators (indicated by underlines in control labels)
A window can send messages to request a change in the UI state, can query the UI state, or enforce a certain state for its child windows. These messages are as follows.
| Message | Description |
|-----------------------------------------------|--------------------------------------------|
| [**WM\_CHANGEUISTATE**](wm-changeuistate.md) | Indicates that the UI state should change. |
| [**WM\_QUERYUISTATE**](wm-queryuistate.md) | Retrieves the UI state for a window. |
| [**WM\_UPDATEUISTATE**](wm-updateuistate.md) | Changes the UI state. |
By default, all child windows of a top-level window are created with the same UI state as their parent.
The system handles the UI state for controls in dialog boxes. At dialog box creation, the system initializes the UI state accordingly. All child controls inherit this state. After the dialog box is created, the system monitors the user's keystrokes. If the UI state settings are hidden and the user navigates using the keyboard, the system updates the UI state. For example, if the user presses the Tab key to move the focus to the next control, the system calls [**WM\_CHANGEUISTATE**](wm-changeuistate.md) to make the focus indicators visible. If the user presses the Alt key, the system calls **WM\_CHANGEUISTATE** to make the keyboard accelerators visible.
If a control supports navigation between the UI elements it contains, it can update its own UI state. The control can call [**WM\_QUERYUISTATE**](wm-queryuistate.md) to retrieve and cache the initial UI state. Whenever the control receives an [**WM\_UPDATEUISTATE**](wm-updateuistate.md) message, it can update its UI state and send a [**WM\_CHANGEUISTATE**](wm-changeuistate.md) message to its parent. Each window will continue to send the message to its parent until it reaches the top-level window. The top-level window sends the **WM\_UPDATEUISTATE** message to the windows in the window tree. If a window does not pass on the **WM\_CHANGEUISTATE** message, it will not reach the top-level window and the UI state will not be updated.
| 103.532847 | 905 | 0.731599 | eng_Latn | 0.976951 |
265d82e5391f88cef52383cea41b5304d97ab58c | 9,117 | md | Markdown | README.md | dvsa/cvs-tsk-pull-test-results | a44666d624d51c88aee8976f3b11b5fca477de0f | [
"MIT"
] | null | null | null | README.md | dvsa/cvs-tsk-pull-test-results | a44666d624d51c88aee8976f3b11b5fca477de0f | [
"MIT"
] | 3 | 2022-01-07T12:23:00.000Z | 2022-01-11T14:30:50.000Z | README.md | dvsa/cvs-tsk-pull-test-results | a44666d624d51c88aee8976f3b11b5fca477de0f | [
"MIT"
] | null | null | null | # cvs-tsk-pull-test-results
Service for feeding test results from DynamoDB into DynamicsCE.
General overview:
- ATF engineer performs a vehicle test and records the outcome in the VTA app.
- VTA app inserts the data into the DynamoDB table via the test-results lambda.
- The table is configured to stream activity to the pull-test-results lambda.
- Pull-test-results lambda extracts the results from the data and send it as events to EventBridge.
- An EventBridge rule inserts the results into DynamicsCE via an OData endpoint.
## Dependencies
The project runs on node 14.x with typescript and serverless framework. For further details about project dependencies, please refer to the `package.json` file.
[nvm](https://github.com/nvm-sh/nvm/blob/master/README.md) is used to manage node versions and configuration is per project using an `.npmrc` file.
## Running the project
Before running the project, the dependencies need to be installed using `npm install`. Once the dependencies are installed, you will be required to copy the `.env.example` file to `.env.local` in the root of the project. See these for information about [variables](https://www.serverless.com/framework/docs/providers/aws/guide/variables/) and [environment variables](https://www.serverless.com/framework/docs/environment-variables/) with serverless.
Please note that multiple `.env` files can be created, one per environment. Our current development environment is 'local'.
The application runs on port `:3001` by default.
## Packaging the project locally
The `package` npm script takes a ZIP_NAME variable. To set the variable when running manually use `ZIP_NAME=zipName npm run package`. This will produce a file called zipName.zip.
### Environments
We use `NODE_ENV` environment variable to set the stage. `NODE_ENV` is set through npm scripts (package.json) to load the relevant `.env.<NODE_ENV>` file from the root folder into the `serverless.yml`.
If no `NODE_ENV` value is provided when running the scripts, it will default its `NODE_ENV` value to 'local' with the `.env.local` config file.
The defaulted values for 'stage' and 'region' are `'local'`. Please refer to the values provided in the `serverless.yml` file.
The following values can be provided when running the scripts with `NODE_ENV`:
```ts
// ./.env.<NODE_ENV> files
'local'; // used for local development
'development'; // used development staging should we wish to require external services
'test'; // used during test scripts where local services, mocks can be used in conjunction
```
```ts
/** Running serverless offline as an example for a specific stage - 'local'.
* Stage 'local' will be injected in the serverless.yml
**/
NODE_ENV=local serverless offline
```
Further details about environment setup can be found in the provided documentation and `.env.example` file.
All secrets will stored in `AWS Secrets Manager`.
### Scripts
The following scripts are available, for further information please refer to the project `package.json` file:
- <b>start</b>: `npm start` - _launch serverless offline service_
- <b>dev</b>: `npm run dev` - _run in parallel, the service and unit tests in_ `--watch` _mode with live reload_.
- <b>test</b>: `npm t` - _execute the unit test suite_
- <b>build</b>: `npm run build` - _build the project, transpiling typescript to javascript_
- <b>production build</b>: `npm run package` - _generate the project zip file ready for deployment_
### Offline
Serverless-offline is used to run the project locally. Use `npm run start` script to do so. The endpoints below are available.
```
(POST) http://localhost:3002/2015-03-31/functions/cvs-tsk-pull-test-results-local-pullTestResults/invocations
(POST) http://localhost:3002/2014-11-13/functions/cvs-tsk-pull-test-results-local-pullTestResults/invoke-async/
```
The function expects a DynamoDB stream event.
```json
{
"eventID": "mock-id",
"eventName": "MODIFY",
"eventVersion": "1.1",
"eventSource": "aws:dynamodb",
"awsRegion": "eu-west-1",
"dynamodb": {
"ApproximateCreationDateTime": 1641807422,
"Keys": {
"vin": {
"S": "123456"
},
"testResultId": {
"S": "123"
}
},
"NewImage": {
...
},
"OldImage": {
...
},
"SequenceNumber": "123456789",
"SizeBytes": 3701,
"StreamViewType": "NEW_AND_OLD_IMAGES"
},
"eventSourceARN": "mock-arn"
}
```
Complete examples can be found in the `./tests/unit/data` folder.
### Debugging
Existing configuration to debug the running service has been made available for vscode, please refer to `.vscode/launch.json`. Two jest configurations are also provided which will allow debugging a test or multiple tests.
## Environmental variables
The following variables are supported in the `.env.<NODE_ENV>` file.
- AWS_PROVIDER_PROFILE=default
- AWS_REGION=eu-west-1
- AWS_SERVER_PORT=3009
- AWS_EVENT_BUS_NAME=default
- AWS_EVENT_BUS_SOURCE=eventBusName
## Testing
### Unit
Jest is used for unit testing. Jest mocks have been added for external services and other dependencies when needed. Debugging tests is possible using the two options configured in ./vscode/launch.json `Jest Debug all tests` and `Jest Debug opened file`. Using the Jest vscode extension is also a very good option. Please refer to the [Jest documentation](https://jestjs.io/docs/en/getting-started) for further details.
### Integration
TBC
## Infrastructure
### Release
Releases (tag, release notes, changelog, github release, assets) are automatically managed by [semantic-release](https://semantic-release.gitbook.io/semantic-release/) and when pushing (or merging) to `develop` branch which is protected. [semver](https://semver.org/) convention is followed.
Please be familiar with conventional commit as described in the Contributing section below.
Default preset used is angular for conventional commits, please see the [angular conventions](https://github.com/conventional-changelog/commitlint/tree/master/%40commitlint/config-conventional#type-enum).
The `<type>` `'breaking'` in the commit message will trigger a major version bump as well as any of the following text contained in the commit body: `"BREAKING CHANGE", "BREAKING CHANGES", "BREAKING_CHANGES", "BREAKING", "BREAKING_CHANGE"`. Please refer to the `.releaserc.json` file for the full configuration.
The script `npm run release` will automatically trigger the release in CI. To manually test the release the following flags -`--dry-run --no-ci` - can be passed to the release script.
Publishing and artifacts are managed separately by the pipeline.
## Contributing
To facilitate the standardisation of the code, a few helpers and tools have been adopted for this repository.
### External dependencies
The projects has multiple hooks configured using [husky](https://github.com/typicode/husky#readme) which will execute the following scripts: `audit`, `lint`, `build`, `test` and format your code with [eslint](https://github.com/typescript-eslint/typescript-eslint#readme) and [prettier](https://github.com/prettier/prettier).
You will be required to install [git-secrets](https://github.com/awslabs/git-secrets) (_brew approach is recommended_) and DVSA [repo-security-scanner](https://github.com/UKHomeOffice/repo-security-scanner) that runs against your git log history to find accidentally committed passwords, private keys.
We follow the [conventional commit format](https://www.conventionalcommits.org/en/v1.0.0/) when we commit code to the repository and follow the [angular convention](https://github.com/conventional-changelog/commitlint/tree/master/%40commitlint/config-conventional#type-enum).
The type is mandatory and must be all lowercase.
The scope of your commit remain is also mandatory, it must include your ticket number and be all lowercase. The format for the ticket number can be set in the commitlint.config.js file.
```js
// Please see /commitlint.config.js for customised format
type(scope?): subject
// examples
'chore(cvsb-1234): my commit msg' // pass
'CHORE(cvsb-1234): my commit msg' // will fail
```
### Code standards
#### Toolings
The code uses [eslint](https://eslint.org/docs/user-guide/getting-started), [typescript clean code standards](https://github.com/labs42io/clean-code-typescript) as well as sonarqube for static analysis.
SonarQube is available locally, please follow the instructions below if you wish to run the service locally (brew is the preferred approach):
- _Brew_:
- Install sonarqube using brew
- Change `sonar.host.url` to point to localhost, by default, sonar runs on `http://localhost:9000`
- run the sonar server `sonar start`, then perform your analysis `npm run sonar-scanner`
- _Manual_:
- Add sonar-scanner in environment variables in your \_profile file add the line: `export PATH=<PATH_TO_SONAR_SCANNER>/sonar-scanner-3.3.0.1492-macosx/bin:$PATH`
- Start the SonarQube server: `cd <PATH_TO_SONARQUBE_SERVER>/bin/macosx-universal-64 ./sonar.sh start`
- In the microservice folder run the command: `npm run sonar-scanner`
| 47.984211 | 449 | 0.752331 | eng_Latn | 0.981673 |
265d9bec0ca854fab597d33b103db6d9fa93bb5a | 5,519 | md | Markdown | CHANGELOG.md | sensate-iot/platform-network | b14d6703358cff48de662b0f12618a89bb52bbf1 | [
"Apache-2.0"
] | null | null | null | CHANGELOG.md | sensate-iot/platform-network | b14d6703358cff48de662b0f12618a89bb52bbf1 | [
"Apache-2.0"
] | 3 | 2021-04-04T15:51:17.000Z | 2021-12-19T13:10:14.000Z | CHANGELOG.md | sensate-iot/platform-network | b14d6703358cff48de662b0f12618a89bb52bbf1 | [
"Apache-2.0"
] | null | null | null | # Changelog
All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
## [1.8.3] - 20-12-2021
### Updated
- Development environment settings
## [1.8.2] - 19-12-2021
### Added
- Error handling in the Storage Service
### Updated
- Resubscription logic in the abstract MQTT client
- Database `TIMESTAMP` types (changed to `TIMESTAMP WITH TIME ZONE`)
- Protobuf contracts
## [1.8.1] - 06-12-2021
### Updated
- Project dependencies
- .NET runtime version (upgrade from .NET 5 to .NET 6)
## [1.8.0] - 07-08-2021
### Updated
- Project dependencies
- Router contract package
## [1.7.0] - 27-05-2021
### Updated
- Project dependencies
### Removed
- Gateway controller
- Router integration
- Authorization services
## [1.6.4] - 27-05-2021
### Updated
- Project dependencies
## [1.6.3] - 24-05-2021
### Updated
- Project dependencies
### Removed
- Router project
- Unused code
- Unused dependency's
## [1.6.2] - 13-04-2021
### Added
- Subscription count tracking
- Log scopes to the Live Data Service log template
- Subscription management service in the Live Data Service
### Updated
- Project dependency's
- Live Data Service log statement template
## [1.6.1] - 04-04-2021
### Added
- Exception handing to services
- Fatal to the background service when exceptions are not handled
- Timestamps to the live data service log statements
### Updated
- Access modifier of the background servic execute method
- Live data service logging statements
- Project dependency's
## [1.6.0] - 16-03-2021
### Added
- Caching of trigger actions in the Trigger Service
- Additional statistics types
### Updated
- Statistics logging
- Logging in the Storage Service
- Router initialization routine
- The database function `generic_getblobs`
### Removed
- Unused code
- TriggerInvocations table
- TriggerInvocation pgsql functions
## [1.5.0] - 11-03-2021
### Added
- Ability to disable data reloads in the message router
- Sensor commands when removing a user
### Updated
- Trigger related repository's
- Trigger invocation creation database function
- Project dependency's
### Removed
- The ability of the Network API to create trigger invocations
- Unused code
## [1.4.3] - 01-03-2021
### Added
- Storage histogram
- Router command statistics/monitoring
- Trigger execution histogram
- Management SQL functions
### Updated
- Improved router configuration
- Git ignore definition
- Live data service dependency's
- Development configurations
- Network API request auditting
### Removed
- Unused code
- Unused configuration
## [1.4.2] - 26-02-2021
### Added
- Timestamp variable to trigger message's
- Data contexts using native Npgsql functionality
### Updated
- Retarget the authorization context to Npgsql
- Retarget the networking context to Npgsql
- Logging package references
- Hosting package references
### Removed
- Remove references to Entity Framework Core
- Unused code
## [1.4.1] - 22-02-2021
### Updated
- Package dependency's
- Live data service authorization logging
- Live data authorization flow
### Removed
- Live data sensor links
- Networking dabase integration in the live data service
## [1.4.0] - 16-02-2021
### Added
- Routing cache
- Internal queue metrics
### Updated
- Command subscription QoS levels (from 1 to 3)
- Load tests for new caches
### Removed
- Ingress projects
- API projects (except the network API)
- Common projects (caching)
## [1.3.0] - 06-02-2021
### Updated
- Measurement query result JSON deserialization
- MeasurementQueryResult model annotations
### Removed
- Unused configuration files
- Unused deployment configuration
## [1.2.2] - 05-02-2021
### Updated
- Hardcode swagger and/or open API scheme's
### Added
- Set HTTP as possible scheme
- Set HTTPS as possible scheme
## [1.2.1] - 05-02-2021
### Added
- HTTPS swagger support
### Updated
- Network API swagger
- Auth API swagger
- Data API swagger
- Auth API swagger
- Dashboard API swagger
## [1.2.0] - 05-02-2021
### Updated
- Encoding storage conversion
- Sensor creation flow: publish sensor keys on the MQTT broker
### Added
- Ingress router request metrics
- Egress router request metrics
## [1.1.2] - 26-01-2021
### Updated
- Script directory name
- MQTT service project files
- CI/CD pipelines
### Added
- Security policy
### Removed
- CAKE build system
- Unused API code
## [1.1.1] - 26-01-2021
### Updated
- Update the versioning schema:
- Update version of the MQTT service
- Update the version of the core API's
## [1.1.0] - 26-01-2021
### Added
- Network:
- Network project setup
- Contracts project
- Solution folders for:
- Database project
- General files
- Message router
- Network API + Gateway
- Networking database:
- Trigger administration
- Sensor link administration
- Live data service
- Platform:
- Moved MQTT ingress service
- Forward MQTT ingress to the HTTP gateway
- API:
- Refactor SensateService into SensateIoT.API
- Upgrade API's to .NET 5
- Improve swagger documentation
### Removed
- Trigger administration (moved to network project)
- Network API (moved to network project)
- Trigger service (moved to network project)
- Live data service (moved to network project)
## [1.0.0] - 09-10-2020
### Added
- API's:
- Network API
- Data API
- Authorization API
- Blob API
- Dashboard API
- Ingress services
- Data processing
| 21.474708 | 87 | 0.723138 | eng_Latn | 0.789279 |
265e0228f986ff91a0d689a48c9586efc3393f01 | 40 | md | Markdown | TODO.md | deliciousinsights/vat-helper | f6d9c6c365b850b73f18e9334f4f94595ffbf269 | [
"MIT"
] | 1 | 2020-02-12T19:06:44.000Z | 2020-02-12T19:06:44.000Z | TODO.md | deliciousinsights/vat-helper | f6d9c6c365b850b73f18e9334f4f94595ffbf269 | [
"MIT"
] | 6 | 2021-03-10T07:28:57.000Z | 2022-02-26T23:35:12.000Z | TODO.md | deliciousinsights/vat-helper | f6d9c6c365b850b73f18e9334f4f94595ffbf269 | [
"MIT"
] | null | null | null | - EU VAT computation for FR SIREN/SIRET
| 20 | 39 | 0.775 | kor_Hang | 0.584614 |
265e14721eeaff8022d54f51c5f151c3f7319ce3 | 660 | md | Markdown | jekyll/_posts/blog/2009/2009-05-06-tumblr-104255431.md | BenWard/benward | f32687f015b2884ecda417db17945b0053a52753 | [
"RSA-MD"
] | 4 | 2015-01-19T21:49:43.000Z | 2020-07-26T06:13:17.000Z | jekyll/_posts/blog/2009/2009-05-06-tumblr-104255431.md | BenWard/benward | f32687f015b2884ecda417db17945b0053a52753 | [
"RSA-MD"
] | null | null | null | jekyll/_posts/blog/2009/2009-05-06-tumblr-104255431.md | BenWard/benward | f32687f015b2884ecda417db17945b0053a52753 | [
"RSA-MD"
] | 6 | 2016-01-25T13:52:27.000Z | 2020-08-18T20:03:31.000Z | ---
layout: blog
category: blog
title: "Adobe Gripe"
date: "2009-05-06T18:45:57+0000"
original_service: tumblr
original_url: "http://blog.benward.me/post/104255431/adobe-gripe-in-the-fine-vein-of-adobe"
tumblr_post_type: photo
atomid: "http://blog.benward.me/post/104255431/adobe-gripe-in-the-fine-vein-of-adobe"
---
<figure class="photo">
<img src="http://benward.me/res/tumblr/media/104255431/0.jpg" alt="">
</figure>
In the fine vein of [Adobe Gripes](http://adobegripes.tumblr.com): What the cocking fuck shit is the default application icon doing in the “Save for Web and Devices” dialog? Would that really have been so hard to remove?
(Photoshop CS3) | 38.823529 | 220 | 0.751515 | eng_Latn | 0.359142 |
265e284d515adf1ecc60baaff8ebd578702b5829 | 14,033 | md | Markdown | docs/machine-learning/package-management/install-additional-r-packages-on-sql-server.md | eustaabGit/sql-docs | ac9feb0b10847b369b77f3c03f8200c86ee4f4e0 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/machine-learning/package-management/install-additional-r-packages-on-sql-server.md | eustaabGit/sql-docs | ac9feb0b10847b369b77f3c03f8200c86ee4f4e0 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/machine-learning/package-management/install-additional-r-packages-on-sql-server.md | eustaabGit/sql-docs | ac9feb0b10847b369b77f3c03f8200c86ee4f4e0 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Install new R packages
description: Learn how to use sqlmlutils to install new R packages to an instance of SQL Server Machine Learning Services.
ms.prod: sql
ms.technology: machine-learning
ms.date: 06/04/2020
ms.topic: how-to
author: garyericson
ms.author: garye
ms.reviewer: davidph
ms.custom: seo-lt-2019
monikerRange: ">=sql-server-ver15||>=sql-server-linux-ver15||=azuresqldb-mi-current||=sqlallproducts-allversions"
---
# Install new R packages with sqlmlutils
[!INCLUDE [SQL Server 2019 SQL MI](../../includes/applies-to-version/sqlserver2019-asdbmi.md)]
::: moniker range=">=sql-server-ver15||>=sql-server-linux-ver15||=sqlallproducts-allversions"
This article describes how to use functions in the [**sqlmlutils**](https://github.com/Microsoft/sqlmlutils) package to install new R packages to an instance of [Machine Learning Services on SQL Server](../sql-server-machine-learning-services.md) and on [Big Data Clusters](../../big-data-cluster/machine-learning-services.md). The packages you install can be used in R scripts running in-database using the [sp_execute_external_script](https://docs.microsoft.com/sql/relational-databases/system-stored-procedures/sp-execute-external-script-transact-sql) T-SQL statement.
> [!NOTE]
> The **sqlmlutils** package described in this article is used for adding R packages to SQL Server 2019 or later. For SQL Server 2017 and earlier, see [Install packages with R tools](https://docs.microsoft.com/sql/machine-learning/package-management/install-r-packages-standard-tools?view=sql-server-2017).
::: moniker-end
::: moniker range="=azuresqldb-mi-current||=sqlallproducts-allversions"
This article describes how to use functions in the [**sqlmlutils**](https://github.com/Microsoft/sqlmlutils) package to install new R packages to an instance of [Azure SQL Managed Instance Machine Learning Services](/azure/azure-sql/managed-instance/machine-learning-services-overview). The packages you install can be used in R scripts running in-database using the [sp_execute_external_script](https://docs.microsoft.com/sql/relational-databases/system-stored-procedures/sp-execute-external-script-transact-sql) T-SQL statement.
::: moniker-end
## Prerequisites
- Install [R](https://www.r-project.org) and [RStudio Desktop](https://www.rstudio.com/products/rstudio/download/) on the client computer you use to connect to SQL Server. You can use any R IDE for running scripts, but this article assumes RStudio.
- Install [Azure Data Studio](https://docs.microsoft.com/sql/azure-data-studio/what-is) on the client computer you use to connect to SQL Server. You can use other database management or query tools, but this article assumes Azure Data Studio.
### Other considerations
- Package installation is specific to the SQL instance, database, and user you specify in the connection information you provide to **sqlmlutils**. To use the package in multiple SQL instances or databases, or for different users, you'll need to install the package for each one. The exception is that if the package is installed by a member of `dbo`, the package is *public* and is shared with all users. If a user installs a newer version of a public package, the public package is not affected but that user will have access to the newer version.
- R script running in SQL Server can use only packages installed in the default instance library. SQL Server cannot load packages from external libraries, even if that library is on the same computer. This includes R libraries installed with other Microsoft products.
- On a hardened SQL Server environment, you might want to avoid the following:
- Packages that require network access
- Packages that require elevated file system access
- Packages used for web development or other tasks that don't benefit by running inside SQL Server
## Install sqlmlutils on the client computer
To use **sqlmlutils**, you first need to install it on the client computer you use to connect to SQL Server.
The **sqlmlutils** package depends on the **RODBCext** package, and **RODBCext** depends on a number of other packages. The following procedures install all of these packages in the correct order.
### Install sqlmlutils online
If the client computer has Internet access, you can download and install **sqlmlutils** and its dependent packages online.
1. Download the latest **sqlmlutils** file (`.zip` for Windows, `.tar.gz` for Linux) from https://github.com/Microsoft/sqlmlutils/tree/master/R/dist to the client computer. Don't expand the file.
1. Open a **Command Prompt** and run the following commands to install the packages **RODBCext** and **sqlmlutils**. Substitute the path to the **sqlmlutils** file you downloaded. The **RODBCext** package is found online and installed.
::: moniker range=">=sql-server-ver15||=sqlallproducts-allversions"
```console
R -e "install.packages('RODBCext', repos='https://mran.microsoft.com/snapshot/2019-02-01/')"
R CMD INSTALL sqlmlutils_0.7.1.zip
```
::: moniker-end
::: moniker range=">=sql-server-linux-ver15||=sqlallproducts-allversions"
```console
R -e "install.packages('RODBCext', repos='https://mran.microsoft.com/snapshot/2019-02-01/')"
R CMD INSTALL sqlmlutils_0.7.1.tar.gz
```
::: moniker-end
### Install sqlmlutils offline
If the client computer doesn't have an Internet connection, you need to download the packages **RODBCext** and **sqlmlutils** in advance using a computer that does have Internet access. You then can copy the files to a folder on the client computer and install the packages offline.
The **RODBCext** package has a number of dependent packages, and identifying all dependencies for a package gets complicated. We recommend that you use [**miniCRAN**](https://andrie.github.io/miniCRAN/) to create a local repository folder for the package that includes all the dependent packages.
For more information, see [Create a local R package repository using miniCRAN](create-a-local-package-repository-using-minicran.md).
The **sqlmlutils** package consists of a single file that you can copy to the client computer and install.
On a computer with Internet access:
1. Install **miniCRAN**. See [Install miniCRAN](create-a-local-package-repository-using-minicran.md#install-minicran) for details.
1. In RStudio, run the following R script to create a local repository of the package **RODBCext**. This example assumes the repository will be created in the folder `rodbcext`.
::: moniker range=">=sql-server-ver15||=sqlallproducts-allversions"
```R
CRAN_mirror <- c(CRAN = "https://mran.microsoft.com/snapshot/2019-02-01/")
local_repo <- "rodbcext"
pkgs_needed <- "RODBCext"
pkgs_expanded <- pkgDep(pkgs_needed, repos = CRAN_mirror);
makeRepo(pkgs_expanded, path = local_repo, repos = CRAN_mirror, type = "win.binary", Rversion = "3.5");
```
::: moniker-end
::: moniker range=">=sql-server-linux-ver15||=sqlallproducts-allversions"
```R
CRAN_mirror <- c(CRAN = "https://mran.microsoft.com/snapshot/2019-02-01/")
local_repo <- "rodbcext"
pkgs_needed <- "RODBCext"
pkgs_expanded <- pkgDep(pkgs_needed, repos = CRAN_mirror);
makeRepo(pkgs_expanded, path = local_repo, repos = CRAN_mirror, type = "source", Rversion = "3.5");
```
::: moniker-end
For the `Rversion` value, use the version of R installed on SQL Server. To verify the installed version, use the following T-SQL command.
```sql
EXECUTE sp_execute_external_script @language = N'R'
, @script = N'print(R.version)'
```
1. Download the latest **sqlmlutils** file (`.zip` for Windows, `.tar.gz` for Linux) from [https://github.com/Microsoft/sqlmlutils/tree/master/R/dist](https://github.com/Microsoft/sqlmlutils/tree/master/R/dist). Don't expand the file.
1. Copy the entire **RODBCext** repository folder and the **sqlmlutils** file to the client computer.
On the client computer you use to connect to SQL Server:
1. Open a command prompt.
1. Run the following commands to install **RODBCext** and then **sqlmlutils**. Substitute the full paths to the **RODBCext** repository folder and the **sqlmlutils** file you copied to this computer.
::: moniker range=">=sql-server-ver15||=sqlallproducts-allversions"
```console
R -e "install.packages('RODBCext', repos='rodbcext')"
R CMD INSTALL sqlmlutils_0.7.1.zip
```
::: moniker-end
::: moniker range=">=sql-server-linux-ver15||=sqlallproducts-allversions"
```console
R -e "install.packages('RODBCext', repos='rodbcext')"
R CMD INSTALL sqlmlutils_0.7.1.tar.gz
```
::: moniker-end
## Add an R package on SQL Server
In the following example, you'll add the [**glue**](https://cran.r-project.org/web/packages/glue/) package to SQL Server.
### Add the package online
If the client computer you use to connect to SQL Server has Internet access, you can use **sqlmlutils** to find the **glue** package and any dependencies over the Internet, and then install the package to a SQL Server instance remotely.
1. On the client computer, open RStudio and create a new **R Script** file.
1. Use the following R script to install the **glue** package using **sqlmlutils**. Substitute your own SQL Server database connection information.
```R
library(sqlmlutils)
connection <- connectionInfo(
server = "server",
database = "database",
uid = "username",
pwd = "password")
sql_install.packages(connectionString = connection, pkgs = "glue", verbose = TRUE, scope = "PUBLIC")
```
> [!TIP]
> The **scope** can be either **PUBLIC** or **PRIVATE**. Public scope is useful for the database administrator to install packages that all users can use. Private scope makes the package available only to the user who installs it. If you don't specify the scope, the default scope is **PRIVATE**.
### Add the package offline
If the client computer doesn't have an Internet connection, you can use **miniCRAN** to download the **glue** package using a computer that does have Internet access. You then copy the package to the client computer where you can install the package offline.
See [Install miniCRAN](create-a-local-package-repository-using-minicran.md#install-minicran) for information on installing **miniCRAN**.
On a computer with Internet access:
1. Run the following R script to create a local repository for **glue**. This example creates the repository folder in `c:\downloads\glue`.
::: moniker range=">=sql-server-ver15||=sqlallproducts-allversions"
```R
CRAN_mirror <- c(CRAN = "https://cran.microsoft.com")
local_repo <- "c:/downloads/glue"
pkgs_needed <- "glue"
pkgs_expanded <- pkgDep(pkgs_needed, repos = CRAN_mirror);
makeRepo(pkgs_expanded, path = local_repo, repos = CRAN_mirror, type = "win.binary", Rversion = "3.5");
```
::: moniker-end
::: moniker range=">=sql-server-linux-ver15||=sqlallproducts-allversions"
```R
CRAN_mirror <- c(CRAN = "https://cran.microsoft.com")
local_repo <- "c:/downloads/glue"
pkgs_needed <- "glue"
pkgs_expanded <- pkgDep(pkgs_needed, repos = CRAN_mirror);
makeRepo(pkgs_expanded, path = local_repo, repos = CRAN_mirror, type = "source", Rversion = "3.5");
```
::: moniker-end
For the `Rversion` value, use the version of R installed on SQL Server. To verify the installed version, use the following T-SQL command.
```sql
EXECUTE sp_execute_external_script @language = N'R'
, @script = N'print(R.version)'
```
1. Copy the entire **glue** repository folder (`c:\downloads\glue`) to the client computer. For example, copy it to the folder `c:\temp\packages\glue`.
On the client computer:
1. Open RStudio and create a new **R Script** file.
1. Use the following R script to install the **glue** package using **sqlmlutils**. Substitute your own SQL Server database connection information (if you don't use Windows Authentication, add `uid` and `pwd` parameters).
```R
library(sqlmlutils)
connection <- connectionInfo(
server= "yourserver",
database = "yourdatabase")
localRepo = "c:/temp/packages/glue"
sql_install.packages(connectionString = connection, pkgs = "glue", verbose = TRUE, scope = "PUBLIC", repos=paste0("file:///",localRepo))
```
> [!TIP]
> The **scope** can be either **PUBLIC** or **PRIVATE**. Public scope is useful for the database administrator to install packages that all users can use. Private scope makes the package available only to the user who installs it. If you don't specify the scope, the default scope is **PRIVATE**.
## Use the package
Once the **glue** package is installed, you can use it in an R script in SQL Server with the T-SQL **sp_execute_external_script** command.
1. Open Azure Data Studio and connect to your SQL Server database.
1. Run the following command:
```sql
EXECUTE sp_execute_external_script @language = N'R'
, @script = N'
library(glue)
name <- "Fred"
birthday <- as.Date("2020-06-14")
text <- glue(''My name is {name} '',
''and my birthday is {format(birthday, "%A, %B %d, %Y")}.'')
print(text)
';
```
**Results**
```text
My name is Fred and my birthday is Sunday, June 14, 2020.
```
## Remove the package
If you would like to remove the **glue** package, run the following R script. Use the same **connection** variable you defined earlier.
```R
sql_remove.packages(connectionString = connection, pkgs = "glue", scope = "PUBLIC")
```
## Next steps
- For information about installed R packages, see [Get R package information](r-package-information.md)
- For help in working with R packages, see [Tips for using R packages](tips-for-using-r-packages.md)
- For information about installing Python packages, see [Install Python packages with pip](install-additional-python-packages-on-sql-server.md)
- For more information about SQL Server Machine Learning Services, see [What is SQL Server Machine Learning Services (Python and R)?](../sql-server-machine-learning-services.md)
| 52.167286 | 571 | 0.734483 | eng_Latn | 0.9354 |
265ea3c41db49afe4f0a68c3a8f9429759572cb3 | 281 | md | Markdown | README.md | karuniaperjuangan/Hepatitis-Identification | b6cf38647cfe42dde41a71bb9f5fa1be1b8813d0 | [
"MIT"
] | null | null | null | README.md | karuniaperjuangan/Hepatitis-Identification | b6cf38647cfe42dde41a71bb9f5fa1be1b8813d0 | [
"MIT"
] | null | null | null | README.md | karuniaperjuangan/Hepatitis-Identification | b6cf38647cfe42dde41a71bb9f5fa1be1b8813d0 | [
"MIT"
] | null | null | null | # Hepatitis-Identification
Tugas AI Pak Azkario Jilid 1
Aplikasi Demo Implementasi Expert System berbasis Clips dalam bidang Kesehatan.
Tech Stack:
1. CLIPS : Implementasi Expert System
2. Python (Tkinter) : Graphical User Interface
3. Clipspy : Wrapping CLIPS ke bahasa Python
| 28.1 | 79 | 0.800712 | ind_Latn | 0.815848 |
265f47c591d1bb94a8d92d6f5992a2092d3a6f1e | 4,746 | md | Markdown | 00_documentation/002_repo_structure/0022_dataset_tables/population.md | worldbank/LearningPoverty-tmp | 67e26f4b371d6f863c7969ce6335d0cdc3bc1cad | [
"MIT"
] | 1 | 2019-10-17T00:45:42.000Z | 2019-10-17T00:45:42.000Z | 00_documentation/002_repo_structure/0022_dataset_tables/population.md | worldbank/LearningPoverty-tmp | 67e26f4b371d6f863c7969ce6335d0cdc3bc1cad | [
"MIT"
] | null | null | null | 00_documentation/002_repo_structure/0022_dataset_tables/population.md | worldbank/LearningPoverty-tmp | 67e26f4b371d6f863c7969ce6335d0cdc3bc1cad | [
"MIT"
] | null | null | null |
Documentation of Population
=====================================================================
<sup>back to the [Repo Structure](https://github.com/worldbank/LearningPoverty/blob/master/00_documentation/002_repo_structure/Repo_Structure.md) :leftwards_arrow_with_hook:</sup>
Dataset of late primary aged population. Long in countrycode and year, wide in population definitions (ie: 10-14y, primary-aged, etc) and subgroups (all, male, female). In units, not thousands nor millions.
**Metadata** stored in this dataset:
~~~~
sources: World Bank staff estimates using the World Bank's total population and age distributions of the United Nations Population Division's World Population Prospects.
~~~~
About the **18 variables** in this dataset:
~~~~
The variables belong to the following variable classifications:
idvars valuevars traitvars
idvars: countrycode year_population
valuevars: population_fe_10 population_fe_0516 population_fe_primary population_fe_9plus population_ma_10 population_ma_0516 population_ma_primary population_ma_9plus population_all_10 population_all_0516 population_all_primary population_all_9plus population_fe_1014 population_ma_1014 population_all_1014 population_source
traitvars: population_source
. codebook, compact
Variable Obs Unique Mean Min Max Label
<<<<<<< HEAD
---------------------------------------------------------------------------------------------------------------------------------------
=======
-----------------------------------------------------------------------------------------------------------------------
>>>>>>> develop
countrycode 13237 217 . . . WB country code (3 letters)
year_popul~n 13237 61 2020 1990 2050 Year of population
populat~e_10 11795 6832 320676.8 479 1.33e+07 Female population aged 10 (WB API)
popul~e_0516 11795 9464 3837089 5500 1.43e+08 Female population aged 05-16 (WB API)
po~e_primary 10941 8257 2210692 3477 7.53e+07 Female population primary age, country specific (WB API)
<<<<<<< HEAD
popu~e_9plus 11413 8005 1198257 967 5.14e+07 Female population aged 9 to end of primary, country specific (WB API)
populat~a_10 11795 6836 339959 492 1.42e+07 Male population aged 10 (WB API)
popul~a_0516 11795 9528 4067005 5800 1.60e+08 Male population aged 05-16 (WB API)
po~a_primary 10941 8286 2343564 3858 8.08e+07 Male population primary age, country specific (WB API)
popu~a_9plus 11413 8004 1265978 1007 5.52e+07 Male population aged 9 to end of primary, country specific (WB API)
populat~l_10 11795 7468 660635.8 971 2.75e+07 Total population aged 10 (WB API)
popul~l_0516 11795 10279 7904094 11300 3.04e+08 Total population aged 05-16 (WB API)
po~l_primary 10941 9035 4554256 7335 1.56e+08 Total population primary age, country specific (WB API)
popu~l_9plus 11413 8713 2464235 1974 1.07e+08 Total population aged 9 to end of primary, country specific (WB API)
=======
popu~e_9plus 11413 8005 1198257 967 5.14e+07 Female population aged 9 to end of primary, country specific ...
populat~a_10 11795 6836 339959 492 1.42e+07 Male population aged 10 (WB API)
popul~a_0516 11795 9528 4067005 5800 1.60e+08 Male population aged 05-16 (WB API)
po~a_primary 10941 8286 2343564 3858 8.08e+07 Male population primary age, country specific (WB API)
popu~a_9plus 11413 8004 1265978 1007 5.52e+07 Male population aged 9 to end of primary, country specific (W...
populat~l_10 11795 7468 660635.8 971 2.75e+07 Total population aged 10 (WB API)
popul~l_0516 11795 10279 7904094 11300 3.04e+08 Total population aged 05-16 (WB API)
po~l_primary 10941 9035 4554256 7335 1.56e+08 Total population primary age, country specific (WB API)
popu~l_9plus 11413 8713 2464235 1974 1.07e+08 Total population aged 9 to end of primary, country specific (...
>>>>>>> develop
popul~e_1014 11792 8157 1582125 2300 6.21e+07 Female population between ages 10 to 14 (WB API)
popul~a_1014 11792 8190 1676713 2300 6.72e+07 Male population between ages 10 to 14 (WB API)
popul~l_1014 11792 8890 3258838 4600 1.29e+08 Total population between ages 10 to 14 (WB API)
population~e 13237 1 . . . The source used for population variables
<<<<<<< HEAD
---------------------------------------------------------------------------------------------------------------------------------------
=======
-----------------------------------------------------------------------------------------------------------------------
>>>>>>> develop
~~~~
| 66.84507 | 324 | 0.631479 | eng_Latn | 0.527865 |
265f56b521312b865e79f14ca83ca70f1de05e91 | 7,235 | md | Markdown | docs/internals/transforms.md | vmchale/hakaru | 78922e13876e449d6812a55a11bf84c8eb0af4d6 | [
"BSD-3-Clause"
] | 327 | 2015-01-03T08:56:51.000Z | 2022-01-24T12:12:06.000Z | docs/internals/transforms.md | zaxtax/hakaru | 03ac5b645815e99437e28d228e6c668753b2640e | [
"BSD-3-Clause"
] | 155 | 2015-05-05T17:57:22.000Z | 2022-03-30T15:43:39.000Z | docs/internals/transforms.md | zaxtax/hakaru | 03ac5b645815e99437e28d228e6c668753b2640e | [
"BSD-3-Clause"
] | 38 | 2015-01-23T16:25:37.000Z | 2021-03-14T15:09:12.000Z | # Program transformations in Hakaru
## Coalesce
Coalesce is an internal transformation that works on the untyped Hakaru AST. It
takes recursive `NAryOp` terms that have the same type and combines them into
a single term. For instance:
```
3.0 + 1.5 + 0.3
```
is parser as:
```
NaryOp Sum [3.0, NaryOp Sum [1.5, NaryOp Sum [0.3]]]
```
which when coalesced becomes:
```
NaryOp Sum [3.0,1.5,0.3]
```
## Optimizations
The Hakaru AST has a suite of standard compiler optimizations which have
a substantial effect on the runtime of the resulting program.
The current pipeline is described by the `optimizations` variable in
`Language.Hakaru.Syntax.Transforms`.
In order, the optimizations performed are:
1. A-normalization
2. Uniquification of variables (needed for let-floating)
3. Let-floating
4. Common subexpression elimination
5. Pruning of dead binders
6. Uniquification of variables (for the C backend)
7. Constant Propagation
Each pass is described in more detail below.
### A-normalization
Found in `Language.Hakaru.Syntax.ANF`
See **The Essence of Compiling with Continuations by Flannigan, Sabry, Duba, and
Felleisen**
A-normalization converts expressions into *administrative normal form* (ANF).
This ensures that all intermediate values are named and all arguments to
functions or primitive operations are either literals or variables.
ANF is a common program representation for functional language compilers which
can simplify some compiler passes and make others more effective.
As an example, consider
```
(add1 (let ([x (f y)]) 5))
```
This expression in ANF looks like the following
```
(let ([x (f y)]) (add1 5))
```
which opens up the opportunity for constant folding to eliminate the `(add1 5)`
expression.
This pass exists mostly to simplify the implementation of CSE, but is useful for
other passes as well.
### Uniquification
Found in `Language.Hakaru.Syntax.Uniquify`
Ensures all variables in the program have unique variable identifiers.
This is not strictly necessary, but simplifies the implementation of other
passes, several of which rely on this property.
### Let-floating
Found in `Language.Hakaru.Syntax.Hoist`
See **Let-Floating: Moving Bindings to Give Faster Programs (1996)
by Simon Peyton Jones , Will Partain , André Santos**
Let-floating alters the bindings structure of the program in order to improve
performance.
Typically, this entails moving definitions into or out of lambda expressions.
When a lambda expression encodes a loop, this effectively accomplishes
loop invariant code motion.
This pass only moves definitions upward in the AST.
For the most part, we are only interested in looping constructs like `summate` and
`product`, and moving `summate` expressions out of other `summate` or `product`
expressions when they do not depend on the index.
This can radically alter the asymptotics of the resulting program, as nested
loops are converted into sequentially executed loops.
The only assumption this pass makes about the input AST is that all variable
identifiers are unique.
This is to handle the case where two branches of a match statement introduce the
same variable.
If both binders are hoisted out of the match statement, they one binding will
shadow the other.
This pass, as implemented, unconditionally floats expression to where their data
dependencies are fulfilled.
This is not safe in a general purpose language, and we may need to layer some
heuristics on top of this pass to make it less aggressive if we end up
introducing performance regressions.
### Common Subexpression Elimination
Found in `Language.Hakaru.Syntax.CSE`
Common subexpression elimination eliminates redundant computation by reusing
results for equivalent expressions.
The current implementation of this pass relies on the program being in ANF.
ANF simplifies the implementation of CSE greatly by ensuring all expressions are
named and that if two expressions may be shared, one of them is let-bound so
that it dominates the other.
In short, ANF simplifies the program to a simple top-down traversal of the AST.
Consider the example
```
(+ (add1 z) (add1 z))
```
Eliminating the common expression `(add1 z)` requires us to traverse the
expression in evaluation order, track expression which have already been
evaluated, recognize when an expression is duplicated, and introduce it
with a new name that dominates all use sites of that expression.
However, an expression in ANF allows us to perform CSE simply by keeping track
of let-bound expressions and propagating those expressions downward into the
AST.
Consider the example in ANF
```
(let ([t1 (add1 z)])
(let ([t2 (add1 z)])
(+ t1 t2)))
```
To remove the common subexpression, we simply have to note that the `(add1 z)`
bound to `t2` is equivalent to the expression bound to `t1` and replace it with
the variable `t1`.
```
(let ([t1 (add1 z)])
(let ([t2 t1])
(+ t1 t2)))
```
Trivial bindings can then be eliminated, if desired, giving
```
(let ([t1 (add1 z)])
(+ t1 t1)))
```
A major goal of CSE is to cleanup any work which is duplicated by the
let-floating pass.
### Pruning
Found in `Language.Hakaru.Syntax.Prune`
This is essentially a limited form of dead code elimination.
If an expression is bound to a variable which is never referenced, then that
expression need never be executed, as the code language has no side effects.
This pass serves to clean up some of the junk introduced by other passes.
Cases which are handled
1. `(let ([x e1]) e2) => e2 if x not in fv(e2)`
2. `(let ([x e1]) x) => e1`
### Constant Propagation
Found in `Language.Hakaru.Evalutation.ConstantPropagation`
Performs simple constant propagation and constant folding.
The current implementation does not do that much work, mostly just evaluating
primitive operations when their arguments are constant.
## Unused Passes
### Loop Peeling
Found in `Language.Hakaru.Syntax.Unroll`
Loop peeling was an initial attempt at performing loop invariant code motion by
leveraging CSE to do most of the heavy lifting.
Peeling is a common strategy to make other optimization passes "loop-aware".
The idea is to peel off one iteration of a loop and then apply the existing
suite of optimizations.
Consider the following `summate` whose body `e` is some loop-invariant
computation.
```
(summate lo hi (λ x -> e))
```
After peeling we obtain
```
(if (= lo hi)
0
(let ([x lo])
(let ([t1 e])
(let ([t2 (summate (+ lo 1) hi (λ x -> e))])
(+ t1 t2)))))
```
After applying CSE, the loop invariant body is simply reused on each iteration
```
(if (= lo hi)
0
(let ([x lo])
(let ([t1 e])
(let ([t2 (summate (+ lo 1) hi (λ x -> t1))])
(+ t1 t2)))))
```
ANF ensures that all subexpression in the `e` bound to `t1` are shareable with
the copy of `e` used in the body of the `summate`, allowing us to hoist out
subexpressions of `e` and not just the entire `summate` body.
This pass is currently disabled in favor of the let-floating pass, which does
a better job without causing an exponential blow up in code size.
Some of Hakaru's looping constructs, such as `array`, cannot be peeled, so we
cannot move loop invariant operations out of `array` statements.
| 31.051502 | 82 | 0.752453 | eng_Latn | 0.999239 |
2660f145b09f4016aecddd22bd9a452d0cc70b9c | 2,839 | md | Markdown | myapp/README.md | priyavratshukla/my-react-world | 73a776d0270c1b1c8ef14b4a88c00e119e6e3825 | [
"MIT"
] | null | null | null | myapp/README.md | priyavratshukla/my-react-world | 73a776d0270c1b1c8ef14b4a88c00e119e6e3825 | [
"MIT"
] | null | null | null | myapp/README.md | priyavratshukla/my-react-world | 73a776d0270c1b1c8ef14b4a88c00e119e6e3825 | [
"MIT"
] | null | null | null | This project was bootstrapped with [Create React App](https://github.com/facebook/create-react-app).
## Available Scripts
In the project directory, you can run:
### `npm start`
Runs the app in the development mode.<br>
Open [http://localhost:3000](http://localhost:3000) to view it in the browser.
The page will reload if you make edits.<br>
You will also see any lint errors in the console.
### `npm test`
Launches the test runner in the interactive watch mode.<br>
See the section about [running tests](https://facebook.github.io/create-react-app/docs/running-tests) for more information.
### `npm run build`
Builds the app for production to the `build` folder.<br>
It correctly bundles React in production mode and optimizes the build for the best performance.
The build is minified and the filenames include the hashes.<br>
Your app is ready to be deployed!
See the section about [deployment](https://facebook.github.io/create-react-app/docs/deployment) for more information.
### `npm run eject`
**Note: this is a one-way operation. Once you `eject`, you can’t go back!**
If you aren’t satisfied with the build tool and configuration choices, you can `eject` at any time. This command will remove the single build dependency from your project.
Instead, it will copy all the configuration files and the transitive dependencies (Webpack, Babel, ESLint, etc) right into your project so you have full control over them. All of the commands except `eject` will still work, but they will point to the copied scripts so you can tweak them. At this point you’re on your own.
You don’t have to ever use `eject`. The curated feature set is suitable for small and middle deployments, and you shouldn’t feel obligated to use this feature. However we understand that this tool wouldn’t be useful if you couldn’t customize it when you are ready for it.
## Learn More
You can learn more in the [Create React App documentation](https://facebook.github.io/create-react-app/docs/getting-started).
To learn React, check out the [React documentation](https://reactjs.org/).
## Steps to setup the project
$ npm install --global create-react-app
$ cd /path/to/folder
$ create-react-app my-hello-world
cd my-hello-world
npm start
remove everything from App.js and add the following:
import React from 'react';
import './App.css';
export default () => (
<div className="App">
<header className="App-header">
<h1 className="App-title">Welcome To My Hello World!</h1>
</header>
<p className="App-intro">
My name is type_your_name_here. Welcome, and hello!
</p>
</div>
);
## Moving on to automated builds and deployments
npm install --global surge
$ npm run build && surge
yarn global add serve
serve -s build
After git commit ---------->
npm test
npm run build
npm add --dev gh-pages
npm run deploy | 34.621951 | 322 | 0.742515 | eng_Latn | 0.99522 |
26610521ae61efa3ebabaaf467f83c8a84072306 | 778 | md | Markdown | _publications/2019-12-31-topology.md | austinyi/austinyi.github.io | cd34c05fb08f2859457e56ed83c33919e9a9d1cd | [
"MIT"
] | null | null | null | _publications/2019-12-31-topology.md | austinyi/austinyi.github.io | cd34c05fb08f2859457e56ed83c33919e9a9d1cd | [
"MIT"
] | 3 | 2021-05-20T09:44:36.000Z | 2022-02-26T05:38:20.000Z | _publications/2019-12-31-topology.md | austinyi/austinyi.github.io | cd34c05fb08f2859457e56ed83c33919e9a9d1cd | [
"MIT"
] | null | null | null | ---
title: "Topology"
collection: notes
permalink: /notes/2019-12-31-topology
date: 2019-12-31
paperurl:
---
Basic Topology, Countability & Separation Axioms, The Tychonoff Theorems, Metrization Theorems, Complete Metric Space & Function spaces, Baire Spaces & Dimension Theory
There are several textbooks and references that I used to study this subject, which are the following.
* Topology, James Munkres, 2014.
* [Introduction to Topology, MIT OpenCourseWare](https://ocw.mit.edu/courses/mathematics/18-901-introduction-to-topology-fall-2004/index.htm).
* [Lecture notes on Topology, John Rognes, 2010](http://folk.uio.no/rognes/kurs/mat4500h10/topology.pdf).
Click here to see my notes on Topology.
[Topology Summary](http://austinyi.github.io/files/topology.pdf)
| 37.047619 | 168 | 0.772494 | kor_Hang | 0.420053 |
266161fbd4f3a22f97725fc1a7760ea920d3fabd | 12,638 | md | Markdown | README.md | cgerard321/champlain_petclinic | b8786285eabf554ab7ff673f6d109e7bc5044153 | [
"MIT"
] | 8 | 2021-09-14T23:55:56.000Z | 2021-11-29T21:20:39.000Z | README.md | cgerard321/champlain_petclinic | b8786285eabf554ab7ff673f6d109e7bc5044153 | [
"MIT"
] | 47 | 2021-09-11T21:58:47.000Z | 2021-11-14T23:54:42.000Z | README.md | cgerard321/champlain_petclinic | b8786285eabf554ab7ff673f6d109e7bc5044153 | [
"MIT"
] | 6 | 2021-09-21T20:57:45.000Z | 2022-02-17T22:33:25.000Z | # champlain_petclinic
Champlain Final Project 1 420-N52-LA Pet Clinic repo
## Source
This project is based on the spring petclinic microservices (https://github.com/spring-petclinic/spring-petclinic-microservices) implementation.
However, only the customers, visits, vets, and api-gateway services have been retained. In addition, the
Docker setup has been changed.
## Running the project
Once you have cloned the repo (see the setup instructions below), you need to do the following:
### H2 Profile (for testing with H2 database outside of Docker)
```
./gradlew customers-service:build
java –jar -Dspring.profiles.active=h2 customers-service/build/libs/*.jar &
```
(repeat the above two commands for each service you want to compile)
Test with curl:
```
curl localhost:7003/owners | jq
```
To view H2 database in browser:
localhost:7003/h2-console
Database is: jdbc:h2:mem:customers-db (note this db is service specific)
### Docker Profile (for running with Docker or Docker-compose with a MySQL database)
```
./gradlew build
docker-compose build
docker-compose up -d
docker-compose logs -f
```
Test in browser:
```
curl localhost:8080/
```
In terminal:
Check database contents (did the script run)
```
winpty docker-compose exec mysql3 mysql -uuser -p customers-db -e "select * from owners"
winpty docker-compose exec mysql3 mysql -uuser -p customers-db -e "select * from pets"
winpty docker-compose exec mysql3 mysql -uuser -p customers-db -e "select * from types"
```
When all docker containers are up, test with curl:
```
curl localhost:8080/api/gateway/customer/owners | jq
curl localhost:8080/api/gateway/vet/vets | jq
```
## Structure
- Please following standard naming convention as stated in the 'story workflow' section, and don't forget to label your pull requests
- We are all contributing to the same root project. If you break something, it will affect everyone. So, working on your own BRANCH is mandatory.
### Project Structure
- The teams will self-name and come up with a four-letter acronym. The acronym will be called your TEAMTAG in these instructions.
- Each team is responsible for one or more microservices.
- Each team is responsible for implementing the UI for their features.
### Branch Naming
- Branches will be named according to the following convention: type/TEAMTAG-JiraID_Description
I like to break it down into 4 'folders' or types:
- feat/
- bug/
- doc/
- conf/
- After the slash, add your TEAMTAG
- After the TEAMTAG, add a slash and then the JIRA id (it will be something like CPC-4).
- The full branch name would look like this `feat/TEAMA-CPC-4_Add_Test_Scenario_New_Pet` and would be created and navigated to by executing the following command:
```
git checkout -b feat/TEAMA-CPC-4_Add_Test_Scenario_New_Pet
```
### Pull Requests (PR) Naming
- To make it so we can easily search and find pull requests we will adhere to the following standard:
```
feat(TEAMTAG-JiraID): short description
```
- In that example, you would replace TEAMTAG with your team's acronym and the JIRA-TICKET-ID with the id from Jira.
- Keep the parentheses.
- Do not include any capital letters or punctuation in the description
### Pull Request Commit Naming
- This is pretty much the exact same as the Pull Request Naming except that at the end there will be an auto-generated number in parentheses. Please don't delete it. Simply add your stuff before it.
```
feat(TEAMTAG-JiraID): short description (#420)
```
## Setup
- First create an account on GitHub
- Download git https://git-scm.com/downloads
- Go to the official/ main repo https://github.com/cgerard321/champlain_petclinic
- Click the green button 'Code', and copy the given URL
- On your file explorer, navigate to where you want the project, right-click, and select 'git bash here'
- In the terminal window, type 'git clone' and then paste the copied url. (Do not ctrl + v to paste in the git bash terminal, it does not use standard windows encoding and will add extra invisible chars to the command causing it to error out.) It will look like this:
```
git clone https://github.com/cgerard321/champlain_petclinic
```
- The repo on your computer is known as the "local"
- The repo on GitHub is known as the "remote origin" or simply "origin"
- cd into the champlain_petclinic folder on your computer
```
cd champlain_petclinic/
```
To see that the remote origin has been correctly set up, type:
```
git remote add upstream https://github.com/cgerard321/champlain_petclinic.git
```
- If we type `git remote -v` we should see 4 different connections, push and fetch for our upstream and for our origin
- Now that you have setup your clone, move on to the 'story workflow section'
## Story Workflow
- So you've setup your clone of the repo and started your first story. Now what?
- We will first navigate to our project in the file explorer, right-click, and select 'git bash here'
- In the current command line, you should see in parentheses, the branch you are currently on. We want to start this 'new story process' from our origin's main branch.
```
Christine@DESKTOP-2VF5PQD MINGW64 /e/champlain_petclinic (main)
```
- If it says main, great. Skip this next line. If not, type:
```
git checkout main
```
- This will simply transfer us to our origin's main branch
- Next, we will want to update our local project with any code our fellow devs have pushed while we were gone. To do this we must first 'download' the code using the following command:
```
git fetch origin main
```
- We are telling git to download the latest stuff from the main branch on our remote remote
- Then we want to actually start our story fresh with that code, so we will reset our local environment with that newly fetched code:
```
git reset --hard origin/main
```
- It is also important to note this will reset any uncommited changes you've made, so keep that in mind. If you are following along and not starting a story from scratch, you might want to rebase instead. More info on rebasing can be found in the 'useful git commands' section
- Now we will want to make a new branch to start working on our feature or bug fix. Simply type:
```
git checkout -b YOUR-BRANCH-NAME
```
- This command is broken down into 2 parts, `checkout` will move you to a given branch the `-b` modifier will create the branch
- You have now created your new branch and are on it. Check the 'structure' section for what you should write in place of YOUR-BRANCH-NAME
- Now it's time to actually write some code. So go start implementing a new feature using TDD. Then come back after you're done.
- So now you have hopefully something done or at least the start to it and want to commit it
- First, we have to stage all edits, additions, and removals
```
git add .
```
- We can also stage specific files with a relative path
```
git add /path/to/file
```
- Next we will commit the code
```
git commit -m "A short description of what work was done in the commit"
```
- After that you might repeat the `git add .` and `git commit` a couple times before your masterpiece is done
- When you are ready to show it to everyone else or if you want to be able to access it on another computer, we have to push it with this command: (it might ask you for login creds)
```
git push
```
- Again this is the same this as saying `git push origin YOUR-BRANCH-NAME` the `origin` and `YOUR-BRANCH-NAME` are implicitly applied
- If git gives you and error here telling you that you need to set the remote as upstream, simply copy/paste the command it gives you. Next time you push on this branch, you won't get this error.
- Imagine at this point that everything in the story is done, and you are ready to get your code reviewed by the other devs. We need to make a pull request to do that
- Go to your origin's github page (or project repo) and make a new pull request. At the top, verify that the branch (thing you want to compare) and base are all coming from and going to the correct place. "Compare" should be YOUR-BRANCH-NAME and "base" should be main.
- Add a title as per the instructions in the 'structure' section, and make sure to add the label on the side bar, indicating which team you are on
- In order to merge this Pull Request (PR), we need two other people to review and approve it. You can get other peoples attention by 'requesting a review' on the side bar or by sending them a DM in slack
- Start by asking people on your team to do the review but don't hesitate to ask someone from a different team if there is an interaction
- Once you've pleased everyone, your code is in prime condition, and you have no merge conflicts you can finally hit the 'squash and merge' button and set another title. Follow the naming conventions in the 'Pull Request Commit Naming' section of 'Structure'
- Your PR is now merged and everyone can fetch and rebase or pull to see the work you've done
- Congrats. Just repeat this process until the semester is over.
## Merge Conflicts / Updating your Branch
The commands are pretty much the same whether you are updating a branch, or you are trying to fix a merge conflict, except if you are updating you will skip the `git add .` and the `git rebase --continue` because you don't have anything to fix. You will still have to `git push -f`
Here's the scenario: Oh no, you have a merge conflict! This happens when you and another dev are working on the same file and edit the same line or git can't automatically figure out how to add your code and the main code together.
Once you see this error on your pull request, or if you happen to run into it outside of a PR, just follow these easy steps:
- First download the origin main data
```
git fetch origin main
```
- Next we will use the rebase command
```
git rebase origin/main
```
- Git will now replay the commits of your branch on top of the origin main. If you have a merge conflict, the prompt will pause and tell you which files were affected. From there, just navigate to your file and update the code accordingly.
- Once you have fixed all the merge conflicts go back to your terminal and type:
```
git add .
```
- Then
```
git rebase --continue
```
- This command is telling git "ok I've fixed this conflict now move on to the next commit"
- If you have more conflicts, repeat the last couple of steps, until the rebase is complete
- Generally, you can tell the rebase is complete when you look at the branch name in your terminal, and it is the correct branch name i.e. without any extra text or random symbols
- After that, the rebase has made a new local commit with all your changes, only one step left which is to force push:
```
git push -f
```
- This is just shorthand for `git push --force`
- If you don't force push you'll get a bunch of red and yellow text, which looks like you messed up, but it's fine. It didn't actually do anything just redo the command but with the `-f`
- At this point, if you go back to your pull request, you should be able to automatically merge the branch.
## Useful Git Commands
This command lets you see any edited, added, or removed files:
```
git status
```
This will show you the differences between last commit (HEAD is main) and your local repo. Press q when you want to leave:
```
git diff HEAD .
```
This will list all your remotes:
```
git remote -v
```
This will list all your branches and there will be a star next to the branch you are currently on:
```
git branch
```
Reset your current branch to the upstream main:
```
git fetch origin main
git reset --hard origin/main
```
If you want to rebase the upstream main on top of your working branch:
```
git fetch origin main
git rebase origin/main
```
Switch to a branch:
```
git checkout BRANCH-NAME
```
Creating and switching to a branch:
```
git checkout -b BRANCH-NAME
```
Add all files ot be staged:
```
git add .
```
Remove all files from staging area:
```
git reset HEAD .
```
Commit all staged files:
```
git commit -m "My message"
```
Push code to remote repo:
```
git push
```
Push code to remote repo after rebase, use this one carefully:
```
git push --force
```
Select a specific commit and replay in onto a branch, don't include angled brackets:
```
git cherry-pick <commitId>
```
To save the stuff you have been working on if you need to quickly change branches but don't want to commit or want to transfer work from one branch to another, the basic is with git stash and there are a variety of variation you can look up but for general uses, this first command will store the data:
```
git stash
```
The next command will re-apply the data:
```
git stash pop
```
---
| 35.00831 | 302 | 0.747982 | eng_Latn | 0.999373 |
2662521ec54336538e7bd5c5d593c924b4957c25 | 1,186 | md | Markdown | ru/tracker/user/trigger.md | ivanfomkin/docs | 20e9e18d7cae5ee608f59f90c4aa0e4192073ad9 | [
"CC-BY-4.0"
] | null | null | null | ru/tracker/user/trigger.md | ivanfomkin/docs | 20e9e18d7cae5ee608f59f90c4aa0e4192073ad9 | [
"CC-BY-4.0"
] | null | null | null | ru/tracker/user/trigger.md | ivanfomkin/docs | 20e9e18d7cae5ee608f59f90c4aa0e4192073ad9 | [
"CC-BY-4.0"
] | null | null | null | ---
__system: {"dislikeVariants":["Нет ответа на мой вопрос","Рекомендации не помогли","Содержание не соответствует заголовку","Другое"]}
---
# Триггеры
{% note warning %}
По умолчанию создавать, редактировать и удалять триггеры может только [владелец очереди](../manager/queue-access.md).
{% endnote %}
Триггер — это набор [действий](create-trigger.md) над задачей, который запускается автоматически при выполнении заданных [условий](create-trigger.md). Например, если изменился статус задачи или если на задачу подписался определенный пользователь, триггер может изменить параметры задачи, оставить комментарий или отправить HTTP-запрос.
С помощью триггера можно автоматически [назначать исполнителей для задач](../manager/trigger-examples.md#assign_ticket) или [отправлять уведомления из {{ tracker-name }} в мессенджер](../messenger.md).
Каждая очередь имеет свой набор триггеров. Триггеры применяются последовательно в том же порядке, в котором они перечислены в настройках очереди. При необходимости вы можете [изменить порядок триггеров](manage-trigger.md) в списке.
[Как создать триггер](create-trigger.md)
[Как изменить или удалить триггер](manage-trigger.md)
| 47.44 | 335 | 0.785835 | rus_Cyrl | 0.975774 |
2662d754300899328ee0c313914723986e42ab7a | 2,340 | md | Markdown | README.md | tameralamiri/schoolcpp | 765cde924f5f91677e58e8b2e31c05e598e5d1db | [
"MIT"
] | null | null | null | README.md | tameralamiri/schoolcpp | 765cde924f5f91677e58e8b2e31c05e598e5d1db | [
"MIT"
] | null | null | null | README.md | tameralamiri/schoolcpp | 765cde924f5f91677e58e8b2e31c05e598e5d1db | [
"MIT"
] | null | null | null | # schoolcpp
This project is based on edx c++ course.
The aim of this project to create three classes:
* A Student Class
* A Teacher Class
* A Course Class
The Course object should contain an array of Student objects so ensure that you create an array inside the Course object to hold Students. A Course object will also contain a single Teacher object.. For this assignment, create an array of size 3 for students.
The diagram shows how some objects relate to each other in a program that might be used to maintain class registrations. The term UProgram is used so as not to confuse Program with a computer program. It is meant to represent a program such as Computer Science or Liberal Arts, etc.

The Student and Teacher classes need to have private member variables for first and last names, age, address, city, and phone along with public accessors for these.
Each class needs to have a default constructor and one that sets the values of the member variables when the object is created. Each class should also have a destructor.
Ensure that you are using a header (.h) and an implementation file (.cpp) for each class.
The Teacher class needs to have a method called GradeStudent() that accepts no arguments and returns nothing. Have this method output an appropriate message to the console such as "Student graded".
Add a method to both Student and Teacher called SitInClass(). it should take no arguments and return no arguments but, to illustrate class scope, have the method output, "Sitting at front of class" for the teacher and "Sitting in main theater" for the students.
In the main() method:
1.Instantiate three Student objects called Student1, Student2, and Student3, provide values for the member variables.
2.Instantiate a Course object called Intermediate C++.
3.Add your three students to this Course object.
4.Instantiate at least one Teacher object.
5.Add that Teacher object to your Course object
6.Using cout statements where appropriate, follow these instructions:
a.Output the name of the course
b.Call the GradeStudent() method on the Teacher object
c.Leave your application open and answer the Lab assessment questions
| 54.418605 | 284 | 0.794017 | eng_Latn | 0.999573 |
26630b4df001fdf841979166884bff771d2fec1b | 9,020 | md | Markdown | mdop/medv-v1/how-to-configure-published-applicationsmedvv2.md | MicrosoftDocs/mdop-docs-pr.fr-fr | 2eb26867200d360eef2e34b52e1044f5e8f00c7e | [
"CC-BY-4.0",
"MIT"
] | 1 | 2021-04-20T21:13:51.000Z | 2021-04-20T21:13:51.000Z | mdop/medv-v1/how-to-configure-published-applicationsmedvv2.md | MicrosoftDocs/mdop-docs-pr.fr-fr | 2eb26867200d360eef2e34b52e1044f5e8f00c7e | [
"CC-BY-4.0",
"MIT"
] | 2 | 2020-07-08T05:27:50.000Z | 2020-07-08T15:39:16.000Z | mdop/medv-v1/how-to-configure-published-applicationsmedvv2.md | MicrosoftDocs/mdop-docs-pr.fr-fr | 2eb26867200d360eef2e34b52e1044f5e8f00c7e | [
"CC-BY-4.0",
"MIT"
] | 1 | 2021-11-04T12:31:57.000Z | 2021-11-04T12:31:57.000Z | ---
title: Comment configurer des applications publiées
description: Comment configurer des applications publiées
author: dansimp
ms.assetid: 43a59ff7-5d4e-49dc-84e5-1082bc4dd8f4
ms.reviewer: ''
manager: dansimp
ms.author: dansimp
ms.pagetype: mdop, virtualization
ms.mktglfcycl: deploy
ms.sitesec: library
ms.prod: w10
ms.date: 06/16/2016
ms.openlocfilehash: cb5736382f03e818ef10aa814a8e61044ca2b73a
ms.sourcegitcommit: 354664bc527d93f80687cd2eba70d1eea024c7c3
ms.translationtype: MT
ms.contentlocale: fr-FR
ms.lasthandoff: 06/26/2020
ms.locfileid: "10810481"
---
# Comment configurer des applications publiées
Les applications qui ne sont pas compatibles avec le système d’exploitation hôte peuvent être exécutées au sein de l’espace de travail MED-V et démarrées à partir de l’espace de travail MED-V de la même manière que sur le bureau, à partir du menu Démarrer ou à partir d’un raccourci d’hôte local. Les applications sélectionnées et définies sont appelées applications publiées. Les procédures dans cette section décrivent la façon d’ajouter et de supprimer des applications publiées.
Une application peut être publiée de l’une des façons suivantes:
- En tant qu’application: sélectionnez une application spécifique en tapant dans la ligne de commande de l’application. Seule l’application sélectionnée est publiée.
- En tant que menu, sélectionnez un dossier qui contient plusieurs applications. Toutes les applications du dossier sont publiées et affichées sous forme de menu.
## <a href="" id="bkmk-addingapublishedapplication"></a>Comment ajouter une application publiée à un espace de travail MED-V
**Pour ajouter une application à l’espace de travail MED-V**
1. Cliquez sur l’espace de travail MED-V à configurer.
2. Dans le volet **applications** , dans la section **applications publiées** , cliquez sur **Ajouter** pour ajouter une nouvelle application.
3. Configurez les propriétés de l’application comme décrit dans le tableau suivant.
4. Dans le menu **stratégie** , cliquez sur **valider**.
**Remarque**
Si vous configurez Internet Explorer en tant qu’application publiée pour vous assurer que la redirection de site Web fonctionne correctement, assurez-vous que les paramètres ne sont pas entre parenthèses.
**Propriétés de l’application publiée**
<table>
<colgroup>
<col width="50%" />
<col width="50%" />
</colgroup>
<thead>
<tr class="header">
<th align="left">Propriété</th>
<th align="left">Description</th>
</tr>
</thead>
<tbody>
<tr class="odd">
<td align="left"><p>Activé</p></td>
<td align="left"><p>Activez cette case à cocher pour activer l’application publiée.</p></td>
</tr>
<tr class="even">
<td align="left"><p>Nom d’affichage</p></td>
<td align="left"><p>Nom du raccourci dans le menu Démarrer de l''utilisateur.</p>
<div class="alert">
<strong>Remarque</strong><br/><p>Le nom d’affichage n’est <strong> pas sensible à la </strong> casse.</p>
</div>
<div>
</div></td>
</tr>
<tr class="odd">
<td align="left"><p>Description</p></td>
<td align="left"><p>Description de l’application publiée, qui s’affiche sous la forme d’une info-bulle lorsque l’utilisateur'de la souris sur le raccourci.</p></td>
</tr>
<tr class="even">
<td align="left"><p>Ligne de commande</p></td>
<td align="left"><p>Commande utilisée pour exécuter l’application à partir de l’espace de travail MED-V. Le chemin d’accès complet est requis et les paramètres peuvent être transmis à l’application de la même manière que dans toute autre commande Windows.</p>
<p>Dans un espace de travail de revertible MED-V, vous pouvez mapper un lecteur réseau avec la syntaxe MapNetworkDrive: " <em> MapNetworkDrive < > < Path, > </em> " par exemple " <em> MapNetworkDrive t: \tux\date </em> " .</p>
<p>Par exemple, pour publier l’Explorateur Windows, utilisez la syntaxe suivante: " <em> < /em > " ou " <em> c:\Windows </em> ."</p>
<div class="alert">
<strong>Remarque</strong><br/><p>Pour utiliser une résolution de nom, vous devez effectuer l’une des opérations suivantes:</p>
</div>
<div>
</div>
<ul>
<li><p>Configurez le DNS dans l’image de base de l’espace de travail MED-V.</p></li>
<li><p>Vérifiez que la résolution DNS est définie dans l’hôte et configurez-la pour utiliser le DNS hôte.</p></li>
<li><p>Utilisez l’adresse IP pour définir le lecteur réseau.</p></li>
</ul>
<div class="alert">
<strong>Remarque</strong><br/><p>Si le chemin d’accès comporte des espaces, le chemin d’accès complet doit être placé entre guillemets.</p>
</div>
<div>
</div>
<div class="alert">
<strong>Remarque</strong><br/><p>Le chemin d’accès ne doit pas se terminer par une barre oblique inverse ().</p>
</div>
<div>
</div></td>
</tr>
<tr class="odd">
<td align="left"><p>Menu Démarrer</p></td>
<td align="left"><p>Activez cette case à cocher pour créer un raccourci pour l’application dans le menu Démarrer de l’utilisateur'.</p></td>
</tr>
</tbody>
</table>
Toutes les applications publiées apparaissent sous la forme de raccourcis dans le menu **Démarrer** de Windows (**Démarrer > toutes les > applications MED-V**).
## Comment supprimer une application publiée d’un espace de travail MED-V
**Pour supprimer une application de l’espace de travail MED-V**
1. Cliquez sur un espace de travail MED-V.
2. Dans le volet **applications** , dans la section **applications publiées** , sélectionnez une application à supprimer.
3. Cliquez sur **supprimer**.
L’application est supprimée de la liste des applications publiées.
4. Dans le menu **stratégie** , cliquez sur **valider**.
## Comment ajouter un menu publié à un espace de travail MED-V
**Pour ajouter un menu publié dans l’espace de travail MED-V**
1. Cliquez sur l’espace de travail MED-V à configurer.
2. Dans le volet **applications** , dans la section **menus publiés** , cliquez sur **Ajouter** pour ajouter un nouveau menu.
3. Configurez les propriétés de menu comme décrit dans le tableau suivant.
4. Dans le menu **stratégie** , cliquez sur **valider**.
**Propriétés du menu publié**
<table>
<colgroup>
<col width="50%" />
<col width="50%" />
</colgroup>
<thead>
<tr class="header">
<th align="left">Propriété</th>
<th align="left">Description</th>
</tr>
</thead>
<tbody>
<tr class="odd">
<td align="left"><p>Activé</p></td>
<td align="left"><p>Activez cette case à cocher pour activer le menu publié.</p></td>
</tr>
<tr class="even">
<td align="left"><p>Nom d’affichage</p></td>
<td align="left"><p>Nom du raccourci dans le menu Démarrer de l''utilisateur.</p></td>
</tr>
<tr class="odd">
<td align="left"><p>Description</p></td>
<td align="left"><p>La description, qui s’affiche sous la forme d’une info-bulle lorsque le pointeur de la souris'le pointeur de la souris sur le raccourci.</p></td>
</tr>
<tr class="even">
<td align="left"><p>Dossier dans l’espace de travail</p></td>
<td align="left"><p>Sélectionnez le dossier à publier en tant que menu contenant toutes les applications dans le dossier.</p>
<p>Le texte affiché correspond au chemin d’accès relatif du dossier programmes.</p>
<div class="alert">
<strong>Remarque</strong><br/><p>S’il est vide, tous les programmes de l’hôte seront publiés sous forme de menu.</p>
</div>
<div>
</div></td>
</tr>
</tbody>
</table>
Tous les menus publiés apparaissent en tant que raccourcis dans le menu **Démarrer** de Windows (**Démarrez > tous les programmes pour les > applications MED-V**). Vous pouvez modifier le nom du raccourci dans le champ de **dossier raccourcis du menu Démarrer** .
**Remarque**
Lorsque vous configurez deux espaces de travail MED-V, il est recommandé de configurer un autre nom pour le dossier raccourcis du menu Démarrer.
## Supprimer un menu publié d’un espace de travail MED-V
**Pour supprimer un menu publié d’un espace de travail MED-V**
1. Cliquez sur un espace de travail MED-V.
2. Dans le volet **applications** , dans la section **menus publiés** , sélectionnez le menu à supprimer.
3. Cliquez sur **supprimer**.
Le menu est supprimé de la liste des menus publiés.
4. Dans le menu **stratégie** , cliquez sur **valider**.
## Exécution d’une application publiée à partir d’une ligne de commande sur le client
L’administrateur peut exécuter des applications publiées depuis n’importe quel emplacement, par exemple, un raccourci sur le bureau, à l’aide de la commande suivante:
``` syntax
"<Install path>\Manager\KidaroCommands.exe" /run "<published application name>" "<MED-V workspace name>"
```
**Remarque**
L’espace de travail MED-V dans lequel l’application publiée est définie doit être en cours d’exécution.
## Rubriques connexes
[Comment modifier une application publiée avec des paramètres avancés](how-to-edit-a-published-application-with-advanced-settings.md)
[Utilisation de l'interface utilisateur de la console de gestion MED-V](using-the-med-v-management-console-user-interface.md)
[Création d'un espace de travail MED-V](creating-a-med-v-workspacemedv-10-sp1.md)
| 36.816327 | 482 | 0.732816 | fra_Latn | 0.951167 |
2663535ff3fb8deb66c0aae519985912a5b4a883 | 4,626 | md | Markdown | _posts/2018-08-09-Node.js make Crontab DashBoard with Python 2.md | MoonCinnamon/mooncinnamon.github.io | 8fe09946dfddf66b3b0ba64688e8eda308146bca | [
"MIT"
] | 1 | 2018-02-03T12:43:15.000Z | 2018-02-03T12:43:15.000Z | _posts/2018-08-09-Node.js make Crontab DashBoard with Python 2.md | MoonCinnamon/mooncinnamon.github.io | 8fe09946dfddf66b3b0ba64688e8eda308146bca | [
"MIT"
] | null | null | null | _posts/2018-08-09-Node.js make Crontab DashBoard with Python 2.md | MoonCinnamon/mooncinnamon.github.io | 8fe09946dfddf66b3b0ba64688e8eda308146bca | [
"MIT"
] | null | null | null | 빠르게 웹을 찍어 내야하다 보니 자바스크립트를 사용한 기교는 최대한 덜 부리고 완성을 목표로 개발하게 되었네요. Bash랑 Python이랑 Javascript를 다 쓰게 생겼는데 어쩌다보니 스크립트 파티가 되었네요 여튼 오늘도 진행해봅시다.
### 그 다음 해야 할 일은?
대충 프론트 설계가 끝났으니 Back을 설계해야 합니다. 어제 Post로 Data를 넘겨받는것 까지 개발을 했기 때문에... 이제 넘겨받은 Data를 File로 작성해 봅시다.
```javascript
var firesystem =require('fs');
firesystem.writeFile('../crash.sh', data ,'utf-8', function(error){
console.log('데이터 작성 실험 함수');
});
```
File System이 상당히 쉽게 (?) 작성되어 있는데 덕분에 출근하자마자 빠르게 파일 작성하는 부분을 해결할 수 있었습니다.
appendFile과 writeFile중 무엇을 사용해야 하나 고민이 되었지만 appendFile의 경우 기존의 존재하는 파일에 이어서 작성하는 것이고 writeFile은 처음부터 재 작성하게 됩니다.
성공적으로 파일을 작성했으니 다음 작업으로 넘어가 봅시다.
### 문자열을 만들자...
Bash Script에 작성될 문자열을 만들자. 사실 단순하게 Python Script를 동작시키는 것이기 때문에
```bash
python3 test_script.py argv
```
하면 될 것이다... 하지만 이걸 하기 위해서는 Python Script를 한번 정리해야 합니다.
현재 3개의 스크립트가 존재하고 이 3개는 전부 InfluxDB를 사용합니다.
### 어? Python이 먼저네...
InfluxDB에 Insert하는 부분을 모듈화 시키고 3개의 script에서 참고한다면 편할 것이죠.
먼저 Python Project의 형태를 만들자 먼저 InfluxDB.py를 작성할 패키지를 만들어 봅시다.

위와 같은 디렉토리 구조를 만들고 ```__init__.py``` 를 작성해 줘야 합니다. 그래야 Python에서 패키지로 인식해 정상적인 import를 시켜주기 때문이죠
InfulxDB말고도 다른 DB를 사용하기 때문에 쿼리문을 모아두는 의미를 담아 SQLMoudle이라고 적어둡니다.
InfluxDB Module을 설치하는 동안 심심하니 InfluxDB을 Python에서 쓰는걸 적어보자면 공식 라이브러리는 아래와 같이 [InfluxDB Github](https://github.com/influxdata/influxdb-python)에 메뉴얼이 있습니다.
```pip install influxdb```명령어를 통해 module을 설치하고 사용해주면 됩니다.
```python
from influxdb import InfluxDBClient
client = InfluxDBClient('localhost', 8086, 'root', 'root', 'example')
def create_db(database_name):
client.create_database(database_name)
def write_db(json_data):
client.write_points(json_data)
if __name__ == "__main__":
print('InfluxDB Client...')
```
기본적인 구조는 이렇게 될 것입니다.
파이썬을 처음 작성하게 되면 ```if __name__ == ""__main__"":```의 역할이 무엇인지 궁금해 하는데 단순하게 이 Python Script가 모듈로 실행되는 건지 Script 자체가 실행되는것인지 구분하는 역할을 합니다. 테스트 할때 유용하게 사용할 수 있는 부분이기 때문에 Module을 만든다면 적어두고 개발하는 것이 유용합니다.
동작하는 방식은 InfluxDBClient로 Client를 얻어오고 DB를 만든 후, writeDB로 table에 기록해 주면 됩니다. 이때, influxdb가 받는 data는 json입니다.
```python
def create_db(database_name):
try:
client.create_database(database_name)
except Exception as ex:
print(ex)
```
로 예외처리까지 해줍시다.
최종적으로 만들어진 Script는 아래와 같습니다.
[InfluxDB.py]
```python
from influxdb import InfluxDBClient
client = InfluxDBClient('localhost', 8086, 'root', 'root', 'example')
def create_db(database_name):
try:
client.create_database(database_name)
except Exception as ex:
print(ex)
def write_db(json_data):
client.write_points(json_data)
if __name__ == "__main__":
print('InfluxDB Client...')
```
이제 상위 Project Folder에 임시로 테스트할 Script를 작성해 봅시다.
[crash.py]
```python
import SQLMoudle.InfluxDB as influx
import sys
a = sys.argv[1]
b = sys.argv[2]
def __make_json_body__():
json_body = ''
return json_body
if __name__ == "__main__":
influx.create_db('Database name')
json_data = __make_json_body__()
# Run Run Run...
influx.write_db(json_data)
```
SQLModule에 만들어준 Python Script를 import 시키고 argv를 사용하기 위해 sys도 함게 import 시켜 줍니다. 여기서 argv[0] 은 파일 이름이 들어가 있기 때문에 argv[1] 번부터 호출해야 합니다.
```__make_json_body__```를 만들어서 json data를 가공하고 influx.write_db를 호출해 기록하면 될 것입니다. 데이터를 가공하는 부분은 따로 적지 않겠습니다.
```python
tags = {'host': 'fabric-monitoring', "region": "as-seoul"}
fields = {'crash': 0, 'lastday': None}
body = {"measurement": None, "tags": tags, "fields": fields, "time": None}
```
이런 식으로 미리 body 형태를 만들어 주고 json 처럼 수정하면 좀 더 깔끔하게 코드가 작성 가능해 집니다.
```python
import SQLMoudle.InfluxDB as influx
from copy import deepcopy
import sys
a = sys.argv[1]
b = sys.argv[2]
tags = {'host': 'fabric-monitoring', "region": "as-seoul"}
fields = {'crash': 0, 'lastday': None}
body = {"measurement": None, "tags": tags, "fields": fields, "time": None}
def __make_json_body__(measurement, fieldData):
json_body = []
tp = deepcopy(body)
tp['measurement'] = measurement
tp['fields']['crash'] = fieldData[0]
tp['fields']['lastday'] = fieldData[1]
tp['time'] = 0000000
return json_body
if __name__ == "__main__":
influx.create_db('Database name')
json_data = __make_json_body__()
# Run Run Run...
influx.write_db(json_data)
```
최종적으로 만들어진 형태입니다 여기서 time은 자신이 사용하는 time을 사용하면 됩니다 저는 아래와 같은 방식으로 UTC를 얻어서 사용합니다.
```python
from datetime import datetime
dt = datetime.utcnow()
```
이제 데이터 가공은 알아서 하면 됩니다.
어쨌든 우리는 인자값을 불러오고 influxdb를 모듈화 시켜서 import 하는 부분까지 작성했습니다.

계속... | 23.246231 | 198 | 0.700821 | kor_Hang | 0.999978 |
2663aff63174341fd3e16835acfa2795974a5f8f | 9,273 | md | Markdown | docs/data-management.md | mohammadshahidkhan/incubator-carbondata | 922683eb8957ef17b66fbd2585abdc8af788b047 | [
"Apache-2.0"
] | null | null | null | docs/data-management.md | mohammadshahidkhan/incubator-carbondata | 922683eb8957ef17b66fbd2585abdc8af788b047 | [
"Apache-2.0"
] | null | null | null | docs/data-management.md | mohammadshahidkhan/incubator-carbondata | 922683eb8957ef17b66fbd2585abdc8af788b047 | [
"Apache-2.0"
] | 1 | 2017-04-05T10:41:42.000Z | 2017-04-05T10:41:42.000Z | <!--
Licensed to the Apache Software Foundation (ASF) under one
or more contributor license agreements. See the NOTICE file
distributed with this work for additional information
regarding copyright ownership. The ASF licenses this file
to you under the Apache License, Version 2.0 (the
"License"); you may not use this file except in compliance
with the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing,
software distributed under the License is distributed on an
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
KIND, either express or implied. See the License for the
specific language governing permissions and limitations
under the License.
-->
# Data Management
This tutorial is going to introduce you to the conceptual details of data management like:
* [Loading Data](#loading-data)
* [Deleting Data](#deleting-data)
* [Compacting Data](#compacting-data)
* [Updating Data](#updating-data)
## Loading Data
* **Scenario**
After creating a table, you can load data to the table using the [LOAD DATA](dml-operation-on-carbondata.md) command. The loaded data is available for querying.
When data load is triggered, the data is encoded in CarbonData format and copied into HDFS CarbonData store path (specified in carbon.properties file)
in compressed, multi dimensional columnar format for quick analysis queries. The same command can be used to load new data or to
update the existing data. Only one data load can be triggered for one table. The high cardinality columns of the dictionary encoding are
automatically recognized and these columns will not be used for dictionary encoding.
* **Procedure**
Data loading is a process that involves execution of multiple steps to read, sort and encode the data in CarbonData store format.
Each step is executed on different threads. After data loading process is complete, the status (success/partial success) is updated to
CarbonData store metadata. The table below lists the possible load status.
| Status | Description |
|-----------------|------------------------------------------------------------------------------------------------------------|
| Success | All the data is loaded into table and no bad records found. |
| Partial Success | Data is loaded into table and bad records are found. Bad records are stored at carbon.badrecords.location. |
In case of failure, the error will be logged in error log. Details of loads can be seen with [SHOW SEGMENTS](dml-operation-on-carbondata.md) command. The show segment command output consists of :
- SegmentSequenceID
- START_TIME OF LOAD
- END_TIME OF LOAD
- LOAD STATUS
The latest load will be displayed first in the output.
Refer to [DML operations on CarbonData](dml-operation-on-carbondata.md) for load commands.
## Deleting Data
* **Scenario**
If you have loaded wrong data into the table, or too many bad records are present and you want to modify and reload the data, you can delete required data loads.
The load can be deleted using the Segment Sequence Id or if the table contains date field then the data can be deleted using the date field.
If there are some specific records that need to be deleted based on some filter condition(s) we can delete by records.
* **Procedure**
The loaded data can be deleted in the following ways:
* Delete by Segment ID
After you get the segment ID of the segment that you want to delete, execute the delete command for the selected segment.
The status of deleted segment is updated to Marked for delete / Marked for Update.
| SegmentSequenceId | Status | Load Start Time | Load End Time |
|-------------------|-------------------|----------------------|----------------------|
| 0 | Success | 2015-11-19 19:14:... | 2015-11-19 19:14:... |
| 1 | Marked for Update | 2015-11-19 19:54:... | 2015-11-19 20:08:... |
| 2 | Marked for Delete | 2015-11-19 20:25:... | 2015-11-19 20:49:... |
* Delete by Date Field
If the table contains date field, you can delete the data based on a specific date.
* Delete by Record
To delete records from CarbonData table based on some filter Condition(s).
For delete commands refer to [DML operations on CarbonData](dml-operation-on-carbondata.md).
* **NOTE**:
- When the delete segment DML is called, segment will not be deleted physically from the file system. Instead the segment status will be marked as "Marked for Delete". For the query execution, this deleted segment will be excluded.
- The deleted segment will be deleted physically during the next load operation and only after the maximum query execution time configured using "max.query.execution.time". By default it is 60 minutes.
- If the user wants to force delete the segment physically then he can use CLEAN FILES Command.
Example :
```
CLEAN FILES FOR TABLE table1
```
This DML will physically delete the segment which are "Marked for delete" immediately.
## Compacting Data
* **Scenario**
Frequent data ingestion results in several fragmented CarbonData files in the store directory. Since data is sorted only within each load, the indices perform only within each
load. This means that there will be one index for each load and as number of data load increases, the number of indices also increases. As each index works only on one load,
the performance of indices is reduced. CarbonData provides provision for compacting the loads. Compaction process combines several segments into one large segment by merge sorting the data from across the segments.
* **Procedure**
There are two types of compaction Minor and Major compaction.
- **Minor Compaction**
In minor compaction the user can specify how many loads to be merged. Minor compaction triggers for every data load if the parameter carbon.enable.auto.load.merge is set. If any segments are available to be merged, then compaction will
run parallel with data load. There are 2 levels in minor compaction.
- Level 1: Merging of the segments which are not yet compacted.
- Level 2: Merging of the compacted segments again to form a bigger segment.
- **Major Compaction**
In Major compaction, many segments can be merged into one big segment. User will specify the compaction size until which segments can be merged. Major compaction is usually done during the off-peak time.
There are number of parameters related to Compaction that can be set in carbon.properties file
| Parameter | Default | Application | Description | Valid Values |
|-----------------------------------------|---------|-------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|--------------|
| carbon.compaction.level.threshold | 4, 3 | Minor | This property is for minor compaction which decides how many segments to be merged. Example: If it is set as 2, 3 then minor compaction will be triggered for every 2 segments. 3 is the number of level 1 compacted segment which is further compacted to new segment. | NA |
| carbon.major.compaction.size | 1024 MB | Major | Major compaction size can be configured using this parameter. Sum of the segments which is below this threshold will be merged. | NA |
| carbon.numberof.preserve.segments | 0 | Minor/Major | If the user wants to preserve some number of segments from being compacted then he can set this property. Example: carbon.numberof.preserve.segments=2 then 2 latest segments will always be excluded from the compaction. No segments will be preserved by default. | 0-100 |
| carbon.allowed.compaction.days | 0 | Minor/Major | Compaction will merge the segments which are loaded within the specific number of days configured. Example: If the configuration is 2, then the segments which are loaded in the time frame of 2 days only will get merged. Segments which are loaded 2 days apart will not be merged. This is disabled by default. | 0-100 |
| carbon.number.of.cores.while.compacting | 2 | Minor/Major | Number of cores which is used to write data during compaction. | 0-100 |
For compaction commands refer to [DDL operations on CarbonData](ddl-operation-on-carbondata.md)
## Updating Data
* **Scenario**
Sometimes after the data has been ingested into the System, it is required to be updated. Also there may be situations where some specific columns need to be updated
on the basis of column expression and optional filter conditions.
* **Procedure**
To update we need to specify the column expression with an optional filter condition(s).
For update commands refer to [DML operations on CarbonData](dml-operation-on-carbondata.md).
| 55.526946 | 391 | 0.686509 | eng_Latn | 0.997384 |
26647e6385edc4dafa5913c9baef7dc7bcc0e609 | 1,823 | md | Markdown | roadmap.md | filipnavara/winforms | 24540bab4fafb69f59907aa994f37452678495ce | [
"MIT"
] | 3 | 2019-12-22T15:00:27.000Z | 2020-05-21T09:27:48.000Z | roadmap.md | abbaye/winforms | 24540bab4fafb69f59907aa994f37452678495ce | [
"MIT"
] | null | null | null | roadmap.md | abbaye/winforms | 24540bab4fafb69f59907aa994f37452678495ce | [
"MIT"
] | 1 | 2021-07-19T02:05:06.000Z | 2021-07-19T02:05:06.000Z | # WinForms on .NET Core Roadmap
This roadmap communicates priorities for evolving and extending the scope of WinForms for .NET Core.
At present, our primary focus is enabling the following for .NET Core 3.0:
* Achieve WinForms functional and performance parity compared to .NET Framework
* Publish remaining WinForms components to the repo
* Publish (and write) more WinForms tests to the repo
> Note: There are some specific .NET Framework features will not be supported, such as hosting WinForms controls in Internet Explorer.
As we complete those goals, we'll update our roadmap to include additional feature/capability areas we will focus on next.
For general information regarding .NET Core plans, see [.NET Core
roadmap](https://github.com/dotnet/core/blob/master/roadmap.md).
## Timelines
| Milestone | Date |
|---|---|
|Initial launch of WinForms on .NET Core repository |Dec 4, 2018|
|Functional parity with .NET Framework WinForms |Q1 2019|
|First version of WinForms on .NET Core |.NET Core 3.0 GA|
|Designer support in Visual Studio|Update to VS 2019|
If you'd like to contribute to WinForms, please take a look at our [Contributing
Guide](Documentation/contributing.md).
## Shorter-Term Feature Backlog
* Port existing functional tests and test infrastructure to this repo
* Add Application property for DPI Awareness setting
## Longer-Term Feature Backlog
* Add WinForms Designer support for .NET Core 3 projects in a Visual Studio 2019 update
* Fix existing scaling bugs in Per Monitor DPI aware applications
* Add a new “clean" way of calculating location/size information in PMA mode.
* Make new projects be per monitor aware
* Add Edge browser control
* Add Data Visualization controls
* Improve accessibility support for some missing UIA interfaces
* Improve performance of WinForms runtime
| 40.511111 | 134 | 0.785518 | eng_Latn | 0.947356 |
2664ab9f06f756de73aef6f8b185abf097fd610e | 985 | md | Markdown | docs/schema-doc/peripherals-properties-axi_lite_peripherals.md | sccc-8/snitch | 989a0bc3987d83ef077b51e5b9a1adfc42f3cde7 | [
"Apache-2.0"
] | null | null | null | docs/schema-doc/peripherals-properties-axi_lite_peripherals.md | sccc-8/snitch | 989a0bc3987d83ef077b51e5b9a1adfc42f3cde7 | [
"Apache-2.0"
] | null | null | null | docs/schema-doc/peripherals-properties-axi_lite_peripherals.md | sccc-8/snitch | 989a0bc3987d83ef077b51e5b9a1adfc42f3cde7 | [
"Apache-2.0"
] | null | null | null | # Untitled array in Peripherals Schema Schema
```txt
http://pulp-platform.org/snitch/peripherals.schema.json#/properties/axi_lite_peripherals
```
| Abstract | Extensible | Status | Identifiable | Custom Properties | Additional Properties | Access Restrictions | Defined In |
| :------------------ | :--------- | :------------- | :---------------------- | :---------------- | :-------------------- | :------------------ | :------------------------------------------------------------------------- |
| Can be instantiated | No | Unknown status | Unknown identifiability | Forbidden | Allowed | none | [peripherals.schema.json*](peripherals.schema.json "open original schema") |
## axi_lite_peripherals Type
unknown\[]
## axi_lite_peripherals Constraints
**unique items**: all items in this array must be unique. Duplicates are not allowed.
| 49.25 | 222 | 0.484264 | eng_Latn | 0.738959 |
26657584af9ae181f8e34cb6f2e7ed46e5b2327b | 3,620 | md | Markdown | articles/logic-apps/logic-apps-track-integration-account-custom-tracking-schema.md | changeworld/azure-docs.sv-se | 6234acf8ae0166219b27a9daa33f6f62a2ee45ab | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/logic-apps/logic-apps-track-integration-account-custom-tracking-schema.md | changeworld/azure-docs.sv-se | 6234acf8ae0166219b27a9daa33f6f62a2ee45ab | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/logic-apps/logic-apps-track-integration-account-custom-tracking-schema.md | changeworld/azure-docs.sv-se | 6234acf8ae0166219b27a9daa33f6f62a2ee45ab | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Anpassade spårningsscheman för B2B-meddelanden
description: Skapa anpassade spårningsscheman för att övervaka B2B-meddelanden i Azure Logic Apps
services: logic-apps
ms.suite: integration
author: divyaswarnkar
ms.author: divswa
ms.reviewer: jonfan, estfan, logicappspm
ms.topic: article
ms.date: 01/01/2020
ms.openlocfilehash: c82f9cbfaf2e23ddaa5e4b05f4aac4795d3e16a9
ms.sourcegitcommit: 2ec4b3d0bad7dc0071400c2a2264399e4fe34897
ms.translationtype: MT
ms.contentlocale: sv-SE
ms.lasthandoff: 03/27/2020
ms.locfileid: "76903064"
---
# <a name="create-custom-tracking-schemas-that-monitor-end-to-end-workflows-in-azure-logic-a"></a>Skapa anpassade spårningsscheman som övervakar end-to-end-arbetsflöden i Azure Logic A
Azure Logic Apps har inbyggd spårning som du kan aktivera för delar av arbetsflödet. Du kan dock ställa in anpassad spårning som loggar händelser från början till slutet av arbetsflöden, till exempel arbetsflöden som innehåller en logikapp, BizTalk Server, SQL Server eller något annat lager. Den här artikeln innehåller anpassad kod som du kan använda i lagren utanför logikappen.
## <a name="custom-tracking-schema"></a>Anpassat spårningsschema
```json
{
"sourceType": "",
"source": {
"workflow": {
"systemId": ""
},
"runInstance": {
"runId": ""
},
"operation": {
"operationName": "",
"repeatItemScopeName": "",
"repeatItemIndex": ,
"trackingId": "",
"correlationId": "",
"clientRequestId": ""
}
},
"events": [
{
"eventLevel": "",
"eventTime": "",
"recordType": "",
"record": {}
}
]
}
```
| Egenskap | Krävs | Typ | Beskrivning |
|----------|----------|------|-------------|
| sourceType (sourceType) | Ja | String | Typ av körningskälla med `Microsoft.Logic/workflows`dessa tillåtna värden: ,`custom` |
| källa | Ja | Sträng eller JToken | Om källtypen `Microsoft.Logic/workflows`är måste källinformationen följa det här schemat. Om källtypen `custom`är är schemat en JToken. |
| systemId | Ja | String | System-ID för logikapp |
| runId (på) | Ja | String | Kör-ID för logikapp |
| operationName | Ja | String | Åtgärdens namn, till exempel åtgärd eller utlösare |
| repeatItemScopeName | Ja | String | Upprepa objektnamn om åtgärden `foreach`finns `until` i en eller slinga |
| repeatItemIndex | Ja | Integer | Anger att åtgärden finns `foreach` i `until` en eller slinga och är det upprepade artikelindexnumret. |
| trackingId (trackingId) | Inga | String | Spårnings-ID för att korrelera meddelandena |
| correlationId | Inga | String | Korrelations-ID för att korrelera meddelandena |
| klientRequestId | Inga | String | Klienten kan fylla i den här egenskapen för att korrelera meddelanden |
| händelseNivå | Ja | String | Nivå av händelsen |
| Händelsetid | Ja | DateTime | Tid för händelsen i *UTC-format: YYYY-MM-DDTHH:MM:SS.00000Z* |
| recordType (posttyp) | Ja | String | Typ av meritlista med endast detta tillåtna värde:`custom` |
| spela in | Ja | JToken (svenska) | Anpassad posttyp med endast JToken-format |
|||||
## <a name="b2b-protocol-tracking-schemas"></a>B2B-protokollspårningsscheman
Information om B2B-protokollspårningsscheman finns i:
* [AS2-spårningsscheman](../logic-apps/logic-apps-track-integration-account-as2-tracking-schemas.md)
* [X12-spårningsscheman](logic-apps-track-integration-account-x12-tracking-schema.md)
## <a name="next-steps"></a>Nästa steg
* Läs mer om [att övervaka B2B-meddelanden med Azure Monitor-loggar](../logic-apps/monitor-b2b-messages-log-analytics.md) | 44.691358 | 381 | 0.706906 | swe_Latn | 0.981284 |
2665c9a0bf4e46cb85a3b794c2f231d39c72952c | 4,648 | md | Markdown | docs/relational-databases/native-client-odbc-how-to/cursors/use-rowset-binding-odbc.md | thiagoamc/sql-docs.pt-br | 32e5d2a16f76e552e93b54b343566cd3a326b929 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/relational-databases/native-client-odbc-how-to/cursors/use-rowset-binding-odbc.md | thiagoamc/sql-docs.pt-br | 32e5d2a16f76e552e93b54b343566cd3a326b929 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/relational-databases/native-client-odbc-how-to/cursors/use-rowset-binding-odbc.md | thiagoamc/sql-docs.pt-br | 32e5d2a16f76e552e93b54b343566cd3a326b929 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: "Usar o conjunto de linhas de associação (ODBC) | Microsoft Docs"
ms.custom:
ms.date: 03/06/2017
ms.prod: sql-non-specified
ms.prod_service: database-engine, sql-database, sql-data-warehouse, pdw
ms.service:
ms.component: native-client-odbc-how-to
ms.reviewer:
ms.suite: sql
ms.technology:
ms.tgt_pltfrm:
ms.topic: reference
helpviewer_keywords:
- rowset binding [ODBC]
ms.assetid: a7be05f0-6b11-4b53-9fbc-501e591eef09
caps.latest.revision:
author: MightyPen
ms.author: genemi
manager: craigg
ms.workload: Inactive
ms.openlocfilehash: 91d372fbb5e63bff8782eaeef1fe21e510f8578b
ms.sourcegitcommit: 9e6a029456f4a8daddb396bc45d7874a43a47b45
ms.translationtype: MT
ms.contentlocale: pt-BR
ms.lasthandoff: 01/25/2018
---
# <a name="use-rowset-binding-odbc"></a>Usar associação de conjunto de linhas (ODBC)
[!INCLUDE[appliesto-ss-asdb-asdw-pdw-md](../../../includes/appliesto-ss-asdb-asdw-pdw-md.md)]
[!INCLUDE[SNAC_Deprecated](../../../includes/snac-deprecated.md)]
### <a name="to-use-column-wise-binding"></a>Para usar uma associação por coluna
1. Para cada coluna associada, siga este procedimento:
- Aloque uma matriz de R (ou mais) buffers de coluna para armazenar valores de dados, onde R é o número de linhas no conjunto de linhas.
- Outra opção é alocar uma matriz de R (ou mais) buffers de coluna para armazenar comprimentos de dados.
- Chamar [SQLBindCol](../../../relational-databases/native-client-odbc-api/sqlbindcol.md) para associar a coluna de um valor de dados e matrizes de comprimento de dados para a coluna do conjunto de linhas.
2. Chame [SQLSetStmtAttr](../../../relational-databases/native-client-odbc-api/sqlsetstmtattr.md) para definir os seguintes atributos:
- Defina SQL_ATTR_ROW_ARRAY_SIZE como o número de linhas no conjunto de linhas (R).
- Defina SQL_ATTR_ROW_BIND_TYPE como SQL_BIND_BY_COLUMN.
- Defina o atributo SQL_ATTR_ROWS FETCHED_PTR de modo que aponte para uma variável SQLUINTEGER que contém o número de linhas buscadas.
- Defina SQL_ATTR_ROW_STATUS_PTR de modo que aponte para uma matriz[R] de variáveis SQLUSSMALLINT que contém indicadores de status de linha.
3. Executar a instrução.
4. Cada chamada para [SQLFetch](http://go.microsoft.com/fwlink/?LinkId=58401) ou [SQLFetchScroll](../../../relational-databases/native-client-odbc-api/sqlfetchscroll.md) recupera linhas R e transfere os dados para as colunas associadas.
### <a name="to-use-row-wise-binding"></a>Para usar uma associação por linha
1. Aloque uma matriz[R] de estruturas, onde R é o número de linhas no conjunto de linhas. A estrutura tem um elemento para cada coluna e cada elemento tem duas partes:
- A primeira parte é uma variável do tipo de dados apropriado que contém os dados de coluna.
- A segunda parte é uma variável SQLINTEGER que contém o indicador de coluna.
2. Chame [SQLSetStmtAttr](../../../relational-databases/native-client-odbc-api/sqlsetstmtattr.md) para definir os seguintes atributos:
- Defina SQL_ATTR_ROW_ARRAY_SIZE como o número de linhas no conjunto de linhas (R).
- Defina SQL_ATTR_ROW_BIND_TYPE como o tamanho da estrutura alocada na Etapa 1.
- Defina o atributo SQL_ATTR_ROWS_FETCHED_PTR de modo que aponte para uma variável SQLUINTEGER que contém o número de linhas buscadas.
- Defina SQL_ATTR_PARAMS_STATUS_PTR de modo que aponte para uma matriz[R] de variáveis SQLUSSMALLINT que contém indicadores de status de linha.
3. Para cada coluna no conjunto de resultados, chame [SQLBindCol](../../../relational-databases/native-client-odbc-api/sqlbindcol.md) para apontar o ponteiro de comprimento de dados da coluna e valor de dados para suas variáveis no primeiro elemento da matriz de estruturas alocadas na etapa 1.
4. Executar a instrução.
5. Cada chamada para [SQLFetch](http://go.microsoft.com/fwlink/?LinkId=58401) ou [SQLFetchScroll](../../../relational-databases/native-client-odbc-api/sqlfetchscroll.md) recupera linhas R e transfere os dados para as colunas associadas.
## <a name="see-also"></a>Consulte também
[Usando os tópicos de instruções de cursores ( ODBC )](../../../relational-databases/native-client-odbc-how-to/cursors/using-cursors-how-to-topics-odbc.md)
[Como os cursores são implementados](../../../relational-databases/native-client-odbc-cursors/implementation/how-cursors-are-implemented.md)
[Usar cursores ( ODBC )](../../../relational-databases/native-client-odbc-how-to/cursors/use-cursors-odbc.md)
| 53.425287 | 297 | 0.738167 | por_Latn | 0.965368 |
2665e9cb6fb40a9c11e5afea62c3fffdbf5d5fc4 | 7,040 | md | Markdown | README.md | Esbiya/node-requests | c73f6e4d9f89abfb5a3afe2dcde2477fb35230fb | [
"MIT"
] | 2 | 2021-11-22T02:05:33.000Z | 2022-02-17T09:43:42.000Z | README.md | Esbiya/node-requests | c73f6e4d9f89abfb5a3afe2dcde2477fb35230fb | [
"MIT"
] | null | null | null | README.md | Esbiya/node-requests | c73f6e4d9f89abfb5a3afe2dcde2477fb35230fb | [
"MIT"
] | null | null | null | <h1 align="center">Welcome to @esbiya/requests 👋</h1>
<p>
<img alt="Version" src="https://img.shields.io/badge/version-1.0.0-blue.svg?cacheSeconds=2592000" />
<a href="#" target="_blank">
<img alt="License: MIT" src="https://img.shields.io/badge/License-MIT-yellow.svg" />
</a>
</p>
> a network requests lib
## Install
```sh
npm install -g @esbiya/requests
```
## Author
👤 **esbiya**
## Usage
```javascript
const requests = require("@esbiya/requests");
(async function() {
const resp = await requests.get(`https://www.baidu.com/`);
console.log(resp.text);
})();
```
### 使用 headers
```javascript
async function headerTest() {
const resp = (await requests.get(`http://127.0.0.1:3000/api/v1/header-test`, {
headers: {
"hello": "world",
}
})).json();
return resp["hello"] === "world"
}
```
### 使用代理
```javascript
async function proxyTest() {
const resp = (await requests.get("http://127.0.0.1:3000/api/v1/proxy-test", {
proxy: "http://127.0.0.1:8888",
verify: false
})).text;
return resp === "127.0.0.1"
}
```
```
支持 s5/http/https 代理:
s5: socks5://{username}:{password}@{ip}:{port}
http: http://{ip}:{port}
https: https://{ip}:{port}
```
### 使用 cookie
```javascript
async function cookieTest() {
const resp = (await requests.get("http://127.0.0.1:3000/api/v1/cookie-test", {
// 方式 1
cookies: {
"hello": "world",
"test1": "test2",
},
// 方式 2
// cookies: "hello=world; test1=test2",
})).text;
return resp == "hello=world; test1=test2"
}
```
### 禁止重定向
```javascript
async function redirectTest() {
const resp = await requests.get("http://127.0.0.1:3000/api/v1/redirect-test", {
params: {
"redirectUrl": "http://test.demo.com/redirect",
},
followRedirect: false
});
return resp.statusCode === 302 && resp.location() === "http://test.demo.com/redirect"
}
```
### get params 示例
```javascript
async function paramsTest() {
let params = {
"hello": "world"
}
const resp = (await requests.get("http://127.0.0.1:3000/api/v1/params-test", {
params: params
})).json();
return resp["hello"] === params["hello"];
}
```
### post form 表单示例
```javascript
async function formTest() {
let form = {
hello: 'world'
}
const resp = (await requests.post("http://127.0.0.1:3000/api/v1/form-test", {
form: form // 或者 hello=world
})).json();
return resp["hello"] === form["hello"];
}
```
### post payload 示例
```javascript
async function jsonTest() {
let payload = {
hello: 'world'
}
const resp = (await requests.post("http://127.0.0.1:3000/api/v1/json-test", {
json: payload,
headers: { 'Content-Type': 'application/json' },
})).json();
return resp["hello"] === payload["hello"];
}
```
### post 二进制 示例
```javascript
async function jsonTest() {
let body = fs.readFileSync(`test.jpg`);
const resp = await requests.post("http://127.0.0.1:3000/api/v1/binary-test", {
body: body,
});
return resp.text === calculateFileHash("./test.jpg")
}
```
### post 多表单示例
```javascript
async function formDataTest() {
const resp = await requests.post("http://127.0.0.1:3000/api/v1/formdata-test", {
headers : { 'Content-Type' : 'multipart/form-data' },
formData: {
img: fs.createReadStream("./test.jpg")
}
});
return resp.text === calculateFileHash("./test.jpg")
}
```
### 文件下载示例
```javascript
async function downloadTest() {
(await requests.get("http://127.0.0.1:3000/api/v1/download-test")).saveFile("test1.jpg");
return calculateFileHash("test1.jpg") === calculateFileHash("test.jpg")
}
```
### session 代理示例
```javascript
async function sessionProxyTest() {
const session = requests.session({
proxy: "http://127.0.0.1:8888"
});
const resp = await session.get("http://127.0.0.1:3000/api/v1/proxy-test", { verify: false });
return resp.text === "127.0.0.1"
}
```
### session 设置 cookies 示例
```javascript
// 用法 1
const session = requests.session();
session.setCookies(`hello=world`, `http://www.baidu.com`);
session.setCookies({
"hello": "word"
}, `http://www.baidu.com`);
session.setCookies([{
key: "hello",
value: "world",
domain: "www.baidu.com"
}]);
// 用法 2
// const session = requests.session({
// cookies: `xxx=yyy`
// cookies: {
// 'xxx': 'yyy'
// }
// cookies: [{
// key: 'xxx',
// value: 'yyy'
// }]
// });
const resp = await session.get("http://wwww.baidu.com/");
console.log(resp.text)
```
### session 设置代理示例(使用代理默认忽略证书验证)
```javascript
const session = requests.session({ proxy: `http://127.0.0.1:8888` })
```
### session 设置 keepAalive
```javascript
const session = requests.session({ keepAlive: true })
```
### 获取响应 html 解析对象(使用 cheerio, 具体用法请查看 cheerio 文档: https://github.com/cheeriojs/cheerio)
```javascript
async function downloadTest() {
const session = requests.session();
const resp = await session.get("http://127.0.0.1:3000/api/v1/input-form-test");
console.log(resp.document())
}
```
### 获取响应 html 中的 input form 表单
```javascript
async function downloadTest() {
const session = requests.session();
const resp = await session.get("http://127.0.0.1:3000/api/v1/input-form-test");
console.log(resp.inputForm('payForm'))
}
```
### 自动获取响应中的 ```<script>var data = JSON.parse('{\"error\":\"\"}'); </script>``` 并转化为标准 json 格式
```javascript
async function downloadTest() {
const session = requests.session();
const resp = await session.get("http://127.0.0.1:3000/api/v1/parse-json-test");
console.log(resp.parseJSON())
}
```
### 响应接口说明
| 接口 | 说明 |
|--- | --- |
| resp.bytes | 响应字节流 |
| resp.text | 响应文本 |
| resp.json() | 获取标准 json 格式响应 |
| resp.callbackJSON() | 自动处理 callback({"1": "2"}) 类型数据为标准 json 格式, 可指定 callback 名称, 如 resp.callbakcJSON('cb') 解析 ' cb({})' |
| resp.cost() | 请求耗时: 毫秒(ms) |
| resp.setEncoding('gbk') | 设置响应编码格式 |
| resp.saveFile('test.jpg') | 响应存入本地文件 |
| resp.location() | 获取重定向地址, 参数: load: boolean, true 表示获取 window.location.href=`` 类型的跳转 url, false 表示获取 302 跳转地址, 默认 false |
| await resp.cookies() | 获取响应 cookie, touch.Cookie 数组 |
| await resp.cookieString() | 获取响应 cookie 字符串, "111=222; 333=444" |
| await resp.cookieMap() | 获取响应 cookie 标准 json 格式, {"111": "222"} |
| await resp.cookieArrayMap() | 获取响应标准 cookie 格式数组, [{key: "111", value: "222", domain: "xxx"}] |
| resp.content | 响应内容 |
| resp.charset | 响应字符编码 |
| resp.contentType | 响应类型 |
| resp.contentLength | 响应长度 |
| resp.headers | 响应头 |
| resp.uri | 响应 url |
| resp.httpVersion | 请求 http 版本 |
## Show your support
Give a ⭐️ if this project helped you!
***
_This README was generated with ❤️ by [readme-md-generator](https://github.com/kefranabg/readme-md-generator)_ | 24.444444 | 132 | 0.588068 | yue_Hant | 0.426015 |
266753a5448720f25fd5393ed5a97cd2f1e73c41 | 1,665 | md | Markdown | docs/ide/reference/convert-anonymous-type-to-class.md | MicrosoftDocs/visualstudio-docs.fr-fr | 7e9a0c75b29b9e7e5b67cbea2a23b2525235725a | [
"CC-BY-4.0",
"MIT"
] | 9 | 2018-03-03T14:08:04.000Z | 2022-01-28T01:13:30.000Z | docs/ide/reference/convert-anonymous-type-to-class.md | MicrosoftDocs/visualstudio-docs.fr-fr | 7e9a0c75b29b9e7e5b67cbea2a23b2525235725a | [
"CC-BY-4.0",
"MIT"
] | 37 | 2017-12-01T21:15:20.000Z | 2021-07-09T11:03:45.000Z | docs/ide/reference/convert-anonymous-type-to-class.md | MicrosoftDocs/visualstudio-docs.fr-fr | 7e9a0c75b29b9e7e5b67cbea2a23b2525235725a | [
"CC-BY-4.0",
"MIT"
] | 29 | 2017-12-01T19:42:00.000Z | 2022-02-18T10:09:22.000Z | ---
title: Conversion de type anonyme en classe
description: Découvrez comment utiliser le menu actions rapides et refactorisations pour convertir un type anonyme en classe dans Visual Studio.
ms.custom: SEO-VS-2020
ms.date: 03/10/2020
ms.topic: reference
author: mikadumont
ms.author: midumont
manager: jmartens
ms.technology: vs-ide-general
dev_langs:
- CSharp
- VB
ms.workload:
- dotnet
monikerRange: '>= vs-2019'
ms.openlocfilehash: eedab2e2d826b44728b4f29569c9086b70eba4a1
ms.sourcegitcommit: 68897da7d74c31ae1ebf5d47c7b5ddc9b108265b
ms.translationtype: MT
ms.contentlocale: fr-FR
ms.lasthandoff: 08/13/2021
ms.locfileid: "122123929"
---
# <a name="convert-anonymous-type-to-class"></a>Conversion de type anonyme en classe
Cette refactorisation s’applique à :
- C#
- Visual Basic
**Ce qui suit :** Convertit un type anonyme en classe.
Dans les **cas suivants :** Vous avez un type anonyme que vous souhaitez continuer à créer dans une classe.
**Pourquoi :** Les types anonymes sont utiles si vous les utilisez uniquement localement. À mesure que votre code se développe, il est intéressant de disposer d’un moyen simple de le promouvoir en classe.
## <a name="how-to"></a>Procédures
1. Placez votre curseur dans un type anonyme.
2. Appuyez sur **CTRL** + **.** pour afficher le menu **Actions rapides et refactorisations**.

2. Appuyez sur **Entrée** pour accepter la refactorisation.

## <a name="see-also"></a>Voir aussi
- [Refactorisation](../refactoring-in-visual-studio.md)
| 32.019231 | 204 | 0.765165 | fra_Latn | 0.900067 |
26678c5f661ac94f88359883094865b6dc0307c0 | 3,603 | md | Markdown | docs/mfc/relationships-among-mfc-objects.md | Dinja1403/cpp-docs | 50161f2a9638424aa528253e95ef9a94ef028678 | [
"CC-BY-4.0",
"MIT"
] | 2 | 2020-06-30T03:02:58.000Z | 2021-07-27T18:21:28.000Z | docs/mfc/relationships-among-mfc-objects.md | drewbatgit/cpp-docs | 230b7231ed324317d2f806531288d6a109791af4 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2019-10-16T08:33:11.000Z | 2019-10-16T08:33:11.000Z | docs/mfc/relationships-among-mfc-objects.md | drewbatgit/cpp-docs | 230b7231ed324317d2f806531288d6a109791af4 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2021-12-21T18:29:23.000Z | 2021-12-21T18:29:23.000Z | ---
title: "Relationships Among MFC Objects"
ms.date: "11/04/2016"
helpviewer_keywords: ["MFC, relationships between key objects", "objects [MFC], relationships", "relationships, MFC objects", "MFC object relationships"]
ms.assetid: 6e8f3b51-e80f-4d88-94c8-4c1e4ee163ad
---
# Relationships Among MFC Objects
To help put the document/view creation process in perspective, consider a running program: a document, the frame window used to contain the view, and the view associated with the document.
- A document keeps a list of the views of that document and a pointer to the document template that created the document.
- A view keeps a pointer to its document and is a child of its parent frame window.
- A document frame window keeps a pointer to its current active view.
- A document template keeps a list of its open documents.
- The application keeps a list of its document templates.
- Windows keeps track of all open windows so it can send messages to them.
These relationships are established during document/view creation. The following table shows how objects in a running program can access other objects. Any object can obtain a pointer to the application object by calling the global function [AfxGetApp](../mfc/reference/application-information-and-management.md#afxgetapp).
### Gaining Access to Other Objects in Your Application
|From object|How to access other objects|
|-----------------|---------------------------------|
|Document|Use [GetFirstViewPosition](../mfc/reference/cdocument-class.md#getfirstviewposition) and [GetNextView](../mfc/reference/cdocument-class.md#getnextview) to access the document's view list.<br /><br /> Call [GetDocTemplate](../mfc/reference/cdocument-class.md#getdoctemplate) to get the document template.|
|View|Call [GetDocument](../mfc/reference/cview-class.md#getdocument) to get the document.<br /><br /> Call [GetParentFrame](../mfc/reference/cwnd-class.md#getparentframe) to get the frame window.|
|Document frame window|Call [GetActiveView](../mfc/reference/cframewnd-class.md#getactiveview) to get the current view.<br /><br /> Call [GetActiveDocument](../mfc/reference/cframewnd-class.md#getactivedocument) to get the document attached to the current view.|
|MDI frame window|Call [MDIGetActive](../mfc/reference/cmdiframewnd-class.md#mdigetactive) to get the currently active [CMDIChildWnd](../mfc/reference/cmdichildwnd-class.md).|
Typically, a frame window has one view, but sometimes, as in splitter windows, the same frame window contains multiple views. The frame window keeps a pointer to the currently active view; the pointer is updated any time another view is activated.
> [!NOTE]
> A pointer to the main frame window is stored in the [m_pMainWnd](../mfc/reference/cwinthread-class.md#m_pmainwnd) member variable of the application object. A call to `OnFileNew` in your override of the `InitInstance` member function of `CWinApp` sets *m_pMainWnd* for you. If you do not call `OnFileNew`, you must set the variable's value in `InitInstance` yourself. (SDI COM component (server) applications may not set the variable if /Embedding is on the command line.) Note that *m_pMainWnd* is now a member of class `CWinThread` rather than `CWinApp`.
## See also
[Document Templates and the Document/View Creation Process](../mfc/document-templates-and-the-document-view-creation-process.md)<br/>
[Document Template Creation](../mfc/document-template-creation.md)<br/>
[Document/View Creation](../mfc/document-view-creation.md)<br/>
[Creating New Documents, Windows, and Views](../mfc/creating-new-documents-windows-and-views.md)
| 80.066667 | 559 | 0.768804 | eng_Latn | 0.944722 |
2667ba39d7e6f837adaafd3465b98d8556e97dcb | 4,875 | md | Markdown | docs/ssma/oracle/messages/o2ss0264.md | PowerBee-AK/sql-docs.de-de | f6f4854db855a89c4e49dc0557fa456da060b3c7 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/ssma/oracle/messages/o2ss0264.md | PowerBee-AK/sql-docs.de-de | f6f4854db855a89c4e49dc0557fa456da060b3c7 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/ssma/oracle/messages/o2ss0264.md | PowerBee-AK/sql-docs.de-de | f6f4854db855a89c4e49dc0557fa456da060b3c7 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: 'O2SS0264: der Cursor oder die Cursor Variable kann nicht als Funktions-oder Prozedur Aufrufparameter konvertiert werden (Fehler).'
description: Beschreibt, warum SQL Server Migration Assistant (SSMA) für Oracle den PL/SQL-Block nicht konvertiert, wenn eine Cursor-oder Cursor Variable als Parameter an einen Funktions-oder Prozedur Befehl übergeben wird.
author: nahk-ivanov
ms.prod: sql
ms.technology: ssma
ms.devlang: sql
ms.topic: reference
ms.date: 1/22/2020
ms.author: alexiva
ms.openlocfilehash: 6b571193d5c2717815372a8af82af2d6fa2ba345
ms.sourcegitcommit: 33f0f190f962059826e002be165a2bef4f9e350c
ms.translationtype: MT
ms.contentlocale: de-DE
ms.lasthandoff: 01/30/2021
ms.locfileid: "99187806"
---
# <a name="o2ss0264-unable-to-convert-cursor-or-cursor-variable-as-a-function-or-procedure-call-parameter-error"></a>O2SS0264: der Cursor oder die Cursor Variable kann nicht als Funktions-oder Prozedur Aufrufparameter konvertiert werden (Fehler).
In diesem Artikel wird beschrieben, warum SQL Server Migration Assistant (SSMA) für Oracle den PL/SQL-Block nicht konvertiert, wenn eine Cursor-oder Cursor Variable als Parameter an einen Funktions-oder Prozedur Befehl übergeben wird.
## <a name="background"></a>Hintergrund
Ein Cursor ist ein Mechanismus, mit dem Sie einer Anweisung einen Namen zuweisen `SELECT` und die Informationen in dieser SQL-Anweisung bearbeiten können. Cursor werden von Daten Bank Programmierern verwendet, um einzelne von Datenbanksystem Abfragen zurückgegebene Zeilen zu verarbeiten. In Oracle `SYS_REFCURSOR` wird verwendet, um Cursor von und an eine gespeicherte Prozedur zu übergeben.
Wenn eine Cursor-oder Cursor Variable als Parameter an einen Funktions-oder Prozedur Befehl übergeben wird, kann SSMA diese Anweisung nicht konvertieren und generiert eine Fehlermeldung.
## <a name="example"></a>Beispiel
Beachten Sie die folgende Beispiel Abfrage, in der wir eine Variable als deklariert haben `SYS_REFCURSOR` :
```sql
CREATE OR REPLACE PROCEDURE p_close_refcursor
(
emp_refcur OUT SYS_REFCURSOR
)
AS
test_cursor SYS_REFCURSOR;
departmentno dept.deptno%TYPE;
BEGIN
OPEN
test_cursor
FOR
SELECT deptno
FROM dept;
LOOP
FETCH test_cursor
INTO departmentno;
EXIT WHEN test_cursor%NOTFOUND;
DBMS_OUTPUT.PUT_LINE(departmentno);
END LOOP;
emp_refcur := test_cursor;
CLOSE test_cursor;
END;
```
Wir rufen dann diese Prozedur auf und übergeben ihr eine Variable vom Typ, `SYS_REFCURSOR` um den Cursor abzurufen:
```sql
DECLARE
emp_cur SYS_REFCURSOR;
BEGIN
p_close_refcursor(emp_cur);
END;
```
Wenn Sie versuchen, den obigen Code in SSMA zu konvertieren, wird die folgende Fehlermeldung generiert:
> O2SS0264: der Cursor oder die Cursor Variable kann nicht als Funktions-oder Prozedur Aufrufparameter konvertiert werden.
## <a name="possible-remedies"></a>Mögliche Abhilfemaßnahmen
Um diesen Fehler zu beheben, können Sie zuerst die Prozedur ( `P_CLOSE_REFCURSOR` ) in Transact-SQL mithilfe von SSMA konvertieren und die folgenden Änderungen im SQL-Code ausführen:
1. Wenn SSMA die Oracle-Prozedur in Transact-SQL konvertiert, konvertiert Sie den `CURSOR (@emp_refcur)` Typ in `varchar(8000)` . In SQL Server kann der Cursor Datentyp in einem `OUTPUT` Parameter wie folgt deklariert werden: `@emp_refcur Cursor Varying OUTPUT` .
2. Außerdem initialisiert SSMA die Variable `@emp_refcur` (die vom Typ ist `varchar(8000)` ) mit dem `NULL` Wert. Nachdem Sie den Typ geändert haben, müssen Sie diese Initialisierung durch kommentieren der Anweisung entfernen `SET @emp_refcur = NULL` .
Hierfür müssen wir den SQL Server Transact-SQL-Code wie folgt aktualisieren:
```sql
CREATE PROCEDURE dbo.P_CLOSE_REFCURSOR
@emp_refcur Cursor Varying OUTPUT
AS
BEGIN
-- SET @emp_refcur = NULL
DECLARE
@test_cursor CURSOR,
@departmentno float(53)
SET @test_cursor =
CURSOR FOR
SELECT DEPT.DEPTNO
FROM dbo.DEPT
OPEN @test_cursor
WHILE 1 = 1
BEGIN
FETCH @test_cursor
INTO @departmentno
IF @@FETCH_STATUS <> 0
BREAK
PRINT @departmentno
END
SET @emp_refcur = @test_cursor
CLOSE @test_cursor
DEALLOCATE @test_cursor
END
```
Nun können Sie den folgenden Code verwenden, um die oben beschriebene Prozedur aufzurufen, indem Sie die Cursor Variable übergeben:
```sql
DECLARE @cursor_variable CURSOR
EXECUTE dbo.P_CLOSE_REFCURSOR @cursor_variable
```
## <a name="related-conversion-messages"></a>Verwandte Konvertierungs Meldungen
* [O2SS0094: der Cursor kann nicht als Parameter konvertiert werden.](o2ss0094.md)
* [O2SS0157: dynamische Zeichenfolge für Open... Nicht konvertiert](o2ss0157.md)
* [O2SS0245: die Cursor Konvertierung in Return-Anweisungen wird nicht unterstützt.](o2ss0245.md)
| 36.380597 | 392 | 0.763282 | deu_Latn | 0.966667 |
2667ebab4d4377db90edc0982e1a21818f8916a9 | 7,936 | md | Markdown | articles/mobile-engagement/mobile-engagement-windows-phone-get-started.md | OpenLocalizationTestOrg/azure-docs-pr15_et-EE | bc69bd1a9d45d7abd2a3d01806d74e2b0848a808 | [
"CC-BY-3.0",
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/mobile-engagement/mobile-engagement-windows-phone-get-started.md | OpenLocalizationTestOrg/azure-docs-pr15_et-EE | bc69bd1a9d45d7abd2a3d01806d74e2b0848a808 | [
"CC-BY-3.0",
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/mobile-engagement/mobile-engagement-windows-phone-get-started.md | OpenLocalizationTestOrg/azure-docs-pr15_et-EE | bc69bd1a9d45d7abd2a3d01806d74e2b0848a808 | [
"CC-BY-3.0",
"CC-BY-4.0",
"MIT"
] | null | null | null | <properties
pageTitle="Azure'i Mobile kaasamine for Windows Phone Silverlighti Appsi kasutamise alustamine"
description="Saate teada, kuidas kasutada Azure Mobile kaasamine Kasutusanalüüsi ja push teatised Windows Phone Silverlighti rakendused."
services="mobile-engagement"
documentationCenter="windows"
authors="piyushjo"
manager="dwrede"
editor="" />
<tags
ms.service="mobile-engagement"
ms.workload="mobile"
ms.tgt_pltfrm="mobile-windows-phone"
ms.devlang="dotnet"
ms.topic="hero-article"
ms.date="08/19/2016"
ms.author="piyushjo" />
# <a name="get-started-with-azure-mobile-engagement-for-windows-phone-silverlight-apps"></a>Azure'i Mobile kaasamine for Windows Phone Silverlighti Appsi kasutamise alustamine
[AZURE.INCLUDE [Hero tutorial switcher](../../includes/mobile-engagement-hero-tutorial-switcher.md)]
Selles teemas näidatakse, kuidas kasutada Azure Mobile kaasamine mõista teie rakenduse kasutamise ja saata Tõuketeatiste segmenditud Windows Phone Silverlighti rakenduse kasutajatele.
Selle õpetuse näitab lihtsa leviedastuse stsenaariumi, kasutades Mobile allikaid. Saate luua tühja Windows Phone Silverlighti rakendus, mis kogub lähteandmete ja tõuketeatised Microsoft tõuketeatised teatise teenuse (MPNS) abil saab.
> [AZURE.NOTE] Kui olete suunatud Windows Phone 8.1 (mitte Silverlight), vaadake [Windowsi universaalne õpetuse](mobile-engagement-windows-store-dotnet-get-started.md).
Selle õpetuse kasutamiseks on vaja järgmist:
+ Visual Studio 2013
+ [MicrosoftAzure.MobileEngagement] Nugeti pakett
> [AZURE.NOTE] Selle õpetuse lõpuleviimiseks peab teil olema aktiivne Azure'i konto. Kui teil pole kontot, saate luua tasuta prooviversiooni konto vaid paar minutit. Lisateavet leiate teemast [Azure tasuta prooviversioon](https://azure.microsoft.com/pricing/free-trial/?WT.mc_id=A0E0E5C02&returnurl=http%3A%2F%2Fazure.microsoft.com%2Fen-us%2Fdocumentation%2Farticles%2Fmobile-engagement-windows-phone-get-started).
##<a id="setup-azme"></a>Rakenduse Windows Phone Mobile kaasamine häälestamine
[AZURE.INCLUDE [Create Mobile Engagement App in Portal](../../includes/mobile-engagement-create-app-in-portal-new.md)]
##<a id="connecting-app"></a>Rakenduse ühenduse kirjutamata Mobile kaasamine
Selle õpetuse esitab "lihtsa integratsioon", mis on nõutav andmete kogumine ja tõuketeatised teatise saatmine minimaalsete määramine. Täielik integratsioon dokumentatsiooni kohta leiate [Mobile kaasamine Windows Phone SDK integreerimine](mobile-engagement-windows-phone-sdk-overview.md)
Loome lihtsa rakenduse Visual Studio näidata integreerimine.
###<a name="create-a-new-windows-phone-silverlight-project"></a>Windows Phone Silverlighti uue projekti loomine
Järgmiste juhiste korral eeldatakse Visual Studio 2015 kasutamine kuigi juhiseid sarnanevad Visual Studio varasemates versioonides.
1. Käivitage Visual Studio ja valige Kuva **Avaleht** **Uue projekti**.
2. Märkige hüpikaknas **Windows 8** -> **Windows Phone** -> **Tühja rakendust (Windows Phone Silverlight)**. Täitke rakenduse **nimi** ja **lahenduse nimi**ja seejärel klõpsake nuppu **OK**.
![][1]
3. Saate valida, kas **Windows Phone 8.0** või **Windows Phone 8.1**suunata.
Nüüd olete loonud uue Windows Phone Silverlighti rakenduse kuhu me Azure Mobile kaasamine SDK integreerida.
###<a name="connect-your-app-to-the-mobile-engagement-backend"></a>Rakenduse ühenduse kirjutamata Mobile kaasamine
1. Installige [MicrosoftAzure.MobileEngagement] Nugeti pakett projektis.
2. Avatud `WMAppManifest.xml` (jaotises kausta Atribuudid) ja veenduge, et on (lisamine neid, kui nad pole) sisse selle `<Capabilities />` silt:
<Capability Name="ID_CAP_NETWORKING" />
<Capability Name="ID_CAP_IDENTITY_DEVICE" />
![][2]
3. Nüüd kleepige varem kopeeritud oma Mobile kaasamine rakenduse ühendusstringi ja kleepige see soovitud `Resources\EngagementConfiguration.xml` faili vahel on `<connectionString>` ja `</connectionString>` Sildid:
![][3]
4. Klõpsake soovitud `App.xaml.cs` faili:
lisamine. Lisage soovitud `using` lause:
using Microsoft.Azure.Engagement;
b. Rakenduses SDK lähtestada selle `Application_Launching` meetod:
private void Application_Launching(object sender, LaunchingEventArgs e)
{
EngagementAgent.Instance.Init();
}
c. Lisada järgmist funktsiooni `Application_Activated`:
private void Application_Activated(object sender, ActivatedEventArgs e)
{
EngagementAgent.Instance.OnActivated(e);
}
##<a id="monitor"></a>Reaalajas jälgimise lubamine
Andmete saatmine ja tagada, et kasutajad on aktiivne käivitamiseks peate saatma Mobile kaasamine taustväärtus vähemalt ühe ekraanikuva (tegevus).
1. MainPage.xaml.cs, lisage soovitud `using` lause:
using Microsoft.Azure.Engagement;
2. Asendage **Avaleht**, mis on enne **PhoneApplicationPage**, alus klassi **EngagementPage**.
class MainPage : EngagementPage
3. Klõpsake oma `MainPage.xml` faili:
lisamine. Lisage oma nimeruumid deklaratsiooni.
xmlns:engagement="clr-namespace:Microsoft.Azure.Engagement;assembly=Microsoft.Azure.Engagement.EngagementAgent.WP"
b. Asendage `phone:PhoneApplicationPage` XML-i sildi nimi koos `engagement:EngagementPage`.
##<a id="monitor"></a>Rakenduse kasutajaga reaalajas jälgimine
[AZURE.INCLUDE [Connect app with real-time monitoring](../../includes/mobile-engagement-connect-app-with-monitor.md)]
##<a id="integrate-push"></a>Tõuketeatised ja sõnumside rakenduse lubamine
Mobile kaasamine võimaldab teil suhelda ja saavutamiseks kasutajate tõuketeatised ja-rakenduse sõnumside kampaaniat kontekstis. Selle mooduli nimetatakse REACHi Mobile kaasamine portaalis.
Järgmistes jaotistes häälestada rakenduse neid vastu võtta.
###<a name="enable-your-app-to-receive-mpns-push-notifications"></a>Lubage oma rakenduse MPNS tõuketeatised vastuvõtmiseks
Uued võimalused, kui soovite lisada oma `WMAppManifest.xml` faili:
ID_CAP_PUSH_NOTIFICATION
ID_CAP_WEBBROWSERCOMPONENT
![][5]
###<a name="initialize-the-reach-sdk"></a>REACHi SDK lähtestamine
1. Klõpsake `App.xaml.cs`, kõne `EngagementReach.Instance.Init();` **Application_Launching** funktsiooni, paremklõpsake pärast agent lähtestamine:
private void Application_Launching(object sender, LaunchingEventArgs e)
{
EngagementAgent.Instance.Init();
EngagementReach.Instance.Init();
}
2. Klõpsake `App.xaml.cs`, kõne `EngagementReach.Instance.OnActivated(e);` **Application_Activated** funktsiooni, paremklõpsake pärast agent lähtestamine:
private void Application_Activated(object sender, ActivatedEventArgs e)
{
EngagementAgent.Instance.OnActivated(e);
EngagementReach.Instance.OnActivated(e);
}
Kõik on valmis. Nüüd me veenduge, et teil on õigesti hüüdis selle lihtsa integreerimine.
##<a id="send"></a>Rakenduse teatise saatmine
[AZURE.INCLUDE [Create Windows Push campaign](../../includes/mobile-engagement-windows-push-campaign.md)]
Nüüd peaks nähtaval olema teatis kuvatakse teie seadmes kui teatis – rakenduse kui rakendus on avatud, muidu töölauateatise, näiteks järgmine:
![][6]
<!-- URLs. -->
[MicrosoftAzure.MobileEngagement]: http://go.microsoft.com/?linkid=9874664
[Mobile Engagement Windows Phone SDK documentation]: ../mobile-engagement-windows-phone-integrate-engagement/
<!-- Images. -->
[1]: ./media/mobile-engagement-windows-phone-get-started/project-properties.png
[2]: ./media/mobile-engagement-windows-phone-get-started/wmappmanifest-capabilities.png
[3]: ./media/mobile-engagement-windows-phone-get-started/add-connection-string.png
[5]: ./media/mobile-engagement-windows-phone-get-started/reach-capabilities.png
[6]: ./media/mobile-engagement-windows-phone-get-started/push-screenshot.png
| 46.682353 | 418 | 0.762727 | est_Latn | 0.985135 |
2668a0f9d4cc7b7ec394ff14a04f86829e3ac4cb | 10,699 | md | Markdown | content/refguide/data-hub-pane.md | zlogic/docs | 51116fb20eb98ab4167903d9b0aa43b9e6d00481 | [
"CC-BY-4.0"
] | 102 | 2016-09-26T11:28:17.000Z | 2022-03-11T14:11:54.000Z | content/refguide/data-hub-pane.md | zlogic/docs | 51116fb20eb98ab4167903d9b0aa43b9e6d00481 | [
"CC-BY-4.0"
] | 1,636 | 2016-10-06T09:02:52.000Z | 2022-03-31T13:38:17.000Z | content/refguide/data-hub-pane.md | zlogic/docs | 51116fb20eb98ab4167903d9b0aa43b9e6d00481 | [
"CC-BY-4.0"
] | 829 | 2016-09-29T06:58:11.000Z | 2022-03-31T08:41:52.000Z | ---
title: "Data Hub Pane"
parent: view-menu
menu_order: 15
description: "Describes the Data Hub pane in Mendix Studio Pro."
tags: ["studio Pro", "data hub", "data hub pane", "data hub catalog"]
---
## 1 Introduction
[Mendix Data Hub](/data-hub/) enables integration of available data sources from the different applications in an organization into your Mendix apps. This means that new apps can be created using shared datasets that are registered in the [Data Hub Catalog](/data-hub/data-hub-catalog/). In Studio Pro, this is possible using the integrated functionality of Data Hub Catalog through the **Data Hub** pane.
{{% alert type="info" %}}
You need a license to use Data Hub in Studio Pro. For further information see [Data Hub License](consumed-odata-service-requirements#license-limitations).
{{% /alert %}}
You can search in the Data Hub Catalog through the **Data Hub** pane to discover data sources that you can use in your app. Via this pane you can add the entities that are exposed in the registered OData services—called **Data Sources** in Data Hub—into your app's domain model. These entities are called [external entities](external-entities) and are different because they enable the connection to the data associated with the entities in the originating app.
{{% alert type="info" %}}
In the Data Hub Catalog, registered published services are referred to as *data sources* and exposed entities will show the **Entity set** name and are called *datasets.*
{{% /alert %}}
To display the **Data Hub** pane, click **View** > **Data Hub**.
## 2 Data Hub Pane Overview
The **Data Hub** pane is used to search the Data Hub Catalog for entities that can be dragged to the domain model and used in your app and also display the external entities and the associated services that are consumed in your current model.
The following functionality is available in the pane:
* [Search](#search) – Enter a search string of alphanumeric characters to search in the Data Hub Catalog. The search will be performed on services, entities, attributes, associations, and descriptions in the Catalog.
* [Filter](#filter) – By default, the search is performed on assets in the **Production** environment. Click the **Filter** icon to include all other environments such as test, acceptance and also the Mendix free app environment **Sandbox** in the search.
* [View information](#viewing) on the service, its entities, attributes, and associations – When you enter a search term and browse through services, you can view various information on them.
* [View services used in your app](#used-in-app) – Services and the entities that are currently being used in your app are displayed in the **Used in your App** section and are indicated with a green check-mark in the search results. For more information, see the [Used in Your App](#used-in-app) section below.
### 2.2 Used in Your App Section {#used-in-app}
When you do not enter search text in the **Data Hub** pane, then **Used in your App** section is displayed. This shows the consumed services and the external entities used in the current app. The list of entities, associations, and attributes for the consumed services are shown as for the search results:

For more information on how to add entities to your app, see [Adding an External Entity to an App](external-entities#adding-external-entities) section in *External Entities*.
## 3 Searching the Data Hub Catalog {#search}
As you enter a search term, all the items in the Data Hub Catalog satisfying the search string are listed in the search results. This includes words in the service, entity and attribute descriptions, which are not displayed in the **Data Hub** pane. For more information, see the [Selected Asset Details](/data-hub/data-hub-catalog/search#search-details) section in *Search in the Data Hub Catalog*.
You can drag the entity from the search results into your domain model and it will be added to your app and displayed as an [external entity](external-entities).
{{% alert type="info" %}}Services that are set to **not-Discoverable** in the Catalog are not be included in the search results for *any* user including owners of the service. To consume entities from these services owners must ensure that they are [Discoverable](/data-hub/data-hub-catalog/curate#discoverability).{{% /alert %}}
### 3.1 Wildcard Search
You can perform a wildcard search by entering `*` in the search field.
{{% alert type="info" %}}
The search strings must be a minimum of three alphanumeric characters. Punctuation cannot be used as part of the search term except for the wildcard character `*` to perform an "empty" search in the Data Hub Catalog. You cannot use the wildcard in combination with other characters. For further details, see [How to Search for Registered Assets](/data-hub/data-hub-catalog/search).
{{% /alert %}}
### 3.2 Filtering Environments {#filter}
By default, the search is performed on assets in the **Production** environment. To include all other environments such as test, acceptance, and also the Mendix free app environment, **Sandbox** in the search, click the **Filter** icon and check **Show development environments**:
{{% image_container width="300" %}}{{% /image_container %}}
{{% alert type="info" %}}
When the **Show development environments** is checked, all subsequent searches results will also include those in non-production environments.
{{% /alert %}}
## 4 Data Hub Pane Information {#viewing}
The information that is displayed in the **Data Hub** pane either when you enter a search term or when you open the **Used in your App** section is described in the sections below.
### 4.1 Services
The search results and **User in your App** section show the following information at a service level:
* **Service name**
* **Application icon** for the service (for example, Mendix, SAP, Siemens Teamcenter, or custom icons)
* **Service version**
* **Environment name** for non-production environments
{{% alert type="info" %}}Only the names of non-production environments are displayed. Services in the **Production** do not show an environment name. {{% /alert %}}
* **Green check-mark** if the service or entity is consumed in the app. If you right-click a consumed service, you can do the following:
* **View in Data Hub Catalog** – click this to go to the **Data Source Details** page in the Data Hub Catalog
* **Go to connection settings** – click this to open the [consumed OData service](consumed-odata-service) document

* **Gray shield icon** shows if the service or entity is validated in the Catalog
* **Update icon** is a blue arrow icon that indicates that there is another version of the consumed service available in the Data Hub. Click to update the service that is consumed in the app to the contract that is now available:

{{% alert type="info" %}}If there is an OData service update available, then the entities that are listed are those that are available in that version of the OData service. These entities are grayed-out to indicate that they cannot be dragged into the domain model, as the *current* contract that is consumed in the app does not have these entities. You must update the contract to the version shown in the search results by clicking the **Update** icon. {{% /alert %}}
{{% alert type="info" %}}The version number that is shown for the OData service is the latest one that is available in the Data Hub Catalog at the service endpoint – in the example above, version 1.0.0 of **BikeVehicleService** is currently consumed in the app, but the contract that is available in the Catalog is different to the one currently consumed.{{% /alert %}}
* **Information icon** allows you to view further details for the service and a link to go directly to the [Service Details](/data-hub/data-hub-catalog/search#search-details) screen in the Data Hub Catalog:

### 4.2 Entities, Attributes, and Associations {#association-attributes}
Entities, attributes, and associations are displayed under the service name.
For any service in the list, you can click **Show details** to see the full list of the exposed entities, associations, and attributes for that service.

{{% alert type="info" %}}The associations and attributes that are not supported in your Mendix app are shown as non-selectable (gray) and will not be included when you drag them into the domain model.{{% /alert %}}
#### 4.2.1 Entity
If you right-click an entity and select **View in Data Hub Catalog**, it will take you to the entity details page in the [Data Hub Catalog](/data-hub/data-hub-catalog/).
If you right-click a consumed entity and **Go to entity**, it will take you to the entity in the domain model.
#### 4.2.2 Associations
The associations that are exposed in the services are listed before attributes in alphabetical order. You can click on the **+** to see the entity that the association is with.
**Mulitiple association**s between the same entities are shown before single associations.
In the following example the entity **Customer** has multiple associations with the entity **Order** however, these associations are not supported and cannot be used in your app:

#### 4.2.3 Attributes
Attributes for a service are listed in alphabetical order. If you right-click an attribute of a consumed entity and **Go to attribute**, it takes you to the attribute in the domain model.
Unsupported attributes are grayed out and are not included in your app.
#### 4.2.4 CRUD Capabilities
If the entity, association, or attribute supports **C**reate, **R**ead, **U**pdate, or **D**elete capabilities and it is also supported by Studio Pro, then it is displayed in the **Data Hub** pane.
Entities and associations can have any of the CRUD capabilities, while attributes can only have create and update. For more information on CRUD capabilities, see [Write Data to Another App](/data-hub/write-data/).
## 5 Read More
* [Data Hub Catalog](/data-hub/data-hub-catalog)
* [External Entities](external-entities)
* [Consumed OData Service](consumed-odata-service)
* [How to Consume Registered Assets](/data-hub/data-hub-catalog/consume)
| 70.854305 | 473 | 0.757267 | eng_Latn | 0.99805 |
26691a6eda03ac37bf8cb4a2d87ad68bb3a2831b | 2,361 | md | Markdown | nativecontrols/ios/chart/axes/datetime.md | yordan-mitev/xamarin-forms-docs | 19d58da96eb8bc47a6a5df9199133061b15d6361 | [
"MIT",
"Unlicense"
] | null | null | null | nativecontrols/ios/chart/axes/datetime.md | yordan-mitev/xamarin-forms-docs | 19d58da96eb8bc47a6a5df9199133061b15d6361 | [
"MIT",
"Unlicense"
] | null | null | null | nativecontrols/ios/chart/axes/datetime.md | yordan-mitev/xamarin-forms-docs | 19d58da96eb8bc47a6a5df9199133061b15d6361 | [
"MIT",
"Unlicense"
] | null | null | null | ---
title: Datetime
page_title: Datetime Axis
position: 3
---
## Chart for Xamarin.iOS: Datetime Axis
<code>TKChartDateTimeAxis</code> categoric axis is an axis with NSDate values sorted chronologically. It also allows definitions of categories based on specific date time components – year, month, day etc. For example, if data values fall in the range of one year, the points can be plotted in categories for each month. If data values fall in the range of one month, the values can be categorized by days. It also introduces several important properties:
- <code>MajorTickInterval</code> - defines an interval between major axis ticks.
- <code>Baseline</code> - defines how the series data should be aligned. For example: The <code>TKChartBarSeries</code> might render its bars up and down depending on whether its value is greater or less than the baseline value.
- <code>Offset</code> - determines an axis value where the axis is crossed with another axis.
## Configure a TKChartDateTimeAxis##
You can configure a date-time axis by initializing it and setting it as the main x-axis or y-axis of the chart:
```C#
TKChartDateTimeAxis xAxis = new TKChartDateTimeAxis (minDate, maxDate);
xAxis.MajorTickIntervalUnit = TKChartDateTimeAxisIntervalUnit.Months;
xAxis.MajorTickInterval = 1;
```
You can define the axis categories by changing the interval unit property to one of the following values:
* *TKChartDateTimeAxisIntervalUnit.Seconds*
* *TKChartDateTimeAxisIntervalUnit.Minutes*
* *TKChartDateTimeAxisIntervalUnit.Hours*
* *TKChartDateTimeAxisIntervalUnit.Days*
* *TKChartDateTimeAxisIntervalUnit.Weeks*
* *TKChartDateTimeAxisIntervalUnit.Months*
* *TKChartDateTimeAxisIntervalUnit.Years*
* *TKChartDateTimeAxisIntervalUnit.Custom*

## Setting a plotting mode of axis##
The <code>TKChartAxisPlotMode</code> is used by the axis to plot the data. Possible values are <code>TKChartAxisPlotMode.BetweenTicks</code> and <code>TKChartAxisPlotMode.OnTicks</code>. <code>TKChartAxisPlotMode.BetweenTicks</code> plots points in the middle of the range, defined by two ticks. <code>OnTicks</code> plots the points over each tick.
You should use the following lines of code to alter this behavior:
```C#
xAxis.PlotMode = TKChartAxisPlotMode.BetweenTicks;
```

| 46.294118 | 455 | 0.787802 | eng_Latn | 0.892741 |
26699b9f126f77505d79e87b0aeb2e8454d58ca2 | 486 | md | Markdown | CHANGELOG.md | michdr/yasod | 5d28385a5e5a66d82e3303e65caf66b7ce1ac514 | [
"MIT"
] | null | null | null | CHANGELOG.md | michdr/yasod | 5d28385a5e5a66d82e3303e65caf66b7ce1ac514 | [
"MIT"
] | null | null | null | CHANGELOG.md | michdr/yasod | 5d28385a5e5a66d82e3303e65caf66b7ce1ac514 | [
"MIT"
] | null | null | null | # Changelog
## [0.1.1] - 2020-11-30
### Changed
- Fix miscellaneous errros from the recent refactoring
## [0.1.0] - 2020-11-30
### Added
- Count object detections' class ids `YasodModel.get_object_detections_class_ids_counts`.
- Count object detections' class names `YasodModel.get_object_detections_class_names_counts`.
## [0.0.2] - 2020-11-22
### Added
- Project metadata in `pyproject.toml`.
## [0.0.1] - 2020-11-22
### Added
- Initial release - containing basic functionality.
| 25.578947 | 93 | 0.722222 | eng_Latn | 0.827907 |
266a537dfda9addd84842fca9689e5fc40caf9df | 20,256 | md | Markdown | common-data-model/schema/core/operationsCommon/Tables/Finance/AccountsReceivable/Transaction/CustBillOfExchangeTrans.md | billgib/common-data-model-and-service | 656ad3ac9b3928f0047d84d98bb627990cf71f62 | [
"CC-BY-4.0",
"MIT"
] | 38 | 2018-12-18T08:52:40.000Z | 2022-03-24T10:48:19.000Z | common-data-model/schema/core/operationsCommon/Tables/Finance/AccountsReceivable/Transaction/CustBillOfExchangeTrans.md | billgib/common-data-model-and-service | 656ad3ac9b3928f0047d84d98bb627990cf71f62 | [
"CC-BY-4.0",
"MIT"
] | 92 | 2018-12-19T18:09:46.000Z | 2022-03-08T12:54:59.000Z | common-data-model/schema/core/operationsCommon/Tables/Finance/AccountsReceivable/Transaction/CustBillOfExchangeTrans.md | billgib/common-data-model-and-service | 656ad3ac9b3928f0047d84d98bb627990cf71f62 | [
"CC-BY-4.0",
"MIT"
] | 59 | 2019-02-01T19:59:42.000Z | 2022-03-28T21:14:55.000Z | ---
title: CustBillOfExchangeTrans in Transaction - Common Data Model | Microsoft Docs
description: undefined
author: llawwaii
ms.service: common-data-model
ms.reviewer: deonhe
ms.topic: article
ms.date: 8/7/2020
ms.author: weiluo
---
# Bill of exchange lines in Transaction(CustBillOfExchangeTrans)
Latest version of the JSON entity definition is available on <a href="https://github.com/Microsoft/CDM/tree/master/schemaDocuments/core/operationsCommon/Tables/Finance/AccountsReceivable/Transaction/CustBillOfExchangeTrans.cdm.json" target="_blank">GitHub</a>.
## Traits
<details>
<summary>Traits for this entity are listed below.
</summary>
**is.identifiedBy**
names a specifc identity attribute to use with an entity <table><tr><th>Parameter</th><th>Value</th><th>Data type</th><th>Explanation</th></tr><tr><td>attribute</td><td>[CustBillOfExchangeTrans/(resolvedAttributes)/RecId](#RecId)</td><td>attribute</td><td></td></tr></table>
**is.CDM.entityVersion**
<table><tr><th>Parameter</th><th>Value</th><th>Data type</th><th>Explanation</th></tr><tr><td>versionNumber</td><td>"1.1"</td><td>string</td><td>semantic version number of the entity</td></tr></table>
**is.application.releaseVersion**
<table><tr><th>Parameter</th><th>Value</th><th>Data type</th><th>Explanation</th></tr><tr><td>releaseVersion</td><td>"10.0.13.0"</td><td>string</td><td>semantic version number of the application introducing this entity</td></tr></table>
**is.localized.displayedAs**
Holds the list of language specific display text for an object. <table><tr><th>Parameter</th><th>Value</th><th>Data type</th><th>Explanation</th></tr><tr><td>localizedDisplayText</td><td><table><tr><th>languageTag</th><th>displayText</th></tr><tr><td>en</td><td>Bill of exchange lines</td></tr></table></td><td>entity</td><td>a reference to the constant entity holding the list of localized text</td></tr></table>
</details>
## Attributes
|Name|Description|First Included in Instance|
|---|---|---|
|[RecId](#RecId)||<a href="CustBillOfExchangeTrans.md" target="_blank">Transaction/CustBillOfExchangeTrans</a>|
|[AmountCur](#AmountCur)||<a href="CustBillOfExchangeTrans.md" target="_blank">Transaction/CustBillOfExchangeTrans</a>|
|[BankRemittanceType](#BankRemittanceType)||<a href="CustBillOfExchangeTrans.md" target="_blank">Transaction/CustBillOfExchangeTrans</a>|
|[BillOfExchangeId](#BillOfExchangeId)||<a href="CustBillOfExchangeTrans.md" target="_blank">Transaction/CustBillOfExchangeTrans</a>|
|[CurrencyCode](#CurrencyCode)||<a href="CustBillOfExchangeTrans.md" target="_blank">Transaction/CustBillOfExchangeTrans</a>|
|[CustAccount](#CustAccount)||<a href="CustBillOfExchangeTrans.md" target="_blank">Transaction/CustBillOfExchangeTrans</a>|
|[DueDate](#DueDate)||<a href="CustBillOfExchangeTrans.md" target="_blank">Transaction/CustBillOfExchangeTrans</a>|
|[ProtestReason](#ProtestReason)||<a href="CustBillOfExchangeTrans.md" target="_blank">Transaction/CustBillOfExchangeTrans</a>|
|[RemittedToBankAccountId](#RemittedToBankAccountId)||<a href="CustBillOfExchangeTrans.md" target="_blank">Transaction/CustBillOfExchangeTrans</a>|
|[SeqNum](#SeqNum)||<a href="CustBillOfExchangeTrans.md" target="_blank">Transaction/CustBillOfExchangeTrans</a>|
|[Status](#Status)||<a href="CustBillOfExchangeTrans.md" target="_blank">Transaction/CustBillOfExchangeTrans</a>|
|[TransDate](#TransDate)||<a href="CustBillOfExchangeTrans.md" target="_blank">Transaction/CustBillOfExchangeTrans</a>|
|[Voucher](#Voucher)||<a href="CustBillOfExchangeTrans.md" target="_blank">Transaction/CustBillOfExchangeTrans</a>|
|[DataAreaId](#DataAreaId)||<a href="CustBillOfExchangeTrans.md" target="_blank">Transaction/CustBillOfExchangeTrans</a>|
|[Relationship_BankAccountTableRelationshipId](#Relationship_BankAccountTableRelationshipId)||<a href="CustBillOfExchangeTrans.md" target="_blank">Transaction/CustBillOfExchangeTrans</a>|
|[Relationship_CurrencyRelationshipId](#Relationship_CurrencyRelationshipId)||<a href="CustBillOfExchangeTrans.md" target="_blank">Transaction/CustBillOfExchangeTrans</a>|
|[Relationship_CustBillOfExchangeJourRelationshipId](#Relationship_CustBillOfExchangeJourRelationshipId)||<a href="CustBillOfExchangeTrans.md" target="_blank">Transaction/CustBillOfExchangeTrans</a>|
|[Relationship_CustTableRelationshipId](#Relationship_CustTableRelationshipId)||<a href="CustBillOfExchangeTrans.md" target="_blank">Transaction/CustBillOfExchangeTrans</a>|
|[Relationship_CompanyRelationshipId](#Relationship_CompanyRelationshipId)||<a href="CustBillOfExchangeTrans.md" target="_blank">Transaction/CustBillOfExchangeTrans</a>|
### <a href=#RecId name="RecId">RecId</a>
First included in: Transaction/CustBillOfExchangeTrans (this entity)
#### Properties
<table><tr><th>Name</th><th>Value</th></tr><tr><td>isPrimaryKey</td><td>true</td></tr><tr><td>dataFormat</td><td>int64</td></tr><tr><td>isReadOnly</td><td>true</td></tr></table>
#### Traits
<details>
<summary>List of traits for the RecId attribute are listed below.</summary>
**is.dataFormat.integer**
**is.dataFormat.big**
**is.identifiedBy**
names a specifc identity attribute to use with an entity <table><tr><th>Parameter</th><th>Value</th><th>Data type</th><th>Explanation</th></tr><tr><td>attribute</td><td>[CustBillOfExchangeTrans/(resolvedAttributes)/RecId](#RecId)</td><td>attribute</td><td></td></tr></table>
**is.readOnly**
**is.dataFormat.integer**
**is.dataFormat.big**
</details>
### <a href=#AmountCur name="AmountCur">AmountCur</a>
First included in: Transaction/CustBillOfExchangeTrans (this entity)
#### Properties
<table><tr><th>Name</th><th>Value</th></tr><tr><td>dataFormat</td><td>decimal</td></tr><tr><td>isNullable</td><td>true</td></tr></table>
#### Traits
<details>
<summary>List of traits for the AmountCur attribute are listed below.</summary>
**is.dataFormat.numeric.shaped**
for setting the exact precision and scale of numeric values
**is.nullable**
The attribute value may be set to NULL.
**is.dataFormat.numeric.shaped**
for setting the exact precision and scale of numeric values
</details>
### <a href=#BankRemittanceType name="BankRemittanceType">BankRemittanceType</a>
First included in: Transaction/CustBillOfExchangeTrans (this entity)
#### Properties
<table><tr><th>Name</th><th>Value</th></tr><tr><td>dataFormat</td><td>int32</td></tr><tr><td>isNullable</td><td>true</td></tr></table>
#### Traits
<details>
<summary>List of traits for the BankRemittanceType attribute are listed below.</summary>
**is.dataFormat.integer**
**is.nullable**
The attribute value may be set to NULL.
**is.dataFormat.integer**
</details>
### <a href=#BillOfExchangeId name="BillOfExchangeId">BillOfExchangeId</a>
First included in: Transaction/CustBillOfExchangeTrans (this entity)
#### Properties
<table><tr><th>Name</th><th>Value</th></tr><tr><td>dataFormat</td><td>string</td></tr></table>
#### Traits
<details>
<summary>List of traits for the BillOfExchangeId attribute are listed below.</summary>
**is.dataFormat.character**
**is.dataFormat.big**
**is.dataFormat.array**
**is.dataFormat.character**
**is.dataFormat.array**
</details>
### <a href=#CurrencyCode name="CurrencyCode">CurrencyCode</a>
First included in: Transaction/CustBillOfExchangeTrans (this entity)
#### Properties
<table><tr><th>Name</th><th>Value</th></tr><tr><td>dataFormat</td><td>string</td></tr><tr><td>isNullable</td><td>true</td></tr></table>
#### Traits
<details>
<summary>List of traits for the CurrencyCode attribute are listed below.</summary>
**is.dataFormat.character**
**is.dataFormat.big**
**is.dataFormat.array**
**is.nullable**
The attribute value may be set to NULL.
**is.dataFormat.character**
**is.dataFormat.array**
</details>
### <a href=#CustAccount name="CustAccount">CustAccount</a>
First included in: Transaction/CustBillOfExchangeTrans (this entity)
#### Properties
<table><tr><th>Name</th><th>Value</th></tr><tr><td>dataFormat</td><td>string</td></tr><tr><td>isNullable</td><td>true</td></tr></table>
#### Traits
<details>
<summary>List of traits for the CustAccount attribute are listed below.</summary>
**is.dataFormat.character**
**is.dataFormat.big**
**is.dataFormat.array**
**is.nullable**
The attribute value may be set to NULL.
**is.dataFormat.character**
**is.dataFormat.array**
</details>
### <a href=#DueDate name="DueDate">DueDate</a>
First included in: Transaction/CustBillOfExchangeTrans (this entity)
#### Properties
<table><tr><th>Name</th><th>Value</th></tr><tr><td>dataFormat</td><td>date</td></tr><tr><td>isNullable</td><td>true</td></tr></table>
#### Traits
<details>
<summary>List of traits for the DueDate attribute are listed below.</summary>
**is.dataFormat.date**
**means.measurement.date**
**is.nullable**
The attribute value may be set to NULL.
**is.dataFormat.date**
</details>
### <a href=#ProtestReason name="ProtestReason">ProtestReason</a>
First included in: Transaction/CustBillOfExchangeTrans (this entity)
#### Properties
<table><tr><th>Name</th><th>Value</th></tr><tr><td>dataFormat</td><td>int32</td></tr><tr><td>isNullable</td><td>true</td></tr></table>
#### Traits
<details>
<summary>List of traits for the ProtestReason attribute are listed below.</summary>
**is.dataFormat.integer**
**is.nullable**
The attribute value may be set to NULL.
**is.dataFormat.integer**
</details>
### <a href=#RemittedToBankAccountId name="RemittedToBankAccountId">RemittedToBankAccountId</a>
First included in: Transaction/CustBillOfExchangeTrans (this entity)
#### Properties
<table><tr><th>Name</th><th>Value</th></tr><tr><td>dataFormat</td><td>string</td></tr><tr><td>isNullable</td><td>true</td></tr></table>
#### Traits
<details>
<summary>List of traits for the RemittedToBankAccountId attribute are listed below.</summary>
**is.dataFormat.character**
**is.dataFormat.big**
**is.dataFormat.array**
**is.nullable**
The attribute value may be set to NULL.
**is.dataFormat.character**
**is.dataFormat.array**
</details>
### <a href=#SeqNum name="SeqNum">SeqNum</a>
First included in: Transaction/CustBillOfExchangeTrans (this entity)
#### Properties
<table><tr><th>Name</th><th>Value</th></tr><tr><td>dataFormat</td><td>int32</td></tr><tr><td>isNullable</td><td>true</td></tr></table>
#### Traits
<details>
<summary>List of traits for the SeqNum attribute are listed below.</summary>
**is.dataFormat.integer**
**is.nullable**
The attribute value may be set to NULL.
**is.dataFormat.integer**
</details>
### <a href=#Status name="Status">Status</a>
First included in: Transaction/CustBillOfExchangeTrans (this entity)
#### Properties
<table><tr><th>Name</th><th>Value</th></tr><tr><td>dataFormat</td><td>int32</td></tr><tr><td>isReadOnly</td><td>true</td></tr><tr><td>isNullable</td><td>true</td></tr></table>
#### Traits
<details>
<summary>List of traits for the Status attribute are listed below.</summary>
**is.dataFormat.integer**
**is.readOnly**
**is.nullable**
The attribute value may be set to NULL.
**is.dataFormat.integer**
</details>
### <a href=#TransDate name="TransDate">TransDate</a>
First included in: Transaction/CustBillOfExchangeTrans (this entity)
#### Properties
<table><tr><th>Name</th><th>Value</th></tr><tr><td>dataFormat</td><td>date</td></tr><tr><td>isNullable</td><td>true</td></tr></table>
#### Traits
<details>
<summary>List of traits for the TransDate attribute are listed below.</summary>
**is.dataFormat.date**
**means.measurement.date**
**is.nullable**
The attribute value may be set to NULL.
**is.dataFormat.date**
</details>
### <a href=#Voucher name="Voucher">Voucher</a>
First included in: Transaction/CustBillOfExchangeTrans (this entity)
#### Properties
<table><tr><th>Name</th><th>Value</th></tr><tr><td>dataFormat</td><td>string</td></tr><tr><td>isNullable</td><td>true</td></tr></table>
#### Traits
<details>
<summary>List of traits for the Voucher attribute are listed below.</summary>
**is.dataFormat.character**
**is.dataFormat.big**
**is.dataFormat.array**
**is.nullable**
The attribute value may be set to NULL.
**is.dataFormat.character**
**is.dataFormat.array**
</details>
### <a href=#DataAreaId name="DataAreaId">DataAreaId</a>
First included in: Transaction/CustBillOfExchangeTrans (this entity)
#### Properties
<table><tr><th>Name</th><th>Value</th></tr><tr><td>dataFormat</td><td>string</td></tr><tr><td>isReadOnly</td><td>true</td></tr></table>
#### Traits
<details>
<summary>List of traits for the DataAreaId attribute are listed below.</summary>
**is.dataFormat.character**
**is.dataFormat.big**
**is.dataFormat.array**
**is.readOnly**
**is.dataFormat.character**
**is.dataFormat.array**
</details>
### <a href=#Relationship_BankAccountTableRelationshipId name="Relationship_BankAccountTableRelationshipId">Relationship_BankAccountTableRelationshipId</a>
First included in: Transaction/CustBillOfExchangeTrans (this entity)
#### Properties
<table><tr><th>Name</th><th>Value</th></tr><tr><td>dataFormat</td><td>guid</td></tr></table>
#### Traits
<details>
<summary>List of traits for the Relationship_BankAccountTableRelationshipId attribute are listed below.</summary>
**is.dataFormat.character**
**is.dataFormat.big**
**is.dataFormat.array**
**is.dataFormat.guid**
**means.identity.entityId**
**is.linkedEntity.identifier**
Marks the attribute(s) that hold foreign key references to a linked (used as an attribute) entity. This attribute is added to the resolved entity to enumerate the referenced entities. <table><tr><th>Parameter</th><th>Value</th><th>Data type</th><th>Explanation</th></tr><tr><td>entityReferences</td><td><table><tr><th>entityReference</th><th>attributeReference</th></tr><tr><td><a href="../../Bank/Main/BankAccountTable.md" target="_blank">/core/operationsCommon/Tables/Finance/Bank/Main/BankAccountTable.cdm.json/BankAccountTable</a></td><td><a href="../../Bank/Main/BankAccountTable.md#RecId" target="_blank">RecId</a></td></tr></table></td><td>entity</td><td>a reference to the constant entity holding the list of entity references</td></tr></table>
**is.dataFormat.guid**
**is.dataFormat.character**
**is.dataFormat.array**
</details>
### <a href=#Relationship_CurrencyRelationshipId name="Relationship_CurrencyRelationshipId">Relationship_CurrencyRelationshipId</a>
First included in: Transaction/CustBillOfExchangeTrans (this entity)
#### Properties
<table><tr><th>Name</th><th>Value</th></tr><tr><td>dataFormat</td><td>guid</td></tr></table>
#### Traits
<details>
<summary>List of traits for the Relationship_CurrencyRelationshipId attribute are listed below.</summary>
**is.dataFormat.character**
**is.dataFormat.big**
**is.dataFormat.array**
**is.dataFormat.guid**
**means.identity.entityId**
**is.linkedEntity.identifier**
Marks the attribute(s) that hold foreign key references to a linked (used as an attribute) entity. This attribute is added to the resolved entity to enumerate the referenced entities. <table><tr><th>Parameter</th><th>Value</th><th>Data type</th><th>Explanation</th></tr><tr><td>entityReferences</td><td><table><tr><th>entityReference</th><th>attributeReference</th></tr><tr><td><a href="../../../Common/Currency/Group/Currency.md" target="_blank">/core/operationsCommon/Tables/Common/Currency/Group/Currency.cdm.json/Currency</a></td><td><a href="../../../Common/Currency/Group/Currency.md#RecId" target="_blank">RecId</a></td></tr></table></td><td>entity</td><td>a reference to the constant entity holding the list of entity references</td></tr></table>
**is.dataFormat.guid**
**is.dataFormat.character**
**is.dataFormat.array**
</details>
### <a href=#Relationship_CustBillOfExchangeJourRelationshipId name="Relationship_CustBillOfExchangeJourRelationshipId">Relationship_CustBillOfExchangeJourRelationshipId</a>
First included in: Transaction/CustBillOfExchangeTrans (this entity)
#### Properties
<table><tr><th>Name</th><th>Value</th></tr><tr><td>dataFormat</td><td>guid</td></tr></table>
#### Traits
<details>
<summary>List of traits for the Relationship_CustBillOfExchangeJourRelationshipId attribute are listed below.</summary>
**is.dataFormat.character**
**is.dataFormat.big**
**is.dataFormat.array**
**is.dataFormat.guid**
**means.identity.entityId**
**is.linkedEntity.identifier**
Marks the attribute(s) that hold foreign key references to a linked (used as an attribute) entity. This attribute is added to the resolved entity to enumerate the referenced entities. <table><tr><th>Parameter</th><th>Value</th><th>Data type</th><th>Explanation</th></tr><tr><td>entityReferences</td><td><table><tr><th>entityReference</th><th>attributeReference</th></tr><tr><td><a href="CustBillOfExchangeJour.md" target="_blank">/core/operationsCommon/Tables/Finance/AccountsReceivable/Transaction/CustBillOfExchangeJour.cdm.json/CustBillOfExchangeJour</a></td><td><a href="CustBillOfExchangeJour.md#RecId" target="_blank">RecId</a></td></tr></table></td><td>entity</td><td>a reference to the constant entity holding the list of entity references</td></tr></table>
**is.dataFormat.guid**
**is.dataFormat.character**
**is.dataFormat.array**
</details>
### <a href=#Relationship_CustTableRelationshipId name="Relationship_CustTableRelationshipId">Relationship_CustTableRelationshipId</a>
First included in: Transaction/CustBillOfExchangeTrans (this entity)
#### Properties
<table><tr><th>Name</th><th>Value</th></tr><tr><td>dataFormat</td><td>guid</td></tr></table>
#### Traits
<details>
<summary>List of traits for the Relationship_CustTableRelationshipId attribute are listed below.</summary>
**is.dataFormat.character**
**is.dataFormat.big**
**is.dataFormat.array**
**is.dataFormat.guid**
**means.identity.entityId**
**is.linkedEntity.identifier**
Marks the attribute(s) that hold foreign key references to a linked (used as an attribute) entity. This attribute is added to the resolved entity to enumerate the referenced entities. <table><tr><th>Parameter</th><th>Value</th><th>Data type</th><th>Explanation</th></tr><tr><td>entityReferences</td><td><table><tr><th>entityReference</th><th>attributeReference</th></tr><tr><td><a href="../../../Common/Customer/Main/CustTable.md" target="_blank">/core/operationsCommon/Tables/Common/Customer/Main/CustTable.cdm.json/CustTable</a></td><td><a href="../../../Common/Customer/Main/CustTable.md#RecId" target="_blank">RecId</a></td></tr></table></td><td>entity</td><td>a reference to the constant entity holding the list of entity references</td></tr></table>
**is.dataFormat.guid**
**is.dataFormat.character**
**is.dataFormat.array**
</details>
### <a href=#Relationship_CompanyRelationshipId name="Relationship_CompanyRelationshipId">Relationship_CompanyRelationshipId</a>
First included in: Transaction/CustBillOfExchangeTrans (this entity)
#### Properties
<table><tr><th>Name</th><th>Value</th></tr><tr><td>dataFormat</td><td>guid</td></tr></table>
#### Traits
<details>
<summary>List of traits for the Relationship_CompanyRelationshipId attribute are listed below.</summary>
**is.dataFormat.character**
**is.dataFormat.big**
**is.dataFormat.array**
**is.dataFormat.guid**
**means.identity.entityId**
**is.linkedEntity.identifier**
Marks the attribute(s) that hold foreign key references to a linked (used as an attribute) entity. This attribute is added to the resolved entity to enumerate the referenced entities. <table><tr><th>Parameter</th><th>Value</th><th>Data type</th><th>Explanation</th></tr><tr><td>entityReferences</td><td><table><tr><th>entityReference</th><th>attributeReference</th></tr><tr><td><a href="../../Ledger/Main/CompanyInfo.md" target="_blank">/core/operationsCommon/Tables/Finance/Ledger/Main/CompanyInfo.cdm.json/CompanyInfo</a></td><td><a href="../../Ledger/Main/CompanyInfo.md#RecId" target="_blank">RecId</a></td></tr></table></td><td>entity</td><td>a reference to the constant entity holding the list of entity references</td></tr></table>
**is.dataFormat.guid**
**is.dataFormat.character**
**is.dataFormat.array**
</details>
| 41.087221 | 766 | 0.729957 | yue_Hant | 0.333811 |
266abb5ab9a4396888a09bb0744cd6d1f0b8b92f | 7,551 | md | Markdown | articles/virtual-machines/linux/tutorial-azure-devops-blue-green-strategy.md | MicrosoftDocs/azure-docs.hu-hu | 5fb082c5dae057fd040c7e09881e6c407e535fe2 | [
"CC-BY-4.0",
"MIT"
] | 7 | 2017-08-28T07:44:33.000Z | 2021-04-20T21:12:50.000Z | articles/virtual-machines/linux/tutorial-azure-devops-blue-green-strategy.md | MicrosoftDocs/azure-docs.hu-hu | 5fb082c5dae057fd040c7e09881e6c407e535fe2 | [
"CC-BY-4.0",
"MIT"
] | 412 | 2018-07-25T09:31:03.000Z | 2021-03-17T13:17:45.000Z | articles/virtual-machines/linux/tutorial-azure-devops-blue-green-strategy.md | MicrosoftDocs/azure-docs.hu-hu | 5fb082c5dae057fd040c7e09881e6c407e535fe2 | [
"CC-BY-4.0",
"MIT"
] | 13 | 2017-09-05T09:10:35.000Z | 2021-11-05T11:42:31.000Z | ---
title: Oktatóanyag – az Azure Linux rendszerű virtuális gépekre vonatkozó Kanári-telepítések konfigurálása
description: Ebből az oktatóanyagból megtudhatja, hogyan állíthatja be a folyamatos üzembe helyezés (CD) folyamatát. Ez a folyamat az Azure Linux rendszerű virtuális gépek egy csoportját frissíti a kék-zöld telepítési stratégia használatával.
author: moala
tags: azure-devops-pipelines
ms.assetid: ''
ms.service: virtual-machines
ms.collection: linux
ms.topic: tutorial
ms.tgt_pltfrm: azure-pipelines
ms.workload: infrastructure
ms.date: 4/10/2020
ms.author: moala
ms.custom: devops
ms.openlocfilehash: 4545891cce926f049673cd2c2380a8309f2e71a1
ms.sourcegitcommit: f28ebb95ae9aaaff3f87d8388a09b41e0b3445b5
ms.translationtype: MT
ms.contentlocale: hu-HU
ms.lasthandoff: 03/30/2021
ms.locfileid: "102552587"
---
# <a name="tutorial---configure-the-blue-green-deployment-strategy-for-azure-linux-virtual-machines"></a>Oktatóanyag – az Azure Linux rendszerű virtuális gépekhez készült Blue-Green üzembe helyezési stratégia konfigurálása
## <a name="infrastructure-as-a-service-iaas---configure-cicd"></a>Infrastruktúra-szolgáltatás (IaaS) – CI/CD konfigurálása
Az Azure-folyamatok teljes körű CI/CD Automation-eszközöket biztosítanak a virtuális gépekhez való üzembe helyezéshez. Az Azure-beli virtuális gépek folyamatos kézbesítési folyamatát a Azure Portal is konfigurálhatja.
Ez a cikk bemutatja, hogyan állíthat be egy olyan CI/CD-folyamatot, amely a MultiMachine-környezetekben a kék-zöld stratégiát használja. A Azure Portal más stratégiákat is támogat, mint például a [Rolling](./tutorial-devops-azure-pipelines-classic.md) és a [Canary](./tutorial-azure-devops-canary-strategy.md).
### <a name="configure-cicd-on-virtual-machines"></a>CI/CD konfigurálása virtuális gépeken
A virtuális gépeket célként adhatja hozzá egy [központi telepítési csoporthoz](/azure/devops/pipelines/release/deployment-groups). Ezután megcélozhatja őket a MultiMachine-frissítésekhez. A gépekre való központi telepítés után megtekintheti a telepítési **előzményeket** egy központi telepítési csoportban. Ez a nézet lehetővé teszi a virtuális gép és a folyamat közötti nyomkövetést, majd a véglegesítés elvégzését.
### <a name="blue-green-deployments"></a>Kék-zöld üzembe helyezések
A kék-zöld telepítés csökkenti az állásidőt egy azonos készenléti környezettel. Egyszerre csak egy környezet él.
Az új kiadásra való felkészülés során végezze el a tesztelés utolsó szakaszát a zöld környezetben. Miután a szoftver működik a zöld környezetben, váltson át a forgalomra, hogy minden bejövő kérelem a zöld környezetbe lépjen. A kék környezet inaktív.
A folyamatos kézbesítés beállítás használatával a Azure Portal a kék-zöld környezeteket a virtuális gépekre is konfigurálhatja. Itt látható a lépésenkénti útmutató:
1. Jelentkezzen be a Azure Portalba, és navigáljon a virtuális géphez.
1. A virtuális gép beállításainak bal szélső paneljén válassza a **folyamatos kézbesítés** lehetőséget. Ezután válassza a **Konfigurálás** lehetőséget.

1. A konfiguráció panelen válassza az **Azure DevOps Organization** lehetőséget egy meglévő fiók kiválasztásához, vagy hozzon létre újat. Ezután válassza ki azt a projektet, amelyben be szeretné állítani a folyamatot.

1. A központi telepítési csoport a központi telepítési célszámítógépek logikai készlete, amely a fizikai környezeteket jelképezi. Példák a fejlesztési, tesztelési, ellenőrzését és éles környezetekre. Létrehozhat egy új központi telepítési csoportot, vagy kijelölhet egy meglévőt is.
1. Válassza ki a Build folyamatot, amely közzéteszi a virtuális gépre telepítendő csomagot. A közzétett csomagnak rendelkeznie kell egy deploy.ps1 vagy deploy.sh nevű telepítési parancsfájllal a csomag gyökérkönyvtárában található deployscripts mappában. A folyamat futtatja ezt az üzembe helyezési parancsfájlt.
1. A **központi telepítési stratégia** területen válassza a **kék-zöld** lehetőséget.
1. Adjon hozzá egy "kék" vagy "zöld" címkét olyan virtuális gépekhez, amelyek a kék-zöld környezetek részét képezik. Ha egy virtuális gép készenléti szerepkörhöz kapcsolódik, a címke "zöld". Ellenkező esetben címkézse "Blue"-ként.

1. Kattintson az **OK** gombra a folyamatos kézbesítési folyamat konfigurálásához a virtuális gépen való üzembe helyezéshez.

1. Megjelenik a virtuális gép központi telepítési adatai. A hivatkozásra kattintva megtekintheti az Azure DevOps kiadási folyamatát. A folyamat konfigurációjának megtekintéséhez a kiadási folyamatban válassza a **Szerkesztés** lehetőséget. A folyamatnak a következő három fázisa van:
1. Ez a fázis egy üzembe helyezési csoport fázisa. Az alkalmazások üzembe helyezése készenléti virtuális gépekre történik, amelyek "zöld" néven vannak megjelölve.
1. Ebben a fázisban a folyamat szünetel, és megvárja a manuális beavatkozást a Futtatás folytatásához. A felhasználók akkor is folytathatják a folyamatot, ha manuálisan biztosítják a központi telepítés stabilitását a "zöld" címkével jelölt virtuális gépekre.
1. Ez a fázis a virtuális gépek "kék" és "zöld" címkéit cseréli le. Ez biztosítja, hogy a régebbi verziójú alkalmazásokkal rendelkező virtuális gépek már "zöld" címkével legyenek megjelölve. A következő folyamat futtatásakor az alkalmazások a virtuális gépekre lesznek telepítve.

1. Az üzembe helyezési parancsfájl végrehajtása feladat alapértelmezés szerint futtatja az üzembe helyezési parancsfájlt deploy.ps1 vagy deploy.sh. A parancsfájl a közzétett csomag gyökérkönyvtárában lévő deployscripts mappában található. Győződjön meg arról, hogy a kiválasztott build-folyamat közzéteszi a központi telepítést a csomag gyökérkönyvtárában.

## <a name="other-deployment-strategies"></a>Egyéb központi telepítési stratégiák
- [A működés közbeni üzembe helyezési stratégia konfigurálása](./tutorial-devops-azure-pipelines-classic.md)
- [A Kanári-telepítési stratégia konfigurálása](./tutorial-azure-devops-canary-strategy.md)
## <a name="azure-devops-projects"></a>Azure DevOps Projects
Az Azure-t egyszerűen megteheti. A Azure DevOps Projects használatával a következő parancs kiválasztásával megkezdheti az alkalmazás futtatását bármely Azure-szolgáltatásban mindössze három lépésben:
- Alkalmazás nyelve
- Egy futtatókörnyezet
- Azure-szolgáltatás
[További információk](https://azure.microsoft.com/features/devops-projects/).
## <a name="additional-resources"></a>További források
- [Üzembe helyezés az Azure-beli virtuális gépeken Azure DevOps Projects használatával](../../devops-project/azure-devops-project-vms.md)
- [Az alkalmazás folyamatos üzembe helyezésének megvalósítása egy Azure virtuálisgép-méretezési csoportba](/azure/devops/pipelines/apps/cd/azure/deploy-azure-scaleset) | 82.076087 | 416 | 0.819494 | hun_Latn | 1.000007 |
266b06a50194cbaeb1cad0c1fc6dd7c472936f93 | 3,893 | md | Markdown | docs/csharp/language-reference/compiler-messages/cs0019.md | acidburn0zzz/docs.fr-fr | 5fdf04b7027f8b7d749c2180da4b99068e1f44ee | [
"CC-BY-4.0",
"MIT"
] | 1 | 2019-04-11T17:00:02.000Z | 2019-04-11T17:00:02.000Z | docs/csharp/language-reference/compiler-messages/cs0019.md | Acidburn0zzz/docs.fr-fr | 5fdf04b7027f8b7d749c2180da4b99068e1f44ee | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/csharp/language-reference/compiler-messages/cs0019.md | Acidburn0zzz/docs.fr-fr | 5fdf04b7027f8b7d749c2180da4b99068e1f44ee | [
"CC-BY-4.0",
"MIT"
] | 1 | 2022-02-23T14:59:20.000Z | 2022-02-23T14:59:20.000Z | ---
title: Erreur du compilateur CS0019
ms.date: 07/20/2015
f1_keywords:
- CS0019
helpviewer_keywords:
- CS0019
ms.assetid: 5a25be41-535b-4850-a230-9a385e01fd20
ms.openlocfilehash: 22060d0e09a53096665cf50f4538cecbc37b9d95
ms.sourcegitcommit: 4a8c2b8d0df44142728b68ebc842575840476f6d
ms.translationtype: HT
ms.contentlocale: fr-FR
ms.lasthandoff: 03/28/2019
ms.locfileid: "58545661"
---
# <a name="compiler-error-cs0019"></a>Erreur du compilateur CS0019
L’opérateur 'operator' ne peut pas être appliqué aux opérandes de type 'type' et 'type'
Un opérateur binaire s’applique à des types de données qui ne le prennent pas en charge. Par exemple, vous ne pouvez pas utiliser l’opérateur [||](../../../csharp/language-reference/operators/conditional-or-operator.md) sur des chaînes ; par ailleurs, l’opérateur [+](../../../csharp/language-reference/operators/addition-operator.md), [-](../../../csharp/language-reference/operators/subtraction-operator.md), [\<](../../../csharp/language-reference/operators/less-than-operator.md) ou [>](../../../csharp/language-reference/operators/greater-than-operator.md) ne peut pas être employé sur des variables [bool](../../../csharp/language-reference/keywords/bool.md), et l’opérateur [==](../../../csharp/language-reference/operators/equality-operators.md#equality-operator-) ne peut pas être utilisé avec un type `struct` à moins qu’il ne surcharge explicitement cet opérateur.
Si vous rencontrez cette erreur avec un type de classe, cela signifie que la classe ne surcharge pas l’opérateur. Pour plus d’informations, consultez l’article sur le mot clé [operator](../../../csharp/language-reference/keywords/operator.md) et l’article [Opérateurs surchargeables](../../../csharp/programming-guide/statements-expressions-operators/overloadable-operators.md).
## <a name="example"></a>Exemple
Dans l’exemple suivant, l’erreur CS0019 est générée à deux emplacements, car [bool](../../../csharp/language-reference/keywords/bool.md) dans C# n’est pas convertible en [int](../../../csharp/language-reference/keywords/int.md). L’erreur CS0019 est également générée quand l’opérateur de soustraction est appliqué à une chaîne. L’opérateur d’addition (+) peut être utilisé avec des opérandes de chaîne, car il est surchargé par la classe `String` pour permettre la concaténation de chaînes.
```csharp
static void Main()
{
bool result = true;
if (result > 0) //CS0019
{
// Do something.
}
int i = 1;
// You cannot compare an integer and a boolean value.
if (i == true) //CS0019
{
//Do something...
}
// The following use of == causes no error. It is the comparison of
// an integer and a boolean value that causes the error in the
// previous if statement.
if (result == true)
{
//Do something...
}
string s = "Just try to subtract me.";
float f = 100 - s; // CS0019
}
```
## <a name="example"></a>Exemple
Dans l’exemple suivant, la logique conditionnelle doit être spécifiée en dehors de <xref:System.Diagnostics.ConditionalAttribute>. Vous ne pouvez passer qu’un symbole prédéfini à <xref:System.Diagnostics.ConditionalAttribute>.
L’exemple suivant génère l’erreur CS0019.
```csharp
// CS0019_a.cs
// compile with: /target:library
using System.Diagnostics;
public class MyClass
{
[ConditionalAttribute("DEBUG" || "TRACE")] // CS0019
public void TestMethod() {}
// OK
[ConditionalAttribute("DEBUG"), ConditionalAttribute("TRACE")]
public void TestMethod2() {}
}
```
## <a name="see-also"></a>Voir aussi
- [Opérateurs](../../../csharp/programming-guide/statements-expressions-operators/operators.md)
- [Tableau des conversions numériques implicites](../../../csharp/language-reference/keywords/implicit-numeric-conversions-table.md)
| 47.47561 | 888 | 0.706653 | fra_Latn | 0.690312 |
266b8578fd91546477206cc6df6ad29b94b2214f | 26 | md | Markdown | README.md | andreybozhkov/pdf-invoice-converter | 697f2ba7b6ca27738634e6d1edce16b2328a68d8 | [
"MIT"
] | null | null | null | README.md | andreybozhkov/pdf-invoice-converter | 697f2ba7b6ca27738634e6d1edce16b2328a68d8 | [
"MIT"
] | null | null | null | README.md | andreybozhkov/pdf-invoice-converter | 697f2ba7b6ca27738634e6d1edce16b2328a68d8 | [
"MIT"
] | null | null | null | # pdf-invoice-converter
| 8.666667 | 23 | 0.730769 | yue_Hant | 0.306704 |
266c255142b298061507dc9aca10257918cb15d1 | 465 | md | Markdown | README.md | erdetn/vldsp | f957e31e8a3676aefdc5122a7bd0a81d7ea91c9d | [
"MIT"
] | null | null | null | README.md | erdetn/vldsp | f957e31e8a3676aefdc5122a7bd0a81d7ea91c9d | [
"MIT"
] | null | null | null | README.md | erdetn/vldsp | f957e31e8a3676aefdc5122a7bd0a81d7ea91c9d | [
"MIT"
] | null | null | null | # vldsp
vldsp is a wrapepr of liquid-dsp (liquid-sdr)
## Install
First, you need to instrall `liquid-sdr` and `fftw3` library.
```
libliquid-dev - signal processing library for software defined radio (development files)
libliquid2d - signal processing library for software defined radio
libfftw3-dev - Library for computing Fast Fourier Transforms - development
```
Go to command line terminal:
```
sudo apt-get install libliquid-dev libliquid2d libfftw3-dev
```
| 29.0625 | 88 | 0.774194 | eng_Latn | 0.851961 |
266c26be241379abacb807ce5626f09f3f51a623 | 200 | md | Markdown | .github/Workflows.md | AndriiMykytiuk/SeleniumBase | d675f19ca8c8550607a0da1305704772eec34fa8 | [
"MIT"
] | 2,745 | 2016-07-20T09:13:15.000Z | 2022-03-29T15:07:31.000Z | .github/Workflows.md | AndriiMykytiuk/SeleniumBase | d675f19ca8c8550607a0da1305704772eec34fa8 | [
"MIT"
] | 384 | 2016-07-17T20:45:26.000Z | 2022-03-31T22:35:35.000Z | .github/Workflows.md | AndriiMykytiuk/SeleniumBase | d675f19ca8c8550607a0da1305704772eec34fa8 | [
"MIT"
] | 704 | 2016-07-17T20:47:04.000Z | 2022-03-31T04:32:35.000Z | ### <img src="https://seleniumbase.io/img/logo3a.png" title="SeleniumBase" width="28" /> SeleniumBase Workflows
> **Table of Contents / Navigation:**
> - [**CI build**](workflows/python-package.yml)
| 40 | 111 | 0.695 | yue_Hant | 0.229212 |
266c344dcbf4a14b44c5f1ec4badeae48c41a128 | 3,334 | md | Markdown | docs/t-sql/functions/schema-name-transact-sql.md | PowerBee-AK/sql-docs.de-de | f6f4854db855a89c4e49dc0557fa456da060b3c7 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/t-sql/functions/schema-name-transact-sql.md | PowerBee-AK/sql-docs.de-de | f6f4854db855a89c4e49dc0557fa456da060b3c7 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/t-sql/functions/schema-name-transact-sql.md | PowerBee-AK/sql-docs.de-de | f6f4854db855a89c4e49dc0557fa456da060b3c7 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
description: SCHEMA_NAME (Transact-SQL)
title: SCHEMA_NAME (Transact-SQL) | Microsoft-Dokumentation
ms.custom: ''
ms.date: 03/03/2017
ms.prod: sql
ms.prod_service: database-engine, sql-database, sql-data-warehouse, pdw
ms.reviewer: ''
ms.technology: t-sql
ms.topic: reference
f1_keywords:
- SCHEMA_NAME
- SCHEMA_NAME_TSQL
dev_langs:
- TSQL
helpviewer_keywords:
- SCHEMA_NAME function
- schemas [SQL Server], names
ms.assetid: 20071b77-2b6e-4ce7-a8e3-fa71480baf73
author: julieMSFT
ms.author: jrasnick
monikerRange: '>=aps-pdw-2016||=azuresqldb-current||=azure-sqldw-latest||>=sql-server-2016||>=sql-server-linux-2017||=azuresqldb-mi-current'
ms.openlocfilehash: 52c3d7527494c8bb4ac92a1bd083be7ce0b53908
ms.sourcegitcommit: 33f0f190f962059826e002be165a2bef4f9e350c
ms.translationtype: HT
ms.contentlocale: de-DE
ms.lasthandoff: 01/30/2021
ms.locfileid: "99183483"
---
# <a name="schema_name-transact-sql"></a>SCHEMA_NAME (Transact-SQL)
[!INCLUDE [sql-asdb-asdbmi-asa-pdw](../../includes/applies-to-version/sql-asdb-asdbmi-asa-pdw.md)]
Gibt den einer Schema-ID zugeordneten Schemanamen zurück.
 [Transact-SQL-Syntaxkonventionen](../../t-sql/language-elements/transact-sql-syntax-conventions-transact-sql.md)
## <a name="syntax"></a>Syntax
```syntaxsql
SCHEMA_NAME ( [ schema_id ] )
```
[!INCLUDE[sql-server-tsql-previous-offline-documentation](../../includes/sql-server-tsql-previous-offline-documentation.md)]
## <a name="arguments"></a>Argumente
|Begriff|Definition|
|----------|----------------|
|*schema_id*|Die ID des Schemas. *schema_id* ist vom Datentyp **int**. Ist *schema_id* nicht definiert, gibt SCHEMA_NAME den Namen des Standardschemas des Aufrufers zurück.|
## <a name="return-types"></a>Rückgabetypen
**sysname**
Gibt NULL zurück, wenn *schema_id* keine gültige ID ist.
## <a name="remarks"></a>Hinweise
SCHEMA_NAME gibt Namen von Systemschemas und benutzerdefinierten Schemas zurück. SCHEMA_NAME kann in einer SELECT-Liste, in einer WHERE-Klausel und überall dort aufgerufen werden, wo ein Ausdruck zulässig ist.
## <a name="examples"></a>Beispiele
### <a name="a-returning-the-name-of-the-default-schema-of-the-caller"></a>A. Zurückgeben des Namens des Standardschemas des Aufrufers
```sql
SELECT SCHEMA_NAME();
```
### <a name="b-returning-the-name-of-a-schema-by-using-an-id"></a>B. Zurückgeben des Namens eines Schemas mithilfe einer ID
```sql
SELECT SCHEMA_NAME(1);
```
## <a name="see-also"></a>Weitere Informationen
[Ausdrücke (Transact-SQL)](../../t-sql/language-elements/expressions-transact-sql.md)
[SCHEMA_ID (Transact-SQL)](../../t-sql/functions/schema-id-transact-sql.md)
[sys.schemas (Transact-SQL)](../../relational-databases/system-catalog-views/schemas-catalog-views-sys-schemas.md)
[sys.database_principals (Transact-SQL)](../../relational-databases/system-catalog-views/sys-database-principals-transact-sql.md)
[Metadata Functions (Transact-SQL) (Metadatenfunktionen (Transact-SQL))](../../t-sql/functions/metadata-functions-transact-sql.md)
[WHERE (Transact-SQL)](../../t-sql/queries/where-transact-sql.md)
| 40.168675 | 226 | 0.724055 | yue_Hant | 0.374015 |
266c59ec1a42fb8bbb6226875180268307d8681a | 1,404 | md | Markdown | desktop-src/Tapi/midi-in.md | crushonme/win32 | f5099e1e3e455bb162771d80b0ba762ee5c974ec | [
"CC-BY-4.0",
"MIT"
] | 3 | 2020-04-24T13:02:42.000Z | 2021-07-17T15:32:03.000Z | desktop-src/Tapi/midi-in.md | crushonme/win32 | f5099e1e3e455bb162771d80b0ba762ee5c974ec | [
"CC-BY-4.0",
"MIT"
] | null | null | null | desktop-src/Tapi/midi-in.md | crushonme/win32 | f5099e1e3e455bb162771d80b0ba762ee5c974ec | [
"CC-BY-4.0",
"MIT"
] | 1 | 2022-03-09T23:50:05.000Z | 2022-03-09T23:50:05.000Z | ---
Description: The midi/in device class consists of MIDI sequencers that are used for input. You access these devices by using the MIDI functions, which are described in the Platform Software Development Kit (SDK).
ms.assetid: 8997a391-bf61-4ec9-8ffc-fe3e6b92d63a
title: midi/in
ms.topic: article
ms.date: 05/31/2018
---
# midi/in
The midi/in device class consists of MIDI sequencers that are used for input. You access these devices by using the MIDI functions, which are described in the Platform Software Development Kit (SDK).
The [**lineGetID**](/windows/desktop/api/Tapi/nf-tapi-linegetid) and [**phoneGetID**](/windows/desktop/api/Tapi/nf-tapi-phonegetid) functions fill a [**VARSTRING**](/windows/desktop/api/Tapi/ns-tapi-varstring) structure, setting the **dwStringFormat** member to the STRINGFORMAT\_BINARY value and appending this additional member:
``` syntax
DWORD DeviceId; // identifier of MIDI device
```
The **DeviceId** member is the identifier of a closed MIDI device. You use this identifier in a call to the [**midiInOpen**](https://msdn.microsoft.com/en-us/library/Dd798458(v=VS.85).aspx) function to open the device for input. You can use the resulting device handle to record MIDI data from the line or phone device.
For more information about the MIDI functions, see [**Multimedia Functions**](https://msdn.microsoft.com/en-us/library/Dd743586(v=VS.85).aspx).
| 48.413793 | 330 | 0.76567 | eng_Latn | 0.968025 |
266d2db4835dade3d463f4aeb8ac9745f69b340a | 2,965 | md | Markdown | includes/api-management-custom-domain.md | fuadi-star/azure-docs.nl-nl | 0c9bc5ec8a5704aa0c14dfa99346e8b7817dadcd | [
"CC-BY-4.0",
"MIT"
] | 16 | 2017-08-28T07:45:43.000Z | 2021-04-20T21:12:50.000Z | includes/api-management-custom-domain.md | fuadi-star/azure-docs.nl-nl | 0c9bc5ec8a5704aa0c14dfa99346e8b7817dadcd | [
"CC-BY-4.0",
"MIT"
] | 575 | 2017-08-30T07:14:53.000Z | 2022-03-04T05:36:23.000Z | includes/api-management-custom-domain.md | fuadi-star/azure-docs.nl-nl | 0c9bc5ec8a5704aa0c14dfa99346e8b7817dadcd | [
"CC-BY-4.0",
"MIT"
] | 58 | 2017-07-06T11:58:36.000Z | 2021-11-04T12:34:58.000Z | ---
author: vladvino
ms.service: api-management
ms.topic: include
ms.date: 11/09/2018
ms.author: vlvinogr
ms.openlocfilehash: 1858317d40efa59b188ce894534be93a1f11b287
ms.sourcegitcommit: f28ebb95ae9aaaff3f87d8388a09b41e0b3445b5
ms.translationtype: MT
ms.contentlocale: nl-NL
ms.lasthandoff: 03/29/2021
ms.locfileid: "96027546"
---
## <a name="how-apim-proxy-server-responds-with-ssl-certificates-in-the-tls-handshake"></a>Hoe APIM Proxy Server reageert met SSL-certificaten in de TLS-handshake
### <a name="clients-calling-with-sni-header"></a>Clientoproepen met SNI-header
Als de klant een of meer aangepaste domeinen heeft geconfigureerd voor Proxy, kan APIM reageren op HTTP-aanvragen van de aangepaste domeinen (bijvoorbeeld contoso.com) en van het standaarddomein (bijvoorbeeld apim-service-name.azure-api.net). Op basis van de informatie in de SNI-header (servernaamindicatie) reageert APIM met het juiste servercertificaat.
### <a name="clients-calling-without-sni-header"></a>Clientoproepen zonder SNI-header
Als de klant een client gebruikt, waardoor de [SNI](https://tools.ietf.org/html/rfc6066#section-3)-header niet wordt verzonden, maakt APIM antwoorden op basis van de volgende logica:
* Als met de service slechts één aangepast domein is geconfigureerd voor Proxy, is het standaardcertificaat het certificaat dat is verleend aan het aangepaste Proxy-domein.
* Als met de service meerdere aangepaste domeinen zijn geconfigureerd voor Proxy (ondersteund in de lagen **Ontwikkelaar** en **Proxy**), kan de klant bepalen welk certificaat het standaardcertificaat is. Als u het standaardcertificaat wilt instellen, moet de eigenschap [defaultSslBinding](/rest/api/apimanagement/2019-12-01/apimanagementservice/createorupdate#hostnameconfiguration) zijn ingesteld op Waar ("defaultSslBinding":"true"). Als de klant de eigenschap niet instelt, is het standaardcertificaat het certificaat dat is verleend aan het standaard-Proxy-domein dat wordt gehost op *.azure-api.net.
## <a name="support-for-putpost-request-with-large-payload"></a>Ondersteuning voor PUT/POST-aanvragen met grote nettolading
De APIM Proxy-server biedt ondersteuning voor grote nettoladingen, bij het gebruik van certificaten aan de clientzijde in HTTPS (bijvoorbeeld nettolading > 40 kB). Als u wilt voorkomen dat de aanvraag van de server blijft hangen, kunnen klanten de eigenschap ["negotiateClientCertificate": "true"](/rest/api/apimanagement/2019-12-01/ApiManagementService/CreateOrUpdate#hostnameconfiguration) instellen op de Proxy-hostnaam. Als de eigenschap is ingesteld op Waar, wordt het clientcertificaat aangevraagd op de tijd van de SSL/TLS-verbinding, vóór HTTP-aanvragen worden uitgewisseld. Omdat de instelling van toepassing is op het niveau **Proxy-hostnaam**, vragen alle verbindingsaanvragen om het clientcertificaat. Klanten kunnen maximaal 20 aangepaste domeinen voor Proxy configureren (alleen ondersteund in de laag **Premium**) en omzeilen deze beperking. | 109.814815 | 856 | 0.814165 | nld_Latn | 0.997287 |
266db003caaf6e7255f5591da84bb136058196fb | 427 | md | Markdown | catalog/boku-wa-nandomo-umarekawaru/en-US_boku-wa-nandomo-umarekawaru.md | htron-dev/baka-db | cb6e907a5c53113275da271631698cd3b35c9589 | [
"MIT"
] | 3 | 2021-08-12T20:02:29.000Z | 2021-09-05T05:03:32.000Z | catalog/boku-wa-nandomo-umarekawaru/en-US_boku-wa-nandomo-umarekawaru.md | zzhenryquezz/baka-db | da8f54a87191a53a7fca54b0775b3c00f99d2531 | [
"MIT"
] | 8 | 2021-07-20T00:44:48.000Z | 2021-09-22T18:44:04.000Z | catalog/boku-wa-nandomo-umarekawaru/en-US_boku-wa-nandomo-umarekawaru.md | zzhenryquezz/baka-db | da8f54a87191a53a7fca54b0775b3c00f99d2531 | [
"MIT"
] | 2 | 2021-07-19T01:38:25.000Z | 2021-07-29T08:10:29.000Z | # Boku wa Nandomo Umarekawaru

- **type**: light-novel
- **volumes**: 1
- **chapters**: 7
- **original-name**: 僕は何度も生まれ変わる
- **start-date**: 2018-07-01
## Tags
- fantasy
## Authors
- Jyumonji
- Ao (Story)
- da-kuro (Art)
## Links
- [My Anime list](https://myanimelist.net/manga/115144/Boku_wa_Nandomo_Umarekawaru)
| 17.791667 | 85 | 0.648712 | yue_Hant | 0.052092 |
266df5b3a283758966e61a778cfe40e5cb090088 | 4,818 | md | Markdown | README.md | sieteunoseis/cucm_tig_dashboard_ubuntu | 584404887aad37569456a7a69ed63d66136b1eeb | [
"MIT"
] | 10 | 2020-07-21T13:34:17.000Z | 2021-07-23T16:59:40.000Z | README.md | sieteunoseis/cucm_tig_dashboard_ubuntu | 584404887aad37569456a7a69ed63d66136b1eeb | [
"MIT"
] | 2 | 2020-06-03T08:48:20.000Z | 2021-09-12T00:49:36.000Z | README.md | sieteunoseis/cucm_tig_dashboard_ubuntu | 584404887aad37569456a7a69ed63d66136b1eeb | [
"MIT"
] | 1 | 2020-11-03T20:23:31.000Z | 2020-11-03T20:23:31.000Z | # OVA image running a TIG Stack (Telegraf, InfluxDB, and Grafana) on Ubuntu 18.04 LTS
TIG Stack for Cisco Call Manager Perfmon and Risport data
* [Download OVA](https://github.com/sieteunoseis/cucm_tig_dashboard_ubuntu/releases/download/v1.0/cucm_tig_ubuntu.ova)

## Access/Credentials
### Ubuntu OS
* username: dashadmin
* password: *TietERison*
### InfluxDB
* http://localhost:8086/
* username: none
* password: none
### Grafana
* http://localhost:3000/
* username: admin
* password: OmituRepRa
## Ubuntu Settings
### View IP Address
```
$ ip addr show ens33 | grep inet | awk '{ print $2; }' | sed 's/\/.*$//'
```
### Set static IP address
https://www.howtoforge.com/linux-basics-set-a-static-ip-on-ubuntu
### PM2 Commands (Process manager running python scripts)
PM2 is already set up to run scripts against Cisco's [DevNet 12.5 Collaboration Sandbox](https://devnetsandbox.cisco.com/). You'll need to delete all the processes and add new ones for your IP addresses. I created an Excel spreadsheet to help create the command lines for this step. The Jabber Registration example is below. All example are set to run every 5 minutes.
[Excel Helper](https://github.com/sieteunoseis/cucm_tig_dashboard_ubuntu/blob/master/PM2_Helper.xlsx)
```
$ pm2 [list|ls|status]
$ pm2 flush
$ pm2 log
$ pm2 restart app_name
$ pm2 reload app_name
$ pm2 stop app_name
$ pm2 delete app_name
$ pm2 start perfmon_arg.py --interpreter python3 --name thread --cron '*/5 * * * *' --no-autorestart -- -ip 10.10.20.1 10.10.20.2 -u administrator -p ciscopsdt -c 'Cisco CallManager'
$ pm2 start cisco_axl_jabber.py --interpreter python3 --name jabber_status --cron '*/5 * * * *' --no-autorestart -- -ip 10.10.20.1 -u administrator -p ciscopsdt -v 12.0
$ pm2 save or pm2 set pm2:autodump true
$ pm2 stop all
$ pm2 show <id|name>
$ pm2 startup
$ pm2 monit
```
### SFTP Configuration (This is already preconfigured)
```
$ sudo groupadd sftp
$ sudo nano /etc/ssh/sshd_config
```
Paste at bottom
```
Match group sftp
ChrootDirectory /
X11Forwarding no
AllowTcpForwarding no
```
Add user to group
```
$ sudo usermod -a -G dashadmin sftp
```
## InfluxDB Settings
InfluxDB is already configured with sample data in it from Cisco's [DevNet 12.5 Collaboration Sandbox](https://devnetsandbox.cisco.com/).
```
$ sudo nano /etc/influxdb/influxdb.conf
$ influx -precision rfc3339
> show databases
> CREATE DATABASE cisco_perfmon WITH DURATION 90d
> use cisco_perfmon
> show measurements
> CREATE DATABASE cisco_risport WITH DURATION 90d
> use cisco_risport
> show measurements
```
If you'd like to remove the data and start fresh do the following:
```
$ influx -precision rfc3339
> drop database cisco_perfmon
> CREATE DATABASE cisco_perfmon WITH DURATION 90d
> drop database cisco_risport
> CREATE DATABASE cisco_risport WITH DURATION 90d
> drop database telegraf
> CREATE DATABASE telegraf WITH DURATION 90d
```
## Telegraf Settings
Telegraf config needs to be edited with updated IP addresses. Telegraf is used to pull stats from CUCM via SNMP. This was done due to a bug in the Perfmon API.
[CSCvn19112](https://bst.cloudapps.cisco.com/bugsearch/bug/CSCvn19112/?rfs=iqvred)
```
$ sudo nano /etc/telegraf/telegraf.conf
$ sudo service telegraf restart
$ telegraf -test -config /etc/telegraf/telegraf.conf
$ snmpwalk -c dashboardRO -v 2c 10.10.20.1 -m +CISCO-CCM-MIB 1.3.6.1.4.1.9.9.156
$ cd /usr/share/snmp/mibs
```
Majority of the telegraf configuration was from this guide:
https://angristan.xyz/2018/04/monitoring-telegraf-influxdb-grafana/
## Grafana Settings
Using the default settings created during installation. If you'd like more customization check out:
https://www.digitalocean.com/community/tutorials/how-to-install-and-secure-grafana-on-ubuntu-18-04
Nginx is also install if you want to create a reverse proxy to port 3000.
* [Grafana Config](https://grafana.com/tutorials/run-grafana-behind-a-proxy/#0)
* [NGINX Config](https://grafana.com/tutorials/run-grafana-behind-a-proxy/#1)
## Python Scripts
Scripts are in the /home/dashboard/development directory. They are pulled from the following:
* [Perfmon](https://github.com/sieteunoseis/cisco_perfmon_influxdb)
* [Jabber Risport](https://github.com/sieteunoseis/cisco_risport_influxdb)
Sample python scripts are running via PM2. You can do the following to list them:
```
$ pm2 list
```
## Giving Back
If you would like to support my work and the time I put in creating the code, you can click the image below to get me a coffee. I would really appreciate it (but is not required).
[](https://www.buymeacoffee.com/automatebldrs)
-Jeremy Worden
Enjoy!
| 30.884615 | 368 | 0.752594 | eng_Latn | 0.585485 |
266e26b102a73f594f4c13d1f6feeaa53441c78a | 1,225 | md | Markdown | README.md | ONYLAB/TTAnalysis | 6e855c5ab0c2feb31a78f8b1ed9bcd52e58a3d8c | [
"MIT"
] | null | null | null | README.md | ONYLAB/TTAnalysis | 6e855c5ab0c2feb31a78f8b1ed9bcd52e58a3d8c | [
"MIT"
] | null | null | null | README.md | ONYLAB/TTAnalysis | 6e855c5ab0c2feb31a78f8b1ed9bcd52e58a3d8c | [
"MIT"
] | null | null | null | # TTAnalysis
Rscripts used to make this R Shiny web application are available on [Github](https://github.com/ONYLAB/TTAnalysis).
This app was originally developed by [Osman Yogurtcu](https://github.com/ONYLAB) who you can contact with questions at <osman.yogurtcu@fda.hhs.gov>.
A team of researchers have contributed to the continual development of the model and app:
* Richard A. Forhsee @FDA/CBER/OBE
* Carlos Villa @FDA/CBER/OBRR
Websites that we used to make this app and recommend:
* https://rstudio.github.io/shinydashboard/ - Shinydashboard is an R package that lets you create nice dashboards with Shiny.
* https://alhill.shinyapps.io/COVID19seir/ - Versatile and sophisticated Shiny-based app for COVID-19 epidemiological modelling from Allison Hill
* https://gallery.shinyapps.io/dist_calc/ - Probability distribution visualization app from Mine Cetinkaya-Rundel
* https://stat.ethz.ch/R-manual/R-devel/library/stats/html/00Index.html - R Stats package
* https://cran.r-project.org/web/packages/flexsurv/index.html - Flexible parametric models for time-to-event data, including the Royston-Parmar spline model, generalized gamma and generalized F distributions.
November 2020, Silver Spring, Maryland, USA
| 58.333333 | 208 | 0.791837 | eng_Latn | 0.83028 |
266fbffd2c1a76f9df3cb91688538e8fdc9bb5b0 | 5,690 | md | Markdown | example/evaluation_metrics_explanation.md | LRGASP/lrgasp-challenge-2-evaluation | a658b76b9f1356bf3786b75522b28464cdde8a2b | [
"MIT"
] | 1 | 2021-07-13T18:35:00.000Z | 2021-07-13T18:35:00.000Z | example/evaluation_metrics_explanation.md | LRGASP/lrgasp-challenge-2-evaluation | a658b76b9f1356bf3786b75522b28464cdde8a2b | [
"MIT"
] | null | null | null | example/evaluation_metrics_explanation.md | LRGASP/lrgasp-challenge-2-evaluation | a658b76b9f1356bf3786b75522b28464cdde8a2b | [
"MIT"
] | null | null | null | # How the evaluation metrics works
## Spearman Correlation Coefficient (SCC)
SCC evaluates the monotonic relationship between the estimation and gold standard, which is based on the rank for gene isoform abundance rather than the raw data. It is computed as<br>
<br>
where  and  are the ranks of  and , respectively, and  is the covariance of the rank variables, and  are the sample standard deviations of  and , respectively.<br>
<br>
Spearman Correlation Coefficient (SCC) between the estimation and gold standard. The SCC reveals gene CDY1 can be accurately quantified but not gene METTL9.
## Abundance Recovery Rate
Abundance Recovery Rate (ARR) represents the percentage of the estimated value to the real abundance, which is calculated by <br>
<br>
An accurate estimation should have the ARR value close to 100%.
## Median Relative Difference
Median Relative Difference (MRD) represents the median of relative difference for abundance estimates in all gene isoforms, which can be calculated by<br>
<br>
A small MRD value indicates good performance of abundance estimation.
## Normalized Root Mean Square Error
Normalized Root Mean Square Error (NRMSE) gives a measure of the extent to which the relationship deviates from a one-to-one linear relationship. It can be computed by<br>
<br>
where  is the sample standard deviation of . A good performance of abundance estimation should have a small value of NRMSE.
Since quantification performance could be influenced by the exon-isoform structure and the abundance, we evaluate the quantification of different sets of genes/transcripts with different isoform features, including isoform numbers, exon numbers, gold standard abundance values and a customized statistic K-value representing the complexity of exon-isoform structures.
## Reproducibility
<br>
By fitting the standard deviation versus average isoform abundance into a smooth curve, it can be shown that Method B has a lower standard deviation and higher reproducibility.
## Consistency
<br>
By setting an expression threshold (e.g., 1 in this toy example), we can define which set of genes express (in blue) or not (in yellow). This statistic is to measure the consistency of the expressed gene sets between replicates.
## Resolution entropy
<br>
(A) The software output only a few certain discrete values has lower resolution entropy as it cannot capture the continuous and subtle difference of gene expressions. (B) The software with continuous output values has higher resolution entropy.
## Fold-change-based evaluation
<br>
By counting the # of transcripts that are differentially expressed, statistics such as precision, recall, and accuracy can be defined and calculated.
## Evaluation with different gene features
<br>
Explore the relationship between the above evaluation metrics and important features like K-value, expression level, isoform length, and number of exons by splitting the transcripts into groups on these features.
| 153.783784 | 1,151 | 0.783656 | eng_Latn | 0.789561 |