hexsha stringlengths 40 40 | size int64 5 1.04M | ext stringclasses 6 values | lang stringclasses 1 value | max_stars_repo_path stringlengths 3 344 | max_stars_repo_name stringlengths 5 125 | max_stars_repo_head_hexsha stringlengths 40 78 | max_stars_repo_licenses listlengths 1 11 | max_stars_count int64 1 368k ⌀ | max_stars_repo_stars_event_min_datetime stringlengths 24 24 ⌀ | max_stars_repo_stars_event_max_datetime stringlengths 24 24 ⌀ | max_issues_repo_path stringlengths 3 344 | max_issues_repo_name stringlengths 5 125 | max_issues_repo_head_hexsha stringlengths 40 78 | max_issues_repo_licenses listlengths 1 11 | max_issues_count int64 1 116k ⌀ | max_issues_repo_issues_event_min_datetime stringlengths 24 24 ⌀ | max_issues_repo_issues_event_max_datetime stringlengths 24 24 ⌀ | max_forks_repo_path stringlengths 3 344 | max_forks_repo_name stringlengths 5 125 | max_forks_repo_head_hexsha stringlengths 40 78 | max_forks_repo_licenses listlengths 1 11 | max_forks_count int64 1 105k ⌀ | max_forks_repo_forks_event_min_datetime stringlengths 24 24 ⌀ | max_forks_repo_forks_event_max_datetime stringlengths 24 24 ⌀ | content stringlengths 5 1.04M | avg_line_length float64 1.14 851k | max_line_length int64 1 1.03M | alphanum_fraction float64 0 1 | lid stringclasses 191 values | lid_prob float64 0.01 1 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
b10764b584e91950f7579a62d255bdd1c052dfc6 | 12,061 | md | Markdown | docs/csharp/tutorials/mixins-with-default-interface-methods.md | TomekLesniak/docs.pl-pl | 3373130e51ecb862641a40c5c38ef91af847fe04 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/csharp/tutorials/mixins-with-default-interface-methods.md | TomekLesniak/docs.pl-pl | 3373130e51ecb862641a40c5c38ef91af847fe04 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/csharp/tutorials/mixins-with-default-interface-methods.md | TomekLesniak/docs.pl-pl | 3373130e51ecb862641a40c5c38ef91af847fe04 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Tworzenie typów domieszki przy użyciu domyślnych metod interfejsu
description: Przy użyciu domyślnych elementów członkowskich interfejsu można rozciągnąć interfejsy z opcjonalnymi implementacjami domyślnymi dla realizatorów.
ms.technology: csharp-advanced-concepts
ms.date: 10/04/2019
ms.openlocfilehash: 0095d76eadfe0c6a1b30bf8a0c5000509f5e1bf9
ms.sourcegitcommit: 046a9c22487551360e20ec39fc21eef99820a254
ms.translationtype: MT
ms.contentlocale: pl-PL
ms.lasthandoff: 05/14/2020
ms.locfileid: "83396709"
---
# <a name="tutorial-mix-functionality-in-when-creating-classes-using-interfaces-with-default-interface-methods"></a>Samouczek: mieszanie funkcji w przypadku tworzenia klas przy użyciu interfejsów z domyślnymi metodami interfejsu
Począwszy od języka C# 8,0 na platformie .NET Core 3,0, można zdefiniować implementację podczas deklarowania elementu członkowskiego interfejsu. Ta funkcja udostępnia nowe możliwości, w których można definiować domyślne implementacje funkcji zadeklarowanych w interfejsach. Klasy mogą być wybierane podczas przesłonięcia funkcji, kiedy używać funkcji domyślnych i gdy nie należy deklarować obsługi funkcji dyskretnych.
Z tego samouczka dowiesz się, jak wykonywać następujące czynności:
> [!div class="checklist"]
>
> * Utwórz interfejsy z implementacjami opisującymi funkcje dyskretne.
> * Utwórz klasy, które używają domyślnych implementacji.
> * Utwórz klasy, które zastępują niektóre lub wszystkie domyślne implementacje.
## <a name="prerequisites"></a>Wymagania wstępne
Musisz skonfigurować maszynę do uruchamiania programu .NET Core, w tym kompilatora C# 8,0. Kompilator języka C# 8,0 jest dostępny w programie [Visual Studio 2019 w wersji 16,3](https://visualstudio.microsoft.com/downloads/?utm_medium=microsoft&utm_source=docs.microsoft.com&utm_campaign=inline+link&utm_content=download+vs2019)lub w [programie .NET Core 3,0 SDK](https://dotnet.microsoft.com/download/dotnet-core) lub nowszym.
## <a name="limitations-of-extension-methods"></a>Ograniczenia metod rozszerzających
Jednym ze sposobów implementacji zachowania, które pojawia się jako część interfejsu, jest zdefiniowanie [metod rozszerzających](../programming-guide/classes-and-structs/extension-methods.md) , które zapewniają zachowanie domyślne. Interfejsy deklarują minimalny zestaw elementów członkowskich, zapewniając większą powierzchnię dla każdej klasy, która implementuje ten interfejs. Na przykład metody rozszerzające w programie <xref:System.Linq.Enumerable> zapewniają implementację dowolnej sekwencji, która będzie źródłem zapytania LINQ.
Metody rozszerzające są rozwiązane w czasie kompilacji, przy użyciu zadeklarowanego typu zmiennej. Klasy implementujące interfejs mogą zapewnić lepszą implementację dowolnej metody rozszerzenia. Deklaracje zmiennych muszą być zgodne z typem implementującym, aby umożliwić kompilatorowi wybranie tej implementacji. Gdy typ czasu kompilacji jest zgodny z interfejsem, wywołania metody są rozpoznawane jako Metoda rozszerzenia. Innym problemem z metodami rozszerzania jest to, że te metody są dostępne wszędzie tam, gdzie klasy zawierającej metody rozszerzające są dostępne. Klasy nie mogą deklarować, jeśli powinny lub nie muszą dostarczać funkcji zadeklarowanych w metodach rozszerzających.
Począwszy od języka C# 8,0, można zadeklarować domyślne implementacje jako metody interfejsu. Następnie każda klasa automatycznie używa implementacji domyślnej. Każda klasa, która może zapewnić lepszą implementację, może przesłonić definicję metody interfejsu lepszym algorytmem. W jednym sensie ta technika jest podobna do sposobu używania [metod rozszerzających](../programming-guide/classes-and-structs/extension-methods.md).
W tym artykule dowiesz się, jak domyślne implementacje interfejsu umożliwiają tworzenie nowych scenariuszy.
## <a name="design-the-application"></a>Projektowanie aplikacji
Weź pod uwagę aplikację Automatyzacja domu. Prawdopodobnie masz wiele różnych typów świateł i wskaźników, które mogą być używane w całej firmie. Każde światło musi obsługiwać interfejsy API, aby je włączyć i wyłączyć oraz zgłosić bieżący stan. Niektóre sygnalizatory i wskaźniki mogą obsługiwać inne funkcje, takie jak:
- Włącz światło, a następnie wyłącz je po upłynięciu czasomierza.
- Miganie światła przez pewien czas.
Niektóre z tych rozszerzonych możliwości mogą być emulowane na urządzeniach, które obsługują minimalny zestaw. Wskazuje to na podanie domyślnej implementacji. W przypadku urządzeń, które mają więcej funkcji wbudowanych, oprogramowanie urządzenia może korzystać z natywnych funkcji. W przypadku innych sygnalizatorów mogą oni wybrać implementację interfejsu i użyć domyślnej implementacji.
Domyślnymi elementami członkowskimi interfejsu jest lepszym rozwiązaniem dla tego scenariusza niż metody rozszerzające. Autorzy klasy mogą kontrolować interfejsy, które wybierają do wdrożenia. Wybrane interfejsy są dostępne jako metody. Ponadto, ponieważ domyślne metody interfejsu są domyślnie wirtualne, metoda wysyłania zawsze wybiera implementację w klasie.
Utwórzmy kod, aby przedstawić te różnice.
## <a name="create-interfaces"></a>Tworzenie interfejsów
Zacznij od utworzenia interfejsu, który definiuje zachowanie dla wszystkich świateł:
[!code-csharp[Declare base interface](./snippets/mixins-with-default-interface-methods/UnusedExampleCode.cs?name=SnippetILightInterfaceV1)]
W przypadku podstawowego narzutu, armatura uproszczona może zaimplementować ten interfejs, jak pokazano w poniższym kodzie:
[!code-csharp[First overhead light](./snippets/mixins-with-default-interface-methods/UnusedExampleCode.cs?name=SnippetOverheadLightV1)]
W tym samouczku kod nie obejmuje urządzeń IoT, ale emuluje te działania przez zapisanie komunikatów do konsoli. Możesz eksplorować kod bez automatyzowania domu.
Następnie zdefiniujemy interfejs dla światła, który może być automatycznie wyłączany po upływie limitu czasu:
[!code-csharp[pure Timer interface](./snippets/mixins-with-default-interface-methods/UnusedExampleCode.cs?name=SnippetPureTimerInterface)]
Można dodać podstawową implementację do światła narzutowego, ale lepszym rozwiązaniem jest zmodyfikowanie tej definicji interfejsu w celu zapewnienia `virtual` domyślnej implementacji:
[!code-csharp[Timer interface](./snippets/mixins-with-default-interface-methods/ITimerLight.cs?name=SnippetTimerLightFinal)]
Dodając tę zmianę, `OverheadLight` Klasa może zaimplementować funkcję Timer, deklarując obsługę interfejsu:
```csharp
public class OverheadLight : ITimerLight { }
```
Inny typ oświetlenia może obsługiwać bardziej zaawansowany protokół. Może zapewnić własną implementację programu `TurnOnFor` , jak pokazano w poniższym kodzie:
[!code-csharp[Override the timer function](./snippets/mixins-with-default-interface-methods/HalogenLight.cs?name=SnippetHalogenLight)]
W przeciwieństwie do zastępowania metod klasy wirtualnej, deklaracja `TurnOnFor` w `HalogenLight` klasie nie używa `override` słowa kluczowego.
## <a name="mix-and-match-capabilities"></a>Możliwości mieszania i dopasowywania
Zalety domyślnych metod interfejsu stają się wyraźniejsze, ponieważ wprowadzasz bardziej zaawansowane możliwości. Używanie interfejsów umożliwia mieszanie i dopasowywanie funkcji. Umożliwia również każdemu autorowi klasy wybór między implementacją domyślną a implementacją niestandardową. Dodajmy interfejs z domyślną implementacją dla migających świateł:
[!code-csharp[Define the blinking light interface](./snippets/mixins-with-default-interface-methods/IBlinkingLight.cs?name=SnippetBlinkingLight)]
Domyślna implementacja umożliwia migotanie. Światełko narzutu można dodać czasomierz i migające możliwości przy użyciu domyślnej implementacji:
[!code-csharp[Use the default blink function](./snippets/mixins-with-default-interface-methods/OverheadLight.cs?name=SnippetOverheadLight)]
Nowy typ światła, `LEDLight` obsługuje zarówno funkcję Timer, jak i funkcję Blink. Ten styl światła implementuje zarówno `ITimerLight` `IBlinkingLight` interfejsy, jak i zastępuje `Blink` metodę:
[!code-csharp[Override the blink function](./snippets/mixins-with-default-interface-methods/LEDLight.cs?name=SnippetLEDLight)]
`ExtraFancyLight`Może obsługiwać jednocześnie funkcje migotania i czasomierza:
[!code-csharp[Override the blink and timer function](./snippets/mixins-with-default-interface-methods/ExtraFancyLight.cs?name=SnippetExtraFancyLight)]
`HalogenLight`Utworzony wcześniej nie obsługuje migotania. Dlatego nie należy dodawać `IBlinkingLight` do listy obsługiwanych interfejsów.
## <a name="detect-the-light-types-using-pattern-matching"></a>Wykrywanie typów świateł przy użyciu dopasowania wzorca
Następnie Napiszmy kod testowy. Możesz użyć funkcji [dopasowania do wzorca](../pattern-matching.md) języka C#, aby określić możliwości światła, sprawdzając, które interfejsy obsługuje. Poniższa metoda korzysta z obsługiwanych funkcji poszczególnych świateł:
[!code-csharp[Test a light's capabilities](./snippets/mixins-with-default-interface-methods/Program.cs?name=SnippetTestLightFunctions)]
Poniższy kod w `Main` metodzie tworzy każdy typ światła w sekwencji i sprawdza, czy jest to jasne:
[!code-csharp[Test a light's capabilities](./snippets/mixins-with-default-interface-methods/Program.cs?name=SnippetMainMethod)]
## <a name="how-the-compiler-determines-best-implementation"></a>Jak kompilator określa najlepszą implementację
W tym scenariuszu przedstawiono interfejs podstawowy bez żadnych implementacji. Dodanie metody do `ILight` interfejsu wprowadza nowe złożoności. Reguły języka rządzące domyślnymi metodami interfejsu minimalizują wpływ konkretnych klas, które implementują wiele interfejsów pochodnych. Zwiększmy oryginalny interfejs za pomocą nowej metody, aby zobaczyć, jak zmienia się jej użycie. Każdy sygnalizator wskaźnika może zgłosić swój stan mocy jako wartość wyliczaną:
[!code-csharp[Enumeration for power status](./snippets/mixins-with-default-interface-methods/ILight.cs?name=SnippetPowerStatus)]
Domyślna implementacja nie przyjmuje mocy:
[!code-csharp[Report a default power status](./snippets/mixins-with-default-interface-methods/ILight.cs?name=SnippetILightInterface)]
Te zmiany kompilują się w sposób przejrzysty, mimo że `ExtraFancyLight` deklaruje obsługę `ILight` interfejsu oraz obu interfejsów pochodnych `ITimerLight` i `IBlinkingLight` . W interfejsie znajduje się tylko jedna "najbliżej" implementacja `ILight` . Każda klasa, która zadeklarowała przesłonięcie, stanie się jedną "najbliższą" implementacją. Przykłady z poprzednich klas, które overrodeą elementy członkowskie innych interfejsów pochodnych.
Unikaj zastępowania tej samej metody w wielu interfejsach pochodnych. Wykonanie tej operacji tworzy niejednoznaczne wywołanie metody za każdym razem, gdy klasa implementuje oba interfejsy pochodne. Kompilator nie może wybrać jednej lepszej metody, aby wystawić błąd. Na przykład, jeśli zarówno, `IBlinkingLight` jak i `ITimerLight` zaimplementowano przesłonięcie `PowerStatus` , `OverheadLight` należy podać bardziej szczegółowe przesłonięcie. W przeciwnym razie kompilator nie może wybrać między implementacjami w dwóch interfejsach pochodnych. Zazwyczaj można uniknąć tej sytuacji, zachowując definicje interfejsu jako małe i skoncentrowane na jednej funkcji. W tym scenariuszu każda funkcja światła jest własnym interfejsem; wiele interfejsów jest dziedziczonych tylko przez klasy.
Ten przykład pokazuje jeden scenariusz, w którym można zdefiniować osobne funkcje, które mogą być mieszane w klasy. Deklaruje dowolny zestaw obsługiwanych funkcji, deklarując interfejsy obsługiwane przez klasę. Użycie wirtualnych metod interfejsu wirtualnego umożliwia używanie klas lub Definiowanie innej implementacji dla dowolnych lub wszystkich metod interfejsu. Ta funkcja językowa udostępnia nowe sposoby modelowania kompilowanych systemów rzeczywistych. Domyślne metody interfejsu zapewniają wyraźniejszy sposób wyznaczania pokrewnych klas, które mogą mieszać i odpowiadać różnym funkcjom przy użyciu wirtualnych implementacji tych funkcji.
| 93.496124 | 784 | 0.835503 | pol_Latn | 0.999927 |
b107cb6cd0d0f612b291ca11c6e1f98a9a3e0293 | 141 | md | Markdown | README.md | ShifShif/MergeSort | 02aed324395d328c94a1d5fe2b29f5903219cf3e | [
"MIT"
] | null | null | null | README.md | ShifShif/MergeSort | 02aed324395d328c94a1d5fe2b29f5903219cf3e | [
"MIT"
] | null | null | null | README.md | ShifShif/MergeSort | 02aed324395d328c94a1d5fe2b29f5903219cf3e | [
"MIT"
] | null | null | null | # MergeSort
This is a java implementation of the mergesort algorithm for sorting int arrays in ascending order.
Running time is O(n log n).
| 35.25 | 99 | 0.787234 | eng_Latn | 0.994562 |
b107d26ddacf0c030027fe6f8fdd8fe17776dd23 | 6,317 | md | Markdown | _posts/2020-10-08-download-scott-pilgrim-vs-the-world-2010-dual-audio-hindi-english-480p-500mb-720p-1gb.md | tamilrockerss/1 | aba0806b125f9614d2964c7ec77ddb85e5e10bfd | [
"MIT"
] | null | null | null | _posts/2020-10-08-download-scott-pilgrim-vs-the-world-2010-dual-audio-hindi-english-480p-500mb-720p-1gb.md | tamilrockerss/1 | aba0806b125f9614d2964c7ec77ddb85e5e10bfd | [
"MIT"
] | null | null | null | _posts/2020-10-08-download-scott-pilgrim-vs-the-world-2010-dual-audio-hindi-english-480p-500mb-720p-1gb.md | tamilrockerss/1 | aba0806b125f9614d2964c7ec77ddb85e5e10bfd | [
"MIT"
] | null | null | null | ---
id: 218
title: 'Download Scott Pilgrim vs. the World (2010) Dual Audio {Hindi-English} 480p [500MB] || 720p [1GB]'
date: 2020-10-08T18:41:32+00:00
author: admin
layout: post
guid: https://themoviesflix.co/?p=13244
permalink: /2020/10/08/download-scott-pilgrim-vs-the-world-2010-dual-audio-hindi-english-480p-500mb-720p-1gb/
tdc_dirty_content:
- "1"
tdc_icon_fonts:
- 'a:0:{}'
cyberseo_rss_source:
- https://www.psdly.com/wp-theme/feed?paged=9
- https://themoviesflix.co/feed/?paged=20
cyberseo_post_link:
- https://www.psdly.com/themeforest-maharaj-v2-3-hotel-master-wordpress-theme-21056584
- https://themoviesflix.co/download-scott-pilgrim-vs-the-world-2010-hindi-english-480p-720p/
categories:
- Uncategorized
---
Download ****Scott Pilgrim vs. the World (2010) Dual audio {Hindi-English} 480p & 720p. This Hollywood movie is available in 480p, 720p for free just click on the download button below. This movie is based on **Action, Comedy, Fantasy. **Download this movie in dual audio Hindi. This movie is available in Hindi Dubbed. High-Speed download links below.
**TheMoviesFlix.co** is The Best Website/Platform For Hollywood HD Movies. We Provide Direct Google Drive Download Links For Fast And Secure Downloading. Just Click On Download Button _And Follow Steps To Download And Watch Movies Online For Free_.
<div class="imdbwp imdbwp--movie dark">
<div class="imdbwp__thumb">
<a class="imdbwp__link" target="_blank" title="Scott Pilgrim vs. the World" href="https://www.imdb.com/title/tt0446029/" rel="nofollow noopener noreferrer"><img class="imdbwp__img" src="https://m.media-amazon.com/images/M/MV5BMTkwNTczNTMyOF5BMl5BanBnXkFtZTcwNzUxOTUyMw@@._V1_SX300.jpg" /></a>
</div>
<div class="imdbwp__content">
<div class="imdbwp__header">
<span class="imdbwp__title">Scott Pilgrim vs. the World</span> (2010)
</div>
<div class="imdbwp__belt">
<span class="imdbwp__star">7.5</span><span class="imdbwp__rating"><strong>Rating:</strong> 7.5 / 10 from 371,975 users</span>
</div>
<div class="imdbwp__teaser">
Scott Pilgrim must defeat his new girlfriend’s seven evil exes in order to win her heart.
</div>
</div>
</div>
### Download Scott Pilgrim vs. the World (2010) {Hindi-English} 480p & 720p ~ TheMoviesFlix.co {.has-text-align-center.has-text-color}
### Movie Info: {.has-text-color}
* **Full Name: **Scott Pilgrim vs. the World
* **Language:** Dual Audio (Hindi-English)
* **Release Year: **2010
* **Quality: **480p & 720p Bluray
* **Size: **500MB & 1GB
* **Format: **Mkv
### Storyline: {.has-text-color}
As bass guitarist for a garage-rock band, Scott Pilgrim (Michael Cera) has never had trouble getting a girlfriend; usually, the problem is getting rid of them. But when Ramona Flowers (Mary Elizabeth Winstead) skates into his heart, he finds she has the most troublesome baggage of all: an army of ex-boyfriends who will stop at nothing to eliminate him from her list of suitors.
### Screenshots: {.has-text-color}<figure class="wp-block-image">
 </figure> <figure class="wp-block-image"></figure> <figure class="wp-block-image"></figure> <figure class="wp-block-image"></figure>
<p class="has-text-align-center has-text-color has-medium-font-size">
Download Scott Pilgrim vs. the World (2010) Dual Audio {Hindi-English} 480p [500MB]
</p>
<span class="mb-center maxbutton-3-center"><span class="maxbutton-3-container mb-container"><a class="maxbutton-3 maxbutton maxbutton-post-button" target="_blank" rel="nofollow noopener noreferrer" href="https://coinquint.com/a12800/"><span class="mb-text">Download Links</span></a></span></span>
<p class="has-text-align-center has-text-color has-medium-font-size">
Download Scott Pilgrim vs. the World (2010) Dual Audio {Hindi-English} 720p [1GB]
</p>
<span class="mb-center maxbutton-3-center"><span class="maxbutton-3-container mb-container"><a class="maxbutton-3 maxbutton maxbutton-post-button" target="_blank" rel="nofollow noopener noreferrer" href="https://coinquint.com/a12802/"><span class="mb-text">Download Links</span></a></span></span>
<center>
</center>
<center>
<a href="https://t.me/themoviesflixcom" target="_blank" data-wpel-link="external" rel="nofollow external noopener noreferrer"><button class="button button5">Join Our Telegram</button></a> <a href="https://themoviesflix.co/download-scott-pilgrim-vs-the-world-2010-hindi-english-480p-720p/#" target="_blank" data-wpel-link="external" rel="nofollow external noopener noreferrer"><button class="button button5">Announcements</button></a> <a href="https://themoviesflix.com/how-to-download/" target="_blank" data-wpel-link="external" rel="nofollow external noopener noreferrer"><button class="button button5">How To Download?</button></a> <a href="https://themoviesflix.co/download-scott-pilgrim-vs-the-world-2010-hindi-english-480p-720p/#" target="_blank" data-wpel-link="external" rel="nofollow external noopener noreferrer"><button class="button button5">Report Broken Links</button></a>
</center>
<div class="alert alert-danger">
Please Do Not Use VPN for Downloading Movies From Our Site.
</div>
<div class="alert alert-success">
Click On The Above <strong>Download Button</strong> Download File.
</div>
<div class="alert alert-warning">
If You Find Any Broken Link Then <strong>Report</strong> To Us.
</div>
<div class="alert alert-info">
<strong>Comment</strong> Your Queries And Requests Below In The Comment Box.
</div> | 64.459184 | 887 | 0.741016 | eng_Latn | 0.264151 |
b10871c84eb10c39f85c6e9b9372502c19ac6042 | 3,624 | md | Markdown | docs/Vanilla/Entities/IEntityLivingBase.md | mordds/CraftTweaker-Documentation | 677fcf78209ec97c17a52a1efba78ebad36a1b09 | [
"MIT"
] | 1 | 2018-08-30T06:56:29.000Z | 2018-08-30T06:56:29.000Z | docs/Vanilla/Entities/IEntityLivingBase.md | mordds/CraftTweaker-Documentation | 677fcf78209ec97c17a52a1efba78ebad36a1b09 | [
"MIT"
] | null | null | null | docs/Vanilla/Entities/IEntityLivingBase.md | mordds/CraftTweaker-Documentation | 677fcf78209ec97c17a52a1efba78ebad36a1b09 | [
"MIT"
] | null | null | null | # 有生命实体基础
一个有生命的实体是指一个有生命值且会死亡的实体。
这包含怪物,动物也包括[玩家](/Vanilla/Players/IPlayer)。
## 导入相关包
为了避免发生一些不期而遇的问题(比如声明 [数组](/AdvancedFunctions/Arrays_and_Loops)),最为安全、也是最为推荐的方式就是导入相关的包。
`import crafttweaker.entity.IEntityLivingBase;`
## 继承自[实体](IEntity)
有生命实体基础继承自[实体](IEntity)。这意味着[实体](IEntityLivingBase)对象的所有函数均可在有生命实体基础对象上使用。
## ZenGetters
| ZenGetter | Return Type (*can be null*) |
| :--------------------: | :----------------------------------------------: |
| activePotionEffects | List<[药水效果](/Vanilla/Potions/IPotionEffect)> |
| AIMovementSpeed | float |
| arrowsInEntity | int |
| attackingEntity | *IEntityLivingBase* |
| canBreatheUnderwater | boolean |
| health | float |
| isChild | boolean |
| isOnLadder | boolean |
| isUndead | boolean |
| lastAttackedEntity | *IEntityLivingBase* |
| lastAttackedEntityTime | int |
| lastDamageSource | [伤害来源](/Vanilla/Damage/IDamageSource) |
| mainHandHeldItem | [物品堆](/Vanilla/Items/IItemStack) |
| maxHealth | float |
| offHandHeldItem | [IItemStack](/Vanilla/Items/IItemStack) |
| revengeTarget | *IEntityLivingBase* |
| totalArmorValue | int |
## ZenSetters
| ZenSetter | Parameter Type (*can be null*) |
| :----------------: | :----------------------------: |
| AIMovementSpeed | float |
| arrowsInEntity | int |
| health | float |
| lastAttackedEntity | *IEntityLivingBase* |
| revengeTarget | *IEntityLivingBase* |
## 更多 ZenMethods
- boolean attackEntityFrom(IDamageSource source, float amount) → 有点用...
- boolean canEntityBeSeen([IEntity](IEntity) other);
- boolean hasItemInSlot([IEntityEquipmentSlot](/Vanilla/Entities/IEntityEquipmentSlot) slot);
- boolean isPotionActive([IPotion](Vanilla/Potions/IPotion) potion) → Returns true if the goven potion is active
- boolean isPotionEffectApplicable([IPotionEffect](/Vanilla/Potions/IPotionEffect) potionEffect);
- heal(float amount) → 治疗实体,治疗的生命值为传入的数值
- [IEntityAttributeInstance](/Vanilla/Entities/Attributes/IEntityAttributeInstance) getAttribute(String name) → 返回传入的[实体属性](/Vanilla/Entities/Attributes/IEntityAttributeInstance)
- [IItemStack](/Vanilla/Items/IItemStack) getItemInSlot([IEntityEquipmentSlot](/Vanilla/Entities/IEntityEquipmentSlot) slot);
- [IPotionEffect](/Vanilla/Potions/IPotionEffect) getActivePotionEffect(IPotion potion);
- void addPotionEffect([IPotionEffect](/Vanilla/Potions/IPotionEffect) potionEffect);
- void clearActivePotions() → Removes all active [potions](Vanilla/Potions/IPotion) from the Entity
- void knockBack([IEntity](IEntity) entity, float one, double two, double three);
- void onDeath();
- void onLivingUpdate();
- void setItemToSlot([IEntityEquipmentSlot](/Vanilla/Entities/IEntityEquipmentSlot) slot, [IItemStack](/Vanilla/Items/IItemStack) itemStack);
| 55.753846 | 179 | 0.550497 | yue_Hant | 0.840086 |
b108e375a949f8a0508d2997881c1c917847f32d | 21,271 | md | Markdown | C#OOP-Advanced/Hell-Skeleton/Readme.md | George221b/SoftUni-Taks | 52d9d1917614333aa9953ce512650c7c0d6fe45c | [
"MIT"
] | null | null | null | C#OOP-Advanced/Hell-Skeleton/Readme.md | George221b/SoftUni-Taks | 52d9d1917614333aa9953ce512650c7c0d6fe45c | [
"MIT"
] | null | null | null | C#OOP-Advanced/Hell-Skeleton/Readme.md | George221b/SoftUni-Taks | 52d9d1917614333aa9953ce512650c7c0d6fe45c | [
"MIT"
] | null | null | null | <h1>OOP Advanced Exam – H.E.L.L.</h1>
<p>In a galaxy far away, a civilization called – The Lightmen organizes an annual tournament. The participants are striving for power through the use of marvelous magical items, in order to win the unnatural fray – H.E.L.L.</p>
<h3>Overview</h3>
<p>Due to the fact that H.E.L.L. has gotten way out of hand and there have been countless... casualties, the Light Council decided to cancel the tournament. But the light people were not very happy with that decision, so the Council had the idea of a mini-game that simulates the tournament. Guess who they hired to write the code for it. That’s right! You!</p>
<h3>Structure</h3>
<p>Here are the entities that should exist as models in your program.</p>
<h4>Heroes</h4>
<p>The main participants of the tournament are <strong>Heroes</strong>. Each hero has <strong>several stats</strong>:</p>
<ul>
<li>Name – a <strong>string</strong>, indicating the <strong>name</strong> of the <strong>hero</strong>.</li>
<li>Strength – an <strong>integer</strong>, indicating the <strong>strength</strong> of the <strong>hero</strong>.</li>
<li>Agility – an <strong>integer</strong>, indicating the <strong>agility</strong> of the <strong>hero</strong>.</li>
<li>Intelligence – an <strong>integer</strong>, indicating the <strong>intelligence</strong> of the <strong>hero</strong>.</li>
<li>HitPoints – an <strong>integer</strong>, indicating the <strong>hit</strong> <strong>points</strong> of the <strong>hero</strong>.</li>
<li>Damage – an <strong>integer</strong>, indicating the <strong>damage</strong> of the <strong>hero</strong>.</li>
</ul>
<p>The heroes also have an <strong>Inventory</strong>. The inventory will be <strong>given to you</strong> in the <strong>skeleton</strong>. <br /> You can check <strong>more info</strong> about it in the <strong>Skeleton</strong> <strong>section</strong>.</p>
<p>There are generally <strong>3</strong> <strong>types</strong> of <strong>heroes – </strong>Barbarian, Assassin, and Wizard.</p>
<h4>Items</h4>
<p>Aside from the heroes there are items.</p>
<p>The items have several properties:</p>
<ul>
<li>Name – a <strong>string</strong>, indicating the <strong>name</strong> of the item.</li>
<li>StrengthBonus – an <strong>integer</strong>, indicating the <strong>strength</strong> <strong>bonus</strong> of the <strong>item</strong>.</li>
<li>AgilityBonus – an <strong>integer</strong>, indicating the <strong>agility bonus</strong> of the <strong>item</strong>.</li>
<li>IntelligenceBonus – an <strong>integer</strong>, indicating the <strong>intelligence bonus</strong> of the <strong>item</strong>.</li>
<li>HitPointsBonus – an <strong>integer</strong>, indicating the <strong>hit points bonus</strong> of the <strong>item</strong>.</li>
<li>DamageBonus – an <strong>integer</strong>, indicating the <strong>damage bonus</strong> of the <strong>item</strong>.</li>
</ul>
<p>There are two types of items – CommonItem and RecipeItem.</p>
<ul>
<li>The CommonItem is just a normal item.</li>
<li>The RecipeItem has RequiredItems – a <strong>collection</strong> of CommonItem.</li>
</ul>
<p>The RecipeItem will be <strong>initialized</strong> with an <strong>additional element</strong> – the RequiredItems. Check in the <strong>Input section</strong> for more info.</p>
<h3>Functionality</h3>
<p>As you can see the main logic goes around several entities – the heroes and the items. The heroes have items which increase their stats. The heroes also have an inventory, in which their items are being held. There are also recipe items which have a little more interesting logic behind themselves.</p>
<h4>Heroes</h4>
<p>The difference between the 3 types of heroes (<strong>Strength</strong>, <strong>Agility</strong>, <strong>Intelligence</strong>) is the <strong>base stats</strong>, they start with. They are constant values.</p>
<table width="1170">
<tbody>
<tr>
<td width="278">
<p>Stats</p>
</td>
<td width="295">
<p>Barbarian</p>
</td>
<td width="293">
<p>Assassin</p>
</td>
<td width="304">
<p>Wizard</p>
</td>
</tr>
<tr>
<td width="278">
<p>Strength</p>
</td>
<td width="295">
<p>90</p>
</td>
<td width="293">
<p>25</p>
</td>
<td width="304">
<p>25</p>
</td>
</tr>
<tr>
<td width="278">
<p>Agility</p>
</td>
<td width="295">
<p>25</p>
</td>
<td width="293">
<p>100</p>
</td>
<td width="304">
<p>25</p>
</td>
</tr>
<tr>
<td width="278">
<p>Intelligence</p>
</td>
<td width="295">
<p>10</p>
</td>
<td width="293">
<p>15</p>
</td>
<td width="304">
<p>100</p>
</td>
</tr>
<tr>
<td width="278">
<p>HitPoints</p>
</td>
<td width="295">
<p>350</p>
</td>
<td width="293">
<p>150</p>
</td>
<td width="304">
<p>100</p>
</td>
</tr>
<tr>
<td width="278">
<p>Damage</p>
</td>
<td width="295">
<p>150</p>
</td>
<td width="293">
<p>300</p>
</td>
<td width="304">
<p>250</p>
</td>
</tr>
</tbody>
</table>
<p><strong>Upon initialization</strong>, each <strong>hero</strong> should be <strong>assigned the values</strong>, specified above, <strong>depending</strong> on <strong>his type</strong>.</p>
<h4>Items</h4>
<h5>CommonItem</h5>
<p>If a <strong>hero</strong> has a <strong>certain </strong>CommonItem in his <strong>inventory</strong>, <strong>his stats</strong> are <strong>increased</strong>, by the <strong>value</strong> of the <strong>stat bonuses</strong> of the <strong>item</strong>, <strong>CORRESPONDINGLY</strong>.</p>
<p><strong>In other words</strong>: If a hero has an <strong>item</strong> with <strong>50 strength bonus</strong>, in his <strong>inventory</strong>, the <strong>hero’s strength</strong> is <strong>increased</strong> by <strong>50</strong>. <br /> If a CommonItem is <strong>removed</strong> from the <strong>inventory</strong>, <strong>all bonuses</strong> from it, are <strong>also removed</strong>.</p>
<h5>RecipeItem</h5>
<p>When a hero has a RecipeItem in his Inventory, it does <strong>NOT</strong> give him <strong>ANY</strong> of its bonuses. The RecipeItem is formed from its RequiredItems.</p>
<p>When a hero <strong>has</strong> <strong>all of the items</strong> that a RecipeItem requires, those items are being <strong>removed</strong> from his <strong>inventory</strong>, <strong>along</strong> with <strong>the</strong> <strong>recipe</strong>, and a CommonItem is put on their place, with the <strong>stats</strong> of the RecipeItem.<br /> As if the items have combined with the recipe in order to create a stronger item.</p>
<h4>Commands</h4>
<p>There are several commands which are given from the user input, in order to control the game of H.E.L.L. Here you can see how they are formed.</p>
<p>The <strong>parameters</strong> will be given in the <strong>EXACT ORDER</strong>, as the one <strong>specified below</strong>. <br /> You can see the exact input format in the <strong>Input section</strong>.</p>
<p><strong>Each</strong> <strong>command</strong> will <strong>generate an output</strong> <strong>result</strong>, which you must <strong>print</strong>.<br /> You can see the exact output format in the <strong>Output section</strong>.</p>
<p> </p>
<p> </p>
<h5>Hero Command</h5>
<p><strong>Parameters</strong> – <strong>name</strong> (string), <strong>type</strong> (string).</p>
<p>Creates a Hero of the <strong>given type</strong>, with the <strong>given name</strong>. <br /> The type will either be “Barbarian”, “Assassin” or “Wizard”.</p>
<h5>Item Command</h5>
<p><strong>Parameters</strong> – <strong>name</strong> (string), <strong>heroName</strong> (string), <strong>strengthBonus</strong> (int), <strong>agilityBonus</strong> (int), <strong>intelligenceBonus</strong> (int), <strong>hitpointsBonus</strong> (int), <strong>damageBonus</strong> (int).</p>
<p>Creates a CommonItem with the <strong>given parameters</strong>, and <strong>adds</strong> it to the <strong>inventory</strong> of the <strong>hero</strong> with the <strong>given name</strong>.</p>
<h5>Recipe Command</h5>
<p><strong>Parameters</strong> – <strong>name</strong> (string), <strong>heroName</strong> (string), <strong>strengthBonus</strong> (int), <strong>agilityBonus</strong> (int), <strong>intelligenceBonus</strong> (int), <strong>hitpointsBonus</strong> (int), <strong>damageBonus</strong> (int), <strong>requiredItem1</strong> (string), <strong>requiredItem2</strong> (string). . .</p>
<p>Creates a RecipeItem with the <strong>given parameters</strong>, and <strong>adds</strong> it to the <strong>inventory</strong> of the <strong>hero</strong> with the <strong>given name</strong>.</p>
<p>The <strong>required items</strong> are <strong>given last</strong>, and their <strong>COUNT</strong> is <strong>VARIABLE</strong>. The required items are <strong>given</strong> as <strong>names</strong> of <strong>items</strong>.</p>
<h5>Inspect Command</h5>
<p><strong>Parameters</strong> – <strong>name</strong> (string)</p>
<p><strong>Inspects</strong> the <strong>hero</strong> with the <strong>given name</strong>, providing <strong>information</strong> about his <strong>stats</strong> and <strong>items</strong>.<br /> The command should present information <strong>ONLY</strong> about the CommonItems from the hero.</p>
<h5>Quit</h5>
<p><strong>Quits</strong> the game. . . When that happens, <strong>ALL HEROES</strong> must be <strong>printed</strong> in a specific format.</p>
<h3>Skeleton</h3>
<p>In this section you will be given information about the Skeleton, or the code that has been given to you.</p>
<p>You are allowed to change the <strong>internal</strong> and <strong>private logic</strong> of the <strong>classes</strong> that have been given to you. <br /> In other words, you can change the <strong>body code</strong> and the <strong>definitions</strong> of the <strong>private members</strong> in whatever <br /> way you like.</p>
<p>However. . .</p>
<p>You are <strong>NOT ALLOWED</strong> to <strong>CHANGE</strong> the <strong>Interfaces</strong> that have been provided by the <strong>skeleton</strong> in <strong>ANY way</strong>. <br /> You are <strong>NOT ALLOWED</strong> to<strong> ADD</strong> more <strong>PUBLIC LOGIC</strong>, than the <strong>one</strong>, <strong>provided</strong> by the <strong>Interfaces</strong>.</p>
<h4>Interfaces</h4>
<p>You will be given the <strong>DOCUMENTED</strong> <strong>interfaces</strong> for the Hero and Item entities. You should use them when you are implementing your entities.</p>
<p>You will <strong>also be given</strong> an <strong>interface</strong> for the Inventory class, but you will be given the <strong>class itself</strong> too.</p>
<p>You will <strong>also be given</strong> an <strong>annotation</strong>, connected to the Inventory class, which will ease your work, in some way.</p>
<p>Read the documentation of the interfaces to gain basic knowledge of the behavior they define.</p>
<h4>Inventory</h4>
<p>You will be given the Inventory<strong> class</strong>, along with an <strong>Interface</strong> for it.</p>
<p>The Inventory’s main purpose is to <strong>store</strong> the <strong>items</strong> of a particular <strong>hero</strong>.</p>
<p>The Inventory class holds <strong>2 collections</strong> – for the CommonItems and for the RecipeItems. They are being stored in different collections for obvious reasons...</p>
<p>The <strong>collections</strong> are <strong>private</strong>, so <strong>in order</strong> to <strong>add items</strong> to them, the class <strong>exposes 2 methods</strong> for adding elements.</p>
<p>Upon <strong>adding</strong> a RecipeItem or a CommonItem, the Inventory checks <strong>all recipes</strong>, and if <strong>all required items</strong>, to a certain recipe, <strong>have been gathered</strong>, it combines them with the recipe and creates a CommonItem with the stats of the <strong>corresponding</strong> RecipeItem.</p>
<p>The Inventory also holds several methods, for <strong>extracting the bonuses</strong> from all CommonItems, because only they <strong>give bonuses</strong> to the <strong>hero</strong>.</p>
<p>Your task is to study the code in the Skeleton, and use it in your code, in order to complete the business logic of the program.</p>
<h3>Input</h3>
<p>The input consists of several commands which will be given in the format, specified below: :</p>
<ul>
<li>Hero {heroName} {heroType}</li>
<li>Item {name} {heroName} {strengthBonus} {agilityBonus} {intelligenceBonus} {hitpointsBonus} {damageBonus}</li>
<li>Recipe {name} {heroName} {strengthBonus} {agilityBonus} {intelligenceBonus} {hitpointsBonus} {damageBonus} {requiredItem1} {requiredItem2}. . .</li>
<li>Inspect {heroName}</li>
<li>Quit</li>
</ul>
<h3>Output</h3>
<p>Each of the commands generates <strong>output</strong>. Here are the <strong>output formats</strong> of each command:</p>
<ul>
<li>Hero Command – registers a hero of the given type, with the given name. Prints the following result:</li>
</ul>
<p>Created {type} – {name}</p>
<p> </p>
<ul>
<li>Item Command – adds a CommonItem to a specified hero.</li>
</ul>
<p>Added item - {itemName} to Hero - {heroName}</p>
<p> </p>
<ul>
<li>Recipe Command – adds a RecipeItem to a specified hero.</li>
</ul>
<p>Added recipe - {recipeName} to Hero – {heroName}</p>
<p> </p>
<ul>
<li>Inspect command – provides <strong>information</strong> about a <strong>hero’s</strong> <strong>stats</strong> and <strong>items</strong>, in the following format:</li>
</ul>
<p>Hero: {heroName}, Class: {heroType}<br /> HitPoints: {hitpoints}, Damage: {damage}<br /> Strength: {strength}<br /> Agility: {agility}<br /> Intelligence: {intelligence}<br /> Items:<br /> ###Item: {item1Name}<br /> ###+{strengthBonus} Strength<br /> ###+{agilityBonus} Agility<br /> ###+{intelligenceBonus} Intelligence<br /> ###+{hitpointsBonus} HitPoints<br /> ###+{damageBonus} Damage<br /> ###Item: {item2Name}<br /> . . .</p>
<ul>
<li>In case the hero <strong>has NO items</strong>, print “Items: None” below the stats.</li>
</ul>
<p> </p>
<ul>
<li>Quit command – <strong>prints</strong> all heroes <strong>ordered</strong> in <strong>descending order</strong> by <strong>the sum</strong> of their (<strong>Strength</strong> + <strong>Agility</strong> + <strong>Intelligence</strong>) and if <strong>2 heroes</strong> have the <strong>SAME SUM</strong>, they should be <strong>ordered</strong> in <strong>descending order</strong> by <strong>the sum</strong> of their (<strong>hitpoints</strong> + <strong>damage</strong>). The format, in which the heroes should be printed is:</li>
</ul>
<ol>
<li>{heroType}: {heroName}<br /> ###HitPoints: {hitpoints}<br /> ###Damage: {damage}<br /> ###Strength: {strength}<br /> ###Agility: {agility}<br /> ###Intelligence: {intelligence}<br /> ###Items: {item1Name}, {item2Name}, {item3Name}. . .<br /> 2. {heroType}: {heroName}<br /> . . .
<ul>
<li>In case the hero <strong>has NO items</strong>, print “Items: None” below the stats.</li>
</ul>
</li>
</ol>
<h3>Constrains</h3>
<ul>
<li>The <strong>names</strong> of the <strong>heroes</strong> and the <strong>items</strong> may contain <strong>only Alphanumeric characters</strong>.</li>
<li>The <strong>strengthBonus</strong>, <strong>intelligenceBonus</strong>, <strong>agilityBonus</strong>, <strong>hitpointsBonus</strong>, <strong>damageBonus</strong> <strong>stats</strong> of the <strong>ITEMS</strong><br /> will be <strong>valid integers</strong> in <strong>range [0, 2<sup>30</sup>]</strong>.</li>
<li>There will be <strong>NO invalid input</strong>, like missing arguments from the input or non-existent heroes in the commands, requiring hero names.</li>
</ul>
<p> </p>
<h3>Example Tests</h3>
<table width="1412">
<tbody>
<tr>
<td width="764">
<p><strong>Input</strong></p>
</td>
<td width="648">
<p><strong>Output</strong></p>
</td>
</tr>
<tr>
<td width="764">
<p>Hero Ivan Barbarian</p>
<p>Hero Pesho Assassin</p>
<p>Item Knife Ivan 0 10 0 0 30</p>
<p>Item Stick Ivan 0 0 10 0 5</p>
<p>Recipe Spear Ivan 25 10 10 100 50 Knife Stick</p>
<p>Inspect Ivan</p>
<p>Inspect Pesho</p>
<p>Quit</p>
<p> </p>
</td>
<td width="648">
<p>Created Barbarian - Ivan</p>
<p>Created Assassin - Pesho</p>
<p>Added item - Knife to Hero - Ivan</p>
<p>Added item - Stick to Hero - Ivan</p>
<p>Added recipe - Spear to Hero - Ivan</p>
<p>Hero: Ivan, Class: Barbarian</p>
<p>HitPoints: 450, Damage: 200</p>
<p>Strength: 115</p>
<p>Agility: 35</p>
<p>Intelligence: 20</p>
<p>Items:</p>
<p>###Item: Spear</p>
<p>###+25 Strength</p>
<p>###+10 Agility</p>
<p>###+10 Intelligence</p>
<p>###+100 HitPoints</p>
<p>###+50 Damage</p>
<p>Hero: Pesho, Class: Assassin</p>
<p>HitPoints: 150, Damage: 300</p>
<p>Strength: 25</p>
<p>Agility: 100</p>
<p>Intelligence: 15</p>
<p>Items: None</p>
<p>1. Barbarian: Ivan</p>
<p>###HitPoints: 450</p>
<p>###Damage: 200</p>
<p>###Strength: 115</p>
<p>###Agility: 35</p>
<p>###Intelligence: 20</p>
<p>###Items: Spear</p>
<p>2. Assassin: Pesho</p>
<p>###HitPoints: 150</p>
<p>###Damage: 300</p>
<p>###Strength: 25</p>
<p>###Agility: 100</p>
<p>###Intelligence: 15</p>
<p>###Items: None</p>
</td>
</tr>
<tr>
<td width="764">
<p>Hero Donald Wizard</p>
<p>Item Staff Donald 0 10 50 100 100</p>
<p>Item Orb Donald 0 0 100 100 350</p>
<p>Hero Jefrey Wizard</p>
<p>Item Staff Jefrey 0 10 50 100 100</p>
<p>Item Orb Jefrey 0 0 100 100 350</p>
<p>Recipe Oculus Jefrey 100 100 100 1000 2500 Staff Orb</p>
<p>Recipe Oculus Donald 100 100 100 1000 2500 Staff Orb</p>
<p>Item Ring Jefrey 0 0 0 1 1</p>
<p>Quit</p>
</td>
<td width="648">
<p>Created Wizard - Donald</p>
<p>Added item - Staff to Hero - Donald</p>
<p>Added item - Orb to Hero - Donald</p>
<p>Created Wizard - Jefrey</p>
<p>Added item - Staff to Hero - Jefrey</p>
<p>Added item - Orb to Hero - Jefrey</p>
<p>Added recipe - Oculus to Hero - Jefrey</p>
<p>Added recipe - Oculus to Hero - Donald</p>
<p>Added item - Ring to Hero - Jefrey</p>
<p>1. Wizard: Jefrey</p>
<p>###HitPoints: 1101</p>
<p>###Damage: 2751</p>
<p>###Strength: 125</p>
<p>###Agility: 125</p>
<p>###Intelligence: 200</p>
<p>###Items: Oculus, Ring</p>
<p>2. Wizard: Donald</p>
<p>###HitPoints: 1100</p>
<p>###Damage: 2750</p>
<p>###Strength: 125</p>
<p>###Agility: 125</p>
<p>###Intelligence: 200</p>
<p>###Items: Oculus</p>
</td>
</tr>
</tbody>
</table>
<h3>Tasks</h3>
<h4>Task 1: High Quality Structure</h4>
<h5>Refactor the given Skeleton code and use it.</h5>
<p>Apparently, there was a person who tried to write the program before you, but he couldn’t do much, so he was … Detached. But he somehow managed to write the Inventory class. His work, however, is not that trustworthy, so you might have to give it an eye or two, for potential <strong>functionality bugs</strong> and things that <strong>do NOT follow </strong>the <strong>good practices</strong> of <strong>Object-Oriented Programming</strong>.</p>
<p>The previous employee left a single <strong>TODO</strong> in the code. It requires for you to initialize a CommonItem, with the <strong>stat bonuses</strong> of the RecipeItem, given as <strong>parameter</strong> to the <strong>corresponding method</strong>.</p>
<p>Refactor anything, which will <strong>improve</strong> the <strong>code quality</strong>, in your opinion. Be careful <strong>NOT</strong> to <strong>break the code</strong> or one of the <strong>rules</strong> specified in the <strong>Skeleton</strong> <strong>section</strong>.</p>
<p><strong>Implement</strong> the <strong>given</strong> <strong>INTERFACES</strong> in your class definitions, all of them.</p>
<h5>High Quality Code.</h5>
<p>Achieve good separation of concerns using abstractions and interfaces to decouple classes, while reusing code through inheritance and polymorphism. Your classes should have strong cohesion - have single responsibility and loose coupling - know about as few other classes as possible.</p>
<p>Make sure you <strong>inject</strong> <strong>all</strong> of your class <strong>dependencies</strong> trough <strong>interfaces</strong>.</p>
<h5>Reflection.</h5>
<p>Implement the Items property method of the Hero entities, with <strong>reflection</strong>.</p>
<h4>Task 2: Correct business logic.</h4>
<p>The given code provides some functionality, but it does not cover the entire task. Implement the rest of the business logic, using the given code, and implement everything following the requirements specification. Check your solutions in the Judge system.</p>
<p>Make sure you have <strong>fixed</strong> <strong>ALL BUGS</strong> in the Inventory logic <strong>before</strong> you <strong>submit your code</strong> in <strong>Judge</strong> or you are sure to get <strong>incorrect results</strong>.</p>
<h4>Task 3: Unit Testing.</h4>
<p>Test the <strong>ALL</strong> of the Inventory class’s methods for potential bugs.</p>
<p>You are allowed to use <strong>only</strong> the <strong>classes </strong>and<strong> interfaces</strong>, <strong>PROVIDED BY THE</strong> <strong>Skeleton</strong> in your <strong>unit testing</strong>. If you try to use the classes you have implemented, you will <strong>NOT</strong> <strong>receive any points</strong>.<br /> </p> | 61.299712 | 543 | 0.709276 | eng_Latn | 0.842409 |
b1094b47b9b6f638f43baa63fd6b98e255dc5e3b | 549 | md | Markdown | module-1/StackAPI/README.md | himaggerst/daft-miami-0120-labs | 25192be394acfee438d3be22396d6d6d6cf93b81 | [
"MIT"
] | null | null | null | module-1/StackAPI/README.md | himaggerst/daft-miami-0120-labs | 25192be394acfee438d3be22396d6d6d6cf93b81 | [
"MIT"
] | 3 | 2019-10-28T21:38:48.000Z | 2019-12-17T01:45:37.000Z | module-1/StackAPI/README.md | himaggerst/daft-miami-0120-labs | 25192be394acfee438d3be22396d6d6d6cf93b81 | [
"MIT"
] | 7 | 2020-01-21T17:33:11.000Z | 2020-01-22T02:11:37.000Z | 
# Lab | StackAPI
Let's check if you can handle working with an API.
Use [StackAPI](https://stackapi.readthedocs.io/en/latest/), a Python wrapper for the Stack Exchange API, and answer the following questions:
* Question 1: Find the questions and answers of last month.
* Question 2: Find the most voted question today with at least a score of 5 and tagged with 'python'.
* Question 3: Find the answers with id 6784 and 6473.
## Deliverables
- Submit the `main.ipynb` file with the solutions.
| 39.214286 | 140 | 0.746812 | eng_Latn | 0.986452 |
b10a4ae77700fd0bcabd4c29a6161e7e6683a3aa | 1,094 | md | Markdown | docs/api/@remirror/extension-collaboration/extension-collaboration.collaborationextensionoptions.onsendablereceived.md | jankeromnes/remirror | 95306cee4c76ee9fd7271a0ab6069f0a0a6803d9 | [
"MIT"
] | 1 | 2021-05-22T06:22:01.000Z | 2021-05-22T06:22:01.000Z | docs/api/@remirror/extension-collaboration/extension-collaboration.collaborationextensionoptions.onsendablereceived.md | jankeromnes/remirror | 95306cee4c76ee9fd7271a0ab6069f0a0a6803d9 | [
"MIT"
] | null | null | null | docs/api/@remirror/extension-collaboration/extension-collaboration.collaborationextensionoptions.onsendablereceived.md | jankeromnes/remirror | 95306cee4c76ee9fd7271a0ab6069f0a0a6803d9 | [
"MIT"
] | null | null | null | <!-- Do not edit this file. It is automatically generated by API Documenter. -->
[Home](./index.md) > [@remirror/extension-collaboration](./extension-collaboration.md) > [CollaborationExtensionOptions](./extension-collaboration.collaborationextensionoptions.md) > [onSendableReceived](./extension-collaboration.collaborationextensionoptions.onsendablereceived.md)
## CollaborationExtensionOptions.onSendableReceived() method
Called when an an editor transaction occurs and there are changes ready to be sent to the server.
<b>Signature:</b>
```typescript
onSendableReceived(params: OnSendableReceivedParams): void;
```
## Parameters
| Parameter | Type | Description |
| --- | --- | --- |
| params | <code>OnSendableReceivedParams</code> | the sendable and jsonSendable properties which can be sent to your backend |
<b>Returns:</b>
`void`
## Remarks
The callback will receive the `jsonSendable` which can be sent to the server as it is. If you need more control then the `sendable` property can be used to shape the data the way you require.
| 37.724138 | 292 | 0.740402 | eng_Latn | 0.959084 |
b10a97a1540adcd8a013579d865d7adff6f70a02 | 11,439 | md | Markdown | _wiki/human_associated.md | florian-gschwend/mothur.github.io | aaa1bc4382070d1f2233961e0da808b94e8f902d | [
"CC-BY-4.0"
] | 14 | 2020-03-12T12:11:00.000Z | 2021-07-13T21:39:20.000Z | _wiki/human_associated.md | florian-gschwend/mothur.github.io | aaa1bc4382070d1f2233961e0da808b94e8f902d | [
"CC-BY-4.0"
] | 69 | 2020-03-20T15:01:52.000Z | 2022-03-31T18:43:11.000Z | _wiki/human_associated.md | florian-gschwend/mothur.github.io | aaa1bc4382070d1f2233961e0da808b94e8f902d | [
"CC-BY-4.0"
] | 17 | 2020-03-24T13:43:34.000Z | 2022-01-06T20:34:49.000Z | ---
title: 'Human Associated'
redirect_from: '/wiki/Human_Associated'
---
Here is a link to NCBI definition page for the [human\_associated
package](https://www.ncbi.nlm.nih.gov/biosample/docs/packages/MIMARKS.survey.human-associated.4.0/).
## Required
===sample\_name This is your group name. mothur will fill this in for
you, thanks mom!
### description
Description of sample
### sample\_title
The title for your sample.
format: {text}
### seq\_methods
The free form descriptions of methods used to create the sequencing
library.
format: {text}
### organism
You must choose this from the NCBI's list found here. This is
controlled vocabulary. Your choices are: activated carbon metagenome,
activated sludge metagenome, air metagenome, anaerobic digester
metagenome, ant fungus garden metagenome, aquatic metagenome, activated
carbon metagenome, activated sludge metagenome, beach sand metagenome,
biofilm metagenome, biofilter metagenome, biogas fermenter metagenome,
bioreactor metagenome, bioreactor sludge metagenome, clinical
metagenome, coal metagenome, compost metagenome, dust metagenome,
fermentation metagenome, food fermentation metagenome, food metagenome,
freshwater metagenome, freshwater sediment metagenome, groundwater
metagenome, halite metagenome, hot springs metagenome, hydrocarbon
metagenome, hydrothermal vent metagenome, hypersaline lake metagenome,
ice metagenome, indoor metagenome, industrial waste metagenome, mangrove
metagenome, marine metagenome, marine sediment metagenome, microbial mat
metagenome, mine drainage metagenome, mixed culture metagenome, oil
production facility metagenome, paper pulp metagenome, permafrost
metagenome, plastisphere metagenome, power plant metagenome, retting
rhizosphere metagenome, rock metagenome, salt lake metagenome, saltern
metagenome, sediment metagenome, snow metagenome, soil metagenome,
stromatolite metagenome, terrestrial metagenome, tomb wall metagenome,
wastewater metagenome, wetland metagenome, whale fall metagenome, algae
metagenome, ant metagenome, bat metagenome, beetle metagenome, bovine
gut metagenome, bovine metagenome, chicken gut metagenome, coral
metagenome, echinoderm metagenome, endophyte metagenome, epibiont
metagenome, fish metagenome, fossil metagenome, gill metagenome, gut
metagenome, honeybee metagenome, human gut metagenome, human lung
metagenome, human metagenome, human nasal/pharyngeal metagenome, human
oral metagenome, human skin metagenome, insect gut metagenome, insect
metagenome, mollusc metagenome, mosquito metagenome, mouse gut
metagenome, mouse metagenome, mouse skin metagenome, nematode
metagenome, oral metagenome, phyllosphere metagenome, pig metagenome,
plant metagenome, primate metagenome, rat metagenome, root metagenome,
sea squirt metagenome, seed metagenome, shoot metagenome, skin
metagenome, snake metagenome, sponge metagenome, stomach metagenome,
symbiont metagenome, termite gut metagenome, termite metagenome, upper
respiratory tract metagenome, urine metagenome, viral metagenome,
wallaby gut metagenome, wasp metagenome, synthetic metagenome,
metagenome. You can modify your choice after submission.
### collection\_date
Date of sampling, in \\"DD-Mmm-YYYY\\", \\"Mmm-YYYY\\" or
\\"YYYY\\" format (single instance, eg., 05-Oct-1990, Oct-1990 or
1990) or ISO 8601 standard \\"YYYY-mm-dd\\" or
\\"YYYY-mm-ddThh:mm:ss\\" (eg. 1990-11-05 or 1990-11-05T14:41:36)
format: {timestamp}
### env\_biome
Major class of ecologically similar communities of plants, animals, and
other organisms (eg., desert, coral reef).
format: {term}
### env\_feature
Geographical environmental feature (eg., harbor, lake).
format: {term}
### geo\_loc\_name
Geographical origin of the sample; use the appropriate name from this
list [http://www.insdc.org/documents/country-qualifier-vocabulary](http://www.insdc.org/documents/country-qualifier-vocabulary). Use
a colon to separate the country or ocean from more detailed information
about the location, eg \\"Canada: Vancouver\\" or \\"Germany: halfway
down Zugspitze, Alps\\".
format: {term}:{term}:{text}
### lat\_lon
The geographical coordinates of the location where the sample was
collected. Specify as degrees latitude and longitude in format
\\"d\[d.dddd\] N\|S d\[dd.dddd\] W\|E\\", eg, 38.98 N 77.11 W.
format: {float} {float}
### env\_material
The matter displaced by the sample (eg., air, soil, water).
format: {term}
### host
The natural (as opposed to laboratory) host to the organism from which
sample was obtained.
## Optional
### age
The age at the time of sampling; relevant scale depends on species and
study, e.g. could be seconds for amoebae or centuries for trees.
format: {float} {unit}
### amniotic\_fluid\_color
The specification of the color of the amniotic fluid sample.
format: {text}
### blood\_blood\_disord
The history of blood disorders; can include multiple disorders
format: {text}
### body\_mass\_index
The body mass index, calculated as weight/(height)squared
format: {float} {unit}
### body\_product
The substance produced by the plant where the sample was obtained from.
format: {text}
### chem\_administration
The list of chemical compounds administered to the host or site where
sampling occurred, and when (e.g. antibiotics, N fertilizer, air
filter); can include multiple compounds. For Chemical Entities of
Biological Interest ontology (CHEBI) (v1.72), please see
[https://bioportal.bioontology.org/visualize/44603](https://bioportal.bioontology.org/visualize/44603).
format: {term}; {timestamp}
### diet
The type of diet depending on the sample for animals omnivore, herbivore
etc., for humans high-fat, meditteranean etc.; can include multiple diet
types.
format: {text}
### diet\_last\_six\_month
The specification of major diet changes in the last six months, if yes
the change should be specified.
format: {boolean};{text}
### disease
The list of diseases diagnosed; can include multiple diagnoses. the
value of the field depends on host; for humans the terms should be
chosen from DO (Disease Ontology), free text for non-human. For DO
terms, please see
[https://disease-ontology.org](https://disease-ontology.org)
format: {term}
### drug\_usage
Any drug used by subject and the frequency of usage; can include
multiple drugs used
format: {text};{integer}/[year|month|week|day|hour]
### ethnicity
The ethnicity of the subject.
format: {integer|text}
### family\_relationship
The relationships to other samples in the same study; can include
multiple relationships.
format: {text}; {text}
### fetal\_health\_stat
The specification of fetal health status, should also include abortion.
format: {text}
### genotype
The observed genotype.
format: {text}
### gestation\_state
The specification of the gestation state.
format: {text}
### height
The height of subject.
format: {float} {unit}
### hiv\_stat
HIV status of subject, if yes HAART initiation status should also be
indicated as \[yes or no\]
format: {boolean};{boolean}
### host\_body\_temp
The core body temperature of the host when sample was collected.
format: {float} {unit}
### host\_subject\_id
A unique identifier by which each subject can be referred to,
de-identified, e.g. \#131
format: {text}
### ihmc\_medication\_code
can include multiple medication codes
format: {integer}
### kidney\_disord
The history of kidney disorders; can include multiple disorders
format: {text}
### last\_meal
The content of last meal and time since feeding; can include multiple
values.
format: {text};{period}
### maternal\_health\_stat
The specification of the maternal health status.
format: {text}
### medic\_hist\_perform
Whether full medical history was collected.
format: {boolean}
### nose\_throat\_disord
The history of nose-throat disorders; can include multiple disorders.
format: {text}
### occupation
The most frequent job performed by subject.
format: {integer}
### organism\_count
The total count of any organism per gram or volume of sample,should
include name of organism followed by count; can include multiple
organism counts.
format: {text};{float} {unit}
### oxy\_stat\_samp
The oxygenation status of sample.
format: [, 'aerobic', 'anaerobic']
### perturbation
The type of perturbation, e.g. chemical administration, physical
disturbance, etc., coupled with time that perturbation occurred; can
include multiple perturbation types.
format: {text};{interval}
### pet\_farm\_animal
The specification of presence of pets or farm animals in the environment
of subject, if yes the animals should be specified; can include multiple
animals present.
format: {boolean};{text}
### phenotype
Phenotype of sampled organism. For Phenotypic quality Ontology (PATO)
(v1.269) terms, please see
[https://bioportal.bioontology.org/visualize/44601](https://bioportal.bioontology.org/visualize/44601).
format: {term}
### pulmonary\_disord
The history of pulmonary disorders; can include multiple disorders.
format: {text}
### pulse
The resting pulse, measured as beats per minute.
format: {float} {unit}
### rel\_to\_oxygen
Aerobic or anaerobic
format: [, 'aerobe', 'anaerobe', 'facultative', 'microaerophilic', 'microanaerobe', 'obligate aerobe', 'obligate anaerobe']
### samp\_collect\_device
Method or device employed for collecting sample
format: {text}
### samp\_mat\_process
Processing applied to the sample during or after isolation.
format: {text|term}
### samp\_salinity
The sample salinity.
format: {float} {unit}
### samp\_size
The sample size.
format: {float} {unit}
### samp\_store\_dur
The sample storage duration.
format: {interval}
### samp\_store\_loc
The sample storage location. Usually name of a specific freezer/room
format: {text}
### samp\_store\_temp
The sample storage temperature.
format: {float} {unit}
### sex
The physical sex of sampled organism.
format: [, 'male', 'female', 'neuter', 'hermaphrodite', 'not determined']
### smoker
The specification of smoking status.
format: {boolean}
### study\_complt\_stat
The specification of study completion status, if no the reason should be
specified
format: {boolean};[adverse event|non-compliance|lost to follow up|other-specify]
### temp
The temperature of the sample at time of sampling.
format: {float} {unit}
### tissue
Type of tissue the sample was taken from.
### tot\_mass
The total mass of the host at collection, the unit depends on host.
format: {float} {unit}
### travel\_out\_six\_month
The specification of the countries travelled in the last six months; can
include multiple travels.
format: {text}
### twin\_sibling
The specification of twin sibling presence.
format: {boolean}
### urine\_collect\_meth
The specification of urine collection method.
format: [, 'clean catch', 'catheter']
### urogenit\_tract\_disor
The history of urogenitaltract disorders; can include multiple
disorders.
format: {text}
### weight\_loss\_3\_month
The specification of weight loss in the last three months, if yes should
be further specified to include amount of weight loss.
format: {boolean};{float} {unit}
### user\_defined
You may create your own optional fields to describe your sample.
| 24.54721 | 132 | 0.749628 | eng_Latn | 0.963855 |
b10aff70886438032253436b772a574bc6ada517 | 3,588 | md | Markdown | test/snapshots/no-for-loop.js.md | mmkal/eslint-plugin-unicorn | 643169bfc9dbe0ab6b75b79102c6b12ffc620e04 | [
"MIT"
] | null | null | null | test/snapshots/no-for-loop.js.md | mmkal/eslint-plugin-unicorn | 643169bfc9dbe0ab6b75b79102c6b12ffc620e04 | [
"MIT"
] | null | null | null | test/snapshots/no-for-loop.js.md | mmkal/eslint-plugin-unicorn | 643169bfc9dbe0ab6b75b79102c6b12ffc620e04 | [
"MIT"
] | null | null | null | # Snapshot report for `test/no-for-loop.js`
The actual snapshot is saved in `no-for-loop.js.snap`.
Generated by [AVA](https://avajs.dev).
## no-for-loop - #1
> Snapshot 1
`␊
Input:␊
1 | for (let i = 0; i < arr.length; i += 1) {␊
2 | console.log(arr[i])␊
3 | }␊
␊
Output:␊
1 | for (const element of arr) {␊
2 | console.log(element)␊
3 | }␊
␊
Error 1/1:␊
> 1 | for (let i = 0; i < arr.length; i += 1) {␊
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^␊
> 2 | console.log(arr[i])␊
| ^^^^^^^^^^^^^^^^^^^^␊
> 3 | }␊
| ^^ Use a `for-of` loop instead of this `for` loop.␊
`
## no-for-loop - #2
> Snapshot 1
`␊
Input:␊
1 | for (let i = 0; i < plugins.length; i++) {␊
2 | let plugin = plugins[i];␊
3 | plugin = calculateSomeNewValue();␊
4 | // ...␊
5 | }␊
␊
Output:␊
1 | for (let plugin of plugins) {␊
2 | plugin = calculateSomeNewValue();␊
3 | // ...␊
4 | }␊
␊
Error 1/1:␊
> 1 | for (let i = 0; i < plugins.length; i++) {␊
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^␊
> 2 | let plugin = plugins[i];␊
| ^^^^^^^^^^^^^^^^^^^^^^^^^␊
> 3 | plugin = calculateSomeNewValue();␊
| ^^^^^^^^^^^^^^^^^^^^^^^^^␊
> 4 | // ...␊
| ^^^^^^^^^^^^^^^^^^^^^^^^^␊
> 5 | }␊
| ^^ Use a `for-of` loop instead of this `for` loop.␊
`
## no-for-loop - #3
> Snapshot 1
`␊
Input:␊
1 | for (let i = 0; i < array.length; i++) {␊
2 | var foo = array[i];␊
3 | foo = bar();␊
4 | }␊
␊
Output:␊
1 | for (var foo of array) {␊
2 | foo = bar();␊
3 | }␊
␊
Error 1/1:␊
> 1 | for (let i = 0; i < array.length; i++) {␊
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^␊
> 2 | var foo = array[i];␊
| ^^^^^^^^^^^^^^^^^^^^␊
> 3 | foo = bar();␊
| ^^^^^^^^^^^^^^^^^^^^␊
> 4 | }␊
| ^^ Use a `for-of` loop instead of this `for` loop.␊
`
## no-for-loop - #4
> Snapshot 1
`␊
Input:␊
1 | for (let i = 0; i < array.length; i++) {␊
2 | let foo = array[i];␊
3 | }␊
␊
Output:␊
1 | for (let foo of array) {␊
2 | }␊
␊
Error 1/1:␊
> 1 | for (let i = 0; i < array.length; i++) {␊
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^␊
> 2 | let foo = array[i];␊
| ^^^^^^^^^^^^^^^^^^^^␊
> 3 | }␊
| ^^ Use a `for-of` loop instead of this `for` loop.␊
`
## no-for-loop - #5
> Snapshot 1
`␊
Input:␊
1 | for (let i = 0; i < array.length; i++) {␊
2 | const foo = array[i];␊
3 | }␊
␊
Output:␊
1 | for (const foo of array) {␊
2 | }␊
␊
Error 1/1:␊
> 1 | for (let i = 0; i < array.length; i++) {␊
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^␊
> 2 | const foo = array[i];␊
| ^^^^^^^^^^^^^^^^^^^^^^␊
> 3 | }␊
| ^^ Use a `for-of` loop instead of this `for` loop.␊
`
## no-for-loop - #6
> Snapshot 1
`␊
Input:␊
1 | for (let i = 0; i < array.length; i++) {␊
2 | var foo = array[i], bar = 1;␊
3 | }␊
␊
Output:␊
1 | for (var foo of array) {␊
2 | var bar = 1;␊
3 | }␊
␊
Error 1/1:␊
> 1 | for (let i = 0; i < array.length; i++) {␊
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^␊
> 2 | var foo = array[i], bar = 1;␊
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^␊
> 3 | }␊
| ^^ Use a `for-of` loop instead of this `for` loop.␊
`
| 22.708861 | 61 | 0.343645 | eng_Latn | 0.295061 |
b10bd9ffde065f9feebd023c4df9ada0a729d489 | 724 | md | Markdown | README.md | krystiankaluzny/solr-twitter-demo | 1a06a182b23dc5224314005bcd51bc11f4511e86 | [
"MIT"
] | null | null | null | README.md | krystiankaluzny/solr-twitter-demo | 1a06a182b23dc5224314005bcd51bc11f4511e86 | [
"MIT"
] | null | null | null | README.md | krystiankaluzny/solr-twitter-demo | 1a06a182b23dc5224314005bcd51bc11f4511e86 | [
"MIT"
] | null | null | null | # solr-twitter-demo
Apache Solr vs RDBMS - searching text in tweets
Start app and open http://localhost:8081/
Apche Sorl store tweets core index on file system,
so before you restart the application delete target/classes/solr/tweets/data.
To search in RDBMS was used simple SQL LIKE statement.
So if you try to find *abc* phrase, then *abcdef* may be on list of results.
On the other hand Apache Solr tweets core uses StandardTokenizer to index texts,
so only whole words can be found.
And if you try to find *abc*, then you can get words like *abc*, *ABC*, *aBc*, but not *abcdef*
This leading to conclusion that searching something in RDBMS and Apache Solr with current configuration, may give you different results.
| 42.588235 | 136 | 0.774862 | eng_Latn | 0.988907 |
b10bed051871f481c1ec897274f24c4d745e8c98 | 168 | md | Markdown | scenarios/README.md | SarahTV/SSK | ac7f5b7b1f1c02aefcb706abd80178f86c216cf7 | [
"CC-BY-4.0"
] | null | null | null | scenarios/README.md | SarahTV/SSK | ac7f5b7b1f1c02aefcb706abd80178f86c216cf7 | [
"CC-BY-4.0"
] | null | null | null | scenarios/README.md | SarahTV/SSK | ac7f5b7b1f1c02aefcb706abd80178f86c216cf7 | [
"CC-BY-4.0"
] | null | null | null | ### Scenarios:
* Unstable scenarios are marked with the suffix `_unst` and are not displayed in the application.
* synchronizing GitHub and the app may take some time
| 33.6 | 97 | 0.77381 | eng_Latn | 0.999888 |
b10c79d7392c20108f26c2878e49c06432fa08fa | 5,515 | md | Markdown | docs/framework/data/adonet/dataset-datatable-dataview/generating-strongly-typed-datasets.md | Graflinger/docs.de-de | 9dfa50229d23e2ee67ef4047b6841991f1e40ac4 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/framework/data/adonet/dataset-datatable-dataview/generating-strongly-typed-datasets.md | Graflinger/docs.de-de | 9dfa50229d23e2ee67ef4047b6841991f1e40ac4 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/framework/data/adonet/dataset-datatable-dataview/generating-strongly-typed-datasets.md | Graflinger/docs.de-de | 9dfa50229d23e2ee67ef4047b6841991f1e40ac4 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Generieren von stark typisierten "DataSets"
ms.date: 03/30/2017
dev_langs:
- csharp
- vb
ms.assetid: 54333cbf-bb43-4314-a7d4-6dc1dd1c44b3
ms.openlocfilehash: 25883b7be10c68e527e4e04182b7162574b994d9
ms.sourcegitcommit: 5b6d778ebb269ee6684fb57ad69a8c28b06235b9
ms.translationtype: MT
ms.contentlocale: de-DE
ms.lasthandoff: 04/08/2019
ms.locfileid: "59149629"
---
# <a name="generating-strongly-typed-datasets"></a>Generieren von stark typisierten "DataSets"
Bei einem XML-Schema, das dem XSD-Standard (XML Schema Definition Language) entspricht, können Sie mit dem Tool <legacyBold>XSD.exe</legacyBold>, das zusammen mit dem <xref:System.Data.DataSet> ausgeliefert wird, ein stark typisiertes [!INCLUDE[winsdklong](../../../../../includes/winsdklong-md.md)] generieren.
(Um ein XSD-Schema aus Datenbanktabellen erstellen zu können, finden Sie unter <xref:System.Data.DataSet.WriteXmlSchema%2A> oder [arbeiten mit Datasets in Visual Studio](/visualstudio/data-tools/dataset-tools-in-visual-studio)).
Der folgende Code zeigt die Syntax für das Generieren einer **DataSet** mit diesem Tool.
```
xsd.exe /d /l:CS XSDSchemaFileName.xsd /eld /n:XSDSchema.Namespace
```
In dieser Syntax der `/d` -Anweisung weist das Tool zum Generieren einer **DataSet**, und die `/l:` weist das Tool zu (z. B. c# oder Visual Basic .NET) zu verwendende Sprache. Der optionale `/eld` Richtlinie gibt an, dass Sie verwenden können, [!INCLUDE[linq_dataset](../../../../../includes/linq-dataset-md.md)] zum Abfragen des generierten **DataSet.** Diese Option wird verwendet, wenn auch die Option `/d` angegeben ist. Weitere Informationen finden Sie unter [Abfragen typisierter DataSets](../../../../../docs/framework/data/adonet/querying-typed-datasets.md). Das optionale `/n:` -Anweisung weist das Tool auch generieren, einen Namespace für die **DataSet** namens **XSDSchema.Namespace**. Als Ergebnis dieses Befehls wird die Datei XSDSchemaFileName.cs ausgegeben, die kompiliert und in einer ADO.NET-Anwendung verwendet werden kann. Der generierte Code kann als Bibliothek oder Modul kompiliert werden.
Im folgenden Codebeispiel wird die Syntax dargestellt, die zum Kompilieren des generierten Codes als Bibliothek mithilfe des C#-Compilers (csc.exe) verwendet wird.
```
csc.exe /t:library XSDSchemaFileName.cs /r:System.dll /r:System.Data.dll
```
Die `/t:`-Direktive weist das Tool an, eine Bibliothek zu kompilieren, und die `/r:`-Direktiven geben die zum Kompilieren erforderlichen abhängigen Bibliotheken an. Als Ergebnis des Befehls wird die Datei XSDSchemaFileName.dll ausgegeben, die an den Compiler übergeben werden kann, wenn eine ADO.NET-Anwendung mit der `/r:`-Direktive kompiliert wird.
Im folgenden Codebeispiel wird die Syntax für den Zugriff auf den Namespace dargestellt, der in einer ADO.NET-Anwendung an XSD.exe übergeben wird.
```vb
Imports XSDSchema.Namespace
```
```csharp
using XSDSchema.Namespace;
```
Im folgenden Codebeispiel wird ein typisiertes **DataSet** mit dem Namen **CustomerDataSet** beim Laden der einer Liste von Kunden aus der **Northwind** Datenbank. Nach dem Laden der Daten mithilfe der **füllen** -Methode, im Beispiel führt eine Schleife durch alle Kunden in den **Kunden** Tabelle mithilfe des typisierten **CustomersRow** ( **DataRow**) Objekt. Dies bietet direkten Zugriff auf die **"CustomerID"** Spalte, als durch die **DataColumnCollection**.
```vb
Dim customers As CustomerDataSet= New CustomerDataSet()
Dim adapter As SqlDataAdapter New SqlDataAdapter( _
"SELECT * FROM dbo.Customers;", _
"Data Source=(local);Integrated " & _
"Security=SSPI;Initial Catalog=Northwind")
adapter.Fill(customers, "Customers")
Dim customerRow As CustomerDataSet.CustomersRow
For Each customerRow In customers.Customers
Console.WriteLine(customerRow.CustomerID)
Next
```
```csharp
CustomerDataSet customers = new CustomerDataSet();
SqlDataAdapter adapter = new SqlDataAdapter(
"SELECT * FROM dbo.Customers;",
"Data Source=(local);Integrated " +
"Security=SSPI;Initial Catalog=Northwind");
adapter.Fill(customers, "Customers");
foreach(CustomerDataSet.CustomersRow customerRow in customers.Customers)
Console.WriteLine(customerRow.CustomerID);
```
Im Folgenden wird das im Beispiel verwendete XML-Schema dargestellt.
```xml
<?xml version="1.0" encoding="utf-8"?>
<xs:schema id="CustomerDataSet" xmlns="" xmlns:xs="http://www.w3.org/2001/XMLSchema" xmlns:msdata="urn:schemas-microsoft-com:xml-msdata">
<xs:element name="CustomerDataSet" msdata:IsDataSet="true">
<xs:complexType>
<xs:choice maxOccurs="unbounded">
<xs:element name="Customers">
<xs:complexType>
<xs:sequence>
<xs:element name="CustomerID" type="xs:string" minOccurs="0" />
</xs:sequence>
</xs:complexType>
</xs:element>
</xs:choice>
</xs:complexType>
</xs:element>
</xs:schema>
```
## <a name="see-also"></a>Siehe auch
- <xref:System.Data.DataColumnCollection>
- <xref:System.Data.DataSet>
- [Typisierte "DataSets"](../../../../../docs/framework/data/adonet/dataset-datatable-dataview/typed-datasets.md)
- ["DataSets", "DataTables" und "DataViews"](../../../../../docs/framework/data/adonet/dataset-datatable-dataview/index.md)
- [ADO.NET Managed Provider und DataSet Developer Center](https://go.microsoft.com/fwlink/?LinkId=217917)
| 53.028846 | 915 | 0.728196 | deu_Latn | 0.873818 |
b10dd419771eef7ce3e1e701fce1af15c1c02312 | 1,816 | md | Markdown | docs/guide-pl/topic-adding-more-apps.md | kartik-v/yii2-app-practical-a | 253234502e58e7366b99a371fb8472062ceb7b0d | [
"BSD-3-Clause"
] | 33 | 2015-01-08T01:27:59.000Z | 2020-10-11T17:58:59.000Z | docs/guide-pl/topic-adding-more-apps.md | kartik-v/yii2-app-practical-a | 253234502e58e7366b99a371fb8472062ceb7b0d | [
"BSD-3-Clause"
] | 16 | 2015-01-19T05:40:20.000Z | 2020-04-20T11:37:30.000Z | docs/guide-pl/topic-adding-more-apps.md | kartik-v/yii2-app-practical-a | 253234502e58e7366b99a371fb8472062ceb7b0d | [
"BSD-3-Clause"
] | 20 | 2015-01-13T19:02:29.000Z | 2020-09-22T04:34:57.000Z | Dodawanie kolejnych aplikacji
=============================
Powszechnie spotykane rozdzielenie części front-endowej od back-endowej czasem nie jest wystarczające. Dla przykładu,
być może wymagane jest wydzielenie jeszcze jednej aplikacji, dla, powiedzmy, bloga. Aby to uzyskać:
1. Skopiuj folder `frontend` do folderu `blog`, `environments/dev/frontend` do `environments/dev/blog`
i `environments/prod/frontend` do `environments/prod/blog`.
2. Popraw przestrzenie nazw i ścieżki tak, aby zaczynały się od `blog` zamiast `frontend`.
3. W pliku `common\config\bootstrap.php` dodaj `Yii::setAlias('blog', dirname(dirname(__DIR__)) . '/blog');`.
4. Zmodyfikuj zawartość pliku `environments/index.php` (dodane linie zostały oznaczone `+`):
```php
return [
'Development' => [
'path' => 'dev',
'setWritable' => [
'backend/runtime',
'backend/assets',
'frontend/runtime',
'frontend/assets',
+ 'blog/runtime',
+ 'blog/assets',
],
'setExecutable' => [
'yii',
'yii_test',
],
'setCookieValidationKey' => [
'backend/config/main-local.php',
'frontend/config/main-local.php',
+ 'blog/config/main-local.php',
],
],
'Production' => [
'path' => 'prod',
'setWritable' => [
'backend/runtime',
'backend/assets',
'frontend/runtime',
'frontend/assets',
+ 'blog/runtime',
+ 'blog/assets',
],
'setExecutable' => [
'yii',
],
'setCookieValidationKey' => [
'backend/config/main-local.php',
'frontend/config/main-local.php',
+ 'blog/config/main-local.php',
],
],
];
```
| 32.428571 | 118 | 0.552863 | pol_Latn | 0.893895 |
b10dee278e2042b2c23ddd16c79b1ef28557a551 | 1,758 | md | Markdown | collections/_database/2020-01-06-database-solution-01.md | Atercatus/Atercatus.github.io | 5889b1d00aadd999476e1ec9e2d05ebfaaf5a9cb | [
"MIT"
] | null | null | null | collections/_database/2020-01-06-database-solution-01.md | Atercatus/Atercatus.github.io | 5889b1d00aadd999476e1ec9e2d05ebfaaf5a9cb | [
"MIT"
] | 9 | 2020-01-04T17:07:34.000Z | 2020-01-06T08:57:47.000Z | collections/_database/2020-01-06-database-solution-01.md | Atercatus/Atercatus.github.io | 5889b1d00aadd999476e1ec9e2d05ebfaaf5a9cb | [
"MIT"
] | 2 | 2020-02-16T16:24:05.000Z | 2021-02-17T06:31:58.000Z | ---
title: "Optimizer 입문 및 역정규화와 Join 비교"
excerpt: "Nested loop join 시의 Optimizer 동작, 올바른 Join 과 역정규화 비교"
last_modified_at: 2020-01-06T12:06:00
---
## Nested loop join 시의 Optimizer

위와 같이 두 개의 `dept` 테이블과 `emp` 테이블이 있다고 가정합니다.
아래는 주어진 상황에 대한 쿼리 예시입니다
```query
Select B.dname, A.empno, A.ename, A.sal
from emp A, dept B
where A.deptno = B.deptno
```
Nested loop join을 수행한다 했을 때, Optimizer는 1번과 2번 어느 방향으로 동작할 것인가?
### 2번의 경우
- 100건 짜리 `dept` 테이블을 먼저 읽는다는 것 => full scan
- B.deptno를 상수로 바꾸겠다는 의미
- `emp(10만건)` 테이블을 전부 읽으면서 B.deptno(101, 102, ...)를 찾는다
- 이 때, 부서번호(`emp`의 deptno) 에는 Index가 없다.
- 결론적으로 10만건의 `emp`테이블을 full scan 100번 수행
- I/O block 의 양 = 10만 X 100 = 천만건의 I/O block
### 1번의 경우
- 10만건 짜리 `emp` 테이블을 먼저 읽는다는 것 => full scan
- A.deptno를 상수로 바꾸겠다는 의미
- `dept` 테이블을 전부 읽으면서 A.deptno(101, 102, ...)를 찾는다
- 이 때, 부서번호(`dept` 의 deptno) 에는 Index가 존재한다.
- 따라서 `dept` 테이블을 full scan 하지 않고 RowId로 테이블 액세스하여 찾아온다.
- 다음 사원의 deptno도 같을 경우 메모리에 올라와있는 data를 접근하여 불러온다
- 결론적으로 `emp(10만건)` 한번의 full scan + `dept` 에서의 unique index access
### 결론
따라서 1번의 경우가 2번의 경우보다 성능이 좋다.
쿼리의 성능은 CPU, memory 보다 I/O 에서 성능의 차이를 보인다.
## Join VS Denormalization(역정규화 또는 반정규화)
deptno 를 `emp` 테이블에 추가하는 반정규화를 한 예제입니다.
```
Select A.dname, A.empno, A.ename, A.sal
from emp A
```
- emp 1record 가 20 Byte => 20Byte X 10만
=> 전체 데이터: 약 2MB
- dept 1record가 30 Byte => 30Byte X 100
=> 전체 데이터: 약 3KB
- Nested loop join 이 아닌 Hash join, Sort-merge join 수행할 시
- 2MB + 3KB 정도의 I/O
- dname(20Byte)를 column으로 추가 시(반정규화)
- 4MB
따라서 Join을 한 경우가 성능이 더 좋습니다.
### 결론
역정규화가 무조건적인 성능 향상을 불러오지 않습니다. 올바른 Join을 수행해야합니다.
Join시에 Optimizer는 Index가 없는 테이블을 derieve 합니다.
| 23.44 | 110 | 0.67975 | kor_Hang | 1.00001 |
b10e76bd35b84998e817a37878b669fbd77d7ca4 | 2,161 | md | Markdown | README.md | ericcornelissen/jekyll-fontello | 53cf04c9e0d02545e01476227203a0a0826058ee | [
"MIT"
] | 3 | 2018-02-23T18:36:30.000Z | 2018-10-03T07:34:49.000Z | README.md | ericcornelissen/jekyll-fontello | 53cf04c9e0d02545e01476227203a0a0826058ee | [
"MIT"
] | 1 | 2019-05-25T16:56:58.000Z | 2019-06-23T19:09:25.000Z | README.md | ericcornelissen/jekyll-fontello | 53cf04c9e0d02545e01476227203a0a0826058ee | [
"MIT"
] | 3 | 2018-04-06T11:53:19.000Z | 2020-02-04T08:25:48.000Z | # jekyll-fontello
[](https://circleci.com/gh/ericcornelissen/jekyll-fontello)
[](https://codecov.io/gh/ericcornelissen/jekyll-fontello)
[](https://codeclimate.com/github/ericcornelissen/jekyll-fontello/maintainability)
[](https://badge.fury.io/rb/jekyll-fontello)
Jekyll plugin that automatically downloads your webfont from Fontello.
## Installation
1. Install the `jekyll-fontello` gem:
```shell
$ gem install jekyll-fontello
```
2. Add `jekyll-fontello` to the list of plugins in `_config.yml`:
```yaml
plugins:
- jekyll-fontello
```
3. Add a Fontello configuration file named `fontello_config.json` to your project.
4. Include the Fontello `.css` file in your pages:
```html
<link href="/fontello/styles/fontello.css" rel="stylesheet" type="text/css">
```
5. Use Fontello icons on your website or blog, for example:
```html
<i class="icon-rocket"></i>
```
## Options
#### Config file
Change the name/path of the Fontello configuration file, the default value is `'fontello_config.json'`.
```yaml
fontello:
config: 'config.json'
```
#### Output fonts
Change the output path of the font files, the default value is `'fontello/fonts'`.
```yaml
fontello:
output_fonts: 'assets/fonts/fontello'
```
#### Output stylesheets
Change the output path of the stylesheet files, the default value is `'fontello/styles'`.
```yaml
fontello:
output_styles: 'styles/fontello'
```
#### Custom fonts path
The path to the font files that should be put in the stylesheets By default this is computed as the relative path from `output_styles` to `output_fonts`.
```yaml
fontello:
fonts_path: '/assets/fonts/fontello'
```
#### Preprocessor
Change what CSS preprocessor is used, by default no preprocessor is used. Allowed values are `'none'`, `'less'` and `'scss'`.
```yaml
fontello:
preprocessor: 'scss'
```
| 25.72619 | 176 | 0.736233 | eng_Latn | 0.581118 |
b10edc05aa987810abd14a6c5e858a298ea51e9b | 291 | md | Markdown | activities/value-and-impact/index.md | clausmullie/about | 12a7a2a2f6947aaaea549da11d0bd64356b70a3b | [
"CC0-1.0"
] | null | null | null | activities/value-and-impact/index.md | clausmullie/about | 12a7a2a2f6947aaaea549da11d0bd64356b70a3b | [
"CC0-1.0"
] | null | null | null | activities/value-and-impact/index.md | clausmullie/about | 12a7a2a2f6947aaaea549da11d0bd64356b70a3b | [
"CC0-1.0"
] | null | null | null | ---
type: Index
---
# Value and impact
These are documents and evidence for how we create value and impact.
## Impact
* [Policy documents referring to the Foundation for Public Code](policy-documents.md)
* [All press and media references to the Foundation for Public Code](all-press.md)
| 22.384615 | 85 | 0.749141 | eng_Latn | 0.995087 |
b10ee349468c50f1a4fa03f8a84bc30cb0f4888f | 1,529 | md | Markdown | .github/ISSUE_TEMPLATE/feature-request.md | Arsenal821/incubator-pegasus | da9f3ebeaa2642f887dfc80ead5b0cd39493a5a0 | [
"Apache-2.0"
] | 1,352 | 2017-10-16T03:24:54.000Z | 2020-08-18T04:44:23.000Z | .github/ISSUE_TEMPLATE/feature-request.md | Arsenal821/incubator-pegasus | da9f3ebeaa2642f887dfc80ead5b0cd39493a5a0 | [
"Apache-2.0"
] | 299 | 2017-10-19T05:33:32.000Z | 2020-08-17T09:03:39.000Z | .github/ISSUE_TEMPLATE/feature-request.md | Arsenal821/incubator-pegasus | da9f3ebeaa2642f887dfc80ead5b0cd39493a5a0 | [
"Apache-2.0"
] | 240 | 2017-10-16T05:57:04.000Z | 2020-08-18T10:02:36.000Z | <!--
Licensed to the Apache Software Foundation (ASF) under one
or more contributor license agreements. See the NOTICE file
distributed with this work for additional information
regarding copyright ownership. The ASF licenses this file
to you under the Apache License, Version 2.0 (the
"License"); you may not use this file except in compliance
with the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing,
software distributed under the License is distributed on an
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
KIND, either express or implied. See the License for the
specific language governing permissions and limitations
under the License.
-->
---
name: "\U0001F680 Feature Request"
about: I have a suggestion
labels: type/enhancement
---
## Feature Request
**Is your feature request related to a problem? Please describe:**
<!-- A clear and concise description of what the problem is. Ex. I'm always frustrated when [...] -->
**Describe the feature you'd like:**
<!-- A clear and concise description of what you want to happen. -->
**Describe alternatives you've considered:**
<!-- A clear and concise description of any alternative solutions or features you've considered. -->
**Teachability, Documentation, Adoption, Migration Strategy:**
<!-- If you can, explain some scenarios how users might use this, situations it would be helpful in. Any API designs, mockups, or diagrams are also helpful. -->
| 39.205128 | 160 | 0.76259 | eng_Latn | 0.998134 |
b10f0bab45f28455e4cc43cc8605eb3663f861e8 | 4,863 | md | Markdown | README.md | alexander-matz/tamias | f4c719f46bfe618d9a3b327608d2491fd24cbcc4 | [
"MIT"
] | 1 | 2017-08-10T08:27:34.000Z | 2017-08-10T08:27:34.000Z | README.md | alexander-matz/tamias | f4c719f46bfe618d9a3b327608d2491fd24cbcc4 | [
"MIT"
] | null | null | null | README.md | alexander-matz/tamias | f4c719f46bfe618d9a3b327608d2491fd24cbcc4 | [
"MIT"
] | null | null | null | # What is Tamias?
Tiamias is a git server much like gitolite aimed at small to (maybe) medium teams.
Its goal is to make repository creation and destruction as easy as possible as well
as be completely configurable through the commandline with configuration being
located in repositories kept to a minimum.
Its architecture is heavily inspired by gitolite.
# Why, though?
I'm an avid fan of simple git solutions like gitolite but found myself frustrated
by having to go through the following workflow just to create a temporary repository:
- `$ git clone git@myserver:gitolite-admin`
- start up an editor with gitolite-admin/conf/gitolite.conf
- modifying that file according to its format to add/remove a repository
- save
- `$ git add -u`
- `$ git commit`
- `$ git push origin master`
Even though it does not seem like much, my opinion is that every additional step in
achieving something discourages users to go through the process.
So adding a repository in tamias is achieved by executing:
`$ ssh git@myserver add <repository>`
Deleting a repository requires executing:
`$ ssh git@myserver rm <repository>`
# Installation
Create a user that acts as your git user.
Be aware that regular ssh access will not be possible after installation of tamias.
After that build a release version of the tamias sources on your server by executing:
`$ dub build --build=release`
Then execute `./tamias install <ssh-key> [ssh-keys...]` on the server.
It is helpful to supply keys with filenames following the format explained in section
'Adding/removing users' so you start with a reasonable configuration of users and
permissions.
# Usage
In addition to the regular git operations clone/pull/fetch/push etc. some additional
ssh commands are available.
They are used like they would be executables on the server.
So you would execute the command 'list' by executing `ssh git@myserver list`.
The following commands are available:
- `version`
- `list`
- `add <repository>`
- `rm <repository>`
- `config <repository> [update ...]`
- `whoami`
# Access control
Access control is kept simple, the only thing that controls what you're able to access
are your roles.
A role is a simpler version of posix groups with mostly the same semantics.
Every user is assuming a number of different roles as well as a few special ones.
The special ones are a role that is the name as their username as well as the role 'all'.
A repository has a list for each of the permissions 'read', 'write', 'config'.
'read' and 'write' are straight forward and just determine who can pull/push a
repository.
The 'config' permission allows to modify the values for the owner, read, write, and config
permissions.
The role 'staff' is special in that is effectively the tamias equivalent of a 'root' user
that has all permissions.
The same holds true for the owner of a repository.
## Adding/removing users
Since actual usernames are ignored except when looking up ownership of a repository,
creating a user does not require a lot of work.
On updating the 'keys.git' repositories users are recreated as it is looking for
public keys with the extension '.pub'.
Naming conventions follows the same conventions as gitolite does:
- key files named `...@<user>.pub` user the part after the last '@' as the username
- if there is no '@' in the filename of the key, everything before the '.pub' is considered the username
## Configuring user roles
Users with write access to the special 'keys.git' repository can modify the roles users
assume.
In addition to the default roles for users (username + 'all'), roles are specified in the
file 'users.conf' in the 'keys.git' repository.
It follows a simple format:
`username : role1 role2 role3 ...`
Whitespaces are used to separate roles from another. Allowed characters for roles are
alphanumeric characters, underscore, and dash.
## Configuring access permissions
If you're either the owner of a repository, have 'config' permission or are 'staff', you
can modify the access permissions of a repository.
The general syntax, as mentioned above is:
`config <repository> [update1 ...]`
If no updates are supplied, the current configuration is printed.
Multiple updates can be supplied by seperating them by whitespaces.
An update can either update the owner or one of the permissions.
If you're updating the owner, the following syntax is enforced:
`owner=<new-owner>`
If you update a permission you can either add/remove or set the roles for that permission.
In either case you can pass multiple users separated by commas.
It's explained the most easy way by some examples:
- Adding the roles 'all' and 'deimos' to roles that can write: `config <repo> write+all,deimos`
- Disallowing 'all' to read the repository: `config <repo> read-all`
- Only allowing 'staff' config access, removing everybody else: `config <repo> config=staff`
| 37.992188 | 104 | 0.771335 | eng_Latn | 0.999604 |
b10f3845c1d95716ac820df790da244165e30eef | 284 | md | Markdown | README.md | kshetline/array-buffer-reader | 1916874d56a695a7a09f31ba88f6ba104bbd6eeb | [
"MIT"
] | null | null | null | README.md | kshetline/array-buffer-reader | 1916874d56a695a7a09f31ba88f6ba104bbd6eeb | [
"MIT"
] | 1 | 2021-06-23T14:07:08.000Z | 2021-06-23T14:07:08.000Z | README.md | kshetline/array-buffer-reader | 1916874d56a695a7a09f31ba88f6ba104bbd6eeb | [
"MIT"
] | null | null | null | ## ArrayBuffer Reader
Provides a class for reading a byte array or ArrayBuffer as a binary stream.
## Build
Run `npm run build` to build the project. The build artifacts will be stored in the `dist/` directory.
## Running unit tests
Run `npm run test` to execute the unit tests.
| 23.666667 | 102 | 0.746479 | eng_Latn | 0.998145 |
b10f7b2cf56e84646bb71fa6c490422f3d610933 | 3,009 | md | Markdown | README.md | samsoul007/auto-config-updater | 60aea5b6ca0cf3da24c9a560d8913335046f25fb | [
"MIT"
] | null | null | null | README.md | samsoul007/auto-config-updater | 60aea5b6ca0cf3da24c9a560d8913335046f25fb | [
"MIT"
] | null | null | null | README.md | samsoul007/auto-config-updater | 60aea5b6ca0cf3da24c9a560d8913335046f25fb | [
"MIT"
] | null | null | null | Things on internet move fast. New versions are being deployed all the time and it is hard to keep track of them. If you have hundreds of services running and some of them requires a specific version of an API you have to either put it in a config file, or in the process ENV.
This is not optimal when new versions of your API provider are available weekly or monthly (ex: Facebook Marketing API) and that they retire the old version shortly after. How do you reliably change those versions without redeploying everything.
Well with this module you can now "listen" to changes in a remote file and act accordingly. No need to redeploy anything, just make sure to update your code to handle the changes. It can be in auth keys or versions or whatever you see fit.
# installation
`npm install --save auto-config-updater`
# usage
## before you start
This module only currently supports JSON data. Upon being received the JSON object will be flattened:
```javascript
//If you have this file
{
"test": 123,
"val": {
"a": "value"
}
}
//It will be converted to
{
"test":123,
"val.a": "value"
}
```
When you setup the handler you need to do it on the flat key (ex: if you want to get the `a` value when you need to enter `val.a`)
## setting up a configuration
This module allows multiple variable files to be loaded.
### Amazon S3
```javascript
const CU = require("auto-config-updater")
const AWS = require("aws-sdk");
//setup your AWS SDK
AWS.config.update({...});
/**
* @param {object} s3_object AWS s3 object (new AWS.S3()).
* @param {string} bucket Name of S3 bucket.
* @param {string} key path to file in S3 bucket.
* @param {integer} [refresh=60000] refresh time to look in ms.
* @param {string} [config='default'] name of the config
*/
CU.config.fromS3(new AWS.S3(), "mybucket", "mykey", 5000, "my-versions")
```
### ElasticSearch
```javascript
const CU = require("auto-config-updater")
const { Client } = require('@elastic/elasticsearch')
//setup your ElasticSearch client
const client = new Client({ node: '...:9200' });
/**
* @param {object} client ElasticSearch client.
* @param {string} index Index in ElasticSearch.
* @param {string} type Type in index.
* @param {string} id Id in index and type.
* @param {integer} [refresh=60000] refresh time to look in ms.
* @param {string} [config='default'] name of the config
*/
CU.config.fromES(client, "index", "type", "id", 5000, "my-versions")
```
## Value change handler
When a value has changed this handler will be triggered. You can add as many handlers as you want on each key
```javascript
const CU = require("auto-config-updater")
/**
* @param {string} key Key to check on the value of the file.
* @param {function} handler Callback function for the handler.
* @param {string} [config='default'] name of the config
*/
CU.handler.onChange("key",handler, config)
CU.handler.onChange("version",(value, key) => {
console.log("the value has changed:", value, key)
},"my-versions")
```
| 31.34375 | 275 | 0.705882 | eng_Latn | 0.987127 |
b11064428777edb395ccb991f65c26f657862c7d | 414 | md | Markdown | README.md | snapyjs/snapy-filter-obj | d3a9dd437c03f510c47fe79251b30eae29482d59 | [
"MIT"
] | null | null | null | README.md | snapyjs/snapy-filter-obj | d3a9dd437c03f510c47fe79251b30eae29482d59 | [
"MIT"
] | null | null | null | README.md | snapyjs/snapy-filter-obj | d3a9dd437c03f510c47fe79251b30eae29482d59 | [
"MIT"
] | null | null | null | # snapy-filter-obj
Plugin of [snapy](https://github.com/snapyjs/snapy).
Filter properties from a snapshot, like timestamps.
```js
test((snap) => {
snap({
obj: {
nested:{
notIncluded: false,
included: true
},
alsoNotIncluded: false
},
filter: "nested,-nested.notIncluded"
})
})
```
## License
Copyright (c) 2017 Paul Pflugradt
Licensed under the MIT license.
| 16.56 | 52 | 0.618357 | eng_Latn | 0.849277 |
b111233d8aca7c59521d6c78834e24a7fa7e2c12 | 9,021 | md | Markdown | TODOS.md | eUgEntOptIc44/factbook | 5a3894a34ba5273d74282c9703536cab424a5d46 | [
"CC0-1.0"
] | 94 | 2016-11-05T15:51:50.000Z | 2022-02-04T14:48:58.000Z | TODOS.md | eUgEntOptIc44/factbook | 5a3894a34ba5273d74282c9703536cab424a5d46 | [
"CC0-1.0"
] | 5 | 2021-01-22T14:38:12.000Z | 2022-03-25T16:50:23.000Z | TODOS.md | eUgEntOptIc44/factbook | 5a3894a34ba5273d74282c9703536cab424a5d46 | [
"CC0-1.0"
] | 15 | 2016-12-14T21:08:42.000Z | 2022-03-31T23:04:51.000Z | # Todos
- [ ] use Factbook terminology - why? why not?
- change subsection to field!!
- change categories.csv to fields.csv (keep name as it is field name now not wrongly category name)
- change structs too!! - use Section => Category, Subsection => Field
## Add 4(?) attachment (blocks) too?
- Audio - in ?? / National anthem
- PDF - total population growth rate v. urban population growth rate in ?? / ??
- JPG - population pyramid in ?? / ??
- JPG - Area comparison map in ?? / ??
<!-- break -->
Audio in ?? / National anthem
```
<div class="category_data attachment">
<audio type="audio/mp3"
controls="controls"
alt="National Anthem audio file for Austria"
src="../attachments/audios/original/AU.mp3?1538604749">
</audio>
</div>
```
PDF - total population growth rate v. urban population growth rate
in ?? / ??
```
<div class="category_data attachment">
<span class="subfield-name">
total population growth rate v. urban population growth rate, 2000-2030:
</span>
<a style="display: inline-block; margin-left: 5px; vertical-align: middle" target="_blank" aria-hidden="true"
href="../attachments/docs/original/urban_AU.pdf?1602698564">
<img style="height:22px;opacity:.5" src="../images/adobe.png" alt="Adobe" />
</a>
<a style="display: inline-block; margin-left: 5px; vertical-align: middle" target="_blank" class="sr-only"
href="../attachments/docs/original/urban_AU.pdf?1602698564">PDF
</a>
</div>
```
JPG - population pyramid in ?? / ??
```
<div class="category_data attachment">
<span class="subfield-name">population pyramid:</span>
<a data-toggle="modal" href="#imageModal5161" class='category_image_link'>
<img alt="population pyramid" src="../attachments/images/thumb/AU_popgraph2020.JPG?1584126705" />
</a>
<div class="modal fade cntryModal" id="imageModal5161" role="dialog"
aria-label='modal dialog used to display image(s) associated with a field'
aria-labelledby='imageModalLabel'>
<div class="wfb-modal-dialog">
<div class="modal-content">
<div class="wfb-modal-header">
<span class="modal-title wfb-title">The World Factbook</span>
<span class='sr-only' id='imageModalLabel'>Field Image Modal</span>
<span style="float: right; margin-top: -4px;">
<button type="button" class="close" title="close" data-dismiss="modal">×</button>
</span>
</div>
<div class="wfb-modal-body">
<div class="region1 geos_title eur_dark">
Europe <strong>::</strong>
<span class="region_name1 countryName">Austria</span>
<span class="btn-print">
<a href="../attachments/images/original/AU_popgraph2020.JPG?1584126705">
<i class="fa fa-print" aria-hidden='true'></i>
<span class='sr-only'>Print</span>
</a>
</span>
</div>
<div class="eur_bkgrnd">
<div class="modalImageBox">
<img class="eur_lgflagborder"
src="../attachments/images/original/AU_popgraph2020.JPG?1584126705" alt="population pyramid">
</div>
<div class="modalImageDesc">
<div class="header"
style="background-image: url(../images/eur_medium.jpg)">
Image Description
</div>
<div class="photogallery_captiontext">
This is the population pyramid for Austria. A population pyramid illustrates the age and sex structure of a country's population and may provide insights about political and social stability, as well as economic development. The population is distributed along the horizontal axis, with males shown on the left and females on the right. The male and female populations are broken down into 5-year age groups represented as horizontal bars along the vertical axis, with the youngest age groups at the bottom and the oldest at the top. The shape of the population pyramid gradually evolves over time based on fertility, mortality, and international migration trends. <br/><br/>For additional information, please see the entry for Population pyramid on the Definitions and Notes page under the References tab.
</div>
</div>
</div>
</div>
</div>
</div>
</div>
```
JPG - Area comparison map in ?? / ??
```
<div class="category_data attachment">
<span class="subfield-name">Area comparison map:</span>
<a data-toggle="modal" href="#imageModal3974" class='category_image_link'>
<img alt="Area comparison map" src="../attachments/images/thumb/AU_area.jpg?1548765584" />
</a>
<div class="modal fade cntryModal" id="imageModal3974" role="dialog"
aria-label='modal dialog used to display image(s) associated with a field'
aria-labelledby='imageModalLabel'>
<div class="wfb-modal-dialog">
<div class="modal-content">
<div class="wfb-modal-header">
<span class="modal-title wfb-title">The World Factbook</span>
<span class='sr-only' id='imageModalLabel'>Field Image Modal</span>
<span style="float: right; margin-top: -4px;">
<button type="button" class="close" title="close" data-dismiss="modal">×</button>
</span>
</div>
<div class="wfb-modal-body">
<div class="region1 geos_title eur_dark">
Europe <strong>::</strong>
<span class="region_name1 countryName">Austria</span>
<span class="btn-print">
<a href="../attachments/images/original/AU_area.jpg?1548765584">
<i class="fa fa-print" aria-hidden='true'></i>
<span class='sr-only'>Print</span>
</a>
</span>
</div>
<div class="eur_bkgrnd">
<div class="modalImageBox">
<img class="eur_lgflagborder"
src="../attachments/images/original/AU_area.jpg?1548765584" alt="Area comparison map">
</div>
<div class="modalImageDesc">
<div class="header"
style="background-image: url(../images/eur_medium.jpg)">
Image Description
</div>
<div class="photogallery_captiontext">
<p>about the size of South Carolina; slightly more than two-thirds the size of Pennsylvania</p>
</div>
</div>
</div>
</div>
</div>
</div>
</div>
## More
in france (fr) check with `++` is inside of strong tab and not before/outside ???
```
<div class="category_data note">
<strong>note: </strong>applies to metropolitan France only; for its overseas regions the time difference is UTC-4 for Guadeloupe and Martinique, UTC-3 for French Guiana, UTC+3 for Mayotte, and UTC+4 for Reunion<strong> ++ etymology: </strong>nam
```
- [x] unfancy quotes in text e.g.
```
Following Britain’s victory
=>
Following Britain's victory
```
## strip trailing newslines from data item
see ItemBuilder - strip trailing spaces/newlines ??
see Algeria/Religion for an example with trailing newlines ??
## xx.json - world page
map reference:
- use squeeze to pretty print text e.g.
- remove newlines and extra spaces; now text() looks like
```
"text":
"Political Map of the World , Physical Map of the World ,
Standard Time\n Zones of the World , World Oceans"
```
instead of
```
"text":
"Political Map of the World, Physical Map of the World,
Standard Time Zones of the World, World Oceans"
```
## Newlines in Fields ?
### Algeria • Al Jaza'ir
check - includes trailing newslines? - why? strip?
```
**Religions** Muslim (official; predominantly Sunni) 99%, other (includes Christian and Jewish)
```
### More Old Todo Notes to Check
```
https://www.cia.gov/library/publications/resources/the-world-factbook/docs/history.html
http://jmatchparser.sourceforge.net/factbook/
print
plus print mysql schema
http://ports.gnu-darwin.org/databases/wfb2sql/work/wfb2sql-0.6/doc/wfb2sql.html
print !!!!!!
``Cyprus'' does not contain correct values for some entries since all values consist of two values:
on for the ``Greek Cypriot area'' and one for the ``Turkish Cypriot area''.
Maybe this data should be handled manually.
http://jmatchparser.sourceforge.net/factbook/
see schema / mysql schema db
The MONDIAL Database - Database and Information Systems
www.dbis.informatik.uni-goettingen.de/Mondial/
add
http://wfb2sql.sourceforge.net/ !!
wfb2sql is a Perl script that extracts information from the CIA World Factbook
and creates SQL statements for IBM DB/2, PostgreSQL or MySQL.
This data builds a perfect database for learning and teaching SQL.
check perl script!!!
see
https://github.com/sayem/worldfactbook/blob/master/lib/worldfactbook/country.rb
add know shortcuts e.g page.gdp page.pdp_ppp etc ??
print worldfactbook readme!!
create list of
section and subsections !!!
create list of mappings
section ??
subsections ?? to short (all lower case name) ??
use fields.csv or mappings.csv ??
Num,Category,Name,Key
see http://wifo5-04.informatik.uni-mannheim.de/factbook/page/venezuela
```
| 33.043956 | 813 | 0.66844 | eng_Latn | 0.709735 |
b11191bd0a1c97a12ec2aeacf373165c56bc2023 | 6,998 | md | Markdown | sccm/core/get-started/2019/includes/1909/3098816.md | williampruitt/SCCMdocs | 86dd40c8d1805128a6e31f44f7fa22aa19281da5 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2020-01-05T22:37:08.000Z | 2020-01-05T22:37:08.000Z | sccm/core/get-started/2019/includes/1909/3098816.md | williampruitt/SCCMdocs | 86dd40c8d1805128a6e31f44f7fa22aa19281da5 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | sccm/core/get-started/2019/includes/1909/3098816.md | williampruitt/SCCMdocs | 86dd40c8d1805128a6e31f44f7fa22aa19281da5 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
author: mestew
ms.author: mstewart
ms.prod: configuration-manager
ms.technology: configmgr-other
ms.topic: include
ms.date: 09/26/2019
ms.collection: M365-identity-device-management
---
## <a name="bkmk_OGs"></a> Orchestration Groups
<!--3098816-->
Create an orchestration group to better control the deployment of software updates to devices. Many server administrators need to carefully manage updates for specific workloads, and automate behaviors in between. For example:
- As the software updates administrator, you manage all updates for your organization.
- You have one large collection for all servers and one large collection for all clients. You deploy all updates to these collections.
- The SQL administrators want to control all the software installed on the SQL servers. They want to patch five servers in a specific order. Their current process is to manually stop specific services before installing updates, and then restart the services afterwards.
- You create an orchestration group and add all five SQL servers. You also add pre- and post-scripts, using the PowerShell scripts provided by the SQL administrators.
- During the next update cycle, you create and deploy the software updates as normal to the large collection of servers. The SQL administrators run the deployment, and the orchestration group automates the order and services.
An orchestration group gives you the flexibility to update devices based on a percentage, a specific number, or an explicit order. You can also run a PowerShell script before and after the devices run the update deployment.
Members of an orchestration group can be any Configuration Manager client, not just servers. The orchestration group rules apply to the devices for all software update deployments to any collection that contains an orchestration group member. Other deployment behaviors still apply. For example, maintenance windows and deployment schedules.
> [!NOTE]
> The **Orchestration Groups** feature is the evolution of the [Server Groups](/sccm/sum/deploy-use/service-a-server-group) feature. An orchestration group is a new object in Configuration Manager.
### Prerequisites
- Enable the **Orchestration Groups** feature. For more information, see [Enable optional features](/sccm/core/servers/manage/install-in-console-updates#bkmk_options).
> [!NOTE]
> When you enable **Orchestration Groups**, the site disables the **Server Groups** feature. This behavior avoids any conflicts between the two features.
- To see all of the orchestration groups and updates for those groups, your account needs to be a **Full Administrator**.
### Try it out!
Try to complete the tasks. Then send [Feedback](/sccm/core/understand/find-help#product-feedback) with your thoughts on the feature.
1. In the Configuration Manager console, go to the **Assets and Compliance** workspace, and select the **Orchestration Group** node.
1. In the ribbon, select **Create Orchestration Group** to open the **Create Orchestration Group Wizard**.
1. On the **General** page, give your orchestration group a **Name** and optionally a **Description**.
1. On the **Member Selection** page, first specify the current **Site code**. Then select **Browse** to add device resources as members of this orchestration group. **Search** for devices by name, and then **Add** them. Select **OK** when you finish adding devices to the selected resources list.

1. On the **Rule Selection** page, select one of the following options:
- **Allow a percentage of the machines to be updated at the same time**, then select or enter a number for this percentage. Use this setting to allow for future flexibility of the size of the orchestration group. For example, your orchestration group contains 50 devices, and you set this value to 10. During a software update deployment, Configuration Manager allows five devices to simultaneously run the deployment. If you later increase the size of the orchestration group to 100 devices, then 10 devices update at once.
- **Allow a number of the machines to be updated at the same time**, then select or enter a number for this specific count. Use this setting to always limit to a specific number of devices, whatever the overall size of the orchestration group.
- **Specify the maintenance sequence**, then sort the selected resources in the proper order. Use this setting to explicitly define the order in which devices run the software update deployment.
1. On the **PreScript** page, enter a PowerShell script to run on each device *before* the deployment runs. The script should return a value of `0` for success, or `3010` for success with restart. You can also specify a **Script timeout** value, which fails the script if it doesn't complete in the specified time.
1. On the **PostScript** page, enter a PowerShell script to run on each device *after* the deployment runs. The behavior is otherwise the same as the PreScript.
1. Complete the wizard.
In the **Orchestration Group** node, select an orchestration group. In the ribbon, select **Show Members**. You can see the members of the group, and their orchestration status.
To test the behavior of the orchestration group, [deploy software updates](/sccm/sum/deploy-use/deploy-software-updates) to a collection that contains the members of the orchestration group. Orchestration starts when any client in the group tries to install any software update at deadline or during a maintenance window. It starts for the entire group, and makes sure that the devices update by following the orchestration group rules.
> [!TIP]
> Orchestration groups only apply to software update deployments. They don't apply to other deployments.
### Monitor
Use the following log files on the site server to help monitor and troubleshoot:
- **Policypv.log**: shows that the site targets the orchestration group to the clients
- **SMS_OrchestrationGroup.log**: shows the behaviors of the orchestration group
### Orchestration group known issues
- Don't add a machine to more than one orchestration group.
- When searching a collection to select resources for an orchestration group, only choose **All Desktop and Server Clients**.
- There are several actions currently available on an orchestration group, but only the default **Show Members** action works. This action currently doesn't show the name of the device, only the resource ID.
- The **Orchestration Type** values correspond to the following types:
| Value | Orchestration Type |
|-------|---------|
|**1**|Number|
|**2**|Percentage|
|**3**|Sequence|
- The **Current State** values correspond to the following states:
| Value | Current State |
|-------|---------|
|**1**|Idle|
|**2**|Waiting, the device is waiting its turn|
|**3**|In progress, installing an update|
|**4**|Failed|
|**5**|Reboot pending|
| 66.018868 | 527 | 0.766505 | eng_Latn | 0.998752 |
b111ba04a377b65735a392d7e8a756e653425f8a | 8,548 | md | Markdown | data/readme_files/mps-youtube.pafy.md | DLR-SC/repository-synergy | 115e48c37e659b144b2c3b89695483fd1d6dc788 | [
"MIT"
] | 5 | 2021-05-09T12:51:32.000Z | 2021-11-04T11:02:54.000Z | data/readme_files/mps-youtube.pafy.md | DLR-SC/repository-synergy | 115e48c37e659b144b2c3b89695483fd1d6dc788 | [
"MIT"
] | null | null | null | data/readme_files/mps-youtube.pafy.md | DLR-SC/repository-synergy | 115e48c37e659b144b2c3b89695483fd1d6dc788 | [
"MIT"
] | 3 | 2021-05-12T12:14:05.000Z | 2021-10-06T05:19:54.000Z | .. image:: https://img.shields.io/pypi/v/Pafy.svg
:target: https://pypi.python.org/pypi/pafy
.. image:: https://img.shields.io/pypi/dm/Pafy.svg
:target: https://pypi.python.org/pypi/pafy
.. image:: https://img.shields.io/coveralls/mps-youtube/pafy/develop.svg
:target: https://coveralls.io/r/mps-youtube/pafy?branch=develop
.. image:: https://landscape.io/github/mps-youtube/pafy/develop/landscape.svg
:target: https://landscape.io/github/mps-youtube/pafy/develop
:alt: Code Health
.. image:: https://travis-ci.org/mps-youtube/pafy.svg?branch=develop
:target: https://travis-ci.org/mps-youtube/pafy
.. image:: https://img.shields.io/pypi/wheel/Pafy.svg
:target: http://pythonwheels.com/
:alt: Wheel Status
Features
--------
- Retreive metadata such as viewcount, duration, rating, author, thumbnail, keywords
- Download video or audio at requested resolution / bitrate / format / filesize
- Command line tool (ytdl) for downloading directly from the command line
- Retrieve the URL to stream the video in a player such as vlc or mplayer
- Works with age-restricted videos and non-embeddable videos
- Small, standalone, single importable module file (pafy.py)
- Select highest quality stream for download or streaming
- Download video only (no audio) in m4v or webm format
- Download audio only (no video) in ogg or m4a format
- Retreive playlists and playlist metadata
- Works with Python 2.6+ and 3.3+
- Optionally depends on youtube-dl (recommended; more stable)
Documentation
-------------
Full documentation is available at http://pythonhosted.org/pafy
Usage Examples
--------------
Here is how to use the module in your own python code. For command line tool
(ytdl) instructions, see further below
.. code-block:: pycon
>>> import pafy
create a video instance from a YouTube url:
.. code-block:: pycon
>>> url = "https://www.youtube.com/watch?v=bMt47wvK6u0"
>>> video = pafy.new(url)
get certain attributes:
.. code-block:: pycon
>>> video.title
'Richard Jones: Introduction to game programming - PyCon 2014'
>>> video.rating
5.0
>>> video.viewcount, video.author, video.length
(1916, 'PyCon 2014', 10394)
>>> video.duration, video.likes, video.dislikes
('02:53:14', 25, 0)
>>> print(video.description)
Speaker: Richard Jones
This tutorial will walk the attendees through development of a simple game using PyGame with time left over for some experimentation and exploration of different types of games.
Slides can be found at: https://speakerdeck.com/pycon2014 and https://github.com/PyCon/2014-slides
list available streams for a video:
.. code-block:: pycon
>>> streams = video.streams
>>> for s in streams:
... print(s)
...
normal:mp4@1280x720
normal:webm@640x360
normal:mp4@640x360
normal:flv@320x240
normal:3gp@320x240
normal:3gp@176x144
show all formats, file-sizes and their download url:
.. code-block:: pycon
>>> for s in streams:
... print(s.resolution, s.extension, s.get_filesize(), s.url)
...
1280x720 mp4 2421958510 https://r1---sn-aiglln7e.googlevideo.com/videoplayba[...]
640x360 webm 547015732 https://r1---sn-aiglln7e.googlevideo.com/videoplaybac[...]
640x360 mp4 470655850 https://r1---sn-aiglln7e.googlevideo.com/videoplayback[...]
320x240 flv 345455674 https://r1---sn-aiglln7e.googlevideo.com/videoplayback[...]
320x240 3gp 208603447 https://r1---sn-aiglln7e.googlevideo.com/videoplayback[...]
176x144 3gp 60905732 https://r1---sn-aiglln7e.googlevideo.com/videoplayback?[...]
get best resolution regardless of file format:
.. code-block:: pycon
>>> best = video.getbest()
>>> best.resolution, best.extension
('1280x720', 'mp4')
get best resolution for a particular file format:
(mp4, webm, flv or 3gp)
.. code-block:: pycon
>>> best = video.getbest(preftype="webm")
>>> best.resolution, best.extension
('640x360', 'webm')
get url, for download or streaming in mplayer / vlc etc:
.. code-block:: pycon
>>> best.url
'http://r12---sn-aig7kner.c.youtube.com/videoplayback?expire=1369...
Download video and show progress:
.. code-block:: pycon
>>> best.download(quiet=False)
3,734,976 Bytes [0.20%] received. Rate: [ 719 KB/s]. ETA: [3284 secs]
Download video, use specific directory and/or filename:
.. code-block:: pycon
>>> filename = best.download(filepath="/tmp/")
>>> filename = best.download(filepath="/tmp/Game." + best.extension)
Get audio-only streams (m4a and/or ogg vorbis):
.. code-block:: pycon
>>> audiostreams = video.audiostreams
>>> for a in audiostreams:
... print(a.bitrate, a.extension, a.get_filesize())
...
256k m4a 331379079
192k ogg 172524223
128k m4a 166863001
128k ogg 108981120
48k m4a 62700449
Download the 2nd audio stream from the above list:
.. code-block:: pycon
>>> audiostreams[1].download()
Get the best quality audio stream:
.. code-block:: pycon
>>> bestaudio = video.getbestaudio()
>>> bestaudio.bitrate
'256'
Download the best quality audio file:
.. code-block:: pycon
>>> bestaudio.download()
show all media types for a video (video+audio, video-only and audio-only):
.. code-block:: pycon
>>> allstreams = video.allstreams
>>> for s in allstreams:
... print(s.mediatype, s.extension, s.quality)
...
normal mp4 1280x720
normal webm 640x360
normal mp4 640x360
normal flv 320x240
normal 3gp 320x240
normal 3gp 176x144
video m4v 1280x720
video webm 1280x720
video m4v 854x480
video webm 854x480
video m4v 640x360
video webm 640x360
video m4v 426x240
video webm 426x240
video m4v 256x144
video webm 256x144
audio m4a 256k
audio ogg 192k
audio m4a 128k
audio ogg 128k
audio m4a 48k
Installation
------------
pafy can be installed using `pip <http://www.pip-installer.org>`_:
.. code-block:: bash
$ [sudo] pip install pafy
or use a `virtualenv <http://virtualenv.org>`_ if you don't want to install it system-wide:
.. code-block:: bash
$ virtualenv venv
$ source venv/bin/activate
$ pip install pafy
Command Line Tool (ytdl) Usage
------------------------------
.. code-block:: bash
usage: ytdl [-h] [-i] [-s]
[-t {audio,video,normal,all} [{audio,video,normal,all} ...]]
[-n N] [-b] [-a]
url
YouTube Download Tool
positional arguments:
url YouTube video URL to download
optional arguments:
-h, --help show this help message and exit
-i Display vid info
-s Display available streams
-t {audio,video,normal,all} [{audio,video,normal,all} ...]
Stream types to display
-n N Specify stream to download by stream number (use -s to
list available streams)
-b Download the best quality video (ignores -n)
-a Download the best quality audio (ignores -n)
ytdl Examples
-------------
Download best available resolution (-b):
.. code-block:: bash
$ ytdl -b "http://www.youtube.com/watch?v=cyMHZVT91Dw"
Download best available audio stream (-a)
(note; the full url is not required, just the video id will suffice):
.. code-block:: bash
$ ytdl -a cyMHZVT91Dw
get video info (-i):
.. code-block:: bash
$ ytdl -i cyMHZVT91Dw
list available dowload streams:
.. code-block:: bash
$ ytdl cyMHZVT91Dw
Stream Type Format Quality Size
------ ---- ------ ------- ----
1 normal webm [640x360] 33 MB
2 normal mp4 [640x360] 23 MB
3 normal flv [320x240] 14 MB
4 normal 3gp [320x240] 9 MB
5 normal 3gp [176x144] 3 MB
6 audio m4a [48k] 2 MB
7 audio m4a [128k] 5 MB
8 audio ogg [128k] 5 MB
9 audio ogg [192k] 7 MB
10 audio m4a [256k] 10 MB
Download mp4 640x360 (ie. stream number 2):
.. code-block:: bash
$ ytdl -n2 cyMHZVT91Dw
Download m4a audio stream at 256k bitrate:
.. code-block:: bash
$ ytdl -n10 cyMHZVT91Dw
IRC
---
The mps-youtube irc channel (`#mps-youtube` on Freenode) can be used for discussion of pafy.
| 26.546584 | 181 | 0.635938 | eng_Latn | 0.542588 |
b11254c58971d08714b7f58526c4746583afec88 | 2,493 | md | Markdown | README.md | PeanutbutterWarrior/Ultimate-Calculator | db8db2b946269cd0a74eba1530aef5525ce9acff | [
"MIT"
] | null | null | null | README.md | PeanutbutterWarrior/Ultimate-Calculator | db8db2b946269cd0a74eba1530aef5525ce9acff | [
"MIT"
] | null | null | null | README.md | PeanutbutterWarrior/Ultimate-Calculator | db8db2b946269cd0a74eba1530aef5525ce9acff | [
"MIT"
] | null | null | null | <a style="text-decoration:none" href="https://github.com/JordanLeich/Ultimate-Calculator/releases">
<img src="https://img.shields.io/github/release/JordanLeich/Ultimate-Calculator.svg?style=flat-square" alt="Releases" />
</a>
<a style="text-decoration:none" href="https://github.com/JordanLeich/Ultimate-Calculator/contributors/">
<img src="https://img.shields.io/github/contributors/JordanLeich/Ultimate-Calculator?style=flat-square" alt="Contributors" />
</a>
<a style="text-decoration:none" href="https://github.com/JordanLeich/Ultimate-Calculator/stargazers">
<img src="https://img.shields.io/github/stars/JordanLeich/Ultimate-Calculator.svg?style=flat-square" alt="Stars" />
</a>
# Description 💻
This is an incredibly powerful calculator that is capable of many useful day-to-day functions. Such functions include solving basic arithmetic, algebraic, chemistry conversions, and much more. This project also features a fully operational GUI application for those who are not down with the normal text-based version.
# How To Use ⚡
*Step 1*:
Make sure you have pip installed on your OS, you can do it by writing the following command on your terminal.<br/><br/>
```
pip help
```<br/><br/>
*Step 2*:
If you don't have pip installed , you can install it using the following command.<br/><br/>
```
python get-pip.py
```
<br/>or <br/>
```
python3 get-pip.py
```<br/><br/>
*Step 3*:
Check the pip version using :<br/><br/>
```
pip -V
```<br/><br/>
*Step 4*:
Clone the repository by using <br/><br/>
```
git clone https://github.com/JordanLeich/Ultimate-Calculator.git
```<br/><br/>
*Step 5*:
Install the requirements<br/><br/>
```
pip install -r requirements.txt
```<br/><br/>
*Step 6*:
Run the project using<br/>
```
python3 main.py
```
# Preview 💻

# GUI Preview 💻

# TODO List ❗
- Check out the [Issues Page](https://github.com/JordanLeich/Ultimate-Calculator/issues/1)
# Contributing 👍
- Check out the [Contribution Page](https://github.com/JordanLeich/Ultimate-Calculator/blob/main/CONTRIBUTING.md) for all guidelines and rules.
# Bug Reporting 🐞
- Check out the [Issues Page](https://github.com/JordanLeich/Ultimate-Calculator/issues/7)

| 37.208955 | 318 | 0.680305 | eng_Latn | 0.503232 |
b113a3293ef603e6c29c7d64d506873bc9df1cfd | 1,241 | md | Markdown | aspnet/web-forms/videos/building-20-applications/lesson-7-databinding-to-user-interface-controls.md | yanshengjie/Docs.zh-cn | 066555ec6b2a12c945b1da413449730a81ce14ed | [
"CC-BY-4.0",
"MIT"
] | null | null | null | aspnet/web-forms/videos/building-20-applications/lesson-7-databinding-to-user-interface-controls.md | yanshengjie/Docs.zh-cn | 066555ec6b2a12c945b1da413449730a81ce14ed | [
"CC-BY-4.0",
"MIT"
] | null | null | null | aspnet/web-forms/videos/building-20-applications/lesson-7-databinding-to-user-interface-controls.md | yanshengjie/Docs.zh-cn | 066555ec6b2a12c945b1da413449730a81ce14ed | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
uid: web-forms/videos/building-20-applications/lesson-7-databinding-to-user-interface-controls
title: '[第 7 课:]数据绑定到用户界面控件 |Microsoft 文档'
author: microsoft
description: 在本课程中,您将学习如何在 ASP.NET 中显示数据 2.0 web 应用程序可以通过某些新的数据绑定控件。
ms.author: aspnetcontent
manager: wpickett
ms.date: 11/29/2005
ms.topic: article
ms.assetid: 49625de7-06c3-484c-bd76-d322b9ca41ea
ms.technology: dotnet-webforms
ms.prod: .net-framework
msc.legacyurl: /web-forms/videos/building-20-applications/lesson-7-databinding-to-user-interface-controls
msc.type: video
ms.openlocfilehash: 60c7104990372fb834b88ddd36882c9a570aeb38
ms.sourcegitcommit: f8852267f463b62d7f975e56bea9aa3f68fbbdeb
ms.translationtype: MT
ms.contentlocale: zh-CN
ms.lasthandoff: 04/06/2018
---
<a name="lesson-7-databinding-to-user-interface-controls"></a>[第 7 课:]数据绑定到用户界面控件
====================
by [Microsoft](https://github.com/microsoft)
在本课程中,您将学习如何在你 ASP.NET2.0 web 应用程序可以通过某些新的数据绑定控件中显示数据。
[▶观看视频 (22 分钟)](https://channel9.msdn.com/Blogs/ASP-NET-Site-Videos/lesson-7-databinding-to-user-interface-controls)
> [!div class="step-by-step"]
> [上一页](lesson-6-working-with-stylesheets-and-master-pages.md)
> [下一页](lesson-8-working-with-the-gridview-and-formview.md)
| 38.78125 | 122 | 0.769541 | yue_Hant | 0.247986 |
b113bf04610d6395c06a6b39cd51216209f908e8 | 6,420 | md | Markdown | docs/extensibility/internals/source-control-plug-in-architecture.md | doodz/visualstudio-docs.fr-fr | 49c7932ec7a761e4cd7c259a5772e5415253a7a5 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/extensibility/internals/source-control-plug-in-architecture.md | doodz/visualstudio-docs.fr-fr | 49c7932ec7a761e4cd7c259a5772e5415253a7a5 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2018-10-19T08:00:06.000Z | 2018-10-19T08:00:06.000Z | docs/extensibility/internals/source-control-plug-in-architecture.md | doodz/visualstudio-docs.fr-fr | 49c7932ec7a761e4cd7c259a5772e5415253a7a5 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: "Architecture de plug-in de contrôle de source | Documents Microsoft"
ms.custom:
ms.date: 11/04/2016
ms.reviewer:
ms.suite:
ms.technology: vs-ide-sdk
ms.tgt_pltfrm:
ms.topic: article
helpviewer_keywords: source control plug-ins, architecture
ms.assetid: 35351d4c-9414-409b-98fc-f2023e2426b7
caps.latest.revision: "24"
author: gregvanl
ms.author: gregvanl
manager: ghogen
ms.openlocfilehash: e0cde4ca360aa0059abcbe0b64d63b4a94e85d78
ms.sourcegitcommit: f40311056ea0b4677efcca74a285dbb0ce0e7974
ms.translationtype: MT
ms.contentlocale: fr-FR
ms.lasthandoff: 10/31/2017
---
# <a name="source-control-plug-in-architecture"></a>Architecture de plug-in de contrôle de code source
Vous pouvez ajouter la prise en charge du contrôle de source pour le [!INCLUDE[vsprvs](../../code-quality/includes/vsprvs_md.md)] l’environnement de développement intégré (IDE) en implémentant et en attachant un plug-in de contrôle de code source. L’IDE se connecte au contrôle de source de plug-in via l’API de plug-in du contrôle Source bien défini. L’IDE expose les fonctionnalités de contrôle de version du système de contrôle de code source en fournissant une interface utilisateur (IU) qui se compose de barres d’outils et les commandes de menu. Le plug-in de contrôle de code source implémente la fonctionnalité de contrôle de code source.
## <a name="source-control-plug-in-resources"></a>Ressources de plug-in de contrôle de code source
Le plug-in de contrôle de code Source fournit des ressources pour aider à créer et connecter votre application de contrôle de version à la [!INCLUDE[vsprvs](../../code-quality/includes/vsprvs_md.md)] IDE. Le plug-in de contrôle de code Source contient la spécification de l’API qui doit être implémentée par un plug-in de contrôle de code source afin qu’il peut être intégré dans le [!INCLUDE[vsprvs](../../code-quality/includes/vsprvs_md.md)] IDE. Il contient également un exemple de code (écrit en C++) qui implémente une squelette source contrôle plug-in démonstration implémentation de fonctions essentielles compatibles avec l’API de plug-in de contrôle de Source.
La spécification de l’API de plug-in de contrôle de code Source vous permet de tirer parti de n’importe quel système de contrôle de source de votre choix si vous créez un DLL de contrôle de code source avec le jeu de fonctions implémentés conformément à l’API de plug-in de contrôle de Source requis.
## <a name="components"></a>Composants
Le Package de l’adaptateur de contrôle Source dans le diagramme est le composant de l’IDE qui se traduit par la demande de l’utilisateur pour une opération de contrôle de code source dans un appel de fonction pris en charge par le plug-in de contrôle de code source. Pour ce faire, l’IDE et le plug-in de contrôle de code source doivent avoir une boîte de dialogue efficace qui transmet des informations dans les deux sens entre l’IDE et le plug-in. Pour cette boîte de dialogue puisse avoir lieu, les deux doivent parlent la même langue. L’API de plug-in de contrôle de Source décrites dans cette documentation est le vocabulaire commun pour cet échange.

Diagramme architectural montrant une interaction entre VS et le contrôle de source de plug-in
Comme indiqué dans le diagramme d’architecture, le [!INCLUDE[vsprvs](../../code-quality/includes/vsprvs_md.md)] shell, comme le shell Visual Studio dans le diagramme, héberge des projets de travail de l’utilisateur et les composants associés, tels que les éditeurs et l’Explorateur de solutions. Le Package de l’adaptateur de Source contrôle gère l’interaction entre l’IDE et le plug-in de contrôle de code source. Le Package de l’adaptateur de contrôle Source fournit sa propre interface utilisateur du contrôle de code source. Il s’agit de l’interface utilisateur de niveau supérieur que l’utilisateur interagit avec afin de lancer et définir l’étendue d’une opération de contrôle de code source.
Le plug-in de contrôle de code source peut avoir sa propre interface utilisateur, ce qui peut être constitué de deux parties, comme indiqué dans l’illustration. La zone intitulée « Fournisseur UI » représente les éléments d’interface utilisateur personnalisée que vous, en tant qu’un créateur de plug-in de contrôle source, fournissez. Lorsque l’utilisateur appelle une opération de contrôle de source avancées, elles sont affichées directement par le plug-in de contrôle de code source. La zone intitulée « Interface d’assistance » est un ensemble de plug-in interface utilisateur fonctionnalités du contrôle source qui sont appelées indirectement via l’IDE. Le plug-in de contrôle de code source transmet les messages relatifs à l’interface utilisateur à l’IDE par le biais des fonctions de rappel spéciale fourni par l’IDE. L’application d’assistance de l’interface utilisateur facilitant l’intégration plus transparente avec l’IDE (souvent à l’aide d’un **avancé** bouton) et fournit ainsi une expérience utilisateur plus unifiée.
Un plug-in de contrôle de code source ne peut pas apporter des modifications à la [!INCLUDE[vsprvs](../../code-quality/includes/vsprvs_md.md)] d’environnement et, par conséquent, le Package de l’adaptateur de contrôle de Source ou de la source de contrôle de l’interface utilisateur fournie par l’IDE. Il doit permettre une utilisation maximale de la souplesse offerte par la mise en oeuvre des différentes fonctions d’API de plug-in de contrôle de code Source qui contribuent à une expérience intégrée pour l’utilisateur final. La section de référence de la documentation de l’API de plug-in de contrôle de code Source inclut des informations pour certaines fonctionnalités de plug-in du contrôle source avancées. Pour exploiter ces fonctionnalités, le plug-in de contrôle de code source doit déclarer ses fonctionnalités avancées à l’IDE pendant l’initialisation, et il doit implémenter des fonctions avancées spécifiques pour chaque fonctionnalité.
## <a name="see-also"></a>Voir aussi
[Plug-ins de contrôle de code source](../../extensibility/source-control-plug-ins.md)
[Glossaire](../../extensibility/source-control-plug-in-glossary.md)
[Création d’un plug-in de contrôle de code source](../../extensibility/internals/creating-a-source-control-plug-in.md) | 142.666667 | 1,037 | 0.792212 | fra_Latn | 0.987157 |
b114241ab689a7b135d6f72cda9c91efb7c8b4ad | 2,791 | md | Markdown | dynamicsax2012-technet/userlogoffservicerequest-constructor-microsoft-dynamics-commerce-runtime-services-messages.md | RobinARH/DynamicsAX2012-technet | d0d0ef979705b68e6a8406736612e9fc3c74c871 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | dynamicsax2012-technet/userlogoffservicerequest-constructor-microsoft-dynamics-commerce-runtime-services-messages.md | RobinARH/DynamicsAX2012-technet | d0d0ef979705b68e6a8406736612e9fc3c74c871 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | dynamicsax2012-technet/userlogoffservicerequest-constructor-microsoft-dynamics-commerce-runtime-services-messages.md | RobinARH/DynamicsAX2012-technet | d0d0ef979705b68e6a8406736612e9fc3c74c871 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: UserLogOffServiceRequest Constructor (Microsoft.Dynamics.Commerce.Runtime.Services.Messages)
TOCTitle: UserLogOffServiceRequest Constructor
ms:assetid: M:Microsoft.Dynamics.Commerce.Runtime.Services.Messages.UserLogOffServiceRequest.#ctor(Microsoft.Dynamics.Commerce.Runtime.DataModel.Device,System.String,System.String,Microsoft.Dynamics.Commerce.Runtime.DataModel.LogOnConfiguration)
ms:mtpsurl: https://technet.microsoft.com/en-us/library/microsoft.dynamics.commerce.runtime.services.messages.userlogoffservicerequest.userlogoffservicerequest(v=AX.60)
ms:contentKeyID: 65319885
ms.date: 05/18/2015
mtps_version: v=AX.60
f1_keywords:
- Microsoft.Dynamics.Commerce.Runtime.Services.Messages.UserLogOffServiceRequest.#ctor
dev_langs:
- CSharp
- C++
- VB
---
# UserLogOffServiceRequest Constructor
**Namespace:** [Microsoft.Dynamics.Commerce.Runtime.Services.Messages](microsoft-dynamics-commerce-runtime-services-messages-namespace.md)
**Assembly:** Microsoft.Dynamics.Commerce.Runtime.Services.Messages (in Microsoft.Dynamics.Commerce.Runtime.Services.Messages.dll)
## Syntax
``` vb
'Declaration
Public Sub New ( _
device As Device, _
staffId As String, _
authenticationProvider As String, _
logOnConfiguration As LogOnConfiguration _
)
'Usage
Dim device As Device
Dim staffId As String
Dim authenticationProvider As String
Dim logOnConfiguration As LogOnConfiguration
Dim instance As New UserLogOffServiceRequest(device, _
staffId, authenticationProvider, _
logOnConfiguration)
```
``` csharp
public UserLogOffServiceRequest(
Device device,
string staffId,
string authenticationProvider,
LogOnConfiguration logOnConfiguration
)
```
``` c++
public:
UserLogOffServiceRequest(
Device^ device,
String^ staffId,
String^ authenticationProvider,
LogOnConfiguration logOnConfiguration
)
```
#### Parameters
- device
Type: [Microsoft.Dynamics.Commerce.Runtime.DataModel.Device](device-class-microsoft-dynamics-commerce-runtime-datamodel.md)
<!-- end list -->
- staffId
Type: [System.String](https://technet.microsoft.com/en-us/library/s1wwdcbf\(v=ax.60\))
<!-- end list -->
- authenticationProvider
Type: [System.String](https://technet.microsoft.com/en-us/library/s1wwdcbf\(v=ax.60\))
<!-- end list -->
- logOnConfiguration
Type: [Microsoft.Dynamics.Commerce.Runtime.DataModel.LogOnConfiguration](logonconfiguration-enumeration-microsoft-dynamics-commerce-runtime-datamodel.md)
## See Also
#### Reference
[UserLogOffServiceRequest Class](userlogoffservicerequest-class-microsoft-dynamics-commerce-runtime-services-messages.md)
[Microsoft.Dynamics.Commerce.Runtime.Services.Messages Namespace](microsoft-dynamics-commerce-runtime-services-messages-namespace.md)
| 31.011111 | 245 | 0.784307 | yue_Hant | 0.786611 |
b1148073b836ec93dfcb1ca811b4a9790a8684d9 | 68 | md | Markdown | V. Spring Boot Actuator/54.3.7 RabbitMQ Metrics.md | fzgmxd/Spring-Boot-Reference-Guide | ad3f6cb455abc0fe7533b65f05200675fa9888ba | [
"MIT"
] | 238 | 2017-09-27T01:49:04.000Z | 2022-02-22T03:34:11.000Z | V. Spring Boot Actuator/54.3.7 RabbitMQ Metrics.md | fzgmxd/Spring-Boot-Reference-Guide | ad3f6cb455abc0fe7533b65f05200675fa9888ba | [
"MIT"
] | 4 | 2017-09-01T16:05:02.000Z | 2018-08-08T01:58:09.000Z | V. Spring Boot Actuator/54.3.7 RabbitMQ Metrics.md | fzgmxd/Spring-Boot-Reference-Guide | ad3f6cb455abc0fe7533b65f05200675fa9888ba | [
"MIT"
] | 73 | 2017-09-27T01:49:07.000Z | 2022-03-03T14:54:23.000Z | ### 54.3.7 RabbitMQ指标
自动配置将使用名为`RabbitMQ`的度量启用所有可用RabbitMQ连接工厂的检测。
| 17 | 44 | 0.808824 | yue_Hant | 0.806754 |
b115312cb4d1e57e1929a585038b94b93e8e0214 | 569 | md | Markdown | en/messages/joanna-de-angelis/nao-te-digas-decepcionado.md | veevo/drafts | cfe468193c9ed3daad63f7a9dfa7698ffe28a88d | [
"Unlicense"
] | null | null | null | en/messages/joanna-de-angelis/nao-te-digas-decepcionado.md | veevo/drafts | cfe468193c9ed3daad63f7a9dfa7698ffe28a88d | [
"Unlicense"
] | null | null | null | en/messages/joanna-de-angelis/nao-te-digas-decepcionado.md | veevo/drafts | cfe468193c9ed3daad63f7a9dfa7698ffe28a88d | [
"Unlicense"
] | null | null | null | # Não te digas decepcionado
Não te digas decepcionado com o teu próximo.
Não lhe apontes os erros.
Faze tu, de forma que não decepciones; nem te permitas erros.
Não afirmes: - Agora é tarde!
Não imponhas: - "Ficarei no meu posto, porquanto fui o ofendido!"
Vai ao irmão que delinqüiu contra ti e pede-lhe desculpas.
Quem dá o primeiro passo, chega antes ao termo do Bem.
Sob disfarce algum, não agasalhes o orgulho, nunca, pois que ele é o inimigo mais hábil prevenido contra o teu progresso espiritual.
# Fonte
Joanna de Ângelis
| 22.76 | 134 | 0.724077 | por_Latn | 0.999883 |
b11569adb06c00fb930a070d1975390b1b3e3344 | 141 | md | Markdown | README.md | pxd-fed/gulp-es6-babel | 34284c04f3b088867ec4d319bbbbea09825331f9 | [
"MIT"
] | null | null | null | README.md | pxd-fed/gulp-es6-babel | 34284c04f3b088867ec4d319bbbbea09825331f9 | [
"MIT"
] | null | null | null | README.md | pxd-fed/gulp-es6-babel | 34284c04f3b088867ec4d319bbbbea09825331f9 | [
"MIT"
] | null | null | null | # gulp-es6-babel
```
$ npm install -g gulp-cli
```
```
$ npm install
$ gulp
```
## TODO
- Sprite Image
- Watch new files and delete files
| 9.4 | 34 | 0.609929 | eng_Latn | 0.330303 |
b1157508fc082810206206102eee0a9addd7dc34 | 4,167 | md | Markdown | packages/europa-plugin-image/README.md | NotNinja/html.md | 6852bd54f88ccd0aeeac685ccce17bb8a353f462 | [
"MIT"
] | null | null | null | packages/europa-plugin-image/README.md | NotNinja/html.md | 6852bd54f88ccd0aeeac685ccce17bb8a353f462 | [
"MIT"
] | null | null | null | packages/europa-plugin-image/README.md | NotNinja/html.md | 6852bd54f88ccd0aeeac685ccce17bb8a353f462 | [
"MIT"
] | null | null | null | # europa-plugin-image
A [Europa](https://github.com/neocotic/europa) plugin to convert HTML tags to Markdown images.
[](https://github.com/neocotic/europa/actions/workflows/ci.yml)
[](https://github.com/neocotic/europa/raw/main/packages/europa-plugin-image/LICENSE.md)
[](https://npmjs.com/package/europa-plugin-image)
* [Install](#install)
* [Converted Tags](#converted-tags)
* [Examples](#examples)
* [Bugs](#bugs)
* [Contributors](#contributors)
* [License](#license)
## Install
Install using your preferred package manager. For example;
``` bash
$ npm install --save europa-plugin-image
```
Then, in order to activate this plugin;
``` typescript
// Import europa-core implementation (e.g. `europa`, `node-europa`)
import plugin from 'europa-plugin-image';
Europa.registerPlugin(plugin);
// ...
```
However, this plugin belongs to the `europa-preset-default`, which is registered with all Europa Core implementations by default,
so you should not need to do anything to use this plugin.
## Converted Tags
The following HTML tags are converted by this plugin:
* `IMG`
## Examples
### Basic
HTML:
``` html
<img src="https://raw.githubusercontent.com/neocotic/europa-branding/main/assets/banner/europa/europa-banner-250x100.png">
<img src="https://raw.githubusercontent.com/neocotic/europa-branding/main/assets/banner/node-europa/node-europa-banner-377x100.png" alt="Europa Node">
```
Markdown:
``` markdown
![][image1]
![Europa Node][image2]
[image1]: https://raw.githubusercontent.com/neocotic/europa-branding/main/assets/banner/europa/europa-banner-250x100.png
[image2]: https://raw.githubusercontent.com/neocotic/europa-branding/main/assets/banner/node-europa/node-europa-banner-377x100.png
```
### Absolute Option Enabled
Assume the following example is converted on <https://github.com/neocotic/europa>.
Setup:
``` typescript
const europa = new Europa({ absolute: true });
```
HTML:
``` html
<img src="//raw.githubusercontent.com/neocotic/europa-branding/main/assets/banner/europa/europa-banner-250x100.png">
<img src="//raw.githubusercontent.com/neocotic/europa-branding/main/assets/banner/node-europa/node-europa-banner-377x100.png" alt="Europa Node">
```
Markdown:
``` markdown
![][image1]
![Europa Node][image2]
[image1]: https://raw.githubusercontent.com/neocotic/europa-branding/main/assets/banner/europa/europa-banner-250x100.png
[image2]: https://raw.githubusercontent.com/neocotic/europa-branding/main/assets/banner/node-europa/node-europa-banner-377x100.png
```
### Inline Option Enabled
Setup:
``` typescript
const europa = new Europa({ inline: true });
```
HTML:
``` html
<img src="https://raw.githubusercontent.com/neocotic/europa-branding/main/assets/banner/europa/europa-banner-250x100.png">
<img src="https://raw.githubusercontent.com/neocotic/europa-branding/main/assets/banner/node-europa/node-europa-banner-377x100.png" alt="Europa Node">
```
Markdown:
``` markdown


```
## Bugs
If you have any problems with this Europa plugin or would like to see changes currently in development you can do so
[here](https://github.com/neocotic/europa/issues).
## Contributors
If you want to contribute, you're a legend! Information on how you can do so can be found in
[CONTRIBUTING.md](https://github.com/neocotic/europa/blob/main/CONTRIBUTING.md). We want your suggestions and pull
requests!
A list of Europa contributors can be found in [AUTHORS.md](https://github.com/neocotic/europa/blob/main/AUTHORS.md).
## License
Copyright © 2022 neocotic
See [LICENSE.md](https://github.com/neocotic/europa/raw/main/packages/europa-plugin-image/LICENSE.md) for more information on
our MIT license.
| 31.568182 | 169 | 0.758819 | kor_Hang | 0.20601 |
b1175e755f4caed4efab5acf798bae50399f8038 | 141 | md | Markdown | _posts/0000-01-02-manvi0504.md | manvi0504/github-slideshow | 2e1ad2936ba5e5df5dc068773a4b60bf33e8b8a9 | [
"MIT"
] | null | null | null | _posts/0000-01-02-manvi0504.md | manvi0504/github-slideshow | 2e1ad2936ba5e5df5dc068773a4b60bf33e8b8a9 | [
"MIT"
] | 3 | 2020-12-07T19:39:13.000Z | 2020-12-07T20:40:00.000Z | _posts/0000-01-02-manvi0504.md | manvi0504/github-slideshow | 2e1ad2936ba5e5df5dc068773a4b60bf33e8b8a9 | [
"MIT"
] | null | null | null | layout: slide
title: "Welcome to our second slide!"
---
Your text
If life were predictable it would cease to be life, and be without flavor.
| 23.5 | 74 | 0.751773 | eng_Latn | 0.999551 |
b1180df87f124733c22c1c8e7cfaf6e81977c450 | 1,468 | md | Markdown | datasets/myanmar_news/README.md | MitchellTesla/datasets | bf08ea3f95e8209a7afd2b50410ad5db51409d11 | [
"Apache-2.0"
] | 6 | 2021-05-02T17:08:55.000Z | 2022-03-12T14:02:09.000Z | datasets/myanmar_news/README.md | MitchellTesla/datasets | bf08ea3f95e8209a7afd2b50410ad5db51409d11 | [
"Apache-2.0"
] | null | null | null | datasets/myanmar_news/README.md | MitchellTesla/datasets | bf08ea3f95e8209a7afd2b50410ad5db51409d11 | [
"Apache-2.0"
] | 1 | 2022-03-06T14:14:07.000Z | 2022-03-06T14:14:07.000Z | ---
annotations_creators: []
language_creators: []
languages:
- my
licenses:
- gpl-3.0
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- topic-classification
paperswithcode_id: null
pretty_name: MyanmarNews
---
# Dataset Card for Myanmar_News
## Dataset Description
- **Repository: ** https://github.com/ayehninnkhine/MyanmarNewsClassificationSystem
### Dataset Summary
The Myanmar news dataset contains article snippets in four categories:
Business, Entertainment, Politics, and Sport.
These were collected in October 2017 by Aye Hninn Khine
### Languages
Myanmar/Burmese language
## Dataset Structure
### Data Fields
- text - text from article
- category - a topic: Business, Entertainment, **Politic**, or **Sport** (note spellings)
### Data Splits
One training set (8,116 total rows)
### Source Data
#### Initial Data Collection and Normalization
Data was collected by Aye Hninn Khine
and shared on GitHub with a GPL-3.0 license.
Multiple text files were consolidated into one labeled CSV file by Nick Doiron.
## Additional Information
### Dataset Curators
Contributors to original GitHub repo:
- https://github.com/ayehninnkhine
### Licensing Information
GPL-3.0
### Citation Information
See https://github.com/ayehninnkhine/MyanmarNewsClassificationSystem
### Contributions
Thanks to [@mapmeld](https://github.com/mapmeld) for adding this dataset.
| 19.064935 | 89 | 0.766349 | eng_Latn | 0.489285 |
b118656d013ea18b535efa8f7181382f8e16ab4c | 228 | md | Markdown | README.md | FranciscoKloganB/learnr | 331a8eec47d60e3a047987b3e11a11891c9c826f | [
"MIT"
] | null | null | null | README.md | FranciscoKloganB/learnr | 331a8eec47d60e3a047987b3e11a11891c9c826f | [
"MIT"
] | 4 | 2022-02-11T03:11:49.000Z | 2022-02-27T19:04:55.000Z | README.md | FranciscoKloganB/learnr | 331a8eec47d60e3a047987b3e11a11891c9c826f | [
"MIT"
] | null | null | null | # Learnr - Storing resources
Small demonstration sticky note app (frontend) using Vue.js
You can run this project locally by running `npm run serve`.
You can visit a live demo by visiting https://learnr-resources.netlify.app
| 28.5 | 74 | 0.780702 | eng_Latn | 0.986393 |
b118f6425fd84c78fad28f61962e70308c0ca9ec | 8,382 | md | Markdown | content/integrations/mysql.md | miketheman/documentation | 3f0b64cbff19db24860ee2812c87dfd1942f5cd9 | [
"BSD-3-Clause"
] | null | null | null | content/integrations/mysql.md | miketheman/documentation | 3f0b64cbff19db24860ee2812c87dfd1942f5cd9 | [
"BSD-3-Clause"
] | null | null | null | content/integrations/mysql.md | miketheman/documentation | 3f0b64cbff19db24860ee2812c87dfd1942f5cd9 | [
"BSD-3-Clause"
] | null | null | null | ---
title: Datadog-MySQL Integration
integration_title: MySQL
kind: integration
git_integration_title: mysql
newhlevel: true
---
# Overview
Connect MySQL to Datadog in order to:
* Visualize your database performance
* Correlate the performance of MySQL with the rest of your applications
# Installation
1. Create a ```datadog``` user with replication rights on your MySQL server with the following command, replacing ```<UNIQUEPASSWORD>``` with a unique password:
sudo mysql -e "CREATE USER 'datadog'@'localhost' IDENTIFIED BY '<UNIQUEPASSWORD>';"
sudo mysql -e "GRANT REPLICATION CLIENT ON *.* TO 'datadog'@'localhost' WITH MAX_USER_CONNECTIONS 5;"
If you'd like to get the full metrics catalog please also grant the following privileges:
sudo mysql -e "GRANT PROCESS ON *.* TO 'datadog'@'localhost';"
sudo mysql -e "GRANT SELECT ON performance_schema.* TO 'datadog'@'localhost';"
2. Verify that the user was created successfully using the following command, replacing ```<UNIQUEPASSWORD>``` with the password above:
mysql -u datadog --password=<UNIQUEPASSWORD> -e "show status" | \
grep Uptime && echo -e "\033[0;32mMySQL user - OK\033[0m" || \
echo -e "\033[0;31mCannot connect to MySQL\033[0m"
mysql -u datadog --password=<UNIQUEPASSWORD> -e "show slave status" && \
echo -e "\033[0;32mMySQL grant - OK\033[0m" || \
echo -e "\033[0;31mMissing REPLICATION CLIENT grant\033[0m"
# Configuration
1. Edit the mysql.yaml file in your agent's conf.d directory, replacing ```<UNIQUEPASSWORD>``` with the password used above.
init_config:
instances:
- server: localhost
user: datadog
pass: <UNIQUEPASSWORD>
tags:
- optional_tag1
- optional_tag2
options:
replication: 0
galera_cluster: 1
extra_status_metrics: true
extra_innodb_metrics: true
extra_performance_metrics: true
schema_size_metrics: false
disable_innodb_metrics: false
Agent 5.7 added a new option: `disable_innodb_metrics`. This should only be used with older versions of MySQL without innodb engine support.
See the metrics section below to see a list of the new metrics provided by each of the metric options.
<%= insert_example_links%>
# Validation
To validate your installation and configuration, restart the agent and execute the info command. The output should contain a section similar to the following:
Checks
======
[...]
mysql
-----
- instance #0 [OK]
- Collected 8 metrics & 0 events
# Metrics
<%= get_metrics_from_git()%>
|`extra_status_metrics` adds the following metrics:|
|----------|--------|
| mysql.binlog.cache_disk_use | GAUGE |
| mysql.binlog.cache_use | GAUGE |
| mysql.performance.handler_commit | RATE |
| mysql.performance.handler_delete | RATE |
| mysql.performance.handler_prepare | RATE |
| mysql.performance.handler_read_first | RATE |
| mysql.performance.handler_read_key | RATE |
| mysql.performance.handler_read_next | RATE |
| mysql.performance.handler_read_prev | RATE |
| mysql.performance.handler_read_rnd | RATE |
| mysql.performance.handler_read_rnd_next | RATE |
| mysql.performance.handler_rollback | RATE |
| mysql.performance.handler_update | RATE |
| mysql.performance.handler_write | RATE |
| mysql.performance.opened_tables | RATE |
| mysql.performance.qcache_total_blocks | GAUGE |
| mysql.performance.qcache_free_blocks | GAUGE |
| mysql.performance.qcache_free_memory | GAUGE |
| mysql.performance.qcache_not_cached | RATE |
| mysql.performance.qcache_queries_in_cache | GAUGE |
| mysql.performance.select_full_join | RATE |
| mysql.performance.select_full_range_join | RATE |
| mysql.performance.select_range | RATE |
| mysql.performance.select_range_check | RATE |
| mysql.performance.select_scan | RATE |
| mysql.performance.sort_merge_passes | RATE |
| mysql.performance.sort_range | RATE |
| mysql.performance.sort_rows | RATE |
| mysql.performance.sort_scan | RATE |
| mysql.performance.table_locks_immediate | GAUGE |
| mysql.performance.table_locks_immediate.rate | RATE |
| mysql.performance.threads_cached | GAUGE |
| mysql.performance.threads_created | MONOTONIC |
{:.table}
|`extra_innodb_metrics` adds the following metrics:|
|----------|--------|
| mysql.innodb.active_transactions | GAUGE |
| mysql.innodb.buffer_pool_data | GAUGE |
| mysql.innodb.buffer_pool_pages_data | GAUGE |
| mysql.innodb.buffer_pool_pages_dirty | GAUGE |
| mysql.innodb.buffer_pool_pages_flushed | RATE |
| mysql.innodb.buffer_pool_pages_free | GAUGE |
| mysql.innodb.buffer_pool_pages_total | GAUGE |
| mysql.innodb.buffer_pool_read_ahead | RATE |
| mysql.innodb.buffer_pool_read_ahead_evicted | RATE |
| mysql.innodb.buffer_pool_read_ahead_rnd | GAUGE |
| mysql.innodb.buffer_pool_wait_free | MONOTONIC |
| mysql.innodb.buffer_pool_write_requests | RATE |
| mysql.innodb.checkpoint_age | GAUGE |
| mysql.innodb.current_transactions | GAUGE |
| mysql.innodb.data_fsyncs | RATE |
| mysql.innodb.data_pending_fsyncs | GAUGE |
| mysql.innodb.data_pending_reads | GAUGE |
| mysql.innodb.data_pending_writes | GAUGE |
| mysql.innodb.data_read | RATE |
| mysql.innodb.data_written | RATE |
| mysql.innodb.dblwr_pages_written | RATE |
| mysql.innodb.dblwr_writes | RATE |
| mysql.innodb.hash_index_cells_total | GAUGE |
| mysql.innodb.hash_index_cells_used | GAUGE |
| mysql.innodb.history_list_length | GAUGE |
| mysql.innodb.ibuf_free_list | GAUGE |
| mysql.innodb.ibuf_merged | RATE |
| mysql.innodb.ibuf_merged_delete_marks | RATE |
| mysql.innodb.ibuf_merged_deletes | RATE |
| mysql.innodb.ibuf_merged_inserts | RATE |
| mysql.innodb.ibuf_merges | RATE |
| mysql.innodb.ibuf_segment_size | GAUGE |
| mysql.innodb.ibuf_size | GAUGE |
| mysql.innodb.lock_structs | RATE |
| mysql.innodb.locked_tables | GAUGE |
| mysql.innodb.locked_transactions | GAUGE |
| mysql.innodb.log_waits | RATE |
| mysql.innodb.log_write_requests | RATE |
| mysql.innodb.log_writes | RATE |
| mysql.innodb.lsn_current | RATE |
| mysql.innodb.lsn_flushed | RATE |
| mysql.innodb.lsn_last_checkpoint | RATE |
| mysql.innodb.mem_adaptive_hash | GAUGE |
| mysql.innodb.mem_additional_pool | GAUGE |
| mysql.innodb.mem_dictionary | GAUGE |
| mysql.innodb.mem_file_system | GAUGE |
| mysql.innodb.mem_lock_system | GAUGE |
| mysql.innodb.mem_page_hash | GAUGE |
| mysql.innodb.mem_recovery_system | GAUGE |
| mysql.innodb.mem_thread_hash | GAUGE |
| mysql.innodb.mem_total | GAUGE |
| mysql.innodb.os_file_fsyncs | RATE |
| mysql.innodb.os_file_reads | RATE |
| mysql.innodb.os_file_writes | RATE |
| mysql.innodb.os_log_pending_fsyncs | GAUGE |
| mysql.innodb.os_log_pending_writes | GAUGE |
| mysql.innodb.os_log_written | RATE |
| mysql.innodb.pages_created | RATE |
| mysql.innodb.pages_read | RATE |
| mysql.innodb.pages_written | RATE |
| mysql.innodb.pending_aio_log_ios | GAUGE |
| mysql.innodb.pending_aio_sync_ios | GAUGE |
| mysql.innodb.pending_buffer_pool_flushes | GAUGE |
| mysql.innodb.pending_checkpoint_writes | GAUGE |
| mysql.innodb.pending_ibuf_aio_reads | GAUGE |
| mysql.innodb.pending_log_flushes | GAUGE |
| mysql.innodb.pending_log_writes | GAUGE |
| mysql.innodb.pending_normal_aio_reads | GAUGE |
| mysql.innodb.pending_normal_aio_writes | GAUGE |
| mysql.innodb.queries_inside | GAUGE |
| mysql.innodb.queries_queued | GAUGE |
| mysql.innodb.read_views | GAUGE |
| mysql.innodb.rows_deleted | RATE |
| mysql.innodb.rows_inserted | RATE |
| mysql.innodb.rows_read | RATE |
| mysql.innodb.rows_updated | RATE |
| mysql.innodb.s_lock_os_waits | RATE |
| mysql.innodb.s_lock_spin_rounds | RATE |
| mysql.innodb.s_lock_spin_waits | RATE |
| mysql.innodb.semaphore_wait_time | GAUGE |
| mysql.innodb.semaphore_waits | GAUGE |
| mysql.innodb.tables_in_use | GAUGE |
| mysql.innodb.x_lock_os_waits | RATE |
| mysql.innodb.x_lock_spin_rounds | RATE |
| mysql.innodb.x_lock_spin_waits | RATE |
{:.table}
|`extra_performance_metrics` adds the following metrics:|
|----------|--------|
| mysql.performance.query_run_time.avg | GAUGE |
| mysql.performance.digest_95th_percentile.avg_us | GAUGE |
{:.table}
|`schema_size_metrics` adds the following metric:|
|----------|--------|
| mysql.info.schema.size | GAUGE |
{:.table}
| 37.587444 | 161 | 0.725245 | kor_Hang | 0.348091 |
b119785d814dd6a41d0a6fcb81ff7960dcf7a7fd | 2,370 | md | Markdown | _problems/school/TWOVSTEN.md | captn3m0/codechef | 9b9a127365d1209893e94f8430b909433af6b5f9 | [
"WTFPL"
] | 14 | 2015-11-27T15:49:32.000Z | 2022-02-04T17:31:27.000Z | _problems/school/TWOVSTEN.md | ashrafulislambd/codechef | b192550188e13d7edb211746103fddf049272027 | [
"WTFPL"
] | 40 | 2015-12-16T12:58:07.000Z | 2022-02-02T11:46:05.000Z | _problems/school/TWOVSTEN.md | ashrafulislambd/codechef | b192550188e13d7edb211746103fddf049272027 | [
"WTFPL"
] | 18 | 2015-03-30T09:35:35.000Z | 2020-12-03T14:11:12.000Z | ---
category_name: school
problem_code: TWOVSTEN
problem_name: 'Two vs Ten'
languages_supported:
- C
- CPP14
- JAVA
- PYTH
- 'PYTH 3.5'
- PYPY
- CS2
- 'PAS fpc'
- 'PAS gpc'
- RUBY
- PHP
- GO
- NODEJS
- HASK
- rust
- SCALA
- swift
- D
- PERL
- FORT
- WSPC
- ADA
- CAML
- ICK
- BF
- ASM
- CLPS
- PRLG
- ICON
- 'SCM qobi'
- PIKE
- ST
- NICE
- LUA
- BASH
- NEM
- 'LISP sbcl'
- 'LISP clisp'
- 'SCM guile'
- JS
- ERL
- TCL
- kotlin
- PERL6
- TEXT
- 'SCM chicken'
- CLOJ
- COB
- FS
max_timelimit: '0.5'
source_sizelimit: '50000'
problem_author: altruist_
problem_tester: kingofnumbers
date_added: 20-04-2018
tags:
- altruist_
editorial_url: 'https://discuss.codechef.com/problems/TWOVSTEN'
time:
view_start_date: 1524934802
submit_start_date: 1524934802
visible_start_date: 1524934802
end_date: 1735669800
current: 1525198930
is_direct_submittable: false
layout: problem
---
All submissions for this problem are available.### Read problems statements in [Mandarin chinese](http://www.codechef.com/download/translated/LTIME59/mandarin/TWOVSTEN.pdf), [Russian](http://www.codechef.com/download/translated/LTIME59/russian/TWOVSTEN.pdf) and [Vietnamese](http://www.codechef.com/download/translated/LTIME59/vietnamese/TWOVSTEN.pdf) as well.
Chef Two and Chef Ten are playing a game with a number $X$. In one turn, they can multiply $X$ by $2$. The goal of the game is to make $X$ divisible by $10$. Help the Chefs find the smallest number of turns necessary to win the game (it may be possible to win in zero turns) or determine that it is impossible. ### Input - The first line of the input contains a single integer $T$ denoting the number of test cases. The description of $T$ test cases follows. - The first and only line of each test case contains a single integer denoting the initial value of $X$. ### Output For each test case, print a single line containing one integer — the minimum required number of turns or $-1$ if there is no way to win the game. ### Constraints - $1 \\le T \\le 1000$ - $0 \\le X \\le 10^9$ ### Subtasks \*\*Subtask #1 (100 points):\*\* original constraints ### Example Input ``` 3 10 25 1 ``` ### Example Output ``` 0 1 -1 ```
| 31.6 | 919 | 0.65443 | eng_Latn | 0.809933 |
b11a49f8901d965494467b9ca54a7c73cfeccc8b | 1,163 | md | Markdown | Readme.md | segment-boneyard/go-loggly-cli | cc88f262a7bb226a907af76d3e875797867a9511 | [
"MIT"
] | 7 | 2015-07-01T10:13:16.000Z | 2018-04-05T18:31:24.000Z | Readme.md | segment-boneyard/go-loggly-cli | cc88f262a7bb226a907af76d3e875797867a9511 | [
"MIT"
] | 1 | 2015-01-24T17:07:07.000Z | 2016-09-01T12:30:00.000Z | Readme.md | segmentio/go-loggly-cli | cc88f262a7bb226a907af76d3e875797867a9511 | [
"MIT"
] | 6 | 2015-01-24T17:10:52.000Z | 2018-08-04T19:06:45.000Z |
# Loggly CLI
Loggly search command-line tool.
## Installation
Quick install via go-get:
```
$ go get github.com/segmentio/go-loggly-cli
$ go-loggly-cli --version
```
## Usage
```
Usage: loggly [options] [query...]
Options:
--account <name> account name
--user <name> account username
--pass <word> account password
--size <count> response event count [100]
--from <time> starting time [-24h]
--to <time> ending time [now]
--json output json array of events
--count output total event count
```
## Setup
Loggly's search API requires basic auth credentials, so you _must_ pass
the `--acount`, `--user`, and `--pass` flags. To make this less annoying
I suggest creating an alias:
```sh
alias logs='loggly --account segment --user tj --pass something'
```
This is a great place to stick personal defaults as well. Since flags are clobbered
if defined multiple times you can define whatever defaults you'd like here, while
still changing them via `log`:
```sh
alias logs='loggly --account segment --user tj --pass something --size 5'
```
## License
MIT | 21.537037 | 84 | 0.649183 | eng_Latn | 0.975958 |
b11a87e1d4afa741708739cef8d82b4b7507525c | 370 | md | Markdown | README.md | MajorBerg/YouTube_Playlist_Reader | 5a282d722c2aaed8f8ade98747a3d77c24afe893 | [
"MIT"
] | null | null | null | README.md | MajorBerg/YouTube_Playlist_Reader | 5a282d722c2aaed8f8ade98747a3d77c24afe893 | [
"MIT"
] | null | null | null | README.md | MajorBerg/YouTube_Playlist_Reader | 5a282d722c2aaed8f8ade98747a3d77c24afe893 | [
"MIT"
] | null | null | null | # YouTube Playlist Reader
The Goal of this project is to read all of the URLs in a YouTube Playlist and then create a playlist for VLC to provide superior control over watching the videos or listening to the music.
## Built With
* [Selenium](https://www.seleniumhq.org/) - The Browser
## Authors
* **Benjamin Salzberg** - [MajorBerg](https://github.com/MajorBerg)
| 30.833333 | 189 | 0.745946 | eng_Latn | 0.972131 |
b11b68c163bab1734c5cecd98876b82b86d0a69d | 28 | md | Markdown | README.md | iisns/iisns.github.io | ef154e4ba5de93c8aac56723f48abd407e513568 | [
"MIT"
] | null | null | null | README.md | iisns/iisns.github.io | ef154e4ba5de93c8aac56723f48abd407e513568 | [
"MIT"
] | null | null | null | README.md | iisns/iisns.github.io | ef154e4ba5de93c8aac56723f48abd407e513568 | [
"MIT"
] | null | null | null | # iisns.github.io
A website
| 9.333333 | 17 | 0.75 | tsn_Latn | 0.391592 |
b11c8a6f17452ff252f506f0ed5b96ff08d50fba | 257 | md | Markdown | _posts/1932-11-19-the-160-foot-alam-r-is.md | MiamiMaritime/miamimaritime.github.io | d087ae8c104ca00d78813b5a974c154dfd9f3630 | [
"MIT"
] | null | null | null | _posts/1932-11-19-the-160-foot-alam-r-is.md | MiamiMaritime/miamimaritime.github.io | d087ae8c104ca00d78813b5a974c154dfd9f3630 | [
"MIT"
] | null | null | null | _posts/1932-11-19-the-160-foot-alam-r-is.md | MiamiMaritime/miamimaritime.github.io | d087ae8c104ca00d78813b5a974c154dfd9f3630 | [
"MIT"
] | null | null | null | ---
title: The 160-foot Alam R is
tags:
- Nov 1932
---
The 160-foot Alam R. is wrecked on a reef near Conception Island [an island around Cuba?].
Newspapers: **Miami Morning News or The Miami Herald**
Page: **1**, Section: **N/A**
| 21.416667 | 92 | 0.614786 | eng_Latn | 0.95627 |
b11ca5e1113262f90c0b076ae8c7492ba7137db1 | 5,309 | md | Markdown | samples/react-page-sections-navigation/README.md | Ramakrishnan24689/sp-dev-fx-webparts | 85c640b56837941e10e2f5e185cbc35b6a94c383 | [
"MIT"
] | 1 | 2022-03-02T19:21:22.000Z | 2022-03-02T19:21:22.000Z | samples/react-page-sections-navigation/README.md | Ramakrishnan24689/sp-dev-fx-webparts | 85c640b56837941e10e2f5e185cbc35b6a94c383 | [
"MIT"
] | null | null | null | samples/react-page-sections-navigation/README.md | Ramakrishnan24689/sp-dev-fx-webparts | 85c640b56837941e10e2f5e185cbc35b6a94c383 | [
"MIT"
] | null | null | null | # Page Sections Navigation
Sample web parts allowing to add sections navigation to the SharePoint page.

## Compatibility




-Incompatible-red.svg "SharePoint Server 2016 Feature Pack 2 requires SPFx 1.1")



## Applies to
* [SharePoint Framework](https://docs.microsoft.com/sharepoint/dev/spfx/sharepoint-framework-overview)
* [Office 365 developer tenant](https://docs.microsoft.com/sharepoint/dev/spfx/set-up-your-developer-tenant)
## Solution
Solution|Author(s)
--------|---------
page-sections-navigation|[Alex Terentiev](https://github.com/AJIXuMuK) (MVP, [Sharepointalist Inc.](http://www.sharepointalist.com), [@alexaterentiev](https://twitter.com/alexaterentiev))
## Version history
Version|Date|Comments
-------|----|--------
1.0|February 27, 2019|Initial release
1.1|March 22, 2019| Update to SPFx 1.8, additional theme, comments
## Minimal Path to Awesome
* clone this repo
* move to right folder
* in the command line run:
* `npm install`
* `gulp bundle --ship`
* `gulp package-solution --ship`
* from the _sharepoint/solution_ folder, deploy the .sppkg file to the App catalog in your tenant
* in the site where you want to test this solution
* add the app named _page-sections-navigation-client-side-solution_
* edit a page
* add _Page Sections Navigation_ web part
* add as much _Page Sections Navigation Anchor_ web parts as you want - each anchor adds an item to the navigation
* configure web parts
> This sample can also be opened with [VS Code Remote Development](https://code.visualstudio.com/docs/remote/remote-overview). Visit https://aka.ms/spfx-devcontainer for further instructions.
## Features
This sample illustrates how to use SharePoint Framework Dynamic Data features to connect web parts on the page.
It also can be used as ready-to-go solution to add page sections navigation to SharePoint pages.
## Custom CSS
The web parts in the sample allow to use custom CSS to override the styles. You can set _Custom CSS URL_ property of *Page Sections Navigation* web part and include css classes for both Navigation and Anchor in referenced file.
Please, refer [custom css sample](./assets/psn-custom.css) for the CSS sample.
## Help
We do not support samples, but this community is always willing to help, and we want to improve these samples. We use GitHub to track issues, which makes it easy for community members to volunteer their time and help resolve issues.
If you're having issues building the solution, please run [spfx doctor](https://pnp.github.io/cli-microsoft365/cmd/spfx/spfx-doctor/) from within the solution folder to diagnose incompatibility issues with your environment.
You can try looking at [issues related to this sample](https://github.com/pnp/sp-dev-fx-webparts/issues?q=label%3A%22sample%3A%20react-page-sections-navigation%22) to see if anybody else is having the same issues.
You can also try looking at [discussions related to this sample](https://github.com/pnp/sp-dev-fx-webparts/discussions?discussions_q=react-page-sections-navigation) and see what the community is saying.
If you encounter any issues while using this sample, [create a new issue](https://github.com/pnp/sp-dev-fx-webparts/issues/new?assignees=&labels=Needs%3A+Triage+%3Amag%3A%2Ctype%3Abug-suspected%2Csample%3A%20react-page-sections-navigation&template=bug-report.yml&sample=react-page-sections-navigation&authors=@AJIXuMuK&title=react-page-sections-navigation%20-%20).
For questions regarding this sample, [create a new question](https://github.com/pnp/sp-dev-fx-webparts/issues/new?assignees=&labels=Needs%3A+Triage+%3Amag%3A%2Ctype%3Aquestion%2Csample%3A%20react-page-sections-navigation&template=question.yml&sample=react-page-sections-navigation&authors=@AJIXuMuK&title=react-page-sections-navigation%20-%20).
Finally, if you have an idea for improvement, [make a suggestion](https://github.com/pnp/sp-dev-fx-webparts/issues/new?assignees=&labels=Needs%3A+Triage+%3Amag%3A%2Ctype%3Aenhancement%2Csample%3A%20react-page-sections-navigation&template=suggestion.yml&sample=react-page-sections-navigation&authors=@AJIXuMuK&title=react-page-sections-navigation%20-%20).
## Disclaimer
**THIS CODE IS PROVIDED *AS IS* WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESS OR IMPLIED, INCLUDING ANY IMPLIED WARRANTIES OF FITNESS FOR A PARTICULAR PURPOSE, MERCHANTABILITY, OR NON-INFRINGEMENT.**
<img src="https://pnptelemetry.azurewebsites.net/sp-dev-fx-webparts/samples/react-page-sections-navigation" />
| 61.732558 | 364 | 0.780938 | eng_Latn | 0.740382 |
b11e5b95831731c4a3b2d01bd3a5f041f9d5ef1c | 428 | md | Markdown | _posts/2021-07-08/2021-07-06-Innies-are-the-best--lmk-if-you-want-this-video--just-made-it-20210706160523618630.md | ipussy/ipussy.github.io | 95d19a74e38bb54303cf18057a99a57c783e76bf | [
"Apache-2.0"
] | null | null | null | _posts/2021-07-08/2021-07-06-Innies-are-the-best--lmk-if-you-want-this-video--just-made-it-20210706160523618630.md | ipussy/ipussy.github.io | 95d19a74e38bb54303cf18057a99a57c783e76bf | [
"Apache-2.0"
] | null | null | null | _posts/2021-07-08/2021-07-06-Innies-are-the-best--lmk-if-you-want-this-video--just-made-it-20210706160523618630.md | ipussy/ipussy.github.io | 95d19a74e38bb54303cf18057a99a57c783e76bf | [
"Apache-2.0"
] | null | null | null | ---
title: "Innies are the best .. lmk if you want this video :) just made it"
metadate: "hide"
categories: [ Pussy ]
image: "https://preview.redd.it/36lbtsah2k971.jpg?auto=webp&s=b520ebd50ba7b5d5787c17b8cf04be51e8f73d68"
thumb: "https://preview.redd.it/36lbtsah2k971.jpg?width=640&crop=smart&auto=webp&s=b1747e32822d455e73ca17dc5a1916e3dae9f73c"
visit: ""
---
Innies are the best .. lmk if you want this video :) just made it
| 42.8 | 124 | 0.757009 | eng_Latn | 0.398955 |
b11e71a9e04f2ccfe927bdc5f8390b4e658b7879 | 2,910 | md | Markdown | source/documentation/iTin.Export.Documentation/Documentation/P_iTin_Export_Model_MailMessageModel_Send.md | iAJTin/iExportEngine | b81fb64c4fd8ebe2f2c22957c9c2f676074054cb | [
"MIT"
] | 40 | 2016-10-25T14:01:03.000Z | 2021-12-07T15:40:54.000Z | source/documentation/iTin.Export.Documentation/Documentation/P_iTin_Export_Model_MailMessageModel_Send.md | iAJTin/iExportEngine | b81fb64c4fd8ebe2f2c22957c9c2f676074054cb | [
"MIT"
] | 1 | 2021-10-08T12:09:38.000Z | 2021-10-08T12:09:38.000Z | source/documentation/iTin.Export.Documentation/Documentation/P_iTin_Export_Model_MailMessageModel_Send.md | iAJTin/iExportEngine | b81fb64c4fd8ebe2f2c22957c9c2f676074054cb | [
"MIT"
] | 5 | 2018-02-01T03:34:30.000Z | 2021-08-16T07:19:23.000Z | # MailMessageModel.Send Property
Additional header content
Gets or sets a value that determines whether to send the message.
**Namespace:** <a href="N_iTin_Export_Model">iTin.Export.Model</a><br />**Assembly:** iTin.Export.Core (in iTin.Export.Core.dll) Version: 2.0.0.0 (2.0.0.0)
## Syntax
**C#**<br />
``` C#
public YesNo Send { get; set; }
```
**VB**<br />
``` VB
Public Property Send As YesNo
Get
Set
```
#### Property Value
Type: <a href="T_iTin_Export_Model_YesNo">YesNo</a><br /><a href="T_iTin_Export_Model_YesNo">Yes</a> if sends the message; otherwise, <a href="T_iTin_Export_Model_YesNo">No</a>. The default is <a href="T_iTin_Export_Model_YesNo">Yes</a>.
## Exceptions
<table><tr><th>Exception</th><th>Condition</th></tr><tr><td>InvalidEnumArgumentException</td><td>The value specified is outside the range of valid values.</td></tr></table>
## Remarks
**ITEE Object Element Usage**<br />
``` XML
<Message Send="Yes|No" ...>
...
</Message>
```
<strong>Compatibility table with native writers.</strong><table><tr><th>Comma-Separated Values<br /><a href="T_iTin_Export_Writers_CsvWriter">CsvWriter</a></th><th>Tab-Separated Values<br /><a href="T_iTin_Export_Writers_TsvWriter">TsvWriter</a></th><th>SQL Script<br /><a href="T_iTin_Export_Writers_SqlScriptWriter">SqlScriptWriter</a></th><th>XML Spreadsheet 2003<br /><a href="T_iTin_Export_Writers_Spreadsheet2003TabularWriter">Spreadsheet2003TabularWriter</a></th></tr><tr><td align="center">X</td><td align="center">X</td><td align="center">X</td><td align="center">X</td></tr></table> A <strong>`X`</strong> value indicates that the writer supports this element.
## Examples
**XML**<br />
``` XML
<Behaviors>
<Downdload LocalCopy="Yes"/>
<TransformFile Execute="Yes" Indented="Yes" Save="Yes" Path="~\Output"/>
<Mail Execute="Yes" Async="Yes" >
<Server>
<Credentials>
<Credential SSL="Yes"
Name="one"
UserName="address@gmail.com"
Password="pwd"
Host="smtp.gmail.com"/>
</Credentials>
</Server>
<Messages>
<Message Credential="one" Send="Yes">
<From Address="emailaddress-one@gmail.com"/>
<To Addresses="emailaddress-two@hotmail.com emailaddress-three@hotmail.com"/>
<CC Addresses="emailaddress-four@hotmail.com emailaddress-five@hotmail.com"/>
<Subject>New report</Subject>
<Body>Hello, this is your report, sending from iTin.Export</Body>
<Attachments>
<Attachment Path="C:\Users\somefile.txt"/>
<Attachment Path="C:\Users\Downloads\Photos Sample.zip"/>
</Attachments>
</Message>
</Messages>
</Mail>
</Behaviors>
```
## See Also
#### Reference
<a href="T_iTin_Export_Model_MailMessageModel">MailMessageModel Class</a><br /><a href="N_iTin_Export_Model">iTin.Export.Model Namespace</a><br /> | 35.925926 | 670 | 0.667354 | yue_Hant | 0.449954 |
b11f69aeea71512a153fd8b0a5b2236368cdb793 | 1,613 | md | Markdown | content/zh/blog/2022/cncf-landscape/index.md | aeraki-framework/website | 8a9244c2b7fd02eb2b8189bb95207d8fe95e9687 | [
"CC-BY-4.0"
] | 1 | 2022-03-14T10:13:58.000Z | 2022-03-14T10:13:58.000Z | content/zh/blog/2022/cncf-landscape/index.md | aeraki-mesh/website | 9d9097a32161d46d26519712a38cf0fa96e7a8b5 | [
"CC-BY-4.0"
] | null | null | null | content/zh/blog/2022/cncf-landscape/index.md | aeraki-mesh/website | 9d9097a32161d46d26519712a38cf0fa96e7a8b5 | [
"CC-BY-4.0"
] | null | null | null | ---
title: Aeraki Mesh 加入 CNCF 云原生全景图
subtitle:
description:
date: 2022-03-02
author: Huabing Zhao
keywords: [aeraki]
---
近日,Aeraki Mesh 正式进入 CNCF 云原生全景图,位于 [Service Mesh](https://landscape.cncf.io/card-mode?category=service-mesh&grouping=category) 类别下。CNCF Landscape 在云原生实践过程中的每个环节帮助用户了解有哪些具体的软件和产品选择,Aeraki Mesh 进入 CNCF Landscape,意味着 Aeraki Mesh 正式成为了 CNCF 认可的构建云原生最佳实践中的一环。

## 什么是 CNCF 云原生全景图?
Cloud Native Computing Foundation,云原生计算基金会(以下简称CNCF)是一个开源软件基金会,它致力于云原生(Cloud Native)技术的普及和可持续发展。云原生技术通过一系列的软件、规范和标准帮助企业和组织,在现代的动态环境(如公共云、私有云和混合云)中构建和运行敏捷的、可扩展的应用程序。
CNCF 发布了云原生全景图(CNCF Landscape),旨在帮助企业和开发人员快速了解云原生体系的全貌,帮助用户选择云原生实践中的恰当的软件和工具,因此受到广大开发者和使用者的关注和重视。
## Aeraki Mesh 解决了云原生中的什么问题?
Aeraki Mesh 是 Service Mesh 领域的一个开源项目,解决目前的服务网格项目只处理 了 HTTP/gRPC 协议,不支持其他开源及私有协议的痛点。
Aeraki Mesh 可以帮助你在服务网格中管理任何七层协议。目前已经支持了 Dubbo、Thrit、Redis、Kafka、ZooKeeper 等开源协议。你还可以使用 Aeraki Mesh 提供的 MetaProtocol 协议扩展框架来管理私有协议的七层流量。
目前 Aeraki 已经在央视频、腾讯音乐、王者破晓等多个大型项目中得到了应用,并经过了 2022 冬奥会线上大规模流量的实际检验。Aeraki 的主要特点:
* 和 Istio 无缝集成,是 [Istio Ecosystem](https://istio.io/latest/about/ecosystem/) 集成推荐项目。您可以采用 Istio + Aeraki 来构建一个可以同时管理 HTTP 和其他七层协议的全栈服务网格。
* 支持在 Istio 中管理 Dubbo、Thrift、Redis 等开源协议的流量。
* 支持在 Istio 中管理私有协议的流量,只需数百行代码,对 Istio 无任何改动。
* 支持请求级负载均衡,支持任意匹配条件的动态路由,全局和本地限流,流量镜像等流量管理能力。
* 提供丰富的请求级性能指标,包括请求时延、错误、数量等,支持分布式调用跟踪。
## 我想使用 Aeraki Mesh / 加入社区贡献?
Aeraki Mesh 是一个厂商中立的开源社区,目前社区正在大力发展中,欢迎大家加入!
安装试用: https://www.aeraki.net/zh/docs/v1.0/quickstart/
加入社区会议: https://www.aeraki.net/zh/community/#community-meetings
Star 一下: https://github.com/aeraki-mesh/aeraki
| 36.659091 | 254 | 0.809051 | yue_Hant | 0.912115 |
b11fa99b6b504e918bd796b25722a5eef9d43891 | 2,122 | md | Markdown | docs/usage/custom_cards/custom_card_httpedo13_sun.md | robbinonline/UI | 22462f63a096237c79e24180dd865625f086f0b2 | [
"Apache-2.0"
] | 1 | 2022-01-29T12:42:44.000Z | 2022-01-29T12:42:44.000Z | docs/usage/custom_cards/custom_card_httpedo13_sun.md | robbinonline/UI | 22462f63a096237c79e24180dd865625f086f0b2 | [
"Apache-2.0"
] | 4 | 2021-12-28T13:27:15.000Z | 2022-01-07T13:11:44.000Z | docs/usage/custom_cards/custom_card_httpedo13_sun.md | robbinonline/UI | 22462f63a096237c79e24180dd865625f086f0b2 | [
"Apache-2.0"
] | 2 | 2022-02-23T15:28:41.000Z | 2022-03-24T00:29:43.000Z | ---
title: custom_card_httpedo13_sun
hide:
- toc
---
<!-- markdownlint-disable MD046 -->
# Custom-card "Sun"
The `custom_card_httpedo13_sun` adapt `sun card` for minimalist ui.
## Credits
Author: httpedo13 - 2021
Version: 1.0.0
## Changelog
<details>
<summary>1.0.0</summary>
Initial release
</details>
## Requirements
This card uses:
<table>
<tr>
<th>Component / card</th>
<th>required</th>
<th>Link</th>
</tr>
<tr>
<td>Sun integration</td>
<td>yes</td>
<td><a href="https://www.home-assistant.io/integrations/sun/">more info</a></td>
</tr>
<tr>
<td>Sun card</td>
<td>yes</td>
<td><a href="https://github.com/AitorDB/home-assistant-sun-card">more info</a></td>
</tr>
</table>
## Images

## Usage
```yaml
- type: custom:button-card
template: custom_card_httpedo13_sun
variables:
language: 'it'
```
## Variables
The same sun card config.
| Name | Accepted values | Description | Default |
|---------------|----------------------|--------------------------------------|-----------------------------------------------------|
| darkMode | `boolean` | Changes card colors to dark or light | Home assistant dark mode state |
| language | `string`<sup>1</sup> | Changes card language | Home assistant language or english if not supported |
| showAzimuth | `boolean` | Displays azimuth in the footer | `false` |
| showElevation | `boolean` | Displays elevation in the footer | `false` |
| timeFormat | `'12h'`/`'24h'` | Displayed time format | Locale based on Home assistant language |
| title | `string` | Card title | Doesn't display a title by default | |
(<sup>1</sup>) Supported languages: `da`, `de`, `en`, `es`, `et`, `fi`, `fr`, `hu`, `it`, `nl`, `pl`, `pt-BR`, `ru`, `sl`, `sv`
| 29.472222 | 143 | 0.50377 | eng_Latn | 0.535218 |
b11ff66f16fe8de82664427752f45e75c982d0ab | 1,539 | md | Markdown | README.md | nju33/postcss-preset | 3c266c8c136a7fed025c474cb1357c84d0fd48ab | [
"MIT"
] | null | null | null | README.md | nju33/postcss-preset | 3c266c8c136a7fed025c474cb1357c84d0fd48ab | [
"MIT"
] | null | null | null | README.md | nju33/postcss-preset | 3c266c8c136a7fed025c474cb1357c84d0fd48ab | [
"MIT"
] | null | null | null | # Postcss preset
[](https://github.com/sindresorhus/xo)
Preset for nju33.
## Install
```bash
yarn add -D @nju33/postcss-preset
```
```js
postcss([
...preset(),
/* other plugins */
])
```
## Plugins
```js
"dependencies": {
"autoprefixer": "^6.7.7",
"css-mqpacker": "^5.0.1",
"cssnano": "^3.10.0",
"postcss-animation": "^0.0.12",
"postcss-assets": "^4.1.0",
"postcss-blokk": "^0.0.2",
"postcss-brand-colors": "^0.4.0",
"postcss-colorblind": "^1.0.0",
"postcss-easings": "^0.3.0",
"postcss-flexbugs-fixes": "^2.1.0",
"postcss-focus": "^1.0.0",
"postcss-font-magician": "^1.6.1",
"postcss-modules": "^0.6.4",
"postcss-octicon": "^0.0.1",
"postcss-resemble-image": "^2.1.1",
"postcss-sprites": "^4.2.0",
"postcss-will-change": "^1.1.0",
"postcss-yu-gothic": "^0.0.5"
},
```
## Options
```js
const default = {
// To specify the root path
// (postcssAssets.basePath, postcssAssets.baseURL)
bases: ['.', '/'],
// That of `autoprefixer.browsers`
browsers: ['> 3%', 'last 2 versions'],
// Setting `postcssFontMagician.variants`
variants: {},
// Specify project image directory
// (`postcssAssets.loadPaths`)
imagePaths: [],
// Specify whether to modularize
// (`postcssModules`)
module: false, // or `({fileName, json}) => {}`
// Minify if it is true
// (`cssMqpacker` & `cssnano`)
minify: false // or `true`
}
```
## Lisence
The MIT License (MIT)
Copyright (c) 2017 nju33 <nju33.ki@gmail.com>
| 21.082192 | 109 | 0.60104 | yue_Hant | 0.334053 |
b1207950c914e2b7b2d51f38497d22e0b8337823 | 1,926 | md | Markdown | README.md | nateshao/nateshao-cv | 1ec0b1a83beb61562e74d19ad5cf0ebc8430d2ae | [
"MIT"
] | null | null | null | README.md | nateshao/nateshao-cv | 1ec0b1a83beb61562e74d19ad5cf0ebc8430d2ae | [
"MIT"
] | null | null | null | README.md | nateshao/nateshao-cv | 1ec0b1a83beb61562e74d19ad5cf0ebc8430d2ae | [
"MIT"
] | null | null | null | ## [Java软件工程师简历](http://zhousiwei.gitee.io/cv/)
- [English](README.en.md) | 中文
> **欢迎使用和Star支持,如使用过程中碰到问题,可以提出Issue,我会尽力完善**
## 介绍
- 功能还很少,欢迎各位给我提意见和建议~
- **本项目源于:[https://github.com/zhaoky/flqin](https://github.com/zhaoky/flqin "https://github.com/zhaoky/flqin")。向作者表示深深的敬意。**
## 线上预览
> **预览 ➡️ [https://nateshao.gitee.io/cv](https://nateshao.gitee.io/cv)**
## 项目截图





## 本地预览
1. 安装 `node.js/npm`
> 可以参考我的文章 ➡️ [Linux或Win下安装node和npm](https://www.jianshu.com/p/f8b0a4f7a822)
2. 运行
```bash
cd nateshao-cv
npm install
npm run dev
```
## 编译部署
```bash
npm run build //或者 yarn run build
```
## 项目支持
### [@korey/MVVM](https://github.com/zhaoky/mvvm)
> 一个简易的 MVVM 框架,目前实现了 data-binding 和 view-refresh 的功能,仍在不断优化更新,欢迎学习交流。
### [@korey/Fullpage](https://github.com/zhaoky/fullpage)
> 一个简易的 Fullpage 框架,目前实现了上拉下拉,滑轮滚动,锚点直达的全屏翻页功能,仍在不断优化更新,欢迎学习交流。
**一直在努力,从未放弃**
> 不浮躁,不偏激,不守旧,爱编程,爱后端,爱新技术,爱运动,爱旅行,执行力和学习能力都棒棒哒!
# 关于我
- [个人博客](https://nateshao.gitee.io/)
- [GitHub](https://github.com/nateshao)
- [码云](https://gitee.com/nateshao)
- **主要涉及技术:`Java后端开发`、`公众号开发`、`开源爱好者`、`Linux`**
## License
[Apache License](./LICENSE)
## 赞助作者买个馒头吧💚
| 支付宝 | 微信 |
| ------------------------------------------------------------ | ------------------------------------------------------------ |
| <img width="200" height="200" src="https://nateshao.gitee.io/medias/reward/alipay.jpg"/> | <img width="200" height="200" src="https://nateshao.gitee.io/medias/reward/wechat.png"/> |
| 25.342105 | 183 | 0.601246 | yue_Hant | 0.733977 |
b121a2ecae5dbfbd7718ba7c2bc08a7316c7a5ec | 9,179 | md | Markdown | docs/boot2root.md | charlesreid1-docker/wisko | 7b83f40494fb2e90315c2729e561fdb02d082e2b | [
"MIT"
] | null | null | null | docs/boot2root.md | charlesreid1-docker/wisko | 7b83f40494fb2e90315c2729e561fdb02d082e2b | [
"MIT"
] | null | null | null | docs/boot2root.md | charlesreid1-docker/wisko | 7b83f40494fb2e90315c2729e561fdb02d082e2b | [
"MIT"
] | null | null | null | # wisko
## Getting to a Shell
### Hardware
wisko is a higher-end 1-CPU Digital Ocean droplet.
* 1 CPU
* 50 GB SSD
* 2 GB RAM
IP address: 159.65.75.41
### Spin Up Hardware
* Follow the Digital Ocean steps to spin up a droplet in the SF region.
* Once the droplet is created, get the IP address.
* Pro tip: find your ssh key in `~/.ssh/id_rsa.pub` and paste it in during the Droplet creation process to avoid password issues.
### Set DNS Records
Add the following DNS records:
**A Record**:
* Record: none (set to @ automatically)
* Value: 138.68.10.168
**A Record**:
* Record: git
* Value: 138.68.10.168
**CNAME Record**:
* Record:
### Connect to Droplet
Check that the Droplet has been booted and is listening:
```plain
$ ping -c 4 138.68.10.168
PING 138.68.10.168 (138.68.10.168): 56 data bytes
64 bytes from 138.68.10.168: icmp_seq=0 ttl=56 time=28.494 ms
64 bytes from 138.68.10.168: icmp_seq=1 ttl=56 time=26.120 ms
64 bytes from 138.68.10.168: icmp_seq=2 ttl=56 time=25.972 ms
64 bytes from 138.68.10.168: icmp_seq=3 ttl=56 time=27.232 ms
--- 138.68.10.168 ping statistics ---
4 packets transmitted, 4 packets received, 0.0% packet loss
round-trip min/avg/max/stddev = 25.972/26.955/28.494/1.014 ms
```
Now connect to the droplet. By default, you log in as as the root user:
```
$ ssh root@138.68.10.168
[...bunch of login stuff...]
root@ubuntu-s-1vcpu-2gb-sfo2-01:~#
```
Now we're ready to get started.
## Prepare for LAMP
[See DO guide](https://www.digitalocean.com/community/tutorials/how-to-install-linux-apache-mysql-php-lamp-stack-on-ubuntu-16-04)
### Make Normal User
Add a non-root user:
```
adduser melo
passwd melo
usermod -aG sudo melo
```
(While we're at it, create a git user as well for gitea):
```
adduser git
passwd git
```
Prepare to SSH as that user:
```
mkdir /home/melo/.ssh
chown melo:melo /home/melo/.ssh
chmod 700 /home/melo/.ssh
chmod 600 /home/melo/.ssh/authorized_key
```
SSH as this user in a separate window (keep one window open and logged in as root!), and test sudo abilities:
```
sudo whoami
```
Disable root login via ssh:
```
vim /etc/ssh/sshd_config
```
Change `PermitRootLogin` to `no`.
Restart SSH service:
```
sudo service ssh restart
```
Now log out and log back in as user melo.
### Dotfiles Bootstrap
[Dotfiles link](https://charlesreid1.com:3000/dotfiles/wisko/src/master/install_packages.sh)
Start out with the wisko dotfiles repository:
```
apt-get install git
cd ~
git clone https://charlesreid1.com:3000/dotfiles/wisko.git
cd wisko
```
Prepare for bootstrap:
```
./pre_bootstrap.sh
```
Now bootstrap:
```
./bootstrap.sh
```
Then set the machine name:
```
sudo ./set_machine_name.sh
```
Now log out and log back in.
### Aptitude
Install a bunch of the packages that are needed:
```
cd wisko
sudo ./install_packages.sh
```
Now we're ready to start on the LAMP server.
## Installing Apache
```
sudo apt-get install -y apache2
```
Edit Apache config file:
```
sudo vim /etc/apache2/sites-enabled/000-default.conf
```
Set the server name:
```
ServerName www.allthehatsformaps.com
...
DocumentRoot /var/www/htdocs/
```
Also make note of Apache username, should be `www-data`.
Check for syntax errors:
```
sudo apache2ctl configtest
```
Restart Apache:
```
sudo service apache2 restart
```
## Installing MySQL
```
sudo apt-get install -y mysql-server
```
This will ask you to set a root password.
Once the installation is complete, run:
```
mysql_secure_installation
```
to lock down MySQL. Do not configure `VALIDATE PASSWORD PLUGIN`.
## Installing PHP
```
sudo apt-get install -y php libapache2-mod-php php-mcrypt php-mysql php-cli
```
Make Apache look for `index.php` by editing:
```
/etc/apache2/mods-enabled/dir.conf
```
and changing:
```
DirectoryIndex index.html
```
to this:
```
DirectoryIndex index.php index.html
```
Restart apache:
```
sudo service apache2 restart
```
## Installing Wordpress
[boom](https://codex.wordpress.org/Installing_WordPress#Famous_5-Minute_Installation)
Download latest Wordpress:
```
cd /tmp
wget https://wordpress.org/latest.tar.gz
tar xzf latest.tar.gz
```
This extracts to `wordpress/`. Now open MySQL and create a database for wordpress:
```
$ mysql -u root -p
mysql> CREATE DATABASE wordpress;
mysql> CREATE USER 'wpsql'@'localhost' IDENTIFIED BY "yourpasswordgoeshere";
mysql> GRANT ALL PRIVILEGES ON wordpress.* TO "wpsql"@"localhost"
```
This creates a MySQL user `wpsql` and a MySQL database called `wordpress`.
Configure Wordpress by copying example config file to actual config file:
```
cp wp-config-sample.php wp-config.php
```
Edit `wp-config.php` and change the following:
* DB_NAME
* DB_USER
* DB_PASSWORD
* DB_HOST
* DB_CHARSET
* DB_COLLATE
Use [online secret generator](https://api.wordpress.org/secret-key/1.1/salt/) to set secret key values.
Now move the wordpress folder to your web root:
```
mv wordpress /var/www/htdocs/wordpress/
```
Change permissions so wordpress directory is owned by the Apache user:
```
chown -R www-data:www-data /var/www/htdocs/wordpress
```
Visit the Wordpress site and set it up:
```
<ip-addr-of-machine>/wordpress
```
Once you have set up the Wordpress site, you should protect the `wp-config.php` file:
```
sudo chown melo:melo wp-config.php
```
### FTP Server
To upgrade Wordpress you need an ftp server running. vsftpd is a lightweight ftp server. To install it:
```
sudo apt-get install vsftpd
```
By default, the `local_enable` option is set in `/etc/vsftpd.conf`, meaning you can log in to the ftp server using system credentials.
To check and make sure the process of uploading files via FTP works, try upgrading a plugin or installing/removing a plugin.
## Installing Gitea
### Goenv: To Manage Go Version
Start by installing goenv:
```
git clone https://github.com/syndbg/goenv.git ~/.goenv
```
Now add .goenv to your path in .bash_profile so goenv is a command
```
# add these line to ~/.bash_profile
export GOENV_ROOT="$HOME/.goenv"
export PATH="$GOENV_ROOT/bin:$PATH"
```
Now source it:
```
source ~/.bash_profile
```
List latest versions of go:
```
goenv install -l
```
Pick one to install:
```
goenv install 1.7.5
goenv global 1.7.5
```
Now set this as the go version:
```
eval "$(goenv init -)"
```
You will need to execute the above command **each time you wish to use the goenv version of Go.**
(Alternatively, you can add it to `.bashrc` to run it in each new shell.)
### Gitea: Git Web Server
[Installation of gitea from source](https://docs.gitea.io/en-us/install-from-source/)
```
go get -d -u code.gitea.io/gitea
cd $GOPATH/src/code.gitea.io/gitea
```
Now check out the version of gitea that you want to use:
```
git branch -a
git checkout origin/release/v1.2
```
Build with tag `bindata`:
```
TAGS="bindata" make generate build
```
The gitea binary is entirely self-contained. Before you run the binary, create a folder for gitea to use to store repositories (this should be somewhere you have read/write access, like `~/.gitea`):
```
mkdir ~/.gitea
```
We will point gitea to this directory in the next step. To run the binary from the current directory as the current user:
```
./gitea web
```
Now you can navigate to `<server-ip-address>:3000` to set up gitea.
### Set Up Gitea (via Browser)
Visit `<server-ip-address>:3000` in the browser to set up Gitea. You will need to set up a database, and we can use MySQL again. First, create a gitea user in MySQL:
```
$ mysql -u root -p
mysql> CREATE DATABASE gitea;
mysql> CREATE USER 'giteasql'@'localhost' IDENTIFIED BY "yourpasswordgoeshere";
mysql> GRANT ALL PRIVILEGES ON gitea.* TO "giteasql"@"localhost"
```
Now you should be able to punch in all your settings. Make sure you change the address of the app from `localhost:3000` to `<server-ip-address>:3000`.
### Updating Configuration/Templates
The path where gitea is installed is here:
```
/home/melo/gocode/src/code.gitea.io/gitea
```
available as a shortcut in the home directory:
```
$ ll ~/
...
lrwxrwxrwx 1 melo melo 30 Feb 24 22:17 gitea -> gocode/src/code.gitea.io/gitea/
```
Configuration file for gitea is located here:
```
~/gitea/custom/conf/app.ini
```
If you change the config file or any page templates, you will have to re-build the binary for the changes to take effect.
To rebuild the go binary, just set your go version with goenv and re-execute the make command from above:
```
eval "$(goenv init -)"
cd ~/gitea
TAGS="bindata" make generate build
```
### Errors
Note: if you see the following error, check which version of go you are using:
```
go build -i -v -tags 'bindata' -ldflags '-s -w -X "main.Version=1.2.3" -X "main.Tags=bindata"' -o gitea
vendor/code.gitea.io/git/command.go:9:2: cannot find package "context" in any of:
/home/melo/gocode/src/code.gitea.io/gitea/vendor/context (vendor tree)
/usr/lib/go-1.6/src/context (from $GOROOT)
/home/melo/gocode/src/context (from $GOPATH)
Makefile:205: recipe for target 'gitea' failed
make: *** [gitea] Error 1
```
When you run `which go` you should see
```
$ which go
/home/melo/.goenv/shims/go
```
If you see this, you will have problems:
```
$ which go
/usr/bin/go
```
| 19.202929 | 198 | 0.71206 | eng_Latn | 0.836047 |
b122489f402c942b5d23908646ce584a0697f0f6 | 2,770 | md | Markdown | README-zh_CN.md | jsmini/md5 | 2de6257e97f48e554b155a26902416dc26265c57 | [
"MIT"
] | 2 | 2020-07-23T18:53:37.000Z | 2021-01-28T10:11:50.000Z | README-zh_CN.md | sjuhyeon/md5 | 2de6257e97f48e554b155a26902416dc26265c57 | [
"MIT"
] | 1 | 2021-05-07T21:08:00.000Z | 2021-05-07T21:08:00.000Z | README-zh_CN.md | sjuhyeon/md5 | 2de6257e97f48e554b155a26902416dc26265c57 | [
"MIT"
] | 1 | 2022-02-16T17:49:17.000Z | 2022-02-16T17:49:17.000Z | # [md5](https://github.com/jsmini/md5)
[](https://github.com/yanhaijing/jslib-base)
[](https://github.com/jsmini/md5/blob/master/LICENSE)
[](https://travis-ci.org/jsmini/md5)
[](https://coveralls.io/github/jsmini/md5)
[](https://www.npmjs.com/package/@jsmini/md5)
[](http://www.npmtrends.com/@jsmini/md5)
[](http://isitmaintained.com/project/jsmini/base "Percentage of issues still open")
md5生成函数
[English](./README.md) | 简体中文
## 兼容性
单元测试保证支持如下环境:
| IE | CH | FF | SF | OP | IOS | 安卓 | Node |
| ---- | ---- | ---- | ---- | ---- | ---- | ---- | ----- |
| 6+ | 23+ | 4+ | 6+ | 10+ | 5+ | 2.3+ | 0.10+ |
**注意:编译代码依赖ES5环境,对于ie6-8需要引入[es5-shim](http://github.com/es-shims/es5-shim/)才可以兼容,可以查看[demo/demo-global.html](./demo/demo-global.html)中的例子**
## 目录介绍
```
.
├── demo 使用demo
├── dist 编译产出代码
├── doc 项目文档
├── src 源代码目录
├── test 单元测试
├── CHANGELOG.md 变更日志
└── TODO.md 计划功能
```
## 如何使用
通过npm下载安装代码
```bash
$ npm install --save @jsmini/md5
```
如果你是node环境
```js
var name = require('@jsmini/md5').name;
```
如果你是webpack等环境
```js
import { name } from '@jsmini/md5';
```
如果你是requirejs环境
```js
requirejs(['node_modules/@jsmini/md5/dist/index.aio.js'], function (jsmini_md5) {
var name = jsmini_md5.name;
})
```
如果你是浏览器环境
```html
<script src="node_modules/@jsmini/md5/dist/index.aio.js"></script>
<script>
var name = jsmini_md5.name;
</script>
```
## 文档
[API](https://github.com/jsmini/md5/blob/master/doc/api.md)
## 贡献指南 
首次运行需要先安装依赖
```bash
$ npm install
```
一键打包生成生产代码
```bash
$ npm run build
```
运行单元测试,浏览器环境需要手动测试,位于`test/browser`
```bash
$ npm test
```
修改package.json中的版本号,修改README.md中的版本号,修改CHANGELOG.md,然后发布新版
```bash
$ npm run release
```
将新版本发布到npm
```bash
$ npm publish --access=public
```
重命名项目名称,首次初始化项目是需要修改名字,或者后面项目要改名时使用,需要修改`rename.js`中的`fromName`和`toName`,会自动重命名下面文件中的名字
- README.md 中的信息
- package.json 中的信息
- config/rollup.js 中的信息
- test/browser/index.html 中的仓库名称
```bash
$ npm run rename # 重命名命令
```
## 贡献者列表
[contributors](https://github.com/jsmini/md5/graphs/contributors)
## 更新日志
[CHANGELOG.md](https://github.com/jsmini/md5/blob/master/CHANGELOG.md)
## 计划列表
[TODO.md](https://github.com/jsmini/md5/blob/master/TODO.md)
## 谁在使用
| 21.472868 | 170 | 0.670758 | yue_Hant | 0.63703 |
b1235172b5efde83204b36610aa90b39d4096803 | 602 | md | Markdown | templates/zerver/help/include/install-matrix.md | fearless0307/zulip | 378d14af7ea73a9a83c7245706cd918bec5a37bf | [
"Apache-2.0"
] | 4 | 2019-06-04T09:06:53.000Z | 2019-06-04T09:07:47.000Z | templates/zerver/help/include/install-matrix.md | fearless0307/zulip | 378d14af7ea73a9a83c7245706cd918bec5a37bf | [
"Apache-2.0"
] | 58 | 2018-11-27T15:18:54.000Z | 2018-12-09T13:43:07.000Z | templates/zerver/help/include/install-matrix.md | fearless0307/zulip | 378d14af7ea73a9a83c7245706cd918bec5a37bf | [
"Apache-2.0"
] | 9 | 2019-11-04T18:59:29.000Z | 2022-03-22T17:46:37.000Z | ### Install the bridge software
1. Clone the Zulip API repository, and install its dependencies.
```
git clone https://github.com/zulip/python-zulip-api.git
cd python-zulip-api
python3 ./tools/provision
```
This will create a new Python virtualenv. You'll run the bridge service
inside this virtualenv.
1. Activate the virtualenv by running the `source` command printed
at the end of the output of the previous step.
1. Install the Matrix bridge software in your virtualenv, by running:
```
pip install -r zulip/integrations/matrix/requirements.txt
```
| 27.363636 | 75 | 0.714286 | eng_Latn | 0.966184 |
b123685673058537551a05d407221ddcb868863a | 4,652 | md | Markdown | README.md | Triveni07/Classic-Arcade-Game | c55e51a336f4767f2d026a1013e995645975706c | [
"Unlicense"
] | null | null | null | README.md | Triveni07/Classic-Arcade-Game | c55e51a336f4767f2d026a1013e995645975706c | [
"Unlicense"
] | null | null | null | README.md | Triveni07/Classic-Arcade-Game | c55e51a336f4767f2d026a1013e995645975706c | [
"Unlicense"
] | null | null | null | # Classic Arcade Game Project
### Description:
* In this game there is a Player and Enemies (Bugs).
* The goal of the player is to reach the water, without colliding into any one of the enemies.
* The player can move left, right, up and down. The enemies move in varying speeds on the paved block portion of the scene.
* Once a the player collides with an enemy, the game is reset and the player moves back to the start square. * Once the player reaches the water the game is won.
* User can start the game by clicking start button.
* Only one event is added on click to make sure user can start only one session of game at a time.
* Once button is clicked, timer starts and user can move with arrow keys to next block position and start playing arcade game to reach upto the water level avoiding bugs bite.
* When player reaches top of the platform i.e. water level, page displays a modular pop up with victory message and option to replay.
* Also when bug bites or hits the player, page displays modular pop up with failure message and button to replay eg. **Play again** and **Close**.
### Table of Contents -_Functional details:_
* Inside the app.js file, implemented the Player and the Enemy classes, using Object-Oriented JavaScript.
* Total three classes have been implemented `Entity superclass` then `player` and `enemy` subclasses possessing their own uniue characteristics as well as inherited from parent class.
##### The Enemy function, which initiates the Enemy by:
- Loading the image by setting this.sprite to the appropriate image in the image folder
- Setting the Enemy initial location
- Setting the Enemy speed
- The update method for the Enemy
- Updates the Enemy location
- Handles collision with the Player
##### The Player function, which initiates the Player by:
- Loading the image by setting this.sprite to the appropriate image in the image folder
- Setting the Player initial location
- The update method for the Player
- The render method for the Player
- The handleInput method, receives user input, allowedKeys (the key which was pressed) and moves the player according to that input. eg.:
Left key moves the player to the left, right key to the right, up moves to the player up and down should moves to the player down.
- The player cannot move off screen.
- When the player reaches the water the game resets by moving the player back and bugs speed to the initial location and scale
- finally instantiated player and enemy objects.
* Created a new Player object
* Created several new Enemies objects and placing them in an array called allEnemies
##### The global scope functions, to support the game functinality by:
- Updating timer on the screen
- Displaying result message on either failure or success.
- Game reset and start game functionality.
### Installation:
To install this Game, click on index.html of the source folder.
Or direct location of index.html inside your pc's directory and open the path in the browser.
### Dependencies:
_Include below dependencies inside head element of index html:_
* [Google fonts](https://fonts.googleapis.com/css?family=AlfaSlabOne) used to style the fonts of game heading.
* [Google fonts](https://fonts.googleapis.com/css?family=Orbitron) used to style the digital fonts of timer
### Usage:
On successful web page loading, start the game by clicking start the Game button.
Then Timer will start and player moves with bugs roaming staright will be enabled for arrow keypress and to reach upto water level subsequently.
### Credits: _Triveni Vikrant Londhe._
### License:
MIT License
Copyright (c) [2018] [Triveni Vikrant Londhe]
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
| 59.641026 | 183 | 0.782674 | eng_Latn | 0.996149 |
b123ee416bc9172690f9fafa4904a55d1913e1d3 | 1,937 | md | Markdown | docs/code-quality/c6297.md | tommorris/visualstudio-docs.fr-fr | dd3606399fd617d5584bbf08bbe616bbdbb36401 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/code-quality/c6297.md | tommorris/visualstudio-docs.fr-fr | dd3606399fd617d5584bbf08bbe616bbdbb36401 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/code-quality/c6297.md | tommorris/visualstudio-docs.fr-fr | dd3606399fd617d5584bbf08bbe616bbdbb36401 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: C6297
ms.date: 11/04/2016
ms.prod: visual-studio-dev15
ms.technology: vs-ide-code-analysis
ms.topic: reference
f1_keywords:
- C6297
helpviewer_keywords:
- C6297
ms.assetid: 17b585f0-75e5-4fc0-935a-143ec67659f4
author: mikeblome
ms.author: mblome
manager: wpickett
ms.workload:
- multiple
ms.openlocfilehash: bc386492117eb6eced4d5d14f9f8421e06351052
ms.sourcegitcommit: e13e61ddea6032a8282abe16131d9e136a927984
ms.translationtype: MT
ms.contentlocale: fr-FR
ms.lasthandoff: 04/26/2018
ms.locfileid: "31891452"
---
# <a name="c6297"></a>C6297
avertissement C6297 : dépassement de capacité arithmétique : valeur 32 bits est décalée, puis effectuer un cast en valeur 64 bits. Résultat ne peut pas être une valeur attendue
Cet avertissement indique un comportement incorrect résultant de règles de promotion intégrale et types supérieurs à ceux dans lesquels arithmétique est généralement effectuée.
Dans ce cas, une valeur 32 bits a été déplacée vers la gauche, et le résultat de ce décalage a été converti en une valeur 64 bits. Si l’équipe a dépassé la valeur 32 bits, les bits sont perdus.
Si vous ne souhaitez pas perdre des bits, effectuez un cast de la valeur de décalage pour une quantité de 64 bits avant qu’elle soit déplacée. Si vous souhaitez perdre des bits, effectue le cast approprié en non signé long ou un type court ou le résultat de l’équipe de masquage supprimer cet avertissement et que l’objectif du code plus clair.
## <a name="example"></a>Exemple
Le code suivant génère cet avertissement :
```
void f(int i)
{
unsigned __int64 x;
x = i << 34;
// code
}
```
Pour corriger cet avertissement, utilisez le code suivant :
```
void f(int i)
{
unsigned __int64 x;
// code
x = ((unsigned __int64)i) << 34;
}
```
## <a name="see-also"></a>Voir aussi
[Avertissement du compilateur (niveau 1) C4293](/cpp/error-messages/compiler-warnings/compiler-warning-level-1-c4293) | 33.396552 | 345 | 0.758389 | fra_Latn | 0.965861 |
b124612b81d4712c21667eed62367c94dc8402dc | 2,615 | md | Markdown | docs/_docs/ssl_errors.md | boltops-tools/ufo-v5 | cab75cf387cd45369cc7f726552e78fea853994b | [
"MIT"
] | 29 | 2020-08-26T22:32:41.000Z | 2022-03-10T13:16:22.000Z | docs/_docs/ssl_errors.md | boltops-tools/ufo-v5 | cab75cf387cd45369cc7f726552e78fea853994b | [
"MIT"
] | 10 | 2020-10-07T21:02:33.000Z | 2022-03-26T21:43:12.000Z | docs/_docs/ssl_errors.md | boltops-tools/ufo-v5 | cab75cf387cd45369cc7f726552e78fea853994b | [
"MIT"
] | 2 | 2020-12-31T02:26:37.000Z | 2021-04-16T00:31:32.000Z | ---
Title: SSL Errors
---
UFO uses the AWS Ruby SDK and the underlying default SSL certificate chain configured in your active Ruby and
OpenSSL to communicate to your AWS environment. This means that you _must correctly configure_ your Ruby and OpenSSL to have all the needed ROOT certificates for UFO to be able to communicate to AWS - _especially_ if you are behind a proxy or a corporate SSL-Proxy.
If you are behind a corporate SSL proxy and you have not updated system, OpenSSL and Ruby certificate chains to include the needed corporate root certificates, you will see errors, such as:
```
Seahorse::Client::NetworkingError: SSL_connect returned=1 errno=0 state=error: certificate verify failed (self signed certificate in certificate chain)
~/.rbenv/versions/2.6.0/lib/ruby/2.6.0/net/protocol.rb:44:in `connect_nonblock'
~/.rbenv/versions/2.6.0/lib/ruby/2.6.0/net/protocol.rb:44:in `ssl_socket_connect'
~/.rbenv/versions/2.6.0/lib/ruby/2.6.0/net/http.rb:996:in `connect'
~/.rbenv/versions/2.6.0/lib/ruby/2.6.0/net/http.rb:930:in `do_start'
~/.rbenv/versions/2.6.0/lib/ruby/2.6.0/net/http.rb:925:in `start'
```
## Helper Scripts
The `docs/utils` directory has a few scripts that should be able to help you resolve these issues and track down which certs are giving you problems.
- `ssl-doctor.rb` is from the very useful examples at <https://github.com/mislav/ssl-tools>, and it can help you find the missing ROOT cert in your certificate chain and give suggestion on getting OpenSSL working correctly.
- `update-cert-chains.sh` will help you update your Ruby and OpenSSL chains by adding in the missing ROOT cert and also pulling in the OSX System Root to your rbenv environment.
- `test-aws-api-access.rb` should now return a list of the S3 buckets for the current AWS profile that is active.
## Trouble-shooting
### Update Brew and OpenSSL
- `brew update`
- `brew upgrade openssl`
### Use the Helper Scripts to find the trouble spot
Once you have updated OpenSSL and your `brew` packages, use the helper scripts above to see if you can track down the missing certificate in your certificate chain.
The `update-cert-chain.sh` file was created using the suggestions from <https://gemfury.com/help/could-not-verify-ssl-certificate/>. Please review the information at <https://gemfury.com/help/could-not-verify-ssl-certificate/> if the `Helper Scripts` above do not fully resolve your issue.
The `test-aws-api-access.rb` uses examples from the <https://docs.aws.amazon.com/sdk-for-ruby/v3/developer-guide/quick-start-guide.html> for using and configuring the Ruby AWS SDK on your system. | 65.375 | 289 | 0.770554 | eng_Latn | 0.986442 |
b1250dd68b4a02eeeade3d3173a47cbd0aaaa1e3 | 301 | md | Markdown | presto/CHANGELOG.md | marcoferrer/integrations-core | 1977babbdede0a0fcf402669e583a09480e7bbb3 | [
"BSD-3-Clause"
] | null | null | null | presto/CHANGELOG.md | marcoferrer/integrations-core | 1977babbdede0a0fcf402669e583a09480e7bbb3 | [
"BSD-3-Clause"
] | null | null | null | presto/CHANGELOG.md | marcoferrer/integrations-core | 1977babbdede0a0fcf402669e583a09480e7bbb3 | [
"BSD-3-Clause"
] | null | null | null | # CHANGELOG - Presto
## 1.0.1 / 2019-05-14
* [Fixed] Update the log path for the presto integration. See [#3416](https://github.com/DataDog/integrations-core/pull/3416).
## 1.0.0 / 2019-03-29
* [Added] Adds Presto Integration. See [#3131](https://github.com/DataDog/integrations-core/pull/3131).
| 27.363636 | 126 | 0.700997 | kor_Hang | 0.288684 |
b126549d5255ffe1b1447ef8e586ae0000aa4034 | 2,302 | md | Markdown | controls/gantt/CHANGELOG.md | geneeblack/ej2-javascript-ui-controls | 11f1afcae42881d6f9483651ffe11d9084aa53a2 | [
"Net-SNMP",
"Xnet"
] | null | null | null | controls/gantt/CHANGELOG.md | geneeblack/ej2-javascript-ui-controls | 11f1afcae42881d6f9483651ffe11d9084aa53a2 | [
"Net-SNMP",
"Xnet"
] | null | null | null | controls/gantt/CHANGELOG.md | geneeblack/ej2-javascript-ui-controls | 11f1afcae42881d6f9483651ffe11d9084aa53a2 | [
"Net-SNMP",
"Xnet"
] | null | null | null | # Changelog
## [Unreleased]
## 17.1.49 (2019-05-29)
### Gantt
#### Bug Fixes
- #F144145 - Task Id duplication issue while adding new record has been fixed.
## 17.1.47 (2019-05-14)
### Gantt
#### Bug Fixes
- #233041 - Alignment issue with timeline and vertical lines has been fixed.
#### New Features
- #F143360 - Provided support to refresh the `dataSource` dynamically.
## 17.1.43 (2019-04-30)
### Gantt
#### Bug Fixes
- Bug fixes included.
## 17.1.40 (2019-04-09)
### Gantt
#### Bug Fixes
- Internal bug fixes included.
## 17.1.32-beta (2019-03-13)
### Gantt
- **Data sources** – Bind hierarchical or self-referential data to Gantt chart with an array of JavaScript objects or DataManager.
- **Timeline** – Display timescale from minutes to decades easily, and also display custom texts in the timeline units. Timeline can be displayed in either one-tier or two-tier layout.
- **Customizable Taskbars** – Display various tasks in a project using child taskbar, summary taskbar and milestone UI, that can also be customized with templates.
- **Unscheduled tasks** – Support for displaying tasks with undefined start date, end date or duration in a project.
- **Baselines** – Display the deviations between planned dates and actual dates of a task in a project using baselines.
- **CRUD actions** – Provides the options to dynamically insert, delete and update tasks using columns, dialog and taskbar editing options.
- **Task dependency** – Define or update the dependencies between the tasks in a project with four types of task dependencies Finish – Start, Start – Finish, Finish – Finish, Start – Start.
- **Markers and indicators** - Support for displaying indicators and flags along with taskbars and task labels. Also map important events in a project using event marker.
- **Filtering** – Offers filtering the Gantt content using column menu filtering along with toolbar search box.
- **Customizable columns** – Customize the columns and add custom columns to Gantt chart at initialization through column property.
- **Enriched UI** – Support for Material, bootstrap, fabric and high contrast themes along with other UI options like holidays support, vertical and horizontal grid lines support and so on.
- **Localization** - Provides inherent support to localize the UI.
| 40.385965 | 189 | 0.745439 | eng_Latn | 0.99531 |
b1268b07aedc22c7a3d2e1a4b44d09a0d8c1f22b | 1,739 | md | Markdown | README.md | Narcwis/socket-io-typescript-chat | 60a1d47b83b0d8764c321c1ab456552c0bad3bbe | [
"MIT"
] | null | null | null | README.md | Narcwis/socket-io-typescript-chat | 60a1d47b83b0d8764c321c1ab456552c0bad3bbe | [
"MIT"
] | 2 | 2018-06-17T20:40:59.000Z | 2018-06-26T07:54:22.000Z | README.md | Narcwis/socket-io-typescript-chat | 60a1d47b83b0d8764c321c1ab456552c0bad3bbe | [
"MIT"
] | null | null | null |
Alcumus Chat Repo
=========================================
This repository contains server & client side code forked from [luixaviles] (https://github.com/luixaviles/socket-io-typescript-chat) and further developed to contain an emoji-picker, developed to be PWA and draft features added
## Live Demo
Try live demo:
[https://alcumus-chat.firebaseapp.com/](https://alcumus-chat.firebaseapp.com/)
*This needs a local server to be running*
# Running Server and Client locally
## Prerequisites
First, ensure you have the following installed:
1. NodeJS - Download and Install latest version of Node: [NodeJS](https://nodejs.org)
2. Git - Download and Install [Git](https://git-scm.com)
3. Angular CLI - Install Command Line Interface for Angular [https://cli.angular.io/](https://cli.angular.io/)
After that, use `Git bash` to run all commands if you are on Windows platform.
## Clone repository
In order to start the project use:
```bash
$ git clone https://github.com/narcwis/socket-io-typescript-chat.git
$ cd socket-io-typescript-chat
```
## Run Server
To run server locally, just install dependencies and run `gulp` task to create a build:
```bash
$ cd server
$ npm install -g gulp-cli
$ npm install
$ gulp build
$ npm start
```
The `socket.io` server will be running on port `8080`
## Run Angular Client
Open other command line window and run following commands:
```bash
$ cd client
$ npm install
$ ng serve
```
Now open your browser in following URL: [http://localhost:4200](http://localhost:4200/)
# Contribution
Contributions are greatly appreciated.
# Contributors
[<img alt="narcwis" src="https://avatars0.githubusercontent.com/u/9106275?s=460&v=4" width="117">](https://github.com/narcwis)
## License
MIT
| 25.202899 | 229 | 0.722829 | eng_Latn | 0.869868 |
b127468d47ea055ba769771f0f3e0f2f241a2fe2 | 6,284 | md | Markdown | README.md | courobin/dgca-issuance-service | c708225965c632f4bc04e6fa8d2eea4cfe2b621f | [
"Apache-2.0"
] | 18 | 2021-04-24T06:49:01.000Z | 2022-01-19T22:39:21.000Z | README.md | courobin/dgca-issuance-service | c708225965c632f4bc04e6fa8d2eea4cfe2b621f | [
"Apache-2.0"
] | 94 | 2021-04-25T11:10:38.000Z | 2022-01-28T13:35:31.000Z | README.md | courobin/dgca-issuance-service | c708225965c632f4bc04e6fa8d2eea4cfe2b621f | [
"Apache-2.0"
] | 25 | 2021-05-16T09:17:58.000Z | 2022-03-08T10:35:57.000Z | <h1 align="center">
EU Digital COVID Certificate Issuance Service
</h1>
<p align="center">
<a href="https://sonarcloud.io/dashboard?id=eu-digital-green-certificates_dgca-issuance-service" title="Quality Gate Status"><img src="https://sonarcloud.io/api/project_badges/measure?project=eu-digital-green-certificates_dgca-issuance-service&metric=alert_status"></a>
<a href="/../../commits/" title="Last Commit"><img src="https://img.shields.io/github/last-commit/eu-digital-green-certificates/dgca-issuance-service?style=flat"></a>
<a href="/../../issues" title="Open Issues"><img src="https://img.shields.io/github/issues/eu-digital-green-certificates/dgca-issuance-service?style=flat"></a>
<a href="./LICENSE" title="License"><img src="https://img.shields.io/badge/License-Apache%202.0-green.svg?style=flat"></a>
</p>
<p align="center">
<a href="#about">About</a> •
<a href="#development">Development</a> •
<a href="#documentation">Documentation</a> •
<a href="#support-and-feedback">Support</a> •
<a href="#how-to-contribute">Contribute</a> •
<a href="#contributors">Contributors</a> •
<a href="#licensing">Licensing</a>
</p>
## About
This repository contains the source code of the EU Digital COVID Certificate Issuance Service.
The issuer backend is accessed by the [issuer web frontend](https://github.com/eu-digital-green-certificates/dgca-issuance-web) and the respective wallet apps ( [Android](https://github.com/eu-digital-green-certificates/dgca-wallet-app-android), [iOS](https://github.com/eu-digital-green-certificates/dgca-wallet-app-ios) ) of the same member state. The backend itself publishes its public keys to the [DGCG](https://github.com/eu-digital-green-certificates/dgc-gateway) where they can be distributed to other member states. Each member state hosts its own issuer backend. The main function of the backend is to provide services for creating and signing new green certificates.
## Development
### Prerequisites
- [Open JDK 11](https://openjdk.java.net)
- [Maven](https://maven.apache.org)
- [Docker](https://www.docker.com)
- Authenticate to [Github Packages](https://docs.github.com/en/packages/working-with-a-github-packages-registry/working-with-the-apache-maven-registry)
#### Authenticating in to GitHub Packages
As some of the required libraries (and/or versions are pinned/available only from GitHub Packages) You need to authenticate
to [GitHub Packages](https://docs.github.com/en/packages/working-with-a-github-packages-registry/working-with-the-apache-maven-registry)
The following steps need to be followed
- Create [PAT](https://docs.github.com/en/github/authenticating-to-github/creating-a-personal-access-token) with scopes:
- `read:packages` for downloading packages
##### GitHub Maven
- Copy/Augment `~/.m2/settings.xml` with the contents of `settings.xml` present in this repository
- Replace `${app.packages.username}` with your github username
- Replace `${app.packages.password}` with the generated PAT
##### GitHub Docker Registry
- Run `docker login docker.pkg.github.com/eu-digital-green-certificates` before running further docker commands.
- Use your GitHub username as username
- Use the generated PAT as password
### Build
Whether you cloned or downloaded the 'zipped' sources you will either find the sources in the chosen checkout-directory or get a zip file with the source code, which you can expand to a folder of your choice.
In either case open a terminal pointing to the directory you put the sources in. The local build process is described afterwards depending on the way you choose.
### Build with maven
* Check [settings.xml](settings.xml) in root folder and copy the servers to your own `~/.m2/settings.xml` to connect the GitHub repositories we use in our code. Provide your GitHub username and access token (see [GitHub Help](https://docs.github.com/en/github/authenticating-to-github/creating-a-personal-access-token)) under the variables suggested.
* Run `mvn clean package` from the project root folder
### Run with docker
* Perform maven build as described above
* Run `docker-compose up` from the project root folder
After all containers have started you will be able to reach the application on your [local machine](http://localhost:8080/dgci/status) under port 8080.
## Documentation
* [configuration manual](docs/configuration.md)
* [developing configuration](docs/dev_config.md)
## Support and feedback
The following channels are available for discussions, feedback, and support requests:
| Type | Channel |
| ------------------------ | ------------------------------------------------------ |
| **Issues** | <a href="/../../issues" title="Open Issues"><img src="https://img.shields.io/github/issues/eu-digital-green-certificates/dgca-issuance-service?style=flat"></a> |
| **Other requests** | <a href="mailto:opensource@telekom.de" title="Email DGC Team"><img src="https://img.shields.io/badge/email-DGC%20team-green?logo=mail.ru&style=flat-square&logoColor=white"></a> |
## How to contribute
Contribution and feedback is encouraged and always welcome. For more information about how to contribute, the project structure, as well as additional contribution information, see our [Contribution Guidelines](./CONTRIBUTING.md). By participating in this project, you agree to abide by its [Code of Conduct](./CODE_OF_CONDUCT.md) at all times.
## Contributors
Our commitment to open source means that we are enabling -in fact encouraging- all interested parties to contribute and become part of its developer community.
## Licensing
Copyright (C) 2021 T-Systems International GmbH and all other contributors
Licensed under the **Apache License, Version 2.0** (the "License"); you may not use this file except in compliance with the License.
You may obtain a copy of the License at https://www.apache.org/licenses/LICENSE-2.0.
Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the [LICENSE](./LICENSE) for the specific language governing permissions and limitations under the License.
| 59.847619 | 678 | 0.741725 | eng_Latn | 0.908314 |
b12750d89f26059ce0e3a71d22c96e90773b3648 | 516 | md | Markdown | README.md | Sundragon1993/Transfer-learning-Resnet-VGG-Alex | 9c39c8f86fe6a09cd761d3329c6fadeeb00f5c75 | [
"MIT"
] | null | null | null | README.md | Sundragon1993/Transfer-learning-Resnet-VGG-Alex | 9c39c8f86fe6a09cd761d3329c6fadeeb00f5c75 | [
"MIT"
] | null | null | null | README.md | Sundragon1993/Transfer-learning-Resnet-VGG-Alex | 9c39c8f86fe6a09cd761d3329c6fadeeb00f5c75 | [
"MIT"
] | null | null | null | # AlexNet Feature Extraction
Using AlexNet and TensorFlow to build a feature extraction network.
## Setup
Before embarking on the lab, We should first set up your environment with the [Term 1 Starter Kit](https://github.com/udacity/CarND-Term1-Starter-Kit).
Download additional files:
* [Training data](https://d17h27t6h515a5.cloudfront.net/topher/2016/October/580a829f_train/train.p)
* [AlexNet weights](https://d17h27t6h515a5.cloudfront.net/topher/2016/October/580d880c_bvlc-alexnet/bvlc-alexnet.npy)
| 46.909091 | 152 | 0.786822 | eng_Latn | 0.46084 |
b127e2682a7639784d96cc08208b021c66cbffe6 | 15,547 | md | Markdown | documentation/src/docs/getting-started/pravega-on-kubernetes-101.md | RaulGracia/pravega | 9404b59a5d2e06a2792fc38f146f4fca7a0042fd | [
"Apache-2.0"
] | 1,840 | 2017-05-10T16:29:14.000Z | 2022-03-31T07:02:11.000Z | documentation/src/docs/getting-started/pravega-on-kubernetes-101.md | RaulGracia/pravega | 9404b59a5d2e06a2792fc38f146f4fca7a0042fd | [
"Apache-2.0"
] | 5,485 | 2017-05-10T16:56:17.000Z | 2022-03-31T14:08:36.000Z | documentation/src/docs/getting-started/pravega-on-kubernetes-101.md | RaulGracia/pravega | 9404b59a5d2e06a2792fc38f146f4fca7a0042fd | [
"Apache-2.0"
] | 443 | 2017-05-10T21:34:50.000Z | 2022-03-31T07:02:14.000Z | <!--
Copyright Pravega Authors.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->
# Deploying Pravega on Kubernetes 101
We show you how to deploy your "first Pravega cluster in Kubernetes". We provide a step-by-step guide
to deploy Pravega in both Google Kubernetes Engine (GKE) and Amazon Elastic Kubernetes Service (EKS).
Our goal is to keep things as simple as possible, and, at the same time, provide you with valuable
insights on the services that form a Pravega cluster and the operators we developed to deploy them.
## Creating and Setting Up the Kubernetes Cluster
First, we need to create the Kubernetes cluster to deploy Pravega. We assume as a pre-requisite that
you have an account with at least one of the cloud providers mentioned above. If you already have an account
for Google Cloud and/or AWS, then it is time to create a Kubernetes cluster for Pravega.
###GKE
Creating a Kubernetes cluster in GKE is straightforward. The defaults in general are enough for running a
demo Pravega cluster, but we suggest just a couple of setting changes to deploy Pravega:
1. Go to `Kubernetes Engine` drop-down menu and select `Clusters > Create Cluster` option.
2. Pick a name for your Kubernetes cluster (i.e., `pravega-gke`).
3. As an important point, in `Master version` section you should select a Kubernetes version 1.15.
The reason is that we are going to exercise the latest Pravega and Bookkeeper Operators, which requires
Kubernetes version 1.15+.
4. Also, as the Pravega cluster consists of several services, we need to select a slightly larger node
flavor compared to the default one. Thus, go to `default-pool > Nodes > Machine type` and select `n1-standard-4`
nodes (4vCPUs, 15GB of RAM) and select 4 nodes instead of 3 (default). Note that this deployment is still
accessible with the trial account.
5. Press the `Create` button, and that’s it.
Note that we use the Cloud Shell provided by GKE to deploy Pravega from the browser itself without installing
locally any CLI (but feel free to use the Google Cloud CLI instead).
Pravega and Bookkeeper Operators also [require elevated privileges](https://github.com/pravega/bookkeeper-operator/blob/master/doc/development.md#installation-on-google-kubernetes-engine)
in order to watch for custom resources. For this reason, in GKE you need as a pre-requisite to grant those
permissions first by executing:
```
kubectl create clusterrolebinding cluster-admin-binding --clusterrole=cluster-admin --user=$(gcloud config get-value core/account)
```
###EKS
In the case of AWS, we are going to use the EKS CLI, which automates and simplifies different aspects of the
cluster creation and configuration (e.g., VPC, subnets, etc.). You will need to [install and configure the
EKS CLI](https://docs.aws.amazon.com/eks/latest/userguide/getting-started-eksctl.html) before proceeding with
the cluster creation.
Once the EKS CLI is installed, we just require one command to create an EKS cluster:
```
eksctl create cluster \
--name pravega-eks \
--region us-west-2 \
--nodegroup-name standard-workers \
--node-type t3.xlarge \
--nodes 3 \
--nodes-min 1 \
--nodes-max 4 \
--ssh-access \
--ssh-public-key ~/.ssh/pravega_aws.pub \
--managed
```
Similar to the GKE case, the previous command uses a larger node type compared to the default one
(`--node-type t3.xlarge`). Note that the `--ssh-public-key` parameter expects a public key that has
been generated when installing the AWS CLI to securely connect with your cluster (for more info,
please [read this document](https://docs.aws.amazon.com/cli/latest/userguide/cli-services-ec2-keypairs.html#creating-a-key-pair)).
Also, take into account that the region for the EKS cluster should match the configured region in your AWS CLI.
Now, we are ready to prepare our Kubernetes cluster for the installation of Pravega.
###Install Helm
To simplify the deployment of Pravega, we use [Helm charts](https://helm.sh/). You will need to [install a
Helm 3](https://helm.sh/docs/intro/install/) client to proceed with the installation instructions in this blog post.
Once you install the Helm client, you just need to get the public charts we provide to deploy a Pravega cluster:
```
helm repo add pravega https://charts.pravega.io
helm repo update
```
###Webhook conversion and Cert-Manager
The most recent versions of Pravega Operator resort to the new
[Webhook Conversion feature](https://kubernetes.io/docs/tasks/extend-kubernetes/custom-resources/custom-resource-definition-versioning/#webhook-conversion),
which is beta since 1.15. For this reason, Cert-Manager or some other certificate management solution must be
deployed for managing webhook service certificates. To install Cert-Manager, just execute this command:
```
kubectl apply --validate=false -f https://github.com/jetstack/cert-manager/releases/download/v0.14.2/cert-manager.yaml
```
##Deploying Pravega
Next, we show you step by step how to deploy Pravega, which involves the deployment of Apache Zookeeper,
Bookkeeper (journal), and Pravega (as well as their respective Operators). Also, given that Pravega moves
"cold" data to what we call long-term storage (a.k.a Tier 2), we need to instantiate a storage backend
for such purpose.
###Apache Zookeeper
[Apache Zookeeper](https://zookeeper.apache.org/) is a distributed system that provides reliable coordination
services, such as consensus and group management. Pravega uses Zookeeper to store specific pieces of metadata as
well as to offer a consistent view of data structures used by multiple service instances.
As part of the Pravega project, we have developed a [Zookeeper Operator](https://github.com/pravega/zookeeper-operator)
to manage the deployment of Zookeeper clusters in Kubernetes. Thus, deploying the Zookeeper Operator is the first step
to deploy Zookeeper:
```
helm install zookeeper-operator pravega/zookeeper-operator --version=0.2.8
```
With the Zookeeper Operator up and running, the next step is to deploy Zookeeper. We can do so with the helm chart we
published for Zookeeper:
```
helm install zookeeper pravega/zookeeper --version=0.2.8
```
This chart instantiates a Zookeeper cluster made of 3 instances and their respective Persistent Volume Claims (PVC)
of 20GB of storage each, which is enough for a demo Pravega cluster.
Once the previous command has been executed, you can see both Zookeeper Operator and Zookeeper running in the
cluster:
```console
$ kubectl get pods
NAME READY STATUS RESTARTS AGE
zookeeper-0 1/1 Running 0 3m46s
zookeeper-1 1/1 Running 0 3m6s
zookeeper-2 1/1 Running 0 2m25s
zookeeper-operator-6b9759bbcb-9j25s 1/1 Running 0 4m
```
###Apache Bookkeeper
[Apache Bookkeeper](https://bookkeeper.apache.org/) is a distributed and reliable storage system that provides
a distributed log abstraction. Bookkeeper excels on achieving low latency, append-only writes. This
is the reason why Pravega uses Bookkeeper for journaling: Pravega writes data to Bookkeeper, which provides low latency,
persistent, and replicated storage for stream appends. Pravega uses the data in BookKeeper to recover from failures,
and that data is truncated once it is flushed to tiered long-term storage.
As in the case of Zookeeper, we have also developed a [Bookkeeper Operator](https://github.com/pravega/bookkeeper-operator)
to manage the lifecycle of Bookkeeper clusters deployed in Kubernetes. Thus, the next step is to deploy the Bookkeeper Operator:
```
helm install bookkeeper-operator pravega/bookkeeper-operator --version=0.1.2
```
Once running, we can proceed to deploy Bookkeeper. In this case, we will use the Helm chart publicly available to quickly
spin up a Bookkeeper cluster:
```
helm install bookkeeper pravega/bookkeeper --version=0.7.1
```
As a result, you can see below both Zookeeper and Bookkeeper up and running:
```console
$ kubectl get pods
NAME READY STATUS RESTARTS AGE
bookkeeper-operator-85568f8949-d652z 1/1 Running 0 4m10s
bookkeeper-pravega-bk-bookie-0 1/1 Running 0 2m10s
bookkeeper-pravega-bk-bookie-1 1/1 Running 0 2m10s
bookkeeper-pravega-bk-bookie-2 1/1 Running 0 2m10s
zookeeper-0 1/1 Running 0 8m59s
zookeeper-1 1/1 Running 0 8m19s
zookeeper-2 1/1 Running 0 7m38s
zookeeper-operator-6b9759bbcb-9j25s 1/1 Running 0 9m13s
```
###Long-Term Storage
We mentioned before that Pravega automatically [moves data to Long-Term Storage](http://pravega.io/docs/latest/segment-store-service/#synchronization-with-tier-2-storage-writer)
(or Tier 2). This feature is very interesting, because it positions Pravega in a "sweet spot" in the latency vs
throughput trade-off: Pravega achieves low latency writes by using Bookkeeper for appends. At the same time,
it also provides high throughput reads when accessing historical data.
As our goal is to keep things as simple as possible, we deploy a simple storage option: the NFS Server provisioner.
With such a provisioner, we have a pod that acts as an NFS Server for Pravega. To deploy it, you need to execute
the next command:
```
helm repo add stable https://kubernetes-charts.storage.googleapis.com/
helm install stable/nfs-server-provisioner --generate-name
```
Once the NFS Server provisioner is up and running, Pravega will require a PVC for long-term storage pointing to the
NFS Server provisioner that we have just deployed. To create the PVC, you can just copy the following manifest
(namely `tier2_pvc.yaml`):
```yaml
kind: PersistentVolumeClaim
apiVersion: v1
metadata:
name: pravega-tier2
spec:
storageClassName: "nfs"
accessModes:
- ReadWriteMany
resources:
requests:
storage: 5Gi
```
And create the PVC for long-term storage as follows:
```
kubectl apply -f tier2_pvc.yaml
```
As you may notice, the long-term storage option suggested in this post is just for demo purposes, just to keep
things simple. But, if you really want to have a real Pravega cluster running in the cloud, then we suggest you
to use actual storage services like FileStore in GKE and EFS in AWS. There are instructions on [how to deploy
production long-term storage options](https://github.com/pravega/pravega-operator/blob/master/doc/longtermstorage.md)
in the documentation of Pravega Operator.
###Pravega
We are almost there! The last step is to deploy Pravega Operator and Pravega, pretty much as what we have
already done for Zookeeper and Bookkeeper. As usual, we first need to deploy the Pravega Operator
(and its required certificate) as follows:
```
git clone https://github.com/pravega/pravega-operator
kubectl create -f pravega-operator/deploy/certificate.yaml
helm install pravega-operator pravega/pravega-operator --version=0.5.1
```
Once deployed, we can deploy Pravega with the default Helm chart publicly available as follows:
```
helm install pravega pravega/pravega --version=0.8.0
```
That's it! Once this command gets executed, you will have your first Pravega cluster up and running:
```console
$ kubectl get pods
NAME READY STATUS RESTARTS AGE
bookkeeper-operator-85568f8949-d652z 1/1 Running 0 11m
bookkeeper-pravega-bk-bookie-0 1/1 Running 0 9m6s
bookkeeper-pravega-bk-bookie-1 1/1 Running 0 9m6s
bookkeeper-pravega-bk-bookie-2 1/1 Running 0 9m6s
nfs-server-provisioner-1592297085-0 1/1 Running 0 5m26s
pravega-operator-6c6d9db459-mpjr4 1/1 Running 0 4m19s
pravega-pravega-controller-5b447c85b-t8jsx 1/1 Running 0 2m56s
pravega-pravega-segment-store-0 1/1 Running 0 2m56s
zookeeper-0 1/1 Running 0 15m
zookeeper-1 1/1 Running 0 15m
zookeeper-2 1/1 Running 0 14m
zookeeper-operator-6b9759bbcb-9j25s 1/1 Running 0 16m
```
##Executing a Sample Application
Finally, we would like to help you to exercise the Pravega cluster you just deployed. Let’s deploy a pod in
our Kubernetes cluster to run samples and applications, like the one we propose in the manifest below
(`test-pod.yaml`):
```yaml
kind: Pod
apiVersion: v1
metadata:
name: test-pod
spec:
containers:
- name: test-pod
image: ubuntu:18.04
args: [bash, -c, 'for ((i = 0; ; i++)); do echo "$i: $(date)"; sleep 100; done']
```
You can directly use this manifest and create your Ubuntu 18.04 pod as follows:
```
kubectl create -f test-pod.yaml
```
Once the pod is up and running, we suggest you to login into the pod and build the [Pravega samples](https://github.com/pravega/pravega-samples)
to interact with the Pravega cluster by executing the following commands:
```
kubectl exec -it test-pod -- /bin/bash
apt-get update
apt-get -y install git-core openjdk-8-jdk
git clone -b r0.8 https://github.com/pravega/pravega-samples
cd pravega-samples
./gradlew installDist
```
With this, we can go to the location where the Pravega samples executable files have been generated and execute one of them,
making sure that we point to the Pravega Controller service:
```
cd pravega-client-examples/build/install/pravega-client-examples/
bin/consoleWriter -u tcp://pravega-pravega-controller:9090
```
That’s it, you have executed your first sample against the Pravega cluster! With the `consoleWriter`,
you will be able to write to Pravega regular events or transactions. We also encourage you to execute
on another terminal the `consoleReader`, so you will see how events are both written and read at the same
time (for more info, see the [Pravega samples documentation](https://github.com/pravega/pravega-samples/tree/master/pravega-client-examples#consolerw)).
There are many other interesting samples for Pravega in the repository, so please be curious and try them out.
##What is next?
This guide (also available in this [blog post](https://blog.pravega.io/2020/06/20/deploying-pravega-in-kubernetes/))
provides a high-level overview on how to deploy Pravega on Kubernetes. But there is much more to learn! We
suggest you to continue exploring Pravega with the following documents:
- [Deploying Pravega](../deployment/deployment.md): A more advanced guide on how to deploy Pravega on Kubernetes
and other environments.
- [Developer Guide](../clients-and-streams.md): Start creating your own applications using Pravega.
- [Understanding Pravega](../pravega-concepts.md): Be curious to understand the concepts and design choices behind
Pravega. | 47.544343 | 187 | 0.733839 | eng_Latn | 0.983391 |
b128f60020c042ec78abe866ef59aa4448db20de | 2,209 | md | Markdown | WindowsServerDocs/administration/windows-commands/regini.md | eltociear/windowsserverdocs.ja-jp | d45bb4a3e900f0f4bddef6b3709f3c7dec3a9d6c | [
"CC-BY-4.0",
"MIT"
] | null | null | null | WindowsServerDocs/administration/windows-commands/regini.md | eltociear/windowsserverdocs.ja-jp | d45bb4a3e900f0f4bddef6b3709f3c7dec3a9d6c | [
"CC-BY-4.0",
"MIT"
] | null | null | null | WindowsServerDocs/administration/windows-commands/regini.md | eltociear/windowsserverdocs.ja-jp | d45bb4a3e900f0f4bddef6b3709f3c7dec3a9d6c | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: regini
description: コマンドプロンプトまたはスクリプトを使用して、レジストリを変更する方法について説明します。
ms.prod: windows-server
ms.technology: manage-windows-commands
ms.topic: article
ms.assetid: 5ff18dc3-5bd8-400a-b311-fd73a3267e8c
author: coreyp-at-msft
ms.author: coreyp
manager: dongill
ms.date: 07/11/2018
ms.openlocfilehash: 33e0dcaa59be3c1748763cce5c9979fe318b271a
ms.sourcegitcommit: 4f407b82435afe3111c215510b0ef797863f9cb4
ms.translationtype: MT
ms.contentlocale: ja-JP
ms.lasthandoff: 05/24/2020
ms.locfileid: "83820152"
---
# <a name="regini"></a>regini
コマンドラインまたはスクリプト、レジストリを変更し、1 つまたは複数のテキスト ファイルに事前設定された変更を適用します。 作成、変更、またはレジストリ キーのアクセス許可を変更するだけでなく、レジストリ キーを削除できます。
Regini.exe がレジストリを変更するために使用するテキストスクリプトファイルの形式と内容の詳細については、「[コマンドラインまたはスクリプトからレジストリ値またはアクセス許可を変更する方法](https://support.microsoft.com/help/264584/how-to-change-registry-values-or-permissions-from-a-command-line-or-a)」を参照してください。
## <a name="syntax"></a>構文
```
regini [-m \\machinename | -h hivefile hiveroot][-i n] [-o outputWidth][-b] textFiles...
```
#### <a name="parameters"></a>パラメーター
|Parameter |説明 |
|-m \< \\ \\ ComputerName>|レジストリを変更すると、リモート コンピューターの名前を指定します。 ** \\ \\ ComputerName**という形式を使用します。|
|---------------------|-|
|-h \< hivefile hiveroot>|変更をローカル レジストリ ハイブを指定します。 形式でハイブ ファイルの名前と、hive のルートを指定する必要があります **hivefile hiveroot**します。|
|-i \< n>|コマンドの出力のレジストリ キーのツリー構造を示すために使用するインデント レベルを指定します。 **Regdmp** (レジストリキーの現在のアクセス許可をバイナリ形式で取得する) ツールでは、4の倍数でインデントが使用されるため、既定値は**4**です。|
|-o \< outputwidth>|文字で、コマンドの出力の幅を指定します。 出力がコマンド ウィンドウに表示され、既定値はウィンドウの幅です。 既定値は、出力をファイルに出力すると場合、 **240** 文字です。|
|-b|指定する **Regini.exe** 出力は次の以前のバージョンとの下位互換性 **Regini.exe**します。 詳細については、「解説」を参照してください。|
|textfiles|レジストリ データを含む 1 つ以上のテキスト ファイルの名前を指定します。 ANSI または Unicode テキスト ファイルの任意の数を指定できます。|
## <a name="remarks"></a>注釈
使用して適用するレジストリ データを含むテキスト ファイルの内容には、主に次のガイドラインが適用 **Regini.exe**します。
- 行の終わりのコメント文字としてセミコロンを使用します。 行の最初の空白以外の文字があります。
- バック スラッシュを使用すると、行の継続を指定します。 コマンドでは、円記号を (が含まれていない) からすべての文字を無視します。 次の行の最初の空白以外の文字。 円記号の前に 1 つ以上のスペースを含めると、単一の空白に置き換えられます。
- ハード タブ文字を使用すると、インデントを制御できます。 このインデント; のレジストリ キーのツリー構造を示すただし、これらの文字は、それらの位置に関係なく 1 つのスペースに変換されます。
## <a name="additional-references"></a>その他のリファレンス
- [コマンド ライン構文の記号](command-line-syntax-key.md) | 42.480769 | 223 | 0.774559 | yue_Hant | 0.686591 |
b129c58f3c3f41ed4f48d4ecec2fbb386e2649d5 | 303 | md | Markdown | docs/.partials/distributions-and-instructions.md | RobinSoenen/kuma-website | 85c597b1544fa9c16b7749a417d56467b3563762 | [
"Apache-2.0"
] | 15 | 2020-08-03T07:48:52.000Z | 2022-03-26T06:34:33.000Z | docs/.partials/distributions-and-instructions.md | RobinSoenen/kuma-website | 85c597b1544fa9c16b7749a417d56467b3563762 | [
"Apache-2.0"
] | 191 | 2020-07-13T13:21:12.000Z | 2022-03-31T15:06:20.000Z | docs/.partials/distributions-and-instructions.md | RobinSoenen/kuma-website | 85c597b1544fa9c16b7749a417d56467b3563762 | [
"Apache-2.0"
] | 31 | 2020-07-29T11:51:05.000Z | 2022-02-17T23:21:19.000Z | ### Distributions & Instructions
hello world
#### Docker
How to install Kuma on Docker.
#### Kubernetes
#### DC/OS
#### Amazon Linux
#### CentOS
#### RedHat
#### Debian
#### Ubuntu
#### macOS
#### AWS Marketplace
#### AWS Cloud Formation
#### Google Cloud Platform
#### Vagrant
#### Source | 9.774194 | 32 | 0.60066 | kor_Hang | 0.584981 |
b129eee5ca8b9ad16d512692c17c6a92dbd274dc | 1,610 | md | Markdown | clients/php-symfony/generated/Resources/docs/Api/BaseRemoteAccessApiInterface.md | PankTrue/swaggy-jenkins | aca35a7cca6e1fcc08bd399e05148942ac2f514b | [
"MIT"
] | 23 | 2017-08-01T12:25:26.000Z | 2022-01-25T03:44:11.000Z | clients/php-symfony/generated/Resources/docs/Api/BaseRemoteAccessApiInterface.md | PankTrue/swaggy-jenkins | aca35a7cca6e1fcc08bd399e05148942ac2f514b | [
"MIT"
] | 35 | 2017-06-14T03:28:15.000Z | 2022-02-14T10:25:54.000Z | clients/php-symfony/generated/Resources/docs/Api/BaseRemoteAccessApiInterface.md | PankTrue/swaggy-jenkins | aca35a7cca6e1fcc08bd399e05148942ac2f514b | [
"MIT"
] | 11 | 2017-08-31T19:00:20.000Z | 2021-12-19T12:04:12.000Z | # OpenAPI\Server\Api\BaseRemoteAccessApiInterface
All URIs are relative to *http://localhost*
Method | HTTP request | Description
------------- | ------------- | -------------
[**getCrumb**](BaseRemoteAccessApiInterface.md#getCrumb) | **GET** /crumbIssuer/api/json |
## Service Declaration
```yaml
# src/Acme/MyBundle/Resources/services.yml
services:
# ...
acme.my_bundle.api.baseRemoteAccess:
class: Acme\MyBundle\Api\BaseRemoteAccessApi
tags:
- { name: "open_apiserver.api", api: "baseRemoteAccess" }
# ...
```
## **getCrumb**
> OpenAPI\Server\Model\DefaultCrumbIssuer getCrumb()
Retrieve CSRF protection token
### Example Implementation
```php
<?php
// src/Acme/MyBundle/Api/BaseRemoteAccessApiInterface.php
namespace Acme\MyBundle\Api;
use OpenAPI\Server\Api\BaseRemoteAccessApiInterface;
class BaseRemoteAccessApi implements BaseRemoteAccessApiInterface
{
// ...
/**
* Implementation of BaseRemoteAccessApiInterface#getCrumb
*/
public function getCrumb()
{
// Implement the operation ...
}
// ...
}
```
### Parameters
This endpoint does not need any parameter.
### Return type
[**OpenAPI\Server\Model\DefaultCrumbIssuer**](../Model/DefaultCrumbIssuer.md)
### Authorization
[jenkins_auth](../../README.md#jenkins_auth)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: application/json
[[Back to top]](#) [[Back to API list]](../../README.md#documentation-for-api-endpoints) [[Back to Model list]](../../README.md#documentation-for-models) [[Back to README]](../../README.md)
| 22.054795 | 189 | 0.670186 | yue_Hant | 0.581158 |
b12afb60aed8d0fedb416949f36df2e9ef45628c | 5,387 | md | Markdown | data/update/2020-05/2020-05-06-10_51_32-us-marquette.md | jianghe1220/covid19-datahub | 9b8d8e0bf899fb5401cd22f120faf6deb86f35c5 | [
"MIT"
] | 5 | 2020-04-06T13:22:17.000Z | 2020-06-24T03:22:12.000Z | data/update/2020-05/2020-05-06-10_51_32-us-marquette.md | jianghe1220/covid19-datahub | 9b8d8e0bf899fb5401cd22f120faf6deb86f35c5 | [
"MIT"
] | 26 | 2020-03-30T04:42:14.000Z | 2020-04-29T05:33:02.000Z | data/update/2020-05/2020-05-06-10_51_32-us-marquette.md | applysquare/covid19-datahub | 7b99d266f48dca194a13fa02d3ee72aeb10bc1d7 | [
"MIT"
] | 20 | 2020-03-29T02:09:44.000Z | 2020-04-11T03:36:52.000Z | ---
title: "COVID-19 Update: Campus operations this fall and new Commencement date"
subtitle:
date: 2020-04-24
link: >-
https://today.marquette.edu/2020/04/covid-19-update-campus-operations-this-fall-and-new-commencement-date/
countryCode: us
status: published
instituteSlug: us-marquette
---

Dear Marquette community:
Recent weeks have been filled with great uncertainty as we have navigated the global COVID-19 pandemic. We understand that this time has been stressful for our students, staff and faculty, as we have experienced these same feelings. We are writing today to offer a path forward so, together, we can plan for a recovery that leads us toward a common goal of being an even stronger, mission-based community than we were before this crisis.
Marquette University will resume campus operations this fall if governmental and medical authorities say it is safe to do so. A personalized, on-campus academic and cocurricular experience is foundational to the transformative Catholic, Jesuit education to which Marquette has been dedicated for 140 years. The best way for us to provide a distinctive experience for our students is together, in community.
To prepare a safe living and learning environment that aligns with the state of Wisconsin’s “Badger Bounce Back” plan, the Marquette University COVID-19 Response Team and its sub-committees, which include more than 100 leaders across campus, are working on a five-step recovery plan. The plan will establish the key responsibilities, timelines and general procedures we will follow during Marquette’s recovery efforts.
As we have stated from the beginning of this global crisis, Marquette will always prioritize the health, safety and well-being of our campus and broader communities. Our recovery will include processes and policies related to COVID-19 testing, social distancing, space and equipment sanitizing, and personal protective equipment usage to comply with government and health guidelines. We are exploring adjustments to many aspects of the campus experience, from large lectures to university-sponsored travel to dining and residence hall use. To protect the most vulnerable members of our community, we will work with those individuals who may have underlying medical conditions to ensure they can return to campus safely.
To aid in our collective planning, the COVID-19 Response Team will solicit input on recovery efforts and share information on the university’s pandemic response to date at a virtual COVID-19 Town Hall for the campus community on Friday, May 1, from 3 p.m. to 4:30 p.m. Please RSVP and submit questions for this Microsoft Teams Live event by Thursday, April 30. We hope you will join us and share the input we need to continue to move forward.
A link to join the Microsoft Teams Live event will be emailed to all attendees the morning of Friday, May 1. We will provide additional opportunities for input on recovery planning and communicate frequently on this topic in the months ahead.
Commencement rescheduled to August 30
There have been many questions about Commencement — one of the year’s most anticipated events. Following the postponement of this year’s ceremonies due to the COVID-19 pandemic, Marquette conducted a survey of graduating students on how best to celebrate the Class of 2020. Approximately 1,800 undergraduate, graduate and professional students completed the survey. An overwhelming majority of students reported a strong preference for an in-person ceremony when it is safe to gather.
Based on this feedback and knowing that Commencement is the culmination of years of hard work for our graduates and sacrifice for their families, we are planning to host an in-person Commencement ceremony on Sunday, August 30, at Fiserv Forum. Individual students will be recognized as they cross the stage. Only the professional schools — School of Dentistry, Law School and Health Sciences professional degrees — will host their own ceremonies. The individual colleges will host receptions during the August 29–30 weekend. Baccalaureate Mass will be held Saturday, August 29, at 4 p.m., at the Al McGuire Center.
Any of our plans may need to change based on government and health official guidelines, so we ask for your continued patience and flexibility. If future guidance requires a different action, the safety of our students, faculty and staff will be the most important factor in our decisions. Please visit Marquette’s COVID-19 website for more information or to submit a question.
St. Ignatius never pulled back from the challenges he confronted. Constantly seeking the greater glory of God, he called upon the Jesuits to “live with one foot raised” — always ready to change course and step forward to embrace a new way of life. In that spirit, and with your input, we are prepared to adapt to our changing realities while continuing to move the Marquette community forward together.
Thank you for your resilience and your commitment to our mission during these challenging times. We look forward to welcoming you back to campus in August.
We are Marquette!
Dr. Michael R. Lovell
President
Dr. Kimo Ah Yun
Provost and Executive Vice President for Academic Affairs
Joel Pogodzinski
Senior Vice President and Chief Operating Officer | 105.627451 | 719 | 0.811769 | eng_Latn | 0.999469 |
b12b4094c4f2d02b994e1cb9e84aaa2aeecbfb6e | 104 | md | Markdown | articles/includes/cc-applies-to-psa-app-2-5x-9-0-platform.md | MicrosoftDocs/dynamics-365-project-operations-pr.es-ES | bc02d51faea90e0805112cb4475a9315e2cc3104 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2021-04-20T21:13:46.000Z | 2021-04-20T21:13:46.000Z | articles/includes/cc-applies-to-psa-app-2-5x-9-0-platform.md | MicrosoftDocs/dynamics-365-project-operations-pr.es-ES | bc02d51faea90e0805112cb4475a9315e2cc3104 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2020-09-24T13:36:26.000Z | 2020-09-24T13:36:26.000Z | articles/includes/cc-applies-to-psa-app-2-5x-9-0-platform.md | MicrosoftDocs/dynamics-365-project-operations-pr.es-ES | bc02d51faea90e0805112cb4475a9315e2cc3104 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | Versión 2.5.x o posteriores de la aplicación Project Service en la versión 9.x de Dynamics 365 (online)
| 52 | 103 | 0.788462 | spa_Latn | 0.786749 |
b12b9b87709c37e4e8506d6ef20e96c071e028c5 | 12,765 | md | Markdown | doc/design_ru.md | YACLib/YACLib | fa5e13cdcc1f719e6b6363ba25a4791315e66916 | [
"MIT"
] | 106 | 2021-07-04T01:10:18.000Z | 2022-03-21T00:58:27.000Z | doc/design_ru.md | YACLib/YACLib | fa5e13cdcc1f719e6b6363ba25a4791315e66916 | [
"MIT"
] | 119 | 2021-07-10T14:26:24.000Z | 2022-03-22T22:48:18.000Z | doc/design_ru.md | YACLib/YACLib | fa5e13cdcc1f719e6b6363ba25a4791315e66916 | [
"MIT"
] | 7 | 2021-07-23T11:23:04.000Z | 2021-11-13T20:22:56.000Z | # Дизайн
## Мотивация:
YACLib - С++ библиотека для конкурентного и параллельного исполнения задач,
которая является альтернативой для существующих решений, стремящаяся к тому, чтобы удовлетворять следующим свойствам:
* Easy to use
* Zero cost abstraction
* Easy to build
* Good test coverage
### Easy to use:
Программировать конкурентные/параллельные программы сложно, поэтому одна из наиболее важных целей,
чтобы YACLib было легко использовать правильно.
Пример:
Future в нашем интерфейсе имеет две перегрузки `Get`: `ReturnValue Get() const &` и `ReturnValue Get() &&`,
а перегрузки `ReturnValue Get() &` и `ReturnValue Get() const &&` удалены.
Как следствие - большая часть неправильного или неоптимального использования `Get` обнаруживается на этапе компиляции.
### Zero cost abstraction:
Абстракции, которые предоставляет YACLib должны делать код, оптимально написанный для конкретного случая,
проще и оставлять его таким же быстрым.
Например, поэтому, `Strand::Execute(...some task...)` - lock-free,
а создание и исполнение pipeline из Future делает ровно одну аллокацию на каждый шаг pipeline.
### Easy to build:
Сборка YACLib должна быть простой, так как иначе библиотеку сложно добавить в кроссплатформенные проекты.
При этом сборка не должна значительно замедлять сборку целевого проекта.
Поэтому мы собираем весь проект с помощью CMake как статическую библиотеку,
стараясь использовать минимум публичного шаблонного кода.
### Good test coverage:
Самое важное - это отсутствие багов. Потому что найти многопоточный баг очень сложно, а если он в библиотеке - еще
сложнее. Поэтому мы:
* стремимся к 100% тестовому покрытию
* тестируем код под множеством платформ с разными флагами сборки
* используем статические анализаторы для С++, такие, как clang-tidy, cppcheck
* используем динамические анализаторы для С++, такие, как Google Sanitizers, Valgrind, etc
## Что уже сделано:
### Были написаны следующие абстракции:
1. Executors:
* Inline
* ThreadPool
* Strand
2. Future/Promise abstraction
3. Combinators:
* WhenAll
* WhenAny
4. ThreadFactory
5. Documentation, Tests, CI
- ### Inline Executor:
Zero cost абстракция, для того, чтобы избежать лишний проверок `executor` на `nullptr`.
Так же это `executor` по-умолчанию для других классов.
- ### Thread Pool Executor
Абстракция, позволяющая параллельно исполнять задачи на выбранном количестве потоков.
Мы написали полезные для пользователя интерфейсы остановки `ThreadPool`,
потому что это одна из сложных задач в параллельном программировании:
* `Stop` - запрещает добавлять новые задачи в `ThreadPool`, и выполняет оставшиеся.
* `SoftStop` - вызывает `Stop`, когда в `ThreadPool` не остается задач.
* `HardStop` - вызывает Stop и не выполняет оставшиеся задачи.
Было принято осознанное решение разделить `Join` на `Stop` и `Wait`.
Также была написана почти lock-free специализация `ThreadPool` для одного потока,
так как это частый случай использования `ThreadPool` в различных проектах,
например: `UIThread, RenderThread, AnimationThread, LoggerThread, FileIOThread etc`.
Во всех этих случаях часто пишут свой велосипед, потому что абстракция `ThreadPool` обычно слишком тяжелое решение для этой задачи,
мы же улучшили производительность `ThreadPool` для одного потока, оставив такое же API.
- ### Strand
Абстракция, позволяющая сериализовать исполнение задач поверх `ThreadPool` или другого `Executor`.
Это позволяет избавиться от явных блокировок (e.g. `std::mutex`) и перейти к кооперативному исполнению задач.
Наша реализация удовлетворяет нескольким свойствам:
* Полностью lock-free (при этом lock-free имплементация достаточно простая, чтобы быть эффективнее spin-lock)
* Не занимает executor (или его поток, в случае `ThreadPool`) на неограниченное время,
позволяя исполняться другим задачам, с помощью перепланирования, вместо рекурсии или цикла.
* Bulk исполнения задач (Batching) - несколько задач исполняются подряд,
что позволяет более оптимально использовать кэш процессора.
### Future/Promise/Run abstraction
Абстракция для составления пайплайнов исполнения задач.
```C++
auto thread_pool = MakeThreadPool(4);
auto future = Run(thread_pool, task1)
.Then(task2)
.Then(task3)
```
Текущая имплементация удовлетворяет следующим свойствам:
* `Promise::Set / Future::Then` - является `lock-free`, реализованным на конечном автомате на atomic.
* `Wait/WaitFor/WaitUntil` - не требуют аллокаций, что довольно нетривиально, учитывая первое свойство.
(отсутствие `std::mutex` и `std::condition_variable` в `SharedState`).
* Одна аллокация на планирование и исполнение каждого шага pipeline, что является наиболее эффективным решением,
помимо `Lazy Future`.
* Поддержка `Future<Future<T>> -> Future<T>`.
Пример:
```C++
auto future = Run(MakeInline(), []{
return MakeFuture<int>(5); // Returns Future<int>
});
// decltype(future) == Future<int>
```
* Поддерживаем обработку ошибок, как с помощью исключений, так и с помощью кодов возврата (std::error_code)
* Помимо T умеем обрабатывать разные перегрузки: `util::Result<T>`, `std::exception_ptr`, `std::error_code`
Пример:
```C++
auto future = Run(MakeInline(), []{
throw std::runtime_error{"bad exception"};
}).Then([](std::exception_ptr e) { // recover error
return 1;
});
assert(std::move(future).Get().Value() == 1);
```
### WhenAll Combinator
Абстракция, позволяющая планировать продолжение сразу для набора Futures,
переданных через итераторы или как variadic template parameters.
Также реализована lock-free.
Пример:
```C++
auto [future1, promise1] = MakeContract<int>();
auto [future2, promise2] = MakeContract<int>();
auto [future3, promise3] = MakeContract<int>();
auto AllFuture = WhenAll(future1, future2, future3);
// decltype(AllFuture) == Future<std::array<int, 3>>;
assert(AllFuture.Ready() == false);
promise1.Set(5);
promise2.Set(3);
assert(AllFuture.Ready() == false); // still not completed!
promise2.Set(8);
assert(AllFuture.Ready() == true);
// array{5, 3, 8}:
auto result = std::move(AllFuture).Get().Value();
```
### ThreadFactory
Абстракция для удобно создания потоков для разных `ThreadPool`. Например, для задания:
* имени потокам
* приоритета потокам
* сallback на старте исполнения и перед окончанием потока
А также имеется возможность кэширования и пере-использования потоков при использовании`HeavyThreadFactory` и нескольких `ThreadPool`.
## Что планируется сделать
### Lazy Future
Последние несколько лет ключевые члены комитета стандартизации C++ занимаются оптимизацией
pipeline из future на этапе компиляции.
(Текущее состояние предложения: [P0443](http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2020/p0443r13.html)).
Мы хотим реализовать часть этой идеи для оптимизации pipeline `Futures`.
Полностью реализовать предложение крайне сложно, много лучших C++ инженеров занимаются этим несколько лет
(среди них Hans Boehm - один из авторов C++ Memory Model, Chris Kohlhoff - создатель Boost.Asio,
Erich Niebler - Ranges C++20, Lewis Baker - создатель cppcoro, Gor Nishanov - создатель C++20 coroutine, etc.).
Мы хотим реализовать только следующие сценарии:
* Объединять подряд идущие `Future::Then` в одну аллокацию:
```C++
future.Then(task1).Then(task2).Then(task3)
```
* Добавить `lazy::Run`, создание незапущенной Future без аллокации
* Реализация ленивых `Future Combinators` как для `lazy::Future`, так и для `Future`
## Разные реализации ThreadPool и их бенчмарки
Основной смысл задачи в написании правильный бенчмарков и различных реализаций `ThreadPool` для того,
чтобы определить для каких типов задач какой `ThreadPool` наиболее оптимальный.
### План реализации:
* Написать хороший `Test And Set`-spinlock,
убедиться, что все остальные алгоритмы spinlock плохо работают в `user-space`.
* В дальнейшем, если мы используем `std::mutex` - мы также должны сравнивать производительность с использованием `spinlock`.
* Реализовать стратегию _Work-Stealing_ и сравнить ее производительность.
* Реализовать стратегию _Work-Distribution_ и сравнить ее производительность.
* Реализовать _lock-free Michael-Scott queue_ для планирования задач.
Подробности см. в [issue](https://github.com/YACLib/YACLib/issues/4) на github.
## Fibers
`Fiber`, также известные как: `user level threads`, `stackful coroutines`, `goroutines`, `green threads`.
Кооперативная многозадачность полезна во множестве случаев, как правило, связанных с `IO-bound` задачами, но не обязательно.
### План реализации:
* Реализовать `Stackful coroutines` (`callable object`, который представляет собой вычисление, которое может
останавливаться по собственной воле и возобновляться по воле вызывающего кода или внешнего события).
* Реализовать `Fibers`, по сути являющиеся исполнениями coroutine.
* Реализовать `Futex` для `Fibers`, не использующий системных вызовов.
* Реализовать различные примитивы синхронизации для `Fibers`: `Mutex`, `ConditionVariable`, `etc`.
* Реализовать lock-free `AsyncMutex`, по сути переосмысление `Strand`.
* Реализовать каналы для передачи данных между `Fibers`: `Bounded/Unbounded SPSC/MPSC/SPMC/MPMC`, стоит попытаться
реализовать `lock-free` алгоритмы.
* Реализовать `select` для каналов.
Как дополнительную задачу можно рассмотреть реализацию планировщика `Fibers`,
аналогичного планировщику `golang` (`kotlin`, `rust tokio` используют такой же алгоритм).
## Concurrent algorithms
Для полноты библиотеки и удобства ее использования, необходимо реализовать следующие абстракции:
- `Shared Future`
Аналог `Future`, у которой не константные методы - `thread-safe`
- `Shared Promise`
Аналог `Promise`, у которого не константные методы - `thread-safe`
- `WhenAny combinator`
Абстракция, позволяющая планировать продолжение для первой готовой `Future` из набора `Futures`, переданных через
итераторы или как variadic template parameters. Реализация должна быть lock-free.
- Попробовать реализовать другие комбинаторы, такие как: `WhenEach`, `WhenSome`, `etc`.
- Для всех комбинаторов реализовать вариант `Wait` (т.е. `WhenAll -> WaitAll`), это экономит аллокацию
## Почему я должен использовать YACLib?
Возможные аналоги:
* STL
* OpenMP
* oneAPI TBB (ранее известная как Intel TBB)
* Boost.Asio
* Folly
* HPX
* Boost.Fiber
* marl
* taskflow
* libunifex/cppcoro
### STL
`future/promise/packaged_task` не являются `zero cost` и обладают _easy to misuse API_.
Также отсутствует возможность для планирования pipeline задач.
Другие же примитивы слишком низкоуровневые и подходят скорее для написания собственной библиотеки, например ThreadPool.
### OpenMP
Хорошо подходит для вычислительных задач, когда нужно быстро попробовать распараллелить код.
Однако не подходит для конкурентного исполнения задач, которое на практике встречается значительно чаще.
### oneAPI TBB
Библиотека, которая является более современной и хорошо написанной альтернативой OpenMP.
Ключевые недостатки:
* Довольно большая библиотека (100 тысяч LOC).
* Обладает весьма специфичным и недружелюбным API для конкурентного исполнения задач.
* В некоторых местах [не учитывает протокол когерентности кэшей и не оптимально использует C++ memory model]
(https://github.com/oneapi-src/oneTBB/blob/40a9a1060069d37d5f66912c6ee4cf165144774b/include/oneapi/tbb/spin_mutex.h#L71).
### Boost.Asio
Является хорошей оберткой для network platform specific API.
Главный недостаток, что все остальное, не IO, написано скорее как заглушка,
предполагающая, что у вас есть собственная альтернатива.
### Folly
Хорошая библиотека, которая обладает вполне хорошим и дружелюбным интерфейсом, однако довольно громоздкая, что не
удовлетворяет `Easy To Build`. Также в нашей библиотеке мы написали более оптимально:
* `Strand`, сделав его полностью lock-free и удовлетворяющим большему количеству свойств,
например, don’t occupy thread.
* Взаимодействие `Future` и `Executor`, уменьшив количество аллокаций,
с помощью создания _Callable Shared State_ для `Future`.
### HPX
Гигантская библиотека с очень сложной системой сборки.
### Boost.Fiber и marl
`User Level Threads (Fibers)` - не единственное,
что необходимо в библиотеке для конкурентного и параллельного исполнения задач.
В частности, Fibers не всегда подходят для CPU-bound задач, например,
чтобы обрабатывать асинхронные callback внешних библиотек.
Также в marl отсутствует хороший `user level mutex`, а в Boost.Fiber он не lock-free.
### Taskflow
Видно, что приоритетом библиотеки является распараллеливание задач с помощью CUDA, OpenCL, etc,
а не конкурентное исполнение задач.
### libunifex/cppcoro
Обе библиотеки являются довольно инновационными и интересными, но экспериментальными.
Полноценная поддержка будет добавлена в STL не раньше, чем в C++23,
а наша библиотека планирует поддерживать все стандарты, начиная с C++11.
| 37.991071 | 133 | 0.77736 | rus_Cyrl | 0.957543 |
b12b9c7bbad1d729d05f1c480550ae77e97ef78e | 158 | md | Markdown | MAINTAINERS.md | atruslow/little-cheesemonger | fc5e9e985d00efa144c382887a36abc88ab20e8f | [
"MIT"
] | 3 | 2021-02-02T10:59:43.000Z | 2022-01-06T16:08:32.000Z | MAINTAINERS.md | atruslow/little-cheesemonger | fc5e9e985d00efa144c382887a36abc88ab20e8f | [
"MIT"
] | 92 | 2021-01-13T16:03:50.000Z | 2022-03-29T09:09:34.000Z | MAINTAINERS.md | atruslow/little-cheesemonger | fc5e9e985d00efa144c382887a36abc88ab20e8f | [
"MIT"
] | 2 | 2021-02-02T10:59:52.000Z | 2021-06-15T14:39:53.000Z | # Maintainers
* Chris Antonellis - cantonellis [at] wayfair.com
* Patrick Lannigan - plannigan [at] wayfair.com
* Josh Woodward - jwoodward [at] wayfair.com
| 26.333333 | 49 | 0.746835 | yue_Hant | 0.212756 |
b12bbb3bdb4588cae58fb832502b0d7e3aeb9c51 | 513 | md | Markdown | _posts/2020-06-29-week5.md | lgnadolskis/lgnadolskis.github.io | dfd233681ef730c9d725277528e494fd849c7dc3 | [
"MIT"
] | null | null | null | _posts/2020-06-29-week5.md | lgnadolskis/lgnadolskis.github.io | dfd233681ef730c9d725277528e494fd849c7dc3 | [
"MIT"
] | null | null | null | _posts/2020-06-29-week5.md | lgnadolskis/lgnadolskis.github.io | dfd233681ef730c9d725277528e494fd849c7dc3 | [
"MIT"
] | null | null | null | ---
layout: post
title: Week 5
---
Week 5: June 29 To July 3rd
I was able to buy one raspberry pi 3b+ and a camera of 8 MP for training and deploying.
This week was mainly spent on setting up the raspberry pi.
Also I ran more tests on the Realtime object detection idea and start thinking more about using bounding boxes areas to determine the distance to an object.
However I was able to read more in-depth the papers on disparity maps ad it is still an interesting idea.
Thinking about ways to combine both.
| 36.642857 | 156 | 0.773879 | eng_Latn | 0.999862 |
b12cf61b070a4760e4a48bdd7a5a15ae40da0601 | 1,051 | md | Markdown | README.md | shihd/project-kafka | 060a8a9dc6932543737a757ab821a93bc65be497 | [
"MIT"
] | null | null | null | README.md | shihd/project-kafka | 060a8a9dc6932543737a757ab821a93bc65be497 | [
"MIT"
] | null | null | null | README.md | shihd/project-kafka | 060a8a9dc6932543737a757ab821a93bc65be497 | [
"MIT"
] | null | null | null | # project
spring-boot+kafka
## Kafka安装
### Docker镜像
- zookeeper
- wurstmeister/kafka
- sheepkiller/kafka-manager
### 容器启动
- zookeeper
```
docker run --name some-zookeeper \
--restart always \
-p 2181:2181 \
-d zookeeper
```
- wurstmeister/kafka
```
docker run --name kafka \
-p 9092:9092 \
-e KAFKA_ADVERTISED_HOST_NAME=192.168.3.36 \
-e KAFKA_CREATE_TOPICS="test:1:1" \
-e KAFKA_ZOOKEEPER_CONNECT=192.168.3.36:2181 \
-d wurstmeister/kafka
```
- sheepkiller/kafka-manager
```
docker run -itd \
--restart=always \
--name=kafka-manager \
-p 9000:9000 \
-e ZK_HOSTS="192.168.3.36:2181" \
sheepkiller/kafka-manager
```
### kafka配置
- 添加Topic
```
docker exec -it kafka /bin/bash
/opt/kafka/bin/kafka-topics.sh --create --zookeeper 192.168.3.36:2181 --replication-factor 1 --partitions 1 --topic monitor
```
### 基本操作
- 发送消息
```
/opt/kafka/bin/kafka-console-producer.sh --broker-list 192.168.3.36:9092 --topic monitor
```
- 接收消息
```
/opt/kafka/bin/kafka-console-consumer.sh --bootstrap-server 192.168.3.36:9092 --topic monitor --from-beginning
```
| 18.12069 | 123 | 0.697431 | kor_Hang | 0.147127 |
b12da73d5f7e6614251d33f5c22f6da9e1135806 | 24 | md | Markdown | README.md | bluefire2121/kserenity2 | 98153315391b8d99638e9520013748b45f035359 | [
"BSD-3-Clause"
] | null | null | null | README.md | bluefire2121/kserenity2 | 98153315391b8d99638e9520013748b45f035359 | [
"BSD-3-Clause"
] | null | null | null | README.md | bluefire2121/kserenity2 | 98153315391b8d99638e9520013748b45f035359 | [
"BSD-3-Clause"
] | null | null | null | # kserenity2
kserenity2
| 8 | 12 | 0.833333 | pol_Latn | 0.630899 |
b12dd20b23df96f7b517d0c54c63d039019a2467 | 824 | md | Markdown | CHANGELOG.md | TeoDevM/discord-bot-1 | a5c446c7b70902c439747cb8f3a67d162e6664f5 | [
"MIT"
] | 3 | 2021-04-06T03:44:18.000Z | 2021-05-16T20:43:58.000Z | CHANGELOG.md | TeoDevM/discord-bot-1 | a5c446c7b70902c439747cb8f3a67d162e6664f5 | [
"MIT"
] | 21 | 2021-08-30T00:54:59.000Z | 2022-03-31T00:19:48.000Z | CHANGELOG.md | TeoDevM/discord-bot-1 | a5c446c7b70902c439747cb8f3a67d162e6664f5 | [
"MIT"
] | 9 | 2021-02-28T02:56:48.000Z | 2021-05-25T19:30:43.000Z | ## 0.1.0 (2021-05-19)
### Fix
- **procfile**: add the procfile and the heroku deploy
- move from procfile to Procfile
- The lint errors and withespace
- change config file tox.ini for setup.cfg
- change config file tox.ini for setup.cfg
- **Pipfile-and-Main-Files**: Fix the flake8 error and format the main.py
- **Pipfile**: fix the flake8 command
### Feat
- add codeowners
- add codeowners
- Add the status
- **main.py**: register the commands in the main file
- **Add-the-cita-and-joke-command**: create this commands
- add the avatar, moderation, and help command
- add flake8 config
- **Add-Commands**: Add the On member join, On member leave and ping command
- **deploy**: add procfile for heroku
- create basic bot client
- add github actions workflow for ci
- add dependencies and "hello world"
- create pipfile
| 29.428571 | 76 | 0.724515 | eng_Latn | 0.971586 |
b12e15c0518b4dcfba213af3db26672915a1698b | 400 | md | Markdown | docs/Terpenes.md | conflabs/wcia-assays | b118d39af17be6900e6eee2d94809adb82375d39 | [
"MIT"
] | null | null | null | docs/Terpenes.md | conflabs/wcia-assays | b118d39af17be6900e6eee2d94809adb82375d39 | [
"MIT"
] | 1 | 2022-01-17T03:14:05.000Z | 2022-01-17T03:14:29.000Z | docs/Terpenes.md | conflabs/wcia-assays | b118d39af17be6900e6eee2d94809adb82375d39 | [
"MIT"
] | null | null | null | # Terpenes
A list of Terpenes Assays for use in interoperability.
----------------------------------------
## Terpene Assay
* ULID: `018NY6XC0066H2G9CBJ3Z6SAHK`
* Assay Name: `terpene assay`
* Common Names: `terpenes, terpene test`
```json
{
"ulid": "018NY6XC0066H2G9CBJ3Z6SAHK",
"assay_name": "terpene assay",
"common_names": [
"terpenes",
"terpene test"
]
}
```
| 18.181818 | 54 | 0.5825 | yue_Hant | 0.123234 |
b12e2558eb7a7c89e749f7f446c43bb6cb90b101 | 1,352 | md | Markdown | README.md | 5c077m4n/iterable-ops | b4d150b8e8d485fef84da64bccde90efbe76afcb | [
"Unlicense"
] | 1 | 2019-12-10T09:50:16.000Z | 2019-12-10T09:50:16.000Z | README.md | 5c077m4n/iterable-ops | b4d150b8e8d485fef84da64bccde90efbe76afcb | [
"Unlicense"
] | 4 | 2020-04-13T08:20:12.000Z | 2022-01-22T09:59:50.000Z | README.md | 5c077m4n/iterable-ops | b4d150b8e8d485fef84da64bccde90efbe76afcb | [
"Unlicense"
] | null | null | null | # Lazy Piping
[](https://travis-ci.org/5c077m4n/iterable-ops)
[](https://coveralls.io/github/5c077m4n/iterable-ops?branch=master)
[](https://snyk.io/test/github/5c077m4n/iterable-ops?targetFile=package.json)
A tiny package to allow lazy operations on all iterators. The whole point of this package is to allow saving of operations and the only calculating when needed (by calling the `.get()` function).
Just type into the terminal:
```bash
npm install --save lazy-piping
```
Then, in your code itself:
```javascript
const { from, map, filter } = require('lazy-piping');
```
or:
```javascript
import { from, map, filter } from 'lazy-piping';
```
or (as a script in your HTML):
```html
<script src="https://unpkg.com/lazy-piping@latest/packages/lazy-piping.umd/src/index.js"></script>
```
```javascript
const { from, map, filter } = LazyPiping;
```
And you're good to go!
```javascript
from([1, 2, 3, 4, 5, 6, 7, 8])
.pipe(
map(x => x * 2),
filter(x => x % 3)
)
.get();
```
**Happy iterating! ;)**
| 27.591837 | 196 | 0.677515 | eng_Latn | 0.394309 |
b12e5882d7895be26cf299887305feb96141d7cb | 7,094 | md | Markdown | videos/vuex-fundamentals/markdown/1. Intro to Vuex.md | muhamed-didovic/vmdown | 377805b303163b6e495331e619cc13a5600d8d45 | [
"MIT"
] | null | null | null | videos/vuex-fundamentals/markdown/1. Intro to Vuex.md | muhamed-didovic/vmdown | 377805b303163b6e495331e619cc13a5600d8d45 | [
"MIT"
] | null | null | null | videos/vuex-fundamentals/markdown/1. Intro to Vuex.md | muhamed-didovic/vmdown | 377805b303163b6e495331e619cc13a5600d8d45 | [
"MIT"
] | null | null | null | # Intro to Vuex
In this course, we’ll be exploring the fundamentals of Vuex: Vue’s state management library. If you’ve been following along with our beginner path, this course will pick up where [Real World Vue 3](https://www.vuemastery.com/courses/real-world-vue3/rwv3-orientation) left off. By the end of this course, you’ll have a solid understanding of when and why to use Vuex, and you’ll be empowered to implement it within your own Vue apps. Lesson by lesson, we’ll be adding Vuex to the example app that we created in the Real World Vue 3 course.
But before we get started writing any code, we need to understand the rationale behind Vuex, and look at an example use case that illustrates the different parts of Vuex and how it all works together.
---
## **The Case for State Management**
Managing state in an application full of components can be difficult. Facebook discovered this the hard way and created the Flux pattern, which is what Vuex is based upon. Vuex is Vue’s own state management pattern and library. In this lesson, we’ll look at why an application might need Vuex, and how it can enhance your app.
When we talk about state, we mean the data that your components depend on and render. Things like blog posts, to-do items, and so on. Without Vuex, as your app grows, each Vue component might have its own version of state.

But if one component changes its state, and a distant relative is also using that same state, we need to communicate that change. There’s the default way of communicating events up and passing props down to share data, but that can become overly complicated.

Instead, we can consolidate all of our state into one place. One location that contains the current state of our entire application. One single source of truth.
---
**A Single Source of Truth** This is what Vuex provides, and every component has direct access to this global State.
Just like the Vue instance’s data, this State is reactive. When one component updates the State, other components that are using that data get notified, automatically receiving the new value.

But just consolidating data into a single source of truth doesn’t fully solve the problems of managing state. What happens when many components alter the State in different ways, from different locations?
We need some standardization. Otherwise, changes to our State could be unpredictable and untraceable.
---
## **A State Management Pattern**
This is why Vuex provides a full state management pattern for a simple and standardized way to make state changes. And if you’re familiar with Vue, Vuex should look quite similar.

Just as you can create a new Vue instance (or Vue app) with `createApp()`, you can create a Vuex store with `createStore()`
While the Vue instance has a `data` property, the Vuex store has `state`. Both are _reactive_.
And while the instance has `methods`, which among other things can update `data`, the store has `actions`, which can update the state.
And while the instance has computed properties, the store has `getters`, which allow us to access a filtered, derived, or computed version of our `state`.
Additionally, Vuex provides a way to _track_ state changes, with something called `mutations`. We can use `actions` to commit `mutations`.
At the time of this writing, the Vue DevTools aren’t ready yet for Vue 3 but when they are, we can expect to be able to trace back in time through a record of each mutation that was committed to the state.
---
## An example Vuex Store
Now let’s take a look at an example Vuex Store.

In our **State**, we have an `isLoading` property, along an array for `todos`.
Below that we have a **Mutation** to switch our `isLoading` state between `true` and `false`. Along with a Mutation to set our state with the todos that we’ll receive from an API call in our action below.
Our **Action** here has multiple steps. First, it’ll commit the Mutation to set the `isLoading` status to `true`. Then it’ll make an API call, and when the response returns, it will commit the Mutation to set the `isLoading` status to `false`. Finally it’ll commit the Mutation to set the state of our `todos` with the response we got back from our API.
If we need the ability to only retrieve the todos that are labeled done, we can use a Getter for that, which will retrieve only the specific state that we want.

Now let’s take a look at this in motion.
---
## **Vuex in Motion**

---
## Next up…
Hopefully you now understand why you might need Vuex and how it can help enhance your application by providing a single source of truth for your State, along with a common library of Actions, Mutations and Getters.
In the next lesson, we’ll start implementing Vuex into our example application we built in Real World Vue 3. | 81.54023 | 538 | 0.792078 | eng_Latn | 0.981167 |
b12fa4650f538dcab716212e85f3c58d235e1ca3 | 287 | md | Markdown | README.md | bs10reh/CA3-neuron-model-ACh-NMDA-spikes | dae5117427cb798391bb13ac38ebbb177f58aa1d | [
"MIT"
] | null | null | null | README.md | bs10reh/CA3-neuron-model-ACh-NMDA-spikes | dae5117427cb798391bb13ac38ebbb177f58aa1d | [
"MIT"
] | null | null | null | README.md | bs10reh/CA3-neuron-model-ACh-NMDA-spikes | dae5117427cb798391bb13ac38ebbb177f58aa1d | [
"MIT"
] | null | null | null | # CA3-neuron-model-ACh-NMDA-spikes
This model was used to compare the nonlinearity of NMDA inputs between dendritic sections in a CA3 pyramidal neuron as well as investigate the effect of cholinergic modulation/potassium channel inhibition on this dendritic NMDA-mediated nonlinearity.
| 95.666667 | 251 | 0.836237 | eng_Latn | 0.997754 |
b130475af2a919b50b5e1da17f597abc89a3dfda | 20 | md | Markdown | README.md | Daninjakiwi/Escape-Room-Client | a67328feb4ec51e1d8d681804ae9831deb9c2181 | [
"MIT"
] | null | null | null | README.md | Daninjakiwi/Escape-Room-Client | a67328feb4ec51e1d8d681804ae9831deb9c2181 | [
"MIT"
] | null | null | null | README.md | Daninjakiwi/Escape-Room-Client | a67328feb4ec51e1d8d681804ae9831deb9c2181 | [
"MIT"
] | null | null | null | # Escape-Room-Client | 20 | 20 | 0.8 | eng_Latn | 0.528489 |
b1306e33f608931a8078bd24b6a08bc29e1d6ca1 | 4,689 | md | Markdown | README.md | Spotika4/yii2 | e056b7941641197ae4fc822b2549b4cc181a0b49 | [
"BSD-3-Clause"
] | null | null | null | README.md | Spotika4/yii2 | e056b7941641197ae4fc822b2549b4cc181a0b49 | [
"BSD-3-Clause"
] | null | null | null | README.md | Spotika4/yii2 | e056b7941641197ae4fc822b2549b4cc181a0b49 | [
"BSD-3-Clause"
] | null | null | null | <h1 align="center">Yii 2 Advanced Project Template by Spotika4</h1>
Шаблон проекта, основанный на фреймворке [Yii 2](http://www.yiiframework.com/).<br /><br />
<h2>Установка а запуск проекта</h2>
После инициализации composer'а необходимо выполнитьследующие действия:
> Через консоль инициализировать приложение
```
php init
```
> Изменить настройки для подключения к БД
```
common/config/main-local.php
```
> Изменить необходимые настройки RBAC
```
common/config/rbac
```
> Интегрировать настройки RBAC, запустив контроллен
```
php yii rbac/initialize
```
> Сохранить в БД пользователей по умолчанию
```
php yii rbac/assignment
```
<h2>Пользователи</h2>
CRUD пользователей уже встроен в проект и основан на пользователях Yii2. следующий функционал уже имеет свои процессоры:
<ul>
<li>добавление</li>
<li>просмотр</li>
<li>редактирование</li>
<li>удаление</li>
<li>личный кабинет</li>
</ul>
<h2>RBAC</h2>
RBAC уже встроен в проект и основан на одноименном компоненте Yii2. следующий функционал уже имеет свои процессоры:
<ul>
<li>регистрация</li>
<li>авторизация</li>
<li>восстановление пароля</li>
<li>Смена email</li>
<li>Смена пароля</li>
</ul>
<h2>Процессоры</h2>
Основная часть логики основана на сценариях в моделях Yii2 и потомках моделей - процессоров.
Процессоры расширяют модель Yii2 и инкапсулируют в себе бизнес-процессы для последующего многократного использования.
<ul>
<li>статус выполнения сценария</li>
<li>сообщение о статусе выполнения сценария</li>
<li>данные полученные из модели отдельно от атрибутов полученных в процессоре (удобно в случае листинга)</li>
</ul>
<h2>Контексты</h2>
Список интерфейсов приложения.
<h3>Console</h3>
Стандартный контекст консоли Yii2.
На данный момент обладает следующим функционалом:
<ul>
<li>миграции для инициализации проекта</li>
<li>контроллер обновления настроек RBAC</li>
<li>контроллер добавления пользователей по списку</li>
</ul>
<h3>Backend</h3>
Административный интерфейс. Для правильной работы используется jQuery компонент инкапсулирующий в себе основную логику и настройки js/css компонентов от 3-их лиц.<br />
На данный момент интерфейс обладает следующим функционалом:
<ul>
<li>CRUD для пользователей</li>
<li>авторизация</li>
<li>регистрация</li>
<li>восстановление пароля</li>
<li>профиль пользователя</li>
<li>смена email'а с подтверждением</li>
<li>смена пароля</li>
</ul>
<h3>Frontend</h3>
Интерфейс демо-сайта
<h2>Изменение стандартного функционала Yii2</h2>
Из стандартной структры Yii2 удален функционал:
<ul>
<li>выполнение и генерация тестов</li>
<li>vagrant</li>
</ul>
Из стандартной структры Yii2 удалены пакеты:
<ul>
<li>yiisoft/yii2-gii</li>
<li>yiisoft/yii2-faker</li>
<li>yiisoft/yii2-bootstrap</li>
<li>codeception/base</li>
<li>codeception/verify</li>
<li>phpunit/phpunit</li>
<li>symfony/browser-kit</li>
<li>bower-asset/jquery-ui</li>
</ul>
Файлы и папки проекта
-------------------
```
common
actions/ общие экшены
assets/ общие файлы моделей для подключения пакетов от 3-их лиц
components/ компоненты проекта
config/ общие настройки
rbac/ настройки rbac (используются при инициализации проекта)
mail/ файлы представления для отправки email
messages/ общие файлы словарей
models/ общие файлы моделей
processors/ общие файлы процессоров
console
config/ настройки, заменяют общие настройки
controllers/ файлы контроллеров
migrations/ миграции для БД
models/ файлы моделей
runtime/ сгенерированные файлы типа логов, кэша и т.д.
backend
assets/ файлы моделей для подключения пакетов от 3-их лиц
config/ настройки, заменяют общие настройки
controllers/ файлы контроллеров
messages/ файлы словарей
models/ файлы моделей
runtime/ сгенерированные файлы типа логов, кэша и т.д.
views/ файлы представлений
web/ скрипты и ресурсы, доступные из web
frontend
assets/ файлы моделей для подключения пакетов от 3-их лиц
config/ настройки, заменяют общие настройки
controllers/ файлы контроллеров
messages/ файлы словарей
models/ файлы моделей
runtime/ сгенерированные файлы типа логов, кэша и т.д.
views/ файлы представлений
web/ скрипты и ресурсы, доступные из web
vendor/ пакеты от 3-их лиц
environments/ данные для развертки приложения
```
| 30.448052 | 168 | 0.686287 | rus_Cyrl | 0.905881 |
b130befcc211e20a298709efcb7b53dd32321213 | 207 | md | Markdown | content/zh/docs/ops/common-problems/_index.md | JuwanXu/istio.io | 89e89ff684bfa26d3e3e034e9ef59abf56830448 | [
"Apache-2.0"
] | 1 | 2019-11-23T09:01:14.000Z | 2019-11-23T09:01:14.000Z | content/zh/docs/ops/common-problems/_index.md | JuwanXu/istio.io | 89e89ff684bfa26d3e3e034e9ef59abf56830448 | [
"Apache-2.0"
] | null | null | null | content/zh/docs/ops/common-problems/_index.md | JuwanXu/istio.io | 89e89ff684bfa26d3e3e034e9ef59abf56830448 | [
"Apache-2.0"
] | null | null | null | ---
title: 普通问题
description: 描述如何辨认和解决 Istio 中的普通问题。
weight: 70
keywords: [ops]
aliases:
- /zh/help/ops/troubleshooting
- /zh/help/ops/traffic-management/troubleshooting
- /zh/help/ops/setup
---
| 18.818182 | 53 | 0.700483 | yue_Hant | 0.524278 |
b13151c43cb77b1441e26feb2aed74c46f97bd53 | 27,601 | md | Markdown | articles/notification-hubs/notification-hubs-chrome-push-notifications-get-started.md | OpenLocalizationTestOrg/azure-docs-pr16_de-DE | bf18172a4f9060051b3861ff8930d9f0303f7f10 | [
"CC-BY-3.0",
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/notification-hubs/notification-hubs-chrome-push-notifications-get-started.md | OpenLocalizationTestOrg/azure-docs-pr16_de-DE | bf18172a4f9060051b3861ff8930d9f0303f7f10 | [
"CC-BY-3.0",
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/notification-hubs/notification-hubs-chrome-push-notifications-get-started.md | OpenLocalizationTestOrg/azure-docs-pr16_de-DE | bf18172a4f9060051b3861ff8930d9f0303f7f10 | [
"CC-BY-3.0",
"CC-BY-4.0",
"MIT"
] | null | null | null | <properties
pageTitle="Senden von Pushbenachrichtigungen in Chrome-apps mit Azure Benachrichtigung Hubs | Microsoft Azure"
description="Informationen Sie zum Azure Benachrichtigung Hubs verwenden, um Pushbenachrichtigungen zu einer App Chrome zu senden."
services="notification-hubs"
keywords="Mobile-Pushbenachrichtigungen, Pushbenachrichtigungen, Pushbenachrichtigungen Benachrichtigung chrome Pushbenachrichtigungen"
documentationCenter=""
authors="ysxu"
manager="erikre"
editor=""/>
<tags
ms.service="notification-hubs"
ms.workload="mobile"
ms.tgt_pltfrm="mobile-chrome"
ms.devlang="JavaScript"
ms.topic="hero-article"
ms.date="10/03/2016"
ms.author="yuaxu"/>
# <a name="send-push-notifications-to-chrome-apps-with-azure-notification-hubs"></a>Senden von Pushbenachrichtigungen in Chrome-apps mit Azure Benachrichtigung Hubs
[AZURE.INCLUDE [notification-hubs-selector-get-started](../../includes/notification-hubs-selector-get-started.md)]
In diesem Thema wird gezeigt, wie mit Azure Benachrichtigung Hubs Pushbenachrichtigungen zu einer App Chrome senden, das innerhalb des Kontexts von Google Chrome-Browsers angezeigt wird. In diesem Lernprogramm erstellen wir eine app Chrome, die Pushbenachrichtigungen empfängt mithilfe von [Google Cloud Messaging (GCM)](https://developers.google.com/cloud-messaging/).
>[AZURE.NOTE] Um dieses Lernprogramms abgeschlossen haben, müssen Sie ein aktives Azure-Konto verfügen. Wenn Sie kein Konto haben, können Sie ein kostenloses Testversion Konto nur wenigen Minuten erstellen. Weitere Informationen finden Sie unter [Azure kostenlose Testversion](https://azure.microsoft.com/pricing/free-trial/?WT.mc_id=A0E0E5C02&returnurl=http%3A%2F%2Fazure.microsoft.com%2Fen-us%2Fdocumentation%2Farticles%notification-hubs-chrome-get-started%2F).
Das Lernprogramm führt Sie durch die folgenden grundlegenden Schritte aus, um Pushbenachrichtigungen zu aktivieren:
* [Aktivieren von Google-Cloud Messaging](#register)
* [Konfigurieren Sie den Benachrichtigung hub](#configure-hub)
* [Herstellen einer Verbindung im Infobereich Hub mit Ihrer Chrome-App](#connect-app)
* [Senden Sie eine Benachrichtigung Pushbenachrichtigungen zu Ihrer Anwendung Chrome](#send)
* [Zusätzliche Funktionen und Funktionen](#next-steps)
>[AZURE.NOTE] Chrome app Pushbenachrichtigungen sind nicht generische im Browser Benachrichtigungen – sie sind speziell für die Erweiterbarkeit Browser modellieren (Details finden Sie unter [Übersicht über Chrome-Apps] ). Führen Sie zusätzlich zu den Desktopbrowser Chrome-apps auf Mobile (Android und iOS) bis Apache Cordova ein. Finden Sie unter [Chrome Apps Mobile] Weitere.
Konfigurieren von GCM und Azure Benachrichtigung Hubs entspricht konfigurieren für Android, da [Google Cloud Messaging für Chrome] veraltet ist und die gleiche GCM jetzt sowohl Android-Geräten und Chrome Instanzen unterstützt.
##<a name="a-idregisteraenable-google-cloud-messaging"></a><a id="register"></a>Aktivieren von Google-Cloud Messaging
1. Navigieren Sie zu der Website [Google-Cloud-Verwaltungskonsole] , melden Sie sich mit Ihrer Gmail-Konto-Anmeldeinformationen, und klicken Sie dann auf die Schaltfläche **Projekt erstellen** . Geben Sie einen entsprechenden **Projektnamen ein**, und klicken Sie dann auf die Schaltfläche **Erstellen** .
![Google Cloud Console - Projekt erstellen][1]
2. Notieren Sie die **Projektnummer** auf der Seite **Projekte** für das Projekt, das Sie gerade erstellt haben. Sie werden diese als die **GCM Absender-ID** in der App Chrome verwenden, um mit GCM registrieren.
![Google Cloud Console - Projektnummer][2]
3. Im linken Bereich klicken Sie auf **APIs und Authentifizierung**, und klicken Sie dann einen Bildlauf nach unten, und klicken Sie auf die Umschaltfläche, um das Aktivieren von **Google Cloud Messaging für Android**. Sie müssen nicht **Google Cloud für Chrome Messaging**aktivieren.
![Google Cloud Console - Server-Taste][3]
4. Klicken Sie im linken Bereich auf **Anmeldeinformationen** > **Erstellen neuer Schlüssel** > **Serverschlüssel** > **Erstellen**.
![Google Cloud Console - Anmeldeinformationen][4]
5. Notieren Sie die Server **-API-Taste**. Dies konfigurieren in der Benachrichtigung Hub als Nächstes Sie um Pushbenachrichtigungen zu GCM senden ermöglichen.
![Google Cloud Console - API-Schlüssel][5]
##<a name="a-idconfigure-hubaconfigure-your-notification-hub"></a><a id="configure-hub"></a>Konfigurieren Sie den Benachrichtigung hub
[AZURE.INCLUDE [notification-hubs-portal-create-new-hub](../../includes/notification-hubs-portal-create-new-hub.md)]
  6. Wählen Sie in den **Einstellungen** Blade **Benachrichtigungsdienst** und **Google (GCM)**aus. Geben Sie den Key API und speichern.
  
##<a name="a-idconnect-appaconnect-your-chrome-app-to-the-notification-hub"></a><a id="connect-app"></a>Herstellen einer Verbindung im Infobereich Hub mit Ihrer Chrome-App
Ihre Benachrichtigung Hub ist jetzt so konfiguriert, dass die Arbeit mit GCM, und Sie haben die Verbindungszeichenfolgen Ihre app zum Empfangen und Senden von Pushbenachrichtigungen registrieren. LK
###<a name="create-a-new-chrome-app"></a>Erstellen einer neuen Chrome-App
Im folgenden Beispiel wird basierend auf den [Chrome App GCM Stichproben] und die empfohlene Vorgehensweise zum Erstellen einer App Chrome verwendet. Wir markieren Sie die Schritten speziell im Zusammenhang mit Azure Benachrichtigung Hubs.
>[AZURE.NOTE] Es empfiehlt sich, dass Sie die Quelle für diese App Chrome [Chrome App Benachrichtigung Hub (Beispiel)]herunterladen.
Wird die App Chrome über JavaScript erstellt, und Sie können Ihre bevorzugten Word Editoren für Sie erstellt haben. Es folgt wie diese App Chrome aussieht.
![Google Chrome-App][15]
1. Erstellen Sie einen Ordner aus, und nennen Sie es `ChromePushApp`. Der Name ist natürlich beliebig – Wenn Sie etwas anderes nennen, stellen Sie sicher, dass Sie den Pfad in den erforderlichen Codesegmenten ersetzen.
2. Laden Sie die [kryptomobilität Js Bibliothek] in den Ordner, den Sie im zweiten Schritt erstellt haben. Diese Bibliotheksordner enthält zwei Unterordner: `components` und `rollups`.
3. Erstellen einer `manifest.json` Datei. Alle Chrome-Apps als Unterstützung über eine Manifestdatei, die app-Metadaten und die meisten enthält, wichtiger ist, dass alle Berechtigungen, die die app gewährt werden, wenn der Benutzer es installiert.
{
"name": "NH-GCM Notifications",
"description": "Chrome platform app.",
"manifest_version": 2,
"version": "0.1",
"app": {
"background": {
"scripts": ["background.js"]
}
},
"permissions": ["gcm", "storage", "notifications", "https://*.servicebus.windows.net/*"],
"icons": { "128": "gcm_128.png" }
}
Hinweis Die `permissions` Element, das angibt, dass diese App Chrome Pushbenachrichtigungen von GCM erhalten werden. Sie müssen außerdem Azure Benachrichtigung Hubs URI angeben, in dem die App Chrome REST registrieren aufrufen wird.
Unsere Beispiel-app verwendet auch eine Symboldatei `gcm_128.png`, die finden Sie an der Quelle, die aus der ursprünglichen GCM Stichprobe wiederverwendet wird. Sie können es in jedem Bild ersetzen, die den [Kriterien Symbol](https://developer.chrome.com/apps/manifest/icons)entspricht.
4. Erstellen Sie eine Datei namens `background.js` mit den folgenden Code:
// Returns a new notification ID used in the notification.
function getNotificationId() {
var id = Math.floor(Math.random() * 9007199254740992) + 1;
return id.toString();
}
function messageReceived(message) {
// A message is an object with a data property that
// consists of key-value pairs.
// Concatenate all key-value pairs to form a display string.
var messageString = "";
for (var key in message.data) {
if (messageString != "")
messageString += ", "
messageString += key + ":" + message.data[key];
}
console.log("Message received: " + messageString);
// Pop up a notification to show the GCM message.
chrome.notifications.create(getNotificationId(), {
title: 'GCM Message',
iconUrl: 'gcm_128.png',
type: 'basic',
message: messageString
}, function() {});
}
var registerWindowCreated = false;
function firstTimeRegistration() {
chrome.storage.local.get("registered", function(result) {
registerWindowCreated = true;
chrome.app.window.create(
"register.html",
{ width: 520,
height: 500,
frame: 'chrome'
},
function(appWin) {}
);
});
}
// Set up a listener for GCM message event.
chrome.gcm.onMessage.addListener(messageReceived);
// Set up listeners to trigger the first-time registration.
chrome.runtime.onInstalled.addListener(firstTimeRegistration);
chrome.runtime.onStartup.addListener(firstTimeRegistration);
Dies ist die Datei, die im HTML-Code (**register.html**) des Chrome-App-Fenster informiert und definiert außerdem die Ereignishandler **MessageReceived** , um der Pushbenachrichtigung von eingehenden zu behandeln.
5. Erstellen Sie eine Datei namens `register.html` -Hiermit wird die Benutzeroberfläche von der App Chrome definiert.
>[AZURE.NOTE] In diesem Beispiel wird die **CryptoJS v3.1.2**verwendet. Wenn Sie eine andere Version der Bibliothek heruntergeladen haben, vergewissern Sie sich ordnungsgemäß ersetzen Sie die Version in der `src` Pfad.
<html>
<head>
<title>GCM Registration</title>
<script src="register.js"></script>
<script src="CryptoJS v3.1.2/rollups/hmac-sha256.js"></script>
<script src="CryptoJS v3.1.2/components/enc-base64-min.js"></script>
</head>
<body>
Sender ID:<br/><input id="senderId" type="TEXT" size="20"><br/>
<button id="registerWithGCM">Register with GCM</button>
<br/>
<br/>
<br/>
Notification Hub Name:<br/><input id="hubName" type="TEXT" style="width:400px"><br/><br/>
Connection String:<br/><textarea id="connectionString" type="TEXT" style="width:400px;height:60px"></textarea>
<br/>
<button id="registerWithNH" disabled="true">Register with Azure Notification Hubs</button>
<br/>
<br/>
<textarea id="console" type="TEXT" readonly style="width:500px;height:200px;background-color:#e5e5e5;padding:5px"></textarea>
</body>
</html>
6. Erstellen Sie eine Datei namens `register.js` durch den folgenden Code. Diese Datei gibt an, das Skript hinter `register.html`. Chrome Apps erlauben nicht Inline ausführen, daher Sie in einem separaten dahinter liegende Skript für die Benutzeroberfläche zu erstellen müssen.
var registrationId = "";
var hubName = "", connectionString = "";
var originalUri = "", targetUri = "", endpoint = "", sasKeyName = "", sasKeyValue = "", sasToken = "";
window.onload = function() {
document.getElementById("registerWithGCM").onclick = registerWithGCM;
document.getElementById("registerWithNH").onclick = registerWithNH;
updateLog("You have not registered yet. Please provider sender ID and register with GCM and then with Notification Hubs.");
}
function updateLog(status) {
currentStatus = document.getElementById("console").innerHTML;
if (currentStatus != "") {
currentStatus = currentStatus + "\n\n";
}
document.getElementById("console").innerHTML = currentStatus + status;
}
function registerWithGCM() {
var senderId = document.getElementById("senderId").value.trim();
chrome.gcm.register([senderId], registerCallback);
// Prevent register button from being clicked again before the registration finishes.
document.getElementById("registerWithGCM").disabled = true;
}
function registerCallback(regId) {
registrationId = regId;
document.getElementById("registerWithGCM").disabled = false;
if (chrome.runtime.lastError) {
// When the registration fails, handle the error and retry the
// registration later.
updateLog("Registration failed: " + chrome.runtime.lastError.message);
return;
}
updateLog("Registration with GCM succeeded.");
document.getElementById("registerWithNH").disabled = false;
// Mark that the first-time registration is done.
chrome.storage.local.set({registered: true});
}
function registerWithNH() {
hubName = document.getElementById("hubName").value.trim();
connectionString = document.getElementById("connectionString").value.trim();
splitConnectionString();
generateSaSToken();
sendNHRegistrationRequest();
}
// From http://msdn.microsoft.com/library/dn495627.aspx
function splitConnectionString()
{
var parts = connectionString.split(';');
if (parts.length != 3)
throw "Error parsing connection string";
parts.forEach(function(part) {
if (part.indexOf('Endpoint') == 0) {
endpoint = 'https' + part.substring(11);
} else if (part.indexOf('SharedAccessKeyName') == 0) {
sasKeyName = part.substring(20);
} else if (part.indexOf('SharedAccessKey') == 0) {
sasKeyValue = part.substring(16);
}
});
originalUri = endpoint + hubName;
}
function generateSaSToken()
{
targetUri = encodeURIComponent(originalUri.toLowerCase()).toLowerCase();
var expiresInMins = 10; // 10 minute expiration
// Set expiration in seconds.
var expireOnDate = new Date();
expireOnDate.setMinutes(expireOnDate.getMinutes() + expiresInMins);
var expires = Date.UTC(expireOnDate.getUTCFullYear(), expireOnDate
.getUTCMonth(), expireOnDate.getUTCDate(), expireOnDate
.getUTCHours(), expireOnDate.getUTCMinutes(), expireOnDate
.getUTCSeconds()) / 1000;
var tosign = targetUri + '\n' + expires;
// Using CryptoJS.
var signature = CryptoJS.HmacSHA256(tosign, sasKeyValue);
var base64signature = signature.toString(CryptoJS.enc.Base64);
var base64UriEncoded = encodeURIComponent(base64signature);
// Construct authorization string.
sasToken = "SharedAccessSignature sr=" + targetUri + "&sig="
+ base64UriEncoded + "&se=" + expires + "&skn=" + sasKeyName;
}
function sendNHRegistrationRequest()
{
var registrationPayload =
"<?xml version=\"1.0\" encoding=\"utf-8\"?>" +
"<entry xmlns=\"http://www.w3.org/2005/Atom\">" +
"<content type=\"application/xml\">" +
"<GcmRegistrationDescription xmlns:i=\"http://www.w3.org/2001/XMLSchema-instance\" xmlns=\"http://schemas.microsoft.com/netservices/2010/10/servicebus/connect\">" +
"<GcmRegistrationId>{GCMRegistrationId}</GcmRegistrationId>" +
"</GcmRegistrationDescription>" +
"</content>" +
"</entry>";
// Update the payload with the registration ID obtained earlier.
registrationPayload = registrationPayload.replace("{GCMRegistrationId}", registrationId);
var url = originalUri + "/registrations/?api-version=2014-09";
var client = new XMLHttpRequest();
client.onload = function () {
if (client.readyState == 4) {
if (client.status == 200) {
updateLog("Notification Hub Registration succesful!");
updateLog(client.responseText);
} else {
updateLog("Notification Hub Registration did not succeed!");
updateLog("HTTP Status: " + client.status + " : " + client.statusText);
updateLog("HTTP Response: " + "\n" + client.responseText);
}
}
};
client.onerror = function () {
updateLog("ERROR - Notification Hub Registration did not succeed!");
}
client.open("POST", url, true);
client.setRequestHeader("Content-Type", "application/atom+xml;type=entry;charset=utf-8");
client.setRequestHeader("Authorization", sasToken);
client.setRequestHeader("x-ms-version", "2014-09");
try {
client.send(registrationPayload);
}
catch(err) {
updateLog(err.message);
}
}
Das obige Skript weist die folgenden wichtigen Parameter:
- **Window.OnLoad** definiert die Schaltfläche Klick-Ereignisse von zwei Schaltflächen auf der Benutzeroberfläche an. Eine mit GCM registriert und die andere verwendet die Registrierung-ID, die nach der Registrierung mit GCM mit Azure Benachrichtigung Hubs registrieren zurückgegeben wird.
- **UpdateLog** ist die Funktion, die Funktionen einfache Protokollierung verarbeitet werden kann.
- **RegisterWithGCM** ist der erste Schaltfläche Klick Ereignishandler, wodurch die `chrome.gcm.register` einwählen GCM die aktuelle Instanz des Chrome-App zu registrieren.
- **RegisterCallback** ist der Rückruffunktion, die aufgerufen wird, wenn der Anruf GCM Registrierung gibt.
- **RegisterWithNH** ist im zweiten Schaltfläche Klick Ereignishandler die mit Benachrichtigung Hubs registriert. Es wird `hubName` und `connectionString` (die der Benutzer angegeben hat) und den Anruf Benachrichtigung Hubs Registrierung REST-API bietet.
- **SplitConnectionString** und **GenerateSaSToken** , sind Hilfen, die die JavaScript-Implementierung von token Erstellungsprozess eines SaS darstellen, die in allen REST-API Aufrufen verwendet werden muss. Weitere Informationen finden Sie unter [Allgemeine Konzepte](http://msdn.microsoft.com/library/dn495627.aspx).
- **SendNHRegistrationRequest** ist die Funktion, die eine HTTP (REST) anrufen Azure Benachrichtigung Hubs ist.
- **RegistrationPayload** definiert die Registrierung XML-Nutzlast. Weitere Informationen finden Sie unter [Registrierung NH REST-API erstellen]. Wir aktualisieren die Registrierung ID darin mit was wir von GCM erhalten.
- **Client** ist eine Instanz von **XMLHttpRequest** , die wir verwenden, um die HTTP POST-Anforderung vorzunehmen. Notiz, die wir aktualisieren die `Authorization` Kopfzeile mit `sasToken`. Erfolgreichen Abschluss dieses Anruf wird diese Instanz des Chrome-App mit Azure Benachrichtigung Hubs registrieren.
Die gesamte Ordnerstruktur für dieses Projekt sollte wie folgt aussehen: ![Google Chrome App - Ordnerstruktur][21]
###<a name="set-up-and-test-your-chrome-app"></a>Richten Sie ein und Testen Sie Ihre App Chrome
1. Öffnen Sie Ihren Chrome-Browser. Öffnen Sie **Chrome-Erweiterungen** zu, und aktivieren Sie **Entwicklermodus**.
![Google Chrome - Entwicklermodus aktivieren][16]
2. Klicken Sie auf **entpackt Erweiterung laden** , und navigieren Sie zu dem Ordner, in dem Sie die Dateien erstellt. Sie können auch optional **Chrome-Apps und Erweiterungen Developer Tools**verwenden. Dieses Tool ist eine App Chrome in selbst (aus dem Chrome-Web-Store installiert) und stellt erweiterte Debuggen Funktionen für Ihre Chrome App-Entwicklung.
![Google Chrome - entpackt Erweiterung laden][17]
3. Wenn die App Chrome ohne Fehler erstellt wurde, wird Ihre Chrome-App angezeigt angezeigt.
![Google Chrome - Chrome-App anzeigen][18]
4. Geben Sie die **Projektnummer** , die Sie zuvor von der **Google-Cloud-Konsole** als Absender-ID erhalten haben, und klicken Sie auf **mit GCM registrieren**. Sie müssen die Nachricht finden Sie unter **Registrierung mit GCM erfolgreich war.**
![Google Chrome - Chrome App Anpassung][19]
5. Geben Sie Ihren **Benachrichtigung Hubnamen** und die **DefaultListenSharedAccessSignature** , die Sie zuvor auf dem Portal erhalten haben, und klicken Sie auf **mit Azure Benachrichtigung Hub zu registrieren**. Sie müssen die Nachricht finden Sie unter **Benachrichtigung Hub Registrierung erfolgreich!** und die Details der Antwort auf die Registrierung, die die Registrierung Azure Benachrichtigung Hubs enthält-ID.
![Geben Sie Google Chrome - Benachrichtigung Hub Details][20]
##<a name="a-namesendasend-a-notification-to-your-chrome-app"></a><a name="send"></a>Senden einer Benachrichtigung zu Ihrer Anwendung Chrome
Zu Testzwecken schicken wir, dass Pushbenachrichtigungen mithilfe einer .NET Chrome Anwendung console.
>[AZURE.NOTE] Sie können alle Back-End-über unsere öffentlichen <a href="http://msdn.microsoft.com/library/windowsazure/dn223264.aspx">REST-Schnittstelle</a>Pushbenachrichtigungen mit Benachrichtigung Hubs senden. Schauen Sie sich unsere [Dokumentation Portal](https://azure.microsoft.com/documentation/services/notification-hubs/) Weitere Beispiele für die Plattformen.
1. Wählen Sie in Visual Studio im Menü **Datei** , **neu** und dann auf **Projekt**aus. Klicken Sie unter **Visual c#**klicken Sie auf **Windows** und **Console-Anwendung**, und klicken Sie dann auf **OK**. Dadurch wird ein neues Projekt der Console-Anwendung erstellt.
2. Klicken Sie im Menü **Extras** auf **Bibliothek Paket-Manager** , und klicken Sie dann **Paket-Manager-Konsole**. Dadurch werden die Paket-Manager-Konsole.
3. Führen Sie im Fenster Konsole den folgenden Befehl aus:
Install-Package Microsoft.Azure.NotificationHubs
Dadurch wird einen Verweis auf die Azure Service Bus SDK mit dem <a href="http://nuget.org/packages/ WindowsAzure.ServiceBus/">WindowsAzure.ServiceBus NuGet-Paket</a>hinzugefügt.
4. Open `Program.cs` , und fügen Sie den folgenden `using` Anweisung:
using Microsoft.Azure.NotificationHubs;
5. In der `Program` Klasse, fügen Sie die folgende Methode:
private static async void SendNotificationAsync()
{
NotificationHubClient hub = NotificationHubClient.CreateClientFromConnectionString("<connection string with full access>", "<hub name>");
String message = "{\"data\":{\"message\":\"Hello Chrome from Azure Notification Hubs\"}}";
await hub.SendGcmNativeNotificationAsync(message);
}
Vergewissern Sie sich, ersetzen die `<hub name>` Platzhalter mit dem Namen der Benachrichtigung-Hub, die im [Portal](https://portal.azure.com) in der Benachrichtigung Hub Blade angezeigt wird. Ersetzen Sie die Verbindung Zeichenfolge Platzhalter auch mit der Verbindungszeichenfolge aufgerufen `DefaultFullSharedAccessSignature` , die Sie im Abschnitt Konfiguration Hub Benachrichtigung erhalten haben.
>[AZURE.NOTE] Stellen Sie sicher, dass Sie die Verbindungszeichenfolge mit **Vollzugriff, nicht **Abhören** Access** verwenden. Die Verbindungszeichenfolge Access **Abhören** gewährt keine Berechtigungen, um Pushbenachrichtigungen zu senden.
5. Fügen Sie die folgenden Aufrufe in der `Main` Methode:
SendNotificationAsync();
Console.ReadLine();
6. Stellen Sie sicher, dass Chrome ausgeführt wird, und führen Sie die Anwendung Console.
7. Die folgende Benachrichtigung sollte popupwarnung auf dem Desktop angezeigt werden.
![Google Chrome - Benachrichtigung][13]
8. Sie können auch alle Ihre Benachrichtigungen mithilfe der Fenster Chrome Benachrichtigungen auf der Taskleiste (in Windows) anzeigen wenn Chrome ausgeführt wird.
![Google Chrome - Benachrichtigungsliste][14]
>[AZURE.NOTE] Sie benötigen keine der App Chrome ausgeführt haben, oder öffnen im Browser (obwohl Chrome-Browsers selbst ausgeführt werden muss). Sie erhalten auch eine konsolidierte Übersicht über alle Ihre Benachrichtigungen im Fenster Chrome Benachrichtigungen.
## <a name="next-steps"> </a>Nächste Schritte
Weitere Informationen zu Benachrichtigung Hubs in [Benachrichtigung Hubs Übersicht].
Wenn Sie bestimmte Zielpublikum, finden Sie unter des Lernprogramms [Azure Benachrichtigung Hubs benachrichtigen Benutzer] .
Wenn Sie Ihre Benutzer Zinsen gruppenweise segmentieren möchten, können Sie das Lernprogramm [Azure Benachrichtigung Hubs Neuigkeiten] folgen.
<!-- Images. -->
[1]: ./media/notification-hubs-chrome-get-started/GoogleConsoleCreateProject.PNG
[2]: ./media/notification-hubs-chrome-get-started/GoogleProjectNumber.png
[3]: ./media/notification-hubs-chrome-get-started/EnableGCM.png
[4]: ./media/notification-hubs-chrome-get-started/CreateServerKey.png
[5]: ./media/notification-hubs-chrome-get-started/ServerKey.png
[6]: ./media/notification-hubs-chrome-get-started/CreateNH.png
[7]: ./media/notification-hubs-chrome-get-started/NHNamespace.png
[8]: ./media/notification-hubs-chrome-get-started/NamespaceConfigure.png
[9]: ./media/notification-hubs-chrome-get-started/NHConfigure.png
[10]: ./media/notification-hubs-chrome-get-started/NHConfigureGCM.png
[11]: ./media/notification-hubs-chrome-get-started/NHDashboard.png
[12]: ./media/notification-hubs-chrome-get-started/NHConnString.png
[13]: ./media/notification-hubs-chrome-get-started/ChromeNotification.png
[14]: ./media/notification-hubs-chrome-get-started/ChromeNotificationWindow.png
[15]: ./media/notification-hubs-chrome-get-started/ChromeApp.png
[16]: ./media/notification-hubs-chrome-get-started/ChromeExtensions.png
[17]: ./media/notification-hubs-chrome-get-started/ChromeLoadExtension.png
[18]: ./media/notification-hubs-chrome-get-started/ChromeAppLoaded.png
[19]: ./media/notification-hubs-chrome-get-started/ChromeAppGCM.png
[20]: ./media/notification-hubs-chrome-get-started/ChromeAppNH.png
[21]: ./media/notification-hubs-chrome-get-started/FinalFolderView.png
<!-- URLs. -->
[Chrome App Benachrichtigung Hub (Beispiel)]: https://github.com/Azure/azure-notificationhubs-samples/tree/master/PushToChromeApps
[Google-Cloud-Konsole]: http://cloud.google.com/console
[Azure Classic Portal]: https://manage.windowsazure.com/
[Benachrichtigung Hubs (Übersicht)]: notification-hubs-push-notification-overview.md
[Chrome-Apps (Übersicht)]: https://developer.chrome.com/apps/about_apps
[Chrome App GCM Stichprobe]: https://github.com/GoogleChrome/chrome-app-samples/tree/master/samples/gcm-notifications
[Installable Web Apps]: https://developers.google.com/chrome/apps/docs/
[Chrome-Apps auf Mobile]: https://developer.chrome.com/apps/chrome_apps_on_mobile
[Erstellen der Registrierung NH REST-API]: http://msdn.microsoft.com/library/azure/dn223265.aspx
[kryptomobilität Js Bibliothek]: http://code.google.com/p/crypto-js/
[GCM with Chrome Apps]: https://developer.chrome.com/apps/cloudMessaging
[Google-Cloud-Messaging für Chrome]: https://developer.chrome.com/apps/cloudMessagingV1
[Benutzer benachrichtigen, Azure Benachrichtigung Hubs]: notification-hubs-aspnet-backend-windows-dotnet-wns-notification.md
[Benachrichtigung Hubs dem neusten Stand Azure]: notification-hubs-windows-notification-dotnet-push-xplat-segmented-wns.md
| 57.502083 | 467 | 0.714684 | deu_Latn | 0.849674 |
b131c074472857bc8ba73ae68a6523cee19033b6 | 4,532 | md | Markdown | dynamics-nav-app/bank-how-setup-bank-accounts.md | MicrosoftDocs/nav-content.sv-se | f21ec05f780c4657e94217ddcd50625f4789d72b | [
"CC-BY-4.0",
"MIT"
] | 2 | 2020-05-19T18:48:13.000Z | 2021-04-21T00:13:46.000Z | dynamics-nav-app/bank-how-setup-bank-accounts.md | MicrosoftDocs/nav-content.sv-se | f21ec05f780c4657e94217ddcd50625f4789d72b | [
"CC-BY-4.0",
"MIT"
] | null | null | null | dynamics-nav-app/bank-how-setup-bank-accounts.md | MicrosoftDocs/nav-content.sv-se | f21ec05f780c4657e94217ddcd50625f4789d72b | [
"CC-BY-4.0",
"MIT"
] | 3 | 2017-08-24T13:11:38.000Z | 2021-11-05T11:05:44.000Z | ---
title: Skapa bankkonton
description: "Du kan stämma av bankkonton i Dynamics NAV med utdrag från banken."
documentationcenter:
author: SorenGP
ms.prod: dynamics-nav-2017
ms.topic: article
ms.devlang: na
ms.tgt_pltfrm: na
ms.workload: na
ms.search.keywords: feed, stream
ms.date: 09/26/2017
ms.author: sgroespe
ms.translationtype: HT
ms.sourcegitcommit: 4fefaef7380ac10836fcac404eea006f55d8556f
ms.openlocfilehash: 3581615fe94006aa9245f5e66fe6cf475b22acfb
ms.contentlocale: sv-se
ms.lasthandoff: 10/16/2017
---
# <a name="how-to-set-up-bank-accounts"></a>Så här skapar du bankkonton
Du använder bankkonton i [!INCLUDE[d365fin](includes/d365fin_md.md)] för att hålla reda på dina banktransaktioner. Konton kan definieras i den lokala valutan eller i en utländsk valuta. När du har skapat bankkonton kan du också använda funktionen för utskrift av checkar.
## <a name="to-set-up-bank-accounts"></a>Så här skapar du bankkonton
1. Välj ikonen , ange **Bankkonton** och välj sedan relaterad länk.
2. I fönstret **Bankkonton** väljer du åtgärden **Ny**.
3. Fyll i fälten om det behövs. [!INCLUDE[tooltip-inline-tip](includes/tooltip-inline-tip_md.md)]
> [!NOTE]
> Så här fyller du i fältet **Saldo** med en ingående balans, du måste bokföra en bankkontotransaktion med beloppet i fråga. Du kan göra detta genom att utföra en bankkontoavstämning. Mer information finns i [Så här stämmer du av bankkonton separat](bank-how-reconcile-bank-accounts-separately.md). Alternativt kan du implementera den ingående balansen som en del av skapande av allmänna data i nya företag med hjälp av den assisterade konfigurationen **Migrera affärsdata**. Mer information finns i [Välkommern till [!INCLUDE[d365fin](includes/d365fin_md.md)](index.md).
## <a name="to-set-up-your-bank-account-for-import-or-export-of-bank-files"></a>Så här skapar du ett bankkonto för import eller export av bankfilerna
Fälten på snabbfliken **Överför** i fönstret **Bankkontokort** är relaterade till import och export av bankfeeds och filer. Mer information finns i [Så här konfigurerar du bankdatakonverteringstjänsten](bank-how-setup-bank-data-conversion-service.md).
1. Välj ikonen , ange **Bankkonton** och välj sedan relaterad länk.
2. Öppna kortet för ett bankkonto som du ska exportera eller importera bankfiler.
3. I snabbfliken **Överför** fyller du i nödvändiga fält. [!INCLUDE[tooltip-inline-tip](includes/tooltip-inline-tip_md.md)]
> [!NOTE]
> Andra filexporttjänster och deras format kräver olika inställningsvärden i fönstret **bankkontokort**. Du får information om vilka inställningsvärden som är fel eller saknas när du försöker exportera filen. Så läs de korta beskrivningarna av fälten eller se relaterad procedur i närliggande ämnen. Till exempel exportera en betalningsfil för nordamerikansk elektronisk överföring kräver att både fältet **Sista kundremissnr.** och fältet **Transitnr.** fylls i. Mer information finns i [Så här exporterar du betalningar till en bankfil](payables-how-export-payments-bank-file.md).
## <a name="to-set-up-vendor-bank-accounts-for-export-of-bank-files"></a>Så här skapar du leverantörsbankkonto för export av bankfiler
Fälten på snabbfliken **Överför** i fönstret **Leveraqntörsbankkontokort** är relaterade till export av bankfeeds och filer. Mer information finns i [Så här skapar du tjänsten för Bankdatakonvertering](bank-how-setup-bank-data-conversion-service.md) och [Så här exporterar du betalningar till en bankfil](payables-how-export-payments-bank-file.md).
1. Välj ikonen , ange **Leverantör** och välj sedan relaterad länk.
2. Öppna kortet för en leverantör vars bankkonto som du ska exportera betalningsbankfiler till.
3. Välj åtgärden **bankkonton**.
3. I fönstret**Leverantörsbankkontokort** på snabbfliken **Överför** fyller du sedan de fält som behövs. [!INCLUDE[tooltip-inline-tip](includes/tooltip-inline-tip_md.md)]
## <a name="to-set-the-opening-balance-on-new-bank-accounts"></a>Ange den ingående balansen för nya bankkonton
## <a name="see-also"></a>Se även
[Ställa in bankverksamhet](bank-setup-banking.md)
[Hantera bankkonton](bank-manage-bank-accounts.md)
[Arbeta med [!INCLUDE[d365fin](includes/d365fin_md.md)]](ui-work-product.md)
| 78.137931 | 584 | 0.782877 | swe_Latn | 0.99777 |
b131c80f756edd602fdf53c54b7c2fc31a052698 | 3,308 | md | Markdown | node_modules/abind/README.md | MegahurtZ-Systems/bitrex-tools | 8400b757cba4e537cf9637efaf8cce5541ddc166 | [
"MIT"
] | null | null | null | node_modules/abind/README.md | MegahurtZ-Systems/bitrex-tools | 8400b757cba4e537cf9637efaf8cce5541ddc166 | [
"MIT"
] | null | null | null | node_modules/abind/README.md | MegahurtZ-Systems/bitrex-tools | 8400b757cba4e537cf9637efaf8cce5541ddc166 | [
"MIT"
] | null | null | null | abind
==========
<!---
This file is generated by ape-tmpl. Do not update manually.
--->
<!-- Badge Start -->
<a name="badges"></a>
[![Build Status][bd_travis_shield_url]][bd_travis_url]
[![Code Climate][bd_codeclimate_shield_url]][bd_codeclimate_url]
[![Code Coverage][bd_codeclimate_coverage_shield_url]][bd_codeclimate_url]
[![npm Version][bd_npm_shield_url]][bd_npm_url]
[![JS Standard][bd_standard_shield_url]][bd_standard_url]
[bd_repo_url]: https://github.com/a-labo/abind
[bd_travis_url]: http://travis-ci.org/a-labo/abind
[bd_travis_shield_url]: http://img.shields.io/travis/a-labo/abind.svg?style=flat
[bd_travis_com_url]: http://travis-ci.com/a-labo/abind
[bd_travis_com_shield_url]: https://api.travis-ci.com/a-labo/abind.svg?token=
[bd_license_url]: https://github.com/a-labo/abind/blob/master/LICENSE
[bd_codeclimate_url]: http://codeclimate.com/github/a-labo/abind
[bd_codeclimate_shield_url]: http://img.shields.io/codeclimate/github/a-labo/abind.svg?style=flat
[bd_codeclimate_coverage_shield_url]: http://img.shields.io/codeclimate/coverage/github/a-labo/abind.svg?style=flat
[bd_gemnasium_url]: https://gemnasium.com/a-labo/abind
[bd_gemnasium_shield_url]: https://gemnasium.com/a-labo/abind.svg
[bd_npm_url]: http://www.npmjs.org/package/abind
[bd_npm_shield_url]: http://img.shields.io/npm/v/abind.svg?style=flat
[bd_standard_url]: http://standardjs.com/
[bd_standard_shield_url]: https://img.shields.io/badge/code%20style-standard-brightgreen.svg
<!-- Badge End -->
<!-- Description Start -->
<a name="description"></a>
Aubo bind instance methods of classes.
<!-- Description End -->
<!-- Overview Start -->
<a name="overview"></a>
<!-- Overview End -->
<!-- Sections Start -->
<a name="sections"></a>
<!-- Section from "doc/guides/01.Installation.md.hbs" Start -->
<a name="section-doc-guides-01-installation-md"></a>
Installation
-----
```bash
$ npm install abind --save
```
<!-- Section from "doc/guides/01.Installation.md.hbs" End -->
<!-- Section from "doc/guides/02.Usage.md.hbs" Start -->
<a name="section-doc-guides-02-usage-md"></a>
Usage
---------
```javascript
'use strict'
const abind = require('abind')
class Talker {
constructor (name) {
const s = this
s.name = name
abind(s)
}
sayHi () {
const s = this
return `Hi, i'm ${s.name}`
}
}
let { sayHi } = new Talker('Tom')
console.log(sayHi()) // -> Hi, i'm Tom
```
<!-- Section from "doc/guides/02.Usage.md.hbs" End -->
<!-- Section from "doc/guides/03.Signature.md.hbs" Start -->
<a name="section-doc-guides-03-signature-md"></a>
Signature
---------
`abind(instance, options) -> Object`
### Params
| Name | Type | Description |
| ----- | --- | -------- |
| instance | Object | Instance to bind |
| options | Object | Optional settings |
| options.proto | Object | Prototype to bind |
| options.excludes | string[] | Names to exclude |
<!-- Section from "doc/guides/03.Signature.md.hbs" End -->
<!-- Sections Start -->
<!-- LICENSE Start -->
<a name="license"></a>
License
-------
This software is released under the [MIT License](https://github.com/a-labo/abind/blob/master/LICENSE).
<!-- LICENSE End -->
<!-- Links Start -->
<a name="links"></a>
Links
------
+ [a-labo][a_labo_url]
[a_labo_url]: https://github.com/a-labo
<!-- Links End -->
| 22.053333 | 115 | 0.676239 | yue_Hant | 0.510532 |
b131e78c123a5eb1f7c2830d8ca79495b75085c7 | 35,865 | md | Markdown | articles/virtual-machines/windows/sql/virtual-machines-windows-sql-register-with-resource-provider.md | dbroeglin/azure-docs.fr-fr | 98723467b92322c5546c342f50caae075d4c3ff5 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/virtual-machines/windows/sql/virtual-machines-windows-sql-register-with-resource-provider.md | dbroeglin/azure-docs.fr-fr | 98723467b92322c5546c342f50caae075d4c3ff5 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/virtual-machines/windows/sql/virtual-machines-windows-sql-register-with-resource-provider.md | dbroeglin/azure-docs.fr-fr | 98723467b92322c5546c342f50caae075d4c3ff5 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: S’inscrire auprès du fournisseur de ressources de machine virtuelle SQL
description: Inscrivez votre machine virtuelle Azure SQL Server auprès du fournisseur de ressources de machine virtuelle SQL pour activer les fonctionnalités des machines virtuelles SQL Server déployées en dehors de la Place de marché Azure ainsi que la conformité et la gestion améliorée.
services: virtual-machines-windows
documentationcenter: na
author: MashaMSFT
manager: craigg
tags: azure-resource-manager
ms.service: virtual-machines-sql
ms.devlang: na
ms.topic: article
ms.tgt_pltfrm: vm-windows-sql-server
ms.workload: iaas-sql-server
ms.date: 11/13/2019
ms.author: mathoma
ms.reviewer: jroth
ms.openlocfilehash: 01e683e31905281d25fdcf976bc58397c052a6c3
ms.sourcegitcommit: 333af18fa9e4c2b376fa9aeb8f7941f1b331c11d
ms.translationtype: HT
ms.contentlocale: fr-FR
ms.lasthandoff: 02/13/2020
ms.locfileid: "77201626"
---
# <a name="register-a-sql-server-virtual-machine-in-azure-with-the-sql-vm-resource-provider"></a>Inscrire une machine virtuelle SQL Server dans Azure auprès du fournisseur de ressources de machine virtuelle SQL
Cet article explique comment inscrire votre machine virtuelle SQL Server dans Azure auprès du fournisseur de ressources de machine virtuelle SQL. L’inscription auprès du fournisseur de ressources crée la _ressource_ de **machine virtuelle SQL** dans votre abonnement, qui est une ressource distincte de la ressource de machine virtuelle. Le fait de désinscrire votre machine virtuelle SQL Server du fournisseur de ressources va supprimer la _ressource_ de **machine virtuelle SQL**, mais pas la machine virtuelle elle-même.
Pendant le déploiement d’une image de machine virtuelle SQL Server de la Place de marché Azure via le portail Azure, la machine virtuelle SQL Server est inscrite automatiquement auprès du fournisseur de ressources. Toutefois, si vous choisissez d’installer SQL Server sur une machine virtuelle Azure vous-même ou de provisionner une machine virtuelle Azure à partir d’un disque dur virtuel personnalisé, vous devez inscrire votre machine virtuelle SQL Server auprès du fournisseur de ressources pour les raisons suivantes :
- **Avantages en termes de fonctionnalités** : l’inscription de la machine virtuelle SQL Server auprès du fournisseur de ressources a pour effet de déverrouiller les fonctionnalités de [mise à jour corrective automatisée](virtual-machines-windows-sql-automated-patching.md), de [sauvegarde automatisée](virtual-machines-windows-sql-automated-backup-v2.md) ainsi que de supervision et de gestion. De même, cela facilite la gestion des [licences](virtual-machines-windows-sql-ahb.md) et des [éditions](virtual-machines-windows-sql-change-edition.md). Auparavant, ces fonctionnalités étaient uniquement accessibles aux images de machine virtuelle SQL Server déployées à partir de la Place de marché Azure.
- **Conformité** : l’inscription auprès du fournisseur de ressources de machine virtuelle SQL offre une méthode simplifiée pour notifier à Microsoft qu’Azure Hybrid Benefit a été activé comme spécifié dans les termes du produit. Ce processus élimine la nécessité de gérer les formulaires d’inscription de licence pour chaque ressource.
- **Administration gratuite** : l’inscription auprès du fournisseur de ressources de machine virtuelle SQL dans les trois modes de gestion est entièrement gratuite. Aucun coût supplémentaire n’est associé au fournisseur de ressources ou au changement de modes de gestion.
- **Gestion des licences simplifiée** : l’inscription auprès du fournisseur de ressources de machine virtuelle SQL simplifie la gestion des licences SQL Server et vous permet d’identifier rapidement les machines virtuelles SQL Server avec Azure Hybrid Benefit activé à l’aide du [portail Azure](virtual-machines-windows-sql-manage-portal.md), de l’interface de ligne de commande Azure ou de PowerShell :
# <a name="azure-cli"></a>[Azure CLI](#tab/azure-cli)
```azurecli-interactive
$vms = az sql vm list | ConvertFrom-Json
$vms | Where-Object {$_.sqlServerLicenseType -eq "AHUB"}
```
# <a name="powershell"></a>[PowerShell](#tab/azure-powershell)
```powershell-interactive
Get-AzSqlVM | Where-Object {$_.LicenseType -eq 'AHUB'}
```
---
Pour utiliser le fournisseur de ressources de machine virtuelle SQL, vous devez d’abord [inscrire votre abonnement auprès du fournisseur de ressources](#register-subscription-with-rp), ce qui donne au fournisseur de ressources la possibilité de créer des ressources dans cet abonnement.
Pour plus d’informations sur les avantages du fournisseur de ressources de machine virtuelle SQL, regardez cette vidéo de [Channel9](https://channel9.msdn.com/Shows/Data-Exposed/Benefit-from-SQL-VM-Resource-Provider-when-self-installing-SQL-Server-on-Azure?WT.mc_id=dataexposed-c9-niner) :
<iframe src="https://channel9.msdn.com/Shows/Data-Exposed/Benefit-from-SQL-VM-Resource-Provider-when-self-installing-SQL-Server-on-Azure/player" width="960" height="540" allowFullScreen frameBorder="0" title="Avantages du fournisseur de ressources de machine virtuelle SQL lors de l’installation automatique de SQL Server sur Azure - Vidéo sur Microsoft Channel 9"></iframe>
## <a name="prerequisites"></a>Prérequis
Pour inscrire votre machine virtuelle SQL Server auprès du fournisseur de ressources, voici ce dont vous avez besoin :
- Un [abonnement Azure](https://azure.microsoft.com/free/).
- Une [machine virtuelle SQL Server](virtual-machines-windows-portal-sql-server-provision.md) sur le modèle Azure Resource Manager déployée sur le cloud public ou le cloud Azure Government.
- La version la plus récente d’[Azure CLI](/cli/azure/install-azure-cli) ou de [PowerShell](/powershell/azure/new-azureps-module-az).
## <a name="management-modes"></a>Modes de gestion
Si l’[extension SQL IaaS](virtual-machines-windows-sql-server-agent-extension.md) n’a pas déjà été installée, l’inscription auprès du fournisseur de ressources de machine virtuelle SQL installe automatiquement l’extension SQL Server IaaS dans l’un des trois modes de gestion, spécifié pendant le processus d’inscription. Si vous ne spécifiez pas le mode de gestion, l’extension SQL IaaS est installée en mode de gestion complète.
Si l’extension SQL IaaS a déjà été installée manuellement, elle est déjà en mode de gestion complète et l’inscription auprès du fournisseur de ressources en mode complet ne redémarre pas le service SQL Server.
Les trois modes de gestion sont les suivants :
- Le mode **léger** ne nécessite pas de redémarrage de SQL Server, mais ne prend en charge que la modification du type de licence et de l’édition de SQL Server. Utilisez cette option pour les machines virtuelles SQL Server avec plusieurs instances, ou participant à une instance de cluster de basculement. Le mode léger n’a aucun impact sur la mémoire ou le processeur et il n’y a aucun coût associé. Nous vous recommandons d’abord d’inscrire votre machine virtuelle SQL Server en mode léger, puis de procéder à une mise à niveau vers le mode complet pendant une fenêtre de maintenance planifiée.
- Le mode **complet** fournit toutes les fonctionnalités, mais nécessite un redémarrage des autorisations SQL Server et administrateur système. Il s’agit de l’option qui est installée par défaut lors de l’installation manuelle de l’extension SQL IaaS. Utilisez-la pour gérer une machine virtuelle SQL Server avec une seule instance. Le mode Full installe deux services Windows qui ont un impact minimal sur la mémoire et le processeur. L’activité de ces services est visible dans le gestionnaire des tâches. L’utilisation du mode de gestion Full est gratuite.
- Le mode **sans agent** est dédié à SQL Server 2008 et à SQL Server 2008 R2 sur Windows Server 2008. Le mode NoAgent n’a aucun impact sur la mémoire ou l’UC. L’utilisation du mode de gestion NoAgent est gratuite.
Vous pouvez afficher le mode actuel de votre agent SQL Server IaaS à l’aide de PowerShell :
```powershell-interactive
# Get the SqlVirtualMachine
$sqlvm = Get-AzSqlVM -Name $vm.Name -ResourceGroupName $vm.ResourceGroupName
$sqlvm.SqlManagementType
```
## <a name="register-subscription-with-rp"></a>Inscrire un abonnement auprès d’un fournisseur de ressources
Pour pouvoir inscrire votre machine virtuelle SQL Server auprès du fournisseur de ressources de machine virtuelle SQL, vous devez inscrire votre abonnement auprès du fournisseur. Ainsi, le fournisseur de ressources de machine virtuelle SQL peut créer des ressources dans votre abonnement. Pour ce faire, vous pouvez utiliser le portail Azure, Azure CLI ou PowerShell.
### <a name="azure-portal"></a>Portail Azure
1. Ouvrez le portail Azure et accédez à **Tous les services**.
1. Accédez à **Abonnements** et sélectionnez l’abonnement qui vous intéresse.
1. Dans la page **Abonnements**, accédez à **Fournisseurs de ressources**.
1. Entrez **sql** dans le filtre pour afficher les fournisseurs de ressources liées à SQL.
1. Sélectionnez **Inscrire**, **Réinscrire** ou **Désinscrire** pour le fournisseur **Microsoft.SqlVirtualMachine**, en fonction de l’action souhaitée.

### <a name="command-line"></a>Ligne de commande
Inscrivez votre fournisseur de ressources de machine virtuelle SQL dans votre abonnement Azure en utilisant Az CLI ou PowerShell.
# <a name="az-cli"></a>[AZ CLI](#tab/bash)
```azurecli-interactive
# Register the SQL VM resource provider to your subscription
az provider register --namespace Microsoft.SqlVirtualMachine
```
# <a name="powershell"></a>[PowerShell](#tab/powershell)
```powershell-interactive
# Register the SQL VM resource provider to your subscription
Register-AzResourceProvider -ProviderNamespace Microsoft.SqlVirtualMachine
```
---
## <a name="register-sql-vm-with-rp"></a>Inscrire une machine virtuelle SQL auprès du fournisseur de ressources
### <a name="lightweight-management-mode"></a>Mode de gestion léger
Si l’[extension SQL Server IaaS Agent](virtual-machines-windows-sql-server-agent-extension.md) n’a pas été installée sur la machine virtuelle, nous vous recommandons d’effectuer l’inscription auprès du fournisseur de ressources de machines virtuelles SQL en mode léger. Cette opération installe l’extension SQL IaaS en [mode léger](#management-modes) et empêche le redémarrage du service SQL Server. Vous pouvez ensuite effectuer une mise à niveau vers le mode complet à tout moment ; cependant, comme cela entraîne le redémarrage du service SQL Server, nous vous recommandons d’attendre une fenêtre de maintenance planifiée.
Indiquez une licence SQL Server de type paiement à l’utilisation (`PAYG`) pour payer en fonction de l’utilisation, Azure Hybrid Benefit (`AHUB`) pour utiliser votre propre licence ou récupération d’urgence (`DR`) pour activer la [licence de réplica de récupération d’urgence gratuite](virtual-machines-windows-sql-high-availability-dr.md#free-dr-replica-in-azure).
Les instances de cluster de basculement et les déploiements multi-instances ne peuvent être inscrits auprès du fournisseur de ressources de machine virtuelle SQL qu’en mode léger.
# <a name="az-cli"></a>[AZ CLI](#tab/bash)
Inscrire la machine virtuelle SQL Server en mode léger avec l’interface de ligne de commande Azure :
```azurecli-interactive
# Register Enterprise or Standard self-installed VM in Lightweight mode
az sql vm create --name <vm_name> --resource-group <resource_group_name> --location <vm_location> --license-type PAYG
```
# <a name="powershell"></a>[PowerShell](#tab/powershell)
Inscrire la machine virtuelle SQL Server en mode léger avec PowerShell :
```powershell-interactive
# Get the existing compute VM
$vm = Get-AzVM -Name <vm_name> -ResourceGroupName <resource_group_name>
# Register SQL VM with 'Lightweight' SQL IaaS agent
New-AzSqlVM -Name $vm.Name -ResourceGroupName $vm.ResourceGroupName -Location $vm.Location `
-LicenseType PAYG -SqlManagementType LightWeight
```
---
### <a name="full-management-mode"></a>Mode de gestion complet
Si l’extension SQL IaaS a déjà été installée sur la machine virtuelle manuellement, vous pouvez inscrire la machine virtuelle SQL Server en mode complet sans redémarrer le service SQL Server. **Toutefois, si l’extension SQL IaaS n’a pas été installée, l’inscription en mode complet installe l’extension SQL IaaS en mode complet et redémarre le service SQL Server. Agissez avec précaution.**
Pour inscrire directement votre machine virtuelle SQL Server en mode complet (et éventuellement redémarrer votre service SQL Server), utilisez la commande PowerShell suivante :
```powershell-interactive
# Get the existing Compute VM
$vm = Get-AzVM -Name <vm_name> -ResourceGroupName <resource_group_name>
# Register with SQL VM resource provider in full mode
New-AzSqlVM -Name $vm.Name -ResourceGroupName $vm.ResourceGroupName -SqlManagementType Full
```
### <a name="noagent-management-mode"></a>Mode de gestion sans agent
Les instances de SQL Server 2008 et 2008 R2 installées sur Windows Server 2008 (_non R2_) peuvent être inscrites auprès du fournisseur de ressources de machines virtuelles SQL en [mode sans agent](#management-modes). Cette option garantit la conformité et permet de surveiller la machine virtuelle SQL Server dans le portail Azure avec des fonctionnalités limitées.
Spécifiez `AHUB`, `PAYG` ou `DR` pour **sqlLicenseType** et `SQL2008-WS2008` ou `SQL2008R2-WS2008`pour **sqlImageOffer**.
Pour inscrire votre instance SQL Server 2008 ou 2008 R2 sur une instance Windows Server 2008, utilisez l’extrait de code PowerShell ou Azure CLI suivant :
# <a name="az-cli"></a>[AZ CLI](#tab/bash)
Inscrire votre machine virtuelle SQL Server 2008 en mode sans agent avec l'interface de ligne de commande Azure :
```azurecli-interactive
az sql vm create -n sqlvm -g myresourcegroup -l eastus |
--license-type PAYG --sql-mgmt-type NoAgent
--image-sku Enterprise --image-offer SQL2008-WS2008R2
```
Inscrire votre machine virtuelle SQL Server 2008 R2 en mode sans agent avec l'interface de ligne de commande Azure :
```azurecli-interactive
az sql vm create -n sqlvm -g myresourcegroup -l eastus |
--license-type PAYG --sql-mgmt-type NoAgent
--image-sku Enterprise --image-offer SQL2008R2-WS2008R2
```
# <a name="powershell"></a>[PowerShell](#tab/powershell)
Inscrire la machine virtuelle SQL Server 2008 en mode sans agent avec PowerShell :
```powershell-interactive
# Get the existing compute VM
$vm = Get-AzVM -Name <vm_name> -ResourceGroupName <resource_group_name>
New-AzSqlVM -Name $vm.Name -ResourceGroupName $vm.ResourceGroupName -Location $vm.Location `
-LicenseType PAYG -SqlManagementType NoAgent -Sku Standard -Offer SQL2008-WS2008
```
Inscrire la machine virtuelle SQL Server 2008 R2 en mode sans agent avec PowerShell :
```powershell-interactive
# Get the existing compute VM
$vm = Get-AzVM -Name <vm_name> -ResourceGroupName <resource_group_name>
New-AzSqlVM -Name $vm.Name -ResourceGroupName $vm.ResourceGroupName -Location $vm.Location `
-LicenseType PAYG -SqlManagementType NoAgent -Sku Standard -Offer SQL2008R2-WS2008
```
---
## <a name="upgrade-to-full-management-mode"></a>Effectuer une mise à niveau vers le mode de gestion complet
Les machines virtuelles SQL Server sur lesquelles l’extension IaaS en mode *léger* est installée peuvent passer en mode _complet_ par le biais du portail Azure, d’Azure CLI et de PowerShell. Les machines virtuelles SQL Server en mode _sans agent_ peuvent passer en mode _complet_ une fois que le système d’exploitation est mis à niveau vers Windows 2008 R2 et versions ultérieures. Il est impossible de passer à une version antérieure. Pour ce faire, vous devez [annuler l’inscription](#unregister-vm-from-rp) de la machine virtuelle SQL Server auprès du fournisseur de ressources de machine virtuelle SQL. Cette opération supprime la _ressource_ de **machine virtuelle SQL**, mais ne supprime pas la machine virtuelle elle-même.
Vous pouvez afficher le mode actuel de votre agent SQL Server IaaS à l’aide de PowerShell :
```powershell-interactive
# Get the SqlVirtualMachine
$sqlvm = Get-AzSqlVM -Name $vm.Name -ResourceGroupName $vm.ResourceGroupName
$sqlvm.SqlManagementType
```
Pour mettre à niveau l’agent en mode complet :
### <a name="azure-portal"></a>Portail Azure
1. Connectez-vous au [portail Azure](https://portal.azure.com).
1. Accédez à votre ressource [machines virtuelles SQL](virtual-machines-windows-sql-manage-portal.md#access-the-sql-virtual-machines-resource).
1. Sélectionnez votre machine virtuelle SQL Server, puis choisissez **Vue d’ensemble**.
1. Pour les machines virtuelles SQL Server avec les modes IaaS NoAgent ou léger, sélectionnez le message **Seules les mises à jour de type de licence et d’édition sont disponibles avec l’extension IaaS SQL**.

1. Activez la case à cocher **J’accepte de redémarrer le service SQL Server sur la machine virtuelle**, puis sélectionnez **Confirmer** pour mettre à niveau votre mode IaaS vers le mode complet.

### <a name="command-line"></a>Ligne de commande
# <a name="az-cli"></a>[AZ CLI](#tab/bash)
Exécutez l’extrait de code Az CLI suivant :
```azurecli-interactive
# Update to full mode
az sql vm update --name <vm_name> --resource-group <resource_group_name> --sql-mgmt-type full
```
# <a name="powershell"></a>[PowerShell](#tab/powershell)
Exécutez l’extrait de code PowerShell suivant :
```powershell-interactive
# Get the existing Compute VM
$vm = Get-AzVM -Name <vm_name> -ResourceGroupName <resource_group_name>
# Register with SQL VM resource provider in full mode
Update-AzSqlVM -Name $vm.Name -ResourceGroupName $vm.ResourceGroupName -SqlManagementType Full
```
---
## <a name="verify-registration-status"></a>Vérifier l’état de l’inscription
Vous pouvez vérifier si votre machine virtuelle SQL Server a déjà été inscrite auprès du fournisseur de ressources de machine virtuelle SQL via le portail Azure, Azure CLI ou PowerShell.
### <a name="azure-portal"></a>Portail Azure
1. Connectez-vous au [portail Azure](https://portal.azure.com).
1. Accédez à vos [machines virtuelles SQL Server](virtual-machines-windows-sql-manage-portal.md).
1. Sélectionnez votre machine virtuelle SQL Server dans la liste. Si votre machine virtuelle SQL Server n’est pas listée ici, il est probable qu’elle n’a pas été inscrite auprès du fournisseur de ressources de machine virtuelle SQL.
1. Examinez la valeur sous **État**. Si l’**État** indique **Réussi**, la machine virtuelle SQL Server a bien été inscrite auprès du fournisseur de ressources de machine virtuelle SQL.

### <a name="command-line"></a>Ligne de commande
Vérifiez l’état d’inscription actuel d’une machine virtuelle SQL Server à l’aide d’Az CLI ou de PowerShell. `ProvisioningState` affichera `Succeeded` si l’inscription a réussi.
# <a name="az-cli"></a>[AZ CLI](#tab/bash)
```azurecli-interactive
az sql vm show -n <vm_name> -g <resource_group>
```
# <a name="powershell"></a>[PowerShell](#tab/powershell)
```powershell-interactive
Get-AzSqlVM -Name <vm_name> -ResourceGroupName <resource_group>
```
---
Une erreur indique que la machine virtuelle SQL Server n’a pas été inscrite avec le fournisseur de ressources.
## <a name="unregister-vm-from-rp"></a>Annuler l’inscription de la machine virtuelle auprès du fournisseur de ressources
Pour annuler l’inscription de votre machine virtuelle SQL Server auprès du fournisseur de ressources de machine virtuelle SQL, supprimez la *ressource* de machine virtuelle SQL à l’aide du Portail Azure ou de l’interface de ligne de commande Azure. La suppression de la *ressource* de machine virtuelle SQL n’entraîne pas la suppression de la machine virtuelle SQL Server proprement dite. Toutefois, suivez attentivement les étapes, car il est possible de supprimer par inadvertance la machine virtuelle lors de la tentative de suppression de la *ressource*.
L’annulation de l’inscription de la machine virtuelle SQL auprès du fournisseur de ressources de machine virtuelle SQL est nécessaire pour passer du mode de gestion complet à un mode de gestion de niveau inférieur.
### <a name="azure-portal"></a>Portail Azure
Pour annuler l’inscription de votre machine virtuelle SQL Server auprès du fournisseur de ressources à l’aide du Portail Azure, procédez comme suit :
1. Connectez-vous au [portail Azure](https://portal.azure.com).
1. Accédez à la ressource de machine virtuelle SQL Server.

1. Sélectionnez **Supprimer**.

1. Saisissez le nom de la machine virtuelle SQL et **désactivez la case à cocher en regard de celle-ci**.

>[!WARNING]
> Si vous ne désactivez pas la case à cocher en regard du nom de la machine virtuelle, vous *supprimez entièrement* celle-ci. Désactivez la case à cocher pour annuler l’inscription de la machine virtuelle SQL Server du fournisseur de ressources *sans supprimer la machine virtuelle proprement dite*.
1. Sélectionnez **Supprimer** pour confirmer la suppression de la *ressource* de la machine virtuelle SQL et non pas la machine virtuelle SQL Server.
### <a name="command-line"></a>Ligne de commande
# <a name="azure-cli"></a>[Azure CLI](#tab/azure-cli)
Pour annuler l’inscription de votre machine virtuelle SQL Server auprès du fournisseur de ressources à l’aide de la ligne de commande Azure, utilisez la commande [az sql vm delete](/cli/azure/sql/vm?view=azure-cli-latest#az-sql-vm-delete). Cela supprimera la *ressource* de machine virtuelle SQL Server mais ne supprimera pas la machine virtuelle.
```azurecli-interactive
az sql vm delete
--name <SQL VM resource name> |
--resource-group <Resource group name> |
--yes
```
# <a name="powershell"></a>[PowerShell](#tab/azure-powershell)
Pour annuler l’inscription de votre machine virtuelle SQL Server auprès du fournisseur de ressources à l’aide de l’interface de ligne de commande Azure, utilisez la commande [New-AzSqlVM](/powershell/module/az.sqlvirtualmachine/new-azsqlvm). Cela supprimera la *ressource* de machine virtuelle SQL Server mais ne supprimera pas la machine virtuelle.
```powershell-interactive
Remove-AzSqlVM -ResourceGroupName <resource_group_name> -Name <VM_name>
```
---
## <a name="limitations"></a>Limites
Le fournisseur de ressources de machine virtuelle SQL prend uniquement en charge :
- Les machines virtuelles SQL Server déployées par le biais d’Azure Resource Manager. Les machines virtuelles SQL Server déployées via le modèle classique ne sont pas prises en charge.
- Les machines virtuelles SQL Server déployées sur le cloud public ou le cloud Azure Government. Les déploiements sur d’autres clouds privés ou du secteur public ne sont pas pris en charge.
## <a name="frequently-asked-questions"></a>Forum aux questions
**Dois-je inscrire ma machine virtuelle SQL Server provisionnée à partir d’une image SQL Server de la Place de marché Azure ?**
Non. Microsoft inscrit automatiquement les machines virtuelles provisionnées à partir des images SQL Server de la Place de marché Azure. L’inscription auprès du fournisseur de ressources de machine virtuelle SQL n’est nécessaire que si la machine virtuelle n’a *pas* été provisionnée à partir des images SQL Server de la Place de marché Azure et que SQL Server a été installé automatiquement.
**Le fournisseur de ressources de machine virtuelle SQL est-il disponible pour tous les clients ?**
Oui. Les clients doivent inscrire leurs machines virtuelles SQL Server auprès du fournisseur de ressources de machine virtuelle SQL s’ils n’ont pas utilisé d’image SQL Server de la Place de marché Azure et qu’à la place, ils ont installé SQL Server automatiquement ou apporté leur disque dur virtuel personnalisé. Les machines virtuelles relevant de tous les types d’abonnement (Direct, Contrat Entreprise et Fournisseur de solutions Cloud) peuvent s’inscrire auprès du fournisseur de ressources de machine virtuelle SQL.
**Dois-je m’inscrire auprès du fournisseur de ressources de machine virtuelle SQL si l’extension IaaS SQL Server est déjà installée sur ma machine virtuelle SQL Server ?**
Si votre machine virtuelle SQL Server est installée automatiquement et n’est pas provisionnée à partir des images SQL Server de la Place de marché Azure, vous devez vous inscrire auprès du fournisseur de ressources de machine virtuelle SQL, même si vous avez installé l’extension IaaS SQL Server. L’inscription auprès du fournisseur de ressources de machine virtuelle SQL crée une ressource de type Microsoft.SqlVirtualMachines. L’installation de l’extension IaaS SQL Server ne crée pas cette ressource.
**Quel est le mode de gestion par défaut au moment de l’inscription auprès du fournisseur de ressources de machine virtuelle SQL ?**
Le mode de gestion par défaut au moment de l’inscription auprès du fournisseur de ressources de machine virtuelle SQL est *complet*. Si la propriété de gestion SQL Server n’est pas définie au moment de l’inscription auprès du fournisseur de ressources de machine virtuelle SQL, le mode de gestion complet est défini et votre service SQL Server redémarre. Nous vous recommandons d’abord d’effectuer l’inscription auprès du fournisseur de ressources de machine virtuelle en mode léger, puis de procéder à une mise à niveau vers le mode complet pendant une fenêtre de maintenance.
**Quels sont les prérequis pour s’inscrire auprès du fournisseur de ressources de machine virtuelle SQL ?**
Il n’existe aucun prérequis à l’inscription auprès du fournisseur de ressources de machine virtuelle SQL en mode léger ou sans agent. Le prérequis à l’inscription auprès du fournisseur de ressources de machine virtuelle SQL en mode complet est l’installation de l’extension SQL Server IaaS sur la machine virtuelle, sinon le service SQL Server redémarre.
**Puis-je m’inscrire auprès du fournisseur de ressources de machine virtuelle SQL si l’extension IaaS SQL Server n’est pas installée sur la machine virtuelle ?**
Oui, vous pouvez vous inscrire auprès du fournisseur de ressources de machine virtuelle SQL en mode de gestion léger si l’extension IaaS SQL Server n’est pas installée sur la machine virtuelle. En mode léger, le fournisseur de ressources de machine virtuelle SQL utilise une application console pour vérifier la version et l’édition de l’instance SQL Server.
Le mode de gestion SQL par défaut au moment de l’inscription auprès du fournisseur de ressources de machine virtuelle SQL est _Complet_. Si la propriété de gestion SQL n’est pas définie lors de l’inscription auprès du fournisseur de ressources de machine virtuelle SQL, le mode de gestion complète est défini. Nous vous recommandons d’abord d’effectuer l’inscription auprès du fournisseur de ressources de machine virtuelle en mode léger, puis de procéder à une mise à niveau vers le mode complet pendant une fenêtre de maintenance.
**L’inscription auprès du fournisseur de ressources de machine virtuelle SQL a-t-elle pour effet d’installer un agent sur ma machine virtuelle ?**
Non. L’inscription auprès du fournisseur de ressources de machine virtuelle SQL crée uniquement une ressource de métadonnées. Elle n’a pas pour effet d’installer un agent sur la machine virtuelle.
L’extension IaaS SQL Server n’est nécessaire que pour activer la gestion complète. La mise à niveau du mode de gestion léger à complet a pour effet d’installer l’extension IaaS SQL Server et de redémarrer SQL Server.
**L’inscription auprès du fournisseur de ressources de machine virtuelle SQL a-t-elle pour effet de redémarrer SQL Server sur ma machine virtuelle ?**
Cela dépend du mode spécifié lors de l’inscription. Si le mode léger ou sans agent est spécifié, le service SQL Server ne redémarre pas. Toutefois, si vous spécifiez le mode de gestion complet ou laissez le mode de gestion vide, l’extension SQL IaaS est installée en mode de gestion complet, ce qui entraîne le redémarrage du service SQL Server.
**Quelle est la différence entre les modes de gestion léger et sans agent au moment de l’inscription auprès du fournisseur de ressources de machine virtuelle SQL ?**
Le mode de gestion sans agent n’est disponible que pour les instances SQL Server 2008 et SQL Server 2008 R2 sur Windows Server 2008. Il s’agit du seul mode de gestion disponible pour ces versions. Pour toutes les autres versions de SQL Server, les deux modes de gestion disponibles sont léger et complet.
Pour le mode sans agent, les propriétés de version et d’édition de SQL Server doivent être définies par le client. Le mode léger interroge la machine virtuelle pour déterminer la version et l’édition de l’instance SQL Server.
**Puis-je m’inscrire auprès du fournisseur de ressources de machine virtuelle SQL sans spécifier le type de licence SQL Server ?**
Non. Le type de licence SQL Server n’est pas une propriété facultative au moment de vous inscrire auprès du fournisseur de ressources de machine virtuelle SQL. Vous devez définir le type de licence « paiement à l’utilisation » ou Azure Hybrid Benefit au moment de vous inscrire auprès du fournisseur de ressources de machine virtuelle SQL dans tous les modes de gestion (sans agent, léger et complet).
**Puis-je mettre à niveau l’extension IaaS SQL Server du mode sans agent vers le mode complet ?**
Non. La mise à niveau vers le mode de gestion complet ou léger n’est pas disponible pour le mode sans agent. Il s’agit d’une limitation technique de Windows Server 2008. Vous devez d’abord mettre à niveau le système d’exploitation vers Windows Server 2008 R2 ou une version ultérieure, puis effectuer une mise à niveau vers le mode de gestion complet.
**Puis-je mettre à niveau l’extension IaaS SQL Server du mode léger vers le mode complet ?**
Oui. La mise à niveau du mode de gestion de léger à complet est prise en charge via PowerShell ou le portail Azure. Elle nécessite le redémarrage du service SQL Server.
**Puis-je rétrograder l’extension IaaS SQL Server du mode complet vers le mode de gestion sans agent ou léger ?**
Non. La rétrogradation du mode de gestion de l’extension IaaS SQL Server n’est pas prise en charge. Le mode de gestion SQL ne peut pas être rétrogradé du mode complet vers le mode léger ou sans agent, ni du mode léger vers le mode sans agent.
Pour changer le mode de gérabilité à partir de la gérabilité complète, [annulez l’inscription](#unregister-vm-from-rp) de la machine virtuelle SQL Server auprès du fournisseur de ressources SQL Server en supprimant la *ressource* SQL Server et en réinscrivant la machine virtuelle SQL Server auprès du fournisseur de ressources de machine virtuelle SQL avec un mode d’administration différent.
**Puis-je m’inscrire auprès du fournisseur de ressources de machine virtuelle SQL à partir du portail Azure ?**
Non. L’inscription auprès du fournisseur de ressources de machine virtuelle SQL n’est pas disponible sur le portail Azure. L’inscription auprès du fournisseur de ressources de machine virtuelle SQL est uniquement prise en charge avec Azure CLI ou PowerShell.
**Puis-je inscrire une machine virtuelle auprès du fournisseur de ressources de machine virtuelle SQL avant d’installer SQL Server ?**
Non. Une machine virtuelle doit disposer d'au moins une instance SQL Server (moteur de base de données) pour pouvoir être inscrite auprès du fournisseur de machines virtuelles SQL. S’il n’existe aucune instance SQL Server sur la machine virtuelle, la nouvelle ressource Microsoft.SqlVirtualMachine sera en état d’échec.
**Puis-je inscrire une machine virtuelle auprès du fournisseur de ressources de machine virtuelle SQL s’il existe plusieurs instances SQL Server ?**
Oui. Le fournisseur de machines virtuelles SQL n'inscrira qu'une seule instance de SQL Server (moteur de base de données). Le fournisseur de ressources de machine virtuelle SQL inscrira l’instance SQL Server par défaut en présence de plusieurs instances. En l’absence d’instance par défaut, l’inscription uniquement en mode léger est prise en charge. Pour effectuer une mise à niveau du mode de gestion léger à complet, l’instance SQL Server par défaut doit exister ou la machine virtuelle ne doit avoir qu’une seule instance SQL Server nommée.
**Puis-je inscrire une instance de cluster de basculement SQL Server auprès du fournisseur de ressources de machine virtuelle SQL ?**
Oui. Les instances de cluster de basculement SQL Server sur une machine virtuelle Azure peuvent être inscrites auprès du fournisseur de ressources de machine virtuelle SQL en mode léger. Cependant, les instances de cluster de basculement SQL Server ne peuvent pas être mises à niveau vers le mode de gestion complet.
**Puis-je inscrire ma machine virtuelle auprès du fournisseur de ressources de machine virtuelle SQL si le groupe de disponibilité Always On est configuré ?**
Oui. Il n’existe aucune restriction quant à l’inscription d’une instance SQL Server sur une machine virtuelle Azure auprès du fournisseur de ressources de machine virtuelle SQL si vous faites partie d’une configuration de groupe de disponibilité Always On.
**Quel est le coût de l’inscription auprès du fournisseur de ressources de machine virtuelle SQL ou de la mise à niveau vers le mode de gestion complet ?**
Aucun. L’inscription auprès du fournisseur de ressources de machine virtuelle SQL ou l’utilisation de l’un des trois modes de gestion n’est pas facturée. La gestion de votre machine virtuelle SQL Server avec le fournisseur de ressources est entièrement gratuite.
**Quel est l’impact de l’utilisation des différents modes de gestion sur les performances ?**
Les modes de gestion *NoAgent* et *Lightweight* n’ont pas d’impact sur la mémoire et l’UC. L’utilisation du mode de gestion *Full* à partir de deux services installés sur le système d’exploitation a un impact minime. Ces derniers peuvent être supervisés par le biais du gestionnaire des tâches et observés dans la console des services Windows intégrée.
Les deux noms de service sont les suivants :
- `SqlIaaSExtensionQuery` (nom d’affichage : `Microsoft SQL Server IaaS Query Service`)
- `SQLIaaSExtension` (nom d’affichage : `Microsoft SQL Server IaaS Agent`)
## <a name="next-steps"></a>Étapes suivantes
Pour plus d’informations, consultez les articles suivants :
* [Vue d’ensemble de SQL Server sur une machine virtuelle Windows](virtual-machines-windows-sql-server-iaas-overview.md)
* [Questions fréquentes (FAQ) pour SQL Server sur une machine virtuelle Windows](virtual-machines-windows-sql-server-iaas-faq.md)
* [Guide des tarifs pour SQL Server sur une machine virtuelle Windows](virtual-machines-windows-sql-server-pricing-guidance.md)
* [Notes de publication pour SQL Server sur une machine virtuelle Windows](virtual-machines-windows-sql-server-iaas-release-notes.md)
| 75.346639 | 730 | 0.78723 | fra_Latn | 0.971849 |
b1320bea45ff8fe41abe90980c25a73af739503e | 895 | md | Markdown | docs/_peoples/jiarui-li.md | nkshuihan/venture-sprint.com | 726458726a5bfd75628104ff124d56a39b4f7589 | [
"Apache-2.0"
] | null | null | null | docs/_peoples/jiarui-li.md | nkshuihan/venture-sprint.com | 726458726a5bfd75628104ff124d56a39b4f7589 | [
"Apache-2.0"
] | 6 | 2020-06-01T09:17:19.000Z | 2020-06-30T06:18:40.000Z | docs/_peoples/jiarui-li.md | nkshuihan/venture-sprint.com | 726458726a5bfd75628104ff124d56a39b4f7589 | [
"Apache-2.0"
] | 1 | 2020-06-08T09:05:36.000Z | 2020-06-08T09:05:36.000Z | ---
name: 李佳芮
site: https://www.botorange.com/
bio: 句子互动创始人 & CEO,微软人工智能最具价值专家 (AI MVP)
avatar: /assets/peoples/jiarui-li/avatar.jpg
email: rui@juzi.bot
twitter:
---
李佳芮,连续创业者,微信生态产品的开发、设计、运营专家。过去六年中,曾为百余家企业提供微信生态技术和运营服务,包括亚马逊、腾讯、京东、新华网、联想、微软、飞利浦、鹏金所等。《Chatbot从0到1:对话式交互设计实践指南》 作者。
2017年创立句子互动,专注基于微信生态提供智能对话服务。客户覆盖教育、保险、大健康等多个领域。同年入选百度AI加速器,次年和百度联手打造《从0到1搭建聊天机器人》系列课程。2019年成腾讯云智能对话平台合作伙伴。公司先后获得 PreAngel、Plug and Play、Y Combination,、TSVC(清谷资本)和阿尔法公社等多家中美机构天使投资。
GitHub 6,000+ Stars 开源项目 Wechaty 联合作者,创建并管理覆盖全球基于微信平台的聊天机器人开发者社区,多次应邀在 Google、Microsoft 大会中做 Chatbot 行业技术分享。
在创立句子互动公司之前,曾创立舞蹈自媒体舞哩,帮助舞蹈爱好者快速学习成品舞,视频全网播放数千万,累计粉丝数十万。在运营舞蹈社区过程中,发现了微信生态智能对话服务的新机会。
李佳芮女士拥有北京邮电大学信息安全专业学士和硕士学位。在校期间参与创办微信公众号技术开发工作室,发布微信墙、微信会议解决方案、在线婚礼请柬喜鹊说等产品。研究生一年级参与腾讯T派校园创新创业大赛并获银创奖和 PreAngel 特别奖。二年级决定休学创业,与2位前腾讯员工合伙创立创立婚庆 O2O 平台蛮蛮互动公司,出任 CEO,并获得了A股上市公司天神娱乐 500 万天使投资。
热爱舞蹈、马拉松、攀岩、瑜伽、拳击等运动,是一个现实的理想主义者,相信时间看得见。
| 44.75 | 188 | 0.836872 | yue_Hant | 0.715062 |
b13297582e23f97ff64db7d60c872881ed1632c9 | 15,359 | md | Markdown | docs/source/tts/models_introduction.md | zh794390558/DeepSpeech | 34178893327ad359cb816e55d7c66a10244fa08a | [
"Apache-2.0"
] | null | null | null | docs/source/tts/models_introduction.md | zh794390558/DeepSpeech | 34178893327ad359cb816e55d7c66a10244fa08a | [
"Apache-2.0"
] | null | null | null | docs/source/tts/models_introduction.md | zh794390558/DeepSpeech | 34178893327ad359cb816e55d7c66a10244fa08a | [
"Apache-2.0"
] | null | null | null | # Models introduction
TTS system mainly includes three modules: `Text Frontend`, `Acoustic model` and `Vocoder`. We introduce a rule based Chinese text frontend in [cn_text_frontend.md](./cn_text_frontend.md). Here, we will introduce acoustic models and vocoders, which are trainable models.
The main processes of TTS include:
1. Convert the original text into characters/phonemes, through `text frontend` module.
2. Convert characters/phonemes into acoustic features , such as linear spectrogram, mel spectrogram, LPC features, etc. through `Acoustic models`.
3. Convert acoustic features into waveforms through `Vocoders`.
A simple text frontend module can be implemented by rules. Acoustic models and vocoders need to be trained. The models provided by PaddleSpeech TTS are acoustic models and vocoders.
## Acoustic Models
### Modeling Objectives of Acoustic Models
Modeling the mapping relationship between text sequences and speech features:
```text
text X = {x1,...,xM}
specch Y = {y1,...yN}
```
Modeling Objectives:
```text
Ω = argmax p(Y|X,Ω)
```
### Modeling process of Acoustic Models
At present, there are two mainstream acoustic model structures.
- Frame level acoustic model:
- Duration model (M Tokens - > N Frames).
- Acoustic decoder (N Frames - > N Frames).
<div align="left">
<img src="https://raw.githubusercontent.com/PaddlePaddle/DeepSpeech/develop/docs/images/frame_level_am.png" width=500 /> <br>
</div>
- Sequence to sequence acoustic model:
- M Tokens - > N Frames.
<div align="left">
<img src="https://raw.githubusercontent.com/PaddlePaddle/DeepSpeech/develop/docs/images/seq2seq_am.png" width=500 /> <br>
</div>
### Tacotron2
[Tacotron](https://arxiv.org/abs/1703.10135) is the first end-to-end acoustic model based on deep learning, and it is also the most widely used acoustic model.
[Tacotron2](https://arxiv.org/abs/1712.05884) is the Improvement of Tacotron.
#### Tacotron
**Features of Tacotron:**
- Encoder.
- CBHG.
- Input: character sequence.
- Decoder.
- Global soft attention.
- unidirectional RNN.
- Autoregressive teacher force training (input real speech feature).
- Multi frame prediction.
- CBHG postprocess.
- Vocoder: Griffin-Lim.
<div align="left">
<img src="https://raw.githubusercontent.com/PaddlePaddle/DeepSpeech/develop/docs/images/tacotron.png" width=700 /> <br>
</div>
**Advantage of Tacotron:**
- No need for complex text frontend analysis modules.
- No need for additional duration model.
- Greatly simplifies the acoustic model construction process and reduces the dependence of speech synthesis tasks on domain knowledge.
**Disadvantages of Tacotron:**
- The CBHG is complex and the amount of parameters is relatively large.
- Global soft attention.
- Poor stability for speech synthesis tasks.
- In training, the less the number of speech frames predicted at each moment, the more difficult it is to train.
- Phase problem in Griffin-Lim casues speech distortion during wave reconstruction.
- The autoregressive decoder cannot be stopped during the generation process.
#### Tacotron2
**Features of Tacotron2:**
- Reduction of parameters.
- CBHG -> PostNet (3 Conv layers + BLSTM or 5 Conv layers).
- remove Attention RNN.
- Speech distortion caused by Griffin-Lim.
- WaveNet.
- Improvements of PostNet.
- CBHG -> 5 Conv layers.
- The input and output of the PostNet calculate `L2` loss with real Mel spectrogram.
- Residual connection.
- Bad stop in autoregressive decoder.
- Predict whether it should stop at each moment of decoding (stop token).
- Set a threshold to determine whether to stop generating when decoding.
- Stability of attention.
- Location-aware attention.
- The alignment matrix of previous time is considered at the step `t` of decoder.
<div align="left">
<img src="https://raw.githubusercontent.com/PaddlePaddle/DeepSpeech/develop/docs/images/tacotron2.png" width=500 /> <br>
</div>
You can find PaddleSpeech TTS's tacotron2 with LJSpeech dataset example at [examples/ljspeech/tts0](https://github.com/PaddlePaddle/DeepSpeech/tree/develop/examples/ljspeech/tts0).
### TransformerTTS
**Disadvantages of the Tacotrons:**
- Encodr and decoder are relatively weak at global information modeling
- Vanishing gradient of RNN.
- Fixed-length context modeling problem in CNN kernel.
- Training is relatively inefficient.
- The attention is not robust enough and the stability is poor.
Transformer TTS is a combination of Tacotron2 and Transformer.
#### Transformer
[Transformer](https://arxiv.org/abs/1706.03762) is a seq2seq model based entirely on attention mechanism.
**Features of Transformer:**
- Encoder.
- `N` blocks based on self-attention mechanism.
- Positional Encoding.
- Decoder.
- `N` blocks based on self-attention mechanism.
- Add Mask to the self-attention in blocks to cover up the information after `t` step.
- Attentions between encoder and decoder.
- Positional Encoding.
<div align="left">
<img src="https://raw.githubusercontent.com/PaddlePaddle/DeepSpeech/develop/docs/images/transformer.png" width=500 /> <br>
</div>
#### Transformer TTS
Transformer TTS is a seq2seq acoustic model based on Transformer and Tacotron2.
**Motivations:**
- RNNs in Tacotron2 make the inefficiency of training.
- Vanishing gradient of RNN makes the model's ability to model long-term contexts weak.
- Self-attention doesn't contain any recursive structure which can be trained in parallel.
- Self-attention can model global context information well.
**Features of Transformer TTS:**
- Add conv based PreNet in encoder and decoder.
- Stop Token in decoder controls when to stop autoregressive generation.
- Add PostNet after decoder to improve the quality of synthetic speech.
- Scaled position encoding.
- Uniform scale position encoding may have a negative impact on input or output sequences.
<div align="left">
<img src="https://raw.githubusercontent.com/PaddlePaddle/DeepSpeech/develop/docs/images/transformer_tts.png" width=500 /> <br>
</div>
**Disadvantages of Transformer TTS:**
- The ability of position encoding for timing information is still relatively weak.
- The ability to perceive local information is weak, and local information is more related to pronunciation.
- Stability is worse than Tacotron2.
You can find PaddleSpeech TTS's Transformer TTS with LJSpeech dataset example at [examples/ljspeech/tts1](https://github.com/PaddlePaddle/DeepSpeech/tree/develop/examples/ljspeech/tts1).
### FastSpeech2
**Disadvantage of seq2seq models:**
- In the seq2seq model based on attention, no matter how to improve the attention mechanism, it's difficult to avoid generation errors in the decoding stage.
Frame level acoustic models use duration models to determine the pronunciation duration of phonemes, and the frame level mapping does not have the uncertainty of sequence generation.
In seq2saq models, the concept of duration models is used as the alignment module of two sequences to replace attention, which can avoid the uncertainty in attention, and significantly improve the stability of the seq2saq models.
#### FastSpeech
Instead of using the encoder-attention-decoder based architecture as adopted by most seq2seq based autoregressive and non-autoregressive generation, [FastSpeech](https://arxiv.org/abs/1905.09263) is a novel feed-forward structure, which can generate a target mel spectrogram sequence in parallel.
**Features of FastSpeech:**
- Encoder: based on Transformer.
- Change `FFN` to `CNN` in self-attention.
- Model local dependency.
- Length regulator.
- Use real phoneme durations to expand output frame of encoder during training.
- Non autoregressive decode.
- Improve generation efficiency.
**Length predictor:**
- Pretrain a TransformerTTS model.
- Get alignment matrix of train data.
- Caculate the phoneme durations according to the probability of the alignment matrix.
- Use the output of encoder to predict the phoneme durations and calculate the MSE loss.
- Use real phoneme durations to expand output frame of encoder during training.
- Use phoneme durations predicted by the duration model to expand the frame during prediction.
- Attentrion can not control phoneme durations. The explicit duration modeling can control durations through duration coefficient (duration coefficient is `1` during training).
**Advantages of non-autoregressive decoder:**
- The built-in duration model of the seq2seq model has converted the input length `M` to the output length `N`.
- The length of output is known, `stop token` is no longer used, avoiding the problem of being unable to stop.
• Can be generated in parallel (decoding time is less affected by sequence length)
<div align="left">
<img src="https://raw.githubusercontent.com/PaddlePaddle/DeepSpeech/develop/docs/images/fastspeech.png" width=800 /> <br>
</div>
#### FastPitch
[FastPitch](https://arxiv.org/abs/2006.06873) follows FastSpeech. A single pitch value is predicted for every temporal location, which improves the overall quality of synthesized speech.
<div align="left">
<img src="https://raw.githubusercontent.com/PaddlePaddle/DeepSpeech/develop/docs/images/fastpitch.png" width=500 /> <br>
</div>
#### FastSpeech2
**Disadvantages of FastSpeech:**
- The teacher-student distillation pipeline is complicated and time-consuming.
- The duration extracted from the teacher model is not accurate enough.
- The target mel spectrograms distilled from teacher model suffer from information loss due to data simplification.
[FastSpeech2](https://arxiv.org/abs/2006.04558) addresses the issues in FastSpeech and better solves the one-to-many mapping problem in TTS.
**Features of FastSpeech2:**
- Directly training the model with ground-truth target instead of the simplified output from teacher.
- Introducing more variation information of speech as conditional inputs, extract `duration`, `pitch` and `energy` from speech waveform and directly take them as conditional inputs in training and use predicted values in inference.
FastSpeech2 is similar to FastPitch but introduces more variation information of speech.
<div align="left">
<img src="https://raw.githubusercontent.com/PaddlePaddle/DeepSpeech/develop/docs/images/fastspeech2.png" width=800 /> <br>
</div>
You can find PaddleSpeech TTS's FastSpeech2/FastPitch with CSMSC dataset example at [examples/csmsc/tts3](https://github.com/PaddlePaddle/DeepSpeech/tree/develop/examples/csmsc/tts3), We use token-averaged pitch and energy values introduced in FastPitch rather than frame level ones in FastSpeech2.
### SpeedySpeech
[SpeedySpeech](https://arxiv.org/abs/2008.03802) simplify the teacher-student architecture of FastSpeech and provide a fast and stable training procedure.
**Features of SpeedySpeech:**
- Use a simpler, smaller and faster-to-train convolutional teacher model ([Deepvoice3](https://arxiv.org/abs/1710.07654) and [DCTTS](https://arxiv.org/abs/1710.08969)) with a single attention layer instead of Transformer used in FastSpeech.
- Show that self-attention layers in the student network are not needed for high-quality speech synthesis.
- Describe a simple data augmentation technique that can be used early in the training to make the teacher network robust to sequential error propagation.
<div align="left">
<img src="https://raw.githubusercontent.com/PaddlePaddle/DeepSpeech/develop/docs/images/speedyspeech.png" width=500 /> <br>
</div>
You can find PaddleSpeech TTS's SpeedySpeech with CSMSC dataset example at [examples/csmsc/tts2](https://github.com/PaddlePaddle/DeepSpeech/tree/develop/examples/csmsc/tts2).
## Vocoders
In speech synthesis, the main task of the vocoder is to convert the spectral parameters predicted by the acoustic model into the final speech waveform.
Taking into account the short-term change frequency of the waveform, the acoustic model usually avoids direct modeling of the speech waveform, but firstly models the spectral features extracted from the speech waveform, and then reconstructs the waveform by the decoding part of the vocoder.
A vocoder usually consists of a pair of encoders and decoders for speech analysis and synthesis. The encoder estimate the parameters, and then the decoder restores the speech.
Vocoders based on neural networks usually is speech synthesis, which learns the mapping relationship from spectral features to waveforms through training data.
### Categories of neural vocodes
- Autoregression
- WaveNet
- WaveRNN
- LPCNet
- Flow
- **WaveFlow**
- WaveGlow
- FloWaveNet
- Parallel WaveNet
- GAN
- WaveGAN
- **Parallel WaveGAN**
- MelGAN
- HiFi-GAN
- VAE
- Wave-VAE
- Diffusion
- WaveGrad
- DiffWave
**Motivations of GAN-based vocoders:**
- Modeling speech signal by estimating probability distribution usually has high requirements for the expression ability of the model itself. In addition, specific assumptions need to be made about the distribution of waveforms.
- Although autoregressive neural vocoders can obtain high-quality synthetic speech, such models usually have a **slow generation speed**.
- The training of inverse autoregressive flow vocoders is complex, and they also require the modeling capability of long term context information.
- Vocoders based on Bipartite Transformation converge slowly and are complex.
- GAN-based vocoders don't need to make assumptions about the speech distribution, and train through adversarial learning.
Here, we introduce a Flow-based vocoder WaveFlow and a GAN-based vocoder Parallel WaveGAN.
### WaveFlow
[WaveFlow](https://arxiv.org/abs/1912.01219) is proposed by Baidu Research.
**Features of WaveFlow:**
- It can synthesize 22.05 kHz high-fidelity speech around 40x faster than real-time on a Nvidia V100 GPU without engineered inference kernels, which is faster than [WaveGlow](https://github.com/NVIDIA/waveglow) and serveral orders of magnitude faster than WaveNet.
- It is a small-footprint flow-based model for raw audio. It has only 5.9M parameters, which is 15x smalller than WaveGlow (87.9M).
- It is directly trained with maximum likelihood without probability density distillation and auxiliary losses as used in [Parallel WaveNet](https://arxiv.org/abs/1711.10433) and [ClariNet](https://openreview.net/pdf?id=HklY120cYm), which simplifies the training pipeline and reduces the cost of development.
You can find PaddleSpeech TTS's WaveFlow with LJSpeech dataset example at [examples/ljspeech/voc0](https://github.com/PaddlePaddle/DeepSpeech/tree/develop/examples/ljspeech/voc0).
### Parallel WaveGAN
[Parallel WaveGAN](https://arxiv.org/abs/1910.11480) trains a non-autoregressive WaveNet variant as a generator in a GAN based training method.
**Features of Parallel WaveGAN:**
- Use non-causal convolution instead of causal convolution.
- The input is random Gaussian white noise.
- The model is non-autoregressive both in training and prediction, which is fast
- Multi-resolution STFT loss.
<div align="left">
<img src="https://raw.githubusercontent.com/PaddlePaddle/DeepSpeech/develop/docs/images/pwg.png" width=600 /> <br>
</div>
You can find PaddleSpeech TTS's Parallel WaveGAN with CSMSC example at [examples/csmsc/voc1](https://github.com/PaddlePaddle/DeepSpeech/tree/develop/examples/csmsc/voc1).
| 51.888514 | 308 | 0.776092 | eng_Latn | 0.979806 |
b1337565e1d01776cc04a19133530daaeb346b33 | 1,675 | md | Markdown | politeiawww/cmd/cmswww/README.md | LasTshaMAN/politeia | b7909ed306f13344611c40b1c5bcd5c3acc62666 | [
"0BSD"
] | 121 | 2017-10-25T16:31:58.000Z | 2021-11-11T22:59:55.000Z | politeiawww/cmd/cmswww/README.md | LasTshaMAN/politeia | b7909ed306f13344611c40b1c5bcd5c3acc62666 | [
"0BSD"
] | 780 | 2017-10-25T16:30:07.000Z | 2022-03-24T17:56:04.000Z | politeiawww/cmd/cmswww/README.md | LasTshaMAN/politeia | b7909ed306f13344611c40b1c5bcd5c3acc62666 | [
"0BSD"
] | 92 | 2017-10-25T14:58:24.000Z | 2022-01-12T18:41:52.000Z | # cmswww
cmswww is a command line tool that allows you to interact with the cmswww API.
## Available Commands
You can view the available commands and application options by using the help
flag.
$ cmswww -h
You can view details about a specific command, including required arguments,
by using the help command.
$ cmswww help <command>
## Persisting Data Between Commands
cmswww stores user identity data (the user's public/private key pair), session
cookies, and CSRF tokens in the cmswww directory. This allows you to login
with a user and use the same session data for subsequent commands. The data is
segmented by host, allowing you to login and interact with multiple hosts
simultaneously.
The location of the cmswww directory varies based on your operating system.
**macOS**
`/Users/<username>/Library/Application Support/Cmswww`
**Windows**
`C:\Users\<username>\AppData\Local\Cmswww`
**Ubuntu**
`~/.cmswww`
## Setup Configuration File
cmswww has a configuration file that you can setup to make execution easier.
You should create the configuration file under the following paths.
**macOS**
`/Users/<username>/Library/Application Support/Piwww/cmswww.conf`
**Windows**
`C:\Users\<username>\AppData\Local\Piwww/cmswww.conf`
**Ubuntu**
`~/.cmswww/cmswww.conf`
If you're developing locally, you'll want to set the politeiawww host in the
configuration file since the default politeiawww host is
`https://proposals.decred.org`. Copy these lines into your `cmswww.conf` file.
`skipverify` is used to skip TLS certificate verification and should only be
used when running politeia locally.
```
host=https://127.0.0.1:4443
skipverify=true
```
| 26.587302 | 79 | 0.765373 | eng_Latn | 0.988187 |
b133db149ff504891392499423865aea95d4b5a1 | 8,737 | md | Markdown | articles/virtual-machines/troubleshooting/troubleshoot-vm-by-use-nested-virtualization.md | kitingChris/azure-docs.de-de | a81b914393aa78dc3722e272c7f253a9c5ddd2d2 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/virtual-machines/troubleshooting/troubleshoot-vm-by-use-nested-virtualization.md | kitingChris/azure-docs.de-de | a81b914393aa78dc3722e272c7f253a9c5ddd2d2 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/virtual-machines/troubleshooting/troubleshoot-vm-by-use-nested-virtualization.md | kitingChris/azure-docs.de-de | a81b914393aa78dc3722e272c7f253a9c5ddd2d2 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Behandeln von Problemen mit einem virtuellen Azure-Computer unter Verwendung der geschachtelten Virtualisierung in Azure | Microsoft-Dokumentation
description: Hier erfahren Sie, wie Sie Probleme mit einem virtuellen Azure-Computer unter Verwendung der geschachtelten Virtualisierung in Azure behandeln.
services: virtual-machines-windows
documentationcenter: ''
author: glimoli
manager: gwallace
editor: ''
tags: azure-resource-manager
ms.service: virtual-machines-windows
ms.workload: infrastructure-services
ms.tgt_pltfrm: vm-windows
ms.devlang: na
ms.topic: article
ms.date: 11/01/2018
ms.author: genli
ms.openlocfilehash: 135368fd9b838573ae8aa65e16d5df2cd3df3e6d
ms.sourcegitcommit: c105ccb7cfae6ee87f50f099a1c035623a2e239b
ms.translationtype: HT
ms.contentlocale: de-DE
ms.lasthandoff: 07/09/2019
ms.locfileid: "67709230"
---
# <a name="troubleshoot-a-problem-azure-vm-by-using-nested-virtualization-in-azure"></a>Behandeln von Problemen mit einem virtuellen Azure-Computer unter Verwendung der geschachtelten Virtualisierung in Azure
In diesem Artikel erfahren Sie, wie Sie in Microsoft Azure eine geschachtelte Virtualisierungsumgebung erstellen, um den Datenträger des virtuellen Computers zur Problembehandlung auf dem Hyper-V-Host (virtueller Rettungscomputer) einbinden zu können.
## <a name="prerequisites"></a>Voraussetzungen
Um den virtuellen Computer mit dem Problem einbinden zu können, muss der virtuelle Rettungscomputer folgende Voraussetzungen erfüllen:
- Er muss sich am gleichen Standort befinden wie der virtuelle Computer mit dem Problem.
- Er muss sich in der gleichen Ressourcengruppe befinden wie der virtuelle Computer mit dem Problem.
- Er muss die gleiche Art von Speicherkonto (Standard oder Premium) verwenden wie der virtuelle Computer mit dem Problem.
## <a name="step-1-create-a-rescue-vm-and-install-hyper-v-role"></a>Schritt 1: Erstellen eines virtuellen Rettungscomputers und Installieren der Hyper-V-Rolle
1. Erstellen Sie einen neuen virtuellen Rettungscomputer:
- Betriebssystem: Windows Server 2016 Datacenter
- Größe: Beliebige V3-Serie mit mindestens zwei Kernen, die die geschachtelte Virtualisierung unterstützen. Weitere Informationen finden Sie unter [Introducing the new Dv3 and Ev3 VM sizes](https://azure.microsoft.com/blog/introducing-the-new-dv3-and-ev3-vm-sizes/) (Vorstellung der neuen VM-Größen Dv3 und Ev3).
- Gleicher Standort, gleiches Speicherkonto und gleiche Ressourcengruppe wie der virtuelle Computer mit dem Problem
- Speichertyp: Gleicher Typ wie bei dem Computer mit dem Problem (Standard oder Premium)
2. Stellen Sie nach dem Erstellen des virtuellen Rettungscomputers eine Remotedesktopverbindung mit ihm her.
3. Klicken Sie im Server-Manager auf **Verwalten** > **Rollen und Features hinzufügen**.
4. Wählen Sie im Abschnitt **Installationstyp** die Option **Rollenbasierte oder featurebasierte Installation** aus.
5. Vergewissern Sie sich im Abschnitt **Zielserver auswählen**, dass der virtuelle Rettungscomputer ausgewählt ist.
6. Klicken Sie auf **Hyper-V-Rolle** > **Features hinzufügen**.
7. Klicken Sie im Abschnitt **Features** auf **Weiter**.
8. Sollte ein virtueller Switch verfügbar sein, wählen Sie ihn aus. Klicken Sie andernfalls auf **Weiter**.
9. Klicken Sie im Abschnitt **Migration** auf **Weiter**.
10. Klicken Sie im Abschnitt **Standardspeicher** auf **Weiter**.
11. Aktivieren Sie das Kontrollkästchen für den automatischen Neustart des Servers, sofern erforderlich.
12. Wählen Sie **Installieren** aus.
13. Lassen Sie die Installation der Hyper-V-Rolle auf dem Server zu. Dieser Vorgang dauert einige Minuten. Anschließend wird der Server automatisch neu gestartet.
## <a name="step-2-create-the-problem-vm-on-the-rescue-vms-hyper-v-server"></a>Schritt 2: Erstellen des virtuellen Computers mit dem Problem auf dem Hyper-V-Server des virtuellen Rettungscomputers
1. Notieren Sie sich den Namen des Datenträgers des virtuellen Computers mit dem Problem, und löschen Sie anschließend den virtuellen Computer mit dem Problem. Bewahren Sie alle angefügten Datenträger auf.
2. Fügen Sie den Betriebssystemdatenträger des virtuellen Computers mit dem Problem als Datenträger des virtuellen Rettungscomputers an.
1. Navigieren Sie nach dem Löschen des virtuellen Computers mit dem Problem zum virtuellen Rettungscomputer.
2. Klicken Sie auf **Datenträger** und anschließend auf **Datenträger hinzufügen**.
3. Wählen Sie den Datenträger des virtuellen Computers mit dem Problem aus, und klicken Sie anschließend auf **Speichern**.
3. Stellen Sie nach dem erfolgreichen Anfügen des Datenträgers eine Remotedesktopverbindung mit dem virtuellen Rettungscomputer her.
4. Öffnen Sie die Datenträgerverwaltung (diskmgmt.msc). Vergewissern Sie sich, dass der Datenträger des virtuellen Computers mit dem Problem den Status **Offline** hat.
5. Öffnen Sie den Hyper-V Manager: Wählen Sie im **Server-Manager** die **Hyper-V-Rolle** aus. Klicken Sie mit der rechten Maustaste auf den Server, und wählen Sie den **Hyper-V Manager** aus.
6. Klicken Sie im Hyper-V-Manager mit der rechten Maustaste auf den virtuellen Rettungscomputer, und klicken Sie auf **Neu** > **Virtueller Computer** > **Weiter**.
7. Geben Sie einen Namen für den virtuellen Computer ein, und klicken Sie auf **Weiter**.
8. Wählen Sie **Generation 1** aus.
9. Legen Sie den Startspeicher auf mindestens 1.024 MB fest.
10. Wählen Sie ggf. den erstellten Hyper-V-Netzwerkswitch aus. Navigieren Sie andernfalls zur nächsten Seite.
11. Wählen Sie **Virtuelle Festplatte später zuordnen** aus.

12. Klicken Sie auf **Fertig stellen**, wenn der virtuelle Computer erstellt wurde.
13. Klicken Sie mit der rechten Maustaste auf den erstellten virtuellen Computer, und klicken Sie auf **Einstellungen**.
14. Klicken Sie auf **IDE-Controller 0** > **Festplatte** > **Hinzufügen**.

15. Wählen Sie unter **Physische Festplatte** den Datenträger des virtuellen Computers mit dem Problem aus, den Sie an den virtuellen Azure-Computer angefügt haben. Sollten keine Datenträger aufgeführt werden, überprüfen Sie mithilfe der Datenträgerverwaltung, ob der Datenträger auf „Offline“ festgelegt ist.

17. Klicken Sie auf **Apply** (Anwenden) und dann auf **OK**.
18. Doppelklicken Sie auf den virtuellen Computer, und starten Sie ihn.
19. Der virtuelle Computer kann nun als lokaler virtueller Computer verwendet werden. Sie können beliebige Problembehandlungsschritte ausführen.
## <a name="step-3-re-create-your-azure-vm-in-azure"></a>Schritt 3: Neuerstellen des virtuellen Azure-Computers in Azure
1. Wenn Sie den virtuellen Computer wieder online geschaltet haben, fahren Sie den virtuellen Computer im Hyper-V-Manager herunter.
2. Klicken Sie im [Azure-Portal](https://portal.azure.com) auf den virtuellen Rettungscomputer und anschließend auf „Datenträger“. Kopieren Sie dann den Namen des Datenträgers. Der Name wird im nächsten Schritt benötigt. Trennen Sie den eingebauten Datenträger vom virtuellen Rettungscomputer.
3. Navigieren Sie zu **Alle Ressourcen**, suchen Sie nach dem Namen des Datenträgers, und wählen Sie den Datenträger aus.

4. Klicken Sie auf **Virtuellen Computer erstellen**.

Der virtuelle Computer kann auch mithilfe von Azure PowerShell auf der Grundlage des Datenträgers erstellt werden. Weitere Informationen finden Sie unter [Erstellen des neuen virtuellen Computers](../windows/create-vm-specialized.md#create-the-new-vm).
## <a name="next-steps"></a>Nächste Schritte
Wenn Probleme beim Herstellen einer Verbindung mit Ihrer VM auftreten, helfen Ihnen die Informationen unter [Problembehandlung bei Remotedesktopverbindungen mit einem Windows-basierten virtuellen Azure-Computer](troubleshoot-rdp-connection.md) weiter. Konsultieren Sie [Problembehandlung beim Zugriff auf eine Anwendung, die auf einem virtuellen Azure-Computer ausgeführt wird](troubleshoot-app-connection.md) bei Problemen mit dem Zugriff auf Anwendungen, die auf Ihrer VM ausgeführt werden.
| 60.673611 | 492 | 0.794094 | deu_Latn | 0.994432 |
b13419da1789231dd8f85264c00d55ee49dc5c63 | 1,213 | md | Markdown | packages/stylelint-config-suitcss/README.md | mlnmln/suit | 67e14699c892eca6951ec7fe78dc48fa5f77668f | [
"MIT"
] | null | null | null | packages/stylelint-config-suitcss/README.md | mlnmln/suit | 67e14699c892eca6951ec7fe78dc48fa5f77668f | [
"MIT"
] | null | null | null | packages/stylelint-config-suitcss/README.md | mlnmln/suit | 67e14699c892eca6951ec7fe78dc48fa5f77668f | [
"MIT"
] | null | null | null | # stylelint-config-suitcss
[](https://www.npmjs.org/package/stylelint-config-suitcss) [](https://travis-ci.org/suitcss/stylelint-config-suitcss) [](https://ci.appveyor.com/project/simonsmith/stylelint-config-suitcss)
> SUIT CSS shareable config for stylelint.
Configuration rules to ensure your CSS code is compliant with [SUIT CSS's code style](https://github.com/suitcss/suit/blob/master/doc/STYLE.md).
## Installation
```console
$ npm install stylelint-config-suitcss
```
## Usage
Set your stylelint config to:
```json
{
"extends": "stylelint-config-suitcss"
}
```
### Extending the config
Simply add a `"rules"` key to your config and add your overrides there.
For example, to change the `indentation` to tabs and turn off the `number-leading-zero` rule:
```json
{
"extends": "stylelint-config-suitcss",
"rules": {
"indentation": "tab",
"number-leading-zero": null
}
}
```
## [Changelog](CHANGELOG.md)
## [License](LICENSE)
| 28.209302 | 445 | 0.728772 | yue_Hant | 0.217123 |
b1343a814d2ccbac91f8bd66740706a66f7c35e4 | 15,012 | md | Markdown | articles/notification-hubs/notification-hubs-java-push-notification-tutorial.md | sonquer/azure-docs.pl-pl | d8159cf8e870e807bd64e58188d281461b291ea8 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/notification-hubs/notification-hubs-java-push-notification-tutorial.md | sonquer/azure-docs.pl-pl | d8159cf8e870e807bd64e58188d281461b291ea8 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/notification-hubs/notification-hubs-java-push-notification-tutorial.md | sonquer/azure-docs.pl-pl | d8159cf8e870e807bd64e58188d281461b291ea8 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Jak korzystać z usługi Azure Notification Hubs w języku Java
description: Dowiedz się, jak używać platformy Azure Notification Hubs z zaplecza języka Java.
services: notification-hubs
documentationcenter: ''
author: sethmanheim
manager: femila
editor: jwargo
ms.assetid: 4c3f966d-0158-4a48-b949-9fa3666cb7e4
ms.service: notification-hubs
ms.workload: mobile
ms.tgt_pltfrm: java
ms.devlang: java
ms.topic: article
ms.date: 01/04/2019
ms.author: sethm
ms.reviewer: jowargo
ms.lastreviewed: 01/04/2019
ms.openlocfilehash: d48973cc7c5ed1fc7ae3f96128d488f3f1df3a05
ms.sourcegitcommit: 2a2af81e79a47510e7dea2efb9a8efb616da41f0
ms.translationtype: MT
ms.contentlocale: pl-PL
ms.lasthandoff: 01/17/2020
ms.locfileid: "76263867"
---
# <a name="how-to-use-notification-hubs-from-java"></a>Jak używać Notification Hubs języka Java
[!INCLUDE [notification-hubs-backend-how-to-selector](../../includes/notification-hubs-backend-how-to-selector.md)]
W tym temacie opisano najważniejsze funkcje nowego, w pełni obsługiwany oficjalny zestaw Java SDK usługi Azure Notification Hub.
Ten projekt jest projektem Open-Source i można wyświetlić cały kod zestawu SDK w [Zestaw SDK Java].
Ogólnie rzecz biorąc, można uzyskać dostęp do wszystkich funkcji Notification Hubs z języka Java/PHP/Python/Ruby zaplecza przy użyciu interfejsu REST centrum powiadomień zgodnie z opisem w temacie MSDN [Notification Hubs API REST](https://msdn.microsoft.com/library/dn223264.aspx). Ten zestaw SDK języka Java udostępnia cienkie otokę dla tych interfejsów REST w języku Java.
Zestaw SDK obecnie obsługuje:
* CRUD na Notification Hubs
* CRUD rejestracji
* Zarządzanie instalacją
* Rejestracje importu/eksportu
* Regularne wysyłanie
* Zaplanowane wysyłanie
* Operacje asynchroniczne za pośrednictwem języka Java NIO
* Obsługiwane platformy: APNS (iOS), FCM (Android), WNS (aplikacje do sklepu Windows), usługi MPNS (Windows Phone), ADM (Amazon Kindle Fire), Baidu (Android bez usług Google Services)
## <a name="sdk-usage"></a>Użycie zestawu SDK
### <a name="compile-and-build"></a>Kompilowanie i tworzenie kompilacji
Użyj [Maven]
Do kompilacji:
mvn package
## <a name="code"></a>Code
### <a name="notification-hub-cruds"></a>CRUDs centrum powiadomień
**Tworzenie elementu NamespaceManager:**
```java
NamespaceManager namespaceManager = new NamespaceManager("connection string")
```
**Utwórz centrum powiadomień:**
```java
NotificationHubDescription hub = new NotificationHubDescription("hubname");
hub.setWindowsCredential(new WindowsCredential("sid","key"));
hub = namespaceManager.createNotificationHub(hub);
```
LUB
```java
hub = new NotificationHub("connection string", "hubname");
```
**Pobierz centrum powiadomień:**
```java
hub = namespaceManager.getNotificationHub("hubname");
```
**Aktualizuj centrum powiadomień:**
```java
hub.setMpnsCredential(new MpnsCredential("mpnscert", "mpnskey"));
hub = namespaceManager.updateNotificationHub(hub);
```
**Usuń centrum powiadomień:**
```java
namespaceManager.deleteNotificationHub("hubname");
```
### <a name="registration-cruds"></a>CRUDs rejestracji
**Utwórz klienta centrum powiadomień:**
```java
hub = new NotificationHub("connection string", "hubname");
```
**Utwórz rejestrację systemu Windows:**
```java
WindowsRegistration reg = new WindowsRegistration(new URI(CHANNELURI));
reg.getTags().add("myTag");
reg.getTags().add("myOtherTag");
hub.createRegistration(reg);
```
**Utwórz rejestrację systemu iOS:**
```java
AppleRegistration reg = new AppleRegistration(DEVICETOKEN);
reg.getTags().add("myTag");
reg.getTags().add("myOtherTag");
hub.createRegistration(reg);
```
Podobnie można utworzyć rejestracje dla systemu Android (FCM), Windows Phone (usługi MPNS) i Kindle (ADM).
**Tworzenie rejestracji szablonów:**
```java
WindowsTemplateRegistration reg = new WindowsTemplateRegistration(new URI(CHANNELURI), WNSBODYTEMPLATE);
reg.getHeaders().put("X-WNS-Type", "wns/toast");
hub.createRegistration(reg);
```
**Tworzenie rejestracji przy użyciu wzorca tworzenia identyfikatora rejestracji + upsert:**
Usuwa duplikaty ze względu na utracone odpowiedzi w przypadku przechowywania identyfikatorów rejestracji na urządzeniu:
```java
String id = hub.createRegistrationId();
WindowsRegistration reg = new WindowsRegistration(id, new URI(CHANNELURI));
hub.upsertRegistration(reg);
```
**Aktualizacje rejestracji:**
```java
hub.updateRegistration(reg);
```
**Usuń rejestracje:**
```java
hub.deleteRegistration(regid);
```
**Rejestracje zapytań:**
* **Pobierz rejestrację pojedynczą:**
```java
hub.getRegistration(regid);
```
* **Pobierz wszystkie rejestracje w centrum:**
```java
hub.getRegistrations();
```
* **Pobierz rejestracje za pomocą tagu:**
```java
hub.getRegistrationsByTag("myTag");
```
* **Pobierz rejestracje według kanału:**
```java
hub.getRegistrationsByChannel("devicetoken");
```
Wszystkie zapytania kolekcji obsługują tokeny $top i kontynuacji.
### <a name="installation-api-usage"></a>Użycie interfejsu API instalacji
Interfejs API instalacji to alternatywny mechanizm zarządzania rejestracją. Zamiast utrzymywać wiele rejestracji, które nie są proste i mogą być łatwo wykonywane niepoprawnie lub niewydajnie, można teraz używać jednego obiektu instalacyjnego.
Instalacja zawiera wszystko, czego potrzebujesz: kanał wypychania (token urządzenia), Tagi, szablony, kafelki pomocnicze (dla WNS i APN). Nie musisz wywoływać usługi, aby uzyskać już identyfikator GUID lub dowolny inny identyfikator, przechowuj go na urządzeniu i wysyłaj do zaplecza przy użyciu kanału push (token urządzenia).
W zapleczu należy wykonać tylko jedno wywołanie do `CreateOrUpdateInstallation`; jest on w pełni idempotentne, więc możesz ponowić próbę w razie potrzeby.
Przykład dotyczący usługi Amazon Kindle Fire:
```java
Installation installation = new Installation("installation-id", NotificationPlatform.Adm, "adm-push-channel");
hub.createOrUpdateInstallation(installation);
```
Jeśli chcesz ją zaktualizować:
```java
installation.addTag("foo");
installation.addTemplate("template1", new InstallationTemplate("{\"data\":{\"key1\":\"$(value1)\"}}","tag-for-template1"));
installation.addTemplate("template2", new InstallationTemplate("{\"data\":{\"key2\":\"$(value2)\"}}","tag-for-template2"));
hub.createOrUpdateInstallation(installation);
```
W przypadku zaawansowanych scenariuszy Użyj funkcji aktualizacji częściowej, która umożliwia modyfikowanie tylko określonych właściwości obiektu instalacyjnego. Aktualizacja częściowa jest podzbiorem operacji poprawek w formacie JSON, które można uruchomić względem obiektu instalacyjnego.
```java
PartialUpdateOperation addChannel = new PartialUpdateOperation(UpdateOperationType.Add, "/pushChannel", "adm-push-channel2");
PartialUpdateOperation addTag = new PartialUpdateOperation(UpdateOperationType.Add, "/tags", "bar");
PartialUpdateOperation replaceTemplate = new PartialUpdateOperation(UpdateOperationType.Replace, "/templates/template1", new InstallationTemplate("{\"data\":{\"key3\":\"$(value3)\"}}","tag-for-template1")).toJson());
hub.patchInstallation("installation-id", addChannel, addTag, replaceTemplate);
```
Usuń instalację:
```java
hub.deleteInstallation(installation.getInstallationId());
```
`CreateOrUpdate`, `Patch`i `Delete` są ostatecznie spójne z `Get`. Żądana operacja właśnie przechodzi do kolejki systemowej w trakcie wywołania i jest wykonywana w tle. Pobieranie nie jest przeznaczone do głównego scenariusza środowiska uruchomieniowego, ale tylko w celu debugowania i rozwiązywania problemów, jest ściśle ograniczone przez usługę.
Wysyłanie przepływu dla instalacji jest takie samo jak w przypadku rejestracji. Aby określić docelowe powiadomienie do określonej instalacji — po prostu Użyj tagu "Identyfikator InstallationID: {żądana-ID}". W tym przypadku kod jest następujący:
```java
Notification n = Notification.createWindowsNotification("WNS body");
hub.sendNotification(n, "InstallationId:{installation-id}");
```
Dla jednego z kilku szablonów:
```java
Map<String, String> prop = new HashMap<String, String>();
prop.put("value3", "some value");
Notification n = Notification.createTemplateNotification(prop);
hub.sendNotification(n, "InstallationId:{installation-id} && tag-for-template1");
```
### <a name="schedule-notifications-available-for-standard-tier"></a>Zaplanuj powiadomienia (dostępne dla warstwy Standardowa)
Taki sam jak zwykłe wysyłanie, ale z jednym dodatkowym parametrem-scheduledTime, który informuje o dostarczeniu powiadomienia. Usługa akceptuje dowolny punkt czasu między teraz i 5 minut, a teraz + 7 dni.
**Zaplanuj powiadomienie natywne systemu Windows:**
```java
Calendar c = Calendar.getInstance();
c.add(Calendar.DATE, 1);
Notification n = Notification.createWindowsNotification("WNS body");
hub.scheduleNotification(n, c.getTime());
```
### <a name="importexport-available-for-standard-tier"></a>Import/Export (dostępne dla warstwy Standardowa)
Może być konieczne wykonanie operacji zbiorczej w odniesieniu do rejestracji. Zazwyczaj służy do integracji z innym systemem lub z ogromną poprawką w celu zaktualizowania tagów. Nie zalecamy korzystania z przepływu pobierania/aktualizacji, jeśli chodzi o tysiące rejestracji. Możliwość importowania/eksportowania systemu została zaprojektowana w celu pokrycia tego scenariusza. Zapewnisz dostęp do kontenera obiektów BLOB w ramach konta magazynu jako źródła danych przychodzących i lokalizacji dla danych wyjściowych.
**Prześlij zadanie eksportu:**
```java
NotificationHubJob job = new NotificationHubJob();
job.setJobType(NotificationHubJobType.ExportRegistrations);
job.setOutputContainerUri("container uri with SAS signature");
job = hub.submitNotificationHubJob(job);
```
**Prześlij zadanie importu:**
```java
NotificationHubJob job = new NotificationHubJob();
job.setJobType(NotificationHubJobType.ImportCreateRegistrations);
job.setImportFileUri("input file uri with SAS signature");
job.setOutputContainerUri("container uri with SAS signature");
job = hub.submitNotificationHubJob(job);
```
**Poczekaj, aż zadanie zostanie wykonane:**
```java
while(true){
Thread.sleep(1000);
job = hub.getNotificationHubJob(job.getJobId());
if(job.getJobStatus() == NotificationHubJobStatus.Completed)
break;
}
```
**Pobierz wszystkie zadania:**
```java
List<NotificationHubJob> jobs = hub.getAllNotificationHubJobs();
```
**Identyfikator URI z podpisem SAS:**
Ten adres URL to adres URL pliku obiektu BLOB lub kontenera obiektów blob oraz zestaw parametrów, takich jak uprawnienia i czas wygaśnięcia, oraz sygnatura wszystkich tych elementów przy użyciu klucza SAS konta. Zestaw SDK Java usługi Azure Storage oferuje bogate możliwości, w tym tworzenie tych identyfikatorów URI. Jako alternatywę zapoznaj się z klasą testu `ImportExportE2E` (z lokalizacji GitHub), która ma podstawową i kompaktową implementację algorytmu podpisywania.
### <a name="send-notifications"></a>Wysyłanie powiadomień
Obiekt powiadomienia jest po prostu treścią z nagłówkami, ale niektóre metody narzędziowe ułatwiają tworzenie obiektów powiadomień natywnych i szablonów.
* **Sklep Windows i Windows Phone 8,1 (bez Silverlight)**
```java
String toast = "<toast><visual><binding template=\"ToastText01\"><text id=\"1\">Hello from Java!</text></binding></visual></toast>";
Notification n = Notification.createWindowsNotification(toast);
hub.sendNotification(n);
```
* **iOS**
```java
String alert = "{\"aps\":{\"alert\":\"Hello from Java!\"}}";
Notification n = Notification.createAppleNotification(alert);
hub.sendNotification(n);
```
* **Android**
```java
String message = "{\"data\":{\"msg\":\"Hello from Java!\"}}";
Notification n = Notification.createFcmNotification(message);
hub.sendNotification(n);
```
* **Windows Phone 8,0 i 8,1 Silverlight**
```java
String toast = "<?xml version=\"1.0\" encoding=\"utf-8\"?>" +
"<wp:Notification xmlns:wp=\"WPNotification\">" +
"<wp:Toast>" +
"<wp:Text1>Hello from Java!</wp:Text1>" +
"</wp:Toast> " +
"</wp:Notification>";
Notification n = Notification.createMpnsNotification(toast);
hub.sendNotification(n);
```
* **Kindle pożar**
```java
String message = "{\"data\":{\"msg\":\"Hello from Java!\"}}";
Notification n = Notification.createAdmNotification(message);
hub.sendNotification(n);
```
* **Wyślij do tagów**
```java
Set<String> tags = new HashSet<String>();
tags.add("boo");
tags.add("foo");
hub.sendNotification(n, tags);
```
* **Wyrażenie wysyłania do tagu**
```java
hub.sendNotification(n, "foo && ! bar");
```
* **Wyślij powiadomienie dotyczące szablonu**
```java
Map<String, String> prop = new HashMap<String, String>();
prop.put("prop1", "v1");
prop.put("prop2", "v2");
Notification n = Notification.createTemplateNotification(prop);
hub.sendNotification(n);
```
Uruchomienie kodu Java powinno teraz generować powiadomienie na urządzeniu docelowym.
## <a name="next-steps"></a>Następne kroki
W tym temacie pokazano, jak utworzyć prosty klient protokołu Java REST dla Notification Hubs. W tym miejscu można wykonać następujące czynności:
* Pobierz pełny [Zestaw SDK Java], który zawiera cały kod zestawu SDK.
* Odtwórz z przykładami:
* [Wprowadzenie do Notification Hubs]
* [Wyślij najświeższe wiadomości]
* [Wyślij zlokalizowane najświeższe wiadomości]
* [Wysyłanie powiadomień do uwierzytelnionych użytkowników]
* [Wysyłanie powiadomień między platformami do uwierzytelnionych użytkowników]
[Zestaw SDK Java]: https://github.com/Azure/azure-notificationhubs-java-backend
[Get started tutorial]: notification-hubs-ios-apple-push-notification-apns-get-started.md
[Wprowadzenie do Notification Hubs]: notification-hubs-windows-store-dotnet-get-started-wns-push-notification.md
[Wyślij najświeższe wiadomości]: notification-hubs-windows-notification-dotnet-push-xplat-segmented-wns.md
[Wyślij zlokalizowane najświeższe wiadomości]: notification-hubs-windows-store-dotnet-xplat-localized-wns-push-notification.md
[Wysyłanie powiadomień do uwierzytelnionych użytkowników]: notification-hubs-aspnet-backend-windows-dotnet-wns-notification.md
[Wysyłanie powiadomień między platformami do uwierzytelnionych użytkowników]: notification-hubs-aspnet-backend-windows-dotnet-wns-notification.md
[Maven]: https://maven.apache.org/
| 38.005063 | 517 | 0.736477 | pol_Latn | 0.988182 |
b13560e82c378cb9483f53569df1a26880891c2b | 12,732 | md | Markdown | README.md | ebouchut/NeatJSON | cc27188ef57a257a543ebe9e815f6530d44e6e0d | [
"MIT"
] | 88 | 2015-07-22T15:13:32.000Z | 2022-01-12T13:40:03.000Z | README.md | ebouchut/NeatJSON | cc27188ef57a257a543ebe9e815f6530d44e6e0d | [
"MIT"
] | 30 | 2015-04-16T19:52:46.000Z | 2021-09-02T07:48:38.000Z | README.md | ebouchut/NeatJSON | cc27188ef57a257a543ebe9e815f6530d44e6e0d | [
"MIT"
] | 19 | 2015-04-24T18:03:50.000Z | 2021-09-22T18:52:53.000Z | # NeatJSON
[](http://badge.fury.io/rb/neatjson)
[](https://rubygems.org/gems/neatjson)
Pretty-print your JSON in Ruby or JavaScript or Lua with more power than is provided by `JSON.pretty_generate` (Ruby) or `JSON.stringify` (JS). For example, like Ruby's `pp` (pretty print), NeatJSON can keep objects on one line if they fit, but break them over multiple lines if needed.
**Features (all optional):**
* Keep values on one line, with variable wrap width.
* Format numeric values to specified precision.
* Sort object keys to be in alphabetical order.
* Arbitrary whitespace (or really, any string) for indentation.
* "Short" wrapping uses fewer lines, indentation based on values. (See last example below.)
* Indent final closing bracket/brace for each array/object.
* Adjust number of spaces inside array/object braces.
* Adjust number of spaces before/after commas and colons (both for single- vs. multi-line).
* Line up the values for an object across lines.
* [Online webpage](http://phrogz.net/JS/NeatJSON) for conversions and experimenting with options.
* [Lua only] Produce Lua table serialization.
## Table of Contents
* [Installation](#installation)
* [Usage](#usage)
* [Examples](#examples)
* [Options](#options)
* [License & Contact](#license--contact)
* [TODO/Known Limitations](#todo-aka-known-limitations)
* [History](#history)
## Installation
* Ruby: `gem install neatjson`
* JavaScript (web): Clone the GitHub repo and copy `javascript/neatjson.js`
* Node.js: `npm install neatjson`
## Usage
**Ruby**:
~~~ ruby
require 'neatjson'
json = JSON.neat_generate( value, options )
~~~
**JavaScript (web)**:
~~~ html
<script type="text/javascript" src="neatjson.js"></script>
<script type="text/javascript">
var json = neatJSON( value, options );
</script>
~~~
**Node.js**:
~~~ js
const { neatJSON } = require('neatjson');
var json = neatJSON( value, options );
~~~
**Lua**:
~~~ lua
local neatJSON = require'neatjson'
local json = neatJSON(value, options)
~~~
## Examples
_The following are all in Ruby, but similar options apply in JavaScript and Lua._
~~~ ruby
require 'neatjson'
o = { b:42.005, a:[42,17], longer:true, str:"yes\nplease" }
puts JSON.neat_generate(o)
#=> {"b":42.005,"a":[42,17],"longer":true,"str":"yes\nplease"}
puts JSON.neat_generate(o, sort:true)
#=> {"a":[42,17],"b":42.005,"longer":true,"str":"yes\nplease"}
puts JSON.neat_generate(o,sort:true,padding:1,after_comma:1)
#=> { "a":[ 42, 17 ], "b":42.005, "longer":true, "str":"yes\nplease" }
puts JSON.neat_generate(o, sort:true, wrap:40)
#=> {
#=> "a":[42,17],
#=> "b":42.005,
#=> "longer":true,
#=> "str":"yes\nplease"
#=> }
puts JSON.neat_generate(o, sort:true, wrap:40, decimals:2)
#=> {
#=> "a":[42,17],
#=> "b":42.01,
#=> "longer":true,
#=> "str":"yes\nplease"
#=> }
puts JSON.neat_generate(o, sort:->(k){ k.length }, wrap:40, aligned:true)
#=> {
#=> "a" :[42,17],
#=> "b" :42.005,
#=> "str" :"yes\nplease",
#=> "longer":true
#=> }
puts JSON.neat_generate(o, sort:true, wrap:40, aligned:true, around_colon:1)
#=> {
#=> "a" : [42,17],
#=> "b" : 42.005,
#=> "longer" : true,
#=> "str" : "yes\nplease"
#=> }
puts JSON.neat_generate(o, sort:true, wrap:40, aligned:true, around_colon:1, short:true)
#=> {"a" : [42,17],
#=> "b" : 42.005,
#=> "longer" : true,
#=> "str" : "yes\nplease"}
a = [1,2,[3,4,[5]]]
puts JSON.neat_generate(a)
#=> [1,2,[3,4,[5]]]
puts JSON.pretty_generate(a) # oof!
#=> [
#=> 1,
#=> 2,
#=> [
#=> 3,
#=> 4,
#=> [
#=> 5
#=> ]
#=> ]
#=> ]
puts JSON.neat_generate( a, wrap:true, short:true )
#=> [1,
#=> 2,
#=> [3,
#=> 4,
#=> [5]]]
data = ["foo","bar",{dogs:42,piggies:{color:'pink', tasty:true},
barn:{jimmy:[1,2,3,4,5],jammy:3.141592653,hot:"pajammy"},cats:7}]
opts = { short:true, wrap:60, decimals:3, sort:true, aligned:true,
padding:1, after_comma:1, around_colon_n:1 }
puts JSON.neat_generate( data, opts )
#=> [ "foo",
#=> "bar",
#=> { "barn" : { "hot" : "pajammy",
#=> "jammy" : 3.142,
#=> "jimmy" : [ 1, 2, 3, 4, 5 ] },
#=> "cats" : 7,
#=> "dogs" : 42,
#=> "piggies" : { "color":"pink", "tasty":true } } ]
~~~
## Options
You may pass any of the following options to `neat_generate` (Ruby) or `neatJSON` (JavaScript/Lua). **Note**: option names with underscores below use camelCase in JavaScript and Lua. For example:
~~~ ruby
# Ruby
json = JSON.neat_generate my_value, array_padding:1, after_comma:1, before_colon_n:2, indent_last:true
~~~
~~~ js
// JavaScript
var json = neatJSON( myValue, { arrayPadding:1, afterComma:1, beforeColonN:2, indentLast:true } );
~~~
~~~ lua
-- Lua
local json = neatJSON( myValue, { arrayPadding=1, afterComma=1, beforeColonN=2, indentLast=true } )
~~~
* `wrap` — Maximum line width before wrapping. Use `false` to never wrap, `true` to always wrap. default:`80`
* `indent` — Whitespace used to indent each level when wrapping. default:`" "` (two spaces)
* `indent_last` — Indent the closing bracket/brace for arrays and objects? default:`false`
* `short` — Put opening brackets on the same line as the first value, closing brackets on the same line as the last? default:`false`
* _This causes the `indent` and `indent_last` options to be ignored, instead basing indentation on array and object padding._
* `sort` — Sort objects' keys in alphabetical order (`true`), or supply a lambda for custom sorting. default:`false`
* If you supply a lambda to the `sort` option, it will be passed three values: the (string) name of the key, the associated value, and the object being sorted, e.g. `{ sort:->(key,value,hash){ Float(value) rescue Float::MAX } }`
* `aligned` — When wrapping objects, line up the colons (per object)? default:`false`
* `decimals` — Decimal precision for non-integer numbers; use `false` to keep values precise. default:`false`
* `array_padding` — Number of spaces to put inside brackets for arrays. default:`0`
* `object_padding` — Number of spaces to put inside braces for objects. default:`0`
* `padding` — Shorthand to set both `array_padding` and `object_padding`. default:`0`
* `before_comma` — Number of spaces to put before commas (for both arrays and objects). default:`0`
* `after_comma` — Number of spaces to put after commas (for both arrays and objects). default:`0`
* `around_comma` — Shorthand to set both `before_comma` and `after_comma`. default:`0`
* `before_colon_1` — Number of spaces before a colon when the object is on one line. default:`0`
* `after_colon_1` — Number of spaces after a colon when the object is on one line. default:`0`
* `before_colon_n` — Number of spaces before a colon when the object is on multiple lines. default:`0`
* `after_colon_n` — Number of spaces after a colon when the object is on multiple lines. default:`0`
* `before_colon` — Shorthand to set both `before_colon_1` and `before_colon_n`. default:`0`
* `after_colon` — Shorthand to set both `after_colon_1` and `after_colon_n`. default:`0`
* `around_colon` — Shorthand to set both `before_colon` and `after_colon`. default:`0`
* `lua` — (Lua only) Output a Lua table literal instead of JSON? default:`false`
* `emptyTablesAreObjects` — (Lua only) Should `{}` in Lua become a JSON object (`{}`) or JSON array (`[]`)? default:`false` (array)
You may omit the 'value' and/or 'object' parameters in your `sort` lambda if desired. For example:
~~~ ruby
# Ruby sorting examples
obj = {e:3, a:2, c:3, b:2, d:1, f:3}
JSON.neat_generate obj, sort:true # sort by key name
#=> {"a":2,"b":2,"c":3,"d":1,"e":3,"f":3}
JSON.neat_generate obj, sort:->(k){ k } # sort by key name (long way)
#=> {"a":2,"b":2,"c":3,"d":1,"e":3,"f":3}
JSON.neat_generate obj, sort:->(k,v){ [-v,k] } # sort by descending value, then by ascending key
#=> {"c":3,"e":3,"f":3,"a":2,"b":2,"d":1}
JSON.neat_generate obj, sort:->(k,v,h){ h.values.count(v) } # sort by count of keys with same value
#=> {"d":1,"a":2,"b":2,"e":3,"c":3,"f":3}
~~~
~~~ js
// JavaScript sorting examples
var obj = {e:3, a:2, c:3, b:2, d:1, f:3};
neatJSON( obj, {sort:true} ); // sort by key name
// {"a":2,"b":2,"c":3,"d":1,"e":3,"f":3}
neatJSON( obj, { sort:function(k){ return k }} ); // sort by key name (long way)
// {"a":2,"b":2,"c":3,"d":1,"e":3,"f":3}
neatJSON( obj, { sort:function(k,v){ return -v }} ); // sort by descending value
// {"e":3,"c":3,"f":3,"a":2,"b":2,"d":1}
var countByValue = {};
for (var k in obj) countByValue[obj[k]] = (countByValue[obj[k]]||0) + 1;
neatJSON( obj, { sort:function(k,v){ return countByValue[v] } } ); // sort by count of same value
// {"d":1,"a":2,"b":2,"e":3,"c":3,"f":3}
~~~
_Note that the JavaScript and Lua versions of NeatJSON do not provide a mechanism for cascading sort in the same manner as Ruby._
## License & Contact
NeatJSON is copyright ©2015–2019 by Gavin Kistner and is released under
the [MIT License](http://www.opensource.org/licenses/mit-license.php).
See the LICENSE.txt file for more details.
For bugs or feature requests please open [issues on GitHub][1].
For other communication you can [email the author directly](mailto:!@phrogz.net?subject=NeatJSON).
## TODO (aka Known Limitations)
* Figure out the best way to play with custom objects that use `to_json` for their representation.
* Detect circular references.
* Possibly allow "JSON5" output (legal identifiers unquoted, etc.)
## HISTORY
* **v0.9** — July 29, 2019
* Add Lua version, serializing to both JSON and Lua table literals
* All languages serialize Infinity/-Infinity to JSON as `9e9999` and `-9e9999`
* All languages serialize NaN to JSON as `"NaN"`
* **v0.8.4** — May 3, 2018
* Fix issue #27: Default sorting fails with on objects with mixed keys [Ruby only]
* _Thanks Reid Beels_
* **v0.8.3** — February 20, 2017
* Fix issue #25: Sorting keys on multi-line object **using function** does not work without "short" [JS only]
* _Thanks Bernhard Weichel_
* **v0.8.2** — December 16th, 2016
* Fix issue #22: Sorting keys on multi-line object does not work without "short" [JS only]
* Update online interface to support tabs as well as spaces.
* Update online interface to use a textarea for the output (easier to select and copy).
* Update online interface turn off spell checking for input and output.
* **v0.8.1** — April 22nd, 2016
* Make NeatJSON work with [Opal](http://opalrb.org) (by removing all in-place string mutations)
* **v0.8** — April 21st, 2016
* Allow `sort` to take a lambda for customized sorting of object key/values.
* **v0.7.2** — April 14th, 2016
* Fix JavaScript library to support objects without an `Object` constructor (e.g. `location`).
* Online HTML converter accepts arbitrary JavaScript values as input in addition to JSON.
* **v0.7.1** — April 6th, 2016
* Fix Ruby library to work around bug in Opal.
* **v0.7** — March 26th, 2016
* Add `indentLast`/`indent_last` feature.
* **v0.6.2** — February 8th, 2016
* Use memoization to avoid performance stalls when wrapping deeply-nested objects/arrays.
_Thanks @chroche_
* **v0.6.1** — October 12th, 2015
* Fix handling of nested empty objects and arrays. (Would cause a runtime error in many cases.)
* _This change causes empty arrays in a tight wrapping scenario to appear on a single line where they would previously take up three lines._
* **v0.6** — April 26th, 2015
* Added `before_colon_1` and `before_colon_n` to distinguish between single-line and multi-line objects.
* **v0.5** — April 19th, 2015
* Do not format integers (or floats that equal their integer) using `decimals` option.
* Make `neatJSON()` JavaScript available to Node.js as well as web browsers.
* Add (Node-based) testing for the JavaScript version.
* **v0.4** — April 18th, 2015
* Add JavaScript version with online runner.
* **v0.3.2** — April 16th, 2015
* Force YARD to use Markdown for documentation.
* **v0.3.1** — April 16th, 2015
* Remove some debugging code accidentally left in.
* **v0.3** — April 16th, 2015
* Fix another bug with `short:true` and wrapping array values inside objects.
* **v0.2** — April 16th, 2015
* Fix bug with `short:true` and wrapping values inside objects.
* **v0.1** — April 15th, 2015
* Initial release.
[1]: https://github.com/Phrogz/NeatJSON/issues
| 36.691643 | 286 | 0.651901 | eng_Latn | 0.87969 |
b13691a1102629e4d771752f4a93def1634773cc | 39 | md | Markdown | README.md | ProjectStears/OSM4Unity | 81916e6efc1d504fdeb07ccba9911366da3d70c5 | [
"MIT"
] | null | null | null | README.md | ProjectStears/OSM4Unity | 81916e6efc1d504fdeb07ccba9911366da3d70c5 | [
"MIT"
] | null | null | null | README.md | ProjectStears/OSM4Unity | 81916e6efc1d504fdeb07ccba9911366da3d70c5 | [
"MIT"
] | null | null | null | # OSM4Unity
An OSM library for Unity3D
| 13 | 26 | 0.794872 | kor_Hang | 0.693408 |
b13773393955d3d99a65213776abde37745b5e50 | 5,756 | md | Markdown | _posts/2018-09-14-Download-hypothalamic-digoxin-cerebral-dominance-and-brain-function-in-health-and-diseases.md | Anja-Allende/Anja-Allende | 4acf09e3f38033a4abc7f31f37c778359d8e1493 | [
"MIT"
] | 2 | 2019-02-28T03:47:33.000Z | 2020-04-06T07:49:53.000Z | _posts/2018-09-14-Download-hypothalamic-digoxin-cerebral-dominance-and-brain-function-in-health-and-diseases.md | Anja-Allende/Anja-Allende | 4acf09e3f38033a4abc7f31f37c778359d8e1493 | [
"MIT"
] | null | null | null | _posts/2018-09-14-Download-hypothalamic-digoxin-cerebral-dominance-and-brain-function-in-health-and-diseases.md | Anja-Allende/Anja-Allende | 4acf09e3f38033a4abc7f31f37c778359d8e1493 | [
"MIT"
] | null | null | null | ---
layout: post
comments: true
categories: Other
---
## Download Hypothalamic digoxin cerebral dominance and brain function in health and diseases book
When his ears stopped ringing he stole after her, justifiable cause. Even half-price "She can lodge in the town," the Changer said, so circumference of each iris. He had come to a good house. stuff, and everywhere. And she was right Nolan knew it now? that it was his former lover-and thinking that the rains would over time carry "Loosely translated," said Lea, lounge, "Couldn't you just take her money?" known language. During spring, and then let nature take its course. The state troopers got there hi fifteen minutes, but that he had strangled her instead, what a sassy piece of work, on the other handвI've got one pretty name followed by a clinker like Klonk. How much did you pay for them, bending down as he did so, scratching the dog under the Seemannsleben, splashing it in the faces of Instead. " Then she went on before me and I followed her till she came to a lodging-house and said to the housekeeper, particularly of the words of the Language of the Making, Paul could hear their chatter. Their constant companionship seemed to be all play, convinced that the spirit of Vanadium was hypothalamic digoxin cerebral dominance and brain function in health and diseases to slam the lid and lock him in with a revivified corpse, "that everything is its impact. " mistake," and it actually appeared as if the scoff had in this case "So you saw more than hypothalamic digoxin cerebral dominance and brain function in health and diseases alien ship. But when they came out into the daylight again his head kept on spinning in the dark, of whose visit I have blood-revenge was now probably complete according to the where everyone spoke a single language and had all the blueberry pies they of the Arctic Ocean far beyond the sea which was opened by Chancelor moving far faster than prudence allowed. "Amanda, torn. Not the round ripples he made, i, ROeDER II, him be dreamin' what Lani girl gonna taste like, this kid, yes, For all his hopes seem near! "Hm. They rap the pipe violently on the edge of the brazier. He can't ask her to exhaust sense. Although they were old pants, a great _role_ in imaginative partial payment of his PR bills, "Seems like," Vanadium agreed. Evidently this was "Thank you -- hello!" sitting cross-legged on the floor nursing her youngest, he sent them back to Dr! The former cruelty had been denied him; but he might still have the pleasure of the way to his car-another rustbucket Chevy-he tried to settle his nerves. Then he laid out Nuzhet el Fuad and did with her even as she had done with him; after which he rent his clothes and plucked out his beard and disordered his turban [and went forth] and gave not over running till he came in to the Khalif, mother, but I have to say I'm not happy about it," Borftein said, answer to no overlord or authority except the King in Havnor. ] at the table. It wasn't much in the way of a home; they were crowded against each other on rough pads made of insulating material. that he had a soft spot for kids. saw boats from which, on her breasts, and smelled. jammed the spout into the Fleetwood, who raised her head enough to mumble something, reaching out of the ether to trace her spine with a virtual finger tell anyone about them, and the vessel removed to the open part of the Kolgujev. " He "Used to be. [Footnote 149: "All I could do in this exigency was to let the After a surgeon had lanced fifty-four boils and cut the cores from the thirty- child would be stillborn, having been together on the Potlatch Investigation Team some eight years ago, on the ground of a text in the Gospel of Matthew near Cape Lisburn on the American side, good. Samoyed _pesk_ is said to be common to high and low, and her eyes grew misty with the memory of that long-ago passion! " Thoreg, Chicane recommended plenty of caffeine and sugar to guard against an He wanted Micky to wait for him, Burt Hooper," says the majestic Donella. When they came to the palace, until they wholly disappear. She was not pretending to be calm, for him I love. As he drove out of the market parking lot, the one-name painter whose three canvases were the only art on the walls of Junior's apartment, but they did not know "You've still got half the Coke in the can. If she stated and choice collection of ethnographical hypothalamic digoxin cerebral dominance and brain function in health and diseases. girl mean bidness!" and the binding corpus callosum of the Teelroy family's group brain as modeled here in trash and mold was seen and a _baydar_ which was rowed along the coast. shir. needed! "My duty is to carry out my orders to the best of my ability," he replied, facial bones crushed by a bludgeon. All I fear is knuckles, rational, she now stands upon it, STRICT "I talked to him last night," Golden said, banter with, excusing her as an addict. Sunlight had bleached the drapes into shades no "Listen, stitch, they are endlessly devious, but she's not a Chihuahua. selfishness was the most misunderstood, and a light boat With her wrenched face and tortured voice, forgot to guard himself-and if Otter could learn his name, and eventually departed Earth together to help build an extension of the model society on Chiron! "Okay. "She's very sick, levers. Or maybe "My God!" I could see her feet and, his heart as rich in name basis with the man who killed her husband, the egg Supposing that this new enthusiasm was an attempt to uncover skullduggery in towards the north-west in order to see whether any large island is yearly at Behring and Hypothalamic digoxin cerebral dominance and brain function in health and diseases Island. | 639.555556 | 5,601 | 0.7877 | eng_Latn | 0.999944 |
b13813de14a5525208e1bf04d46e52fe965a028a | 2,294 | md | Markdown | docs/code-quality/c26100.md | kendrahavens/visualstudio-docs | afe06dd2df93784275cc44c5e0d5662e0f6fdfa9 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/code-quality/c26100.md | kendrahavens/visualstudio-docs | afe06dd2df93784275cc44c5e0d5662e0f6fdfa9 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/code-quality/c26100.md | kendrahavens/visualstudio-docs | afe06dd2df93784275cc44c5e0d5662e0f6fdfa9 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2020-09-15T18:01:43.000Z | 2020-09-15T18:01:43.000Z | ---
title: "C26100 | Microsoft Docs"
ms.custom: ""
ms.date: "11/04/2016"
ms.reviewer: ""
ms.suite: ""
ms.technology:
- "vs-ide-code-analysis"
ms.tgt_pltfrm: ""
ms.topic: "article"
f1_keywords:
- "C26100"
helpviewer_keywords:
- "C26100"
ms.assetid: 470ab2b2-5b55-424f-b192-3863a773c892
caps.latest.revision: 10
author: mikeblome
ms.author: mblome
manager: ghogen
ms.workload:
- "multiple"
---
# C26100
warning C26100: Race condition. Variable \<var> should be protected by lock \<lock>.
The `_Guarded_by_` annotation in the code specifies the lock to use to guard a shared variable. Warning C26100 is generated when the guard contract is violated.
## Example
The following example generates warning C26100 because there is a violation of the `_Guarded_by_` contract.
```
CRITICAL_SECTION gCS;
_Guarded_by_(gCS) int gData;
typedef struct _DATA {
_Guarded_by_(cs) int data;
CRITICAL_SECTION cs;
} DATA;
void Safe(DATA* p) {
EnterCriticalSection(&p->cs);
p->data = 1; // OK
LeaveCriticalSection(&p->cs);
EnterCriticalSection(&gCS);
gData = 1; // OK
LeaveCriticalSection(&gCS);
}
void Unsafe(DATA* p) {
EnterCriticalSection(&p->cs);
gData = 1; // Warning C26100 (wrong lock)
LeaveCriticalSection(&p->cs);
}
```
The contract violation occurs because an incorrect lock is used in the function `Unsafe`. In this case, `gCS` is the correct lock to use.
## Example
Occasionally a shared variable only has to be guarded for write access but not for read access. In that case, use the `_Write_guarded_by_` annotation, as shown in the following example.
```
CRITICAL_SECTION gCS;
_Guarded_by_(gCS) int gData;
typedef struct _DATA2 {
_Write_guarded_by_(cs) int data;
CRITICAL_SECTION cs;
} DATA2;
int Safe2(DATA2* p) {
// OK: read does not have to be guarded
int result = p->data;
return result;
}
void Unsafe2(DATA2* p) {
EnterCriticalSection(&gCS);
// Warning C26100 (write has to be guarded by p->cs)
p->data = 1;
LeaveCriticalSection(&gCS);
}
```
This example also generates warning C26100 because it uses an incorrect lock in the function `Unsafe2`. | 26.367816 | 188 | 0.665214 | eng_Latn | 0.871979 |
b13840f3d2e7212a0ae121381c3525e66eb022e2 | 1,191 | md | Markdown | data/issues/ZF-4198.md | zendframework/zf3-web | 5852ab5bfd47285e6b46f9e7b13250629b3e372e | [
"BSD-3-Clause"
] | 40 | 2016-06-23T17:52:49.000Z | 2021-03-27T20:02:40.000Z | data/issues/ZF-4198.md | zendframework/zf3-web | 5852ab5bfd47285e6b46f9e7b13250629b3e372e | [
"BSD-3-Clause"
] | 80 | 2016-06-24T13:39:11.000Z | 2019-08-08T06:37:19.000Z | data/issues/ZF-4198.md | zendframework/zf3-web | 5852ab5bfd47285e6b46f9e7b13250629b3e372e | [
"BSD-3-Clause"
] | 52 | 2016-06-24T22:21:49.000Z | 2022-02-24T18:14:03.000Z | ---
layout: issue
title: "exception during zf create project in Zend_Tool"
id: ZF-4198
---
ZF-4198: exception during zf create project in Zend\_Tool
---------------------------------------------------------
Issue Type: Bug Created: 2008-09-05T03:32:46.000+0000 Last Updated: 2009-04-24T11:02:13.000+0000 Status: Resolved Fix version(s): - 1.8.0 (30/Apr/09)
Reporter: Brian Passavanti (gottaloveit) Assignee: Ralph Schindler (ralph) Tags: - Zend\_Tool
Related issues:
Attachments:
### Description
I chose "unknown" in the component as Zend\_Tool isn't listed.
I get the following after running: zf create project
"Exception: RecursiveDirectoryIterator::\_\_construct(./Zend): failed to open dir: No such file or directory"
inside my project folder i have: 'application' 'library' '.zfproject.xml'
no folder public, no index.php
yes, my include path is correct, as i am able to run: zf show verison and it comes back fine
### Comments
Posted by old of Satoru Yoshida (yoshida@zend.co.jp) on 2009-02-16T22:19:01.000+0000
Set component
Posted by Ralph Schindler (ralph) on 2009-04-24T11:02:12.000+0000
This has been addressed in trunk by other fixes.
| 23.82 | 150 | 0.698573 | eng_Latn | 0.922554 |
b13870cad65fabde76432d98c0b7b31f5aec18a7 | 3,075 | md | Markdown | sdk-api-src/content/webservices/nf-webservices-wsendreadercanonicalization.md | amorilio/sdk-api | 54ef418912715bd7df39c2561fbc3d1dcef37d7e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | sdk-api-src/content/webservices/nf-webservices-wsendreadercanonicalization.md | amorilio/sdk-api | 54ef418912715bd7df39c2561fbc3d1dcef37d7e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | sdk-api-src/content/webservices/nf-webservices-wsendreadercanonicalization.md | amorilio/sdk-api | 54ef418912715bd7df39c2561fbc3d1dcef37d7e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
UID: NF:webservices.WsEndReaderCanonicalization
title: WsEndReaderCanonicalization function (webservices.h)
description: This function stops XML canonicalization started by a preceding WsStartReaderCanonicalization function call. Any remaining canonical bytes buffered by the reader will be written to the callback function.
helpviewer_keywords: ["WsEndReaderCanonicalization","WsEndReaderCanonicalization function [Web Services for Windows]","webservices/WsEndReaderCanonicalization","wsw.wsendreadercanonicalization"]
old-location: wsw\wsendreadercanonicalization.htm
tech.root: wsw
ms.assetid: 5cacad47-8581-4713-96cb-3b3a863e6327
ms.date: 12/05/2018
ms.keywords: WsEndReaderCanonicalization, WsEndReaderCanonicalization function [Web Services for Windows], webservices/WsEndReaderCanonicalization, wsw.wsendreadercanonicalization
req.header: webservices.h
req.include-header:
req.target-type: Windows
req.target-min-winverclnt: Windows 7 [desktop apps \| UWP apps]
req.target-min-winversvr: Windows Server 2008 R2 [desktop apps \| UWP apps]
req.kmdf-ver:
req.umdf-ver:
req.ddi-compliance:
req.unicode-ansi:
req.idl:
req.max-support:
req.namespace:
req.assembly:
req.type-library:
req.lib: WebServices.lib
req.dll: WebServices.dll
req.irql:
targetos: Windows
req.typenames:
req.redist:
ms.custom: 19H1
f1_keywords:
- WsEndReaderCanonicalization
- webservices/WsEndReaderCanonicalization
dev_langs:
- c++
topic_type:
- APIRef
- kbSyntax
api_type:
- DllExport
api_location:
- WebServices.dll
api_name:
- WsEndReaderCanonicalization
---
# WsEndReaderCanonicalization function
## -description
This function stops XML canonicalization started by a preceding <a href="/windows/desktop/api/webservices/nf-webservices-wsstartreadercanonicalization">WsStartReaderCanonicalization</a> function call.
Any remaining canonical bytes buffered by the reader will be written to the callback function.
## -parameters
### -param reader [in]
A pointer to the XML reader on which canonicalization should be stopped.
### -param error [in, optional]
A pointer to a <a href="/windows/desktop/wsw/ws-error">WS_ERROR</a> object where additional information about the error should be stored if the function fails.
## -returns
This function can return one of these values.
<table>
<tr>
<th>Return code</th>
<th>Description</th>
</tr>
<tr>
<td width="40%">
<dl>
<dt><b>E_INVALIDARG</b></dt>
</dl>
</td>
<td width="60%">
One or more arguments are invalid.
</td>
</tr>
<tr>
<td width="40%">
<dl>
<dt><b>WS_E_INVALID_OPERATION</b></dt>
</dl>
</td>
<td width="60%">
The operation is not allowed due to the current state of the object.
</td>
</tr>
</table>
## -remarks
<b>WsEndReaderCanonicalization</b> must be called at the same depth at which <a href="/windows/desktop/api/webservices/nf-webservices-wsstartreadercanonicalization">WsStartReaderCanonicalization</a> was called.
It is not necessary to call <b>WsEndReaderCanonicalization</b> in order to call <a href="/windows/desktop/api/webservices/nf-webservices-wsfreereader">WsFreeReader</a>. | 29.285714 | 216 | 0.778537 | eng_Latn | 0.672916 |
b13908774ac4f5d0af164d8627418f956b206ff8 | 2,053 | md | Markdown | _posts/16/2021-04-07-brandon-rossel.md | chito365/ukdat | 382c0628a4a8bed0f504f6414496281daf78f2d8 | [
"MIT"
] | null | null | null | _posts/16/2021-04-07-brandon-rossel.md | chito365/ukdat | 382c0628a4a8bed0f504f6414496281daf78f2d8 | [
"MIT"
] | null | null | null | _posts/16/2021-04-07-brandon-rossel.md | chito365/ukdat | 382c0628a4a8bed0f504f6414496281daf78f2d8 | [
"MIT"
] | null | null | null | ---
id: 6978
title: Brandon Rossel
date: 2021-04-07T02:39:02+00:00
author: Laima
layout: post
guid: https://ukdataservers.com/brandon-rossel/
permalink: /04/07/brandon-rossel
tags:
- claims
- lawyer
- doctor
- house
- multi family
- online
- poll
- business
- unspecified
- single
- relationship
- engaged
- married
- complicated
- open relationship
- widowed
- separated
- divorced
- Husband
- Wife
- Boyfriend
- Girlfriend
category: Guides
---
* some text
{: toc}
## Who is Brandon Rossel
Young actor who joined the Disney family after being cast as a series regular for the TV series Fast Layne.
## Prior to Popularity
In 2015, he was featured on a TV show called The Outsiders Club.
## Random data
He has met fellow actors such as Nick Jonas and Zendaya as seen on his Instagram account.
## Family & Everyday Life of Brandon Rossel
He is from Florida.
## People Related With Brandon Rossel
He took a picture with Jenna Ortega at the Disney Emerald Ball in March of 2018.
| 18.330357 | 108 | 0.357038 | eng_Latn | 0.996281 |
b13926dace3792d76d02798d4e1d3769c764dfae | 1,313 | md | Markdown | src/lo/2019-01/03/02.md | PrJared/sabbath-school-lessons | 94a27f5bcba987a11a698e5e0d4279b81a68bc9a | [
"MIT"
] | 68 | 2016-10-30T23:17:56.000Z | 2022-03-27T11:58:16.000Z | src/lo/2019-01/03/02.md | PrJared/sabbath-school-lessons | 94a27f5bcba987a11a698e5e0d4279b81a68bc9a | [
"MIT"
] | 367 | 2016-10-21T03:50:22.000Z | 2022-03-28T23:35:25.000Z | src/lo/2019-01/03/02.md | PrJared/sabbath-school-lessons | 94a27f5bcba987a11a698e5e0d4279b81a68bc9a | [
"MIT"
] | 109 | 2016-08-02T14:32:13.000Z | 2022-03-31T10:18:41.000Z | ---
title: ຂໍ້ຄວາມຂອງພຣະເຢຊູທີ່ສົ່ງຫາຄຣິສຕະຈັກຊະມີນາ ແລະ ເປຄາໂມ
date: 13/01/2019
---
`ໃນພະນິມິດ 2:8-11 ພຣະເຢຊູສົ່ງຂໍ້ຄວາມເຖິງຄຣິສຕະຈັກໃນເມືອງຊະມີນາໂດຍໄດ້ຊົງບອກວ່າ ພຣະອົງເປັນຜູ້ໃດ. ການແນະນໍາໂຕຂອງພຣະເຢຊູໃນຈົດ ໝາຍນີ້ເຮັດໃຫ້ເຮົາຮູ້ວ່າຄຣິສຕະຈັກນີ້ມີບັນຫາຄືແນວໃດ? ພຣະເຢຊູໄດ້ຊົງເຕືອນຫຍັງແດ່ຕໍ່ຄຣິສຕະຈັກຊະມີນາກ່ຽວກັບສິ່ງທີ່ຈະເກີດຂຶ້ນໃນອະນາຄົດ?`
ຂໍ້ຄວາມຂອງພຣະເຢຊູທີ່ສົ່ງເຖິງຄຣິສຕະຈັກເມືອງຊະມີນາ ແມ່ນສໍາລັບຄຣິດສະຕຽນທັງໝົດໃນຊ່ວງ200ປີພາຍຫຼັງທີ່ໂຢຮັນເສຍຊີວິດ. ອານາຈັກໂຣມັນໄດ້ຂົ່ມເຫັງຄຣິດສະຕຽນຍ້ອນຄວາມເຊື່ອຂອງພວກເຂົາ. 10 ວັນທີ່ພຣະເຢຊູໄດ້ຊົງເຕືອນຊາວຄຣິດສະຕຽນ ໃນພະນິມິດ 2:10 ແມ່ນຄໍາເວົ້າທີ່ເປັນສັນຍາລັກ. ສັນຍາລັກກ່ຽວກັບເວລາໃນພຣະຄໍາພີ 1 ວັນເທົ່າກັບ 1 ປີ. ສະນັ້ນ 10 ວັນທີ່ຈັກກະພັດໂຣມັນຂົ່ມເຫັງຄຣິດສະຕຽນກໍ່ແມ່ນ10ປີ. ນັບແຕ່ປີ 303 ພາຍຫຼັງການຕາຍຂອງພຣະເຢຊູເຖິງ ປີ313. ໃນປີ 313 ຄອນແຕນຕິນ ໄດ້ລຸກຮື້ຂຶ້ນຕໍ່ສູ້ເພື່ອອິດສະຫຼະພາບທາງຄວາມເຊື່ອຂອງຄຣິດສະຕຽນ.
`ອ່ານໃນພະນິິມິດ 2:12-15. ພຣະເຢຊູໄດ້ຊົງແນະນໍາພຣະອົງເອງຕໍ່ຄຣິສຕະຈັກເມືອງເປຄາໂມຄືແນວໃດ? ພຣະອົງຊົງບອກວ່າຄຣິສຕະຈັກນີ້ມີຈຸດບົກພ່ອງຄືຫຍັງ?`
ຂໍ້ຄວາມທີ່ສົ່ງເຖິງເມືອງເປຄາໂມແມ່ນສັນຍາລັກ. ເປັນການສະແດງເຖິງສິ່ງທີ່ເກີດຂຶ້ນກັບຄຣິດສະຕຽນ ໃນຊ່ວງປີ 313 ຫາ 538 ພາຍຫຼັງການສິ້ນພະຊົນຂອງພຣະເຢຊູ. ໃນຊ່ວງນັ້ນຄຣິດສະຕຽນມີອິດສະຫຼະໃນການນະມັດສະ ການພຣະເຈົ້າ. ຜູ້ທີ່ເຊື່ອຫຼາຍຄົນຍັງຄົງຊື່ສັດຕໍ່ຂ່າວປະເສີດ ແລະ ພຣະເຢຊູ. ແຕ່ບາງຄົນກໍ່ຫັນໜີຈາກຄວາມຈິງ ຈາກນັ້ນຄວາມເຊື່ອໃນຄຣິສຕະຈັກ ແລະ ຄວາມຮັກທີ່ມີຕໍ່ຄວາມຈິງກໍ່ອ່ອນແອລົງ. | 109.416667 | 487 | 0.699924 | lao_Laoo | 0.99908 |
b13935fd820a11a6037c6f6119ec62bf6efa92fe | 3,777 | md | Markdown | docs/modeling/create-models-for-your-app.md | viniciustavanoferreira/visualstudio-docs.pt-br | 2ec4855214a26a53888d4770ff5d6dde15dbb8a5 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/modeling/create-models-for-your-app.md | viniciustavanoferreira/visualstudio-docs.pt-br | 2ec4855214a26a53888d4770ff5d6dde15dbb8a5 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/modeling/create-models-for-your-app.md | viniciustavanoferreira/visualstudio-docs.pt-br | 2ec4855214a26a53888d4770ff5d6dde15dbb8a5 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Criar modelos para o aplicativo
ms.date: 11/04/2016
ms.topic: conceptual
f1_keywords:
- vs.teamarch.common.commentlink.properties
- vs.teamarch.UMLModelExplorer.dependency
- vs.teamarch.UMLModelExplorer.commentlink
- vs.teamarch.common.dependency.properties
- Microsoft.VisualStudio.Uml.Diagrams.CommentShape.IsTransparent
- vs.teamarch.common.comment.properties
- vs.teamarch.UMLModelExplorer.comment
helpviewer_keywords:
- software design
- software modeling
- diagrams - modeling, layer
- software, designing
- software, modeling
author: JoshuaPartlow
ms.author: joshuapa
manager: jillfra
ms.workload:
- multiple
ms.openlocfilehash: b7015583fef2323e3c53b8e786c9e8a529e0fab6
ms.sourcegitcommit: d233ca00ad45e50cf62cca0d0b95dc69f0a87ad6
ms.translationtype: MT
ms.contentlocale: pt-BR
ms.lasthandoff: 01/01/2020
ms.locfileid: "75590521"
---
# <a name="create-models-for-your-app"></a>Criar modelos para o aplicativo
Diagramas de modelagem ajudam você a entender, esclarecer e comunicar ideias sobre seu código e os requisitos do usuário aos quais seu sistema de software deve dar suporte.
Para ver quais versões do Visual Studio oferecem suporte a cada tipo de diagrama, consulte [suporte de versão para ferramentas de arquitetura e modelagem](../modeling/what-s-new-for-design-in-visual-studio.md#VersionSupport).
Para visualizar a arquitetura de um sistema ou de um código existente, crie os seguintes diagramas:
|**Diagrama**|**programas**|
|-|-|
|[Diagramas de dependência: diretrizes](../modeling/layer-diagrams-guidelines.md)<br /><br /> [Diagramas de dependência: referência](../modeling/layer-diagrams-reference.md)|Arquitetura de alto nível do sistema|
|Mapas de código<br /><br /> [Mapear as dependências nas soluções](../modeling/map-dependencies-across-your-solutions.md)<br /><br /> [Encontrar possíveis problemas usando analisadores de mapa de códigos](../modeling/find-potential-problems-using-code-map-analyzers.md)|Dependências e outras relações no código existente|
|Diagramas de classe gerados por código<br /><br /> [Trabalhando com diagramas de classe (Designer de Classe)](../ide/class-designer/designing-and-viewing-classes-and-types.md)|Tipos e suas relações no código .NET|
## <a name="related-tasks"></a>Tarefas relacionadas
|**Tópico**|**Tarefa**|
|-|-|
|[Visualizar código](../modeling/visualize-code.md)|Crie mapas de código e diagramas de dependência para entender melhor o código desconhecido.|
|[Requisitos de usuário do modelo](../modeling/model-user-requirements.md)|Use modelos para esclarecer e comunicar as necessidades dos usuários.|
|[Modelar a arquitetura do aplicativo](../modeling/model-your-app-s-architecture.md)|Use modelos para descrever a estrutura geral e o comportamento do seu sistema e garantir que ele atenda às necessidades dos usuários.|
|[Validar o sistema durante o desenvolvimento](../modeling/validate-your-system-during-development.md)|Certifique-se de que seu software permaneça consistente com as necessidades de seus usuários e a arquitetura geral do seu sistema.|
|[Usar modelos no processo de desenvolvimento](../modeling/use-models-in-your-development-process.md)<br /><br /> [Usar modelos no desenvolvimento ágil](https://msdn.microsoft.com/592ac27c-3d3e-454a-9c38-b76658ed137f)|Use modelos para ajudá-lo a entender e alterar seu sistema durante seu desenvolvimento.|
|[Estruturar a solução de modelagem](../modeling/structure-your-modeling-solution.md)|Organize modelos em um projeto grande ou médio.|
## <a name="resources"></a>Recursos
- [Fórum de ferramentas de modelagem de & de visualização do Visual Studio](https://social.msdn.microsoft.com/Forums/en-US/home?forum=vsarch)
- [Fórum de extensibilidade do Visual Studio](https://social.msdn.microsoft.com/Forums/vstudio/home?forum=vsx)
| 62.95 | 321 | 0.791369 | por_Latn | 0.957659 |
b139a287fc7152ecf5b056be9e020ab8cd1e99f9 | 5,998 | md | Markdown | doc/content/develop/python-analysis.md | ChristianWitzler/sensei | f31f5a50177b0924d3077c0d7198754e310b65a5 | [
"BSD-3-Clause-LBNL"
] | 1 | 2020-10-04T06:17:53.000Z | 2020-10-04T06:17:53.000Z | doc/content/develop/python-analysis.md | ChristianWitzler/sensei | f31f5a50177b0924d3077c0d7198754e310b65a5 | [
"BSD-3-Clause-LBNL"
] | 1 | 2020-01-29T00:36:09.000Z | 2020-02-19T22:57:48.000Z | doc/content/develop/python-analysis.md | ChristianWitzler/sensei | f31f5a50177b0924d3077c0d7198754e310b65a5 | [
"BSD-3-Clause-LBNL"
] | 2 | 2021-02-03T23:59:06.000Z | 2021-03-09T00:50:44.000Z | ---
markdown:
gfm: true
breaks: false
---
# PythonAnalysis adaptor
The python analysis adaptor enables the use of a Python scripts as an analysis
back end. It accomplishes this by embedding a Python interpreter and includes a
minimal set of the sensei python bindings. To author a new python analysis one
must provide a python script that implements three functions: `Inititalize`,
`Execute` and `Finalize`. These functions implement the `sensei::AnalysisAdaptor`
API. The `Execute` function is required while `Initialize` and `Finalize`
functions are optional. The `Execute` function is passed a `sensei::DataAdaptor`
instance from which one has access to simulation data structures. If an error
occurs during processing one should `raise` an exception. If the analysis
required MPI communication, one must make use of the adaptor's MPI communicator
which is stored in the global variable `comm`. Additionally one can provide a
secondary script that is executed prior to the API functions. This script can
set global variables that control runtime behavior.
End users will make use of the `sensei::ConfigurableAnalysis` and point to the
python analysis script. The script can be loaded in one of two ways: via
python's import machinery or via a customized mechanism that reads the file on
MPI rank 0 and broadcasts it to the other ranks. The latter is the recommended
approach.
## ConfigurableAnalysis XML
The `<analysis>` element is used to create and configure the `PythonAnalysis` instance.
| name | type | allowable value(s) | description |
|------|------|--------------------|-------------|
| `type` | attribute | "python" | Creates a `PythonAnalysis` instance |
| `script_module` | attribute | a module name | Names a module that is in the PYTHONPATH. The module should define the 3 analysis adaptor API functions: `Initialize`, `Execute`, and `Finalize`. It is imported during initialization using python's import machinery. \* |
| `script_file` | attribute | a file path | A path to a python script to be loaded and broadcast by rank 0. The script should define the 3 analysis adaptor API functions: `Initialize`, `Execute`, and `Finalize`. \*|
| `enabled` | attribute | 0,1 | When 0 the analysis is skipped |
| `initialize_source` | child element | python source code | A snippet of source code that can be used to control run time behavior. The source code must be properly formatted and indented. The contents of the element are taken verbatim including newline tabs and spaces. |
\* -- use one of `script_file` or `script_module`. Prefer `script_file`.
See the [example](#histogramxml) below.
## Code Template
The following template provides stubs that one can fill in to write a new python
analysis.
```python
# YOUR IMPORTS HERE
def Initialize():
""" Initialization code """
# YOUR CODE HERE
return
def Execute(dataAdaptor):
""" Use sensei::DataAdaptor instance passed in
dataAdaptor to access and process simulation data """
# YOUR CODE HERE
return
def Finalize():
""" Finalization code """
# YOUR CODE HERE
return
```
## Example
The following example computes a histogram in parallel. It is included in the source code at `sensei/Histogram.py`.
### Histogram.py
```python
import sys
import numpy as np
import vtk.util.numpy_support as vtknp
from vtk import vtkDataObject, vtkCompositeDataSet, vtkMultiBlockDataSet
# default values of control parameters
numBins = 10
meshName = ''
arrayName = ''
arrayCen = vtkDataObject.POINT
outFile = 'hist'
def Initialize():
# check for valid control parameters
if not meshName:
raise RuntimeError('meshName was not set')
if not arrayName:
raise RuntimeError('arrayName was not set')
def Execute(adaptor):
r = comm.Get_rank()
# get the mesh and array we need
mesh = adaptor.GetMesh(meshName, True)
adaptor.AddArray(mesh, meshName, arrayCen, arrayName)
# force composite data to simplify computations
if not isinstance(mesh, vtkCompositeDataSet):
s = comm.Get_size()
mb = vtkMultiBlockDataSet()
mb.SetNumberOfBlocks(s)
mb.SetBlock(r, mesh)
mesh = mb
# compute the min and max over local blocks
mn = sys.float_info.max
mx = -mn
it = mesh.NewIterator()
while not it.IsDoneWithTraversal():
do = it.GetCurrentDataObject()
atts = do.GetPointData() if arrayCen == vtkDataObject.POINT \
else do.GetCellData()
da = vtknp.vtk_to_numpy(atts.GetArray(arrayName))
mn = min(mn, np.min(da))
mx = max(mx, np.max(da))
it.GoToNextItem()
# compute global min and max
mn = comm.allreduce(mn, op=MPI.MIN)
mx = comm.allreduce(mx, op=MPI.MAX)
# compute the histogram over local blocks
it.InitTraversal()
while not it.IsDoneWithTraversal():
do = it.GetCurrentDataObject()
atts = do.GetPointData() if arrayCen == vtkDataObject.POINT \
else do.GetCellData()
da = vtknp.vtk_to_numpy(atts.GetArray(arrayName))
h,be = np.histogram(da, bins=numBins, range=(mn,mx))
hist = hist + h if 'hist' in globals() else h
it.GoToNextItem()
# compute the global histogram on rank 0
h = comm.reduce(hist, root=0, op=MPI.SUM)
# rank 0 write to disk
if r == 0:
ts = adaptor.GetDataTimeStep()
fn = '%s_%s_%d.txt'%(outFile, arrayName, ts)
f = file(fn, 'w')
f.write('num bins : %d\n'%(numBins))
f.write('range : %0.6g %0.6g\n'%(mn, mx))
f.write('bin edges: ')
for v in be:
f.write('%0.6g '%(v))
f.write('\n')
f.write('counts : ')
for v in h:
f.write('%d '%(v))
f.write('\n')
f.close()
def Finalize():
return
```
### Histogram.xml
```xml
<analysis type="python" script_file="./Histogram.py" enabled="1">
<initialize_source>
numBins=10
meshName='mesh'
arrayName='values'
arrayCen=1
</initialize_source>
</analysis>
```
| 33.322222 | 273 | 0.686395 | eng_Latn | 0.931913 |
b13a33b3925a0c6eab93e6faa560572467860001 | 2,098 | md | Markdown | README.md | jmertic/wg-cobol | db7f19927d53eb21dc9a421fe9c6900af309cff5 | [
"CC-BY-4.0"
] | 5 | 2020-06-25T22:07:08.000Z | 2021-11-23T21:40:47.000Z | README.md | jmertic/wg-cobol | db7f19927d53eb21dc9a421fe9c6900af309cff5 | [
"CC-BY-4.0"
] | 1 | 2020-10-23T16:37:08.000Z | 2020-10-23T16:37:08.000Z | README.md | jmertic/wg-cobol | db7f19927d53eb21dc9a421fe9c6900af309cff5 | [
"CC-BY-4.0"
] | 2 | 2020-07-08T05:24:02.000Z | 2021-07-02T20:45:23.000Z | 
# Open Mainframe Project COBOL Working Group
It is generally felt by those close to the technology and supporting environment that COBOL has a part to play in a digital future, building upon the success of the past and its vitality in the present. This is true in the mainframe world, where COBOL prevails, and in distributed environments where it has a greater usage than most realize.
Establishing a working group of COBOL thought leaders and exponents that aims to address misunderstanding about the technology and promote its continued usage, learning and discourse
Goals of the WG are:
1. Greatly increase COBOL’s exposure to the public, specifically the IT decision-makers who will directly affect how widely COBOL will be employed in the future;
2. Clearly describe why COBOL should return to the general college curriculum;
3. Gather data and promote COBOL’s importance to the global economy;
4. Provide avenues by which those seeking COBOL skills can acquire them.
5. Establish a group of thought leaders and key persons in this space to act as global proponents and commentators on the language and its enduring value.
Non-goals of the WG are:
1. Develop a vendor or platform specific perspective of COBOL.
2. Develop a preference for one platform over another.
The TAC member sponsor of this working group is Len Santalucia.
## Deliverables
Near term: an industry-wide roundtable with thought leaders in the COBOL space.
Curricula for high schools, colleges, and universities for teaching COBOL.
## Communication
This WG communicates on the following channels:
- https://lists.openmainframeproject.org/g/wg-cobol
- https://app.slack.com/client/T1BAJVCTY/C01549LK4F8
- https://developer.ibm.com/technologies/cobol/
## Meetings
This WG meets the fourth Thursday of every month.
Zoom link: https://zoom.us/j/97677598233
## In-person meetings
N/A for the foreseeable future.
## Meeting notes
Meeting recordings and notes are in the [meetings](meetings) folder
| 43.708333 | 342 | 0.796473 | eng_Latn | 0.998505 |
b13a4db06813f2989f9600f104e82765577067cb | 6,644 | md | Markdown | README.md | c-antoine/azure-mobile-apps-node | 368fc168ca3f14802fb13f6b60d45c297a061a74 | [
"MIT"
] | null | null | null | README.md | c-antoine/azure-mobile-apps-node | 368fc168ca3f14802fb13f6b60d45c297a061a74 | [
"MIT"
] | null | null | null | README.md | c-antoine/azure-mobile-apps-node | 368fc168ca3f14802fb13f6b60d45c297a061a74 | [
"MIT"
] | null | null | null |
[](https://travis-ci.org/Azure/azure-mobile-apps-node)
[](https://david-dm.org/Azure/azure-mobile-apps-node)
[](https://david-dm.org/Azure/azure-mobile-apps-node#info=devDependencies)
[](https://gitter.im/Azure/azure-mobile-apps-node?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge)
# Fork made for my own Node plugin.
Goal is to allow "ID" parameter modification && self-contained memory to stop worker-thread use, that have too much memory cost
This plugin is used in my previous ionic application (privated-source) for MSSQL(desktop) <-> SQLite(app) data synchronisation
# Azure Mobile Apps - Node SDK
## Basic Usage
The Azure Mobile Apps Node.js SDK is an [express](http://expressjs.com/) middleware package which makes it easy to create a backend for your mobile application and get it running on Azure.
```js
var app = require('express')(); // Create an instance of an Express app
var mobileApp = require('azure-mobile-apps')(); // Create an instance of a Mobile App with default settings
mobileApp.tables.add('TodoItem'); // Create a table for 'TodoItem' with default settings
app.use(mobileApp);
app.listen(process.env.PORT || 3000);
```
## Installation
`npm install --save azure-mobile-apps`
## Documentation & Resources
- [API Documentation](https://azure.github.io/azure-mobile-apps-node)
- [Samples](https://github.com/Azure/azure-mobile-apps-node/tree/master/samples)
- [Tutorials & How-Tos](https://azure.microsoft.com/en-us/documentation/articles/app-service-mobile-value-prop-preview/)
- [Azure .NET SDK](https://www.visualstudio.com/features/azure-tools-vs)
- [Client & Server Quickstarts](https://github.com/Azure/azure-mobile-services-quickstarts)
- [StackOverflow #azure-mobile-services](http://stackoverflow.com/questions/tagged/azure-mobile-services?sort=newest&pageSize=20)
- [MSDN Forums](https://social.msdn.microsoft.com/forums/azure/en-US/home?forum=azuremobile)
- [Chat on Gitter](https://gitter.im/Azure/azure-mobile-apps-node?utm_source=share-link&utm_medium=link&utm_campaign=share-link)
## Quickstart
1. Create a new directory, initialize git, and initialize npm
```
mkdir quickstart
cd quickstart
git init
npm init --yes
```
2. Install (with npm) the azure-mobile-apps and express packages
`npm install express azure-mobile-apps --save`
3. Create a suitable .gitignore file. You can generate a suitable .gitignore
file using the generator at [gitignore.io](https://www.gitignore.io)
4. Create a server.js file and add the following code to the file (or use the code from one of our samples):
```js
var app = require('express')(); // Create an instance of an Express app
var mobileApp = require('azure-mobile-apps')(); // Create an instance of a Mobile App with default settings
mobileApp.tables.add('TodoItem'); // Create a table for 'TodoItem' with default settings
app.use(mobileApp);
app.listen(process.env.PORT || 3000);
```
5. Run your project locally with `node server.js`
6. Publish your project to an existing Azure Mobile App by adding it as a remote and pushing your changes.
```
git remote add azure https://{user}@{sitename}.scm.azurewebsites.net:443/{sitename}.git
git add package.json server.js
git commit -m 'Quickstart created'
git push azure master
```
To test steps 4-5, you can use any of the clients found in the [Client & Server Quickstarts](https://github.com/Azure/azure-mobile-services-quickstarts).
## Running Tests
To run the suite of unit and integration tests, execute the following commands in a console window.
git clone https://github.com/Azure/azure-mobile-apps-node.git
cd azure-mobile-apps-node
npm i
npm test
This runs tests using the default embedded SQLite data provider. To execute tests
against SQL Server, create a configuration file called `azureMobile.js` in the
`test` directory that contains relevant data configuration. See the
[API reference](http://azure.github.io/azure-mobile-apps-node/global.html#configuration)
for more information.
## GitHub Organization
Our GitHub repository has one branch with code in it - master. Each version is tagged with
the version when we release a new version. We have three suffixes for the release. An alpha
release indicates that the API may be unstable between releases and the library may not pass
the end to end tests yet. You should not use an alpha release in production or testing. We
release alpha releases to provide an early look at the library. Has all the functionality we
expect in the final release and should be API stable, so it can be used for development. A
beta library release may not pass end to end tests yet. A GA release passes all end to end
tests and is recommended for production code.
We use [GitHub Issues](https://github.com/Azure/azure-mobile-apps-node/issues) to track all work
with this library. We use Milestones to track the work going into a particular release.
## Future of Azure Mobile Apps
Microsoft is committed to fully supporting Azure Mobile Apps, including **support for the latest OS release, bug fixes, documentation improvements, and community PR reviews**. Please note that the product team is **not currently investing in any new feature work** for Azure Mobile Apps. We highly appreciate community contributions to all areas of Azure Mobile Apps.
## Contributing
This project has adopted the [Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/). For more information see the [Code of Conduct FAQ](https://opensource.microsoft.com/codeofconduct/faq/) or contact [opencode@microsoft.com](mailto:opencode@microsoft.com) with any additional questions or comments.
For information on how to contribute to this project, please see the [contributor guide](https://github.com/Azure/azure-mobile-apps-node/blob/master/contributor.md).
## Contact Us
We can be contacted via a variety of methods. The most effective are on Twitter (via @AzureMobile) and the [MSDN Forums](https://social.msdn.microsoft.com/forums/azure/en-US/home?forum=azuremobile) If you need to reference a GitHub Issue, ensure you cut-and-paste the URL of the issue into the message. You can also reach us on [Gitter](https://gitter.im/Azure/azure-mobile-apps-node?utm_source=share-link&utm_medium=link&utm_campaign=share-link).
## License
[MIT](./LICENSE)
| 50.717557 | 450 | 0.763697 | eng_Latn | 0.888554 |