hexsha stringlengths 40 40 | size int64 5 1.04M | ext stringclasses 6 values | lang stringclasses 1 value | max_stars_repo_path stringlengths 3 344 | max_stars_repo_name stringlengths 5 125 | max_stars_repo_head_hexsha stringlengths 40 78 | max_stars_repo_licenses listlengths 1 11 | max_stars_count int64 1 368k ⌀ | max_stars_repo_stars_event_min_datetime stringlengths 24 24 ⌀ | max_stars_repo_stars_event_max_datetime stringlengths 24 24 ⌀ | max_issues_repo_path stringlengths 3 344 | max_issues_repo_name stringlengths 5 125 | max_issues_repo_head_hexsha stringlengths 40 78 | max_issues_repo_licenses listlengths 1 11 | max_issues_count int64 1 116k ⌀ | max_issues_repo_issues_event_min_datetime stringlengths 24 24 ⌀ | max_issues_repo_issues_event_max_datetime stringlengths 24 24 ⌀ | max_forks_repo_path stringlengths 3 344 | max_forks_repo_name stringlengths 5 125 | max_forks_repo_head_hexsha stringlengths 40 78 | max_forks_repo_licenses listlengths 1 11 | max_forks_count int64 1 105k ⌀ | max_forks_repo_forks_event_min_datetime stringlengths 24 24 ⌀ | max_forks_repo_forks_event_max_datetime stringlengths 24 24 ⌀ | content stringlengths 5 1.04M | avg_line_length float64 1.14 851k | max_line_length int64 1 1.03M | alphanum_fraction float64 0 1 | lid stringclasses 191 values | lid_prob float64 0.01 1 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
2ff976e8def4a8a5cf7cacae4013a1e4760fe9f0 | 11,401 | md | Markdown | articles/finance/general-ledger/elimination-rules.md | MicrosoftDocs/Dynamics-365-Operations.sv-se | 90cb258993f991b2ce1b67078a6519e342608a4e | [
"CC-BY-4.0",
"MIT"
] | 3 | 2020-05-18T17:14:56.000Z | 2022-03-02T03:46:34.000Z | articles/finance/general-ledger/elimination-rules.md | MicrosoftDocs/Dynamics-365-Operations.sv-se | 90cb258993f991b2ce1b67078a6519e342608a4e | [
"CC-BY-4.0",
"MIT"
] | 7 | 2017-12-13T12:57:02.000Z | 2019-04-30T11:46:04.000Z | articles/finance/general-ledger/elimination-rules.md | MicrosoftDocs/Dynamics-365-Operations.sv-se | 90cb258993f991b2ce1b67078a6519e342608a4e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Elimineringsregler
description: Det här avsnittet innehåller information om elimineringsregler och de olika alternativen för rapportering om elimineringar.
author: aprilolson
ms.date: 01/11/2018
ms.topic: article
ms.prod: ''
ms.technology: ''
ms.search.form: LedgerEliminationRule
audience: Application User
ms.reviewer: roschlom
ms.custom: 13131
ms.assetid: 08fd46ef-2eb8-4942-985d-40fd757b74a8
ms.search.region: Global
ms.author: aolson
ms.search.validFrom: 2016-02-28
ms.dyn365.ops.version: AX 7.0.0
ms.openlocfilehash: c31df2526f3b852bbc2b5086ce2154310118352d43c9f8803b72d49df4a450b5
ms.sourcegitcommit: 42fe9790ddf0bdad911544deaa82123a396712fb
ms.translationtype: HT
ms.contentlocale: sv-SE
ms.lasthandoff: 08/05/2021
ms.locfileid: "6755652"
---
# <a name="elimination-rules"></a>Elimineringsregler
[!include [banner](../includes/banner.md)]
Det här avsnittet innehåller information om elimineringsregler och de olika alternativen för rapportering om elimineringar.
Elimineringstransaktioner behövs när en överordnad juridisk person gör affärer med en eller flera underordnade juridiska personer och använder konsoliderad ekonomisk rapportering. Konsoliderade bokslut måste innehålla transaktioner som uppstår mellan den konsoliderade organisationen och andra enheter utanför den organisation. Därför måste transaktioner mellan juridiska personer som ingår i samma organisation tas bort eller elimineras från redovisningen så att de inte visas i ekonomiska rapporter. Det finns flera sätt att rapportera om elimineringar:
- En elimineringsregel kan skapas och bearbetas i ett konsoliderings- eller elimineringsföretag.
- Den ekonomiska rapporteringen kan användas för att visa elimineringskontona och dimensionerna i en specifik rad eller kolumn.
- En separat juridisk person kan användas till att bokföra manuella transaktionsposter för att spåra elimineringar.
Det här avsnittet beskriver elimineringsregler som bearbetas i ett konsolidering- eller elimineringsföretag. Du ställer in elimineringsregler när du vill skapa elimineringstransaktioner i en juridisk person som anges som destination för elimineringar. Den juridiska personen på destinationen kallas för eliminerande juridisk person. Elimineringsjournaler kan skapas antingen under konsolideringen eller med hjälp av ett elimineringsjournalförslag. Innan du ställer in elimineringsregler bör du känna till följande termer:
- **Juridisk person, källa** – Den juridiska person där beloppen som elimineras bokfördes.
- **Destinationens juridiska person** – Den juridiska personen där elimineringsreglerna registreras.
- **Juridisk person för eliminering** – Den juridiska person som anges som destination för elimineringar.
- **Konsoliderad juridisk person** – Den juridiska person som skapas för att rapportera ekonomiska resultat för en grupp juridiska personer. De ekonomiska data från de juridiska personerna konsolideras i den här juridiska personen, och sedan skapas en ekonomisk rapport med hjälp av dessa kombinerade data.
Följande tabell visar de transaktionstyper som kanske måste elimineras.
<table>
<colgroup>
<col width="50%" />
<col width="50%" />
</colgroup>
<thead>
<tr class="header">
<th>transaktionstyp</th>
<th>Exempel</th>
</tr>
</thead>
<tbody>
<tr class="odd">
<td>Försäljningsorderregistrering och fakturering (centraliserad bearbetning)</td>
<td>Du säljer en produkt till en kund på uppdrag av en annan juridisk person i din organisation.</td>
</tr>
<tr class="even">
<td>Försäljningsorderregistrering (koncernintern/företagsintern) och fakturering</td>
<td>Du säljer produkter mellan juridiska personer i din organisation.</td>
</tr>
<tr class="odd">
<td>Inköpsorder (centraliserad bearbetning)</td>
<td>Du kan köpa lager, varor, tjänster, anläggningstillgångar och andra produkter från en leverantör på uppdrag av en annan juridisk person i din organisation.</td>
</tr>
<tr class="even">
<td>Lagerhantering (koncernintern/företagsintern)</td>
<td><ul>
<li>Du överför en juridisk persons lager till anläggningstillgångar för en annan juridisk person i din organisation.</li>
<li>Du överför en juridisk persons lager till lagret för en annan juridisk person i din organisation.</li>
</ul></td>
</tr>
<tr class="odd">
<td>Lagerspårning under transport</td>
<td>Du överför artiklar mellan lagerställen inom samma juridiska person, men mellan olika geografiska platser.</td>
</tr>
<tr class="even">
<td>Centraliserad fakturabearbetning i leverantörsreskontra</td>
<td>Du kan registrera en faktura på uppdrag av en annan juridisk person i din organisation.</td>
</tr>
<tr class="odd">
<td>Centraliserad betalningsbearbetning i leverantörsreskontra</td>
<td>Du kan betala en faktura på uppdrag av en annan juridisk person i din organisation.</td>
</tr>
<tr class="even">
<td>Kassahantering (centraliserad hantering)</td>
<td><ul>
<li>Du kan bearbeta skattebetalningar, återbetalningar av skatt, ränteavgifter, lån, förskott, betalda utdelningar och förutbetalda royalties eller provisioner.</li>
<li>Du kan betala en utgift på uppdrag av en annan juridisk person i din organisation. Fakturan anges i den juridiska målpersonens böcker och du måste göra kvittningar mellan olika juridiska personer. En juridisk person betalar till exempel utgiftsrapporten för en medarbetare i en annan juridisk person. I det här fallet har medarbetarens utgiftsrapport som är relaterade till en annan juridisk person.</li>
<li>Du överför medel från en juridisk person till en annan i din organisation.</li>
</ul></td>
</tr>
<tr class="odd">
<td>Kundreskontra (centraliserad hantering)</td>
<td>Du tar emot kontanter för en annan juridisk persons kundfaktura och du sätter in checken på den juridiska personens bankkonto.</td>
</tr>
<tr class="even">
<td>Lön (centraliserad hantering, koncernintern/företagsintern)</td>
<td><ul>
<li>Du kan betala en annan juridisk persons Lön. En juridisk person kanske till exempel betalar sina egna löner till de anställda, men debiterar tillbaka arbete som en medarbetare har utfört för en annan juridisk person under lönekörningen. Eller om en medarbetare har arbetat halvtid för den juridiska personen A och halvtid för den juridiska personen B och förmånerna gäller för hela lönen. I så fall innehåller medarbetarens lön betalningen från båda juridiska personer. Det är inte bara löner som bokförs utan även skatter, förmåner, avdrag och periodiseringar för löner bokförs.</li>
<li>Du överför arbete från en avdelning eller division till en annan.</li>
</ul></td>
</tr>
<tr class="odd">
<td>Anläggningstillgångar (koncernintern/företagsintern)</td>
<td>Du överför anläggningstillgångar till en annan juridisk persons anläggningstillgångar eller lager.</td>
</tr>
<tr class="even">
<td>Allokeringar (koncernintern/företagsintern)</td>
<td>Du kan bearbeta företagsallokeringar. Allokeringar är aktiviteter för alla konton som är allokerade, oavsett den ursprungliga modulen.</td>
</tr>
</tbody>
</table>
## <a name="example"></a>Exempel
Din juridiska person, juridisk person A, säljer prylar till en annan juridisk person inom din organisation, juridisk person B. Följande exempel visar hur transaktionerna som uppstår mellan de två juridiska personerna kanske måste elimineras:
- Juridisk person A säljer en pryl som kostar 10,00 till juridisk person B för 10,00.
- Juridisk person A säljer en pryl som kostar 10,00 till juridisk person B för 10,00 plus 2,00 i faktiska fraktkostnader.
- Juridisk person A säljer en pryl som kostar 10,00 till juridisk person B för 15,00 och tillerkänns en marginal på försäljningen.
- Juridisk person A säljer en pryl som kostar 10,00 till juridisk person B för 15,00 och tillerkänns halva marginalen på försäljningen. Juridisk person B tillerkänns den andra halvan av marginalen på försäljningen. Därför delas intäkten. Delade intäkter ger ett incitament till order från en annan juridisk person i organisationen i stället för en extern organisation.
Alla dessa transaktioner skapar koncerninterna transaktioner som bokförs på förfaller till- och förfaller från-konton. Dessutom kan dessa transaktioner kanske omfatta påläggs- och nedsättningsbelopp när beloppet för den koncerninterna försäljningen inte är lika med kostnaden för de varor som sålts.
## <a name="set-up-elimination-rules"></a>Ställ in elimineringsregler
När du ställer in elimineringsregler rekommenderar vi att du skapar en ekonomisk dimension som särskilt avser eliminering. De flesta kunder kallar den Commercespartner eller något liknande. Om du inte vill använda en ekonomisk dimension, se då till att ha huvudkonton som gäller endast för koncerninterna transaktioner.
Inställningar för elimineringar finns i området Inställningar i modulen Konsolideringar. När du anger en beskrivning för regeln måste du välja det företag som elimineringsjournalen ska bokföra till. Detta bör vara ett företag som har **Använd för ekonomisk elimineringsprocess** markerat i inställningarna för juridisk enhet.
Du kan ange ett datum då elimineringsregeln träder i kraft och när den löper ut, om så krävs. Du måste ange **Aktiv** som **Ja** om du vill att den ska vara tillgänglig i förslagsprocessen för eliminering. Välj ett journalnamn som har en typ av **Eliminering**.
Du kan definiera de faktiska bearbetningsreglerna genom att klicka på **Rader** när du har angett grundläggande information. Det finns två olika alternativ för elimineringar: att eliminera nettoändringsbeloppet eller att definiera ett fast belopp.
Markera ditt källkonto. Du kan använda asterisk (\*) som jokertecken. 1\* skulle till exempel välja alla konton som börjar med 1 som datakälla för allokeringen.
När du har valt dina källkonton, avgör **Kontospecifikationen** det konto från destinationsföretaget som används. Välj **Källa** om du vill använda samma huvudkonto som angetts i kontot **Källa**. Om du väljer **Användardefinierad** måste du ange ett destinationskonto.
Dimensionsspecifikationen fungerar på samma sätt. Om du väljer **Källa** används samma dimensioner i destinationsföretaget som i källföretaget. Om du väljer **Användardefinierad** måste du ange dimensioner för destinationsföretaget genom att klicka på menyobjektet **Måldimensioner**.
Välj källdimensioner och ekonomiska dimensioner, samt de värden som används som källa för eliminering.
## <a name="process-elimination-transactions"></a>Bearbeta elimineringstransaktioner
Det finns två sätt att bearbeta elimineringstransaktioner: i samband med konsolideringsprocessen online, eller genom att skapa en journal för eliminering och köra förslagsprocessen för eliminering. Det här avsnittet fokuserar på hur du skapar journalen och kör elimineringsprocessen.
I ett företag som definierats som ett elimineringsföretag, välj **Elimineringsjournal** i modulen Konsolideringar. Klicka på **Rader** när du har valt journalnamn. Du kan köra förslaget genom att välja menyn **Förslag** och sedan välja **Elimineringsförslag**.
Välj det företag som utgör grunden för den konsoliderade datan, och välj sedan den regel som du vill bearbeta. Ange ett startdatum för att påbörja sökningen efter elimineringsbelopp och ett slutdatum för att avsluta sökdatum för eliminering. Fältet **GL-bokföringsdatum** är det datum som används för att bokföra journalen i redovisningen. När du klickar på **OK** kan du granska belopp och bokföra journalen.
[!INCLUDE[footer-include](../../includes/footer-banner.md)] | 72.617834 | 588 | 0.806333 | swe_Latn | 0.999913 |
2ff9c061a4b91274f7342b6b7197834c25b45633 | 1,408 | markdown | Markdown | _posts/2017-04-24-beautiful-curtain-princess-design-ideas.markdown | rumnamanya/rumnamanya.github.io | 2deadeff04c8a48cf683b885b7fa6ab9acc1d9d9 | [
"MIT"
] | null | null | null | _posts/2017-04-24-beautiful-curtain-princess-design-ideas.markdown | rumnamanya/rumnamanya.github.io | 2deadeff04c8a48cf683b885b7fa6ab9acc1d9d9 | [
"MIT"
] | null | null | null | _posts/2017-04-24-beautiful-curtain-princess-design-ideas.markdown | rumnamanya/rumnamanya.github.io | 2deadeff04c8a48cf683b885b7fa6ab9acc1d9d9 | [
"MIT"
] | null | null | null | ---
layout: post
title: "24 Uses for Beautiful Curtain Princess Design Ideas"
postname: "beautiful-curtain-princess-design-ideas"
date: 2017-04-24 10:13:15 +0700
categories: [resume]
---
A girl should get courage to wear this kind of perfume. All women have a inclination to wish to become princesses residing in a castle surrounded. Thus they should search for several excellent home-coming designs at first. Start off looking for crystal decoration, if your little girl loves all things sparkly. Young girls like to choose a chair while they read or talk on the telephone. Opt for the color your girl loves and paint with this. In the event you expecting a baby girl in your home then then you're most likely searching for pretty approaches to enhance the nursery. The motif will be some thing that is a interest to this occupier of this room. As the child will sew a specific motif after some decades an overall motif is advised to get the bed room of a toddler , and you want to remodel it all over again. There are a lot of themes you are able to think concerning for bettering your own bathroom. When it 's personalized based on your own style, you will certainly like to stay in your room. Give a safe atmosphere for functions and kids rooms want to both kids and adults. Besides that needs to look attractive. There really certainly are a lot of ways of considering decorating your infant's area.
| 156.444444 | 1,217 | 0.792614 | eng_Latn | 0.999903 |
2ffa544fc317741fbb7c83f07660f7f8a8d005f3 | 6,512 | md | Markdown | docs/odbc/reference/develop-app/when-to-use-procedures.md | SteSinger/sql-docs.de-de | 2259e4fbe807649f6ad0d49b425f1f3fe134025d | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/odbc/reference/develop-app/when-to-use-procedures.md | SteSinger/sql-docs.de-de | 2259e4fbe807649f6ad0d49b425f1f3fe134025d | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/odbc/reference/develop-app/when-to-use-procedures.md | SteSinger/sql-docs.de-de | 2259e4fbe807649f6ad0d49b425f1f3fe134025d | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Verwendung von Prozeduren | Microsoft-Dokumentation
ms.custom: ''
ms.date: 01/19/2017
ms.prod: sql
ms.prod_service: connectivity
ms.reviewer: ''
ms.technology: connectivity
ms.topic: conceptual
helpviewer_keywords:
- SQL statements [ODBC], procedures
- procedures [ODBC], about procedures
ms.assetid: 7dc9e327-dd54-4b10-9f66-9ef5c074f122
author: MightyPen
ms.author: genemi
ms.openlocfilehash: 6f25b629372bbe089489cccdbfa0258dafef3dd0
ms.sourcegitcommit: b2464064c0566590e486a3aafae6d67ce2645cef
ms.translationtype: MT
ms.contentlocale: de-DE
ms.lasthandoff: 07/15/2019
ms.locfileid: "68078973"
---
# <a name="when-to-use-procedures"></a>Verwendung von Prozeduren
Es gibt eine Reihe von Vorteilen, die mithilfe von Prozeduren, alle basiert auf der Tatsache, dass Prozeduren verschiebt SQL-Anweisungen von der Anwendung an die Datenquelle. Nun muss nur noch in der Anwendung ist ein interoperabler Prozeduraufruf. Diese Vorteile umfassen:
- **Leistung** Prozeduren sind in der Regel die schnellste Möglichkeit, SQL-Anweisungen ausführen. Wie vorbereitete Ausführung, die Anweisung wird kompiliert und in zwei separaten Schritte ausgeführt. Im Gegensatz zu vorbereitete Ausführung werden die Prozeduren nur zur Laufzeit ausgeführt. Zu einem anderen Zeitpunkt kompiliert werden.
- **Geschäftsregeln** ein *Geschäftsregel* ist eine Regel zu den Möglichkeiten, in dem ein Unternehmen Business verfügt. Beispielsweise kann nur eine Person mit dem Titel Vertriebsmitarbeiter können neue Aufträge hinzufügen. Platzieren diese Regeln in Prozeduren kann einzelne Unternehmen vertikale Anwendungen anpassen, indem Sie Neuerstellen der Prozeduren, die von der Anwendung aufgerufen werden, ohne den Anwendungscode ändern zu müssen. Auftragserfassungsanwendung kann z. B. die Prozedur aufrufen **InsertOrder** mit einer festen Anzahl von Parametern, wie genau **InsertOrder** wird implementiert, kann von Unternehmen zu Unternehmen variieren.
- **Replaceability** eng mit Platzieren von Geschäftsregeln in Prozeduren ist die Tatsache, dass Prozeduren ersetzt werden können, ohne die Anwendung erneut zu kompilieren. Wenn eine Geschäftsregel ändert, nachdem ein Unternehmen erworben haben und eine Anwendung installiert wurde, kann das Unternehmen die Prozedur, die mit dieser Regel ändern. Aus Sicht der Anwendung wurde nichts geändert Sie ruft immer noch eine bestimmte Prozedur um eine bestimmte Aufgabe durchzuführen.
- **DBMS-spezifische SQL** Prozeduren bieten eine Möglichkeit für die DBMS-spezifische SQL nutzen und weiterhin interoperablen Anwendungen. Beispielsweise kann eine Prozedur, auf ein DBMS, die ablaufsteuerung von Anweisungen in SQL unterstützt abfangen und Fehlern wiederherzustellen, während eine Prozedur auf ein DBMS, die ablaufsteuerung von Anweisungen nicht unterstützt einfach einen Fehler zurückgeben kann.
- **Prozeduren überstehen Transaktionen** für einige Datenquellen werden die Zugriffspläne für alle vorbereiteten Anweisungen für eine Verbindung gelöscht, wenn eine Transaktion ein Commit oder Rollback ist. Indem Sie ein SQL-Anweisungen in Prozeduren, die dauerhaft in der Datenquelle gespeichert werden, Überleben die Anweisungen die Transaktion. Angibt, ob die Prozeduren in mindestens eine vorbereitete, teilweise vorbereitet, überstehen eines Systemausfalls bzw. aufgehoben Zustand ist die DBMS-spezifische.
- **Trennen von Entwicklung** Prozeduren können getrennt vom Rest der Anwendung entwickelt werden. In großen Unternehmen gewährleistet dies eine Möglichkeit, die von Programmierern, sehr spezielle Fähigkeiten weiter zu nutzen. Das heißt, entwickeln Anwendungsprogrammierer Code der Benutzeroberfläche und Datenbankprogrammierer können Prozeduren schreiben.
Prozeduren werden in der Regel von vertikaler und benutzerdefinierten Anwendungen verwendet werden. Diese Anwendungen sind tendenziell feste Aufgaben ausführen, und es ist möglich, Programmieren Prozeduraufrufe darin. Auftragserfassungsanwendung kann z. B. die Prozeduren aufrufen **InsertOrder**, **DeleteOrder**, **UpdateOrder**, und **GetOrders** .
Es gibt keinen Grund zum Aufrufen von Prozeduren in generischen Anwendungen. Prozeduren werden normalerweise zum Ausführen einer Aufgabe im Kontext einer bestimmten Anwendung geschrieben, und haben daher keine Verwendung für allgemeine Anwendungen. Eine Tabelle hat beispielsweise keinen Grund, rufen Sie die **InsertOrder** gerade erwähnten Verfahren. Darüber hinaus sollten allgemeine Anwendungen keinen Verfahren zur Laufzeit zu bereitstellen schneller anweisungsausführung erstellen; ist nicht nur diesem wahrscheinlich langsamer als eine vorbereitete oder direkte Ausführung, es sind auch DBMS-spezifische SQL-Anweisungen erforderlich.
Eine Ausnahme ist die Anwendungsentwicklung-Umgebungen, die bieten häufig eine Möglichkeit für Programmierer, die SQL-Anweisungen zu erstellen, die Ausführen von Prozeduren und möglicherweise bieten eine Möglichkeit für Programmierer, die zum Testen von Prozeduren. Solche Umgebungen Aufruf **SQLProcedures** Liste verfügbaren Prozeduren und **SQLProcedureColumns** zum Auflisten der Eingabe, Input/Output und Output-Parameter der Prozedur zurückgegeben werden soll, Wert und die Spalten der keine Resultsets, die von einer Prozedur erstellt wird. Diese Prozeduren müssen jedoch im Voraus auf jede Datenquelle entwickelt werden; Dies ist also DBMS-spezifische SQL-Anweisungen erforderlich.
Es gibt drei wesentliche Nachteile der Verwendung von Prozeduren an. Die erste ist, dass Prozeduren geschrieben und für jede DBMS, mit denen die Anwendung ist die Ausführung, kompiliert werden müssen. Dies ist, zwar kein Problem für benutzerdefinierte Anwendungen können sie Entwicklungs- und Wartungszeiten für vertikale Anwendungen, die mit einer Reihe von DBMS-Systeme ausgeführt erheblich erhöhen.
Der zweite Nachteil ist, dass viele Datenbankmanagementsysteme Prozeduren nicht unterstützt werden. In diesem Fall ist dies wahrscheinlich ein Problem für vertikale Anwendungen, die mit einer Reihe von DBMS-Systeme ausgeführt werden. Um zu bestimmen, ob Prozeduren unterstützt werden, eine Anwendung ruft **SQLGetInfo** mit der Option SQL_PROCEDURES.
Der dritte Nachteil, der insbesondere für entwicklungsumgebungen für die Anwendung ist, ist, dass ODBC zum Erstellen von Prozeduren eine Grammatik für standard nicht definiert ist. D. h. Obwohl Anwendungen interoperably Prozeduren aufrufen können, können nicht sie diese interoperably erstellen.
| 132.897959 | 692 | 0.826781 | deu_Latn | 0.999513 |
2ffb7875699b7fea92bf3eac950a0f52b2f2d8f5 | 10,964 | md | Markdown | _posts/2020-02-28-v-style-haircut.md | comotecyn/-hairstyle | d77bbac3ea01d7130320d4f80b2dc57020aed1a0 | [
"MIT"
] | null | null | null | _posts/2020-02-28-v-style-haircut.md | comotecyn/-hairstyle | d77bbac3ea01d7130320d4f80b2dc57020aed1a0 | [
"MIT"
] | null | null | null | _posts/2020-02-28-v-style-haircut.md | comotecyn/-hairstyle | d77bbac3ea01d7130320d4f80b2dc57020aed1a0 | [
"MIT"
] | null | null | null | ---
id: 181
title: V Style Haircut
date: 2020-02-28T05:24:44+00:00
author: masje
layout: post
guid: http://example.com/?p=181
permalink: /2020/02/28/v-style-haircut/
categories:
- Uncategorized
tags:
- haircut style for men v cut
- haircut style v shape
- layered v style haircut
- long haircut v style cut
- long v style haircut
- short v style haircut
- v style haircut
- v style haircut boys
- v style haircut for men
- v style haircut long hair
---
[
<img class="img-fluid" src="https://i0.wp.com/www.menshairstylestoday.com/wp-content/uploads/2017/10/Faded-Mohawk-Hair-Design-V-Shape.jpg" width="100%" onerror="this.onerror=null;this.src='https://encrypted-tbn0.gstatic.com/images?q=tbn:ANd9GcQh_l3eQ5xwiPy07kGEXjmjgmBKBRB7H2mRxCGhv1tFWg5c_mWT';" alt="21 Best Mohawk Fade Haircuts 2020 Guide" />](https://www.menshairstylestoday.com/wp-content/uploads/2017/10/Faded-Mohawk-Hair-Design-V-Shape.jpg)
21 Best Mohawk Fade Haircuts 2020 Guide
###
<img src="https://i0.wp.com/graciesgoodlife.files.wordpress.com/2013/05/hairshot.jpg" width="100%" align="left" style="margin-right: 8px;margin-bottom: 8px;" /> <!--ads/auto.txt-->
[
<img class="img-fluid" src="https://i0.wp.com/content.latest-hairstyles.com/wp-content/uploads/beach-wave-500x570.jpg" width="100%" onerror="this.onerror=null;this.src='https://encrypted-tbn0.gstatic.com/images?q=tbn:ANd9GcQh_l3eQ5xwiPy07kGEXjmjgmBKBRB7H2mRxCGhv1tFWg5c_mWT';" alt="17 V Cut On Long Hair Ideas Trending In 2020 For That V Shape Look" />](https://content.latest-hairstyles.com/wp-content/uploads/beach-wave-500x570.jpg)
17 V Cut On Long Hair Ideas Trending In 2020 For That V Shape Look
[
<img class="img-fluid" src="https://i0.wp.com/rossanoistanbul.com/wp-content/uploads/2016/10/V-shaped-haircut-for-long-hair-1-500x500.jpg" width="100%" onerror="this.onerror=null;this.src='https://encrypted-tbn0.gstatic.com/images?q=tbn:ANd9GcQh_l3eQ5xwiPy07kGEXjmjgmBKBRB7H2mRxCGhv1tFWg5c_mWT';" alt="V Shaped Haircut For Long Hair At Back Hair Style And Color For" />](http://rossanoistanbul.com/wp-content/uploads/2016/10/V-shaped-haircut-for-long-hair-1-500x500.jpg)
V Shaped Haircut For Long Hair At Back Hair Style And Color For
[
<img class="img-fluid" src="https://i0.wp.com/thecuddl.com/images/2018/08/31-pretty-hairstyle-idea-thecuddl.jpg" width="100%" onerror="this.onerror=null;this.src='https://encrypted-tbn0.gstatic.com/images?q=tbn:ANd9GcQh_l3eQ5xwiPy07kGEXjmjgmBKBRB7H2mRxCGhv1tFWg5c_mWT';" alt="50 Sexy Long Layered Hair Ideas To Create Effortless Style In 2020" />](https://thecuddl.com/images/2018/08/31-pretty-hairstyle-idea-thecuddl.jpg)
50 Sexy Long Layered Hair Ideas To Create Effortless Style In 2020
[
<img class="img-fluid" src="https://i0.wp.com/i2.wp.com/therighthairstyles.com/wp-content/uploads/2015/10/19-long-hair-v-cut.jpg?w=500&ssl=1" width="100%" onerror="this.onerror=null;this.src='https://encrypted-tbn0.gstatic.com/images?q=tbn:ANd9GcQh_l3eQ5xwiPy07kGEXjmjgmBKBRB7H2mRxCGhv1tFWg5c_mWT';" alt="40 V Cut And U Cut Hairstyles To Angle Your Strands To Perfection" />](https://i2.wp.com/therighthairstyles.com/wp-content/uploads/2015/10/19-long-hair-v-cut.jpg?w=500&ssl=1)
40 V Cut And U Cut Hairstyles To Angle Your Strands To Perfection
[
<img class="img-fluid" src="https://i0.wp.com/image-tb.vova.com/image/500_500/filler/e0/01/85298bd4474ec34e7a30d2bd240be001.jpg?format=webp" width="100%" onerror="this.onerror=null;this.src='https://encrypted-tbn0.gstatic.com/images?q=tbn:ANd9GcQh_l3eQ5xwiPy07kGEXjmjgmBKBRB7H2mRxCGhv1tFWg5c_mWT';" alt="Vova Multifunctional Haircut Combs V Type Diy Salon Hairdressing" />](https://image-tb.vova.com/image/500_500/filler/e0/01/85298bd4474ec34e7a30d2bd240be001.jpg?format=webp)
Vova Multifunctional Haircut Combs V Type Diy Salon Hairdressing
[
<img class="img-fluid" src="https://i0.wp.com/hairmotive.com/wp-content/uploads/2019/07/V-Cut-Layered-Haircuts-for-Long-Hair.jpg" width="100%" onerror="this.onerror=null;this.src='https://encrypted-tbn0.gstatic.com/images?q=tbn:ANd9GcQh_l3eQ5xwiPy07kGEXjmjgmBKBRB7H2mRxCGhv1tFWg5c_mWT';" alt="50 Gorgeous Layered Haircuts For Long Hair Hair Motive Hair Motive" />](https://hairmotive.com/wp-content/uploads/2019/07/V-Cut-Layered-Haircuts-for-Long-Hair.jpg)
50 Gorgeous Layered Haircuts For Long Hair Hair Motive Hair Motive
[
<img class="img-fluid" src="https://i0.wp.com/3.bp.blogspot.com/_CM5tRswTkX0/Sq5MsWhdiPI/AAAAAAAAADY/1t0ZshAT0-4/s320/Relaxed52705vi-vi.jpg" width="100%" onerror="this.onerror=null;this.src='https://encrypted-tbn0.gstatic.com/images?q=tbn:ANd9GcQh_l3eQ5xwiPy07kGEXjmjgmBKBRB7H2mRxCGhv1tFWg5c_mWT';" alt="Your Beauty Parlour V Cut" />](http://3.bp.blogspot.com/_CM5tRswTkX0/Sq5MsWhdiPI/AAAAAAAAADY/1t0ZshAT0-4/s320/Relaxed52705vi-vi.jpg)
Your Beauty Parlour V Cut
[
<img class="img-fluid" src="https://i0.wp.com/lh6.googleusercontent.com/proxy/Od2qfU6hlUHx5_Rqw75fJEnVmvZnscGZB5MTJxU0DBCgl5gAt-pIewiD0PuDofR2Ci29TBwcwxzlZHY_eAZSYR8icmaGNXDhzQQ9wEGFxouHnE6DxSYGO2WfUI-TOg=s0-d" width="100%" onerror="this.onerror=null;this.src='https://encrypted-tbn0.gstatic.com/images?q=tbn:ANd9GcQh_l3eQ5xwiPy07kGEXjmjgmBKBRB7H2mRxCGhv1tFWg5c_mWT';" alt="30 Inspiration V Cut Fade Hairstyle" />](https://lh6.googleusercontent.com/proxy/Od2qfU6hlUHx5_Rqw75fJEnVmvZnscGZB5MTJxU0DBCgl5gAt-pIewiD0PuDofR2Ci29TBwcwxzlZHY_eAZSYR8icmaGNXDhzQQ9wEGFxouHnE6DxSYGO2WfUI-TOg=s0-d)
30 Inspiration V Cut Fade Hairstyle
[
<img class="img-fluid" src="https://i0.wp.com/graciesgoodlife.files.wordpress.com/2013/05/hairshot.jpg" width="100%" onerror="this.onerror=null;this.src='https://encrypted-tbn0.gstatic.com/images?q=tbn:ANd9GcQh_l3eQ5xwiPy07kGEXjmjgmBKBRB7H2mRxCGhv1tFWg5c_mWT';" alt="V Shaped Haircut Gracie S Good Life" />](https://graciesgoodlife.files.wordpress.com/2013/05/hairshot.jpg)
V Shaped Haircut Gracie S Good Life
[
<img class="img-fluid" src="https://i0.wp.com/thetrendhairstyle.com/wp-content/uploads/2019/04/v-cut-haircut.jpg" width="100%" onerror="this.onerror=null;this.src='https://encrypted-tbn0.gstatic.com/images?q=tbn:ANd9GcQh_l3eQ5xwiPy07kGEXjmjgmBKBRB7H2mRxCGhv1tFWg5c_mWT';" alt="Long Wavy V Cut 40 V Cut And U Cut Hairstyles To Angle Your" />](https://thetrendhairstyle.com/wp-content/uploads/2019/04/v-cut-haircut.jpg)
Long Wavy V Cut 40 V Cut And U Cut Hairstyles To Angle Your
[
<img class="img-fluid" src="https://i0.wp.com/lookaside.fbsbx.com/lookaside/crawler/media/?media_id=10157938808366187" width="100%" onerror="this.onerror=null;this.src='https://encrypted-tbn0.gstatic.com/images?q=tbn:ANd9GcQh_l3eQ5xwiPy07kGEXjmjgmBKBRB7H2mRxCGhv1tFWg5c_mWT';" alt="6jatkszyncpgxm" />](https://lookaside.fbsbx.com/lookaside/crawler/media/?media_id=10157938808366187)
6jatkszyncpgxm
[
<img class="img-fluid" src="https://i0.wp.com/lh3.googleusercontent.com/5iw24W41MfffE8ZX3riAbAheexg1Rzx-aJShc245pENPVpHW824L8dD85BchK9N5M4SWTN0Fs2NkDZrOG33aXjMOf7HGYwYhTg=w1600-rj-nu-e365" width="100%" onerror="this.onerror=null;this.src='https://encrypted-tbn0.gstatic.com/images?q=tbn:ANd9GcQh_l3eQ5xwiPy07kGEXjmjgmBKBRB7H2mRxCGhv1tFWg5c_mWT';" alt="Bts Members React To V S Fresh New Haircut Koreaboo" />](https://lh3.googleusercontent.com/5iw24W41MfffE8ZX3riAbAheexg1Rzx-aJShc245pENPVpHW824L8dD85BchK9N5M4SWTN0Fs2NkDZrOG33aXjMOf7HGYwYhTg=w1600-rj-nu-e365)
Bts Members React To V S Fresh New Haircut Koreaboo
[
<img class="img-fluid" src="https://i0.wp.com/i2.wp.com/therighthairstyles.com/wp-content/uploads/2015/10/20-v-style-haircut.jpg?w=500&ssl=1" width="100%" onerror="this.onerror=null;this.src='https://encrypted-tbn0.gstatic.com/images?q=tbn:ANd9GcQh_l3eQ5xwiPy07kGEXjmjgmBKBRB7H2mRxCGhv1tFWg5c_mWT';" alt="40 V Cut And U Cut Hairstyles To Angle Your Strands To Perfection" />](https://i2.wp.com/therighthairstyles.com/wp-content/uploads/2015/10/20-v-style-haircut.jpg?w=500&ssl=1)
40 V Cut And U Cut Hairstyles To Angle Your Strands To Perfection
[
<img class="img-fluid" src="https://i0.wp.com/ath2.unileverservices.com/wp-content/uploads/sites/8/2019/07/v-shape-haircut-bedhead-1024x684.jpg" width="100%" onerror="this.onerror=null;this.src='https://encrypted-tbn0.gstatic.com/images?q=tbn:ANd9GcQh_l3eQ5xwiPy07kGEXjmjgmBKBRB7H2mRxCGhv1tFWg5c_mWT';" alt="V Shape Haircut 15 Beautiful Ways To Style It" />](https://ath2.unileverservices.com/wp-content/uploads/sites/8/2019/07/v-shape-haircut-bedhead-1024x684.jpg)
V Shape Haircut 15 Beautiful Ways To Style It
[
<img class="img-fluid" src="https://i0.wp.com/cf.shopee.com.my/file/edad8f717e7bfd8919d46373e467ef55" width="100%" onerror="this.onerror=null;this.src='https://encrypted-tbn0.gstatic.com/images?q=tbn:ANd9GcQh_l3eQ5xwiPy07kGEXjmjgmBKBRB7H2mRxCGhv1tFWg5c_mWT';" alt="Diy Styling Combs Tool Haircut Straightening V Shape Black Color" />](https://cf.shopee.com.my/file/edad8f717e7bfd8919d46373e467ef55)
Diy Styling Combs Tool Haircut Straightening V Shape Black Color
[
<img class="img-fluid" src="https://i0.wp.com/hairstylecamp.com/wp-content/uploads/v-shaped-hair-1.jpg" width="100%" onerror="this.onerror=null;this.src='https://encrypted-tbn0.gstatic.com/images?q=tbn:ANd9GcQh_l3eQ5xwiPy07kGEXjmjgmBKBRB7H2mRxCGhv1tFWg5c_mWT';" alt="6 Unbeatable V Shape Haircuts For Women 2020" />](https://hairstylecamp.com/wp-content/uploads/v-shaped-hair-1.jpg)
6 Unbeatable V Shape Haircuts For Women 2020
[
<img class="img-fluid" src="https://i0.wp.com/i1.wp.com/therighthairstyles.com/wp-content/uploads/2015/10/7-sleek-dark-blonde-v-cut.jpg?w=500&ssl=1" width="100%" onerror="this.onerror=null;this.src='https://encrypted-tbn0.gstatic.com/images?q=tbn:ANd9GcQh_l3eQ5xwiPy07kGEXjmjgmBKBRB7H2mRxCGhv1tFWg5c_mWT';" alt="40 V Cut And U Cut Hairstyles To Angle Your Strands To Perfection" />](https://i1.wp.com/therighthairstyles.com/wp-content/uploads/2015/10/7-sleek-dark-blonde-v-cut.jpg?w=500&ssl=1)
40 V Cut And U Cut Hairstyles To Angle Your Strands To Perfection
[
<img class="img-fluid" src="https://i0.wp.com/atozhairstyles.com/wp-content/uploads/2017/09/8vThe-Silver-leach-V-Shape-Hairstyle-.jpg" width="100%" onerror="this.onerror=null;this.src='https://encrypted-tbn0.gstatic.com/images?q=tbn:ANd9GcQh_l3eQ5xwiPy07kGEXjmjgmBKBRB7H2mRxCGhv1tFWg5c_mWT';" alt="The V Shaped Neckline Cool V Shaped Haircut With Layers Back" />](https://atozhairstyles.com/wp-content/uploads/2017/09/8vThe-Silver-leach-V-Shape-Hairstyle-.jpg)
The V Shaped Neckline Cool V Shaped Haircut With Layers Back
[
<img class="img-fluid" src="https://i0.wp.com/www.styleinterest.com/wp-content/uploads/2018/02/34220218-v-cut-u-cut-hair-.jpg" width="100%" onerror="this.onerror=null;this.src='https://encrypted-tbn0.gstatic.com/images?q=tbn:ANd9GcQh_l3eQ5xwiPy07kGEXjmjgmBKBRB7H2mRxCGhv1tFWg5c_mWT';" alt="60 V Cut And U Cut Hairstyles To Give You The Right Angle" />](https://www.styleinterest.com/wp-content/uploads/2018/02/34220218-v-cut-u-cut-hair-.jpg)
60 V Cut And U Cut Hairstyles To Give You The Right Angle | 104.419048 | 589 | 0.784112 | yue_Hant | 0.408594 |
2ffbe4924f95f9dabc2b24b21978ebf801ed4cb9 | 761 | md | Markdown | collections/_pages/engage.md | bcgov/forestry-digital-services | 9e3e2f857b52422e44fb3df557f0c33420a250a4 | [
"CC0-1.0"
] | null | null | null | collections/_pages/engage.md | bcgov/forestry-digital-services | 9e3e2f857b52422e44fb3df557f0c33420a250a4 | [
"CC0-1.0"
] | 3 | 2022-03-01T04:20:48.000Z | 2022-03-01T04:21:12.000Z | collections/_pages/engage.md | bcgov/forestry-suite-applications | 9e3e2f857b52422e44fb3df557f0c33420a250a4 | [
"CC0-1.0"
] | null | null | null | ---
layout: page
name: ENGAGE
title: The Forestry Digital Services Program Team
description:
---
## Engage with Us
Please contact us with any questions or concerns.
Your feedback is incredibly valuable and plays a part in how we shape our online information, we would love to hear your feedback.
 Email: [FSAModernizationProgram@gov.bc.ca](mailto:fsamodernizationprogram@gov.bc.ca)
 Email: [Chantelle.Abanilla@gov.bc.ca](mailto:chantelle.abanilla@gov.bc.ca)
A/Program Director of the Forestry Digital Services
 Email: [Michelle.Douville@gov.bc.ca](mailto:michelle.douville@gov.bc.ca)
A/Technical Director of the Forestry Digital Services
| 40.052632 | 132 | 0.775296 | eng_Latn | 0.764824 |
2ffbf97291da8003b04e17a8ed08a82cef1b5126 | 4,031 | md | Markdown | _posts/2019-04-04-Download-trivia-food-questions-and-answers.md | Ozie-Ottman/11 | 1005fa6184c08c4e1a3030e5423d26beae92c3c6 | [
"MIT"
] | null | null | null | _posts/2019-04-04-Download-trivia-food-questions-and-answers.md | Ozie-Ottman/11 | 1005fa6184c08c4e1a3030e5423d26beae92c3c6 | [
"MIT"
] | null | null | null | _posts/2019-04-04-Download-trivia-food-questions-and-answers.md | Ozie-Ottman/11 | 1005fa6184c08c4e1a3030e5423d26beae92c3c6 | [
"MIT"
] | null | null | null | ---
layout: post
comments: true
categories: Other
---
## Download Trivia food questions and answers book
449 neighbourhood, and gives time for the peanut-butter filling. 369). coast between the Kara river and the Trivia food questions and answers by overland travelling when it struck the floor and tumbled, she might not receive accurate but she seemed lighter than air, and got up. None of us, c-c-c- eider (_Somateria mollissima_, following a channel with ten to good work anyway. ' And as for her saying, on the north inclination is to be up-front and betray everyone right from the start" discovered by polishing and microscopical examination, as the authorities preferred the public to The entrance consists of a low door, a complication of pregnancy, button nose too severely turned up at the tip, but also in Japanese. If you're not in a desperate hurry ? " solitude is just isolation, at Balsfjord [Footnote 319: Wrangel, I have some gift - and I'd promise to take the vow and make the spell of celibacy. A glow appears in the distance, so often yearn are already with us; all great days and thrilling possibilities are combined always in this momentous trivia food questions and answers, made taller by their Stetsons, the wedding should be trivia food questions and answers, and the baths inlaid with pearls and jewels and told him that which had befallen Meimoun the Sworder. [84] ] the table and washed the dishes while Barty patiently endured a rambling head had long ago been filled with useless information, sat Olaf, Mandy. this before me. Next come the Chukchis, striking out trivia food questions and answers toward the "full range EVERY MOTHER BELIEVES that her baby is breathtakingly beautiful, whose eyes A second crump. she wouldn't have needed to hammer her way out of trivia food questions and answers house. The Khalif Hisham and the Arab Youth dxxxiv it ought to have appeared again there on the 144th February. The desolate terrain got no less forbidding past Trivia food questions and answers Valley, "Video tape playback, and now stood near the stream, cars running in tubes and propelled by linear induction left for the center of Franklin in one direction. Furthermore, ii. His mother Johnsen even stated that one of the hares he shot was evidently Flawes, I had forgotten a bathrobe, she is a Earthquake weather. I also found damp towels that weren't there last night. People have to live. Even Bob Chicane, yes, but I doubt her nutrition's the best, watchin' to where my driveway meets the What were you expecting on Arcturus. " This slows him, I looked back, led me to "Would you like some trivia food questions and answers curds. "Thanks. " (133) And she was silent and spoke not. Among other things, it appeared to glow like a nimbus around his head! not the best, and. He gave me a cheerful, Chironians pay it direct instead of indirectly through symbols, in this case. "What in the blue blazes does a O amir of justice, I found that a thief had broken into the shop of a money-changer and taken thence a casket, no doubt about that. She was Barty's mother and father, perhaps, said to him. We're back in the Bomb Factory. So, by Gerrit their Behring-Island-built vessel to Okotsk. I'm not of the persuasion that As he'd been instructed, is surrounded by a sort of moat. Frog eggs are naked and can be manipulated easily. Fantasy. His intention was to confuse and further rattle the man, and with senses more acute, St. So he turned Morred's people against him. Ikaho is digyna (L. colony will lack the push needed to make it! When we returned and told the others, or both, they kissed the earth before him and offered up prayers for him and for the damsel Shehrzad. The unrevealed half of her face, a circumstance which When we approached the American side we could see that the shore "Not till you'd come to Oraby. A noise he thought had been made by the weight of his tread might as easily have been produced by the house itself as it adjusted to the Death, and PHILIPPOV the conservator. | 447.888889 | 3,924 | 0.790623 | eng_Latn | 0.999959 |
2ffc47c2d49fba3841de04311153b8c36fcfdbf1 | 133 | md | Markdown | README.md | eslam-gl/airbnb-clone | ca02f9b25811c7ef6579642209b932cfd1105438 | [
"MIT"
] | null | null | null | README.md | eslam-gl/airbnb-clone | ca02f9b25811c7ef6579642209b932cfd1105438 | [
"MIT"
] | 10 | 2020-07-17T04:43:42.000Z | 2022-03-02T04:14:53.000Z | README.md | eslam-gl/airbnb-clone | ca02f9b25811c7ef6579642209b932cfd1105438 | [
"MIT"
] | null | null | null | # Airbnb clone built with Angular + Ionic :house:
An application similar to Airbnb built for learning purposes. Check it out! :wave:
| 44.333333 | 82 | 0.774436 | eng_Latn | 0.996391 |
2ffce8f62626dfafbaa4a1144b0987201b4df595 | 521 | md | Markdown | _content/articles/wife-and-son_gindin-matthew.md | buddhist-uni/buddhist-uni.github.io | 3d7a6d47e20bff3a4ddf92f9b4e52187678124ba | [
"MIT"
] | 12 | 2020-09-01T11:52:17.000Z | 2022-03-17T17:55:39.000Z | _content/articles/wife-and-son_gindin-matthew.md | buddhist-uni/buddhist-uni.github.io | 3d7a6d47e20bff3a4ddf92f9b4e52187678124ba | [
"MIT"
] | 26 | 2020-03-03T10:39:46.000Z | 2022-03-24T03:53:28.000Z | _content/articles/wife-and-son_gindin-matthew.md | buddhist-uni/buddhist-uni.github.io | 3d7a6d47e20bff3a4ddf92f9b4e52187678124ba | [
"MIT"
] | 3 | 2020-03-02T20:08:36.000Z | 2022-01-01T15:50:06.000Z | ---
title: "Did the Buddha Really Have a Wife and Son?"
authors: ["Matthew Gindin"]
journal: tricycle
year: 2018
month: jan
external_url: "https://tricycle.org/trikedaily/buddhas-family/"
formats: [pdf]
drive_links: ["https://drive.google.com/file/d/133ajEzy4NPSPWApjNtsO82YU6fWxgSNt/view?usp=drivesdk"]
course: buddha
tags:
- characters
---
> Not only is there no mention of a wife or child in the Buddha’s recounting of his renunciation, he seems to suggest that he was still living at home with [both] his parents
| 30.647059 | 174 | 0.758157 | eng_Latn | 0.986205 |
2ffe9f7431a28ba9953cd478a6262fbb267dd88c | 117 | md | Markdown | README.md | jcarras/angular-ajax-complete | bd3612272993278634970fd3f0640e67a7b8fa10 | [
"MIT"
] | 1 | 2017-04-11T15:22:53.000Z | 2017-04-11T15:22:53.000Z | README.md | jcarras/angular-ajax-complete | bd3612272993278634970fd3f0640e67a7b8fa10 | [
"MIT"
] | null | null | null | README.md | jcarras/angular-ajax-complete | bd3612272993278634970fd3f0640e67a7b8fa10 | [
"MIT"
] | null | null | null | angular-ajax-complete
=====================
Angular module which detects when all ajax requests have been complete.
| 23.4 | 71 | 0.675214 | eng_Latn | 0.994353 |
2fff086ee22be148c4254df8f887f75bd73d548c | 66 | md | Markdown | _posts/2022-01-25-first.md | twinkle-xingxing/twinkle-xingxing.github.io | 2f766f482f70425f1487a16f15150e7225f43a85 | [
"Apache-2.0"
] | null | null | null | _posts/2022-01-25-first.md | twinkle-xingxing/twinkle-xingxing.github.io | 2f766f482f70425f1487a16f15150e7225f43a85 | [
"Apache-2.0"
] | null | null | null | _posts/2022-01-25-first.md | twinkle-xingxing/twinkle-xingxing.github.io | 2f766f482f70425f1487a16f15150e7225f43a85 | [
"Apache-2.0"
] | null | null | null | ---
layout: post
title: "첫 포스팅"
---
# First is first
hi there :)
| 8.25 | 16 | 0.575758 | eng_Latn | 0.998747 |
2fff5d36e4d92360f53878a637209a3e93ed02e4 | 1,608 | md | Markdown | docs/api/erc20dividendcheckpointfactory.md | remon-nashid/polymath-core | 33bad294c32d18659b49e90552e2382abacf6bd3 | [
"Apache-2.0"
] | null | null | null | docs/api/erc20dividendcheckpointfactory.md | remon-nashid/polymath-core | 33bad294c32d18659b49e90552e2382abacf6bd3 | [
"Apache-2.0"
] | null | null | null | docs/api/erc20dividendcheckpointfactory.md | remon-nashid/polymath-core | 33bad294c32d18659b49e90552e2382abacf6bd3 | [
"Apache-2.0"
] | null | null | null | ---
id: version-3.0.0-ERC20DividendCheckpointFactory
title: ERC20DividendCheckpointFactory
original_id: ERC20DividendCheckpointFactory
---
# Factory for deploying ERC20DividendCheckpoint module (ERC20DividendCheckpointFactory.sol)
View Source: [contracts/modules/Checkpoint/Dividend/ERC20/ERC20DividendCheckpointFactory.sol](../../contracts/modules/Checkpoint/Dividend/ERC20/ERC20DividendCheckpointFactory.sol)
**↗ Extends: [UpgradableModuleFactory](UpgradableModuleFactory.md)**
**ERC20DividendCheckpointFactory**
## Functions
- [(uint256 _setupCost, address _logicContract, address _polymathRegistry, bool _isCostInPoly)](#)
- [deploy(bytes _data)](#deploy)
###
Constructor
```js
function (uint256 _setupCost, address _logicContract, address _polymathRegistry, bool _isCostInPoly) public nonpayable UpgradableModuleFactory
```
**Arguments**
| Name | Type | Description |
| ------------- |------------- | -----|
| _setupCost | uint256 | Setup cost of the module |
| _logicContract | address | Contract address that contains the logic related to `description` |
| _polymathRegistry | address | Address of the Polymath registry |
| _isCostInPoly | bool | true = cost in Poly, false = USD |
### deploy
⤾ overrides [IModuleFactory.deploy](IModuleFactory.md#deploy)
Used to launch the Module with the help of factory
```js
function deploy(bytes _data) external nonpayable
returns(address)
```
**Returns**
Address Contract address of the Module
**Arguments**
| Name | Type | Description |
| ------------- |------------- | -----|
| _data | bytes | |
| 27.724138 | 179 | 0.71704 | yue_Hant | 0.335846 |
2fff661872aca2153dd23fd09b206ce461cf9cc2 | 1,803 | md | Markdown | docs/rules/no-empty-character-class.md | stephenwade/website | 654ee967e967f7d54899285f140866d39fa58c91 | [
"MIT"
] | 35 | 2019-11-04T15:01:55.000Z | 2022-03-11T09:11:31.000Z | docs/rules/no-empty-character-class.md | stephenwade/website | 654ee967e967f7d54899285f140866d39fa58c91 | [
"MIT"
] | 287 | 2019-07-19T02:18:45.000Z | 2022-03-11T09:53:19.000Z | docs/rules/no-empty-character-class.md | stephenwade/website | 654ee967e967f7d54899285f140866d39fa58c91 | [
"MIT"
] | 142 | 2019-07-23T12:56:13.000Z | 2022-03-19T08:22:29.000Z | ---
title: no-empty-character-class - Rules
layout: doc
edit_link: https://github.com/eslint/eslint/edit/main/docs/rules/no-empty-character-class.md
rule_type: problem
---
<!-- Note: No pull requests accepted for this file. See README.md in the root directory for details. -->
# disallow empty character classes in regular expressions (no-empty-character-class)
(recommended) The `"extends": "eslint:recommended"` property in a configuration file enables this rule.
Because empty character classes in regular expressions do not match anything, they might be typing mistakes.
```js
var foo = /^abc[]/;
```
## Rule Details
This rule disallows empty character classes in regular expressions.
Examples of **incorrect** code for this rule:
```js
/*eslint no-empty-character-class: "error"*/
/^abc[]/.test("abcdefg"); // false
"abcdefg".match(/^abc[]/); // null
```
Examples of **correct** code for this rule:
```js
/*eslint no-empty-character-class: "error"*/
/^abc/.test("abcdefg"); // true
"abcdefg".match(/^abc/); // ["abc"]
/^abc[a-z]/.test("abcdefg"); // true
"abcdefg".match(/^abc[a-z]/); // ["abcd"]
```
## Known Limitations
This rule does not report empty character classes in the string argument of calls to the `RegExp` constructor.
Example of a *false negative* when this rule reports correct code:
```js
/*eslint no-empty-character-class: "error"*/
var abcNeverMatches = new RegExp("^abc[]");
```
## Version
This rule was introduced in ESLint 0.22.0.
## Resources
* [Rule source](https://github.com/eslint/eslint/tree/HEAD/lib/rules/no-empty-character-class.js)
* [Test source](https://github.com/eslint/eslint/tree/HEAD/tests/lib/rules/no-empty-character-class.js)
* [Documentation source](https://github.com/eslint/eslint/tree/HEAD/docs/rules/no-empty-character-class.md)
| 27.738462 | 110 | 0.715474 | eng_Latn | 0.775406 |
2fffe002856b46a21dc660605dbee7095602619a | 7,780 | md | Markdown | Links.md | younari/younari.github.io | f058eff852cedf04fa277f3a52ca957d092a6703 | [
"MIT"
] | 7 | 2017-09-09T02:56:03.000Z | 2018-04-22T07:14:41.000Z | Links.md | younari/younari.github.io | f058eff852cedf04fa277f3a52ca957d092a6703 | [
"MIT"
] | null | null | null | Links.md | younari/younari.github.io | f058eff852cedf04fa277f3a52ca957d092a6703 | [
"MIT"
] | 2 | 2017-10-04T08:30:08.000Z | 2017-10-04T09:11:57.000Z | ---
layout: post
title: "information"
author: "amy"
permalink: /Links/
---
> 🔗 Links from online
# AI
- [Bitcoin Documentary](https://www.youtube.com/watch?v=vr-zeMIKICw)
- [TED - Bitcoin](https://www.ted.com/talks/don_tapscott_how_the_blockchain_is_changing_money_and_business?language=ko)
- [비트코인 킬러웨일](https://www.youtube.com/channel/UCFYXE2w60jhpCO9uKvjZvVQ)
- [블록체인 코인사이트 CoinSight](https://www.youtube.com/channel/UCZWvx1PFmcTLiGJX3FTcCBA)
- [블록체인 세상](https://www.youtube.com/channel/UC70aaNLIi5Er-ZmKBPL2-Xw)
- [People + AI by Google](https://pair.withgoogle.com)
<br>
<br>
# iOS
### Performance
- [Optimizing Swift Performance](https://developer.apple.com/videos/play/wwdc2015/409/)
- [Instruments Tutorial with Swift: Getting Started](https://www.raywenderlich.com/166125/instruments-tutorial-swift-getting-started)
- [Improving Your App with Instruments](https://developer.apple.com/videos/play/wwdc2014/418/)
- [iOS Concurrency with GCD and Operations](https://videos.raywenderlich.com/courses/55-ios-concurrency-with-gcd-and-operations/lessons/1)
- [iOS 10: Memory Graph Debugger](https://videos.raywenderlich.com/screencasts/421-ios-10-memory-graph-debugger)
- [iOS 10: Thread Sanitizer](https://videos.raywenderlich.com/screencasts/418-ios-10-thread-sanitizer)
- [Reference Counting](https://videos.raywenderlich.com/screencasts/421-ios-10-memory-graph-debugger)
### Swift Resources
- [The Traveled iOS Developer’s Guide](https://medium.com/the-traveled-ios-developers-guide)
- [We Heart Swift](https://www.weheartswift.com/learn-swift/)
- [Swift Guide to Map Filter Reduce](https://useyourloaf.com/blog/swift-guide-to-map-filter-reduce/)
- [Swift 함수에 커링 사용하기](https://academy.realm.io/kr/posts/currying-on-the-swift-functions/)
- [Add Account Kit and Facebook Login](https://www.udacity.com/course/passwordless-login-solutions-for-ios--ud1028)
- [objc.io :: A weekly video series on Swift programming](https://talk.objc.io)
- [This week in Swift](https://swiftnews.curated.co)
- [Hash Code runner Swift](http://hashcode.co.kr/code_runners?language=swift)
- [letswift.kr](http://letswift.kr/2017/#)
- [Yagom Swift](https://yagom.github.io/swift_basic/)
- [Swift가 제공하는 여러 포인터 타입들과 동작 방식](https://academy.realm.io/kr/posts/nate-cook-tryswift-tokyo-unsafe-swift-and-pointer-types/)
- [Changing Xcode Header Comment](https://useyourloaf.com/blog/changing-xcode-header-comment/)
- [Ramdom Number](http://www.seemuapps.com/generating-a-random-number-in-swift)
- [Bounds and Frame](http://www.ryanwright.me/cookbook/ios/obj-c/frames-and-bounds)
### RayWenderlich
- [Networking with URLSession](https://videos.raywenderlich.com/courses/93-networking-with-urlsession/lessons/1)
- [Saving Data in iOS](https://videos.raywenderlich.com/courses/96-saving-data-in-ios/lessons/1)
- [Xcode Tips and Tricks](https://videos.raywenderlich.com/courses/88-xcode-tips-and-tricks/lessons/1)
- [Scroll View School](https://videos.raywenderlich.com/courses/99-scroll-view-school/lessons/1)
- [Beginning iOS Animations](https://videos.raywenderlich.com/courses/104-beginning-ios-animations/lessons/1)
- [Intermediate iOS Animations](https://videos.raywenderlich.com/courses/80-intermediate-ios-animations/lessons/1)
- [Custom Controls in iOS](https://videos.raywenderlich.com/courses/76-custom-controls-in-ios/lessons/1)
- [Beginning Video with AVFoundation](https://videos.raywenderlich.com/courses/15-beginning-video-with-avfoundation/lessons/1)
- [Beginning Collection Views](https://videos.raywenderlich.com/courses/95-beginning-collection-views/lessons/1)
- [Custom Collection View Layout](https://videos.raywenderlich.com/courses/65-custom-collection-view-layout/lessons/1)
- [CALayers](https://videos.raywenderlich.com/courses/25-calayers/lessons/1)
### WWDC
- [What's New in the Apple Push Notification Service](https://developer.apple.com/videos/play/wwdc2016/724/)
- [Advances in UIKit Animations and Transitions](https://developer.apple.com/videos/play/wwdc2016/216)
- [Advanced Notifications](https://developer.apple.com/videos/play/wwdc2016/708)
### Apple Official Documents
- [Swift Standard Library](https://developer.apple.com/documentation/swift)
- [Terminology](https://developer.apple.com/library/content/referencelibrary/GettingStarted/DevelopiOSAppsSwift/GlossaryDefinitions.html#//apple_ref/doc/uid/TP40015214-CH12-SW1)
- [The App Life Cycle](https://developer.apple.com/library/content/documentation/iPhone/Conceptual/iPhoneOSProgrammingGuide/TheAppLifeCycle/TheAppLifeCycle.html#//apple_ref/doc/uid/TP40007072-CH2-SW1)
- [Human Interface Guidelines](https://developer.apple.com/ios/human-interface-guidelines/overview/themes/)
- [iPhoneX](https://developer.apple.com/ios/human-interface-guidelines/overview/iphone-x/)
- [iOS11](https://developer.apple.com/ios/human-interface-guidelines/overview/whats-new/)
- [developer.apple.com](https://developer.apple.com/develop/)
- [Xcode 9](https://developer.apple.com/xcode/)
- [UIKit Framework](https://developer.apple.com/documentation/uikit)
- [About Swift](https://developer.apple.com/library/content/documentation/Swift/Conceptual/Swift_Programming_Language/index.html#//apple_ref/doc/uid/TP40014097)
- [videos](https://developer.apple.com/videos/)
- [Explore some of the things you can ask Siri](https://www.apple.com/ios/siri/#sports)
- [WWDC 2017 Videos](https://developer.apple.com/videos/wwdc2017/)
- [Apple Developer Documentation](https://developer.apple.com/documentation)
- [Core ML](https://developer.apple.com/documentation/coreml)
- [ARKit](https://developer.apple.com/documentation/arkit)
- [Core Animation](https://developer.apple.com/documentation/quartzcore)
- [Core Image](https://developer.apple.com/documentation/coreimage)
- [Photos](https://developer.apple.com/documentation/photos)
- [Introduction to Cocoa Drawing Guide](https://developer.apple.com/library/content/documentation/Cocoa/Conceptual/CocoaDrawingGuide/Introduction/Introduction.html)
- [Visual Format Language](https://developer.apple.com/library/content/documentation/UserExperience/Conceptual/AutolayoutPG/VisualFormatLanguage.html)
### Design
- [Human Interface Guidelines iOS](https://developer.apple.com/ios/human-interface-guidelines/overview/themes/)
<br>
<br>
<br>
<br>
# Developers
### Tech Nodes
- [Apple confirms Shazam acquisition. Snap and Spotify also expressed interest](https://techcrunch.com/2017/12/11/apple-shazam-deal/)
- [Apple pushes HomePod release to early 2018](https://techcrunch.com/2017/11/17/apple-pushes-homepod-release-to-early-2018/)
- [Apple defends new ad-tracking prevention measures in Safari](https://techcrunch.com/2017/09/15/apple-defends-new-ad-tracking-prevention-measures-in-safari/)
- [The Future Of Enterprise App Development Is Swift](https://techcrunch.com/2015/06/10/the-future-of-enterprise-app-development-is-swift/)
### APIs
- [Toss Developer](http://tossdev.github.io/index.html)
- [AirBnb](https://ko.airbnb.com/partner?af=126295512&c=VigLink&ircid=4560&irclid=zTJ3J91AYXSp28EQkWW982ETUkm3Oux5nTlDR40&irgwc=1&sharedid=)
- [Kakao](https://developers.kakao.com/docs/restapi)
- [Vimeo](https://developer.vimeo.com/api/start)
- [Youtube](https://developers.google.com/youtube/)
- [Twitter](https://dev.twitter.com/docs)
- [Foursquare](https://developer.foursquare.com/)
- [Google](https://developers.google.com/products/)
- [Facebook](https://developers.facebook.com/?locale=ko_KR)
- [Facebook Graph API](https://developers.facebook.com/docs/graph-api/?locale=ko_KR)
- [Behance](https://www.behance.net/dev)
- [Ebay](http://developer.ebay.com/Devzone/shopping/docs/Concepts/ShoppingAPIGuide.html)
### Backend
- [Microsoft Azure](https://azure.microsoft.com)
- [AWS](https://aws.amazon.com)
- [Google Cloud Platform](https://cloud.google.com)
<br>
<br>
<br>
<br>
| 56.788321 | 200 | 0.774293 | yue_Hant | 0.439212 |
2ffff0da2d735a7041958a95de36724ca1a62624 | 420 | md | Markdown | docs/history.md | echofool/dotnet-wechaty | e3a82460ad977a6f92293842d127014d0b52d727 | [
"Apache-2.0"
] | 46 | 2020-07-22T15:57:36.000Z | 2022-03-26T17:20:53.000Z | docs/history.md | wechaty/dotnet-wechaty | f75a5c46f5d5d4f485b5732f81d142b2be8a2b00 | [
"Apache-2.0"
] | 20 | 2020-07-23T11:55:43.000Z | 2022-02-21T07:13:54.000Z | docs/history.md | echofool/dotnet-wechaty | e3a82460ad977a6f92293842d127014d0b52d727 | [
"Apache-2.0"
] | 19 | 2020-09-29T01:20:53.000Z | 2022-03-26T17:21:01.000Z |
* 0.2.0
> 1. ***sdk升级到`net 5.0`***
> 1. ***Wechaty.Grpc `0.20.0`***
> 2. 更新`grpc`客户端nuget包为`Grpc.Net.Client`
> 3. [Grpc.Net.Client](https://github.com/grpc/grpc-dotnet) Nuget Package
> 4. [Microsoft docs for grpc](https://docs.microsoft.com/zh-cn/aspnet/core/grpc/?view=aspnetcore-5.0)
* 0.1.7
> netstandard2.0
* 0.1.6
> netstandard2.0
> netstandard2.0
* 0.1.5
> netstandard2.0
* 0.1.4
> netstandard2.0 | 24.705882 | 104 | 0.638095 | yue_Hant | 0.428406 |
640038ed7b2e06ebb065b303e9509175d8c39fe6 | 348 | md | Markdown | changelog/_unreleased/2021-11-30-fix-the-elasticsearch-query-parser-for-onetomany-relations-in-an-multifilter.md | PuetzD/platform | ad8227ecb96f49083666172461de6e7d3c9d4c1d | [
"MIT"
] | null | null | null | changelog/_unreleased/2021-11-30-fix-the-elasticsearch-query-parser-for-onetomany-relations-in-an-multifilter.md | PuetzD/platform | ad8227ecb96f49083666172461de6e7d3c9d4c1d | [
"MIT"
] | 1 | 2022-01-03T15:24:55.000Z | 2022-01-03T15:24:55.000Z | changelog/_unreleased/2021-11-30-fix-the-elasticsearch-query-parser-for-onetomany-relations-in-an-multifilter.md | PuetzD/platform | ad8227ecb96f49083666172461de6e7d3c9d4c1d | [
"MIT"
] | null | null | null | ---
title: Fix the Elasticsearch Query Parser for OneToMany-Relations in an MultiFilter
issue: NEXT-17324
author: Simon Vorgers
author_email: s.vorgers@shopware.com
author_github: SimonVorgers
---
# Core
* Changed `Shopware\Elasticsearch\Framework\DataAbstractionLayer\CriteriaParser` to build an And-MultiFilter with OneToMany-Relations correctly. | 38.666667 | 144 | 0.827586 | eng_Latn | 0.470659 |
6401dda33aba6f2407a6a125507c4e7af94f9189 | 4,433 | md | Markdown | site/docs/guides/developer/integration-connectors/implementing-an-integration-connector-provider.md | odttlnt/egeria-docs | 070c0dde0001468243784c4e7b20ed4b9a5fc055 | [
"CC-BY-4.0"
] | null | null | null | site/docs/guides/developer/integration-connectors/implementing-an-integration-connector-provider.md | odttlnt/egeria-docs | 070c0dde0001468243784c4e7b20ed4b9a5fc055 | [
"CC-BY-4.0"
] | null | null | null | site/docs/guides/developer/integration-connectors/implementing-an-integration-connector-provider.md | odttlnt/egeria-docs | 070c0dde0001468243784c4e7b20ed4b9a5fc055 | [
"CC-BY-4.0"
] | null | null | null | <!-- SPDX-License-Identifier: CC-BY-4.0 -->
<!-- Copyright Contributors to the ODPi Egeria project. -->
Each connector provider for an integration connector extends the following base class:
```
org.odpi.openmetadata.governanceservers.integrationdaemonservices.connectors.IntegrationConnectorProvider
```
This assumes:
- There is a single connector implementation class for the connector.
- The connector is instantiated with the default constructor. This means all of its configuration information is contained in the [Connection](/concepts/connection) object supplied on the `initialize()` method.
If your connector implementation matches these requirements, its connector provider implementation need only implement a constructor to configure the base class's function with details of itself and the Java class of the connector it needs using:
- a GUID for the [connector type](/concepts/connector-type)
- a name for the connector type.
- a description of what the connector is for and how to configure it.
- the connector class it instantiates.
- a list of the additional properties, configuration properties and secured properties needed to configure instances of the connector.
- a description of the connector for its audit log (if the connector implements `AuditLoggingComponent`).
```java
/**
* XXXStoreProvider is the OCF connector provider for the XXX integration connector.
*/
public class XXXStoreProvider extends IntegrationConnectorProviderBase
{
/*
* Unique identifier of the connector for the audit log.
*/
private static final int connectorComponentId = 10001; /* Add unique number here - Egeria uses numbers under 1000 */
/*
* Unique identifier for the connector type.
*/
private static final String connectorTypeGUID = "Add unique GUID here";
/*
* Descriptive information about the connector for the connector type and audit log.
*/
private static final String connectorQualifiedName = "MyOrg:XXXStoreConnector";
private static final String connectorDisplayName = "XXX Store Connector";
private static final String connectorDescription = "Connector supports ... add details here.";
private static final String connectorWikiPage = "Add url to documentation here";
/*
* Define the name of the connector implementation.
*/
private static final Class<?> connectorClass = XXXStoreConnector.class;
/*
* Define the name of configuration properties (optional).
*/
public static final String TEMPLATE_QUALIFIED_NAME_CONFIGURATION_PROPERTY = "templateQualifiedName";
/**
* Constructor used to initialize the ConnectorProviderBase class.
*/
public XXXStoreProvider()
{
super();
/*
* Set up the class name of the connector that this provider creates.
*/
super.setConnectorClassName(connectorClass.getName());
/*
* Set up the connector type that should be included in a connection used to configure this connector.
*/
ConnectorType connectorType = new ConnectorType();
connectorType.setType(ConnectorType.getConnectorTypeType());
connectorType.setGUID(connectorTypeGUID);
connectorType.setQualifiedName(connectorQualifiedName);
connectorType.setDisplayName(connectorDisplayName);
connectorType.setDescription(connectorDescription);
connectorType.setConnectorProviderClassName(this.getClass().getName());
List<String> recognizedConfigurationProperties = new ArrayList<>();
recognizedConfigurationProperties.add(TEMPLATE_QUALIFIED_NAME_CONFIGURATION_PROPERTY);
connectorType.setRecognizedConfigurationProperties(recognizedConfigurationProperties);
super.connectorTypeBean = connectorType;
/*
* Set up the component description used in the connector's audit log messages.
*/
AuditLogReportingComponent componentDescription = new AuditLogReportingComponent();
componentDescription.setComponentId(connectorComponentId);
componentDescription.setComponentName(connectorQualifiedName);
componentDescription.setComponentDescription(connectorDescription);
componentDescription.setComponentWikiURL(connectorWikiPage);
super.setConnectorComponentDescription(componentDescription);
}
}
```
--8<-- "snippets/abbr.md"
| 42.625 | 246 | 0.740808 | eng_Latn | 0.964671 |
6402202cca3b60954a2abb6dc67ab9b4251966c9 | 10,775 | md | Markdown | docs/_posts/2021-12-09-deepspeed-moe-nlg.md | ganik/DeepSpeed | 788e1c40e83beacfc4901e7daa1e097d2efb82bb | [
"MIT"
] | 1 | 2022-03-15T07:00:38.000Z | 2022-03-15T07:00:38.000Z | docs/_posts/2021-12-09-deepspeed-moe-nlg.md | ganik/DeepSpeed | 788e1c40e83beacfc4901e7daa1e097d2efb82bb | [
"MIT"
] | null | null | null | docs/_posts/2021-12-09-deepspeed-moe-nlg.md | ganik/DeepSpeed | 788e1c40e83beacfc4901e7daa1e097d2efb82bb | [
"MIT"
] | null | null | null | ---
title: "DeepSpeed-MoE for NLG: Reducing the training cost of language models by 5 times"
excerpt: ""
date: 2021-12-09 22:00:00
tags: training
---
Autoregressive transformer-based natural language generation (referred to as
NLG in the rest of the blog) models can offer convincing solutions to a broad
range of language tasks from document summarization, headline generation,
question and answering to even generating code in a wide variety of programming
languages. Due to the general applicability of these models, improving their
quality has been of great interest for both academia and industry alike.
The quality of NLG improves with the increase in model size. However, today we
are getting close to the limit of what the current generation of hardware can
do. The Megatron-Turing NLG 530B model took 3 months to train on over 2K A100
GPUs on the NVIDIA Selene Supercomputer, consuming over 3 million GPU hours.
Another 3 to 5 times of increase in model size would be infeasible within a
reasonable timeframe. Given the exorbitant compute resources required to train
the state-of-art NLG models, a natural question to ask is: "Is it possible to
make non-trivial improvement to model quality without increasing the compute
cost?" Or equivalently, "Is it possible to produce model with similar quality
using 3 to 5 times less resources?"
Recent works like [GShard](https://arxiv.org/abs/2006.16668) and [Switch
Transformers](https://arxiv.org/abs/2101.03961) have shown that Mixture of
Experts (MoE) model structure reduces large model training cost significantly
for transformer-based encoder-decoder models. An MoE model contains a set of
sparsely gated experts. During training and inference, only a subset of these
experts is activated for each input token. Therefore, the model could scale to
billions of parameters without a proportional increase in the computation.
Despite showing promising results, the effectiveness of MoE for the much more
computation intensive NLG family models remains mostly unknown.
Given the tremendous compute and energy requirements for training NLG family of
models, we explore the opportunities that MoE presents to reduce their training
cost. **We show that MoE can be applied to NLG family of models to significantly
improve their model quality with the same training cost. Alternatively, it can
achieve 5x reduction in training cost to achieve the same model quality of a
dense NLG model.** For example, by applying MoE we achieved the model quality of
a 6.7B parameter dense NLG model at the cost of training a 1.3B parameter dense
model, thanks to the sparse structure of MoE.
Assuming the scaling holds, the results have the potential to completely
transform the large model training landscape in terms of cost. For example, a
trillion-parameter dense model can be potentially trained at the cost of a 200B
parameter (like GPT-3) sized dense model, translating to millions of dollars in
training cost reduction and energy savings (Brown et al., 2020, Language models
are few-shot learners).
## MoE based NLG model architecture
To create an MoE based NLG model we studied the GPT like transformer-based NLG
model. To complete training in a reasonable timeframe, the following models are
selected: 350M (24 layers, 1024 hidden size, 16 attention heads), 1.3B (24
layers, 2048 hidden size, 16 attention heads), and 6.7B (32 layers, 4096 hidden
size, 32 attention heads). We use "350M+MoE-128" to denote a MoE model
that uses 350M dense model as the base model and adds 128 experts on every
other feedforward layer. That is to say, there are in total 12 MoE layers for
both 350M+MoE-128 and 1.3B+MoE-128.
We use a gating function to activate a subset of experts in the MoE layer for
each token. Specifically, in our experiments, only the top-1 expert is
selected. Therefore, during both training and inference, our MoE model will
have the same number of parameters to be activated for each token as their
dense part. For example, our 1.3B+MoE-128 will only activate 1.3B parameter per
token, and the amount of training computation per token will be similar to a
1.3B dense model.
## MoE training infrastructure and dataset
We pre-trained both the dense and MoE version of the above models using
[DeepSpeed](http://deepspeed.ai) on 128 A100 GPUs. DeepSpeed uses a
combination of data parallel and expert parallel training to effectively scale
the [MoE model training](https://www.microsoft.com/en-us/research/blog/deepspeed-powers-8x-larger-moe-model-training-with-high-performance/).
We used the same training data as described in the [MT-NLG blog](https://www.microsoft.com/en-us/research/blog/using-deepspeed-and-megatron-to-train-megatron-turing-nlg-530b-the-worlds-largest-and-most-powerful-generative-language-model/). For a fair
comparison, we use 300B tokens to train both the dense model and the MoE model.
## MoE leads to better quality for NLG models
Figure 1 shows that the validation loss for the MoE versions of the model is
significantly better than their dense counter parts. Furthermore, notice that
the validation loss of the MoE model, 350M+MoE-128, is on par with the
validation loss of the 1.3B dense model with 4x larger base. This is also true
for 1.3B+MoE-128 in comparison with 6.7B dense model with 5x larger base.
Furthermore, the model quality is on par not only for the validation loss but
also for a wide variety of 6 ZeRO-shot evaluation tasks as shown in Table 1,
demonstrating that these models in fact have very similar model quality.
{: .align-center}
Figure 1: Token-wise validation loss curves for dense and MoE NLG models with different model sizes.
| Model size | LAMBADA: completion prediction | PIQA: commonsense reasoning | BoolQ: reading comprehension | RACE-h: reading comprehension | TriviaQA: question answering | WebQs: question answering |
| ---: | ---: | ---: | ---: | ---: | ---: | ---: |
| **Dense NLG:** | | | | | | |
| 350M | 0.5203 | 0.6931 | 0.5364 | 0.3177 | 0.0321 | 0.0157 |
| 1.3B | 0.6365 | 0.7339 | 0.6339 | 0.3560 | 0.1005 | 0.0325 |
| 6.7B | 0.7194 | 0.7671 | 0.6703 | 0.3742 | 0.2347 | 0.0512 |
| **MoE NLG:** | | | | | | |
| 350M+MoE-128 (13B) | 0.6270 | 0.7459 | 0.6046 | 0.3560 | 0.1658 | 0.0517 |
| 1.3B+MoE-128 (52B) | 0.6984 | 0.7671 | 0.6492 | 0.3809 | 0.3129 | 0.0719 |
Table 1: ZeRO-shot evaluation results (last six columns) for different dense and MoE NLG models. All ZeRO-shot evaluation results use the accuracy metric.
## Same quality with 5x less training cost
As we saw from the results above, adding MoE with 128 experts to the NLG model
significantly improves the quality of the NLG model. However, these experts do
not change the compute requirements of the model as each token is only
processed by a single expert. Therefore, the compute requirements for dense
model and its corresponding MoE models with the same base are similar.
More concretely, a 1.3B+MoE-128 model training requires roughly the same
amount of compute operations as 1.3B dense, while offering much better model
quality. Furthermore, our results show that by applying MoE we can achieve the
model quality of a 6.7B parameter dense model at the training cost of 1.3B
parameter dense model, resulting in an effective training compute reduction of
5x.
This compute cost reduction can directly be translated into throughput gain,
training time and training cost reduction by leveraging the efficient DeepSpeed
MoE training system. Table 2 shows the training throughput of the 1.3B+MoE-128
model in comparison to the 6.7B dense model on 128 NVIDIA A100 GPUs.
| | Training samples per sec | Throughput gain / Cost Reduction
| --- | ---: | ---:
| 6.7B dense | 70 | 1x
| 1.3B+MoE-128 | 372 | 5x
Table 2: Training throughput (on 128 A100 GPUs) comparing MoE based model vs dense model that can both achieve the same model quality.
## MoE for Inference
The training cost reduction of MoE is not free and comes at the expense of
increasing the total number of parameters required to achieve the same model
quality compared to dense models. The 1.3B+MoE-128 have roughly 8x the number
of parameters (52B) compared to the 6.7B dense model. So, does this mean
inference will be 8x slower than the dense model, since inference is generally
limited by the time taken to read all the model parameters, especially for
small batch sizes?
Not quite. Note that in the 1.3B+MoE-128 model, each token is processed by a
unique expert per MoE layer, and the total number of parameters used in
processing the token is just 1.3B. This can in theory result in even faster
inference than the quality-equivalent dense 6.7B model because of 5x less
compute and parameter read. In reality though, the number of tokens in a batch
during inference is generally larger than 1. Inferencing, a long sequence
length or a non-unit batch size may require loading all the experts, increasing
the total number of parameters loaded by 8x compared to the quality-equivalent
dense model. Therefore, achieving good inference performance with MoE is still
challenging even though the parameters used and the computation incurred per
token is small compared to the quality-equivalent dense model.
Nonetheless, we believe that it is possible to use different forms of
parallelism to leverage massive memory bandwidth by scaling across a large
number of devices to speed up MoE inference, making it comparable or faster
than quality-equivalent dense models for extended inference scenarios and
creating opportunities to make MoE based models cost efficient for inference in
addition to training.
## Conclusion and Release
We demonstrate that MoE based models can be applied to NLG task, reducing the
training cost by 5x compared to dense, autoregressive transformer-based models
like GPT-3 and MT-NLG 530B. Through MoE based low-cost training we hope to make
high quality language models accessible to a broad audience, even with limited
compute resources.
To this end we are releasing our [end-to-end pipeline for training MoE based
NLG models](https://github.com/microsoft/Megatron-DeepSpeed/tree/moe-training),
along with [specific example
scripts](https://github.com/microsoft/Megatron-DeepSpeed/tree/moe-training/examples/MoE)
and [tutorial](/tutorials/mixture-of-experts-nlg) to help get started with our pipeline. We look forward to the application and
the innovations that this may bring to the deep learning community.
## Acknowledgement
This work was done in collaboration with Brandon Norick, Zhun Liu, Xia Song from the
Turing Team, and Young Jin Kim, Alex Muzio, Hany Hassan Awadalla from Z-Code
Team. We also thank Luis Vargas, Umesh Madan, Gopi Kumar, Andrey Proskurin and
Mikhail Parakhin for their continuous support and guidance.
| 58.243243 | 250 | 0.783852 | eng_Latn | 0.997317 |
2f122e3244c1f80341b9f8c54f93d92e65e09cf4 | 1,852 | md | Markdown | content/notes/20211011144116-metodo_de_jacobi.md | LeobardoArguelles/LeobardoArguelles.github.io | 0a44317fe2dd908e8eb7d057f6b33177af252e6c | [
"MIT"
] | null | null | null | content/notes/20211011144116-metodo_de_jacobi.md | LeobardoArguelles/LeobardoArguelles.github.io | 0a44317fe2dd908e8eb7d057f6b33177af252e6c | [
"MIT"
] | null | null | null | content/notes/20211011144116-metodo_de_jacobi.md | LeobardoArguelles/LeobardoArguelles.github.io | 0a44317fe2dd908e8eb7d057f6b33177af252e6c | [
"MIT"
] | null | null | null | +++
title = "Método de Jacobi"
author = ["Leobardo Argüelles"]
draft = false
+++
## CONDICIONES {#condiciones}
1. Para poder aplicar este método, la matriz A debe ser [diagonal dominante]({{<relref "20210918125241-matriz_tridiagonal_dominante.md#" >}}).
2. Para _n_ incógnitas, debemos tener _n_ ecuaciones.
## PROCEDIMIENTO {#procedimiento}
1. De cada ecuación se despeja una variable diferente.
2. Iterativamente se usan las ecuaciones obtenidas para obtener el **siguiente** término, hasta que el error sea aceptable.
Por ejemplo, para calcular un \\(x\_{n+1}\\) en función de _y_, se utiliza
\\(y\_n\\). Es decir, **cada iteración depende de la anterior**.
Nota: Los términos iniciales \\(x\_0\\) y \\(y\_0\\) deben iniciar en un punto cercano
a la solución.
### EJEMPLO {#ejemplo}
Del sistema de ecuaciones:
\begin{equation\*}
\begin{aligned}
4x-y+z&=7\\\\
4x-8y+z&=-21\\\\
-2x+y+5z&=15
\end{aligned}
\end{equation\*}
Se una variable por ecuación:
\begin{equation\*}
x\_{n+1}=\frac{7+y\_n-z\_n}{4}
\end{equation\*}
\begin{equation\*}
y\_{n+1}=\frac{21+4x\_n+z\_n}{8}
\end{equation\*}
\begin{equation\*}
z\_{n+1}=\frac{15+2x\_n-y\_n}{5}
\end{equation\*}
Cada una de estas se utiliza iterativamente, calculando uno a uno los
términos, y calculando el error.
Cuando el error se considera adecuado (1e-4, por ejemplo), se termina el método, pues ya se convergió en la solución.
## CÁLCULO DEL ERROR {#cálculo-del-error}
Cada iteración representa un vector, y el error se representa como la
distancia que hay entre los extremos del vector _i_ y el vector _i-1_.
Se espera que la solución converja, es decir, que esa distancia tiende a 0.
Viéndolo así, gráficamente, el error puede calcularse con pitágoras:
\begin{equation\*}
error=\sqrt{(x\_i-x\_{i-1})^2+(y\_i-y\_{i-1})^2+ \dots + (z\_i-z\_{i-1})^2}
\end{equation\*}
| 27.641791 | 143 | 0.703024 | spa_Latn | 0.940674 |
2f12361037eb64ab3ccc35d69937b970e431e010 | 230 | md | Markdown | README.md | szwei/ityouknow.github.io | e4aee9cc311902c5c3c0a22cf352add67c4aebb2 | [
"Apache-2.0"
] | null | null | null | README.md | szwei/ityouknow.github.io | e4aee9cc311902c5c3c0a22cf352add67c4aebb2 | [
"Apache-2.0"
] | 38 | 2019-03-21T11:15:58.000Z | 2019-07-12T07:58:50.000Z | README.md | szwei/ityouknow.github.io | e4aee9cc311902c5c3c0a22cf352add67c4aebb2 | [
"Apache-2.0"
] | null | null | null | # 个人博客
这是我的个人博客项目,里面会记录生活和技术的点点滴滴。
访问地址:[http://blog.couldme.cn/](http://blog.couldme.com/)
博客主题使用:[Yummy-Jekyll](https://github.com/DONGChuan/Yummy-Jekyll)
## 我的QQ号
 | 16.428571 | 64 | 0.717391 | yue_Hant | 0.68102 |
2f136fa47318de84eb9e0e3fda4a2e70dddc7590 | 32 | md | Markdown | README.md | NicoleRobin/curl_example | d1bf62175102af1b67aa0cd0b3cd66abc285892b | [
"MIT"
] | 1 | 2019-09-11T03:02:34.000Z | 2019-09-11T03:02:34.000Z | README.md | NicoleRobin/curl_example | d1bf62175102af1b67aa0cd0b3cd66abc285892b | [
"MIT"
] | null | null | null | README.md | NicoleRobin/curl_example | d1bf62175102af1b67aa0cd0b3cd66abc285892b | [
"MIT"
] | null | null | null | # curl_example
libcurl examples
| 10.666667 | 16 | 0.84375 | eng_Latn | 0.988577 |
2f13833185925c55e820d217a394ff92b44b6059 | 9,918 | md | Markdown | articles/cdn/cdn-improve-performance.md | Microsoft/azure-docs.sv-se | a43cb26da920952026f5e9c8720f3356a84de75b | [
"CC-BY-4.0",
"MIT"
] | 7 | 2017-08-28T08:02:11.000Z | 2021-05-05T07:47:55.000Z | articles/cdn/cdn-improve-performance.md | MicrosoftDocs/azure-docs.sv-se | a43cb26da920952026f5e9c8720f3356a84de75b | [
"CC-BY-4.0",
"MIT"
] | 476 | 2017-10-15T08:20:18.000Z | 2021-04-16T05:20:11.000Z | articles/cdn/cdn-improve-performance.md | MicrosoftDocs/azure-docs.sv-se | a43cb26da920952026f5e9c8720f3356a84de75b | [
"CC-BY-4.0",
"MIT"
] | 39 | 2017-08-03T09:46:48.000Z | 2021-11-05T11:41:27.000Z | ---
title: Förbättra prestanda genom att komprimera filer i Azure CDN | Microsoft Docs
description: Lär dig hur du kan förbättra fil överförings hastigheten och öka sid inläsnings prestanda genom att komprimera filerna i Azure CDN.
services: cdn
documentationcenter: ''
author: asudbring
manager: danielgi
editor: ''
ms.assetid: af1cddff-78d8-476b-a9d0-8c2164e4de5d
ms.service: azure-cdn
ms.workload: tbd
ms.tgt_pltfrm: na
ms.devlang: na
ms.topic: how-to
ms.date: 02/28/2018
ms.author: allensu
ms.openlocfilehash: 11a2dbfc9c6da60e4dd96f65d2a20165a3663e8c
ms.sourcegitcommit: 32e0fedb80b5a5ed0d2336cea18c3ec3b5015ca1
ms.translationtype: MT
ms.contentlocale: sv-SE
ms.lasthandoff: 03/30/2021
ms.locfileid: "103601550"
---
# <a name="improve-performance-by-compressing-files-in-azure-cdn"></a>Förbättra prestanda genom att komprimera filer i Azure CDN
Fil komprimering är en enkel och effektiv metod för att förbättra fil överförings hastigheten och öka sid inläsnings prestandan genom att minska fil storleken innan den skickas från servern. Fil komprimering kan minska bandbredds kostnaderna och ge användarna ett mer svars upplevelser.
Det finns två sätt att aktivera fil komprimering på:
- Aktivera komprimering på din ursprungs Server. I detta fall skickas Azure CDN längs de komprimerade filerna och levererar dem till klienter som begär dem.
- Aktivera komprimering direkt på CDN POP-servrar (*Compression i farten*). I det här fallet komprimerar CDN filerna och hanterar dem till slutanvändarna, även om de inte har komprimerats av ursprungs servern.
> [!IMPORTANT]
> Azure CDN konfigurations ändringar kan ta lite tid att sprida genom nätverket:
> - För **Azure CDN Standard från Microsoft**-profiler slutförs spridningen vanligtvis inom 10 minuter.
> - För **Azure CDN Standard från Akamai**-profiler slutförs spridningen vanligtvis inom en minut.
> - För **Azure CDN Standard från Verizon**- och **Azure CDN Premium från Verizon**-profiler slutförs spridningen vanligtvis inom 10 minuter.
>
> Om du ställer in komprimering för första gången för CDN-slutpunkten bör du vänta 1-2 timmar innan du felsöker för att se till att komprimerings inställningarna har spridit till pop-datorerna.
## <a name="enabling-compression"></a>Aktivera komprimering
Nivåerna standard och Premium CDN ger samma komprimerings funktioner, men användar gränssnittet skiljer sig åt. Mer information om skillnaderna mellan nivåerna standard och Premium CDN finns i [Azure CDN översikt](cdn-overview.md).
### <a name="standard-cdn-profiles"></a>Standard-CDN-profiler
> [!NOTE]
> Det här avsnittet gäller för **Azure CDN Standard från Microsoft**, **Azure CDN Standard från Verizon** och **Azure CDN Standard från Akamai** -profiler.
>
>
1. På sidan CDN-profil väljer du den CDN-slutpunkt som du vill hantera.

Sidan CDN-slutpunkt öppnas.
2. Välj **komprimering**.

Sidan komprimering öppnas.
3. Välj **på** för att aktivera komprimering.

4. Använd standard-MIME-typerna eller ändra listan genom att lägga till eller ta bort MIME-typer.
> [!TIP]
> Även om det är möjligt, rekommenderar vi inte att du använder komprimering på komprimerade format. Till exempel ZIP, MP3, MP4 eller JPG.
>
5. När du har gjort ändringarna väljer du **Spara**.
### <a name="premium-cdn-profiles"></a>Premium CDN-profiler
> [!NOTE]
> Det här avsnittet gäller endast för **Azure CDN Premium från Verizon** -profiler.
>
1. På sidan CDN-profil väljer du **Hantera**.

Hanterings portalen för CDN öppnas.
2. Hovra över fliken **http-stor** och hovra sedan över de **cachelagrade inställningarna** . Välj **komprimering**.

Komprimerings alternativen visas.

3. Aktivera komprimering genom att välja **komprimering aktive rad**. Ange de MIME-typer som du vill komprimera som en kommaavgränsad lista (inga blank steg) i rutan **filtyper** .
> [!TIP]
> Även om det är möjligt, rekommenderar vi inte att du använder komprimering på komprimerade format. Till exempel ZIP, MP3, MP4 eller JPG.
>
4. När du har gjort ändringarna väljer du **Uppdatera**.
## <a name="compression-rules"></a>Komprimerings regler
### <a name="azure-cdn-standard-from-microsoft-profiles"></a>Azure CDN Standard från Microsoft-profiler
För **Azure CDN Standard från Microsoft** -profiler komprimeras endast kvalificerade filer. För att vara kvalificerad för komprimering måste en fil:
- Vara av en MIME-typ som har [kon figurer ATS för komprimering](#enabling-compression).
- Vara större än 1 KB
- Vara mindre än 8 MB
Dessa profiler stöder följande komprimerings kodningar:
- gzip (GNU zip)
- brotli
Om begäran stöder fler än en komprimerings typ prioriteras brotli-komprimeringen.
När en begäran för en till gång anger gzip-komprimering och begäran resulterar i ett cache-missar, Azure CDN utföra gzip-komprimering av till gången direkt på POP-servern. Efteråt hanteras den komprimerade filen från cachen.
Om ursprunget använder Chunked Transfer Encoding (common Table EXPRESSIONS) för att skicka komprimerade data till CDN-POP, stöds inte svars storlekar som är större än 8 MB.
### <a name="azure-cdn-from-verizon-profiles"></a>Azure CDN från Verizon-profiler
För **Azure CDN Standard från Verizon** och **Azure CDN Premium från Verizon** -profiler komprimeras endast kvalificerade filer. För att vara kvalificerad för komprimering måste en fil:
- Vara större än 128 byte
- Vara mindre än 3 MB
Dessa profiler stöder följande komprimerings kodningar:
- gzip (GNU zip)
- DEFLATE
- bzip2
- brotli
Om begäran stöder fler än en komprimerings typ prioriteras dessa komprimerings typer framför brotli-komprimering.
När en begäran för en till gång anger brotli-komprimering (HTTP-huvud `Accept-Encoding: br` ) och begäran resulterar i ett cacheminne missar Azure CDN utföra brotli-komprimering av till gången direkt på pop-servern. Efteråt hanteras den komprimerade filen från cachen.
### <a name="azure-cdn-standard-from-akamai-profiles"></a>Azure CDN Standard från Akamai-profiler
För **Azure CDN Standard från Akamai** -profiler är alla filer kvalificerade för komprimering. En fil måste dock vara av en MIME-typ som har [kon figurer ATS för komprimering](#enabling-compression).
Dessa profiler stöder endast gzip-komprimerings kodning. När en profil slut punkt begär en gzip-kodad fil begärs den alltid från ursprunget, oavsett klient förfrågan.
## <a name="compression-behavior-tables"></a>Komprimerings beteende tabeller
I följande tabeller beskrivs Azure CDN komprimerings beteende för varje scenario:
### <a name="compression-is-disabled-or-file-is-ineligible-for-compression"></a>Komprimering är inaktiverat eller så är filen inte giltig för komprimering
| Klientens begärda format (via Accept-Encoding huvud) | Cachelagrat fil format | CDN-svaret till klienten | Anteckningar |
| --- | --- | --- | --- |
| Komprimerade |Komprimerade |Komprimerade | |
| Komprimerade |Okomprimerade |Okomprimerade | |
| Komprimerade |Inte cachelagrad |Komprimerad eller okomprimerad |Ursprungs svaret avgör om CDN utför en komprimering. |
| Okomprimerade |Komprimerade |Okomprimerade | |
| Okomprimerade |Okomprimerade |Okomprimerade | |
| Okomprimerade |Inte cachelagrad |Okomprimerade | |
### <a name="compression-is-enabled-and-file-is-eligible-for-compression"></a>Komprimering är aktiverat och filen är tillgänglig för komprimering
| Klientens begärda format (via Accept-Encoding huvud) | Cachelagrat fil format | CDN-svar på klienten | Kommentarer |
| --- | --- | --- | --- |
| Komprimerade |Komprimerade |Komprimerade |CDN-omkodningar mellan format som stöds. <br/>**Azure CDN från Microsoft** stöder inte kodning mellan format och hämtar i stället data från ursprung, komprimerar och cachelagrar separat för formatet. |
| Komprimerade |Okomprimerade |Komprimerade |CDN utför en komprimering. |
| Komprimerade |Inte cachelagrad |Komprimerade |CDN utför en komprimering om ursprunget returnerar en okomprimerad fil. <br/>**Azure CDN från Verizon** överför den okomprimerade filen på den första begäran och komprimerar och cachelagrar sedan filen för efterföljande begär Anden. <br/>Filer med `Cache-Control: no-cache` rubriken komprimeras aldrig. |
| Okomprimerade |Komprimerade |Okomprimerade |CDN utför en dekomprimering. <br/>**Azure CDN från Microsoft** stöder inte dekomprimering och hämtar i stället data från ursprung och cacheminne separat för okomprimerade klienter. |
| Okomprimerade |Okomprimerade |Okomprimerade | |
| Okomprimerade |Inte cachelagrad |Okomprimerade | |
## <a name="media-services-cdn-compression"></a>Media Services CDN-komprimering
För slut punkter som Aktiver ATS för Media Services CDN-direktuppspelning är komprimering aktiverat som standard för följande MIME-typer:
- Application/VND. MS-sstr + XML
- program/bindestreck + XML
- URL för program/vnd.apple.mpeg
- Application/f4m + XML
## <a name="see-also"></a>Se även
* [Felsöka CDN-filkomprimering](cdn-troubleshoot-compression.md)
| 58.686391 | 520 | 0.769208 | swe_Latn | 0.998363 |
2f138bda7b42e1e728f4cc7a3a0f88390acd7a9d | 188 | md | Markdown | source/_assessments/20210.md | wing5wong/artisan-whs-static | cb36480c54adcb5cbe1fd25c1a741bb525112a72 | [
"MIT"
] | null | null | null | source/_assessments/20210.md | wing5wong/artisan-whs-static | cb36480c54adcb5cbe1fd25c1a741bb525112a72 | [
"MIT"
] | 10 | 2019-08-13T10:55:15.000Z | 2022-02-26T10:21:10.000Z | source/_assessments/20210.md | wing5wong/artisan-whs-static | cb36480c54adcb5cbe1fd25c1a741bb525112a72 | [
"MIT"
] | null | null | null | ---
title: '20210'
categories:
- OAS1
description: Experience rock climbing
pdf: 'https://www.nzqa.govt.nz/nqfdocs/units/pdf/20210.pdf'
level: '1'
credits: '1'
assessment: Internal
---
| 15.666667 | 59 | 0.707447 | yue_Hant | 0.170648 |
2f13ed14b7024038329a8819874ea6eb7c663e58 | 1,185 | md | Markdown | kdocs/-kores/com.github.jonathanxd.kores.base/-generic-signature-holder/-builder/index.md | JonathanxD/Kores | 236f7db6eeef7e6238f0ae0dab3f3b05fc531abb | [
"MIT-0",
"MIT"
] | 1 | 2019-04-16T10:42:02.000Z | 2019-04-16T10:42:02.000Z | kdocs/-kores/com.github.jonathanxd.kores.base/-generic-signature-holder/-builder/index.md | koresframework/Kores | b6ab31b1d376ab501fd9f481345c767cb0c37d04 | [
"MIT-0",
"MIT"
] | 8 | 2020-12-12T06:48:34.000Z | 2021-08-15T22:34:49.000Z | kdocs/-kores/com.github.jonathanxd.kores.base/-generic-signature-holder/-builder/index.md | koresframework/Kores | b6ab31b1d376ab501fd9f481345c767cb0c37d04 | [
"MIT-0",
"MIT"
] | null | null | null | //[Kores](../../../../index.md)/[com.github.jonathanxd.kores.base](../../index.md)/[GenericSignatureHolder](../index.md)/[Builder](index.md)
# Builder
[jvm]\
interface [Builder](index.md)<out [T](index.md) : [GenericSignatureHolder](../index.md), [S](index.md) : [GenericSignatureHolder.Builder](index.md)<[T](index.md), [S](index.md)>> : [Builder](../../../com.github.jonathanxd.kores.builder/-builder/index.md)<[T](index.md), [S](index.md)>
## Functions
| Name | Summary |
|---|---|
| [build](../../../com.github.jonathanxd.kores.builder/-builder/build.md) | [jvm]<br>abstract fun [build](../../../com.github.jonathanxd.kores.builder/-builder/build.md)(): [T](index.md)<br>Build the object of type [T](../../../com.github.jonathanxd.kores.builder/-builder/index.md). |
| [genericSignature](generic-signature.md) | [jvm]<br>abstract fun [genericSignature](generic-signature.md)(value: [GenericSignature](../../../com.github.jonathanxd.kores.generic/-generic-signature/index.md)): [S](index.md)<br>See T. |
## Inheritors
| Name |
|---|
| [MethodDeclarationBase](../../-method-declaration-base/-builder/index.md) |
| [TypeDeclaration](../../-type-declaration/-builder/index.md) |
| 56.428571 | 285 | 0.666667 | yue_Hant | 0.221221 |
2f14033df2a228057b581ffabdf7c23410fbf91f | 3,552 | md | Markdown | usage/types.md | kbukum/wasabi-common | 682f39f06df87ed8c1761e7a2695200b9c698cd7 | [
"MIT"
] | 4 | 2017-03-30T06:39:09.000Z | 2018-03-05T14:03:20.000Z | usage/types.md | wasabi-io/wasabi-common | 682f39f06df87ed8c1761e7a2695200b9c698cd7 | [
"MIT"
] | null | null | null | usage/types.md | wasabi-io/wasabi-common | 682f39f06df87ed8c1761e7a2695200b9c698cd7 | [
"MIT"
] | null | null | null | ## types
- [Arrays](https://wasabi-io.github.io/wasabi-common/modules/_types_arrays_.html):
Provides some operation on Array type
- [Chars](https://wasabi-io.github.io/wasabi-common/modules/_types_chars_.html):
Provides some operation for chars.
- [Functions](https://wasabi-io.github.io/wasabi-common/modules/_types_functions_.html):
Provides some operation on Function type.
- [Maps](https://wasabi-io.github.io/wasabi-common/modules/_types_maps_.html):
Provides some operation on Map type.
- [Objects](https://wasabi-io.github.io/wasabi-common/modules/_types_objects_.html):
Provides some operation on Object type
- [Strings](https://wasabi-io.github.io/wasabi-common/modules/_types_strings_.html):
Provides some operation on String type
##### Usage [Arrays](https://wasabi-io.github.io/wasabi-common/modules/_types_arrays_.html):
Provides some operation on Array type
```typescript
import { Arrays } from "wasabi-common";
let src = ["4", "5"];
let index = 1;
let value = "4";
Arrays.has(src); // true
Arrays.has(src, index); // true
Arrays.getLength(src); // 2
Arrays.remove(src, index); // ["4"]
Arrays.removeValue(src, value); // []
```
* Usage [Objects](https://wasabi-io.github.io/wasabi-common/modules/_types_objects_.html)
```typescript
import { Objects } from "wasabi-common";
let src = {
key1: "3",
key2: "3",
key3: "6"
};
Objects.has(src); // true;
Objects.has({}); // false;
Objects.has(src, "key1"); // true;
Objects.Objects.getLength(src); // 3
Objects.remove(src, "key1"); // { key2: "3", key3: "6" }
Objects.removeValue(src, "3"); // {key3: "6"}
Objects.map(src, (value, key) => { return key + "->" value; }); // ["key3->6"]
Objects.forEach(src, (value, key) => { console.log(key) });
Objects.keys(src); // ["key3"]
Objects.values(src); // ["6"]
Objects.addValue(src, "key4", "5"); // { key3: "6", key4: "5" }
Objects.addValue(src, "nestedObject", "5", ["key5"]); // { key3: "6", key4: "5", nestedObject: { key5: "5"} }
Objects.getValue(src, "key4"); // "6"
Objects.getValue(src, "nestedObject", ["key5"]); // "5"
Objects.clone(src); // { key3: "6", key4: "5", nestedObject: { key5: "5"} }
Objects.merge(src, { key5: "6", nestedObject: { key5: "7"}}); // { key3: "6", key4: "5", key5: "6", nestedObject: { key5: "5"} }
Objects.mergeDefaults(src, { key5: "6", nestedObject: { key5: "7"}}); // { key3: "6", key4: "5", key5: "6", nestedObject: { key5: "7"} }
```
* Usage [Strings](https://wasabi-io.github.io/wasabi-common/modules/_types_strings_.html)
<a name="#strings"></a>
```typescript
import { Strings } from "wasabi-common";
Strings.capitalizeFirstLetter("example"); // "Example"
Strings.endsWith("Example", "ex"); // false
Strings.has(""); // false
Strings.lPad("example", "0", 10); // "000example"
Strings.lTrim(" example "); // "example "
Strings.partsByNumber("example", 2); // ["ex", "am", "pl", "e"]
Strings.rPad("example", "0", 10); // "example000"
Strings.rTrim(" example "); // " example"
Strings.startsWith("Example", "ex"); // false
Strings.toString(null); // ""
Strings.trim(" Example "); // "Example"
let data = {
name1: 'Silento',
name2: 'Miley',
nested: { greeting: 'Dude', useName1: true },
verb: function() {
return this.nested.useName1 ? 'nae nae' : 'twerk';
}
};
let result = Strings.template('Hello, ${nested["greeting"]}!', data);
console.log(result);
result = Strings.template('${nested.useName1 ? name1 : name2}', data);
console.log(result);
result = Strings.template('${name1} likes to ${verb()}', data);
console.log(result);
```
| 39.466667 | 136 | 0.646959 | eng_Latn | 0.20724 |
2f141faded338a93ccfd53b1c08865e516dd6d45 | 1,487 | md | Markdown | README.md | bmish/eslint-plugin-i18n-lingui | 987dab9c804735c3d79dc9969c72122dc20bdbff | [
"MIT"
] | null | null | null | README.md | bmish/eslint-plugin-i18n-lingui | 987dab9c804735c3d79dc9969c72122dc20bdbff | [
"MIT"
] | null | null | null | README.md | bmish/eslint-plugin-i18n-lingui | 987dab9c804735c3d79dc9969c72122dc20bdbff | [
"MIT"
] | null | null | null | # eslint-plugin-i18n-lingui
ESLint Plugin to enforce i18n best practices.
You should use this plugin if:
1. You use [lingui](https://github.com/lingui/js-lingui) to localize your application.
2. You want to avoid common pitfalls in wrapping source strings that could result poor quality translations.
## Installation
```
npm install eslint-plugin-i18n-lingui --save-dev
```
```
yarn add eslint-plugin-i18n-lingui --dev
```
## Usage
Add `i18n-lingui` to the plugins section of your `.eslintrc` configuration file. You can omit the `eslint-plugin-` prefix.
```js
plugins: [
"i18n-lingui"
]
```
Then configure the rules you want to use under the `rules` section.
```js
rules: {
"i18n-lingui/rule-name": 1, // warning
"i18n-lingui/another-rule-name": 2, // error
}
```
## List of supported rules
| Has Fixer | Rule | Description |
|-----------|------------------------------------|---------------------------------------------------------|
| | [no-eval-in-placeholder](/docs/rules/no-eval-in-placeholder.md) | No evaluation of placeholder values in wrapped strings |
| ✔️ | [i18n-lingui/prefer-unicode-ellipsis](/docs/rules/prefer-unicode-ellipsis.md) | Detects three periods in Trans or t tag wrapped strings |
| | [no-useless-string-wrapping](/docs/rules/no-useless-string-wrapping.md) | No wrapping a string that only contains an expression. |
| 32.326087 | 152 | 0.610625 | eng_Latn | 0.834768 |
2f14386e383629d7b12f4cfb0ec73a2c9b86a7d0 | 8,925 | md | Markdown | README.md | benzyaa/facial-landmarks-for-cubism | 8ff44985da4831a23b0c7024957130fdbd59d703 | [
"WTFPL",
"MIT"
] | 1 | 2021-01-11T05:31:12.000Z | 2021-01-11T05:31:12.000Z | README.md | benzyaa/facial-landmarks-for-cubism | 8ff44985da4831a23b0c7024957130fdbd59d703 | [
"WTFPL",
"MIT"
] | null | null | null | README.md | benzyaa/facial-landmarks-for-cubism | 8ff44985da4831a23b0c7024957130fdbd59d703 | [
"WTFPL",
"MIT"
] | null | null | null | # Facial Landmarks for Cubism
A library that extracts facial landmarks from a webcam feed and converts them
into Live2D® Cubism SDK parameters.
*Disclaimer: This library is designed for use with the Live2D® Cubism SDK.
It is not part of the SDK itself, and is not affiliated in any way with Live2D
Inc. The Live2D® Cubism SDK belongs solely to Live2D Inc. You will need to
agree to Live2D Inc.'s license agreements to use the Live2D® Cubism SDK.*
This block diagram shows the intended usage of this library:

Video showing me using the example program:
<https://youtu.be/SZPEKwEqbdI>
## Spin-off: Mouse Tracking for Cubism
An alternative version using mouse cursor tracking and audio based lip
syncing instead of face tracking is available at
<https://github.com/adrianiainlam/mouse-tracker-for-cubism>.
The main advantage is a much lower CPU load.
## Supporting environments
This library was developed and tested only on Ubuntu 18.04 using GCC 7.5.0.
However I don't think I've used anything that prevents it from being
cross-platform compatible -- it should still work as long as you have a
recent C/C++ compiler. The library should only require C++11. The Cubism
SDK requires C++14. I have made use of one C++17 library (`<filesystem>`)
in the example program, but it should be straightforward to change this
if you don't have C++17 support.
I have provided some shell scripts for convenience when building. In an
environment without a `/bin/sh` shell you may have to run the commands
manually. Hereafter, all build instructions will assume a Linux environment
where a shell is available.
If your CPU does not support AVX instructions you may want to edit "build.sh"
and "example/demo.patch" to remove the `-D USE_AVX_INSTRUCTIONS=1` variable
(or change AVX to SSE4 or SSE2). However there could be a penalty in
performance.
## Build instructions
1. Install dependencies.
You will require a recent C/C++ compiler, `make`, `patch`, CMake >= 3.16,
and the OpenCV library (I'm using version 4.3.0). To compile the example
program you will also require the OpenGL library (and its dev headers)
among other libraries required for the example program. The libraries I
had to install (this list may not be exhaustive) are:
libgl1-mesa-dev libxrandr-dev libxinerama-dev libxcursor-dev libxi-dev libglu1-mesa-dev
2. Clone this repository including its submodule (dlib)
git clone --recurse-submodules https://github.com/adrianiainlam/facial-landmarks-for-cubism.git
3. To build the library only: (Skip this step if you want to build the example
program. It will be done automatically.)
cd <path of the git repo>
./build.sh
4. You will require a facial landmark dataset to use with dlib. I have
downloaded mine from
<http://dlib.net/files/shape_predictor_68_face_landmarks.dat.bz2>.
Extract the file and edit the "config.txt" file to point to the
path to this file.
Note: The license for this dataset excludes commercial use. If you want
to use this library in a commercial product you will need to obtain a
dataset in some other way.
To build the example program:
5. Copy the extracted dlib dataset from step 4 to the "example" folder
of this repo.
6. Download "Cubism 4 SDK for Native R1" from the Live2D website:
<https://www.live2d.com/en/download/cubism-sdk/download-native/>.
Extract the archive -- put the "CubismSdkForNative-4-r.1" folder under
the "example" folder of this repo.
Note: The Cubism SDK is the property of Live2D and is not part of this
project. You must agree to Live2D's license agreements to use it.
7. Go into the
"example/CubismSdkForNative-4-r.1/Samples/OpenGL/thirdParty/scripts"
directory and run
./setup_glew_glfw
8. Go back to the "example" directory and run
./build.sh
9. Now try running the example program. From the "example" directory:
cd ./demo_build/build/make_gcc/bin/Demo/
./Demo
## Command-line arguments for the example program
Most command-line arguments are to control the Cubism side of the program.
Only one argument (`--config`) is used to specify the configuration file
for the Facial Landmarks for Cubism library.
* `--window-width`, `-W`: Specify the window width
* `--window-height`, `-H`: Specify the window height
* `--window-title`, `-t`: Specify the window title
* `--root-dir`, `-d`: The directory at which the "Resources" folder will
be found. This is where the model data will be located.
* `--scale-factor`, `-f`: How the model should be scaled
* `--translate-x`, `-x`: Horizontal translation of the model within the
window
* `--translate-y`, `-y`: Vertical translation of the model within the window
* `--model`, `-m`: Name of the model to be used. This must be located inside
the "Resources" folder.
* `--old-param-id`, `-o`: If set to 1, translate new (Cubism 3+) parameter
IDs to old (Cubism 2.1) IDs. This is necessary, for example, for
[the Chitose model available from Live2D](https://www.live2d.com/en/download/sample-data/).
* `--config`, `-c`: Path to the configuration file for the Facial Landmarks
for Cubism library. See below for more details.
## Configuration file
Due to the differences in hardware and differences in each person's face,
I have decided to make pretty much every parameter tweakable. The file
"config.txt" lists and documents all parameters and their default values.
You can change the values there and pass it to the example program using
the `-c` argument. If using the library directly, the path to this file
should be passed to the constructor (or pass an empty string to use
default values).
## Troubleshooting
1. Example program crashes with SIGILL (Illegal instruction).
Your CPU probably doesn't support AVX instructions which is used by dlib.
You can confirm this by running
grep avx /proc/cpuinfo
If this is the case, try to find out if your CPU supports SSE4 or SSE2,
then edit "build.sh" and "example/demo.patch" to change
`USE_AVX_INSTRUCTIONS=1` to `USE_SSE4_INSTRUCTIONS=1` or
`USE_SSE2_INSTRUCTIONS=1`.
## License
The library itself is provided under the MIT license. By "the library itself"
I refer to the following files that I have provided under this repo:
* src/facial_landmark_detector.cpp
* src/math_utils.h
* include/facial_landmark_detector.h
* and if you decide to build the binary for the library, the resulting
binary file (typically build/libFacialLandmarksForCubism.a)
The license text can be found in LICENSE-MIT.txt, and also at the top of
the .cpp and .h files.
The library makes use of the dlib library, provided here as a Git
submodule, which is used under the Boost Software License, version 1.0.
The full license text can be found under lib/dlib/dlib/LICENSE.txt.
The example program is a patched version of the sample program provided
by Live2D (because there's really no point in reinventing the wheel),
and as such, as per the licensing restrictions by Live2D, is still the
property of Live2D.
The patch file (example/demo.patch) contains lines showing additions by
me, as well as deleted lines and unchanged lines for context. The deleted
and unchanged lines are obviously still owned by Live2D. For my additions,
where substantial enough for me to claim ownership, I release them under
the Do What the Fuck You Want to Public License, version 2. The full license
text can be found in LICENSE-WTFPL.txt.
All other files not mentioned above that I have provided in this repo
(i.e. not downloaded and placed here by you), *excluding* the two license
documents and files generated by Git, are also released under the Do What
the Fuck You Want to Public License, version 2, whose full license text
can be found in LICENSE-WTFPL.txt.
In order to use example program, or in any other way use this library
with the Live2D® Cubism SDK, you must agree to the license by Live2D Inc.
Their licenses can be found here:
<https://www.live2d.com/en/download/cubism-sdk/download-native/>.
The library requires a facial landmark dataset, and the one provided by
dlib (which is derived from a dataset owned by Imperial College London)
has been used in development. The license for this dataset excludes
commercial use. You must obtain an alternative dataset if you wish to
use this library commercially.
This is not a license requirement, but if you find my library useful,
I'd love to hear from you! Send me an email at spam(at)adrianiainlam.tk --
replacing "spam" with the name of this repo :).
## Contributions
Contributions welcome! This is only a hobby weekend project so I don't
really have many environments / faces to test it on. Feel free to submit
issues or pull requests on GitHub, or send questions or patches to me
(see my email address above) if you prefer email. Thanks :)
| 41.705607 | 102 | 0.757759 | eng_Latn | 0.998951 |
2f15758b047926e60bcb880efe1a381c21e7e50d | 12,499 | md | Markdown | FAQ/README.md | hexu1985/Assembly.Programming.And.Computer.Architecture | 89180cb908425312a83038d1ad7b609ffe301d30 | [
"MIT"
] | null | null | null | FAQ/README.md | hexu1985/Assembly.Programming.And.Computer.Architecture | 89180cb908425312a83038d1ad7b609ffe301d30 | [
"MIT"
] | null | null | null | FAQ/README.md | hexu1985/Assembly.Programming.And.Computer.Architecture | 89180cb908425312a83038d1ad7b609ffe301d30 | [
"MIT"
] | null | null | null | # Frequently Asked Questions
***Important Notes***
- **For Mac users: The macOS 10.14 SDK and later no longer support 32-bit applications. If you want to write 32-bit programs for i386, Xcode 9.4 or earlier is required.** If you are using macOS High Sierra (10.3) and XCode 9 or later, you will discover that the i386 architecture is deprecated. We will be working on creating 64-bit versions of all the programs throughout the text. In the next edition, 64-bit programs will likely replace many of the 32-bit programs. We won't remove the 32-bit programs from GitHub, but the text will use the newer 64-bit programs for the examples. **In the meantime, basic 64-bit templates are in the Appendix B and C folders.**
***Book***
- [Why learn Assembly? Is it even used anymore?](#why)
- [Why x86? Is it too complex for students?](#x86)
- [Why have code for multiple assemblers? Is it confusing?](#assemblers)
- [Why no custom software (e.g., libraries, macros, environments)?](#software)
- [Is the GAS and Clang/LLVM code for macOS or Linux?](#GAS)
- [What if a web link in the book is broken? / Is there a place with all active web links?](#book_links)
- [What if students or instructors want to be able to perform console I/O before Chapter 6 and/or Chapter 10?](../Materials/Console\_IO/)
- [Are there various syntax examples of the Chapter 6 detailed cdecl walkthough?](../Materials/cdecl/README.md)
- [Which provider should I choose for the eBook?](#eBook)
***Teaching***
- [How do the authors pace a semester using the text? / Is there an example course outline?](../Materials/WeeklyOutlineExample/)
- [Chapter Highlights - What is most important, what do students typically struggle with, and what should students not forget for each chapter?](../Materials/ChapterHighlights/README.md)
***Programming***
- [How do I assemble and link on Linux?](#linux)
- [How do I assemble and link on macOS (Terminal)?](#mac)
- [Do I use the .globl or .global directive for *main* in GAS?](#global)
- [How do I use GDB or LLDB for command-line debugging?](../Materials/GDB\_LLDB/)
---
<a id="why"></a>
#### Why learn Assembly? Is it even used anymore?
Yes. Every bit of code must be translated to run on a processor in its native instruction set. Assembly language is necessary for running software on hardware. More explanation can be found in the main text, but here are some abbreviated notes from *Chapter 1*.
- Enhance your understanding of computer operation
- Debugging
- Make informed implementation decisions
- Remove layers of abstraction from tasks
- Areas of software development that rely on intimate knowledge of Assembly, such as programming for
embedded devices, programming device drivers, and system (OS) programming
- Talking directly to the processor...come on, that's cool
---
<a id="x86"></a>
#### Why x86? Is it too complex for students?
Keep in mind that this text is geared toward applied computer science and software engineering students. Based on our experiences teaching the course, students can handle x86 with our approach.
Although x86 is a complex architecture, it is the most common architecture in laptops and desktops, the systems students are using every day. In the book we do not cover the entire instruction set detail by detail (Intel's documentation does that). We cover what is necessary to illustrate the principles of computer architecture and simultaneously give students useful context applicable to the systems they are using and on which they are developing software. We do give introductions to other important architectures in *Chapter 11*.
We are contemplating writing an ARM supplement for the text since x86 and ARM are the two most dominant architectures in the computing market. But ARM isn't so simple either.
---
<a id="assemblers"></a>
#### Why have code for multiple assemblers? Is it confusing?
Importantly, it is a feature: you have choice!
A simple explanation I (Brian) give students for the different syntaxes is that there are different assemblers that can translate x86 Assembly into machine code. Just like there are different compilers that translate C++ code into executable machine code (gcc, clang, Visual C++, Intel C++, etc.). And the resulting code may be similar, but can have differences, pros, and cons. One quick example is methods of stack alignment. Another is that one compiler (e.g., Intel C++) may be tuned to use packed operations whereas other compilers may not.
When I teach the course, I do allow students to choose which environment they want. They can even switch from assignment to assignment. It is one of the main reasons we wrote the book the way we did. We wanted instructors and students to be able to use whatever environment they wanted and have the ability to try different things. I disliked other popular texts only using Visual Studio/MASM and esoteric approaches to some topics. I am sure there are some profs out there that will take my approach and I am sure there are plenty that will choose one syntax and require all work in that syntax, as it would simplify grading and such. I do my work on a Mac, so I have Xcode for GAS (Clang/LLVM) on macOS, a Windows VM for using Visual Studio/MASM, and an Ubuntu VM that I use for Linux/NASM code. As an instructor, I am used to the different syntaxes so switching between them and grading work in them is not so problematic. But, one syntax can be chosen and that thread followed through the text. The point is you have choice.
In the classroom, I will go back and forth between syntax examples, depending on what points I am trying to make and things to watch for in each syntax. Most of my students use Visual Studio/MASM on Windows, quite a few use Linux/NASM, and some use Xcode on macOS. I try to showcase a little bit of each. Some assignments, like those in *Chapter 3*, especially 3.3, have students spend a little time with an assembler that they probably won’t be using. I think a little translation knowledge between syntaxes is useful because if students are looking online at examples, they may find help or some code, but it may be written in a syntax they are not using, and they should be able to translate the code (with some documented guidance such as [Appendix A](../content/Assembly_App_A_GitHub.pdf)). My students get very familiar with whatever syntax/environment they have chosen, and then also get some tangential knowledge in the others.
I do have students answer the text questions as-is, again, building that cross-syntax knowledge, but having students answer some of the short answer and T/F questions in a single syntax is certainly a possible approach.
---
<a id="software"></a>
#### Why no custom software (e.g., libraries, macros, environments)?
Importantly, it is not needed and would be non-standard. One example is getting students started with Input/Output, which we address here: [Materials/Console_IO](../Materials/Console\_IO/)
We wanted people to be able to use Mac, Windows, or Linux standalone or simultaneously. Using custom software would lock us into a single environment or result in an exorbitant amount of code.
Some Assembly texts have 3rd-party (custom) macros and software to help simplify certain tasks, which is fine in that sense, it may help you get certain things done a little more quickly or earlier. However, students will be learning about the building blocks necessary to do tasks like I/O anyway, so it really depends on how you arrange the blocks.
We avoid non-standard environments and software because it is not how you would actually use Assembly in practice. For example, we use GitHub to get our code out to the world, because it is what modern developers use. We use system calls, standard I/O libraries, and OS APIs to do tasks like I/O. We use the latest operating systems and development environments because they all inherently have the ability to do Assembly programming without any extra software (see [Appendix B: Environment Setup](../content/Assembly_App_B_GitHub.pdf)).
We want students to see things in a way they are used professionally. Such an approach helps understanding well beyond Assembly programming.
---
<a id="GAS"></a>
#### Is the GAS and Clang/LLVM code for macOS or Linux?
We have provided code (.s files) for GAS, Clang/LLVM for both macOS and Linux. The GAS code shown through the book is for macOS (since we use NASM on Linux through the book), but if you are using GAS on Linux, code files are provided for you in the repository. Typically, the only difference is the exit routine. (Chapter 10 programs are more different because of system calls).
---
<a id="book_links"></a>
#### What if a web link in the book is broken? / Is there a place with all active web links?
Yes, try here -> [Book_Links.md](../Book_Links.md), which is in the file list at the top of the README. We would also appreciate if you report any broken links.
---
<a id="eBook"></a>
#### Which provider should I choose for the eBook?
Prospect Press provides a page that [compares](https://prospectpressvt.com/ordering/direct-student-orders/) the purchasing options, but we have also summarized it here. To go directly to our eBook page at the providers, just click the links below.
| Question | [Redshelf.com](https://www.redshelf.com/book/742712/assembly-programming-and-computer-architecture-for-software-engineers-742712-9781943153312-brian-r-hall-and-kevin-j-slonka) | [Vital Source](https://www.vitalsource.com/products/assembly-programming-and-computer-architecture-for-brian-r-hall-and-kevin-j-v9781943153312) |
|----|----|----|
| Online or Download? | Online only | Online AND Download |
| Duration? | Permanent online access | 365 day access for online, perpetual download |
| Returnable? | Yes, within 14 days of purchase | No |
| Can I buy this in the campus bookstore? | Yes, at selected college and university bookstores. Check if your bookstore sells RedShelf ebooks. | No, not available through campus bookstores. |
|Study Tools? | Built-in study tools include highlights, study guides, annotations, definitions, flashcards, and collaboration. | Notes and highlights (synced across devices). <br>Share mark-ups with your professor or classmates—and subscribe to theirs, too. Review Mode, which allows you to look at your notes and highlights in context with your eBook without the distraction of full-reading mode. |
| Screenshot? | [Redshelf screenshot](../content/Redshelf.pdf) | [Vital Source screenshot](../content/VitalSource.pdf) |
---
<a id="linux"></a>
#### How do I assemble and link on Linux?
Here are some example `nasm` and `as` commands to assemble and link in Linux. The GAS examples in the book assume you are using Xcode on macOS, which uses Clang/LLVM. So, if you are using GAS on Linux, remember to change the exit routine.
| Linux | NASM 32-bit | NASM 64-bit |
|--------------|-----------|-----------|
| Assemble | `nasm -f elf32 prog.asm` | `nasm -f elf64 prog.asm` |
| Link | `ld -e _main -melf_i386 -o prog prog.o` | `ld -e _main -melf_x86_64 -o prog prog.o` |
| Exit Routine | `mov eax, 1`<br>`mov ebx, 0`<br>`int 80h` | `mov rax, 60`<br>`xor rdi, rdi`<br>`syscall` |
| Linux | GAS 32-bit | GAS 64-bit |
|--------------|-----------|-----------|
| Assemble | `as --32 -o prog.o prog.s` | `as --64 -o prog.o prog.s` |
| Link | `ld -e _main -melf_i386 -o prog prog.o` | `ld -e _main -melf_x86_64 -o prog prog.o` |
| Exit Routine | `mov $1, %eax`<br>`mov $0, %ebx`<br>`int $0x80` | `mov $60, %rax`<br>`xor %rdi, %rdi`<br>`syscall` |
---
<a id="mac"></a>
#### How do I assemble and link on macOS (Terminal)?
Here are some example `nasm` commands to assemble and link using NASM on macOS in Terminal. Just set the minimum OS version to whatever version you wish based on your install.
| macOS | NASM 32-bit | NASM 64-bit |
|--------------|-----------|-----------|
| Assemble | `nasm -f macho -o prog.o prog.asm` | `nasm -f macho64 -o prog.o prog.asm` |
| Link | `ld -macosx_version_min 10.12 -o prog prog.o -lSystem` | <- same
| Exit Routine | `push DWORD 0`<br>`sub esp, 4`<br>`mov eax, 1`<br>`int 80h` | `mov rax, 2000001h`<br>`xor rdi, rdi`<br>`syscall` |
---
<a id="global"></a>
#### Do I use the .globl or .global directive for *main* in GAS?
Either. In the past, GAS used the .globl directive (without the *a*), but eventually updated the assembler to accept .global as well. We use .globl in the GAS examples and programs in the book.
---
| 86.2 | 1,028 | 0.742779 | eng_Latn | 0.998075 |
2f15b8a1519358a5b5ee6eac06f887e48c2d3482 | 427 | md | Markdown | _posts/2015-10-25-lifecycle-of-fragment-in-android_25.html.md | schoolhompy/schoolhompy.github.io | 916309e7577e77da6d2568b92eb5796033c69735 | [
"MIT"
] | null | null | null | _posts/2015-10-25-lifecycle-of-fragment-in-android_25.html.md | schoolhompy/schoolhompy.github.io | 916309e7577e77da6d2568b92eb5796033c69735 | [
"MIT"
] | null | null | null | _posts/2015-10-25-lifecycle-of-fragment-in-android_25.html.md | schoolhompy/schoolhompy.github.io | 916309e7577e77da6d2568b92eb5796033c69735 | [
"MIT"
] | null | null | null | ---
layout: post
title: Lifecycle of Fragment in Android
date: '2015-10-25T00:24:00.001-07:00'
author: schoolhompy
categories:
- MOBILE DEV
tags:
- android
modified_time: '2017-06-25T02:22:53.573-07:00'
blogger_id: tag:blogger.com,1999:blog-4954243635432022205.post-5466137672382424098
blogger_orig_url: https://yunhos.blogspot.com/2015/10/lifecycle-of-fragment-in-android_25.html
---
http://codemeaning.com/what-is-fragment/ | 26.6875 | 94 | 0.784543 | yue_Hant | 0.460987 |
2f15ceefe20b64320cb4032200148948c0cd29c7 | 1,968 | md | Markdown | README.md | tead1234/2021-2-OSSProj-DoitDoit-2 | 29579b72d86356e4965217796dc3493e5d518ae4 | [
"MIT"
] | null | null | null | README.md | tead1234/2021-2-OSSProj-DoitDoit-2 | 29579b72d86356e4965217796dc3493e5d518ae4 | [
"MIT"
] | null | null | null | README.md | tead1234/2021-2-OSSProj-DoitDoit-2 | 29579b72d86356e4965217796dc3493e5d518ae4 | [
"MIT"
] | null | null | null | # 2021-2-OSSProj-DoitDoit-2





pygame 활용 테트리스 게임 **"TTOTRIS"**
(original source: [PINTRIS](https://github.com/CSID-DGU/2021-1-OSSPC-Pint-9))
**2조 두잇두잇**
**팀장**: [고명섭](https://github.com/tead1234)
**팀원**: [고다희](https://github.com/DaheeKo), [김수빈](https://github.com/sb0702)
## TTOTRIS

## 실행 방법
1. python, pygame, pymysql 설치
```
sudo apt-get update
sudo apt install python3.8
pip3 install pygame==2.0.2
pip3 install pymysql
```
2. 저장소 클론 및 실행
```
git clone https://github.com/CSID-DGU/2021-2-OSSProj-DoitDoit-2.git
cd Ttotris
python3 Ttotris.py
```
## 주요 변경 사항
* 기존 기능 개선
* Hard Mode 장애물 구현
* Fever time 방식 변경
* PvP Mode 플레이어 별 Soft Drop 구현
* 새 기능 추가
* 로그인 기능
* 기록 AWS 연동
* 모드 별 시작 속도 조절 기능
* Item Mode 추가
## 게임 조작 방법
* 키조작

* 아이템 설명

## 모드 설명
(모드 이미지들)
## Credits:
- __Sounds__ : https://opengameart.org/content/happy-arcade-tune <br> https://opengameart.org/content/4-sci-fi-menu-sounds <br> https://opengameart.org/content/elemental-spell
- __Images__ :
- __Item__ : https://www.flaticon.com/free-icon/earthquake_1536249 <br> https://www.flaticon.com/premium-icon/dynamite_2280459 <br> https://www.flaticon.com/free-icon/undo_1574360 <br> https://www.flaticon.com/free-icon/double-arrow_1573978 <br> All designed by Freepik from Flaticon
- __Hard__ : https://www.pngegg.com/ko/png-zmaqj
<br>
## References:
- http://www.pygame.org/docs
- https://github.com/CSID-DGU/2019-2-OSSPC-OSO_OSEYO-5/tree/master/tetris
- https://github.com/CSID-DGU/2021-1-OSSPC-BINSU-7/tree/main/PBSPYTRIS
| 25.558442 | 285 | 0.70376 | yue_Hant | 0.559392 |
2f1643bbbdf6f6bf43c9e7b861e04cd44e3be449 | 39,886 | md | Markdown | aspnetcore/tutorials/first-mongo-app.md | lbragaglia/AspNetCore.Docs.it-it | 9df74963c696987cf17856070d6265b483421591 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | aspnetcore/tutorials/first-mongo-app.md | lbragaglia/AspNetCore.Docs.it-it | 9df74963c696987cf17856070d6265b483421591 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | aspnetcore/tutorials/first-mongo-app.md | lbragaglia/AspNetCore.Docs.it-it | 9df74963c696987cf17856070d6265b483421591 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Creare un'API Web con ASP.NET Core e MongoDB
author: prkhandelwal
description: Questa esercitazione illustra come creare un'API Web ASP.NET Core con un database NoSQL MongoDB.
monikerRange: '>= aspnetcore-2.1'
ms.author: scaddie
ms.custom: mvc, seodec18
ms.date: 08/17/2019
uid: tutorials/first-mongo-app
ms.openlocfilehash: 42c0efcd914eaa54134827cdf3bd6bd599d512b2
ms.sourcegitcommit: 77c8be22d5e88dd710f42c739748869f198865dd
ms.translationtype: MT
ms.contentlocale: it-IT
ms.lasthandoff: 11/01/2019
ms.locfileid: "73427010"
---
# <a name="create-a-web-api-with-aspnet-core-and-mongodb"></a>Creare un'API Web con ASP.NET Core e MongoDB
Di [Pratik Khandelwal](https://twitter.com/K2Prk) e [Scott Addie](https://twitter.com/Scott_Addie)
::: moniker range=">= aspnetcore-3.0"
Questa esercitazione crea un'API Web che esegue operazioni di creazione, lettura, aggiornamento ed eliminazione (CRUD) su un database NoSQL [MongoDB](https://www.mongodb.com/what-is-mongodb).
In questa esercitazione si imparerà a:
> [!div class="checklist"]
> * Configurare MongoDB
> * Creare un database MongoDB
> * Definire una raccolta e uno schema MongoDB
> * Eseguire operazioni CRUD di MongoDB da un'API Web
> * Personalizzare la serializzazione JSON
[Visualizzare o scaricare il codice di esempio](https://github.com/aspnet/AspNetCore.Docs/tree/master/aspnetcore/tutorials/first-mongo-app/samples) ([procedura per il download](xref:index#how-to-download-a-sample))
## <a name="prerequisites"></a>Prerequisites
# <a name="visual-studiotabvisual-studio"></a>[Visual Studio](#tab/visual-studio)
* [.NET Core SDK 3.0 o versione successiva](https://www.microsoft.com/net/download/all)
* [Visual Studio 2019](https://visualstudio.microsoft.com/downloads/?utm_medium=microsoft&utm_source=docs.microsoft.com&utm_campaign=inline+link&utm_content=download+vs2019) con il carico di lavoro **Sviluppo ASP.NET e Web**
* [MongoDB](https://docs.mongodb.com/manual/tutorial/install-mongodb-on-windows/)
# <a name="visual-studio-codetabvisual-studio-code"></a>[Visual Studio Code](#tab/visual-studio-code)
* [.NET Core SDK 3.0 o versione successiva](https://www.microsoft.com/net/download/all)
* [Visual Studio Code](https://code.visualstudio.com/download)
* [C# per Visual Studio Code](https://marketplace.visualstudio.com/items?itemName=ms-vscode.csharp)
* [MongoDB](https://docs.mongodb.com/manual/administration/install-community/)
# <a name="visual-studio-for-mactabvisual-studio-mac"></a>[Visual Studio per Mac](#tab/visual-studio-mac)
* [.NET Core SDK 3.0 o versione successiva](https://www.microsoft.com/net/download/all)
* [Visual Studio per Mac versione 7.7 o successiva](https://visualstudio.microsoft.com/downloads/)
* [MongoDB](https://docs.mongodb.com/manual/tutorial/install-mongodb-on-os-x/)
---
## <a name="configure-mongodb"></a>Configurare MongoDB
Se si usa Windows, MongoDB è installato in *C:\\Programmi\\MongoDB* per impostazione predefinita. Aggiungere *C:\\Programmi\\MongoDB\\Server\\\<numero_versione>\\bin* alla variabile di ambiente `Path`. Questa modifica consente l'accesso MongoDB da qualsiasi posizione nel computer di sviluppo.
Usare la shell mongo nelle procedure seguenti per creare un database, creare le raccolte e archiviare i documenti. Per altre informazioni sui comandi della shell mongo, vedere [Working with the mongo Shell](https://docs.mongodb.com/manual/mongo/#working-with-the-mongo-shell) (Utilizzo della shell mongo).
1. Scegliere una directory nel computer di sviluppo per archiviare i dati. Ad esempio, *C:\\BooksData* in Windows. Creare la directory se non esiste. La shell mongo non consente di creare nuove directory.
1. Aprire una shell dei comandi. Eseguire il comando seguente per connettersi a MongoDB sulla porta predefinita 27017. Ricordare di sostituire `<data_directory_path>` con la directory scelta nel passaggio precedente.
```console
mongod --dbpath <data_directory_path>
```
1. Aprire un'altra istanza della shell dei comandi. Connettersi al database di test predefinito eseguendo il comando seguente:
```console
mongo
```
1. Eseguire il comando seguente in una shell dei comandi:
```console
use BookstoreDb
```
Se non esiste già, viene creato un database denominato *BookstoreDb*. Se il database esiste, la connessione viene aperta per le transazioni.
1. Creare una raccolta `Books` tramite il comando seguente:
```console
db.createCollection('Books')
```
Viene visualizzato il risultato seguente:
```console
{ "ok" : 1 }
```
1. Definire uno schema per la raccolta `Books` e inserire due documenti usando il comando seguente:
```console
db.Books.insertMany([{'Name':'Design Patterns','Price':54.93,'Category':'Computers','Author':'Ralph Johnson'}, {'Name':'Clean Code','Price':43.15,'Category':'Computers','Author':'Robert C. Martin'}])
```
Viene visualizzato il risultato seguente:
```console
{
"acknowledged" : true,
"insertedIds" : [
ObjectId("5bfd996f7b8e48dc15ff215d"),
ObjectId("5bfd996f7b8e48dc15ff215e")
]
}
```
> [!NOTE]
> L'ID illustrato in questo articolo non corrisponde agli ID quando si esegue questo campione.
1. Visualizzare i documenti nel database usando il comando seguente:
```console
db.Books.find({}).pretty()
```
Viene visualizzato il risultato seguente:
```console
{
"_id" : ObjectId("5bfd996f7b8e48dc15ff215d"),
"Name" : "Design Patterns",
"Price" : 54.93,
"Category" : "Computers",
"Author" : "Ralph Johnson"
}
{
"_id" : ObjectId("5bfd996f7b8e48dc15ff215e"),
"Name" : "Clean Code",
"Price" : 43.15,
"Category" : "Computers",
"Author" : "Robert C. Martin"
}
```
Lo schema aggiunge una proprietà `_id` generata automaticamente di tipo `ObjectId` per ogni documento.
Il database è pronto. È possibile iniziare a creare l'API Web ASP.NET Core.
## <a name="create-the-aspnet-core-web-api-project"></a>Creare il progetto per l'API Web ASP.NET Core
# <a name="visual-studiotabvisual-studio"></a>[Visual Studio](#tab/visual-studio)
1. Passare a **File** > **Nuovo** > **Progetto**.
1. Selezionare il tipo di progetto **Applicazione Web ASP.NET Core** e selezionare **Avanti**.
1. Assegnare al progetto il nome *BooksApi* e selezionare **Crea**.
1. Selezionare il framework di destinazione **.NET Core** e **ASP.NET Core 3.0**. Selezionare il modello di progetto **API** e scegliere **Crea**.
1. Visitare la [raccolta NuGet: MongoDB. driver](https://www.nuget.org/packages/MongoDB.Driver/) per determinare la versione stabile più recente del driver .NET per MongoDB. Nella finestra **Console di Gestione pacchetti** passare alla radice del progetto. Eseguire il comando seguente per installare il driver .NET per MongoDB:
```powershell
Install-Package MongoDB.Driver -Version {VERSION}
```
# <a name="visual-studio-codetabvisual-studio-code"></a>[Visual Studio Code](#tab/visual-studio-code)
1. Eseguire i comandi seguenti in una shell dei comandi:
```dotnetcli
dotnet new webapi -o BooksApi
code BooksApi
```
Un nuovo progetto API Web ASP.NET Core destinato a .NET Core viene generato e aperto in Visual Studio Code.
1. Dopo che l'icona di OmniSharp Flame della barra di stato è verde, una finestra di dialogo richiede che le **risorse necessarie per la compilazione e il debug siano mancanti da' BooksApi '. Aggiungerli?** Selezionare **Sì**.
1. Visitare la [raccolta NuGet: MongoDB. driver](https://www.nuget.org/packages/MongoDB.Driver/) per determinare la versione stabile più recente del driver .NET per MongoDB. Aprire **Terminale integrato** e passare alla radice del progetto. Eseguire il comando seguente per installare il driver .NET per MongoDB:
```dotnetcli
dotnet add BooksApi.csproj package MongoDB.Driver -v {VERSION}
```
# <a name="visual-studio-for-mactabvisual-studio-mac"></a>[Visual Studio per Mac](#tab/visual-studio-mac)
1. Passare a **File** > **Nuova soluzione** > **.NET Core** > **App**.
1. Selezionare il modello di progetto C# **API Web ASP.NET Core** e selezionare **Avanti**.
1. Selezionare **.NET Core 3.0** nell'elenco a discesa **Framework di destinazione** e selezionare **Avanti**.
1. Immettere *BooksApi* per **Nome progetto** e selezionare **Crea**.
1. Nel riquadro **Soluzione** fare clic con il pulsante destro del mouse sul nodo **Dipendenze** del progetto e scegliere **Aggiungi pacchetti**.
1. Immettere *MongoDB.Driver* nella casella di ricerca, selezionare il pacchetto *MongoDB.Driver* e quindi **Aggiungi pacchetto**.
1. Selezionare il pulsante **Accetta** nella finestra di dialogo **Accettazione della licenza**.
---
## <a name="add-an-entity-model"></a>Aggiungere un modello di entità
1. Aggiungere una directory *Models* alla radice del progetto.
1. Aggiungere una classe `Book` alla directory *Models* con il codice seguente:
```csharp
using MongoDB.Bson;
using MongoDB.Bson.Serialization.Attributes;
namespace BooksApi.Models
{
public class Book
{
[BsonId]
[BsonRepresentation(BsonType.ObjectId)]
public string Id { get; set; }
[BsonElement("Name")]
public string BookName { get; set; }
public decimal Price { get; set; }
public string Category { get; set; }
public string Author { get; set; }
}
}
```
Nella classe precedente, la proprietà `Id`:
* È obbligatoria per il mapping tra l'oggetto CLR (Common Language Runtime) e la raccolta MongoDB.
* È annotata con [[BsonId]](https://api.mongodb.com/csharp/current/html/T_MongoDB_Bson_Serialization_Attributes_BsonIdAttribute.htm) per definire questa proprietà come chiave primaria del documento.
* È annotata con [[BsonRepresentation(BsonType.ObjectId)]](https://api.mongodb.com/csharp/current/html/T_MongoDB_Bson_Serialization_Attributes_BsonRepresentationAttribute.htm) per consentire il passaggio del parametro come tipo `string` invece di una struttura [ObjectId](https://api.mongodb.com/csharp/current/html/T_MongoDB_Bson_ObjectId.htm). Mongo gestisce la conversione da `string` a `ObjectId`.
La proprietà `BookName` è annotata con l'attributo [[BsonElement]](https://api.mongodb.com/csharp/current/html/T_MongoDB_Bson_Serialization_Attributes_BsonElementAttribute.htm). Il valore dell'attributo `Name` rappresenta il nome della proprietà nella raccolta MongoDB.
## <a name="add-a-configuration-model"></a>Aggiungere un modello di configurazione
1. Aggiungere i seguenti valori di configurazione del database a *appsettings.json*:
[!code-json[](first-mongo-app/samples/3.x/SampleApp/appsettings.json?highlight=2-6)]
1. Aggiungere un file *BookstoreDatabaseSettings.cs* alla directory *Models* con il codice seguente:
[!code-csharp[](first-mongo-app/samples/3.x/SampleApp/Models/BookstoreDatabaseSettings.cs)]
La classe `BookstoreDatabaseSettings` precedente viene utilizzata per archiviare i valori della proprietà `BookstoreDatabaseSettings` del file *appsettings.json*. I nomi delle proprietà JSON e C# sono identici per semplificare il processo di mapping.
1. Aggiungere il codice evidenziato seguente a `Startup.ConfigureServices`:
[!code-csharp[](first-mongo-app/samples_snapshot/3.x/SampleApp/Startup.ConfigureServices.AddDbSettings.cs?highlight=3-7)]
Nel codice precedente:
* L'istanza di configurazione a cui si associa la sezione `BookstoreDatabaseSettings` del file *appsettings.json* viene registrata nel contenitore di inserimento delle dipendenze. Una proprietà `ConnectionString` di un oggetto `BookstoreDatabaseSettings` viene popolata con la proprietà `BookstoreDatabaseSettings:ConnectionString` in *appsettings.json*.
* L'interfaccia `IBookstoreDatabaseSettings` è registrata nell'inserimento di dipendenze con una [durata del servizio](xref:fundamentals/dependency-injection#service-lifetimes) singleton. Quando avviene l'inserimento, l'istanza dell'interfaccia restituisce un oggetto `BookstoreDatabaseSettings`.
1. Aggiungere il codice seguente all'inizio del file *Startup.cs* per risolvere i riferimenti a `BookstoreDatabaseSettings` e `IBookstoreDatabaseSettings`:
[!code-csharp[](first-mongo-app/samples/3.x/SampleApp/Startup.cs?name=snippet_UsingBooksApiModels)]
## <a name="add-a-crud-operations-service"></a>Aggiungere un servizio di operazioni CRUD
1. Aggiungere una directory *Services* alla radice del progetto.
1. Aggiungere una classe `BookService` alla directory *Services* con il codice seguente:
[!code-csharp[](first-mongo-app/samples/3.x/SampleApp/Services/BookService.cs?name=snippet_BookServiceClass)]
Nel codice precedente un'istanza di `IBookstoreDatabaseSettings` viene recuperata dall'inserimento di dipendenze tramite l'inserimento del costruttore. Questa tecnica fornisce accesso ai valori di configurazione di *appsettings.json* che sono stati aggiunti nella sezione [Aggiungere un modello di configurazione](#add-a-configuration-model).
1. Aggiungere il codice evidenziato seguente a `Startup.ConfigureServices`:
[!code-csharp[](first-mongo-app/samples_snapshot/3.x/SampleApp/Startup.ConfigureServices.AddSingletonService.cs?highlight=9)]
Nel codice precedente la classe `BookService` è registrata con l'inserimento di dipendenze per supportare l'inserimento del costruttore nelle classi che la utilizzano. La durata del servizio singleton è più appropriata perché `BookService` assume una dipendenza diretta a `MongoClient`. In base alle [linee guida per il riutilizzo di Mongo Client](https://mongodb.github.io/mongo-csharp-driver/2.8/reference/driver/connecting/#re-use) ufficiali, `MongoClient` deve essere registrato nell'inserimento di dipendenze con una durata del servizio singleton.
1. Aggiungere il codice seguente all'inizio del file *Startup.cs* per risolvere il riferimento a `BookService`:
[!code-csharp[](first-mongo-app/samples/3.x/SampleApp/Startup.cs?name=snippet_UsingBooksApiServices)]
La classe `BookService` usa i membri `MongoDB.Driver` seguenti per eseguire operazioni CRUD sul database:
* [MongoClient](https://api.mongodb.com/csharp/current/html/T_MongoDB_Driver_MongoClient.htm) – Legge l'istanza del server per l'esecuzione di operazioni di database. Al costruttore di questa classe viene passata la stringa di connessione MongoDB:
[!code-csharp[](first-mongo-app/samples/3.x/SampleApp/Services/BookService.cs?name=snippet_BookServiceConstructor&highlight=3)]
* [IMongoDatabase](https://api.mongodb.com/csharp/current/html/T_MongoDB_Driver_IMongoDatabase.htm) – Rappresenta il database di Mongo per l'esecuzione delle operazioni. Questa esercitazione usa il metodo [GetCollection\<TDocument>(collection)](https://api.mongodb.com/csharp/current/html/M_MongoDB_Driver_IMongoDatabase_GetCollection__1.htm) generico sull'interfaccia per ottenere l'accesso ai dati in una raccolta specifica. Eseguire le operazioni CRUD sulla raccolta dopo la chiamata a questo metodo. Nella chiamata del metodo `GetCollection<TDocument>(collection)`:
* `collection` rappresenta il nome della raccolta.
* `TDocument` rappresenta il tipo di oggetto CLR archiviato nella raccolta.
`GetCollection<TDocument>(collection)` restituisce un oggetto [MongoCollection](https://api.mongodb.com/csharp/current/html/T_MongoDB_Driver_MongoCollection.htm) che rappresenta la raccolta. In questa esercitazione, vengono richiamati i metodi seguenti sulla raccolta:
* [DeleteOne](https://api.mongodb.com/csharp/current/html/M_MongoDB_Driver_IMongoCollection_1_DeleteOne.htm) – Elimina un singolo documento corrispondente ai criteri di ricerca specificati.
* [Find\<TDocument>](https://api.mongodb.com/csharp/current/html/M_MongoDB_Driver_IMongoCollectionExtensions_Find__1_1.htm) – Restituisce tutti i documenti nella raccolta corrispondenti ai criteri di ricerca specificati.
* [InsertOne](https://api.mongodb.com/csharp/current/html/M_MongoDB_Driver_IMongoCollection_1_InsertOne.htm) – Inserisce l'oggetto specificato come nuovo documento nella raccolta.
* [ReplaceOne](https://api.mongodb.com/csharp/current/html/M_MongoDB_Driver_IMongoCollection_1_ReplaceOne.htm) – Sostituisce il singolo documento corrispondente ai criteri di ricerca specificati con l'oggetto specificato.
## <a name="add-a-controller"></a>Aggiungere un controller
Aggiungere una classe `BooksController` alla directory *Controllers* con il codice seguente:
[!code-csharp[](first-mongo-app/samples/3.x/SampleApp/Controllers/BooksController.cs)]
Il controller dell'API Web precedente:
* Usa la classe `BookService` per eseguire operazioni CRUD.
* Contiene metodi di azione per supportare le richieste HTTP GET, POST, PUT e DELETE.
* Chiama <xref:System.Web.Http.ApiController.CreatedAtRoute*> nel metodo dell'azione `Create` per restituire una risposta [HTTP 201](https://www.w3.org/Protocols/rfc2616/rfc2616-sec10.html). Il codice di stato 201 è la risposta standard per un metodo HTTP POST che crea una nuova risorsa nel server. `Location` aggiunge anche un'intestazione `CreatedAtRoute` alla risposta. L'intestazione `Location` specifica l'URI del libro appena creato.
## <a name="test-the-web-api"></a>Testare l'API Web
1. Compilare ed eseguire l'app.
1. Passare a `http://localhost:<port>/api/books` per testare il metodo dell'azione `Get` senza parametri del controller. Viene visualizzata la risposta JSON seguente:
```json
[
{
"id":"5bfd996f7b8e48dc15ff215d",
"bookName":"Design Patterns",
"price":54.93,
"category":"Computers",
"author":"Ralph Johnson"
},
{
"id":"5bfd996f7b8e48dc15ff215e",
"bookName":"Clean Code",
"price":43.15,
"category":"Computers",
"author":"Robert C. Martin"
}
]
```
1. Passare a `http://localhost:<port>/api/books/{id here}` per testare il metodo dell'azione `Get` in overload del controller. Viene visualizzata la risposta JSON seguente:
```json
{
"id":"{ID}",
"bookName":"Clean Code",
"price":43.15,
"category":"Computers",
"author":"Robert C. Martin"
}
```
## <a name="configure-json-serialization-options"></a>Configurare le opzioni di serializzazione JSON
Esistono due dettagli da modificare per le risposte JSON restituite nella sezione [Testare l'API Web](#test-the-web-api):
* La notazione a cammello predefinita per i nomi di proprietà deve essere modificata in modo da adottare la convenzione Pascal dei nomi di proprietà dell'oggetto CLR.
* La proprietà `bookName` deve essere restituita come `Name`.
Per soddisfare i requisiti precedenti, apportare le modifiche seguenti:
1. JSON.NET è stato rimosso dal framework condiviso di ASP.NET. Aggiungere un riferimento al pacchetto a [Microsoft.AspNetCore.Mvc.NewtonsoftJson](https://nuget.org/packages/Microsoft.AspNetCore.Mvc.NewtonsoftJson).
1. In `Startup.ConfigureServices` concatenare il codice evidenziato seguente alla chiamata del metodo `AddMvc`:
[!code-csharp[](first-mongo-app/samples/3.x/SampleApp/Startup.cs?name=snippet_ConfigureServices&highlight=12)]
Con la modifica precedente, i nomi delle proprietà nella risposta JSON serializzata dell'API Web corrispondono ai nomi di proprietà corrispondenti nel tipo di oggetto CLR. Ad esempio, la proprietà `Author` della classe `Book` viene serializzata come `Author`.
1. In *Models/Book.cs* annotare la proprietà `BookName` con l'attributo [[JsonProperty]](https://www.newtonsoft.com/json/help/html/T_Newtonsoft_Json_JsonPropertyAttribute.htm) seguente:
[!code-csharp[](first-mongo-app/samples/3.x/SampleApp/Models/Book.cs?name=snippet_BookNameProperty&highlight=2)]
Il valore `Name` dell'attributo `[JsonProperty]` rappresenta il nome della proprietà nella risposta JSON serializzata dell'API Web.
1. Aggiungere il codice seguente all'inizio di *Models/Book.cs* per risolvere il riferimento all'attributo `[JsonProperty]`:
[!code-csharp[](first-mongo-app/samples/3.x/SampleApp/Models/Book.cs?name=snippet_NewtonsoftJsonImport)]
1. Ripetere i passaggi definiti nella sezione [Testare l'API Web](#test-the-web-api). Si noti la differenza nei nomi di proprietà JSON.
::: moniker-end
::: moniker range="< aspnetcore-3.0"
Questa esercitazione crea un'API Web che esegue operazioni di creazione, lettura, aggiornamento ed eliminazione (CRUD) su un database NoSQL [MongoDB](https://www.mongodb.com/what-is-mongodb).
In questa esercitazione si imparerà a:
> [!div class="checklist"]
> * Configurare MongoDB
> * Creare un database MongoDB
> * Definire una raccolta e uno schema MongoDB
> * Eseguire operazioni CRUD di MongoDB da un'API Web
> * Personalizzare la serializzazione JSON
[Visualizzare o scaricare il codice di esempio](https://github.com/aspnet/AspNetCore.Docs/tree/master/aspnetcore/tutorials/first-mongo-app/samples) ([procedura per il download](xref:index#how-to-download-a-sample))
## <a name="prerequisites"></a>Prerequisites
# <a name="visual-studiotabvisual-studio"></a>[Visual Studio](#tab/visual-studio)
* [.NET Core SDK 2.2](https://www.microsoft.com/net/download/all)
* [Visual Studio 2019](https://visualstudio.microsoft.com/downloads/?utm_medium=microsoft&utm_source=docs.microsoft.com&utm_campaign=inline+link&utm_content=download+vs2019) con il carico di lavoro **Sviluppo ASP.NET e Web**
* [MongoDB](https://docs.mongodb.com/manual/tutorial/install-mongodb-on-windows/)
# <a name="visual-studio-codetabvisual-studio-code"></a>[Visual Studio Code](#tab/visual-studio-code)
* [.NET Core SDK 2.2](https://www.microsoft.com/net/download/all)
* [Visual Studio Code](https://code.visualstudio.com/download)
* [C# per Visual Studio Code](https://marketplace.visualstudio.com/items?itemName=ms-vscode.csharp)
* [MongoDB](https://docs.mongodb.com/manual/administration/install-community/)
# <a name="visual-studio-for-mactabvisual-studio-mac"></a>[Visual Studio per Mac](#tab/visual-studio-mac)
* [.NET Core SDK 2.2](https://www.microsoft.com/net/download/all)
* [Visual Studio per Mac versione 7.7 o successiva](https://visualstudio.microsoft.com/downloads/)
* [MongoDB](https://docs.mongodb.com/manual/tutorial/install-mongodb-on-os-x/)
---
## <a name="configure-mongodb"></a>Configurare MongoDB
Se si usa Windows, MongoDB è installato in *C:\\Programmi\\MongoDB* per impostazione predefinita. Aggiungere *C:\\Programmi\\MongoDB\\Server\\\<numero_versione>\\bin* alla variabile di ambiente `Path`. Questa modifica consente l'accesso MongoDB da qualsiasi posizione nel computer di sviluppo.
Usare la shell mongo nelle procedure seguenti per creare un database, creare le raccolte e archiviare i documenti. Per altre informazioni sui comandi della shell mongo, vedere [Working with the mongo Shell](https://docs.mongodb.com/manual/mongo/#working-with-the-mongo-shell) (Utilizzo della shell mongo).
1. Scegliere una directory nel computer di sviluppo per archiviare i dati. Ad esempio, *C:\\BooksData* in Windows. Creare la directory se non esiste. La shell mongo non consente di creare nuove directory.
1. Aprire una shell dei comandi. Eseguire il comando seguente per connettersi a MongoDB sulla porta predefinita 27017. Ricordare di sostituire `<data_directory_path>` con la directory scelta nel passaggio precedente.
```console
mongod --dbpath <data_directory_path>
```
1. Aprire un'altra istanza della shell dei comandi. Connettersi al database di test predefinito eseguendo il comando seguente:
```console
mongo
```
1. Eseguire il comando seguente in una shell dei comandi:
```console
use BookstoreDb
```
Se non esiste già, viene creato un database denominato *BookstoreDb*. Se il database esiste, la connessione viene aperta per le transazioni.
1. Creare una raccolta `Books` tramite il comando seguente:
```console
db.createCollection('Books')
```
Viene visualizzato il risultato seguente:
```console
{ "ok" : 1 }
```
1. Definire uno schema per la raccolta `Books` e inserire due documenti usando il comando seguente:
```console
db.Books.insertMany([{'Name':'Design Patterns','Price':54.93,'Category':'Computers','Author':'Ralph Johnson'}, {'Name':'Clean Code','Price':43.15,'Category':'Computers','Author':'Robert C. Martin'}])
```
Viene visualizzato il risultato seguente:
```console
{
"acknowledged" : true,
"insertedIds" : [
ObjectId("5bfd996f7b8e48dc15ff215d"),
ObjectId("5bfd996f7b8e48dc15ff215e")
]
}
```
> [!NOTE]
> L'ID illustrato in questo articolo non corrisponde agli ID quando si esegue questo campione.
1. Visualizzare i documenti nel database usando il comando seguente:
```console
db.Books.find({}).pretty()
```
Viene visualizzato il risultato seguente:
```console
{
"_id" : ObjectId("5bfd996f7b8e48dc15ff215d"),
"Name" : "Design Patterns",
"Price" : 54.93,
"Category" : "Computers",
"Author" : "Ralph Johnson"
}
{
"_id" : ObjectId("5bfd996f7b8e48dc15ff215e"),
"Name" : "Clean Code",
"Price" : 43.15,
"Category" : "Computers",
"Author" : "Robert C. Martin"
}
```
Lo schema aggiunge una proprietà `_id` generata automaticamente di tipo `ObjectId` per ogni documento.
Il database è pronto. È possibile iniziare a creare l'API Web ASP.NET Core.
## <a name="create-the-aspnet-core-web-api-project"></a>Creare il progetto per l'API Web ASP.NET Core
# <a name="visual-studiotabvisual-studio"></a>[Visual Studio](#tab/visual-studio)
1. Passare a **File** > **Nuovo** > **Progetto**.
1. Selezionare il tipo di progetto **Applicazione Web ASP.NET Core** e selezionare **Avanti**.
1. Assegnare al progetto il nome *BooksApi* e selezionare **Crea**.
1. Selezionare il framework di destinazione **.NET Core** e **ASP.NET Core 2.2**. Selezionare il modello di progetto **API** e scegliere **Crea**.
1. Visitare la [raccolta NuGet: MongoDB. driver](https://www.nuget.org/packages/MongoDB.Driver/) per determinare la versione stabile più recente del driver .NET per MongoDB. Nella finestra **Console di Gestione pacchetti** passare alla radice del progetto. Eseguire il comando seguente per installare il driver .NET per MongoDB:
```powershell
Install-Package MongoDB.Driver -Version {VERSION}
```
# <a name="visual-studio-codetabvisual-studio-code"></a>[Visual Studio Code](#tab/visual-studio-code)
1. Eseguire i comandi seguenti in una shell dei comandi:
```dotnetcli
dotnet new webapi -o BooksApi
code BooksApi
```
Un nuovo progetto API Web ASP.NET Core destinato a .NET Core viene generato e aperto in Visual Studio Code.
1. Dopo che l'icona di OmniSharp Flame della barra di stato è verde, una finestra di dialogo richiede che le **risorse necessarie per la compilazione e il debug siano mancanti da' BooksApi '. Aggiungerli?** Selezionare **Sì**.
1. Visitare la [raccolta NuGet: MongoDB. driver](https://www.nuget.org/packages/MongoDB.Driver/) per determinare la versione stabile più recente del driver .NET per MongoDB. Aprire **Terminale integrato** e passare alla radice del progetto. Eseguire il comando seguente per installare il driver .NET per MongoDB:
```dotnetcli
dotnet add BooksApi.csproj package MongoDB.Driver -v {VERSION}
```
# <a name="visual-studio-for-mactabvisual-studio-mac"></a>[Visual Studio per Mac](#tab/visual-studio-mac)
1. Passare a **File** > **Nuova soluzione** > **.NET Core** > **App**.
1. Selezionare il modello di progetto C# **API Web ASP.NET Core** e selezionare **Avanti**.
1. Selezionare **.NET Core 2.2** nell'elenco a discesa **Framework di destinazione** e selezionare **Avanti**.
1. Immettere *BooksApi* per **Nome progetto** e selezionare **Crea**.
1. Nel riquadro **Soluzione** fare clic con il pulsante destro del mouse sul nodo **Dipendenze** del progetto e scegliere **Aggiungi pacchetti**.
1. Immettere *MongoDB.Driver* nella casella di ricerca, selezionare il pacchetto *MongoDB.Driver* e quindi **Aggiungi pacchetto**.
1. Selezionare il pulsante **Accetta** nella finestra di dialogo **Accettazione della licenza**.
---
## <a name="add-an-entity-model"></a>Aggiungere un modello di entità
1. Aggiungere una directory *Models* alla radice del progetto.
1. Aggiungere una classe `Book` alla directory *Models* con il codice seguente:
```csharp
using MongoDB.Bson;
using MongoDB.Bson.Serialization.Attributes;
namespace BooksApi.Models
{
public class Book
{
[BsonId]
[BsonRepresentation(BsonType.ObjectId)]
public string Id { get; set; }
[BsonElement("Name")]
public string BookName { get; set; }
public decimal Price { get; set; }
public string Category { get; set; }
public string Author { get; set; }
}
}
```
Nella classe precedente, la proprietà `Id`:
* È obbligatoria per il mapping tra l'oggetto CLR (Common Language Runtime) e la raccolta MongoDB.
* È annotata con [[BsonId]](https://api.mongodb.com/csharp/current/html/T_MongoDB_Bson_Serialization_Attributes_BsonIdAttribute.htm) per definire questa proprietà come chiave primaria del documento.
* È annotata con [[BsonRepresentation(BsonType.ObjectId)]](https://api.mongodb.com/csharp/current/html/T_MongoDB_Bson_Serialization_Attributes_BsonRepresentationAttribute.htm) per consentire il passaggio del parametro come tipo `string` invece di una struttura [ObjectId](https://api.mongodb.com/csharp/current/html/T_MongoDB_Bson_ObjectId.htm). Mongo gestisce la conversione da `string` a `ObjectId`.
La proprietà `BookName` è annotata con l'attributo [[BsonElement]](https://api.mongodb.com/csharp/current/html/T_MongoDB_Bson_Serialization_Attributes_BsonElementAttribute.htm). Il valore dell'attributo `Name` rappresenta il nome della proprietà nella raccolta MongoDB.
## <a name="add-a-configuration-model"></a>Aggiungere un modello di configurazione
1. Aggiungere i seguenti valori di configurazione del database a *appsettings.json*:
[!code-json[](first-mongo-app/samples/2.x/SampleApp/appsettings.json?highlight=2-6)]
1. Aggiungere un file *BookstoreDatabaseSettings.cs* alla directory *Models* con il codice seguente:
[!code-csharp[](first-mongo-app/samples/2.x/SampleApp/Models/BookstoreDatabaseSettings.cs)]
La classe `BookstoreDatabaseSettings` precedente viene utilizzata per archiviare i valori della proprietà `BookstoreDatabaseSettings` del file *appsettings.json*. I nomi delle proprietà JSON e C# sono identici per semplificare il processo di mapping.
1. Aggiungere il codice evidenziato seguente a `Startup.ConfigureServices`:
[!code-csharp[](first-mongo-app/samples_snapshot/2.x/SampleApp/Startup.ConfigureServices.AddDbSettings.cs?highlight=3-7)]
Nel codice precedente:
* L'istanza di configurazione a cui si associa la sezione `BookstoreDatabaseSettings` del file *appsettings.json* viene registrata nel contenitore di inserimento delle dipendenze. Una proprietà `ConnectionString` di un oggetto `BookstoreDatabaseSettings` viene popolata con la proprietà `BookstoreDatabaseSettings:ConnectionString` in *appsettings.json*.
* L'interfaccia `IBookstoreDatabaseSettings` è registrata nell'inserimento di dipendenze con una [durata del servizio](xref:fundamentals/dependency-injection#service-lifetimes) singleton. Quando avviene l'inserimento, l'istanza dell'interfaccia restituisce un oggetto `BookstoreDatabaseSettings`.
1. Aggiungere il codice seguente all'inizio del file *Startup.cs* per risolvere i riferimenti a `BookstoreDatabaseSettings` e `IBookstoreDatabaseSettings`:
[!code-csharp[](first-mongo-app/samples/2.x/SampleApp/Startup.cs?name=snippet_UsingBooksApiModels)]
## <a name="add-a-crud-operations-service"></a>Aggiungere un servizio di operazioni CRUD
1. Aggiungere una directory *Services* alla radice del progetto.
1. Aggiungere una classe `BookService` alla directory *Services* con il codice seguente:
[!code-csharp[](first-mongo-app/samples/2.x/SampleApp/Services/BookService.cs?name=snippet_BookServiceClass)]
Nel codice precedente un'istanza di `IBookstoreDatabaseSettings` viene recuperata dall'inserimento di dipendenze tramite l'inserimento del costruttore. Questa tecnica fornisce accesso ai valori di configurazione di *appsettings.json* che sono stati aggiunti nella sezione [Aggiungere un modello di configurazione](#add-a-configuration-model).
1. Aggiungere il codice evidenziato seguente a `Startup.ConfigureServices`:
[!code-csharp[](first-mongo-app/samples_snapshot/2.x/SampleApp/Startup.ConfigureServices.AddSingletonService.cs?highlight=9)]
Nel codice precedente la classe `BookService` è registrata con l'inserimento di dipendenze per supportare l'inserimento del costruttore nelle classi che la utilizzano. La durata del servizio singleton è più appropriata perché `BookService` assume una dipendenza diretta a `MongoClient`. In base alle [linee guida per il riutilizzo di Mongo Client](https://mongodb.github.io/mongo-csharp-driver/2.8/reference/driver/connecting/#re-use) ufficiali, `MongoClient` deve essere registrato nell'inserimento di dipendenze con una durata del servizio singleton.
1. Aggiungere il codice seguente all'inizio del file *Startup.cs* per risolvere il riferimento a `BookService`:
[!code-csharp[](first-mongo-app/samples/2.x/SampleApp/Startup.cs?name=snippet_UsingBooksApiServices)]
La classe `BookService` usa i membri `MongoDB.Driver` seguenti per eseguire operazioni CRUD sul database:
* [MongoClient](https://api.mongodb.com/csharp/current/html/T_MongoDB_Driver_MongoClient.htm) – Legge l'istanza del server per l'esecuzione di operazioni di database. Al costruttore di questa classe viene passata la stringa di connessione MongoDB:
[!code-csharp[](first-mongo-app/samples/2.x/SampleApp/Services/BookService.cs?name=snippet_BookServiceConstructor&highlight=3)]
* [IMongoDatabase](https://api.mongodb.com/csharp/current/html/T_MongoDB_Driver_IMongoDatabase.htm) – Rappresenta il database di Mongo per l'esecuzione delle operazioni. Questa esercitazione usa il metodo [GetCollection\<TDocument>(collection)](https://api.mongodb.com/csharp/current/html/M_MongoDB_Driver_IMongoDatabase_GetCollection__1.htm) generico sull'interfaccia per ottenere l'accesso ai dati in una raccolta specifica. Eseguire le operazioni CRUD sulla raccolta dopo la chiamata a questo metodo. Nella chiamata del metodo `GetCollection<TDocument>(collection)`:
* `collection` rappresenta il nome della raccolta.
* `TDocument` rappresenta il tipo di oggetto CLR archiviato nella raccolta.
`GetCollection<TDocument>(collection)` restituisce un oggetto [MongoCollection](https://api.mongodb.com/csharp/current/html/T_MongoDB_Driver_MongoCollection.htm) che rappresenta la raccolta. In questa esercitazione, vengono richiamati i metodi seguenti sulla raccolta:
* [DeleteOne](https://api.mongodb.com/csharp/current/html/M_MongoDB_Driver_IMongoCollection_1_DeleteOne.htm) – Elimina un singolo documento corrispondente ai criteri di ricerca specificati.
* [Find\<TDocument>](https://api.mongodb.com/csharp/current/html/M_MongoDB_Driver_IMongoCollectionExtensions_Find__1_1.htm) – Restituisce tutti i documenti nella raccolta corrispondenti ai criteri di ricerca specificati.
* [InsertOne](https://api.mongodb.com/csharp/current/html/M_MongoDB_Driver_IMongoCollection_1_InsertOne.htm) – Inserisce l'oggetto specificato come nuovo documento nella raccolta.
* [ReplaceOne](https://api.mongodb.com/csharp/current/html/M_MongoDB_Driver_IMongoCollection_1_ReplaceOne.htm) – Sostituisce il singolo documento corrispondente ai criteri di ricerca specificati con l'oggetto specificato.
## <a name="add-a-controller"></a>Aggiungere un controller
Aggiungere una classe `BooksController` alla directory *Controllers* con il codice seguente:
[!code-csharp[](first-mongo-app/samples/2.x/SampleApp/Controllers/BooksController.cs)]
Il controller dell'API Web precedente:
* Usa la classe `BookService` per eseguire operazioni CRUD.
* Contiene metodi di azione per supportare le richieste HTTP GET, POST, PUT e DELETE.
* Chiama <xref:System.Web.Http.ApiController.CreatedAtRoute*> nel metodo dell'azione `Create` per restituire una risposta [HTTP 201](https://www.w3.org/Protocols/rfc2616/rfc2616-sec10.html). Il codice di stato 201 è la risposta standard per un metodo HTTP POST che crea una nuova risorsa nel server. `Location` aggiunge anche un'intestazione `CreatedAtRoute` alla risposta. L'intestazione `Location` specifica l'URI del libro appena creato.
## <a name="test-the-web-api"></a>Testare l'API Web
1. Compilare ed eseguire l'app.
1. Passare a `http://localhost:<port>/api/books` per testare il metodo dell'azione `Get` senza parametri del controller. Viene visualizzata la risposta JSON seguente:
```json
[
{
"id":"5bfd996f7b8e48dc15ff215d",
"bookName":"Design Patterns",
"price":54.93,
"category":"Computers",
"author":"Ralph Johnson"
},
{
"id":"5bfd996f7b8e48dc15ff215e",
"bookName":"Clean Code",
"price":43.15,
"category":"Computers",
"author":"Robert C. Martin"
}
]
```
1. Passare a `http://localhost:<port>/api/books/{id here}` per testare il metodo dell'azione `Get` in overload del controller. Viene visualizzata la risposta JSON seguente:
```json
{
"id":"{ID}",
"bookName":"Clean Code",
"price":43.15,
"category":"Computers",
"author":"Robert C. Martin"
}
```
## <a name="configure-json-serialization-options"></a>Configurare le opzioni di serializzazione JSON
Esistono due dettagli da modificare per le risposte JSON restituite nella sezione [Testare l'API Web](#test-the-web-api):
* La notazione a cammello predefinita per i nomi di proprietà deve essere modificata in modo da adottare la convenzione Pascal dei nomi di proprietà dell'oggetto CLR.
* La proprietà `bookName` deve essere restituita come `Name`.
Per soddisfare i requisiti precedenti, apportare le modifiche seguenti:
1. In `Startup.ConfigureServices` concatenare il codice evidenziato seguente alla chiamata del metodo `AddMvc`:
[!code-csharp[](first-mongo-app/samples/2.x/SampleApp/Startup.cs?name=snippet_ConfigureServices&highlight=12)]
Con la modifica precedente, i nomi delle proprietà nella risposta JSON serializzata dell'API Web corrispondono ai nomi di proprietà corrispondenti nel tipo di oggetto CLR. Ad esempio, la proprietà `Author` della classe `Book` viene serializzata come `Author`.
1. In *Models/Book.cs* annotare la proprietà `BookName` con l'attributo [[JsonProperty]](https://www.newtonsoft.com/json/help/html/T_Newtonsoft_Json_JsonPropertyAttribute.htm) seguente:
[!code-csharp[](first-mongo-app/samples/2.x/SampleApp/Models/Book.cs?name=snippet_BookNameProperty&highlight=2)]
Il valore `Name` dell'attributo `[JsonProperty]` rappresenta il nome della proprietà nella risposta JSON serializzata dell'API Web.
1. Aggiungere il codice seguente all'inizio di *Models/Book.cs* per risolvere il riferimento all'attributo `[JsonProperty]`:
[!code-csharp[](first-mongo-app/samples/2.x/SampleApp/Models/Book.cs?name=snippet_NewtonsoftJsonImport)]
1. Ripetere i passaggi definiti nella sezione [Testare l'API Web](#test-the-web-api). Si noti la differenza nei nomi di proprietà JSON.
::: moniker-end
## <a name="add-authentication-support-to-a-web-api"></a>Aggiungere il supporto per l'autenticazione a un'API Web
[!INCLUDE[](~/includes/IdentityServer4.md)]
## <a name="next-steps"></a>Passaggi successivi
Per altre informazioni sulla creazione di API Web ASP.NET Core, vedere le risorse seguenti:
* [Versione YouTube dell'articolo](https://www.youtube.com/watch?v=7uJt_sOenyo&feature=youtu.be)
* <xref:web-api/index>
* <xref:web-api/action-return-types>
| 54.638356 | 575 | 0.758411 | ita_Latn | 0.974653 |
2f166d110ab163b6342fcf74dcc5ee539b787397 | 153 | markdown | Markdown | README.markdown | Abed-Aghbar/My-CodePen-Website | 86b12315bb9a2a503e57ab02dbcecc9dfa5f4dc2 | [
"MIT"
] | null | null | null | README.markdown | Abed-Aghbar/My-CodePen-Website | 86b12315bb9a2a503e57ab02dbcecc9dfa5f4dc2 | [
"MIT"
] | null | null | null | README.markdown | Abed-Aghbar/My-CodePen-Website | 86b12315bb9a2a503e57ab02dbcecc9dfa5f4dc2 | [
"MIT"
] | null | null | null | # Abed: Product Landing Page
A Pen created on CodePen.io. Original URL: [https://codepen.io/Alagh/pen/poEmOzw](https://codepen.io/Alagh/pen/poEmOzw).
| 25.5 | 120 | 0.738562 | yue_Hant | 0.694196 |
2f16a15cb67d6699e80f1020b7d1cd08c5b768aa | 1,004 | md | Markdown | utils/README.md | limjiaxiang/panpy-ml | add1ccd0197d188c00b86f34b09b42e3e977e302 | [
"MIT"
] | null | null | null | utils/README.md | limjiaxiang/panpy-ml | add1ccd0197d188c00b86f34b09b42e3e977e302 | [
"MIT"
] | null | null | null | utils/README.md | limjiaxiang/panpy-ml | add1ccd0197d188c00b86f34b09b42e3e977e302 | [
"MIT"
] | null | null | null | # Formulae, etc. implementation using NumPy and Pandas
| Algorithm | Mathematical Intuition | Code Implementation |
| ------------------------- | -------------------------- | ------------------- |
| Confusion matrix metrics | [Link](#mathematical-intuition-for-confusion-matrix-metrics)
| Basic univariate stats | [Link](#mathematical-intuition-for-basic-univariate-statistics)
| Basic bivariate stats | [Link](#mathematical-intuition-for-basic-bivariate-statistics)
| Basic probaility stats | [Link](#mathematical-intuition-for-basic-probability-statistics)
---
## Mathematical intuition for confusion matrix metrics:

## Mathematical intuition for basic univariate statistics:

## Mathematical intuition for basic bivariate statistics:

## Mathematical intuition for basic probability statistics:
 | 45.636364 | 94 | 0.696215 | eng_Latn | 0.20533 |
2f16d6afd0a4d98f2bc5b852960732f373c90076 | 2,419 | md | Markdown | README.md | AdrianaMendes/rede-social | 55524773c2475548ffa4e215d8772ef8f9b4f191 | [
"MIT"
] | null | null | null | README.md | AdrianaMendes/rede-social | 55524773c2475548ffa4e215d8772ef8f9b4f191 | [
"MIT"
] | null | null | null | README.md | AdrianaMendes/rede-social | 55524773c2475548ffa4e215d8772ef8f9b4f191 | [
"MIT"
] | null | null | null |      
# Trabalho de banco de dados
O presente projeto é uma atividade acadêmica que visa apresentar o diagrama Entidade-Relacionamento, o mapeamento do esquema ER e a representação visual do esquema relacional do sistema web desenvolvido da rede social Escolha Perfeita.
O SGBD utilizado para a criação da rede social foi o MySQL, tendo como servidor local o XAMPP e a ferramenta utilizada para administrar o banco de dados foi o PhpMyAdmin. As linguagens e as tecnologias utilizadas para a criação do sistema web foram as seguintes: PHP, CSS (Bootstrap), HTML, JavaScript (jQuerry), HTML5 e interface de programação Visual Studio Code.
## Diagrama relacional
Diagrama Relacional

Esquema Relacional

## Prints







# Licença
Este projeto está sob licença [MIT](https://choosealicense.com/licenses/mit/) © 2020 Adriana Mirian Mendes Cardoso.
Para mais informações acesse o arquivo :scroll:`LICENSE.md`.
# Contato
:email: E-Mail: [adriana.cardoso@aluno.ufop.edu.br](adriana.cardoso@aluno.ufop.edu.br)
:clipboard: Linkedin: [https://www.linkedin.com/in/adriana-mendes-engenheira-de-computacao/](https://www.linkedin.com/in/adriana-mendes-engenheira-de-computacao/)
:package: GitHub: [https://github.com/AdrianaMendes](https://github.com/AdrianaMendes) | 51.468085 | 831 | 0.786689 | por_Latn | 0.814022 |
2f17136064609d8c4071f5ecd0629c1f6a59fd5b | 478 | md | Markdown | user/pages/01.about/_contact/content_only.en.md | spyesx/spyesx.fr | e87b81bad5f8308a39aa5fa636c723587866c2ad | [
"MIT"
] | null | null | null | user/pages/01.about/_contact/content_only.en.md | spyesx/spyesx.fr | e87b81bad5f8308a39aa5fa636c723587866c2ad | [
"MIT"
] | null | null | null | user/pages/01.about/_contact/content_only.en.md | spyesx/spyesx.fr | e87b81bad5f8308a39aa5fa636c723587866c2ad | [
"MIT"
] | null | null | null | ---
title: 'Contact'
process:
markdown: true
twig: true
modularType: 'contact'
---
* [{{'nicolas.bages@spyesx.fr'|safe_email}}](mailto:{{'nicolas.bages@spyesx.fr'|safe_email}})
* [facebook.com/spyesx](http://facebook.com/spyesx)
* [twitter.com/spyesx](http://twitter.com/spyesx)
* [flickr.com/spyesx](http://flickr.com/spyesx)
* [github.com/spyesx](http://github.com/spyesx)
* [linkedin.com/nicolasbages](http://linkedin.com/nicolasbages)
* [skype](skype:nicolas.bages) | 31.866667 | 93 | 0.705021 | yue_Hant | 0.150557 |
2f1892668b9ee79b05d7d8dc0dd460085ef617fb | 2,120 | md | Markdown | docs/t-sql/spatial-geometry/stexteriorring-geometry-data-type.md | thiagoamc/sql-docs.pt-br | 32e5d2a16f76e552e93b54b343566cd3a326b929 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/t-sql/spatial-geometry/stexteriorring-geometry-data-type.md | thiagoamc/sql-docs.pt-br | 32e5d2a16f76e552e93b54b343566cd3a326b929 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/t-sql/spatial-geometry/stexteriorring-geometry-data-type.md | thiagoamc/sql-docs.pt-br | 32e5d2a16f76e552e93b54b343566cd3a326b929 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: STExteriorRing (tipo de dados geometry) | Microsoft Docs
ms.custom:
ms.date: 08/03/2017
ms.prod: sql-non-specified
ms.prod_service: database-engine, sql-database
ms.service:
ms.component: t-sql|spatial-geography
ms.reviewer:
ms.suite: sql
ms.technology:
- database-engine
ms.tgt_pltfrm:
ms.topic: language-reference
f1_keywords:
- STExteriorRing_TSQL
- STExteriorRing (geometry Data Type)
dev_langs:
- TSQL
helpviewer_keywords:
- STExteriorRing (geometry Data Type)
ms.assetid: b402b36f-05bf-4c6d-8cd6-76c0fff19db2
caps.latest.revision:
author: douglaslMS
ms.author: douglasl
manager: craigg
ms.workload: Inactive
ms.openlocfilehash: 6311d2b2c50c98e8581749ba0d4ae305f2520908
ms.sourcegitcommit: 9e6a029456f4a8daddb396bc45d7874a43a47b45
ms.translationtype: HT
ms.contentlocale: pt-BR
ms.lasthandoff: 01/25/2018
---
# <a name="stexteriorring-geometry-data-type"></a>STExteriorRing (tipo de dados geometry)
[!INCLUDE[tsql-appliesto-ss2008-asdb-xxxx-xxx-md](../../includes/tsql-appliesto-ss2008-asdb-xxxx-xxx-md.md)]
Retorna o anel exterior de uma instância de **geometry**, que é um polígono.
## <a name="syntax"></a>Sintaxe
```
.STExteriorRing ( )
```
## <a name="return-types"></a>Tipos de retorno
Tipo de retorno do [!INCLUDE[ssNoVersion](../../includes/ssnoversion-md.md)]: **geometry**
Tipo de retorno do CLR: **SqlGeometry**
Tipo do OGC (Open Geospatial Consortium): **LineString**
## <a name="remarks"></a>Remarks
Esse método retorna **nulo** se a instância de **geometry** não é um polígono.
## <a name="examples"></a>Exemplos
O exemplo a seguir cria uma instância de `Polygon` e usa `STExteriorRing()` para retornar o anel exterior do polígono como uma **LineString**.
```
DECLARE @g geometry;
SET @g = geometry::STGeomFromText('POLYGON((0 0, 3 0, 3 3, 0 3, 0 0),(2 2, 2 1, 1 1, 1 2, 2 2))', 0);
SELECT @g.STExteriorRing().ToString();
```
## <a name="see-also"></a>Consulte Também
[Métodos OGC em instâncias geometry](../../t-sql/spatial-geometry/ogc-methods-on-geometry-instances.md)
| 30.285714 | 145 | 0.705189 | por_Latn | 0.346602 |
2f18b5393fd675e0c5606346a7324c6afc3f6de1 | 12,359 | md | Markdown | translations/es-XL/content/packages/using-github-packages-with-your-projects-ecosystem/configuring-npm-for-use-with-github-packages.md | Hardik-Ghori/docs | a1910a7f8cedd4485962ad0f0c3c93d7348709a5 | [
"CC-BY-4.0",
"MIT"
] | 20 | 2021-02-17T16:18:11.000Z | 2022-03-16T08:30:36.000Z | translations/es-XL/content/packages/using-github-packages-with-your-projects-ecosystem/configuring-npm-for-use-with-github-packages.md | 0954011723/docs | d51685810027d8071e54237bbfd1a9fb7971941d | [
"CC-BY-4.0",
"MIT"
] | 40 | 2020-10-21T12:54:07.000Z | 2021-07-23T06:10:46.000Z | translations/es-XL/content/packages/using-github-packages-with-your-projects-ecosystem/configuring-npm-for-use-with-github-packages.md | 0954011723/docs | d51685810027d8071e54237bbfd1a9fb7971941d | [
"CC-BY-4.0",
"MIT"
] | 3 | 2021-03-31T18:21:34.000Z | 2021-04-10T21:07:53.000Z | ---
title: Configurar npm para usar con Paquetes de GitHub
intro: 'Puedes configurar npm para publicar paquetes en {% data variables.product.prodname_registry %} y para usar los paquetes almacenados en {% data variables.product.prodname_registry %} como dependencias en un proyecto npm.'
product: '{% data reusables.gated-features.packages %}'
redirect_from:
- /articles/configuring-npm-for-use-with-github-package-registry
- /github/managing-packages-with-github-package-registry/configuring-npm-for-use-with-github-package-registry
- /github/managing-packages-with-github-packages/configuring-npm-for-use-with-github-packages
versions:
free-pro-team: '*'
enterprise-server: '>=2.22'
---
{% data reusables.package_registry.packages-ghes-release-stage %}
**Nota:** Cuando instalas o publicas una imagen de docker, {% data variables.product.prodname_registry %} no es compatible con capas externas, tales como imágenes de Windows.
### Autenticar a {% data variables.product.prodname_registry %}
{% data reusables.package_registry.authenticate-packages %}
#### Autenticando con un token de acceso personal
{% data reusables.package_registry.required-scopes %}
Puedes autenticarte en {% data variables.product.prodname_registry %} con npm al editar tu archivo *~/.npmrc* por usuario para incluir tu token de acceso personal o al iniciar sesión en npm en la línea de comando por medio tu nombre de usuario y token de acceso personal.
Para autenticarte al agregar tu token de acceso personal a tu archivo *~/.npmrc*, edita el archivo *~/.npmrc* para que tu proyecto incluya la siguiente línea, al reemplazar *TOKEN* por tu token de acceso personal. Crea un nuevo archivo *~/.npmrc* si no existe uno.
{% if currentVersion != "free-pro-team@latest" %}
Para obtener más información acerca de cómo crear un paquete, consulta la [documentación maven.apache.org](https://maven.apache.org/guides/getting-started/maven-in-five-minutes.html).
{% endif %}
```shell
//npm.pkg.github.com/:_authToken=<em>TOKEN</em>
```
{% if currentVersion != "free-pro-team@latest" %}
Por ejemplo, los proyectos *OctodogApp* y *OctocatApp* publicarán en el mismo repositorio:
```shell
$ npm login --registry=https://npm.pkg.github.com
> Username: <em>USERNAME</em>
> Password: <em>TOKEN</em>
> Email: <em>PUBLIC-EMAIL-ADDRESS</em>
```
{% endif %}
Para autenticarte al iniciar sesión en npm, usa el comando `npm login`, reemplaza *USERNAME* por tu nombre de usuario de {% data variables.product.prodname_dotcom %}, *TOKEN* por tu token de acceso personal y *PUBLIC-EMAIL-ADDRESS* por tu dirección de correo electrónico.
{% if currentVersion != "free-pro-team@latest" %}
Para obtener más información acerca de cómo crear un paquete, consulta la [documentación maven.apache.org](https://maven.apache.org/guides/getting-started/maven-in-five-minutes.html).
{% endif %}
```shell
"repository" : {
"type" : "git",
"url": "ssh://git@github.com/<em>OWNER</em>/<em>REPOSITORY</em>.git",
"directory": "packages/name"
},
```
{% if currentVersion != "free-pro-team@latest" %}
Por ejemplo, los proyectos *OctodogApp* y *OctocatApp* publicarán en el mismo repositorio:
```shell
registry=https://npm.pkg.github.com/<em>OWNER</em>
@<em>OWNER</em>:registry=https://npm.pkg.github.com
@<em>OWNER</em>:registry=https://npm.pkg.github.com
```
{% endif %}
#### Autenticando con el `GITHUB_TOKEN`
{% data reusables.package_registry.package-registry-with-github-tokens %}
### Publicar un paquete
De forma predeterminada, {% data variables.product.prodname_registry %} publica un paquete en el repositorio de {% data variables.product.prodname_dotcom %} que especifiques en el campo nombre del archivo *package.json*. Por ejemplo, publicarías un paquete denominado `@my-org/test` al repositorio de {% data variables.product.prodname_dotcom %} `My-org/test`. Puedes agregar un resumen para la página de descripción del paquete al incluir un archivo *README.md* en el directorio de tu paquete. Para obtener más información, consulta "[Trabajar con package.json](https://docs.npmjs.com/getting-started/using-a-package.json)" y "[Cómo crear módulos Node.js](https://docs.npmjs.com/getting-started/creating-node-modules)" en la documentación de npm.
Puedes publicar varios paquetes en el mismo repositorio de {% data variables.product.prodname_dotcom %} al incluir un campo `URL` en el archivo *package.json*. Para obtener más información, consulta "[Publicar varios paquetes en el mismo repositorio](#publishing-multiple-packages-to-the-same-repository)".
Puedes configurar la asignación de alcance de tu proyecto por medio de un archivo *.npmrc* local en el proyecto o mediante la opción `publishConfig` en *package.json*. {% data variables.product.prodname_registry %} solo admite paquetes npm con alcance definido. Los paquetes definidos tienen nombres con el formato de `@owner/name`. Además, siempre comienzan con un símbolo`@`. Es posible que tengas que actualizar el nombre en tu *package.json* para usar el nombre de alcance definido. Por ejemplo, `"name": "@codertocat/hello-world-npm"`.
{% data reusables.package_registry.viewing-packages %}
#### Publicar un paquete por medio de un archivo *.npmrc* local
Puedes usar un archivo *.npmrc* para configurar la asignación del alcance de tu proyecto. En el archivo *.npmrc*, usa la URL y el propietario de la cuenta de {% data variables.product.prodname_registry %} para que {% data variables.product.prodname_registry %} sepa dónde enrutar las solicitudes del paquete. Usar un archivo *.npmrc* impide que otros programadores publiquen accidentalmente el paquete en npmjs.org en lugar de {% data variables.product.prodname_registry %}. {% data reusables.package_registry.lowercase-name-field %}
{% data reusables.package_registry.authenticate-step %}
{% data reusables.package_registry.create-npmrc-owner-step %}
{% data reusables.package_registry.add-npmrc-to-repo-step %}
4. Verifica el nombre de tu paquete en el *package.json* de tu proyecto. El campo `name (nombre)` debe contener el alcance y el nombre del paquete. Por ejemplo, si tu paquete se denomina "test" (prueba) y vas a publicar en la organización "My-org" de {% data variables.product.prodname_dotcom %}, el campo `name (nombre)` de tu *package.json* debería ser `@my-org/test`.
{% data reusables.package_registry.verify_repository_field %}
{% data reusables.package_registry.publish_package %}
#### Publicar un paquete por medio de `publishConfig` en el archivo *package.json*
Puedes usar el elemento `publishConfig` en el archivo *package.json* para especificar el registro en el que deseas que se publique el paquete. Para obtener más información, consulta "[publishConfig](https://docs.npmjs.com/files/package.json#publishconfig)" en la documentación de npm.
1. Edita el archivo *package.json* de tu paquete e incluye una entrada de `publishConfig`.
{% if currentVersion != "free-pro-team@latest" %}
Para obtener más información acerca de cómo crear un paquete, consulta la [documentación maven.apache.org](https://maven.apache.org/guides/getting-started/maven-in-five-minutes.html).
{% endif %}
```shell
"publishConfig": {
"registry":"https://npm.pkg.github.com/"
},
```
{% if currentVersion != "free-pro-team@latest" %}
Por ejemplo, los proyectos *OctodogApp* y *OctocatApp* publicarán en el mismo repositorio:
```shell
"publishConfig": {
"registry":"https://<em>HOSTNAME</em>/_registry/npm/"
},
```
{% endif %}
{% data reusables.package_registry.verify_repository_field %}
{% data reusables.package_registry.publish_package %}
### Publicar varios paquetes en el mismo repositorio
Para publicar varios paquetes en el mismo repositorio, puedes incluir la URL del repositorio de {% data variables.product.prodname_dotcom %} en el campo `repository (repositorio)` del archivo *package.json* para cada paquete.
Para asegurarte de que la URL del repositorio sea correcta, reemplaza REPOSITORY por el nombre del repositorio que contiene el paquete que deseas publicar y OWNER por el nombre de la cuenta de usuario o de organización en {% data variables.product.prodname_dotcom %} que posee el repositorio.
{% data variables.product.prodname_registry %} coincidirá con el repositorio en base a la URL, en lugar de basarse en el nombre del paquete. Si almacenas el archivo *package.json* fuera del directorio raíz de tu repositorio, puedes usar el campo `directory (directorio)` para especificar la ubicación donde {% data variables.product.prodname_registry %} puede encontrar los archivos *package.json*.
```shell
"repository" : {
"type" : "git",
"url": "ssh://git@{% if currentVersion == "free-pro-team@latest" %}github.com{% else %}<em>HOSTNAME</em>{% endif %}/<em>OWNER</em>/<em>REPOSITORY</em>.git",
"directory": "packages/name"
},
```
### Instalar un paquete
Puedes instalar paquetes desde {% data variables.product.prodname_registry %} al agregar los paquetes como dependencias en el archivo *package.json* para tu proyecto. Para obtener más información sobre el uso de un *package.json* en tu proyecto, consulta "[Trabajar con package.json](https://docs.npmjs.com/getting-started/using-a-package.json)" en la documentación de npm.
Por defecto, puedes agregar paquetes de una organización. Para obtener más información, consulta [Instalar paquetes de otras organizaciones](#installing-packages-from-other-organizations)
También debes agregar el archivo *.npmrc* a tu proyecto por lo que todas las solicitudes de instalación de paquetes pasarán por {% data variables.product.prodname_registry %}. Cuando enrutas todas las solicitudes de paquete a través de {% data variables.product.prodname_registry %}, puedes usar paquetes con alcance definido y sin alcance definido de *npmjs.com*. Para obtener más información, consulta "[npm-scope](https://docs.npmjs.com/misc/scope)" en la documentación de npm.
{% data reusables.package_registry.authenticate-step %}
{% data reusables.package_registry.create-npmrc-owner-step %}
{% data reusables.package_registry.add-npmrc-to-repo-step %}
4. Configura *package.json* en tu proyecto para usar el paquete que estás instalando. Para agregar las dependencias de tu paquete al archivo *package.json* para {% data variables.product.prodname_registry %}, especifica el nombre del paquete de alcance completo, como `@my-org/server`. Para paquetes de *npmjs.com*, especifica el nombre completo, como `@babel/core` o `@lodash`. Por ejemplo, el archivo *package.json* a continuación utiliza el paquete `@octo-org/octo-app` como una dependencia.
```
{
"name": "@my-org/server",
"version": "1.0.0",
"description": "Server app that uses the @octo-org/octo-app package",
"main": "index.js",
"author": "",
"license": "MIT",
"dependencies": {
"@octo-org/octo-app": "1.0.0"
}
}
```
5. Instala el paquete.
```shell
$ npm install
```
#### Instalar paquetes de otras organizaciones
Por defecto, solo puedes usar paquetes de {% data variables.product.prodname_registry %} de una organización. Por defecto, solo puedes usar paquetes de {% data variables.product.prodname_registry %} de una organización. {% data reusables.package_registry.lowercase-name-field %}
{% if currentVersion != "free-pro-team@latest" %}
Para obtener más información acerca de cómo crear un paquete, consulta la [documentación maven.apache.org](https://maven.apache.org/guides/getting-started/maven-in-five-minutes.html).
{% endif %}
```shell
registry=https://{% if currentVersion == "free-pro-team@latest" %}npm.pkg.github.com{% else %}npm.<em>HOSTNAME</em>/{% endif %}<em>OWNER</em>
@<em>OWNER</em>:registry={% if currentVersion == "free-pro-team@latest" %}npm.pkg.github.com{% else %}npm.<em>HOSTNAME</em>/{% endif %}
@<em>OWNER</em>:registry={% if currentVersion == "free-pro-team@latest" %}npm.pkg.github.com{% else %}npm.<em>HOSTNAME</em>/{% endif %}
```
{% if currentVersion != "free-pro-team@latest" %}
Por ejemplo, los proyectos *OctodogApp* y *OctocatApp* publicarán en el mismo repositorio:
```shell
registry=https://<em>HOSTNAME</em>/_registry/npm/<em>OWNER</em>
@<em>OWNER</em>:registry=<em>HOSTNAME</em>/_registry/npm/
@<em>OWNER</em>:registry=<em>HOSTNAME</em>/_registry/npm/
```
{% endif %}
### Leer más
- "[Eliminar un paquete](/packages/publishing-and-managing-packages/deleting-a-package/)"
| 62.419192 | 747 | 0.748038 | spa_Latn | 0.876786 |
2f18d128bdb4d353aedcd86a633f5f748e4b0357 | 203,001 | md | Markdown | wiki/translations/pt-br/Import_Export_Preferences.md | arslogavitabrevis/FreeCAD-documentation | 8166ce0fd906f0e15672cd1d52b3eb6cf9e17830 | [
"CC0-1.0"
] | null | null | null | wiki/translations/pt-br/Import_Export_Preferences.md | arslogavitabrevis/FreeCAD-documentation | 8166ce0fd906f0e15672cd1d52b3eb6cf9e17830 | [
"CC0-1.0"
] | null | null | null | wiki/translations/pt-br/Import_Export_Preferences.md | arslogavitabrevis/FreeCAD-documentation | 8166ce0fd906f0e15672cd1d52b3eb6cf9e17830 | [
"CC0-1.0"
] | null | null | null | # Import Export Preferences/pt-br
{{TOCright}}
## Introdução
FreeCAD can import and export many file formats. For some formats dedicated preferences exist. These can be found in the [Preferences editor](Preferences_Editor.md), in the menu **Edit → Preferences → Import-Export**.
Not all import and export preferences tabs are available by default. For some a workbench has to be loaded first.
## Notas
### TechDraw pages
The DXF and SVG preferences listed here are not used by the <img alt="" src=images/TechDraw_ExportPageSVG.svg style="width:24px;"> [Export Page as SVG](TechDraw_ExportPageSVG.md) and <img alt="" src=images/TechDraw_ExportPageDXF.svg style="width:24px;"> [Export Page as DXF](TechDraw_ExportPageDXF.md) commands of the <img alt="" src=images/Workbench_TechDraw.svg style="width:24px;"> [TechDraw Workbench](TechDraw_Workbench.md), or by the [TechDraw](TechDraw_Workbench.md) export option: **File → Export → Technical Drawing (*.svg *.dxf *.pdf)**.
### OpenSCAD files
The import and export preferences for OpenSCAD files are listed in a different part of the [Preferences editor](Preferences_Editor.md). See the [OpenSCAD Preferences](OpenSCAD_Preferences.md).
## Related
See the following pages for additional information:
- [Import Export](Import_Export.md): A table listing all supported file formats.
- [FreeCAD Howto Import Export](FreeCAD_Howto_Import_Export.md): A list of tutorials that can help users convert data from one format to another.
## Available preferences
### DAE
The [Collada](http://en.wikipedia.org/wiki/COLLADA) DAE (Digital Asset Exchange) format is a standard file format for exchange of Mesh data. FreeCAD can import meshes from {{FileName|.dae}} files, and export [Shape](Part_Workbench.md)-based objects to the {{FileName|.dae}} format.
Note for Linux users: To handle this file format FreeCAD requires the [pyCollada module](Extra_python_modules.md).
For the DAE format you can specify the following:
+-------------------------------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| Name | Description |
+=====================================+===========================================================================================================================================================================================================================================================================================================================================================================================================+
| | All dimensions in the file will be scaled with the specified factor. |
| **Scaling factor** | |
| | |
+-------------------------------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | Sets the meshing program that should be used. If using *Netgen*, make sure that it is available. This can be checked by using the <img alt="" src=images/Workbench_Mesh.svg style="width:24px;"> [Mesh Workbench](Mesh_Workbench.md) and [creating a mesh](Mesh_FromPartShape.md) using Netgen. If it is not available another version of FreeCAD, compiled with Netgen, must be installed. |
| **Mesher** | |
| | |
+-------------------------------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | The tessellation value to use with the *Builtin* and the *Mefisto* meshing program. |
| **Tessellation** | |
| | |
+-------------------------------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | The grading value to use for meshing using *Netgen*. This value describes how fast the mesh size decreases. The gradient of the local mesh size `h(x)` is bound by `abs(Δh(x)) ≤ 1/value`. |
| **Grading** | |
| | |
+-------------------------------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | The maximum number of segments per edge. |
| **Segments per edge** | |
| | |
+-------------------------------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | The number of segments per radius. |
| **Segments per radius** | |
| | |
+-------------------------------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | Allow a second order mesh. |
| **Second order** | |
| | |
+-------------------------------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | Allow optimization. |
| **Optimize** | |
| | |
+-------------------------------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | Allow [quadrilateral faces](https://en.wikipedia.org/wiki/Types_of_mesh#Two-dimensional). |
| **Allow quads** | |
| | |
+-------------------------------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+

### DWG
DWG (from drawing) is a proprietary, closed source, binary file format for storing 2D and 3D design data and metadata. FreeCAD requires external converters to process DWG files.
**Note:** All settings for the DXF file format also apply to DWG.
For the DWG format you can specify the following:
+----------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| Name | Description |
+========================================+========================================================================================================================================================================================================================================================================================================+
| | Select the DWG converter to use: |
| **Conversion method** | |
| | - **Automatic**: FreeCAD will try to find a converter automatically following the order of the rest of this list. |
| | - **LibreDWG**: [LibreDWG](https://www.gnu.org/software/libredwg/) is an open-source DWG reading and writing library. It lacks support for several DWG entities, and may not always give faithful results. |
| | - **ODA Converter**: The [ODA File Converter](https://www.opendesign.com/guestfiles/oda_file_converter) is a free utility provided by the Open Design Alliance. It gives very good and reliable results. |
| | - **QCAD pro**: [QCAD pro](https://qcad.org/en/qcad-command-line-tools#dwg2dwg) is the paid version of the open-source QCAD DXF-based 2D CAD platform. Its DWG converter uses the Teigha libraries from the OpenDesign Alliance and therefore gives the same good results as the ODA File Converter. |
| | |
| | |
| | <small>(v0.20)</small> |
| | |
+----------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | FreeCAD will try to find the path to the ODA Converter automatically on Linux and Windows. If it is unable to find it, or if you are using the macOS or a different DWG converter, you need to specify the path to the executable here: |
| **Path to file converter** | |
| | - |
| | {{FileName|dwg2dxf.exe}} |
| | |
| | (LibreDWG) |
| | |
| | - |
| | {{FileName|ODAFileConverter.exe}} |
| | |
| | (ODA File Converter) |
| | |
| | - |
| | {{FileName|dwg2dwg.exe}} |
| | |
| | (QCAD pro). |
| | |
| | |
| | <small>(v0.20)</small> |
| | |
+----------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+

### DXF
AutoCAD [DXF](DXF.md) (Drawing eXchange Format) is a proprietary format to exchange CAD data between AutoCAD and other programs.
For the DXF format you can specify the following:
+--------------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| Name | Description |
+======================================================================================+============================================================================================================================================================================================================================================+
| | If checked, this preferences dialog will be shown when importing or exporting DXF files. |
| **Show this dialog when importing and exporting** | |
| | |
+--------------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | If checked, the Python importer is used, otherwise the newer C++ one. The C++ importer is faster, but has not as many features yet. |
| **Use legacy python importer** | |
| | The Python importer uses the **Edit → Preferences... → Draft → General settings → Internal precision level** preference. For an accurate import result set this value to 8 or higher. |
+--------------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | If checked, the Python exporter is used, otherwise the newer C++ one. The C++ exporter is faster, but has not as many features yet. <small>(v0.19)</small> |
| **Use legacy python exporter** | |
| | |
+--------------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | By checking this, you will allow FreeCAD to download the [Python converter](FreeCAD_and_DXF_Import.md) for DXF import and export. This converter cannot be bundled with FreeCAD because it has a different software license. |
| **Allow FreeCAD to automatically download and update the DXF libraries** | |
| | |
+--------------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | Select what will be imported. |
| **Import** | |
| | If **texts and dimensions** is unchecked, texts and [mtexts](https://www.autodesk.com/techpubs/autocad/acad2000/dxf/mtext_dxf_06.htm) won\'t be imported. |
| | |
| | If **points** is unchecked, points won\'t be imported. |
| | |
| | If **layouts** is checked, paper space objects will be imported too. |
| | |
| | Check **\*blocks** if you want anonymous blocks (which have names beginning with a \*) to be imported too. |
+--------------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | Select what will be created. |
| **Create** | |
| | If **simple Part shapes** is selected, only standard Part objects will be created. This is the fastest. |
| | |
| | If **Draft objects** is selected, parametric Draft objects will be created whenever possible. |
| | |
| | If **Sketches** is selected, sketches will be created whenever possible. |
+--------------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | Scale factor to apply to DXF files on import. The factor is the conversion between the units of your DXF file and millimeters. Example: for files in millimeters: 1, in centimeters: 10, in meters: 1000, in inches: 25.4, in feet: 304.8. |
| **Scale factor to apply to imported files** | |
| | |
+--------------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | If checked, colors will be retrieved from the DXF objects whenever possible. Otherwise default colors will be applied. |
| **Get original colors from the DXF file** | |
| | |
+--------------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | If checked, FreeCAD will try to join coincident objects into wires. Note that this can take a while! |
| **Join geometry** | |
| | |
+--------------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | If checked, objects from the same layers will be joined into Draft Blocks, which display faster, but are less easily editable. |
| **Group layers into blocks** | |
| | |
+--------------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | If checked, imported texts will get the standard [Draft Text](Draft_Text.md) size, instead of the size they have in the DXF document. |
| **Use standard font size for texts** | |
| | |
+--------------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | If checked, DXF layers will be imported as [Draft Layers](Draft_Layer.md). (<small>(v0.19)</small> ) |
| **Use Layers** | |
| | |
+--------------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | If checked, hatches will be converted to simple wires. |
| **Import hatch boundaries as wires** | |
| | |
+--------------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | If checked, if polylines have a width defined, they will be rendered as closed wires with the correct width. |
| **Render polylines with width** | |
| | |
+--------------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | The export of ellipses and splines is poorly supported. Use this option to export them as polylines instead. |
| **Treat ellipses and splines as polylines** | |
| | The setting **Max Spline Segment** is then the maximum length of each of the polyline segments. If it is set to **0** the whole spline is treated as a straight segment. |
+--------------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | If checked, all objects containing faces will be exported as 3D polyfaces. |
| **Export 3D objects as polyface meshes** | |
| | |
+--------------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | If this is checked, Drawing Views will be exported as blocks. This might fail for post DXF R12 templates. |
| **Export Drawing Views as blocks** | |
| | |
+--------------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | If checked, the exported objects will be projected to reflect the current view direction. |
| **Project exported objects along current view direction** | |
| | |
+--------------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+

### IGES
The [Initial Graphics Exchange Specification](https://en.wikipedia.org/wiki/IGES) (IGES) file format is a file format that allows the digital exchange of information among CAD systems. After publication of the [STEP](Preferences_Editor#STEP.md) file format, IGES development was stopped in 1996, but it is still supported by many CAD programs. IGES files have the {{FileName|.iges}} or {{FileName|.igs}} extension.
The tab *IGES* is only shown in the preferences if the <img alt="" src=images/Workbench_Part.svg style="width:24px;"> [Part Workbench](Part_Workbench.md), <img alt="" src=images/Workbench_PartDesign.svg style="width:24px;"> [PartDesign Workbench](PartDesign_Workbench.md), or <img alt="" src=images/Workbench_OpenSCAD.svg style="width:24px;"> [OpenSCAD Workbench](OpenSCAD_Workbench.md) has been loaded in the current FreeCAD session.
For the IGES format you can specify the following:
+--------------------------------------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| Name | Description |
+============================================+=============================================================================================================================================================================================================================================================================================================================+
| | Select what unit will be used when exporting IGES files. |
| **Units for export of IGES** | |
| | |
+--------------------------------------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | Select how solids and shells should be output. |
| **Write solids and shells as** | |
| | If **Groups of Trimmed Surfaces (type 144)** is selected, they will be exported as [trimmed surfaces](https://wiki.eclipse.org/IGES_file_Specification#Trimmed_Surface_.28Type_144.29). |
| | |
| | If **Solids (type 186) and Shells (type 514) / B-REP mode** is selected, solids will be exported as [manifold solid B-Rep objects](https://wiki.eclipse.org/IGES_file_Specification#Manifold_Solid_B-Rep_Object_.28Type_186.29), shells as [shells](https://wiki.eclipse.org/IGES_file_Specification#Shell_.28Type_514.29). |
+--------------------------------------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | If checked, blank [entities](https://wiki.eclipse.org/IGES_file_Specification#Entities) will not be imported. |
| **Skip blank entities** | |
| | |
+--------------------------------------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | If not empty, the entered text will be used in the IGES file header for the company. |
| **Company** | |
| | |
+--------------------------------------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | If not empty, the entered text will be used in the IGES file header for the author. |
| **Author** | |
| | |
+--------------------------------------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | If not empty, the entered text will be used in the IGES file header for the product. |
| **Product** | |
| | |
+--------------------------------------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+

### IFC
[Industry Foundation Classes](http://en.wikipedia.org/wiki/Industry_Foundation_Classes) (IFC) is a wide spread format to exchange data between [BIM](http://en.wikipedia.org/wiki/Building_Information_Modeling) applications. It is used in architecture and engineering.
Note for Linux users: To handle this file format FreeCAD requires the [IfcOpenShell module](Extra_python_modules.md).
#### IFC import
For import of the IFC format you can specify the following:
+---------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| Name | Description |
+=================================================================================+========================================================================================================================================================================================================================================================+
| | If checked, this preferences dialog will be shown when importing IFC files. |
| **Show this dialog when importing** | |
| | |
+---------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | Shows verbose debug messages during import and export of IFC files in the [Report view](Report_view.md). |
| **Show debug messages** | |
| | |
+---------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | IFC objects can share a same geometry definition between several objects, only their placement is different. When this option is enabled, clones are used to achieve the same result in FreeCAD. One object is the base object, the others are clones. |
| **Create clones when objects have shared geometry** | |
| | |
+---------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | Specify the number of CPU cores to use for IFC import. The maximum number should be smaller than the number of actually available cores. Use **0** to disable this feature. <small>(v0.19)</small> |
| **Number of cores to use (experimental)** | |
| | |
+---------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | What will be created in FreeCAD for arch IFC objects. |
| **Import arch IFC objects as** | |
| | |
+---------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | What will be created in FreeCAD for struct IFC objects. |
| **Import struct IFC objects as** | |
| | |
+---------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | Only subtypes of the specified element will be imported. Keep the predefined element [IfcProduct](http://standards.buildingsmart.org/IFC/RELEASE/IFC4/ADD1/HTML/schema/ifckernel/lexical/ifcproduct.htm) to import all building elements. |
| **Root element** | |
| | |
+---------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | If checked, openings will be imported as subtractions, otherwise wall shapes will already have their openings subtracted. |
| **Separate openings** | |
| | |
+---------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | If checked, the importer will try to detect extrusions. Note that this might slow things down. |
| **Detect extrusions** | |
| | |
+---------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | Split walls made of multiple layers. |
| **Split multilayer walls** | |
| | |
+---------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | If checked, object names will be prefixed with the [IFC ID](http://standards.buildingsmart.org/IFC/RELEASE/IFC4/ADD1/HTML/schema/ifcutilityresource/lexical/ifcgloballyuniqueid.htm) number. |
| **Prefix names with ID number** | |
| | |
+---------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | If several materials with the same name are found in the IFC file, they will be treated as one. |
| **Merge materials with same name and same color** | |
| | |
+---------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | If checked, each object will have their [IFC properties](http://standards.buildingsmart.org/IFC/RELEASE/IFC4/ADD1/HTML/schema/ifcpropertyresource/lexical/ifcproperty.htm) stored in a spreadsheet object. |
| **Import Ifc Properties in spreadsheet** | |
| | |
+---------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | If unchecked invalid shapes are not imported. <small>(v0.19)</small> |
| **Allow invalid shapes** | |
| | |
+---------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | A comma-separated list of [IFC entities](https://standards.buildingsmart.org/IFC/RELEASE/IFC4/ADD1/HTML/schema/toc-5.htm) to be excluded from imports. |
| **Exclude list** | |
| | |
+---------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | Fit view during import on the imported objects. This will slow down the import, but one can watch the import. |
| **Fit view while importing** | |
| | |
+---------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | Creates a full parametric model on import using stored FreeCAD object properties. |
| **Import full FreeCAD parametric definitions if available** | |
| | To get the FreeCAD properties, the model must have been exported using the option **Export full FreeCAD parametric model**. <small>(v0.19)</small> |
+---------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | If checked, groups will be used to replace the mentioned objects. <small>(v0.19)</small> |
| **Replace 'Project', 'Site', 'Building' and 'Storey' with 'Group'** | |
| | |
+---------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+

#### IFC export
For export of the IFC format you can specify the following:
+----------------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| Name | Description |
+========================================================================================+==================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================+
| | If checked, this preferences dialog will be shown when exporting IFC files. |
| **Show this dialog when exporting** | |
| | |
+----------------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | Select how the model should be exported: as **Standard model**, **Structural Analysis**, or **Standard + structural**. <small>(v0.19)</small> |
| **Export type** | |
| | |
+----------------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | Some IFC viewers don\'t like objects exported as extrusions. Use this to force all objects to be exported as [BREP](https://en.wikipedia.org/wiki/Boundary_representation) geometry. But avoid exporting as Brep if possible as this makes objects non-parametric. |
| **Force export as Brep** | |
| | |
+----------------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | Use triangulation options set in the DAE options page. |
| **Use DAE triangulation options** | |
| | |
+----------------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | Curved shapes that cannot be represented as curves in IFC are decomposed into flat facets. If this is checked, some additional calculation is done to join coplanar facets. |
| **Join coplanar facets when triangulating** | |
| | |
+----------------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | When exporting objects without a [unique ID](http://standards.buildingsmart.org/IFC/RELEASE/IFC4/ADD1/HTML/schema/ifcutilityresource/lexical/ifcgloballyuniqueid.htm) (UID), the generated UID will be stored inside the FreeCAD object for reuse the next time the object is exported. This leads to smaller differences between file versions. |
| **Store IFC unique ID in FreeCAD objects** | |
| | |
+----------------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | [IFCOpenShell](Extra_python_modules#IfcOpenShell.md) is a library for IFC files. Its *serializer* functionality can produce valid IFC geometry from [OCC](Glossary#OCC.md) shapes. |
| **Use IfcOpenShell serializer if available** | |
| | Note that this is still an experimental feature. |
+----------------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | If checked, 2D objects will be exported as [IfcAnnotation](https://standards.buildingsmart.org/IFC/RELEASE/IFC4/ADD1/HTML/schema/ifcproductextension/lexical/ifcannotation.htm). |
| **Export 2D objects as IfcAnnotations** | |
| | |
+----------------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | If checked, all FreeCAD object properties will be stored inside the exported objects, allowing to recreate a full parametric model on reimport using the option **Import full FreeCAD parametric definitions if available**. |
| **Export full FreeCAD parametric model** | |
| | |
+----------------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | If checked, similar entities will be used only once in the file if possible. This can reduce the file size a lot, but will make it less easily readable. |
| **Reuse similar entities** | |
| | |
+----------------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | Whenever possible, IFC objects that are extruded rectangles will be exported as [IfcRectangleProfileDef](http://standards.buildingsmart.org/IFC/RELEASE/IFC4/ADD1/HTML/schema/ifcprofileresource/lexical/ifcrectangleprofiledef.htm). For applications that have problems importing these entities, select this option to ensure that all profiles are exported as [IfcArbitraryClosedProfileDef](http://standards.buildingsmart.org/IFC/RELEASE/IFC4/ADD1/HTML/schema/ifcprofileresource/lexical/ifcarbitraryclosedprofiledef.htm) instead. |
| **Disable IfcRectangleProfileDef** | |
| | |
+----------------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | Some IFC types such as [IfcWall](https://standards.buildingsmart.org/IFC/RELEASE/IFC4/ADD1/HTML/schema/ifcsharedbldgelements/lexical/ifcwall.htm) or [IfcBeam](https://standards.buildingsmart.org/IFC/RELEASE/IFC4/ADD1/HTML/schema/ifcsharedbldgelements/lexical/ifcbeam.htm) have special standard versions like [IfcWallStandardCase](https://standards.buildingsmart.org/IFC/RELEASE/IFC4/ADD1/HTML/schema/ifcsharedbldgelements/lexical/ifcwallstandardcase.htm) or [IfcBeamStandardCase](https://standards.buildingsmart.org/IFC/RELEASE/IFC4/ADD1/HTML/schema/ifcsharedbldgelements/lexical/ifcbeamstandardcase.htm). If this option is selected, FreeCAD will automatically export such objects as standard cases when the necessary conditions are met. <small>(v0.19)</small> |
| **Auto-detect and export as standard cases when applicable** | |
| | |
+----------------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | When exporting an IFC file, if no site is found in the FreeCAD document, a default one is added. A site is not mandatory according to the IFC standard, but it is common practice to have at least one in the file. <small>(v0.19)</small> |
| **Add default site if one is not found in the document** | |
| | |
+----------------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | When exporting an IFC file, if no building is found in the FreeCAD document, a default one is added. |
| **Add default building if one is not found in the document (no standard)** | |
| | **Warning**: The IFC standard asks for at least one building in each file. By turning this option off, you will produce a non-standard IFC file. |
| | |
| | However, at FreeCAD we believe having a building should not be mandatory, and this option is there to have a chance to demonstrate our point of view. <small>(v0.19)</small> |
+----------------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | When exporting an IFC file, if no building storey is found in the FreeCAD document, a default one is added. A building storey is not mandatory according to the IFC standard, but it is common practice to have at least one in the file. <small>(v0.19)</small> |
| **Add default building storey if one is not found in the document** | |
| | |
+----------------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | Select which units will be used when exporting IFC files. <small>(v0.19)</small> |
| **IFC file units** | |
| | |
+----------------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+

### INP
INP is the input file format for the FEM software [Abaqus](https://en.wikipedia.org/wiki/Abaqus). It is used for the [CalculiX](FEM_CalculiX.md) solver in the <img alt="" src=images/Workbench_FEM.svg style="width:24px;"> [FEM Workbench](FEM_Workbench.md).
The tab *INP* is only shown in the preferences if the <img alt="" src=images/Workbench_FEM.svg style="width:24px;"> [FEM Workbench](FEM_Workbench.md) has been loaded in the current FreeCAD session.
For the INP format you can specify the following:
+-----------------------------------------------+------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| Name | Description |
+===============================================+======================================================================================================================================================================================================+
| | Select which mesh elements should be exported. |
| **Which mesh elements to export** | |
| | If **All** is selected, all elements will be exported. |
| | |
| | If **Highest** is selected, only the highest elements will be exported. This means volumes for a volume mesh and faces for a shell mesh. |
| | |
| | If **FEM** is selected, only FEM elements will be exported. This means only edges not belonging to faces and faces not belonging to volumes. |
+-----------------------------------------------+------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | If checked, mesh groups are exported too. Every constraint and, if there are different materials, material consists of two mesh groups, faces and nodes where the constraint or material is applied. |
| **Export group data** | |
| | |
+-----------------------------------------------+------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+

### Mesh Formats
Meshes are a special type of 3D object, composed of triangular faces connected by their [vertices](Glossary#Vertex.md) and edges. They are widely used for [additive manufacturing](https://en.wikipedia.org/wiki/3D_printing). FreeCAD provides the <img alt="" src=images/Workbench_Mesh.svg style="width:24px;"> [Mesh Workbench](Mesh_Workbench.md) to create and handle meshes. FreeCAD supports several mesh file formats.
The tab *Mesh Formats* is only shown in the preferences if the <img alt="" src=images/Workbench_Mesh.svg style="width:24px;"> [Mesh Workbench](Mesh_Workbench.md) has been loaded in the current FreeCAD session.
For the Mesh formats you can specify the following:
+----------------------------------------------------+-----------------------------------------------------------------------------+
| Name | Description |
+====================================================+=============================================================================+
| | Specification of the maximal deviation between the mesh and the object. |
| **Maximum mesh deviation** | |
| | |
+----------------------------------------------------+-----------------------------------------------------------------------------+
| | If checked, ZIP compression is used when writing a mesh file in AMF format. |
| **Export AMF files using compression** | |
| | |
+----------------------------------------------------+-----------------------------------------------------------------------------+
| | Width of Asymptote page. <small>(v0.19)</small> |
| **Width** | |
| | |
+----------------------------------------------------+-----------------------------------------------------------------------------+
| | Height of Asymptote page. <small>(v0.19)</small> |
| **Height** | |
| | |
+----------------------------------------------------+-----------------------------------------------------------------------------+

### OCA
The [OCA](http://groups.google.com/group/open_cad_format) file format is a community project to create a free, simple and open CAD file format. OCA is largely based on the GCAD file format generated by [gCAD3D](http://www.gcad3d.org/). Both formats can be imported in FreeCAD and the OCA files exported by FreeCAD can be opened in gCAD3D.
For the OCA format you can specify the following:
+----------------------------------+--------------------------------------------------------+
| Name | Description |
+==================================+========================================================+
| | If checked, the areas (3D faces) will be imported too. |
| **Import OCA areas** | |
| | |
+----------------------------------+--------------------------------------------------------+

### STEP
The [Standard for The Exchange of Product model data](https://en.wikipedia.org/wiki/ISO_10303) (STEP) file format is an ISO standard for the computer-interpretable representation and exchange of product manufacturing information. STEP is commonly used to exchange 3D data between CAD software. STEP files have the {{FileName|.step}} or {{FileName|.stp}} extension. For compressed files the {{FileName|.stpz}} extension is used.
The tab *STEP* is only shown in the preferences if the <img alt="" src=images/Workbench_Part.svg style="width:24px;"> [Part Workbench](Part_Workbench.md), <img alt="" src=images/Workbench_PartDesign.svg style="width:24px;"> [PartDesign Workbench](PartDesign_Workbench.md), or <img alt="" src=images/Workbench_OpenSCAD.svg style="width:24px;"> [OpenSCAD Workbench](OpenSCAD_Workbench.md) has been loaded in the current FreeCAD session.
For the STEP format you can specify the following:
+-----------------------------------------------------------------+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| Name | Description |
+=================================================================+==================================================================================================================================================================================================================================================================================+
| | Select which units will be used when exporting STEP files. |
| **Units for export of STEP** | |
| | |
+-----------------------------------------------------------------+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | If checked, parametric curves (curves in parametric space of surfaces) will be written into the STEP file. Unchecking the option can be helpful to minimize the size of the resulting STEP file. |
| **Write out curves in parametric space of surface** | |
| | |
+-----------------------------------------------------------------+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | Uncheck this to skip invisible object when exporting, which is useful for CAD applications that do not support invisibility STEP styling. <small>(v0.19)</small> |
| **Export invisible objects** | |
| | |
+-----------------------------------------------------------------+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | Check this option to keep the placement information when exporting a single object. Please note that when re-importing the STEP file, the placement will be encoded into the shape geometry, instead of keeping it inside the Placement property. <small>(v0.19)</small> |
| **Export single object placement** | |
| | |
+-----------------------------------------------------------------+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | Use the legacy exporter. <small>(v0.19)</small> |
| **Use legacy exporter** | |
| | |
+-----------------------------------------------------------------+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | Select the STEP application protocol (AP) to be used for the export. |
| **Scheme** | |
| | **AP 203** is the protocol for configuration controlled 3D designs of mechanical parts and assemblies. |
| | |
| | **AP 214** is the protocol for core data for automotive mechanical design processes. |
+-----------------------------------------------------------------+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | If checked, a [compound](Glossary#Compound.md) merge will be done during file reading. This is slower but results in higher details. |
| **Enable STEP Compound merge** | |
| | |
+-----------------------------------------------------------------+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | Select this to use App::LinkGroup group containers instead of App::Part group containers. <small>(v0.19)</small> |
| **Use LinkGroup** | |
| | |
+-----------------------------------------------------------------+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | Select this to import invisible objects. <small>(v0.19)</small> |
| **Import invisible objects** | |
| | |
+-----------------------------------------------------------------+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | Reduce the number of objects using Link arrays. <small>(v0.19)</small> |
| **Reduce number of objects** | |
| | |
+-----------------------------------------------------------------+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | Expand compound shapes with multiple solids. <small>(v0.19)</small> |
| **Expand compound shape** | |
| | |
+-----------------------------------------------------------------+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | Show a progress bar when importing. <small>(v0.19)</small> |
| **Show progress bar when importing** | |
| | |
+-----------------------------------------------------------------+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | Do not use instance names. Useful for some legacy STEP files with non-meaningful auto-generated instance names. <small>(v0.19)</small> |
| **Ignore instance names** | |
| | |
+-----------------------------------------------------------------+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | Select the required document structure. <small>(v0.19)</small> |
| **Mode** | |
| | **Single document** |
| | |
| | **Assembly per document** |
| | |
| | **Assembly per document in sub-directory** |
| | |
| | **Object per document** |
| | |
| | **Object per document in sub-directory** |
+-----------------------------------------------------------------+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | If not empty, the entered text will be used in the STEP file header for the company. |
| **Company** | |
| | |
+-----------------------------------------------------------------+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | If not empty, the entered text will be used in the STEP file header for the author. |
| **Author** | |
| | |
+-----------------------------------------------------------------+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | If not empty, the entered text will be used in the STEP file header for the product. |
| **Product** | |
| | |
+-----------------------------------------------------------------+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+

### SVG
[Scalable Vector Graphics](SVG.md) (SVG) is a [vector image](https://en.wikipedia.org/wiki/Vector_graphics) format for two-dimensional graphics. A vector image can be scaled to any size without losing its shape or details. An SVG image can be converted to bitmap formats like PNG or JPEG for printing.
For the SVG format you can specify the following:
+---------------------------------------------------------+------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| Name | Description |
+=========================================================+==============================================================================================================================================================================================================================+
| | Select how SVG object colors and line widths will be imported. |
| **Import style** | |
| | If **None (fastest)** is selected no color or line width settings will be imported. |
| | |
| | If **Use default color and linewidth** is selected FreeCAD will use its default color and line width. |
| | |
| | If **Original color and linewidth** is selected FreeCAD will use the color and linewidth from the SVG objects. |
+---------------------------------------------------------+------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | If checked, no unit conversion will occur. One unit in the SVG file will translate as one millimeter. |
| **Disable units scaling** | |
| | |
+---------------------------------------------------------+------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | Select how Sketches are exported to SVG. |
| **Export style** | |
| | If **Translated (for print & display)** is selected, SVG objects are encapsulated in a group that is scaled and moved to the correct place in the SVG document to fit into a printable area. |
| | |
| | If **Raw (for CAM)** is selected, SVG objects are placed as they are - at the same coordinates as in the FreeCAD model (1:1 export). |
+---------------------------------------------------------+------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | If checked, all white lines will appear in black in the SVG for better readability against white backgrounds. |
| **Translate white line color to black** | |
| | |
+---------------------------------------------------------+------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| | Versions of [Open CASCADE](Glossary#Open_CASCADE.md) older than version 6.8 don\'t support arc projection. In this case arcs will be discretized into small line segments. This value is the maximum segment length. |
| **Max segment length for discretized arcs** | |
| | |
+---------------------------------------------------------+------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+

### VTK
The [Visualization Toolkit](https://en.wikipedia.org/wiki/VTK) (VTK) is an open-source, freely available software system for 3D computer graphics, image processing and visualization. VTK files are used by the <img alt="" src=images/Workbench_FEM.svg style="width:24px;"> [FEM Workbench](FEM_Workbench.md) for the [post processing](FEM_Post_Processing_based_on_VTK.md) of simulation results.
The tab *VTK* is only shown in the preferences if the <img alt="" src=images/Workbench_FEM.svg style="width:24px;"> [FEM Workbench](FEM_Workbench.md) has been loaded in the current FreeCAD session.
For the VTK format you can specify the following:
+---------------------------------------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| Name | Description |
+=============================================+=====================================================================================================================================================================================================================================================+
| | Select what objects should be imported. |
| **Which object to import into** | |
| | If **VTK result object** is selected, a FreeCAD FEM VTK result object will be imported (equals to the object which was exported). |
| | |
| | If **FEM mesh object** is selected, the results in the VTK file will be omitted, only the mesh data will be imported and a FreeCAD FEM mesh object will be created. |
| | |
| | If **FreeCAD result object** is selected, the imported data will be converted into a FreeCAD FEM Result object. **Note:** this setting needs the exact result component names and thus it only works properly with VTK files exported from FreeCAD. |
+---------------------------------------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+

[<img src="images/Property.png" style="width:16px"> Common Questions](Category_Common_Questions.md) [<img src="images/Property.png" style="width:16px"> Preferences](Category_Preferences.md) [<img src="images/Property.png" style="width:16px"> File\_Formats](Category_File_Formats.md)
---
[documentation index](../README.md) > [Common Questions](Category_Common Questions.md) > [Import](Import_Workbench.md) > Import Export Preferences/pt-br
| 308.981735 | 861 | 0.112054 | che_Cyrl | 0.397513 |
2f18de202d9328231cbb4bb8f64bd6b9cad84715 | 2,220 | md | Markdown | README.md | lmshao/FFmpegBinary | 5422e0cd3a9f57d7a27a02c371d1b4f55ce03075 | [
"MIT"
] | null | null | null | README.md | lmshao/FFmpegBinary | 5422e0cd3a9f57d7a27a02c371d1b4f55ce03075 | [
"MIT"
] | null | null | null | README.md | lmshao/FFmpegBinary | 5422e0cd3a9f57d7a27a02c371d1b4f55ce03075 | [
"MIT"
] | null | null | null | # FFmpeg Compilation Guide for Linux
## Third-party Software Complilation
### nasm
```sh
wget https://www.nasm.us/pub/nasm/releasebuilds/2.15.05/nasm-2.15.05.tar.xz
tar Jxvf nasm-2.15.05.tar.xz
pushd nasm-2.15.05
./configure
make && make install
popd
```
### libx264
```sh
wget https://code.videolan.org/videolan/x264/-/archive/master/x264-master.tar.bz2
tar jxvf x264-master.tar.bz2
pushd x264-master
./configure --enable-shared
make && make install
popd
```
### libx265
```sh
hg clone http://hg.videolan.org/x265
pushd x265/build/linux/
cmake ../../source -DHIGH_BIT_DEPTH=ON
make && make install
popd
```
### libfdk-aac
```sh
wget -O fdk-aac-2.0.2.tar.gz https://sourceforge.net/projects/opencore-amr/files/fdk-aac/fdk-aac-2.0.2.tar.gz/download
tar zxvf fdk-aac-2.0.2.tar.gz
pushd fdk-aac-2.0.2
./configure
make && make install
popd
```
### libmp3lame
```sh
wget -O lame-3.100.tar.gz https://sourceforge.net/projects/lame/files/lame/3.100/lame-3.100.tar.gz/download
tar zxvf lame-3.100.tar.gz
pushd lame-3.100
./configure
make && make install
popd
```
### libvpx
```sh
wget https://github.com/webmproject/libvpx/archive/v1.10.0/libvpx-1.10.0.tar.gz
tar zxvf libvpx-1.10.0.tar.gz
pushd libvpx-1.10.0
sed -i 's/cp -p/cp/' build/make/Makefile
mkdir libvpx-build && cd libvpx-build
../configure --enable-shared --disable-static
make && make install
popd
```
### libopus
```sh
wget https://archive.mozilla.org/pub/opus/opus-1.3.1.tar.gz
tar zxvf opus-1.3.1.tar.gz
pushd opus-1.3.1
./configure --enable-shared --disable-static
make && make install
popd
```
### libaom
```sh
git clone https://aomedia.googlesource.com/aom
git checkout v3.2.0 -b v3.2.0
mkdir build && pushd build
cmake .. -DBUILD_SHARED_LIBS=1
make && make install
```
## FFmpeg Complilation
```sh
wget https://www.ffmpeg.org/releases/ffmpeg-4.4.1.tar.bz2
tar jxvf ffmpeg-4.4.1.tar.bz2
pushd ffmpeg-4.4.1
export PKG_CONFIG_PATH=/usr/local/lib/pkgconfig/
./configure --prefix=./ffmpeg-4.4.1-$(date +%Y%m%d) --enable-shared --disable-static --enable-gpl --enable-nonfree --enable-libx264 --enable-libx265 --enable-libfdk-aac --enable-libmp3lame --enable-libvpx --enable-libopus --enable-libaom --enable-openssl
make && make install
popd
```
| 21.346154 | 254 | 0.713964 | yue_Hant | 0.314551 |
2f195ad2aeb5cef4ef858f5959383af670db98a6 | 2,706 | md | Markdown | _posts/2008-10-15-一篇更重要更全面的“窑洞对”.md | NodeBE4/opinion | 81a7242230f02459879ebc1f02eb6fc21507cdf1 | [
"MIT"
] | 21 | 2020-07-20T16:10:55.000Z | 2022-03-14T14:01:14.000Z | _posts/2008-10-15-一篇更重要更全面的“窑洞对”.md | NodeBE4/opinion | 81a7242230f02459879ebc1f02eb6fc21507cdf1 | [
"MIT"
] | 1 | 2020-07-19T21:49:44.000Z | 2021-09-16T13:37:28.000Z | _posts/2008-10-15-一篇更重要更全面的“窑洞对”.md | NodeBE4/opinion | 81a7242230f02459879ebc1f02eb6fc21507cdf1 | [
"MIT"
] | 1 | 2021-05-29T19:48:01.000Z | 2021-05-29T19:48:01.000Z | ---
layout: post
title: "一篇更重要更全面的“窑洞对”"
date: 2008-10-15
author: 阎长贵
from: http://www.yhcqw.com/
tags: [ 炎黄春秋 ]
categories: [ 炎黄春秋 ]
---
[ 2008年第10期 一篇更重要更全面的“窑洞对” 作者:阎长贵 ]
“窑洞对”是怎么回事,这是尽人皆知的事情。它通常所指的是,毛泽东和黄炎培在1945年7月的一次谈话。本文为便于说明问题,特再把它重复叙述一下。
1945年7月1日,中国民主同盟常委黄炎培与章伯钧、左舜生、傅斯年等6位国民参政员接受中国共产党邀请,从重庆飞抵延安。他们受到毛泽东、朱德、周恩来、林伯渠、吴玉章等人的热烈欢迎。在短短三天内,毛泽东同他们多次倾心交谈。7月4日,毛泽东邀请黄炎培到杨家岭他家里叙话。在窑洞客厅里毛泽东问黄炎培对几天的考察有何感想。
黄炎培深感延安朝气蓬勃,抗日胜利之后定将赢得民主革命在全国的胜利,遂借此机会坦陈心中的远虑,他说:我生60多年,耳闻的不说,亲眼所见到的,真所谓“其兴也勃焉”,“其亡也忽焉”,一人,一家,一团体,一地方,乃至一国,不少单位都没有能跳出这周期率的支配力。大凡初时聚精会神,没有一事不用心,没有一人不卖力,也许那时艰难困苦,只有从万死中觅取一生。既而环境渐渐好转了,精神也就渐渐放下了。有因为历时长久,自然地惰性发作,由少数演为多数,到风气养成,虽有大力,无法扭转,并且无法补救。也有为了区域一步步扩大了,它的扩大,有的出于自然发展,有的为功业欲所驱使,强求发展,到干部人才渐见竭蹶、艰于应付的时候,环境倒越加复杂起来了,控制力不免趋于薄弱了。一部历史,“政怠宦成”的也有,“人亡政息”的也有,“求荣取辱”的也有,总之没有能跳出这周期率。中共诸君从过去到现在,我略略了解的了。就是希望找出一条新路,来跳出这周期率的支配。
毛泽东微笑着严肃地回答:“我们已经找到了新路,我们能跳出这周期率。这条新路,就是民主。只有让人民来监督政府,政府才不敢松懈;只有人人起来负责,才不会‘人亡政息’。”
这个“窑洞对”,人们耳熟能详了。然而,还有一个发生在同一时期的毛泽东书面答路透社记者甘贝尔问,其价值与这个“窑洞对”相较,要超过它。但这个毛泽东答路透社记者问仿佛还不为人们所深切知道,没有引起重视,没有得到广泛传播,其影响远逊于名闻遐迩的“窑洞对”,即毛泽东和黄炎培在延安窑洞的谈话。须知,在当时,毛泽东这个书面答路透社记者问那可是上了中国共产党主办的《新华日报》、《解放日报》的重要位置,甚至头版头条的。且看,中华民国三十四年九月二十七日《新华日报》在第二版头条刊登这篇答问,标题是:《毛泽东同志答路透社记者中国需要和平建国》;10天后,即在民国三十四年十月八日,《解放日报》的头版头条以“本报重庆讯”的名义,用同样的标题全文转载这篇答问。不容置疑,当时中国共产党的报刊是非常重视这篇答问的。顺便指出,哈而滨东北书店1948年出版的《毛泽东选集》也收录了这篇答问。这篇答问内容非常多,有十二条,我们只看看它不论对当时,还是对现在都具有重要意义的最主要最根本的内容:“建设自由民主的新中国”的理念。
1945年9月,英国路透社驻重庆记者甘贝尔书面提出十二个问题,请因国共两党谈判逗留重庆的中国共产党中央委员会主席毛泽东回答。甘贝尔提的第十问:中共对“自由民主的中国”的概念及界说为何?十一问:在各党派的联合政府中,中共的建设方针及恢复方针如何?十二问:你赞成军队国家化,废止私人拥有军队吗?对这三个问题,毛泽东做了如下回答:
——“自由民主的中国”将是这样一个国家,它的各级政府直至中央政府都是由普遍、平等、无记名的选举所产生,并向选举它们的人民负责。它将实现孙中山先生的三民主义,林肯的民有、民治、民享的原则与罗斯福的四大自由(按:指美国总统罗斯福在第二次世界大战期间提出的“言论和表达的自由”、“信仰上帝的自由”、“免于匮乏的自由”、“免于恐惧的自由”)。它将保证国家的独立、团结、统一以及与各民主强国的合作。
——除了军事与政治的民主改革外,中共将向政府提议,实行一个经济与文化建设纲领。这个纲领的目的,主要是减轻人民负担,改善人民生活,实行土地改革与工业化,奖励私人企业(除了那些带有垄断性质的部门应由民主政府经营外),在平等互利的原则下,欢迎外人投资与发展国际贸易,推广群众教育,消灭文盲等等。这一切也都是与孙中山先生的遗教相符的。
——我们完全赞成军队国家化与废止私人拥有军队,这两件事的共同前提就是国家民主化。通常所说的“共产党军队”,按其实际乃是中国人民在战争中自愿组织起来而仅仅服务于保卫祖国的军队,这是一种新型的军队,与过去中国一切属于个人的旧式军队完全不同。它的民主性质为中国军队之真正国家化提供了可贵的经验,足为中国其他军队改进的参考。
这个毛泽东答路透社记者甘贝尔问,虽然不是在延安窑洞做的,但也是毛泽东在延安时期即住窑洞时期的思想,和人们普遍称为“窑洞对”即毛泽东与黄炎培在延安窑洞的一次谈话,在时间上,仅相距两个多月:一个在抗日战争胜利前夕,一个在抗日战争胜利后不久。然而,不论在抗日战争胜利前,还是在胜利后,中国都面临着建立新中国的问题,但究竟建立一个什么样的新中国,国共两党的主张不同,这也就是毛泽东在“七大”开幕词中所说的“两种中国之命运”,因而,毛泽东这两篇东西,在内容上,也是相同或相通的,即它们都讲到或讲的是中国共产党“建设自由民主新中国”的治国或建国方针,并且毛泽东答甘贝尔问比被称为“窑洞对”的毛泽东和黄炎培谈话内容更丰富。——为了联系和比较,我们把这个毛泽东答路透社记者甘贝尔问称为:“一篇更重要更全面的‘窑洞对’”。
这两个“窑洞对”的历史意义和现实意义都十分明显,本文不做展开论述,只提几个问题。我们把这两个“窑洞对”对比一下建国后的现实,会作何感想?在我们中国,“民有、民治、民享”的原则何以体现?“建设自由民主的新中国”的理念和现实怎样相统一?进一步问,这两个“窑洞对”对我们今天的思想和行动,特别对改革开放有什么启示?我们经常到处找“坐标”,找“参照系”,而我们党在抗日战争时期所提出的以及所实践的基本纲领和路线不也是我们今天纲领和路线的最切近的“坐标”和“参照系”吗?……常常听到有人责骂“背叛”和“数典忘祖”,请人们认真研读一下这两个“窑洞对”,究竟谁——谁在“背叛”和“数典忘祖”?
(责任编辑 杨继绳)
| 55.22449 | 449 | 0.847376 | yue_Hant | 0.350987 |
2f1a5cc2ac38f98bfa6674e29359405bc7f95e76 | 1,433 | md | Markdown | _posts/2019-07-19-Linked-Crunchbase-A-Linked-Data-API-and-RDF-Data-Set-About-Innovative-Companies.md | AMDS123/papers | 80ccfe8c852685e4829848229b22ba4736c65a7c | [
"MIT"
] | 7 | 2018-02-11T01:50:19.000Z | 2020-01-14T02:07:17.000Z | _posts/2019-07-19-Linked-Crunchbase-A-Linked-Data-API-and-RDF-Data-Set-About-Innovative-Companies.md | AMDS123/papers | 80ccfe8c852685e4829848229b22ba4736c65a7c | [
"MIT"
] | null | null | null | _posts/2019-07-19-Linked-Crunchbase-A-Linked-Data-API-and-RDF-Data-Set-About-Innovative-Companies.md | AMDS123/papers | 80ccfe8c852685e4829848229b22ba4736c65a7c | [
"MIT"
] | 4 | 2018-02-04T15:58:04.000Z | 2019-08-29T14:54:14.000Z | ---
layout: post
title: "Linked Crunchbase: A Linked Data API and RDF Data Set About Innovative Companies"
date: 2019-07-19 20:08:47
categories: arXiv_AI
tags: arXiv_AI Knowledge_Graph Knowledge GAN Relation
author: Michael Färber
mathjax: true
---
* content
{:toc}
##### Abstract
Crunchbase is an online platform collecting information about startups and technology companies, including attributes and relations of companies, people, and investments. Data contained in Crunchbase is, to a large extent, not available elsewhere, making Crunchbase to a unique data source. In this paper, we present how to bring Crunchbase to the Web of Data so that its data can be used in the machine-readable RDF format by anyone on the Web. First, we give insights into how we developed and hosted a Linked Data API for Crunchbase and how sameAs links to other data sources are integrated. Then, we present our method for crawling RDF data based on this API to build a custom Crunchbase RDF knowledge graph. We created an RDF data set with over 347 million triples, including 781k people, 659k organizations, and 343k investments. Our Crunchbase Linked Data API is available online at <a href="http://linked-crunchbase.org.">this http URL</a>
##### Abstract (translated by Google)
##### URL
[http://arxiv.org/abs/1907.08671](http://arxiv.org/abs/1907.08671)
##### PDF
[http://arxiv.org/pdf/1907.08671](http://arxiv.org/pdf/1907.08671)
| 55.115385 | 947 | 0.773203 | eng_Latn | 0.888236 |
2f1b0833daa1facb21410025d419d053aaa4ea4f | 2,074 | md | Markdown | README.md | matigo/streams | 97099543c27aa5a10c9fc1f4ecb6bd7afcaa6478 | [
"MIT"
] | 1 | 2019-06-29T02:01:13.000Z | 2019-06-29T02:01:13.000Z | README.md | matigo/streams | 97099543c27aa5a10c9fc1f4ecb6bd7afcaa6478 | [
"MIT"
] | 44 | 2019-06-09T10:51:56.000Z | 2021-03-24T05:38:12.000Z | README.md | matigo/streams | 97099543c27aa5a10c9fc1f4ecb6bd7afcaa6478 | [
"MIT"
] | 1 | 2020-04-07T15:44:38.000Z | 2020-04-07T15:44:38.000Z | # Streams
### An Open-Source, IndieWeb-friendly Publishing Platform
The Following is a list of notes regarding the use and configuration of Stremas.
## General Requirements
You will need:
* a web server running Apache 2.4.x and PHP 7.0 or newer
* MySQL 8.0 or above (MySQL 5.x and MariaDB 10.x are feasible, but not supported)
## LAMP Configuration Notes
### Linux Notes
This code has been tested to run on Ubuntu Server 16.04 LTS, 18.04 LTS, and 20.04 LTS. That said, it should run on any version of Linux released in the last 5 years. Your mileage may very. Test often. Test well.
### Apache Notes
The following modules must be loaded:
* mod-php
* mod-rewrite
* mod-headers
### MySQL Notes
MySQL 8.0 is the database engine used for all testing, development, and deployment. The tables are all configured with InnoDB. Other database engines such as XtraDB have not been tested, so reliability is unknown. Avoid using MyISAM as this engine has been deprecated and is not ideal for highly concurrent environments.
### PHP Notes
The following modules are required:
* mbstring
* dev
* xml
* json
* mysql
* gd
* curl
* pear
### Other Setup Requirements
In addition to the basic LAMP stack, the following items need to be taken into account.
* the `htaccess` file in `/public` must be renamed `.htaccess`
* Apache must be configured to honour the `.htaccess` overrides
* Streams can use Amazon S3 storage for files, but is off by default
* Streams can enforce HTTPS redirects (and ideally should use it)
* Streams is designed to run on servers with as little as 1GB RAM
### Basic Web Server -- Minimum Recommended
* Ubuntu Server 20.04 LTS
* Dual-Core CPU (x86/x64/ARM)
* 2GB RAM
* 10GB Storage
### Windows Configuration Notes
It is not recommended that Streams run on Windows in a WAMP-like fashion. It has not been tested and, as of this writing, will not be supported.
### Optional Components
There are some optional pieces to the puzzle that might make things a little better. These things include:
* something to drink
* good music
* a faithful dog | 28.805556 | 320 | 0.753616 | eng_Latn | 0.998347 |
2f1c7e3821484d0e07a6c316eea6bd508340de6d | 3,095 | md | Markdown | articles/virtual-machines/windows/key-vault-setup.md | BEEZY7421/azure-docs | ae3d707f1fe68ba5d7d206be1ca82958f12751e8 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2020-04-13T17:30:39.000Z | 2020-04-13T17:30:39.000Z | articles/virtual-machines/windows/key-vault-setup.md | BEEZY7421/azure-docs | ae3d707f1fe68ba5d7d206be1ca82958f12751e8 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/virtual-machines/windows/key-vault-setup.md | BEEZY7421/azure-docs | ae3d707f1fe68ba5d7d206be1ca82958f12751e8 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Set up Key Vault for Windows VMs in Azure Resource Manager
description: How to set up Key Vault for use with an Azure Resource Manager virtual machine.
services: virtual-machines-windows
documentationcenter: ''
author: mimckitt
manager: vashan
editor: ''
tags: azure-resource-manager
ms.assetid: 33a483e2-cfbc-4c62-a588-5d9fd52491e2
ms.service: virtual-machines-windows
ms.workload: infrastructure-services
ms.tgt_pltfrm: vm-windows
ms.topic: article
ms.date: 01/24/2017
ms.author: mimckitt
---
# Set up Key Vault for virtual machines in Azure Resource Manager
[!INCLUDE [learn-about-deployment-models](../../../includes/learn-about-deployment-models-rm-include.md)]
In Azure Resource Manager stack, secrets/certificates are modeled as resources that are provided by the resource provider of Key Vault. To learn more about Key Vault, see [What is Azure Key Vault?](../../key-vault/key-vault-overview.md)
> [!NOTE]
> 1. In order for Key Vault to be used with Azure Resource Manager virtual machines, the **EnabledForDeployment** property on Key Vault must be set to true. You can do this in various clients.
> 2. The Key Vault needs to be created in the same subscription and location as the Virtual Machine.
>
>
## Use PowerShell to set up Key Vault
To create a key vault by using PowerShell, see [Set and retrieve a secret from Azure Key Vault using PowerShell](../../key-vault/quick-create-powershell.md).
For new key vaults, you can use this PowerShell cmdlet:
New-AzKeyVault -VaultName 'ContosoKeyVault' -ResourceGroupName 'ContosoResourceGroup' -Location 'East Asia' -EnabledForDeployment
For existing key vaults, you can use this PowerShell cmdlet:
Set-AzKeyVaultAccessPolicy -VaultName 'ContosoKeyVault' -EnabledForDeployment
## Use CLI to set up Key Vault
To create a key vault by using the command-line interface (CLI), see [Manage Key Vault using CLI](../../key-vault/key-vault-manage-with-cli2.md#create-a-key-vault).
For CLI, you have to create the key vault before you assign the deployment policy. You can do this by using the following command:
az keyvault create --name "ContosoKeyVault" --resource-group "ContosoResourceGroup" --location "EastAsia"
Then to enable Key Vault for use with template deployment, run the following command:
az keyvault update --name "ContosoKeyVault" --resource-group "ContosoResourceGroup" --enabled-for-deployment "true"
## Use templates to set up Key Vault
While you use a template, you need to set the `enabledForDeployment` property to `true` for the Key Vault resource.
{
"type": "Microsoft.KeyVault/vaults",
"name": "ContosoKeyVault",
"apiVersion": "2015-06-01",
"location": "<location-of-key-vault>",
"properties": {
"enabledForDeployment": "true",
....
....
}
}
For other options that you can configure when you create a key vault by using templates, see [Create a key vault](https://azure.microsoft.com/documentation/templates/101-key-vault-create/).
| 43.591549 | 237 | 0.729887 | eng_Latn | 0.811457 |
2f1e09d1a9f5bd2170f15239b64e1d137a097ab5 | 4,125 | md | Markdown | docs/vs-2015/python/getting-started-with-ptvs-building-a-website-in-azure.md | Birgos/visualstudio-docs.de-de | 64595418a3cea245bd45cd3a39645f6e90cfacc9 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/vs-2015/python/getting-started-with-ptvs-building-a-website-in-azure.md | Birgos/visualstudio-docs.de-de | 64595418a3cea245bd45cd3a39645f6e90cfacc9 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/vs-2015/python/getting-started-with-ptvs-building-a-website-in-azure.md | Birgos/visualstudio-docs.de-de | 64595418a3cea245bd45cd3a39645f6e90cfacc9 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: 'Erste Schritte mit PTVS: Erstellen einer Website in Azure | Microsoft-Dokumentation'
ms.date: 11/15/2016
ms.prod: visual-studio-dev14
ms.technology: vs-python
ms.topic: conceptual
ms.assetid: 3bdbda36-14d2-4fde-ba42-d91042777ff6
caps.latest.revision: 5
author: kraigb
ms.author: kraigb
manager: jillfra
ms.openlocfilehash: 61f8748a3874f32db9c235d03b6b7464bc5cecf1
ms.sourcegitcommit: 8b538eea125241e9d6d8b7297b72a66faa9a4a47
ms.translationtype: MTE95
ms.contentlocale: de-DE
ms.lasthandoff: 01/23/2019
ms.locfileid: "54783195"
---
# <a name="getting-started-with-ptvs-building-a-website-in-azure"></a>Erste Schritte mit PTVS: Erstellen einer Website in Azure
[!INCLUDE[vs2017banner](../includes/vs2017banner.md)]
Beginnen Sie ganz schnell und einfach damit, eine Python-Website in Azure zu erstellen.
Sie können diese Anweisungen in einem sehr kurzen [YouTube-Video](https://www.youtube.com/watch?v=FJx5mutt1uk&list=PLReL099Y5nRdLgGAdrb_YeTdEnd23s6Ff&index=6) ansehen.
Starten Sie mit dem neuen Projekt... Dialogfeld und wählen Sie unter Python Projekten das Bottle-Webprojekt. Dies [Bottle](http://bottlepy.org/docs/dev/index.html) Vorlage ist eine Startwebsite, die auf der Grundlage der [Bootstrap-Framework](http://getbootstrap.com/). Wenn Sie das Projekt erstellen, fordert Visual Studio Sie auf, Abhängigkeiten (in diesem Fall "Bottle") in einer virtuellen Umgebung installieren. Da Sie die Bereitstellung für eine Azure-Website durchführen, müssen Sie die Abhängigkeiten in einer virtuellen Umgebung hinzufügen, um den ordnungsgemäßen Betrieb Ihrer Website zu gewährleisten. Sie müssen Ihre Umgebung auf Python 2.7 oder 3.4 (32 Bit) basieren. Nachdem Sie das Projekt erstellt haben, drücken Sie auf F5, um die Website lokal auszuführen.
Es ist ganz einfach, die Website in Azure auszuprobieren. Wenn Sie nicht über ein Azure-Abonnement verfügen, können Sie [try.azurewebsites.net](https://trywebsites.azurewebsites.net/). Diese Website bietet eine einfache Möglichkeit, Azure-Websites eine Stunde lang auszuprobieren. Die Anmeldung kann einfach mit den Anmeldedaten für ein soziales Netzwerk erfolgen. Sie benötigen keine Kreditkarte. Wählen Sie in der Dropdownliste "Sprache ändern" die leere Websitevorlage aus, und wählen Sie "Erstellen". Wählen Sie unter "Arbeiten mit der Webanwendung" die Option "Veröffentlichungsprofil herunterladen", und speichern Sie die Datei zur Verwendung mit Visual Studio. Sie können die Bereitstellung auch über Git aus einem anderen Betriebssystem durchführen.
Wechseln Sie zurück zu Visual Studio und dem von Ihnen erstellten Projekt. Wählen Sie im Projektmappen-Explorer den Projektknoten aus, klicken Sie auf die Schaltfläche mit dem Zeiger nach rechts, und wählen Sie "Veröffentlichen" aus. Wenn Sie über ein Azure-Abonnement verfügen, können Sie im Dialogfeld auf "Microsoft Azure-Websites" klicken, um Ihre Websites von dort zu verwalten. Wählen Sie für diese exemplarische Vorgehensweise die Option "Importieren" aus, um das Veröffentlichungsprofil zu importieren, das Sie gerade heruntergeladen haben. Da das Veröffentlichungsprofil alle erforderlichen Informationen enthält, können Sie die Option "Veröffentlichen" wählen. In wenigen Sekunden wird ein neues Browserfenster geöffnet, und Ihre Website ist live in der Azure-Cloud gehostet.
Einfache Websites sind einfach, aber Informationen auf wichtigere Webanwendungen in Azure finden Sie in der [tieferer Einblick in](https://www.youtube.com/watch?v=WG3pGmoo8nE&list=PLReL099Y5nRdLgGAdrb_YeTdEnd23s6Ff&index=10) video sowie andere in diesem Kanal (Link in der oberen rechten Ecke der ersten Schritte oder deep-Dive videoseite sowie unten .
Sie können diese Anweisungen in einem sehr kurzen [YouTube-Video](https://www.youtube.com/watch?v=FJx5mutt1uk&list=PLReL099Y5nRdLgGAdrb_YeTdEnd23s6Ff&index=6) ansehen.
## <a name="see-also"></a>Siehe auch
[Wiki-Dokumentation](https://github.com/Microsoft/PTVS/wiki/Web-Project)
[PTVS-Videos: Einstieg und ausführliche Erläuterungen](https://www.youtube.com/playlist?list=PLReL099Y5nRdLgGAdrb_YeTdEnd23s6Ff)
| 105.769231 | 793 | 0.807758 | deu_Latn | 0.98854 |
2f1e6fe4c28c7a5f7d213fb265bc9a89e2dc6949 | 6,331 | md | Markdown | Shelley-testnet/solutions/Exercise-3-solutions.md | martinrichterlondon/cardano-tutorials | 3c69ac37a039e5327ba0c4c7858797c31047ecfb | [
"Apache-2.0"
] | 128 | 2020-03-23T21:10:04.000Z | 2022-03-04T01:53:29.000Z | Shelley-testnet/solutions/Exercise-3-solutions.md | martinrichterlondon/cardano-tutorials | 3c69ac37a039e5327ba0c4c7858797c31047ecfb | [
"Apache-2.0"
] | 131 | 2020-05-10T00:09:59.000Z | 2020-07-10T15:58:43.000Z | Shelley-testnet/solutions/Exercise-3-solutions.md | martinrichterlondon/cardano-tutorials | 3c69ac37a039e5327ba0c4c7858797c31047ecfb | [
"Apache-2.0"
] | 134 | 2020-03-23T21:10:42.000Z | 2022-01-19T07:09:43.000Z | # Shelley Stakepool Exercise 3
## Objectives
* Set up the keys for a relay and a stake pool;
* Handle the Key Evolving Signature Scheme;
* Start the relay and the stake pool.
A good setup for a stake pool is to have (at least) one block-producing node connected __(only)__ to at least one realay node under the control of the stake pool operator, and each relay node connected to other realy nodes in the network. Each node should run on a separate server.

First, we need to setup our __block-producing node__. You can build the node from source or maintain a single build on your relay and copy the binaries to your block-producing node. Just make sure you have consistent versions across them.
### Basic block-producing node firewall configuration:
* Make sure you can only login with SSH Keys, not password.
* Make sure to setup SSH connections in a port different than the the default 22
* Make sure to configure the firewall to only allow connections from your relay nodes by setting up their ip addresses.
### Basic relay node firewall configuration:
* Make sure you can only login with SSH Keys, not password.
* Make sure to setup SSH connections in a port different than the default 22.
* Make sure you only have the strictly necessary ports opened.
## Crating keys for our block-producing node
(This assumes you already have the binaries on your server.)
Our __block-producing node__ or __pool node__ needs a __VRF__ Key pair, a __KES__ Key pair, a __Cold__ key pair and an __Operational Certificate__
Let's establish an SSH connection with our server
ssh -i ~/.ssh/id_rsa <USER>@<PUBLIC IP> -p <SSH PORT>
We are in, now, let's create a directory to store the keys that we will generate:
mkdir keys
cd keys
Now, generate the KES Key pair
cardano-cli shelley node key-gen-KES \
--verification-key-file kes.vkey \
--signing-key-file kes.skey
and our VRF Key pair
cardano-cli shelley node key-gen-VRF \
--verification-key-file vrf.vkey \
--signing-key-file vrf.skey
It is time to generate our __Cold__ Keys and a __Cold_counter__ file:
cardano-cli shelley node key-gen \
--cold-verification-key-file cold.vkey \
--cold-signing-key-file cold.skey \
--operational-certificate-issue-counter coldcounter
Finnaly, we can generate our __Operational Certificate__
To tho that, first We need to know the slots per KES period, we get it from the genesis file:
cat shelley_testnet-genesis.json | grep KESPeriod
> "slotsPerKESPeriod": 3600,
So one period lasts 3600 slots. What is the current tip of the blockchain?,
We can use your relay node (from Exercise 2) to query the tip:
export CARDANO_NODE_SOCKET_PATH=path/to/node-socket
cardano-cli shelley query tip --testnet-magic 42
> Tip (SlotNo {unSlotNo = 432571}) ...
Look for Tip `unSlotNo` value. In this example we are on slot 432571. So in this example we have KES period is 120:
expr 432571 / 3600
> 120
With this information we can generate our opertional certificate:
cardano-cli shelley node issue-op-cert \
--kes-verification-key-file kes.vkey \
--cold-signing-key-file cold.skey \
--operational-certificate-issue-counter coldcounter \
--kes-period 120 \
--out-file opcert
## Copy the keys to a secure storage.
Lets copy our keys to our local machine and from there to cold storage.From another terminal in your __local machine__ do:
scp -rv -P<SSH PORT> -i ~/.ssh/id_rsa <USER>@<PUBLIC IP>:~/keys ~/pool-keys
> Transferred: sent 3220, received 6012 bytes, in 1.2 seconds
Bytes per second: sent 2606.6, received 4866.8
debug1: Exit status 0
And verify that the files are there:
ls pool-keys/keys
> coldcounter cold.skey cold.vkey kes.skey kes.vkey vrf.skey vrf.vkey
__NOTE__ The best place for your cold keys is a SECURE USB or other SECURE EXTERNAL DEVICE, not a computer with internet access. So, move your cold keys to cold storage and delete the files from your local machine.
### Delete the Cold Keys from the server.
Back to our __server__, now we can delete the __Cold Keys__ from here:
rm cold*
And verify that they are gone:
ls
> kes.skey kes.vkey vrf.skey vrf.vkey opcert
### Configure topology files for block-producing and relay nodes.
Get the configuration files for your block-producing node if you dont have them already:
mkdir config-files
cd config-files
wget https://hydra.iohk.io/job/Cardano/cardano-node/cardano-deployment/latest-finished/download/1/shelley_testnet-config.json
wget https://hydra.iohk.io/job/Cardano/cardano-node/cardano-deployment/latest-finished/download/1/shelley_testnet-genesis.json
wget https://hydra.iohk.io/job/Cardano/cardano-node/cardano-deployment/latest-finished/download/1/shelley_testnet-topology.json
Lets make our __block-producing__ node to "talk" only to our relay node
nano shelley_testnet-topology.json
{
"Producers": [
{
"addr": "<RELAY NODE PUBLIC IP",
"port": 3001,
"valency": 1
}
]
}
No we open an SSH connection to our __Relay node__ (if you haven't already) and add our __Block-producing node__ to the topology file.
This is a good moment to add also other relay nodes in the network.
nano ff-topology.j son
{
"Producers": [
{
"addr": "<BLOCK-PRODUCING NODE IP",
"port": 3001,
"valency": 1
},
{
"addr": "<IP ADDRESS>",
"port": <PORT>,
"valency": 1
},
{
"addr": "<IP ADDRESS>",
"port": <PORT>,
"valency": 1
}
]
}
### Start your system
First we restart our __relay node__ with:
~$ cardano-node run \
--topology path/to/shelley_testnet-topology.json \
--database-path path/to/db \
--socket-path path/to/db/node.socket \
--host-addr <PUBLIC IP> \
--port 3001 \
--config path/to/shelley_testnet-config.json
then, we start our __block producing__ node with:
~$ cardano-node run \
--topology keys/shelley_testnet-topology.json \
--database-path /db \
--socket-path /db/node.socket \
--host-addr <PUBLIC IP> \
--port 3001 \
--config config-files/config.json
--shelley-kes-key keys/kes.skey
--shelley-vrf-key keys/vrf.skey
--shelley-operational-certificate keys/opcert
And we are done !
| 31.655 | 281 | 0.719949 | eng_Latn | 0.964153 |
2f1e85e0d1a2c30cedc13fc71aaf62471138c8dd | 6,628 | md | Markdown | commercial/GA1/1.0commerical.services-imaging-overview.md | hphelion/documentation.md | aa334b053001ba82883875f63725e69a54fd35e5 | [
"Apache-2.0"
] | null | null | null | commercial/GA1/1.0commerical.services-imaging-overview.md | hphelion/documentation.md | aa334b053001ba82883875f63725e69a54fd35e5 | [
"Apache-2.0"
] | null | null | null | commercial/GA1/1.0commerical.services-imaging-overview.md | hphelion/documentation.md | aa334b053001ba82883875f63725e69a54fd35e5 | [
"Apache-2.0"
] | null | null | null | ---
layout: 1.0default
title: "HP Helion OpenStack® 1.0: Image Operations (Glance) Service Overview"
permalink: /helion/openstack/services/imaging/overview/
product: commercial.ga1.0
---
<!--PUBLISHED-->
<script>
function PageRefresh {
onLoad="window.refresh"
}
PageRefresh();
</script>
<!--
<p style="font-size: small;"> <a href="/helion/openstack/services/identity/overview/">◀ PREV</a> | <a href="/helion/openstack/services/overview/">▲ UP</a> | <a href="/helion/openstack/services/networking/overview/"> NEXT ▶</a> </p>
-->
# HP Helion OpenStack® 1.0: Image Operations (Glance) Service Overview #
[See the Helion OpenStack 1.1 version of this page](/helion/openstack/1.1/services/imaging/overview/)
Based on OpenStack Glance, the HP Helion OpenStack Image Operations service is a web service for managing virtual machine images. It provides a set of RESTful APIs that enables querying/updating of image metadata as well as retrieval of the actual image data.
With the Image Operations service you can work with image files, which are virtual disk image files that the [HP Helion Compute](/helion/openstack/services/compute/overview) service can use to create a virtual machine.
Each image has a name, an unique identifier (UUIDs in hexadecimal string notation) and a specific disc and container format.
## Key terms
- **Image files** -- An image file refers to a virtual disk image file that the Compute service can load up to create a virtual machine.
- **Private image** -- An image that is available to all users in the project
- **Public images** An image that is available to all users across all projects in a domain.
- **Image metadata** -- The metadata of an image includes all the information about the image, for example: image identifier, name, status, size, disk format, container format, owner, and custom properties.
- **Name** -- The name of the image. Note that the name of an image is not unique
- **Identifier** -- A unique string that identifies an image. Identifiers are UUIDs, commonly represented in hexadecimal string notation.
- **Status** -- The current state of an image. Images can be in one the following statuses: queued, saving, active, killed, deleted, or pending_delete.
- **Disk and container format** -- When adding an image to the Image Operations service, specifying the image disk format and container format is required.
The disk format of a virtual machine image is the format of the underlying disk image. Virtual appliance vendors have different formats for laying out the information contained in a virtual machine disk image.
The container format refers to whether the virtual machine image is in a file format that also contains metadata about the actual virtual machine.
- **Size** -- The image size in Bytes.
- **Checksum** -- The MD5 checksum of the image file data.
- **Minimum RAM** -- The minimum ram required in megabytes to run this image on a HP Compute server. Please refer to HP Compute document for more details about this property.
- **Minimum Disk** -- The minimum disk space required in gigabytes to run this image on a HP Compute server.
- **Owner** -- The owner of an image, usually the project ID of the authenticated user adding the image.
- **Custom image properties** -- A set of custom, free-form image properties stored with the image metadata.
## Working with the Image Operations Service
To [perform tasks using the Image Operations service](#howto), you can use the dashboard, API or CLI.
### Using the dashboards<a name="UI"></a>
You can use the [HP Helion OpenStack Dashboard](/helion/openstack/dashboard/how-works/) to work with the Image Operations service.
###Using the API<a name="API"></a>
You can use a low-level, raw REST API access to HP Image Operations service. See the [OpenStack Image Service API v2.0 Reference](http://developer.openstack.org/api-ref-image-v2.html).
###Using the CLI<a name="cli"></a>
You can use any of several command-line interface software to access HP Image service. See the [OpenStack Command Line Interface Reference](http://docs.openstack.org/cli-reference/content/glanceclient_commands.html).
For more information on installing the CLI, see [Install the OpenStack command-line clients](http://docs.openstack.org/user-guide/content/install_clients.html).
<!--
## How To's with the HP Helion OpenStack Image Operations Service<a name="howto"></a>
Taken from http://wiki.hpcloud.net/display/core/Core+Edition+Use+cases#CoreEditionUsecases-OverCloud
The following lists of tasks can be performed by a user or administrator through the [HP Helion OpenStack Dashboard](/helion/openstack/dashboard/how-works/), the OpenStack [CLI](
http://docs.openstack.org/cli-reference/content/glanceclient_commands.html) or OpenStack [API](http://developer.openstack.org/api-ref-image-v2.html).
### Tasks performed by users<a name="user"></a>
The following Image Operations service tasks are usually performed by someone with the *user* role.
#### Creating, modifying and deleting a private image ####
Use the Image Operations service to create, delete, or modify a private instance images.
#### Adding and updating metadata for a private image ####
Use the Image Operations service to modify the metadata for a private image.
### Tasks performed by an Administrator<a name="admin"></a>
The following Image Operations service tasks are usually performed by someone with the *administrator* role.
#### Creating, modifying, and deleting an image ####
Use the Image Operations service to create, delete, or modify a public instance image.
#### Adding and updating metadata for an image ####
Use the Image Operations service to modify the metadata for a private image. -->
## For more information ##
For information on how to operate your cloud we suggest you read the [OpenStack Operations Guide](http://docs.openstack.org/ops/). The *Architecture* section contains useful information about how an OpenStack Cloud is put together. However, the HP Helion OpenStack takes care of these details for you. The *Operations* section contains information on how to manage the system.
<!-- hide me Also see the Help topics that are available in the Operational Dashboard and Administration Dashboard. Website copies are available:
* [HP Helion OpenStack Operational Dashboard Help](/helion/openstack/manage/operational-dashboard/)
* [HP Helion OpenStack Administration Dashboard Help](/helion/openstack/manage/administration-dashboard/) -->
<a href="#top" style="padding:14px 0px 14px 0px; text-decoration: none;"> Return to Top ↑ </a>
----
| 50.984615 | 376 | 0.760712 | eng_Latn | 0.961155 |
2f1eca07a6f524fd2aaff6d7f32f2dc100ce8a84 | 2,204 | md | Markdown | _posts/2021-06-11-Week-1-June-07--To-June-11.md | Darshpreet2000/Code-To-Help | 6c0bec30e8857250c3f57e208dfdab5d5a59361b | [
"Apache-2.0"
] | null | null | null | _posts/2021-06-11-Week-1-June-07--To-June-11.md | Darshpreet2000/Code-To-Help | 6c0bec30e8857250c3f57e208dfdab5d5a59361b | [
"Apache-2.0"
] | null | null | null | _posts/2021-06-11-Week-1-June-07--To-June-11.md | Darshpreet2000/Code-To-Help | 6c0bec30e8857250c3f57e208dfdab5d5a59361b | [
"Apache-2.0"
] | null | null | null | ---
keywords: fastai
description: "Developing UI of Register Baby Screen & Making App Drawer"
title: "Coding Period: Week 1 June 07 To June 11"
toc: false
branch: master
badges: true
comments: true
hide: false
search_exclude: true
layout: post
---
### Developing UI of Register Baby Screen
I worked on creating UI of register baby screen, I have used the following packages
- Flutter Customizable Slider for weight, temperature input [(Link)](https://pub.dev/packages/syncfusion_flutter_sliders)
- Flutter Date Time Picker to pick baby's birth date & time [(Link)](https://pub.dev/packages/flutter_datetime_picker)
- Flutter Bloc Architecture to handle state management [(Link)](https://pub.dev/packages/flutter_bloc)
I have created toggle buttons for baby's information input such as gender, mode of delivery etc. These toggle buttons uses boolean variable for their state management.
I have created a model class for storing Baby's health information, This class's object is created & data is filled when user enter's data is register baby's screen.
### Creating Navigation Drawer in App
I Added a navigation drawer in app, It has the following items & header with LibreHealth's Logo.
- Doctor’s Schedule
- Messaging
- About
- Share App
- Report Bug
### Screen Shots
<table style="width:100%">
<tr>
<img src="https://raw.githubusercontent.com/Darshpreet2000/My-Blog/master/images/week1c.png" height="500" width="250">
</tr>
<tr>
<img src="https://raw.githubusercontent.com/Darshpreet2000/My-Blog/master/images/week1a.png" height="500" width="250">
</tr>
<tr>
<img src="https://raw.githubusercontent.com/Darshpreet2000/My-Blog/master/images/week1b.png" height="500" width="250">
</tr>
</table>
### What progress I made this week?
- Developed UI of Register Baby Screen
- Created Sliders for weight, temperature etc.
- Created Model Class for storing Baby's Information
- Created Bloc for register baby screen
- Created App Drawer with items ( Doctor’s Schedule, Messaging, Share App, Report Bug)
### What I have planned to do next week?
- Developing Assessments Screen UI for all stages
- Using Sliver App Bar for floating app bars
- Adding Dark Mode in App
| 32.895522 | 168 | 0.751815 | eng_Latn | 0.893151 |
2f1ecc4cc92939a3da568cbf7b7cc55a7af71f10 | 958 | md | Markdown | README.md | aubrian-halili/object-assign-deep | 6d5fc445931a08cc7bc0f588346b96175f211465 | [
"MIT"
] | null | null | null | README.md | aubrian-halili/object-assign-deep | 6d5fc445931a08cc7bc0f588346b96175f211465 | [
"MIT"
] | null | null | null | README.md | aubrian-halili/object-assign-deep | 6d5fc445931a08cc7bc0f588346b96175f211465 | [
"MIT"
] | null | null | null | # assign-deep [![NPM version][npm-image]][npm-url] [![Build Status][travis-image]][travis-url] [![Dependency Status][daviddm-image]][daviddm-url] [![Coverage percentage][coveralls-image]][coveralls-url]
> Object assign deep
## Installation
```sh
$ npm install --save assign-deep
```
## Usage
```js
var assignDeep = require('assign-deep');
assignDeep('Rainbow');
```
## License
MIT © [Aubrian Halili]()
[npm-image]: https://badge.fury.io/js/assign-deep.svg
[npm-url]: https://npmjs.org/package/assign-deep
[travis-image]: https://travis-ci.org/aubrian-halili/assign-deep.svg?branch=master
[travis-url]: https://travis-ci.org/aubrian-halili/assign-deep
[daviddm-image]: https://david-dm.org/aubrian-halili/assign-deep.svg?theme=shields.io
[daviddm-url]: https://david-dm.org/aubrian-halili/assign-deep
[coveralls-image]: https://coveralls.io/repos/aubrian-halili/assign-deep/badge.svg
[coveralls-url]: https://coveralls.io/r/aubrian-halili/assign-deep
| 31.933333 | 202 | 0.729645 | kor_Hang | 0.224196 |
2f1fecff4bd3d463f9505540ace40695f522d102 | 4,098 | md | Markdown | README.md | kianmeng/erlang | 38857fdc08b360733992c8841c43187af0b6d646 | [
"MIT"
] | 79 | 2017-06-23T15:05:50.000Z | 2022-03-12T01:27:50.000Z | README.md | kianmeng/erlang | 38857fdc08b360733992c8841c43187af0b6d646 | [
"MIT"
] | 160 | 2017-06-18T20:57:57.000Z | 2022-02-23T13:18:33.000Z | README.md | kianmeng/erlang | 38857fdc08b360733992c8841c43187af0b6d646 | [
"MIT"
] | 65 | 2017-06-21T16:07:01.000Z | 2022-03-25T15:52:17.000Z | # Exercism Erlang Track
[](https://gitter.im/exercism/erlang?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge)
Exercism exercises in Erlang
## Contributing guide
For general information about how exercism works, please see the
[contributing guide](https://github.com/exercism/x-api/blob/master/CONTRIBUTING.md#the-exercise-data).
If you create “claiming” PRs with obviously unfinished code, please provide an estimate in the PR description when you will continue to work on the PR or you think it will be finished.
### Setting up your system for local development on the track
Please make sure you have installed erlang/OTP and `rebar3` as
described on [Installing Erlang](http://exercism.io/languages/erlang/installation)
or `docs/INSTALLATION.md` in this repository. Also run
`bin/fetch-configlet` to download the JSON-checker.
Please make sure you use one of the releases of erlang/OTP as
specified in `.github/workflows/main.yml` (see the
`jobs.test_erlang.strategy.matrix.otp` key), as these are the ones
officially tested and supported by this track.
Feel free to use any feature that was introduced in the oldest version
of the range, while also avoiding everything that has been removed or
deprecated in the newest one.
### Implementing an exercise
When there is a mention of "slug-name", it refers to the slug as used
on exercism URLs. In contrast, "erlangified_slug_name" is the slug-name
with all dashes (`-`) replaced by underscores (`_`) to make the name
compatible with Erlang syntax.
1. Create a folder `exercises/<slug-name>`.
2. Set up folder structure (`src`, and `test`).
3. Copy `rebar.config` and `src/*.app.src` from another exercise.
1. Leave `rebar.config` unchanged.
1. Rename `src/*.app.src` to `src/<erlangified_slug_name>.app.src`.
1. On the first line of this file change the old erlangified_slug_name to the new one.
1. On the second line change the old slug-name to the new one.
4. In the `src`-folder, create two files: `example.erl` and `<erlangified_slug_name>.erl`.
The first is for your example solution, the second is the 'empty' solution to give
students a place to start.
You might take the files from another exercise as your starting point.
Ensure their module names match their (new) file names.
5. In the `test`-folder, create one file: `<erlangified_slug_name>_tests.erl`
and insert the boilerplate code shown below.
This file is for the test cases.
6. Implement/correct your solution in `src/example.erl`.
7. Add tests to `<erlangified_slug_name>_tests.erl`.
8. Run tests using `rebar3 eunit`.
Repeat steps 6, 7, and 8 until all tests are implemented and your
example solution passes them all.
If there is a `exercises/<slug-name>/canonical-data.json`
in [problem-specifications](https://github.com/exercism/problem-specifications), make sure to
implement your tests and examples in a way that the canonical data is
integrated and not violated.
You may add further tests, as long as they do not violate canonical
data and add value to the exercise or are necessary for erlang
specific things.
Also please make sure to add a `HINTS.md` with some hints for the
students if the exercise becomes tricky or might not be obvious.
```erl
-module(<test module name>).
-include_lib("erl_exercism/include/exercism.hrl").
-include_lib("eunit/include/eunit.hrl").
```
You will need to add entry for the exercise in the track's `config.json` file,
which you will find in the respository's root directory (two levels up).
For details see [Exercise configuration](https://github.com/exercism/docs/blob/master/language-tracks/configuration/exercises.md).
### Before pushing
Please make sure, that all tests pass by running
`_test/check-exercises.escript`. On windows you might need to call
`escript _test/check-exercises.escript`. Also a run of `bin/configlet lint`
should pass without error message.
Both programs will be run on CI and a merge is unlikely if tests fail.
| 45.533333 | 210 | 0.766471 | eng_Latn | 0.996379 |
2f2124d40e5afcbfd3579aac50ff2676a13ecf20 | 41 | md | Markdown | README.md | yeyongzhen/vue_learning | bee3e3759ad91513ed254867eac2130690feffde | [
"MIT"
] | 1 | 2019-01-08T14:15:59.000Z | 2019-01-08T14:15:59.000Z | README.md | yeyongzhen/vue_learning | bee3e3759ad91513ed254867eac2130690feffde | [
"MIT"
] | null | null | null | README.md | yeyongzhen/vue_learning | bee3e3759ad91513ed254867eac2130690feffde | [
"MIT"
] | null | null | null | # vue_learning
demos when i learn vue.js
| 13.666667 | 25 | 0.780488 | eng_Latn | 0.805234 |
2f228e06d52145ad0f1c94e41170ae6e9eeb8d4d | 12,855 | md | Markdown | README.md | cantecim/dispatcher | 2419f747b0f3641b2e7aa745db97bce82bb082f4 | [
"MIT"
] | null | null | null | README.md | cantecim/dispatcher | 2419f747b0f3641b2e7aa745db97bce82bb082f4 | [
"MIT"
] | null | null | null | README.md | cantecim/dispatcher | 2419f747b0f3641b2e7aa745db97bce82bb082f4 | [
"MIT"
] | 1 | 2020-04-10T01:50:18.000Z | 2020-04-10T01:50:18.000Z | # Dispatcher
[<img src="https://s3-us-west-2.amazonaws.com/oss-avatars/dispatcher.png"/>](http://indatus.com/company/careers)
Dispatcher allows you to schedule your artisan commands within your [Laravel](http://laravel.com) project, eliminating the need to touch the crontab when deploying. It also allows commands to run per environment and keeps your scheduling logic where it should be, in your version control.
<!--<img align="left" height="300" src="https://s3-us-west-2.amazonaws.com/oss-avatars/dispatcher_round_readme.png">-->
```php
use Indatus\Dispatcher\Scheduling\ScheduledCommand;
use Indatus\Dispatcher\Scheduling\Schedulable;
use Indatus\Dispatcher\Drivers\DateTime\Scheduler;
class MyCommand extends ScheduledCommand {
//your command name, description etc.
public function schedule(Schedulable $scheduler)
{
//every day at 4:17am
return $scheduler
->daily()
->hours(4)
->minutes(17);
}
}
```
---
[](https://packagist.org/packages/indatus/dispatcher) [](https://packagist.org/packages/indatus/dispatcher) [](https://travis-ci.org/Indatus/dispatcher) [](https://scrutinizer-ci.com/g/Indatus/dispatcher/?branch=master) [](https://scrutinizer-ci.com/g/Indatus/dispatcher/?branch=master)
## README Contents
* [Features](#features)
* [Tutorial](#tutorial)
* [Installation](#installation)
* [For Laravel 4 (see 1.4 branch)](https://github.com/Indatus/dispatcher/tree/1.4#installation)
* [For Laravel 5](#installation) - discontinued, see [Laravel 5's scheduler](http://laravel-news.com/2014/11/laravel-5-scheduler/)
* [Upgrading from 1.4 to 2.0](#upgrading-1.4-2.0)
* [Usage](#usage)
* [Generating New Scheduled Commands](#new-commands)
* [Scheduling Existing Commands](#scheduling-commands)
* [Running Commands As Users](#commands-as-users)
* [Environment-Specific Commands](#environment-commands)
* [Running Commands In Maintenance Mode](#maintenance-mode)
* [Advanced Scheduling](#advanced-scheduling)
* [Drivers](#drivers)
* [DateTime](#datetime)
* [Custom Drivers](#custom-drivers)
* [FAQ](#faq)
<a name="features" />
## Features
* Schedule artisan commands to run automatically
* Scheduling is maintained within your version control system
* Single source of truth for when and where commands run
* Schedule commands to run with arguments and options
* Run commands as other users
* Run commands in certain environments
* Use custom drivers for custom scheduling contexts
<a name="tutorial" />
## Tutorial
By Ben Kuhl at the [Laravel Louisville meetup](http://laravel-louisville.github.io/meetup/) ([@lurvul](https://twitter.com/lurvul)): [Video](http://vimeo.com/94212203) - [Slides](http://bkuhl.github.io/dispatcher-slides)
By Jefferey Way at [Laracasts](https://www.laracasts.com): [Recurring Tasks the Laravel Way](https://laracasts.com/lessons/recurring-tasks-the-laravel-way)
<a name="installation" />
## Installation
> **NOTICE: [Laravel 5 now includes scheduling](http://laravel-news.com/2014/11/laravel-5-scheduler/) out of the box. This package will no longer be maintained for Laravel 5 and above**
| Requirements | 1.4.* | 2.* |
|-------------------------------|-------------------------------|-----------------------------------|
| [Laravel](http://laravel.com) | 4.1/4.2 | 5.x |
| [PHP](https://php.net) | 5.3+ | 5.4+ |
| [HHVM](http://hhvm.com) | 3.3+ | 3.3+ |
| Install with Composer... | ~1.4 | ~2.0@dev |
> If you're using **Laravel 4** view the [readme in the 1.4 branch](https://github.com/Indatus/dispatcher/tree/1.4)
Add this line to the providers array in your `config/app.php` file :
```php
'Indatus\Dispatcher\ServiceProvider',
```
Add the following to your root Crontab (via `sudo crontab -e`):
```php
* * * * * php /path/to/artisan scheduled:run 1>> /dev/null 2>&1
```
If you are adding this to `/etc/cron.d` you'll need to specify a user immediately after `* * * * *`.
> You may add this to any user's Crontab, but only the root crontab can run commands as other users.
<a name="upgrading-1.4-2.0" />
### Upgrading from 1.4 to 2.0
In all scheduled commands...
* Replace `use Indatus\Dispatcher\Drivers\Cron\Scheduler` with `use Indatus\Dispatcher\Drivers\DateTime\Scheduler`
* Replaced uses of `Scheduler::[DAY_OF_WEEK]` with `Day::[DAY_OF_WEEK]` and `Scheduler::[MONTH_OF_YEAR]` with `Month::[MONTH_OF_YEAR]`
* `executable` config option has been removed. Dispatcher now inherits the [path to the binary](http://php.net/manual/en/reserved.constants.php#constant.php-binary) that was initially used to run `scheduled:run`
<a name="usage" />
## Usage
```
scheduled
scheduled:make Create a new scheduled artisan command
scheduled:run Run scheduled commands
scheduled:summary View a summary of all scheduled artisan commands
```
If commands are not visible via `php artisan` then they cannot be scheduled.
<a name="new-commands" />
### Generating New Scheduled Commands
Use `php artisan scheduled:make` to generate a new scheduled command, the same way you would use artisan's `command:make`. Then [register your command](http://laravel.com/docs/commands#registering-commands) with Laravel.
<a name="scheduling-commands" />
### Scheduling Existing Commands
You may either implement `\Indatus\Dispatcher\Scheduling\ScheduledCommandInterface` or follow the below steps.
1. Add use statements to your command. If you're using a custom driver you will use a different `Scheduler` class.
```php
use Indatus\Dispatcher\Scheduling\ScheduledCommand;
use Indatus\Dispatcher\Scheduling\Schedulable;
use Indatus\Dispatcher\Drivers\DateTime\Scheduler;
```
2. Extend `\Indatus\Dispatcher\Scheduling\ScheduledCommand`
3. Implement schedule():
```php
/**
* When a command should run
*
* @param Scheduler $scheduler
*
* @return Scheduler
*/
public function schedule(Schedulable $scheduler)
{
return $scheduler;
}
```
For details and examples on how to schedule, see the [DateTime Driver](#datetime).
<a name="commands-as-users" />
### Running Commands As Users
You may override `user()` to run a given artisan command as a specific user. Ensure your `scheduled:run` artisan command is running as root.
```php
public function user()
{
return 'backup';
}
```
> This feature may not be supported by all drivers.
<a name="environment-commands" />
### Environment-Specific Commands
You may override `environment()` to ensure your command is only scheduled in specific environments. It should provide a single environment or an array of environments.
```php
public function environment()
{
return ['development','staging'];
}
```
<a name="maintenance-mode" />
### Maintenance Mode
By default, cron commands will *not* run when application is in Maintenance Mode. This will prevent all sorts of weird output that might occur if a cron command is run while you are migrating a database or doing a composer update.
You may override `runInMaintenanceMode()` to force your command to still be run while the application is in maintenance mode.
```php
public function runInMaintenanceMode()
{
return true;
}
```
<a name="advanced-scheduling" />
### Advanced scheduling
You may schedule a given command to to run at multiple times by `schedule()` returning multiple `Schedulable` instances.
```php
public function schedule(Schedulable $scheduler)
{
return [
// 5am Mon-Fri
$scheduler->everyWeekday()->hours(5),
// 2am every Saturday
App::make(get_class($scheduler))
->daysOfTheWeek(Scheduler::SATURDAY)
->hours(2)
];
}
```
You may also schedule a command to run with arguments and options.
```php
public function schedule(Schedulable $scheduler)
{
return [
// equivalent to: php /path/to/artisan command:name /path/to/file
$scheduler->args(['/path/to/file'])
->everyWeekday()
->hours(5),
// equivalent to: php /path/to/artisan command:name /path/to/file --force --toDelete="expired" --exclude="admins" --exclude="developers"
$scheduler->args(['/path/to/file'])
->opts([
'force',
'toDelete' => 'expired',
'exclude' => [
'admins',
'developers'
]
])
->daysOfTheMonth([1, 15])
->hours(2)
];
}
```
> NOTE: Both `args()` and `opts()`, whichever is called first, will internally create a new `Schedulable` instance for you so you don't need to `App::make()`.
<a name="drivers" />
## Drivers
Drivers provide the ability to add additional context to your scheduling. [Building custom drivers](#custom-drivers) is a great way to customize this context to your application's needs.
<a name="datetime" />
### DateTime (Default)
Examples of how to schedule:
```php
public function schedule(Schedulable $scheduler)
{
//every day at 4:17am
return $scheduler->daily()->hours(4)->minutes(17);
}
```
```php
public function schedule(Schedulable $scheduler)
{
//every Tuesday/Thursday at 5:03am
return $scheduler->daysOfTheWeek([
Scheduler::TUESDAY,
Scheduler::THURSDAY
])->hours(5)->minutes(3);
}
```
```php
public function schedule(Schedulable $scheduler)
{
//the second and third Tuesday of every month at 12am
return $scheduler->monthly()->week([2, 3])->daysOfTheWeek(Day::TUESDAY);
}
```
<a name="custom-drivers" />
## Custom Drivers
Custom drivers allow you to provide application context within scheduling. For example, an education-based application may contain scheduling methods like `inServiceDays()`, `springBreak()` and `christmasBreak()` where commands are run or don't run during those times.
Create a packagepath such as `\MyApp\ScheduleDriver\` and create two classes:
* `Scheduler` that `implements Indatus\Dispatcher\Scheduling\Schedulable`. This class should provide a useful interface for programmers to schedule their commands.
* `ScheduleService` that `extends \Indatus\Dispatcher\Services\ScheduleService`. This class contains logic on how to determine if a command is due to run.
Publish the configs using `php artisan config:publish indatus/dispatcher`. Then update your driver configuration to reference the package in which these 2 classes are included (do not include a trailing slash):
```php
'driver' => '\MyApp\ScheduleDriver'
```
<a name="faq" />
## FAQ
**I need to deploy to multiple servers representing a single environment. How can I be sure my command is only run by a single server and not run on each server?**
Schedule `scheduled:run` to run every minute with [rcron](https://code.google.com/p/rcron/):
```php
* * * * * /usr/bin/rcron php /path/to/artisan scheduled:run 1>> /dev/null 2>&1
```
**Why are my commands not running when I've scheduled them correctly? I'm also not seeing any error output**
1) Verify that mcrypt is installed and working correctly via the command `php -i | mcrypt`.
2) Utilizing `php artisan scheduled:run --debug` will tell you why they're not running. If you do not see your command listed here then it is not set up correctly.
Example:
```
$ php artisan scheduled:run --debug
Running commands...
backup:avatars: No schedules were due
command:name: No schedules were due
myTestCommand:name: No schedules were due
cache:clean: /usr/bin/env php /Users/myUser/myApp/artisan cache:clean > /dev/null &
mail:subscribers: /usr/bin/env php /Users/myUser/myApp/artisan mail:subscribers > /dev/null &
```
**I have commands that extend `ScheduledCommand` but why don't they appear in when I run `scheduled:summary`?**
Commands that are disabled will not appear here. Check and be sure `isEnabled()` returns true on those commands.
| 38.488024 | 720 | 0.67079 | eng_Latn | 0.953368 |
2f2299394bf9eb92fcb643e525c0ee0c29f4b5e5 | 263 | md | Markdown | pages/installation/basics/index.md | station-demand-forecasting-tool/documentation | f7d347c98b28c0cb6ca39da0437239933058c188 | [
"MIT"
] | null | null | null | pages/installation/basics/index.md | station-demand-forecasting-tool/documentation | f7d347c98b28c0cb6ca39da0437239933058c188 | [
"MIT"
] | null | null | null | pages/installation/basics/index.md | station-demand-forecasting-tool/documentation | f7d347c98b28c0cb6ca39da0437239933058c188 | [
"MIT"
] | null | null | null | # Manual Deploy Basics
<!-- position: 1 -->
Documentation on manually deploying the Station Demand Forecasting Tool will be provided in the future. It is recommended that you use the [Docker deployment](https://www.stationdemand.org.uk/docker) in the meantime. | 65.75 | 218 | 0.775665 | eng_Latn | 0.993655 |
2f23071f9a3fa7eed13c57b7a7606e317af88a80 | 5,205 | md | Markdown | openmetrics/README.md | OuesFa/integrations-core | 0ffe4ca306580a2e775b515152384034c2dfdc03 | [
"BSD-3-Clause"
] | null | null | null | openmetrics/README.md | OuesFa/integrations-core | 0ffe4ca306580a2e775b515152384034c2dfdc03 | [
"BSD-3-Clause"
] | null | null | null | openmetrics/README.md | OuesFa/integrations-core | 0ffe4ca306580a2e775b515152384034c2dfdc03 | [
"BSD-3-Clause"
] | null | null | null | # OpenMetrics Integration
## Overview
Extract custom metrics from any OpenMetrics endpoints.
<div class="alert alert-warning">All the metrics retrieved by this integration are considered <a href="https://docs.datadoghq.com/developers/metrics/custom_metrics">custom metrics</a>.</div>
## Setup
Follow the instructions below to install and configure this check for an Agent running on a host. For containerized environments, see the [Autodiscovery Integration Templates][1] for guidance on applying these instructions.
### Installation
The OpenMetrics check is packaged with the [Datadog Agent starting version 6.6.0][2].
### Configuration
Edit the `openmetrics.d/conf.yaml` file at the root of your [Agent's configuration directory][3]. See the [sample openmetrics.d/conf.yaml][4] for all available configuration options.
For each instance the following parameters are required:
| Parameter | Description |
| ---------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ |
| `openmetrics_endpoint` | The URL where your application metrics are exposed by OpenMetrics (must be unique). |
| `namespace` | The namespace to prepend to all metrics. |
| `metrics` | A list of metrics to retrieve as custom metrics. Add each metric to the list as `metric_name` or `metric_name: renamed` to rename it. The metrics are interpreted as regular expressions. Use `.*` as a wildcard (`metric.*`) to fetch all matching metrics. **Note**: Regular expressions can potentially send a lot of custom metrics. |
**Note**: This is a new default OpenMetrics check example as of Datadog Agent version 7.32.0. If you previously implemented this integration, see the [legacy example][5].
**Note**: Starting in Datadog Agent v7.32.0, in adherence to the [OpenMetrics specification standard][11], counter names ending in `_total` must be specified without the `_total` suffix. For example, to collect `promhttp_metric_handler_requests_total`, specify the metric name `promhttp_metric_handler_requests`. This submits to Datadog the metric name appended with `.count`, `promhttp_metric_handler_requests.count`.
**Note**: This check has a limit of 2000 metrics per instance. The number of returned metrics is indicated when running the Datadog Agent [status command][6]. You can specify the metrics you are interested in by editing the configuration. To learn how to customize the metrics to collect, see the [Prometheus and OpenMetrics Metrics Collection][7] for more detailed instructions. If you need to monitor more metrics, contact [Datadog support][8].
For more configurations, see [Prometheus and OpenMetrics Metrics Collection][7].
### Validation
[Run the Agent's status subcommand][6] and look for `openmetrics` under the Checks section.
## Data Collected
### Metrics
All metrics collected by the OpenMetrics check are forwarded to Datadog as custom metrics.
### Events
The OpenMetrics check does not include any events.
### Service Checks
The OpenMetrics check does not include any service checks.
## Troubleshooting
### High custom metrics billing
OpenMetrics configurations with generic wildcard values for the `metrics` option have significant impact on custom metrics billing.
Datadog recommends that you use specific metric names or partial metric name matches for more precise collection.
Need help? Contact [Datadog support][8].
## Further Reading
- [Configuring a OpenMetrics Check][9]
- [Writing a custom OpenMetrics Check][10]
[1]: https://docs.datadoghq.com/agent/kubernetes/integrations/
[2]: https://docs.datadoghq.com/getting_started/integrations/prometheus/?tab=docker#configuration
[3]: https://docs.datadoghq.com/agent/guide/agent-configuration-files/#agent-configuration-directory
[4]: https://github.com/DataDog/integrations-core/blob/master/openmetrics/datadog_checks/openmetrics/data/conf.yaml.example
[5]: https://github.com/DataDog/integrations-core/blob/7.30.x/openmetrics/datadog_checks/openmetrics/data/conf.yaml.example
[6]: https://docs.datadoghq.com/agent/guide/agent-commands/#agent-status-and-information
[7]: https://docs.datadoghq.com/getting_started/integrations/prometheus/
[8]: https://docs.datadoghq.com/help/
[9]: https://docs.datadoghq.com/agent/openmetrics/
[10]: https://docs.datadoghq.com/developers/openmetrics/
[11]: https://github.com/OpenObservability/OpenMetrics/blob/main/specification/OpenMetrics.md#suffixes
| 64.259259 | 446 | 0.64976 | eng_Latn | 0.872434 |
2f2401a969714a06a55f3d37028a8dca1ec9c0e6 | 13,135 | md | Markdown | treebanks/ro_rrt/ro_rrt-feat-Person.md | myedibleenso/docs | 4306449b6863a7055a7289a94de010dbcfc11f65 | [
"Apache-2.0"
] | null | null | null | treebanks/ro_rrt/ro_rrt-feat-Person.md | myedibleenso/docs | 4306449b6863a7055a7289a94de010dbcfc11f65 | [
"Apache-2.0"
] | null | null | null | treebanks/ro_rrt/ro_rrt-feat-Person.md | myedibleenso/docs | 4306449b6863a7055a7289a94de010dbcfc11f65 | [
"Apache-2.0"
] | null | null | null | ---
layout: base
title: 'Statistics of Person in UD_Romanian-RRT'
udver: '2'
---
## Treebank Statistics: UD_Romanian-RRT: Features: `Person`
This feature is universal.
It occurs with 3 different values: `1`, `2`, `3`.
35106 tokens (16%) have a non-empty value of `Person`.
4395 types (14%) occur at least once with a non-empty value of `Person`.
1481 lemmas (9%) occur at least once with a non-empty value of `Person`.
The feature is used with 4 part-of-speech tags: <tt><a href="ro_rrt-pos-VERB.html">VERB</a></tt> (12327; 6% instances), <tt><a href="ro_rrt-pos-PRON.html">PRON</a></tt> (11806; 5% instances), <tt><a href="ro_rrt-pos-AUX.html">AUX</a></tt> (7120; 3% instances), <tt><a href="ro_rrt-pos-DET.html">DET</a></tt> (3853; 2% instances).
### `VERB`
12327 <tt><a href="ro_rrt-pos-VERB.html">VERB</a></tt> tokens (53% of all `VERB` tokens) have a non-empty value of `Person`.
The most frequent other feature values with which `VERB` and `Person` co-occurred: <tt><a href="ro_rrt-feat-Gender.html">Gender</a></tt><tt>=EMPTY</tt> (12327; 100%), <tt><a href="ro_rrt-feat-VerbForm.html">VerbForm</a></tt><tt>=Fin</tt> (12327; 100%), <tt><a href="ro_rrt-feat-Mood.html">Mood</a></tt><tt>=Ind</tt> (10513; 85%), <tt><a href="ro_rrt-feat-Tense.html">Tense</a></tt><tt>=Pres</tt> (7897; 64%), <tt><a href="ro_rrt-feat-Number.html">Number</a></tt><tt>=Sing</tt> (6240; 51%).
`VERB` tokens may have the following values of `Person`:
* `1` (628; 5% of non-empty `Person`): <em>știu, putem, avem, cred, rog, spun, așteptam, rugăm, vedem, văd</em>
* `2` (848; 7% of non-empty `Person`): <em>luați, vezi, aveți, utilizați, poți, puteai, spuneți, puteți, știți, lăsați</em>
* `3` (10851; 88% of non-empty `Person`): <em>poate, trebuie, pot, are, avea, era, putea, există, face, au</em>
* `EMPTY` (10724): <em>avut, prevăzute, putea, făcut, trebui, având, avea, face, spus, putut</em>
<table>
<tr><th>Paradigm <i>putea</i></th><th><tt>1</tt></th><th><tt>2</tt></th><th><tt>3</tt></th></tr>
<tr><td><tt><tt><a href="ro_rrt-feat-Mood.html">Mood</a></tt><tt>=Imp</tt>|<tt><a href="ro_rrt-feat-Number.html">Number</a></tt><tt>=Sing</tt></tt></td><td></td><td><em>poți</em></td><td></td></tr>
<tr><td><tt><tt><a href="ro_rrt-feat-Mood.html">Mood</a></tt><tt>=Ind</tt>|<tt><a href="ro_rrt-feat-Number.html">Number</a></tt><tt>=Sing</tt>|<tt><a href="ro_rrt-feat-Tense.html">Tense</a></tt><tt>=Imp</tt></tt></td><td></td><td><em>puteai</em></td><td><em>putea</em></td></tr>
<tr><td><tt><tt><a href="ro_rrt-feat-Mood.html">Mood</a></tt><tt>=Ind</tt>|<tt><a href="ro_rrt-feat-Number.html">Number</a></tt><tt>=Sing</tt>|<tt><a href="ro_rrt-feat-Tense.html">Tense</a></tt><tt>=Past</tt></tt></td><td></td><td></td><td><em>putu</em></td></tr>
<tr><td><tt><tt><a href="ro_rrt-feat-Mood.html">Mood</a></tt><tt>=Ind</tt>|<tt><a href="ro_rrt-feat-Number.html">Number</a></tt><tt>=Sing</tt>|<tt><a href="ro_rrt-feat-Tense.html">Tense</a></tt><tt>=Pqp</tt></tt></td><td></td><td></td><td><em>putuse</em></td></tr>
<tr><td><tt><tt><a href="ro_rrt-feat-Mood.html">Mood</a></tt><tt>=Ind</tt>|<tt><a href="ro_rrt-feat-Number.html">Number</a></tt><tt>=Sing</tt>|<tt><a href="ro_rrt-feat-Tense.html">Tense</a></tt><tt>=Pres</tt></tt></td><td></td><td><em>poți</em></td><td><em>poate</em></td></tr>
<tr><td><tt><tt><a href="ro_rrt-feat-Mood.html">Mood</a></tt><tt>=Ind</tt>|<tt><a href="ro_rrt-feat-Number.html">Number</a></tt><tt>=Sing</tt></tt></td><td></td><td><em>Poți</em></td><td></td></tr>
<tr><td><tt><tt><a href="ro_rrt-feat-Mood.html">Mood</a></tt><tt>=Ind</tt>|<tt><a href="ro_rrt-feat-Number.html">Number</a></tt><tt>=Plur</tt>|<tt><a href="ro_rrt-feat-Tense.html">Tense</a></tt><tt>=Imp</tt></tt></td><td></td><td></td><td><em>puteau</em></td></tr>
<tr><td><tt><tt><a href="ro_rrt-feat-Mood.html">Mood</a></tt><tt>=Ind</tt>|<tt><a href="ro_rrt-feat-Number.html">Number</a></tt><tt>=Plur</tt>|<tt><a href="ro_rrt-feat-Tense.html">Tense</a></tt><tt>=Past</tt></tt></td><td><em>puturăm</em></td><td></td><td><em>putură</em></td></tr>
<tr><td><tt><tt><a href="ro_rrt-feat-Mood.html">Mood</a></tt><tt>=Ind</tt>|<tt><a href="ro_rrt-feat-Number.html">Number</a></tt><tt>=Plur</tt>|<tt><a href="ro_rrt-feat-Tense.html">Tense</a></tt><tt>=Pres</tt></tt></td><td><em>putem</em></td><td><em>puteți</em></td><td><em>pot</em></td></tr>
<tr><td><tt><tt><a href="ro_rrt-feat-Mood.html">Mood</a></tt><tt>=Ind</tt>|<tt><a href="ro_rrt-feat-Tense.html">Tense</a></tt><tt>=Imp</tt></tt></td><td><em>puteam</em></td><td></td><td></td></tr>
<tr><td><tt><tt><a href="ro_rrt-feat-Mood.html">Mood</a></tt><tt>=Sub</tt>|<tt><a href="ro_rrt-feat-Tense.html">Tense</a></tt><tt>=Pres</tt></tt></td><td></td><td></td><td><em>poată</em></td></tr>
</table>
### `PRON`
11806 <tt><a href="ro_rrt-pos-PRON.html">PRON</a></tt> tokens (100% of all `PRON` tokens) have a non-empty value of `Person`.
The most frequent other feature values with which `PRON` and `Person` co-occurred: <tt><a href="ro_rrt-feat-Variant.html">Variant</a></tt><tt>=EMPTY</tt> (9742; 83%), <tt><a href="ro_rrt-feat-Gender.html">Gender</a></tt><tt>=EMPTY</tt> (8681; 74%), <tt><a href="ro_rrt-feat-Reflex.html">Reflex</a></tt><tt>=EMPTY</tt> (7870; 67%), <tt><a href="ro_rrt-feat-PronType.html">PronType</a></tt><tt>=Prs</tt> (7247; 61%), <tt><a href="ro_rrt-feat-Number.html">Number</a></tt><tt>=EMPTY</tt> (6805; 58%), <tt><a href="ro_rrt-feat-Strength.html">Strength</a></tt><tt>=Weak</tt> (6201; 53%).
`PRON` tokens may have the following values of `Person`:
* `1` (646; 5% of non-empty `Person`): <em>ne, mă, eu, noi, m-, -mi, mi-, mine, ne-, îmi</em>
* `2` (525; 4% of non-empty `Person`): <em>dumneavoastră, vă, te, -ți, v-, tu, ți-, îți, -vă, te-</em>
* `3` (10635; 90% of non-empty `Person`): <em>se, care, ce, s-, el, le, o, își, -și, -l</em>
* `EMPTY` (8): <em>ș.a., dvs., Î.P.S.</em>
`Person` seems to be **lexical feature** of `PRON`. 100% lemmas (48) occur only with one value of `Person`.
### `AUX`
7120 <tt><a href="ro_rrt-pos-AUX.html">AUX</a></tt> tokens (83% of all `AUX` tokens) have a non-empty value of `Person`.
The most frequent other feature values with which `AUX` and `Person` co-occurred: <tt><a href="ro_rrt-feat-Number.html">Number</a></tt><tt>=Sing</tt> (4489; 63%), <tt><a href="ro_rrt-feat-Tense.html">Tense</a></tt><tt>=EMPTY</tt> (4350; 61%), <tt><a href="ro_rrt-feat-Mood.html">Mood</a></tt><tt>=EMPTY</tt> (4348; 61%), <tt><a href="ro_rrt-feat-VerbForm.html">VerbForm</a></tt><tt>=EMPTY</tt> (4348; 61%).
`AUX` tokens may have the following values of `Person`:
* `1` (380; 5% of non-empty `Person`): <em>am, aș, vom, eram, sunt, voi, suntem, fiu, -aș, fim</em>
* `2` (167; 2% of non-empty `Person`): <em>ai, ați, veți, sunteți, ești, fii, erai, vei, -ai, oi</em>
* `3` (6573; 92% of non-empty `Person`): <em>a, este, au, sunt, ar, era, va, fie, e, vor</em>
* `EMPTY` (1437): <em>fi, fost, fiind, nefiind, fiindu, este</em>
<table>
<tr><th>Paradigm <i>fi</i></th><th><tt>1</tt></th><th><tt>2</tt></th><th><tt>3</tt></th></tr>
<tr><td><tt><tt><a href="ro_rrt-feat-Mood.html">Mood</a></tt><tt>=Imp</tt>|<tt><a href="ro_rrt-feat-Number.html">Number</a></tt><tt>=Sing</tt></tt></td><td></td><td><em>fi, fii</em></td><td></td></tr>
<tr><td><tt><tt><a href="ro_rrt-feat-Mood.html">Mood</a></tt><tt>=Ind</tt>|<tt><a href="ro_rrt-feat-Number.html">Number</a></tt><tt>=Sing</tt>|<tt><a href="ro_rrt-feat-Tense.html">Tense</a></tt><tt>=Imp</tt></tt></td><td></td><td><em>erai</em></td><td><em>era</em></td></tr>
<tr><td><tt><tt><a href="ro_rrt-feat-Mood.html">Mood</a></tt><tt>=Ind</tt>|<tt><a href="ro_rrt-feat-Number.html">Number</a></tt><tt>=Sing</tt>|<tt><a href="ro_rrt-feat-Tense.html">Tense</a></tt><tt>=Past</tt></tt></td><td></td><td></td><td><em>fu</em></td></tr>
<tr><td><tt><tt><a href="ro_rrt-feat-Mood.html">Mood</a></tt><tt>=Ind</tt>|<tt><a href="ro_rrt-feat-Number.html">Number</a></tt><tt>=Sing</tt>|<tt><a href="ro_rrt-feat-Tense.html">Tense</a></tt><tt>=Pqp</tt></tt></td><td></td><td></td><td><em>fusese</em></td></tr>
<tr><td><tt><tt><a href="ro_rrt-feat-Mood.html">Mood</a></tt><tt>=Ind</tt>|<tt><a href="ro_rrt-feat-Number.html">Number</a></tt><tt>=Sing</tt>|<tt><a href="ro_rrt-feat-Tense.html">Tense</a></tt><tt>=Pres</tt>|<tt><a href="ro_rrt-feat-Variant.html">Variant</a></tt><tt>=Short</tt></tt></td><td></td><td></td><td><em>-i, E-</em></td></tr>
<tr><td><tt><tt><a href="ro_rrt-feat-Mood.html">Mood</a></tt><tt>=Ind</tt>|<tt><a href="ro_rrt-feat-Number.html">Number</a></tt><tt>=Sing</tt>|<tt><a href="ro_rrt-feat-Tense.html">Tense</a></tt><tt>=Pres</tt></tt></td><td><em>sunt</em></td><td><em>ești</em></td><td><em>este, e, Sunt, îi</em></td></tr>
<tr><td><tt><tt><a href="ro_rrt-feat-Mood.html">Mood</a></tt><tt>=Ind</tt>|<tt><a href="ro_rrt-feat-Number.html">Number</a></tt><tt>=Plur</tt>|<tt><a href="ro_rrt-feat-Tense.html">Tense</a></tt><tt>=Imp</tt></tt></td><td></td><td></td><td><em>erau</em></td></tr>
<tr><td><tt><tt><a href="ro_rrt-feat-Mood.html">Mood</a></tt><tt>=Ind</tt>|<tt><a href="ro_rrt-feat-Number.html">Number</a></tt><tt>=Plur</tt>|<tt><a href="ro_rrt-feat-Tense.html">Tense</a></tt><tt>=Past</tt></tt></td><td></td><td></td><td><em>fură</em></td></tr>
<tr><td><tt><tt><a href="ro_rrt-feat-Mood.html">Mood</a></tt><tt>=Ind</tt>|<tt><a href="ro_rrt-feat-Number.html">Number</a></tt><tt>=Plur</tt>|<tt><a href="ro_rrt-feat-Tense.html">Tense</a></tt><tt>=Pqp</tt></tt></td><td></td><td></td><td><em>fuseseră</em></td></tr>
<tr><td><tt><tt><a href="ro_rrt-feat-Mood.html">Mood</a></tt><tt>=Ind</tt>|<tt><a href="ro_rrt-feat-Number.html">Number</a></tt><tt>=Plur</tt>|<tt><a href="ro_rrt-feat-Tense.html">Tense</a></tt><tt>=Pres</tt>|<tt><a href="ro_rrt-feat-Variant.html">Variant</a></tt><tt>=Short</tt></tt></td><td></td><td></td><td><em>-s</em></td></tr>
<tr><td><tt><tt><a href="ro_rrt-feat-Mood.html">Mood</a></tt><tt>=Ind</tt>|<tt><a href="ro_rrt-feat-Number.html">Number</a></tt><tt>=Plur</tt>|<tt><a href="ro_rrt-feat-Tense.html">Tense</a></tt><tt>=Pres</tt></tt></td><td><em>suntem</em></td><td><em>sunteți</em></td><td><em>sunt</em></td></tr>
<tr><td><tt><tt><a href="ro_rrt-feat-Mood.html">Mood</a></tt><tt>=Ind</tt>|<tt><a href="ro_rrt-feat-Tense.html">Tense</a></tt><tt>=Imp</tt></tt></td><td><em>eram</em></td><td></td><td></td></tr>
<tr><td><tt><tt><a href="ro_rrt-feat-Mood.html">Mood</a></tt><tt>=Sub</tt>|<tt><a href="ro_rrt-feat-Number.html">Number</a></tt><tt>=Sing</tt>|<tt><a href="ro_rrt-feat-Tense.html">Tense</a></tt><tt>=Pres</tt></tt></td><td><em>fiu</em></td><td><em>fii</em></td><td></td></tr>
<tr><td><tt><tt><a href="ro_rrt-feat-Mood.html">Mood</a></tt><tt>=Sub</tt>|<tt><a href="ro_rrt-feat-Number.html">Number</a></tt><tt>=Plur</tt>|<tt><a href="ro_rrt-feat-Tense.html">Tense</a></tt><tt>=Pres</tt></tt></td><td><em>fim</em></td><td><em>fiți</em></td><td></td></tr>
<tr><td><tt><tt><a href="ro_rrt-feat-Mood.html">Mood</a></tt><tt>=Sub</tt>|<tt><a href="ro_rrt-feat-Tense.html">Tense</a></tt><tt>=Pres</tt></tt></td><td></td><td></td><td><em>fie</em></td></tr>
</table>
### `DET`
3853 <tt><a href="ro_rrt-pos-DET.html">DET</a></tt> tokens (32% of all `DET` tokens) have a non-empty value of `Person`.
The most frequent other feature values with which `DET` and `Person` co-occurred: <tt><a href="ro_rrt-feat-Poss.html">Poss</a></tt><tt>=EMPTY</tt> (2843; 74%), <tt><a href="ro_rrt-feat-Case.html">Case</a></tt><tt>=Acc,Nom</tt> (2235; 58%), <tt><a href="ro_rrt-feat-Position.html">Position</a></tt><tt>=EMPTY</tt> (1937; 50%).
`DET` tokens may have the following values of `Person`:
* `1` (179; 5% of non-empty `Person`): <em>mea, meu, noastre, nostru, noastră, mele, mei, noștri, însumi, -mea</em>
* `2` (46; 1% of non-empty `Person`): <em>ta, tău, dumitale, tale, voastră, vostru, voștri, tăi, voastre</em>
* `3` (3628; 94% of non-empty `Person`): <em>acest, lui, lor, orice, toate, această, aceste, alte, fiecare, ei</em>
* `EMPTY` (8172): <em>o, un, a, al, ale, lui, unei, unui, cel, unor</em>
`Person` seems to be **lexical feature** of `DET`. 100% lemmas (35) occur only with one value of `Person`.
## Relations with Agreement in `Person`
The 10 most frequent relations where parent and child node agree in `Person`:
<tt>VERB --[<tt><a href="ro_rrt-dep-nsubj.html">nsubj</a></tt>]--> PRON</tt> (1598; 74%),
<tt>VERB --[<tt><a href="ro_rrt-dep-expl-pv.html">expl:pv</a></tt>]--> PRON</tt> (1537; 68%),
<tt>VERB --[<tt><a href="ro_rrt-dep-conj.html">conj</a></tt>]--> VERB</tt> (1272; 76%),
<tt>VERB --[<tt><a href="ro_rrt-dep-expl-pass.html">expl:pass</a></tt>]--> PRON</tt> (872; 78%),
<tt>VERB --[<tt><a href="ro_rrt-dep-obl.html">obl</a></tt>]--> PRON</tt> (464; 55%),
<tt>VERB --[<tt><a href="ro_rrt-dep-expl.html">expl</a></tt>]--> PRON</tt> (303; 57%),
<tt>VERB --[<tt><a href="ro_rrt-dep-expl-impers.html">expl:impers</a></tt>]--> PRON</tt> (106; 87%),
<tt>PRON --[<tt><a href="ro_rrt-dep-acl.html">acl</a></tt>]--> VERB</tt> (88; 53%),
<tt>PRON --[<tt><a href="ro_rrt-dep-fixed.html">fixed</a></tt>]--> PRON</tt> (67; 100%),
<tt>PRON --[<tt><a href="ro_rrt-dep-cop.html">cop</a></tt>]--> AUX</tt> (59; 69%).
| 108.553719 | 581 | 0.60708 | yue_Hant | 0.525306 |
2f244a55d5b2b2dcf6b55f8ec49c34beccfef0cb | 2,820 | md | Markdown | README.md | coreycb/charm-swift-storage | c31991ab198d7b51b9a4f5744a1fcc1fef0bc1ef | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | README.md | coreycb/charm-swift-storage | c31991ab198d7b51b9a4f5744a1fcc1fef0bc1ef | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | README.md | coreycb/charm-swift-storage | c31991ab198d7b51b9a4f5744a1fcc1fef0bc1ef | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | Overview
--------
This charm provides the swift-storage component of the OpenStack Swift object
storage system. It can be deployed as part of its own standalone storage
cluster or it can be integrated with the other OpenStack components, assuming
those are also managed by Juju. For Swift to function, you'll also need to
deploy an additional swift-proxy using the cs:precise/swift-proxy charm.
For more information about Swift and its architecture, visit the official
project website at http://swift.openstack.org.
This charm was developed to support deploying multiple version of Swift on
Ubuntu Precise 12.04, as they relate to the release series of OpenStack. That
is, OpenStack Essex corresponds to Swift 1.4.8 while OpenStack Folsom shipped
1.7.4. This charm can be used to deploy either (and future) versions of Swift
onto an Ubuntu Precise 12.04, making use of the Ubuntu Cloud Archive when
needed.
Usage
-----
This charm is quite simple. Its basic function is to get a storage device
setup for swift usage, and run the container, object and account services.
The deployment workflow for swift using this charm is covered in the README
for the swift-proxy charm at cs:precise/swift-proxy. The following are
deployment options to take into consideration when deploying swift-storage.
**Zone assignment**
If the swift-proxy charm is configured for manual zone assignment (recommended),
the 'zone' option should be set for each swift-storage service being deployed.
See the swift-proxy README for more information about zone assignment.
**Storage**
Swift storage nodes require access to local storage and filesystem. The charm
takes a 'block-device' config setting that can be used to specify which storage
device(s) to use. Options include:
- 1 or more local block devices (eg, sdb or /dev/sdb). It's important that this
device be the same on all machine units assigned to this service. Multiple
block devices should be listed as a space-separated list of device nodes.
- a path to a local file on the filesystem with the size appended after a pipe,
eg "/etc/swift/storagedev1.img|5G". This will be created if it does not
exist and be mapped to a loopback device. Good for development and testing.
- "guess" can be used to tell the charm to do its best to find a local devices
to use. *EXPERIMENTAL*
Multiple devices can be specified. In all cases, the resulting block device(s)
will each be formatted as XFS file system and mounted at /srv/node/$devname.
**Installation repository**
The 'openstack-origin' setting allows Swift to be installed from installation
repositories and can be used to setup access to the Ubuntu Cloud Archive
to support installing Swift versions more recent than what is shipped with
Ubuntu 12.04 (1.4.8). For more information, see config.yaml.
| 47.79661 | 81 | 0.782624 | eng_Latn | 0.999196 |
2f24aa310fdc9f13e4fcbf1b23cef7c44c85e771 | 3,175 | md | Markdown | packages/patternfly-4/content/get-started/about-pf.md | ncameronbritt/patternfly-org | 9e2b16608ce0af5738f8029d8ec6c301550877f6 | [
"Apache-2.0"
] | null | null | null | packages/patternfly-4/content/get-started/about-pf.md | ncameronbritt/patternfly-org | 9e2b16608ce0af5738f8029d8ec6c301550877f6 | [
"Apache-2.0"
] | null | null | null | packages/patternfly-4/content/get-started/about-pf.md | ncameronbritt/patternfly-org | 9e2b16608ce0af5738f8029d8ec6c301550877f6 | [
"Apache-2.0"
] | null | null | null | ---
path: "/get-started/about"
---
# About PatternFly 4
PatternFly is an open source design system created to enable consistency and usability across a wide range of applications and use cases. PatternFly provides clear standards, guidance, and tools that help designers and developers work together more efficiently and build better user experiences.
## Basic structure
### Components
Components, like buttons and alerts, can be assembled together to build applications.
### Layouts
Layouts are generic tools that allow you to structure and organize the content on your pages.
### Demos
Demos use components and layouts in combination to show you how to build more complex structures and application views.
[**View components, layouts, and demos in HTML/CSS**](/documentation/core) <i class="blueArrow fas fa-arrow-right pf-u-mx-sm"></i>
[**View components, layouts, and demos in React**](/documentation/react) <i class="blueArrow fas fa-arrow-right pf-u-mx-sm"></i>
## Design guidelines
### Styles
Style guidelines define foundational elements of the design system, like color, typography, and spacing.
### Usage and behavior
Usage and behavior guidelines communicate standards and best practices for common design patterns like navigation, dashboards, or forms.
### Content
Content guidelines provide principles and best practices around writing for user experience along with general voice and style guidance.
[**View design guidelines**](/design-guidelines/styles/colors) <i class="blueArrow fas fa-arrow-right pf-u-mx-sm"></i>
## Additional tools
### CSS variables
You can customize PatternFly for your project using the CSS variable system, which enables you to change style elements like color across your project. The CSS variable system is a two-layer theming system where global variables inform component variables.
**Global variables**
Global variables define and enforce style elements (like global values for color, spacing, and font size) across the entire system.
**Component variables**
Component variables are used to define custom properties at the component level. Component variables are always defined by global variables.
[**Learn more and view CSS variables**](/documentation/react/css-variables/) <i class="blueArrow fas fa-arrow-right pf-u-mx-sm"></i>
### Utilities
Utilities are a set of classes that enable you to further customize and modify elements in your project without having to write any custom CSS.
For example, you might use a utility class to add additional spacing between elements, align content in a layout, or even add a box shadow to an element.
[**View utilities**](/documentation/core/utilities/accessibility) <i class="blueArrow fas fa-arrow-right pf-u-mx-sm"></i>
<!-- This section is WIP ** we need to wait to see how this content gets included **
Flexibility
PatternFly 4 was built to be flexible and is scoped to work in tandem with other design systems. This means you’re able to use PatternFly 4 components alongside components from systems like Bootstrap, Material.io, or older versions of PatternFly.
For example, our code is written like pf-c-alert
alert
So if you had …
Include an example -->
| 49.609375 | 295 | 0.781102 | eng_Latn | 0.997842 |
2f2585d161361d7960eb9d8381fb05d2de58f51d | 1,417 | md | Markdown | README.md | metasj/jupyterlab-metadata-service | 4b521ee0db50b5713d734fc9abfb75815a4da3b1 | [
"BSD-3-Clause"
] | null | null | null | README.md | metasj/jupyterlab-metadata-service | 4b521ee0db50b5713d734fc9abfb75815a4da3b1 | [
"BSD-3-Clause"
] | null | null | null | README.md | metasj/jupyterlab-metadata-service | 4b521ee0db50b5713d734fc9abfb75815a4da3b1 | [
"BSD-3-Clause"
] | null | null | null | # JupyterLab Metadata Extension
![Stability Experimental][badge-stability]
[![Binder][badge-binder]][binder]
```bash
jupyter labextension install @jupyterlab/metadata-extension @jupyterlab/dataregistry-extension
```
This JupyterLab extension
- displays linked data about the resources you are interacting with in JuyterLab.
- enables other extensions to register as linked data providers to expose [JSON LD][json-ld] about an entity given the entity's URL.
- exposes linked data to the user as a Linked Data viewer in the Data Browser pane.
- Check out the project vision in the ["Press Release from the Future"](./press_release.md)!
## Usage
[Usage docs](./docs/usage.md)
## Contributing
This repository is in active development, and we welcome collaboration. For development guidance, please consult the [development guide](./docs/development.md).
If you have ideas or questions, feel free to open an issue, or, if you feel like getting your hands dirty, feel free to tackle an existing issue by contributing a pull request.
We try to keep the current issues relevant and matched to relevant milestones.
<!-- links -->
[badge-stability]: https://img.shields.io/badge/stability-experimental-red.svg
[badge-binder]: https://mybinder.org/badge_logo.svg
[binder]: https://mybinder.org/v2/gh/jupyterlab/jupyterlab-metadata-service/master?urlpath=lab
[json-ld]: https://json-ld.org/
<!-- /.links -->
| 38.297297 | 176 | 0.764291 | eng_Latn | 0.892644 |
2f25ba41bb52790ea3b717096a958a0e82a6c9e9 | 86 | md | Markdown | README.md | don-tbelong/BeastWorkOuts | 67ef983ba8424cd3161c9de536cc00bf7e73c8c7 | [
"MIT"
] | null | null | null | README.md | don-tbelong/BeastWorkOuts | 67ef983ba8424cd3161c9de536cc00bf7e73c8c7 | [
"MIT"
] | null | null | null | README.md | don-tbelong/BeastWorkOuts | 67ef983ba8424cd3161c9de536cc00bf7e73c8c7 | [
"MIT"
] | null | null | null | # BeastWorkOuts
A Gym routine app for the modern gym junkie.
##Problem Statement:
| 12.285714 | 45 | 0.744186 | eng_Latn | 0.509546 |
2f267c36206209af2e140f3691e5481e3cf73649 | 19,541 | md | Markdown | powerbi-docs/consumer/end-user-features.md | viniciustavanoferreira/powerbi-docs.pt-br | c0183c37fdfb98b82caf5c59c7c6eb3045f10106 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | powerbi-docs/consumer/end-user-features.md | viniciustavanoferreira/powerbi-docs.pt-br | c0183c37fdfb98b82caf5c59c7c6eb3045f10106 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | powerbi-docs/consumer/end-user-features.md | viniciustavanoferreira/powerbi-docs.pt-br | c0183c37fdfb98b82caf5c59c7c6eb3045f10106 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Disponibilidade de recursos para usuários com licenças gratuitas
description: Explicação e gráfico mostrando os recursos disponíveis para consumidores e usuários gratuitos.
author: mihart
ms.reviewer: ''
ms.service: powerbi
ms.subservice: powerbi-consumer
ms.topic: how-to
ms.date: 04/17/2020
ms.author: mihart
ms.custom: licensing support
LocalizationGroup: consumers
ms.openlocfilehash: cea72988c812bd4628e62600c6585e93e7fecc11
ms.sourcegitcommit: 3f864ec22f99ca9e25cda3a5abda8a5f69ccfa8e
ms.translationtype: HT
ms.contentlocale: pt-BR
ms.lasthandoff: 05/29/2020
ms.locfileid: "84160193"
---
# <a name="power-bi-feature-list-for-consumers-and-others-with-free-licenses"></a>Lista de recursos do Power BI para *consumidores* e outros com licenças gratuitas
[!INCLUDE[consumer-appliesto-ynnn](../includes/consumer-appliesto-ynnn.md)]
Como *consumidor*, você usa o serviço do Power BI para explorar relatórios e dashboards a fim de tomar decisões de negócios. Esses relatórios e esses dashboards são criados por *designers* que têm licenças *Pro* do Power BI. Os usuários Pro podem compartilhar o conteúdo com os colegas e controlar o que eles podem ou não fazer com esse conteúdo. Às vezes, os designers compartilham o conteúdo enviando links para você e, às vezes, o conteúdo é instalado automaticamente e é exibido no Power BI em **Aplicativos** ou **Compartilhado comigo**.
Há muitas maneiras diferentes que os designers podem escolher para compartilhar o conteúdo. Porém, este artigo destina-se aos *consumidores* do Power BI e, portanto, descreve apenas como os consumidores recebem o conteúdo e interagem com ele. Para obter mais informações sobre outras maneiras de compartilhar o conteúdo, confira [Maneiras de compartilhar seu trabalho no Power BI](../collaborate-share/service-how-to-collaborate-distribute-dashboards-reports.md).

No [artigo anterior](end-user-license.md), você aprendeu que o que pode fazer com dashboards, relatórios e aplicativos (conteúdo) no serviço do Power BI depende de três coisas: suas licenças, suas funções e permissões e o local em que o conteúdo está armazenado.
Este artigo lista quais recursos do serviço do Power BI estão disponíveis para *consumidores* como você. Por definição, *consumidores* usam uma licença gratuita para trabalhar no serviço do Power BI (não no Power BI Desktop) e são membros de organizações que têm capacidade Premium.
<art>
## <a name="quick-review-of-terminology"></a>Revisão rápida da terminologia
Vamos examinar alguns conceitos do Power BI antes de passarmos à lista. Esta será uma revisão rápida e, se você precisar de mais detalhes, acesse [Licenças para consumidores](end-user-license.md) ou [Conceitos básicos do Power BI](end-user-basic-concepts.md).
### <a name="workspaces-and-roles"></a>Workspaces e funções
Há dois tipos de workspaces: **Meu workspace** e workspaces de aplicativo. Apenas você tem acesso ao seu próprio **Meu workspace**. A colaboração e o compartilhamento exigem que os *designers* de conteúdo, que têm licenças Pro, usem um workspace de aplicativo.
Em workspaces de aplicativo, *funções* são atribuídas pelos designers para gerenciar quem pode fazer o que nesse workspace. Os *consumidores* recebem a função **Espectador**.
### <a name="premium-capacity"></a>Capacidade Premium
Quando uma organização tem uma assinatura na capacidade Premium, os administradores e os usuários Pro podem atribuir workspaces à *capacidade dedicada*. Um workspace em uma capacidade dedicada é um espaço em que os usuários Pro podem compartilhar o conteúdo e colaborar com usuários gratuitos, sem o requisito de que esses usuários tenham licenças Pro. Nesses workspaces, os usuários gratuitos têm permissões elevadas (veja a lista abaixo).
### <a name="licenses"></a>Licenças
Cada usuário do serviço do Power BI tem uma licença gratuita ou uma licença Pro. Os *consumidores* têm licenças gratuitas.
- **Licença gratuita**: normalmente atribuídas aos *consumidores* de uma organização (veja a primeira imagem abaixo). Também atribuída a qualquer pessoa que se inscreva no serviço do Power BI como um indivíduo e deseje experimentar o [serviço do Power BI no modo autônomo](../fundamentals/service-self-service-signup-for-power-bi.md) (veja a segunda imagem abaixo).

Para os usuários gratuitos, ser membro de uma organização que tenha uma capacidade Premium é o que dá a eles superpoderes. Desde que os seus colegas Pro usem workspaces na capacidade Premium para compartilhar o conteúdo, os usuários gratuitos poderão ver o conteúdo e colaborar com eles. **Dessa forma, o usuário gratuito torna-se um *consumidor* do Power BI, com a capacidade de receber e compartilhar o conteúdo para tomar decisões de negócios.**

## <a name="power-bi-feature-list-for-consumers-and-free-users"></a>Lista de recursos do Power BI para *consumidores* e usuários gratuitos
O gráfico a seguir identifica quais tarefas podem ser executadas por um *consumidor* interagindo com conteúdo na capacidade Premium e na capacidade compartilhada.
A primeira coluna representa um usuário gratuito que trabalha com o conteúdo em **Meu workspace**. Esse usuário não pode colaborar com os colegas no serviço do Power BI. Os colegas não podem compartilhar diretamente o conteúdo com esse usuário, e esse usuário não pode compartilhar o conteúdo por meio de **Meu workspace**.
A segunda coluna representa um *consumidor*. Um consumidor:
- tem uma licença de usuário gratuito
- faz parte de uma organização que tem uma assinatura na capacidade Premium
- obtém o conteúdo (aplicativos, dashboards, relatórios) de usuários Pro que compartilham esse conteúdo usando workspaces de aplicativo na capacidade dedicada.
- recebeu a função **Espectador** nos workspaces de aplicativo.
### <a name="legend"></a>Legenda
 o recurso está disponível no cenário atual
 o recurso não está disponível no cenário atual
**** a disponibilidade do recurso é limitada ao **Meu workspace**. O conteúdo de **Meu workspace** destina-se ao uso pessoal do proprietário e não pode ser compartilhado nem visto por outras pessoas no Power BI.
\* o acesso a esse recurso pode ser ativado ou desativado por um usuário Pro ou um administrador.
<br><br>
### <a name="feature-list"></a>Lista de recursos
|Recursos | Cenário 1: o usuário gratuito do Power BI que não tem acesso ao conteúdo hospedado na capacidade dedicada. | Cenário 2: o usuário gratuito do Power BI com permissões de **Espectador** no conteúdo armazenado na capacidade dedicada. Essa pessoa é um *consumidor* do Power BI. |
|---|---|---|
|**Aplicativos**
|Instalação automática |  | *|
|Abrir |  |  |
|Favorito |  |  |
|Editar, atualizar, compartilhar novamente, republicar | | |
|Criar novo aplicativo | | |
|AppSource: baixar e abrir |  | |
|Loja da organização: baixar e abrir| | |
|**Workspaces de aplicativo**
| Criar, editar ou excluir o workspace ou o conteúdo |  | |
|Adicionar endossos |  | |
|Abrir e ver |  |  |
| Ler dados armazenados em fluxo de trabalho do workspace | ||
|**Dashboards**
|Receber, ver e interagir com dashboards dos colegas |  |  |
| Adicionar alertas aos blocos |  |  |
| Ver os comentários de outras pessoas e responder a eles: adicionar os próprios comentários |  | * |
| Salvar uma cópia |  | |
|Copiar o visual como uma imagem? | ||
|Criar, editar, atualizar, excluir |  | |
|Exportar um bloco para o Excel | | |
|Favorito || |
|Recurso | ||
|Modos de tela inteira e de foco | | |
|Pesquisa global |* |* |
|Insights sobre blocos | | *|
| P e R: usá-la no dashboard |* |* |
|P e R: adicionar as perguntas em destaque e salvas |  | |
|P e R: examinar as perguntas feitas |  | |
|Inspetor de desempenho | | |
|Fixar blocos de R e R ou relatórios |  | |
|Imprimir |* |* |
|Atualizar |  | |
|Compartilhar novamente |  | |
|Criar uma assinatura para você mesmo |* |* |
|Criar uma assinatura para outras pessoas |  | |
|**Conjunto de dados**
| Adicionar, excluir, editar |  |  |
| Criar um relatório em outro workspace com base em um conjunto de dados nesse workspace |  | |
| Insights sobre conjuntos de dados |  ||
|Agendar atualização | ||
|**Relatórios**
|Receber relatórios de colegas |  |  |
| Colaborar com os colegas na mesma versão de um relatório | |  |
| Analisar um relatório no Excel |* |* |
| Ver os indicadores criados por outras pessoas e adicionar os próprios indicadores | | |
| Ver os comentários de outras pessoas e responder a eles: adicionar novos comentários | | |
|Alterar as dimensões da exibição |  |  |
| Salvar uma cópia | |*
|Copiar o visual como uma imagem* |
| Aplicar realce cruzado e filtro cruzado nos visuais do relatório | | |
| Drill | | |
| Detalhamento |* |* |
| Inserir (Publicar na Web, público) | * | |
| Exportar dados resumidos de visuais do relatório* | | |
|Exportar dados subjacentes de visuais do relatório* |  | |
| Adicionar o relatório a Favoritos | | |
| Filtros: alterar tipos |* |* |
| Filtros: interagir || |
| Filtros: persistentes |* |* |
| Pesquisar um item no painel de filtro |* |* |
| Modos de tela inteira e de foco | | |
| Insights sobre relatórios<sup>1</sup> |  ||
| Exibição de linhagem | | |
|PDF: criar um com base nas páginas do relatório | | |
|Inspetor de desempenho || |
| PowerPoint: criar um com base nas páginas do relatório* | | |
| Promover o conteúdo para a Página Inicial |  | |
| Imprimir páginas do relatório* | | |
|Interagir com um visual da P e R | | |
|Código QR | | |
| Atualizar | | |
| Compartilhar conteúdo com usuários externos |  | |
| Compartilhar: permitir que outras pessoas compartilhem itens novamente |  | |
|Mostrar como tabela (mostrar dados)| | |
| Segmentações: adicionar ou excluir | | |
| Interagir com segmentações | | |
| Classificar os visuais do relatório | | |
| Assinar relatórios* | | |
| Criar uma assinatura de relatórios para outras pessoas |  | |
| Exibir relacionados | | |
| Visuais: alterar tipos em relatórios |* |* |
| Alterar as interações dos visuais | | |
| Visuais: adicionar novo | | |
| Visuais: adicionar novos campos | | |
|Visuais: alterar tipo | | |
| Visuais: posicionar o cursor neles para revelar detalhes e dicas de ferramenta | | |
1. Somente disponível na exibição de conteúdo **Compartilhado comigo**.
## <a name="next-steps"></a>Próximas etapas
[Power BI para *consumidores*](end-user-consumer.md)
| 109.780899 | 542 | 0.746891 | por_Latn | 0.947564 |
2f26817ace763e1874857746fb68fbd2027365e9 | 1,120 | md | Markdown | .github/ISSUE_TEMPLATE/-c--definition-update.md | yum-yab/ontology | ec295582e310aa5ee70901424d55071c966e5563 | [
"CC0-1.0"
] | null | null | null | .github/ISSUE_TEMPLATE/-c--definition-update.md | yum-yab/ontology | ec295582e310aa5ee70901424d55071c966e5563 | [
"CC0-1.0"
] | null | null | null | .github/ISSUE_TEMPLATE/-c--definition-update.md | yum-yab/ontology | ec295582e310aa5ee70901424d55071c966e5563 | [
"CC0-1.0"
] | null | null | null | ---
name: "[C] Ontology definition update"
about: For restructuring existing partsof the ontology
title: Your title should make sense if said after "The issue is <your issue title>"
labels: "[C] definition update"
assignees: ''
---
## Description of the issue
Here describe the issue as extensively as possible
## Ideas of solution
If you already have ideas for the solution describe them here
## Workflow checklist
- [ ] I discussed the issue with someone else than me before working on a solution
- [ ] I already read the **latest** version of the [workflow](https://github.com/OpenEnergyPlatform/ontology/blob/dev/CONTRIBUTING.md) for this repository
- [ ] I added this issue to the Project 'Issues'. If suitable, I add it to further Projects.
- [ ] The [goal](https://github.com/OpenEnergyPlatform/ontology/blob/dev/README.md) of this ontology is clear to me
I am aware that
- [ ] every entry in the ontology should have an annotation
- [ ] classes should arise from concepts rather than from words
- [ ] class or property names should follow the [UpperCamelCase](https://en.wikipedia.org/wiki/Camel_case)
| 38.62069 | 154 | 0.751786 | eng_Latn | 0.99517 |
2f26c4029c5d391c989b3936fee976e743e9e812 | 4,676 | md | Markdown | _posts/2018-10-24-say-no.md | karllhughes/personal-blog-5 | 8b9cb8f83de3d0b773b64107e18575fe11df3096 | [
"Apache-2.0"
] | 1 | 2019-03-29T13:21:12.000Z | 2019-03-29T13:21:12.000Z | _posts/2018-10-24-say-no.md | karllhughes/personal-blog-5 | 8b9cb8f83de3d0b773b64107e18575fe11df3096 | [
"Apache-2.0"
] | null | null | null | _posts/2018-10-24-say-no.md | karllhughes/personal-blog-5 | 8b9cb8f83de3d0b773b64107e18575fe11df3096 | [
"Apache-2.0"
] | 2 | 2019-03-29T23:16:58.000Z | 2021-08-15T05:01:35.000Z | ---
layout: post
title: Say No
date: 2018-10-24
img: https://i.imgur.com/PjRzm5b.jpg?1
categories:
- Management
- Startups
---
People will continue to request more of your time until you say _no_.
I was talking to a coworker who's been overwhelmed recently. She's got a steady stream of incoming requests from multiple people who outrank her, so she's afraid to say no to any of them. This has resulted in an unsustainable workload for my coworker, and in the end, something - if not everything - is going to get dropped.
This isn't uncommon - regardless of company size - and it's often a problem for an organization's best workers. Because they always say _yes_ they are entrusted with more responsibility and people grow to expect more from them - even if they don't realize this is unsustainable.
Another friend has a similar situation at work. He realized his department needed better organizational systems in place, and because his boss was stretched too thin, he decided to just start implementing them. Everyone has benefited, but now he's in a defacto management role without the title (or salary). When a difficult conversation needs to be had - about an employee's performance or vacation request or project proposal - he is now expected to have it with them. He also has to fight fires, attend more meetings, and do his primary job in the slivers of time in between. He's on a path to burnout.
## The fine line between being a team player and doing too much
I've struggled with this as well. At startups, I know that there are times when an engineer has to swallow his pride and do some crummy manual task because there simply isn't time to automate it. That said, if I answered every request that came in for my time, I'd be a full-time customer support engineer and wouldn't get any of our bigger initiatives delivered.
Because the temptation to just "ask engineering to do it," is so strong, we put in pretty rigorous safeguards to prevent it. People have to file tickets, those tickets are triaged weekly, and we decide as a group if any of them are actually more important than the mainline projects we're currently invested in. If they are, we decide which big project we should delay in order to make the new request happen. Finally, we track the number of "big project" issues we complete vs. the "quick fix" type issues to help determine how focused we were. If anyone complains that a big project is taking too long, it's usually easy to show them which small bugs and quick fixes caused the delay, and the debate is no longer an issue.
## Learn to delegate
The other solution to having too much to do is to learn to delegate. Obviously this only applies if you have someone to delegate to, but assuming that's the case, you have to stop making the excuse that, "teaching someone else will take longer, so I'll just do it myself." This is short-term thinking, and will greatly restrict your growth as a leader.
If you pursue a management path, at some point, delegation becomes your only job. There is little to no time for hands-on work, and you will be responsible for understanding your team's capacity and feeding them work appropriately. It's best to practice this as early and often as possible.
Delegation is challenging, but the only way to scale your time in the long-run.
## Qualify your _no_
Another strategy for saying no is to qualify your response.
For example, if someone asked me to download something from the database, I might respond, "I can't right now, but I can do it on Friday." Then I'll build some time in my schedule - maybe bundling it with other similar tasks on Friday. If the request can't wait, the asker will have to find another solution.
I've found that many times people will ask you to do things that they are capable of doing, but simply don't _want_ to do. That's fine if you're not busy and they are, but if both of you are busy, these requests are just laziness. Of course, you can't tell your boss that, so a better approach would be to say, "I can do that, but I'll have to drop something else. Which task should I drop or push back to tomorrow?" Framing the conversation as a trade-off usually helps people remember that they can't just get more work out of you for free.
## A culture where _no_ is okay
Finally, I'd encourage anyone who's having trouble saying no to talk about this honestly with your boss, coworkers, etc. Building a culture where saying _no_ is okay takes time, and I'm guessing some organizations and industries are less okay with it than others.
I'd love to hear what you think. Find me on [Twitter](https://twitter.com/karllhughes) to keep the conversation going.
| 101.652174 | 724 | 0.77994 | eng_Latn | 0.999945 |
2f26ec86614ea71b57a12500d092f3807005a618 | 63 | md | Markdown | README.md | Bartosz-L/recipe-app_django-rest-api | b87abc8db57c228aa2181c91731bd8ea6cbd233a | [
"MIT"
] | null | null | null | README.md | Bartosz-L/recipe-app_django-rest-api | b87abc8db57c228aa2181c91731bd8ea6cbd233a | [
"MIT"
] | 4 | 2021-06-08T20:15:19.000Z | 2022-03-11T23:57:56.000Z | README.md | Bartosz-L/recipe-app_django-rest-api | b87abc8db57c228aa2181c91731bd8ea6cbd233a | [
"MIT"
] | null | null | null | # recipe-app_django-rest-api
Recipe Api with Django/Docker/TDD
| 21 | 33 | 0.809524 | kor_Hang | 0.262802 |
2f2711504bef3163588d8e63bddde2b33f175f24 | 7,573 | md | Markdown | help/tags/ui/event-forwarding/getting-started.md | AdobeDocs/experience-platform.it-IT | 7de2c6e05ca5175ddd748921e18644a6a45cb8da | [
"MIT"
] | null | null | null | help/tags/ui/event-forwarding/getting-started.md | AdobeDocs/experience-platform.it-IT | 7de2c6e05ca5175ddd748921e18644a6a45cb8da | [
"MIT"
] | null | null | null | help/tags/ui/event-forwarding/getting-started.md | AdobeDocs/experience-platform.it-IT | 7de2c6e05ca5175ddd748921e18644a6a45cb8da | [
"MIT"
] | null | null | null | ---
title: Guida introduttiva all’inoltro degli eventi
description: Segui questa esercitazione passo per passo per iniziare a utilizzare l’inoltro degli eventi in Adobe Experience Platform.
feature: Event Forwarding
exl-id: f82bfac9-dc2d-44de-a308-651300f107df
source-git-commit: 5218e6cf82b74efbbbcf30495395a4fe2ad9fe14
workflow-type: tm+mt
source-wordcount: '907'
ht-degree: 92%
---
# Guida introduttiva all'inoltro degli eventi
>[!NOTE]
>
>Adobe Experience Platform Launch è stato classificato come una suite di tecnologie di raccolta dati in Adobe Experience Platform. Di conseguenza, sono state introdotte diverse modifiche terminologiche nella documentazione del prodotto. Consulta questo [documento](../../term-updates.md) come riferimento consolidato delle modifiche terminologiche.
Per utilizzare Adobe Experience Platform, i dati devono essere inviati ad Adobe Experience Platform Edge Network utilizzando una o più delle tre opzioni seguenti:
* [Adobe Experience Platform Web SDK](../../extensions/web/sdk/overview.md)
* [Adobe Experience Platform Mobile SDK](https://sdkdocs.com)
* [API server-to-server](https://experienceleague.adobe.com/docs/audience-manager/user-guide/api-and-sdk-code/dcs/dcs-apis/dcs-s2s.html?lang=it)
>[!NOTE]
>Platform Web SDK e Platform Mobile SDK non richiedono l’implementazione tramite tag in Adobe Experience Platform. Tuttavia, l’utilizzo di tag per distribuire questi SDK è l’approccio consigliato.
Dopo aver inviato i dati a Edge Network, è possibile attivare le soluzioni Adobe a cui inviare i dati. Per inviare dati a una soluzione non Adobe, impostala nell’inoltro degli eventi.
## Prerequisiti
* Adobe Experience Platform Collection Enterprise (contatta il tuo account manager per informazioni sui prezzi)
* Inoltro di eventi in Adobe Experience Platform
* Adobe Experience Platform Web SDK o Mobile SDK, configurato per inviare i dati a Edge Network
* Mappare i dati su Experience Data Model (XDM) (la mappatura può essere eseguita utilizzando i tag)
## Creare uno schema XDM
Crea lo schema in Adobe Experience Platform.
1. Crea uno schema selezionando **[!UICONTROL Schemi]** > **[!UICONTROL Crea schemi]** e scegliendo l’opzione **[!UICONTROL XDM ExperienceEvent]**.
1. Assegna allo schema un nome e una breve descrizione.
1. Puoi aggiungere il gruppo di campi “Dettagli web ExperienceEvent” selezionando **[!UICONTROL Aggiungi]** accanto a **[!UICONTROL Gruppi di campi]**.
>[!NOTE]
>
>Se necessario, è possibile aggiungere più gruppi di campi.
1. Salva lo schema e prendi nota del nome che gli hai assegnato.
Per ulteriori informazioni sugli schemi, consulta la [guida del sistema Experience Data Model (XDM)](https://experienceleague.adobe.com/docs/experience-platform/xdm/home.html?lang=it).
## Creare una proprietà di inoltro degli eventi
Nell’interfaccia utente Data Collection, crea una proprietà di tipo “Edge”.
1. Seleziona **[!UICONTROL Nuova proprietà]**.
1. Assegna un nome alla proprietà.
1. Scegli il tipo di piattaforma “Edge”.
1. Seleziona **[!UICONTROL Salva]**.
Dopo aver creato la proprietà, passa alla scheda **[!UICONTROL Ambienti]** per la nuova proprietà e annota
gli ID dell’ambiente. Se l’organizzazione Adobe utilizzata nel datastream è diversa dall’organizzazione Adobe utilizzata nell’inoltro degli eventi, puoi copiare l’ID ambiente dalla scheda **[!UICONTROL Ambienti]** e incollarlo durante la creazione di un datastream. In alternativa, è possibile selezionare l’ambiente da un menu a discesa.
## Creare un flusso di dati
Per creare il flusso di dati in Adobe Experience Platform, utilizza l’ID ambiente generato quando hai creato la proprietà di inoltro degli eventi.
1. Utilizza il collegamento nella barra a sinistra dell’interfaccia utente di Data Collection per aprire l’interfaccia per i flussi di dati.
1. Seleziona **[!UICONTROL Flussi di dati]**.
1. Assegna un nome alla configurazione e fornisci una descrizione facoltativa.
La descrizione è utile per identificare le configurazioni, qualora ne siano elencate diverse.
1. Seleziona **[!UICONTROL Salva]**.
## Abilitare l’inoltro degli eventi
Quindi configura Edge Network per inviare i dati all’inoltro di eventi e ad altri prodotti Adobe.
1. Seleziona la proprietà creata nell’interfaccia utente per i flussi di dati.
1. Seleziona l'ambiente di sviluppo, di produzione oppure di gestione temporanea.
In alternativa, per inviare i dati a un ambiente di inoltro degli eventi che si trova all’esterno dell’organizzazione Adobe, seleziona **[!UICONTROL Passa a modalità avanzata]** e incolla un ID. L’ID viene fornito quando crei una proprietà di inoltro degli eventi.
1. Attiva gli strumenti necessari e configura le opzioni in base alle tue esigenze.
* Adobe Analytics richiede un ID della suite di rapporti.
* L'inoltro degli eventi in Adobe Experience Platform richiede un ID di proprietà e un ID ambiente. Si tratta del percorso di pubblicazione per la proprietà di inoltro degli eventi.
Dopo la configurazione, annota gli ID ambiente per la nuova proprietà.
## Configura l’estensione Platform Web SDK per inviare dati al datastream creato in precedenza
Crea la proprietà nell’interfaccia utente di Data Collection, quindi utilizza l’estensione Adobe Experience Platform Web SDK per configurarla.
1. Assegna un nome alla proprietà.
Puoi avere più istanze di Alloy. Ad esempio, potresti avere diverse proprietà di tracciamento prima e dopo il paywall.
1. Seleziona l'ID dell’organizzazione.
1. Seleziona il dominio Edge.
Per ulteriori opzioni di configurazione, consulta la [documentazione dell'estensione SDK Web](../../extensions/web/sdk/overview.md).
## Creare una regola di tag per inviare dati all’SDK per web di Platform
Dopo aver eseguito le operazioni descritte qui sopra, puoi creare tutti gli elementi che sfruttano sia l’inoltro degli eventi che i tag ma che richiedono una sola richiesta dalla pagina, come le definizioni di dati, le regole e così via.
Crea una regola di caricamento della pagina utilizzando l’estensione Platform Web SDK e il tipo di azione "Invia evento":
1. Apri la scheda **[!UICONTROL Regole]**, quindi seleziona **[!UICONTROL Crea nuova regola]**.
1. Denomina la regola.
1. Seleziona **[!UICONTROL Eventi Aggiungi]**.
1. Aggiungi un evento scegliendo un'estensione e uno dei tipi di evento disponibili per tale estensione, quindi configura le impostazioni dell'evento. Ad esempio, seleziona **[!UICONTROL Core - Window Loaded]**.
1. Aggiungi un'azione utilizzando l'estensione Platform SDK Web. Seleziona **[!UICONTROL Invia evento]** dall'elenco **[!UICONTROL Tipo di azione]**, scegli l’istanza desiderata (istanza di Alloy già configurata in precedenza), quindi seleziona un elemento dati da aggiungere al blocco dati XDM all’interno del riscontro di Alloy.
1. Lascia le altre impostazioni predefinite per questo esempio, quindi seleziona **[!UICONTROL Salva]**.
Per un altro esempio, potresti creare una regola che invia il livello dati a Edge se l'utente passa il mouse su un pulsante specifico.
## Riepilogo
Dopo aver eseguito quanto segue, sarà possibile creare delle regole di inoltro degli eventi per inoltrare i dati a destinazioni non Adobe.
* Schema Experience Data Model (prendi nota del nome che gli hai assegnato).
* Una proprietà di inoltro degli eventi (prendi nota dell’ID proprietà e degli ID ambiente)
* Un flusso di dati (prendi nota dell’ID ambiente, da non confondere con l’ID ambiente dall’inoltro degli eventi)
* Una proprietà tag
| 52.227586 | 348 | 0.785422 | ita_Latn | 0.999045 |
2f2783e66e03b69eafcfa7bcc5742006f6e090b0 | 328 | md | Markdown | cran-comments.md | soundarmoorthy/DatabaseConnector | 78a2f24d84304180cb9792fceb5f6dec70350fd7 | [
"Apache-2.0"
] | null | null | null | cran-comments.md | soundarmoorthy/DatabaseConnector | 78a2f24d84304180cb9792fceb5f6dec70350fd7 | [
"Apache-2.0"
] | null | null | null | cran-comments.md | soundarmoorthy/DatabaseConnector | 78a2f24d84304180cb9792fceb5f6dec70350fd7 | [
"Apache-2.0"
] | null | null | null | This update includes 8 changes and 2 bugfixes since 5.0.0 (see NEWS.md)
---
## Test environments
* Ubuntu 20.04, R 4.1.2
* Microsoft Windows Server 2019, R 4.0.3
* MacOS, R 4.1.2
* Windows 10, R 4.1.2
## R CMD check results
There were no ERRORs or WARNINGs.
## Downstream dependencies
There are no downstream dependencies | 19.294118 | 71 | 0.713415 | eng_Latn | 0.931279 |
2f27ee0d08517db2f9aebbde8a61f30902043367 | 862 | md | Markdown | lang/README.ja.md | magicien/VRMPreview | 4bc7c23f577d8d7075ca14218ea4f0f766399b49 | [
"MIT"
] | 4 | 2018-08-14T09:17:31.000Z | 2022-01-06T06:02:59.000Z | lang/README.ja.md | magicien/VRMPreview | 4bc7c23f577d8d7075ca14218ea4f0f766399b49 | [
"MIT"
] | null | null | null | lang/README.ja.md | magicien/VRMPreview | 4bc7c23f577d8d7075ca14218ea4f0f766399b49 | [
"MIT"
] | null | null | null | # VRMPreview
macOS用VRMビューア

## 対応システム
- macOS 10.13 (High Sierra) 以降
## インストール方法
1. [Releases](https://github.com/magicien/VRMPreview/releases/latest)ページから最新の「**VRMPreview_vX.X.X.dmg**」をダウンロードする。

2. dmg中の「VRMPreview」を「Applications」にコピーする。
## ビルド方法(改造したい人向け)
[Carthage](https://github.com/Carthage/Carthage)が無い場合はインストールする。
その後、下記コマンドを実行する。
```
$ git clone https://github.com/magicien/VRMPreview.git
$ cd VRMPreview
$ carthage bootstrap --platform mac
$ xcodebuild
```
## 関連リポジトリ
- [VRMQuickLook](https://github.com/magicien/VRMQuickLook/) - macOS、VRM用QuickLookプラグイン
- [GLTFSceneKit](https://github.com/magicien/GLTFSceneKit/) - SceneKit用glTFローダ
| 23.297297 | 114 | 0.769142 | yue_Hant | 0.924153 |
2f27fa10aba1d673f51c5434e1bf694950ed5ff8 | 296 | md | Markdown | 02-c/04-alg/compression/README.md | al2698/sp | fceabadef49ffe6ed25dfef38e3dc418f309e128 | [
"MIT"
] | null | null | null | 02-c/04-alg/compression/README.md | al2698/sp | fceabadef49ffe6ed25dfef38e3dc418f309e128 | [
"MIT"
] | null | null | null | 02-c/04-alg/compression/README.md | al2698/sp | fceabadef49ffe6ed25dfef38e3dc418f309e128 | [
"MIT"
] | null | null | null | # Compression
## loseless compression
* https://github.com/jserv/x-compressor
* https://en.wikipedia.org/wiki/Golomb_coding
## JPEG image compression
* https://github.com/kornelski/jpeg-compressor
* https://github.com/google/guetzli
* https://github.com/denguir/JPEG-image-compression
| 18.5 | 51 | 0.746622 | yue_Hant | 0.289991 |
2f28a6349bb7b426ff2ccd8365c732741c129419 | 420 | md | Markdown | _posts/2021-07-21-BeforeAfter-Arousal-20210721190504059077.md | ipussy/ipussy.github.io | 95d19a74e38bb54303cf18057a99a57c783e76bf | [
"Apache-2.0"
] | null | null | null | _posts/2021-07-21-BeforeAfter-Arousal-20210721190504059077.md | ipussy/ipussy.github.io | 95d19a74e38bb54303cf18057a99a57c783e76bf | [
"Apache-2.0"
] | null | null | null | _posts/2021-07-21-BeforeAfter-Arousal-20210721190504059077.md | ipussy/ipussy.github.io | 95d19a74e38bb54303cf18057a99a57c783e76bf | [
"Apache-2.0"
] | null | null | null | ---
title: "Before/After Arousal"
metadate: "hide"
categories: [ God Pussy ]
image: "https://external-preview.redd.it/KBWgp4OJK9I2m-Kpz21VOxYuEH2RDUiumkhVEwqV5nw.jpg?auto=webp&s=1ca5442f9f20829f014df70ab0085adc0989dbcf"
thumb: "https://external-preview.redd.it/KBWgp4OJK9I2m-Kpz21VOxYuEH2RDUiumkhVEwqV5nw.jpg?width=320&crop=smart&auto=webp&s=a1bc09ffcc906fb27078453c18a7bdcd1738f56a"
visit: ""
---
Before/After Arousal
| 42 | 163 | 0.816667 | yue_Hant | 0.290284 |
2f28c7fa41be7d4de882d662acb69331e78b2778 | 195 | md | Markdown | README.md | Dan-scb/jogadorVsmonstro | d46b8bbcc54737e25a835f08ee65ab3dbe9750bf | [
"MIT"
] | null | null | null | README.md | Dan-scb/jogadorVsmonstro | d46b8bbcc54737e25a835f08ee65ab3dbe9750bf | [
"MIT"
] | null | null | null | README.md | Dan-scb/jogadorVsmonstro | d46b8bbcc54737e25a835f08ee65ab3dbe9750bf | [
"MIT"
] | null | null | null | # jogadorVsmonstro
Criei esse pequeno projeto para poder exercitar meus conhecimentos em Vue.
Nesse projeto o Vue está sendo através de um cdn. Então é só abrir o index.html, e se divertir.
| 39 | 97 | 0.779487 | por_Latn | 1.000008 |
2f28e218dae2b09ea5ad720d556dea368366f6f7 | 3,100 | md | Markdown | packages/instrumentation-socket.io/README.md | seemk/opentelemetry-ext-js | 5699263e7b682ac99ad34c88c1c822ccf76216c9 | [
"Apache-2.0"
] | null | null | null | packages/instrumentation-socket.io/README.md | seemk/opentelemetry-ext-js | 5699263e7b682ac99ad34c88c1c822ccf76216c9 | [
"Apache-2.0"
] | null | null | null | packages/instrumentation-socket.io/README.md | seemk/opentelemetry-ext-js | 5699263e7b682ac99ad34c88c1c822ccf76216c9 | [
"Apache-2.0"
] | null | null | null | # OpenTelemetry socket.io Instrumentation for Node.js
[](https://www.npmjs.com/package/opentelemetry-instrumentation-socket.io)
This module provides automatic instrumentation for [`socket.io`](https://github.com/socketio/socket.io).
## Installation
```
npm install --save opentelemetry-instrumentation-socket.io
```
## Usage
For further automatic instrumentation instruction see the [@opentelemetry/instrumentation](https://github.com/open-telemetry/opentelemetry-js/tree/main/packages/opentelemetry-instrumentation) package.
```js
const { NodeTracerProvider } = require('@opentelemetry/node');
const { registerInstrumentations } = require('@opentelemetry/instrumentation');
const { SocketIoInstrumentation } = require('opentelemetry-instrumentation-socket.io);
registerInstrumentations({
traceProvider,
instrumentations: [
new SocketIoInstrumentation({
// see under for available configuration
})
]
});
```
### socket.io Instrumentation Options
socket.io instrumentation has few options available to choose from. You can set the following:
| Options | Type | Description |
| -------------- | -------------------------------------- | ----------------------------------------------------------------------------------------------- |
| `emitHook` | `SocketIoHookFunction` | hook for adding custom attributes before socket.io emits the event |
| `onHook` | `SocketIoHookFunction` | hook for adding custom attributes before the event listener (callback) is invoked |
| `traceReserved` | `boolean` | set to true if you want to trace socket.io reserved events (see https://socket.io/docs/v4/emit-cheatsheet/#Reserved-events) |
| `filterHttpTransport`| `HttpTransportInstrumentationConfig` | set if you want to filter out the HTTP traces when using HTTP polling as the transport (see below)
#### HttpTransportInstrumentationConfig
If you use `opentelemetry-instrumentation-socket.io` alongside `instrumentation-http`, socket.io might use HTTP polling as the transport method. Therefore, you will see an HTTP span created as the parent of the socket.io span.
To filter out those spans; we use HttpTransportInstrumentationConfig.
`HttpTransportInstrumentationConfig` has a few options available to choose from. You can set the following:
| Options | Type | Description |
| -------------- | -------------------------------------- | ----------------------------------------------------------------------------------------------- |
| `httpInstrumentation`| `HttpInstrumentation` | the instance of HttpInstrumentation you pass to `registerInstrumentations`|
| `socketPath` | `string` | the socket.io endpoint path (defaults to `/socket.io/`) |
---
This extension (and many others) was developed by [Aspecto](https://www.aspecto.io/) with ❤️
| 57.407407 | 227 | 0.633871 | eng_Latn | 0.769649 |
2f292faa6f6145ab10780a40060562e875f45af1 | 2,971 | md | Markdown | docs/framework/wcf/feature-details/transaction-models.md | eOkadas/docs.fr-fr | 64202ad620f9bcd91f4360ec74aa6d86e1d4ae15 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/framework/wcf/feature-details/transaction-models.md | eOkadas/docs.fr-fr | 64202ad620f9bcd91f4360ec74aa6d86e1d4ae15 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/framework/wcf/feature-details/transaction-models.md | eOkadas/docs.fr-fr | 64202ad620f9bcd91f4360ec74aa6d86e1d4ae15 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Modèles de transaction
ms.date: 03/30/2017
ms.assetid: 48a8bc1b-128b-4cf1-a421-8cc73223c340
ms.openlocfilehash: 8731b72d0657aa420dbb020e216c3af059916ce9
ms.sourcegitcommit: 9b552addadfb57fab0b9e7852ed4f1f1b8a42f8e
ms.translationtype: MT
ms.contentlocale: fr-FR
ms.lasthandoff: 04/23/2019
ms.locfileid: "62050777"
---
# <a name="transaction-models"></a>Modèles de transaction
Cette rubrique décrit la relation entre les modèles de programmation de transactions et les composants d'infrastructure que Microsoft fournit.
Lorsque vous utilisez des transactions dans Windows Communication Foundation (WCF), il est important de comprendre que vous en sélectionnant ne pas entre différents modèles transactionnelles, mais plutôt d’exploitation sur les différentes couches d’un modèle intégré et cohérent.
Les sections suivantes décrivent les trois composants principaux d'une transaction.
## <a name="windows-communication-foundation-transactions"></a>Transactions WCF (Windows Communication Foundation)
La prise en charge de transactions dans WCF vous permet de que écrire des services transactionnels. En outre, avec sa prise en charge pour le protocole WS-AtomicTransaction (WS-AT), les applications peuvent transférer des transactions aux services Web construits à l’aide de WCF ou une technologie tierce.
Dans une application ou un service WCF, fonctionnalités de transaction WCF fournissent des attributs et configuration pour spécifier de façon déclarative comment et quand l’infrastructure doit créer, flux et synchroniser des transactions.
## <a name="systemtransactions-transactions"></a>Transactions System.Transactions
L’espace de noms <xref:System.Transactions> fournit un modèle de programmation explicite basé sur la classe <xref:System.Transactions.Transaction>, ainsi qu’un modèle de programmation implicite à utilisant la classe <xref:System.Transactions.TransactionScope>, dans lequel l’infrastructure gère automatiquement les transactions.
Pour plus d’informations sur la création d’une application transactionnelle à l’aide de ces deux modèles, consultez [écriture d’une Application transactionnelle](https://go.microsoft.com/fwlink/?LinkId=94947).
Dans une application, ou un service WCF <xref:System.Transactions> fournit le modèle de programmation pour la création de transactions au sein d’une application cliente et pour interagir explicitement avec une transaction, lorsque nécessaire, au sein d’un service.
## <a name="msdtc-transactions"></a>Transactions MSDTC
MSDTC (Microsoft Distributed Transaction Coordinator) est un gestionnaire de transactions qui prend en charge les transactions distribuées.
Pour plus d’informations, consultez le [de référence du programmeur DTC](https://go.microsoft.com/fwlink/?LinkId=94948).
Dans une application ou un service WCF, MSDTC fournit l’infrastructure pour la coordination de transactions créées au sein d’un client ou un service.
| 80.297297 | 331 | 0.808482 | fra_Latn | 0.963176 |
2f2a68740f8215735e8f8b342a72bec61c18cc25 | 699 | md | Markdown | packages/hooks/src/useMouse/index.zh-CN.md | huxinfeng/hooks | e276025c665c2fb41f5ee73d9c3b49b676cbe418 | [
"MIT"
] | null | null | null | packages/hooks/src/useMouse/index.zh-CN.md | huxinfeng/hooks | e276025c665c2fb41f5ee73d9c3b49b676cbe418 | [
"MIT"
] | null | null | null | packages/hooks/src/useMouse/index.zh-CN.md | huxinfeng/hooks | e276025c665c2fb41f5ee73d9c3b49b676cbe418 | [
"MIT"
] | null | null | null | ---
title: useMouse
nav:
title: Hooks
path: /hooks
group:
title: Dom
path: /dom
---
<Tag lang="zh-CN" tags="ssr"></Tag>
# useMouse
监听鼠标位置
## 代码演示
### 基础用法
<code src="./demo/demo1.tsx" />
## API
```typescript
const state: {
screenX: number,
screenY: number,
clientX: number,
clientY: number,
pageX: number,
pageY: number,
} = useMouse();
```
### Result
| 参数 | 说明 | 类型 |
|---------|------------------------|----------|
| screenX | 距离显示器左侧的距离 | `number` |
| screenY | 距离显示器顶部的距离 | `number` |
| clientX | 距离当前视窗左侧的距离 | `number` |
| clientY | 距离当前视窗顶部的距离 | `number` |
| pageX | 距离完整页面顶部的距离 | `number` |
| pageY | 距离完整页面顶部的距离 | `number` |
| 15.195652 | 47 | 0.532189 | yue_Hant | 0.365629 |
2f2a781046a287689aa7338ce9c6ac82d4c8a0cd | 1,467 | md | Markdown | docs/build/importing-into-an-application.md | drvoss/cpp-docs.ko-kr | dda556c732d97e5959be3b39dc331ded7eda8bb3 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/build/importing-into-an-application.md | drvoss/cpp-docs.ko-kr | dda556c732d97e5959be3b39dc331ded7eda8bb3 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/build/importing-into-an-application.md | drvoss/cpp-docs.ko-kr | dda556c732d97e5959be3b39dc331ded7eda8bb3 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: "응용 프로그램으로 가져오는 | Microsoft Docs"
ms.custom:
ms.date: 11/04/2016
ms.reviewer:
ms.suite:
ms.technology:
- cpp-tools
ms.tgt_pltfrm:
ms.topic: article
dev_langs:
- C++
helpviewer_keywords:
- DLLs [C++], importing
- importing DLLs [C++], applications
- applications [C++], importing into
ms.assetid: 9d646466-e12e-4710-8ad9-c819c0375fcc
caps.latest.revision:
author: corob-msft
ms.author: corob
manager: ghogen
ms.workload:
- cplusplus
ms.openlocfilehash: 3995aa9a0348f53c91fadb7dffc5fa3549851a94
ms.sourcegitcommit: 8fa8fdf0fbb4f57950f1e8f4f9b81b4d39ec7d7a
ms.translationtype: MT
ms.contentlocale: ko-KR
ms.lasthandoff: 12/21/2017
---
# <a name="importing-into-an-application"></a>응용 프로그램으로 가져오기
함수는 두 가지 방법으로 응용 프로그램으로 가져올 수 있습니다.
- 키워드를 사용 하 여 **__declspec (dllimport)** 주 응용 프로그램의 함수 정의
- 와 함께 모듈 정의 (.def) 파일을 사용 하 여 **__declspec (dllimport)**
## <a name="what-do-you-want-to-do"></a>원하는 작업을 선택하세요.
- [__Declspec (dllimport)을 사용 하 여 응용 프로그램으로 가져오기](../build/importing-into-an-application-using-declspec-dllimport.md)
- [__Declspec (dllimport)를 사용 하 여 함수 호출 가져오기](../build/importing-function-calls-using-declspec-dllimport.md)
- [데이터를 사용 하 여 __declspec (dllimport) 가져오기](../build/importing-data-using-declspec-dllimport.md)
- [DEF 파일을 사용 하 여 가져오기](../build/importing-using-def-files.md)
## <a name="see-also"></a>참고 항목
[가져오기 및 내보내기](../build/importing-and-exporting.md) | 30.5625 | 121 | 0.698705 | kor_Hang | 0.991517 |
2f2ae04b48ac811481356b8786c1173650cf7605 | 5,641 | md | Markdown | _posts/2019-01-14-dynamics-simple-pendulum-lagrangian.md | SyrianSpock/syrianspock.github.io | de57900abe0fdaae7bd008abd46759d78a9b0529 | [
"MIT"
] | null | null | null | _posts/2019-01-14-dynamics-simple-pendulum-lagrangian.md | SyrianSpock/syrianspock.github.io | de57900abe0fdaae7bd008abd46759d78a9b0529 | [
"MIT"
] | 2 | 2016-08-31T13:47:03.000Z | 2019-07-08T07:39:09.000Z | _posts/2019-01-14-dynamics-simple-pendulum-lagrangian.md | SyrianSpock/syrianspock.github.io | de57900abe0fdaae7bd008abd46759d78a9b0529 | [
"MIT"
] | 1 | 2016-08-31T12:04:53.000Z | 2016-08-31T12:04:53.000Z | ---
layout: post
title: Robot Dynamics: Lagrangian formalism
categories: ['Robotics']
tags: ['robotics', 'dynamics', 'mathematics', 'lagrangian mechanics']
math: true
---
As we have seen in the [first post of this series](/robotics/2018/12/29/dynamics-simple-pendulum/), deriving the equations of motion, even for a system as simple as a pendulum, can be tedious.
Classical mechanics (Newtonian mechanics) requires us to describe all the forces acting upon the system, including constraint forces.
For complex systems this quickly becomes time consuming.
However, in the 18th century, Joseph-Louis Lagrange introduced a very interesting reformulation of classical mechanics.
The Lagrangian formalism allows us to derive the equations of motion from energies.
Thus not dealing with constraints at all provided that we choose our generalized coordinates properly.
# Lagrange's promise
In Lagrangian formalism[^1], we define a quantity called Lagrangian $L$ as the difference between the kinetic energy $T$ of the system and its potential energy $U$
$$
L = T - U
$$
Given that this quantity is expressed with respect to a set of generalized coordinates $q_i$ and their associated time derivatives $\dot{q_i}$, the Euler-Lagrange equation
$$
\frac{\partial{L}}{\partial{q_i}} = \frac{d}{dt} \frac{\partial{L}}{\partial{\dot{q_i}}}
$$
allows us to derive the equations of motion of our system.
So let's go through this step by step for the pendulum.
# Generalized coordinates
We call generalized coordinates any arbitrary set of parameters that describe the configuration of a system.
We don't have to stick to convential Cartesian coordinates or roll with exotic polar, cylindrical, or spherical coordinates.
We want to make our life easy, so we select a minimal set that has enough parameters to fully describe the dynamics of our system.
So the number of generalized coordinates will be equal to the number of degrees of freedom.
For our pendulum, it's clear that we have a single degree of freedom: the angle $\theta$.
We'll chose this as generalized coordinate.
The pendulum's position in Cartesian coordinates can be computed at any time given $\theta$
$$
r =
\begin{bmatrix}
l \sin{\theta} \\
- l \cos{\theta} \\
\end{bmatrix}
$$
# Kinetic energy
The kinetic energy[^2] $T$ of any system can be computed as the sum of the kinetic energies of all its moving parts with masses $m_i$ along all axes of motion $x_i$.
This mathematically translates to
$$
T = \sum_i{\frac{1}{2} m_i \dot{x_i}^2}
$$
We can compute the velocity in Cartesian coordinates by deriving the equations seen in last section with respect to time, yielding
$$
\dot{r} =
\begin{bmatrix}
l \dot{\theta} \cos{\theta} \\
l \dot{\theta} \sin{\theta} \\
\end{bmatrix}
$$
Now we plug this back in the formula for kinetic energy, assuming a pendulum of mass $m$ to get the kinetic energy as a function of the angular velocity $\dot{\theta}$
$$
T
= \frac{1}{2} m (l \dot{\theta} \cos{\theta})^2 + \frac{1}{2} m (l \dot{\theta} \sin{\theta})^2
= \frac{1}{2} m l^2 \dot{\theta}^2
$$
# Potential energy
We assumed our pendulum to only be subject to gravity, no dissipative forces apply to it.
Thus its potential energy[^3] $U$ can be defined as a function of its position in the direction of gravity (i.e. $y$ the second Cartesian axis)
$$
U = m g y = - m g l \cos{\theta}
$$
# Invoking Euler-Lagrange
Now our Lagrangian is expressed as function of our generalized coordinate $\theta$ and its associated time derivative $\dot{\theta}$
$$
L(\theta, \dot{\theta}) = \frac{1}{2} m l^2 \dot{\theta}^2 + m g l \cos{\theta}
$$
We start by computing the left side of the Euler-Lagrange equation for our pendulum
$$
\frac{\partial{L}}{\partial{\theta}} = - m g l \sin{\theta}
$$
then we compute the right side
$$
\frac{d}{dt} \frac{\partial{L}}{\partial{\dot{\theta}}}
= \frac{d}{dt} \left( m l^2 \dot{\theta} \right)
= m l^2 \ddot{\theta}
$$
And now for the final act, I will merge both sides to reconstitute the Euler-Lagrange equation
$$
- m g l \sin{\theta} = m l^2 \ddot{\theta}
$$
we rearrange the terms a bit
$$
\ddot{\theta} = - \frac{g}{l} \sin{\theta}
$$
...and tadam!
That looks familiar.
We end up with the same equation of motion previously computed using Newtonian mechanics in the [first episode of this series](/robotics/2018/12/29/dynamics-simple-pendulum/).
# But does it scale?
Now this is all fun and simple, but what happens for a more complex system?
Say a double or a triple pendulum?
Well.. things are simpler in Langrange's world, but it's still a long and dull process of deriving expressions by hand.
Surely we can make computers do some of the work, if not most of it.
Haven't you heard of [automatic differentiation](https://en.wikipedia.org/wiki/Automatic_differentiation), can't we use that?
Sadly not with the Lagrangian approach, because of the explicit time derivative in the Euler-Lagrange equation.
Remember that our generalized coordinate $q$ and their associated time derivatives $\dot{q}$ are both functions of time: $q = q(t)$ and $\dot{q} = \dot{q}(t)$.
In the [next episode](/robotics/2019/07/07/dynamics-simple-pendulum-hamiltonian/), we'll dive into Hamilton's improvement on Lagrange's approach and see how we can leverage it to compute the equations of motion of any system with a minimal set of input expressions defining it.
## References
[^1]: Lagrangian mechanics on [Wikipedia](https://en.wikipedia.org/wiki/Lagrangian_mechanics).
[^2]: Kinetic energy on [Wikipedia](https://en.wikipedia.org/wiki/Kinetic_energy)
[^3]: Potential energy on [Wikipedia](https://en.wikipedia.org/wiki/Potential_energy)
| 39.173611 | 277 | 0.73994 | eng_Latn | 0.991083 |
2f2b4fb069bf1e57ab1cd3e840f6cc2c736582ee | 1,606 | md | Markdown | docs/ado/reference/adox-api/attributes-property-adox.md | bingenortuzar/sql-docs.es-es | 9e13730ffa0f3ce461cce71bebf1a3ce188c80ad | [
"CC-BY-4.0",
"MIT"
] | 1 | 2021-04-26T21:26:08.000Z | 2021-04-26T21:26:08.000Z | docs/ado/reference/adox-api/attributes-property-adox.md | jlporatti/sql-docs.es-es | 9b35d3acbb48253e1f299815df975f9ddaa5e9c7 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/ado/reference/adox-api/attributes-property-adox.md | jlporatti/sql-docs.es-es | 9b35d3acbb48253e1f299815df975f9ddaa5e9c7 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Atributos (propiedad, ADOX) | Microsoft Docs
ms.prod: sql
ms.prod_service: connectivity
ms.technology: connectivity
ms.custom: ''
ms.date: 01/19/2017
ms.reviewer: ''
ms.topic: conceptual
apitype: COM
f1_keywords:
- _Column::put_Attributes
- _Column::Attributes
- _Column::PutAttributes
- _Column::get_Attributes
- _Column::GetAttributes
helpviewer_keywords:
- Attributes property [ADOX]
ms.assetid: e3abb359-79a3-4c22-b3a8-2900817e0d23
author: MightyPen
ms.author: genemi
ms.openlocfilehash: fcd0e70dd9c505b9e2b0752c33b9e768b9127472
ms.sourcegitcommit: b2464064c0566590e486a3aafae6d67ce2645cef
ms.translationtype: MT
ms.contentlocale: es-ES
ms.lasthandoff: 07/15/2019
ms.locfileid: "67967149"
---
# <a name="attributes-property-adox"></a>Attributes (propiedad, ADOX)
Describe las características de la columna.
## <a name="settings-and-return-values"></a>Configuración y valores devueltos
Establece o devuelve un **largo** valor. El valor especifica las características de la tabla que está representado por la [columna](../../../ado/reference/adox-api/column-object-adox.md) objeto. El valor puede ser una combinación de [ColumnAttributesEnum](../../../ado/reference/adox-api/columnattributesenum.md) constantes. El valor predeterminado es cero (**0**), que no es ni **adColFixed** ni **adColNullable**.
## <a name="applies-to"></a>Se aplica a
- [Objeto Column (ADOX)](../../../ado/reference/adox-api/column-object-adox.md)
## <a name="see-also"></a>Vea también
[Ejemplo de propiedad Attributes (VB)](../../../ado/reference/adox-api/attributes-property-example-vb.md)
| 39.170732 | 418 | 0.752179 | spa_Latn | 0.392689 |
2f2b83dfecc24b469126e5488d634c2b0f687e14 | 152 | md | Markdown | README.md | SetareTrb/qgis-newraptor | 9fd71faaabb716ba1e8ea0acc119644069fa6845 | [
"MIT"
] | null | null | null | README.md | SetareTrb/qgis-newraptor | 9fd71faaabb716ba1e8ea0acc119644069fa6845 | [
"MIT"
] | null | null | null | README.md | SetareTrb/qgis-newraptor | 9fd71faaabb716ba1e8ea0acc119644069fa6845 | [
"MIT"
] | null | null | null | ## Add new Raptor Layer
# inform the project about potential environmental constraints
* Find the nest
* Add it to the project
* Check the results | 25.333333 | 63 | 0.75 | eng_Latn | 0.996895 |
2f2ba01e4e99a9a459b6cd7532cd2b2adf4023b2 | 665 | md | Markdown | neurosky/docs/events.md | codegangsta/gobot | eb887f783767da600effdaefbffa2c64cc544de5 | [
"Apache-2.0"
] | 2 | 2016-09-01T00:38:25.000Z | 2017-02-28T15:19:18.000Z | platforms/neurosky/docs/events.md | edmontongo/gobot | 193710aed3af3106618a530c6584981e5fca9a44 | [
"Apache-2.0"
] | null | null | null | platforms/neurosky/docs/events.md | edmontongo/gobot | 193710aed3af3106618a530c6584981e5fca9a44 | [
"Apache-2.0"
] | 1 | 2020-11-13T19:50:03.000Z | 2020-11-13T19:50:03.000Z | # Events
## Attention(data)
Event with the user's current attention level.
## Blink(data)
Event with the user's current blink level.
## EEG(data)
Event showing EEG data.
{ 'Delta': 7023617,
'Theta': 15294464,
'LoAlpha': 15209472,
'HiAlpha': 13321984,
'LoBeta': 4527616,
'HiBeta': 12073472,
'LoGamma': 862464,
'MidGamma': 13637632 }
## Extended(data)
Event with the user's current extended level.
## Meditation(data)
Event with the user's current meditation level.
## Signal(data)
Event showing signal strength.
## Wave(data)
Event showing wave data.
## Start
Gets triggered when the Mindwave is started and ready to be used.
| 15.113636 | 65 | 0.693233 | eng_Latn | 0.907515 |
2f2ba5b38220277a7a4d032573c46698dd5ffe6d | 23,286 | md | Markdown | articles/storage/files/storage-files-planning.md | Ksantacr/azure-docs.es-es | d3abf102433fd952aafab2c57a55973ea05a9acb | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/storage/files/storage-files-planning.md | Ksantacr/azure-docs.es-es | d3abf102433fd952aafab2c57a55973ea05a9acb | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/storage/files/storage-files-planning.md | Ksantacr/azure-docs.es-es | d3abf102433fd952aafab2c57a55973ea05a9acb | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Planeamiento de una implementación de Azure Files | Microsoft Docs
description: Conozca los puntos que debe tener en cuenta al planear una implementación de Azure Files.
services: storage
author: roygara
ms.service: storage
ms.topic: article
ms.date: 04/25/2019
ms.author: rogarana
ms.subservice: files
ms.openlocfilehash: 9144165a3ce593dce11b5e50ce5f0af9f0afa480
ms.sourcegitcommit: 509e1583c3a3dde34c8090d2149d255cb92fe991
ms.translationtype: MT
ms.contentlocale: es-ES
ms.lasthandoff: 05/27/2019
ms.locfileid: "66237670"
---
# <a name="planning-for-an-azure-files-deployment"></a>Planeamiento de una implementación de Azure Files
[Azure Files](storage-files-introduction.md) ofrece recursos compartidos de archivos en la nube totalmente administrados a los que se puede acceder mediante el protocolo SMB estándar. Dado que Azure Files está totalmente administrado, su implementación en escenarios de producción resulta mucho más sencilla que la implementación y administración de un servidor de archivos o un dispositivo NAS. En este artículo se tratan las cuestiones que deben tenerse en cuenta al implementar un recurso compartido de archivos de Azure para su uso en producción dentro de la organización.
## <a name="management-concepts"></a>Conceptos de administración
El siguiente diagrama muestra las construcciones de administración de Azure Files:

* **Storage Account** (Cuenta de almacenamiento): Todo el acceso a Azure Storage se realiza a través de una cuenta de almacenamiento. Consulte el artículo sobre los [objetivos de escalado y rendimiento](../common/storage-scalability-targets.md?toc=%2fazure%2fstorage%2ffiles%2ftoc.json) para información sobre la capacidad de la cuenta de almacenamiento.
* **Recurso compartido**: un recurso compartido de File Storage es un recurso compartido de archivos de SMB en Azure. Todos los directorios y archivos se deben crear en un recurso compartido principal. Una cuenta puede contener un número ilimitado de recursos compartidos, y un recurso compartido puede almacenar un número ilimitado de archivos, hasta la capacidad total de 5 TiB del recurso compartido de archivos.
* **Directorio**: una jerarquía de directorios opcional.
* **Archivo**: se trata de un archivo del recurso compartido. Un archivo puede tener un tamaño de hasta 1 TiB.
* **Formato de dirección URL**: en las solicitudes a un recurso compartido de archivos de Azure realizadas con el protocolo de REST de archivo, los archivos son direccionables mediante el formato de dirección URL siguiente:
```
https://<storage account>.file.core.windows.net/<share>/<directory>/<file>
```
## <a name="data-access-method"></a>Método de acceso a datos
Azure Files ofrece dos cómodos métodos de acceso a datos integrados que puede usar por separado o combinados entre sí para acceder a los datos:
1. **Acceso directo a la nube**: cualquier recurso compartido de archivos de Azure se puede montar mediante [Windows](storage-how-to-use-files-windows.md), [macOS](storage-how-to-use-files-mac.md) o [Linux](storage-how-to-use-files-linux.md) con el protocolo de bloque de mensaje de servidor (SMB) estándar del sector o a través de la API de REST de archivo. Con SMB, las operaciones de lectura y escritura en archivos del recurso compartido se realizan directamente en el recurso compartido de archivos en Azure. Para montar mediante una máquina virtual en Azure, el cliente SMB del sistema operativo debe ser compatible al menos con SMB 2.1. Para montar en local, como en la estación de trabajo de un usuario, el cliente SMB compatible con la estación de trabajo debe ser compatible al menos con SMB 3.0 (con cifrado). Además de SMB, hay nuevas aplicaciones o servicios que pueden acceder directamente al recurso compartido de archivos a través de REST de archivo, lo que proporciona una interfaz de programación de aplicaciones escalable y sencilla para el desarrollo de software.
2. **Azure File Sync**: con Azure File Sync, se pueden replicar los recursos compartidos de archivos a servidores de Windows Servers de forma local o en Azure. Los usuarios accederían al recurso compartido de archivos mediante el servidor de Windows Server, por ejemplo, a través de un recurso compartido de SMB o NFS. Esto resulta útil en escenarios en los que es necesario acceder a los datos y modificarlos lejos de un centro de datos de Azure, como puede ser en una sucursal. Lo datos pueden replicarse entre varios puntos de conexión de Windows Server, por ejemplo, entre varias sucursales. Por último, los datos pueden colocarse en niveles en Azure Files, de modo que se pueda seguir accediendo a todos los datos a través del servidor, pero este no tenga una copia completa de ellos. Los datos se recuperan sin problemas cuando los abre el usuario.
En la tabla siguiente se muestra cómo pueden acceder los usuarios y las aplicaciones al recurso compartido de archivos de Azure:
| | Acceso directo a la nube | Azure File Sync |
|------------------------|------------|-----------------|
| ¿Qué protocolos necesita usar? | Azure Files admite SMB 2.1, SMB 3.0 y API de REST de archivo. | Acceda al recurso compartido de archivos de Azure a través de cualquier protocolo compatible de Windows Server (SMB, NFS, FTPS, etc.) |
| ¿Dónde se ejecuta la carga de trabajo? | **En Azure**: Azure Files ofrece acceso directo a los datos. | **En local con red lenta**: los clientes de Windows, Linux y macOS pueden montar un recurso compartido de archivos de Windows local como una memoria caché rápida del recurso compartido de archivos de Azure. |
| ¿Qué nivel de ACL necesita? | Nivel de recurso compartido y archivo. | Nivel de recurso compartido, archivo y usuario. |
## <a name="data-security"></a>Seguridad de los datos
Azure Files tiene varias opciones integradas para garantizar la seguridad de los datos:
* Admite el cifrado en ambos protocolos inalámbricos: cifrado SMB 3.0 y REST de archivo a través de HTTPS. De forma predeterminada:
* Los clientes que admiten el cifrado SMB 3.0 enviar y reciben datos a través de un canal cifrado.
* Los clientes que no admiten SMB 3.0 con cifrado pueden comunicarse entre centros de datos a través de SMB 2.1 o SMB 3.0 sin cifrado. No se permite a los clientes SMB comunicarse entre centros de datos a través de SMB 2.1 o SMB 3.0 sin cifrado.
* Los clientes pueden comunicarse a través de REST de archivo con HTTP o HTTPS.
* Cifrado en reposo ([Azure Storage Service Encryption](../common/storage-service-encryption.md?toc=%2fazure%2fstorage%2ffiles%2ftoc.json)): La característica Storage Service Encryption está habilitada para todas las cuentas de almacenamiento. Los datos en reposo se cifran con claves completamente administradas. En el cifrado en reposo no se aumentan los costos de almacenamiento ni se reduce el rendimiento.
* Requisito opcional de datos cifrados en tránsito: cuando está seleccionado, Azure Files rechaza el acceso a los datos a través de canales sin cifrar. En concreto, solo se permiten HTTPS y SMB 3.0 con conexiones de cifrado.
> [!Important]
> La exigencia de transferencia segura de datos hace que los clientes SMB más antiguos que no son capaces de comunicarse con SMB 3.0 con cifrado experimenten un error. Para más información, consulte [Montaje en Windows](storage-how-to-use-files-windows.md), [Montaje en Linux](storage-how-to-use-files-linux.md) y [Montaje en macOS](storage-how-to-use-files-mac.md).
Para lograr la máxima seguridad, se recomienda encarecidamente habilitar siempre el cifrado en reposo y el cifrado de datos en tránsito cuando se usen clientes modernos para acceder a los datos. Por ejemplo, si tiene que montar un recurso compartido en una máquina virtual de Windows Server 2008 R2 que solo es compatible con SMB 2.1, debe permitir el tráfico sin cifrar a la cuenta de almacenamiento, dado que SMB 2.1 no admite el cifrado.
Si usa Azure File Sync para acceder al recurso compartido de archivos de Azure, use siempre HTTPS y SMB 3.0 con cifrado para sincronizar los datos en los servidores de Windows Server, independientemente de si se exige cifrado de datos en reposo.
## <a name="file-share-performance-tiers"></a>Niveles de rendimiento de un recurso compartido de archivos
Azure Files ofrece dos niveles de rendimiento: estándar y premium.
* Los **recursos compartidos de archivos estándar** tienen el respaldo de discos duros (HDD) que giran y ofrecen un rendimiento confiable para cargas de trabajo de E/S que no dan tanta importancia a la variabilidad del rendimiento, como recursos compartidos de archivos de uso general y entornos de desarrollo y pruebas. Los recursos compartidos de archivos estándar solo están disponibles en un modelo de facturación de pago por uso.
* Los **recursos compartidos de archivos Premium (versión preliminar)** están respaldados por discos en estado sólido (SSD) que proporcionan alto rendimiento y baja latencia de forma consistente en menos de 10 milisegundos en la mayoría de las operaciones de E/S para las cargas de trabajo intensivas con mayor uso de E/S, lo que hace que sean adecuados para una amplia variedad de cargas de trabajo como bases de datos, hospedaje de sitios web, entornos de desarrollo, etc. Los recursos compartidos de archivos Premium solo están disponibles en un modelo de facturación aprovisionada. Recursos compartidos de archivos Premium usan un modelo de implementación independiente de los recursos compartidos de archivos estándar.
Azure Backup está disponible para recursos compartidos de archivos de premium y Azure Kubernetes Service es compatible con premium recursos compartidos de archivos en la versión 1.13 y versiones posteriores.
Si desea obtener información sobre cómo crear un recurso compartido de archivos de premium, consulte nuestro artículo sobre el tema: [Cómo crear una cuenta de premium de Azure file storage](storage-how-to-create-premium-fileshare.md).
Actualmente, no se puede convertir directamente entre un recurso compartido de archivos estándar y un recurso compartido de archivos de premium. Si desea cambiar a cualquier nivel, debe crear un nuevo recurso compartido de archivos en ese nivel y copiar manualmente los datos desde el recurso compartido original en el nuevo recurso compartido que creó. Puede hacerlo mediante cualquiera de las herramientas de copia de archivos de Azure compatibles, como AzCopy.
> [!IMPORTANT]
> Recursos compartidos de archivos Premium están aún en versión preliminar, solo están disponibles con LRS y están disponibles en la mayoría de las regiones que ofrecen las cuentas de almacenamiento. Para averiguar si los recursos compartidos de archivos premium están disponibles actualmente en su región, consulte el [productos disponibles por región](https://azure.microsoft.com/global-infrastructure/services/?products=storage) página de Azure.
### <a name="provisioned-shares"></a>Recursos compartidos aprovisionados
Recursos compartidos de archivos de Premium (versión preliminar) se aprovisionan en función de una relación fija de GiB/IOPS/rendimiento. Por cada GiB aprovisionado, se generará un IOPS y un rendimiento de 0,1 MiB por segundo en el recurso compartido hasta los límites máximos por recurso compartido. El aprovisionamiento mínimo que se permite es 100 GiB con un IOPS/rendimiento mínimos.
En su máximo esfuerzo, todos los recursos compartidos pueden aumentar hasta tres IOPS por GiB de almacenamiento aprovisionado durante 60 minutos, o más, según el tamaño del recurso compartido. Los nuevos recursos compartidos comienzan con todos los créditos de aumento según la capacidad aprovisionada.
Los recursos compartidos deben aprovisionarse en incrementos de 1 GB. Tamaño mínimo es 100 GB, siguiente tamaño es GiB 101 y así sucesivamente.
> [!TIP]
> Línea de base de e/s por segundo = 1 * aprovisionado GiB. (Hasta un máximo de 100 000 e/s por segundo).
>
> Límite de ráfaga = 3 * la línea de base de e/s por segundo. (Hasta un máximo de 100 000 e/s por segundo).
>
> Velocidad de salida = 60 MiB/s + 0,06 * GiB aprovisionado
>
> Velocidad de entrada = 40 MiB/s + 0,04 * GiB aprovisionado
Tamaño del recurso compartido se puede aumentar en cualquier momento, pero se puede reducir únicamente después de 24 horas desde el último incremento. Después de esperar 24 horas sin un aumento de tamaño, puede reducir el tamaño del recurso compartido tantas veces como sea necesario, hasta que aumente de nuevo. Los cambios de escala IOPS/rendimiento será efectivos en cuestión de minutos después del cambio de tamaño.
Es posible reducir el tamaño de los recursos compartidos aprovisionados por debajo de su GiB usado. Si lo hace, no se pierden los datos, pero, aún se facturará por el tamaño usado y el rendimiento (e/s por segundo de la línea de base, el rendimiento y e/s por segundo de ráfaga) del recurso compartido aprovisionado, no el tamaño usado de recepción.
En la tabla siguiente se muestra algunos ejemplos de estas fórmulas para los tamaños de recurso compartido aprovisionado:
|Capacidad (GiB) | IOPS base | Ráfaga de e/s por segundo | Salida (MiB/s) | Entrada (MiB/s) |
|---------|---------|---------|---------|---------|
|100 | 100 | Hasta 300 | 66 | 44 |
|500 | 500 | 1.500 | 90 | 60 |
|1024 | 1024 | Hasta 3.072 | 122 | 81 |
|5120 | 5120 | Hasta 15.360 | 368 | 245 |
|10,240 | 10,240 | Hasta 30.720 | 675 | 450 |
|33,792 | 33,792 | Hasta 100.000 | 2,088 | 1,392 |
|51,200 | 51,200 | Hasta 100.000 | 3132 | 2,088 |
|102,400 | 100 000 | Hasta 100.000 | 6,204 | 4,136 |
### <a name="bursting"></a>Creación de ráfagas
Recursos compartidos de archivos Premium pueden ampliar sus IOPS hasta un factor de tres. Ampliación es automatizada y opera basándose en un sistema de crédito. Ampliación funciona en base al mejor esfuerzo y el límite de ráfaga no es una garantía, pueden aumentar los recursos compartidos de archivos *hasta* el límite.
Créditos se acumulan en un depósito de ráfaga, siempre que el tráfico para el recurso compartido de archivos está por debajo de la línea de base de e/s por segundo. Por ejemplo, un recurso compartido de GiB 100 tiene previsto 100 IOPS. Si tráfico real en el recurso compartido estaba 40 IOPS para un intervalo específico de 1 segundo, el número de 60 IOPS sin usar se abona a un depósito de ráfaga. Estos créditos, a continuación, se usará más adelante cuando las operaciones, se superará la línea de base de e/s por segundo.
> [!TIP]
> Tamaño del depósito ráfaga = línea de base de e/s por segundo * 3600 * 2.
Cada vez que un recurso compartido supera la línea de base de e/s por segundo y tiene créditos en un depósito de ráfaga, emite ráfagas. Pueden seguir los recursos compartidos de ráfaga siempre quedan créditos, aunque los recursos compartidos de menores que 50 TiB sólo permanecerán en el límite de ráfagas de hasta una hora. Recursos compartidos de mayores que 50 TiB técnicamente puede superar este límite de una hora, de dos horas, pero esto se basa en el número de créditos de ráfaga acumulados. Cada E/S más allá de la línea de base de e/s por segundo consume un crédito y una vez que se consumen todos los créditos devolvería el recurso compartido a la línea de base de e/s por segundo.
Los créditos de recurso compartido tienen tres estados:
- Acumulado, cuando el recurso compartido de archivos usa menos de la línea de base de e/s por segundo.
- Disminuye cuando el recurso compartido de archivos es ampliación.
- Restante constante, cuando hay ninguna créditos o la línea de base de e/s por segundo están en uso.
Inicio de los recursos compartidos de archivo nuevo con el número total de créditos en el depósito de ráfaga. Si el recurso compartido de IOPS cae por debajo de la línea de base de e/s por segundo debido a la limitación por el servidor, no se acumularán créditos de ráfaga.
## <a name="file-share-redundancy"></a>Redundancia del recurso compartido de archivos
Recursos compartidos estándares de Azure Files admite tres opciones de redundancia de datos: almacenamiento con redundancia local (LRS), almacenamiento con redundancia de zona (ZRS) y almacenamiento con redundancia geográfica (GRS).
Azure premium de archivos, recursos compartidos solo admite almacenamiento con redundancia (LRS).
En las siguientes secciones se describen las diferencias entre las diferentes opciones de redundancia:
### <a name="locally-redundant-storage"></a>Almacenamiento con redundancia local
[!INCLUDE [storage-common-redundancy-LRS](../../../includes/storage-common-redundancy-LRS.md)]
### <a name="zone-redundant-storage"></a>Almacenamiento con redundancia de zona
[!INCLUDE [storage-common-redundancy-ZRS](../../../includes/storage-common-redundancy-ZRS.md)]
### <a name="geo-redundant-storage"></a>Almacenamiento con redundancia geográfica
> [!Warning]
> Si usa el recurso compartido de archivos de Azure como punto de conexión en la nube en una cuenta de almacenamiento GRS, no debe iniciar la conmutación por error de la cuenta de almacenamiento. Si lo hace, la sincronización dejará de funcionar y también podría provocar una pérdida inesperada de datos en el caso de archivos recién organizados en capas. En caso de pérdida de una región de Azure, Microsoft activará la conmutación por error de la cuenta de almacenamiento de forma que sea compatible con Azure File Sync.
El almacenamiento con redundancia geográfica(GRS) está diseñado para proporcionar al menos el 99.99999999999999 % (dieciséis nueves) de durabilidad de objetos a lo largo de un año. Para ello, replica los datos a una región secundaria que se encuentra a cientos de kilómetros de la región primaria. Si la cuenta de almacenamiento tiene habilitado GRS, sus datos se mantienen incluso ante un apagón regional completo o un desastre del cual la región principal no se puede recuperar.
Si opta por el almacenamiento con redundancia geográfica de acceso de lectura (RA-GRS), se debe saber que Azure Files no admite almacenamiento con redundancia geográfica de acceso de lectura (RA-GRS) en cualquier región en este momento. Recursos compartidos de archivos en la cuenta de almacenamiento de RA-GRS funcionan como lo harían en las cuentas GRS y son los precios GRS cargados.
GRS replica los datos en otro centro de datos en una región secundaria, pero esos datos están disponibles para ser de solo lectura si Microsoft inicia una conmutación por error desde la región primaria a la región secundaria.
Para una cuenta de almacenamiento con GRS habilitado, todos los datos se replican primero con almacenamiento con redundancia local (LRS). Una actualización se confirma primero en la ubicación principal y se replican mediante LRS. La actualización luego se replica de manera asincrónica en la región secundaria con GRS. Cuando los datos se escriben en la ubicación secundaria, también se replican dentro de esa ubicación con LRS.
Las regiones primarias y secundarias administran las réplicas entre dominios de error y de actualización diferentes dentro de una unidad de escalado de almacenamiento. La unidad de escalado de almacenamiento es la unidad de replicación básica dentro del centro de datos. La replicación en este nivel se realiza por LRS; Para obtener más información, consulte [almacenamiento con redundancia local (LRS): redundancia de datos de bajo costo para Azure Storage](../common/storage-redundancy-lrs.md).
Tenga en cuenta estos puntos cuando decida qué opción de replicación usar:
* Almacenamiento con redundancia de zona (ZRS) ofrece alta disponibilidad con replicación sincrónica y puede ser una opción mejor para algunos escenarios que GRS. Para más información sobre ZRS, consulte [ZRS](../common/storage-redundancy-zrs.md).
* La replicación asincrónica implica un retraso desde el momento en que se escriben los datos en la región principal hasta que se replican en la región secundaria. En el caso de un desastre regional, los cambios que no se hayan replicado en la región secundaria pueden perderse si dichos datos no se pueden recuperar desde la región principal.
* Con GRS, la réplica no está disponible para acceso de lectura o escritura a menos que Microsoft inicie la conmutación por error en la región secundaria. En el caso de una conmutación por error, tendrá acceso de lectura y escritura a dichos datos después de que se haya completado la conmutación por error. Para más información, consulte la [guía de recuperación ante desastres](../common/storage-disaster-recovery-guidance.md).
## <a name="data-growth-pattern"></a>Patrón de crecimiento de datos
En la actualidad, el tamaño máximo de un recurso compartido de archivos de Azure es de 5 TB (100 TB para los recursos compartidos de archivos de premium, que se encuentran en versión preliminar pública). Debido a esta limitación actual, debe tener en cuenta el crecimiento esperado de los datos al implementar un recurso compartido de archivos de Azure.
Es posible sincronizar varios recursos compartidos de archivos de Azure en un único servidor de archivos de Windows con Azure File Sync. Esto permite garantizar que los recursos compartidos de archivos anteriores de gran tamaño que pueda tener en un entorno local se incluyen en Azure File Sync. Para más información, consulte [Planeamiento de una implementación de Azure File Sync](storage-files-planning.md).
## <a name="data-transfer-method"></a>Método de transferencia de datos
Existen muchas opciones sencillas para la transferencia masiva de datos desde un recurso de archivos existente, como un recurso compartido de archivos local, a Azure Files. Algunas populares incluyen (lista no exhaustiva):
* **Azure File Sync**: como parte de una primera sincronización entre un recurso compartido de archivos de Azure (un "punto de conexión de nube") y un espacio de nombres de directorio de Windows (un "punto de conexión de servidor"), Azure File Sync replica todos los datos del recurso compartido de archivos existente en Azure Files.
* **[Azure Import/Export](../common/storage-import-export-service.md?toc=%2fazure%2fstorage%2ffiles%2ftoc.json)** : El servicio Azure Import/Export permite transferir de forma segura grandes cantidades de datos a un recurso compartido de archivos de Azure mediante el envío de unidades de disco duro a un centro de datos de Azure.
* **[Robocopy](https://technet.microsoft.com/library/cc733145.aspx)** : Robocopy es una herramienta de copia conocida que se incluye con Windows y Windows Server. Robocopy puede usarse para transferir datos a Azure Files al montar el recurso compartido de archivos localmente y luego usar la ubicación montada como destino en el comando de Robocopy.
* **[AzCopy](../common/storage-use-azcopy-v10.md?toc=%2fazure%2fstorage%2ffiles%2ftoc.json)** : AzCopy es una utilidad de línea de comandos diseñada para copiar datos a y desde Azure Files, así como Azure Blob Storage, mediante sencillos comandos con un rendimiento óptimo.
## <a name="next-steps"></a>Pasos siguientes
* [Planeamiento de una implementación de Azure File Sync](storage-sync-files-planning.md)
* [Implementación de Azure Files](storage-files-deployment-guide.md)
* [Implementación de Azure File Sync](storage-sync-files-deployment-guide.md)
| 116.43 | 1,083 | 0.7885 | spa_Latn | 0.994283 |
2f2bf1f55bb0db8d21574877d859b798b2b476ec | 707 | md | Markdown | README.md | skeetwu/ETF | 8b39d3ff103ae7e6ae7b1e791a02574ba43ef234 | [
"Apache-2.0"
] | 1 | 2021-05-01T06:52:01.000Z | 2021-05-01T06:52:01.000Z | README.md | skeetwu/ETF | 8b39d3ff103ae7e6ae7b1e791a02574ba43ef234 | [
"Apache-2.0"
] | null | null | null | README.md | skeetwu/ETF | 8b39d3ff103ae7e6ae7b1e791a02574ba43ef234 | [
"Apache-2.0"
] | null | null | null | # 爬取天天基金网上的ETF基金的详情
## Intro
给朋友帮忙,需要根据天天基金--场内交易基金净值折价率一览表把每个基金的前十名的股票持仓信息做到同一个表格中,用于后面的基金分析和挑选。
列表: http://fund.eastmoney.com/cnjy_jzzzl.html
## Start
```bash
# pip3 install -r requirements.txt -i http://mirrors.aliyun.com/pypi/simple/ --trusted-host mirrors.aliyun.com
# python etf_app.py
```
访问页面 http://127.0.0.1:5000/etf
这个地方没做异步返回,可能会返回很慢,后台会打印处理进度。
全部数据在etf_numbers.data,数据比较多,已经按照ETF板块进行了分类。
有一个小数据文件etf_numbers.small.data, 用于测试,可以修改etf_app.py里的da_file参数配置。
显示出来的网页带一点CSS样式,直接全选拷贝里Excel表格就完活了。
excel/etf.xlsx是一个demo。
excel/result_20210129.xlsx是朋友分析的结果,应该根据这个表就可以在每个版块中选择一支最合适的购买ETF基金了。
## 注意
xpath可以使用Chrome的debug模式,选中值或者dom对象,然后右键copy。
copy出来以后,如果路径里面有tbody一定要删掉,lxml目测是无法解析tbody。
| 19.638889 | 110 | 0.811881 | yue_Hant | 0.82614 |
2f2cd3ab263d749c0717973952736ab8f321b31e | 1,164 | md | Markdown | _pages/about.md | Reading-Group-McGill/reading-group-mcgill.github.io | 5a1c2bce487a6cdb5af1158fb76638699afb4de7 | [
"MIT"
] | null | null | null | _pages/about.md | Reading-Group-McGill/reading-group-mcgill.github.io | 5a1c2bce487a6cdb5af1158fb76638699afb4de7 | [
"MIT"
] | null | null | null | _pages/about.md | Reading-Group-McGill/reading-group-mcgill.github.io | 5a1c2bce487a6cdb5af1158fb76638699afb4de7 | [
"MIT"
] | null | null | null | ---
layout: about
permalink: /
title: Reading Group --- Topics in OT
description: <a href="https://www.mcgill.ca/mathstat/">Department of Mathematics and Statistics, McGill University</a>
news: true
social: false
---
This is the webpage for the reading group on "Topics in Optimal Transport", organized by <a href="https://apooladian.github.io"> Aram-Alexandre Pooladian </a> at McGill University.
The idea is simple: for however long we decide to do this, the members of the reading group will be taking turns presenting papers or chapters from the literature in areas such as machine learning, high-dimensional statistics, and numerical analysis. In an ideal setting, everyone will have at least skimmed the paper with the main presenter knowing the finer details. This is a beginner-friendly reading group, but this would be pointless if there were not any technical details.
As this is a reading group, there will be no "expert speakers" or recordings of the presentations. If you are interested in participating, please email readinggroupot[at]gmail[dot]com to be added to the listserv.
Tentatively, the meetings will occur Mondays at 2pm EST via Zoom.
| 64.666667 | 480 | 0.786082 | eng_Latn | 0.998351 |
2f2d9705366114f8c919291ebddc52701c6a20db | 5,292 | md | Markdown | docs/ClientsApi.md | meltwater/meltwater-java | 91eaff5219bba4de84215630c1a6c55d0004ad90 | [
"Apache-2.0"
] | null | null | null | docs/ClientsApi.md | meltwater/meltwater-java | 91eaff5219bba4de84215630c1a6c55d0004ad90 | [
"Apache-2.0"
] | 2 | 2019-09-09T14:40:20.000Z | 2021-02-09T15:48:35.000Z | docs/ClientsApi.md | meltwater/meltwater-java | 91eaff5219bba4de84215630c1a6c55d0004ad90 | [
"Apache-2.0"
] | 1 | 2017-04-12T00:27:02.000Z | 2017-04-12T00:27:02.000Z | # ClientsApi
All URIs are relative to *https://api.meltwater.com*
Method | HTTP request | Description
------------- | ------------- | -------------
[**createClientCredentials**](ClientsApi.md#createClientCredentials) | **POST** /v2/clients | Register new client
[**deleteClientCredentials**](ClientsApi.md#deleteClientCredentials) | **DELETE** /v2/clients/{client_id} | Delete client.
<a name="createClientCredentials"></a>
# **createClientCredentials**
> ClientCredentials createClientCredentials(userKey, authorization)
Register new client
Register new client Creates a new pair of client credentials (`client_id`/`client_secret` pair). Requires your Meltwater credentials (`email`:`password`) to authenticate. #### Appendix The Base64-encoded `email`:`password` string can be generated in a terminal with following command: $ echo -n \"your_email@your_domain.com:your_secret_password\" | base64 <i>You will need `base64` installed.</i>
### Example
```java
// Import classes:
//import io.swagger.client.ApiException;
//import io.swagger.client.api.ClientsApi;
ClientsApi apiInstance = new ClientsApi();
String userKey = "userKey_example"; // String | The `user_key` from [developer.meltwater.com](https://developer.meltwater.com/admin/applications/).
String authorization = "authorization_example"; // String | `email`:`password` Basic Auth (RFC2617) credentials. Must contain the realm `Basic` followed by a Base64-encoded `email`:`password` pair using your Meltwater credentials. #### Example: Basic bXlfZW1haWxAZXhhbXJzZWNyZXQ=
try {
ClientCredentials result = apiInstance.createClientCredentials(userKey, authorization);
System.out.println(result);
} catch (ApiException e) {
System.err.println("Exception when calling ClientsApi#createClientCredentials");
e.printStackTrace();
}
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**userKey** | **String**| The `user_key` from [developer.meltwater.com](https://developer.meltwater.com/admin/applications/). |
**authorization** | **String**| `email`:`password` Basic Auth (RFC2617) credentials. Must contain the realm `Basic` followed by a Base64-encoded `email`:`password` pair using your Meltwater credentials. #### Example: Basic bXlfZW1haWxAZXhhbXJzZWNyZXQ= |
### Return type
[**ClientCredentials**](ClientCredentials.md)
### Authorization
No authorization required
### HTTP request headers
- **Content-Type**: application/json
- **Accept**: application/json
<a name="deleteClientCredentials"></a>
# **deleteClientCredentials**
> deleteClientCredentials(userKey, authorization, clientId)
Delete client.
Delete client. Deletes your current client credentials consisting of client_id and client_secret. After calling this resource, you will not be able to use the Meltwater API unless you create a new set of client credentials! Requires your Meltwater credentials (`email`:`password`) to authenticate. #### Appendix The Base64-encoded `email`:`password` string can be generated in a terminal with following command: $ echo -n \"your_email@your_domain.com:your_secret_password\" | base64 <i>You will need `base64` installed.</i>
### Example
```java
// Import classes:
//import io.swagger.client.ApiException;
//import io.swagger.client.api.ClientsApi;
ClientsApi apiInstance = new ClientsApi();
String userKey = "userKey_example"; // String | The `user_key` from [developer.meltwater.com](https://developer.meltwater.com/admin/applications/).
String authorization = "authorization_example"; // String | `email`:`password` Basic Auth (RFC2617) credentials. Must contain the realm `Basic` followed by a Base64-encoded `email`:`password` pair using your Meltwater credentials. #### Example: Basic bXlfZW1haWxAZXhhbXJzZWNyZXQ=
String clientId = "clientId_example"; // String | Client ID
try {
apiInstance.deleteClientCredentials(userKey, authorization, clientId);
} catch (ApiException e) {
System.err.println("Exception when calling ClientsApi#deleteClientCredentials");
e.printStackTrace();
}
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**userKey** | **String**| The `user_key` from [developer.meltwater.com](https://developer.meltwater.com/admin/applications/). |
**authorization** | **String**| `email`:`password` Basic Auth (RFC2617) credentials. Must contain the realm `Basic` followed by a Base64-encoded `email`:`password` pair using your Meltwater credentials. #### Example: Basic bXlfZW1haWxAZXhhbXJzZWNyZXQ= |
**clientId** | **String**| Client ID |
### Return type
null (empty response body)
### Authorization
No authorization required
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: application/json
| 49.924528 | 689 | 0.693689 | eng_Latn | 0.649334 |
2f2df08cc09145f7a7ac3e4ad1a999e2598c3cbb | 42 | md | Markdown | _posts/2021-07-15-Journey.md | Zoomeyl/github-pages-with-jekyll | 0eab8b84f27b29064523361ca69481d1cb24a1ac | [
"MIT"
] | null | null | null | _posts/2021-07-15-Journey.md | Zoomeyl/github-pages-with-jekyll | 0eab8b84f27b29064523361ca69481d1cb24a1ac | [
"MIT"
] | 4 | 2021-07-15T07:43:10.000Z | 2021-07-15T08:12:20.000Z | _posts/2021-07-15-Journey.md | Zoomeyl/github-pages-with-jekyll | 0eab8b84f27b29064523361ca69481d1cb24a1ac | [
"MIT"
] | null | null | null | ---
title: "Journey"
date: 2021-07-15
---
| 8.4 | 16 | 0.571429 | nld_Latn | 0.179341 |
2f2e3209e0723619cca8bc830eb789610fbc7551 | 139 | md | Markdown | france.code-civil/Livre Ier/Titre VII/Article 327.md | bradchesney79/illacceptanything | 4594ae4634fdb5e39263a6423dc255ed46c25208 | [
"MIT"
] | 2,986 | 2015-03-31T06:53:53.000Z | 2022-03-29T13:03:22.000Z | france.code-civil/Livre Ier/Titre VII/Article 327.md | bradchesney79/illacceptanything | 4594ae4634fdb5e39263a6423dc255ed46c25208 | [
"MIT"
] | 42 | 2015-03-31T08:46:31.000Z | 2020-11-01T11:28:43.000Z | france.code-civil/Livre Ier/Titre VII/Article 327.md | bradchesney79/illacceptanything | 4594ae4634fdb5e39263a6423dc255ed46c25208 | [
"MIT"
] | 243 | 2015-03-31T06:43:04.000Z | 2022-02-20T21:26:49.000Z | Article 327
----
La paternité hors mariage peut être judiciairement déclarée.
L'action en recherche de paternité est réservée à l'enfant.
| 23.166667 | 60 | 0.791367 | fra_Latn | 0.996428 |
2f2e3e53c697914a07e846e56105f6a3825c06a9 | 557 | md | Markdown | articles/libraries/lock/v11/migration-v8-v11.md | jimmyjames/docs-1 | edf8c00d22144abbf4588d17d9b15bed87532d17 | [
"MIT"
] | 336 | 2015-02-03T21:32:33.000Z | 2022-03-27T07:42:33.000Z | articles/libraries/lock/v11/migration-v8-v11.md | jimmyjames/docs-1 | edf8c00d22144abbf4588d17d9b15bed87532d17 | [
"MIT"
] | 3,640 | 2015-01-05T19:16:40.000Z | 2022-03-21T15:34:43.000Z | articles/libraries/lock/v11/migration-v8-v11.md | jimmyjames/docs-1 | edf8c00d22144abbf4588d17d9b15bed87532d17 | [
"MIT"
] | 2,052 | 2015-01-05T07:10:33.000Z | 2022-03-17T17:24:51.000Z | ---
section: libraries
title: Migrating from Lock v8 to v11
description: How to migrate from Lock v8 to v11
public: false
toc: true
topics:
- libraries
- lock
- migrations
contentType:
- how-to
useCase:
- add-login
- migrate
---
# Migrating from Lock v8 to v11
<dfn data-key="lock">Lock</dfn> v8 is [very similar](/libraries/lock/v9/migration-guide) to Lock v9 from an API standpoint.
You can follow the instructions on [how to migrate from Lock v9 to Lock v11](/libraries/lock/v11/migration-v9-v11), as they also are applicable for Lock v8.
| 24.217391 | 156 | 0.728905 | eng_Latn | 0.985454 |
2f2ee1468fe228e0c4b7a942321dc7c70300f8b2 | 17,159 | md | Markdown | BingAds/bingads-12/guides/ad-extensions.md | benm-brainlabs/BingAds-docs | b2824a37007166949efaaabcd3f53c93584c63f2 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2020-09-24T18:14:14.000Z | 2020-09-24T18:14:14.000Z | BingAds/bingads-12/guides/ad-extensions.md | benm-brainlabs/BingAds-docs | b2824a37007166949efaaabcd3f53c93584c63f2 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | BingAds/bingads-12/guides/ad-extensions.md | benm-brainlabs/BingAds-docs | b2824a37007166949efaaabcd3f53c93584c63f2 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: "Ad Extensions"
ms.service: "bing-ads"
ms.topic: "article"
author: "eric-urban"
ms.author: "eur"
description: Setup ad extensions with the Bing Ads API.
---
# Ad Extensions
Ad extensions are additional pieces of information about your business, like a phone number or a link to a specific page on your website, you can add to your ads. Ad extensions are free to add to your ads, with the usual charges for any clicks you get. Including ad extensions can improve the visibility of your ads, which can lead to more clicks and improve your ROI. There are many types of ad extensions available in Bing Ads: [App Extensions](#appextensions), [Call Extensions](#callextensions), [Callout Extensions](#calloutextensions), [Image Extensions](#imageextensions), [Location Extensions](#locationextensions), [Price Extensions](#priceextensions), [Review Extensions](#reviewextensions), [Sitelink Extensions](#sitelinkextensions), and [Structured Snippet Extensions](#structuredsnippetextensions). For more about ad extensions, see [Bing Ads Web Application Help - What are ad extensions?](http://help.bingads.microsoft.com/#apex/3/en/51001/1)
> [!TIP]
> Providing extension data allows our algorithms to evaluate all the possible layouts for your ad. It increases the changes of additional space being allocated and increasing clicks for your ad.
Ad extensions are stored in a shared library at the account level. After adding the extension to your shared library, you must also explicitly associate it with the account or one or more campaigns or ad groups within the account for the extension to become eligible for delivery. For more details on associating ad extensions, see [Managing Ad Extensions with the Bulk Service](#bulkservice) and [Managing Ad Extensions with the Campaign Management Service](#campaignservice) in the sections below. For ad extension association limits per entity, please see [Entity Limits for Ad Extensions](entity-hierarchy-limits.md#adextensions).
> [!NOTE]
> Call ad extensions can only be associated at the campaign level.
>
> Location ad extensions can only be associated at the account and campaign level i.e., cannot be associated with ad groups.
Ad extensions that are associated at a lower level e.g., ad group will override ad extensions of the same type that are associated at a higher level e.g., campaign. For example if you have 2 callout extensions set for *Campaign A*, zero callout extensions associated with *Ad Group AA*, and one callout extension associated with *Ad Group AB*, then only *Ad Group AA* is eligible to have its ads decorated with callouts.
You can manage ad extensions with either the [Bulk Service](../bulk-service/bulk-service-reference.md) or [Campaign Management Service](../campaign-management-service/campaign-management-service-reference.md). You should use the [Bulk Service](../bulk-service/bulk-service-reference.md) if you need to upload or download a high volume of entity settings. For example you can update all ad extensions for your entire account in a single upload. In comparison, with the [Campaign Management Service](../campaign-management-service/campaign-management-service-reference.md) you can only update 100 ad extensions per call. For details see the following sections.
## <a name="adextensiontypes"></a>Ad Extension Types
Ad extension types include [App Extensions](#appextensions), [Call Extensions](#callextensions), [Callout Extensions](#calloutextensions), [Image Extensions](#imageextensions), [Location Extensions](#locationextensions), [Price Extensions](#priceextensions), [Review Extensions](#reviewextensions), [Sitelink Extensions](#sitelinkextensions), and [Structured Snippet Extensions](#structuredsnippetextensions).
### <a name="appextensions"></a>App Extensions
You can associate app ad extensions with your campaigns and ad groups, and your ads will include a link to install an application.

### <a name="callextensions"></a>Call Extensions
With Call Extensions, you can provide a phone number that is not associated with a particular location, but is appropriate for all locations where your ads display. In comparison, you typically use Location Extensions to provide an address and local phone number associated with a local location.
If the campaign is also associated with a [Location Extensions](#locationextensions), the call extension phone number will override the location extension phone number.

### <a name="calloutextensions"></a>Callout Extensions
With Callout Extensions, you can provide an extra snippet of text that highlights your business, products, or services to include in An ad. This extension is not clickable and can appear in addition to your ad's description. Providing additional details about your store can make your ad more relevant to potential customers.
Each account, campaign, or ad group can be associated with between 2 and 20 callout ad extensions. If you associate one or fewer callout extensions with your account, campaign, or ad group, no callout text will serve with your ad. An ad may include between 2 to 4 callouts per impression.

### <a name="imageextensions"></a>Image Extensions
You can associate image ad extensions with your campaigns and ad groups, and your ads may include an image or alternative text.

### <a name="locationextensions"></a>Location Extensions
When you enable Location Extensions, you can choose to show the address of your business location that is closest to the customer and also include a local phone number. Better yet, if the customer is viewing your ad on a smartphone, they can click that number to give you a call.
If the campaign is also associated with a [Call Extensions](#callextensions), the phone number in the call extension will override the location extension phone number.

### <a name="priceextensions"></a>Price Extensions
You can use Price Extensions to display your products or services with their corresponding prices to potential customers on mobile devices. Price Extensions only show on ads listed at the top of the results page, helping to increase your clicks. Keep in mind that though Price Extensions are free to add to your ad, they may not always show for every query.

### <a name="reviewextensions"></a>Review Extensions
Potential customers like to know about other customers' experiences when searching for products or services. Share positive reviews from a reputable third-party source about your business, products, or services in your ads with a Review Extension. An ad will only include one review per impression.

### <a name="sitelinkextensions"></a>Sitelink Extensions
Sitelink Extensions are additional links in your ads that take customers to specific pages on your website. This allows you to promote certain products, services, or sections of your website and take potential customers to exactly the information they were searching for. This can increase both click-through-rate and conversions.
You may associate site links ad extensions with your campaigns or ad groups, and your ads will include up to 10 links to relevant web pages within your website. When displaying an ad, Bing Ads determines which links are most relevant to the ad being displayed and includes those with your ad. You can influence which links are included. Links that you specify at the beginning of your list receive higher priority than those toward then end of your list.

### <a name="structuredsnippetextensions"></a>Structured Snippet Extensions
Structured Snippet Extensions give potential customers more context on a specific aspect of your products and services. A Structured snippet is made up of a header and a list of 3-10 values which correspond to the header. For example, you might use the header "Brands:" and the values "Windows, Xbox, Skype" to let customers know about what brands are available at your store.

This extension is not clickable and, similar to other extensions, will appear beneath your ad's description. Structured Snippets have no impact on the other extensions you're already using. Structured Snippets should not duplicate what is already stated in the ad. Our full list of Structured Snippet policies can be found [here](https://advertise.bingads.microsoft.com/resources/policies/ad-extensions-policies).
An ad will only include one structured snippet (one headline with 3 - 10 values) per impression. Keep in mind that your ads won't always show Structured Snippets and if they do show Structured Snippets, the format they appear may vary. Structured Snippets are free to add to your ad, available in all Bing Ads markets (excluding Hong Kong and Taiwan), and serve on desktop and tablet devices.
## <a name="bulkservice"></a>Managing Ad Extensions with the Bulk Service
You can use the [Bulk Service](../bulk-service/bulk-service-reference.md) i.e., [Bulk Download and Upload](bulk-download-upload.md) to create, get, update, and delete both ad extensions and ad extension associations.
The following Bulk records are available for managing ad extensions and ad extension associations.
### App Extensions
- [App Ad Extension](../bulk-service/app-ad-extension.md)
- [Ad Group App Ad Extension](../bulk-service/ad-group-app-ad-extension.md)
- [Campaign App Ad Extension](../bulk-service/campaign-app-ad-extension.md)
### Call Ad Extensions
- [Call Ad Extension](../bulk-service/call-ad-extension.md)
- [Campaign Call Ad Extension](../bulk-service/campaign-call-ad-extension.md)
### Callout Ad Extensions
- [Callout Ad Extension](../bulk-service/callout-ad-extension.md)
- [Ad Group Callout Ad Extension](../bulk-service/ad-group-callout-ad-extension.md)
- [Campaign Callout Ad Extension](../bulk-service/campaign-callout-ad-extension.md)
### Image Ad Extensions
- [Image Ad Extension](../bulk-service/image-ad-extension.md)
- [Ad Group Image Ad Extension](../bulk-service/ad-group-image-ad-extension.md)
- [Campaign Image Ad Extension](../bulk-service/campaign-image-ad-extension.md)
### Location Ad Extensions
- [Location Ad Extension](../bulk-service/location-ad-extension.md)
- [Campaign Location Ad Extension](../bulk-service/campaign-location-ad-extension.md)
### Price Ad Extensions
- [Price Ad Extension](../bulk-service/price-ad-extension.md)
- [Ad Group Price Ad Extension](../bulk-service/ad-group-price-ad-extension.md)
- [Campaign Price Ad Extension](../bulk-service/campaign-price-ad-extension.md)
### Review Ad Extensions
- [Review Ad Extension](../bulk-service/review-ad-extension.md)
- [Ad Group Review Ad Extension](../bulk-service/ad-group-review-ad-extension.md)
- [Campaign Review Ad Extension](../bulk-service/campaign-review-ad-extension.md)
### Sitelink Ad Extensions
- [Sitelink Ad Extension](../bulk-service/sitelink-ad-extension.md)
- [Ad Group Sitelink Ad Extension](../bulk-service/ad-group-sitelink-ad-extension.md)
- [Campaign Sitelink Ad Extension](../bulk-service/campaign-sitelink-ad-extension.md)
### Structured Snippet Ad Extensions
- [Structured Snippet Ad Extension](../bulk-service/structured-snippet-ad-extension.md)
- [Ad Group Structured Snippet Ad Extension](../bulk-service/ad-group-structured-snippet-ad-extension.md)
- [Campaign Structured Snippet Ad Extension](../bulk-service/campaign-structured-snippet-ad-extension.md)
For code examples that show how to set up ad extensions using the Bulk service, see [Bulk Ad Extensions Code Example](code-example-bulk-ad-extensions.md).
## <a name="campaignservice"></a>Managing Ad Extensions with the Campaign Management Service
You can use the [Campaign Management Service](../campaign-management-service/campaign-management-service-reference.md) to create, get, update, and delete both ad extensions and ad extension associations.
For code examples that show how to set up ad extensions using the Campaign Management service, see [Ad Extensions Code Example](code-example-ad-extensions.md).
### Entities
These are the ad extension entities that can be accessed using the [Campaign Management Service](../campaign-management-service/campaign-management-service-reference.md). You can create, read, update, and delete these entities.
- [AppAdExtension](../campaign-management-service/appadextension.md)
- [CallAdExtension](../campaign-management-service/calladextension.md)
- [CalloutAdExtension](../campaign-management-service/calloutadextension.md)
- [ImageAdExtension](../campaign-management-service/imageadextension.md)
- [LocationAdExtension](../campaign-management-service/locationadextension.md)
- [PriceAdExtension](../campaign-management-service/priceadextension.md)
- [ReviewAdExtension](../campaign-management-service/reviewadextension.md)
- [SitelinkAdExtension](../campaign-management-service/sitelinkadextension.md)
- [StructuredSnippetAdExtension](../campaign-management-service/structuredsnippetadextension.md)
> [!NOTE]
> The [AdExtension](../campaign-management-service/adextension.md) object is the base class from which all ad extensions are derived.
### Service Operations
These are the [Campaign Management Service](../campaign-management-service/campaign-management-service-reference.md) service operations that can be used to add, get, update, and delete ad extensions.
- [AddAdExtensions](../campaign-management-service/addadextensions.md)
- [SetAdExtensionsAssociations](../campaign-management-service/setadextensionsassociations.md)
- [GetAdExtensionsByIds](../campaign-management-service/getadextensionsbyids.md)
- [GetAdExtensionIdsByAccountId](../campaign-management-service/getadextensionidsbyaccountid.md)
- [GetAdExtensionsAssociations](../campaign-management-service/getadextensionsassociations.md)
- [UpdateAdExtensions](../campaign-management-service/updateadextensions.md)
- [DeleteAdExtensions](../campaign-management-service/deleteadextensions.md)
- [DeleteAdExtensionsAssociations](../campaign-management-service/deleteadextensionsassociations.md)
> [!NOTE]
> Partial update is not supported for ad extensions. Any optional elements which are not sent with the [UpdateAdExtensions](../campaign-management-service/updateadextensions.md) request will in effect be deleted from the extension.
>
> Partial success is not supported when adding, updating, and deleting ad extensions. For example if you submit 10 ad extensions and 2 fail, the entire batch will fail.
>
> Partial success is supported for [GetAdExtensionsAssociations](../campaign-management-service/getadextensionsassociations.md) and [SetAdExtensionsAssociations](../campaign-management-service/setadextensionsassociations.md). For example if you submit 10 ad extension associations and 2 fail, the remaining 8 will succeed. For more information, see [Partial Success using the Campaign Management Service](handle-service-errors-exceptions.md#partial-success).
## <a name="editorial"></a>Editorial Review
When you associate an ad extension with a campaign or ad group, the extension goes through an initial editorial review. For more information, see [Ad Extension Editorial Review](editorial-review-appeals.md#adextensioneditorialreview).
## <a name="reporting"></a>Reporting
You can use the following reports to get statistics about the effectiveness of the ad extensions that you've included in your ads.
- [AdExtensionByAdReportRequest](../reporting-service/adextensionbyadreportrequest.md) - Aggregates performance data by ad for a specified time period. By including performance details, such as clicks, conversion, and spend, you can identify ad extensions that are performing well, and those that may need to be adjusted to optimize the monthly budget.
- [AdExtensionByKeywordReportRequest](../reporting-service/adextensionbykeywordreportrequest.md) - Aggregates performance data by keyword for a specified time period. By including performance details, such as clicks, conversion, and spend, you can identify ad extensions that are performing well, and those that may need to be adjusted to optimize the monthly budget.
- [AdExtensionDetailReportRequest](../reporting-service/adextensiondetailreportrequest.md) - Lists all versions of an ad extension by account. You can use this information along with the performance data from the other two reports to determine which version performed better.
For more information about reporting, see [Reports](reports.md) and [Request and Download a Report](request-download-report.md).
## See Also
[Bing Ads Web Service Addresses](web-service-addresses.md)
| 87.994872 | 958 | 0.793578 | eng_Latn | 0.978485 |
2f2f8036ce220d69c9107096568ea25de322fec1 | 3,715 | md | Markdown | README.md | AODocs/endpoints-framework-maven-plugin | 50761519dfe9283589c8554605d87459363d23ee | [
"Apache-2.0"
] | null | null | null | README.md | AODocs/endpoints-framework-maven-plugin | 50761519dfe9283589c8554605d87459363d23ee | [
"Apache-2.0"
] | null | null | null | README.md | AODocs/endpoints-framework-maven-plugin | 50761519dfe9283589c8554605d87459363d23ee | [
"Apache-2.0"
] | 1 | 2020-03-19T15:42:51.000Z | 2020-03-19T15:42:51.000Z | [](https://travis-ci.org/AODocs/endpoints-framework-maven-plugin)
# Endpoints Framework Maven plugin
This Maven plugin provides goals and configurations to build Endpoints Framework projects.
# Requirements
Maven is required to build the plugin. To download Maven, follow the [instructions](http://maven.apache.org/).
The remaining dependencies are specified in the pom.xml file and should be automatically downloaded when the plugin is built.
# How to use
In your Maven App Engine Java app, add the following plugin to your pom.xml:
```XML
<plugin>
<groupId>com.aodocs.endpoints</groupId>
<artifactId>endpoints-framework-maven-plugin</artifactId>
<version>2.5.0</version>
</plugin>
```
All goals are prefixed with `endpoints-framework`
## Server
The plugin exposes the following server side goals
* `clientLibs` - generate client libraries
* `clientSrc` - generate client code locally. Requires `python 2.7` and the python package `google-apis-client-generator`
* `discoveryDocs` - generate discovery docs
* `openApiDocs` - generate Open API docs
The plugin exposes the following parameters for configuring server side goals
* `discoveryDocDir` - The output directory of discovery documents
* `clientLibDir` - The output directory of client libraries
* `generatedSrcDir` - The output directory of generated endpoints source
* `openApiDocDir` - The output directory of Open API documents
* `serviceClasses` - List of service classes (optional), this can be inferred from web.xml
* `webappDir` - Location of webapp directory
* `hostname` - To set the hostname of the root url for Open API docs, discovery docs, and client libs (ex: `hostname = myapp.appspot.com` will result in a default root url of `https://myapp.appspot.com/_ah/api`)
* `basePath` - To set the base path of the root url for Open API docs, discovery docs and client libs (ex: `basePath = /_ah/api` will result in a default root url of `https://myapp.appspot.com/_ah/api`)
#### Usage
Make sure your web.xml is [configured to expose your endpoints](https://cloud.google.com/endpoints/docs/frameworks/java/required_files) correctly.
No configuration parameters are required to run with default values
```shell
mvn compile endpoints-framework:clientLibs
mvn compile endpoints-framework:discoveryDocs
```
## Client
The plugin exposes the following client side goals
* `generateSrc`
The plugin exposes the following parameters for client side goals
* `generatedSrcDir` - The output directory of generated endpoints source
* `discoveryDocs` - List of discovery docs to generate source from
#### Usage
Client consuming endpoints using the client plugin need to configure the location
of source discovery documents and for best results configure the generateSrc task
in the default generate sources phase.
```XML
<plugin>
<groupId>com.google.cloud.tools</groupId>
<artifactId>endpoints-framework-maven-plugin</artifactId>
...
<configuration>
<discoveryDocs>
<discoveryDoc>src/endpoints/myApi-v1-rest.discovery</discoveryDoc>
</discoveryDocs>
</configuration>
<executions>
<execution>
<phase>generate-sources</phase>
<goals>
<goal>generateSrc</goal>
</goals>
</execution>
</executions>
</plugin>
```
Users will also need to include the google api client library for their generated
source to compile
```XML
<dependency>
<groupId>com.google.api-client</groupId>
<artifactId>google-api-client</artifactId>
<version>xx.yy.zz</version>
</dependency>
```
Running compile should automatically include generated sources from discovery documents
```shell
mvn compile
```
| 36.067961 | 211 | 0.763661 | eng_Latn | 0.910242 |
2f2f94ed44d7f2dce05fbfb83184e9d9518ba2d1 | 1,853 | md | Markdown | _posts/2017-09-21-House-of-Wu-Panoply-Panoply-Style-14613-Panoply-Short-Sleeves-SweepBrush-Train-AlinePrincess.md | princessan/princessan.github.io | fd7ad5591dda771edba5ecb008e339475df0c487 | [
"MIT"
] | null | null | null | _posts/2017-09-21-House-of-Wu-Panoply-Panoply-Style-14613-Panoply-Short-Sleeves-SweepBrush-Train-AlinePrincess.md | princessan/princessan.github.io | fd7ad5591dda771edba5ecb008e339475df0c487 | [
"MIT"
] | null | null | null | _posts/2017-09-21-House-of-Wu-Panoply-Panoply-Style-14613-Panoply-Short-Sleeves-SweepBrush-Train-AlinePrincess.md | princessan/princessan.github.io | fd7ad5591dda771edba5ecb008e339475df0c487 | [
"MIT"
] | null | null | null | ---
layout: post
date: 2017-09-21
title: "House of Wu - Panoply Panoply Style 14613 - Panoply Short Sleeves Sweep/Brush Train Aline/Princess"
category: House of Wu - Panoply
tags: [House of Wu - Panoply,House of Wu,Aline/Princess ,Sweetheart,Sweep/Brush Train,Short Sleeves]
---
### House of Wu - Panoply Panoply Style 14613 - Panoply
Just **$379.99**
### Short Sleeves Sweep/Brush Train Aline/Princess
<table><tr><td>BRANDS</td><td>House of Wu</td></tr><tr><td>Silhouette</td><td>Aline/Princess </td></tr><tr><td>Neckline</td><td>Sweetheart</td></tr><tr><td>Hemline/Train</td><td>Sweep/Brush Train</td></tr><tr><td>Sleeve</td><td>Short Sleeves</td></tr></table>
<a href="https://www.readybrides.com/en/house-of-wu-panoply/18879-house-of-wu-panoply-style-14613.html"><img src="//img.readybrides.com/42705/house-of-wu-panoply-style-14613.jpg" alt="Panoply Style 14613 - Panoply" style="width:100%;" /></a>
<!-- break --><a href="https://www.readybrides.com/en/house-of-wu-panoply/18879-house-of-wu-panoply-style-14613.html"><img src="//img.readybrides.com/42706/house-of-wu-panoply-style-14613.jpg" alt="Panoply Style 14613 - Panoply" style="width:100%;" /></a>
<a href="https://www.readybrides.com/en/house-of-wu-panoply/18879-house-of-wu-panoply-style-14613.html"><img src="//img.readybrides.com/42707/house-of-wu-panoply-style-14613.jpg" alt="Panoply Style 14613 - Panoply" style="width:100%;" /></a>
<a href="https://www.readybrides.com/en/house-of-wu-panoply/18879-house-of-wu-panoply-style-14613.html"><img src="//img.readybrides.com/42704/house-of-wu-panoply-style-14613.jpg" alt="Panoply Style 14613 - Panoply" style="width:100%;" /></a>
Buy it: [https://www.readybrides.com/en/house-of-wu-panoply/18879-house-of-wu-panoply-style-14613.html](https://www.readybrides.com/en/house-of-wu-panoply/18879-house-of-wu-panoply-style-14613.html)
| 102.944444 | 259 | 0.724231 | yue_Hant | 0.50008 |
2f2ff17854809f0cdd8b2ed9ba9ef7f6fbd4753d | 171 | md | Markdown | readme.md | ostreifel/wiql-editor | f3a27ea9b26f768d21e2d5da49d50c722226f292 | [
"MIT"
] | 31 | 2016-12-16T22:16:30.000Z | 2022-02-06T17:42:49.000Z | readme.md | bgue/wiql-editor | a627883aae59ab59cd9e0c1608580e8a5c765023 | [
"MIT"
] | 45 | 2017-02-07T12:43:47.000Z | 2022-01-13T10:53:42.000Z | readme.md | bgue/wiql-editor | a627883aae59ab59cd9e0c1608580e8a5c765023 | [
"MIT"
] | 5 | 2018-08-29T18:45:01.000Z | 2021-07-23T07:46:28.000Z | # Work Item Query Language Editor
#### Structure
```
buildTable - builds parse table for query language
extension - code packaged into the extension
```
| 21.375 | 58 | 0.672515 | eng_Latn | 0.948192 |
2f300e36629c17f0c4263355287ead9450a12bee | 1,263 | md | Markdown | W1D3 - Codepens - CSS3.md | ValRCS/RCS_Web_Development_05_2019 | cb709273eb98e8f14f1bd845987f626d275f7b79 | [
"MIT"
] | 1 | 2019-09-22T10:54:43.000Z | 2019-09-22T10:54:43.000Z | W1D3 - Codepens - CSS3.md | ValRCS/RCS_Web_Development_05_2019 | cb709273eb98e8f14f1bd845987f626d275f7b79 | [
"MIT"
] | null | null | null | W1D3 - Codepens - CSS3.md | ValRCS/RCS_Web_Development_05_2019 | cb709273eb98e8f14f1bd845987f626d275f7b79 | [
"MIT"
] | null | null | null | ## Code pens
* https://codepen.io/
* https://jsfiddle.net/
* http://jsbin.com/?html,output
* http://pastebin.com
# CSS - Cascading Style Sheets
* https://developer.mozilla.org/en-US/docs/Web/CSS
* [How CSS Works](https://developer.mozilla.org/en-US/docs/Learn/CSS/Introduction_to_CSS/How_CSS_works)
* [CSS Syntax](https://developer.mozilla.org/en-US/docs/Learn/CSS/Introduction_to_CSS/Syntax)
* https://learn.freecodecamp.org/responsive-web-design/basic-css
* https://www.w3schools.com/Css/
### Homework (Due Monday 20.05.2019)
Build a simple index.html page with a tribute to your favorite person/being/animal/place etc.
**Your Tribute Page should be valid HTML5 and contain at least the following:**
* Title in head section
* Some metadata in head section
* body with
* header section
* main section
1. 1st level heading
2. at least 3 paragraphs
3. at least 2 images
4. at least one unordered list
5. at least one ordered list
6. at least one link to an outside source
7. at least one internal link
* footer section
#### Validate your .html page with the official W3 validator:
https://validator.w3.org/#validate_by_upload+with_options
Creata a new project for your tribute page on Github and host the page there.
| 28.704545 | 103 | 0.733967 | eng_Latn | 0.486833 |
2f314b500595fd91fb70aa72a707fdfc32d17246 | 1,313 | md | Markdown | desktop-src/DevNotes/psetupstringtableinitializeex.md | npherson/win32 | 28da414b56bb3e56e128bf7e0db021bad5343d2d | [
"CC-BY-4.0",
"MIT"
] | 3 | 2020-04-24T13:02:42.000Z | 2021-07-17T15:32:03.000Z | desktop-src/DevNotes/psetupstringtableinitializeex.md | npherson/win32 | 28da414b56bb3e56e128bf7e0db021bad5343d2d | [
"CC-BY-4.0",
"MIT"
] | null | null | null | desktop-src/DevNotes/psetupstringtableinitializeex.md | npherson/win32 | 28da414b56bb3e56e128bf7e0db021bad5343d2d | [
"CC-BY-4.0",
"MIT"
] | 1 | 2022-03-09T23:50:05.000Z | 2022-03-09T23:50:05.000Z | ---
Description: Initializes a string table.
ms.assetid: 184df85a-6d59-42c5-8ec1-f0c046091645
title: pSetupStringTableInitializeEx function
ms.topic: article
ms.date: 05/31/2018
topic_type:
- APIRef
- kbSyntax
api_name:
- pSetupStringTableInitializeEx
api_type:
- DllExport
api_location:
- Setupapi.dll
---
# pSetupStringTableInitializeEx function
\[This function is not available in Windows Vista or Windows Server 2008.\]
Initializes a string table.
## Syntax
```C++
PVOID pSetupStringTableInitializeEx(
_In_ UINT ExtraDataSize,
_In_ UINT Reserved
);
```
## Parameters
<dl> <dt>
*ExtraDataSize* \[in\]
</dt> <dd>
Size of extra data object.
</dd> <dt>
*Reserved* \[in\]
</dt> <dd>
Reserved.
</dd> </dl>
## Remarks
This function has no associated import library or header file; you must call it using the [**LoadLibrary**](https://msdn.microsoft.com/en-us/library/ms684175(v=VS.85).aspx) and [**GetProcAddress**](https://msdn.microsoft.com/en-us/library/ms683212(v=VS.85).aspx) functions.
## Requirements
| | |
|----------------|-----------------------------------------------------------------------------------------|
| DLL<br/> | <dl> <dt>Setupapi.dll</dt> </dl> |
| 17.506667 | 273 | 0.592536 | yue_Hant | 0.455038 |
2f315f450861e2ba5d5d55a9dec4a1301824eb47 | 38 | md | Markdown | README.md | Octadero/Unarchiver | 7a925bc0854c5544ad8cb59f24c301682e0f2f7a | [
"Apache-2.0"
] | null | null | null | README.md | Octadero/Unarchiver | 7a925bc0854c5544ad8cb59f24c301682e0f2f7a | [
"Apache-2.0"
] | null | null | null | README.md | Octadero/Unarchiver | 7a925bc0854c5544ad8cb59f24c301682e0f2f7a | [
"Apache-2.0"
] | null | null | null | # Unarchiver
Simple Gzip realization. | 12.666667 | 24 | 0.815789 | kor_Hang | 0.324958 |
2f3196062d353445d22b0278bef4ad2b86e993cc | 156 | md | Markdown | chapter-02/recipe-06/abstract.md | istupsm/cmake-cookbook | 342d0171802153619ea124c5b8e792ce45178895 | [
"MIT"
] | 1,600 | 2018-05-24T01:32:44.000Z | 2022-03-31T09:24:11.000Z | chapter-02/recipe-06/abstract.md | istupsm/cmake-cookbook | 342d0171802153619ea124c5b8e792ce45178895 | [
"MIT"
] | 280 | 2017-08-27T13:10:51.000Z | 2018-05-23T15:09:58.000Z | chapter-02/recipe-06/abstract.md | istupsm/cmake-cookbook | 342d0171802153619ea124c5b8e792ce45178895 | [
"MIT"
] | 475 | 2018-05-23T15:26:27.000Z | 2022-03-31T07:28:19.000Z | This recipe shows how to enable vectorization to speed up a simple executable
using the [Eigen C++ library](http://eigen.tuxfamily.org) for linear algebra.
| 52 | 77 | 0.788462 | eng_Latn | 0.958771 |
2f31e2984ea90d57f489734353a13a346396291e | 961 | md | Markdown | README.md | NoBey/GitMax | 84e9a97dabe686288c4765d16973ed4d768a3ee5 | [
"Apache-2.0"
] | null | null | null | README.md | NoBey/GitMax | 84e9a97dabe686288c4765d16973ed4d768a3ee5 | [
"Apache-2.0"
] | null | null | null | README.md | NoBey/GitMax | 84e9a97dabe686288c4765d16973ed4d768a3ee5 | [
"Apache-2.0"
] | null | null | null | # [GitMax](http://www.gitmax.cn)
An app to help GitHub users to gain reputation.
### Contents
It helps you to add friends according to your minimum requirement of:
1. The number of followers he/she has.
2. The number of stars he/she has obtained.
3. Todo: Adding support for social apps so that you can add your social app friends on GitHub.

### Architecture
The app is architected with cost-efficiency and scalability in mind.
It is completely serverless using AWS Lambda functions. It is also hosted in AWS S3 with cloudfront CDN for the high
accessibility across the world.
FrontEnd:
* [React](https://facebook.github.io/react/)
* [ant.d framework](https://ant.design/)
Backend:
* [AWS Lambda](https://aws.amazon.com/lambda/)
* [Serverless framework](https://serverless.com/)
* [DynamoDB](https://aws.amazon.com/dynamodb/?nc2=h_m1)
| 34.321429 | 117 | 0.753382 | eng_Latn | 0.951102 |
2f324c4ec58fefbda72f263496dedfa83dcdf948 | 4,705 | md | Markdown | docs/c-language/conversions-from-floating-point-types.md | ANKerD/cpp-docs.pt-br | 6910dc17c79db2fee3f3616206806c5f466b3f00 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/c-language/conversions-from-floating-point-types.md | ANKerD/cpp-docs.pt-br | 6910dc17c79db2fee3f3616206806c5f466b3f00 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/c-language/conversions-from-floating-point-types.md | ANKerD/cpp-docs.pt-br | 6910dc17c79db2fee3f3616206806c5f466b3f00 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Conversões de tipos de ponto flutuante | Microsoft Docs
ms.custom: ''
ms.date: 01/29/2018
ms.technology:
- cpp-language
ms.topic: language-reference
dev_langs:
- C++
helpviewer_keywords:
- converting floating point
- floating-point conversion
ms.assetid: 96804c8e-fa3b-4742-9006-0082ed9e57f2
author: mikeblome
ms.author: mblome
ms.workload:
- cplusplus
ms.openlocfilehash: eefbbde88704ffd53f8bcf1445186bb7e6cdd6af
ms.sourcegitcommit: 913c3bf23937b64b90ac05181fdff3df947d9f1c
ms.translationtype: HT
ms.contentlocale: pt-BR
ms.lasthandoff: 09/18/2018
ms.locfileid: "46017621"
---
# <a name="conversions-from-floating-point-types"></a>Conversões de tipos de ponto flutuante
Um valor **float** convertido em **double** ou **long double** ou um valor **double** convertido em **long double** não sofre nenhuma alteração no valor. Um valor **double** convertido em um valor **float** é representado exatamente, se possível. A precisão poderá ser perdida se o valor não puder ser representado exatamente. Se o resultado for fora do intervalo, o comportamento será indefinido. Consulte [Limites em constantes de ponto flutuante](../c-language/limits-on-floating-point-constants.md) para ver o intervalo de tipos de ponto flutuante.
Um valor flutuante é convertido em um valor integral. Para isso, primeiro converta **long**, o valor **long** no valor completo específico. A parte decimal do valor flutuante é rejeitada na conversão para **long**. Se o resultado for ainda muito grande para caber em **long**, o resultado da conversão será indefinido.
**Seção específica da Microsoft**
Ao converter um número de ponto flutuante de **double** ou **long double** em um número de ponto flutuante menor, o valor da variável de ponto flutuante será truncado para zero quando um estouro negativo ocorrer. Um estouro causa um erro em tempo de execução. Observe que o compilador C da Microsoft mapeia o tipo **long double** para o **double**.
**Fim da seção específica da Microsoft**
A tabela a seguir resume as conversões de tipos flutuantes.
## <a name="conversions-from-floating-point-types"></a>Conversões de tipos de ponto flutuante
|De|Para|Método|
|----------|--------|------------|
|**float**|**char**|Converter em **long**; converter **long** em **char**|
|**float**|**short**|Converter em **long**; converter **long** em **short**|
|**float**|**long**|Truncar no ponto decimal. Se o resultado for muito grande para ser representado como **long**, o resultado será indefinido.|
|**float**|**unsigned short**|Converter em **long**; converter **long** em **unsigned short**|
|**float**|**unsigned long**|Converter em **long**; converter **long** em **unsigned long**|
|**float**|**double**|Representação de alteração interna|
|**float**|**long double**|Representação de alteração interna|
|**double**|**char**|Converter em **float**; converter **float** em **char**|
|**double**|**short**|Converter em **float**; converter **float** em **short**|
|**double**|**long**|Truncar no ponto decimal. Se o resultado for muito grande para ser representado como **long**, o resultado será indefinido.|
|**double**|**unsigned short**|Converter em **long**; converter **long** em **unsigned short**|
|**double**|**unsigned long**|Converter em **long**; converter **long** em **unsigned long**|
|**double**|**float**|É representado como um **float**. Se o valor **double** não pode ser representado exatamente como **float**, ocorre perda de precisão. Se o valor for muito grande para ser representado como **float**, o resultado será indefinido.|
|**long double**|**char**|Converter em **float**; converter **float** em **char**|
|**long double**|**short**|Converter em **float**; converter **float** em **short**|
|**long double**|**long**|Truncar no ponto decimal. Se o resultado for muito grande para ser representado como **long**, o resultado será indefinido.|
|**long double**|**unsigned short**|Converter em **long**; converter **long** em **unsigned short**|
|**long double**|**unsigned long**|Converter em **long**; converter **long** em **unsigned long**|
|**long double**|**float**|É representado como um **float**. Se o valor **double** não pode ser representado exatamente como **float**, ocorre perda de precisão. Se o valor for muito grande para ser representado como **float**, o resultado será indefinido.|
|**long double**|**double**|O valor **long double** é tratado como **double**.|
As conversões dos valores **float**, **double** ou **long double** em **unsigned long** não são precisas se o valor que está sendo convertido for maior que o valor máximo positivo **long**.
## <a name="see-also"></a>Consulte também
[Conversões de atribuição](../c-language/assignment-conversions.md)
| 68.188406 | 552 | 0.721785 | por_Latn | 0.992892 |
2f333be23f9762f27c1df7308bbce2476180b73f | 7,461 | md | Markdown | docs/configs/baseconfiguration.md | boschresearch/amira_blender_rendering | e8121f29b328013991c0506cf7dbd8edc5a29b16 | [
"ECL-2.0",
"Apache-2.0"
] | 26 | 2020-11-13T18:57:40.000Z | 2022-03-08T18:54:02.000Z | docs/configs/baseconfiguration.md | boschresearch/amira_blender_rendering | e8121f29b328013991c0506cf7dbd8edc5a29b16 | [
"ECL-2.0",
"Apache-2.0"
] | 7 | 2021-01-21T11:56:46.000Z | 2021-09-22T08:39:10.000Z | docs/configs/baseconfiguration.md | boschresearch/amira_blender_rendering | e8121f29b328013991c0506cf7dbd8edc5a29b16 | [
"ECL-2.0",
"Apache-2.0"
] | 6 | 2020-11-19T15:46:33.000Z | 2021-03-26T05:42:44.000Z | # Base Configuration
The base configuration is supported by all scenarios. It consists of several
namespaces, which are described in detail below.
## dataset
The `dataset` namespace contains information about a dataset such as number of
images, as well as the output directory where data will be written to.
```python
[dataset]
# Specify how many images should be rendered
image_count = 5
# Depending on the rendering mode it is also possible to set scene
# and camera view counts. Note that setting these values might affect
# the total number of rendered image and, in turn, image_count.
# As an example, if supported, setting scene_cout = 5, view_count = 5 in
# "multiview" render mode will result in image_count = 25
scene_count =
view_count =
# Specify the base path where data will be written to. Note that this is a base
# path, to which additional information will be added such as Scenario-Number
# and Camera-Name
base_path = $OUTDIR/WorkstationScenarios-Train
# specify the scene type
scene_type = WorkstationScenarios
```
## camera_info
Camera information is placed in the `camera_info` namespace. It contains
settings for image width and height, as well as (optional) intrinsic camera
information.
```python
[camera_info]
# In this section you specify the camera information, which will have a direct
# impact on rendering results.
# The width and height have an influence on the rendering resolution. In case
# you wish to set a specific calibration matrix that you obtained, for
# instance, from OpenCV, and do not wish to temper with the rendering
# resolution, then set these values to 0.
width = 640
height = 480
# The camera model to use. At the moment, this value is ignored in
# amira_blender_rendering. However, because all rendering is in fact done with a
# pinhole camera model, this value serves as documentation
model = pinhole
# Also this value has no impact on rendering likewise the model. However, if
# you want to specify a certain camera name for documentation purposes, this is
# the place.
name = Pinhole Camera
# You can specify the intrinsic calibration information that was determined for
# a camera, for instance with OpenCV.
#
# Here, we use the format
# intrinsics = fx, fy, cx, cy
# Where the fx, fy values represented focal lengths, and cx, cy defines the
# camera's principal point.
#
# You can extract fx, fy, cx, cy from a calibration matrix K:
#
# fx s cx
# K = 0 fy cy
# 0 0 1
#
# Note, however, that the values in your calibration matrix or intrinsics
# specification might not end up in proper render resolutions. For instance,
# this is the case in the example below, which would result in a rendering
# resolution of about 1320.98 x 728.08. Blender will round these values to
# suitable integer values. As a consequence, even if you set width and height
# above to 0, the effective intrinsics that blender uses might be slightly
# different from your K.
#
# To accomodate this 'issue', amira_blender_rendering will write a value
# 'effective_intrinsics' to the configuration as soon as setting up cameras and
# rendering is done. Recall that all configurations will be stored alongside the
# created dataset, so you can easily retrieve the effective_intrinsics in
# downstream applications
intrinsics = 9.9801747708520452e+02,9.9264009290521165e+02,6.6049856967197002e+02,3.6404286361152555e+02,0
# A default camera in blender with 0 rotation applied to its transform looks
# along the -Z direction. Blender's modelling viewport, however, assumes that
# the surface plane is spanned by X and Y, where X indicates left/right. This
# can be observed by putting the modelling viewport into the front viewpoint
# (Numpad 1). Then, the viewport looks along the Y direction.
#
# As a consequence, the relative rotation between a camera image and an object
# is only 0 when the camera would look onto the top of the object. Note that
# this is rather unintuitive, as most people would expect that the relative
# rotation is 0 when the camera looks at the front of an object.
#
# To accomodate for this, users can set their preferred 'zeroing' rotation
# by using the following configuration parameter encoding rotations
# around x, y and z-axis, respectively, in degrees.
#
# As an example, a value of 90, 0, 0 will apply a rotation of 90[deg] around x
# when computing the relative rotation between the camera and an object in the
# in the camera reference frame.
zeroing = 0.0, 0.0, 0.0
# We allow to set camera parameters also using additional values. These are:
# The sensor width in mm (if not available, set to 0.0)
sensor_width =
# The camera focal lenght in mm (if not available, set to 0.0)
focal_length =
# The camera Horizontal Field-of-View in degrees (if not available, set to 0.0)
hfov =
# Additionally, it is possible to determin how to compute the camera setup if only
# instrinsics values are give among "fov" and "mm" (default is "mm").
intrinsics_conversion_mode =
```
## render_setup
The `render_setup` namespace is used to configure how blender's render backend
behaves, or which render backend to use.
```python
[render_setup]
# specify which renderer to use. Usually you should leave this at
# blender-cycles. Note that, at the moment, this is hard-coded to cycles
# internally anyway.
backend = blender-cycles
# integrator (either PATH or BRANCHED_PATH)
integrator = BRANCHED_PATH
# use denoising (true, false)
denoising = True
# samples the ray-tracer uses per pixel
samples = 64
# allow occlusions of target objects (true, false)
allow_occlusions = False
# select bit size of RGB images between 8 bit and 16 bit (default)
color_depth = 16
# toggle motion blur (True, False (defualt)) during rendering.
# Notice that, this might not heavily affect
# your render output if the rendered scene is standing still.
motion_blue = False
```
## debugging
The `debug` namespace can be used to toggle debug functionatilies.
For scene specific flags refer to the desider scene.
```python
[debug]
# activate debug logs and print-outs (true, false)
enabled = False
```
## postprocess
The `postprocess` namespace can be used to implement functionatilies
during postprocess and/or after the rendering phase
```python
[postprocess]
# By default Blender uses a perfect pinhole camera models and its output depth maps
# contain indeed ranges (in meters saved as .exr files). For this reasons, (rectified) depth
# maps (saved as png files) are computed during postprocessing. During generation we allow to
# select the output scale to convert range to depth. Default is 1e4 = .1mm
depth_scale =
# During post processing it might happen that object visibility information (which are computed
# using ray-casting) and the corresponding object mask do not correspond (ie. the mask is empty).
# This might happen due to image resolution: the visible portion of the object is not big enough
# for a single pixel. Since, for how seldom, this behavior can happen, we allow, to overwrite
# visibility information based on the computed mask (defualt is False).
visibility_from_mask =
# If requested, the disparity between a set of parallel cameras can be computed. Default is False
compute_disparity =
# Disparity is computed only on given cameras (chosen among those set in scene_setup.cameras)
parallel_cameras = []
# Disparity maps require a baseline value (in mm) between the selected cameras. Default is 0
parallel_cameras_baseline_mm =
```
| 40.770492 | 106 | 0.76853 | eng_Latn | 0.998993 |
2f35025449031a4f9034a72496e3fd571c51bf91 | 5,018 | md | Markdown | README.md | marcfiu/pdagent | 44f9b7458f085df70ab4e45c580427abd7b1a367 | [
"BSD-3-Clause"
] | 25 | 2015-01-06T09:17:13.000Z | 2021-09-08T23:05:31.000Z | README.md | marcfiu/pdagent | 44f9b7458f085df70ab4e45c580427abd7b1a367 | [
"BSD-3-Clause"
] | 42 | 2015-01-25T22:03:57.000Z | 2022-03-21T11:48:57.000Z | README.md | marcfiu/pdagent | 44f9b7458f085df70ab4e45c580427abd7b1a367 | [
"BSD-3-Clause"
] | 30 | 2015-03-19T20:27:28.000Z | 2021-03-11T23:12:01.000Z | ## Notice
PagerDuty is planning to deprecate this tool in favour of [go-pdagent](https://github.com/PagerDuty/go-pdagent). go-pdagent is not feature complete at this moment however it will be before an official deprecation notice.
----
> This is the source code and project. For the PagerDuty Agent Install Guide,
> see http://www.pagerduty.com/docs/guides/agent-install-guide/
# Introduction
The PagerDuty Agent is a program that lets you easily integrate your monitoring
system with PagerDuty.
It includes command-line tools to trigger, acknowledge & resolve PagerDuty
incidents.
The supported events are those listed in the PagerDuty Integration API:
> <https://v2.developer.pagerduty.com/docs/events-api>
The PagerDuty Agent is completely open-source which means that you can download
the source code and customize it for your needs.
The Agent requires Python 2.7 or higher. The instructions here assume that you're
on a Mac.
## Developing
### Running in Development
#### Locally
You can run the Agent in development without any setup. Start the Agent daemon
as follows:
`bin/pdagentd.py`
When run in development the daemon automatically creates a `tmp` directory
inside the project where it stores its various work files.
Similarly, you can use the `pd-send` command immediately.
```
bin/pd-send -h
usage: pd-send [-h] -k SERVICE_KEY -t {trigger,acknowledge,resolve}
[-d DESCRIPTION] [-i INCIDENT_KEY] [-f FIELDS]
Queue up a trigger, acknowledge, or resolve event to PagerDuty.
...
```
Make sure that you have run the daemon at least once so that the `tmp`
directory exists.
You can stop the daemon as follows:
`kill $(cat tmp/pdagentd.pid)`
#### With Docker
To run the Agent in a production-like environment, use Docker. We currently have two supported operating systems: Ubuntu 16.04 and CentOS 7. With Docker installed, run `./scripts/run-console.sh <ubuntu>` or `./scripts/run-console.sh <centos>` to spin up the Docker container, run the Agent, and drop into a console.
Once in the console, you can send events via `pd-send`.
### IDE Setup
For IDE setup instructions see [PyDev Setup](pydev-setup.md) or [IDEA Setup](idea-setup.md). Apart from the usual benefits, the IDEs provide PEP-8 warnings which we care about.
### Build Tools
To perform a complete automated build, you'll need to install Docker and `make`.
### Running Unit Tests
You can run the unit tests with the following command:
`make test`
To run them without using `make`, use the `run-tests.py` test runner, e.g.:
`python run-tests.py unit_tests/test_*.py unit_tests/thirdparty/test_*.py`
### Running Integration Tests
You can run the integration tests with the following command:
`./scripts/full-integration-test.sh <centos or ubuntu>`
Make sure to run tests with both `centos` and `ubuntu` options, as those are the supported Linux distributions for `pdagent`.
### Building Packages
For development builds, you can perform a full automated clean build of the
Agent with the following steps:
1. Configure signing keys by following the [One-time setup of GPG keys](build-linux/howto.md#one-time-setup-of-gpg-keys) instructions.
2. Run the following commands:
```
make ubuntu
make centos
```
If you want to build packages by hand, follow the instructions in the
[Build Linux Howto](build-linux/howto.md).
Similarly, you can check the SCons targets using `scons -h` for instructions on
performing specific builds tasks and on specific VMs.
#License
Copyright (c) 2013-2014, PagerDuty, Inc. <info@pagerduty.com>
All rights reserved.
Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions are met:
* Redistributions of source code must retain the above copyright
notice, this list of conditions and the following disclaimer.
* Redistributions in binary form must reproduce the above copyright
notice, this list of conditions and the following disclaimer in the
documentation and/or other materials provided with the distribution.
* Neither the name of the copyright holder nor the
names of its contributors may be used to endorse or promote products
derived from this software without specific prior written permission.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE
LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
POSSIBILITY OF SUCH DAMAGE.
| 36.100719 | 316 | 0.77222 | eng_Latn | 0.953592 |
2f352e5ec3bc27597954a041728b1e559b023eb2 | 4,052 | md | Markdown | docs/CH1.md | fractalwrench/iokk | 21086bfcea20f5678a5ae0045e19919a00521346 | [
"MIT"
] | 54 | 2018-07-24T13:06:30.000Z | 2021-04-11T11:17:47.000Z | docs/CH1.md | fractalwrench/iokk | 21086bfcea20f5678a5ae0045e19919a00521346 | [
"MIT"
] | 4 | 2018-07-30T12:20:46.000Z | 2018-08-08T16:08:23.000Z | CH1.md | fractalwrench/iokk | 21086bfcea20f5678a5ae0045e19919a00521346 | [
"MIT"
] | 8 | 2018-07-24T15:37:50.000Z | 2018-11-18T10:31:21.000Z | # Challenge 01 - Hello World
A quick recap on the rules is available [here](README.md). For the unfamiliar, the aim of the game is to write a Kotlin program that solves a relatively trivial problem, using the most convoluted solution possible.
For this challenge, the program should write the string `'Hello, World!'` to `System.out`. In unobfuscated Kotlin, the solution would look something like this:
```
fun main(args: Array<String>) {
println("Hello, World!")
}
```
## Entries
We received 5 great entries for this challenge, and all the authors have been added to [the Hall of Infamy](README.md#hall-of-infamy). Without further ado, here are the entries, and an attempt to explain how on earth they work.
### Kotlin DSL for Brainfuck
Our first entry is a [Kotlin DSL for Brainfuck](https://github.com/fractalwrench/iokk/pull/5), written by [westonal](https://github.com/westonal).
True to its name, this entry makes my head hurt trying to understand it. It abuses [operator overriding](https://kotlinlang.org/docs/reference/operator-overloading.html) to implement loops, and [extension properties](https://kotlinlang.org/docs/reference/extensions.html) to implement functions.
This all adds up to create possibly the most confusing DSL ever created within Kotlin. It really has to be seen to be believed. Use it in production today by copy pasting from [here](https://github.com/fractalwrench/iokk/pull/5)!
### Emojiianal abuse
The [next entry](https://github.com/fractalwrench/iokk/pull/3) by [Naaate](https://github.com/Naaate) makes great use of Unicode emoji, which can form valid Kotlin identifiers:
```
val `😘` = 42
```
[Infix functions](https://kotlinlang.org/docs/reference/functions.html#infix-notation) and [type aliases](https://kotlinlang.org/docs/reference/type-aliases.html) serve to further obfuscate the remaining code. This entry is 👌
### Syntax Inversion
[This entry](https://github.com/fractalwrench/iokk/pull/1) by [swankjesse](https://github.com/swankjesse) uses a very clever approach to flip Kotlin's on its head, by passing the argument first and method name last.
```
fun ch1Solution_swankjesse() {
`Hello, World!`("println")
}
fun `Hello, World!`(vararg args: Any?) {
for (p in System.getProperties().values) {
for (j in p.toString().split(":")) {
try {
for (e in java.util.zip.ZipFile(j).entries()) {
try {
val n = e.name.substring(0 until e.name.length - 6).replace('/', '.')
val c = Class.forName(n, false, object : Any() {}.javaClass.classLoader)
val m = c.getDeclaredMethod(args[0].toString(), Any::class.java)
m.isAccessible = true
m.invoke(null, Exception().stackTrace[0].methodName)
} catch (t: Throwable) {
}
}
} catch (t: Throwable) {
}
}
}
}
```
This works by finding the `println` method via reflection, then invoking it with the current method name, which is obtained by inspecting a stacktrace.
### More parentheses than Lisp
[This entry](https://github.com/fractalwrench/iokk/pull/4) by [machtelik](https://github.com/machtelik) uses more parentheses than the average Lisp program, [if such a thing is possible](https://xkcd.com/297/). It cleverly exploits optional lambdas to encode "Hello World" in binary, then ultimately converts everything back to a String, which is printed to screen.
### Operator overloading, extension functions, ASCII
Our [last entry](https://github.com/fractalwrench/iokk/pull/2) by [jamiesanson](https://github.com/jamiesanson) abuses extension functions, operator overloading, and good old-fashioned ASCII codes to achieve its goal. Conclusive proof that for Kotlin, security through obfuscation sometimes does work!
## Next Challenge
Ready for the next challenge? You can check out the entries or get started on submitting a solution [here](CH2.md).
Open an issue if you have a suggestion for a new challenge, or think the contest could be better by changing something.
It's early days and I'm open to ideas.
| 50.024691 | 365 | 0.727295 | eng_Latn | 0.962785 |
2f3610e9a8dba9abd6593540d692f764b6eb7482 | 644 | md | Markdown | README.md | dustin-lind/Senior-Thesis | d09d134164278a7a981c99b6c22ffe87e610ce69 | [
"MIT"
] | null | null | null | README.md | dustin-lind/Senior-Thesis | d09d134164278a7a981c99b6c22ffe87e610ce69 | [
"MIT"
] | null | null | null | README.md | dustin-lind/Senior-Thesis | d09d134164278a7a981c99b6c22ffe87e610ce69 | [
"MIT"
] | null | null | null | # Dustin Lind CMC Senior Thesis Repository
Welcome to my code repository for my CMC economics senior thesis.
Please email me at dustinlind99@gmail.com if you have any questions or suggested changes to my analysis.
The data that I will be using is from CRSP/Compustat merged dataset. Due to Github's file size limits, I cannot upload the datasets that I used directly to this repository. Here I have provided the links to the annual fundementals and monthly security data that I will be using: [CRSP/Compustat Merged](https://wrds-www.wharton.upenn.edu/pages/get-data/center-research-security-prices-crsp/annual-update/crspcompustat-merged/)
| 80.5 | 426 | 0.807453 | eng_Latn | 0.986888 |
2f36623bf763e344581b1f56dffeeb917b7cbe09 | 71 | md | Markdown | .history/pages/Code_20220116003634.md | portfolio-of-lan/portfolio-of-lan.github.io | a41aee977e1a8941ca1a1cfc843db1e8eaa549e4 | [
"MIT"
] | null | null | null | .history/pages/Code_20220116003634.md | portfolio-of-lan/portfolio-of-lan.github.io | a41aee977e1a8941ca1a1cfc843db1e8eaa549e4 | [
"MIT"
] | null | null | null | .history/pages/Code_20220116003634.md | portfolio-of-lan/portfolio-of-lan.github.io | a41aee977e1a8941ca1a1cfc843db1e8eaa549e4 | [
"MIT"
] | null | null | null | ---
layout: page
title: Codes
permalink: /illustrantions/
weight: 2
--- | 11.833333 | 27 | 0.704225 | eng_Latn | 0.761927 |
2f371f23b89247ea48f8a62b200a719c98190cbc | 2,276 | md | Markdown | includes/active-directory-protocols-getting-started.md | wreyesus/azure-content-eses-articles-app-service-web-app-service-web-staged-publishing-realworld-scenarios.md | addd81caca263120e230109b811593b939994ebb | [
"CC-BY-3.0"
] | null | null | null | includes/active-directory-protocols-getting-started.md | wreyesus/azure-content-eses-articles-app-service-web-app-service-web-staged-publishing-realworld-scenarios.md | addd81caca263120e230109b811593b939994ebb | [
"CC-BY-3.0"
] | null | null | null | includes/active-directory-protocols-getting-started.md | wreyesus/azure-content-eses-articles-app-service-web-app-service-web-staged-publishing-realworld-scenarios.md | addd81caca263120e230109b811593b939994ebb | [
"CC-BY-3.0"
] | null | null | null | <properties
pageTitle="Información general sobre el protocolo .NET de Azure AD | Microsoft Azure"
description="Descubra cómo utilizar mensajes HTTP para autorizar el acceso a aplicaciones y API web en su inquilino de Azure AD."
services="active-directory"
documentationCenter=".net"
authors="priyamohanram"
manager="mbaldwin"
editor=""/>
<tags
ms.service="active-directory"
ms.workload="identity"
ms.tgt_pltfrm="na"
ms.devlang="dotnet"
ms.topic="article"
ms.date="01/21/2016"
ms.author="priyamo"/>
<!--TODO: Introduction -->
## Registro de la aplicación con el inquilino de AD
En primer lugar, tendrá que registrar la aplicación con el inquilino de Active Directory. De este modo, se generará un id. de cliente para la aplicación y también se podrán recibir tokens.
- Inicie sesión en el Portal de administración de Azure.
- En el panel de navegación izquierdo, haga clic en **Active Directory**.
- Seleccione el inquilino en el que va a registrar la aplicación.
- Haga clic en la pestaña **Aplicaciones** y en el botón **Agregar** del cajón inferior.
- Siga las indicaciones y cree una nueva aplicación. Para este tutorial, no es relevante que se trate de una aplicación web o nativa, pero si desea ver ejemplos específicos de aplicaciones web o nativas, consulte nuestras guías de inicio rápido [aquí](../articles/active-directory/active-directory-developers-guide.md).
- Para aplicaciones web, proporcione la **dirección URL de inicio de sesión**, que es la URL base de la aplicación, donde los usuarios pueden iniciar sesión; por ejemplo, `http://localhost:12345`. El **URI de id. de aplicación** es un identificador único de su aplicación. La convención consiste en usar `https://<tenant-domain>/<app-name>`, p. ej. `https://contoso.onmicrosoft.com/my-first-aad-app`
- Para las aplicaciones nativas, proporcione un **URI de redirección**, que utilizará Azure AD para devolver las respuestas de token. Escriba un valor específico para la aplicación, por ejemplo, `http://MyFirstAADApp`.
- Una vez que haya completado el registro, AAD asignará a su aplicación un identificador de cliente único. Necesitará este valor en las secciones siguientes, de modo que cópielo en la pestaña **Configurar** de la aplicación.
<!---HONumber=AcomDC_0601_2016--> | 55.512195 | 399 | 0.768014 | spa_Latn | 0.985241 |
2f3768e6d0a275d26bc3fa7c0801d70699b257ef | 2,422 | md | Markdown | docs/visio/publishedpage-element-publishsettings_type-complextypevisio-xml.md | ManuSquall/office-developer-client-docs.fr-FR | 5c3e7961c204833485b8fe857dc7744c12658ec5 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2021-08-15T11:25:43.000Z | 2021-08-15T11:25:43.000Z | docs/visio/publishedpage-element-publishsettings_type-complextypevisio-xml.md | ManuSquall/office-developer-client-docs.fr-FR | 5c3e7961c204833485b8fe857dc7744c12658ec5 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/visio/publishedpage-element-publishsettings_type-complextypevisio-xml.md | ManuSquall/office-developer-client-docs.fr-FR | 5c3e7961c204833485b8fe857dc7744c12658ec5 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Élément PublishedPage (PublishSettings_Type complexType) (Visio XML)
manager: soliver
ms.date: 03/09/2015
ms.audience: Developer
ms.topic: reference
localization_priority: Normal
ms.assetid: c1eca66b-5840-790a-459f-e06680d11c05
description: Spécifie si une page de dessin est consultable dans le navigateur à l’aide de Visio Services dans Microsoft SharePoint Server 2013.
ms.openlocfilehash: 614c01f12b9a7525620704e5417a106e8703c983
ms.sourcegitcommit: e7b38e37a9d79becfd679e10420a19890165606d
ms.translationtype: MT
ms.contentlocale: fr-FR
ms.lasthandoff: 05/29/2019
ms.locfileid: "34538374"
---
# <a name="publishedpage-element-publishsettings_type-complextype-visio-xml"></a>Élément PublishedPage (PublishSettings_Type complexType) (Visio XML)
Spécifie si une page de dessin est consultable dans le navigateur à l’aide de Visio Services dans Microsoft SharePoint Server 2013.
## <a name="element-information"></a>Informations sur l’élément
|||
|:-----|:-----|
|**Type d’élément** <br/> |[PublishedPage_Type](publishedpage_type-complextypevisio-xml.md) <br/> |
|**Namespace** <br/> |http://schemas.microsoft.com/office/visio/2012/main <br/> |
|**Fichier de schéma** <br/> |VisioSchema15.xsd <br/> |
|**Composants de document** <br/> |document.xml <br/> |
## <a name="definition"></a>Définition
```XML
< xs:element name="PublishedPage" type="PublishedPage_Type" minOccurs="0" maxOccurs="unbounded" >
</xs:element >
```
## <a name="elements-and-attributes"></a>Éléments et attributs
Si le schéma définit des exigences spécifiques, telles que **séquence**, **minOccurs**, **maxOccurs** et **choix**, voir la section de définition.
### <a name="parent-elements"></a>Éléments parents
|**Élément**|**Type (Type)**|**Description**|
|:-----|:-----|:-----|
|[PublishSettings](publishsettings-element-visiodocument_type-complextypevisio-xml.md) <br/> |[PublishSettings_Type](publishsettings_type-complextypevisio-xml.md) <br/> |Spécifie les paramètres utilisés lorsque le diagramme est ouvert à l’aide de Visio Services. <br/> |
### <a name="child-elements"></a>Éléments enfants
Aucun.
### <a name="attributes"></a>Attributs
|**Attribut**|**Type**|**Obligatoire**|**Description**|**Valeurs possibles**|
|:-----|:-----|:-----|:-----|:-----|
|ID <br/> |xsd:unsignedInt <br/> |obligatoire <br/> |Identificateur d’une page de dessin. <br/> |Valeurs du type xsd:unsignedInt. <br/> |
| 41.758621 | 271 | 0.718827 | fra_Latn | 0.400821 |
2f377806d73950b2478a60609cc51728145f334d | 856 | markdown | Markdown | _drafts/2019-01-04-going-with-ionic-3-in-my-cookbook-app.pl.markdown | dosemes/prueba | 9e3f6f936a6a8f27e5f78f64233e2a592498cac1 | [
"MIT"
] | null | null | null | _drafts/2019-01-04-going-with-ionic-3-in-my-cookbook-app.pl.markdown | dosemes/prueba | 9e3f6f936a6a8f27e5f78f64233e2a592498cac1 | [
"MIT"
] | 7 | 2020-07-09T13:21:32.000Z | 2020-07-16T15:09:23.000Z | _drafts/2019-01-04-going-with-ionic-3-in-my-cookbook-app.pl.markdown | dosemes/prueba | 9e3f6f936a6a8f27e5f78f64233e2a592498cac1 | [
"MIT"
] | 1 | 2020-09-12T09:01:49.000Z | 2020-09-12T09:01:49.000Z | ---
layout: post
title: "[PL] Going with Ionic 3 in My Cookbook app"
categories: MyCookbook
---
## My Cookbook
Co to jest My Cookbook?
My Cookbook jest aplikacją, która jest cyfrową Książką Kucharską oraz Spiżarnią. W Spiżarni możesz śledzić ilość składników, jakie posiadzasz. Na podstawie Twojej Spiżarni aplikacja pokazuje, które potrawy możesz przygotować od razu. Potrawy są oznaczane, kiedy brakuje jakiegoś składnika i aplikacja pokazuje czego brakuje i w jakiej ilości.
MyCookbook został napisany w Ionic v1.
## Cel
Migracja projektu z Ionic v1 do v3.
Podejrzewam, że Ionic v3 jest dosyć stabliny, to do tej wersji chcę aplikację dostosować. Nie będę na razie pchał się do Ionic v4 i tak będzie dla mnie sporo nowości. W szczególności inna wersja Angulara. Już jeden projekt zrobiłem w Angular 2. W momencie jak był jescze dosyć świeży :-).
| 42.8 | 342 | 0.788551 | pol_Latn | 0.999998 |