hexsha
stringlengths
40
40
size
int64
5
1.04M
ext
stringclasses
6 values
lang
stringclasses
1 value
max_stars_repo_path
stringlengths
3
344
max_stars_repo_name
stringlengths
5
125
max_stars_repo_head_hexsha
stringlengths
40
78
max_stars_repo_licenses
listlengths
1
11
max_stars_count
int64
1
368k
max_stars_repo_stars_event_min_datetime
stringlengths
24
24
max_stars_repo_stars_event_max_datetime
stringlengths
24
24
max_issues_repo_path
stringlengths
3
344
max_issues_repo_name
stringlengths
5
125
max_issues_repo_head_hexsha
stringlengths
40
78
max_issues_repo_licenses
listlengths
1
11
max_issues_count
int64
1
116k
max_issues_repo_issues_event_min_datetime
stringlengths
24
24
max_issues_repo_issues_event_max_datetime
stringlengths
24
24
max_forks_repo_path
stringlengths
3
344
max_forks_repo_name
stringlengths
5
125
max_forks_repo_head_hexsha
stringlengths
40
78
max_forks_repo_licenses
listlengths
1
11
max_forks_count
int64
1
105k
max_forks_repo_forks_event_min_datetime
stringlengths
24
24
max_forks_repo_forks_event_max_datetime
stringlengths
24
24
content
stringlengths
5
1.04M
avg_line_length
float64
1.14
851k
max_line_length
int64
1
1.03M
alphanum_fraction
float64
0
1
lid
stringclasses
191 values
lid_prob
float64
0.01
1
932938d7e009b526ca879939a912e115cf5ff362
858
md
Markdown
src/pages/escuela/clases/2021-barra-coreografiada.md
denisprado/bambudanza
c5d881cc27a846a00227c53fc1633515b2cdf83f
[ "MIT" ]
null
null
null
src/pages/escuela/clases/2021-barra-coreografiada.md
denisprado/bambudanza
c5d881cc27a846a00227c53fc1633515b2cdf83f
[ "MIT" ]
null
null
null
src/pages/escuela/clases/2021-barra-coreografiada.md
denisprado/bambudanza
c5d881cc27a846a00227c53fc1633515b2cdf83f
[ "MIT" ]
null
null
null
--- templateKey: programa-post title: BARRA COREOGRAFIADA order: 11 date: 2021-01-22T18:23:09.114Z description: Técnicas fusionadas featuredpost: true featuredimage: /img/arnaldo-iv.jpg tags: - Barra Coreografiada profesora: Arnaldo Iarsoli tarifa: - 1 X A LA SEMANA (1H) - 45 € - 2 X A LA SEMANA ONLINE - 60 € horarios: - 19h00 a 20h00 dias: - Martes y Jueves estilo: - Jazz y Ballet Clásico nivel: - Iniciación - Intermedio - Avanzado --- <!--StartFragment--> Esta clase tiene como objetivo lograr mejor postura, coordinación y fortalecer la parte superior e inferior del cuerpo con el trabajo de contrapeso, buscando posturas adecuadas de cada miembro, potenciando la creatividad e investigación de cada alumno. Mezclaremos las bases del ballet clásico con el Jazz, trabajando el movimiento de forma placentera. <!--FinishFragment-->
26.8125
352
0.75641
spa_Latn
0.938678
93294c61d9920cc2b68a2620bf642d8492de85d3
71
md
Markdown
lib/README.md
cassidoxa/z3r-sramr
9cd19460737a3f84a19d9aa7ff4db699a342aa71
[ "MIT" ]
null
null
null
lib/README.md
cassidoxa/z3r-sramr
9cd19460737a3f84a19d9aa7ff4db699a342aa71
[ "MIT" ]
3
2021-03-29T19:01:39.000Z
2021-03-29T19:12:51.000Z
lib/README.md
cassidoxa/z3r-sramr
9cd19460737a3f84a19d9aa7ff4db699a342aa71
[ "MIT" ]
null
null
null
# z3r-sramr ## Installation Add `z3r-sramr = 0.2` to your Cargo.toml
11.833333
40
0.676056
kor_Hang
0.405698
93295286621450c4df26695ea6909b2eaa32f40d
1,980
md
Markdown
node_modules/@seregpie/claw/README.md
MaherSalama2020/GeniusExams
8091cfaba95ec1747ebb14e6f3a32ea0295fb225
[ "MIT" ]
null
null
null
node_modules/@seregpie/claw/README.md
MaherSalama2020/GeniusExams
8091cfaba95ec1747ebb14e6f3a32ea0295fb225
[ "MIT" ]
null
null
null
node_modules/@seregpie/claw/README.md
MaherSalama2020/GeniusExams
8091cfaba95ec1747ebb14e6f3a32ea0295fb225
[ "MIT" ]
null
null
null
# Claw A very small gesture recognizer. ## demo [Try it out!](https://seregpie.github.io/VueClaw/) ## setup ### npm ```shell npm install @seregpie/claw ``` ### ES module ```javascript import Claw from '@seregpie/claw'; ``` ### browser ```html <script src="https://unpkg.com/@seregpie/claw"></script> ``` ## members ``` .constructor(target, { delay = 500, distance = 1, }) ``` | argument | description | | ---: | :--- | | `target` | The target element. | | `delay` | The delay threshold to separate the event types. | | `distance` | The distance threshold to separate the event types. | ```javascript let element = document.getElementById('claw'); (new Claw(element)) .on('panStart', event => { // handle }) .on('pan', event => { // handle }) .on('panEnd', event => { // handle }); ``` --- `.on(type, listener)` *chainable* Binds a listener to an event type. ```javascript claw.on('tap', event => { // handle tap event }); ``` --- `.off(type, listener)` *chainable* Unbinds a listener from an event type. Omit the argument `listener` to unbind all listeners from an event type. Omit the argument `type` to unbind all listeners from all event types. ```javascript let tapListener = function(event) { // handle tap event claw.off('tap', tapListener); }; claw.on('tap', tapListener); ``` --- `.isIdle` Returns `true`, if there are no bound listeners. ## events `holdStart` ```js { pointerType, timeStamp, x, y, } ``` --- `holdEnd` ```js { initialTimeStamp, pointerType, timeStamp, x, y, } ``` --- `panStart` ```js { pointerType, timeStamp, x, y, } ``` --- `pan` ```js { initialTimeStamp, initialX, initialY, pointerType, previousTimeStamp, previousX, previousY, timeStamp, x, y, } ``` --- `panEnd` ```js { initialTimeStamp, initialX, initialY, pointerType, timeStamp, x, y, } ``` --- `tap` ```js { pointerType, timeStamp, x, y, } ```
10.819672
72
0.6
eng_Latn
0.542967
9329ea19b1ae97cffb1c4a2f6112173dbd1ca2f4
1,035
md
Markdown
junior_class/chapter-4-Object_Detection/README_en.md
wwhio/awesome-DeepLearning
2cc92edcf0c22bdfc670c537cc819c8fadf33fac
[ "Apache-2.0" ]
1,150
2021-06-01T03:44:21.000Z
2022-03-31T13:43:42.000Z
junior_class/chapter-4-Object_Detection/README_en.md
wwhio/awesome-DeepLearning
2cc92edcf0c22bdfc670c537cc819c8fadf33fac
[ "Apache-2.0" ]
358
2021-06-01T03:58:47.000Z
2022-03-28T02:55:00.000Z
junior_class/chapter-4-Object_Detection/README_en.md
wwhio/awesome-DeepLearning
2cc92edcf0c22bdfc670c537cc819c8fadf33fac
[ "Apache-2.0" ]
502
2021-05-31T12:52:14.000Z
2022-03-31T02:51:41.000Z
# Computer Vision: Object_Detection[[简体中文](./README.md)] ## Introduction In this chapter, you will learn some basic concepts for target detection tasks, including: bounding box, anchor box, IOU, non-maximum suppression, and mAP. At the same time, the basic principle of YOLOV3 target detection model is introduced, and model training and testing experiments are completed through forest pest and disease data sets. Hopefully, through this chapter, you will have a basic understanding of the target detection task and YOLOV3. ## Structure This chapter is presented in the form of notebook and code: - The notebook provides the learning tutorial of this chapter, with complete text description. In order to have a better reading method, you can also visit the [notebook document](https://aistudio.baidu.com/aistudio/education/group/info/9045/content) on the AIStudio platform. - The code section provides a complete learning code. For the specific usage tutorial, please refer to the [[README](./code/README.md)] in the code section.
94.090909
451
0.795169
eng_Latn
0.997152
932b54cb71418679ea7982e75a27b10b271d8df8
5,053
md
Markdown
README.md
hiroq/RxWebSocketClient
17c874f77a50343fb32b9e28a52bffeddf97f452
[ "MIT" ]
1
2016-10-27T07:45:39.000Z
2016-10-27T07:45:39.000Z
README.md
hiroq/RxWebSocketClient
17c874f77a50343fb32b9e28a52bffeddf97f452
[ "MIT" ]
null
null
null
README.md
hiroq/RxWebSocketClient
17c874f77a50343fb32b9e28a52bffeddf97f452
[ "MIT" ]
null
null
null
# RxWebSocketClient Simple RxJava WebSocketClient # DEMO Video [![DEMO VIDEO](http://img.youtube.com/vi/831TLryJnR8/0.jpg)](http://www.youtube.com/watch?v=831TLryJnR8) # Install ```groovy dependencies { compile 'net.hiroq:rxwsc:0.1.7' } ``` # Usage If you use lambda, the code will be simpler! ```java onResume(){   mSocketClient = new RxWebSocketClient();   mSubscription = mSocketClient.connect(Uri.parse("ws://hogehoge"))   .subscribeOn(Schedulers.newThread())   .observeOn(AndroidSchedulers.mainThread())   .subscribe(new Action1<RxWebSocketClient.Event>() {   @Override   public void call(RxWebSocketClient.Event event) {   Log.d(TAG, "== onNext ==");   switch (event.getType()) {   case CONNECT:   Log.d(TAG, " CONNECT");   mSocketClient.send("test");   break;   case DISCONNECT:   Log.d(TAG, " DISCONNECT");   break;   case MESSAGE_BINARY:   Log.d(TAG, " MESSAGE_BINARY : bytes = " + event.getBytes().length);   break;   case MESSAGE_STRING:   Log.d(TAG, " MESSAGE_STRING = " + event.getString());   break;   }   }   }, new Action1<Throwable>() {   @Override   public void call(Throwable throwable) {   Log.d(TAG, "== onError ==");   throwable.printStackTrace();   }   }, new Action0() {              @Override   public void call() {   Log.d(TAG, "== onComplete ==");   }   });            } } onPause(){    // After this call, connection will be disconnected.    mSubscription.unsubscribe(); } onClick(){    mSocketClient.send("This is test message"); } ``` If you want to reconnect automatically, use retryWhen like below: ```java mSocketClient = new RxWebSocketClient(); mSubscription = mSocketClient.connect(Uri.parse("ws://hogehoge")) .retryWhen(new Func1<Observable<? extends Throwable>, Observable<?>>() { @Override public Observable<?> call(final Observable<? extends Throwable> observable) { return observable.flatMap(new Func1<Throwable, Observable<?>>() { @Override public Observable<?> call(Throwable throwable) { // If the exception is ConnectionException, // retry with 5 second delay. if (throwable instanceof ConnectException) { Log.d(TAG, "retry with delay"); return Observable.timer(5, TimeUnit.SECONDS); }else { // Other exceptions will be handle onError return Observable.error(throwable); } } }); } }) .subscribeOn(Schedulers.newThread()) .observeOn(AndroidSchedulers.mainThread()) ``` # Sample You can run sample project with sample WebSocket server which is implemented in Javascript and run on Node.js. ``` cd sample-server npm install node app.js ``` Then run sample android app on an emulator. > If you want run the sample on a real devide, you have to change Connetion URL. # TODO * make test # Licence ``` MIT License Copyright (c) 2016 Hiroki Oizumi This library is ported from following libraries 1. Copyright (c) 2010-2016 James Coglan<br> faye's faye-websocket-node<br> https://github.com/faye/faye-websocket-node 2. Copyright (c) 2012 Eric Butler<br> codebutler's android-websockets.<br> https://github.com/codebutler/android-websockets Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ```
34.37415
110
0.584603
eng_Latn
0.350824
932b92ad38add1a6da2f3d8fe025c2ebf9020ea3
111
md
Markdown
_news/new_2.md
npitsillos/npitsillos.github.io
6aaba62ce7c9ea38da862d68624ae74892dbd26e
[ "MIT" ]
null
null
null
_news/new_2.md
npitsillos/npitsillos.github.io
6aaba62ce7c9ea38da862d68624ae74892dbd26e
[ "MIT" ]
null
null
null
_news/new_2.md
npitsillos/npitsillos.github.io
6aaba62ce7c9ea38da862d68624ae74892dbd26e
[ "MIT" ]
null
null
null
--- layout: post date: 2021-05-24 inline: true --- [Paper](/publications/) accepted at ICDL-EpiRob 2021 :tada:
15.857143
59
0.693694
eng_Latn
0.41249
932cbb4d691f2e090197ea74035afebaa33e364e
2,300
md
Markdown
docs/workflow-designer/messaging-activity-designers.md
klmnden/visualstudio-docs.tr-tr
82aa1370dab4ae413f5f924dad3e392ecbad0d02
[ "CC-BY-4.0", "MIT" ]
1
2020-09-01T20:45:52.000Z
2020-09-01T20:45:52.000Z
docs/workflow-designer/messaging-activity-designers.md
klmnden/visualstudio-docs.tr-tr
82aa1370dab4ae413f5f924dad3e392ecbad0d02
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/workflow-designer/messaging-activity-designers.md
klmnden/visualstudio-docs.tr-tr
82aa1370dab4ae413f5f924dad3e392ecbad0d02
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: İş Akışı Tasarımcısı - Mesajlaşma etkinlik tasarımcıları ms.date: 11/04/2016 ms.topic: reference ms.assetid: 897e63cf-a42f-4edd-876f-c4ccfffaf6d6 author: gewarren ms.author: gewarren manager: jillfra ms.workload: - multiple ms.openlocfilehash: a9868b5eb52edde8e12d6a3b4f5edab1a4a9e499 ms.sourcegitcommit: 12f2851c8c9bd36a6ab00bf90a020c620b364076 ms.translationtype: MT ms.contentlocale: tr-TR ms.lasthandoff: 06/06/2019 ms.locfileid: "66747098" --- # <a name="messaging-activity-designers"></a>Mesajlaşma etkinlik tasarımcıları Mesajlaşma etkinlik tasarımcıları oluşturmak ve bir Windows Workflow Foundation (WF) uygulama içinden Windows Communication Foundation (WCF) iletilerini gönderip Mesajlaşma etkinlikleri yapılandırmak için kullanılır. Beş Mesajlaşma etkinlikleri, .NET Framework 4'te tanıtıldı. İş Akışı Tasarımcısı bir iş akışı içinde Mesajlaşma yönetmenizi sağlayan iki Şablon tasarımcıları sağlar. Bu bölümde yer alan ve aşağıdaki tabloda listelenen konular, Şablon tasarımcıları ve etkinlik iş akışı Tasarımcısını kullanma konusunda rehberlik sağlar. - <xref:System.Activities.Activity> - <xref:System.ServiceModel.Activities.CorrelationScope> - <xref:System.ServiceModel.Activities.Receive> - <xref:System.ServiceModel.Activities.Send> - <xref:System.ServiceModel.Activities.ReceiveReply> - <xref:System.ServiceModel.Activities.SendReply> - <xref:System.ServiceModel.Activities.TransactedReceiveScope> ## <a name="related-sections"></a>İlgili bölümler Etkinlik tasarımcıları diğer türleri için aşağıdaki konulara bakın: - [Denetim Akışı](../workflow-designer/control-flow-activity-designers.md) - [Etkinlik Tasarımcılarını kullanma](../workflow-designer/using-the-activity-designers.md) - [Akış Çizelgesi](../workflow-designer/flowchart-activity-designers.md) - [Çalışma Zamanı](../workflow-designer/runtime-activity-designers.md) - [Temel Türler](../workflow-designer/primitives-activity-designers.md) - [İşlem](../workflow-designer/transaction-activity-designers.md) - [Koleksiyon](../workflow-designer/collection-activity-designers.md) - [Hata İşleme](../workflow-designer/error-handling-activity-designers.md) ## <a name="external-resources"></a>Dış kaynaklar [Etkinlik Tasarımcılarını kullanma](../workflow-designer/using-the-activity-designers.md)
38.333333
382
0.808696
tur_Latn
0.974589
932cd48fe14c8e8b5dfd28c8c6449c1151f2229e
655
md
Markdown
CHANGELOG.md
kudohamu/akashic-label
6f61a7d0d0c95d75412641147ff929a55f0758d9
[ "MIT" ]
null
null
null
CHANGELOG.md
kudohamu/akashic-label
6f61a7d0d0c95d75412641147ff929a55f0758d9
[ "MIT" ]
null
null
null
CHANGELOG.md
kudohamu/akashic-label
6f61a7d0d0c95d75412641147ff929a55f0758d9
[ "MIT" ]
null
null
null
# CHANGELOG ## 2.0.5 * 描画タイミングで、`glyph.surface` が存在しない場合の対応。 * `drawImage` 前に、`glyph.isSurfaceValid` にてチェックを行い、破棄されていた場合、改めてglyphの作成を行うよう修正。 ## 2.0.4 * サロゲート文字の一部が正しく描画できない問題の解消。 ## 2.0.3 * 禁則処理を指定する `lineBreakRule` オプションを追加。 ## 2.0.2 * サロゲート文字を正しく描画できない問題の解消。 * サンプルのディレクトリ構造を akashic-cli-init の typescript テンプレートに追従。 ## 2.0.1 * サロゲート文字が文字化けする現象に対応。 ## 2.0.0 * akashic-engine@2.0.0 系に追従。あわせてバージョンを 2.0.0 に。 * 非推奨だった `LabelParameterObject#bitmapFont` および `RubyOptions#rubyBitmapFont` を削除。 ## 0.3.5 * publish対象から不要なファイルを除去。 ## 0.3.4 * ビルドツールの変更 * TypeScriptの更新 ## 0.3.3 * `trimMarginTop` `widthAutoAdjust` オプションを追加。 ## 0.3.2 * 初期リリース
14.23913
81
0.71145
yue_Hant
0.62428
932ce931103036aff702a6cff832b617be803ca6
2,112
md
Markdown
CONTRIBUTING.md
hfaivre/synthetics-ci-github-action-1
ca8612ffcfced6982be83e31fe061970146c0121
[ "MIT" ]
25
2022-01-18T15:17:10.000Z
2022-03-17T21:52:25.000Z
CONTRIBUTING.md
hfaivre/synthetics-ci-github-action-1
ca8612ffcfced6982be83e31fe061970146c0121
[ "MIT" ]
3
2022-01-07T14:06:48.000Z
2022-03-04T09:29:42.000Z
CONTRIBUTING.md
hfaivre/synthetics-ci-github-action-1
ca8612ffcfced6982be83e31fe061970146c0121
[ "MIT" ]
2
2022-01-05T22:27:57.000Z
2022-02-17T09:25:17.000Z
# Contributing First of all, thanks for contributing! This document provides some basic guidelines for contributing to this repository. To propose improvements, feel free to submit a pull request. ## Submitting issues Github issues are welcome, feel free to submit error reports and feature requests! - Ensure the bug was not already reported by searching on GitHub under [Issues](https://github.com/DataDog/synthetics-ci-github-action/issues). - If you're unable to find an open issue addressing the problem, [open a new one](https://github.com/DataDog/synthetics-ci-github-action/issues/new/choose). - Make sure to add enough details to explain your use case. If you require further assistance, you can also contact [our support](https://docs.datadoghq.com/help/). ## Submitting pull requests Have you fixed a bug or written a new feature and want to share it? Many thanks! In order to ease/speed up our review, here are some items you can check/improve when submitting your pull request: - **Write meaningful commit messages** Messages should be concise but explanatory. The commit message should describe the reason for the change, to later understand quickly the thing you've been working on for a day. - **Keep it small and focused.** Pull requests should contain only one fix, or one feature improvement. Bundling several fixes or features in the same PR will make it harder to review, and eventually take more time to release. - **Write tests for the code you wrote.** Each module should be tested. The tests for a module are located in the [`__tests__` folder](https://github.com/DataDog/synthetics-ci-github-action/tree/main/__tests__), under a file with the same name as the module. Our CI is not (yet) public, so it may be difficult to understand why your pull request status is failing. Make sure that all tests pass locally, and we'll try to sort it out in our CI. ## Style guide The code under this repository follows a format enforced by prettier, and a style guide enforced by eslint. ## Asking a questions Need help? Contact [Datadog support](https://docs.datadoghq.com/help/).
49.116279
186
0.777936
eng_Latn
0.998442
932e49a7eb4e4d32159b5cf5ee9e1143e50bf627
3,490
md
Markdown
content/publication/mmhri2020_workshop/index.md
zedavid/website
11b271a4e91c80b4a14ce49f0e88449e1b97bad3
[ "MIT" ]
null
null
null
content/publication/mmhri2020_workshop/index.md
zedavid/website
11b271a4e91c80b4a14ce49f0e88449e1b97bad3
[ "MIT" ]
null
null
null
content/publication/mmhri2020_workshop/index.md
zedavid/website
11b271a4e91c80b4a14ce49f0e88449e1b97bad3
[ "MIT" ]
null
null
null
--- title: "Natural Language Interaction to Facilitate Mental Models of Remote Robots" # Authors # If you created a profile for a user (e.g. the default `admin` user), write the username (folder name) here # and it will be replaced with their full name and linked to their profile. authors: - Francisco Javier Chiyah Garcia - admin - Helen Hastie # Author notes (optional) #author_notes: #- "Equal contribution" #- "Equal contribution" date: "2020-05-01T00:00:00Z" doi: "" # Schedule page publish date (NOT publication's date). publishDate: "2020-12-01T00:00:00Z" # Publication type. # Legend: 0 = Uncategorized; 1 = Conference paper; 2 = Journal article; # 3 = Preprint / Working Paper; 4 = Report; 5 = Book; 6 = Book section; # 7 = Thesis; 8 = Patent publication_types: ["1"] # Publication name and optional abbreviated publication name. publication: In *Mental Model of Robots Workshop at HRI* publication_short: In *MM-HRI* abstract: "Increasingly complex and autonomous robots are being deployed in real-world environments with far-reaching consequences. High-stakes scenarios, such as emergency response or offshore energy platform and nuclear inspections, require robot operators to have clear mental models of what the robots can and can't do. However, operators are often not the original designers of the robots and thus, they do not necessarily have such clear mental models, especially if they are novice users. This lack of mental model clarity can slow adoption and can negatively impact human-machine teaming. We propose that interaction with a conversational assistant, who acts as a mediator, can help the user with understanding the functionality of remote robots and increase transparency through natural language explanations, as well as facilitate the evaluation of operators' mental models." # Summary. An optional shortened abstract. summary: A position paper on mental models for HRI. tags: [] # Display this page in the Featured widget? featured: true # Custom links (uncomment lines below) # links: # - name: Custom Link # url: http://example.org url_pdf: 'https://arxiv.org/pdf/2003.05870.pdf' url_code: '' url_dataset: '' url_poster: '' url_project: '' url_slides: '' url_source: '' url_video: '' # Featured image # To use, add an image named `featured.jpg/png` to your page's folder. #image: # caption: "Image credit: [Mr Freeze's lab](https://www.flickr.com/photos/9842867@N04/8560981360)" # focal_point: "" # preview_only: false # Associated Projects (optional). # Associate this publication with one or more of your projects. # Simply enter your project's folder or file name without extension. # E.g. `internal-project` references `content/project/internal-project/index.md`. # Otherwise, set `projects: []`. #projects: #- example # Slides (optional). # Associate this publication with Markdown slides. # Simply enter your slide deck's filename without extension. # E.g. `slides: "example"` references `content/slides/example/index.md`. # Otherwise, set `slides: ""`. #slides: example --- {{% callout note %}} Click the *Cite* button above to demo the feature to enable visitors to import publication metadata into their reference management software. {{% /callout %}} {{% callout note %}} Create your slides in Markdown - click the *Slides* button to check out the example. {{% /callout %}} Supplementary notes can be added here, including [code, math, and images](https://wowchemy.com/docs/writing-markdown-latex/).
39.213483
885
0.751289
eng_Latn
0.980021
932e670e962932b9358111bbd25d41849359c8c6
24
md
Markdown
_includes/04-lists.md
redlotus15/markdown-portfolio
d70b85e55e5b67d3365741dfa4e9d011a795e644
[ "MIT" ]
null
null
null
_includes/04-lists.md
redlotus15/markdown-portfolio
d70b85e55e5b67d3365741dfa4e9d011a795e644
[ "MIT" ]
5
2019-09-09T21:44:07.000Z
2019-09-10T19:12:46.000Z
_includes/04-lists.md
redlotus15/markdown-portfolio
d70b85e55e5b67d3365741dfa4e9d011a795e644
[ "MIT" ]
null
null
null
- Pizza - Burger - Cake
6
8
0.625
kor_Hang
0.727405
932ef7fe492e13c111259a867117d10a7ab335e0
799
md
Markdown
example/README.md
pavannntarget/pspdfkit-flutter
3ad842d880a9e7e25131b785e43b3e35a0ce8d42
[ "Apache-2.0" ]
null
null
null
example/README.md
pavannntarget/pspdfkit-flutter
3ad842d880a9e7e25131b785e43b3e35a0ce8d42
[ "Apache-2.0" ]
null
null
null
example/README.md
pavannntarget/pspdfkit-flutter
3ad842d880a9e7e25131b785e43b3e35a0ce8d42
[ "Apache-2.0" ]
1
2021-01-15T17:46:38.000Z
2021-01-15T17:46:38.000Z
# PSPDFKit Flutter Example This is a brief example of how to use the PSPDFKit with Flutter. # Running the Example Project 1. Clone the repository `git clone https://github.com/PSPDFKit/pspdfkit-flutter.git` 2. Create a local property file in `pspdfkit-flutter/example/android/local.properties` and specify the following properties: ```local.properties sdk.dir=/path/to/your/Android/sdk flutter.sdk=/path/to/your/flutter/sdk flutter.buildMode=debug ``` 3. cd `pspdfkit-flutter/example` 4. Run `flutter emulators --launch <EMULATOR_ID>` to launch the desired emulator. Optionally, you can repeat this step to launch multiple emulators. 5. The app is ready to start! Run `flutter run -d all` and the PSPDFKit Flutter example will be deployed on all your devices connected, both iOS and Android.
39.95
157
0.780976
eng_Latn
0.906668
edbfb87bbbfe83758cdf07cecdf7654893231eec
3,277
md
Markdown
_problems/hard/UPDTREE.md
captn3m0/codechef
9b9a127365d1209893e94f8430b909433af6b5f9
[ "WTFPL" ]
14
2015-11-27T15:49:32.000Z
2022-02-04T17:31:27.000Z
_problems/hard/UPDTREE.md
ashrafulislambd/codechef
b192550188e13d7edb211746103fddf049272027
[ "WTFPL" ]
40
2015-12-16T12:58:07.000Z
2022-02-02T11:46:05.000Z
_problems/hard/UPDTREE.md
ashrafulislambd/codechef
b192550188e13d7edb211746103fddf049272027
[ "WTFPL" ]
18
2015-03-30T09:35:35.000Z
2020-12-03T14:11:12.000Z
--- category_name: hard problem_code: UPDTREE problem_name: 'Updating Edges on Trees' languages_supported: - ADA - ASM - BASH - BF - C - 'C99 strict' - CAML - CLOJ - CLPS - 'CPP 4.3.2' - 'CPP 4.9.2' - CPP14 - CS2 - D - ERL - FORT - FS - GO - HASK - ICK - ICON - JAVA - JS - 'LISP clisp' - 'LISP sbcl' - LUA - NEM - NICE - NODEJS - 'PAS fpc' - 'PAS gpc' - PERL - PERL6 - PHP - PIKE - PRLG - PYTH - 'PYTH 3.4' - RUBY - SCALA - 'SCM guile' - 'SCM qobi' - ST - TCL - TEXT - WSPC max_timelimit: '3.5' source_sizelimit: '50000' problem_author: darkshadows problem_tester: null date_added: 15-09-2014 tags: - cook50 - darkshadows - dfs - dynamic - lca - medium editorial_url: 'http://discuss.codechef.com/problems/UPDTREE' time: view_start_date: 1411324200 submit_start_date: 1411324200 visible_start_date: 1411324200 end_date: 1735669800 current: 1493556895 layout: problem --- All submissions for this problem are available.### Read problems statements in [English](http://www.codechef.com/download/translated/COOK50/english/UPDTREE.pdf), [Mandarin Chinese](http://www.codechef.com/download/translated/COOK50/mandarin/UPDTREE.pdf) and [Russian](http://www.codechef.com/download/translated/COOK50/russian/UPDTREE.pdf) as well. You have a [tree](http://en.wikipedia.org/wiki/Tree_(graph_theory)) consisting of **N** vertices numbered **1** to **N**. Initially each edge has a value equal to zero. You have to first perform **M1** operations and then answer **M2** queries. Note you have to first perform all the operations and then answer all queries after all operations have been done. Operations are defined by: **A B C D**: On the path between nodes numbered **A** and **B** increase the value of each edge by **1**, except for those edges which occur on the path between **C** and **D**. Note that there is an unique path between every pair of nodes ie. we don't consider values on edges for finding the path. All four values given in input will be distinct. Queries are of the following type: **E F**: Print the sum of values of all the edges on the path between two distinct nodes **E** and **F**. Again the path will be unique. />/>/>/>/>/>/> ### Input Input description. First line contains **N**, **M1** and **M2**. Each of the next **N-1** lines contain two integers **u v** denoting an undirected edge between node numbered **u** and **v**. Each of the next **M1** lines contain four integers **Ai Bi Ci Di**, denoting the operations. Each of the next **M2** lines contain two integers **Ei Fi** denoting the queries. ### Output For each query, print the required answer in one line. ### Constraints 10. **1** ≤ **N** ≤ **105** 11. **1** ≤ **M1, M2** ≤ **5\*105** 12. **1** ≤ **Ai, Bi, Ci, Di, Ei, Fi** ≤ **N** ### Example <pre><b>Input:</b> 5 2 2 1 2 2 4 2 5 1 3 1 4 2 3 3 4 2 5 4 5 4 3 <b>Output:</b> 2 4 </pre> ### Explanation On first operation, value of edge (2-4) is increased by one. On second operation, value of edges (1-3), (1-2), (2-4) are increased by one. **Warning:**Use fast input/output. Large input files.
25.80315
349
0.642356
eng_Latn
0.909907
edc07d19917a74affdcb875d6a05dcc93cff9dc4
1,184
md
Markdown
_posts/4.CS_/CA/2021-12-21-Computer Architecture-lecture-1-6.md
candymask0712/candymask0712.github.io
1e47e68639718a4f0ea423bcb4e5ab88b3c349a3
[ "MIT" ]
null
null
null
_posts/4.CS_/CA/2021-12-21-Computer Architecture-lecture-1-6.md
candymask0712/candymask0712.github.io
1e47e68639718a4f0ea423bcb4e5ab88b3c349a3
[ "MIT" ]
1
2021-11-14T09:17:23.000Z
2021-11-14T09:17:23.000Z
_posts/4.CS_/CA/2021-12-21-Computer Architecture-lecture-1-6.md
candymask0712/candymask0712.github.io
1e47e68639718a4f0ea423bcb4e5ab88b3c349a3
[ "MIT" ]
null
null
null
--- title: "[Computer Architecture] 컴퓨터 구조 - 6 - 부울대수와 논리식의 간편화" excerpt: toc: true toc_sticky: true toc_label: "페이지 컨텐츠 리스트" categories: - Computer Architecture tags: - Boolean Algebra - Karnaugh map last_modified_at: --- ### **1. 부울대수(Boolean Algebra)** - T/F 판별 논리 명제를 수학 전개 식으로 구현 (G.Boole) - 부울 대수의 기본 법칙 - 교환 법칙 ```JavaScript A*B = B*A A+B = B+A ``` - 결합 법칙 ```JavaScript A*(B*C) = (A*B)*C (A+B)+C = A+(B+C) ``` - 분배 법칙 ```JavaScript A*(B+C) = A*B + A*C ``` - 드모르간의 법칙 ```JavaScript (A + B)' = A' + B' (A * B)' = A' * B' ``` [참고자료 - 드모르간의 법칙](https://ko.wikipedia.org/wiki/%EB%93%9C_%EB%AA%A8%EB%A5%B4%EA%B0%84%EC%9D%98_%EB%B2%95%EC%B9%99#:~:text=%EB%93%9C%20%EB%AA%A8%EB%A5%B4%EA%B0%84%EC%9D%98%20%EB%B2%95%EC%B9%99(%EC%98%81%EC%96%B4,%ED%95%9C%20%EA%B2%83%EC%9C%BC%EB%A1%9C%2C%20%EC%88%98%ED%95%99%EC%9E%90%20%EC%98%A4%EA%B1%B0%EC%8A%A4%ED%84%B0%EC%8A%A4) ### **2. 카노(Karnaugh) 맵** - 논리 표현식을 간편하게 보여줄 수 있는 맵 - 카노 맵의 표현 방법 - 만약 변수가 n개라면 카노맵은 2의n승 개의 민텀 - 각 인접 민텀은 하나의 변수만이 변경되어야 한다 - 출력이 1인 기본 곱에 해당되는 민텀은 1로, 나머지는 0으로 표시 [참고자료 - 컴퓨터 공학 전공 필수 강의 (패스트캠퍼스 - 현재는 수강불가)]
19.733333
332
0.559966
kor_Hang
0.998815
edc0f8159715ea615d1184535296a2d55deedb22
1,523
md
Markdown
docs/ProfessionalInputDTO.md
em-ad/kotlinsdk
be4c2e4a6222d467a51d9148555eed25264c7eb0
[ "Condor-1.1" ]
null
null
null
docs/ProfessionalInputDTO.md
em-ad/kotlinsdk
be4c2e4a6222d467a51d9148555eed25264c7eb0
[ "Condor-1.1" ]
null
null
null
docs/ProfessionalInputDTO.md
em-ad/kotlinsdk
be4c2e4a6222d467a51d9148555eed25264c7eb0
[ "Condor-1.1" ]
null
null
null
# ProfessionalInputDTO ## Properties Name | Type | Description | Notes ------------ | ------------- | ------------- | ------------- **aboutMe** | [**kotlin.String**](.md) | | [optional] **address** | [**kotlin.String**](.md) | | [optional] **capabilities** | [**kotlin.Array&lt;Capability&gt;**](Capability.md) | | [optional] **city** | [**kotlin.String**](.md) | | [optional] **dateOfIssue** | [**kotlin.String**](.md) | | [optional] **degree** | [**inline**](#DegreeEnum) | | [optional] **educations** | [**kotlin.Array&lt;Education&gt;**](Education.md) | | [optional] **gender** | [**inline**](#GenderEnum) | | [optional] **languages** | [**kotlin.Array&lt;LanguageModel&gt;**](LanguageModel.md) | | [optional] **licenseCode** | [**kotlin.String**](.md) | | [optional] **licenseFileId** | [**kotlin.String**](.md) | | [optional] **personalInformation** | [**PersonalInformationReq**](PersonalInformationReq.md) | | [optional] **profileImageId** | [**kotlin.String**](.md) | | [optional] **province** | [**kotlin.String**](.md) | | [optional] **researches** | [**kotlin.Array&lt;Research&gt;**](Research.md) | | [optional] **resumes** | [**kotlin.Array&lt;Resume&gt;**](Resume.md) | | [optional] **userId** | [**kotlin.String**](.md) | | [optional] <a name="DegreeEnum"></a> ## Enum: degree Name | Value ---- | ----- degree | ASSOCIATE, BACHELOR, DIPLOMA, DOCTORATE, MASTER, POSTDOCTORAL <a name="GenderEnum"></a> ## Enum: gender Name | Value ---- | ----- gender | FEMALE, MALE, NONE
43.514286
98
0.576494
yue_Hant
0.39899
edc1aa4a26429bcfac8dfb7eacf9e284ed7558c9
742
md
Markdown
content/tentang.md
perkodi-org/perkodi
41339bac76d1219c72be33543867c83002bf2b78
[ "MIT" ]
8
2017-10-24T04:19:44.000Z
2020-03-07T00:35:29.000Z
content/tentang.md
perkodi-org/perkodi
41339bac76d1219c72be33543867c83002bf2b78
[ "MIT" ]
10
2017-10-24T03:06:19.000Z
2021-03-29T17:36:53.000Z
content/tentang.md
perkodi-org/perkodi
41339bac76d1219c72be33543867c83002bf2b78
[ "MIT" ]
6
2017-10-24T04:19:00.000Z
2018-11-19T07:11:52.000Z
--- title: Tentang PERKODI --- # Tentang PERKODI PERKODI lahir sebagai perwujudan dari keinginan sejumlah komunitas pemrograman di Indonesia untuk mengadakan konferensi bahasa pengembangan sumber terbuka (*open source*) bertingkat internasional di tahun 2017. PERKODI merupakan badan hukum perkumpulan yang sifatnya non-profit. Keanggotaan dan kepengurusan PERKODI adalah sukarela dan sumber penghasilan untuk mengadakan kegiatan datang dari sponsor, donatur serta iuran anggota. Di tahun pertama sejak berdiri, PERKODI telah mendukung 2 kegiatan berskala internasional dan akan terus berusaha konsisten untuk mengadakan kegiatan yang lebih baik kedepannya dalam mendukung peningkatan budaya pemrograman di Indonesia sesuai visi PERKODI.
61.833333
257
0.842318
ind_Latn
0.975986
edc1ac407ab36241c1c0d24220cc7ffeb37f268a
10,112
md
Markdown
PostnayaTriod1915/chapters/22sub4.md
slavonic/cu-md-sandbox
34e07f78ee101d387e850123cf9469664ea9ed16
[ "MIT" ]
5
2018-06-14T15:04:18.000Z
2021-04-13T11:38:32.000Z
PostnayaTriod1915/chapters/22sub4.md
slavonic/cu-md-sandbox
34e07f78ee101d387e850123cf9469664ea9ed16
[ "MIT" ]
1
2018-06-11T16:37:54.000Z
2018-07-13T01:18:29.000Z
PostnayaTriod1915/chapters/22sub4.md
slavonic/cu-md-sandbox
34e07f78ee101d387e850123cf9469664ea9ed16
[ "MIT" ]
1
2021-04-13T11:38:36.000Z
2021-04-13T11:38:36.000Z
=Въ сꙋббѡ́тꙋ д҃-ѧ седми́цы на ᲂу҆́трени,= {{text_align=center}} =А҆ллилꙋ́їа и҆ тропарѝ, гла́съ в҃:= ~А҆пⷭ҇ли, мч҃нцы: =Сла́ва:= ~Помѧнѝ, гдⷭ҇и: =И҆ ны́нѣ:= ~Мт҃и ст҃а́ѧ: =И҆ стїхосло́вїе.= =Та́же мч҃нчны сѣда́льны гла́са. По непоро́чныхъ же пое́мъ тропарѝ:= ~Ст҃ы́хъ ли́къ: =И҆ по є҆ктенїѝ сѣда́ленъ ме́ртвенъ, гла́съ є҃:= ~Поко́й, сп҃се на́шъ, съ првⷣными: =Сла́ва, коне́цъ: И҆ ны́нѣ, бг҃оро́диченъ:= ~Ѿ дв҃ы возсїѧ́вый: =Та́же канѡ́нъ мине́и, и҆ свѧта́гѡ ѡ҆би́тели, и҆ настоѧ́щыѧ четверопѣ́снцы по чи́нꙋ и҆́хъ. Четверопѣ́снецъ, є҆гѡ́же краестро́чїе: Пѣ́снь на мч҃нки. Творе́нїе господи́на і҆ѡ́сифа.= =И҆ стїхосло́вѧтсѧ пѣ̑сни. Гла́съ д҃.= {{text_align=center}} =Пѣ́снь ѕ҃.= {{text_align=center}} =І҆рмо́съ:= ~Пꙋчи́ною жите́йскою: ~Преидо́сте предѣ́лы пло́ти мно́гимъ терпѣ́нїемъ, страда́льцы, поне́сше мꙋ̑ки и҆ бѡлѣ̑зни: тѣ́мже всѧ́кꙋю болѣ́знь и҆ ско́рбь ва́съ воспѣва́ющихъ ѡ҆блегча́ете. ~Тьма́мъ а҆́гг҃лъ совокꙋпи́сѧ во́инство ст҃ы́хъ стрⷭ҇тоте́рпєцъ, и҆ мо́литъ всест҃а́го бг҃а, ѿ согрѣше́нїй тьмочи́сленныхъ на́съ и҆зба́вити, ꙗ҆́кѡ хрⷭ҇тꙋ̀ ᲂу҆годи́вшїи. =Оу҆=мерщвле́нъ бы́въ и҆ во гро́бѣ ᲂу҆снꙋ́въ, хрⷭ҇тѐ, возста́вилъ є҆сѝ мє́ртвыѧ, и҆ въ вѣ́рѣ ᲂу҆ме́ршымъ дае́ши бл҃гости бога́тство, хрⷭ҇тѐ, ᲂу҆покое́нїе со всѣ́ми ст҃ы́ми. =Бг҃оро́диченъ:= ~Бж҃їе бг҃ъ сло́во, и҆скі́й ѡ҆божи́ти человѣ́ка чтⷭ҇аѧ, и҆з̾ тебє̀ воплоща́етсѧ и҆ человѣ́къ ꙗ҆влѧ́етсѧ: є҆го́же непреста́ннѡ молѝ, ѡ҆брѣстѝ на́мъ млⷭ҇ть во вре́мѧ ѿвѣ́та. =И҆́ный, господи́на ѳео́дѡра, гла́съ то́йже.= {{text_align=center}} =І҆рмо́съ:= ~Бꙋ́рею грѣхо́вною: ~Пло́ти и҆ кро́ве не пощадѣ́вше, ст҃і́и, ко всѧ́комꙋ мꙋче́нїю ста́сте небоѧ́знени, хрⷭ҇та̀ не ѿмета́ющесѧ: тѣ́мже съ нб҃съ низпосла̀ хрⷭ҇то́съ ва́мъ вѣнцы̀. ~Мꙋ́чєникъ торжество̀ срѣта́имъ, просвѣти́вшесѧ дѣѧ́ньми, и҆ возопїи́мъ бг҃одꙋхнове́нными пѣ́сньми: вы̀ є҆стѐ вои́стиннꙋ денни̑цы на землѝ, хрⷭ҇тѡ́вы мч҃нцы. =Трⷪ҇ченъ:= ~Трⷪ҇це ст҃а́ѧ, сла́влю тѧ̀, є҆стество̀ безнача́льное, є҆ди́наго бг҃а, є҆ди́наго гдⷭ҇а, трѝ лица̑, ѻ҆ц҃а̀, сн҃а и҆ дх҃а, нерожде́нна, рожде́нна, и҆сходѧ́ща, тꙋ́южде и҆ присносꙋ́щнꙋ. =Бг҃оро́диченъ:= ~Ѽ бл҃же́ннаѧ бг҃оневѣ́сто, ка́кѡ родила̀ є҆сѝ без̾ мꙋ́жа, и҆ пребыла̀ є҆сѝ,ꙗ҆́коже пре́жде; бг҃а бо родила̀ є҆сѝ, чꙋ́до стра́шное. но молѝ сп҃са́тисѧ пою́щымъ тѧ̀. =Сті́хъ:= ~Ди́венъ бг҃ъ во ст҃ы́хъ свои́хъ, бг҃ъ і҆и҃левъ. =Мч҃нченъ: Оу҆=сѣчє́нїѧ ᲂу҆дѡ́въ ва́шихъ зрѧ́ще; наслажда́стесѧ кровьмѝ ра́дꙋющесѧ: но моли́те ѡ҆ на́съ прилѣ́жнѡ гдⷭ҇а, мч҃нцы присносла́внїи. =Сті́хъ:= ~Дꙋ́шы и҆́хъ во бл҃ги́хъ водворѧ́тсѧ. ~Ѿ землѝ созда́вый мѧ̀, и҆ ѡ҆живи́вый мѧ̀, и҆ въ зе́млю возврати́тсѧ мнѣ̀ па́ки рекі́й, ꙗ҆̀же прїѧ́лъ є҆сѝ рабы̑ твоѧ̑ ᲂу҆поко́й, и҆ ѿ тлѣ́нїѧ сме́ртнагѡ возведѝ, гдⷭ҇и. =І҆рмо́съ:= ~Бꙋ́рею грѣхо́вною погрꙋжа́емь, ꙗ҆́кѡ во чре́вѣ ки́товѣ содержи́мь, съ прⷪ҇ро́комъ зовꙋ́ ти: возведѝ ѿ тлѣ́нїѧ жи́знь мою, гдⷭ҇и, и҆ сп҃си́ мѧ. =Конда́къ:= ~Со ст҃ы́ми ᲂу҆поко́й: =І҆́косъ:= ~Са́мъ є҆ди́нъ є҆сѝ безсме́ртенъ: =Пѣ́снь з҃.= {{text_align=center}} =І҆рмо́съ:= ~Ѡ҆́бразꙋ злато́мꙋ: ~Ѡ҆блече́ни въ нетлѣ́нїе бж҃е́ственное ѡ҆бнаже́нїемъ тлѣ́нныѧ пло́ти, ны́нѣ свѣ́тли предстоитѐ на́съ ра́ди пло́ть ѿ нетлѣ́нныѧ жены̀ прїе́мшемꙋ, страда́льцы: сегѡ̀ ра́ди во ѡ҆де́ждꙋ сщ҃е́ннꙋю мѧ̀ ѡ҆блецы́те, ѕлѣ̀ ѡ҆бнаже́ннаго. ~Въ воздержа́нїи пожи́вый, стрⷭ҇тоте́рпєцъ собо́ръ ᲂу҆крѣплѧ́етъ на́съ воздержа́нїѧ по́прище тещѝ невозбра́ннѡ, ꙗ҆́кѡ проповѣ́давый хрⷭ҇та̀ на по́прищи мꙋ́жественнѡ, и҆ предстоѧ́й прⷭ҇то́лꙋ вѣнцено́сенъ наслажда́етсѧ мы́сленнѡ со а҆́гг҃лы. ~Мл҃твами твои́хъ ст҃ы́хъ мч҃нкъ, бж҃е, ᲂу҆со́пшыѧ вѣ̑рныѧ рабы̑ твоѧ̑ раѧ̀ жи́тєли покажѝ, и҆ свѣ́та мы́сленнагѡ сподо́би ѧ҆̀, непреста́ннѡ вопїю́щыѧ тебѣ̀: бл҃гослове́нъ бг҃ъ ѻ҆тє́цъ на́шихъ. =Бг҃оро́диченъ:= ~Тѧ̀ є҆ди́нꙋ бл҃гꙋ́ю мо́лимъ, дв҃о, ѡ҆ѕло́блєнныѧ ны̀ бл҃ги ꙗ҆вѝ, и҆ хрⷭ҇та̀ є҆стество́мъ сꙋ́ща бл҃га́го прилѣ́жнѡ молѝ, воздержа́нїѧ вре́мѧ, бл҃га̑ѧ дѣ́ющымъ, и҆спо́лнити на́мъ пѣсносло́вѧщымъ є҆го̀: бл҃гослове́нъ бг҃ъ ѻ҆тє́цъ на́шихъ. =И҆́ный.= {{text_align=center}} =І҆рмо́съ:= ~Глаго́лавый на горѣ̀: ~Возвели́чивый ст҃ы̑ѧ твоѧ̑ всѧ̑ и҆ зна́меньми ᲂу҆диви́вый ѧ҆̀ въ мі́рѣ, бл҃гослове́нъ є҆сѝ во вѣ́ки, гдⷭ҇и бж҃е ѻ҆тє́цъ на́шихъ. ~Всѧ́кїй ви́дъ ра́нъ проше́дше и҆ колѣ̑на ваа́лꙋ не поко́ршесѧ преклони́ти, вѣнцы̀ сла́вы взѧ́сте ѿ бг҃а, хрⷭ҇тѡ́вы мч҃нцы. =Трⷪ҇ченъ:= ~Во є҆ди́ницѣ трⷪ҇це, покланѧ́емое сꙋщество̀, ѻ҆́ч҃е, сн҃е и҆ дш҃е, воспѣва́ющыѧ тѧ̀ сохранѧ́й, бж҃е ѻ҆тє́цъ на́шихъ. =Бг҃оро́диченъ:= ~Дв҃о мт҃и ѻ҆трокови́це всесвѣ́тлаѧ, є҆ди́на къ бг҃ꙋ хода́таице, не преста́й, влⷣчце, моли́ти сп҃сти́сѧ на́мъ. =Сті́хъ:= ~Ст҃ы̑мъ, и҆̀же сꙋ́ть на землѝ є҆гѡ̀: ~Безсме́ртномꙋ цр҃ю̀ во́инствовавше и҆ вѣ́рꙋ къ немꙋ̀ показа́вше соверше́ннꙋю, мч҃нцы, кро́вь свою̀ за него̀ и҆злїѧ́сте. =Сті́хъ:= ~Бл҃же́ни, ꙗ҆̀же и҆збра́лъ и҆ прїѧ́лъ є҆сѝ, гдⷭ҇и. ~И҆дѣ́же жи́зни свѣ́тъ тво́й и҆стека́етъ, вселѝ вѣ̑рныѧ твоѧ̑ рабы̑, ꙗ҆̀же преста́вилъ є҆сѝ ѿ вре́менныхъ, гдⷭ҇и бж҃е ѻ҆тє́цъ на́шихъ. =І҆рмо́съ:= ~Глаго́лавый на горѣ̀ съ мѡѷсе́емъ, и҆ ѡ҆́бразъ дв҃ы показа́вый кꙋпинꙋ̀, бл҃гослове́нъ є҆сѝ, гдⷭ҇и бж҃е ѻ҆тє́цъ на́шихъ. =Пѣ́снь и҃.= {{text_align=center}} =І҆рмо́съ:= ~Во ѻ҆гнѝ пла́меннѣмъ: ~Великоимени́тїи хрⷭ҇тѡ́вы страда́льцы и҆ чтⷭ҇ні́и въ бз҃ѣ, всѧ̑ воспѣва́ющыѧ па̑мѧти ва́шѧ ѿ вели́кихъ согрѣше́нїй и҆ та́мошнихъ мꙋ́къ и҆зба́вите ва́шими вели́кими къ немꙋ̀ мл҃твами. ~Ст҃оизбра́нное и҆ тве́рдое вои́стиннꙋ во́инство хрⷭ҇та̀ бг҃а, мч҃нкъ собо́ри, ѡ҆ст҃и́те на́шъ ᲂу҆́мъ и҆ се́рдце въ сїѧ̑ ст҃ы̑ѧ дни̑ пѡ́стныѧ ва́шими ст҃ы́ми мл҃твами. ~И҆зба́ви, гдⷭ҇и ѿ че́рвїѧ мꙋ́чащагѡ и҆ скре́жета зꙋ́бнагѡ, и҆ кромѣ́шнїѧ тьмы̀ несвѣти́мыѧ, всѧ̑ ꙗ҆̀же вѣ́рою прїѧ́лъ є҆сѝ: и҆ ᲂу҆чинѝ сїѧ̑, и҆дѣ́же свѣ́тъ лица̀ твоегѡ̀, хрⷭ҇тѐ, сїѧ́етъ во вѣ́ки. =Бг҃оро́диченъ:= ~Крⷭ҇тъ хрⷭ҇то́въ чтⷭ҇ны́й ви́дѣвшымъ и҆ ѿ се́рдца поклони́вшымсѧ сподо́би, бцⷣе чтⷭ҇аѧ, твои́ми ко влⷣцѣ мл҃твами, ви́дѣти на́мъ и҆ стрⷭ҇ти честны̑ѧ, ѿ страсте́й ѡ҆чищє́ннымъ. =И҆́ный.= {{text_align=center}} =І҆рмо́съ:= ~Землѧ̀ и҆ всѧ̑ ꙗ҆̀же на не́й: ~Ѽ до́брыѧ мѣ́ны! є҆́юже ѡ҆брѣто́сте сме́ртїю жи́знъ, сщ҃енномч҃нцы хрⷭ҇тѡ́вы, ѻ҆гнѧ̀ и҆ меча̀, мра́за и҆ ѕвѣре́й ника́коже ᲂу҆боѧ́вшесѧ и҆ вопїю́ще: по́йте гдⷭ҇а и҆ превозноси́те во всѧ̑ вѣ́ки. ~Горѣ̀ ли́къ а҆́гг҃льскїй, до́лꙋ же мы̀ земні́и хва́лимъ, мч҃нцы хрⷭ҇тѡ́вы, ди̑внаѧ страда̑нїѧ и҆ по́двиги ва́шегѡ мꙋ́жества, бл҃гословѧ́ще, пою́ще гдⷭ҇а и҆ превозносѧ́ще во всѧ̑ вѣ́ки. =Бл҃гослови́мъ ѻ҆ц҃а̀, и҆ сн҃а, и҆ ст҃а́го дх҃а, гдⷭ҇а.= ~Свѣ́тъ, жи́знъ и҆ жи̑зни почита́ю тѧ̀, ѻ҆ц҃а̀ и҆ сн҃а, и҆ дх҃а и҆схо́дна, є҆стество̀ є҆ди́но, трѝ ѵ҆поста̑си, бг҃а є҆ди́наго поѧ̀: бл҃гословлѧю́ тѧ̀, пою́ тѧ гдⷭ҇а, и҆ превозношꙋ́ тѧ во всѧ̑ вѣ́ки. =Бг҃оро́диченъ:= ~Кто̀ ѿ земноро́дныхъ не воспое́тъ тѧ̀, нескве́рнаѧ чтⷭ҇аѧ голꙋби́це; ты́ бо родила̀ є҆сѝ на́мъ свѣ́тъ вели́кїй, жи́зни бога́тство, і҆и҃са сп҃са: є҆го́же пое́мъ хва́лѧще ꙗ҆́кѡ гдⷭ҇а, и҆ превозно́симъ во всѧ̑ вѣ́ки. =Сті́хъ:= ~Ди́венъ бг҃ъ во ст҃ы́хъ свои́хъ, бг҃ъ і҆и҃левъ. ~По́двиги чꙋ̑дныѧ ва́шѧ сла́вѧще, мч҃нцы, бл҃годѣ́телѧ и҆ бг҃а бл҃гослови́мъ и҆ покланѧ́емсѧ, на по́прище подвигѡ́въ ва́съ ᲂу҆крѣпи́вшаго: є҆го́же превозно́симъ во всѧ̑ вѣ́ки. =Сті́хъ:= ~Дꙋ́шы и҆́хъ во бл҃ги́хъ водворѧ́тсѧ. ~Ты̀, сме́рти и҆ жи́зни гдⷭ҇ь сы́й и҆ бг҃ъ, преста́вльшыѧсѧ бл҃гоче́стнѡ воздвиза́ѧй, положѝ та́мѡ въ селе́нїихъ првⷣныхъ, бл҃гословѧ́щыѧ, пою́щыѧ тѧ̀, гдⷭ҇и, и҆ превозносѧ́щыѧ во всѧ̑ вѣ́ки. =Хва́лимъ, бл҃гослови́мъ, покланѧ́емсѧ гдⷭ҇еви:= =І҆рмо́съ:= ~Землѧ̀ и҆ всѧ̑ ꙗ҆̀же на не́й, морѧ̀ и҆ всѝ и҆сто́чницы, небеса̀ небе́съ, свѣ́тъ и҆ тьма̀, мра́зъ и҆ зно́й, сы́нове человѣ́честїи, сщ҃е́нницы, бл҃гослови́те гдⷭ҇а и҆ превозноси́те во всѧ̑ вѣ́ки. =Пѣ́снь ѳ҃.= {{text_align=center}} =І҆рмо́съ:= ~Ꙗ҆́кѡ сотворѝ мнѣ̀: ~Свѣти́льницы сꙋ́ще неле́стнїи, хрⷭ҇тѡ́вы страда́льцы, просвѣти́те на́шѧ по́мыслы ᲂу҆тверди́те твори́ти свѣтоза̑рнаѧ и҆ чтⷭ҇аѧ бж҃їѧ хотѣ̑нїѧ. ~Ѡ҆рꙋ̑жїѧ закала̑ющаѧ врагѝ зрите́сѧ, до́блїи хрⷭ҇тѡ́вы страда́льцы: но и҆зба́вите на́съ ѿ стрѣ́лъ лꙋка́вагѡ предста́тельствы ва́шими. ~Поко́й, ще́дре, въ нѣ́дрѣхъ а҆враа́мовыхъ рабы̑ твоѧ̑, вѣ́рою ѿ на́съ ѿше́дшыѧ къ тебѣ̀, всѧ́ческихъ зижди́телю и҆ пребл҃го́мꙋ. =Бг҃оро́диченъ:= ~Пло́ти моеѧ̀ ᲂу҆мертвѝ движє́нїѧ, бг҃а пло́тїю ро́ждшаѧ па́че ᲂу҆ма̀, и҆ пода́ждь просвѣще́нїе мы́сли мое́й, чтⷭ҇аѧ, сꙋ́щи ѡ҆́блакъ свѣ́та. =И҆́ный.= {{text_align=center}} =І҆рмо́съ:= ~Велича́емъ всѝ: ~Воспѣва́емъ всѝ па̑мѧти ва́шѧ, мч҃нцы прехва́льнїи, ви́дѧще по́двиги сщ҃е́нныхъ ва́шихъ страда́нїй, и҆ ᲂу҆диви́вшесѧ хрⷭ҇та̀ велича́емъ. ~Стра́ждꙋще дрꙋ́гъ ко дрꙋ́гꙋ глаго́лютъ стрⷭ҇тоте́рпцы: пло́ти не пощади́мъ: прїиди́те, ᲂу҆́мремъ за хрⷭ҇та, да во всѧ̑ вѣ́ки поживе́мъ ликꙋ́юще безконе́чнѡ. =Трⷪ҇ченъ:= ~Ѽ трⷪ҇це во є҆ди́номъ є҆стествѣ̀, ѻ҆́ч҃е нерожде́нный, и҆ рожде́нный сн҃е, и҆ дш҃е и҆сходѧ́й, тѧ̀ воспѣва̑ющыѧ твое́ю млⷭ҇тїю невреди̑мы соблюдѝ. =Бг҃оро́диченъ:= ~Ра́дꙋйсѧ, всечⷭ҇тна́ѧ, чтⷭ҇аѧ: дѣ́вства похвало̀, ма́терей ᲂу҆твержде́нїе,человѣ́кѡмъ по́мощь и҆ мі́ра ра́дость, мр҃і́е, мт҃и и҆ рабо̀ бг҃а на́шегѡ. =Сті́хъ:= ~Ди́венъ бг҃ъ во ст҃ы́хъ свои́хъ, бг҃ъ і҆и҃левъ. ~Ли́че ст҃ы́хъ, прїимѝ мольбꙋ̀ мою̀: и҆ ꙗ҆́коже сподо́бихсѧ крⷭ҇тъ ѡ҆блобыза́ти, ᲂу҆ хрⷭ҇та̀ и҆спроси́те и҆ сп҃си́тельнѣй стрⷭ҇ти мнѣ̀ поклони́тисѧ. =Сті́хъ:= ~Бл҃же́ни, ꙗ҆̀же и҆збра́лъ и҆ прїѧ́лъ є҆сѝ, гдⷭ҇и. ~Ѡ҆сла́би, ѡ҆ста́ви, ще́дре, къ тебѣ̀ чл҃вѣколю́бцꙋ преста́вльшымсѧ, и҆ сїѧ̑ ᲂу҆поко́й въ селе́нїихъ и҆збра́нныхъ: ты́ бо є҆сѝ жи́знь и҆ воскрⷭ҇нїе. =І҆рмо́съ:= ~Велича́емъ всѝ чл҃вѣколю́бїе твоѐ, хрⷭ҇тѐ сп҃се на́шъ, сла́во ра̑бъ твои́хъ и҆ вѣ́нче вѣ́рныхъ, возвели́чивый па́мѧть ро́ждшїѧ тѧ̀. =Є҆ѯапостїла́рїи, пи́саны въ сꙋббѡ́тꙋ в҃-ю поста̀. На хвали́техъ мч҃нчны гла́са. На стїхо́внѣ же подо́бны господи́на ѳеофа́на.= =На +лїтꙋргі́и+: И҆з̾ѡбрази́тєльнаѧ и҆ бл҃жє́нны гла́са. Прокі́менъ а҆пⷭ҇ла ме́ртвенъ. А҆пⷭ҇лъ, зача́ло тг҃і. И҆ за ᲂу҆поко́й, къ корі́нѳѧнѡмъ, зача́ло рѯ҃г. А҆ллилꙋ́їа, ме́ртвенъ. Є҆ѵⷢ҇лїе ѿ ма́рка, зача́ло л҃а. И҆ за ᲂу҆поко́й, ѿ і҆ѡа́нна, зача́ло ѕ҃і. Прича́стенъ:= ~Ра́дꙋйтесѧ, првⷣнїи: =Дрꙋгі́й, за ᲂу҆поко́й:= ~Бл҃же́ни, ꙗ҆̀же и҆збра́лъ: ![Рисунок 61](Pictures/10000000000001DF000000CF2AF5163F26AE1AD8.png) ![Рисунок 60](Pictures/100000000000036E00000091CA63DA7895B9AB6D.png)
60.550898
548
0.662282
ukr_Cyrl
0.180464
edc1eb5251fb0d2d9d21d1f926da351c1bfc3348
1,577
md
Markdown
README.md
catchr/tiktok-marketing-api
c5477baf2c976c130bc4f131ed59a2fe14df7296
[ "Unlicense" ]
null
null
null
README.md
catchr/tiktok-marketing-api
c5477baf2c976c130bc4f131ed59a2fe14df7296
[ "Unlicense" ]
null
null
null
README.md
catchr/tiktok-marketing-api
c5477baf2c976c130bc4f131ed59a2fe14df7296
[ "Unlicense" ]
null
null
null
# TikTok Marketing API PHP client library Convenient full-featured wrapper for [TikTok Marketing API](https://ads.tiktok.com/marketing_api/docs). 🚫 ️️Work In Progress. Not ready for use. [![Build Status](https://travis-ci.org/promopult/tiktok-marketing-api.svg?branch=master)](https://travis-ci.org/promopult/tiktok-marketing-api) [![Scrutinizer Code Quality](https://scrutinizer-ci.com/g/promopult/tiktok-marketing-api/badges/quality-score.png?b=master)](https://scrutinizer-ci.com/g/promopult/tiktok-marketing-api/?branch=master) [![Code Coverage](https://scrutinizer-ci.com/g/promopult/tiktok-marketing-api/badges/coverage.png?b=master)](https://scrutinizer-ci.com/g/promopult/tiktok-marketing-api/?branch=master) ### Installation ```bash $ composer require promopult/tiktok-marketing-api ``` or ``` "require": { // ... "promopult/tiktok-marketing-api": "*" // ... } ``` ### Usage See [examples](/examples) folder. ```php $credentials = \Promopult\TikTokMarketingApi\Credentials::build( getenv('__ACCESS_TOKEN__'), getenv('__APP_ID__'), getenv('__SECRET__') ); // Any PSR-18 HTTP-client $httpClient = new \GuzzleHttp\Client(); $client = new \Promopult\TikTokMarketingApi\Client( $credentials, $httpClient ); $response = $client->user->info(); print_r($response); //Array //( // [message] => OK // [code] => 0 // [data] => Array // ( // [create_time] => 1614175583 // [display_name] => my_user // [id] => xxx // [email] => xxx // ) // // [request_id] => xxx //) ```
25.031746
200
0.65948
kor_Hang
0.153066
edc209593e57a949095847f53a3c47b277381c9c
506
md
Markdown
README.md
bguyl/docker-stacks
6800e07a495ff3c89178b8427102dd3bc7dc5826
[ "MIT" ]
1
2018-05-17T16:31:10.000Z
2018-05-17T16:31:10.000Z
README.md
bguyl/docker-stacks
6800e07a495ff3c89178b8427102dd3bc7dc5826
[ "MIT" ]
3
2018-06-28T11:05:09.000Z
2018-06-28T11:18:24.000Z
README.md
bguyl/docker-stacks
6800e07a495ff3c89178b8427102dd3bc7dc5826
[ "MIT" ]
null
null
null
# docker-stacks > Some useful stack files for Docker ### How to use it ? `latest` directories contains stacks with images without specifics tags. Don't use theses in production. `stateless` are stacks without volumes. Use them to run and try stacks quickly. `stable` contains stacks with the last version tested. ### Stacks * [DAMP](damp/DESCRIPTION.md): Apache-MySQL-PHP stack * [Wordpress](wordpress/DESCRIPTION.md) * [Java-MySQL](java-mysql/DESCRIPTION.md) * [Peertube](peertube/DESCRIPTION.md)
31.625
106
0.752964
eng_Latn
0.766348
edc280d5d930aa946b757f5e9474f1fb2f05ff7d
2,171
md
Markdown
README.md
wagtaillabs/GRANT
442fa8353e198a1a1fbac83b645be1915382fc85
[ "Apache-1.1" ]
19
2020-11-17T13:26:56.000Z
2022-02-01T08:47:57.000Z
README.md
wagtaillabs/GRANT
442fa8353e198a1a1fbac83b645be1915382fc85
[ "Apache-1.1" ]
null
null
null
README.md
wagtaillabs/GRANT
442fa8353e198a1a1fbac83b645be1915382fc85
[ "Apache-1.1" ]
null
null
null
# GRANT Graft, Reassemble, Answer delta, Neighbour sensitivity, Training delta (GRANT) GRANT has been created by Wagtail Labs to remove the guesswork in using tree based models, assisting the creation of faster, more accurate models with satisfying explanations. It provides a deep understanding of the model's internal behaviour, shows prediction sensitivities, and helps data scientists improve inaccurate predictions. For a detailed introduction to the theory of GRANT visit [wagtaillabs.com](https://wagtaillabs.com). For a detailed introduction to the implementation of GRANT see the GRANT_walkthrough.ipynb notebook in this repo. ## Installation Clone the WagtailLabs/GRANT/grant.py file. ## Usage For a detailed walkthrough of the usage see the GRANT.ipynb notebook in this repo. ```python from WagtailLabs.grant import grant grant_rf = grant(rf, train_features, train_labels) grant_rf.graft() grafted_df = grant_rf.get_graft() #Returns the Graft (dataset containing all decision boundaries) of the Tree Ensemble grafted_dt = grant_rf.reassemble() #Returns a decision tree that produces the exact same results of the Tree Ensemble in less time grant_rf.amalgamate(threshold) amalgamated_df = grant_rf.get_amalgamated() #Returns a pruned (simplified) copy of the Graft where contigious decision boundaries with a prediction difference less than the supplied threshold are merged sensitivities = grant_rf.neighbour_sensitivity(val_feature) #Returns the change required in explanatory data required to reach each neighbouring decision boundary trainer_delta = grant_rf.training_delta_trainer(trainer_feature, trainer_label) #Returns the incremental change in result from a given training record to all predictions grant_rf.training_delta_trainee(trainee_feature, trainee_label) #Returns the incremental change in result from each training record that contributed any given prediction ``` ## Contributing Pull requests are welcome. For major changes, please open an issue first to discuss what you would like to change. Please make sure to update tests as appropriate. ## License [Apache License 2.0](https://choosealicense.com/licenses/apache-2.0/)
48.244444
333
0.817135
eng_Latn
0.992934
edc2ac0328356b073daeb049639098c7caf78fba
4,066
md
Markdown
_posts/2021-07-15-3-alternatives-to-saying-i-dont-know-in-interviews.md
recursivefaults/recursivefaults.github.io
87f92eeacf2d134f130019a432dc4130660e776f
[ "MIT" ]
null
null
null
_posts/2021-07-15-3-alternatives-to-saying-i-dont-know-in-interviews.md
recursivefaults/recursivefaults.github.io
87f92eeacf2d134f130019a432dc4130660e776f
[ "MIT" ]
2
2020-06-02T16:44:46.000Z
2021-09-03T17:11:16.000Z
_posts/2021-07-15-3-alternatives-to-saying-i-dont-know-in-interviews.md
recursivefaults/recursivefaults.github.io
87f92eeacf2d134f130019a432dc4130660e776f
[ "MIT" ]
null
null
null
--- title: 3 Alternatives to Saying, “I Don’t Know” In Interviews tags: [interview, question, technique] categories: [career] excerpt: Eventually you won’t know how to answer a question in an interview. In this article I walk through three alternatives to saying “I don’t know” that give much better results. cover_image: https://source.unsplash.com/fg8tdcxrkrA/900x500 --- You’ll inevitably hear some question that you don’t know the answer to during an interview. How you handle that moment can either leave the interviewer nodding in approval or unsure if you’re ready. Before I jump into how to handle this, I want to point out why this is a certainty: The tech field is just too big to have all the answers in our heads. Most interviews favor obscure, ridiculous, and trick questions that are all geared to mess you up. There is a specific interviewing technique that I call “The probing question,” which specifically tries to get someone to the boundary of their knowledge. In other words, you’ll get asked a question, and you will have no idea what the answer is. So here’s what you can do about it: # Technique 1: Focus on What You Can Do When you hear that question and your brain immediately turns to jelly because it draws a total blank, you want to shift gears quickly. Instead of focusing on the answer and how you don’t have it, focus instead on the process you’d follow to arrive at the solution. In other words, tell them what you can do to get the answer. A pretty easy way to launch into that is to say, “Oh, that’s interesting, I haven’t come across that before, but here’s what I’d do...” By doing this, you’re keeping a conversation going and demonstrating your ability to solve problems, keep composure, and how you work. # Technique 2: Tell A Story Somewhat related to the first technique, you can instead tell a story about a time you solved a puzzle you didn’t have the answer to either. Now, this one takes a little bit of extra work because you’ll have to relate your story back to the question. Thankfully this isn’t too hard to do. If you are getting asked about some obscure fact, tell a story where you had to learn an obscure fact. If it was a tricky coding problem, tell a story about solving a problem that felt obscure. By telling a story, you’re using your past to show you’ve been through similar tricky situations in a deeply relatable way. Storytelling is a potent technique that builds relationships and rounds out rough spots in an interview. # Technique 3: Reversal This last technique is probably the toughest to pull off but can pay off pretty big. The idea is to reverse the question back on the interviewer. It might go something like this: “Oh, that’s interesting. Have you bumped into something like that here?” If you can get the interviewer to share their story, they will build a relationship with you. You can ask more questions and talk like two peers. There is a risk with this technique since many questions people ask are genuinely unrelated to anything, so you can’t reverse it for them to tell a story. What this will do in this case is highlight that they just asked you an unrelated question. At this point, your best bet is likely to leverage technique #2 and tell a story about something you can relate to. I’ll give an example from my past to show how this might play out. Them: How would you solve this... (Some problem related to relational databases) Me: Wow, do you all bump into this problem a lot? Them: Yeah, almost every project. It’s a pain to deal with, and we need to know our developers can do it. Me: Well, I’d use a NoSQL solution for this case, I think. Them: Uh, we’ve never used NoSQL before. How would that work? Me: No problem, let me walk you through how I’d do this. I reversed the question, then did moved back to technique #1 and focused on what I could do. I had no idea how to answer the question they asked, so I created a scenario that I could attempt. There you have it, three ways to handle the moment when you don’t know the answer in an interview.
67.766667
362
0.775947
eng_Latn
0.999896
edc3b80d1a06d84fa50bae6d5ef620f203c76de4
4,276
md
Markdown
dynamicsax2012-technet/add-or-update-identification-information.md
MicrosoftDocs/DynamicsAX2012-technet
4e3ffe40810e1b46742cdb19d1e90cf2c94a3662
[ "CC-BY-4.0", "MIT" ]
9
2019-01-16T13:55:51.000Z
2021-11-04T20:39:31.000Z
dynamicsax2012-technet/add-or-update-identification-information.md
MicrosoftDocs/DynamicsAX2012-technet
4e3ffe40810e1b46742cdb19d1e90cf2c94a3662
[ "CC-BY-4.0", "MIT" ]
265
2018-08-07T18:36:16.000Z
2021-11-10T07:15:20.000Z
dynamicsax2012-technet/add-or-update-identification-information.md
MicrosoftDocs/DynamicsAX2012-technet
4e3ffe40810e1b46742cdb19d1e90cf2c94a3662
[ "CC-BY-4.0", "MIT" ]
32
2018-08-09T22:29:36.000Z
2021-08-05T06:58:53.000Z
--- title: Add or update identification information TOCTitle: Add or update identification information ms:assetid: 71bd6f16-e2b1-4011-88c8-a4f931124a12 ms:mtpsurl: https://technet.microsoft.com/library/Hh271561(v=AX.60) ms:contentKeyID: 36384192 author: Khairunj ms.author: daxcpft ms.date: 04/18/2014 mtps_version: v=AX.60 f1_keywords: - HcmEPPersonIdentificationNumberCreate - HcmEPPersonIdentificationNumberEdit - HcmEPPersonIdentificationNumberList audience: Application User ms.search.region: Global --- # Add or update identification information [!INCLUDE[archive-banner](includes/archive-banner.md)] _**Applies To:** Microsoft Dynamics AX 2012 R3, Microsoft Dynamics AX 2012 R2, Microsoft Dynamics AX 2012 Feature Pack, Microsoft Dynamics AX 2012_ Use the **Identification** list page to maintain information about your identity or the identity of your personal contacts. You can use information from forms of identity such as a driver’s license, visa, passport, or another government issued identity card. > [!NOTE] > <P>Depending on your role or the privileges that are assigned to you, you might need to go to your Employee services site before you complete the procedures in this topic.</P> ## View your identification information Click **Personal information** on the top link bar, and then click **Identification** on the Quick Launch to display the **Identification** list page, where you can view the forms of identification that you have on record with your company or organization. ## Add identification information 1. Click **Personal information** on the top link bar. 2. To add identification information for yourself, click **Identification** on the Quick Launch. To add identification information for a personal contact, on the **Personal contacts** FastTab, select the personal contact to add identification information for, and then click **Edit**. 3. Click **Identification** to display the **New identification** page. 4. Select the type of identification. For example, if you are adding identification information from your driver’s license, select **Driver’s license**. 5. Enter the number that is associated with the identification type that you selected in step 4. For example, if you selected **Driver’s license** in step 4, enter your driver’s license number. 6. Enter additional information about the identification information in either the **Description** field or the **Entry type** field. 7. Select the **Primary** check box to indicate that the identification information is your primary form of identification. 8. Select the agency that issued the form of identification that you selected in step 4. For example, if you selected **Driver’s license** and your driver’s license was issued to you by a government entity, select the name of the government entity. 9. Enter the date that the form of identification that you selected in step 4 was issued. 10. Enter the expiration date of the form of identification that you selected in step 4. 11. Click **Save and close**. ## Modify identification information 1. Click **Personal information** on the top link bar. 2. To modify identification information for yourself, click **Identification** on the Quick Launch. Go to step 4. To modify identification information for a personal contact, on the **Personal contacts** FastTab, select the personal contact to modify identification information for and then click **Edit**. 3. Click **Identification**. 4. Select the identification information to modify and then click **Edit**. 5. Modify the necessary information and then click **Save and close**. ## Delete identification information 1. Click **Personal information** on the top link bar. 2. To delete identification information for yourself, click **Identification** on the Quick Launch. Go to step 4. To delete identification information for a personal contact, on the **Personal contacts** FastTab, select the personal contact to delete identification information for and then click **Edit**. 3. Click **Identification**. 4. Select the identification information to delete and then click **Remove**. ## See also [Add and maintain your personal contacts](add-and-maintain-your-personal-contacts.md)
43.632653
258
0.77479
eng_Latn
0.984681
edc44a024e7571ac3575a1bd2b2b501e831a6114
110
md
Markdown
README.md
jMac029/Bootstrap-Portfolio
1a635b64845b79706bf3f5f10bdcea219ea2d474
[ "MIT" ]
2
2017-10-31T07:33:10.000Z
2018-02-12T19:17:00.000Z
README.md
jMac029/Bootstrap-Portfolio
1a635b64845b79706bf3f5f10bdcea219ea2d474
[ "MIT" ]
null
null
null
README.md
jMac029/Bootstrap-Portfolio
1a635b64845b79706bf3f5f10bdcea219ea2d474
[ "MIT" ]
null
null
null
# Bootstrap-Portfolio 2nd Homework Assignment for UCI Coding Bootcamp - Make a Portfolio site using BootStrap
36.666667
87
0.827273
eng_Latn
0.54022
edc4ff692f5130d2b90d9457c9d807c62dd462b5
765
md
Markdown
F1/windowactivate-event-project-vbapj-chm131140.md
CeptiveYT/VBA-Docs
1d9c58a40ee6f2d85f96de0a825de201f950fc2a
[ "CC-BY-4.0", "MIT" ]
283
2018-07-06T07:44:11.000Z
2022-03-31T14:09:36.000Z
F1/windowactivate-event-project-vbapj-chm131140.md
CeptiveYT/VBA-Docs
1d9c58a40ee6f2d85f96de0a825de201f950fc2a
[ "CC-BY-4.0", "MIT" ]
1,457
2018-05-11T17:48:58.000Z
2022-03-25T22:03:38.000Z
F1/windowactivate-event-project-vbapj-chm131140.md
CeptiveYT/VBA-Docs
1d9c58a40ee6f2d85f96de0a825de201f950fc2a
[ "CC-BY-4.0", "MIT" ]
469
2018-06-14T12:50:12.000Z
2022-03-27T08:17:02.000Z
--- title: WindowActivate Event, Project [vbapj.chm131140] keywords: vbapj.chm131140 f1_keywords: - vbapj.chm131140 ms.prod: office ms.assetid: a9fb0eaf-faec-4680-ab20-702a0f6856a7 ms.date: 06/08/2017 ms.localizationpriority: medium --- # WindowActivate Event, Project [vbapj.chm131140] Hi there! You have landed on one of our F1 Help redirector pages. Please select the topic you were looking for below. [Application.WindowActivate Event (Project)](https://msdn.microsoft.com/library/b54d0956-7eab-db5f-394a-5120bc111afd%28Office.15%29.aspx) [Application.WindowSidepaneTaskChange Event (Project)](https://msdn.microsoft.com/library/674a8134-1e34-2658-6c67-5eb92c628ed8%28Office.15%29.aspx) [!include[Support and feedback](~/includes/feedback-boilerplate.md)]
36.428571
147
0.797386
eng_Latn
0.317915
edc6cf775328051c19cc34624e01ac20bd7bc7a9
483
md
Markdown
README.md
jpbonson/APIAuthenticationCourse
b9cf6239a5ece90044f6b274f74b76836334d073
[ "BSD-2-Clause" ]
3
2018-01-23T22:21:38.000Z
2018-01-25T15:19:47.000Z
README.md
jpbonson/Course_APIAuthentication
b9cf6239a5ece90044f6b274f74b76836334d073
[ "BSD-2-Clause" ]
null
null
null
README.md
jpbonson/Course_APIAuthentication
b9cf6239a5ece90044f6b274f74b76836334d073
[ "BSD-2-Clause" ]
null
null
null
# API Authentication Course API with examples of multiple authentication strategies (Python). - Basic - Simple Token - OAuth 2.0 with JWT Python. Flask. Slides at https://docs.google.com/presentation/d/1zWgwnl7QCW8svIQOgq28rlYA13sK3P7N82SH-Wceel0/edit?usp=sharing (Portuguese only) ### How to install? ### ``` pipenv install ``` ### How to run? ### ``` gunicorn main:app ``` ### Application ### https://course-api-auth.herokuapp.com/ The routes are described in the slides.
17.25
128
0.724638
eng_Latn
0.748658
edc77a6416ffe83bcee47c2d50a335ff18311dcf
4,175
md
Markdown
articles/databox/data-box-disk-system-requirements.md
changeworld/azure-docs.nl-nl
bdaa9c94e3a164b14a5d4b985a519e8ae95248d5
[ "CC-BY-4.0", "MIT" ]
null
null
null
articles/databox/data-box-disk-system-requirements.md
changeworld/azure-docs.nl-nl
bdaa9c94e3a164b14a5d4b985a519e8ae95248d5
[ "CC-BY-4.0", "MIT" ]
null
null
null
articles/databox/data-box-disk-system-requirements.md
changeworld/azure-docs.nl-nl
bdaa9c94e3a164b14a5d4b985a519e8ae95248d5
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: Systeemvereisten voor Microsoft Azure Data Box Disk | Microsoft Docs description: Lees meer over de software- en netwerkvereisten voor uw Azure Data Box Disk services: databox author: alkohli ms.service: databox ms.subservice: disk ms.topic: article ms.date: 09/04/2019 ms.author: alkohli ms.localizationpriority: high ms.openlocfilehash: fb2fd89664517e44cf5128a5c82e583f03087061 ms.sourcegitcommit: 49c4b9c797c09c92632d7cedfec0ac1cf783631b ms.translationtype: HT ms.contentlocale: nl-NL ms.lasthandoff: 09/05/2019 ms.locfileid: "70307687" --- ::: zone target="docs" # <a name="azure-data-box-disk-system-requirements"></a>Systeemvereisten voor Azure Data Box Disk In dit artikel worden de belangrijkste systeemvereisten beschreven voor uw oplossing met Microsoft Azure Data Box Disk en voor de clients die verbinding maken met de Data Box Disk. We adviseren om de informatie zorgvuldig door te nemen voordat u uw Data Box Disk implementeert. Als u tijdens de implementatie en daaropvolgende bewerkingen nog vragen hebt, kunt u de informatie er altijd nog even bij pakken. De systeemvereisten omvatten de ondersteunde platforms voor clients die verbinding maken met schijven, ondersteunde opslagaccounts en opslagtypen. ::: zone-end ::: zone target="chromeless" ## <a name="review-prerequisites"></a>Vereisten controleren 1. U moet uw Data Box Disk hebben besteld met behulp van [Zelfstudie: Azure Data Box Disk bestellen](data-box-disk-deploy-ordered.md). U hebt de schijven en één aansluitkabel per schijf ontvangen. 2. U hebt een clientcomputer beschikbaar van waaruit u de gegevens kunt kopiëren. De clientcomputer moet voldoen aan deze vereisten: - Er is een ondersteund besturingssysteem geïnstalleerd. - Andere vereiste software is geïnstalleerd. ::: zone-end ## <a name="supported-operating-systems-for-clients"></a>Ondersteunde besturingssystemen voor clients Hier volgt een lijst met de ondersteunde besturingssystemen voor het ontgrendelen van de schijf en het kopiëren van gegevens via de clients die zijn verbonden met de Data Box Disk. | **Besturingssysteem** | **Geteste versies** | | --- | --- | | Windows Server |2008 R2 SP1 <br> 2012 <br> 2012 R2 <br> 2016 | | Windows (64-bits) |7, 8, 10 | |Linux <br> <li> Ubuntu </li><li> Debian </li><li> Red Hat Enterprise Linux (RHEL) </li><li> CentOS| <br>14.04, 16.04, 18.04 <br> 8.11, 9 <br> 7.0 <br> 6.5, 6.9, 7.0, 7.5 | ## <a name="other-required-software-for-windows-clients"></a>Andere vereiste software voor Windows-clients Voor Windows-clients moet de volgende software ook zijn geïnstalleerd. | **Software**| **Versie** | | --- | --- | | Windows Powershell |5.0 | | .NET Framework |4.5.1 | | Windows Management Framework |5.0| | BitLocker| - | ## <a name="other-required-software-for-linux-clients"></a>Andere vereiste software voor Linux-clients Voor Linux-clients wordt de volgende vereiste software geïnstalleerd door de Data Box Disk-werkset: - dislocker - OpenSSL ## <a name="supported-connection"></a>Ondersteunde verbinding De clientcomputer met de gegevens moet een USB 3.0-poort of hoger hebben. De schijven maken verbinding met deze client met behulp van de meegeleverde kabel. ## <a name="supported-storage-accounts"></a>Ondersteunde opslagaccounts Hier volgt een lijst met de ondersteunde opslagtypen voor de Data Box Disk. | **Opslagaccount** | **Opmerkingen** | | --- | --- | | Klassiek | Standard | | Algemeen gebruik |Standard; zowel V1 als V2 wordt ondersteund. Zowel dynamische als statische servicelagen worden ondersteund. | | Blob-opslagaccount | | >[!NOTE] > Gen 2-accounts van Azure Data Lake Storage worden niet ondersteund. ## <a name="supported-storage-types-for-upload"></a>Ondersteunde opslagtypen voor uploaden Hier volgt een lijst met de opslagtypen die worden ondersteund voor uploaden naar Azure met behulp van Data Box Disk. | **Bestandsindeling** | **Opmerkingen** | | --- | --- | | Azure-blok-blob | | | Azure-pagina-blob | | | Azure Files | | | Beheerde schijven | | ::: zone target="docs" ## <a name="next-step"></a>Volgende stap * [Azure Data Box Disk implementeren](data-box-disk-deploy-ordered.md) ::: zone-end
39.386792
407
0.752096
nld_Latn
0.993273
edc947104f62ba5367b866e26ad8608e70675b05
76
md
Markdown
README.md
auttawutsriprasan/myTODOs
15e861dd5195d70234631b10eb601df0d019a4b3
[ "MIT" ]
null
null
null
README.md
auttawutsriprasan/myTODOs
15e861dd5195d70234631b10eb601df0d019a4b3
[ "MIT" ]
null
null
null
README.md
auttawutsriprasan/myTODOs
15e861dd5195d70234631b10eb601df0d019a4b3
[ "MIT" ]
null
null
null
# myTODOs This repo is a collection of all things I want to achieve. momo
12.666667
58
0.75
eng_Latn
0.999872
edc95dfae80aa56470804cfb9d9f137896d5856e
326
md
Markdown
src/comments/mastering-paper/introduction-tool-guide/comment-1385076228000.md
Oxyenyos/web-proj
37e321fcc45f13a87831d20a83ba797b29864ea0
[ "MIT" ]
null
null
null
src/comments/mastering-paper/introduction-tool-guide/comment-1385076228000.md
Oxyenyos/web-proj
37e321fcc45f13a87831d20a83ba797b29864ea0
[ "MIT" ]
null
null
null
src/comments/mastering-paper/introduction-tool-guide/comment-1385076228000.md
Oxyenyos/web-proj
37e321fcc45f13a87831d20a83ba797b29864ea0
[ "MIT" ]
null
null
null
--- replying_to: '14' id: comment-1133682554 date: 2013-11-21T23:23:48Z updated: 2013-11-21T23:23:48Z _parent: /mastering-paper/introduction-tool-guide/ name: Michael Rose url: https://alokprateek.in/ email: 1ce71bc10b86565464b612093d89707e --- My pleasure. I learn something new every time I use Paper. It's such a fun app!
25.076923
79
0.766871
yue_Hant
0.373742
edc9751e8a38a6eee9f8372fb65b9265609134cd
441
md
Markdown
content/conditions/heat-rash/main-content-4.md
nhsalpha/betahealth
9cc0bbf71000e5322a5dcedfe4e6b4473ef9313e
[ "MIT" ]
4
2016-09-15T13:47:05.000Z
2017-04-24T08:01:30.000Z
content/conditions/heat-rash/main-content-4.md
nhsuk/betahealth
9cc0bbf71000e5322a5dcedfe4e6b4473ef9313e
[ "MIT" ]
73
2016-07-28T10:52:06.000Z
2017-06-07T14:58:49.000Z
content/conditions/heat-rash/main-content-4.md
nhsuk/betahealth
9cc0bbf71000e5322a5dcedfe4e6b4473ef9313e
[ "MIT" ]
7
2016-10-07T12:37:41.000Z
2021-04-11T07:40:24.000Z
## Causes of heat rash Heat rash is usually caused by excessive sweating. Sweat glands get blocked and the trapped sweat causes a rash to develop a few days later. Babies often get it because they can’t control their temperature as well as adults and children. Sweating is usually caused by hot or humid weather but other things can cause it. For example, being overweight or spending long periods in bed, perhaps because of an illness.
31.5
77
0.795918
eng_Latn
0.999989
edc9bc90b63f7627caa645498335d22acfb9d392
16,442
md
Markdown
09-04-2020/13-56.md
preetham/greenhub
7aac43f72d919533b7515bf016021fc6b9d023f6
[ "MIT" ]
null
null
null
09-04-2020/13-56.md
preetham/greenhub
7aac43f72d919533b7515bf016021fc6b9d023f6
[ "MIT" ]
null
null
null
09-04-2020/13-56.md
preetham/greenhub
7aac43f72d919533b7515bf016021fc6b9d023f6
[ "MIT" ]
null
null
null
<h2>News Now</h2><table><tr><th>Title</th><th>Content</th><th>URL</th><th>Author</th></tr> <tr><td><h3>TwentyFour Select Monthly Income Fund - Tender Submission Deadline Results – Company Announcement</h3></td><td><p>4 September 2020 TWENTYFOUR SELECT MONTHLY INCOME FUND LIMITED (a non-cellular company limited by shares incorporated in the Island of Guernsey under the Companies (Guernsey) Law 2008, as amended, with registered number 57985 and registered as a Registered Closed-ended Collective Investment Scheme w...</p></td><td><a href=https://markets.ft.com/data/announce/detail?dockey&#61;600-202009040951PR_NEWS_PRUKDSCL_0086-1>https://markets.ft.com/data/announce/detail?dockey=600-202009040951PR_NEWS_PRUKDSCL_0086-1</a></td><td><p>PR Newswire</p></td></tr><tr><td><h3>Driver&#39;s lucky escape after car flips onto roof on busy A66</h3></td><td><p>The crash caused a backlog of traffic for those heading into the town centre...</p></td><td><a href=https://www.gazettelive.co.uk/news/teesside-news/drivers-lucky-escape-after-car-18878218>https://www.gazettelive.co.uk/news/teesside-news/drivers-lucky-escape-after-car-18878218</a></td><td><p>Toni Guillot</p></td></tr><tr><td><h3>&#39;Highly encouraging&#39; attendance for school trust</h3></td><td><p>23 primary schools within Nicholas Postgate Catholic Academy Trust said that 96.1% of children returned for their first day of term...</p></td><td><a href=https://www.gazettelive.co.uk/news/teesside-news/highly-encouraging-attendance-children-return-18875606>https://www.gazettelive.co.uk/news/teesside-news/highly-encouraging-attendance-children-return-18875606</a></td><td><p>Kristy Dawson</p></td></tr><tr><td><h3>Chart-topping DJ Joel Corry to play at Newcastle social-distance gig</h3></td><td><p>The venue operates a system permitting people from the same household to watch the shows from their very own pod...</p></td><td><a href=https://www.gazettelive.co.uk/whats-on/chart-topping-dj-joel-corry-18878942>https://www.gazettelive.co.uk/whats-on/chart-topping-dj-joel-corry-18878942</a></td><td><p>Toni Guillot</p></td></tr><tr><td><h3>Man to face trial linked to Luke Jobson&#39;s death fails to show</h3></td><td><p>Edwin Taha was listed to appear at Teesside Crown Court in Middlesbrough on Thursday...</p></td><td><a href=https://www.gazettelive.co.uk/news/teesside-news/man-due-go-trial-charged-18877167>https://www.gazettelive.co.uk/news/teesside-news/man-due-go-trial-charged-18877167</a></td><td><p>Kristy Dawson</p></td></tr><tr><td><h3>Crash claims life of second victim as teen loses fight for life</h3></td><td><p>A three-year-old girl died following the crash on the A1086 Coast Road but now a 17-year-old has also lost her life as a result of her injuries...</p></td><td><a href=https://www.gazettelive.co.uk/news/teesside-news/crash-claims-life-second-victim-18879407>https://www.gazettelive.co.uk/news/teesside-news/crash-claims-life-second-victim-18879407</a></td><td><p>Toni Guillot</p></td></tr><tr><td><h3>Parole, furlough not absolute right; not to be given to terrorists, hardened criminals: MHA to States</h3></td><td><p>The Ministry of Home Affairs (MHA) said on Friday the release of prisoners on parole and furlough is not an absolute right and should be on the basis of well-defined norms of eligibility, and directed states that those involved in terrorism and other heinous crimes should not be allowed to go out of...</p></td><td><a href=https://timesofindia.indiatimes.com/india/parole-furlough-not-absolute-right-not-to-be-given-to-terrorists-hardened-criminals-mha-to-states/articleshow/77933249.cms>https://timesofindia.indiatimes.com/india/parole-furlough-not-absolute-right-not-to-be-given-to-terrorists-hardened-criminals-mha-to-states/articleshow/77933249.cms</a></td><td><p>indiatimes</p></td></tr><tr><td><h3>The Pandemic Is Contributing To Financial Scams, And Generation Z Is Especially Vulnerable</h3></td><td><p>Generation Z, the generational name given to people born after the mid-1990s, is coming of age. The oldest members of the cohort are graduating college and entering the workforce, and, just like their millennial counterparts, are doing so in the midst of an economic crisis. And on top of that, the e...</p></td><td><a href=http://feeds.benzinga.com/~r/benzinga/tech/~3/SHr4QTEMNhM/the-pandemic-is-contributing-to-financial-scams-and-generation-z-is-especially-vulnerable>http://feeds.benzinga.com/~r/benzinga/tech/~3/SHr4QTEMNhM/the-pandemic-is-contributing-to-financial-scams-and-generation-z-is-especially-vulnerable</a></td><td><p>Spencer Israel</p></td></tr><tr><td><h3>Metromile, Ford Motor Change The Way Consumers Look At Car Ownership</h3></td><td><p>Metromile, an insurance-focused fintech powered by data science and machine learning, on Thursday announced it partnered with Ford Motor Company (NYSE: F), to provide vehicle owners with built-in, intelligent car insurance. As part of the development, Benzinga chatted with Metromile CTO Paw Andersen...</p></td><td><a href=http://feeds.benzinga.com/~r/benzinga/tech/~3/YTRiQqOxKrg/metromile-ford-motor-change-the-way-consumers-look-at-car-ownership>http://feeds.benzinga.com/~r/benzinga/tech/~3/YTRiQqOxKrg/metromile-ford-motor-change-the-way-consumers-look-at-car-ownership</a></td><td><p>Renato Capelj</p></td></tr><tr><td><h3>Here&#39;s How Much Investing $1,000 In Microsoft At Dot-Com Bubble Peak Would Be Worth Today</h3></td><td><p>Despite an ongoing pandemic and the U.S. economy barely limping along, the Nasdaq is trading at all-time highs and is now more than 68% above its March lows. The surge in tech stocks in 2020 has understandably led investors to draw comparisons to the dot-com bubble in 2000. The Nasdaq ultimately pea...</p></td><td><a href=http://feeds.benzinga.com/~r/benzinga/tech/~3/78N-CmwIhzM/heres-how-much-investing-1-000-in-microsoft-at-dot-com-bubble-peak-would-be-worth-today>http://feeds.benzinga.com/~r/benzinga/tech/~3/78N-CmwIhzM/heres-how-much-investing-1-000-in-microsoft-at-dot-com-bubble-peak-would-be-worth-today</a></td><td><p>Wayne Duggan</p></td></tr><tr><td><h3>Looking Into Verizon Communications&#39;s Return On Capital Employed</h3></td><td><p>Verizon Communications (NYSE: VZ) posted Q2 earnings of $7.36 billion, an increase from Q1 of 11.89%. Sales dropped to $30.40 billion, a 3.83% decrease between quarters. In Q1, Verizon Communications brought in $31.61 billion in sales but only earned $6.58 billion. What Is Return On Capital Employed...</p></td><td><a href=http://feeds.benzinga.com/~r/benzinga/tech/~3/dhwVny1UjhE/looking-into-verizon-communicationss-return-on-capital-employed>http://feeds.benzinga.com/~r/benzinga/tech/~3/dhwVny1UjhE/looking-into-verizon-communicationss-return-on-capital-employed</a></td><td><p>Benzinga Insights</p></td></tr><tr><td><h3>The genetics of blood: A global perspective</h3></td><td><p>To better understand the properties of blood cells, an international team has been examining variations in the DNA of 746,667 people worldwide....</p></td><td><a href=https://www.sciencedaily.com/releases/2020/09/200904090308.htm>https://www.sciencedaily.com/releases/2020/09/200904090308.htm</a></td><td><p>sciencedaily</p></td></tr><tr><td><h3>Ocean carbon uptake widely underestimated</h3></td><td><p>The world&#39;s oceans soak up more carbon than most scientific models suggest, according to new research....</p></td><td><a href=https://www.sciencedaily.com/releases/2020/09/200904090312.htm>https://www.sciencedaily.com/releases/2020/09/200904090312.htm</a></td><td><p>sciencedaily</p></td></tr><tr><td><h3>Researchers identify nanobody that may prevent COVID-19 infection</h3></td><td><p>Researchers have identified a small neutralizing antibody, a so-called nanobody, that has the capacity to block SARS-CoV-2 from entering human cells. The researchers believe this nanobody has the potential to be developed as an antiviral treatment against COVID-19....</p></td><td><a href=https://www.sciencedaily.com/releases/2020/09/200904090314.htm>https://www.sciencedaily.com/releases/2020/09/200904090314.htm</a></td><td><p>sciencedaily</p></td></tr><tr><td><h3>Red hot meat: The wrong recipe for heart disease</h3></td><td><p>From MasterChef to MKR, the world&#39;s best chefs have taught us how to barbeque, grill and panfry a steak to perfection. But while the experts may be seeking that extra flavor, new research suggests high-heat caramelization could be bad for our health....</p></td><td><a href=https://www.sciencedaily.com/releases/2020/09/200904090316.htm>https://www.sciencedaily.com/releases/2020/09/200904090316.htm</a></td><td><p>sciencedaily</p></td></tr><tr><td><h3>How Designers Are Making Their Spring 2021 Collections With Leftover Fabric, Old Patterns, and Renewed Clarity</h3></td><td><p>Sustainability was the biggest conversation topic in fashion—then the pandemic happened. For some designers, working sustainably was the only option, from using leftover materials to exploring handwork. See a first look at Gabriela Hearst and Marina Moscone&#39;s spring 2021 collections here....</p></td><td><a href=https://www.vogue.com/article/how-designers-are-making-spring-2021-collections-leftover-fabrics-sustainability>https://www.vogue.com/article/how-designers-are-making-spring-2021-collections-leftover-fabrics-sustainability</a></td><td><p>Emily Farra</p></td></tr><tr><td><h3>Kickstart the Fall Season With These 16 Sneakers</h3></td><td><p>With athleisure&#39;s comeback this spring, sneakers still reign supreme....</p></td><td><a href=https://www.vogue.com/slideshow/kickstart-the-fall-season-with-these-16-sneakers>https://www.vogue.com/slideshow/kickstart-the-fall-season-with-these-16-sneakers</a></td><td><p>Rachel Besser</p></td></tr><tr><td><h3>Alexander Wang Fall 2020 Ready-to-Wear Fashion Show</h3></td><td><p>The complete Alexander Wang Fall 2020 Ready-to-Wear fashion show now on Vogue Runway....</p></td><td><a href=https://www.vogue.com/fashion-shows/fall-2020-ready-to-wear/alexander-wang>https://www.vogue.com/fashion-shows/fall-2020-ready-to-wear/alexander-wang</a></td><td><p>@voguemagazine</p></td></tr><tr><td><h3>Gabriele Colangelo Resort 2021 Fashion Show</h3></td><td><p>The complete Gabriele Colangelo Resort 2021 fashion show now on Vogue Runway....</p></td><td><a href=https://www.vogue.com/fashion-shows/resort-2021/gabriele-colangelo>https://www.vogue.com/fashion-shows/resort-2021/gabriele-colangelo</a></td><td><p>@voguemagazine</p></td></tr><tr><td><h3>17 On-Sale Picks for the Last Splash of Summer</h3></td><td><p>End the season in style....</p></td><td><a href=https://www.vogue.com/slideshow/labor-day-clothing-sales>https://www.vogue.com/slideshow/labor-day-clothing-sales</a></td><td><p>Madeline Fass</p></td></tr><tr><td><h3>What to Watch this Long Weekend</h3></td><td><p>From a horror movie filmed on zoom to a Charles Dickens&#39; adaptation, there is plenty to keep you busy this long weekend....</p></td><td><a href=https://www.vogue.com/article/what-to-stream-this-weekend>https://www.vogue.com/article/what-to-stream-this-weekend</a></td><td><p>vogue</p></td></tr><tr><td><h3>SZA’s “Hit Different” Video Is a Kaleidoscopic Ode to End-of-Summer Style</h3></td><td><p>The singer&#39;s new video features tie-dye, crop-tops, and more....</p></td><td><a href=https://www.vogue.com/article/sza-hit-different-video-summer-style>https://www.vogue.com/article/sza-hit-different-video-summer-style</a></td><td><p>Liam Hess</p></td></tr><tr><td><h3>Los Gatos: With houses on verge of going up in North 40 site, residents mount a last stand against controversial project</h3></td><td><p>SummerHill Homes president calls lower parking level &#39;wasteful&#39; but isn&#39;t confident town officials will support him...</p></td><td><a href=https://www.mercurynews.com/2020/09/04/los-gatos-with-houses-on-verge-of-going-up-in-north-40-site-residents-mount-a-last-stand-against-controversial-project/>https://www.mercurynews.com/2020/09/04/los-gatos-with-houses-on-verge-of-going-up-in-north-40-site-residents-mount-a-last-stand-against-controversial-project/</a></td><td><p>Darren Sabedra</p></td></tr><tr><td><h3>India can survive Covid-19 not Modinomics</h3></td><td><p>People elect the government they want. The nightmare for the Indian economy started in2014 when the voters were scammed by the seller of the Achhe Din Aanewale Hain dream. The assault had started earlier but......</p></td><td><a href=https://timesofindia.indiatimes.com/blogs/gantantra-gantavya/india-can-survive-covid-19-not-modinomics/>https://timesofindia.indiatimes.com/blogs/gantantra-gantavya/india-can-survive-covid-19-not-modinomics/</a></td><td><p>Ashutosh Misra</p></td></tr><tr><td><h3>Stop harassing international members of Tablighi Jamaat when cases against them have collapsed in courts</h3></td><td><p>In March, when India&#39;s largest Covid cluster emerged out of the Tablighi Jamaat congregation in Delhi, it belonged to a string of religious gatherings in the world acting in a dangerously irresponsible manner in the......</p></td><td><a href=https://timesofindia.indiatimes.com/blogs/toi-editorials/stop-harassing-international-members-of-tablighi-jamaat-when-cases-against-them-have-collapsed-in-courts/>https://timesofindia.indiatimes.com/blogs/toi-editorials/stop-harassing-international-members-of-tablighi-jamaat-when-cases-against-them-have-collapsed-in-courts/</a></td><td><p>Quick Edit</p></td></tr><tr><td><h3>Poor mask discipline is denting unlocking efforts; government must step up interventions on this score</h3></td><td><p>AIIMS director Randeep Guleria has advised hefty fines for not wearing masks and aggressive testing if we are to prevent the spread of the virus and revive economic activities at a fast pace. He also......</p></td><td><a href=https://timesofindia.indiatimes.com/blogs/toi-editorials/poor-mask-discipline-is-denting-unlocking-efforts-government-must-step-up-interventions-on-this-score/>https://timesofindia.indiatimes.com/blogs/toi-editorials/poor-mask-discipline-is-denting-unlocking-efforts-government-must-step-up-interventions-on-this-score/</a></td><td><p>Quick Edit</p></td></tr><tr><td><h3>Hindu rashtra – Myth or reality (Part 2)</h3></td><td><p>Today, when a course correction is being attempted to restore part of India&#39;s enviable ancient past, we have all kinds of protests and charges against the democratically elected government of the day. The resentment perhaps......</p></td><td><a href=https://timesofindia.indiatimes.com/blogs/blunt-frank/hindu-rashtra-myth-or-reality-part-2/>https://timesofindia.indiatimes.com/blogs/blunt-frank/hindu-rashtra-myth-or-reality-part-2/</a></td><td><p>Saroj Chadha</p></td></tr><tr><td><h3>Feedback from ground up</h3></td><td><p>Soon after the lockdown due to corona began on March 25, the central government announced that Rs 500-1500 is being immediately transferred to Jan Dhan accounts of millions of poor women in the country. Eager......</p></td><td><a href=https://timesofindia.indiatimes.com/blogs/voices/feedback-from-ground-up/>https://timesofindia.indiatimes.com/blogs/voices/feedback-from-ground-up/</a></td><td><p>Dr Rajesh Tandon</p></td></tr><tr><td><h3>What Biden Understands That the Left Does Not</h3></td><td><p>If Biden wins, it&#39;ll be because he&#39;s too old and offline to abide by the strictures that now hamper much of the Democratic Party.All through the primaries, Joe Biden was portrayed as an anachronism, a man who had missed his moment by a decade or three. Even as mainstream media outlets published fawn...</p></td><td><a href=https://www.theatlantic.com/ideas/archive/2020/09/what-biden-understands-left-does-not/616000/>https://www.theatlantic.com/ideas/archive/2020/09/what-biden-understands-left-does-not/616000/</a></td><td><p>Yascha Mounk</p></td></tr><tr><td><h3>Mulan Is a War Movie With a Dash of Magic</h3></td><td><p>The ever-expanding canon of Disney&#39;s &#34;live-action remakes&#34; of its own animated classics can be handily split into two categories. There are the intensely faithful remakes, such as The Lion King and Beauty and the Beast, which take hit films of yesteryear and present them as largely the same narrativ...</p></td><td><a href=https://www.theatlantic.com/culture/archive/2020/09/disney-mulan-niki-caro-review/616006/>https://www.theatlantic.com/culture/archive/2020/09/disney-mulan-niki-caro-review/616006/</a></td><td><p>David Sims</p></td></tr></table>
2,348.857143
9,995
0.760187
eng_Latn
0.65881
edc9d9e8f6ba03fa6ab3eca8faf96d7dc503f74f
4,117
md
Markdown
README.md
lmc-eu/awesome-developer
aa4b0f22935689e2a6900e9ecc2dd5437369c9d3
[ "MIT" ]
14
2018-04-05T12:53:20.000Z
2021-08-13T10:27:45.000Z
README.md
lmc-eu/awesome-developer
aa4b0f22935689e2a6900e9ecc2dd5437369c9d3
[ "MIT" ]
4
2018-04-11T12:25:18.000Z
2018-11-08T11:57:37.000Z
README.md
lmc-eu/awesome-developer
aa4b0f22935689e2a6900e9ecc2dd5437369c9d3
[ "MIT" ]
1
2018-06-05T12:40:33.000Z
2018-06-05T12:40:33.000Z
# Awesome resources for LMC developers [![Awesome](https://cdn.rawgit.com/sindresorhus/awesome/d7305f38d29fed78fa85652e3a63e154dd8e8829/media/badge.svg)](https://github.com/sindresorhus/awesome) A curated list of amazingly awesome resources for (not only) LMC developers. The aim is to provide a list of "must-read" resources about not widely known or often neglected topics. If you get through them, they will surely make you a better and even more awesome developer - we promise! - 🇨🇿 Resource is in Czech (otherwise it is in English) - 🎥 Resource is a video (otherwise it is an article) ## Table of Contents - [Development](#development) - [Code review](#code-review) - [Q&A](#qa) - [Git](#git) - [Books](#books) - [Newsletters](#newsletters) ### Development - [Fluent interfaces are evil](https://ocramius.github.io/blog/fluent-interfaces-are-evil/) - there are actually very few cases when fluent interfaces should be used. - Article series by Matthias Noback: - [Reducing complexity](https://www.ibuildings.nl/blog/2016/01/programming-guidelines-part-1-reducing-complexity) - [Getting rid of null](https://www.ibuildings.nl/blog/2016/01/programming-guidelines-part-2-getting-rid-null) - [Why write tests](https://www.ibuildings.nl/blog/2016/08/why-write-tests) - [Principy objektově orientovaného návrhu](https://www.zdrojak.cz/serialy/principy-objektove-orientovaneho-navrhu/) (🇨🇿) - article series covering SOLID and GRASP patterns. - [Verzování Databáze](https://www.youtube.com/watch?v=KTmlw5AKM8E) (🇨🇿 🎥) - why and how to migrate your database with backward-compatibility. - [Technický dluh](https://blog.think-forth.com/2016/01/21/technicky-dluh/) (🇨🇿) #### PHP-specific - [Extremely Defensive PHP](https://ocramius.github.io/extremely-defensive-php/#/) - Avoid mistakes. Be cautious about your code. - [Named constructors](http://verraes.net/2014/06/named-constructors-in-php/) - especially value objects may be better instantiated via static named constructors, instead of `__construct()`. - [Symfony komponenty](https://www.zdrojak.cz/serialy/symfony-po-kruckach/) (🇨🇿) - article series about standalone Symfony components. - [Dealing with exceptional conditions](https://www.youtube.com/watch?v=1YAGxJVuuws) (🎥) - best practices for dealing with exceptions in PHP. - Main ideas are also summed [in an article](https://blog.nikolaposa.in.rs/2016/08/17/exceptional-behavior-best-practices/). ### Code review - [Code reviews v praxi](https://www.zdrojak.cz/clanky/code-reviews-praxi/) (🇨🇿) - [Code review checklist](https://www.zdrojak.cz/clanky/code-review-checklist/) (🇨🇿) - [Co nás naučilo code review](https://www.zdrojak.cz/clanky/co-nas-naucilo-code-review/) (🇨🇿) ### Q&A - [Funkční systémové testování: když unit-testy nestačí](http://prvnivcesku.cz/systemove-testovani/) (🇨🇿) - what is functional system testing and when to use it. ### Git - [Jak a proč fixupovat](https://filip-prochazka.com/blog/git-fixup) (🇨🇿) - how and why use fixup commits (especially during code-reviews). ### Books - [Clean code - Rober C. Martin](https://www.amazon.com/Clean-Code-Handbook-Software-Craftsmanship/dp/0132350882) - [Návrhové vzory - Rudolf Pecinovský](http://knihy.pecinovsky.cz/vzory/index.html) (🇨🇿) ### Newsletters - [PHP Annotated monthly by Jetbrains](https://blog.jetbrains.com/phpstorm/category/php-annotated-monthly/) - [PHP Weekly](http://www.phpweekly.com/) ## Contributing rules If you feel some resource matching aim of this document (see the beginning) is missing, you are welcome to contribute! But please follow this rules: - Add one new resource per one pull request. - Use icons (🇨🇿 🎥 ...) if applicable. - Add short description why the resource is awesome. The description sentence should end by a dot. - If the resource is in Czech, make the title (link) also in Czech, but the description should be always in English. - Also please note this list is *curated*, as we want it to contain only the most important resources. This means suggestions are not automatically accepted and may be rejected if we feel they does not match the mission of this document.
63.338462
194
0.751032
eng_Latn
0.816778
edca71d3b3b734db1b7d5dc2fdbb2d0a6c8156d0
1,431
md
Markdown
README.md
ltsaiete/be-the-hero-semanaomnistack11-
4a290e998bdd8c5276c5bc7d793d3eb517933b44
[ "MIT" ]
1
2020-12-15T14:50:14.000Z
2020-12-15T14:50:14.000Z
README.md
ltsaiete/be-the-hero-semanaomnistack11-
4a290e998bdd8c5276c5bc7d793d3eb517933b44
[ "MIT" ]
5
2021-01-09T22:29:32.000Z
2022-02-27T02:06:37.000Z
README.md
ltsaiete/be-the-hero-semanaomnistack11-
4a290e998bdd8c5276c5bc7d793d3eb517933b44
[ "MIT" ]
1
2021-04-30T18:38:12.000Z
2021-04-30T18:38:12.000Z
<h1 align="center"> <img src="./frontend/src/assets/logo.svg" width="250"> </h1> <p align="center"><b>Semana Omnistack 11.0:rocket:</b></p> <p align="center"> <a href="#projecto">Projecto</a> | <a href="#rocket-semana-omnistack">Semana Omnistack</a> | <a href="#rocket-tecnologias">Tecnologias</a> | <a href="#como-contribuir">Como contribuir</a> | <a href="#memo-licença">Licença</a> </p> ## Projecto O BeTheHero é um projeto que visa conectar pessoas que desejam fazer contribuições monetárias a ONG's (Organizações não governamentais) que precisam de ajuda. ## :rocket: Semana Omnistack A **Semana Omnistack** é um evento promovido pela [Rocketseat](https://rocketseat.com.br) :rocket: onde durante uma semana desenvolvemos uma App só com JavaScript usando o Node.js no backend, React no frontend e React Native para mobile e tocamos em diversos conceitos relacionados a desenvolvimento web e mobile com esta Stack altamente poderosa. ## :rocket: Tecnologias Este projecto foi desenvolvido com as seguintes tecnologias: * Node.js * React * React Native * Expo ## Como contribuir * Faça um fork desse repositório; * Cria uma branch com a sua feature; * Faça commit das suas alterações; * Faça push para a sua branch. Depois que o merge da sua pull request for feito, você pode deletar a sua branch. ## :memo: Licença Este projeto está sob a licença MIT. Veja o arquivo [LICENSE](LICENSE) para mais detalhes.
38.675676
347
0.740741
por_Latn
0.998815
edcaad23807765179ff69931f34edc8b3861d515
2,584
md
Markdown
README.md
Zingle/authz
b909d6467c469af74e1982b5171ccb21c7fc2c31
[ "MIT" ]
null
null
null
README.md
Zingle/authz
b909d6467c469af74e1982b5171ccb21c7fc2c31
[ "MIT" ]
null
null
null
README.md
Zingle/authz
b909d6467c469af74e1982b5171ccb21c7fc2c31
[ "MIT" ]
null
null
null
Zingle **authz** authentication library. This core library provides Express.js middleware and helpers for building an authentication server relying on standard Zingle protocols. authz Library API ================= class AuthZ ----------- The **AuthZ** class maintains the application-level settings for authentication, and can be used to generate middleware primitives that can be used to build an authentication service. ```js import {AuthZ} from "@zingle/authz"; ``` ### new AuthZ({secret, passport=new Passport()}) Create an **AuthZ** instance with required application secret and an optional Passport.js instance. This instance can be used to generate middleware used to build an authentication service. ### AuthZ#authenticate(strategy) Create request authentication middleware using the Passport.js strategy which was registered with the provided strategy name. The middleware will generate a 403 Forbidden response if the strategy does not result in a logged in user. Otherwise, it will continue to the next middleware. ### AuthZ#oauth(strategy, scope, data=()=>{}) Create OAuth permission request middleware using the Passport.js strategy which was registered with the provided strategy name. Scopes must be an array of requested scopes. Additional data can be passed through the OAuth provider to be returned by the OAuth provider after successful authentication. This data can be static string data or a function which generates the data from the Express.js Request object. The middleware will send the client to the OAuth provider's site to complete the authentication process. The OAuth provider and the strategy determine where the client is redirected upon success. ### AuthZ#requestState() Create function to extract the AuthZ state from a client request. This function can be passed as the third argument to **AuthZ#oauth()** to pass along the AuthZ state with the OAuth permission request. ### AuthZ#sign(iss) Create JWT signing middleware for the provided issuer. The middleware expects some other middleware earlier in the chain to have logged in the user. The middleware supports HTML, plain text, and JSON responses, selected by the Accept header. For HTML responses, the "token" template will be rendered with the JWT passed in the "jwt" variable. The application must provide this template. ### AuthZ#userInfo() Create user info middleware to send logged in user's info to the client. The middleware expects some other middleware earlier in the chain to have logged in the user. The middleware sends the user info to the client as JSON.
41.677419
80
0.786378
eng_Latn
0.998625
edcae0ed7d3e91c5a9ff89a602cdfafe64aebe3b
2,899
md
Markdown
gsoc/2020/posts/akash_kumar_singh/post05.md
pawanw17/web
593db35a2c692f0c26b0936fb40f9f89506cbcbf
[ "CC0-1.0" ]
2
2019-06-01T09:39:10.000Z
2020-10-29T00:49:09.000Z
gsoc/2020/posts/akash_kumar_singh/post05.md
vaibhawkhemka/web
c8f26cd5f1507a24346aa043fa6292382093ed35
[ "CC0-1.0" ]
62
2017-05-26T05:14:26.000Z
2021-08-18T03:08:05.000Z
gsoc/2020/posts/akash_kumar_singh/post05.md
vaibhawkhemka/web
c8f26cd5f1507a24346aa043fa6292382093ed35
[ "CC0-1.0" ]
73
2017-05-25T18:43:26.000Z
2021-08-16T08:16:20.000Z
# Design Details (robocompdsl) This post contains the new design that is adopted in the code generated by the `robocompdsl` for ROS1/ROS2 middleware. With the new design we dedicate a whole class for ROS middlware. This is class is declared and defined in the `genericworker.h` header file, which is then isntantiated in `GenericWorker` class. ``` [ -- class name is choosen from the interface name class Publisher/Subscriber/Server/Client{InterfaceName} { ] public: [ -- publishers are choosen on the basis of different types of data types present in the modules containing the interface, the dictionary data type is avoided as it is not available in ROS -- besides this topic interfaces according to ICE syntax are also declared as publishers // rclcpp::Publisher<{ComponentName}::msg::{MessageName}>::SharedPtr pub_{PublisherName}; ] [ -- same heuristic as publishers are employed in subscribers too // rclcpp::Subscription<{ComponentName}::msg::{MessageName}>::SharedPtr sub_{SubscriberName}; ] [ -- servers are declared by checking if the interface is not a topic interface and then take all the methods as services (is the method doesn't contain the dictionary data type), names are decided by the names of methods // rclcpp::Service<{ComponentName}::srv::{ServiceName}>::SharedPtr server_{ServerName}; ] [ -- same heuristic is adopted as servers // rclcpp::Client<{ComponentName}::srv::{ServiceName}>::SharedPtr client_{ClientName}; ] [ -- TO BE ADDED LATER -- -- to store the data from the Subscriber callback // {ComponentName}::msg::{MessageName} {DataType}_msg; ] rclcpp::Node::SharedPtr node; Publisher/Subscriber/Server/Client{InterfaceName} () { node = rclcpp::Node::make_shared ("Node Name"); // pub_{PublisherName} = node->create_publisher<{ComponentName}::msg::{MessageName}>("{TopicName}", 10); // sub_{SubscriberName} = node->create_subscription<{ComponentName}::msg::{MessageName}>("{TopicName}", 10, std::bind(&Subscriber{InterfaceName}::cb_{SubscriberName}, this, _1)); // server_{ServerName} = node->create_service<{ComponentName}::srv::{ServiceName}>("{TopicName}", std::bind(&Server{InterfaceName}::{ServerName}, this, _1, _2)); // client_{ClientName} = node->create_client<{ComponentName}::srv::{ServiceName}>("{TopicName}"); } ~Publisher/Subscriber/Server/Client{InterfaceName} () {} // void {ServerName} (const std::shared_ptr<{ComponentName}::srv::{ServiceName}::Request> req, std::shared_ptr<{ComponentName}::srv::{ServiceName}::Response> res) {} // void cb_{SubscriberName} (const {ComponentName}::msg::{MessageName}::SharedPtr msg) {} }; ```
45.296875
100
0.667127
eng_Latn
0.877859
edcc41fe13a5213e72470483bcb3a566061d5859
2,545
md
Markdown
source/_posts/greedy_customer_tries_to_blag_eight_half_price_meals_on_eat_out_to_help_out.md
soumyadipdas37/finescoop.github.io
0346d6175a2c36d4054083c144b7f8364db73f2f
[ "MIT" ]
null
null
null
source/_posts/greedy_customer_tries_to_blag_eight_half_price_meals_on_eat_out_to_help_out.md
soumyadipdas37/finescoop.github.io
0346d6175a2c36d4054083c144b7f8364db73f2f
[ "MIT" ]
null
null
null
source/_posts/greedy_customer_tries_to_blag_eight_half_price_meals_on_eat_out_to_help_out.md
soumyadipdas37/finescoop.github.io
0346d6175a2c36d4054083c144b7f8364db73f2f
[ "MIT" ]
2
2021-09-18T12:06:26.000Z
2021-11-14T15:17:34.000Z
--- extends: _layouts.post section: content image: https://i.dailymail.co.uk/1s/2020/08/31/12/32599718-0-image-a-17_1598873664432.jpg title: Greedy customer tries to blag EIGHT half price meals on Eat Out To Help Out description: Waiters at Eglinton Diner and Fish Fry in Saltcoats, North Ayrshire, reacted with disbelief when the man asked for one meal to eat in and seven to take away on the scheme. date: 2020-08-31-20-00-23 categories: [latest, latest] featured: true --- A 'greedy' customer has been shamed by a fish and chip shop after he tried to order eight half price meals on the Eat Out To Help Out scheme. Waiters at Eglinton Diner and Fish Fry in Saltcoats, North Ayrshire, were left in 'disbelief' when a lone male walked in and asked for one meal to eat in and seven to take away. When shocked staff asked him why he had placed the large order, he said he hoped to get enough discounted meals to last 'for a few days'. After they explained this wasn't how the taxpayer-funded scheme worked the man took to social media to vent his rage - prompting a furious rebuke from the business. Eglinton Diner and Fish Fry in North Ayrshire posted this response to the man on social media after he ordered eight meals on the taxpayer-funded scheme.  The diner said its waiters reacted in 'disbelief' when they received the lone male's order Saying he did not recommend the restaurant, the man wrote: 'Attitude totally pants and insulting, lost out on eight different meals, KFC loved your money, long time customer never be back.' The enraged restaurant responded: 'I can only assume you were the person in the diner yesterday who was dining alone but asked to order eight suppers. 'One to eat and the other seven to take away to keep you going for the next few days. 'Please read the rules on the Eat Out to Help Out scheme as this is certainly not the way it is intended to work. 'It is people with your greed that puts schemes like this in jeopardy and we value our business too much to be bending rules for greedy customers. 'I hope you enjoyed your meal.' They then shared his review on their social media page with the caption: 'I think the attitude of the staff was utter disbelief. Trying to order eight meals for one person so he can get them half price. Unbelievable!' Customers have voiced their support for the business on social media. The Treasury has revealed more than 64 million meals have been claimed on the scheme, which discounts meals eaten in restaurants from Monday to Wednesday by up to £10.
59.186047
217
0.782318
eng_Latn
0.999891
edccc1fffe9456f8517199d25c224a67da263b92
2,221
md
Markdown
README.md
smf-distribution/heroku-buildpack-version
fe6175d0970249b8945a00c553476c0c830af747
[ "MIT" ]
null
null
null
README.md
smf-distribution/heroku-buildpack-version
fe6175d0970249b8945a00c553476c0c830af747
[ "MIT" ]
null
null
null
README.md
smf-distribution/heroku-buildpack-version
fe6175d0970249b8945a00c553476c0c830af747
[ "MIT" ]
null
null
null
Source Version Buildpack ================== As a developer, I want to display the source version of my app so that the team can refer to it for QA. ## Background When you push your source to Heroku, the repo is used for slug compilation and then purged before the app is deployed. Unfortunately this prevents you from using something like the following in your app code: version = `git rev-parse --short HEAD` So, let's capture the information during slug compilation. Heroku provides the [SOURCE_VERSION](https://devcenter.heroku.com/changelog-items/630) environment variable as a nice SCM-agnostic way to do this. We can use a buildpack to create a [profile.d](https://devcenter.heroku.com/articles/profiled) script that will export SOURCE_VERSION into the app runtime environment. ## Usage First thing, add the buildpack to your app: $ heroku buildpacks:add https://github.com/smf-distribution/heroku-buildpack-version During your next deployment, the buildpack will generate the environment configuration script: -----> Fetching custom git buildpack... done -----> Source app detected -----> Creating profile.d script SOURCE_VERSION: d9f63da90bc76ee878a0e9a54d9e85db7da4a52b Script installed to .profile.d/source_version.sh The script itself is simple, just a single export statement with default expansion: export SOURCE_VERSION=${SOURCE_VERSION:-f5efc0615dbd0f64f718e142bad858b8e1cf59bb} Note the expansion syntax ':-' ensures that any existing value is not overidden. In dev environments, scripting `git rev-parse` is still probably the most convenient way to get your current source version. For easy portablity, you can cascade the initialization: version = ENV['SOURCE_VERSION'] || `git rev-parse HEAD`.presence || '1.0' ## Links - [Heroku Buildpacks](http://devcenter.heroku.com/articles/buildpacks) - [Buildpack API](https://devcenter.heroku.com/articles/buildpack-api) - [SOURCE_VERSION](https://devcenter.heroku.com/changelog-items/630) - [Default Config Values](https://devcenter.heroku.com/articles/buildpack-api#default-config-values) ## License The pack is available as open source under the terms of the [MIT License](http://opensource.org/licenses/MIT).
44.42
373
0.770824
eng_Latn
0.911106
edce1095666f5d1c073e64026503dd922906d600
281
md
Markdown
_project/wes-anderson-inspired-wallpaper-for-your-viewing-pleasure.md
rumnamanya/rumnamanya.github.io
2deadeff04c8a48cf683b885b7fa6ab9acc1d9d9
[ "MIT" ]
null
null
null
_project/wes-anderson-inspired-wallpaper-for-your-viewing-pleasure.md
rumnamanya/rumnamanya.github.io
2deadeff04c8a48cf683b885b7fa6ab9acc1d9d9
[ "MIT" ]
null
null
null
_project/wes-anderson-inspired-wallpaper-for-your-viewing-pleasure.md
rumnamanya/rumnamanya.github.io
2deadeff04c8a48cf683b885b7fa6ab9acc1d9d9
[ "MIT" ]
null
null
null
--- layout: project_single title: "wes anderson inspired wallpaper for your viewing pleasure" slug: "wes-anderson-inspired-wallpaper-for-your-viewing-pleasure" parent: "beautiful-wes-anderson-decor-ideas" --- wes anderson inspired wallpaper for your viewing pleasure on domino.com
40.142857
71
0.80427
eng_Latn
0.924053
edcef4b0f9d9c5975348bcd9483c49022164b1ac
11,190
md
Markdown
model_zoo/official/cv/psenet/README_CN.md
Vincent34/mindspore
a39a60878a46e7e9cb02db788c0bca478f2fa6e5
[ "Apache-2.0" ]
null
null
null
model_zoo/official/cv/psenet/README_CN.md
Vincent34/mindspore
a39a60878a46e7e9cb02db788c0bca478f2fa6e5
[ "Apache-2.0" ]
null
null
null
model_zoo/official/cv/psenet/README_CN.md
Vincent34/mindspore
a39a60878a46e7e9cb02db788c0bca478f2fa6e5
[ "Apache-2.0" ]
null
null
null
# 目录 - [目录](#目录) - [PSENet概述](#psenet概述) - [PSENet示例](#psenet示例) - [概述](#概述) - [数据集](#数据集) - [环境要求](#环境要求) - [快速入门](#快速入门) - [脚本说明](#脚本说明) - [脚本和样例代码](#脚本和样例代码) - [脚本参数](#脚本参数) - [训练过程](#训练过程) - [分布式训练](#分布式训练) - [评估过程](#评估过程) - [运行测试代码](#运行测试代码) - [ICDAR2015评估脚本](#icdar2015评估脚本) - [用法](#用法) - [结果](#结果) - [推理过程](#推理过程) - [导出MindIR](#导出mindir) - [在Ascend310执行推理](#在ascend310执行推理) - [结果](#结果) - [模型描述](#模型描述) - [性能](#性能) - [评估性能](#评估性能) - [推理性能](#推理性能) - [使用方法](#使用方法) - [推理](#推理) <!-- /TOC --> # PSENet概述 随着卷积神经网络的发展,场景文本检测技术迅速发展,但其算法中存在的两大问题阻碍了这一技术的应用:第一,现有的大多数算法都需要四边形边框来精确定位任意形状的文本;第二,两个相邻文本可能会因错误检测而被覆盖。传统意义上,语义分割可以解决第一个问题,但无法解决第二个问题。而PSENet能够精确地检测出任意形状文本实例,同时解决了两个问题。具体地说,PSENet为每个文本实例生成不同的扩展内核,并逐渐将最小扩展内核扩展为具有完整形状的文本实例。由于最小内核之间的几何差别较大,PSNet可以有效分割封闭的文本实例,更容易地检测任意形状文本实例。通过在CTW1500、全文、ICDAR 2015和ICDAR 2017 MLT中进行多次实验,PSENet的有效性得以验证。 [论文](https://openaccess.thecvf.com/content_CVPR_2019/html/Wang_Shape_Robust_Text_Detection_With_Progressive_Scale_Expansion_Network_CVPR_2019_paper.html): Wenhai Wang, Enze Xie, Xiang Li, Wenbo Hou, Tong Lu, Gang Yu, Shuai Shao; Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2019, pp. 9336-9345 # PSENet示例 ## 概述 渐进尺度扩展网络(PSENet)是一种能够很好地检测自然场景中任意形状文本的文本检测器。 # 数据集 使用的数据集:[ICDAR2015](https://rrc.cvc.uab.es/?ch=4&com=tasks#TextLocalization) 训练集:包括约4500个可读单词的1000张图像。 测试集:约2000个可读单词。 # 环境要求 - 硬件:昇腾处理器(Ascend) - 使用Ascend处理器来搭建硬件环境。 - 框架 - [MindSpore](https://www.mindspore.cn/install) - 如需查看详情,请参见如下资源: - [MindSpore教程](https://www.mindspore.cn/tutorial/training/zh-CN/master/index.html) - [MindSpore Python API](https://www.mindspore.cn/doc/api_python/zh-CN/master/index.html) - 安装Mindspore - 安装[pyblind11](https://github.com/pybind/pybind11) - 安装[Opencv3.4](https://docs.opencv.org/3.4.9/) # 快速入门 通过官方网站安装MindSpore后,您可以按照如下步骤进行训练和评估: ```python # 分布式训练运行示例 sh scripts/run_distribute_train.sh [RANK_TABLE_FILE] [PRED_TRAINED PATH] [TRAIN_ROOT_DIR] # 下载opencv库 download pyblind11, opencv3.4 # 安装pyblind11 opencv3.4 setup pyblind11(install the library by the pip command) setup opencv3.4(compile source code install the library) # 单击[此处](https://rrc.cvc.uab.es/?ch=4&com=tasks#TextLocalization)下载评估方法 # 点击"我的方法"按钮,下载评估脚本 # 输入路径,运行Makefile,找到产品文件 cd ./src/ETSNET/pse/;make clean&&make # 运行test.py python test.py --ckpt pretrained_model.ckpt --TEST_ROOT_DIR [test root path] download script.py # 运行评估示例 sh scripts/run_eval_ascend.sh ``` ## 脚本说明 ## 脚本和样例代码 ```path └── PSENet ├── export.py // mindir转换脚本 ├── mindspore_hub_conf.py // 网络模型 ├─postprogress.py # 310推理后处理脚本 ├── README.md // PSENet相关描述英文版 ├── README_CN.md // PSENet相关描述中文版 ├── scripts ├── run_distribute_train.sh // 用于分布式训练的shell脚本 └── run_eval_ascend.sh // 用于评估的shell脚本 ├─run_infer_310.sh # Ascend 310 推理shell脚本 ├── src ├──model_utils ├──config.py # 参数配置 ├──device_adapter.py # 设备相关信息 ├──local_adapter.py # 设备相关信息 ├──moxing_adapter.py # 装饰器(主要用于ModelArts数据拷贝) ├── dataset.py // 创建数据集 ├── ETSNET ├── base.py // 卷积和BN算子 ├── dice_loss.py // 计算PSENet损耗值 ├── etsnet.py // PSENet中的子网 ├── fpn.py // PSENet中的子网 ├── __init__.py ├── pse // PSENet中的子网 ├── adaptor.cpp ├── adaptor.h ├── __init__.py ├── Makefile ├── resnet50 // PSENet中的子网 ├── __init__.py ├── lr_schedule.py // 学习率 ├── network_define.py // PSENet架构 ├── test.py // 测试脚本 ├── train.py // 训练脚本 ├─default_config.yaml # 参数文件 ├─ma-pre-start.sh # modelarts配置系统环境变量 ``` ## 脚本参数 ```default_config.yaml 配置文件中主要参数如下: -- pre_trained:是从零开始训练还是基于预训练模型训练。可选值为True、False。 -- device_id:用于训练或评估数据集或导出的设备ID。当使用train.sh进行分布式训练时,忽略此参数。 ``` ## 训练过程 ### 分布式训练 分布式训练需要提前创建JSON格式的HCCL配置文件。 请遵循链接中的说明:[链接](https://gitee.com/mindspore/mindspore/tree/master/model_zoo/utils/hccl_tools) ```shell sh scripts/run_distribute_train.sh [RANK_TABLE_FILE] [PRED_TRAINED PATH] [TRAIN_ROOT_DIR] ``` 上述shell脚本将在后台运行分布训练。可以通过`device[X]/test_*.log`文件查看结果。 采用以下方式达到损失值: ```log # grep "epoch:" device_*/loss.log device_0/log:epoch: 1, step: 20,loss is 0.80383 device_0/log:epcoh: 2, step: 40,loss is 0.77951 ... device_1/log:epoch: 1, step: 20,loss is 0.78026 device_1/log:epcoh: 2, step: 40,loss is 0.76629 ``` ## 评估过程 ### 运行测试代码 ```test python test.py --ckpt [CKPK PATH] --TEST_ROOT_DIR [TEST DATA DIR] ``` - 如果要在modelarts上进行模型的训练,可以参考modelarts的[官方指导文档](https://support.huaweicloud.com/modelarts/) 开始进行模型的训练和推理,具体操作如下: ```ModelArts # 在ModelArts上使用分布式训练示例: # 数据集存放方式 # ├── ICDAR2015 # dir # ├── train # train dir # ├── ic15 # train_dataset dir # ├── ch4_training_images # ├── ch4_training_localization_transcription_gt # ├── train_predtrained # predtrained dir # ├── eval # eval dir # ├── ic15 # eval dataset dir # ├── ch4_test_images # ├── challenge4_Test_Task1_GT # ├── checkpoint # ckpt files dir # (1) 选择a(修改yaml文件参数)或者b(ModelArts创建训练作业修改参数)其中一种方式。 # a. 设置 "enable_modelarts=True" # 设置 "run_distribute=True" # 设置 "TRAIN_MODEL_SAVE_PATH=/cache/train/outputs/" # 设置 "TRAIN_ROOT_DIR=/cache/data/ic15/" # 设置 "pre_trained=/cache/data/train_predtrained/pred file name" 如果没有预训练权重 pre_trained="" # b. 增加 "enable_modelarts=True" 参数在modearts的界面上。 # 在modelarts的界面上设置方法a所需要的参数 # 注意:路径参数不需要加引号 # (2)设置网络配置文件的路径 "_config_path=/The path of config in default_config.yaml/" # (3) 在modelarts的界面上设置代码的路径 "/path/psenet"。 # (4) 在modelarts的界面上设置模型的启动文件 "train.py" 。 # (5) 在modelarts的界面上设置模型的数据路径 ".../ICDAR2015/train"(选择ICDAR2015/train文件夹路径) , # 模型的输出路径"Output file path" 和模型的日志路径 "Job log path" 。 # (6) 开始模型的训练。 # 在modelarts上使用模型推理的示例 # (1) 把训练好的模型地方到桶的对应位置。 # (2) 选择a或者b其中一种方式。 # a.设置 "enable_modelarts=True" # 设置 "TEST_ROOT_DIR=/cache/data/ic15" # 设置 "ckpt=/cache/data/checkpoint/ckpt file" # b. 增加 "enable_modelarts=True" 参数在modearts的界面上。 # 在modelarts的界面上设置方法a所需要的参数 # 注意:路径参数不需要加引号 # (3) 设置网络配置文件的路径 "_config_path=/The path of config in default_config.yaml/" # (4) 在modelarts的界面上设置代码的路径 "/path/psenet"。 # (5) 在modelarts的界面上设置模型的启动文件 "eval.py" 。 # (6) 在modelarts的界面上设置模型的数据路径 "../ICDAR2015/eval"(选择ICDAR2015/eval文件夹路径) , # 模型的输出路径"Output file path" 和模型的日志路径 "Job log path" 。 # (7) 开始模型的推理。 ``` ### ICDAR2015评估脚本 #### 用法 第一步:单击[此处](https://rrc.cvc.uab.es/?ch=4&com=tasks#TextLocalization)下载评估方法。 第二步:单击"我的方法"按钮,下载评估脚本。 第三步:建议将评估方法根符号链接到$MINDSPORE/model_zoo/psenet/eval_ic15/。如果您的文件夹结构不同,您可能需要更改评估脚本文件中的相应路径。 ```shell sh ./script/run_eval_ascend.sh.sh ``` #### 结果 Calculated!{"precision": 0.8147966668299853,"recall":0.8006740491092923,"hmean":0.8076736279747451,"AP":0} ## 推理过程 ### [导出MindIR](#contents) ```shell python export.py --ckpt [CKPT_PATH] --file_name [FILE_NAME] --file_format [FILE_FORMAT] ``` 参数ckpt为必填项, `EXPORT_FORMAT` 必须在 ["AIR", "MINDIR"]中选择。 - 在modelarts上导出MindIR ```Modelarts 在ModelArts上导出MindIR示例 数据集存放方式同Modelart训练 # (1) 选择a(修改yaml文件参数)或者b(ModelArts创建训练作业修改参数)其中一种方式。 # a. 设置 "enable_modelarts=True" # 设置 "file_name=psenet" # 设置 "file_format=MINDIR" # 设置 "ckpt_file=/cache/data/checkpoint file name" # b. 增加 "enable_modelarts=True" 参数在modearts的界面上。 # 在modelarts的界面上设置方法a所需要的参数 # 注意:路径参数不需要加引号 # (2)设置网络配置文件的路径 "_config_path=/The path of config in default_config.yaml/" # (3) 在modelarts的界面上设置代码的路径 "/path/psenet"。 # (4) 在modelarts的界面上设置模型的启动文件 "export.py" 。 # (5) 在modelarts的界面上设置模型的数据路径 ".../ICDAR2015/eval/checkpoint"(选择ICDAR2015/eval/checkpoint文件夹路径) , # MindIR的输出路径"Output file path" 和模型的日志路径 "Job log path" 。 ``` ### 在Ascend310执行推理 在执行推理前,mindir文件必须通过`export.py`脚本导出。以下展示了使用minir模型执行推理的示例。 目前仅支持batch_Size为1的推理。在执行推理前,请按照[快速入门](#快速入门)配置环境。 ```shell # Ascend310 推理 bash run_infer_310.sh [MINDIR_PATH] [DATA_PATH] [DEVICE_ID] ``` - `DEVICE_ID` 可选,默认值为0。 ### result 在运行目录的上一级目录将生成`res`文件夹,最终精度计算过程,请参照[ICDAR2015评估脚本](#icdar2015评估脚本). # 模型描述 ## 性能 ### 评估性能 | 参数 | Ascend | | -------------------------- | ----------------------------------------------------------- | | 模型版本 | PSENet | | 资源 | Ascend 910; CPU 2.60GHz,192内核;内存 755G;系统 Euler2.8 | | 上传日期 | 2020-09-15 | | MindSpore版本 | 1.0.0 | | 数据集 | ICDAR2015 | | 训练参数 | start_lr=0.1; lr_scale=0.1 | | 优化器 | SGD | | 损失函数 | LossCallBack | | 输出 | 概率 | | 损失 | 0.35 | | 速度 | 1卡:444毫秒/步;8卡:446毫秒/步 | 总时间 | 1卡:75.48小时;8卡:7.11小时| | 参数(M) | 27.36 | | 微调检查点 | 109.44M (.ckpt file) | | 脚本 | <https://gitee.com/mindspore/mindspore/tree/master/model_zoo/official/cv/psenet> | ### 推理性能 | 参数 | Ascend | | ------------------- | --------------------------- | | 模型版本 | PSENet | | 资源 | Ascend 910;系统 Euler2.8 | | 上传日期 | 2020/09/15 | | MindSpore版本 | 1.0.0 | | 数据集| ICDAR2015 | | 输出 | 概率 | | 准确性 | 1卡:81%; 8卡:81% | ## 使用方法 ### 推理 如果您需要使用已训练模型在GPU、Ascend 910、Ascend 310等多个硬件平台上进行推理,可参考[此处](https://www.mindspore.cn/tutorial/training/zh-CN/master/advanced_use/migrate_3rd_scripts.html)。操作示例如下: ```python # 加载未知数据集进行推理 dataset = dataset.create_dataset(cfg.data_path, 1, False) # 定义模型 config.INFERENCE = False net = ETSNet(config) net = net.set_train() param_dict = load_checkpoint(args.pre_trained) load_param_into_net(net, param_dict) print('Load Pretrained parameters done!') criterion = DiceLoss(batch_size=config.TRAIN_BATCH_SIZE) lrs = lr_generator(start_lr=1e-3, lr_scale=0.1, total_iters=config.TRAIN_TOTAL_ITER) opt = nn.SGD(params=net.trainable_params(), learning_rate=lrs, momentum=0.99, weight_decay=5e-4) # 模型变形 net = WithLossCell(net, criterion) net = TrainOneStepCell(net, opt) time_cb = TimeMonitor(data_size=step_size) loss_cb = LossCallBack(per_print_times=20) # 设置并应用检查点参数 ckpoint_cf = CheckpointConfig(save_checkpoint_steps=1875, keep_checkpoint_max=2) ckpoint_cb = ModelCheckpoint(prefix="ETSNet", config=ckpoint_cf, directory=config.TRAIN_MODEL_SAVE_PATH) model = Model(net) model.train(config.TRAIN_REPEAT_NUM, ds, dataset_sink_mode=False, callbacks=[time_cb, loss_cb, ckpoint_cb]) # 加载预训练模型 param_dict = load_checkpoint(cfg.checkpoint_path) load_param_into_net(net, param_dict) net.set_train(False) # 对未知数据集进行预测 acc = model.eval(dataset) print("accuracy: ", acc) ```
29.140625
338
0.629133
yue_Hant
0.529901
edcefd49e7b3cee14745ee214ca68bb0ad76433b
590
md
Markdown
content/drafts/safety.md
jfw10973/sorg
70c1fdd30ae2e514ddefd710ad74d6e978bfe9c5
[ "MIT" ]
null
null
null
content/drafts/safety.md
jfw10973/sorg
70c1fdd30ae2e514ddefd710ad74d6e978bfe9c5
[ "MIT" ]
null
null
null
content/drafts/safety.md
jfw10973/sorg
70c1fdd30ae2e514ddefd710ad74d6e978bfe9c5
[ "MIT" ]
1
2020-09-02T09:23:08.000Z
2020-09-02T09:23:08.000Z
--- title: Active Versus Passive Safety in Production Systems published_at: 2017-10-13T15:08:44Z location: San Francisco hook: TODO --- Safety. ## Atomicity (#atomicity) This is where the importance of using a database with atomic guarantees comes in. If you're running MongoDB or a database anything like it, your system is actively safe. Data integrity problems are introduced as operations fail before they're supposed to, and you might have scripts designed to go through and fix them. More likely though, the active safety system at play is a human. ## Safety by human (#human)
23.6
58
0.776271
eng_Latn
0.998988
edd005fd33764b72da8f0314af4c4948b30c0d65
1,641
md
Markdown
README.md
chris060986/jenkins-dashboard
b8e80a26e424a23dbe1308ad95b44d1e4e0d54fd
[ "Apache-2.0" ]
null
null
null
README.md
chris060986/jenkins-dashboard
b8e80a26e424a23dbe1308ad95b44d1e4e0d54fd
[ "Apache-2.0" ]
null
null
null
README.md
chris060986/jenkins-dashboard
b8e80a26e424a23dbe1308ad95b44d1e4e0d54fd
[ "Apache-2.0" ]
null
null
null
# Jenkins-Dashboard [![Build Status](https://travis-ci.org/chris060986/jenkins-dashboard.svg?branch=master)](https://travis-ci.org/chris060986/jenkins-dashboard) A build dashboard for jenkins for all who are no allowed to install jenkins plugins. ## Motivation Target users are **NOT** admin-users who setting up their own jenkins, for a single project and are free to do whatever they want with their jenkins installation. Thy should have a look for the [Jenkins Build Monitor](https://github.com/jan-molak/jenkins-build-monitor-plugin/). This application is for users with no admin access to jenkins and with no possibility to install or update plugins, like in companies where build server is provided as a service for all development teams. The Jenkins Dashboard doesn't claim to be a full monitor of all builds and history, that is provided by jenkins itself, but should give a quick and easy overview if all builds are passed. You should be able to get this information while you just walk by the dashboard. ## How to use TODO ### Start demo with public jenkins TODO ### Configuration TODO ## Docker Every build is publishing a docker image on [Dockerhub](https://hub.docker.com/r/chris060986/jenkins-dashboard/). ### Image usage TODO ## Restrictions App is maybe slow if much builds have to be load. Their is no caching or anything else at the moment. ## Features - not implemented yet - Fixing layout - Configuring URL in yaml file - Configure which projects to see - Making multiproject layout - using SSL - caching or no reloading for not changed builds - use environmentvariables to make containerization possible
38.162791
278
0.780622
eng_Latn
0.994875
edd1fa7970fd748c6aa640617bbe109954b8aacc
4,627
md
Markdown
docs/develop/itp-and-third-party-cookies.md
OfficeDev/office-js-docs-pr.ru-ru
b63c841a71fa33f0cee2a248f03f0b27bc9d4c07
[ "CC-BY-4.0", "MIT" ]
1
2021-01-25T15:03:29.000Z
2021-01-25T15:03:29.000Z
docs/develop/itp-and-third-party-cookies.md
OfficeDev/office-js-docs-pr.ru-ru
b63c841a71fa33f0cee2a248f03f0b27bc9d4c07
[ "CC-BY-4.0", "MIT" ]
2
2022-02-15T02:34:57.000Z
2022-02-15T02:35:42.000Z
docs/develop/itp-and-third-party-cookies.md
OfficeDev/office-js-docs-pr.ru-ru
b63c841a71fa33f0cee2a248f03f0b27bc9d4c07
[ "CC-BY-4.0", "MIT" ]
2
2018-07-30T19:13:36.000Z
2020-11-15T16:33:46.000Z
--- title: Разработка надстройки Office для работы с ITP при использовании сторонних файлов cookie description: Работа с ITP и Office надстройки при использовании сторонних файлов cookie ms.date: 07/8/2021 ms.localizationpriority: medium ms.openlocfilehash: d8216e2945acf1b87306bb00b7fb868728a986bc ms.sourcegitcommit: 1306faba8694dea203373972b6ff2e852429a119 ms.translationtype: MT ms.contentlocale: ru-RU ms.lasthandoff: 09/12/2021 ms.locfileid: "59151010" --- # <a name="develop-your-office-add-in-to-work-with-itp-when-using-third-party-cookies"></a>Разработка надстройки Office для работы с ITP при использовании сторонних файлов cookie Если для Office надстройки требуются сторонние файлы cookie, эти файлы cookie будут заблокированы, если интеллектуальная профилактика отслеживания (ITP) используется временем запуска браузера, загрузив надстройку. Для проверки подлинности пользователей или для других сценариев, таких как хранение параметров, можно использовать сторонние файлы cookie. Если ваша Office надстройка и веб-сайт должны полагаться на сторонние файлы cookie, используйте следующие действия для работы с ITP. 1. Настройка [авторизации OAuth 2.0](https://tools.ietf.org/html/rfc6749)таким образом, чтобы домен проверки подлинности (в вашем случае стороннее стороннее, ожидающее файлов cookie) перенародил маркер авторизации на   ваш веб-сайт. Используйте маркер для создания сеанса входа с помощью сервера Secure и [cookie HttpOnly.](https://developer.mozilla.org/docs/Web/HTTP/Cookies#Secure_and_HttpOnly_cookies) 2. Используйте [API служба хранилища доступа,](https://webkit.org/blog/8124/introducing-storage-access-api/)чтобы сторонние стороны могли запрашивать разрешения на доступ к его первому   участнику cookie. Текущие версии Office Mac и Office в Интернете поддерживают этот API. > [!NOTE] > Если вы используете файлы cookie для других целей, кроме проверки подлинности, то рассмотрите возможность `localStorage` использования для вашего сценария. В следующем примере кода показано, как использовать API служба хранилища доступа. ```javascript function displayLoginButton() { var button = createLoginButton(); button.addEventListener("click", function(ev) { document.requestStorageAccess().then(function() { authenticateWithCookies(); }).catch(function() { // User must have previously interacted with this domain loaded in a top frame // Also you should have previously written a cookie when domain was loaded in the top frame console.error("User cancelled or requirements were not met."); }); }); } if (document.hasStorageAccess) { document.hasStorageAccess().then(function(hasStorageAccess) { if (!hasStorageAccess) { displayLoginButton(); } else { authenticateWithCookies(); } }); } else { authenticateWithCookies(); } ``` ## <a name="about-itp-and-third-party-cookies"></a>О ИТП и сторонних файлах cookie Сторонние файлы cookie — это файлы cookie, загружаются в iframe, где домен отличается от кадра верхнего уровня. ItP может повлиять на сложные сценарии проверки подлинности, когда диалоговое окно всплывающее окно используется для ввода учетных данных, а затем доступ к файлам cookie необходим надстройке iframe для завершения потока проверки подлинности. ItP также может повлиять на сценарии бесшумной проверки подлинности, где ранее для проверки подлинности использовался диалоговое окно всплывающее окно, но после этого использование надстройки пытается проверить подлинность через скрытый iframe. При разработке Office надстроек на Mac доступ к сторонним файлам cookie блокируется SDK MacOS Big Sur. Это происходит из-за того, что ИТП WKWebView включен по умолчанию в браузере Safari, а WKWebView блокирует все сторонние файлы cookie. Office mac версии 16.44 или более поздней версии интегрирован с MacOS Big Sur SDK. В браузере Safari конечные пользователи могут переключать контрольный ящик **Prevent cross-site tracking** under **Preference** > **Privacy,** чтобы отключить ITP. Однако itP нельзя отключить для встроенного управления WKWebView. ## <a name="see-also"></a>Дополнительные материалы - [Обработка ITP в Safari и других браузерах, где сторонние файлы cookie заблокированы](/azure/active-directory/develop/reference-third-party-cookies-spas) - [Отслеживание предотвращения в WebKit](https://webkit.org/tracking-prevention/) - ["Песочница конфиденциальности" Chrome](https://blog.chromium.org/2020/01/building-more-private-web-path-towards.html) - [Введение API служба хранилища доступа](https://blogs.windows.com/msedgedev/2020/07/08/introducing-storage-access-api/)
70.106061
598
0.7949
rus_Cyrl
0.865544
edd25f008b04800cbe4b37cc3ac7ba3e20006770
8,361
md
Markdown
server-2013/get-csarchivingpolicy.md
v-rajagt/OfficeDocs-SkypeforBusiness-Test-pr.zh-cn
eab3686e8cbc09adec3e81749bfcafc598f66fb9
[ "CC-BY-4.0", "MIT" ]
3
2020-05-19T19:27:53.000Z
2022-02-19T00:00:24.000Z
server-2013/get-csarchivingpolicy.md
v-rajagt/OfficeDocs-SkypeforBusiness-Test-pr.zh-cn
eab3686e8cbc09adec3e81749bfcafc598f66fb9
[ "CC-BY-4.0", "MIT" ]
30
2018-05-30T19:12:05.000Z
2018-08-24T10:54:53.000Z
server-2013/get-csarchivingpolicy.md
v-rajagt/OfficeDocs-SkypeforBusiness-Test-pr.zh-cn
eab3686e8cbc09adec3e81749bfcafc598f66fb9
[ "CC-BY-4.0", "MIT" ]
18
2018-05-02T08:27:54.000Z
2021-11-15T11:24:05.000Z
--- title: Get-CsArchivingPolicy TOCTitle: Get-CsArchivingPolicy ms:assetid: 25d7de86-871d-4f07-8825-028137365435 ms:mtpsurl: https://technet.microsoft.com/zh-cn/library/Gg425731(v=OCS.15) ms:contentKeyID: 49312277 ms.date: 05/19/2016 mtps_version: v=OCS.15 ms.translationtype: HT --- # Get-CsArchivingPolicy   _**上一次修改主题:** 2015-03-09_ Returns information about your instant messaging (IM) session archiving policies. Archiving policies enable you to archive all IM and web conferencing sessions that take place between internal users and/or between internal users and external users. 此 cmdlet 是在 Lync Server 2010 中引入的。 ## 语法 Get-CsArchivingPolicy [-Identity <XdsIdentity>] <COMMON PARAMETERS> Get-CsArchivingPolicy [-Filter <String>] <COMMON PARAMETERS> COMMON PARAMETERS: [-LocalStore <SwitchParameter>] ## Examples ## EXAMPLE 1 Example 1 calls the **Get-CsArchivingPolicy** cmdlet without any parameters. This returns a collection of all the archiving policies currently in use in your organization. Get-CsArchivingPolicy ## EXAMPLE 2 In Example 2, the **Get-CsArchivingPolicy** cmdlet is used to return the archiving policy with the Identity site:Redmond. Because identities must be unique, this command will always return, at most, a single policy. Get-CsArchivingPolicy -Identity site:Redmond ## EXAMPLE 3 Example 3 returns a collection of all the archiving policies that have been configured at the per-user scope. This is done by including the Filter parameter and the filter value "tag:\*". That filter value instructs the **Get-CsArchivingPolicy** cmdlet to return only those policies that have an identity beginning with the string value "tag:". Get-CsArchivingPolicy -Filter tag:* ## EXAMPLE 4 Example 4 returns a collection of all the archiving policies where the archiving of internal IM sessions has been disabled. To do this, the **Get-CsArchivingPolicy** cmdlet is first used to return a collection of all the archiving policies currently in use. That collection is then piped to the **Where-Object** cmdlet. In turn, the **Where-Object** cmdlet applies a filter that restricts the returned data to those policies where the ArchiveInternal property is equal to False. Get-CsArchivingPolicy | Where-Object {$_.ArchiveInternal -eq $False} ## EXAMPLE 5 Example 5 is similar to Example 4; in this case, however, the command returns all of the archiving policies where both internal and external archiving have been disabled. To accomplish this, the **Get-CsArchivingPolicy** cmdlet is first used to return a collection of all the archiving policies currently in use. That collection is then piped to the **Where-Object** cmdlet, which picks out only those policies where both the ArchiveInternal and the ArchiveExternal properties are equal to False. The -and operator instructs the **Where-Object** cmdlet to only select policies that meet all the specified criteria. To select policies that meet just one (or both) of the specified criteria use the –or operator: Get-CsArchivingPolicy | Where-Object {$_.ArchiveInternal -eq $False -and $_.ArchiveExternal -eq $False} ## Detailed Description Many organizations find it useful to keep an archive of all the IM sessions that their users take part in; other organizations are legally required to keep such an archive. In order to archive IM sessions with Lync Server, you must perform two steps. First, you need to enable archiving at the global and/or the site scope by using the **Set-CsArchivingConfiguration** cmdlet. This gives you the ability to archive IM sessions; however, it does not automatically begin archiving those sessions. Instead, to actually save transcripts of your IM sessions you must complete step 2: create one or more IM session archiving policies. These policies determine which users will have their IM sessions recorded and which type of IM sessions (internal and/or external) will be archived. Internal IM sessions are sessions where all of the participants are authenticated users who have Active Directory accounts within your organization; external IM sessions are sessions where at least one participant is an unauthenticated user who does not have an Active Directory account within your organization. You can choose to archive only internal sessions, only external sessions, or both internal and external sessions. Archiving policies (which are created with the **New-CsArchivingPolicy** cmdlet) can be assigned to the global site or to the site scope. In addition, these policies can be assigned to the per-user scope; this means that a policy can be created and then applied to a specific user or a specific set of users. For example, you might have a global policy that archives internal IM sessions for all your users. In addition, you might create a second policy, one that archives both internal and external sessions, and apply that policy to your sales staff. Because per-user policies take precedence over global and site policies, members of the sales staff will have all their IM sessions archived. Other users (users who are not part of the sales department and are not affected by the sales policy) will have only their internal IM sessions archived. The **Get-CsArchivingPolicy** cmdlet provides a way for you to return information about the archiving policies that have been configured for use in your organization. Keep in mind that these policies are enforced only if IM session archiving has been enabled at the global or site scope. To determine whether or not IM session archiving has been enabled, use the **Get-CsArchivingConfiguration** cmdlet. Who can run this cmdlet: By default, members of the following groups are authorized to run the **Get-CsArchivingPolicy** cmdlet locally: RTCUniversalUserAdmins, RTCUniversalServerAdmins. To return a list of all the role-based access control (RBAC) roles this cmdlet has been assigned to (including any custom RBAC roles you have created yourself), run the following command from the Windows PowerShell prompt: Get-CsAdminRole | Where-Object {$\_.Cmdlets –match "Get-CsArchivingPolicy"} ## 参数 <table> <colgroup> <col style="width: 25%" /> <col style="width: 25%" /> <col style="width: 25%" /> <col style="width: 25%" /> </colgroup> <thead> <tr class="header"> <th>Parameter</th> <th>Required</th> <th>Type</th> <th>Description</th> </tr> </thead> <tbody> <tr class="odd"> <td><p><em>Filter</em></p></td> <td><p>Optional</p></td> <td><p>System.String</p></td> <td><p>Enables you to use wildcard characters when indicating the policy (or policies) to be returned. For example, to return all of the policies configured at the site scope, use this syntax: -Filter &quot;site:*&quot;. This returns any policy where the Identity (the only property you can filter on) begins with the string value &quot;site:&quot;. To return a collection of all the per-user policies that have an Identity that begins with &quot;Sales&quot;, use this syntax: -Filter &quot;Sales*&quot;.</p></td> </tr> <tr class="even"> <td><p><em>Identity</em></p></td> <td><p>Optional</p></td> <td><p>Microsoft.Rtc.Management.Xds.XdsIdentity</p></td> <td><p>Unique identifier of the archiving policy to be returned. To refer to the global policy, use this syntax: -Identity global. To refer to a site policy, use syntax similar to this: -Identity site:Redmond. To refer to a per-user policy, use syntax similar to this: -Identity RedmondArchivingPolicy. If this parameter is omitted, then all of the archiving policies configured for use in your organization will be returned.</p></td> </tr> <tr class="odd"> <td><p><em>LocalStore</em></p></td> <td><p>Optional</p></td> <td><p>System.Management.Automation.SwitchParameter</p></td> <td><p>Retrieves the archiving policy data from the local replica of the 中央管理存储 rather than from the 中央管理存储 itself.</p></td> </tr> </tbody> </table> ## Input Types None. The **Get-CsArchivingPolicy** cmdlet does not accept pipelined input. ## Return Types The **Get-CsArchivingPolicy** cmdlet returns instances of the Microsoft.Rtc.Management.WritableConfig.Policy.Im.IMArchivingPolicy object. ## 另请参阅 #### 其他资源 [Grant-CsArchivingPolicy](grant-csarchivingpolicy.md) [New-CsArchivingPolicy](new-csarchivingpolicy.md) [Remove-CsArchivingPolicy](remove-csarchivingpolicy.md) [Set-CsArchivingPolicy](set-csarchivingpolicy.md)
63.340909
848
0.77706
eng_Latn
0.991013
edd384f8159b977c94c715496bd8f44b0eb61357
13,252
md
Markdown
articles/virtual-machines/linux/image-builder-vnet.md
niklasloow/azure-docs.sv-se
31144fcc30505db1b2b9059896e7553bf500e4dc
[ "CC-BY-4.0", "MIT" ]
null
null
null
articles/virtual-machines/linux/image-builder-vnet.md
niklasloow/azure-docs.sv-se
31144fcc30505db1b2b9059896e7553bf500e4dc
[ "CC-BY-4.0", "MIT" ]
null
null
null
articles/virtual-machines/linux/image-builder-vnet.md
niklasloow/azure-docs.sv-se
31144fcc30505db1b2b9059896e7553bf500e4dc
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: Använd Azure Image Builder för virtuella Linux-datorer som ger åtkomst till ett befintligt Azure VNET (för hands version) description: Skapa virtuella Linux-avbildningar med Azure Image Builder som ger åtkomst till ett befintligt Azure VNET author: danielsollondon ms.author: danis ms.date: 08/10/2020 ms.topic: how-to ms.service: virtual-machines-linux ms.subservice: imaging ms.reviewer: danis ms.openlocfilehash: f216b6fa3a0e43c1c0313baa4f8414546a74d8f0 ms.sourcegitcommit: d8b8768d62672e9c287a04f2578383d0eb857950 ms.translationtype: MT ms.contentlocale: sv-SE ms.lasthandoff: 08/11/2020 ms.locfileid: "88068337" --- # <a name="use-azure-image-builder-for-linux-vms-allowing-access-to-an-existing-azure-vnet"></a>Använd Azure Image Builder för virtuella Linux-datorer som ger åtkomst till ett befintligt Azure VNET Den här artikeln visar hur du kan använda Azure Image Builder för att skapa en grundläggande anpassad Linux-avbildning som har åtkomst till befintliga resurser i ett VNET. Build VM som du skapar distribueras till ett nytt eller befintligt VNET som du anger i din prenumeration. När du använder ett befintligt Azure VNET kräver inte Azure Image Builder-tjänsten offentlig nätverks anslutning. > [!IMPORTANT] > Azure Image Builder är för närvarande en offentlig för hands version. > Den här förhandsversionen tillhandahålls utan serviceavtal och rekommenderas inte för produktionsarbetsbelastningar. Vissa funktioner kanske inte stöds eller kan vara begränsade. Mer information finns i [Kompletterande villkor för användning av Microsoft Azure-förhandsversioner](https://azure.microsoft.com/support/legal/preview-supplemental-terms/). [!INCLUDE [cloud-shell-try-it.md](../../../includes/cloud-shell-try-it.md)] ## <a name="register-the-features"></a>Registrera funktionerna Först måste du registrera dig för Azure Image Builder-tjänsten. Registreringen ger behörighet att skapa, hantera och ta bort en resurs grupp för mellanlagring. Tjänsten har också behörighet att lägga till resurser som gruppen som krävs för avbildnings versionen. ```azurecli-interactive az feature register --namespace Microsoft.VirtualMachineImages --name VirtualMachineTemplatePreview ``` ## <a name="set-variables-and-permissions"></a>Ange variabler och behörigheter Du kommer att använda vissa informations delar flera gånger. Skapa några variabler för att lagra informationen. ```azurecli-interactive # set your environment variables here!!!! # destination image resource group imageResourceGroup=aibImageRG01 # location (see possible locations in main docs) location=WestUS2 # your subscription # get the current subID : 'az account show | grep id' subscriptionID=$(az account show | grep id | tr -d '",' | cut -c7-) # name of the image to be created imageName=aibCustomLinuxImg01 # image distribution metadata reference name runOutputName=aibCustLinManImg01ro # VNET properties (update to match your existing VNET, or leave as-is for demo) # VNET name vnetName=myexistingvnet01 # subnet name subnetName=subnet01 # VNET resource group name # NOTE! The VNET must always be in the same region as the AIB service region. vnetRgName=existingVnetRG # Existing Subnet NSG Name or the demo will create it nsgName=aibdemoNsg ``` Skapa resursgruppen. ```azurecli-interactive az group create -n $imageResourceGroup -l $location ``` ## <a name="configure-networking"></a>Konfigurera nätverk Om du inte har en befintlig VNET\Subnet\NSG kan du använda följande skript för att skapa ett. ```bash # Create a resource group az group create -n $vnetRgName -l $location # Create VNET az network vnet create \ --resource-group $vnetRgName \ --name $vnetName --address-prefix 10.0.0.0/16 \ --subnet-name $subnetName --subnet-prefix 10.0.0.0/24 # Create base NSG to simulate an existing NSG az network nsg create -g $vnetRgName -n $nsgName az network vnet subnet update \ --resource-group $vnetRgName \ --vnet-name $vnetName \ --name $subnetName \ --network-security-group $nsgName # NOTE! The VNET must always be in the same region as the Azure Image Builder service region. ``` ### <a name="add-network-security-group-rule"></a>Lägg till regel för nätverks säkerhets grupp Den här regeln tillåter anslutning från Azure Image Builder-belastningsutjämnaren till den virtuella proxy-datorn. Port 60001 är för Linux-OSs och port 60000 för Windows OSs. Den virtuella proxykonfigurationen ansluter till build VM med port 22 för Linux eller port 5986 för Windows OSs. ```azurecli-interactive az network nsg rule create \ --resource-group $vnetRgName \ --nsg-name $nsgName \ -n AzureImageBuilderNsgRule \ --priority 400 \ --source-address-prefixes AzureLoadBalancer \ --destination-address-prefixes VirtualNetwork \ --destination-port-ranges 60000-60001 --direction inbound \ --access Allow --protocol Tcp \ --description "Allow Image Builder Private Link Access to Proxy VM" ``` ### <a name="disable-private-service-policy-on-subnet"></a>Inaktivera princip för privata tjänster på undernät ```azurecli-interactive az network vnet subnet update \ --name $subnetName \ --resource-group $vnetRgName \ --vnet-name $vnetName \ --disable-private-link-service-network-policies true ``` Mer information om nätverk i Image Builder finns i [nätverks alternativ för Azure Image Builder-tjänster](image-builder-networking.md). ## <a name="modify-the-example-template-and-create-role"></a>Ändra exempel mal len och skapa rollen ```bash # download the example and configure it with your vars curl https://raw.githubusercontent.com/danielsollondon/azvmimagebuilder/master/quickquickstarts/1a_Creating_a_Custom_Linux_Image_on_Existing_VNET/existingVNETLinux.json -o existingVNETLinux.json curl https://raw.githubusercontent.com/danielsollondon/azvmimagebuilder/master/solutions/12_Creating_AIB_Security_Roles/aibRoleNetworking.json -o aibRoleNetworking.json curl https://raw.githubusercontent.com/danielsollondon/azvmimagebuilder/master/solutions/12_Creating_AIB_Security_Roles/aibRoleImageCreation.json -o aibRoleImageCreation.json sed -i -e "s/<subscriptionID>/$subscriptionID/g" existingVNETLinux.json sed -i -e "s/<rgName>/$imageResourceGroup/g" existingVNETLinux.json sed -i -e "s/<region>/$location/g" existingVNETLinux.json sed -i -e "s/<imageName>/$imageName/g" existingVNETLinux.json sed -i -e "s/<runOutputName>/$runOutputName/g" existingVNETLinux.json sed -i -e "s/<vnetName>/$vnetName/g" existingVNETLinux.json sed -i -e "s/<subnetName>/$subnetName/g" existingVNETLinux.json sed -i -e "s/<vnetRgName>/$vnetRgName/g" existingVNETLinux.json sed -i -e "s/<subscriptionID>/$subscriptionID/g" aibRoleImageCreation.json sed -i -e "s/<rgName>/$imageResourceGroup/g" aibRoleImageCreation.json sed -i -e "s/<subscriptionID>/$subscriptionID/g" aibRoleNetworking.json sed -i -e "s/<vnetRgName>/$vnetRgName/g" aibRoleNetworking.json ``` ## <a name="set-permissions-on-the-resource-group"></a>Ange behörigheter för resurs gruppen Image Builder använder den [användar identitet](https://docs.microsoft.com/azure/active-directory/managed-identities-azure-resources/qs-configure-cli-windows-vm#user-assigned-managed-identity) som tillhandahölls för att mata in avbildningen i Azure-galleriet för delad avbildning (sig). I det här exemplet ska du skapa en Azure-roll definition som innehåller detaljerade åtgärder för att distribuera avbildningen till SIG själv. Roll definitionen tilldelas sedan till användar identiteten. ```bash # create user assigned identity for image builder idenityName=aibBuiUserId$(date +'%s') az identity create -g $imageResourceGroup -n $idenityName # get identity id imgBuilderCliId=$(az identity show -g $imageResourceGroup -n $idenityName | grep "clientId" | cut -c16- | tr -d '",') # get the user identity URI, needed for the template imgBuilderId=/subscriptions/$subscriptionID/resourcegroups/$imageResourceGroup/providers/Microsoft.ManagedIdentity/userAssignedIdentities/$idenityName # update the template sed -i -e "s%<imgBuilderId>%$imgBuilderId%g" existingVNETLinux.json # make role name unique, to avoid clashes in the same Azure Active Directory domain imageRoleDefName="Azure Image Builder Image Def"$(date +'%s') netRoleDefName="Azure Image Builder Network Def"$(date +'%s') # update the definitions sed -i -e "s/Azure Image Builder Service Image Creation Role/$imageRoleDefName/g" aibRoleImageCreation.json sed -i -e "s/Azure Image Builder Service Networking Role/$netRoleDefName/g" aibRoleNetworking.json ``` I stället för att ge Image Builder lägre granularitet och ökad behörighet kan du skapa två roller. En ger Builder-behörighet att skapa en avbildning, den andra gör det möjligt att ansluta den virtuella datorn och belastningsutjämnaren till ditt VNET. ```bash # create role definitions az role definition create --role-definition ./aibRoleImageCreation.json az role definition create --role-definition ./aibRoleNetworking.json # grant role definition to the user assigned identity az role assignment create \ --assignee $imgBuilderCliId \ --role $imageRoleDefName \ --scope /subscriptions/$subscriptionID/resourceGroups/$imageResourceGroup az role assignment create \ --assignee $imgBuilderCliId \ --role $netRoleDefName \ --scope /subscriptions/$subscriptionID/resourceGroups/$vnetRgName ``` Mer information om behörigheter finns i [Konfigurera Azure Image Builder-tjänst behörigheter med Azure CLI](image-builder-permissions-cli.md) eller [Konfigurera Azure avbildning Builder-tjänst behörigheter med PowerShell](image-builder-permissions-powershell.md). ## <a name="create-the-image"></a>Skapa avbildningen Skicka avbildnings konfigurationen till Azure Image Builder-tjänsten. ```azurecli-interactive az resource create \ --resource-group $imageResourceGroup \ --properties @existingVNETLinux.json \ --is-full-object \ --resource-type Microsoft.VirtualMachineImages/imageTemplates \ -n existingVNETLinuxTemplate01 # Wait approximately 1-3 mins (validation, permissions etc.) ``` Starta avbildnings versionen. ```azurecli-interactive az resource invoke-action \ --resource-group $imageResourceGroup \ --resource-type Microsoft.VirtualMachineImages/imageTemplates \ -n existingVNETLinuxTemplate01 \ --action Run # Wait approximately 15 mins ``` Det kan ta en stund att skapa avbildningen och replikera den till båda regionerna. Vänta tills den här delen är klar innan du fortsätter med att skapa en virtuell dator. ## <a name="create-the-vm"></a>Skapa den virtuella datorn Skapa en virtuell dator från avbildnings versionen som skapades av Azure Image Builder. ```azurecli-interactive az vm create \ --resource-group $imageResourceGroup \ --name aibImgVm0001 \ --admin-username aibuser \ --image $imageName \ --location $location \ --generate-ssh-keys ``` SSH till den virtuella datorn. ```bash ssh aibuser@<publicIpAddress> ``` Du bör se att avbildningen har anpassats till ett *meddelande om dygnet* så snart din SSH-anslutning har upprättats! ```console ******************************************************* ** This VM was built from the: ** ** !! AZURE VM IMAGE BUILDER Custom Image !! ** ** You have just been Customized :-) ** ******************************************************* ``` ## <a name="clean-up-resources"></a>Rensa resurser Om du nu vill försöka att omanpassa avbildnings versionen för att skapa en ny version av samma avbildning, hoppar du över nästa steg och fortsätter med att [använda Azure Image Builder för att skapa en annan avbildnings version](image-builder-gallery-update-image-version.md). Följande tar bort den avbildning som har skapats, tillsammans med alla andra resursfiler. Kontrol lera att du är färdig med distributionen innan du tar bort resurserna. När du tar bort avbildnings Galleri resurser måste du ta bort alla avbildnings versioner innan du kan ta bort avbildnings definitionen som används för att skapa dem. Om du vill ta bort ett galleri måste du först ta bort alla bild definitionerna i galleriet. Ta bort Image Builder-mallen. ```azurecli az resource delete \ --resource-group $imageResourceGroup \ --resource-type Microsoft.VirtualMachineImages/imageTemplates \ -n existingVNETLinuxTemplate01 ``` Ta bort behörighets tilldelningar, roller och identiteter ```azurecli-interactive az role assignment delete \ --assignee $imgBuilderCliId \ --role $imageRoleDefName \ --scope /subscriptions/$subscriptionID/resourceGroups/$imageResourceGroup az role assignment delete \ --assignee $imgBuilderCliId \ --role $netRoleDefName \ --scope /subscriptions/$subscriptionID/resourceGroups/$vnetRgName az role definition delete --name "$imageRoleDefName" az role definition delete --name "$netRoleDefName" az identity delete --ids $imgBuilderId ``` Ta bort resurs gruppen. ```azurecli-interactive az group delete -n $imageResourceGroup ``` Om du har skapat ett VNET för den här snabb starten kan du ta bort det virtuella nätverket om det inte längre används. ## <a name="next-steps"></a>Nästa steg Lär dig mer om [Azures delade bild gallerier](shared-image-galleries.md).
41.4125
489
0.770223
swe_Latn
0.634613
edd3964f12434f04fbd6e4e97ac957f86f1c3c01
1,035
md
Markdown
README.md
alinebastos/fullstack-challenges
7ee59f5716157997c5a65ef6bed0062c9bd2e7d6
[ "MIT" ]
345
2017-10-17T00:07:50.000Z
2022-03-29T12:23:04.000Z
README.md
Edlaine-Pontes/fullstack-challenges
7ee59f5716157997c5a65ef6bed0062c9bd2e7d6
[ "MIT" ]
1
2020-10-12T09:34:24.000Z
2020-10-12T09:34:24.000Z
README.md
Edlaine-Pontes/fullstack-challenges
7ee59f5716157997c5a65ef6bed0062c9bd2e7d6
[ "MIT" ]
45
2017-10-17T23:52:55.000Z
2022-03-05T19:58:56.000Z
# Fullstack Challenges Open source's challenges of fullstack jobs to test your skills. Only fullstack challenges. Only open source. ### EN :us: - [Finimize](https://github.com/finimize/fullstack-dev-challenge) - [Itexico](https://github.com/itexico/interview-codingchallenge-fsjs) - [RivetAI](https://github.com/endcue/fullstack-challenge) - [Techstars](https://github.com/techstars/full-stack-challenge) - [Unbabel](https://github.com/Unbabel/fullstack-coding-challenge) - [AstroCoders](https://github.com/Astrocoders/fullstack-challenge) ### PT :brazil: - [Cubo](https://github.com/cubonetwork/fullstack-challenge) - [Entria](https://github.com/entria/jobs/blob/master/fullstack/challenge.md) - [Estudar com Você ](https://github.com/estudarcomvoce/fullstack-challenge) - [Hospede](https://github.com/hospede/fullstack-challenge) - [iFood](https://github.com/ifood/ifood-fullstack-test) - [RoutEasy](https://github.com/RoutEasy/challenge-fullstack) - [Softplan](https://github.com/g-cpa-squad-produto/softplan-desafio-fullstack)
45
79
0.766184
kor_Hang
0.134257
edd4319ade7e0832df60c27fda0bca305b8b2259
7,498
md
Markdown
articles/attestation/overview.md
keowu/azure-docs.pt-br
b233c381f62695efe522874b868e0320a3384391
[ "CC-BY-4.0", "MIT" ]
null
null
null
articles/attestation/overview.md
keowu/azure-docs.pt-br
b233c381f62695efe522874b868e0320a3384391
[ "CC-BY-4.0", "MIT" ]
null
null
null
articles/attestation/overview.md
keowu/azure-docs.pt-br
b233c381f62695efe522874b868e0320a3384391
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: Visão geral do Atestado do Azure description: Uma visão geral do Atestado do Microsoft Azure, uma solução usada para atestar TEEs (Ambientes de Execução Confiáveis) services: attestation author: msmbaldwin ms.service: attestation ms.topic: overview ms.date: 08/31/2020 ms.author: mbaldwin ms.custom: references_regions ms.openlocfilehash: 6a587ecbe7ff67908b22d4f2429cfdd0c511e07d ms.sourcegitcommit: 003ac3b45abcdb05dc4406661aca067ece84389f ms.translationtype: HT ms.contentlocale: pt-BR ms.lasthandoff: 12/07/2020 ms.locfileid: "96748766" --- # <a name="microsoft-azure-attestation-preview"></a>Atestado do Microsoft Azure (versão prévia) O Atestado do Microsoft Azure (versão prévia) é uma solução unificada para verificar remotamente a confiabilidade de uma plataforma e a integridade dos binários em execução dentro dela. O serviço dá suporte ao atestado das plataformas com suporte de TPMs (Trusted Platform Modules) juntamente com a capacidade de atestar TEEs (Ambientes de Execução Confiável), como enclaves [Intel® SGX](https://www.intel.com/content/www/us/en/architecture-and-technology/software-guard-extensions.html) (Software Guard Extensions) e enclaves [VBS](/windows-hardware/design/device-experiences/oem-vbs) (Segurança baseada em Virtualização). O atestado é um processo usado para demonstrar que os binários de software foram instanciados corretamente em uma plataforma confiável. Em seguida, as partes confiáveis remotas podem ter a confiança de que apenas esse software pretendido está sendo executado no hardware confiável. O Atestado do Azure é um serviço unificado voltado ao cliente e uma estrutura para o atestado. O Atestado do Azure permite paradigmas de segurança de ponta, como a [Computação confidencial do Azure](../confidential-computing/overview.md) e a proteção de borda inteligente. Os clientes estão solicitando a capacidade de verificar de maneira independente a localização de um computador, a postura de uma VM (máquina virtual) nesse computador e o ambiente no qual os enclaves estão em execução nessa VM. O Atestado do Azure capacitará essas e muitas solicitações de clientes adicionais. O Atestado do Azure recebe evidências de entidades de computação, transforma-as em um conjunto de declarações, valida-as em relação às políticas configuráveis e produz provas de criptografia para aplicativos baseados em declarações (por exemplo, partes confiáveis e autoridades de auditoria). ## <a name="use-cases"></a>Casos de uso O Atestado do Azure fornece serviços de atestado abrangentes para vários ambientes e casos de uso diferentes. ### <a name="sgx-attestation"></a>Atestado do SGX O SGX refere-se ao isolamento no nível de hardware, que é compatível com alguns modelos de CPUs Intel. O SGX permite que o código seja executado em compartimentos corrigidos conhecidos como enclaves do SGX. Depois, as permissões de acesso e de memória são gerenciadas pelo hardware para garantir uma superfície de ataque mínima com isolamento adequado. Os aplicativos cliente podem ser criados para aproveitar os enclaves do SGX, delegando a ocorrência de tarefas de segurança confidenciais nesses enclaves. Em seguida, esses aplicativos podem usar o Atestado do Azure para estabelecer a confiança rotineiramente no enclave e a capacidade de acessar dados confidenciais. ### <a name="open-enclave"></a>Open Enclave O [OE](https://openenclave.io/sdk/) (Open Enclave) é uma coleção de bibliotecas destinadas à criação de uma abstração unificada de enclaves para que os desenvolvedores criem aplicativos baseados em TEE. Ele oferece um modelo de aplicativo seguro universal que minimiza as especificidades de plataforma. A Microsoft o considera como uma base essencial para a democratização das tecnologias de enclave baseadas em hardware, como o SGX, e para a adoção delas no Azure. O OE padroniza os requisitos específicos para a verificação de uma evidência de enclave. Isso qualifica o OE como um consumidor de atestado altamente qualificado do Atestado do Azure. ## <a name="azure-attestation-can-run-in-a-tee"></a>O Atestado do Azure pode ser executado em um TEE O Atestado do Azure é crítico para cenários de Computação Confidencial, pois ele executa as seguintes ações: - Verifica se as evidências do enclave são válidas. - Avalia as evidências do enclave em relação a uma política definida pelo cliente. - Gerencia e armazena políticas específicas do locatário. - Gera e assina um token que é usado por terceiras partes confiáveis para interagir com o enclave. O Atestado do Azure foi criado para ser executado em dois tipos de ambientes: - O Atestado do Azure em execução em um SGX habilitado para TEE. - O Atestado do Azure em execução em um não TEE. Os clientes do Atestado do Azure expressaram um requisito para que a Microsoft fique operacionalmente fora da TCB (base de computação confiável). Isso serve para impedir que entidades da Microsoft, como administradores de VMs, administradores do host e desenvolvedores da Microsoft, modifiquem solicitações de atestado, políticas e tokens emitidos pelo Atestado do Azure. O Atestado do Azure também foi criado para ser executado no TEE, em que os recursos do Atestado do Azure, como validação da cotação e geração e assinatura de token, são movidos para um enclave do SGX. ## <a name="why-use-azure-attestation"></a>Por que usar o Atestado do Azure O Atestado do Azure é a escolha preferencial para atestar os TEEs, pois oferece os seguintes benefícios: - Estrutura unificada para atestar vários ambientes como enclaves TPMs, SGX e VBS - Serviço multilocatário que permite a configuração de provedores de atestado personalizados e políticas para restringir a geração de token - Oferece provedores padrão que podem fornecer um atestado sem nenhuma configuração dos usuários - Protege os dados durante o uso com a implementação em um enclave do SGX - Serviço altamente disponível ## <a name="business-continuity-and-disaster-recovery-bcdr-support"></a>Suporte de BCDR (continuidade dos negócios e recuperação de desastres) A [BCDR](../best-practices-availability-paired-regions.md) (continuidade dos negócios e recuperação de desastres) do Atestado do Azure permite reduzir as interrupções de serviço resultantes de problemas de disponibilidade ou eventos de desastres significativos em uma região. Os clusters implantados em duas regiões funcionarão de maneira independente em circunstâncias normais. No caso de uma falha ou uma interrupção de uma região, ocorrerá o seguinte: - A BCDR do Atestado do Azure fornecerá um failover contínuo no qual os clientes não precisam realizar nenhuma etapa extra para recuperação - O [Gerenciador de Tráfego do Azure](../traffic-manager/index.yml) para a região detectará se a investigação de integridade está degradada e alternará o ponto de extremidade para a região emparelhada - As conexões existentes não funcionarão e receberão problemas de erro interno do servidor ou de tempo limite - Todas as operações do painel de controle serão bloqueadas. Os clientes não poderão criar provedores de atestado e atualizar políticas na região primária - Todas as operações do plano de dados, incluindo chamadas de atestado, continuarão funcionando na região primária ## <a name="next-steps"></a>Próximas etapas - Saiba mais sobre os [conceitos básicos do Atestado do Azure](basic-concepts.md) - [Como criar e assinar uma política de atestado](author-sign-policy.md) - [Configurar o Atestado do Azure usando o PowerShell](quickstart-powershell.md)
89.261905
624
0.811016
por_Latn
0.999874
edd49ab7baad1ca720be734bc833e0ed402b96d2
2,756
md
Markdown
README.md
leobalter/jquery-ui
06fe70b10af3e75837545c019d23029fe331f433
[ "CC0-1.0" ]
26
2016-01-11T05:38:53.000Z
2020-03-16T09:25:16.000Z
README.md
leobalter/jquery-ui
06fe70b10af3e75837545c019d23029fe331f433
[ "CC0-1.0" ]
4
2016-01-11T10:17:28.000Z
2016-01-11T11:06:37.000Z
README.md
leobalter/jquery-ui
06fe70b10af3e75837545c019d23029fe331f433
[ "CC0-1.0" ]
2
2018-03-27T09:33:00.000Z
2019-01-11T11:58:20.000Z
# [jQuery UI](http://jqueryui.com/) - Interactions and Widgets for the web jQuery UI is a curated set of user interface interactions, effects, widgets, and themes built on top of jQuery. Whether you're building highly interactive web applications, or you just need to add a date picker to a form control, jQuery UI is the perfect choice. If you want to use jQuery UI, go to [jqueryui.com](http://jqueryui.com) to get started, [jqueryui.com/demos/](http://jqueryui.com/demos/) for demos, [api.jqueryui.com](http://api.jqueryui.com/) for API documentation, or the [Using jQuery UI Forum](http://forum.jquery.com/using-jquery-ui) for discussions and questions. If you want to report a bug/issue, please visit [bugs.jqueryui.com](http://bugs.jqueryui.com). If you are interested in helping develop jQuery UI, you are in the right place. To discuss development with team members and the community, visit the [Developing jQuery UI Forum](http://forum.jquery.com/developing-jquery-ui) or [#jqueryui-dev on irc.freenode.net](http://irc.jquery.org/). ## For contributors If you want to help and provide a patch for a bugfix or new feature, please take a few minutes and look at [our Getting Involved guide](http://wiki.jqueryui.com/w/page/35263114/Getting-Involved). In particular check out the [Coding standards](http://wiki.jqueryui.com/w/page/12137737/Coding-standards) and [Commit Message Style Guide](http://contribute.jquery.org/commits-and-pull-requests/#commit-guidelines). In general, fork the project, create a branch for a specific change and send a pull request for that branch. Don't mix unrelated changes. You can use the commit message as the description for the pull request. ## Running the Unit Tests Run the unit tests with a local server that supports PHP. No database is required. Pre-configured php local servers are available for Windows and Mac. Here are some options: - Windows: [WAMP download](http://www.wampserver.com/en/) - Mac: [MAMP download](http://www.mamp.info/en/index.html) - Linux: [Setting up LAMP](https://www.linux.com/learn/tutorials/288158-easy-lamp-server-installation) - [Mongoose (most platforms)](http://code.google.com/p/mongoose/) ## Building jQuery UI jQuery UI uses the [Grunt](http://github.com/gruntjs/grunt) build system. To build jQuery UI, you must have [node.js](http://nodejs.org/) installed and then run the following commands: ```sh # Install the Grunt CLI npm install -g grunt-cli # Clone the jQuery UI git repo git clone git://github.com/jquery/jquery-ui.git cd jquery-ui # Install the node module dependencies npm install # Run the concat task to concatenate files grunt concat # There are many other tasks that can be run through Grunt. # For a list of all tasks: grunt --help ```
45.933333
319
0.761611
eng_Latn
0.927155
edd56c2d8d42462315a08664e5b7652d656bdf03
3,275
md
Markdown
_pages/projects.md
arav-agarwal2/adv-mmml-course
ac327af6461dfe5b83a90669f9660900f7f074dd
[ "MIT" ]
6
2022-01-12T17:06:51.000Z
2022-03-29T01:51:46.000Z
_pages/projects.md
arav-agarwal2/adv-mmml-course
ac327af6461dfe5b83a90669f9660900f7f074dd
[ "MIT" ]
null
null
null
_pages/projects.md
arav-agarwal2/adv-mmml-course
ac327af6461dfe5b83a90669f9660900f7f074dd
[ "MIT" ]
2
2022-01-07T05:20:45.000Z
2022-01-25T20:42:54.000Z
--- layout: page permalink: /spring2022/projects/ title: Examples of Previous Project Reports description: Project reports from student teams who participated in previous editions of the MMML course --- We list here only project reports that were publicly released by students. It should be noted that some of these links are for the follow-up submissions to conferences, after some revisions of the original project reports. Fall 2019: 1. Seong Hyeon Park, Gyubok Lee, Manoj Bhat, Jimin Seo, Minseok Kang, Jonathan Francis, Ashwin R. Jadhav, Paul Pu Liang, Louis-Philippe Morency. [Diverse and Admissible Trajectory Prediction through Multimodal Context Understanding](https://arxiv.org/abs/2003.03212). ECCV 2020 Fall 2018: 1. Ankit Shah, Vaibhav Vaibhav, Vasu Sharma, Mahmoud Alismail, Louis-Philippe Morency. [Multimodal Behavior Markers Exploring Suicidal Intent in Social Media Videos](https://dl.acm.org/doi/10.1145/3340555.3353718). ICMI 2019 2. Vasu Sharma, Ankita Kalra, Vaibhav, Simral Chaudhary, Labhesh Patel, Louis-Philippe Morency. [Attend and Attack: Attention Guided Adversarial Attacks on Visual Question Answering Models](https://nips2018vigil.github.io/static/papers/accepted/33.pdf). NeurIPS ViGIL workshop 2018 3. Yash Patel, Lluis Gomez, Marçal Rusiñol, Dimosthenis Karatzas, C.V. Jawahar. [Self-Supervised Visual Representations for Cross-Modal Retrieval](https://arxiv.org/abs/1902.00378) Fall 2017: 1. Hai Pham, Paul Pu Liang, Thomas Manzini, Louis-Philippe Morency, Barnabas Poczos. [Found in Translation: Learning Robust Joint Representations by Cyclic Translations Between Modalities](https://arxiv.org/abs/1812.07809). AAAI 2019 2. Zhun Liu, Ying Shen, Varun Bharadhwaj Lakshminarasimhan, Paul Pu Liang, Amir Zadeh, Louis-Philippe Morency. [Efficient Low-rank Multimodal Fusion with Modality-Specific Factors](https://arxiv.org/abs/1806.00064). ACL 2018 Spring 2017: 1. Devendra Singh Chaplot, Kanthashree Mysore Sathyendra, Rama Kumar Pasumarthi, Dheeraj Rajagopal, Ruslan Salakhutdinov. [Gated-Attention Architectures for Task-Oriented Language Grounding](https://arxiv.org/abs/1706.07230). AAAI 2018 2. Minghai Chen, Sen Wang, Paul Pu Liang, Tadas Baltrušaitis, Amir Zadeh, Louis-Philippe Morency. [Multimodal Sentiment Analysis with Word-Level Fusion and Reinforcement Learning](https://arxiv.org/abs/1802.00924). ICMI 2017 3. Abhilasha Ravichander, Shruti Rijhwani, Rajat Kulshreshtha, Chirag Nagpal, Tadas Baltrušaitis, Louis-Philippe Morency. [Preserving Intermediate Objectives: One Simple Trick to Improve Learning for Hierarchical Models](https://arxiv.org/abs/1706.07867). 4. Junjie Hu, Desai Fan, Shuxin Yao, Jean Oh. [Answer-Aware Attention on Grounded Question Answering in Images](https://www.ttic.edu/nchrc/papers/27.pdf). AAAI 2017 Fall Symposium on Natural Communication for Human-Robot Collaboration Spring 2016: 1. Haohan Wang, Aaksha Meghawat, Louis-Philippe Morency, Eric P. Xing. [Select-Additive Learning: Improving Generalization in Multimodal Sentiment Analysis](https://arxiv.org/abs/1609.05244). ICME 2017 If you previously took the MMML course and ended up releasing a public version of your project report, we would love to hear about it! Please contact the course instructor to be added to this list. ***
93.571429
281
0.800305
eng_Latn
0.69859
edd58a4720c46f870ad4c2bfb8610a5b1025c041
1,220
md
Markdown
README.md
kingkushal11/Team-Profile-Generator
6c9e7fd3f8263895473284faa4854f8428d722e0
[ "MIT" ]
null
null
null
README.md
kingkushal11/Team-Profile-Generator
6c9e7fd3f8263895473284faa4854f8428d722e0
[ "MIT" ]
null
null
null
README.md
kingkushal11/Team-Profile-Generator
6c9e7fd3f8263895473284faa4854f8428d722e0
[ "MIT" ]
null
null
null
# Team-Profile-Generator # Team Profile Generator ## License [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT) ## Table of Contents - [Decription](#decription) - [Installation](#installation) - [Usage](#usage) - [Contributing](#contributing) - [Tests](#tests) - [Questions](#questions) ## Decription This application allows you to generate a team profile. when run it in a node, it asks user to put answers to the question. The application was created with HTML and CSS. ## Installation I install my project with the help of npm i and node. ## Usage When you done with all your code, and npm i, run it with "node index.js" into your terminal and finish all the question and it will create HTML file. Watch the demo here https://watch.screencastify.com/v/309psQmyRMwHiivxvjGl. ## Contributing I did it with theI did this project with the help of my tutor Andrew Hardemon. ## Tests Node.index.js ## Questions - GitHub Profile Page:(https://github.com/Kingkushal11). - Any other question contact: Kushalcool06@gmail.com ## Deployed links - https://github.com/kingkushal11/Team-Profile-Generator - https://kingkushal11.github.io/Team-Profile-Generator/
45.185185
225
0.752459
eng_Latn
0.873261
edd5c2ba499a507697630a9a80224de3b155f744
30
md
Markdown
README.md
EchTR/python-projects-for-beginners
466b0c9864cbb924e7a4ad03cfade49d7c2f180b
[ "MIT" ]
1
2020-02-15T08:33:18.000Z
2020-02-15T08:33:18.000Z
README.md
EchTR/my-python-projects
36efd7483c8f94666718f22a514dd4d65dca44be
[ "MIT" ]
null
null
null
README.md
EchTR/my-python-projects
36efd7483c8f94666718f22a514dd4d65dca44be
[ "MIT" ]
null
null
null
python projects for beginners
15
29
0.866667
eng_Latn
0.999567
edd62ff829f81f83aab8739ef46fd5e4d7ff8ad7
74
md
Markdown
build/content/people/s/samira-ben-dbabis.md
briemadu/semdial-proceedings
59f13adcc73dc9433c3c07ee929fadd8d271b022
[ "Apache-2.0" ]
null
null
null
build/content/people/s/samira-ben-dbabis.md
briemadu/semdial-proceedings
59f13adcc73dc9433c3c07ee929fadd8d271b022
[ "Apache-2.0" ]
null
null
null
build/content/people/s/samira-ben-dbabis.md
briemadu/semdial-proceedings
59f13adcc73dc9433c3c07ee929fadd8d271b022
[ "Apache-2.0" ]
2
2021-09-16T07:16:15.000Z
2021-10-30T06:41:55.000Z
--- lastname: Dbabis name: samira-ben-dbabis title: Samira Ben Dbabis ---
12.333333
24
0.716216
deu_Latn
0.309708
edd675c6e03b9306533ad56ad37c74631f217f36
92
md
Markdown
config/default/common/config/metadata/layers/trmm/TRMM_Brightness_Temp_Dsc.md
evlnyng/worldview
48eca7aac872b88211ec997945acb62dc80212d5
[ "NASA-1.3" ]
null
null
null
config/default/common/config/metadata/layers/trmm/TRMM_Brightness_Temp_Dsc.md
evlnyng/worldview
48eca7aac872b88211ec997945acb62dc80212d5
[ "NASA-1.3" ]
null
null
null
config/default/common/config/metadata/layers/trmm/TRMM_Brightness_Temp_Dsc.md
evlnyng/worldview
48eca7aac872b88211ec997945acb62dc80212d5
[ "NASA-1.3" ]
null
null
null
References: [doi:10.5067/GPM/TMI/TRMM/1C/05](https://dx.doi.org/10.5067/GPM/TMI/TRMM/1C/05)
46
91
0.717391
kor_Hang
0.722646
edd6868bd3c456624fc135cbc355bae1dd6fa7c9
2,791
md
Markdown
doc/book/index.md
albertopgarcia/js.core
4ed3d83990601a254320cab83d18ac1df1f39c53
[ "MIT" ]
4
2021-05-07T20:14:36.000Z
2021-11-14T17:35:25.000Z
doc/book/index.md
albertopgarcia/js.core
4ed3d83990601a254320cab83d18ac1df1f39c53
[ "MIT" ]
12
2018-09-05T12:56:50.000Z
2019-10-09T09:40:54.000Z
doc/book/index.md
albertopgarcia/js.core
4ed3d83990601a254320cab83d18ac1df1f39c53
[ "MIT" ]
6
2018-01-29T16:26:29.000Z
2020-05-27T08:46:13.000Z
# Index - Core components - - Bootstrap - - CLI - Command Line Interface - - Configuration - - Console - - Deepclone - - Deepfind - - Deepfreeze - - Deepmerge - - Eventbus - - Http - - - Request - - - Server - - Locator - - - When to use - - - When not to use - - Object - - Path - - Process - - Schema - - String - Code structure - - Domain - - - Microservice - - - Core domain - - - Sub domains - - - Branch or leaf domain - - Layers - - - Api - - - - Endpoint - - - - Observer - - - Domain - - - - Aggregate - - - - Composite - - - - Schema - - - - - DTO - - - - - - Command - - - - - - Query - - - - - - Event - - - - - Entity - - - - - Value object - - - - - Collection - - - Infrastructure - - - - Gateway - - - - Repository - - - - Anti corruption layer - - - View - - - - Template - - - - Helper - - Networks (???) (Grouping the diferent features by networks) - Code automisation - Helper - - Generate project - - Generate API classes and corresponding tests from configuration - - Generate API documentation - - Generate Domain classes from configuration - Tutorial ... Todo application - Appendix | Architecture - - The importance of balance ... Describing the importance of balance, the importance of realization of none abolute implementation of every principle, guidline or recomendation (don't be a slave to another mans word - think for your self, you know your reality better then someone who never has experienced the enviremnt on which you apply the principles) - - Communication - Integration pattern between microservices - - - MVC like pattern - - - - M = Model = Backend microservices - - - - V = View = UI microservices - - - - C = Controller = API microservice - - - - - Security - - - - - Point of failure that will take it all down, replicated state to help solve the problem - - Designing software - - - Specifications to code - - - - Understanding the stakeholders - - - - DDD - Domain driven design - - - - - A good technique for understanding and building applications - - - - - Good defintions of domain and infrastructure components - - - - - Eventstorming - - Eventsource - - - The importance of an event log - - - Create a state from the event log and use it as your statefull db - - Isolation - - - Scopes - - - Globals - - Stateless development ... Conditionless ... don't return bool if possible, it will force conditions else where ... Pollymophism - - Don't over enginer ... Over enginering, what is the concept of over enginering and how to improve productivity by not trying to solve everything / every edge-case ... balance coding by flexibility and constraintfull to achive an endresult that reflects simplicity ... KISS - Keep It Simple Stupid ... YAGNI - You Aint Gonna Need It ... MVP - Minimal valiable product
25.842593
327
0.669294
eng_Latn
0.972462
edd69a2db6fa7f8ea3052cddca268f992b44c680
928
md
Markdown
CHANGELOG.md
d-kimuson/ts-type-expand
adc9fede6ecfbc7c09dd2904ed8a99c3d8d97e8d
[ "MIT" ]
23
2021-05-26T05:27:24.000Z
2022-02-06T15:07:26.000Z
CHANGELOG.md
d-kimuson/ts-type-expand
adc9fede6ecfbc7c09dd2904ed8a99c3d8d97e8d
[ "MIT" ]
33
2021-05-25T10:39:49.000Z
2022-03-23T21:59:19.000Z
CHANGELOG.md
d-kimuson/ts-type-expand
adc9fede6ecfbc7c09dd2904ed8a99c3d8d97e8d
[ "MIT" ]
1
2021-12-02T14:33:33.000Z
2021-12-02T14:33:33.000Z
# Change Log All notable changes to the "ts-type-expand" extension will be documented in this file. Check [Keep a Changelog](http://keepachangelog.com/) for recommendations on how to structure this file. ## [Unreleased] ## [0.0.12] - 2021-09-12 ### Added - Support for multiple workspaces ### Changed - fixed bug that type expansion of function arguments #21 ## [0.0.11] - 2021-09-10 ### Changed - Fixed problem with dependency resolution failing on case-sensitive machines. - Do not display an error when the file extension is not ts or tsx. ## [0.0.9] - 2021-09-04 ### Added - added error messages after error occured ### Changed - `ts-type-expand.refresh` renamed `ts-type-expand.restart` - restart compilerAPI after command executed - update configure after command executed - Upgrade typescript version to 4.4.2 - Upgrade dependecy versions ## [0.0.8] - 2021-05-31 ### Added - Support enum expansion
20.622222
103
0.720905
eng_Latn
0.986399
edd72fd97ff2a847da4c363899a1ed3292271f25
5,911
md
Markdown
articles/cognitive-services/QnAMaker/Tutorials/migrate-knowledge-base.md
BaherAbdullah/azure-docs
65d82440dd3209697fdb983ef456b0a2293e270a
[ "CC-BY-4.0", "MIT" ]
7,073
2017-06-27T08:58:22.000Z
2022-03-30T23:19:23.000Z
articles/cognitive-services/QnAMaker/Tutorials/migrate-knowledge-base.md
BaherAbdullah/azure-docs
65d82440dd3209697fdb983ef456b0a2293e270a
[ "CC-BY-4.0", "MIT" ]
87,608
2017-06-26T22:11:41.000Z
2022-03-31T23:57:29.000Z
articles/cognitive-services/QnAMaker/Tutorials/migrate-knowledge-base.md
BaherAbdullah/azure-docs
65d82440dd3209697fdb983ef456b0a2293e270a
[ "CC-BY-4.0", "MIT" ]
17,093
2017-06-27T03:28:18.000Z
2022-03-31T20:46:38.000Z
--- title: Migrate knowledge bases - QnA Maker description: Migrating a knowledge base requires exporting from one knowledge base, then importing into another. ms.service: cognitive-services ms.subservice: qna-maker ms.topic: how-to ms.date: 11/09/2020 --- # Migrate a knowledge base using export-import You may want to create a copy of your knowledge base for several reasons: * Copy a knowledge base from QnA Maker GA to Custom question answering * To implement a backup and restore process * Integrate with your CI/CD pipeline * When you wish to move your data to different regions ## Prerequisites # [QnA Maker GA (stable release)](#tab/v1) > * If you don't have an Azure subscription, [create a free account](https://azure.microsoft.com/free/cognitive-services/) before you begin. > * A [QnA Maker resource](https://ms.portal.azure.com/#create/Microsoft.CognitiveServicesQnAMaker) created in the Azure portal. Remember your Azure Active Directory ID, Subscription, QnA resource name you selected when you created the resource. > * Set up a new [QnA Maker service](../How-To/set-up-qnamaker-service-azure.md) # [Custom question answering (preview release)](#tab/v2) > * If you don't have an Azure subscription, [create a free account](https://azure.microsoft.com/free/cognitive-services/) before you begin. > * A [Text Analytics resource](https://ms.portal.azure.com/#create/Microsoft.CognitiveServicesTextAnalytics) with the custom question answering feature enabled in the Azure portal. Remember your Azure Active Directory ID, Subscription, and Text Analytics resource name you selected when you created the resource. > * Set up [Custom question answering](../How-To/set-up-qnamaker-service-azure.md) --- ## Export a knowledge base 1. Sign in to [QnA Maker portal](https://qnamaker.ai). 1. Select the knowledge base you want to migrate. 1. On the **Settings** page, you have the options to export **QnAs**, **Synonyms**, or **Knowledge Base Replica**. You can choose to download the data in .tsv/.xlsx. 1. **QnAs**: When exporting QnAs, all QnA pairs (with questions, answers, metadata, follow-up prompts, and the data source names) are downloaded. The QnA IDs that are exported with the questions and answers may be used to update a specific QnA pair using the [update API](/rest/api/cognitiveservices/qnamaker/knowledgebase/update). The QnA ID for a specific QnA pair remains unchanged across multiple export operations. 2. **Synonyms**: You can export Synonyms that have been added to the knowledge base. 4. **Knowledge Base Replica**: If you want to download the entire knowledge base with synonyms and other settings, you can choose this option. ## Import a knowledge base 1. Click **Create a knowledge base** from the top menu of the qnamaker.ai portal and then create an _empty_ knowledge base by not adding any URLs or files. Set the name of your choice for the new knowledge base and then Click **Create your KB**. 1. In this new knowledge base, open the **Settings** tab and and under _Import knowledge base_ select one of the following options: **QnAs**, **Synonyms**, or **Knowledge Base Replica**. 1. **QnAs**: This option imports all QnA pairs. **The QnA pairs created in the new knowledge base shall have the same QnA ID as present in the exported file**. You can refer [SampleQnAs.xlsx](https://aka.ms/qnamaker-sampleqnas), [SampleQnAs.tsv](https://aka.ms/qnamaker-sampleqnastsv) to import QnAs. 2. **Synonyms**: This option can be used to import synonyms to the knowledge base. You can refer [SampleSynonyms.xlsx](https://aka.ms/qnamaker-samplesynonyms), [SampleSynonyms.tsv](https://aka.ms/qnamaker-samplesynonymstsv) to import synonyms. 3. **Knowledge Base Replica**: This option can be used to import KB replica with QnAs, Synonyms and Settings. You can refer [KBReplicaSampleExcel](https://aka.ms/qnamaker-samplereplica), [KBReplicaSampleTSV](https://aka.ms/qnamaker-samplereplicatsv) for more details. If you also want to add unstructured content to the replica, refer [CustomQnAKBReplicaSample](https://aka.ms/qnamaker-samplev2replica). Either QnAs or Unstructured content is required when importing replica. Unstructured documents are only valid for Custom question answering. Synonyms file is not mandatory when importing replica. Settings file is mandatory when importing replica. |Settings|Update permitted when importing to QnA Maker KB?|Update permitted when importing to Custom question answering KB?| |:--|--|--| |DefaultAnswerForKB|No|Yes| |EnableActiveLearning (True/False)|Yes|No| |EnableMultiTurnExtraction (True/False)|Yes|Yes| |DefaultAnswerforMultiturn|Yes|Yes| |Language|No|No| 1. **Test** the new knowledge base using the Test panel. Learn how to [test your knowledge base](../How-To/test-knowledge-base.md). 1. **Publish** the knowledge base and create a chat bot. Learn how to [publish your knowledge base](../Quickstarts/create-publish-knowledge-base.md#publish-the-knowledge-base). > [!div class="mx-imgBorder"] > ![Migrate knowledge base](../media/qnamaker-how-to-migrate-kb/import-export-kb.png) ## Programmatically migrate a knowledge base from QnA Maker The migration process is programmatically available using the following REST APIs: **Export** * [Download knowledge base API](/rest/api/cognitiveservices/qnamaker4.0/knowledgebase/download) **Import** * [Replace API (reload with same knowledge base ID)](/rest/api/cognitiveservices/qnamaker4.0/knowledgebase/replace) * [Create API (load with new knowledge base ID)](/rest/api/cognitiveservices/qnamaker4.0/knowledgebase/create) ## Chat logs There is no way to migrate chat logs, since the new knowledge base uses Application Insights for storing chat logs. ## Next steps > [!div class="nextstepaction"] > [Edit a knowledge base](../How-To/edit-knowledge-base.md)
63.55914
422
0.757232
eng_Latn
0.97313
edd784adbb94b029e1b86e951f4173a7fffe7fe6
2,121
md
Markdown
_research_groups/unige-casaPaganini/description.md
davidespano/sigchitialy
5554a67fd5ccd064251e6cd07ba108d5bd40d44d
[ "MIT" ]
null
null
null
_research_groups/unige-casaPaganini/description.md
davidespano/sigchitialy
5554a67fd5ccd064251e6cd07ba108d5bd40d44d
[ "MIT" ]
2
2022-02-04T10:51:22.000Z
2022-02-21T18:15:13.000Z
_research_groups/unige-casaPaganini/description.md
davidespano/sigchitialy
5554a67fd5ccd064251e6cd07ba108d5bd40d44d
[ "MIT" ]
8
2021-11-22T16:50:42.000Z
2022-03-05T11:50:28.000Z
--- layout: research title: Casa Paganini group-name: Casa Paganini institution: Università degli Studi di Genova department: DIBRIS address: street-name: piazza S. Maria in Passione street-number: 34 city: Genova state: zip: 16123 pin: 44.40530589281376, 8.929921441016095 website: http://www.casapaganini.org field: CS members: - member: name: Gualtiero surname: Volpe picture: null url: null email: gualtiero.volpe@unige.it - member: name: Antonio surname: Camurri picture: null url: null email: antonio.camurri@unige.it keywords: - keyword: multimodal systems - keyword: affective computing - keyword: social signal processing - keyword: sound and music computing - keyword: performance science - keyword: full-body movement analysis - keyword: expressive gesture - keyword: interactive sonification - keyword: active experience of multimedia content - keyword: tools for fast prototyping of multimodal interactive systems papers: - doi: 10.1109/TAFFC.2021.3095425 - doi: 10.1109/TAFFC.2020.2978069 - doi: 10.1109/THMS.2020.3016085 - doi: 10.1007/s12559-020-09768-8 - doi: 10.1109/ACCESS.2020.3005719 - doi: 10.1109/TAFFC.2019.2916023 - doi: 10.1038/s41598-019-42395-4 - doi: 10.1007/s12193-019-00302-1 - doi: 10.1145/3132369 - doi: 10.1177/1046496417722841 - doi: 10.3389/fdigh.2017.00009 - doi: 10.1098/rstb.2015.0377 - doi: 10.1109/MMUL.2016.13 - doi: 10.1109/TMM.2010.2052592 - doi: 10.1016/S1071-5819(03)00050-8 --- ### Main Research Interests Casa Paganini - InfoMus, a research center of DIBRIS - Università degli Studi di Genova, carries on scientific research and design, development, and experimentation of multimodal interfaces and multimedia systems. Research addresses computational methods for real-time analysis of nonverbal multimodal expressive and social interaction, with a particular focus on human movement and gesture (e.g., full-body movement, dance) and sound (e.g., music, interactive sonification). Applications span education, rehabilitation, cultural heritage, cultural wellness, and performing arts.
34.770492
579
0.751061
eng_Latn
0.445635
edd7e28bf57614b944255d618a11ac308e6ce012
836
md
Markdown
2007/CVE-2007-4727.md
justinforbes/cve
375c65312f55c34fc1a4858381315fe9431b0f16
[ "MIT" ]
2,340
2022-02-10T21:04:40.000Z
2022-03-31T14:42:58.000Z
2007/CVE-2007-4727.md
justinforbes/cve
375c65312f55c34fc1a4858381315fe9431b0f16
[ "MIT" ]
19
2022-02-11T16:06:53.000Z
2022-03-11T10:44:27.000Z
2007/CVE-2007-4727.md
justinforbes/cve
375c65312f55c34fc1a4858381315fe9431b0f16
[ "MIT" ]
280
2022-02-10T19:58:58.000Z
2022-03-26T11:13:05.000Z
### [CVE-2007-4727](https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2007-4727) ![](https://img.shields.io/static/v1?label=Product&message=n%2Fa&color=blue) ![](https://img.shields.io/static/v1?label=Version&message=n%2Fa&color=blue) ![](https://img.shields.io/static/v1?label=Vulnerability&message=n%2Fa&color=brighgreen) ### Description Buffer overflow in the fcgi_env_add function in mod_proxy_backend_fastcgi.c in the mod_fastcgi extension in lighttpd before 1.4.18 allows remote attackers to overwrite arbitrary CGI variables and execute arbitrary code via an HTTP request with a long content length, as demonstrated by overwriting the SCRIPT_FILENAME variable, aka a "header overflow." ### POC #### Reference - http://www.novell.com/linux/security/advisories/2007_20_sr.html #### Github No PoCs found on GitHub currently.
46.444444
352
0.77512
eng_Latn
0.595273
edd82e94653be73cf625ae246c81b45e74edd918
28,168
md
Markdown
CHANGELOG.md
hessius/OctoDash
f88a0350a36faa94111f0dd4b51f9240c2030a07
[ "Apache-2.0" ]
null
null
null
CHANGELOG.md
hessius/OctoDash
f88a0350a36faa94111f0dd4b51f9240c2030a07
[ "Apache-2.0" ]
null
null
null
CHANGELOG.md
hessius/OctoDash
f88a0350a36faa94111f0dd4b51f9240c2030a07
[ "Apache-2.0" ]
null
null
null
# Change Log ## [v1.3.1](https://github.com/UnchartedBull/OctoDash/tree/v1.3.1) (2019-10-08) [Full Changelog](https://github.com/UnchartedBull/OctoDash/compare/v1.3.0...v1.3.1) **Implemented enhancements:** - Redesign Loaded File Screen [\#185](https://github.com/UnchartedBull/OctoDash/issues/185) - Use OctoPrint-Enclosure Plugin [\#182](https://github.com/UnchartedBull/OctoDash/issues/182) - Notification if new version is available [\#166](https://github.com/UnchartedBull/OctoDash/issues/166) - Integrate Preheat into OctoDash [\#151](https://github.com/UnchartedBull/OctoDash/issues/151) **Fixed bugs:** - Clear Loaded File after starting print [\#184](https://github.com/UnchartedBull/OctoDash/issues/184) **Closed issues:** - unable to open octodash via VNC [\#180](https://github.com/UnchartedBull/OctoDash/issues/180) **Merged pull requests:** - Fix/remove node dht sensor [\#188](https://github.com/UnchartedBull/OctoDash/pull/188) ([UnchartedBull](https://github.com/UnchartedBull)) - New Design for Loaded files [\#187](https://github.com/UnchartedBull/OctoDash/pull/187) ([UnchartedBull](https://github.com/UnchartedBull)) - Clear loaded File after printing started [\#186](https://github.com/UnchartedBull/OctoDash/pull/186) ([UnchartedBull](https://github.com/UnchartedBull)) - Bump @types/lodash from 4.14.141 to 4.14.142 [\#181](https://github.com/UnchartedBull/OctoDash/pull/181) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @angular/cli from 8.3.7 to 8.3.8 [\#179](https://github.com/UnchartedBull/OctoDash/pull/179) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @angular-devkit/build-angular from 0.803.7 to 0.803.8 [\#178](https://github.com/UnchartedBull/OctoDash/pull/178) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @types/node from 12.7.10 to 12.7.11 [\#177](https://github.com/UnchartedBull/OctoDash/pull/177) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @types/node from 12.7.9 to 12.7.10 [\#176](https://github.com/UnchartedBull/OctoDash/pull/176) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @angular/language-service from 8.2.8 to 8.2.9 [\#175](https://github.com/UnchartedBull/OctoDash/pull/175) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @angular-devkit/build-angular from 0.803.6 to 0.803.7 [\#174](https://github.com/UnchartedBull/OctoDash/pull/174) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @angular/router from 8.2.8 to 8.2.9 [\#173](https://github.com/UnchartedBull/OctoDash/pull/173) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @angular/cli from 8.3.6 to 8.3.7 [\#172](https://github.com/UnchartedBull/OctoDash/pull/172) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - update notification [\#171](https://github.com/UnchartedBull/OctoDash/pull/171) ([UnchartedBull](https://github.com/UnchartedBull)) - Bump @types/node from 12.7.8 to 12.7.9 [\#169](https://github.com/UnchartedBull/OctoDash/pull/169) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump electron from 6.0.10 to 6.0.11 [\#168](https://github.com/UnchartedBull/OctoDash/pull/168) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) ## [v1.3.0](https://github.com/UnchartedBull/OctoDash/tree/v1.3.0) (2019-09-30) [Full Changelog](https://github.com/UnchartedBull/OctoDash/compare/v1.2.0...v1.3.0) **Implemented enhancements:** - Octoprint Plugin [\#95](https://github.com/UnchartedBull/OctoDash/issues/95) - Integrate DisplayLayerProgess API [\#93](https://github.com/UnchartedBull/OctoDash/issues/93) - File Actions [\#18](https://github.com/UnchartedBull/OctoDash/issues/18) - Detailed File View [\#17](https://github.com/UnchartedBull/OctoDash/issues/17) - File API [\#16](https://github.com/UnchartedBull/OctoDash/issues/16) - File browser [\#15](https://github.com/UnchartedBull/OctoDash/issues/15) **Fixed bugs:** - Long Filenames don't get cut correctly [\#150](https://github.com/UnchartedBull/OctoDash/issues/150) - Temperature Sensor not working reliably [\#87](https://github.com/UnchartedBull/OctoDash/issues/87) **Closed issues:** - Integrate iFrame to be triggered by Custom Actions [\#162](https://github.com/UnchartedBull/OctoDash/issues/162) - Screen resolution [\#145](https://github.com/UnchartedBull/OctoDash/issues/145) **Merged pull requests:** - Files View [\#165](https://github.com/UnchartedBull/OctoDash/pull/165) ([UnchartedBull](https://github.com/UnchartedBull)) - Open iFrame from Custom Actions [\#163](https://github.com/UnchartedBull/OctoDash/pull/163) ([UnchartedBull](https://github.com/UnchartedBull)) - Bump @types/lodash from 4.14.140 to 4.14.141 [\#161](https://github.com/UnchartedBull/OctoDash/pull/161) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @types/lodash from 4.14.139 to 4.14.140 [\#160](https://github.com/UnchartedBull/OctoDash/pull/160) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump codelyzer from 5.1.1 to 5.1.2 [\#159](https://github.com/UnchartedBull/OctoDash/pull/159) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Merge layer progress into files [\#158](https://github.com/UnchartedBull/OctoDash/pull/158) ([UnchartedBull](https://github.com/UnchartedBull)) - Bump @types/node from 12.7.7 to 12.7.8 [\#157](https://github.com/UnchartedBull/OctoDash/pull/157) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @angular/cli from 8.3.5 to 8.3.6 [\#156](https://github.com/UnchartedBull/OctoDash/pull/156) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @angular-devkit/build-angular from 0.803.5 to 0.803.6 [\#155](https://github.com/UnchartedBull/OctoDash/pull/155) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @angular/language-service from 8.2.7 to 8.2.8 [\#154](https://github.com/UnchartedBull/OctoDash/pull/154) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @angular/router from 8.2.7 to 8.2.8 [\#153](https://github.com/UnchartedBull/OctoDash/pull/153) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @types/node from 12.7.5 to 12.7.7 [\#152](https://github.com/UnchartedBull/OctoDash/pull/152) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @types/lodash from 4.14.138 to 4.14.139 [\#147](https://github.com/UnchartedBull/OctoDash/pull/147) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump node-dht-sensor from 0.3.0 to 0.4.0 [\#146](https://github.com/UnchartedBull/OctoDash/pull/146) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump codelyzer from 5.1.0 to 5.1.1 [\#144](https://github.com/UnchartedBull/OctoDash/pull/144) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump electron from 6.0.9 to 6.0.10 [\#143](https://github.com/UnchartedBull/OctoDash/pull/143) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @fortawesome/free-solid-svg-icons from 5.11.0 to 5.11.1 [\#142](https://github.com/UnchartedBull/OctoDash/pull/142) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @fortawesome/fontawesome-svg-core from 1.2.23 to 1.2.24 [\#141](https://github.com/UnchartedBull/OctoDash/pull/141) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @angular-devkit/build-angular from 0.803.4 to 0.803.5 [\#140](https://github.com/UnchartedBull/OctoDash/pull/140) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @angular/cli from 8.3.4 to 8.3.5 [\#139](https://github.com/UnchartedBull/OctoDash/pull/139) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @angular/language-service from 8.2.6 to 8.2.7 [\#138](https://github.com/UnchartedBull/OctoDash/pull/138) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @angular/router from 8.2.6 to 8.2.7 [\#137](https://github.com/UnchartedBull/OctoDash/pull/137) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @fortawesome/fontawesome-svg-core from 1.2.22 to 1.2.23 [\#136](https://github.com/UnchartedBull/OctoDash/pull/136) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @fortawesome/free-solid-svg-icons from 5.10.2 to 5.11.0 [\#135](https://github.com/UnchartedBull/OctoDash/pull/135) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump ts-node from 8.4.0 to 8.4.1 [\#134](https://github.com/UnchartedBull/OctoDash/pull/134) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump ts-node from 8.3.0 to 8.4.0 [\#133](https://github.com/UnchartedBull/OctoDash/pull/133) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump electron from 6.0.8 to 6.0.9 [\#132](https://github.com/UnchartedBull/OctoDash/pull/132) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @angular/language-service from 8.2.5 to 8.2.6 [\#131](https://github.com/UnchartedBull/OctoDash/pull/131) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @angular/router from 8.2.5 to 8.2.6 [\#130](https://github.com/UnchartedBull/OctoDash/pull/130) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @angular-devkit/build-angular from 0.803.3 to 0.803.4 [\#129](https://github.com/UnchartedBull/OctoDash/pull/129) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @angular/cli from 8.3.3 to 8.3.4 [\#128](https://github.com/UnchartedBull/OctoDash/pull/128) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @types/node from 12.7.4 to 12.7.5 [\#127](https://github.com/UnchartedBull/OctoDash/pull/127) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump tslint from 5.19.0 to 5.20.0 [\#126](https://github.com/UnchartedBull/OctoDash/pull/126) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump electron from 6.0.7 to 6.0.8 [\#125](https://github.com/UnchartedBull/OctoDash/pull/125) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump electron-store from 4.0.0 to 5.0.0 [\#124](https://github.com/UnchartedBull/OctoDash/pull/124) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @angular/language-service from 8.2.4 to 8.2.5 [\#123](https://github.com/UnchartedBull/OctoDash/pull/123) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @angular/router from 8.2.4 to 8.2.5 [\#122](https://github.com/UnchartedBull/OctoDash/pull/122) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @angular-devkit/build-angular from 0.803.2 to 0.803.3 [\#121](https://github.com/UnchartedBull/OctoDash/pull/121) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @angular/cli from 8.3.2 to 8.3.3 [\#120](https://github.com/UnchartedBull/OctoDash/pull/120) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump rxjs from 6.5.2 to 6.5.3 [\#119](https://github.com/UnchartedBull/OctoDash/pull/119) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @types/node from 12.7.3 to 12.7.4 [\#118](https://github.com/UnchartedBull/OctoDash/pull/118) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump electron from 6.0.6 to 6.0.7 [\#116](https://github.com/UnchartedBull/OctoDash/pull/116) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump electron from 6.0.5 to 6.0.6 [\#115](https://github.com/UnchartedBull/OctoDash/pull/115) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @types/node from 12.7.2 to 12.7.3 [\#114](https://github.com/UnchartedBull/OctoDash/pull/114) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @angular-devkit/build-angular from 0.803.1 to 0.803.2 [\#113](https://github.com/UnchartedBull/OctoDash/pull/113) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @angular/cli from 8.3.1 to 8.3.2 [\#112](https://github.com/UnchartedBull/OctoDash/pull/112) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @angular/language-service from 8.2.3 to 8.2.4 [\#111](https://github.com/UnchartedBull/OctoDash/pull/111) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @angular/router from 8.2.3 to 8.2.4 [\#110](https://github.com/UnchartedBull/OctoDash/pull/110) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @angular/cli from 8.3.0 to 8.3.1 [\#109](https://github.com/UnchartedBull/OctoDash/pull/109) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @angular-devkit/build-angular from 0.803.0 to 0.803.1 [\#108](https://github.com/UnchartedBull/OctoDash/pull/108) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump electron from 6.0.4 to 6.0.5 [\#107](https://github.com/UnchartedBull/OctoDash/pull/107) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @types/lodash from 4.14.137 to 4.14.138 [\#106](https://github.com/UnchartedBull/OctoDash/pull/106) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump electron from 6.0.3 to 6.0.4 [\#105](https://github.com/UnchartedBull/OctoDash/pull/105) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @fortawesome/fontawesome-svg-core from 1.2.21 to 1.2.22 [\#104](https://github.com/UnchartedBull/OctoDash/pull/104) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @fortawesome/free-solid-svg-icons from 5.10.1 to 5.10.2 [\#103](https://github.com/UnchartedBull/OctoDash/pull/103) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @angular-devkit/build-angular from 0.802.2 to 0.803.0 [\#102](https://github.com/UnchartedBull/OctoDash/pull/102) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @angular/cli from 8.2.2 to 8.3.0 [\#101](https://github.com/UnchartedBull/OctoDash/pull/101) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @angular/language-service from 8.2.2 to 8.2.3 [\#100](https://github.com/UnchartedBull/OctoDash/pull/100) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @angular/router from 8.2.2 to 8.2.3 [\#99](https://github.com/UnchartedBull/OctoDash/pull/99) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump tslint from 5.18.0 to 5.19.0 [\#98](https://github.com/UnchartedBull/OctoDash/pull/98) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Display layer progress [\#97](https://github.com/UnchartedBull/OctoDash/pull/97) ([UnchartedBull](https://github.com/UnchartedBull)) - Bump electron from 6.0.2 to 6.0.3 [\#96](https://github.com/UnchartedBull/OctoDash/pull/96) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @types/lodash from 4.14.136 to 4.14.137 [\#94](https://github.com/UnchartedBull/OctoDash/pull/94) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @angular/cli from 8.2.1 to 8.2.2 [\#92](https://github.com/UnchartedBull/OctoDash/pull/92) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @types/node from 12.7.1 to 12.7.2 [\#91](https://github.com/UnchartedBull/OctoDash/pull/91) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @angular-devkit/build-angular from 0.802.1 to 0.802.2 [\#90](https://github.com/UnchartedBull/OctoDash/pull/90) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump typescript from 3.4.5 to 3.5.3 [\#89](https://github.com/UnchartedBull/OctoDash/pull/89) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) ## [v1.2.0](https://github.com/UnchartedBull/OctoDash/tree/v1.2.0) (2019-08-13) [Full Changelog](https://github.com/UnchartedBull/OctoDash/compare/v1.1.0...v1.2.0) **Implemented enhancements:** - Make jog speed adjustable [\#79](https://github.com/UnchartedBull/OctoDash/issues/79) - Implement Main Screen for Touchscreen [\#13](https://github.com/UnchartedBull/OctoDash/issues/13) **Fixed bugs:** - adjust screen does not open [\#58](https://github.com/UnchartedBull/OctoDash/issues/58) **Closed issues:** - Rename access token to api key [\#81](https://github.com/UnchartedBull/OctoDash/issues/81) **Merged pull requests:** - Improve setup [\#88](https://github.com/UnchartedBull/OctoDash/pull/88) ([UnchartedBull](https://github.com/UnchartedBull)) - Bump @fortawesome/angular-fontawesome from 0.4.0 to 0.5.0 [\#86](https://github.com/UnchartedBull/OctoDash/pull/86) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @angular/compiler-cli from 8.1.3 to 8.2.2 [\#84](https://github.com/UnchartedBull/OctoDash/pull/84) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump electron from 6.0.1 to 6.0.2 [\#83](https://github.com/UnchartedBull/OctoDash/pull/83) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @angular/language-service from 8.2.1 to 8.2.2 [\#82](https://github.com/UnchartedBull/OctoDash/pull/82) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Printer Control [\#80](https://github.com/UnchartedBull/OctoDash/pull/80) ([UnchartedBull](https://github.com/UnchartedBull)) - Bump @angular/language-service from 8.2.0 to 8.2.1 [\#77](https://github.com/UnchartedBull/OctoDash/pull/77) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @angular/cli from 8.2.0 to 8.2.1 [\#76](https://github.com/UnchartedBull/OctoDash/pull/76) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @angular-devkit/build-angular from 0.802.0 to 0.802.1 [\#75](https://github.com/UnchartedBull/OctoDash/pull/75) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @types/node from 12.7.0 to 12.7.1 [\#74](https://github.com/UnchartedBull/OctoDash/pull/74) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump electron from 6.0.0 to 6.0.1 [\#73](https://github.com/UnchartedBull/OctoDash/pull/73) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @types/node from 12.6.9 to 12.7.0 [\#72](https://github.com/UnchartedBull/OctoDash/pull/72) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump electron-reload from 1.4.1 to 1.5.0 [\#71](https://github.com/UnchartedBull/OctoDash/pull/71) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @types/node from 12.6.8 to 12.6.9 [\#70](https://github.com/UnchartedBull/OctoDash/pull/70) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @angular-devkit/build-angular from 0.801.3 to 0.802.0 [\#69](https://github.com/UnchartedBull/OctoDash/pull/69) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @angular/cli from 8.1.3 to 8.2.0 [\#68](https://github.com/UnchartedBull/OctoDash/pull/68) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @angular/language-service from 8.1.3 to 8.2.0 [\#67](https://github.com/UnchartedBull/OctoDash/pull/67) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump electron-builder from 21.1.5 to 21.2.0 [\#65](https://github.com/UnchartedBull/OctoDash/pull/65) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump electron from 5.0.8 to 6.0.0 [\#64](https://github.com/UnchartedBull/OctoDash/pull/64) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @angular-devkit/build-angular from 0.801.2 to 0.801.3 [\#63](https://github.com/UnchartedBull/OctoDash/pull/63) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @angular/cli from 8.1.2 to 8.1.3 [\#62](https://github.com/UnchartedBull/OctoDash/pull/62) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @angular/language-service from 8.1.2 to 8.1.3 [\#61](https://github.com/UnchartedBull/OctoDash/pull/61) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @angular/compiler-cli from 8.1.2 to 8.1.3 [\#60](https://github.com/UnchartedBull/OctoDash/pull/60) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump electron-builder from 21.1.1 to 21.1.5 [\#59](https://github.com/UnchartedBull/OctoDash/pull/59) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump electron from 5.0.7 to 5.0.8 [\#57](https://github.com/UnchartedBull/OctoDash/pull/57) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) ## [v1.1.0](https://github.com/UnchartedBull/OctoDash/tree/v1.1.0) (2019-07-24) [Full Changelog](https://github.com/UnchartedBull/OctoDash/compare/v1.0.1...v1.1.0) **Implemented enhancements:** - Implement Pause Job [\#12](https://github.com/UnchartedBull/OctoDash/issues/12) - Implement Cancel Job [\#11](https://github.com/UnchartedBull/OctoDash/issues/11) - Implement Print Controls [\#10](https://github.com/UnchartedBull/OctoDash/issues/10) - Switch between Touchscreen and Non-Touchscreen Version [\#9](https://github.com/UnchartedBull/OctoDash/issues/9) **Closed issues:** - Structure README.md [\#28](https://github.com/UnchartedBull/OctoDash/issues/28) - Include Dependabot [\#27](https://github.com/UnchartedBull/OctoDash/issues/27) - Help [\#25](https://github.com/UnchartedBull/OctoDash/issues/25) **Merged pull requests:** - Print controls [\#56](https://github.com/UnchartedBull/OctoDash/pull/56) ([UnchartedBull](https://github.com/UnchartedBull)) - Bump lodash from 4.17.14 to 4.17.15 [\#54](https://github.com/UnchartedBull/OctoDash/pull/54) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @angular-devkit/build-angular from 0.801.1 to 0.801.2 [\#53](https://github.com/UnchartedBull/OctoDash/pull/53) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @types/node from 12.6.6 to 12.6.8 [\#52](https://github.com/UnchartedBull/OctoDash/pull/52) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @angular/cli from 8.1.1 to 8.1.2 [\#51](https://github.com/UnchartedBull/OctoDash/pull/51) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @angular/compiler-cli from 8.1.1 to 8.1.2 [\#50](https://github.com/UnchartedBull/OctoDash/pull/50) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @angular/language-service from 8.1.1 to 8.1.2 [\#49](https://github.com/UnchartedBull/OctoDash/pull/49) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump electron-builder from 21.0.15 to 21.1.1 [\#48](https://github.com/UnchartedBull/OctoDash/pull/48) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) ## [v1.0.1](https://github.com/UnchartedBull/OctoDash/tree/v1.0.1) (2019-07-18) [Full Changelog](https://github.com/UnchartedBull/OctoDash/compare/v1.0.0...v1.0.1) **Implemented enhancements:** - Interface for Octoprint API [\#31](https://github.com/UnchartedBull/OctoDash/issues/31) - Initial Setup with Electron [\#23](https://github.com/UnchartedBull/OctoDash/issues/23) - Electron Build [\#7](https://github.com/UnchartedBull/OctoDash/issues/7) - Add DHT-22 sensor [\#2](https://github.com/UnchartedBull/OctoDash/issues/2) **Fixed bugs:** - If temperature is over 200°C the interface breaks [\#34](https://github.com/UnchartedBull/OctoDash/issues/34) - Fan Speed show 2 % characters [\#26](https://github.com/UnchartedBull/OctoDash/issues/26) **Merged pull requests:** - Temperature sensor [\#47](https://github.com/UnchartedBull/OctoDash/pull/47) ([UnchartedBull](https://github.com/UnchartedBull)) - Fix/nozzle over200 [\#46](https://github.com/UnchartedBull/OctoDash/pull/46) ([UnchartedBull](https://github.com/UnchartedBull)) - Bump rxjs from 6.4.0 to 6.5.2 [\#45](https://github.com/UnchartedBull/OctoDash/pull/45) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump ajv from 6.10.1 to 6.10.2 [\#44](https://github.com/UnchartedBull/OctoDash/pull/44) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump electron-reload from 1.4.0 to 1.4.1 [\#43](https://github.com/UnchartedBull/OctoDash/pull/43) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump ts-node from 7.0.1 to 8.3.0 [\#42](https://github.com/UnchartedBull/OctoDash/pull/42) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @angular/language-service from 8.0.3 to 8.1.1 [\#41](https://github.com/UnchartedBull/OctoDash/pull/41) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @types/node from 8.9.5 to 12.6.6 [\#40](https://github.com/UnchartedBull/OctoDash/pull/40) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump electron from 5.0.6 to 5.0.7 [\#39](https://github.com/UnchartedBull/OctoDash/pull/39) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump @angular/cli from 8.0.6 to 8.1.1 [\#38](https://github.com/UnchartedBull/OctoDash/pull/38) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump tslint from 5.15.0 to 5.18.0 [\#37](https://github.com/UnchartedBull/OctoDash/pull/37) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Bump wait-on from 3.2.0 to 3.3.0 [\#36](https://github.com/UnchartedBull/OctoDash/pull/36) ([dependabot-preview[bot]](https://github.com/apps/dependabot-preview)) - Dependabot [\#35](https://github.com/UnchartedBull/OctoDash/pull/35) ([UnchartedBull](https://github.com/UnchartedBull)) - Integrate Travis [\#33](https://github.com/UnchartedBull/OctoDash/pull/33) ([UnchartedBull](https://github.com/UnchartedBull)) - Introduce interfaces for Octorprint API [\#32](https://github.com/UnchartedBull/OctoDash/pull/32) ([UnchartedBull](https://github.com/UnchartedBull)) - Fix double percentage character [\#30](https://github.com/UnchartedBull/OctoDash/pull/30) ([UnchartedBull](https://github.com/UnchartedBull)) ## [v1.0.0](https://github.com/UnchartedBull/OctoDash/tree/v1.0.0) (2019-07-15) **Implemented enhancements:** - Implement DisplayLayerProgress API [\#20](https://github.com/UnchartedBull/OctoDash/issues/20) - Refactor Code [\#5](https://github.com/UnchartedBull/OctoDash/issues/5) - Fix the filament amount [\#4](https://github.com/UnchartedBull/OctoDash/issues/4) **Closed issues:** - API for DisplayLayerProgress [\#3](https://github.com/UnchartedBull/OctoDash/issues/3) **Merged pull requests:** - Electron [\#24](https://github.com/UnchartedBull/OctoDash/pull/24) ([UnchartedBull](https://github.com/UnchartedBull)) - Display layer progress api [\#22](https://github.com/UnchartedBull/OctoDash/pull/22) ([UnchartedBull](https://github.com/UnchartedBull)) - Fix/filament amount [\#21](https://github.com/UnchartedBull/OctoDash/pull/21) ([UnchartedBull](https://github.com/UnchartedBull)) - Bump lodash from 4.17.11 to 4.17.14 [\#19](https://github.com/UnchartedBull/OctoDash/pull/19) ([dependabot[bot]](https://github.com/apps/dependabot)) - Refactor [\#6](https://github.com/UnchartedBull/OctoDash/pull/6) ([UnchartedBull](https://github.com/UnchartedBull)) - Create LICENSE.md [\#1](https://github.com/UnchartedBull/OctoDash/pull/1) ([UnchartedBull](https://github.com/UnchartedBull)) \* *This Change Log was automatically generated by [github_changelog_generator](https://github.com/skywinder/Github-Changelog-Generator)*
109.178295
194
0.742261
yue_Hant
0.743368
edd9676cd97d0e35146beeffeb8dee37aef17b1f
592
md
Markdown
pages/common/cowsay.md
Ninja-Tw1sT/tldr
a7127f16eb9a40808b515330b6d4640c68eafed5
[ "MIT" ]
4
2017-04-19T14:35:14.000Z
2021-11-18T20:51:45.000Z
pages/common/cowsay.md
Ninja-Tw1sT/tldr
a7127f16eb9a40808b515330b6d4640c68eafed5
[ "MIT" ]
null
null
null
pages/common/cowsay.md
Ninja-Tw1sT/tldr
a7127f16eb9a40808b515330b6d4640c68eafed5
[ "MIT" ]
3
2017-04-23T22:05:01.000Z
2017-09-17T18:52:58.000Z
# cowsay > Generate an ASCII character like a cow or sheep saying or thinking something. > Available characters are stored in the /usr/share/cowsay on Linux. > And /usr/local/share/cows/ on Mac. - Print an ASCII cow saying "Hello world!": `cowsay "Hello world!"` - Print an ASCII dragon saying "Hello!": `echo "Hello!" | cowsay -f dragon` - Print a stoned thinking ASCII cow: `cowthink -s "I'm just a cow, not a great thinker ..."` - Print out a list of all characters with cowsay: `ls -1 {{cowsay_character_directory}} | rev | cut -c 5- | rev | xargs -I _ cowsay -f _ cowsay -f _`
26.909091
100
0.694257
eng_Latn
0.984907
edd9abc00cc2ab68688d61a5775512cdc01275e2
2,950
md
Markdown
content/publication/2021-01-01_Obstacles_to_Birth_S.md
nataliariquelme/publicaciones
e69bb7fab03783d2da04d8065ef41e8bfa68f865
[ "MIT" ]
null
null
null
content/publication/2021-01-01_Obstacles_to_Birth_S.md
nataliariquelme/publicaciones
e69bb7fab03783d2da04d8065ef41e8bfa68f865
[ "MIT" ]
43
2022-01-21T02:56:28.000Z
2022-01-21T20:54:18.000Z
content/publication/2021-01-01_Obstacles_to_Birth_S.md
nataliariquelme/publicaciones
e69bb7fab03783d2da04d8065ef41e8bfa68f865
[ "MIT" ]
1
2021-12-27T16:41:26.000Z
2021-12-27T16:41:26.000Z
+++ title = "Obstacles to Birth Surname Retention Upon Marriage: How Do Hostile Sexism and System Justification Predict Support for Marital Surname Change Among Women?" date = "2021-01-01" authors = ["Maria Chayinska", "Ozden Melis Ulug", "Nevin Solak", "Betul Kanik", "Burcu Cuvas"] publication_types = ["2"] publication = "Frontiers in Psychology, 12 702553. https://doi.org/10.3389/fpsyg.2021.702553" publication_short = "Frontiers in Psychology, 12 702553. https://doi.org/10.3389/fpsyg.2021.702553" abstract = "Despite the ongoing shift in societal norms and gender-discriminatory practices toward more equality, many heterosexual women worldwide, including in many Western societies, choose to replace their birth surname with the family name of their spouse upon marriage. Previous research has demonstrated that the adherence to sexist ideologies (i.e., a system of discriminatory gender-based beliefs) among women is associated with their greater endorsement of practices and policies that maintain gender inequality. By integrating the ideas from the system justification theory and the ambivalent sexism theory, we proposed that the more women adhere to hostile and benevolent sexist beliefs, the more likely they would be to justify existing gender relations in society, which in turn, would positively predict their support for traditional, husband-centered marital surname change. We further argued that hostile (as compared to benevolent) sexism could act as a particularly strong direct predictor of the support for marital surname change among women. We tested these possibilities across three cross-sectional studies conducted among women in Turkey (Study 1, N =118, self-identified feminist women; Study 2, N =131, female students) and the United States (Study 3, N =140, female students). Results of Studies 1 and 3 revealed that higher adherence to hostile (but not benevolent) sexism was associated with higher support for marital surname change indirectly through higher gender-based system justification. In Study 2, the hypothesized full mediation was not observed. Consistent with our predictions, in all three studies, hostile (but not benevolent) sexism was found to be a direct positive predictor of the support for marital surname change among women. We discuss the role of dominant ideologies surrounding marriage and inegalitarian naming conventions in different cultures as obstacles to women’s birth surname retention upon marriage." abstract_short = "" url_source = "https://www.frontiersin.org/articles/10.3389/fpsyg.2021.702553/full" tags = ["Gender","Intergroup contact","Sociability"] url_code = "" image_preview = "" selected = false projects = [] url_pdf = "" url_preprint = "" url_dataset = "" url_project = "" url_slides = "" url_video = "" url_poster = "" math = true highlight = true [header] image = "" caption = "" +++
101.724138
2,047
0.770508
eng_Latn
0.992899
edd9b526705a70db793ac4ac46a4448334b530ca
1,062
md
Markdown
README.md
dannyfoltz/Cantando
bbfdd0215ccca55da4663f167412b9a4a1e28ae8
[ "MIT" ]
null
null
null
README.md
dannyfoltz/Cantando
bbfdd0215ccca55da4663f167412b9a4a1e28ae8
[ "MIT" ]
null
null
null
README.md
dannyfoltz/Cantando
bbfdd0215ccca55da4663f167412b9a4a1e28ae8
[ "MIT" ]
null
null
null
# Cantando Warm up those vocal chords! This command line app scrapes foreign song lyrics from Genius.com and translates them to English. Code for the Genius.com scraper was obtained from a blog post by Abhisek Roy on PromptCloud.com ('Scraping Song Lyrics using Python from Genius' https://www.promptcloud.com/blog/scraping-song-lyrics-using-python-from-genius/). The main.py file uses two modules - the scraper module and the translate module. There are four actions that can be performed using the app: * Scrape Data - input a URL from Genius.com of a song you would like translated. It will return a json file with the lyrics and any comments from the website (as well as the scraped html in a separate file). * Translate Song - this command takes a json file of a song you have scraped, translates it to english, and returns a csv file (idea being to then load it into a pandas dataframe). * Print Song - this command takes a json file and prints the original lyrics and translation to the command line. * Exit - this command exits the application.
96.545455
227
0.779661
eng_Latn
0.995725
edd9cb8b149ae06d9efd7090a21ddfd6baf5840c
1,332
md
Markdown
about.md
Wellwick/wellwick.github.io
f1123d06104e06788d35e4b671506114ee4fd512
[ "MIT" ]
null
null
null
about.md
Wellwick/wellwick.github.io
f1123d06104e06788d35e4b671506114ee4fd512
[ "MIT" ]
null
null
null
about.md
Wellwick/wellwick.github.io
f1123d06104e06788d35e4b671506114ee4fd512
[ "MIT" ]
null
null
null
--- layout: page title: About permalink: /about/ --- My real name is Michael Boyle, but I've been going by the moniker of Wellwick for over a decade now. The dog over there isn't actually me (believe it or not), but I thought I could spare you from my face and show you my dog at the same time. Her name is Ruby and she is lovely. I'm a Software Developer currently working at Autodesk in the UK. I graduated from Warwick back in 2018 with a MEng Computer Science degree. I'm a moderator on the [Parahumans Discord](https://discord.gg/TYsRHpG) where I help develop the primary bot and run a campaign of Weaverdice (a Dungeons and Dragons style game but with superheroes). I also do my best to dabble in writing, though I haven't published much at all. One day I'd love to do some game developement or something in that vein, but there's a lot of work between here and there. Sometimes I work on a graphics tablet I own in an attempt to get better at drawing, but it's by far my weakest creative skill. Still all things take practice! ### Contact me You can reach me at my email [mbbexley@hotmail.co.uk](mailto:mbbexley@hotmail.co.uk), however you're more likely to get a hold of me urgently by joining the Discord mentioned above and PMing Wellwick. I'll usually be over on the right sidebar, twiddling my thumbs.
45.931034
81
0.762763
eng_Latn
0.999729
eddbcd74a2767aa302821808a7aa26c934434e0f
1,701
md
Markdown
_posts/2016-12-13-rosa-clara-2013-243-bergamo.md
mightts/mightts.github.io
4c47706b48c2b23ff1694b666225249fb3ae798c
[ "MIT" ]
null
null
null
_posts/2016-12-13-rosa-clara-2013-243-bergamo.md
mightts/mightts.github.io
4c47706b48c2b23ff1694b666225249fb3ae798c
[ "MIT" ]
null
null
null
_posts/2016-12-13-rosa-clara-2013-243-bergamo.md
mightts/mightts.github.io
4c47706b48c2b23ff1694b666225249fb3ae798c
[ "MIT" ]
null
null
null
--- layout: post date: '2016-12-13' title: "Rosa Clara - 2013 - 243 Bergamo" category: Rosa Clara tags: ["dresses","formal","rosa","pretty","wedding"] image: http://img.metalkind.com/93625-thickbox_default/243-bergamo.jpg --- Rosa Clara - 2013 - 243 Bergamo On Sales: **$308.99** <a href="https://www.metalkind.com/en/rosa-clara/8981-243-bergamo.html"><amp-img layout="responsive" width="600" height="600" src="//img.metalkind.com/93625-thickbox_default/243-bergamo.jpg" alt="Rosa Clara - 2013 - 243 Bergamo 0" /></a> <a href="https://www.metalkind.com/en/rosa-clara/8981-243-bergamo.html"><amp-img layout="responsive" width="600" height="600" src="//img.metalkind.com/93626-thickbox_default/243-bergamo.jpg" alt="Rosa Clara - 2013 - 243 Bergamo 1" /></a> <a href="https://www.metalkind.com/en/rosa-clara/8981-243-bergamo.html"><amp-img layout="responsive" width="600" height="600" src="//img.metalkind.com/93627-thickbox_default/243-bergamo.jpg" alt="Rosa Clara - 2013 - 243 Bergamo 2" /></a> <a href="https://www.metalkind.com/en/rosa-clara/8981-243-bergamo.html"><amp-img layout="responsive" width="600" height="600" src="//img.metalkind.com/93628-thickbox_default/243-bergamo.jpg" alt="Rosa Clara - 2013 - 243 Bergamo 3" /></a> <a href="https://www.metalkind.com/en/rosa-clara/8981-243-bergamo.html"><amp-img layout="responsive" width="600" height="600" src="//img.metalkind.com/93629-thickbox_default/243-bergamo.jpg" alt="Rosa Clara - 2013 - 243 Bergamo 4" /></a> Buy it: [Rosa Clara - 2013 - 243 Bergamo](https://www.metalkind.com/en/rosa-clara/8981-243-bergamo.html "Rosa Clara - 2013 - 243 Bergamo") View more: [Rosa Clara](https://www.metalkind.com/en/173-rosa-clara "Rosa Clara")
85.05
237
0.716637
kor_Hang
0.050391
eddca77aa98812988b57be132f66528ebe7cdf60
1,853
md
Markdown
content/authors/Vicky_Morgan/_index.md
ackueck/eco-evo-interactions
6c49cf53f1a142a95fca658f025b0cd66d8e20ea
[ "MIT" ]
null
null
null
content/authors/Vicky_Morgan/_index.md
ackueck/eco-evo-interactions
6c49cf53f1a142a95fca658f025b0cd66d8e20ea
[ "MIT" ]
2
2021-06-07T14:18:01.000Z
2021-06-07T15:11:45.000Z
content/authors/Vicky_Morgan/_index.md
ackueck/eco-evo-interactions
6c49cf53f1a142a95fca658f025b0cd66d8e20ea
[ "MIT" ]
1
2021-12-20T15:10:36.000Z
2021-12-20T15:10:36.000Z
--- # Display name title: Victoria Watson-Zink # Is this the primary user of the site? superuser: false # Role/position role: Postdoc # Organizations/Affiliations organizations: - name: Stanford University url: "" # Short bio (displayed in user profile at end of posts) bio: To be filled. # Social/Academic Networking # For available icons, see: https://sourcethemes.com/academic/docs/page-builder/#icons # For an email link, use "fas" icon pack, "envelope" icon, and a link in the # form "mailto:your-email@example.com" or "#contact" for contact widget. social: - icon: twitter icon_pack: fab link: https://twitter.com/origamicrab - icon: readme icon_pack: fab link: https://origamicrab.wordpress.com/ # Link to a PDF of your resume/CV from the About widget. # To enable, copy your resume/CV to `static/files/cv.pdf` and uncomment the lines below. # - icon: cv # icon_pack: ai # link: files/cv.pdf # Enter email to display Gravatar (if Gravatar enabled in Config) email: "" # Highlight the author in author lists? (true/false) highlight_name: false # Organizational groups that you belong to (for People widget) # Set this to `[]` or comment out if you are not using People widget. user_groups: - Collaborators --- Vicky is a finishing graduate student at the University of California, Davis in Prof. [Richard Grosberg's lab](https://biology.ucdavis.edu/people/richard-grosberg). She just accepted two fellowships: a National Science (NSF) Postdoctoral Research fellowship in Biology (PRFB) for 3 years and then a Stanford Science Fellowship. At Stanford, she will be working in Prof. [Lauren O'Connell's lab](https://oconnell.stanford.edu/). Vicky studies the genomic and transcriptomic basis of adaptation to land (terrestriality) in crabs. We are working together to study the evolution of crab-associated microbiomes.
37.06
606
0.753913
eng_Latn
0.932146
edddbd7bfe965b1bb50b916d44969bc76d4e1eb9
1,106
md
Markdown
docs/server/pubsub.md
crybapp/mesa
de6518a9d5a32fb00221622d1dec8a54fe538fd9
[ "MIT" ]
22
2019-10-08T08:36:24.000Z
2020-11-26T15:52:45.000Z
docs/server/pubsub.md
mesafoundation/mesa
de6518a9d5a32fb00221622d1dec8a54fe538fd9
[ "MIT" ]
11
2020-01-04T06:05:40.000Z
2020-12-08T12:39:16.000Z
docs/server/pubsub.md
mesafoundation/mesa
de6518a9d5a32fb00221622d1dec8a54fe538fd9
[ "MIT" ]
1
2020-09-12T17:13:24.000Z
2020-09-12T17:13:24.000Z
# Pub/Sub Mesa Pub/Sub allows you to run multiple instances of your Mesa server while allowing messages to reach clients. Pub/Sub relies on Redis to forward messages, so you'll need a Redis instance to enable this feature. We support Redis Sentinels for high availability production environments too. To enable pub/sub, simply supply your Redis URI in your Mesa config: ```js const mesa = new Mesa({ port: 4000, redis: 'redis://localhost:6379' }) ``` Messages sent globally via `mesa.send` or to authenticated clients via `client.send` or `dispatcher.dispatch` will now be correctly send to clients regardless of which Mesa server they're connected to. If you want to run different Mesa servers on the same Redis server without pub/sub, check out [namespaces](./namespaces.md). We also support more advanced Redis connection options via an object. Interally Mesa uses [ioredis](https://github.com/luin/ioredis) to handle Redis connections, so view their [connection options](https://github.com/luin/ioredis/blob/11e5d810f7076a144ab22cb4848b64d9d3da2254/lib/redis/RedisOptions.ts#L8) to learn more
58.210526
318
0.787523
eng_Latn
0.981998
edddf79154eea444b66996b734ffbb2cf2329a33
3,137
md
Markdown
_posts/2019-2-7-Snow-Apocalypse.md
nickpape/nickpape.com
3c6bfe2420e9679da57d05324e8a77b2d7204b0e
[ "MIT" ]
null
null
null
_posts/2019-2-7-Snow-Apocalypse.md
nickpape/nickpape.com
3c6bfe2420e9679da57d05324e8a77b2d7204b0e
[ "MIT" ]
null
null
null
_posts/2019-2-7-Snow-Apocalypse.md
nickpape/nickpape.com
3c6bfe2420e9679da57d05324e8a77b2d7204b0e
[ "MIT" ]
null
null
null
--- layout: post title: "Snow Apocalypse!" date: 2019-02-7 backgrounds: - https://get.pxhere.com/photo/landscape-tree-forest-snow-winter-trail-frost-walk-weather-snowy-season-trees-relaxation-footwear-blizzard-wintry-winter-magic-freezing-atmospheric-recovery-forest-path-winter-mood-snowed-in-snow-magic-snow-forest-snow-pictures-winter-images-winter-picture-natural-environment-geological-phenomenon-winter-storm-496806.jpg thumb: https://cdn2.picturesofwinter.net/picturesofwinter-cdn/snow-angel-9-s.jpg category: pre-trip tags: packing reading_time: "5 minutes" --- So right now we are in the middle of a "Snow Apocalypse" here in Seattle. Bad news. It dumped 8 inches on Sunday night and Monday morning, basically shutting down Microsoft for Monday and Tuesday. Now, according to celebrity meteorologist Cliff Mass, we are set to have [another massive snowstorm](https://cliffmass.blogspot.com/2019/02/a-major-snowstorm-will-hit-region.html) on Friday and a potentially bigger one on Monday. They are talking about another foot of snow, and potentially more in other areas. It has been pretty bad. ![snow](https://static.seattletimes.com/wp-content/uploads/2019/02/02042019_Bus-snow_084825-1020x680.jpg) ![more snow](https://static.seattletimes.com/wp-content/uploads/2019/02/02042019_snow_162732-1020x774.jpg) We are stocking up on some food over the weekend! Meanwhile, we still have some issues with our trip. First, California has been having [mudslides along the 101](https://www.nytimes.com/2018/01/14/us/mudslides-california-highway-101.html). There recently was one in Montecito, CA (near Santa Barbara) right along our route. There have also been continuous ones in Ventura County, south of Montecito.. <iframe src="https://www.google.com/maps/embed?pb=!1m18!1m12!1m3!1d36431.27303037232!2d-119.65416317187265!3d34.43663569625527!2m3!1f0!2f0!3f0!3m2!1i1024!2i768!4f13.1!3m3!1m2!1s0x80e9124ae0a7db0d%3A0x92cfad27fcb46eb6!2sMontecito%2C+CA+93108!5e0!3m2!1sen!2sus!4v1549574962059" width="600" height="450" frameborder="0" style="border:0" allowfullscreen></iframe> There has also been a glut of sinkholes.. Including [this one](https://www.constructionequipment.com/video-california-sinkhole-swallows-loader-operator) that swallowed an excavator, and [this one](https://patch.com/california/lagunaniguel-danapoint/dana-point-waterfall-opens-large-sinkhole) in Dana Point. ![excavator in sinkhole](https://media.nbclosangeles.com/images/1206*675/Excavator.JPG) ![sinkholes](https://patch.com/img/cdn20/users/23196800/20190206/083109/styles/T800x600/public/processed_images/sinkhole-1549502761-1459.jpg) Yikes. Maybe we will just avoid Ventura County. Or all of California... *On to Vegas!* Our other concern is the government shutdown. We will be starting the trip on March 1st, a full 14 days after the government could shut down again, which would shutter our hopes of seeing a bunch of national parks and monuments... If it lasts as long as the last one, we probably won't get to see any parks, on either trip... That could suck. Anyway, we will post a list of some of the places we are hoping to see next!
66.744681
359
0.795346
eng_Latn
0.918991
edde993004b77b59d01a0a30741f3f6b14df7108
6,726
md
Markdown
README.md
DominicMDev/STPopup
175be68eb453601a3884359400cad73cd876723a
[ "MIT" ]
51
2019-01-22T08:38:46.000Z
2022-03-19T03:13:54.000Z
README.md
DominicMDev/STPopup
175be68eb453601a3884359400cad73cd876723a
[ "MIT" ]
6
2019-02-28T09:56:21.000Z
2020-11-20T03:04:48.000Z
README.md
DominicMDev/STPopup
175be68eb453601a3884359400cad73cd876723a
[ "MIT" ]
16
2019-03-14T11:47:32.000Z
2022-03-07T06:29:15.000Z
# STPopup ![CI Status](https://img.shields.io/travis/kevin0571/STPopup.svg?style=flat) ![Version](http://img.shields.io/cocoapods/v/STPopup.svg?style=flag) ![License](https://img.shields.io/cocoapods/l/STPopup.svg?style=flag) STPopup provides STPopupController, which works just like UINavigationController in popup style, for both iPhone and iPad. **Features:** - Extend your view controller from UIViewController, build it in your familiar way. - Push/Pop view controller in to/out of popup view stack, and set navigation items by using self.navigationItem.leftBarButtonItem and rightBarButtonItem, just like you are using UINavigationController. - Support both "Form Sheet" and "Bottom Sheet" style. - Work well with storyboard(including segue). - Customize UI by using UIAppearance. - Fully customizable popup transition style. - Auto-reposition of popup view when keyboard is showing up, make sure your UITextField/UITextView won't be covered by the keyboard. - Drag navigation bar to dismiss popup view. - Support both portrait and landscape orientation, and both iPhone and iPad. ## Overview **Used in Sth4Me app** ![Sth4Me](https://cloud.githubusercontent.com/assets/1491282/9857827/8fa0125e-5b4f-11e5-9c0d-ff955c007360.gif) ## Get Started **CocoaPods** ```ruby platform :ios, '7.0' pod 'STPopup' ``` **Carthage** ```ruby github "kevin0571/STPopup" ``` **Import header file** ```objc #import <STPopup/STPopup.h> ``` **Initialize STPopupController** ```objc STPopupController *popupController = [[STPopupController alloc] initWithRootViewController:[ViewController new]]; [popupController presentInViewController:self]; ``` **Set content size in view controller** ```objc @implementation ViewController - (instancetype)init { if (self = [super init]) { self.title = @"View Controller"; self.navigationItem.rightBarButtonItem = [[UIBarButtonItem alloc] initWithTitle:@"Next" style:UIBarButtonItemStylePlain target:self action:@selector(nextBtnDidTap)]; self.contentSizeInPopup = CGSizeMake(300, 400); self.landscapeContentSizeInPopup = CGSizeMake(400, 200); } return self; } - (void)viewDidLoad { [super viewDidLoad]; // Add views here // self.view.frame.size == self.contentSizeInPopup in portrait // self.view.frame.size == self.landscapeContentSizeInPopup in landscape } @end ``` **Push, pop and dismiss view controllers** ```objc [self.popupController pushViewController:[ViewController new] animated:YES]; [self.popupController popViewControllerAnimated:YES]; // Popup will be dismissed if there is only one view controller in the popup view controller stack [self.popupController dismiss]; ``` ![Push & Pop](https://cloud.githubusercontent.com/assets/1491282/9857915/0d4ab3ee-5b50-11e5-81bc-8fbae3ad8c06.gif) **Bottom sheet style** ```objc STPopupController *popupController = [[STPopupController alloc] initWithRootViewController:[ViewController new]]; popupController.style = STPopupStyleBottomSheet; [popupController presentInViewController:self]; ``` ![Bottom Sheet](https://cloud.githubusercontent.com/assets/1491282/10417963/7649f356-7080-11e5-8f3c-0cb817b8353e.gif) **Customize popup transition style** ```objc #pragma mark - STPopupControllerTransitioning - (NSTimeInterval)popupControllerTransitionDuration:(STPopupControllerTransitioningContext *)context { return context.action == STPopupControllerTransitioningActionPresent ? 0.5 : 0.35; } - (void)popupControllerAnimateTransition:(STPopupControllerTransitioningContext *)context completion:(void (^)())completion { UIView *containerView = context.containerView; if (context.action == STPopupControllerTransitioningActionPresent) { containerView.transform = CGAffineTransformMakeTranslation(containerView.superview.bounds.size.width - containerView.frame.origin.x, 0); [UIView animateWithDuration:[self popupControllerTransitionDuration:context] delay:0 usingSpringWithDamping:1 initialSpringVelocity:1 options:UIViewAnimationOptionCurveEaseInOut animations:^{ context.containerView.transform = CGAffineTransformIdentity; } completion:^(BOOL finished) { completion(); }]; } else { [UIView animateWithDuration:[self popupControllerTransitionDuration:context] delay:0 options:UIViewAnimationOptionCurveEaseOut animations:^{ containerView.transform = CGAffineTransformMakeTranslation(- 2 * (containerView.superview.bounds.size.width - containerView.frame.origin.x), 0); } completion:^(BOOL finished) { containerView.transform = CGAffineTransformIdentity; completion(); }]; } } ``` **Blur background** ```objc STPopupController *popupController = [[STPopupController alloc] initWithRootViewController:[PopupViewController1 new]]; if (NSClassFromString(@"UIBlurEffect")) { UIBlurEffect *blurEffect = [UIBlurEffect effectWithStyle:UIBlurEffectStyleDark]; popupController.backgroundView = [[UIVisualEffectView alloc] initWithEffect:blurEffect]; } ``` **Dismiss by tapping background** ```objc popupController = [[STPopupController alloc] initWithRootViewController:self]; [popupController.backgroundView addGestureRecognizer:[[UITapGestureRecognizer alloc] initWithTarget:self action:@selector(backgroundViewDidTap)]]; ``` **Customize UI** ```objc [STPopupNavigationBar appearance].barTintColor = [UIColor colorWithRed:0.20 green:0.60 blue:0.86 alpha:1.0]; [STPopupNavigationBar appearance].tintColor = [UIColor whiteColor]; [STPopupNavigationBar appearance].barStyle = UIBarStyleDefault; [STPopupNavigationBar appearance].titleTextAttributes = @{ NSFontAttributeName: [UIFont fontWithName:@"Cochin" size:18], NSForegroundColorAttributeName: [UIColor whiteColor] }; [[UIBarButtonItem appearanceWhenContainedIn:[STPopupNavigationBar class], nil] setTitleTextAttributes:@{ NSFontAttributeName:[UIFont fontWithName:@"Cochin" size:17] } forState:UIControlStateNormal]; ``` ![Customize UI](https://cloud.githubusercontent.com/assets/1491282/9911306/0f6db056-5cd4-11e5-9329-33b0cf02e1b0.png) **Auto-reposition when keyboard is showing up** No codes needed for this feature ![Auto-reposition](https://cloud.githubusercontent.com/assets/1491282/9858277/5b29b130-5b52-11e5-9569-7560a0853493.gif) **Drag to dismiss** No codes needed for this feature ![Drag to dismiss](https://cloud.githubusercontent.com/assets/1491282/9858334/b103fc96-5b52-11e5-9c3f-517367ed9386.gif) **Handle orientation change** No codes needed for this feature ![Orientation change](https://cloud.githubusercontent.com/assets/1491282/9858372/e6538880-5b52-11e5-8882-8705588606ba.gif) For more details, please download the example project.
43.960784
225
0.772227
yue_Hant
0.355417
edde9fcbf5bdc05927960c87713f16f1dd28b541
64
md
Markdown
20210924213810/README.md
rei-nert/zettelkasten
17fa3929d776ee21da105e69242df48697f2c7b1
[ "MIT" ]
null
null
null
20210924213810/README.md
rei-nert/zettelkasten
17fa3929d776ee21da105e69242df48697f2c7b1
[ "MIT" ]
null
null
null
20210924213810/README.md
rei-nert/zettelkasten
17fa3929d776ee21da105e69242df48697f2c7b1
[ "MIT" ]
null
null
null
# Tilde *Definition:* g(x) ~ h(x) means lim x->a g(x)/h(x) = 1
16
54
0.53125
kor_Hang
0.360701
eddf560e22d5daaf0bdc731a7f82b0c3105eea5e
11,728
md
Markdown
articles/active-directory/saas-apps/mixpanel-provisioning-tutorial.md
pmsousa/azure-docs.pt-pt
bc487beff48df00493484663c200e44d4b24cb18
[ "CC-BY-4.0", "MIT" ]
15
2017-08-28T07:46:17.000Z
2022-02-03T12:49:15.000Z
articles/active-directory/saas-apps/mixpanel-provisioning-tutorial.md
pmsousa/azure-docs.pt-pt
bc487beff48df00493484663c200e44d4b24cb18
[ "CC-BY-4.0", "MIT" ]
407
2018-06-14T16:12:48.000Z
2021-06-02T16:08:13.000Z
articles/active-directory/saas-apps/mixpanel-provisioning-tutorial.md
pmsousa/azure-docs.pt-pt
bc487beff48df00493484663c200e44d4b24cb18
[ "CC-BY-4.0", "MIT" ]
17
2017-10-04T22:53:31.000Z
2022-03-10T16:41:59.000Z
--- title: 'Tutorial: Configurar mixpanel para fornecimento automático de utilizadores com Azure Ative Directory | Microsoft Docs' description: Saiba como providenciar e desa provisionar automaticamente as contas de utilizador do Azure AD para o Mixpanel. services: active-directory author: Zhchia writer: Zhchia manager: CelesteDG ms.service: active-directory ms.subservice: saas-app-tutorial ms.workload: identity ms.topic: tutorial ms.date: 01/24/2020 ms.author: Zhchia ms.openlocfilehash: ab261d4ca04fed04c8a3e1046c0a4c563767ad4c ms.sourcegitcommit: f28ebb95ae9aaaff3f87d8388a09b41e0b3445b5 ms.translationtype: MT ms.contentlocale: pt-PT ms.lasthandoff: 03/29/2021 ms.locfileid: "96182007" --- # <a name="tutorial-configure-mixpanel-for-automatic-user-provisioning"></a>Tutorial: Configurar mixpanel para fornecimento automático de utilizadores Este tutorial descreve os passos necessários para realizar tanto no Mixpanel como no Azure Ative Directory (Azure AD) para configurar o fornecimento automático do utilizador. Quando configurado, a Azure AD fornece automaticamente e desespedaça os utilizadores e grupos para a [Mixpanel](https://mixpanel.com/pricing/) utilizando o serviço de provisionamento Azure AD. Para obter detalhes importantes sobre o que este serviço faz, como funciona e perguntas frequentes, veja [Automatizar o aprovisionamento e desaprovisionamento de utilizadores em aplicações SaaS no Azure Active Directory](../app-provisioning/user-provisioning.md). ## <a name="capabilities-supported"></a>Capacidades suportadas > [!div class="checklist"] > * Criar utilizadores no Mixpanel > * Remova os utilizadores no Mixpanel quando já não necessitam de acesso > * Mantenha os atributos do utilizador sincronizados entre Azure AD e Mixpanel > * Grupos de provisão e membros do grupo no Mixpanel > * [Único sinal de](./mixpanel-tutorial.md) mixpanel (recomendado) ## <a name="prerequisites"></a>Pré-requisitos O cenário delineado neste tutorial pressupõe que já tem os seguintes pré-requisitos: * [Um inquilino da AD AZure](../develop/quickstart-create-new-tenant.md) * Uma conta de utilizador no Azure AD com [permissão](../roles/permissions-reference.md) para configurar o aprovisionamento (por ex., Administrador de Aplicações, Administrador de Aplicações de Cloud, Proprietário da Aplicação ou Administrador Global). * Uma organização de mixpanel de nível empresarial * Uma conta mixpanel com privilégios de administração em dito org * SSO ativado dentro do mixpanel com um domínio reclamado ## <a name="step-1-plan-your-provisioning-deployment"></a>Passo 1. Planear a sua implementação de aprovisionamento 1. Saiba [como funciona o serviço de aprovisionamento](../app-provisioning/user-provisioning.md). 2. Determine quem vai estar no [âmbito do aprovisionamento](../app-provisioning/define-conditional-rules-for-provisioning-user-accounts.md). 3. Determine quais os dados a [mapear entre Azure AD e Mixpanel](../app-provisioning/customize-application-attributes.md). ## <a name="step-2-configure-mixpanel-to-support-provisioning-with-azure-ad"></a>Passo 2. Configure mixpanel para apoiar o provisionamento com Azure AD 1. Para configurar o SSO e reivindicar um domínio, consulte [este](https://help.mixpanel.com/hc/articles/360036428871-Single-Sign-On). 2. Depois disso, terá de gerar um token SCIM no separador SCIM da secção de segurança de acesso das definições da sua organização. ![Ficha mixpanel](./media/mixpanel-provisioning-tutorial/mixpanelscim.png) ## <a name="step-3-add-mixpanel-from-the-azure-ad-application-gallery"></a>Passo 3. Adicione Mixpanel da galeria de aplicações AZure AD Adicione mixpanel da galeria de aplicações AZure AD para começar a gerir o fornecimento ao Mixpanel. Se tiver configurado anteriormente o Mixpanel para SSO, pode utilizar a mesma aplicação. No entanto, é recomendável criar uma aplicação separada ao testar a integração inicialmente. Saiba mais sobre como adicionar uma aplicação a partir da galeria [aqui](../manage-apps/add-application-portal.md). ## <a name="step-4-define-who-will-be-in-scope-for-provisioning"></a>Passo 4: Determinar quem vai estar no âmbito do aprovisionamento O serviço de aprovisionamento do Azure AD permite-lhe determinar quem vai ser aprovisionado com base na atribuição à aplicação e/ou com base em atributos do utilizador/grupo. Se optar por determinar quem vai ser aprovisionado na sua aplicação com base na atribuição, pode utilizar os seguintes [passos](../manage-apps/assign-user-or-group-access-portal.md) para atribuir utilizadores e grupos à aplicação. Se escolher determinar quem vai ser aprovisionado com base apenas em atributos do utilizador ou grupo, pode utilizar um filtro de âmbito conforme descrito [aqui](../app-provisioning/define-conditional-rules-for-provisioning-user-accounts.md). * Ao atribuir utilizadores e grupos ao Mixpanel, tem de selecionar outra função que não o **Acesso Predefinido**. Os utilizadores com a função Acesso Predefinido são excluídos do aprovisionamento e marcados como não autorizados de forma efetiva nos registos de aprovisionamento. Se a única função disponível na aplicação for a função de acesso predefinido, pode [atualizar o manifesto de aplicação](../develop/howto-add-app-roles-in-azure-ad-apps.md) para adicionar funções adicionais. * Comece pequeno. Teste com um pequeno conjunto de utilizadores e grupos antes de implementar para todos. Quando o âmbito do aprovisionamento está definido para os utilizadores e os grupos atribuídos, pode controlar isto ao atribuir um ou dois utilizadores ou grupos à aplicação. Quando o âmbito está definido para todos os utilizadores e grupos, pode especificar um [filtro de âmbito baseado em atributos](../app-provisioning/define-conditional-rules-for-provisioning-user-accounts.md). ## <a name="step-5-configure-automatic-user-provisioning-to-mixpanel"></a>Passo 5. Configure o fornecimento automático do utilizador ao Mixpanel Esta secção guia-o através dos passos para configurar o serviço de fornecimento de AD Azure para criar, atualizar e desativar utilizadores e/ou grupos no TestApp com base em atribuições de utilizador e/ou grupo em Azure AD. ### <a name="to-configure-automatic-user-provisioning-for-mixpanel-in-azure-ad"></a>Para configurar o fornecimento automático do utilizador para mixpanel em Azure AD: 1. Inicie sessão no [portal do Azure](https://portal.azure.com). Selecione **Aplicações Empresariais** e, em seguida, **Todas as aplicações**. ![Painel Aplicações empresariais](common/enterprise-applications.png) 2. Na lista de candidaturas, selecione **Mixpanel**. ![O link Mixpanel na lista de Aplicações](common/all-applications.png) 3. Selecione o separador **Aprovisionamento**. ![Screenshot das opções De gestão com a opção Provisioning chamada.](common/provisioning.png) 4. Defina o **Modo de Aprovisionamento** como **Automático**. ![Screenshot da lista de retirada do modo de provisionamento com a opção Automática chamada.](common/provisioning-automatic.png) 5. Na secção **Credenciais de Administração,** insira o **URL do Inquilino** Mixpanel e o **Token Secreto.** Clique em **Testar a Ligação** para garantir que o Azure AD pode ligar-se ao Mixpanel. Se a ligação falhar, certifique-se de que a sua conta Mixpanel tem permissões de Administração e tente novamente. ![A screenshot mostra a caixa de diálogo de Admin Credentials, onde pode inserir o seu Inquilino U R L e Secret Token.](./media/mixpanel-provisioning-tutorial/provisioning.png) 6. No campo **E-mail de Notificação**, introduza o endereço de e-mail de uma pessoa ou um grupo que deve receber as notificações de erro de aprovisionamento e marque a caixa de verificação **Enviar uma notificação de e-mail quando ocorre uma falha**. ![E-mail de Notificação](common/provisioning-notification-email.png) 7. Selecione **Guardar**. 8. Na secção **Mappings,** selecione **Synchronize Azure Ative Directory Users to Mixpanel**. 9. Reveja os atributos do utilizador que são sincronizados de Azure AD a Mixpanel na secção **De mapeamento de atributos.** Os atributos selecionados como propriedades **de correspondência** são utilizados para combinar as contas do utilizador no Mixpanel para operações de atualização. Se optar por alterar o [atributo de alvo correspondente,](../app-provisioning/customize-application-attributes.md)terá de se certificar de que a API Mixpanel suporta utilizadores filtrantes com base nesse atributo. Selecione o botão **Guardar** para escoar quaisquer alterações. |Atributo|Tipo| |---|---| |userName|String| |displayName|String| 10. Na secção **Mappings,** selecione **Synchronize Azure Ative Directory Groups to Mixpanel**. 11. Reveja os atributos do grupo que são sincronizados de Azure AD a Mixpanel na secção **De mapeamento de atributos.** Os atributos selecionados como propriedades **de correspondência** são utilizados para combinar com os grupos do Mixpanel para operações de atualização. Selecione o botão **Guardar** para escoar quaisquer alterações. |Atributo|Tipo| |---|---| |displayName|String| |membros|Referência| 12. Para configurar filtros de âmbito, veja as instruções seguintes disponibilizadas no [Tutorial de filtro de âmbito](../app-provisioning/define-conditional-rules-for-provisioning-user-accounts.md). 13. Para ativar o serviço de prestação de Ad Azure para mixpanel, altere o **Estado de Provisionamento** para **On** na secção **Definições.** ![Estado do Aprovisionamento Ativado](common/provisioning-toggle-on.png) 14. Defina os utilizadores e/ou grupos que deseja prestar ao Mixpanel, escolhendo os valores desejados no **Âmbito** na secção **Definições.** ![Âmbito de Aprovisionamento](common/provisioning-scope.png) 15. Quando estiver pronto para aprovisionar, clique em **Guardar**. ![Guardar Configuração de Aprovisionamento](common/provisioning-configuration-save.png) Esta operação inicia o ciclo de sincronização inicial de todos os utilizadores e grupos definidos no **Âmbito** na secção **Definições**. O ciclo inicial leva mais tempo a ser executado do que os ciclos subsequentes, que ocorrem aproximadamente a cada 40 minutos, desde que o serviço de aprovisionamento do Azure AD esteja em execução. ## <a name="step-6-monitor-your-deployment"></a>Passo 6. Monitorizar a implementação Depois de configurar o aprovisionamento, utilize os seguintes recursos para monitorizar a sua implementação: 1. Utilize os [registos de aprovisionamento](../reports-monitoring/concept-provisioning-logs.md) para determinar quais os utilizadores que foram aprovisionados com ou sem êxito 2. Verifique a [barra de progresso](../app-provisioning/application-provisioning-when-will-provisioning-finish-specific-user.md) para ver o estado do ciclo de aprovisionamento e quão próximo está da conclusão 3. Se a configuração de aprovisionamento parecer estar num mau estado de funcionamento, a aplicação vai entrar em quarentena. Saiba mais sobre os estados de quarentena [aqui](../app-provisioning/application-provisioning-quarantine-status.md). ## <a name="additional-resources"></a>Recursos adicionais * [Gerir o aprovisionamento de contas de utilizador para Aplicações Empresariais](../app-provisioning/configure-automatic-user-provisioning-portal.md) * [What is application access and single sign-on with Azure Active Directory?](../manage-apps/what-is-single-sign-on.md) (O que é o acesso a aplicações e o início de sessão único com o Azure Active Directory?) ## <a name="next-steps"></a>Passos seguintes * [Saiba como analisar os registos e obter relatórios sobre a atividade de aprovisionamento](../app-provisioning/check-status-user-account-provisioning.md)
79.782313
649
0.791439
por_Latn
0.99629
ede0154c98bea3fe2010267cabc772babc3d1e74
455
md
Markdown
docs/vue/vue2/install.md
dongjianOne/dongjianOne.github.io
eff2e4f966f82135da7e176ed0a5b8a1bcaf5ec8
[ "MIT" ]
null
null
null
docs/vue/vue2/install.md
dongjianOne/dongjianOne.github.io
eff2e4f966f82135da7e176ed0a5b8a1bcaf5ec8
[ "MIT" ]
null
null
null
docs/vue/vue2/install.md
dongjianOne/dongjianOne.github.io
eff2e4f966f82135da7e176ed0a5b8a1bcaf5ec8
[ "MIT" ]
null
null
null
# 组件安装 ### 配置淘宝镜像 ``` npm install -g cnpm --registry=http://registry.npm.taobao.org ``` 查看镜像是否配置成功 ```` npm get registry ```` ### 安装vue-cli ```` cnpm install -g vue-cli ```` ### 打包vue项目 先安装vue/cli-init ```` cnpm i -g @vue/cli-init ```` 打包项目 ```` vue init webpack Project-name ```` 安装sass-loader(sass-loader依赖于node-sass,所以要安装node-sass) ``` // 安装node-sass npm install --save-dev node-sass // 安装sass-loader npm install --save-dev sass-loader ```
11.097561
61
0.641758
yue_Hant
0.148366
ede0644f2b5a55d392d8077bab0a681338b765de
70
md
Markdown
src/HorBotFrameWork/README.md
DrSnake88/HorBotFrameWork
76da380b0f6503f1f4ddfb92a02d7511537042e7
[ "MIT" ]
null
null
null
src/HorBotFrameWork/README.md
DrSnake88/HorBotFrameWork
76da380b0f6503f1f4ddfb92a02d7511537042e7
[ "MIT" ]
null
null
null
src/HorBotFrameWork/README.md
DrSnake88/HorBotFrameWork
76da380b0f6503f1f4ddfb92a02d7511537042e7
[ "MIT" ]
null
null
null
# HorBot Framework - Home of Source Home directory of the source code
35
36
0.785714
eng_Latn
0.940547
ede06660a8081b55aff19384f5f5e03f29b87654
961
md
Markdown
AlchemyInsights/microsoft-teamsrestoring-a-deleted-team-site.md
isabella232/OfficeDocs-AlchemyInsights-pr.tr-TR
829935b72282a64e4ec4294adebf31fd30f93f68
[ "CC-BY-4.0", "MIT" ]
3
2020-05-19T19:08:16.000Z
2021-03-14T11:48:36.000Z
AlchemyInsights/microsoft-teamsrestoring-a-deleted-team-site.md
isabella232/OfficeDocs-AlchemyInsights-pr.tr-TR
829935b72282a64e4ec4294adebf31fd30f93f68
[ "CC-BY-4.0", "MIT" ]
2
2022-02-09T06:53:52.000Z
2022-02-09T06:54:01.000Z
AlchemyInsights/microsoft-teamsrestoring-a-deleted-team-site.md
isabella232/OfficeDocs-AlchemyInsights-pr.tr-TR
829935b72282a64e4ec4294adebf31fd30f93f68
[ "CC-BY-4.0", "MIT" ]
2
2019-10-09T20:30:22.000Z
2021-10-09T10:37:41.000Z
--- title: Microsoft Teams - Silinmiş Ekip Sitesini Geri Yükleme ms.author: heidip author: microsoftheidi ms.audience: ITPro ms.topic: article ms.service: o365-administration ROBOTS: NOINDEX, NOFOLLOW localization_priority: Normal ms.custom: - "404" - "6500002" ms.assetid: b26be13f-7b8f-4393-9083-2b4d97b6cd80 ms.openlocfilehash: 5e51b9b34223b3122b59fd602b09103e4ca2e94444679f767e2a7005a9928694 ms.sourcegitcommit: b5f7da89a650d2915dc652449623c78be6247175 ms.translationtype: MT ms.contentlocale: tr-TR ms.lasthandoff: 08/05/2021 ms.locfileid: "53975481" --- # <a name="restoring-a-deleted-team-in-microsoft-teams"></a>Ekip'te silinmiş bir Ekibi geri Microsoft Teams Bir ekibi yanlışlıkla mı sildin? Ekibi silmenin bu yana 30 günden az zaman kaldı ise şanslısınız; geri yükleyebilirsiniz. Bunu yapmak için buradaki adımları izleyin: Silinmiş [bir ekibi geri yükleme](https://docs.microsoft.com/microsoftteams/archive-or-delete-a-team#restore-a-deleted-team).
40.041667
291
0.816857
tur_Latn
0.811489
ede0e61eaf4b5f0c11001587179a77e83e4b0c94
948
md
Markdown
README.md
translucentleaf/tomdong-3.0
61f1a6248241b014a53574cf980426c9d5308935
[ "MIT" ]
null
null
null
README.md
translucentleaf/tomdong-3.0
61f1a6248241b014a53574cf980426c9d5308935
[ "MIT" ]
2
2021-09-21T06:58:16.000Z
2022-02-26T23:13:58.000Z
README.md
translucentleaf/tomdong-3.0
61f1a6248241b014a53574cf980426c9d5308935
[ "MIT" ]
null
null
null
<!-- PROJECT LOGO AND TITILE --> <br /> <p align="center"> <h1 align="center">Tom Dong 3.0</h1> <p align="center">Third iteration of my website. Designed and built with TypeScript React and Gatsby.</p> <a><p align="center">See it live here.</p></a> </p> ## Check out the design You know how we have open-source code? Well designers have been taking notice of us developers so they've also started an open-source movement with open-source design. I'll share my Figma project that you can learn off of just as I'm learning off of everyone else's Figma projects. _Figma link will be added sortly._ ## Play around with my code 1. Clone the repository and move to its folder ```sh git clone https://github.com/tomdng/tomdong-3.0.git && cd tomdong-3.0 ``` 2. Install the required packages for the project ```sh yarn install ``` 3. Run the development version of Propguard ```sh yarn start ``` 4. Now play around, code, and enjoy!
24.947368
167
0.712025
eng_Latn
0.988306
ede183c39416da3b08f8ef6ce27c10b060bc0ca0
6,810
md
Markdown
docs/visual-basic/programming-guide/com-interop/introduction-to-com-interop.md
ilyakharlamov/docs.fr-fr
54c09f71d03787b462bdd134b3407d5ed708a191
[ "CC-BY-4.0", "MIT" ]
1
2019-04-11T17:00:02.000Z
2019-04-11T17:00:02.000Z
docs/visual-basic/programming-guide/com-interop/introduction-to-com-interop.md
ilyakharlamov/docs.fr-fr
54c09f71d03787b462bdd134b3407d5ed708a191
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/visual-basic/programming-guide/com-interop/introduction-to-com-interop.md
ilyakharlamov/docs.fr-fr
54c09f71d03787b462bdd134b3407d5ed708a191
[ "CC-BY-4.0", "MIT" ]
1
2022-02-23T14:59:20.000Z
2022-02-23T14:59:20.000Z
--- title: Introduction à COM Interop (Visual Basic) ms.date: 07/20/2015 helpviewer_keywords: - interop assemblies - COM interop [Visual Basic], about COM interop ms.assetid: 8bd62e68-383d-407f-998b-29aa0ce0fd67 ms.openlocfilehash: e4421cbc40cdccc1dbbaeb459cb12fda0ee407cf ms.sourcegitcommit: 8f95d3a37e591963ebbb9af6e90686fd5f3b8707 ms.translationtype: MT ms.contentlocale: fr-FR ms.lasthandoff: 02/23/2019 ms.locfileid: "56745597" --- # <a name="introduction-to-com-interop-visual-basic"></a>Introduction à COM Interop (Visual Basic) Le composant COM (Object Model) permet un objet d’exposer ses fonctionnalités à d’autres composants et d’héberger des applications. Bien que les objets COM ont été essentiels à Windows de programmation de nombreuses années, les applications conçues pour le common language runtime (CLR) offrent de nombreux avantages. [!INCLUDE[dnprdnshort](~/includes/dnprdnshort-md.md)] applications va remplacer celles développées avec COM. En attendant, vous devrez peut-être utiliser ou créer des objets COM à l’aide de Visual Studio. L’interopérabilité avec COM, ou *COM interop*, vous pouvez ainsi utiliser des objets COM existants lors de la transition vers le [!INCLUDE[dnprdnshort](~/includes/dnprdnshort-md.md)] à votre propre rythme. À l’aide de la [!INCLUDE[dnprdnshort](~/includes/dnprdnshort-md.md)] pour créer des composants COM, vous pouvez utiliser COM interop sans inscription. Cela vous permet de contrôler quelle version de DLL est activée lorsque plusieurs versions sont installée sur un ordinateur et permet aux utilisateurs finaux d’utiliser XCOPY ou FTP pour copier votre application dans un répertoire approprié sur leur ordinateur où il peut être exécuté. Pour plus d’informations, consultez [Registration-Free COM Interop](../../../framework/interop/registration-free-com-interop.md). ## <a name="managed-code-and-data"></a>Le Code managé et des données Le code développé pour le [!INCLUDE[dnprdnshort](~/includes/dnprdnshort-md.md)] est appelé *du code managé*et contient des métadonnées qui sont utilisée par le CLR. Données utilisées par [!INCLUDE[dnprdnshort](~/includes/dnprdnshort-md.md)] applications est appelée *les données managées* , car le runtime gère les tâches liées aux données telles que la vérification de type allocation et libération de la mémoire et l’exécution. Par défaut, Visual Basic .NET utilise des données et le code managé, mais vous pouvez accéder à du code non managé et les données des objets COM à l’aide des assemblys d’interopérabilité (décrites plus loin dans cette page). ## <a name="assemblies"></a>Assemblys Un assembly est le principal bloc de construction d’un [!INCLUDE[dnprdnshort](~/includes/dnprdnshort-md.md)] application. C’est un ensemble de fonctionnalités qui sont générées, avec contrôle de version et déployées comme une unité de mise en œuvre unique contenant un ou plusieurs fichiers. Chaque assembly contient un manifeste d’assembly. ## <a name="type-libraries-and-assembly-manifests"></a>Bibliothèques de types et les manifestes d’Assembly Bibliothèques de types décrivent les caractéristiques des objets COM, tels que les noms des membres et types de données. Manifestes d’assembly exécutent la même fonction pour [!INCLUDE[dnprdnshort](~/includes/dnprdnshort-md.md)] applications. Ils incluent des informations sur les éléments suivants : - Identité de l’assembly, version, culture et signature numérique. - Fichiers qui composent l’implémentation de l’assembly. - Types et ressources qui composent l’assembly. Comprennent celles qui sont exportés à partir de celui-ci. - Dépendances de compilation des autres assemblys. - Autorisations requises pour l’assembly pour s’exécuter correctement. Pour plus d’informations sur les assemblys et les manifestes d’assembly, consultez [assemblys dans .NET](../../../standard/assembly/index.md). ### <a name="importing-and-exporting-type-libraries"></a>Importation et exportation de bibliothèques de types Visual Studio contient un utilitaire, Tlbimp, qui vous permet d’importer des informations à partir d’une bibliothèque de types dans un [!INCLUDE[dnprdnshort](~/includes/dnprdnshort-md.md)] application. Vous pouvez générer des bibliothèques de types provenant d’assemblys à l’aide de l’utilitaire Tlbexp. Pour plus d’informations sur Tlbimp et Tlbexp, consultez [Tlbimp.exe (Type Library Importer)](../../../framework/tools/tlbimp-exe-type-library-importer.md) et [Tlbexp.exe (exportateur)](../../../framework/tools/tlbexp-exe-type-library-exporter.md). ## <a name="interop-assemblies"></a>Assemblys PIA Assemblys PIA sont [!INCLUDE[dnprdnshort](~/includes/dnprdnshort-md.md)] les assemblys qui pont entre managé et code, membres de l’objet mappage COM en équivalent [!INCLUDE[dnprdnshort](~/includes/dnprdnshort-md.md)] membres managés. Assemblys d’interopérabilité créés par Visual Basic .NET gèrent de nombreux détails de l’utilisation des objets COM, tels que le marshaling d’interopérabilité. ## <a name="interoperability-marshaling"></a>Marshaling d’interopérabilité Tous les [!INCLUDE[dnprdnshort](~/includes/dnprdnshort-md.md)] applications partagent un ensemble de types courants qui permettent l’interopérabilité des objets, quel que soit le langage de programmation qui est utilisé. Les paramètres et les valeurs de retour des objets COM utilisent parfois des types de données qui diffèrent de celles utilisées dans le code managé. *Marshaling d’interopérabilité* est le processus empaqueter des paramètres et valeurs de retour dans les types de données équivalents lors de leur déplacement vers et depuis des objets COM. Pour plus d’informations, consultez [Marshaling d’interopérabilité](../../../framework/interop/interop-marshaling.md). ## <a name="see-also"></a>Voir aussi - [COM Interop](../../../visual-basic/programming-guide/com-interop/index.md) - [Procédure pas à pas : Implémentation de l’héritage avec les objets COM](../../../visual-basic/programming-guide/com-interop/walkthrough-implementing-inheritance-with-com-objects.md) - [Interopération avec du code non managé](../../../framework/interop/index.md) - [Dépannage des problèmes liés à l’interopérabilité](../../../visual-basic/programming-guide/com-interop/troubleshooting-interoperability.md) - [Assemblys dans .NET](../../../standard/assembly/index.md) - [Tlbimp.exe (importateur de bibliothèques de types)](../../../framework/tools/tlbimp-exe-type-library-importer.md) - [Tlbexp.exe (exportateur de bibliothèques de types)](../../../framework/tools/tlbexp-exe-type-library-exporter.md) - [Marshaling d'interopérabilité](../../../framework/interop/interop-marshaling.md) - [COM Interop sans inscription](../../../framework/interop/registration-free-com-interop.md)
104.769231
681
0.777533
fra_Latn
0.92873
ede1ccd7393bcd67735a09049176603cf49cf088
824
md
Markdown
_posts/2019-12-01-apos-bolsonaro-vetar-folha-no-governo-crivella-corta-relacoes-com-o-globo.md
tatudoquei/tatudoquei.github.io
a3a3c362424fda626d7d0ce2d9f4bead6580631c
[ "MIT" ]
null
null
null
_posts/2019-12-01-apos-bolsonaro-vetar-folha-no-governo-crivella-corta-relacoes-com-o-globo.md
tatudoquei/tatudoquei.github.io
a3a3c362424fda626d7d0ce2d9f4bead6580631c
[ "MIT" ]
null
null
null
_posts/2019-12-01-apos-bolsonaro-vetar-folha-no-governo-crivella-corta-relacoes-com-o-globo.md
tatudoquei/tatudoquei.github.io
a3a3c362424fda626d7d0ce2d9f4bead6580631c
[ "MIT" ]
1
2022-01-13T07:57:24.000Z
2022-01-13T07:57:24.000Z
--- layout: post item_id: 2810736060 title: >- Após Bolsonaro vetar Folha no governo, Crivella corta relações com O Globo author: Tatu D'Oquei date: 2019-12-01 21:56:00 pub_date: 2019-12-01 21:56:00 time_added: 2019-12-02 20:49:47 category: tags: [] image: https://conteudo.imguol.com.br/c/noticias/5a/2019/04/09/19jun2017---o-prefeito-do-rio-de-janeiro-marcelo-crivella-1554812958622_v2_615x300.jpg --- Depois de o presidente Jair Bolsonaro (sem partido) anunciar que cancelaria as assinaturas do jornal Folha de S. **Link:** [https://noticias.uol.com.br/politica/ultimas-noticias/2019/12/01/apos-bolsonaro-vetar-folha-no-governo-crivella-corta-relacoes-com-o-globo.htm](https://noticias.uol.com.br/politica/ultimas-noticias/2019/12/01/apos-bolsonaro-vetar-folha-no-governo-crivella-corta-relacoes-com-o-globo.htm)
43.368421
298
0.774272
por_Latn
0.633871
ede20f3bbbd4b523c73529ea00ac982da71d8d59
203
md
Markdown
README.md
kyleballard/mock-data-generator
04d201138e9cc25960230ee7053e05bba68f0f97
[ "MIT" ]
4
2018-05-04T00:54:35.000Z
2021-01-26T19:11:19.000Z
README.md
kyleballard/mock-data-generator
04d201138e9cc25960230ee7053e05bba68f0f97
[ "MIT" ]
null
null
null
README.md
kyleballard/mock-data-generator
04d201138e9cc25960230ee7053e05bba68f0f97
[ "MIT" ]
1
2020-01-08T19:26:47.000Z
2020-01-08T19:26:47.000Z
# Mock Data Generator Proof of concept for generating sample Ecommerce data using Faker.js ### Instructions ```bash npm install npm start person npm start inline npm start hookio npm start orders ```
15.615385
68
0.773399
eng_Latn
0.728341
ede28a8aaeb6cdf34dd2045b285bc96c41d120ed
4,403
md
Markdown
README.md
kikitux/nginx-on-ubuntu-aws-image
9d2ee4d914e1bedec89846c56dd7037127d135f6
[ "MIT" ]
null
null
null
README.md
kikitux/nginx-on-ubuntu-aws-image
9d2ee4d914e1bedec89846c56dd7037127d135f6
[ "MIT" ]
null
null
null
README.md
kikitux/nginx-on-ubuntu-aws-image
9d2ee4d914e1bedec89846c56dd7037127d135f6
[ "MIT" ]
null
null
null
# AWS AMI: nginx on ubuntu using packer # Description Create an nginx web server on ubuntu image to deploy static pages.\ The image is build on AWS by default in "eu-central-1" region. This image contains a default instalation of nginx. If you need to change the nginx confiration please refer to [nginx docs](http://nginx.org/en/docs/) # How to use If you don't have AWS credentials you can create them on [this page](https://console.aws.amazon.com/iam/home?#security_credential). Add your AWS credentials as two environment variables: ``` export AWS_ACCESS_KEY_ID=<YOUR_ACCESS_KEY> export AWS_SECRET_ACCESS_KEY=<YOUR_SECRET_KEY> ``` Clone the repository on your computer ```bash git clone git@github.com:ionhashicorp/nginx-on-ubuntu-aws-image.git ``` drive into the directory named nginx-on-ubuntu-aws-image ```bash cd nginx-on-ubuntu-aws-image ``` build the image with the following command: ``` packer build nginx-on-ubuntu.pkr.hcl ``` AMI image located at: [AWS AMI management page](https://us-west-2.console.aws.amazon.com/ec2/v2/home?region=eu-central-1#Images:visibility=owned-by-me;search=learn-packer-linux-aws;sort=tag:Name) # Change the region of the AMI You can specify a different region where you want to deploy the image by specifying the [region](https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/using-regions-availability-zones.html) code: ``` packer build --var region=<REGION_CODE> nginx-on-ubuntu.pkr.hcl ``` # Sample output ``` ± packer build nginx-on-ubuntu.pkr.hcl nginx-on-ubuntu.amazon-ebs.ubuntu: output will be in this color. ==> nginx-on-ubuntu.amazon-ebs.ubuntu: Prevalidating any provided VPC information ==> nginx-on-ubuntu.amazon-ebs.ubuntu: Prevalidating AMI Name: nginx-on-ubuntu-aws-20210916083844 nginx-on-ubuntu.amazon-ebs.ubuntu: Found Image ID: ami-01299931803ce83f6 ==> nginx-on-ubuntu.amazon-ebs.ubuntu: Creating temporary keypair: packer_61430294-3765-0f07-d2eb-22783df0ed5f ==> nginx-on-ubuntu.amazon-ebs.ubuntu: Creating temporary security group for this instance: packer_61430296-732c-85a5-b2f8-86a2659742b5 ==> nginx-on-ubuntu.amazon-ebs.ubuntu: Authorizing access to port 22 from [0.0.0.0/0] in the temporary security groups... ==> nginx-on-ubuntu.amazon-ebs.ubuntu: Launching a source AWS instance... ==> nginx-on-ubuntu.amazon-ebs.ubuntu: Adding tags to source instance nginx-on-ubuntu.amazon-ebs.ubuntu: Adding tag: "Name": "Packer Builder" nginx-on-ubuntu.amazon-ebs.ubuntu: Instance ID: i-01477c57f4ebe3e49 ==> nginx-on-ubuntu.amazon-ebs.ubuntu: Waiting for instance (i-01477c57f4ebe3e49) to become ready... ==> nginx-on-ubuntu.amazon-ebs.ubuntu: Using SSH communicator to connect: 18.185.116.244 ==> nginx-on-ubuntu.amazon-ebs.ubuntu: Waiting for SSH to become available... ==> nginx-on-ubuntu.amazon-ebs.ubuntu: Connected to SSH! ==> nginx-on-ubuntu.amazon-ebs.ubuntu: Provisioning with shell script: /var/folders/k2/jxw8lbzx3k91ml2q8w02zfv00000gq/T/packer-shell756209253 ==> nginx-on-ubuntu.amazon-ebs.ubuntu: ==> nginx-on-ubuntu.amazon-ebs.ubuntu: WARNING: apt does not have a stable CLI interface. Use with caution in scripts. ==> nginx-on-ubuntu.amazon-ebs.ubuntu: <<< output omitted for brevity >>> ==> nginx-on-ubuntu.amazon-ebs.ubuntu: Stopping the source instance... nginx-on-ubuntu.amazon-ebs.ubuntu: Stopping instance ==> nginx-on-ubuntu.amazon-ebs.ubuntu: Waiting for the instance to stop... ==> nginx-on-ubuntu.amazon-ebs.ubuntu: Creating AMI nginx-on-ubuntu-aws-20210916083844 from instance i-01477c57f4ebe3e49 nginx-on-ubuntu.amazon-ebs.ubuntu: AMI: ami-0e51d829398e07bea ==> nginx-on-ubuntu.amazon-ebs.ubuntu: Waiting for AMI to become ready... ==> nginx-on-ubuntu.amazon-ebs.ubuntu: Skipping Enable AMI deprecation... ==> nginx-on-ubuntu.amazon-ebs.ubuntu: Terminating the source AWS instance... ==> nginx-on-ubuntu.amazon-ebs.ubuntu: Cleaning up any extra volumes... ==> nginx-on-ubuntu.amazon-ebs.ubuntu: No volumes to clean up, skipping ==> nginx-on-ubuntu.amazon-ebs.ubuntu: Deleting temporary security group... ==> nginx-on-ubuntu.amazon-ebs.ubuntu: Deleting temporary keypair... Build 'nginx-on-ubuntu.amazon-ebs.ubuntu' finished after 5 minutes 16 seconds. ==> Wait completed after 5 minutes 16 seconds ==> Builds finished. The artifacts of successful builds are: --> nginx-on-ubuntu.amazon-ebs.ubuntu: AMIs were created: eu-central-1: ami-0e51d829398e07bea ```
48.922222
193
0.76357
eng_Latn
0.403561
ede2c6eb1cec76afa9e52ca175abfb7b62f55d56
356
md
Markdown
_pages/documents.md
panyan7/panyan7.github.io
b666cac19ce6faa51ce822faa396cd896957298c
[ "MIT" ]
null
null
null
_pages/documents.md
panyan7/panyan7.github.io
b666cac19ce6faa51ce822faa396cd896957298c
[ "MIT" ]
null
null
null
_pages/documents.md
panyan7/panyan7.github.io
b666cac19ce6faa51ce822faa396cd896957298c
[ "MIT" ]
null
null
null
--- layout: page permalink: /docs/ title: Documents description: Links to documents on my website. nav: true --- [CV](https://panyan7.github.io/cv/)/[Resume](https://panyan7.github.io/assets/pdf/yanpan_resume_jul21.pdf) [Coursework](https://panyan7.github.io/blog/2021/courses/) ## Documents [SURF Proposal](https://panyan7.github.io/blog/2021/surf/)
20.941176
106
0.733146
yue_Hant
0.398602
ede34e3e17d32b5321bcd7772da65c3fc6eff56b
455
md
Markdown
blog/2020-10-14-rosary.vi.md
dtgauburn/dtgauburn.github.io
428b46261fe8e3a9c8e9644bc5580e5c6ad31fe6
[ "MIT" ]
null
null
null
blog/2020-10-14-rosary.vi.md
dtgauburn/dtgauburn.github.io
428b46261fe8e3a9c8e9644bc5580e5c6ad31fe6
[ "MIT" ]
null
null
null
blog/2020-10-14-rosary.vi.md
dtgauburn/dtgauburn.github.io
428b46261fe8e3a9c8e9644bc5580e5c6ad31fe6
[ "MIT" ]
null
null
null
--- title: 'Bó Hoa Thiêng October!' date: '2020-10-14 14:02:02' description: 'Bring a rose (a rose represents one Hail Mary) to our blessed Mother this month!' category: HS image: '/assets/img/rosary-page.jpg' --- Hi everyone! Bring a rose (a rose represents one Hail Mary) to our blessed Mother this month! And submit your record to the link provided to receive one rosary that’s been blessed by the Pope! ![BHT image](/assets/images/rosary-page.jpg)
37.916667
193
0.740659
eng_Latn
0.971023
ede3c9f5b6d83d567a9e23dd7bb4e4799d0da39e
1,505
md
Markdown
design-patterns/FactoryMethod-Java/README.md
nshaikh1/DesignAndMaintenance
6c03c6074482d0b29ed89c5b64a31e452dc80420
[ "MIT" ]
2
2020-01-28T00:36:43.000Z
2020-01-28T00:49:40.000Z
design-patterns/FactoryMethod-Java/README.md
nshaikh1/DesignAndMaintenance
6c03c6074482d0b29ed89c5b64a31e452dc80420
[ "MIT" ]
36
2020-01-28T06:51:40.000Z
2020-05-16T00:23:10.000Z
design-patterns/FactoryMethod-Java/README.md
nshaikh1/DesignAndMaintenance
6c03c6074482d0b29ed89c5b64a31e452dc80420
[ "MIT" ]
23
2020-01-28T00:51:29.000Z
2020-05-16T03:06:03.000Z
# Factory Pattern In Factory pattern, we create object without exposing the creation logic to the client and refer to newly created object using a common interface. A real time use of Factory Pattern is when a method returns one of several possible classes that share a common super class. ![UML of Globe class using Factory pattern](factory-java.png "UML class diagram of Factory Pattern") ## Java example In this case, I have created the `Globe` interface [IGlobe.java](IGlobe.java), which contains an operation `displayRate` that is a method used to display rate of deaths in different continents. Same method is being used by different continents created -[Asia.java](Asia.java),[Africa.java](Africa.java),[Europe.java](Europe.java). To hide the object creation from the user I have created a Factory class [GlobeFactory.java](GlobeFactory.java) where `getInstance` method checks which object needs to be created which is actually the motive behind hiding creational features from the user. [FactoryMain.java](FactoryMain.java) class contains the main methos which in turn creates object of the factory class[GlobeFactory.java](GlobeFactory.java) and actually demands which object needs to be created. ### Running the example The provided [GlobeFactory.java](GlobeFactory.java) file creates multiple classes, but the only ones which are created will be displayed. ```{bash} $output: Africa has 0.0625 population affetced by COVID-19 Europe has 0.0026990555 population affetced by COVID-19 ```
71.666667
587
0.795349
eng_Latn
0.999111
ede4bdc441d797b67118f6edee55f3eef2bdf83d
321
markdown
Markdown
_posts/2018-12-13-project-2.markdown
danielabishop/danielabishop.github.io
921ff3e4848c6b2c347a96ac4d2c2907acc747f9
[ "MIT" ]
null
null
null
_posts/2018-12-13-project-2.markdown
danielabishop/danielabishop.github.io
921ff3e4848c6b2c347a96ac4d2c2907acc747f9
[ "MIT" ]
null
null
null
_posts/2018-12-13-project-2.markdown
danielabishop/danielabishop.github.io
921ff3e4848c6b2c347a96ac4d2c2907acc747f9
[ "MIT" ]
null
null
null
--- layout: default modal-id: 2 title: 2018 AGU Presentation date: 2018-12-11 img: Talk3.png alt: image-alt project-date: December 2018 description: Poster Presentation at the 2018 American Geophysical Union Fall Meeting in Washington, DC, on December 11, 2018. download: https://figshare.com/s/7493334d6d57566b8691 ---
24.692308
125
0.778816
yue_Hant
0.436094
ede558f31ac6d19a278dde0d3954eb0a55e20b5b
1,951
md
Markdown
input/en-us/features/powershell.md
rachfop/docs
f6a3a5b9ac4641ec6cc1ab1057622857c13518be
[ "Apache-2.0" ]
94
2020-10-30T11:02:37.000Z
2022-03-29T10:50:24.000Z
input/en-us/features/powershell.md
rachfop/docs
f6a3a5b9ac4641ec6cc1ab1057622857c13518be
[ "Apache-2.0" ]
339
2020-10-30T08:09:53.000Z
2022-03-31T23:43:38.000Z
input/en-us/features/powershell.md
rachfop/docs
f6a3a5b9ac4641ec6cc1ab1057622857c13518be
[ "Apache-2.0" ]
108
2020-10-30T08:09:25.000Z
2022-03-30T06:52:46.000Z
--- Order: 20 xref: packaging-for-the-masses Title: Packaging for the masses - PowerShell Description: At its heart, Chocolatey uses the Windows Automation Language, PowerShell, to perform its different operations --- Released in November 2006, Windows PowerShell, quickly became the defacto way to automate tasks on Windows. Whether it was automating the creation of Users within Active Directory, or configuring/maintaining your Exchange Server, PowerShell was the way to do it. When Chocolatey started, it was clear that PowerShell was going to be at the heart of how it works as well, and that continues to this day. Within a Chocolatey package, there are three main PowerShell files: - chocolateyInstall.ps1 - chocolateyUninstall.ps1 - chocolateyBeforeModify.ps1 > :memo: **NOTE** Not all of these files are required within every Chocolatey package, as Chocolatey performs a number of tasks automatically, and overrdiing the dfeault functionality is only required when you want to do something different. When a Chocolatey package is installed/uninstalled, these files are executed at [different times](xref:create-packages#during-which-scenarios-will-my-custom-scripts-be-triggered), giving a [package creator](xref:create-your-own-packages) full control over exactly what actions are performed within a package. This flexibility allows for the creation of simple packages (that simply execute a native installer) all the way up to highly complicated installations that perform several operations. On top of this, Chocolatey ships with a number of [built-in functions](xref:powershell-reference), to help with the specific tasks of installing applications, binaries, zips, and scripts. When you need to create your own custom PowerShell functions, Chocolatey provides the ability to create [extension packages](xref:extensions), to allow the sharing of code across multiple packages, rather than duplicating it.
75.038462
309
0.800615
eng_Latn
0.998513
ede564c54bc45140bd8c0426c863328ea827111d
1,072
md
Markdown
terraform/import-yaml/README.md
minyu19/workflows-demos
eb037b9776db03a840611d6806845d4b114873f2
[ "Apache-2.0" ]
29
2021-04-20T07:55:19.000Z
2022-03-22T14:08:21.000Z
terraform/import-yaml/README.md
minyu19/workflows-demos
eb037b9776db03a840611d6806845d4b114873f2
[ "Apache-2.0" ]
17
2021-05-04T07:47:37.000Z
2022-01-28T09:05:38.000Z
terraform/import-yaml/README.md
minyu19/workflows-demos
eb037b9776db03a840611d6806845d4b114873f2
[ "Apache-2.0" ]
16
2021-04-28T12:54:00.000Z
2022-02-27T10:56:25.000Z
# Terraform with imported YAML This sample shows how to use Terraform's [google_workflows_workflow](https://registry.terraform.io/providers/hashicorp/google/latest/docs/resources/workflows_workflow) resource to deploy a Workflow from a main workflow YAML file. ## Terraform You can see [main.tf](main.tf) for Terraform and [workflows.yaml](workflows.yaml] for Workflows definitions. 1. Initialize terraform: ```sh terraform init ``` 1. See the planned changes: ```sh terraform plan -var="project_id=YOUR-PROJECT-ID" -var="region=YOUR-GCP-REGION" ``` 1. Create workflow: ```sh terraform apply -var="project_id=YOUR-PROJECT-ID" -var="region=YOUR-GCP-REGION" ``` 1. Once workflow is created, you can see it in the list: ```sh gcloud workflows list --location YOUR-GCP-REGION ``` 1. Cleanup: ```sh terraform destroy -var="project_id=YOUR-PROJECT-ID" -var="region=YOUR-GCP-REGION" ``` ## Execute You can optionally execute the workflow using gcloud: ```sh gcloud workflows execute sample-workflow ```
22.333333
167
0.70709
eng_Latn
0.911111
ede61fd34b7bfb0ff3df4d7f37dca0503d4874f2
5,022
md
Markdown
LEARNING.md
Chris56974/microservices-learning-app
ce5c94916eaaf68408dc83d35b9a205008ce0e41
[ "MIT" ]
null
null
null
LEARNING.md
Chris56974/microservices-learning-app
ce5c94916eaaf68408dc83d35b9a205008ce0e41
[ "MIT" ]
null
null
null
LEARNING.md
Chris56974/microservices-learning-app
ce5c94916eaaf68408dc83d35b9a205008ce0e41
[ "MIT" ]
null
null
null
# Learning ## Working with Kafka as a StatefulSet is tricky sometimes I ran into [this error](https://www.orchome.com/10529) and it was hard to figure out because the proposed solution didn't seem to work. After reading this [post on SO](https://stackoverflow.com/questions/65687515/delete-kubernetes-persistent-volume-from-statefulset-after-scale-down), I thought maybe I could delete the persistent volume claim and have it delete the persistent volume so I could reset the data in kafka but it didn't seem to work for me either. I think it was because I was running into [this problem some people are talking about on github here](https://github.com/kubernetes/kubernetes/issues/69697). I ended up fixing things by going into the troubleshooting page for docker desktop and resetting everything there and it seems to work okay after that. I had to reinstall nginx-ingress controller. ## Honestly, working with Kafka in general is tricky sometimes I ran into a myriad of problems [like the one this guy had](https://github.com/confluentinc/examples/issues/398). I also had troubles setting up tests with kafka. I don't want to mock kafka because I don't think my tests would be that useful. But running Kafka in a container for all my tests is not that great either because it takes a while to start and it's quite noisy and my containers don't shut down properly when I run jest. ## Choice of hash function I was originally going to use bcrypt, but I decided on scrypt instead (using the node crypto module) because I like to trim down on dependencies and prefer using core modules. But I also read that argon2 is popular and that it won a [password hashing competition](https://en.wikipedia.org/wiki/Password_Hashing_Competition). I'm not sure how the competition works, but my guess is it's a lot harder to brute force a password that has been hashed with argon2 so maybe I'll try that later. ## SEO It somehow slipped my mind, but I just realized that the google search crawler can't crawl anything that requires user authentication (it can't create an account and sign in). My original idea would've been completely disastrous for SEO lol and it changes the site in a lot of ways on the frontend. ## Deployment stuff In docker, I could pass in environment variables from the host into the container like this ```yaml environment: FOO: $FOO ``` and then have a CI/CD pass in a different value for FOO for production. But in k8s, it doesn't look like I can do that anymore. It seems like I have to use secrets and configMaps instead. ```yaml env: - name: FOO value: $FOO # doesn't work ``` I tried using kaniko, which is supported by skaffold and accepts env variables for each pod (using Skaffold's templated fields) but kaniko couldn't build some of my images for some reason and I wasn't sure how to debug it. I also thought about using an environment.config a file and then changing it later using a CI/CD so that it had different values for production. In the end, I went with bitnami sealed secrets. I also wasn't sure how I wanted to split up my development environment and my production environment. Skaffold has the same issue as k8s where I can't interpolate environment variables from the host unless if it was for a templated field. So my skaffold file used to be massive with duplicate config until I used skaffold patches. I haven't looked into kustomize but maybe that was an even better solution. I originally chose skaffold because I wanted the same familiar DX I had with docker compose. But I'm not sure if it makes sense for what I'm doing. My computer is starting to slow down with 12 services, and kafka is making it harder to understand things conceptually, especially when faced with the occassional error message. ## Jest stuff I think I like using Jest over Postman when testing out an API. I'm also realizing, that the way I structure my application has a big effect on my tests. Usually, the more I optimize my code for tests, the less coupling my code ends up having which is really nice. I want to be able to create integration tests between all my microservices but it seems tricky with kafka. I don't have the expertise for those kinds of tests right now so I might have to come back to that later, because tests would be really helpful in this regard. ## Distroless Overhyped? I see a lot of discussion over whether distroless images even matter like this [redhat article](https://www.redhat.com/en/blog/why-distroless-containers-arent-security-solution-you-think-they-are). It might make it harder to scan my images, but I'll give them a shot for now. ## I should've been using a linter a long time ago It's done great things for my code in this project (prevented cyclic dependencies, made things consistent). And after reading this excellent [blog post by Pau Ramon Revilla](https://labs.factorialhr.com/posts/hooks-considered-harmful), it even seems like some of the pitfalls of different JS frameworks can be caught by a linter as well (to an extent).
96.576923
816
0.785544
eng_Latn
0.999665
ede63cf92fb44f449b6a4ab73c69a30b3e204e1f
4,524
md
Markdown
README.md
73k05/android-image-preview
b66f7c18f22784e7d1fa9ee62df4726845f38320
[ "MIT" ]
null
null
null
README.md
73k05/android-image-preview
b66f7c18f22784e7d1fa9ee62df4726845f38320
[ "MIT" ]
null
null
null
README.md
73k05/android-image-preview
b66f7c18f22784e7d1fa9ee62df4726845f38320
[ "MIT" ]
null
null
null
![](https://thedroid.io/assets/img/tb-image-preview.png) ![API](https://img.shields.io/badge/API-16%2B-34bf49.svg) ### Demo: ![Screenshot_1629961047](https://user-images.githubusercontent.com/9129812/130916268-004b5d8e-17ec-42aa-824c-b718d333b5e4.png) [<img src="https://play.google.com/intl/en_us/badges/images/generic/en-play-badge.png" alt="Get it on Google Play" height="80">](https://play.google.com/store/apps/details?id=com.greentoad.turtlebody.imagepreview.sample) ## Image Preview Library for Android (AndroidX) A Image Preview library for Android for selecting single/multiple files of any type. ## Setup Step 1: Add the dependency ```gradle dependencies { ... /*image preview*/ implementation 'com.greentoad.turtlebody.imagepreview:image-preview:$latestVersion' } ``` ## Usage Step 1: Declare and Initialize ImagePreview. #### Java ```java ArrayList<Uri> arrayList = new ArrayList<>(); //add uri to arrayList ImagePreviewConfig config = new ImagePreviewConfig().setAllowAddButton(true).setUris(arrayList); ImagePreview.ImagePreviewImpl imagePreview = ImagePreview.with(this); imagePreview .setConfig(config) .setListener(new ImagePreview.ImagePreviewImpl.OnImagePreviewListener() { @Override public void onDone(@NotNull ArrayList<Uri> data) { //after done all uri is sent back } @Override public void onAddBtnClicked() { //trigger when button clicked } }) .start(); ``` #### Kotlin ```kotlin val arrayList = arrayListOf<Uri>() //add uri to arrayList val config = ImagePreviewConfig().setAllowAddButton(true).setUris(arrayList) ImagePreview.with(this) .setConfig(config) .setListener(object : ImagePreview.ImagePreviewImpl.OnImagePreviewListener{ override fun onDone(data: ArrayList<Uri>) { println( "uris: $data" ) } override fun onAddBtnClicked() { printn( "addBtn clicked" ) } }) .start() ``` ## Explanation: #### 1. ImagePreviewConfig: It is use to set the configuration. 1. **.setAllowAddButton(booleanValue)**: tells whether to show add button in preview activity. 2. **.setUris(arrayList of Uri)**: set array of image(uri) to be send for preview eg. ```java //allow button and set uri list ImagePreviewConfig config = new ImagePreviewConfig().setAllowAddButton(true).setUris(arrayList); ``` #### 2. ExtraListener: Call back listener when user clicked add button or done button. eg. ``` java .setListener(new ImagePreview.ImagePreviewImpl.OnImagePreviewListener() { @Override public void onDone(@NotNull ArrayList<Uri> data) { //after done all uri is sent back } @Override public void onAddBtnClicked() { //trigger when button clicked } }) ``` #### 3. Manually dismissing ImagePreview: ```java ImagePreview.ImagePreviewImpl imagePreview = ImagePreview.with(this); //note: always use same instance from which you started imagePreview imagePreview.dismissImagePreview(); ``` ### URI: We will be returning the list of Uri after done button is clicked. That's why it is better to know about Uri first. A Uniform Resource Identifier (URI) is a compact sequence of characters that identifies an abstract or physical resource. In Android, Content providers manage access to a structured set of data. They encapsulate the data, and provide mechanisms for defining data security. Content providers are the standard interface that connects data in one process with code running in another process. You can get almost all information from uri. #### URI usages: 1. Get file from uri: ```java File file = new File(uri.getPath()); ``` 2. Get mime from uri: ```java String mimeType = getContentResolver().getType(uri); ``` 3. Used in Glide: ```java Glide.with(context) .load(uri) .into(imageView); ``` --- ### Quick Links * [ChangeLog](/CHANGELOG.md) * [Documentation](https://github.com/Turtlebody/android-image-preview/wiki) ### Demos * [Example](/Example.md) * [Sample APK file](https://play.google.com/store/apps/details?id=com.greentoad.turtlebody.imagepreview.sample) ### Developers * [API Documentation](https://github.com/Turtlebody/android-image-preview/wiki/API-Documentation) * [Developer Setup & Usage](https://github.com/Turtlebody/android-image-preview/wiki/Developer-Setup) --- ## To pick media files(audio,image,video) you can use [Turtlebody Media Picker](https://github.com/Turtlebody/android-media-picker) library.
26.302326
267
0.714191
kor_Hang
0.335573
ede6c7fcfca97024e47958a20be2ca8173927934
731
md
Markdown
README.md
JenningsMK/my-pattern-library
7f75ba31e1989621976b7eb5cdb41a2ba039e0c5
[ "MIT" ]
null
null
null
README.md
JenningsMK/my-pattern-library
7f75ba31e1989621976b7eb5cdb41a2ba039e0c5
[ "MIT" ]
null
null
null
README.md
JenningsMK/my-pattern-library
7f75ba31e1989621976b7eb5cdb41a2ba039e0c5
[ "MIT" ]
null
null
null
# The Reason I have created a local server using [Express][1] to run our development environment at work. The aim of this repo is to improve what I have done & try different approaches. ## Current problems * One of the problems I have with our current implementation is defining the data structure for a page. What I would like to happen is the data is inhered from the component it is calling with the option to overwrite. * My current implementation doesn't show you the output data & I believe this would be a nice to have. * Have the local server running on HTTP2 so I can do some performance testing. ## License Licensed under the MIT. See the license header in the respective file to be sure. [1]: http://expressjs.com/
45.6875
121
0.770178
eng_Latn
0.999769
ede78d4d68b1aa2341584f254ece436c5a606929
273
md
Markdown
contact.md
ShunChengWu/cards
49662d7b442b13c80d5588cfd542aa8e875fc017
[ "MIT" ]
null
null
null
contact.md
ShunChengWu/cards
49662d7b442b13c80d5588cfd542aa8e875fc017
[ "MIT" ]
null
null
null
contact.md
ShunChengWu/cards
49662d7b442b13c80d5588cfd542aa8e875fc017
[ "MIT" ]
1
2021-09-29T12:27:46.000Z
2021-09-29T12:27:46.000Z
--- layout: page title: Contact permalink: /contact/ --- <iframe src="https://docs.google.com/forms/d/e/1FAIpQLSeMIBgrtP-iShw4_zUyg-rw9s-mY6eVGkYwGyBiUuX9BvZ0_Q/viewform?embedded=true" width="640" height="796" frameborder="0" marginheight="0" marginwidth="0">載入中…</iframe>
39
215
0.761905
eng_Latn
0.094287
ede791de552f71b52842b292472e96203a79eb30
1,889
md
Markdown
README.md
edh72/financial-demo
a08ddfc47369fc118ae821212166d0238d439930
[ "Apache-2.0" ]
1
2020-04-24T19:32:52.000Z
2020-04-24T19:32:52.000Z
README.md
edh72/financial-demo
a08ddfc47369fc118ae821212166d0238d439930
[ "Apache-2.0" ]
7
2020-04-02T11:57:39.000Z
2020-04-22T08:44:32.000Z
README.md
edh72/financial-demo
a08ddfc47369fc118ae821212166d0238d439930
[ "Apache-2.0" ]
null
null
null
# Financial Services Demo Bot This is a demo bot for financial services. ## To install the dependencies: Run: ```bash pip install -r requirements.txt ``` ## To run the bot: Use `rasa train` to train a model. Then, to run, first set up your action server in one terminal window: ```bash rasa run actions ``` In another window, run the duckling server (for entity extraction): ```bash docker run -p 8000:8000 rasa/duckling ``` Then to talk to the bot, run: ``` rasa shell --debug ``` Note that `--debug` mode will produce a lot of output meant to help you understand how the bot is working under the hood. To simply talk to the bot, you can remove this flag. ## Overview of the files `data/core.md` - contains stories `data/nlu.md` - contains NLU training data `actions.py` - contains custom action/api code `domain.yml` - the domain file, including bot response templates `config.yml` - training configurations for the NLU pipeline and policy ensemble `tests/e2e.md` - end-to-end test stories ## Things you can ask the bot The bot currently has five skills. You can ask it to: 1. Transfer money to another person 2. Check your earning or spending history (with a specific vendor or overall) 3. Answer a question about transfer charges 4. Pay a credit card bill 5. Tell you your account balance It also has a limited ability to switch skills mid-transaction and then return to the transaction at hand. For the purposes of illustration, the bot recognises the following fictional credit card accounts: - `gringots` - `justice bank` - `credit all` - `iron bank` It recognises the following payment amounts (besides actual currency amounts): - `minimum balance` - `current balance` It recognises the following vendors (for spending history): - `Starbucks` - `Amazon` - `Target` You can change any of these by modifying `actions.py` and the corresponding NLU data.
23.6125
106
0.745368
eng_Latn
0.99777
ede83927235351c9efeef1ef0a602b5ac772815b
936
md
Markdown
2021/CVE-2021-33879.md
sei-vsarvepalli/cve
fbd9def72dd8f1b479c71594bfd55ddb1c3be051
[ "MIT" ]
4
2022-03-01T12:31:42.000Z
2022-03-29T02:35:57.000Z
2021/CVE-2021-33879.md
sei-vsarvepalli/cve
fbd9def72dd8f1b479c71594bfd55ddb1c3be051
[ "MIT" ]
null
null
null
2021/CVE-2021-33879.md
sei-vsarvepalli/cve
fbd9def72dd8f1b479c71594bfd55ddb1c3be051
[ "MIT" ]
1
2022-03-29T02:35:58.000Z
2022-03-29T02:35:58.000Z
### [CVE-2021-33879](https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-33879) ![](https://img.shields.io/static/v1?label=Product&message=n%2Fa&color=blue) ![](https://img.shields.io/static/v1?label=Version&message=n%2Fa&color=blue) ![](https://img.shields.io/static/v1?label=Vulnerability&message=n%2Fa&color=brighgreen) ### Description Tencent GameLoop before 4.1.21.90 downloaded updates over an insecure HTTP connection. A malicious attacker in an MITM position could spoof the contents of an XML document describing an update package, replacing a download URL with one pointing to an arbitrary Windows executable. Because the only integrity check would be a comparison of the downloaded file's MD5 checksum to the one contained within the XML document, the downloaded executable would then be executed on the victim's machine. ### POC #### Reference No POC found. #### Github - https://github.com/nomi-sec/PoC-in-GitHub
52
493
0.771368
eng_Latn
0.847871
ede854d684db1a62113b980ebbf243ccdd30036b
1,073
md
Markdown
docs/content/config.md
egig/drafterbit
2272315d5e6b90fc67cef9c511db10cb639ef56a
[ "MIT" ]
5
2015-01-05T07:24:51.000Z
2020-10-01T20:47:11.000Z
docs/content/config.md
egig/drafterbit
2272315d5e6b90fc67cef9c511db10cb639ef56a
[ "MIT" ]
17
2015-09-22T09:20:46.000Z
2021-08-11T02:57:46.000Z
docs/content/config.md
egig/drafterbit
2272315d5e6b90fc67cef9c511db10cb639ef56a
[ "MIT" ]
3
2015-01-27T07:31:16.000Z
2015-09-04T04:02:01.000Z
--- title: Config --- There is two type of configuration. 1. Environment config, you can set in OS Environment variable or in `.env` file in project root directory. 2. Project config, you can set in `drafterbit.config.js`. ### Environment Following are the environment variable used by drafterbit |Name|Description|Default| |----|------|------| | DRAFTERBIT_PORT | Port which the applicatoin should run | 3000 | | DRAFTERBIT_APP_NAME | The name of the application/website | | | DRAFTERBIT_DEBUG | whether to output debug information while running the server | false | | DRAFTERBIT_ENV | Environment used by the application | production | ### Project This config defined in file named `drafterbit.config.js` in project directory and you might want to include it in vcs as this config defines the features of the application. |Name|Description|Default| |----|------|------| | appName | The name of your application, the environment variable DRAFTERBIT_APP_NAME take higher priority | | | theme | The theme name used | quill | | plugins | The list of plugins | [] |
33.53125
112
0.732526
eng_Latn
0.993935
ede86a76b10e63a70ab7771aaaf139e3013a48d1
44,822
md
Markdown
articles/virtual-machines/workloads/sap/hana-backup-restore.md
changeworld/azure-docs.pt-pt
8a75db5eb6af88cd49f1c39099ef64ad27e8180d
[ "CC-BY-4.0", "MIT" ]
null
null
null
articles/virtual-machines/workloads/sap/hana-backup-restore.md
changeworld/azure-docs.pt-pt
8a75db5eb6af88cd49f1c39099ef64ad27e8180d
[ "CC-BY-4.0", "MIT" ]
null
null
null
articles/virtual-machines/workloads/sap/hana-backup-restore.md
changeworld/azure-docs.pt-pt
8a75db5eb6af88cd49f1c39099ef64ad27e8180d
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: Hana backup e restaurar em SAP HANA em Azure (Grandes Instâncias) / Microsoft Docs description: Como executar backup HANA e restaurar em SAP HANA em Azure (Grandes Instâncias) services: virtual-machines-linux documentationcenter: '' author: saghorpa manager: gwallace editor: '' ms.service: virtual-machines-linux ms.topic: article ms.tgt_pltfrm: vm-linux ms.workload: infrastructure ms.date: 10/16/2019 ms.author: saghorpa ms.custom: H1Hack27Feb2017 ms.openlocfilehash: 4384d29811d29f06422802abba5d3eb1ea5737e9 ms.sourcegitcommit: 2ec4b3d0bad7dc0071400c2a2264399e4fe34897 ms.translationtype: MT ms.contentlocale: pt-PT ms.lasthandoff: 03/27/2020 ms.locfileid: "72430087" --- # <a name="backup-and-restore"></a>Cópia de segurança e restauro >[!IMPORTANT] >Este artigo não substitui a documentação da administração SAP HANA ou as Notas SAP. Esperamos que tenha uma compreensão sólida e experiência na administração e operações da SAP HANA, especialmente para backup, restauro, alta disponibilidade e recuperação de desastres. Neste artigo, são mostradas imagens do Estúdio SAP HANA. Conteúdo, estrutura e natureza dos ecrãs das ferramentas de administração SAP e das próprias ferramentas podem mudar a partir da versão SAP HANA para lançar. É importante que exerça os passos e processos tomados no seu ambiente e com as suas versões e lançamentos HANA. Alguns processos descritos neste artigo são simplificados para uma melhor compreensão geral. Não devem ser usados como passos detalhados para uma eventual operação de manuais. Se pretender criar manualde operação para as suas configurações, testar e exercitar os seus processos e documentar os processos relacionados com as suas configurações específicas. Um dos aspetos mais importantes das bases de dados operacionais é protegê-las de eventos catastróficos. A causa destes eventos pode ser qualquer coisa, desde desastres naturais a erros simples do utilizador. O backup de uma base de dados, com a capacidade de restaurá-la a qualquer momento, como antes de alguém apagar dados críticos, permite a restauração a um estado o mais próximo possível da forma como estava antes da interrupção. Devem ser realizados dois tipos de cópias de segurança para alcançar a capacidade de restauro: - Backups de base de dados: Backups completos, incrementais ou diferenciais - Backups de registo de transações Além das cópias de segurança completas realizadas a nível de aplicação, pode efetuar cópias de segurança com instantâneos de armazenamento. Os instantâneos de armazenamento não substituem as cópias de segurança do registo de transações. As cópias de segurança do registo de transações continuam a ser importantes para restaurar a base de dados num determinado momento ou para esvaziar os registos de transações já comprometidas. Os instantâneos de armazenamento podem acelerar a recuperação fornecendo rapidamente uma imagem de avanço da base de dados. O SAP HANA on Azure (Grandes Instâncias) oferece duas opções de backup e restauro: - **Faça-o por si mesmo (DIY).** Depois de se certificar de que existe espaço suficiente para o disco, execute a base de dados completa e faça cópias de segurança através de um dos seguintes métodos de backup do disco. Pode fazer o seu back back diretamente aos volumes ligados às unidades HANA Large Instance ou às ações da NFS que são configuradas numa máquina virtual Azure (VM). Neste último caso, os clientes criaram um Linux VM em Azure, anexam o Armazenamento Azure ao VM e partilham o armazenamento através de um servidor NFS configurado nesse VM. Se executar a cópia de segurança contra volumes que se ligam diretamente às unidades HANA Large Instance, copie as cópias de cópias de segurança para uma conta de armazenamento Azure. Faça isto depois de criar um VM Azure que exporta ações da NFS baseadas no Armazenamento Azure. Também pode utilizar um cofre azure backup ou armazenamento a frio Azure. Outra opção é usar uma ferramenta de proteção de dados de terceiros para armazenar as cópias de segurança depois de serem copiadas para uma conta de armazenamento Azure. A opção de backup DIY também pode ser necessária para os dados que você precisa armazenar por períodos mais longos de tempo para finalidades de conformidade e auditoria. Em todos os casos, as cópias de backup são copiadas em ações da NFS representadas através de um VM e Armazenamento Azure. - **Infraestrutura de backup e restaurar funcionalidade.** Também pode utilizar a funcionalidade de backup e restauro que a infraestrutura subjacente do SAP HANA no Azure (Grandes Instâncias) fornece. Esta opção satisfaz a necessidade de backups e restauros rápidos. O resto desta secção aborda a funcionalidade de backup e restauro que é oferecida com as Grandes Instâncias HANA. Esta secção também cobre a relação que o backup e o restauro têm com a funcionalidade de recuperação de desastres oferecida pela HANA Large Instances. > [!NOTE] > A tecnologia instantânea que é usada pela infraestrutura subjacente de HANA Large Instances tem uma dependência de instantâneos SAP HANA. Neste ponto, as fotos do SAP HANA não funcionam em conjunto com vários inquilinos de contentores de base de dados multiarrendatários SAP HANA. Se apenas um inquilino for implantado, as fotos do SAP HANA funcionam e você pode usar este método. ## <a name="use-storage-snapshots-of-sap-hana-on-azure-large-instances"></a>Utilize instantâneos de armazenamento de SAP HANA em Azure (Grandes Instâncias) A infraestrutura de armazenamento subjacente ao SAP HANA no Azure (Grandes Instâncias) suporta instantâneos de armazenamento de volumes. Tanto a cópia de segurança como a restauração de volumes são suportadas, com as seguintes considerações: - Em vez de cópias de dados completas, as imagens de volume de armazenamento são tiradas frequentemente. - Quando um instantâneo é acionado sobre /hana/data e /hana/shared, que inclui /usr/seiva, volumes, a tecnologia snapshot inicia um instantâneo SAP HANA antes de ser executado o instantâneo de armazenamento. Este instantâneo SAP HANA é o ponto de configuração para eventuais restaurações de log após a recuperação do instantâneo de armazenamento. Para que uma foto hana tenha sucesso, precisa de um caso HANA ativo. Num cenário de HSR, um instantâneo de armazenamento não é suportado num nó secundário atual onde não se pode realizar um instantâneo HANA. - Após o instantâneo de armazenamento ser executado com sucesso, o instantâneo SAP HANA é eliminado. - As cópias de segurança do registo de transações são tomadas frequentemente e armazenadas no volume /hana/logbackups ou em Azure. Pode ativar o volume de backups /hana/logbackups que contém as cópias de segurança do registo de transações para tirar uma fotografia separadamente. Nesse caso, não precisas de fazer uma fotografia da HANA. - Se tiver de restaurar uma base de dados a um determinado ponto do tempo, para uma paragem de produção, solicite que o Microsoft Azure Support ou o SAP HANA no Azure restaure a um determinado instantâneo de armazenamento. Um exemplo é a restauração planeada de um sistema de caixa de areia para o seu estado original. - O instantâneo SAP HANA que está incluído no instantâneo de armazenamento é um ponto de compensação para a aplicação de cópias de segurança de registo de transações que foram realizadas e foram armazenadas após a foto de armazenamento ter sido tirada. - Estas cópias de segurança do registo de transações são tomadas para restaurar a base de dados de volta a um certo ponto do tempo. Pode executar instantâneos de armazenamento que visam três classes de volumes: - Um instantâneo combinado sobre /hana/data e /hana/shared, que inclui /usr/seiva. Este instantâneo requer a criação de um instantâneo SAP HANA como preparação para o instantâneo de armazenamento. O instantâneo SAP HANA garante que a base de dados está num estado consistente do ponto de vista do armazenamento. Para o processo de restauro, é um ponto para definir. - Uma imagem separada sobre /hana/logbackups. - Uma partição do sistema operativo. Para obter os mais recentes scripts e documentação instantânea, consulte [GitHub](https://github.com/Azure/hana-large-instances-self-service-scripts/blob/master/latest/Microsoft%20Snapshot%20Tools%20for%20SAP%20HANA%20on%20Azure%20Guide.md). Ao descarregar o pacote de scripts snapshot do [GitHub,](https://github.com/Azure/hana-large-instances-self-service-scripts/blob/master/latest/release.md)obtém três ficheiros. Um dos ficheiros está documentado num PDF para a funcionalidade fornecida. Depois de descarregar o conjunto de ferramentas, siga as instruções em "Obter as ferramentas instantâneas". ## <a name="storage-snapshot-considerations"></a>Considerações de instantâneode armazenamento >[!NOTE] >Os instantâneos de armazenamento consomem espaço de armazenamento que é atribuído às unidades HANA Large Instance. Considere os seguintes aspetos de agendamento de instantâneos de armazenamento e quantos instantâneos de armazenamento devem guardar. As mecânicas específicas dos instantâneos de armazenamento para SAP HANA em Azure (Grandes Instâncias) incluem: - Uma foto de armazenamento específica no momento em que é tomada consome pouco armazenamento. - À medida que o conteúdo dos dados muda e o conteúdo nos ficheiros de dados do SAP HANA muda no volume de armazenamento, o instantâneo precisa de armazenar o conteúdo original do bloco e as alterações de dados. - Como resultado, o instantâneo de armazenamento aumenta de tamanho. Quanto mais tempo o instantâneo existir, maior se torna o instantâneo de armazenamento. - Quanto mais alterações forem feitas ao volume de base de dados SAP HANA ao longo da vida útil de um instantâneo de armazenamento, maior é o consumo de espaço do instantâneo de armazenamento. O SAP HANA em Azure (Grandes Instâncias) vem com tamanhos de volume fixos para os dados e volumes de registo SAP HANA. Executar instantâneos desses volumes come no seu espaço de volume. Precisa: - Determine quando agendar instantâneos de armazenamento. - Monitorize o consumo espacial dos volumes de armazenamento. - Gerencie o número de instantâneos que armazena. Pode desativar as imagens de armazenamento quando importar massas de dados ou realizar outras alterações significativas na base de dados da HANA. As seguintes secções fornecem informações para a realização destes instantâneos e incluem recomendações gerais: - Embora o hardware possa sustentar 255 instantâneos por volume, você quer ficar bem abaixo deste número. A recomendação é de 250 ou menos. - Antes de realizar instantâneos de armazenamento, monitorize e mantenha o registo do espaço livre. - Baixe o número de instantâneos de armazenamento com base no espaço livre. Pode baixar o número de instantâneos que guarda, ou pode estender os volumes. Pode encomendar armazenamento adicional em unidades de 1 terabyte. - Durante atividades como a deslocação de dados para SAP HANA com ferramentas de migração da plataforma SAP (R3load) ou restaurar as bases de dados SAP HANA de cópias de segurança, desative instantâneos de armazenamento no volume /hana/data. - Durante as maiores reorganizações das tabelas SAP HANA, evite instantâneos de armazenamento, se possível. - Os instantâneos de armazenamento são um pré-requisito para tirar partido das capacidades de recuperação de desastres da SAP HANA em Azure (Grandes Instâncias). ## <a name="prerequisites-for-using-self-service-storage-snapshots"></a>Pré-requisitos para a utilização de instantâneos de armazenamento de self-service Para se certificar de que o script instantâneo funciona com sucesso, certifique-se de que o Perl está instalado no sistema operativo Linux no servidor HANA Large Instances. Perl vem pré-instalado na sua unidade HANA Large Instance. Para verificar a versão Perl, utilize o seguinte comando: `perl -v` ![A chave pública é copiada com este comando](./media/hana-overview-high-availability-disaster-recovery/perl_screen.png) ## <a name="set-up-storage-snapshots"></a>Configurar instantâneos de armazenamento Para configurar instantâneos de armazenamento com as grandes instâncias HANA, siga estes passos. 1. Certifique-se de que o Perl está instalado no sistema operativo Linux no servidor HANA Large Instances. 1. Modificar o config /etc/ssh/ssh\_para adicionar a linha _MACs hmac-sha1_. 1. Crie uma conta de utilizador de reserva SAP HANA no nó principal para cada instância SAP HANA que executa, se aplicável. 1. Instale o cliente SAP HANA HDB em todos os servidores SAP HANA Large Instances. 1. No primeiro servidor SAP HANA Large Instances de cada região, crie uma chave pública para aceder à infraestrutura de armazenamento subjacente que controla a criação de instantâneos. 1. Copie os scripts e o ficheiro de configuração do [GitHub](https://github.com/Azure/hana-large-instances-self-service-scripts/blob/master/latest/release.md) para a localização do **hdbsql** na instalação SAP HANA. 1. Modifique o ficheiro *HANABackupDetails.txt* conforme necessário para as especificações apropriadas do cliente. Obtenha os mais recentes scripts instantâneos e documentação do [GitHub.](https://github.com/Azure/hana-large-instances-self-service-scripts/blob/master/latest/release.md) Para os passos anteriormente listados, consulte as ferramentas snapshot da [Microsoft para SAP HANA no Azure](https://github.com/Azure/hana-large-instances-self-service-scripts/blob/master/latest/Microsoft%20Snapshot%20Tools%20for%20SAP%20HANA%20on%20Azure%20Guide.md). ### <a name="consideration-for-mcod-scenarios"></a>Consideração para cenários MCOD Se executar um [cenário MCOD](https://launchpad.support.sap.com/#/notes/1681092) com múltiplas instâncias SAP HANA numa unidade HANA Large Instance, tem volumes de armazenamento separados previstos para cada uma das instâncias SAP HANA. Para obter mais informações sobre o MDC e outras considerações, consulte "Coisas importantes a lembrar" nas [ferramentas instantâneas da Microsoft para o SAP HANA no Azure](https://github.com/Azure/hana-large-instances-self-service-scripts/blob/master/latest/Microsoft%20Snapshot%20Tools%20for%20SAP%20HANA%20on%20Azure%20Guide.md). ### <a name="step-1-install-the-sap-hana-hdb-client"></a>Passo 1: Instalar o cliente SAP HANA HDB O sistema operativo Linux instalado no SAP HANA no Azure (Grandes Instâncias) inclui as pastas e scripts necessários para executar instantâneos de armazenamento SAP HANA para fins de backup e recuperação de desastres. Verifique se há lançamentos mais recentes no [GitHub](https://github.com/Azure/hana-large-instances-self-service-scripts/blob/master/latest/release.md). É da sua responsabilidade instalar o cliente SAP HANA HDB nas unidades HANA Large Instance enquanto instala o SAP HANA. ### <a name="step-2-change-the-etcsshssh_config"></a>Passo 2: Alterar o config /etc/ssh/ssh\_ Este passo é descrito em "Enable communication with storage" nas [ferramentas instantâneas da Microsoft para SAP HANA no Azure](https://github.com/Azure/hana-large-instances-self-service-scripts/blob/master/latest/Microsoft%20Snapshot%20Tools%20for%20SAP%20HANA%20on%20Azure%20Guide.md). ### <a name="step-3-create-a-public-key"></a>Passo 3: Criar uma chave pública Para permitir o acesso às interfaces instantâneas de armazenamento do seu inquilino HANA Large Instance, estabeleça um procedimento de iniciação através de uma chave pública. No primeiro servidor SAP HANA no Azure (Grandes Instâncias) no seu inquilino, crie uma chave pública para aceder à infraestrutura de armazenamento. Com uma chave pública, não é necessária uma senha para iniciar sessão nas interfaces de instantâneo de armazenamento. Também não precisa de manter credenciais de senha com uma chave pública. Para gerar uma chave pública, consulte "Enable communication with storage" nas [ferramentas instantâneas da Microsoft para SAP HANA no Azure](https://github.com/Azure/hana-large-instances-self-service-scripts/blob/master/latest/Microsoft%20Snapshot%20Tools%20for%20SAP%20HANA%20on%20Azure%20Guide.md). ### <a name="step-4-create-an-sap-hana-user-account"></a>Passo 4: Criar uma conta de utilizador SAP HANA Para iniciar a criação de instantâneos SAP HANA, crie uma conta de utilizador no SAP HANA que os scripts instantâneos de armazenamento possam usar. Crie uma conta de utilizador SAP HANA dentro do Estúdio SAP HANA para este fim. O utilizador deve ser criado sob o SYSTEMDB e *não* sob a base de dados SID para MDC. No ambiente de contentor único, o utilizador é criado na base de dados dos inquilinos. Esta conta deve ter privilégios **de Backup Admin** e **Catalog Read.** Para configurar e utilizar uma conta de utilizador, consulte "Ativar a comunicação com o SAP HANA" no [GitHub](https://github.com/Azure/hana-large-instances-self-service-scripts/blob/master/latest/Microsoft%20Snapshot%20Tools%20for%20SAP%20HANA%20on%20Azure%20Guide.md). ### <a name="step-5-authorize-the-sap-hana-user-account"></a>Passo 5: Autorizar a conta de utilizador SAP HANA Neste passo, autoriza a conta de utilizador SAP HANA que criou para que os scripts não precisem de enviar senhas no prazo de execução. O comando `hdbuserstore` SAP HANA permite a criação de uma chave de utilizador SAP HANA. A chave é armazenada em um ou mais nós SAP HANA. A chave do utilizador permite ao utilizador aceder ao SAP HANA sem ter de gerir as palavras-passe a partir do processo de script. O processo de escrita é discutido mais tarde neste artigo. >[!IMPORTANT] >Executar estes comandos de configuração com o mesmo contexto de utilizador em que os comandos instantâneos são executados. Caso contrário, os comandos instantâneos não funcionarão corretamente. ### <a name="step-6-get-the-snapshot-scripts-configure-the-snapshots-and-test-the-configuration-and-connectivity"></a>Passo 6: Obtenha os scripts instantâneos, configure os instantâneos e teste a configuração e conectividade Descarregue a versão mais recente dos scripts do [GitHub.](https://github.com/Azure/hana-large-instances-self-service-scripts/tree/master/snapshot_tools_v4.1) A forma como os scripts são instalados alterou-se com o lançamento 4.1 dos scripts. Para mais informações, consulte "Enable communication with SAP HANA" nas [ferramentas instantâneas da Microsoft para SAP HANA no Azure](https://github.com/Azure/hana-large-instances-self-service-scripts/blob/master/latest/Microsoft%20Snapshot%20Tools%20for%20SAP%20HANA%20on%20Azure%20Guide.md). Para obter a sequência exata de comandos, consulte "Fácil instalação de ferramentas instantâneas (predefinido)" nas [ferramentas instantâneas da Microsoft para SAP HANA no Azure](https://github.com/Azure/hana-large-instances-self-service-scripts/blob/master/latest/Microsoft%20Snapshot%20Tools%20for%20SAP%20HANA%20on%20Azure%20Guide.md). Recomendamos a utilização da instalação predefinida. Para atualizar da versão 3.x para 4.1, consulte "Atualize uma instalação existente" nas [ferramentas instantâneas da Microsoft para SAP HANA no Azure](https://github.com/Azure/hana-large-instances-self-service-scripts/blob/master/latest/Microsoft%20Snapshot%20Tools%20for%20SAP%20HANA%20on%20Azure%20Guide.md). Para desinstalar o conjunto de ferramentas 4.1, consulte "Desinstalação das ferramentas instantâneas" nas [ferramentas instantâneas da Microsoft para SAP HANA no Azure](https://github.com/Azure/hana-large-instances-self-service-scripts/blob/master/latest/Microsoft%20Snapshot%20Tools%20for%20SAP%20HANA%20on%20Azure%20Guide.md). Não se esqueça de executar os passos descritos em "Configuração completa de ferramentas instantâneas" nas [ferramentas instantâneas da Microsoft para SAP HANA no Azure](https://github.com/Azure/hana-large-instances-self-service-scripts/blob/master/latest/Microsoft%20Snapshot%20Tools%20for%20SAP%20HANA%20on%20Azure%20Guide.md). O propósito dos diferentes scripts e ficheiros à medida que foram instalados é descrito em "O que são estas ferramentas instantâneas?" em [ferramentas instantâneas da Microsoft para SAP HANA no Azure](https://github.com/Azure/hana-large-instances-self-service-scripts/blob/master/latest/Microsoft%20Snapshot%20Tools%20for%20SAP%20HANA%20on%20Azure%20Guide.md). Antes de configurar as ferramentas instantâneas, certifique-se de que também configura corretamente as localizações e configurações de backup HANA. Para mais informações, consulte "SAP HANA Configuration" nas [ferramentas instantâneas da Microsoft para SAP HANA no Azure](https://github.com/Azure/hana-large-instances-self-service-scripts/blob/master/latest/Microsoft%20Snapshot%20Tools%20for%20SAP%20HANA%20on%20Azure%20Guide.md). A configuração do conjunto de ferramentas instantâneas é descrita em "Config file - HANABackupCustomerDetails.txt" nas [ferramentas instantâneas da Microsoft para SAP HANA no Azure](https://github.com/Azure/hana-large-instances-self-service-scripts/blob/master/latest/Microsoft%20Snapshot%20Tools%20for%20SAP%20HANA%20on%20Azure%20Guide.md). #### <a name="test-connectivity-with-sap-hana"></a>Testar conectividade com SAP HANA Depois de colocar todos os dados de configuração no ficheiro *HANABackupCustomerDetails.txt,* verifique se as configurações estão corretas para os dados da instância HANA. Utilize o `testHANAConnection`script , que é independente de uma configuração de escala SAP HANA ou scale-out. Para mais informações, consulte "Verifique a conectividade com o SAP HANA - testHANAConnection" nas [ferramentas instantâneas da Microsoft para SAP HANA no Azure](https://github.com/Azure/hana-large-instances-self-service-scripts/blob/master/latest/Microsoft%20Snapshot%20Tools%20for%20SAP%20HANA%20on%20Azure%20Guide.md). #### <a name="test-storage-connectivity"></a>Testar conectividade de armazenamento O próximo passo de teste é verificar a conectividade com o armazenamento com base nos dados que colocou no ficheiro de configuração *HANABackupCustomerDetails.txt.* Em seguida, executar uma foto de teste. Antes de `azure_hana_backup` executar o comando, tem de fazer este teste. Para a sequência de comandos para este teste, consulte "Verifique a conectividade com o armazenamento - testStorageSnapshotConnection" nas [ferramentas instantâneas da Microsoft para SAP HANA no Azure](https://github.com/Azure/hana-large-instances-self-service-scripts/blob/master/latest/Microsoft%20Snapshot%20Tools%20for%20SAP%20HANA%20on%20Azure%20Guide.md). Após um sessão bem-sucedido nas interfaces de máquina virtual de armazenamento, o script continua com a fase 2 e cria um instantâneo de teste. A saída é mostrada aqui para uma configuração de escala de três nós de SAP HANA. Se o instantâneo do teste for executado com sucesso com o script, pode agendar as imagens de armazenamento reais. Se não for bem sucedido, investigue os problemas antes de seguir em frente. O instantâneo do teste deve ficar por aqui até que as primeiras fotos verdadeiras estejam prontas. ### <a name="step-7-perform-snapshots"></a>Passo 7: Realizar instantâneos Quando os passos de preparação estiverem terminados, pode começar a configurar e agendar os instantâneos de armazenamento reais. O guião a ser programado funciona com configurações de escala e escala SAP HANA. Para a execução periódica e regular do script de backup, agende o script utilizando o utilitário cron. Para obter a sintaxe e funcionalidade exatas de comando, consulte "Execute a cópia de segurança instantânea - azure_hana_backup" nas [ferramentas instantâneas da Microsoft para SAP HANA no Azure](https://github.com/Azure/hana-large-instances-self-service-scripts/blob/master/latest/Microsoft%20Snapshot%20Tools%20for%20SAP%20HANA%20on%20Azure%20Guide.md). Quando o `azure_hana_backup` script corre, cria o instantâneo de armazenamento nas três fases seguintes: 1. Tem uma foto da SAP HANA. 1. Tem um instantâneo de armazenamento. 1. Remove o instantâneo SAP HANA que foi criado antes do instantâneo de armazenamento correr. Para executar o script, chame-o da pasta executável do HDB para a qual foi copiado. O período de retenção é administrado com o número de instantâneos que são submetidos como parâmetro quando executa o script. A quantidade de tempo que é coberta pelos instantâneos de armazenamento é uma função do período de execução, e do número de instantâneos submetidos como parâmetro quando o script corre. Se o número de instantâneos que são mantidos exceder o número que são nomeados como parâmetro na chamada do script, o instantâneo de armazenamento mais antigo da mesma etiqueta é eliminado antes de um novo instantâneo ser executado. O número que dá como último parâmetro da chamada é o número que pode usar para controlar o número de instantâneos que são mantidos. Com este número, também pode controlar, indiretamente, o espaço do disco que é usado para instantâneos. ## <a name="snapshot-strategies"></a>Estratégias instantâneas A frequência de instantâneos para os diferentes tipos depende se utiliza a funcionalidade de recuperação de desastres hana large instance. Esta funcionalidade baseia-se em instantâneos de armazenamento, o que pode exigir recomendações especiais para os períodos de frequência e execução dos instantâneos de armazenamento. Nas considerações e recomendações que se seguem, o pressuposto é que *não* utiliza a funcionalidade de recuperação de desastres que a HANA Large Instances oferece. Em vez disso, utiliza as imagens de armazenamento para ter cópias de segurança e poderá fornecer uma recuperação pontual nos últimos 30 dias. Dadas as limitações do número de instantâneos e espaço, considere os seguintes requisitos: - O tempo de recuperação para a recuperação do ponto no tempo. - O espaço usado. - O ponto de recuperação e os objetivos de tempo de recuperação para uma potencial recuperação de uma catástrofe. - A eventual execução de backups de base de dados completas da HANA contra discos. Sempre que for efetuada uma cópia de segurança de base de dados completa contra discos ou a interface **backint,** a execução dos instantâneos de armazenamento falha. Se planeia executar cópias de segurança em cima de imagens de armazenamento, certifique-se de que a execução das imagens de armazenamento é desativada durante este período. - O número de instantâneos por volume, que é limitado a 250. <!-- backint is term for a SAP HANA interface and not a spelling error not spelling errors --> Se não utilizar a funcionalidade de recuperação de desastres de GRANDES Instâncias HANA, o período de instantâneo é menos frequente. Nesses casos, efetue os instantâneos combinados em /hana/data e /hana/shared, que inclui /usr/seiva, em períodos de 12 horas ou 24 horas. Guarde as fotos por um mês. O mesmo se aplica às imagens do volume de cópiade cópia de bordo. A execução de cópias de segurança de registo de transações SAP HANA contra o volume de cópia de segurança de registo ocorre em períodos de 5 minutos a 15 minutos. Os instantâneos de armazenamento programados são melhor realizados usando cron. Use o mesmo guião para todas as necessidades de recuperação de cópias de segurança e desastres. Modifique as inputs do script para corresponder aos vários tempos de backup solicitados. Estas fotos são todas programadas de forma diferente em cron, dependendo do seu tempo de execução. Pode ser de hora em hora, a cada 12 horas, diariamente ou semanalmente. O exemplo seguinte mostra uma programação cronememem em /etc/crontab: ``` 00 1-23 * * * ./azure_hana_backup --type=hana --prefix=hourlyhana --frequency=15min --retention=46 10 00 * * * ./azure_hana_backup --type=hana --prefix=dailyhana --frequency=15min --retention=28 00,05,10,15,20,25,30,35,40,45,50,55 * * * * ./azure_hana_backup --type=logs --prefix=regularlogback --frequency=3min --retention=28 22 12 * * * ./azure_hana_backup --type=logs --prefix=dailylogback --frequncy=3min --retention=28 30 00 * * * ./azure_hana_backup --type=boot --boottype=TypeI --prefix=dailyboot --frequncy=15min --retention=28 ``` No exemplo anterior, um instantâneo combinado de hora a hora cobre os volumes que contêm o /hana/data e /hana/shared/SID, que inclui /usr/seiva, locais. Utilize este tipo de instantâneo para uma recuperação pontual mais rápida nos últimos dois dias. Há também uma foto diária nesses volumes. Então, você tem dois dias de cobertura por fotos de hora a hora mais quatro semanas de cobertura por fotos diárias. O volume de cópia de segurança do registo de transações também é apoiado diariamente. Estes reforços são mantidos por quatro semanas. Como pode ver na terceira linha de crontab, a cópia de segurança do registo de transações da HANA está programada para ser executada a cada 5 minutos. Os tempos de início dos diferentes trabalhos de compadrio que executam instantâneos de armazenamento são escalonados. Desta forma, as fotos não funcionam de uma só vez num determinado momento. No exemplo seguinte, executa um instantâneo combinado que cobre os volumes que contêm os /hana/data e /hana/shared/SID, que inclui /usr/seiva, localizações numa base horária. Guarde estas fotos por dois dias. As imagens dos volumes de cópia de segurança do registo de transações são executadas numa base de 5 minutos e são mantidas durante quatro horas. Como antes, a cópia de segurança do ficheiro de registo de transações HANA está programada para ser executada a cada 5 minutos. O instantâneo do volume de cópia de segurança do registo de transações é realizado com um atraso de 2 minutos após o início da cópia de segurança do registo de transações. Em circunstâncias normais, o registo de cópias de segurança da sap HANA termina nesses 2 minutos. Como antes, o volume que contém a bota LUN é apoiado uma vez por dia por um instantâneo de armazenamento e é mantido durante quatro semanas. ``` 10 0-23 * * * ./azure_hana_backup --type=hana ==prefix=hourlyhana --frequency=15min --retention=48 0,5,10,15,20,25,30,35,40,45,50,55 * * * * ./azure_hana_backup --type=logs --prefix=regularlogback --frequency=3min --retention=28 2,7,12,17,22,27,32,37,42,47,52,57 * * * * ./azure_hana_backup --type=logs --prefix=logback --frequency=3min --retention=48 30 00 * * * ./azure_hana_backup --type=boot --boottype=TypeII --prefix=dailyboot --frequency=15min --retention=28 ``` O gráfico seguinte ilustra as sequências do exemplo anterior. A bota LUN está excluída. ![Relação entre backups e instantâneos](./media/hana-overview-high-availability-disaster-recovery/backup_snapshot_updated0921.PNG) O SAP HANA realiza escritas regulares contra o volume /hana/log para documentar as alterações comprometidas na base de dados. Regularmente, o SAP HANA escreve um ponto de poupança para o volume /hana/data. Conforme especificado no crontab, uma cópia de segurança de registo de transações SAP HANA funciona a cada 5 minutos. Também vê que um instantâneo SAP HANA funciona de hora a hora como resultado de desencadear um instantâneo de armazenamento combinado sobre os volumes /hana/data e /hana/shared/SID. Após o instantâneo hana ter sucesso, o instantâneo combinado de armazenamento corre. Conforme instruído no crontab, o instantâneo de armazenamento no volume /hana/logbackup funciona a cada 5 minutos, cerca de 2 minutos após a cópia de segurança do registo de transações HANA. > >[!IMPORTANT] > A utilização de instantâneos de armazenamento para cópias de segurança SAP HANA só é valiosa quando as imagens são realizadas em conjunto com cópias de segurança de registo de transações SAP HANA. Estas cópias de segurança do registo de transações têm de cobrir os períodos de tempo entre os instantâneos de armazenamento. Se definiu um compromisso com os utilizadores de uma recuperação pontual de 30 dias, precisa de: - Aceda a um instantâneo de armazenamento combinado sobre /hana/data e /hana/shared/SID que tenha 30 dias, em casos extremos. - Distenha cópias de segurança contíguas de registo de transações que cubram o tempo entre qualquer um dos instantâneos combinados de armazenamento. Assim, o instantâneo mais antigo do volume de cópia de segurança do registo de transações tem de ter 30 dias. Este não é o caso se copiar as cópias de registo de transações para outra parte da NFS que está localizada no Armazenamento Azure. Nesse caso, pode retirar cópias de segurança antigas do registo de transações daquela parte da NFS. Para beneficiar de instantâneos de armazenamento e da eventual replicação de armazenamento de cópias de segurança de registo de transações, altere a localização para a qual o SAP HANA escreve as cópias de segurança do registo de transações. Podes fazer esta mudança no Hana Studio. Embora o SAP HANA recue automaticamente nos segmentos de registo completos, especifique um intervalo de cópia de segurança para ser determinista. Isto é especialmente verdade quando se utiliza a opção de recuperação de desastres, porque normalmente quer executar cópias de segurança de registo com um período determinístico. No caso seguinte, 15 minutos é definido como intervalo de backup de registo. ![Agendar registos de backup SAP HANA no Estúdio SAP HANA](./media/hana-overview-high-availability-disaster-recovery/image5-schedule-backup.png) Também pode escolher backups mais frequentes do que a cada 15 minutos. Uma configuração mais frequente é frequentemente usada em conjunto com a funcionalidade de recuperação de desastres de GRANDES Instâncias HANA. Alguns clientes realizam cópias de segurança de registo de transações a cada 5 minutos. Se a base de dados nunca tiver sido apoiada, o passo final é executar uma cópia de segurança baseada em ficheiros para criar uma única entrada de cópia de segurança que deve existir dentro do catálogo de cópias de segurança. Caso contrário, o SAP HANA não pode iniciar as suas cópias de segurança especificadas. ![Faça uma cópia de segurança baseada em ficheiros para criar uma única entrada de backup](./media/hana-overview-high-availability-disaster-recovery/image6-make-backup.png) Após a execução dos seus primeiros instantâneos de armazenamento bem sucedidos, elimine o instantâneo de teste que correu no passo 6. Para obter mais informações, consulte "Remover instantâneos de teste - removaTestStorageSnapshot" nas [ferramentas instantâneas da Microsoft para SAP HANA no Azure](https://github.com/Azure/hana-large-instances-self-service-scripts/blob/master/latest/Microsoft%20Snapshot%20Tools%20for%20SAP%20HANA%20on%20Azure%20Guide.md). ### <a name="monitor-the-number-and-size-of-snapshots-on-the-disk-volume"></a>Monitorize o número e o tamanho das imagens no volume do disco Num volume de armazenamento específico, pode monitorizar o número de instantâneos e o consumo de armazenamento desses instantâneos. O `ls` comando não mostra o diretório ou ficheiros instantâneos. O comando `du` Linux OS mostra detalhes sobre as fotos de armazenamento porque são armazenados nos mesmos volumes. Utilize o comando com as seguintes opções: - `du –sh .snapshot`: Esta opção fornece um total de todas as imagens dentro do diretório instantâneo. - `du –sh --max-depth=1`: Esta opção lista todos os instantâneos guardados na pasta **.snapshot** e o tamanho de cada instantâneo. - `du –hc`: Esta opção fornece o tamanho total utilizado por todos os instantâneos. Utilize estes comandos para se certificar de que as imagens que são tiradas e armazenadas não consomem todo o armazenamento nos volumes. >[!NOTE] >As fotos da bota LUN não são visíveis com os comandos anteriores. ### <a name="get-details-of-snapshots"></a>Obter detalhes de fotos Para obter mais detalhes sobre as `azure_hana_snapshot_details`fotos, use o script . Pode executar este script em qualquer local se houver um servidor ativo no local de recuperação de desastres. O script fornece a seguinte saída, discriminada por cada volume que contém instantâneos: * O tamanho dos instantâneos totais num volume * Os seguintes detalhes em cada instantâneo desse volume: - Nome instantâneo - Criar tempo - Tamanho do instantâneo - Frequência do instantâneo - HANA Backup ID associado a esse instantâneo, se relevante Para sintaxe do comando e saídas, consulte "List snapshots - azure_hana_snapshot_details" nas [ferramentas instantâneas da Microsoft para SAP HANA no Azure](https://github.com/Azure/hana-large-instances-self-service-scripts/blob/master/latest/Microsoft%20Snapshot%20Tools%20for%20SAP%20HANA%20on%20Azure%20Guide.md). ### <a name="reduce-the-number-of-snapshots-on-a-server"></a>Reduzir o número de instantâneos num servidor Como anteriormente explicado, pode reduzir o número de certos rótulos de instantâneos que armazena. Os dois últimos parâmetros do comando para iniciar um instantâneo são a etiqueta e o número de instantâneos que pretende reter. ``` ./azure_hana_backup --type=hana --prefix=dailyhana --frequency=15min --retention=28 ``` No exemplo anterior, o rótulo instantâneo é **diáriohana**. O número de instantâneos com esta etiqueta a manter é **de 28**. Ao responder ao consumo de espaço em disco, é possível reduzir o número de instantâneos armazenados. Uma maneira fácil de reduzir o número de instantâneos para 15, por exemplo, é executar o script com o último parâmetro definido para **15:** ``` ./azure_hana_backup --type=hana --prefix=dailyhana --frequency=15min --retention=15 ``` Se executar o script com esta definição, o número de instantâneos, que inclui o novo instantâneo de armazenamento, é de 15. As 15 fotografias mais recentes são mantidas e as 15 fotografias mais antigas são apagadas. >[!NOTE] > Este script reduz o número de instantâneos apenas se houver instantâneos com mais de uma hora de idade. O guião não apaga fotografias com menos de uma hora de idade. Estas restrições estão relacionadas com a funcionalidade opcional de recuperação de desastres oferecida. Se já não quiser manter um conjunto de instantâneos com o prefixo de reserva **diariamentehana** nos exemplos de sintaxe, execute o script com **0** como número de retenção. Todas as imagens que correspondem à etiqueta são removidas. Remover todos os instantâneos pode afetar as capacidades da funcionalidade de recuperação de desastres de grandes instâncias HANA. Uma segunda opção para eliminar instantâneos `azure_hana_snapshot_delete`específicos é utilizar o script . Este script foi concebido para eliminar um instantâneo ou um conjunto de instantâneos, utilizando o ID de backup HANA, tal como se encontra no HANA Studio, ou através do próprio nome instantâneo. Atualmente, o ID de reserva está apenas ligado às imagens criadas para o tipo de instantâneo **hana.** As cópias de segurança instantâneas dos **registos** do tipo e da **bota** não executam uma foto do SAP HANA, por isso não há identificação de reserva para essas fotos. Se o nome instantâneo for introduzido, procura todos os instantâneos nos diferentes volumes que correspondam ao nome instantâneo introduzido. <!-- hana, logs and boot are no spelling errors as Acrolinx indicates, but terms of parameter values --> Para obter mais informações sobre o script, consulte "Delete a snapshot - azure_hana_snapshot_delete" nas [ferramentas instantâneas da Microsoft para SAP HANA no Azure](https://github.com/Azure/hana-large-instances-self-service-scripts/blob/master/latest/Microsoft%20Snapshot%20Tools%20for%20SAP%20HANA%20on%20Azure%20Guide.md). Executar o script como **raiz**do utilizador . >[!IMPORTANT] >Se houver dados que existam apenas no instantâneo que planeia eliminar, depois de o instantâneo ser eliminado, esses dados perdem-se para sempre. ## <a name="file-level-restore-from-a-storage-snapshot"></a>Restauro ao nível de ficheiro a partir de um instantâneo de armazenamento <!-- hana, logs and boot are no spelling errors as Acrolinx indicates, but terms of parameter values --> Para os tipos de instantâneos **hana** e **registos,** pode aceder diretamente às imagens nos volumes do diretório **.snapshot.** Há um subdiretório para cada uma das fotos. Copiar cada ficheiro no estado em que estava no ponto do instantâneo desse subdiretório para a estrutura real do diretório. Na versão atual do script, *não* há nenhum script de restauro fornecido para o restauro instantâneo como self-service. A restauração instantânea pode ser realizada como parte dos scripts de recuperação de desastres de autosserviço no local de recuperação de desastres durante a failover. Para restaurar um instantâneo desejado a partir dos instantâneos disponíveis, deve contactar a equipa de operações da Microsoft abrindo um pedido de serviço. >[!NOTE] >A restauração de ficheiros simples não funciona para fotos da bota LUN independentes do tipo de unidades HANA Large Instance. O diretório **.snapshot** não está exposto na bota LUN. ## <a name="recover-to-the-most-recent-hana-snapshot"></a>Recupere para o mais recente instantâneo da HANA Num cenário de produção para baixo, o processo de recuperação de um instantâneo de armazenamento pode ser iniciado como um incidente com o Microsoft Azure Support. É uma questão de alta urgência se os dados foram apagados num sistema de produção e a única maneira de recuperá-lo é restaurar a base de dados de produção. Numa situação diferente, uma recuperação pontual pode ser de baixa urgência e dias previstos com antecedência. Você pode planear esta recuperação com SAP HANA em Azure em vez de levantar uma bandeira de alta prioridade. Por exemplo, pode planear atualizar o software SAP aplicando um novo pacote de melhoramento. Em seguida, você precisa voltar a uma foto que representa o estado antes da atualização do pacote de melhoramento. Antes de enviar o pedido, precisa se preparar. A equipa SAP HANA da Azure pode então lidar com o pedido e fornecer os volumes restaurados. Depois, restaure a base de dados HANA com base nas imagens. Para obter um instantâneo restaurado com o novo conjunto de ferramentas, consulte "Como restaurar um instantâneo" no guia de [recuperação manual para SAP HANA em Azure](https://github.com/Azure/hana-large-instances-self-service-scripts/blob/master/latest/Microsoft%20Snapshot%20Tools%20for%20SAP%20HANA%20on%20Azure%20Guide.md)a partir de um instantâneo de armazenamento . Para se preparar para o pedido, siga estes passos. 1. Decida qual instantâneo restaurar. Apenas o volume hana/dados é restaurado a menos que instrua o contrário. 1. Desligue a instância HANA. ![Desligue a instância HANA](./media/hana-overview-high-availability-disaster-recovery/image7-shutdown-hana.png) 1. Desmonte os volumes de dados em cada nó de base de dados HANA. Se os volumes de dados ainda estiverem montados no sistema operativo, a restauração do instantâneo falha. ![Desmontar os volumes de dados em cada nó de base de dados HANA](./media/hana-overview-high-availability-disaster-recovery/image8-unmount-data-volumes.png) 1. Abra um pedido de apoio azure e inclua instruções sobre a restauração de um instantâneo específico: - Durante a restauração: O SAP HANA no Serviço Azure pode pedir-lhe para assistir a uma chamada de conferência para coordenar, verificar e confirmar que o instantâneo de armazenamento correto é restaurado. - Após a restauração: O SAP HANA no Serviço Azure notifica-o quando o instantâneo de armazenamento é restaurado. 1. Depois de concluído o processo de restauração, remonte todos os volumes de dados. ![Remontar todos os volumes de dados](./media/hana-overview-high-availability-disaster-recovery/image9-remount-data-volumes.png) Outra possibilidade para obter, por exemplo, ficheiros de dados SAP HANA recuperados de um instantâneo de armazenamento, está documentado no passo 7 no guia de [recuperação manual para SAP HANA em Azure a partir de um instantâneo de armazenamento](https://github.com/Azure/hana-large-instances-self-service-scripts/blob/master/latest/Microsoft%20Snapshot%20Tools%20for%20SAP%20HANA%20on%20Azure%20Guide.md). Para restaurar a partir de uma cópia de segurança instantânea, consulte o guia de [recuperação manual para O SAP HANA em Azure a partir de um instantâneo de armazenamento](https://github.com/Azure/hana-large-instances-self-service-scripts/blob/master/latest/Microsoft%20Snapshot%20Tools%20for%20SAP%20HANA%20on%20Azure%20Guide.md). >[!Note] >Se o seu instantâneo foi restaurado pelas operações da Microsoft, não precisa de fazer o passo 7. ### <a name="recover-to-another-point-in-time"></a>Recuperar para outro ponto no tempo Para restaurar um determinado ponto no tempo, consulte "Recuperar a base de dados para o seguinte ponto de tempo" no guia de [recuperação manual para SAP HANA em Azure a partir de um instantâneo de armazenamento](https://github.com/Azure/hana-large-instances-self-service-scripts/blob/master/latest/Microsoft%20Snapshot%20Tools%20for%20SAP%20HANA%20on%20Azure%20Guide.md). ## <a name="next-steps"></a>Passos seguintes - Consulte [os princípios e a preparação](hana-concept-preparation.md)da recuperação de desastres.
110.399015
910
0.806077
por_Latn
0.999504
ede86b4a2715ebb5af4eda4552c8ddb2f191490a
3,700
md
Markdown
_posts/2019-2-16-term-frequency-inverse-document-frequency.md
Pahulpreet86/pahulpreet86.github.io
831a1a67d1a1e5620f18ef99d28beaea2ea14d86
[ "MIT" ]
2
2019-01-28T10:14:24.000Z
2020-07-23T20:24:11.000Z
_posts/2019-2-16-term-frequency-inverse-document-frequency.md
Pahulpreet86/pahulpreet86.github.io
831a1a67d1a1e5620f18ef99d28beaea2ea14d86
[ "MIT" ]
null
null
null
_posts/2019-2-16-term-frequency-inverse-document-frequency.md
Pahulpreet86/pahulpreet86.github.io
831a1a67d1a1e5620f18ef99d28beaea2ea14d86
[ "MIT" ]
null
null
null
--- layout: post title: "Term Frequency - Inverse Document Frequency" author: Pahul Preet Singh Kohli categories: [Term Frequency - Inverse Document Frequency, TF-IDF, NLP, Natural Language Processing, TfidfVectorizer, Python] image: assets/images/tf.png description: "TF-IDF is an information retrieval technique that weighs a term’s frequency (TF) and its inverse document frequency (IDF). Each word or term has its respective TF and IDF score. The product of the TF and IDF scores of a term is called the TF*IDF weight of that term." comments: false --- **TF-IDF** : is an information retrieval technique that weighs a term’s frequency (TF) and its inverse document frequency (IDF). Each word or term has its respective TF and IDF score. The product of the TF and IDF scores of a term is called the TF*IDF weight of that term. **Term Frequency (TF)** : is the ratio of the number of times a word appear in the document to the total number of words in the documents. **Inverse Data Frequency (IDF)** : assigns higher weightage to the rare words in the text corpus. It is defined as the log of the ratio of number of documents to number of documents in which a particular words. ![image]({{ site.baseurl }}/assets/images/idf.png) **IDF Smoothing** : is done to makes sure the terms with zero IDF don't get suppressed entirely. ![image]({{ site.baseurl }}/assets/images/idf_smooth.png) **TF-IDF score** : is the product of the Term Frequency and Inverse Data Frequency. ![image]({{ site.baseurl }}/assets/images/tfidf.png) ***TF-IDF Implementation Code***: ```python import pandas as pd from sklearn.feature_extraction.text import CountVectorizer, TfidfVectorizer s1="The cat sat on my face" s2="The dog sat on my bed" #without smooth IDF print("Without Smoothing:") #define tf-idf tf_idf_vec = TfidfVectorizer(use_idf=True, norm=None, smooth_idf=False, sublinear_tf=False, binary=False, max_features=None, strip_accents='unicode', ngram_range=(1,1), preprocessor=None,stop_words=None, tokenizer=None, vocabulary=None) #transform tf_idf_data = tf_idf_vec.fit_transform([s1,s2]) #create dataframe tf_idf_dataframe=pd.DataFrame(tf_idf_data.toarray(),columns=tf_idf_vec.get_feature_names()) print(tf_idf_dataframe) print("\n") #with smooth tf_idf_vec_smooth = TfidfVectorizer(use_idf=True, norm=None, smooth_idf=True, sublinear_tf=False, binary=False, max_features=None, strip_accents='unicode', ngram_range=(1,1), preprocessor=None,stop_words=None, tokenizer=None, vocabulary=None) tf_idf_data_smooth = tf_idf_vec_smooth.fit_transform([s1,s2]) print("With Smoothing:") tf_idf_dataframe_smooth=pd.DataFrame(tf_idf_data_smooth.toarray(),columns=tf_idf_vec_smooth.get_feature_names()) print(tf_idf_dataframe_smooth) Output: Without Smoothing: bed cat dog face my on sat the -- ------- ------- ------- ------- ---- ---- ----- ----- 0 0 1.69315 0 1.69315 1 1 1 1 1 1.69315 0 1.69315 0 1 1 1 1 With Smoothing: bed cat dog face my on sat the -- ------- ------- ------- ------- ---- ---- ----- ----- 0 0 1.40547 0 1.40547 1 1 1 1 1 1.40547 0 1.40547 0 1 1 1 1 ```
39.361702
281
0.616757
eng_Latn
0.833328
ede8e62a6e291fb2485d40674e94393e5f789e26
8,558
md
Markdown
docs/debugger/debugging-gpu-code.md
MicrosoftDocs/visualstudio-docs.tr-tr
ff0c41f814d042e7d4a0e457839db4a191a59f81
[ "CC-BY-4.0", "MIT" ]
7
2018-09-14T23:12:51.000Z
2021-08-22T21:23:28.000Z
docs/debugger/debugging-gpu-code.md
huriyilmaz/visualstudio-docs.tr-tr
9459e8aaaeb3441455be384a2b011dbf306ce691
[ "CC-BY-4.0", "MIT" ]
7
2018-07-20T23:01:49.000Z
2021-04-15T20:00:12.000Z
docs/debugger/debugging-gpu-code.md
huriyilmaz/visualstudio-docs.tr-tr
9459e8aaaeb3441455be384a2b011dbf306ce691
[ "CC-BY-4.0", "MIT" ]
22
2018-01-11T11:53:37.000Z
2022-03-06T16:38:31.000Z
--- title: GPU Kodu Hata Ayıklama | Microsoft Docs description: Grafik işleme biriminde (GPU) çalışan C++ kodunda hata ayıklama hakkında bilgi Visual Studio. ms.custom: SEO-VS-2020 ms.date: 11/04/2016 ms.topic: conceptual dev_langs: - CSharp - VB - FSharp - C++ ms.assetid: c7e77a5a-cb57-4b11-9187-ecc89acc8775 author: mikejo5000 ms.author: mikejo manager: jmartens ms.technology: vs-ide-debug ms.workload: - multiple ms.openlocfilehash: 86e7593ff4df88efb24592fcd758156575eb9fd4 ms.sourcegitcommit: b12a38744db371d2894769ecf305585f9577792f ms.translationtype: MT ms.contentlocale: tr-TR ms.lasthandoff: 09/13/2021 ms.locfileid: "126725428" --- # <a name="debugging-gpu-code"></a>GPU Kodunda Hata Ayıklama Grafik işleme biriminde (GPU) çalışan C++ kodunda hata ayıkabilirsiniz. Visual Studio'daki GPU hata ayıklama desteği; yarış algılama, başlatma işlemleri ve onlara ekleme ve hata ayıklama pencerelerine tümleştirme içerir. ## <a name="supported-platforms"></a>Desteklenen Platformlar Hata ayıklama , [!INCLUDE[win7](../debugger/includes/win7_md.md)] , [!INCLUDE[win8](../debugger/includes/win8_md.md)] Windows 10, ve [!INCLUDE[winsvr08_r2](../debugger/includes/winsvr08_r2_md.md)] diğer [!INCLUDE[winserver8](../debugger/includes/winserver8_md.md)] Windows Server 2016. Yazılım öykünücüsünün hata ayıklaması [!INCLUDE[win8](../debugger/includes/win8_md.md)] için, Windows 10 [!INCLUDE[winserver8](../debugger/includes/winserver8_md.md)] veya Windows Server 2016 gerekir. Donanımda hata ayıklama için grafik kartınız için sürücüleri yüklemeniz gerekir. Tüm donanım satıcıları tüm hata ayıklayıcı özelliklerini uygulamaz. Sınırlamalar için satıcı belgelerine bakın. > [!NOTE] > Visual Studio'da GPU hata ayıklamasını desteklemek isteyen bağımsız donanım satıcılarının VSD3DDebug arabirimini uygulayan ve kendi sürücülerini hedef alan bir DLL oluşturması gerekir. ## <a name="configuring-gpu-debugging"></a>GPU Hata Ayıklamayı Yapılandırma Hata ayıklayıcı aynı uygulama yürütmesinde hem CPU kodunda hem de GPU kodunda kesme olamaz. Varsayılan olarak, hata ayıklayıcısı CPU kodunda bozar. GPU kodunda hata ayıklamak için şu iki adımdan birini kullanın: - Standart araç **çubuğundaki Hata** Ayıklama Türü **listesinde Yalnızca** **GPU'su seçin.** - Bu **Çözüm Gezgini** proje kısayol menüsünde Özellikler'i **seçin.** Özellik Sayfaları **iletişim kutusunda** Hata Ayıklama'ya **tıklayın** ve ardından Hata Ayıklayıcı Türü listesinde Yalnızca **GPU'ya** tıklayın. ## <a name="launching-and-attaching-to-applications"></a>Uygulamaları Başlatma ve Uygulamalara Ekleme GPU hata ayıklamasını başlatmak Visual Studio hata ayıklama komutlarını kullanabilirsiniz. Daha fazla bilgi için [bkz. Hata Ayıklayıcı ile Kodda Gezinme.](../debugger/navigating-through-code-with-the-debugger.md) GPU hata ayıklayıcısını çalışan bir işleme de iliştirebilirsiniz, ancak yalnızca bu işlem GPU kodu yürütürse. Daha fazla bilgi için [bkz. Çalışan İşlemlere Ekleme.](../debugger/attach-to-running-processes-with-the-visual-studio-debugger.md) ## <a name="run-current-tile-to-cursor-and-run-to-cursor"></a>Geçerli Kutucuğu İmleçte Çalıştır ve İmleçte Çalıştır GPU'da hata ayıklarken, imleç konuma çalıştırmak için iki seçeneğiniz vardır. Her iki seçeneğin komutlarını da kod düzenleyicisinin kısayol menüsünde bulabilirsiniz. 1. **İmleçten Imleçe** Çalıştır komutu, imleç konuma ulaşana ve sonra sonlara ulaşana kadar uygulamanızı çalıştırır. Bu, geçerli iş parçacığının imleç üzerinde çalıştırıldıklarını ifade eder; bunun yerine, imleç noktasına ulaşan ilk iş parçacığının kesmeyi tetikley olduğu anlamına gelir. Bkz. [Hata Ayıklayıcı ile Kodda Gezinme](../debugger/navigating-through-code-with-the-debugger.md) 2. Geçerli **Kutucuğu İmleçten** Imleçe Çalıştır komutu, geçerli kutucukta yer alan tüm iş parçacıkları imleçe ulaşana ve sonra kesmelere ulaşana kadar uygulamanızı çalıştırır. ## <a name="debugging-windows"></a>Hata ayıklama Windows Belirli hata ayıklama pencerelerini kullanarak GPU iş parçacıklarını inceler, bayraklar ve dondurabilir. Daha fazla bilgi için bkz. - [Paralel Yığınlar Penceresini Kullanma](../debugger/using-the-parallel-stacks-window.md) - [Görevleri Penceresini Kullanma](../debugger/using-the-tasks-window.md) - [Nasıl yapılır: Paralel İzleme Penceresini Kullanma](../debugger/how-to-use-the-parallel-watch-window.md) - [İş Parçacıklarında ve İşlemlerde Hata Ayıklama](../debugger/debug-threads-and-processes.md) (Hata Ayıklama Konumu araç çubuğu) - [Nasıl: GPU İş Parçacıkları Penceresini Kullanma](../debugger/how-to-use-the-gpu-threads-window.md) ## <a name="data-synchronization-exceptions"></a>Veri Eşitleme Özel Durumları Hata ayıklayıcısı, yürütme sırasında çeşitli veri eşitleme koşullarını tanımlayabilir. Bir koşul algılandığında hata ayıklayıcı kesme durumuna girer. İki seçeneğiniz vardır:**Kesme veya** **Devam.** Özel Durumlar **iletişim kutusunu** kullanarak, hata ayıklayıcının bu koşulları algılayan ve hangi koşullar için bozacaklarını yapılandırabilirsiniz. Daha fazla bilgi için, [bkz. Managing Exceptions with the Debugger](../debugger/managing-exceptions-with-the-debugger.md). Yazılan veriler verilerin **değerini** değiştirmezse hata ayıklayıcının özel durumları yoksaymak zorunda olduğunu belirtmek için Seçenekler iletişim kutusunu da kullanabilirsiniz. Daha fazla bilgi için [bkz. Genel, Hata Ayıklama, Seçenekler İletişim Kutusu.](../debugger/general-debugging-options-dialog-box.md) ## <a name="troubleshooting"></a>Sorun giderme ### <a name="specifying-an-accelerator"></a>Hızlandırıcı belirtme GPU kodundaki kesme noktalarına yalnızca kod hızlandırıcıda çalışıyorsa isabet [eder::d irect3d_ref](/cpp/parallel/amp/reference/accelerator-class#direct3d_ref) (REF) hızlandırıcısı. Kodda bir hızlandırıcı belirtmezseniz, ref hızlandırıcısı proje özelliklerinde Hata Ayıklama **Hızlandırıcısı Türü olarak otomatik** olarak seçilir. Kodunuz bir hızlandırıcıyı açıkça seçerse, HATA AYıKLAMA sırasında REF hızlandırıcısı kullanılmaz ve GPU donanımınız hata ayıklama desteğine sahip olmadığı sürece kesme noktalarına isabet olmaz. Hata ayıklama sırasında REF hızlandırıcısını kullanması için kodunuzu yazarak bu sorunu giderebilirsiniz. Daha fazla bilgi için bkz. Proje özellikleri ve [C++](../debugger/project-settings-for-a-cpp-debug-configuration.md) [Hata Ayıklama Yapılandırması için accelerator_view](/cpp/parallel/amp/using-accelerator-and-accelerator-view-objects) ve nesneleri Project Ayarlar kullanma. ### <a name="conditional-breakpoints"></a>Koşullu Kesme Noktaları GPU kodundaki koşullu kesme noktaları desteklene, ancak cihazda her ifade değerlendirilene değildir. Bir ifade cihazda değerlendirilenene kadar hata ayıklayıcıda değerlendirilir. Hata ayıklayıcısı cihazdan daha yavaş çalışıyor olabilir. ### <a name="error-there-is-a-configuration-issue-with-the-selected-debugging-accelerator-type"></a>Hata: Seçilen Hata Ayıklama Hızlandırıcısı Türü ile ilgili bir yapılandırma sorunu var. Bu hata, proje ayarları ile hata ayıkladı bilgisayarınızda yapılandırma arasında bir tutarsızlık olduğunda gerçekleşir. Daha fazla bilgi için [bkz. Project Ayarlar C++ Hata Ayıklama Yapılandırması](../debugger/project-settings-for-a-cpp-debug-configuration.md)için bkz. . ### <a name="error-the-debug-driver-for-the-selected-debugging-accelerator-type-is-not-installed-on-the-target-machine"></a>Hata: Seçilen Hata Ayıklama Hızlandırıcısı Türü için hata ayıklama sürücüsü hedef makineye yüklenmedi. Bu hata, uzak bir bilgisayarda hata ayıklarken gerçekleşir. Hata ayıklayıcı, sürücülerin uzak bilgisayarda yüklü olup olmadığını çalışma süresine kadar belirleyamaz. Sürücüler, grafik kartının üreticisinden kullanılabilir. ### <a name="error-timeout-detection-and-recovery-tdr-must-be-disabled-at-the-remote-site"></a>Hata: Uzak sitede Zaman Aşımı Algılama ve Kurtarma (TDR) devre dışı bırakılmıştır. Bu hesaplamaların C++ AMP algılama ve kurtarma işlemi (TDR) tarafından Windows zaman aralığını aşması mümkündür. Bu durumda hesaplama iptal edilir ve veriler kaybolur. Daha fazla bilgi için [bkz. TDR'leri C++ AMP.](/archive/blogs/nativeconcurrency/handling-tdrs-in-c-amp) ## <a name="see-also"></a>Ayrıca bkz. - [Adım adım kılavuz: C++ AMP Uygulamanın Hata Ayıklaması](/cpp/parallel/amp/walkthrough-debugging-a-cpp-amp-application) - [Project Ayarlar C++ Hata Ayıklama Yapılandırması için yapılandırma](../debugger/project-settings-for-a-cpp-debug-configuration.md) - [Visual Studio'da GPU Hata Ayıklamayı başlatma](/archive/blogs/nativeconcurrency/start-gpu-debugging-in-visual-studio-2012)
97.25
908
0.806731
tur_Latn
0.999235
ede99f8eacb9248777f12e611ba0603267cd6ef3
1,835
md
Markdown
README_CN.md
muyu66/transactional
845fe8db26793c44202096e911b8e5f21ace1782
[ "MIT" ]
2
2019-03-23T11:19:15.000Z
2019-09-06T09:50:30.000Z
README_CN.md
muyu66/transactional
845fe8db26793c44202096e911b8e5f21ace1782
[ "MIT" ]
null
null
null
README_CN.md
muyu66/transactional
845fe8db26793c44202096e911b8e5f21ace1782
[ "MIT" ]
null
null
null
# @Transactional [![Build Status: Linux](https://travis-ci.org/muyu66/transactional.svg?branch=master)](https://travis-ci.org/muyu66/transactional) [![Coverage Status](https://coveralls.io/repos/github/muyu66/transactional/badge.svg?branch=master)](https://coveralls.io/github/muyu66/transactional?branch=master) [![TypeScript](https://badges.frapsoft.com/typescript/code/typescript.svg?v=101)](https://github.com/ellerbrock/typescript-badges/) [![Chat](https://badges.gitter.im/stockmarketjsserver.svg)](https://gitter.im/zhouyu_66/TRANSACTIONAL) [![MIT Licence](https://badges.frapsoft.com/os/mit/mit.svg?v=103)](https://opensource.org/licenses/mit-license.php) **@Transactional** 是一款专为解决 nodejs 代码中 **事务传递** 只能依赖于 **函数间的参数传递** 而产生的包。 Now, 我们使用装饰器(注解)来传递。 ### 立即使用: ``` typescript import { Transactional } from 'transactionaljs' class UserService { @Transactional() public async createUser(){} } ``` ### 立即测试: ``` bash npm install npm run test ``` ### 特点: - **非侵入式** :将 @Transactional() 放置于任意函数上, 立即自动获取强大的事务功能; - **多种模式** :REQUIRED、SUPPORTS、REQUIRES_NEW、NEVER、NOT_SUPPORTED(计划支持); - **多种数据源** : 任何支持事务创建、事务提交、事务回滚的数据库, 均可使用, 例如 mysql、mongodb 等; - **ORM友好** :只需要向 TransactionalPlatform 注入多个事务方法, 即可轻松完成事务接管, SequelizeV4 注入只需要9行代码; - **测试丰富** :目前的单元测试已达到66.67%的覆盖率, 我们的计划是永续100%; - **Typescript** :完全使用 typescript 编写; - **一键测试** :npm run test, 下载源代码, 不需要任何外部依赖, 只要有 npm nodejs 直接开测; - **开源免费** :欢迎提交PR, 永续 MIT 协议; ### Roadmap: - **分布式事务** :还没想好, 可能是基于2PC; - **多种数据源** :支持 mongodb, mysql, redis 事务链; - **更多测试** :更多单元测试; - **更多模式支持** : NOT_SUPPORTED 等模式, 以及其它模式的迭代; ### 类型定义 | 类型 | 描述 | :-------- | --------: | REQUIRED | 如果存在一个事务,则支持当前事务。如果没有事务则开启 | SUPPORTS | 如果存在一个事务,支持当前事务。如果没有事务,则非事务的执行 | REQUIRES_NEW | 总是开启一个新的事务。如果一个事务已经存在,则将这个存在的事务挂起 | NOT_SUPPORTED | 总是非事务地执行,并挂起任何存在的事务 | NEVER | 总是非事务地执行,如果存在一个活动事务,则抛出异常
31.637931
164
0.712807
yue_Hant
0.740656
edea95da49ed82958f201115c8ac9b47727e8347
4,889
md
Markdown
wdk-ddi-src/content/portabledevice/ne-portabledevice-tagwpd_meta_genres.md
Cloud-Writer/windows-driver-docs-ddi
6ac33c6bc5649df3e1b468a977f97c688486caab
[ "CC-BY-4.0", "MIT" ]
null
null
null
wdk-ddi-src/content/portabledevice/ne-portabledevice-tagwpd_meta_genres.md
Cloud-Writer/windows-driver-docs-ddi
6ac33c6bc5649df3e1b468a977f97c688486caab
[ "CC-BY-4.0", "MIT" ]
null
null
null
wdk-ddi-src/content/portabledevice/ne-portabledevice-tagwpd_meta_genres.md
Cloud-Writer/windows-driver-docs-ddi
6ac33c6bc5649df3e1b468a977f97c688486caab
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- UID: NE:portabledevice.tagWPD_META_GENRES title: WPD_META_GENRES (portabledevice.h) description: The WPD_META_GENRES enumeration type describes a broad genre type of a media file. old-location: wpddk\wpd_meta_genres.htm tech.root: wpd_dk ms.assetid: e37156ae-bc6a-4b68-ad79-34649ab3b38d ms.date: 02/15/2018 ms.keywords: WPD_META_GENRES, WPD_META_GENRES enumeration, WPD_META_GENRE_AUDIO_PODCAST, WPD_META_GENRE_FEATURE_FILM_VIDEO_FILE, WPD_META_GENRE_GENERIC_MUSIC_AUDIO_FILE, WPD_META_GENRE_GENERIC_NON_AUDIO_NON_VIDEO, WPD_META_GENRE_GENERIC_NON_MUSIC_AUDIO_FILE, WPD_META_GENRE_GENERIC_VIDEO_FILE, WPD_META_GENRE_HOME_VIDEO_FILE, WPD_META_GENRE_MIXED_PODCAST, WPD_META_GENRE_MUSIC_VIDEO_FILE, WPD_META_GENRE_NEWS_VIDEO_FILE, WPD_META_GENRE_PHOTO_MONTAGE_VIDEO_FILE, WPD_META_GENRE_SPOKEN_WORD_AUDIO_BOOK_FILES, WPD_META_GENRE_SPOKEN_WORD_FILES_NON_AUDIO_BOOK, WPD_META_GENRE_SPOKEN_WORD_NEWS, WPD_META_GENRE_SPOKEN_WORD_TALK_SHOWS, WPD_META_GENRE_TELEVISION_VIDEO_FILE, WPD_META_GENRE_TRAINING_EDUCATIONAL_VIDEO_FILE, WPD_META_GENRE_UNUSED, WPD_META_GENRE_VIDEO_PODCAST, enumeration, portabledevice/WPD_META_GENRES, portabledevice/WPD_META_GENRE_AUDIO_PODCAST, portabledevice/WPD_META_GENRE_FEATURE_FILM_VIDEO_FILE, portabledevice/WPD_META_GENRE_GENERIC_MUSIC_AUDIO_FILE, portabledevice/WPD_META_GENRE_GENERIC_NON_AUDIO_NON_VIDEO, portabledevice/WPD_META_GENRE_GENERIC_NON_MUSIC_AUDIO_FILE, portabledevice/WPD_META_GENRE_GENERIC_VIDEO_FILE, portabledevice/WPD_META_GENRE_HOME_VIDEO_FILE, portabledevice/WPD_META_GENRE_MIXED_PODCAST, portabledevice/WPD_META_GENRE_MUSIC_VIDEO_FILE, portabledevice/WPD_META_GENRE_NEWS_VIDEO_FILE, portabledevice/WPD_META_GENRE_PHOTO_MONTAGE_VIDEO_FILE, portabledevice/WPD_META_GENRE_SPOKEN_WORD_AUDIO_BOOK_FILES, portabledevice/WPD_META_GENRE_SPOKEN_WORD_FILES_NON_AUDIO_BOOK, portabledevice/WPD_META_GENRE_SPOKEN_WORD_NEWS, portabledevice/WPD_META_GENRE_SPOKEN_WORD_TALK_SHOWS, portabledevice/WPD_META_GENRE_TELEVISION_VIDEO_FILE, portabledevice/WPD_META_GENRE_TRAINING_EDUCATIONAL_VIDEO_FILE, portabledevice/WPD_META_GENRE_UNUSED, portabledevice/WPD_META_GENRE_VIDEO_PODCAST, tagWPD_META_GENRES, wpddk.wpd_meta_genres ms.topic: enum req.header: portabledevice.h req.include-header: req.target-type: Windows req.target-min-winverclnt: req.target-min-winversvr: req.kmdf-ver: req.umdf-ver: req.ddi-compliance: req.unicode-ansi: req.idl: req.max-support: req.namespace: req.assembly: req.type-library: req.lib: req.dll: req.irql: topic_type: - APIRef - kbSyntax api_type: - HeaderDef api_location: - PortableDevice.h api_name: - WPD_META_GENRES product: - Windows targetos: Windows req.typenames: WPD_META_GENRES ms.custom: RS5 --- # tagWPD_META_GENRES enumeration ## -description The <b>WPD_META_GENRES</b> enumeration type describes a broad genre type of a media file. ## -enum-fields ### -field WPD_META_GENRE_UNUSED The genre has not been set, or is not applicable. ### -field WPD_META_GENRE_GENERIC_MUSIC_AUDIO_FILE This is a generic music file (audio only). ### -field WPD_META_GENRE_GENERIC_NON_MUSIC_AUDIO_FILE This is a generic non-music audio file, for example, a speech or audio book. ### -field WPD_META_GENRE_SPOKEN_WORD_AUDIO_BOOK_FILES This is an audio book file. ### -field WPD_META_GENRE_SPOKEN_WORD_FILES_NON_AUDIO_BOOK This is a spoken word audio file that is not an audio book, for example, an interview or speech. ### -field WPD_META_GENRE_SPOKEN_WORD_NEWS This is a news audio or video file. ### -field WPD_META_GENRE_SPOKEN_WORD_TALK_SHOWS This is an audio recording of a talk show. ### -field WPD_META_GENRE_GENERIC_VIDEO_FILE This is a generic video file. ### -field WPD_META_GENRE_NEWS_VIDEO_FILE This is a news video file. ### -field WPD_META_GENRE_MUSIC_VIDEO_FILE This is a music video file. ### -field WPD_META_GENRE_HOME_VIDEO_FILE This is a home video file. ### -field WPD_META_GENRE_FEATURE_FILM_VIDEO_FILE This is a feature film video file. ### -field WPD_META_GENRE_TELEVISION_VIDEO_FILE This is a television program video file. ### -field WPD_META_GENRE_TRAINING_EDUCATIONAL_VIDEO_FILE This is an educational video file. ### -field WPD_META_GENRE_PHOTO_MONTAGE_VIDEO_FILE This is a video file featuring a photo montage. ### -field WPD_META_GENRE_GENERIC_NON_AUDIO_NON_VIDEO This is a file without audio or video. ### -field WPD_META_GENRE_AUDIO_PODCAST This is an audio podcast. ### -field WPD_META_GENRE_VIDEO_PODCAST This is a video podcast. ### -field WPD_META_GENRE_MIXED_PODCAST This is a podcast containing both audio and video. ## -remarks This enumeration is used by the <a href="https://docs.microsoft.com/windows/desktop/wpd_sdk/media-properties">WPD_MEDIA_META_GENRE</a> property. ## -see-also <a href="https://msdn.microsoft.com/library/windows/hardware/ff597672">Structures and Enumeration Types</a>    
27.937143
1,846
0.837595
yue_Hant
0.882243
edeabb29a9d86df7a5272fa52e7b506c8ef843c8
2,665
md
Markdown
docs/c-runtime-library/reference/ferror.md
FlorianHaupt/cpp-docs.de-de
0ccd351cf627f1a41b14514c746f729b778d87e3
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/c-runtime-library/reference/ferror.md
FlorianHaupt/cpp-docs.de-de
0ccd351cf627f1a41b14514c746f729b778d87e3
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/c-runtime-library/reference/ferror.md
FlorianHaupt/cpp-docs.de-de
0ccd351cf627f1a41b14514c746f729b778d87e3
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: ferror ms.date: 11/04/2016 apiname: - ferror apilocation: - msvcrt.dll - msvcr80.dll - msvcr90.dll - msvcr100.dll - msvcr100_clr0400.dll - msvcr110.dll - msvcr110_clr0400.dll - msvcr120.dll - msvcr120_clr0400.dll - ucrtbase.dll - api-ms-win-crt-stdio-l1-1-0.dll apitype: DLLExport f1_keywords: - ferror helpviewer_keywords: - ferror function - streams, testing for errors - errors [C++], testing for stream ms.assetid: 528a34bc-f2aa-4c3f-b89a-5b148e6864f7 ms.openlocfilehash: 2be90ffe8a135b4108abd9504099bd2f6c28f249 ms.sourcegitcommit: 6052185696adca270bc9bdbec45a626dd89cdcdd ms.translationtype: MT ms.contentlocale: de-DE ms.lasthandoff: 10/31/2018 ms.locfileid: "50587888" --- # <a name="ferror"></a>ferror Testet auf einen Fehler in einem Stream ## <a name="syntax"></a>Syntax ```C int ferror( FILE *stream ); ``` ### <a name="parameters"></a>Parameter *Stream*<br/> Zeiger auf die **FILE**-Struktur. ## <a name="return-value"></a>Rückgabewert Wenn kein Fehler aufgetreten ist, auf *Stream*, **Ferror** gibt 0 zurück. Andernfalls gibt es einen Wert ungleich 0 (null) zurück. Wenn der Stream **NULL**, **Ferror** wird der Handler für ungültige Parameter aufgerufen, wie in beschrieben [Parametervalidierung](../../c-runtime-library/parameter-validation.md). Wenn die weitere Ausführung zugelassen wird, um den Vorgang fortzusetzen, setzt diese Funktion **Errno** zu **EINVAL** und gibt 0 zurück. Weitere Informationen zu diesen und anderen Fehlercodes finden Sie unter [_doserrno, errno, _sys_errlist und _sys_nerr](../../c-runtime-library/errno-doserrno-sys-errlist-and-sys-nerr.md). ## <a name="remarks"></a>Hinweise Die **Ferror** -Routine (die sowohl als Funktion und Makro implementiert werden) wird getestet, ob eine Lese- oder Schreibfehler auf die zugeordnete Datei *Stream*. Wenn ein Fehler aufgetreten ist, der Fehlerindikator für den Stream bleibt festgelegt, bis der Stream geschlossen oder zurückgespult wird oder bis **Clearerr** dagegen aufgerufen wird. ## <a name="requirements"></a>Anforderungen |Funktion|Erforderlicher Header| |--------------|---------------------| |**ferror**|\<stdio.h>| Weitere Informationen zur Kompatibilität finden Sie unter [Kompatibilität](../../c-runtime-library/compatibility.md). ## <a name="example"></a>Beispiel Sehen Sie sich das Beispiel für [feof](feof.md) an. ## <a name="see-also"></a>Siehe auch [Fehlerbehandlung](../../c-runtime-library/error-handling-crt.md)<br/> [Stream-E/A](../../c-runtime-library/stream-i-o.md)<br/> [clearerr](clearerr.md)<br/> [_eof](eof.md)<br/> [feof](feof.md)<br/> [fopen, _wfopen](fopen-wfopen.md)<br/> [perror, _wperror](perror-wperror.md)<br/>
32.901235
450
0.734334
deu_Latn
0.854075
edeb6562437e9d3515cbcf1cd056d22cee0f1f64
3,343
md
Markdown
content/page/legal-mentions.md
MaryameBennaniPro/maryame-bennani-site-static
45f08d5f3dc5b297e4251423203025bcbfc50c5b
[ "MIT" ]
null
null
null
content/page/legal-mentions.md
MaryameBennaniPro/maryame-bennani-site-static
45f08d5f3dc5b297e4251423203025bcbfc50c5b
[ "MIT" ]
null
null
null
content/page/legal-mentions.md
MaryameBennaniPro/maryame-bennani-site-static
45f08d5f3dc5b297e4251423203025bcbfc50c5b
[ "MIT" ]
null
null
null
--- title: "Mentions légales" date: 2021-05-01T17:35:43+02:00 draft: false description: "Mentions légales" --- # Mentions légales ## Acceptation de fait En utilisant MARYAMEBENNANI.COM, l’internaute prend note et accepte les conditions ici après énumérées. ## Propriétaire / Éditeur MARYAMEBENNANI.COM est un site web édité par Maryame Bennani et publié sous la direction de Maryame Bennani. L’ensemble du site MARYAMEBENNANI.COM, pages et contenus, est la propriété de Maryame Bennani. Contact administratif du site MARYAMEBENNANI.COM : **contact@maryamebennani.com** ## Hébergeur MARYAMEBENNANI.COM décline toute responsabilité quant aux éventuelles interruptions du site MARYAMEBENNANI.COM et de ses services. Le site est hébergé par GITHUB PAGES, 88 Colin P Kelly Junior Street, United States CA 94107 San Francisco. ## Abus En application de la loi n° 2004-575 du 21 juin 2004 pour la confiance dans l’économie numérique, pour signaler un contenu litigieux ou si vous êtes victime d’une utilisation frauduleuse du site MARYAMEBENNANI.COM, merci de contacter l’administrateur du site à l’adresse courriel : **contact@maryamebennani.com** ## Collecte et traitement de données personnelles Les informations recueillies par MARYAMEBENNANI.COM ne sont jamais communiquées à des tiers sans le consentement explicite des personnes concernées. Sauf cas particulier et encadré sur le support de collecte, ces informations proviennent de l’enregistrement volontaire de données personnelles fournies par l’internaute, avec son consentement explicite sur l’usage et le traitement de ces données. En application de la loi n° 78-17 du 6 janvier 1978 relative à l’informatique, aux fichiers et aux libertés, l’internaute dispose d’un droit d’accès, de rectification et de suppression des informations personnelles le concernant stockées par MARYAMEBENNANI.COM, qu’il peut exercer à tout moment auprès de son responsable, Maryame Bennani, à l'adresse suivante : **laure@MARYAMEBENNANI.COM**. ## Cookies Le site MARYAMEBENNANI.COM n'utilise d’aucun cookie, il est cookies free :) Toutefois MARYAMEBENNANI.COM utilise CLOUDFLARE a des fins de distribution du contenu des pages, entre l'hébergeur et l'internaute. Ce service insère un cookie a des fins techniques pour lui permettre d'acheminer correctement la connexion de l'internaute dans son infrastructure durant la visite du site. ## Concepteur / Créateur L’ensemble du site MARYAMEBENNANI.COM, conception, charte graphique et applications, a été créé et développé et est administré par Maryame Bennani. ## Contenus et droits de reproduction En application des articles L. 111-1 et L. 123-1 du Code de la Propriété Intellectuelle, l’ensemble des contenus de ce site (textes, images, vidéos et tout média en général), sauf mention contraire explicite, est protégé par le droit d’auteur. La reproduction, même partielle, des contenus des pages de ce site sans accord préalable de Maryame Bennani est strictement interdite (les courtes citations sont autorisées par le droit français pour commentaires et critiques, tant que ceux-ci y sont strictement concomitants et que sont précisés l’auteur original et le lien Internet vers la page source). <!-- MARYAMEBENNANI est immatriculée au RCS du Havre Siret : xxxx xxx xxx.--> Copyright @ 2021 | Maryame Bennani
36.336957
84
0.797786
fra_Latn
0.973217
edebf11177bb1186fbdef7548aa2a1309212e662
304
md
Markdown
src/posts/second-blogpost.md
Donatron/gatsby-wp
9367a047cf39ab9f1c965510ac483027273e80cb
[ "MIT" ]
null
null
null
src/posts/second-blogpost.md
Donatron/gatsby-wp
9367a047cf39ab9f1c965510ac483027273e80cb
[ "MIT" ]
null
null
null
src/posts/second-blogpost.md
Donatron/gatsby-wp
9367a047cf39ab9f1c965510ac483027273e80cb
[ "MIT" ]
null
null
null
--- title: "This is our second post" date: "2020-03-29" keywords: "firstKeyword, secondKeyword" image: "https://i1.wp.com/blog.alexdevero.com/wp-content/uploads/2018/11/how-to-build-simple-website-with-gatsbyjs-postcss-pt1.jpg?resize=768%2C476&ssl=1" --- Change the text to something else perhaps this.
33.777778
154
0.756579
eng_Latn
0.417417
edec3cb6821c0433131aa06700cb63c9ab24b439
666
md
Markdown
docs/assembler/masm/operator-dup.md
taketakeyyy/cpp-docs.ja-jp
4cc268ee390a7f6a2a8afd848f65876acc4450a9
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/assembler/masm/operator-dup.md
taketakeyyy/cpp-docs.ja-jp
4cc268ee390a7f6a2a8afd848f65876acc4450a9
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/assembler/masm/operator-dup.md
taketakeyyy/cpp-docs.ja-jp
4cc268ee390a7f6a2a8afd848f65876acc4450a9
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: DUP 演算子 ms.date: 12/17/2019 f1_keywords: - operator DUP helpviewer_keywords: - operator DUP - DUP operator ms.assetid: ed1e91ea-91ed-43c0-9315-7e532df65a28 ms.openlocfilehash: 0cdc0f16d5318d4e9688a579d2b535f25eced3cc ms.sourcegitcommit: 0781c69b22797c41630601a176b9ea541be4f2a3 ms.translationtype: MT ms.contentlocale: ja-JP ms.lasthandoff: 12/20/2019 ms.locfileid: "75316669" --- # <a name="operator-dup"></a>DUP 演算子 *Initialvalue*の宣言の*数を指定します。* ## <a name="syntax"></a>構文 > *count* **DUP** __(__ *initialvalue* ⟦ __、__ *initialvalue* ...⟧ __)__ ## <a name="see-also"></a>関連項目 [演算子リファレンス](operators-reference.md)\ [MASM BNF 文法](masm-bnf-grammar.md)
22.965517
72
0.744745
kor_Hang
0.150545
eded76f36bd42e457f7629feed65a3a85b117486
6,415
md
Markdown
articles/stream-analytics/vs-code-intellisense.md
matmahnke/azure-docs.pt-br
6c96d25caf8663547775f333164198e3ed03972f
[ "CC-BY-4.0", "MIT" ]
null
null
null
articles/stream-analytics/vs-code-intellisense.md
matmahnke/azure-docs.pt-br
6c96d25caf8663547775f333164198e3ed03972f
[ "CC-BY-4.0", "MIT" ]
null
null
null
articles/stream-analytics/vs-code-intellisense.md
matmahnke/azure-docs.pt-br
6c96d25caf8663547775f333164198e3ed03972f
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: IntelliSense no Azure Stream Analytics Tools para Visual Studio Code description: Este artigo descreve como usar os recursos do IntelliSense no Azure Stream Analytics Tools para Visual Studio Code. ms.service: stream-analytics author: su-jie ms.author: sujie ms.date: 4/11/2020 ms.topic: how-to ms.openlocfilehash: 756604b71efd1715ae3b4ca3d5eebf0fdfa41e34 ms.sourcegitcommit: 857859267e0820d0c555f5438dc415fc861d9a6b ms.translationtype: MT ms.contentlocale: pt-BR ms.lasthandoff: 10/30/2020 ms.locfileid: "93129789" --- # <a name="intellisense-in-azure-stream-analytics-tools-for-visual-studio-code"></a>IntelliSense no Azure Stream Analytics Tools para Visual Studio Code O IntelliSense está disponível para [Stream Analytics linguagem de consulta](/stream-analytics-query/stream-analytics-query-language-reference?bc=https%253a%252f%252fdocs.microsoft.com%252fazure%252fbread%252ftoc.json&toc=https%253a%252f%252fdocs.microsoft.com%252fazure%252fstream-analytics%252ftoc.json) no [Azure Stream Analytics Tools para vs Code](https://marketplace.visualstudio.com/items?itemName=ms-bigdatatools.vscode-asa&ssr=false#overview). O IntelliSense é uma ajuda de preenchimento de código que inclui inúmeras funcionalidades: Listar Membros, Informações do Parâmetro, Informações Rápidas e Completar Palavra. Os recursos do IntelliSense às vezes são chamados por outros nomes, como "auto-completar de código", "assistência a conteúdo" e "dicas de código". ![Demonstração do IntelliSense](./media/vs-code-intellisense/intellisense.gif) ## <a name="intellisense-features"></a>Recursos do IntelliSense Os recursos do IntelliSense no Stream Analytics Tools for VS Code são alimentados por um serviço de linguagem. Um serviço de linguagem analisa seu código-fonte e fornece preenchimentos de código inteligentes com base na semântica da linguagem. Se um serviço de linguagem souber possíveis conclusões, as sugestões do IntelliSense pop-up conforme você digita. Se você continuar digitando, uma lista de membros, como variáveis e métodos, será filtrada para incluir somente membros que contenham os caracteres digitados. Quando você pressiona as `Tab` `Enter` teclas ou, o IntelliSense insere o membro que você selecionou. Você pode disparar o IntelliSense em qualquer janela do Editor digitando um caractere de gatilho, como o caractere de ponto `.` . ![preenchimento automático do IntelliSense](./media/vs-code-intellisense/auto-completion.gif) > [!TIP] > O widget de sugestões dá suporte à filtragem de CamelCase. Você pode digitar as letras que estão em letras maiúsculas em um nome de método para limitar as sugestões. Por exemplo, "CRA" abrirá rapidamente "createapplication". ### <a name="types-of-completions"></a>Tipos de conclusões As ferramentas de Stream Analytics para o VS Code IntelliSense oferecem diferentes tipos de conclusões, incluindo sugestões de servidor de linguagem, trechos de código e conclusões textuais baseadas em palavras simples. |Completion | Tipo | | ----- | ------- | | Palavras-chave | `keyword` | Funções | `build-in function`, `user defined function` | | Nome do conjunto de dados| `input`, `output`, `intermediate result set`| | Nome da coluna do conjunto de dados|`input`, `intermediate result set`| #### <a name="name-completion"></a>Conclusão do nome Além da conclusão automática de palavras-chave, Stream Analytics ferramentas para VS Code lê a lista de nomes de entrada e saída do trabalho, bem como os nomes das colunas em suas fontes de dados quando eles são configurados. A extensão lembra essas informações para fornecer recursos de conclusão de nome que são úteis para inserir instruções com poucos pressionamentos de tecla: Durante a codificação, você não precisa deixar o editor para executar pesquisas em nomes de entrada de trabalho, nome de saída e nomes de coluna. Você pode manter seu contexto, encontrar as informações necessárias, inserir elementos diretamente em seu código e fazer com que o IntelliSense conclua sua digitação para você. Observe que você precisa configurar a entrada local ou a entrada ao vivo e salvar o arquivo de configuração para poder usar a conclusão de nome. ![conclusão do nome](./media/vs-code-intellisense/name-completion.gif) ### <a name="parameter-info"></a>Informações de Parâmetro A opção **informações do parâmetro** do IntelliSense abre uma lista de parâmetros que fornece informações sobre o número, os nomes e os tipos dos parâmetros exigidos por uma função. O parâmetro em negrito indica o próximo parâmetro necessário à medida que você digita uma função. A lista de parâmetros também é exibida para funções aninhadas. Se você digitar uma função como um parâmetro para outra função, a lista de parâmetros exibirá os parâmetros da função interna. Em seguida, quando a lista de parâmetros da função interna estiver completa, a lista de parâmetros é revertida para exibir os parâmetros da função externa. ![informações do parâmetro](./media/vs-code-intellisense/parameter-info.gif) ### <a name="quick-info"></a>Informação Rápida Conforme fornecido pelo serviço de linguagem, você pode ver **informações rápidas** para cada identificador em seu código. Alguns exemplos de identificadores são entrada, saída, um conjunto de resultados intermediário ou função. Quando você move o ponteiro do mouse sobre um identificador, sua declaração é exibida em uma janela pop-up. As propriedades e esquemas de dados para entradas, se configuradas, e conjuntos de dados intermediários são mostrados. ![informações rápidas](./media/vs-code-intellisense/quick-info.gif) ## <a name="troubleshoot-intellisense"></a>Solução de problemas do IntelliSense Esse problema é causado pela configuração de entrada ausente que fornece dados. Você pode verificar se uma [entrada local](visual-studio-code-local-run.md#define-a-local-input) ou uma [entrada ao vivo](visual-studio-code-local-run-live-input.md#define-a-live-stream-input) foi configurada corretamente. ## <a name="next-steps"></a>Próximas etapas * [Início rápido: criar um trabalho de Azure Stream Analytics no Visual Studio Code](quick-create-visual-studio-code.md) * [Testar consultas do Stream Analytics localmente com os dados de exemplo usando o Visual Studio Code](visual-studio-code-local-run.md) * [Testar Stream Analytics consultas localmente contra entrada de fluxo ao vivo usando Visual Studio Code](visual-studio-code-local-run-live-input.md)
84.407895
773
0.799376
por_Latn
0.999185
edeede776a3901abf2ac9e0d496b99b81d29ead9
1,204
md
Markdown
README.md
Lambda-School-Labs/Labs25-SaverLife-TeamC-fe
1d4637fd38500e4879f70048c6a65933ac96eece
[ "MIT" ]
1
2020-08-04T20:15:59.000Z
2020-08-04T20:15:59.000Z
README.md
Lambda-School-Labs/Labs25-SaverLife-TeamC-fe
1d4637fd38500e4879f70048c6a65933ac96eece
[ "MIT" ]
20
2020-08-11T20:20:30.000Z
2020-09-23T00:03:00.000Z
README.md
Lambda-School-Labs/Labs25-SaverLife-TeamC-fe
1d4637fd38500e4879f70048c6a65933ac96eece
[ "MIT" ]
1
2020-08-05T22:48:44.000Z
2020-08-05T22:48:44.000Z
# SaverLife ## Description 👇 > **Disclaimer:** This application is currently in Alpha (as of Aug 03, 2020) and is not ready for production. Please use at your own risk as things will change almost daily. - An application that takes a user's past financial data and creates a predicted budget. - SaverLife is a nonprofit on a mission — to inspire, inform, and reward the millions of Americans who need help saving money. - This app will give working people the methods and motivation to take control of their financial future. Here's a quick video presentation to get you started: https://youtu.be/VTn3yK6lOGk ## Resources 👇 ## Deployed App https://main.d241jcpwmggivh.amplifyapp.com/ ### Environment variables - `REACT_APP_CLIENT_ID` Okta client id - `REACT_APP_OKTA_ISSUER_URI` Okta api authorization server issuer uri (eg. `https://name-438r8hr.okta.com/oauth2/default`) - `REACT_APP_API_URI` The URL (localhost or live) for the Backend that you're building ## Styling Our App - `Ant Design` - https://ant.design/ - `Styled Components` - https://styled-components.com/ ## Data Visualization `Plotly` - https://plotly.com/javascript/ ## Testing Our App `Jest` - https://jestjs.io/
28.666667
174
0.748339
eng_Latn
0.908599