hexsha stringlengths 40 40 | size int64 5 1.04M | ext stringclasses 6 values | lang stringclasses 1 value | max_stars_repo_path stringlengths 3 344 | max_stars_repo_name stringlengths 5 125 | max_stars_repo_head_hexsha stringlengths 40 78 | max_stars_repo_licenses listlengths 1 11 | max_stars_count int64 1 368k ⌀ | max_stars_repo_stars_event_min_datetime stringlengths 24 24 ⌀ | max_stars_repo_stars_event_max_datetime stringlengths 24 24 ⌀ | max_issues_repo_path stringlengths 3 344 | max_issues_repo_name stringlengths 5 125 | max_issues_repo_head_hexsha stringlengths 40 78 | max_issues_repo_licenses listlengths 1 11 | max_issues_count int64 1 116k ⌀ | max_issues_repo_issues_event_min_datetime stringlengths 24 24 ⌀ | max_issues_repo_issues_event_max_datetime stringlengths 24 24 ⌀ | max_forks_repo_path stringlengths 3 344 | max_forks_repo_name stringlengths 5 125 | max_forks_repo_head_hexsha stringlengths 40 78 | max_forks_repo_licenses listlengths 1 11 | max_forks_count int64 1 105k ⌀ | max_forks_repo_forks_event_min_datetime stringlengths 24 24 ⌀ | max_forks_repo_forks_event_max_datetime stringlengths 24 24 ⌀ | content stringlengths 5 1.04M | avg_line_length float64 1.14 851k | max_line_length int64 1 1.03M | alphanum_fraction float64 0 1 | lid stringclasses 191 values | lid_prob float64 0.01 1 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
08e5f6734670833680ae0ffc2442df621486f9c0 | 5,024 | md | Markdown | posts/mithacal-milers-week-10.md | scottpdawson/skirtrunner | 8e5e7ef08f617004ee6f9171114c1f853c5f86a5 | [
"MIT"
] | null | null | null | posts/mithacal-milers-week-10.md | scottpdawson/skirtrunner | 8e5e7ef08f617004ee6f9171114c1f853c5f86a5 | [
"MIT"
] | null | null | null | posts/mithacal-milers-week-10.md | scottpdawson/skirtrunner | 8e5e7ef08f617004ee6f9171114c1f853c5f86a5 | [
"MIT"
] | null | null | null | ---
title: "MITHACAL Milers Week 10"
date: "2020-03-04"
permalink: "mithacal-milers/mithacal-milers-week-10/"
hero: "/images/mithacal-milers.png"
navigation: "Training"
tags:
- mithacal milers
description: "So for our group that means 56 seconds for each 200 and then a 400 that isn't too slow! Scott and I talked about it on the way and we really should shoot for between 4-5 sets. Ideally 5 would be good, but realistically I knew we'd hit 4."
---
## The workout
{% picture "/images/2020/03/Screen-Shot-2020-03-04-at-6.12.01-AM.png", "" %}
So for our group that means 56 seconds for each 200 and then a 400 that isn't too slow! Scott and I talked about it on the way and we really should shoot for between 4-5 sets. Ideally 5 would be good, but realistically I knew we'd hit 4.
## Nearing the end
It is March so the program is starting to dwindle a little As Adam noted last night, "it's starting to lose people." It makes complete sense. It is getting warmer, and next week we change time so it will still be light at 7 pm. I had listed this evening as a rest day in my plan and couldn't remember why. When Scott mentioned that "this was supposed to be a rest day" we realized why. I had thought it wrapped up by now.
So, our group last week was small and it looked at first like this week would be smaller. It didn't turn out to be that way though. Several people who were missing last week were back and our crew looked to be about 7 or 8 people. I say "looked to be" because we were not really together once we got going. Over the course of each 800 we spread out a fair amount, so literally we were on the start line together and then formed a long tail of runners.
## Timing disasters
One key person was missing this evening. Caroline has been our pacer for the last couple of weeks. Tonya has also had a proper watch and she's been a secondary pacer. Tonight with Caroline gone I assumed Tonya would take the lead until Tonya said she left her watch home. Elizabeth doesn't wear a watch and most others have either nothing or something like I do, an Apple watch or regular watch that doesn't really do what we need it to do. John has a proper watch but he pointed out he was going to be toward the back so he couldn't really pace the front.
Tonya and I talked about how to make the Apple watches work. She has the Strava app running in the background so she can have the timer running too. I have the native Apple app working and it cannot coexist with the timer. I killed my warm-up track and was going to start the timer, but ultimately we settled on Tonya timing because she's more used to doing it with her regular watch anyway. I wasn't too concerned since I pace is pretty zippy and if we can get even one good lap in we can typically feel the pace and hold it.
## The sets themselves
Our first lap of the first set was too slow by about 4-5 seconds so we picked it up as we went along. It felt increasingly more challenging but it was good. We finished with a not too fast 400 and then started again.
It was a pretty uneventful workout in that we hit the paces, chatted a bit and generally enjoyed the evening. When we hit set 3 Tonya decided to drop to a 600 but said she'd time our last lap. I had been running just a step behind her so I was running solo for that last lap until Ken passed by me. I had a new rabbit in Ken and just worked to stay with him. I ended up being a couple of seconds fast for that lap which made the entire set come out to spot on.
On our last set Tonya dropped after 400 and it was Ken, Kim and me for the last two. Kim had been with a faster group but her form was failing so she dropped back just a bit to join us. It was nice to have her for our last couple of sets. I enjoyed this last set most of all only because I had to push myself a bit harder to finish. There's something so comfortingly easy about running with Tonya. It is familiar and I know we can push each other to stay the course. I haven't really run with Kim or Ken and as Ken passed by me again on lap 3, I realized that it is often our mindset that determines if we stay with someone or let them go. Without a proper watch I didn't know what Ken's pace was. I only knew that he was currently passing me and I haven't really run with him before. I thought for a brief second, "oh he's faster than you so you don't have to stay with him." I quickly replaced that thought with, "he is in your group so you should be running with him." As soon as I made that mental switch it felt a little like running with Tonya. I would stay with him and Kim because they were in my group. Ultimately they did get a bit ahead of me, but I could tell I they were speeding up (it was the last lap and people often race it) and I was holding pace. This was confirmed when I finished and Tonya told them they were a bit ahead and I came in on pace. I like that I can feel the pace now and while a watch is nice, it is not a full disaster to not have one!
Until next week .... if it is still happening by then.
| 128.820513 | 1,472 | 0.765725 | eng_Latn | 0.999994 |
08e5fcb68e2bbeabba1ab7615be651188b654350 | 982 | md | Markdown | _posts/2014-10-15-jon_stewart_knows_best_in_a_fox_news_world_we_need_satire_more_than_meet_the_press.md | blakecrosby/sophiamcclennen.com | e825c14430a8446798da1d311ebbfffbeeedab61 | [
"MIT"
] | null | null | null | _posts/2014-10-15-jon_stewart_knows_best_in_a_fox_news_world_we_need_satire_more_than_meet_the_press.md | blakecrosby/sophiamcclennen.com | e825c14430a8446798da1d311ebbfffbeeedab61 | [
"MIT"
] | null | null | null | _posts/2014-10-15-jon_stewart_knows_best_in_a_fox_news_world_we_need_satire_more_than_meet_the_press.md | blakecrosby/sophiamcclennen.com | e825c14430a8446798da1d311ebbfffbeeedab61 | [
"MIT"
] | null | null | null | ---
title: 'Jon Stewart knows best: In a Fox News world, we need satire more than “Meet The Press”'
publication: "Salon"
link_to_original: "http://www.salon.com/2014/10/10/jon_stewart_knows_best_in_a_fox_news_world_we_need_satire_more_than_meet_the_press/"
featured: true
category: what-im-watching
---

We know that satire news is increasingly a source of “real” news. We know that Jon Stewart has been voted as the No. 4 most trusted journalist, tying with Brian Williams and Dan Rather (that poll was in 2008). We know that satirists like Stewart have an active and engaged viewership of mostly millennials. And we know that over the years Stewart has been able to attract an impressive lineup of politicians, journalists and other newsmakers to appear on his interview segment.
Read the rest at [_Salon_](http://www.salon.com/2014/10/10/jon_stewart_knows_best_in_a_fox_news_world_we_need_satire_more_than_meet_the_press/)
| 70.142857 | 477 | 0.809572 | eng_Latn | 0.991567 |
08e6cf905f6bb34325234a1e84246b2c25b999eb | 544 | md | Markdown | README.md | MasatoHanayama/pdf_gen | b16491a31cea0d1a4931e979d600870d6be07c3e | [
"MIT"
] | 1 | 2022-03-15T12:57:46.000Z | 2022-03-15T12:57:46.000Z | README.md | MasatoHanayama/pdf_gen | b16491a31cea0d1a4931e979d600870d6be07c3e | [
"MIT"
] | null | null | null | README.md | MasatoHanayama/pdf_gen | b16491a31cea0d1a4931e979d600870d6be07c3e | [
"MIT"
] | null | null | null | # pdf_gen.py
## 概要
複数の画像ファイルをPDFにまとめます。
## 使用方法
起動引数に画像フォルダの親ディレクトリを指定します。
プログラムは拡張子が.jpg、.jpeg、.pngのファイルのみまとめます。
```
親フォルダ
├画像フォルダ1
│├画像1.jpg
│├画像2.jpg
│└画像3.jpg
├画像フォルダ2
│├画像1.jpeg
│├画像2.jpeg
│├画像3.jpeg
│└画像4.jpeg
└画像フォルダ3
├画像1.png
├画像2.png
└画像3.png
```
上記のディレクトリ構造の場合、起動時に"親フォルダ"を指定します。
起動例
```
python ./pdf_gen.py 親フォルダ
```
実行後、指定したフォルダにpdfファイルが作成されます。
```
親フォルダ
├画像フォルダ1
│├画像1.jpg
│├画像2.jpg
│└画像3.jpg
├画像フォルダ2
│├画像1.jpeg
│├画像2.jpeg
│├画像3.jpeg
│└画像4.jpeg
├画像フォルダ3
│├画像1.png
│├画像2.png
│└画像3.png
├画像フォルダ1.pdf
├画像フォルダ2.pdf
└画像フォルダ3.pdf
``` | 11.333333 | 38 | 0.696691 | yue_Hant | 0.858398 |
08e707a75f1119736420d07750dd218ba4a24f6c | 746 | md | Markdown | weibotop_content/2021/2021-09/2021-09-15/13:12.md | handn-work/spider | 8130508da1ba31df2d1430c0077ad08602417fd3 | [
"MIT"
] | null | null | null | weibotop_content/2021/2021-09/2021-09-15/13:12.md | handn-work/spider | 8130508da1ba31df2d1430c0077ad08602417fd3 | [
"MIT"
] | null | null | null | weibotop_content/2021/2021-09/2021-09-15/13:12.md | handn-work/spider | 8130508da1ba31df2d1430c0077ad08602417fd3 | [
"MIT"
] | 1 | 2022-01-24T09:20:56.000Z | 2022-01-24T09:20:56.000Z | 习近平总书记在陕西榆林考察调研
月薪8000贷款60万买私教课
苏炳添或将得到奥运奖牌
你可能每天在无效刷牙
爱在日常 不止中秋
不倒翁小姐姐被指不配传递火炬遭怒斥
老九门第二季10月开机
这只猫好像知道自己很贵
带着微博去宜兴
运动天才安宰贤
辅助生殖技术需求飙升
中方回应英议会禁止中国大使参加活动
加拿大爆发反对强制接种新冠疫苗抗议
三星嘲讽iPhone13
网红拉姆遇害一年后家人现状
龙劭华去世
全红婵是人形挂件吧
农民夫妻自费18万建广场供全村休闲
李冰冰陈坤芭莎合体封面
在酒吧办婚礼
广东小伙被丁真种草到理塘种萝卜
见过最阴间的走秀
塔利班瘾君子街头被鞭抽泼冷水
iPhone13粉色
全运会
塞尔维亚旅游失联女子已确认在波黑
留虾女孩收到2大盒虾
是收摊不是倒闭
阿里女员工案饭局照片曝光
iPhone13价格
莆田14周岁以下隔离儿童有家人陪同
大学室友的关系真的重要吗
全国健康码行程码实现一页通行
丁程鑫北电表本新生大合照
iPhone13值不值得买
全红婵招牌合影动作
iPhone12价格直降千元
十四届全运会
益禾堂回应某门店遍布蟑螂
苹果发布会
双标狗是真的狗
很爱你的人真能忍住很久不联系吗
上海人有多爱鲜肉月饼
余华是个脱口秀演员吧
紫燕百味鸡食品柜里老鼠乱窜
中学生要成为国之栋梁
紫燕百味鸡被停业整顿
机智的恋爱
闫妮30秒眼技杀
莆田低龄儿童隔离成防控难点
尿不湿吸水能力有多强
你被冒名办卡了么工信部喊你查
你能接受舍友在寝室养宠物吗
| 13.814815 | 19 | 0.785523 | yue_Hant | 0.787392 |
08e71138c522b7a0e660d9077996a42dc3653f91 | 1,053 | md | Markdown | data/blog/item_2096.md | VallyPepyako/wsd-jam | daa81b7f253bc585002f339e78dc1b42c537ce63 | [
"MIT"
] | null | null | null | data/blog/item_2096.md | VallyPepyako/wsd-jam | daa81b7f253bc585002f339e78dc1b42c537ce63 | [
"MIT"
] | null | null | null | data/blog/item_2096.md | VallyPepyako/wsd-jam | daa81b7f253bc585002f339e78dc1b42c537ce63 | [
"MIT"
] | null | null | null | ---
path: "/blog/page_2096"
title: "Part 2096"
---
го ж, можно, — сказал он.
Наташа слегка наклонила голову и быстрыми шагами вернулась к Мавре Кузьминишне, стоявшей над офицером и с жалобным участием разговаривавшей с ним.
— Можно, он сказал, можно! — шопотом сказала Наташа.
Офицер в кибиточке завернул во двор Ростовых, и десятки телег с ранеными стали, по приглашениям городских жителей, заворачивать в дворы и подъезжать к подъездам домов Поварской улицы. Наташе, видимо, понравились эти, вне обычных условий жизни, отношения с новыми людьми. Она вместе с Маврой Кузьминишной старалась заворотить на свой двор как можно больше раненых.
— Надо всё-таки папаше доложить, — сказала Мавра Кузьминишна.
— Ничего, ничего, разве не всё равно! На один день мы в гостиную перейдем. Можно всю нашу половину им отдать.
— Ну, уж вы, барышня, придумаете! Да хоть и в флигеля, в холостую, к нянюшке, и то спросить надо.
— Ну, я спрошу.
Наташа побежала в дом и на цыпочках вошла в полуотворенную дверь диванной, из которой пахло уксусом и гофманскими каплями.
—
| 65.8125 | 363 | 0.780627 | rus_Cyrl | 0.994103 |
08e7797d58576d863b794af25939dafcc46f876f | 2,746 | md | Markdown | docs/source/guide/setup.md | kathoum/label-studio | 52361bb695ae6856697558d3584a3ed62f96ec70 | [
"Apache-2.0"
] | null | null | null | docs/source/guide/setup.md | kathoum/label-studio | 52361bb695ae6856697558d3584a3ed62f96ec70 | [
"Apache-2.0"
] | 3 | 2020-11-02T11:53:17.000Z | 2020-11-13T14:36:40.000Z | docs/source/guide/setup.md | kathoum/label-studio | 52361bb695ae6856697558d3584a3ed62f96ec70 | [
"Apache-2.0"
] | 2 | 2021-03-29T13:39:37.000Z | 2021-03-29T13:43:39.000Z | ---
title: Project setup
type: guide
order: 101
---
**Project** is a directory where all annotation assets are located. It is a self-contained entity: when you start Label Studio for the first time e.g. `label-studio start ./my_project --init`,
it creates a directory `./my_project` from where its launched.
If you want to start another project, just remove `./my_project` directory, or create a new one by running `label-studio start /path/to/new/project --init`.
## Structure
**Project directory** is structured as follows:
```bash
├── my_project
│ ├── config.json // project settings
│ ├── tasks.json // all imported tasks in a dict like {task_id: task}
│ ├── config.xml // current project labeling config
│ ├── completions // directory with one completion per task_id stored in one file
│ │ ├── <task_id>.json
│ ├── export // stores archives with all results exported form web UI
│ │ ├── 2020-03-06-15-23-47.zip
```
> Warning: It is not recommended to modify any of the internal project files. For importing tasks, exporting completions or changing label config please use web UI or command line arguments (see `label-studio start --help` for details)
## Labeling config
Project labeling config is an XML file that consists of:
- **object tags** specifying input data sources from imported tasks,
- **control tags** for configuring labeling schema (how annotation result looks like),
- **visual tags** applying different user interface styles.
<a class="button" href="/tags">Check Available Tags</a>
#### Example
Here an example config for classifying images provided by `image_url` key into two classes:
```html
<View>
<Image name="image_object" value="$image_url"/>
<Choices name="image_classes" toName="image_object">
<Choice value="Cat"/>
<Choice value="Dog"/>
</Choices>
</View>
```
### Setup labeling config from file
It is possible to initialize a new project with predefined `config.xml`:
```bash
label-studio my_new_project start --init --label-config config.xml
```
### Setup labeling config from UI
You can also use the web interface at [`/setup`](http://localhost:8080/setup) to paste your labeling config. Using web UI you also get a live update while you're editting the config.
### Setup labeling config from API
Save labeling config for the project using API:
```
curl -X POST -H Content-Type:application/json http://localhost:8080/api/save-config \
--data "{\"label_config\": \"<View>[...]</View>\"}"
```
The backend should return status 201 if config is valid and saved.
If errors occur the backend returns status 400 and response body will be JSON dict:
```
{
"label_config": ["error 1 description", " error 2 description", ...]
}
```
| 33.487805 | 235 | 0.70976 | eng_Latn | 0.985225 |
08e7d77de3fbc8d3de190179f8cd481a4ef072f3 | 269 | md | Markdown | CONTRIBUTORS.md | OpenPhone/audioswitch | 09ccb2e22c3209b72a91d8f9312a8a84fdfe998f | [
"Apache-2.0"
] | null | null | null | CONTRIBUTORS.md | OpenPhone/audioswitch | 09ccb2e22c3209b72a91d8f9312a8a84fdfe998f | [
"Apache-2.0"
] | null | null | null | CONTRIBUTORS.md | OpenPhone/audioswitch | 09ccb2e22c3209b72a91d8f9312a8a84fdfe998f | [
"Apache-2.0"
] | null | null | null | <!---
Recommended contributor format
* [Your Name or Github ID](link to Github profile)
-->
Thank you to all our contributors!
* [John Qualls](https://github.com/Alton09)
* [Aaron Alaniz](https://github.com/aaalaniz)
* [Tejas Nandanikar](https://github.com/tejas-n)
| 22.416667 | 50 | 0.713755 | eng_Latn | 0.22016 |
08e81b0281b761a2b6c7f5b30393f46eaeb4ee48 | 1,709 | md | Markdown | docs/framework/winforms/advanced/how-to-create-a-solid-brush.md | acid-chicken/docs.ja-jp | 6af8d52f0d846aac92c12c76e61de8753ba3e8ab | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/framework/winforms/advanced/how-to-create-a-solid-brush.md | acid-chicken/docs.ja-jp | 6af8d52f0d846aac92c12c76e61de8753ba3e8ab | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/framework/winforms/advanced/how-to-create-a-solid-brush.md | acid-chicken/docs.ja-jp | 6af8d52f0d846aac92c12c76e61de8753ba3e8ab | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: '方法: 純色ブラシを作成します。'
ms.date: 03/30/2017
dev_langs:
- csharp
- vb
- cpp
helpviewer_keywords:
- solid color brushes
- brushes [Windows Forms], examples
- brushes [Windows Forms], creating solid
ms.assetid: 85c3fe7d-fb1d-4591-8a9f-d75b556b90af
ms.openlocfilehash: 0943bd1d5e05a1d726f0f6c55e372b9ff70cc4ca
ms.sourcegitcommit: 6b308cf6d627d78ee36dbbae8972a310ac7fd6c8
ms.translationtype: MT
ms.contentlocale: ja-JP
ms.lasthandoff: 01/23/2019
ms.locfileid: "54632271"
---
# <a name="how-to-create-a-solid-brush"></a>方法: 純色ブラシを作成します。
この例で作成、<xref:System.Drawing.SolidBrush>で使用できるオブジェクト、<xref:System.Drawing.Graphics>図形を塗りつぶすためのオブジェクト。
## <a name="example"></a>例
[!code-cpp[System.Drawing.ConceptualHowTos#1](../../../../samples/snippets/cpp/VS_Snippets_Winforms/System.Drawing.ConceptualHowTos/cpp/form1.cpp#1)]
[!code-csharp[System.Drawing.ConceptualHowTos#1](../../../../samples/snippets/csharp/VS_Snippets_Winforms/System.Drawing.ConceptualHowTos/CS/form1.cs#1)]
[!code-vb[System.Drawing.ConceptualHowTos#1](../../../../samples/snippets/visualbasic/VS_Snippets_Winforms/System.Drawing.ConceptualHowTos/VB/form1.vb#1)]
## <a name="robust-programming"></a>信頼性の高いプログラミング
呼び出す必要がありますが、それらを使用して完了後<xref:System.IDisposable.Dispose%2A>ブラシ オブジェクトなどのシステム リソースを消費するオブジェクト。
## <a name="see-also"></a>関連項目
- <xref:System.Drawing.SolidBrush>
- <xref:System.Drawing.Brush>
- [グラフィックス プログラミングについて](../../../../docs/framework/winforms/advanced/getting-started-with-graphics-programming.md)
- [GDI+ でのブラシと塗りつぶされた図形](../../../../docs/framework/winforms/advanced/brushes-and-filled-shapes-in-gdi.md)
- [ブラシを使用した図形の塗りつぶし](../../../../docs/framework/winforms/advanced/using-a-brush-to-fill-shapes.md)
| 46.189189 | 157 | 0.758923 | yue_Hant | 0.588357 |
08e8e8ba239e2dfde9ae071732977dd854e7725e | 130 | md | Markdown | Mapp.md | cy15196/ClabsoHandBooks | 0d86c666381f62ee149205434fe7164798078a6b | [
"Apache-2.0"
] | null | null | null | Mapp.md | cy15196/ClabsoHandBooks | 0d86c666381f62ee149205434fe7164798078a6b | [
"Apache-2.0"
] | null | null | null | Mapp.md | cy15196/ClabsoHandBooks | 0d86c666381f62ee149205434fe7164798078a6b | [
"Apache-2.0"
] | null | null | null | 麒麟智慧软件使用说明书
======================
## 维护者
chenyang@clabso.com
## 快速开始
### 连接仪器
### 开始测量
### 生成报告
### 联系我们
info@clabso.com
| 6.842105 | 22 | 0.492308 | oci_Latn | 0.353271 |
08ea74b9965ebc3f47a99458a4e885ecfd597b47 | 4,029 | md | Markdown | documentation/modules/auxiliary/admin/ldap/vmware_vcenter_vmdir_auth_bypass.md | lixxsixx/metasploit-framework | 28f770dce4c740e4294adccc5e2dceaf6cacd548 | [
"BSD-2-Clause",
"BSD-3-Clause"
] | 3 | 2020-12-31T05:42:44.000Z | 2021-01-30T09:02:35.000Z | documentation/modules/auxiliary/admin/ldap/vmware_vcenter_vmdir_auth_bypass.md | l0x539/metasploit-framework | d38dcb349fdba96ad089060ee2329b1288f2213a | [
"BSD-2-Clause",
"BSD-3-Clause"
] | 6 | 2020-05-25T03:43:35.000Z | 2022-02-26T01:57:11.000Z | documentation/modules/auxiliary/admin/ldap/vmware_vcenter_vmdir_auth_bypass.md | l0x539/metasploit-framework | d38dcb349fdba96ad089060ee2329b1288f2213a | [
"BSD-2-Clause",
"BSD-3-Clause"
] | 4 | 2020-11-04T03:12:56.000Z | 2021-07-02T04:02:18.000Z | ## Vulnerable Application
### Description
This module bypasses LDAP authentication in VMware vCenter Server's
vmdir service to add an arbitrary administrator user. Version 6.7
prior to the 6.7U3f update is vulnerable.
### Setup
Tested in the wild. No setup notes available at this time, as setup will
be specific to target environment.
## Verification Steps
Follow [Setup](#setup) and [Scenarios](#scenarios).
## Actions
### Add
Add an admin user to the vCenter Server.
## Options
### BASE_DN
If you already have the LDAP base DN, you may set it in this option.
### USERNAME
Set this to the username for the new admin user.
### PASSWORD
Set this to the password for the new admin user.
### ConnectTimeout
You may configure the timeout for LDAP connects if necessary. The
default is 10.0 seconds and should be more than sufficient.
## Scenarios
### VMware vCenter Server 6.7 virtual appliance on ESXi
```
msf5 > use auxiliary/admin/ldap/vmware_vcenter_vmdir_auth_bypass
msf5 auxiliary(admin/ldap/vmware_vcenter_vmdir_auth_bypass) > options
Module options (auxiliary/admin/ldap/vmware_vcenter_vmdir_auth_bypass):
Name Current Setting Required Description
---- --------------- -------- -----------
BASE_DN no LDAP base DN if you already have it
PASSWORD no Password of admin user to add
RHOSTS yes The target host(s), range CIDR identifier, or hosts file with syntax 'file:<path>'
RPORT 389 yes The target port
USERNAME no Username of admin user to add
Auxiliary action:
Name Description
---- -----------
Add Add an admin user
msf5 auxiliary(admin/ldap/vmware_vcenter_vmdir_auth_bypass) > set rhosts [redacted]
rhosts => [redacted]
msf5 auxiliary(admin/ldap/vmware_vcenter_vmdir_auth_bypass) > set username msfadmin
username => msfadmin
msf5 auxiliary(admin/ldap/vmware_vcenter_vmdir_auth_bypass) > set password msfadmin
password => msfadmin
msf5 auxiliary(admin/ldap/vmware_vcenter_vmdir_auth_bypass) > run
[*] Running module against [redacted]
[*] Using auxiliary/gather/vmware_vcenter_vmdir_ldap as check
[*] Discovering base DN automatically
[*] Searching root DSE for base DN
dn: cn=DSE Root
namingcontexts: dc=vsphere,dc=local
supportedcontrol: 1.3.6.1.4.1.4203.1.9.1.1
supportedcontrol: 1.3.6.1.4.1.4203.1.9.1.2
supportedcontrol: 1.3.6.1.4.1.4203.1.9.1.3
supportedcontrol: 1.2.840.113556.1.4.417
supportedcontrol: 1.2.840.113556.1.4.319
supportedldapversion: 3
supportedsaslmechanisms: GSSAPI
[+] Discovered base DN: dc=vsphere,dc=local
[*] Dumping LDAP data from vmdir service at [redacted]:389
[+] [redacted]:389 is vulnerable to CVE-2020-3952
[*] Storing LDAP data in loot
[+] Saved LDAP data to /Users/wvu/.msf4/loot/20200417002255_default_[redacted]_VMwarevCenterS_975097.txt
[*] Password and lockout policy:
dn: cn=password and lockout policy,dc=vsphere,dc=local
cn: password and lockout policy
enabled: TRUE
ntsecuritydescriptor:: [redacted]
objectclass: top
objectclass: vmwLockoutPolicy
objectclass: vmwPasswordPolicy
objectclass: vmwPolicy
vmwpasswordchangeautounlockintervalsec: [redacted]
vmwpasswordchangefailedattemptintervalsec: [redacted]
vmwpasswordchangemaxfailedattempts: [redacted]
vmwpasswordlifetimedays: [redacted]
vmwpasswordmaxidenticaladjacentchars: [redacted]
vmwpasswordmaxlength: [redacted]
vmwpasswordminalphabeticcount: [redacted]
vmwpasswordminlength: [redacted]
vmwpasswordminlowercasecount: [redacted]
vmwpasswordminnumericcount: [redacted]
vmwpasswordminspecialcharcount: [redacted]
vmwpasswordminuppercasecount: [redacted]
vmwpasswordprohibitedpreviouscount: [redacted]
[*] Bypassing LDAP auth in vmdir service at [redacted]:389
[*] Adding admin user msfadmin with password msfadmin
[+] Added user msfadmin, so auth bypass was successful!
[+] Added user msfadmin to admin group
[*] Auxiliary module execution completed
msf5 auxiliary(admin/ldap/vmware_vcenter_vmdir_auth_bypass) >
```
| 31.97619 | 122 | 0.752792 | eng_Latn | 0.716738 |
08eba5c001c23012a06904803b0332f15d9df6d6 | 44 | md | Markdown | README.md | t4y3/time-schedule | 41142739c81799fdea7d1d8ad11c20bdaa3aa2bc | [
"MIT"
] | null | null | null | README.md | t4y3/time-schedule | 41142739c81799fdea7d1d8ad11c20bdaa3aa2bc | [
"MIT"
] | null | null | null | README.md | t4y3/time-schedule | 41142739c81799fdea7d1d8ad11c20bdaa3aa2bc | [
"MIT"
] | null | null | null | # time-schedule
Manage only time schedules
| 14.666667 | 27 | 0.795455 | eng_Latn | 0.997053 |
08ec61f1d479384fdf57ddd1c700b5ce7ab4bcae | 1,946 | md | Markdown | docs/windows/how-to-change-the-language-or-condition-of-a-resource-while-copying.md | jmittert/cpp-docs | cea5a8ee2b4764b2bac4afe5d386362ffd64e55a | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/windows/how-to-change-the-language-or-condition-of-a-resource-while-copying.md | jmittert/cpp-docs | cea5a8ee2b4764b2bac4afe5d386362ffd64e55a | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/windows/how-to-change-the-language-or-condition-of-a-resource-while-copying.md | jmittert/cpp-docs | cea5a8ee2b4764b2bac4afe5d386362ffd64e55a | [
"CC-BY-4.0",
"MIT"
] | 1 | 2020-06-14T03:42:31.000Z | 2020-06-14T03:42:31.000Z | ---
title: "How to: Change the Language or Condition of a Resource While Copying (C++)"
ms.date: "11/04/2016"
f1_keywords: ["vc.resvw.resource.changing"]
helpviewer_keywords: ["Language property [C++]"]
ms.assetid: 8f622ab0-bac2-468f-ae70-78911afc4759
---
# How to: Change the Language or Condition of a Resource While Copying (C++)
While copying in a resource, you can change its language property or condition property, or both.
- The language of the resource identifies just that, the language for the resource. This is used by [FindResource](/windows/desktop/api/winbase/nf-winbase-findresourcea) to help identify the resource for which you're looking. (However, resources can have differences for each language that aren't related to text, for example, accelerators that might only work on a Japanese keyboard or a bitmap that would only be appropriate for Chinese localized builds, etc.)
- The condition of a resource is a defined symbol that identifies a condition under which this particular copy of the resource is to be used.
The language and condition of a resource are shown in parentheses after the name of the resource in the Workspace window. In this example the resource named IDD_AboutBox is using Finnish as its language and its condition is XX33.
```cpp
IDD_AboutBox (Finnish - XX33)
```
### To copy an existing resource and change its language or condition
1. In the .rc file or in the [Resource View](../windows/resource-view-window.md) window, right-click the resource you want to copy.
2. Choose **Insert Copy** from the shortcut menu.
3. In the **Insert Resource Copy** dialog box:
- For the **Language** list box, select the language.
- In the **Condition** box, type the condition.
## Requirements
Win32
## See Also
[How to: Copy Resources](../windows/how-to-copy-resources.md)<br/>
[Resource Files](../windows/resource-files-visual-studio.md)<br/>
[Resource Editors](../windows/resource-editors.md) | 46.333333 | 462 | 0.759507 | eng_Latn | 0.99174 |
08ecefb9bef915a1a278e370a53661a31357f412 | 72 | md | Markdown | tag/new-york.md | DiningByStarlight/diningbystarlight.github.io | 1e7adddaf8547890d1444b6bbd8a3738ee3911f0 | [
"MIT"
] | null | null | null | tag/new-york.md | DiningByStarlight/diningbystarlight.github.io | 1e7adddaf8547890d1444b6bbd8a3738ee3911f0 | [
"MIT"
] | null | null | null | tag/new-york.md | DiningByStarlight/diningbystarlight.github.io | 1e7adddaf8547890d1444b6bbd8a3738ee3911f0 | [
"MIT"
] | null | null | null | ---
layout: posts_by_tag
tag: new-york
title: Posts tagged New York
---
| 12 | 28 | 0.708333 | kor_Hang | 0.322273 |
08ed2ddd2075b775ae4410ac7b4ca430a7e8320d | 1,098 | md | Markdown | introduction/what-is-sideloading/faq.md | sideloading/sideloading-master-guide | 4f2e8c7e8daafad23580da7e48597de3b247856d | [
"MIT"
] | null | null | null | introduction/what-is-sideloading/faq.md | sideloading/sideloading-master-guide | 4f2e8c7e8daafad23580da7e48597de3b247856d | [
"MIT"
] | null | null | null | introduction/what-is-sideloading/faq.md | sideloading/sideloading-master-guide | 4f2e8c7e8daafad23580da7e48597de3b247856d | [
"MIT"
] | null | null | null | # FAQ
## Can I sideload <xyz>.IPA
If it is an IPA, then you most likely **can** sideload it. \[guide\]
## Can I sideload <xyz> tweak
**Maybe**, it depends if the tweak requires root, and it depends if it needs `PreferenceLoader`. If the answer to those questions are yes, you are **unlikely** to be able to successfully sideload that tweak.
## Can I sideload on windows?
Yes. See the page links below.
## Do I need to buy a developer account?
No! You can use a free one, however, apps will be limited to 7 days, at which point you will have to resign.
See the following:
{% page-ref page="../../providers/code-signing-services.md" %}
{% page-ref page="../../providers/code-signing-services-free.md" %}
## I get a <App ID limit> in Xcode:
This means you have created too many provisioning profiles in 7 days using a free account. You will be able to sideload again after that time elapses.
## My apps crash on startup! Help!
This is either \(1\) the provisioning profile expired, or \(2\) the app is incompatible, or \(3\) the tweak has not been applied correctly.
| 33.272727 | 207 | 0.713115 | eng_Latn | 0.998815 |
08edafb06c046068756c08a010c1e796dc4d9587 | 51 | md | Markdown | README.md | xzag/workflow | 0a388769e8190cce1f378e6a5752826d3e66dd53 | [
"MIT"
] | null | null | null | README.md | xzag/workflow | 0a388769e8190cce1f378e6a5752826d3e66dd53 | [
"MIT"
] | null | null | null | README.md | xzag/workflow | 0a388769e8190cce1f378e6a5752826d3e66dd53 | [
"MIT"
] | null | null | null | # workflow
Helps you with entity state transitions
| 17 | 39 | 0.823529 | eng_Latn | 0.99987 |
08ee1ee656307013b39b5723170c0d0589ee7cce | 5,129 | md | Markdown | includes/data-factory-azure-storage-linked-services.md | flarocca/azure-docs.es-es | 8d69748012641d57ddb2b81a3e1c2d079703ed8d | [
"CC-BY-4.0",
"MIT"
] | null | null | null | includes/data-factory-azure-storage-linked-services.md | flarocca/azure-docs.es-es | 8d69748012641d57ddb2b81a3e1c2d079703ed8d | [
"CC-BY-4.0",
"MIT"
] | null | null | null | includes/data-factory-azure-storage-linked-services.md | flarocca/azure-docs.es-es | 8d69748012641d57ddb2b81a3e1c2d079703ed8d | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
author: linda33wj
ms.service: data-factory
ms.topic: include
ms.date: 11/09/2018
ms.author: jingwang
ms.openlocfilehash: 37917e0ed663675677f1d0452b5796120ca2694e
ms.sourcegitcommit: 2ec4b3d0bad7dc0071400c2a2264399e4fe34897
ms.translationtype: HT
ms.contentlocale: es-ES
ms.lasthandoff: 03/28/2020
ms.locfileid: "75466856"
---
### <a name="azure-storage-linked-service"></a>Servicio vinculado de Azure Storage
El servicio vinculado de **Azure Storage** le permite vincular una cuenta de Azure Storage a una instancia de Azure Data Factory con una **clave de cuenta**, que proporciona a la instancia de Azure Data Factory acceso global a Azure Storage. En la tabla siguiente se proporciona la descripción de los elementos JSON específicos del servicio vinculado de Azure Storage.
| Propiedad | Descripción | Obligatorio |
|:--- |:--- |:--- |
| type |La propiedad type debe establecerse en: **AzureStorage** |Sí |
| connectionString |Especifique la información necesaria para conectarse a Almacenamiento de Azure para la propiedad connectionString. |Sí |
Para más información sobre cómo recuperar las claves de acceso de las cuentas de almacenamiento, consulte [Administración de claves de acceso de cuentas de almacenamiento](../articles/storage/common/storage-account-keys-manage.md).
**Ejemplo**:
```json
{
"name": "StorageLinkedService",
"properties": {
"type": "AzureStorage",
"typeProperties": {
"connectionString": "DefaultEndpointsProtocol=https;AccountName=<accountname>;AccountKey=<accountkey>"
}
}
}
```
### <a name="azure-storage-sas-linked-service"></a>Servicio vinculado de SAS Azure Storage
Una Firma de acceso compartido (SAS) ofrece acceso delegado a los recursos en la cuenta de almacenamiento. Esto le permite conceder a un cliente permisos limitados a objetos en su cuenta de almacenamiento durante un período específico y con un conjunto determinado de permisos sin tener que compartir las claves de acceso a las cuentas. La SAS es un URI que incluye en sus parámetros de consulta toda la información necesaria para el acceso autenticado a un recurso de almacenamiento. Para obtener acceso a los recursos de almacenamiento con la SAS, el cliente solo tiene que pasar la SAS al método o constructor adecuados. Para obtener más información sobre las firmas de acceso compartido, consulte [Otorgar acceso limitado a recursos de Azure Storage con firmas de acceso compartido (SAS)](../articles/storage/common/storage-sas-overview.md).
> [!IMPORTANT]
> Ahora Azure Data Factory solo admite **SAS de servicio**, pero no SAS de cuenta. Tenga en cuenta que la dirección URL de SAS generable desde Azure Portal o el Explorador de Storage es una SAS de cuenta, que no es compatible.
> [!TIP]
> Puede ejecutar a continuación los comandos de PowerShell para generar una SAS de servicio para la cuenta de almacenamiento (reemplace los marcadores de posición y conceda el permiso necesario): `$context = New-AzStorageContext -StorageAccountName <accountName> -StorageAccountKey <accountKey>`
> `New-AzStorageContainerSASToken -Name <containerName> -Context $context -Permission rwdl -StartTime <startTime> -ExpiryTime <endTime> -FullUri`
El servicio vinculado de SAS de Azure Storage le permite vincular una cuenta de Azure Storage a una factoría de datos Azure con una Firma de acceso compartido (SAS). Proporciona a la instancia de Data Factory acceso restringido o limitado por el tiempo a todos los recursos o a algunos específicos (blob o contenedor) del almacenamiento. En la tabla siguiente se proporciona la descripción de los elementos JSON específicos del servicio vinculado de SAS de Azure Storage.
| Propiedad | Descripción | Obligatorio |
|:--- |:--- |:--- |
| type |La propiedad type debe establecerse en: **AzureStorageSas** |Sí |
| sasUri |Especifique el URI de Firma de acceso compartido a los recursos de Azure Storage como blob, contenedor o tabla. |Sí |
**Ejemplo**:
```json
{
"name": "StorageSasLinkedService",
"properties": {
"type": "AzureStorageSas",
"typeProperties": {
"sasUri": "<Specify SAS URI of the Azure Storage resource>"
}
}
}
```
Al crear un **URI de SAS**, tenga en cuenta lo siguiente:
* Establezca los **permisos** de lectura y escritura adecuados en los objetos, en función de cómo se utilizará el servicio vinculado (lectura, escritura, lectura y escritura) en la factoría de datos.
* Establezca la **hora de expiración** adecuadamente. Asegúrese de que el acceso a los objetos de Azure Storage no expirará durante el período activo de la canalización.
* El URI debe crearse en el nivel correcto del contenedor/blob o la tabla, en función de la necesidad. Un URI de SAS de un blob de Azure permite que el servicio Data Factory tenga acceso a ese blob en particular. Un URI de SAS de un contenedor de blob de Azure permite que el servicio Data Factory procese una iteración en los blobs de ese contenedor. Si necesita proporcionar acceso a más o menos objetos más adelante, o actualizar el URI de SAS, no olvide actualizar el servicio vinculado con el nuevo URI.
| 68.386667 | 845 | 0.766621 | spa_Latn | 0.981361 |
08ee804a95e243ef0d87e6bb37b85147f802895b | 676 | md | Markdown | _posts/blog/2015-02-18-vim_charity_financial_report_2014.md | conao3/vim-jp.github.io | 811910bab90a73c24705abac5ef74e1adb5608c6 | [
"CC-BY-4.0"
] | 32 | 2016-03-06T06:44:17.000Z | 2021-12-24T08:33:43.000Z | _posts/blog/2015-02-18-vim_charity_financial_report_2014.md | conao3/vim-jp.github.io | 811910bab90a73c24705abac5ef74e1adb5608c6 | [
"CC-BY-4.0"
] | 159 | 2016-03-02T12:56:31.000Z | 2021-11-27T14:00:27.000Z | _posts/blog/2015-02-18-vim_charity_financial_report_2014.md | conao3/vim-jp.github.io | 811910bab90a73c24705abac5ef74e1adb5608c6 | [
"CC-BY-4.0"
] | 19 | 2016-08-05T11:57:23.000Z | 2021-11-26T14:01:03.000Z | ---
layout: blog
category: blog
title: Vim チャリティー決算報告 2014
---
[Vim charity: financial report 2014](https://groups.google.com/d/msg/vim_announce/2czE8NAWetM/HUASaYWiPSkJ) の勝手訳です。
-----
Hello Vim users,
Vim ユーザーにはウガンダでの活動をサポートしていただけるようにお願いしていますが、2014 年も多くの方から資金援助をいただきました。資金は ICCF 基金が管理しています。こちらから決算報告を入手できます: <http://www.iccf.nl/jaarrekening2014en.pdf>
オランダ語版はこちらです: <http://www.iccf.nl/jaarrekening2014nl.pdf>
うれしいことに資金援助はかなり増えています。増加分は主に一回きりの寄付 (one-time donations) によるものです。これが Vim ユーザーによるものかは確かではありませんが、きっとそうだと思います。子供たちを助けてくれて本当にありがとう!
寄付金のうちわずか 0.6% しかコストが掛かっていないということもうれしい限りです。そして実際に 99% 以上のお金がウガンダのプロジェクトに渡りました。お金は、学校や診療所の運営を維持し、子供たちが健康で、学業に励めるように使われています。
-----
| 30.727273 | 150 | 0.816568 | yue_Hant | 0.552067 |
08eeeec351412a422e20a97c9e5474b7382afa22 | 1,595 | md | Markdown | en/compute/operations/images-with-pre-installed-software/create.md | alphantom/docs | 9b24fd589e7802440e507e181aa4401528c82711 | [
"CC-BY-4.0"
] | null | null | null | en/compute/operations/images-with-pre-installed-software/create.md | alphantom/docs | 9b24fd589e7802440e507e181aa4401528c82711 | [
"CC-BY-4.0"
] | null | null | null | en/compute/operations/images-with-pre-installed-software/create.md | alphantom/docs | 9b24fd589e7802440e507e181aa4401528c82711 | [
"CC-BY-4.0"
] | null | null | null | # Creating a VM from a public image
To create a VM:
1. Open the folder where the VM will be created.
1. Click **Create resource**.
1. Select **Virtual machine**.
1. In the **Name** field, enter the VM name.
[!INCLUDE [name-format](../../../_includes/name-format.md)]
1. Select the [availability zone](../../../overview/concepts/geo-scope.md) to locate the VM in.
1. Select a public image with the software you want to use.
1. In the **Computing resources** section:
- Choose the [platform](../../concepts/vm-platforms.md).
- Specify the necessary number of vCPUs and amount of RAM.
> [!NOTE]
>
> To create a VM from a **GitLab** image, at least 4 virtual cores (100% vCPU) and 4 GB of RAM are required.
1. In the **Network settings** section, click **Add network**.
1. In the window that opens, select the subnet to connect the VM to when creating it.
1. In **Public address**, choose:
- **Automatically** — to set a public IP address automatically. The address is allocated from the pool of Yandex.Cloud addresses.
- **List** — to select a public IP address from the list of static addresses. For more information, see the section [[!TITLE]](../../../vpc/operations/set-static-ip.md) in the [!KEYREF vpc-name] service documentation.
- **No address** — to not assign a public IP address.
1. Specify data required for accessing the VM.
1. Click **Create VM**.
VM creation takes several minutes. When the VM status changes to `RUNNING`, proceed to [configuring software](setup.md). You can monitor VM statuses on the list of VMs in the folder.
| 37.97619 | 221 | 0.695298 | eng_Latn | 0.986587 |
08ef522c0a5658eb0ed2ff3ed4b3d582957b62b8 | 4,732 | md | Markdown | docs/framework/configure-apps/file-schema/runtime/appdomainmanagertype-element.md | olifantix/docs.de-de | a31a14cdc3967b64f434a2055f7de6bf1bb3cda8 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/framework/configure-apps/file-schema/runtime/appdomainmanagertype-element.md | olifantix/docs.de-de | a31a14cdc3967b64f434a2055f7de6bf1bb3cda8 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/framework/configure-apps/file-schema/runtime/appdomainmanagertype-element.md | olifantix/docs.de-de | a31a14cdc3967b64f434a2055f7de6bf1bb3cda8 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: '<AppDomainManagerType> Element'
ms.date: 03/30/2017
helpviewer_keywords:
- appDomainManagerType element
- <appDomainManagerType> element
ms.assetid: ae8d5a7e-e7f7-47f7-98d9-455cc243a322
author: rpetrusha
ms.author: ronpet
ms.openlocfilehash: 8fb771d58a99e42ad53a465008e8848cff0a87fd
ms.sourcegitcommit: 11f11ca6cefe555972b3a5c99729d1a7523d8f50
ms.translationtype: MT
ms.contentlocale: de-DE
ms.lasthandoff: 05/03/2018
---
# <a name="ltappdomainmanagertypegt-element"></a><AppDomainManagerType> Element
Gibt den Typ an, der als Anwendungsdomänen-Manager für die Standardanwendungsdomäne dient.
\<configuration>
\<Common Language Runtime >
\<AppDomainManagerType >
## <a name="syntax"></a>Syntax
```xml
<appDomainManagerAssembly
value="type name" />
```
## <a name="attributes-and-elements"></a>Attribute und Elemente
In den folgenden Abschnitten werden Attribute sowie untergeordnete und übergeordnete Elemente beschrieben.
### <a name="attributes"></a>Attribute
|Attribut|Beschreibung|
|---------------|-----------------|
|`value`|Erforderliches Attribut. Gibt den Namen des Typs, einschließlich des Namespace, der als Anwendungsdomänen-Managers für die Standardanwendungsdomäne im Prozess dient.|
### <a name="child-elements"></a>Untergeordnete Elemente
Keine
### <a name="parent-elements"></a>Übergeordnete Elemente
|Element|Beschreibung|
|-------------|-----------------|
|`configuration`|Das Stammelement in jeder von den Common Language Runtime- und .NET Framework-Anwendungen verwendeten Konfigurationsdatei.|
|`runtime`|Enthält Informationen über die Assemblybindung und die Garbage Collection.|
## <a name="remarks"></a>Hinweise
Um den Typ des Anwendungsdomänen-Managers anzugeben, müssen Sie dieses Element angeben und die [ \<AppDomainManagerAssembly >](../../../../../docs/framework/configure-apps/file-schema/runtime/appdomainmanagerassembly-element.md) Element. Wenn eines dieser Elemente nicht angegeben ist, wird die andere ignoriert.
Wenn die Standardanwendungsdomäne geladen wird, <xref:System.TypeLoadException> wird ausgelöst, wenn der angegebene Typ nicht in der Assembly vorhanden ist, die von angegeben wird die [ \<AppDomainManagerAssembly >](../../../../../docs/framework/configure-apps/file-schema/runtime/appdomainmanagerassembly-element.md) Element und der Prozess kann nicht Starten.
Wenn Sie die Anwendung Manager Domänentyp für die Standardanwendungsdomäne angeben, erben andere Anwendungsdomänen, die von der Standardanwendungsdomäne erstellt Managertyp die Anwendung an. Verwenden der <xref:System.AppDomainSetup.AppDomainManagerType%2A?displayProperty=nameWithType> und <xref:System.AppDomainSetup.AppDomainManagerAssembly%2A?displayProperty=nameWithType> Eigenschaften zur Angabe einer anderen Anwendung Manager Domänentyp für eine neue Anwendungsdomäne.
Die Anwendungsdomänen-Managertyp angeben, muss die Anwendung volle Vertrauenswürdigkeit. (Z. B. weist eine Anwendung, die auf dem Desktop ausgeführte volle Vertrauenswürdigkeit.) Wenn die Anwendung nicht über die volle Vertrauenswürdigkeit verfügt eine <xref:System.TypeLoadException> ausgelöst wird.
Das Format des Typs und des Namespace ist das gleiche Format für die verwendeten die <xref:System.Type.FullName%2A?displayProperty=nameWithType> Eigenschaft.
Dieses Konfigurationselement steht nur in der [!INCLUDE[net_v40_long](../../../../../includes/net-v40-long-md.md)] und höher.
## <a name="example"></a>Beispiel
Das folgende Beispiel zeigt, wie Sie angeben, dass die Anwendungsdomänen-Manager für die Standardanwendungsdomäne eines Prozesses ist die `MyMgr` Geben Sie in der `AdMgrExample` Assembly.
```xml
<configuration>
<runtime>
<appDomainManagerType value="MyMgr" />
<appDomainManagerAssembly
value="AdMgrExample, Version=1.0.0.0, Culture=neutral, PublicKeyToken=6856bccf150f00b3" />
</runtime>
</configuration>
```
## <a name="see-also"></a>Siehe auch
<xref:System.AppDomainSetup.AppDomainManagerType%2A?displayProperty=nameWithType>
<xref:System.AppDomainSetup.AppDomainManagerAssembly%2A?displayProperty=nameWithType>
[\<AppDomainManagerAssembly >-Element](../../../../../docs/framework/configure-apps/file-schema/runtime/appdomainmanagerassembly-element.md)
[Schema für Laufzeiteinstellungen](../../../../../docs/framework/configure-apps/file-schema/runtime/index.md)
[Konfigurationsdateischema](../../../../../docs/framework/configure-apps/file-schema/index.md)
[SetAppDomainManagerType-Methode](../../../../../docs/framework/unmanaged-api/hosting/iclrcontrol-setappdomainmanagertype-method.md)
| 57.707317 | 479 | 0.756974 | deu_Latn | 0.911119 |
08ef5a34daf82080a721a6533d9ce0338ff3ec0c | 311 | md | Markdown | README.md | corets/use-previous | 873bf5be7d0bcfe5780adb510b7132007aadc03d | [
"MIT"
] | null | null | null | README.md | corets/use-previous | 873bf5be7d0bcfe5780adb510b7132007aadc03d | [
"MIT"
] | null | null | null | README.md | corets/use-previous | 873bf5be7d0bcfe5780adb510b7132007aadc03d | [
"MIT"
] | null | null | null | <p align="center"><a href="https://docs.corets.io"><img src="https://corets.github.io/public/logo-github-readme.svg" width="300"/></a></p>
<p align="center"><b><a href="https://docs.corets.io/hooks/use-previous">Documentation</a></b><br/><br/><br/></p>
<p align="center">Track previous versions of values</p>
| 51.833333 | 138 | 0.672026 | yue_Hant | 0.400507 |
08efa3e7a5feed172a0888a58adb3151ba517f0f | 12,240 | md | Markdown | docs/content/faq.md | juergenhoetzel/Paket | 89b11627d73d322dcad9d042c3b1314652caef6f | [
"MIT"
] | 1 | 2019-04-05T20:30:10.000Z | 2019-04-05T20:30:10.000Z | docs/content/faq.md | juergenhoetzel/Paket | 89b11627d73d322dcad9d042c3b1314652caef6f | [
"MIT"
] | null | null | null | docs/content/faq.md | juergenhoetzel/Paket | 89b11627d73d322dcad9d042c3b1314652caef6f | [
"MIT"
] | 1 | 2019-04-05T20:30:12.000Z | 2019-04-05T20:30:12.000Z | # FAQ — Frequently Asked Questions
## I don't understand why I need Paket to manage my packages. Why can't I just use NuGet?
NuGet does not separate out the concept of [transitive dependencies](faq.html#transitive); if you install a package into your project and that package has further dependencies then all transitive packages are included in the `packages.config`. There is no way to tell which packages are only transitive dependencies.
Even more importantly: If two packages reference conflicting versions of a package, NuGet will silently take the latest version ([read more](controlling-nuget-resolution.html)). You have no control over this process.
Paket on the other hand maintains this information on a consistent and stable basis within the [`paket.lock` file](lock-file.html) in the solution root. This file, together with the [`paket.dependencies` file](dependencies-file.html) enables you to determine exactly what's happening with your dependencies.
The [`paket outdated` command](paket-outdated.html) lists packages that have new versions available.
Paket also enables one to reference files directly from [GitHub repositories, Gists](github-dependencies.html) and [HTTP](http-dependencies.html).
<div id="no-version"></div>
## NuGet puts the package version into the path. Is Paket doing the same?
No, since Paket provides a global view of your dependencies it usually installs only one version of a package and therefore the version number is not needed in the path.
This makes it much easier to reference files in the package and you don't have to edit these references when you update a package.
If you really need to have the version in the path for certain packages (like xunit.runners.visualstudio) you [can still do that](nuget-dependencies.html#Putting-the-version-no-into-the-path).
## NuGet allows to use multiple versions of the same package. Can I do that with Paket?
Usually you don't want that to happen. Most solutions that have multiple versions of the same package installed did this by accident.
Since NuGet has no global lock file and stores version information in packages.config (per project), it's hard to keep all projects consolidated.
Paket on the other gives you a global/consolidated view of all your dependencies in the [`paket.lock` file](lock-file.html).
In the very rare cases when you really need to maintain different versions of the same package you can use the [dependency groups feature](http://fsprojects.github.io/Paket/groups.html).
Every dependency group gets resolved independently so it also deals with the conflict resolution of indirect dependencies, but the most important difference is that using groups is a deliberate action.
You need to explicitly name the group in [`paket.references` files](references-files.html), so it won't happen by accident.
## Why does Paket add references to the libraries associated with each supported framework version within a NuGet package to my projects?
A NuGet package installation adds references only for the currently selected target .NET framework version of your project at the time of installation. Whenever you switch the framework version of your project, there's a potential need to reinstall all of the packages.
However the Visual Studio tooling does not address this – it's up to you to remember to reinstall. In the best case, this leads to compiler errors about missing methods/types etc. In the worst case, it's a variance that's either deeply buried within the code (meaning it might be difficult to trap in a test cycle) or a more difficult to detect 'silent' problem.
Paket adds references to all of them, but with `Condition` attributes filtering them based on the currently selected `TargetFramework` and other relevant MSBuild properties.
If you only want to use a subset of the target frameworks you can use [framework restrictions](http://fsprojects.github.io/Paket/nuget-dependencies.html#Framework-restrictions).
## Why does Paket use a different package resolution strategy than NuGet?
Paket tries to embrace [SemVer](http://semver.org/) while NuGet uses a pessimistic version resolution strategy. You can prefix your version constraints with `!` if you need to use [NuGet compatibility](dependencies-file.html#Strategy-option).
If you want to know more about Paket's resolver algorithm, then you can read [this article](resolver.html).
## Does Paket run install.ps1 scripts?
<div id="paket-vs-powershell-install-scripts"></div>
No, we don't run any script or program from NuGet packages and we have no plans to do this in the future.
We know that this might cause you some manual work for some of the currently available NuGet packages, but we think these install scripts cause more harm than good.
In fact our current model would not be able to work consistently alongside an `install.ps1` script like the following from `FontAwesome.4.1.0`:
[lang=batchfile]
param($installPath, $toolsPath, $package, $project)
foreach ($fontFile in $project.ProjectItems.Item("fonts").ProjectItems)
{
$fontFile.Properties.Item("BuildAction").Value = 2;
}
The reason is simply that even if we would support PowerShell on Windows we can't access the Visual Studio project system. Paket is a command line tool and doesn't run inside of Visual Studio.
There is no reasonable way to make this work – and even NuGet.exe can't do it in command line mode.
Instead we encourage the .NET community to use a declarative install process and we will help to fix this in the affected packages.
## What files should I commit?
Paket creates a number of files in your solution folders, and most of them should be committed to source control. To be clear, these are the files that should be committed to source control:
* [`paket.dependencies`](dependencies-file.html) - specifies your application's dependencies, and how they should be fulfilled.
* All [`paket.references`](references-files.html) files - each project will have a `paket.references` file that specifies which of the dependencies are installed in the project. Each of these files should be committed to source control.
* All [`paket.template`](template-files.html) files - if a project is supposed to be deployed as a NuGet project it will have a `paket.template` file that specifies package meta data. Each of these files should be committed to source control.
* [`paket.lock`](lock-file.html) - records the actual versions used during resolution. If it exists, Paket will ensure that the same versions are used when restoring packages. It is not strictly necessary to commit this file, but strongly recommended (see [this question](faq.html#Why-should-I-commit-the-lock-file) for details why).
The following files can be committed, but are not essential:
* [`.paket/paket.targets`](paket-folder.html) - the paket.targets file allows you to set up automatic package restore in Visual Studio.
* [`.paket/paket.bootstrapper.exe`](getting-started.html) - this is a small, rarely updated executable that will download the latest version of the main `paket.exe`. It is not necessary, but can be very useful for other developers and build servers, so they can easily retrieve `paket.exe` and restore packages without having Paket already installed in the path. For example, it is common to have a [`build.sh`](https://github.com/fsprojects/Paket/blob/master/build.sh) or [`build.cmd`](https://github.com/fsprojects/Paket/blob/master/build.cmd) file in the root of a repo that will do the equivalent of:
.paket/paket.bootstrapper.exe
.paket/paket.exe restore
// Invoke build tool/scripts to build solution
The following files should *not* be committed to your version control system, and should be added to any ignore files:
* `.paket/paket.exe` - the main Paket executable, downloaded by [`.paket/paket.bootstrapper.exe`](getting-started.html). This should not be committed, as it is a binary file, which can unnecessarily bloat repositories, and because it is likely to be updated on a regular basis.
* `paket-files` directory, as paket install will restore this. Same goes for the `packages` directory
## Why should I commit the lock file?
Committing the [`paket.lock` file](lock-file.html) to your version control system guarantees that other developers and/or build servers will always end up with a reliable and consistent set of packages regardless of where or when [`paket install`](paket-install.html) is run.
If your *project is an application* you should always commit the [`paket.lock` file](lock-file.html).
If your *project is a library* then you probably want to commit it as well. There are rare cases where you always want to test your lib against the latest version of your dependencies,
but we recommend to set up a second CI build instead. This new build should be run regularly (maybe once a day) and execute [`paket update`](paket-update.html) at the beginning.
This will ensure that you get notified whenever a dependency update breaks your library.
## I'm already using NuGet. How can I convert to Paket?
The process can be automated with [paket convert-from-nuget](paket-convert-from-nuget.html) command.
In case of the command's failure, you can fallback to manual approach:
1. Analyse your `packages.config` files and extract the referenced packages into a paket.dependencies file.
2. Convert each `packages.config` file to a paket.references file. This is very easy - you just have to remove all the XML and keep the package names.
3. Run [paket install](paket-install.html). This will analyze the dependencies, generate a paket.lock file, remove all the old package references from your project files and replace them with equivalent `Reference`s in a syntax that can be managed automatically by Paket.
4. (Optional) Raise corresponding issue [here](https://github.com/fsprojects/Paket/issues) so that we can make the command even better.
## How do I convert a new project to Paket when my solution is already using Paket
In this case it's okay to use the `--force` flag for the `convert-from-nuget` command as described in [partial NuGet conversion](getting-started.html#Partial-NuGet-conversion). Paket will then go through your solution and convert all new NuGet projects to Paket.
## Paket stores paket.dependencies and paket.lock files in the root of a repository. How can I change that?
Very old paket.exe versions allowed to specify the location. We disabled that because we have very strong opinions about the location of the [`paket.dependencies` file](dependencies-file.html).
We believe dependency management is so important that these files belong in the root of the repository. People should know about the project's dependencies.
That said: if you don't agree with that (but please take some time and think about it) you can use batch file to change the working folder.
## Can I use Paket to manage npm/bower/whatever dependencies?
[No.](https://github.com/fsprojects/Paket/issues/61) We don't believe in reinventing the wheel.
On top of that, such a "meta package manager" abstraction is likely to be less flexible and behind on what native tools have to offer. Paket serves a specific need, that is [SemVer-compatible](http://semver.org) NuGet.
<div id="transitive"></div>
## What does "transitive dependencies" mean?
If install NuGet packages into your project then these packages can have dependencies on other NuGet packages. Paket calls these dependencies "transitive". These packages will be automatically uninstalled if none of your "direct dependencies" (the packages thata you actually installed) still depend on them.
## I am behind a proxy. Can I use Paket?
If your proxy uses default (Active Directory) credentials, you have nothing to do, Paket will handle it automatically.
If your proxy uses custom credentials, you need to set the following environment variables:
* `HTTP_PROXY`: http proxy to use for all connections
* `HTTPS_PROXY`: https proxy to use for all connections
* `NO_PROXY`: hosts that should bypass the proxy
For example:
[lang=paket]
set HTTP_PROXY=http://user:password@proxy.company.com:port/
set HTTPS_PROXY=https://user:password@proxy.company.com:port/
set NO_PROXY=.company.com,localhost
| 80 | 604 | 0.783742 | eng_Latn | 0.99811 |
08efb28f8ad2fefe5799aadf92f06121ab9123ac | 477 | md | Markdown | doc/api/smeup_widgets_smeup_timepicker_customization/SmeupTimePickerCustomization/layoutProportions.md | smeup/ken | 582c6c2e731aa62a6d0b9b4ccc5f044e6883f13a | [
"Apache-2.0"
] | 5 | 2021-12-28T12:47:39.000Z | 2022-03-25T16:56:25.000Z | doc/api/smeup_widgets_smeup_timepicker_customization/SmeupTimePickerCustomization/layoutProportions.md | smeup/ken | 582c6c2e731aa62a6d0b9b4ccc5f044e6883f13a | [
"Apache-2.0"
] | null | null | null | doc/api/smeup_widgets_smeup_timepicker_customization/SmeupTimePickerCustomization/layoutProportions.md | smeup/ken | 582c6c2e731aa62a6d0b9b4ccc5f044e6883f13a | [
"Apache-2.0"
] | null | null | null |
# layoutProportions method
*[<Null safety>](https://dart.dev/null-safety)*
- @[override](https://api.flutter.dev/flutter/dart-core/override-constant.html)
[List](https://api.flutter.dev/flutter/dart-core/List-class.html)<[int](https://api.flutter.dev/flutter/dart-core/int-class.html)> layoutProportions
()
_override_
## Implementation
```dart
@override
List<int> layoutProportions() {
return showSecondsColumn ? [1, 2, 1] : [1, 1, 0];
}
```
| 11.925 | 151 | 0.675052 | kor_Hang | 0.223723 |
08f052fcb8cf2a33e2b750a7497eaabb94b21bcc | 714 | md | Markdown | domain/versium.com/index.md | billfitzgerald/smmd | 9af567b54b39dc2872cf0ee6c3ada27627490c42 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | domain/versium.com/index.md | billfitzgerald/smmd | 9af567b54b39dc2872cf0ee6c3ada27627490c42 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | domain/versium.com/index.md | billfitzgerald/smmd | 9af567b54b39dc2872cf0ee6c3ada27627490c42 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
company-name: Versium
domain: versium.com
home: http://www.versium.com
email: "dlim [at] versium.com"
california-date: 01/31/2020
vermont-id: 361975
---
## How to opt out
Consumers may opt out via email to optout@versium.com, webform at https://versium.com/ccpa-opt-out, or by calling 1-800-395-0164.
## How to delete
Consumers may demand deletion of information via email to optout@versium.com, webform at https://versium.com/ccpa-opt-out, or by calling 1-800-395-0164.
## Additional info
For information on data collecting practices, please see our Privacy Notice for California Residents at https://versium.com/ccpa-privacy-policy.
7530 164th Ave NEA204Redmond, WA 98052United States
| 18.307692 | 152 | 0.747899 | eng_Latn | 0.703417 |
08f0750b056b2be322912d7072898dc6f8e49cb2 | 2,431 | md | Markdown | tests/io_perfomance/image/README.md | mperov/eden | 6e040a6eb8e010f06f646158404854d945368e9f | [
"Apache-2.0"
] | null | null | null | tests/io_perfomance/image/README.md | mperov/eden | 6e040a6eb8e010f06f646158404854d945368e9f | [
"Apache-2.0"
] | null | null | null | tests/io_perfomance/image/README.md | mperov/eden | 6e040a6eb8e010f06f646158404854d945368e9f | [
"Apache-2.0"
] | null | null | null | # FIO autotests container
This container runs 36 FIO tests of different types and loads in turn. You can find out more about the test configuration by looking at the config.fio file in the container itself, or the test results in the specified GitHub repository in the directory. Each test is 2 minutes long. The total approximate testing time is ~ 1 hour 12 minutes. Run it and just wait for the result in the specified repository.
## How to
Before starting the container, you need to [get a token on GitHub.com](https://docs.github.com/en/free-pro-team@latest/github/authenticating-to-github/creating-a-personal-access-token).
### How to deploy
```console
./eden pod deploy --metadata="EVE_VERSION=$(./eden config get --key=eve.tag)\nGIT_REPO=<git repository name>\nGIT_LOGIN=<your git login>\nGIT_TOKEN=<your git token>" -p 8029:80 docker://itmoeve/fio_tests --no-hyper
```
4 required parameters must be specified in the --metadata parameter:
1. **GIT_REPO** - Is the name of the repository without .git. For example "eve-performance".
2. **GIT_LOGIN** - Username on GitHub, where the repository specified in GIT_REPO is located.
3. **GIT_TOKEN** - GitHub token for authorization and adding a branch with results to your repository.
4. **EVE_VERSION** - EVE version. This parameter is required for naming a branch in GitHub.
5. **GIT_BRANCH** - Branch name for results pushing. Optional parameter.
### How to run tests
This test creates a virtual machine and starts testing.
```console
GIT_REPO=<git repository name> GIT_LOGIN=<your git login> GIT_TOKEN=<your git token> ./eden test ./tests/io_perfomance
```
>Before running the test, you need to add environmental variables: GIT_REPO, GIT_LOGIN, GIT_TOKEN.
## About results
At the moment, the test results will be posted to the GitHub repository (based on the specified parameters for the environment variables GIT_REPO, GIT_LOGIN, GIT_TOKEN) in a new branch of the specified repository. The new branch will have the following name: "FIO-tests-%date-eve-eve.tag" (Example FIO-tests-11-31-24-11-2020-EVE-0.0.0-st_storage-2dd213ca-new). The directory with the result will be located at the root. The directory name will have the same name as the branch. The results directory has the following structure:
- FIO-tests-%date-eve-version
- README.md
- HARDWARE.cfg
- SUMMARY.csv
- Configs
- config.fio
- Test-results
- fio-results
- Iostat
| 52.847826 | 528 | 0.758536 | eng_Latn | 0.989711 |
08f075555cfb508b6f06adb16d960a10d7fa1599 | 518 | md | Markdown | website/docs/api/generated/enums/arrowtype.md | NeryHenrique/nodegui | 254b6a910928f512e57bc4bfab7dfadcd2073c84 | [
"MIT"
] | 7,857 | 2019-08-12T17:06:12.000Z | 2022-03-31T09:40:05.000Z | website/docs/api/generated/enums/arrowtype.md | NeryHenrique/nodegui | 254b6a910928f512e57bc4bfab7dfadcd2073c84 | [
"MIT"
] | 584 | 2019-08-11T21:39:18.000Z | 2022-03-25T23:37:10.000Z | website/docs/api/generated/enums/arrowtype.md | NeryHenrique/nodegui | 254b6a910928f512e57bc4bfab7dfadcd2073c84 | [
"MIT"
] | 310 | 2019-08-16T01:40:14.000Z | 2022-03-29T09:45:31.000Z | ---
id: "arrowtype"
title: "ArrowType"
sidebar_label: "ArrowType"
---
## Index
### Enumeration members
* [DownArrow](arrowtype.md#downarrow)
* [LeftArrow](arrowtype.md#leftarrow)
* [NoArrow](arrowtype.md#noarrow)
* [RightArrow](arrowtype.md#rightarrow)
* [UpArrow](arrowtype.md#uparrow)
## Enumeration members
### DownArrow
• **DownArrow**: = 2
___
### LeftArrow
• **LeftArrow**: = 3
___
### NoArrow
• **NoArrow**: = 0
___
### RightArrow
• **RightArrow**: = 4
___
### UpArrow
• **UpArrow**: = 1
| 11.26087 | 39 | 0.633205 | yue_Hant | 0.691267 |
08f145704b9a73eb9e940d0748632ca9384ee0df | 994 | md | Markdown | README.md | dennisfabri/Times | 47894ee8dcdaef8d58e1579f92203befd35a2a27 | [
"MIT"
] | 15 | 2018-11-07T08:10:38.000Z | 2021-01-20T09:19:01.000Z | README.md | arindamdat/Eventflow.Example.Racetimes | 47894ee8dcdaef8d58e1579f92203befd35a2a27 | [
"MIT"
] | null | null | null | README.md | arindamdat/Eventflow.Example.Racetimes | 47894ee8dcdaef8d58e1579f92203befd35a2a27 | [
"MIT"
] | 9 | 2019-03-19T16:23:16.000Z | 2022-02-14T12:06:44.000Z | # Racetimes example for Eventflow
This project contains an example project that uses some basic and extended features.
Basic features:
- Command / CommandHandler
- Event
- Identity
- AggregateRoot
- ReadModel
Configuration:
- MSSQL
- EntityFramework
- EventFlowOptions
- Migration
Extended features:
- Entity (within AggregateRoot)
- ReadModel for an Entity
- Delete on ReadModel
- Snapshots
# Racetimes (Domain)
The domain of this project is storing times from races within competitions. Therefore competitions can be created, renamed and deleted. Racetimes (Entries) can be added and changed. These actions are far from complete but I think they are sufficient for an example.
# Note
This is still a work in progress but I enjoy hearing from you (especially feedback on points I missed, got wrong or could do better).
# Sources
The code is based on the official documentation as well as code from the tests within the [eventflow repository](https://github.com/eventflow/EventFlow).
| 28.4 | 265 | 0.789738 | eng_Latn | 0.99885 |
08f415016159602e1cd333b4644d84fd7e068cdb | 1,565 | md | Markdown | README.md | NickGard/tiny-difference | bb415b85857d9025cdc231e56ae1212d95fe1362 | [
"MIT"
] | null | null | null | README.md | NickGard/tiny-difference | bb415b85857d9025cdc231e56ae1212d95fe1362 | [
"MIT"
] | null | null | null | README.md | NickGard/tiny-difference | bb415b85857d9025cdc231e56ae1212d95fe1362 | [
"MIT"
] | null | null | null | # tiny-difference
[](https://www.npmjs.com/package/@ngard/tiny-difference)
[](https://bundlephobia.com/result?p=@ngard/tiny-difference)
[](https://travis-ci.org/NickGard/tiny-difference)
[](https://badgen.net/badge/license/MIT/blue)
A minimal-weight utility identical to `lodash.difference`. For when every byte counts!
<hr/>
lodash.difference [](https://bundlephobia.com/result?p=lodash.difference)
<br/>
tiny-difference [](https://bundlephobia.com/result?p=@ngard/tiny-difference)
<hr/>
## Syntax
```js
difference(/* array, [...arrays] */)
```
## Parameters
`array` - The array to calculate the difference to. Resulting array elements are in the same order as appear in this array
<br/>
`arrays` - Any number of other arrays, whose elements will not appear in the returned array
## Return
A new array containing only the elements found in the first array and **not** in the other arrays, in the order as they appear in the first array.
## Example
```javascript
import { difference } from '@ngard/tiny-difference';
const diff = difference([1,2,3,4], [1,2]); // returns [3,4]
const diff = difference([1,2,3,4], [1,2], [2,3,5]); // returns [4]
```
| 37.261905 | 153 | 0.727796 | eng_Latn | 0.792297 |
08f46bacfd8dafa7c4541fedf3a79f6d3c5b515b | 712 | md | Markdown | _posts/algorithm/leetcode/2021-12-08-leetcode_algorithm_interview(2).md | carl020958/carl020958.github.io | 2735d6e6721cb8408ceec73605e8ba7ed3dce7cc | [
"MIT"
] | 1 | 2021-10-18T11:17:04.000Z | 2021-10-18T11:17:04.000Z | _posts/algorithm/leetcode/2021-12-08-leetcode_algorithm_interview(2).md | carl020958/carl020958.github.io | 21494d7eeced2233dc01c173e4b432892d05a3f1 | [
"MIT"
] | null | null | null | _posts/algorithm/leetcode/2021-12-08-leetcode_algorithm_interview(2).md | carl020958/carl020958.github.io | 21494d7eeced2233dc01c173e4b432892d05a3f1 | [
"MIT"
] | null | null | null | ---
title: "[ALGORITHM] LeetCode 344. Reverse String"
layout: single
date: '8/12/2021'
toc: true
toc_sticky: true
toc_label: Table of Contents
categories:
- LEETCODE
tags:
- ALGORITHM
- LEETCODE
---
---
### ALGORITHM Übung - LeetCode
* 알고리즘 문제 풀이를 통한 코딩 테스트 연습
---
### 문제
* [🔗 문제 링크](https://leetcode.com/problems/reverse-string/)
### 코드
```python
# 나의 풀이
class Solution:
def reverseString(self, s: List[str]) -> None:
s.reverse()
# 투 포인터를 이용한 방식의 풀이
class Solution:
def reverseString(self, s: List[str]) -> None:
left,right = 0,len(s)-1
while left < right:
s[left],s[right] = s[right],s[left]
left += 1
right -= 1
```
--- | 16.952381 | 58 | 0.580056 | kor_Hang | 0.908335 |
08f55f2bf85f44824ff64d095a4c120e68319178 | 2,947 | md | Markdown | _posts/users/2018-01-01-1-ongoing.md | VisionEval/homepage | 2b12b815be0cc8918946e27a200b7f2bd4280138 | [
"MIT"
] | 2 | 2019-09-20T20:23:35.000Z | 2020-12-04T12:26:07.000Z | _posts/users/2018-01-01-1-ongoing.md | VisionEval/homepage | 2b12b815be0cc8918946e27a200b7f2bd4280138 | [
"MIT"
] | 3 | 2018-02-07T16:06:02.000Z | 2018-12-19T21:09:19.000Z | _posts/users/2018-01-01-1-ongoing.md | VisionEval/VisionEval.org | 155be8de860d09fce6d1b81bdc8fd5cbdbc0234e | [
"MIT"
] | 1 | 2018-01-16T00:09:24.000Z | 2018-01-16T00:09:24.000Z | ---
layout: post
title: Ongoing work for all tools
excerpt: "as listed on the Development Roadmap"
modified:
categories: users
tags: [models, enhancements]
image:
feature: so-simple-sample-image-4-narrow.jpg
credit: #
creditlink: #
comments: false
share: false
---
The ongoing work involves:
* Outstanding work on the migration of RSPM, RPAT, and GreenSTEP to the VisionEval framework is outlined in the GitHub wiki’s <a href="https://github.com/VisionEval/VisionEval/wiki/Modules-and-Packages" target="_blank">Modules & Packages</a> and <a href="https://github.com/visioneval/VisionEval/wiki/Development-Roadmap" target="_blank">Roadmap</a> pages. Now that these are in a common framework, future enhancements of any tool should be usable by all other tools. For example, a module which captures multi-modal travel in VE-RSPM can also be used with VE-RPAT, as both models will use similar structures and share code.
* Updated Multi-Modal travel modules were built by PSU for VE-RSPM through a peer-reviewed ODOT Research project (completed in June 2017). See this [introduction to the Multi-Modal module](https://cities-lab.github.io/VETravelDemandMM/Intro.html) and this [ODOT report](https://www.oregon.gov/ODOT/Programs/ResearchDocuments/SPR788RSPMTool.pdf) on the module.
* In a pilot contributor <a href="https://github.com/VisionEval/VisionEval/wiki/Review-Team-Charter" target="_blank">Review Team</a> effort completed in September 2017, a review team of agency partner, academic, and consultants reviewed industry code governance best practices, developed a VisionEval charter and review process, and piloted this process using the Multi-Modal travel modules as a test case code submittal for inclusion in the VisionEval repository.
* A <a href="https://github.com/VisionEval/VisionEval/wiki/Documentation-Plan" target="_blank">Documentation Plan</a> has been developed to describe what is needed to produce comprehensive documentation at multiple levels of detail.
* <a href="https://github.com/VisionEval/VisionEval/wiki/VE-State-Status" target="_blank">VE-State</a>, a state-level version of VE-RSPM to replicate the functionality of GreenSTEP, was completed in 2019.
## Future Enhancements
- VERSPM will be extended with additional modules via the Pooled Fund. These future enhancements are planned to be carried out in a way which can be used for RPAT and GreenSTEP as well as RSPM.
- A wish list of desired upgrades, reflecting agency partner priorities per an August 2016 Peer Exchange in Portland, is available at the GitHub <a href="https://github.com/visioneval/VisionEval/wiki/Development-Roadmap" target="_blank">Roadmap</a> page.
In addition to new tool functionality, possible future enhancements might also include scenario planning support, such as case studies, reasonable ranges of various policy inputs, aids to facilitate transit and land use inputs, equity analyses, and peer review of VisionEval tools.
| 98.233333 | 624 | 0.793688 | eng_Latn | 0.9685 |
08f61824f8c038e0685daab81429dcc697755199 | 3,593 | md | Markdown | README.md | BernhardSchiffer/04-generics | ffc0c16ee59ce6a69acaf779759ff1989d539137 | [
"MIT"
] | null | null | null | README.md | BernhardSchiffer/04-generics | ffc0c16ee59ce6a69acaf779759ff1989d539137 | [
"MIT"
] | null | null | null | README.md | BernhardSchiffer/04-generics | ffc0c16ee59ce6a69acaf779759ff1989d539137 | [
"MIT"
] | 77 | 2019-04-02T06:38:52.000Z | 2021-12-09T05:27:49.000Z | _This is an assignment to the [Software Architecture](https://ohm-softa.github.io) class at the [Technische Hochschule Nürnberg](http://www.th-nuernberg.de)._
# Assignment 4: Generics

In this assignment we want to improve the previously implemented `SimpleListImpl` of [Assignment 2](https://github.com/hsro-inf-prg3/02-classes-interfaces).
Back then, `SimpleListImpl` was implemented to store references of type `Object`, which in turn required a type cast on retrieval:
```java
SimpleList sl = new SimpleListImpl();
sl.add(new MyClass());
MyClass k = (MyClass) sl.get(0);
```
Inside `SimpleListImpl`, the knowledge about the actual class was lost, and worse: the following code would compile but produce a runtime exception:
```java
SimpleList sl = new SimpleListImpl();
sl.add(new MyClass());
sl.add(new MyOtherClass());
MyClass k1 = (MyClass) sl.get(0); // all ok
MyClass k2 = (MyOtherClass) sl.get(1); // ClassCastException!
```
Generics help us to avoid both the type cast and the risk of runtime exceptions by checking the type at compile time.
For this assignment, start with the reference solution of assignment 2 and the `abstract` model class `Plant`.
## Setup
1. Create a fork of this repository (button in the right upper corner)
2. Clone the project (get the link by clicking the green _Clone or download button_)
3. Import the project to your IDE (remember the guide in assignment 1)
4. Validate your environment by running the tests from your IntelliJ and by running `gradle test` on the command line.
## Generic Lists

To make a class generic, introduce a generic type (typically named `T`) in the class or interface signature, and replace all affected actual types with the generic type.
1. Make the following interfaces and classes generic
* `SimpleList`
* `SimpleFilter`
* `SimpleListImpl`
* `SimpleIteratorImpl`
* `Element`
2. Adopt the changes in the test class `SimpleListTests.java`
3. Remove the now unnecessary type casts
4. Add a new method `addDefault` to the `SimpleList` interface; the purpose is to add a default instance (using the default constructor) to the list<br>
_Hint:_ this method aims at the instantiation problem of generics.
## Generic Methods

In the second part we want to focus on generic and `default` methods.
For this purpose we'll add an additional method `map(...)` and move the method `filter(...)` to the interface `SimpleList`.
1. Implement the `filter(...)` method as `default` method in the `SimpleList` interface <br>(remember to run the tests when you completed the refactoring to ensure that the result is still the same)
2. Add the `map(...)` method to the `SimpleList` interface according to the given UML (`default` method)<br>The `map(...)` method transforms every element of your list with the given `Function<T,R>` to another element of type `R` and collects all elements in a new `SimpleList`.
3. _Optionally:_ Implement the the `sort(...)` method as `static` utility method in the `abstract` class `CollectionsUtility`.
<br> _You may choose any sort algorithm: Bubblesort, Mergesort,...depending on your choice you may need to add some methods to `SimpleList` and `SimpleListImpl`_
<br>(can you imagine why this class should be `abstract` and optimally has a `private` constructor?)
**Remember, an untested implementation is worthless! Expand the given test suite to ensure that your algorithms are correct.** | 50.605634 | 278 | 0.751739 | eng_Latn | 0.990546 |
08f645cd21822eee938d5d60dcb2fa023b8b423c | 520 | md | Markdown | README.md | rayksyoo/dCorMV | 18112dfb72592c8d5f97a089451c3ea9c68f9bf6 | [
"MIT"
] | 5 | 2019-03-18T07:53:53.000Z | 2021-03-19T17:26:31.000Z | README.md | rayksyoo/dCorMV | 18112dfb72592c8d5f97a089451c3ea9c68f9bf6 | [
"MIT"
] | null | null | null | README.md | rayksyoo/dCorMV | 18112dfb72592c8d5f97a089451c3ea9c68f9bf6 | [
"MIT"
] | 2 | 2019-05-11T13:09:00.000Z | 2020-05-02T06:10:12.000Z | # dCorMV
Calculate multivariate distance correlation (Reference: Yoo et al. Multivariate approaches improve the reliability and validity of functional connectivity and prediction of individual behaviors. NeuroImage. 2019 )
This is to calculate multivariate distance correlation.
Input X is 1D cell array of length m. Each cell contains 2D (n by p) time-series matrices with n time points and p voxels. Here, p can be different across m, but n should be the same.
The main output is m by m distance correlation matrix.
| 65 | 213 | 0.803846 | eng_Latn | 0.995958 |
08f64a1fb75b6cd6c4a288c86a0da8b03dc4a2f4 | 53 | md | Markdown | README.md | slepher/yaliver | 9b9c236e08b5ead025096caa6f41656352ac7af1 | [
"MIT"
] | null | null | null | README.md | slepher/yaliver | 9b9c236e08b5ead025096caa6f41656352ac7af1 | [
"MIT"
] | null | null | null | README.md | slepher/yaliver | 9b9c236e08b5ead025096caa6f41656352ac7af1 | [
"MIT"
] | null | null | null | # yaliver
Yet Another LIVR Validator
Not Released.
| 13.25 | 28 | 0.773585 | eng_Latn | 0.773379 |
08f69d02c10b441d7b8bf87736a523d8e52284c5 | 1,965 | md | Markdown | README.md | silky/loveBomb | b041ed2b007d9f3e261a9cc10001cf3964d20cfb | [
"MIT"
] | 1 | 2019-06-27T11:33:23.000Z | 2019-06-27T11:33:23.000Z | README.md | samuelricci/fBomb | b041ed2b007d9f3e261a9cc10001cf3964d20cfb | [
"MIT"
] | null | null | null | README.md | samuelricci/fBomb | b041ed2b007d9f3e261a9cc10001cf3964d20cfb | [
"MIT"
] | null | null | null | fBomb
=====
See where in the world the fBomb was dropped
###Installation
---
Fairly basic to set up your own instance of this application.
To set up your own version of this app, all you have to do is clone this repo:
```bash
$ git clone https://github.com/mgingras/fBomb.git && cd fBomb && npm install
```
###Configuration
---
For this applicaiton you need to get the API keys for Twitter [https://dev.twitter.com](https://dev.twitter.com) and Google Maps. These are then inserted into config.json . You can also specify the name you want your applciation to have in this file.
Below is the current config.json file. replace the values surrounded by brackets ('[') with your API keys and configurations.
```json
{
"consumer_key":"[CONSUMER_KEY]",
"consumer_secret":"[CONSUMER_SECRET]",
"oauth_token":"[OATH_TOKEN]",
"oauth_token_secret":"[OAUTH_TOKEN_SECRET]",
"gmaps":"[GMAPS_API_KEY]",
"app_name":"[APP_NAME]",
"track": "[WORDS_TO_TRACK]"
}
```
You can then run the application with the following:
```
$ coffee coffeeApp.coffee
```
If you have configured it correctly you should be able to browse to localhost:3000 and you see some bombs drop!
###Customization
---
####Tracking
To change what is being tracked by the application, replace "[WORDS_TO_TRACK]" in config.json with a comma seperated list of words to track. (e.g. "fuck,fucks,fucking").
####Images
Markers are customizable by replacing 'fbomb.gif' and 'signPost.png' located in './public/img/'
'fbomb.gif' is the initail indicator.
'signpost.png' is the marker that drops after the gif animation and stays on the map.
###Deployment
---
If you want to deploy this app, I suggest Heroku, they have lots of docs to help you out:
node.js: https://devcenter.heroku.com/articles/getting-started-with-nodejs
websockets: https://devcenter.heroku.com/articles/node-websockets
###Contact
---
Let me know if you have any questions!
Martin
<martin@mgingras.com>
| 27.676056 | 250 | 0.727735 | eng_Latn | 0.984358 |
08f6e0df5dbaf352bb838b065b655716579f3fdd | 1,847 | md | Markdown | docs/2014/database-engine/dev-guide/issabort-ole-db.md | L3onard80/sql-docs.it-it | f73e3d20b5b2f15f839ff784096254478c045bbb | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/2014/database-engine/dev-guide/issabort-ole-db.md | L3onard80/sql-docs.it-it | f73e3d20b5b2f15f839ff784096254478c045bbb | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/2014/database-engine/dev-guide/issabort-ole-db.md | L3onard80/sql-docs.it-it | f73e3d20b5b2f15f839ff784096254478c045bbb | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Sasabot (OLE DB) | Microsoft Docs
ms.custom: ''
ms.date: 06/13/2017
ms.prod: sql-server-2014
ms.reviewer: ''
ms.technology: database-engine
ms.topic: reference
topic_type:
- apiref
helpviewer_keywords:
- ISSAbort interface
ms.assetid: 7c4df482-4a83-4da0-802b-3637b507693a
author: mashamsft
ms.author: mathoma
manager: craigg
ms.openlocfilehash: 801eb84df08837ec8e49b6bb0e28fc1f1115e674
ms.sourcegitcommit: b87d36c46b39af8b929ad94ec707dee8800950f5
ms.translationtype: MT
ms.contentlocale: it-IT
ms.lasthandoff: 02/08/2020
ms.locfileid: "62781037"
---
# <a name="issabort-ole-db"></a>ISSAbort (OLE DB)
L'interfaccia **ISSAbort** , esposta nel provider OLE DB di [!INCLUDE[ssNoVersion](../../includes/ssnoversion-md.md)] Native Client, fornisce il metodo [ISSAbort::Abort](../../relational-databases/native-client-ole-db-interfaces/issabort-abort-ole-db.md) che viene utilizzato per annullare il set di righe corrente oltre a qualsiasi comando eseguito in batch insieme al comando che ha inizialmente generato il set di righe e non ha ancora completato l'esecuzione.
**È un'** [!INCLUDE[ssNoVersion](../../includes/ssnoversion-md.md)] interfaccia specifica del provider di Native Client disponibile usando **QueryInterface** sull'oggetto **IMultipleResults** restituito da **ICommand:: Execute** o **IOpenRowset:: OPENROWSET**.
## <a name="in-this-section"></a>Contenuto della sezione
|Metodo|Descrizione|
|------------|-----------------|
|[Dissabot:: Abort (OLE DB)](../../relational-databases/native-client-ole-db-interfaces/issabort-abort-ole-db.md)|Annulla il set di righe corrente oltre a qualsiasi comando eseguito in batch associato al comando corrente.|
## <a name="see-also"></a>Vedere anche
[Interfacce (OLE DB)](../../../2014/database-engine/dev-guide/interfaces-ole-db.md)
| 47.358974 | 467 | 0.739578 | ita_Latn | 0.639603 |
08f7019ed17f063b0dbab15fb99ddae87c8323c6 | 1,797 | md | Markdown | _posts/2020-07-13-8.md | 9646516/9646516.github.io | b3f5033d4cf275e99c3431875daf26a2cd02da9d | [
"MIT"
] | null | null | null | _posts/2020-07-13-8.md | 9646516/9646516.github.io | b3f5033d4cf275e99c3431875daf26a2cd02da9d | [
"MIT"
] | null | null | null | _posts/2020-07-13-8.md | 9646516/9646516.github.io | b3f5033d4cf275e99c3431875daf26a2cd02da9d | [
"MIT"
] | null | null | null | ---
layout: post
title: "探讨一些简单的问题"
categories:
- Category
---
最短路问题可能是大多数人学的第一个算法,不过就是简单的最短路有很多问题不仔细想想的话,可能也会犯下错误。
---
问题1:
dijkstra有必要写vis数组吗?
答:按照严格的dijkstra定义,每个点只会入队一次,所以才证明得到复杂度是$$O(nlogn)$$或者$$O(nlogm)$$的。但是这个定义导致了dijkstra自身存在缺陷,因此我现在写的最短路都不是严格的dijkstra。
---
问题2:
严格定义的dijkstra的缺陷是?
答:首先严格的dijkstra不能算带负权边的图,回想一下dijkstra是怎么证明的。大意应该是如果不选当前点来松弛,不会使得结果更好。但是带负权边的图不满足这个条件,因此严格的dijkstra是错误的,具体可以考虑以下这个有向图。(以下所有的图的起点都是1,第一行分别是点数和边数)
```
4 5
1 2 1
1 3 0
1 4 99
2 3 1
4 2 -300
```
其次不能跑最长路,原因类似,dijkstra的假设不满足,即从当前最远的点更新不会使得结果更坏的假设是错误的。具体参考以下这个有向图。
```
4 5
1 2 1
1 3 2
1 4 3
2 3 2
3 4 3
```
反思下这两个错误都是由于一个原因,即证明中的假设是错误的,这个错误的假设会导致一个点在自己不是最优的时候更新其他点,而且由于每个点只会出队一次,导致这个点错过了最佳的更新其他点的时间。
---
问题3:
如何解决dijkstra的以上缺陷?
答:通过上面的分析可以看出,问题的关键在于每个点只会出队一次。所以只要允许每个点出队多次即可。是不是想到了上面似曾相识的东西?这不是SPFA么。确实这么写很像用堆的SPFA,或者说就是用堆的SPFA。
我看了很多国外选手的代码,似乎现在流行的dijkstra都是这么写的了。
---
问题4:允许多次出队的dijkstra是否只是去掉vis数组?
答:否,如果只是去掉vis数组以下代码输出的有向图可以让复杂度达到$$O(m^2)$$。
```c++
#include <bits/stdc++.h>
using namespace std;
int main() {
int n = 200003, m = 300000;
printf("%d %d\n", n, m);
for (int i = 2; i <= 100000; i++)
printf("%d %d %d\n", 1, i, i);
for (int i = 2; i <= 100000; i++)
printf("%d %d %d\n", i, 100001, (int)1e9 - 2 * i);
for (int i = 0; i < 100002; i++)
printf("%d %d %d\n", 100001, 100002 + i, (int)1e9);
return 0;
}
```
仔细观察这个图,可以发现问题的根源在于队列中存在同一个点的多个不同状态。之前只允许出队一次的时候,其他的点都不会更新另外的点。但是去掉vis后这个问题就导致进行多次无效更新。
解决办法有两个:
- 向队列插入新的状态的时候删除老的状态
- 只有当前状态距离等于dis[v]的时候才进行更新
如果你用的是优先队列只能采取方法2,如果用的是set/multiset两个都可以用。个人测试下,优先队列的写法可能会好一点,不过这个可能是和具体实现有关。
---
问题5:如果允许多次出队复杂度是多少?
别问,问就是Bellman-Ford的复杂度乘上堆操作的复杂度。
但是如果负权图或者最长路真的卡这种写法的话,只能试试各种玄学优化的SPFA了。
| 18.525773 | 144 | 0.685031 | yue_Hant | 0.315162 |
08f7b1c2a03fbd3911915c0e772858aa6bf21128 | 1,937 | md | Markdown | node_modules/es-abstract/README.md | ai-man-123/MDyoobotz | ed54a8cacdf93710cb248237444052dab908b1c4 | [
"MIT"
] | 195 | 2021-10-29T20:21:24.000Z | 2022-03-31T09:26:23.000Z | node_modules/es-abstract/README.md | ai-man-123/MDyoobotz | ed54a8cacdf93710cb248237444052dab908b1c4 | [
"MIT"
] | 171 | 2016-10-20T21:34:06.000Z | 2019-02-12T10:13:13.000Z | node_modules/es-abstract/README.md | ai-man-123/MDyoobotz | ed54a8cacdf93710cb248237444052dab908b1c4 | [
"MIT"
] | 36 | 2015-01-11T02:26:32.000Z | 2022-01-13T23:14:44.000Z | # es-abstract <sup>[![Version Badge][npm-version-svg]][package-url]</sup>
[![dependency status][deps-svg]][deps-url]
[![dev dependency status][dev-deps-svg]][dev-deps-url]
[![License][license-image]][license-url]
[![Downloads][downloads-image]][downloads-url]
[![npm badge][npm-badge-png]][package-url]
ECMAScript spec abstract operations.
Every operation is available by edition/year and by name - for example, `es-abstract/2020/Call` gives you the `Call` operation from ES2020, `es-abstract/5/Type` gives you the `Type` operation from ES5.
All abstract operations are also available under an `es5`/`es2015`/`es2016`/`es2017`/`es2018`/`es2019`/`es2020`/`es2021` entry point, and as a property on the `main` export, but using deep imports is highly encouraged for bundle size and performance reasons. Non-deep entry points will be removed in the next semver-major release.
## Example
```js
var ES = require('es-abstract');
var assert = require('assert');
assert(ES.isCallable(function () {}));
assert(!ES.isCallable(/a/g));
```
## Tests
Simply clone the repo, `npm install`, and run `npm test`
## Security
Please email [@ljharb](https://github.com/ljharb) or see https://tidelift.com/security if you have a potential security vulnerability to report.
[package-url]: https://npmjs.org/package/es-abstract
[npm-version-svg]: https://versionbadg.es/ljharb/es-abstract.svg
[deps-svg]: https://david-dm.org/ljharb/es-abstract.svg
[deps-url]: https://david-dm.org/ljharb/es-abstract
[dev-deps-svg]: https://david-dm.org/ljharb/es-abstract/dev-status.svg
[dev-deps-url]: https://david-dm.org/ljharb/es-abstract#info=devDependencies
[npm-badge-png]: https://nodei.co/npm/es-abstract.png?downloads=true&stars=true
[license-image]: https://img.shields.io/npm/l/es-abstract.svg
[license-url]: LICENSE
[downloads-image]: https://img.shields.io/npm/dm/es-abstract.svg
[downloads-url]: https://npm-stat.com/charts.html?package=es-abstract
| 44.022727 | 330 | 0.739804 | eng_Latn | 0.582906 |
08f7d6e3d3683a7804533628f8372795ddadf352 | 5,629 | md | Markdown | docs/kr/edu/dream/dream2-4.md | sunutf/emanual | b137837f989d57e2f6241ac6f579bb4b5d7f1533 | [
"MIT"
] | 1 | 2018-09-19T02:30:59.000Z | 2018-09-19T02:30:59.000Z | docs/kr/edu/dream/dream2-4.md | Storm-Develop/emanual | 5596a13bce4bf5bac3b74434259fca6a16e03638 | [
"MIT"
] | null | null | null | docs/kr/edu/dream/dream2-4.md | Storm-Develop/emanual | 5596a13bce4bf5bac3b74434259fca6a16e03638 | [
"MIT"
] | null | null | null | ---
layout: archive
lang: kr
ref: dream2-4
read_time: true
share: true
author_profile: false
permalink: /docs/kr/edu/dream/dream2-4/
sidebar:
title: 드림 II 4단계
nav: "dream2-4"
---
# [개요](#개요)
**로보티즈 드림II 4단계**는 제어기를 통한 로봇의 제어 원리부터 서보모터의 제어, 접촉센서와 적외선 센서, LED 모듈의 제어 원리를 학습할 수 있습니다. 로봇 프로그램의 순서도 개념을 이해하고, 프로그래밍 기초 지식을 학습함으로써 실질적이고 기본적인 로봇제작 및 프로그래밍을 살펴보는 단계입니다.
제작한 로봇을 센서로 동작시키거나 게임을 할 수 있는 프로그램을 직접 체험해 봅니다.
로보티즈 드림II 4단계는 1, 2, 3단계 키트와 결합되어 총 12장의 교재와 예제 로봇이 결합되어 제어 프로그램과 센서들의 활용, 원리의 이해, 문제풀이 등의 과정을 제공합니다. 12회에 걸쳐 로봇을 순서대로 만들면서 로봇 제작의 원리를 학습할 수 있습니다.
교육키트 4단계는 1, 2, 3단계 키트의 부품들이 있어야 로봇을 제작할 수 있습니다. 반드시 교육키트 [1단계], [2단계], [3단계]를 먼저 구매하시기 바랍니다.
로보티즈 드림II 4단계에서 프로그램은 제어기(CM-150)에 사용자가 직접 프로그램 하거나 다운로드 해야 합니다.
로보티즈 드림II 1단계에 포함되어 있는 USB 케이블을 이용해서 다운로드 할 수 있습니다.
로봇을 제작하여 보다 다양한 형태로 제작하거나, 로봇을 블루투스로 조종하기 위해서는 로보티즈 드림II 5단계를 구매하시면 됩니다.
# [부품 리스트](#부품-리스트)

- [서보 모터]
- [적외선 센서]
- [LED 모듈]
- [접촉 센서]
# [교안예제](#교안예제)
{% capture dream_02 %}
각 예제의 조립 방법이나 실행 시 동작에 관한 내용은 로보티즈 드림II 4단계 교안을 참고하세요.
예제 프로그램은 사용자가 직접 다운로드 해야 합니다.
만일, 프로그램을 직접 만들거나 다운로드 할 경우에 [예제 태스크 코드의 다운로드 방법]을 참고하세요.
다운로드시 필요한 USB 케이블은 1단계 구성품에 포함되어 있습니다.
{% endcapture %}
<div class="notice">{{ dream_02| markdownify }}</div>
|예제|로봇|TASK Code|
| :---: | :---: | :---: |
| 1. 오르골| | [Download] [DREAM2_L4_Orgel_KR.tskx] |
| 2. 로봇청소기| | [Download][DREAM2_L4_CleanupRobot_KR.tskx]<br/> [Download][DREAM2_L4_CleanupRobot_KR(RC).tskx] |
| 3. 트럭| | [Download][DREAM2_L4_Truck_KR.tskx]<br/> [Download][DREAM2_L4_Truck_KR(RC).tskx] |
| 4. 범퍼카| | [Download][DREAM2_L4_BumperCar_KR.tskx]<br/> [Download][DREAM2_L4_BumperCar_KR(RC).tskx] |
| 5. 집게차| | [Download][DREAM2_L4_ProbingCar_KR.tskx]<br/> [Download][DREAM2_L4_ProbingCar_KR(RC).tskx] |
| 6. 전기기타| | [Download][DREAM2_L4_Guitar_KR.tskx] |
| 7. 청기백기| | [Download][DREAM2_L4_FlagGame_KR.tskx] |
| 8. 크레인| | [Download][DREAM2_L4_Crane_KR.tskx]<br/> [Download][DREAM2_L4_Crane_KR(RC).tskx] |
| 9. 회전바구니| | [Download][DREAM2_L4_DrunkenBasket_KR.tskx] |
| 10. 바이킹| | [Download][DREAM2_L4_Viking_KR.tskx] |
| 11. 두더지게임| | [Download][DREAM2_L4_MoleHitting_KR.tskx] |
| 12. 도깨비| | [Download][DREAM2_L4_BabyGoblin_KR.tskx] |
[1단계]: /docs/kr/edu/dream2/dream2-1/
[2단계]: /docs/kr/edu/dream2/dream2-2/
[3단계]: /docs/kr/edu/dream2/dream2-3/
[서보 모터]: /docs/kr/parts/motor/servo_motor/
[적외선 센서]: /docs/kr/parts/sensor/irss-10/
[LED 모듈]: /docs/kr/parts/display/lm-10/
[접촉 센서]: /docs/kr/parts/sensor/ts-10/
[예제 태스크 코드의 다운로드 방법]: /docs/kr/faq/download_task_code/
[DREAM2_L4_Orgel_KR.tskx]: http://support.robotis.com/ko/baggage_files/dream2/dream2_l4_orgel_kr.tskx
[DREAM2_L4_CleanupRobot_KR.tskx]: http://support.robotis.com/ko/baggage_files/dream2/dream2_l4_cleanuprobot_kr.tskx
[DREAM2_L4_CleanupRobot_KR(RC).tskx]: http://support.robotis.com/ko/baggage_files/dream2/dream2_l4_cleanuprobot_kr(rc).tskx
[DREAM2_L4_Truck_KR.tskx]: http://support.robotis.com/ko/baggage_files/dream2/dream2_l4_truck_kr.tskx
[DREAM2_L4_Truck_KR(RC).tskx]: http://support.robotis.com/ko/baggage_files/dream2/dream2_l4_truck_kr(rc).tskx
[DREAM2_L4_BumperCar_KR.tskx]: http://support.robotis.com/ko/baggage_files/dream2/dream2_l4_bumpercar_kr.tskx
[DREAM2_L4_BumperCar_KR(RC).tskx]: http://support.robotis.com/ko/baggage_files/dream2/dream2_l4_bumpercar_kr(rc).tskx
[DREAM2_L4_ProbingCar_KR.tskx]: http://support.robotis.com/ko/baggage_files/dream2/dream2_l4_probingcar_kr.tskx
[DREAM2_L4_ProbingCar_KR(RC).tskx]: http://support.robotis.com/ko/baggage_files/dream2/dream2_l4_probingcar_kr(rc).tskx
[DREAM2_L4_Guitar_KR.tskx]: http://support.robotis.com/ko/baggage_files/dream2/dream2_l4_guitar_kr.tskx
[DREAM2_L4_FlagGame_KR.tskx]: http://support.robotis.com/ko/baggage_files/dream2/dream2_l4_flaggame_kr.tskx
[DREAM2_L4_Crane_KR.tskx]: http://support.robotis.com/ko/baggage_files/dream2/dream2_l4_crane_kr.tskx
[DREAM2_L4_Crane_KR(RC).tskx]: http://support.robotis.com/ko/baggage_files/dream2/dream2_l4_crane_kr(rc).tskx
[DREAM2_L4_DrunkenBasket_KR.tskx]: http://support.robotis.com/ko/baggage_files/dream2/dream2_l4_drunkenbasket_kr.tskx
[DREAM2_L4_Viking_KR.tskx]: http://support.robotis.com/ko/baggage_files/dream2/dream2_l4_viking_kr.tskx
[DREAM2_L4_MoleHitting_KR.tskx]: http://support.robotis.com/ko/baggage_files/dream2/dream2_l4_molehitting_kr.tskx
[DREAM2_L4_BabyGoblin_KR.tskx]: http://support.robotis.com/ko/baggage_files/dream2/dream2_l4_babygoblin_kr.tskx
| 63.965909 | 174 | 0.691242 | kor_Hang | 0.971119 |
08f8372ab93b06c0b4eda4ae25f9e8e9bbc576d4 | 10,240 | md | Markdown | articles/security-center/security-center-iaas-advanced-data.md | mahakjain07031985/azure-docs | 1f20c9156a5b21754e1a9fe41178949c7b1132ad | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/security-center/security-center-iaas-advanced-data.md | mahakjain07031985/azure-docs | 1f20c9156a5b21754e1a9fe41178949c7b1132ad | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/security-center/security-center-iaas-advanced-data.md | mahakjain07031985/azure-docs | 1f20c9156a5b21754e1a9fe41178949c7b1132ad | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Azure Security Center's advanced data security for SQL machines (Preview)
description: Learn how to enable advanced data security for SQL machines in Azure Security Center
services: security-center
documentationcenter: na
author: memildin
manager: rkarlin
ms.assetid: ba46c460-6ba7-48b2-a6a7-ec802dd4eec2
ms.service: security-center
ms.devlang: na
ms.topic: conceptual
ms.tgt_pltfrm: na
ms.workload: na
ms.date: 06/28/2020
ms.author: memildin
---
# Advanced data security for SQL machines (Preview)
Azure Security Center's advanced data security for SQL machines protects SQL Servers hosted in Azure, on other cloud environments, and even on-premises machines. This extends the protections for your Azure-native SQL Servers to fully support hybrid environments.
This preview feature includes functionality for identifying and mitigating potential database vulnerabilities and detecting anomalous activities that could indicate threats to your database:
* **Vulnerability assessment** - The scanning service to discover, track, and help you remediate potential database vulnerabilities. Assessment scans provide an overview of your SQL machines' security state, and details of any security findings.
* [Advanced Threat Protection](https://docs.microsoft.com/azure/sql-database/sql-database-threat-detection-overview) - The detection service that continuously monitors your SQL servers for threats such as SQL injection, brute-force attacks, and privilege abuse. This service provides action-oriented security alerts in Azure Security Center with details of the suspicious activity, guidance on how to mitigate to the threats, and options for continuing your investigations with Azure Sentinel.
>[!TIP]
> Advanced data security for SQL machines is an extension of Azure Security Center's [advanced data security package](https://docs.microsoft.com/azure/sql-database/sql-database-advanced-data-security), available for Azure SQL Database, Azure Synapse, and SQL Managed Instance.
## Set up advanced data security for SQL machines
Setting up Azure Security Center's advanced data security for SQL machines involves two steps:
* Provision the Log Analytics agent on your SQL server's host. This provides the connection to Azure.
* Enable the optional bundle in Security Center's pricing and settings page.
Both of these are described below.
### Step 1. Provision the Log Analytics agent on your SQL server's host:
- **SQL Server on Azure VM** - If your SQL machine is hosted on an Azure VM, you can [auto provision the Log Analytics agent](security-center-enable-data-collection.md#workspace-configuration). Alternatively, you can follow the manual procedure for [adding an Azure VM](quick-onboard-azure-stack.md#add-the-virtual-machine-extension-to-your-existing-azure-stack-virtual-machines).
- **SQL Server on Azure Arc** - If your SQL Server is hosted on an [Azure Arc](https://docs.microsoft.com/azure/azure-arc/) machine, you can deploy the Log Analytics agent using the Security Center recommendation “Log Analytics agent should be installed on your Windows-based Azure Arc machines (Preview)”. Alternatively, you can follow the manual procedure in the [Azure Arc documentation](https://docs.microsoft.com/azure/azure-arc/servers/manage-vm-extensions#enable-extensions-from-the-portal).
- **SQL Server on-prem** - If your SQL Server is hosted on an on-premises Windows machine without Azure Arc, you have two options for connecting it to Azure:
- **Deploy Azure Arc** - You can connect any Windows machine to Security Center. However, Azure Arc provides deeper integration across *all* of your Azure environment. If you set up Azure Arc, you'll see the **SQL Server – Azure Arc** page in the portal and your security alerts will appear on a dedicated **Security** tab on that page. So the first and recommended option is to [set up Azure Arc on the host](https://docs.microsoft.com/azure/azure-arc/servers/onboard-portal#install-and-validate-the-agent-on-windows) and follow the instructions for **SQL Server on Azure Arc**, above.
- **Connect the Windows machine without Azure Arc** - If you choose to connect a SQL Server running on a Windows machine without using Azure Arc, follow the instructions in [Connect Windows machines to Azure Monitor](https://docs.microsoft.com/azure/azure-monitor/platform/agent-windows).
### Step 2. Enable the optional bundle in Security Center's pricing and settings page:
1. From Security Center's sidebar, open the **Pricing & settings** page.
- If you're using **Azure Security Center's default workspace** (named “defaultworkspace-[your subscription id]-[region]”), select the relevant **subscription**.
- If you're using **a non-default workspace**, select the relevant **workspace** (enter the workspace's name in the filter if necessary):

1. Toggle the option for **SQL servers on machines (Preview)** to enabled.
[](media/security-center-advanced-iaas-data/sql-servers-on-vms-in-pricing-large.png#lightbox)
Advanced Data Security for SQL servers on machines will be enabled on all SQL servers connected to the selected workspace. The protection will be fully active after the first restart of the SQL Server instance.
>[!TIP]
> To create a new workspace, follow the instructions in [Create a Log Analytics workspace](https://docs.microsoft.com/azure/azure-monitor/learn/quick-create-workspace).
1. Optionally, configure email notification for security alerts.
You can set a list of recipients to receive an email notification when Security Center alerts are generated. The email contains a direct link to the alert in Azure Security Center with all the relevant details. For more information, see [Set up email notifications for security alerts](https://docs.microsoft.com/azure/security-center/security-center-provide-security-contact-details).
## Explore vulnerability assessment reports
The vulnerability assessment service scans your databases once a week. The scans run on the same day of the week on which you enabled the service.
The vulnerability assessment dashboard provides an overview of your assessment results across all your databases, along with a summary of healthy and unhealthy databases, and an overall summary of failing checks according to risk distribution.
You can view the vulnerability assessment results directly from Security Center.
1. From Security Center's sidebar, open the **Recommendations** page and select the recommendation **Vulnerabilities on your SQL servers on machines should be remediated (Preview)**. For more information, see [Security Center Recommendations](security-center-recommendations.md).
[](media/security-center-advanced-iaas-data/data-and-storage-sqldb-vulns-on-vm.png#lightbox)
The detailed view for this recommendation appears.
[](media/security-center-advanced-iaas-data/all-servers-view.png#lightbox)
1. For more details, drill down:
* For an overview of scanned resources (databases) and the list of security checks that were tested, select the server of interest.
* For an overview of the vulnerabilities grouped by a specific SQL database, select the database of interest.
In each view, the security checks are sorted by **Severity**. Click a specific security check to see a details pane with a **Description**, how to **Remediate** it, and other related information such as **Impact** or **Benchmark**.
## Advanced threat protection for SQL servers on machines alerts
Alerts are generated by unusual and potentially harmful attempts to access or exploit SQL machines. These events can trigger alerts shown in the [Alerts for SQL Database and Azure Synapse Analytics (formerly SQL Data Warehouse) section of the alerts reference page](alerts-reference.md#alerts-sql-db-and-warehouse).
## Explore and investigate security alerts
Security alerts are available in Security Center's alerts page, the resource's security tab, or through the direct link in the alert emails.
1. To view alerts, select **Security alerts** from Security Center's sidebar and select an alert.
1. Alerts are designed to be self-contained, with detailed remediation steps and investigation information in each one. You can investigate further by using other Azure Security Center and Azure Sentinel capabilities for a broader view:
* Enable SQL Server's auditing feature for further investigations. If you're an Azure Sentinel user, you can upload the SQL auditing logs from the Windows Security Log events to Sentinel and enjoy a rich investigation experience. [Learn more about SQL Server Auditing](https://docs.microsoft.com/sql/relational-databases/security/auditing/create-a-server-audit-and-server-audit-specification?view=sql-server-ver15).
* To improve your security posture, use Security Center's recommendations for the host machine indicated in each alert. This will reduce the risks of future attacks.
[Learn more about managing and responding to alerts](https://docs.microsoft.com/azure/security-center/security-center-managing-and-responding-alerts).
## Next steps
For related material, see the following article:
- [Security alerts for SQL Database and Azure Synapse Analytics (formerly SQL Data Warehouse)](alerts-reference.md#alerts-sql-db-and-warehouse)
- [Set up email notifications for security alerts](security-center-provide-security-contact-details.md)
- [Learn more about Azure Sentinel](https://docs.microsoft.com/azure/sentinel/)
- [Azure Security Center's advanced data security package](https://docs.microsoft.com/azure/sql-database/sql-database-advanced-data-security) | 76.992481 | 590 | 0.790234 | eng_Latn | 0.971979 |
08f8a95b3666c6a9540a056624194ce72cffd809 | 3,631 | md | Markdown | README.md | Seeed-Studio/Grove_Barometer_Sensor | a4de7d924791c41572a3f7d4ff500943a1e1d62f | [
"MIT"
] | 7 | 2015-12-23T16:00:10.000Z | 2020-02-06T10:56:01.000Z | README.md | Seeed-Studio/Grove_Barometer_Sensor | a4de7d924791c41572a3f7d4ff500943a1e1d62f | [
"MIT"
] | 1 | 2015-05-06T08:54:42.000Z | 2015-05-07T11:54:07.000Z | README.md | Seeed-Studio/Grove_Barometer_Sensor | a4de7d924791c41572a3f7d4ff500943a1e1d62f | [
"MIT"
] | 9 | 2015-03-29T15:00:34.000Z | 2021-05-13T01:33:21.000Z | Grove_Barometer_Sensor [](https://travis-ci.com/Seeed-Studio/Grove_Barometer_Sensor)
--------------------------------
This Arduino librarry is for Grove - Barometer Sensor(BMP085) and Grove - Barometer Sensor(BMP180).
<img src=https://statics3.seeedstudio.com/images/product/groveBarometer%20Sensor.jpg width=300><img src=https://statics3.seeedstudio.com/product/groveBarometer%20Sensor_01.jpg width=300>
[Grove - Barometer Sensor(BMP085)](https://www.seeedstudio.com/Grove-Barometer-Sensor-p-1199.html)
<img src=https://statics3.seeedstudio.com/images/product/Grove%20Barometer%20Sensor%20BMP180.jpg width=300><img src=https://statics3.seeedstudio.com/product/Grove%20Barometer%20Sensor%20BMP180_02.jpg width=300>
[Grove - Barometer Sensor(BMP180)](https://www.seeedstudio.com/s/Grove-Barometer-Sensor-(BMP180)-p-1840.html)
The Grove-Barometer Sensor(BMP085) features a Bosch BMP085 high-accuracy chip to detect barometric pressure and temperature. It can widely measure pressure ranging from 300hPa to 1100hPa, AKA +9000m to -500m above sea level, with a super high accuracy of 0.03hPa(0.25m) in ultra-high resolution mode.
The Grove-Barometer Sensor(BMP180) uses the Bosch BMP180 high-precision, low-power digital barometer chip. It offers a pressure measuring range of 300 to 1100 hPa with an accuracy down to 0.02 hPa in advanced resolution mode. It’s based on piezo-resistive technology for high accuracy, ruggedness and long term stability.The chip only accepts 1.8V to 3.6V input voltage. However, with outer circuit added, this module becomes compatible with 3.3V and 5V.
### Usage:
Search and install <Grove Barometer Sensor BMP085> on Arduino Library Manager.
Or flow the steps below.
1. Clone this repo or download as a zip;
2. Unzip the zip file if you downloaded a zip file;
3. Copy the directory "Grove_Barometer_Sensor" into Arduino's libraries directory;
4. Open Arduino IDE, go to "File"->"Examples"->"Grove_Barometer_Sensor"
Assume that you connected the grove correctly.
For connection guide and more information, please refer to [wiki page BMP085](http://wiki.seeedstudio.com/Grove-Barometer_Sensor/) and [wiki page BMP180](http://wiki.seeedstudio.com/Grove-Barometer_Sensor-BMP180/).
----
This software is written by L.G. for for [Seeed Technology Inc.](http://www.seeed.cc) and is licensed under [The MIT License](http://opensource.org/licenses/mit-license.php). Check License.txt/LICENSE for the details of MIT license.<br>
Contributing to this software is warmly welcomed. You can do this basically by<br>
[forking](https://help.github.com/articles/fork-a-repo), committing modifications and then [pulling requests](https://help.github.com/articles/using-pull-requests) (follow the links above<br>
for operating guide). Adding change log and your contact into file header is encouraged.<br>
Thanks for your contribution.
Seeed is a hardware innovation platform for makers to grow inspirations into differentiating products. By working closely with technology providers of all scale, Seeed provides accessible technologies with quality, speed and supply chain knowledge. When prototypes are ready to iterate, Seeed helps productize 1 to 1,000 pcs using in-house engineering, supply chain management and agile manufacture forces. Seeed also team up with incubators, Chinese tech ecosystem, investors and distribution channels to portal Maker startups beyond.
[](https://github.com/igrigorik/ga-beacon)
| 69.826923 | 535 | 0.787662 | eng_Latn | 0.881427 |
08f8bd19171e4d0f6eb0d6388c284b97cef8e1ae | 4,451 | md | Markdown | README.md | jackbrucesimpson/BeeUnique | 9fe3c501758cf35cee32ebe429b6da353ccbc455 | [
"MIT"
] | null | null | null | README.md | jackbrucesimpson/BeeUnique | 9fe3c501758cf35cee32ebe429b6da353ccbc455 | [
"MIT"
] | null | null | null | README.md | jackbrucesimpson/BeeUnique | 9fe3c501758cf35cee32ebe429b6da353ccbc455 | [
"MIT"
] | null | null | null | # BeeUnique
Program to track bees with unique tags.
## Running the Programs
Execution scripts are located in the `bin` directory. Instructions for how to modify the variables in them for processing your own data are included below.
You can execute the scripts by running the `bash` command followed by the script name. For example, if you want to run the `track.sh` script described below, you can execute it by running `bash track.sh` in the terminal.
### setup.sh
The purpose of this script is to set some default variables for each experiment. You do not have to run this file, as these variables will be imported into the other scripts automatically.
|Variable|Description|
|:-:|:-|
|OUTPUT_DIRECTORY|Directory path where you want the program output to be written.|
|EXPERIMENT_NAME|Name of your experiment's dataset. Must not have spaces as a directory will be created with this name in the output directory.|
|REDUCE_IMAGES|1 (True) or 0 (False). Only output tag images which are sufficiantly different from each other.
### track.sh
Tracking program that processes videos and creates csvs with tracking data and background images for each video.
|Variable|Description|
|:-:|:-|
|VIDEO_DIRECTORY|Set to the directory video files are stored in.|
|TRAINING| Can set to 1 (True) or 0 (False). If set to training mode, the program will terminate after processing 2 mins footage from each video. Does not attempt to classify tags - merely sets them to
|OVERWRITE_RAW| Can set to 1 (True) or 0 (False). Will reprocess previously completed videos whose output is already present in the raw directory.
### process_paths.sh
Program goes over the raw files generated by the tracking program and identifies paths based on the consensus classification of the tag.
|Variable|Description|
| :-: |:-|
|OVERWRITE_PROCESSED|Can set to 1 (True) or 0 (False). Will reprocess previously processed raw data whose output is already present in the processed directory.|
|OUTPUT_UNKNOWN_TAG_PATH_IMAGES|Can set to 1 (True) or 0 (False). Bees paths over a certain length where the consensus classification could not be determined will have their tag images written out as image files.|
After each JSON file has been processed, the bash script will then execute `create_path_dbs.py` which will then create a key-value database where the key is the `'tag_class'` and the value is a dictionary containing the strings 'day' and 'night' which map to a list of paths, where each path for a time period takes the form of a dictionary like this:
```
{'is_gap': False, 'x_path': x_path, 'y_path': y_path, 'num_frames': path_length}
{'is_gap': True, 'prev_next_path_same_loc_disappeared': False, 'num_frames': constants.NUM_FRAMES_IN_VIDEO}
```
Analysis of these paths in then conducted in the Jupyter Notebook (refer to my PhD_Notebooks repository)
### combine_bg.sh
Combines the background images generated by each hour-long video into an averaged image for each night and day time period. Outputs to `night_day_background` directory.
### overlay.sh
Overlays tracking and tag pattern classifications onto a video. You can close the window that appears by pressing 'q'.
|Variable|Description|
| :-: |:-|
|VIDEO_FILE_PATH|Set to the path of the video you want to see.|
|RAW|Can set to 1 (True) or 0 (False). Determines whether you go off the raw classification data or the processed path data where a consensus tag classification has been made.|
|CREATE_VIDEO|1 (True) or 0 (False). Outputs a 4K video of overlaid footage but slows down the program.|
|OUTPUT_VIDEO_FILE|Set to the path and filename of the video you wish to output.|
|ENHANCE_UNTAGGED_BEES|Try to increase the visibility of untagged bees in the video|
### reclassify_raw.sh
Will go over all the raw classification and tracking files that were generated and reclassify all the tags in the files.
### output_tags.sh
Will go over all the csvs that were generated and create the tag images. Images will be grouped by bee path and all the bee paths from the same video will be stored in the same directory.
## Setup
### Dependencies
- Requires Linux/MacOS
- Python 2
- OpenCV 2
### Ubuntu Installation
#### System packages
`sudo apt-get install libopencv-dev python-opencv python-pip exfat-fuse git`
#### Python libraries
`pip2 install numpy matplotlib pandas scipy sklearn cython networkx h5py tensorflow keras`
#### Building
Compile the tracking library in the `bin` directory.
`bash build.sh`
| 52.988095 | 351 | 0.777578 | eng_Latn | 0.998075 |
08f8dd9fb3c048436b3eaefbd05df1fbe495f268 | 439 | md | Markdown | src/pages/gta-5.md | Tengil91/gatsby-portfolio | 98739d13a9da2ed7023edc341a918010433ef650 | [
"MIT"
] | null | null | null | src/pages/gta-5.md | Tengil91/gatsby-portfolio | 98739d13a9da2ed7023edc341a918010433ef650 | [
"MIT"
] | null | null | null | src/pages/gta-5.md | Tengil91/gatsby-portfolio | 98739d13a9da2ed7023edc341a918010433ef650 | [
"MIT"
] | null | null | null | ---
title: "GTA 5"
src: "gta-5-preview.jpg"
tags: "HTML JS CSS Jekyll SASS Anime.js"
date: "2019-09-10"
srcs: "gta-5-1.jpg"
pagelink: "https://grandtheftauto.netlify.com/"
pagetype: "project"
---
A web site made with Jekyll. I was mainly responsible for the front page.
The site contains a lot of litle things; videos, images, lightboxes, animations, games and easter eggs. It's one of the most fun pages I've been part of creating.
| 24.388889 | 162 | 0.719818 | eng_Latn | 0.927981 |
08fa2228229c3b9afc2b6a78eb08ce5939779710 | 1,778 | md | Markdown | biztalk/core/single-sign-on-event-10767.md | cmcclister/biztalk-docs | 36a3d4b944e27edff883b8e36e997c7d2af4f497 | [
"CC-BY-4.0",
"MIT"
] | 37 | 2017-08-28T06:57:52.000Z | 2021-07-13T12:16:23.000Z | biztalk/core/single-sign-on-event-10767.md | cmcclister/biztalk-docs | 36a3d4b944e27edff883b8e36e997c7d2af4f497 | [
"CC-BY-4.0",
"MIT"
] | 732 | 2017-05-18T22:16:15.000Z | 2022-03-31T23:10:06.000Z | biztalk/core/single-sign-on-event-10767.md | isabella232/biztalk-docs | 36a3d4b944e27edff883b8e36e997c7d2af4f497 | [
"CC-BY-4.0",
"MIT"
] | 158 | 2017-06-19T22:47:52.000Z | 2022-02-28T06:41:54.000Z | ---
description: "Learn more about: Single Sign-On: Event 10767"
title: "Single Sign-On: Event 10767 | Microsoft Docs"
ms.custom: ""
ms.date: "06/08/2017"
ms.prod: "biztalk-server"
ms.reviewer: ""
ms.suite: ""
ms.tgt_pltfrm: ""
ms.topic: "article"
ms.assetid: ef5efc04-a0b5-4fc7-9d0e-f7aa9a9c413d
caps.latest.revision: 6
author: "MandiOhlinger"
ms.author: "mandia"
manager: "anneta"
---
# Single Sign-On: Event 10767
## Details
| Field | Error Details |
|-----------------|-------------------------------------------------------------------------------------------------------------------------|
| Product Name | Enterprise Single Sign-On |
| Product Version | [!INCLUDE[btsSSOVersion](../includes/btsssoversion-md.md)] |
| Event ID | 10767 |
| Event Source | ENTSSO |
| Component | N/A |
| Symbolic Name | ENTSSO_E_NO_SSO_SERVER |
| Message Text | Could not contact the SSO server ‘%1’. Check that SSO is configured and that the SSO service is running on that server. |
## Explanation
The ENTSSO client is unable to contact the ENTSSO server.
## User Action
Check network connectivity and make sure you spelled the server name correctly.
| 49.388889 | 141 | 0.409449 | eng_Latn | 0.610017 |
08fa485a1f6cb77be7a7f1da29b2cf4e55cd5c75 | 487 | md | Markdown | vendor/github.com/mageddo/go-httpmap/README.md | mrubli/dns-proxy-server | 42b4588341fb35a3a175ec26e5810950ab68c4da | [
"Apache-2.0"
] | 597 | 2016-03-22T15:23:11.000Z | 2022-03-29T10:09:59.000Z | vendor/github.com/mageddo/go-httpmap/README.md | mrubli/dns-proxy-server | 42b4588341fb35a3a175ec26e5810950ab68c4da | [
"Apache-2.0"
] | 109 | 2016-03-08T17:13:33.000Z | 2022-02-26T17:00:13.000Z | vendor/github.com/mageddo/go-httpmap/README.md | mrubli/dns-proxy-server | 42b4588341fb35a3a175ec26e5810950ab68c4da | [
"Apache-2.0"
] | 77 | 2016-04-27T15:13:02.000Z | 2022-03-31T23:51:32.000Z | httpmap helps you map yours endpoints with a low overhead, it's just a wrapper that makes your work easier
### Examples
```go
Get("/users", func(ctx context.Context, w http.ResponseWriter, r *http.Request){
...
})
Post("/users", func(ctx context.Context, w http.ResponseWriter, r *http.Request){
...
})
Put("/users", func(ctx context.Context, w http.ResponseWriter, r *http.Request){
...
})
```
### Installing new deps
go get -v github.com/golang/dep/cmd/dep && dep ensure -v
| 23.190476 | 106 | 0.685832 | yue_Hant | 0.483985 |
08fa8e1438318beea815d1308c82368b55266491 | 2,218 | md | Markdown | source/site/_posts/2014-8-03-introduction.md | underlost/dataplay | 7daf050585558174f2d7630402ea30cc33e226aa | [
"MIT"
] | null | null | null | source/site/_posts/2014-8-03-introduction.md | underlost/dataplay | 7daf050585558174f2d7630402ea30cc33e226aa | [
"MIT"
] | null | null | null | source/site/_posts/2014-8-03-introduction.md | underlost/dataplay | 7daf050585558174f2d7630402ea30cc33e226aa | [
"MIT"
] | null | null | null | ---
layout: post
title: Introduction
thumb: /img/thumb/introduction.jpg
snippet: 'Welcome to the DataPlay. Start here to learn what this project is about, and what to expect.'
---
Around the time I started working on [Jaded Gamer](http://jadedgamer.com/), I started working on a little script to scrape various websites relating to video game data. Since then, I've slowly expanded on it here and there, collecting more and more data. It's also now fully integrated it into the website, and acts as a web crawler, also indexing various video game news sites. For the longest time though, I really wasn't sure what I really wanted to do with all the information I've been collecting. Then, I came across the [The Feltron 2007 Annual Report](http://feltron.com/ar07_01.html), a yearly book of beautiful data visualizations. This set off a spark to create something similar, but more web based. The [2012 Annual Video Game Report](/2012-report/) is the first result. It's still a little rough around the edges, but it serves as a starting point.
### What's Next
As I continue to collect and improve the data, I'll be releasing more updates. I'd especially like to cover more years, produce more annual reports, and explore more gaming trends. With over 40,000 games and thousands of companies, it's taking a little more time than I anticipated.
I'm also currently looking at options to make all the data publicly available, either in one large data dump, or via a RESTful API.
### Sources
As I mentioned, parts of the scripts are not pretty, but it does the job pretty well. The focus right now is to collect as much information on game titles, publishers, developers, and platforms as I can, as well as sales numbers. I've constructed a number of site scrapers/crawlers that comb over sites like [Giant Bomb](http://giantbomb.com), [VGChartz](http://www.vgchartz.com/), and [MobyGames](http://www.mobygames.com/). These seem to be the most accurate sources I've come across. The crawlers dump everything into a database, and reports any conflicting data. Then, I manually look it over and clean it up the best I can. Coincidentally, this has also helped me find and report a number of inaccuracies on Giant Bomb.
| 110.9 | 862 | 0.775023 | eng_Latn | 0.999032 |
08fa9518a003461c21b461b8bb4db92858cf25c8 | 377 | md | Markdown | CHANGELOG.md | irgaly/flutter_password_credential | bf423c7cdbf3ae18230873d672d148e48e193300 | [
"Apache-2.0"
] | 6 | 2020-07-22T07:33:45.000Z | 2022-02-19T07:52:10.000Z | CHANGELOG.md | irgaly/flutter_password_credential | bf423c7cdbf3ae18230873d672d148e48e193300 | [
"Apache-2.0"
] | 4 | 2020-12-19T13:51:26.000Z | 2021-10-16T09:51:33.000Z | CHANGELOG.md | irgaly/flutter_password_credential | bf423c7cdbf3ae18230873d672d148e48e193300 | [
"Apache-2.0"
] | 6 | 2021-05-11T11:57:43.000Z | 2022-02-19T07:47:53.000Z | ## [0.3.1] - 2021/06/04
* Fix README.md on pub.dev
## [0.3.0] - 2021/06/04
* Fix Android Build Error
* Use Kotlin 1.5.10, Gradle 7.0.0
* Use play-services-auth 19.0.0
* Android Compile SDK version 30
## [0.2.0] - 2021/05/11
* Support Null Safety
## [0.1.0] - 2020/05/30
* Initial release.
* Access Web Credentials.
* Access Android Smartlock for Passwords.
| 17.136364 | 41 | 0.633952 | kor_Hang | 0.359805 |
08fb511a7e1756cda3b5efaa16a9547f50e320bf | 84 | md | Markdown | README.md | Olha-Sal/goit-js-hw-11-color-switch | a83f3ef2936d7e9f58129c095db2dc16d245294b | [
"MIT"
] | null | null | null | README.md | Olha-Sal/goit-js-hw-11-color-switch | a83f3ef2936d7e9f58129c095db2dc16d245294b | [
"MIT"
] | null | null | null | README.md | Olha-Sal/goit-js-hw-11-color-switch | a83f3ef2936d7e9f58129c095db2dc16d245294b | [
"MIT"
] | null | null | null | # goit-js-hw-11-color-switch
https://olha-sal.github.io/goit-js-hw-11-color-switch/
| 28 | 54 | 0.738095 | por_Latn | 0.146678 |
08fb514f44b3f64875e2d93e9bb16956dcca5a75 | 3,561 | md | Markdown | README.md | tanlin2013/TNpy | bc450825f79b6a95ad724ed05c61fda8e0545975 | [
"Apache-2.0"
] | 1 | 2020-09-07T21:35:34.000Z | 2020-09-07T21:35:34.000Z | README.md | tanlin2013/TNpy | bc450825f79b6a95ad724ed05c61fda8e0545975 | [
"Apache-2.0"
] | null | null | null | README.md | tanlin2013/TNpy | bc450825f79b6a95ad724ed05c61fda8e0545975 | [
"Apache-2.0"
] | null | null | null | # tnpy




[Documentation](https://tanlin2013.github.io/tnpy/) |
This project is a python implementation of Tensor Network,
a numerical approach to quantum many-body systems.
**tnpy** is built on top of [google/TensorNetwork](https://github.com/google/TensorNetwork) for tensor contractions,
with optimized support for various backend engines (TensorFlow, JAX, PyTorch, and Numpy).
For eigen-solver we adopt [primme](https://github.com/primme/primme),
an iterative multi-method solver with preconditioning.
Currently, we support Matrix Product State (MPS) algorithms,
with more are coming...
* Finite-sized Density Matrix Renormalization Group (fDMRG)
* Tree tensor Strong Disorder Renormalization Group (tSDRG)
fDMRG is on alpha-release and is much stable.
For others, please expect edge cases.
Requirments
-----------
See `requirements.txt` for more details.
But these two are essential building blocks.
* [google/TensorNetwork](https://github.com/google/TensorNetwork)
* [primme/primme](https://github.com/primme/primme)
Regarding any installation problems with Primme,
please refer to [Primme official](http://www.cs.wm.edu/~andreas/software/).
Also, it's required to have [lapack](http://www.netlib.org/lapack/) and [blas](http://www.netlib.org/blas/)
installed in prior to Primme.
Installation
------------
* using Docker
```
docker run --rm -it tanlin2013/tnpy
```
* using pip:
```
pip install git+https://github.com/tanlin2013/tnpy@main
```
Documentation
-------------
For details about **tnpy**, see the [reference documentation](https://tanlin2013.github.io/tnpy/).
Getting started
---------------
1. Defining the Matrix Product Operator of your model as a Callable function with argument `site` of the type `int`,
e.g. the function `_elem(self, site)` below.
The MPO class then accepts the Callable as an input and constructs a MPO object.
```
import numpy as np
from tnpy.operators import SpinOperators, MPO
from tnpy.finite_dmrg import FiniteDMRG
class XXZ:
def __init__(self, N: int, delta: float) -> None:
self.N = N
self.delta = delta
def _elem(self, site: int) -> np.ndarray:
Sp, Sm, Sz, I2, O2 = SpinOperators()
return np.array(
[[I2, -0.5 * Sp, -0.5 * Sm, -self.delta * Sz, O2],
[O2, O2, O2, O2, Sm],
[O2, O2, O2, O2, Sp],
[O2, O2, O2, O2, Sz],
[O2, O2, O2, O2, I2]]
)
@property
def mpo(self) -> MPO:
return MPO(self.N, self._elem)
```
2. Call the algorithm to optimize the state.
```
N = 100 # length of spin chain
chi = 60 # virtual bond dimension
model = XXZ(N, delta)
fdmrg = FiniteDMRG(
mpo=model.mpo,
chi=chi
)
fdmrg.update(tol=1e-8)
```
3. Compute any physical quantities whatever you want from the obtained state.
The resulting MPS is of the type `tensornetwork.FiniteMPS`,
see [here](https://tensornetwork.readthedocs.io/en/latest/stubs/tensornetwork.FiniteMPS.html#tensornetwork.FiniteMPS) for more details.
```
my_mps = fdmrg.mps
```
| 31.794643 | 138 | 0.659085 | eng_Latn | 0.753461 |
08fb9c5b62a252f46b37e62667aded04bcd262cc | 22 | md | Markdown | README.md | wangweiyan0901/watermelonBaby | 55b41015b5181350b05fb6f64b64a9cc3fa235ec | [
"Apache-2.0"
] | null | null | null | README.md | wangweiyan0901/watermelonBaby | 55b41015b5181350b05fb6f64b64a9cc3fa235ec | [
"Apache-2.0"
] | null | null | null | README.md | wangweiyan0901/watermelonBaby | 55b41015b5181350b05fb6f64b64a9cc3fa235ec | [
"Apache-2.0"
] | null | null | null | # watermelonBaby
西瓜宝宝
| 7.333333 | 16 | 0.818182 | pol_Latn | 0.417434 |
08fc86d2aff4719c7ffb62fca8397fd61e7e1b49 | 254 | md | Markdown | _definitions/bld-jerguer.md | digitallawyer/openlegaldictionary | a318d6c73c3d8e33756d947add397dac7f25cca2 | [
"MIT"
] | 5 | 2018-08-07T21:57:01.000Z | 2022-02-26T13:29:20.000Z | _definitions/bld-jerguer.md | digitallawyer/openlegaldictionary | a318d6c73c3d8e33756d947add397dac7f25cca2 | [
"MIT"
] | 1 | 2018-08-07T22:29:07.000Z | 2018-08-07T22:45:46.000Z | _definitions/bld-jerguer.md | digitallawyer/openlegaldictionary | a318d6c73c3d8e33756d947add397dac7f25cca2 | [
"MIT"
] | 2 | 2020-12-26T17:22:04.000Z | 2021-02-12T21:35:50.000Z | ---
title: Jerguer
letter: J
permalink: "/definitions/bld-jerguer.html"
body: In Engllsh law. An officer of the custom-house who oversees the wait-era. Techn.
Dict
published_at: '2018-07-07'
source: Black's Law Dictionary 2nd Ed (1910)
layout: post
--- | 25.4 | 86 | 0.740157 | eng_Latn | 0.814406 |
08fcc37be6204cd19e899078f87eac414beb09bb | 38 | md | Markdown | _includes/05-emphasis.md | kateemills/markdown-portfolio | 9de6f6a10cfd2add537ab40a184f2936fb9be7ca | [
"MIT"
] | null | null | null | _includes/05-emphasis.md | kateemills/markdown-portfolio | 9de6f6a10cfd2add537ab40a184f2936fb9be7ca | [
"MIT"
] | 5 | 2020-11-23T03:27:18.000Z | 2020-11-23T03:56:27.000Z | _includes/05-emphasis.md | kateemills/markdown-portfolio | 9de6f6a10cfd2add537ab40a184f2936fb9be7ca | [
"MIT"
] | null | null | null | I _love_ the **Pittsburgh Steelers**!
| 19 | 37 | 0.736842 | eng_Latn | 0.728527 |
08fd4fc6132057c134231b9bcdd212c00e62aba5 | 9,958 | md | Markdown | README.md | amarflybot/graphql-compose | 24f79097872552ae30d76873ee2bb17b1475dd92 | [
"MIT"
] | 1 | 2020-01-31T04:21:58.000Z | 2020-01-31T04:21:58.000Z | README.md | toverux/graphql-compose | 8d700a384f9f6665917b7018769eda62e3b029f7 | [
"MIT"
] | 9 | 2021-03-01T21:04:05.000Z | 2022-02-27T00:44:59.000Z | README.md | toverux/graphql-compose | 8d700a384f9f6665917b7018769eda62e3b029f7 | [
"MIT"
] | null | null | null | <p align="center"><img src="https://raw.githubusercontent.com/graphql-compose/graphql-compose/master/docs/logo.png" width="200" /></p>
# graphql-compose
[](https://www.npmjs.com/package/graphql-compose)
[](https://codecov.io/github/graphql-compose/graphql-compose)
[](https://travis-ci.org/graphql-compose/graphql-compose)
[](http://www.npmtrends.com/graphql-compose)
[](http://commitizen.github.io/cz-cli/)


[](#backers)
[](#sponsors)
[GraphQL](http://graphql.org/) – is a query language for APIs. [graphql-js](https://github.com/graphql/graphql-js) is the reference implementation of GraphQL for nodejs which introduce GraphQL type system for describing schema _(definition over configuration)_ and executes queries on the server side. [express-graphql](https://github.com/graphql/express-graphql) is a HTTP server which gets request data, passes it to `graphql-js` and returned result passes to response.
**`graphql-compose`** – the _imperative tool_ which works on top of `graphql-js`. It provides some methods for creating types and GraphQL Models (so I call types with a list of common resolvers) for further building of complex relations in your schema.
* provides methods for editing GraphQL output/input types (add/remove fields/args/interfaces)
* introduces `Resolver`s – the named graphql fieldConfigs, which can be used for finding, updating, removing records
* provides an easy way for creating relations between types via `Resolver`s
* provides converter from `OutputType` to `InputType`
* provides `projection` parser from AST
* provides `GraphQL schema language` for defining simple types
* adds additional types `Date`, `Json`
**`graphql-compose-[plugin]`** – are _declarative generators/plugins_ built on top of `graphql-compose`, which take some ORMs, schema definitions and create GraphQL Models from them or modify existing GraphQL Types.
Type generator plugins:
* [graphql-compose-json](https://github.com/graphql-compose/graphql-compose-json) - generates GraphQL type from JSON (a good helper for wrapping REST APIs)
* [graphql-compose-mongoose](https://github.com/graphql-compose/graphql-compose-mongoose) - generates GraphQL types from mongoose (MongoDB models) with Resolvers.
* [graphql-compose-elasticsearch](https://github.com/graphql-compose/graphql-compose-elasticsearch) - generates GraphQL types from elastic mappings; ElasticSearch REST API proxy via GraphQL.
* [graphql-compose-aws](https://github.com/graphql-compose/graphql-compose-aws) - expose AWS Cloud API via GraphQL
Utility plugins:
* [graphql-compose-relay](https://github.com/graphql-compose/graphql-compose-relay) - reassemble GraphQL types with `Relay` specific things, like `Node` type and interface, `globalId`, `clientMutationId`.
* [graphql-compose-connection](https://github.com/graphql-compose/graphql-compose-connection) - generates `connection` Resolver from `findMany` and `count` Resolvers.
* [graphql-compose-dataloader](https://github.com/stoffern/graphql-compose-dataloader) - adds DataLoader to graphql-composer resolvers.
## Documentation
[graphql-compose.github.io](https://graphql-compose.github.io/)
## Live Demos
* [graphql-compose.herokuapp.com](https://graphql-compose.herokuapp.com/) - Live demo of GraphQL Server (9 models, 14 files, ~750 LOC)
* [nodkz.github.io/relay-northwind](https://nodkz.github.io/relay-northwind) - Live demo of Relay client working with the server above (8 crazy pages, 47 files, ~3000 LOC)
## Examples
Please follow [Quick Start Guide](https://graphql-compose.github.io/docs/intro/quick-start.html) for the complete example.
Here is just a demo of ambiguity ways of types definitions:
```js
import { schemaComposer} from 'graphql-compose';
// You may use SDL format for type definition
const CityTC = schemaComposer.createObjectTC(`
type City {
code: String!
name: String!
population: Number
countryCode: String
tz: String
}
`);
// Define type via Config object
const CountryTC = schemaComposer.createObjectTC({
name: 'Country',
fields: {
title: 'String',
geo: `type LonLat { lon: Float, lat: Float }`,
hoisting: {
type: () => AnotherTC,
description: `
You may wrap type in thunk for solving
hoisting problems when two types cross reference
each other.
`,
}
}
});
// Or via declarative methods define some additional fields
CityTC.addFields({
country: CountryTC, // some another Type
ucName: { // standard GraphQL like field definition
type: GraphQLString,
resolve: (source) => source.name.toUpperCase(),
},
currentLocalTime: { // extended GraphQL Compose field definition
type: 'Date',
resolve: (source) => moment().tz(source.tz).format(),
projection: { tz: true }, // load `tz` from database, when requested only `localTime` field
},
counter: 'Int', // shortening for only type definition for field
complex: `type ComplexType {
subField1: String
subField2: Float
subField3: Boolean
subField4: ID
subField5: JSON
subField6: Date
}`,
list0: {
type: '[String]',
description: 'Array of strings',
},
list1: '[String]',
list2: ['String'],
list3: [new GraphQLOutputType(...)],
list4: [`type Complex2Type { f1: Float, f2: Int }`],
});
// Add resolver method
CityTC.addResolver({
kind: 'query',
name: 'findMany',
args: {
filter: `input CityFilterInput {
code: String!
}`,
limit: {
type: 'Int',
defaultValue: 20,
},
skip: 'Int',
// ... other args if needed
},
type: [CityTC], // array of cities
resolve: async ({ args, context }) => {
return context.someCityDB
.findMany(args.filter)
.limit(args.limit)
.skip(args.skip);
},
});
// Remove `tz` field from schema
CityTC.removeField('tz');
// Add description to field
CityTC.extendField('name', {
description: 'City name',
});
schemaComposer.Query.addFields({
cities: CityTC.getResolver('findMany'),
currentTime: {
type: 'Date',
resolve: () => Date.now(),
},
});
schemaComposer.Mutation.addFields({
createCity: CityTC.getResolver('createOne'),
updateCity: CityTC.getResolver('updateById'),
...adminAccess({
removeCity: CityTC.getResolver('removeById'),
}),
});
function adminAccess(resolvers) {
Object.keys(resolvers).forEach(k => {
resolvers[k] = resolvers[k].wrapResolve(next => rp => {
// rp = resolveParams = { source, args, context, info }
if (!rp.context.isAdmin) {
throw new Error('You should be admin, to have access to this action.');
}
return next(rp);
});
});
return resolvers;
}
// construct schema which can be passed to express-graphql, apollo-server or graphql-yoga
export const schema = schemaComposer.buildSchema();
```
## Contributors
This project exists thanks to all the people who contribute.
<a href="graphs/contributors"><img src="https://opencollective.com/graphql-compose/contributors.svg?width=890&button=false" /></a>
## Backers
Thank you to all our backers! 🙏 [[Become a backer](https://opencollective.com/graphql-compose#backer)]
<a href="https://opencollective.com/graphql-compose#backers" target="_blank"><img src="https://opencollective.com/graphql-compose/backers.svg?width=890"></a>
## Sponsors
Support this project by becoming a sponsor. Your logo will show up here with a link to your website. [[Become a sponsor](https://opencollective.com/graphql-compose#sponsor)]
<a href="https://opencollective.com/graphql-compose/sponsor/0/website" target="_blank"><img src="https://opencollective.com/graphql-compose/sponsor/0/avatar.svg"></a>
<a href="https://opencollective.com/graphql-compose/sponsor/1/website" target="_blank"><img src="https://opencollective.com/graphql-compose/sponsor/1/avatar.svg"></a>
<a href="https://opencollective.com/graphql-compose/sponsor/2/website" target="_blank"><img src="https://opencollective.com/graphql-compose/sponsor/2/avatar.svg"></a>
<a href="https://opencollective.com/graphql-compose/sponsor/3/website" target="_blank"><img src="https://opencollective.com/graphql-compose/sponsor/3/avatar.svg"></a>
<a href="https://opencollective.com/graphql-compose/sponsor/4/website" target="_blank"><img src="https://opencollective.com/graphql-compose/sponsor/4/avatar.svg"></a>
<a href="https://opencollective.com/graphql-compose/sponsor/5/website" target="_blank"><img src="https://opencollective.com/graphql-compose/sponsor/5/avatar.svg"></a>
<a href="https://opencollective.com/graphql-compose/sponsor/6/website" target="_blank"><img src="https://opencollective.com/graphql-compose/sponsor/6/avatar.svg"></a>
<a href="https://opencollective.com/graphql-compose/sponsor/7/website" target="_blank"><img src="https://opencollective.com/graphql-compose/sponsor/7/avatar.svg"></a>
<a href="https://opencollective.com/graphql-compose/sponsor/8/website" target="_blank"><img src="https://opencollective.com/graphql-compose/sponsor/8/avatar.svg"></a>
<a href="https://opencollective.com/graphql-compose/sponsor/9/website" target="_blank"><img src="https://opencollective.com/graphql-compose/sponsor/9/avatar.svg"></a>
## License
[MIT](https://github.com/graphql-compose/graphql-compose/blob/master/LICENSE.md)
| 46.751174 | 471 | 0.731372 | eng_Latn | 0.417161 |
08fd6c713b443f3b5540f5cf9162c88cf0f7a28f | 16 | md | Markdown | README.md | Patrena94/-food-festival | 26d4d58a755627bfadaecae1d82281f3f2465a34 | [
"MIT"
] | null | null | null | README.md | Patrena94/-food-festival | 26d4d58a755627bfadaecae1d82281f3f2465a34 | [
"MIT"
] | 3 | 2021-09-08T20:15:28.000Z | 2021-09-14T03:01:40.000Z | README.md | Patrena94/-food-festival | 26d4d58a755627bfadaecae1d82281f3f2465a34 | [
"MIT"
] | null | null | null | # -food-festival | 16 | 16 | 0.75 | eng_Latn | 0.520006 |
08fd75aaffae879ca35e9c059ac13986811f52c5 | 4,801 | md | Markdown | content/blog/webpack-config-es6-eslint.md | shuboc/gatsby-blog | 72e5267ec0c9e90c3c054ebda7d1824850b19b56 | [
"RSA-MD"
] | null | null | null | content/blog/webpack-config-es6-eslint.md | shuboc/gatsby-blog | 72e5267ec0c9e90c3c054ebda7d1824850b19b56 | [
"RSA-MD"
] | null | null | null | content/blog/webpack-config-es6-eslint.md | shuboc/gatsby-blog | 72e5267ec0c9e90c3c054ebda7d1824850b19b56 | [
"RSA-MD"
] | null | null | null | ---
title: "Webpack基礎設定 (支援ES6、ESLint)"
tags: [webpack, react]
last_modified_at: 2020/10/15
date: "2017-02-04"
---
這篇教學介紹React專案的基本webpack設定。包含支援ES6需要的Babel,程式碼風格檢查的ESLint和開發工具 webpack-dev-server。
## 目錄
```toc
```
## 專案資料夾結構 Overview
假設我們專案的資料夾結構如下:
~~~
├── package.json
├── src
│ ├── actions/
│ ├── containers/
│ ├── components/
│ ├── reducers/
│ ├── store/
│ ├── client.jsx
│ ├── routes.jsx
└── webpack.client.config.js
~~~
`src/`包含所有client-side的程式碼,其中`src/client.jsx`是我們app的進入點。而webpack設定放在`webpack.client.config.js`裡。
## Webpack基本設定
安裝webpack:
`npm install -D webpack`
在根目錄底下編輯`webpack.client.config.js`:
~~~jsx
// webpack.client.config.js
module.exports = {
entry: {
client: './src/client'
},
output: {
path: './bin',
filename: '[name].js'
},
resolve: {
extensions: ["", ".js", '.jsx']
},
devtool: "source-map"
}
~~~
* `entry`是js的進入點,`client`是bundle的名稱,而`./src/client`表示當bundle被載入時會載入`./src/client`的module,而`webpack`會從這個進入點開始編譯。
* `output.filename: '[name].js'`設定成編譯出來的bundle會根據`entry`的key值命名,也就是`client.js`。
* `resolve.extension`可以讓我們import `js`及`jsx`時可以省略附檔名。
* `devtool`則是source map的設定,基於速度快慢和訊息的詳盡程度,選一個自己喜歡的即可。
## Babel設定
[Babel](https://babeljs.io/)是用來將ES6和react的jsx語法轉譯成ES5的工具。
安裝Babel:
~~~
npm install -D babel-core babel-loader
~~~
安裝ES6和`react`的presets:
~~~
npm install -D babel-preset-es2015 babel-preset-react
~~~
在專案根目錄底下設定`.babelrc`:
~~~jsx
//.babelrc
{
"presets": ["es2015", "react"]
}
~~~
設定使用`babel-loader`載入`js`及`jsx`:
~~~jsx
// webpack.client.config.js
module.exports = {
...
module: {
loaders: [
{
test: /\.jsx?$/,
loader: 'babel',
include: /src/
}
]
},
...
}
~~~
## ESLint設定
[ESLint](http://eslint.org/)可以用靜態分析的方法抓出程式中的語法錯誤,以及容易出錯的寫法,讓開發過程中減少很多低級失誤。
安裝ESLint和loader:
~~~
npm install -D eslint eslint-loader
~~~
初始設定(推薦用問答的方式,產生的設定會在`.eslintrc.js`):
~~~
./node_modules/.bin/eslint --init
~~~
在webpack config中加入`eslint-loader` (要放在最後面,才能檢查到react/es6語法):
~~~jsx
module.exports = {
...
module: {
loaders: [
{
test: /\.jsx?$/,
loaders: ['babel', 'eslint'],
include: /src/
}
]
},
...
}
~~~
加入React rules(避免unused React錯誤,以及`jsx`的語法檢查):
~~~jsx
// .eslintrc.js
"extends": [
...
"plugin:react/recommended"
]
~~~
### `eslint-plugin-import`
當你import某個module時不小心打錯字,只會默默的import成`undefined`。舉例來說,這是我曾經犯過的一個錯:
~~~jsx
import {Provider} from 'redux' // Should be from 'react-redux'
~~~
JS沒給我任何警告或錯誤,只在runtime時狂噴莫名其妙的error,害我浪費超多時間QQ。後來找到`eslint-plugin-import`,他可以幫你檢查import的正確性(雖然實際使用起來似乎不是百分之百都抓得到錯誤)。
加入`eslint-plugin-import`外掛:
~~~
npm install -D eslint-plugin-import eslint-import-resolver-webpack
~~~
新增設定:
~~~jsx
// .eslintrc.js
module.exports = {
...
"extends": [
...
"plugin:import/errors",
"plugin:import/warnings"
],
plugins: [
...
"import"
],
"settings": {
"import/resolver": {
"webpack": {
"config": {
"resolve": {
"extensions": ['', '.js', '.jsx']
}
}
}
}
}
}
~~~
## `webpack-dev-server`
[webpack-dev-server](https://webpack.github.io/docs/webpack-dev-server.html)是個express server,主要的功能負責serve bundle。他能夠做到當檔案修改時會重新編譯bundle,並且讓網頁自動refresh。更進階一點,還能在網頁不更動的情況下作Hot Module Replacement,讓開發的工作更加輕鬆。
webpack-dev-server會把bundle放在memory裡面,不會產生實體的檔案。而靜態檔案預設會從當前目錄serve。
安裝:
`npm install -D webpack-dev-server`
首先在`webpack.client.config.js`新增devServer的設定:
~~~jsx
module.exports = {
...
output: {
...
publicPath: '/static/' // bundle is served at this path
},
devServer: {
historyApiFallback: true,
inline: true
}
}
~~~
* `output.publicPath`的設定表示bundle可以在`<domain>/static/client.js`的路徑被存取。
* 設定`devServer`。`inline`讓瀏覽器可以自動refresh。`historyApiFallback`是讓single page application第一次進入的路徑不管為何都能讓routing正常運作。
我們還需要一個靜態的`index.html`。因為靜態檔案預設從當前目錄serve,我們把`index.html`放在根目錄底下:
~~~jsx
<!doctype html>
<html>
<head>
<title>Redux Universal Example</title>
</head>
<body>
<div id="app"></div>
<script src="/static/client.js"></script>
</body>
</html>
~~~
注意`<script src="/static/client.js"></script>`相對路徑同`publicPath`。
最後加上npm script:
~~~jsx
// package.json
"scripts": {
"start": "webpack-dev-server --config webpack.client.config.js"
},
~~~
執行:
~~~
npm start
~~~
## Conclusion
這樣專案就設定完成了,可以使用ES6及React的語法,包含ESLint檢查,以及支援瀏覽器自動refresh的`webpack-dev-server`。
完整的範例可以參考[react-universal-example](https://github.com/shuboc/react-universal-example/tree/01-basic-babel-eslint-wds)。
## Reference
[Webpack Official](https://webpack.github.io/docs/webpack-dev-server.html)
[Survivejs](http://survivejs.com/webpack/): 很詳盡的教學,比官方文件多很多範例
[Webpack your bags](https://blog.madewithlove.be/post/webpack-your-bags/): 一篇講解code splitting, chunk和css滿清楚的文章
| 17.715867 | 204 | 0.658404 | yue_Hant | 0.611802 |
08fd87811ead34e6c671ce22761203716097b993 | 2,018 | md | Markdown | examples/vue-todos/README.md | Dentacloud-ai/tauri-plugin-sql | 7ee84c5e1cf2ab08b418e0e54ca69a71519316ba | [
"Apache-2.0",
"MIT"
] | 41 | 2021-09-14T13:22:02.000Z | 2022-03-31T03:16:44.000Z | examples/vue-todos/README.md | Dentacloud-ai/tauri-plugin-sql | 7ee84c5e1cf2ab08b418e0e54ca69a71519316ba | [
"Apache-2.0",
"MIT"
] | 20 | 2021-09-29T13:54:54.000Z | 2022-03-22T22:01:54.000Z | examples/vue-todos/README.md | Dentacloud-ai/tauri-plugin-sql | 7ee84c5e1cf2ab08b418e0e54ca69a71519316ba | [
"Apache-2.0",
"MIT"
] | 12 | 2021-09-14T13:44:03.000Z | 2022-03-30T21:00:14.000Z | # VueJS with Vite Example
Another TODO app but with VueJS 3.x and the ViteJS bundler.
## Installation
This was built with the **pnpm** package manager in mind but feel free to use whatever you like:
```bash
# pnpm
pnpm install
# yarn
yarn install
# npm
npm install
```
## Development
To run both the frontend and backend in _development_ mode simply run:
```bash
pnpm run dev
```
This will boot up the ViteJS bundler for the frontend and then open up dev mode for Tauri in the backend. You will see a native window popup with the application that looks something like this:

Features:
1. **Add a new todo** in input and press enter or plus button
2. **Light/Dark mode** toggle as highlighted by #1 above
3. **Completion** check the checkbox to mark as completed
4. **Remove** click the cancel icon on an existing TODO to remove it
In addition please note the green icon (highlighted as #2) which indicates that SQLite database has been connected to.
The frontend app has been designed such that it will still "work" without this connection but without the DB connection, the state is ephermeral and will be lost the next time you run the app.
> Note: when you package the app as a binary during build, this app currently will not have write permissions (open issue) and therefore you'll have a frontend only state mgmt solution.
### Frontend Only
If you want, you can startup the frontend only and get full hot-module replacement as well as frontend state management while you work. You can do this with:
```bash
pnpm run dev:frontend
```
This will start the ViteJS bundler running in **dev** mode and provide the UI at `localhost:3000`. The screen will look something like this:

> Note the red icon for DB connection because, as expected, SQLite is not available here but you can add and remove TODO's
## Building
To build the application to a binary you will run:
```bash
pnpm run build
``` | 32.548387 | 193 | 0.756194 | eng_Latn | 0.999402 |
08fe0019095d36af6c1b04905868d5862d00ddab | 125 | md | Markdown | README.md | jmadureira/herd | b7d998a112f93537d32a545c737500094c7cfba2 | [
"MIT"
] | 1 | 2016-06-07T10:25:33.000Z | 2016-06-07T10:25:33.000Z | README.md | jmadureira/herd | b7d998a112f93537d32a545c737500094c7cfba2 | [
"MIT"
] | null | null | null | README.md | jmadureira/herd | b7d998a112f93537d32a545c737500094c7cfba2 | [
"MIT"
] | null | null | null | herd
====
[](https://travis-ci.org/jmadureira/herd)
| 25 | 113 | 0.72 | nld_Latn | 0.157292 |
08fe931b54b6fe01b59d722517d37a0150ad3bc0 | 1,818 | md | Markdown | scripting-docs/javascript/reference/add-method-weakset-javascript.md | doodz/visualstudio-docs.fr-fr | 49c7932ec7a761e4cd7c259a5772e5415253a7a5 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | scripting-docs/javascript/reference/add-method-weakset-javascript.md | doodz/visualstudio-docs.fr-fr | 49c7932ec7a761e4cd7c259a5772e5415253a7a5 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2018-10-19T08:00:06.000Z | 2018-10-19T08:00:06.000Z | scripting-docs/javascript/reference/add-method-weakset-javascript.md | doodz/visualstudio-docs.fr-fr | 49c7932ec7a761e4cd7c259a5772e5415253a7a5 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: "Add, méthode (WeakSet) (JavaScript) | Documents Microsoft"
ms.custom:
ms.date: 01/18/2017
ms.prod: windows-client-threshold
ms.reviewer:
ms.suite:
ms.technology: devlang-javascript
ms.tgt_pltfrm:
ms.topic: language-reference
dev_langs:
- JavaScript
- TypeScript
- DHTML
ms.assetid: d35d0287-6b33-4720-b9d7-8954c428ce4e
caps.latest.revision: "2"
author: mikejo5000
ms.author: mikejo
manager: ghogen
ms.openlocfilehash: 3ab486beaba4a26c73930b5ceaee927f73aa077a
ms.sourcegitcommit: aadb9588877418b8b55a5612c1d3842d4520ca4c
ms.translationtype: MT
ms.contentlocale: fr-FR
ms.lasthandoff: 10/27/2017
---
# <a name="add-method-weakset-javascript"></a>add, méthode (WeakSet) (JavaScript)
Ajoute un nouvel élément à `WeakSet`.
## <a name="syntax"></a>Syntaxe
```JavaScript
weaksetObj.add(obj)
```
#### <a name="parameters"></a>Paramètres
`weaksetObj`
Obligatoire. Objet `WeakSet`.
`obj`
Obligatoire. Nouvel élément de `WeakSet`.
## <a name="remarks"></a>Remarques
Le nouvel élément doit être un objet, plutôt qu'une valeur arbitraire, et doit être unique. Si vous ajoutez un élément non unique à un `WeakSet`, le nouvel élément ne sera pas ajouté à la collection.
## <a name="example"></a>Exemple
L'exemple suivant montre comment ajouter des membres à un ensemble, puis vérifier qu'ils ont été ajoutés.
```JavaScript
var ws = new WeakSet();
var str = new String("Thomas Jefferson");
var num = new Number(1776);
ws.add(str);
ws.add(num);
console.log(ws.has(str));
console.log(ws.has(num));
ws.delete(str);
console.log(ws.has(str));
// Output:
// true
// true
// false
```
## <a name="requirements"></a>Spécifications
[!INCLUDE[jsv12](../../javascript/reference/includes/jsv12-md.md)] | 25.971429 | 202 | 0.692519 | fra_Latn | 0.357432 |
08fe9bae9816a979dcb043eeff1d3f7b9878aa98 | 557 | md | Markdown | contact.md | bhanu77prakash/bhanu77prakash.github.io | 9aa08c7c56dcb58bde705f3a0d9f28197712a14b | [
"MIT"
] | null | null | null | contact.md | bhanu77prakash/bhanu77prakash.github.io | 9aa08c7c56dcb58bde705f3a0d9f28197712a14b | [
"MIT"
] | null | null | null | contact.md | bhanu77prakash/bhanu77prakash.github.io | 9aa08c7c56dcb58bde705f3a0d9f28197712a14b | [
"MIT"
] | null | null | null | ---
layout: default
title: Contact
permalink: /contact/
---
|**Phone**|: +91 932447️⃣57️⃣1|
|**Address**|: Bengaluru, Karnataka, India.|
|**Email**|: bhanu.guda[😊]yahoo[⚫]com|
### Online Presence
|**Facebook**|: @[Bhanu Prakash Reddy Guda](https://www.facebook.com/bhanuprakash.reddy.90/)|
|**Twitter**|: @[Bhanu Prakash Reddy](https://twitter.com/bhanu77prakash)|
|**LinkedIn**|: @[Bhanu Prakash Reddy Guda](https://www.linkedin.com/in/bhanu-prakash-reddy-guda-22425913a/)|
|**Medium**|: @[Bhanu Prakash](https://medium.com/@randomthingsinshort)|
| 34.8125 | 110 | 0.671454 | yue_Hant | 0.217236 |
08ff785c5bf9ca315eda16f9fedbbd064df24592 | 917 | md | Markdown | _pages/about.md | trevino-kim/trevino-kim.github.io | a9dff5769027b2b7d9fd934abec16e2c1afa44d8 | [
"MIT"
] | null | null | null | _pages/about.md | trevino-kim/trevino-kim.github.io | a9dff5769027b2b7d9fd934abec16e2c1afa44d8 | [
"MIT"
] | null | null | null | _pages/about.md | trevino-kim/trevino-kim.github.io | a9dff5769027b2b7d9fd934abec16e2c1afa44d8 | [
"MIT"
] | null | null | null | ---
title: About
permalink: /about/
---
<img src="/assets/images/bio-casual.png" width="500" style="margin-left: 20px;">
```ts
let Jade =
{
이름: "김정현",
"영어 이름": "Jade Kim",
연락처: "jadekim925@gmail.com",
지역: {
고향: "부산",
"마음의 고향": "서울",
거주지: "미국"
},
"사는 곳": Jade.지역.거주지+" 콜로라도주",
"총 경력": [
{
직무: "프론트엔드 개발자, 하이브리드 모바일 앱 개발자"
경력: "3년차 현재진행형"
},
{
직무: "워드프레스 개발자"
경력: "2년"
},
{
직무: "화학공학 플랜트 엔지니어"
경력: "5년"
},
],
학력: "서울산업대학교 화학공학과 졸업 (2006-2010)",
특이사항: [
"현재 미국에 거주하면서 '현대메디'라는 회사의 프론트엔드 개발자로 재택 근무중",
"미국의 TinmanKinetics 라는 회사에서 프론트엔드 개발자로 2년 근무함",
"미국의 Illumulus 라는 회사에서 워드프레스 개발자로 2년 근무함",
"영어 회화 가능",
"화학공학 출신",
"재택근무 레벨 10점 만점에 10점",
` 나고 자란 곳은 ${Jade.지역.고향},
공부하고 사회 생활 시작한 곳은 ${Jade.지역["마음의 고향"]},
현재 거주하는 곳은 ${Jade.지역.거주지},
일하는 회사는 ${Jade.지역["마음의 고향"]}
`
],
}
```
| 18.34 | 80 | 0.498364 | kor_Hang | 1.000009 |
1c0064aecc0dba0e8d483e34ea4cc8cd419306b8 | 3,795 | md | Markdown | README.md | static-dev/source-loader | 219b2278a21ece208b40fc7cf26bd809e170bc07 | [
"MIT"
] | 4 | 2016-05-13T23:41:00.000Z | 2018-11-06T04:40:29.000Z | README.md | static-dev/source-loader | 219b2278a21ece208b40fc7cf26bd809e170bc07 | [
"MIT"
] | 14 | 2016-05-19T14:23:22.000Z | 2019-02-07T09:37:24.000Z | README.md | static-dev/source-loader | 219b2278a21ece208b40fc7cf26bd809e170bc07 | [
"MIT"
] | null | null | null | # Webpack Source Loader
[](https://badge.fury.io/js/source-loader) [](https://travis-ci.org/static-dev/source-loader) [](https://david-dm.org/static-dev/source-loader) [](https://coveralls.io/github/static-dev/source-loader)
Webpack loader that exports the source directly
> **Note:** This project is in early development, and versioning is a little different. [Read this](http://markup.im/#q4_cRZ1Q) for more details.
### Why should you care?
If you need to load up a file's source directly and have it available within your javascript, this loader is what you're after. For example, you could load up a svg file, parse out a path you want, and manipulate that with your javascript somehow. Or you could load up a json or text file. Or some other type of file that you came up with yourself. Whatever it is, as long as it's contents would be ok after being run through `JSON.stringify` (read: text files, not images or binary files), it will work just great with the source loader. And if it is a binary file, it will work great as well, but you won't be able to require it in your client-side js, just manipulate it through a plugin.
### Installation
`npm install source-loader -S`
### Usage
Just load it up in your webpack config like this:
```js
module.exports = {
module: {
loaders: [
{ test: /\.foo$/, loader: 'source-loader' }
]
}
}
```
Then you can require it up in your main entry:
```js
const fooFile = require('testing.foo')
console.log(fooFile) // wow it's the contents!
```
As an added bonus, this loader makes the buffered raw source available on the `loaderContext` object so that plugins can manipulate it in any way necessary.
Let's break down how this could be done. Inside any plugin hook, you have a `compilation` object. You can get the `loaderContext` for any of the modules that webpack is processing through `compilation.modules` -- just find the one(s) you want by name. Now you have a large object which is an instance of the `DependenciesBlock` class, with a bunch of great information on it. You can find the raw buffered source under the `_src` property if the file was loaded with the source-loader.
Additionally, if you have a specific source that is valid javascript and you'd like this loader not to output it as a string that would need to be eval'd in order to use the javascript, there's a hack for that. Within a plugin or another loader, if you add `_jsSource` as a truthy property to the module, it will skip the extra stringification and output the source raw. Note that you will get an error if it's not valid javascript, so make sure you are only setting the `_jsSource` property if you are positive that what is coming out of your loader chain is going to be js. To set from a loader, `this._module._jsSource = true` will do it, and from a plugin, you can do the same as is described above with the `_src` property.
Wondering what sets this loader apart from [raw-loader](https://github.com/webpack/raw-loader)? This is it. Both loaders expose the file's contents to be required by webpack, but this loader also exposes the raw source for plugin processing, and allows raw js to be passed through conditionally. It also does not try to stringify binary files (which can cause bugs), has tests, and is actively maintained, as a bonus.
### License & Contributing
- Details on the license [can be found here](LICENSE.md)
- Details on running tests and contributing [can be found here](contributing.md)
| 75.9 | 728 | 0.755995 | eng_Latn | 0.996156 |
1c011dbd84710aef177bce24038c45212c2bfbf3 | 3,220 | md | Markdown | 4._Collection_Set_Stack_Queue_Map/Notes.md | RasmusKnothNielsen/Code_examples_Construction_3sem | f4d6c69f8fbba84245371cae584fb9b694b03acf | [
"MIT"
] | null | null | null | 4._Collection_Set_Stack_Queue_Map/Notes.md | RasmusKnothNielsen/Code_examples_Construction_3sem | f4d6c69f8fbba84245371cae584fb9b694b03acf | [
"MIT"
] | null | null | null | 4._Collection_Set_Stack_Queue_Map/Notes.md | RasmusKnothNielsen/Code_examples_Construction_3sem | f4d6c69f8fbba84245371cae584fb9b694b03acf | [
"MIT"
] | null | null | null | **Collection**:\
Collection er et interface der repræsenterer en gruppe af Objekter, disse objekter er det pågældende Collection's
elementer. Collection er root interface i Collection hierarkiet. Alle interfaces der implementerer Collection
eller implementerer sub-interfaces til Collection nedarver alle metoder fra Collection interfacet. Collection
nedarver fra superinterfacet Iterable som er det "højeste" interface i interface hierarkiet.
Alle Interfaces/klasser nedenfor nedarver alt funktionalitet fra Collection ved at enten extende eller implementerer
Collection eller sub-interfaces til Collection.
**Set**:\
Set er en collection af elementer, hvoraf ingen duplicates er tilladte. Set extender Collection og er et interface.
Dvs. at Set ikke kan instantieres uden at blive implementeret af en klasse.\
Hvad bruges det til?
- Brugernavne er et godt eksempel på et Set. Du vil aldrig have to klienter med det samme brugernavn.
**Stack**:\
Stack er en klasse der følger LIFO princippet, Last In First Out. Ved en Stack kan du kun kigge på det øverste element,
og ikke andre elementer i Stacken. Deque har samme funktionalitet, samt meget mere og kan stadig vedholde LIFO
princippet - Deque er at foretrække af de to.\
`TODO:`Hvad bruges det til?
**Queue**:\
Queue er et interface der følger FIFO princippet, First In First Out. Queue extender Collection.\
Hvad bruges det til?
- Det kan fx bruges hvis du vil være helt sikker på at du følger FIFO princippet og kun vil indsætte data i den ene
ende af din queue og trække det ud i den anden ende af din queue.
**Deque** - Ikke nødvendigt at nævne Deque\
Deque = Double ended Queue. Deque er et interface der extender Queue. Deque kan både følge princippet FIFO og LIFO. Med
deque kan der tages elementer ud af begge ender og sættes ind i begge ender.
Hvad bruges det til?
- Hvis man har flere processorer som kører simultant......
**Map**:\
Map er en del af Java Collections, men ikke en del af Collection interfacet.
Map er et interface hvor du kan mappe en key til en værdi. Et Map objekt kan ikke instantieres uden at
blive instantieret som fx TreeMap eller HashMap (da det er et interface). Map kan ikke indeholde duplicate keys,
men kan godt indeholde duplicate værdier. Det kan ikke indeholde null keys, men kan indeholde flere null værdier -
dette kan være brugbart hvis man ved hvad man skal bruge den specifikke key, men endnu ikke ved hvad den tilhørende
value skal være.
Alt efter implementeringen kan rækkefølgen og dermed sorteringen ikke garanteres at være i den naturlige rækkefølge.
Eksempelvis TreeMap sorterer keys'ne i den naturlige rækkefølge, hvorimod HashMap sorterer dem alt efter hvad Hash
værdien er. Hvis man vil garantere rækkefølgen kan man implementere en Comparator som sorterer Map'et. HashMap
funktioner er *oftest* konstante O(1), men det kommer an på hvordan Hashingen er implementeret.\
Hvad bruges det til?
- Hvis vi vil gemme elementer som vi kan referere til via specifikke keys kan Map være idéelt. Et eksempel kunne være
et fornavn samt efternavn som værdi og et medarbejderID eller brugernavn som key. På denne måde kan alle medarbejdere
have et unikt ID, men der kan godt være flere med det samme navn. | 67.083333 | 120 | 0.795963 | dan_Latn | 0.999464 |
1c0139c6b999a6f326adb9bf177bf8d13effcf28 | 16 | md | Markdown | Patterns/2010/12/13/commit.md | vishwatejharer/Patterns-with-python | fc221f78e151efca3235d687023112fb5b760643 | [
"MIT"
] | null | null | null | Patterns/2010/12/13/commit.md | vishwatejharer/Patterns-with-python | fc221f78e151efca3235d687023112fb5b760643 | [
"MIT"
] | null | null | null | Patterns/2010/12/13/commit.md | vishwatejharer/Patterns-with-python | fc221f78e151efca3235d687023112fb5b760643 | [
"MIT"
] | null | null | null | 2 on 12/13/2010
| 8 | 15 | 0.6875 | fin_Latn | 0.611235 |
1c01dd9d78265d5a42b5b2d492bc7d3827b67265 | 5,412 | md | Markdown | Uninstallation.md | trncb/choco-wiki | 4184f8eb676ab54081c6de48a50725fa7cf1976d | [
"Apache-2.0"
] | null | null | null | Uninstallation.md | trncb/choco-wiki | 4184f8eb676ab54081c6de48a50725fa7cf1976d | [
"Apache-2.0"
] | null | null | null | Uninstallation.md | trncb/choco-wiki | 4184f8eb676ab54081c6de48a50725fa7cf1976d | [
"Apache-2.0"
] | null | null | null | # Uninstalling Chocolatey
Should you decide you don't like Chocolatey, you can uninstall it simply by removing the folder (and the environment variable(s) that it creates). Since it is not actually installed in Programs and Features, you don't have to worry that it cluttered up your registry (however that's a different story for the applications that you installed with Chocolatey or manually).
## Folder
Most of Chocolatey is contained in `C:\ProgramData\chocolatey` or whatever `$env:ChocolateyInstall` evaluates to. You can simply delete that folder.
**NOTE** You might first back up the sub-folders `lib` and `bin` just in case you find undesirable results in removing Chocolatey. Bear in mind not every Chocolatey package is an installer package, there may be some non-installed applications contained in these subfolders that could potentially go missing. Having a backup will allow you to test that aspect out.
## Environment Variables
There are some environment variables that need to be adjusted or removed.
* ChocolateyInstall
* ChocolateyToolsLocation
* ChocolateyLastPathUpdate
* PATH (will need updated to remove)
## Script
There are no warranties on this script whatsoever, but here is something you can try:
**WARNING!!** This will remove Chocolatey and all packages, software, and configurations in the Chocolatey Installation folder from your machine. Everything will be GONE. This is very destructive. DO NOT RUN this script unless you completely understand what the intention of this script is and are good with it. If you mess something up, we cannot help you fix it.
***WARNING:*** Seriously, this script may destroy your machine and require a rebuild. It may have varied results on different machines in the same environment. Think twice before running this.
<!--remove
<p class="text-danger"><strong>Click the red button below to reveal the uninstall scripts.</strong></p>
<button type="button" class="btn btn-danger btn-hide">Yes, I understand the dangers of running these scripts</button>
<div id="uninstall-scripts" class="d-none">
-->
If you also intend to delete the Chocolatey directory, remove the `-WhatIf`:
~~~powershell
if (!$env:ChocolateyInstall) {
Write-Warning "The ChocolateyInstall environment variable was not found. `n Chocolatey is not detected as installed. Nothing to do"
return
}
if (!(Test-Path "$env:ChocolateyInstall")) {
Write-Warning "Chocolatey installation not detected at '$env:ChocolateyInstall'. `n Nothing to do."
return
}
$userPath = [Microsoft.Win32.Registry]::CurrentUser.OpenSubKey('Environment').GetValue('PATH', '', [Microsoft.Win32.RegistryValueOptions]::DoNotExpandEnvironmentNames).ToString()
$machinePath = [Microsoft.Win32.Registry]::LocalMachine.OpenSubKey('SYSTEM\CurrentControlSet\Control\Session Manager\Environment\').GetValue('PATH', '', [Microsoft.Win32.RegistryValueOptions]::DoNotExpandEnvironmentNames).ToString()
@"
User PATH:
$userPath
Machine PATH:
$machinePath
"@ | Out-File "C:\PATH_backups_ChocolateyUninstall.txt" -Encoding UTF8 -Force
if ($userPath -like "*$env:ChocolateyInstall*") {
Write-Output "Chocolatey Install location found in User Path. Removing..."
# WARNING: This could cause issues after reboot where nothing is
# found if something goes wrong. In that case, look at the backed up
# files for PATH.
[System.Text.RegularExpressions.Regex]::Replace($userPath, [System.Text.RegularExpressions.Regex]::Escape("$env:ChocolateyInstall\bin") + '(?>;)?', '', [System.Text.RegularExpressions.RegexOptions]::IgnoreCase) | %{[System.Environment]::SetEnvironmentVariable('PATH', $_.Replace(";;",";"), 'User')}
}
if ($machinePath -like "*$env:ChocolateyInstall*") {
Write-Output "Chocolatey Install location found in Machine Path. Removing..."
# WARNING: This could cause issues after reboot where nothing is
# found if something goes wrong. In that case, look at the backed up
# files for PATH.
[System.Text.RegularExpressions.Regex]::Replace($machinePath, [System.Text.RegularExpressions.Regex]::Escape("$env:ChocolateyInstall\bin") + '(?>;)?', '', [System.Text.RegularExpressions.RegexOptions]::IgnoreCase) | %{[System.Environment]::SetEnvironmentVariable('PATH', $_.Replace(";;",";"), 'Machine')}
}
# Adapt for any services running in subfolders of ChocolateyInstall
$agentService = Get-Service -Name chocolatey-agent -ErrorAction SilentlyContinue
if ($agentService -and $agentService.Status -eq 'Running') { $agentService.Stop() }
# TODO: add other services here
# delete the contents (remove -WhatIf to actually remove)
Remove-Item -Recurse -Force "$env:ChocolateyInstall" -WhatIf
[System.Environment]::SetEnvironmentVariable("ChocolateyInstall", $null, 'User')
[System.Environment]::SetEnvironmentVariable("ChocolateyInstall", $null, 'Machine')
[System.Environment]::SetEnvironmentVariable("ChocolateyLastPathUpdate", $null, 'User')
[System.Environment]::SetEnvironmentVariable("ChocolateyLastPathUpdate", $null, 'Machine')
~~~
If you also intend to delete the tools directory that was managed by Chocolatey, remove both of the `-WhatIf` switches:
~~~powershell
if ($env:ChocolateyToolsLocation) { Remove-Item -Recurse -Force "$env:ChocolateyToolsLocation" -WhatIf }
[System.Environment]::SetEnvironmentVariable("ChocolateyToolsLocation", $null, 'User')
[System.Environment]::SetEnvironmentVariable("ChocolateyToolsLocation", $null, 'Machine')
~~~
<!--remove
</div>
--> | 58.826087 | 371 | 0.769956 | eng_Latn | 0.921451 |
1c01f0b2b26c96ef6e281bd41067c5145c496a8d | 322 | md | Markdown | tiup/tiup-command-clean.md | cncc/docs-cn | 2476083c1e5869cd6b98156661bdeef59e4d6b04 | [
"Apache-2.0"
] | null | null | null | tiup/tiup-command-clean.md | cncc/docs-cn | 2476083c1e5869cd6b98156661bdeef59e4d6b04 | [
"Apache-2.0"
] | null | null | null | tiup/tiup-command-clean.md | cncc/docs-cn | 2476083c1e5869cd6b98156661bdeef59e4d6b04 | [
"Apache-2.0"
] | null | null | null | ---
title: tiup clean
---
# tiup clean
命令 `tiup clean` 用于清除组件运行过程中产生的数据。
## 语法
```sh
tiup clean [name] [flags]
```
`[name]` 取值为 [status 命令](/tiup/tiup-command-status.md)输出的 `Name` 字段。若省略 `[name]`,则必须配合 `--all` 使用。
## 选项
### --all (boolean, 默认 false)
清除所有运行记录。
## 输出
```
Clean instance of `%s`, directory: %s
```
| 11.5 | 98 | 0.593168 | eng_Latn | 0.249088 |
1c025555a1d2a43e91fde89391a3a7a79e5518a4 | 2,244 | md | Markdown | README.md | zeina1i/diff-sniffer | 9b85ad6e78761f8d57642f5e541e1f632d5566ab | [
"MIT"
] | 7 | 2018-08-17T19:41:20.000Z | 2019-06-06T21:13:05.000Z | README.md | zeina1i/diff-sniffer | 9b85ad6e78761f8d57642f5e541e1f632d5566ab | [
"MIT"
] | 38 | 2020-05-24T23:10:31.000Z | 2021-12-13T08:17:36.000Z | README.md | zeina1i/diff-sniffer | 9b85ad6e78761f8d57642f5e541e1f632d5566ab | [
"MIT"
] | 2 | 2018-05-13T07:04:47.000Z | 2019-05-01T10:42:43.000Z | Diff Sniffer for Git
====================
[](https://packagist.org/packages/diff-sniffer/diff-sniffer)
[](https://packagist.org/packages/diff-sniffer/diff-sniffer)

[](https://ci.appveyor.com/project/diff-sniffer/diff-sniffer)
[](https://codecov.io/gh/diff-sniffer/diff-sniffer)
This tool allows you to use [PHP_CodeSniffer](https://github.com/squizlabs/PHP_CodeSniffer) as a pre-commit hook. The main difference from [existing solutions](https://github.com/s0enke/git-hooks/blob/master/phpcs-pre-commit/pre-commit) that this one validates only changed lines of code but not the whole source tree.
Installation
------------
Download a PHAR package of the latest release and put it somewhere within your `$PATH`:
```
$ wget https://github.com/diff-sniffer/diff-sniffer/releases/latest/download/diff-sniffer.phar
$ chmod +x diff-sniffer.phar
$ sudo cp diff-sniffer.phar /usr/local/bin/diff-sniffer
```
Create a pre-commit hook in a specific Git repository .
```
$ cd /path/to/repo
$ cat > .git/hooks/pre-commit << 'EOF'
#!/usr/bin/env bash
diff-sniffer --staged "$@"
EOF
```
Alternatively, you can create a global pre-commit hook for your user (see [`man githooks`](https://git-scm.com/docs/githooks)):
```
$ cat > ~/.config/git/hooks/pre-commit << 'EOF'
#!/usr/bin/env bash
diff-sniffer --staged "$@"
EOF
```
You can also install Diff Sniffer manually:
```
$ git clone git@github.com:diff-sniffer/diff-sniffer.git
$ cd diff-sniffer
$ composer install
$ bin/diff-sniffer --version
```
Continuous integration mode
---------------------------
Diff Sniffer can also run on a CI server and validate pull requests. For example, on Travis CI:
```
$ wget https://github.com/diff-sniffer/diff-sniffer/releases/latest/download/diff-sniffer.phar
$ php diff-sniffer.phar origin/$TRAVIS_BRANCH...$TRAVIS_PULL_REQUEST_SHA
```
| 38.689655 | 318 | 0.730838 | eng_Latn | 0.44468 |
1c02e18fcb7d23d0b19821488ec99050fc6523c2 | 654 | md | Markdown | add/metadata/System.ServiceModel.Dispatcher/IDispatchMessageFormatter.meta.md | erikly/docs | 3de58af074aadfc5ad44c6c7106a0531d7cfa3e3 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | add/metadata/System.ServiceModel.Dispatcher/IDispatchMessageFormatter.meta.md | erikly/docs | 3de58af074aadfc5ad44c6c7106a0531d7cfa3e3 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | add/metadata/System.ServiceModel.Dispatcher/IDispatchMessageFormatter.meta.md | erikly/docs | 3de58af074aadfc5ad44c6c7106a0531d7cfa3e3 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2021-01-06T09:36:01.000Z | 2021-01-06T09:36:01.000Z | ---
uid: System.ServiceModel.Dispatcher.IDispatchMessageFormatter
ms.technology:
- "dotnet-standard"
author: "Erikre"
ms.author: "erikre"
manager: "erikre"
---
---
uid: System.ServiceModel.Dispatcher.IDispatchMessageFormatter.DeserializeRequest(System.ServiceModel.Channels.Message,System.Object[])
ms.technology:
- "dotnet-standard"
author: "Erikre"
ms.author: "erikre"
manager: "erikre"
---
---
uid: System.ServiceModel.Dispatcher.IDispatchMessageFormatter.SerializeReply(System.ServiceModel.Channels.MessageVersion,System.Object[],System.Object)
ms.technology:
- "dotnet-standard"
author: "Erikre"
ms.author: "erikre"
manager: "erikre"
---
| 24.222222 | 151 | 0.770642 | kor_Hang | 0.107901 |
1c02f27208b9e76f26a4d96236ac7316a0b6591e | 5,117 | md | Markdown | _posts/2019-08-13-Download-advice-and-consent-the-politics-of-judicial-appointments.md | Anja-Allende/Anja-Allende | 4acf09e3f38033a4abc7f31f37c778359d8e1493 | [
"MIT"
] | 2 | 2019-02-28T03:47:33.000Z | 2020-04-06T07:49:53.000Z | _posts/2019-08-13-Download-advice-and-consent-the-politics-of-judicial-appointments.md | Anja-Allende/Anja-Allende | 4acf09e3f38033a4abc7f31f37c778359d8e1493 | [
"MIT"
] | null | null | null | _posts/2019-08-13-Download-advice-and-consent-the-politics-of-judicial-appointments.md | Anja-Allende/Anja-Allende | 4acf09e3f38033a4abc7f31f37c778359d8e1493 | [
"MIT"
] | null | null | null | ---
layout: post
comments: true
categories: Other
---
## Download Advice and consent the politics of judicial appointments book
He creaks and scrapes to the door, in a stretch limousine? Kaitlin had the piercing voice and talent for vituperation that marked her as With the determination of any pulp-magazine adventurer, then the reservoir of anger was deeper still and pent up behind thoroughly salting the seat of his pants, the temperature and pressure were nearly Earth-normal, toward the cockpit, and led to the discovery of the long volcanic for her and perhaps because the sweet man doubted his desirability. I hit out at the name. "To Roke?" prove it, she was afraid to have that commitment tested just an illuminated wall clock, he would drive east into Montana first thing in the morning. " thus the whole of our little Polar Sea squadron was collected at the shirt with epaulets, to which he'd Aug, I've never been much of a talker, she changed jobs. He detected a note of melancholy in his voice, stiff turn. She sat next to the window. series of comic books portraying him in colorful cape and tights. Grey and a tray of tea cakes. I think -" 137 least hindrance from ice? It belonged to a famous wizard. He-or Anieb within him-could follow the advice and consent the politics of judicial appointments of Gelluk's spells back into Gelluk's own mind. The two people almost tumbled "No. Ath left his book with a fellow mage on Pody when he went into advice and consent the politics of judicial appointments west, whereat she rejoiced, "Could be risky," Bernard agreed after a second's reflection, while He was grateful to see Kurremkarmerruk coming slowly down the bank of the Thwilburn from the swarm? "You knew when my license would expire, bowing low, i 293! Just forget the busload of nuns smashed on the tracks, carried back to Norway! But those rockets or the case on the mainland. She was Anieb. Theel returned borders. " Richard Velnod stood in his open doorway, Agnes intended to Junior was tempted to experiment with the controls. John Simpson describes in his well-known "! He winced and almost cast them aside in disgust. the _Ymer_ to Korepovskoj, he was snoring, he was in a room with brick walls and bricked-up windows. 246 insignificance! There are no good harbours in the Hemlock was glad to see a bit of fire in the boy. They tried to make her stay and eat supper with them, the boy now knows that was a good thing. This, I shall myself be slain, either, lying on his cot in incorrect, takes my business, hopin' she'd see who you might be. In its "More vanilla Coke, Aunt Gen. Even from the top of a Over many proud generations and at least to the extent of second cousins, that I would do anything for you. Venerate moved up to Admiral. have here, "if you really want advice and consent the politics of judicial appointments know. night by crowds of angry villagers with torches and pitchforks, too. " Pacific coast of North Asia to pay "jassak" to Deschnev, if it is known that no one can -- you and she'd found relief in revelation, how advice and consent the politics of judicial appointments was that, but 41 Buddhist canonical books Or are you ready to leave now?" tents differed somewhat in construction from the common Chukch does that mean?" "Detail. This world is as vivid as any Curtis has ever "For Earth, only to collide with Ralston as he came out. " hillside, covered with a shroud. " the other hand, but he is assigned to him so exalted a position, that we again got news from Menka by maintain surveillance on it at least for fifteen or twenty minutes. They were clothed in close and forced her to disrobe. "The second thing I have to announce tonight is that such a commitment has now been made. Regardless -of the severity of a setback, and Roke, its contents having been explored in haste, the girl clawed at the steeper angle -- we were falling, I wouldn't put it past them to have taps and call-monitor programs anywhere, but she closed her eyes and said: "I'll be okay, he tugged a mass of tissues from the box with his left Chukches' mode of life. " Merrick was speaking casually in a way that seemed to assume the subject to be common advice and consent the politics of judicial appointments although Bernard still hadn't been told anything else about it officially; but at the same lime he was eyeing Bernard curiously, don't advice and consent the politics of judicial appointments to Roke Knoll until you know the ground you stand on. He had defied Losen's power, and then - nodded to the two guards, we saw a sail comming about different poses. " "Just let him be," she advised. "A man who slumped in one of the two chairs at the small dinette. Softly rustling leaves. ] Satan than him," said Geneva? The regulars have pretty well secured the whole module already. They want the Rule of Roke to separate men from women, if Zorphwar. " When Westland left, juvenile. " But they answered, staring, at least, macaroni and advice and consent the politics of judicial appointments. "Why does Mrs. file:D|Documents20and20Settingsharry. A trial. It's just a card. | 568.555556 | 4,987 | 0.786398 | eng_Latn | 0.999953 |
1c02f2b46d6392f87d5ec3a6e29371dcb798002f | 1,898 | md | Markdown | docs/ado/reference/ado-api/command-object-properties-methods-and-events.md | zelanko/sql-docs.es-es | e8de33fb5b7b566192c5fd38f7d922aca7fa3840 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/ado/reference/ado-api/command-object-properties-methods-and-events.md | zelanko/sql-docs.es-es | e8de33fb5b7b566192c5fd38f7d922aca7fa3840 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/ado/reference/ado-api/command-object-properties-methods-and-events.md | zelanko/sql-docs.es-es | e8de33fb5b7b566192c5fd38f7d922aca7fa3840 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
description: Propiedades, métodos y eventos del objeto Command
title: Propiedades, métodos y eventos del objeto Command | Microsoft Docs
ms.prod: sql
ms.prod_service: connectivity
ms.technology: ado
ms.custom: ''
ms.date: 01/19/2017
ms.reviewer: ''
ms.topic: conceptual
helpviewer_keywords:
- Command object [ADO], members
ms.assetid: 0389f21c-06da-4090-9da1-28d912f888d7
author: rothja
ms.author: jroth
ms.openlocfilehash: 393ed4047bb1f2313a3719ce39b942de20b5b6b7
ms.sourcegitcommit: 18a98ea6a30d448aa6195e10ea2413be7e837e94
ms.translationtype: MT
ms.contentlocale: es-ES
ms.lasthandoff: 08/27/2020
ms.locfileid: "88975226"
---
# <a name="command-object-properties-methods-and-events"></a>Propiedades, métodos y eventos del objeto Command
## <a name="propertiescollections"></a>Propiedades/colecciones
[ActiveConnection (propiedad)](./activeconnection-property-ado.md)
[CommandStream (propiedad)](./commandstream-property-ado.md)
[CommandText (propiedad)](./commandtext-property-ado.md)
[Propiedad CommandTimeout](./commandtimeout-property-ado.md)
[CommandType (propiedad)](./commandtype-property-ado.md)
[Propiedad Dialect](./dialect-property.md)
[Propiedad Name](./name-property-ado.md)
[Propiedad NamedParameters](./namedparameters-property-ado.md)
[Colección Parameters](./parameters-collection-ado.md)
[Propiedad prepared](./prepared-property-ado.md)
[Colección de propiedades](./properties-collection-ado.md)
[Propiedad State](./state-property-ado.md)
## <a name="methods"></a>Métodos
[CANCEL (método)](./cancel-method-ado.md)
[CreateParameter (método)](./createparameter-method-ado.md)
[Método Execute (Command ADO)](./execute-method-ado-command.md)
## <a name="events"></a>Eventos
Ninguno.
## <a name="see-also"></a>Consulte también
[Objeto Command (ADO)](./command-object-ado.md) | 31.633333 | 110 | 0.735511 | kor_Hang | 0.132071 |
1c034250a51ce1b549780e8cb5e15917843319d9 | 1,103 | md | Markdown | add/metadata/System.Windows.Automation/TablePattern.meta.md | v-maudel/docs-1 | f849afb0bd9a505311e7aec32c544c3169edf1c5 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | add/metadata/System.Windows.Automation/TablePattern.meta.md | v-maudel/docs-1 | f849afb0bd9a505311e7aec32c544c3169edf1c5 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | add/metadata/System.Windows.Automation/TablePattern.meta.md | v-maudel/docs-1 | f849afb0bd9a505311e7aec32c544c3169edf1c5 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
uid: System.Windows.Automation.TablePattern
ms.technology:
- "dotnet-wpf"
author: "Xansky"
ms.author: "mhopkins"
manager: "wpickett"
---
---
uid: System.Windows.Automation.TablePattern.RowHeadersProperty
ms.technology:
- "dotnet-wpf"
author: "Xansky"
ms.author: "mhopkins"
manager: "wpickett"
---
---
uid: System.Windows.Automation.TablePattern.Cached
ms.technology:
- "dotnet-wpf"
author: "Xansky"
ms.author: "mhopkins"
manager: "wpickett"
---
---
uid: System.Windows.Automation.TablePattern.Current
ms.technology:
- "dotnet-wpf"
author: "Xansky"
ms.author: "mhopkins"
manager: "wpickett"
---
---
uid: System.Windows.Automation.TablePattern.RowOrColumnMajorProperty
ms.technology:
- "dotnet-wpf"
author: "Xansky"
ms.author: "mhopkins"
manager: "wpickett"
---
---
uid: System.Windows.Automation.TablePattern.Pattern
ms.technology:
- "dotnet-wpf"
author: "Xansky"
ms.author: "mhopkins"
manager: "wpickett"
---
---
uid: System.Windows.Automation.TablePattern.ColumnHeadersProperty
ms.technology:
- "dotnet-wpf"
author: "Xansky"
ms.author: "mhopkins"
manager: "wpickett"
---
| 17.507937 | 68 | 0.722575 | yue_Hant | 0.325614 |
1c0410602251fa6173115039844892e9745e4e89 | 1,737 | md | Markdown | README.md | your-funny-uncle/eep | 34fede0af9c28697298752f262b6bc199fbf5a73 | [
"CC0-1.0"
] | 2 | 2016-08-16T10:41:55.000Z | 2016-10-19T09:46:04.000Z | README.md | your-funny-uncle/eep | 34fede0af9c28697298752f262b6bc199fbf5a73 | [
"CC0-1.0"
] | 1 | 2015-03-27T13:25:06.000Z | 2018-03-19T12:53:25.000Z | README.md | your-funny-uncle/eep | 34fede0af9c28697298752f262b6bc199fbf5a73 | [
"CC0-1.0"
] | null | null | null | Erlang Enhancement Process
--------------------------
This repository contains in the subdirectory eeps/ the EEPs (Erlang
Extension Proposals), in [Markdown][MD] (`*.md`) format produced in the
[Erlang Enhancement Process][EEP].
The [EEP Index][EEP 0] in [EEP 0][] gathers all EEPs and is built with
the tools in the following paragraphs from the EEPs themselves.
This repository also contains a version of a [Markdown.pl][] script in
subdirectory `md/` that can be used, for example with the also contained
Perl [build script][build.pl] and some helper scripts,
to produce HTML versions of the `*.md` EEPs.
Type `perl build.pl` or `./build.pl` depending on your environment to
rebuild all HTML that needs to be rebuilt. A reasonable `perl` (5.8)
is all that is needed.
Patch suggestions to this repository should be sent to <eeps@erlang.org>
(remember to subscribe to the list first) as stated in the
[Erlang Enhancement Process][EEP].
[MD]: http://daringfireball.net/projects/markdown/
"The Markdown Project"
[Markdown.pl]: md/Markdown.pl
"Markdown.pl"
[EEP]: http://www.erlang.org/eep.html
"Erlang Enhancement Process"
[build.pl]: build.pl
"Perl build script to overcome Makefile inportability"
[EEP 0]: http://erlang.org/eep/eeps/eep-0000.html
"EEP 0: Index of EEPS"
Copyright
---------
This document is placed in the public domain or under the CC0-1.0-Universal
license, whichever is more permissive.
### Author
Erlang/OTP, Raimo Niskanen, 2010, 2018
[EmacsVar]: <> "Local Variables:"
[EmacsVar]: <> "mode: indented-text"
[EmacsVar]: <> "indent-tabs-mode: nil"
[EmacsVar]: <> "sentence-end-double-space: t"
[EmacsVar]: <> "fill-column: 70"
[EmacsVar]: <> "coding: utf-8"
[EmacsVar]: <> "End:"
| 28.95 | 75 | 0.711572 | eng_Latn | 0.931103 |
1c04b9538a51304906057ad8ca673335af73deb4 | 666 | md | Markdown | 2020/CVE-2020-8250.md | justinforbes/cve | 375c65312f55c34fc1a4858381315fe9431b0f16 | [
"MIT"
] | 2,340 | 2022-02-10T21:04:40.000Z | 2022-03-31T14:42:58.000Z | 2020/CVE-2020-8250.md | justinforbes/cve | 375c65312f55c34fc1a4858381315fe9431b0f16 | [
"MIT"
] | 19 | 2022-02-11T16:06:53.000Z | 2022-03-11T10:44:27.000Z | 2020/CVE-2020-8250.md | justinforbes/cve | 375c65312f55c34fc1a4858381315fe9431b0f16 | [
"MIT"
] | 280 | 2022-02-10T19:58:58.000Z | 2022-03-26T11:13:05.000Z | ### [CVE-2020-8250](https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-8250)


&color=brighgreen)
### Description
A vulnerability in the Pulse Secure Desktop Client (Linux) < 9.1R9 could allow local attackers to escalate privilege.
### POC
#### Reference
- https://kb.pulsesecure.net/articles/Pulse_Security_Advisories/SA44601
#### Github
No PoCs found on GitHub currently.
| 37 | 119 | 0.761261 | kor_Hang | 0.208115 |
1c051b78a9b5b22411939d40007a8c2f7bff60b7 | 12,672 | md | Markdown | Instructions/Labs/LAB_10-Implement_Data_Protection.md | MicrosoftLearning/AZ-104_ZH | 74b4ab796aa6b6f65fb31958010e11ef56669754 | [
"MIT"
] | null | null | null | Instructions/Labs/LAB_10-Implement_Data_Protection.md | MicrosoftLearning/AZ-104_ZH | 74b4ab796aa6b6f65fb31958010e11ef56669754 | [
"MIT"
] | null | null | null | Instructions/Labs/LAB_10-Implement_Data_Protection.md | MicrosoftLearning/AZ-104_ZH | 74b4ab796aa6b6f65fb31958010e11ef56669754 | [
"MIT"
] | null | null | null | ---
lab:
title: '10 - 实现数据保护'
module: '模块 10 - 数据保护'
---
# 实验室 10 - 备份虚拟机
# 学生实验室手册
## 实验室场景
你的任务是对使用 Azure 恢复服务来备份和还原 Azure 虚拟机和本地计算机上托管的文件这一过程进行评估。此外,你还想确定保护恢复服务保管库中所存储数据的方法,以防意外或恶意行为造成的数据丢失。
## 目标
在本实验室中,你将:
+ 任务 1:预配实验室环境
+ 任务 2:创建恢复服务保管库
+ 任务 3:实现 Azure 虚拟机级备份
+ 任务 4:实现文件和文件夹备份
+ 任务 5:通过使用 Azure 恢复服务代理执行文件恢复
+ 任务 6:通过使用 Azure 虚拟机快照执行文件恢复(可选)
+ 任务 7:查看 Azure 恢复服务软删除功能(可选)
## 预计用时:50 分钟
## 说明
### 练习 1
#### 任务 1:预配实验室环境
在此任务中,你将部署两台虚拟机用于测试不同的备份方案。
1. 登录到 [Azure 门户](https://portal.azure.com)。
1. 在 Azure 门户中,单击 Azure 门户右上方的图标,打开 **Azure Cloud Shell**。
1. 提示选择 **“Bash”** 或 **“PowerShell”** 时,选择 **“PowerShell”**。
>**备注**: 如果这是第一次启动 **Cloud Shell**,并看到 **“未装载任何存储”** 消息,请选择在本实验室中使用的订阅,然后选择 **“创建存储”**。
1. 在“Cloud Shell”窗格的工具栏中,单击 **“上传/下载文件”** 图标,在下拉菜单中,单击 **“上传”** 并将文件 **\\Allfiles\\Labs\\10\\az104-10-vms-edge-template.json** 和 **\\Allfiles\\Labs\\10\\az104-10-vms-edge-parameters.json** 上传到 Cloud Shell 主目录中。
1. 在 Cloud Shell 窗格中,运行以下命令以创建托管虚拟机的资源组(将 `[Azure_region]` 占位符替换为你打算将 Azure 虚拟机部署到的 Azure 区域的名称)。分别键入每个命令行并分别执行:
```powershell
$location = '[Azure_region]'
```
```powershell
$rgName = 'az104-10-rg0'
```
```powershell
New-AzResourceGroup -Name $rgName -Location $location
```
1. 在 Cloud Shell 窗格中,运行以下命令创建第一个虚拟网络,并使用上传的模板和参数文件将虚拟机部署到其中:
```powershell
New-AzResourceGroupDeployment `
-ResourceGroupName $rgName `
-TemplateFile $HOME/az104-10-vms-edge-template.json `
-TemplateParameterFile $HOME/az104-10-vms-edge-parameters.json `
-AsJob
```
1. 最小化 Cloud Shell(但不要将其关闭)。
>**备注**: 不要等待部署完成,而是继续执行下一个任务。该部署大约需要 5 分钟。
#### 任务 2:创建恢复服务保管库
在此任务中,你将创建恢复服务保管库。
1. 在 Azure 门户中,搜索并选择“**恢复服务保管库**”,然后在“**恢复服务保管库**”边栏选项卡上,单击“**+ 创建**”。
1. 在 **“创建恢复服务保管库”** 边栏选项卡上,指定以下设置:
| 设置 | 值 |
| --- | --- |
| 订阅 | 在本实验室中使用的 Azure 订阅的名称 |
| 资源组 | 新资源组名称 **az104-10-rg1** |
| 保管库名称 | **az104-10-rsv1** |
| 区域 | 在上一个任务中部署两台虚拟机的区域的名称 |
>**注意**: 确保指定与上一个任务中部署虚拟机的区域相同的区域。
1. 单击 **“查看 + 创建”**,确保通过验证并单击 **“创建”**。
>**备注**: 等待部署完成。部署时间应少于 1 分钟。
1. 部署完成后,单击 **“前往资源”**。
1. 在 **az104-10-rsv1** 恢复服务保管库边栏选项上的 **“设置”** 部分,单击 **“属性”**。
1. 在 **“az104-10-rsv1 - 属性”** 边栏选项卡中,在 **“备份配置”** 标签下单击 **“更新”** 链接。
1. 在 **“备份配置”** 边栏选项卡上,请注意,可以将 **“存储复制类型”** 设置为 **“本地冗余”** 或 **“异地冗余”**。保留 **“异地冗余”** 默认设置,关闭边栏选项卡。
>**备注**: 只有不存在现有备份项目时才能配置此设置。
1. 返回 **“az104-10-rsv1 - 属性”** 边栏选项卡,在 **“安全设置”** 标签下单击 **“更新”** 链接。
1. 在 **“安全设置”** 边栏选项卡上,请注意, **“软删除(适用于 Azure 虚拟机)”** 为 **“已启用”**。
1. 关闭 **“安全设置”** 边栏选项卡,然后返回 **az104-10-rsv1** “恢复服务保管库”边栏选项卡,单击 **“概述”**。
#### 任务 3:实现 Azure 虚拟机级备份
在此任务中,你将实现 Azure 虚拟机级备份。
>**备注**: 开始本任务之前,请确保在本实验室第一个任务中开始的部署已顺利完成。
1. 在 **“az104-10-rsv1”** 恢复服务保管库边栏选项卡上,依次单击 **“概述”** 和 **“+ 备份”**。
1. 在 **“备份目标”** 边栏选项卡上,指定以下设置:
| 设置 | 值 |
| --- | --- |
| 工作负载在哪里运行? | **Azure** |
| 想要备份什么? | **虚拟机** |
1. 在 **“备份目标”** 边栏选项卡上,单击 **“备份”**。
1. 在 **“备份策略”** 上,查看 **“DefaultPolicy”** 设置并选择 **“创建新策略”**。
1. 使用以下设置定义新备份策略(保留其他设置的默认值):
| 设置 | 值 |
| ---- | ---- |
| 策略名称 | **az104-10-backup-policy** |
| 频率 | **每日** |
| 时间 | **0:00** |
| 时区 | 当地时区的名称 |
| 即时恢复快照保留时间 | **2** 天 |
1. 单击 **“确定”** 以创建策略,然后在 **“虚拟机”** 部分选择 **“添加”**。
1. 在 **“选择虚拟机”** 边栏选项卡上,选择 **“az-104-10-vm0”**,单击 **“确定”**,然后返回 **“备份”** 边栏选项卡,单击 **“启用备份”**。
>**备注**: 等待启用备份。该操作需约 2 分钟。
1. 导航回 **“az104-10-rsv1”** 恢复服务保管库边栏选项卡,然后在 **“受保护项目”** 部分,单击 **“备份项目”**,再单击 **“Azure 虚拟机”** 条目。
1. 在 **az104-10-vm0** 的“**备份项目(Azure 虚拟机)**”边栏选项卡上,查看“**备份预检查**”和“**上次备份状态**”条目的值。
1. 在 **“az104-10-vm0”** 备份项目边栏选项卡上,单击 **“立即备份”**,接受 **“保留备份截止日期”** 下拉列表中的默认值,然后单击 **“确定”**。
>**注意**: 不要等待备份完成,而是继续执行下一个任务。
#### 任务 4:实现文件和文件夹备份
在此任务中,你将使用 Azure 恢复服务实现文件和文件夹备份。
1. 在 Azure 门户中,搜索并选择 **“虚拟机”**,并且在 **“虚拟机”** 边栏选项卡上,单击 **“az104-10-vm1”**。
1. 在 **“az104-10-vm1”** 边栏选项卡上,单击 **“连接”**,然后在下拉菜单中,单击 **“RDP”**,再在 **“与 RDP 连接”** 边栏选项卡上,单击 **“下载 RDP 文件”** 并按照提示启动远程桌面会话。
>**注意**: 此步骤是指在 Windows 计算机中通过远程桌面进行连接。在 Mac 上,可以使用 Mac App Store 中的远程桌面客户端,而在 Linux 计算机上,可以使用开源 RDP 客户端软件。
>**备注**: 连接到目标虚拟机时,可以忽略任何警告提示。
1. 出现提示时,请使用用户名 **“Student”** 和密码 **“Pa55w.rd1234”** 登录。
>**备注:** 由于 Azure 门户不再支持 IE11,所以必须使用 Microsoft Edge 浏览器来完成此项任务。
1. 在 **az104-10-vm1** Azure 虚拟机的远程桌面会话中,启动 Edge 浏览器,浏览到 [Azure 门户](https://portal.azure.com),然后使用凭据登录。
1. 在 Azure 门户中,搜索并选择 **“恢复服务保管库”**,然后在 **“恢复服务保管库”** 中单击 **“az104-10-rsv1”**。
1. 在 **“az104-10-rsv1”** 恢复服务保管库边栏选项卡上,单击 **“+ 备份”**。
1. 在 **“备份目标”** 边栏选项卡上,指定以下设置:
| 设置 | 值 |
| --- | --- |
| 工作负载在哪里运行? | **本地** |
| 想要备份什么? | **文件和文件夹** |
>**备注**: 即使在此任务中使用的虚拟机在 Azure 中运行,也可以利用它来评估适用于任何运行 Windows Server 操作系统的本地计算机的备份功能。
1. 在 **“备份目标”** 边栏选项卡上,单击 **“准备基础结构”**。
1. 在 **“准备基础结构”** 边栏选项卡上,单击 **“下载 Windows Server 或 Windows 客户端的代理”** 链接。
1. 出现提示时,单击 **“运行”** 以默认设置开始安装 **MARSAgentInstaller.exe**。
>**备注**: 在 **“Microsoft Azure 恢复服务代理安装向导”** 的 **“Microsoft Update 选择加入”** 页面 ,选择 **“我不想使用 Microsoft Update”** 安装选项。
1. 在 **“Microsoft Azure 恢复服务代理安装向导”** 的 **“安装”** 页面,单击 **“继续注册”**。这将启动 **“注册服务器向导”**。
1. 切换到显示 Azure 门户的 Web 浏览器窗口,在 **“准备基础结构”** 边栏选项卡上,选中 **“已下载或使用最新的恢复服务器代理”** 复选框,然后单击 **“下载”**。
1. 提示选择要打开还是保存保管库凭据文件时,请单击 **“保存”**。这会将保管库凭据文件保存到本地的“下载”文件夹。
1. 切换回 **“注册服务器向导”** 窗口,在 **“保管库标识”** 页面上,单击 **“浏览”**。
1. 在 **“选择保管库凭据”** 对话框中,浏览到 **“资料下载”** 文件夹,单击所下载的保管库凭据文件,然后单击 **“打开”**。
1. 返回 **“保管库标识”** 页面,单击 **“下一步”**。
1. 在 **“注册服务器向导”** 的 **“加密设置”** 页面上,单击 **“生成密码”**。
1. 在**注册服务器向导的**“**加密设置**”页面上,单击“**输入保存密码的位置**”旁边的“**浏览**”按钮。
1. 在 **“浏览文件夹”** 对话框中,选择 **“文档”** 文件夹,然后单击 **“确定”**。
1. 单击 **“完成”**,查看 **“Microsoft Azure 备份”** 警告并单击 **“是”**,然后等待注册完成。
>**备注**: 在生产环境中,应该将密码文件存储在除备份服务器之外的安全位置。
1. 在 **“注册服务器向导”** 的 **“服务器注册”** 页面上,查看有关密码文件位置的警告,确保选中 **“启动 Microsoft Azure 恢复服务代理”** 复选框,然后单击 **“关闭”**。这会自动打开 **“Microsoft Azure 备份”** 控制台。
1. 在 **“Microsoft Azure 备份”** 控制台的 **“操作”** 窗格中,单击 **“计划备份”**。
1. 在 **“计划备份向导”** 的 **“入门”** 页面上,单击 **“下一步”**。
1. 在 **“选择备份项目”** 页面上,单击 **“添加项目”**。
1. 在 **“选择项目”** 对话框中,展开 **C:\\Windows\\System32\\drivers\\etc\\**,选择 **“hosts”**,然后单击 **“确定”**:
1. 在 **“选择备份项目”** 页面上,单击 **“下一步”**。
1. 在 **“指定备份计划”** 页面上,确保选择 **“天”** 选项,然后在 **“在以下时间(每天最多三个时间)”** 方框下的第一个下拉列表框中,选择 **“凌晨 4:30”**,然后单击 **“下一步”**。
1. 在 **“选择保留策略”** 页面上,接受默认设置,然后单击 **“下一步”**。
1. 在 **“选择初始备份类型”** 页面上,接受默认设置,然后单击 **“下一步”**。
1. 在 **“确认”** 页面上,单击 **“完成”**。创建备份计划后,单击 **“关闭”**。
1. 在 **“Microsoft Azure 备份”** 控制台的“操作”窗格中,单击 **“立即备份”**。
>**备注**: 创建计划的备份后,可以使用按需运行备份的选项。
1. 在“立即备份”向导的 **“选择备份项目”** 页面上,确保选中 **“文件和文件夹”** 选项并单击 **“下一步”**。
1. 在 **“保留备份截止日期”** 页面上,接受默认设置,然后单击 **“下一步”**。
1. 在 **“确认”** 页面上,单击 **“备份”**。
1. 备份完成后,单击 **“关闭”**,然后关闭“Microsoft Azure 备份”。
1. 切换到显示 Azure 门户的 Web 浏览器窗口,导航回 **“恢复服务保管库”** 边栏选项卡,在 **“受保护项目”** 部分,单击 **“备份项目”**。
1. 在 **“az104-10-rsv1 - 备份项目”** 边栏选项卡上,单击 **“Azure 备份代理”**。
1. 在 **“备份项目(Azure 备份代理)”** 边栏选项卡上,验证是否存在引用 **“az104-10-vm1”** 的 **C:\\** 驱动器的条目。
#### 任务 5:通过使用 Azure 恢复服务代理来执行文件恢复(可选)
在此任务中,你将使用 Azure 恢复服务代理执行文件还原。
1. 在 **az104-10-vm1** 的远程桌面会话中,打开文件资源管理器,导航到 **“C:\\Windows\\System32\\drivers\\etc\\”** 文件夹并删除 **hosts** 文件。
1. 打开 Microsoft Azure 备份,然后在 **“操作”** 窗格中单击 **“恢复数据”**。这将启动 **“恢复数据向导”**。
1. 在恢复数据向导的 **“入门”** 页面上,确保选中 **“此服务器(az104-10-vm1.)”** 选项并单击 **“下一步”**。
1. 在 **“选择恢复模式”** 页面上,确保选中 **“单个文件和文件夹”** 选项并单击 **“下一步”**。
1. 在 **“选择卷和日期”** 页面的 **“选择卷”** 下拉列表中,选择 **“C:\\”**,接受可用备份的默认选择,然后单击 **“装载”**。
>**注意**: 等待装载操作完成。该操作需要约 2 分钟。
1. 在 **“浏览和恢复文件”** 页面上,记下恢复卷的驱动器号,并查看有关使用 robocopy 的提示。
1. 单击 **“开始”** ,展开 **“Windows 系统”** 文件夹,然后单击 **“命令提示符”**。
1. 在命令提示符下,运行以下命令以复制还原 **“hosts”** 文件到原始位置(用先前确定的恢复卷的驱动器号替换 `[recovery_volume]`):
```sh
robocopy [recovery_volume]:\Windows\System32\drivers\etc C:\Windows\system32\drivers\etc hosts /r:1 /w:1
```
1. 切换回 **“恢复数据向导”**,然后在 **“浏览和恢复文件”** 中单击 **“卸载”**,并在提示确认时单击 **“是”**。
1. 终止远程桌面会话。
#### 任务 6:通过使用 Azure 虚拟机快照执行文件恢复(可选)
在此任务中,你将从基于 Azure 虚拟机级快照的备份中还原文件。
1. 切换到在实验室计算机上运行并显示 Azure 门户的浏览器窗口。
1. 在 Azure 门户中,搜索并选择 **“虚拟机”**,并且在 **“虚拟机”** 边栏选项卡中,单击 **“az104-10-vm0”**。
1. 在 **“az104-10-vm0”** 边栏选项卡上单击 **“连接”**,在下拉列表中单击 **“RDP”**,在 **“使用 RDP 连接”** 边栏选项卡上单击 **“下载 RDP 文件”**,并按照提示启动远程桌面会话。
>**备注**: 此步骤是指在 Windows 计算机中通过远程桌面进行连接。在 Mac 上,可以使用 Mac App Store 中的远程桌面客户端,而在 Linux 计算机上,可以使用开源 RDP 客户端软件。
>**备注**: 连接到目标虚拟机时,可以忽略任何警告提示。
1. 出现提示时,请使用用户名 **“Student”** 和密码 **“Pa55w.rd1234”** 登录。
>**备注:** 由于 Azure 门户不再支持 IE11,所以必须使用 Microsoft Edge 浏览器来完成此项任务。
1. 在 **az104-10-vm0** 的远程桌面会话中,单击 **“开始”**,展开 **“Windows 系统”** 文件夹,然后单击 **“命令提示符”**。
1. 在命令提示符下运行以下命令,以删除 **hosts** 文件:
```sh
del C:\Windows\system32\drivers\etc\hosts
```
>**备注**: 在此任务的稍后部分,你将从基于 Azure 虚拟机级快照的备份中还原此文件。
1. 在 **“az104-10-vm0”** Azure 虚拟机的远程桌面会话中,启动 Edge 浏览器,浏览到 [Azure 门户](https://portal.azure.com),然后使用凭据登录。
1. 在 Azure 门户中,搜索并选择 **“恢复服务保管库”**,然后在 **“恢复服务保管库”** 中单击 **“az104-10-rsv1”**。
1. 在 **“az104-10-rsv1”** 恢复服务保管库边栏选项卡的 **“受保护项目”** 部分,单击 **“备份项目”**。
1. 在 **“az104-10-rsv1 - 备份项目”** 边栏选项卡上,单击 **“Azure 虚拟机”**。
1. 在 **“备份项目(Azure 虚拟机)”** 边栏选项卡上,单击 **“az104-10-vm0”**。
1. 在 **“az104-10-vm0”** 备份项目边栏选项卡上,单击 **“文件恢复”**。
>**备注**: 可以选择在备份开始后不久根据应用程序一致性快照运行恢复。
1. 在 **“文件恢复”** 边栏选项卡上,接受默认恢复点,然后单击 **“下载可执行文件”**。
>**备注**: 该脚本从所选恢复点将磁盘装载为运行脚本的操作系统中的本地驱动器。
1. 单击 **“下载”**,在提示运行还是保存 **IaaSVMILRExeForWindows.exe** 时,单击 **“保存”**。
1. 返回“文件资源管理器”窗口,双击新下载的文件。
1. 当提示从门户提供密码时,请从 **“文件恢复”** 边栏选项卡上的 **“用于运行脚本的密码”** 文本框中复制密码,并将其粘贴在命令提示符下,然后按 **Enter**。
>**注意**: 这将打开一个 Windows PowerShell 窗口,并显示装载进度。
>**注意**: 如果此时收到错误消息,请刷新 Web 浏览器窗口并重复最后三个步骤。
1. 等待装载过程完成,查看 Windows PowerShell 窗口中的信息性消息,记下分配给托管 **“Windows”** 的卷的驱动器号,然后启动“文件资源管理器”。
1. 在文件资源管理器中,导航到你在上一步中确定的承载操作系统卷快照的驱动器号,并查看其内容。
1. 切换到 **“命令提示符”** 窗口。
1. 在命令提示符下,运行以下命令以复制还原 **“hosts”** 文件到原始位置(用先前确定的操作系统卷的驱动器号替换 `[os_volume]`):
```sh
robocopy [os_volume]:\Windows\System32\drivers\etc C:\Windows\system32\drivers\etc hosts /r:1 /w:1
```
1. 切换回 Azure 门户中的 **“文件恢复”** 边栏选项卡,然后单击 **“卸载磁盘”**。
1. 终止远程桌面会话。
#### 任务 7:查看 Azure 恢复服务软删除功能
1. 在实验室计算机的 Azure 门户中,搜索并选择 **“恢复服务保管库”**,然后在 **“恢复服务保管库”** 中,单击 **“az104-10-rsv1”**。
1. 在 **“az104-10-rsv1”** 恢复服务保管库边栏选项卡的 **“受保护项目”** 部分,单击 **“备份项目”**。
1. 在 **“az104-10-rsv1 - 备份项目”** 边栏选项卡上,单击 **“Azure 备份代理”**。
1. 在 **“备份项目(Azure 备份代理)”** 边栏选项卡上,单击代表 **az104-10-vm1** 备份的条目。
1. 在 **“az104-10-vm1. 上的 C:\\”** 边栏选项卡上,单击 **“az104-10-vm1.”** 链接。
1. 在 **“az104-10-vm1.** 受保护服务器边栏选项卡上,单击 **“删除”**。
1. 在 **“删除”** 边栏选项卡上,指定以下设置。
| 设置 | 值 |
| --- | --- |
| 键入服务器名称 | **az104-10-vm1.** |
| 原因 | **回收开发/测试服务器** |
| 备注 | **az104 10 实验室** |
>**备注**: 确保在键入服务器名称时包含末尾句点
1. 选中 **“有与此服务器关联的 1 个备份项目的备份数据”标签旁边的复选框。 我了解单击“确认”将永久删除所有云备份数据。此操作无法撤消。此订阅的管理员可能会收到数据删除的警告”** 标签旁边的复选框,然后单击 **“删除”**。
1. 导航回 **“az104-10-rsv1 - 备份项目”** 边栏选项卡并单击 **“Azure 虚拟机”**。
1. 在 **“az104-10-rsv1 - 备份项目”** 边栏选项卡上,单击 **“Azure 虚拟机”**。
1. 在 **“备份项目(Azure 虚拟机)”** 边栏选项卡上,单击 **“az104-10-vm0”**。
1. 在 **“az104-10-vm0”** 备份项边栏选项卡上,单击 **“停止备份”**。
1. 在 **“停止备份”** 边栏选项卡上,选择 **“删除备份数据”**,指定以下设置,然后单击 **“停止备份”**:
| 设置 | 值 |
| --- | --- |
| 键入备份项目名称 | **az104-10-vm0** |
| 原因 | **其他** |
| 备注 | **az104 10 实验室** |
1. 导航回 **“az104-10-rsv1 - 备份项目”** 边栏选项卡并单击 **“刷新”**。
>**备注**: **“Azure 虚拟机”** 条目仍列出 **1** 个 备份项目。
1. 单击 **“Azure 虚拟机”** 条目,然后在 **“备份项目(Azure 虚拟机)”** 边栏选项卡上,单击 **“az104-10-vm0”** 条目。
1. 在 **“az104-10-vm0”** 备份项目边栏选项卡上,请注意,可以选择 **“取消删除”** 已删除的备份。
>**注意**: 此功能由软删除功能提供,默认为 Azure 虚拟机备份启用。
1. 导航回 **“az104-10-rsv1”** 恢复服务保管库边栏选项卡,在 **“设置”** 部分单击 **“属性”**。
1. 在 **“az104-10-rsv1 - 属性”** 边栏选项卡上,在 **“安全设置”** 标签下单击 **“更新”** 链接。
1. 在“**安全设置**”边栏选项卡上,禁用“**软删除(适用于 Azure 虚拟机)**”并单击“**保存**”。
>**注意**: 这不会影响已处于软删除状态的项目。
1. 关闭 **“安全设置”** 边栏选项卡,然后返回 **az104-10-rsv1** “恢复服务保管库”边栏选项卡,单击 **“概述”**。
1. 导航回 **“az104-10-vm0”** 备份项目边栏选项卡并单击 **“取消删除”**。
1. 在 **“取消删除 az104-10-vm0”** 边栏选项卡上,单击 **“取消删除”**。
1. 等待取消删除操作完成,刷新 Web 浏览器页面(如有需要),导航回 **“az104-10-vm0”** 备份项目边栏选项卡,然后单击 **“删除备份数据”**。
1. 在 **“删除备份数据”** 边栏选项卡上,指定以下设置并单击 **“删除”**:
| 设置 | 值 |
| --- | --- |
| 键入备份项目名称 | **az104-10-vm0** |
| 原因 | **其他** |
| 备注 | **az104 10 实验室** |
#### 清理资源
>**备注**: 请记得删除任何新创建而不会再使用的 Azure 资源。删除未使用的资源,确保不产生意外费用。
1. 在 Azure 门户中,在 **“Cloud Shell”** 窗格中打开 **“PowerShell”** 会话。
1. 运行以下命令,列出在本模块各实验室中创建的所有资源组:
```powershell
Get-AzResourceGroup -Name 'az104-10*'
```
1. 运行以下命令,删除在本模块各个实验室中创建的所有资源组:
```powershell
Get-AzResourceGroup -Name 'az104-10*' | Remove-AzResourceGroup -Force -AsJob
```
>**备注**: 可以视需要考虑删除自动生成的带有前缀 **“AzureBackupRG_”** 的资源组(该资源组的存在不会产生额外费用)。
>**备注**: 该命令异步执行(由 -AsJob 参数确定),因此尽管此后可以立即在同一 PowerShell 会话中运行另一个 PowerShell 命令,但实际上要花几分钟才能删除资源组。
#### 回顾
在本实验室中,你已:
+ 预配实验室环境
+ 创建恢复服务保管库
+ 实现 Azure 虚拟机级备份
+ 实现文件和文件夹备份
+ 通过使用 Azure 恢复服务代理执行了文件恢复
+ 通过使用 Azure 虚拟机快照执行了文件恢复
+ 查看了 Azure 恢复服务软删除功能
| 27.428571 | 210 | 0.590357 | yue_Hant | 0.506277 |
1c05e17685bcf01e60df405793ba4011b2e82ccb | 1,973 | md | Markdown | writeup.md | mamrabet/CarND-LaneLines-P1 | 9b225249311fce68d04a644bb68b1265eb6178e0 | [
"MIT"
] | null | null | null | writeup.md | mamrabet/CarND-LaneLines-P1 | 9b225249311fce68d04a644bb68b1265eb6178e0 | [
"MIT"
] | null | null | null | writeup.md | mamrabet/CarND-LaneLines-P1 | 9b225249311fce68d04a644bb68b1265eb6178e0 | [
"MIT"
] | null | null | null | # **Finding Lane Lines on the Road**
### Objective
---
**Finding Lane Lines on the Road**
In this project, the tools learned in the lane finding lessons are applied to identify lane lines on the road. The pipeline is developed and tested on a series of individual images, and later applied to a video stream.
[image1]: ./test_images_output/solidWhiteCurve.jpg "Lane finding"
---
### Reflection
### 1. Pipeline description
My pipeline consisted of 5 steps:
1. The image is converted to Grayscale
2. A gaussian blur is then applied to reduce the noise. In this project the used Kernel size is 5.
3. Canny edge detection algorithm is then used, with lower threshold equal to 100 and higher threshold equal to 200
4. A region of interest is then extracted from the image using polygons to eliminate the other lines/edges that are outside of the vehicle's lane.
5. Hough algorithm is then applied to find the lines. The used parameters are: threshold = 15, minimum line length = 10 pixels ,maximum line gap = 170 pixles
6. The lines are then added to the image
In order to draw a single line on the left and right lanes, all the lines that are part of these lanes are averaged using least squares polynomial fit (degree = 1), then extrapolated to cover the lanes until the bottom of the image (in order to cover the missing lane parts for example to ensure continuity of lane finding between frames).
### 2. Potential shortcomings with current pipeline
The tuning of the lane finding horizon in the current pipeline is done manually. In fact, in this project lanes are drawn for 40% of the image height, and this value could lead to incorrect lane finding in the farthest lane points.
Another shortcoming could be the estimation of complex lane configurations like in curves.
### 3. Possible improvements to current pipeline
A possible improvement would be to further tune the parameters using more input images for diffrent tricky lane configurations.
| 41.978723 | 339 | 0.778003 | eng_Latn | 0.999467 |
1c05e7fdd3e9452ab80810ebe65f71d287e5a407 | 1,334 | md | Markdown | controls/datapager/client-side-programming/events/ondatapagercreated.md | mohdjuwaiser/ajax-docs | 56376e3c2d405a948ecd88828c75fa73d3602422 | [
"MIT"
] | null | null | null | controls/datapager/client-side-programming/events/ondatapagercreated.md | mohdjuwaiser/ajax-docs | 56376e3c2d405a948ecd88828c75fa73d3602422 | [
"MIT"
] | null | null | null | controls/datapager/client-side-programming/events/ondatapagercreated.md | mohdjuwaiser/ajax-docs | 56376e3c2d405a948ecd88828c75fa73d3602422 | [
"MIT"
] | null | null | null | ---
title: OnDataPagerCreated
page_title: OnDataPagerCreated | RadDataPager for ASP.NET AJAX Documentation
description: OnDataPagerCreated
slug: datapager/client-side-programming/events/ondatapagercreated
tags: ondatapagercreated
published: True
position: 1
---
# OnDataPagerCreated
##
The **OnDataPagerCreated** client-side event is fired after **RadDataPager** is created.
The event handler receives one argument:
1. the [RadDataPager object]({%slug datapager/client-side-programming/datapager-object%}) that fired the event.
The following example uses the **OnDataPagerCreated** event to display a message:
````ASPNET
<telerik:RadDataPager ID="RadDataPager1" runat="server">
<ClientEvents OnDataPagerCreated="DataPagerCreated" />
</telerik:RadDataPager>
````
````JavaScript
<telerik:RadCodeBlock ID="RadCodeBlock1" runat="server">
<script type="text/javascript">
function DataPagerCreated(sender, eventArgs) {
alert("Created datapager with UniqueID: " + sender.get_uniqueID());
}
</script>
</telerik:RadCodeBlock>
````
# See Also
* [OnDataPagerCreating]({%slug datapager/client-side-programming/events/ondatapagercreating%})
* [OnDataPagerDestroying]({%slug datapager/client-side-programming/events/ondatapagerdestroying%})
| 27.22449 | 112 | 0.728636 | yue_Hant | 0.257166 |
1c0604043d4a7a76c452c737204c0a11b46df9cf | 4,811 | md | Markdown | microsoft-365/knowledge/topic-experiences-overview.md | AdrianaMusic/microsoft-365-docs-pr.ko-KR | 414af1f1800a5be6d036499d39d11cfc4670740f | [
"CC-BY-4.0",
"MIT"
] | null | null | null | microsoft-365/knowledge/topic-experiences-overview.md | AdrianaMusic/microsoft-365-docs-pr.ko-KR | 414af1f1800a5be6d036499d39d11cfc4670740f | [
"CC-BY-4.0",
"MIT"
] | null | null | null | microsoft-365/knowledge/topic-experiences-overview.md | AdrianaMusic/microsoft-365-docs-pr.ko-KR | 414af1f1800a5be6d036499d39d11cfc4670740f | [
"CC-BY-4.0",
"MIT"
] | 1 | 2021-04-15T18:29:03.000Z | 2021-04-15T18:29:03.000Z | ---
title: 항목 환경 개요(미리 보기)
ms.author: efrene
author: efrene
manager: pamgreen
audience: admin
ms.topic: article
ms.service: ''
ms.prod: microsoft-365-enterprise
search.appverid: ''
ms.collection: enabler-strategic
localization_priority: None
ROBOTS: NOINDEX, NOFOLLOW
description: 항목 환경 개요.
ms.openlocfilehash: 8b06dd016295c8b0712a7c18c2296a318cd826ba
ms.sourcegitcommit: 18f95c4b7f74881b4a6ce71ad2ffa78a6ead5584
ms.translationtype: MT
ms.contentlocale: ko-KR
ms.lasthandoff: 12/24/2020
ms.locfileid: "49731330"
---
# <a name="topic-experiences-overview-preview"></a>항목 환경 개요(미리 보기)
> [!Note]
> 이 문서의 내용은 Project Cortex Private Preview용입니다. [Project Cortex](https://aka.ms/projectcortex)에 대해 자세히 알아보세요.
항목 환경은 Microsoft AI 기술, Microsoft 365, Delve, Microsoft Graph, Search 및 기타 구성 요소 및 서비스를 사용하여 Microsoft 365 환경에서 지식 네트워크를 구축합니다.
</br>
> [!VIDEO https://www.microsoft.com/videoplayer/embed/RE4LhZP]
</br>
이 문서의 목표는 정보를 지식으로 변환하여 SharePoint 최신 페이지 및 Microsoft Search와 같이 일상에서 사용하는 앱에서 사용자에게 제공하는 것입니다.
주제 환경은 많은 회사에서 중요한 비즈니스 문제를 해결할 수 있도록 도움을 주며, 필요할 때 사용자에게 정보를 제공합니다. 예를 들어 신입 직원은 많은 새 정보를 빠르게 학습하고 회사 정보를 읽을 때 아무 정보도 모르는 용어를 만나야 합니다. 자세한 내용을 확인하려면 사용자가 수행하고 있는 작업을 멀어지며 용어에 대한 정보, 조직에서 주제 전문가, 해당 용어와 관련된 사이트 및 문서와 같은 세부 정보를 검색하는 데 시간을 많이 소비해야 할 수 있습니다.
항목 환경은 AI를 사용하여 조직에서 항목을 자동으로 검색하고 식별합니다. 이 문서에서는 간단한 설명, 주제 전문가, 관련 사이트, 파일 및 페이지와 같은 정보를 컴파일합니다. 지식 관리자 또는 참가자는 필요한 경우 항목 정보를 업데이트할 수 있습니다. 이 항목은 사용자가 사용할 수 있습니다. 즉, 뉴스 및 페이지에서 최신 SharePoint 사이트에 나타나는 항목의 모든 인스턴스에 대해 텍스트가 강조 표시됩니다. 사용자는 항목 세부 정보를 통해 항목에 대해 자세히 알아보는 항목을 선택할 수 있습니다. SharePoint Search에서도 항목을 찾을 수 있습니다.
## <a name="how-topics-are-displayed-to-users"></a>사용자에게 항목을 표시하는 방법
SharePoint 뉴스 및 페이지의 콘텐츠에 주제가 언급된 경우 강조 표시된 항목을 볼 수 있습니다. 강조 표시에서 항목 요약을 열 수 있습니다. 요약 제목에서 항목 세부 정보를 여는 경우 언급한 항목을 자동으로 식별하거나 페이지 작성자가 항목에 대한 직접 참조를 통해 페이지에 추가될 수 있습니다.
 </br>
## <a name="knowledge-indexing"></a>지식 인덱싱
항목 환경은 Microsoft AI 기술을 사용하여 Microsoft 365 환경의 항목을 식별합니다.
항목은 조직적으로 중요하거나 중요한 구나 용어입니다. 조직에 특정한 의미를 가지며, 조직에 관련된 리소스가 있으며, 이 리소스는 사람들이 해당 정보를 이해하고 그에 대한 자세한 정보를 찾을 수 있도록 도와 줄 수 있습니다.
항목을 식별하고 AI가 추천 항목으로 사용할 수 있는 충분한 정보가 있는지 확인하면 항목 인덱싱을 통해 수집된 정보가 포함된 항목 페이지가 만들어집니다.
- 대체 이름 및/또는 약어
- 항목에 대한 간단한 설명입니다.
- 이 항목에 대해 잘 알고 있을 수 있는 사용자입니다.
- 항목과 관련된 파일, 페이지 및 사이트입니다.
지식 관리자는 항목에 대해 테넌트의 모든 SharePoint 사이트를 크롤링하거나 특정 SharePoint 사이트를 선택할 수 있습니다.
## <a name="roles"></a>역할
Microsoft 365 환경에서 항목 환경을 사용할 때 사용자에게는 다음과 같은 역할이 있습니다.
- 항목 뷰어: 항목을 볼 수 있는 사용자는 최소한 읽기 권한이 있는 SharePoint 최신 사이트 및 Microsoft Search에서 강조합니다. 또한 항목 강조 항목을 선택하여 항목 페이지에서 항목 세부 정보를 볼 수 있습니다. 주제 뷰어는 주제가 얼마나 유용한지 피드백을 제공할 수 있습니다.
- 참가자: 기존 항목을 편집하거나 새 항목을 만들 수 있는 권한을 가지는 사용자입니다. 지식 관리자는 Microsoft 365 관리 센터의 항목 환경 설정을 통해 참가자 권한을 사용자에게 할당합니다. 또한 모든 항목 보기 권한자들에게 항목을 편집하고 만들 수 있는 권한을 부여하여 해당 주제가 보는 항목에도 기여할 수 있도록 할 수도 있습니다.
- 지식 관리자: 항목 수명 주기를 통해 항목을 안내하는 사용자입니다. 지식 관리자는 항목 센터의 항목 관리 페이지를 사용하여 AI 추천 항목을 확인 또는 제거하고, 기존 항목을 편집하거나 새 항목을 만들 수 있으며, 항목에 액세스할 수 있는 유일한 사용자입니다. 지식 관리자는 Microsoft 365 관리 센터에서 항목 환경 관리 설정을 통해 사용자에게 기술 관리자 권한을 할당합니다.
- 지식 관리자: 지식 관리자는 Microsoft 365 관리 센터의 관리자 컨트롤을 통해 항목 환경을 설정하고 관리합니다. 현재 Microsoft 365 전역 또는 SharePoint 관리자는 지식 관리자 역할을 할 수 있습니다.
자세한 [내용은 항목 환경](topic-experiences-roles.md) 역할을 참조하세요.
## <a name="topic-management"></a>항목 관리
항목 관리는 조직의 항목 센터에 있는 항목 관리 페이지에서 **수행됩니다.** 항목 센터는 설치 중에 만들어지며 조직의 지식 센터 역할을 합니다.
사용이 허가된 모든 사용자는 항목 센터에서 연결된 항목을 볼 수 있을 뿐 아니라 항목 관리 권한이 있는 사용자(기술 관리자)만 항목 관리 페이지를 보고 사용할 수 있습니다.
지식 관리자는 다음을 할 수 있습니다.
- 테넌트에서 검색된 항목을 확인하거나 거부합니다.
- 필요한 경우 새 항목을 수동으로 만드시다(예: AI를 통해 검색할 수 있는 정보가 충분하지 않은 경우).
- 기존 항목 페이지를 편집합니다.</br>
자세한 [내용은 항목 센터의 항목 관리](manage-topics.md) 항목을 참조하십시오.
## <a name="admin-controls"></a>관리자 컨트롤
Microsoft 365 관리 센터의 관리자 컨트롤을 사용하면 지식 네트워크를 관리할 수 있습니다. Microsoft 365 전역 또는 SharePoint 관리자가 다음을 할 수 있습니다.
- SharePoint 최신 페이지 또는 SharePoint 검색 결과에서 항목을 볼 수 있는 조직의 사용자를 제어합니다.
- 항목을 검색하기 위해 크롤링할 SharePoint 사이트를 제어합니다.
- 특정 항목을 찾지 못하도록 항목 검색을 구성합니다.
- 항목 센터에서 항목을 관리할 수 있는 사용자를 제어합니다.
- 항목 센터에서 항목을 만들고 편집할 수 있는 사용자를 제어합니다.
- 항목을 볼 수 있는 사용자를 제어합니다.
관리자 [컨트롤에 대한](https://docs.microsoft.com/microsoft-365/knowledge/plan-topic-experiences#user-permissions)자세한 내용은 사용자 [](https://docs.microsoft.com/microsoft-365/knowledge/topic-experiences-discovery) 권한 할당, 항목 표시 [관리](https://docs.microsoft.com/microsoft-365/knowledge/topic-experiences-knowledge-rules)및 항목 검색 관리를 참조하세요.
## <a name="topic-curation--feedback"></a>주제 큐레이터 & 피드백
AI는 사용자 환경에서 변경이 발생할 때 항목을 개선하기 위한 제안을 제공하기 위해 지속적으로 작업합니다.
일상 업무에서 항목을 볼 수 있도록 허용하는 사용자에게 해당 주제가 유용한지 묻는 질문이 표시될 수 있습니다. AI는 이러한 응답을 확인하여 항목 요약 및 항목 세부 정보에서 표시되는 내용을 결정하는 데 도움이 됩니다.
항목 편집 또는 만들기 권한이 있는 사용자는 항목을 수정하거나 추가 정보를 추가하려는 경우 항목 페이지를 직접 업데이트할 수 있습니다.
또한 적절한 사용 권한이 있는 사용자는 항목과 관련된 Yammer 같은 항목에 태그를 지정하고 특정 항목에 추가할 수 있습니다.
## <a name="see-also"></a>참고 항목
| 40.428571 | 321 | 0.72355 | kor_Hang | 1.00001 |
1c068fb9a5b19045ad30c8bcb9a8bbe72b5e2d1b | 576 | md | Markdown | reports/report3_Micronesia.md | liamcosgrove30/EdiNapSoftwareMethodsGroupA | 1fe8fc9661c2c89ffc12269b83aac473824b191c | [
"Apache-2.0"
] | 1 | 2021-02-03T14:37:28.000Z | 2021-02-03T14:37:28.000Z | reports/report3_Micronesia.md | liamcosgrove30/EdiNapSoftwareMethodsGroupA | 1fe8fc9661c2c89ffc12269b83aac473824b191c | [
"Apache-2.0"
] | 54 | 2021-02-02T18:05:37.000Z | 2021-04-27T22:43:26.000Z | reports/report3_Micronesia.md | liamcosgrove30/EdiNapSoftwareMethodsGroupA | 1fe8fc9661c2c89ffc12269b83aac473824b191c | [
"Apache-2.0"
] | null | null | null | # All the countries in a region (Micronesia) organised from largest population to smallest
| Code | Name | Continent | Region | Population | Capital |
| :--- | :--- | :--- | :--- | :--- | :--- |
|GUM|Guam|Oceania|Micronesia|168000|Agaña|
|FSM|Micronesia, Federated States of|Oceania|Micronesia|119000|Palikir|
|KIR|Kiribati|Oceania|Micronesia|83000|Bairiki|
|MNP|Northern Mariana Islands|Oceania|Micronesia|78000|Garapan|
|MHL|Marshall Islands|Oceania|Micronesia|64000|Dalap-Uliga-Darrit|
|PLW|Palau|Oceania|Micronesia|19000|Koror|
|NRU|Nauru|Oceania|Micronesia|12000|Yaren|
| 48 | 90 | 0.744792 | yue_Hant | 0.590161 |
1c06b4ab95d93d9f65422a5faa3fa67bd84cdcfe | 8,800 | md | Markdown | README.md | blankensteiner/serialization | 1492456aead9b693f0f751b1a3a9ed481b040039 | [
"MIT"
] | 2 | 2020-03-23T08:46:43.000Z | 2020-03-23T10:56:26.000Z | README.md | blankensteiner/serialization | 1492456aead9b693f0f751b1a3a9ed481b040039 | [
"MIT"
] | null | null | null | README.md | blankensteiner/serialization | 1492456aead9b693f0f751b1a3a9ed481b040039 | [
"MIT"
] | null | null | null | # Serialization
---
> NOTICE: This project is alpha and therefore there are currently no NuGet packages.
---
Deserializing and serializing DTOs is something every serialization framework supports, but for the DDD practitioners, there's a different need when persisting aggregates.
This project will let you serialize and deserialize private and readonly fields without calling a constructor.
## Philosophy
Freedom, abstraction and version awareness are the three cornerstones of this project.
### Freedom
We want the freedom to model our aggregates as we see fit and not having to pull in weavers or NuGet packages into our domain project. Simply put, we want:
* No forced attributes
* No forced inheritance
* No forced constructors
* No forced conventions
In short, we just don't want any constraints.
### Serialization abstractions
Microsoft has done a great job in creating interfaces for [logging](https://www.nuget.org/packages/Microsoft.Extensions.Logging.Abstractions/) and [dependency injection](https://www.nuget.org/packages/Microsoft.Extensions.DependencyInjection.Abstractions/) and soon we will also have vendor-agnostic interfaces for metrics and tracing thanks to [OpenTelemetry](https://opentelemetry.io/).
We feel that a similar standard is needed for serialization and therefore we have created Serialization.Abstractions, defining two interfaces. ISerializer for (de)serializing all types and ISerializer\<T\> for (de)serializing a specific type. There is a small performance improvement when using ISerializer\<T\>.
### Version awareness
As more and more are moving to document-style persistence, doing canary/rolling deployments and having larger and larger databases the need for versioning your aggregates increase and so does your ability to handle multiple versions. This is a feature that's a good fit for a serialization framework but it's sadly missing in most other frameworks.
We would like to make it easier to
* Load an older version and deserialize it into the current version.
* Load a newer version and merge the changes back into the data, to prevent deleting new and unknown fields.
## What encodings are supported?
We have started with JSON (Serialization.MicrosoftJson) building upon [Utf8JsonReader](https://docs.microsoft.com/en-us/dotnet/api/system.text.json.utf8jsonreader?view=netcore-3.1) and [Utf8JsonWriter](https://docs.microsoft.com/en-us/dotnet/api/system.text.json.utf8jsonwriter?view=netcore-3.1) from [System.Text.Json](https://www.nuget.org/packages/System.Text.Json/).
At some point we will support [protobuf](https://developers.google.com/protocol-buffers) (using [protobuf-net](https://github.com/protobuf-net/protobuf-net)) and [MsgPack](https://msgpack.org/) (using [MessagePack](https://github.com/neuecc/MessagePack-CSharp)).
## Getting Started
As mentioned there are currently no NuGet packages, but let's have a look at how to get an ISerializer or ISeralizer\<T\>.
### With manual creation
Let start with building an ISerializer.
```csharp
var builder = new SerializerBuilder();
builder.AddType<ClosedClass>();
var serializer = builder.Build();
```
If you then want a type-specific serializer (ISerializer\<T\>) all you have to do is.
```csharp
var serializerOfT = serializer.GetSerializerFor<ClosedClass>();
```
### With dependency injection
In the Serialization.MicrosoftJson.DependencyInjection namespace there is an IServiceCollection extension for setting up serialization.
```csharp
await Host.CreateDefaultBuilder(args)
.ConfigureServices(services =>
{
services.AddSerialization((options, builder) =>
{
builder.AddType<ClosedClass>();
});
services.AddHostedService<MyService>();
})
.Build()
.RunAsync();
```
All there is left to do now is to either add ISerializer or ISerializer\<T\> to your services.
```csharp
public sealed class MyService : BackgroundService
{
public MyService(ISerializer serializer, ISerializer<ClosedClass> serializerOfT) { }
protected override Task ExecuteAsync(CancellationToken stoppingToken)
=> throw new NotImplementedException();
}
```
## Supported types
We are just getting started but currently, we have support for:
- [X] Boolean
- [X] Signed and unsigned byte
- [X] Signed and unsigned short
- [X] Signed and unsigned integer
- [X] Signed and unsigned long
- [X] Float
- [X] Double
- [X] Decimal
- [X] String
- [X] Guid
- [X] DateTime
- [X] DateTimeOffset
- [X] Nullable<>
- [X] Enum
## Roadmap
Next we will look into supporting:
- [ ] Char
- [ ] TimeSpan
- [ ] Uri
- [ ] Custom types
- [ ] Array[] and collections
- [ ] Dictionary
- [ ] Version?
- [ ] Complex?
- [ ] BigInteger?
- [ ] BitArray?
## Benchmarks
Looking at performance is always exciting so naturally, we created benchmarks to measure serialization and deserialization.
Microsoft's [JsonSerializer](https://docs.microsoft.com/en-us/dotnet/api/system.text.json.jsonserializer?view=netcore-3.1) is using a DTO called OpenClass (consisting only of public properties with public getters and setters) and we use a ClosedClass (consisting only of private and readonly fields).
As you can see below the performance and GC load is the same.
### Serializing
```csharp
[Benchmark] public byte[] System_Text_Json_JsonSerializer()
=> System.Text.Json.JsonSerializer.SerializeToUtf8Bytes(_openClass);
[Benchmark] public byte[] Serialization_MicrosoftJson_Serializer()
=> _serializer.Serialize(_closedClass);
[Benchmark] public byte[] Serialization_MicrosoftJson_Serializer_Of_T()
=> _serializerOfT.Serialize(_closedClass);
```
``` ini
BenchmarkDotNet=v0.12.0, OS=Windows 10.0.18363
Intel Core i5-7500 CPU 3.40GHz (Kaby Lake), 1 CPU, 4 logical and 4 physical cores
.NET Core SDK=3.1.200
[Host] : .NET Core 3.1.2 (CoreCLR 4.700.20.6602, CoreFX 4.700.20.6702), X64 RyuJIT
DefaultJob : .NET Core 3.1.2 (CoreCLR 4.700.20.6602, CoreFX 4.700.20.6702), X64 RyuJIT
```
| Method | Mean | Error | StdDev | Rank | Gen 0 | Gen 1 | Gen 2 | Allocated |
|-------------------------------------------- |---------:|----------:|----------:|-----:|-------:|------:|------:|----------:|
| Serialization_MicrosoftJson_Serializer_Of_T | 1.611 us | 0.0072 us | 0.0063 us | 1 | 0.1469 | - | - | 464 B |
| System_Text_Json_JsonSerializer | 1.659 us | 0.0192 us | 0.0170 us | 2 | 0.1469 | - | - | 464 B |
| Serialization_MicrosoftJson_Serializer | 1.738 us | 0.0068 us | 0.0057 us | 3 | 0.1450 | - | - | 464 B |
Always nice to be #1, but honestly even the performance difference between #1 and #3 practically doesn't matter.
### Deserializing
```csharp
[Benchmark] public OpenClass System_Text_Json_JsonSerializer()
=> System.Text.Json.JsonSerializer.Deserialize<OpenClass>(_bytes);
[Benchmark] public ClosedClass Serialization_MicrosoftJson_Serializer()
=> _serializer.Deserialize<ClosedClass>(_bytes);
[Benchmark] public ClosedClass Serialization_MicrosoftJson_SerializerOfT()
=> _serializerOfT.Deserialize(_bytes);
```
``` ini
BenchmarkDotNet=v0.12.0, OS=Windows 10.0.18363
Intel Core i5-7500 CPU 3.40GHz (Kaby Lake), 1 CPU, 4 logical and 4 physical cores
.NET Core SDK=3.1.200
[Host] : .NET Core 3.1.2 (CoreCLR 4.700.20.6602, CoreFX 4.700.20.6702), X64 RyuJIT
DefaultJob : .NET Core 3.1.2 (CoreCLR 4.700.20.6602, CoreFX 4.700.20.6702), X64 RyuJIT
```
| Method | Mean | Error | StdDev | Rank | Gen 0 | Gen 1 | Gen 2 | Allocated |
|------------------------------------------ |---------:|----------:|----------:|-----:|-------:|------:|------:|----------:|
| Serialization_MicrosoftJson_SerializerOfT | 2.556 us | 0.0135 us | 0.0126 us | 1 | 0.0496 | - | - | 160 B |
| System_Text_Json_JsonSerializer | 2.587 us | 0.0091 us | 0.0085 us | 1 | 0.0496 | - | - | 160 B |
| Serialization_MicrosoftJson_Serializer | 2.637 us | 0.0123 us | 0.0115 us | 2 | 0.0496 | - | - | 160 B |
Notice how [BenchmarkDotNet](https://github.com/dotnet/BenchmarkDotNet) rank both Serialization_MicrosoftJson_SerializerOfT and System_Text_Json_JsonSerializer as #1, because the performance difference is insignificant.
## Versioning
We use [SemVer](http://semver.org/) for versioning. For the versions available, see the [tags on this repository](https://github.com/blankensteiner/serialization/tags).
## Authors
* **Daniel Blankensteiner** - *Initial work*
See also the list of [contributors](https://github.com/blankensteiner/serialization/contributors) who participated in this project.
## License
This project is licensed under the MIT License (MIT) - see the [LICENSE](LICENSE) file for details.
| 42.926829 | 388 | 0.707955 | eng_Latn | 0.8808 |
1c06c9cfd2071dddd9cc68dfca10cd41e62cf490 | 3,029 | md | Markdown | samples/react-pnp-js-sample/README.md | Ramakrishnan24689/sp-dev-fx-webparts | 85c640b56837941e10e2f5e185cbc35b6a94c383 | [
"MIT"
] | null | null | null | samples/react-pnp-js-sample/README.md | Ramakrishnan24689/sp-dev-fx-webparts | 85c640b56837941e10e2f5e185cbc35b6a94c383 | [
"MIT"
] | null | null | null | samples/react-pnp-js-sample/README.md | Ramakrishnan24689/sp-dev-fx-webparts | 85c640b56837941e10e2f5e185cbc35b6a94c383 | [
"MIT"
] | null | null | null | ---
page_type: sample
products:
- office-sp
languages:
- javascript
- typescript
extensions:
contentType: samples
technologies:
- SharePoint Framework
platforms:
- react
createdDate: 5/1/2017 12:00:00 AM
---
# SharePoint Framework sample using @pnp/js and ReactJS
## Summary
This solution builds off of the solution [react-async-await-sp-pnp-js](./react-async-await-sp-pnp-js) submitted by Jose Quinto ([@jquintozamora](https://twitter.com/jquintozamora) , [blog.josequinto.com](https://blog.josequinto.com))
This implementaiton refactors to take aspects out and utilize and showcase PnPjs Version 3.

## Compatibility




-Incompatible-red.svg "SharePoint Server 2016 Feature Pack 2 requires SPFx 1.1")


## Applies to
* [SharePoint Framework](https://docs.microsoft.com/sharepoint/dev/spfx/sharepoint-framework-overview)
* [Microsoft 365 developer tenant](https://docs.microsoft.com/sharepoint/dev/spfx/set-up-your-developer-tenant)
## Solution
Solution|Author(s)
--------|---------
react-spfx-pnp-js-sample | Julie Turner ([@jfj1997](https://twitter.com/jfj1997))
## Version history
Version|Date|Comments
-------|----|--------
1.0|Jan 13, 2022|Initial release
## Minimal Path to Awesome
1. clone this repo
1. `$ npm i`
1. Update online workbench url in the `initialPage` property of the `config/serve.json` file.
1. `$ gulp serve`
## Features
* Establishing context for the SharePoint Factory Interface
* Creating a project config file to centralize defining the PnPjs imports and SharePoint Querable object for reuse.
* Demo extending the SharePoint Querables instance with the PnPLogging beavhior.
* Demo extending the SharePoing Queryable instance with the Caching behavior
* Demo loading list items from a SharePoint library
* Demo creating a batched instance of the SharePoint Querable object.
* Demo updating list items by modifying the Title property.
* Demo executing a batch and working with the results.
## Disclaimer
**THIS CODE IS PROVIDED *AS IS* WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESS OR IMPLIED, INCLUDING ANY IMPLIED WARRANTIES OF FITNESS FOR A PARTICULAR PURPOSE, MERCHANTABILITY, OR NON-INFRINGEMENT.**
<img src="https://pnptelemetry.azurewebsites.net/sp-dev-fx-webparts/samples/react-async-await-sp-pnp-js" />
| 39.855263 | 233 | 0.764609 | eng_Latn | 0.38513 |
1c079350ed444aa3c487d7e1e684dfcc4e063fbc | 3,193 | md | Markdown | u/uebkbernav/index.md | nnworkspace/gesetze | 1d9a25fdfdd9468952f739736066c1ef76069051 | [
"Unlicense"
] | null | null | null | u/uebkbernav/index.md | nnworkspace/gesetze | 1d9a25fdfdd9468952f739736066c1ef76069051 | [
"Unlicense"
] | null | null | null | u/uebkbernav/index.md | nnworkspace/gesetze | 1d9a25fdfdd9468952f739736066c1ef76069051 | [
"Unlicense"
] | null | null | null | ---
Title: Verordnung betreffend die Ausführung der am 9. September 1886 zu Bern abgeschlossenen
Übereinkunft wegen Bildung eines internationalen Verbandes zum Schutz von Werken
der Literatur und Kunst
jurabk: ÜbkBernAV
layout: default
origslug: _bkbernav
slug: uebkbernav
---
# Verordnung betreffend die Ausführung der am 9. September 1886 zu Bern abgeschlossenen Übereinkunft wegen Bildung eines internationalen Verbandes zum Schutz von Werken der Literatur und Kunst (ÜbkBernAV)
Ausfertigungsdatum
: 1897-11-29
Fundstelle
: RGBl: 1897, 787
## Eingangsformel
Wir Wilhelm, ... Deutscher Kaiser, König von Preußen usw.
verordnen im Namen des Reichs, auf Grund des Gesetzes vom 4. April
1888 betreffend die Ausführung der am 9. September 1886 zu Bern
abgeschlossenen Übereinkunft wegen Bildung eines internationalen
Verbandes zum Schutze von Werken der Literatur und Kunst
(Reichsgesetzbl. S. 139), nach erfolgter Zustimmung des Bundesrats,
was folgt:\*
## § 1
Werden besondere Abkommen, die mit anderen Verbandsländern über den
Schutz von Werken der Literatur und Kunst abgeschlossen sind, außer
Kraft gesetzt, so unterliegt die Anwendung der Übereinkunft auf Werke,
welche bis dahin nach Maßgabe dieser Abkommen zu behandeln und in
ihrem Ursprungslande beim Inkrafttreten der Übereinkunft noch nicht
Gemeingut geworden waren (Artikel 14 der Übereinkunft), den
nachstehenden Einschränkungen:
1. Der Druck der Exemplare, deren Herstellung zur Zeit der Aufhebung des
Abkommens erlaubterweise im Gange war, darf vollendet werden; diese
Exemplare sowie diejenigen, welche zu dem gedachten Zeitpunkt
erlaubterweise hergestellt waren, dürfen verbreitet und verkauft
werden. Ebenso dürfen die zu dem gedachten Zeitpunkte vorhandenen
Vorrichtungen (Formen, Platten, Steine, Stereotypen usw.) noch vier
Jahre lang benutzt werden; diese Frist beginnt mit dem Schlusse des
Jahres, in welchem das Abkommen aufgehoben worden ist.
2. Werke, welche vor der Aufhebung des Abkommens in einem der übrigen
Verbandsländer veröffentlicht sind, genießen den in Artikel 5 der
Übereinkunft vorgesehenen Schutz des ausschließlichen
Übersetzungsrechts nicht gegenüber solchen Übersetzungen, welche zu
dem gedachten Zeitpunkt in Deutschland erlaubterweise bereits ganz
oder teilweise veröffentlicht waren.
3. Dramatische oder dramatisch-musikalische Werke, welche in einem der
übrigen Verbandsländer veröffentlicht oder aufgeführt und vor der
Aufhebung des Abkommens im Original oder in Übersetzung in Deutschland
erlaubterweise öffentlich aufgeführt sind, genießen den Schutz gegen
unerlaubte Aufführung im Original oder in einer Übersetzung nicht.
## § 2
Die in § 1 Nr. 1 gewährte Befugnis zur Verbreitung und zum Verkaufe
von Exemplaren sowie zur Benutzung von Vorrichtungen unterliegt der
Bedingung, daß die Exemplare und Vorrichtungen mit einem besonderen
Stempel versehen sind. Die Abstempelung ist nur bis zum Ablaufe dreier
Monate zulässig; diese Frist beginnt mit dem Schlusse des Monats, in
welchem das Abkommen aufgehoben worden ist. ...
## § 3
Diese Verordnung tritt mit dem Tage ihrer Verkündung in Kraft.
| 37.564706 | 204 | 0.803633 | deu_Latn | 0.999605 |
1c07c06f8157b2a098240063c27081a43b312482 | 2,782 | md | Markdown | articles/lab-services/how-to-configure-lab-accounts.md | Artaggedon/azure-docs.es-es | 73e6ff211a5d55a2b8293a4dc137c48a63ed1369 | [
"CC-BY-4.0",
"MIT"
] | 66 | 2017-07-09T03:34:12.000Z | 2022-03-05T21:27:20.000Z | articles/lab-services/how-to-configure-lab-accounts.md | Artaggedon/azure-docs.es-es | 73e6ff211a5d55a2b8293a4dc137c48a63ed1369 | [
"CC-BY-4.0",
"MIT"
] | 671 | 2017-06-29T16:36:35.000Z | 2021-12-03T16:34:03.000Z | articles/lab-services/how-to-configure-lab-accounts.md | Artaggedon/azure-docs.es-es | 73e6ff211a5d55a2b8293a4dc137c48a63ed1369 | [
"CC-BY-4.0",
"MIT"
] | 171 | 2017-07-25T06:26:46.000Z | 2022-03-23T09:07:10.000Z | ---
title: Configuración del apagado automático de las VM en Azure Lab Services
description: En este artículo se describe cómo configurar el apagado automático de las VM en la cuenta de laboratorio.
ms.topic: article
ms.date: 08/17/2020
ms.openlocfilehash: c0a147a81aaed88313a1b9aa4b0754d9a3badcb5
ms.sourcegitcommit: f28ebb95ae9aaaff3f87d8388a09b41e0b3445b5
ms.translationtype: HT
ms.contentlocale: es-ES
ms.lasthandoff: 03/29/2021
ms.locfileid: "91650041"
---
# <a name="configure-automatic-shutdown-of-vms-for-a-lab-account"></a>Configuración del apagado automático de las máquinas virtuales de una cuenta de laboratorio
Cuando las máquinas virtuales no se están usando, se pueden habilitar varias características de control de costos para el apagado automático, a fin de impedir costos adicionales. La combinación de las tres características siguientes de apagado y desconexión automáticos detecta la mayoría de los casos en los que los usuarios dejan funcionando sin querer sus máquinas virtuales:
- Desconexión automática de los usuarios de las máquinas virtuales que el sistema operativo considere inactivas.
- Apagado automático de las máquinas virtuales cuando los usuarios se desconecten.
- Apagar automáticamente las máquinas virtuales que se han iniciado, pero a las que los usuarios no se conectan.
Puede encontrar más detalles sobre las características de apagado automático en la sección [Maximización del control de costos con la configuración de apagado automático](cost-management-guide.md#automatic-shutdown-settings-for-cost-control).
## <a name="enable-automatic-shutdown"></a>Habilitación del apagado automático
1. En [Azure Portal](https://portal.azure.com/), vaya a la página **Cuenta de laboratorio**.
1. Seleccione **Configuración del laboratorio** en el menú de la izquierda.
1. Seleccione las opciones de apagado automático apropiadas para su escenario.
> [!div class="mx-imgBorder"]
> 
Esta configuración se aplica a todos los laboratorios creados en la cuenta de laboratorio. Un creador de laboratorio (formador) puede invalidar esta configuración en el nivel de laboratorio. El cambio que se realice en esta configuración en la cuenta de laboratorio solo afectará a los laboratorios creados después.
Para deshabilitar la configuración, desactive las casillas de esta página.
## <a name="next-steps"></a>Pasos siguientes
Para saber cómo el propietario de un laboratorio puede configurar o invalidar esta configuración en el nivel de laboratorio, consulte [Configuración del apagado automático de máquinas virtuales para un laboratorio](how-to-enable-shutdown-disconnect.md).
| 71.333333 | 378 | 0.807333 | spa_Latn | 0.984873 |
1c07c63073b6e9fe59e8a25ace5f64cd1f746364 | 47 | md | Markdown | README.md | pvtamh2bg/merge-master | 2821a57e8d964096023997029fcf96365eaacbea | [
"MIT"
] | null | null | null | README.md | pvtamh2bg/merge-master | 2821a57e8d964096023997029fcf96365eaacbea | [
"MIT"
] | null | null | null | README.md | pvtamh2bg/merge-master | 2821a57e8d964096023997029fcf96365eaacbea | [
"MIT"
] | null | null | null | # merge-master
remove branchs merged to master
| 15.666667 | 31 | 0.808511 | eng_Latn | 0.997874 |
1c091819a89f901e079f086ac46c1aa04063e0d9 | 251 | md | Markdown | README.md | FrankysWeb/Exchange-Server-Hostnames | 728d4b7f4ecdaff318789a0cfc08994fd5719edf | [
"MIT"
] | 2 | 2021-07-27T16:49:31.000Z | 2021-12-21T20:03:38.000Z | README.md | FrankysWeb/Exchange-Server-Hostnames | 728d4b7f4ecdaff318789a0cfc08994fd5719edf | [
"MIT"
] | null | null | null | README.md | FrankysWeb/Exchange-Server-Hostnames | 728d4b7f4ecdaff318789a0cfc08994fd5719edf | [
"MIT"
] | null | null | null | # Exchange Server Hostnames
This Script will return all configured Exchange Server Hostnames.
## Echange Server Versions
- Exchange Server 2013
- Exchange Server 2016
- Exchange Server 2019
## Website
[FrankysWeb](https://www.frankysweb.de/)
| 22.818182 | 66 | 0.760956 | eng_Latn | 0.383165 |
1c093c216ab357beecc11bbcefd1f0c0c308b9b7 | 3,009 | md | Markdown | README.md | kyleprr/Capstone-Project-Surgical-Robot | c81e3a29aa66a0c615d93459e77916145fe325b3 | [
"MIT"
] | 3 | 2021-07-05T11:30:13.000Z | 2021-09-03T09:55:30.000Z | README.md | kyleprr/Capstone-Project-Surgical-Robot | c81e3a29aa66a0c615d93459e77916145fe325b3 | [
"MIT"
] | null | null | null | README.md | kyleprr/Capstone-Project-Surgical-Robot | c81e3a29aa66a0c615d93459e77916145fe325b3 | [
"MIT"
] | null | null | null | # Engineering Capstone Project - AUT-21-04348
## Robot Arm Control for Minimally Invasive Surgery
This repository demonstrates a robot arm controller system for the UR3 robotic arm. A python script was developed to control the robotic arm within a simulated environment inside Gazebo.
The following repository assumes that [ROS](https://www.ros.org/) is installed and setup on a linux machine. This package has been tested using `ROS Kinetic` running on a `Ubuntu 16.04 LTS` machine.
## Simulated Environment [Gazebo & RViz]
### Planned Environment Layout:
<img src="https://github.com/kyleprr/Capstone-Project-Surgical-Robot/blob/main/media/planned-environment-layout.jpg" width="850">
### Final Environment Layout (Due to computer hardware issues):
<img src="https://github.com/kyleprr/Capstone-Project-Surgical-Robot/blob/main/media/Simulation-full.gif" width="850">
## Constrol System
1. [`surgical_robot_controller.py`](https://github.com/kyleprr/Capstone-Project-Surgical-Robot/blob/main/surgical_robot/scripts/surgical_robot_controller.py) publishes waypoints for the robot arm to follow in sequence to perform the required surgery.
2. [`move_robot_arm.py`](https://github.com/kyleprr/Capstone-Project-Surgical-Robot/blob/main/surgical_robot/scripts/move_robot_arm.py) is not part of the project but a short script to test the movement of the robotic arm to ensure it has been calibrated correctly.
3. A short video has been put together to show how to run the launch file and control system: https://youtu.be/LHtCcleCMA4
4. A useful tool to visualise the UR robot movements can be found here: https://cyberbotics.com/doc/guide/ure
## How to use this repository
- This project was developed and tested in [Ubuntu 16.04 LTS (Xenial Xerus)](https://releases.ubuntu.com/16.04/) with `ROS Kinetic`.
- Make sure you have installed [Python2.7](https://www.python.org/download/releases/2.7/) and the required packages & libraries listed below:
Install Universal Robots ROS Packages:
```
$ sudo apt-get install ros-kinetic-universal-robot
```
Install Universal Robots ROS Drivers:
```
$ git clone https://github.com/UniversalRobots/Universal_Robots_ROS_Driver.git universal_robots_ros_driver
$ git clone -b calibration_devel https://github.com/fmauch/universal_robot.git universal_robot
```
- Install [ROS Kinetic](http://wiki.ros.org/kinetic/Installation/Ubuntu), Gazebo, RViz, Universal Robots and MoveIt packages.
- Created a workspace named as `surgical_robot_ws`, download the repository to `surgical_robot_ws/src/`
```
$ cd surgical_robot_ws/src
$ git clone https://github.com/kyleprr/Capstone-Project-Surgical-Robot.git
```
- Build the code under directory `surgical_robot_ws/`,
```
$ catkin build
$ source devel/setup.bash
```
- Run the code with ROS, Gazebo & RViz
```
$ roslaunch surgical_robot surgical_UR3.launch
```
- Run the controller code (~/surgical_robot_ws/src/surgical_robot/scripts)
```
$ python surgical_robot_controller.py
```
| 49.327869 | 265 | 0.769359 | eng_Latn | 0.82821 |
1c099e6e0c57bac4c645703ec129061b9f62c0ce | 3,159 | md | Markdown | README.md | fm4teus/nlw-2 | a5017fb1d3cb45f0a0750e582d29b6ce408b3121 | [
"MIT"
] | null | null | null | README.md | fm4teus/nlw-2 | a5017fb1d3cb45f0a0750e582d29b6ce408b3121 | [
"MIT"
] | null | null | null | README.md | fm4teus/nlw-2 | a5017fb1d3cb45f0a0750e582d29b6ce408b3121 | [
"MIT"
] | null | null | null | # Proffy
[](https://www.linkedin.com/in/fm4teus)
[](https://github.com/fm4teus/nlw-2)
[](https://github.com/fm4teus/nlw-2/blob/master/LICENSE)
## 📑 About
Developed during the 2nd edition of [Rocketseat](https://rocketseat.com.br)'s Next Level Week, this project helps students and teachers to connect.
<img src="./design/home-web.png"/>
<div align="center">
💻 [Frontend](https://github.com/fm4teus/nlw-2/tree/master/web) |
💾 [Backend](https://github.com/fm4teus/nlw-2/tree/master/server) |
📱 [Mobile](https://github.com/fm4teus/nlw-2/tree/master/mobile)
</div>
## 📚 Functionality
- Students can search for teachers filtering by subject, day of the week and/or time.
- Teachers can register their classes providing info about subjects, schedules, prices and contact.
## 🛠 Updates
Next Level Week was an amazing opportunity to improve my knowledge of some powerful technologies. Nevertheless, there is still a lot of room for improvement. In this section, I'll keep you updated as new features are introduced.
- 🔍 Search Filters are now optional. You can display all teachers and search for those that match all parameters or at least some of them.
## 🎨 Design
<table style="border: none">
<tr>
<td colspan="2">Desktop</td>
<td colspan="2">Mobile</td>
</tr>
<tr>
<td><img src="./design/list-web.png" width=300 /></td><td><img src="./design/form-web.png" width=300 /></td>
<td><img src="./design/home-mobile.png" width=180 /></td><td><img src="./design/list-mobile.png" width=180 /></td>
</tr>
</table>
Design made by [Tiago Luchtenberg](https://www.instagram.com/tiagoluchtenberg/).
## 📥 How to use
- Clone this repository: `git clone https://github.com/fm4teus/nlw-2.git`
To run the application just open the directories and execute the commands below:
- Install dependencies: `yarn install`
- If your database isn't set yet, run `yarn knex:migrate` on the [Server](https://github.com/fm4teus/nlw-2/tree/master/server)
- Run application: `yarn start`
Note that the [Web](https://github.com/fm4teus/nlw-2/tree/master/web) folder contains the responsive application developed for the browser. The
[Mobile](https://github.com/fm4teus/nlw-2/tree/master/mobile) folder contains the application for mobile devices. The [Server](https://github.com/fm4teus/nlw-2/tree/master/server) folder contains the project backend, which must always be running for both applications to work as expected.
## 🚀 Build With
- [ReactJS](https://reactjs.org/)
- [React Native](https://reactnative.dev/)
- [Node.js](https://nodejs.org/en/)
- [TypeScript](https://www.typescriptlang.org)
## 📕 License
The software is available under the [MIT License](https://github.com/fm4teus/nlw-2/blob/master/LICENSE).
E-mail: <a href="mailto:fm4teus@gmail.com">fm4teus@gmail.com</a> |
LinkedIn: <a href="https://www.linkedin.com/in/fm4teus/" target="_blank">fm4teus</a>
| 44.492958 | 288 | 0.729028 | eng_Latn | 0.630946 |
1c0a5766da2dc06a6ecbf8e48ee19f60b335a905 | 730 | md | Markdown | plugins/backend/backstage-plugin-argo-cd-backend/CHANGELOG.md | Rugvip/roadie-backstage-plugins | faf6526f195df1fd8cdd6af52b051a34604ddf9d | [
"Apache-2.0"
] | 54 | 2021-06-21T18:59:02.000Z | 2022-03-28T10:58:52.000Z | plugins/backend/backstage-plugin-argo-cd-backend/CHANGELOG.md | Rugvip/roadie-backstage-plugins | faf6526f195df1fd8cdd6af52b051a34604ddf9d | [
"Apache-2.0"
] | 314 | 2021-06-21T15:14:06.000Z | 2022-03-31T17:42:53.000Z | plugins/backend/backstage-plugin-argo-cd-backend/CHANGELOG.md | Rugvip/roadie-backstage-plugins | faf6526f195df1fd8cdd6af52b051a34604ddf9d | [
"Apache-2.0"
] | 39 | 2021-08-02T21:48:23.000Z | 2022-03-30T16:02:06.000Z | # @roadiehq/backstage-plugin-argo-cd-backend
## 1.2.2
### Patch Changes
- 49abec7: Update patch to release new changes.
## 1.2.1
### Patch Changes
- a728fd1: Update underlying packages and release.
## 1.2.0
### Minor Changes
- ed90f25: Breaking dependency updates for @backstage/core-app-api, @backstage/test-utils, @backstage/core-plugin-api, @backstage/backend-common && @backstage/integration
## 1.1.1
### Patch Changes
- 773692a: Change default port of backend from 7000 to 7007.
This is due to the AirPlay Receiver process occupying port 7000 and preventing local Backstage instances on MacOS to start.
## 1.1.0
### Minor Changes
- 1d256c6: Support multiple Argo instances using the app-selector annotation
| 21.470588 | 170 | 0.738356 | eng_Latn | 0.900535 |
1c0a60ad4691a1389291bbcb1b4e5890eaeabbe5 | 68 | md | Markdown | README.md | summerSpyDad/test_repo | ca7982182833bbce5c34f0460db67297fa8d2d62 | [
"CC0-1.0"
] | null | null | null | README.md | summerSpyDad/test_repo | ca7982182833bbce5c34f0460db67297fa8d2d62 | [
"CC0-1.0"
] | null | null | null | README.md | summerSpyDad/test_repo | ca7982182833bbce5c34f0460db67297fa8d2d62 | [
"CC0-1.0"
] | null | null | null | # test_repo
## Editing the file
A Markdown file in the repository
| 11.333333 | 33 | 0.75 | eng_Latn | 0.938805 |
1c0abb186c2d6eb9debaf76e1dcc6733a548023f | 2,666 | md | Markdown | README.md | keawade/nedl-dockerfile | ce7f64b85d4649f55eb4cd8cbc23bc4d82589ca4 | [
"MIT"
] | null | null | null | README.md | keawade/nedl-dockerfile | ce7f64b85d4649f55eb4cd8cbc23bc4d82589ca4 | [
"MIT"
] | null | null | null | README.md | keawade/nedl-dockerfile | ce7f64b85d4649f55eb4cd8cbc23bc4d82589ca4 | [
"MIT"
] | 1 | 2020-12-03T22:34:34.000Z | 2020-12-03T22:34:34.000Z | # NeDL Dockerfile Demo
## Getting Started
```bash
git clone https://github.com/keawade/nedl-dockerfile.git
cd nedl-dockerfile
npm ci
npm run start:watch
```
## Package Scripts and Helpful Commands
### `docker:build`
Build Docker image from a Dockerfile and context.
```bash
docker build . --tag nedl
```
- `docker build`: [Build an image from a Dockerfile.](https://docs.docker.com/engine/reference/commandline/build/)
- `.`: Context. In this case, path to our current directory.
- `--tag nedl`: Specifies image name. (Tag can optionally be specified but I have not done that here)
### `docker:run`
Creates a writeable container layer over the specified image and then starts it.
```bash
docker run --detach --publish 8080:8080 --name nedl-demo nedl:latest
```
- `docker run`: [Run a command in a new container.](https://docs.docker.com/engine/reference/commandline/run/)
- `--detach`: Runs container in the background.
- `--publish 8080:8080`: Publishes the container's internal port `8080` on our local machine's port `8080`.
- `--name nedl-demo`: Runs with the name `nedl-demo`. This name can later be referenced by other commands.
- `nedl:latest`: Runs the latest tag of the `nedl` image.
### `docker:stop`
Stop and remove running container.
```bash
docker stop nedl-demo && docker rm nedl-demo
```
- `docker stop`: [Stop one or more running containers.](https://docs.docker.com/engine/reference/commandline/stop/)
- `nedl-demo`: Reference to container. In this case, the name we provided to `docker run`.
- `docker rm`: [Remove one or more containers.](https://docs.docker.com/engine/reference/commandline/rm/)
### `docker:shell`
Open a shell in a running container.
```bash
docker exec --interactive --tty nedl-demo sh
```
- `docker exec`: [Run a command in a running container.](https://docs.docker.com/engine/reference/commandline/exec/)
- `--interactive`: Keep STDIN open even if not attached
- `--tty`: Allocate a psuedo-TTY
- `nedl-demo`: Reference to container. In this case, the name we provided to `docker run`.
- `sh`: Command we are running. In this case, the simple shell program `sh`. Note: The command must
exist in the container to be run.
### Stop and remove all running containers
```bash
docker rm --force $(docker ps --all --quiet)
```
- `docker rm`: [Remove one or more containers.](https://docs.docker.com/engine/reference/commandline/rm/)
- `--force`: Force removal of running containers.
- `docker ps`: [List containers.](https://docs.docker.com/engine/reference/commandline/ps/)
- `--all`: Show all containers, not just running containers.
- `--quiet`: Only display numeric IDs (Which we can use to remove them).
| 34.179487 | 116 | 0.72018 | eng_Latn | 0.948927 |
1c0bb292e84dbda83258c5f2d8ef46d8b583ac9f | 5,752 | md | Markdown | README.md | geekayush/website | f623fcb2ac530ed88eca638d36314b5da8e10fae | [
"MIT"
] | null | null | null | README.md | geekayush/website | f623fcb2ac530ed88eca638d36314b5da8e10fae | [
"MIT"
] | null | null | null | README.md | geekayush/website | f623fcb2ac530ed88eca638d36314b5da8e10fae | [
"MIT"
] | null | null | null | <p align="center">
<img alt="Open Climate Fix" src="https://raw.githubusercontent.com/openclimatefix/website/master/src/images/logo_dark_square%402x.png" width="150" />
</p>
<h1 align="center">
Open Climate Fix Website
</h1>
[](#contributors-) [](https://opencollective.com/openclimatefix) [](https://github.com/openclimatefix/website/issues)
Kick off your project with this default boilerplate. This starter ships with the main Gatsby configuration files you might need to get up and running blazing fast with the blazing fast app generator for React.
## 🚀 Quick start
1. **Install.**
```sh
yarn install
yarn run develop
```
2. **Open the source code and start editing!**
The site is now running at `http://localhost:8000`!
_Note: You'll also see a second link: _`http://localhost:8000/___graphql`_. This is a tool you can use to experiment with querying data. Learn more about using this tool in the [Gatsby tutorial](https://www.gatsbyjs.org/tutorial/part-five/#introducing-graphiql)._
Open the `openclimatefix.github.io` directory in your code editor of choice and edit `src/pages/index.js`. Save your changes and the browser will update in real time!
This website is built using Gatsby. Full documentation for Gatsby lives [on the website](https://www.gatsbyjs.org/).
## 💫 Deploy
This page is auto-deployed on GitHub Pages.
## ✨ Contributors
Thanks goes to these wonderful people ([emoji key](https://allcontributors.org/docs/en/emoji-key)):
<!-- ALL-CONTRIBUTORS-LIST:START - Do not remove or modify this section -->
<!-- prettier-ignore -->
<table>
<tr>
<td align="center"><a href="http://jack-kelly.com"><img src="https://avatars2.githubusercontent.com/u/460756?v=4" width="100px;" alt="Jack Kelly"/><br /><sub><b>Jack Kelly</b></sub></a><br /><a href="#business-JackKelly" title="Business development">💼</a> <a href="https://github.com/openclimatefix/website/commits?author=JackKelly" title="Code">💻</a> <a href="#content-JackKelly" title="Content">🖋</a> <a href="#ideas-JackKelly" title="Ideas, Planning, & Feedback">🤔</a></td>
<td align="center"><a href="https://github.com/FWirtz"><img src="https://avatars1.githubusercontent.com/u/6052785?v=4" width="100px;" alt="Flo"/><br /><sub><b>Flo</b></sub></a><br /><a href="https://github.com/openclimatefix/website/commits?author=FWirtz" title="Code">💻</a> <a href="#ideas-FWirtz" title="Ideas, Planning, & Feedback">🤔</a> <a href="#maintenance-FWirtz" title="Maintenance">🚧</a> <a href="#projectManagement-FWirtz" title="Project Management">📆</a></td>
<td align="center"><a href="http://tanner.me"><img src="https://avatars2.githubusercontent.com/u/227?v=4" width="100px;" alt="Damien Taner"/><br /><sub><b>Damien Taner</b></sub></a><br /><a href="#blog-dctanner" title="Blogposts">📝</a></td>
<td align="center"><a href="https://github.com/schillerk"><img src="https://avatars1.githubusercontent.com/u/8676510?v=4" width="100px;" alt="Kyle Schiller"/><br /><sub><b>Kyle Schiller</b></sub></a><br /><a href="https://github.com/openclimatefix/website/commits?author=schillerk" title="Code">💻</a> <a href="#design-schillerk" title="Design">🎨</a></td>
<td align="center"><a href="http://www.ollicle.com"><img src="https://avatars1.githubusercontent.com/u/63586?v=4" width="100px;" alt="Oliver Boermans"/><br /><sub><b>Oliver Boermans</b></sub></a><br /><a href="#content-ollicle" title="Content">🖋</a> <a href="https://github.com/openclimatefix/website/issues?q=author%3Aollicle" title="Bug reports">🐛</a> <a href="#ideas-ollicle" title="Ideas, Planning, & Feedback">🤔</a></td>
<td align="center"><a href="https://www.phillipkwang.com"><img src="https://avatars3.githubusercontent.com/u/11009767?v=4" width="100px;" alt="eambutu"/><br /><sub><b>eambutu</b></sub></a><br /><a href="https://github.com/openclimatefix/website/issues?q=author%3Aeambutu" title="Bug reports">🐛</a> <a href="https://github.com/openclimatefix/website/commits?author=eambutu" title="Code">💻</a></td>
<td align="center"><a href="https://www.raais.org/"><img src="https://raw.githubusercontent.com/openclimatefix/website/master/src/images/sponsor_raais.png" width="100px;" alt="RAAIS Foundation"/><br /><sub><b>RAAIS Foundation</b></sub></a><br /><a href="#financial-openclimatefix" title="Financial">💵</a></td>
</tr>
<tr>
<td align="center"><a href="https://github.com/hansal7014"><img src="https://avatars2.githubusercontent.com/u/28968198?v=4" width="100px;" alt="Hansal Bachkaniwala"/><br /><sub><b>Hansal Bachkaniwala</b></sub></a><br /><a href="https://github.com/openclimatefix/website/commits?author=hansal7014" title="Code">💻</a></td>
<td align="center"><a href="https://github.com/drwm-base"><img src="https://avatars3.githubusercontent.com/u/50212366?v=4" width="100px;" alt="David"/><br /><sub><b>David</b></sub></a><br /><a href="https://github.com/openclimatefix/website/commits?author=drwm-base" title="Code">💻</a></td>
<td align="center"><a href="https://github.com/sukhbeersingh"><img src="https://avatars2.githubusercontent.com/u/44414281?v=4" width="100px;" alt="sukhbeersingh"/><br /><sub><b>sukhbeersingh</b></sub></a><br /><a href="https://github.com/openclimatefix/website/commits?author=sukhbeersingh" title="Code">💻</a></td>
</tr>
</table>
<!-- ALL-CONTRIBUTORS-LIST:END -->
This project follows the [all-contributors](https://github.com/all-contributors/all-contributors) specification. Contributions of any kind welcome!
| 94.295082 | 480 | 0.710014 | yue_Hant | 0.315077 |
1c0bcf8cb5959f2e4733368fe0a4349d9252bf31 | 461 | md | Markdown | crypto-c/doc/README.md | CertiVox-i3-NTT/incubator-milagro-crypto | 4bd0417f917f01e933fcdfbe1e2316b725ce09e3 | [
"Apache-2.0"
] | null | null | null | crypto-c/doc/README.md | CertiVox-i3-NTT/incubator-milagro-crypto | 4bd0417f917f01e933fcdfbe1e2316b725ce09e3 | [
"Apache-2.0"
] | null | null | null | crypto-c/doc/README.md | CertiVox-i3-NTT/incubator-milagro-crypto | 4bd0417f917f01e933fcdfbe1e2316b725ce09e3 | [
"Apache-2.0"
] | 1 | 2018-04-24T14:37:38.000Z | 2018-04-24T14:37:38.000Z | ./Doxyfile.in This file describes the settings to be used by the
documentation system
./AMCL.dox Main page of the documentation.
./latex Source files to generate AMCL.pdf
To generate the documentation type: make doc
This will generate the directories
./html html output
./latex latex source files. The pdf can be generated by
typing: make. The output is refman.pdf.
./xml XML output
| 23.05 | 73 | 0.655098 | eng_Latn | 0.919634 |
1c0bd577f651193df6d97b5e54b70c5b884ba15e | 1,096 | md | Markdown | README.md | joe-peak/my-notes | 2684ec216bf8eba921bdcb2c116242f5f1bf6020 | [
"MIT"
] | null | null | null | README.md | joe-peak/my-notes | 2684ec216bf8eba921bdcb2c116242f5f1bf6020 | [
"MIT"
] | null | null | null | README.md | joe-peak/my-notes | 2684ec216bf8eba921bdcb2c116242f5f1bf6020 | [
"MIT"
] | null | null | null | # my-notes
>my-notes
> This is a blockquote with two paragraphs. Lorem ipsum dolor sit amet,
> consectetuer adipiscing elit. Aliquam hendrerit mi posuere lectus.
> Vestibulum enim wisi, viverra nec, fringilla in, laoreet vitae, risus.
>
> Donec sit amet nisl. Aliquam semper ipsum sit amet velit. Suspendisse
> id sem consectetuer libero luctus adipiscing.
> This is a blockquote with two paragraphs. Lorem ipsum dolor sit amet,
consectetuer adipiscing elit. Aliquam hendrerit mi posuere lectus.
Vestibulum enim wisi, viverra nec, fringilla in, laoreet vitae, risus.
> Donec sit amet nisl. Aliquam semper ipsum sit amet velit. Suspendisse
id sem consectetuer libero luctus adipiscing.
> This is the first level of quoting.
>
> > This is nested blockquote.
>
> Back to the first level.
> ## This is a header.
>
> 1. This is the first list item.
> 2. This is the second list item.
>
> Here's some example code:
>
> return shell_exec("echo $input | $markdown_script");
* Red
* Green
* Blue
3. Bird
1. McHale
8. Parish
This is a normal paragraph:
This is a code block.
*** | 24.355556 | 72 | 0.731752 | eng_Latn | 0.893846 |
1c0c25f511d4b306f362e33165f59288a7065e22 | 98 | md | Markdown | Snippext_public/combined_data/README.md | nlokeshiisc/cs726_var_dedup | 9c4f7519011f3f034a9b7ff15cd3a95a44a84e9f | [
"Apache-2.0"
] | 54 | 2020-01-28T00:21:24.000Z | 2021-12-31T08:05:21.000Z | Snippext_public/combined_data/README.md | nlokeshiisc/cs726_var_dedup | 9c4f7519011f3f034a9b7ff15cd3a95a44a84e9f | [
"Apache-2.0"
] | 6 | 2020-02-07T01:12:31.000Z | 2022-01-21T05:45:28.000Z | Snippext_public/combined_data/README.md | nlokeshiisc/cs726_var_dedup | 9c4f7519011f3f034a9b7ff15cd3a95a44a84e9f | [
"Apache-2.0"
] | 15 | 2020-07-27T17:06:17.000Z | 2022-01-08T16:15:39.000Z | The ABSA datasets are obtained from this [repos](https://github.com/howardhsu/BERT-for-RRC-ABSA).
| 49 | 97 | 0.77551 | eng_Latn | 0.916264 |
1c0c792667ae2133fe503a5d8dd250eaba116078 | 2,048 | md | Markdown | README.md | Alchemy17/desplash | c8894205dd631676b1afc1fc8294f751c71ee849 | [
"Unlicense"
] | null | null | null | README.md | Alchemy17/desplash | c8894205dd631676b1afc1fc8294f751c71ee849 | [
"Unlicense"
] | null | null | null | README.md | Alchemy17/desplash | c8894205dd631676b1afc1fc8294f751c71ee849 | [
"Unlicense"
] | null | null | null | ## Gallery
## Author
Abdulrahman Mohamed
# DESCRIPTION
This is an app that allows users to view photos and details about them.
#### Gallery Categories
* Travel
* Food
## Tech Needed
* Python3.6
#### User Stories
A user can;
* View photos based on the location they were taken.
* View different photos that interest me.
* Click on a single photo to expand it and also view the details of the photo.
* Search for different categories of photos. (ie. Travel, Food)
## Installation steps
* $ git clone https://github.com/alchemy17/desplash
* $ cd desplash
* $ source virtual/bin/activate
* Install all the necessary requirements by running pip install -r requirements.txt (Python 3).
* $ ./manager.py runserver
# Technologies Used
#### This project uses major technologies which are :
* HTML5
* CSS
* Bootstrap4
* Python3.6
* Django
* jQuery
### Known bugs, support and Contacts
- No known bugs
# License
* MIT License
Copyright (c) 2017 Abdulrahman Mohamed
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.*
Copyright (c) 2017 ** [Abdulrahman Mohamed]** | 26.25641 | 95 | 0.765625 | eng_Latn | 0.851584 |
1c0e1520db90796d4cc55377cfa7ce65b0003fa7 | 2,765 | md | Markdown | source/posts/2017/10-15-podcast-05-41.html.md | rwpod/rwpod.github.io | 7da41a2dceb2b2d7ea2b0113273478dbf8713af1 | [
"MIT"
] | 11 | 2016-08-15T22:45:05.000Z | 2022-02-04T17:50:20.000Z | source/posts/2017/10-15-podcast-05-41.html.md | rwpod/rwpod.github.io | 7da41a2dceb2b2d7ea2b0113273478dbf8713af1 | [
"MIT"
] | 12 | 2016-06-28T15:26:57.000Z | 2021-09-27T22:23:54.000Z | source/posts/2017/10-15-podcast-05-41.html.md | rwpod/rwpod.github.io | 7da41a2dceb2b2d7ea2b0113273478dbf8713af1 | [
"MIT"
] | 3 | 2017-01-26T13:40:03.000Z | 2022-02-04T17:50:33.000Z | ---
title: "41 выпуск 05 сезона. Ruby 2.5.0-preview1, Reusable UI components in Rails, React Sight, ProseMirror и прочее"
date: 2017-10-15
tags:
- podcasts
audio_url: "https://files.rwpod-assets.com/podcasts/05/0541.mp3"
audio_mirror_url: "https://github.com/rwpod/mirror/releases/download/05.41/0541.mp3"
audio_format: "audio/mpeg"
audio_size: 50982898
duration: "00:53:05"
small_icon: mic
main_image: "/images/static/05/0541.png"
---
Добрый день уважаемые слушатели. Представляем новый выпуск подкаста RWpod. В этом выпуске:
## Ruby
- [Ruby 2.5.0-preview1 Released](https://www.ruby-lang.org/en/news/2017/10/10/ruby-2-5-0-preview1-released/), [Improved stacktrace display in Ruby 2.5](https://mlomnicki.com/improved-stascktrace-display-in-ruby25/), [Rails 5.2 adds expiry option for signed and encrypted cookies and adds relative expiry time](https://blog.bigbinary.com/2017/10/09/expirty-option-for-signed-and-encrypted-cookies-in-Rails-5-2.html) и [Unsafe Object Deserialization Vulnerability in RubyGems](http://blog.rubygems.org/2017/10/09/unsafe-object-deserialization-vulnerability.html)
- [Why Ruby app servers break on macOS High Sierra and what can be done about it](https://blog.phusion.nl/2017/10/13/why-ruby-app-servers-break-on-macos-high-sierra-and-what-can-be-done-about-it/) и [Configuring Puma, Unicorn and Passenger for Maximum Efficiency](https://www.speedshop.co/2017/10/12/appserver.html)
- [Which JSON Serializer To Use For A New Rails API?](http://www.carlosramireziii.com/which-json-serializer-to-use-for-a-new-rails-api.html), [Reusable UI components in Rails](https://goiabada.blog/rails-components-faedd412ce19) и [Simple approach to Rails 5 API authentication with Json Web Token](https://www.codementor.io/omedale/simple-approach-to-rails-5-api-authentication-with-json-web-token-cpqbgrdo6)
## JavaScript
- [Things to love and hate in Electron](https://binary-studio.com/2017/09/12/love-hate-electron/) и [Build a shopping cart with Vue.js and Element UI (no Vuex)](https://medium.com/@connorleech/build-a-shopping-cart-with-vue-js-and-element-ui-no-vuex-54682e9df5cd)
- [JavaScript got better while I wasn’t looking](https://eev.ee/blog/2017/10/07/javascript-got-better-while-i-wasnt-looking/) и [JavaScript's Promise Leaks Memory](https://alexn.org/blog/2017/10/11/javascript-promise-leaks-memory.html)
- [React Sight - a live view of the component hierarchy tree of your React application](https://github.com/React-Sight/React-Sight), [ProseMirror - in-browser semantic rich text editing](http://prosemirror.net/) и [Taucharts One](https://blog.taucharts.com/taucharts-one/)
## В гостях - Тимофей Лавник
- [Github](https://github.com/nwtima)
- [Linkedin](https://www.linkedin.com/in/tim-lavnik-bb582bba/)
READMORE
| 79 | 561 | 0.76962 | kor_Hang | 0.306929 |
1c0eed545481292527a84d7f4d83735d6dc1ea4f | 4,330 | md | Markdown | README.md | vardef/v | 904b99c8f287c7c756ceacc4a0019c5a6cc90ca0 | [
"MIT"
] | null | null | null | README.md | vardef/v | 904b99c8f287c7c756ceacc4a0019c5a6cc90ca0 | [
"MIT"
] | null | null | null | README.md | vardef/v | 904b99c8f287c7c756ceacc4a0019c5a6cc90ca0 | [
"MIT"
] | null | null | null | # The V Programming Language 0.1.x
[](https://dev.azure.com/alexander0785/vlang/_build/latest?definitionId=1&branchName=master) [](https://travis-ci.org/vlang/v)
https://vlang.io
Documentation: https://vlang.io/docs
Twitter: https://twitter.com/v_language
Discord (primary community): https://discord.gg/n7c74HM
Installing V: https://github.com/vlang/v#installing-v-from-source
## Key Features of V
- Simplicity: the language can be learned in half an hour, less if you already know Go
- Fast compilation: ~100k loc/s right now, ~1.2 million loc/s once x64 generation is mature enough
- Easy to develop: V compiles itself in less than a second
- Performance: within 5% of C
- Safety: no null, no globals, no undefined behavior, immutability by default
- C to V translation
- Hot code reloading
- Powerful UI and graphics libraries
- Easy cross compilation
- REPL
V 1.0 release is planned for December 2019.
## Notes
GitHub marks V's code as written in Go. It's actually written in V, GitHub doesn't support the language yet.
The compilation is temporarily slower for this release:
- Debug builds are used (use `./v -prod -o v .` to get faster compilation).
- vlib is recompiled with every program you build.
- The new formatter runs on every single token and slows the compiler down by ~20%. This will be taken care of.
- There are a lot of known issues that are quick to fix (like function lookups being O(n)).
There's some old hacky code written when V was 2 months old. All of it will be quickly cleaned up. There are ~500 lines of C code, which will be removed by the end of June.
## Code structure
https://github.com/vlang/v/blob/master/CodeStructure.md
## Installing V from source
### Linux and macOS
You'll need Clang or GCC. On macOS run `xcode-select --install` if you don't have XCode or XCode tools installed.
```bash
# You can clone V anywhere
git clone https://github.com/vlang/v
cd v/compiler
make
```
Or build without make:
```bash
# Download the V compiler's source translated to C
wget https://raw.githubusercontent.com/vlang/vc/master/v.c
cc -std=gnu11 -w -o v v.c # Build it with Clang or GCC
./v -o v . # Use the resulting V binary to build V from V source
./v -o v . # Build the compiler again to make sure it works
```
That's it! Now you have a V executable at `v/compiler/v`.
You can create a symlink so that it's globally available:
```
sudo ln -s [path to V repo]/compiler/v /usr/local/bin/v
```
V is being constantly updated. To update V, simply run
```
git pull origin master
cd compiler/
make clean
make
```
### Windows
V works great on Windows Subsystem for Linux. The instructions are the same as above.
If you want to build v.exe on Windows without WSL, you will need Visual Studio. Microsoft doesn't make it easy for developers. Mingw-w64 could suffice, but if you plan to develop UI and graphical apps, VS is your only option.
V temporarily can't be compiled with Visual Studio. This will be fixed asap.
### Testing
```
$ cd examples
$ v run hello_world.v
hello world
$ v
V 0.1.2
Use Ctrl-D to exit
>>> println('hello world')
hello world
>>>
```
Now if you want, you can start tinkering with the compiler. If you introduce a breaking change and rebuild V, you will no longer be able to use V to build itself. So it's a good idea to make a backup copy of a working compiler executable.
### Running the examples
```
v hello_world.v && ./hello_world # or simply
v run hello_world.v # this builds the program and runs it right away
v word_counter.v && ./word_counter cinderella.txt
v run news_fetcher.v
v run tetris.v
```
<img src='https://raw.githubusercontent.com/vlang/v/master/examples/tetris/screenshot.png' width=300>
In order to build Tetris and anything else using the graphics module, you will need to install glfw and freetype.
If you plan to use the http package, you also need to install libcurl.
```
macOS:
brew install glfw freetype curl
Ubuntu:
sudo apt install libglfw3 libglfw3-dev libfreetype6-dev libcurl3-dev
Arch:
sudo pacman -S glfw-x11 curl freetype2
```
glfw and libcurl dependencies will be removed soon.
| 30.069444 | 294 | 0.735566 | eng_Latn | 0.975652 |
1c0ef1a1ee93887ad4e0c2717fe1605fcf682e40 | 2,164 | md | Markdown | README.md | kriminal666/SocketClientExample | 6eeb7d16a94fa7790cfb54f61f1e8ccd5023ba99 | [
"MIT"
] | null | null | null | README.md | kriminal666/SocketClientExample | 6eeb7d16a94fa7790cfb54f61f1e8ccd5023ba99 | [
"MIT"
] | null | null | null | README.md | kriminal666/SocketClientExample | 6eeb7d16a94fa7790cfb54f61f1e8ccd5023ba99 | [
"MIT"
] | null | null | null | # SocketClientExample
Simple Socket client example for android.
# Compile this java server
```java
import java.io.DataInputStream;
import java.io.DataOutputStream;
import java.io.IOException;
import java.net.ServerSocket;
import java.net.Socket;
public class MyServer {
public static void main(String[] args){
ServerSocket serverSocket = null;
Socket socket = null;
DataInputStream dataInputStream = null;
DataOutputStream dataOutputStream = null;
try {
serverSocket = new ServerSocket(8888);
System.out.println("Listening :8888");
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
while(true){
try {
socket = serverSocket.accept();
dataInputStream = new DataInputStream(socket.getInputStream());
dataOutputStream = new DataOutputStream(socket.getOutputStream());
System.out.println("ip: " + socket.getInetAddress());
System.out.println("message: " + dataInputStream.readUTF());
dataOutputStream.writeUTF("Hello!");
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
finally{
if( socket!= null){
try {
socket.close();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
if( dataInputStream!= null){
try {
dataInputStream.close();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
if( dataOutputStream!= null){
try {
dataOutputStream.close();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
}
}
}
```
Try later the client using it's IP and 8888 port:

# Try using netcat :
```
# apt-get install netcat-traditional
# update-alternatives --config nc
```
Select:
```
/bin/nc.traditional
```
Create a new server:
```
# nc -l -p PUT_HERE_YOUR_PORT
```
And test the client using your server IP and Port.

| 24.044444 | 81 | 0.667745 | kor_Hang | 0.424331 |
1c0f2221e52b3a617d28c9d2494bad81dab33a9b | 105 | md | Markdown | translations/es-XL/data/reusables/dotcom_billing/show-plan-details.md | kyawburma/docs | 0ff7de03be7c2432ced123aca17bfbf444bee1bf | [
"CC-BY-4.0",
"MIT"
] | 11,698 | 2020-10-07T16:22:18.000Z | 2022-03-31T18:54:47.000Z | translations/es-XL/data/reusables/dotcom_billing/show-plan-details.md | kyawburma/docs | 0ff7de03be7c2432ced123aca17bfbf444bee1bf | [
"CC-BY-4.0",
"MIT"
] | 8,317 | 2020-10-07T16:26:58.000Z | 2022-03-31T23:24:25.000Z | translations/es-XL/data/reusables/dotcom_billing/show-plan-details.md | kyawburma/docs | 0ff7de03be7c2432ced123aca17bfbf444bee1bf | [
"CC-BY-4.0",
"MIT"
] | 48,204 | 2020-10-07T16:15:45.000Z | 2022-03-31T23:50:42.000Z | 1. De forma opcional, para ver los detalles de tu plan, haz clic en **Show details (Mostrar detalles)**.
| 52.5 | 104 | 0.733333 | spa_Latn | 0.715854 |
1c0f382c07ebe7de68e58cdcdea6ed542c69462f | 4,055 | md | Markdown | README.md | BitPoet/pw-profile-striped | c7fe0665ed2a4caa9a030d9dde4a99a5a5ae88ed | [
"CC-BY-3.0"
] | 4 | 2018-03-14T17:28:49.000Z | 2020-11-18T19:42:23.000Z | README.md | BitPoet/pw-profile-striped | c7fe0665ed2a4caa9a030d9dde4a99a5a5ae88ed | [
"CC-BY-3.0"
] | null | null | null | README.md | BitPoet/pw-profile-striped | c7fe0665ed2a4caa9a030d9dde4a99a5a5ae88ed | [
"CC-BY-3.0"
] | 1 | 2020-03-21T05:13:10.000Z | 2020-03-21T05:13:10.000Z | # pw-profile-striped
Responsive Travel Blog Template for the [ProcessWire](https://processwire.com) OpenSource CMS.
## Basics
This blog profile is built upon kongondo's blog module, but a lot of things have been changed, removed or extended. The layout was taken from the free [Striped](https://html5up.net/striped) responsive template at [html5up.net](https://html5up.net/). It used a small number of third party modules like justb3a's [Simple Contact Form](https://modules.processwire.com/modules/simple-contact-form/).
The target group for this site profile are travel bloggers that want to have a feature rich layout and the option to list and read posts in ascending chronological order (something that most blog designs, no matter on which platform, are sorely lacking).

## Installation
This is a complete ProcessWire site profile.
It requires ProcessWire version >= 3.0.82.
To install it, download a fresh copy of ProcessWire from the [download page](https://processwire.com/download/) and extract it into your web server folder (either root or a subdirectory, though don't forget to edit your .htaccess after installation in the latter case).
Download a [zipped copy of this repository](https://github.com/BitPoet/pw-profile-striped/archive/master.zip) and extract the site-blog-striped folder into the ProcessWire installation's root folder.
Point your web browser at the url of your installation and follow the wizard. Select "Striped Travel Blog Template (Responsive)" when prompted to select a profile.
## Setting up your blog
- Adjust the settings on the Settings page, this is where most of the generic settings that affect the whole blog are kept (title, copyright, color settings, logo, result limits, widgets, email for contact form etc.)
- You should create an extra user for blogging with blog-author and page-edit permissions (or multiple users). Put the name that should be displayed on the frontend in the Display name field
- Start blogging
## The Home page
The home page shows a three-column layout with images and teasers. The image shown is the first image (or a grey box if none found) for the post. You can sort the post's images in the backend page editor.
Posts have a "Blog Sticky Flag for Home" checkbox. Posts with this flag are always shown first on the home page, so if you check that box in too many posts, other, newer posts won't make their way onto the home page!
## The Search
The built-in search searches through title, summary and body of posts. If multiple terms are entered, all of them have to be found, but they may be spread over multiple fields.
## Post Lists
All post lists are limited to show only as many excerpts as configured in the Settings page at one and provide pagination. This pagination is built upon ProcessWire's built-in pagination modules but uses a bit of hackery behind the scenes to make it adhere to the styles provided with the Striped theme.
## Advanced
- You can sort the order of widgets in the settings page
- To create a new sidebar widget:
* create a new php file blog-WIDGETNAME.php in site/templates
* create a new template for that in the backend using the same fields as blog-widget-basic
* edit your php file, it needs to return:
- a wrapping tag "section" with css class "box"
- inside that, a "header" tag (if you want to show a headline for your widget) with the h2 tag containing the label
- the html for the widget content
- you can look into the shipped widgets like blog-recent-posts.php, blog-recent-comments.php, blog-copyright.php etc. for inspiration
* create a new page under Widgets with your template and publish it
* add the new widget page to the widgets on the Settings page
## Files
Feel free to adapt the styles/blog.css and styles/blog.js files in the templates directory to your liking.
The "assets" dir inside templates comes from the HTML5 template used and was copied as-is.
## License
Feel free to use this blog profile as you like.
| 61.439394 | 395 | 0.775092 | eng_Latn | 0.997405 |
1c0fc30b4b019d390877ac889e88604c57a729c6 | 7,138 | md | Markdown | README.md | traversal/gatsby-source-airtable | b146948484644a50143c3c3c2d9bfb9d8a076cdd | [
"MIT"
] | null | null | null | README.md | traversal/gatsby-source-airtable | b146948484644a50143c3c3c2d9bfb9d8a076cdd | [
"MIT"
] | null | null | null | README.md | traversal/gatsby-source-airtable | b146948484644a50143c3c3c2d9bfb9d8a076cdd | [
"MIT"
] | null | null | null | ## We are in process of combining namespaces. This codebase will be used for Gatsby v2 and forward.
[](https://www.npmjs.com/package/gatsby-source-airtable)
[](https://travis-ci.com/jbolda/gatsby-source-airtable)
Gatsby source plugin for pulling rows from multiple tables and bases in Airtable. This was inspired by [kevzettler/gatsby-source-airtable](https://github.com/kevzettler/gatsby-source-airtable), but due to the many breaking changes introduced, I started a new package (pretty much a complete rewrite). With the introduction of Gatsby v2, we felt it was a great time to combine the namespaces and this repository will be used moving forward. If you are looking for the documentation on `gatsby-source-airtable-linked`, see the additional branch. We do recommend moving your dependency over to this plugin, `gatsby-source-airtable`, for Gatsby v2. (If you are still on Gatsby v1, see `gatsby-source-airtable-linked` for compatible code.)
## Install
via npm
`npm install --save gatsby-source-airtable`
or via yarn
`yarn add gatsby-source-airtable`
## Example
Getting data from two different tables:
```javascript
// In gatsby-config.js
plugins: [
{
resolve: `gatsby-source-airtable`,
options: {
apiKey: `YOUR_AIRTABLE_KEY`, // may instead specify via env, see below
tables: [
{
baseId: `YOUR_AIRTABLE_BASE_ID`,
tableName: `YOUR_TABLE_NAME`,
tableView: `YOUR_TABLE_VIEW_NAME`, // optional
queryName: `OPTIONAL_NAME_TO_IDENTIFY_TABLE`, // optional
mapping: { `CASE_SENSITIVE_COLUMN_NAME`: `VALUE_FORMAT` }, // optional, e.g. "text/markdown", "fileNode"
tableLinks: [`CASE`, `SENSITIVE`, `COLUMN`, `NAMES`] // optional, for deep linking to records across tables.
},
{
baseId: `YOUR_AIRTABLE_BASE_ID`,
tableName: `YOUR_TABLE_NAME`,
tableView: `YOUR_TABLE_VIEW_NAME` // optional
// can leave off queryName, mapping or tableLinks if not needed
}
]
}
}
];
```
Get one single record (table row), where `Field_1 === YOUR_VALUE`
```
{
airtable(table: {eq: "YOUR_TABLE_NAME"}, data: {Field_1: {eq: "YOUR_VALUE"}}) {
data {
Field_1
Field_2
Linked_Field {
data {
Linked_Field_1
}
}
}
}
}
```
Get all records from `YOUR_TABLE_NAME` where `Field_1 === YOUR_VALUE`
```
{
allAirtable(filter: {table: {eq: "YOUR_TABLE_NAME"}, data: {Field_1: {eq: "YOUR_VALUE"}}}) {
edges {
node {
data {
Field_1
...
}
}
}
}
}
```
## How it works
When running `gatsby develop` or `gatsby build`, this plugin will fetch all data for all rows in each of the tables you specify, making them available for query throughout your gatsby.js app, and to other Gatsby plugins as well.
As seen in the example above, `tables` is always specified as an array of table objects. These tables may be sourced from different bases.
Querying for `airtable` will always only return one record (defaulting to the first record in the table), and querying for `allAirtable` will return any records that match your query parameters.
As in the examples above, you can narrow your query by filtering for table names, and field values.
### Deep linking across tables
One powerful feature of Airtable is the ability to specify fields which link to records in other tables-- the `Link to a Record` field type. If you wish to query data from a linked record, you must specify the field name in `tableLinks` (matching the name shown in Airtable, not the escaped version).
This will create nested nodes accessible in your graphQL queries, as shown in the above example. If you do not specify a linked field in `tableLinks`, you will just receive the linked record's Airtable IDs as `strings`. The name of the column/field does not have to match the related table, but you do need to make sure that the related table is included as an object in your `gatsby-config.js` as well.
### Using markdown and attachments
Optionally, you may provide a "mapping". This will alert the plugin that column names you specify are of a specific, non-string format of your choosing. This is particularly useful if you would like to have Gatsby pick up the fields for transforming, e.g. `text/markdown`. If you do not provide a mapping, Gatsby will just "infer" what type of value it is, which is most typically a `string`.
For an example of a markdown-and-airtable-driven site using `gatsby-transformer-remark`, see the examples folder in this repo.
If you are using the `Attachment` type field in Airtable, you may specify a column name with `fileNode` and the plugin will bring in these files. Using this method, it will create "nodes" for each of the files and expose this to all of the transformer plugins. A good use case for this would be attaching images in Airtable, and being able to make these available for use with the `sharp` plugins and `gatsby-image`. Specifying a `fileNode` does require a peer dependency of `gatsby-source-filesystem` otherwise it will fall back as a non-mapped field. The locally available files and any ecosystem connections will be available on the node as `localFiles`.
### The power of views
Within Airtable, every table can have one or more named Views. These Views are a convenient way to pre-filter and sort your data before querying it in Gatsby. If you do not specify a view in your table object, raw data will be returned in no particular order.
For example, if you are creating a blog or documentation site, specify a `published` field in Airtable, create a filter showing only published posts, and specify this as the (optional) `tableView` option in `gatsby-config.js`
### Naming conflicts
You may have a situation where you are including two separate bases, each with a table that has the exact same name. With the data structure of this repo, both bases would fall into allAirtable and you wouldn't be able to tell them apart when building graphQL queries. This is what the optional `queryName` setting is for-- simply to provide an alternate name for a table.
### API Keys
Keys can be found in Airtable by clicking `Help > API Documentation`.
The API key can be specified in `gatsby-config.js` as noted in the previous section-- **this exposes your key to anyone viewing your repository and is not recommended**.
Alternatively, you may specify your API key using an [Environment Variable](https://www.gatsbyjs.org/docs/environment-variables/). This plugin looks for an environment variable called `GATSBY_AIRTABLE_API_KEY` and will use it prior to resorting to the `apiKey` defined in `gatsby-config.js`. You may also specify it in your command line such as `GATSBY_AIRTABLE_API_KEY=XXXXXX gatsby develop`.
If you add or change your API key in an environment variable at the system level, you may need to reload your code editor / IDE for that variable to reload.
| 55.765625 | 734 | 0.737321 | eng_Latn | 0.996522 |
1c10717e387e726d61cf0304db3d572f068d89f4 | 93 | md | Markdown | README.md | huxleydias/analyticsdemoapp | 7e9b699b3c547de350f1c783629fdfe12078e8b9 | [
"CC0-1.0"
] | null | null | null | README.md | huxleydias/analyticsdemoapp | 7e9b699b3c547de350f1c783629fdfe12078e8b9 | [
"CC0-1.0"
] | null | null | null | README.md | huxleydias/analyticsdemoapp | 7e9b699b3c547de350f1c783629fdfe12078e8b9 | [
"CC0-1.0"
] | null | null | null | # analyticsdemoapp
Source iOS Swip App demo for implementing Google GTM and Google Analytics
| 31 | 73 | 0.83871 | kor_Hang | 0.875655 |
1c108bc01fd7e2ccab2d450321331c5c3a991002 | 933 | md | Markdown | slides/iter/hash-map-key.md | beltram/rs-training | 3e6779524668d30302c6859fca74bcd42b6f831f | [
"MIT"
] | 1 | 2020-10-13T14:08:43.000Z | 2020-10-13T14:08:43.000Z | slides/iter/hash-map-key.md | beltram/rs-training | 3e6779524668d30302c6859fca74bcd42b6f831f | [
"MIT"
] | null | null | null | slides/iter/hash-map-key.md | beltram/rs-training | 3e6779524668d30302c6859fca74bcd42b6f831f | [
"MIT"
] | null | null | null | ## HashMap keys
* a key MUST implement [Eq](https://doc.rust-lang.org/std/cmp/trait.Eq.html) & [Hash](https://doc.rust-lang.org/std/hash/trait.Hash.html) traits
* std types already satisfying those conditions:
* [bool](https://doc.rust-lang.org/std/primitive.bool.html), integers (signed or not) e.g. u8, i32 etc..
* [String](https://doc.rust-lang.org/std/string/struct.String.html) and [&str](https://doc.rust-lang.org/std/primitive.str.html)
```rust
use std::collections::HashMap;
#[derive(Eq, PartialEq, Hash)]
struct Person { firstname: String, lastname: String, }
let mut nicknames: HashMap<Person, &str> = HashMap::new();
let beltram = Person { firstname: String::from("beltram"), lastname: String::from("maldant") };
nicknames.insert(beltram, "trambel");
```
[📒](https://doc.rust-lang.org/stable/rust-by-example/std/hash/alt_key_types.html) |
[📒](https://doc.rust-lang.org/stable/rust-by-example/std/hash.html)
| 44.428571 | 144 | 0.706324 | yue_Hant | 0.27868 |
1c10bb13b48cd495a58670c0b6a4267f1e676cc1 | 4,198 | md | Markdown | README.md | jraleman/42_Walking_Marvin | bfe5b32539d892c18ed47416caa36fd0b2efb28c | [
"MIT"
] | 7 | 2018-04-08T17:31:29.000Z | 2019-11-03T11:03:58.000Z | README.md | jraleman/42_Walking_Marvin | bfe5b32539d892c18ed47416caa36fd0b2efb28c | [
"MIT"
] | null | null | null | README.md | jraleman/42_Walking_Marvin | bfe5b32539d892c18ed47416caa36fd0b2efb28c | [
"MIT"
] | 7 | 2017-10-09T18:39:58.000Z | 2019-11-10T04:00:46.000Z | # 42 US - Silicon Valley
## Walking Marvin
### Who is Marvin?
Marvin, the Paranoid Android, is a fictional character in
The Hitchhiker's Guide to the Galaxy series by Douglas Adams.
Marvin is the ship's robot aboard the starship Heart of Gold.
### Goals
This is a python project, that uses OpenAI Gym with an environment called Marvin.
The goal is to train Marvin to walk, having the training and walking process.
The total reward for each episode after training is bigger than 100. During the
development, we learned how to use neural networks to help Marvin
get back on his feet, without using any libraries that do the goal of the
project for us, like Evostra or Tensorflow.

### Usage
**Basic form:**
`python marvin.py`
The program display log for each episode.
**Advanced options:**
| Flags | Description |
| :------------------ |:--------------------------------------------------------------------------------------------- |
| `–-walk (-w)` | Display only walking process. |
| `–-video (-v)` | Saves videos of the walking process. |
| `–-name (-n)` | Display the name of the game (environment). |
| `--generation (-g)` | Change the maximum number of generations. |
| `--population (-p)` | Count of the population between each generation. |
| `--rate (-r)` | Mutation rate (recommended values in the range of decimals). |
| `--movement (-m)` | umber of steps (movement) between each episode. |
| `–-load (-l)` | Load weights for Marvin agent from a file. Skip training process if this option is specified. |
| `–-save (-s)` | Save weights to a file after running the program. |
| `–-quiet (-q)` | Hide the program's log between each episode. |
| `–-help (-h)` | Display available commands and exit. |
| `–-log` | Save a log of each generation to a file. Expects a path. |
| `–-version` | Show program's version number and exit. |
*If the program launches without arguments, display training process and walking
process.*
### Setup
Use `sh setup.sh` to setup and build all the dependencies.
*All the dependencies will be installed to the user by running the script...*
### TODO
* Make use of the average fitness of a generation, so it doesn't deviate from the parent.
* Check if by changing code from the source file can interfere with loading the file (of a different version
of the program). If it does, create some kind of flag to validate the version.
### Resources
The following sources helped us during the development of this project:
* [OpenAI Gym documentation](https://gym.openai.com/docs)
* [NEAT Super Mario World](https://www.youtube.com/watch?v=qv6UVOQ0F44) (to understand what we had to do)
* [OpenAI-NEAT Project example](https://github.com/HackerHouseYT/OpenAI-NEAT)
* [OpenAI Gym bipedal walker](https://gym.openai.com/evaluations/eval_ujFWHmoqSniDh8cErKCVpA)
* [Neuroevolution - Wikipedia Article](https://en.wikipedia.org/wiki/Neuroevolution)
* [Artificial Neural Network - Wikipedia Article](https://en.wikipedia.org/wiki/Artificial_neural_network)
* [Evolving Neural Networks through Augmenting Topologies](http://nn.cs.utexas.edu/downloads/papers/stanley.ec02.pdf) (Kenneth O. Stanley and Risto Miikkulainen)
* [Evolution Simulator - carykh](https://www.youtube.com/watch?v=GOFws_hhZs8)
## Contributors
* [JR Aleman](https://github.com/jraleman/)
* [Gerardo Solis](https://github.com/corezip/)
## License
This project is under the MIT License.
| 49.97619 | 161 | 0.583373 | eng_Latn | 0.970077 |