hexsha
stringlengths
40
40
size
int64
5
1.04M
ext
stringclasses
6 values
lang
stringclasses
1 value
max_stars_repo_path
stringlengths
3
344
max_stars_repo_name
stringlengths
5
125
max_stars_repo_head_hexsha
stringlengths
40
78
max_stars_repo_licenses
listlengths
1
11
max_stars_count
int64
1
368k
max_stars_repo_stars_event_min_datetime
stringlengths
24
24
max_stars_repo_stars_event_max_datetime
stringlengths
24
24
max_issues_repo_path
stringlengths
3
344
max_issues_repo_name
stringlengths
5
125
max_issues_repo_head_hexsha
stringlengths
40
78
max_issues_repo_licenses
listlengths
1
11
max_issues_count
int64
1
116k
max_issues_repo_issues_event_min_datetime
stringlengths
24
24
max_issues_repo_issues_event_max_datetime
stringlengths
24
24
max_forks_repo_path
stringlengths
3
344
max_forks_repo_name
stringlengths
5
125
max_forks_repo_head_hexsha
stringlengths
40
78
max_forks_repo_licenses
listlengths
1
11
max_forks_count
int64
1
105k
max_forks_repo_forks_event_min_datetime
stringlengths
24
24
max_forks_repo_forks_event_max_datetime
stringlengths
24
24
content
stringlengths
5
1.04M
avg_line_length
float64
1.14
851k
max_line_length
int64
1
1.03M
alphanum_fraction
float64
0
1
lid
stringclasses
191 values
lid_prob
float64
0.01
1
9842798ca0436010d7046a277164d35b482ed586
3,695
md
Markdown
_posts/2021-0502-tsinghua-application.md
tnecniv22/tnecniv22.github.io
c7bd97dbd3dedd56881b0eebfc7d3d6941df78af
[ "MIT" ]
null
null
null
_posts/2021-0502-tsinghua-application.md
tnecniv22/tnecniv22.github.io
c7bd97dbd3dedd56881b0eebfc7d3d6941df78af
[ "MIT" ]
null
null
null
_posts/2021-0502-tsinghua-application.md
tnecniv22/tnecniv22.github.io
c7bd97dbd3dedd56881b0eebfc7d3d6941df78af
[ "MIT" ]
null
null
null
--- layout: post title: Tsinghua Applicants, Read This! description: What I learned from an failed application to Tsinghua. summary: What I learned from an failed application to Tsinghua. tags: [education, china] --- If you are planning on applying to Tsinghua for graduate school as an international student, please make sure to submit your application well before the posted deadline. This is because there is an additional step in the application process post-submittal: Tsinghua must verify your application prior to the deadline. I learned this the hard way. For the past few weeks, I have been working night and day to prepare my graduate school application for a data science master's program at Tsinghua. The third and last round of applications were due 5pm, April 30th, Beijing time. Believing that I had up until the deadline to submit, I worked on perfecting my personal statement until the second last day. Once I was completely satisfied with my writing, I submitted my application. It was 5am, April 30th, Beijing time: 12 hours before the applications were due. I thought I had plenty of time to spare. However, upon submittal, I was met with a sudden message. ![Image](/assets/img/tsinghua1.png) Without verification, the application fee could not be paid. Without paying the application fee, my application would not be considered. I was surprised. In the official application guide for international students, there was no mention of a required verification before the fee payment. I had no choice but to wait. For 12 hours, I waited patiently, checking the application website periodically for an answer. However, none came. Partway through, I emailed and called the admissions committee but received no answer. Eventually, the application deadline passed, and my application had not received verification. I was no longer under consideration for my program of choice. Needless to say I was quite devastated. This was my last shot in getting into graduate school for the September 2021 intake, having not been successful with my prior applications. I also really wanted to study and experience life in China: to immerse myself in my cultural roots and see with my own eyes a society that in many ways has surpassed the western world in technology, infrastructure, and efficiency. To prepare for my application, I had entirely re-written my personal statement, paid for official documents from my university, and requested additional professors to write recommendation letters. I really believed I had put together a very strong package. In the end, it feels like all my effort was for naught. To not even have my application reviewed for admission even though I submitted 12 hours in advance of the deadline sucks. It would have been easier to accept if they had reviewed my documents and officially rejected me. It remains to be seen whether there is a silver lining to this story. Perhaps, it has taught me to stay strong and pick myself up after intense disappointment. Perhaps, parts of my personal statement might be used as a cover letter to land a future job. Perhaps there even remains a chance that Tsinghua will consider my application once I discuss the situation with them. Regardless, I can't help but feel that this situation should not have happened--that it isn't right. To future Tsinghua applicants: Please submit your applications EARLY and get them verified, so that you won't have to experience the same disappointment as I did. To Tsinghua Admissions Committee: Please make it clear that submitted applications require time for verification, or change the system such that applications that are submitted before the deadline are considered.
131.964286
675
0.807307
eng_Latn
0.999869
98431ddefe6e47e27fe2cbd286c0c6f9c128b522
1,810
md
Markdown
treebanks/la_proiel/la_proiel-dep-flat-name.md
mjabrams/docs
eef96df1ce8f6752e9f80660c8255482b2a07c45
[ "Apache-2.0" ]
null
null
null
treebanks/la_proiel/la_proiel-dep-flat-name.md
mjabrams/docs
eef96df1ce8f6752e9f80660c8255482b2a07c45
[ "Apache-2.0" ]
null
null
null
treebanks/la_proiel/la_proiel-dep-flat-name.md
mjabrams/docs
eef96df1ce8f6752e9f80660c8255482b2a07c45
[ "Apache-2.0" ]
null
null
null
--- layout: base title: 'Statistics of flat:name in UD_Latin-PROIEL' udver: '2' --- ## Treebank Statistics: UD_Latin-PROIEL: Relations: `flat:name` This relation is a language-specific subtype of <tt><a href="la_proiel-dep-flat.html">flat</a></tt>. There are also 1 other language-specific subtypes of `flat`: <tt><a href="la_proiel-dep-flat-foreign.html">flat:foreign</a></tt>. 510 nodes (0%) are attached to their parents as `flat:name`. 493 instances of `flat:name` (97%) are left-to-right (parent precedes child). Average distance between parent and child is 1.41960784313725. The following 1 pairs of parts of speech are connected with `flat:name`: <tt><a href="la_proiel-pos-PROPN.html">PROPN</a></tt>-<tt><a href="la_proiel-pos-PROPN.html">PROPN</a></tt> (510; 100% instances). ~~~ conllu # visual-style 5 bgColor:blue # visual-style 5 fgColor:white # visual-style 4 bgColor:blue # visual-style 4 fgColor:white # visual-style 4 5 flat:name color:blue 1 quod qui PRON Pr Case=Acc|Gender=Neut|Number=Sing|PronType=Rel 3 obj:dir _ ref=LUKE_5.8 2 cum cum SCONJ G- _ 3 mark _ ref=LUKE_5.8 3 videret video VERB V- Aspect=Imp|Mood=Sub|Number=Sing|Person=3|Tense=Past|VerbForm=Fin|Voice=Act 6 advcl _ ref=LUKE_5.8 4 Simon Simon PROPN Ne Case=Nom|Gender=Masc|Number=Sing 3 nsubj _ ref=LUKE_5.8 5 Petrus Petrus PROPN Ne Case=Nom|Gender=Masc|Number=Sing 4 flat:name _ ref=LUKE_5.8 6 procidit procido VERB V- Mood=Ind|Number=Sing|Person=3|Tense=Pres|VerbForm=Fin|Voice=Act 0 root _ ref=LUKE_5.8 7 ad ad ADP R- _ 8 case _ ref=LUKE_5.8 8 genua genu NOUN Nb Case=Acc|Gender=Neut|Number=Plur 6 obl _ ref=LUKE_5.8 9 Iesu Iesus PROPN Ne Case=Gen|Gender=Masc|Number=Sing 8 nmod _ ref=LUKE_5.8 10 dicens dico VERB V- Case=Nom|Gender=Masc|Number=Sing|Tense=Pres|VerbForm=Part|Voice=Act 6 advcl _ ref=LUKE_5.8 ~~~
45.25
203
0.746961
yue_Hant
0.439606
98438f7b54503483202e2f19e54907643ea57d2c
3,719
md
Markdown
_posts/2019-12-31-faiths-prospect-for-a-new-year.md
SaintPeterCalvaryBaptistChurch/SaintPeterCalvaryBaptistChurch.github.io
56312ef65aa9de8e0f653593996e83f31098f753
[ "Apache-2.0" ]
1
2018-08-03T19:11:19.000Z
2018-08-03T19:11:19.000Z
_posts/2019-12-31-faiths-prospect-for-a-new-year.md
SaintPeterCalvaryBaptistChurch/SaintPeterCalvaryBaptistChurch.github.io
56312ef65aa9de8e0f653593996e83f31098f753
[ "Apache-2.0" ]
null
null
null
_posts/2019-12-31-faiths-prospect-for-a-new-year.md
SaintPeterCalvaryBaptistChurch/SaintPeterCalvaryBaptistChurch.github.io
56312ef65aa9de8e0f653593996e83f31098f753
[ "Apache-2.0" ]
1
2018-08-03T19:11:23.000Z
2018-08-03T19:11:23.000Z
--- layout: post title: Faith's Prospect For a New Year date: 2019-12-31 00:30:00 author: "Pastor Dave Johnson" --- Here's a word to brighten your new year: PROSPECT from the Lord, all through a faith relationship with Him. We can look forward with anticipation as to what glorious goodness the Lord has in store for us in the coming year, and it's all a matter of walking day by day with Him. Let's look to Jacob as he looked to the Lord in anticipation of what He had for him in PROSPECT. While walking by faith, the patriarch rested his unknown, future years on his all knowing, all supplying, providential LORD. Here is the Word on which Jacob relied: "God Almighty bless thee ... and give thee the blessing of Abraham" (Genesis 28:3). In that context of Scripture the promise of watch care, guidance, safety, and provision are all inclusive in the blessing that we all need. The promise of blessing that Lord gave to Abraham was also to Jacob, and actually to all who possess the faith of Abraham. That's us, the blessing is for the New Testament believer as well. Here's the promise in prospect: "In thee shall all the families of the earth be blessed" (Genesis 12:3). Question: What exactly was the "in thee" provision (in Abraham) that denoted the blessing of God. The answer: that one most important ingredient in obtaining God's blessing is Biblical faith (believing and acting on His Word). With active faith, the prospect of your new year is this: we can have the best that God desires to give, and we can look forward to receiving His best during this coming year. When it comes down to the blessing of God, His giving at His best did not stop at that Biblical moment during Jacob's timeline. The Lord's blessing was "to thee, and to thy seed with thee" (v. 4). Once again, that's us, as we dwell in faith we also dwell in faith's expectancy from our Lord. Active dwelling faith is always faith being blessed actively, looking for God's intervention of His goodness being passed along to our tomorrows. Jacob wanted that, he needed that, he anticipated that, and his future prospect was only an anxious, fretful reality without that. He needed God's providential hand on his life, and so do we. In context, God's best came to Jacob, as He had a faith-dwelling in Bethel (Genesis 28:19 - the house of God, His dwelling place). Concerning being blessed, it's all a matter of active, dwelling faith: "So then they which be of faith are blessed with faithful Abraham" (Galatians 3:9). Don't miss out on the blessing in prospect this new year, coming to you from the Lord prized with blessing on His own. So as you "keep the faith," He keeps you being blessed, so keep a look out for the blessing coming to you this year as we dwell well in Christ by faith. In prospect, "If ye be Christ's, then are ye Abraham's seed, and heirs according to the promise (Galatians 3:29). Be well at home with the One of promise; be finding yourself at one with the One indwelling you. Allow the One to find you living by faith in the One who blesses as you find your place with Him, abiding in your Bethel. Your Bethel? Oh yes: "Ye are the temple of God, and the Spirit of God dwelleth in you" (I Corinthians 3:16). "If ye abide (dwell) in me, and my words abide in you, ye shall ask what ye will, and it shall be done unto you" (John 15:7). That, my friend is faith's prospect as you DWELL WELL with Him in this coming year. Then, you can face all of your unknown tomorrows without an anxious heart this coming year, practicing restful faith in the One who knows all about your coming days with providential care. Yes, He knows all, and blesses all who have FAITH'S PROSPECT FOR A NEW YEAR.
464.875
3,602
0.764722
eng_Latn
0.999912
9843e617ebdb7e2f6e77737642f1faa3fe4363cb
606
md
Markdown
curriculum/challenges/ukrainian/07-scientific-computing-with-python/python-for-everybody/web-services-service-oriented-approach.md
fcastillo-serempre/freeCodeCamp
43496432d659bac8323ab2580ba09fa7bf9b73f2
[ "BSD-3-Clause" ]
172,317
2017-01-11T05:26:18.000Z
2022-03-31T23:30:16.000Z
curriculum/challenges/ukrainian/07-scientific-computing-with-python/python-for-everybody/web-services-service-oriented-approach.md
fcastillo-serempre/freeCodeCamp
43496432d659bac8323ab2580ba09fa7bf9b73f2
[ "BSD-3-Clause" ]
26,252
2017-01-11T06:19:09.000Z
2022-03-31T23:18:31.000Z
curriculum/challenges/ukrainian/07-scientific-computing-with-python/python-for-everybody/web-services-service-oriented-approach.md
fcastillo-serempre/freeCodeCamp
43496432d659bac8323ab2580ba09fa7bf9b73f2
[ "BSD-3-Clause" ]
27,418
2017-01-11T06:31:22.000Z
2022-03-31T20:44:38.000Z
--- id: 5e7b9f140b6c005b0e76f07e title: 'Веб-сервіси: Сервісно-орієнтовний підхід' challengeType: 11 videoId: muerlsCHExI bilibiliIds: aid: 846899335 bvid: BV1E54y1J7oz cid: 377333277 dashedName: web-services-service-oriented-approach --- # --question-- ## --text-- Згідно сервісно-орієнтовного підходу до розробки web-додатків, де розташовані дані? ## --answers-- Розповсюджені між багатьма комп'ютерними системами, які поєднані між собою через інтернет або внутрішню мережу. --- В межах різних служб на головному веб-сервері. --- На окремому сервері бази даних. ## --video-solution-- 1
17.314286
111
0.754125
ukr_Cyrl
0.995014
9844c88632b6ad922a44bc66d08882ddb920168c
1,159
md
Markdown
content/get-in-touch.md
jlandowner/documentation
60e059909084779d05055a69c085023f0dd3a2c8
[ "Apache-2.0" ]
1
2021-06-10T07:58:19.000Z
2021-06-10T07:58:19.000Z
content/get-in-touch.md
jlandowner/documentation
60e059909084779d05055a69c085023f0dd3a2c8
[ "Apache-2.0" ]
null
null
null
content/get-in-touch.md
jlandowner/documentation
60e059909084779d05055a69c085023f0dd3a2c8
[ "Apache-2.0" ]
null
null
null
--- title: Getting in touch --- ## Via chat or email Join our open online chat room [gitter.im/jaegertracing](https://gitter.im/jaegertracing/Lobby) or Google mail group [jaeger-tracing@googlegroups.com](https://groups.google.com/forum/#!forum/jaeger-tracing). ## Bi-weekly project meetings The Jaeger maintainers and contributors meet on a video call every other week, and everyone is welcome to join and participate, discuss their issues, present case studies. Agenda and meeting details [are in this Google doc][bi-weekly-call]. ## Report issues on GitHub Our GitHub org has many different repos, please make sure to pick the one appropriate for your issue. * For general backend features or issues use the main repo [github.com/jaegertracing/jaeger](https://github.com/jaegertracing/jaeger). * For frontend features or issues use [github.com/jaegertracing/jaeger-ui](https://github.com/jaegertracing/jaeger-ui). * For features or issues with client libraries, pick the [corresponding repository](/docs/latest/client-libraries/#supported-libraries). [bi-weekly-call]: https://docs.google.com/document/d/1ZuBAwTJvQN7xkWVvEFXj5WU9_JmS5TPiNbxCJSvPqX0/
55.190476
240
0.788611
eng_Latn
0.879224
9845e7400df13e1e760330547b1667ada421a3aa
1,773
md
Markdown
his/core/planning-for-transaction-integrator2.md
SicongLiuSimon/biztalk-docs
85394b436d277504d9e759c655608888123785bd
[ "CC-BY-4.0", "MIT" ]
1
2020-06-16T22:06:46.000Z
2020-06-16T22:06:46.000Z
his/core/planning-for-transaction-integrator2.md
AzureMentor/biztalk-docs
16b211f29ad233c26d5511475c7e621760908af3
[ "CC-BY-4.0", "MIT" ]
7
2020-01-09T22:34:58.000Z
2020-02-18T19:42:16.000Z
his/core/planning-for-transaction-integrator2.md
AzureMentor/biztalk-docs
16b211f29ad233c26d5511475c7e621760908af3
[ "CC-BY-4.0", "MIT" ]
2
2017-06-23T18:30:28.000Z
2017-11-28T01:11:25.000Z
--- title: "Planning for Transaction Integrator | Microsoft Docs" ms.custom: "" ms.date: "11/30/2017" ms.prod: "host-integration-server" ms.reviewer: "" ms.suite: "" ms.tgt_pltfrm: "" ms.topic: "article" ms.assetid: 3baca727-234d-494b-93a2-0be91efa6d1a caps.latest.revision: 3 author: "gplarsen" ms.author: "hisdocs" manager: "anneta" --- # Planning for Transaction Integrator ## Overview Before installing and using Transaction Integrator (TI), determine whether your mainframe-based transaction programs (TPs) can be used with TI and whether any of them need modification. Answer the following three questions to find out whether TI can invoke your TP: - Is the TP irretrievably terminal-oriented, or can you expose a request-response interface? - What programming model does the TP need? - Does the TP use data types that TI supports? To use TI to invoke a mainframe-based transaction program (TP), you must separate the business logic from the presentation logic in the TP. TI uses the request-response model; it does not support conversational or pseudo-conversational transactions. The TP must support the so-called ping-pong request-reply mode. Although TI does not support screen scraping, it does support eight other communication models. Some CICS and most IMS transactions that expose terminal interface can also be invoked using one of the eight supported models. For example, a CICS transaction might be terminal-oriented but still have the business logic partitioned in a separate link model transaction for load balancing or maintainability. ## Next steps [Communication Models](../core/communication-models2.md) [CICS and VTAM Sample Definitions for LU 6.2](../core/cics-and-vtam-sample-definitions-for-lu-6-21.md)
53.727273
408
0.772702
eng_Latn
0.987519
984618f74ce38bb2d58f036486764d4d9c7453b5
2,025
md
Markdown
README.md
Adrixop95/Snek_Game_Inzynieria_oprogramowania_w_grach
ea8d06fc3b03f91b942ae2fadf23c92ab89eba85
[ "MIT" ]
null
null
null
README.md
Adrixop95/Snek_Game_Inzynieria_oprogramowania_w_grach
ea8d06fc3b03f91b942ae2fadf23c92ab89eba85
[ "MIT" ]
null
null
null
README.md
Adrixop95/Snek_Game_Inzynieria_oprogramowania_w_grach
ea8d06fc3b03f91b942ae2fadf23c92ab89eba85
[ "MIT" ]
null
null
null
# Snake ### Inżynieria Oprogramowania w Grach ## Zespół: - Laura Dymarczyk - Agata Dziurka - Kamil Karpiński - Adrian Rupala ## Koncept gry: Nasz projekt ma na celu zaprojektowanie i stworzenie gry w węża (ang. Snake), jest to nasza interpretacji klasycznej gry istniejącej od 1976 roku. Gra polega na poruszaniu tytułowym wężem po prostokątnej planszy, na której pojawiają się punkty (jabłka), które są pożywieniem naszego zwierzątka. Pożywienie pozwala wężu się rozrastać, a celem gry jest zapełnienie całej przestrzeni planszy ciałem naszego węża. W grze można ponieść porażkę poprzez dobicie do zewnętrznych krawędzi planszy, tzw. “ścian”. ## Wykorzystane w projekcie wzorce projektowe ### Wzorce konstrukcyjne: - Prototyp - użyty przy tworzeniu instancji jabłek i kolejnych segmentów węża. - Singleton - użyty do stworzenia instancji planszy, której pojedynczy obiekt jest wystarczający dla działania całej rozgrywki, używany do stworzenia aktualnego stanu poruszania się węża oraz obiekt służący do zliczania punktów w rozgrywce. ### Wzorce strukturalne: - Kompozycja - użyty do reprezentowania węża poprzez liniową strukturę rekurencyjną. Pomaga to przy przesuwaniu całego ciała węża w sposób ciągły. ### Wzorce operacyjne: - Stan - wąż może znajdować się w jednym z czterech stanów poruszania się: do góry (przyrost y), do dołu (zmniejszanie y), w lewo (zmniejszenie x) lub w prawo (przyrost x). Istnieje również stan rozgrywki, który może być wartością wygraną, przegraną bądź w toku. - Komenda - pozwala zmienić stan ruchu węża, reaguje na wydarzenia klawiatury. - Obserwator - Istnieją dwie instancje obserwatorów. Jeden z nich obserwuje węża względem jabłka i daje im sygnał, gdy spotkają się w tym samym polu. Drugi obserwator obserwuje węża i ścianę, wysyła on sygnał do węża, aby poinformować go o zderzeniu. ### Wzorce sekwencjonowania: - Metoda aktualizująca - zliczanie wykorzystane podczas kalkulowania punktów. - Renderer - wizualizacja stanu gry, podpięcie assetów, obsługa efektów graficznych.
63.28125
262
0.800494
pol_Latn
1.000009
98472a6ea9086fa216f40fe4a98f9742299c4d3f
218
md
Markdown
_watches/M20200417_020403_TLP_9.md
Meteoros-Floripa/meteoros.floripa.br
7d296fb8d630a4e5fec9ab1a3fb6050420fc0dad
[ "MIT" ]
5
2020-01-22T17:44:06.000Z
2020-01-26T17:57:58.000Z
_watches/M20200417_020403_TLP_9.md
Meteoros-Floripa/site
764cf471d85a6b498873610e4f3b30efd1fd9fae
[ "MIT" ]
null
null
null
_watches/M20200417_020403_TLP_9.md
Meteoros-Floripa/site
764cf471d85a6b498873610e4f3b30efd1fd9fae
[ "MIT" ]
2
2020-05-19T17:06:27.000Z
2020-09-04T00:00:43.000Z
--- layout: watch title: TLP9 - 17/04/2020 - M20200417_020403_TLP_9T.jpg date: 2020-04-17 02:04:03 permalink: /2020/04/17/watch/M20200417_020403_TLP_9 capture: TLP9/2020/202004/20200416/M20200417_020403_TLP_9T.jpg ---
27.25
62
0.784404
eng_Latn
0.04649
984748b27668f31b649f8121cab4173d87da7617
1,923
md
Markdown
_posts/2021-01-06-ssafy-apply-review.md
Heeseok-Jeong/Heeseok-Jeong.github.io
ec463c4ffd6d9afe12a6ee1a949979da4b6ee2bf
[ "MIT" ]
null
null
null
_posts/2021-01-06-ssafy-apply-review.md
Heeseok-Jeong/Heeseok-Jeong.github.io
ec463c4ffd6d9afe12a6ee1a949979da4b6ee2bf
[ "MIT" ]
null
null
null
_posts/2021-01-06-ssafy-apply-review.md
Heeseok-Jeong/Heeseok-Jeong.github.io
ec463c4ffd6d9afe12a6ee1a949979da4b6ee2bf
[ "MIT" ]
null
null
null
--- layout : post title : 싸피 5기 합격 후기 subtitle : SSAFY, Samsumg Software Academy For Youth tags : [SSAFY] author : Heeseok Jeong comments : True sitemap : changefreq : daily priority : 1.0 --- ## 지원 작년 10월부터 시작된 싸피 5기 전형 결과가 12월 중순에 나왔다. <br><br> 처음으로 취준을 하면서 현대 오토에버 서류를 뚫었지만 코테에서 떨어졌다. 또 여러 회사의 공채, 인턴을 지원했지만 코테에서 번번히 떨어지면서 실력이 부족함을 느꼈다. <br><br> 싸피는 친한 학교 선배가 3기였기에 잘 알고 있었다. 그렇기에 싸피에서 알고리즘 공부와 자바, 스프링을 활용하여 웹백엔드 개발자로 성장하기를 기대하며 지원하였다. <br> <hr> ## 전형 아래는 각 전형들에 대해 어떻게 준비했는지 당시 기록한 내용이다.<br> ### 서류전형 SW 관심 계기와 어떤 개발자 되고 싶은지 - SW 관심 계기와 어떤 개발자 되고 싶은지 1학년 수업에서 파이썬, HTML경험 → 생각대로 구현된다는 점 + 자주 사용하는 소프트웨어 원리 이해하며 흥미 생김 C 독학한 경험 제시 + IT 정보 스스로 찾아보며 개발자 생각하게 됨 --- 기본 충실 + 전문성 있는 개발자 되고 싶음 - 기본 쌓으려 노력하지만 실력 부족함과 혼자 공부 어려움을 느끼며 좋은 교육과 동료 함께하고 싶다 생각 - 노력은 하지만 부족함을 느끼는 것 강조 + 동료 협업하고싶음 강조 - 실전적인 프로젝트 통해 전문성 개발하고 싶음 강조 - IT적으로 취업 노력 경험 인턴 경험 사례 제시 - 데이터 분류, 동료 통해 더 좋은 결과 가져옴 → 협업능력 강조 - 처음 사용한 언어 해결 → 문제해결 능력 + 목표달성 능력 강조 ### 온라인 테스트 - 인적성 - 15문제 30분, 그으렇게 어렵진 않음, 11솔 - 삼성 GSAT 문제집 사서 추리/수리영역 푼게 도움 됨 - CT - 5문제(각각 소문제 5개 있음, 총 25) 30분, 알고리즘보단 규칙 이해하고 적용 빠르게 하는 느낌, 2.5솔 - 소문제 4,5 부터 인풋이 길어져서 상당히 힘듦, 모든거 3개 푸는게 나은지 하나 진득하게 푸는게 나은지 모르겠음 ### 면접 - CT - 2문제 12분, 매우 쉬움, 1솔 - 1번은 잘 풀었는데, 2번 도저히 이해가 안돼서 하나도 못품... - 친구에게 물어보고 이해하니 너~무 쉬워서 눈물 한방울.. - PT - 2지문 중 1지문만 선택 1분, 피티 준비 20분 - A조라 바로 입장 - 지문 세트 매우 많은듯, 주제 엄청 다양해서 뭐 나올지 절대 모름 - 나는 ~~ 주제 선택 (자세한 내용은 보안!) - 금융권 클라우드 DB 데이터 추출 → 개인화 추천 모델 학습 → 금융상품 개인 추천 서비스 생각 - 경쟁사, 구체적인 과정, 제약 사항 등 제시 - 하지만 그 주제 이해한 후, 미리 준비한 아이템으로 끌고갔음 - 면접 (자세한 내용은 못적어요 ㅜ) - 자기소개 및 자소서 질문 자기소개 먼저함, 긴장해서 5초 정도 멈추고 떨면서 하니까 긴장 풀라하고 자소서 질문 먼저 했음 - 프로젝트 관련 질문 - 협업, 리더 등에 관한 질문 - 코테 공부 관련 질문(자소서에 적었어서) - PT - 준비한 내용 발표 - 구체적으로 구현하려면 뭐가 중요할지 질문 - 어떤 서비스 있을지 구체적으로 예시 들라함 - 해당 주제가 뭔지 설명하라 함 <br> <hr> <h2>결과</h2> ![Pass Image]({{ site.baseurl }}/assets/img/ssafy_pass.png)
21.852273
92
0.613105
kor_Hang
1.00001
9848587f655c2a77b1b553a4488dbca84b922cff
2,215
md
Markdown
node_modules/postcss-loader/README.md
ZhouHengYi/https---github.com-ZhouHengYi-RenRen_react-starter-kit
3278bcbd3df340e557d9d3ec9ac2375ae1f9fece
[ "MIT" ]
null
null
null
node_modules/postcss-loader/README.md
ZhouHengYi/https---github.com-ZhouHengYi-RenRen_react-starter-kit
3278bcbd3df340e557d9d3ec9ac2375ae1f9fece
[ "MIT" ]
null
null
null
node_modules/postcss-loader/README.md
ZhouHengYi/https---github.com-ZhouHengYi-RenRen_react-starter-kit
3278bcbd3df340e557d9d3ec9ac2375ae1f9fece
[ "MIT" ]
null
null
null
# PostCSS for Webpack [![Build Status][ci-img]][ci] <img align="right" width="95" height="95" title="Philosopher’s stone, logo of PostCSS" src="http://postcss.github.io/postcss/logo.svg"> [PostCSS] loader for [webpack] to postprocesses your CSS with [PostCSS plugins]. <a href="https://evilmartians.com/?utm_source=postcss-loader"> <img src="https://evilmartians.com/badges/sponsored-by-evil-martians.svg" alt="Sponsored by Evil Martians" width="236" height="54"> </a> [PostCSS plugins]: https://github.com/postcss/postcss#built-with-postcss [PostCSS]: https://github.com/postcss/postcss [webpack]: http://webpack.github.io/ [ci-img]: https://travis-ci.org/postcss/postcss-loader.svg [ci]: https://travis-ci.org/postcss/postcss-loader ## Usage Set `postcss` section in webpack config: ```js var autoprefixer = require('autoprefixer-core'); var csswring = require('csswring'); module.exports = { module: { loaders: [ { test: /\.css$/, loader: "style-loader!css-loader!postcss-loader" } ] }, postcss: [autoprefixer, csswring] } ``` Now your CSS files requirements will be processed by selected PostCSS plugins: ```js var css = require('./file.css'); // => CSS after Autoprefixer and CSSWring ``` ## Plugins Packs If you want to process different styles by different PostCSS plugins you can define plugin packs in `postcss` section and use them by `?pack=name` parameter. ```js module.exports = { module: { loaders: [ { test: /\.docs\.css$/, loader: "style-loader!css-loader!postcss-loader?pack=cleaner" }, { test: /\.css$/, loader: "style-loader!css-loader!postcss-loader" } ] }, postcss: { defaults: [autoprefixer, csswring], cleaner: [autoprefixer({ browsers: [] })] } } ``` ## Safe Mode If you add `?safe=1` to requirement, PostCSS will try to correct any syntax error that it finds in the CSS. For example, it will parse `a {` as `a {}`. ```js var css = require('postcss?safe=1!./broken') ```
27.012195
131
0.609932
kor_Hang
0.363766
984890dd27ccae33bd6eaae74d94465a3f89a2ff
815
md
Markdown
README.md
shadowoom/shadowoom.github.io
00386054af032fa3c5608f6a114a3db3749ddee4
[ "MIT" ]
null
null
null
README.md
shadowoom/shadowoom.github.io
00386054af032fa3c5608f6a114a3db3749ddee4
[ "MIT" ]
null
null
null
README.md
shadowoom/shadowoom.github.io
00386054af032fa3c5608f6a114a3db3749ddee4
[ "MIT" ]
null
null
null
# Zhang Chen's Personal Tech Blog This blog used Edition template, the product documentation template for Jekyll. ## Blog Category * Algorithm * Artificial Intelligence * Database * Operating System * Programming Language * Web Development ### Documentation pages * Add, update or remove a documentation page in the *Documentation* collection. * Change the category of a documentation page to move it to another section in the navigation. * Documentation pages are organised in the navigation by category, with URLs based on the path inside the `_docs` folder. ### Search * Add `excluded_in_search: true` to any documentation page's front matter to exclude that page in the search results. ### Navigation * Change `site.show_full_navigation` to control all or only the current navigation group being open.
30.185185
121
0.782822
eng_Latn
0.987766
9848dd1767a7782f09927ca4c1cd2d59220bb69f
1,006
md
Markdown
changelog/sprint-2.md
dickyhermawan12/Final-Project-OOP
e7205a0228db76a10e0327d38e161c98b6c475a3
[ "MIT" ]
null
null
null
changelog/sprint-2.md
dickyhermawan12/Final-Project-OOP
e7205a0228db76a10e0327d38e161c98b6c475a3
[ "MIT" ]
12
2020-11-19T17:18:17.000Z
2021-11-16T05:10:22.000Z
changelog/sprint-2.md
dickyhermawan12/Final-Project-OOP
e7205a0228db76a10e0327d38e161c98b6c475a3
[ "MIT" ]
1
2020-11-19T18:02:22.000Z
2020-11-19T18:02:22.000Z
# Scrum Report (Sprint 2) | From 25/11/2020 to 01/11/2020 ## Team Winduisme | NPM | Name | | ------------- |-------------| | 140810190001 | Dicky Rahma Hermawan | | 140810190015 | Salsabila Karin | | 140810190041 | Windu Nursetyadi | ## Sprint Overview | Planned (n) | Completed (n) | | ------------- |-------------- | | 3 | 3 | ## Sprint 2 Backlog | ID | Title/Desc | Asignee | Status | | --- | ---------- | ------- | ------ | | 2.1 | Membuat tampilan menu dan pilihannya | Karin | Done | | 2.2 | Membuat setting | Dicky | Done | | 2.3 | Merapikan antarmuka UI | Windu | Done | ## Retrospective Dalam rapat sprint kemarin, kelompok juga merencanakan untuk mengatur tugas untuk sprint 3. Selain itu, tidak ada kendala ## Next Sprint Backlog (Sprint 3) | ID | Title/Desc | Asignee | | --- | ---------- | ------- | | 3.1 | Refactor drawable main menu | Karin | | 3.2 | Bug fixing | Dicky | | 3.3 | Menambah sistem *single player* | Windu |
29.588235
121
0.54672
ind_Latn
0.639575
9848e25d6c4abd2ee822f595936e8a90a96e711c
679
md
Markdown
catalog/hot-limit/en-US_hot-limit.md
htron-dev/baka-db
cb6e907a5c53113275da271631698cd3b35c9589
[ "MIT" ]
3
2021-08-12T20:02:29.000Z
2021-09-05T05:03:32.000Z
catalog/hot-limit/en-US_hot-limit.md
zzhenryquezz/baka-db
da8f54a87191a53a7fca54b0775b3c00f99d2531
[ "MIT" ]
8
2021-07-20T00:44:48.000Z
2021-09-22T18:44:04.000Z
catalog/hot-limit/en-US_hot-limit.md
zzhenryquezz/baka-db
da8f54a87191a53a7fca54b0775b3c00f99d2531
[ "MIT" ]
2
2021-07-19T01:38:25.000Z
2021-07-29T08:10:29.000Z
# Hot Limit ![hot-limit](https://cdn.myanimelist.net/images/manga/3/8984.jpg) - **type**: manga - **volumes**: 1 - **chapters**: 5 ## Tags - yaoi ## Authors - Kanbe - Akira (Art) - Shima - Minori (Story) ## Sinopse By day he is a beautiful honor student, but when night falls, the sexy and lewd "Maya" appears. Even though university student Kazuma wants to deny the existence of Maya's double life, he gets drawn in by him. Before long, Kazuma finds out that a young congressman, Tanabe, is manipulating Maya like a puppet, and is using him for the government... [ from B-U ] ## Links - [My Anime list](https://myanimelist.net/manga/6718/Hot_Limit)
23.413793
348
0.686303
eng_Latn
0.967197
9848e31ba376373bf70f5e974d29de3797a720e6
1,187
md
Markdown
README.md
ThoughtHaven/Azure
a4b3a3f2dd3dee80fdf4f76d60828c26203857e6
[ "MIT" ]
2
2018-03-15T22:08:28.000Z
2020-09-08T17:24:16.000Z
README.md
ThoughtHaven/Azure
a4b3a3f2dd3dee80fdf4f76d60828c26203857e6
[ "MIT" ]
null
null
null
README.md
ThoughtHaven/Azure
a4b3a3f2dd3dee80fdf4f76d60828c26203857e6
[ "MIT" ]
null
null
null
Thought Haven Azure === Data abstractions and helpers around the official Microsoft Azure library so that cloud storage development feels more like working with POCOs, not TableEntities, etc. Bridge the gap between Azure table storage and the repository pattern. Pass in a POCO, configure entities keys, and you have an ICrudStore. Also, guarantee that your app only checks for a table or blob container's existence once per instance. ## Other projects: * [AspNetCore](https://github.com/ThoughtHaven/AspNetCore): Wrappers and helpers built on top of Microsoft's wonderful AspNetCore and MVC libraries. Get started faster and easier while following enforced best practices. * [Identity](https://github.com/ThoughtHaven/Identity): An alternative Identity framework for AspNetCore built on flexibility, extensibility, and code separation. * [Core](https://github.com/ThoughtHaven/Core): Low-level libraries and helpers for use in any application. * [Security](https://github.com/ThoughtHaven/Security): Making the secure thing easier and faster. * [Emailers](https://github.com/ThoughtHaven/Emailers): Abstractions for email messages and services, as well as a SendGrid implementation.
84.785714
242
0.802022
eng_Latn
0.961029
9849776603ab70cae7c260c3e57dd42033e7dcfc
87
md
Markdown
README.md
taroninak/swipe-pad-server
6f76c85d9c494e6b8b50d400e3d95416862464f7
[ "MIT" ]
null
null
null
README.md
taroninak/swipe-pad-server
6f76c85d9c494e6b8b50d400e3d95416862464f7
[ "MIT" ]
null
null
null
README.md
taroninak/swipe-pad-server
6f76c85d9c494e6b8b50d400e3d95416862464f7
[ "MIT" ]
null
null
null
# SwipePad Server SwipePad is an application for using an android device as a touchpad
29
68
0.816092
eng_Latn
0.995128
984bffd8d4d75310155613d3bd77b951e2f664f3
104
md
Markdown
README.md
Shanthi-Rajendran/ReactNativeMomentOfTheDay
9d3638a579a1413b3ef1fcba0ad325afbf23278f
[ "MIT" ]
null
null
null
README.md
Shanthi-Rajendran/ReactNativeMomentOfTheDay
9d3638a579a1413b3ef1fcba0ad325afbf23278f
[ "MIT" ]
null
null
null
README.md
Shanthi-Rajendran/ReactNativeMomentOfTheDay
9d3638a579a1413b3ef1fcba0ad325afbf23278f
[ "MIT" ]
null
null
null
# ReactNativeMomentOfTheDay React Native based NPM package to get moment of the day in different locale
34.666667
75
0.836538
eng_Latn
0.939809
984c9e819f0b27eed68da8616066823a0c1fca16
2,643
md
Markdown
docs/fetchers.md
ianmorgan/graph-store
7b0f6e4ac15c3a39a28b9d5df3b13ca83f1dcf98
[ "MIT" ]
3
2019-01-17T17:16:10.000Z
2020-05-15T22:45:17.000Z
docs/fetchers.md
ianmorgan/doc-store
7b0f6e4ac15c3a39a28b9d5df3b13ca83f1dcf98
[ "MIT" ]
null
null
null
docs/fetchers.md
ianmorgan/doc-store
7b0f6e4ac15c3a39a28b9d5df3b13ca83f1dcf98
[ "MIT" ]
null
null
null
# (GraphQL) Fetchers ## Overview "wiring up" GraphQL queries using the [GraphQL Java](https://github.com/graphql-java/graphql-java) library is essential a 2 step process. Step 1. For each type, interface and union defined in the schema a TypeResolver is required, which tells the library which fields are expected. This process is quite simple, though it feels a little unnecessary as the information is readily available in the schema which has just been parsed. Presumably this is a trade off in the internal API design, which is flexible enough to drive directly through Java without an actual schema file (_or I have just misunderstood the API_). Step 2. To return actual data, a "Fetcher" is required (so essentially for each query a fetcher must be wired up). Fetchers are somewhat more complicated and different patterns are required for types, unions and interfaces, though they all ultimately delegate to the [DAO](daos) layer. The GraphQL Java library supports a variety of object mappers in fetchers, but these all work using a basic Java Map (the key in the Map must match the name of the field, and the value must be stored in a Java class that can be coerced by the [Type Mapping](typeMappings) rules). The underlying data in the Map is simply that returned by the [DAO](daos), with some extra pseudo fields (all prefixed #), which are used to pass additional data for amongst others pagination and type resolution. The entry point in the code is GraphQLFactory, which is passed the actual GraphQL schema and a collection of the DAOS. It examines the queries and build fetchers according to the following rules. ### Document by ID A query like the that below is the most simple, simply lookup a document by its ID ```yaml droid(id: ID!): Droid ``` This is implemented by a DocDataFetcher. ### Interface by ID A query like the that below simply tries a lookup for each document in the interface until a result is returned. ```yaml character(id: ID!): Character ``` This is implemented by an InterfaceDataFetcher. ### A collection of documents A query like that below is slightly more complicated. It needs to find all documents of type 'Human' where the name field matches. ```yaml humans(name : String!) : [Human] ``` This is implemented by a DocListDataFetcher. A production quality implementation will need some type of index, but the current implementation simple runs through the entire collection in memory. ### Pseudo fields As noted, these are purely for internal processing. #### #docType The docType, which is actually the 'type' name in the GraphQL schema, e.g. 'Human'.
40.661538
125
0.768445
eng_Latn
0.99969
984cde6614d8cc3d4fe61385edb3ddde5d477364
26
md
Markdown
README.md
Lucius671/AtI
9843e1771fa4f7da8a7d72e21afc2652e27dd693
[ "Unlicense" ]
null
null
null
README.md
Lucius671/AtI
9843e1771fa4f7da8a7d72e21afc2652e27dd693
[ "Unlicense" ]
null
null
null
README.md
Lucius671/AtI
9843e1771fa4f7da8a7d72e21afc2652e27dd693
[ "Unlicense" ]
null
null
null
# AtI Around The Infinity
8.666667
19
0.769231
eng_Latn
0.967978
984d9ad8e8a51658e3638bbe34062d07049fb638
3,363
md
Markdown
pages/content/amp-dev/documentation/guides-and-tutorials/start/visual_story/start_story@ko.md
machal/docs
6a63453ae3b296ad485870a3ab15a8bf07a7becd
[ "Apache-2.0" ]
10
2018-07-27T12:13:59.000Z
2019-02-09T16:54:01.000Z
pages/content/amp-dev/documentation/guides-and-tutorials/start/visual_story/start_story@ko.md
machal/docs
6a63453ae3b296ad485870a3ab15a8bf07a7becd
[ "Apache-2.0" ]
140
2018-07-25T17:20:43.000Z
2019-03-06T17:33:00.000Z
pages/content/amp-dev/documentation/guides-and-tutorials/start/visual_story/start_story@ko.md
machal/docs
6a63453ae3b296ad485870a3ab15a8bf07a7becd
[ "Apache-2.0" ]
3
2018-07-25T13:52:29.000Z
2019-07-05T09:21:22.000Z
--- $title: 본격적으로 스토리 시작하기 --- 이제부터 작성할 이야기는 [`amp-story`]({{g.doc('/content/amp-dev/documentation/components/reference/amp-story.md', locale=doc.locale).url.path}}) 컴포넌트안에 들어가는데 [`amp-story`]({{g.doc('/content/amp-dev/documentation/components/reference/amp-story.md', locale=doc.locale).url.path}})는 이야기를 구성하는 모든 페이지를 포함하는 컨테이너 역할을 합니다. [`amp-story`]({{g.doc('/content/amp-dev/documentation/components/reference/amp-story.md', locale=doc.locale).url.path}}) 컴포넌트는 사용자의 동작(gesture)과 이동(navigation)을 처리하는 UI 셸 역할도 합니다. [`amp-story`]({{g.doc('/content/amp-dev/documentation/components/reference/amp-story.md', locale=doc.locale).url.path}})는 커스텀 AMP 컴포넌트라서 해당 AMP 문서에서 미리 필요한 스크립트를 추가해주어야 합니다. 예제에서 보면 `pets.html` 파일을 에디터에서 **열어서** `<head>` 섹션안에 다음과 같은 스크립트를 **추가**해야 합니다: ```html hl_lines="2 3" <head> <script async custom-element="amp-story" src="https://cdn.ampproject.org/v0/amp-story-1.0.js"></script> </head> ``` 이제 `<amp-story>` 요소를 `<body>`안에 **추가**합니다. 참, `standalone` 속성을 추가하는 것을 잊지 마십시오. ```html hl_lines="2 3" <body> <amp-story standalone> </amp-story> </body> ``` 여기서 중요한 것 한가지. `<body>` 요소는 [`amp-story`]({{g.doc('/content/amp-dev/documentation/components/reference/amp-story.md', locale=doc.locale).url.path}}) 컴포넌트를 딱 하나만 포함해야만 합니다. 그리고 다른 요소들은 모두 [`amp-story`]({{g.doc('/content/amp-dev/documentation/components/reference/amp-story.md', locale=doc.locale).url.path}}) 컴포넌트안에 들어있어야 합니다. ## 메타 정보 설정 스토리를 쉽게 찾을 수 있어야 AMP 스토리 생태계가 건강하게 유지될 수 있습니다. 그러기 위해서 스토리에 대한 약간의 정보를 메타데이터 형태로 제공해주어야 합니다. 예를 들어, * 스토리의 제목: `title` 속성입니다. 이를테면 “Joy of Pets” 텍스트가 들어갑니다. * 퍼블리셔의 이름: `publisher` 속성입니다. 이를테면 “AMP tutorials” 텍스트가 들어갑니다. * 퍼블리셔의 로고: `publisher-logo-src` 속성입니다. 로고 이미지의 URL을 넣는데, 이미지는 1x1 비율(aspect ratio) 또는 정사각형이어야 합니다. * 스토리의 포스터(대표) 이미지: `poster-portrait-src` 속성에 들어갑니다. 포스터 이미지의 URL을 넣는데, 반드시 3x4 비율의 세로방향(portrait) 이미지로 넣습니다. 자, [`amp-story`]({{g.doc('/content/amp-dev/documentation/components/reference/amp-story.md', locale=doc.locale).url.path}}) 태그의 필수 속성에 적절한 값을 채워보겠습니다: ```html hl_lines="2 3 4 5" <amp-story standalone title="Joy of Pets" publisher="AMP tutorials" publisher-logo-src="assets/AMP-Brand-White-Icon.svg" poster-portrait-src="assets/cover.jpg"> ``` 위의 필수 속성외에도 다른 속성도 있습니다. 다른 속성에 대해서도 알고 싶으면 [`amp-story`]({{g.doc('/content/amp-dev/documentation/components/reference/amp-story.md', locale=doc.locale).url.path}}) 레퍼런스 문서의 [attributes]({{g.doc('/content/amp-dev/documentation/components/reference/amp-story.md', locale=doc.locale).url.path}}#attributes) 섹션을 참고하십시오. [tip type="note"] 여기에서 설명한 메타데이터 속성은 해당 페이지의 Structured Data (예를 들면 JSON-LD)를 대체하는 것은 아닙니다. [`amp-story`]({{g.doc('/content/amp-dev/documentation/components/reference/amp-story.md', locale=doc.locale).url.path}}) 들어가는 title, publisher 등이 페이지에 이미 있는 Structured Data와 중복된다고 Structured Data를 없애면 안된다는 뜻입니다. 참고로, [Structured Data]({{g.doc('/content/amp-dev/documentation/guides-and-tutorials/optimize-measure/discovery.md', locale=doc.locale).url.path}}#integrate-with-third-party-platforms-through-additional-metadata)는 AMP 문서(AMP 스토리를 포함하여)를 여러 플랫폼이나 서비스에서 문서가 잘 보여줄 수 있게하는 메타데이터 포맷입니다. [/tip] 여기까지 왔으면 껍데기만 만들어진 셈인데 아직 유효(valid)한 문서는 아닙니다. [`amp-story`]({{g.doc('/content/amp-dev/documentation/components/reference/amp-story.md', locale=doc.locale).url.path}}) 컴포넌트는 최소한 하나 이상의 페이지를 갖고 있어야합니다. 그럼 이제 페이지를 만들어보겠습니다.
45.445946
274
0.719298
kor_Hang
0.999571
984dadbf5b7171f4c9177763a97f945d067767b4
5,335
md
Markdown
source/blog/2015-12-10-the-other-side-of-documentation.html.md
mlichvar/community-website
adc97e58dea3dcc557a4dadc16693511319c28a2
[ "MIT" ]
null
null
null
source/blog/2015-12-10-the-other-side-of-documentation.html.md
mlichvar/community-website
adc97e58dea3dcc557a4dadc16693511319c28a2
[ "MIT" ]
null
null
null
source/blog/2015-12-10-the-other-side-of-documentation.html.md
mlichvar/community-website
adc97e58dea3dcc557a4dadc16693511319c28a2
[ "MIT" ]
null
null
null
--- title: The Other Side of Documentation date: 2015-12-10 19:19 UTC author: bkp tags: community, open source, documentation comments: true published: true --- ![bookshelves](blog/bookshelves.jpg) For those of us who write a lot, there is a certain bias for the notion that "if you don't write it down, it didn't happen." It's not just writers; as digital as our world has become, there is still a value of permanence to the written word--it's just more likely in bytes rather than paper. In free and open source software communities, there's always a lot of stock put into the need to have written documentation of most any sort. From user guides, to feature specs, to marketing materials, a community's collective shared knowledge should not rely on the memory and experience of a few people in the community, but rather information that's freely available to all. This may be preaching to the choir, of course, since the need for documentation is well established. Of course, getting that documentation created can be a challenge in and of itself. Many are the tales of project after project that could really take off in terms of adoption and contribution, except people are held back because they don't know enough about it or the learning curve is too steep. Arguing the pros and cons of documentation is not, however, the point of this discussion. Let's assume that this journey is underway and you are producing documents for your community. Now the question becomes: what do you do with them? READMORE ## Stuck On You The problem that sometimes crops up within communities is that documents tend to be sticky. Specifically, they stick with the author/creator of the document. This is usually not a result of people being intentionally selfish. I have been guilty of this myself at times. I finish up a nice howto on using a new software feature and then I send a notice of its location on Google Drive or Dropbox out to a team or community by email. The document is received with some gratitude, and life goes on. Until this happens: the email drops in my inbox that reads "Hey, Brian, where's that howto you wrote on the feature *X* a while back?" At that exact moment, you have a document problem. The problem is not that you don't *have* a document available... the issue is that it's not stored in a way that everyone who needs to find it can find it easily. Sure, I was cool and I published the document on [*Insert Cloud Platform Here*], but I likely posted to *my* account's space within that larger cloud platform. The document is still public, but it's still associated with me. It is, in effect, sticky. This happens all of the time: slides, documents, and spreadsheets are scattered hither and yon throughout your community. They are publicly shared, but their *accessibility* is fragmented across multiple locations based on authors. This may seem like a non-issue to some of you, because come on, how hard is it for people to just search their email for "sender: Brian feature X"? Indeed, I suspect many of us have gotten used to using our inboxes as *ad hoc* card catalogs where we try go call up enough keywords to find the information or document we need. But such searches, while occasionally valid, still rely on very specific institutional knowledge. What about, for instance, the community member who wasn't active in the community when you first sent out your document? If they don't ask around, they might assume that such a document was never made, and will either try to move on without it or, worse, duplicate effort and make their own version of the document. ## Consolidation Is Key The solution here is to try to gather all of the known documentation into one single repository that is prominently located within your community so *anyone* in your project should be able to find it. Additionally, that repository should be easily searchable so specific contents can be quickly located. The type of repository does not matter: it can be GitHub, a community wiki, or even Google Drive. The important thing is that it's one repository for all documents... not many repositories for all documents. This may seem dead obvious, but the truth is that even communities that have made a commitment to create documentation do not consistently archive their materials in this manner. Paradoxically, it's almost one of those things that seems so obvious that you *don't* have to do it. But the benefits of a central repository are clear: * Project knowledge is more broadly shared * Institutional knowledge is reduced * Document consolidation can help identify gaps in coverage and versions * Document collections can motivate the creation of more documents It is important to note that you should not just dump your documents into a centralized repo and call it a day. Effective organization, such as through tagging, is another vital part of making such archives work. Sometimes archive users don't know the right search question to ask, so the more they can narrow their search down to the right content, the better. Getting documentation right is a big enough challenge... don't make it hard for people to find what you do have. *(Image courtesy [Alexandre Duret-Lutz](https://www.flickr.com/photos/gadl/110845690), [(CC BY-SA 2.0)](https://creativecommons.org/licenses/by-sa/2.0/))*
104.607843
739
0.788941
eng_Latn
0.999929
984e4e1838f6a09262f8728d3489a97560ccc885
2,209
md
Markdown
AlchemyInsights/owa-delete-contact.md
pebaum/OfficeDocs-AlchemyInsights-pr.lv-LV
e84c8c1b48e94a94d39adfe48bbcea4cc1fd919b
[ "CC-BY-4.0", "MIT" ]
null
null
null
AlchemyInsights/owa-delete-contact.md
pebaum/OfficeDocs-AlchemyInsights-pr.lv-LV
e84c8c1b48e94a94d39adfe48bbcea4cc1fd919b
[ "CC-BY-4.0", "MIT" ]
null
null
null
AlchemyInsights/owa-delete-contact.md
pebaum/OfficeDocs-AlchemyInsights-pr.lv-LV
e84c8c1b48e94a94d39adfe48bbcea4cc1fd919b
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: Kontaktpersonas dzēšana programmā Outlook tīmeklī ms.author: daeite author: daeite manager: joallard ms.date: 04/21/2020 ms.audience: Admin ms.topic: article ROBOTS: NOINDEX, NOFOLLOW localization_priority: Normal ms.custom: - "8000012" - "1997" ms.openlocfilehash: 695952c1c3179a41be1aa40b5be12c254687cb02 ms.sourcegitcommit: 55eff703a17e500681d8fa6a87eb067019ade3cc ms.translationtype: MT ms.contentlocale: lv-LV ms.lasthandoff: 04/22/2020 ms.locfileid: "43721206" --- # <a name="delete-a-contact"></a>Kontaktpersonas dzēšana 1. Lapas apakšējā kreisajā stūrī programmā Outlook Web atlasiet <img src='data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABgAAAAYCAYAAADgdz34AAAACXBIWXMAAA7EAAAOxAGVKw4bAAAAB3RJTUUH4gEKEisVwYq3YQAAAAd0RVh0QXV0aG9yAKmuzEgAAAAMdEVYdERlc2NyaXB0aW9uABMJISMAAAAKdEVYdENvcHlyaWdodACsD8w6AAAADnRFWHRDcmVhdGlvbiB0aW1lADX3DwkAAAAJdEVYdFNvZnR3YXJlAF1w/zoAAAALdEVYdERpc2NsYWltZXIAt8C0jwAAAAh0RVh0V2FybmluZwDAG+aHAAAAB3RFWHRTb3VyY2UA9f+D6wAAAAh0RVh0Q29tbWVudAD2zJa/AAAABnRFWHRUaXRsZQCo7tInAAAClklEQVRIieWVX0iTYRTGz2aazW3f0nItTD7n/kCUf8q0QVnCChNGISpWoN1EF6WWNeoiwYtQClZUSEgphlBkCV4oKCsjEZcaQo5glW2oyyZafOosc86ny6Z+U7/Eq87l857n/Hjf88ArAgBaxxKv5/A1AcZHBsnpdNGUb4VGCC3OCcuFbKiVckgkEUhIN6G+/XPQdmGAuQlUnknDFrURDe19cDj6cacoC5HRCXhh/752wKitBvpt8XjumA1Q/SjPSca+girM8XgE7eDbgIvC/YmUpg8NUMVkSEqjmXcfycvjEQSI3bWTfKE91NrNBahT1Nr5mpiMZJLxmYQ8EeDDk/J8xMalwlLXBKu1BddOp0OTaILN/YvXsSzgh8eNoaFh/PT/1Wy1ZoQRIVzKQKFQQEyEmIwiuLz8M/gBnBO3S3KhVsogkUix15iPlu4BdD2+iEgpi9J7jRga98Lr9eJDZxNyDVrsOHgWrsnVAOYmUFGQiii1EU+tvbDbe1F5LgvKSAVkTBSu1vXwXLUfxjg5cm81rwzwdFZDp9Kg8ZMvQJ3F9WM6MNo8jPC/BF5aTkHCnsTI/EJ9w+Klux1faJNoDxm0gUehtD8lndrGthITJGEpJ0qpSjxIISAi0TIpGuuuhV7Fov79dIA6iZIjOhwursf8YsMKxbPkGTwyHwerP4QHDW3o6LCirDADmiQTur7yR1EgALDVXEYIEcIi5GAYOYgIbKYZ7hnB85cC3lSfx2YZi0t3G+Aa5cBxHPran8GUwiL+aDHc03xjVgn4PWjFbhWDkodvl3Z6enEgRoqC+9Z/B7yqyINKk4nhIM3NN7Ih1RViTABgQUyjk/PoZpmStgeJoiHnClk2uokWR3GZEgH/66e/2voDxKbBpBHmq6QAAAAASUVORK5CYII=' /> **Cilvēku**. 2. Atlasiet kontaktpersonu, kuru vēlaties dzēst, un atlasiet **Dzēst**. Ja nevarat atlasīt **Dzēst**, kontaktpersona var būt no Skype vai pievienotā sociālā tīkla konta. Lai izdzēstu kontaktpersonu, dodieties uz Skype vai kontu, kurā kontaktpersona atrodas.
81.814815
1,396
0.90086
lvs_Latn
0.133099
984e95b2fcbe7851f750d0797be44744c2738fc8
86
md
Markdown
tls/class_tls_server.md
xxuuzhe/node-api-cn
5edc6072250d63bb2807ab6b9cf0b23f6fece8d6
[ "CC-BY-4.0" ]
1
2021-11-22T07:54:42.000Z
2021-11-22T07:54:42.000Z
tls/class_tls_server.md
lexmin0412/node-api-cn
5edc6072250d63bb2807ab6b9cf0b23f6fece8d6
[ "CC-BY-4.0" ]
null
null
null
tls/class_tls_server.md
lexmin0412/node-api-cn
5edc6072250d63bb2807ab6b9cf0b23f6fece8d6
[ "CC-BY-4.0" ]
null
null
null
<!-- YAML added: v0.3.2 --> `tls.Server` 类是 `net.Server` 的子类,接受使用 TLS 或者 SSL 的加密连接。
12.285714
55
0.616279
yue_Hant
0.954695
984f13e843ea0933ef30a13a1e0ac6a428b9caad
296
md
Markdown
content/documentation/_index.md
stevehouel/daswag-website
22a6f7c571a00b06d6463753af6d055588f64383
[ "Apache-2.0" ]
1
2022-01-29T13:33:11.000Z
2022-01-29T13:33:11.000Z
content/documentation/_index.md
stevehouel/daswag-website
22a6f7c571a00b06d6463753af6d055588f64383
[ "Apache-2.0" ]
null
null
null
content/documentation/_index.md
stevehouel/daswag-website
22a6f7c571a00b06d6463753af6d055588f64383
[ "Apache-2.0" ]
null
null
null
--- date: 2017-07-19T09:00:00+00:00 title: Documentation overview --- Coming soon ! {{< note title="What's next?" >}} * [Learn how daSWAG works](/documentation/how-daswag-works/architecture-and-components/) * Get started with [installation](/documentation/installation/overview) {{< /note >}}
22.769231
88
0.712838
eng_Latn
0.628995
985042a96ef4f27e74a4a00a57579b0ac19117c4
49,779
markdown
Markdown
_posts/2007-01-22-distribution-of-network-communications-based-on-server-power-consumption.markdown
api-evangelist/patents-2007
da723589b6977a05c0119d5476325327da6c5a5c
[ "Apache-2.0" ]
1
2017-11-15T11:20:53.000Z
2017-11-15T11:20:53.000Z
_posts/2007-01-22-distribution-of-network-communications-based-on-server-power-consumption.markdown
api-evangelist/patents-2007
da723589b6977a05c0119d5476325327da6c5a5c
[ "Apache-2.0" ]
null
null
null
_posts/2007-01-22-distribution-of-network-communications-based-on-server-power-consumption.markdown
api-evangelist/patents-2007
da723589b6977a05c0119d5476325327da6c5a5c
[ "Apache-2.0" ]
2
2019-10-31T13:03:32.000Z
2020-08-13T12:57:02.000Z
--- title: Distribution of network communications based on server power consumption abstract: A network device is described that load-balances network traffic among a set of network servers based on electrical power consumption of the network servers. The network device may measure electrical power consumption in a variety of ways, and may generate and maintain a power consumption profile for each of the network server. The power consumption profile may describe the respective server power consumption in increasing granularity. For instance, each power consumption profile may specify electrical power consumption according to watts consumed by a server per average transaction, watts consumed per transaction for a specific type of software application, watts consumed per transaction for a software application for individual network resources, and so on. Furthermore, the profiles may be maintained for individual servers or aggregated for groups or sequences of servers. url: http://patft.uspto.gov/netacgi/nph-Parser?Sect1=PTO2&Sect2=HITOFF&p=1&u=%2Fnetahtml%2FPTO%2Fsearch-adv.htm&r=1&f=G&l=50&d=PALL&S1=07844839&OS=07844839&RS=07844839 owner: Juniper Networks, Inc. number: 07844839 owner_city: Sunnyvale owner_country: US publication_date: 20070122 --- This application claims the benefit of U.S. Provisional Application No. 60 868 970 filed Dec. 7 2006 the entire content of which is incorporated herein by reference. A data center is a specialized facility that houses network servers provides data services. In its most simple form a data center may consist of a single facility that hosts all of the infrastructure equipment. A more sophisticated data center can be an organization spread throughout the world with subscriber support equipment located in various physical hosting facilities. Data centers allow enterprises to provide a number of different types of services including e commerce services to customers extranets and secure VPNs to employees and customers firewall protection and Network Address Translation NAT services web caching as well as many others. These services can all be provided at an off site facility in the data center without requiring the enterprise to maintain the facility itself. Each of the network servers within a data center requires electrical power to operate. Moreover a data center that houses network servers must include a heating ventilating and air conditioning HVAC system to regulate the temperature of the network servers. The HVAC system also requires electrical power to operate. Typically a regional electricity grid supplies this electrical power. However under certain circumstances the data center may be unable to receive sufficient electrical power from this electricity grid. For example the data center s connection to the electricity grid may be severed or brownout conditions may occur. To avoid a shutdown of the data center when the data center is unable to receive sufficient electrical power from the electricity grid the data center may include one or more local electrical generators that supply electrical power to the data center when the data center is unable to receive sufficient electrical power from the electricity grid. Local electrical generators may be expensive to purchase install and operate. Minimization or optimization of the power consumption of the network servers may reduce the costs associated with the purchase installation and operation such electrical generators. Furthermore minimization or optimization of the power consumption of the network servers may reduce the costs associated with the purchase of electrical power from the electricity grid. In general the invention is directed to techniques of load balancing network traffic among a set of network servers based on electrical power consumption of the network servers. Example embodiments of the invention may define rates of electrical power consumption in a plurality of ways. For instance the embodiments may define rates of electrical power consumption according to watts consumed by a server per average transaction watts consumed by a server per transaction for a specific type of software application watts consumed by a server per request to a specific network resource e.g. by way of a particular universal resource locator URL and so on. Furthermore embodiments of the invention may define rates of electrical power consumption for groups or sequences of servers. The techniques make use of one or more intermediate load balancers which are network devices that distribute network traffic among the plurality of network servers. By application of the techniques described herein a system that distributes network traffic among a set of network servers based on rates of electrical power consumption may consumes less overall power as compared to a similar system that distributes network traffic among network servers without regard to electrical power consumption. Because such a system may consume less electrical power overall costs related to the purchase and or generation of electrical power may be reduced. Furthermore the techniques described herein may allow an administrator to more easily anticipate and manage rates of power consumption. In one embodiment a method comprises receiving with an intermediate network device a network communication from a computer network. The intermediate network device is located between a client device and a plurality of servers. The method also comprises determining a network application associated with the network communication. In addition the method comprises identifying a set of the plurality of servers able to process network communication. In addition the method comprises selecting a server in the set of servers as a function of a power consumption rate for each of the servers wherein the power consumption rate specifies an amount of power consumed by the respective server when processing network communications associated with the network application. Furthermore the method comprises forwarding the network communication to the selected server. In another embodiment a network device located on a computer network between a client device and a plurality of servers. The network device comprises an interface to receive a network communication from the computer network. The network device also comprises a server selection module to determine a network application associated with the network communication to identify a set of the plurality of servers able to process the network communication and to select a server in the set of servers as a function of a power consumption rate for each of the servers wherein the power consumption rate specifies an amount of power consumed by the respective server when processing network communications associated with the network application and to forward the network communication to the selected server. In another embodiment a system comprises a client device a plurality of network servers a network load balancing device and a network to facilitate communication between the servers and the network load balancing device. The network device comprises an interface to receive a network communication from the client device. The network device also comprises a server selection module to determine a network application associated with the network communication to identify a set of the plurality of servers able to process the network communication and to select a server in the set of servers as a function of a power consumption rate for each of the servers wherein the power consumption rate specifies an amount of power consumed by the respective server when processing network communications associated with the network application and to forward the network communication to the selected server. In another embodiment a computer readable medium comprises instructions. The instructions cause a programmable processor to receive with an intermediate network device a network communication from a computer network. The intermediate network device is located between a client device and a plurality of servers. The medium also includes instructions that cause the processor to determine a network application associated with the network communication and to identify a set of the plurality of servers able to process the network communication. In addition the instructions cause the processor to select a server in the set of network servers as a function of a power consumption as a power consumption rate for each of the servers. The power consumption rate specifies an amount of power consumed by the respective server when processing network communications associated with the network application. The instructions also cause the processor to forward the network communication to the selected server. The details of one or more embodiments of the invention are set forth in the accompanying drawings and the description below. Other features objects and advantages of the invention will be apparent from the description and drawings and from the claims. Client devices A through N collectively client devices may be personal computers intermediate network devices network telephones network gaming platforms cellular telephones network televisions television set top boxes and so on. Moreover each of client devices may be a different kind of network device. For instance client device A may be a personal computer while client device B may be an intermediate network device. Servers A through N collectively servers may be any of several types of server that provide one or more resources or network services. For example servers may be web servers that provide web pages media servers that provide streaming audio and or video content file servers that provide one or more files application servers that provide an application and so on. Furthermore servers may be logically grouped so as to constitute a server farm or group of servers. To access resources provided by servers client devices may send network communications through a public network to load balancer which typically may be located behind a firewall and within a local network . Public network may be a wide area network such as the Internet. In other exemplary embodiments a private network such as a local area network a wireless network or other type of network may be substituted for public network . Furthermore client devices may send the network communications through a virtual private network over public network to load balancer . When load balancer receives a communication from one of client devices load balancer forwards the network communication to one of servers via a private network based at least in part on the relative power consumption rates of servers . Local network may be a local area network or otherwise. For example local network may be an Ethernet network that facilitates communication among network devices in a data center. Alternatively load balancer may receive a Domain Name System DNS request to resolve a domain name of a resource provided by servers . In response to the DNS request load balancer may send a DNS response that lists network addresses of servers sorted in an order based on the relative power consumption rates of servers . While the remainder of this specification describes techniques of forwarding and processing network communications the techniques described herein are generally applicable to this DNS based approach. Prior to distributing network traffic from client devices among servers load balancer may obtain corresponding power consumption rates for servers or for logical groups of servers. Power consumption rates may be defined and obtained in a variety of ways. Accordingly load balancer may apply different techniques to obtain such power consumption rates for servers . As one example load balancer may define power consumption rates for servers as the average amounts of power each of the servers consumes over a defined period e.g. per hour when operating normally. In this case load balancer may compute the power consumption rates by first sending requests to servers for listings of hardware components in respective ones of servers . For example these requests may be Simple Network Management Protocol SNMP requests to Management Information Bases MIBs on servers . After receiving a listing of hardware components for one of servers load balancer may automatically access one or more data sources such as vendors web sites or databases to obtain power consumption rates for the listed components. Load balancer may for example accomplish this by way of an automated invocation of an application programming interface API presented by a vendor s data source. Load balancer may then compute an overall power consumption rate for each of the servers by summing the power consumption rates of the constituent components of each of the servers. In another example power consumption rates for servers may be defined in terms of the power consumed when processing differing types of network communications likely to be received from client devices . That is the power consumption rates may be defined with respect to the different types of communication protocols. For instance servers may provide one or more a web sites e.g. by hosting one or more web servers and optionally application servers and database servers that support the web sites. Accordingly network communications from client devices are likely to include HyperText Transfer Protocol HTTP communications such as HTTP GET requests for web pages provided by servers . In this example load balancer may perform techniques to quantify specific power consumption rates for each of servers when each of servers process particular types of HTTP communications such as HTTP GET requests. To quantify power consumption rates for each of servers with this increased level of granularity load balancer may send probe HTTP GET requests to each of servers and obtain measurements of the rates of power consumption while the servers process the requests. As one example load balancer may obtain measurements of the rates of power consumption by receiving power measurements from electrical sensors on servers and correlating the power measurements with time periods during which the servers were processing the requests. As another example load balancer may obtain power consumption rates for each of servers with respect to particular individual network resources identified individual resources identified by universal resource locators URL . Similarly load balancer may obtain power consumption rates for each of servers with respect to individual network applications executing on servers . In this way load balancer may store multiple power consumption rates with a variety of granularities for each of servers e.g. server level granularity application level granularity and resource level granularity. After obtaining power consumption rates for servers load balancer may categorize servers into classes based on the power consumption rates of servers . Each of the classes may be associated with a different rate of power consumption and or a different level of power consumption granularity. In this way load balancer may construct and maintain a power consumption profile for each of servers or group of servers . For example a first class may be associated with a power consumption rate of 40 watts per hour and a second class may be associated with a power consumption of 60 watts per hour. Additional classes may be used to categorize servers based on rates of power consumption when the servers process request for specific types of network application e.g. HTTP FTP NAT DNS encryption decryption etc. . Furthermore classes may be used to categorize servers based on rates of power consumption when servers process requests for specific network resources or services provided by specific types of network applications e.g. HTTP in combination with a particular URL or DNS request for a particular network prefix . Subsequently load balancer may use these classes when distributing network traffic from client devices among servers . To distribute network traffic from client devices among servers load balancer may first for each network communication identify the type of the network communication received from the client device. Load balancer may then identify and select one of servers based on the power consumption profile of the server. For example when load balancer receives a network communication load balancer may identify a class of servers associated with the lowest rate of power consumption when processing that particular type of network communication. Load balancer may further narrow the selection to a reduced class of servers by identifying individual resources or services being requested. Load balancer may then forward the network communication to a server in this class of servers unless load balancer determines that a current utilization rate of the server is above a threshold for the class of the server. A utilization rate may for example represent the percentage of resources e.g. processor capacity random access memory usage bandwidth usage etc. currently being utilized by a server. For example when a processor of the server is operating at or above a certain threshold of capacity e.g. 90 load balancer may consider a current utilization rate of the server to be above the utilization threshold set for the class of the server. When load balancer determines that current utilization rates of servers in this class are above a certain class utilization threshold load balancer may identify a class of servers associated with a rate of power consumption that is higher than that of the previous class of servers but lower than rates of power consumption associated with other classes of servers . If load balancer determines that utilization rates of servers in this class are above a utilization threshold for this class load balancer may identify a class of servers associated with a next lowest rate of power consumption after this class and so on. Eventually load balancer may identify a class of servers whose servers have current utilization rates that are not above a utilization threshold for the class and whose servers consume less electrical power to process network communications of the identified type of the network communication than other ones of the servers whose current utilization rates are not above their utilization thresholds for their respective classes. Load balancer may then select one of the servers in the identified class. In addition to a power consumption classification load balancer may also use a current performance characteristic of servers as a factor in identifying a class of servers . A performance characteristic is a measure of how quickly an individual server generates a response after receiving a request. For example a performance characteristic may be computed based on response times of the servers e.g. response times to requests dropped packets counts of packets already forwarded to the servers etc. Servers in a class associated with low power consumption rates may respond to requests more slowly than servers in a class associated with higher power consumption rates. To optimize or otherwise improve both power consumption and the performance characteristic load balancer may apply a static or dynamic weighting scheme. As one example in identifying a class of servers load balancer may use a weighting scheme in which responsiveness is always weighted twice as heavily as power consumption. In an alternative example load balancer may use a weighting scheme in which a weight associated with responsiveness is rises or falls when overall power consumption of servers decreases or increases. In this way load balancer may identify a class of servers in which overall power consumption rates may be managed while still utilizing ones of servers that are most responsive. Load balancer automatically issue communications to instruct servers in particular classes to enter a low power consumption mode i.e. a sleep mode or a hibernation mode . When one of servers is in hibernation mode the server consumes a minimal amount of power. For example if server N is in a class of servers associated with relatively high power consumption rates and the overall number of requests to servers is low servers in classes associated with lower power consumption rates may be able to process all of the requests. In this situation load balancer may instruct server N to enter hibernation mode. Later when the overall number of requests to servers is higher or the utilization of the servers in lower power consumption classes reaches defined thresholds load balancer may automatically instruct server N to exit hibernation mode in order to be ready to process requests. Each of servers may operate a plurality of virtual machines that provide operating environments for one or more software applications. For example each of servers may operate production test and development virtual machines that provide operating environments for production test and development applications. Load balancer may distribute requests among the virtual machines as though the virtual machines were separate sets of physical machines according to the techniques described herein. Furthermore load balancer may redistribute the virtual machines among servers to reduce overall power consumption rate of the servers. For example load balancer may track the utilization rates and power consumption rates of each of servers . Each of the virtual machines may be associated with a priority. For example the production virtual machine may have a higher priority than the development virtual machine. Lower priority virtual machines may be virtual machines that handle fewer requests than higher priority virtual machines. When a utilization rate of one of servers approaches a given threshold load balancer may send instructions to the server to terminate a lower priority virtual machine or otherwise limit the resources available to the lower priority virtual machine. This may make more resources available to the higher priority virtual machine. In addition load balancer may send instructions to a second one of servers associated with a higher power consumption rate to increase the resources available to a virtual machine that is equivalent to the lower priority virtual machine on the second server. In this way load balancer may reduce the overall power consumption of servers . Load balancer includes a server power estimation module SPEM to estimate power consumption rates of servers based on power consumption rates of hardware components included in servers . To estimate how much power each of servers consume SPEM may initially send an SNMP request to each of servers for catalogs of hardware components associated with respective ones of servers . In response to such an SNMP request one of servers may send an SNMP response to SPEM to provide a listing of the hardware components of the server. For example server A may send a response that provides a listing that specifies that server A includes an Intel Pentium processor a hard disk drive from manufacturer X and a hard disk drive from manufacturer Y. Continuing this example server N may send a response that provides a listing that specifies that server B includes an Advanced Micro Devices Athlon processor and a hard disk drive from Seagate. Records of hardware components of the server may be stored in a Management Information Base in the server a configuration management database or a central asset tracking system. After receiving the listings of hardware components associated with servers SPEM may determine a power consumption rate associated with each of the listed hardware components. To determine a power consumption rate for a hardware component SPEM may access one or more data sources that store a power consumption rate for the hardware component. Such data sources may be available from the vendor e.g. the manufacturer retailer etc. of the hardware component. For example a power consumption rate may be posted on a manufacturer s web site. Alternatively load balancer may include such databases. After determining a power consumption rate associated with each of the hardware components in the listing for one of servers SPEM may sum the power consumption rates associated with the listed hardware components to derive an overall power consumption rate for the server. Once SPEM derives an overall power consumption rate for the server SPEM may store the overall power consumption rate for the server in server power database . In an alternative example SPEM invokes scripts on servers to determine the power consumption rates associated with each of servers . For example SPEM may use a technique such as ActiveX or Java to invoke scripts or other programs that execute locally on servers . A script executing on one of servers may generate a listing of hardware components of the server and send this listing to SPEM . SPEM may then use a data source to determine a power consumption rate associated with each of the listed components. Alternatively the script may access the data source directly and send to SPEM an overall power consumption rate associated with the server. SPEM may also determine an aggregate power consumption rate associated with groups of servers . For example servers A and B may collaborate to provide a resource. In this case SPEM may receive listings of hardware components from servers A and B and obtain power consumption rates associated with the components of servers A and B. SPEM may then add together all of the power consumption rates of components of servers A and B to obtain an overall power consumption rate for the combination of servers A and B. In this way SPEM may obtain power consumption rate at varying degrees of granularity. Administrator may define which ones of servers SPEM regards as groups. Furthermore SPEM may identify classes of servers based on power consumption rates of servers . For example SPEM may categorize all servers having power consumption rates below 50 watts in a first class and all server having power consumption rates of 50 watts or higher in a second class. SPEM may record these classes in server power database . Load balancer may also include a request power estimation module RPEM to estimate power consumption rates of servers when processing one or more types of network communications referred to generally as requests . In order to estimate power consumption rates of servers when processing a particular type of request RPEM may first access a request power database . Request power database contains records each of which associate a type of request with one of servers and a power consumption rate. For example request power database may contain a record that indicates that server A consumes 50 watts to process a request for streaming media. If RPEM determines that request power database does not contain a record that associates the type of request with one of servers RPEM may generate a probe communication that includes a request of this type. RPEM may then forward this communication to the server. At the same time RPEM may monitor the average power consumption of the server while the server is processing the communication. RPEM may use one or more SNMP requests to the server to monitor the average power consumption of the server while the server is processing the communication. Alternatively RPEM may receive power consumption information from a baseboard management controller BMC that monitors sensors in server. The sensors may include temperature sensors analog to digital converters to measure current and voltage on various power supplies in the server and so on. In either case RPEM may determine a power consumption rate by correlating the received power consumption information with the specific period starting after the probe communication was sent and a response was received. After determining a power consumption rate for the server for the type of request RPEM may update request power database to include a record that associates the request type server and measured power consumption rate. RPEM may repeat this process until request power database includes a record for each server for each request type. Furthermore RPEM may identify classes of servers for various types of requests. For example RPEM may identify records in request power database having a common request type. RPEM may then identify classes of servers by determining which of the records having approximately equivalent power consumption rates. For example RPEM may order all of the identified records by power consumption rating. RPEM may then identify the first third of the servers specified by the ordered identified records as being in a first class the second third of servers specified by the ordered identified records as being in a second class and so on. RPEM may repeat this process for each type of request specified in the records of request power database . In this way RPEM may identify classes of servers for each request type. When interface receives a network communication from public network interface forwards the communication to a server selection module SSM . Upon receiving the network communication from interface SSM may access power consumption profiles for servers and determine a rate of electrical power consumption for the servers using the power consumption profiles of the servers. SSM may then identify one of servers based at least in part on power consumption rates of servers . After identifying one of servers SSM provides the network communication to interface . Interface may then forward the communication to the one of servers identified by SSM . As illustrated in the example of SSM includes a server classification module . Server classification module uses estimates of power consumption rates of servers based on power consumption rates of hardware components included in servers to identify one of servers that would consume the least power to process the communication given the utilization of server . To identify one of servers that would most efficiently process the communication server classification module may access server power database to obtain classifications of servers . Server classification module may then access a server utilization module . Server utilization module provides current utilization rates of servers . Based on the records from server power database and the current utilization rates from server utilization module server classification module may identify a class of servers that is not over utilized and that consumes the least power. Server classification module may determine a class of servers is over utilized when the average utilization rate of the servers in the class exceeds a given utilization threshold. For example a class of servers may be over utilized when the average server in the class is operating at 85 of capacity. Server classification module identify one of servers that is not over utilized that is capable of processing the communication with a combined level of power consumption and response time to the network communication that is less than combined levels of power consumption to process the communication and response time to the network communication of other ones of the servers that are not over utilized. For example server classification module may calculate a combined level of power consumption and response time for a server by multiplying a first weight by a performance characteristic for the server plus a second weight multiplied by a power consumption rate for the server. Server classification module may then compare combined levels of power consumption and response time for each of the servers that is not over utilized to identify one of servers that has the lowest overall combined level of power consumption and response time. SSM also includes a request classification module . Request classification module identifies a type of request associated with a received communication and identifies one of servers that would most efficiently process a member of the identified type of request given current utilization rates of servers . After request classification module has identified a type of request for the network communication request classification module may obtain records from request power database for the identified type of request. Request classification module may then access server utilization module to obtain current server utilization data. Then using the current server utilization data request classification module may identify a class of servers that are not over utilized and that are associated with the lowest power consumption ratings to process requests in the identified type of request. Request classification module may then forward the communication to one of servers in the identified class of servers. Load balancer also includes a hibernation module . Hibernation module sends instructions to one of servers to enter a low power consumption mode e.g. a hibernation mode or a standby mode when the server consumes more power than other ones of servers and when the server is unlikely to be utilized for a given period of time. For example server N may be in a class of servers that is associated with high power consumption rates. During times when utilization of servers is relatively low e.g. on nights or weekends and ones of servers in classes associated with lower power consumption rates are sufficient to process all requests to servers hibernation module may send instructions to server N to enter a hibernation mode. To determine which ones of servers are in which classes and to determine the utilization of servers hibernation module may access server power database and server utilization module . When server N receives the instructions to enter a hibernation mode server N may store the content of its volatile memory e.g. Random Access Memory to a persistent storage medium shut down disk drives cooling systems and other systems of server N that consume power. However server N may continue to operate a process that waits for instructions to reenter a normal power consumption mode. Subsequently when utilization of servers is higher and ones of servers in classes associated with lower power consumption are not sufficient to process all requests to servers hibernation module may send instructions to server N to reenter a normal power consumption mode. For example hibernation module may send instructions to server N to reenter a normal power consumption mode when ones of servers in a class associated with lower power consumption ratings are operating at 75 of capacity. In another example hibernation module may send instructions to server N to reenter a normal power consumption mode when hibernation module determines that it is time of day associated with high utilization. Upon receiving the instructions to reenter a normal power consumption mode server N may restore its volatile memory from the persistent storage medium power up disk drives and cooling systems and so on. When server N reenters the normal power consumption mode server N may begin processing requests. Load balancer may also include a virtual machine configuration module VMCM to redistribute the virtual machines among servers to reduce overall power consumption of servers . For example VMCM may maintain records of which virtual machines are operating on which ones of servers . For instance VMCM may maintain records indicating that server A is operating a development virtual machine and a production virtual machine. The development virtual machine may provide an operating environment for applications that are currently under development within an enterprise. The production virtual machine may provide an operating environment for applications that are currently used by clients of the enterprise. For example the production virtual machine may provide an operating environment for a web server that provides an e commerce web page to clients. Because it may be more important to provide clients with a minimum response time than to provide developers with access to development applications the production virtual machine may have higher priority than the development virtual machine. VMCM may access server utilization module to obtain utilization rates for each of servers and access server power database to obtain power consumption rates for each of servers . VMCM may then use the utilization rates and power consumption rates to determine whether to redistribute virtual machines on servers among servers such that response times to the clients are the shortest and the overall power consumption of servers is the lowest. For example administrator may define thresholds for network traffic server load and server response times. If any of these thresholds are surpassed on one of servers VMCM may instruct a lower priority virtual machine operating on the server to cease operating or to transfer operations to another one of servers . For example VMCM may send a request to a virtual server management platform such as IBM WebSphere to terminate the lower priority virtual machine. In another example VMCM may send the request directly to a virtualization software layer executing on the server. By terminating the lower priority virtual machine on the server the server may have more resources to devote to requests for the higher priority virtual machine. This may result in a decrease in overall power consumption because servers associated with higher power consumption ratings are less likely to process requests associated with the higher priority virtual machine. If server classification module determines that the utilization rate of the servers in the current class does not exceed the threshold NO of server classification module forwards the network communication to one of the servers in the current class of servers . On the other hand if server classification module determines that the utilization rate of the servers in the current class exceeds the utilization threshold YES of server classification module determines whether there are any remaining classes of servers associated with power consumption rates immediately higher than the power consumption rates associated with servers in the current class and capable of servicing the received request . If server classification module determines that there is not a remaining class of servers associated with next higher power consumption rates that are capable of servicing the received request NO of server classification module may forward the network communication to one of the servers in the current class of servers . However if server classification module determines that there is a remaining class of servers associated with a next higher rate of power consumption YES of server classification module may set the label current class to indicate this class of servers . Server classification module may then determine whether the usage of the servers in the new current class exceeds a utilization threshold and so on. Request classification module may perform a similar operation as that described in the example of . However request classification may first identify a type of request included in the network communication and then access request power database to retrieve classes of servers for the identified type of request. Each entry in a second column of table represents a general rate of power consumption for a server. For instance a server in table with server ID 1 has a general rate of power consumption of 0.1 watts per time period. Entries in the second column of table may represent a first level of granularity that specifies an average power consumption rate for a server. Each entry in a third fourth and fifth column of table represents a rate of power consumption of a server when the server processes for a network communication associated with a particular network application. For instance each entry in the third column represents a rate of power consumption for a server when the server processes a network communication associated with an HTTP application. Similarly each entry in the fourth and fifth columns represent rates of power consumption of a server when the server processes network communications associated with the FTP and DNS applications respectively. For example a server in table with server ID 1 has a rate of power consumption of 0.11 watts per time period when processing network communications associated with the HTTP application. Because entries in the third fourth and fifth columns of table represent rates of power consumption for a server when the server is processing a network communication associated with a network application entries in these columns describe rates of power consumption at a lower level of granularity than entries in the second column of table . Thus the third fourth and fifth columns may represent a second level of granularity that specifies power consumption rates for the server with respect to a plurality of different types of network applications. Each entry in a sixth seventh and eighth column of table represents a rate of power consumption of a server when the server is processing a network communication for request for a resource associated with a particular network application. For instance each entry in the sixth column represents a rate of power consumption of a server when the server processes a network communication for a resource named File 1 which is associated with the HTTP service. As illustrated in the example of the server having server ID 1 has a rate of power consumption of 0.13 when the server is processing a network communication for the resource File 1 which is associated with the HTTP network application. Although not shown in table may include columns for resources that include pre generated files dynamically generated files streaming media responses to network protocol requests e.g. a DNS response and so on. Because entries in the sixth seventh and eighth columns of table represents rates of power consumption for a server when the server is processing a network communication for a specific resource with a particular network application entries in these columns describe rates at a lower level of granularity than entries in the third fourth and fifth columns. Thus the sixth seventh and eighth columns may represent a third level of granularity that specifies power consumption rates for a server with respect to a plurality of individual network resources for different types of network applications. In system client devices A through N collectively client devices send requests to load balancer . Load balancer distributes the requests among servers A through N collectively servers . To process the requests servers may send requests to access a resource provided by a set of servers A through N collectively servers . A second level load balancer distributes the requests from servers among servers . Furthermore to process the requests from servers servers may send requests to access a resource provided by a set of servers A through N collectively servers . A third level load balancer distributes the requests from servers among servers . Because one of servers one of servers and one of servers may process a request the power consumed to process the request may be the sum of the power consumed to process the request by the one of servers the one of servers and the one of servers . However because second level load balancer distributes requests among servers load balancer may not be able to accurately predict which one of servers processes a request. Similarly because third level load balancer distributes requests among servers load balancer may not be able to accurately predict which one of servers processes the request. Consequently load balancer may not be able to determine in the manner described above how much power is consumed to process a request. To estimate an amount of power consumed to process a request a heuristic analysis module in load balancer may generate a probe communication that includes the request. Heuristic analysis module determines whether a sufficient number of communications of the request type have been sent to each of servers to determine how much power servers servers and servers likely consume to process the request. If heuristic analysis module determines that a sufficient number of communications of the request type have been sent to each of servers to determine how much power the servers are likely to consume to process the request heuristic analysis module may have completed an estimation of the amount of power system is likely to consume when load balancer forward network communications containing a request type to each of servers . Subsequently load balancer may use these estimates to distribute network traffic among servers . On the other hand if heuristic analysis module determines that an insufficient number of communications of the request type have been sent to one of servers to determine how much power the servers are likely to consume to process the request if the request was sent to the one of servers heuristic analysis module forwards the request to the one of servers . Heuristic analysis module then monitors the power consumption of the one of servers each of servers and each of servers . Heuristic analysis module may then calculate an overall power consumption rate for the request by adding the power consumption for the one of servers the one of servers that processed the request and the one of servers that processed the request. Heuristic analysis module may then average this overall power consumption rate with other overall power rates obtained when heuristic analysis module forwarded the request to the one of servers . By repeating this process a sufficient number of times heuristic analysis module may obtain a relatively accurate overall power consumption rate of system when a request of the request type is forwarded to the one of servers . Furthermore by repeating this process a sufficient number of times on each of servers heuristic analysis module may obtain relatively accurate overall power consumption rates of system when requests of the request type are forwarded to each of servers . Alternatively heuristic analysis module may use a Bayesian algorithm to determine overall power consumption ratings of system when requests are forwarded to each of servers . For example heuristic analysis module may include an internal model in which each of servers and are nodes in a Bayesian network. In this model servers are the parent nodes of servers and servers are the parent nodes of servers . Heuristic analysis module may then send a probe communication to one of servers in order to discover the probability of one of servers processing a request associated with the probe communication. Furthermore heuristic analysis module may forward communications one of servers in order to discover the probability of one of servers processing a request associated with the request. By discovering such probabilities for each of servers heuristic analysis module may estimate which ones of servers and are likely be active in responding to the request. Then by discovering average rates of power consumption of each of servers and heuristic analysis module may determine how much power is likely to be consumed by servers of system when heuristic analysis module forwards a network communication of a certain request type to each of servers . Load balancer may then use this information to forward network communications. If heuristic analysis module does not have sufficient confidence in one of the power consumption estimates of system when a network communication of the identified request type is forwarded to one of servers NO of heuristic analysis module selects the server . Heuristic analysis module may then generate a probe communication that includes a request of the request type and forwards the probe communication to the selected server . After forwarding the probe communication to the selected server heuristic analysis module monitors the power consumption of the selected server and each of servers and . Heuristic analysis module then sums the observed power consumption of the selected servers together with the observed power consumption of servers and to when processing the probe communication . Next heuristic analysis module may average this sum with an existing estimate of the power consumption of system when a network communication of the identified request type is forwarded to the selected server . This average is now the new estimate of the power consumption of system when a network communication of the identified request type is forwarded to the selected server. After averaging this sum with an existing estimate of the power consumption heuristic analysis module may increment a confidence indicator of the power consumption estimate . Next heuristic analysis module may again determine whether heuristic analysis module has sufficient confidence in power consumption estimates of system when heuristic analysis module sends probe communications of the identified request type to each of servers . If heuristic analysis module has sufficient confidence YES of testing may be complete . Various embodiments of the invention have been described. These and other embodiments are within the scope of the following claims.
401.443548
1,704
0.831756
eng_Latn
0.999823
98508cd931fb3b65d5f736f4a4e40ff27c6878ee
57
md
Markdown
README.md
celelstine/assestManager
e259b3f8cb2ba6ee005578059201de0ba1f7cfbf
[ "MIT" ]
null
null
null
README.md
celelstine/assestManager
e259b3f8cb2ba6ee005578059201de0ba1f7cfbf
[ "MIT" ]
1
2018-05-15T13:59:56.000Z
2018-05-15T13:59:56.000Z
README.md
celelstine/assetManager
e259b3f8cb2ba6ee005578059201de0ba1f7cfbf
[ "MIT" ]
null
null
null
# assestManager A platform to manage and monitor assests
19
40
0.824561
eng_Latn
0.996279
9850a32bfd3ec23857d445381e6f98e43165c992
18
md
Markdown
README.md
Jeandra/variaveis-php
e8572a97598da42102beef11d7093b7b9b907687
[ "MIT" ]
null
null
null
README.md
Jeandra/variaveis-php
e8572a97598da42102beef11d7093b7b9b907687
[ "MIT" ]
null
null
null
README.md
Jeandra/variaveis-php
e8572a97598da42102beef11d7093b7b9b907687
[ "MIT" ]
null
null
null
# variaveis-php
6
15
0.666667
vie_Latn
0.539232
985157462d34c213b8d3191331935327087bb92e
2,499
md
Markdown
docs/rules/no-restricted-resolver-tests.md
jaydgruber/eslint-plugin-ember
2029651e09134f4662267c0c75f8e5f7f1651c5f
[ "MIT" ]
null
null
null
docs/rules/no-restricted-resolver-tests.md
jaydgruber/eslint-plugin-ember
2029651e09134f4662267c0c75f8e5f7f1651c5f
[ "MIT" ]
null
null
null
docs/rules/no-restricted-resolver-tests.md
jaydgruber/eslint-plugin-ember
2029651e09134f4662267c0c75f8e5f7f1651c5f
[ "MIT" ]
null
null
null
# no-restricted-resolver-tests Don't use constructs or configuration that use the restricted resolver in tests. [RFC-0229](https://github.com/emberjs/rfcs/blob/master/text/0229-deprecate-testing-restricted-resolver.md) proposed to remove the concept of artificially restricting the resolver used under testing. This rule helps identify anti-patterns in tests that we want to migrate off. ## Examples Examples of **incorrect** code for this rule: If `integration: true` is not included in the specified options for the APIs listed below. This specifically includes specifying `unit: true`, `needs: []`, or specifying none of the "test type options" (`unit`, `needs`,or `integration` options) to the following ember-qunit and ember-mocha API's: ```js // ember-qunit moduleFor('service:session'); moduleFor('service:session', { unit: true }); moduleFor('service:session', { needs: ['type:thing'] }); moduleFor('service:session', 'arg2', ['etc'], {}); moduleForComponent('display-page'); moduleForComponent('display-page', { unit: true }); moduleForComponent('display-page', { needs: ['type:thing'] }); moduleForComponent('display-page', 'arg2', ['etc'], {}); moduleForModel('post'); moduleForModel('post', { unit: true }); moduleForModel('post', { needs: ['type:thing'] }); moduleForModel('post', 'arg2', ['etc'], {}); ``` ```js // ember-mocha setupTest('service:session'); setupTest('service:session', { unit: true }); setupTest('service:session', { needs: ['type:thing'] }); moduleFor('arg1', 'arg2', ['etc'], {}); setupComponentTest('display-page'); setupComponentTest('display-page', { unit: true }); setupComponentTest('display-page', { needs: ['type:thing'] }); setupComponentTest('display-page', 'arg2', ['etc'], {}); setupModelTest('post'); setupModelTest('post', { unit: true }); setupModelTest('post', { needs: ['type:thing'] }); setupModelTest('post', 'arg2', ['etc'], {}); ``` Examples of **correct** code for this rule: ```js // ember-qunit moduleFor('service:session', { integration: true }); moduleForComponent('display-page', { integration: true }); moduleFor('service:session', { integration: true }); ``` ```js // ember-mocha setupTest('service:session', { integration: true }); setupComponentTest('display-page', { integration: true }); setupModelTest('post', { integration: true }); ``` ## Further Reading If there are other links that describe the issue this rule addresses, please include them here in a bulleted list.
21.730435
296
0.692677
eng_Latn
0.665154
985175a4c02a1da783604edcaf95b40a6dd2746d
2,680
md
Markdown
tests/unit/test_data/en_tw-wa/en_tw/bible/kt/fulfill.md
linearcombination/DOC
4478e55ec81426c15a2c402cb838e76d79741c03
[ "MIT" ]
1
2022-01-10T21:03:26.000Z
2022-01-10T21:03:26.000Z
tests/unit/test_data/en_tw-wa/en_tw/bible/kt/fulfill.md
linearcombination/DOC
4478e55ec81426c15a2c402cb838e76d79741c03
[ "MIT" ]
1
2022-03-28T17:44:24.000Z
2022-03-28T17:44:24.000Z
tests/unit/test_data/en_tw-wa/en_tw/bible/kt/fulfill.md
linearcombination/DOC
4478e55ec81426c15a2c402cb838e76d79741c03
[ "MIT" ]
3
2022-01-14T02:55:44.000Z
2022-02-23T00:17:51.000Z
# fulfill ## Related Ideas: carry out, fill to the limit, finish, fulfillment, in full, make something full ## Definition: The term "fulfill" means to complete or accomplish something that was expected. * When a prophecy is fulfilled, it means that God causes to happen what was predicted in the prophecy. * If a person fulfills a promise or a vow, it means that he does what he has promised to do. * To fulfill a responsibility means to do the task that was assigned or required. ## Translation Suggestions: * Depending on the context, "fulfill" could be translated as "accomplish" or "complete" or "cause to happen" or "obey" or "perform." * The phrase "has been fulfilled" could also be translated as "has come true" or "has happened" or "has taken place." * Ways to translate "fulfill," as in "fulfill your ministry," could include "complete" or "perform" or "practice" or "serve other people as God has called you to do." (See also: [prophet](../kt/prophet.md), [Christ](../kt/christ.md), [minister](../kt/minister.md), [call](../kt/call.md)) ## Bible References: * [1 Kings 02:27](rc://en/tn/help/1ki/02/27) * [Acts 03:17-18](rc://en/tn/help/act/03/17) * [Leviticus 22:17-19](rc://en/tn/help/lev/22/17) * [Luke 04:21](rc://en/tn/help/luk/04/21) * [Matthew 01:22-23](rc://en/tn/help/mat/01/22) * [Matthew 05:17](rc://en/tn/help/mat/05/17) * [Psalms 116:12-15](rc://en/tn/help/psa/116/012) ## Examples from the Bible stories: * __[24:04](rc://en/tn/help/obs/24/04)__ John __fulfilled__ what the prophets said, "See I send my messenger ahead of you, who will prepare your way." * __[40:03](rc://en/tn/help/obs/40/03)__ The soldiers gambled for Jesus' clothing. When they did this, they __fulfilled__ a prophecy that said, "They divided my garments among them, and gambled for my clothing." * __[42:07](rc://en/tn/help/obs/42/07)__ Jesus said, "I told you that everything written about me in God's word must be __fulfilled__." * __[43:05](rc://en/tn/help/obs/43/05)__ "This __fulfills__ the prophecy made by the prophet Joel in which God said, 'In the last days, I will pour out my Spirit.'" * __[43:07](rc://en/tn/help/obs/43/07)__ "This __fulfills__ the prophecy which says, 'You will not let your Holy One rot in the grave.'" * __[44:05](rc://en/tn/help/obs/44/05)__ "Although you did not understand what you were doing, God used your actions to __fulfill__ the prophecies that the Messiah would suffer and die." ## Word Data: * Strong's: H1214, H4390, H5487, H7999, G378, G4135, G4137, G4138, G5048, G5055 ## Forms Found in the English ULB: carried out, fill up ... to the limit, finishing, fulfill, fulfilled, fulfillment, fulfills, in full, make ... full
53.6
211
0.716418
eng_Latn
0.997301
9852251c34de0a77e55c94263f70ac3db8202b36
1,464
md
Markdown
doc/botbuilder/enums/botbuilder.textformattypes.md
gwilymhumphreys/botbuilder-js
9e953a248a3753d92b6d838679d1fc1c670267dc
[ "MIT" ]
1
2021-04-14T03:11:53.000Z
2021-04-14T03:11:53.000Z
doc/botbuilder/enums/botbuilder.textformattypes.md
gwilymhumphreys/botbuilder-js
9e953a248a3753d92b6d838679d1fc1c670267dc
[ "MIT" ]
null
null
null
doc/botbuilder/enums/botbuilder.textformattypes.md
gwilymhumphreys/botbuilder-js
9e953a248a3753d92b6d838679d1fc1c670267dc
[ "MIT" ]
null
null
null
[Bot Builder SDK](../README.md) > [TextFormatTypes](../enums/botbuilder.textformattypes.md) # Enumeration: TextFormatTypes Defines values for TextFormatTypes. Possible values include: 'markdown', 'plain', 'xml' There could be more values for this enum apart from the ones defined here.If you want to set a value that is not from the known values then you can do the following: let param: TextFormatTypes = <textformattypes>"someUnknownValueThatWillStillBeValid";</textformattypes> *__readonly__*: *__enum__*: {string} ## Index ### Enumeration members * [Markdown](botbuilder.textformattypes.md#markdown) * [Plain](botbuilder.textformattypes.md#plain) * [Xml](botbuilder.textformattypes.md#xml) --- ## Enumeration members <a id="markdown"></a> ### Markdown ** Markdown**: = "markdown" *Defined in [libraries/botframework-schema/lib/index.d.ts:1779](https://github.com/Microsoft/botbuilder-js/blob/c748a95/libraries/botframework-schema/lib/index.d.ts#L1779)* ___ <a id="plain"></a> ### Plain ** Plain**: = "plain" *Defined in [libraries/botframework-schema/lib/index.d.ts:1780](https://github.com/Microsoft/botbuilder-js/blob/c748a95/libraries/botframework-schema/lib/index.d.ts#L1780)* ___ <a id="xml"></a> ### Xml ** Xml**: = "xml" *Defined in [libraries/botframework-schema/lib/index.d.ts:1781](https://github.com/Microsoft/botbuilder-js/blob/c748a95/libraries/botframework-schema/lib/index.d.ts#L1781)* ___
20.619718
282
0.728825
eng_Latn
0.274349
985240941711e05b7aa0c1471be759b9c71eed04
694
md
Markdown
vault/lexicon/G04740.md
mandolyte/uw-obsidian
39e987c4cdc49d2a68e3af6b4e3fc84d1cda916d
[ "MIT" ]
null
null
null
vault/lexicon/G04740.md
mandolyte/uw-obsidian
39e987c4cdc49d2a68e3af6b4e3fc84d1cda916d
[ "MIT" ]
null
null
null
vault/lexicon/G04740.md
mandolyte/uw-obsidian
39e987c4cdc49d2a68e3af6b4e3fc84d1cda916d
[ "MIT" ]
null
null
null
# ἀντι-βάλλω <!-- Status: S2=NeedsEdits --> <!-- Lexica used for edits: --> ## Word data * Strongs: G04740 * Alternate spellings: , * Principle Parts: * Part of speech: * Instances in Scripture: 1 * All Scriptures cited: Yes ## Etymology: * LXX/Hebrew glosses: in LXX: [II Mac 11:13](2Macc.11.13)*; * Time Period/Ancient Authors: * Related words: * Antonyms for all senses * Synonyms for all senses: ## Senses ### Sense 1.0: #### Definition: #### Glosses: to throw in turn; #### Explanation: exchange; #### Citations: to throw in turn, exchange: metaph., [λόγους]() (cf. Lat. [conferre sermones](); v. Field, Notes, 81), [Lk 24:17](Luk 24:17).†
11.762712
126
0.615274
eng_Latn
0.51826
98528250038df1d6e44ae7a80d0dac3153eaa85d
1,186
md
Markdown
wiki/translations/ru/Std_ViewIvStereoOff.md
arslogavitabrevis/FreeCAD-documentation
8166ce0fd906f0e15672cd1d52b3eb6cf9e17830
[ "CC0-1.0" ]
null
null
null
wiki/translations/ru/Std_ViewIvStereoOff.md
arslogavitabrevis/FreeCAD-documentation
8166ce0fd906f0e15672cd1d52b3eb6cf9e17830
[ "CC0-1.0" ]
null
null
null
wiki/translations/ru/Std_ViewIvStereoOff.md
arslogavitabrevis/FreeCAD-documentation
8166ce0fd906f0e15672cd1d52b3eb6cf9e17830
[ "CC0-1.0" ]
null
null
null
--- - GuiCommand:/ru Name:Std ViewIvStereoOff Name/ru:Std ViewIvStereoOff MenuLocation:Вид → Стерео → View → Выключить стерео SeeAlso:[Std ViewIvStereoRedGreen](Std_ViewIvStereoRedGreen/ru.md), [Std ViewIvStereoQuadBuff](Std_ViewIvStereoQuadBuff/ru.md), [/ruStd ViewIvStereoInterleavedRows](Std_ViewIvStereoInterleavedRows.md), [Std ViewIvStereoInterleavedColumns](Std_ViewIvStereoInterleavedColumns/ru.md) --- # Std ViewIvStereoOff/ru ## Описание Команда **Std ViewIvStereoOff** выключает стерео режим в активном [3D view](3D_view/ru.md). ## Применение 1. Выберите из меню опцию **Вид → Стерео → <img src="images/Std_ViewIvStereoOff.svg" width=16px> Выключить стерео**. ## Scripting **Смотрите так же:** [Основы составления скриптов в FreeCAD](FreeCAD_Scripting_Basics/ru.md). Чтобы изменить вид на отмену стерео, используйте метод `setStereoType` объекта ActiveView. Этот метод не доступен, когда FreeCAD в режиме консоли. ```python import FreeCADGui FreeCADGui.ActiveDocument.ActiveView.setStereoType('None') FreeCADGui.ActiveDocument.ActiveView.getStereoType() ``` {{Std Base navi }} --- [documentation index](../README.md) > Std ViewIvStereoOff/ru
25.782609
283
0.775717
yue_Hant
0.565021
9852ff16cbc3d37761f396ec2cb4dfd1ba6d1ba2
9,174
md
Markdown
doc_source/list_amazoncognitoidentity.md
tyron/iam-user-guide
caa7ada1c8570bbd0e9bf17edd35f1fb3daa23c3
[ "MIT-0" ]
null
null
null
doc_source/list_amazoncognitoidentity.md
tyron/iam-user-guide
caa7ada1c8570bbd0e9bf17edd35f1fb3daa23c3
[ "MIT-0" ]
null
null
null
doc_source/list_amazoncognitoidentity.md
tyron/iam-user-guide
caa7ada1c8570bbd0e9bf17edd35f1fb3daa23c3
[ "MIT-0" ]
null
null
null
# Actions, Resources, and Condition Keys for Amazon Cognito Identity<a name="list_amazoncognitoidentity"></a> Amazon Cognito Identity \(service prefix: `cognito-identity`\) provides the following service\-specific resources, actions, and condition context keys for use in IAM permission policies\. References: + Learn how to [configure this service](http://docs.aws.amazon.com/cognito/latest/developerguide/)\. + View a [list of the API operations available for this service](http://docs.aws.amazon.com/cognitoidentity/latest/APIReference/)\. + Learn how to protect this service and its resources by [using IAM](http://docs.aws.amazon.com/cognito/latest/developerguide/cognito-identity.html) permission policies\. **Topics** + [Actions Defined by Amazon Cognito Identity](#amazoncognitoidentity-actions-as-permissions) + [Resources Defined by Cognito Identity](#amazoncognitoidentity-resources-for-iam-policies) + [Condition Keys for Amazon Cognito Identity](#amazoncognitoidentity-policy-keys) ## Actions Defined by Amazon Cognito Identity<a name="amazoncognitoidentity-actions-as-permissions"></a> You can specify the following actions in the `Action` element of an IAM policy statement\. By using policies, you define the permissions for anyone performing an operation in AWS\. When you use an action in a policy, you usually allow or deny access to the API operation or CLI command with the same name\. However, in some cases, a single action controls access to more than one operation\. Alternatively, some operations require several different actions\. For details about the columns in the following table, see [The Actions Table](reference_policies_actions-resources-contextkeys.md#actions_table)\. **** | Actions | Description | Access Level | Resource Types \(\*required\) | Condition Keys | Dependent Actions | | --- | --- | --- | --- | --- | --- | | [http://docs.aws.amazon.com/cognitoidentity/latest/APIReference/API_CreateIdentityPool.html](http://docs.aws.amazon.com/cognitoidentity/latest/APIReference/API_CreateIdentityPool.html) | Creates a new identity pool\. | Write | | | | | [http://docs.aws.amazon.com/cognitoidentity/latest/APIReference/API_DeleteIdentities.html](http://docs.aws.amazon.com/cognitoidentity/latest/APIReference/API_DeleteIdentities.html) | Deletes identities from an identity pool\. You can specify a list of 1\-60 identities that you want to delete\. | Write | | | | | [http://docs.aws.amazon.com/cognitoidentity/latest/APIReference/API_DeleteIdentityPool.html](http://docs.aws.amazon.com/cognitoidentity/latest/APIReference/API_DeleteIdentityPool.html) | Deletes a user pool\. Once a pool is deleted, users will not be able to authenticate with the pool\. | Write | [identitypool\*](#amazoncognitoidentity-identitypool) | | | | [http://docs.aws.amazon.com/cognitoidentity/latest/APIReference/API_DescribeIdentity.html](http://docs.aws.amazon.com/cognitoidentity/latest/APIReference/API_DescribeIdentity.html) | Returns metadata related to the given identity, including when the identity was created and any associated linked logins\. | Read | | | | | [http://docs.aws.amazon.com/cognitoidentity/latest/APIReference/API_DescribeIdentityPool.html](http://docs.aws.amazon.com/cognitoidentity/latest/APIReference/API_DescribeIdentityPool.html) | Gets details about a particular identity pool, including the pool name, ID description, creation date, and current number of users\. | Read | [identitypool\*](#amazoncognitoidentity-identitypool) | | | | [http://docs.aws.amazon.com/cognitoidentity/latest/APIReference/API_GetCredentialsForIdentity.html](http://docs.aws.amazon.com/cognitoidentity/latest/APIReference/API_GetCredentialsForIdentity.html) | Returns credentials for the provided identity ID\. | Read | | | | | [http://docs.aws.amazon.com/cognitoidentity/latest/APIReference/API_GetId.html](http://docs.aws.amazon.com/cognitoidentity/latest/APIReference/API_GetId.html) | Generates \(or retrieves\) a Cognito ID\. Supplying multiple logins will create an implicit linked account\. | Write | | | | | [http://docs.aws.amazon.com/cognitoidentity/latest/APIReference/API_GetIdentityPoolRoles.html](http://docs.aws.amazon.com/cognitoidentity/latest/APIReference/API_GetIdentityPoolRoles.html) | Gets the roles for an identity pool\. | Read | [identitypool\*](#amazoncognitoidentity-identitypool) | | | | [http://docs.aws.amazon.com/cognitoidentity/latest/APIReference/API_GetOpenIdToken.html](http://docs.aws.amazon.com/cognitoidentity/latest/APIReference/API_GetOpenIdToken.html) | Gets an OpenID token, using a known Cognito ID\. | Read | | | | | [http://docs.aws.amazon.com/cognitoidentity/latest/APIReference/API_GetOpenIdTokenForDeveloperIdentity.html](http://docs.aws.amazon.com/cognitoidentity/latest/APIReference/API_GetOpenIdTokenForDeveloperIdentity.html) | Registers \(or retrieves\) a Cognito IdentityId and an OpenID Connect token for a user authenticated by your backend authentication process\. | Read | [identitypool\*](#amazoncognitoidentity-identitypool) | | | | [http://docs.aws.amazon.com/cognitoidentity/latest/APIReference/API_ListIdentities.html](http://docs.aws.amazon.com/cognitoidentity/latest/APIReference/API_ListIdentities.html) | Lists the identities in a pool\. | List | [identitypool\*](#amazoncognitoidentity-identitypool) | | | | [http://docs.aws.amazon.com/cognitoidentity/latest/APIReference/API_ListIdentityPools.html](http://docs.aws.amazon.com/cognitoidentity/latest/APIReference/API_ListIdentityPools.html) | Lists all of the Cognito identity pools registered for your account\. | List | | | | | [http://docs.aws.amazon.com/cognitoidentity/latest/APIReference/API_LookupDeveloperIdentity.html](http://docs.aws.amazon.com/cognitoidentity/latest/APIReference/API_LookupDeveloperIdentity.html) | Retrieves the IdentityID associated with a DeveloperUserIdentifier or the list of DeveloperUserIdentifiers associated with an IdentityId for an existing identity\. | Read | [identitypool\*](#amazoncognitoidentity-identitypool) | | | | [http://docs.aws.amazon.com/cognitoidentity/latest/APIReference/API_MergeDeveloperIdentities.html](http://docs.aws.amazon.com/cognitoidentity/latest/APIReference/API_MergeDeveloperIdentities.html) | Merges two users having different IdentityIds, existing in the same identity pool, and identified by the same developer provider\. | Write | [identitypool\*](#amazoncognitoidentity-identitypool) | | | | [http://docs.aws.amazon.com/cognitoidentity/latest/APIReference/API_SetIdentityPoolRoles.html](http://docs.aws.amazon.com/cognitoidentity/latest/APIReference/API_SetIdentityPoolRoles.html) | Sets the roles for an identity pool\. These roles are used when making calls to GetCredentialsForIdentity action\. | Write | | | | | [http://docs.aws.amazon.com/cognitoidentity/latest/APIReference/API_UnlinkDeveloperIdentity.html](http://docs.aws.amazon.com/cognitoidentity/latest/APIReference/API_UnlinkDeveloperIdentity.html) | Unlinks a DeveloperUserIdentifier from an existing identity\. | Write | [identitypool\*](#amazoncognitoidentity-identitypool) | | | | [http://docs.aws.amazon.com/cognitoidentity/latest/APIReference/API_UnlinkIdentity.html](http://docs.aws.amazon.com/cognitoidentity/latest/APIReference/API_UnlinkIdentity.html) | Unlinks a federated identity from an existing account\. | Write | | | | | [http://docs.aws.amazon.com/cognitoidentity/latest/APIReference/API_UpdateIdentityPool.html](http://docs.aws.amazon.com/cognitoidentity/latest/APIReference/API_UpdateIdentityPool.html) | Updates a user pool\. | Write | [identitypool\*](#amazoncognitoidentity-identitypool) | | | ## Resources Defined by Cognito Identity<a name="amazoncognitoidentity-resources-for-iam-policies"></a> The following resource types are defined by this service and can be used in the `Resource` element of IAM permission policy statements\. Each action in the [Actions table](#amazoncognitoidentity-actions-as-permissions) identifies the resource types that can be specified with that action\. A resource type can also define which condition keys you can include in a policy\. These keys are displayed in the last column of the table\. For details about the columns in the following table, see [The Resource Types Table](reference_policies_actions-resources-contextkeys.md#resources_table)\. **** | Resource Types | ARN | Condition Keys | | --- | --- | --- | | [http://docs.aws.amazon.com/cognito/latest/developerguide/identity-pools.html](http://docs.aws.amazon.com/cognito/latest/developerguide/identity-pools.html) | arn:$\{Partition\}:cognito\-identity:$\{Region\}:$\{Account\}:identitypool/$\{IdentityPoolId\} | | ## Condition Keys for Amazon Cognito Identity<a name="amazoncognitoidentity-policy-keys"></a> Cognito Identity has no service\-specific context keys that can be used in the `Condition` element of policy statements\. For the list of the global context keys that are available to all services, see [Available Keys for Conditions](http://docs.aws.amazon.com/IAM/latest/UserGuide/reference_policies_condition-keys.html#AvailableKeys) in the *IAM Policy Reference*\.
163.821429
605
0.786353
yue_Hant
0.397587
9854e23b93328b57bdbbcee1aee285b17b03993f
2,033
md
Markdown
_lift/vid0068.md
elotroalex/lift
b65b073139f5d5fb7f5ae07b77eabf349a42a3cb
[ "MIT" ]
null
null
null
_lift/vid0068.md
elotroalex/lift
b65b073139f5d5fb7f5ae07b77eabf349a42a3cb
[ "MIT" ]
null
null
null
_lift/vid0068.md
elotroalex/lift
b65b073139f5d5fb7f5ae07b77eabf349a42a3cb
[ "MIT" ]
null
null
null
--- pid: vid0068 label: Performance by Mormon Tabernacle Choir performer: Mormon Tabernacle Choir date_uploaded: May 25, 2018 embed_url: www.youtube.com/embed/iUdkKNKx4DA length: '5:41' uploaded_by: The Tabernacle Choir at Temple Square video_title: " Mormon Tabernacle Choir" video_notes: |- On May 20, 2018, the National Association for the Advancement of Colored People (NAACP) attended the Mormon Tabernacle Choir’s weekly Music and the Spoken Word broadcast. Members of the National Board of Directors of the NAACP and the NAACP Foundation were in Salt Lake City for their board meetings, which were held in Salt Lake City for the first time. They also met with the First Presidency of The Church of Jesus Christ of Latter-day Saints and made a joint statement to the media calling for “greater civility and racial harmony.” “Lift Every Voice and Sing” was written in 1900, when a school principal and poet, James Weldon Johnson, was invited to speak to a crowd in Jacksonville, Florida, for the anniversary of Abraham Lincoln’s birthday. To introduce the honored guest, Booker T. Washington, Johnson decided to write a poem. On February 12, 1900, 500 schoolchildren at the segregated Stanton School in Jacksonville, where Johnson was principal, recited “Lift Every Voice and Sing.” Stanton's brother John wrote the music to accompany the poem in 1905. Read more: bit.ly/2HrS97X “Lift Every Voice and Sing”: © 1978 by MarVel. All rights reserved. narrative: 'The Mormon Tabernacle Choir performs in what is noted as the NAACP''s first visit to Salt Lake City, Utah. A small number of the members of the choir are black. They are accompanied by an orchestra. They perform all three verses. The final verse is sung slowly and softly, as if a prayer. ' rights: Mormon Tabernacle Choir and Orchestra location: Salt Lake City, UtT related_docs: personal_notes: order: '67' layout: lift_item collection: lift thumbnail: img/derivatives/simple/vid0068/thumbnail.jpg full: img/derivatives/simple/vid0068/full.jpg ---
63.53125
538
0.789966
eng_Latn
0.996068
9855562300ce2311c05ddd59f8e874179048cd50
1,389
md
Markdown
README.md
mujeebishaque/todo-slap
834c93be201f1b8511800f01fe1eda74486a97b1
[ "MIT" ]
null
null
null
README.md
mujeebishaque/todo-slap
834c93be201f1b8511800f01fe1eda74486a97b1
[ "MIT" ]
null
null
null
README.md
mujeebishaque/todo-slap
834c93be201f1b8511800f01fe1eda74486a97b1
[ "MIT" ]
null
null
null
# todo-slap My personal project for myself To Slap me with my todos when my pc wakes up ### Where to download the executable of the software? - You can download the executable from this repo. It's placed inside the `output` folder. ### How does the software look like? - Following are the 3 screenshots of the software: - ![main window](screenshots/todo-slap-1.jpg) - ![error window](screenshots/todo-slap-2.jpg) - ![second tab](screenshots/todo-slap-3.jpg) ### What are the software requirements? - Windows 10+ ### How to make it load on startup so I'm slapped with my todos? - Place the exe inside your `startup` folder. - For Example, in my case the startup folder is here: `C:\Users\Dell\AppData\Roaming\Microsoft\Windows\Start Menu\Programs\Startup` - There's another and better/recommended way to do it. - Have a batch script at startup folder that'll start your .exe file. - It can look like this: `start "" "C:\Users\Dell\OneDrive\Desktop\todo-slap\output\frontend.exe"` ### What libs did I use? - Pyqt5 - TinyDB - Tkinter MessageBoxes ### Database of choice? - I used tinyDB which uses .json file as a database. It's efficient and very easy to use. ### Can I copy your code to make it mine or sell it to others? - You're allowed to do whatever you want to do with the source code or software but `I'll request to please include my name @mujeebishaque`. Thank You :)
42.090909
154
0.735781
eng_Latn
0.994335
9856ad617f5ddbb765c87d27e72a2417a3e38d5a
14,456
md
Markdown
01-swift-intro/01-swift-intro.md
agrippa1994/iOS8-day-by-day
1bee7d51c127485344e7a6e00b145028b290cd02
[ "Apache-2.0" ]
1
2016-04-18T11:44:34.000Z
2016-04-18T11:44:34.000Z
01-swift-intro/01-swift-intro.md
Onetaway/iOS8-day-by-day
023324eb3f018f25a7778d8534c037c60d0ee8b0
[ "Apache-2.0" ]
null
null
null
01-swift-intro/01-swift-intro.md
Onetaway/iOS8-day-by-day
023324eb3f018f25a7778d8534c037c60d0ee8b0
[ "Apache-2.0" ]
null
null
null
# iOS8 Day-by-Day :: Day 1 :: Swift for Blaggers This post is part of a daily series of posts introducing the most exciting new parts of iOS8 for developers - [#iOS8DayByDay](https://twitter.com/search?q=%23iOS8DayByDay). To see the posts you've missed check out the [introduction page](http://www.shinobicontrols.com/ios8daybyday), but have a read through the rest of this post first! --- ## Introduction ![Swift Logo](assets/swift_logo.png) It won't have gone unnoticed that at WWDC this year, in addition to announcing iOS8, they also introduced a new programming language in the form of Swift. This is quite a different language from objective-C in that it is strongly-typed and includes some features common to more modern languages. In the interests of embracing anything and everything that's new and shiny, this blog series will exclusively use Swift. There is a wealth of information out there about how to learn Swift, and how to interact with the Cocoa libraries - in fact you can't go wrong with starting out by reading through the official books: - [The Swift Programming Language](https://itunes.apple.com/us/book/the-swift-programming-language/id881256329?mt=11&ls=1) - [Using Swift with Cocoa and Objective-C](https://itunes.apple.com/us/book/using-swift-cocoa-objective/id888894773?mt=11&ls=1) You should also check out the official [swift blog](https://developer.apple.com/swift/blog/), and some of the other [resources](https://developer.apple.com/swift/resources/) made available by Apple. Since there is so much good info out there about how to use Swift, this post is not going to attempt to cover any of that. Instead, it's going to run through some of the important gotchas and potential pain points when using Swift for the first time - especially when relating to the system frameworks. There is an Xcode 6 playground which accompanies this post - including short samples for each of the sections. You can get hold of it on the ShinobiControls Github page - at [github.com/ShinobiControls/iOS8-day-by-day](https://github.com/ShinobiControls/iOS8-day-by-day). If you have any questions or suggestions of other things to add to this post then do let me know - I'll try to keep it up to date throughout the blog series. Drop a comment below, or gimme a shout on twitter - [@iwantmyrealname](https://twitter.com/iwantmyrealname). ## Initialisation Swift formalises the concepts surround initialisation of objects somewhat - including designated -vs- convenience initialisers, and sets a very specific order of the operations to be called within the initialisation phases of an object. In the coming weeks, there will be an article as part of this series which will go into detail about how initialisation works in Swift, and how this affects any objective-C that you write - so look out for this. There is one other fairly major difference in initialisation between Swift and objective-C, and that is return values and initialisation failure. In objective-C an initialiser looks a lot like this: - (instancetype)init { self = [super init]; if (self) { // Do some stuff } return self; } Whereas in Swift: init { variableA = 10 ... super.init() } Notice that in objective-C the initialiser is responsible for 'creating' and then returning `self`, but there is no `return` statement in the Swift equivalent. This means that there is actually no way in which you can return a `nil` object, which is a pattern commonly used to indicate an initialisation failure in objC. This is apparently likely to change in an upcoming release of the language, but for now the only workaround is to use class methods which return optional types: class MyClass { class func myFactoryMethod() -> MyClass? { ... } } Interestingly, factory methods on objective-C APIs are converted into initialisers in Swift, so this approach is not preferred. However, until language support arrives, it's the only option for initialisers which have the potential to fail. ## Mutability The concept of (im)mutability is not new to Cocoa developers - we've been used to using `NSArray` and its mutable counterpart `NSMutableArray` where appropriate, and even understand that we should always prefer the immutable version wherever possible. Swift takes this concept to the next level, and bakes immutability into the language as a fundamental concept. The `let` keyword defines an immutable variable, which means that you can't change what it represents. For example: let a = MyClass() a = MySecondClass() // Not allowed This means that you can't redefine something specified with the `let` keyword. Depending on the type of the object referred to, it might itself be immutable too. If it is a value type (such as a struct) then it will also be immutable. If it is a reference type, such as a class then it will be mutable. To see this in action, consider the following `struct`: struct MyStruct { let t = 12 var u: String } If you define a variable `struct1` with the `var` keyword then you get the following behaviour: var struct1 = MyStruct(t: 15, u: "Hello") struct1.t = 13 // Error: t is an immutable property struct1.u = "GoodBye" struct1 = MyStruct(t: 10, u: "You") You can mutate the `u` property, since this is defined with `var`, and you can redefine the `struct1` variable itself, again because this is defined with `var`. You can't mutate the `t` property, since this is defined with `let`. Now take a look what happens when you define an instance of a `struct` using `let`: let struct2 = MyStruct(t: 12, u: "World") struct2.u = "Planet" // Error: struct2 is immutable struct2 = MyStruct(t: 10, u: "Defeat") // Error: struct2 is an immutable ref Here, not only are you unable to mutate the `struct2` reference itself, but you are also unable to mutate the struct itself (i.e. the `u` property). This is because a struct is a __value type__. The behaviour is subtly different with a class: class MyClass { let t = 12 var u: String init(t: Int, u: String) { self.t = t self.u = u } } Defining a variable using `var` gives behaviour you might be used to from objective-C: var class1 = MyClass(t: 15, u: "Hello") class1.t = 13 // Error: t is an immutable property class1.u = "GoodBye" class1 = MyClass(t: 10, u: "You") You can mutate both the reference itself, and any properties defined using `var`, but you are unable to mutate any properties defined with `let`. Compare this to the behaviour when the instance is defined with `let`: let class2 = MyClass(t: 12, u: "World") class2.u = "Planet" // No error class2 = MyClass(t: 11, u: "Geoid") Error: class2 is an immutable reference Here you are unable to mutate the reference itself, but you __can__ still mutate any properties defined with `var` within the class. This is because a class is a __reference type__. This behaviour is fairly easy to understand, and is well-explained in the language reference books. There is potential for confusion when looking at Swift collection types though. An `NSArray` is a reference type. That is to say that when you create an instance of `NSArray`, you create an object and your variable is a pointer to the location of the array itself in memory - hence the asterisk in the objective-C definition. If you take a look back over what you've learnt about the semantics of reference and value types with respect to `let` and `var` then you can probably work out how they would behave. In fact, if you want a mutable version of an `NSArray` you have to use a different class - in the shape of `NSMutableArray`. Swift arrays aren't like this - they are value types instead of reference types. This means that they behave like a struct, not a class. Therefore, the `let` or `var` keyword not only specifies whether or not the variable can be redefined, but also whether or not the created array is mutable. An array defined with `var` can both be reassigned, and mutated: var array1 = [1,2,3,4] array1.append(5) // [1,2,3,4,5] array1[0] = 27 // [27,2,3,4,5] array1 = [3,2] // [3,2] But an array defined with `let` can be neither: let array2 = [4,3,2,1] array2.append(0) // Error: array2 is immutable array2[2] = 36 // Error: array2 is immutable array2 = [5,6] // Error: cannot reassign an immutable reference This is an area with a huge potential for confusion. Not only does it completely change the way we think about mutability for collections, but it also mixes up two previously distinct concepts. There is potential that this might be changed in a future release of the language - so keep an eye on the language definition. A corollary of this is that since arrays are value types, they are passed by copy. `NSArray` instances are always passed by reference - so a method which takes an `NSArray` pointer will point to exactly the same chunk of memory. If you pass a Swift array into a method, it will receive a copy of that array. Depending on the type of the objects stored in that array this could either be a deep, or a shallow copy. Be aware of this whilst writing your code! ## Strong Typing and `AnyObject` Strong typing is seen as a great feature of Swift - it can allow for safer code, since what in objective-C would have been runtime exceptions can now be caught at compile time. This is great, but as you're working with the objective-C system frameworks you'll notice a lot of this `AnyObject` type. This is the Swift equivalent of objective-C's `id`. In many respects, `AnyObject` feels rather un-Swift-like. It allows you to call __any__ methods it can find on it, but these will result in a run-time exception. In fact, it behaves _almost_ exactly the same as `id` in objective-C. The difference is that properties and methods which take no arguments will return `nil` if that method/property doesn't exist on the `AnyObject`: let myString: AnyObject = "hello" myString.cornerRadius // Returns nil In order to work in a more Swift-like way with the Cocoa APIs, you'll see the following pattern a lot: func someFunc(parameter: AnyObject!) -> AnyObject! { if let castedParameter = parameter as? NSString { // Now I know I have a string :) ... } } If you know that you've definitely been passed a string, you don't necessarily need to guard around the cast: let castedParameter = parameter as NSString A top-tip is to realise that casting arrays is really easy too. All arrays that you'll receive from a Cocoa framework will be of the type `[AnyObject]`, since `NSArray` doesn't support generics. However, in nearly every case not only are all the elements of the same type, but they are of a known type. You can cast an entire array in both the conditional and unconditional ways expressed above, with the following syntax: func someArrayFunc(parameter: [AnyObject]!) { let newArray = parameter as [String] // Do something with your strings :) } ## Protocol Conformance Protocols are well-understood in Swift - defined as follows: protocol MyProtocol { func myProtocolMethod() -> Bool } One of the things you often want to do is test whether an object conforms to a specified protocol, which you could do as follows: if let class1AsMyProtocol = class1 as? MyProtocol { // We're in } However, this will have an error, because in order to check conformance of a protocol that protocol must be an objective-C protocol - and annotated with `@objc`: @objc protocol MyNewProtocol { func myProtocolMethod() -> Bool } if let class1AsMyNewProtocol = class1 as? MyNewProtocol { // We're in } This can actually be more effort than you'd expect, since in order that a protocol be labelled as `@objc`, all of its properties and method return types must also be understood in the objective-C world. This means that you might end up annotating loads of classes you thought you only cared about in Swift with `@objc`. ## Enums Enums in Swift have become super-charged. Not only can an enum now have associated values (which needn't be of the same type), but also contain functions too. enum MyEnum { case FirstType case IntType (Int) case StringType (String) case TupleType (Int, String) func prettyFormat() -> String { switch self { case .FirstType: return "No params" case .IntType(let value), .StringType(let value): return "One param: \(value)" case .TupleType(let v1, let v2): return "Some params: \(v1), \(v2)" default: return "Nothing to see here" } } } This is really powerful - use it as follows: var enum1 = MyEnum.FirstType enum1.prettyFormat() // "No params" enum1 = .TupleType(12, "Hello") enum1.prettyFormat() // "Some params: 12, Hello" It'll take a little practice to see where you can get some benefit out of the power of these, but just as an indication of what you can achieve - the optionals system within Swift is built out of enumerations. ## Conclusion Swift is really powerful - and it's going to take us a while to establish best practice and work out what idioms and patterns we can now use that weren't possible within the constraints of objective-C. This post has outlined some of the common areas of confusion when moving from objective-C to Swift, but don't let this put you off. All of the projects associated with this blog series are written using Swift, and on the most-part are really simple to understand. There is a playground which contains some of the samples mentioned in this post - it's part of the Github repo which accompanies this series. You can get it at [github.com/ShinobiControl/iOS8-day-by-day](https://github.com/ShinobiControl/iOS8-day-by-day). If you have any questions or suggestions for additions / updates on this page then please do get in contact. You can leave a comment below, or tweet me - I'm [@iwantmyrealname](https://twitter.com/iwantmyrealname). The series starts properly on Monday - with a look at one of the significant new APIs within iOS8. Join us then - or use the __subscribe__ button below to get an email reminder that a new post has been published. sam
40.606742
127
0.735266
eng_Latn
0.999528
985717d57153e7a2708d5c8292777c10dcb672ee
4,102
md
Markdown
README.md
takashabe/go-metrics
ec773daf2b6e080eeade29c9a5a0ea1e5376eeaa
[ "MIT" ]
null
null
null
README.md
takashabe/go-metrics
ec773daf2b6e080eeade29c9a5a0ea1e5376eeaa
[ "MIT" ]
null
null
null
README.md
takashabe/go-metrics
ec773daf2b6e080eeade29c9a5a0ea1e5376eeaa
[ "MIT" ]
null
null
null
# go-metrics [![GoDoc](https://godoc.org/github.com/takashabe/go-metrics?status.svg)](https://godoc.org/github.com/takashabe/go-metrics) [![CircleCI](https://circleci.com/gh/takashabe/go-metrics.svg?style=shield)](https://circleci.com/gh/takashabe/go-metrics) [![Go Report Card](https://goreportcard.com/badge/github.com/takashabe/go-metrics)](https://goreportcard.com/report/github.com/takashabe/go-metrics) Collecting metrics. The metrics values can be categorized into several types. ## Installation ``` go get -u github.com/takashabe/go-metrics ``` ## Usage * saveMetrics * collect metrics * forwardConsole, forwardUDP * forward to the specified io.Writer detail see at [example/main.go](example/main.go) ```go package main import ( "context" "os" "time" "github.com/takashabe/go-metrics/collect" "github.com/takashabe/go-metrics/forward" ) func main() { // collect metrics collector := collect.NewSimpleCollector() for i := 0; i < 10; i++ { saveMetrics(collector) } // metrics send to console cctx, ccancel := context.WithCancel(context.Background()) forwardConsole(cctx, collector) // metrics send to udp server // must running server uctx, ucancel := context.WithCancel(context.Background()) forwardUDP(uctx, collector) time.Sleep(2 * time.Second) ccancel() ucancel() // output console and udp socket (prepare reformat by jq): ` { "cnt": 10, "histogram.95percentile": 1499763733746145300, "histogram.avg": 1499763733746128600, "histogram.count": 10, "histogram.max": 1499763733746146600, "histogram.median": 1499763733746140200, "history": [ "2017-07-11 18:02:13.746027874 +0900 JST", "2017-07-11 18:02:13.746132309 +0900 JST", "2017-07-11 18:02:13.74613555 +0900 JST", "2017-07-11 18:02:13.746137325 +0900 JST", "2017-07-11 18:02:13.746138707 +0900 JST", "2017-07-11 18:02:13.746140146 +0900 JST", "2017-07-11 18:02:13.746141455 +0900 JST", "2017-07-11 18:02:13.746142766 +0900 JST", "2017-07-11 18:02:13.746144055 +0900 JST", "2017-07-11 18:02:13.746146669 +0900 JST" ], "recent": 1499763733746146600 }` } func saveMetrics(c collect.Collector) { now := time.Now() c.Add("cnt", 1) c.Gauge("recent", float64(now.UnixNano())) c.Histogram("histogram", float64(now.UnixNano())) c.Set("history", now.String()) } func forwardConsole(ctx context.Context, c collect.Collector) { writer, err := forward.NewSimpleWriter(c, os.Stdout) if err != nil { panic(err) } writer.AddMetrics(c.GetMetricsKeys()...) writer.RunStream(ctx) // metrics will be sent every seconds } func forwardUDP(ctx context.Context, c collect.Collector) { writer, err := forward.NewNetWriter(c, ":1234") if err != nil { panic(err) } writer.AddMetrics(c.GetMetricsKeys()...) writer.RunStream(ctx) // metrics will be sent every seconds } ``` ## Metrics type | Type | Detail | | --- | --- | | Counter | Used to count things | | Gauge | A particular value at a particular time | | Histogram | Represents a statistical distribution of a series of values.<br> Each histogram are `count`, `average`, `minimum`, `maximum`, `median` and `95th percentile` | | Set | Used to count the value of unique in a group | | Snapshot | A particular value set at a particular time |
34.762712
172
0.556802
eng_Latn
0.292424
9857236aa67bde98beec4f4d0a27f910c2d1424a
1,290
md
Markdown
arcgis-runtime-samples-java-master/src/main/java/com/esri/samples/search/reverse_geocode_online/README.md
vgauri1797/Eclipse
d342fe9e7e68718ae736c2b9a1e88c84ad50dfcf
[ "Apache-2.0" ]
null
null
null
arcgis-runtime-samples-java-master/src/main/java/com/esri/samples/search/reverse_geocode_online/README.md
vgauri1797/Eclipse
d342fe9e7e68718ae736c2b9a1e88c84ad50dfcf
[ "Apache-2.0" ]
null
null
null
arcgis-runtime-samples-java-master/src/main/java/com/esri/samples/search/reverse_geocode_online/README.md
vgauri1797/Eclipse
d342fe9e7e68718ae736c2b9a1e88c84ad50dfcf
[ "Apache-2.0" ]
null
null
null
<h1>Reverse Geocode Online</h1> <p>Demonstrates how to reverse geocode a location and find its nearest address.</p> <p><img src="ReverseGeocodeOnline.png"/></p> <h2>How to use the sample</h2> <p>You can click on the ArcGISMap to perform online reverse geocoding and show the matching results in the ArcGISMap. </p> <h2>How it works</h2> <p>To perform online reverse geocode:</p> <ol> <li>Create the <code>ArcGISMap</code>'s with <code>Basemap</code>. <ul><li>basemap is created using a <code>TileCache</code> to represent an offline resource </li></ul></li> <li>Create a <code>LocatorTask</code> using a URL.</li> <li>Set the <code>GeocodeParameters</code> for the LocatorTask and specify the geocodes' attributes.</li> <li>Get the matching results from the <code>GeocodeResult</code> using <code>LocatorTask.reverseGeocodeAsync()</code>.</li> <li>Lastly, to show the results using a <code>PictureMarkerSymbol</code> with attributes and add the symbol to a <code>Graphic</code> in the <code>GraphicsOverlay</code>.</li> </ol> <h2>Features</h2> <ul> <li>ArcGISMap</li> <li>GeocodeParameters</li> <li>GraphicsOverlay</li> <li>LocatorTask</li> <li>MapView</li> <li>PictureMarkerSymbol</li> <li>ReverseGeocodeParameters </li> <li>TileCache</li> </ul>
33.076923
178
0.716279
eng_Latn
0.650464
9857c05c1ad04d8075c8c91cdc791b0fba211ab5
435
md
Markdown
docs/api/alfa-graph.graph_class.tojson_1_method.md
Siteimprove/alfa
3eb032275a9fa5f3b97b892e28ebfc90eb4ef611
[ "MIT" ]
70
2018-05-25T16:02:23.000Z
2022-03-21T14:28:03.000Z
docs/api/alfa-graph.graph_class.tojson_1_method.md
Siteimprove/alfa
3eb032275a9fa5f3b97b892e28ebfc90eb4ef611
[ "MIT" ]
448
2018-06-01T08:46:47.000Z
2022-03-31T14:02:55.000Z
docs/api/alfa-graph.graph_class.tojson_1_method.md
Siteimprove/alfa
3eb032275a9fa5f3b97b892e28ebfc90eb4ef611
[ "MIT" ]
13
2018-07-04T19:47:49.000Z
2022-02-19T09:59:34.000Z
<!-- Do not edit this file. It is automatically generated by API Documenter. --> [Home](./index.md) &gt; [@siteimprove/alfa-graph](./alfa-graph.md) &gt; [Graph](./alfa-graph.graph_class.md) &gt; [toJSON](./alfa-graph.graph_class.tojson_1_method.md) ## Graph.toJSON() method <b>Signature:</b> ```typescript toJSON(): Graph.JSON<T>; ``` <b>Returns:</b> [Graph.JSON](./alfa-graph.graph_namespace.json_typealias.md)<!-- -->&lt;T&gt;
27.1875
167
0.673563
eng_Latn
0.251447
9857feb3191c4266453fdc7f898a698a36556d8c
2,624
md
Markdown
_posts/2022-03-15-1-US-government-interest-rates.md
PythonRSAS/PythonRSAS.github.io
89dd643967d0ac52a7ebd93bc84e9a09e9301866
[ "MIT" ]
null
null
null
_posts/2022-03-15-1-US-government-interest-rates.md
PythonRSAS/PythonRSAS.github.io
89dd643967d0ac52a7ebd93bc84e9a09e9301866
[ "MIT" ]
null
null
null
_posts/2022-03-15-1-US-government-interest-rates.md
PythonRSAS/PythonRSAS.github.io
89dd643967d0ac52a7ebd93bc84e9a09e9301866
[ "MIT" ]
null
null
null
--- layout: post tag: inflation, FRED, data analysis category: "other risks" title: "US Government Interest Rates" description: Analyze interest rates from FRED image: images/posts/photos/IMG-0869.JPG --- ![](/images/posts/photos/IMG-0869.jpg) Under this mighty topic of interest rate, in this post I explore interest rates from 1970s and try to understand them in more details in exact dates and actions in history. One of the motivation is to get a bit of crystal ball on inflation and recession. If I have longer data, I would like to study on wars as well. We had lived in a world with low inflation in the US coming out of a decade of less than 2.5% YoY inflation rate. China inflation rate is less than 4%, India 6%, and Japan only at 0.5%. It is hard to imagine how inflation possibly could get any worse than the current +7% in the US, and 5.8% in Europe (average) in the West. But it very well likely can. <!-- print(tabulate(freq_tbl.iloc[:,:1], tablefmt="pipe", headers='keys')) --> | MEV | frequency | meaning | date | |:-----------|:------------|:--------------------|:--------------------| | PALLFNFINDEXQ | quarterly | global commodities | first of quarter | | CPIAUCSL | monthly | cpi | first of month | | PPIACO | monthly | producers index | first of month | | USSTHPI | quarterly | hpi | first of quarter | | FEDFUNDS | monthly | fed funds rate | first of month | | DGS10 | daily | 10-Year Treasury rate | nan | | DGS2 | daily | 2-Year Treasury rate | nan | | TB3MS | monthly | 3-month Treasury bill | first of month | | UNRATE | monthly | unemployment rate | first of month | | GDP | quarterly | Nominal GDP | first of quarter | | GDPC1 | quarterly | Inflation adjusted GDP | first of quarter | <!-- what's the cause of the disease, how do we cure the disease? what are the effects of the cure? What are the side effects of it? What if we don't cure it? --> <div class="code-head"><span>code</span>fred.py</div> ```python import pandas_datareader.data as web # pandas 0.19.x and later from datetime import datetime import pandas as pd import matplotlib.pyplot as plt lt = ["FEDFUNDS","DGS10", "DGS2", "TB3MS", "UNRATE", "GDP", "GDPC1"] ss_lt =[] start = pd.Timestamp('1960-1-1') end = datetime.today() for i in lt: ss =web.DataReader(i, "fred", start, end) ss_lt.append(ss) df = pd.concat(ss_lt, axis=1) ```
45.241379
359
0.616997
eng_Latn
0.983539
9858285d7d0fabe9b3af71ec0467fad031a607a0
124
md
Markdown
france.code-civil/Livre IV/Titre II/Article 2359.md
bradchesney79/illacceptanything
4594ae4634fdb5e39263a6423dc255ed46c25208
[ "MIT" ]
2,986
2015-03-31T06:53:53.000Z
2022-03-29T13:03:22.000Z
france.code-civil/Livre IV/Titre II/Article 2359.md
bradchesney79/illacceptanything
4594ae4634fdb5e39263a6423dc255ed46c25208
[ "MIT" ]
42
2015-03-31T08:46:31.000Z
2020-11-01T11:28:43.000Z
france.code-civil/Livre IV/Titre II/Article 2359.md
bradchesney79/illacceptanything
4594ae4634fdb5e39263a6423dc255ed46c25208
[ "MIT" ]
243
2015-03-31T06:43:04.000Z
2022-02-20T21:26:49.000Z
Article 2359 ---- Le nantissement s'étend aux accessoires de la créance à moins que les parties n'en conviennent autrement.
24.8
77
0.790323
fra_Latn
0.984615
985868d515af0e41291559c05d079d0d8f4900b1
314
md
Markdown
README.md
fb0801/Team-ToDoList
deda3984b6ca5de57ca81682dd9cab435cdabc96
[ "MIT" ]
null
null
null
README.md
fb0801/Team-ToDoList
deda3984b6ca5de57ca81682dd9cab435cdabc96
[ "MIT" ]
null
null
null
README.md
fb0801/Team-ToDoList
deda3984b6ca5de57ca81682dd9cab435cdabc96
[ "MIT" ]
null
null
null
# Team-ToDoList TODO list made in Python with Tkinter as Team CW <h3 align="left">Languages and Tools:</h3> <p align="left"> <a href="https://www.python.org" target="_blank"> <img src="https://devicons.github.io/devicon/devicon.git/icons/python/python-original.svg" alt="python" width="40" height="40"/> </a> </p>
62.8
205
0.710191
eng_Latn
0.323296
98589e7dd56e0f25a247f2e7121c74f45247b7e9
225
md
Markdown
README.md
Garmelon/task-machine
59b2d64a3c95da87b9c8e2df10244e2043359c79
[ "MIT" ]
2
2018-10-24T19:42:08.000Z
2021-01-19T01:36:40.000Z
README.md
Garmelon/task-machine
59b2d64a3c95da87b9c8e2df10244e2043359c79
[ "MIT" ]
null
null
null
README.md
Garmelon/task-machine
59b2d64a3c95da87b9c8e2df10244e2043359c79
[ "MIT" ]
1
2021-01-19T01:36:47.000Z
2021-01-19T01:36:47.000Z
# task-machine A TUI client for the a format inspired by [todo.txt](https://github.com/todotxt/todo.txt), written in Haskell. Still in development... ![exampletodo.txt displayed using i3 and urxvt](example_screenshot.png)
28.125
110
0.764444
eng_Latn
0.974229
9858bb2490ab5b62c478f09e94c4f330ef301999
129
md
Markdown
docs/Hypothesis.md
thifranc/node
ffdd75056ca9d9a6e9b6f3fb4a0b2bda6af11d14
[ "MIT" ]
4
2020-03-18T06:51:52.000Z
2021-08-09T20:17:57.000Z
docs/Hypothesis.md
thifranc/node
ffdd75056ca9d9a6e9b6f3fb4a0b2bda6af11d14
[ "MIT" ]
120
2019-02-08T05:55:44.000Z
2021-11-24T09:34:14.000Z
docs/Hypothesis.md
thifranc/node
ffdd75056ca9d9a6e9b6f3fb4a0b2bda6af11d14
[ "MIT" ]
9
2019-01-25T12:35:28.000Z
2022-01-11T15:40:05.000Z
# Hypothesis Hypothesis provides annotations for Hoover and DokuWiki. It's loaded automatically in the web pages of those apps.
25.8
68
0.813953
eng_Latn
0.995488
9859000d155e2e290df2899559eabb0d9e69029c
5,825
md
Markdown
deploy/helm/README.md
yagosys/terrascan
cf582eb54534a3343432268e2feeb6911d800444
[ "Apache-2.0" ]
1
2021-05-07T07:58:24.000Z
2021-05-07T07:58:24.000Z
deploy/helm/README.md
yagosys/terrascan
cf582eb54534a3343432268e2feeb6911d800444
[ "Apache-2.0" ]
33
2021-01-05T12:46:46.000Z
2022-03-28T13:06:19.000Z
deploy/helm/README.md
dev-gaur/terrascan
4ba3060dce4a2064791ee359bd91962bbb9e15eb
[ "Apache-2.0" ]
1
2021-06-25T23:17:24.000Z
2021-06-25T23:17:24.000Z
# Helm charts for deploying terrascan This guide deploys terrascan as a server within your kubernetes cluster. Additionally, you can deploy a Validating Webhook as well, that'll use the terrascan server as its backend. In server mode, terrascan will act both as an API server for performing remote scans of IAC, as well as a validating admission webhook for a Kubernetes cluster. Further details can be found in the [main documentation](https://docs.accurics.com/projects/accurics-terrascan/en/latest/). ## Usage ### Set up TLS certificates A requirement to run an admission controller is that communication happens over TLS. This helm chart expects to find the certificate at `data/server.crt` and key at `data/server.key`. There's a `data/domain.cnf` file available for you to edit and generate key & certificate. You can use the following command: ```bash openssl req -x509 -sha256 -nodes -newkey rsa:2048 -keyout data/server.key -out data/server.crt -config data/domain.cnf ``` In the `data/domain.cnf` file, we have configured DNS names as `terrascan.terrascan.svc`, assuming the defaults that service will be named `terrascan` and hosted in `terrascan` namespace. You'll have to manually change that as per your requirements. ### Terrascan configuration file This chart will look for a [terrascan configuration file](https://docs.accurics.com/projects/accurics-terrascan/en/latest/usage/#config-file) at `data/config.toml`. If that file exists before running `helm install`, it's contents will be loaded into a configMap and provided to the terrascan server. ### Set up SSH config for private remote repo scan If you're opting to utilise the remote repo scan feature for ***private*** repositories, terrascan will require ssh capabilities to do that. This helm chart expects to find the your ssh private key at `.ssh/private_key`,and .ssh known_hosts file at `.ssh/known_hosts`. Your ssh public key must setup at the code repository hosting service, such as github, bitbucket, etc. You can use the below content to create your `.ssh/known_hosts` file. ```bash # known_hosts github.com,192.30.255.113 ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEAq2A7hRGmdnm9tUDbO9IDSwBK6TbQa+PXYPCPy6rbTrTtw7PHkccKrpp0yVhp5HdEIcKr6pLlVDBfOLX9QUsyCOV0wzfjIJNlGEYsdlLJizHhbn2mUjvSAHQqZETYP81eFzLQNnPHt4EVVUh7VfDESU84KezmD5QlWpXLmvU31/yMf+Se8xhHTvKSCZIFImWwoG6mbUoWf9nzpIoaSjB+weqqUUmpaaasXVal72J+UX2B+2RPW3RcT0eOzQgqlJL3RKrTJvdsjE3JEAvGq3lGHSZXy28G3skua2SmVi/w4yCE6gbODqnTWlg7+wC604ydGXA8VJiS5ap43JXiUFFAaQ== bitbucket.org,104.192.141.1 ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEAubiN81eDcafrgMeLzaFPsw2kNvEcqTKl/VqLat/MaB33pZy0y3rJZtnqwR2qOOvbwKZYKiEO1O6VqNEBxKvJJelCq0dTXWT5pbO2gDXC6h6QDXCaHo6pOHGPUy+YBaGQRGuSusMEASYiWunYN0vCAI8QaXnWMXNMdFP3jHAJH0eDsoiGnLPBlBp4TNm6rYI74nMzgz3B9IikW4WVK+dc8KZJZWYjAuORU3jc1c/NPskD2ASinf8v3xnfXeukU0sJ5N6m5E8VLjObPEO+mN2t/FZTMZLiFqPWc/ALSqnMnnhwrNi2rbfg/rd/IpL8Le3pSBne8+seeFVBoGqzHM9yXw== gitlab.com,172.65.251.78 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFSMqzJeV9rUzU4kWitGjeR4PWSa29SPqJ1fVkhtj3Hw9xjLVXVYrU9QlYWrOLXBpQ6KWjbjTDTdDkoohFzgbEY= ``` **Note:** This is an optional feature and not a requirement. ### Persistent storage By default, this chart will deploy terrascan with a `emptyDir` volume - basically a temporary volume. If you intend to use the admission controller functionality, then you may want to store the admission controller database on a persistent volume. This chart supports specifying a [persistent volume claim](https://kubernetes.io/docs/concepts/storage/persistent-volumes/) for the database - as storage, PVs, and PVCs are a wide topic within Kubernetes ecosystem, the details of the PV/PVC creation are left to the individual. To specify the use of a PVC, set `persistence.enable` to `true`, and then specify the name of an existing PVC: ``` persistence: enabled: false existingclaim: pvcClaimName ``` ### Deploy Once your TLS certificate is generated and the values in the `values.yaml` configuration file have been reviewed, you can install the chart with the following command: 1. Deploying Terrascan Server. For just installing a terrascan server deployment and service, ``` helm install <release-name> . -n <namespace> ``` Where `<release-name>` is the name you want to assign to this installed chart. This value will be used in various resources to make them both distinct and identifiable. #### Verification You can query for the pod using the following command. ``` kubectl get pod -n <namespace> -w ``` Watch the pod until it attains the `Running` state. Verify the logs of the terrascan pod using the following command. ``` kubectl -n <namespace> logs <pod-name> ``` If you see a log that goes like `server listening on port : <port-name>`, the deployment went smooth. ###Deploying Validating Webhook. For installing the terrascan deployment and service along the validating webhook, ``` helm install <release-name> . -n <namespace> --set webhook.mode=true ``` This will use your current namespace unless `-n <namespace>` is specified. #### Verification Try creating a resource that's scanning by the webhook ```bash kubectl run test-pod --image=nginx ``` #### Clean Up ```bash helm uninstall <release-name> -n <namespace> ``` ## TODO: This chart is a WIP - we intend to add the following functionality in the near future: - [x] Storage support - volume for db - [x] Add section for setting the validating-webhook up. - [x] Add secrets to add ssh capabilities in the container, to enable remote repo scan feature. - [ ] Support more load balancer types - [ ] Support for ingress - [ ] Flag for UI enable/disable - [ ] Publish to Artifact hub - [ ] Support TLS certificate/key in existing secrets
45.507813
408
0.780944
eng_Latn
0.975343
985903b37aa24c933363101c55a8d930334a28c9
49,375
md
Markdown
documentation/sections/changelog.md
turbo/gush.coffee
713f217d94db558e78109c6658a2ee2627278673
[ "MIT" ]
null
null
null
documentation/sections/changelog.md
turbo/gush.coffee
713f217d94db558e78109c6658a2ee2627278673
[ "MIT" ]
null
null
null
documentation/sections/changelog.md
turbo/gush.coffee
713f217d94db558e78109c6658a2ee2627278673
[ "MIT" ]
null
null
null
## Changelog ``` releaseHeader('2017-10-26', '2.0.2', '2.0.1') ``` * `--transpile` now also applies to `require`d or `import`ed CoffeeScript files. * `--transpile` can be used with the REPL: `coffee --interactive --transpile`. * Improvements to comments output that should now cover all of the [Flow comment-based syntax](https://flow.org/en/docs/types/comments/). Inline `###` comments near [variable](https://flow.org/en/docs/types/variables/) initial assignments are now output in the variable declaration statement, and `###` comments near a [class and method names](https://flow.org/en/docs/types/generics/) are now output where Flow expects them. * Importing CoffeeScript keywords is now allowed, so long as they’re aliased: `import { and as andFn } from 'lib'`. (You could also do `import lib from 'lib'` and then reference `lib.and`.) * Calls to functions named `get` and `set` no longer throw an error when given a bracketless object literal as an argument: `obj.set propertyName: propertyValue`. * In the constructor of a derived class (a class that `extends` another class), you cannot call `super` with an argument that references `this`: `class Child extends Parent then constructor: (@arg) -> super(@arg)`. This isn’t allowed in JavaScript, and now the CoffeeScript compiler will throw an error. Instead, assign to `this` after calling `super`: `(arg) -> super(arg); @arg = arg`. * Bugfix for incorrect output when backticked statements and hoisted expressions were both in the same class body. This allows a backticked line like `` `field = 3` ``, for people using the experimental [class fields](https://github.com/tc39/proposal-class-fields) syntax, in the same class along with traditional class body expressions like `prop: 3` that CoffeeScript outputs as part of the class prototype. * Bugfix for comments not output before a complex `?` operation, e.g. `@a ? b`. * All tests now pass in Windows. ``` releaseHeader('2017-09-26', '2.0.1', '2.0.0') ``` * `babel-core` is no longer listed in `package.json`, even as an `optionalDependency`, to avoid it being automatically installed for most users. If you wish to use `--transpile`, simply install `babel-core` manually. See [Transpilation](#transpilation). * `--transpile` now relies on Babel to find its options, i.e. the `.babelrc` file in the path of the file(s) being compiled. (Previously the CoffeeScript compiler was duplicating this logic, so nothing has changed from a user’s perspective.) This provides automatic support for additional ways to pass options to Babel in future versions, such as the `.babelrc.js` file coming in Babel 7. * Backticked expressions in a class body, outside any class methods, are now output in the JavaScript class body itself. This allows for passing through experimental JavaScript syntax like the [class fields proposal](https://github.com/tc39/proposal-class-fields), assuming your [transpiler supports it](https://babeljs.io/docs/plugins/transform-class-properties/). ``` releaseHeader('2017-09-18', '2.0.0', '2.0.0-beta5') ``` * Added `--transpile` flag or `transpile` Node API option to tell the CoffeeScript compiler to pipe its output through Babel before saving or returning it; see [Transpilation](#transpilation). Also changed the `-t` short flag to refer to `--transpile` instead of `--tokens`. * Always populate source maps’ `sourcesContent` property. * Bugfixes for destructuring and for comments in JSX. * _Note that these are only the changes between 2.0.0-beta5 and 2.0.0. See below for all changes since 1.x._ ``` releaseHeader('2017-09-02', '2.0.0-beta5', '2.0.0-beta4') ``` * Node 6 is now supported, and we will try to maintain that as the minimum required version for CoffeeScript 2 via the `coffee` command or Node API. Older versions of Node, or non-evergreen browsers, can compile via the [browser compiler](./browser-compiler/coffeescript.js). * The command line `--output` flag now allows you to specify an output filename, not just an output folder. * The command line `--require` flag now properly handles filenames or module names that are invalid identifiers (like an NPM module with a hyphen in the name). * `Object.assign`, output when object destructuring is used, is polyfilled using the same polyfill that Babel outputs. This means that polyfills shouldn’t be required unless support for Internet Explorer 8 or below is desired (or your own code uses a feature that requires a polyfill). See [ES2015+ Output](#es2015plus-output). * A string or JSX interpolation that contains only a comment (`"a#{### comment ###}b"` or `<div>{### comment ###}</div>`) is now output (`` `a${/* comment */}b` ``) * Interpolated strings (ES2015 template literals) that contain quotation marks no longer have the quotation marks escaped: `` `say "${message}"` `` * It is now possible to chain after a function literal (for example, to define a function and then call `.call` on it). * The results of the async tests are included in the output when you run `cake test`. * Bugfixes for object destructuring; expansions in function parameters; generated reference variables in function parameters; chained functions after `do`; splats after existential operator soaks in arrays (`[a?.b...]`); trailing `if` with splat in arrays or function parameters (`[a if b...]`); attempting to `throw` an `if`, `for`, `switch`, `while` or other invalid construct. * Bugfixes for syntactical edge cases: semicolons after `=` and other “mid-expression” tokens; spaces after `::`; and scripts that begin with `:` or `*`. * Bugfixes for source maps generated via the Node API; and stack trace line numbers when compiling CoffeeScript via the Node API from within a `.coffee` file. ``` releaseHeader('2017-08-03', '2.0.0-beta4', '2.0.0-beta3') ``` * This release includes [all the changes from 1.12.6 to 1.12.7](#1.12.7). * [Line comments](#comments) (starting with `#`) are now output in the generated JavaScript. * [Block comments](#comments) (delimited by `###`) are now allowed anywhere, including inline where they previously weren’t possible. This provides support for [static type annotations](#type-annotations) using Flow’s comments-based syntax. * Spread syntax (`...` for objects) is now supported in JSX tags: `<div {props...} />`. * Argument parsing for scripts run via `coffee` is improved. See [breaking changes](#breaking-changes-argument-parsing-and-shebang-lines). * CLI: Propagate `SIGINT` and `SIGTERM` signals when node is forked. * `await` in the REPL is now allowed without requiring a wrapper function. * `do super` is now allowed, and other accesses of `super` like `super.x.y` or `super['x'].y` now work. * Splat/spread syntax triple dots are now allowed on either the left or the right (so `props...` or `...props` are both valid). * Tagged template literals are recognized as callable functions. * Bugfixes for object spread syntax in nested properties. * Bugfixes for destructured function parameter default values. ``` releaseHeader('2017-07-16', '1.12.7', '1.12.6') ``` * Fix regressions in 1.12.6 related to chained function calls and indented `return` and `throw` arguments. * The REPL no longer warns about assigning to `_`. ``` releaseHeader('2017-06-30', '2.0.0-beta3', '2.0.0-beta2') ``` * [JSX](#jsx) is now supported. * [Object rest/spread properties](#object-spread) are now supported. * Bound (fat arrow) methods are once again supported in classes; though an error will be thrown if you attempt to call the method before it is bound. See [breaking changes for classes](#breaking-changes-classes). * The REPL no longer warns about assigning to `_`. * Bugfixes for destructured nested default values and issues related to chaining or continuing expressions across multiple lines. ``` releaseHeader('2017-05-16', '2.0.0-beta2', '2.0.0-beta1') ``` * This release includes [all the changes from 1.12.5 to 1.12.6](#1.12.6). * Bound (fat arrow) methods in classes must be declared in the class constructor, after `super()` if the class is extending a parent class. See [breaking changes for classes](#breaking-changes-classes). * All unnecessary utility helper functions have been removed, including the polyfills for `indexOf` and `bind`. * The `extends` keyword now only works in the context of classes; it cannot be used to extend a function prototype. See [breaking changes for `extends`](#breaking-changes-super-extends). * Literate CoffeeScript is now parsed entirely based on indentation, similar to the 1.x implementation; there is no longer a dependency for parsing Markdown. See [breaking changes for Literate CoffeeScript parsing](#breaking-changes-literate-coffeescript). * JavaScript reserved words used as properties are no longer wrapped in quotes. * `require('coffeescript')` should now work in non-Node environments such as the builds created by Webpack or Browserify. This provides a more convenient way to include the browser compiler in builds intending to run in a browser environment. * Unreachable `break` statements are no longer added after `switch` cases that `throw` exceptions. * The browser compiler is now compiled using Babili and transpiled down to Babel’s `env` preset (should be safe for use in all browsers in current use, not just evergreen versions). * Calling functions `@get` or `@set` no longer throws an error about required parentheses. (Bare `get` or `set`, not attached to an object or `@`, [still intentionally throws a compiler error](#unsupported-get-set).) * If `$XDG_CACHE_HOME` is set, the REPL `.coffee_history` file is saved there. ``` releaseHeader('2017-05-15', '1.12.6', '1.12.5') ``` * The `return` and `export` keywords can now accept implicit objects (defined by indentation, without needing braces). * Support Unicode code point escapes (e.g. `\u{1F4A9}`). * The `coffee` command now first looks to see if CoffeeScript is installed under `node_modules` in the current folder, and executes the `coffee` binary there if so; or otherwise it runs the globally installed one. This allows you to have one version of CoffeeScript installed globally and a different one installed locally for a particular project. (Likewise for the `cake` command.) * Bugfixes for chained function calls not closing implicit objects or ternaries. * Bugfixes for incorrect code generated by the `?` operator within a termary `if` statement. * Fixed some tests, and failing tests now result in a nonzero exit code. ``` releaseHeader('2017-04-14', '2.0.0-beta1', '2.0.0-alpha1') ``` * Initial beta release of CoffeeScript 2. No further breaking changes are anticipated. * Destructured objects and arrays now output using ES2015+ syntax whenever possible. * Literate CoffeeScript now has much better support for parsing Markdown, thanks to using [Markdown-It](https://github.com/markdown-it/markdown-it) to detect Markdown sections rather than just looking at indentation. * Calling a function named `get` or `set` now requires parentheses, to disambiguate from the `get` or `set` keywords (which are [disallowed](#unsupported-get-set)). * The compiler now requires Node 7.6+, the first version of Node to support asynchronous functions without requiring a flag. ``` releaseHeader('2017-04-10', '1.12.5', '1.12.4') ``` * Better handling of `default`, `from`, `as` and `*` within `import` and `export` statements. You can now import or export a member named `default` and the compiler won’t interpret it as the `default` keyword. * Fixed a bug where invalid octal escape sequences weren’t throwing errors in the compiler. ``` releaseHeader('2017-02-21', '2.0.0-alpha1', '1.12.4') ``` * Initial alpha release of CoffeeScript 2. The CoffeeScript compiler now outputs ES2015+ syntax whenever possible. See [breaking changes](#breaking-changes). * Classes are output using ES2015 `class` and `extends` keywords. * Added support for `async`/`await`. * Bound (arrow) functions now output as `=>` functions. * Function parameters with default values now use ES2015 default values syntax. * Splat function parameters now use ES2015 spread syntax. * Computed properties now use ES2015 syntax. * Interpolated strings (template literals) now use ES2015 backtick syntax. * Improved support for recognizing Markdown in Literate CoffeeScript files. * Mixing tabs and spaces in indentation is now disallowed. * Browser compiler is now minified using the Google Closure Compiler (JavaScript version). * Node 7+ required for CoffeeScript 2. ``` releaseHeader('2017-02-18', '1.12.4', '1.12.3') ``` * The `cake` commands have been updated, with new `watch` options for most tasks. Clone the [CoffeeScript repo](https://github.com/jashkenas/coffeescript) and run `cake` at the root of the repo to see the options. * Fixed a bug where `export`ing a referenced variable was preventing the variable from being declared. * Fixed a bug where the `coffee` command wasn’t working for a `.litcoffee` file. * Bugfixes related to tokens and location data, for better source maps and improved compatibility with downstream tools. ``` releaseHeader('2017-01-24', '1.12.3', '1.12.2') ``` * `@` values can now be used as indices in `for` expressions. This loosens the compilation of `for` expressions to allow the index variable to be an `@` value, e.g. `do @visit for @node, @index in nodes`. Within `@visit`, the index of the current node (`@node`) would be available as `@index`. * CoffeeScript’s patched `Error.prepareStackTrace` has been restored, with some revisions that should prevent the erroneous exceptions that were making life difficult for some downstream projects. This fixes the incorrect line numbers in stack traces since 1.12.2. * The `//=` operator’s output now wraps parentheses around the right operand, like the other assignment operators. ``` releaseHeader('2016-12-16', '1.12.2', '1.12.1') ``` * The browser compiler can once again be built unminified via `MINIFY=false cake build:browser`. * The error-prone patched version of `Error.prepareStackTrace` has been removed. * Command completion in the REPL (pressing tab to get suggestions) has been fixed for Node 6.9.1+. * The [browser-based tests](/v<%= majorVersion %>/test.html) now include all the tests as the Node-based version. ``` releaseHeader('2016-12-07', '1.12.1', '1.12.0') ``` * You can now import a module member named `default`, e.g. `import { default } from 'lib'`. Though like in ES2015, you cannot import an entire module and name it `default` (so `import default from 'lib'` is not allowed). * Fix regression where `from` as a variable name was breaking `for` loop declarations. For the record, `from` is not a reserved word in CoffeeScript; you may use it for variable names. `from` behaves like a keyword within the context of `import` and `export` statements, and in the declaration of a `for` loop; though you should also be able to use variables named `from` in those contexts, and the compiler should be able to tell the difference. ``` releaseHeader('2016-12-04', '1.12.0', '1.11.1') ``` * CoffeeScript now supports ES2015 [tagged template literals](#tagged-template-literals). Note that using tagged template literals in your code makes you responsible for ensuring that either your runtime supports tagged template literals or that you transpile the output JavaScript further to a version your target runtime(s) support. * CoffeeScript now provides a [`for…from`](#generator-iteration) syntax for outputting ES2015 [`for…of`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Statements/for...of). (Sorry they couldn’t match, but we came up with `for…of` first for something else.) This allows iterating over generators or any other iterable object. Note that using `for…from` in your code makes you responsible for ensuring that either your runtime supports `for…of` or that you transpile the output JavaScript further to a version your target runtime(s) support. * Triple backticks (`` ```​``) allow the creation of embedded JavaScript blocks where escaping single backticks is not required, which should improve interoperability with ES2015 template literals and with Markdown. * Within single-backtick embedded JavaScript, backticks can now be escaped via `` \`​``. * The browser tests now run in the browser again, and are accessible [here](/v<%= majorVersion %>/test.html) if you would like to test your browser. * CoffeeScript-only keywords in ES2015 `import`s and `export`s are now ignored. * The compiler now throws an error on trying to export an anonymous class. * Bugfixes related to tokens and location data, for better source maps and improved compatibility with downstream tools. ``` releaseHeader('2016-10-02', '1.11.1', '1.11.0') ``` * Bugfix for shorthand object syntax after interpolated keys. * Bugfix for indentation-stripping in `"""` strings. * Bugfix for not being able to use the name “arguments” for a prototype property of class. * Correctly compile large hexadecimal numbers literals to `2e308` (just like all other large number literals do). ``` releaseHeader('2016-09-24', '1.11.0', '1.10.0') ``` * CoffeeScript now supports ES2015 [`import` and `export` syntax](#modules). * Added the `-M, --inline-map` flag to the compiler, allowing you embed the source map directly into the output JavaScript, rather than as a separate file. * A bunch of fixes for `yield`: * `yield return` can no longer mistakenly be used as an expression. * `yield` now mirrors `return` in that it can be used stand-alone as well as with expressions. Where you previously wrote `yield undefined`, you may now write simply `yield`. However, this means also inheriting the same syntax limitations that `return` has, so these examples no longer compile: ``` doubles = -> yield for i in [1..3] i * 2 six = -> yield 2 * 3 ``` * The JavaScript output is a bit nicer, with unnecessary parentheses and spaces, double indentation and double semicolons around `yield` no longer present. * `&&=`, `||=`, `and=` and `or=` no longer accidentally allow a space before the equals sign. * Improved several error messages. * Just like `undefined` compiles to `void 0`, `NaN` now compiles into `0/0` and `Infinity` into `2e308`. * Bugfix for renamed destructured parameters with defaults. `({a: b = 1}) ->` no longer crashes the compiler. * Improved the internal representation of a CoffeeScript program. This is only noticeable to tools that use `CoffeeScript.tokens` or `CoffeeScript.nodes`. Such tools need to update to take account for changed or added tokens and nodes. * Several minor bug fixes, including: * The caught error in `catch` blocks is no longer declared unnecessarily, and no longer mistakenly named `undefined` for `catch`-less `try` blocks. * Unassignable parameter destructuring no longer crashes the compiler. * Source maps are now used correctly for errors thrown from .coffee.md files. * `coffee -e 'throw null'` no longer crashes. * The REPL no longer crashes when using `.exit` to exit it. * Invalid JavaScript is no longer output when lots of `for` loops are used in the same scope. * A unicode issue when using stdin with the CLI. ``` releaseHeader('2015-09-03', '1.10.0', '1.9.3') ``` * CoffeeScript now supports ES2015-style destructuring defaults. * `(offsetHeight: height) ->` no longer compiles. That syntax was accidental and partly broken. Use `({offsetHeight: height}) ->` instead. Object destructuring always requires braces. * Several minor bug fixes, including: * A bug where the REPL would sometimes report valid code as invalid, based on what you had typed earlier. * A problem with multiple JS contexts in the jest test framework. * An error in io.js where strict mode is set on internal modules. * A variable name clash for the caught error in `catch` blocks. ``` releaseHeader('2015-05-27', '1.9.3', '1.9.2') ``` * Bugfix for interpolation in the first key of an object literal in an implicit call. * Fixed broken error messages in the REPL, as well as a few minor bugs with the REPL. * Fixed source mappings for tokens at the beginning of lines when compiling with the `--bare` option. This has the nice side effect of generating smaller source maps. * Slight formatting improvement of compiled block comments. * Better error messages for `on`, `off`, `yes` and `no`. ``` releaseHeader('2015-04-15', '1.9.2', '1.9.1') ``` * Fixed a **watch** mode error introduced in 1.9.1 when compiling multiple files with the same filename. * Bugfix for `yield` around expressions containing `this`. * Added a Ruby-style `-r` option to the REPL, which allows requiring a module before execution with `--eval` or `--interactive`. * In `<script type="text/coffeescript">` tags, to avoid possible duplicate browser requests for .coffee files, you can now use the `data-src` attribute instead of `src`. * Minor bug fixes for IE8, strict ES5 regular expressions and Browserify. ``` releaseHeader('2015-02-18', '1.9.1', '1.9.0') ``` * Interpolation now works in object literal keys (again). You can use this to dynamically name properties. * Internal compiler variable names no longer start with underscores. This makes the generated JavaScript a bit prettier, and also fixes an issue with the completely broken and ungodly way that AngularJS “parses” function arguments. * Fixed a few `yield`-related edge cases with `yield return` and `yield throw`. * Minor bug fixes and various improvements to compiler error messages. ``` releaseHeader('2015-01-29', '1.9.0', '1.8.0') ``` * CoffeeScript now supports ES2015 generators. A generator is simply a function that `yield`s. * More robust parsing and improved error messages for strings and regexes — especially with respect to interpolation. * Changed strategy for the generation of internal compiler variable names. Note that this means that `@example` function parameters are no longer available as naked `example` variables within the function body. * Fixed REPL compatibility with latest versions of Node and Io.js. * Various minor bug fixes. ``` releaseHeader('2014-08-26', '1.8.0', '1.7.1') ``` * The `--join` option of the CLI is now deprecated. * Source maps now use `.js.map` as file extension, instead of just `.map`. * The CLI now exits with the exit code 1 when it fails to write a file to disk. * The compiler no longer crashes on unterminated, single-quoted strings. * Fixed location data for string interpolations, which made source maps out of sync. * The error marker in error messages is now correctly positioned if the code is indented with tabs. * Fixed a slight formatting error in CoffeeScript’s source map-patched stack traces. * The `%%` operator now coerces its right operand only once. * It is now possible to require CoffeeScript files from Cakefiles without having to register the compiler first. * The CoffeeScript REPL is now exported and can be required using `require 'coffeescript/repl'`. * Fixes for the REPL in Node 0.11. ``` releaseHeader('2014-01-29', '1.7.1', '1.7.0') ``` * Fixed a typo that broke node module lookup when running a script directly with the `coffee` binary. ``` releaseHeader('2014-01-28', '1.7.0', '1.6.3') ``` * When requiring CoffeeScript files in Node you must now explicitly register the compiler. This can be done with `require 'coffeescript/register'` or `CoffeeScript.register()`. Also for configuration such as Mocha’s, use **coffeescript/register**. * Improved error messages, source maps and stack traces. Source maps now use the updated `//#` syntax. * Leading `.` now closes all open calls, allowing for simpler chaining syntax. * Added `**`, `//` and `%%` operators and `...` expansion in parameter lists and destructuring expressions. * Multiline strings are now joined by a single space and ignore all indentation. A backslash at the end of a line can denote the amount of whitespace between lines, in both strings and heredocs. Backslashes correctly escape whitespace in block regexes. * Closing brackets can now be indented and therefore no longer cause unexpected error. * Several breaking compilation fixes. Non-callable literals (strings, numbers etc.) don’t compile in a call now and multiple postfix conditionals compile properly. Postfix conditionals and loops always bind object literals. Conditional assignment compiles properly in subexpressions. `super` is disallowed outside of methods and works correctly inside `for` loops. * Formatting of compiled block comments has been improved. * No more `-p` folders on Windows. * The `options` object passed to CoffeeScript is no longer mutated. ``` releaseHeader('2013-06-02', '1.6.3', '1.6.2') ``` * The CoffeeScript REPL now remembers your history between sessions. Just like a proper REPL should. * You can now use `require` in Node to load `.coffee.md` Literate CoffeeScript files. In the browser, `text/literate-coffeescript` script tags. * The old `coffee --lint` command has been removed. It was useful while originally working on the compiler, but has been surpassed by JSHint. You may now use `-l` to pass literate files in over **stdio**. * Bugfixes for Windows path separators, `catch` without naming the error, and executable-class-bodies-with- prototypal-property-attachment. ``` releaseHeader('2013-03-18', '1.6.2', '1.6.1') ``` * Source maps have been used to provide automatic line-mapping when running CoffeeScript directly via the `coffee` command, and for automatic line-mapping when running CoffeeScript directly in the browser. Also, to provide better error messages for semantic errors thrown by the compiler — [with colors, even](http://cl.ly/NdOA). * Improved support for mixed literate/vanilla-style CoffeeScript projects, and generating source maps for both at the same time. * Fixes for **1.6.x** regressions with overriding inherited bound functions, and for Windows file path management. * The `coffee` command can now correctly `fork()` both `.coffee` and `.js` files. (Requires Node.js 0.9+) ``` releaseHeader('2013-03-05', '1.6.1', '1.5.0') ``` * First release of [source maps](#source-maps). Pass the `--map` flag to the compiler, and off you go. Direct all your thanks over to [Jason Walton](https://github.com/jwalton). * Fixed a 1.5.0 regression with multiple implicit calls against an indented implicit object. Combinations of implicit function calls and implicit objects should generally be parsed better now — but it still isn’t good _style_ to nest them too heavily. * `.coffee.md` is now also supported as a Literate CoffeeScript file extension, for existing tooling. `.litcoffee` remains the canonical one. * Several minor fixes surrounding member properties, bound methods and `super` in class declarations. ``` releaseHeader('2013-02-25', '1.5.0', '1.4.0') ``` * First release of [Literate CoffeeScript](#literate). * The CoffeeScript REPL is now based on the Node.js REPL, and should work better and more familiarly. * Returning explicit values from constructors is now forbidden. If you want to return an arbitrary value, use a function, not a constructor. * You can now loop over an array backwards, without having to manually deal with the indexes: `for item in list by -1` * Source locations are now preserved in the CoffeeScript AST, although source maps are not yet being emitted. ``` releaseHeader('2012-10-23', '1.4.0', '1.3.3') ``` * The CoffeeScript compiler now strips Microsoft’s UTF-8 BOM if it exists, allowing you to compile BOM-borked source files. * Fix Node/compiler deprecation warnings by removing `registerExtension`, and moving from `path.exists` to `fs.exists`. * Small tweaks to splat compilation, backticks, slicing, and the error for duplicate keys in object literals. ``` releaseHeader('2012-05-15', '1.3.3', '1.3.1') ``` * Due to the new semantics of JavaScript’s strict mode, CoffeeScript no longer guarantees that constructor functions have names in all runtimes. See [#2052](https://github.com/jashkenas/coffeescript/issues/2052) for discussion. * Inside of a nested function inside of an instance method, it’s now possible to call `super` more reliably (walks recursively up). * Named loop variables no longer have different scoping heuristics than other local variables. (Reverts #643) * Fix for splats nested within the LHS of destructuring assignment. * Corrections to our compile time strict mode forbidding of octal literals. ``` releaseHeader('2012-04-10', '1.3.1', '1.2.0') ``` * CoffeeScript now enforces all of JavaScript’s **Strict Mode** early syntax errors at compile time. This includes old-style octal literals, duplicate property names in object literals, duplicate parameters in a function definition, deleting naked variables, setting the value of `eval` or `arguments`, and more. See a full discussion at [#1547](https://github.com/jashkenas/coffeescript/issues/1547). * The REPL now has a handy new multi-line mode for entering large blocks of code. It’s useful when copy-and-pasting examples into the REPL. Enter multi-line mode with `Ctrl-V`. You may also now pipe input directly into the REPL. * CoffeeScript now prints a `Generated by CoffeeScript VERSION` header at the top of each compiled file. * Conditional assignment of previously undefined variables `a or= b` is now considered a syntax error. * A tweak to the semantics of `do`, which can now be used to more easily simulate a namespace: `do (x = 1, y = 2) -> …` * Loop indices are now mutable within a loop iteration, and immutable between them. * Both endpoints of a slice are now allowed to be omitted for consistency, effectively creating a shallow copy of the list. * Additional tweaks and improvements to `coffee --watch` under Node’s “new” file watching API. Watch will now beep by default if you introduce a syntax error into a watched script. We also now ignore hidden directories by default when watching recursively. ``` releaseHeader('2011-12-18', '1.2.0', '1.1.3') ``` * Multiple improvements to `coffee --watch` and `--join`. You may now use both together, as well as add and remove files and directories within a `--watch`’d folder. * The `throw` statement can now be used as part of an expression. * Block comments at the top of the file will now appear outside of the safety closure wrapper. * Fixed a number of minor 1.1.3 regressions having to do with trailing operators and unfinished lines, and a more major 1.1.3 regression that caused bound functions _within_ bound class functions to have the incorrect `this`. ``` releaseHeader('2011-11-08', '1.1.3', '1.1.2') ``` * Ahh, whitespace. CoffeeScript’s compiled JS now tries to space things out and keep it readable, as you can see in the examples on this page. * You can now call `super` in class level methods in class bodies, and bound class methods now preserve their correct context. * JavaScript has always supported octal numbers `010 is 8`, and hexadecimal numbers `0xf is 15`, but CoffeeScript now also supports binary numbers: `0b10 is 2`. * The CoffeeScript module has been nested under a subdirectory to make it easier to `require` individual components separately, without having to use **npm**. For example, after adding the CoffeeScript folder to your path: `require('coffeescript/lexer')` * There’s a new “link” feature in Try CoffeeScript on this webpage. Use it to get a shareable permalink for your example script. * The `coffee --watch` feature now only works on Node.js 0.6.0 and higher, but now also works properly on Windows. * Lots of small bug fixes from **[@michaelficarra](https://github.com/michaelficarra)**, **[@geraldalewis](https://github.com/geraldalewis)**, **[@satyr](https://github.com/satyr)**, and **[@trevorburnham](https://github.com/trevorburnham)**. ``` releaseHeader('2011-08-04', '1.1.2', '1.1.1') ``` Fixes for block comment formatting, `?=` compilation, implicit calls against control structures, implicit invocation of a try/catch block, variadic arguments leaking from local scope, line numbers in syntax errors following heregexes, property access on parenthesized number literals, bound class methods and super with reserved names, a REPL overhaul, consecutive compiled semicolons, block comments in implicitly called objects, and a Chrome bug. ``` releaseHeader('2011-05-10', '1.1.1', '1.1.0') ``` Bugfix release for classes with external constructor functions, see issue #1182. ``` releaseHeader('2011-05-01', '1.1.0', '1.0.1') ``` When running via the `coffee` executable, `process.argv` and friends now report `coffee` instead of `node`. Better compatibility with **Node.js 0.4.x** module lookup changes. The output in the REPL is now colorized, like Node’s is. Giving your concatenated CoffeeScripts a name when using `--join` is now mandatory. Fix for lexing compound division `/=` as a regex accidentally. All `text/coffeescript` tags should now execute in the order they’re included. Fixed an issue with extended subclasses using external constructor functions. Fixed an edge-case infinite loop in `addImplicitParentheses`. Fixed exponential slowdown with long chains of function calls. Globals no longer leak into the CoffeeScript REPL. Splatted parameters are declared local to the function. ``` releaseHeader('2011-01-31', '1.0.1', '1.0.0') ``` Fixed a lexer bug with Unicode identifiers. Updated REPL for compatibility with Node.js 0.3.7\. Fixed requiring relative paths in the REPL. Trailing `return` and `return undefined` are now optimized away. Stopped requiring the core Node.js `util` module for back-compatibility with Node.js 0.2.5\. Fixed a case where a conditional `return` would cause fallthrough in a `switch` statement. Optimized empty objects in destructuring assignment. ``` releaseHeader('2010-12-24', '1.0.0', '0.9.6') ``` CoffeeScript loops no longer try to preserve block scope when functions are being generated within the loop body. Instead, you can use the `do` keyword to create a convenient closure wrapper. Added a `--nodejs` flag for passing through options directly to the `node` executable. Better behavior around the use of pure statements within expressions. Fixed inclusive slicing through `-1`, for all browsers, and splicing with arbitrary expressions as endpoints. ``` releaseHeader('2010-12-06', '0.9.6', '0.9.5') ``` The REPL now properly formats stacktraces, and stays alive through asynchronous exceptions. Using `--watch` now prints timestamps as files are compiled. Fixed some accidentally-leaking variables within plucked closure-loops. Constructors now maintain their declaration location within a class body. Dynamic object keys were removed. Nested classes are now supported. Fixes execution context for naked splatted functions. Bugfix for inversion of chained comparisons. Chained class instantiation now works properly with splats. ``` releaseHeader('2010-11-21', '0.9.5', '0.9.4') ``` 0.9.5 should be considered the first release candidate for CoffeeScript 1.0. There have been a large number of internal changes since the previous release, many contributed from **satyr**’s [Coco](https://github.com/satyr/coco) dialect of CoffeeScript. Heregexes (extended regexes) were added. Functions can now have default arguments. Class bodies are now executable code. Improved syntax errors for invalid CoffeeScript. `undefined` now works like `null`, and cannot be assigned a new value. There was a precedence change with respect to single-line comprehensions: `result = i for i in list` used to parse as `result = (i for i in list)` by default … it now parses as `(result = i) for i in list`. ``` releaseHeader('2010-09-21', '0.9.4', '0.9.3') ``` CoffeeScript now uses appropriately-named temporary variables, and recycles their references after use. Added `require.extensions` support for **Node.js 0.3**. Loading CoffeeScript in the browser now adds just a single `CoffeeScript` object to global scope. Fixes for implicit object and block comment edge cases. ``` releaseHeader('2010-09-16', '0.9.3', '0.9.2') ``` CoffeeScript `switch` statements now compile into JS `switch` statements — they previously compiled into `if/else` chains for JavaScript 1.3 compatibility. Soaking a function invocation is now supported. Users of the RubyMine editor should now be able to use `--watch` mode. ``` releaseHeader('2010-08-23', '0.9.2', '0.9.1') ``` Specifying the start and end of a range literal is now optional, eg. `array[3..]`. You can now say `a not instanceof b`. Fixed important bugs with nested significant and non-significant indentation (Issue #637). Added a `--require` flag that allows you to hook into the `coffee` command. Added a custom `jsl.conf` file for our preferred JavaScriptLint setup. Sped up Jison grammar compilation time by flattening rules for operations. Block comments can now be used with JavaScript-minifier-friendly syntax. Added JavaScript’s compound assignment bitwise operators. Bugfixes to implicit object literals with leading number and string keys, as the subject of implicit calls, and as part of compound assignment. ``` releaseHeader('2010-08-11', '0.9.1', '0.9.0') ``` Bugfix release for **0.9.1**. Greatly improves the handling of mixed implicit objects, implicit function calls, and implicit indentation. String and regex interpolation is now strictly `#{ … }` (Ruby style). The compiler now takes a `--require` flag, which specifies scripts to run before compilation. ``` releaseHeader('2010-08-04', '0.9.0', '0.7.2') ``` The CoffeeScript **0.9** series is considered to be a release candidate for **1.0**; let’s give her a shakedown cruise. **0.9.0** introduces a massive backwards-incompatible change: Assignment now uses `=`, and object literals use `:`, as in JavaScript. This allows us to have implicit object literals, and YAML-style object definitions. Half assignments are removed, in favor of `+=`, `or=`, and friends. Interpolation now uses a hash mark `#` instead of the dollar sign `$` — because dollar signs may be part of a valid JS identifier. Downwards range comprehensions are now safe again, and are optimized to straight for loops when created with integer endpoints. A fast, unguarded form of object comprehension was added: `for all key, value of object`. Mentioning the `super` keyword with no arguments now forwards all arguments passed to the function, as in Ruby. If you extend class `B` from parent class `A`, if `A` has an `extended` method defined, it will be called, passing in `B` — this enables static inheritance, among other things. Cleaner output for functions bound with the fat arrow. `@variables` can now be used in parameter lists, with the parameter being automatically set as a property on the object — useful in constructors and setter functions. Constructor functions can now take splats. ``` releaseHeader('2010-07-12', '0.7.2', '0.7.1') ``` Quick bugfix (right after 0.7.1) for a problem that prevented `coffee` command-line options from being parsed in some circumstances. ``` releaseHeader('2010-07-11', '0.7.1', '0.7.0') ``` Block-style comments are now passed through and printed as JavaScript block comments – making them useful for licenses and copyright headers. Better support for running coffee scripts standalone via hashbangs. Improved syntax errors for tokens that are not in the grammar. ``` releaseHeader('2010-06-28', '0.7.0', '0.6.2') ``` Official CoffeeScript variable style is now camelCase, as in JavaScript. Reserved words are now allowed as object keys, and will be quoted for you. Range comprehensions now generate cleaner code, but you have to specify `by -1` if you’d like to iterate downward. Reporting of syntax errors is greatly improved from the previous release. Running `coffee` with no arguments now launches the REPL, with Readline support. The `<-` bind operator has been removed from CoffeeScript. The `loop` keyword was added, which is equivalent to a `while true` loop. Comprehensions that contain closures will now close over their variables, like the semantics of a `forEach`. You can now use bound function in class definitions (bound to the instance). For consistency, `a in b` is now an array presence check, and `a of b` is an object-key check. Comments are no longer passed through to the generated JavaScript. ``` releaseHeader('2010-05-15', '0.6.2', '0.6.1') ``` The `coffee` command will now preserve directory structure when compiling a directory full of scripts. Fixed two omissions that were preventing the CoffeeScript compiler from running live within Internet Explorer. There’s now a syntax for block comments, similar in spirit to CoffeeScript’s heredocs. ECMA Harmony DRY-style pattern matching is now supported, where the name of the property is the same as the name of the value: `{name, length}: func`. Pattern matching is now allowed within comprehension variables. `unless` is now allowed in block form. `until` loops were added, as the inverse of `while` loops. `switch` statements are now allowed without switch object clauses. Compatible with Node.js **v0.1.95**. ``` releaseHeader('2010-04-12', '0.6.1', '0.6.0') ``` Upgraded CoffeeScript for compatibility with the new Node.js **v0.1.90** series. ``` releaseHeader('2010-04-03', '0.6.0', '0.5.6') ``` Trailing commas are now allowed, a-la Python. Static properties may be assigned directly within class definitions, using `@property` notation. ``` releaseHeader('2010-03-23', '0.5.6', '0.5.5') ``` Interpolation can now be used within regular expressions and heredocs, as well as strings. Added the `<-` bind operator. Allowing assignment to half-expressions instead of special `||=`-style operators. The arguments object is no longer automatically converted into an array. After requiring `coffeescript`, Node.js can now directly load `.coffee` files, thanks to **registerExtension**. Multiple splats can now be used in function calls, arrays, and pattern matching. ``` releaseHeader('2010-03-08', '0.5.5', '0.5.4') ``` String interpolation, contributed by [Stan Angeloff](https://github.com/StanAngeloff). Since `--run` has been the default since **0.5.3**, updating `--stdio` and `--eval` to run by default, pass `--compile` as well if you’d like to print the result. ``` releaseHeader('2010-03-03', '0.5.4', '0.5.3') ``` Bugfix that corrects the Node.js global constants `__filename` and `__dirname`. Tweaks for more flexible parsing of nested function literals and improperly-indented comments. Updates for the latest Node.js API. ``` releaseHeader('2010-02-27', '0.5.3', '0.5.2') ``` CoffeeScript now has a syntax for defining classes. Many of the core components (Nodes, Lexer, Rewriter, Scope, Optparse) are using them. Cakefiles can use `optparse.coffee` to define options for tasks. `--run` is now the default flag for the `coffee` command, use `--compile` to save JavaScripts. Bugfix for an ambiguity between RegExp literals and chained divisions. ``` releaseHeader('2010-02-25', '0.5.2', '0.5.1') ``` Added a compressed version of the compiler for inclusion in web pages as `/v<%= majorVersion %>/browser-compiler/coffeescript.js`. It’ll automatically run any script tags with type `text/coffeescript` for you. Added a `--stdio` option to the `coffee` command, for piped-in compiles. ``` releaseHeader('2010-02-24', '0.5.1', '0.5.0') ``` Improvements to null soaking with the existential operator, including soaks on indexed properties. Added conditions to `while` loops, so you can use them as filters with `when`, in the same manner as comprehensions. ``` releaseHeader('2010-02-21', '0.5.0', '0.3.2') ``` CoffeeScript 0.5.0 is a major release, While there are no language changes, the Ruby compiler has been removed in favor of a self-hosting compiler written in pure CoffeeScript. ``` releaseHeader('2010-02-08', '0.3.2', '0.3.0') ``` `@property` is now a shorthand for `this.property`. Switched the default JavaScript engine from Narwhal to Node.js. Pass the `--narwhal` flag if you’d like to continue using it. ``` releaseHeader('2010-01-26', '0.3.0', '0.2.6') ``` CoffeeScript 0.3 includes major syntax changes: The function symbol was changed to `->`, and the bound function symbol is now `=>`. Parameter lists in function definitions must now be wrapped in parentheses. Added property soaking, with the `?.` operator. Made parentheses optional, when invoking functions with arguments. Removed the obsolete block literal syntax. ``` releaseHeader('2010-01-17', '0.2.6', '0.2.5') ``` Added Python-style chained comparisons, the conditional existence operator `?=`, and some examples from _Beautiful Code_. Bugfixes relating to statement-to-expression conversion, arguments-to-array conversion, and the TextMate syntax highlighter. ``` releaseHeader('2010-01-13', '0.2.5', '0.2.4') ``` The conditions in switch statements can now take multiple values at once — If any of them are true, the case will run. Added the long arrow `==>`, which defines and immediately binds a function to `this`. While loops can now be used as expressions, in the same way that comprehensions can. Splats can be used within pattern matches to soak up the rest of an array. ``` releaseHeader('2010-01-12', '0.2.4', '0.2.3') ``` Added ECMAScript Harmony style destructuring assignment, for dealing with extracting values from nested arrays and objects. Added indentation-sensitive heredocs for nicely formatted strings or chunks of code. ``` releaseHeader('2010-01-11', '0.2.3', '0.2.2') ``` Axed the unsatisfactory `ino` keyword, replacing it with `of` for object comprehensions. They now look like: `for prop, value of object`. ``` releaseHeader('2010-01-10', '0.2.2', '0.2.1') ``` When performing a comprehension over an object, use `ino`, instead of `in`, which helps us generate smaller, more efficient code at compile time. Added `::` as a shorthand for saying `.prototype.` The “splat” symbol has been changed from a prefix asterisk `*`, to a postfix ellipsis `...` Added JavaScript’s `in` operator, empty `return` statements, and empty `while` loops. Constructor functions that start with capital letters now include a safety check to make sure that the new instance of the object is returned. The `extends` keyword now functions identically to `goog.inherits` in Google’s Closure Library. ``` releaseHeader('2010-01-05', '0.2.1', '0.2.0') ``` Arguments objects are now converted into real arrays when referenced. ``` releaseHeader('2010-01-05', '0.2.0', '0.1.6') ``` Major release. Significant whitespace. Better statement-to-expression conversion. Splats. Splice literals. Object comprehensions. Blocks. The existential operator. Many thanks to all the folks who posted issues, with special thanks to [Liam O’Connor-Davis](https://github.com/liamoc) for whitespace and expression help. ``` releaseHeader('2009-12-27', '0.1.6', '0.1.5') ``` Bugfix for running `coffee --interactive` and `--run` from outside of the CoffeeScript directory. Bugfix for nested function/if-statements. ``` releaseHeader('2009-12-26', '0.1.5', '0.1.4') ``` Array slice literals and array comprehensions can now both take Ruby-style ranges to specify the start and end. JavaScript variable declaration is now pushed up to the top of the scope, making all assignment statements into expressions. You can use `\` to escape newlines. The `coffeescript` command is now called `coffee`. ``` releaseHeader('2009-12-25', '0.1.4', '0.1.3') ``` The official CoffeeScript extension is now `.coffee` instead of `.cs`, which properly belongs to [C#](https://en.wikipedia.org/wiki/C_Sharp_(programming_language)). Due to popular demand, you can now also use `=` to assign. Unlike JavaScript, `=` can also be used within object literals, interchangeably with `:`. Made a grammatical fix for chained function calls like `func(1)(2)(3)(4)`. Inheritance and super no longer use `__proto__`, so they should be IE-compatible now. ``` releaseHeader('2009-12-25', '0.1.3', '0.1.2') ``` The `coffee` command now includes `--interactive`, which launches an interactive CoffeeScript session, and `--run`, which directly compiles and executes a script. Both options depend on a working installation of Narwhal. The `aint` keyword has been replaced by `isnt`, which goes together a little smoother with `is`. Quoted strings are now allowed as identifiers within object literals: eg. `{"5+5": 10}`. All assignment operators now use a colon: `+:`, `-:`, `*:`, etc. ``` releaseHeader('2009-12-24', '0.1.2', '0.1.1') ``` Fixed a bug with calling `super()` through more than one level of inheritance, with the re-addition of the `extends` keyword. Added experimental [Narwhal](http://narwhaljs.org/) support (as a Tusk package), contributed by [Tom Robinson](http://blog.tlrobinson.net/), including **bin/cs** as a CoffeeScript REPL and interpreter. New `--no-wrap` option to suppress the safety function wrapper. ``` releaseHeader('2009-12-24', '0.1.1', '0.1.0') ``` Added `instanceof` and `typeof` as operators. ``` releaseHeader('2009-12-24', '0.1.0', '8e9d637985d2dc9b44922076ad54ffef7fa8e9c2') ``` Initial CoffeeScript release.
73.804185
1,308
0.744304
eng_Latn
0.997743
9859363e54562ced5000bcbf2bd602370c7c108f
1,184
md
Markdown
articles/April19/service/dynamics365-customer-service/unified-service-desk/use-channel-integration-framework-unified-service-desk.md
hyoshioka0128/BusinessApplication-ReleaseNotes.ja-jp
5264f978d7f46c10876783ab0c1d673fdb4a6d25
[ "CC-BY-4.0", "MIT" ]
null
null
null
articles/April19/service/dynamics365-customer-service/unified-service-desk/use-channel-integration-framework-unified-service-desk.md
hyoshioka0128/BusinessApplication-ReleaseNotes.ja-jp
5264f978d7f46c10876783ab0c1d673fdb4a6d25
[ "CC-BY-4.0", "MIT" ]
null
null
null
articles/April19/service/dynamics365-customer-service/unified-service-desk/use-channel-integration-framework-unified-service-desk.md
hyoshioka0128/BusinessApplication-ReleaseNotes.ja-jp
5264f978d7f46c10876783ab0c1d673fdb4a6d25
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: Unified Service Desk で Channel Integration Framework を使用する description: Unified Service Desk で Channel Integration Framework を使用する方法について説明します。 keywords: '' ms.date: 01/21/2019 ms.service: - business-applications ms.topic: article ms.assetid: AF34A5B0-7F0D-4DBD-9B83-258CC1FB3B23 author: kabala123 ms.author: kabala ms.openlocfilehash: 118d844912b90bdc4cf9f83dc5b7e49a979045f5 ms.sourcegitcommit: 921dde7a25596a81c049162eee650d7a2009f17d ms.translationtype: HT ms.contentlocale: ja-JP ms.lasthandoff: 04/29/2019 ms.locfileid: "1225404" --- # <a name="use-channel-integration-framework-with-unified-service-desk"></a>Unified Service Desk で Channel Integration Framework を使用する [!include[unified-service-desk banner](../../../includes/unified-service-desk.md)] 今リリースでは、[Channel Integration Framework (CIF)](https://docs.microsoft.com/dynamics365/customer-engagement/developer/channel-integration-framework/channel-integration-framework) を使用して Unified Service Desk に組み込まれた通信ウィジェットのオンボーディングを迅速に行うことができます。 この通信ウィジェットは、最小限の構成作業を行うだけで、Unified Service Desk で使用できます。 ![Channel Integration Framework](media/USD-CIF-4.1.png "Unified Service Desk での Channel Integration Framework")
42.285714
298
0.821791
eng_Latn
0.21334
985937484c1eb2580c4ba196bdd1b558c998b959
15
md
Markdown
README.md
yaponek/geez-
55f0115f5ade0c3c90fabedb9c6c84e58af054ae
[ "MIT" ]
null
null
null
README.md
yaponek/geez-
55f0115f5ade0c3c90fabedb9c6c84e58af054ae
[ "MIT" ]
null
null
null
README.md
yaponek/geez-
55f0115f5ade0c3c90fabedb9c6c84e58af054ae
[ "MIT" ]
null
null
null
# geez- geez +
5
7
0.533333
nld_Latn
0.995681
98593bae9cab6c2f30b5e6b6a41277377e978d6b
11,698
md
Markdown
articles/cognitive-services/Bing-Video-Search/quickstarts/java.md
tsunami416604/azure-docs.hu-hu
aeba852f59e773e1c58a4392d035334681ab7058
[ "CC-BY-4.0", "MIT" ]
7
2017-08-28T07:44:33.000Z
2021-04-20T21:12:50.000Z
articles/cognitive-services/Bing-Video-Search/quickstarts/java.md
tsunami416604/azure-docs.hu-hu
aeba852f59e773e1c58a4392d035334681ab7058
[ "CC-BY-4.0", "MIT" ]
412
2018-07-25T09:31:03.000Z
2021-03-17T13:17:45.000Z
articles/cognitive-services/Bing-Video-Search/quickstarts/java.md
tsunami416604/azure-docs.hu-hu
aeba852f59e773e1c58a4392d035334681ab7058
[ "CC-BY-4.0", "MIT" ]
13
2017-09-05T09:10:35.000Z
2021-11-05T11:42:31.000Z
--- title: 'Gyors útmutató: videók keresése a REST API és a Java-Bing Video Search használatával' titleSuffix: Azure Cognitive Services description: Ezzel a rövid útmutatóval videó-keresési kéréseket küldhet a Bing Video Search REST API Javával. services: cognitive-services author: aahill manager: nitinme ms.service: cognitive-services ms.subservice: bing-video-search ms.topic: quickstart ms.date: 05/22/2020 ms.custom: devx-track-java ms.author: aahi ms.openlocfilehash: 6c34ec5582b8251d5b64bf24654b50d7c391cee4 ms.sourcegitcommit: 9eda79ea41c60d58a4ceab63d424d6866b38b82d ms.translationtype: MT ms.contentlocale: hu-HU ms.lasthandoff: 11/30/2020 ms.locfileid: "96341393" --- # <a name="quickstart-search-for-videos-using-the-bing-video-search-rest-api-and-java"></a>Gyors útmutató: videók keresése a Bing Video Search REST API és a Java használatával > [!WARNING] > Bing Search API-k átkerülnek a Cognitive Servicesról Bing Search szolgáltatásokra. **2020. október 30-ig** a Bing Search új példányait az [itt](/bing/search-apis/bing-web-search/create-bing-search-service-resource)ismertetett eljárás követésével kell kiépíteni. > A Cognitive Services használatával kiépített Bing Search API-k a következő három évben vagy a Nagyvállalati Szerződés végéig lesz támogatva, attól függően, hogy melyik történik először. > Az áttelepítési utasításokért lásd: [Bing Search Services](/bing/search-apis/bing-web-search/create-bing-search-service-resource). Ezzel a rövid útmutatóval megteheti az első hívást a Bing Video Search API. Ez az egyszerű Java-alkalmazás egy HTTP-videó keresési lekérdezést küld az API-nak, és megjeleníti a JSON-választ. Bár ez az alkalmazás Java nyelven íródott, az API egy REST-alapú webszolgáltatás, amely kompatibilis a legtöbb programozási nyelvvel. A minta forráskódja elérhető [a githubon](https://github.com/Azure-Samples/cognitive-services-REST-api-samples/blob/master/java/Search/BingVideoSearchv7.java) további hibakezelés, funkciók és kódok megjegyzésekkel. ## <a name="prerequisites"></a>Előfeltételek * A [Java fejlesztői készlet (JDK)](https://www.oracle.com/technetwork/java/javase/downloads/jdk11-downloads-5066655.html) * A [Gson-kódtár](https://github.com/google/gson) [!INCLUDE [cognitive-services-bing-video-search-signup-requirements](../../../../includes/cognitive-services-bing-video-search-signup-requirements.md)] ## <a name="create-and-initialize-a-project"></a>Projekt létrehozása és inicializálása 1. Hozzon létre egy új Java-projektet a kedvenc IDE vagy szerkesztőben, és importálja a következő könyvtárakat: ```java import java.net.*; import java.util.*; import java.io.*; import javax.net.ssl.HttpsURLConnection; import com.google.gson.Gson; import com.google.gson.GsonBuilder; import com.google.gson.JsonObject; import com.google.gson.JsonParser; ``` 2. Hozzon létre egy nevű új osztályt a `SearchResults` fejlécek és a JSON-válasz az API-ból való tárolásához. ```java // Container class for search results encapsulates relevant headers and JSON data class SearchResults{ HashMap<String, String> relevantHeaders; String jsonResponse; SearchResults(HashMap<String, String> headers, String json) { relevantHeaders = headers; jsonResponse = json; } } ``` 3. Hozzon létre egy nevű új metódust az `SearchVideos()` API-végpont gazdagépéhez és elérési útjához, az előfizetési kulcshoz és a keresési kifejezéshez változókkal. Ez a metódus egy `SearchResults` objektumot ad vissza. Az érték esetében használhatja `host` a globális végpontot a következő kódban, vagy használhatja az erőforráshoz tartozó Azure Portalban megjelenő [Egyéni altartomány](../../../cognitive-services/cognitive-services-custom-subdomains.md) -végpontot. ```java public static SearchResults SearchVideos (String searchQuery) throws Exception { static String subscriptionKey = "enter your key here"; static String host = "https://api.cognitive.microsoft.com"; static String path = "/bing/v7.0/videos/search"; static String searchTerm = "kittens"; } ``` ## <a name="construct-and-send-the-search-request"></a>A keresési kérelem létrehozása és elküldése A `SearchVideos()` metódusban hajtsa végre a következő lépéseket: 1. Hozza létre a kérelem URL-címét az API-gazdagép, az elérési út és a kódolt keresési lekérdezés kombinálásával. A használatával `openConnection()` hozzon létre egy kapcsolatokat, majd adja hozzá az előfizetési kulcsot a `Ocp-Apim-Subscription-Key` fejléchez. ```java URL url = new URL(host + path + "?q=" + URLEncoder.encode(searchQuery, "UTF-8")); HttpsURLConnection connection = (HttpsURLConnection)url.openConnection(); connection.setRequestProperty("Ocp-Apim-Subscription-Key", subscriptionKey); ``` 2. Szerezze be a választ az API-ból, és tárolja a JSON-karakterláncot. ```java InputStream stream = connection.getInputStream(); String response = new Scanner(stream).useDelimiter("\\A").next(); ``` 3. A paranccsal `getHeaderFields()` kinyerheti a HTTP-fejléceket a válaszból, és tárolhatja a Bing-hez kapcsolódó `results` objektumokat az objektumban. Ezután zárd be a streamet, és adja vissza az eredményt. ```java // extract Bing-related HTTP headers Map<String, List<String>> headers = connection.getHeaderFields(); for (String header : headers.keySet()) { if (header == null) continue; // may have null key if (header.startsWith("BingAPIs-") || header.startsWith("X-MSEdge-")) { results.relevantHeaders.put(header, headers.get(header).get(0)); } } stream.close(); return results; ``` ## <a name="format-the-response"></a>A válasz formázása Hozzon létre egy nevű metódust a `prettify()` Bing video API által visszaadott válasz formázásához. A Gson könyvtárának használatával `JsonParser` JSON-karakterláncot alakíthat át egy objektumba. Ezután `GsonBuilder()` a és a használatával `toJson()` hozza létre a formázott karakterláncot. ```java // pretty-printer for JSON; uses GSON parser to parse and re-serialize public static String prettify(String json_text) { JsonParser parser = new JsonParser(); JsonObject json = parser.parse(json_text).getAsJsonObject(); Gson gson = new GsonBuilder().setPrettyPrinting().create(); return gson.toJson(json); } ``` ## <a name="send-the-request-and-print-the-response"></a>Küldje el a kérést, és nyomtassa ki a választ Az alkalmazás fő metódusában hívja `SearchVideos` meg a keresési kifejezést. Ezután nyomtassa ki a válaszban tárolt HTTP-fejléceket és az API által visszaadott JSON-karakterláncot. ```java public static void main (String[] args) { SearchResults result = SearchVideos(searchTerm); //print the Relevant HTTP Headers for (String header : result.relevantHeaders.keySet()) System.out.println(header + ": " + result.relevantHeaders.get(header)); System.out.println(prettify(result.jsonResponse)); } ``` ## <a name="json-response"></a>JSON-válasz A rendszer JSON formátumban ad vissza egy sikeres választ a következő példában látható módon: ```json { "_type": "Videos", "instrumentation": {}, "readLink": "https://api.cognitive.microsoft.com/api/v7/videos/search?q=kittens", "webSearchUrl": "https://www.bing.com/videos/search?q=kittens", "totalEstimatedMatches": 1000, "value": [ { "webSearchUrl": "https://www.bing.com/videos/search?q=kittens&view=...", "name": "Top 10 cute kitten videos compilation", "description": "HELP HOMELESS ANIMALS AND WIN A PRIZE BY CHOOSING...", "thumbnailUrl": "https://tse4.mm.bing.net/th?id=OVP.n1aE_Oikl4MtzBb...", "datePublished": "2014-11-12T22:47:36.0000000", "publisher": [ { "name": "Fabrikam" } ], "creator": { "name": "Marcus Appel" }, "isAccessibleForFree": true, "contentUrl": "https://www.fabrikam.com/watch?v=8HVWitAW-Qg", "hostPageUrl": "https://www.fabrikam.com/watch?v=8HVWitAW-Qg", "encodingFormat": "h264", "hostPageDisplayUrl": "https://www.fabrikam.com/watch?v=8HVWitAW-Qg", "width": 480, "height": 360, "duration": "PT3M52S", "motionThumbnailUrl": "https://tse4.mm.bing.net/th?id=OM.j4QyJAENJphdZQ_1501386166&pid=Api", "embedHtml": "<iframe width=\"1280\" height=\"720\" src=\"https://www.fabrikam.com/embed/8HVWitAW-Qg?autoplay=1\" frameborder=\"0\" allowfullscreen></iframe>", "allowHttpsEmbed": true, "viewCount": 7513633, "thumbnail": { "width": 300, "height": 168 }, "videoId": "655D98260D012432848F6558260D012432848F", "allowMobileEmbed": true, "isSuperfresh": false }, . . . ], "nextOffset": 36, "queryExpansions": [ { "text": "Kittens Meowing", "displayText": "Meowing", "webSearchUrl": "https://www.bing.com/videos/search?q=Kittens+Meowing...", "searchLink": "https://api.cognitive.microsoft.com/api/v7/videos/search...", "thumbnail": { "thumbnailUrl": "https://tse3.mm.bing.net/th?q=Kittens+Meowing&pid..." } }, { "text": "Funny Kittens", "displayText": "Funny", "webSearchUrl": "https://www.bing.com/videos/search?q=Funny+Kittens...", "searchLink": "https://api.cognitive.microsoft.com/api/v7/videos/search...", "thumbnail": { "thumbnailUrl": "https://tse3.mm.bing.net/th?q=Funny+Kittens&..." } }, . . . ], "pivotSuggestions": [ { "pivot": "kittens", "suggestions": [ { "text": "Cat", "displayText": "Cat", "webSearchUrl": "https://www.bing.com/videos/search?q=Cat...", "searchLink": "https://api.cognitive.microsoft.com/api/v7/videos/search?...", "thumbnail": { "thumbnailUrl": "https://tse3.mm.bing.net/th?q=Cat&pid=Api..." } }, { "text": "Feral Cat", "displayText": "Feral Cat", "webSearchUrl": "https://www.bing.com/videos/search?q=Feral+Cat...", "searchLink": "https://api.cognitive.microsoft.com/api/v7/videos/search...", "thumbnail": { "thumbnailUrl": "https://tse3.mm.bing.net/th?q=Feral+Cat&pid=Api&..." } } ] } ], "relatedSearches": [ { "text": "Kittens Being Born", "displayText": "Kittens Being Born", "webSearchUrl": "https://www.bing.com/videos/search?q=Kittens+Being+Born...", "searchLink": "https://api.cognitive.microsoft.com/api/v7/videos/search?...", "thumbnail": { "thumbnailUrl": "https://tse1.mm.bing.net/th?q=Kittens+Being+Born&pid=..." } }, . . . ] } ``` ## <a name="next-steps"></a>További lépések > [!div class="nextstepaction"] > [Egyoldalas webalkalmazás készítése](../tutorial-bing-video-search-single-page-app.md) ## <a name="see-also"></a>Lásd még [Mi az a Bing Video Search API?](../overview.md)
44.819923
469
0.650026
hun_Latn
0.969317
985a066c5b14f36efb67fdef92a0103a16da4fc1
1,161
md
Markdown
doc/csv-plot.1.md
mslusarz/csv-nix-tools
30682757c605210f7b63d208af58ae56b16da72e
[ "BSD-3-Clause" ]
32
2019-11-05T22:46:46.000Z
2022-03-27T12:32:38.000Z
doc/csv-plot.1.md
mslusarz/csv-nix-tools
30682757c605210f7b63d208af58ae56b16da72e
[ "BSD-3-Clause" ]
4
2020-09-02T23:08:09.000Z
2021-07-20T19:54:44.000Z
doc/csv-plot.1.md
mslusarz/csv-nix-tools
30682757c605210f7b63d208af58ae56b16da72e
[ "BSD-3-Clause" ]
2
2020-01-22T08:25:46.000Z
2021-04-16T10:03:06.000Z
<!-- SPDX-License-Identifier: BSD-3-Clause Copyright 2020, Marcin Ślusarz <marcin.slusarz@gmail.com> --> --- title: csv-plot section: 1 ... # NAME # csv-plot - output 2D or 3D graph from CSV data read from standard input # SYNOPSIS # **csv-plot** [OPTION]... # DESCRIPTION # Read CSV stream from standard input and output 2D or 3D gnuplot script to standard output. -g, \--gnuplot : pipe to gnuplot -G, \--grid : draw a grid -t, \--terminal *TERMINAL* : use *TERMINAL* as gnuplot's output (e.g. png, gif, dumb) -x *COLNAME* : use *COLNAME* as x axis -y *COLNAME* : use *COLNAME* as y axis -z *COLNAME* : use *COLNAME* as z axis -T, \--table=*NAME* : apply to rows only with _table column equal *NAME* \--help : display this help and exit \--version : output version information and exit # EXAMPLES # `csv-plot -x xcol -y ycol < file1.csv` : generate 2D graph gnuplot script from data in file1.csv `csv-plot -x xcol -y ycol -z zcol -g < file1.csv` : generate 3D graph gnuplot script from data in file1.csv and pipe it to gnuplot # SEE ALSO # **[gnuplot](https://linux.die.net/man/1/gnuplot)**(1), **csv-nix-tools**(7)
18.140625
82
0.664944
yue_Hant
0.462463
985a43b0a451582e48ff16b42a0c24160fc0e7ea
1,927
md
Markdown
papers/210319 Thoughts on recent papers.md
rosinality/ml-papers
3c70bb20436ca385dbec6223370d3cda7d86f6d8
[ "MIT" ]
97
2021-05-10T14:36:22.000Z
2022-03-30T08:20:52.000Z
papers/210319 Thoughts on recent papers.md
rosinality/ml-papers
3c70bb20436ca385dbec6223370d3cda7d86f6d8
[ "MIT" ]
null
null
null
papers/210319 Thoughts on recent papers.md
rosinality/ml-papers
3c70bb20436ca385dbec6223370d3cda7d86f6d8
[ "MIT" ]
8
2021-07-19T02:26:17.000Z
2022-03-24T16:45:40.000Z
요즘 논문 정리를 잘 안 하는데...이번 주는 흥미로운 결과들이 꽤 있어서 공유하고 싶어졌음. Involution: Inverting the Inherence of Convolution for Visual Recognition (https://arxiv.org/abs/2103.06255) We additionally demystify the recent popular self-attention operator and subsume it into our involution family as an over-complicated instantiation. 앱스트랙이 아주 공격적. Revisiting ResNets: Improved Training and Scaling Strategies (https://arxiv.org/abs/2103.07579) resnet 약간 트윅하고 supervised pretraining으로 학습을 잘 시켜주니까 괜찮은데? 라는 결론. mixup/cutmix는 안 썼는데 이유는 궁금. You Only Look One-level Feature (https://arxiv.org/abs/2103.09460) 디텍션에서 fpn을 없애고 stride 32 feature만 쓰기. 영감은 detr에서 받은 것 같음. Training GANs with Stronger Augmentations via Contrastive Discriminator (https://arxiv.org/abs/2103.09742) discriminator의 feature representation을 contrastive learning만으로 학습시켜버리기. 원래 gan training이 unsupervised training의 방법이기도 했는데 이젠 unsupervised training이 gan training을 대체해버린 형국. Learning to Resize Images for Computer Vision Tasks (https://arxiv.org/abs/2103.09950) resizing을 해주는 네트워크를 만들어 붙여서 성능을 향상시킨 시도인데...향상폭도 좀 있는 것 같음. GPT Understands, Too (https://arxiv.org/abs/2103.10385) 프롬프트 입력 개선으로 gpt 파인튜닝 개선하기. 벤치마크에 따라 bert 수준의 결과도 나온다고 보고했는데 흥미로움. FastNeRF: High-Fidelity Neural Rendering at 200FPS (https://arxiv.org/abs/2103.10380) nerf는 이제 200 fps를 찍습니다. All NLP Tasks Are Generation Tasks: A General Pretraining Framework (https://arxiv.org/abs/2103.10360) span 기반 generative pretraining으로 nlu 모델 성능 끌어올리기. 위의 gpt 파인튜닝과 묘한 조화. TrivialAugment: Tuning-free Yet State-of-the-Art Data Augmentation (https://arxiv.org/abs/2103.10158) randaugment의 뒤를 잇는 심플한 augmentation 파이프라인. 수치는 좀 미묘해보이지만 autoaugment/randaugment가 트레이닝 파이프라인의 중요한 레시피라는 것을 생각해보면 의미가 있을 듯. Using latent space regression to analyze and leverage compositionality in GANs (https://arxiv.org/abs/2103.10426) masked reconstruction 모델에 gan을 합쳐서 image prior를 만듦. 사진 조각들을 콜라주한 다음에 generator로 consistent한 이미지를 생성하는데 흥미로움. #review
83.782609
171
0.802802
kor_Hang
0.999494
985af86f26cac2657b9bc6748dc735b18cce799e
2,899
md
Markdown
README.md
evexoio/transfer.sh_client
c3e2e566d47a16eb3216b08ab6e632a59e882509
[ "MIT" ]
null
null
null
README.md
evexoio/transfer.sh_client
c3e2e566d47a16eb3216b08ab6e632a59e882509
[ "MIT" ]
null
null
null
README.md
evexoio/transfer.sh_client
c3e2e566d47a16eb3216b08ab6e632a59e882509
[ "MIT" ]
null
null
null
# transfer.sh-client Python client for uploading files to transfer.sh (https://transfer.sh/) This command-line tool send file (or files, in case of directory download) to transfer.sh, and provides link to uploaded files, so it could be easily shared # Latest release: https://pypi.python.org/pypi/transfersh-client/1.1.2 # Getting Started - Install python and pip (package manager): ~~~~ sudo apt-get update sudo apt-get install python3 python3-pip OR sudo apt-get install python python-pip ~~~~ - Download package from pip: ~~~ sudo pip3 install transfersh_client OR sudo pip install trnasfersh_client ~~~ # Usage - After installation, you can run this package directly in command line. Launching it without arguments starts it in interactive mode: ~~~ transfer_files ~~~ ### Sample output: ~~~~ Github|⇒ transfer_files Enter path to file or directory: ./sysinfo Creating zipfile from files in... /home/path/to/directory/sysinfo Added file: cython_tut.cpython-34m.so Added file: cython_tut.pyx Added file: setup.py Added file: build Added file: fib.cpython-34m.so Added file: primes.c Added file: .idea Added file: fib.c Added file: parse_proc_files.py Added file: fib.pyx Added file: primes.pyx Added file: cython_tut.c Added file: primes.cpython-34m.so Sending zipfile: files_archive_09-02_18:34.zip (size of the file: 0.407897 MB) Link to download zipfile(will be saved till 2017-09-16): Could not save metadata Link copied to clipboard Remove archive? (y/n, yes/no):yes Removing file... /home/path/to/directory/sysinfo/files_archive_09-02_18:34.zip Removed. ~~~~ - Besides that, you can start it with arguments: -i --interactive - keys that will start app with prompts (same as running it without arguments) -d --directory - enter path to directory (relative or absolute), which files will be sent in an archive -f --file - same as --directory, but enter path to file --ra --rm-archive - delete created archive, after it was sent --rf --rm-file - delete file after it was sent -h --help - display help message ### Sample output ~~~ transfer.sh_client|dev⚡ ⇒ transfer_files -f test.txt --rf Sending file: /home/path/to/directory/transfer.sh_client/test.txt (size of the file: 0.000113 MB) Link to download file(will be saved till 2017-09-16): https://transfer.sh/CtaJs/test.txt Link copied to clipboard Removing file... /home/path/to/directory/transfer.sh_client/test.txt Removed. ~~~ ## Example of usage inside scripts ~~~python #!/usr/bin/env python3 from transfersh_client.app import send_to_transfersh, create_zip, remove_file def send_files_from_dir(): directory = './' zip_file = create_zip(directory) # creates zip archive and returns it's absolute path send_to_transfersh(zip_file) # sends archive to transfer.sh remove_file(zip_file) # removes it if __name__ == '__main__': send_files_from_dir() ~~~
25.883929
134
0.74336
eng_Latn
0.886507
985b638a4d238cc228ee5f1d3a0420af6dd70e12
4,823
md
Markdown
docs/code-quality/ca1701-resource-string-compound-words-should-be-cased-correctly.md
tommorris/visualstudio-docs.es-es
651470ca234bb6db8391ae9f50ff23485896393c
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/code-quality/ca1701-resource-string-compound-words-should-be-cased-correctly.md
tommorris/visualstudio-docs.es-es
651470ca234bb6db8391ae9f50ff23485896393c
[ "CC-BY-4.0", "MIT" ]
null
null
null
docs/code-quality/ca1701-resource-string-compound-words-should-be-cased-correctly.md
tommorris/visualstudio-docs.es-es
651470ca234bb6db8391ae9f50ff23485896393c
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: 'CA1701: En las palabras compuestas de la cadena de recursos se deberían utilizar las mayúsculas y minúsculas correctamente' ms.date: 03/28/2018 ms.prod: visual-studio-dev15 ms.technology: vs-ide-code-analysis ms.topic: reference f1_keywords: - ResourceStringCompoundWordsShouldBeCasedCorrectly - CA1701 helpviewer_keywords: - CA1701 - ResourceStringCompoundWordsShouldBeCasedCorrectly ms.assetid: 4ddbe09f-24b8-4c47-9373-a06f4487ca0d author: gewarren ms.author: gewarren manager: douge ms.workload: - multiple ms.openlocfilehash: ecc558cb9c069c19b545434afe0a851130fdb300 ms.sourcegitcommit: e13e61ddea6032a8282abe16131d9e136a927984 ms.translationtype: MT ms.contentlocale: es-ES ms.lasthandoff: 04/26/2018 ms.locfileid: "31915662" --- # <a name="ca1701-resource-string-compound-words-should-be-cased-correctly"></a>CA1701: En las palabras compuestas de la cadena de recursos se deberían utilizar las mayúsculas y minúsculas correctamente ||| |-|-| |TypeName|ResourceStringCompoundWordsShouldBeCasedCorrectly| |Identificador de comprobación|CA1701| |Categoría|Microsoft.Naming| |Cambio problemático|Poco problemático| ## <a name="cause"></a>Motivo Una cadena de recurso contiene una palabra compuesta en la que no aparecen en mayúsculas y minúsculas correctamente. ## <a name="rule-description"></a>Descripción de la regla Cada palabra en la cadena de recursos se divide en tokens que se basan en las mayúsculas y minúsculas. La biblioteca de correctores ortográficos de Microsoft comprueba cada combinación de dos tokens contiguos. Si la reconoce, la palabra genera una infracción de la regla. Ejemplos de palabras compuestas que producen una infracción son "CheckSum" y "MultiPart", que debe escribirse como "Checksum" y "Multipart", respectivamente. Debido a un uso común anterior, se generan varias excepciones en la regla y se marcan algunas palabras únicas, como "Toolbar" y "Filename", que debería escribirse como dos palabras distintas. En este ejemplo, debería marcarse "ToolBar" y "FileName". Las convenciones de nomenclatura proporcionan una apariencia común para las bibliotecas destinadas a Common Language Runtime. Esto reduce la curva de aprendizaje necesaria para las nuevas bibliotecas de software y aumenta la confianza del cliente respecto a que la biblioteca se haya desarrollado por parte de un especialista en desarrollo de código administrado. ## <a name="how-to-fix-violations"></a>Cómo corregir infracciones Cambie la palabra para que lo es mayúsculas y minúsculas correctamente. ## <a name="change-the-dictionary-language"></a>Cambiar el idioma de diccionario De forma predeterminada, se utiliza la versión de inglés (en) del corrector ortográfico. Si desea cambiar el idioma del corrector ortográfico, puede hacerlo mediante la adición de uno de los siguientes atributos a la *AssemblyInfo.cs* o *AssemblyInfo.vb* archivo: - Use <xref:System.Reflection.AssemblyCultureAttribute> para especificar la referencia cultural, si los recursos se encuentran en un ensamblado satélite. - Use <xref:System.Resources.NeutralResourcesLanguageAttribute> para especificar el *referencia cultural neutra* del ensamblado si los recursos están en el mismo ensamblado que el código. > [!IMPORTANT] > Si establece la referencia cultural a algo distinto de una referencia cultural basada en inglés, esta regla de análisis de código está deshabilitada en modo silencioso. ## <a name="when-to-suppress-warnings"></a>Cuándo Suprimir advertencias Es seguro suprimir una advertencia de esta regla si ambas partes de la palabra compuesta se reconocen por el diccionario de ortografía y la intención es usar dos o más palabras. También puede agregar palabras compuestas a un diccionario personalizado para el corrector ortográfico. Las palabras en el diccionario personalizado no producen infracciones. Para obtener más información, consulte [Cómo: personalizar el diccionario de análisis de código](../code-quality/how-to-customize-the-code-analysis-dictionary.md). ## <a name="related-rules"></a>Reglas relacionadas - [CA1702: En las palabras compuestas se deberían utilizar las mayúsculas y minúsculas correctamente](../code-quality/ca1702-compound-words-should-be-cased-correctly.md) - [CA1709: Los identificadores deberían utilizar las mayúsculas y minúsculas correctamente](../code-quality/ca1709-identifiers-should-be-cased-correctly.md) - [CA1708: Los identificadores se deberían diferenciar en algo más que en el uso de mayúsculas y minúsculas](../code-quality/ca1708-identifiers-should-differ-by-more-than-case.md) ## <a name="see-also"></a>Vea también - [Convenciones sobre el uso de minúsculas y mayúsculas](/dotnet/standard/design-guidelines/capitalization-conventions) - [Instrucciones de nomenclatura](/dotnet/standard/design-guidelines/naming-guidelines)
65.175676
679
0.807589
spa_Latn
0.964944
985b8e60da8ff37b9aa41a99b8e805e09200dbcb
4,448
md
Markdown
_posts/2021-11-3-slight-nginx-hiccup.md
Xavier-J-Ortiz/xavier-j-ortiz.github.io
f0b1f13c87458b36590eabb1211612ea40e7bf5f
[ "MIT" ]
null
null
null
_posts/2021-11-3-slight-nginx-hiccup.md
Xavier-J-Ortiz/xavier-j-ortiz.github.io
f0b1f13c87458b36590eabb1211612ea40e7bf5f
[ "MIT" ]
null
null
null
_posts/2021-11-3-slight-nginx-hiccup.md
Xavier-J-Ortiz/xavier-j-ortiz.github.io
f0b1f13c87458b36590eabb1211612ea40e7bf5f
[ "MIT" ]
null
null
null
--- layout: single title: NGINX snafu linked to letsencrypt? --- Clearly I have things running, given that the previous, and this current, post are both up and running. However, I did run into some NGINX issues. Or rather, NGINX config issues that stemmed from a letsencrypt update. Letsencrypt recently had a certificate issue. The gist of it is, one of their root CA certs expired, which basically made all the certs that were signed by this cert to be invalid and need to be reissued. This issue did not affect this website, as it seems that the version of `certbot` that I had used was new enough that it had correctly addressed this issue. However... It had been quite some time since the package had been installed on the debian server. In fact, I had installed it via `backports`. Package mantainers sometimes add packages that have made it into unstable, and then add them to the `backports` repo so you may leverage some more up to date packages on the `stable` branch of debian. This is important, and I've used the following packages off of backports when I was using debian `stable` as my main distro. Things such as `linux-kernel-headers`, `linux-kernel-image`, and `nvidia-drivers`. Given that these move ahead quite quickly, and even the kernel may have some new modules that might benefit a desktop user. Getting back to `certbot`, it *used* to be in the `stable` branch's backports, but as of late, [that has changed](https://certbot.eff.org/lets-encrypt/debiantesting-nginx). Seems the way to install it is via a `snap` package. Not gonna lie, I don't like to add snap packages if I can avoid them. Snap packages are supposed to be version, distro agnostic. This of course *sounds* really cool, and does have it's benefits. But.... unfortunately, from what I've read, you also install a lot of extra packages into the snap environment that runs the program you're installing, and well... creates a bit of bloat. Though I would hope that the bloat won't go into the main system, given that I would assume that the snap environment is isolated form the main distro's environment, but I don't know enough of this just yet to say it is one way or another. Anyway, lets assume the best, and that the snap environment is just going to use some more storage space up on my server. Fine. Price paid. The bigger issue that I haven't figured out yet is how to upgrade the package so that it's up to date without my intervention. So... with debian, I could always do unattended upgrades. Which, typically I only enabled unattended upgrades on security patches/upgrades. But, not sure how to go about doing something unattended on a snap that is housing the `cerbot` package. Though, now that I think about it, the `unattended-upgrade` packages would not activate the certbot package from what I recall, given that the `certbot` package was part of `backports`. I don't believe that backports is in `unattended-upgrades` unless you made it. Which I didn't, mainly because you never know when a package upstream might bork something downstream. Anyway.... I had to bite this bullet for the short term, and as I get familiar with the way this goes, maybe long term. We'll see. Either way, NGINX was working, then after updating certbot, it wasn't. It seems that the `certbot` snap package would modify the nginx config files for the sites incorrectly. grrr. Anyway, after brushing up on my nginx-ese, I realized that certbot was adding a couple of lines where not needed, and creating a serverblock where not needed. With this, would generate some bad URL routing and basically take my site offline. grrr. Was able to fix this and get it working, and I'm better for it because I updated my nginx config for this website! Yay! For the time being, will keep the certbot snap package, but am not very convinced, nor really wanting, to move to having a snap on a debian system. Kinda defeats the point to add bloat, when the reason when I installed debian was to avoid any (slight) Ubuntu bloat. **UPDATE: as I write this, I realize that the [debian documentation](https://wiki.debian.org/LetsEncrypt) now points to using the certbot that comes with `stable` now. So, Will probably migrate to that in order to not muck around with snaps.** **UPDATE-2: I did end up using the packages from debian. So much easier to manage, and I also know that the security part is going to be just fine to manage via `unattended-upgrades`**
98.844444
552
0.774955
eng_Latn
0.999837
985bf7a949524fc9b342e01a811e9c5d4c6a7888
1,175
md
Markdown
README.md
Saul-Shen/tw-tree
091397d0f57962f76c8cf3f056b6e7b2bda599a2
[ "MIT" ]
null
null
null
README.md
Saul-Shen/tw-tree
091397d0f57962f76c8cf3f056b6e7b2bda599a2
[ "MIT" ]
null
null
null
README.md
Saul-Shen/tw-tree
091397d0f57962f76c8cf3f056b6e7b2bda599a2
[ "MIT" ]
null
null
null
# tw-tree Two way tree chart, you can customize label content, link style, alse support single way tree chart. # Example ![example.png](./.github/example.png) # Usage Only the root node has 'isRoot' attribute and no 'node' attribute. ```html <tw-tree class="my-tree" :left-tree="leftTree" :right-tree="rightTree"> <template #label="props"> <!-- custom render according to node info --> <div v-if="props.isRoot" class="label"> {{ props.label }} </div> <div v-else class="label"> {{ props.node.label }} </div> </template> </tw-trew> ``` # Props ## leftTree Left tree data, could be null ## rightTree Right tree data, could be null # Slots ## label Label slot # Interface ```typescript interfase TreeNode { // not nessary, but will be useful, the example code shows the details id?: string, twAttrs?: { // if collapse children collapse: boolean, // custom label to parent link style parentLink: { width: string, color: string, }, // custom label to children link style childrenLink: { width: string, color: string, }, }, children: Array<TreeNode> }; ```
16.549296
100
0.628936
eng_Latn
0.907478
985c82a5a69fbef0c5f3dd70d1fb9618fd31590d
2,427
md
Markdown
windows-driver-docs-pr/stream/codecapi-allsettings.md
ahidaka/windows-driver-docs
6eac87818eba4c606a292991994b90f3279c2ab8
[ "CC-BY-4.0", "MIT" ]
1
2021-04-18T03:12:31.000Z
2021-04-18T03:12:31.000Z
windows-driver-docs-pr/stream/codecapi-allsettings.md
ahidaka/windows-driver-docs
6eac87818eba4c606a292991994b90f3279c2ab8
[ "CC-BY-4.0", "MIT" ]
null
null
null
windows-driver-docs-pr/stream/codecapi-allsettings.md
ahidaka/windows-driver-docs
6eac87818eba4c606a292991994b90f3279c2ab8
[ "CC-BY-4.0", "MIT" ]
1
2021-02-23T22:45:54.000Z
2021-02-23T22:45:54.000Z
--- title: CODECAPI\_ALLSETTINGS description: CODECAPI\_ALLSETTINGS ms.date: 11/28/2017 ms.localizationpriority: medium --- # CODECAPI\_ALLSETTINGS ## <span id="ddk_codecapi_allsettings_ks"></span><span id="DDK_CODECAPI_ALLSETTINGS_KS"></span> The CODECAPI\_ALLSETTINGS property is used to pass back and forth a minidriver-generated block of data. <table> <colgroup> <col width="20%" /> <col width="20%" /> <col width="20%" /> <col width="20%" /> <col width="20%" /> </colgroup> <thead> <tr class="header"> <th>Get</th> <th>Set</th> <th>Target</th> <th>Property descriptor type</th> <th>Property value type</th> </tr> </thead> <tbody> <tr class="odd"> <td><p>Yes</p></td> <td><p>Yes</p></td> <td><p>Filter</p></td> <td><p>KSPROPERTY</p></td> <td><p>PVOID</p></td> </tr> </tbody> </table> The property value (operation data) is of type PVOID, which is a pointer to a user-mode buffer for the minidriver-generated block of data. ### Comments On a property get call: If an application makes a property get call with a zero length buffer, the minidriver must return STATUS\_BUFFER\_OVERFLOW and specify the required buffer size in the **Irp-&gt;IoStatus.Information** field. If the length buffer is nonzero, the minidriver must return STATUS\_BUFFER\_TOO\_SMALL if the supplied buffer is too small for the data block, otherwise the minidriver packs its settings into a data block that can be restored later. It is the minidriver's responsibility to add data integrity checks to the data, such as a unique GUID to indicate the minidriver generated the data, a cyclic redundancy check (CRC), and a header length. The data returned should be lightweight and contain only information required to reconstruct the current settings. Applications will use this property for multilevel undos, stored with their projects, etc. On a property set call: The minidriver must verify the data's integrity and check that the data block size is under the maximum data size (for example, reject anything over a certain size). It must also verify the CRC and the header length. The minidriver must also cache any changes to be propagated for [CODECAPI\_CURRENTCHANGELIST](codecapi-currentchangelist.md). ### Requirements **Headers:** Declared in *ksmedia.h*. Include *ksmedia.h*. ### See Also [**KSPROPERTY**](/windows-hardware/drivers/ddi/ks/ns-ks-ksidentifier), [CODECAPI\_CURRENTCHANGELIST](codecapi-currentchangelist.md)
32.797297
439
0.749485
eng_Latn
0.979888
985d102b5d4eabb28622e15b8d2a89913ab90acf
193
md
Markdown
_posts/2021-06-26-example-content.md
KangraePark/kangraepark.github.io
51c271c1aa67ed143ad3ced11bf3dcef23fe76b0
[ "MIT" ]
null
null
null
_posts/2021-06-26-example-content.md
KangraePark/kangraepark.github.io
51c271c1aa67ed143ad3ced11bf3dcef23fe76b0
[ "MIT" ]
null
null
null
_posts/2021-06-26-example-content.md
KangraePark/kangraepark.github.io
51c271c1aa67ed143ad3ced11bf3dcef23fe76b0
[ "MIT" ]
null
null
null
--- layout: post title: Example tags: - math --- <p class="message"> Hi, I am preparing some contents. </p> ----- $a^2+b^2=c^2$ is abcd $\int_a^b 2x dx=b^2-a^2$.\\ 한글영어
12.866667
50
0.518135
eng_Latn
0.593595
985d95ba14e280654a2c5313dfcc85c5c690d558
26
md
Markdown
README.md
ohrak22/quest-board
6cb2c09a97732ea53d694f7dd4caca2fdd512f61
[ "MIT" ]
null
null
null
README.md
ohrak22/quest-board
6cb2c09a97732ea53d694f7dd4caca2fdd512f61
[ "MIT" ]
null
null
null
README.md
ohrak22/quest-board
6cb2c09a97732ea53d694f7dd4caca2fdd512f61
[ "MIT" ]
null
null
null
# quest-board Quest Board
8.666667
13
0.769231
ita_Latn
0.909891
985dc309691c35f1f9506bf1061ac1b52973d221
3,491
md
Markdown
README.md
phR0ze/wmctl
fac89c5d6fce4ab472563aeefdd04906b64e85e2
[ "Apache-2.0", "MIT" ]
null
null
null
README.md
phR0ze/wmctl
fac89c5d6fce4ab472563aeefdd04906b64e85e2
[ "Apache-2.0", "MIT" ]
null
null
null
README.md
phR0ze/wmctl
fac89c5d6fce4ab472563aeefdd04906b64e85e2
[ "Apache-2.0", "MIT" ]
null
null
null
# wmctl [![license-badge](https://img.shields.io/crates/l/fungus.svg)](https://opensource.org/licenses/MIT) [![crates.io](https://img.shields.io/crates/v/wmctl.svg)](https://crates.io/crates/wmctl) [![Minimum rustc](https://img.shields.io/badge/rustc-1.30+-lightgray.svg)](https://github.com/phR0ze/gory#rustc-requirements) ***Rust X11 automation*** `wmctl` implements the [Extended Window Manager Hints (EWMH) specification](https://specifications.freedesktop.org/wm-spec/latest/) as a way to work along side EWMH compatible window managers as a companion. `wmctl` provides the ability to precisely define how windows should be shaped and placed and can fill in gaps for window managers lacking some shaping or placement features. Mapping wmctl commands to user defined hot key sequences will allow for easy window manipulation beyond what your favorite EWMH window manager provides. ### Disclaimer ***wmctl*** comes with absolutely no guarantees or support of any kind. It is to be used at your own risk. Any damages, issues, losses or problems caused by the use of ***wmctl*** are strictly the responsiblity of the user and not the developer/creator of ***wmctl***. ### Quick links * [Usage](#usage) * [Shape window](#shape-window) * [Move window](#move-window) * [Place window](#place-window) * [Contribute](#contribute) * [Git-Hook](#git-hook) * [License](#license) * [Contribution](#contribution) * [Backlog](#backlog) * [Changelog](#changelog) ## Usage <a name="usage"/></a> ***rustc >= 1.30*** is required due to the [tracing\_subscriber](https://docs.rs/tracing-subscriber/0.2.15/tracing_subscriber) requirements `$ wmctl -h` for cli help ![help image](docs/images/help.png) ### Shape window <a name="shape-window"/></a> Shape the active window using the pre-defined `small` shape which is a quarter of the screen. ```bash $ wmctl shape small ``` ### Move window <a name="move-window"/></a> Move the active window to the bottom left corner of the screen. ```bash $ wmctl move bottom-left ``` ### Place window <a name="place-window"/></a> Combine the shape and move into a single command by placing the window. First the window is shaped using the pre-defined `small` shape then it is moved to the bottom left of the screen in a single operation. ```bash $ wmctl place small bottom-left ``` ## Contribute <a name="Contribute"/></a> Pull requests are always welcome. However understand that they will be evaluated purely on whether or not the change fits with my goals/ideals for the project. ### Git-Hook <a name="git-hook"/></a> Enable the git hooks to have automatic version increments ```bash cd ~/Projects/wmctl git config core.hooksPath .githooks ``` ## License <a name="license"/></a> This project is licensed under either of: * MIT license [LICENSE-MIT](LICENSE-MIT) or http://opensource.org/licenses/MIT * Apache License, Version 2.0 [LICENSE-APACHE](LICENSE-APACHE) or http://www.apache.org/licenses/LICENSE-2.0 ### Contribution <a name="contribution"/></a> Unless you explicitly state otherwise, any contribution intentionally submitted for inclusion in this project by you, as defined in the Apache-2.0 license, shall be dual licensed as above, without any additional terms or conditions. --- ## Backlog <a name="backlog"/></a> ## Changelog <a name="changelog"/></a> * 12/18/2021 * Add Arch Linux packaging * Added public documentation * Fix to precisely place windows with Xfwm4 * Completed move, shape and place implementation
38.788889
131
0.734746
eng_Latn
0.983853
985f7b31608261b302f78b36b32f15d1e64b4844
1,860
md
Markdown
README.md
scadl/UnderGround-Basilic_UE4
49d339b9f4a24b51e4a4c757ee896a0726e678d3
[ "MIT" ]
null
null
null
README.md
scadl/UnderGround-Basilic_UE4
49d339b9f4a24b51e4a4c757ee896a0726e678d3
[ "MIT" ]
null
null
null
README.md
scadl/UnderGround-Basilic_UE4
49d339b9f4a24b51e4a4c757ee896a0726e678d3
[ "MIT" ]
null
null
null
This project is my first reconstruction made for interactive visualization, providing interactivity in my projects. Much better surface texturing and interior detailing have been done. The very structure of the temple has a very complex structure due to partial immersion in the rock mass. From a geometric point of view, the model has been verified in great detail. Removed all useless and invisible polygons. The textures have, although not a large resolution, they still have sufficient detail, but they do not correspond to the real materials from which the buildings in Chersonesos were built. This is one of two reconstructions that strike a balance between system requirements and picture quality. In the first version of the program, due to the capabilities of earlier versions of Unity3D, good performance was achieved. Several classic point lights have been used to illuminate the scenes, whose shadows are baked into surface textures, to increase compatibility with cross-platform compilation, and low computing power of the target systems. There is a built-in simplified control help, a hidden logo, basic graphics and image quality settings, as well as three cross-platform options. In the new UE4 version of the reconstruction, all the selected materials were rethought, as well as the structure of the model itself. All geometry was disassembled into independent modules in the open editor Blender. The geometry of the module has been refined, as well as they received a clean, conflict-free UV-scanning, which made it possible to create significantly more complex and complex scene lighting. The camera controller has also undergone improvements, making it smoother and more stable. There were doors separating the outer space from the inner one, as well as here for the first time we used visible light sources, such as braziers and choros.
310
704
0.82043
eng_Latn
0.999928
985f88c840364ec78d484037f61429fc876d38d5
629
md
Markdown
_i18n/en/blog/just-arrived-in-the-huffington-post.md
justarrived/website
30dbf9c14a653f6033419ed87306635a03de688f
[ "MIT" ]
2
2017-06-01T14:26:31.000Z
2017-08-10T09:29:18.000Z
_i18n/en/blog/just-arrived-in-the-huffington-post.md
justarrived/website
30dbf9c14a653f6033419ed87306635a03de688f
[ "MIT" ]
157
2016-09-09T13:45:16.000Z
2020-02-27T04:17:34.000Z
_i18n/en/blog/just-arrived-in-the-huffington-post.md
justarrived/website
30dbf9c14a653f6033419ed87306635a03de688f
[ "MIT" ]
1
2017-02-20T00:07:37.000Z
2017-02-20T00:07:37.000Z
Not only are we thrilled to [see Huffington post](http://www.huffingtonpost.com/entry/58c0210fe4b070e55af9e9ef) make Just Arrived a great example of start-ups working for integration, we are also heart warmed to see Zaher thrive as one of our first candidates receiving a job through Just Arrived. > ”His current employer, Väsby hem, is eager to extend his employment. Proof enough for him that the platform take on immigration is paying off.” > > __Huffington Post__ [Read the full article](http://www.huffingtonpost.com/entry/58c0210fe4b070e55af9e9ef). ![Stockholm Overview](/assets/images/blog/stockholm-huff-overview.jpg)
62.9
297
0.799682
eng_Latn
0.989141
986083e8eed578a4f4025ebf45317a7ce15703d6
2,816
md
Markdown
website/versioned_docs/version-4.2.0/config-maven.md
java-tools/awe
30618df0a6f940284851d2a988317eaf3c546186
[ "Apache-2.0" ]
1
2021-03-06T15:04:08.000Z
2021-03-06T15:04:08.000Z
website/versioned_docs/version-4.3.0/config-maven.md
java-tools/awe
30618df0a6f940284851d2a988317eaf3c546186
[ "Apache-2.0" ]
null
null
null
website/versioned_docs/version-4.3.0/config-maven.md
java-tools/awe
30618df0a6f940284851d2a988317eaf3c546186
[ "Apache-2.0" ]
null
null
null
--- id: maven title: Maven --- The maven dependencies needed to run an application with AWE engine are the next ones: ```xml <dependency> <groupId>com.almis.awe</groupId> <artifactId>awe-spring-boot-starter</artifactId> <version>${awe.version}</version> </dependency> <dependency> <groupId>com.almis.awe</groupId> <artifactId>awe-client-angular</artifactId> <version>${awe.version}</version> </dependency> ``` Add the maven dependency plugin to retrieve the generic screens and the client engine sources: ```xml <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-dependency-plugin</artifactId> <executions> <execution> <phase>prepare-package</phase> <id>unpack awe-generic-screens</id> <goals> <goal>unpack-dependencies</goal> </goals> <configuration> <includeGroupIds>com.almis.awe</includeGroupIds> <includeArtifactIds>awe-generic-screens</includeArtifactIds> <includes>schemas/**</includes> <outputDirectory>${project.build.directory}/classes/static/</outputDirectory> </configuration> </execution> <execution> <phase>prepare-package</phase> <id>unpack awe-client-angular</id> <goals> <goal>unpack-dependencies</goal> </goals> <configuration> <includeGroupIds>com.almis.awe</includeGroupIds> <includeArtifactIds>awe-client-angular</includeArtifactIds> <includes>images/**,fonts/**,js/**,css/**,less/**</includes> <outputDirectory>${project.build.directory}/classes/static/</outputDirectory> </configuration> </execution> </executions> </plugin> ``` We use `webpack` to compile all javascript and less files ```xml <plugin> <groupId>com.github.eirslett</groupId> <artifactId>frontend-maven-plugin</artifactId> <executions> <execution> <id>install node and yarn</id> <goals> <goal>install-node-and-yarn</goal> </goals> <configuration> <nodeVersion>v8.12.0</nodeVersion> <yarnVersion>v1.10.1</yarnVersion> </configuration> </execution> <execution> <id>yarn install</id> <goals> <goal>yarn</goal> </goals> <configuration> <arguments>install</arguments> </configuration> </execution> <execution> <id>webpack</id> <goals> <goal>webpack</goal> </goals> <configuration> <arguments>--output-path "${project.build.frontend}"</arguments> </configuration> </execution> </executions> </plugin> ``` > **Note:** More info about less plugin [here](https://github.com/marceloverdijk/lesscss-maven-plugin) Where **PROJECT-ACRONYM** is the acronym of the project in uppercase, and **project-acronym** is the same acronym but in lowercase.
28.444444
131
0.656605
eng_Latn
0.785149
9860b8ea060681664885515694e609d008d8d7e6
112
md
Markdown
infrastructure/kube/master-scripts/yml/dashboard-fix/README.md
zachradtka/Data-Profiler
034b8eddabf9b66b2862aa452c737e20f65effc5
[ "ECL-2.0", "Apache-2.0" ]
5
2021-12-30T13:05:51.000Z
2022-03-25T18:13:54.000Z
infrastructure/kube/master-scripts/yml/dashboard-fix/README.md
zachradtka/Data-Profiler
034b8eddabf9b66b2862aa452c737e20f65effc5
[ "ECL-2.0", "Apache-2.0" ]
1
2022-02-25T22:13:41.000Z
2022-03-10T19:38:52.000Z
infrastructure/kube/master-scripts/yml/dashboard-fix/README.md
zachradtka/Data-Profiler
034b8eddabf9b66b2862aa452c737e20f65effc5
[ "ECL-2.0", "Apache-2.0" ]
2
2021-12-13T16:32:19.000Z
2022-01-24T17:21:49.000Z
give the dashboard user clusteradmin, not ideal to give too many permissions, but it gets the dashboard working
56
111
0.821429
eng_Latn
0.997392
eed7fabf25bf40f0a981365f8000b3d891b5f8a6
500
markdown
Markdown
_posts/2015/2015-10-17-atualizacao-sobre-a-manutencao-dos-servicos-online-de-spore.markdown
esporo/esporo.net
72de6d7900a0c20d33b152c8e1c6d59710f12d36
[ "MIT" ]
null
null
null
_posts/2015/2015-10-17-atualizacao-sobre-a-manutencao-dos-servicos-online-de-spore.markdown
esporo/esporo.net
72de6d7900a0c20d33b152c8e1c6d59710f12d36
[ "MIT" ]
2
2021-08-15T14:24:59.000Z
2022-03-26T20:07:31.000Z
_posts/2015/2015-10-17-atualizacao-sobre-a-manutencao-dos-servicos-online-de-spore.markdown
esporo/esporo.net
72de6d7900a0c20d33b152c8e1c6d59710f12d36
[ "MIT" ]
null
null
null
--- layout: post title: "Atualização sobre a manutenção dos serviços online de Spore" date: "2015-10-17 22:55:37 -0200" tags: - Spore - status de serviço - serviços online --- MaxisBazajaytee anunciou ontem que os testes nos novos servidores estão indo bem e que nenhum grande problema foi reportado até o momento, o que significa que provavelmente nos próximos dias receberemos uma atualização sobre a data de liberação dos recursos online do Spore. Nós avisaremos aqui qualquer novidade. :)
38.461538
274
0.778
por_Latn
0.999901
eed80bd76b9c2bcc91b4ea7a4459f01a0eb81715
688
md
Markdown
old_md/ravinetto_global_2013.md
thomasdorlo/thomasdorlo
7a748541d80b4910e3a5b4348f6d5c5a666bbb11
[ "MIT" ]
null
null
null
old_md/ravinetto_global_2013.md
thomasdorlo/thomasdorlo
7a748541d80b4910e3a5b4348f6d5c5a666bbb11
[ "MIT" ]
null
null
null
old_md/ravinetto_global_2013.md
thomasdorlo/thomasdorlo
7a748541d80b4910e3a5b4348f6d5c5a666bbb11
[ "MIT" ]
null
null
null
+++ title = "The global impact of Indian generics on access to health" date = "2013-01-01" publication_types = ["2"] authors = ["Raffaella M. Ravinetto", "**Thomas P. C. Dorlo**", "Jean-Michel Caudron", "N. S. Prashanth"] publication = "_Indian Journal of Medical Ethics_" abstract = "" doi = "10.20529/IJME.2013.035" links = [{name = "PubMed", url = "https://www.ncbi.nlm.nih.gov/pubmed/23697493"}] abstract_short = "" image_preview = "" selected = false projects = [] tags = [] url_pdf = "" url_preprint = "" url_code = "" url_dataset = "" url_project = "" url_slides = "" url_video = "" url_poster = "" url_source = "" math = true highlight = true [header] image = "" caption = "" +++
22.933333
104
0.655523
eng_Latn
0.309623
eed83eeb74a8abbc051db2366587eb99367f8ebd
131
md
Markdown
README.md
Wiles/clork
0f057db718660312b23436a6700a09979ea27cf9
[ "Apache-2.0" ]
null
null
null
README.md
Wiles/clork
0f057db718660312b23436a6700a09979ea27cf9
[ "Apache-2.0" ]
null
null
null
README.md
Wiles/clork
0f057db718660312b23436a6700a09979ea27cf9
[ "Apache-2.0" ]
null
null
null
# clork ![clork](clork.jpg) One year, one week and one day duration clocks displaying beginning of January a Thursday at 20:50.
18.714286
99
0.748092
eng_Latn
0.998936
eed8a2785defd7f72cef083dca65a09b53c6100b
289
md
Markdown
_posts/2019-10-25-wiki20191025.md
wiki650/wiki650.github.io
e23265b7d5d679ced69595d14c615bda76aa4c68
[ "MIT" ]
null
null
null
_posts/2019-10-25-wiki20191025.md
wiki650/wiki650.github.io
e23265b7d5d679ced69595d14c615bda76aa4c68
[ "MIT" ]
null
null
null
_posts/2019-10-25-wiki20191025.md
wiki650/wiki650.github.io
e23265b7d5d679ced69595d14c615bda76aa4c68
[ "MIT" ]
null
null
null
--- layout: post title: 2019-10-25 迷惘的时候 --- 这两天突然感觉整个人都突然变得很看得开了,佛系等死的心态吗? 难道是因为没有存在任何希望,所以只能放弃的原因吗?感觉一点都不像我自己了。我这样算放下了吗?我自己都觉得不可思议,到底是发生了什么事情?我就真的这样接受这个结局了吗? 只剩最后一个星期了。好好珍惜在你身边的每一天吧。自我安慰的想象,其实感觉得出来你对我还是蛮关心,对我的举动也都在观察,也许你对我还是有那么一丝好感吧。但是我们之间,实在是有太大太大的距离了。也许是我这一生都追不上的距离。也许想开看开才对我们来说是最好的
28.9
126
0.861592
zho_Hans
0.716303
eed978348eea5be9ff3349758367095a8fee7487
1,864
md
Markdown
docs/mechanisms/build.md
borsothy/moonshot
a2030ceef132d9b9438f0ea7e7f6ec2a940a18d6
[ "Apache-2.0" ]
null
null
null
docs/mechanisms/build.md
borsothy/moonshot
a2030ceef132d9b9438f0ea7e7f6ec2a940a18d6
[ "Apache-2.0" ]
null
null
null
docs/mechanisms/build.md
borsothy/moonshot
a2030ceef132d9b9438f0ea7e7f6ec2a940a18d6
[ "Apache-2.0" ]
null
null
null
# BuildMechanism ## Script The Script BuildMechanism will execute a local shell script, with certain expectations. The script will run with some environment variables: - `VERSION`: The named version string passed to `build-version`. - `OUTPUT_FILE`: The file that the script is expected to produce. If the file is not created by the build script, deployment will fail. Otherwise, the output file will be uploaded using the ArtifactRepository. Sample Usage ```ruby #!/usr/bin/env ruby require 'moonshot' # Set up Moonshot tooling for our environment. class MoonshotSampleApp < Moonshot::CLI self.build_mechanism = Script.new('bin/build.sh') ... ``` ## GithubRelease A build mechanism that creates a tag and GitHub release. Could be used to delegate other building steps after GitHub release is created. Sample Usage ```ruby #!/usr/bin/env ruby require 'moonshot' # Set up Moonshot tooling for our environment. class MoonshotSampleApp < Moonshot::CLI wait_for_travis_mechanism = TravisDeploy.new("acquia/moonshot", true) self.build_mechanism = GithubRelease.new(wait_for_travis_mechanism) ... ``` ## TravisDeploy The Travis Build Mechanism waits for Travis-CI to finish building a job matching the VERSION (see above) and the output of the travis job has to be 'BUILD=1'. Can be used to make sure that the travis job for the repository for that version actually finished before the deployment step can be executed. Sample Usage ```ruby #!/usr/bin/env ruby require 'moonshot' # Set up Moonshot tooling for our environment. class MoonshotSampleApp < Moonshot::CLI # First argument is the repository as known by travis. # Second argument is wether or not you are using travis pro. self.build_mechanism = TravisDeploy.new("acquia/moonshot", pro: true) ... ``` ## Version Proxy @Todo Document and clarify the use-case of the Version Proxy.
28.676923
301
0.76824
eng_Latn
0.99282
eeda166ef28f5fea1698acac76feadfec8c0ac9e
3,422
md
Markdown
_posts/2016-11-01-SRT.md
yunnant/yunnant.github.io
056bf6c5ed0e75d4cb5b691661f6abd46e7660d3
[ "MIT" ]
null
null
null
_posts/2016-11-01-SRT.md
yunnant/yunnant.github.io
056bf6c5ed0e75d4cb5b691661f6abd46e7660d3
[ "MIT" ]
null
null
null
_posts/2016-11-01-SRT.md
yunnant/yunnant.github.io
056bf6c5ed0e75d4cb5b691661f6abd46e7660d3
[ "MIT" ]
null
null
null
--- layout: post title: "Research on Emotion Index of Students 大学生情绪指数分析" categories: Projects excerpt: We Collected relevant emotional indicators of college students and public investors, used principal component analysis method to construct a sentiment index and found the emotion index do have predictability to the stock price.我们小组通过对“安信证券杯”大学生模拟炒股大赛中的数据,运用主成分分析的方法对大学生的投资行为和心理状况,量化成指数并与A股指数进行对比,进而分析大学生情绪指数和A股走势的关系。随后再将市场上的总体水平,与大学生情绪指数进行量化比较,确定出大学生群体投资特点以及情绪指数确实有预见能力 --- 我们小组通过对“安信证券杯”大学生模拟炒股大赛中的数据,运用主成分分析的方法对大学生的投资行为和心理状况,量化成指数并与A股指数进行对比,进而分析大学生情绪指数和A股走势的关系。随后再将市场上的总体水平,与大学生情绪指数进行量化比较,确定出大学生群体投资特点以及情绪指数确实有预见能力。以下内容为结项论文摘选。 由于变量多且有一定的相关性,我们采取主成分分析方法,主成分分析把大量原始指标组合成较少的几个综合指标,可以使分析简化,这种方法是用线性组合的方法将原始指标组合起来,得到反映原始指标变动程度最大的综合指标。每个情绪指标都在不同程度上反映了投资者情绪。一方面,不同的指标在反映投资者情绪有重复,另一方面,各个投资者情绪只能反映情绪的一个角度。因此我们有必要通过主成分分析对初始情绪指标降维,提取出主成分作为投资者情绪的代理变量。 对于市场情绪,我们获得了,每日开户数,每日交易额,大盘换手率的数据。对于大学生投资者,我们使用爬虫工具获得了参赛者收益率、周转率、仓位、最大回撤深度、个股交易量,成交价格等数据,并将其计算成资金余额,单日交易额,股票总市值,仓位,收益率,最大回撤作为情绪指数因子。根据以上数据分别进行主成分分析,并与大盘指数对比,结果如下图。 根据以上图示结果,可以认为大学生群体总体偏被动,受情绪影响并没有想象的大,以及情绪指数确实有预见能力。 We used the principal component analysis method to quantify the investment behavior and psychological status of college students by using the data in the “Essence Securities Cup” college students' and compare this emotion index with the benchmark(A-share index). Then, the college students' sentiment index is compared with the overall emotion of the market. We determined the investment characteristics of the college students and we noticed the sentiment index do have predictive ability. The following is an excerpt of the final paper. Because of the large number of variables and correlation, we adopt the principal component analysis method. Principal component analysis combines a large number of original indicators into a few comprehensive indicators, which can simplify the analysis. This method uses a linear combination method on the indicators to obtain a comprehensive indicator that reflects the S^2 in the original indicators. Each sentiment indicator reflects investor sentiment to varying degrees. On the one hand, different indicators may have repetition of investor sentiment. On the other hand, each investor's sentiment can only reflect an angle of emotion. Therefore, it is necessary to use the principal component analysis to reduce the initial emotional indicators and extract the principal components as the proxy variables of investor sentiment. For market sentiment, we obtained data on daily account opening, daily trading volume, and market turnover. For college student investors, we use the Web crawler tool to obtain the entrant's rate of return, turnover rate, position, maximum withdrawal, individual stock trading volume, transaction price and other data, and calculate it into balance, single day trading amount, total stock Market value, position, return, and maximum drawback as sentiment index factors. Principal component analysis was performed according to the above data, and compared with the market index, the results are shown below. According to the results, it can be considered that the college students are generally passive to the market, the emotions are not as significant as imagined, and the sentiment index does have predictive ability. <center> <img src="https://i.ibb.co/Nj4Rzr1/image.png" width="50%"/> </center> <center> <img src="https://i.ibb.co/6XXPg3z/image.png" width="50%"/> </center>
118
832
0.834892
eng_Latn
0.981415
eedad686ee79ecb2e38675a4e0da583fc3def03f
85
md
Markdown
README.md
jthelin/HelloRayActors
ae3ccab8c52d919b6fa5f3823a750b22c63c3341
[ "MIT" ]
1
2019-08-31T01:45:40.000Z
2019-08-31T01:45:40.000Z
README.md
jthelin/HelloRayActors
ae3ccab8c52d919b6fa5f3823a750b22c63c3341
[ "MIT" ]
null
null
null
README.md
jthelin/HelloRayActors
ae3ccab8c52d919b6fa5f3823a750b22c63c3341
[ "MIT" ]
null
null
null
# HelloRayActors A simple example of distributed actors using ray - http://rllib.io/
28.333333
67
0.776471
eng_Latn
0.951986
eedcaf3815b5116820ca6fc2277c4e7be6d4c6c1
2,056
md
Markdown
webapp/nodejs/node_modules/quick-format-unescaped/readme.md
tatsumack/isu8q
459eacf1c6aef192aef3731b158c95473c923d22
[ "MIT" ]
6
2018-06-29T17:43:53.000Z
2021-12-29T13:00:41.000Z
node_modules/quick-format-unescaped/readme.md
Will-create/acg-flow
51943c930fdc59252d9359b624de96ee1f5abf06
[ "MIT" ]
89
2020-03-13T10:08:30.000Z
2020-12-10T19:56:00.000Z
node_modules/quick-format-unescaped/readme.md
Will-create/acg-flow
51943c930fdc59252d9359b624de96ee1f5abf06
[ "MIT" ]
3
2020-04-28T13:50:12.000Z
2020-12-31T13:12:25.000Z
# quick format unescaped Solves a problem with util.format ## unescaped ? Sometimes you want to embed the results of quick-format into another string, and then escape the whole string. ## usage ```js var format = require('quick-format') var options = {lowres: false} // <--default format(['hello %s %j %d', 'world', {obj: true}, 4, {another: 'obj'}], options) ``` ## options ### lowres Passing an options object with `lowres: true` will cause quick-format any object with a circular as a string with the value '"[Circular]"'. The default behaviour is to label circular references in an object, instead of abandoning the entire object. Naturally, `lowres` is a faster mode, and assumes you have made the decision to ensure the objects you're passing have no circular references. ## caveats We use `JSON.stringify` instead of `util.inspect`, this means object methods (functions) *will not be serialized*. ## util.format In `util.format` for Node 5.9, performance is significantly affected when we pass in more arguments than interpolation characters, e.g ```js util.format('hello %s %j %d', 'world', {obj: true}, 4, {another: 'obj'}) ``` This is mostly due to the use of `util.inspect`. Use `JSON.stringify` (safely) instead which is significantly faster. It also takes an array instead of arguments, which helps us avoid the use of `apply` in some cases. Also - for speed purposes, we ignore symbol. ## Benchmarks Whilst exact matching of objects to interpolation characters is slower, the case of additional objects is 3x faster. Further, using `lowres` mode brings us closer to `util.inspect` speeds. ``` util*100000: 205.978ms quickLowres*100000: 236.337ms quick*100000: 292.018ms utilWithTailObj*100000: 1054.592ms quickWithTailObjLowres*100000: 267.992ms quickWithTailObj*100000: 343.048ms util*100000: 212.011ms quickLowres*100000: 226.441ms quick*100000: 296.600ms utilWithTailObj*100000: 1020.195ms quickWithTailObjLowres*100000: 267.331ms quickWithTailObj*100000: 343.867ms ``` ## Acknowledgements Sponsored by nearForm
28.164384
173
0.753405
eng_Latn
0.989244
eedcddc26750959154aecaa0b522909254edf470
749
md
Markdown
README.md
josephscott/burgontan
49b64c7a0498b4b501b1d9fb5fc70a1557c42587
[ "MIT" ]
null
null
null
README.md
josephscott/burgontan
49b64c7a0498b4b501b1d9fb5fc70a1557c42587
[ "MIT" ]
3
2021-05-14T01:31:57.000Z
2021-06-06T05:08:47.000Z
README.md
josephscott/burgontan
49b64c7a0498b4b501b1d9fb5fc70a1557c42587
[ "MIT" ]
null
null
null
# burgontan A blog focused theme for Hugo. ## Expected Config Items ``` [markup.goldmark.renderer] unsafe= true ``` ## Optional Config Items ``` copyright = "© Copyright {year} Your Name Here" ``` ### Menus ``` [[menu.main]] name = "About" url = "/about/" [[menu.main]] name = "Tags" url = "/tags/" [[menu.main]] name = "Archive" url = "/archive/" ``` ## Optional Partial Templates ``` site-header.html site-footer.html home.html ``` ## Optional Pages **/about** Create a page in `content/about.md` and use `layout: staticpage`. **/archive** Create a page in `content/archive.md` and use this front matter: ``` --- title: "Posts Archive" layout: archive hidden: true --- ``` ## Front Matter ``` hidden: true ```
12.694915
65
0.619493
eng_Latn
0.343995
eedd127ebc86383e19198a8503e0e1608d42796f
1,263
md
Markdown
os/README.md
tpuschel/duality
7761417077ad4c31e7ec7916b0ae54cadc169835
[ "MIT" ]
1
2020-04-28T14:35:51.000Z
2020-04-28T14:35:51.000Z
os/README.md
tpuschel/duality
7761417077ad4c31e7ec7916b0ae54cadc169835
[ "MIT" ]
1
2020-07-19T20:16:58.000Z
2020-07-19T20:16:58.000Z
os/README.md
tpuschel/duality
7761417077ad4c31e7ec7916b0ae54cadc169835
[ "MIT" ]
null
null
null
# Duality OS Duality OS is a small, experimental OS with the eventual goal of bootstrapping just enough to launch a Duality REPL. Requires support for UEFI, and currently only runs on x64. ## Building Duality OS is designed to be build with clang and lld-link. To create the PE32+ executable as required by UEFI, run: `clang -target x86_64-unknown-windows -ffreestanding -fuse-ld=lld-link -nostdlib -O2 -mno-red-zone -Wl,/subsystem:efi_application,/entry:boot boot.c -o BOOTx64.EFI` ## Creating an image To be able to boot Duality OS, we need to create an image with a FAT32 filesystem and /EFI/BOOT/BOOTx64.EFI as the folder structure. On macOS, creating an image can be done like this: `mkdir -p image/EFI/BOOT` `cp BOOTx64.EFI image/EFI/BOOT/.` `hdiutil create -srcfolder image -fs FAT32 -volname "Duality OS" -format RdWr duality-os` If `duality-os.img` already exists, append `-ov` to the above hdiutil command. ## Booting The image can run virtualized with something like [QEMU](https://www.qemu.org), or be copied to a USB drive to run on real hardware. For QEMU, a UEFI firmware like [OVMF](https://github.com/tianocore/tianocore.github.io/wiki/OVMF) is needed. To start QEMU, run: `qemu-system-x86_64 --bios <OVMF file> duality-os.img`
32.384615
164
0.753761
eng_Latn
0.95711
eeddfdea390974411e0dbb56f67b3ccdd2d9b1aa
46,572
md
Markdown
data/post/estate-autunno-eventi.md
moebiusmania/salvatorelaisa.me
2c70e417eb68659b0033e93456f245596d62a22f
[ "MIT" ]
4
2022-02-23T21:48:30.000Z
2022-02-25T22:23:38.000Z
data/post/estate-autunno-eventi.md
moebiusmania/salvatorelaisa.me
2c70e417eb68659b0033e93456f245596d62a22f
[ "MIT" ]
142
2019-09-16T04:43:44.000Z
2022-02-19T12:09:46.000Z
data/post/estate-autunno-eventi.md
moebiusmania/salvatorelaisa.me
2c70e417eb68659b0033e93456f245596d62a22f
[ "MIT" ]
null
null
null
--- title: Tra l'estate e l'autunno c'è sempre tanta roba! date: '2016-09-04' tags: ['eventi'] draft: false summary: "Ora che Settembre è iniziato e domani ci sarà la prima vera settimana lavorativa dopo Agosto inizio a dare un'occhiata al calendario e segnarmi un po di cose, nella mia sfera di interessi è sempre un periodo pieno di cose interessanti!" images: [ 'https://images.unsplash.com/photo-1506899686410-4670690fccef?ixid=MnwxMjA3fDB8MHxwaG90by1wYWdlfHx8fGVufDB8fHx8&ixlib=rb-1.2.1&auto=format&fit=crop&w=1200&q=80', ] --- <script async src="//www.instagram.com/embed.js"></script> [![Foto di Aaron Burden su Unsplash](https://images.unsplash.com/photo-1506899686410-4670690fccef?ixid=MnwxMjA3fDB8MHxwaG90by1wYWdlfHx8fGVufDB8fHx8&ixlib=rb-1.2.1&auto=format&fit=crop&w=1200&q=80)](https://unsplash.com/photos/Zl8zGdnNcP8) <small>_Foto di Aaron Burden su Unsplash_</small> Ora che Settembre è iniziato e domani ci sarà la prima vera settimana lavorativa dopo Agosto inizio a dare un'occhiata al calendario e segnarmi un po di cose, nella mia sfera di interessi è sempre un periodo pieno di cose interessanti! ## Festa del Ticino <small>(10-11 Settembre, Pavia)</small> Per me l'**estate finisce** realmente con la Festa del Ticino di Pavia e lo spettacolo pirotecnico, di cui [ho già parlato in un post a parte](https://salvatorelaisa.blog/post/fuochi-fine-estate/) quindi non mi ripeto, che quest'anno sarà il 10 Settembre. <blockquote className="instagram-media" data-instgrm-captioned data-instgrm-permalink="https://www.instagram.com/p/7TiCw2yVgC/?utm_source=ig_embed&utm_campaign=loading" data-instgrm-version={13} style={{background: '#FFF', border: 0, borderRadius: '3px', boxShadow: '0 0 1px 0 rgba(0,0,0,0.5),0 1px 10px 0 rgba(0,0,0,0.15)', margin: '1px', maxWidth: '540px', minWidth: '326px', padding: 0, width: 'calc(100% - 2px)'}}><div style={{padding: '16px'}}> <a href="https://www.instagram.com/p/7TiCw2yVgC/?utm_source=ig_embed&utm_campaign=loading" style={{background: '#FFFFFF', lineHeight: 0, padding: '0 0', textAlign: 'center', textDecoration: 'none', width: '100%'}} target="_blank"> <div style={{display: 'flex', flexDirection: 'row', alignItems: 'center'}}> <div style={{backgroundColor: '#F4F4F4', borderRadius: '50%', flexGrow: 0, height: '40px', marginRight: '14px', width: '40px'}} /> <div style={{display: 'flex', flexDirection: 'column', flexGrow: 1, justifyContent: 'center'}}> <div style={{backgroundColor: '#F4F4F4', borderRadius: '4px', flexGrow: 0, height: '14px', marginBottom: '6px', width: '100px'}} /> <div style={{backgroundColor: '#F4F4F4', borderRadius: '4px', flexGrow: 0, height: '14px', width: '60px'}} /></div></div><div style={{padding: '19% 0'}} /> <div style={{display: 'block', height: '50px', margin: '0 auto 12px', width: '50px'}}><svg width="50px" height="50px" viewBox="0 0 60 60" version="1.1" xmlns="https://www.w3.org/2000/svg" xmlnsXlink="https://www.w3.org/1999/xlink"><g stroke="none" strokeWidth={1} fill="none" fillRule="evenodd"><g transform="translate(-511.000000, -20.000000)" fill="#000000"><g><path d="M556.869,30.41 C554.814,30.41 553.148,32.076 553.148,34.131 C553.148,36.186 554.814,37.852 556.869,37.852 C558.924,37.852 560.59,36.186 560.59,34.131 C560.59,32.076 558.924,30.41 556.869,30.41 M541,60.657 C535.114,60.657 530.342,55.887 530.342,50 C530.342,44.114 535.114,39.342 541,39.342 C546.887,39.342 551.658,44.114 551.658,50 C551.658,55.887 546.887,60.657 541,60.657 M541,33.886 C532.1,33.886 524.886,41.1 524.886,50 C524.886,58.899 532.1,66.113 541,66.113 C549.9,66.113 557.115,58.899 557.115,50 C557.115,41.1 549.9,33.886 541,33.886 M565.378,62.101 C565.244,65.022 564.756,66.606 564.346,67.663 C563.803,69.06 563.154,70.057 562.106,71.106 C561.058,72.155 560.06,72.803 558.662,73.347 C557.607,73.757 556.021,74.244 553.102,74.378 C549.944,74.521 548.997,74.552 541,74.552 C533.003,74.552 532.056,74.521 528.898,74.378 C525.979,74.244 524.393,73.757 523.338,73.347 C521.94,72.803 520.942,72.155 519.894,71.106 C518.846,70.057 518.197,69.06 517.654,67.663 C517.244,66.606 516.755,65.022 516.623,62.101 C516.479,58.943 516.448,57.996 516.448,50 C516.448,42.003 516.479,41.056 516.623,37.899 C516.755,34.978 517.244,33.391 517.654,32.338 C518.197,30.938 518.846,29.942 519.894,28.894 C520.942,27.846 521.94,27.196 523.338,26.654 C524.393,26.244 525.979,25.756 528.898,25.623 C532.057,25.479 533.004,25.448 541,25.448 C548.997,25.448 549.943,25.479 553.102,25.623 C556.021,25.756 557.607,26.244 558.662,26.654 C560.06,27.196 561.058,27.846 562.106,28.894 C563.154,29.942 563.803,30.938 564.346,32.338 C564.756,33.391 565.244,34.978 565.378,37.899 C565.522,41.056 565.552,42.003 565.552,50 C565.552,57.996 565.522,58.943 565.378,62.101 M570.82,37.631 C570.674,34.438 570.167,32.258 569.425,30.349 C568.659,28.377 567.633,26.702 565.965,25.035 C564.297,23.368 562.623,22.342 560.652,21.575 C558.743,20.834 556.562,20.326 553.369,20.18 C550.169,20.033 549.148,20 541,20 C532.853,20 531.831,20.033 528.631,20.18 C525.438,20.326 523.257,20.834 521.349,21.575 C519.376,22.342 517.703,23.368 516.035,25.035 C514.368,26.702 513.342,28.377 512.574,30.349 C511.834,32.258 511.326,34.438 511.181,37.631 C511.035,40.831 511,41.851 511,50 C511,58.147 511.035,59.17 511.181,62.369 C511.326,65.562 511.834,67.743 512.574,69.651 C513.342,71.625 514.368,73.296 516.035,74.965 C517.703,76.634 519.376,77.658 521.349,78.425 C523.257,79.167 525.438,79.673 528.631,79.82 C531.831,79.965 532.853,80.001 541,80.001 C549.148,80.001 550.169,79.965 553.369,79.82 C556.562,79.673 558.743,79.167 560.652,78.425 C562.623,77.658 564.297,76.634 565.965,74.965 C567.633,73.296 568.659,71.625 569.425,69.651 C570.167,67.743 570.674,65.562 570.82,62.369 C570.966,59.17 571,58.147 571,50 C571,41.851 570.966,40.831 570.82,37.631" /></g></g></g></svg></div><div style={{paddingTop: '8px'}}> <div style={{color: '#3897f0', fontFamily: 'Arial,sans-serif', fontSize: '14px', fontStyle: 'normal', fontWeight: 550, lineHeight: '18px'}}> View this post on Instagram</div></div><div style={{padding: '12.5% 0'}} /> <div style={{display: 'flex', flexDirection: 'row', marginBottom: '14px', alignItems: 'center'}}><div> <div style={{backgroundColor: '#F4F4F4', borderRadius: '50%', height: '12.5px', width: '12.5px', transform: 'translateX(0px) translateY(7px)'}} /> <div style={{backgroundColor: '#F4F4F4', height: '12.5px', transform: 'rotate(-45deg) translateX(3px) translateY(1px)', width: '12.5px', flexGrow: 0, marginRight: '14px', marginLeft: '2px'}} /> <div style={{backgroundColor: '#F4F4F4', borderRadius: '50%', height: '12.5px', width: '12.5px', transform: 'translateX(9px) translateY(-18px)'}} /></div><div style={{marginLeft: '8px'}}> <div style={{backgroundColor: '#F4F4F4', borderRadius: '50%', flexGrow: 0, height: '20px', width: '20px'}} /> <div style={{width: 0, height: 0, borderTop: '2px solid transparent', borderLeft: '6px solid #f4f4f4', borderBottom: '2px solid transparent', transform: 'translateX(16px) translateY(-4px) rotate(30deg)'}} /></div><div style={{marginLeft: 'auto'}}> <div style={{width: '0px', borderTop: '8px solid #F4F4F4', borderRight: '8px solid transparent', transform: 'translateY(16px)'}} /> <div style={{backgroundColor: '#F4F4F4', flexGrow: 0, height: '12px', width: '16px', transform: 'translateY(-4px)'}} /> <div style={{width: 0, height: 0, borderTop: '8px solid #F4F4F4', borderLeft: '8px solid transparent', transform: 'translateY(-4px) translateX(8px)'}} /></div></div> <div style={{display: 'flex', flexDirection: 'column', flexGrow: 1, justifyContent: 'center', marginBottom: '24px'}}> <div style={{backgroundColor: '#F4F4F4', borderRadius: '4px', flexGrow: 0, height: '14px', marginBottom: '6px', width: '224px'}} /> <div style={{backgroundColor: '#F4F4F4', borderRadius: '4px', flexGrow: 0, height: '14px', width: '144px'}} /></div></a><p style={{color: '#c9c8cd', fontFamily: 'Arial,sans-serif', fontSize: '14px', lineHeight: '17px', marginBottom: 0, marginTop: '8px', overflow: 'hidden', padding: '8px 0 7px', textAlign: 'center', textOverflow: 'ellipsis', whiteSpace: 'nowrap'}}><a href="https://www.instagram.com/p/7TiCw2yVgC/?utm_source=ig_embed&utm_campaign=loading" style={{color: '#c9c8cd', fontFamily: 'Arial,sans-serif', fontSize: '14px', fontStyle: 'normal', fontWeight: 'normal', lineHeight: '17px', textDecoration: 'none'}} target="_blank">A post shared by Gianluca Barbetta (@lukeskybarbet)</a></p></div></blockquote> ## Autunno Pavese <small>(23-26 Settembre, Pavia)</small> Non è un [evento](http://www.autunnopavesedoc.it/) a cui vado tutti gli anni, tendenzialmente è sempre uguale... ma da quando è stato spostato al Castello di Pavia sono più propenso a farci un giro se capita la sera e la compagnia giusta, **risotto e vino** a go-go! <blockquote className="instagram-media" data-instgrm-captioned data-instgrm-permalink="https://www.instagram.com/p/8Gw_CFswPn/?utm_source=ig_embed&utm_campaign=loading" data-instgrm-version={13} style={{background: '#FFF', border: 0, borderRadius: '3px', boxShadow: '0 0 1px 0 rgba(0,0,0,0.5),0 1px 10px 0 rgba(0,0,0,0.15)', margin: '1px', maxWidth: '540px', minWidth: '326px', padding: 0, width: 'calc(100% - 2px)'}}><div style={{padding: '16px'}}> <a href="https://www.instagram.com/p/8Gw_CFswPn/?utm_source=ig_embed&utm_campaign=loading" style={{background: '#FFFFFF', lineHeight: 0, padding: '0 0', textAlign: 'center', textDecoration: 'none', width: '100%'}} target="_blank"> <div style={{display: 'flex', flexDirection: 'row', alignItems: 'center'}}> <div style={{backgroundColor: '#F4F4F4', borderRadius: '50%', flexGrow: 0, height: '40px', marginRight: '14px', width: '40px'}} /> <div style={{display: 'flex', flexDirection: 'column', flexGrow: 1, justifyContent: 'center'}}> <div style={{backgroundColor: '#F4F4F4', borderRadius: '4px', flexGrow: 0, height: '14px', marginBottom: '6px', width: '100px'}} /> <div style={{backgroundColor: '#F4F4F4', borderRadius: '4px', flexGrow: 0, height: '14px', width: '60px'}} /></div></div><div style={{padding: '19% 0'}} /> <div style={{display: 'block', height: '50px', margin: '0 auto 12px', width: '50px'}}><svg width="50px" height="50px" viewBox="0 0 60 60" version="1.1" xmlns="https://www.w3.org/2000/svg" xmlnsXlink="https://www.w3.org/1999/xlink"><g stroke="none" strokeWidth={1} fill="none" fillRule="evenodd"><g transform="translate(-511.000000, -20.000000)" fill="#000000"><g><path d="M556.869,30.41 C554.814,30.41 553.148,32.076 553.148,34.131 C553.148,36.186 554.814,37.852 556.869,37.852 C558.924,37.852 560.59,36.186 560.59,34.131 C560.59,32.076 558.924,30.41 556.869,30.41 M541,60.657 C535.114,60.657 530.342,55.887 530.342,50 C530.342,44.114 535.114,39.342 541,39.342 C546.887,39.342 551.658,44.114 551.658,50 C551.658,55.887 546.887,60.657 541,60.657 M541,33.886 C532.1,33.886 524.886,41.1 524.886,50 C524.886,58.899 532.1,66.113 541,66.113 C549.9,66.113 557.115,58.899 557.115,50 C557.115,41.1 549.9,33.886 541,33.886 M565.378,62.101 C565.244,65.022 564.756,66.606 564.346,67.663 C563.803,69.06 563.154,70.057 562.106,71.106 C561.058,72.155 560.06,72.803 558.662,73.347 C557.607,73.757 556.021,74.244 553.102,74.378 C549.944,74.521 548.997,74.552 541,74.552 C533.003,74.552 532.056,74.521 528.898,74.378 C525.979,74.244 524.393,73.757 523.338,73.347 C521.94,72.803 520.942,72.155 519.894,71.106 C518.846,70.057 518.197,69.06 517.654,67.663 C517.244,66.606 516.755,65.022 516.623,62.101 C516.479,58.943 516.448,57.996 516.448,50 C516.448,42.003 516.479,41.056 516.623,37.899 C516.755,34.978 517.244,33.391 517.654,32.338 C518.197,30.938 518.846,29.942 519.894,28.894 C520.942,27.846 521.94,27.196 523.338,26.654 C524.393,26.244 525.979,25.756 528.898,25.623 C532.057,25.479 533.004,25.448 541,25.448 C548.997,25.448 549.943,25.479 553.102,25.623 C556.021,25.756 557.607,26.244 558.662,26.654 C560.06,27.196 561.058,27.846 562.106,28.894 C563.154,29.942 563.803,30.938 564.346,32.338 C564.756,33.391 565.244,34.978 565.378,37.899 C565.522,41.056 565.552,42.003 565.552,50 C565.552,57.996 565.522,58.943 565.378,62.101 M570.82,37.631 C570.674,34.438 570.167,32.258 569.425,30.349 C568.659,28.377 567.633,26.702 565.965,25.035 C564.297,23.368 562.623,22.342 560.652,21.575 C558.743,20.834 556.562,20.326 553.369,20.18 C550.169,20.033 549.148,20 541,20 C532.853,20 531.831,20.033 528.631,20.18 C525.438,20.326 523.257,20.834 521.349,21.575 C519.376,22.342 517.703,23.368 516.035,25.035 C514.368,26.702 513.342,28.377 512.574,30.349 C511.834,32.258 511.326,34.438 511.181,37.631 C511.035,40.831 511,41.851 511,50 C511,58.147 511.035,59.17 511.181,62.369 C511.326,65.562 511.834,67.743 512.574,69.651 C513.342,71.625 514.368,73.296 516.035,74.965 C517.703,76.634 519.376,77.658 521.349,78.425 C523.257,79.167 525.438,79.673 528.631,79.82 C531.831,79.965 532.853,80.001 541,80.001 C549.148,80.001 550.169,79.965 553.369,79.82 C556.562,79.673 558.743,79.167 560.652,78.425 C562.623,77.658 564.297,76.634 565.965,74.965 C567.633,73.296 568.659,71.625 569.425,69.651 C570.167,67.743 570.674,65.562 570.82,62.369 C570.966,59.17 571,58.147 571,50 C571,41.851 570.966,40.831 570.82,37.631" /></g></g></g></svg></div><div style={{paddingTop: '8px'}}> <div style={{color: '#3897f0', fontFamily: 'Arial,sans-serif', fontSize: '14px', fontStyle: 'normal', fontWeight: 550, lineHeight: '18px'}}> View this post on Instagram</div></div><div style={{padding: '12.5% 0'}} /> <div style={{display: 'flex', flexDirection: 'row', marginBottom: '14px', alignItems: 'center'}}><div> <div style={{backgroundColor: '#F4F4F4', borderRadius: '50%', height: '12.5px', width: '12.5px', transform: 'translateX(0px) translateY(7px)'}} /> <div style={{backgroundColor: '#F4F4F4', height: '12.5px', transform: 'rotate(-45deg) translateX(3px) translateY(1px)', width: '12.5px', flexGrow: 0, marginRight: '14px', marginLeft: '2px'}} /> <div style={{backgroundColor: '#F4F4F4', borderRadius: '50%', height: '12.5px', width: '12.5px', transform: 'translateX(9px) translateY(-18px)'}} /></div><div style={{marginLeft: '8px'}}> <div style={{backgroundColor: '#F4F4F4', borderRadius: '50%', flexGrow: 0, height: '20px', width: '20px'}} /> <div style={{width: 0, height: 0, borderTop: '2px solid transparent', borderLeft: '6px solid #f4f4f4', borderBottom: '2px solid transparent', transform: 'translateX(16px) translateY(-4px) rotate(30deg)'}} /></div><div style={{marginLeft: 'auto'}}> <div style={{width: '0px', borderTop: '8px solid #F4F4F4', borderRight: '8px solid transparent', transform: 'translateY(16px)'}} /> <div style={{backgroundColor: '#F4F4F4', flexGrow: 0, height: '12px', width: '16px', transform: 'translateY(-4px)'}} /> <div style={{width: 0, height: 0, borderTop: '8px solid #F4F4F4', borderLeft: '8px solid transparent', transform: 'translateY(-4px) translateX(8px)'}} /></div></div> <div style={{display: 'flex', flexDirection: 'column', flexGrow: 1, justifyContent: 'center', marginBottom: '24px'}}> <div style={{backgroundColor: '#F4F4F4', borderRadius: '4px', flexGrow: 0, height: '14px', marginBottom: '6px', width: '224px'}} /> <div style={{backgroundColor: '#F4F4F4', borderRadius: '4px', flexGrow: 0, height: '14px', width: '144px'}} /></div></a><p style={{color: '#c9c8cd', fontFamily: 'Arial,sans-serif', fontSize: '14px', lineHeight: '17px', marginBottom: 0, marginTop: '8px', overflow: 'hidden', padding: '8px 0 7px', textAlign: 'center', textOverflow: 'ellipsis', whiteSpace: 'nowrap'}}><a href="https://www.instagram.com/p/8Gw_CFswPn/?utm_source=ig_embed&utm_campaign=loading" style={{color: '#c9c8cd', fontFamily: 'Arial,sans-serif', fontSize: '14px', fontStyle: 'normal', fontWeight: 'normal', lineHeight: '17px', textDecoration: 'none'}} target="_blank">A post shared by Avangart (@avangart_arte)</a></p></div></blockquote> ## Radiant <small>(1-2 Ottobre, Novegro)</small> Anche di questo evento ne ho [più o meno già parlato in un altro post](https://salvatorelaisa.blog/post/hdd-vs-cloud/). Non so se ci farò o un giro o meno, ma sapere che c'è è sempre una garanzia in più se proprio dovessi trovarmi senza nulla da fare e la voglia di buttare qualche soldo in cinesate **tecnologiche**. ## BirrArt <small>(13-16 Ottobre, Casteggio)</small> Ormai sono [8 anni](http://www.birrart.org/) che con questo evento Casteggio cerca di affiancare alla sua fama di città dei vini anche la **birra**. Non nascondo che personalmente la fiera è molto meno di quello per cui viene spacciata, come qualità e offerta, ma mi piace considerarla l'ultima festa della birra prima dell'inverno. <blockquote className="instagram-media" data-instgrm-captioned data-instgrm-permalink="https://www.instagram.com/p/BDtpqshmuAV/?utm_source=ig_embed&utm_campaign=loading" data-instgrm-version={13} style={{background: '#FFF', border: 0, borderRadius: '3px', boxShadow: '0 0 1px 0 rgba(0,0,0,0.5),0 1px 10px 0 rgba(0,0,0,0.15)', margin: '1px', maxWidth: '540px', minWidth: '326px', padding: 0, width: 'calc(100% - 2px)'}}><div style={{padding: '16px'}}> <a href="https://www.instagram.com/p/BDtpqshmuAV/?utm_source=ig_embed&utm_campaign=loading" style={{background: '#FFFFFF', lineHeight: 0, padding: '0 0', textAlign: 'center', textDecoration: 'none', width: '100%'}} target="_blank"> <div style={{display: 'flex', flexDirection: 'row', alignItems: 'center'}}> <div style={{backgroundColor: '#F4F4F4', borderRadius: '50%', flexGrow: 0, height: '40px', marginRight: '14px', width: '40px'}} /> <div style={{display: 'flex', flexDirection: 'column', flexGrow: 1, justifyContent: 'center'}}> <div style={{backgroundColor: '#F4F4F4', borderRadius: '4px', flexGrow: 0, height: '14px', marginBottom: '6px', width: '100px'}} /> <div style={{backgroundColor: '#F4F4F4', borderRadius: '4px', flexGrow: 0, height: '14px', width: '60px'}} /></div></div><div style={{padding: '19% 0'}} /> <div style={{display: 'block', height: '50px', margin: '0 auto 12px', width: '50px'}}><svg width="50px" height="50px" viewBox="0 0 60 60" version="1.1" xmlns="https://www.w3.org/2000/svg" xmlnsXlink="https://www.w3.org/1999/xlink"><g stroke="none" strokeWidth={1} fill="none" fillRule="evenodd"><g transform="translate(-511.000000, -20.000000)" fill="#000000"><g><path d="M556.869,30.41 C554.814,30.41 553.148,32.076 553.148,34.131 C553.148,36.186 554.814,37.852 556.869,37.852 C558.924,37.852 560.59,36.186 560.59,34.131 C560.59,32.076 558.924,30.41 556.869,30.41 M541,60.657 C535.114,60.657 530.342,55.887 530.342,50 C530.342,44.114 535.114,39.342 541,39.342 C546.887,39.342 551.658,44.114 551.658,50 C551.658,55.887 546.887,60.657 541,60.657 M541,33.886 C532.1,33.886 524.886,41.1 524.886,50 C524.886,58.899 532.1,66.113 541,66.113 C549.9,66.113 557.115,58.899 557.115,50 C557.115,41.1 549.9,33.886 541,33.886 M565.378,62.101 C565.244,65.022 564.756,66.606 564.346,67.663 C563.803,69.06 563.154,70.057 562.106,71.106 C561.058,72.155 560.06,72.803 558.662,73.347 C557.607,73.757 556.021,74.244 553.102,74.378 C549.944,74.521 548.997,74.552 541,74.552 C533.003,74.552 532.056,74.521 528.898,74.378 C525.979,74.244 524.393,73.757 523.338,73.347 C521.94,72.803 520.942,72.155 519.894,71.106 C518.846,70.057 518.197,69.06 517.654,67.663 C517.244,66.606 516.755,65.022 516.623,62.101 C516.479,58.943 516.448,57.996 516.448,50 C516.448,42.003 516.479,41.056 516.623,37.899 C516.755,34.978 517.244,33.391 517.654,32.338 C518.197,30.938 518.846,29.942 519.894,28.894 C520.942,27.846 521.94,27.196 523.338,26.654 C524.393,26.244 525.979,25.756 528.898,25.623 C532.057,25.479 533.004,25.448 541,25.448 C548.997,25.448 549.943,25.479 553.102,25.623 C556.021,25.756 557.607,26.244 558.662,26.654 C560.06,27.196 561.058,27.846 562.106,28.894 C563.154,29.942 563.803,30.938 564.346,32.338 C564.756,33.391 565.244,34.978 565.378,37.899 C565.522,41.056 565.552,42.003 565.552,50 C565.552,57.996 565.522,58.943 565.378,62.101 M570.82,37.631 C570.674,34.438 570.167,32.258 569.425,30.349 C568.659,28.377 567.633,26.702 565.965,25.035 C564.297,23.368 562.623,22.342 560.652,21.575 C558.743,20.834 556.562,20.326 553.369,20.18 C550.169,20.033 549.148,20 541,20 C532.853,20 531.831,20.033 528.631,20.18 C525.438,20.326 523.257,20.834 521.349,21.575 C519.376,22.342 517.703,23.368 516.035,25.035 C514.368,26.702 513.342,28.377 512.574,30.349 C511.834,32.258 511.326,34.438 511.181,37.631 C511.035,40.831 511,41.851 511,50 C511,58.147 511.035,59.17 511.181,62.369 C511.326,65.562 511.834,67.743 512.574,69.651 C513.342,71.625 514.368,73.296 516.035,74.965 C517.703,76.634 519.376,77.658 521.349,78.425 C523.257,79.167 525.438,79.673 528.631,79.82 C531.831,79.965 532.853,80.001 541,80.001 C549.148,80.001 550.169,79.965 553.369,79.82 C556.562,79.673 558.743,79.167 560.652,78.425 C562.623,77.658 564.297,76.634 565.965,74.965 C567.633,73.296 568.659,71.625 569.425,69.651 C570.167,67.743 570.674,65.562 570.82,62.369 C570.966,59.17 571,58.147 571,50 C571,41.851 570.966,40.831 570.82,37.631" /></g></g></g></svg></div><div style={{paddingTop: '8px'}}> <div style={{color: '#3897f0', fontFamily: 'Arial,sans-serif', fontSize: '14px', fontStyle: 'normal', fontWeight: 550, lineHeight: '18px'}}> View this post on Instagram</div></div><div style={{padding: '12.5% 0'}} /> <div style={{display: 'flex', flexDirection: 'row', marginBottom: '14px', alignItems: 'center'}}><div> <div style={{backgroundColor: '#F4F4F4', borderRadius: '50%', height: '12.5px', width: '12.5px', transform: 'translateX(0px) translateY(7px)'}} /> <div style={{backgroundColor: '#F4F4F4', height: '12.5px', transform: 'rotate(-45deg) translateX(3px) translateY(1px)', width: '12.5px', flexGrow: 0, marginRight: '14px', marginLeft: '2px'}} /> <div style={{backgroundColor: '#F4F4F4', borderRadius: '50%', height: '12.5px', width: '12.5px', transform: 'translateX(9px) translateY(-18px)'}} /></div><div style={{marginLeft: '8px'}}> <div style={{backgroundColor: '#F4F4F4', borderRadius: '50%', flexGrow: 0, height: '20px', width: '20px'}} /> <div style={{width: 0, height: 0, borderTop: '2px solid transparent', borderLeft: '6px solid #f4f4f4', borderBottom: '2px solid transparent', transform: 'translateX(16px) translateY(-4px) rotate(30deg)'}} /></div><div style={{marginLeft: 'auto'}}> <div style={{width: '0px', borderTop: '8px solid #F4F4F4', borderRight: '8px solid transparent', transform: 'translateY(16px)'}} /> <div style={{backgroundColor: '#F4F4F4', flexGrow: 0, height: '12px', width: '16px', transform: 'translateY(-4px)'}} /> <div style={{width: 0, height: 0, borderTop: '8px solid #F4F4F4', borderLeft: '8px solid transparent', transform: 'translateY(-4px) translateX(8px)'}} /></div></div> <div style={{display: 'flex', flexDirection: 'column', flexGrow: 1, justifyContent: 'center', marginBottom: '24px'}}> <div style={{backgroundColor: '#F4F4F4', borderRadius: '4px', flexGrow: 0, height: '14px', marginBottom: '6px', width: '224px'}} /> <div style={{backgroundColor: '#F4F4F4', borderRadius: '4px', flexGrow: 0, height: '14px', width: '144px'}} /></div></a><p style={{color: '#c9c8cd', fontFamily: 'Arial,sans-serif', fontSize: '14px', lineHeight: '17px', marginBottom: 0, marginTop: '8px', overflow: 'hidden', padding: '8px 0 7px', textAlign: 'center', textOverflow: 'ellipsis', whiteSpace: 'nowrap'}}><a href="https://www.instagram.com/p/BDtpqshmuAV/?utm_source=ig_embed&utm_campaign=loading" style={{color: '#c9c8cd', fontFamily: 'Arial,sans-serif', fontSize: '14px', fontStyle: 'normal', fontWeight: 'normal', lineHeight: '17px', textDecoration: 'none'}} target="_blank">A post shared by Marianna ™ (@here.comes.mary)</a></p></div></blockquote> ## Games Week <small>(14-16 Ottobre, Milano)</small> Dopo aver passato alcuni anni tra la fine del '90 e l'inizio del 2000 a frequentare lo SMAU principalmente per vedere le **novità videoludiche**, finalmente[ una fiera dedicata solo a quelle](http://www.milangamesweek.it/)! Occupando un solo padiglione della fiera di Milano, garantisce un giro completo degli stand con calma nel giro di un'oretta abbondante. Non mi metto mai a fare file per provare videogiochi in anteprima, si tratta di file di ore per titoli che usciranno breve, per me il vero forte sono gli angoli dedicati al **retrogaming** e gli sviluppatori indipendenti. <blockquote className="instagram-media" data-instgrm-captioned data-instgrm-permalink="https://www.instagram.com/p/BAft93twNIO/?utm_source=ig_embed&utm_campaign=loading" data-instgrm-version={13} style={{background: '#FFF', border: 0, borderRadius: '3px', boxShadow: '0 0 1px 0 rgba(0,0,0,0.5),0 1px 10px 0 rgba(0,0,0,0.15)', margin: '1px', maxWidth: '540px', minWidth: '326px', padding: 0, width: 'calc(100% - 2px)'}}><div style={{padding: '16px'}}> <a href="https://www.instagram.com/p/BAft93twNIO/?utm_source=ig_embed&utm_campaign=loading" style={{background: '#FFFFFF', lineHeight: 0, padding: '0 0', textAlign: 'center', textDecoration: 'none', width: '100%'}} target="_blank"> <div style={{display: 'flex', flexDirection: 'row', alignItems: 'center'}}> <div style={{backgroundColor: '#F4F4F4', borderRadius: '50%', flexGrow: 0, height: '40px', marginRight: '14px', width: '40px'}} /> <div style={{display: 'flex', flexDirection: 'column', flexGrow: 1, justifyContent: 'center'}}> <div style={{backgroundColor: '#F4F4F4', borderRadius: '4px', flexGrow: 0, height: '14px', marginBottom: '6px', width: '100px'}} /> <div style={{backgroundColor: '#F4F4F4', borderRadius: '4px', flexGrow: 0, height: '14px', width: '60px'}} /></div></div><div style={{padding: '19% 0'}} /> <div style={{display: 'block', height: '50px', margin: '0 auto 12px', width: '50px'}}><svg width="50px" height="50px" viewBox="0 0 60 60" version="1.1" xmlns="https://www.w3.org/2000/svg" xmlnsXlink="https://www.w3.org/1999/xlink"><g stroke="none" strokeWidth={1} fill="none" fillRule="evenodd"><g transform="translate(-511.000000, -20.000000)" fill="#000000"><g><path d="M556.869,30.41 C554.814,30.41 553.148,32.076 553.148,34.131 C553.148,36.186 554.814,37.852 556.869,37.852 C558.924,37.852 560.59,36.186 560.59,34.131 C560.59,32.076 558.924,30.41 556.869,30.41 M541,60.657 C535.114,60.657 530.342,55.887 530.342,50 C530.342,44.114 535.114,39.342 541,39.342 C546.887,39.342 551.658,44.114 551.658,50 C551.658,55.887 546.887,60.657 541,60.657 M541,33.886 C532.1,33.886 524.886,41.1 524.886,50 C524.886,58.899 532.1,66.113 541,66.113 C549.9,66.113 557.115,58.899 557.115,50 C557.115,41.1 549.9,33.886 541,33.886 M565.378,62.101 C565.244,65.022 564.756,66.606 564.346,67.663 C563.803,69.06 563.154,70.057 562.106,71.106 C561.058,72.155 560.06,72.803 558.662,73.347 C557.607,73.757 556.021,74.244 553.102,74.378 C549.944,74.521 548.997,74.552 541,74.552 C533.003,74.552 532.056,74.521 528.898,74.378 C525.979,74.244 524.393,73.757 523.338,73.347 C521.94,72.803 520.942,72.155 519.894,71.106 C518.846,70.057 518.197,69.06 517.654,67.663 C517.244,66.606 516.755,65.022 516.623,62.101 C516.479,58.943 516.448,57.996 516.448,50 C516.448,42.003 516.479,41.056 516.623,37.899 C516.755,34.978 517.244,33.391 517.654,32.338 C518.197,30.938 518.846,29.942 519.894,28.894 C520.942,27.846 521.94,27.196 523.338,26.654 C524.393,26.244 525.979,25.756 528.898,25.623 C532.057,25.479 533.004,25.448 541,25.448 C548.997,25.448 549.943,25.479 553.102,25.623 C556.021,25.756 557.607,26.244 558.662,26.654 C560.06,27.196 561.058,27.846 562.106,28.894 C563.154,29.942 563.803,30.938 564.346,32.338 C564.756,33.391 565.244,34.978 565.378,37.899 C565.522,41.056 565.552,42.003 565.552,50 C565.552,57.996 565.522,58.943 565.378,62.101 M570.82,37.631 C570.674,34.438 570.167,32.258 569.425,30.349 C568.659,28.377 567.633,26.702 565.965,25.035 C564.297,23.368 562.623,22.342 560.652,21.575 C558.743,20.834 556.562,20.326 553.369,20.18 C550.169,20.033 549.148,20 541,20 C532.853,20 531.831,20.033 528.631,20.18 C525.438,20.326 523.257,20.834 521.349,21.575 C519.376,22.342 517.703,23.368 516.035,25.035 C514.368,26.702 513.342,28.377 512.574,30.349 C511.834,32.258 511.326,34.438 511.181,37.631 C511.035,40.831 511,41.851 511,50 C511,58.147 511.035,59.17 511.181,62.369 C511.326,65.562 511.834,67.743 512.574,69.651 C513.342,71.625 514.368,73.296 516.035,74.965 C517.703,76.634 519.376,77.658 521.349,78.425 C523.257,79.167 525.438,79.673 528.631,79.82 C531.831,79.965 532.853,80.001 541,80.001 C549.148,80.001 550.169,79.965 553.369,79.82 C556.562,79.673 558.743,79.167 560.652,78.425 C562.623,77.658 564.297,76.634 565.965,74.965 C567.633,73.296 568.659,71.625 569.425,69.651 C570.167,67.743 570.674,65.562 570.82,62.369 C570.966,59.17 571,58.147 571,50 C571,41.851 570.966,40.831 570.82,37.631" /></g></g></g></svg></div><div style={{paddingTop: '8px'}}> <div style={{color: '#3897f0', fontFamily: 'Arial,sans-serif', fontSize: '14px', fontStyle: 'normal', fontWeight: 550, lineHeight: '18px'}}> View this post on Instagram</div></div><div style={{padding: '12.5% 0'}} /> <div style={{display: 'flex', flexDirection: 'row', marginBottom: '14px', alignItems: 'center'}}><div> <div style={{backgroundColor: '#F4F4F4', borderRadius: '50%', height: '12.5px', width: '12.5px', transform: 'translateX(0px) translateY(7px)'}} /> <div style={{backgroundColor: '#F4F4F4', height: '12.5px', transform: 'rotate(-45deg) translateX(3px) translateY(1px)', width: '12.5px', flexGrow: 0, marginRight: '14px', marginLeft: '2px'}} /> <div style={{backgroundColor: '#F4F4F4', borderRadius: '50%', height: '12.5px', width: '12.5px', transform: 'translateX(9px) translateY(-18px)'}} /></div><div style={{marginLeft: '8px'}}> <div style={{backgroundColor: '#F4F4F4', borderRadius: '50%', flexGrow: 0, height: '20px', width: '20px'}} /> <div style={{width: 0, height: 0, borderTop: '2px solid transparent', borderLeft: '6px solid #f4f4f4', borderBottom: '2px solid transparent', transform: 'translateX(16px) translateY(-4px) rotate(30deg)'}} /></div><div style={{marginLeft: 'auto'}}> <div style={{width: '0px', borderTop: '8px solid #F4F4F4', borderRight: '8px solid transparent', transform: 'translateY(16px)'}} /> <div style={{backgroundColor: '#F4F4F4', flexGrow: 0, height: '12px', width: '16px', transform: 'translateY(-4px)'}} /> <div style={{width: 0, height: 0, borderTop: '8px solid #F4F4F4', borderLeft: '8px solid transparent', transform: 'translateY(-4px) translateX(8px)'}} /></div></div> <div style={{display: 'flex', flexDirection: 'column', flexGrow: 1, justifyContent: 'center', marginBottom: '24px'}}> <div style={{backgroundColor: '#F4F4F4', borderRadius: '4px', flexGrow: 0, height: '14px', marginBottom: '6px', width: '224px'}} /> <div style={{backgroundColor: '#F4F4F4', borderRadius: '4px', flexGrow: 0, height: '14px', width: '144px'}} /></div></a><p style={{color: '#c9c8cd', fontFamily: 'Arial,sans-serif', fontSize: '14px', lineHeight: '17px', marginBottom: 0, marginTop: '8px', overflow: 'hidden', padding: '8px 0 7px', textAlign: 'center', textOverflow: 'ellipsis', whiteSpace: 'nowrap'}}><a href="https://www.instagram.com/p/BAft93twNIO/?utm_source=ig_embed&utm_campaign=loading" style={{color: '#c9c8cd', fontFamily: 'Arial,sans-serif', fontSize: '14px', fontStyle: 'normal', fontWeight: 'normal', lineHeight: '17px', textDecoration: 'none'}} target="_blank">A post shared by Andrea Arwain Caleo (@caleo_andrea)</a></p></div></blockquote> ## Next Vintage <small>(14-17 Ottobre, Belgioioso)</small> Il castello di Belgioioso è sempre un'ottima cornice per eventi, e il Next Vintage è un'occasione per sbirciare tra cinafrusaglie degli anni 60/70, anche se la mostra è principalmente centrata su abbigliamento, ogni tanto qualche chicca salta fuori. <blockquote className="instagram-media" data-instgrm-captioned data-instgrm-permalink="https://www.instagram.com/p/88DiuyPmdl/?utm_source=ig_embed&utm_campaign=loading" data-instgrm-version={13} style={{background: '#FFF', border: 0, borderRadius: '3px', boxShadow: '0 0 1px 0 rgba(0,0,0,0.5),0 1px 10px 0 rgba(0,0,0,0.15)', margin: '1px', maxWidth: '540px', minWidth: '326px', padding: 0, width: 'calc(100% - 2px)'}}><div style={{padding: '16px'}}> <a href="https://www.instagram.com/p/88DiuyPmdl/?utm_source=ig_embed&utm_campaign=loading" style={{background: '#FFFFFF', lineHeight: 0, padding: '0 0', textAlign: 'center', textDecoration: 'none', width: '100%'}} target="_blank"> <div style={{display: 'flex', flexDirection: 'row', alignItems: 'center'}}> <div style={{backgroundColor: '#F4F4F4', borderRadius: '50%', flexGrow: 0, height: '40px', marginRight: '14px', width: '40px'}} /> <div style={{display: 'flex', flexDirection: 'column', flexGrow: 1, justifyContent: 'center'}}> <div style={{backgroundColor: '#F4F4F4', borderRadius: '4px', flexGrow: 0, height: '14px', marginBottom: '6px', width: '100px'}} /> <div style={{backgroundColor: '#F4F4F4', borderRadius: '4px', flexGrow: 0, height: '14px', width: '60px'}} /></div></div><div style={{padding: '19% 0'}} /> <div style={{display: 'block', height: '50px', margin: '0 auto 12px', width: '50px'}}><svg width="50px" height="50px" viewBox="0 0 60 60" version="1.1" xmlns="https://www.w3.org/2000/svg" xmlnsXlink="https://www.w3.org/1999/xlink"><g stroke="none" strokeWidth={1} fill="none" fillRule="evenodd"><g transform="translate(-511.000000, -20.000000)" fill="#000000"><g><path d="M556.869,30.41 C554.814,30.41 553.148,32.076 553.148,34.131 C553.148,36.186 554.814,37.852 556.869,37.852 C558.924,37.852 560.59,36.186 560.59,34.131 C560.59,32.076 558.924,30.41 556.869,30.41 M541,60.657 C535.114,60.657 530.342,55.887 530.342,50 C530.342,44.114 535.114,39.342 541,39.342 C546.887,39.342 551.658,44.114 551.658,50 C551.658,55.887 546.887,60.657 541,60.657 M541,33.886 C532.1,33.886 524.886,41.1 524.886,50 C524.886,58.899 532.1,66.113 541,66.113 C549.9,66.113 557.115,58.899 557.115,50 C557.115,41.1 549.9,33.886 541,33.886 M565.378,62.101 C565.244,65.022 564.756,66.606 564.346,67.663 C563.803,69.06 563.154,70.057 562.106,71.106 C561.058,72.155 560.06,72.803 558.662,73.347 C557.607,73.757 556.021,74.244 553.102,74.378 C549.944,74.521 548.997,74.552 541,74.552 C533.003,74.552 532.056,74.521 528.898,74.378 C525.979,74.244 524.393,73.757 523.338,73.347 C521.94,72.803 520.942,72.155 519.894,71.106 C518.846,70.057 518.197,69.06 517.654,67.663 C517.244,66.606 516.755,65.022 516.623,62.101 C516.479,58.943 516.448,57.996 516.448,50 C516.448,42.003 516.479,41.056 516.623,37.899 C516.755,34.978 517.244,33.391 517.654,32.338 C518.197,30.938 518.846,29.942 519.894,28.894 C520.942,27.846 521.94,27.196 523.338,26.654 C524.393,26.244 525.979,25.756 528.898,25.623 C532.057,25.479 533.004,25.448 541,25.448 C548.997,25.448 549.943,25.479 553.102,25.623 C556.021,25.756 557.607,26.244 558.662,26.654 C560.06,27.196 561.058,27.846 562.106,28.894 C563.154,29.942 563.803,30.938 564.346,32.338 C564.756,33.391 565.244,34.978 565.378,37.899 C565.522,41.056 565.552,42.003 565.552,50 C565.552,57.996 565.522,58.943 565.378,62.101 M570.82,37.631 C570.674,34.438 570.167,32.258 569.425,30.349 C568.659,28.377 567.633,26.702 565.965,25.035 C564.297,23.368 562.623,22.342 560.652,21.575 C558.743,20.834 556.562,20.326 553.369,20.18 C550.169,20.033 549.148,20 541,20 C532.853,20 531.831,20.033 528.631,20.18 C525.438,20.326 523.257,20.834 521.349,21.575 C519.376,22.342 517.703,23.368 516.035,25.035 C514.368,26.702 513.342,28.377 512.574,30.349 C511.834,32.258 511.326,34.438 511.181,37.631 C511.035,40.831 511,41.851 511,50 C511,58.147 511.035,59.17 511.181,62.369 C511.326,65.562 511.834,67.743 512.574,69.651 C513.342,71.625 514.368,73.296 516.035,74.965 C517.703,76.634 519.376,77.658 521.349,78.425 C523.257,79.167 525.438,79.673 528.631,79.82 C531.831,79.965 532.853,80.001 541,80.001 C549.148,80.001 550.169,79.965 553.369,79.82 C556.562,79.673 558.743,79.167 560.652,78.425 C562.623,77.658 564.297,76.634 565.965,74.965 C567.633,73.296 568.659,71.625 569.425,69.651 C570.167,67.743 570.674,65.562 570.82,62.369 C570.966,59.17 571,58.147 571,50 C571,41.851 570.966,40.831 570.82,37.631" /></g></g></g></svg></div><div style={{paddingTop: '8px'}}> <div style={{color: '#3897f0', fontFamily: 'Arial,sans-serif', fontSize: '14px', fontStyle: 'normal', fontWeight: 550, lineHeight: '18px'}}> View this post on Instagram</div></div><div style={{padding: '12.5% 0'}} /> <div style={{display: 'flex', flexDirection: 'row', marginBottom: '14px', alignItems: 'center'}}><div> <div style={{backgroundColor: '#F4F4F4', borderRadius: '50%', height: '12.5px', width: '12.5px', transform: 'translateX(0px) translateY(7px)'}} /> <div style={{backgroundColor: '#F4F4F4', height: '12.5px', transform: 'rotate(-45deg) translateX(3px) translateY(1px)', width: '12.5px', flexGrow: 0, marginRight: '14px', marginLeft: '2px'}} /> <div style={{backgroundColor: '#F4F4F4', borderRadius: '50%', height: '12.5px', width: '12.5px', transform: 'translateX(9px) translateY(-18px)'}} /></div><div style={{marginLeft: '8px'}}> <div style={{backgroundColor: '#F4F4F4', borderRadius: '50%', flexGrow: 0, height: '20px', width: '20px'}} /> <div style={{width: 0, height: 0, borderTop: '2px solid transparent', borderLeft: '6px solid #f4f4f4', borderBottom: '2px solid transparent', transform: 'translateX(16px) translateY(-4px) rotate(30deg)'}} /></div><div style={{marginLeft: 'auto'}}> <div style={{width: '0px', borderTop: '8px solid #F4F4F4', borderRight: '8px solid transparent', transform: 'translateY(16px)'}} /> <div style={{backgroundColor: '#F4F4F4', flexGrow: 0, height: '12px', width: '16px', transform: 'translateY(-4px)'}} /> <div style={{width: 0, height: 0, borderTop: '8px solid #F4F4F4', borderLeft: '8px solid transparent', transform: 'translateY(-4px) translateX(8px)'}} /></div></div> <div style={{display: 'flex', flexDirection: 'column', flexGrow: 1, justifyContent: 'center', marginBottom: '24px'}}> <div style={{backgroundColor: '#F4F4F4', borderRadius: '4px', flexGrow: 0, height: '14px', marginBottom: '6px', width: '224px'}} /> <div style={{backgroundColor: '#F4F4F4', borderRadius: '4px', flexGrow: 0, height: '14px', width: '144px'}} /></div></a><p style={{color: '#c9c8cd', fontFamily: 'Arial,sans-serif', fontSize: '14px', lineHeight: '17px', marginBottom: 0, marginTop: '8px', overflow: 'hidden', padding: '8px 0 7px', textAlign: 'center', textOverflow: 'ellipsis', whiteSpace: 'nowrap'}}><a href="https://www.instagram.com/p/88DiuyPmdl/?utm_source=ig_embed&utm_campaign=loading" style={{color: '#c9c8cd', fontFamily: 'Arial,sans-serif', fontSize: '14px', fontStyle: 'normal', fontWeight: 'normal', lineHeight: '17px', textDecoration: 'none'}} target="_blank">A post shared by Viviana Demicco (@vivianademicco)</a></p></div></blockquote> ## TEDx <small>(18 Ottobre, Milano)</small> Gli [eventi **TED** indipendenti](www.tedxmilano.it) tornano a Milano! Questa volta non mi sono fatto trovare impreparato e ho già comprato il biglietto. E per prepararmi mi sto guardando la registrazione dell'edizione precedente: <iframe width="100%" height="515" src="https://www.youtube.com/embed/ZbPR7lkZyTQ?list=PLsRNoUx8w3rPflea6YPxnaw-iUpuqEwvf" frameBorder="0" allowFullScreen></iframe> ## Comics & Games <small>(28 Ottobre - 1 Novembre, Lucca)</small> Il mio gran finale di questa maratona di eventi autunnali è l'ormai mitica manifestazione **[Lucca Comics & Games](http://www.luccacomicsandgames.com/it/2016/home/)**, che non ha bisogno di presentazioni. Purtroppo l'anno scorso l'ho dovuta saltare quindi quest'anno ho un altro buon motivo per non perdermela. Negli ultimi anni i giorni del Lucca Comics sono stati anche molto caldi e soleggiati per essere gli inizi di Novembre, ma indipendentemente dal clima niente è più eccezionale di un borgo come quello di Lucca invaso dai cosplayer più fantasiosi e curiosi di fumetti & dintorni da tutta europa! <blockquote className="instagram-media" data-instgrm-captioned data-instgrm-permalink="https://www.instagram.com/p/BKVuoPJA4kP/?utm_source=ig_embed&utm_campaign=loading" data-instgrm-version={13} style={{background: '#FFF', border: 0, borderRadius: '3px', boxShadow: '0 0 1px 0 rgba(0,0,0,0.5),0 1px 10px 0 rgba(0,0,0,0.15)', margin: '1px', maxWidth: '540px', minWidth: '326px', padding: 0, width: 'calc(100% - 2px)'}}><div style={{padding: '16px'}}> <a href="https://www.instagram.com/p/BKVuoPJA4kP/?utm_source=ig_embed&utm_campaign=loading" style={{background: '#FFFFFF', lineHeight: 0, padding: '0 0', textAlign: 'center', textDecoration: 'none', width: '100%'}} target="_blank"> <div style={{display: 'flex', flexDirection: 'row', alignItems: 'center'}}> <div style={{backgroundColor: '#F4F4F4', borderRadius: '50%', flexGrow: 0, height: '40px', marginRight: '14px', width: '40px'}} /> <div style={{display: 'flex', flexDirection: 'column', flexGrow: 1, justifyContent: 'center'}}> <div style={{backgroundColor: '#F4F4F4', borderRadius: '4px', flexGrow: 0, height: '14px', marginBottom: '6px', width: '100px'}} /> <div style={{backgroundColor: '#F4F4F4', borderRadius: '4px', flexGrow: 0, height: '14px', width: '60px'}} /></div></div><div style={{padding: '19% 0'}} /> <div style={{display: 'block', height: '50px', margin: '0 auto 12px', width: '50px'}}><svg width="50px" height="50px" viewBox="0 0 60 60" version="1.1" xmlns="https://www.w3.org/2000/svg" xmlnsXlink="https://www.w3.org/1999/xlink"><g stroke="none" strokeWidth={1} fill="none" fillRule="evenodd"><g transform="translate(-511.000000, -20.000000)" fill="#000000"><g><path d="M556.869,30.41 C554.814,30.41 553.148,32.076 553.148,34.131 C553.148,36.186 554.814,37.852 556.869,37.852 C558.924,37.852 560.59,36.186 560.59,34.131 C560.59,32.076 558.924,30.41 556.869,30.41 M541,60.657 C535.114,60.657 530.342,55.887 530.342,50 C530.342,44.114 535.114,39.342 541,39.342 C546.887,39.342 551.658,44.114 551.658,50 C551.658,55.887 546.887,60.657 541,60.657 M541,33.886 C532.1,33.886 524.886,41.1 524.886,50 C524.886,58.899 532.1,66.113 541,66.113 C549.9,66.113 557.115,58.899 557.115,50 C557.115,41.1 549.9,33.886 541,33.886 M565.378,62.101 C565.244,65.022 564.756,66.606 564.346,67.663 C563.803,69.06 563.154,70.057 562.106,71.106 C561.058,72.155 560.06,72.803 558.662,73.347 C557.607,73.757 556.021,74.244 553.102,74.378 C549.944,74.521 548.997,74.552 541,74.552 C533.003,74.552 532.056,74.521 528.898,74.378 C525.979,74.244 524.393,73.757 523.338,73.347 C521.94,72.803 520.942,72.155 519.894,71.106 C518.846,70.057 518.197,69.06 517.654,67.663 C517.244,66.606 516.755,65.022 516.623,62.101 C516.479,58.943 516.448,57.996 516.448,50 C516.448,42.003 516.479,41.056 516.623,37.899 C516.755,34.978 517.244,33.391 517.654,32.338 C518.197,30.938 518.846,29.942 519.894,28.894 C520.942,27.846 521.94,27.196 523.338,26.654 C524.393,26.244 525.979,25.756 528.898,25.623 C532.057,25.479 533.004,25.448 541,25.448 C548.997,25.448 549.943,25.479 553.102,25.623 C556.021,25.756 557.607,26.244 558.662,26.654 C560.06,27.196 561.058,27.846 562.106,28.894 C563.154,29.942 563.803,30.938 564.346,32.338 C564.756,33.391 565.244,34.978 565.378,37.899 C565.522,41.056 565.552,42.003 565.552,50 C565.552,57.996 565.522,58.943 565.378,62.101 M570.82,37.631 C570.674,34.438 570.167,32.258 569.425,30.349 C568.659,28.377 567.633,26.702 565.965,25.035 C564.297,23.368 562.623,22.342 560.652,21.575 C558.743,20.834 556.562,20.326 553.369,20.18 C550.169,20.033 549.148,20 541,20 C532.853,20 531.831,20.033 528.631,20.18 C525.438,20.326 523.257,20.834 521.349,21.575 C519.376,22.342 517.703,23.368 516.035,25.035 C514.368,26.702 513.342,28.377 512.574,30.349 C511.834,32.258 511.326,34.438 511.181,37.631 C511.035,40.831 511,41.851 511,50 C511,58.147 511.035,59.17 511.181,62.369 C511.326,65.562 511.834,67.743 512.574,69.651 C513.342,71.625 514.368,73.296 516.035,74.965 C517.703,76.634 519.376,77.658 521.349,78.425 C523.257,79.167 525.438,79.673 528.631,79.82 C531.831,79.965 532.853,80.001 541,80.001 C549.148,80.001 550.169,79.965 553.369,79.82 C556.562,79.673 558.743,79.167 560.652,78.425 C562.623,77.658 564.297,76.634 565.965,74.965 C567.633,73.296 568.659,71.625 569.425,69.651 C570.167,67.743 570.674,65.562 570.82,62.369 C570.966,59.17 571,58.147 571,50 C571,41.851 570.966,40.831 570.82,37.631" /></g></g></g></svg></div><div style={{paddingTop: '8px'}}> <div style={{color: '#3897f0', fontFamily: 'Arial,sans-serif', fontSize: '14px', fontStyle: 'normal', fontWeight: 550, lineHeight: '18px'}}> View this post on Instagram</div></div><div style={{padding: '12.5% 0'}} /> <div style={{display: 'flex', flexDirection: 'row', marginBottom: '14px', alignItems: 'center'}}><div> <div style={{backgroundColor: '#F4F4F4', borderRadius: '50%', height: '12.5px', width: '12.5px', transform: 'translateX(0px) translateY(7px)'}} /> <div style={{backgroundColor: '#F4F4F4', height: '12.5px', transform: 'rotate(-45deg) translateX(3px) translateY(1px)', width: '12.5px', flexGrow: 0, marginRight: '14px', marginLeft: '2px'}} /> <div style={{backgroundColor: '#F4F4F4', borderRadius: '50%', height: '12.5px', width: '12.5px', transform: 'translateX(9px) translateY(-18px)'}} /></div><div style={{marginLeft: '8px'}}> <div style={{backgroundColor: '#F4F4F4', borderRadius: '50%', flexGrow: 0, height: '20px', width: '20px'}} /> <div style={{width: 0, height: 0, borderTop: '2px solid transparent', borderLeft: '6px solid #f4f4f4', borderBottom: '2px solid transparent', transform: 'translateX(16px) translateY(-4px) rotate(30deg)'}} /></div><div style={{marginLeft: 'auto'}}> <div style={{width: '0px', borderTop: '8px solid #F4F4F4', borderRight: '8px solid transparent', transform: 'translateY(16px)'}} /> <div style={{backgroundColor: '#F4F4F4', flexGrow: 0, height: '12px', width: '16px', transform: 'translateY(-4px)'}} /> <div style={{width: 0, height: 0, borderTop: '8px solid #F4F4F4', borderLeft: '8px solid transparent', transform: 'translateY(-4px) translateX(8px)'}} /></div></div> <div style={{display: 'flex', flexDirection: 'column', flexGrow: 1, justifyContent: 'center', marginBottom: '24px'}}> <div style={{backgroundColor: '#F4F4F4', borderRadius: '4px', flexGrow: 0, height: '14px', marginBottom: '6px', width: '224px'}} /> <div style={{backgroundColor: '#F4F4F4', borderRadius: '4px', flexGrow: 0, height: '14px', width: '144px'}} /></div></a><p style={{color: '#c9c8cd', fontFamily: 'Arial,sans-serif', fontSize: '14px', lineHeight: '17px', marginBottom: 0, marginTop: '8px', overflow: 'hidden', padding: '8px 0 7px', textAlign: 'center', textOverflow: 'ellipsis', whiteSpace: 'nowrap'}}><a href="https://www.instagram.com/p/BKVuoPJA4kP/?utm_source=ig_embed&utm_campaign=loading" style={{color: '#c9c8cd', fontFamily: 'Arial,sans-serif', fontSize: '14px', fontStyle: 'normal', fontWeight: 'normal', lineHeight: '17px', textDecoration: 'none'}} target="_blank">A post shared by Lucca Comics &amp; Games (@luccacomicsandgames)</a></p></div></blockquote>
695.104478
7,009
0.71077
kor_Hang
0.058908
eede59f8ef77e105bb1fd3b949bd396c463622f1
1,767
md
Markdown
articles/cosmos-db/sql-query-aggregate-count.md
MicrosoftDocs/azure-docs.hu-hu
5fb082c5dae057fd040c7e09881e6c407e535fe2
[ "CC-BY-4.0", "MIT" ]
7
2017-08-28T07:44:33.000Z
2021-04-20T21:12:50.000Z
articles/cosmos-db/sql-query-aggregate-count.md
MicrosoftDocs/azure-docs.hu-hu
5fb082c5dae057fd040c7e09881e6c407e535fe2
[ "CC-BY-4.0", "MIT" ]
412
2018-07-25T09:31:03.000Z
2021-03-17T13:17:45.000Z
articles/cosmos-db/sql-query-aggregate-count.md
MicrosoftDocs/azure-docs.hu-hu
5fb082c5dae057fd040c7e09881e6c407e535fe2
[ "CC-BY-4.0", "MIT" ]
13
2017-09-05T09:10:35.000Z
2021-11-05T11:42:31.000Z
--- title: Azure Cosmos DB lekérdezési nyelv megszámlálása description: Ismerkedjen meg a Count (DARABSZÁM) SQL System függvénnyel Azure Cosmos DBban. author: timsander1 ms.service: cosmos-db ms.subservice: cosmosdb-sql ms.topic: conceptual ms.date: 12/02/2020 ms.author: tisande ms.custom: query-reference ms.openlocfilehash: 5228558f4bcb146ec08ee5fff45fb1bdf4d56f01 ms.sourcegitcommit: f28ebb95ae9aaaff3f87d8388a09b41e0b3445b5 ms.translationtype: MT ms.contentlocale: hu-HU ms.lasthandoff: 03/29/2021 ms.locfileid: "96552378" --- # <a name="count-azure-cosmos-db"></a>DARABSZÁM (Azure Cosmos DB) [!INCLUDE[appliesto-sql-api](includes/appliesto-sql-api.md)] Ez a rendszerfüggvény a kifejezésben szereplő értékek számát adja vissza. ## <a name="syntax"></a>Szintaxis ```sql COUNT(<scalar_expr>) ``` ## <a name="arguments"></a>Argumentumok *scalar_expr* Bármely skaláris kifejezés ## <a name="return-types"></a>Visszatérési típusok Egy numerikus kifejezést ad vissza. ## <a name="examples"></a>Példák A következő példa egy tárolóban lévő elemek teljes számát adja vissza: ```sql SELECT COUNT(1) FROM c ``` A COUNT bármilyen skaláris kifejezést felvehet bemenetként. Az alábbi lekérdezés egyenértékű eredményeket fog eredményezni: ```sql SELECT COUNT(2) FROM c ``` ## <a name="remarks"></a>Megjegyzések Ez a rendszerfunkció kihasználja a lekérdezés szűrője bármely tulajdonságának [tartomány-indexét](index-policy.md#includeexclude-strategy) . ## <a name="next-steps"></a>Következő lépések - [Matematikai függvények a Azure Cosmos DBban](sql-query-mathematical-functions.md) - [A Azure Cosmos DB rendszerfunkciói](sql-query-system-functions.md) - [Összesítő függvények a Azure Cosmos DBban](sql-query-aggregate-functions.md)
28.967213
140
0.766271
hun_Latn
0.997786
eedeb12976b2ae6a58d9232d38224a3f888abf5f
1,263
md
Markdown
zpersonal/gary/2022/2022.md
garysdevil/Finance
b5812023c01f0e54b14815f4efd2719fd10e1af5
[ "MIT" ]
null
null
null
zpersonal/gary/2022/2022.md
garysdevil/Finance
b5812023c01f0e54b14815f4efd2719fd10e1af5
[ "MIT" ]
null
null
null
zpersonal/gary/2022/2022.md
garysdevil/Finance
b5812023c01f0e54b14815f4efd2719fd10e1af5
[ "MIT" ]
null
null
null
## 执行 ### 我的猜想 1. FIL 2022 估值+情绪 某一时刻价格超过 150$ ### 大周期/年 1. 网关类型项目 BICO 小于1刀时,购买500U 2. 网关类型项目 代币通胀型 POKT 小于0.5刀 3. 数据块索引 GRT 4. 数据块索引 CQT 5. ATOM ### 我 - 2022年热门赛道 - DAO - 隐私 ## 信息 ### chia - 2021/05/26 区块链初创企业奇亚Chia估值5亿美元计划进行IPO据市场消息,区块链初创企业奇亚Chia估值5亿美元,计划进行IPO - 华尔街专业投行估值200亿美金 - 总融资额度 6100万美元 - 官方保留量 2100万个,10年挖矿产生 3900万个 - 估值 50000/2100=23.8美元/个 - 估值 2000000/2100=952.3美元/个 ### GRT - 估值 https://zhuanlan.zhihu.com/p/338428634 - GRT估值1美元。 ### FIL - 2020年12月10日,每天的代币消耗已攀升至18万枚FIL/天,经济繁荣的标志。 https://htzkw.com/article/12745.html 128T矿机(永久产品至少挖5年):矿机售价176000元,托管费/年运维费7680元/年+20%fil币,总费用约183700 。 128T矿机 售价 27936美元。 29158美元 币价大约为155美元时,每天收益1442美元 币价大约为20美元时,每天收益186美元 156天回本 ### BTC - https://zhuanlan.zhihu.com/p/105136004 电费:0.2~0.4元/度 蚂蚁s17矿机:70T/秒 功耗3kw/h 矿机价格:¥12595.00 以蚂蚁S17为例,当前为2019年12月16日,全网算力90.19EH,难度调整+1.54%。 53T÷(90.19×1000000T)×1809.25(每天比特币总量)=0.001063个 - https://htzkw.com/article/14054.html - Antminer S19j Pro 100 TH/s 3050W 29.5 J/TH ~93000 RMB - 14761美元 - 每天爆块230.4枚 ```python # 全网每天比特币奖励 award=60*24/10*6.25 # 100 TH/s 每天获得的奖励 0.00001/198.37*award # 每天利润 price=37000 0.00001/198.37*award*price ``` - https://www.youtube.com/watch?v=mLdh0cO5zUQ
18.304348
72
0.686461
yue_Hant
0.713161
eedecbe6cf7d45fb2930b5d80e50584d3d77650b
514
md
Markdown
contests/info/586/README.md
kgjamieson/NEXT_data
7cbe8080b441fc91e2e8198ec47c750e6517f83f
[ "CC-BY-4.0", "BSD-3-Clause" ]
62
2016-04-12T14:00:47.000Z
2022-03-18T19:09:47.000Z
contests/info/586/README.md
kgjamieson/NEXT_data
7cbe8080b441fc91e2e8198ec47c750e6517f83f
[ "CC-BY-4.0", "BSD-3-Clause" ]
28
2016-02-26T17:39:14.000Z
2022-02-10T00:24:51.000Z
contests/info/586/README.md
kgjamieson/NEXT_data
7cbe8080b441fc91e2e8198ec47c750e6517f83f
[ "CC-BY-4.0", "BSD-3-Clause" ]
12
2017-10-20T08:46:04.000Z
2021-12-25T01:35:12.000Z
![](info.png) Cardinal bandits (aka "how funny is this caption?") Histogram of when people responded: ![](histogram.png) Example query: ![](example_query.png) This caption contest was also live on the New Yorker caption contest page (reloading, ads). This week, for all captions that had at least one duplicate caption, I included a single, exact duplicate of the original caption for comparison purposes. These duplicate captions are listed in `{contest}_repeat_captions.csv` (which is newline delimited).
25.7
79
0.772374
eng_Latn
0.998788
eeded86994280df778ca27cf57e90b29c96d06a7
967
md
Markdown
_datasets/north+ayrshire+council-consultant+expenditure.md
OpenDataScotland/jkan
0b700981fc69e42495b0e6ab4de006dac3181164
[ "MIT" ]
2
2021-11-28T14:27:28.000Z
2021-11-28T20:27:20.000Z
_datasets/north+ayrshire+council-consultant+expenditure.md
OpenDataScotland/jkan
0b700981fc69e42495b0e6ab4de006dac3181164
[ "MIT" ]
null
null
null
_datasets/north+ayrshire+council-consultant+expenditure.md
OpenDataScotland/jkan
0b700981fc69e42495b0e6ab4de006dac3181164
[ "MIT" ]
null
null
null
--- category: - Uncategorised date_created: '' date_updated: '2019-09-12' license: No licence maintainer: North Ayrshire Council notes: <p>North Ayrshire Council Consultant Expenditure</p> organization: North Ayrshire Council original_dataset_link: https://maps-north-ayrshire.opendata.arcgis.com/maps/north-ayrshire::consultant-expenditure records: null resources: - format: ARCGIS GEOSERVICE name: ARCGIS GEOSERVICE url: https://www.maps.north-ayrshire.gov.uk/arcgis/rest/services/AGOL/Open_Data_Portal3/MapServer/29 - format: GEOJSON name: GEOJSON url: https://maps-north-ayrshire.opendata.arcgis.com/datasets/north-ayrshire::consultant-expenditure.geojson?outSR=%7B%22latestWkid%22%3A27700%2C%22wkid%22%3A27700%7D - format: CSV name: CSV url: https://maps-north-ayrshire.opendata.arcgis.com/datasets/north-ayrshire::consultant-expenditure.csv?outSR=%7B%22latestWkid%22%3A27700%2C%22wkid%22%3A27700%7D schema: default title: Consultant Expenditure ---
38.68
168
0.79938
yue_Hant
0.141747
eedfa97ea16210ac4b3cbc0e29151ba24a5e30d8
33,352
md
Markdown
ref_code/cjdns/admin/README.md
krattai/noo-ebs
00f67fd8f25b7c1eadf7348245670cb0ac911184
[ "BSD-2-Clause" ]
2
2015-04-07T14:37:24.000Z
2015-11-06T00:31:01.000Z
ref_code/cjdns/admin/README.md
krattai/noo-ebs
00f67fd8f25b7c1eadf7348245670cb0ac911184
[ "BSD-2-Clause" ]
null
null
null
ref_code/cjdns/admin/README.md
krattai/noo-ebs
00f67fd8f25b7c1eadf7348245670cb0ac911184
[ "BSD-2-Clause" ]
null
null
null
#Cjdns Admin API Cjdns is inspected and configured through a UDP socket. When cjdroute starts up, it reads the configuration file and spawns cjdns core. The core knows nothing but the port which it should bind to and the private key which it should use. All other information such as peers, interfaces and passwords is given to the core through the admin UDP interface. When cjdroute is finished setting up the core, it exits leaving the core running in the background. You can call all of the functions which are called by cjdroute to collect information and alter the core's configuration. ## How a function works To call a function you send a udp packet containing a bencoded request to the core and it sends back a bencoded response. echo -n 'd1:q4:pinge' | nc6 -u -t 1 -n -w3 127.0.0.1 11234 If you are more comfortable writing json then benc, you can use benc2json in reverse mode to preprocess your message. echo '{ "q": "ping" }' | ./build/benc2json -r Stream the request from json into benc and then make the request to the core: echo '{ "q": "ping" }' \ | ./build/benc2json -r \ | tr -d '\n' \ | nc6 -u -t 1 -n -w3 127.0.0.1 11234 Get the result back into json: echo '{ "q": "ping" }' \ | ./build/benc2json -r \ | tr -d '\n' \ | nc6 -u -t 1 -n -w3 127.0.0.1 11234 \ | ./build/benc2json ## Transaction ID Because you can send multiple messages at once, you may add a transaction ID to a message and it will be reflected back to you in the response. echo '{ "q": "ping", "txid": "my request" }' \ | ./build/benc2json -r \ | tr -d '\n' \ | nc6 -u -t 1 -n -w3 127.0.0.1 11234 \ | ./build/benc2json Result: { "txid" : "my request", "q" : "pong" } ## Arguments Some functions require arguments and others allow arguments but assume defaults if they are not provided. Arguments are sent to a function through a benc *dictionary* called `args`. The `Admin_availableFunctions()` function has an optional argument called `page`, this is because there are too many functions to be described in a single UDP packet. The following command will get the first page of functions from `Admin_availableFunctions` which will describe other functions and their required and allowed arguments. echo -n ' { "q": "Admin_availableFunctions", "args": { "page": 0 } }' | ./build/benc2json -r \ | tr -d '\n' \ | nc6 -u -t 1 -n -w3 127.0.0.1 11234 \ | ./build/benc2json ## Authentication Any function which changes the state of cjdns core requires authentication to carry out. Authentication is done on a per-request basis. Functions which don't require authentication can still be called with authentication and will still fail if the authentication is incorrect. * Step 1: Request a cookie from the server. * Step 2: Calculate the SHA-256 of the cookie and your admin password, place this hash and cookie in the request. * Step 3: Calculate the SHA-256 of the entire request with the hash and cookie added, replace the hash in the request with this result. Steps 1 and 2 securely bind the cookie to the password so that the password hash cannot be taken and used again in another request later on, step 3 binds the cookie and password to the request so that a man-in-the-middle cannot change the content of the request in flight. ### Anatomy of an authenticated request A plain request such as `{"q": "ping"}` becomes `{"q":"auth", "aq":"ping", "hash":<calculated hash>}`. The `q` field is moved to `aq` (authenticated query) and the `q` field says `auth`. **NOTE:** A cookie is only valid for 10 seconds so requesting and using a cookie must be done in the same script. **NOTE2:** Cookies are reusable *for now* this is not part of the API and is considered a bug, you should always request a new cookie for each authenticated request otherwise you may be broke by changes in the future. ### By example **Step 1:** Get the cookie RESP=`echo -n 'd1:q6:cookiee' | nc6 -u -t 1 -n -w3 127.0.0.1 11234` \ echo response=${RESP}; \ COOKIE=`echo ${RESP} | sed 's/d6:cookie10:\([0-9]*\)e/\1/'` \ echo cookie=${COOKIE}; **Step 2:** Calculate the hash of the cookie and password: For this step, you will need the admin password from your cjdroute.conf file, it's to be found inside of the block which says `"admin": {`. ADMIN_PASS=you_will_find_this_in_your_cjdroute_dot_conf \ REQUEST='{"q": "auth", "aq": "ping", "hash": "__HASH__", "cookie": "__COOKIE__"}' \ COOKIE_RESP=`echo -n 'd1:q6:cookiee' | nc6 -u -t 1 -n -w3 127.0.0.1 11234` \ COOKIE=`echo ${COOKIE_RESP} | sed 's/d6:cookie10:\([0-9]*\)e/\1/'` \ HASH_ONE=`echo -n "${ADMIN_PASS}${COOKIE}" | sha256sum -b | cut -d\ -f1` ; \ REQ_ONE=`echo $REQUEST | sed -e "s/__HASH__/${HASH_ONE}/" -e "s/__COOKIE__/${COOKIE}/" \ | ./build/benc2json -r | tr -d '\n'` ; \ echo "hash of password and cookie is ${HASH_ONE}" ; \ echo "Request with cookie and hash added:" ; \ echo "${REQ_ONE}" ; \ echo "JSON version of request:" ; \ echo "${REQ_ONE}" | ./build/benc2json **Step 3:** Calculate the SHA-256 of the entire request and replace the one in the request: This will calculate the final request and send it to cjdns. ADMIN_PASS=you_will_find_this_in_your_cjdroute_dot_conf \ REQUEST='{"q": "auth", "aq": "ping", "hash": "__HASH__", "cookie": "__COOKIE__"}' \ COOKIE_RESP=`echo -n 'd1:q6:cookiee' | nc6 -u -t 1 -n -w3 127.0.0.1 11234` \ COOKIE=`echo ${COOKIE_RESP} | sed 's/d6:cookie10:\([0-9]*\)e/\1/'` \ HASH_ONE=`echo -n "${ADMIN_PASS}${COOKIE}" | sha256sum -b | cut -d\ -f1` \ REQ_ONE=`echo $REQUEST | sed -e "s/__HASH__/${HASH_ONE}/" -e "s/__COOKIE__/${COOKIE}/" \ | ./build/benc2json -r | tr -d '\n'` \ FINAL_HASH=`echo -n "$REQ_ONE" | sha256sum -b | cut -d\ -f1` \ FINAL_REQ=`echo $REQ_ONE | sed -e "s/${HASH_ONE}/${FINAL_HASH}/"` ; \ echo -n "$FINAL_REQ" \ | nc6 -u -t 1 -n -w3 127.0.0.1 11234 \ | ./build/benc2json If you see this: { "q" : "pong" } then it has succeeded, if the password is incorrect, you will see this: { "error" : "Auth failed." } ### Tools: Obviously using bash to craft cjdns admin RPC calls is probably the most awkward way possible, there are tools in cjdns/contrib which will help you craft requests, specifically there are libraries written in python and perl which will allow users to call cjdns internal functions as python/perl native functions. A tool called `cexec` is provided with the python library which allows you to call cjdns functions from shell scripts or the command line as follows: ./contrib/python/cexec 'ping()' ## Cjdns Functions: user@ubnta8:~/wrk/cjdns$ ./contrib/python/cexec 'functions()' | sort Admin_asyncEnabled() Admin_availableFunctions(page='') AdminLog_subscribe(line='', file=0, level=0) AdminLog_unsubscribe(streamId) AuthorizedPasswords_add(password, user, authType='') AuthorizedPasswords_list() AuthorizedPasswords_remove(user) Core_exit() Core_initTunnel(desiredTunName=0) ETHInterface_beacon(interfaceNumber='', state='') ETHInterface_beginConnection(publicKey, macAddress, interfaceNumber='', password=0) ETHInterface_new(bindDevice) InterfaceController_disconnectPeer(pubkey) InterfaceController_peerStats(page='') IpTunnel_allowConnection(publicKeyOfAuthorizedNode, ip6Address=0, ip4Address=0) IpTunnel_connectTo(publicKeyOfNodeToConnectTo) IpTunnel_listConnections() IpTunnel_removeConnection(connection) IpTunnel_showConnection(connection) memory() NodeStore_dumpTable(page) ping() RouterModule_lookup(address) RouterModule_pingNode(path, timeout='') Security_noFiles() Security_setUser(user) SwitchPinger_ping(path, data=0, timeout='') UDPInterface_beginConnection(publicKey, address, interfaceNumber='', password=0) UDPInterface_new(bindAddress=0) ###RouterModule_pingNode() **Auth Required** Send a node a cjdns ping request. Parameters: * required String **path** may be a route such as "0000.0000.0000.1d53" or an ip address such as "fc5d:baa5:61fc:6ffd:9554:67f0:e290:7536", or an ip with explicit path eg: "fc5d:baa5:61fc:6ffd:9554:67f0:e290:7536@0000.0000.0000.1d53" * Int **timeout** (optional) the number of milliseconds after which to timeout the ping if there is no response. Defaults to router's adaptive ping timeout if unspecified. Responses: * **error**: `could not find node to ping` if there was no node by the given address found in the routing table * **result**: `timeout` gives timeout and number of milliseconds since the ping. * **result**: `pong` gives `version` representing the git hash of the source code which built the pinged node, and `ms` which is the number of milliseconds since the original ping. Examples: >>> cjdns.RouterModule_pingNode('fc38:4c2c:1a8f:3981:f2e7:c2b9:6870:6e84') {'version': '5c5e84ccdba3f31f7c88077729700b4368320bc2', 'result': 'pong', 'ms': 79} >>> cjdns.RouterModule_pingNode('fc5d:baa5:61fc:6ffd:9554:67f0:e290:7536') {'error': 'could not find node to ping'} >>> cjdns.RouterModule_pingNode('0000.0000.0000.0013') {'version': '2b62b9ae911f1044e45f3f28fdd63d0d5a7fc512', 'result': 'pong', 'ms': 0} >>> cjdns.RouterModule_pingNode('a') {'error': "Unexpected length, must be either 39 char ipv6 address (with leading zeros) eg: 'fc4f:000d:e499:8f5b:c49f:6e6b:01ae:3120' or 19 char path eg: '0123.4567.89ab.cdef'"} >>> cjdns.RouterModule_pingNode('aaaaaaaaaaaaaaaaaaa') {'error': 'parse path failed'} >>> cjdns.RouterModule_pingNode('aaaaaaaaaaaaaaaaaaazzzzzzzzzzzzzzzzzzzz') {'error': 'parsing address failed'} >>> cjdns.RouterModule_pingNode('fc38:4c2c:1a8f:3981:f2e7:c2b9:6870:6e84', 10) {'result': 'timeout', 'ms': 10} ### ETHInterface Functions: ETHInterface is a connector which allows cjdns nodes on the same lan to automatically connect without the need to IP addresses on the LAN or sharing of connection credentials. It works on wireless LANs as well as wired ethernet LANs. #### ETHInterface_new() Create a new ETHInterface and bind it to a device. **NOTE**: this call will always fail with `'error': 'call to socket() failed. [permission denied]'` unless it is running as root and will fail with `process cannot open more files` if `Security_setUser()` has already been called. **Auth Required** Parameters: * required String **bindDevice** the name of the ethernet device to bind to, eg: `eth0` or `wlan0`. Returns: * Int **interfaceNumber** an number which can be used to carry out other operations on the interface later. #### ETHInterface_beginConnection() Connect an ETHInterface to another computer which has an ETHInterface running. **Auth Required** Parameters: * required String **publicKey** The public key of the other node, similar to `UDPInterface_beginConnection()` * required String **macAddress** The mac address of the other node. * Int **interfaceNumber** The interface number to use, assumed 0 (first ETHinterface created) if not supplied. * String **password** A password for connecting to the other node if required. Returns: * String **error**: `none` if everything went well. Other errors are self-explanitory. #### ETHInterface_beacon() Enable or disable sending or receiving of ETHInterface beacon messages. ETHInterface uses periodic beacon messages to automatically peer nodes which are on the same LAN. Be mindful that if your lan has is open wifi, enabling beaconing will allow anyone to peer with you. **Auth Required** Beacon States: 0. Disabled, no beacons are sent and incoming beacon messages are discarded. 1. Accepting, no beacons are sent but if an incoming beacon is received, it is acted upon. 2. Sending and Accepting, beacons are sent and accepted. Parameters: * Int **interfaceNumber** The number of the ETHInterface to change the state of, assumed 0 if not provided. * Int **state** What state to switch to, if not provided, the current state will be queried only. Returns: * String **error**: `none` if all went well. * Int **state**: the state number after the call is complete. * String **stateName**: a description of the state. Example: $ ./contrib/python/cexec 'ETHInterface_beacon(2)' {'txid': 'FYRKHAPIM3', 'error': 'invalid interfaceNumber'} $ ./contrib/python/cexec 'ETHInterface_beacon(0)' {'txid': 'Z7KHE7SZ5R', 'state': 2, 'stateName': 'sending and accepting', 'error': 'none'} $ ./contrib/python/cexec 'ETHInterface_beacon(0, 0)' {'txid': 'TP1R8PYCNS', 'state': 0, 'stateName': 'disabled', 'error': 'none'} $ ./contrib/python/cexec 'ETHInterface_beacon(0, 1)' {'txid': 'UGKKGX4ZC9', 'state': 1, 'stateName': 'accepting', 'error': 'none'} $ ./contrib/python/cexec 'ETHInterface_beacon(0, 2)' {'txid': '1B7RXJEH3N', 'state': 2, 'stateName': 'sending and accepting', 'error': 'none'} ### IpTunnel Functions IPTunnel is designed to allow tunneling of IPv4 and IPv6 packets through a cjdns network to the external internet or to a virtual LAN. It provides familiar VPN type functionality. There are 2 nodes, a client and a server, the server uses `IPTunnel_allowConnection()` and the client uses `IPTunnel_connectTo()` the server assigns IPv4 and/or IPv6 addresses to the client and the client is required to use only these addresses, subnet assignment is not supported. When the client uses `IPTunnel_connectTo()`, it sends a request to the server for addresses and continues polling the server periodically until the addresses are provided. #### IpTunnel_listConnections() List the connection numbers of all IPTunnel connections. **Auth Required** Returns: * List **connections**: A list of integers representing the connection numbers for each connection. * String **error**: `none` Example: $ ./contrib/python/cexec 'IpTunnel_listConnections()' {'connections': [0], 'txid': '5ZFPFJ60AT', 'error': 'none'} #### IpTunnel_showConnection() Show information about a perticular IPTunnel connection. **Auth Required** Parameters: * required Int **connection**: the connection number for the connection to show information about. Returns: * Int **outgoing**: 1 if the connection is outgoing, 0 if it's incoming. * String **key**: the cjdns public key of the foreign node. * String **ip6Address**: the IPv6 address which is assigned to this IPTunnel if applicable. * String **ip4Address**: the IPv4 address which is assigned to this IPTunnel if applicable. * String **error**: `none` unless the connection number is invalid. Examples: # Prior to getting it's addresses from the server, they are not listed. $ ./contrib/python/cexec 'IpTunnel_showConnection(0)' {'outgoing': 1, 'txid': 'REIV40SXD9', 'key': 'd5d0wu0usrkufd8s98t19gt7m2ggvbz1xbnuxu82x63uqlnk2kb0.k', 'error': 'none'} # After a short wait, the addresses are provided and they are now listed. $ ./contrib/python/cexec 'IpTunnel_showConnection(0)' {'outgoing': 1, 'txid': 'CAQCTWECRD', 'ip4Address': '192.168.10.2', 'key': 'd5d0wu0usrkufd8s98t19gt7m2ggvbz1xbnuxu82x63uqlnk2kb0.k', 'error': 'none', 'ip6Address': '2a02:2498:e000:20::144:3'} #### IpTunnel_removeConnection() Remove an IPTunnel connection from the list, the other end will nolonger be able to send traffic over this connection. **Auth Required** **NOT IMPLEMENTED** #### IpTunnel_connectTo() Initiate an *outgoing* connection to another node and request IP addresses from them. **Auth Required** Parameters: * required String **publicKeyOfNodeToConnectTo** the pubkey of the node to connect to. Returns: * String **error**: `none` if all went well * Int **connection**: the connection number of the new connection Examples: $ ./contrib/python/cexec 'IpTunnel_connectTo("d5d0wu0usrkufd8s98t19gt7m2ggvbz1xbnuxu82x63uqlnk2kb0.k")' {'connection': 1, 'txid': '9QXRQO1FG8', 'error': 'none'} #### IpTunnel_allowConnection() Allow in *incoming* connection from another node, they must also use `IPTunnel_connectTo()` in order to complete the connection. **Auth Required** Parameters: * required String **publicKeyOfAuthorizedNode** The key of the node which is authorized to connect. * String **ip6Address** The IPv6 address to give them if applicable. * String **ip4Address** The IPv4 address to give them if applicable. Returns: * String **error** `none` if all went well. * Int **connection** the connection number for the new connection. ### UDPInterface Functions UDPInterface is the basic cjdns interface which is used to link distant nodes over the internet. It will work on a LAN as long as the nodes have IP addresses but for linking on a LAN, ETHInterface is easier. #### UDPInterface_new() Create a new UDPInterface which is either bound to an address/port or not. **NOTE**: This call will fail with `'error': 'call to socket() failed [process cannot open more files]'` is `Security_noFiles()` has already been called. Parameters: * String **bindAddress**: the address/port to bind to, if unspecified, it is assumed to be `0.0.0.0`. Returns: * String **error** `none` if all went well * Int **interfaceNumber** the number of the interface, usable with `UDPInterface_beginConnection()` #### UDPInterface_beginConnection() Start a direct connection to another node. **Auth Required** Parameters: * required String **publicKey** the base32 public key for the node to connect to, ending in .k. * required String **address** the ip address and port for the node, at this time DNS resolution and IPv6 is not supported. * Int **interfaceNumber** the number for the UDPInterface to use for connecting, provided by *UDPInterface_new()* if not sent, 0 is assumed. * String **password** a password to use when connecting. Note: just because it returns `'error': 'none'` does not mean that the connection was successful. The neighbor may still reject our connection attempts. Example: >>> cjdns.UDPInterface_beginConnection("v0zyvrjuc4xbzh4n9c4k3qpx7kg8xgndv2k45j9nfgb373m8sss0.k", "192.168.0.2:10000", "null") {'error': 'none'} >>> cjdns.UDPInterface_beginConnection("v0zyvrjuc4xbzh4n9c4k3qpx7kg8xgndv2k45j9nfgb373m8sss0.k", "x", "null") {'error': 'unable to parse ip address and port.'} >>> cjdns.UDPInterface_beginConnection("k", "x", "null") {'error': 'publicKey is too short, must be 52 characters long.'} >>> cjdns.UDPInterface_beginConnection("------------------------------------------------------", "x", "null") {'error': 'failed to parse publicKey.'} >>> cjdns.UDPInterface_beginConnection("zzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzz0.k", "192.168.0.2:10000", "null") {'error': 'invalid cjdns public key.'} >>> cjdns.UDPInterface_beginConnection("v0zyvrjuc4xbzh4n9c4k3qpx7kg8xgndv2k45j9nfgb373m8sss0.k", "[1234::5]:10000", "null") {'error': 'different address type than this socket is bound to.'} ### AdminLog Functions: Since cjdns contains so many logging locations, logging to a file would not only be inefficient but it would fill up your disk rather quickly. Because if this, cjdns logging is only enabled on request, with these functions you can ask for logs to be enabled on a log level, per-file or even per-line basis. Log levels may be excluded at compile time in which case they will not be available. Each log level implies inclusion of every higher level, if you subscribe to **INFO** logging, you will also automatically get **WARN**, **ERROR**, and **CRITICAL**. Cjdns log levels: * **KEYS** Not compiled in by default, contains private keys and other secret information. * **DEBUG** Default level, contains lots of information which is probably not useful unless you are diagnosing an ongoing problem. * **INFO** Shows starting and stopping of various components and general purpose information. * **WARN** Generally this means some system has undergone a minor failure, this includes failures due to network disturbance. * **ERROR** This means there was a (possibly temporary) failure of a system within cjdns. * **CRITICAL** This means something is broken such that the cjdns core will likely have to exit immedietly. To see an implementation of cjdns log consumer, look at `contrib/python/cjdnslog`. #### AdminLog_subscribe() Subscribe to logging of a level/file/line. **Auth Required** **NOTE**: Because this function responds asynchronously, using `netcat` or `cexec` to call it will not work, additionally it will stop sending asynchronous messages unless an incoming message comes in every 10 seconds so you must send periodic messages on the same UDP port. See: `Admin_asyncEnabled()` for more information. Parameters: * Int **line**: If specified, the logging will be constrained to the log message which appers on the given line number in the source file. * String **file**: If specified, the logging will be constrained to the named file, names are not fully qualified, use "CryptoAuth.c", not "/path/to/CryptoAuth.c". * String **level**: If specified, the logging will be constrained to log lines which are of the given level or higher. Returns: * String **error**: `none` if all goes well. * String **streamId**: an opaque string which will be contained in each log message. Log message structure: * String **file** the name of the file where the log message came from, eg: "CryptoAuth.c". * String **level** the log level, one of `["KEYS", "DEBUG", "INFO", "WARN", "ERROR", "CRITICAL"]` * Int **line** the line number of the line where the log function was called. * String **message** the log message * String **streamId** the streamId for the logging subscription. * Int **time** the time in seconds since the unix epoch when the log message was created. * String **txid** the same transaction which was used in the call to `AdminLog_subscribe()`. #### AdminLog_unsubscribe() Unsubscribe from logging. **Auth Required** Parameters: * required String **streamId**: The id returned in the call to `AdminLog_subscribe()`. Returns: * String **error**: `none` if the subscription existed and was removed. **Note**: If the subscription has already timed out, removing it will yield `'error': 'No such subscription.'`. Example: $ ./contrib/python/cexec 'AdminLog_subscribe()' {'txid': '0EKWEP7VXI', 'streamId': 'f1a0e225183397f4', 'error': 'none'} $ ./contrib/python/cexec 'AdminLog_unsubscribe("f1a0e225183397f4")' {'txid': 'CB4V7KLYCC', 'error': 'none'} ### Admin Functions These functions are for dealing with the Admin interface, the infrastructure which allows all of the other functions throughout cjdns to be accessed from the admin socket. #### Admin_availableFunctions() Get a list of functions which are available to the admin socket as well as their required and optional parameters, unfortunately their return values are not provided and can only be determined by experimentation or by reading the source. **Note**: The list of functions is paged to make sure each message fits inside of a UDP packet, in order to get the whole list of functions, you must increment the `page` parameter until the result nolonger contains the `more` field. Parameters: * Int **page**: the page of functions to request, if unspecified it will be assumed to be 0. Returns: * Dict **availableFunctions**: a map of function descriptions by function name. * Int **more**: only present if there are more pages. ##### Function Description: Each function description is a Dict of function parameters with the parameter name as the key and the specifications as the value. The specification `required` is an Int which is either 0 meaning the parameter is optional or 1 meaning it is required. `type` is a String which is one of `["Int", "String", "Dict", "List"]` and defines the type which the parameter must be. 'AdminLog_subscribe': { 'line': { 'required': 0, 'type': 'Int' }, 'file': { 'required': 0, 'type': 'String' }, 'level': { 'required': 0, 'type': 'String' } } #### Admin_asyncEnabled() This function is for determining whether asynchronous communication is allowed. Asynchronous communication, EG: AdminLog responses, is only allowed with clients which satisfy certain requirements. 1. They must send an authenticated request, in the case of AdminLog this is no worry because `AdminLog_subscribe()` requires authentication. 2. They must have sent something in the past 10 seconds, because of the statelessness of UDP, there is no way to tell a client which is listening quietly from one which has wandered off so in order to remain enabled, it must periodically ping (or periodically call `Admin_asyncEnabled()`). These calls do not need to be authenticated, there just needs to have been one in history. Returns: * Int **asyncEnabled**: 1 if asynchronous communication is allowed for this session, 0 otherwise. Example: This example illustrates how using `cexec` to call it returns true because `cexec` uses authenticated calls whereas manually calling it without authentication returns false. $ ./contrib/python/cexec 'Admin_asyncEnabled()' {'asyncEnabled': 1, 'txid': '74GF0SS2N0'} echo '{ "q": "Admin_asyncEnabled" }' \ | ./build/benc2json -r \ | tr -d '\n' \ | nc -u 127.0.0.1 11234 \ | ./build/benc2json { "asyncEnabled" : 0 } ### Security Functions These functions are available for putting the cjdns core into a sandbox where a security breach within the core would be less likely to cause a total system compromize. #### Security_setUser() Set the user ID which cjdns is running under to a different user. This function allows cjdns to shed privileges after starting up. **NOTE**: This function will always fail with an error about `process cannot open more files` if `Security_noFiles()` has already been called. Parameters: * required String **user**: the name of the user to change to. Return: * String **error**: `none` if all went well, otherwise a description of the failure. #### Security_noFiles() Set the hard open file limit to zero, while this does not force closed file descriptors which are already open, it makes any function requiring the opening of a file to fail providing a powerful sandbox. By calling this function after cjdns is started, one can insure that cjdns core cannot touch the filesystem or open network sockets which it does not already have open. This will however prevent a number of other admin API functions fron working. Returns: * String **error**: `none` Examples: $ ./contrib/python/cexec 'UDPInterface_new("[::]:2048")' {'interfaceNumber': 3, 'txid': 'NQGOZXJZIC', 'error': 'none'} $ ./contrib/python/cexec 'Security_noFiles()' {'txid': 'CQYQWA5SZY', 'error': 'none'} $ ./contrib/python/cexec 'UDPInterface_new("[::]:5000")' {'txid': 'UZH9LIUOG0', 'cause': 'process cannot open more files', 'error': 'call to socket() failed [process cannot open more files]'} ### Core_initTunnel() This function is used during cjdns startup to initialize the TUN device, set it's IP address and set the MTU, it is hastily designed and may be removed in the future. Parameters: * String **desiredTunName**: the name of the TUN device to use, if unspecified it will ask the kernel for a new device. Returns: * String **error**: `none` if all went well, otherwise the error which occured. **Note**: an error will be returned if anything goes wrong initializing the tunnel, setting it's IP address or setting it's MTU, even if there is an error, the tunnel may work just fine and even if the tunnel doesn't work, cjdns will function as a router only without the TUN device. ### Core_exit() A function to stop cjdns. Returns: * String **error**: `none` before exiting. ### ping() Returns: {'q':'pong'} For checking if the admin connection is functioning. ### RouterModule_lookup() **Auth Required** Parameters: * String **address** a 39 character (zero padded) ipv6 address. Returns: * A route if one is found in the routing table. * An address and route of the node which should be handed the packet, if a route is not found in the local table. * An error if the address is not parsable. Examples: >>> print cjdns.RouterModule_lookup('fc5d:baa5:61fc:6ffd:9554:67f0:e290:7535') {'result': '0000.0000.0000.1953', 'error': 'none'} >>> print cjdns.RouterModule_lookup('fc5d:baa5:61fc:6ffd:9554:67f0:e290:7536') {'result': 'fcf1:a7a8:8ec0:589b:c64c:cc95:1ced:3679@0000.0000.0000.0013', 'error': 'none'} >>> print cjdns.RouterModule_lookup('f') {'result': '', 'error': 'address wrong length'} >>> print cjdns.RouterModule_lookup('zzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzz') {'result': '', 'error': 'failed to parse address'} ### AuthorizedPasswords_add() **Auth Required** Parameters: * String **password** a password which will allow neighbors to make direct connections. * String **user** a friendly string to identify this password. * Int **authType** (optional) the method for authenticating, defaults to `1` (only currently supported method). Returns: * **error**:`none` if everything went well. * **error**:`Specified auth type is not supported.` if the auth type is specified and not `1`. * **error**:`Password already added.` if you try to add the same user or password twice. * **error**:`Out of memory to store password.` if the buffer for storing authorized passwords is full. Examples: $ ./contrib/python/cexec 'AuthorizedPasswords_add(user="test",password="yh14wl2ffgcqq6bvut12xrz7g3")' {'error': 'none'} $ ./contrib/python/cexec 'AuthorizedPasswords_add(user="test2",password="2yh14wl2ffgcqq6bvut12xrz7g3",authType=300)' {'error': 'Specified auth type is not supported.'} $ ./contrib/python/cexec 'AuthorizedPasswords_add(user="test",password="yh14wl2ffgcqq6bvut12xrz7g3")' {'error': 'Password already added.'} ### AuthorizedPasswords_list() **Auth Required** Get a list of all the authorized users. Example: $ ./contrib/python/cexec 'AuthorizedPasswords_list()' {'total': 2, 'users': ['Test User1', 'Local Peers'], 'txid': 'W0DUG0D50K'} ### memory() Get the number of bytes of memory allocated by all memory allocators in the router. Example: >>> cjdns.memory() {'bytes': 779259} ### NodeStore_dumpTable() Parameters: * Int **page** the page of the routing table to dump, allowing you to get the whole table in a series of reasonably small requests. Response: * `routingTable` a key which contains a list of dictionaries, each containing `ip`, `link` and `path`. `ip` is the IPv6 address of the node, `link` is a unitless number between 0 inclusive and 2^32 exclusive, representing the router's opinion of the quality of that path, higher is better. `path` is the route to the node. * `more` to signal that there is another page of results, the engine will add a `more` key with the integer 1, if there isn't another page of results, the `more` key will not be added. What the data looks like: { 'routingTable': [ { 'ip': 'fce5:de17:cbde:c87b:5289:0556:8b83:c9c8', 'link': 4294967295, 'path': '0000.0000.0000.0001' }, { 'ip': 'fcfc:2ebe:346c:7fe7:95af:a58b:2631:dead', 'link': 235149061, 'path': '0000.0000.631a.3b53' }, { 'ip': 'fc70:772a:f803:7c4e:38bd:981b:f791:60a1', 'link': 271119350, 'path': '0000.0000.017b.b333' }, .............................. ], 'more': 1 } Example: >>> cjdns.NodeStore_dumpTable(0) {'routingTable': [{'ip': 'fce5:de17:cbde:c87b:5289:0556:8b83:c9c8', 'link': 4294967295,.... >>> cjdns.NodeStore_dumpTable(4) {'routingTable': []} ### SwitchPinger_ping() **Auth Required** Send a switch level ping. There is no routing table lookup and the router is not involved. Pinging IP addresses this way is not possible. Parameters: SwitchPinger_ping(required String path, String data, Int timeout) * String **path** the route to the node to ping eg: `0000.0000.04f5.2555` * String **data** (optional) for diagnosing data-dependent errors. * Int **timeout** (optional) milliseconds to wait for a response. If unspecified, will default to `DEFAULT_TIMEOUT` as defined in `SwitchPinger_admin.c` (2 seconds). Examples: >>> cjdns.SwitchPinger_ping('0000.0000.04f5.2555') {'path': '0000.0000.04f5.2555', 'data': '', 'result': 'pong', 'ms': 281} >>> cjdns.SwitchPinger_ping('fca5:9fe0:3fa2:d576:71e6:8373:7aeb:ea11') {'error': 'path was not parsable.'} >>> cjdns.SwitchPinger_ping('0000.0000.04f5.2555', '12345abcdefg') {'path': '0000.0000.04f5.2555', 'data': '12345abcdefg', 'result': 'pong', 'ms': 326} >>> cjdns.SwitchPinger_ping('0000.0000.0405.2555') {'path': '0000.0000.0405.2555', 'data': '', 'result': 'ping message caused switch error', 'ms': 278} >>> cjdns.SwitchPinger_ping('0000.0000.04f5.2555', '', 30) {'result': 'timeout', 'ms': 77}
35.862366
195
0.708233
eng_Latn
0.982653
eee02c501b07548f3467cac720e1a96e59f632bd
544
md
Markdown
README.md
DeveshPankaj/Admin
563f150db389d8105173c9d86ab95bf8122159bf
[ "MIT" ]
null
null
null
README.md
DeveshPankaj/Admin
563f150db389d8105173c9d86ab95bf8122159bf
[ "MIT" ]
null
null
null
README.md
DeveshPankaj/Admin
563f150db389d8105173c9d86ab95bf8122159bf
[ "MIT" ]
null
null
null
# Admin Open source Database Administration tool for Node.js # Todo - [ ] Create CLI & GUI based interface for - - Creating and deleating data tables - Modifying existing database structure - [ ] Dynamic Code snippet generator for - Node.js , PHP - [ ] API builder - [ ] Loadinge Database - [ ] Connecting With remote database # Features - Allow to create and modify database - APIs, Directly integrable with HTML, Angular, Node.js, Vue.js # Tools ```Electron``` ```Commander.js``` ```Jquery``` ```Vue js``` ```RxJS```
19.428571
63
0.670956
eng_Latn
0.541246
eee0886987adb8bd41d74a0e6a9358e502c00b3f
4,561
md
Markdown
src/SUMMARY.md
moio/docs
4df00360c019c66a99f667c30178cb48983f15ce
[ "CC-BY-4.0" ]
null
null
null
src/SUMMARY.md
moio/docs
4df00360c019c66a99f667c30178cb48983f15ce
[ "CC-BY-4.0" ]
null
null
null
src/SUMMARY.md
moio/docs
4df00360c019c66a99f667c30178cb48983f15ce
[ "CC-BY-4.0" ]
null
null
null
## User documentation - [Introduction](introduction.md) - [Quick Start](tutorials/quickstart.md) - [Installation](installation/installation.md) - [Install Epinio](installation/installation.md) - [Install Epinio cli](installation/install_epinio_cli.md) - [Install Epinio with custom DNS](installation/install_epinio_customDNS.md) - [Install Epinio with "magic" DNS](installation/install_epinio_magicDNS.md) - [Uninstall Epinio](installation/uninstall_epinio.md) - [Explanations](explanations/explanations.md) - [Advanced topics](explanations/advanced.md) - [Detailed Push Process](explanations/detailed-push-process.md) - [Principles](explanations/principles.md) - [Security](explanations/security.md) - [HowTos](howtos/howtos.md) - [Certificate Issuers](howtos/certificate_issuers.md) - [Provision external IP address for local Kubernetes](howtos/provision_external_ip_for_local_kubernetes.md) - [Push with gitjob](howtos/gitjob_push.md) - [Setup external S3 storage](howtos/setup_external_s3.md) - [Setup external container registry](howtos/setup_external_registry.md) - [Port Forwarding](howtos/port_forwarding.md) - [Custom Routes](howtos/custom_routes.md) - [Install Epinio on RKE2 (Rancher)](howtos/install_epinio_on_rke.md) - [Install Epinio on K3s (local)](howtos/install_epinio_on_k3s.md) - [Install Epinio on Rancher Desktop (local)](howtos/install_epinio_on_rancher_desktop.md) - [Install Epinio on K3d (local)](howtos/install_epinio_on_k3d.md) - [Install Epinio on Minikube (local)](howtos/install_epinio_on_minikube.md) - [Install Epinio on Public Clouds](howtos/install_epinio_on_public_cloud.md) - [Install Wordpress on Epinio](howtos/install_wordpress_application.md) - [Reference documentation](references/references.md) - [System Requirements](references/system_requirements.md) - [Windows](references/windows.md) - [Command requirements](references/README.md) - [Command reference](references/commands.md) - [epinio](references/cli/epinio.md) - [epinio_app](references/cli/epinio_app.md) - [epinio_app_create](references/cli/epinio_app_create.md) - [epinio_app_delete](references/cli/epinio_app_delete.md) - [epinio_app_env](references/cli/epinio_app_env.md) - [epinio_app_env_list](references/cli/epinio_app_env_list.md) - [epinio_app_env_set](references/cli/epinio_app_env_set.md) - [epinio_app_env_show](references/cli/epinio_app_env_show.md) - [epinio_app_env_unset](references/cli/epinio_app_env_unset.md) - [epinio_app_list](references/cli/epinio_app_list.md) - [epinio_app_logs](references/cli/epinio_app_logs.md) - [epinio_app_manifest](references/cli/epinio_app_manifest.md) - [epinio_app_show](references/cli/epinio_app_show.md) - [epinio_app_update](references/cli/epinio_app_update.md) - [epinio_config](references/cli/epinio_config.md) - [epinio_config_colors](references/cli/epinio_config_colors.md) - [epinio_config_show](references/cli/epinio_config_show.md) - [epinio_config_update](references/cli/epinio_config_update.md) - [epinio_info](references/cli/epinio_info.md) - [epinio_namespace](references/cli/epinio_namespace.md) - [epinio_namespace_create](references/cli/epinio_namespace_create.md) - [epinio_namespace_delete](references/cli/epinio_namespace_delete.md) - [epinio_namespace_list](references/cli/epinio_namespace_list.md) - [epinio_namespace_show](references/cli/epinio_namespace_show.md) - [epinio_push](references/cli/epinio_push.md) - [epinio_server](references/cli/epinio_server.md) - [epinio_service](references/cli/epinio_service.md) - [epinio_service_bind](references/cli/epinio_service_bind.md) - [epinio_service_create](references/cli/epinio_service_create.md) - [epinio_service_delete](references/cli/epinio_service_delete.md) - [epinio_service_list](references/cli/epinio_service_list.md) - [epinio_service_show](references/cli/epinio_service_show.md) - [epinio_service_unbind](references/cli/epinio_service_unbind.md) - [epinio_target](references/cli/epinio_target.md) - [epinio_version](references/cli/epinio_version.md) - [CLI Configuration](references/configuration.md) - [Supported Applications](references/supported-apps.md) - [Application Manifests](references/manifests.md) - [Services](references/services.md) - [API](references/api.md)
47.510417
110
0.75137
yue_Hant
0.188768
eee089373931b18f617eeb2c0df6ecab87ef8be2
43
md
Markdown
README.md
CoffeyKang/laradock
f45ccea0bb5357ed1c46d89acf9d94a3555ae69a
[ "MIT" ]
null
null
null
README.md
CoffeyKang/laradock
f45ccea0bb5357ed1c46d89acf9d94a3555ae69a
[ "MIT" ]
null
null
null
README.md
CoffeyKang/laradock
f45ccea0bb5357ed1c46d89acf9d94a3555ae69a
[ "MIT" ]
null
null
null
# laradock Laravel Development Environment
14.333333
31
0.860465
eng_Latn
0.805514
eee0f781e5fb9af8a493ec164320e844775c7d0b
4,967
md
Markdown
articles/virtual-machines/windows/compute-benchmark-scores.md
Almulo/azure-docs.es-es
f1916cdaa2952cbe247723758a13b3ec3d608863
[ "CC-BY-4.0", "MIT" ]
null
null
null
articles/virtual-machines/windows/compute-benchmark-scores.md
Almulo/azure-docs.es-es
f1916cdaa2952cbe247723758a13b3ec3d608863
[ "CC-BY-4.0", "MIT" ]
null
null
null
articles/virtual-machines/windows/compute-benchmark-scores.md
Almulo/azure-docs.es-es
f1916cdaa2952cbe247723758a13b3ec3d608863
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: Puntuaciones de las pruebas comparativas de proceso para máquinas virtuales Windows de Azure | Microsoft Docs description: Comparación de puntuaciones de las pruebas comparativas de proceso de SPECint para máquinas virtuales de Azure con Windows Server. services: virtual-machines-windows documentationcenter: '' author: cynthn manager: jeconnoc editor: '' tags: azure-resource-manager,azure-service-management ms.assetid: 69ae72ec-e8be-4e46-a8f0-e744aebb5cc2 ms.service: virtual-machines-windows ms.devlang: na ms.topic: article ms.tgt_pltfrm: vm-windows ms.workload: infrastructure-services ms.date: 04/09/2018 ms.author: cynthn;davberg ms.openlocfilehash: a8d071544462361e9750d3fa622467cd0000a040 ms.sourcegitcommit: 7208bfe8878f83d5ec92e54e2f1222ffd41bf931 ms.translationtype: HT ms.contentlocale: es-ES ms.lasthandoff: 07/14/2018 ms.locfileid: "39056790" --- # <a name="compute-benchmark-scores-for-windows-vms"></a>Puntuaciones de pruebas comparativas de proceso para máquinas virtuales Windows Las siguientes puntuaciones de pruebas comparativas SPECInt muestran el rendimiento de proceso para la alineación de máquinas virtuales de alto rendimiento de Azure con Windows Server. Las puntuaciones de pruebas comparativas de proceso también están disponibles para las [máquinas virtuales Linux](../linux/compute-benchmark-scores.md?toc=%2fazure%2fvirtual-machines%2flinux%2ftoc.json). > [!NOTE] > Los números de Linux se actualizaron recientemente y contienen un conjunto de máquinas virtuales más completo. ## <a name="a-series---compute-intensive"></a>Serie A: de proceso intensivo | Tamaño | vCPU | Nodos NUMA | CPU | Ejecuciones | Tasa base promedio | StdDev | | --- | --- | --- | --- | --- | --- | --- | | Standard_A8 |8 |1 |Intel Xeon CPU E5-2670 0 \@ 2,6 GHz |10 |236.1 |1.1 | | Standard_A9 |16 |2 |Intel Xeon CPU E5-2670 0 \@ 2,6 GHz |10 |450.3 |7.0 | | Standard_A10 |8 |1 |Intel Xeon CPU E5-2670 0 \@ 2,6 GHz |5 |235.6 |0.9 | | Standard_A11 |16 |2 |Intel Xeon CPU E5-2670 0 \@ 2,6 GHz |7 |454.7 |4.8 | ## <a name="dv2-series"></a>Serie Dv2 | Tamaño | vCPU | Nodos NUMA | CPU | Ejecuciones | Tasa base promedio | StdDev | | --- | --- | --- | --- | --- | --- | --- | | Standard_D1_v2 |1 |1 |Intel Xeon E5-2673 v3 \@ 2,4 GHz |83 |36.6 |2.6 | | Standard_D2_v2 |2 |1 |Intel Xeon E5-2673 v3 \@ 2,4 GHz |27 |70.0 |3.7 | | Standard_D3_v2 |4 |1 |Intel Xeon E5-2673 v3 \@ 2,4 GHz |19 |130.5 |4.4. | | Standard_D4_v2 |8 |1 |Intel Xeon E5-2673 v3 \@ 2,4 GHz |19 |238.1 |5.2 | | Standard_D5_v2 |16 |2 |Intel Xeon E5-2673 v3 \@ 2,4 GHz |14 |460.9 |15.4 | | Standard_D11_v2 |2 |1 |Intel Xeon E5-2673 v3 \@ 2,4 GHz |19 |70.1 |3.7 | | Standard_D12_v2 |4 |1 |Intel Xeon E5-2673 v3 \@ 2,4 GHz |2 |132.0 |1.4 | | Standard_D13_v2 |8 |1 |Intel Xeon E5-2673 v3 \@ 2,4 GHz |17 |235.8 |3.8 | | Standard_D14_v2 |16 |2 |Intel Xeon E5-2673 v3 \@ 2,4 GHz |15 |460.8 |6.5 | ## <a name="g-series-gs-series"></a>Serie G, serie GS | Tamaño | vCPU | Nodos NUMA | CPU | Ejecuciones | Tasa base promedio | StdDev | | --- | --- | --- | --- | --- | --- | --- | | Standard_G1, Standard_GS1 |2 |1 |Intel Xeon E5-2698B v3 \@ 2 GHz |31 |71.8 |6.5 | | Standard_G2, Standard_GS2 |4 |1 |Intel Xeon E5-2698B v3 \@ 2 GHz |5 |133.4 |13.0 | | Standard_G3, Standard_GS3 |8 |1 |Intel Xeon E5-2698B v3 \@ 2 GHz |6 |242.3 |6.0 | | Standard_G4, Standard_GS4 |16 |1 |Intel Xeon E5-2698B v3 \@ 2 GHz |15 |398.9 |6.0 | | Standard_G5, Standard_GS5 |32 |2 |Intel Xeon E5-2698B v3 \@ 2 GHz |22 |762.8 |3.7 | ## <a name="h-series"></a>Serie H | Tamaño | vCPU | Nodos NUMA | CPU | Ejecuciones | Tasa base promedio | StdDev | | --- | --- | --- | --- | --- | --- | --- | | Standard_H8 |8 |1 |Intel Xeon E5-2667 v3 \@ 3,2 GHz |5 |297,4 |0.9 | | Standard_H16 |16 |2 |Intel Xeon E5-2667 v3 \@ 3,2 GHz |5 |575,8 |6,8 | | Standard_H8m |8 |1 |Intel Xeon E5-2667 v3 \@ 3,2 GHz |5 |297,0 |1.2 | | Standard_H16m |16 |2 |Intel Xeon E5-2667 v3 \@ 3,2 GHz |5 |572,2 |3.9 | | Standard_H16r |16 |2 |Intel Xeon E5-2667 v3 \@ 3,2 GHz |5 |573,2 |2.9 | | Standard_H16mr |16 |2 |Intel Xeon E5-2667 v3 \@ 3,2 GHz |7 |569,6 |2.8 | ## <a name="about-specint"></a>Acerca de SPECint Las cifras de Windows se calcularon mediante la ejecución de [SPECint 2006](https://www.spec.org/cpu2006/results/rint2006.html) en Windows Server. SPECint se ejecutó con la opción de tasa base (SPECint_rate2006), con una copia por vCPU. SPECint consta de 12 pruebas independientes y cada una se ejecuta tres veces, tomando el valor medio de cada prueba y ponderándolo para formar una puntuación compuesta. Se muestran las pruebas que se ejecutaron en varias máquinas virtuales para proporcionar las puntuaciones promedio. ## <a name="next-steps"></a>Pasos siguientes * Para más información sobre las capacidades de almacenamiento, los detalles del disco y consideraciones adicionales para seleccionar tamaños de máquinas virtuales, consulte [Tamaños de máquinas virtuales](sizes.md?toc=%2fazure%2fvirtual-machines%2fwindows%2ftoc.json).
64.506494
521
0.697806
spa_Latn
0.576306
eee0ff793e527f9a2e83e09298a9da6af7c7b548
1,520
md
Markdown
AlchemyInsights/restore-a-deleted-onedrive.md
isabella232/OfficeDocs-AlchemyInsights-pr.fr-FR
b23fe97cbba1674ad1f59978ca5080bb00d217cb
[ "CC-BY-4.0", "MIT" ]
1
2020-05-19T19:06:33.000Z
2020-05-19T19:06:33.000Z
AlchemyInsights/restore-a-deleted-onedrive.md
isabella232/OfficeDocs-AlchemyInsights-pr.fr-FR
b23fe97cbba1674ad1f59978ca5080bb00d217cb
[ "CC-BY-4.0", "MIT" ]
2
2022-02-09T06:56:37.000Z
2022-02-09T06:56:51.000Z
AlchemyInsights/restore-a-deleted-onedrive.md
isabella232/OfficeDocs-AlchemyInsights-pr.fr-FR
b23fe97cbba1674ad1f59978ca5080bb00d217cb
[ "CC-BY-4.0", "MIT" ]
2
2019-10-11T18:36:10.000Z
2021-10-09T11:34:48.000Z
--- title: Restaurer une OneDrive ms.author: pebaum author: pebaum manager: scotv ms.date: 04/21/2020 ms.audience: Admin ms.topic: article ms.service: o365-administration ROBOTS: NOINDEX, NOFOLLOW localization_priority: Normal ms.collection: Adm_O365 ms.custom: '' ms.assetid: 5298f192-326b-4820-b007-7e1a1c3c2b13 ms.openlocfilehash: 6310e3e225392a911bd1f5ae18dc3d49c6b50f0a32f603ceb60816657d5b3fc6 ms.sourcegitcommit: b5f7da89a650d2915dc652449623c78be6247175 ms.translationtype: MT ms.contentlocale: fr-FR ms.lasthandoff: 08/05/2021 ms.locfileid: "53958907" --- # <a name="restore-a-deleted-onedrive"></a>Restaurer une OneDrive Après avoir supprimé un utilisateur, vous pouvez accéder à sa OneDrive via le Centre d’administration Microsoft 365 pendant 30 jours. D’autres utilisateurs peuvent continuer à accéder au contenu partagé dans le OneDrive pendant la durée que vous avez définie dans le centre d’administration OneDrive de gestion. (Pour savoir comment définir ce paramètre, voir Définir la rétention de fichier par défaut pour les utilisateurs [OneDrive supprimés.)](https://go.microsoft.com/fwlink/?linkid=874267) Après cela, le OneDrive est déplacé vers la Corbeille pendant 93 jours, puis supprimé. Après les 30 premiers jours, lorsque l’utilisateur supprimé n’apparaît plus dans le Centre d’administration Microsoft 365, vous pouvez accéder au compte de l’utilisateur OneDrive via PowerShell. Pour plus d’informations, [voir Restaurer une OneDrive](https://go.microsoft.com/fwlink/?linkid=874269).
52.413793
582
0.813816
fra_Latn
0.924457
eee1126b34aa1d6e8880c1848882aa60f7eb6b41
919
md
Markdown
api/qsharp/microsoft.quantum.simulation.timedependentgeneratorsystem.md
geduardo/quantum-docs-pr
fa026045ba7c0d54cfb21d28dc587ad2630912f0
[ "CC-BY-4.0", "MIT" ]
1
2020-05-08T23:31:04.000Z
2020-05-08T23:31:04.000Z
api/qsharp/microsoft.quantum.simulation.timedependentgeneratorsystem.md
geduardo/quantum-docs-pr
fa026045ba7c0d54cfb21d28dc587ad2630912f0
[ "CC-BY-4.0", "MIT" ]
null
null
null
api/qsharp/microsoft.quantum.simulation.timedependentgeneratorsystem.md
geduardo/quantum-docs-pr
fa026045ba7c0d54cfb21d28dc587ad2630912f0
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- uid: Microsoft.Quantum.Simulation.TimeDependentGeneratorSystem title: TimeDependentGeneratorSystem user defined type ms.date: 11/25/2020 12:00:00 AM ms.topic: article qsharp.kind: udt qsharp.namespace: Microsoft.Quantum.Simulation qsharp.name: TimeDependentGeneratorSystem qsharp.summary: >- Represents a time-dependent dynamical generator as a function from time to the value of the dynamical generator at that time. --- # TimeDependentGeneratorSystem user defined type Namespace: [Microsoft.Quantum.Simulation](xref:Microsoft.Quantum.Simulation) Package: [Microsoft.Quantum.Standard](https://nuget.org/packages/Microsoft.Quantum.Standard) Represents a time-dependent dynamical generator as a function from time to the value of the dynamical generator at that time. ```qsharp newtype TimeDependentGeneratorSystem = ((Double -> Microsoft.Quantum.Simulation.GeneratorSystem)); ```
32.821429
126
0.789989
eng_Latn
0.396316
eee18a60ecee636d685bf98345d77ae6c34be9df
3,617
md
Markdown
README.md
demyxsh/traefik
7d1960afda9e593c4eb3122f51d6f5bee9343493
[ "MIT" ]
null
null
null
README.md
demyxsh/traefik
7d1960afda9e593c4eb3122f51d6f5bee9343493
[ "MIT" ]
null
null
null
README.md
demyxsh/traefik
7d1960afda9e593c4eb3122f51d6f5bee9343493
[ "MIT" ]
null
null
null
# traefik [![demyxsh/traefik](https://github.com/demyxsh/traefik/actions/workflows/main.yml/badge.svg)](https://github.com/demyxsh/traefik/actions/workflows/main.yml) [![Code Size](https://img.shields.io/github/languages/code-size/demyxsh/traefik?style=flat&color=blue)](https://github.com/demyxsh/traefik) [![Repository Size](https://img.shields.io/github/repo-size/demyxsh/traefik?style=flat&color=blue)](https://github.com/demyxsh/traefik) [![Watches](https://img.shields.io/github/watchers/demyxsh/traefik?style=flat&color=blue)](https://github.com/demyxsh/traefik) [![Stars](https://img.shields.io/github/stars/demyxsh/traefik?style=flat&color=blue)](https://github.com/demyxsh/traefik) [![Forks](https://img.shields.io/github/forks/demyxsh/traefik?style=flat&color=blue)](https://github.com/demyxsh/traefik) [![Docker Pulls](https://img.shields.io/docker/pulls/demyx/traefik?style=flat&color=blue)](https://hub.docker.com/r/demyx/traefik) [![Architecture](https://img.shields.io/badge/linux-amd64-important?style=flat&color=blue)](https://hub.docker.com/r/demyx/traefik) [![Alpine](https://img.shields.io/badge/dynamic/json?url=https://github.com/demyxsh/traefik/raw/master/version.json&label=alpine&query=$.alpine&color=blue)](https://hub.docker.com/r/demyx/traefik) [![Traefik](https://img.shields.io/badge/dynamic/json?url=https://github.com/demyxsh/traefik/raw/master/version.json&label=traefik&query=$.traefik&color=blue)](https://hub.docker.com/r/demyx/traefik) [![Buy Me A Coffee](https://img.shields.io/badge/buy_me_coffee-$5-informational?style=flat&color=blue)](https://www.buymeacoffee.com/VXqkQK5tb) [![Become a Patron!](https://img.shields.io/badge/become%20a%20patron-$5-informational?style=flat&color=blue)](https://www.patreon.com/bePatron?u=23406156) Non-root Docker image running Alpine Linux and Traefik. Traefik is a modern HTTP reverse proxy and load balancer that makes deploying microservices easy. Traefik integrates with your existing infrastructure components (Docker, Swarm mode, Kubernetes, Marathon, Consul, Etcd, Rancher, Amazon ECS, ...) and configures itself automatically and dynamically. Pointing Traefik at your orchestrator should be the only configuration step you need. DEMYX | TRAEFIK --- | --- USER | demyx ENTRYPOINT | ["demyx-entrypoint"] PORT | 8080 8081 8082 ## NOTICE This repository has been moved to the organization [demyxsh](https://github.com/demyxsh); please update the remote URL. ``` git remote set-url origin git@github.com:demyxsh/traefik.git ``` ## Usage - Since a non-root user can't access docker.sock, this image depends on my lockdown docker.sock proxy [container](https://github.com/demyxsh/docker-socket-proxy). - DEMYX_ACME_EMAIL must be set or the container will exit. ``` # Start the docker.sock proxy container first docker run -d \ --privileged \ --name=demyx_socket \ --network=demyx_socket \ -v /var/run/docker.sock:/var/run/docker.sock \ -e CONTAINERS=1 \ demyx/docker-socket-proxy # Start Traefik container docker run -d \ --name=traefik \ --network=demyx_socket \ -e DEMYX=/demyx \ -e DEMYX_CONFIG=/etc/demyx \ -e DEMYX_LOG=/var/log/demyx \ -e DEMYX_ENDPOINT=tcp://demyx_socket:2375 \ -e DEMYX_ACME_EMAIL=info@domain.tld \ # Required -p 80:8081 \ -p 443:8082 \ -v traefik:/demyx \ # Point your acme.json storage to this directory (ex: /demyx/acme.json) demyx/traefik ``` For more configurations, see Traefik's official documentations: https://docs.traefik.io. ## Updates & Support * Auto built weekly on Saturdays (America/Los_Angeles) * Rolling release updates * For support: [#demyx](https://web.libera.chat/?channel=#demyx)
56.515625
439
0.760852
yue_Hant
0.37941
eee1a0676ac541af6764e6204f2e8a3878ee07a4
3,223
md
Markdown
README.md
hero-ku/Hula
0b44b03b534723dc3412da97487c80a58ec5badb
[ "MIT" ]
null
null
null
README.md
hero-ku/Hula
0b44b03b534723dc3412da97487c80a58ec5badb
[ "MIT" ]
null
null
null
README.md
hero-ku/Hula
0b44b03b534723dc3412da97487c80a58ec5badb
[ "MIT" ]
null
null
null
<h1 align="center"> <a href="https://github.com/MathematicalDessert/Hula">Hula Editor</a> </h1> <h4 align="center">A Luau IDE written in Luau.</h4> Hula is a ***WIP*** IDE written in Luau with power and simplicity in mind. It will make use of the powerful foundations laid by Roblox with [Luau](https://github.com/Roblox/luau), including but not limited to: **code debugging**, **disassembly**, and **intelligent type-checking**. Hula aims to be: * 🔧***Extensible***. Adding features should be easy for both experienced and non-experienced Luau users. * 💨***Performant***. As a showcase of the performance of Luau, Hula should also be highly performant in all areas. * ✔️***Reliable***. Using Hula should be a smooth experience with no unexpected hangs or crashes. * 💻***Native***. Hula should run on all platforms with similar performance. * 🪶***Lightweight***. Shouldn't have many dependencies, and any dependencies (or plugins) should be decoupleable. * 🪟***Simple, but beautiful***. A clean user interface with minimal clutter and powerful tools. Hula does **NOT** aim to be: * ❌***A replacement*** to your normal Luau environment, be that Roblox Studio or Visual Studio Code. * ❌***A general purpose*** editor like Visual Studio Code and Co. If you're interested in working on the project or just want to talk with developers, please join the Discord: https://discord.gg/Mc72duvKMP. ## Components ### Bedrock Bedrock is the C++ backbone of the project. It is here that Luau is incorporated into the project. It is also where any high-performance methods are included. ### Core The core is the true editor code. It is written in Luau and is in the folder [core](core). This is where the majority of the logic necessary for the editor will be, excluding what necessarily needs to be implemented in C++. The core is structured into multiple subdirectories with descriptive names. Every file is a module. ## Design Philosophy Powerful simplicity is the name of the game. Whenever possible, we would prefer code be written in Luau and use the exported API. If a feature requires high performance with zero abstraction penalty, then implementation in C++ and export is acceptable. ## Planned Features 💚 - Definitely (not in development yet). 🔜 - In development. 🤷‍♀️ - Not certain. Features: - [ ] 💚 Full Luau Debugger - [ ] 💚 Visual Disassembler (Similar to https://www.luac.nl/) - [ ] 💚 Luau Plugin Support - [ ] 🤷‍♀️ Intelligent Auto-Completion - [ ] 🤷‍♀️ Integration with Roblox tools(?) ## Contributing The project is still in its infancy, and as such making contributions may be difficult. Despite this, all contributions are welcome! It may be better to discuss potential features and implementations in the [Discord](https://discord.gg/Mc72duvKMP), [Discussions](https://github.com/MathematicalDessert/Hula/discussions), or an issue. More information on contributing will come soon. ## License Hula is distributed under the terms of the [MIT License](LICENSE). It takes inspiration from [rxi/lite](https://github.com/rxi/lite) and implements Luau which are both also distributed under MIT. You are permitted to redistribute Hula and/or modify it under the aforementioned terms of the license.
50.359375
298
0.749922
eng_Latn
0.998028
eee1c6474a6239ff7bcb13cff234df0db3b6d761
4,964
md
Markdown
vendor/ruby/2.7.0/gems/cucumber-core-10.1.1/README.md
Jofranlima/Capybara-automation
54df1e8154e3ee14fa08a20dc960e3fe1699b881
[ "MIT" ]
21
2015-05-12T13:15:40.000Z
2021-10-04T17:55:54.000Z
vendor/ruby/2.7.0/gems/cucumber-core-10.1.1/README.md
Jofranlima/Capybara-automation
54df1e8154e3ee14fa08a20dc960e3fe1699b881
[ "MIT" ]
155
2015-01-06T15:58:03.000Z
2021-10-15T07:50:44.000Z
vendor/ruby/2.7.0/gems/cucumber-core-10.1.1/README.md
Jofranlima/Capybara-automation
54df1e8154e3ee14fa08a20dc960e3fe1699b881
[ "MIT" ]
43
2015-01-03T19:14:41.000Z
2021-07-07T10:06:03.000Z
<p align="center"> <img src="./.github/img/cucumber-open-logo.png" alt="Cucumber Open - Supported by Smartbear" width="428" /> </p> # Cucumber [![OpenCollective](https://opencollective.com/cucumber/backers/badge.svg)](https://opencollective.com/cucumber) [![OpenCollective](https://opencollective.com/cucumber/sponsors/badge.svg)](https://opencollective.com/cucumber) [![pull requests](https://oselvar.com/api/badge?label=pull%20requests&csvUrl=https%3A%2F%2Fraw.githubusercontent.com%2Fcucumber%2Foselvar-github-metrics%2Fmain%2Fdata%2Fcucumber%2Fcucumber-ruby-core%2FpullRequests.csv)](https://oselvar.com/github/cucumber/oselvar-github-metrics/main/cucumber/cucumber-ruby-core) [![issues](https://oselvar.com/api/badge?label=issues&csvUrl=https%3A%2F%2Fraw.githubusercontent.com%2Fcucumber%2Foselvar-github-metrics%2Fmain%2Fdata%2Fcucumber%2Fcucumber-ruby-core%2Fissues.csv)](https://oselvar.com/github/cucumber/oselvar-github-metrics/main/cucumber/cucumber-ruby-core) [![Test cucumber-core](https://github.com/cucumber/cucumber-ruby-core/actions/workflows/cucumber-ruby-core.yml/badge.svg)](https://github.com/cucumber/cucumber-ruby-core/actions/workflows/cucumber-ruby-core.yml) [![Code Climate](https://codeclimate.com/github/cucumber/cucumber-ruby-core.svg)](https://codeclimate.com/github/cucumber/cucumber-ruby-core) [![Coverage Status](https://coveralls.io/repos/cucumber/cucumber-ruby-core/badge.svg?branch=main)](https://coveralls.io/r/cucumber/cucumber-ruby-core?branch=main) Cucumber is a tool for running automated tests written in plain language. Because they're written in plain language, they can be read by anyone on your team. Because they can be read by anyone, you can use them to help improve communication, collaboration and trust on your team. <p align="center"> <img src="./.github/img/gherkin-example.png" alt="Cucumber Gherkin Example" width="728" /> </p> Cucumber Core is the [inner hexagon](https://en.wikipedia.org/wiki/Hexagonal_architecture_(software)) for the [Ruby flavour of Cucumber](https://github.com/cucumber/cucumber-ruby). It contains the core domain logic to execute Cucumber features. It has no user interface, just a Ruby API. If you're interested in how Cucumber works, or in building other tools that work with Gherkin documents, you've come to the right place. See [CONTRIBUTING.md](CONTRIBUTING.md) for info on contributing to Cucumber (issues, PRs, etc.). Everyone interacting in this codebase and issue tracker is expected to follow the Cucumber [code of conduct](https://cucumber.io/conduct). ## Installation `cucumber-core` is a Ruby gem. Install it as you would install any gem: add `cucumber-core` to your Gemfile: gem 'cucumber-core' then install it: $ bundle or install the gem directly: $ gem install cucumber-core ### Supported platforms - Ruby 3.0 - Ruby 2.7 - Ruby 2.6 - Ruby 2.5 - Ruby 2.4 - Ruby 2.3 - JRuby 9.2 (with [some limitations](https://github.com/cucumber/cucumber-ruby/blob/main/docs/jruby-limitations.md)) ## Usage The following example aims to illustrate how to use `cucumber-core` gem and to make sure it is working well within your environment. For more details explanation on what it actually does and how to work with it, see [docs/ARCHITECTURE.md](docs/ARCHITECTURE.md). ```ruby # cucumber_core_example.rb require 'cucumber/core' require 'cucumber/core/filter' class ActivateSteps < Cucumber::Core::Filter.new def test_case(test_case) test_steps = test_case.test_steps.map do |step| step.with_action { print "processing: " } end test_case.with_steps(test_steps).describe_to(receiver) end end feature = Cucumber::Core::Gherkin::Document.new(__FILE__, <<-GHERKIN) Feature: Scenario: Given some requirements When we do something Then it should pass GHERKIN class MyRunner include Cucumber::Core end MyRunner.new.execute([feature], [ActivateSteps.new]) do |events| events.on(:test_step_finished) do |event| test_step, result = event.test_step, event.result print "#{test_step.text} #{result}\n" end end ``` If you run this Ruby script: ```shell ruby cucumber_core_example.rb ``` You should see the following output: ``` processing: some requirements ✓ processing: we do something ✓ processing: it should pass ✓ ``` ## Documentation and support - Getting started with Cucumber, writing features, step definitions, and more: https://cucumber.io/docs - Slack: [register for an account](https://cucumberbdd-slack-invite.herokuapp.com/) then head over to [#intro](https://cucumberbdd.slack.com/messages/C5WD8SA21/) - `cucumber-core` overview: [docs/ARCHITECTURE.md](docs/ARCHITECTURE.md) - How to work with local repositories for `cucumber-gherkin`, `cucumber-messages` or `cucumber-ruby`: [CONTRIBUTING.md#working-with-local-cucumber-dependencies](./CONTRIBUTING.md#working-with-local-cucumber-dependencies) ## Copyright Copyright (c) Cucumber Ltd. and Contributors. See LICENSE for details.
38.48062
312
0.765512
eng_Latn
0.749078
eee1d8df20c536bc4bbdec8de0f217222e9a2228
673
md
Markdown
tests/integration/apiserver-proxy/README.md
FWinkler79/kyma
4ef0055807666cd54c5cbbeecd3aa17918d5d982
[ "Apache-2.0" ]
1,351
2018-07-04T06:14:20.000Z
2022-03-31T16:28:47.000Z
tests/integration/apiserver-proxy/README.md
FWinkler79/kyma
4ef0055807666cd54c5cbbeecd3aa17918d5d982
[ "Apache-2.0" ]
11,211
2018-07-24T22:47:33.000Z
2022-03-31T19:29:15.000Z
tests/integration/apiserver-proxy/README.md
FWinkler79/kyma
4ef0055807666cd54c5cbbeecd3aa17918d5d982
[ "Apache-2.0" ]
481
2018-07-24T14:13:41.000Z
2022-03-31T15:55:46.000Z
# Apiserver-Proxy Integration Tests ## Overview This folder contains the integration tests for the `apiserver-proxy` component. ## Details - Contains the Dockerfile for the image used in Kyma API Server Proxy tests. - Contains the `fetch-token` application used for fetching authentication tokens from Dex. - Contains the `test.sh` script that runs tests for the chart. ## Usage To test your changes and build the image, run the `make build build-image` command. ## Configure Kyma After building and pushing the Docker image, set the proper directory and tag in the `resources/apiserver-proxy/values.yaml` file, in the `apiserver_proxy_integration_tests` property.
33.65
183
0.784547
eng_Latn
0.991133
eee203dca612d3efb137a5270ad1bc27c08257fb
3,115
md
Markdown
sccm-ps/ConfigurationManager/Get-CMDeploymentTypeDependency.md
ragalad/sccm-docs-powershell-ref
5c4cc62706af0cf936c79e07d6e504df38ad3a0d
[ "CC-BY-4.0", "MIT" ]
null
null
null
sccm-ps/ConfigurationManager/Get-CMDeploymentTypeDependency.md
ragalad/sccm-docs-powershell-ref
5c4cc62706af0cf936c79e07d6e504df38ad3a0d
[ "CC-BY-4.0", "MIT" ]
null
null
null
sccm-ps/ConfigurationManager/Get-CMDeploymentTypeDependency.md
ragalad/sccm-docs-powershell-ref
5c4cc62706af0cf936c79e07d6e504df38ad3a0d
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- title: Get-CMDeploymentTypeDependency titleSuffix: Configuration Manager description: Gets a deployment type dependency. ms.date: 01/02/2019 ms.prod: configuration-manager ms.technology: configmgr-other ms.topic: reference author: mumian ms.author: jgao manager: dougeby external help file: AdminUI.PS.AppMan.dll-Help.xml --- # Get-CMDeploymentTypeDependency ## SYNOPSIS Gets a deployment type from a dependency group. ## SYNTAX ```powershell Get-CMDeploymentTypeDependency -InputObject <DeploymentTypeDependencyGroup> [-DisableWildcardHandling] [-ForceWildcardHandling] [<CommonParameters>] ``` ## DESCRIPTION The **Get-CMDeploymentTypeDependency** cmdlet gets existing dependent deployment types from a dependency group. Required input is a dependency group object from [Get-CMDeploymentTypeDependencyGroup](./Get-CMDeploymentTypeDependencyGroup.md). ## EXAMPLES > [!NOTE] > Configuration Manager CmdLets must be run from the Configuration Manager site drive. For more information, see the [getting started documentation](https://docs.microsoft.com/powershell/sccm/overview). ### Example 1 ```powershell PS XYZ:\> Get-CMDeploymentType -ApplicationName MyApp | Get-CMDeploymentTypeDependencyGroup -GroupName MyGroup | Get-CMDeploymentTypeDependency ``` This command gets a deployment type from a dependency group. ## PARAMETERS ### -DisableWildcardHandling DisableWildcardHandling treats wildcard characters as literal character values. Cannot be combined with **ForceWildcardHandling**. ```yaml Type: SwitchParameter Parameter Sets: (All) Aliases: Required: False Position: Named Default value: None Accept pipeline input: False Accept wildcard characters: False ``` ### -ForceWildcardHandling ForceWildcardHandling processes wildcard characters and may lead to unexpected behavior (not recommended). Cannot be combined with **DisableWildcardHandling**. ```yaml Type: SwitchParameter Parameter Sets: (All) Aliases: Required: False Position: Named Default value: None Accept pipeline input: False Accept wildcard characters: False ``` ### -InputObject Specifies a deployment type dependency group object. ```yaml Type: DeploymentTypeDependencyGroup Parameter Sets: (All) Aliases: Group Required: True Position: Named Default value: None Accept pipeline input: True (ByValue) Accept wildcard characters: False ``` ### CommonParameters This cmdlet supports the common parameters: -Debug, -ErrorAction, -ErrorVariable, -InformationAction, -InformationVariable, -OutVariable, -OutBuffer, -PipelineVariable, -Verbose, -WarningAction, and -WarningVariable. For more information, see [about_CommonParameters](http://go.microsoft.com/fwlink/?LinkID=113216). ## INPUTS ### Microsoft.ConfigurationManagement.Cmdlets.AppMan.Commands.DeploymentTypeDependencyGroup ## OUTPUTS ### IResultObject[]#SMS_DeploymentType IResultObject#SMS_DeploymentType ## RELATED LINKS [Add-CMDeploymentTypeDependency](./Add-CMDeploymentTypeDependency.md) [Set-CMDeploymentTypeDependency](./Set-CMDeploymentTypeDependency.md) [Remove-CMDeploymentTypeDependency](./Remove-CMDeploymentTypeDependency.md)
26.398305
315
0.80321
eng_Latn
0.621116
eee363428089855f0d76ffaf8f5a00f0cfdae8a1
9,552
md
Markdown
docs/resources/ownership.md
garettarrowood/oss-guide
28f10ecb0332ad63dbaa4be4aa6e856dfded7e8d
[ "CC-BY-4.0" ]
1
2021-04-05T21:52:54.000Z
2021-04-05T21:52:54.000Z
docs/resources/ownership.md
garettarrowood/oss-guide
28f10ecb0332ad63dbaa4be4aa6e856dfded7e8d
[ "CC-BY-4.0" ]
null
null
null
docs/resources/ownership.md
garettarrowood/oss-guide
28f10ecb0332ad63dbaa4be4aa6e856dfded7e8d
[ "CC-BY-4.0" ]
2
2020-12-10T00:43:37.000Z
2021-01-27T23:19:25.000Z
--- layout: default title: Ownership and Control in Open Source parent: Resources to Address Common Questions nav_order: 9 --- ## Ownership and Control in Open Source Open source challenges basic notions of ownership and control. With many copyright works, an author seeks to control their work so that others have to pay to access it, and no one can edit it without the author’s approval. Open source reverses the control model so that anyone can access the work for free and everyone is welcome to modify their copy. That said, projects are managed; otherwise they would be chaotic or subject to internet trolls. This page seeks to clarify some of the issues that can arise regarding these notions of ownership and control. We’ll explore the following questions: 1. Who “owns” a project? Does it matter? 1. Who controls the project? 1. What happens during conflicts? ### Ownership The copyright owner of a project has the right to publish an open source project. Meaning, you don't have the right to take someone else’s code and publish it for them -- much like I don't have the right to take pictures from your phone and share them. Only the rights holder has those rights. Normally the author of a work is the copyright owner. However in many jurisdictions, an author can assign the copyrights via contract. In fact this is how most companies use code authored by their employees. They enter into a contractual agreement with the employee where the employee assigns the code copyrights and the company pays the employee a salary. Thus when an employee writes code for work, the company gets the copyrights allowing it to use the code and publish the code as open source. The details of the contract stipulate what code is included in the category “work for hire” and some jurisdictions add limitations on what can be assigned. So who owns source code? Let’s define `ownership` in this context as being the copyright holder. (Later we’ll explore where this notion differs from control.) Initially the author of code is the copyright holder. For work-related code, the author assigns the code to the company. The direct implication: the company can use the code without fear the employee will sue the company. The indirect implication: the company can publish the code (even if the employee leaves the company or does not want the code published), and the company can block the code being published (even if the employee wants to publish it, or just take the code with them to their next job when they leave the company). The company, as copyright holder, has the upper hand in deciding the fate of the code. This can pose an emotional conflict with employees who feel attached to their creative work. They feel that _they_ own the code (they don’t feel as moved to refund paychecks that were given in consideration of the copyright assignment agreement). The resolution of rights ownership is often addressed by an impassive legal examination of the legal arrangements the employee made with the employer. Companies with an OSPO can leverage the OPSO to help clarify and resolve conflicts and questions, should they arise. Typical conflict scenarios include: 1. **The employee wishes to take unpublished code with her when she leaves to another company.** This is usually an explicit violation of employment terms and the company’s information security policies. Moreover, the pre-publication SDRT review helps ensure code does not contain confidential or sensitive information (e.g. passwords). Employees should not copy company code to be used outside the context of work without explicit approval (e.g. a published project with an open source license). Many requests to publish code are issued by employees who are planning to leave the company and want to take the code they were working on to their next job. By asking for an open source license, they protect themselves (and their next company) from complications. 1. **The employee wishes to publish code she worked on as an open source project on her _personal_ GitHub account.** Before publishing code, we'll ask who is the copyright holder? If this is _personal_ code (e.g. written by the employee while self-employed, or code not assigned to the company), the company cannot publish the code and generally has nothing to say about what employees do with their own property (assuming it does not violate any laws, corporate ethics considerations, etc.). If this is _company_ code, and if the SDRT approves the publication, the employee can host a copy of the code on her personal GitHub account (actually once published by the SDRT, anyone can host a copy of the code on their personal account too.) 1. **The employee wishes to publish code she worked on as an open source project on the _company’s_ GitHub organization.** The SDRT will ask who is the copyright holder? If it’s company code, and if the SDRT approves the publication, we would likely publish the code on the company's GitHub account. But if this is personally-owned code, it is uncommon to host an employee’s personal project in the company account. Company accounts are not ideal for forks (they get stale quickly) and it’s unclear what should happen to the code when the employee leaves. 1. **The company wishes to publish code, but one of the employee authors does not want the code published.** In this case, the SDRT should explore _why_ one of the authors does not want the code published. Before publishing code, the SDRT will encourage the team publishing the code to ensure that all the people involved in the project agree the code should be published. Sometimes a developer feels their code is not ready to be published, sometimes they want to be more involved in the process. Ultimately, one employee who is not the copyright holder might not block a publication. But we advise understanding the situation before acting with fiat. It's always better to understand issues _before_ code is publiished. 1. **An employee wishes to take an existing open source project in a different direction than the project is going.** Once a project is published under an open source license, anyone can fork the project. Provided an employee complies with the project license, they can do as they wish with the code (considering legal, ethical, and related issues etc.). Ideally employees would work together toward a shared outcome for a project. But if two groups of employees disagree on a project, it's better they each write the code they want to see and not argue about it, certainly not in public. Arguments don’t make code better, but working code wins arguments. ### Control Control is closely related to ownership. We often confuse the two. Consider if someone gave you the right to use his car whenever you wished. You could park it in your driveway, drive it anytime, and use it as you please. It would _feel_ like your car since you have the controls that ownership usually entails. However if you tried to sell the car, you’d have to deal with the fact that someone else owns the title to the car. This gets more complicated with intellectual property since source code is not inherently excludable, nor is your use of code rivalrous to someone else’s use. Control often feels like a close proxy to ownership. So who controls the open source project? How does control even work? For company published open source projects, an employee controls the project often based on objectives that are in harmony with the company. We expect the lead maintainer to invite external participation and eventually get non-employees to become project maintainers too. This often results in one of three outcomes: * The project runs its course as a company-managed project for as long as there is activity and interest in the project. * The community takes a more active role in the project and we consider transferring the project to a software foundation so that the project is community-controlled. * All employee maintainers leave the company and the company offers the project to the community or archives it. ### Conflicts This discussion about ownership and control may cause us to forget that open source challenges the models and allows us to think differently about projects. No code runs because of only one person. All source code projects are based on language fundamentals, libraries, and other code projects. Source code can be improved by contributions from others who are not the code authors. Rather than asserting ownership rights and exclusive control of code, successful projects convey that the future of a project rests in the hands of an emergent community who, together, shapes the growth of the code. When developers view code as a fixed asset, the code rarely becomes a communal project. When a developer starts an argument with "but this code is mine..." she shuts down the notion that others have contributed to it, and those others' will. It is a reality that open source projects have copyright holders and maintainers who control which pull requests are merged. But open source also invites the thinking that the project is not an owned fixed asset, but a growing potential. Our opportunity is not found in how we control it, but in how we encourage it to grow. Yet when there is conflict between participants in a project, we need a resolution that does not destroy the project or the relationships developed within the community. So we encourage people to fork the project and create the reality they wish to see. _Arguments don’t make code better, but working code wins arguments._
207.652174
947
0.802659
eng_Latn
0.999954
eee3afe4e79c0d94bc7bb6254d8fd30bc90299a2
3,834
md
Markdown
LearningPlanResources/Business Applications/Power Platform/Modern Analytics.md
petertuton/PartnerResources
d5209788c74dda1022f30aa4d344ae988f2a3b55
[ "MIT" ]
null
null
null
LearningPlanResources/Business Applications/Power Platform/Modern Analytics.md
petertuton/PartnerResources
d5209788c74dda1022f30aa4d344ae988f2a3b55
[ "MIT" ]
null
null
null
LearningPlanResources/Business Applications/Power Platform/Modern Analytics.md
petertuton/PartnerResources
d5209788c74dda1022f30aa4d344ae988f2a3b55
[ "MIT" ]
null
null
null
# Learning Plan Resources for Modern Analytics/Power BI ## Fundamentals * [Power BI Guided Learning](https://docs.microsoft.com/en-us/power-bi/guided-learning/) (Self-Paced) * [Consume data with Power BI](https://docs.microsoft.com/en-us/learn/paths/consume-data-with-power-bi/) (Self-Paced) (2 Hours) * [Course PL-900T00-A: Microsoft Power Platform Fundamentals](https://docs.microsoft.com/en-us/learn/certifications/courses/pl-900t00) (In-person Instructor Led) (2 Days) * [How to Govern: Part 3 | Power BI Adoption Framework](https://www.youtube.com/watch?v=Zf0lCaGCSuU&list=PL1N57mwBHtN0UZbEgLHtA1yxqPlae3B90&index=7&t=0s) (Self-Paced) (31 Minutes) * [Introduction to key roles: Part 3 | Power BI Adoption Framework](https://www.youtube.com/watch?v=CNq__EBhUCM&list=PL1N57mwBHtN0UZbEgLHtA1yxqPlae3B90&index=4&t=0s) (Self-Paced) (6 Minutes) * [Introduction to the framework: Part 2 | Power BI Adoption Framework](https://www.youtube.com/watch?v=N6m0XxA_m5c&list=PL1N57mwBHtN0UZbEgLHtA1yxqPlae3B90&index=3&t=0s) (Self-Paced) (12 Minutes) * [Introduction to the series: Part 1 | Power BI Adoption Framework](https://www.youtube.com/watch?v=e7Nb-XmrOfY&list=PL1N57mwBHtN0UZbEgLHtA1yxqPlae3B90&index=2&t=0s) (Self-Paced) (16 Minutes) * [Microsoft Power Platform Fundamentals](https://docs.microsoft.com/en-us/learn/paths/power-plat-fundamentals/) (Self-Paced) (3 Hours) * [Power BI Service Management: how to manage | Power BI Adoption Framework](https://www.youtube.com/watch?v=w-bWBE1nA_0&list=PL1N57mwBHtN0UZbEgLHtA1yxqPlae3B90&index=10&t=0s) (Self-Paced) (35 Minutes) * [Power BI Service Management: licensing | Power BI Adoption Framework](https://www.youtube.com/watch?v=2CpdDLVUG8c&list=PL1N57mwBHtN0UZbEgLHtA1yxqPlae3B90&index=8&t=0s) (Self-Paced) (7 Minutes) * [Power BI Service Management: what to manage | Power BI Adoption Framework](https://www.youtube.com/watch?v=pElZcks5nsw&list=PL1N57mwBHtN0UZbEgLHtA1yxqPlae3B90&index=9&t=0s) (Self-Paced) (7 Minutes) * [What to Govern: Part 2 | Power BI Adoption Framework](https://www.youtube.com/watch?v=5n1JhQ8NLRw&list=PL1N57mwBHtN0UZbEgLHtA1yxqPlae3B90&index=6&t=0s) (Self-Paced) (5 Minutes) * [Why to Govern: Part 1 | Power BI Adoption Framework](https://www.youtube.com/watch?v=QIsbkWH15-A&list=PL1N57mwBHtN0UZbEgLHtA1yxqPlae3B90&index=5&t=0s) (Self-Paced) (5 Minutes) ## Associate * [Getting Started with Power BI](https://partner.microsoft.com/en-us/asset/collection/getting-started-with-power-bi#/) (Self-Paced) (17 Hours) * [Create and use analytics reports with Power BI](https://docs.microsoft.com/en-us/learn/paths/create-use-analytics-reports-power-bi/) (Self-Paced) (6 Hours) * [Dashboard in a Day](https://powerbi.microsoft.com/en-us/diad/) (Self-Paced) (1 Day) ## Expert * [Power BI Developer Documentation](https://docs.microsoft.com/en-us/power-bi/developer/) (Self-Paced) * [Technical Deep Dive on Power BI Common Scenarios](https://support.microsoft.com/en-us/help/4013850/power-bi-common-scenarios) (Self-Paced) (3 Hours) ## Specialist * [Technical Deep Dive on Enterprise BI and Hybrid Scenarios with Power BI](https://support.microsoft.com/en-us/help/4456382/technical-deep-dive-on-enterprise-bi-and-hybrid-scenarios-with-power-b) (Self-Paced) (2 Hours) * [Adopting Power BI for Embedded Applications](https://support.microsoft.com/en-us/help/4456372/adopting-power-bi-for-embedded-applications) (Self-Paced) (2 Hours) ## Community Resources * [Power BI UG](https://www.pbiusergroup.com/home) (Self-Paced) * [Microsoft Power BI Community](https://community.powerbi.com/) (Self-Paced) ## Events * [Microsoft Power BI Events](https://community.powerbi.com/t5/Events/ct-p/Events) (Self-Paced) ## Certifications * [Exam PL-900: Microsoft Power Platform Fundamentals](https://docs.microsoft.com/en-us/learn/certifications/exams/pl-900) (Self-Paced)
81.574468
219
0.76891
yue_Hant
0.722707
eee4d745041e65316f86936e7c27c2c866cce6d9
1,303
md
Markdown
aspnet/web-forms/videos/aspnet-dynamic-data/begin-modifying-dynamic-data-applications-with-url-routing.md
itistnet/Docs.ko-kr
f5955b82b1767e4a1950c6b0f19002b7069429d6
[ "CC-BY-4.0", "MIT" ]
null
null
null
aspnet/web-forms/videos/aspnet-dynamic-data/begin-modifying-dynamic-data-applications-with-url-routing.md
itistnet/Docs.ko-kr
f5955b82b1767e4a1950c6b0f19002b7069429d6
[ "CC-BY-4.0", "MIT" ]
null
null
null
aspnet/web-forms/videos/aspnet-dynamic-data/begin-modifying-dynamic-data-applications-with-url-routing.md
itistnet/Docs.ko-kr
f5955b82b1767e4a1950c6b0f19002b7069429d6
[ "CC-BY-4.0", "MIT" ]
null
null
null
--- uid: web-forms/videos/aspnet-dynamic-data/begin-modifying-dynamic-data-applications-with-url-routing title: URL 라우팅을 사용 하 여 Dynamic Data 응용 프로그램 수정 시작 | Microsoft Docs author: JoeStagner description: 이 비디오는 ASP.NET 동적 데이터 URL 라우팅 소개 하 고 라우팅 옵션을 사용 하 여 응용 프로그램의 URL을 구성 하는 방법을 보여 줍니다. ms.author: riande ms.date: 10/23/2008 ms.assetid: 9170d70c-928b-48a8-8f0a-4def9dc99256 msc.legacyurl: /web-forms/videos/aspnet-dynamic-data/begin-modifying-dynamic-data-applications-with-url-routing msc.type: video ms.openlocfilehash: 630e6dcb6572ede91f426396e5598aa25f08f840 ms.sourcegitcommit: 45ac74e400f9f2b7dbded66297730f6f14a4eb25 ms.translationtype: MT ms.contentlocale: ko-KR ms.lasthandoff: 08/16/2018 ms.locfileid: "41827726" --- <a name="begin-modifying-dynamic-data-applications-with-url-routing"></a>URL 라우팅을 사용 하 여 Dynamic Data 응용 프로그램 수정 시작 ==================== [Joe Stagner](https://github.com/JoeStagner) 이 비디오는 ASP.NET 동적 데이터 URL 라우팅 소개 하 고 라우팅 옵션을 사용 하 여 응용 프로그램의 URL을 구성 하는 방법을 보여 줍니다. [&#9654;비디오 (5 분)](https://channel9.msdn.com/Blogs/ASP-NET-Site-Videos/begin-modifying-dynamic-data-applications-with-url-routing) > [!div class="step-by-step"] > [이전](begin-editing-the-templates-in-aspnet-dynamic-data-applications.md) > [다음](enable-in-line-editing-in-aspnet-dynamic-data-applications.md)
44.931034
130
0.768995
kor_Hang
0.988785
eee61622a6c3c1e555bf9e0e947f1d67273029d9
1,885
md
Markdown
README.md
satty9753/SACharts
a076e6d8c12948bf03bfaccff4bd542b1f784a0d
[ "MIT" ]
null
null
null
README.md
satty9753/SACharts
a076e6d8c12948bf03bfaccff4bd542b1f784a0d
[ "MIT" ]
null
null
null
README.md
satty9753/SACharts
a076e6d8c12948bf03bfaccff4bd542b1f784a0d
[ "MIT" ]
null
null
null
# SACharts This is a framework all about gradient chart. ## Installation ### cocoapods ``` pod 'SACharts' ``` ### swift package manager ``` .package(url:"https://github.com/satty9753/SACharts") ``` ## Demo <img src="https://github.com/satty9753/SACharts/blob/master/demo_images/singleArc.PNG?raw=true" alt="singleArc" width="300"> <img src="https://github.com/satty9753/SACharts/blob/master/demo_images/pieChart.PNG?raw=true" alt="pieChart" width="300"> <img src="https://github.com/satty9753/SACharts/blob/master/demo_images/arcsChart.PNG?raw=true" alt="ArcsChart" width="300"> <img src="https://github.com/satty9753/SACharts/blob/master/demo_images/concentricChart.PNG?raw=true" alt="ConcentricChart" width="300"> ## Usage #### SingleCircleChart ```swift import UIKit import SACharts class ViewController: UIViewController { @IBOutlet weak var chartView: UIView! override func viewDidLoad() { super.viewDidLoad() //create a gradient chart let circle = SingleCircleChart(frame: CGRect(x: 0, y: 0, width: view.frame.width/2, height: view.frame.width/2)) //draw chart circle.drawArcsChart(value: [0.7, 0.5, 0.7], gradientColors: [UIColor.gradientDefaultBlue, UIColor.gradientDefaultYellow, UIColor.gradientDefaultGreen]) //add chart to view self.chartView.addSubview(circle) } } ``` #### ConcentricChart ```swift let circles = ConcentricChart(frame: CGRect(x: 0, y: 0, width: view.frame.width/2, height: view.frame.width/2)) //set appearence for label & color circles.setItems(contents: ["aaa", "bbb", "ccc"], colors: [UIColor.gradientDefaultBlue, UIColor.gradientDefaultYellow, UIColor.gradientDefaultPurple]) //set value circles.draw(numbers: [300, 100, 64]) ``` ### !IMPORTANT! ✅ Put your gradient chart inside a view, and adjust view's frame through autoLayout ❌ Change gradient chart's origin directly
33.660714
160
0.724138
kor_Hang
0.286893
eee61f82c8a2fe0d7b47e7d14698957bad2fe5ad
23
md
Markdown
JavaMultithreadsDemo/README.md
manet/blog-java-demos
a2f1e0972ec0d8231d2f979ed98e8f64495875a0
[ "MIT" ]
null
null
null
JavaMultithreadsDemo/README.md
manet/blog-java-demos
a2f1e0972ec0d8231d2f979ed98e8f64495875a0
[ "MIT" ]
null
null
null
JavaMultithreadsDemo/README.md
manet/blog-java-demos
a2f1e0972ec0d8231d2f979ed98e8f64495875a0
[ "MIT" ]
null
null
null
Java Multi Thread Demo
11.5
22
0.826087
por_Latn
0.287871
eee62160009b10ce57a4cb29057a97819c68bcd3
749
md
Markdown
README.md
Team-KKU/KkuPart
6d53ee20c8e826dc1ecbcbbb308f482f02983b35
[ "MIT" ]
null
null
null
README.md
Team-KKU/KkuPart
6d53ee20c8e826dc1ecbcbbb308f482f02983b35
[ "MIT" ]
null
null
null
README.md
Team-KKU/KkuPart
6d53ee20c8e826dc1ecbcbbb308f482f02983b35
[ "MIT" ]
null
null
null
# KkuPart Explore Good News about Seoul, Korea Apartment Price - Title: 부동산 정책이 실거래가에 끼치는 영향 [집값 호재 분석] - 대상 건물 유형: 아파트 - 대상 지역: 위례 신도시 - 소스 데이터 - 실거래가 관련: 공공데이터 부동산 실거래가 데이터 (API, file 형태 등으로 수집) - 정책 관련: 부동산 관련 뉴스 데이터 (Scrapping) - 분석 컨텐츠 - 호재 유형별 기사 매칭 - 기사 트렌드와 실거래가 트렌드 관계 분석 - 추후 작업 내용 - 관련 데이터 조사 - 부동산 기사 데이터 - 매매가 데이터 - (계획) 부동산 정책 공고 데이터 - 데이터별 확보 방안 - 스크래핑, API - 데이터 관리 방안 - 우선 개인 드라이브에 업로드 후 공유 - 관리방안 세워야 함. - 데이터 탐색 (과정) - 기사 데이터 - 연도별 부동산 관련 키워드 추출 - 활용 라이브러리: konlpy, soynlp, kr-wordrank, khaiii - 매매가 데이터 - 시계열 트렌드 확인 -------------- 참고 1. Jupyter Lab 사용 - pip install jupyterlab - "> jupyter lab"
21.4
59
0.5247
kor_Hang
1.00001
eee762f63728522993569ee4835a728ce0f31f36
6,191
md
Markdown
README.md
sthagen/pytorch_active_learning
4d97ad42d95f73f564d05357389b97b45d4972a2
[ "MIT" ]
48
2019-03-27T10:06:24.000Z
2021-12-13T12:38:37.000Z
README.md
sthagen/pytorch_active_learning
4d97ad42d95f73f564d05357389b97b45d4972a2
[ "MIT" ]
4
2019-03-28T12:38:18.000Z
2020-01-09T17:29:58.000Z
README.md
sthagen/pytorch_active_learning
4d97ad42d95f73f564d05357389b97b45d4972a2
[ "MIT" ]
4
2019-03-28T07:36:07.000Z
2020-04-28T02:45:35.000Z
# PyTorch Active Learning Library for common Active Learning methods to accompany: Human-in-the-Loop Machine Learning Robert Munro Manning Publications https://www.manning.com/books/human-in-the-loop-machine-learning The code is stand-alone and can be used with the book. # Active Learning methods in the library The code currently contains methods for: *Least Confidence sampling* *Margin of Confidence sampling* *Ratio of Confidence sampling* *Entropy (classification entropy)* *Model-based outlier sampling* *Cluster-based sampling* *Representative sampling* *Adaptive Representative sampling* *Active Transfer Learning for Uncertainty Sampling* *Active Transfer Learning for Representative Sampling* *Active Transfer Learning for Adaptive Sampling (ATLAS)* The book covers how to apply them indepedently, in combination, and for different use cases in Computer Vision and Natural Language Processing. It also covers strategies for sampling for real-world diversity to avoid bias. ## Installation: If you clone this repo and already have PyTorch installed, you should be able to get going immediately: `git clone https://github.com/rmunro/pytorch_active_learning` `cd pytorch_active_learning` ### Running Chapter 2, Getting Started with Human-in-the-Loop Machine Learning `python active_learning_basics.py` When you run the software, you will be prompted to classify news headlines as being disaster-related or not. The prompt will also tell you give you the option to see a precise definitions for what constitutes "disaster-related". You can also read those definitions in the code in the `detailed_instructions` variable: https://github.com/rmunro/pytorch_active_learning/blob/master/active_learning_basics.py After you have classified (annotated) enough data for evaluation and to begin training, you will see that machine learning models now train after each iteration of annotation, reporting the accuracy on your held-out evaluation data as F-Scores and AUC. After the initial iteration of training, which will just be on randomly-chosen data, you will start to see Active Learning kick-in to find unlabeled items that the model is confused about or are outliers with novel features. The Active Learning will be evident in the annotations, too, as the disaster-related headlines will be very rare initially, but should become around 40% of the data that you are annotating after a few iterations. ### Running Chapter 4, Diversity Sampling `python diversity_sampling.py` This builds on the earlier dataset. See the chapter for the details of the feature flags that allow you to sample using different types of Diversity Sampling, like Model-based Outliers, Clustering, and Representative Sampling. ## Requirements: The code assumes that you are using python3.6 or later. If you really need to get this working on python2.\*, please let me know: the PyTorch and Active Learning algorithms _should_ all be 2.\* compliant and it is only python's methods for getting command-line inputs that will need to be changed (python2.\* expects integrer inputs only). If enough people request it, then I'll try to update the code to be compatible for earlier versions of python! ## Installing PyTorch: ### AWS I recommend using the Deep Learning AMI on AWS, because PyTorch is already installed and can be activated with: `source activate pytorch_p36` That should be all you need to run the program immediately. For more details on using PyTorch on AWS, see: https://docs.aws.amazon.com/dlami/latest/devguide/tutorial-pytorch.html ### Google Cloud I recommend using a PyTorch image for a Deep Learning virtual machine on Google Cloud, because PyTorch is already installed. Both the CPU and GPU should work: `pytorch-latest-cpu` `pytorch-latest-gpu` For more details on using PyTorch on Google Cloud, see: https://cloud.google.com/deep-learning-vm/docs/images ### Microsoft Azure I recommend using a Data Science pre-configured virtual machine on Microsoft Azure: https://azure.microsoft.com/en-us/develop/pytorch/ The Azure Notebook option might also be a good option, but I haven't tested it out: please let me know if you do! ### Linux / Mac / Windows If you're installing locally or on a cloud server without PyTorch pre-installed, you can use these options: Mac: `conda install pytorch torchvision -c pytorch` Linux/Windows: `conda install pytorch torchvision cudatoolkit=9.0 -c pytorch` These local instructions are current as of June 2019. PyTorch are great about maintaining quickstart instructions, so I recommend going there if these commands don't work for you for some reason. See "QUICK START LOCALLY" at: https://pytorch.org/ Mac users should also make sure they are using python3.6 or later, as Mac's still ship with python2.7 by default. See above re support for 2.7 if you really require it. For pip users, it is possible that you can install pytorch with the following commands: `pip3 install torch` or `pip3 install torch` However, this sometimes works and sometimes doesn't depending on the versions of various libraries and your exact operating system. That's why `conda` is recommended over `pip` on the pytorch website. ## Data Sources Currently, the data used is from the "Million News Headlines" dataset posted on Kaggle: https://www.kaggle.com/therohk/million-headlines The data is taken from headlines from Australia's "ABC" news organization. They are in Austalian English, which will be closer to UK English than US English, but a complete lexical subset of UK & US English, differing only in that some words in Australian English have meanings that do not occur in UK or US English. However, I intend to replace it soonish. The headlines are all lower-case and stripped of all characters other than a-z and 0-9: no punctuation, accented characters, etc. Many of the headlines seem to be truncated for some reason, too. So, I will update it with a dataset that is closer to true headlines. This dataset is perfectly fine for everything that you need to learn in this code - it is just that the resulting annotations/models will be less useful in real-world situations.
48.748031
437
0.789856
eng_Latn
0.99659
eee898ae6be0c46c01f888c887a9028a3eb79436
2,433
md
Markdown
index.md
collinskandie/helix-pages
c7d653008c118ab893939b90a100ccb6260f1b25
[ "Apache-2.0" ]
2
2021-03-14T21:55:59.000Z
2021-04-14T07:28:17.000Z
index.md
collinskandie/helix-pages
c7d653008c118ab893939b90a100ccb6260f1b25
[ "Apache-2.0" ]
null
null
null
index.md
collinskandie/helix-pages
c7d653008c118ab893939b90a100ccb6260f1b25
[ "Apache-2.0" ]
null
null
null
# Helix Pages Welcome to Helix Pages! To use it, change the current URL to `https://<repo>--<owner>.project-helix.page`. `<owner>` and `<repo>` must refer to a valid Git repository. Example: <https://helix-home--adobe.project-helix.page/README.html> --- ### Try it... Simply paste a GitHub URL to a publicly visible Markdown file (`.md`) here... <script> function splitURL() { const giturl = document.getElementById('giturl').value; const resegs = /(?<!\?.+)(?<=\/)[\w-\.]+(?=[/\r\n?]|$)/g; const segments = [...giturl.matchAll(resegs)]; const path = giturl.substr(segments[4].index + segments[4][0].length); return ({ "user": segments[1][0], "repo": segments[2][0], "branch": segments[4][0], "path": path}); } function change(evt) { if (evt.key === 'Enter') return takeMeThere(); const alertElem = document.getElementById('alert'); const alert=checkURL(); if (alert) { alertElem.innerHTML = alert; alertElem.style = ''; } else { alertElem.style = 'display: none'; } } function checkURL() { let c; try { c = splitURL(); } catch (e) { return ('URL needs be a valid GitHub URL'); } if (!c.path.endsWith(`.md`)) return ('URL needs to end in \'.md\''); if (c.repo.indexOf('.')>=0) return('Repository name cannot contain a \'.\''); if (c.user.indexOf('.')>=0) return('User name cannot contain a \'.\''); if (c.branch.indexOf('.')>=0) return('Branch cannot contain a \'.\''); } function takeMeThere() { if (checkURL()) { return; } const c = splitURL(); const separator = '--'; const pathstub = c.path.substr(0, c.path.length - 3); const branchprefix = (c.branch === 'master' ? '' : c.branch + separator); const url = `https://${branchprefix}${c.repo}${separator}${c.user}.hlx.page${pathstub}.html`; window.location = url; } </script> <input onkeyup="change(event)" type="text" id="giturl" aria-label="Github URL" placeholder="GitHub URL"></input> <span id="alert" class="alert" style="display:none"></span> <button id="takemethere" onclick="takeMeThere()">Take Me There</button> ## For developers - [Add an Atom feed to your site](docs/feed.md) - [Add a Sitemap to your site](docs/sitemap.md) - [Add an Index to your site](https://github.com/adobe/helix-home/blob/main/docs/setup-indexing.md) ## For authors - [Add the Sidekick to your bookmark bar](tools/sidekick/)
31.192308
112
0.6194
eng_Latn
0.349898
eeea2e939756c086383a2e5c99f246e7f5f6a24c
2,108
md
Markdown
README.md
wtschueller/tdk
cdbbbe26781061d13c937f3885c6c7183fa52ced
[ "BSD-3-Clause" ]
45
2018-03-29T22:09:55.000Z
2022-02-07T11:31:36.000Z
README.md
wtschueller/tdk
cdbbbe26781061d13c937f3885c6c7183fa52ced
[ "BSD-3-Clause" ]
5
2018-06-30T01:19:48.000Z
2020-08-05T21:01:14.000Z
README.md
wtschueller/tdk
cdbbbe26781061d13c937f3885c6c7183fa52ced
[ "BSD-3-Clause" ]
24
2018-03-29T19:02:32.000Z
2021-08-05T07:32:38.000Z
# TDK Tcl Dev Kit (TDK) # Overview Tcl Dev Kit (TDK) includes everything you need for fast development of self-contained, easily-deployable applications. Turn your Tcl programs into ready-to-run executables, starkits or starpacks, for Windows, Mac OS X, Linux, Solaris, AIX and HP-UX. Simplify development with tools for finding and fixing bugs, managing complex code bases and optimizing your programs. Easily reveal unused or overused code with the coverage and hotspot analyzer. Take control and work the way you want with a choice of GUIs or command line interfaces for most tools. # General structure / directory organization * app - application sources, including main entry points. Some debug helper code which can be sourced by apps. * data - Images used by the apps. * docs - Internal dev notes and the official documentation. * lib - All the supporting packages. Some overlap with the `teapot` project. * misc - A hack-week project, incomplete, which never made it into the product. * pkg-src - Sources for the `win32 package. Written in C for access to some Windows system information (mainly paths). # Images * `data/images/about.gif` is a placeholder for the background image of an about dialog. The supplied image is all-grey. Note: It is actually not clear if this image is still referenced, and if yes, where. * `artwork/splash.png` is a similar placeholder, for the splash screen. Note: The places referencing this image (`app/*/main.tcl`) are written on the assumption of unwrapped execution in the directory structure of the checkout. The knowledge of the expected structure is used to locate the image relative to the code file referencing it. For wrapped execution the original build system put the file `main_std.tcl` into the app as `ms.tcl` and added the splash setup code referencing the wrapped image. # License Copyright (c) 2018 ActiveState Software Tcl Dev Kit (TDK) is released under the BSD-3 license. See LICENSE file for details.
37.642857
300
0.731499
eng_Latn
0.998676
eeea96a9d0f8d6393a52cd7458c8054ba908c411
54
md
Markdown
README.md
aknuds1/hapi-cycle
ff9796fd204e53ebf8cbb4fbb7b72653bd4a2160
[ "MIT" ]
null
null
null
README.md
aknuds1/hapi-cycle
ff9796fd204e53ebf8cbb4fbb7b72653bd4a2160
[ "MIT" ]
null
null
null
README.md
aknuds1/hapi-cycle
ff9796fd204e53ebf8cbb4fbb7b72653bd4a2160
[ "MIT" ]
null
null
null
# Hapi Cycle Example isomorphic Hapi.js/Cycle.js app.
18
40
0.777778
eng_Latn
0.870945
eeeb5cf803ba741603169175efc0c4e48f833d35
168
md
Markdown
content/noesporno/2008/11/cortito-xxiv.md
mazza-org/mazza.com.ar
3ac0eb70e9db35f77c69f6861f8bfdb7c6935403
[ "MIT" ]
null
null
null
content/noesporno/2008/11/cortito-xxiv.md
mazza-org/mazza.com.ar
3ac0eb70e9db35f77c69f6861f8bfdb7c6935403
[ "MIT" ]
null
null
null
content/noesporno/2008/11/cortito-xxiv.md
mazza-org/mazza.com.ar
3ac0eb70e9db35f77c69f6861f8bfdb7c6935403
[ "MIT" ]
null
null
null
--- title: "Cortito (XXIV)" date: "2008-11-13" --- > Publicado originalmente en [noesporno](/noesporno). _El helado no se toma, se **come**. Entiéndanlo de una vez._
18.666667
60
0.672619
spa_Latn
0.907234