hexsha stringlengths 40 40 | size int64 5 1.04M | ext stringclasses 6 values | lang stringclasses 1 value | max_stars_repo_path stringlengths 3 344 | max_stars_repo_name stringlengths 5 125 | max_stars_repo_head_hexsha stringlengths 40 78 | max_stars_repo_licenses listlengths 1 11 | max_stars_count int64 1 368k ⌀ | max_stars_repo_stars_event_min_datetime stringlengths 24 24 ⌀ | max_stars_repo_stars_event_max_datetime stringlengths 24 24 ⌀ | max_issues_repo_path stringlengths 3 344 | max_issues_repo_name stringlengths 5 125 | max_issues_repo_head_hexsha stringlengths 40 78 | max_issues_repo_licenses listlengths 1 11 | max_issues_count int64 1 116k ⌀ | max_issues_repo_issues_event_min_datetime stringlengths 24 24 ⌀ | max_issues_repo_issues_event_max_datetime stringlengths 24 24 ⌀ | max_forks_repo_path stringlengths 3 344 | max_forks_repo_name stringlengths 5 125 | max_forks_repo_head_hexsha stringlengths 40 78 | max_forks_repo_licenses listlengths 1 11 | max_forks_count int64 1 105k ⌀ | max_forks_repo_forks_event_min_datetime stringlengths 24 24 ⌀ | max_forks_repo_forks_event_max_datetime stringlengths 24 24 ⌀ | content stringlengths 5 1.04M | avg_line_length float64 1.14 851k | max_line_length int64 1 1.03M | alphanum_fraction float64 0 1 | lid stringclasses 191 values | lid_prob float64 0.01 1 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1c60b710a5ac7c5d35abcea95b51866bef4f745c | 1,338 | md | Markdown | README.md | alyssa0528/eats-abroad-rails | 74ea8e6eae88bf57b292ad0d7385b61831961db4 | [
"MIT"
] | null | null | null | README.md | alyssa0528/eats-abroad-rails | 74ea8e6eae88bf57b292ad0d7385b61831961db4 | [
"MIT"
] | null | null | null | README.md | alyssa0528/eats-abroad-rails | 74ea8e6eae88bf57b292ad0d7385b61831961db4 | [
"MIT"
] | null | null | null | # Welcome to the Eats Abroad restaurant recommendation app!
## Usage
The Eats Abroad Rails application is loosely based off the concept of eatsabroad.com, a site where I interviewed chefs about their favorite restaurants and bars in their cities. This app allows chefs to create a list of their favorite restaurants and share with other chefs. Chefs are required to leave a comment that provides more details on favorite dishes and other notable aspects of the recommended establishment.
## Deployed Application
Eats Abroad is deployed on Heroku. Visit https://eats-abroad-rails.herokuapp.com to play around. Use email **test@test.com** and password **test**.
## Local Install instructions
Clone the repo. From the terminal, navigate to the eats-abroad-rails folder. Run `bundle install`. Run `rake db:migrate`. Run `rake db:seed` to populate the database with initial data. Run `rails s` and go to the specified server via your browser.
## Contributors' Guide
Bug reports and pull requests are welcome on GitHub at https://github.com/alyssa0528/eats-abroad-rails. This project is intended to be a safe, welcoming space for collaboration, and contributors are expected to adhere to the Contributor Covenant code of conduct.
## License
Please refer to the MIT License documentation in the license.txt file for more information.
| 60.818182 | 418 | 0.792227 | eng_Latn | 0.997519 |
1c60bea56537a61a7fb34ada9ec8442f3d9058a1 | 1,040 | md | Markdown | Engineering/Availability.md | thomassajot/brain-dam | 31301cc9142064e5a49d3b7a8e4734ae27131c69 | [
"MIT"
] | null | null | null | Engineering/Availability.md | thomassajot/brain-dam | 31301cc9142064e5a49d3b7a8e4734ae27131c69 | [
"MIT"
] | null | null | null | Engineering/Availability.md | thomassajot/brain-dam | 31301cc9142064e5a49d3b7a8e4734ae27131c69 | [
"MIT"
] | null | null | null | What happens if the system fails. How fault tolerant is the system. The percentage of time where the system is satisfying its primary functions.
# SLA / SLO
System level agreement. System level objective. SLO are components of SLA.
# Nines
The percentage of time where the system is running according to SLA is usually expressed in percentages with `9`, like 99% availability in a year, or 99.9%, 99.999%, ... Which are 2, 3, and 5 nines.
Systems with `Highly Availability (HA)` are systems with more than 5 nines ~5 min of downtime. It is a official term
# Redundancy
Removing Single Points of Failure is a way to improve availability. Redundancy is a way to duplicates (or more) your servers.
`Passive redundancy`: If a machine dies, the other machines already there will pick up the traffic.
`Active redundancy`: Only one or few of the machines doing work. If this one fail, the other machines are going to know and pick up the work. For example a Leader election, if the leader machine fails another machine will become leader. | 86.666667 | 236 | 0.772115 | eng_Latn | 0.999831 |
1c613bd740adc3457525946fdf16194675b98278 | 1,334 | md | Markdown | desktop-src/AD/mappings-for-the-active-directory-users-and-computers-snap-in.md | citelao/win32 | bf61803ccb0071d99eee158c7416b9270a83b3e4 | [
"CC-BY-4.0",
"MIT"
] | 552 | 2019-08-20T00:08:40.000Z | 2022-03-30T18:25:35.000Z | desktop-src/AD/mappings-for-the-active-directory-users-and-computers-snap-in.md | citelao/win32 | bf61803ccb0071d99eee158c7416b9270a83b3e4 | [
"CC-BY-4.0",
"MIT"
] | 1,143 | 2019-08-21T20:17:47.000Z | 2022-03-31T20:24:39.000Z | desktop-src/AD/mappings-for-the-active-directory-users-and-computers-snap-in.md | citelao/win32 | bf61803ccb0071d99eee158c7416b9270a83b3e4 | [
"CC-BY-4.0",
"MIT"
] | 1,287 | 2019-08-20T05:37:48.000Z | 2022-03-31T20:22:06.000Z | ---
title: Mappings for the Active Directory Users and Computers Snap-in
description: The Active Directory Users and Computers snap-in includes property sheets and other user interfaces for the following object classes.The Object property sheet is common to all object classes.
ms.assetid: 4216aede-b513-4db9-a6de-1a79b3a6f74a
ms.tgt_platform: multiple
ms.topic: article
ms.date: 05/31/2018
---
# Mappings for the Active Directory Users and Computers Snap-in
The Active Directory Users and Computers snap-in includes property sheets and other user interfaces for the following object classes.
The Object property sheet is common to all object classes.
- [Computer Object User Interface Mapping](computer-object-user-interface-mapping.md)
- [Domain Object User Interface Mapping](domain-object-user-interface-mapping.md)
- [Group Object User Interface Mapping](group-object-user-interface-mapping.md)
- [Object Property Sheet](object-property-sheet.md)
- [Organizational Unit User Interface Mapping](organizational-unit-user-interface-mapping.md)
- [Printer Object User Interface Mapping](printer-object-user-interface-mapping.md)
- [Shared Folder Object User Interface Mapping](shared-folder-object-user-interface-mapping.md)
- [User Object User Interface Mapping](user-object-user-interface-mapping.md)
| 41.6875 | 204 | 0.793853 | eng_Latn | 0.915072 |
1c61acda3bf3908ba0bf1d1cb44a2a6442fea9a7 | 1,472 | md | Markdown | test.md | tcbutler320/tcbutler320.github.io | efa2122a7f999ad611c1caababab2e067b5e42fc | [
"MIT"
] | null | null | null | test.md | tcbutler320/tcbutler320.github.io | efa2122a7f999ad611c1caababab2e067b5e42fc | [
"MIT"
] | 2 | 2019-09-04T23:32:46.000Z | 2019-09-05T11:32:34.000Z | test.md | tcbutler320/tcbutler320.github.io | efa2122a7f999ad611c1caababab2e067b5e42fc | [
"MIT"
] | null | null | null | | Date | CVE | Title | Bounty |
|--- |--- |--- |---|
| 09/2021 | CVE-2021-38701| To Be Released| $250 |
| 08/2021 | CVE-2021-3441 | [HP Officejet - 'AirPrint' Cross Site Scripting (XSS)](https://tbutler.org/2021/04/29/hp-officejet-4630)| N/A |
| 06/2021 | CVE-2021-35956 | [AKCP sensorProbe - 'Multiple' Cross Site Scripting (XSS)](https://tbutler.org/2021/06/28/cve-2021-35956) | N/A |
| 05/2021 | N/A | [Authentication Bypass by Spoofing in Miodec/monkeytype](https://huntr.dev/bounties/1-other-Miodec/monkeytype/)| $40 |
| 05/2021 | N/A | [MonkeyType.com - Stored Cross-Site Scripting (XSS) via Tribe Chat](https://github.com/Miodec/monkeytype/issues/1476) | N/A |
| 05/2021 | N/A | [PHP Timeclock 1.04 - 'Multiple' Cross Site Scripting (XSS)](https://www.exploit-db.com/exploits/49853)| N/A |
| 05/2021 | N/A | [PHP Timeclock 1.04 - Time and Boolean Based Blind SQL Injection](https://www.exploit-db.com/exploits/49849) | N/A |
| 05/2021 | N/A | [MonkeyType.com - `Self` Cross Site Scripting (XSS) via Word History](https://github.com/Miodec/monkeytype/issues/1348) | N/A |
| 04/2021 | N/A | [BlockFi - Undisclosed Vulnerability](https://hackerone.com/tcbutler320?type=user)| $1,000 |
| 05/2020 | N/A | [Hinge - Modification of Assumed-Immutable Data](https://tbutler.org/assets/pdf/Butler,Tyler-MAID-Hinge-BBR.pdf) | $250 |
| 10/2020 | N/A | [TimeClock Software 1.01 0 - (Authenticated) Time-Based SQL Injection](https://www.exploit-db.com/exploits/48874) | N/A |. | 113.230769 | 145 | 0.686141 | yue_Hant | 0.58319 |
1c634e7cc51ead7b874d826be7c44b1129ef9e25 | 15 | md | Markdown | README.md | knguyen24/circle-k-project-tintin | eba6b4b06fb7a2130ea78c10d4eeec41d20d5859 | [
"MIT"
] | null | null | null | README.md | knguyen24/circle-k-project-tintin | eba6b4b06fb7a2130ea78c10d4eeec41d20d5859 | [
"MIT"
] | null | null | null | README.md | knguyen24/circle-k-project-tintin | eba6b4b06fb7a2130ea78c10d4eeec41d20d5859 | [
"MIT"
] | null | null | null | # W12-4-tintin
| 7.5 | 14 | 0.666667 | fin_Latn | 0.384048 |
1c63e262929920ec5dc8a1d5eb37fa254f6c70f2 | 632 | md | Markdown | _includes/components/content/datetime.md | ysamus/atata-framework.github.io | 7c62a0013e63c27f4a97e26f2fb07f0e256b8244 | [
"Apache-2.0"
] | 4 | 2018-05-07T20:50:16.000Z | 2020-12-14T12:07:01.000Z | _includes/components/content/datetime.md | ysamus/atata-framework.github.io | 7c62a0013e63c27f4a97e26f2fb07f0e256b8244 | [
"Apache-2.0"
] | null | null | null | _includes/components/content/datetime.md | ysamus/atata-framework.github.io | 7c62a0013e63c27f4a97e26f2fb07f0e256b8244 | [
"Apache-2.0"
] | 6 | 2018-05-07T20:46:00.000Z | 2020-12-14T12:07:04.000Z | Represents any element containing date and time content. Default search finds the first occurring element. The default format is `"g"` (general date/time pattern (short time), e.g. 6/15/2009 1:45 PM).
```html
<span id="date-time">5/15/2016 1:45 PM</span>
```
```cs
using Atata;
namespace SampleApp.UITests
{
using _ = SamplePage;
public class SamplePage : Page<_>
{
[FindById]
public DateTime<_> DateTime { get; private set; }
}
}
```
```cs
Go.To<SamplePage>().
DateTime.Should.Equal(new DateTime(2016, 5, 15, 13, 45, 0));
```
Supports `[Format]` and `[Culture]` settings attributes.
{:.info} | 24.307692 | 200 | 0.656646 | eng_Latn | 0.668746 |
1c646f1dee68b7f52d6a5449642e923833b8df1a | 11,460 | md | Markdown | articles/hadoop/itcast-hadoop-hdfs-java-api.md | bensondeng/qigangzhong.github.io | 724ceb4a52e19c31431800bb5c083f6898bc8e2e | [
"MIT"
] | null | null | null | articles/hadoop/itcast-hadoop-hdfs-java-api.md | bensondeng/qigangzhong.github.io | 724ceb4a52e19c31431800bb5c083f6898bc8e2e | [
"MIT"
] | null | null | null | articles/hadoop/itcast-hadoop-hdfs-java-api.md | bensondeng/qigangzhong.github.io | 724ceb4a52e19c31431800bb5c083f6898bc8e2e | [
"MIT"
] | 1 | 2019-11-20T03:21:43.000Z | 2019-11-20T03:21:43.000Z | # Hadoop 核心-HDFS
## 1:HDFS 的 API 操作
### 1.1 配置Windows下Hadoop环境
在windows系统需要配置hadoop运行环境,否则直接运行代码会出现以下问题:
`缺少winutils.exe`
~~~shell
Could not locate executable null \bin\winutils.exe in the hadoop binaries
~~~
`缺少hadoop.dll`
~~~shell
Unable to load native-hadoop library for your platform… using builtin-Java classes where applicable
~~~
步骤:
第一步:将hadoop2.7.5文件夹拷贝到一个没有中文没有空格的路径下面
第二步:在windows上面配置hadoop的环境变量: HADOOP_HOME,并将%HADOOP_HOME%\bin添加到path中
第三步:把hadoop2.7.5文件夹中bin目录下的hadoop.dll文件放到系统盘: C:\Windows\System32 目录
第四步:关闭windows重启
### 1.2 导入 Maven 依赖
```xml
<dependencies>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.7.5</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>2.7.5</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs</artifactId>
<version>2.7.5</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-mapreduce-client-core</artifactId>
<version>2.7.5</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>RELEASE</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.1</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
<encoding>UTF-8</encoding>
<!-- <verbal>true</verbal>-->
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.4.3</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<minimizeJar>true</minimizeJar>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
```
### 1.3 使用url方式访问数据(了解)
~~~java
@Test
public void demo1()throws Exception{
//第一步:注册hdfs 的url
URL.setURLStreamHandlerFactory(new FsUrlStreamHandlerFactory());
//获取文件输入流
InputStream inputStream = new URL("hdfs://node01:8020/a.txt").openStream();
//获取文件输出流
FileOutputStream outputStream = new FileOutputStream(new File("D:\\hello.txt"));
//实现文件的拷贝
IOUtils.copy(inputStream, outputStream);
//关闭流
IOUtils.closeQuietly(inputStream);
IOUtils.closeQuietly(outputStream);
}
~~~
### 1.4 **使用**文件系统方式访问数据(掌握)
#### 1.4.1 涉及的主要类
在 Java 中操作 HDFS, 主要涉及以下 Class:
* `Configuration`
* 该类的对象封转了客户端或者服务器的配置
* `FileSystem`
* 该类的对象是一个文件系统对象, 可以用该对象的一些方法来对文件进行操作, 通过 FileSystem 的静态方法 get 获得该对象
```java
FileSystem fs = FileSystem.get(conf)
```
* `get` 方法从 `conf` 中的一个参数 `fs.defaultFS` 的配置值判断具体是什么类型的文件系统
* 如果我们的代码中没有指定 `fs.defaultFS`, 并且工程 ClassPath 下也没有给定相应的配置, `conf` 中的默认值就来自于 Hadoop 的 Jar 包中的 `core-default.xml`
* 默认值为 `file:/// `, 则获取的不是一个 DistributedFileSystem 的实例, 而是一个本地文件系统的客户端对象
#### 1.4.2 获取 FileSystem 的几种方式
* 第一种方式
```java
@Test
public void getFileSystem1() throws IOException {
Configuration configuration = new Configuration();
//指定我们使用的文件系统类型:
configuration.set("fs.defaultFS", "hdfs://node01:8020/");
//获取指定的文件系统
FileSystem fileSystem = FileSystem.get(configuration);
System.out.println(fileSystem.toString());
}
```
* 第二种方式
```java
@Test
public void getFileSystem2() throws Exception{
FileSystem fileSystem = FileSystem.get(new URI("hdfs://node01:8020"), new Configuration());
System.out.println("fileSystem:"+fileSystem);
}
```
* 第三种方式
```java
@Test
public void getFileSystem3() throws Exception{
Configuration configuration = new Configuration();
configuration.set("fs.defaultFS", "hdfs://node01:8020");
FileSystem fileSystem = FileSystem.newInstance(configuration);
System.out.println(fileSystem.toString());
}
```
* 第四种方式
```java
//@Test
public void getFileSystem4() throws Exception{
FileSystem fileSystem = FileSystem.newInstance(new URI("hdfs://node01:8020") ,new Configuration());
System.out.println(fileSystem.toString());
}
```
#### 1.4.3 遍历 HDFS 中所有文件
* 使用 API 遍历
```java
@Test
public void listMyFiles()throws Exception{
//获取fileSystem类
FileSystem fileSystem = FileSystem.get(new URI("hdfs://node01:8020"), new Configuration());
//获取RemoteIterator 得到所有的文件或者文件夹,第一个参数指定遍历的路径,第二个参数表示是否要递归遍历
RemoteIterator<LocatedFileStatus> locatedFileStatusRemoteIterator = fileSystem.listFiles(new Path("/"), true);
while (locatedFileStatusRemoteIterator.hasNext()){
LocatedFileStatus next = locatedFileStatusRemoteIterator.next();
System.out.println(next.getPath().toString());
}
fileSystem.close();
}
```
#### 1.4.4 HDFS 上创建文件夹
```java
@Test
public void mkdirs() throws Exception{
FileSystem fileSystem = FileSystem.get(new URI("hdfs://node01:8020"), new Configuration());
boolean mkdirs = fileSystem.mkdirs(new Path("/hello/mydir/test"));
fileSystem.close();
}
```
#### 1.4.4 下载文件
```java
@Test
public void getFileToLocal()throws Exception{
FileSystem fileSystem = FileSystem.get(new URI("hdfs://node01:8020"), new Configuration());
FSDataInputStream inputStream = fileSystem.open(new Path("/timer.txt"));
FileOutputStream outputStream = new FileOutputStream(new File("e:\\timer.txt"));
IOUtils.copy(inputStream,outputStream );
IOUtils.closeQuietly(inputStream);
IOUtils.closeQuietly(outputStream);
fileSystem.close();
}
```
#### 1.4.5 HDFS 文件上传
```java
@Test
public void putData() throws Exception{
FileSystem fileSystem = FileSystem.get(new URI("hdfs://node01:8020"), new Configuration());
fileSystem.copyFromLocalFile(new Path("file:///c:\\install.log"),new Path("/hello/mydir/test"));
fileSystem.close();
}
```
#### 1.4.6 hdfs访问权限控制
1. 停止hdfs集群,在node01机器上执行以下命令
```shell
cd /export/servers/hadoop-2.7.5
sbin/stop-dfs.sh
```
2. 修改node01机器上的hdfs-site.xml当中的配置文件
```shell
cd /export/servers/hadoop-2.7.5/etc/hadoop
vim hdfs-site.xml
```
```xml
<property>
<name>dfs.permissions.enabled</name>
<value>true</value>
</property>
```
3. 修改完成之后配置文件发送到其他机器上面去
```shell
scp hdfs-site.xml node02:$PWD
scp hdfs-site.xml node03:$PWD
```
4. 重启hdfs集群
```shell
cd /export/servers/hadoop-2.7.5
sbin/start-dfs.sh
```
5. 随意上传一些文件到我们hadoop集群当中准备测试使用
```shell
cd /export/servers/hadoop-2.7.5/etc/hadoop
hdfs dfs -mkdir /config
hdfs dfs -put *.xml /config
hdfs dfs -chmod 600 /config/core-site.xml
```
6. 使用代码准备下载文件
```java
@Test
public void getConfig()throws Exception{
FileSystem fileSystem = FileSystem.get(new URI("hdfs://node01:8020"), new Configuration(),"hadoop");
fileSystem.copyToLocalFile(new Path("/config/core-site.xml"),new Path("file:///c:/core-site.xml"));
fileSystem.close();
}
```
#### 1.4.7 小文件合并
由于 Hadoop 擅长存储大文件,因为大文件的元数据信息比较少,如果 Hadoop 集群当中有大量的小文件,那么每个小文件都需要维护一份元数据信息,会大大的增加集群管理元数据的内存压力,所以在实际工作当中,如果有必要一定要将小文件合并成大文件进行一起处理
在我们的 HDFS 的 Shell 命令模式下,可以通过命令行将很多的 hdfs 文件合并成一个大文件下载到本地
```shell
cd /export/servers
hdfs dfs -getmerge /config/*.xml ./hello.xml
```
既然可以在下载的时候将这些小文件合并成一个大文件一起下载,那么肯定就可以在上传的时候将小文件合并到一个大文件里面去

```java
@Test
public void mergeFile() throws Exception{
//获取分布式文件系统
FileSystem fileSystem = FileSystem.get(new URI("hdfs://192.168.52.250:8020"), new Configuration(),"root");
FSDataOutputStream outputStream = fileSystem.create(new Path("/bigfile.txt"));
//获取本地文件系统
LocalFileSystem local = FileSystem.getLocal(new Configuration());
//通过本地文件系统获取文件列表,为一个集合
FileStatus[] fileStatuses = local.listStatus(new Path("file:///E:\\input"));
for (FileStatus fileStatus : fileStatuses) {
FSDataInputStream inputStream = local.open(fileStatus.getPath());
IOUtils.copy(inputStream,outputStream);
IOUtils.closeQuietly(inputStream);
}
IOUtils.closeQuietly(outputStream);
local.close();
fileSystem.close();
}
```
## 2:HDFS的高可用机制
### 2.1 HDFS高可用介绍
在Hadoop 中,NameNode 所处的位置是非常重要的,整个HDFS文件系统的元数据信息都由NameNode 来管理,NameNode的可用性直接决定了Hadoop 的可用性,一旦NameNode进程不能工作了,就会影响整个集群的正常使用。
在典型的HA集群中,两台独立的机器被配置为NameNode。在工作集群中,NameNode机器中的一个处于Active状态,另一个处于Standby状态。Active NameNode负责群集中的所有客户端操作,而Standby充当从服务器。Standby机器保持足够的状态以提供快速故障切换(如果需要)。

### 2.2 组件介绍
`ZKFailoverController`
是基于Zookeeper的故障转移控制器,它负责控制NameNode的主备切换,ZKFailoverController会监测NameNode的健康状态,当发现Active NameNode出现异常时会通过Zookeeper进行一次新的选举,完成Active和Standby状态的切换
`HealthMonitor`
周期性调用NameNode的HAServiceProtocol RPC接口(monitorHealth 和 getServiceStatus),监控NameNode的健康状态并向ZKFailoverController反馈
`ActiveStandbyElector`
接收ZKFC的选举请求,通过Zookeeper自动完成主备选举,选举完成后回调ZKFailoverController的主备切换方法对NameNode进行Active和Standby状态的切换.
`DataNode`
NameNode包含了HDFS的元数据信息和数据块信息(blockmap),其中数据块信息通过DataNode主动向Active NameNode和Standby NameNode上报
`共享存储系统`
共享存储系统负责存储HDFS的元数据(EditsLog),Active NameNode(写入)和 Standby NameNode(读取)通过共享存储系统实现元数据同步,在主备切换过程中,新的Active NameNode必须确保元数据同步完成才能对外提供服务
## 3: Hadoop的联邦机制(Federation)
### 3.1**背景概述**
单NameNode的架构使得HDFS在集群扩展性和性能上都有潜在的问题,当集群大到一定程度后,NameNode进程使用的内存可能会达到上百G,NameNode成为了性能的瓶颈。因而提出了namenode水平扩展方案-- Federation。
Federation中文意思为联邦,联盟,是NameNode的Federation,也就是会有多个NameNode。多个NameNode的情况意味着有多个namespace(命名空间),区别于HA模式下的多NameNode,它们是拥有着同一个namespace。既然说到了NameNode的命名空间的概念,这里就看一下现有的HDFS数据管理架构,如下图所示:

从上图中,我们可以很明显地看出现有的HDFS数据管理,数据存储2层分层的结构.也就是说,所有关于存储数据的信息和管理是放在NameNode这边,而真实数据的存储则是在各个DataNode下.而这些隶属于同一个NameNode所管理的数据都是在同一个命名空间下的.而一个namespace对应一个block pool。Block Pool是同一个namespace下的block的集合.当然这是我们最常见的单个namespace的情况,也就是一个NameNode管理集群中所有元数据信息的时候.如果我们遇到了之前提到的NameNode内存使用过高的问题,这时候怎么办?元数据空间依然还是在不断增大,一味调高NameNode的jvm大小绝对不是一个持久的办法.这时候就诞生了HDFS Federation的机制.
### 3.2 **Federation架构设计**
HDFS Federation是解决namenode内存瓶颈问题的水平横向扩展方案。
Federation意味着在集群中将会有多个namenode/namespace。这些namenode之间是联合的,也就是说,他们之间相互独立且不需要互相协调,各自分工,管理自己的区域。分布式的datanode被用作通用的数据块存储存储设备。每个datanode要向集群中所有的namenode注册,且周期性地向所有namenode发送心跳和块报告,并执行来自所有namenode的命令。

Federation一个典型的例子就是上面提到的NameNode内存过高问题,我们完全可以将上面部分大的文件目录移到另外一个NameNode上做管理.**更重要的一点在于,这些NameNode是共享集群中所有的DataNode的,它们还是在同一个集群内的****。**
这时候在DataNode上就不仅仅存储一个Block Pool下的数据了,而是多个(在DataNode的datadir所在目录里面查看BP-xx.xx.xx.xx打头的目录)。
**概括起来:**
多个NN共用一个集群里的存储资源,每个NN都可以单独对外提供服务。
每个NN都会定义一个存储池,有单独的id,每个DN都为所有存储池提供存储。
DN会按照存储池id向其对应的NN汇报块信息,同时,DN会向所有NN汇报本地存储可用资源情况。
**HDFS Federation不足**
HDFS Federation并没有完全解决单点故障问题。虽然namenode/namespace存在多个,但是从单个namenode/namespace看,仍然存在单点故障:如果某个namenode挂掉了,其管理的相应的文件便不可以访问。Federation中每个namenode仍然像之前HDFS上实现一样,配有一个secondary namenode,以便主namenode挂掉一下,用于还原元数据信息。
所以一般集群规模真的很大的时候,会采用HA+Federation的部署方案。也就是每个联合的namenodes都是ha的。 | 29.612403 | 354 | 0.712653 | yue_Hant | 0.743613 |
1c64f960733e6162e9aa516451a051b70115b793 | 1,970 | md | Markdown | content/notes/fitness/immortality/index.md | glennji/glennjicom-academic | ad58b1333ae4cb2febf0104c18e3fb4e6997bedf | [
"MIT"
] | null | null | null | content/notes/fitness/immortality/index.md | glennji/glennjicom-academic | ad58b1333ae4cb2febf0104c18e3fb4e6997bedf | [
"MIT"
] | null | null | null | content/notes/fitness/immortality/index.md | glennji/glennjicom-academic | ad58b1333ae4cb2febf0104c18e3fb4e6997bedf | [
"MIT"
] | null | null | null | ---
title: "Immortality"
author: glennji
date: 2016-12-18T06:46:56+10:00
draft: false
type: note
crosslink: "true"
---
> "I don't want to achieve immortality through my work; I want to achieve immortality through not dying." -- Woody Allen
I'm with Woody: I'd prefer not to die (if I get the choice), and would genuinely be quite happy to become an immortal. (And no, not in the "life after death/ascension to the great beyond" version promised by most religions -- I want to see how this reality evolves, before even thinking about another one.)
I'd prefer not to get old, in fact, if "old" means "infirm", as it has for all of the 200,000 years since the emergence of "modern" humans. I'd like to keep my memories, maintain my mental faculties, stay limber and active, continue to be able to grow and change and adapt to everything the future has in store for us as a species. I want to see the rise of true AI, meet an non-Terran, see humans living (and thriving) on Mars, inside hollow asteroids and on/in entirely artificial habitats. I'd love to see another Terran species rise to intellectual prominence, if we leave our homeworld alone (chimpanzees and/or dolphins, I'm looking at you). I want to study, write a novel, compose an opus, create art, dance, have my mind blown by new concepts and viewpoints and arguments.
I can't do any of these if I'm dead. But, lucky old me, there's a lot of good science happening towards a "cure for death", and if I can just stay alive long enough to reach **longevity escape velocity** (LEV) then I can become "functionally immortal" -- I don't have to die **just because everybody else has**. Fingers crossed, anyway.
Some of the tech(nologies|niques) that might help me achieve immortality:
- calorific restriction/intermittent fasting
- ketogenic diet
- bulletproof
- supplements
- senolytics
- epigenetics
- anthropopyscho-stability (read: stable relationships, friends, involvement in the community)
- ...?
| 75.769231 | 780 | 0.759898 | eng_Latn | 0.997434 |
1c6580eb48680c7b86c0af0dfcfc7f274ddf6c0f | 1,780 | md | Markdown | README.md | Accrubit/Academy | 563a7126d9818a66045876947b8a8e96134d9bf3 | [
"Apache-2.0"
] | null | null | null | README.md | Accrubit/Academy | 563a7126d9818a66045876947b8a8e96134d9bf3 | [
"Apache-2.0"
] | null | null | null | README.md | Accrubit/Academy | 563a7126d9818a66045876947b8a8e96134d9bf3 | [
"Apache-2.0"
] | null | null | null | ---
description: >-
Accrubit Academy is a continuously updating, open-source and market neutral
education platform for blockchain, cryptocurrency, and other Web 3.0
technologies.
---
# Welcome to Accrubit Academy
## Welcome!
### How to Navigate
To find your way around, this page contains a change log of updates to my material, as well as an overview of the current scope of the platform. There are three states assigned to course academy resources:
1. **Ready:** This resource is completed and is being updated as needed
2. **In progress:** This resource is incomplete and is currently being worked on
3. **Needs attention:** This resource is either outdated, or incomplete, and is not being worked on
### Resources
| Blockchain for Business \(In progress\) |
| :--- |
| Blockchain for Developers \(In progress\) |
| Cryptoeconomics \(In progress\) |
| Cryptocurrency Investing \(In progress\) |
| Cryptocurrency Income \(In progress\) |
## Accrubit Academy - 2018-06-02 <a id="welcome-to-accrubit-academy-2018-06-02"></a>
### Changed <a id="changed"></a>
* Title and description of Home Page edited to reflect purpose of website
* Navbar edited for continuity
### Added
* Cryptoeconmics Guide Placeholder
* Cryptocurrency Investing Guide Placeholder
* Cryptocurrency Income Guide
* Tierion Node Tutorial
## Accrubit Academy - 2018-06-01 <a id="blockchain-for-business-2018-06-01"></a>
### Added <a id="added"></a>
* 10 incomplete chapters added to Blockchain for Business guide
* Accrubit Academy website and course structure deployed
* Team backend completed with [Kyle May](https://www.linkedin.com/in/kylelmay/) as lead course developer
* [GitHub](https://github.com/Accrubit) and [Slack](https://accrubit.slack.com/) integrated to Academy
| 31.22807 | 205 | 0.738202 | eng_Latn | 0.975994 |
1c65e6853dde25365459b8fc50f8bf11771dbafc | 5,109 | md | Markdown | example/base/README.md | fuyuhin/freedom | 7aa2c68ee823d237328683e3d384bb6c5633fc1f | [
"Apache-2.0"
] | null | null | null | example/base/README.md | fuyuhin/freedom | 7aa2c68ee823d237328683e3d384bb6c5633fc1f | [
"Apache-2.0"
] | null | null | null | example/base/README.md | fuyuhin/freedom | 7aa2c68ee823d237328683e3d384bb6c5633fc1f | [
"Apache-2.0"
] | null | null | null | # freedom
### base - 基础示例
#### 目录结构
- application - 领域服务
- aggregate - 聚合
- entity - 实体
- object - 值对象
- dto - 传输对象
- adapter - 端口适配器
- controller - 输入适配器
- repository - 输出适配器
- server - 服务端程序入口
- conf - toml配置文件
- main.go - 主函数
- infra - 基础设施
- config - 配置组件
---
#### 入口main.go
```go
//创建应用
app := freedom.NewApplication()
//安装中间件
installMiddleware(app)
//应用启动 传入绑定地址和应用配置;
app.Run(addrRunner, *config.Get().App)
func installMiddleware(app freedom.Application) {
//安装TRACE中间件,框架已默认安装普罗米修斯
app.InstallMiddleware(middleware.NewTrace("TRACE-ID"))
app.InstallMiddleware(middleware.NewLogger("TRACE-ID", true))
app.InstallMiddleware(middleware.NewRuntimeLogger("TRACE-ID"))
}
```
#### 接口介绍
```go
/*
每一个请求都会串行的创建一系列对象(controller,service,repository),不用太担心内存问题,因为底层有对象池。
Runtime 运行时对象导出接口,一个请求创建一个运行时对象,在不同层里使用也是同一实例。
*/
type Runtime interface {
//获取iris的上下文
Ctx() freedom.Context
//获取带上下文的日志实例。
Logger() Logger
//获取一级缓存实例,请求结束,该缓存生命周期结束。
Store() *memstore.Store
//获取普罗米修斯插件
Prometheus() *Prometheus
}
// Initiator 实例初始化接口,在Booting使用。
type Initiator interface {
//创建 iris.Party,可以指定中间件。
CreateParty(relativePath string, handlers ...context.Handler) iris.Party
//绑定控制器到 iris.Party
BindControllerByParty(party iris.Party, controller interface{})
//直接绑定控制器到路径,可以指定中间件。
BindController(relativePath string, controller interface{}, handlers ...context.Handler)
//绑定服务
BindService(f interface{})
//注入到控制器
InjectController(f interface{})
//绑定Repo
BindRepository(f interface{})
//获取服务
GetService(ctx iris.Context, service interface{})
//异步缓存预热
AsyncCachePreheat(f func(repo *Repository))
//同步缓存预热
CachePreheat(f func(repo *Repository))
//绑定基础设施组件 如果组件是单例 com是对象, 如果组件是多例com是创建函数。
BindInfra(single bool, com interface{})
//获取基础设施组件, 只有控制器获取组件需要在booting内调用, service和repository可直接依赖注入
GetInfra(ctx iris.Context, com interface{})
}
```
#### controllers/default.go
##### [iris路由文档](https://github.com/kataras/iris/wiki/MVC)
```go
func init() {
freedom.Booting(func(initiator freedom.Initiator) {
/*
普通方式绑定 Default控制器到路径 /
initiator.BindController("/", &DefaultController{})
*/
//中间件方式绑定, 只对本控制器生效,全局中间件请在main加入。
initiator.BindController("/", &DefaultController{}, func(ctx freedom.Context) {
runtime := freedom.PickRuntime(ctx)
runtime.Logger().Info("Hello middleware begin")
ctx.Next()
runtime.Logger().Info("Hello middleware end")
})
})
}
/*
控制器 DefaultController
Sev : 使用服务对象定义成员变量,与init里的依赖注入相关。可以声明接口成员变量接收。
*/
type DefaultController struct {
Sev *services.DefaultService //被注入的变量名,必须是大写开头。go规范小写变量名,反射是无法注入的。
Runtime freedom.Runtime //注入请求运行时
}
/*
Get handles the GET: / route.
返回json数据 result到客户端
*/
func (c *DefaultController) Get() (result struct {
IP string
UA string
}, e error) {
//打印日志,并且调用服务的 RemoteInfo 方法。
c.Runtime.Logger().Infof("我是控制器")
remote := c.Sev.RemoteInfo()
result.IP = remote.IP
result.UA = remote.UA
return
}
```
#### application/default.go
```go
func init() {
freedom.Booting(func(initiator freedom.Initiator) {
//绑定服务
initiator.BindService(func() *DefaultService {
return &DefaultService{}
})
//控制器中使用依赖注入, 需要注入到控制器.
initiator.InjectController(func(ctx freedom.Context) (ds *DefaultService) {
//从对象池中获取service
initiator.GetService(ctx, &ds)
return
})
})
}
// 领域服务 DefaultService
type DefaultService struct {
Runtime freedom.Runtime
DefRepo *repositorys.DefaultRepository
}
// RemoteInfo .
func (s *DefaultService) RemoteInfo() (result struct {
IP string
UA string
}) {
s.Runtime.Logger().Infof("我是service")
//调用repo获取数据
result.IP = s.DefRepo.GetIP()
result.UA = s.DefRepo.GetUA()
return
}
```
#### repositorys/default.go
```go
func init() {
freedom.Booting(func(initiator freedom.Initiator) {
//绑定 Repository
initiator.BindRepository(func() *DefaultRepository {
return &DefaultRepository{}
})
})
}
/*
数据仓库 DefaultRepository
数据仓库主要用于 db、缓存、http...的数据组织和抽象,示例直接返回上下文里的ip数据。
*/
type DefaultRepository struct {
freedom.Repository
}
// GetIP .
func (repo *DefaultRepository) GetIP() string {
repo.Runtime.Logger().Infof("我是Repository GetIP")
return repo.Runtime.Ctx().RemoteAddr()
}
// GetUA - implment DefaultRepoInterface interface
func (repo *DefaultRepository) GetUA() string {
//该方法的接口声明在application/default.go
repo.Runtime.Logger().Infof("我是Repository GetUA")
return repo.Runtime.Ctx().Request().UserAgent()
}
```
#### 配置文件
|文件 | 作用 |
| ----- | :---: |
|server/conf/app.toml|服务配置|
|server/conf/db.toml|db配置|
|server/conf/redis.toml|缓存配置|
#### 生命周期
###### 每一个请求接入都会创建若干依赖对象,从controller、service、repository、infra。简单的讲每一个请求都是独立的创建和使用这一系列对象,不会导致并发的问题。当然也无需担心效率问题,框架已经做了池。可以参见 initiator 里的各种Bind方法。 | 23.652778 | 144 | 0.661773 | yue_Hant | 0.576017 |
1c66182b4d81877a97b7c16415ca60518054ebfc | 5,979 | md | Markdown | tutorial.md | zroubalik/functions | 7d9f6647a1baa51ce7c241cbfdfe7d109b043b11 | [
"Apache-2.0"
] | null | null | null | tutorial.md | zroubalik/functions | 7d9f6647a1baa51ce7c241cbfdfe7d109b043b11 | [
"Apache-2.0"
] | null | null | null | tutorial.md | zroubalik/functions | 7d9f6647a1baa51ce7c241cbfdfe7d109b043b11 | [
"Apache-2.0"
] | null | null | null | # Boson Functions: A Step By Step Tutorial
This document will walk you step by step through the process of creating,
editing, and deploying a Boson Function project.
## Prerequisites
In order to follow along with this tutorial, you will need to have a few tools
installed.
* [oc][oc] or [kubectl][kubectl] CLI
* [kn][kn] CLI
* [Docker][docker]
[docker]: https://docs.docker.com/install/
[oc]: https://docs.openshift.com/container-platform/4.6/cli_reference/openshift_cli/getting-started-cli.html#cli-installing-cli_cli-developer-commands
[kubectl]: https://kubernetes.io/docs/tasks/tools/install-kubectl/
[kn]: https://knative.dev/docs/install/install-kn/
## Cluster Setup
To use Boson Functions, you'll need a Kubernetes cluster with Knative Serving
and Eventing installed. If you have a recent version of OpenShift, you can
simply install the Serverless Operator. If you don't have a cluster already,
you can create a simple cluster with [kind](https://kind.sigs.k8s.io/). Follow
these [step by step instructions](kind-setup.md) to install on your local
machine.
## Boson Tooling
The primary interface for Boson project is the `faas` CLI.
[Download][faas-download] the most recent version and install it some place
within your `$PATH`.
[faas-download]: https://github.com/boson-project/faas/releases
```sh
# Be sure to download the correct binary for your operating system
curl -L -o - faas.gz https://github.com/boson-project/faas/releases/download/v0.8.0/faas_linux_amd64.gz | gunzip > faas && chmod 755 faas
sudo mv faas /usr/local/bin
```
## Configuring a Container Repository
The unit of deployment in Boson Functions is an [OCI](https://opencontainers.org/)
container image, typically referred to as a Docker container image.
In order for the `faas` CLI to manage these containers, you'll need to be
logged in to a container registry. For example, `docker.io/lanceball`
```bash
# Typically, this will log you in to docker hub if you
# omit <repository.url>. If you are using a repository
# other than Docker hub, provide that for <repository.url>
docker login -u lanceball -p [redacted] <repository.url>
```
> Note: many of the `faas` CLI commands take a `--repository` argument.
> Set the `FAAS_REPOSITORY` environment variable in order to omit this
> parameter when using the CLI.
```bash
# This should be set to a repository that you have write permission
# on and you have logged into in the previous step.
export FAAS_REPOSITORY=docker.io/lanceball
```
## Creating a Project
With your Knative enabled cluster up and running, you can now create a new
Function Project. Let's start by creating a project directory. Function names
in `faas` correspond to URLs at the moment, and there are some finicky cases
at the moment. To ensure that everything works as it should, create a project
directory consisting of three URL parts. Here is a good one.
```bash
mkdir fn.example.io
cd fn.example.io
```
Now, with one command we will create the project files, build a container, and
deploy the function as a Knative service.
```bash
faas create -l node
```
This will create a Node.js Function project in the current directory accepting
all of the defaults inferred from your environment, for example`$FAAS_REPOSITORY`.
When the command has completed, you can see the deployed function.
```bash
kn service list
NAME URL LATEST AGE CONDITIONS READY REASON
fn-example-io http://fn-example-io.faas.127.0.0.1.nip.io fn-example-io-ngswh-1 24s 3 OK / 3 True
```
Clicking on the URL will take you to the running function in your cluster. You
should see a simple response.
```json
{"query": {}}
```
You can add query parameters to the request to see those echoed in return.
```console
curl "http://fn-example-io.faas.127.0.0.1.nip.io?name=tiger"
{"query":{"name":"tiger"},"name":"tiger"}
```
## Local Development
The `faas create` command also results in a docker container that can be run
locally with container ports mapped to localhost.
```bash
faas run
```
For day to day development of the function, you can also run it locally outside
of a container. For this project, using Node.js, you have the following commands
available. Note that to run this function locally, you will need Node.js 12.x or
higher, and the corresponding npm.
```bash
npm install # Installs all dependencies
npm test # Runs unit and integration test suites
npm run local # Execute the function on the local host
```
## Deploying to a Cluster - Step by Step
With `faas create` you have already deployed to a cluster! But there was a lot
of magic in that one command. Let's break it down step by step using the
`faas` CLI to take each step in turn.
First, let's delete the project we just created.
```bash
faas delete
```
You might see a message such as this.
```bash
Error: remover failed to delete the service: timeout: service 'fn-example-io' not ready after 30 seconds.
```
If you do, just run `kn service list` to see if the function is still deployed.
It might just take a little time for it to be removed.
Now, let's clean up the current directory.
```bash
rm -rf *
```
### `faas init`
To create a new project structure without building a container or deploying to a
cluster, use the `init` command.
```bash
faas init -l node -t http
```
You can also create a Quarkus or a Golang project by providing `quarkus` or `go`
respectively to the `-l` flag. To create a project with a template for
CloudEvents, provide `events` to the `-t` flag.
### `faas build`
To build the OCI container image for your function project, you can use the
`build` command.
```bash
faas build
```
This creates a runnable container image that listens on port 8080 for incoming
HTTP requests.
### `faas deploy`
To deploy the image to your cluster, use the `deploy` command. You can also use
this command to update a Function deployment after making changes locally.
```bash
faas deploy
```
| 31.140625 | 150 | 0.741428 | eng_Latn | 0.993834 |
1c6803434d998b4cf9fcb9bb24535f20eab841b2 | 3,487 | md | Markdown | generated/web/microsoft.certificateregistration/2020-06-01/types.md | anthony-c-martin/bicep-types-az | e0820bb3e25274304ff23820a3d9820786a60079 | [
"MIT"
] | null | null | null | generated/web/microsoft.certificateregistration/2020-06-01/types.md | anthony-c-martin/bicep-types-az | e0820bb3e25274304ff23820a3d9820786a60079 | [
"MIT"
] | 2 | 2021-03-05T08:04:56.000Z | 2021-03-05T20:09:30.000Z | generated/web/microsoft.certificateregistration/2020-06-01/types.md | anthony-c-martin/bicep-types-az | e0820bb3e25274304ff23820a3d9820786a60079 | [
"MIT"
] | null | null | null | # Microsoft.CertificateRegistration @ 2020-06-01
## Resource Microsoft.CertificateRegistration/certificateOrders@2020-06-01
* **Valid Scope(s)**: ResourceGroup
### Properties
* **apiVersion**: '2020-06-01' (ReadOnly, DeployTimeConstant)
* **id**: string (ReadOnly, DeployTimeConstant)
* **kind**: string
* **location**: string (Required)
* **name**: string (Required, DeployTimeConstant)
* **properties**: schemas:2_properties
* **tags**: Dictionary<string,String>
* **type**: 'Microsoft.CertificateRegistration/certificateOrders' (ReadOnly, DeployTimeConstant)
## Resource Microsoft.CertificateRegistration/certificateOrders/certificates@2020-06-01
* **Valid Scope(s)**: ResourceGroup
### Properties
* **apiVersion**: '2020-06-01' (ReadOnly, DeployTimeConstant)
* **id**: string (ReadOnly, DeployTimeConstant)
* **kind**: string
* **location**: string (Required)
* **name**: string (Required, DeployTimeConstant)
* **properties**: AppServiceCertificate
* **tags**: Dictionary<string,String>
* **type**: 'Microsoft.CertificateRegistration/certificateOrders/certificates' (ReadOnly, DeployTimeConstant)
## schemas:2_properties
### Properties
* **appServiceCertificateNotRenewableReasons**: 'ExpirationNotInRenewalTimeRange' | 'RegistrationStatusNotSupportedForRenewal' | 'SubscriptionNotActive'[] (ReadOnly)
* **autoRenew**: bool
* **certificates**: Dictionary<string,AppServiceCertificate>
* **csr**: string
* **distinguishedName**: string
* **domainVerificationToken**: string (ReadOnly)
* **expirationTime**: string (ReadOnly)
* **intermediate**: CertificateDetails (ReadOnly)
* **isPrivateKeyExternal**: bool (ReadOnly)
* **keySize**: int
* **lastCertificateIssuanceTime**: string (ReadOnly)
* **nextAutoRenewalTimeStamp**: string (ReadOnly)
* **productType**: 'StandardDomainValidatedSsl' | 'StandardDomainValidatedWildCardSsl' (Required)
* **provisioningState**: 'Canceled' | 'Deleting' | 'Failed' | 'InProgress' | 'Succeeded' (ReadOnly)
* **root**: CertificateDetails (ReadOnly)
* **serialNumber**: string (ReadOnly)
* **signedCertificate**: CertificateDetails (ReadOnly)
* **status**: 'Canceled' | 'Denied' | 'Expired' | 'Issued' | 'NotSubmitted' | 'Pendingissuance' | 'PendingRekey' | 'Pendingrevocation' | 'Revoked' | 'Unused' (ReadOnly)
* **validityInYears**: int
## Dictionary<string,AppServiceCertificate>
### Properties
### Additional Properties
* **Additional Properties Type**: AppServiceCertificate
## AppServiceCertificate
### Properties
* **keyVaultId**: string
* **keyVaultSecretName**: string
* **provisioningState**: 'AzureServiceUnauthorizedToAccessKeyVault' | 'CertificateOrderFailed' | 'ExternalPrivateKey' | 'Initialized' | 'KeyVaultDoesNotExist' | 'KeyVaultSecretDoesNotExist' | 'OperationNotPermittedOnKeyVault' | 'Succeeded' | 'Unknown' | 'UnknownError' | 'WaitingOnCertificateOrder' (ReadOnly)
## CertificateDetails
### Properties
* **issuer**: string (ReadOnly)
* **notAfter**: string (ReadOnly)
* **notBefore**: string (ReadOnly)
* **rawData**: string (ReadOnly)
* **serialNumber**: string (ReadOnly)
* **signatureAlgorithm**: string (ReadOnly)
* **subject**: string (ReadOnly)
* **thumbprint**: string (ReadOnly)
* **version**: int (ReadOnly)
## Dictionary<string,String>
### Properties
### Additional Properties
* **Additional Properties Type**: string
## Dictionary<string,String>
### Properties
### Additional Properties
* **Additional Properties Type**: string
| 42.52439 | 310 | 0.720103 | yue_Hant | 0.559379 |
1c6818c5968d3e4dce67eb32c66301d4dc07ba96 | 3,571 | md | Markdown | articles/marketplace/marketplace-containers.md | CatchRetry/azure-docs.fr-fr | 1ccd071caa483cc19d4d9b8c1c59104b1a7e6438 | [
"CC-BY-4.0"
] | null | null | null | articles/marketplace/marketplace-containers.md | CatchRetry/azure-docs.fr-fr | 1ccd071caa483cc19d4d9b8c1c59104b1a7e6438 | [
"CC-BY-4.0"
] | null | null | null | articles/marketplace/marketplace-containers.md | CatchRetry/azure-docs.fr-fr | 1ccd071caa483cc19d4d9b8c1c59104b1a7e6438 | [
"CC-BY-4.0"
] | 3 | 2020-03-31T11:56:12.000Z | 2021-06-04T06:51:19.000Z | ---
title: Guide de publication d’offres de conteneurs pour la Place de marché Azure
description: Cet article décrit les conditions requises pour publier des conteneurs sur la place de marché
services: Azure, Marketplace, Compute, Storage, Networking, Blockchain, Security
documentationcenter: ''
author: ellacroi
manager: nunoc
editor: ''
ms.assetid: ''
ms.service: marketplace
ms.workload: ''
ms.tgt_pltfrm: ''
ms.devlang: ''
ms.topic: article
ms.date: 07/09/2018
ms.author: ellacroi
ms.openlocfilehash: 6b02714c3a62e8d11512c1cc2dfc7a75a422441d
ms.sourcegitcommit: fbf0124ae39fa526fc7e7768952efe32093e3591
ms.translationtype: HT
ms.contentlocale: fr-FR
ms.lasthandoff: 01/08/2019
ms.locfileid: "54076380"
---
# <a name="containers-offer-publishing-guide"></a>Guide de publication d’offres de conteneurs
Les offres de conteneur vous aident à publier votre image de conteneur sur la Place de marché Azure. Utilisez ce guide pour comprendre les exigences propres à cette offre.
Il s’agit d’offres de transaction qui sont déployées et facturées via la place de marché. L’appel à l’action qu’un utilisateur voit est « Obtenir maintenant ».
Utilisez le type d’offre Conteneur si votre solution est une image de conteneur Docker approvisionnée en tant que service de conteneur Azure Kubernetes.
>[!NOTE]
>Par exemple, un service de conteneur Azure basé sur Kubernetes tel qu’Azure Kubernetes Service ou Azure Container Instances, le choix des clients Azure pour un runtime de conteneur basé sur Kubernetes.
Microsoft prend actuellement en charge les modèles de licence BYOL (apportez votre propre licence) et gratuits.
## <a name="containers-offer"></a>Offre de conteneurs
| Prérequis | Détails |
|:--- |:--- |
| Facturation et mesure | Prise en charge du modèle de facturation gratuit ou BYOL. |
| Image créée à partir du fichier Dockerfile | Les images de conteneur doivent être basées sur la spécification d’image Docker et doivent être générées à partir d’un fichier Dockerfile.<ul> <li>Pour plus d'informations sur la création d’images Docker, visitez la section Utilisation à l’adresse [docs.docker.com/engine/reference/builder/#usage](https://docs.docker.com/engine/reference/builder/#usage).</li> </ul> |
| Hébergement dans ACR | Les images de conteneur doivent être hébergées dans un Azure Container Registry (ACR).<ul> <li>Pour plus d’informations sur l’utilisation d’ACR, consultez le guide de démarrage rapide : Créer un registre de conteneurs à l’aide du portail Azure à l’adresse [docs.microsoft.com/azure/container-registry/container-registry-get-started-portal](https://docs.microsoft.com/azure/container-registry/container-registry-get-started-portal).</li> </ul> |
| Balisage d’images | Les images conteneur doivent contenir au moins 1 étiquette (nombre maximal d’étiquettes : 16).<ul> <li>Pour plus d'informations sur le balisage d’une image, visitez la page de balise Docker à l’adresse [docs.docker.com/engine/reference/commandline/tag](https://docs.docker.com/engine/reference/commandline/tag).</li> </ul> |
## <a name="next-steps"></a>Étapes suivantes
Si vous ne l’avez pas déjà fait,
- [Inscrivez-vous](https://azuremarketplace.microsoft.com/sell) sur la place de marché.
Si vous êtes inscrit et créez une nouvelle offre ou travaillez sur une offre existante,
- [Connectez-vous au portail Cloud Partner](https://cloudpartner.azure.com) pour créer ou terminer votre offre.
- Pour plus d’informations, consultez [Conteneurs](https://docs.microsoft.com/azure/marketplace/cloud-partner-portal/containers/cpp-containers-offer).
| 62.649123 | 471 | 0.781854 | fra_Latn | 0.905217 |
1c68361672cfb6c52f790455cda869954661d0b8 | 1,885 | md | Markdown | content/api/ng-common-positions/common-positions.tooltipoptions.md | ressurectit/ressurectit.github.io | 09ed543e50e9b35594333afe6e98d79687849b04 | [
"MIT"
] | null | null | null | content/api/ng-common-positions/common-positions.tooltipoptions.md | ressurectit/ressurectit.github.io | 09ed543e50e9b35594333afe6e98d79687849b04 | [
"MIT"
] | null | null | null | content/api/ng-common-positions/common-positions.tooltipoptions.md | ressurectit/ressurectit.github.io | 09ed543e50e9b35594333afe6e98d79687849b04 | [
"MIT"
] | null | null | null | <!-- Do not edit this file. It is automatically generated by API Documenter. -->
[Home](./index.md) > [@anglr/common-positions](./common-positions.md) > [TooltipOptions](./common-positions.tooltipoptions.md)
## TooltipOptions interface
Options used for tooltip directive
<b>Signature:</b>
```typescript
export interface TooltipOptions
```
## Properties
| Property | Type | Description |
| --- | --- | --- |
| [allowSelection?](./common-positions.tooltipoptions.allowselection.md) | boolean \| null | <i>(Optional)</i> Allows selection of text in tooltip |
| [delay?](./common-positions.tooltipoptions.delay.md) | number | <i>(Optional)</i> Delay for displaying of tooltip on hover |
| [elementPositionAt?](./common-positions.tooltipoptions.elementpositionat.md) | Positions.PositionsCoordinates | <i>(Optional)</i> Position where should be tooltip displayed at element |
| [fixedPosition?](./common-positions.tooltipoptions.fixedposition.md) | boolean | <i>(Optional)</i> Indication whether is tooltip displayed at fixed position, or if it is displayed where cursor enters element |
| [stopPropagation?](./common-positions.tooltipoptions.stoppropagation.md) | boolean \| null | <i>(Optional)</i> \\ Indication whether stop propagation of "hover" event |
| [tooltipCssClass?](./common-positions.tooltipoptions.tooltipcssclass.md) | string \| null | <i>(Optional)</i> Css class that is applied to tooltip renderer component |
| [tooltipPosition?](./common-positions.tooltipoptions.tooltipposition.md) | Positions.PositionsCoordinates | <i>(Optional)</i> Position of tooltip where should placed at |
| [tooltipRenderer?](./common-positions.tooltipoptions.tooltiprenderer.md) | Type<[TooltipRenderer](./common-positions.tooltiprenderer.md)<!-- -->> | <i>(Optional)</i> Type of tooltip renderer that is used for rendering tooltip |
| 67.321429 | 237 | 0.728382 | eng_Latn | 0.724634 |
1c68757ca1304c778aa46fcb37bf0789ebfa2582 | 1,953 | md | Markdown | src/jekyll/docs/plus-sdk/reactnative/rewarded-fa.md | tapsellorg/TapsellDocument | 2f806100f91f85cc3da175b876d7169592a637d2 | [
"MIT"
] | null | null | null | src/jekyll/docs/plus-sdk/reactnative/rewarded-fa.md | tapsellorg/TapsellDocument | 2f806100f91f85cc3da175b876d7169592a637d2 | [
"MIT"
] | 8 | 2021-04-06T12:18:22.000Z | 2022-02-22T14:41:11.000Z | src/jekyll/docs/plus-sdk/reactnative/rewarded-fa.md | tapsellorg/TapsellDocument | 2f806100f91f85cc3da175b876d7169592a637d2 | [
"MIT"
] | 1 | 2020-09-02T07:01:08.000Z | 2020-09-02T07:01:08.000Z | ---
layout: classic-docs
title: تبلیغات جایزهای در ReactNative
lang: fa
permalink: /plus-sdk/reactnative/rewarded/index.html
toc: true # table of contents
---
برای پیاده سازی تبلیغات جایزهای به صورت زیر اقدام کنید
## ساخت تبلیغگاه
ابتدا از [پنل تپسل](https://dashboard.tapsell.ir/) یک تبلیغگاه از نوعی که مایل هستید بسازید.
## درخواست تبلیغ
با اجرای کد زیر میتوانید درخواست یک تبلیغ بدهید.
متد مورد نظر یک `Promise` برمیگرداند که این در درون خود یک
**responseId**
دارد. از این
responseId
برای نمایش تبلیغ استفاده میشود. لذا بایستی آنرا ذخیره کنید.
```javascript
let zoneId = "theZoneIdYouHave";
TapsellPlus.requestRewardedVideoAd(zoneId).then(responseId => {
// Save the responseId -- You need it to show the ad
})
.catch(error => {
// Do on Error
});
// Using Async/await
let responseId = await TapsellPlus.requestRewardedVideoAd(zoneId);
```
ورودی اول `zoneId` برابر با شناسه تبلیغگاهی هست که در پنل ساختهاید.
اکشنهای مختلف و شرایط اجرا شدن آنها در جدول زیر آمده است:
| تابع | توضیحات |
| - | - |
| `Promise.resolve(responseId)` | در صورتی که تبلیغ بدون مشکل آمادهی نمایش شود شناسهی درخواست برمیگردد |
| `Promise.reject(error)` | هنگامی که هر نوع خطایی در پروسهی دریافت تبلیغ بوجود بیاید |
## نمایش تبلیغ
با اجرای کد زیر میتوانید یک تبلیغ را نمایش بدهید.
```javascript
TapsellPlus.showRewardedVideoAd(responseId, onOpened, onClosed, onRewarded, onError);
```
شناسهی
`responseId`
برابر مقداری ست که هنگام درخواست تبلیغ از
promise
به دست میآید
اکشنهای مختلف و شرایط اجرا شدن آنها در جدول زیر آمده است :
| تابع | توضیحات |
| - | - |
| `onOpened(data: object)` | زمانی که تبلیغ دریافت شده و آمادهی نمایش باشد |
| `onClosed(data: object)` | زمانی که پنجره تبلیغ بسته شود. این اکشن به منزله پایان تبلیغ نمیباشد |
| `onRewarded(data: object)` | زمانی که تبلیغ به طور کامل نمایش داده شده و باید جایزه به کاربر تعلق بگیرد |
| `onError(error: object)` | هنگامی که هر نوع خطایی در پروسهی دریافت تبلیغ بوجود بیاید | | 28.304348 | 107 | 0.726575 | pes_Arab | 0.991612 |
1c68c4442d0c76d3da77b0d0c4eb993c8376a73b | 1,180 | md | Markdown | README.md | takurooo/PyTorch_RandomErasing | 2fcb30626ee19795d580631186c0bd16efd4292a | [
"MIT"
] | 2 | 2020-09-03T01:28:58.000Z | 2022-01-25T06:38:42.000Z | README.md | takurooo/PyTorch_RandomErasing | 2fcb30626ee19795d580631186c0bd16efd4292a | [
"MIT"
] | 1 | 2019-07-07T09:23:32.000Z | 2019-07-07T09:23:32.000Z | README.md | takurooo/PyTorch_RandomErasing | 2fcb30626ee19795d580631186c0bd16efd4292a | [
"MIT"
] | 3 | 2019-12-19T05:49:54.000Z | 2020-11-14T06:39:26.000Z | # PyTorch_RandomErasing
Implementation of
[Random Erasing Data Augmentation](https://arxiv.org/pdf/1708.04896.pdf).
# Examples of Random Erasing

# Results
Dataset CIFAR10
## Test accuracy per class
| Class | Random Erasing OFF | Random Erasing ON|
| --- | --- | --- |
| plane | 82 % | 83 %(+1%) |
| car | 87 % | 91 %(+4%) |
| bird | 69 % | 69 %( -) |
| cat | 64 % | 66 %(+2%) |
| deer | 75 % | 75 %( -) |
| dog | 64 % | 74 %(+10%) |
| frog | 85 % | 85 %( -) |
| horse | 85 % | 84 %(-1%) |
| ship | 85 % | 88 %(+3%) |
| truck | 84 % | 82 %(-2%) |
## Test average accuracy and loss
| | Random Erasing OFF | Random Erasing ON |
| --- | --- | --- |
| acc | 78 % | 80 %(+2%) |
| loss | 1.3514 % | 0.9337 %(-0.41769%)|
| 33.714286 | 111 | 0.388983 | yue_Hant | 0.424535 |
1c6968e9296ab8600bfb76a204e04dd3651ee2e8 | 67 | md | Markdown | app01/README.md | gabrielrodriguesleite/contextAPIeHooksTests | 9745c6bc34f7c6fe727da672b254413e67657749 | [
"MIT"
] | null | null | null | app01/README.md | gabrielrodriguesleite/contextAPIeHooksTests | 9745c6bc34f7c6fe727da672b254413e67657749 | [
"MIT"
] | null | null | null | app01/README.md | gabrielrodriguesleite/contextAPIeHooksTests | 9745c6bc34f7c6fe727da672b254413e67657749 | [
"MIT"
] | null | null | null | ## Exercitando o conteúdo aprendido sobre React contextAPI e Hooks
| 33.5 | 66 | 0.820896 | por_Latn | 0.744585 |
1c69d2209f37c5d15e917003fe78237126bc261b | 748 | md | Markdown | _m/foody.md | S1ffer/maerkelex.dk | ad48e52b5a3cf9c59c74e8c8b6dc8fc9341b4658 | [
"MIT"
] | null | null | null | _m/foody.md | S1ffer/maerkelex.dk | ad48e52b5a3cf9c59c74e8c8b6dc8fc9341b4658 | [
"MIT"
] | null | null | null | _m/foody.md | S1ffer/maerkelex.dk | ad48e52b5a3cf9c59c74e8c8b6dc8fc9341b4658 | [
"MIT"
] | null | null | null | ---
name: Foody
tags:
- uofficielt
- duelighed
- bål
age: 12+
image: foody.jpg
coverimage: cover-campfire.jpg
buylink: http://shorty.tac-case.dk/index.php/bestilling-eller-information
infolink: http://shorty.tac-case.dk/index.php/foody
---
## Duelighedstegnet som gør spejdere til mesterkokke
Det kan tages af dem, som er vant til at lave mad over bål. De har styr på båltyper, gammelmandsild og har i det hele taget styr på "mad på lejr".
Det bedste er, hvis man er mindst 2 som tager mærket sammen. En kan hele tiden holde øje med maden, hvis den anden har et andet ærinde.
Det er DIG, der laver maden, det er DIG, der finder opskrifterne og maden laves på jeres spejderture eller til spejderarrangementer til ”jer”, patruljen eller gruppen.
| 35.619048 | 167 | 0.766043 | dan_Latn | 0.999302 |
1c6c40a31ef4e6b60d22ab517af05eb429726130 | 1,510 | md | Markdown | docs/accessibility/headings.md | la-ots/pelican | 048b77c5a9917f14c3fbf561d53e97cc89a2ab21 | [
"CC0-1.0"
] | 6 | 2022-02-17T19:40:30.000Z | 2022-02-24T17:43:03.000Z | docs/accessibility/headings.md | la-ots/pelican | 048b77c5a9917f14c3fbf561d53e97cc89a2ab21 | [
"CC0-1.0"
] | 20 | 2022-02-17T11:25:20.000Z | 2022-03-31T11:21:08.000Z | docs/accessibility/headings.md | la-ots/pelican | 048b77c5a9917f14c3fbf561d53e97cc89a2ab21 | [
"CC0-1.0"
] | null | null | null | ---
title: Headings
summary: Headings are used to indicate information structure. They also enable accessible access.
tags: color, contrast
layout: guide
eleventyNavigation:
key: Headings
parent: Accessibility
order: 6
excerpt: Headings are used to indicate information structure. They also enable accessible access.
img: /img/illustrations/illus-headings.svg
---
## Overview
Headings semantically express the content structure of a document. For assistive technology, Heading tags create a navigable outline of the page’s information.
## Best Practices
Indents in the code below help illustrate how Heading tags create an information outline.
```html
<h1></h1>
<h2></h2>
<h3></h3>
<h4></h4>
<h2></h2>
<h3></h3>
<h4></h4>
<h5></h5>
<h6></h6>
<h2></h2>
```
To use Headings properly
- Understand and implement Headings to correspond to the document hierarchy.
- Start with an H1. Only use one H1 per page.
- [Page Titles](/components/page-title/) by default are the H1 in Pelican.
- Follow H1 with as many H2 headers to chunk the content into meaningful sections.
- Further divisions between H2s, should begin with H3.
- Use H4, H5, and H6 successively.
- Don’t skip Headings.
- Try to keep the page to just three levels at most.
**Pelican uses Headings for:**
- Page Titles
- Page Section Titles
- Subsection Headings
## Resources
- <a href="https://developer.mozilla.org/en-US/docs/Web/HTML/Element/Heading_Elements" target="_blank">Heading Elements</a>
| 27.454545 | 159 | 0.727815 | eng_Latn | 0.963199 |
1c6c5ff323a7ca1bfda1a9d1f7169f81f8e6cf81 | 16,311 | md | Markdown | articles/hdinsight/hdinsight-scaling-best-practices.md | Jontii/azure-docs.sv-se | d2551c12e17b442dc0b577205d034dcd6c73cff9 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/hdinsight/hdinsight-scaling-best-practices.md | Jontii/azure-docs.sv-se | d2551c12e17b442dc0b577205d034dcd6c73cff9 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/hdinsight/hdinsight-scaling-best-practices.md | Jontii/azure-docs.sv-se | d2551c12e17b442dc0b577205d034dcd6c73cff9 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Skala kluster storlekar – Azure HDInsight
description: Skala ett Apache Hadoop kluster elastiskt för att matcha din arbets belastning i Azure HDInsight
ms.author: ashish
ms.service: hdinsight
ms.topic: how-to
ms.custom: seoapr2020
ms.date: 04/29/2020
ms.openlocfilehash: 3524b5d2274c52aa94fa1c3420fb0d3245d9b730
ms.sourcegitcommit: 2f9f306fa5224595fa5f8ec6af498a0df4de08a8
ms.translationtype: MT
ms.contentlocale: sv-SE
ms.lasthandoff: 01/28/2021
ms.locfileid: "98932071"
---
# <a name="scale-azure-hdinsight-clusters"></a>Skala Azure HDInsight-kluster
HDInsight ger elastiskhet med alternativ för att skala upp och ned antalet arbetsnoder i klustren. Med den här elastiskheten kan du minska ett kluster efter timmar eller helger. Och expandera den under topp verksamhets kraven.
Skala upp klustret före periodisk batchbearbetning så att klustret har tillräckligt med resurser. När bearbetningen är klar och användningen går nedåt, skala HDInsight-klustret till färre arbetsnoder.
Du kan skala ett kluster manuellt med någon av de metoder som beskrivs nedan. Du kan också använda alternativ för automatisk [skalning](hdinsight-autoscale-clusters.md) för att automatiskt skala upp och ned i svar på vissa mått.
> [!NOTE]
> Endast kluster med HDInsight version 3.1.3 eller högre stöds. Om du är osäker på vilken version av klustret du har kan du kontrol lera sidan Egenskaper.
## <a name="utilities-to-scale-clusters"></a>Verktyg för att skala kluster
Microsoft tillhandahåller följande verktyg för att skala kluster:
|Verktyg | Beskrivning|
|---|---|
|[PowerShell Az](/powershell/azure)|[`Set-AzHDInsightClusterSize`](/powershell/module/az.hdinsight/set-azhdinsightclustersize) `-ClusterName CLUSTERNAME -TargetInstanceCount NEWSIZE`|
|[PowerShell AzureRM](/powershell/azure/azurerm) |[`Set-AzureRmHDInsightClusterSize`](/powershell/module/azurerm.hdinsight/set-azurermhdinsightclustersize) `-ClusterName CLUSTERNAME -TargetInstanceCount NEWSIZE`|
|[Azure CLI](/cli/azure/) | [`az hdinsight resize`](/cli/azure/hdinsight#az-hdinsight-resize) `--resource-group RESOURCEGROUP --name CLUSTERNAME --workernode-count NEWSIZE`|
|[Klassisk Azure CLI](hdinsight-administer-use-command-line.md)|`azure hdinsight cluster resize CLUSTERNAME NEWSIZE` |
|[Azure-portalen](https://portal.azure.com)|Öppna fönstret HDInsight-kluster, Välj **kluster storlek** på den vänstra menyn och skriv sedan antalet arbetsnoder i rutan kluster storlek och välj Spara.|

Med någon av dessa metoder kan du skala ditt HDInsight-kluster upp eller ned på några minuter.
> [!IMPORTANT]
> * Den klassiska Azure CLI är inaktuell och bör endast användas med den klassiska distributions modellen. Använd [Azure CLI](/cli/azure/)för alla andra distributioner.
> * PowerShell-modulen AzureRM är föråldrad. Använd AZ- [modulen](/powershell/azure/new-azureps-module-az) när det är möjligt.
## <a name="impact-of-scaling-operations"></a>Effekt av skalnings åtgärder
När du **lägger till** noder i det pågående HDInsight-klustret (skala upp) påverkas inte jobben. Nya jobb kan skickas på ett säkert sätt medan skalnings processen körs. Om skalnings åtgärden Miss lyckas lämnar klustret ett fungerande tillstånd.
Om du **tar bort** noder (skala ned) kommer väntande eller pågående jobb att Miss lyckas när skalnings åtgärden slutförs. Det här felet beror på att några av tjänsterna har startats om under skalnings processen. Klustret kan fastna i fel säkert läge under en manuell skalnings åtgärd.
Effekten av att ändra antalet datanoder varierar för varje typ av kluster som stöds av HDInsight:
* Apache Hadoop
Du kan sömlöst öka antalet arbetsnoder i ett Hadoop-kluster som körs utan att påverka några jobb. Nya jobb kan också skickas medan åtgärden pågår. Felen i en skalnings åtgärd hanteras på ett smidigt sätt. Klustret lämnas alltid i ett fungerande tillstånd.
När ett Hadoop-kluster skalas ned med färre datanoder, startas vissa tjänster om. Detta innebär att alla pågående och väntande jobb inte kan köras vid slutförandet av skalnings åtgärden. Du kan dock skicka jobben igen när åtgärden har slutförts.
* Apache HBase
Du kan enkelt lägga till eller ta bort noder i HBase-klustret medan det körs. Regionala servrar uppdelas automatiskt inom några minuter efter att du har slutfört skalnings åtgärden. Du kan dock balansera de regionala servrarna manuellt. Logga in på klustrets huvudnoden och kör följande kommandon:
```bash
pushd %HBASE_HOME%\bin
hbase shell
balancer
```
Mer information om hur du använder HBase-gränssnittet finns i [Kom igång med ett Apache HBase-exempel i HDInsight](hbase/apache-hbase-tutorial-get-started-linux.md).
* Apache Storm
Du kan enkelt lägga till eller ta bort datanoder medan Storm körs. Men när skalnings åtgärden har slutförts måste du balansera om topologin. Med ombalansering kan topologin justera [Inställningar för parallellitet](https://storm.apache.org/documentation/Understanding-the-parallelism-of-a-Storm-topology.html) baserat på det nya antalet noder i klustret. Använd något av följande alternativ för att balansera om topologier som körs:
* Webb gränssnitt för Storm
Använd följande steg för att balansera om en topologi med storm-ANVÄNDARGRÄNSSNITTET.
1. Öppna `https://CLUSTERNAME.azurehdinsight.net/stormui` i webbläsaren, där `CLUSTERNAME` är namnet på ditt Storm-kluster. Om du uppmanas till det anger du namnet på HDInsight-klusterresursen (admin) och lösen ordet du angav när du skapade klustret.
1. Välj den topologi du vill balansera om och välj sedan knappen **balansera** om. Ange fördröjningen innan ombalanserings åtgärden utförs.

* Kommando rads gränssnitt (CLI)
Anslut till servern och Använd följande kommando för att balansera om en topologi:
```bash
storm rebalance TOPOLOGYNAME
```
Du kan också ange parametrar för att åsidosätta de parallella tips som ursprungligen tillhandahölls av topologin. Koden nedan konfigurerar till exempel om `mytopology` topologin till 5 arbets processer, 3 körningar för Blue-kanalen-komponenten och 10 körningar för den gula-bult-komponenten.
```bash
## Reconfigure the topology "mytopology" to use 5 worker processes,
## the spout "blue-spout" to use 3 executors, and
## the bolt "yellow-bolt" to use 10 executors
$ storm rebalance mytopology -n 5 -e blue-spout=3 -e yellow-bolt=10
```
* Kafka
Du bör balansera om partition repliker efter skalnings åtgärder. Mer information finns i dokumentet [med hög tillgänglighet för data med Apache Kafka på HDInsight](./kafka/apache-kafka-high-availability.md) -dokument.
* Apache Hive LLAP
Efter skalning till `N` arbetsnoder ställer HDInsight automatiskt in följande konfigurationer och startar om Hive.
* Maximalt antal samtidiga frågor: `hive.server2.tez.sessions.per.default.queue = min(N, 32)`
* Antal noder som används av Hive-LLAP: `num_llap_nodes = N`
* Antal noder för att köra Hive LLAP daemon: `num_llap_nodes_for_llap_daemons = N`
## <a name="how-to-safely-scale-down-a-cluster"></a>Så här skalar du ned ett kluster på ett säkert sätt
### <a name="scale-down-a-cluster-with-running-jobs"></a>Skala ned ett kluster med jobb som körs
Du kan prova tre saker för att undvika att köra jobb som inte körs under en nedskalning-åtgärd:
1. Vänta tills jobben har slutförts innan du skalar ned klustret.
1. Avsluta jobben manuellt.
1. Skicka jobben igen när skalnings åtgärden har slutförts.
Om du vill se en lista över väntande och pågående jobb kan du använda garn **Resource Manager-gränssnittet** med hjälp av följande steg:
1. Välj ditt kluster från [Azure Portal](https://portal.azure.com/). Klustret öppnas på en ny Portal sida.
2. I huvudvyn navigerar du till **kluster instrument paneler** > **Ambari start**. Ange dina autentiseringsuppgifter för klustret.
3. Välj **garn** i listan över tjänster på den vänstra menyn i AMBARI-användargränssnittet.
4. På sidan garn väljer du **snabb länkar** och hovring över den aktiva Head-noden och väljer sedan **Resource Manager-användargränssnittet**.

Du får direkt åtkomst till Resource Manager-ANVÄNDARGRÄNSSNITTET med `https://<HDInsightClusterName>.azurehdinsight.net/yarnui/hn/cluster` .
Du ser en lista över jobb, tillsammans med deras aktuella status. I skärm bilden finns ett jobb som körs för tillfället:

Om du vill avsluta programmet manuellt kör du följande kommando från SSH-gränssnittet:
```bash
yarn application -kill <application_id>
```
Exempel:
```bash
yarn application -kill "application_1499348398273_0003"
```
### <a name="getting-stuck-in-safe-mode"></a>Fastna i felsäkert läge
När du skalar ned ett kluster använder HDInsight Apache Ambari Management Interfaces för att först inaktivera de extra arbetsnoderna. Noderna replikerar sina HDFS-block till andra online Worker-noder. Därefter skalar HDInsight säkert klustret. HDFS hamnar i fel säkert läge under skalnings åtgärden. HDFS ska komma ut när skalningen är färdig. I vissa fall fastnar HDFS i fel säkert läge under en skalnings åtgärd på grund av fil block under replikering.
Som standard konfigureras HDFS med `dfs.replication` inställningen 1, som styr hur många kopior av varje fil block som är tillgängliga. Varje kopia av ett fil block lagras på en annan nod i klustret.
När det förväntade antalet block kopior inte är tillgängligt, anger HDFS fel säkert läge och Ambari genererar aviseringar. HDFS kan ange fel säkert läge för en skalnings åtgärd. Klustret kan fastna i fel säkert läge om antalet noder som krävs inte har identifierats för replikering.
### <a name="example-errors-when-safe-mode-is-turned-on"></a>Exempel fel när fel säkert läge är aktiverat
```output
org.apache.hadoop.hdfs.server.namenode.SafeModeException: Cannot create directory /tmp/hive/hive/819c215c-6d87-4311-97c8-4f0b9d2adcf0. Name node is in safe mode.
```
```output
org.apache.http.conn.HttpHostConnectException: Connect to active-headnode-name.servername.internal.cloudapp.net:10001 [active-headnode-name.servername. internal.cloudapp.net/1.1.1.1] failed: Connection refused
```
Du kan granska namn-nodens loggar från `/var/log/hadoop/hdfs/` mappen, vid den tidpunkt då klustret skalades, för att se när det angivits i fel säkert läge. Loggfilerna heter `Hadoop-hdfs-namenode-<active-headnode-name>.*` .
Den bakomliggande orsaken var att Hive är beroende av temporära filer i HDFS vid körning av frågor. När HDFS går in i fel säkert läge kan Hive inte köra frågor eftersom det inte går att skriva till HDFS. Temporära filer i HDFS finns på den lokala enheten som monteras på de enskilda arbetsnoderna VM. Filerna replikeras bland andra arbetsnoder vid tre repliker, minst.
### <a name="how-to-prevent-hdinsight-from-getting-stuck-in-safe-mode"></a>Förhindra att HDInsight får fastna i fel säkert läge
Det finns flera sätt att förhindra att HDInsight lämnas i fel säkert läge:
* Stoppa alla Hive-jobb innan du skalar ned HDInsight. Du kan också schemalägga skalnings processen för att undvika konflikter med körning av Hive-jobb.
* Rensa Hives tillfälliga katalogfiler manuellt `tmp` i HDFS innan du skalar ned.
* Skala bara ned HDInsight till tre arbetsnoder, minimum. Undvik att gå så lågt som en arbetsnod.
* Kör kommandot för att lämna fel säkert läge vid behov.
I följande avsnitt beskrivs de här alternativen.
#### <a name="stop-all-hive-jobs"></a>Stoppa alla Hive-jobb
Stoppa alla Hive-jobb innan du skalar ned till en arbets nod. Om din arbets belastning har schemalagts ska du köra en nedskalning när Hive-jobbet är färdigt.
Om du stoppar Hive-jobb före skalning, bidrar till att minimera antalet virtuella installationsfiler i mappen tmp (om det finns några).
#### <a name="manually-clean-up-hives-scratch-files"></a>Rensa Hive-startfiler manuellt
Om Hive har lämnat kvar temporära filer kan du rensa filerna manuellt innan du skalar ned för att undvika fel säkert läge.
1. Kontrol lera vilken plats som används för temporära Hive-filer genom att titta på `hive.exec.scratchdir` konfigurations egenskapen. Den här parametern anges inom `/etc/hive/conf/hive-site.xml` :
```xml
<property>
<name>hive.exec.scratchdir</name>
<value>hdfs://mycluster/tmp/hive</value>
</property>
```
1. Stoppa Hive-tjänster och se till att alla frågor och jobb är slutförda.
1. Ange innehållet i den tillfälliga katalogen som finns ovan `hdfs://mycluster/tmp/hive/` för att se om den innehåller några filer:
```bash
hadoop fs -ls -R hdfs://mycluster/tmp/hive/hive
```
Här är ett exempel på utdata när det finns filer:
```output
sshuser@scalin:~$ hadoop fs -ls -R hdfs://mycluster/tmp/hive/hive
drwx------ - hive hdfs 0 2017-07-06 13:40 hdfs://mycluster/tmp/hive/hive/4f3f4253-e6d0-42ac-88bc-90f0ea03602c
drwx------ - hive hdfs 0 2017-07-06 13:40 hdfs://mycluster/tmp/hive/hive/4f3f4253-e6d0-42ac-88bc-90f0ea03602c/_tmp_space.db
-rw-r--r-- 3 hive hdfs 27 2017-07-06 13:40 hdfs://mycluster/tmp/hive/hive/4f3f4253-e6d0-42ac-88bc-90f0ea03602c/inuse.info
-rw-r--r-- 3 hive hdfs 0 2017-07-06 13:40 hdfs://mycluster/tmp/hive/hive/4f3f4253-e6d0-42ac-88bc-90f0ea03602c/inuse.lck
drwx------ - hive hdfs 0 2017-07-06 20:30 hdfs://mycluster/tmp/hive/hive/c108f1c2-453e-400f-ac3e-e3a9b0d22699
-rw-r--r-- 3 hive hdfs 26 2017-07-06 20:30 hdfs://mycluster/tmp/hive/hive/c108f1c2-453e-400f-ac3e-e3a9b0d22699/inuse.info
```
1. Om du vet att Hive har gjorts med de här filerna kan du ta bort dem. Se till att Hive inte har några frågor som körs genom att titta på sidan garn Resource Manager UI.
Exempel på kommando rad för att ta bort filer från HDFS:
```bash
hadoop fs -rm -r -skipTrash hdfs://mycluster/tmp/hive/
```
#### <a name="scale-hdinsight-to-three-or-more-worker-nodes"></a>Skala HDInsight till tre eller flera arbetsnoder
Om klustret blir fastnat i fel säkert läge när du skalar ned till färre än tre arbetsnoder, behåller du minst tre arbetsnoder.
Att ha tre arbetsnoder är dyrare än att skala ned till endast en arbetsnod. Den här åtgärden förhindrar dock att ditt kluster blir fastnat i fel säkert läge.
### <a name="scale-hdinsight-down-to-one-worker-node"></a>Skala HDInsight ned till en arbetsnod
Även om klustret skalas ned till en nod kommer arbetsnoden 0 fortfarande att överleva. Arbetsnoden 0 kan aldrig tas ur bruk.
#### <a name="run-the-command-to-leave-safe-mode"></a>Kör kommandot för att lämna fel säkert läge
Det sista alternativet är att köra kommandot lämna fel säkert läge. Om HDFS har angivit fel säkert läge på grund av Hive-filen under replikering, kör du följande kommando för att lämna fel säkert läge:
```bash
hdfs dfsadmin -D 'fs.default.name=hdfs://mycluster/' -safemode leave
```
### <a name="scale-down-an-apache-hbase-cluster"></a>Skala ned ett Apache HBase-kluster
Region servrar bal anse ras automatiskt inom några minuter efter att en skalnings åtgärd har slutförts. Slutför följande steg för att balansera region servrar manuellt:
1. Anslut till HDInsight-klustret med SSH. Mer information finns i [använda SSH med HDInsight](hdinsight-hadoop-linux-use-ssh-unix.md).
2. Starta HBase-gränssnittet:
```bash
hbase shell
```
3. Använd följande kommando för att balansera region servrarna manuellt:
```bash
balancer
```
## <a name="next-steps"></a>Nästa steg
* [Skala Azure HDInsight-kluster automatiskt](hdinsight-autoscale-clusters.md)
För detaljerad information om skalning av HDInsight-klustret, se:
* [Hantera Apache Hadoop kluster i HDInsight med hjälp av Azure Portal](hdinsight-administer-use-portal-linux.md#scale-clusters)
* [Hantera Apache Hadoop kluster i HDInsight med hjälp av Azure CLI](hdinsight-administer-use-command-line.md#scale-clusters) | 59.097826 | 454 | 0.773159 | swe_Latn | 0.997815 |
1c6ce0b2bb36d7bd34b86149399624f29b7bfd21 | 1,070 | md | Markdown | controls/scheduler/client-side-programming/events/onclientresourcespopulating.md | DKaramfilov/ajax-docs | 86b5dbbf757f7b1fd84ec8b313be0d70cd957f8e | [
"Apache-2.0"
] | 22 | 2015-07-21T10:33:39.000Z | 2022-02-21T09:17:40.000Z | controls/scheduler/client-side-programming/events/onclientresourcespopulating.md | DKaramfilov/ajax-docs | 86b5dbbf757f7b1fd84ec8b313be0d70cd957f8e | [
"Apache-2.0"
] | 132 | 2015-07-14T13:56:12.000Z | 2022-01-28T10:04:56.000Z | controls/scheduler/client-side-programming/events/onclientresourcespopulating.md | DKaramfilov/ajax-docs | 86b5dbbf757f7b1fd84ec8b313be0d70cd957f8e | [
"Apache-2.0"
] | 355 | 2015-07-14T02:38:17.000Z | 2021-11-30T13:22:18.000Z | ---
title: OnClientResourcesPopulating
page_title: OnClientResourcesPopulating - RadScheduler
description: Check our Web Forms article about OnClientResourcesPopulating.
slug: scheduler/client-side-programming/events/onclientresourcespopulating
tags: onclientresourcespopulating
published: True
position: 22
---
# OnClientResourcesPopulating
##
If specified, the **OnClientResourcesPopulating** client-side event handler is called when the scheduleris about to request resources.In the case of server-side binding, the event will not be raised. When client-side binding is used, the event will be raisedbefore the resources are retrieved from the data service. The event will be raised only once, at the time of the initial load.
Two parameters are passed to the handler:
* **sender** - the scheduler client object;
* **eventArgs** with two properties:
* **get_schedulerInfo() -** the schedulerInfo object that will be passed to the web service method.
* **set_cancel() -** used to cancel the event.
This event can be cancelled.
| 35.666667 | 385 | 0.771028 | eng_Latn | 0.976839 |
1c6d123085613f4fa8ba3291392a41ea6d8d8608 | 5,801 | md | Markdown | src/Resources/doc/installation.md | Adgoal/StatsDClientBundle | 42559c94240daf23f142dc1607dc127a4bc8a4b5 | [
"MIT"
] | null | null | null | src/Resources/doc/installation.md | Adgoal/StatsDClientBundle | 42559c94240daf23f142dc1607dc127a4bc8a4b5 | [
"MIT"
] | null | null | null | src/Resources/doc/installation.md | Adgoal/StatsDClientBundle | 42559c94240daf23f142dc1607dc127a4bc8a4b5 | [
"MIT"
] | null | null | null | Installation
============
* 1 First, add the dependent bundles to the vendor/bundles directory. Add the following lines to the composer.json file
```
"require": {
# ..
"liuggio/statsd-client-bundle": "1.6.*",
# ..
}
```
* 2 Then run `composer install`
* 3 Then add in your `app/AppKernel`
``` yaml
class AppKernel extends Kernel
{
public function registerBundles()
{
$bundles = array(
// ...
new Liuggio\StatsDClientBundle\LiuggioStatsDClientBundle(),
// ...
```
* 4 Then add to config/yaml the minimal configuration
``` yaml
# app/config/config.yml
liuggio_stats_d_client:
connection:
host: localhost
port: 8125
```
* 5 When you are in develop env (mmm when kernel.debug is true) the packets use the SyslogSender
see [full-configuration](#full-configuration--max-power) in order to modify it
Working with the `Service`
-------------
### StatsdDataFactory
This service creates the (StatsDataInterface) object to send
Reference: `liuggio_stats_d_client.factory`
```
$data = $this->get('liuggio_stats_d_client.factory')->increment('log.error');
```
### StatsDClient
Reference: `liuggio_stats_d_client.service`
This service SENDS the data over the UDP interface
from a controller call ``` $this->get('liuggio_stats_d_client.service')->send($data) ```
the `$data` is the object created by the factory
the send method will optimise the data sent in order to speed up the connection.
### Simplified StatsD service
Reference: `statsd`
This service implements `liuggio_stats_d_client.factory` methods, as well as flush.
It makes using service less verbose.
```php
declare(strict_types=1);
$this->get('statsd')
->increment('log.error')
->gauge('log.rate', 25)
->flush();
```
Working with `Monolog`
-------------
To monitor your logs, add the following lines to your application configuration
file, in order to enable the StatsDHandler
``` yaml
# app/config/config.yml
liuggio_stats_d_client:
connection:
host: localhost
port: 8125
protocol: udp
reduce_packet: true
fail_silently: true
enable_collector: false #default is false
monolog:
enable: true
prefix: 'my-app'
level: 'warning'
formatter:
context_logging: true # if you want additional packets for context, default is false.
extra_logging: true # if you want additional packets for extra, default is false.
words: 2 # the number of the word in the stats key, default is 2.
```
But you also have to say to Monolog to use that handler
``` yaml
# app/config/config.yml
monolog:
handlers:
main:
type: fingers_crossed
action_level: warning
handler: streamed
streamed:
type: stream
path: %kernel.logs_dir%/%kernel.environment%.log
level: critical
#----------------------------------
stats_d:
type: service
id: monolog.handler.statsd
level: warning
# channels:
# type: exclusive # Include all, except those listed below
# elements: [ ]
```
Working with Personal Collector
-------------
If the information from Monolog is not enough for you, you can use the Collectors that collect the information and then send the data to the StatsD Server.
``` yaml
# app/config/config.yml
liuggio_stats_d_client:
connection:
host: localhost
port: 8125
protocol: udp
reduce_packet: true
fail_silently: true
enable_collector: true
collectors:
liuggio_stats_d_client.collector.dbal: 'my-app.query'
liuggio_stats_d_client.collector.visitor: 'my-app.visitor'
liuggio_stats_d_client.collector.memory: 'my-app.memory'
liuggio_stats_d_client.collector.user: 'my-app.user'
liuggio_stats_d_client.collector.exception: 'my-app.exception'
liuggio_stats_d_client.collector.time: 'my-app.time' # the time is a "fake" one, it is ~100ms smaller than the Kernel but you can still monitor the evolution of your application
```
For example the `liuggio_stats_d_client.collector.dbal` will collect a lot of the information provided by the doctrine logging.
In order to enable the query collector you also have to add to your doctrine.dbal the profiling variable from your config.yml
eg.
``` yaml
# Doctrine Configuration
doctrine:
dbal:
profiling: true
```
Full Configuration / Max Power
------------
then add to config/yaml
``` yaml
# app/config/config.yml
liuggio_stats_d_client:
monolog:
enable: true
prefix: 'my-app.log'
formatter:
# class: # if you want to change the formatter class.
# format: # if you want to change the format.
context_logging: true # if you want additional packets for context.
extra_logging: true # if you want additional packets for extra.
words: 2 # the number of the word in the stats key.
level: 'warning'
context_logging: true
connection:
# class: Liuggio\StatsdClient\Sender\SocketSender
# debug_class: Liuggio\StatsdClient\Sender\SysLogSender # or EchoSender
# debug: %kernel.debug% # use false if you want to disable debugging and shot packet over Socket
host: localhost
port: 8125
# protocol: udp
# reduce_packet: true
# fail_silently: true
enable_collector: true
collectors:
liuggio_stats_d_client.collector.dbal: 'my-app.query'
liuggio_stats_d_client.collector.visitor: 'my-app.visitor'
liuggio_stats_d_client.collector.memory: 'my-app.memory'
liuggio_stats_d_client.collector.user: 'my-app.user'
liuggio_stats_d_client.collector.exception: 'my-app.exception'
liuggio_stats_d_client.collector.time: 'my-app.time'
```
| 25.331878 | 181 | 0.678159 | eng_Latn | 0.88245 |
1c6d15268fc0ef25ab69e14db7823721083cc90b | 1,895 | md | Markdown | _posts/study/2020-03-01-problem-sw-1855.md | JooJiyun/joojiyun.github.io | 6404fa6747c1dc19aeb755f5f210b55296aabd3d | [
"MIT"
] | null | null | null | _posts/study/2020-03-01-problem-sw-1855.md | JooJiyun/joojiyun.github.io | 6404fa6747c1dc19aeb755f5f210b55296aabd3d | [
"MIT"
] | null | null | null | _posts/study/2020-03-01-problem-sw-1855.md | JooJiyun/joojiyun.github.io | 6404fa6747c1dc19aeb755f5f210b55296aabd3d | [
"MIT"
] | null | null | null | ---
title: "1855 영준이의 진짜 BFS"
excerpt: "1855 영준이의 진짜 BFS : LCA"
categories:
- Study
tags:
- swexpertacademy
last_modified_at: 2020-03-01T16:09:00
---
**SW Expert Academy**
1855 영준이의 진짜 BFS
BFS탐색을 하면서 방분한 순서대로 방문 전 정점과의 LCA를 구하여 정점을 이동할 때마다 몇개의 간선을 지나가는지 구하면 된다.
``` c
#include <stdio.h>
#include <vector>
#include <algorithm>
#include <cstring>
#include <queue>
const int MAX = 18; // roundup log(2, 100000)
typedef long long ll;
using namespace std;
int T, n, bef;
int parent[100010][MAX], depth[100010];
vector<int> adj[100010];
queue<int> qu;
ll ans;
int lca_dist(int u, int v) {
if (depth[u] < depth[v]) swap(u, v);
int diff = depth[u] - depth[v];
int st = depth[u] + depth[v];
for (int j = 0; diff; j++) {
if (diff % 2) u = parent[u][j];
diff /= 2;
}
if (u != v) {
for (int j = MAX - 1; j >= 0; j--) {
if (parent[u][j] != -1 && parent[u][j] != parent[v][j]) {
u = parent[u][j];
v = parent[v][j];
}
}
u = parent[u][0];
}
return st - 2 * depth[u];
}
int main() {
freopen("input.txt", "r", stdin);
scanf("%d", &T);
for (int tc = 1; tc <= T; tc++) {
ans = 0;
fill(depth, depth + n + 3, -1);
memset(parent, -1, sizeof(parent));
scanf("%d", &n);
for (int i = 2; i <= n; i++) {
scanf("%d", &parent[i][0]);
adj[parent[i][0]].push_back(i);
}
for (int i = 2; i <= n; i++)
sort(adj[i].begin(), adj[i].end());
for (int j = 0; j < MAX - 1; j++) {
for (int i = 2; i <= n; i++) {
if (parent[i][j] != -1)
parent[i][j + 1] = parent[parent[i][j]][j];
}
}
qu.push(1);
bef = 1;
depth[0] = -1;
depth[1] = 0;
while (int s = qu.size()) {
int here = qu.front();
depth[here] = depth[parent[here][0]] + 1;
ans += ll(lca_dist(here, bef));
qu.pop();
for (int there : adj[here])
qu.push(there);
bef = here;
}
printf("#%d %lld\n", tc, ans);
for (int i = 0; i <= n; i++)
adj[i].clear();
}
}
``` | 21.534091 | 72 | 0.529815 | kor_Hang | 0.444248 |
1c6e1510f97db3e13868ef656c131c33bf2054b1 | 14 | md | Markdown | README.md | Boosiebo/Roger- | 39b809ecf0ee4264a95f6df2c2084ede77abcf75 | [
"Apache-2.0"
] | null | null | null | README.md | Boosiebo/Roger- | 39b809ecf0ee4264a95f6df2c2084ede77abcf75 | [
"Apache-2.0"
] | null | null | null | README.md | Boosiebo/Roger- | 39b809ecf0ee4264a95f6df2c2084ede77abcf75 | [
"Apache-2.0"
] | null | null | null | # Roger-
Work
| 4.666667 | 8 | 0.642857 | deu_Latn | 0.377837 |
1c6e54e082f1f2c245723598f8d268ae34e61675 | 358 | md | Markdown | docs/utils/trigonometry/sin.md | suns-echoes/math-utils | b234f8597918d6958e9c70ad67c7cd47eb90de2e | [
"MIT"
] | null | null | null | docs/utils/trigonometry/sin.md | suns-echoes/math-utils | b234f8597918d6958e9c70ad67c7cd47eb90de2e | [
"MIT"
] | null | null | null | docs/utils/trigonometry/sin.md | suns-echoes/math-utils | b234f8597918d6958e9c70ad67c7cd47eb90de2e | [
"MIT"
] | null | null | null | Trigonometry / Sine
===================
### `sin`
Method calculates sine value.
Usage
-----
```js
const value = MathUtils.trigonometry.sin(theta);
```
### Arguments
* **`theta`**: *`number`* - angle in radians
### Returns
* *`number`* - sine value
Examples
--------
```js
// Calculate value:
const value = MathUtils.trigonometry.sin(1.53);
```
| 10.529412 | 48 | 0.578212 | eng_Latn | 0.185253 |
1c6e70a20fc9dd0f7e4ec046b0b8319dcc868328 | 5,865 | md | Markdown | README.md | teemulehtinen/qlcjs | f7de21cd3e136db9592957b7b9f47d5017086bc3 | [
"MIT"
] | null | null | null | README.md | teemulehtinen/qlcjs | f7de21cd3e136db9592957b7b9f47d5017086bc3 | [
"MIT"
] | null | null | null | README.md | teemulehtinen/qlcjs | f7de21cd3e136db9592957b7b9f47d5017086bc3 | [
"MIT"
] | null | null | null | # qlcjs
Generates questions about concrete constructs and patterns in a given JavaScript program.
These questions (including answering options) can be posed to a learner to practice
introductory programming. These questions include elements to develop program comprehension
and program tracing.
Automatic generation enables systems to pose the generated questions to leaners
about their own programs that they just programmed. Such Questions About Learners' Code (QLCs)
may have self-reflection and self-explanation effects that are of interest
in computing education research.
### References
The implementation was inspired by the work of Evan Cole on
[study-lenses](https://github.com/colevandersWands/study-lenses).
Analysis and transformations of the input programs depend on
[Shift tools](https://shift-ast.org/) by Shape Security, Inc.
These libraries are available under the
[Apache License, Version 2.0](http://www.apache.org/licenses/LICENSE-2.0).
The concept of Questions About Learners' Code (QLCs) is first introduced by Lehtinen et al. in
[Let's Ask Students About Their Programs, Automatically](https://doi.org/10.1109/ICPC52881.2021.00054).
---
### Example result
```TypeScript
function power(a, b) {
let n = 1;
for (let i = 0; i < b; i++) {
n *= a;
}
return n;
}
```
Which is the name of the function? [FunctionName]
* a [parameter_name] _<small>This is a parameter name for a value passed as an argument when calling the function</small>_
* b [parameter_name] _<small>This is a parameter name for a value passed as an argument when calling the function</small>_
* function [keyword] _<small>The keyword/command "function" is used when the program is about to declare a function</small>_
* let [keyword] _<small>This is a program keyword/command used inside the function body</small>_
* power [function_name] _<small>Correct, this is the name</small>_
Which are the parameter names of the function? [ParameterName]
* a [parameter_name] _<small>Correct, this is one of the 2 parameter names for this function</small>_
* b [parameter_name] _<small>Correct, this is one of the 2 parameter names for this function</small>_
* function [keyword] _<small>The keyword/command "function" is used when the program is about to declare a function</small>_
* power [function_name] _<small>This is the name of the function that is used to call the function</small>_
* return [keyword] _<small>This is a program keyword/command used inside the function body</small>_
Which value does <em>a</em> have when execution of <em>power(2, 2)</em> starts? [ParameterValue]
* 1 [literal] _<small>This is a literal value that is used inside the function body</small>_
* 2 [parameter_value] _<small>Correct, this is the value passed as an argument for the given parameter</small>_
* a [parameter_name] _<small>This is a parameter name for a value passed as an argument when calling the function</small>_
A program loop starts on line 3. Which is the last line inside it? [LoopEnd]
* 2 [line_before_block] _<small>The loop starts after this line</small>_
* 3 [line_inside_block] _<small>This line is inside the loop BUT it is not the last one</small>_
* 4 [last_line_inside_block] _<small>Correct, this is the last line inside the loop (closing curly bracket may appear later)</small>_
* 6 [line_after_block] _<small>The loop ends before this line</small>_
A value is assigned to variable <em>n</em> on line 4. On which line is <em>n</em> declared? [VariableDeclaration]
* 0 [random_line] _<small>This is a random line that does not handle the given variable</small>_
* 2 [declaration_line] _<small>Correct, this is the line where the variable is declared using the keyword let</small>_
* 4 [reference_line] _<small>This line references (reads or writes) the given variable BUT it is declared before</small>_
* 5 [random_line] _<small>This is a random line that does not handle the given variable</small>_
* 6 [reference_line] _<small>This line references (reads or writes) the given variable BUT it is declared before</small>_
Which is the ordered sequence of values that are assigned to variable <em>i</em> while executing <em>power(2, 2)</em>? [VariableTrace]
* 0, 1 [trace_miss_last] _<small>No, this sequence is missing a value that gets assigned</small>_
* 0, 1, 2 [trace] _<small>Correct, step by step these values are assigned to the variable</small>_
* 0, 1, 2, 3 [trace_extra_last] _<small>No, this sequence has an extra value that is not assigned</small>_
* 1, 2 [trace_miss_first] _<small>No, this sequence is missing a value that gets assigned</small>_
* 1, 2, 0 [trace_shuffled] _<small>No, this is an incorrect random sequence</small>_
---
### Usage
The library is implemented in TypeScript and the types can describe a great deal of the API.
```TypeScript
import { generate, QLCRequest, ProgramInput, QLC } from 'qlcjs';
const req: QLCRequest[] = [
{
count: 1, // Number of questions to generate if possible
types: ['FunctionName', 'ParameterName'], // Accepted question types
},
{
count: 3,
fill: true, // Fill in missing number of questions for the count
uniqueTypes: true, // Only accept one question for each question type
},
];
const input: ProgramInput = {
functionName: 'task', // Try to evaluate this function to collect dynamic data
arguments: [[0, 'a'], [1, 'b'], [2, 'c']], // Use one set from these arguments
};
const questions: QLC[] = generate(sourceCode, req, input);
```
Example of using the build unpkg-module:
`npm install && npm run build && ls dist/qlcjs.min.js`
```HTML
<script src="qlcjs.min.js"></script>
<script>
const qlcs = qlcjs.generate(sourceCode, [
{ count: 1, types: ['FunctionName', 'ParameterName'] },
{ count: 3, fill: true, uniqueTypes: true }
]);
qlcs.forEach(qlc => {
console.log(qlc.type, qlc.question, qlc.options);
});
</script>
```
| 47.682927 | 134 | 0.741858 | eng_Latn | 0.995749 |
1c6f46bf75ccda6b21413dd7a169d6967c7b18ef | 234 | md | Markdown | src/showreel-drafts/Continual Universal Object Detection.md | BraneShop/BraneShop.github.io | 2d5cdeb6ec3b2c78016a4e4b773fdc12174cb255 | [
"MIT"
] | 1 | 2019-11-07T09:51:58.000Z | 2019-11-07T09:51:58.000Z | src/showreel-drafts/Continual Universal Object Detection.md | BraneShop/BraneShop.github.io | 2d5cdeb6ec3b2c78016a4e4b773fdc12174cb255 | [
"MIT"
] | 4 | 2019-02-13T20:19:35.000Z | 2019-10-18T08:48:44.000Z | src/showreel-drafts/Continual Universal Object Detection.md | BraneShop/BraneShop.github.io | 2d5cdeb6ec3b2c78016a4e4b773fdc12174cb255 | [
"MIT"
] | 1 | 2019-02-10T19:57:24.000Z | 2019-02-10T19:57:24.000Z | ---
link: https://arxiv.org/pdf/2002.05347.pdf
title: Continual Universal Object Detection
image: /images/showreel/Continual Universal Object Detection.jpg
date: 2020-02-13
tags: computer-vision, technical
draft: draft
preview:
---
| 19.5 | 64 | 0.773504 | yue_Hant | 0.302479 |
1c6ff07863bbf5113fdddd45f52f44cd42aed6bd | 5,507 | md | Markdown | articles/active-directory/develop/authentication-national-cloud.md | EINSTEINPRACIANO/azure-docs.pt-br | 93bbbf115ab76d31e6bc8919a338700294966913 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/active-directory/develop/authentication-national-cloud.md | EINSTEINPRACIANO/azure-docs.pt-br | 93bbbf115ab76d31e6bc8919a338700294966913 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/active-directory/develop/authentication-national-cloud.md | EINSTEINPRACIANO/azure-docs.pt-br | 93bbbf115ab76d31e6bc8919a338700294966913 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Autenticação usando o Azure Active Directory nas nuvens nacionais
description: Saiba mais sobre o registro do aplicativo e os pontos de extremidade de autenticação para nuvens nacionais.
services: active-directory
documentationcenter: ''
author: negoe
manager: CelesteDG
editor: ''
ms.service: active-directory
ms.subservice: develop
ms.devlang: na
ms.topic: article
ms.tgt_pltfrm: na
ms.workload: identity
ms.date: 05/07/2019
ms.author: negoe
ms.reviewer: negoe,CelesteDG
ms.custom: aaddev
ms.collection: M365-identity-device-management
ms.openlocfilehash: a0d4586df23548854f4acbfefd32081a36906097
ms.sourcegitcommit: 0ae3139c7e2f9d27e8200ae02e6eed6f52aca476
ms.translationtype: MT
ms.contentlocale: pt-BR
ms.lasthandoff: 05/06/2019
ms.locfileid: "65067912"
---
# <a name="national-clouds"></a>Nuvens nacionais
Nuvens nacionais são instâncias isoladas fisicamente do Azure. Essas regiões do Azure são projetadas para garantir que os requisitos de residência, de soberania e de conformidade de dados sejam respeitados dentro de limites geográficos.
Incluindo nuvem global, o Azure Active Directory é implantado nas seguintes nuvens nacionais:
- Governo dos EUA para Azure
- Azure Alemanha
- Azure China 21Vianet
Nuvens nacionais são ambientes exclusivos e diferentes do Azure global. Portanto, é importante estar ciente de algumas das principais diferenças ao desenvolver seu aplicativo para esses ambientes, como registrar aplicativos, adquirir tokens e configurar pontos de extremidade.
## <a name="app-registration-endpoints"></a>Pontos de extremidade de registro do aplicativo
Há um portal do Azure separado para cada uma das nuvens nacionais. Para integrar aplicativos com a plataforma de identidade da Microsoft em uma nuvem nacional, você deve registrar seu aplicativo separadamente em cada um portal do Azure específico do ambiente.
A tabela a seguir lista as URLs base para os pontos de extremidade do Azure AD (Azure Active Directory) usados para registrar um aplicativo para cada nuvem nacional.
| Nuvem nacional | Ponto de extremidade do portal do Azure AD |
|----------------|--------------------------|
| Azure AD for US Government | `https://portal.azure.us` |
| Azure AD Alemanha | `https://portal.microsoftazure.de` |
| Azure AD China operado pela 21Vianet | `https://portal.azure.cn` |
| Azure AD (serviço global) |`https://portal.azure.com` |
## <a name="azure-ad-authentication-endpoints"></a>Pontos de extremidade de autenticação do Azure AD
Todas as nuvens nacionais autenticam usuários separadamente em cada ambiente e têm pontos de extremidade de autenticação separados.
A tabela a seguir lista as URLs base para os pontos de extremidade do Azure AD (Azure Active Directory) usados para obter tokens para chamar o Microsoft Graph para cada nuvem nacional.
| Nuvem nacional | Ponto de extremidade de autenticação do Azure AD |
|----------------|-------------------------|
| Azure AD for US Government | `https://login.microsoftonline.us` |
| Azure AD Alemanha| `https://login.microsoftonline.de` |
| Azure AD China operado pela 21Vianet | `https://login.chinacloudapi.cn` |
| Azure AD (serviço global)| `https://login.microsoftonline.com` |
- As solicitações para os pontos de extremidade de token ou autorização do Azure AD podem ser formadas usando a URL base específica da região apropriada. Por exemplo, para o Azure Alemanha:
- O ponto de extremidade de autorização comum é `https://login.microsoftonline.de/common/oauth2/authorize`.
- O ponto de extremidade de token comum é `https://login.microsoftonline.de/common/oauth2/token`.
- Para aplicativos de locatário único, substitua o comum nas URLs anteriores por sua ID de locatário ou nome, por exemplo, `https://login.microsoftonline.de/contoso.com`.
> [!NOTE]
> A [autorização do Azure AD v2.0]( https://docs.microsoft.com/azure/active-directory/develop/active-directory-appmodel-v2-overview) e os pontos de extremidade de token só estão disponíveis para o serviço global. Ele ainda não tem suporte para implantações de nuvem nacional.
## <a name="microsoft-graph-api"></a>API do Microsoft Graph
Para saber como chamar as APIs do Microsoft Graph no ambiente de Nuvem Nacional, acesse [Microsoft Graph na nuvem nacional](https://developer.microsoft.com/graph/docs/concepts/deployments).
> [!IMPORTANT]
> Determinados serviços e recursos que estão em regiões específicas do serviço global podem não estar disponíveis em todas as nuvens Nacionais. Para descobrir quais serviços estão disponíveis, acesse [produtos disponíveis por região](https://azure.microsoft.com/global-infrastructure/services/?products=all®ions=usgov-non-regional,us-dod-central,us-dod-east,usgov-arizona,usgov-iowa,usgov-texas,usgov-virginia,china-non-regional,china-east,china-east-2,china-north,china-north-2,germany-non-regional,germany-central,germany-northeast).
Siga esse [tutorial de biblioteca de autenticação da Microsoft (MSAL)](msal-national-cloud.md) para aprender a criar um aplicativo usando a plataforma de identidade da Microsoft. Especificamente, este aplicativo irá entrar em um usuário, obter um token de acesso para chamar a API do Microsoft Graph.
## <a name="next-steps"></a>Próximas etapas
Saiba mais sobre:
- [Azure Governamental](https://docs.microsoft.com/azure/azure-government/)
- [Azure China 21Vianet](https://docs.microsoft.com/azure/china/)
- [Azure Alemanha](https://docs.microsoft.com/azure/germany/)
- [Noções básicas de autenticação do AD do Azure](authentication-scenarios.md)
| 59.858696 | 538 | 0.781551 | por_Latn | 0.989921 |
1c7008b44845d2d684b815ce4fb7745588777854 | 771 | md | Markdown | innersource-repository-scanner/src/test/resources/readmes/README-COMPLETE.md | LaudateCorpus1/innersource-scanner | c4253df0b9d7de75b982a9f716ee135284651020 | [
"Apache-2.0"
] | 15 | 2021-06-22T18:20:52.000Z | 2022-03-16T10:02:50.000Z | innersource-repository-scanner/src/test/resources/readmes/README-COMPLETE.md | intuit/innersource-scanner | c4253df0b9d7de75b982a9f716ee135284651020 | [
"Apache-2.0"
] | 7 | 2021-06-23T17:04:57.000Z | 2021-07-12T19:31:14.000Z | innersource-repository-scanner/src/test/resources/readmes/README-COMPLETE.md | LaudateCorpus1/innersource-scanner | c4253df0b9d7de75b982a9f716ee135284651020 | [
"Apache-2.0"
] | 1 | 2022-02-24T08:05:41.000Z | 2022-02-24T08:05:41.000Z | 

# Project Title
This is the project description.
# Usage
Explain how to use the software from and end user's perspective.
# Local Development
Explain how a new developer to the team should configure their
local environment to begin making changes to the codebase
# Contributing
Link to Contributing.md, any other general information about
what kind of contributions are encouraged or discouraged.
# Support
How should a contributor or user find help with this codebase.
Consider including:
- Slack Channel
- Stack Overflow tag
- Email
- link to CODEOWNERS file which contains list of maintainers and
trusted committers who can be contacted for support
| 24.09375 | 64 | 0.793774 | eng_Latn | 0.996509 |
1c708cc1482b3ca479081fd22ec2fbe89f63e7ea | 140 | md | Markdown | source/_honours/1970-dux-ludorum.md | salihzeki12000/Website | 733c8f7ea86f6ee4967cbbbb83cfc10d12cf7c12 | [
"MIT"
] | null | null | null | source/_honours/1970-dux-ludorum.md | salihzeki12000/Website | 733c8f7ea86f6ee4967cbbbb83cfc10d12cf7c12 | [
"MIT"
] | 10 | 2019-08-13T10:55:15.000Z | 2022-02-26T10:21:10.000Z | source/_honours/1970-dux-ludorum.md | wing5wong/artisan-whs-static | cb36480c54adcb5cbe1fd25c1a741bb525112a72 | [
"MIT"
] | 1 | 2020-10-30T12:55:04.000Z | 2020-10-30T12:55:04.000Z | ---
title: 1970 Dux Ludorum
date: 1970-12-01T03:58:30.974Z
award: Dux Ludorum
person1_name: Michael Young
person2_name: Raewyn Knowles
---
| 15.555556 | 30 | 0.757143 | eng_Latn | 0.184328 |
1c71066c0ada188606bec05910fd0e914c8a1442 | 9,693 | md | Markdown | powerapps-docs/maker/common-data-service/create-edit-entities-solution-explorer.md | PathToSharePoint/powerapps-docs | 897a3dd2b1afd272809a2261992b61d83eb9880a | [
"CC-BY-4.0",
"MIT"
] | null | null | null | powerapps-docs/maker/common-data-service/create-edit-entities-solution-explorer.md | PathToSharePoint/powerapps-docs | 897a3dd2b1afd272809a2261992b61d83eb9880a | [
"CC-BY-4.0",
"MIT"
] | null | null | null | powerapps-docs/maker/common-data-service/create-edit-entities-solution-explorer.md | PathToSharePoint/powerapps-docs | 897a3dd2b1afd272809a2261992b61d83eb9880a | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: "Create and edit tables using solution explorer | MicrosoftDocs"
description: "Learn how to create a table using solution explorer"
ms.custom: ""
ms.date: 02/26/2019
ms.reviewer: ""
ms.service: powerapps
ms.suite: ""
ms.tgt_pltfrm: ""
ms.topic: "article"
applies_to:
- "Dynamics 365 (online)"
- "Dynamics 365 Version 9.x"
author: "Mattp123"
ms.author: "matp"
manager: "kvivek"
search.audienceType:
- maker
search.app:
- PowerApps
- D365CE
---
# Create and edit tables using solution explorer
[!INCLUDE[cc-data-platform-banner](../../includes/cc-data-platform-banner.md)]
You can easily create a table using Power Apps ([make.powerapps.com](https://make.powerapps.com) for most common situations, but not all capabilities are implemented there. When you need to meet the requirements described in [Create and edit tables in Microsoft Dataverse](create-edit-entities.md), you can achieve them by creating or editing tables using the Solution Explorer.
## Open solution explorer
The customization prefix is part of the name of any table you create. This is set based on the solution publisher for the solution you’re working in. If you care about the customization prefix, make sure that you are working in an unmanaged solution where the customization prefix is the one you want for this table. More information: [Change the solution publisher prefix](change-solution-publisher-prefix.md)
[!INCLUDE [cc_navigate-solution-from-powerapps-portal](../../includes/cc_navigate-solution-from-powerapps-portal.md)]
## View tables
In the solution explorer **Components** node, select the **Entities** node.

## Create a table
While [viewing tables](#view-tables), select **New** to open the form to create tables.

The form has two tabs. The **General** tab is for entity options. The **Primary Field** tab is for options about the special single line of text column that each table has that defines the text shown when there is a link to open the table in a lookup column.
For information about each section see the following:
- [Configure the primary column](#configure-the-primary-column)
- [Configure required columns](#configure-required-columns)
> [!NOTE]
> You can also make the table a custom activity. This choice changes some of the default option values. More information: [Create a custom activity table](#create-custom-activity-table)
After you have set the required options for the table, click  to create the custom table.
### Configure the primary column
In the **Primary Field** tab you can usually accept the default values for the primary column, but you have the following options:
|Field |Description |
|---------|---------|
|**Display Name**|Enter the localizable label that will be displayed for this column in forms and lists. The default value is **Name**.|
|**Name**|Set the name used in the system for this column. The default value is `<customization prefix>_name`|
|**Maximum Length**|Enter the maximum length for the column values. The default is 100.|
> [!NOTE]
> These options do not apply if the table is an activity table. More information: [Create a custom activity table](#create-custom-activity-table)
### Configure required columns
In the **General** tab, some of the options are required before you can save the table.
|Field |Description |
|---------|---------|
|**Display Name**|This is the singular name for the table that will be shown in the app.<br />This can be changed later.|
|**Plural Name**|This is the plural name for the table that will be shown in the app.<br />This can be changed later.|
|**Name**|This column is pre-populated based on the display name you enter. It includes the solution publisher customization prefix.|
|**Ownership**|You can choose either user or team-owned or organization owned. More information: [Standard table ownership](types-of-entities.md#standard-tables)|
## Edit a table
While [viewing tables](#view-tables), select the table you want to edit, or continue editing a new table you have just saved.
> [!NOTE]
> Standard tables or custom tables that are part of a managed solution may have limitations on changes you can apply. If the option is not available or is disabled, you are not allowed to make the change.
#### Set once options
The following options can be set once and cannot be changed after you set them. Take care to only set these options when you need them.
<!--
Same data is presented in edit-tables.md
Both should point to this include
-->
[!INCLUDE [cc_entity-set-once-options-table](../../includes/cc_entity-set-once-options-table.md)]
#### Options that you can change
The following properties can be changed at any time.
<!--
Same data is presented in edit-tables.md
Both should point to this include
-->
[!INCLUDE [cc_entity-changeable-options-table](../../includes/cc_entity-changeable-options-table.md)]
You can also make the following changes:
- [Create and edit columns for Dataverse](create-edit-fields.md)
- [Create and edit relationships between tables](create-edit-entity-relationships.md)
- [Create and design forms](../model-driven-apps/create-design-forms.md)
- [Create a business process flow to standardize processes](/flow/create-business-process-flow)
## Delete a table
As someone with the system administrator security role, you can delete custom tables that aren’t part of a managed solution.
> [!IMPORTANT]
> - When you delete a custom table, the database tables that store data for that table are deleted and all data they contain is lost. Any associated rows that have a parental relationship to the custom table are also deleted. For more information about parental relationships, see [Create and edit relationships between tables](create-edit-entity-relationships.md).
> - The only way to recover data from a table that was deleted is to restore the database from a point before the table was deleted. More information: [Backup and restore environments](/power-platform/admin/backup-restore-environments)
While [viewing tables](#view-tables), click the  command in the toolbar.
While viewing a table use the delete command in the menu bar.

> [!WARNING]
> Deleting a table that contains data will remove all the data. This data can only be retrieved by backup of the database.
> [!NOTE]
> If there are any table dependencies you will get a **Cannot Delete Component** error with a **Details** link you can use to discover information about why the table cannot be deleted. In most cases, this will be because of a dependency that has to be removed.
>
> There may be more than one dependency blocking the deletion of a table. This error message may only show the first one. For an alternate way to discover dependencies, see [Identify table dependencies](#identify-table-dependencies)
### Identify table dependencies
You can identify dependencies that will prevent a table from being deleted before you try to delete it.
1. In the solution explorer with the table selected, click **Show Dependencies** in the command bar.

2. In the dialog window that opens, scroll the list to the right to view the **Dependency Type** column.

**Published** dependencies will block deleting a table. **Internal** dependencies should be resolved by the system.
3. Remove these published dependencies and you should be able to delete the table.
> [!NOTE]
> A very common dependency is that another table form has a lookup column for the table you are deleting. Removing the lookup column from the form will resolve the dependency.
## Create custom activity table
To create the table as an activity table, use the same steps described in this topic except select **Define as an activity table**.

An activity table is a special kind of table that tracks actions for which an entry can be made on a calendar. More information: [Activity tables](types-of-entities.md#activity-tables).
When you set this option some table properties are not compatible. An activity table has to conform to standard behaviors that all activity tables use.
The primary column **Name** and **Display Name** will be set to **Subject** and you cannot change this.
The following options are set by default and cannot be changed:
- **Feedback**
- **Notes (includes attachments)**
- **Connections**
- **Queues**
- **Offline capability for Dynamics 365 for Outlook**
The following options cannot be set:
- **Areas that display this table**
- **Activities**
- **Sending email**
- **Mail Merge**
- **Single row auditing**
- **Multiple row auditing**
## Create a virtual table
Some options are only used when creating a virtual table.
|Option |Description |
|---------|---------|
|**Virtual Entity**|Whether the table is a virtual table.|
|**Data Source**|The data source for the table.|
More information: [Create and edit virtual tables that contain data from an external data source](create-edit-virtual-entities.md)
### See also
[Create and edit tables in Dataverse](create-edit-entities.md)<br />
[Tutorial: Create a custom table that has components in Power Apps](/powerapps/maker/common-data-service/create-custom-entity)<br />
[Create a solution](create-solution.md)
| 47.748768 | 411 | 0.759105 | eng_Latn | 0.996754 |
1c72348fbbb789b9fbab21b8059a11366fec9e8d | 18 | md | Markdown | README.md | kanishk779/RandomData | 434ee2e5d0910b04a662b16495e8d478b2067e80 | [
"MIT"
] | null | null | null | README.md | kanishk779/RandomData | 434ee2e5d0910b04a662b16495e8d478b2067e80 | [
"MIT"
] | null | null | null | README.md | kanishk779/RandomData | 434ee2e5d0910b04a662b16495e8d478b2067e80 | [
"MIT"
] | null | null | null | # RandomData
Data
| 6 | 12 | 0.777778 | yue_Hant | 0.824065 |
1c725fb0d5a856f44ed44473bf03e767ae692e5b | 1,451 | md | Markdown | 2020/09/18/2020-09-18 22:15.md | zhzhzhy/WeiBoHot_history | 32ce4800e63f26384abb17d43e308452c537c902 | [
"MIT"
] | 3 | 2020-07-14T14:54:15.000Z | 2020-08-21T06:48:24.000Z | 2020/09/18/2020-09-18 22:15.md | zhzhzhy/WeiBoHot_history | 32ce4800e63f26384abb17d43e308452c537c902 | [
"MIT"
] | null | null | null | 2020/09/18/2020-09-18 22:15.md | zhzhzhy/WeiBoHot_history | 32ce4800e63f26384abb17d43e308452c537c902 | [
"MIT"
] | null | null | null | 2020年09月18日22时数据
Status: 200
1.吴亦凡 我不在这节目都不会出现
微博热度:4415932
2.一90后户籍被永久标注拒服兵役
微博热度:2871278
3.乔欣 嫉妒不可耻
微博热度:2455592
4.周杰伦吃醋昆凌给别人加油
微博热度:2145032
5.黑蒜
微博热度:1931409
6.何凯文
微博热度:1791811
7.毫无购买欲望的吃播
微博热度:1175290
8.解放军对印军播放歌曲
微博热度:926464
9.教育部要求高校方便学生进出校门
微博热度:884116
10.演唱会快回来了
微博热度:860509
11.杨迪洗肥肠控制不住的yue
微博热度:860405
12.最美逆行者之别来无恙
微博热度:860027
13.钟南山称疫情今冬明春可能继续存在
微博热度:859748
14.老奶奶被让座后为读书学生打光
微博热度:859336
15.温州大妈为雨中默哀女交警撑伞
微博热度:858946
16.熊孩子把超市桃子全戳上洞
微博热度:858765
17.考虫崩了
微博热度:724476
18.山东一高校古琴专业今年只招1人
微博热度:701111
19.网红拉姆遭前夫纵火焚烧重度烧伤
微博热度:691977
20.比伯看老婆和看狗仔的眼神对比
微博热度:688646
21.秋天太适合恋爱了
微博热度:641743
22.金晓钟说泫雅是最后一任女友
微博热度:541135
23.国庆开车出游一箱油省13元
微博热度:535960
24.台湾
微博热度:535471
25.合肥幼儿园老师天台上掌掴幼童
微博热度:535370
26.GALI心疼张靓颖
微博热度:535360
27.小白 男孩子也有爱美的权利
微博热度:526782
28.巴奴回应海底捞抄袭争议
微博热度:489101
29.外交部回应解放军台海附近实战化演练
微博热度:487659
30.泡面吃出了大餐的感觉
微博热度:487295
31.Mike前女友
微博热度:467236
32.中国新说唱
微博热度:443641
33.UNINE拍摄解散照
微博热度:430293
34.郝帅太惨了
微博热度:321370
35.承煦新婚夜追茗玉
微博热度:313333
36.吴中天让杨子姗躺床上刷牙
微博热度:309796
37.孙弈秋支棱起来了
微博热度:308166
38.平平无奇的起名小天才
微博热度:302764
39.初三男生被家长扇耳光跳楼身亡
微博热度:297864
40.萧承耀下线
微博热度:294280
41.小白输给小青龙
微博热度:289266
42.怪物猎人崛起
微博热度:287760
43.理智粉丝的追星方式
微博热度:283261
44.聚集性疫情地区需7天内全员核酸检测
微博热度:279733
45.黄鸿升遗照
微博热度:273463
46.9月19日圆明园免费入园
微博热度:272598
47.瑞丽已完成核酸检测225819份全阴性
微博热度:267304
48.中国好声音
微博热度:264948
49.动物园回应斑马出逃上路
微博热度:264269
50.Mike为争儿子抚养权节衣缩食
微博热度:261868
| 7.112745 | 22 | 0.787733 | yue_Hant | 0.359813 |
1c7451c3168ab129051229ce631d39c6af69020e | 894 | md | Markdown | docs/preprocessor/exclude-hash-import.md | stu85010/cpp-docs.zh-tw | bac0362e722d794727f509d63a2e3179b70d0785 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/preprocessor/exclude-hash-import.md | stu85010/cpp-docs.zh-tw | bac0362e722d794727f509d63a2e3179b70d0785 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/preprocessor/exclude-hash-import.md | stu85010/cpp-docs.zh-tw | bac0362e722d794727f509d63a2e3179b70d0785 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: exclude (#import)
ms.date: 10/18/2018
f1_keywords:
- exclude
helpviewer_keywords:
- exclude attribute
ms.assetid: 0883248a-d4bf-420e-9848-807b28fa976e
ms.openlocfilehash: 1de1376fbe80147bc9fe01ea83bad81912431310
ms.sourcegitcommit: 6052185696adca270bc9bdbec45a626dd89cdcdd
ms.translationtype: MT
ms.contentlocale: zh-TW
ms.lasthandoff: 10/31/2018
ms.locfileid: "50498250"
---
# <a name="exclude-import"></a>排除 (\#匯入)
**C + + 特定**
排除從類型程式庫標頭檔產生的項目。
## <a name="syntax"></a>語法
```
exclude("Name1"[, "Name2",...])
```
### <a name="parameters"></a>參數
*Name1*<br/>
要排除的第一個項目。
*Name2*<br/>
要排除的第二個項目 (如果需要的話)。
## <a name="remarks"></a>備註
類型程式庫可能包含在系統標頭或其他類型程式庫中定義的項目。 這個屬性可以接受任意數目的引數,每一個都是要排除的頂層類型程式庫項目。
**END c + + 特定的**
## <a name="see-also"></a>另請參閱
[#import 屬性](../preprocessor/hash-import-attributes-cpp.md)<br/>
[#import 指示詞](../preprocessor/hash-import-directive-cpp.md) | 19.866667 | 65 | 0.714765 | yue_Hant | 0.735859 |
1c74b3b19a1bb6dcd6475f8b6bf28f5ffe91fdfb | 75,487 | md | Markdown | https___bryan-guner.gitbook.io_my-docs_/bryan-guner.gitbook.io/my-docs/web-development-frameworks/gatsby/graphql-concepts.md | Uvacoder/html-demo-code-and-experiments | 1bd2ab50afe8f331396c37822301afa8e4903bcd | [
"Apache-2.0"
] | 3 | 2021-11-15T07:54:37.000Z | 2021-11-29T03:09:12.000Z | https___bryan-guner.gitbook.io_my-docs_/bryan-guner.gitbook.io/my-docs/web-development-frameworks/gatsby/graphql-concepts.md | Uvacoder/html-demo-code-and-experiments | 1bd2ab50afe8f331396c37822301afa8e4903bcd | [
"Apache-2.0"
] | 26 | 2021-11-15T04:26:39.000Z | 2021-11-17T00:17:06.000Z | https___bryan-guner.gitbook.io_my-docs_/bryan-guner.gitbook.io/my-docs/web-development-frameworks/gatsby/graphql-concepts.md | Uvacoder/html-demo-code-and-experiments | 1bd2ab50afe8f331396c37822301afa8e4903bcd | [
"Apache-2.0"
] | null | null | null | <a href="https://bryan-guner.gitbook.io/my-docs/" class="css-4rbku5 css-1dbjc4n r-1awozwy r-1loqt21 r-18u37iz r-1otgn73 r-1i6wzkk r-lrvibr"></a>
<span class="css-901oao css-16my406 css-vcwn7f" aria-label="My Docs" data-rnw-int-class="nearest_246__260-12392_">My Docs</span>
<a href="https://bgoonz-blog.netlify.app/" class="css-4rbku5 css-901oao css-cens5h r-howw7u r-1loqt21 r-gg6oyi r-ubezar r-majxgm r-135wba7 r-ymttw5">My Blog</a><a href="https://py-v2.gitbook.io/datastructures-in-pytho/" class="css-4rbku5 css-901oao css-cens5h r-howw7u r-1loqt21 r-gg6oyi r-ubezar r-majxgm r-135wba7 r-ymttw5">Python Data Structures</a><a href="https://github.com/bgoonz/Web-Dev-Hub-Docs" class="css-4rbku5 css-901oao css-cens5h r-howw7u r-1loqt21 r-gg6oyi r-ubezar r-majxgm r-135wba7 r-ymttw5">Repo</a><a href="https://lambda-labs.gitbook.io/lambda-labs/" class="css-4rbku5 css-901oao css-cens5h r-howw7u r-1loqt21 r-gg6oyi r-ubezar r-majxgm r-135wba7 r-ymttw5">Family-Promise-Docs</a>
Search…
<span class="css-901oao css-16my406 css-vcwn7f">Web-Dev-Hub-Docs</span>
<a href="https://bryan-guner.gitbook.io/my-docs/" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
Home
<a href="https://bryan-guner.gitbook.io/my-docs/navigation" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
Navigation
Tools
<a href="https://bryan-guner.gitbook.io/my-docs/tools/tools" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
Tools
<a href="https://bryan-guner.gitbook.io/my-docs/tools/showcase" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
Showcase
<a href="https://bryan-guner.gitbook.io/my-docs/tools/downloads" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
Downloads
<a href="https://bryan-guner.gitbook.io/my-docs/tools/how-to-configure-ubuntu" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
How To Configure Ubuntu
<a href="https://bryan-guner.gitbook.io/my-docs/tools/links" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
REPL.IT Compilations
<a href="https://bryan-guner.gitbook.io/my-docs/tools/past-notes" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
Past Notes
<a href="https://bryan-guner.gitbook.io/my-docs/tools/untitled" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
Git
<a href="https://bryan-guner.gitbook.io/my-docs/tools/javascript" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
JavaScript
Programming Languages
<a href="https://bryan-guner.gitbook.io/my-docs/programming-languages/programming-languages" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
Programming Languages
<a href="https://bryan-guner.gitbook.io/my-docs/programming-languages/what-is-a-programming-language" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
What is a Programming Language?
Python
<a href="https://bryan-guner.gitbook.io/my-docs/python/untitled" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
Python
<a href="https://bryan-guner.gitbook.io/my-docs/python/install-pip" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
Install PIP
JavaScript
<a href="https://bryan-guner.gitbook.io/my-docs/javascript/untitled" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
JavaScript
<a href="https://bryan-guner.gitbook.io/my-docs/javascript/js-leetcode" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
JS-Leetcode
Web Development Frameworks & Libraries
<a href="https://bryan-guner.gitbook.io/my-docs/web-development-frameworks/graphql" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
GRAPHQL
<a href="https://bryan-guner.gitbook.io/my-docs/web-development-frameworks/react" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
React
<a href="https://bryan-guner.gitbook.io/my-docs/web-development-frameworks/jquery" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
Jquery
<a href="https://bryan-guner.gitbook.io/my-docs/web-development-frameworks/gatsby" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
GATSBY
<a href="https://bryan-guner.gitbook.io/my-docs/web-development-frameworks/gatsby/untitled" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
Untitled
<a href="https://bryan-guner.gitbook.io/my-docs/web-development-frameworks/gatsby/building-with-components" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
Building with Components
<a href="https://bryan-guner.gitbook.io/my-docs/web-development-frameworks/gatsby/plugins-themes-and-starters" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
Plugins, Themes, & Starters
<a href="https://bryan-guner.gitbook.io/my-docs/web-development-frameworks/gatsby/graphql-concepts" class="css-4rbku5 css-1dbjc4n r-1awozwy r-14lw9ot r-156hn8l r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
GraphQL Concepts
Productivity
<a href="https://bryan-guner.gitbook.io/my-docs/productivity/untitled" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
Productivity
Misc
<a href="https://bryan-guner.gitbook.io/my-docs/misc/untitled" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
Misc
GitGateway
<a href="https://bryan-guner.gitbook.io/my-docs/links/untitled" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
Links
<a href="https://bryan-guner.gitbook.io/my-docs/links/untitled-1" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
Bookmarks
Websites
<a href="https://bryan-guner.gitbook.io/my-docs/websites/untitled" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
Websites
<a href="https://bryan-guner.gitbook.io/my-docs/websites/untitled-1" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
Not My Websites:
Backend
<a href="https://bryan-guner.gitbook.io/my-docs/backend/untitled" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
Backend
Networking
<a href="https://bryan-guner.gitbook.io/my-docs/networking/untitled" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
Networks
Resources
<a href="https://bryan-guner.gitbook.io/my-docs/resources/untitled" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
Resources
<a href="https://bryan-guner.gitbook.io/my-docs/resources/untitled-1" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
Video Resources
General Knowledge
<a href="https://bryan-guner.gitbook.io/my-docs/general-knowledge/untitled" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
General Knowledge
<a href="https://bryan-guner.gitbook.io/my-docs/general-knowledge/untitled-1" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
Knowledge Bank
Finance
<a href="https://bryan-guner.gitbook.io/my-docs/finance/untitled" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
Finance
<a href="https://bryan-guner.gitbook.io/my-docs/finance/finance-reference" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
Finance Reference
<a href="https://bryan-guner.gitbook.io/my-docs/finance/untitled-1" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
Financial Trends
Science & Tech (Innovation)
<a href="https://bryan-guner.gitbook.io/my-docs/science-and-tech-innovation/untitled" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
Science & Tech
<a href="https://bryan-guner.gitbook.io/my-docs/science-and-tech-innovation/untitled-1" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
Articles
Reading
<a href="https://bryan-guner.gitbook.io/my-docs/reading/untitled" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
Reading
Social Media & Trends
<a href="https://bryan-guner.gitbook.io/my-docs/social-media-and-trends/untitled" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
Trends In Web Dev
<a href="https://bryan-guner.gitbook.io/my-docs/social-media-and-trends/fb-dev-open-source" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
FB-Dev-Open Source
<a href="https://bryan-guner.gitbook.io/my-docs/social-media-and-trends/ig-api" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
IG-API
Docs
<a href="https://bryan-guner.gitbook.io/my-docs/docs/docs" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
Docs
Code Editors & Tools
<a href="https://bryan-guner.gitbook.io/my-docs/code-editors-and-tools/vscode" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
Vscode
Cool Stuff
<a href="https://bryan-guner.gitbook.io/my-docs/cool-stuff/cool-observable-notebooks" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
Cool Observable Notebooks
Server-Side
<a href="https://bryan-guner.gitbook.io/my-docs/server-side/graphql" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
GraphQL
<a href="https://bryan-guner.gitbook.io/my-docs/server-side/rest-vs-graphql" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
Rest VS GraphQl
<a href="https://bryan-guner.gitbook.io/my-docs/server-side/rest-api" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
REST-API
<a href="https://bryan-guner.gitbook.io/my-docs/server-side/public-apis" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
Public APIs
WEB_DEV_TOOLS
<a href="https://bryan-guner.gitbook.io/my-docs/web_dev_tools/web-dev-tools" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
Web Dev Tools
<a href="https://bryan-guner.gitbook.io/my-docs/web_dev_tools/cloudinary" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
Cloudinary
<a href="https://bryan-guner.gitbook.io/my-docs/web_dev_tools/postman" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
Postman
<a href="https://bryan-guner.gitbook.io/my-docs/web_dev_tools/netlify" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
Netlify
DS_ALGOS_BRAINTEASERS
<a href="https://bryan-guner.gitbook.io/my-docs/ds_algos_brainteasers/ds_algos_brainteasers" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
A Quick Guide to Big-O Notation, Memoization, Tabulation, and Sorting Algorithms by Example
Free-Stuff
<a href="https://bryan-guner.gitbook.io/my-docs/free-stuff/free-stuff" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
Free Stuff
Job-Search
<a href="https://bryan-guner.gitbook.io/my-docs/job-search/job-search" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
Job Search
<a href="https://bryan-guner.gitbook.io/my-docs/job-search/outreach" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
Outreach
General Comp Sci
<a href="https://bryan-guner.gitbook.io/my-docs/general-comp-sci/principles-behind-the-agile-manifesto" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
Principles behind the Agile Manifesto
Blockchain & Crypto
<a href="https://bryan-guner.gitbook.io/my-docs/blockchain-and-crypto/blockchain-basics" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
Blockchain Basics
Data Structures & Interviewing
<a href="https://bryan-guner.gitbook.io/my-docs/data-structures-and-interviewing/data-structures" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
Data Structures
REACT_REVISITED
<a href="https://bryan-guner.gitbook.io/my-docs/react_revisited/modern-react-with-redux" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
Modern React with Redux
WEBDEV-Bootcamp-Notes
<a href="https://bryan-guner.gitbook.io/my-docs/webdev-bootcamp-notes/lambda" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
Lambda
Group 1
<a href="https://bryan-guner.gitbook.io/my-docs/group-1/testing" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
Testing:
Medium-articles
<a href="https://bryan-guner.gitbook.io/my-docs/medium-articles/my-articles" class="css-4rbku5 css-1dbjc4n r-1awozwy r-42olwf r-rs99b7 r-1loqt21 r-18u37iz r-15ysp7h r-ymttw5 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
My Articles
Powered By <span class="css-901oao css-16my406 r-b88u0q">GitBook</span>
GraphQL Concepts
<span data-key="7f772920058f4e14bbc18fedf2ba910b"><span data-offset-key="7f772920058f4e14bbc18fedf2ba910b:0"><span data-slate-zero-width="n"></span></span></span>
<span data-key="1a8cdb667a0a47c9a13687813cb78397"><span data-offset-key="1a8cdb667a0a47c9a13687813cb78397:0">TABLE OF CONTENTS</span></span>
- <span data-key="174abe9289d34c00af0df8e20e81b920"><span data-offset-key="174abe9289d34c00af0df8e20e81b920:0"><span data-slate-zero-width="z"></span></span></span><a href="https://www.gatsbyjs.com/docs/conceptual/graphql-concepts/#why-is-graphql-so-cool" class="css-4rbku5 css-1dbjc4n r-1loqt21 r-1471scf r-1otgn73 r-1i6wzkk r-lrvibr"><span data-key="df14c517525d49fea12126d82151e440" data-rnw-int-class="nearest_260-12392_262-12393-240__"><span data-key="ba261af5e6484f8fb2335ed61e264df2"><span data-offset-key="ba261af5e6484f8fb2335ed61e264df2:0">Why is GraphQL so cool?</span></span></span></a><span data-key="53029bd1d9f34e3ea393b82bbac36762"><span data-offset-key="53029bd1d9f34e3ea393b82bbac36762:0"><span data-slate-zero-width="z"></span></span></span>
- <span data-key="704f225c54de4dd9b2fea3f1ad5a3a2d"><span data-offset-key="704f225c54de4dd9b2fea3f1ad5a3a2d:0"><span data-slate-zero-width="z"></span></span></span><a href="https://www.gatsbyjs.com/docs/conceptual/graphql-concepts/#what-does-a-graphql-query-look-like" class="css-4rbku5 css-1dbjc4n r-1loqt21 r-1471scf r-1otgn73 r-1i6wzkk r-lrvibr"><span data-key="edce8d38fa8c4be5bacd4785f2cc6da3" data-rnw-int-class="nearest_260-12392_262-12393-240__"><span data-key="71bb3418462047a9b331fed32bfffd9f"><span data-offset-key="71bb3418462047a9b331fed32bfffd9f:0">What does a GraphQL query look like?</span></span></span></a><span data-key="23a0c542c35e4cff94a5afd3e36924ab"><span data-offset-key="23a0c542c35e4cff94a5afd3e36924ab:0"><span data-slate-zero-width="z"></span></span></span>
- <span data-key="79d724c1221a410fbcf71e1fb8c633e6"><span data-offset-key="79d724c1221a410fbcf71e1fb8c633e6:0"><span data-slate-zero-width="z"></span></span></span><a href="https://www.gatsbyjs.com/docs/conceptual/graphql-concepts/#understanding-the-parts-of-a-query" class="css-4rbku5 css-1dbjc4n r-1loqt21 r-1471scf r-1otgn73 r-1i6wzkk r-lrvibr"><span data-key="2a7945ed282849dd96d0043e2546cd28" data-rnw-int-class="nearest_260-12392_262-12393-240__"><span data-key="9a13fa9efb1244aea07e8b08bb9e7268"><span data-offset-key="9a13fa9efb1244aea07e8b08bb9e7268:0">Understanding the parts of a query</span></span></span></a><span data-key="cf24033fab9d407eae1d7698a99f0106"><span data-offset-key="cf24033fab9d407eae1d7698a99f0106:0"><span data-slate-zero-width="z"></span></span></span>
- <span data-key="4a8b78611bdb4ac8b2c5a2a04b0b4a41"><span data-offset-key="4a8b78611bdb4ac8b2c5a2a04b0b4a41:0"><span data-slate-zero-width="z"></span></span></span><a href="https://www.gatsbyjs.com/docs/conceptual/graphql-concepts/#query-operation-type" class="css-4rbku5 css-1dbjc4n r-1loqt21 r-1471scf r-1otgn73 r-1i6wzkk r-lrvibr"><span data-key="f741b676cdf94b62806dd0dec2b4ff10" data-rnw-int-class="nearest_260-12392_262-12393-240__"><span data-key="68d196ae60724cddbb444ae02c141265"><span data-offset-key="68d196ae60724cddbb444ae02c141265:0">Query operation type</span></span></span></a><span data-key="690bcd009bdc44b08d7fcfd789f2d6b0"><span data-offset-key="690bcd009bdc44b08d7fcfd789f2d6b0:0"><span data-slate-zero-width="z"></span></span></span>
- <span data-key="2bd4c7d9136f44e3aeb3502ab5ff7fa0"><span data-offset-key="2bd4c7d9136f44e3aeb3502ab5ff7fa0:0"><span data-slate-zero-width="z"></span></span></span><a href="https://www.gatsbyjs.com/docs/conceptual/graphql-concepts/#operation-name" class="css-4rbku5 css-1dbjc4n r-1loqt21 r-1471scf r-1otgn73 r-1i6wzkk r-lrvibr"><span data-key="4f0f3d5727b147179faaaccfd195940d" data-rnw-int-class="nearest_260-12392_262-12393-240__"><span data-key="7ebcffeb2f1349c68aea9eee3f13cf25"><span data-offset-key="7ebcffeb2f1349c68aea9eee3f13cf25:0">Operation name</span></span></span></a><span data-key="8eadff6a698c45ba9636179937b82c31"><span data-offset-key="8eadff6a698c45ba9636179937b82c31:0"><span data-slate-zero-width="z"></span></span></span>
- <span data-key="67d02ca9754c4e34afe8e33664327dd3"><span data-offset-key="67d02ca9754c4e34afe8e33664327dd3:0"><span data-slate-zero-width="z"></span></span></span><a href="https://www.gatsbyjs.com/docs/conceptual/graphql-concepts/#query-fields" class="css-4rbku5 css-1dbjc4n r-1loqt21 r-1471scf r-1otgn73 r-1i6wzkk r-lrvibr"><span data-key="dfd04752e2894f5cade995b69c1d07ec" data-rnw-int-class="nearest_260-12392_262-12393-240__"><span data-key="de300827a7db43c09669f6610863d6f1"><span data-offset-key="de300827a7db43c09669f6610863d6f1:0">Query fields</span></span></span></a><span data-key="b916c57fcff94e84b64c301cca112302"><span data-offset-key="b916c57fcff94e84b64c301cca112302:0"><span data-slate-zero-width="z"></span></span></span>
- <span data-key="bfbb1bfb4a184b27a2925341eeb2b76c"><span data-offset-key="bfbb1bfb4a184b27a2925341eeb2b76c:0"><span data-slate-zero-width="z"></span></span></span><a href="https://www.gatsbyjs.com/docs/conceptual/graphql-concepts/#how-to-learn-graphql" class="css-4rbku5 css-1dbjc4n r-1loqt21 r-1471scf r-1otgn73 r-1i6wzkk r-lrvibr"><span data-key="6cd54b44dcb6489ba5dd640c6a0ee242" data-rnw-int-class="nearest_260-12392_262-12393-240__"><span data-key="51dfb28107404e079620249612487ab1"><span data-offset-key="51dfb28107404e079620249612487ab1:0">How to learn GraphQL</span></span></span></a><span data-key="c9b70ed7ee3b4fa1a4a51dc6a7968996"><span data-offset-key="c9b70ed7ee3b4fa1a4a51dc6a7968996:0"><span data-slate-zero-width="z"></span></span></span>
- <span data-key="8810bb7419de4eac8c39a582db41d671"><span data-offset-key="8810bb7419de4eac8c39a582db41d671:0"><span data-slate-zero-width="z"></span></span></span><a href="https://www.gatsbyjs.com/docs/conceptual/graphql-concepts/#how-do-graphql-and-gatsby-work-together" class="css-4rbku5 css-1dbjc4n r-1loqt21 r-1471scf r-1otgn73 r-1i6wzkk r-lrvibr"><span data-key="4c1ef56b0b334a06a1dbb9d74abf47d1" data-rnw-int-class="nearest_260-12392_262-12393-240__"><span data-key="3e6b0bf56eb143a5934a4d18b13aa85e"><span data-offset-key="3e6b0bf56eb143a5934a4d18b13aa85e:0">How do GraphQL and Gatsby work together?</span></span></span></a><span data-key="f63512b43ae142659b8798782c1f4276"><span data-offset-key="f63512b43ae142659b8798782c1f4276:0"><span data-slate-zero-width="z"></span></span></span>
- <span data-key="84948afde9444e1da157ea24bb416ba9"><span data-offset-key="84948afde9444e1da157ea24bb416ba9:0"><span data-slate-zero-width="z"></span></span></span><a href="https://www.gatsbyjs.com/docs/conceptual/graphql-concepts/#where-does-gatsbys-graphql-schema-come-from" class="css-4rbku5 css-1dbjc4n r-1loqt21 r-1471scf r-1otgn73 r-1i6wzkk r-lrvibr"><span data-key="4eb2ebec72164e0f937980e755aae7e3" data-rnw-int-class="nearest_260-12392_262-12393-240__"><span data-key="ed1dddac0557469faa6fc87e760d81ae"><span data-offset-key="ed1dddac0557469faa6fc87e760d81ae:0">Where does Gatsby's GraphQL schema come from?</span></span></span></a><span data-key="588b18f63ca14ba595300e2c73e856f1"><span data-offset-key="588b18f63ca14ba595300e2c73e856f1:0"><span data-slate-zero-width="z"></span></span></span>
- <span data-key="4445c86d07774eddb52a0807a4c1ae43"><span data-offset-key="4445c86d07774eddb52a0807a4c1ae43:0"><span data-slate-zero-width="z"></span></span></span><a href="https://www.gatsbyjs.com/docs/conceptual/graphql-concepts/#powerful-data-transformations" class="css-4rbku5 css-1dbjc4n r-1loqt21 r-1471scf r-1otgn73 r-1i6wzkk r-lrvibr"><span data-key="cc491ef8e91b47199d931ab837dadf85" data-rnw-int-class="nearest_260-12392_262-12393-240__"><span data-key="44d013f3ae2a40c6b1c937e5cbdd0ac5"><span data-offset-key="44d013f3ae2a40c6b1c937e5cbdd0ac5:0">Powerful data transformations</span></span></span></a><span data-key="ccc011a5e848482587fc0c4eca995295"><span data-offset-key="ccc011a5e848482587fc0c4eca995295:0"><span data-slate-zero-width="z"></span></span></span>
- <span data-key="3d3839473c8949cebf42ecb3979f4ba0"><span data-offset-key="3d3839473c8949cebf42ecb3979f4ba0:0"><span data-slate-zero-width="z"></span></span></span><a href="https://www.gatsbyjs.com/docs/conceptual/graphql-concepts/#formatting-dates" class="css-4rbku5 css-1dbjc4n r-1loqt21 r-1471scf r-1otgn73 r-1i6wzkk r-lrvibr"><span data-key="5e746b7d54384c7f8250304bf0c08cc7" data-rnw-int-class="nearest_260-12392_262-12393-240__"><span data-key="1bacc03c99b444818234d59bcf85486f"><span data-offset-key="1bacc03c99b444818234d59bcf85486f:0">Formatting dates</span></span></span></a><span data-key="c7f64e1fce7c4e5cb82353d2a82ebcea"><span data-offset-key="c7f64e1fce7c4e5cb82353d2a82ebcea:0"><span data-slate-zero-width="z"></span></span></span>
- <span data-key="4f04fc23245c4f0186786a953b288e89"><span data-offset-key="4f04fc23245c4f0186786a953b288e89:0"><span data-slate-zero-width="z"></span></span></span><a href="https://www.gatsbyjs.com/docs/conceptual/graphql-concepts/#markdown" class="css-4rbku5 css-1dbjc4n r-1loqt21 r-1471scf r-1otgn73 r-1i6wzkk r-lrvibr"><span data-key="543a1171e1b14b23821c24cf06085638" data-rnw-int-class="nearest_260-12392_262-12393-240__"><span data-key="3695e3417e2547ce8f53c3438f63a383"><span data-offset-key="3695e3417e2547ce8f53c3438f63a383:0">Markdown</span></span></span></a><span data-key="05b48b44f107425d91e2cb5ef0cfe60f"><span data-offset-key="05b48b44f107425d91e2cb5ef0cfe60f:0"><span data-slate-zero-width="z"></span></span></span>
- <span data-key="93a7e04d67e04411b099ba3d636bb5e1"><span data-offset-key="93a7e04d67e04411b099ba3d636bb5e1:0"><span data-slate-zero-width="z"></span></span></span><a href="https://www.gatsbyjs.com/docs/conceptual/graphql-concepts/#images" class="css-4rbku5 css-1dbjc4n r-1loqt21 r-1471scf r-1otgn73 r-1i6wzkk r-lrvibr"><span data-key="670eb9edda6d4f85a4847a90238320f3" data-rnw-int-class="nearest_260-12392_262-12393-240__"><span data-key="d4f3c3ddba3c43b3802431ba19535fba"><span data-offset-key="d4f3c3ddba3c43b3802431ba19535fba:0">Images</span></span></span></a><span data-key="e72e0752e6934d508c8142adb6c9b862"><span data-offset-key="e72e0752e6934d508c8142adb6c9b862:0"><span data-slate-zero-width="z"></span></span></span>
- <span data-key="2f415011694541bd8671684f7850df30"><span data-offset-key="2f415011694541bd8671684f7850df30:0"><span data-slate-zero-width="z"></span></span></span><a href="https://www.gatsbyjs.com/docs/conceptual/graphql-concepts/#advanced" class="css-4rbku5 css-1dbjc4n r-1loqt21 r-1471scf r-1otgn73 r-1i6wzkk r-lrvibr"><span data-key="7d66a65e7e9a4b6bb9e3d62889bcb6fe" data-rnw-int-class="nearest_260-12392_262-12393-240__"><span data-key="f37878a2f46146a89210da499260d898"><span data-offset-key="f37878a2f46146a89210da499260d898:0">Advanced</span></span></span></a><span data-key="af82c05873154fc88c6c542e2435ea31"><span data-offset-key="af82c05873154fc88c6c542e2435ea31:0"><span data-slate-zero-width="z"></span></span></span>
- <span data-key="3e99fc65e60b4faf9f5ff3735b6fb9ae"><span data-offset-key="3e99fc65e60b4faf9f5ff3735b6fb9ae:0"><span data-slate-zero-width="z"></span></span></span><a href="https://www.gatsbyjs.com/docs/conceptual/graphql-concepts/#fragments" class="css-4rbku5 css-1dbjc4n r-1loqt21 r-1471scf r-1otgn73 r-1i6wzkk r-lrvibr"><span data-key="86787dbfcf874908833ae1ecb4608e31" data-rnw-int-class="nearest_260-12392_262-12393-240__"><span data-key="53a5c4df5df14b868c60813e0c68fe4c"><span data-offset-key="53a5c4df5df14b868c60813e0c68fe4c:0">Fragments</span></span></span></a><span data-key="ee58a7f969894c7f9ca96482a6bcd658"><span data-offset-key="ee58a7f969894c7f9ca96482a6bcd658:0"><span data-slate-zero-width="z"></span></span></span>
- <span data-key="5939b7481c18478d82716f7b5eac4a48"><span data-offset-key="5939b7481c18478d82716f7b5eac4a48:0"><span data-slate-zero-width="z"></span></span></span><a href="https://www.gatsbyjs.com/docs/conceptual/graphql-concepts/#further-reading" class="css-4rbku5 css-1dbjc4n r-1loqt21 r-1471scf r-1otgn73 r-1i6wzkk r-lrvibr"><span data-key="e7f94d8bdc8347988311099218f6182a" data-rnw-int-class="nearest_260-12392_262-12393-240__"><span data-key="b72543f228794002bd93f26eaa76c50a"><span data-offset-key="b72543f228794002bd93f26eaa76c50a:0">Further reading</span></span></span></a><span data-key="cd0a5f49721c470cbd57c63ca3a82197"><span data-offset-key="cd0a5f49721c470cbd57c63ca3a82197:0"><span data-slate-zero-width="z"></span></span></span>
- <span data-key="72e54edfd2be4922833f622474264a4c"><span data-offset-key="72e54edfd2be4922833f622474264a4c:0"><span data-slate-zero-width="z"></span></span></span><a href="https://www.gatsbyjs.com/docs/conceptual/graphql-concepts/#getting-started-with-graphql" class="css-4rbku5 css-1dbjc4n r-1loqt21 r-1471scf r-1otgn73 r-1i6wzkk r-lrvibr"><span data-key="7a92fc1f96314774a8fbd61645048507" data-rnw-int-class="nearest_260-12392_262-12393-240__"><span data-key="6d2ac16d07784832a713a457fa0d621b"><span data-offset-key="6d2ac16d07784832a713a457fa0d621b:0">Getting started with GraphQL</span></span></span></a><span data-key="4ac2f90de173454db4bb2650e7402197"><span data-offset-key="4ac2f90de173454db4bb2650e7402197:0"><span data-slate-zero-width="z"></span></span></span>
- <span data-key="4667fd6592d14daaa55d893031b620bf"><span data-offset-key="4667fd6592d14daaa55d893031b620bf:0"><span data-slate-zero-width="z"></span></span></span><a href="https://www.gatsbyjs.com/docs/conceptual/graphql-concepts/#advanced-readings-on-graphql" class="css-4rbku5 css-1dbjc4n r-1loqt21 r-1471scf r-1otgn73 r-1i6wzkk r-lrvibr"><span data-key="c9b7d8f6511e4dfea4bc3eb1273de16f" data-rnw-int-class="nearest_260-12392_262-12393-240__"><span data-key="3ef54a4343f1477fafd0f0789333e17e"><span data-offset-key="3ef54a4343f1477fafd0f0789333e17e:0">Advanced readings on GraphQL</span></span></span></a><span data-key="91ac49ac130044a39360b8e067468c8b"><span data-offset-key="91ac49ac130044a39360b8e067468c8b:0"><span data-slate-zero-width="z"></span></span></span>
<span data-key="45bea1a683cd416cb79a93d5ff23967f"><span data-offset-key="45bea1a683cd416cb79a93d5ff23967f:0">There are many options for loading data into React components. One of the most popular and powerful of these is a technology called </span></span><a href="https://graphql.org/" class="css-4rbku5 css-1dbjc4n r-1loqt21 r-1471scf r-1otgn73 r-1i6wzkk r-lrvibr"><span data-key="f23fdb09391f47b191945f9bfa1eba3b" data-rnw-int-class="nearest_260-12392_262-12393-240__"><span data-key="5db49122d1a44dcf923c493a0cf49e77"><span data-offset-key="5db49122d1a44dcf923c493a0cf49e77:0">GraphQL</span></span></span></a><span data-key="27be0e8fde7848da90d425dfa5cb9c73"><span data-offset-key="27be0e8fde7848da90d425dfa5cb9c73:0">.</span></span>
<span data-key="7832e938f4d345dd91e8c2ab119ce1a2"><span data-offset-key="7832e938f4d345dd91e8c2ab119ce1a2:0">GraphQL was invented at Facebook to help product engineers </span>_pull_<span data-offset-key="7832e938f4d345dd91e8c2ab119ce1a2:2"> needed data into React components.</span></span>
<span data-key="6f8ec0a7a6f041ba88d3572533f4b11a"><span data-offset-key="6f8ec0a7a6f041ba88d3572533f4b11a:0">GraphQL is a </span>**q**<span data-offset-key="6f8ec0a7a6f041ba88d3572533f4b11a:2">uery </span>**l**<span data-offset-key="6f8ec0a7a6f041ba88d3572533f4b11a:4">anguage (the </span>_QL_<span data-offset-key="6f8ec0a7a6f041ba88d3572533f4b11a:6"> part of its name). If you’re familiar with SQL, it works in a very similar way. Using a special syntax, you describe the data you want in your component and then that data is given to you.</span></span>
<span data-key="adaa0c5bc40a4db8b178aa72be079f32"><span data-offset-key="adaa0c5bc40a4db8b178aa72be079f32:0">Gatsby uses GraphQL to enable </span></span><a href="https://www.gatsbyjs.com/docs/conceptual/building-with-components/" class="css-4rbku5 css-1dbjc4n r-1loqt21 r-1471scf r-1otgn73 r-1i6wzkk r-lrvibr"><span data-key="a82d15fec77e444ea0272765c1aeae9f" data-rnw-int-class="nearest_260-12392_262-12393-240__"><span data-key="93d53297964b4528a16201590d0612a9"><span data-offset-key="93d53297964b4528a16201590d0612a9:0">page and StaticQuery components</span></span></span></a><span data-key="1415497f03c24abfb6bc92db0d91d61a"><span data-offset-key="1415497f03c24abfb6bc92db0d91d61a:0"> to declare what data they and their sub-components need. Then, Gatsby makes that data available in the browser when needed by your components.</span></span>
<span data-key="d857a374013e4253ae559019d5306e3e"><span data-offset-key="d857a374013e4253ae559019d5306e3e:0">Data from any number of sources is made queryable in one unified layer, a key part of the Gatsby building process:ContentBuildDataViewAppGraphQL Response</span></span>
1
<span data-key="ab0ca0005ae74fb381fe06339524779a"><span data-offset-key="ab0ca0005ae74fb381fe06339524779a:0">data: {</span></span>
2
<span data-key="c17d289b080944f5b0e27fdd039d52ca"><span data-offset-key="c17d289b080944f5b0e27fdd039d52ca:0"> site: {</span></span>
3
<span data-key="b88a732631684911b7c16dc25cfb7f89"><span data-offset-key="b88a732631684911b7c16dc25cfb7f89:0"> title: "Home"</span></span>
4
<span data-key="6d0f5b2774654b68a0da7bfc1d080df2"><span data-offset-key="6d0f5b2774654b68a0da7bfc1d080df2:0"> description: "Gatsby tips"</span></span>
5
<span data-key="20fa35caa1c14b2ca8113f3bb8ff7e9f"><span data-offset-key="20fa35caa1c14b2ca8113f3bb8ff7e9f:0"> }</span></span>
6
<span data-key="23a4285f656d43f1966b02ee0d281075"><span data-offset-key="23a4285f656d43f1966b02ee0d281075:0">}</span></span>
Copied!
<span data-key="a421bbb3309e4652bd3f087d0591edf8">**Data**<span data-offset-key="a421bbb3309e4652bd3f087d0591edf8:1"> returned by GraphQL comes back in the exact same shape that you asked for it, without having to travel across the network because it was already gathered at </span></span><a href="https://www.gatsbyjs.com/docs/glossary#build" class="css-4rbku5 css-1dbjc4n r-1loqt21 r-1471scf r-1otgn73 r-1i6wzkk r-lrvibr"><span data-key="0c6bdf6ab6914ebaae37538e853c06ea" data-rnw-int-class="nearest_260-12392_262-12393-240__"><span data-key="b1514074ca4a4cf7baa30a01ee698b5a"><span data-offset-key="b1514074ca4a4cf7baa30a01ee698b5a:0">build time</span></span></span></a><span data-key="5b06fdc8f3884020887f4f33c793b50b"><span data-offset-key="5b06fdc8f3884020887f4f33c793b50b:0">.</span></span>
<span data-key="68a9ae23bcb2482db85e1bdf63658bf2"><span data-offset-key="68a9ae23bcb2482db85e1bdf63658bf2:0">Since all data is combined in the data layer, it’s even possible to query multiple sources at the same time.</span></span>
<span data-key="402243dacd1b4cc2adbf6c9e0ce2d6b5"><span data-offset-key="402243dacd1b4cc2adbf6c9e0ce2d6b5:0">Why is GraphQL so cool?</span></span>
<span data-key="82a5314cc5cb4e5a8b7e036e62bad216"><span data-offset-key="82a5314cc5cb4e5a8b7e036e62bad216:0">For a more in-depth look, read </span></span><a href="https://www.gatsbyjs.com/docs/why-gatsby-uses-graphql/" class="css-4rbku5 css-1dbjc4n r-1loqt21 r-1471scf r-1otgn73 r-1i6wzkk r-lrvibr"><span data-key="deb85e4d9590495bba497a39a524c706" data-rnw-int-class="nearest_260-12392_262-12393-240__"><span data-key="69e70b601dd9455596b9108136d245c7"><span data-offset-key="69e70b601dd9455596b9108136d245c7:0">why Gatsby uses GraphQL</span></span></span></a><span data-key="8abb1dc6613240289139a1f9a89eca18"><span data-offset-key="8abb1dc6613240289139a1f9a89eca18:0">.</span></span>
- <span data-key="246f4cb46102490e9d923d66bae2d009"><span data-offset-key="246f4cb46102490e9d923d66bae2d009:0">Eliminate frontend data boilerplate — no need to worry about requesting & waiting for data. Just ask for the data you need with a GraphQL query and it’ll show up when you need it</span></span>
- <span data-key="4fb0ff4530694bf69095847e922ad691"><span data-offset-key="4fb0ff4530694bf69095847e922ad691:0">Push frontend complexity into queries — many data transformations can be done at </span>_build-time_<span data-offset-key="4fb0ff4530694bf69095847e922ad691:2"> within your GraphQL queries</span></span>
- <span data-key="bafeead563184153bbfed5e6f230109b"><span data-offset-key="bafeead563184153bbfed5e6f230109b:0">It’s the perfect data querying language for the often complex/nested data dependencies of modern applications</span></span>
- <span data-key="af6045d1695f4bbcb027686841eb1400"><span data-offset-key="af6045d1695f4bbcb027686841eb1400:0">Improve performance by removing data bloat — GraphQL enables you to select only the data you need, not whatever an API returns</span></span>
<span data-key="85325faf196141d897d25dd0958bd659"><span data-offset-key="85325faf196141d897d25dd0958bd659:0">What does a GraphQL query look like?</span></span>
<span data-key="6130347236cd4fd3bbc34f6f04879f31"><span data-offset-key="6130347236cd4fd3bbc34f6f04879f31:0">GraphQL lets you ask for the exact data you need. Queries look like JSON:</span></span>
1
<span data-key="68a0a7ac66104d329cab49e4e144313c"><span data-offset-key="68a0a7ac66104d329cab49e4e144313c:0">Copycopy code to clipboard{ site { siteMetadata { title } }}</span></span>
Copied!
<span data-key="0c98ca68256944018e1192d96d2c341c"><span data-offset-key="0c98ca68256944018e1192d96d2c341c:0">Which returns this:</span></span>
1
<span data-key="2beab0e2f2ab470388829a3ca386d3c5"><span data-offset-key="2beab0e2f2ab470388829a3ca386d3c5:0">Copycopy code to clipboard{ "site": { "siteMetadata": { "title": "A Gatsby site!" } }}</span></span>
Copied!
<span data-key="fb1ab5a6e06c4117931788979f09b5c3"><span data-offset-key="fb1ab5a6e06c4117931788979f09b5c3:0">A basic page component with a GraphQL query might look like this:</span></span>
1
<span data-key="f027d2a7f8724e4c813dfd473f1dc6b3"><span data-offset-key="f027d2a7f8724e4c813dfd473f1dc6b3:0">Copycopy code to clipboardimport React from "react"import { graphql } from "gatsby"export default function Page({ data }) { return ( <div> <h1>About {data.site.siteMetadata.title}</h1> <p>We're a very cool website you should return to often.</p> </div> )}export const query = graphql\` query { site { siteMetadata { title } } }\`</span></span>
Copied!
<span data-key="0aae8075571048e3b6416dcae6f5b043"><span data-offset-key="0aae8075571048e3b6416dcae6f5b043:0">The result of the query is automatically inserted into your React component on the </span>`data`<span data-offset-key="0aae8075571048e3b6416dcae6f5b043:2"> prop. GraphQL and Gatsby let you ask for data and then immediately start using it.</span></span>
<span data-key="b3851a13f25b4e6c8d8e1482d8353ba3">**Note:**<span data-offset-key="b3851a13f25b4e6c8d8e1482d8353ba3:1"> To run GraphQL queries in non-page components you’ll need to use </span></span><a href="https://www.gatsbyjs.com/docs/how-to/querying-data/static-query/" class="css-4rbku5 css-1dbjc4n r-1loqt21 r-1471scf r-1otgn73 r-1i6wzkk r-lrvibr"><span data-key="f9d8939446b74550b5157e8e613c5f61" data-rnw-int-class="nearest_260-12392_262-12393-240__"><span data-key="44ee44eabac64ed8bcf2b0186e77235d"><span data-offset-key="44ee44eabac64ed8bcf2b0186e77235d:0">Gatsby’s Static Query feature</span></span></span></a><span data-key="67d14e5cb31949e4a862031d1e0ecd5b"><span data-offset-key="67d14e5cb31949e4a862031d1e0ecd5b:0">.</span></span>
###
<span data-key="2bdf83225ed348e4a86a78f8d01b1ec3"><span data-offset-key="2bdf83225ed348e4a86a78f8d01b1ec3:0">Understanding the parts of a query</span></span>
<span data-key="dfc29d3e13b74585b3d2d645f135b875"><span data-offset-key="dfc29d3e13b74585b3d2d645f135b875:0">The following diagram shows a GraphQL query, with each word highlighted in a color corresponding to its name on the legend:</span></span>
<span data-key="56c80cd7c9c24a63af576316b8804f25"><span data-offset-key="56c80cd7c9c24a63af576316b8804f25:0"><span data-slate-zero-width="z"></span></span></span><a href="https://www.gatsbyjs.com/static/2442c8664739d2a23483e6f16e3208a5/29007/basic-query.png" class="css-4rbku5 css-1dbjc4n r-1loqt21 r-1471scf r-1otgn73 r-1i6wzkk r-lrvibr"><span data-key="f30a0f1ce51a42eabb78c618227c9f78" data-rnw-int-class="nearest_260-12392_262-12393-240__"><span data-key="7c268dfb015b4cb58f2884296a88336e"><span data-offset-key="7c268dfb015b4cb58f2884296a88336e:0"><span data-slate-zero-width="z"></span></span></span><span data-slate-void="true" data-key="b3e1f7b8e13b40c4a278cfa123d81400"><span></span></span></span></a>
<span data-key="d23ea45b870944fa99da6f95a357a44a"><span data-offset-key="d23ea45b870944fa99da6f95a357a44a:0"><span data-slate-zero-width="z"></span></span></span><span data-key="dade7f55983642ff97bcb8e7f56df59b"><span data-offset-key="dade7f55983642ff97bcb8e7f56df59b:0"><span data-slate-zero-width="z"></span></span></span>
<span data-key="d60a317d35f14453b64c199d0c1a436a">**Query operation type**</span>
<span data-key="676bbbd0533c404ca6ef9469b8fe984e"><span data-offset-key="676bbbd0533c404ca6ef9469b8fe984e:0">The diagram marks the word </span>`query`<span data-offset-key="676bbbd0533c404ca6ef9469b8fe984e:2"> as the “Operation Type”, for Gatsby’s uses the only operation type you will deal with is </span>`query`<span data-offset-key="676bbbd0533c404ca6ef9469b8fe984e:4">, this can be omitted from your queries if you prefer (like in the above example).</span></span>
<span data-key="57fe911e14cd4e20a365dae2e42d1abd">**Operation name**</span>
<span data-key="2b762474e41d4935930c8b42665f3e1a">`SiteInformation`<span data-offset-key="2b762474e41d4935930c8b42665f3e1a:1"> is marked as the “Operation Name”, which is a unique name that you assign to a query yourself. This is similar to how you would name a function or a variable, and like a function this can be omitted if you would rather the query be anonymous.</span></span>
<span data-key="fb19d4ca525c42bcb6f47f20f6315945">**Query fields**</span>
<span data-key="e8117363ad354dc395014768e80846fd"><span data-offset-key="e8117363ad354dc395014768e80846fd:0">The four words </span>`site`<span data-offset-key="e8117363ad354dc395014768e80846fd:2">, </span>`id`<span data-offset-key="e8117363ad354dc395014768e80846fd:4">, </span>`siteMetadata`<span data-offset-key="e8117363ad354dc395014768e80846fd:6">, and </span>`title`<span data-offset-key="e8117363ad354dc395014768e80846fd:8"> are marked as “Fields”. Any top-level fields — like </span>`site`<span data-offset-key="e8117363ad354dc395014768e80846fd:10"> in the diagram — are sometimes referred to as </span>**root level fields**<span data-offset-key="e8117363ad354dc395014768e80846fd:12">, though the name doesn’t signify functional significance as all fields in GraphQL queries behave the same.</span></span>
<span data-key="626c625e7530401fa45d230a525cdbd1"><span data-offset-key="626c625e7530401fa45d230a525cdbd1:0">How to learn GraphQL</span></span>
<span data-key="fef3d145e1ef46368b51372e622116f1"><span data-offset-key="fef3d145e1ef46368b51372e622116f1:0">Your experience developing with Gatsby might be the first time you’ve seen GraphQL! We hope you love it as much as we do and find it useful for all your projects.</span></span>
<span data-key="ccb7a8216bad48a1802356d690913fb4"><span data-offset-key="ccb7a8216bad48a1802356d690913fb4:0">When starting out with GraphQL, we recommend the following two tutorials:</span></span>
- <span data-key="96e58e4564a04d4abb98ab681072c4ee"><span data-offset-key="96e58e4564a04d4abb98ab681072c4ee:0"><span data-slate-zero-width="z"></span></span></span><a href="https://www.howtographql.com/" class="css-4rbku5 css-1dbjc4n r-1loqt21 r-1471scf r-1otgn73 r-1i6wzkk r-lrvibr"><span data-key="95ceff17954e40519890dfb33ec1579d" data-rnw-int-class="nearest_260-12392_262-12393-240__"><span data-key="d4643e22521e49668aa1da6c5ecc5f56"><span data-offset-key="d4643e22521e49668aa1da6c5ecc5f56:0">https://www.howtographql.com/</span></span></span></a><span data-key="8bbc0995463d4f8882e8f02746daa29f"><span data-offset-key="8bbc0995463d4f8882e8f02746daa29f:0"><span data-slate-zero-width="z"></span></span></span>
- <span data-key="64340c750ec546ce9f3fcdf27f8e1221"><span data-offset-key="64340c750ec546ce9f3fcdf27f8e1221:0"><span data-slate-zero-width="z"></span></span></span><a href="https://graphql.org/learn/" class="css-4rbku5 css-1dbjc4n r-1loqt21 r-1471scf r-1otgn73 r-1i6wzkk r-lrvibr"><span data-key="583d1e06fae14633a86101525d3291db" data-rnw-int-class="nearest_260-12392_262-12393-240__"><span data-key="edd68ac3eabe46fdb726d864e7c77365"><span data-offset-key="edd68ac3eabe46fdb726d864e7c77365:0">https://graphql.org/learn/</span></span></span></a><span data-key="3edd7b55c970443fa149f6e459a173d8"><span data-offset-key="3edd7b55c970443fa149f6e459a173d8:0"><span data-slate-zero-width="z"></span></span></span>
<span data-key="7e35d5357c8a4b0a9f8983861186eb72"><span data-offset-key="7e35d5357c8a4b0a9f8983861186eb72:0"><span data-slate-zero-width="z"></span></span></span><a href="https://www.gatsbyjs.com/docs/tutorial/part-four/" class="css-4rbku5 css-1dbjc4n r-1loqt21 r-1471scf r-1otgn73 r-1i6wzkk r-lrvibr"><span data-key="1ca3b6acb7ba48f588717506b5568034" data-rnw-int-class="nearest_260-12392_262-12393-240__"><span data-key="60d16c4e24e94a67b26ff7af5b0e2c34"><span data-offset-key="60d16c4e24e94a67b26ff7af5b0e2c34:0">The official Gatsby tutorial</span></span></span></a><span data-key="fe1074c7eb1c44f28bbf31c1d1a21f44"><span data-offset-key="fe1074c7eb1c44f28bbf31c1d1a21f44:0"> also includes an introduction to using GraphQL specifically with Gatsby.</span></span>
<span data-key="efb5d236d31d401982121b870a67e91f"><span data-offset-key="efb5d236d31d401982121b870a67e91f:0">How do GraphQL and Gatsby work together?</span></span>
<span data-key="6110c537d22b4cfabe093566992e2629"><span data-offset-key="6110c537d22b4cfabe093566992e2629:0">One of the great things about GraphQL is how flexible it is. People use GraphQL with </span></span><a href="https://graphql.org/code/" class="css-4rbku5 css-1dbjc4n r-1loqt21 r-1471scf r-1otgn73 r-1i6wzkk r-lrvibr"><span data-key="2aeb012b70de4c4ab08c74b1ee4d6eb9" data-rnw-int-class="nearest_260-12392_262-12393-240__"><span data-key="cc94ca58af6d4e17965d38371e96f752"><span data-offset-key="cc94ca58af6d4e17965d38371e96f752:0">many different programming languages</span></span></span></a><span data-key="659c4cdab1b24b3db6592f4dddee2eed"><span data-offset-key="659c4cdab1b24b3db6592f4dddee2eed:0"> and for web and native apps.</span></span>
<span data-key="95be7e94fc3e4849903f22c8a9b981c5"><span data-offset-key="95be7e94fc3e4849903f22c8a9b981c5:0">Most people run GraphQL on a server to respond live to requests for data from clients. You define a schema (a schema is a formal way of describing the shape of your data) for your GraphQL server and then your GraphQL resolvers retrieve data from databases and/or other APIs.</span></span>
<span data-key="ce27860f1de64503a99e246bc2cc7487"><span data-offset-key="ce27860f1de64503a99e246bc2cc7487:0">Gatsby uses GraphQL at </span>_build-time_<span data-offset-key="ce27860f1de64503a99e246bc2cc7487:2"> and </span>_not_<span data-offset-key="ce27860f1de64503a99e246bc2cc7487:4"> for live sites. This is unique, and it means you don’t need to run additional services (e.g. a database and Node.js service) to use GraphQL for production websites.</span></span>
<span data-key="83e91ed7580046be990e8118b0efef79"><span data-offset-key="83e91ed7580046be990e8118b0efef79:0">Gatsby is a great framework for building apps so it’s possible and encouraged to pair Gatsby’s native build-time GraphQL with GraphQL queries running against a live GraphQL server from the browser.</span></span>
<span data-key="b7aa5b88ea2540d4bdb057d5dcf8067a"><span data-offset-key="b7aa5b88ea2540d4bdb057d5dcf8067a:0">Where does Gatsby’s GraphQL schema come from?</span></span>
<span data-key="8d183bd5510c46ca88c8c16716b6be36"><span data-offset-key="8d183bd5510c46ca88c8c16716b6be36:0">Most usages of GraphQL involve manually creating a GraphQL schema.</span></span>
<span data-key="5173b3a6bb7e4bb8b6639cbe2efca583"><span data-offset-key="5173b3a6bb7e4bb8b6639cbe2efca583:0">Gatsby uses plugins which can fetch data from different sources. That data is used to automatically </span>_infer_<span data-offset-key="5173b3a6bb7e4bb8b6639cbe2efca583:2"> a GraphQL schema.</span></span>
<span data-key="9bd05793777f43e3b84a12b4ddad2d6d"><span data-offset-key="9bd05793777f43e3b84a12b4ddad2d6d:0">If you give Gatsby data that looks like this:</span></span>
1
<span data-key="281d19881ca343d48238a3673aa7501b"><span data-offset-key="281d19881ca343d48238a3673aa7501b:0">Copycopy code to clipboard{ "title": "A long long time ago"}</span></span>
Copied!
<span data-key="38597d762ce24bbf9c4f01db1af7ab26"><span data-offset-key="38597d762ce24bbf9c4f01db1af7ab26:0">Gatsby will create a schema that looks something like this:</span></span>
1
<span data-key="0c0d27a4ce2e4d35847c42452f5ccc4c"><span data-offset-key="0c0d27a4ce2e4d35847c42452f5ccc4c:0">Copycopy code to clipboardtitle: String</span></span>
Copied!
<span data-key="839eb3131a814f668a1bcfdf028e3ea7"><span data-offset-key="839eb3131a814f668a1bcfdf028e3ea7:0">This makes it possible to pull data from anywhere and immediately start writing GraphQL queries against your data.</span></span>
<span data-key="589e88b4952f4fc7a57675eb1d46fdf2"><span data-offset-key="589e88b4952f4fc7a57675eb1d46fdf2:0">This </span>_can_<span data-offset-key="589e88b4952f4fc7a57675eb1d46fdf2:2"> cause confusion as some data sources allow you to define a schema even when there’s not any data added for parts or all of the schema. If parts of the data haven’t been added, then those parts of the schema might not be recreated in Gatsby.</span></span>
<span data-key="47bc29ac0efc49fcaa43853a78e09184"><span data-offset-key="47bc29ac0efc49fcaa43853a78e09184:0">Powerful data transformations</span></span>
<span data-key="fe7fd2cc5d5c4f769f72cd872007f518"><span data-offset-key="fe7fd2cc5d5c4f769f72cd872007f518:0">GraphQL enables another unique feature of Gatsby — it lets you control data transformations with arguments to your queries. Some examples follow.</span></span>
###
<span data-key="d5abab178771405bbd8e017fbd187023"><span data-offset-key="d5abab178771405bbd8e017fbd187023:0">Formatting dates</span></span>
<span data-key="90ed2e9df0e14304bf7d2955ca64bfa1"><span data-offset-key="90ed2e9df0e14304bf7d2955ca64bfa1:0">People often store dates like “2018-01-05” but want to display the date in some other form like “January 5th, 2018”. One way of doing this is to load a date-formatting JavaScript library into the browser. Or, with Gatsby’s GraphQL layer, you can do the formatting at query-time like:</span></span>
1
<span data-key="92dd35b458f04cab9a2084f2afd3b178"><span data-offset-key="92dd35b458f04cab9a2084f2afd3b178:0">Copycopy code to clipboard{ date(formatString: "MMMM Do, YYYY")}</span></span>
Copied!
<span data-key="246e8bd414f54c8d94d3d2e299c14125"><span data-offset-key="246e8bd414f54c8d94d3d2e299c14125:0">See the full list of formatting options by viewing our </span></span><a href="https://www.gatsbyjs.com/docs/graphql-reference/#dates" class="css-4rbku5 css-1dbjc4n r-1loqt21 r-1471scf r-1otgn73 r-1i6wzkk r-lrvibr"><span data-key="ffe2589a91334c1cb0b6abe054b71071" data-rnw-int-class="nearest_260-12392_262-12393-240__"><span data-key="a681235bf28e4600a68cb95110bd3c8c"><span data-offset-key="a681235bf28e4600a68cb95110bd3c8c:0">GraphQL reference page</span></span></span></a><span data-key="84dc2ecf29394665abe73932a87edcb2"><span data-offset-key="84dc2ecf29394665abe73932a87edcb2:0">.</span></span>
###
<span data-key="c47e7a1b5aca432c9ee30ecf0ff4390a"><span data-offset-key="c47e7a1b5aca432c9ee30ecf0ff4390a:0">Markdown</span></span>
<span data-key="b14174ce3f334276ac56bf5acfe11fbf"><span data-offset-key="b14174ce3f334276ac56bf5acfe11fbf:0">Gatsby has </span>_transformer_<span data-offset-key="b14174ce3f334276ac56bf5acfe11fbf:2"> plugins which can transform data from one form to another. A common example is markdown. If you install </span></span><a href="https://www.gatsbyjs.com/plugins/gatsby-transformer-remark/" class="css-4rbku5 css-1dbjc4n r-1loqt21 r-1471scf r-1otgn73 r-1i6wzkk r-lrvibr"><span data-key="133012c9c3844a4e86f590bda385df1d" data-rnw-int-class="nearest_260-12392_262-12393-240__"><span data-key="f05114b557a4446b8e2141afa0924a61"><code class="r-1vckr1u r-z2wwpe r-uibjmv r-m2pi6t r-1hvjb8t" data-slate-leaf="true" data-offset-key="f05114b557a4446b8e2141afa0924a61:0">gatsby-transformer-remark</code></span></span></a><span data-key="5b13a1b6e8194622a2f9b65e456b1d0d"><span data-offset-key="5b13a1b6e8194622a2f9b65e456b1d0d:0">, then in your queries, you can specify if you want the transformed HTML version instead of markdown:</span></span>
1
<span data-key="801bf4b9b11441eba0dcfd2e1bccacf4"><span data-offset-key="801bf4b9b11441eba0dcfd2e1bccacf4:0">Copycopy code to clipboardmarkdownRemark { html}</span></span>
Copied!
###
<span data-key="fed67d80f2c54f80973d833c95e28ee7"><span data-offset-key="fed67d80f2c54f80973d833c95e28ee7:0">Images</span></span>
<span data-key="06c9d7f7b8d54195b9dffe6d5ab8556a"><span data-offset-key="06c9d7f7b8d54195b9dffe6d5ab8556a:0">Gatsby has rich support for processing images. Responsive images are a big part of the modern web and typically involve creating 5+ sized thumbnails per photo. With Gatsby’s </span></span><a href="https://www.gatsbyjs.com/plugins/gatsby-transformer-sharp/" class="css-4rbku5 css-1dbjc4n r-1loqt21 r-1471scf r-1otgn73 r-1i6wzkk r-lrvibr"><span data-key="3d6f286279dd41f6b89c601f35d3a6ac" data-rnw-int-class="nearest_260-12392_262-12393-240__"><span data-key="11df5bd0170f40a68801c032021a9929"><code class="r-1vckr1u r-z2wwpe r-uibjmv r-m2pi6t r-1hvjb8t" data-slate-leaf="true" data-offset-key="11df5bd0170f40a68801c032021a9929:0">gatsby-transformer-sharp</code></span></span></a><span data-key="0468d6e1a77d454eb8c9df97dc6c3f03"><span data-offset-key="0468d6e1a77d454eb8c9df97dc6c3f03:0">, you can </span>_query_<span data-offset-key="0468d6e1a77d454eb8c9df97dc6c3f03:2"> your images for responsive versions. The query automatically creates all the needed responsive thumbnails and returns </span>`src`<span data-offset-key="0468d6e1a77d454eb8c9df97dc6c3f03:4"> and </span>`srcSet`<span data-offset-key="0468d6e1a77d454eb8c9df97dc6c3f03:6"> fields to add to your image element.</span></span>
<span data-key="c158a61cf5c44a189527f0f8fd9803b1"><span data-offset-key="c158a61cf5c44a189527f0f8fd9803b1:0">Combined with a special Gatsby image component, </span></span><a href="https://www.gatsbyjs.com/plugins/gatsby-image/" class="css-4rbku5 css-1dbjc4n r-1loqt21 r-1471scf r-1otgn73 r-1i6wzkk r-lrvibr"><span data-key="3c6aaec9537e4dc19efa95b4560c74b5" data-rnw-int-class="nearest_260-12392_262-12393-240__"><span data-key="6a90e3964d7f4df1ad2f245d78991689"><span data-offset-key="6a90e3964d7f4df1ad2f245d78991689:0">gatsby-image</span></span></span></a><span data-key="a42c148bd12342bc897d191807b77532"><span data-offset-key="a42c148bd12342bc897d191807b77532:0">, you have a very powerful set of primitives for building sites with images.</span></span>
<span data-key="c6e4f2414f5741b693102f08072c1f8d"><span data-offset-key="c6e4f2414f5741b693102f08072c1f8d:0">This is what a component using </span>`gatsby-image`<span data-offset-key="c6e4f2414f5741b693102f08072c1f8d:2"> looks like:</span></span>
1
<span data-key="0796e0c8714a4cfba981cbae607580ca"><span data-offset-key="0796e0c8714a4cfba981cbae607580ca:0">Copycopy code to clipboardimport React from "react"import Img from "gatsby-image"import { graphql } from "gatsby"export default function Page({ data }) { return ( <div> <h1>Hello gatsby-image</h1> <Img fixed={data.file.childImageSharp.fixed} /> </div> )}export const query = graphql\` query { file(relativePath: { eq: "blog/avatars/kyle-mathews.jpeg" }) { childImageSharp { \# Specify the image processing specifications right in the query. \# Makes it trivial to update as your page's design changes. fixed(width: 125, height: 125) { ...GatsbyImageSharpFixed } } } }\`</span></span>
Copied!
<span data-key="0133f0743d1c46af8988194587600211"><span data-offset-key="0133f0743d1c46af8988194587600211:0">See also the following blog posts:</span></span>
- <span data-key="4593593e2a10463f9d635871049730f7"><span data-offset-key="4593593e2a10463f9d635871049730f7:0"><span data-slate-zero-width="z"></span></span></span><a href="https://www.gatsbyjs.com/blog/2017-10-16-making-website-building-fun/" class="css-4rbku5 css-1dbjc4n r-1loqt21 r-1471scf r-1otgn73 r-1i6wzkk r-lrvibr"><span data-key="cb23e2d430014b2481143b2d1506e1b9" data-rnw-int-class="nearest_260-12392_262-12393-240__"><span data-key="ea57b7a744224d2b8ab7e62cc39f0c90"><span data-offset-key="ea57b7a744224d2b8ab7e62cc39f0c90:0">Making Website Building Fun</span></span></span></a><span data-key="2b90cc2794d94415b63ecf0f45b7e423"><span data-offset-key="2b90cc2794d94415b63ecf0f45b7e423:0"><span data-slate-zero-width="z"></span></span></span>
- <span data-key="bae7690827db4e7facddbd414b4730cf"><span data-offset-key="bae7690827db4e7facddbd414b4730cf:0"><span data-slate-zero-width="z"></span></span></span><a href="https://medium.com/@kyle.robert.gill/ridiculously-easy-image-optimization-with-gatsby-js-59d48e15db6e" class="css-4rbku5 css-1dbjc4n r-1loqt21 r-1471scf r-1otgn73 r-1i6wzkk r-lrvibr"><span data-key="ac959ae82a77435388db9fd90ce5f8a8" data-rnw-int-class="nearest_260-12392_262-12393-240__"><span data-key="5d90223216c14aadae38a31c66cbdcbd"><span data-offset-key="5d90223216c14aadae38a31c66cbdcbd:0">Image Optimization Made Easy with Gatsby.js</span></span></span></a><span data-key="eebdac30381e4dd285d5e4e410b051a6"><span data-offset-key="eebdac30381e4dd285d5e4e410b051a6:0"><span data-slate-zero-width="z"></span></span></span>
<span data-key="d84387ecc7a04028937edc162896feca"><span data-offset-key="d84387ecc7a04028937edc162896feca:0">Advanced</span></span>
###
<span data-key="47d5e88b407e441e866a0cefc0064d50"><span data-offset-key="47d5e88b407e441e866a0cefc0064d50:0">Fragments</span></span>
<span data-key="13c9dde438e9437688a3a74b14a5d2ab"><span data-offset-key="13c9dde438e9437688a3a74b14a5d2ab:0">Notice that in the above example for </span></span><a href="https://www.gatsbyjs.com/docs/conceptual/graphql-concepts/#images" class="css-4rbku5 css-1dbjc4n r-1loqt21 r-1471scf r-1otgn73 r-1i6wzkk r-lrvibr"><span data-key="be4057aaf80841349ab1aa912f02537b" data-rnw-int-class="nearest_260-12392_262-12393-240__"><span data-key="e2b7beea9839466f91863a0a5a82500c"><span data-offset-key="e2b7beea9839466f91863a0a5a82500c:0">querying images</span></span></span></a><span data-key="94e6b61da86b44bc879d07d584c0adf6"><span data-offset-key="94e6b61da86b44bc879d07d584c0adf6:0">, we used </span>`...GatsbyImageSharpFixed`<span data-offset-key="94e6b61da86b44bc879d07d584c0adf6:2">, which is a GraphQL Fragment, a reusable set of fields for query composition. You can read more about them </span></span><a href="https://graphql.org/learn/queries/#fragments" class="css-4rbku5 css-1dbjc4n r-1loqt21 r-1471scf r-1otgn73 r-1i6wzkk r-lrvibr"><span data-key="7132ebf8eb874d77aa2ace930cd66828" data-rnw-int-class="nearest_260-12392_262-12393-240__"><span data-key="98ea8b0e2d6845778250d80e6e66ea47"><span data-offset-key="98ea8b0e2d6845778250d80e6e66ea47:0">here</span></span></span></a><span data-key="49fd38297d724f7cb84c8dba39180f65"><span data-offset-key="49fd38297d724f7cb84c8dba39180f65:0">.</span></span>
<span data-key="20e359a97b224b6fb9a4ef1a808e9920"><span data-offset-key="20e359a97b224b6fb9a4ef1a808e9920:0">If you wish to define your own fragments for use in your application, you can use named exports to export them in any JavaScript file, and they will be automatically processed by Gatsby for use in your GraphQL queries.</span></span>
<span data-key="c1b7d8fd12da4b49a8d37b211b8ff805"><span data-offset-key="c1b7d8fd12da4b49a8d37b211b8ff805:0">For example, if I put a fragment in a helper component, I can use that fragment in any other query:src/components/PostItem.js</span></span>
1
<span data-key="190b4b83baa64ce1a94f6191eb704204"><span data-offset-key="190b4b83baa64ce1a94f6191eb704204:0">Copysrc/components/PostItem.js: copy code to clipboardexport const markdownFrontmatterFragment = graphql\` fragment MarkdownFrontmatter on MarkdownRemark { frontmatter { path title date(formatString: "MMMM DD, YYYY") } }\`</span></span>
Copied!
<span data-key="5b63d084303b4031ba09e6613ff6fdea"><span data-offset-key="5b63d084303b4031ba09e6613ff6fdea:0">They can then be used in any GraphQL query after that!</span></span>
1
<span data-key="f632059c3e5e4498a719e3d368f5fb49"><span data-offset-key="f632059c3e5e4498a719e3d368f5fb49:0">Copycopy code to clipboardquery ($path: String!) { markdownRemark(frontmatter: { path: { eq: $path } }) { ...MarkdownFrontmatter }}</span></span>
Copied!
<span data-key="f94e6c6fc0be4540b46a9482417262e0"><span data-offset-key="f94e6c6fc0be4540b46a9482417262e0:0">It’s good practice for your helper components to define and export a fragment for the data they need. For example, on your index page you might map over all of your posts to show them in a list.src/pages/index.jsx</span></span>
1
<span data-key="d0f083f9d7a74485a4c1af96c61553ed"><span data-offset-key="d0f083f9d7a74485a4c1af96c61553ed:0">Copysrc/pages/index.jsx: copy code to clipboardimport React from "react"import { graphql } from "gatsby"export default function Home({ data }) { return ( <div> <h1>Index page</h1> <h4>{data.allMarkdownRemark.totalCount} Posts</h4> {data.allMarkdownRemark.edges.map(({ node }) => ( <div key={node.id}> <h3> {node.frontmatter.title} <span>— {node.frontmatter.date}</span> </h3> </div> ))} </div> )}export const query = graphql\` query { allMarkdownRemark { totalCount edges { node { id frontmatter { title date(formatString: "DD MMMM, YYYY") } } } } }\`</span></span>
Copied!
<span data-key="ecbbcfab7bed41b9a0c95ed6ee3a2c13"><span data-offset-key="ecbbcfab7bed41b9a0c95ed6ee3a2c13:0">If the index component becomes too large, you might want to refactor it into smaller components.src/components/IndexPost.jsx</span></span>
1
<span data-key="378ff72bad65465c8faea2c09650e97a"><span data-offset-key="378ff72bad65465c8faea2c09650e97a:0">Copysrc/components/IndexPost.jsx: copy code to clipboardimport React from "react"import { graphql } from "gatsby"export default function IndexPost({ frontmatter: { title, date } }) { return ( <div> <h3> {title} <span>— {date}</span> </h3> </div> )}export const query = graphql\` fragment IndexPostFragment on MarkdownRemark { frontmatter { title date(formatString: "MMMM DD, YYYY") } }\`</span></span>
Copied!
<span data-key="5c26d7a02a4b4e41b20f483a4063a186"><span data-offset-key="5c26d7a02a4b4e41b20f483a4063a186:0">Now, you can use the component together with the exported fragment in your index page.src/pages/index.jsx</span></span>
1
<span data-key="46d1b0a2ca5f4679b445e6f5560f683b"><span data-offset-key="46d1b0a2ca5f4679b445e6f5560f683b:0">Copysrc/pages/index.jsx: copy code to clipboardimport React from "react"import IndexPost from "../components/IndexPost"import { graphql } from "gatsby"export default function Home({ data }) { return ( <div> <h1>Index page</h1> <h4>{data.allMarkdownRemark.totalCount} Posts</h4> {data.allMarkdownRemark.edges.map(({ node }) => ( <div key={node.id}> <IndexPost frontmatter={node.frontmatter} /> </div> ))} </div> )}export const query = graphql\` query { allMarkdownRemark { totalCount edges { node { id ...IndexPostFragment } } } }\`</span></span>
Copied!
<span data-key="4691ec94cdff47d891bf8af8288e84b8"><span data-offset-key="4691ec94cdff47d891bf8af8288e84b8:0">Further reading</span></span>
- <span data-key="7806152ec5fe40079e48e53871f98910"><span data-offset-key="7806152ec5fe40079e48e53871f98910:0"><span data-slate-zero-width="z"></span></span></span><a href="https://www.gatsbyjs.com/docs/why-gatsby-uses-graphql/" class="css-4rbku5 css-1dbjc4n r-1loqt21 r-1471scf r-1otgn73 r-1i6wzkk r-lrvibr"><span data-key="8cc83ed51d494c4092987ce111e00936" data-rnw-int-class="nearest_260-12392_262-12393-240__"><span data-key="1b4d8bac77e94303837c926ed784c030"><span data-offset-key="1b4d8bac77e94303837c926ed784c030:0">Why Gatsby Uses GraphQL</span></span></span></a><span data-key="e15ea00e4cad48c4a75c0fef82cf2c25"><span data-offset-key="e15ea00e4cad48c4a75c0fef82cf2c25:0"><span data-slate-zero-width="z"></span></span></span>
- <span data-key="94710111ef4b408cb8fba3e86895b818"><span data-offset-key="94710111ef4b408cb8fba3e86895b818:0"><span data-slate-zero-width="z"></span></span></span><a href="https://www.apollographql.com/blog/the-anatomy-of-a-graphql-query-6dffa9e9e747" class="css-4rbku5 css-1dbjc4n r-1loqt21 r-1471scf r-1otgn73 r-1i6wzkk r-lrvibr"><span data-key="d84ddabfc2104dc882f3cf93e843da4f" data-rnw-int-class="nearest_260-12392_262-12393-240__"><span data-key="864906605d8e4d93ab34a758959699f1"><span data-offset-key="864906605d8e4d93ab34a758959699f1:0">The Anatomy of a GraphQL Query</span></span></span></a><span data-key="b2215b50c0c545daa2bd74e144271b06"><span data-offset-key="b2215b50c0c545daa2bd74e144271b06:0"><span data-slate-zero-width="z"></span></span></span>
###
<span data-key="6c2869183f804462b5fd907d887fd5c5"><span data-offset-key="6c2869183f804462b5fd907d887fd5c5:0">Getting started with GraphQL</span></span>
- <span data-key="32807b4ef93341d5a6f33feb6c922e48"><span data-offset-key="32807b4ef93341d5a6f33feb6c922e48:0"><span data-slate-zero-width="z"></span></span></span><a href="https://graphql.org/learn/" class="css-4rbku5 css-1dbjc4n r-1loqt21 r-1471scf r-1otgn73 r-1i6wzkk r-lrvibr"><span data-key="9c125a8aca1a4c3689c34fc68ba3678f" data-rnw-int-class="nearest_260-12392_262-12393-240__"><span data-key="951dceea167446c79f71991b4f7cb1c4"><span data-offset-key="951dceea167446c79f71991b4f7cb1c4:0">https://graphql.org/learn/</span></span></span></a><span data-key="fdce3b8bd71d4f38902ee10751732072"><span data-offset-key="fdce3b8bd71d4f38902ee10751732072:0"><span data-slate-zero-width="z"></span></span></span>
- <span data-key="22569e718a894fdeb5e09e55ad36c958"><span data-offset-key="22569e718a894fdeb5e09e55ad36c958:0"><span data-slate-zero-width="z"></span></span></span><a href="https://www.howtographql.com/" class="css-4rbku5 css-1dbjc4n r-1loqt21 r-1471scf r-1otgn73 r-1i6wzkk r-lrvibr"><span data-key="530097ba92b8440392c1e6f3d504bd32" data-rnw-int-class="nearest_260-12392_262-12393-240__"><span data-key="1408b1bb918b4365bc8030c4d7541d2e"><span data-offset-key="1408b1bb918b4365bc8030c4d7541d2e:0">https://www.howtographql.com/</span></span></span></a><span data-key="ccd311078cd14f56ba7a07efc20fb29b"><span data-offset-key="ccd311078cd14f56ba7a07efc20fb29b:0"><span data-slate-zero-width="z"></span></span></span>
- <span data-key="fd476709f2b24344bb8fb02dd07a837e"><span data-offset-key="fd476709f2b24344bb8fb02dd07a837e:0"><span data-slate-zero-width="z"></span></span></span><a href="https://reactjs.org/blog/2015/05/01/graphql-introduction.html" class="css-4rbku5 css-1dbjc4n r-1loqt21 r-1471scf r-1otgn73 r-1i6wzkk r-lrvibr"><span data-key="2394b098883e45b3afc43e49a0084ac9" data-rnw-int-class="nearest_260-12392_262-12393-240__"><span data-key="dd65b731900d4f78b3e9dea2cd019083"><span data-offset-key="dd65b731900d4f78b3e9dea2cd019083:0">https://reactjs.org/blog/2015/05/01/graphql-introduction.html</span></span></span></a><span data-key="1cdbf34ad02d4797ba9be5ff0fa7d500"><span data-offset-key="1cdbf34ad02d4797ba9be5ff0fa7d500:0"><span data-slate-zero-width="z"></span></span></span>
- <span data-key="36603423c93c4d00b2343fa34c6897ba"><span data-offset-key="36603423c93c4d00b2343fa34c6897ba:0"><span data-slate-zero-width="z"></span></span></span><a href="https://services.github.com/on-demand/graphql/" class="css-4rbku5 css-1dbjc4n r-1loqt21 r-1471scf r-1otgn73 r-1i6wzkk r-lrvibr"><span data-key="bbd211c8141044f390ecfdc6235d7508" data-rnw-int-class="nearest_260-12392_262-12393-240__"><span data-key="01ab85cbe5074f088b10ff02bcdad627"><span data-offset-key="01ab85cbe5074f088b10ff02bcdad627:0">https://services.github.com/on-demand/graphql/</span></span></span></a><span data-key="288fcefd8b284edd914d500433554d13"><span data-offset-key="288fcefd8b284edd914d500433554d13:0"><span data-slate-zero-width="z"></span></span></span>
###
<span data-key="3fbe4d7ee0a44bc48b306c6513ff0aee"><span data-offset-key="3fbe4d7ee0a44bc48b306c6513ff0aee:0">Advanced readings on GraphQL</span></span>
- <span data-key="ab6b75cacc554a82973cfcb889c9453b"><span data-offset-key="ab6b75cacc554a82973cfcb889c9453b:0"><span data-slate-zero-width="z"></span></span></span><a href="https://spec.graphql.org/October2016/" class="css-4rbku5 css-1dbjc4n r-1loqt21 r-1471scf r-1otgn73 r-1i6wzkk r-lrvibr"><span data-key="ec551a6d199a405b954b17f22b7e15c8" data-rnw-int-class="nearest_260-12392_262-12393-240__"><span data-key="7b381e418032418abbb0724497925e89"><span data-offset-key="7b381e418032418abbb0724497925e89:0">GraphQL specification</span></span></span></a><span data-key="29f1fef403274a66bca00cbf4f4b78c7"><span data-offset-key="29f1fef403274a66bca00cbf4f4b78c7:0"><span data-slate-zero-width="z"></span></span></span>
- <span data-key="276a268845a248d79fa75da8027f26a8"><span data-offset-key="276a268845a248d79fa75da8027f26a8:0"><span data-slate-zero-width="z"></span></span></span><a href="https://medium.com/the-graphqlhub/graphql-tour-interfaces-and-unions-7dd5be35de0d" class="css-4rbku5 css-1dbjc4n r-1loqt21 r-1471scf r-1otgn73 r-1i6wzkk r-lrvibr"><span data-key="ed492e6319a649e0b5ab0b7923c19a54" data-rnw-int-class="nearest_260-12392_262-12393-240__"><span data-key="c001bef5b64e467886582ace8d1fb2c8"><span data-offset-key="c001bef5b64e467886582ace8d1fb2c8:0">Interfaces and Unions</span></span></span></a><span data-key="7e9997d0bea2447288fc72ecb2e7b268"><span data-offset-key="7e9997d0bea2447288fc72ecb2e7b268:0"><span data-slate-zero-width="z"></span></span></span>
- <span data-key="d9864a04bde642f794a7401597bbc421"><span data-offset-key="d9864a04bde642f794a7401597bbc421:0"><span data-slate-zero-width="z"></span></span></span><a href="https://facebook.github.io/relay/docs/en/compiler-architecture.html" class="css-4rbku5 css-1dbjc4n r-1loqt21 r-1471scf r-1otgn73 r-1i6wzkk r-lrvibr"><span data-key="e0bb796fc97749058c0036dace030800" data-rnw-int-class="nearest_260-12392_262-12393-240__"><span data-key="109a07ac2b804478955c022e1f2d1e99"><span data-offset-key="109a07ac2b804478955c022e1f2d1e99:0">Relay Compiler (which Gatsby uses to process queries)</span></span></span></a><span data-key="9ac582551b1544ac9c6917ffaf9bba25"><span data-offset-key="9ac582551b1544ac9c6917ffaf9bba25:0"><span data-slate-zero-width="z"></span></span></span>
<a href="https://bryan-guner.gitbook.io/my-docs/web-development-frameworks/gatsby/plugins-themes-and-starters" class="css-4rbku5 css-1dbjc4n r-1awozwy r-14lw9ot r-190qawg r-z2wwpe r-rs99b7 r-4dj0k7 r-1loqt21 r-1quu1zo r-1ro0kt6 r-18u37iz r-16y2uox r-1wbh5a2 r-nsbfu8 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
Previous
Plugins, Themes, & Starters
<a href="https://bryan-guner.gitbook.io/my-docs/productivity/untitled" class="css-4rbku5 css-1dbjc4n r-1awozwy r-14lw9ot r-190qawg r-z2wwpe r-rs99b7 r-4dj0k7 r-1loqt21 r-1quu1zo r-1ro0kt6 r-18u37iz r-16y2uox r-1wbh5a2 r-nsbfu8 r-1otgn73 r-1i6wzkk r-lrvibr"></a>
Next - Productivity
Productivity
Last modified <span class="css-901oao css-16my406" aria-label="2021-10-21 00:37 UTC">25d ago</span>
Copy link
Contents
<a href="https://bryan-guner.gitbook.io/my-docs/web-development-frameworks/gatsby/graphql-concepts#undefined" class="css-4rbku5 css-1dbjc4n r-855088 r-dwliz8 r-1loqt21 r-18u37iz r-lqms97 r-dnmrzs r-iphfwy r-1guathk r-1h8ys4a r-1otgn73 r-1i6wzkk r-lrvibr r-7xmw5f"></a>
<a href="https://bryan-guner.gitbook.io/my-docs/web-development-frameworks/gatsby/graphql-concepts#table-of-contents" class="css-4rbku5 css-1dbjc4n r-855088 r-dwliz8 r-1loqt21 r-18u37iz r-lqms97 r-dnmrzs r-iphfwy r-1guathk r-1h8ys4a r-1otgn73 r-1i6wzkk r-lrvibr r-7xmw5f"></a>
TABLE OF CONTENTS
<a href="https://bryan-guner.gitbook.io/my-docs/web-development-frameworks/gatsby/graphql-concepts#why-is-graphql-so-cool" class="css-4rbku5 css-1dbjc4n r-855088 r-dwliz8 r-1loqt21 r-18u37iz r-lqms97 r-dnmrzs r-iphfwy r-1guathk r-1h8ys4a r-1otgn73 r-1i6wzkk r-lrvibr r-7xmw5f"></a>
Why is GraphQL so cool?
<a href="https://bryan-guner.gitbook.io/my-docs/web-development-frameworks/gatsby/graphql-concepts#what-does-a-graphql-query-look-like" class="css-4rbku5 css-1dbjc4n r-855088 r-dwliz8 r-1loqt21 r-18u37iz r-lqms97 r-dnmrzs r-iphfwy r-1guathk r-1h8ys4a r-1otgn73 r-1i6wzkk r-lrvibr r-7xmw5f"></a>
What does a GraphQL query look like?
<a href="https://bryan-guner.gitbook.io/my-docs/web-development-frameworks/gatsby/graphql-concepts#how-to-learn-graphql" class="css-4rbku5 css-1dbjc4n r-855088 r-dwliz8 r-1loqt21 r-18u37iz r-lqms97 r-dnmrzs r-iphfwy r-1guathk r-1h8ys4a r-1otgn73 r-1i6wzkk r-lrvibr r-7xmw5f"></a>
How to learn GraphQL
<a href="https://bryan-guner.gitbook.io/my-docs/web-development-frameworks/gatsby/graphql-concepts#how-do-graphql-and-gatsby-work-together" class="css-4rbku5 css-1dbjc4n r-855088 r-dwliz8 r-1loqt21 r-18u37iz r-lqms97 r-dnmrzs r-iphfwy r-1guathk r-1h8ys4a r-1otgn73 r-1i6wzkk r-lrvibr r-7xmw5f"></a>
How do GraphQL and Gatsby work together?
<a href="https://bryan-guner.gitbook.io/my-docs/web-development-frameworks/gatsby/graphql-concepts#where-does-gatsbys-graphql-schema-come-from" class="css-4rbku5 css-1dbjc4n r-855088 r-dwliz8 r-1loqt21 r-18u37iz r-lqms97 r-dnmrzs r-iphfwy r-1guathk r-1h8ys4a r-1otgn73 r-1i6wzkk r-lrvibr r-7xmw5f"></a>
Where does Gatsby’s GraphQL schema come from?
<a href="https://bryan-guner.gitbook.io/my-docs/web-development-frameworks/gatsby/graphql-concepts#powerful-data-transformations" class="css-4rbku5 css-1dbjc4n r-855088 r-dwliz8 r-1loqt21 r-18u37iz r-lqms97 r-dnmrzs r-iphfwy r-1guathk r-1h8ys4a r-1otgn73 r-1i6wzkk r-lrvibr r-7xmw5f"></a>
Powerful data transformations
<a href="https://bryan-guner.gitbook.io/my-docs/web-development-frameworks/gatsby/graphql-concepts#advanced" class="css-4rbku5 css-1dbjc4n r-855088 r-dwliz8 r-1loqt21 r-18u37iz r-lqms97 r-dnmrzs r-iphfwy r-1guathk r-1h8ys4a r-1otgn73 r-1i6wzkk r-lrvibr r-7xmw5f"></a>
Advanced
<a href="https://bryan-guner.gitbook.io/my-docs/web-development-frameworks/gatsby/graphql-concepts#further-reading" class="css-4rbku5 css-1dbjc4n r-855088 r-dwliz8 r-1loqt21 r-18u37iz r-lqms97 r-dnmrzs r-iphfwy r-1guathk r-1h8ys4a r-1otgn73 r-1i6wzkk r-lrvibr r-7xmw5f"></a>
Further reading
| 104.263812 | 1,405 | 0.785923 | yue_Hant | 0.168614 |
1c7535d1edf0de7a97adda6db429968f4e8e44e8 | 4,485 | md | Markdown | README.md | ChronVer/chronver | 3606e998c4311017c00a9cbb3a319991f1c62ed0 | [
"MIT"
] | 1 | 2020-04-15T00:16:42.000Z | 2020-04-15T00:16:42.000Z | README.md | ChronVer/chronver | 3606e998c4311017c00a9cbb3a319991f1c62ed0 | [
"MIT"
] | null | null | null | README.md | ChronVer/chronver | 3606e998c4311017c00a9cbb3a319991f1c62ed0 | [
"MIT"
] | null | null | null | # chronver
> The [chronological](https://chronver.org "Official ChronVer website") versioner.
## Install
```sh
$ npm i chronver
```
## Requirements
This is an evergreen module 🌲 and requires an [Active LTS](https://github.com/nodejs/Release) Node version (v12.0.0+).
## Usage
### Node.js
```js
import chronver from "chronver";
new chronver({ increment: "change", version: "2030.04.03" }).version;
// ^ Returns 2030.04.03.1
new chronver({ increment: "year", version: "2030.04.03" }).version;
// ^ Returns 2031.04.03
new chronver({ increment: "month", version: "2030.04.03" }).version;
// ^ Returns 2030.05.03
new chronver({ increment: "day", version: "2030.04.03" }).version;
// ^ Returns 2030.04.04
new chronver({ coerce: "2030.4.3" }).version;
// ^ Returns 2030.04.03
new chronver().version;
// ^ Returns the current date in ChronVer format
```
```js
// Here is how a full response looks
ChronVer {
change: 0,
day: 3,
month: 4,
raw: "2030.04.03",
version: "2030.04.03",
year: 2030
}
```
### package.json:
```json
{
"scripts": {
"increment": "chronver --increment package"
}
}
```
This allows you to run `npm run increment` and have your `package.json` version incremented to ChronVer's spec. However if you want to have this happen automatically when committing to a repo, employ [husky](https://github.com/typicode/husky) like so:
```json
{
"husky": {
"hooks": {
"pre-commit": "npm run increment && git add -A :/"
}
}
}
```
## API
### new chronver({ coerce?, increment?, parse?, version? })
ChronVer must be instantiated with the `new` keyword.
#### coerce
Type: `string` (optional)
- Given a string that represents a date, `coerce` will attempt to format it into a ChronVer object.
- If supplied value is blank (""), a ChronVer object representing today's date will be returned.
#### increment
Type: `string` (optional)
- Intended for use with the `version` parameter.
- Available options:
- `change`: increments supplied `version`...version by one.
- `day`: increments supplied `version` year by one.
- `month`: increments supplied `version` year by one.
- `year`: increments supplied `version` year by one.
- If supplied value is blank (""):
- If `version` parameter is not supplied along with an `increment` option:
- If supplied value is in the past:
- A ChronVer object representing today's date will be returned.
#### parse
Type: `string` | `CVType` (optional)
- Given a string that represents a date (or a ChronVer object), `parse` will test the validity of it and return a formatted ChronVer object.
- If supplied value is blank (""), a ChronVer object representing today's date will be returned.
#### version
- When used alone, behaves like `parse`.
### CLI
```shell
__
/ /
____/ / _______ _____ __________
/ __/ _ \/ __/ _ \/ _ | |/ / -_/ __/
\__/_//_/_/ \___/_//_|___/\__/_/
A JavaScript implementation of the https://chronver.org specification
Copyright © netop://ウエハ (Paul Anthony Webb)
Usage: chronver [options] <version>
Prints valid ChronVer versions
Options:
-c --coerce
Coerce a string into ChronVer if possible, silently fail otherwise.
-? -h --help
Show this help message.
-i --inc --increment [<level>]
Increment a version by the specified level. Level can be one of: year,
month, day, or change. Default level is "change".
Only one version may be specified.
The version returned will always default to the present. However,
supplied versions with a future date will remain in the future.
ex. Passing "1970.04.03 -i month" to ChronVer will return the present
date but passing "3027.04.03 -i month" will return "3027.05.03".
--init --initialize
Creates a ChronVer string, defaulting to the present.
ChronVer exits upon failure.
Examples:
$ chronver --initialize
$ chronver --increment month 2030.03.03
$ chronver --increment package
```
## Tests
You will need to first download this repo, `cd` into it, and `npm i` before proceeding further.
```sh
# Run all tests, sequentially
$ npm test
# Test dependencies for latest versions
$ npm run test:dependencies
# Lint "bin" and "lib" directories
$ npm run test:typescript
# Run this module through its paces
# PLEASE run this so I can feel my time writing and troubleshooting these tests were worth it
$ npm run test:assert
```
## License
MIT © [netop://ウエハ](https://webb.page "Homepage of netop://ウエハ")
| 23.238342 | 251 | 0.678707 | eng_Latn | 0.931427 |
1c766df1e652c25c3070405e979aa389b8382ed4 | 5,984 | md | Markdown | docs/infrastructure/symfony-console.md | PanzerLlama/msgphp | 51493573cdbb45059e20e0bd0b3e600a24bd90b2 | [
"MIT"
] | null | null | null | docs/infrastructure/symfony-console.md | PanzerLlama/msgphp | 51493573cdbb45059e20e0bd0b3e600a24bd90b2 | [
"MIT"
] | null | null | null | docs/infrastructure/symfony-console.md | PanzerLlama/msgphp | 51493573cdbb45059e20e0bd0b3e600a24bd90b2 | [
"MIT"
] | null | null | null | # Symfony Console
An overview of available infrastructural code when using [Symfony Console][console-project].
- Requires [symfony/console]
## Commands
Various standard [console commands] are available and can be used depending on implemented domain infrastructure. They
are defined in the `MsgPhp\Domain\Infrastructure\Console\Command\` namespace.
### `InitializeProjectionTypesCommand`
Initializes the [projection type registry](../projection/type-registry.md).
```bash
bin/console projection:initialize-types [--force]
```
### `SynchronizeProjectionsCommand`
Synchronizes domain objects and their [projections](../projection/models.md) using the [projection synchronization](../projection/synchronization.md)
utility service.
```bash
bin/console projection:synchronize
```
## Context Factory
A context factory is bound to `MsgPhp\Domain\Infrastructure\Console\Context\ContextFactory`. Its purpose is to leverage
a CLI command in an effort to interactively built an arbitrary array value (the context).
### API
#### `configure(InputDefinition $definition): void`
Configures a command input definition. See also [`InputDefinition`][api-inputdefinition]. Should be called before using
`getContext()`.
---
#### `getContext(InputInterface $input, StyleInterface $io, array $values = []): array`
Resolves the actual context from the console IO. See also [`InputInterface`][api-inputinterface] and [`StyleInterface`][api-styleinterface].
Any element value provided by `$values` takes precedence and should be used as-is.
### Implementations
#### `MsgPhp\Domain\Infrastructure\Console\Context\ClassContextFactory`
Factorizes a context based on any class method signature. It configures the CLI signature by mapping required class
method arguments to command arguments, whereas optional ones are mapped to command options.
```bash
bin/console command --optional-argument [--] required-argument
```
In both cases a value is optional, if the actual class method argument is required and no value is given it will be
asked interactively. If interaction is not possible an exception will be thrown instead.
- `__construct(string $class, string $method, array $classMapping = [], int $flags = 0, ClassContextElementFactory $elementFactory = null)`
- `$class / $method`: The class method to resolve
- `$classMapping`: Global class mapping. Usually used to map abstracts to concretes.
- `$flags`: A bit mask value to toggle various flags
- `ClassContextBuilder::ALWAYS_OPTIONAL`: Always map class method arguments to command options
- `ClassContextBuilder::NO_DEFAULTS`: Leave out default values when calling `getContext()`
- `ClassContextBuilder::REUSE_DEFINITION`: Reuse the original input definition for matching class method
arguments
- `$elementFactory`: A custom element factory to use. See also [Customizing context elements](#customizing-context-elements).
##### Customizing Context Elements
Per-element configuration can be provided by implementing a `MsgPhp\Domain\Infrastructure\Console\Context\ClassContextElementFactory`.
- `getElement(string $class, string $method, string $argument): ContextElement`
- Get a custom [`ContextElement`][api-contextelement] to apply to a specific class/method/argument pair
A default implementation is provided by `MsgPhp\Domain\Infrastructure\Console\Context\ClassContextElementFactory` which simply
transforms argument names to human readable values so that `$argumentName` becomes `Argument Name`.
##### Basic Example
```php
<?php
use MsgPhp\Domain\Infrastructure\Console\Context\ClassContextFactory;
use Symfony\Component\Console\Command\Command;
use Symfony\Component\Console\Input\InputInterface;
use Symfony\Component\Console\Output\OutputInterface;
use Symfony\Component\Console\Style\SymfonyStyle;
// --- SETUP ---
class MyObject
{
public function __construct(string $argument, $option = null)
{
}
}
class MyCommand extends Command
{
private $contextFactory;
public function __construct()
{
$this->contextFactory = new ClassContextFactory(MyObject::class, '__construct');
parent::__construct();
}
protected function configure(): void
{
$this->setName('my-command');
$this->contextFactory->configure($this->getDefinition());
}
protected function execute(InputInterface $input,OutputInterface $output): int
{
$io = new SymfonyStyle($input, $output);
$context = $this->contextFactory->getContext($input, $io);
$object = new MyObject(...array_values($context));
// do something
return 0;
}
}
// --- USAGE ---
// $ bin/console my-command [--option=OPTION] [--] [<argument>]
```
#### `MsgPhp\Domain\Infrastructure\Console\Context\DoctrineEntityContextFactory`
A [Doctrine](doctrine-orm.md) entity aware context factory. It decorates any context factory. Its purpose is to
provide a discriminator value into the resulting context when working with [ORM inheritance].
- `__construct(ContextFactory $factory, EntityManagerInterface $em, string $class)`
- `$factory`: The decorated context factory
- `$em`: The entity manager to use
- `$class`: The entity class to use
[console-project]: https://symfony.com/doc/current/components/console.html
[symfony/console]: https://packagist.org/packages/symfony/console
[console commands]: https://symfony.com/doc/current/console.html
[api-inputdefinition]: https://api.symfony.com/master/Symfony/Component/Console/Input/InputDefinition.html
[api-inputinterface]: https://api.symfony.com/master/Symfony/Component/Console/Input/InputInterface.html
[api-styleinterface]: https://api.symfony.com/master/Symfony/Component/Console/Style/StyleInterface.html
[api-contextelement]: https://msgphp.github.io/api/MsgPhp/Domain/Infra/Console/Context/ContextElement.html
[ORM inheritance]: http://docs.doctrine-project.org/projects/doctrine-orm/en/latest/reference/inheritance-mapping.html
| 38.857143 | 149 | 0.756852 | eng_Latn | 0.586834 |
1c774d0897b9f7e83a37dc7af5da6b0a62b84675 | 20 | md | Markdown | CHANGELOG.md | kevelopment/liquidctl-gui | 90ccc0ba8d159c957f2691e33329ee73a461d5cb | [
"MIT"
] | null | null | null | CHANGELOG.md | kevelopment/liquidctl-gui | 90ccc0ba8d159c957f2691e33329ee73a461d5cb | [
"MIT"
] | 2 | 2022-03-02T10:43:29.000Z | 2022-03-25T19:17:08.000Z | CHANGELOG.md | vudor/liquidctl-gui | 8cf7a3199dec3056252e44796a759eb7ae7009ba | [
"MIT"
] | null | null | null | ## 0.1.0 25.07.2020
| 10 | 19 | 0.55 | eng_Latn | 0.145547 |
1c77516bc7435f96b7b7a0d5762b6f4fd020d80e | 980 | md | Markdown | docs/build.md | dwango/UniVRM | e91ab9fc519aa387dc9b39044aa2189ff0382f15 | [
"MIT"
] | 318 | 2018-04-16T08:41:10.000Z | 2019-04-23T02:38:54.000Z | docs/build.md | dwango/UniVRM | e91ab9fc519aa387dc9b39044aa2189ff0382f15 | [
"MIT"
] | 171 | 2018-04-17T14:07:30.000Z | 2019-04-20T02:29:08.000Z | docs/build.md | dwango/UniVRM | e91ab9fc519aa387dc9b39044aa2189ff0382f15 | [
"MIT"
] | 65 | 2018-04-16T15:45:55.000Z | 2019-04-12T01:11:17.000Z | # アプリケーションのビルド
UniVRMを使うアプリケーションのビルドに関する注意事項
## ビルドに含めるシェーダー
`Project Settings = Graphics - Always Included Shaders` などに設定して、以下のシェーダーがビルドに含まれるようにしてください。
### Standard
* `Standard`
GLTF の PBR マテリアルが使用します。
```{admonition} Always Included Shaders
:class: warning
明示的に指定する必要があります
```
### Unlit
* `Assets\VRMShaders\GLTF\UniUnlit\Resources\UniGLTF\UniUnlit.shader`
```{admonition} Always Included Shaders
`Resources` に配置しているので明示的に指定しなくてもビルドに含まれます。
```
### MToon
* `Assets\VRMShaders\VRM\MToon\MToon\Resources\Shaders\MToon.shader`
```{admonition} Always Included Shaders
`Resources` に配置しているので明示的に指定しなくてもビルドに含まれます。
```
### ランタイム import/export 時のテクスチャー変換用のシェーダー
* `Assets\VRMShaders\GLTF\IO\Resources\UniGLTF\NormalMapExporter.shader`
* `Assets\VRMShaders\GLTF\IO\Resources\UniGLTF\StandardMapExporter.shader`
* `Assets\VRMShaders\GLTF\IO\Resources\UniGLTF\StandardMapImporter.shader`
```{admonition} Always Included Shaders
`Resources` に配置しているので明示的に指定しなくてもビルドに含まれます。
```
| 21.777778 | 91 | 0.790816 | yue_Hant | 0.912593 |
1c7777ed09e52304d3ad9a903302445d77f2c612 | 245 | md | Markdown | packages/docs/src/pages/input/md/input6.md | jeven2016/react-windy-ui | 93623d11fec4b282ecaeb3e09c60bf81ecf56c3f | [
"MIT"
] | 1 | 2022-01-22T14:31:30.000Z | 2022-01-22T14:31:30.000Z | packages/docs/src/pages/input/md/input6.md | jeven2016/react-windy-ui | 93623d11fec4b282ecaeb3e09c60bf81ecf56c3f | [
"MIT"
] | 2 | 2021-06-12T05:52:22.000Z | 2022-02-26T07:03:01.000Z | packages/docs/src/pages/input/md/input6.md | jeven2016/react-windy-ui | 93623d11fec4b282ecaeb3e09c60bf81ecf56c3f | [
"MIT"
] | null | null | null | ---
order: 6
type: sample
zh_CN: 禁用输入框
en_US: Disabled Input
editUrl: $BASE/pages/input/md/input6.md
---
+++ zh_CN
在Input上设置disabled属性为true后,可以将控件禁用。但对于Input Group组件,您需要自行设置各个子控件的disabled属性。
+++ en_US
Input6
+++ SampleCode
fileName: Input6
| 14.411765 | 75 | 0.755102 | yue_Hant | 0.225073 |
1c777ae4c851c0fc5df2f5c176d195027592bb50 | 142 | md | Markdown | test-content/leitner/functions/arguments.md | colevandersWands/study-lenses | 47398d6719dd2cd732fff59ba2f2ff13fcce420c | [
"MIT"
] | 2 | 2021-07-19T09:50:41.000Z | 2022-02-25T15:54:01.000Z | test-content/leitner/functions/arguments.md | colevandersWands/study-lenses | 47398d6719dd2cd732fff59ba2f2ff13fcce420c | [
"MIT"
] | 2 | 2020-10-19T14:51:35.000Z | 2021-04-22T11:09:37.000Z | test-content/leitner/functions/arguments.md | colevandersWands/study-lenses | 47398d6719dd2cd732fff59ba2f2ff13fcce420c | [
"MIT"
] | 2 | 2020-10-20T14:00:19.000Z | 2020-10-24T09:58:09.000Z | # Arguments
```js
const thingy = (param) => {
console.log(param);
};
thingy(4); // <-- argument is 4
thingy(6); // <-- argument is 6
```
| 11.833333 | 31 | 0.549296 | eng_Latn | 0.40519 |
1c778c29210f19db23d2538d97c2b1d06ca3fb6f | 8,574 | md | Markdown | WindowsServerDocs/identity/ad-fs/deployment/upgrading-to-ad-fs-in-windows-server-sql.md | jeoviedo/windowsserverdocs | d84dc3d037911ad698f5e3e84348b867c5f46ed8 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | WindowsServerDocs/identity/ad-fs/deployment/upgrading-to-ad-fs-in-windows-server-sql.md | jeoviedo/windowsserverdocs | d84dc3d037911ad698f5e3e84348b867c5f46ed8 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | WindowsServerDocs/identity/ad-fs/deployment/upgrading-to-ad-fs-in-windows-server-sql.md | jeoviedo/windowsserverdocs | d84dc3d037911ad698f5e3e84348b867c5f46ed8 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Upgrading to AD FS in Windows Server 2016 with SQL Server
description:
author: billmath
manager: mtillman
ms.date: 04/11/2018
ms.topic: article
ms.prod: windows-server-threshold
ms.assetid: 70f279bf-aea1-4f4f-9ab3-e9157233e267
ms.technology: identity-adfs
ms.author: billmath
---
# Upgrading to AD FS in Windows Server 2016 with SQL Server
## Moving from a Windows Server 2012 R2 AD FS farm to a Windows Server 2016 AD FS farm
The following document will describe how to upgrade your AD FS Windows Server 2012 R2 farm to AD FS in Windows Server 2016 when you are using a SQL Server for the AD FS database.
### Upgrading AD FS to Windows Server 2016 FBL
New in AD FS for Windows Server 2016 is the farm behavior level feature (FBL). This features is farm wide and determines the features that the AD FS farm can use. By default, the FBL in a Windows Server 2012 R2 AD FS farm is at the Windows Server 2012 R2 FBL.
A Windows Server 2016 AD FS server can be added to a Windows Server 2012 R2 farm and it will operate at the same FBL as a Windows Server 2012 R2. When you have a Windows Server 2016 AD FS server operating in this fashion, your farm is said to be "mixed". However, you will not be able to take advantage of the new Windows Server 2016 features until the FBL is raised to Windows Server 2016. With a mixed farm:
- Administrators can add new, Windows Server 2016 federation servers to an existing Windows Server 2012 R2 farm. As a result, the farm is in "mixed mode" and operates the Windows Server 2012 R2 farm behavior level. To ensure consistent behavior across the farm, new Windows Server 2016 features cannot be configured or used in this mode.
- Once all Windows Server 2012 R2 federation servers have been removed from the mixed mode farm, and in the case of a WID farm, one of the new Windows Serve 2016 federation servers has been promoted to the role of primary node, the administrator can then raise the FBL from Windows Server 2012 R2 to Windows Server 2016. As a result, any new AD FS Windows Server 2016 features can then be configured and used.
- As a result of the mixed farm feature, AD FS Windows Server 2012 R2 organizations looking to upgrade to Windows Server 2016 will not have to deploy an entirely new farm, export and import configuration data. Instead, they can add Windows Server 2016 nodes to an existing farm while it is online and only incur the relatively brief downtime involved in the FBL raise.
Be aware that while in mixed farm mode, the AD FS farm is not capable of any new features or functionality introduced in AD FS in Windows Server 2016. This means organizations that want to try out new features cannot do this until the FBL is raised. So if your organization is looking to test the new features prior to rasing the FBL, you will need to deploy a separate farm to do this.
The remainder of the is document provides the steps for adding a Windows Server 2016 federation server to a Windows Server 2012 R2 environment and then raising the FBL to Windows Server 2016. These steps were performed in a test environment outlined by the architectural diagram below.
> [!NOTE]
> Before you can move to AD FS in Windows Server 2016 FBL, you must remove all of the Windows 2012 R2 nodes. You cannot just upgrade a Windows Server 2012 R2 OS to Windows Server 2016 and have it become a 2016 node. You will need to remove it and replace it with a new 2016 node.
To following architectural diagram shows the setup that was used to validate and record the steps below.

#### Join the Windows 2016 AD FS Server to the AD FS farm
1. Using Server Manager install the Active Directory Federation Services Role on the Windows Server 2016
2. Using the AD FS Configuration wizard, join the new Windows Server 2016 server to the existing AD FS farm. On the **Welcome** screen click **Next**.

3. On the **Connect to Active Directory Domain Services** screen, s**pecify an administrator account** with permissions to perform the federation services configuration and click **Next**.
4. On the **Specify Farm** screen, enter the name of the SQL server and instance and then click **Next**.

5. On the **Specify SSL Certificate** screen, specify the certificate and click **Next**.

6. On the **Specify Service Account** screen, specify the service account and click **Next**.
7. On the **Review Options** screen, review the options and click **Next**.
8. On the **Pre-requisites Checks** screen, ensure that all of the pre-requisite checks have passed and click **Configure**.
9. On the **Results** screen, ensure that server was successfully configured and click **Close**.
#### Remove the Windows Server 2012 R2 AD FS server
>[!NOTE]
>You do not need to set the primary AD FS server using Set-AdfsSyncProperties -Role when using SQL as the database. This is because all of the nodes are considered primary in this configuration.
1. On the Windows Server 2012 R2 AD FS server in Server Manager use **Remove Roles and Features** under **Manage**.

2. On the **Before you Begin** screen, click **Next**.
3. On the **Server Selection** Screen, click **Next**.
4. On the **Server Roles** screen, remove the check next to **Active Directory Federation Services** and click **Next**.

5. On the **Features** Screen, click **Next**.
6. On the **Confirmation** Screen, click **Remove**.
7. Once this completes, restart the server.
#### Raise the Farm Behavior Level (FBL)
Prior to this step you need to ensure that forestprep and domainprep have been run on your Active Directory environment and that Active Directory has the Windows Server 2016 schema. This document started with a Windows 2016 domain controller and did not require running these because they were run when AD was installed.
>[!NOTE]
>Before starting the process below, ensure Windows Server 2016 is current by running Windows Update from Settings. Continue this process until no further updates are needed.
1. Now on the Windows Server 2016 Server open PowerShell and run the following: **$cred = Get-Credential** and hit enter.
2. Enter credentials that have admin privileges on the SQL Server.
3. Now in PowerShell, enter the following: **Invoke-AdfsFarmBehaviorLevelRaise -Credential $cred**
2. When prompted, type **Y**. This will begin raising the level. Once this completes you have successfully raised the FBL.

3. Now, if you go to AD FS Management, you will see the new nodes that have been added for AD FS in Windows Server 2016
4. Likewise, you can use the PowerShell cmdlt: Get-AdfsFarmInformation to show you the current FBL.

#### Upgrade the Configuration Version of existing WAP servers
1. On each Web Application Proxy, re-configure the WAP by executing the following PowerShell command in an elevated window:
```powershell
$trustcred = Get-Credential -Message "Enter Domain Administrator credentials"
Install-WebApplicationProxy -CertificateThumbprint {SSLCert} -fsname fsname -FederationServiceTrustCredential $trustcred
```
2. Remove old servers from the cluster and keep only the WAP servers running the latest server version, which were reconfigured above, by running the following Powershell commandlet.
```powershell
Set-WebApplicationProxyConfiguration -ConnectedServersName WAPServerName1, WAPServerName2
```
3. Check the WAP configuration by running the Get-WebApplicationProxyConfiguration commmandlet. The ConnectedServersName will reflect the server run from the prior command.
```powershell
Get-WebApplicationProxyConfiguration
```
4. To upgrade the ConfigurationVersion of the WAP servers, run the following Powershell command.
```powershell
Set-WebApplicationProxyConfiguration -UpgradeConfigurationVersion
```
5. Verify the ConfigurationVersion has been upgraded with the Get-WebApplicationProxyConfiguration Powershell command.
| 77.243243 | 414 | 0.772102 | eng_Latn | 0.987675 |
1c77b1c7bbd7a915929be7faef79625d17153677 | 479 | md | Markdown | applications/magic-mirror/README.md | tillhoff/iac | d2bf0814f6d160415b446866baaee6e7f006c4f4 | [
"MIT"
] | null | null | null | applications/magic-mirror/README.md | tillhoff/iac | d2bf0814f6d160415b446866baaee6e7f006c4f4 | [
"MIT"
] | null | null | null | applications/magic-mirror/README.md | tillhoff/iac | d2bf0814f6d160415b446866baaee6e7f006c4f4 | [
"MIT"
] | null | null | null | # magic-mirror
This application is running the magic-mirror software in server-only mode. This means it starts a docker-container, which hosts the magic-mirror software on a web-server.
## Access
To access it, a browser is required.
For automatic start it is required to:
- log a user in automatically at boot
- autostart the browser in full-screen mode (f.e. kiosk mode)
- point the browser to the container-URL
- hide the mouse cursor (f.e. with the `unclutter` software)
| 31.933333 | 170 | 0.764092 | eng_Latn | 0.998447 |
1c77ec05e5a8c87c866579652b49750cb5ef71fb | 5,402 | md | Markdown | docs/2014/reporting-services/report-design/chart-effects-add-3d-effects-report-builder.md | ZubriQ/sql-docs.ru-ru | 50559946dabe5fce9eef251a637dc2e3fd305908 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/2014/reporting-services/report-design/chart-effects-add-3d-effects-report-builder.md | ZubriQ/sql-docs.ru-ru | 50559946dabe5fce9eef251a637dc2e3fd305908 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/2014/reporting-services/report-design/chart-effects-add-3d-effects-report-builder.md | ZubriQ/sql-docs.ru-ru | 50559946dabe5fce9eef251a637dc2e3fd305908 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Добавление в диаграмму объемных эффектов (построитель отчетов и службы SSRS) | Документы Майкрософт
ms.custom: ''
ms.date: 06/13/2017
ms.prod: sql-server-2014
ms.reviewer: ''
ms.technology: reporting-services-native
ms.topic: conceptual
ms.assetid: ab9625d8-6557-4a4d-8123-eefa7c066ff5
author: maggiesMSFT
ms.author: maggies
manager: kfile
ms.openlocfilehash: 07934e74138f17317eb0258e2b3ad0fa6d26254a
ms.sourcegitcommit: 6fd8c1914de4c7ac24900fe388ecc7883c740077
ms.translationtype: MT
ms.contentlocale: ru-RU
ms.lasthandoff: 04/26/2020
ms.locfileid: "66106315"
---
# <a name="add-3d-effects-to-a-chart-report-builder-and-ssrs"></a>Добавление в диаграмму объемных эффектов (построитель отчетов и службы SSRS)
Объемные (трехмерные) эффекты можно использовать, чтобы добавить диаграмме глубину и увеличить ее визуальную привлекательность. Например, если нужно выделить отдельный срез разрезанной круговой диаграммы, можно повернуть диаграмму и изменить перспективу так, что пользователи обратят внимание на этот сегмент в первую очередь. После применения объемных эффектов к диаграмме все цвета градиента и стили штриховки отключаются.
> [!NOTE]
> [!INCLUDE[ssRBRDDup](../../includes/ssrbrddup-md.md)]
### <a name="to-apply-3d-effects-to-a-range-area-line-scatter-or-polar-chart"></a>Применение объемных эффектов к диаграмме диапазона, диаграмме с областями, графику, точечной и полярной диаграмме
1. Щелкните правой кнопкой мыши диаграмму и выберите пункт **Объемные эффекты**. Открывается диалоговое окно **Свойства области диаграммы** .
2. На вкладке **Параметры объемного отображения**выберите параметр **Разрешить объемные эффекты** .
3. На вкладке **Параметры объемного отображения**можно установить различные свойства, относящиеся к трехмерным углам и параметрам объемной сцены (необязательно). Дополнительные сведения об этих свойствах см. в разделе [Эффекты рельефа, объемные и другие эффекты в диаграмме (построитель отчетов и службы SSRS)](chart-effects-3d-bevel-and-other-report-builder.md).
4. Нажмите кнопку **ОК**.
### <a name="to-apply-3d-effects-to-a-funnel-chart"></a>Применение объемных эффектов к воронкообразной диаграмме
1. Щелкните правой кнопкой мыши диаграмму и выберите пункт **Объемные эффекты**. Открывается диалоговое окно **Свойства области диаграммы** .
2. На вкладке **Параметры объемного отображения**выберите параметр **Разрешить объемные эффекты** . Нажмите кнопку **ОК**.
3. Чтобы настроить внешний вид воронкообразной диаграммы, можно перейти на панель «Свойства» и изменить свойства воронкообразной диаграммы (необязательно).
1. Откройте панель «Свойства».
2. В области конструктора щелкните воронкообразную диаграмму. На панели «Свойства» отобразятся свойства рядов воронкообразной диаграммы.
3. В разделе **Общие** разверните узел **CustomAttributes** .
4. Для свойства **Funnel3DDrawingStyle** выберите, как будет отображаться воронка — с помощью круга или квадрата.
5. Для свойства **Funnel3DRotationAngle** задайте значение между -10 и 10. Оно определяет угол наклона в градусах, отображаемого на объемной воронкообразной диаграмме.
### <a name="to-apply-3d-effects-to-a-pie-chart"></a>Применение объемных эффектов к круговой диаграмме
1. Щелкните правой кнопкой мыши диаграмму и выберите пункт «Объемные эффекты». Открывается диалоговое окно **Свойства области диаграммы** .
2. На вкладке **Параметры объемного отображения**выберите параметр **Разрешить объемные эффекты** . Нажмите кнопку **ОК**.
3. В поле **Вращение**введите целое число, представляющее поворот круговой диаграммы по горизонтали (необязательно).
4. В поле **Наклон**введите целое число, представляющее наклон круговой диаграммы по вертикали (необязательно).
5. Нажмите кнопку **ОК**.
### <a name="to-apply-3d-effects-to-a-bar-or-column-chart"></a>Применение объемных эффектов к линейчатой диаграмме или гистограмме
1. Щелкните правой кнопкой мыши диаграмму и выберите пункт **Объемные эффекты**. Открывается диалоговое окно **Свойства области диаграммы** .
2. Выберите параметр **Разрешить объемные эффекты** . Нажмите кнопку **ОК**.
3. Выберите параметр **Включить кластеризацию рядов** (необязательно). Если диаграмма содержит несколько рядов линейчатой диаграммы или гистограммы, этот параметр отобразит кластеризованные ряды. По умолчанию несколько рядов линейчатой диаграммы или гистограммы отображаются параллельно друг другу.
4. Нажмите кнопку **ОК**.
5. (Необязательно.) Чтобы добавить в линейчатую диаграмму или гистограмму эффект отображения в виде цилиндров, выполните следующие шаги.
1. Откройте панель «Свойства».
2. В области конструктора щелкните любой столбец в ряде. На панели «Свойства» отображаются свойства ряда.
3. В разделе **Общие** разверните узел **CustomAttributes** .
4. Для свойства **DrawingStyle** укажите значение **Cylinder** .
## <a name="see-also"></a>См. также:
[Эффекты рельефа, объемные и другие эффекты в диаграмме (построитель отчетов и службы SSRS)](chart-effects-3d-bevel-and-other-report-builder.md)
[Форматирование диаграммы (построитель отчетов и службы SSRS)](formatting-a-chart-report-builder-and-ssrs.md)
[Диаграммы (построитель отчетов и службы SSRS)](charts-report-builder-and-ssrs.md)
| 58.717391 | 428 | 0.76194 | rus_Cyrl | 0.955824 |
1c780506419bbddd28e2ff1f409b3f3520e1c338 | 6,514 | md | Markdown | _posts/2019-05-17-Download-service-manual-genie.md | Jobby-Kjhy/27 | ea48bae2a083b6de2c3f665443f18b1c8f241440 | [
"MIT"
] | null | null | null | _posts/2019-05-17-Download-service-manual-genie.md | Jobby-Kjhy/27 | ea48bae2a083b6de2c3f665443f18b1c8f241440 | [
"MIT"
] | null | null | null | _posts/2019-05-17-Download-service-manual-genie.md | Jobby-Kjhy/27 | ea48bae2a083b6de2c3f665443f18b1c8f241440 | [
"MIT"
] | null | null | null | ---
layout: post
comments: true
categories: Other
---
## Download Service manual genie book
At that time tobacco was smoked in long pipes, in it, hunter, wholly unprofessional. The passage was tedious in consequence of Ninety. ] GOODS AND SERVICES on the Mayflower II were not provided free, on Sunday night. Unthinkable. These she escorted him and Wally into the Lampion dining room, service manual genie southerly lands to vegetable palaeontology. And it service manual genie be strongly protected! Most Archipelagans learn a few service manual genie to several thousand of these characters as a major part of their few years of schooling. The girl -- I saw her 30th19th June. Service manual genie religion was a few months at the beginning, tick, at This was a challenge and an act of intimidation, he had been confident that when at last he killed her children and yourself and that you hit the books because you wanted to be something more than a pilot and the gob of mucus in his throat, but you know the kind of work it is. " to these men, and what everybody knows is true turns out to be what some people used to think, when he was harmed by hundreds of small tyrants, the fox, and reconsidered them readily if confronted by new information, e'en as I do. it was true, Barty levered himself onto the seat beside her? Then Mesrour carried her to the other end of the sitting-chamber and bound her eyes and making her sit, any of which could become a major blowout, only, passages of sacred history in the Kargad "Why," said Jack, you startled me!" she said. Zorphwar. Far back in the park flew columns of fire, and the new source of "None, which he had never exception of some small snow-fields concealed in the valleys. high, saying: He'd been invited to a Christmas Eve celebration service manual genie a satanic theme. He'd been so service manual genie by the erotic perfection of Seraphim's missing brother. service manual genie the receiver in one hand and pulled at her hair with the other, ii, and under the patriot's chin were stamped the words In God We Trust, a cancer on humanity, all successfully cured, accompanied by lashes from a long whip. 205; They lighted down without the place and when they arose in the morning, at ebb by the dry beach, the ceiling transitions from chamber to service manual genie were difficult to detect, i. " growing confidence, anyhow. If I serviced her, the latter at 40 copecks each, and various other fluids? "I don't For a long time nobody would touch him. "Don't be such a goof. She reached out the poker to gather together her namesakes in the hearth, surely he would be rubinum de mundo". Luzula arctica BL. ' When the king's son heard this, but service manual genie as a boy. North of the highway, Junior parked two blocks past the target Peace wasn't easy to come by in the Maddoc household, "can you come back this far, ii, but I service manual genie appealingly creased as that of the best of grandfathers? molluscs and whales. is generally eaten raw, high-backed chair facing him, came and went without these foggy streets, with great fortitude and determination, ho," she said, eeling across other local official that I had erected a landmark on Most people of the Archipelago have brown or red-brown skin. She leaned to one side to let her mother see the hand she was holding. average time required to crack any simple code devised by anyone lacking through a golden haze that came from the sun in her heart. ' When Bihzad heard this saying, noisy stream he had heard singing through his Leilani knew that Preston had moved the service manual genie close to the bed when she heard him sit on it, glowing on the screen. She glanced at him, just for the hell of it?" Printed in the U, Sparky Vox, and besides had e, just sat staring at her hands clenched in her lap, he recited the following verses: Vanadium raised his eyebrows, is unsuitable for photo-lithographing, and a Hunnicolt. They had been married fourteen months, i. Snow-shoes, so He held her tightly. "You were over there?" not have fallen asleep so easily. " The old man was burying the core of his apple and modest in their statements about high northern latitudes reached. romantic memories to draw upon in my old age. But computer ticket-totes don't lie. service manual genie only want you to He reminded himself that pigs were used to hunt for truffles. As the afternoon waned toward a portentous dusk and toward the gallery Gorging on fudge cake and coffee to guard against a spontaneous lapse into meditative catatonia, and beyond remorse, "O my boon-companion. Because Maddoc was rain-soaked, introducing Marie to the mysteries of protein service manual genie courtesy of Jeeves-and grinned to himself; she was becoming even more impatient than he was. The ice was nothing to go on but the stories other people tell us. You could also, and in the offing two large Tintinyaranga, i, and the locations of observation and fire command posts from source analysis triangulations of stray reflections from control lasers. There are from five to ten such harpoon Maybe because grief is weighing on his mind, Tom had found no other like himself and now a "There are some things which we must accept" the preacher thundered. "He's pretty good, and watch him, under a you. Irian looked from one service manual genie the service manual genie Scorning the belief in the sanctity of all human life that has guided was not the vessel for a miracle birth, even his bedsheet, of previous exploratory sell Jesus door-to-door. shouting, passed neighbourhood of the equator itself. Asexual reproduction can take place among them as well. narrative of the wintering of the Fin, the courtyard of the fountain, certain wells for curing rheumatism. It is bounded on other with fig-trees only. It was approaching 0200, and watching though by less effective means. meat. service manual genie was attractive in my day, never had a chance to struggle, the flight, I was like to go mad for vexation and fell to beseeching my Lord and humbling myself in supplication to Him that He would deliver me from her. The process of intimidation by which young people are made to feel humanly worthless if they freedom, and Identical twins are very like each other and often display mirror-image characteristics. "Bregg here. There were none. Alarmed, and I believe it. "She had to track Alec Baldwin to New Orleans and blow him away Koba-Yoschi, opening his throat and one or both of his carotid arteries. | 723.777778 | 6,420 | 0.792908 | eng_Latn | 0.999957 |
1c79ac7ae7071f86e14eddb05bc5a3734ee8ced4 | 1,649 | md | Markdown | README.md | alexmartinez1745/Pandas-Challenge | 6da043e006c463a77f2f598c2690d2b467e90367 | [
"MIT"
] | null | null | null | README.md | alexmartinez1745/Pandas-Challenge | 6da043e006c463a77f2f598c2690d2b467e90367 | [
"MIT"
] | null | null | null | README.md | alexmartinez1745/Pandas-Challenge | 6da043e006c463a77f2f598c2690d2b467e90367 | [
"MIT"
] | null | null | null | ## Background
Tools used: Python (Pandas), Jupyer Notebook
Tasked with analyzing the data for an independent gaming company and their most recent fantasy game Heroes of Pymoli. The game is free-to-play, but players are encouraged to purchase optional items that enhance their playing experience.
The task is to generate a report that breaks down the game's purchasing data into meaningful insights.
## Final Report
The final report includes each of the following:
### Player Count
* Total Number of Players
### Purchasing Analysis (Total)
* Number of Unique Items
* Average Purchase Price
* Total Number of Purchases
* Total Revenue
### Gender Demographics
* Percentage and Count of Male Players
* Percentage and Count of Female Players
* Percentage and Count of Other / Non-Disclosed
### Purchasing Analysis (Gender)
* Purchase Count
* Average Purchase Price
* Total Purchase Value
* Average Purchase Total per Person by Gender
### Age Demographics
* The below each broken into bins of 4 years:
* Purchase Count
* Average Purchase Price
* Total Purchase Value
* Average Purchase Total per Person by Age Group
### Top Spenders
* Identified the the top 5 spenders in the game by total purchase value:
* SN
* Purchase Count
* Average Purchase Price
* Total Purchase Value
### Most Popular Items
* Identified the 5 most popular items by purchase count:
* Item ID
* Item Name
* Purchase Count
* Item Price
* Total Purchase Value
### Most Profitable Items
* Identified the 5 most profitable items by total purchase value:
* Item ID
* Item Name
* Purchase Count
* Item Price
* Total Purchase Value
| 23.898551 | 237 | 0.748939 | eng_Latn | 0.988259 |
1c7a0c3bb730b2fc36d8f74e199d026f21ce9479 | 1,979 | md | Markdown | content/week02.md | digitalideation/digcre_h2101 | 57a4565ee78550d0448be4719c1f24dc077088bc | [
"MIT"
] | null | null | null | content/week02.md | digitalideation/digcre_h2101 | 57a4565ee78550d0448be4719c1f24dc077088bc | [
"MIT"
] | null | null | null | content/week02.md | digitalideation/digcre_h2101 | 57a4565ee78550d0448be4719c1f24dc077088bc | [
"MIT"
] | null | null | null | ---
layout: inner
title: "Intro"
---
## Intro
This week we look back at what we learned so far and get a first taste of Keras / Pytorch.
## Schedule
|Time |Desc |
|--- |--- |
|20 mins | Looking back at NN |
|40 mins | Keras - Intro + first steps |
|10 mins | Break |
|40 mins | Pytorch - Intro + first steps |
|5 mins | [Exit ticket](#exit-ticket) |
## Content
* [:tv: Looking back at the inner working of a NN](https://digitalideation.github.io/digcre_h2101/slides/intro_part02.html)
* Intro to tools: [Keras](https://keras.io/)
* [First steps with Keras](https://github.com/digitalideation/digcre_h2101/tree/master/samples/week02)
* First Look
* Classifying movie reviews
* Shape classifier
* Intro to tools: [Pytorch](https://pytorch.org/)
* [First steps with Pytorch](https://github.com/digitalideation/digcre_h2101/tree/master/samples/week02)
* Using pretrained models
* All as tensors
* Turn real-world data into tensors
## Preparation work
No preparation this week
The goal this week is to:
1) Start using Pytorch / Keras
2) Start learning how to deploy models
## Going further
* [DL with Pytorch](https://www.manning.com/books/deep-learning-with-pytorch)
* [DL with Python](https://www.manning.com/books/deep-learning-with-python)
* Username + pwd sent by email
## Exit ticket
Use [this link](https://docs.google.com/forms/d/e/1FAIpQLSd4HSpRoMsCCryiGjLxgD86joajca79vfhjH2bShMDjMe-0aQ/viewform?usp=sf_link) if the form does not show up below :arrow_down:
{% raw %}
<iframe src="https://docs.google.com/forms/d/e/1FAIpQLSd4HSpRoMsCCryiGjLxgD86joajca79vfhjH2bShMDjMe-0aQ/viewform?embedded=true" width="100%" height="900" frameborder="0" marginheight="0" marginwidth="0" frameborder="no">Loading…</iframe>
{% endraw %}
| 33.542373 | 237 | 0.645781 | eng_Latn | 0.492626 |
1c7b635b25cbbef4d9f8bff647ebdae99883ea65 | 6,003 | md | Markdown | source/_posts/frustrated_boss_slams_jobseekers_for_not_reading_job_advert_properly.md | soumyadipdas37/finescoop.github.io | 0346d6175a2c36d4054083c144b7f8364db73f2f | [
"MIT"
] | null | null | null | source/_posts/frustrated_boss_slams_jobseekers_for_not_reading_job_advert_properly.md | soumyadipdas37/finescoop.github.io | 0346d6175a2c36d4054083c144b7f8364db73f2f | [
"MIT"
] | null | null | null | source/_posts/frustrated_boss_slams_jobseekers_for_not_reading_job_advert_properly.md | soumyadipdas37/finescoop.github.io | 0346d6175a2c36d4054083c144b7f8364db73f2f | [
"MIT"
] | 2 | 2021-09-18T12:06:26.000Z | 2021-11-14T15:17:34.000Z | ---
extends: _layouts.post
section: content
image: https://i.dailymail.co.uk/1s/2020/09/15/09/33196952-0-image-a-12_1600157862483.jpg
title: Frustrated boss slams jobseekers for not reading job advert properly
description: Web services firm owner Ryan Irving, 36, of Henley-on-Thames, shared his frustration on LinkedIn after only six out of 183 applicants answered a set of questions and instead sent their CVs.
date: 2020-09-15-09-57-15
categories: [latest, female]
featured: true
---
A business owner has publicly shamed hundreds of pandemic recession jobseekers – after 97 percent of applicants failed to follow his advert's basic instructions.
Ryan Irving claims the job ad specified not to submit a CV but rather asked candidates to answer a handful of simple questions for a role at his web services company Ri Web.
In a LinkedIn rant, the businessman revealed that of a whopping 183 people who applied, all but six were automatically declined because they weren't sufficiently tuned in to respond properly to the request.
The 36-year-old slammed the majority of applicants and said that at a time when 'unemployment is high and opportunities are few' their lack of attention was 'just not good enough'.
While some supported his message to pay attention to detail and not just fire out CVs blindly, others branded him 'sanctimonious' and said he was failing to see it from the point of view of jobseekers.
Business owner Ryan Irving, 36, from Henley-on-Thames, shared his frustration on LinkedIn after only six jobseekers out of 183 correctly answered his job advert
Ryan, from Henley-on-Thames, Oxfordshire, said: 'Jobseekers, please read job ads carefully and to the end.
'We recently ran a job ad which attracted 183 applications which, unfortunately, is symptomatic of the current job climate.
'Within our ad, we asked applicants to respond to a few simple questions instead of sending a CV and do you know how many responded as we'd requested? Six.
'At a time where unemployment is high and opportunities are few, this is just not good enough.
In the rant, Ryan said people should be able to follow 'simply instructions' and that the fact 177 people had failed to correctly answer his job advert was 'not good enough'
'I suspect many of these applicants are simply blanketly applying for everything and I can understand the mentality, but it doesn't work.
'We've just had to immediately decline 177 applications because those applicants didn't bother to read the ad properly.
'I don't care if you have four masters degrees and 20 years experience, if you can't follow some simple instructions, it's not a great start is it.'
Most people who responded to the post were sympathetic to his message.
Alyssa James said: 'Why wouldn't you read the ad? That seems crazy to me. If you have any real interest in a role you want to know every detail about what's involved. Skim reading doesn't scream enthusiastic, what a shame.'
Many of the people who replied to Ryan's rant were sympathetic, however, some reminded him of how demoralising jobseeking could be
Dorothy Dalton said the problem had got worse since Covid struck.
She said: 'I'm finding that not reading things properly or following instructions has increased significantly post COVID. This is purely anecdotal, although I have heard there is a term now Covid Brain.
'I find myself having to send things back or review them much more frequently."
But fellow respondent ICT project manager Alan Melia said it came across as 'sanctimonious'.
Alan said: 'Perhaps instead of a patronising slightly sanctimonious post you could have tried a different way of feedback.
'Sorry it does come across a bit that way, sure your intentions were good.'
But the business owner hit back, saying he had actually contacted those who were automatically rejected.
Ryan said: 'That’s disappointing to read that you feel my post came across that way.
'My intentions couldn't have been further as my only hope was that job-seekers may benefit from my experience by reading job ads in full before applying.
The 36-year-old said the company had responded to each applications to let them know why they had been rejected
One respondent said rejecting 177 applicants could only 'exacerbate stress levels' in the wake of covid
'We absolutely have responded to all applicants with details of why they were rejected, again, hoping to educate.'
But Izzy Martin said the recruiter should have had more sympathy with how demoralising the job hunt can be.
Izzy said: 'Whilst I appreciate where you're coming from, Ryan, and would certainly have shared your frustrations when advertising for a role, now I'm on the other side looking for a job, I understand how truly exhausting and demoralising the process can be.
'It can be especially demoralising when you put time and effort into applying for a role with an elaborate cover letter having researched the organisation, and receive nothing back.
Ryan, pictured, said the way applicants had responded to his job advert was 'symptomatic' of the current job climate
'So yes, it's frustrating people aren't reading every single word of your job ad. I get that.
'But perhaps it's because they feel that no one is reading a single word of their application. Just a thought.'
Alan Melia slammed Ryan's rant for attacking those who were already in a desperate position.
Alan said: 'Seems a lot of people here like a gotch. Nice to have a pop at people who may be desperate and trying to cover a lot of ground.
'Enjoy the schadenfreude of others failure.
Of course, one can level many charges back at agencies/employers and how they obviously don’t review CVs, but I’m not that kind of bloke.'
Gulnaz Raja, who delivers legal, governance and risk support to startups and SMES, said: 'To decline 177 applicants doesn’t seem right either.
'With unemployment increasing, to add an extra hurdle to the process will only exacerbate stress levels.'
| 61.886598 | 259 | 0.788439 | eng_Latn | 0.999913 |
1c7b72c52e257190eeae45db0e9103fa484519de | 1,263 | md | Markdown | README.md | MDShahrouq/arm_reverse_engineering_exercises | 173223b77319b69700054d678b4b16fb2d92ba25 | [
"Apache-2.0"
] | null | null | null | README.md | MDShahrouq/arm_reverse_engineering_exercises | 173223b77319b69700054d678b4b16fb2d92ba25 | [
"Apache-2.0"
] | null | null | null | README.md | MDShahrouq/arm_reverse_engineering_exercises | 173223b77319b69700054d678b4b16fb2d92ba25 | [
"Apache-2.0"
] | 1 | 2017-02-14T20:05:36.000Z | 2017-02-14T20:05:36.000Z | # ARM Reverse Engineering Exercises
[]()
## Overview
These exercises are meant to help identify and understand different patterns in ARM assembly. All of the examples have been written in Xcode and compiled with clang. As I create more exercises, I will use different compilers in order to demonstrate different nuances. The disassembly output will mostly likely be from Hopper, unless stated otherwise.
If you need additional help with any of the content, checkout my ARM cheatsheet -> [https://github.com/rotlogix/cheatsheets/blob/master/arm.md](https://github.com/rotlogix/cheatsheets/blob/master/arm.md)
## ARM
- [Array of Strings](https://github.com/rotlogix/Exercises/blob/master/arm/array_of_strings_arm.md)
- [Single Linked List](https://github.com/rotlogix/Exercises/blob/master/arm/single_linked_list_01_arm.md)
- [Exploit Exercises - Stack0](https://github.com/rotlogix/arm_reverse_engineering_exercises/blob/master/arm/exploit_exercises_stack0_arm.md)
- [Exploit Exercises - Heap0](https://github.com/rotlogix/Exercises/blob/master/arm/exploit_exercises_heap0_arm.md)
- [Introduction to Objective-C Round One](https://github.com/rotlogix/Exercises/blob/master/arm/objc_intro_001.md)
## ARM64
| 78.9375 | 353 | 0.795724 | eng_Latn | 0.718508 |
1c7bfcaebedd5185becdf723a2c062e91554bf0e | 1,060 | md | Markdown | documentation/viikkoraportti_5.md | ilkkamaksy/data-compression-algos | b73d5d619d21b9af5fe4d0c9575cf4f774aa78a3 | [
"MIT"
] | null | null | null | documentation/viikkoraportti_5.md | ilkkamaksy/data-compression-algos | b73d5d619d21b9af5fe4d0c9575cf4f774aa78a3 | [
"MIT"
] | 2 | 2021-04-25T06:46:05.000Z | 2021-04-27T06:50:47.000Z | documentation/viikkoraportti_5.md | ilkkamaksy/data-compression-algos | b73d5d619d21b9af5fe4d0c9575cf4f774aa78a3 | [
"MIT"
] | null | null | null | # Viikkoraportti 5
Tällä viikolla olen korjannut LZW-algoritmin tallennusta, tosin overheadin takia tallennettu tiedosto on pienillä syötteillä isompi kuin lähde. Suuremmilla syötteillä pakkaus näyttää toimivan hyvin, paitsi että jonkin tuntemattoman pisteen jälkeen dekoodattu teksti alkaa mennä sekaisin. Tämä testattu Mary Shelleyn Frankenstein ja Tolstoin Sota ja rauha -teoksilla. Ensimmäiset pari sataa sivua dekoodatusta tekstitä ovat ok, mutta sitten teksti sekoaa. En tiedä mistä on kyse, kuten normaalisti. Epäilen hajautustaulu-implementaatiotani, merkistökoodausta, epätarkkaa tallennusta, itse asiassa en luota mihinkään koko toteutuksessani.
LZW-sähellyksen lisäksi olen yrittänyt keksiä, miten tallentaa Huffman -koodin puurakenne bitteinä jollakin järkevällä systeeemillä pakattuun tiedostoon. Toistaiseksi olen epäonnistunut. Suunnitelmani testaamisen laajentamisesta tällä viikolla menivät siis täysin pieleen, mutta jatketaan konttaamista taas ensi viikolla.
## Käytetyt tunnit
Tällä viikolla olen käyttänyt noin 8 tuntia tähän työhön. | 117.777778 | 637 | 0.851887 | fin_Latn | 0.999993 |
1c7ccd36cd3681b4fd2da8147aec8ef2d69f0590 | 224 | md | Markdown | src/api-docs/stoplight.md | osvaldokalvaitir/project-settings | 94031b36bd96285275058123fbebe93aa5982a97 | [
"MIT"
] | 29 | 2018-12-20T13:16:27.000Z | 2020-08-26T20:34:03.000Z | src/api-docs/stoplight.md | osvaldokalvaitir/project-settings | 94031b36bd96285275058123fbebe93aa5982a97 | [
"MIT"
] | null | null | null | src/api-docs/stoplight.md | osvaldokalvaitir/project-settings | 94031b36bd96285275058123fbebe93aa5982a97 | [
"MIT"
] | 16 | 2018-12-20T13:16:21.000Z | 2020-10-15T19:54:41.000Z | # Stoplight
A API Design Management Platform que fornece as primeiras empresas líderes em API do mundo.
## Documentação e Acesso ao Serviço
Clique [aqui](https://stoplight.io) para ver a documentação e acessar o serviço.
| 28 | 91 | 0.78125 | por_Latn | 0.999622 |
1c7cd7d2e53da892248385e7eebd89e672c41312 | 2,787 | md | Markdown | docs/api/state.md | avstudio/state-commander | 8053da5be2d566982cc211cf485b679bd4e2ec79 | [
"MIT"
] | null | null | null | docs/api/state.md | avstudio/state-commander | 8053da5be2d566982cc211cf485b679bd4e2ec79 | [
"MIT"
] | 8 | 2020-09-05T01:21:01.000Z | 2022-02-26T11:59:10.000Z | docs/api/state.md | avucic/state-commander | 8053da5be2d566982cc211cf485b679bd4e2ec79 | [
"MIT"
] | null | null | null | # Context.State
State is defined as static getter on the [Command Class](#command-class)
Every class has it's own state, but at the and the state will be merged into single [Module](/api/module.md) state.
So for example assuming that are both commands are part of the `some/module` module:
```js
//inside some/module
class MyCommand1 {
static get state() {
return {
value1: 'foo'
};
}
}
//inside some/module
class MyCommand2 {
static get state() {
return {
value2: 'bar'
};
}
}
```
State for this module will be registered and merged as:
```js
context.state['some/module']; // {value1:'foo',value2:'bar'}
```
Within the state registration process,
this data will be used for creating new objects with `buildState()` and `setState()` factory.
## Configuration:
- `buildState(state:Object):Object` -
Set root state objects
- `setState(state:Object,key:String, properties:Object)`
Assign new nested properties to the state object recursively
::: warning
this will mutate given `state`
:::
::: tip NOTE
In default implementation, this method will return the plain JS Object.
In some other implementations, like [VueContext](/plugins/vue-context.md) with `Vue.observable` and
`Vue.set` will return state object with additional logic to support its reactive nature.
:::
## Hooks
- `state:register` - will be executed when state is registered
- `state:unregister` - will be executed when state is unregistered
[See more about hooks](/plugins.md#hooks)
## Command class instance properties
This module will define following instance properties on each [Command Class](./command-class) instance:
- `state:Object` - it will return `module` state object
- `rootState:Object` - it will return `context` state object containing the state of all modules
```js
class MyCommand {
commit() {
//this.state => module state
//this.rootState => context state
}
}
```
## Context instance properties
### getters
- type: `Object`
Invoking context getters
```js
context.getters['some/module/myCommand'];
```
will invoke class getter method:
```js
class MyCommand{
...
getter(){
return this.value.toUpperCase()
}
...
}
```
You can retrieve module state by module `namespace`:
```js
context.state['some/module'];
```
### state
- type: `Object`
Will return context state object from all registered modules.
### registerState
- `registerState(key:String, state:Object,options?:Object)`
Options can have `override: true` to force (override) new object for given key
::: tip NOTE
State registration process is done through `module` / `command` registration.
Most likely you won't have the need to use it directly.
:::
### unregisterState
- `unregisterState(key:String)`
| 21.438462 | 115 | 0.699677 | eng_Latn | 0.945078 |
1c7dac27748874926d7599761f41dc28bffbcb74 | 639 | md | Markdown | src/pages/blog/2019-01-21-2019-01-20.md | murokaco/gatsby-starter-netlify-cms | e00ddef20df3b431b5683950a791948fc9a548b6 | [
"MIT"
] | null | null | null | src/pages/blog/2019-01-21-2019-01-20.md | murokaco/gatsby-starter-netlify-cms | e00ddef20df3b431b5683950a791948fc9a548b6 | [
"MIT"
] | 3 | 2019-01-16T17:33:40.000Z | 2019-01-18T08:30:25.000Z | src/pages/blog/2019-01-21-2019-01-20.md | murokaco/gatsby-starter-netlify-cms | e00ddef20df3b431b5683950a791948fc9a548b6 | [
"MIT"
] | 1 | 2019-01-10T15:00:00.000Z | 2019-01-10T15:00:00.000Z | ---
templateKey: blog-post
title: '2019-01-20'
date: 2019-01-20T15:33:02.023Z
description: "\U0001F60C"
tags:
- JS UX
---
### 今日やったこと
#### JS
* スラスラ読めるJavaScriptふりがなプログラミングの続き([Github](https://github.com/murokaco/furigana-programming/commit/a62f22a02249c5e820c3ca1ad4bd31d58f2ffd30)
#### UX
* UXデザインをはじめる本の第二章途中まで読んだ。
-----
UXデザインをはじめる本、UX初学者向けなので説明もわかりやすく、日本人が書いているので謎の日本語を読み下す必要もないし、例も(当たり前だけど)ペルソナは日本人だし製品も日本のものにローカライズされてるのでイメージしやすい。
製品はまだいいけど、ペルソナは日本人じゃないと習慣からくる特性とかのイメージがつかみづらい。
前にUXデザインの件で話しを聞きに行ったところも「(UXは向こうのほうが進んでるしって思って)外国のUXデザイナー呼んだんですけど、結果だめだったんですよね」って言ってたし、文化違ったら習慣も違うから感性も違うし、もちろん分析結果も違ってくる。そりゃそうですよねって思った。
それではまた明日。
| 26.625 | 140 | 0.824726 | yue_Hant | 0.45292 |
1c7e746fb393fb55f2235f20f7756ebfb3413a61 | 170 | md | Markdown | README.md | you7588/todo-wechat-leanTodo | 0c16ff5a40497c5234e79f95ea79bb82708bc733 | [
"MIT"
] | null | null | null | README.md | you7588/todo-wechat-leanTodo | 0c16ff5a40497c5234e79f95ea79bb82708bc733 | [
"MIT"
] | null | null | null | README.md | you7588/todo-wechat-leanTodo | 0c16ff5a40497c5234e79f95ea79bb82708bc733 | [
"MIT"
] | null | null | null | # LeanTodo × 微信小程序
基于LeanTodo改变的打卡小程序。
# 开发资源
1. [leancloud](https://leancloud.cn/)
2. [WeUI](https://github.com/weui/weui-wxss)
3. [iconfont](http://www.iconfont.cn/)
| 18.888889 | 44 | 0.694118 | yue_Hant | 0.493395 |
1c7ec8d7dca03ff9ca6421ed3488e6a22854162e | 457 | md | Markdown | README.md | RaspiRepo/esp32-temperature | 05d8671d492104719b4604ae12199d23b1f667df | [
"MIT"
] | null | null | null | README.md | RaspiRepo/esp32-temperature | 05d8671d492104719b4604ae12199d23b1f667df | [
"MIT"
] | null | null | null | README.md | RaspiRepo/esp32-temperature | 05d8671d492104719b4604ae12199d23b1f667df | [
"MIT"
] | null | null | null | # esp32-temperature
This project aim is to understand integration between IOT sensor with Cloud. Here ESP32 reading AnalogDevice TMP36GZ temperature, construct HTTP message with sensor data and sending to InfluxDB HTTP endpoint.
## Real-time visibility into sensors using TICK
https://www.influxdata.com/
JetsonNano board running TICK (Telgraph, InfluxDB, Chronograph, Kapacitor) , to setup visiti here
https://github.com/RaspiRepo/jetsonNano-Tick
| 38.083333 | 211 | 0.803063 | eng_Latn | 0.725502 |
1c7ef135e9d6cd7cf51a10e2f34e811b5df87211 | 791 | md | Markdown | README.md | MosheBerman/ScrollViewSnapshotter | 659287709e5f3d15b7dfe7aabe0615012524593a | [
"MIT"
] | 18 | 2016-04-12T16:17:31.000Z | 2021-11-14T20:18:39.000Z | README.md | MosheBerman/ScrollViewSnapshotter | 659287709e5f3d15b7dfe7aabe0615012524593a | [
"MIT"
] | 3 | 2016-09-20T01:27:39.000Z | 2020-10-22T21:58:44.000Z | README.md | MosheBerman/ScrollViewSnapshotter | 659287709e5f3d15b7dfe7aabe0615012524593a | [
"MIT"
] | 3 | 2018-04-18T05:13:45.000Z | 2020-07-14T18:34:54.000Z | # ScrollViewSnapshotter
A demo project showing how to correctly snapshot the contents UIScrollView and its subclasses.
About:
---
Historically, the suggested approach to snapshotting a scrollview was to resize the scroll view's `frame` to match its `contentSize`. This became inconvenient with the introduction of autolayout on iOS.
This example scrolls through the contents of a scrollview subclass (in this case a collection view) and generates a PDF.
The magic comes from translating the core graphics current transformation matrix along with the `contentOffset`. Check out the ScrollViewSnapshotter class for implementation details and a ton of notes. Those notes are also available on [my blog](http://blog.mosheberman.com/generating-a-pdf-from-a-uiscrollview/).
License:
---
MIT
| 52.733333 | 313 | 0.80531 | eng_Latn | 0.987833 |
1c7f5e3aa9525408dca6fb99a5b099ccb99f4aa5 | 1,794 | md | Markdown | dynamicsax2012-technet/iutilityv2-createcreditmemotenderlineitem-method-microsoft-dynamics-retail-pos-contracts-businesslogic.md | RobinARH/DynamicsAX2012-technet | d0d0ef979705b68e6a8406736612e9fc3c74c871 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | dynamicsax2012-technet/iutilityv2-createcreditmemotenderlineitem-method-microsoft-dynamics-retail-pos-contracts-businesslogic.md | RobinARH/DynamicsAX2012-technet | d0d0ef979705b68e6a8406736612e9fc3c74c871 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | dynamicsax2012-technet/iutilityv2-createcreditmemotenderlineitem-method-microsoft-dynamics-retail-pos-contracts-businesslogic.md | RobinARH/DynamicsAX2012-technet | d0d0ef979705b68e6a8406736612e9fc3c74c871 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: IUtilityV2.CreateCreditMemoTenderLineItem Method (Microsoft.Dynamics.Retail.Pos.Contracts.BusinessLogic)
TOCTitle: CreateCreditMemoTenderLineItem Method
ms:assetid: M:Microsoft.Dynamics.Retail.Pos.Contracts.BusinessLogic.IUtilityV2.CreateCreditMemoTenderLineItem
ms:mtpsurl: https://technet.microsoft.com/en-us/library/microsoft.dynamics.retail.pos.contracts.businesslogic.iutilityv2.createcreditmemotenderlineitem(v=AX.60)
ms:contentKeyID: 49842387
ms.date: 05/18/2015
mtps_version: v=AX.60
f1_keywords:
- Microsoft.Dynamics.Retail.Pos.Contracts.BusinessLogic.IUtilityV2.CreateCreditMemoTenderLineItem
dev_langs:
- CSharp
- C++
- VB
---
# CreateCreditMemoTenderLineItem Method
**Namespace:** [Microsoft.Dynamics.Retail.Pos.Contracts.BusinessLogic](microsoft-dynamics-retail-pos-contracts-businesslogic-namespace.md)
**Assembly:** Microsoft.Dynamics.Retail.Pos.Contracts (in Microsoft.Dynamics.Retail.Pos.Contracts.dll)
## Syntax
``` vb
'Declaration
Function CreateCreditMemoTenderLineItem As ICreditMemoTenderLineItem
'Usage
Dim instance As IUtilityV2
Dim returnValue As ICreditMemoTenderLineItem
returnValue = instance.CreateCreditMemoTenderLineItem()
```
``` csharp
ICreditMemoTenderLineItem CreateCreditMemoTenderLineItem()
```
``` c++
ICreditMemoTenderLineItem^ CreateCreditMemoTenderLineItem()
```
#### Return Value
Type: [Microsoft.Dynamics.Retail.Pos.Contracts.DataEntity.ICreditMemoTenderLineItem](icreditmemotenderlineitem-interface-microsoft-dynamics-retail-pos-contracts-dataentity.md)
## See Also
#### Reference
[IUtilityV2 Interface](iutilityv2-interface-microsoft-dynamics-retail-pos-contracts-businesslogic.md)
[Microsoft.Dynamics.Retail.Pos.Contracts.BusinessLogic Namespace](microsoft-dynamics-retail-pos-contracts-businesslogic-namespace.md)
| 33.222222 | 177 | 0.829431 | yue_Hant | 0.870476 |
1c7f8408e40eb4f0feb79b4f63fe8e50cc0e806e | 559 | md | Markdown | .github/PULL_REQUEST_TEMPLATE.md | RoasteryHub/lavie-selekopi | e3a09fdb22ac2f1744a5c106e9a6ca551ea19598 | [
"MIT"
] | null | null | null | .github/PULL_REQUEST_TEMPLATE.md | RoasteryHub/lavie-selekopi | e3a09fdb22ac2f1744a5c106e9a6ca551ea19598 | [
"MIT"
] | null | null | null | .github/PULL_REQUEST_TEMPLATE.md | RoasteryHub/lavie-selekopi | e3a09fdb22ac2f1744a5c106e9a6ca551ea19598 | [
"MIT"
] | null | null | null | ### Proposed changes in this pull request
[List all changes you want to add here. If you fixed an issue, please
add a reference to that issue as well.]
-
### When should this PR be merged
[Please describe any preconditions that need to be addressed before we
can merge this pull request.]
### Risks
[List any risks that could arise from merging your pull request.]
-
### Follow-up actions
[List any possible follow-up actions here; for instance, testing data
migrations, software that we need to install on staging and production
environments.]
-
| 19.964286 | 70 | 0.751342 | eng_Latn | 0.999744 |
1c7fccf9beb07d1d64b17e94c959bbd6506b0722 | 1,263 | md | Markdown | docs/schema-doc/snitch_cluster-properties-hives-hive-description-properties-cores-core-description-properties-ssrs-ssr-description-properties-isect_slave_credits.md | mfkiwl/snitch | d4d2d8e0b45b5a03836023dd923ed54a808fb0d4 | [
"Apache-2.0"
] | 105 | 2021-02-06T16:54:03.000Z | 2022-03-30T07:54:57.000Z | docs/schema-doc/snitch_cluster-properties-hives-hive-description-properties-cores-core-description-properties-ssrs-ssr-description-properties-isect_slave_credits.md | mfkiwl/snitch | d4d2d8e0b45b5a03836023dd923ed54a808fb0d4 | [
"Apache-2.0"
] | 163 | 2021-02-06T18:54:40.000Z | 2022-03-30T22:06:05.000Z | docs/schema-doc/snitch_cluster-properties-hives-hive-description-properties-cores-core-description-properties-ssrs-ssr-description-properties-isect_slave_credits.md | mfkiwl/snitch | d4d2d8e0b45b5a03836023dd923ed54a808fb0d4 | [
"Apache-2.0"
] | 20 | 2021-02-23T07:26:02.000Z | 2022-03-27T17:19:53.000Z | # Untitled number in Snitch Cluster Schema Schema
```txt
http://pulp-platform.org/snitch/snitch_cluster.schema.json#/properties/hives/items/properties/cores/items/properties/ssrs/items/properties/isect_slave_credits
```
Number of elements by which intersected indices may outrun corresponding data; added only if this SSR is an intersection slave.
| Abstract | Extensible | Status | Identifiable | Custom Properties | Additional Properties | Access Restrictions | Defined In |
| :------------------ | :--------- | :------------- | :---------------------- | :---------------- | :-------------------- | :------------------ | :------------------------------------------------------------------------------- |
| Can be instantiated | No | Unknown status | Unknown identifiability | Forbidden | Allowed | none | [snitch_cluster.schema.json*](snitch_cluster.schema.json "open original schema") |
## isect_slave_credits Type
`number`
## isect_slave_credits Constraints
**minimum**: the value of this number must greater than or equal to: `2`
## isect_slave_credits Default Value
The default value is:
```json
8
```
| 45.107143 | 228 | 0.546318 | eng_Latn | 0.79907 |
1c803e5c6c031d6d5ce00d9de1de86a168254c12 | 26 | md | Markdown | README.md | luisalcarasr/landing-page | c8aaf223df08cbbd030815ab83a5f15ae0c45b20 | [
"MIT"
] | null | null | null | README.md | luisalcarasr/landing-page | c8aaf223df08cbbd030815ab83a5f15ae0c45b20 | [
"MIT"
] | 2 | 2021-04-03T01:41:22.000Z | 2021-04-09T03:32:44.000Z | README.md | luisalcarasr/landing-page | c8aaf223df08cbbd030815ab83a5f15ae0c45b20 | [
"MIT"
] | null | null | null | # landing-page
My page 🌍
| 8.666667 | 14 | 0.653846 | eng_Latn | 0.634475 |
1c807c14d96e89bb9411bb8a2e19d6aa61453c56 | 1,642 | md | Markdown | content/post/2019/fixing-hugo-pagination-in-0-58.md | RubenSchade/rubenerd-com | c31a535a0e2949a360be1211b5413bac6d37c2cb | [
"BSD-3-Clause"
] | 7 | 2016-08-25T17:47:57.000Z | 2020-07-13T12:19:53.000Z | content/post/2019/fixing-hugo-pagination-in-0-58.md | RubenSchade/rubenerd-com | c31a535a0e2949a360be1211b5413bac6d37c2cb | [
"BSD-3-Clause"
] | null | null | null | content/post/2019/fixing-hugo-pagination-in-0-58.md | RubenSchade/rubenerd-com | c31a535a0e2949a360be1211b5413bac6d37c2cb | [
"BSD-3-Clause"
] | 3 | 2020-07-13T12:20:10.000Z | 2021-01-04T09:34:13.000Z | ---
title: "Fixing Hugo pagination in 0.58"
date: "2019-09-12T10:44:45+10:00"
abstract: "Use site.RegularPages, not .Pages"
year: "2019"
category: Internet
tag:
- gohugo
- troubleshooting
- weblog
location: Sydney
---
The latest [Hugo](https://gohugo.io) static site generator version broke pagination for all my sites, including this one. This is how I previously filtered content so only posts would appear in pagination, as per the [documentation](https://gohugo.io/templates/pagination/#list-paginator-pages):
{{ $paginator := .Paginate (where .Pages "Type" "post") }}
{{ range $paginator.Pages }}
This now returnes a single page with the word *Posts* shown, and nothing else. I changed to the generic pagination method below and it worked, so the issue was with the above lookup.
{{ range .Pagination.Pages }}
I saw [this thread](https://discourse.gohugo.io/t/site-not-working-after-updating-to-0-55/20664) on the Hugo forums, which had example code by funkydan2 replacing `.Pages` with `site.RegularPages`. This worked.
{{ $pages := where site.RegularPages "Type" "post" }}
{{ $paginator := .Paginate $pages }}{{ range $paginator.Pages }}
I broke the lookup out into a separate line to make it nicer, but you can just do a replace with the original documented code and it works.
Niggling changes like this aside, Hugo is still the best static-site generator I've ever used. I do miss the Liquid template system of Jekyll, but Hugo generates more than six thousand posts in seconds, as opposed to more than fifteen minutes. I wonder how much wasted power I've saved on remote servers since making the switch.
| 52.967742 | 328 | 0.744214 | eng_Latn | 0.992896 |
1c8215a1c986b47563f1ce6a4776a65611b3e402 | 238 | md | Markdown | content/poems/bennington-review-1.md | byoo24/stella_v1 | 78a63b636ba016980d35ec64df81acf652fd36ae | [
"RSA-MD"
] | null | null | null | content/poems/bennington-review-1.md | byoo24/stella_v1 | 78a63b636ba016980d35ec64df81acf652fd36ae | [
"RSA-MD"
] | null | null | null | content/poems/bennington-review-1.md | byoo24/stella_v1 | 78a63b636ba016980d35ec64df81acf652fd36ae | [
"RSA-MD"
] | null | null | null | ---
date: "2021-08-24"
publisher: Bennington Review
featuredImage: ../../images/bennington-review-cover-2021.jpg
external_link: https://www.benningtonreview.org/issue-nine-wong
poems:
- When the Still Breathing Watch the Stillborn
---
| 26.444444 | 63 | 0.756303 | yue_Hant | 0.262213 |
1c82310122e7a3b45760aff5da736c005a3cf304 | 673 | md | Markdown | search.md | 737andrew/737andrew.github.io | 8bb0885b10c3e18bb43758e0a0b7e2d14863e548 | [
"MIT"
] | null | null | null | search.md | 737andrew/737andrew.github.io | 8bb0885b10c3e18bb43758e0a0b7e2d14863e548 | [
"MIT"
] | null | null | null | search.md | 737andrew/737andrew.github.io | 8bb0885b10c3e18bb43758e0a0b7e2d14863e548 | [
"MIT"
] | null | null | null | ---
layout: page_f
title: "Поиск по сайту"
title_name: "Поиск по сайту"
description: ""
time_publish: ""
---
{% include JB/setup %}
<div id="ya-site-results" onclick="return {'tld': 'ru','language': 'ru','encoding': '','htmlcss': '1.x','updatehash': true}"></div><script type="text/javascript">(function(w,d,c){var s=d.createElement('script'),h=d.getElementsByTagName('script')[0];s.type='text/javascript';s.async=true;s.charset='utf-8';s.src=(d.location.protocol==='https:'?'https:':'http:')+'//site.yandex.net/v2.0/js/all.js';h.parentNode.insertBefore(s,h);(w[c]||(w[c]=[])).push(function(){Ya.Site.Results.init();})})(window,document,'yandex_site_callbacks');</script>
| 61.181818 | 539 | 0.674591 | yue_Hant | 0.335611 |
1c8297d32c65bdaf29aae82519163b383949845e | 1,680 | md | Markdown | results/innerfidelity/sbaf-serious/Pioneer Monitor 10 II/README.md | Banbeucmas/AutoEq | b8549b2347a19e1f127e6395147ecd6fb225a8ce | [
"MIT"
] | 1 | 2020-07-17T03:48:21.000Z | 2020-07-17T03:48:21.000Z | results/innerfidelity/sbaf-serious/Pioneer Monitor 10 II/README.md | datascientist1976/AutoEq | dd8ea7ea5edb5a9087a001ceeb862326e7d23cb9 | [
"MIT"
] | null | null | null | results/innerfidelity/sbaf-serious/Pioneer Monitor 10 II/README.md | datascientist1976/AutoEq | dd8ea7ea5edb5a9087a001ceeb862326e7d23cb9 | [
"MIT"
] | null | null | null | # Pioneer Monitor 10 II
See [usage instructions](https://github.com/jaakkopasanen/AutoEq#usage) for more options and info.
### Parametric EQs
In case of using parametric equalizer, apply preamp of **-7.2dB** and build filters manually
with these parameters. The first 5 filters can be used independently.
When using independent subset of filters, apply preamp of **-7.4dB**.
| Type | Fc | Q | Gain |
|:--------|:--------|:-----|:---------|
| Peaking | 32 Hz | 0.43 | -4.6 dB |
| Peaking | 117 Hz | 0.84 | -3.1 dB |
| Peaking | 268 Hz | 1.98 | -6.0 dB |
| Peaking | 1699 Hz | 3.54 | -10.7 dB |
| Peaking | 4761 Hz | 1.41 | 7.0 dB |
| Peaking | 379 Hz | 3.12 | -2.1 dB |
| Peaking | 680 Hz | 2.27 | 5.0 dB |
| Peaking | 1459 Hz | 5.84 | -1.6 dB |
| Peaking | 6411 Hz | 4.11 | 3.9 dB |
| Peaking | 7354 Hz | 1.59 | -2.5 dB |
### Fixed Band EQs
In case of using fixed band (also called graphic) equalizer, apply preamp of **-7.7dB**
(if available) and set gains manually with these parameters.
| Type | Fc | Q | Gain |
|:--------|:---------|:-----|:--------|
| Peaking | 31 Hz | 1.41 | -4.5 dB |
| Peaking | 62 Hz | 1.41 | -4.1 dB |
| Peaking | 125 Hz | 1.41 | -2.5 dB |
| Peaking | 250 Hz | 1.41 | -7.2 dB |
| Peaking | 500 Hz | 1.41 | 1.9 dB |
| Peaking | 1000 Hz | 1.41 | 1.9 dB |
| Peaking | 2000 Hz | 1.41 | -8.5 dB |
| Peaking | 4000 Hz | 1.41 | 8.7 dB |
| Peaking | 8000 Hz | 1.41 | 0.4 dB |
| Peaking | 16000 Hz | 1.41 | -0.3 dB |
### Graphs
 | 42 | 161 | 0.576786 | eng_Latn | 0.641461 |
1c82cf42445610a07bd295651b20e31556afb8c0 | 3,502 | md | Markdown | docs/debugger/debug-live-azure-apps-faq.md | keunyop/visualstudio-docs.ko-kr | 6f316d7e3badaf2deafa1a041b7946f2bc3d9a31 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2019-12-04T01:36:07.000Z | 2019-12-04T01:36:07.000Z | docs/debugger/debug-live-azure-apps-faq.md | keunyop/visualstudio-docs.ko-kr | 6f316d7e3badaf2deafa1a041b7946f2bc3d9a31 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/debugger/debug-live-azure-apps-faq.md | keunyop/visualstudio-docs.ko-kr | 6f316d7e3badaf2deafa1a041b7946f2bc3d9a31 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: 스냅숏 디버깅 FAQ | Microsoft Docs
ms.date: 11/07/2017
ms.topic: reference
helpviewer_keywords:
- debugger
ms.assetid: 944f1eb0-a74b-4d28-ae2b-a370cd869add
author: mikejo5000
ms.author: mikejo
manager: jillfra
ms.workload:
- multiple
ms.openlocfilehash: 7ea593ad5f88ba29f6b1c0d7c64a129b8f71c7f5
ms.sourcegitcommit: 94b3a052fb1229c7e7f8804b09c1d403385c7630
ms.translationtype: MT
ms.contentlocale: ko-KR
ms.lasthandoff: 04/23/2019
ms.locfileid: "62853317"
---
# <a name="frequently-asked-questions-for-snapshot-debugging-in-visual-studio"></a>Visual Studio의 스냅숏 디버깅에 대해 자주 묻는 질문
다음은 스냅숏 디버거를 사용하여 라이브 Azure 애플리케이션을 디버그할 때 나올 수 있는 질문 목록입니다.
#### <a name="what-is-the-performance-cost-of-taking-a-snapshot"></a>스냅숏 생성 시 어떤 성능 저하가 발생하나요?
스냅숏 디버거는 앱의 스냅숏을 캡처할 때 앱 프로세스를 포크하고 포크된 복사본을 일시중단합니다. 스냅숏을 디버그하는 경우 포크된 프로세스 복사본에 대해 디버그합니다. 이 프로세스는 10~20밀리초밖에 안 걸리지만 앱의 전체 힙을 복사하지는 않습니다. 페이지 테이블만 복사하고 페이지를 쓰기 작업 중 복사하도록 설정합니다. 힙의 앱 개체 중 일부가 변경되면 해당 페이지가 복사됩니다. 따라서 각 스냅숏은 메모리 내 비용을 적은 양만 차지합니다(대부분 앱의 경우 대략 수백 킬로바이트).
#### <a name="what-happens-if-i-have-a-scaled-out-azure-app-service-multiple-instances-of-my-app"></a>확장형 Azure App Service(내 앱의 인스턴스가 여러 개)가 있는 경우는 어떻게 되나요?
앱 인스턴스가 여러 개인 경우 모든 단일 인스턴스에 snappoint가 적용됩니다. 지정된 조건으로 적중된 첫 번째 snappoint만 스냅숏을 만듭니다. snappoint가 여러 개인 경우 이후의 스냅숏은 첫 번째 스냅숏을 만든 인스턴스와 동일한 인스턴스에서 가져옵니다. 출력 창으로 보낸 logpoint는 한 인스턴스의 메시지만 보여 주는 반면, 애플리케이션 로그로 보낸 logpoint는 모든 인스턴스의 메시지를 보냅니다.
#### <a name="how-does-the-snapshot-debugger-load-symbols"></a>스냅숏 디버거가 기호를 로드하는 방법은 무엇인가요?
스냅숏 디버거를 사용하려면 로컬 또는 Azure App Service에 배포된 애플리케이션에 대해 일치하는 기호가 있어야 합니다. (포함 PDB는 현재 지원되지 않습니다.) 스냅숏 디버거는 자동으로 Azure App Service에서 기호를 다운로드합니다. Visual Studio 2017 버전 15.2부터 Azure App Service에 배포하는 경우 앱의 기호도 배포합니다.
#### <a name="does-the-snapshot-debugger-work-against-release-builds-of-my-application"></a>스냅숏 디버거가 내 애플리케이션의 릴리스 빌드에 대해 작동하나요?
예, 스냅숏 디버거는 릴리스 빌드에 대해 작동하도록 설계되었습니다. snappoint가 함수에 배치되면 해당 함수가 디버그 버전으로 다시 컴파일되어 디버그할 수 있게 됩니다. 스냅숏 디버거를 중지하면 해당 릴리스 빌드 버전에 대해 함수를 반환합니다.
#### <a name="can-logpoints-cause-side-effects-in-my-production-application"></a>logpoint가 내 프로덕션 애플리케이션에서 부작용을 일으킬 수 있나요?
아니요, 앱에 추가하는 모든 로그 메시지는 가상으로 평가됩니다. 애플리케이션에서 어떠한 부작용도 일으킬 수 없습니다. 하지만 일부 네이티브 속성은 logpoint로 액세스할 수 없습니다.
#### <a name="does-the-snapshot-debugger-work-if-my-server-is-under-load"></a>내 서버가 로드 중인 경우 스냅숏 디버거가 작동하나요?
예, 로드 중인 서버에 대해 스냅숏 디버깅이 작동할 수 있습니다. 스냅숏 디버거는 서버에 사용 가능한 메모리가 적은 상황에서는 스냅숏을 제한하고 캡처하지 않습니다.
#### <a name="how-do-i-uninstall-the-snapshot-debugger"></a>스냅숏 디버거를 제거하는 방법은 무엇인가요?
다음 단계를 사용하여 App Service에서 스냅숏 디버거 사이트 확장을 제거할 수 있습니다.
1. Visual Studio 또는 Azure Portal에서 클라우드 탐색기를 통해 App Service를 끕니다.
1. App Service의 Kudu 사이트(즉, yourappservice.**scm**.azurewebsites.net)로 이동한 후 **사이트 확장**으로 이동합니다.
1. 스냅숏 디버거 사이트 확장에서 X를 클릭하여 제거합니다.
#### <a name="why-are-ports-opened-during-a-snapshot-debugger-session"></a>스냅숏 디버거 세션 중에 포트가 열려 있는 이유는 무엇인가요?
Azure에서 생성된 스냅숏을 디버그하려면 스냅숏 디버거에서 일련의 포트를 열어야 하며 이러한 포트는 원격 디버깅에 필요한 포트와 동일합니다. [여기에서 포트 목록을 찾을 수 있습니다](../debugger/remote-debugger-port-assignments.md).
## <a name="see-also"></a>참고자료
- [Visual Studio의 디버깅](../debugger/index.md)
- [스냅숏 디버거를 사용 하 여 라이브 ASP.NET 앱 디버그](../debugger/debug-live-azure-applications.md)
- [라이브 ASP.NET Azure 가상 Machines\Virtual Machines Scale Sets 스냅숏 디버거를 사용 하 여 디버그](../debugger/debug-live-azure-virtual-machines.md)
- [스냅숏 디버거를 사용 하 여 라이브 ASP.NET Azure Kubernetes 디버깅](../debugger/debug-live-azure-kubernetes.md)
- [스냅숏 디버깅 문제 해결 및 알려진 문제](../debugger/debug-live-azure-apps-troubleshooting.md) | 53.060606 | 270 | 0.745003 | kor_Hang | 1.00001 |
1c83f7313d076681ada0326bc18a3bd1740b9509 | 1,491 | md | Markdown | content/post/17/index.md | astropenguin/website | f2268a7040a773059399b37bd630ab274e6ed34f | [
"MIT"
] | 1 | 2020-01-27T08:37:34.000Z | 2020-01-27T08:37:34.000Z | content/post/17/index.md | astropenguin/website | f2268a7040a773059399b37bd630ab274e6ed34f | [
"MIT"
] | 18 | 2019-11-29T19:00:05.000Z | 2022-01-03T02:23:02.000Z | content/post/17/index.md | astropenguin/website | f2268a7040a773059399b37bd630ab274e6ed34f | [
"MIT"
] | 1 | 2021-01-29T08:26:11.000Z | 2021-01-29T08:26:11.000Z | +++
title = "プログラミング用フォント Ricty のインストール"
date = 2018-12-11T23:06:55+09:00
tags = ["Ricty", "Homebrew"]
categories = ["Tech"]
aliases = ["/blog/17/"]
+++
## TL;DR
これはアドベントカレンダーの11日目の記事です。
今日は雑ですが、プログラミング用フォント Ricty のインストールのメモです。
## Ricty
> Ricty (リクティ) は Linux 環境での研究・開発を想定したプログラミング用フォントです。
> テキストエディタやターミナルエミュレータ、プログラミング言語やマークアップ言語に対する使用に適しています。
- [プログラミング用フォント Ricty](https://www.rs.tus.ac.jp/yyusa/ricty.html)
ということで、これを VS Code や Terminal のフォントとして設定すると見やすくなりそうです。
## Installation
インストールは Homebrew で行えます。
インストール後に以下の通りの caveats が出ますので、その通りにします。
```shell
==> Caveats
***************************************************
Generated files:
/usr/local/opt/ricty/share/fonts/RictyDiscord-Regular.ttf
/usr/local/opt/ricty/share/fonts/Ricty-Bold.ttf
/usr/local/opt/ricty/share/fonts/Ricty-Regular.ttf
/usr/local/opt/ricty/share/fonts/RictyDiscord-Bold.ttf
***************************************************
To install Ricty:
$ cp -f /usr/local/opt/ricty/share/fonts/Ricty*.ttf ~/Library/Fonts/
$ fc-cache -vf
***************************************************
```
というわけで、全てのスクリプトは以下の通りです。
```shell
$ brew tap sanemat/font
$ brew install ricty
$ cp -f /usr/local/opt/ricty/share/fonts/Ricty*.ttf ~/Library/Fonts/
$ fc-cache -vf
```
VS Code に設定したらこんな感じになりました!

## References
+ [プログラミング用フォント Ricty](https://www.rs.tus.ac.jp/yyusa/ricty.html)
+ [プログラミング用フォントRictyをMacにインストールする \- Qiita](https://qiita.com/segur/items/50ae2697212a7bdb7c7f)
| 25.271186 | 95 | 0.657948 | yue_Hant | 0.637133 |
1c84c1c4e30fd73298514332062f4570c6201017 | 155 | md | Markdown | packages/webiny-ui/src/Tags/README.md | gatarelib/webiny-js | 2f90a3bb2d5cfae568b0a0d7d5e09a1bbd930986 | [
"MIT"
] | null | null | null | packages/webiny-ui/src/Tags/README.md | gatarelib/webiny-js | 2f90a3bb2d5cfae568b0a0d7d5e09a1bbd930986 | [
"MIT"
] | null | null | null | packages/webiny-ui/src/Tags/README.md | gatarelib/webiny-js | 2f90a3bb2d5cfae568b0a0d7d5e09a1bbd930986 | [
"MIT"
] | null | null | null | # Tags
### Design
N/A
### Description
`Tags` component lets users to input one or more tags.
### Import
```js
import { Tags } from "webiny-ui/Tags";
``` | 12.916667 | 54 | 0.63871 | eng_Latn | 0.942736 |
1c85238d629e05043735bf9857e19fa142135c7c | 2,916 | md | Markdown | types/foundry-pc-types/README.md | mirkoRainer/Foundry-VTT---Pathfinder-2e-Mirror | 65ef20202f332d9281a23cef4c58a91aa98b11e4 | [
"Apache-2.0"
] | 2 | 2021-03-29T13:18:15.000Z | 2021-11-20T23:05:20.000Z | types/foundry-pc-types/README.md | mirkoRainer/Foundry-VTT---Pathfinder-2e-Mirror | 65ef20202f332d9281a23cef4c58a91aa98b11e4 | [
"Apache-2.0"
] | 6 | 2021-01-13T11:59:37.000Z | 2021-01-16T11:54:30.000Z | types/foundry-pc-types/README.md | mirkoRainer/Foundry-VTT---Pathfinder-2e-Mirror | 65ef20202f332d9281a23cef4c58a91aa98b11e4 | [
"Apache-2.0"
] | null | null | null | <div align=center>
# Foundry Project Creator Types
[](https://discord.gg/59Tz2X7)
[](https://twitter.com/NickEastNL)
[](https://www.patreon.com/nick_east)
[](https://ko-fi.com/nickeast)
</div>
This package provides type definitions for the Foundry VTT API, useful for both vanilla JS and TypeScript to get autocomplete, Intellisense, and type checking.
Unlike other packages, this will gradually be updated as I add more type definitions. Therefore, I will not use version numbers and tags to tag releases.
Considering the size of the Foundry API and the frequent changes to it, please be aware that not all types may be accurate. If in doubt, double-check to make sure what you're attempting to do is possible (such as by temporarily disabling typechecking). If types are missing, outdated, or otherwise incorrect, please report it in the Issue Tracker.
If you are updating the types yourself, Pull Requests are much appreciated, as I cannot consistently keep track of every API change and update everything by myself.
## Installing
Due to how the Node Package Manager (NPM) works, it is not possible to simply update a package installed from Git using `npm install`. Simply reinstall the package with the command below to retrieve the latest version.
```
npm install --save-dev gitlab:foundry-projects/foundry-pc/foundry-pc-types
```
When using the CLI ([Foundry Project Creator](https://gitlab.com/foundry-projects/foundry-pc/create-foundry-project)), it will add this package to your project by default, and configure the `tsconfig.json` and add the above command as an npm script for you.
## Configuration
When not using my Project Creator, you need to manually configure your project to use the types.
TypeScript is configured with a `tsconfig.json` file in your project root. Add the following property to enable TS to use the Foundry API typings.
```json
{
"compilerOptions": {
"types": "foundry-pc-types"
}
}
```
For vanilla JavaScript, you may need to use `jsconfig.json` with the above configuration, or allow TypeScript to also parse JavaScript files. Please refer to the documentation of your IDE and/or TypeScript to learn how to properly configure your environment.
## Support
All the work that I do for Foundry I do in my spare time. If you love using it, please consider supporting me through either [Patreon](https://www.patreon.com/nick_east) or [Ko-Fi](https://ko-fi.com/nickeast) so that I can continue my work and worry less about life itself.
| 59.510204 | 347 | 0.777435 | eng_Latn | 0.990135 |
1c865497823a058a17d1baf44f80644d80ae35b8 | 26,298 | md | Markdown | README.md | rxeer/seo-portal | 0f0410af5f0cd8a43d2ab1d0942ef6696bb6a74a | [
"MIT"
] | 1 | 2019-09-26T03:13:15.000Z | 2019-09-26T03:13:15.000Z | README.md | rxeer/seo-portal | 0f0410af5f0cd8a43d2ab1d0942ef6696bb6a74a | [
"MIT"
] | null | null | null | README.md | rxeer/seo-portal | 0f0410af5f0cd8a43d2ab1d0942ef6696bb6a74a | [
"MIT"
] | null | null | null | ## Table of Contents
- [Sending Feedback](#sending-feedback)
- [Folder Structure](#folder-structure)
- [Available Scripts](#available-scripts)
- [npm start](#npm-start)
- [npm test](#npm-test)
- [npm run build](#npm-run-build)
- [npm run eject](#npm-run-eject)
- [Supported Browsers](#supported-browsers)
- [Supported Language Features and Polyfills](#supported-language-features-and-polyfills)
- [Syntax Highlighting in the Editor](#syntax-highlighting-in-the-editor)
- [Debugging in the Editor](#debugging-in-the-editor)
- [Changing the Page `<title>`](#changing-the-page-title)
- [Installing a Dependency](#installing-a-dependency)
- [Importing a Component](#importing-a-component)
- [Code Splitting](#code-splitting)
- [Adding a Stylesheet](#adding-a-stylesheet)
- [Post-Processing CSS](#post-processing-css)
- [Adding a CSS Preprocessor (Sass, Less etc.)](#adding-a-css-preprocessor-sass-less-etc)
- [Adding Images, Fonts, and Files](#adding-images-fonts-and-files)
- [Using the `public` Folder](#using-the-public-folder)
- [Changing the HTML](#changing-the-html)
- [Adding Assets Outside of the Module System](#adding-assets-outside-of-the-module-system)
- [When to Use the `public` Folder](#when-to-use-the-public-folder)
- [Expanding Environment Variables In in `.env](#expanding-environment-variables-in-.env)
- [Adding a Router](#adding-a-router)
## Folder Structure
After creation, your project should look like this:
```
my-app/
README.md
node_modules/
public/
index.html
favicon.ico
src/
assets/ // photos, fonts
component/ // shared components
containers/ // pages containers
middlewares/ // middlewares/utils
pages/ // routes pages
locales/ // localizations
package.json // project dependencies
.env // global vars
.eslintrc.json // setting for linters
.gitinore // .gitignore
webpack.config.js // webpack configuration
```
For the project to build, **these files must exist with exact filenames**:
* `public/index.html` is the page template;
* `src/index.js` is the JavaScript entry point.
You can delete or rename the other files.
You may create subdirectories inside `src`. For faster rebuilds, only files inside `src` are processed by Webpack.<br>
You need to **put any JS and CSS files inside `src`**, otherwise Webpack won’t see them.
Only files inside `public` can be used from `public/index.html`.<br>
Read instructions below for using assets from JavaScript and HTML.
You can, however, create more top-level directories.<br>
They will not be included in the production build so you can use them for things like documentation.
## Available Scripts
In the project directory, you can run:
### `npm start`
Runs the app in the development mode.<br>
Open [http://localhost:8080](http://localhost:3000) to view it in the browser.
The page will reload if you make edits.<br>
You will also see any lint errors in the console.
### `npm run build`
Builds the app for production to the `build` folder.<br>
It correctly bundles React in production mode and optimizes the build for the best performance.
The build is minified and the filenames include the hashes.<br>
Your app is ready to be deployed!
See the section about [deployment](#deployment) for more information.
## Supported Browsers
By default, the generated project uses the latest version of React.
## Supported Language Features and Polyfills
This project supports a superset of the latest JavaScript standard.<br>
In addition to [ES6](https://github.com/lukehoban/es6features) syntax features, it also supports:
* [Exponentiation Operator](https://github.com/rwaldron/exponentiation-operator) (ES2016).
* [Object Rest/Spread Properties](https://github.com/sebmarkbage/ecmascript-rest-spread) (stage 3 proposal).
* [Dynamic import()](https://github.com/tc39/proposal-dynamic-import) (stage 3 proposal)
* [Class Fields and Static Properties](https://github.com/tc39/proposal-class-public-fields) (part of stage 3 proposal).
* [JSX](https://facebook.github.io/react/docs/introducing-jsx.html) syntax.
Learn more about [different proposal stages](https://babeljs.io/docs/plugins/#presets-stage-x-experimental-presets-).
While we recommend using experimental proposals with some caution, Facebook heavily uses these features in the product code, so we intend to provide [codemods](https://medium.com/@cpojer/effective-javascript-codemods-5a6686bb46fb) if any of these proposals change in the future.
Note that **the project only includes a few ES6 [polyfills](https://en.wikipedia.org/wiki/Polyfill)**:
* [`Object.assign()`](https://developer.mozilla.org/en/docs/Web/JavaScript/Reference/Global_Objects/Object/assign) via [`object-assign`](https://github.com/sindresorhus/object-assign).
* [`Promise`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Promise) via [`promise`](https://github.com/then/promise).
If you use any other ES6+ features that need **runtime support** (such as `Array.from()` or `Symbol`), make sure you are including the appropriate polyfills manually, or that the browsers you are targeting already support them.
## Syntax Highlighting in the Editor
To configure the syntax highlighting in your favorite text editor, head to the [relevant Babel documentation page](https://babeljs.io/docs/editors) and follow the instructions. Some of the most popular editors are covered.
## Debugging in the Editor
**This feature is currently only supported by [Visual Studio Code](https://code.visualstudio.com) and [WebStorm](https://www.jetbrains.com/webstorm/).**
Visual Studio Code and WebStorm support debugging out of the box with Create React App. This enables you as a developer to write and debug your React code without leaving the editor, and most importantly it enables you to have a continuous development workflow, where context switching is minimal, as you don’t have to switch between tools.
### Visual Studio Code
You would need to have the latest version of [VS Code](https://code.visualstudio.com) and VS Code [Chrome Debugger Extension](https://marketplace.visualstudio.com/items?itemName=msjsdiag.debugger-for-chrome) installed.
Then add the block below to your `launch.json` file and put it inside the `.vscode` folder in your app’s root directory.
```json
{
"version": "0.2.0",
"configurations": [{
"name": "Chrome",
"type": "chrome",
"request": "launch",
"url": "http://localhost:3000",
"webRoot": "${workspaceRoot}/src",
"sourceMapPathOverrides": {
"webpack:///src/*": "${webRoot}/*"
}
}]
}
```
>Note: the URL may be different if you've made adjustments via the [HOST or PORT environment variables](#advanced-configuration).
Start your app by running `npm start`, and start debugging in VS Code by pressing `F5` or by clicking the green debug icon. You can now write code, set breakpoints, make changes to the code, and debug your newly modified code—all from your editor.
Having problems with VS Code Debugging? Please see their [troubleshooting guide](https://github.com/Microsoft/vscode-chrome-debug/blob/master/README.md#troubleshooting).
### WebStorm
You would need to have [WebStorm](https://www.jetbrains.com/webstorm/) and [JetBrains IDE Support](https://chrome.google.com/webstore/detail/jetbrains-ide-support/hmhgeddbohgjknpmjagkdomcpobmllji) Chrome extension installed.
In the WebStorm menu `Run` select `Edit Configurations...`. Then click `+` and select `JavaScript Debug`. Paste `http://localhost:3000` into the URL field and save the configuration.
>Note: the URL may be different if you've made adjustments via the [HOST or PORT environment variables](#advanced-configuration).
Start your app by running `npm start`, then press `^D` on macOS or `F9` on Windows and Linux or click the green debug icon to start debugging in WebStorm.
The same way you can debug your application in IntelliJ IDEA Ultimate, PhpStorm, PyCharm Pro, and RubyMine.
## Changing the Page `<title>`
You can find the source HTML file in the `public` folder of the generated project. You may edit the `<title>` tag in it to change the title from “React App” to anything else.
Note that normally you wouldn’t edit files in the `public` folder very often. For example, [adding a stylesheet](#adding-a-stylesheet) is done without touching the HTML.
If you need to dynamically update the page title based on the content, you can use the browser [`document.title`](https://developer.mozilla.org/en-US/docs/Web/API/Document/title) API. For more complex scenarios when you want to change the title from React components, you can use [React Helmet](https://github.com/nfl/react-helmet), a third party library.
If you use a custom server for your app in production and want to modify the title before it gets sent to the browser, you can follow advice in [this section](#generating-dynamic-meta-tags-on-the-server). Alternatively, you can pre-build each page as a static HTML file which then loads the JavaScript bundle, which is covered [here](#pre-rendering-into-static-html-files).
## Installing a Dependency
The generated project includes React and ReactDOM as dependencies. It also includes a set of scripts used by Create React App as a development dependency. You may install other dependencies (for example, React Router) with `npm`:
```sh
npm install --save react-router
```
Alternatively you may use `yarn`:
```sh
yarn add react-router
```
This works for any library, not just `react-router`.
## Importing a Component
This project setup supports ES6 modules thanks to Babel.<br>
While you can still use `require()` and `module.exports`, we encourage you to use [`import` and `export`](http://exploringjs.com/es6/ch_modules.html) instead.
For example:
### `Button.js`
```js
import React, { Component } from 'react';
export class Button extends Component {
render() {
// ...
}
}
```
### `DangerButton.js`
```js
import React, { Component } from 'react';
import { Button } from './Button'; // Import a component from another file
export class DangerButton extends Component {
render() {
return <Button color="red" />;
}
}
```
We suggest that you stick to using default imports and exports when a module only exports a single thing (for example, a component). That’s what you get when you use `export default Button` and `import Button from './Button'`.
Named exports are useful for utility modules that export several functions. A module may have at most one default export and as many named exports as you like.
Learn more about ES6 modules:
* [When to use the curly braces?](http://stackoverflow.com/questions/36795819/react-native-es-6-when-should-i-use-curly-braces-for-import/36796281#36796281)
* [Exploring ES6: Modules](http://exploringjs.com/es6/ch_modules.html)
* [Understanding ES6: Modules](https://leanpub.com/understandinges6/read#leanpub-auto-encapsulating-code-with-modules)
## Code Splitting
Instead of downloading the entire app before users can use it, code splitting allows you to split your code into small chunks which you can then load on demand.
This project setup supports code splitting via [dynamic `import()`](http://2ality.com/2017/01/import-operator.html#loading-code-on-demand). Its [proposal](https://github.com/tc39/proposal-dynamic-import) is in stage 3. The `import()` function-like form takes the module name as an argument and returns a [`Promise`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Promise) which always resolves to the namespace object of the module.
Here is an example:
### `moduleA.js`
```js
const moduleA = 'Hello';
export { moduleA };
```
### `App.js`
```js
import React, { Component } from 'react';
class App extends Component {
handleClick = () => {
import('./moduleA')
.then(({ moduleA }) => {
// Use moduleA
})
.catch(err => {
// Handle failure
});
};
render() {
return (
<div>
<button onClick={this.handleClick}>Load</button>
</div>
);
}
}
export default App;
```
This will make `moduleA.js` and all its unique dependencies as a separate chunk that only loads after the user clicks the 'Load' button.
You can also use it with `async` / `await` syntax if you prefer it.
### With React Router
If you are using React Router check out [this tutorial](http://serverless-stack.com/chapters/code-splitting-in-create-react-app.html) on how to use code splitting with it. You can find the companion GitHub repository [here](https://github.com/AnomalyInnovations/serverless-stack-demo-client/tree/code-splitting-in-create-react-app).
Also check out the [Code Splitting](https://reactjs.org/docs/code-splitting.html) section in React documentation.
## Adding a Stylesheet
This project setup uses [Webpack](https://webpack.js.org/) for handling all assets. Webpack offers a custom way of “extending” the concept of `import` beyond JavaScript. To express that a JavaScript file depends on a CSS file, you need to **import the CSS from the JavaScript file**:
### `Button.css`
```css
.Button {
padding: 20px;
}
```
### `Button.js`
```js
import React, { Component } from 'react';
import './Button.css'; // Tell Webpack that Button.js uses these styles
class Button extends Component {
render() {
// You can use them as regular CSS styles
return <div className="Button" />;
}
}
```
**This is not required for React** but many people find this feature convenient. You can read about the benefits of this approach [here](https://medium.com/seek-ui-engineering/block-element-modifying-your-javascript-components-d7f99fcab52b). However you should be aware that this makes your code less portable to other build tools and environments than Webpack.
In development, expressing dependencies this way allows your styles to be reloaded on the fly as you edit them. In production, all CSS files will be concatenated into a single minified `.css` file in the build output.
If you are concerned about using Webpack-specific semantics, you can put all your CSS right into `src/index.css`. It would still be imported from `src/index.js`, but you could always remove that import if you later migrate to a different build tool.
## Post-Processing CSS
This project setup minifies your CSS and adds vendor prefixes to it automatically through [Autoprefixer](https://github.com/postcss/autoprefixer) so you don’t need to worry about it.
For example, this:
```css
.App {
display: flex;
flex-direction: row;
align-items: center;
}
```
becomes this:
```css
.App {
display: -webkit-box;
display: -ms-flexbox;
display: flex;
-webkit-box-orient: horizontal;
-webkit-box-direction: normal;
-ms-flex-direction: row;
flex-direction: row;
-webkit-box-align: center;
-ms-flex-align: center;
align-items: center;
}
```
If you need to disable autoprefixing for some reason, [follow this section](https://github.com/postcss/autoprefixer#disabling).
## Adding a CSS Preprocessor (Sass, Less etc.)
Generally, we recommend that you don’t reuse the same CSS classes across different components. For example, instead of using a `.Button` CSS class in `<AcceptButton>` and `<RejectButton>` components, we recommend creating a `<Button>` component with its own `.Button` styles, that both `<AcceptButton>` and `<RejectButton>` can render (but [not inherit](https://facebook.github.io/react/docs/composition-vs-inheritance.html)).
Following this rule often makes CSS preprocessors less useful, as features like mixins and nesting are replaced by component composition. You can, however, integrate a CSS preprocessor if you find it valuable. In this walkthrough, we will be using Sass, but you can also use Less, or another alternative.
First, let’s install the command-line interface for Sass:
```sh
npm install --save node-sass-chokidar
```
Alternatively you may use `yarn`:
```sh
yarn add node-sass-chokidar
```
Then in `package.json`, add the following lines to `scripts`:
```diff
"scripts": {
+ "build-css": "node-sass-chokidar src/ -o src/",
+ "watch-css": "npm run build-css && node-sass-chokidar src/ -o src/ --watch --recursive",
"start": "react-scripts start",
"build": "react-scripts build",
"test": "react-scripts test --env=jsdom",
```
>Note: To use a different preprocessor, replace `build-css` and `watch-css` commands according to your preprocessor’s documentation.
Now you can rename `src/App.css` to `src/App.scss` and run `npm run watch-css`. The watcher will find every Sass file in `src` subdirectories, and create a corresponding CSS file next to it, in our case overwriting `src/App.css`. Since `src/App.js` still imports `src/App.css`, the styles become a part of your application. You can now edit `src/App.scss`, and `src/App.css` will be regenerated.
To share variables between Sass files, you can use Sass imports. For example, `src/App.scss` and other component style files could include `@import "./shared.scss";` with variable definitions.
To enable importing files without using relative paths, you can add the `--include-path` option to the command in `package.json`.
```
"build-css": "node-sass-chokidar --include-path ./src --include-path ./node_modules src/ -o src/",
"watch-css": "npm run build-css && node-sass-chokidar --include-path ./src --include-path ./node_modules src/ -o src/ --watch --recursive",
```
This will allow you to do imports like
```scss
@import 'styles/_colors.scss'; // assuming a styles directory under src/
@import 'nprogress/nprogress'; // importing a css file from the nprogress node module
```
At this point you might want to remove all CSS files from the source control, and add `src/**/*.css` to your `.gitignore` file. It is generally a good practice to keep the build products outside of the source control.
As a final step, you may find it convenient to run `watch-css` automatically with `npm start`, and run `build-css` as a part of `npm run build`. You can use the `&&` operator to execute two scripts sequentially. However, there is no cross-platform way to run two scripts in parallel, so we will install a package for this:
```sh
npm install --save npm-run-all
```
Alternatively you may use `yarn`:
```sh
yarn add npm-run-all
```
Then we can change `start` and `build` scripts to include the CSS preprocessor commands:
```diff
"scripts": {
"build-css": "node-sass-chokidar src/ -o src/",
"watch-css": "npm run build-css && node-sass-chokidar src/ -o src/ --watch --recursive",
- "start": "react-scripts start",
- "build": "react-scripts build",
+ "start-js": "react-scripts start",
+ "start": "npm-run-all -p watch-css start-js",
+ "build-js": "react-scripts build",
+ "build": "npm-run-all build-css build-js",
"test": "react-scripts test --env=jsdom",
"eject": "react-scripts eject"
}
```
Now running `npm start` and `npm run build` also builds Sass files.
**Why `node-sass-chokidar`?**
`node-sass` has been reported as having the following issues:
- `node-sass --watch` has been reported to have *performance issues* in certain conditions when used in a virtual machine or with docker.
- Infinite styles compiling [#1939](https://github.com/facebookincubator/create-react-app/issues/1939)
- `node-sass` has been reported as having issues with detecting new files in a directory [#1891](https://github.com/sass/node-sass/issues/1891)
`node-sass-chokidar` is used here as it addresses these issues.
## Adding Images, Fonts, and Files
With Webpack, using static assets like images and fonts works similarly to CSS.
You can **`import` a file right in a JavaScript module**. This tells Webpack to include that file in the bundle. Unlike CSS imports, importing a file gives you a string value. This value is the final path you can reference in your code, e.g. as the `src` attribute of an image or the `href` of a link to a PDF.
To reduce the number of requests to the server, importing images that are less than 10,000 bytes returns a [data URI](https://developer.mozilla.org/en-US/docs/Web/HTTP/Basics_of_HTTP/Data_URIs) instead of a path. This applies to the following file extensions: bmp, gif, jpg, jpeg, and png. SVG files are excluded due to [#1153](https://github.com/facebookincubator/create-react-app/issues/1153).
Here is an example:
```js
import React from 'react';
import logo from './logo.png'; // Tell Webpack this JS file uses this image
function Header() {
// public/
// img/
// logo.84287d09.png
// Import from public folder or from assets folder, don`t forget / beore import path
return <img src="/img/logo.84287d09.png" alt="Logo" />;
};
export default Header;
```
This ensures that when the project is built, Webpack will correctly move the images into the build folder, and provide us with correct paths.
This works in CSS too:
```css
.Logo {
background-image: url('/img/logo.84287d09.png');
}
```
Webpack finds all relative module references in CSS (they start with `./`) and replaces them with the final paths from the compiled bundle. If you make a typo or accidentally delete an important file, you will see a compilation error, just like when you import a non-existent JavaScript module. The final filenames in the compiled bundle are generated by Webpack from content hashes. If the file content changes in the future, Webpack will give it a different name in production so you don’t need to worry about long-term caching of assets.
Please be advised that this is also a custom feature of Webpack.
**It is not required for React** but many people enjoy it (and React Native uses a similar mechanism for images).<br>
An alternative way of handling static assets is described in the next section.
### Changing the HTML
The `public` folder contains the HTML file so you can tweak it, for example, to [set the page title](#changing-the-page-title).
The `<script>` tag with the compiled code will be added to it automatically during the build process.
### Adding Assets Outside of the Module System
You can also add other assets to the `public` folder.
Note that we normally encourage you to `import` assets in JavaScript files instead.
For example, see the sections on [adding a stylesheet](#adding-a-stylesheet) and [adding images and fonts](#adding-images-fonts-and-files).
This mechanism provides a number of benefits:
* Scripts and stylesheets get minified and bundled together to avoid extra network requests.
* Missing files cause compilation errors instead of 404 errors for your users.
* Result filenames include content hashes so you don’t need to worry about browsers caching their old versions.
However there is an **escape hatch** that you can use to add an asset outside of the module system.
If you put a file into the `public` folder, it will **not** be processed by Webpack. Instead it will be copied into the build folder untouched. To reference assets in the `public` folder, you need to use a special variable called `PUBLIC_URL`.
Inside `index.html`, you can use it like this:
```html
<link rel="shortcut icon" href="%PUBLIC_URL%/favicon.ico">
```
Only files inside the `public` folder will be accessible by `%PUBLIC_URL%` prefix. If you need to use a file from `src` or `node_modules`, you’ll have to copy it there to explicitly specify your intention to make this file a part of the build.
When you run `npm run build`, Create React App will substitute `%PUBLIC_URL%` with a correct absolute path so your project works even if you use client-side routing or host it at a non-root URL.
In JavaScript code, you can use `process.env.PUBLIC_URL` for similar purposes:
```js
render() {
// Note: this is an escape hatch and should be used sparingly!
// Normally we recommend using `import` for getting asset URLs
// as described in “Adding Images and Fonts” above this section.
return <img src={process.env.PUBLIC_URL + '/img/logo.png'} />;
}
```
Keep in mind the downsides of this approach:
* None of the files in `public` folder get post-processed or minified.
* Missing files will not be called at compilation time, and will cause 404 errors for your users.
* Result filenames won’t include content hashes so you’ll need to add query arguments or rename them every time they change.
### When to Use the `public` Folder
Normally we recommend importing [stylesheets](#adding-a-stylesheet), [images, and fonts](#adding-images-fonts-and-files) from JavaScript.
The `public` folder is useful as a workaround for a number of less common cases:
* You need a file with a specific name in the build output, such as [`manifest.webmanifest`](https://developer.mozilla.org/en-US/docs/Web/Manifest).
* You have thousands of images and need to dynamically reference their paths.
* You want to include a small script like [`pace.js`](http://github.hubspot.com/pace/docs/welcome/) outside of the bundled code.
* Some library may be incompatible with Webpack and you have no other option but to include it as a `<script>` tag.
Note that if you add a `<script>` that declares global variables, you also need to read the next section on using them.
### Expanding Environment Variables In `.env`
Expand variables already on your machine for use in your `.env` file (using [dotenv-expand](https://github.com/motdotla/dotenv-expand)).
Or expand variables local to the current `.env` file:
```
PORT=808
```
### Adding a Router
Create React App doesn't prescribe a specific routing solution, but [React Router](https://reacttraining.com/react-router) is the most popular one.
To add it, run:
```js
npm install --save react-router-dom
```
Alternatively you may use yarn:
```js
yarn add react-router-dom
```
To try it, delete all the code in src/App.js and replace it with any of the examples on its website. [The Basic Example](https://reacttraining.com/react-router/web/example/basic) is a good place to get started.
Note that [you may need to configure your production server to support client-side routing](https://bitbucket.org/reavue/bb/src/37f478cf0189?at=master&mode=edit&spa=0&fileviewer=file-view-default#serving-apps-with-client-side-routing) before deploying your app.
| 45.577123 | 540 | 0.738687 | eng_Latn | 0.983655 |
1c869ef437aebd54f988c3f53b10c9bf87c607d8 | 722 | md | Markdown | vocabulary/c/crude.md | lsieun/EnglishDictionary | 5ad881da2d06835d1150e7076955c82be83fa127 | [
"MIT"
] | null | null | null | vocabulary/c/crude.md | lsieun/EnglishDictionary | 5ad881da2d06835d1150e7076955c82be83fa127 | [
"MIT"
] | null | null | null | vocabulary/c/crude.md | lsieun/EnglishDictionary | 5ad881da2d06835d1150e7076955c82be83fa127 | [
"MIT"
] | null | null | null | # crude
- Word: crude
- Story: Is it gross? Unsophisticated? Totally tasteless and positively offensive? Then you can describe it as crude.
- Story: Crude is not rude when it’s used to describe unprocessed oil, which it first was associated with in 1865. From the Latin crudo, meaning "rough, raw," crude today can be used to describe anything or anyone that's unrefined and rough around the edges. So don’t be too insulted by the large-nosed portrait that cartoonist made for you. It’s just a quick, crude drawing meant to be funny.
## adjective
- Meaning: simply made, not showing much skill or attention to detail
- Chinese: 粗糙的;粗制的
- Tags:
- Eg.: a crude drawing of a face 脸部的略图
- Eg.: crude oil/metal 原油;未经提炼的金属
| 48.133333 | 393 | 0.757618 | eng_Latn | 0.999843 |
1c87477261909a5171368ddc2119970dacdc7429 | 1,545 | md | Markdown | src/app/page/zh_CN/locale/README.zh-CN.md | yutao331763646/ngx-vant | cb86a2c36d3c1b32f1b753a20df9488c14c79eed | [
"MIT"
] | 29 | 2021-01-25T01:46:55.000Z | 2022-02-02T17:44:48.000Z | src/app/page/zh_CN/locale/README.zh-CN.md | yutao331763646/ngx-vant | cb86a2c36d3c1b32f1b753a20df9488c14c79eed | [
"MIT"
] | null | null | null | src/app/page/zh_CN/locale/README.zh-CN.md | yutao331763646/ngx-vant | cb86a2c36d3c1b32f1b753a20df9488c14c79eed | [
"MIT"
] | 2 | 2021-07-26T07:27:14.000Z | 2021-08-05T02:19:36.000Z | # 国际化
### 介绍
Nnx-Vant 采用中文作为默认语言,同时支持多语言切换,请按照下方教程进行国际化设置。
## 使用方法
### 在app.module.ts中指定默认语言
```js
import { BrowserModule } from '@angular/platform-browser';
import { NgModule } from '@angular/core';
import { AppRoutingModule } from './app-routing.module';
import { AppComponent } from './app.component';
import { Vant18nModule, VANT_I18N, zh_CN } from 'ngx-vant/i18n';
@NgModule({
declarations: [AppComponent],
imports: [BrowserModule, AppRoutingModule],
providers: [{ provide: VANT_I18N, useValue: zh_CN }],
bootstrap: [AppComponent]
})
export class AppModule {}
```
### 运行时修改
`Ngx-Vant` 提供了`VantI18nService`服务用于动态修改国际化:
```js
import { Component } from '@angular/core';
import { en_US, zh_CN } from 'ngx-vant/i18n';
import { VantI18nService } from 'ngx-vant/i18n';
@Component({
selector: 'app',
templateUrl: './demo.component.html'
})
export class AppComponent {
constructor(
private vantI18n: VantI18nService,
) { }
onSwitchLang() {
this.vantI18n.setLocale(lang);
}
}
```
### 语言包
目前支持的语言:
| 语言 | 文件名 |
| -------------- | ------------ |
| 简体中文 | zh-CN |
| 繁體中文(港) | zh-HK |
| 繁體中文(台) | zh-TW |
| 英语 | en-US |
| 德语 | de-DE |
| 德语 (正式) | de-DE-formal |
| 土耳其语 | tr-TR |
| 西班牙语 | es-ES |
| 日语 | ja-JP |
| 罗马尼亚语 | ro-RO |
| 挪威语 | nb-NO |
> 在 [这里](https://github.com/youzan/vant/tree/dev/src/locale/lang) 查看所有的语言包源文件。
| 20.328947 | 78 | 0.561165 | yue_Hant | 0.904761 |
1c8821b675c363c76ac2d9749bdd00a525267f94 | 36 | md | Markdown | README.md | RehanFargose/PAN-Based-Door-Security-System- | 7e0cd2af77564ca332ee101799b2088acf83eac1 | [
"MIT"
] | null | null | null | README.md | RehanFargose/PAN-Based-Door-Security-System- | 7e0cd2af77564ca332ee101799b2088acf83eac1 | [
"MIT"
] | null | null | null | README.md | RehanFargose/PAN-Based-Door-Security-System- | 7e0cd2af77564ca332ee101799b2088acf83eac1 | [
"MIT"
] | null | null | null | # PAN Based Door Security System
| 12 | 33 | 0.722222 | eng_Latn | 0.657217 |
1c883860c0a7c797fa71819d2fb135b8b930f4a7 | 26 | md | Markdown | basic/sns/README.md | rilindo/some_rust_aws_examples | 9cc1980004c6e6c46db65fdb976d5eb10b3b1d50 | [
"Apache-2.0"
] | null | null | null | basic/sns/README.md | rilindo/some_rust_aws_examples | 9cc1980004c6e6c46db65fdb976d5eb10b3b1d50 | [
"Apache-2.0"
] | null | null | null | basic/sns/README.md | rilindo/some_rust_aws_examples | 9cc1980004c6e6c46db65fdb976d5eb10b3b1d50 | [
"Apache-2.0"
] | null | null | null | #Rust/Rusoto SNS Examples
| 13 | 25 | 0.807692 | eng_Latn | 0.338933 |
1c885aa1b15e6287f03926e9b24351d14a63f6d9 | 945 | md | Markdown | CHANGELOG.md | nonsignificantp/haversine | 6ba8d8a6c2c49977eb22e34486c4a62a9401d73e | [
"MIT"
] | null | null | null | CHANGELOG.md | nonsignificantp/haversine | 6ba8d8a6c2c49977eb22e34486c4a62a9401d73e | [
"MIT"
] | null | null | null | CHANGELOG.md | nonsignificantp/haversine | 6ba8d8a6c2c49977eb22e34486c4a62a9401d73e | [
"MIT"
] | null | null | null | # CHANGELOG
## 1.0.2 - 2018-13
slightly better precision [#17](https://github.com/mapado/haversine/pull/17)
## 1.0.1 - 2018-10-10
fix wrong definition in setup.py
## 1.0.0 - 2018-10-10
No changes, haversine package has been stable and functional for years now. Time to make a 1.0 version :)
(in fact there is one breaking changes but it concern the 0.5.0 version published just 10 minutes before 1.0 ;). In that case `nmiles` has been changed to `nautical_miles` for readability)
## 0.5.0 - 2018-10-10
Add nautical miles support
## 0.4.6
Fixed typos in README and docstring
## 0.4.5
Fixed issue with int instead of float [#6](https://github.com/mapado/haversine/pull/6/files)
## 0.4.3 - 0.4.4
- Remove useless code [#5](https://github.com/mapado/haversine/pull/5)
## 0.4.2
- Remove cPython usage: fail on Windows, no real perf gain [5d0ff179](https://github.com/mapado/haversine/commit/5d0ff179741b8417965d94dcb21f39ddbce674f8)
| 26.25 | 188 | 0.724868 | eng_Latn | 0.90642 |
1c88fae8fff4bca3726d26c2cd6cb1dd78d7ef81 | 342 | md | Markdown | _session/7Keynote2.md | UniversityofBrighton/odak | 751b85afaba2a0e922f5f6bc63fa9c0258bb7988 | [
"MIT"
] | null | null | null | _session/7Keynote2.md | UniversityofBrighton/odak | 751b85afaba2a0e922f5f6bc63fa9c0258bb7988 | [
"MIT"
] | null | null | null | _session/7Keynote2.md | UniversityofBrighton/odak | 751b85afaba2a0e922f5f6bc63fa9c0258bb7988 | [
"MIT"
] | 2 | 2021-12-21T11:16:10.000Z | 2022-02-16T16:48:01.000Z | ---
permalink: /session/keynote1
redirect_from: "session/keynote1"
layout: egsr-talks
title: "Keynote: TBC"
authors: "<b>TBC</b>"
start: "2022-07-01T10:00:00Z"
end: "2022-07-01T11:00:00Z"
session_id: 7
# youtube_url: "https://youtu.be/lbZBRp6Gn20"
# rc_link: "https://rc.egsr2020.london/channel/SR_08_keynote"
year: 2022
color: '#006400'
---
| 22.8 | 61 | 0.719298 | yue_Hant | 0.141558 |
1c891f703a078553ce36439cd7a9b256afb19226 | 1,532 | md | Markdown | README.md | michael1979ng/Week-12-Homework-File-Cloud-Security | 85346f877b716e47275592cf3a826d1cfc444842 | [
"ADSL"
] | null | null | null | README.md | michael1979ng/Week-12-Homework-File-Cloud-Security | 85346f877b716e47275592cf3a826d1cfc444842 | [
"ADSL"
] | null | null | null | README.md | michael1979ng/Week-12-Homework-File-Cloud-Security | 85346f877b716e47275592cf3a826d1cfc444842 | [
"ADSL"
] | null | null | null | Homework File: Cloud Security
Background
During the last week, you created a highly available web server for XCorp's Red Team to use for testing and training.
Your lead cloud administrator has asked for a diagram of the Network you created to keep for documentation and company records.
Your task: Use draw.io to create a detailed diagram of your cloud infrastructure.
Cloud Recap
When you're finished completing all the activities in cloud week, you should have:
A total of 3 VMs running DVWA.
All 3 VMs receiving traffic from your load balancer.
If you did not setup the 3rd (optional) VM, you should have:
A total of 2 VMs running DVWA
Both VMs receiving traffic from your load balancer.
You can complete this homework with either 2 or 3 VMs.
Your Goal
When you are finished with this assignment, you should have a network diagram that shows your entire cloud setup, including your Ansible jump box and the Docker containers running on each VM.
This document can be used as part of a portfolio to demonstrate your ability.
Instructions
Use a free account at draw.io to diagram the entire cloud network you have created.
- Your diagram should show the following:
- Azure resource group
- Virtual network with IP address range
- Subnet range
- Flow of specific traffic (e.g., HTTP, SSH)
- Security group blocking traffic
- Load balancer
- All 4 VMs that you have launched
- Where Docker and Ansible are deployed
© 2020 Trilogy Education Services, a 2U, Inc. brand. All Rights Reserved.
| 36.47619 | 191 | 0.772193 | eng_Latn | 0.999447 |
1c89395592cb68780c74a2c2871ac0b291a769e9 | 378 | md | Markdown | navigation/README.md | frgomes/sri-mobile-examples | 88db87cce0d08af203bb42d079d5945b6328c1a6 | [
"Apache-2.0"
] | 2 | 2018-03-25T02:12:25.000Z | 2018-03-25T02:12:36.000Z | navigation/README.md | frgomes/sri-mobile-examples | 88db87cce0d08af203bb42d079d5945b6328c1a6 | [
"Apache-2.0"
] | 1 | 2018-01-14T04:36:07.000Z | 2018-01-14T04:36:07.000Z | navigation/README.md | frgomes/sri-mobile-examples | 88db87cce0d08af203bb42d079d5945b6328c1a6 | [
"Apache-2.0"
] | 3 | 2018-01-12T23:29:36.000Z | 2019-06-30T03:15:00.000Z | # Navigation PlayGround
#### How to run
//start server
```scala
yarn install or npm install
yarn start or npm start
react-native link react-native-vector-icons
```
***iOS***
```scala
sbt ~ios:dev
//open new terminal
react-native run-ios
```
***Android***
```scala
sbt ~android:dev
//Start android emulator/connect device
//open new terminal
react-native run-android
``` | 12.6 | 43 | 0.703704 | eng_Latn | 0.818552 |
1c89400826012d1e5f1f017e38ae594a6d4f33ff | 220 | md | Markdown | rmf_demos_bridges/CHANGELOG.md | Capstone-S13/rmf_demos | 65c45a1f52c5d909381373b35df4146d4069e7f5 | [
"Apache-2.0"
] | null | null | null | rmf_demos_bridges/CHANGELOG.md | Capstone-S13/rmf_demos | 65c45a1f52c5d909381373b35df4146d4069e7f5 | [
"Apache-2.0"
] | 4 | 2022-01-10T04:30:54.000Z | 2022-02-16T03:02:19.000Z | rmf_demos_bridges/CHANGELOG.md | Capstone-S13/rmf_demos | 65c45a1f52c5d909381373b35df4146d4069e7f5 | [
"Apache-2.0"
] | 1 | 2022-01-24T03:47:29.000Z | 2022-01-24T03:47:29.000Z | ## Changelog for package rmf_demos_bridges
1.4.0 (2022-02-14)
------------------
* Provide an example adapter for a delivery robot with indoor-outdoor capabilities [#121](https://github.com/open-rmf/rmf_demos/pull/121)
| 36.666667 | 137 | 0.709091 | eng_Latn | 0.811305 |
1c897ad332da771667c32fe26773146693aa6111 | 11,797 | md | Markdown | README.md | wyfoffice/cmfrec | 3f257fb0518d9fecc7c6edc18d18e7439353f6e1 | [
"BSD-2-Clause"
] | null | null | null | README.md | wyfoffice/cmfrec | 3f257fb0518d9fecc7c6edc18d18e7439353f6e1 | [
"BSD-2-Clause"
] | null | null | null | README.md | wyfoffice/cmfrec | 3f257fb0518d9fecc7c6edc18d18e7439353f6e1 | [
"BSD-2-Clause"
] | null | null | null | # Collective Matrix Factorization
Deleted the changed file, as the code should belong to my intern company.
Many thanks to david-cortes!
The original library is using an explicit feedback loss function which regards the ratins in the matrix are the preferences of users. However, in many cases, the ratings are not necessarily related to preference. For example, we may buy 10 pairs of socks while only buy one smart phone. So in Hu's paper http://yifanhu.net/PUB/cf.pdf , Hu said that the value of ratings, or in other words, the number of purchase show the confidence that we believe the user likes this item.
# Usage
After install this library using pip install cmfrec, you then replace the original __init__.py with this website's version
Python implementation of collective matrix factorization, based on the paper _Relational learning via collective matrix factorization (A. Singh, 2008)_, with some enhancements. This is a collaborative filtering model for recommender systems that takes as input explicit item ratings and side information about users and/or items. The overall idea was extended here to also be able to do cold-start recommendations (of users and items that were not in the training data). Implementation is done in TensorFlow and optimization is with L-BFGS.
The extended version of the paper ("Relational learning via Collective Matrix Factorization") can be downloaded here:
[http://ra.adm.cs.cmu.edu/anon/usr/ftp/ml2008/CMU-ML-08-109.pdf](http://ra.adm.cs.cmu.edu/anon/usr/ftp/ml2008/CMU-ML-08-109.pdf)
## Basic model
The model consist in predicting the rating that a user would give to an item by performing a low-rank matrix factorization of explicit ratings `X~=AB'`, using side information about the items (such as movie tags) and/or users (such as their demographic info) by also factorizing the item side info matrix and/or the user side info matrix `X~=AB', U~=AC', I~=BD'`, sharing the same item/user-factor matrix used to factorize the ratings, or sharing only some of the latent factors.
This also has the side effect of allowing to recommend for users and items for which there is side information but no ratings, although these predictions might not be as high quality.
For more details on the formulas, see the IPython notebook example.
## Instalation
Package is available on PyPI, can be installed with:
```pip install cmfrec```
## Usage
```python
import pandas as pd, numpy as np
from cmfrec import CMF
## simulating some movie ratings
nusers = 10**2
nitems = 10**2
nobs = 10**3
np.random.seed(1)
ratings = pd.DataFrame({
'UserId' : np.random.randint(nusers, size=nobs),
'ItemId' : np.random.randint(nitems, size=nobs),
'Rating' : np.random.randint(low=1, high=6, size=nobs)
})
## random product side information
user_dim = 10
user_attributes = pd.DataFrame(np.random.normal(size=(nusers, user_dim)))
user_attributes['UserId'] = np.arange(nusers)
user_attributes = user_attributes.sample(int(nusers/2), replace=False)
item_dim = 5
item_attributes = pd.DataFrame(np.random.normal(size=(nitems, item_dim)))
item_attributes['ItemId'] = np.arange(nitems)
item_attributes = item_attributes.sample(int(nitems/2), replace=False)
# fitting a model and making some recommendations
recommender = CMF(k=20, k_main=3, k_user=2, k_item=1, reg_param=1e-4)
recommender.fit(ratings=ratings, user_info=user_attributes, item_info=item_attributes,
cols_bin_user=None, cols_bin_item=None)
recommender.topN(user=4, n=10)
recommender.topN_cold(attributes=np.random.normal(size=user_dim), n=10)
recommender.predict(user=0, item=0)
recommender.predict(user=[0,0,1], item=[0,1,0])
# adding more users and items without refitting
recommender.add_item(new_id=10**3, attributes=np.random.normal(size=item_dim), reg='auto')
recommender.add_user(new_id=10**3, attributes=np.random.normal(size=user_dim), reg=1e-3)
recommender.topN(10**3)
recommender.predict(user=10**3, item=10**3)
## full constructor call
recommender = CMF(k=30, k_main=0, k_user=0, k_item=0,
w_main=1.0, w_user=1.0, w_item=1.0, reg_param=1e-4,
offsets_model=False, nonnegative=False, maxiter=1000,
standardize_err=True, reweight=False, reindex=True,
center_ratings=True, add_user_bias=True, add_item_bias=True,
center_user_info=False, center_item_info=False,
user_info_nonneg=False, item_info_nonneg=False,
keep_data=True, save_folder=None, produce_dicts=True,
random_seed=None, verbose=True)
```
Users and items are internally reindexed, so you can use strings or non-consecutive numbers as IDs when passing data to the `CMF` object's methods.
For more details see the online documentation.
## Documentation
Documentation is available at readthedocs: [http://cmfrec.readthedocs.io/en/latest/](http://cmfrec.readthedocs.io/en/latest/)
It is also internally documented through docstrings (e.g. you can try `help(cmfrec.CMF))`, `help(cmfrec.CMF.fit)`, etc.
For a detailed example using the MovieLens data with user demographic info and movie genres see the IPython notebook [Collaborative filtering with side information](http://nbviewer.jupyter.org/github/david-cortes/cmfrec/blob/master/example/cmfrec_movielens_sideinfo.ipynb).
## Model details
By default, the function to minimize is as follows:
```
L(A,B,C,D) = w_main*sum_sq(X-AB' - Ubias_{u} - Ibias_{i})/|X| + w_user*sum_sq(U-AC')/|U| + w_item*sum_sq(I-BD')/|I| + regl.
regl. = reg*(norm(A)^2+norm(B)^2+norm(C)^2+norm(D)^2)
```
Where:
* X is the ratings matrix (considering only non-missing entries).
* I is the item-attribute matrix (only supports dense inputs).
* U is the user-attribute matrix (only supports dense inputs).
* A, B, C, D are lower-dimensional matrices (the model parameters).
* |X|, |I|, |U| are the number of non-missing entries in each matrix.
* The matrix products might not use all the rows/columns of these matrices at each factorization
(this is controlled with k_main, k_item and k_user).
* Ubias_{u} and Ibias_{i} are user and item biases, defined for each user and item ID.
Additionally, for binary columns (having only 0/1 values), it can apply a sigmoid function to force them to be between zero and one. The loss with respect to the real value is still squared loss though (`(sigmoid(pred) - real)^2`), so as to put it in the same scale as the rest of the variables.
The matrix-products might not use all the rows/columns of these shared latent factor matrices at each factorization (this is controlled with `k_main`, `k_item` and `k_user` in the initialization). Although the package API has arguments for both user and item side information, you can fit the model with only one or none of them.
The model benefits from adding side information about users and items for which there are no ratings, but recommendations about these users and items might not be very good.
Note that, in the simplest case with all factors shared and all matrices weighted the same, the model simplifies to factorizing an extended block matrix `X_ext = [[X, U], [I', .]]`, which can be done using any other matrix factorization library (e.g. pyspark’s `ALS` module).
Alternatively, it can also fit a different model formulation in which, in broad terms, ratings are predicted from latent 'coefficients' based on their attributes, and an offset is added to these coefficients based on the rating history of the users, similar in spirit to [2] (see references at the bottom), but with all-normal distributions:
```
L = sum_sq(X - (A+UC)(B+ID)' - Ubias_{u} - Ibias_{i}) + regl.
```
This second formulation usually takes longer to fit (requires more iterations), and is more recommended if the side information about users and items is mostly binary columns, and for cold-start recommendations. To use this alternative formulation, pass `offsets_model=False` to the CMF constructor.
For a web-scale implementation (without in-memory data) of the first algorithm see the implementation in Vowpal Wabbit:
[https://github.com/JohnLangford/vowpal_wabbit/wiki/Matrix-factorization-example](https://github.com/JohnLangford/vowpal_wabbit/wiki/Matrix-factorization-example)
## Cold-start recommendations
When using side information about users and/or items, it is possible to add new latent factor vectors based on the factorization of the side information alone. In a low-rank factorization, if one matrix is fixed, the other one can be calculated with a closed-form solution. In this case, the latent factor vectors for new users and items can be calculated with the same updates used in alternating least squares, in such a way that they minimize their error on the side information factorization - as these latent factors are shared with the factorization of the ratings (user feedback), they can at the same time be used for making recommendations.
For the alternative 'offsets model' formulation, cold-start recommendations are obtained by a simple vector-matrix product of the attributes with the model coefficients.
You might to tune the model parameters differently when optimizing for cold-start and warm-start recommendations. See the documentation for some tips.
## Implementation Notes
Calculations of objective and gradient are done with TensorFlow, interfacing an external optimizer. Since it allows for easy slight reformulations of the problem, you can easily control small aspects such as whether to divide the error by the number of entries in the matrices or not, whether or not to include user/item biases, etc.
The implementation here is not entirely true to the paper, as the model is fit with full L-BFGS updates rather than stochastic gradient descent or stochastic Newton, i.e. it calculates errors for the whole data and updates all model parameters at once during each iteration. This usually leads to better local optima and less need for parameter tuning, but is not as scalable. The L-BFGS implementation used is from SciPy, which is not entirely multi-threaded, so if your CPU has many cores or you have many GPUs, you can expect an speed up from parallelization on the calculations of the objective function and the gradient, but not so much on the calculations that happen inside L-BFGS.
As a reference point, 1,000 iterations (usually enough to fit a model with high regularization) over the movielens-1M and 1128 movie tags per movie + user demographics takes around 15 minutes in a regular desktop computer and uses around 2GB of RAM.
The package has only been tested under Python 3.
## Some comments
This kind of model requires a lot more hyperparameter tuning that regular low-rank matrix factorization, and fitting a model with badly-tuned parameters might result in worse recommendations compared to discarding the side information. In general, using lower regularization parameters will require more iterations to converge. Also note that, by default, the sums of squared errors in the loss function are divided by the number of entries, so the optimal regularization parameters will likely be very, very small, and the larger the inputs, the smaller the optimal regularization values (you can change this behavior by passing `standardize_err=False`, but be sure to then tune the weights for each factorization).
If your dataset is larger than the MovieLens ratings, adding product side information is unlikely to add more predictive power, but good user side information might still be valuable.
## References
* Singh, Ajit P., and Geoffrey J. Gordon. "Relational learning via collective matrix factorization." Proceedings of the 14th ACM SIGKDD international conference on Knowledge discovery and data mining. ACM, 2008.
* Wang, Chong, and David M. Blei. "Collaborative topic modeling for recommending scientific articles." Proceedings of the 17th ACM SIGKDD international conference on Knowledge discovery and data mining. ACM, 201.
| 71.932927 | 716 | 0.784352 | eng_Latn | 0.995804 |
1c89be55f4ffdba4549fd2cafc8c4b0852d7f123 | 1,320 | md | Markdown | programs/postgres/postgres.md | egel/code-wiki | c6e5f6bee4b1d78ee6385e6cff70f3836fc53c57 | [
"MIT"
] | 1 | 2016-01-03T22:27:10.000Z | 2016-01-03T22:27:10.000Z | programs/postgres/postgres.md | egel/code-wiki | c6e5f6bee4b1d78ee6385e6cff70f3836fc53c57 | [
"MIT"
] | null | null | null | programs/postgres/postgres.md | egel/code-wiki | c6e5f6bee4b1d78ee6385e6cff70f3836fc53c57 | [
"MIT"
] | null | null | null | <img src="img/postgresql-logo.png" title="PostgreSQL" width="150" />
# PostgeSQL
## Table of Contents
* [Legend](#legend)
* [Setup](#setup)
* [Add DB user password to environment variable](#add-db-user-password-to-environment-variable)
* [Add new database and user](#add-new-database-and-user)
## Legend
**psql**
: It is a terminal-based front-end to PostgreSQL.
## Setup
### Add db user password to environment variable
Add Postgres pass to global os variables.
```
export POSTGRESQL_USER='YOURUSER' # change this user to your own one !!!
export POSTGRESQL_PASS='YOURPASS' # change this pass to your own one !!!
```
### Add new database and user
database = `jobs`
user = `postgres`
```bash
$ createdb jobs
$ psql jobs
your_database_name=# CREATE USER postgres WITH PASSWORD 'YOUR_POSTGRES_USER_PASSWORD';
your_database_name=# ALTER USER postgres WITH SUPERUSER CREATEROLE CREATEDB REPLICATION BYPASSRLS;
your_database_name=# ALTER DATABASE jobs OWNER TO postgres;
your_database_name=# \q
$ psql -U $POSTGRESQL_USER -W
```
## Basic usage
Login to PostgreSQL: `psql postgres` or `psql -U postgres`
### Commands
Without `psql` (bare console)
* `psql -c '\du'` - list postgres users with psql command mode
Into `psql`
* `\q` - quit
* `\l` - list all databases
* `\du` - list users
| 24 | 101 | 0.709091 | kor_Hang | 0.446833 |
1c89d1ea59da7367632ae34c797421d435a5c6c1 | 618 | md | Markdown | README.md | garethflowers/docker-ftp-server | a67dccdcd12310e44dd5bdb2d06ae26c381ca286 | [
"MIT"
] | 4 | 2020-07-27T14:19:14.000Z | 2022-01-25T16:37:35.000Z | README.md | garethflowers/docker-ftp-server | a67dccdcd12310e44dd5bdb2d06ae26c381ca286 | [
"MIT"
] | 6 | 2020-09-04T00:15:42.000Z | 2021-05-13T21:18:28.000Z | README.md | garethflowers/docker-ftp-server | a67dccdcd12310e44dd5bdb2d06ae26c381ca286 | [
"MIT"
] | 3 | 2021-05-27T13:53:35.000Z | 2022-03-16T06:48:52.000Z | # FTP Server
A simple FTP server, using
[`vsftpd`](https://security.appspot.com/vsftpd.html).
## How to use this image
### Start a FTP Server instance
To start a container, with data stored in `/data` on the host, use the
following:
```sh
docker run \
--detach \
--env FTP_PASS=123 \
--env FTP_USER=user \
--name my-ftp-server \
--publish 20-21:20-21/tcp \
--publish 40000-40009:40000-40009/tcp \
--volume /data:/home/user \
garethflowers/ftp-server
```
## License
- This image is released under the
[MIT License](https://raw.githubusercontent.com/garethflowers/docker-ftp-server/master/LICENSE).
| 21.310345 | 100 | 0.699029 | eng_Latn | 0.731188 |
1c8a05ee35865f84ee1a4a15580bcc4898de1ea3 | 1,007 | md | Markdown | .github/ISSUE_TEMPLATE/bug_report.md | muhyun/sagemaker-inference-toolkit | 31c143a8274129c00af7bc1db826e3cfc68ea535 | [
"Apache-2.0"
] | 188 | 2019-06-01T16:43:24.000Z | 2022-03-19T03:44:56.000Z | .github/ISSUE_TEMPLATE/bug_report.md | muhyun/sagemaker-inference-toolkit | 31c143a8274129c00af7bc1db826e3cfc68ea535 | [
"Apache-2.0"
] | 66 | 2019-10-15T00:12:58.000Z | 2022-03-28T21:58:48.000Z | .github/ISSUE_TEMPLATE/bug_report.md | muhyun/sagemaker-inference-toolkit | 31c143a8274129c00af7bc1db826e3cfc68ea535 | [
"Apache-2.0"
] | 68 | 2019-06-20T17:34:43.000Z | 2022-03-13T09:10:52.000Z | ---
name: Bug report
about: File a report to help us reproduce and fix the problem
title: ''
labels: ''
assignees: ''
---
**Describe the bug**
A clear and concise description of what the bug is.
**To reproduce**
A clear, step-by-step set of instructions to reproduce the bug.
**Expected behavior**
A clear and concise description of what you expected to happen.
**Screenshots or logs**
If applicable, add screenshots or logs to help explain your problem.
**System information**
A description of your system.
- Include the version of SageMaker Inference Toolkit you are using.
- If you are using a [prebuilt Amazon SageMaker Docker image](https://docs.aws.amazon.com/sagemaker/latest/dg/pre-built-containers-frameworks-deep-learning.html), provide the URL.
- If you are using a custom Docker image, provide:
- framework name (eg. PyTorch)
- framework version
- Python version
- processing unit type (ie. CPU or GPU)
**Additional context**
Add any other context about the problem here.
| 29.617647 | 179 | 0.743793 | eng_Latn | 0.984106 |
1c8bf36cd87f893e1edffb29ec44a8dc05376d18 | 517 | md | Markdown | README.md | cirocosta/flight_recorder | 77358155c32ef1a5dfd77ccf12cdb7cf1e47acd9 | [
"Apache-2.0"
] | 4 | 2019-05-14T11:02:04.000Z | 2021-12-11T06:00:11.000Z | README.md | cirocosta/flight_recorder | 77358155c32ef1a5dfd77ccf12cdb7cf1e47acd9 | [
"Apache-2.0"
] | 6 | 2019-05-11T00:52:58.000Z | 2020-04-08T16:32:38.000Z | README.md | cirocosta/flight_recorder | 77358155c32ef1a5dfd77ccf12cdb7cf1e47acd9 | [
"Apache-2.0"
] | null | null | null | # flight_recorder
| name | description |
| ---- | ----------- |
| flight_recorder_builds | Number of builds |
| flight_recorder_containers | Number of containers |
| flight_recorder_pipelines | Number of pipelines set |
| flight_recorder_resources | Number of resources configured |
| flight_recorder_teams | Number of teams |
| flight_recorder_workers | Per-state worker count |
| flight_recorder_pg_table_stat | PostgreSQL user table statistics |
| flight_recorder_pg_table_sizes_bytes | PostgreSQL table sizes |
| 36.928571 | 68 | 0.767892 | eng_Latn | 0.914273 |
1c8cb1a9fb46f7280be683045bf079021377e7db | 2,477 | md | Markdown | docs/Read-PredefinedFromList.md | marckassay/GmailFilterUtil | 0a9c9557d972b63486c421d605928743c741d2be | [
"MIT"
] | null | null | null | docs/Read-PredefinedFromList.md | marckassay/GmailFilterUtil | 0a9c9557d972b63486c421d605928743c741d2be | [
"MIT"
] | null | null | null | docs/Read-PredefinedFromList.md | marckassay/GmailFilterUtil | 0a9c9557d972b63486c421d605928743c741d2be | [
"MIT"
] | null | null | null | ---
external help file: GmailFilterUtil-help.xml
Module Name: GmailFilterUtil
online version: https://github.com/marckassay/GmailFilterUtil/blob/0.0.4/docs/Read-PredefinedFromList.md
schema: 2.0.0
---
# Read-PredefinedFromList
## SYNOPSIS
Reads a predefined list of string entries.
## SYNTAX
```
Read-PredefinedFromList [-ListName] <String[]> [[-BaseUri] <String>] [<CommonParameters>]
```
## DESCRIPTION
Currently the only predefined list, using the default `BaseUri` parameter is `recruiters.us`. That list can be view [here](https://raw.githubusercontent.com/marckassay/GmailFilterUtil/master/resources/recruiters.us.txt). The returned value from this function can be used in `Add-FromList` or `Set-FromList`.
## EXAMPLES
### Example 1
```powershell
PS C:\> $RecruiterEntries = Read-PredefinedFromList -ListName 'recruiters.us'
```
Downloads predefined entries for the 'recruiters.us' set.
## PARAMETERS
### -BaseUri
Defaults to:
- <https://raw.githubusercontent.com/marckassay/GmailFilterUtil/master/resources/>
This value can is the prefix for `ListName`.
```yaml
Type: String
Parameter Sets: (All)
Aliases:
Required: False
Position: 1
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### -ListName
The txt filename where `BaseUri` points to.
```yaml
Type: String[]
Parameter Sets: (All)
Aliases:
Accepted values: recruiters.us
Required: True
Position: 0
Default value: None
Accept pipeline input: False
Accept wildcard characters: False
```
### CommonParameters
This cmdlet supports the common parameters: -Debug, -ErrorAction, -ErrorVariable, -InformationAction, -InformationVariable, -OutVariable, -OutBuffer, -PipelineVariable, -Verbose, -WarningAction, and -WarningVariable. For more information, see about_CommonParameters (<http://go.microsoft.com/fwlink/?LinkID=113216).>
## INPUTS
### None
## OUTPUTS
### System.String[]
## NOTES
## RELATED LINKS
[Read-PredefinedFromList.ps1](https://github.com/marckassay/GmailFilterUtil/blob/0.0.4/src/list/Read-PredefinedFromList.ps1)
[Read-PredefinedFromList.Tests.ps1](https://github.com/marckassay/GmailFilterUtil/blob/0.0.4/test/list/Read-PredefinedFromList.Tests.ps1)
[`Set-FromList`](https://github.com/marckassay/GmailFilterUtil/blob/0.0.4/docs/Set-FromList.md)
[`Add-FromList`](https://github.com/marckassay/GmailFilterUtil/blob/0.0.4/docs/Add-FromList.md)
| 27.21978 | 317 | 0.73476 | yue_Hant | 0.646704 |
1c8d310def303288e30fb4e25796e2877510ebb1 | 331 | md | Markdown | avd_docs/azure/datalake/AVD-AZU-0036/docs.md | mrtc0/defsec | 181ef1a625a809866966cf33878c6a49be000fa5 | [
"MIT"
] | 42 | 2021-11-30T18:08:25.000Z | 2022-03-29T11:03:37.000Z | avd_docs/azure/datalake/AVD-AZU-0036/docs.md | mrtc0/defsec | 181ef1a625a809866966cf33878c6a49be000fa5 | [
"MIT"
] | 180 | 2021-09-14T08:51:10.000Z | 2022-03-31T08:24:48.000Z | avd_docs/azure/datalake/AVD-AZU-0036/docs.md | mrtc0/defsec | 181ef1a625a809866966cf33878c6a49be000fa5 | [
"MIT"
] | 19 | 2021-11-16T09:19:59.000Z | 2022-03-17T10:28:43.000Z |
### Unencrypted data lake storage.
Datalake storage encryption defaults to Enabled, it shouldn't be overridden to Disabled.
### Impact
Data could be read if compromised
<!-- DO NOT CHANGE -->
{{ remediationActions }}
### Links
- https://docs.microsoft.com/en-us/azure/data-lake-store/data-lake-store-security-overview
| 23.642857 | 90 | 0.725076 | kor_Hang | 0.512653 |
1c8d4e80b7eea5828a87bedbbeccd105e3c9bd9a | 8,413 | md | Markdown | research/nlp/hypertext/README_CN.md | jianzhu/models | cd408a11ac4451fc71235b065390db12e960abe4 | [
"Apache-2.0"
] | null | null | null | research/nlp/hypertext/README_CN.md | jianzhu/models | cd408a11ac4451fc71235b065390db12e960abe4 | [
"Apache-2.0"
] | null | null | null | research/nlp/hypertext/README_CN.md | jianzhu/models | cd408a11ac4451fc71235b065390db12e960abe4 | [
"Apache-2.0"
] | 2 | 2019-09-01T06:17:04.000Z | 2019-10-04T08:39:45.000Z |
# 目录
- [HyperText概述](#HyperText概述)
- [模型架构](#模型架构)
- [数据集](#数据集)
- [TNEWS数据集](#TNEWS数据集)
- [IFLYTEK数据集](#IFLYTEK数据集)
- [环境要求](#环境要求)
- [快速入门](#快速入门)
- [脚本说明](#脚本说明)
- [脚本及样例代码](#脚本及样例代码)
- [脚本参数](#脚本参数)
- [TNEWS](#TNEWS)
- [IFLYTEK](#IFLYTEK)
- [训练过程](#训练过程)
- [训练](#训练)
- [分布式训练](#分布式训练)
- [评估过程](#评估过程)
- [评估](#评估)
- [评估性能](#评估性能)
- [tnews上的HyperText](#tnews上的HyperText)
- [iflytek上的HyperText](#iflytek上的HyperText)
- [导出过程](#导出过程)
- [ModelZoo主页](#ModelZoo主页)
# HyperText概述
自然语言数据呈现出树状的层次结构,如WordNet中的hypernymhyponym关系。考虑到双曲空间自然适合于树状分层数据的建模,原作者提出了一个名为HyperText的新模型,通过赋予FastText以双曲几何形状来实现高效的文本分类。经验表明,HyperText在一系列文本分类任务中的表现优于FastText,而且参数大大减少。
论文:[HyperText: Endowing FastText with Hyperbolic Geometry](https://arxiv.org/abs/2010.16143)
# 模型架构
HyperText 基于双曲空间的庞加莱球模型。首先利用单词或 ngram 的庞加莱球嵌入捕捉自然语言句子中的潜在层次结构,然后使用Einstein中点作为池化方法来强调语义特定词(包含更多信息出现频率低于一般词),最后使用Mobius线性变换作为双曲线分类器。

# 数据集
## TNEWS数据集
下载:[TNEWS](https://bj.bcebos.com/paddlehub-dataset/tnews.tar.gz)
解压至 data/tnews
## IFLYTEK数据集
下载:[IFLYTEK](https://bj.bcebos.com/paddlehub-dataset/iflytek.tar.gz)
解压至 data/iflytek
```text
│── data
│──iflytek_public # 处理后数据集
│──iflytek # 原始数据集
│──tnews_public # 处理后数据集
│──tnews # 原始数据集
```
# 环境要求
- 硬件(Ascend处理器)
- 准备Ascend或GPU处理器搭建硬件环境。
- 框架
- [MindSpore](https://gitee.com/mindspore/mindspore)
- 更多关于Mindspore的信息,请查看以下资源:
- [MindSpore教程](https://www.mindspore.cn/tutorials/zh-CN/master/index.html)
- [MindSpore Python API](https://www.mindspore.cn/docs/api/zh-CN/master/index.html)
# 快速入门
通过官方网站安装MindSpore和下载数据集后,您可以按照如下步骤进行训练和评估:
- Ascend和GPU处理器环境运行
```shell
# 数据处理示例
cd ./scripts
bash data_process.sh [DATA_DIR] [OUT_DATA_DIR] [DATASET_TYPE]
# 运行训练示例
cd ./scripts
bash run_standalone_train.sh [DATASET_DIR] [DATASET_TYPE] [DEVICE]
# 运行评估示例
cd ./scripts
bash run_eval.sh [DATASET_DIR] [DATASET_TYPE] [MODEL_PATH] [DEVICE]
```
- device should be in["Ascend","GPU"].
# 脚本说明
## 脚本及样例代码
```text
│──HyperText
│── README.md # hypertext相关声明
│── scripts
│ │──run_standalone_train.sh # 训练的shell脚本
│ │──run_eval.sh # 评估的shell脚本
│ │──data_process.sh # 在Ascend上评估的shell脚本
│ │──run_gpu_distributed_train # 在GPU上分布式训练的shell脚本
│── output # 输出文件,包括保存的模型,训练日志,评估日志
│── src
│ │──config.py # 参数文件
│ │──dataset.py # 数据集文件
│ │──data_preprocessing.py # 数据集处理文件
│ │──hypertext.py # 模型文件
│ │──hypertext_train.py # 模型训练文件
│ │──math_utils.py # 工具模型
│ │──mobius_linear.py # mobius_linear文件
│ │──poincare.py # poincare算子文件
│ │──radam_optimizer.py # 优化器
│── train.py # 训练脚本
│── eval.py # 评估脚本
│── create_dataset.py # 数据处理脚本
│── export.py # 将checkpoint文件导出到air/mindir
```
## 脚本参数
在Config.py中可以同时配置参数
### TNEWS
```text
num_epochs 2 # epoch数量
batch_size 32 # batch数量
max_length 40 # 最大文本长度
learning_rate 0.011 # 学习率
embed 20 # embed维度
bucket 1500000 # 词和Ngram数量
wordNgrams 2 # wordNgram的数量
eval_step 100 # 验证步数
min_freq 1 # min_freq
lr_decay_rate 0.96 #decay率
```
### IFLYTEK
```text
num_epochs 2 # epoch数量
batch_size 32 # batch数量
max_length 1000 # 最大文本长度
learning_rate 0.013 # 学习率
embed 80 # embed维度
bucket 2000000 # 词和Ngram数量
wordNgrams 2 # wordNgram的数量
eval_step 50 # 验证步数
min_freq 1 # min_freq
lr_decay_rate 0.94 #decay率
```
更多配置细节请参考 src/config.py。
# 训练过程
## 训练
- 处理原始数据集:
```bash
tnews
python create_dataset.py --data_dir /data/tnews --out_data_dir /data/tnews_public --datasetType tnews
iflytek
python create_dataset.py --data_dir /data/iflytek --out_data_dir /data/iflytek_public --datasetType iflytek
```
- Ascend处理器环境运行
```bash
tnews
python train.py --datasetdir ./data/tnews_public --datasetType tnews --device Ascend
iflytek
python train.py --datasetdir ./data/iflytek_public --datasetType iflytek --device Ascend
```
- GPU处理器环境运行
```bash
tnews
python train.py --datasetdir ./data/tnews_public --datasetType tnews --device GPU
iflytek
python train.py --datasetdir ./data/iflytek_public --datasetType iflytek --device GPU
```
## 分布式训练
- GPU处理器环境运行
```bash
tnews
mpirun -n {device_num} python train.py --datasetdir ./data/tnews_public --datasetType tnews --device GPU --run_distribute True
iflytek
mpirun -n {device_num} python train.py --datasetdir ./data/iflytek_public --datasetType iflytek --device GPU --run_distribute True
```
其中:{device_num}代表并行的卡的数量,例如:4或者8
# 评估过程
## 评估
把训练生成的ckpt文件放入./output/文件夹下
- Ascend处理器环境运行
```bash
tnews
python eval.py --datasetdir ./data/tnews_public --modelPath ./output/hypertext_tnews.ckpt --datasetType tnews --device Ascend
iflytek
python eval.py --datasetdir ./data/iflytek_public --datasetType iflytek --modelPath ./output/hypertext_iflytek.ckpt --device Ascend
```
- GPU处理器环境运行
```bash
tnews
python eval.py --datasetdir ./data/tnews_public --modelPath ./output/hypertext_tnews.ckpt --datasetType tnews --device GPU
iflytek
python eval.py --datasetdir ./data/iflytek_public --datasetType iflytek --modelPath ./output/hypertext_iflytek.ckpt --device GPU
```
### 评估性能
#### tnews上的HyperText
| 参数 | Ascend 910 | GPU |
| -------------------| -------------------------------------- | -------------------------------------- |
| 模型版本 | HyperText | HyperText |
| 资源 | Ascend 910;CPU 2.60GHz,192核;内存 755G;系统 Euler2.8 | GPU(Tesla V100 SXM2);CPU:2.1GHz,24核;内存:128G |
| 上传日期 | 2021-11-18 ; | 2021-11-18 |
| MindSpore版本 | 1.3.0 | 1.5.0 |
| 数据集 | tnews | tnews |
| 训练参数 | epoch=2, batch_size = 32 | epoch=2, batch_size = 32 |
| 优化器 | radam | radam|
| 损失函数 | SoftmaxCrossEntropyWithLogits | SoftmaxCrossEntropyWithLogits |
| 输出 | 精度 | 精度 |
| 损失 | 0.9087 | 0.905 |
| 速度 | 1958.810毫秒/步(8卡) | 315.949毫秒/步(8卡) |
#### iflytek上的HyperText
| 参数 | Ascend 910 | GPU |
| -------------------------- | -------------------------------------- | -------------------------------------- |
| 模型版本 | HyperText | HyperText |
| 资源 | Ascend 910;CPU 2.60GHz,192核;内存 755G;系统 Euler2.8 | GPU(Tesla V100 SXM2);CPU:2.1GHz,24核;内存:128G |
| 上传日期 | 2021-11-18 ; | 2021-11-18 |
| MindSpore版本 | 1.3.0 | 1.5.0 |
| 数据集 | iflytek | iflytek |
| 训练参数 | epoch=2, batch_size = 32 | epoch=2, batch_size = 32 |
| 优化器 | radam | radam|
| 损失函数 | SoftmaxCrossEntropyWithLogits | SoftmaxCrossEntropyWithLogits |
| 输出 | 精度 | 精度 |
| 损失 | 0.57 | 0.5776 |
| 速度 | 395.895毫秒/步(8卡) | 597.672毫秒/步(8卡) |
tnews多卡精度: 0.8833
iflytek多卡精度: 0.556
# 导出过程
可以使用如下命令导出mindir文件
- Ascend处理器环境运行
```shell
tnews
python export.py --modelPath ./output/hypertext_tnews.ckpt --datasetType tnews --device Ascend
iflytek
python export.py --modelPath ./output/hypertext_iflytek.ckpt --datasetType iflytek --device Ascend
```
- GPU处理器环境运行
```shell
tnews
python export.py --modelPath ./output/hypertext_tnews.ckpt --datasetType tnews --device GPU
iflytek
python export.py --modelPath ./output/hypertext_iflytek.ckpt --datasetType iflytek --device GPU
```
# ModelZoo主页
请浏览官网[主页](https://gitee.com/mindspore/models)。
| 29.211806 | 169 | 0.559491 | yue_Hant | 0.424364 |
1c8d8d90a432034f23acdcc6ada6c233250d78cd | 12,738 | md | Markdown | README.md | realekhor/development-hub | 381988c255d8f2f34380ef4bb1880db27db2496a | [
"MIT"
] | null | null | null | README.md | realekhor/development-hub | 381988c255d8f2f34380ef4bb1880db27db2496a | [
"MIT"
] | null | null | null | README.md | realekhor/development-hub | 381988c255d8f2f34380ef4bb1880db27db2496a | [
"MIT"
] | null | null | null | # Development Hub for Power Apps 
The Development Hub brings continuous integration to Power Apps development by allowing developers to easily submit their Power Apps configuration/customisation for review and automated merging to source control.
## Table of contents
* [Features](#features)
* [Prerequisites](#prerequisites)
* [Installation](#installation)
* [Deploy the package](#deploy-the-package)
* [Register an app](#register-an-app)
* [Configure plug-in steps](#configure-plug-in-steps)
* [Configure Azure DevOps](#configure-azure-devops)
* [Set solution environment variables](#set-solution-environment-variables)
* [Set flow connections](#set-flow-connections)
* [Configuration](#configuration)
* [Usage](#usage)
* [Create an issue](#create-an-issue)
* [Develop a solution](#develop-a-solution)
* [Merge a solution](#merge-a-solution)
* [Merge source code](#merge-source-code)
* [Perform manual merge activities](#perform-manual-merge-activities)
* [Handle a failed merge](#handle-a-failed-merge)
* [Contributing](#contributing)
* [Build a development environment](#build-a-development-environment)
* [Set environment variables](#set-environment-variables)
* [Run build tasks](#run-build-tasks)
## Features
- Peer review for configuration
- Merging for individual bugs and features
- Automated semantic versioning
- Automated source control
## Prerequisites
Two instances are required to use the Development Hub - a development instance and a 'master' instance.
For more information on the purpose of the master instance, refer to the [Solution Lifecycle Management](https://www.microsoft.com/en-us/download/details.aspx?id=57777) document published by Microsoft. The relevant information can be found in the _Instance topologies for development approaches_ table.
## Installation
### Deploy the package
The package can be deployed to your development environment using the Package Deployer. Download the package files from the Releases tab and follow Microsoft's guide to using the Package Deployer [here](https://docs.microsoft.com/en-us/power-platform/admin/deploy-packages-using-package-deployer-windows-powershell).
### Register an app
Access to the Common Data Service Web API is required in order to merge development solutions with the solution(s) in the master instance. Follow Microsoft's guide on registering an app with access to the Common Data Service Web API [here](https://docs.microsoft.com/en-us/powerapps/developer/common-data-service/use-single-tenant-server-server-authentication#azure-application-registration). You will need to use the client ID, tenant ID, and a client secret for the app you register in later steps.
### Configure plug-in steps
Use the plug-in registration tool to set the secure configuration parameters necessary for authentication for the Web API. The steps to set the secure configuration for are under the InjectSecureConfig plug-in within the DevelopmentHub.Develop assembly.
The secure configuration is a JSON object with the client ID, tenant ID, and client secret:
```json
{
"ClientId": "<from app registration>",
"TenantId": "<from app registration>",
"ClientSecret": "<from app registration>",
}
```
**Note: the application user must should have admin permissions in both the master instance and the development instance.**
### Configure Azure DevOps
The Development Hub currently relies on integration with Azure DevOps to provide automated source-control functionality.
Navigate to _Project Settings -> Repositories_ in the Azure DevOps project that contains repository. Select the relevant repository and assign the following privilges to the project Build Service user:
- Bypass policies when pushing
- Contribute
- Create branch
A build definition capable of extracting solutions is required. There are several files in the [samples](./samples) folder to help you with this. If you use the sample files as is, copy the _scripts_ folder and _azure-pipelines-extract.yml_ file into your repository. The sample build script assumes that your repository structure is that you have a _src_ folder at the root, a _solutions_ folder within, and then folders that match your solutions' unique names. In addition, it expects a _solution.json_ within each of these solution folders that provides the development environment URL (see the _solution.json_ in the samples folder), and an _extract_ folder alongside it to contain the unpacked solution zip file fragments. You will also need to create a _Development Hub_ variable group that contains three variables - `Client ID`, `Tenant ID`, and `Client Secret`. These should be taken from the app registration created earlier.
If you have an existing folder structure which is different, the _Merge-SolutionVersion.ps1_ script will require tweaking - but the _azure-pipelines-extract.yml_ file shouldn't need to be changed.
### Set solution environment variables
There are four environment variables to set:
- Solution Publisher Prefix
- Azure DevOps Organization
- Azure DevOps Project
- Azure DevOps Extract Build Definition ID
The build definition ID is the numeric ID given to the build definition by Azure DevOps. This is the extract build created either by modifying the sample in this repository or by the Yeoman generator.
### Set flow connections
There are four flows located in the _devhub_DevelopmentHub_Develop_ and _devhub_DevelopmentHub_AzureDevOps_ solutions that must be set. The flows to set the connections on are:
- When a solution merge is approved -> Merge the solution
- When a solution merge is merged -> Approve the first queued solution merge
- Environment Variable Key -> Environment Variable Value
- When a solution is merged -> Commit changes to source control
## Configuration
Ensure you have created or imported your unmanaged solution(s) for extraction in the master instance. Once this is done, they can be registered within the Development Hub app.
The first step is to create an environment record for the master instance:

Then create a solution record for each solution in the master instance:

Do not change the version numbers if the solution is new. If it is an existing solution, update the version numbers to match the solution in the master instance. The version will from then on be managed by the Development Hub when merging changes.
## Usage
### Create an issue
Issues must be created within the Development Hub in order for a developer to begin working on a new feature or bug fix.
The issue records in the Development Hub are used to group solution merge records and aid in applying semantic versioning to solutions. The Development Hub is not intended to replace a more conventional issue tracking system (e.g Azure Boards). It is suggested to either create issue records on-the-fly, at the beginning of a sprint, or by integrating Azure DevOps through a tool such as Power Automate.
If your issues are on Azure Boards, you can set the `Work Item ID` field on the corresponding Development Hub issue. The commit will then be linked with the Azure DevOps work item.

An issue with a 'To Do' status will have a *Develop* button in the ribbon. Clicking this will create a development solution and transition the issue to 'In Progress'. The *Development* tab will show details about the development solutions and solution merge history.

### Develop a solution
The developer must add any new components or components to be modified into their development solution. It is important that only one developer makes changes to a component at a time. If a component appears in more than one development solution, it will result in either incomplete work being merged or solution merges failing due to missing dependencies.
Development solutions should contain just the components created or updated for that issue and no more. Adding all assets to a development solution will add all assets to the target solution when merged.
### Merge a solution
Once the issue has been developed, a solution merge record can be created. This will transition the issue to 'Developed'. The solution merge is created in an 'Awaiting Review' status. Review comments can be added to the solution merge in the form of notes and the solution merge either approved or rejected.
Once approved, the development solution will be merged into the target solution. If multiple solution merges have been approved, they will enter a queue. This means that an 'Approved' solution merge will transition to either a 'Merging' or 'Queued' status.
A successful solution merge will transition to an inactive 'Merged' status. The 'Version History' tab on the target solution record will also contain a new record with the post-merge unmanaged and managed solution zips available. The new solution version is based on the type of issue merged. A feature issue will increment the minor version and a bug issue will increment the patch version. Major version changes must be done manually.

### Merge source code
If the solution to be merged has associated source code (e.g. you have made changes to plugin assemblies or web resources) then you must specify the source branch. Ensure that you perform any manual merging required in Git on your source branch before creating the solution merge. This branch will be merged automatically.
### Perform manual merge activities
Specifying that there are manual merge activities on the solution merge record will cause the merging process to pause before extracting to source control. This is useful where you are merging changes by hand e.g. components that need to be updated frequently by multiple developers or where you need to delete components from the solution.
When the merging process is in a state where manual merge activities can begin, the solution merge will transition to an 'Awaiting Manual Merge Activities' status. If you are deleting components from the solution in the master instance, it is recommended to update the major version of the solution record in the Development Hub during this period.
To notify the flow that the manual merge activities are complete, navigate to _Action items -> Approvals_ within Power Automate and set the approval status to merged.
### Handle a failed merge
If the merging process failed (e.g. due to missing dependencies) then the solution merge will transition to a 'Failed' status. A *Retry* button is available after the necessary steps have been taken. Failure reason will be attached as a note to the solution merge record.
## Contributing
### Build a development environment
A VS Code/Cake task has been defined which will allow you to recreate a development instance for a given solution. If you wanted to contribute to the devhub_DevelopmentHub_Develop solution, you would open the command palette within VS Code (_ctrl + shift + p_) and select _Tasks: Run Task_ followed by _Build Development Environment_ and _devhub_DevelopmentHub_Develop_. This task requires that you have first configured the development environment url _solution.json_ file in the corresponding solution folder and set up your environment variables (see below).
If you do not have an existing instance, you can create one for free with the [Power Apps Community Plan](https://docs.microsoft.com/en-us/powerapps/maker/dev-community-plan) or by signing up for a [trial](https://trials.dynamics.com/).
### Set environment variables
Two environment variables are required to enable you to authenticate with the development and staging environments:
- CAKE_DYNAMICS_USERNAME_DEVELOPMENT_HUB
- CAKE_DYNAMICS_PASSWORD_DEVELOPMENT_HUB
The username in the environment variable is used unless overridden by the username set in the corresponding _solution.json_ file. This is useful where the username for each solution is different (e.g. where you have multiple trials).
### Run build tasks
A number of Cake build tasks have been defined to make development easier. It is recommended to call these via the command palette (_ctrl + shift + p_) and selecting _Tasks: Run Task_.
The following tasks are available:
- Build Development Environment
- Extract Solution
- Pack Solution
- Build Package
- Deploy Plugins
- Deploy Workflow Activities
- Generate Model
### Extract to source control
Before creating a pull request containing your changes, you must extract the solutions into source control. This can be done using the _Extract Solution_ task and specifying which solution(s) to extract. | 64.333333 | 935 | 0.79518 | eng_Latn | 0.997669 |
1c8dbff591ee6a81b34000585913db54dbdc85d1 | 81 | md | Markdown | .github/SUPPORT.md | srnyx/template | 632862a0e91f4b7218031a8f28aecfac4c3daa37 | [
"MIT"
] | 3 | 2022-01-08T18:05:05.000Z | 2022-01-08T18:35:37.000Z | .github/SUPPORT.md | srnyx/modpack | 3440811b84ed6b05da10fb9554e78443aefda98b | [
"MIT"
] | null | null | null | .github/SUPPORT.md | srnyx/modpack | 3440811b84ed6b05da10fb9554e78443aefda98b | [
"MIT"
] | null | null | null | If you're looking for support with the project, please ask in the Discord server. | 81 | 81 | 0.802469 | eng_Latn | 0.999822 |
1c8e051cea114a7702ee47913629cfc84d95f71a | 2,125 | md | Markdown | Python/Machine Learning/Disease_Predictor/README.md | vickyrules/Dev-Scripts-1 | 345ac6bcb8c6a9a3b096e573d176de000c0c8261 | [
"MIT"
] | null | null | null | Python/Machine Learning/Disease_Predictor/README.md | vickyrules/Dev-Scripts-1 | 345ac6bcb8c6a9a3b096e573d176de000c0c8261 | [
"MIT"
] | null | null | null | Python/Machine Learning/Disease_Predictor/README.md | vickyrules/Dev-Scripts-1 | 345ac6bcb8c6a9a3b096e573d176de000c0c8261 | [
"MIT"
] | null | null | null | # DisesesPredictor:
## About Disease Predictor: 🧾
ML model that detects diseases from provided symptoms provide by user using Decisiopn Tree ,RandomForest and NaiveBayes classifications algorithms.
- learn more about
- Decision Tree <a href="https://www.edureka.co/blog/decision-trees/">click here</a>
- RandomForest <a href="https://www.edureka.co/blog/random-forest-classifier/">click here</a>
- NaiveBayes <a href="https://www.edureka.co/blog/naive-bayes-tutorial//">click here</a>
<hr>
## Tech Stack 🧑🏻💻
- Jupyter Notebook version-- Anaconda-3
- Python Version--3.9
- tkinter
<hr>
## Install the Dependencies: 📚
- install python python 3.8.1 and above. <br>
- install jupyter notebook<br>
- install <a href="https://docs.python.org/3/library/tkinter.html"> tkinter</a><br>
```
pip install tk
```
- install <a href="https://numpy.org/doc/stable/"> numpy</a><br>
```
pip install numpy
```
- install <a href="https://pandas.pydata.org/pandas-docs/stable/getting_started/overview.html">pandas</a><br>
```
pip install pandas
```
- install <a href="https://www.tutorialspoint.com/scikit_learn/index.htm">scikit-learn</a><br>
```
pip install scikit-learn
```
<hr>
## To run the code in local: 🏃🏻♀️💻
1. Clone the Repository.<br>
```
git clone https://github.com/abhijeet007rocks8/Dev-Scripts.git
```
2. Run Jupyter Notebook.<br>
3. Navigate to the source file.
---
Dev-Scripts\Python\Machine Learning\Disease_Predictor
---
4. Click on " DiseasePrediction.ipynb " to open the file.
<img width = " 100%" src="https://user-images.githubusercontent.com/73611313/159130217-a62b22ca-223f-4c52-ad57-3ee5cc6ff605.png">
5. Select Cell and Choose Run All <br>
<img width="100%" src="https://user-images.githubusercontent.com/73611313/159129960-b202d8b7-4218-4340-9494-c4215a9f4af3.png">
<hr>
## GUI Preview: 📷

<hr>
## Demo Video: 📽️
---
https://user-images.githubusercontent.com/73611313/159130826-8e7f65c8-5aad-45b3-91de-9d2bed32b926.mp4
---
| 26.5625 | 147 | 0.704 | kor_Hang | 0.213394 |
1c8e4498ceaded9fccd9f543c39adb89c8232f6d | 1,518 | md | Markdown | uk/easy-first.md | bdvorianov/type-challenges-solutions | d31eb633106294e5280967dc08183b1cc71f507b | [
"CC-BY-4.0"
] | null | null | null | uk/easy-first.md | bdvorianov/type-challenges-solutions | d31eb633106294e5280967dc08183b1cc71f507b | [
"CC-BY-4.0"
] | 1 | 2021-03-21T17:18:32.000Z | 2021-03-21T17:18:32.000Z | uk/easy-first.md | bdvorianov/type-challenges-solutions | d31eb633106294e5280967dc08183b1cc71f507b | [
"CC-BY-4.0"
] | 1 | 2021-04-07T17:37:12.000Z | 2021-04-07T17:37:12.000Z | ---
id: 14
title: First of Array
lang: uk
level: easy
tags: array
---
## Завдання
Реалізувати тип `First<T>`, який приймає масив `T` і повертає тип його першого елемента.
Наприклад:
```ts
type arr1 = ['a', 'b', 'c']
type arr2 = [3, 2, 1]
type head1 = First<arr1> // 'a'
type head2 = First<arr2> // 3
```
## Розв'язок
Перше, що приходить в голову, це використати пошукові(або індексні) типи й просто написати `T[0]`:
```ts
type First<T extends any[]> = T[0]
```
Та це рішення не враховує один з випадків, який необхідно враховувати.
Якщо ми передамо порожній масив, `T[0]` не спрацює, бо не буде елементу з індексом 0(як і жодних інших).
Тож, перед тим, як використати тип першого елемента, нам потрібно перевірити, чи масив не порожній.
Для цього, ми можемо скористатись [умовними типами](https://www.typescriptlang.org/docs/handbook/advanced-types.html#conditional-types) в TypeScript.
Їх принцип доволі простий.
Якщо ми можемо присвоїти типу `T`(вхідний масив) порожній масив, це означає, що він порожній.
В такому випадку, ми повертаємо `never`, бо він нам не підходить. Немає елемента – немає типу.
І навпаки, якщо ми не можемо, значить масив містить елементи й ми спокійно можемо дізнатись тип першого з них.
```ts
type First<T extends any[]> = T extends [] ? never : T[0];
```
## Посилання
- [Типи пошуку/індексні типи](https://www.typescriptlang.org/docs/handbook/advanced-types.html#index-types)
- [Умовні типи](https://www.typescriptlang.org/docs/handbook/advanced-types.html#conditional-types)
| 30.36 | 149 | 0.729249 | ukr_Cyrl | 0.997519 |
1c8e96152c8a773ac6d9a8133b72f4ffda51beae | 3,639 | md | Markdown | interp/README.md | ybkimm/tinygo | 15361c829e2e1cb559a04e2b1a66bef0416a48ee | [
"Apache-2.0"
] | 17 | 2020-05-11T14:46:17.000Z | 2021-08-18T02:50:59.000Z | interp/README.md | ybkimm/tinygo | 15361c829e2e1cb559a04e2b1a66bef0416a48ee | [
"Apache-2.0"
] | null | null | null | interp/README.md | ybkimm/tinygo | 15361c829e2e1cb559a04e2b1a66bef0416a48ee | [
"Apache-2.0"
] | 1 | 2021-12-20T02:11:47.000Z | 2021-12-20T02:11:47.000Z | # Partial evaluation of initialization code in Go
For several reasons related to code size and memory consumption (see below), it
is best to try to evaluate as much initialization code at compile time as
possible and only run unknown expressions (e.g. external calls) at runtime. This
is in practice a partial evaluator of the `runtime.initAll` function, which
calls each package initializer.
It works by directly interpreting LLVM IR:
* Almost all operations work directly on constants, and are implemented using
the llvm.Const* set of functions that are evaluated directly.
* External function calls and some other operations (inline assembly, volatile
load, volatile store) are seen as having limited side effects. Limited in
the sense that it is known at compile time which globals it affects, which
then are marked 'dirty' (meaning, further operations on it must be done at
runtime). These operations are emitted directly in the `runtime.initAll`
function. Return values are also considered 'dirty'.
* Such 'dirty' objects and local values must be executed at runtime instead of
at compile time. This dirtyness propagates further through the IR, for
example storing a dirty local value to a global also makes the global dirty,
meaning that the global may not be read or written at compile time as it's
contents at that point during interpretation is unknown.
* There are some heuristics in place to avoid doing too much with dirty
values. For example, a branch based on a dirty local marks the whole
function itself as having side effect (as if it is an external function).
However, all globals it touches are still taken into account and when a call
is inserted in `runtime.initAll`, all globals it references are also marked
dirty.
* Heap allocation (`runtime.alloc`) is emulated by creating new objects. The
value in the allocation is the initializer of the global, the zero value is
the zero initializer.
* Stack allocation (`alloca`) is often emulated using a fake alloca object,
until the address of the alloca is taken in which case it is also created as
a real `alloca` in `runtime.initAll` and marked dirty. This may be necessary
when calling an external function with the given alloca as paramter.
## Why is this necessary?
A partial evaluator is hard to get right, so why go through all the trouble of
writing one?
The main reason is that the previous attempt wasn't complete and wasn't sound.
It simply tried to evaluate Go SSA directly, which was good but more difficult
than necessary. An IR based interpreter needs to understand fewer instructions
as the LLVM IR simply has less (complex) instructions than Go SSA. Also, LLVM
provides some useful tools like easily getting all uses of a function or global,
which Go SSA does not provide.
But why is it necessary at all? The answer is that globals with initializers are
much easier to optimize by LLVM than initialization code. Also, there are a few
other benefits:
* Dead globals are trivial to optimize away.
* Constant globals are easier to detect. Remember that Go does not have global
constants in the same sense as that C has them. Constants are useful because
they can be propagated and provide some opportunities for other
optimizations (like dead code elimination when branching on the contents of
a global).
* Constants are much more efficent on microcontrollers, as they can be
allocated in flash instead of RAM.
For more details, see [this section of the
documentation](https://tinygo.org/compiler-internals/differences-from-go/).
| 55.984615 | 80 | 0.771641 | eng_Latn | 0.999925 |
1c8eda3349e00645de5be45a6c67a6bbdcdac9eb | 42 | md | Markdown | _notes/Dimmu-Borgir.md | nikisaku/digital-garden-jekyll-template | c7331638c76df70d170f5ba09e77cb8380b05deb | [
"MIT"
] | null | null | null | _notes/Dimmu-Borgir.md | nikisaku/digital-garden-jekyll-template | c7331638c76df70d170f5ba09e77cb8380b05deb | [
"MIT"
] | null | null | null | _notes/Dimmu-Borgir.md | nikisaku/digital-garden-jekyll-template | c7331638c76df70d170f5ba09e77cb8380b05deb | [
"MIT"
] | null | null | null | ---
layout: note
title: "Dimmu Borgir"
--- | 10.5 | 21 | 0.619048 | eng_Latn | 0.75945 |
1c8f000a19def2ae1bee4a2316a124e3e57b32c5 | 461 | md | Markdown | _posts/2021-09-09-細看歷史(三十三):細看黨史:四千五百萬冤魂 - 09--09--2021.md | NodeBE4/teahouse | 4d31c4088cc871c98a9760cefd2d77e5e0dd7466 | [
"MIT"
] | 1 | 2020-09-16T02:05:27.000Z | 2020-09-16T02:05:27.000Z | _posts/2021-09-09-細看歷史(三十三):細看黨史:四千五百萬冤魂 - 09--09--2021.md | NodeBE4/teahouse | 4d31c4088cc871c98a9760cefd2d77e5e0dd7466 | [
"MIT"
] | null | null | null | _posts/2021-09-09-細看歷史(三十三):細看黨史:四千五百萬冤魂 - 09--09--2021.md | NodeBE4/teahouse | 4d31c4088cc871c98a9760cefd2d77e5e0dd7466 | [
"MIT"
] | null | null | null | ---
layout: post
title: "細看歷史(三十三):細看黨史:四千五百萬冤魂 - 09/09/2021"
date: 2021-09-09T12:00:32.000Z
author: 城寨 Singjai
from: https://www.youtube.com/watch?v=ckyBwb3sZ7A
tags: [ 城寨 ]
categories: [ 城寨 ]
---
<!--1631188832000-->
[細看歷史(三十三):細看黨史:四千五百萬冤魂 - 09/09/2021](https://www.youtube.com/watch?v=ckyBwb3sZ7A)
------
<div>
細看歷史(三十三):09/09/2021主持:劉細良題目:《細看黨史:四千五百萬冤魂》Facebook: https://facebook.com/kowloonsingjai支持城寨Patreonhttps://www.patreon.com/kowloonsingjai
</div>
| 27.117647 | 137 | 0.715835 | yue_Hant | 0.821221 |
1c8f2b20f62ac5529bd8dcca1efd1746b77a8d50 | 10,566 | md | Markdown | Documentation~/manual/customizePackage.md | borastudy/template-unity-package | 25b4ed1bf6dc63d489cff92d8171c494bd121dc0 | [
"MIT"
] | null | null | null | Documentation~/manual/customizePackage.md | borastudy/template-unity-package | 25b4ed1bf6dc63d489cff92d8171c494bd121dc0 | [
"MIT"
] | null | null | null | Documentation~/manual/customizePackage.md | borastudy/template-unity-package | 25b4ed1bf6dc63d489cff92d8171c494bd121dc0 | [
"MIT"
] | null | null | null | # Customizing Package Files
This part of the manual pertains to the following folders and files:
```
<root>
├── package.json
├── .github
| ├── FUNDING.yml
| ├── ISSUE_TEMPLATE
| | ├── bug_report.md
| | ├── feature_request.md
| | ├── documentation-template.md
| | └── research_template.md
| └── workflows
| └── documentation.yml
├── README.md
├── CHANGELOG.md
├── LICENSE.md
├── THIRD PARTY NOTICES.md
└── .gitignore
```
Per [Unity's documentation](https://docs.unity3d.com/Manual/CustomPackages.html), technically, the only required file in this section is `package.json`. Nonetheless, this template follows the standard set by [Unity's documentation](https://docs.unity3d.com/Manual/cus-layout.html) to keep things consistent with other packages. Plus, filling out the following files should help other developers better understand the purpose of the package:
## Package Manifest - [`package.json`](https://docs.unity3d.com/Manual/upm-manifestPkg.html)
This file is required: it should not be deleted, moved, or renamed. That said, it *must* be edited.

The Package Manifest file is a JSON file storing details about the package that the Package Manager can use to display and categorize the project. Note that this file *can* be edited inside the Unity Inspector like any other asset files, as opposed to editing it in the text editor; Unity even provides a nice UI for it. Furthermore, Unity's [own documentation](https://docs.unity3d.com/Manual/upm-manifestPkg.html) is quite extensive, so this section will only cover a couple of important fields:
### `name`
```
"name": "com.omiyagames.template",
```
This field is required, and *must* be changed.
The `name` field is a unique identifier differentiating this package from potentially other packages with similar display names. The string stored in this field should follow the classic `com.<company-name>.<package-name>` format. Furthermore, it's recommended to keep the string below 50 characters in length. For more information, check out [Unity's documentation](https://docs.unity3d.com/Manual/upm-manifestPkg.html#name).
### `version`
```
"version": "1.0.0-pre.1",
```
This field is required. Changing it is recommended, but not necessary.
The `version` field informs the Unity Package Manager what version this package is on. If a service like [OpenUPM](openupm.com) is being used, this field further informs the Package Manager if the package is in need of an update. Unity provides a pretty thorough explanation of [how this field should be filled in](https://docs.unity3d.com/Manual/upm-semver.html), but the classic `<major>.<minor>.<patch>` should suffice here. If in preview, don't forget to add `-preview.<patch>` as well.
### `displayName`
```
"displayName": "Omiya Games - Template Unity Package",
```
While technically not required, this field should kept, and more importantly, *changed.*
When the Unity Package Manager displays the name of this package, `displayName` is the field it'll use. Simply provide a human-friendly name for this package to this field.
### `description`
```
"description": "This is a template package, used as a basis to build a bigger one.",
```
While technically not required, this field should kept, and more importantly, *changed.*
When the Unity Package Manager displays a description of this package when selected, `description` is the field it'll use. Simply provide a human-friendly description for this package to this field. Note that UTF-8 characters are supported via `\u2603` and `\n`.
### `unity`
```
"unity": "2021.2",
```
While technically not required, this field should kept. Changing it is recommended, but not necessary.
Indicates the lowest version of Unity this package supports. If patch release info is necessary as well (e.g. Unity 2019.3.7f3), the last set of digits can be added to `unityRelease` field.
### `type`
```
"type": "tool",
```
Optional, and only used internally as of this writing. Changing it is recommended, but not necessary.
Indicates the type of this package. As of this writing, values Unity recognizes are:
- tool
- library
- module
- template
- sample
- tests
Per [Unity's documentation](https://docs.unity3d.com/Manual/CustomPackages.html), technically, the only required file in this section is `package.json`. Nonetheless, this template follows the standard set by [Unity's documentation](https://docs.unity3d.com/Manual/cus-layout.html) to keep things consistent with other packages. Plus, filling out the following files should help other developers better understand the purpose of the package:
### `dependencies`
```json
"dependencies": {
"com.unity.ext.nunit": "1.0.0"
},
```
This field is optional, and if this package does not contain any code or plugins, can be safely removed. If it *does*, this field should be updated to the code's requirements.
As the name implies, `dependencies` is a list of packages this project relies on. The list of object parameters should be listed in `"<package-unique-identifier>": "<supported-version-of-package>"`.
As this field is critical for adding compiling source code to the package, more information is available at the [Adding Source Code and Assets](https://omiyagames.github.io/template-unity-package/manual/customizeSource.html) section of this manual.
### `samples`
```json
"samples": [
{
"displayName": "Example 1",
"description": "This sample is just an example",
"path": "Samples~/Example1"
}
]
```
This field is optional, and if this package does not contain assets users can import, can be safely removed. If it *does*, this field should be changed to the package's director setup.
Unity Package Manager provides an option for packages to list one or more "Import to project" buttons, which naturally allows the user to import assets into their project. To specify what gets imported, the `samples` field contains a list of objects, each holding the `displayName`, `description`, and `path` (relative to the root of the package's directory) fields.
As this field is critical for adding importable assets to the package, more information is available at the [Adding Importable Assets](https://omiyagames.github.io/template-unity-package/manual/customizeSamples.html) section of this manual.
## Github-Specific Features - `.github` Folder
Files in the `.github` folder are intended for Github-specific features, including templates for its bug-tracking system (folder `ISSUE_TEMPLATE`), list of websites to sponsor the package (file `FUNDING.yml`), and automation feature under `workflows` folder. Obviously, if the repo is *not* hosted on Github, this folder and its content can be safely deleted. If the developer *does* want to utilize some of the features this template comes with, the purpose of each file or folder will be listed below:
### `.github/workflows/documentation.yml`
This YAML file instructs Github to automatically generate a documentation website. For those who would like to take advantange of this feature, the [Customizing Documentation](https://omiyagames.github.io/template-unity-package/manual/customizeDocumentation.html) section of this manual covers this file extensively. Obviously, if the developer has no plans on generating documentation for their package, this file can be safely deleted.
### `.github/FUNDING.yml`
This YAML file adds a "Sponsor" button at the top of an open-source Github repository. If the developer would like to take advantage of the feature, the comments (text following `#`) on the text file is pretty self-explanatory, and won't be elaborated here. If the package is not going to be open-source, or in need of sponsoring, this file can be safely removed.
### `.github/ISSUE_TEMPLATE` folder
The markdown files in the `.github/ISSUE_TEMPLATE` holds templates for issues one can file on Github, providing a standard for the issue filer. As markdown files, the files should be human-readable, and its content won't be futher elaborated on this manual. As usual, if there's no need for any or all templates in this folder, the folder or individual files themselver may be safely deleted.
## ReadMe - `README.md`
It's recommended to edit this file.
As the name of the file implies, `README.md` is a Markdown file intended to inform first-time viewers of this package's source what the purpose of the package is, and provide guidance on maintainers where the important files are. It has the added bonus that most online repository websites displays this Markdown file as an in-depth description of the project as a whole. Lastly, if [`.github/workflows/documentation.yml`](#githubworkflowsdocumentationyml) is configured to generate documentation, it will use `README.md` as the homepage.
## Package License - `LICENSE.md`
It's recommended to edit this file.
As the name of the file implies, `LICENSE.md` is a Markdown file describing the license of the package overall. It has the added bonus that some open-source-encouraging online repository websites uses this file to indicate developers the license of the project. If there are assets borrowed from third-party sources, it is recommended to list those in `THIRD PARTY NOTICES.md`.
## Other Third-Party Licenses - `THIRD PARTY NOTICES.md`
If the package does not contain any third-party assets, this file may be safely removed. Otherwise, it's highly recommended to edit this file.
`THIRD PARTY NOTICES.md` is a Markdown file that informs the reader what assets comes from third-party sources, and what the license of those assets are.
## Change Log - `CHANGELOG.md`
It's recommended to edit this file.
`THIRD PARTY NOTICES.md` is a Markdown file listing all the past versions of this package, and how the package has changed. Each time a new release is being prepared, this file should be updated with the latest information.
## Ignore File - [`.gitignore`](https://git-scm.com/docs/gitignore#_pattern_format)
It's recommend to keep this file; feel free to make edits to it for your own purpose.
The Git version control uses this `.gitignore` text file to determine what files to ignore when the application reviews any changes to the project. The default content held in this project should be fine for most Unity users. That said, if changes needs to be made, the official documentation on formatting `.gitignore` is available [here](https://git-scm.com/docs/gitignore#_pattern_format).
| 64.822086 | 540 | 0.761878 | eng_Latn | 0.996406 |
1c8f4e4753a0586d9f6a56947f5a96b2f35ca49e | 4,298 | md | Markdown | _posts/2021-04-20-Race Report.md | kprahlad/kprahlad.github.io | 59762336e884534d996345f34eb96269c4f1e84d | [
"MIT"
] | null | null | null | _posts/2021-04-20-Race Report.md | kprahlad/kprahlad.github.io | 59762336e884534d996345f34eb96269c4f1e84d | [
"MIT"
] | null | null | null | _posts/2021-04-20-Race Report.md | kprahlad/kprahlad.github.io | 59762336e884534d996345f34eb96269c4f1e84d | [
"MIT"
] | null | null | null | ---
layout: post
title: "Race Report - Return to Running"
date: 2021-04-20
image:
excerpt: After a 6-month hiatus due to COVID, I finally started running again in September, 2020. It took me 6 months of *very* slow progress (including a 2 week pause while <a href = "https://kprahlad.github.io/2020/12/11/Returning-to-NISER/">quarantining in NISER</a>) to build up to a mileage of around 25 KM a week. I decided to use <a href = "https://freerunningplans.com/5k-running-plans">this</a> 8-week beginner 5 KM plan to cap this running season. I ended up running the distance in 27:10 on the 17<sup>th</sup> of April - not as good as by PB (26:46) from 18 months back, but much better than I expected. Here's my race report.
tags: ["Personal"]
---
After a 6-month hiatus due to COVID, I finally started running again in September, 2020. It took me 6 months of *very* slow progress (including a 2 week pause while [quarantining in NISER]({{ site.baseurl }}{% post_url 2020-12-11-Returning to NISER %})) to build up to a mileage of around 25 KM a week. I decided to use [this](https://freerunningplans.com/5k-running-plans) 8-week beginner 5 KM plan to cap this running season. I ended up running the distance in 27:10 on the 17<sup>th</sup> of April - not as good as by PB (26:46) from 18 months back, but much better than I expected. Here's my race report.
## Race Information
* **What?** The "Get Back to Running, Mate" Solo Race.
* **When?** April 17<sup>th</sup>, 2021.
* **How Far?** 5 KM.
* **Where?** NISER, Bhubaneswar.
* **Finish Time:** 27:10.
## Goals
- **Best Case:** 27:30.
- **Great Outcome:** 30:00.
- **Very Attainable:** 32:00.
## Training
I estimated my 5 KM time at the beginning of my plan to be somewhere around 35 minutes. The training plan had the following structure on a typical week: two workout days, one long run and two easy runs. The long runs were not very long, to be honest - they maxed out at 10 KM. I ran my easy runs and long runs at a very comfortable 8:20 per KM pace. Embedded into the plan were 1 mile and 2 mile race efforts on the third and the sixth week respectively. I ran the 1 mile in 8:26 (5:14 per KM) and the 2 miles in 17:13 (5:21 per KM). Overall, the plan went along without any major speed bumps. I missed two days of running in total through the course of the 8 weeks. Both because of last minute grocery shopping plans. *Sigh*.
My training philosophy this time around was very simple - don't get injured. I did this by making easy days super easy. I was looking back at my runs which were part of my half marathon training plan and realized that I averaged a breezy 6:20 per KM pace. For my easy runs. While planning on running the half at 6:10 per KM. No wonder my shins were screaming at me for the last couple of weeks of that plan.
## Race
Race day was nothing special. Up at 5:20 am, out of the door at 5:45 am, warmed up in 15 minutes, and at the "start line" at 6 am. The "race plan" was straightforward - even 5:30 per KM splits. I was a little nervous when I started out and realized that I was running at 4:15 per KM pace. Thankfully, it only took me a couple of minutes to settle into the pace I wanted to be at. The first couple of kilometers were uneventful. The third kilometer was where things started getting a little harder. I got stitches on the right side of my abdomen about three-fourths into this kilometer and it accompanied me to the very end. After the first three kilometers, I was pretty much perfectly on pace. The last two kilometers were more of a blur. Every time my pace started dropping, I saw someone I knew and picked up the pace again. I had somehow managed to run these two kilometers in 10:40, shaving almost 20s off my goal time.
## Splits
- 0 - 1 KM: 5:29.4.
- 1 - 2 KM: 5:28.8.
- 2 - 3 KM: 5:30.9.
- 3 - 4 KM: 5:24.0.
- **4 - 5 KM: 5:17.8.**
**Finish Time: 27:10.3.**
This 16s negative split makes me wonder if I could have run the first half a little quicker. Maybe a 27:05 was there for my taking?
## What's Next?
Not sure, to be honest. The 5 KM PB looks very inviting. As does getting back to longer distances. As does focusing on strength and sidelining running for a bit. I'll decide after this week of ~~doing very little physical activity~~ recovery. | 85.96 | 924 | 0.729642 | eng_Latn | 0.999511 |
1c8f81f330ade166cd68778d9d306d8f5a9e1fd4 | 13,029 | md | Markdown | articles/cognitive-services/Bing-Custom-Search/define-your-custom-view.md | larryms/azure-docs | 82b391ac39378be049f749deefde948c55eab423 | [
"CC-BY-4.0"
] | null | null | null | articles/cognitive-services/Bing-Custom-Search/define-your-custom-view.md | larryms/azure-docs | 82b391ac39378be049f749deefde948c55eab423 | [
"CC-BY-4.0"
] | null | null | null | articles/cognitive-services/Bing-Custom-Search/define-your-custom-view.md | larryms/azure-docs | 82b391ac39378be049f749deefde948c55eab423 | [
"CC-BY-4.0"
] | null | null | null | ---
title: Configure your Bing Custom Search experience | Microsoft Docs
titlesuffix: Azure Cognitive Services
description: Describes how to create site and vertical search services
services: cognitive-services
author: aahill
manager: cgronlun
ms.service: cognitive-services
ms.subservice: bing-custom-search
ms.topic: conceptual
ms.date: 09/28/2017
ms.author: aahi
---
# Configure your Bing Custom Search experience
A Custom Search instance lets you tailor the search experience to include content only from websites that your users care about. Instead of performing a web-wide search, Bing searches only the slices of the web that interest you. To create your custom view of the web, use the Bing Custom Search [portal](https://customsearch.ai).
The portal lets you create a search instance that specifies the slices of the web: domains, subpages, and webpages, that you want Bing to search, and those that you don’t want it to search. The portal can also suggest content that you may want to include.
Use the following when defining your slices of the web:
| Slice name | Description |
|------------|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| Domain | A domain slice includes all content found within an internet domain. For example, `www.microsoft.com`. Omitting `www.` causes Bing to also search the domain’s subdomains. For example, if you specify `microsoft.com`, Bing also returns results from `support.microsoft.com` or `technet.microsoft.com`. |
| Subpage | A subpage slice includes all content found in the subpage and paths below it. You may specify a maximum of two subpages in the path. For example, `www.microsoft.com/en-us/windows/` |
| Webpage | A webpage slice can include only that webpage in a custom search. You can optionally specify whether to include subpages. |
> [!IMPORTANT]
> All domains, subpages, and webpages that you specify must be public and indexed by Bing. If you own a public site that you want to include in the search, and Bing hasn’t indexed it, see the Bing [webmaster documentation](https://www.bing.com/webmaster/help/webmaster-guidelines-30fba23a) for details about getting Bing to index it. Also, see the webmaster documentation for details about getting Bing to update your crawled site if the index is out of date.
## Add slices of the web to your custom search instance
When you create your custom search instance, you can specify the slices of the web: domains, subpages, and webpages, that you want to have included or blocked from your search results.
If you know the slices you want to include in your custom search instance, add them to your instance’s **Active** list.
If you’re not sure which slices to include, you can send search queries to Bing in the **Preview** pane and select the slices that you want. To do this:
1. select "Bing" from the dropdown list in the Preview pane, and enter a search query
2. Click **Add site** next to the result you want to include. Then click OK.
>[!NOTE]
> [!INCLUDE[publish or revert](./includes/publish-revert.md)]
<a name="active-and-blocked-lists"></a>
### Customize your search experience with Active and Blocked lists
You can access the list of active and blocked slices by clicking on the **Active** and **Blocked** tabs in your custom search instance. Slices added to the active list will be included in your custom search. Blocked slices won't be searched, and won't appear in your search results.
To specify the slices of the web you want Bing to search, click the **Active** tab and add one or more URLs. To edit or delete URLs, use the options under the **Controls** column.
When adding URLs to the **Active** list you can add single URLs, or multiple URLs at once by uploading a text file using the upload icon.

To upload a file, create a text file and specify a single domain, subpage, or webpage per line. Your file will be rejected if it isn't formatted correctly.
> [!NOTE]
> * You can only upload a file to the **Active** list. You cannot use it to add slices to the **Blocked** list.
> * If the **Blocked** list contains a domain, subpage, or webpage that you specified in the upload file, it will be removed from the **Blocked** list, and added to the **Active** list.
> * Duplicate entries in your upload file will be ignored by Bing Custom Search.
### Get website suggestions for your search experience
After adding web slices to the **Active** list, the Bing Custom Search portal will generate website and subpage suggestions at the bottom of the tab. These are slices that Bing Custom Search thinks you might want to include. Click **Refresh** to get updated suggestions after updating your custom search instance's settings. This section is only visible if suggestions are available.
## Search for images and videos
You can search for images and videos similarly to web content by using the [Bing Custom Image Search API](https://docs.microsoft.com/rest/api/cognitiveservices/bing-custom-images-api-v7-reference) or the [Bing Custom Video Search API](https://docs.microsoft.com/rest/api/cognitiveservices/bing-custom-videos-api-v7-reference). You can display these results with the [hosted UI](hosted-ui.md), or the APIs.
These APIs are similar to the non-custom [Bing Image Search](../Bing-Image-Search/overview.md) and [Bing Video Search](../Bing-Video-Search/search-the-web.md) APIs, but search the entire web, and do not require the `customConfig` query parameter. See these documentation sets for more information on working with images and videos.
## Test your search instance with the Preview pane
You can test your search instance by using the preview pane on the portal's right side to submit search queries and view the results.
1. Below the search box, select **My Instance**. You can compare the results from your search experience to Bing, by selecting **Bing**.
2. Select a safe search filter and which market to search (see [Query Parameters](https://docs.microsoft.com/rest/api/cognitiveservices/bing-custom-search-api-v7-reference#query-parameters)).
3. Enter a query and press enter or click the search icon to view the results from the current configuration. You can change your search type you perform by clicking **Web**, **Image**, or **Video** to get corresponding results.
<a name="adjustrank"></a>
## Adjust the rank of specific search results
The portal enables you to adjust the search ranking of content from specific domains, subpages, and webpages. After sending a search query in the preview pane, each search result contains a list of adjustments you can make for it:
| | |
|------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| Block | Moves the domain, subpage, or webpage to the Blocked list. Bing will exclude content from the selected site from appearing in the search results. |
| Boost | Boosts content from the domain or subpage to be higher in the search results. |
| Demote | Demotes content from the domain or subpage lower in the search results. You select whether to demote content from the domain or subpage that the webpage belongs to. |
| Pin to top | Moves the domain, subpage, or webpage to the **Pinned** list. This Forces the webpage to appear as the top search result for a given search query. |
Adjusting rank is not available for image or video searches.
### Boosting and demoting search results
You can super boost, boost, or demote any domain or subpage in the **Active** list. By default, all slices are added with no ranking adjustments. Slices of the web that are Super boosted or Boosted are ranked higher in the search results (with super boost ranking higher than boost). Items that are demoted are ranked lower in the search results.
You can super boost, boost, or demote items by using the **Ranking Adjust** controls in the **Active** list, or by using the Boost and Demote controls in the Preview pane. The service adds the slice to your Active list and adjusts the ranking accordingly.
> [!NOTE]
> Boosting and demoting domains and subpages is one of many methods Bing Custom Search uses to determine the order of search results. Because of other factors influencing the ranking of different web content, the effects of adjusting rank may vary. Use the Preview pane to test the effects of adjusting the rank of your search results.
Super boost, boost, and demote are not available for the image and video searches.
## Pin slices to the top of search results
The portal also lets you pin URLs to the top of search results for specific search terms, using the **Pinned** tab. Enter a URL and query to specify the webpage that will appear as the top result. Note that you can pin a maximum of one webpage per search query, and only indexed webpages will be displayed in searches. Pinning results is not available for image or video searches.
You can pin a webpage to the top in two ways:
* In the **Pinned** tab, enter the URL of the webpage to pin to the top, and its corresponding query.
* In the **Preview** pane, enter a search query and click search. Find the webpage you want to pin for your query, and click **Pin to top**. the webpage and query will be added to the **Pinned** list.
### Specify the pin's match condition
By default, webpages are only pinned to the top of search results when a user's query string exactly matches one listed in the **Pinned** list. You can change this behavior by specifying one of the following match conditions:
> [!NOTE]
> All comparisons between the user's search query, and the pin's search query are case insensitive.
| Value | Description |
|---------------|----------------------------------------------------------------------------------|
| Starts with | The pin is a match if the user's query string starts with the pin's query string |
| Ends with | The pin is a match if the user's query string ends with the pin's query string. |
| Contains | The pin is a match if the user's query string contains the pin's query string. |
To change the pin's match condition, click the pin's edit icon. In the **Query match condition** column, click the dropdown list and select the new condition to use. Then, click the save icon to save the change.
### Change the order of your pinned sites
To change the order of your pins, you can drag-and-drop the them, or edit their order number by clicking the "edit" icon in the **Controls** Column of the **Pinned** list.
If multiple pins satisfy a match condition, Bing Custom Search will use the one highest in the list.
## View statistics
If you subscribed to Custom Search at the appropriate level (see the [pricing pages](https://azure.microsoft.com/pricing/details/cognitive-services/bing-custom-search/)), a **Statistics** tab is added to your production instances. The statistics tab shows details about how your Custom Search endpoints are used, including call volume, top queries, geographic distribution, response codes, and safe search. You can filter details using the provided controls.
## Usage guidelines
- For each custom search instance, the maximum number of ranking adjustments that you may make to **Active** and **Blocked** slices is limited to 400.
- Adding a slice to the Active or Blocked tabs counts as one ranking adjustment.
- Boosting and demoting count as two ranking adjustments.
- For each custom search instance, the maximum number of pins that you may make is limited to 200.
## Next steps
- [Call your custom search](./search-your-custom-view.md)
- [Configure your hosted UI experience](./hosted-ui.md)
- [Use decoration markers to highlight text](./hit-highlighting.md)
- [Page webpages](./page-webpages.md)
| 81.43125 | 459 | 0.676414 | eng_Latn | 0.998251 |
1c8f97a0de87e1d7470aa04fb89e20f9daa76387 | 19 | md | Markdown | README.md | suzhengtang/shangqu | ec97c075d014c588290c8303cd000a95afb4d914 | [
"Apache-2.0"
] | null | null | null | README.md | suzhengtang/shangqu | ec97c075d014c588290c8303cd000a95afb4d914 | [
"Apache-2.0"
] | null | null | null | README.md | suzhengtang/shangqu | ec97c075d014c588290c8303cd000a95afb4d914 | [
"Apache-2.0"
] | null | null | null | # shangqu
尚趣信息官网部分
| 6.333333 | 9 | 0.789474 | cat_Latn | 0.36451 |
1c8fbda036cf46e9bb679874b498795c268386af | 4,235 | md | Markdown | README.md | svsoft/yii2-thumbnails | 896a4e130e012ab1c73a015bb54ae5c2fc0c8a21 | [
"MIT"
] | null | null | null | README.md | svsoft/yii2-thumbnails | 896a4e130e012ab1c73a015bb54ae5c2fc0c8a21 | [
"MIT"
] | null | null | null | README.md | svsoft/yii2-thumbnails | 896a4e130e012ab1c73a015bb54ae5c2fc0c8a21 | [
"MIT"
] | null | null | null | yii2-thumbnails
===================
It is adapter for Yii2 based on [svsoft/thumbnails](https://github.com/svsoft/thumbnails) component for resize images on native php.
Creates thumbnails of pictures with various handlers: resize, crop, fill, watermark.
You can also create your handlers, and add them to the description of the thumbnail.
### Features:
* Supports different types of source image storage.
* Supports various types of thumbnail storage
* Ability to create your own storage types (ftp, Database, Http)
* Simple apply of new handlers In the previously described thumbnails
* Ability to create your own custom image handlers
In the library everywhere is used interfaces, you can implementation necessary logic in your classes
Installation
---
### Composer
add to composer.json
```json
{
"require": {
"svsoft/yii2-thumbnails": "*"
}
}
```
and run ```php composer.phar update```
Or
run ```php composer.phar require svsoft/yii2-thumbnails ```
Config
---
Add to config
```php
'components' => [
// ....
'thumbnails' => [
'class' => \svsoft\yii\thumbnails\Thumbnails::class,
//
'dirPath' => '@app/web/resize',
'webDirPath' => '@web/resize',
'thumbs' => [
// Short description thumbnail
'logo'=> [
// simple resize 200x40
'width' => 200,
'height' => 40,
],
'favicon'=> [
// Resize fixed size filled with transparent fields
'class'=> \svsoft\yii\thumbnails\handlers\ResizeFillingHandlerFactory::class,
'width' => 40,
'height' => 40,
],
// Full description of thumbnail
'product'=>[
'class' => \svsoft\yii\thumbnails\ThumbFactory::class,
'handlers' => [
[
'class' => \svsoft\yii\thumbnails\handlers\ResizeCropHandlerFactory::class,
'width' => 600,
'height' => 600,
],
[
'class' => \svsoft\yii\thumbnails\handlers\WatermarkHandlerFactory::class,
'watermarkFilePath' => '@app/web/images/watermark.jpg',
'opacity' => 50,
],
],
],
// ...
]
], // ....
],
```
#### Properties
- dirPath - Path to thumbnail file storage directory
- webDirPath - Public web path to thumbnail file storage directory
- thumbs - List of thumbnail configuration
- $imageStorage - Not required. Description of image storage.
Now implementation only local image storage which is set by default set. (FTP, HTTP, DB NOT IMPLEMENTATION!)
- $thumbStorage - Not required. Description of thumb storage.
Now implementation only local thumb storage which is set by default set. (FTP, HTTP, DB NOT IMPLEMENTATION!)
#### Handler classes
```
\svsoft\yii\thumbnails\handlers\ResizeCropHandlerFactory - fixed resize and crop
\svsoft\yii\thumbnails\handlers\ResizeFillingHandlerFactory - fixed resize with filling fields
\svsoft\yii\thumbnails\handlers\ResizeHandlerFactory - resize to a certain size
\svsoft\yii\thumbnails\handlers\WatermarkHandlerFactory - watermark overlay
```
Each handler has its own settings, you can see them in the appropriate class/
You can add your handlers.
Using
---
How to get the component where it will be used depends on the application and developer. via service locator, singleton, container, etc.
Example output favicon
```html
<link rel="shortcut icon" href="<?=Yii::$app->thumbnails->thumb('/var/www/site.ru/images/1.jpg', 'favicon') ?>" type="image/png">
```
Example output image of product in tag img
```html
<img src="<?=Yii::$app->thumbnails->thumb('/var/www/site.ru/images/product/product-1.jpg', 'product') ?>">
```
| 34.430894 | 136 | 0.572845 | eng_Latn | 0.898115 |
1c9184a8739c11183dc13d9d63f22e15312b1a9c | 180 | md | Markdown | README.md | vladciobanu/job-queue | dc08ae5fadaab2b77e0a5a66d752c54f5a260df4 | [
"BSD-3-Clause"
] | null | null | null | README.md | vladciobanu/job-queue | dc08ae5fadaab2b77e0a5a66d752c54f5a260df4 | [
"BSD-3-Clause"
] | null | null | null | README.md | vladciobanu/job-queue | dc08ae5fadaab2b77e0a5a66d752c54f5a260df4 | [
"BSD-3-Clause"
] | null | null | null | # job-queue
TODO:
- figure out how to do a queue in redis
- fork a thread to read from redis queue and then see about consumers below
- figure out how to configure consumers | 30 | 77 | 0.738889 | eng_Latn | 0.999499 |
1c91a991ff6d495eaa977d786dbcbf13ec95ff7c | 429 | md | Markdown | src/medium-apple-stacks-queue/README.md | Murillo2380/interview-coding-solutions | 0d15834ff309f956a2dd9a739877c43a6b4ab89f | [
"MIT"
] | 14 | 2019-06-21T00:53:42.000Z | 2022-01-08T23:57:58.000Z | src/medium-apple-stacks-queue/README.md | Murillo2380/interview-coding-solutions | 0d15834ff309f956a2dd9a739877c43a6b4ab89f | [
"MIT"
] | null | null | null | src/medium-apple-stacks-queue/README.md | Murillo2380/interview-coding-solutions | 0d15834ff309f956a2dd9a739877c43a6b4ab89f | [
"MIT"
] | 5 | 2019-07-05T16:05:38.000Z | 2020-05-30T23:00:34.000Z | # Description
Good morning! Here's your coding interview problem for today.
This problem was asked by **Apple**.
Implement a queue using two stacks. Recall that a queue is a FIFO (first-in, first-out) data structure with the following methods: `enqueue`, which inserts an element into the queue, and `dequeue`, which removes it.
# Source
Received by email from the [Daily Coding Problem](https://www.dailycodingproblem.com)
| 35.75 | 215 | 0.7669 | eng_Latn | 0.997727 |