hexsha stringlengths 40 40 | size int64 5 1.04M | ext stringclasses 6 values | lang stringclasses 1 value | max_stars_repo_path stringlengths 3 344 | max_stars_repo_name stringlengths 5 125 | max_stars_repo_head_hexsha stringlengths 40 78 | max_stars_repo_licenses listlengths 1 11 | max_stars_count int64 1 368k ⌀ | max_stars_repo_stars_event_min_datetime stringlengths 24 24 ⌀ | max_stars_repo_stars_event_max_datetime stringlengths 24 24 ⌀ | max_issues_repo_path stringlengths 3 344 | max_issues_repo_name stringlengths 5 125 | max_issues_repo_head_hexsha stringlengths 40 78 | max_issues_repo_licenses listlengths 1 11 | max_issues_count int64 1 116k ⌀ | max_issues_repo_issues_event_min_datetime stringlengths 24 24 ⌀ | max_issues_repo_issues_event_max_datetime stringlengths 24 24 ⌀ | max_forks_repo_path stringlengths 3 344 | max_forks_repo_name stringlengths 5 125 | max_forks_repo_head_hexsha stringlengths 40 78 | max_forks_repo_licenses listlengths 1 11 | max_forks_count int64 1 105k ⌀ | max_forks_repo_forks_event_min_datetime stringlengths 24 24 ⌀ | max_forks_repo_forks_event_max_datetime stringlengths 24 24 ⌀ | content stringlengths 5 1.04M | avg_line_length float64 1.14 851k | max_line_length int64 1 1.03M | alphanum_fraction float64 0 1 | lid stringclasses 191 values | lid_prob float64 0.01 1 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
c150d4d6ae6133885f88d28f470daed359df0889 | 40 | md | Markdown | README.md | bonvio-opr/bonvio-opr-landing | da96e6fb9248004c295540f5e29db58304741e13 | [
"MIT"
] | null | null | null | README.md | bonvio-opr/bonvio-opr-landing | da96e6fb9248004c295540f5e29db58304741e13 | [
"MIT"
] | null | null | null | README.md | bonvio-opr/bonvio-opr-landing | da96e6fb9248004c295540f5e29db58304741e13 | [
"MIT"
] | null | null | null | # bonvio-opr-landing
bonvio-opr-landing
| 13.333333 | 20 | 0.8 | eng_Latn | 0.336003 |
c151a82320b3c87d4f93082f7e01e13676905391 | 2,446 | md | Markdown | docs/tables/aws_vpc_flow_log_event.md | blinkops/blink-aws-query | 21b0d5eee6b0b5554c040867b9e25681e7ba6559 | [
"Apache-2.0"
] | 61 | 2021-01-21T19:06:48.000Z | 2022-03-28T20:09:46.000Z | docs/tables/aws_vpc_flow_log_event.md | blinkops/blink-aws-query | 21b0d5eee6b0b5554c040867b9e25681e7ba6559 | [
"Apache-2.0"
] | 592 | 2021-01-23T05:27:12.000Z | 2022-03-31T14:16:19.000Z | docs/tables/aws_vpc_flow_log_event.md | blinkops/blink-aws-query | 21b0d5eee6b0b5554c040867b9e25681e7ba6559 | [
"Apache-2.0"
] | 30 | 2021-01-21T18:43:25.000Z | 2022-03-12T15:14:05.000Z | # Table: aws_vpc_flow_log_event
VPC flow logs capture information about the IP traffic going to and from network interfaces in your VPC.
This table reads flow log records from CloudWatch log groups.
**Important notes:**
- You **_must_** specify `log_group_name` in a `where` clause in order to use this table.
- This table supports optional quals. Queries with optional quals are optimised to used CloudWatch filters. Optional quals are supported for the following columns:
- `action`
- `dst_addr`
- `dst_port`
- `event_id`
- `filter`
- `interface_id`
- `log_status`
- `log_stream_name`
- `region`
- `src_addr`
- `src_port`
- `timestamp`
## Examples
### Basic info
```sql
select
log_group_name,
log_stream_name,
log_status,
action,
ingestion_time,
timestamp,
interface_id,
interface_account_id,
src_addr,
region
from
aws_vpc_flow_log_event
where
log_group_name = 'my-vpc-logs';
```
### List distinct interface IDs found in all flow logs
```sql
select
distinct(interface_id)
from
aws_vpc_flow_log_event
where
log_group_name = 'my-vpc-logs';
```
### Get details for all rejected traffic
```sql
select
log_stream_name,
timestamp,
interface_id,
interface_account_id,
src_addr,
src_port,
dst_addr,
dst_port
from
aws_vpc_flow_log_event
where
log_group_name = 'my-vpc-logs'
and action = 'REJECT';
```
## Filter Examples
For more information on CloudWatch log filters, please refer to [Filter Pattern Syntax](https://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/FilterAndPatternSyntax.html).
### List flow logs with traffic between specific IP addresses
```sql
select
log_group_name,
log_stream_name,
log_status,
action,
ingestion_time,
timestamp,
interface_id,
interface_account_id,
src_addr,
region
from
aws_vpc_flow_log_event
where
log_group_name = 'vpc_flow_logs_vpc-ba23a1d5'
and log_stream_name = 'eni-1d47d21d-all'
and (src_addr = '10.85.14.210' or dst_addr = '10.85.14.213')
order by
timestamp;
```
### List flow logs with source IP address in a specific range
```sql
select
log_group_name,
log_stream_name,
log_status,
action,
ingestion_time,
timestamp,
interface_id,
interface_account_id,
src_addr,
region
from
aws_vpc_flow_log_event
where
log_group_name = 'vpc_flow_logs_vpc-ba23a1d5'
and log_stream_name = 'eni-1d47d21d-all'
and src_addr << '10.0.0.0/8'::inet
order by
timestamp;
```
| 19.259843 | 174 | 0.73426 | eng_Latn | 0.91431 |
c1525822ec20207696db3b2fedad50ed42bd66de | 1,670 | md | Markdown | bcrypt-pbkdf/CHANGELOG.md | lkarthee/password-hashes | 2e88d09950b02610c1770c440398dcee5b798498 | [
"Apache-2.0",
"MIT"
] | null | null | null | bcrypt-pbkdf/CHANGELOG.md | lkarthee/password-hashes | 2e88d09950b02610c1770c440398dcee5b798498 | [
"Apache-2.0",
"MIT"
] | null | null | null | bcrypt-pbkdf/CHANGELOG.md | lkarthee/password-hashes | 2e88d09950b02610c1770c440398dcee5b798498 | [
"Apache-2.0",
"MIT"
] | null | null | null | # Changelog
All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
## 0.6.2 (2021-07-20)
### Changed
- Pin `zeroize` dependency to v1.3 ([#190])
[#190]: https://github.com/RustCrypto/password-hashes/pull/190
## 0.6.1 (2021-05-04)
### Changed
- Bump `blowfish` dependency to v0.8 ([#171])
[#171]: https://github.com/RustCrypto/password-hashing/pull/171
## 0.6.0 (2021-04-29) [YANKED]
### Changed
- Bump `crypto-mac` dependency to v0.11 ([#165])
- Bump `pbkdf2` to v0.8 ([#167])
[#165]: https://github.com/RustCrypto/password-hashing/pull/165
[#167]: https://github.com/RustCrypto/password-hashing/pull/167
## 0.5.0 (2021-01-29)
### Changed
- Bump `pbkdf2` dependency to v0.7 ([#102])
[#102]: https://github.com/RustCrypto/password-hashing/pull/102
## 0.4.0 (2020-10-18)
### Changed
- Bump `crypto-mac` dependency to v0.10 ([#58])
- Bump `pbkdf2` dependency to v0.10 ([#61])
[#61]: https://github.com/RustCrypto/password-hashing/pull/61
[#58]: https://github.com/RustCrypto/password-hashing/pull/58
## 0.3.0 (2020-08-18)
### Changed
- Bump `crypto-mac` dependency to v0.9, `blowfish` to v0.6, and `pbkdf2` to v0.5 ([#46])
[#46]: https://github.com/RustCrypto/password-hashing/pull/46
## 0.2.1 (2020-06-03)
### Added
- `no_std` support ([#41])
[#41]: https://github.com/RustCrypto/password-hashing/pull/41
## 0.2.0 (2020-05-13)
### Changed
- Update dependencies to `sha2 v0.9`, `pbkdf2 v0.4`, `blowfish v0.5`,
and `crypto-mac v0.8`
## 0.1.0 (2020-01-03)
- Initial release
| 27.377049 | 88 | 0.673653 | yue_Hant | 0.240655 |
c1529400d9652e3c566a186e367346183b269733 | 20,637 | md | Markdown | docs/OrgsApi.md | MingboPeng/csharp-sdk | 23d29fbc6fff7be9d9c93c9698ee7a28c77100a2 | [
"RSA-MD"
] | null | null | null | docs/OrgsApi.md | MingboPeng/csharp-sdk | 23d29fbc6fff7be9d9c93c9698ee7a28c77100a2 | [
"RSA-MD"
] | 17 | 2020-09-16T07:34:56.000Z | 2022-01-11T06:31:55.000Z | docs/OrgsApi.md | MingboPeng/csharp-sdk | 23d29fbc6fff7be9d9c93c9698ee7a28c77100a2 | [
"RSA-MD"
] | 4 | 2020-07-01T16:03:59.000Z | 2021-08-25T00:28:40.000Z | # PollinationSDK.Api.OrgsApi
All URIs are relative to *http://localhost*
Method | HTTP request | Description
------------- | ------------- | -------------
[**CreateOrg**](OrgsApi.md#createorg) | **POST** /orgs | Create an Org
[**DeleteOrg**](OrgsApi.md#deleteorg) | **DELETE** /orgs/{name} | Delete an Org
[**DeleteOrgMember**](OrgsApi.md#deleteorgmember) | **DELETE** /orgs/{name}/members/{username} | Remove an Org member
[**GetOrg**](OrgsApi.md#getorg) | **GET** /orgs/{name} | Get an Org
[**GetOrgMembers**](OrgsApi.md#getorgmembers) | **GET** /orgs/{name}/members | List organization members
[**ListOrgs**](OrgsApi.md#listorgs) | **GET** /orgs | List Orgs
[**UpdateOrg**](OrgsApi.md#updateorg) | **PUT** /orgs/{name} | Update an Org
[**UpsertOrgMember**](OrgsApi.md#upsertorgmember) | **PATCH** /orgs/{name}/members/{username}/{role} | Add or update the role of an Org Member
## CreateOrg
> CreatedContent CreateOrg (OrganizationCreate organizationCreate)
Create an Org
Create a new org.
### Example
```csharp
using System.Collections.Generic;
using System.Diagnostics;
using PollinationSDK.Api;
using PollinationSDK.Client;
using PollinationSDK.Model;
namespace Example
{
public class CreateOrgExample
{
public static void Main()
{
Configuration.Default.BasePath = "http://localhost";
// Configure API key authorization: APIKeyAuth
Configuration.Default.AddApiKey("x-pollination-token", "YOUR_API_KEY");
// Uncomment below to setup prefix (e.g. Bearer) for API key, if needed
// Configuration.Default.AddApiKeyPrefix("x-pollination-token", "Bearer");
// Configure HTTP bearer authorization: JWTAuth
Configuration.Default.AccessToken = "YOUR_ACCESS_TOKEN";
var apiInstance = new OrgsApi(Configuration.Default);
var organizationCreate = new OrganizationCreate(); // OrganizationCreate |
try
{
// Create an Org
CreatedContent result = apiInstance.CreateOrg(organizationCreate);
Debug.WriteLine(result);
}
catch (ApiException e)
{
Debug.Print("Exception when calling OrgsApi.CreateOrg: " + e.Message );
Debug.Print("Status Code: "+ e.ErrorCode);
Debug.Print(e.StackTrace);
}
}
}
}
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**organizationCreate** | [**OrganizationCreate**](OrganizationCreate.md)| |
### Return type
[**CreatedContent**](CreatedContent.md)
### Authorization
[APIKeyAuth](../README.md#APIKeyAuth), [JWTAuth](../README.md#JWTAuth)
### HTTP request headers
- **Content-Type**: application/json
- **Accept**: application/json
### HTTP response details
| Status code | Description | Response headers |
|-------------|-------------|------------------|
| **201** | Success | - |
| **403** | Access forbidden | - |
| **500** | Server error | - |
| **400** | Invalid request | - |
| **202** | Accepted | - |
| **422** | Validation Error | - |
[[Back to top]](#)
[[Back to API list]](../README.md#documentation-for-api-endpoints)
[[Back to Model list]](../README.md#documentation-for-models)
[[Back to README]](../README.md)
## DeleteOrg
> void DeleteOrg (string name)
Delete an Org
Delete a org (must have `admin` permission)
### Example
```csharp
using System.Collections.Generic;
using System.Diagnostics;
using PollinationSDK.Api;
using PollinationSDK.Client;
using PollinationSDK.Model;
namespace Example
{
public class DeleteOrgExample
{
public static void Main()
{
Configuration.Default.BasePath = "http://localhost";
// Configure API key authorization: APIKeyAuth
Configuration.Default.AddApiKey("x-pollination-token", "YOUR_API_KEY");
// Uncomment below to setup prefix (e.g. Bearer) for API key, if needed
// Configuration.Default.AddApiKeyPrefix("x-pollination-token", "Bearer");
// Configure HTTP bearer authorization: JWTAuth
Configuration.Default.AccessToken = "YOUR_ACCESS_TOKEN";
var apiInstance = new OrgsApi(Configuration.Default);
var name = name_example; // string |
try
{
// Delete an Org
apiInstance.DeleteOrg(name);
}
catch (ApiException e)
{
Debug.Print("Exception when calling OrgsApi.DeleteOrg: " + e.Message );
Debug.Print("Status Code: "+ e.ErrorCode);
Debug.Print(e.StackTrace);
}
}
}
}
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**name** | **string**| |
### Return type
void (empty response body)
### Authorization
[APIKeyAuth](../README.md#APIKeyAuth), [JWTAuth](../README.md#JWTAuth)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: application/json
### HTTP response details
| Status code | Description | Response headers |
|-------------|-------------|------------------|
| **204** | Accepted | - |
| **403** | Access forbidden | - |
| **500** | Server error | - |
| **400** | Invalid request | - |
| **422** | Validation Error | - |
[[Back to top]](#)
[[Back to API list]](../README.md#documentation-for-api-endpoints)
[[Back to Model list]](../README.md#documentation-for-models)
[[Back to README]](../README.md)
## DeleteOrgMember
> void DeleteOrgMember (string name, string username)
Remove an Org member
Remove a member from the org (must have org `owner` role)
### Example
```csharp
using System.Collections.Generic;
using System.Diagnostics;
using PollinationSDK.Api;
using PollinationSDK.Client;
using PollinationSDK.Model;
namespace Example
{
public class DeleteOrgMemberExample
{
public static void Main()
{
Configuration.Default.BasePath = "http://localhost";
// Configure API key authorization: APIKeyAuth
Configuration.Default.AddApiKey("x-pollination-token", "YOUR_API_KEY");
// Uncomment below to setup prefix (e.g. Bearer) for API key, if needed
// Configuration.Default.AddApiKeyPrefix("x-pollination-token", "Bearer");
// Configure HTTP bearer authorization: JWTAuth
Configuration.Default.AccessToken = "YOUR_ACCESS_TOKEN";
var apiInstance = new OrgsApi(Configuration.Default);
var name = name_example; // string |
var username = username_example; // string |
try
{
// Remove an Org member
apiInstance.DeleteOrgMember(name, username);
}
catch (ApiException e)
{
Debug.Print("Exception when calling OrgsApi.DeleteOrgMember: " + e.Message );
Debug.Print("Status Code: "+ e.ErrorCode);
Debug.Print(e.StackTrace);
}
}
}
}
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**name** | **string**| |
**username** | **string**| |
### Return type
void (empty response body)
### Authorization
[APIKeyAuth](../README.md#APIKeyAuth), [JWTAuth](../README.md#JWTAuth)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: application/json
### HTTP response details
| Status code | Description | Response headers |
|-------------|-------------|------------------|
| **204** | Accepted | - |
| **403** | Access forbidden | - |
| **500** | Server error | - |
| **400** | Invalid request | - |
| **422** | Validation Error | - |
[[Back to top]](#)
[[Back to API list]](../README.md#documentation-for-api-endpoints)
[[Back to Model list]](../README.md#documentation-for-models)
[[Back to README]](../README.md)
## GetOrg
> Organization GetOrg (string name)
Get an Org
Retrieve a org by name
### Example
```csharp
using System.Collections.Generic;
using System.Diagnostics;
using PollinationSDK.Api;
using PollinationSDK.Client;
using PollinationSDK.Model;
namespace Example
{
public class GetOrgExample
{
public static void Main()
{
Configuration.Default.BasePath = "http://localhost";
// Configure API key authorization: APIKeyAuth
Configuration.Default.AddApiKey("x-pollination-token", "YOUR_API_KEY");
// Uncomment below to setup prefix (e.g. Bearer) for API key, if needed
// Configuration.Default.AddApiKeyPrefix("x-pollination-token", "Bearer");
// Configure HTTP bearer authorization: JWTAuth
Configuration.Default.AccessToken = "YOUR_ACCESS_TOKEN";
var apiInstance = new OrgsApi(Configuration.Default);
var name = name_example; // string |
try
{
// Get an Org
Organization result = apiInstance.GetOrg(name);
Debug.WriteLine(result);
}
catch (ApiException e)
{
Debug.Print("Exception when calling OrgsApi.GetOrg: " + e.Message );
Debug.Print("Status Code: "+ e.ErrorCode);
Debug.Print(e.StackTrace);
}
}
}
}
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**name** | **string**| |
### Return type
[**Organization**](Organization.md)
### Authorization
[APIKeyAuth](../README.md#APIKeyAuth), [JWTAuth](../README.md#JWTAuth)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: application/json
### HTTP response details
| Status code | Description | Response headers |
|-------------|-------------|------------------|
| **200** | Retrieved | - |
| **403** | Access forbidden | - |
| **500** | Server error | - |
| **400** | Invalid request | - |
| **404** | Not found | - |
| **422** | Validation Error | - |
[[Back to top]](#)
[[Back to API list]](../README.md#documentation-for-api-endpoints)
[[Back to Model list]](../README.md#documentation-for-models)
[[Back to README]](../README.md)
## GetOrgMembers
> OrganizationMemberList GetOrgMembers (string name)
List organization members
Retrieve a org's members
### Example
```csharp
using System.Collections.Generic;
using System.Diagnostics;
using PollinationSDK.Api;
using PollinationSDK.Client;
using PollinationSDK.Model;
namespace Example
{
public class GetOrgMembersExample
{
public static void Main()
{
Configuration.Default.BasePath = "http://localhost";
var apiInstance = new OrgsApi(Configuration.Default);
var name = name_example; // string |
try
{
// List organization members
OrganizationMemberList result = apiInstance.GetOrgMembers(name);
Debug.WriteLine(result);
}
catch (ApiException e)
{
Debug.Print("Exception when calling OrgsApi.GetOrgMembers: " + e.Message );
Debug.Print("Status Code: "+ e.ErrorCode);
Debug.Print(e.StackTrace);
}
}
}
}
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**name** | **string**| |
### Return type
[**OrganizationMemberList**](OrganizationMemberList.md)
### Authorization
No authorization required
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: application/json
### HTTP response details
| Status code | Description | Response headers |
|-------------|-------------|------------------|
| **200** | Retrieved | - |
| **422** | Validation Error | - |
[[Back to top]](#)
[[Back to API list]](../README.md#documentation-for-api-endpoints)
[[Back to Model list]](../README.md#documentation-for-models)
[[Back to README]](../README.md)
## ListOrgs
> OrganizationList ListOrgs (List<string> search = null, List<string> name = null, List<string> member = null, int? page = null, int? perPage = null)
List Orgs
search for orgs using query parameters
### Example
```csharp
using System.Collections.Generic;
using System.Diagnostics;
using PollinationSDK.Api;
using PollinationSDK.Client;
using PollinationSDK.Model;
namespace Example
{
public class ListOrgsExample
{
public static void Main()
{
Configuration.Default.BasePath = "http://localhost";
// Configure API key authorization: APIKeyAuth
Configuration.Default.AddApiKey("x-pollination-token", "YOUR_API_KEY");
// Uncomment below to setup prefix (e.g. Bearer) for API key, if needed
// Configuration.Default.AddApiKeyPrefix("x-pollination-token", "Bearer");
// Configure HTTP bearer authorization: JWTAuth
Configuration.Default.AccessToken = "YOUR_ACCESS_TOKEN";
var apiInstance = new OrgsApi(Configuration.Default);
var search = new List<string>(); // List<string> | You know, for search (optional)
var name = new List<string>(); // List<string> | The account name (optional)
var member = new List<string>(); // List<string> | The username of a user (optional)
var page = 56; // int? | Page number starting from 1 (optional) (default to 1)
var perPage = 56; // int? | Number of items per page (optional) (default to 25)
try
{
// List Orgs
OrganizationList result = apiInstance.ListOrgs(search, name, member, page, perPage);
Debug.WriteLine(result);
}
catch (ApiException e)
{
Debug.Print("Exception when calling OrgsApi.ListOrgs: " + e.Message );
Debug.Print("Status Code: "+ e.ErrorCode);
Debug.Print(e.StackTrace);
}
}
}
}
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**search** | [**List<string>**](string.md)| You know, for search | [optional]
**name** | [**List<string>**](string.md)| The account name | [optional]
**member** | [**List<string>**](string.md)| The username of a user | [optional]
**page** | **int?**| Page number starting from 1 | [optional] [default to 1]
**perPage** | **int?**| Number of items per page | [optional] [default to 25]
### Return type
[**OrganizationList**](OrganizationList.md)
### Authorization
[APIKeyAuth](../README.md#APIKeyAuth), [JWTAuth](../README.md#JWTAuth)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: application/json
### HTTP response details
| Status code | Description | Response headers |
|-------------|-------------|------------------|
| **200** | Retrieved | - |
| **422** | Validation Error | - |
[[Back to top]](#)
[[Back to API list]](../README.md#documentation-for-api-endpoints)
[[Back to Model list]](../README.md#documentation-for-models)
[[Back to README]](../README.md)
## UpdateOrg
> UpdateAccepted UpdateOrg (string name, OrganizationUpdate organizationUpdate)
Update an Org
Update a org (must have org `owner` role)
### Example
```csharp
using System.Collections.Generic;
using System.Diagnostics;
using PollinationSDK.Api;
using PollinationSDK.Client;
using PollinationSDK.Model;
namespace Example
{
public class UpdateOrgExample
{
public static void Main()
{
Configuration.Default.BasePath = "http://localhost";
// Configure API key authorization: APIKeyAuth
Configuration.Default.AddApiKey("x-pollination-token", "YOUR_API_KEY");
// Uncomment below to setup prefix (e.g. Bearer) for API key, if needed
// Configuration.Default.AddApiKeyPrefix("x-pollination-token", "Bearer");
// Configure HTTP bearer authorization: JWTAuth
Configuration.Default.AccessToken = "YOUR_ACCESS_TOKEN";
var apiInstance = new OrgsApi(Configuration.Default);
var name = name_example; // string |
var organizationUpdate = new OrganizationUpdate(); // OrganizationUpdate |
try
{
// Update an Org
UpdateAccepted result = apiInstance.UpdateOrg(name, organizationUpdate);
Debug.WriteLine(result);
}
catch (ApiException e)
{
Debug.Print("Exception when calling OrgsApi.UpdateOrg: " + e.Message );
Debug.Print("Status Code: "+ e.ErrorCode);
Debug.Print(e.StackTrace);
}
}
}
}
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**name** | **string**| |
**organizationUpdate** | [**OrganizationUpdate**](OrganizationUpdate.md)| |
### Return type
[**UpdateAccepted**](UpdateAccepted.md)
### Authorization
[APIKeyAuth](../README.md#APIKeyAuth), [JWTAuth](../README.md#JWTAuth)
### HTTP request headers
- **Content-Type**: application/json
- **Accept**: application/json
### HTTP response details
| Status code | Description | Response headers |
|-------------|-------------|------------------|
| **202** | Accepted | - |
| **403** | Access forbidden | - |
| **500** | Server error | - |
| **400** | Invalid request | - |
| **404** | Not found | - |
| **422** | Validation Error | - |
[[Back to top]](#)
[[Back to API list]](../README.md#documentation-for-api-endpoints)
[[Back to Model list]](../README.md#documentation-for-models)
[[Back to README]](../README.md)
## UpsertOrgMember
> UpdateAccepted UpsertOrgMember (string name, string username, OrganizationRoleEnum role)
Add or update the role of an Org Member
Upsert a member role to the org (must have org `owner` role)
### Example
```csharp
using System.Collections.Generic;
using System.Diagnostics;
using PollinationSDK.Api;
using PollinationSDK.Client;
using PollinationSDK.Model;
namespace Example
{
public class UpsertOrgMemberExample
{
public static void Main()
{
Configuration.Default.BasePath = "http://localhost";
// Configure API key authorization: APIKeyAuth
Configuration.Default.AddApiKey("x-pollination-token", "YOUR_API_KEY");
// Uncomment below to setup prefix (e.g. Bearer) for API key, if needed
// Configuration.Default.AddApiKeyPrefix("x-pollination-token", "Bearer");
// Configure HTTP bearer authorization: JWTAuth
Configuration.Default.AccessToken = "YOUR_ACCESS_TOKEN";
var apiInstance = new OrgsApi(Configuration.Default);
var name = name_example; // string |
var username = username_example; // string |
var role = ; // OrganizationRoleEnum |
try
{
// Add or update the role of an Org Member
UpdateAccepted result = apiInstance.UpsertOrgMember(name, username, role);
Debug.WriteLine(result);
}
catch (ApiException e)
{
Debug.Print("Exception when calling OrgsApi.UpsertOrgMember: " + e.Message );
Debug.Print("Status Code: "+ e.ErrorCode);
Debug.Print(e.StackTrace);
}
}
}
}
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**name** | **string**| |
**username** | **string**| |
**role** | **OrganizationRoleEnum**| |
### Return type
[**UpdateAccepted**](UpdateAccepted.md)
### Authorization
[APIKeyAuth](../README.md#APIKeyAuth), [JWTAuth](../README.md#JWTAuth)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: application/json
### HTTP response details
| Status code | Description | Response headers |
|-------------|-------------|------------------|
| **202** | Accepted | - |
| **403** | Access forbidden | - |
| **500** | Server error | - |
| **400** | Invalid request | - |
| **404** | Not found | - |
| **422** | Validation Error | - |
[[Back to top]](#)
[[Back to API list]](../README.md#documentation-for-api-endpoints)
[[Back to Model list]](../README.md#documentation-for-models)
[[Back to README]](../README.md)
| 29.066197 | 149 | 0.58744 | yue_Hant | 0.435086 |
c1530d6f8fee39d96659a0e930319e3336ddc90d | 1,220 | md | Markdown | _posts/2020-09-11-gadget-guru-ganesha-ep-water-park-first-on-net-in-tamil.md | tamilrockerss/tamilrockerss.github.io | ff96346e1c200f9507ae529f2a5acba0ecfb431d | [
"MIT"
] | null | null | null | _posts/2020-09-11-gadget-guru-ganesha-ep-water-park-first-on-net-in-tamil.md | tamilrockerss/tamilrockerss.github.io | ff96346e1c200f9507ae529f2a5acba0ecfb431d | [
"MIT"
] | 3 | 2022-03-15T05:48:33.000Z | 2022-03-15T16:54:49.000Z | _posts/2020-09-11-gadget-guru-ganesha-ep-water-park-first-on-net-in-tamil.md | tamilrockerss/tamilrockerss.github.io | ff96346e1c200f9507ae529f2a5acba0ecfb431d | [
"MIT"
] | 6 | 2020-11-22T07:57:38.000Z | 2022-03-15T16:57:40.000Z | ---
title: "Gadget Guru Ganesha Ep-Water Park [FIRST ON NET IN TAMIL]"
date: "2020-09-11"
---
_**Gadget Guru Ganesha Ep-****Water Park** **\[FIRST ON NET IN TAMIL\]**_
_****_
_**INFO:**_
_**Title:**_**_Water Park_**
_**Duration:11m 46s**_
_**Size:89mb**_
_**Quality:720p**_
_**Encoded BY :**__**Toonkids&Toontamizha**_
_**Recorded by**_
_**Toonkids&Toontamizha**_
_**Screenshots:**_
_**



**_
_**__TO DOWNLOAD
_
CLICK THE LINK BELOW
_
_**Water Park** : [Google Drive](https://drive.google.com/file/d/1IxZP5esncDrfX3f2KRyIoZKTIvkQWL3K/view?usp=sharing)___**_
| 24.4 | 163 | 0.745902 | yue_Hant | 0.708201 |
c153677ee3f62a6af65911a206ae6b3409d63807 | 9,596 | md | Markdown | site/content/blog/securities-crowdfunding-dan-equity-crowdfunding.md | landx-id/blog | 561beb4f33c21dee89f179f79c60d0c10d83993f | [
"MIT"
] | null | null | null | site/content/blog/securities-crowdfunding-dan-equity-crowdfunding.md | landx-id/blog | 561beb4f33c21dee89f179f79c60d0c10d83993f | [
"MIT"
] | 1 | 2022-01-06T05:27:23.000Z | 2022-01-06T05:27:23.000Z | site/content/blog/securities-crowdfunding-dan-equity-crowdfunding.md | landx-id/blog | 561beb4f33c21dee89f179f79c60d0c10d83993f | [
"MIT"
] | null | null | null | ---
draft: false
author: Abdul Wahhab
slug: securities-crowdfunding-dan-equity-crowdfunding
title: Securities Crowdfunding (SCF) Sebagai Alternatif Pendanaan UMKM
metaTitle: Securities Crowdfunding (SCF) Sebagai Alternatif Pendanaan UMKM
metaDescription: SCF atau securities crowdfunding (urun dana) adalah sistem
Investasi bisnis yang dimana masyarakat dapat mendanai bisnis privat
intro: Yuk kenal lebih dekat dengan sistem investasi securities crowdfunding
yang mempermudah kamu untuk berinvestasi ke bisnis privat dengan modal kecil.
date: 2021-08-07T18:05:32.000Z
tag:
- Securities Crowdfunding untuk Pengembangan UMKM di Indonesia
- Equity Crowdfunding
- crowdfunding
- Situs Crowdfunding
- bisnis investasi
- investasi bisnis
category:
- Investasi
- Cara Investasi di Securities Crowdfunding
featuredImage:
src: https://images.unsplash.com/photo-1598793923805-184b1adc10e7?ixlib=rb-1.2.1&ixid=MnwxMjA3fDB8MHxwaG90by1wYWdlfHx8fGVufDB8fHx8&auto=format&fit=crop&w=1470&q=80
alt: Memahami cara kerja investasi lewat securities crowdfunding
title: Investasi Bisnis Modal Kecil Lewat Securities Crowdfunding
---
[Securities crowdfunding](https://landx.id/) adalah sistem yang baru-baru ini diresmikan pemerintah Indonesia berdasarkan POJK Nomor 57/POJK.04/2020 tentang Penawaran Efek Melalui Layanan Urun Dana Berbasis Teknologi Informasi (*Securities Crowdfunding).*
SCF merupakan penyempurnaan dari equity crowdfunding (ECF) sehingga sistem ini memiliki cakupan yang lebih luas.
Kehadiran platform securities crowdfunding menjadi harapan baru bagi UMKM dan bisnis yang masih terkendala pengembangannya karena masih kesulitan dalam mengakses modal.
Hal ini terjadi karena masih banyak bisnis dan UMKM yang kesulitan mengakses pinjaman karena berbagai kendala dalam bisnis. Oleh sebab itu, SCF menjadi harapan baru untuk perkembangan UMKM dan bisnis di Indonesia.
Selain sebagai sumber pendanaan kepada bisnis dan UMKM, securities crowdfunding juga merupakan salah satu instrumen investasi dengan potensi keuntungan yang menjanjikan.
SCF memberikan menawarkan jenis efek yang beragam seperti saham, obligasi, dan sukuk sehingga investor bisa menyesuaikan dengan profil risiko mereka.
Supaya lebih jelas mari kita bahas lebih lanjut *tentang* [securities crowdfunding.](https://landx.id/)_
## Perkembangan Securities Crowdfunding di Indonesia
Securities crowdfunding (SCF) adalah sistem pendanaan usaha yang dilakukan dengan sistem patungan dari beberapa investor melalui platform crowdfunding yang bertugas menghimpun dan menyalurkan modal yang telah dikumpulkan kepada bisnis.
Dalam skema pendanaan patungan ini, investor bisa mendapatkan kepemilikan berupa saham, obligasi atau surat utang, dan sukuk.
Bisnis atau UMKM yang telah didanai dengan skema ini akan menggunakan dana tersebut untuk pengembangan bisnis mereka dan akan melakukan pengembalian berdasarkan efek yang mereka tawarkan kepada investor.
Oleh sebab itu, investasi bisnis dalam sistem ini akan memiliki risiko yang berbeda-beda tergantung dengan efek yang ditawarkan oleh bisnis tersebut.
Sebelumnya, equity crowdfunding hanya menawarkan efek berupa saham sehingga keuntungan fokus pada dividen dan capital gain, namun, SCF menawarkan efek yang lebih beragam sehingga *return* yang ditawarkan pun akan berbeda pula.
Kehadiran sistem ini tentu membuat investasi dan pendanaan menjadi lebih mudah karena investor dipertemukan dalam satu platform securities crowdfunding berbasis digital. Hal ini membuat cakupan investor menjadi luas dan bisnis pun semakin mudah berkembang.
Anda lebih paham, berikut perbedaan dari SCF dan ECF
## Perbedaan antara Equity dan Securities Crowdfunding
Dibandingkan equity crowdfunding, security crowdfunding memiliki penawaran efek yang lebih beragam dalam pendanaan bisnis dan UMKM. Sistem penawaran dan transaksi dari kedua instrumen ini dilakukan sepenuhnya secara online melalui platform yang disediakan oleh penyelenggara crowdfunding.
Dalam securities crowdfunding pendanaan bisa dilakukan kepada bisnis dengan cakupan yang lebih luas tidak hanya kepada korporasi dan PT saja tapi juga memberikan kesempatan urun dana untuk badan usaha seperti CV, NV, Firma dan berbagai jenis badan usaha lain.
Selain itu, equity crowdfunding pada dasarnya hanya menerbitkan efek berupa saham yang ditawarkan kepada investor guna pendanaan kepada bisnis yang akan melakukan pengembangan usaha ke dalam pasar yang lebih luas.
Dalam securities crowdfunding, efek yang ditawarkan lebih beragam karena selain saham SCF juga menawarkan efek dalam bentuk saham syariah, sukuk dan obligasi. Security crowdfunding ini memberikan investor pilihan instrumen investasi yang lebih beragam.
Apabila anda sudah paham perbedaanya, mari kita bahas lebih lanjut tentang berbagai efek yang ditawarkan dalam *securities crowdfunding.*
## Berbagai Jenis Instrumen dalam Securities Crowdfunding (SCF)
### 1. Saham
Saham merupakan bukti kepemilikan modal akan sebuah bisnis atau badan usaha yang telah diberikan modal oleh investor. Saham menjadi acuan seberapa besar suatu pihak memiliki modal dalam sebuah usaha atau bisnis.
Saham juga menjadi acuan besar keuntungan yang akan anda dapatkan dari dividen saat bisnis menguntungkan. Nilai dividen yang akan diterima investor akan mengikuti jumlah saham yang mereka miliki.
Instrumen ini merupakan efek umum ditawarkan dalam urun dana atau crowdfunding. Karena itu equity dan securities crowdfunding memberikan penawaran efek berupa saham.
### 2. Saham Syariah
Saham syariah merupakan efek yang ditawarkan dalam equity dan securities crowdfunding.
Saham syariah merupakan instrumen efek berupa saham yang menggunakan prinsip syariah. Instrumen ini telah diatur oleh OJK dengan berbagai kriteria yang mengacu pada aturan syariah terkait saham dan pasar modal.
Sistem ini juga memiliki beberapa kriteria emiten seperti ketentuan total utang berbasis bunga yang tidak boleh melebihi 45% dan perbandingan antara pendapatan bunga dan pendapatan usaha tidak boleh lebih dari 10%. Sistem saham ini telah dijabarkan OJK untuk mengantisipasi saham bertentangan dengan syariat islam
### 3. Sukuk
Sukuk merupakan instrumen surat berharga yang menunjukkan besar kepemilikan aset atas sebuah perusahaan. Berbeda dengan obligasi yang akan memberikan keuntungan berupa bunga pinjaman, sukuk akan memberikan keuntungan dengan sistem bagi hasil keuntungan bisnis yang sudah ditanami modal.
Dalam securities crowdfunding, sukuk merupakan salah satu opsi pendanaan yang bisa digunakan bisnis dan ditawarkan kepada para investor.
### 4. Obligasi
Obligasi atau surat utang merupakan salah satu instrumen yang bisa ditawarkan dalam securities crowdfunding
Obligasi merupakan surat utang yang diperjualbelikan ditawarkan oleh sebuah bisnis atau instansi sebagai alternatif pendanaan mereka.
Obligasi merupakan surat utang berjangka yang bisa diperjualbelikan. Instansi yang yang mengeluarkan surat utang akan melakukan pembayaran kembali hutang mereka pada waktu jatuh tempo dengan disertakan bunga pinjaman yang sudah disepakati pada awal periode.
## Penutup
Gimana? Anda sekarang jadi paham bukan tentang bagaimana cara berinvestasi bisnis melalui securities crowdfunding. Apabila anda ingin mulai berinvestasi dengan skema ini, anda harus menemukan platform terpercaya yang sudah berizin resmi dari OJK sehingga investasi anda menjadi lebih aman.
Selain itu, anda juga harus memilih platform yang sudah berpengalaman dalam melakukan pendanaan ke berbagai sektor bisnis sehingga anda investasi bisnis yang anda lakukan pun menjadi semakin minim risiko.
Hal ini karena platform crowdfunding pihak yang akan membantu anda mengevaluasi apakah suatu bisnis potensial atau tidak dalam jangka panjang.
[LandX](https://landx.id/project/index.html) merupakan salah satu platform equity crowdfunding yang sedang berproses menjadi securities crowdfunding dengan pengalaman melakukan crowdfunding bagi berbagai [bisnis potensial](https://landx.id/project/index.html) dari berbagai bidang.
Dalam beberapa waktu terakhir [LandX](https://landx.id/project/index.html) telah melakukan listing beberapa bisnis dari sektor jasa, F&B, properti, hingga klinik kecantikan dengan potensi pertumbuhan yang menjanjikan.
Karena itu, tunggu apalagi….
**[Yuk Temukan Bisnis Terbaik untuk Investasi Anda di LandX](https://landx.id/project/index.html)**
**[Yuk Temukan Bisnis Menjanjikan dengan Keuntungan Terbaik di LandX](https://landx.id/project/?utm_source=Blog&utm_medium=organic+keyword&utm_campaign=blog&utm_id=Blog)**
[](https://landx.id/project/?utm_source=Blog&utm_medium=organic+keyword&utm_campaign=blog&utm_id=Blog)
**[Follow Kami Yuk di Instagram @landx.id Untuk Berbagai Update Terbaru Terkait Investasi](https://www.instagram.com/landx.id/?utm_medium=copy_link)**
- - -
**Baca Juga:**
* [Resmi Diluncurkan, Berikut Perbedaan antara Securities Crowdfunding (SCF) dan Equity Crowdfunding (ECF)](https://landx.id/blog/kenali-berbagai-istilah-dalam-securities-crowdfunding-agar-investasi-anda-menjadi-semakin-mudah/)
* [Kemudahan Investasi Bisnis Jangka Panjang melalui Securities Crowdfunding](https://landx.id/blog/securities-crowdfunding/)
- - -
#LandX.id #landx.id #InvestasiBisnis #SecuritiesCrowdfunding #EquityCrowdfunding #InvestasiMenguntungkan #Urundana #BisnisPatungan #InvestasiUsaha | 72.69697 | 324 | 0.829512 | ind_Latn | 0.966467 |
c1536f31982d8cfad44c5717fdcc8c5b4c99709d | 447 | md | Markdown | chapters/16/16-21.md | Raymain1944/CPPLv1 | 96e5fd5347a336870fc868206ebfe44f88ce69eb | [
"Apache-2.0"
] | null | null | null | chapters/16/16-21.md | Raymain1944/CPPLv1 | 96e5fd5347a336870fc868206ebfe44f88ce69eb | [
"Apache-2.0"
] | null | null | null | chapters/16/16-21.md | Raymain1944/CPPLv1 | 96e5fd5347a336870fc868206ebfe44f88ce69eb | [
"Apache-2.0"
] | null | null | null | ```c++
#include <iostream>
using std::cin; using std::cout; using std::endl; using std::cerr;
using std::ostream;
#include <string>
using std::string;
//函数对象类,对给定指针执行delete
class DebugDelete{
public:
DebugDelete(ostream &s = cerr) : os(s) {}
//与任何函数模板相同,T的类型由编译器推断
template <typename T> void operator()(T *p) const {
os << "deleting unique_ptr" << endl;
delete p;
}
private:
ostream &os;
};
int main(){}
```
| 18.625 | 66 | 0.626398 | eng_Latn | 0.507055 |
c154600536d9cbb9e1e2e9f79f4e660af9251c83 | 689 | md | Markdown | 2004/CVE-2004-1073.md | sei-vsarvepalli/cve | fbd9def72dd8f1b479c71594bfd55ddb1c3be051 | [
"MIT"
] | 4 | 2022-03-01T12:31:42.000Z | 2022-03-29T02:35:57.000Z | 2004/CVE-2004-1073.md | sei-vsarvepalli/cve | fbd9def72dd8f1b479c71594bfd55ddb1c3be051 | [
"MIT"
] | null | null | null | 2004/CVE-2004-1073.md | sei-vsarvepalli/cve | fbd9def72dd8f1b479c71594bfd55ddb1c3be051 | [
"MIT"
] | 1 | 2022-02-24T21:07:04.000Z | 2022-02-24T21:07:04.000Z | ### [CVE-2004-1073](https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2004-1073)



### Description
The open_exec function in the execve functionality (exec.c) in Linux kernel 2.4.x up to 2.4.27, and 2.6.x up to 2.6.8, allows local users to read non-readable ELF binaries by using the interpreter (PT_INTERP) functionality.
### POC
#### Reference
- http://www.isec.pl/vulnerabilities/isec-0017-binfmt_elf.txt
#### Github
No GitHub POC found.
| 38.277778 | 223 | 0.7373 | eng_Latn | 0.274246 |
c15473f0fdf288c3c75f37fc363b116b743fec11 | 3,130 | md | Markdown | README.md | juanpicado/mem | 1d9493f4d0cdb045dc0e8251ec8d6fa78d64f5fc | [
"MIT"
] | null | null | null | README.md | juanpicado/mem | 1d9493f4d0cdb045dc0e8251ec8d6fa78d64f5fc | [
"MIT"
] | null | null | null | README.md | juanpicado/mem | 1d9493f4d0cdb045dc0e8251ec8d6fa78d64f5fc | [
"MIT"
] | null | null | null | # level-mem
> A convenience package that bundles [`levelup`](https://github.com/Level/levelup) and [`memdown`](https://github.com/Level/memdown) and exposes `levelup` on its export.
[![level badge][level-badge]](https://github.com/Level/awesome)
[](https://www.npmjs.com/package/level-mem)
[](https://www.npmjs.com/package/level-mem)
[](https://travis-ci.com/Level/mem)
[](https://coveralls.io/github/Level/mem)
[](https://standardjs.com)
[](https://www.npmjs.com/package/level-mem)
[](#backers)
[](#sponsors)
Use this package to avoid having to explicitly install `memdown` when you want to use `memdown` with `levelup` for non-persistent `levelup` data storage.
```js
const level = require('level-mem')
// 1) Create our database, with optional options.
// This will create or open the underlying LevelDB store.
const db = level()
// 2) Put a key & value
db.put('name', 'Level', function (err) {
if (err) return console.log('Ooops!', err) // some kind of I/O error
// 3) Fetch by key
db.get('name', function (err, value) {
if (err) return console.log('Ooops!', err) // likely the key was not found
// Ta da!
console.log('name=' + value)
})
})
```
See [`levelup`](https://github.com/Level/levelup) and [`memdown`](https://github.com/Level/memdown) for more details.
**If you are upgrading:** please see [`UPGRADING.md`](UPGRADING.md).
## Contributing
[`Level/mem`](https://github.com/Level/mem) is an **OPEN Open Source Project**. This means that:
> Individuals making significant and valuable contributions are given commit-access to the project to contribute as they see fit. This project is more like an open wiki than a standard guarded open source project.
See the [Contribution Guide](https://github.com/Level/community/blob/master/CONTRIBUTING.md) for more details.
## Donate
To sustain [`Level`](https://github.com/Level) and its activities, become a backer or sponsor on [Open Collective](https://opencollective.com/level). Your logo or avatar will be displayed on our 28+ [GitHub repositories](https://github.com/Level) and [npm](https://www.npmjs.com/) packages. 💖
### Backers
[](https://opencollective.com/level)
### Sponsors
[](https://opencollective.com/level)
## License
[MIT](LICENSE.md) © 2012-present [Contributors](CONTRIBUTORS.md).
[level-badge]: https://leveljs.org/img/badge.svg
| 46.716418 | 292 | 0.726518 | eng_Latn | 0.349869 |
c15540e3589adfcf97c8c876c2b13651c77c7c94 | 2,425 | md | Markdown | _captures/TLP2_20210514.md | Meteoros-Floripa/meteoros.floripa.br | 7d296fb8d630a4e5fec9ab1a3fb6050420fc0dad | [
"MIT"
] | 5 | 2020-05-19T17:04:49.000Z | 2021-03-30T03:09:14.000Z | _captures/TLP2_20210514.md | Meteoros-Floripa/site | 764cf471d85a6b498873610e4f3b30efd1fd9fae | [
"MIT"
] | null | null | null | _captures/TLP2_20210514.md | Meteoros-Floripa/site | 764cf471d85a6b498873610e4f3b30efd1fd9fae | [
"MIT"
] | 2 | 2020-05-19T17:06:27.000Z | 2020-09-04T00:00:43.000Z | ---
layout: capture
label: 20210514
title: Capturas da estação TLP2
station: TLP2
date: 2021-05-14 21:24:54
preview: TLP2/2021/202105/20210514/stack.jpg
capturas:
- imagem: TLP2/2021/202105/20210514/M20210514_212454_TLP_2P.jpg
- imagem: TLP2/2021/202105/20210514/M20210514_212916_TLP_2P.jpg
- imagem: TLP2/2021/202105/20210514/M20210514_213932_TLP_2P.jpg
- imagem: TLP2/2021/202105/20210514/M20210514_213944_TLP_2P.jpg
- imagem: TLP2/2021/202105/20210514/M20210514_214731_TLP_2P.jpg
- imagem: TLP2/2021/202105/20210514/M20210515_021339_TLP_2P.jpg
- imagem: TLP2/2021/202105/20210514/M20210515_025747_TLP_2P.jpg
- imagem: TLP2/2021/202105/20210514/M20210515_031655_TLP_2P.jpg
- imagem: TLP2/2021/202105/20210514/M20210515_032021_TLP_2P.jpg
- imagem: TLP2/2021/202105/20210514/M20210515_032809_TLP_2P.jpg
- imagem: TLP2/2021/202105/20210514/M20210515_033758_TLP_2P.jpg
- imagem: TLP2/2021/202105/20210514/M20210515_034414_TLP_2P.jpg
- imagem: TLP2/2021/202105/20210514/M20210515_034827_TLP_2P.jpg
- imagem: TLP2/2021/202105/20210514/M20210515_035215_TLP_2P.jpg
- imagem: TLP2/2021/202105/20210514/M20210515_035843_TLP_2P.jpg
- imagem: TLP2/2021/202105/20210514/M20210515_041918_TLP_2P.jpg
- imagem: TLP2/2021/202105/20210514/M20210515_042013_TLP_2P.jpg
- imagem: TLP2/2021/202105/20210514/M20210515_053428_TLP_2P.jpg
- imagem: TLP2/2021/202105/20210514/M20210515_065305_TLP_2P.jpg
- imagem: TLP2/2021/202105/20210514/M20210515_070954_TLP_2P.jpg
- imagem: TLP2/2021/202105/20210514/M20210515_074942_TLP_2P.jpg
- imagem: TLP2/2021/202105/20210514/M20210515_075532_TLP_2P.jpg
- imagem: TLP2/2021/202105/20210514/M20210515_075540_TLP_2P.jpg
- imagem: TLP2/2021/202105/20210514/M20210515_075548_TLP_2P.jpg
- imagem: TLP2/2021/202105/20210514/M20210515_075603_TLP_2P.jpg
- imagem: TLP2/2021/202105/20210514/M20210515_075613_TLP_2P.jpg
- imagem: TLP2/2021/202105/20210514/M20210515_075627_TLP_2P.jpg
- imagem: TLP2/2021/202105/20210514/M20210515_081453_TLP_2P.jpg
- imagem: TLP2/2021/202105/20210514/M20210515_081732_TLP_2P.jpg
- imagem: TLP2/2021/202105/20210514/M20210515_081859_TLP_2P.jpg
- imagem: TLP2/2021/202105/20210514/M20210515_081906_TLP_2P.jpg
- imagem: TLP2/2021/202105/20210514/M20210515_082239_TLP_2P.jpg
- imagem: TLP2/2021/202105/20210514/M20210515_082312_TLP_2P.jpg
- imagem: TLP2/2021/202105/20210514/M20210515_082916_TLP_2P.jpg
---
| 55.113636 | 65 | 0.813196 | yue_Hant | 0.046258 |
c156acc3dc3b5fffb52cfaba51f81ab698bbc3a0 | 294 | md | Markdown | ssrmu_port_migration.md | crossfw/Wiki | db9aeb5b22a9d6f6633cf8c7608a3f16dc4df6bf | [
"MIT"
] | null | null | null | ssrmu_port_migration.md | crossfw/Wiki | db9aeb5b22a9d6f6633cf8c7608a3f16dc4df6bf | [
"MIT"
] | null | null | null | ssrmu_port_migration.md | crossfw/Wiki | db9aeb5b22a9d6f6633cf8c7608a3f16dc4df6bf | [
"MIT"
] | null | null | null | # SS/SSR端口偏移
设置完单端口后,如果想指定某些SS/SSR节点使用不同与默认的端口, 则需要端口偏移.
## 面板设置
在SS/SSR节点地址后加入<br>
";port=1234#4321"(不含引号)<br>
此处1234为默认的单端口, 4321为该节点实际使用的端口号
## 配置文件设置
修改
>SSPanel根目录/config/.config.php
把以下变量修正为true
>$_ENV['mu_port_migration'] = true; //为后端直接下发偏移后的端口
至此, SS/SSR后端即可获得偏移后的端口号
| 16.333333 | 63 | 0.734694 | yue_Hant | 0.201769 |
c156de13df6298778f2593c95c87ab96691be960 | 32 | md | Markdown | README.md | AliumFX/Module-Email | 0c2e3f8c41f90174dbcdecb0e8028952df51cb32 | [
"Apache-2.0"
] | null | null | null | README.md | AliumFX/Module-Email | 0c2e3f8c41f90174dbcdecb0e8028952df51cb32 | [
"Apache-2.0"
] | null | null | null | README.md | AliumFX/Module-Email | 0c2e3f8c41f90174dbcdecb0e8028952df51cb32 | [
"Apache-2.0"
] | null | null | null | # Module-Email
The Email module
| 10.666667 | 16 | 0.78125 | kor_Hang | 0.410245 |
c157b9f35043d347d75b4bb7406d8cf25daf3476 | 150 | md | Markdown | README.md | Badrucyber/darkfbMod | 7f8fd4d4d2c1da0eb843a014e4df15fc0d21e81a | [
"Apache-2.0"
] | 1 | 2019-07-15T20:24:32.000Z | 2019-07-15T20:24:32.000Z | README.md | cyber2611/wira | 309d8a7246727b8c041f3a114baab88af688bd65 | [
"Apache-2.0"
] | null | null | null | README.md | cyber2611/wira | 309d8a7246727b8c041f3a114baab88af688bd65 | [
"Apache-2.0"
] | 2 | 2019-07-23T19:49:16.000Z | 2020-01-10T19:06:23.000Z | pkgupgrade
pkgupdate
instal pkgpython2
instal pip2mekanisasi
instal pip2requests
gitclonehttps: // github.com/cyber2611/wira
cd wira
ls
python2mod.py
| 15 | 43 | 0.853333 | ind_Latn | 0.378371 |
c1586a79f1c1675a517df4fc980e6cf1314fd94d | 160 | md | Markdown | tty/class_tty_readstream.md | xxuuzhe/node-api-cn | 5edc6072250d63bb2807ab6b9cf0b23f6fece8d6 | [
"CC-BY-4.0"
] | 1 | 2021-11-22T07:54:42.000Z | 2021-11-22T07:54:42.000Z | tty/class_tty_readstream.md | lexmin0412/node-api-cn | 5edc6072250d63bb2807ab6b9cf0b23f6fece8d6 | [
"CC-BY-4.0"
] | null | null | null | tty/class_tty_readstream.md | lexmin0412/node-api-cn | 5edc6072250d63bb2807ab6b9cf0b23f6fece8d6 | [
"CC-BY-4.0"
] | null | null | null | <!-- YAML
added: v0.5.8
-->
`tty.ReadStream` 类是 [`net.Socket`] 的子类,表示 TTY 的可读端。
在正常情况下,[`process.stdin`] 将是 Node.js 进程中唯一的 `tty.ReadStream` 实例,并且没有理由创建其他实例。
| 20 | 76 | 0.675 | yue_Hant | 0.301433 |
c159028447c668722f80e2aad185e041706597e0 | 2,325 | md | Markdown | index.md | mpcodemonkey/mvre | 19e1348a22d0997515b1537b6ac5d8cf86220cac | [
"BSD-3-Clause"
] | 2 | 2016-11-08T17:43:13.000Z | 2017-01-27T07:01:55.000Z | index.md | mpcodemonkey/mvre | 19e1348a22d0997515b1537b6ac5d8cf86220cac | [
"BSD-3-Clause"
] | 1 | 2017-01-27T06:47:37.000Z | 2017-02-17T07:37:34.000Z | index.md | mpcodemonkey/mvre | 19e1348a22d0997515b1537b6ac5d8cf86220cac | [
"BSD-3-Clause"
] | null | null | null | # MVRE
## Introduction
MVRE is a mobile virtual reality game engine built on top of WebGL and WebVR. Its purpose as of now is to show the extent to which the WebVR spec and WebGL 2.0 have matured, and to serve as an intermediate introduction for those looking to develop web-based VR games. Although the focus is on mobile, any game produced using this engine is (unless directly stated) playable on any desktop VR device.
This project is heavily based on the work done by Brandon Jones in his VR Presentation example. It uses webvr-polyfill by default until WebVR is enabled in Google Chrome and Mozilla Firefox by default.
## Status
The engine is going through quite a bit of refactoring. Most of the planned features are implemented or in-progress, and so demos will be coming out shortly.
## Working Examples
* [#1-Webgl2-and-Scenegraph](https://mpcodemonkey.github.io/mvre/examples/01-Webgl2-and-Scenegraph/)
* [#2-Textured-Objects](https://mpcodemonkey.github.io/mvre/examples/02-Textured-Objects/)
## Planned Features
* 3D Audio
## In-progress Features
* Physics - gravity is present, collision detection currently underway. Cannon.js used for physics
* HUD - based on wglu-stats.js made by Brandon Jones
## Completed Features
* Validation of Webgl 2.0 support in WebVR
* Scenegraph - single parent, multiple child graph
* Wavefront .OBJ support, ported from the java version present in __Computer Graphics Programming in OpenGL with Java__ by Scott Gordon and John Clevenger
## Future Work
* Lighting and Shadows - lighting and shadows are not currently in the scope of this project. The original purpose was to identify and create a simple to use engine for mobile VR. It's emphasis is on checking the current state of WebVR. Lighting is an important engine feature, but does not add to the research being done.
* Skeletal animation - Andrea Venuta provides a wonderful explanation [here](http://veeenu.github.io/2014/05/09/implementing-skeletal-animation.html) on how skeletal animation works conceptually, and provides some code that can be used as a basis for animation in this engine. Given the time, I would love to work on finishing this.
## Notes
Game engine development is hard but rewarding. Never let the difficulty of a project be a roadblock that turns you away.
| 70.454545 | 400 | 0.778495 | eng_Latn | 0.998034 |
c1592574b4c6b8ac803a854dcbf0c22ba932a78a | 663 | md | Markdown | _posts/software/2019-05-07-computer-software-43.md | Townwang/Townwang.github.io | 9250f8e7ea8ff433361b74a2e7c90fbf33fe60a4 | [
"MIT"
] | 5 | 2017-03-20T09:04:44.000Z | 2019-08-25T19:59:19.000Z | _posts/software/2019-05-07-computer-software-43.md | Townwang/townwang.github.io | 099df0e2d041fabe5ac19c16e7f4adc6abc3413b | [
"MIT"
] | 224 | 2017-07-27T00:40:45.000Z | 2021-02-18T20:50:43.000Z | _posts/software/2019-05-07-computer-software-43.md | Townwang/townwang.github.io | 099df0e2d041fabe5ac19c16e7f4adc6abc3413b | [
"MIT"
] | 1 | 2018-09-03T09:26:54.000Z | 2018-09-03T09:26:54.000Z | ---
layout: post
title: "〖Ska大神★精品★破解〗:磁力猫v1.7.3会员版 ★VIP全部免费用"
date: 2019-05-07
categories: software
tags: 手机软件
author: Town
cover: https://i.loli.net/2019/05/07/5cd123181c68e.jpg
---
## 软件介绍
软件名称:磁力猫(*VIP*)
软件版本:v1.7.3_破解_会员_至尊版
软件语言:中文
软件大小:4.5M
软件包名:com.magnet.torrent.cat
支持系统:Android 2.2 及更高版本
测试机型:小米 9
.
此版本感谢大神[Skaura]完美破解,安装就是专业版,欢迎各位机友下载。
·磁力猫致力打造手机端的磁力搜索神器,本App搜索结果来自互联网,实现了一站式全网磁力多引擎磁力种子搜索功能,和多个影视站点的资源聚合。
## 软件截图


## 下载地址
<span id="psd">
[磁力猫v1.7.3会员版 ★VIP全部免费用](https://www.lanzous.com/i42881i)
</span>
| 16.575 | 68 | 0.726998 | yue_Hant | 0.801525 |
c1597017b9215284d84d7a582c9ef2fab86ab9b4 | 509 | md | Markdown | guide/english/certifications/javascript-algorithms-and-data-structures/basic-javascript/count-backwards-with-a-for-loop/index.md | frodriguez20H/freeCodeCamp | df189b5e24db8c4f7a37c2ab8ed788d780d051ed | [
"BSD-3-Clause"
] | 2 | 2019-08-16T21:35:10.000Z | 2020-08-12T06:09:54.000Z | guide/english/certifications/javascript-algorithms-and-data-structures/basic-javascript/count-backwards-with-a-for-loop/index.md | mpdo2017/freeCodeCamp | ddd6c228ada3bdc7469b7c26eb8fa67aca0b15a2 | [
"BSD-3-Clause"
] | 4 | 2019-07-29T13:13:24.000Z | 2020-12-09T12:33:19.000Z | guide/english/certifications/javascript-algorithms-and-data-structures/basic-javascript/count-backwards-with-a-for-loop/index.md | mpdo2017/freeCodeCamp | ddd6c228ada3bdc7469b7c26eb8fa67aca0b15a2 | [
"BSD-3-Clause"
] | 1 | 2020-07-31T21:22:18.000Z | 2020-07-31T21:22:18.000Z | ---
title: Count Backwards With a For Loop
---
# Count Backwards With a For Loop
---
## Hints
### Hint 1
* create a new for loop for myArray
### Hint 2
* start from the first odd number just before 9
---
## Solutions
<details><summary>Solution 1 (Click to Show/Hide)</summary>
```javascript
var ourArray = [];
for (var i = 10; i > 0; i -= 2) {
ourArray.push(i);
}
// Setup
var myArray = [];
// Only change code below this line.
for (var i = 9; i > 0; i -= 2) {
myArray.push(i);
}
```
</details> | 13.756757 | 59 | 0.609037 | eng_Latn | 0.905545 |
c159ac777b77993772d96ddd4780cdcd15c0a441 | 336 | md | Markdown | _kaartenproject/e0cf714d-9aeb-44c9-ad43-9c8b2bb0d587.md | sammeltassen/tresor-maps | dddf1e93cafbe846d5044f3d87b031f489112201 | [
"MIT"
] | null | null | null | _kaartenproject/e0cf714d-9aeb-44c9-ad43-9c8b2bb0d587.md | sammeltassen/tresor-maps | dddf1e93cafbe846d5044f3d87b031f489112201 | [
"MIT"
] | null | null | null | _kaartenproject/e0cf714d-9aeb-44c9-ad43-9c8b2bb0d587.md | sammeltassen/tresor-maps | dddf1e93cafbe846d5044f3d87b031f489112201 | [
"MIT"
] | null | null | null | ---
pid: e0cf714d-9aeb-44c9-ad43-9c8b2bb0d587
idno: TRL-11.4.1.03
thumbnail: https://dlc.services/thumbs/7/4/e0cf714d-9aeb-44c9-ad43-9c8b2bb0d587/full/400,339/0/default.jpg
manifest: https://dlc.services/iiif-resource/delft/string1string2string3/kaartenproject-2007/TRL-11.4.1.03
order: '568'
layout: map
collection: kaartenproject
---
| 33.6 | 106 | 0.785714 | kor_Hang | 0.12022 |
c15a29c0511bf71485bc3fd4bbaf0b7a9dc98fab | 1,671 | md | Markdown | assets/0300-0399/0344.Reverse String/README_EN.md | yanglr/LeetCodeOJ | 27dd1e4a2442b707deae7921e0118752248bef5e | [
"MIT"
] | 45 | 2021-07-25T00:45:43.000Z | 2022-03-24T05:10:43.000Z | assets/0300-0399/0344.Reverse String/README_EN.md | yanglr/LeetCodeOJ | 27dd1e4a2442b707deae7921e0118752248bef5e | [
"MIT"
] | null | null | null | assets/0300-0399/0344.Reverse String/README_EN.md | yanglr/LeetCodeOJ | 27dd1e4a2442b707deae7921e0118752248bef5e | [
"MIT"
] | 15 | 2021-07-25T00:40:52.000Z | 2021-12-27T06:25:31.000Z | # [344. Reverse String](https://leetcode.com/problems/reverse-string)
## Description
<p>Write a function that reverses a string. The input string is given as an array of characters <code>s</code>.</p>
<p> </p>
<p><strong>Example 1:</strong></p>
<pre><strong>Input:</strong> s = ["h","e","l","l","o"]
<strong>Output:</strong> ["o","l","l","e","h"]
</pre><p><strong>Example 2:</strong></p>
<pre><strong>Input:</strong> s = ["H","a","n","n","a","h"]
<strong>Output:</strong> ["h","a","n","n","a","H"]
</pre>
<p> </p>
<p><strong>Constraints:</strong></p>
<ul>
<li><code>1 <= s.length <= 10<sup>5</sup></code></li>
<li><code>s[i]</code> is a <a href="https://en.wikipedia.org/wiki/ASCII#Printable_characters" target="_blank">printable ascii character</a>.</li>
</ul>
<p> </p>
<p><strong>Follow up:</strong> Do not allocate extra space for another array. You must do this by modifying the input array <a href="https://en.wikipedia.org/wiki/In-place_algorithm" target="_blank">in-place</a> with <code>O(1)</code> extra memory.</p>
## Solutions
<!-- tabs:start -->
### **Python3**
```python
class Solution:
def reverseString(self, s: List[str]) -> None:
"""
Do not return anything, modify s in-place instead.
"""
s[:] = s[::-1]
```
### **Java**
```java
class Solution {
public void reverseString(char[] s) {
int n;
if (s == null || (n = s.length) < 2) return;
int i = 0, j = n - 1;
while (i < j) {
char t = s[i];
s[i] = s[j];
s[j] = t;
++i;
--j;
}
}
}
```
### **...**
```
```
<!-- tabs:end -->
| 23.871429 | 252 | 0.541592 | eng_Latn | 0.368837 |
c15a546a88ba690bc6ba149e738bea1f52e9152c | 2,231 | md | Markdown | help/analyze/report-builder/workbook-library/protect-wb.md | AdobeDocs/analytics.ja-JP | 25c344f8ce00cab4d2b94065619f61e3605e9f5f | [
"MIT"
] | 1 | 2022-02-08T03:31:50.000Z | 2022-02-08T03:31:50.000Z | help/analyze/report-builder/workbook-library/protect-wb.md | AdobeDocs/analytics.ja-JP | 25c344f8ce00cab4d2b94065619f61e3605e9f5f | [
"MIT"
] | 8 | 2022-01-25T15:50:51.000Z | 2022-03-26T04:19:21.000Z | help/analyze/report-builder/workbook-library/protect-wb.md | AdobeDocs/analytics.ja-JP | 25c344f8ce00cab4d2b94065619f61e3605e9f5f | [
"MIT"
] | null | null | null | ---
description: ワークブックをロックすることで、リクエストの編集を防止できます。これにより、編集効率を高めるため、すべてのレポートリクエストを一時停止してワークブックをオフラインで編集できるようにします。
title: ワークブックのロック/ロック解除
uuid: ef5c276c-5f74-4741-b6fa-4c79eda29f62
feature: Report Builder
role: User, Admin
exl-id: b5a83532-9fa7-4f1f-b744-e5d74781fffb
source-git-commit: 7226b4c77371b486006671d72efa9e0f0d9eb1ea
workflow-type: tm+mt
source-wordcount: '470'
ht-degree: 100%
---
# ワークブックのロック/ロック解除
ワークブックをロックすることで、リクエストの編集を防止できます。これにより、編集効率を高めるため、すべてのレポートリクエストを一時停止してワークブックをオフラインで編集できるようにします。
アナリストは、ワークブックをロックすることによって、組織内の他のユーザーがワークブックリクエストを変更できないように保護できます。ただし、他のユーザーはワークブックのデータを更新することは可能です。
ワークブックが編集されないように保護するには、Report Builder ツールバーの「**[!UICONTROL ロック済み]**」(
)をクリックします。
ワークブックの保護を解除するには、「**[!UICONTROL ロック解除]**」(
)をクリックします。
次のいずれかの権限を持っている場合に、ロックされたワークブックのロックを解除できます。
* 管理者である場合。
* ワークブックを最初にロックしたユーザーである場合。この場合は、管理者である必要はありません。
>[!NOTE]
>
>ワークブックのロックを解除する権限を持っていない場合は、保護されたワークブックにリクエストを追加できません。
ワークブックがロックされてリクエストの編集ができない場合、ユーザーの操作が次のように制限されます。
* ユーザーは、リクエストを作成/追加できません。
* ユーザーは、リクエストウィザードを使用してリクエストを編集できません。
* ユーザーは、複数のリクエストを編集する機能を使用してリクエストを編集できません。
* ユーザーは、リクエストを切り取り/コピー/貼り付けできません。ただし、ユーザーは、Excel 標準の切り取り/コピー/貼り付けコンテキストメニューを使用して、リクエストの内容を切り取り/コピー/貼り付けすることができます。
* ユーザーは、リクエストを個別に更新したり、グループの一部として更新したりできます。
* リクエストで、セルの入力値(日付範囲、セグメント、フィルター)が使用されている場合、ユーザーは、セル内でこれらの値を変更でき、これらを更新することでリクエストを間接的に編集できます。
保護されているワークブックの編集を(コンテキストメニュー、**[!UICONTROL リクエストマネージャー]**、**[!UICONTROL 複数のリクエストを編集]**​を使用して)試みた場合、編集できるかどうかは権限に応じて次のように決定されます。
* リクエストのロックを解除する権限を持っていない場合は、次のプロンプトダイアログが表示されます。

* 必要な権限を持っている場合、プロンプトは表示されず、リクエストを編集できます。
## ワークフロー {#section_260D05FF632B41DB97DB43E2ADBE2E75}
ワークブック A に、ユーザー A が作成し、ロックされた状態のリクエストが 1 件あるとします。
**例 1:管理者ユーザー(またはユーザー A)**
1. ユーザーは、Report Builder にログインし、ワークブックを開きます。
1. ワークブック A は現在ロックされているので、ツールバーの「リクエストを作成」ボタンは無効になっています。また、他のすべてのボタンも、ロックによって機能が無効になっています。
1. ユーザーが無効なボタンのいずれかの使用を試みると、ワークブックが現在ロックされているというメッセージが表示されます。
1. ユーザーはワークブックのロックを解除できます。ロックを解除することで、すべての編集機能が有効になります。
1. ロックを解除した後、ワークブックは、再度明示的にロックするまでロック解除の状態のままになります。
**例 2:管理者以外のユーザー(ユーザー B)**
1. ユーザーは、Report Builder にログインし、ワークブックを開きます。
1. ユーザーは、リクエストを追加/編集できません。
1. ユーザーは、ワークブックのロックを解除できません。
| 30.986111 | 134 | 0.833707 | jpn_Jpan | 0.939083 |
c15a69ab582f1ed51f84ab33b59214cf7bf356f0 | 70 | md | Markdown | README.md | todaygood/HelloSpringBoot | e423e8e06a2d17dffd8deb0aab1a154516600af3 | [
"Apache-2.0"
] | null | null | null | README.md | todaygood/HelloSpringBoot | e423e8e06a2d17dffd8deb0aab1a154516600af3 | [
"Apache-2.0"
] | null | null | null | README.md | todaygood/HelloSpringBoot | e423e8e06a2d17dffd8deb0aab1a154516600af3 | [
"Apache-2.0"
] | null | null | null | # HelloSpringBoot
Hello,SpringBoot
# Guide
jdk: 10
在Idea中import
| 7 | 17 | 0.742857 | yue_Hant | 0.973221 |
c15bdb04a4a4ba937f5ac7fef1b0099b733b2852 | 17,645 | md | Markdown | src/firehose_of_rust/README.md | alexzanderr/rust-cheat-sheet | 015a5cf3140b599c859c7176d5201d92f5811d35 | [
"MIT"
] | null | null | null | src/firehose_of_rust/README.md | alexzanderr/rust-cheat-sheet | 015a5cf3140b599c859c7176d5201d92f5811d35 | [
"MIT"
] | null | null | null | src/firehose_of_rust/README.md | alexzanderr/rust-cheat-sheet | 015a5cf3140b599c859c7176d5201d92f5811d35 | [
"MIT"
] | null | null | null | # Firehose of Rust
## First of all, credits
credit goes to: `Jack O'Connor`
materials included in this cheat sheet
- [`slides`](https://jacko.io/firehose_of_rust)
- [`presentation`](https://www.youtube.com/watch?v=FSyfZVuD32Y)
# A Dangling Pointer in Rust
```rust
let my_int_ptr: &i32;
{
let my_int: i32 = 5;
my_int_ptr = &my_int;
}
dbg!(*my_int_ptr);
```
after run
```shell
error[E0597]: `my_int` does not live long enough
--> examples/firehose_dangling_pointer.rs:7:22
|
7 | my_int_ptr = &my_int;
| ^^^^^^^ borrowed value does not live long enough
8 | }
| - `my_int` dropped here while still borrowed
9 | dbg!(*my_int_ptr);
| ----------- borrow later used here
For more information about this error, try `rustc --explain E0597`.
error: could not compile `cheat-sheet` due to previous error
```
solution to fix this:
```shell
you dont make to things that dont live long enough
```
# A borrowing view
```rust
let my_string: String =
"abcdefghijklmnopqrstuvwxy".to_string();
let my_string_view: &str = (my_string + "z").as_str();
dbg!(my_string_view);
```
after run
```shell
error[E0716]: temporary value dropped while borrowed
--> src/main.rs:5:9
|
5 | (my_string + "z").as_str();
| ^^^^^^^^^^^^^^^^^ - temporary value is freed at the end of this statement
| |
| creates a temporary which is freed while still in use
6 | dbg!(my_string_view);
| -------------- borrow later used here
|
= note: consider using a `let` binding to create a longer lived value
```
solution to fix this: you put the temp value into a variable
```rust
let my_string: String =
"abcdefghijklmnopqrstuvwxy".to_string();
let temp_value_not_temp_anymore = my_string + "z";
let my_string_view: &str = temp_value_not_temp_anymore.as_str();
// what if i drop `temp_value_not_temp_anymore` here
// drop(temp_value_not_temp_anymore);
dbg!(my_string_view);
```
# A long lived container
```rust
let mut my_vector: Vec<&str> = Vec::new();
{
let my_string = "hello world".to_string();
my_vector.push(&my_string);
}
dbg!(my_vector);
```
after run
```shell
error[E0597]: `my_string` does not live long enough
--> src/main.rs:5:24
|
5 | my_vector.push(&my_string);
| ^^^^^^^^^^ borrowed value does not live long enough
6 | }
| - `my_string` dropped here while still borrowed
7 | dbg!(my_vector);
| --------- borrow later used here
```
# An invalid function
```rust
fn my_push_back(v: &mut Vec<&str>, s: &str) {
v.push(s);
}
fn main() {
let mut my_vector: Vec<&str> = Vec::new();
{
let my_string = "hello world".to_string();
my_push_back(&mut my_vector, &my_string);
}
dbg!(my_vector);
}
```
after run
```shell
error[E0623]: lifetime mismatch
--> src/main.rs:2:12
|
1 | fn my_push_back(v: &mut Vec<&str>, s: &str) {
| ---- ---- these two types are declared with different lifetimes...
2 | v.push(s);
| ^ ...but data from `s` flows into `v` here
```
rust analyses the function first. it assumes the s reference doesnt live long enough to be put in the `v` container and that is very corect. it could really happen, as presented in the above code.
solution to fix this: add explicit lifetimes
```rust
fn my_push_back<'a>(v: &mut Vec<&'a str>, s: &'a str) {
v.push(s);
}
fn main() {
let mut my_vector: Vec<&str> = Vec::new();
{
let my_string = "hello world".to_string();
my_push_back(&mut my_vector, &my_string);
}
dbg!(my_vector);
}
```
NOTE: this doesnt fix the code, these explicit lifetimes only fix the function
the error still remains
```shell
error[E0597]: `my_string` does not live long enough
--> src/main.rs:9:38
|
9 | my_push_back(&mut my_vector, &my_string);
| ^^^^^^^^^^ borrowed value does not live long enough
10 | }
| - `my_string` dropped here while still borrowed
11 | dbg!(my_vector);
| --------- borrow later used here
```
same thing as first example with pointer, cannot have reference to something that was deallocated from the memory
explanation: the elements in the vector and new element that will arive in the vector `must` have the same lifetime, meaning: they must have the same life span
think of 2 people with different ages. for example, 4 this to work, both people must die at the same time or have both 70 years of lifespan for example.
you cannot have 2 people with one dying sooner than the other. same principle to references in rust
# Mutable aliasing
```rust
let mut my_int = 5;
let reference1 = &mut my_int;
let reference2 = &mut my_int;
*reference1 += 1;
*reference2 += 1;
assert_eq!(my_int, 7);
```
after run
```shell
error[E0499]: cannot borrow `my_int` as mutable more than once at a time
--> src/main.rs:4:22
|
3 | let reference1 = &mut my_int;
| ----------- first mutable borrow occurs here
4 | let reference2 = &mut my_int;
| ^^^^^^^^^^^ second mutable borrow occurs here
5 | *reference1 += 1;
| ---------------- first borrow later used here
```
# Multiple references into an array
```rust
let mut char_array: [char; 2] = ['a', 'b'];
let first_element = &mut char_array[0];
let second_element = &char_array[1];
*first_element = *second_element;
assert_eq!(char_array[0], 'b');
```
after run
```shell
error[E0502]: cannot borrow `char_array[_]` as immutable because it is also borrowed as mutable
--> src/main.rs:6:26
|
5 | let first_element = &mut char_array[0];
| ------------------ mutable borrow occurs here
6 | let second_element = &char_array[1];
| ^^^^^^^^^^^^^^ immutable borrow occurs here
7 | *first_element = *second_element;
| -------------------------------- mutable borrow later used here
```
solution to this:
1. `split_at_mut`
```rust
let mut char_array: [char; 2] = ['a', 'b'];
let (first_slice, rest_slice) =
char_array.split_at_mut(1);
let first_element = &mut first_slice[0];
let second_element = &rest_slice[0];
*first_element = *second_element;
assert_eq!(char_array[0], 'b');
```
2. `unsafe code`
```rust
let mut char_array: [char; 2] = ['a', 'b'];
let first_element: *mut char = &mut char_array[0];
let second_element: *const char = &char_array[1];
unsafe {
*first_element = *second_element;
}
assert_eq!(char_array[0], 'b');
```
# Invalidating a reference by reallocating
```rust
fn push_int_twice(v: &mut Vec<i32>, n: &i32) {
v.push(*n);
v.push(*n);
}
fn main() {
let mut my_vector = vec![0];
let my_int_reference = &my_vector[0];
push_int_twice(&mut my_vector, my_int_reference);
}
```
after run
```shell
error[E0502]: cannot borrow `my_vector` as mutable because it is also borrowed as immutable
--> src/main.rs:12:20
|
11 | let my_int_reference = &my_vector[0];
| --------- immutable borrow occurs here
12 | push_int_twice(&mut my_vector, my_int_reference);
| ^^^^^^^^^^^^^^ ---------------- immutable borrow later used here
| |
| mutable borrow occurs here
```
# Magical multi-threading
```rust
// serial for loop
let mut v: Vec<i32> = vector_of_ints();
for x in &mut v {
*x += 1;
}
// serial for_each
let mut v: Vec<i32> = vector_of_ints();
v.iter_mut().for_each(|x| {
*x += 1;
});
// Rayon parallel for_each
let mut v: Vec<i32> = vector_of_ints();
v.par_iter_mut().for_each(|x| {
*x += 1;
});
```
this code is very fine, every `x` is incremented on its `thread`, fine, no data race
# Tragical multithreading
```rust
// serial for loop
let mut v: Vec<i32> = vector_of_ints();
let mut sum = 0;
for x in &mut v {
*x += 1;
sum += *x;
}
// very fine
// serial for_each
let mut v: Vec<i32> = vector_of_ints();
let mut sum = 0;
v.iter_mut().for_each(|x| {
*x += 1;
sum += *x;
});
// very fine
// Rayon parallel for_each
let mut v: Vec<i32> = vector_of_ints();
let mut sum = 0;
v.par_iter_mut().for_each(|x| {
*x += 1;
sum += *x;
});
// not fine
// compile error, data race
```
after run
```shell
error[E0594]: cannot assign to `sum`, as it is a captured variable in a `Fn` closure
--> src/main.rs:62:9
|
62 | sum += *x;
| ^^^^^^^^^ cannot assign
```
## How does Rust know? about that sum error
### Serial Iterator
```rust
fn for_each<F>(self, f: F)
where
Self: Sized,
F: FnMut(Self::Item),
{
#[inline]
fn call<T>(
mut f: impl FnMut(T),
) -> impl FnMut((), T) {
move |(), item| f(item)
}
self.fold((), call(f));
}
```
`Fn` is not `FnMut`, in the prev example rust expected `FnMut`
### ParallelIterator
```rust
fn for_each<OP>(self, op: OP)
where
OP: Fn(Self::Item) + Sync + Send,
{
for_each::for_each(self, &op)
}
```
# Synchronizing shared state
```rust
// using AtomicI32
let mut v: Vec<i32> = vector_of_ints();
let sum: AtomicI32 = AtomicI32::new(0);
v.par_iter_mut().for_each(|x| {
*x += 1;
sum.fetch_add(*x, Ordering::Relaxed);
});
// using Mutex<i32>
let mut v: Vec<i32> = vector_of_ints();
let sum: Mutex<i32> = Mutex::new(0);
v.par_iter_mut().for_each(|x| {
*x += 1;
let mut guard: MutexGuard<i32> = sum.lock().unwrap();
*guard += *x;
});
```
this code is very fine
# Moving a string
```rust
let s1 = "abcdefghijklmnopqrstuvwxyz".to_string();
let s2 = s1; // its moved,
let mut v = Vec::new();
v.push(s2); // its moved
// here v goes out of scope, only the destructor of v is called
```
one string allocation, one string destructor, how ?
well, look at the comments
# Copying a string
```rust
let s1 = "abcdefghijklmnopqrstuvwxyz".to_string();
let s2 = s1.clone();
let mut v = Vec::new();
v.push(s2.clone());
```
three string allocations, three string destructors, because you copy that data inside the `String`
# Accessing a moved-from object
```rust
let s1 = "abcdefghijklmnopqrstuvwxyz".to_string();
let s2 = s1;
dbg!(s1);
```
after run
```shell
error[E0382]: use of moved value: `s1`
--> src/main.rs:26:14
|
24 | let s1 = "abcdefghijklmnopqrstuvwxyz".to_string();
| -- move occurs because `s1` has type `String`, which does not implement the `Copy` trait
25 | let s2 = s1;
| -- value moved here
26 | dbg!(s1);
| ^^ value used here after move
```
s1 its gone, you cannot use it anymore
# Moving a borrowed object
```rust
let s1 = "abcde".to_string();
let my_view = s1.as_str();
let s2 = s1;
dbg!(my_view);
```
after run
```shell
error[E0505]: cannot move out of `s1` because it is borrowed
--> src/main.rs:31:18
|
30 | let my_view = s1.as_str();
| -- borrow of `s1` occurs here
31 | let s2 = s1;
| ^^ move out of `s1` occurs here
32 | dbg!(my_view);
| ------- borrow later used here
```
# Moving a string again
```rust
let s1 = "abcdefghijklmnopqrstuvwxyz".to_string();
let s2 = s1;
let mut v = Vec::new();
v.push(s2);
let s3 = v[0];
```
after run
```shell
error[E0507]: cannot move out of index of `Vec<String>`
--> src/main.rs:86:18
|
86 | let s3 = v[0];
| ^^^^
| |
| move occurs because value has type `String`, which does not implement the `Copy` trait
| help: consider borrowing here: `&v[0]`
```
### another example
```rust
fn f(s1: &mut String) {
let s2 = *s1;
dbg!(s2);
}
fn g() {
let mut s1 = "foo".to_string();
f(&mut s1);
dbg!(s1);
}
```
```shell
error[E0507]: cannot move out of `*s1` which is behind a mutable reference
--> src/main.rs:7:14
|
7 | let s2 = *s1;
| ^^^
| |
| move occurs because `*s1` has type `String`, which does not implement the `Copy` trait
| help: consider borrowing here: `&*s1`
```
## how to fix
### mem::swap
```rust
fn f(s1: &mut String) {
let mut s2 = "".to_string();
mem::swap(s1, &mut s2);
dbg!(s2);
}
fn g() {
let mut s1 = "foo".to_string();
f(&mut s1);
dbg!(s1);
}
```
s1 == ""
s2 == "foo"
### Option::take
```rust
fn f(s1: &mut Option<String>) {
let s2 = s1.take().unwrap();
dbg!(s2);
}
fn g() {
let mut s1: Option<String> =
Some("foo".to_string());
f(&mut s1);
dbg!(s1);
}
```
s1 == None
s2 == "foo"
### Vec::remove
```rust
fn f(v: &mut Vec<String>) {
let s2 = v.remove(0);
dbg!(s2);
}
fn g() {
let mut v = vec![
"foo".to_string(),
"bar".to_string(),
"baz".to_string(),
];
f(&mut v);
dbg!(v);
}
```
v == ["bar", "baz"]
s2 == "foo"
# The drop function
```rust
let file = File::open("/dev/null")?;
drop(file);
```
## Surprise: drop is the empty function
```rust
pub fn drop<T>(_x: T) {}
```
# Putting it all together
`Arc<Mutex<String>>`
# A mutex on the stack
```rust
let my_string: Mutex<String> = Mutex::new(String::new());
let mut thread_handles = Vec::new();
for _ in 0..10 {
let thread_handle = thread::spawn(|| {
let mut guard: MutexGuard<String> =
my_string.lock().unwrap();
guard.push_str("some characters");
});
thread_handles.push(thread_handle);
}
for thread_handle in thread_handles {
thread_handle.join().unwrap();
}
```
after run
```shell
error[E0373]: closure may outlive the current function, but it borrows `my_string`, which is owned by the current function
--> src/main.rs:9:43
|
9 | let thread_handle = thread::spawn(|| {
| ^^ may outlive borrowed value `my_string`
10 | let mut guard: MutexGuard<String> =
11 | my_string.lock().unwrap();
| --------- `my_string` is borrowed here
|
```
## How does Rust know?
for_each
```rust
fn for_each<OP>(self, op: OP)
where
OP: Fn(Self::Item) + Sync + Send,
{
for_each::for_each(self, &op)
}
```
spawn
```rust
pub fn spawn<F, T>(f: F) -> JoinHandle<T>
where
F: FnOnce() -> T,
F: Send + 'static,
T: Send + 'static,
{
Builder::new()
.spawn(f)
.expect("failed to spawn thread")
}
```
# A mutex on the heap
```rust
let my_string: Arc<Mutex<String>> =
Arc::new(Mutex::new(String::new()));
let mut thread_handles = Vec::new();
for _ in 0..10 {
let arc_clone = my_string.clone();
let thread_handle = thread::spawn(move || {
let mut guard: MutexGuard<String> =
arc_clone.lock().unwrap();
guard.push_str("some characters");
});
thread_handles.push(thread_handle);
}
for thread_handle in thread_handles {
thread_handle.join().unwrap();
}
```
## forgetting the mutex
```rust
let my_string: Arc<String> = Arc::new(String::new());
let mut thread_handles = Vec::new();
for _ in 0..10 {
let mut arc_clone = my_string.clone();
let thread_handle = thread::spawn(move || {
arc_clone.push_str("some characters");
});
thread_handles.push(thread_handle);
}
for thread_handle in thread_handles {
thread_handle.join().unwrap();
}
```
after run
```shell
error[E0596]: cannot borrow data in an `Arc` as mutable
--> src/main.rs:46:13
|
46 | arc_clone.push_str("some characters");
| ^^^^^^^^^ cannot borrow as mutable
|
= help: trait `DerefMut` is required to modify through a dereference, but it is not implemented for `Arc<String>`
```
## Writing under a read lock
```rust
let my_string: Arc<RwLock<String>> =
Arc::new(RwLock::new(String::new()));
let mut thread_handles = Vec::new();
for _ in 0..10 {
let arc_clone = my_string.clone();
let thread_handle = thread::spawn(move || {
let mut guard: RwLockReadGuard<String> =
arc_clone.read().unwrap();
guard.push_str("some characters");
});
thread_handles.push(thread_handle);
}
for thread_handle in thread_handles {
thread_handle.join().unwrap();
}
```
after run
```shell
error[E0596]: cannot borrow data in a dereference of `RwLockReadGuard<'_, String>` as mutable
--> src/main.rs:65:13
|
65 | guard.push_str("some characters");
| ^^^^^ cannot borrow as mutable
|
= help: trait `DerefMut` is required to modify through a dereference, but it is not implemented for `RwLockReadGuard<'_, String>`
```
## Rust giveth and Rust taketh away
```rust
let my_string: Arc<Mutex<String>> =
Arc::new(Mutex::new(String::new()));
let mut thread_handles = Vec::new();
for _ in 0..10 {
let arc_clone = my_string.clone();
let thread_handle = thread::spawn(move || {
let mut guard = arc_clone.lock().unwrap();
let smuggled_ptr: &mut String = &mut *guard;
drop(guard);
smuggled_ptr.push_str("some characters");
});
thread_handles.push(thread_handle);
}
for thread_handle in thread_handles {
thread_handle.join().unwrap();
}
```
after run
```shell
error[E0505]: cannot move out of `guard` because it is borrowed
--> src/main.rs:84:18
|
83 | let smuggled_ptr: &mut String = &mut *guard;
| ----- borrow of `guard` occurs here
84 | drop(guard);
| ^^^^^ move out of `guard` occurs here
85 | smuggled_ptr.push_str("some characters");
| ------------ borrow later used here
```
simple error for such a complicated thing huh ?
# Special Thanks
to `Jack O'Connor` | 24.472954 | 196 | 0.596033 | eng_Latn | 0.934248 |
c15c13fc34db86265e7f9935ec4eb104d5945a19 | 2,372 | md | Markdown | README.md | YZITE/APC_Temp_fetch | 95491d8d638b2daa8e49d7ff790c5811dd97cb2e | [
"Apache-2.0"
] | null | null | null | README.md | YZITE/APC_Temp_fetch | 95491d8d638b2daa8e49d7ff790c5811dd97cb2e | [
"Apache-2.0"
] | null | null | null | README.md | YZITE/APC_Temp_fetch | 95491d8d638b2daa8e49d7ff790c5811dd97cb2e | [
"Apache-2.0"
] | null | null | null | # APC_Temp_fetch
This python package provides an unified interface to several UPS
network adapters with different firmware versions. It does not
support guessing which interface to use; that wasn't necessary yet.
It supports the following interfaces:
| `kind` | `description` |
|--------|---------------|
| `old` | Simple interface using HTTP basic auth, information gets extracted from `upsstat.htm`, by finding the line mentioning `Internal Temperature` |
| `frmnc` | Requires 2 HTTP requests, main characteristics are that the login form is named `frmLogin` in HTML, and the data is presented in a `<div>`-table, but parsable line-by-line |
| `gden-nt07` | Similar to `frmnc`; with 3 HTTP requests, but the login form is named `HashForm1` instead, and the data representation is more complex, so a more full-blown HTML-parser is used; data structured via `<span>` items |
| `cs141` | Simple JSON API (and the imo best interface of these); with 3 HTTP requests, characteristic is the the data is located at `/api/devices/ups/report` |
The description provides some guidance which `kind` an interface is.
You may also just try out each `kind` and check if any returns useful results,
but (although unlikely) it may "brick" the network adapter until it is reset;
usually seen when (thru some bug) the script isn't able to properly logout
after a successful login, because many of these network adapters limit the
number of concurrent logins, often to even just 1 user at a time.
This package has 2 entry points:
* `APC_Tempf [--verbose] <kind> <host> <user> <password> [--timeout <timeout>]`
`kind` is the one of these mentioned above.
`host`, `user` and `password` should be self-explanatory.
`timeout` is an optional per-request timeout.
This is the primary, "simple" interface.
* `APC_Tempstfe [--verbose] <apclist>`
allows querying multiple UPS devices sequentially, and is used to
amortize the python script startup time.
The file should contain lines of the format `<kind> <host> <user> <password>[ <timeout>]`
(all without the `<>` brackets).
After the timeout, a comment may be given, or a comment can be alternatively placed
at the begging of a line, prefixed with a `# `; empty lines are ignored.
Results are formatted as `<host>\t<temperature>` (`<>` brackets aren't part of the output,
`\t` is replaced with an ASCII TAB character).
| 60.820513 | 230 | 0.740725 | eng_Latn | 0.998977 |
c15cbb3db0216425b7938524eb2a0a7c32d5c099 | 2,360 | md | Markdown | docs/user-guide/stores.md | jmelinav/feast | f56e5f2536980a6235ddf18df7c580811abd6cce | [
"Apache-2.0"
] | null | null | null | docs/user-guide/stores.md | jmelinav/feast | f56e5f2536980a6235ddf18df7c580811abd6cce | [
"Apache-2.0"
] | 6 | 2020-10-26T17:50:33.000Z | 2022-02-10T02:01:28.000Z | docs/user-guide/stores.md | jmelinav/feast | f56e5f2536980a6235ddf18df7c580811abd6cce | [
"Apache-2.0"
] | 1 | 2022-01-08T12:05:29.000Z | 2022-01-08T12:05:29.000Z | # Stores
In Feast, a store describes a database that is populated with feature data in order to be served to models.
Feast supports two classes of stores
* Historical stores
* Online stores
In order to populate these stores, Feast Core creates a long running ingestion job that streams in data from all feature sources to all stores that subscribe to those feature sets.

## Historical Stores
Historical stores maintain a complete history of feature data for the feature sets they are subscribed to.
Feast currently only supports [Google BigQuery](https://cloud.google.com/bigquery) as a feature store, but we have [developed a storage API ](https://github.com/gojek/feast/issues/482)that makes adding a new store possible.
Each historical store models its data differently, but in the case of a relational store \(like BigQuery\), each feature set maps directly to a table. Each feature and entity within a feature set maps directly to a column within a table.
Data from historical stores can be used to train a model. In order to retrieve data from a historical store it is necessary to connect to a Feast Serving deployment and request historical features. Please see feature retrieval for more details.
{% hint style="danger" %}
Data is persisted in historical stores like BigQuery in log format. Repeated ingestions will duplicate the data is persisted in the store. Feast will automatically deduplicate data during retrieval, but it doesn't currently remove data from the stores themselves.
{% endhint %}
## Online Stores
Online stores maintain only the latest values for a specific feature. Feast currently supports Redis as an online store. Online stores are meant for very high throughput writes from ingestion jobs and very low latency access to features during online serving.
Please continue to the [feature retrieval](feature-retrieval.md) section for more details on retrieving data from online storage.
## Subscriptions
Stores are populated by ingestion jobs \(Apache Beam\) that retrieve feature data from sources based on subscriptions. These subscriptions are typically defined by the administrators of the Feast deployment. In most cases a store would simply subscribe to all features, but in some cases it may subscribe to a subset in order to improve performance or efficiency.
| 62.105263 | 363 | 0.799576 | eng_Latn | 0.999093 |
c15cecb29ffd4f870de872cf7e18bcd3e79e471e | 206 | md | Markdown | content/lua/skill/intimidation.md | xackery/eqquestapi | e3bb4d58651c7c2bb1ced94deb59115946eed3c5 | [
"MIT"
] | null | null | null | content/lua/skill/intimidation.md | xackery/eqquestapi | e3bb4d58651c7c2bb1ced94deb59115946eed3c5 | [
"MIT"
] | 1 | 2020-09-08T17:21:08.000Z | 2020-09-08T17:21:08.000Z | content/lua/skill/intimidation.md | xackery/eqquestapi | e3bb4d58651c7c2bb1ced94deb59115946eed3c5 | [
"MIT"
] | 1 | 2020-08-29T00:49:26.000Z | 2020-08-29T00:49:26.000Z | ---
title: Skill Intimidation
searchTitle: Lua Skill Intimidation
weight: 1
hidden: true
menuTitle: Skill Intimidation
---
## Intimidation
This represents a Intimidation Skill
```lua
Skill.Intimidation
``` | 15.846154 | 36 | 0.776699 | eng_Latn | 0.673332 |
c15d8804ad0fca81237bb864eba1b895e0db48b1 | 2,199 | md | Markdown | src/ms/2019-04/06/02.md | PrJared/sabbath-school-lessons | 94a27f5bcba987a11a698e5e0d4279b81a68bc9a | [
"MIT"
] | 68 | 2016-10-30T23:17:56.000Z | 2022-03-27T11:58:16.000Z | src/ms/2019-04/06/02.md | PrJared/sabbath-school-lessons | 94a27f5bcba987a11a698e5e0d4279b81a68bc9a | [
"MIT"
] | 367 | 2016-10-21T03:50:22.000Z | 2022-03-28T23:35:25.000Z | src/ms/2019-04/06/02.md | PrJared/sabbath-school-lessons | 94a27f5bcba987a11a698e5e0d4279b81a68bc9a | [
"MIT"
] | 109 | 2016-08-02T14:32:13.000Z | 2022-03-31T10:18:41.000Z | ---
title: 'Rakyat Berhimpun'
date: 03/11/2019
---
`Sila baca Nehemia 8:1, 2. Apakah yang hal ini seharusnya beritahu kepada kita mengenai betapa pentingnya Firman Tuhan itu kepada umat?`
Aapbila orang Yahudi selesai membina semula tembok itu dan berpindah ke Yerusalem, mereka berkumpul di kawasan lapang Yerusalem pada bulan ketujuh. Bulan ketujuh, iaitu bulan Tishri, adalah bulan paling penting bagi bangsa Israel kerana ia ditahbiskan kepada Pesta Trompet (iaitu persediaan bagi penghakiman Tuhan, iaitu hari pertama dalam bulan tersebut), Hari Pendamaian (Hari Penghakiman, adalah hari ke-10 pada bulan itu), dan Pesta Pondok Daun (pesta memperingati pembebasan umat Tuhan dari Mesir oleh Tuhan dan perbekalan yang Dia berikan selama dalam pengembaraan di padang gurun, iaitu hari ke 15 pada bulan itu). Perhimpunan itu berlangsung pada hari pertama bulan itu, iaitu pada hari Pesta Trompet disambut. Para pemimpin menyerukan kaum lelaki dan wanita bangsa itu untuk menghadiri perhimpunan istimewa ini, melalui pembacaan Hukum, untuk memberikan peluang bagi mereka untuk belajar tentang Tuhan dan sejarah mereka.
Rakyat mengundang Ezra untuk membawa kitab Hukum Musa ke hadapan mereka dan membacanya. Mereka juga membina satu pentas dengan mimbar untuk majlis itu. Ia bukan sesuatu yang dipaksa oleh para pemimpin ke atas Jemaah itu. Sebaliknya, “mereka,’ iaitu rakyat, memberitahu Ezra untuk membawa Kitab itu. Kemungkinan besar Ezra membaca kepada umat itu dari kitab Musa, termasuklah hukum yang diberikan kepada Musa di Gunung Sinai.
`Sila baca Ulangan 31:9–13. Apakah yang Tuhan beritahu kepada mereka dan apakah pelajaran yang kita boleh ambil dari sini untuk diri kita sendiri?`
Dalam Ulangan 31:9-13, Musa memberitahu orang Israel bahawa, semasa Perayaan Pondok Daun, mereka akan berkumpul dan membacanya bersama Hukum Tuhan dan ia menyebutkan pelbagai kumpulan yang perlu dihimpunkan bersama: lelaki, wanita, kanak-kanak dan orang asing yang menumpang di dalam rumah mereka.
`Pembaca secara literal pada Nehemia 8:1 mengatakan bahawa mereka berkumpul bersama “sebagai satu orang.” Apakah yang hal itu beritahu kepada kita tentang pentingnya perpaduan dikalangan umat percaya?` | 137.4375 | 930 | 0.819009 | zsm_Latn | 0.972205 |
c15e7ded3213455808850efa137cbbb1d033ddbb | 1,116 | md | Markdown | README.md | monbarkley/vb01 | 2ccce3cbdc081a0004fac7480bb83d1e4146eeea | [
"Apache-2.0"
] | null | null | null | README.md | monbarkley/vb01 | 2ccce3cbdc081a0004fac7480bb83d1e4146eeea | [
"Apache-2.0"
] | null | null | null | README.md | monbarkley/vb01 | 2ccce3cbdc081a0004fac7480bb83d1e4146eeea | [
"Apache-2.0"
] | null | null | null | # vb01
ABC
#include <bits/stdc++.h>
using namespace std;
// Tên chương trình
const string NAME = "template";
// Số test kiểm tra
const int NTEST = 100;
mt19937 rd(chrono::steady_clock::now().time_since_epoch().count());
#define rand rd
// Viết lại hàm random để sử dụng cho thuận tiện. Hàm random này sinh ngẫu nhiên số trong phạm vi long long, số sinh ra >= l và <= h.
long long Rand(long long l, long long h) {
assert(l <= h);
return l + rd() * 1LL * rd() % (h - l + 1);
}
int main()
{
srand(time(NULL));
for (int iTest = 1; iTest <= NTEST; iTest++)
{
ofstream inp((NAME + ".inp").c_str());
// Code phần sinh test ở đây
inp.close();
// Nếu dùng Linux thì "./" + Tên chương trình
system((NAME + ".exe").c_str());
system((NAME + "_trau.exe").c_str());
// Nếu dùng linux thì thay fc bằng diff
if (system(("fc " + NAME + ".out " + NAME + ".ans").c_str()) != 0)
{
cout << "Test " << iTest << ": WRONG!\n";
return 0;
}
cout << "Test " << iTest << ": CORRECT!\n";
}
return 0;
}
| 27.9 | 133 | 0.536738 | vie_Latn | 0.963544 |
c15ff6ae47109cc59135a58a46ec44a43f96e9ae | 56 | md | Markdown | README.md | andrewgierens/freedive-trainer | 2d0b35ba30c02600942bf289e6940d89265c61fa | [
"MIT"
] | null | null | null | README.md | andrewgierens/freedive-trainer | 2d0b35ba30c02600942bf289e6940d89265c61fa | [
"MIT"
] | null | null | null | README.md | andrewgierens/freedive-trainer | 2d0b35ba30c02600942bf289e6940d89265c61fa | [
"MIT"
] | null | null | null | # freedive-trainer
Free, open source freediving trainer
| 18.666667 | 36 | 0.821429 | eng_Latn | 0.445301 |
c1608af325fa5e2d93a97a95d45fe124b6c1a5f6 | 566 | md | Markdown | basic-english-book1/lesson13-nobody-is-at-home.md | levy9527/ivy-english | 8c258a2834a51d090c0bed044bbee93dee211aa4 | [
"MIT"
] | null | null | null | basic-english-book1/lesson13-nobody-is-at-home.md | levy9527/ivy-english | 8c258a2834a51d090c0bed044bbee93dee211aa4 | [
"MIT"
] | null | null | null | basic-english-book1/lesson13-nobody-is-at-home.md | levy9527/ivy-english | 8c258a2834a51d090c0bed044bbee93dee211aa4 | [
"MIT"
] | null | null | null | Nobody is at home at the Wangs' house. Mr. Wang is working in his office. Mrs. Wang is shopping at the supermarket.
Tony is sitting on the bus. He's on his way on the gym. Tina is studying at the library.
Rover is not at home, either. It's running around the neighborhood. It's not chasing Mrs. Lee's cat. It's chasing Mrs. Lee!
## Grammar Points
be at home 固定短语,表示在家里
shopping 是不及物动词,要加一个for才能接名词。比如她在买一条裙子,She's shopping for a new dress
in 与 at 的区别:
- in 强调在...里面
- at 表示在...,没有强调之间,可以在门口,可以在里面
in 与 on 的区别:
- in 不能站起来,in the car/taxi
- on 可以站起来 on the bus
| 29.789474 | 123 | 0.729682 | eng_Latn | 0.999311 |
c161147df3b0020f0b6da452bac611e192781c65 | 74 | md | Markdown | _data/references.md | geoidapp/geoidapp.github.io | 44ca45a4187f5320ecad907f0a26bee6b50a2246 | [
"Apache-2.0"
] | 1 | 2016-10-26T15:46:39.000Z | 2016-10-26T15:46:39.000Z | _data/references.md | geoidapp/geoidapp.github.io | 44ca45a4187f5320ecad907f0a26bee6b50a2246 | [
"Apache-2.0"
] | null | null | null | _data/references.md | geoidapp/geoidapp.github.io | 44ca45a4187f5320ecad907f0a26bee6b50a2246 | [
"Apache-2.0"
] | null | null | null | bigheader: "References"
toc:
- title: References
path: /docs/reference/
| 14.8 | 24 | 0.72973 | eng_Latn | 0.570718 |
c1612dfdbd4bc02ba6ecde87089d82b1dcb672c3 | 35 | md | Markdown | README.md | estissy/phpelements | 75b487f0fae77b9861465574afbac8d53ad18c19 | [
"MIT"
] | null | null | null | README.md | estissy/phpelements | 75b487f0fae77b9861465574afbac8d53ad18c19 | [
"MIT"
] | null | null | null | README.md | estissy/phpelements | 75b487f0fae77b9861465574afbac8d53ad18c19 | [
"MIT"
] | null | null | null | # phpelements
Set of html elements
| 11.666667 | 20 | 0.8 | kor_Hang | 0.552784 |
c1617ba8f6dcdfa2b27f5e06e2ce12ef9f066754 | 1,205 | md | Markdown | AlchemyInsights/add-setup-email-accounts.md | isabella232/OfficeDocs-AlchemyInsights-pr.pt-BR | 61447e6f79e6a5ca40e5e50168da971230d5f9b7 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2020-05-19T19:07:30.000Z | 2020-05-19T19:07:30.000Z | AlchemyInsights/add-setup-email-accounts.md | isabella232/OfficeDocs-AlchemyInsights-pr.pt-BR | 61447e6f79e6a5ca40e5e50168da971230d5f9b7 | [
"CC-BY-4.0",
"MIT"
] | 2 | 2022-02-09T06:58:01.000Z | 2022-02-09T06:58:58.000Z | AlchemyInsights/add-setup-email-accounts.md | isabella232/OfficeDocs-AlchemyInsights-pr.pt-BR | 61447e6f79e6a5ca40e5e50168da971230d5f9b7 | [
"CC-BY-4.0",
"MIT"
] | 2 | 2019-10-11T19:14:27.000Z | 2021-10-13T10:06:06.000Z | ---
title: 764 Adicionar/Configurar contas de email
ms.author: pdigia
author: pebaum
ms.date: 04/21/2020
ms.audience: ITPro
ms.topic: article
ms.service: o365-administration
ROBOTS: NOINDEX, NOFOLLOW
localization_priority: Normal
ms.custom:
- "764"
- "1800018"
ms.assetid: afd20b89-09e9-4746-ac16-e282382dd948
ms.openlocfilehash: d7e015bd79addc9a1abe1a9a7973fe923c934bf60c2891f4454c13622a2b8a9f
ms.sourcegitcommit: b5f7da89a650d2915dc652449623c78be6247175
ms.translationtype: MT
ms.contentlocale: pt-BR
ms.lasthandoff: 08/05/2021
ms.locfileid: "53935505"
---
# <a name="set-up-email-accounts"></a>Configurar contas de email
1. Em Outlook, clique em **Adicionar** > **Conta de Informações de** > **Arquivo.**
2. Insira seu endereço de email e clique **em Conexão**.
3. Insira sua senha e clique em **OK**.
4. Você pode repetir essas etapas para adicionar várias contas.
**Observação**: Algumas contas de email, como Gmail, Yahoo e iCloud, exigem que você configurar a autenticação de dois fatores para usar Outlook enviar e receber emails. Consulte [Adicionar uma conta de email Outlook](https://support.office.com/article/6e27792a-9267-4aa4-8bb6-c84ef146101b.aspx) para obter mais informações.
| 36.515152 | 324 | 0.780913 | por_Latn | 0.91309 |
c161f3a95a46c363a2fb2cef62d3480b02e30e63 | 144 | md | Markdown | benchmark/README.md | bisakhmondal/biscuitdb | 0e7625eea8d05be25af026ffffd77f46df6194cc | [
"Apache-2.0"
] | 3 | 2021-08-14T15:25:23.000Z | 2022-02-08T05:14:28.000Z | benchmark/README.md | bisakhmondal/biscuitdb | 0e7625eea8d05be25af026ffffd77f46df6194cc | [
"Apache-2.0"
] | 5 | 2021-03-18T19:12:31.000Z | 2021-03-21T13:25:44.000Z | benchmark/README.md | bisakhmondal/biscuitdb | 0e7625eea8d05be25af026ffffd77f46df6194cc | [
"Apache-2.0"
] | 3 | 2021-08-14T14:00:40.000Z | 2021-09-04T03:49:43.000Z | # Benchmarks Directory
These are the microbenchmarks for the DBMS. We use the [Google Benchmark](https://github.com/google/benchmark) library.
| 36 | 119 | 0.791667 | eng_Latn | 0.85768 |
c16287e5967d1c5eae672f108b252c7a2537cf2e | 14,425 | md | Markdown | submission/response_to_reviewers.md | SchlossLab/Tomkovich_PPI_mSphere_2019 | 04f69d5e497304ac4034aed2c86d2238e0602ae5 | [
"CC0-1.0",
"MIT"
] | null | null | null | submission/response_to_reviewers.md | SchlossLab/Tomkovich_PPI_mSphere_2019 | 04f69d5e497304ac4034aed2c86d2238e0602ae5 | [
"CC0-1.0",
"MIT"
] | null | null | null | submission/response_to_reviewers.md | SchlossLab/Tomkovich_PPI_mSphere_2019 | 04f69d5e497304ac4034aed2c86d2238e0602ae5 | [
"CC0-1.0",
"MIT"
] | null | null | null | ---
output:
pdf_document: default
html_document: default
---
### Editor Comments:
**1. Both reviewers commented about the appropriateness of strain that was used in your experiments. Please provide a justification for why this strain was chosen.**
> We have responded to the reviewers' comments below regarding why we chose to test *C. difficile* 630 and have added the following sentence to the discussion to acknowledge the strain choice as a potential limitation:
"The type of *C. difficile* strain type could also be an important contributing factor, our study was limited in that we only tested *C. difficile* 630 (ribotype 012)."
**2. Both reviewers commented on the low number of mice used in the experiments, perhaps making it hard to pick up any small differences. Please add some additional discussion around this point.**
> We have responded to the reviewers' comments below and added the following sentence to the discussion regarding the low number of mice used:
"One limitation of our study is that there were only 4-5 mice per group, which may have limited our ability to identify PPI-induced changes in specific bacteria genera."
**3. Lastly, from my own reading of the paper, I recall that rodent gut microflora consist mainly of Gram-positive organisms, whereas human gut flora is mainly Gram-negatives and anaerobes. One would presume that any treatment that disrupts the microbiome would interact very differently with these two environments, and that the rodent microbiome may not accurately reflect what happens in the human gut. This caveat is hinted at in lines 106-107 of your manuscript, but I would like you to make this point more clearly.**
> We have added an additional sentence to precede the sentence from lines 106-107 to draw attention to the differences in intestinal microbiome composition between mice and humans:
"*While the intestinal microbiomes of both humans and mice are dominated by the Bacteroidetes and Firmicutes phyla, there are significant differences in the relative abundances of genera that are present and some genera are unique to each mammal (Hugenholtz & de Vos Cell. Mol. Life Sci. 2018), differences that may partly explain our results.* The microbiota and physiological differences between humans and mice may limit the usefulness of employing mouse models to study the impact of PPIs on the microbiota and CDIs."
### Reviewer #1 (Comments for the Author):
**In their manuscript Tomkovich et al. address the question whether the correlative association of proton pump inhibitors with the occurence of CDI is due to the changes in the microbiome following PPI administration as it is speculated in the field. This correlation drawn mainly from epidemiological studies and typically retrospectively, lacks experimental evaluations. Hence, the authors used the mouse colonization model of C. difficile. They checked the changes in the microbiome and the colonization efficiencies in mice treated either with the PPI, omeprazole or clindamycin or with a commbination of both. They show that efficient colonization is dependent on clindamycin treatment prior to infction and that omeprazole alone has no significant effect on the diversity of the microbiome, and does not result in effective colonzation by C. difficile. As this is the first attempt to experimentally address a question arising from medical practice and observation, the study and its findings are interesting.**
**Neverteheless, the study has some points that need to be addressed at least in the discussion before publication**
**1. Is it correct that the study was perfomed on max. 4 mice/treatment group? This might be too low a number, especially in case of the colonization studies to draw conclusions.**
> The reviewer is correct that the number of mice per group was low. We state the exact number of mice per treatment group in Figure 1A (5 mice each for 2 of the treatment groups and 4 mice for the 3rd treatment group). We agree that this is a low number, but would argue that the low microbiome variation across laboratory mice means less mice are needed compared to human epidemiological studies when examining *C. difficile* colonization. Previous *C. difficile* infection mouse model studies from our group and a close collaborator have used 3-5 mice per treatment group (see Koenigsknecht et al. Infection and Immunity 2015 and Theriot et al. Gut Microbes 2011 ) and were still able to draw meaningful conclusions. We have added the following sentence to the discussion to acknowledge the low number of mice used for these experiments as a limitation:
"One limitation of our study is that there were only 4-5 mice per group, which may have limited our ability to identify PPI-induced changes in specific bacteria genera."
**2. The authors should address the fact that they have used only C. difficile str. 630 for their experiments. This is a ribotype 012 strain that is not predominant in the clinical setting. Which ribotypes are typically observed in PPI associated CDI, is there a tendency towards a certain ribotype? Could this have an impact on the interpretation of the data derived from the model described in this manuscript?**
> We thank the reviewers for pointing out that the type of *C. difficile* strain could be an important factor influencing the association between PPIs and CDIs observed in human epidemiological studies. Unfortunately, the studies to date seldom examine the *C. difficile* strains found in PPI users that develop CDIs. The studies that do discuss *C. difficile* strains merely note the type of *C. difficile* frequently found in the region that the study was conducted (Roughead et al. Expert Opinion on Drug Safety 2016; McDonald et al. JAMA Internal Medicine 2015; Chitnis et al. JAMA Internal Medicine 2013; Vardakas et al. International Journal of Infections Diseases 2012; Dalton et al. Alimentary Pharmacology & Therapeutics 2008). Our lab prefers using *C. difficile* 630 for experiments because we are interested in colonization and clearance. *C. difficile* VPI 10463 administration to mice results in a more agressive disease that leads to mortality. Using *C. difficile* 630 after clindamycin administration results in an intermediate phenotype, where the mice are colonized but able to clear infection within 10 days. The intermediate phenotype with *C. difficile* 630 allows us to examine whether the addition of a drug exacerbates or prevents infection and track the mice for a longer period of time. In addition to our lab's prior experience with the *C. difficile* 630 infection mouse model, we also decided to use *C. difficile* 630 because of a recent study that demonstrated three proton pump inhibitors had no effect on *in vitro C. difficile* 630 growth (Maier et al. Nature 2018). We have added the following sentence to the discussion to acknowledge that the type of *C. difficile* strain is another potential contributing factor to the association between PPIs and CDIs seen in human epidemiological studies:
"The type of *C. difficile* strain type could also be an important contributing factor, however our study was limited in that we only tested *C. difficile* 630 (ribotype 012)."
**3. As the manuscript is pretty concise and has in total 4 figures, the supplementary figures can be integrated into the main body.**
> We have submitted this manuscript as an Observation, which is limited to 2 figures total. Additionally, in response to Reviewer #2, we have expanded the size of all the panels in our figures, limiting our ability to include additional panels for the 2 main text figures. Because of the changes to the sizes of the figures, we have opted to keep the supplementary figures as supplemental.
**5. There are some studies that looked into the effect of PPIs on microbiomes in cats and infants, which seem to be in accordance with the findings in this manuscript. Hence, the authors should include these in their discussion:**
**- Schmid SM, et al.,2018**
**- Castellani C, et al., 2017**
> Thank you for the suggestion, we have added the following sentence to the discussion referring to these studies to indicate that other groups have also observed minimal changes in the mammalian microbiome after proton pump inhibitor treatment:
"Interestingly, other groups examining fecal microbiota communities before and after PPI administration to healthy cats and infants with gastroesophageal reflux disease, found PPIs have minimal effects on fecal bacterial community structures, although there were a few significant changes in specific genera (Schmid et al. Frontiers in Veterinary Science 2018; Castellani et al. Frontiers in Cellular and Infection Microbiology 2017)."
**6. It is becoming more and more accepted that Clostridium difficile, due to its biology, is a separate genus, resulting in Clostridioides difficile as the accepted name. The authors should give credit to this and at least state it as Clostridium (Clostridioides) difficile in the text the first time they use it.**
> We appreciate the suggestion and have changed "*Clostridium*" to "*Clostridioides*" in the title, abstract, importance, and introduction sections of our manuscript.
### Reviewer #2 (Comments for the Author):
**The manuscript by Tomkovich et al tests if the PPI omeprazole can influence the fecal microbiome of mice and promote C. difficile infection. The study is based on epidemiological findings in humans that associate increased PPI use with CDI as well as a few mouse studies that implicate PPIs in CDI. The data presented here show no significant impact of omeprazole on the microbiome of mice after 7 days of treatment and the mice did not become susceptible to C. diff colonization. This is a small study but adds to the literature in that it conclusively shows no impact of omeprazole on the microbiome.**
**1. The size of the figures and the fonts are very small and should be improved in a resubmission. It makes it difficult to see individual points in the graphs.**
> We have adjusted the sizes of all figures and increased the sizing of the fonts and points in the graphs.
**2. The choice of CD630 and dose may have a big impact on the results and comparison with previous studies. In ref. 15 the authors used a more virulent strain (VPi) and at a 10,000 fold increased concentration. The authors do a good job of citing other factors that may influence the results (like the increased pH of the mouse stomach) but this should also be noted for readers.**
> We thank the reviewer for pointing out *C. difficile* strain type as an additional factor that could explain the association between proton pump inhibitors and CDIs in human studies, which was also pointed out by Reviewer # 1. In response, we have added the following sentence to the discussion:
"The type of *C. difficile* strain type could also be an important contributing factor, however our study was limited in that we only tested *C. difficile* 630 (ribotype 012)."
**3. Please provide evidence for how the dose of 40mg/kg of OMZ corresponds to what humans get in a dose.**
> Omeprazole is prescribed to adults in 20 mg or 40 mg doses that are taken once per day. The 40 mg/kg dose administered to the mice in our study is higher than what is administered to humans translating to ~ 3.25 mg/kg (compared to ~ 0.33-0.66 mg/kg for the prescribed human dose) based on a reference body weight of 60 kg for humans and 0.02 kg for mice (converted according to the calculation described in Nair & Jacob J Basic Clin Pharm. 2016). However, previous studies have justified administering omeprazole at higher doses to mice because the drug administered to the mice lacks an enteric coating (Llorente et al. Nature Communications 2017) and is eliminated more quickly compared to humans (plasma elimination half life of 5-15 minutes compared to 1 hour for humans, see Regardh et al. Scandinavian Journal of Gastroenterology 1985). The 40 mg/kg dose chosen for our study is in line with what was used by one of the previous studies that examined the proton pump inhibitor esomeprazole and *C. difficile* in mice (see Hung et al. Journal of Infectious Diseases 2015). We have added the following sentence to the methods section to account for the higher dosing:
"Although the omeprazole dose administered to mice is higher than the recommended dose for humans, omeprazole has a shorter half-life in mice compared to humans (Regardh et al. Scandinavian Journal of Gastroenterology 1985) and lacks an enteric coating (Llorente et al. Nature Communications 2017)."
**4. Although OMZ did not have a major impact on community structure by itself, in figure 2 it seems that it did have an impact on a few groups that were treated with clindamycin. Although the P-values aren't significant, this could be mainly due to the low number of mice that were used in the study as a few of the genera showed 10-fold changes. I can see why it isn't commented on due to the lack of stat significance, but it is pretty striking as presented in the figure.**
> We agree with the reviewer regarding Figure 2C, where the relative abundances for specific taxa in the Clindamycin + omeprazole treated mice were typically in between the clindamycin-treated and omeprazole-treated mice. Beyond the P-values not being significant, we decided not to comment on this because we suspect that the majority of the differences in bacterial relative abundances across groups is driven by the clindamycin treatment. We have added the following sentence to the discussion to acknowledge the low number of mice used as a potential limitation in detecting omeprazole-induced changes in the fecal microbiota:
"One limitation of our study is that there were only 4-5 mice per group, which may have limited our ability to identify PPI-induced changes in bacteria genera."
**5. The final sentence of the importance section is a run-on sentence and needs to be broken up or simplified.**
> To simplify the sentence, we have removed the part that elaborates on what other factors ("composition of the starting bacterial community, comorbidities, and use of additional medications") might be contributing to the correlations between PPIs and CDIs seen in humans.
**6. There are many typos and missed italics in the references.**
> Thank you for pointing these errors out, typos have been corrected and italics have been added to the references.
| 148.71134 | 1,831 | 0.799653 | eng_Latn | 0.999691 |
c16307aecf67cf0be9f9b44299e1cb212de00dec | 1,587 | md | Markdown | articles/virtual-machines/linux/premium-storage.md | nt-7/azure-docs.ja-jp | 06311dbe497ae0d6c994cb732fb447760834fa46 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/virtual-machines/linux/premium-storage.md | nt-7/azure-docs.ja-jp | 06311dbe497ae0d6c994cb732fb447760834fa46 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/virtual-machines/linux/premium-storage.md | nt-7/azure-docs.ja-jp | 06311dbe497ae0d6c994cb732fb447760834fa46 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Linux VM 向けの高パフォーマンス Premium Storage と Azure Managed Disks | Microsoft Docs
description: Azure VM 向けの高パフォーマンスの Premium Storage と管理ディスクについて説明します。 Azure DS シリーズ、DSv2 シリーズ、GS シリーズ、および Fs シリーズの VM は、Premium Storage をサポートしています。
services: virtual-machines-linux
documentationcenter: ''
author: ramankumarlive
manager: jeconnoc
editor: tysonn
ms.assetid: e2a20625-6224-4187-8401-abadc8f1de91
ms.service: virtual-machines-linux
ms.workload: infrastructure-services
ms.tgt_pltfrm: na
ms.devlang: na
ms.topic: article
ms.date: 03/30/2018
ms.author: ramankum
ms.openlocfilehash: fa64f1f1b25646eeaaf701bb98b1a9bfd7b2166d
ms.sourcegitcommit: 0a84b090d4c2fb57af3876c26a1f97aac12015c5
ms.translationtype: HT
ms.contentlocale: ja-JP
ms.lasthandoff: 07/11/2018
ms.locfileid: "38478088"
---
[!INCLUDE [virtual-machines-common-premium-storage.md](../../../includes/virtual-machines-common-premium-storage.md)]
### <a name="design-and-implement-with-premium-storage"></a>Premium Storage での設計と実装
* [Premium Storage でのパフォーマンスのための設計](premium-storage-performance.md)
* [Premium Storage での Blob Storage の操作](http://go.microsoft.com/fwlink/?LinkId=521969)
### <a name="operational-guidance"></a>操作ガイダンス
* [Azure Premium Storage への移行](../../storage/common/storage-migration-to-premium-storage.md)
### <a name="blog-posts"></a>ブログ記事
* [Azure Premium Storage の一般提供開始](https://azure.microsoft.com/blog/azure-premium-storage-now-generally-available-2/)
* [GS シリーズの提供開始を発表: パブリック クラウドの最大の VM に Premium Storage サポートを追加](https://azure.microsoft.com/blog/azure-has-the-most-powerful-vms-in-the-public-cloud/) | 45.342857 | 151 | 0.7908 | yue_Hant | 0.208585 |
c163a7e93bf49f986388ffee6d9e5611b4a87a5d | 1,249 | md | Markdown | README.md | OnlyOneCookie/Kaida | 3b500e6d09bcd8810e9a6c8622681ad7481a5518 | [
"MIT"
] | null | null | null | README.md | OnlyOneCookie/Kaida | 3b500e6d09bcd8810e9a6c8622681ad7481a5518 | [
"MIT"
] | 5 | 2020-09-11T15:02:33.000Z | 2020-10-04T06:47:06.000Z | README.md | OnlyOneCookie/Kaida | 3b500e6d09bcd8810e9a6c8622681ad7481a5518 | [
"MIT"
] | 1 | 2020-12-16T10:44:29.000Z | 2020-12-16T10:44:29.000Z | # DEFCON
| | Status |
| --- | --- |
| Build | soonTM |
| Discord | [](https://discord.gg/WgUDVAk) |
| CodeFactor |  |
| License |  |
DEFCON will be your very next ~~system~~ bot that contains everything and will serve for your ~~country~~ server!
## Levels
*Coming soon - security description about each level for your server!*
### DEFCON 1
### DEFCON 2
### DEFCON 3
### DEFCON 4
### DEFCON 5
## Technologies
- [.NET Core 3.1](https://dotnet.microsoft.com/download/dotnet-core/)
- [DSharpPlus](https://dsharpplus.github.io/)
- [Redis](https://redis.io) (will be replaced in the future with mysql or sqlite)
[<img src="https://img.shields.io/badge/made%20in-Switzerland-red?style=flat-square">](#) [<img src="https://img.shields.io/badge/Antoine-Martins%20Pacheco-blue?style=flat-square">](https://www.onlyonecookie.ch)
| 49.96 | 211 | 0.734988 | yue_Hant | 0.572117 |
c164a15b351fa98fda4deac5ec9fc8b5a034974a | 2,339 | md | Markdown | docs/vs-2015/modeling/how-to-set-clr-attributes-on-an-element.md | monkey3310/visualstudio-docs.pl-pl | adc80e0d3bef9965253897b72971ccb1a3781354 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/vs-2015/modeling/how-to-set-clr-attributes-on-an-element.md | monkey3310/visualstudio-docs.pl-pl | adc80e0d3bef9965253897b72971ccb1a3781354 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/vs-2015/modeling/how-to-set-clr-attributes-on-an-element.md | monkey3310/visualstudio-docs.pl-pl | adc80e0d3bef9965253897b72971ccb1a3781354 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: 'Porady: ustawienie atrybutów CLR w elemencie | Dokumentacja firmy Microsoft'
ms.custom: ''
ms.date: 2018-06-30
ms.prod: visual-studio-tfs-dev14
ms.reviewer: ''
ms.suite: ''
ms.tgt_pltfrm: ''
ms.topic: article
f1_keywords:
- vs.dsltools.EditAttributesDialog
helpviewer_keywords:
- Domain-Specific Language, custom attrributes
ms.assetid: b3db3c74-920c-4701-9544-6f75cbe8b7c9
caps.latest.revision: 21
author: gewarren
ms.author: gewarren
manager: douge
ms.openlocfilehash: afcbadd7c7b3a18c94228e7221eda32b7117a2ed
ms.sourcegitcommit: 55f7ce2d5d2e458e35c45787f1935b237ee5c9f8
ms.translationtype: MT
ms.contentlocale: pl-PL
ms.lasthandoff: 08/22/2018
ms.locfileid: "42628685"
---
# <a name="how-to-set-clr-attributes-on-an-element"></a>Porady: ustawienie atrybutów CLR w elemencie
[!INCLUDE[vs2017banner](../includes/vs2017banner.md)]
Najnowszą wersję tego tematu znajduje się w temacie [porady: Ustawianie atrybutów CLR w elemencie](https://docs.microsoft.com/visualstudio/modeling/how-to-set-clr-attributes-on-an-element).
Atrybuty niestandardowe są specjalne atrybuty, które mogą być dodawane do elementy domeny, kształty, łączników i diagramy. Możesz dodać dowolny atrybut, który dziedziczy z `System.Attribute` klasy.
### <a name="to-add-a-custom-attribute"></a>Aby dodać atrybut niestandardowy
1. W **Eksplorator DSL**, wybierz element, do którego chcesz dodać atrybut niestandardowy.
2. W **właściwości** okna, obok **atrybuty niestandardowe** właściwości, kliknij przycisk Przeglądaj (**...** ) ikona.
**Edycja atrybutów** zostanie otwarte okno dialogowe.
3. W **nazwa** kolumny, kliknij przycisk **\<Dodaj atrybut >** i wpisz nazwę swojej atrybutu. Naciśnij klawisz ENTER.
4. Wiersz pod nazwą atrybutu zawiera nawiasy. W tym wierszu typu parametru atrybutu (na przykład `string`), a następnie naciśnij klawisz ENTER.
5. W **właściwości Name** kolumny, wpisz odpowiednią nazwę, na przykład `MyString`.
6. Kliknij przycisk **OK**.
**Atrybuty niestandardowe** właściwości wyświetla teraz atrybutu w następującym formacie:
`[` *AttributeName* `(` *ParameterName* `=` *typu* `)]`
## <a name="see-also"></a>Zobacz też
[Słownik narzędzi języka specyficznego dla domeny](http://msdn.microsoft.com/en-us/ca5e84cb-a315-465c-be24-76aa3df276aa)
| 40.327586 | 199 | 0.749466 | pol_Latn | 0.996452 |
c164b0883831197a2f2afc0a50c9334cacfb2222 | 940 | md | Markdown | docs/API/interfaces/isnapshotprocessors.md | paatrofimov/mobx-state-tree | 1906a394906d2e8f2cc1c778e1e3228307c1b112 | [
"MIT"
] | 2 | 2019-05-20T16:59:57.000Z | 2019-05-24T16:28:10.000Z | docs/API/interfaces/isnapshotprocessors.md | paatrofimov/mobx-state-tree | 1906a394906d2e8f2cc1c778e1e3228307c1b112 | [
"MIT"
] | 3 | 2022-02-28T03:18:56.000Z | 2022-02-28T05:00:57.000Z | docs/API/interfaces/isnapshotprocessors.md | paatrofimov/mobx-state-tree | 1906a394906d2e8f2cc1c778e1e3228307c1b112 | [
"MIT"
] | null | null | null | [mobx-state-tree](../README.md) > [ISnapshotProcessors](../interfaces/isnapshotprocessors.md)
# Interface: ISnapshotProcessors
Snapshot processors.
## Type parameters
#### C
#### CustomC
#### S
#### CustomS
## Hierarchy
**ISnapshotProcessors**
## Index
### Methods
* [postProcessor](isnapshotprocessors.md#postprocessor)
* [preProcessor](isnapshotprocessors.md#preprocessor)
---
## Methods
<a id="postprocessor"></a>
### `<Optional>` postProcessor
▸ **postProcessor**(snapshot: *`S`*): `CustomS`
Function that transforms an output snapshot.
**Parameters:**
| Name | Type | Description |
| ------ | ------ | ------ |
| snapshot | `S` | |
**Returns:** `CustomS`
___
<a id="preprocessor"></a>
### `<Optional>` preProcessor
▸ **preProcessor**(snapshot: *`CustomC`*): `C`
Function that transforms an input snapshot.
**Parameters:**
| Name | Type |
| ------ | ------ |
| snapshot | `CustomC` |
**Returns:** `C`
___
| 15.16129 | 93 | 0.628723 | yue_Hant | 0.248583 |
c1661cb7e182aec3f4f75a5c315b98ff97a71479 | 1,907 | md | Markdown | packages/atomic/src/SNIPPET.md | Seolhun/jtn-script-example | 38a2ebad469f73b056f4e829d84a2ad57a2b522c | [
"MIT"
] | 2 | 2020-05-09T17:04:04.000Z | 2020-08-15T14:24:30.000Z | packages/atomic/src/SNIPPET.md | Seolhun/jtn-script-example | 38a2ebad469f73b056f4e829d84a2ad57a2b522c | [
"MIT"
] | 63 | 2018-09-14T17:36:06.000Z | 2020-11-20T18:17:00.000Z | packages/atomic/src/SNIPPET.md | Seolhun/jtn-script-example | 38a2ebad469f73b056f4e829d84a2ad57a2b522c | [
"MIT"
] | 2 | 2019-01-14T08:31:56.000Z | 2021-01-28T06:22:53.000Z | # Snippet
```tsx
import React from 'react';
import styled from '@emotion/styled';
import classnames from 'classnames';
import {
getLocalizeIntentColor,
getLocalizeScaleBy,
LocalizeIntentThemeType,
LocalizeProps,
LocalizeScale,
LocalizeThemeProps,
} from '@seolhun/localize-components-styled-types';
const CLASSNAME = '__Localize__Something';
type DivProps = React.HTMLAttributes<HTMLDivElement>;
type ExtentionProps = LocalizeProps & DivProps;
export interface LocalizeSomethingProps extends ExtentionProps {
/**
* Set this to change scale
* @default md
*/
scale?: LocalizeScale;
/**
* Set this to change intent color
* @default primary
*/
intent?: LocalizeIntentThemeType;
/**
* Set this to change rounded border-radius
*/
rounded?: boolean;
}
const LocalizeSomethingWrapper = styled.div<LocalizeProps, LocalizeThemeProps>(({
theme,
scale = 'md',
intent = 'primary',
localize = {
primaryColor: 'primary',
neutralColor: 'inversed9',
fontColor: 'inversed1',
inversedFontColor: 'inversed10',
},
}) => {
const localizedColor = getLocalizeIntentColor(theme, intent, localize);
const { primaryColor, inversedFontColor } = localizedColor;
const localizeScale = getLocalizeScaleBy(scale);
return {
backgroundColor: primaryColor,
borderColor: neutralColor,
color: fontColor,
}
});
const LocalizeSomethingContainer = styled.div<LocalizeProps, LocalizeThemeProps>(() =>{
return {}
})
const LocalizeSomething: React.FC<LocalizeSomethingProps> = ({
children,
className = '',
...props
}) => {
return (
<LocalizeSomethingWrapper
{...props}
className={classnames(CLASSNAME, className)}
>
<LocalizeSomethingContainer>
{children}
</LocalizeSomethingContainer>
</LocalizeSomethingWrapper>
);
};
export { LocalizeSomething };
export default LocalizeSomething;
```
| 22.435294 | 87 | 0.706869 | eng_Latn | 0.43652 |
c16667170b1833d7cc7e9d9cbcef3a924c08855e | 216 | md | Markdown | README.md | onuromer/ssis-on-the-beach | e2847398e3ee27fe21d5b614a58ff5f12f7585cb | [
"MIT"
] | 1 | 2020-06-04T21:14:31.000Z | 2020-06-04T21:14:31.000Z | README.md | onuromer/ssis-on-the-beach | e2847398e3ee27fe21d5b614a58ff5f12f7585cb | [
"MIT"
] | 3 | 2020-07-24T09:46:24.000Z | 2020-12-29T02:50:44.000Z | README.md | onuromer/ssis-on-the-beach | e2847398e3ee27fe21d5b614a58ff5f12f7585cb | [
"MIT"
] | null | null | null | # ssis-on-the-beach
# In other words: SQL Server Integration Web Services
# In other words: .NET Core WEB API wrapper for managing SSIS Server.
# In other words: a tribute to famous cocktail and hot summer nights :)
| 43.2 | 71 | 0.759259 | eng_Latn | 0.982919 |
c167499fc5122c98388a62a3dae0a218489b6bad | 3,732 | md | Markdown | README.md | Sage-Bionetworks/mongo | 8e915b58a79fb3da48442e8373d0fe03308d8375 | [
"Apache-2.0"
] | null | null | null | README.md | Sage-Bionetworks/mongo | 8e915b58a79fb3da48442e8373d0fe03308d8375 | [
"Apache-2.0"
] | 9 | 2021-01-19T21:34:12.000Z | 2022-01-01T21:00:47.000Z | README.md | Sage-Bionetworks/mongo | 6480941693b7437a003f7555fd984f8f5663f98b | [
"Apache-2.0"
] | null | null | null | # Hardened Docker image for MongoDB
[](https://github.com/Sage-Bionetworks/mongo/releases)
[](https://github.com/Sage-Bionetworks/mongo/actions)
[](https://github.com/Sage-Bionetworks/mongo/blob/main/LICENSE)
[](https://hub.docker.com/r/sagebionetworks/mongo)
## Introduction
This image extends the official [mongo image] to provide the features described
below.
## Specification
- Mongo version: 4.4.8
- Docker image: [sagebionetworks/mongo]
## Usage
### Create non-root user at startup
By default, a user connects to the DB using the root account specified by the
credentials defined by the following official environment variables
- `MONGO_INITDB_ROOT_USERNAME`
- `MONGO_INITDB_ROOT_PASSWORD`
as shown in this official example:
```console
docker run -d --network some-network --name some-mongo \
-e MONGO_INITDB_ROOT_USERNAME=mongoadmin \
-e MONGO_INITDB_ROOT_PASSWORD=secret \
sagebionetworks/mongo
```
**Issue:** This root account has elevated priviledges that allow it to perform
any actions in the DB. In practice, you may prefer to restrict the priviledges
that your application has access to for better security.
**Solution:** When a container is started for the first time it will execute the
script [add-db-user](add-db-user) that creates a user account with priviledges
limited to the role [dbAdmin]. This role does not grant privileges for user and
role management. Moreover, this user access is limited to the use of a single
database specified by the official environment variable `MONGO_INITDB_DATABASE`.
This user account is created only if the environment variables listed below are
specified:
- `MONGO_USERNAME`
- `MONGO_PASSWORD`
Example:
```console
docker run -d --network some-network --name some-mongo \
-e MONGO_INITDB_ROOT_USERNAME=mongoadmin \
-e MONGO_INITDB_ROOT_PASSWORD=secret \
-e MONGO_INITDB_DATABASE=appdb \
-e MONGO_USERNAME=appuser \
-e MONGO_PASSWORD=secret \
sagebionetworks/mongo
```
## MongoDB UI
[Robo 3T] is a powerful open-source UI client for troubleshooting during the
development of a web application that relies on a Mongo instance. Make sure to
download Robo 3T and not Studio 3T, the paid version, unless that's what you
want.
## Contributing
Anyone can contribute with [issues] and [PRs]. If you're submitting a pull
request, always create a new branch to work your changes, and try squashing
commits down if possible.
## License
[Apache License 2.0]
<!-- Links -->
[mongo image]: https://hub.docker.com/_/mongo
[dbAdmin]: https://docs.mongodb.com/manual/reference/built-in-roles/#dbAdmin
[sagebionetworks/mongo]: https://hub.docker.com/repository/docker/sagebionetworks/mongo
[issues]: https://github.com/Sage-Bionetworks/mongo/issues
[PRs]:https://github.com/Sage-Bionetworks/mongo/pulls
[Robo 3T]: https://robomongo.org/download
[Apache License 2.0]: https://github.com/Sage-Bionetworks/mongo/blob/main/LICENSE
[sagebionetworks/mongo]: https://hub.docker.com/r/sagebionetworks/mongo
[mongo image]: https://hub.docker.com/_/mongo
| 38.474227 | 237 | 0.778135 | eng_Latn | 0.736572 |
c1677208617fcf8d670b886d4d28b0f65d66026c | 54 | md | Markdown | README.md | yazdi/bone | 049b426c9f88692dce25faadcaeaad717cd7c0c3 | [
"MIT"
] | null | null | null | README.md | yazdi/bone | 049b426c9f88692dce25faadcaeaad717cd7c0c3 | [
"MIT"
] | null | null | null | README.md | yazdi/bone | 049b426c9f88692dce25faadcaeaad717cd7c0c3 | [
"MIT"
] | null | null | null | # Bone
Bone is a simple percentage based grid system.
| 18 | 46 | 0.777778 | eng_Latn | 0.998594 |
c16839b5aa40a272c15af41243de040b260528bd | 440 | markdown | Markdown | _posts/2018-02-24-students-love.markdown | dreaming-forest/dreaming-forest.github.io | a53c6dd9df94cbab1ae8b49ce3ccebb551baf66a | [
"Apache-2.0"
] | null | null | null | _posts/2018-02-24-students-love.markdown | dreaming-forest/dreaming-forest.github.io | a53c6dd9df94cbab1ae8b49ce3ccebb551baf66a | [
"Apache-2.0"
] | null | null | null | _posts/2018-02-24-students-love.markdown | dreaming-forest/dreaming-forest.github.io | a53c6dd9df94cbab1ae8b49ce3ccebb551baf66a | [
"Apache-2.0"
] | null | null | null | ---
layout: post
title: "学生时代的爱情"
subtitle: "Student`s love"
date: 2018-02-24 00:00:00
author: "Hector.Z"
header-img: "img/post-think-try-write.jpg"
tags:
- 聊聊
---
经过二十年的单身生活,这一年来,我终于体会到爱情的滋味,没有想象中的那么美好,也没有想象中的那么束缚。
曾经如此接近,今已各自天涯,爱情的酸甜苦辣咸尽在心里,经久不散。
要我说,恋爱最宝贵的地方,在于恋人之间做出的改变。
不管我们曾经是个怎样的人,或内向,或自私,或一无是处,一旦陷入爱情的漩涡,就一定会做出改变,让自己变得更加完美、优秀。
我曾自喻为典型的理工男,毫无生活情调,如今却也小懂弹弹吉他、唱唱情歌、讲讲情话,做做菜。
我怀念这种改变,更怀念曾经的美好。
希望彼此安好。
| 16.296296 | 59 | 0.720455 | yue_Hant | 0.235258 |
c169044348f76f8f0cc6c17b53ee8f31b73483a4 | 25 | md | Markdown | projects/angular-mailru-counter/README.md | dima716/angular-mailru-counter | 3e2c5eb17a762e05230f14348c55307682bee3be | [
"Apache-2.0"
] | null | null | null | projects/angular-mailru-counter/README.md | dima716/angular-mailru-counter | 3e2c5eb17a762e05230f14348c55307682bee3be | [
"Apache-2.0"
] | 1 | 2021-11-25T17:52:35.000Z | 2021-11-25T18:44:17.000Z | projects/angular-mailru-counter/README.md | dima716/angular-mailru-counter | 3e2c5eb17a762e05230f14348c55307682bee3be | [
"Apache-2.0"
] | null | null | null | # angular-mailru-counter
| 12.5 | 24 | 0.8 | deu_Latn | 0.299767 |
c1692c93ae69f15013cae8720baec5196f0d8fcd | 16,341 | md | Markdown | articles/machine-learning/team-data-science-process/move-sql-server-virtual-machine.md | maiemy/azure-docs.it-it | b3649d817c2ec64a3738b5f05f18f85557d0d9b6 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/machine-learning/team-data-science-process/move-sql-server-virtual-machine.md | maiemy/azure-docs.it-it | b3649d817c2ec64a3738b5f05f18f85557d0d9b6 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/machine-learning/team-data-science-process/move-sql-server-virtual-machine.md | maiemy/azure-docs.it-it | b3649d817c2ec64a3738b5f05f18f85557d0d9b6 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Spostare i dati in una macchina virtuale di SQL Server - Processo di data science per i team
description: Spostare i dati da file flat o da SQL Server locali a SQL Server nella macchina virtuale di Azure.
services: machine-learning
author: marktab
manager: marktab
editor: marktab
ms.service: machine-learning
ms.subservice: team-data-science-process
ms.topic: article
ms.date: 01/10/2020
ms.author: tdsp
ms.custom: seodec18, previous-author=deguhath, previous-ms.author=deguhath
ms.openlocfilehash: c80a90b07e25942e751d52cafa47f6e3e94852ab
ms.sourcegitcommit: 96918333d87f4029d4d6af7ac44635c833abb3da
ms.translationtype: MT
ms.contentlocale: it-IT
ms.lasthandoff: 11/04/2020
ms.locfileid: "93320341"
---
# <a name="move-data-to-sql-server-on-an-azure-virtual-machine"></a>Spostamento dei dati in SQL Server in una macchina virtuale di Azure
Questo articolo descrive le opzioni per lo spostamento dei dati da file flat con estensione csv o tsv o da SQL Server locale a SQL Server in una macchina virtuale di Azure. Queste attività per lo spostamento dei dati nel cloud fanno parte del Processo di analisi scientifica dei dati per i team.
Per un argomento che descrive le opzioni per lo spostamento dei dati a un database SQL di Azure per Machine Learning, vedere [Spostare i dati a un Database di SQL Azure per Azure Machine Learning](move-sql-azure.md).
Nella tabella seguente vengono riepilogate le opzioni per lo spostamento dei dati in SQL Server in una macchina virtuale Azure.
| <b>ORIGINE</b> | <b>DESTINAZIONE: SQL Server in VM di Azure</b> |
| --- | --- |
| <b>File flat</b> |1. <a href="#insert-tables-bcp">utilità per la copia bulk da riga di comando (BCP) </a><br> 2. <a href="#insert-tables-bulkquery">Inserimento di massa query SQL </a><br> 3. <a href="#sql-builtin-utilities">utilità predefinite grafiche in SQL Server</a> |
| <b>Server SQL locale</b> |1. <a href="#deploy-a-sql-server-database-to-a-microsoft-azure-vm-wizard">distribuzione di un database di SQL Server in una procedura guidata Microsoft Azure macchina virtuale</a><br> 2. <a href="#export-flat-file">esportare in un file flat </a><br> 3. <a href="#sql-migration">Migrazione guidata database SQL </a> <br> 4. <a href="#sql-backup">Backup e ripristino database </a><br> |
In questo documento si presuppone che i comandi SQL vengano eseguiti da SQL Server Management Studio o Esplora database di Visual Studio.
> [!TIP]
> In alternativa, è possibile usare [Data factory di Azure](https://azure.microsoft.com/services/data-factory/) per creare e pianificare una pipeline che sposta i dati a una macchina virtuale di SQL Server in Azure. Per altre informazioni, vedere [Copia di dati con Data factory di Azure (Attività di copia)](../../data-factory/copy-activity-overview.md).
>
>
## <a name="prerequisites"></a><a name="prereqs"></a>Prerequisiti
Il tutorial presuppone:
* Una **sottoscrizione di Azure**. Se non si ha una sottoscrizione, è possibile iscriversi per una [versione di valutazione gratuita](https://azure.microsoft.com/pricing/free-trial/).
* Un **account di archiviazione di Azure**. In questa esercitazione si userà un account di archiviazione di Azure per archiviare i dati. Se non si dispone di un account di archiviazione di Azure, vedere l'articolo [Creare un account di archiviazione di Azure](../../storage/common/storage-account-create.md) . Dopo avere creato l'account di archiviazione, sarà necessario ottenere la chiave dell'account usata per accedere alla risorsa di archiviazione. Vedere [gestire le chiavi di accesso dell'account di archiviazione](../../storage/common/storage-account-keys-manage.md).
* Provisioning di **SQL Server in una VM di Azure**. Per le istruzioni, vedere [Configurare una macchina virtuale SQL Server di Azure come server IPython Notebook per l'analisi avanzata](../data-science-virtual-machine/overview.md).
* Installazione e configurazione di **Azure PowerShell** in locale. Per istruzioni, vedere [Come installare e configurare Azure PowerShell](/powershell/azure/).
## <a name="moving-data-from-a-flat-file-source-to-sql-server-on-an-azure-vm"></a><a name="filesource_to_sqlonazurevm"></a> Spostamento di dati da un'origine di file flat a SQL Server su una VM di Azure
Se i dati si trovano in un file flat (organizzati in un formato righe/colonne), possono essere spostati a una macchina virtuale di SQL Server attraverso i seguenti metodi:
1. [Utilità per la copia bulk da riga di comando (BCP)](#insert-tables-bcp)
2. [Inserimento di massa query SQL](#insert-tables-bulkquery)
3. [Utilità grafiche integrate in SQL Server (importazione/esportazione, SSIS)](#sql-builtin-utilities)
### <a name="command-line-bulk-copy-utility-bcp"></a><a name="insert-tables-bcp"></a>Utilità copia di massa della riga di comando (BCP)
BCP è un'utilità della riga di comando installata con SQL Server e rappresenta uno dei metodi più rapidi per spostare i dati. Funziona in tutte e tre le varianti di SQL Server (SQL Server locale, SQL Azure e SQL Server VM in Azure).
> [!NOTE]
> **Dove devono trovarsi i dati per eseguire la copia BCP?**
> Anche se non è obbligatorio che i file contenenti i dati di origine si trovino nello stesso computer del server SQL di destinazione, questo garantisce trasferimenti più rapidi, a causa della differenza tra velocità di rete e velocità di I/O dei dischi locali. È possibile spostare i file flat contenenti i dati nel computer dove è installato SQL Server usando diversi strumenti per la copia dei file quali [AZCopy](../../storage/common/storage-use-azcopy-v10.md), [Esplora archivi di Azure](https://storageexplorer.com/) o la funzione di copia/incolla di Windows tramite Remote Desktop Protocol (RDP).
>
>
1. Assicurarsi che il database e le tabelle vengano create nel database di SQL Server di destinazione. Ecco un esempio di come procedere utilizzando i comandi `Create Database` e `Create Table`:
```sql
CREATE DATABASE <database_name>
CREATE TABLE <tablename>
(
<columnname1> <datatype> <constraint>,
<columnname2> <datatype> <constraint>,
<columnname3> <datatype> <constraint>
)
```
1. Generare il file di formato che descrive lo schema per la tabella eseguendo il comando seguente dalla riga di comando del computer in cui è installato bcp.
`bcp dbname..tablename format nul -c -x -f exportformatfilename.xml -S servername\sqlinstance -T -t \t -r \n`
1. Inserire i dati nel database usando il comando bcp, che dovrebbe funzionare dalla riga di comando quando SQL Server è installato nello stesso computer:
`bcp dbname..tablename in datafilename.tsv -f exportformatfilename.xml -S servername\sqlinstancename -U username -P password -b block_size_to_move_in_single_attempt -t \t -r \n`
> **Ottimizzazione inserimenti BCP** Per ottimizzare gli inserimenti, fare riferimento al seguente articolo ["Linee guida per ottimizzare l'importazione di massa"](/previous-versions/sql/sql-server-2008-r2/ms177445(v=sql.105)) .
>
>
### <a name="parallelizing-inserts-for-faster-data-movement"></a><a name="insert-tables-bulkquery-parallel"></a>Parallelizzazione delle operazioni di inserimento per uno spostamento dei dati più veloce
Se i dati spostati sono di grandi dimensioni, è possibile velocizzare le operazioni eseguendo simultaneamente più comandi BCP in parallelo in uno script di PowerShell.
> [!NOTE]
> **Inserimento di Big Data** Per ottimizzare il caricamento dei dati per set di dati grandi e molto grandi, partizionare le tabelle dei database logici e fisici mediante più gruppi di file e tabelle di partizione. Per ulteriori informazioni sulla creazione e sul caricamento dei dati in tabelle di partizione, vedere [Caricamento parallelo di tabelle di partizione SQL](parallel-load-sql-partitioned-tables.md).
>
>
Lo script di esempio PowerShell seguente illustra gli inserimenti mediante bcp:
```powershell
$NO_OF_PARALLEL_JOBS=2
Set-ExecutionPolicy RemoteSigned #set execution policy for the script to execute
# Define what each job does
$ScriptBlock = {
param($partitionnumber)
#Explicitly using SQL username password
bcp database..tablename in datafile_path.csv -F 2 -f format_file_path.xml -U username@servername -S tcp:servername -P password -b block_size_to_move_in_single_attempt -t "," -r \n -o path_to_outputfile.$partitionnumber.txt
#Trusted connection w.o username password (if you are using windows auth and are signed in with that credentials)
#bcp database..tablename in datafile_path.csv -o path_to_outputfile.$partitionnumber.txt -h "TABLOCK" -F 2 -f format_file_path.xml -T -b block_size_to_move_in_single_attempt -t "," -r \n
}
# Background processing of all partitions
for ($i=1; $i -le $NO_OF_PARALLEL_JOBS; $i++)
{
Write-Debug "Submit loading partition # $i"
Start-Job $ScriptBlock -Arg $i
}
# Wait for it all to complete
While (Get-Job -State "Running")
{
Start-Sleep 10
Get-Job
}
# Getting the information back from the jobs
Get-Job | Receive-Job
Set-ExecutionPolicy Restricted #reset the execution policy
```
### <a name="bulk-insert-sql-query"></a><a name="insert-tables-bulkquery"></a>Inserimento di massa query SQL
L'[inserimento di massa di query SQL](/sql/t-sql/statements/bulk-insert-transact-sql) può essere usato per importare dati nel database da file basati su righe/colonne (i tipi supportati sono indicati nell'argomento [Preparazione dei dati per l'importazione o l'esportazione bulk (SQL Server)](/sql/relational-databases/import-export/prepare-data-for-bulk-export-or-import-sql-server)).
Ecco alcuni comandi di esempio per l'inserimento di massa:
1. Analizzare i dati e impostare le opzioni personalizzate prima dell'importazione per assicurarsi che il database SQL Server presupponga lo stesso formato per tutti i campi speciali, ad esempio le date. Ecco un esempio di come impostare il formato della data come anno-mese-giorno (se i dati contengono la data in formato anno-mese-giorno):
```sql
SET DATEFORMAT ymd;
```
2. portare i dati utilizzando le istruzioni per eseguire importazioni di massa
```sql
BULK INSERT <tablename>
FROM
'<datafilename>'
WITH
(
FirstRow = 2,
FIELDTERMINATOR = ',', --this should be column separator in your data
ROWTERMINATOR = '\n' --this should be the row separator in your data
)
```
### <a name="built-in-utilities-in-sql-server"></a><a name="sql-builtin-utilities"></a>Utilità integrate in SQL Server
È possibile usare SQL Server Integration Services (SSIS) per importare dati in SQL Server macchina virtuale in Azure da un file flat.
SSIS è disponibile in due ambienti studio. Per ulteriori informazioni, vedere [Integration Services (SSIS) e ambienti Studio](/sql/integration-services/integration-services-ssis-development-and-management-tools):
* Per informazioni dettagliate su SQL Server Data Tools, vedere [Microsoft SQL Server Data Tools](/sql/ssdt/download-sql-server-data-tools-ssdt)
* Per informazioni dettagliate sull'importazione/esportazione guidata, vedere [Importazione/esportazione guidata di SQL Server](/sql/integration-services/import-export-data/import-and-export-data-with-the-sql-server-import-and-export-wizard)
## <a name="moving-data-from-on-premises-sql-server-to-sql-server-on-an-azure-vm"></a><a name="sqlonprem_to_sqlonazurevm"></a>Spostamento dei dati da SQL Server locale a SQL Server in una VM di Azure
È inoltre possibile utilizzare le strategie di migrazione seguenti:
1. [Distribuzione di un database di SQL Server a una macchina virtuale di Microsoft Azure](#deploy-a-sql-server-database-to-a-microsoft-azure-vm-wizard)
2. [Esporta nel file flat](#export-flat-file)
3. [Migrazione guidata database SQL](#sql-migration)
4. [Backup e ripristino del database](#sql-backup)
Ognuna di queste opzioni viene descritta di seguito:
### <a name="deploy-a-sql-server-database-to-a-microsoft-azure-vm-wizard"></a>Distribuzione di un database di SQL Server a una macchina virtuale di Microsoft Azure
La **Distribuzione di un Database SQL Server in una macchina virtuale di Microsoft Azure** è un modo semplice e consigliato per spostare dati da un'istanza di SQL Server locale a un SQL Server in una macchina virtuale di Azure. Per passaggi dettagliati, nonché per una descrizione delle altre alternative, vedere [Migrazione di un database a SQL Server su una macchina virtuale di Azure](../../azure-sql/virtual-machines/windows/migrate-to-vm-from-sql-server.md).
### <a name="export-to-flat-file"></a><a name="export-flat-file"></a>Esportazione in un file flat
È possibile usare diversi metodi per l'esportazione di massa dei dati dal Server locale SQL come descritto nell'argomento [Importazione ed esportazione dei dati in massa (SQL Server)](/sql/relational-databases/import-export/bulk-import-and-export-of-data-sql-server) . In questo documento si parla di Bulk Copy Program (BCP) come esempio. Una volta che i dati sono esportati in un file flat, possono essere importati in un altro server SQL mediante l'importazione di massa.
1. Esportare i dati da SQL Server locali in un file usando l'utilità bcp come indicato di seguito
`bcp dbname..tablename out datafile.tsv -S servername\sqlinstancename -T -t \t -t \n -c`
2. Creare il database e la tabella nella macchina virtuale di SQL Server in Azure tramite `create database` e `create table` per lo schema della tabella esportato nel passaggio 1.
3. Creare un file di formato per la descrizione dello schema della tabella dei dati da importare/esportare. I dettagli del file di formato sono descritti in [Creazione di un file di formato (SQL Server)](/sql/relational-databases/import-export/create-a-format-file-sql-server).
Formattare la generazione di file durante l'esecuzione di BCP dal computer SQL Server
`bcp dbname..tablename format nul -c -x -f exportformatfilename.xml -S servername\sqlinstance -T -t \t -r \n`
Creazione di file di formato quando si esegue BCP in remoto rispetto a SQL Server
`bcp dbname..tablename format nul -c -x -f exportformatfilename.xml -U username@servername.database.windows.net -S tcp:servername -P password --t \t -r \n`
4. Utilizzare uno dei metodi descritti nella sezione [Spostamento dei dati dall'origine file](#filesource_to_sqlonazurevm) per spostare i dati dai file flat in SQL Server.
### <a name="sql-database-migration-wizard"></a><a name="sql-migration"></a>Migrazione guidata database SQL
[Migrazione guidata database SQL Server](https://sqlazuremw.codeplex.com/) fornisce un modo semplice per spostare i dati tra due istanze del server SQL. Consente all'utente di mappare lo schema dei dati tra origini e tabelle di destinazione, scegliere i tipi di colonna e varie altre funzionalità. Utilizza la copia di massa (BCP) dietro le quinte. Di seguito è riportata una schermata della schermata iniziale della procedura guidata di migrazione del database SQL.
![Migrazione guidata in SQL Server][2]
### <a name="database-back-up-and-restore"></a><a name="sql-backup"></a>Backup e ripristino database
SQL Server supporta:
1. La [funzionalità di backup e ripristino del database](/sql/relational-databases/backup-restore/back-up-and-restore-of-sql-server-databases) (sia in un file locale o in un'esportazione bacpac in un BLOB) e [applicazioni livello dati](/sql/relational-databases/data-tier-applications/data-tier-applications) (tramite bacpac).
2. Possibilità di creare direttamente macchine virtuali SQL Server in Azure con un database copiato o copiarle in un database esistente nel database SQL. Per altre informazioni, vedere [Use the Copy Database Wizard](/sql/relational-databases/databases/use-the-copy-database-wizard).
Di seguito è riportata una schermata delle opzioni di backup e ripristino del database da SQL Server Management Studio.
![Strumento di importazione di SQL Server][1]
## <a name="resources"></a>Risorse
[Eseguire la migrazione di un database a SQL Server in una macchina virtuale di Azure](../../azure-sql/virtual-machines/windows/migrate-to-vm-from-sql-server.md)
[Panoramica di SQL Server in macchine virtuali di Azure](../../azure-sql/virtual-machines/windows/sql-server-on-azure-vm-iaas-what-is-overview.md)
[1]: ./media/move-sql-server-virtual-machine/sqlserver_builtin_utilities.png
[2]: ./media/move-sql-server-virtual-machine/database_migration_wizard.png | 74.958716 | 603 | 0.771679 | ita_Latn | 0.97925 |
c169801790ea7e0595d65557c96b0faa1a02d050 | 244 | md | Markdown | archive/SampleOperatingAgreement/README.md | bryangw1/repository | 63ec043d16a31704e2d606da89f2e2c47889e19c | [
"MIT"
] | null | null | null | archive/SampleOperatingAgreement/README.md | bryangw1/repository | 63ec043d16a31704e2d606da89f2e2c47889e19c | [
"MIT"
] | null | null | null | archive/SampleOperatingAgreement/README.md | bryangw1/repository | 63ec043d16a31704e2d606da89f2e2c47889e19c | [
"MIT"
] | null | null | null | # Sample Operating Agreement
Link to Sample Operating Agreement in Docassemble: http://ec2-18-216-155-205.us-east-2.compute.amazonaws.com/interview?i=docassemble.SimpleOperatingAgreement%3Adata%2Fquestions%2FsimpleOperatingAgreement.yml#page1
| 61 | 213 | 0.848361 | eng_Latn | 0.485857 |
c16a2140744b3ae1baf7aae02f4ade2b4eb06e9b | 500 | md | Markdown | kata/8-kyu/sum-of-multiples/README.md | ParanoidUser/codewars | f3f39f755504fecf8de56e729da3b661f9f9c14e | [
"MIT"
] | 26 | 2019-04-02T05:32:02.000Z | 2019-12-12T05:33:30.000Z | kata/8-kyu/sum-of-multiples/README.md | ParanoidUser/codewars-solutions | f3f39f755504fecf8de56e729da3b661f9f9c14e | [
"MIT"
] | 3 | 2019-12-17T17:36:26.000Z | 2019-12-18T04:21:30.000Z | kata/8-kyu/sum-of-multiples/README.md | ParanoidUser/codewars | f3f39f755504fecf8de56e729da3b661f9f9c14e | [
"MIT"
] | 2 | 2019-11-17T07:17:35.000Z | 2019-12-15T09:56:53.000Z | # [Sum of Multiples](https://www.codewars.com/kata/sum-of-multiples "https://www.codewars.com/kata/57241e0f440cd279b5000829")
Find the sum of all multiples of `n` below `m`
### Keep in Mind
* `n` and `m` are natural numbers (positive integers)
* `m` is **excluded** from the multiples
### Examples
```
Kata.sumMul(2, 9) ==> 2 + 4 + 6 + 8 = 20
Kata.sumMul(3, 13) ==> 3 + 6 + 9 + 12 = 30
Kata.sumMul(4, 123) ==> 4 + 8 + 12 + ... = 1860
Kata.sumMul(4, -7) // throws IllegalArgumentException
``` | 29.411765 | 125 | 0.632 | eng_Latn | 0.646497 |
c16a5eb1ab9568c5971068c3a3531a4aa160d9d6 | 45 | md | Markdown | _posts/2021-06-10-my-first-blog-post.md | microbiaki/github-pages-with-jekyll | 79afffb23082db9520b12856abaa3d0c421cade3 | [
"MIT"
] | null | null | null | _posts/2021-06-10-my-first-blog-post.md | microbiaki/github-pages-with-jekyll | 79afffb23082db9520b12856abaa3d0c421cade3 | [
"MIT"
] | 5 | 2021-06-10T18:37:41.000Z | 2021-06-10T19:13:26.000Z | _posts/2021-06-10-my-first-blog-post.md | microbiaki/github-pages-with-jekyll | 79afffb23082db9520b12856abaa3d0c421cade3 | [
"MIT"
] | null | null | null | ---
title: "YOUR-TITLE"
date: 2021-06-10
---
| 9 | 19 | 0.577778 | eng_Latn | 0.1707 |
c16a880407e74d1edf7b4d042f5aa98743f52df9 | 351 | md | Markdown | aisp.md | isharailanga/demo-open-banking.github.io | 7953f8c0cd43e2aae337a6fea86ab4cc66364f64 | [
"Apache-2.0"
] | 3 | 2021-05-20T17:10:17.000Z | 2021-09-06T13:18:53.000Z | aisp.md | isharailanga/demo-open-banking.github.io | 7953f8c0cd43e2aae337a6fea86ab4cc66364f64 | [
"Apache-2.0"
] | 12 | 2019-11-14T08:35:22.000Z | 2021-02-10T09:17:21.000Z | aisp.md | isharailanga/demo-open-banking.github.io | 7953f8c0cd43e2aae337a6fea86ab4cc66364f64 | [
"Apache-2.0"
] | 9 | 2019-11-06T07:20:13.000Z | 2020-11-13T07:13:11.000Z | ---
layout: ob-base-demo-page
title: AISP
intro: Learn the core concepts of Open Banking
toc: true
permalink: /demos/aisp
---
Assume you are logging in to an AISP application. You can then check the aggregated details of multiple banks
<iframe class="embed-responsive-item cIframe" src="https://openbanking.wso2.com/demos/aisp/index.html"></iframe>
| 29.25 | 112 | 0.766382 | eng_Latn | 0.752265 |
c16ac1e73e689b11a2712e85a026e821f3ce6157 | 8,368 | md | Markdown | docs/framework/tools/peverify-exe-peverify-tool.md | mtorreao/docs.pt-br | e080cd3335f777fcb1349fb28bf527e379c81e17 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/framework/tools/peverify-exe-peverify-tool.md | mtorreao/docs.pt-br | e080cd3335f777fcb1349fb28bf527e379c81e17 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/framework/tools/peverify-exe-peverify-tool.md | mtorreao/docs.pt-br | e080cd3335f777fcb1349fb28bf527e379c81e17 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Peverify.exe (Ferramenta PEVerify)
description: Use Peverify.exe (executável portátil de verificação) para ajudar a determinar se o código do Microsoft Intermediate Language (MSIL) & metadados atendem aos padrões de segurança do tipo no .NET.
ms.date: 03/30/2017
helpviewer_keywords:
- portable executable files, PEVerify
- verifying MSIL and metadata
- PEVerify tool
- type safety requirements
- MSIL
- PEverify.exe
- PE files, PEVerify
ms.assetid: f4f46f9e-8d08-4e66-a94b-0c69c9b0bbfa
ms.openlocfilehash: c859aa4e2e3ae95c5c72aed930a9bc4a05add296
ms.sourcegitcommit: bc293b14af795e0e999e3304dd40c0222cf2ffe4
ms.translationtype: MT
ms.contentlocale: pt-BR
ms.lasthandoff: 11/26/2020
ms.locfileid: "96238581"
---
# <a name="peverifyexe-peverify-tool"></a>Peverify.exe (ferramenta PEVerify)
A ferramenta PEVerify ajuda os desenvolvedores que geram MSIL (Microsoft Intermediate Language) (como escritores de compilador e desenvolvedores de mecanismo de script) para determinar se o código MSIL e os metadados associados atendem aos requisitos de segurança de tipo. Alguns compiladores só gerarão código fortemente tipado verificável se você evitar usar determinados constructos de linguagem. Se você estiver usando um compilador como esse, convém verificar se você não comprometeu a segurança de tipo do seu código. Você pode executar a ferramenta PEVerify em seus arquivos para verificar o MSIL e os metadados.
Essa ferramenta é instalada automaticamente com o Visual Studio. Para executar a ferramenta, use o Prompt de Comando do Desenvolvedor para Visual Studio (ou o Prompt de Comando do Visual Studio no Windows 7). Para obter mais informações, consulte [Prompts de Comando](developer-command-prompt-for-vs.md).
## <a name="syntax"></a>Sintaxe
```console
peverify filename [options]
```
## <a name="parameters"></a>parâmetros
|Argumento|Descrição|
|--------------|-----------------|
|*nome do arquivo*|O arquivo PE (Portable Executable) para o qual MSIL e metadados devem ser verificados.|
|Opção|Descrição|
|------------|-----------------|
|**/break=** *maxErrorCount*|Anula a verificação depois de erros *maxErrorCount*.<br /><br /> Esse parâmetro não é compatível no .NET Framework versão 2.0 ou posterior.|
|**/clock**|Mede e relata os seguintes tempos de verificação em milissegundos:<br /><br /> **MD Val. cycle**<br /> Ciclo de validação dos metadados<br /><br /> **MD Val. pure**<br /> Validação dos metadados pura<br /><br /> **IL Ver. cycle**<br /> Ciclo de verificação MSIL<br /><br /> **IL Ver pure**<br /> Verificação MSIL pura<br /><br /> Os tempos de **MD Val. cycle** e **IL Ver. cycle** incluem o tempo necessário para a realização de procedimentos de inicialização e desligamento necessários. Os tempos de **MD Val. pure** e **IL Ver pure** refletem o tempo necessário para a realização da validação ou apenas da verificação.|
|**/Help**|Exibe sintaxe de comando e opções para a ferramenta.|
|**/hresult**|Exibe códigos de erro em formato hexadecimal.|
|**/ignore=** *hex.code* [, *hex.code*]|Ignora os códigos de erro especificados.|
|**/ignore=@** *responseFile*|Ignora os códigos de erro listados no arquivo de resposta especificado.|
|**/il**|Realiza verificação de segurança do tipo MSIL dos métodos implementados no assembly especificado por *filename*. A ferramenta retorna descrições detalhadas para cada problema encontrado, a menos que você especifique a opção **/quiet**.|
|**/MD**|Realiza verificações de validação de metadados no assembly especificado por *filename*. Essa opção percorre a estrutura de metadados completa dentro do arquivo e relata todos os problemas de validação encontrados.|
|**/nologo**|Suprime a exibição da versão do produto e as informações de direitos autorais.|
|**/nosymbols**|No .NET Framework versão 2.0, suprime números de linha para compatibilidade com versões anteriores.|
|**/quiet**|Especifica o modo silencioso; suprime saída dos relatórios de problema de verificação. Peverify.exe ainda relata se o arquivo é fortemente tipado, mas não relata informações sobre problemas que impeçam a verificação de segurança do tipo.|
|`/transparent`|Verifique apenas os métodos transparentes.|
|**/Unique**|Ignora códigos de erro repetidos.|
|**/verbose**|No .NET Framework versão 2.0, exibe informações adicionais em mensagens de verificação MSIL.|
|**/?**|Exibe sintaxe de comando e opções para a ferramenta.|
## <a name="remarks"></a>Comentários
O Common Language Runtime depende na execução fortemente tipada do código de aplicativo para ajudar a impor mecanismos de segurança e isolamento. Normalmente, o código que não é [fortemente tipado verificável](../../standard/security/key-security-concepts.md#type-safety-and-security) não pode ser executado, embora seja possível definir a política de segurança para permitir a execução de código confiável, mas não verificável.
Se as opções **/md** ou **/il** não forem especificadas, Peverify.exe realizará ambos os tipos de verificações. Peverify.exe realiza as verificações de **/md** primeiro. Se não houver erros, as verificações de **/il** serão feitas. Se você especificar **/md** e **/il**, as verificações de **/il** serão feitas mesmo se houver erros nos metadados. Por isso, se não houver erros de metadados, **peverify** *filename* equivalerá a **peverify** *filename* **/md** **/il**.
Peverify.exe realiza verificações MSIL abrangentes com base na análise do fluxo de dados mais uma lista de várias centenas de regras em metadados válidos. Saiba mais sobre as verificações realizadas por Peverify.exe em "Especificação de Validação dos Metadados" e em "Especificação do Conjunto de Instruções MSIL" na pasta Guia de Desenvolvedores de Ferramentas no SDK do Windows.
.NET Framework versão 2,0 ou posterior dá suporte `byref` a retornos verificáveis especificados usando as seguintes instruções MSIL:,,,, `dup` `ldsflda` `ldflda` `ldelema` `call` e `unbox` .
## <a name="examples"></a>Exemplos
O comando a seguir realiza verificações de validação dos metadados e verificações de segurança fortemente tipada MSIL para métodos implementados no assembly `myAssembly.exe`.
```console
peverify myAssembly.exe /md /il
```
Mediante a conclusão bem-sucedida da solicitação acima, Peverify.exe exibe a mensagem a seguir.
```output
All classes and methods in myAssembly.exe Verified
```
O comando a seguir realiza verificações de validação dos metadados e verificações de segurança fortemente tipada MSIL para métodos implementados no assembly `myAssembly.exe`. A ferramenta exibe o tempo necessário à realização dessas verificações.
```console
peverify myAssembly.exe /md /il /clock
```
Mediante a conclusão bem-sucedida da solicitação acima, Peverify.exe exibe a mensagem a seguir.
```output
All classes and methods in myAssembly.exe Verified
Timing: Total run 320 msec
MD Val.cycle 40 msec
MD Val.pure 10 msec
IL Ver.cycle 270 msec
IL Ver.pure 230 msec
```
O comando a seguir realiza verificações de validação dos metadados e verificações de segurança fortemente tipada MSIL para métodos implementados no assembly `myAssembly.exe`. Peverify.exe para, no entanto, quando alcança a contagem de erros máxima de 100. A ferramenta também ignora os códigos de erro especificados.
```console
peverify myAssembly.exe /break=100 /ignore=0x12345678,0xABCD1234
```
O comando a seguir produz o mesmo resultado que o exemplo anterior acima, mas especifica os códigos de erro que serão ignorados no arquivo de resposta `ignoreErrors.rsp`.
```console
peverify myAssembly.exe /break=100 /ignore@ignoreErrors.rsp
```
O arquivo de resposta pode conter uma lista separada por vírgulas de códigos de erro.
```text
0x12345678, 0xABCD1234
```
O arquivo de resposta também pode ser formatado com um código de erro por linha.
```text
0x12345678
0xABCD1234
```
## <a name="see-also"></a>Veja também
- [Ferramentas](index.md)
- [Escrevendo um código fortemente tipado verificável](../misc/code-access-security-basics.md#typesafe_code)
- [Segurança de tipos e proteção](../../standard/security/key-security-concepts.md#type-safety-and-security)
- [Prompts de comando](developer-command-prompt-for-vs.md)
| 64.868217 | 635 | 0.748685 | por_Latn | 0.997406 |
c16cc4099ecb17ca4c3e58e11dedd5b9e17a40e9 | 2,290 | md | Markdown | windows-driver-docs-pr/audio/hd-audio-ddi-structures.md | AmadeusW/windows-driver-docs | 6d272f80814969bbb5ec836cbbebdf5cae52ee35 | [
"CC-BY-4.0",
"MIT"
] | 4 | 2017-05-30T18:13:16.000Z | 2021-09-26T19:45:08.000Z | windows-driver-docs-pr/audio/hd-audio-ddi-structures.md | AmadeusW/windows-driver-docs | 6d272f80814969bbb5ec836cbbebdf5cae52ee35 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | windows-driver-docs-pr/audio/hd-audio-ddi-structures.md | AmadeusW/windows-driver-docs | 6d272f80814969bbb5ec836cbbebdf5cae52ee35 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2021-09-26T19:45:46.000Z | 2021-09-26T19:45:46.000Z | ---
title: HD Audio DDI Structures
description: HD Audio DDI Structures
ms.assetid: 2aa6d4be-5ebb-4f9f-956a-2491ce483bfa
ms.author: windowsdriverdev
ms.date: 11/28/2017
ms.topic: article
ms.prod: windows-hardware
ms.technology: windows-devices
---
# HD Audio DDI Structures
The routines in the two versions of the [HD Audio DDI](hd-audio-ddi-routines.md) use the following structure types:
[**HDAUDIO\_BUFFER\_DESCRIPTOR**](https://msdn.microsoft.com/library/windows/hardware/ff536411)
[**HDAUDIO\_BUS\_INTERFACE**](https://msdn.microsoft.com/library/windows/hardware/ff536413)
[**HDAUDIO\_BUS\_INTERFACE\_V2**](https://msdn.microsoft.com/library/windows/hardware/ff536418)
[**HDAUDIO\_BUS\_INTERFACE\_BDL**](https://msdn.microsoft.com/library/windows/hardware/ff536416)
[**HDAUDIO\_CODEC\_COMMAND**](https://msdn.microsoft.com/library/windows/hardware/ff536420)
[**HDAUDIO\_CODEC\_RESPONSE**](https://msdn.microsoft.com/library/windows/hardware/ff536422)
[**HDAUDIO\_CODEC\_TRANSFER**](https://msdn.microsoft.com/library/windows/hardware/ff536424)
[**HDAUDIO\_CONVERTER\_FORMAT**](https://msdn.microsoft.com/library/windows/hardware/ff536426)
[**HDAUDIO\_DEVICE\_INFORMATION**](https://msdn.microsoft.com/library/windows/hardware/ff536428)
[**HDAUDIO\_STREAM\_FORMAT**](https://msdn.microsoft.com/library/windows/hardware/ff536430)
[Send comments about this topic to Microsoft](mailto:wsddocfb@microsoft.com?subject=Documentation%20feedback%20[audio\audio]:%20HD%20Audio%20DDI%20Structures%20%20RELEASE:%20%2811/22/2017%29&body=%0A%0APRIVACY%20STATEMENT%0A%0AWe%20use%20your%20feedback%20to%20improve%20the%20documentation.%20We%20don't%20use%20your%20email%20address%20for%20any%20other%20purpose,%20and%20we'll%20remove%20your%20email%20address%20from%20our%20system%20after%20the%20issue%20that%20you're%20reporting%20is%20fixed.%20While%20we're%20working%20to%20fix%20this%20issue,%20we%20might%20send%20you%20an%20email%20message%20to%20ask%20for%20more%20info.%20Later,%20we%20might%20also%20send%20you%20an%20email%20message%20to%20let%20you%20know%20that%20we've%20addressed%20your%20feedback.%0A%0AFor%20more%20info%20about%20Microsoft's%20privacy%20policy,%20see%20http://privacy.microsoft.com/default.aspx. "Send comments about this topic to Microsoft")
| 49.782609 | 932 | 0.798253 | yue_Hant | 0.505043 |
c16d245a039f54519fe722251d14851cf9840863 | 10,146 | md | Markdown | Code-Learning-Paths-jp/docs/Code-Articles/introduction-to-machine-learning/index_en.md | IBM/japan-technology | 45ece69ade3ff39cca83fa99b09276a59d9086fa | [
"Apache-2.0"
] | 93 | 2021-02-25T02:20:38.000Z | 2022-03-17T03:51:05.000Z | Code-Learning-Paths-jp/docs/Code-Articles/introduction-to-machine-learning/index_en.md | IBM/japan-technology | 45ece69ade3ff39cca83fa99b09276a59d9086fa | [
"Apache-2.0"
] | 1 | 2021-02-25T05:50:48.000Z | 2022-03-29T03:14:26.000Z | Code-Learning-Paths-jp/docs/Code-Articles/introduction-to-machine-learning/index_en.md | IBM/japan-technology | 45ece69ade3ff39cca83fa99b09276a59d9086fa | [
"Apache-2.0"
] | 13 | 2021-05-14T04:17:04.000Z | 2022-01-17T05:27:30.000Z | ---
# Related publishing issue: https://github.ibm.com/IBMCode/Code-Articles/issues/3637
abstract: "Understand how algorithms enable systems to learn patterns within data by using Python."
authors:
- name: Samaya Madhavan
email: "smadhava@us.ibm.com"
- name: Mark Sturdevant
email: "mark.sturdevant@ibm.com"
- name: Romeo Kienzler
email: "romeo.kienzler@ch.ibm.com"
completed_date: "2019-12-04"
draft: false
time_to_read: 15 minutes
excerpt: "Understand how algorithms enable systems to learn patterns within data by using Python."
last_updated: "2019-12-04"
meta_description: "Understand how algorithms enable systems to learn patterns within data by using Python."
meta_keywords: "data science, machine learning, deep learning, artificial intelligence, beginner's guide, beginner, unsupervised learning"
meta_title: "Introduction to machine learning"
primary_tag: data-science
related_content:
- type: patterns
slug: infuse-ai-into-your-application
- type: patterns
slug: iot-sensor-temperature-analysis-with-ibm-db2-event-store
- type: patterns
slug: predict-home-value-using-python-and-watson-machine-learning
subtitle: "Understand how algorithms enable systems to learn patterns within data by using Python"
tags:
- "machine-learning"
- "artificial-intelligence"
- "deep-learning"
- "python"
title: Introduction to machine learning
type: tutorial
also_found_in:
- "learningpaths/get-started-artificial-intelligence/"
- "learningpaths/learning-path-machine-learning-for-developers/"
---
Machine learning is the science where in order to predict a value, algorithms are applied for a system to learn patterns within data. With the use of sufficient data, the relationship between all of the input variables and the values to be predicted is established. It becomes easier for the system to predict a new value given other input variables. This approach differs from conventional programming where an application is developed based on previously set rules. Even though the fundamental concepts of machine learning have been around for a while, the field has gained momentum recently due to the state-of-the-art processors and the abundant data that's available, which are both key to achieving accurate predictions. Because there is enough content already available on the history of machine learning, we do not cover that topic in this article. Instead, we give you a practical approach to understanding the necessary concepts to help get you started.
The following concepts are explained in this article:
* Fundamentals of machine learning
* Supervised versus unsupervised learning
* Building a model
* Pipelines in machine learning
## Fundamentals of machine learning
In this section, we discuss some of the basic terminologies that are used while working with a machine learning project.
### Linear algebra
Linear algebra is a field in mathematics that deals with correlations between variables. It's called linear because the output variable can be expressed in terms of the input variables with powers (exponents) not greater than one. Understanding the basics of linear algebra goes a long way in helping to understand some of the basics of machine learning. In this section, we define some of the key terms that are frequently used.
#### What are tensors?
Let's start with _scalar_. Scalar representation is basically any number such as 1, 5, 23.5, or 42. If you group a number of scalars together, you end up with a _vector_. For example, (1, 5, 23, 2) is a vector of length four. In a vector, all elements should belong to the same data type, whereas in a _tuple_, types can be mixed. A _matrix_ is a list of equal-sized vectors. In a matrix, the number of rows can be different from the number of columns, but each element must have the same type. A matrix with _m_ rows and _n_ columns is referred to as _m x n_ matrices.
_Tensors_ are data that is represented in a multi-dimensional space. Tensor is a generic term that represents these representations. For example, a zero-dimensional tensor is a scalar, a one-dimensional tensor is a vector, and a two-dimensional tensor is a matrix. The following image shows an example of a 3D tensor, which is basically an extension of a matrix, but in three dimensions.

Tensors can be handy in some aspects, such as image processing. You can have one dimension for the height, one for the width, and one for colors.
### High-dimensional vector spaces
Understanding high-dimensional vector spaces helps in giving you a solid foundation towards understanding how machine learning works. The following image shows a data set that has three columns. These columns are referred to as dimensions or features. The table is also called a three-dimensional data set. When these points are plotted in a 3D space, we observe three point clouds.

A _line_ is a basic separation of points in a 2-dimensional space. In the previous image, you see a division that marks the separation of points in a 3-dimensional space. This line in a 3D space is referred to as a _plane_. If you are going from three dimensions to four dimensions or more, the plane becomes an _hyperplane_.

Identifying these separations is critical because after the separations are established, predicting new data is merely identifying which part of the separation a data point lies within.
## Supervised versus unsupervised machine learning
### Supervised machine learning
Supervised machine learning refers to the type of problems in which each record in the the data set contains a label or a flag.

Consider the following table that contains information about max temperature, min temperature, and max vibration.

The final column, asperity, is the label. Given temperature and vibration data, we want to predict asperity. This is a labeled data set.
Using this data set that includes the label, we can train an algorithm to predict the future for unlabeled data. You fit that into your algorithm, and the algorithm now predicts a label for this data. This is referred to as supervised learning. _Regression_ and _classification_ are the two types of supervised learning.
#### Regression
The type of use cases where a continuous value must be predicted is referred to as regression. For example, if we pass the algorithm the values 35, 35, and 12, it predicts a value for asperity of 0.32.

#### Classification
The type of use cases where the output is a binary value or at least a discrete value instead of a continuous value is referred to as classification. In other words, the algorithm does not predict a number, but instead predicts a class variable.

For example, if we pass the algorithm the values 35, 35, and 12, it predicts a value 0 for broken.

If you have only two classes, it is called binary classification. If you have more than two classes, you have multiclass classification.
### Unsupervised learning
Unsupervised machine learning refers to the type of problems in which no record in the the data set contains any label or a flag. _Clustering_ is a type of unsupervised machine learning.

#### Clustering
In the 3-dimensional plot shown previously, notice the 3 clusters or clouds of data. Just by plotting the table, we see that data is centered around three clusters. This process is called clustering.

## Building a model
A machine learning _model_ refers to a mathematical configuration that is built using previously seen data and is set up to predict new data to a certain degree of accuracy that is previously calculated.
Following is the sequence of steps that are performed iteratively to build a model from scratch.
* Data exploration
* Data preprocessing
* Splitting data for training and testing
* Preparing a classification model
* Assembling all of these steps using pipelines
* Training the model
* Running predictions on the model
* Evaluating and visualizing model performance
A more detailed and hands-on approach to building a model is described in [Build and test your first machine learning model using Python and scikit-learn](/tutorials/build-and-test-your-first-machine-learning-model-using-python-and-scikit-learn/).
## Pipelines
Pipelines is a very convenient process of designing your data processing in a machine learning flow. Data preprocessing is a tedious step that must be applied on data every time before training begins, irrespective of the algorithm that will be applied. The following image shows a typical sequence of preprocessing steps that is applied every time before the data modeling begins.

The idea is that when using pipelines, you can keep the preprocessing and just switch the different modeling algorithms or different parameter sets of your modeling algorithm. The overall idea is that you can fuse your complete data processing flow into one single pipeline, and that single pipeline can be used downstream.

Similar to a machine learning algorithm, pipelines have methods called fit, evaluate, and score. Basically, fit starts the training and score returns the predicted value.

_Cross-validation_ is one of the biggest advantages of using pipelines. It refers to the process of changing or tuning several hyperparameters using the same pipeline, which accelerates the optimization of your algorithm. There are several hyperparameters that can be tuned to better performing models. Details related to these topics are covered in a future article.
## Summary
This tutorial provided some of the basic concepts of machine learning. It provided a practical approach to understanding the necessary concepts to help get you started.
| 58.647399 | 963 | 0.795289 | eng_Latn | 0.999266 |
c16d284309e15a6ba1c4c707aac75c7981f474c8 | 10,257 | md | Markdown | notes/20190130.md | barmac/m4vga-rs | a1e2ba47eaeb4864f0d8b97637611d9460ce5c4d | [
"BSD-2-Clause"
] | 116 | 2019-02-07T15:56:46.000Z | 2021-12-28T17:26:32.000Z | notes/20190130.md | barmac/m4vga-rs | a1e2ba47eaeb4864f0d8b97637611d9460ce5c4d | [
"BSD-2-Clause"
] | 6 | 2019-07-31T14:54:25.000Z | 2022-02-17T23:51:20.000Z | notes/20190130.md | barmac/m4vga-rs | a1e2ba47eaeb4864f0d8b97637611d9460ce5c4d | [
"BSD-2-Clause"
] | 7 | 2019-06-26T09:39:52.000Z | 2020-06-25T23:38:04.000Z | # Porting Tunnel
Alright, time for a demo with some real meat to it.
First, this one requires the direct-color rasterizer. I'm going to see if I
can't eliminate its code, replacing it with `memcpy` or equivalent.
First crack: using `slice::copy_from_slice` in a pixel-doubled context, so
moving 400 bytes. It compiles down to good ol' `__aeabi_memcpy`... which is
about the worst copy code I've ever seen. I wonder where this code came from?
Anyhoo. Here's the code.
```rust
|ln, tgt, ctx| {
let coarse_ln = ln / SCALE;
if coarse_ln < HALF_HEIGHT {
let offset = coarse_ln * BUFFER_STRIDE;
let buf = fg.try_lock().expect("rast fg access");
// TODO: this is *ridiculously* slow.
tgt[.. WIDTH].copy_from_slice(
// u32_as_u8 is fn(&[u32]) -> &[u8]
u32_as_u8(&buf[offset .. offset + BUFFER_STRIDE])
);
ctx.target_range = 0..WIDTH;
ctx.cycles_per_pixel *= 2;
ctx.repeat_lines = 1;
} else {
// lol whatever bottom half
}
},
```
The rasterizer using `copy_from_slice` takes 15.57us to run. That's less than a
scanline, so it's actually not implausible.
---
I've jumped through some hoops to ensure that target buffers are always 32-bit
aligned. Using `copy_from_slice` on `[u32]` instead gets us down to 5.7us by
using `__aeabi_memcpy4`, which assumes alignment. (Why `__aeabi_memcpy` doesn't
transfer 0-3 bytes and then call `__aeabi_memcpy4` is beyond me.)
And of course there's `copy_words`. It works the first time, with no mysterious
crashing or corrupted display... memory safe languages are a really pleasant
place to work.
Anyway. `copy_words` takes 2.572us. Yup, twice as fast as `memcpy4` and a cool
six times faster than the original implementation. That's why I keep it around.
Not only does rasterization finish within hblank, it finishes *within the hsync
pulse*.
Which is important, because this is the raster mode that I use for my most
compute-intensive demos. Like this one I'm porting!
```rust
|ln, tgt, ctx| {
if coarse_ln < HALF_HEIGHT {
let offset = coarse_ln * BUFFER_STRIDE;
let buf = fg.try_lock().expect("rast fg access");
m4vga::util::copy_words::copy_words(
&buf[offset .. offset + BUFFER_STRIDE],
&mut tgt.as_words_mut()[..BUFFER_STRIDE],
);
ctx.target_range = 0..WIDTH;
ctx.cycles_per_pixel *= 2;
ctx.repeat_lines = 1;
} else {
// lol whatever bottom half
}
},
```
---
The bottom half of the screen is a trick, in case you haven't read the code. It
uses the same framebuffer as the top half, but rotated 180 degrees -- that is,
flipped horizontally and vertically. This is the sort of weirdness that I can
indulge in a software rasterizer.
Flipping the bottom half vertically is really easy: just change the addresses we
use at each scanline to work bottom-to-top:
```rust
|ln, tgt, ctx| {
let coarse_ln = ln / SCALE;
let coarse_ln = if coarse_ln < HALF_HEIGHT {
coarse_ln
} else {
// flip bottom half of screen vertically
HEIGHT - coarse_ln - 1
};
let offset = coarse_ln * BUFFER_STRIDE;
let buf = fg.try_lock().expect("rast fg access");
m4vga::util::copy_words::copy_words(
&buf[offset .. offset + BUFFER_STRIDE],
&mut tgt.as_words_mut()[..BUFFER_STRIDE],
);
ctx.target_range = 0..WIDTH;
ctx.cycles_per_pixel *= 2;
ctx.repeat_lines = 1;
},
```
Flipping horizontally is harder. The DMA controller that we use for scanout
can't count backwards (shame, that). So we need to reverse the order of the
pixels during the raster callback.
First crack: the naive way.
```rust
let tgt = tgt.as_words_mut()[..BUFFER_STRIDE].iter_mut();
let src_rev = buf[offset .. offset + BUFFER_STRIDE].iter()
.rev();
for (dst, src) in tgt.zip(src_rev) {
*dst = src.swap_bytes()
}
```
That takes 5.744us per scanline, finishing *just after* SAV. Reviewing the
disassembly, this is going to be *much faster* than the handwritten routine I
used in C++...because LLVM has inlined it what appears to be 25x. Well then. I
guess I didn't set `-Oz`, did I.
Done. Moving on. If the binary's too large I can change this later.
---
I initially declared my two framebuffers as undecorated statics, like so:
```rust
static mut BUF0: [u32; BUFFER_WORDS] = [0; BUFFER_WORDS];
static mut BUF1: [u32; BUFFER_WORDS] = [0; BUFFER_WORDS];
```
Because of how my linker script is set up, this puts them both in SRAM1, which
is AHB-attached. This seems to mess up DMA timing on the bottom half of the
screen, because it's slightly fuzzy -- if I move the buffer into CCM (and only
use one) we're good.
Ah. The latency-sensitive ISRs were in Flash. This can cause unpredictable
interrupt latency. It's not immediately obvious to me why generating SRAM1 AHB
traffic would affect that, but I'll worry about it later.
---
Of course both those buffers won't fit in a single SRAM. Gotta move one into CCM
anyway. With that, I can turn on buffer flipping with this main loop:
```rust
let fg = SpinLock::new(unsafe { &mut BUF0 });
let mut bg = unsafe { &mut BUF1 };
// ... things happen ...
|vga| {
vga.video_on();
loop {
vga.sync_to_vblank();
core::mem::swap(&mut bg,
&mut *fg.try_lock().expect("swap access"));
}
}
```
There. Double-buffered.
At this point we've got the display mode we need for Tunnel.
---
Drawing circles usually wants trig. Trig is usually slow. Demos in the "tunnel
zoomer" class circumvent this by precomputing a lookup table; this demo is no
different.
There is a lookup table describing one quarter of the screen; as it's radially
symmetric, you don't need more than one quarter. Since our screen is already
using 2x2 "fat pixels," we need a 200x150 table.
Each table entry contains two values: the distance from the center to the point,
and the angle from the Y axis. This means we'd need 200x150x4x2=240 kilobytes to
store the table using `f32`. We technically have enough Flash to pull this off
if we precomputed it at compile time -- and, in fact, the C++ demo leaves the
table in Flash, because the access patterns tend to hit the cache.
The C++ demo makes the table smaller using two techniques. First, the table is
subsampled 4x in each direction, 16x total, giving us a 50x38 table. Then, each
entry is represented using half-precision floating point. This gets the
footprint down to 50x38x2x2=7600 bytes. That's much more palatable.
Finally, the C++ demo precomputes the tables at compile time using constexpr and
templates.
Rust doesn't have `f16` support, so I'm going to skip that last step and
tolerate the 2x growth it implies. Rust also can't reasonably compute a trig
table at compile time (short of using a build script) so I'll start by computing
it at runtime. If I'm going to compute it at runtime, it'll need to be in RAM,
not Flash -- but at something like 14kiB I can easily afford that, as long as I
place it in SRAM1 with the rest of BSS.
---
So, discovery #1 when attempting to compute a lookup table: `core` doesn't have
trig, or any math support really. That's frustrating.
I'm going to pull in the `libm` crate for starters and evaluate its performance.
Of course, to do that, I need to get rustc to actually include the code in the
output... it's smart enough that I *really do need to be using it* for that to
happen.
Anyway, the answer is that the performance is *atrocious*. `sin` is using what
I'm guessing is a Taylor-series expansion that will take hundreds of cycles;
remember that this CPU has a `vsqrt` instruction that takes 14 cycles,
pipelined.
But! I'm deliberately moving the trig out of the critical path by doing this
lookup table. So who cares!
---
I've glossed the C++ rendering code into Rust, which only required a few
changes, all because of mixed `u32`/`f32` arithmetic.
The shading method produces *different results* in Rust. Given my shading
algorithm this is no surprise...I'm doing bit manipulation and shifts, which we
know have different corner case behavior in Rust.
The rest of the rendering code looks great. It's leaving four pixels of garbage
at the top and bottom of the display -- I don't remember if C++ does the same
thing. (It's because the display is taller than an even multiple of the
subdivided block size, by a fraction of a block.)
The rendering code takes 9.549ms, or 57% of the frame, to complete. I don't
remember how long it took in C++, but I feel like it was longer than that, which
was why I did all those crazy optimizations. I mean, the C++ code sports this,
for chrissakes:
__attribute__((optimize("prefetch-loop-arrays")))
...I remembered correctly. The C++ demo *runs at 30fps.* The render routine --
the same one I just glossed into Rust, but without bounds checks -- takes
20.71ms, or about a frame and a half. And that's using GCC 8.2.0 at `-O2`, not
my original 4.9-based toolchain.
So the Rust one is just about twice as fast. That's surprising, since the code
is so similar. But here I am, watching this do 60fps.
---
I'm starting to remember more. For example, why is tunnel not implemented as a
rasterizer? Why does it bother having a framebuffer? *Because it didn't use to
be fast enough.*
Instrumentation time.
An entire macro-row pass takes 263.9us, or about 10 scanlines. Each such pass
generates 16 scanlines' worth of output. But it's getting to cheat because of
mirroring: without mirroring it only produces 8 scanlines and then has to
recompute the other 8 later. That's not quiiiite fast enough to run realtime.
Close enough that it might be interesting to play with, though.
---
If I force `render` not to inline, it's up to 10.29us, or almost exactly half.
Okay, I've found the culprit: despite writing little static functions that get
called once, or nearly trivial accessors that are in header files, GCC wasn't
inlining *anything.* I've beaten a half dozen functions with the always-inline
stick, and now the C++ code performs very close to the Rust. (At 9.8us, it's
slightly faster -- which makes sense, as it's able to use half-precision floats
to reduce memory traffic.)
It seems only fair to push that fix to the C++ repo.
| 36.501779 | 80 | 0.720484 | eng_Latn | 0.998032 |
c16d42ebf44aea2124f3de39b30bf613f016c9f3 | 1,766 | md | Markdown | content/post/south-africa/index.md | jissac/hugo_web | 39fb521f9dde3ede2be1726ba70fcdc8a7af37eb | [
"MIT"
] | null | null | null | content/post/south-africa/index.md | jissac/hugo_web | 39fb521f9dde3ede2be1726ba70fcdc8a7af37eb | [
"MIT"
] | null | null | null | content/post/south-africa/index.md | jissac/hugo_web | 39fb521f9dde3ede2be1726ba70fcdc8a7af37eb | [
"MIT"
] | null | null | null | ---
# Documentation: https://sourcethemes.com/academic/docs/managing-content/
title: "South Africa: A Collection of Haikus"
subtitle: ""
summary: ""
authors: []
tags: [travel]
categories: []
date: 2019-12-01T16:34:24-05:00
lastmod: 2019-12-01T16:34:24-05:00
featured: false
draft: false
# Featured image
# To use, add an image named `featured.jpg/png` to your page's folder.
# Focal points: Smart, Center, TopLeft, Top, TopRight, Left, Right, BottomLeft, Bottom, BottomRight.
image:
caption: ""
focal_point: ""
preview_only: false
# Projects (optional).
# Associate this post with one or more of your projects.
# Simply enter your project's folder or file name without extension.
# E.g. `projects = ["internal-project"]` references `content/project/deep-learning/index.md`.
# Otherwise, set `projects = []`.
projects: []
---
Two friends and I flew
Into a land of beauty -
Penguins on the beach!

Barbeque or braai
Lamb seared to sweet perfection
Am I in heaven?

Seashells by the beach
Waves crash, mussels grow, splish splash
Sand between my toes

The best time to think
Is on a safari ride
Lions, giraffes, pause

Benz thinking of you
Soft breeze, gentle gaze, warm touch
Let's take a road trip

I'm free, or am I?
Do I still hate, I wonder
Teach me how to fly

The calm blue, vast sea
Secrets I've kept, buried deep
Float along, my ship

Dash, slide, I jump up
Falling onto silken sheets
Sleep, how I've missed you
| 28.483871 | 100 | 0.645527 | eng_Latn | 0.930851 |
c16f163b657fd1312f2e11923e254eb3fb58cd36 | 4,179 | md | Markdown | scrapy_middleware/README.md | Kiofo/Smartproxy | 29685fa0f09e03597bf6474d3ba3f452f5143ecf | [
"MIT"
] | 1 | 2019-05-09T11:48:15.000Z | 2019-05-09T11:48:15.000Z | scrapy_middleware/README.md | Kiofo/Smartproxy | 29685fa0f09e03597bf6474d3ba3f452f5143ecf | [
"MIT"
] | null | null | null | scrapy_middleware/README.md | Kiofo/Smartproxy | 29685fa0f09e03597bf6474d3ba3f452f5143ecf | [
"MIT"
] | null | null | null | ## <img src="https://smartproxy.com/wp-content/themes/smartproxy/images/smartproxy-logo.svg" alt="" width="200" height="50">
### Disclaimer
In case you are not aware of what Scrapy is or how it works, we suggest to reseach [Scrapy](https://docs.scrapy.org/en/latest/) documentation in order to continue development with this tool.
### Prerequisites
To get started with Scrapy you will first need to install it using methods provided in their documentation. [Check here for more information](https://docs.scrapy.org/en/latest/intro/install.html)
### Installation
Once you get Scrapy up and running if you have not yet, make sure that you create your project folder:
```
scrapy startproject yourprojectname
```
<img src="https://content.screencast.com/users/JohanSP/folders/Jing/media/f974b1de-dc9c-4d53-9d43-9215f8742dc9/startproject.png">
When project directory is setup, you can deploy our middleware:
1. Open Terminal window.
2. Navigate to the main directory of your project folder using `cd yourprojectname`
3. Download our proxy middleware using the following command: `curl https://raw.githubusercontent.com/Smartproxy/Smartproxy/master/scrapy_middleware/smartproxy_auth.py > smartproxy_auth.py`
<img src="https://file2.api.drift.com/drift-prod-file-uploads/5fdb%2F5fdb91ab7c45ffb79ff46e5f37fa6c2b/2.png?mimeType=image%2Fpng">
4. You should now see your project folder populated with *smartproxy_auth.py* file.
<img src="https://file2.api.drift.com/drift-prod-file-uploads/94bb%2F94bb73fc522c281e170a6cc81a077ab5/3.png?mimeType=image%2Fpng">
### Configuration
To start using our middleware for proxy authentication, you'll need to configure settings for our proxy authentication.
Doing so is very simple:
1. Using file manager, navigate to your project folder, you should see *settings.py* file located at the bottom of the directory.
2. Edit the *settings.py* file using an editor of your choice and add the following properties at the bottom:
```
DOWNLOADER_MIDDLEWARES = {
'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware': 110,
'yourprojectname.smartproxy_auth.ProxyMiddleware': 100,
}
SMARTPROXY_USER = 'username' ## Smartproxy Username (Sub-user)
SMARTPROXY_PASSWORD = 'password' ## Password for your user
SMARTPROXY_ENDPOINT = 'gate.smartproxy.com' ## Endpoint you'd like to use
SMARTPROXY_PORT = '7000' ## Port of the endpoint you are using.
```
<img src="https://file2.api.drift.com/drift-prod-file-uploads/b7b3%2Fb7b36a1e9a1556fb7b361ed17144159a/4.png?mimeType=image%2Fpng">
<img src="https://file2.api.drift.com/drift-prod-file-uploads/348f%2F348f7143ae28ad224fa3a46c2dc7716e/5.png?mimeType=image%2Fpng">
3. In `DOWNLOADER_MIDDLEWARES` change `yourprojectname` line to the name of your project.
<img src="https://file2.api.drift.com/drift-prod-file-uploads/87d4%2F87d428d4f2d47e29f3e986e109005d26/6.png?mimeType=image%2Fpng">
4. Make sure that you enter your details account details as well as proxy details within punctuation marks ('').
5. Save the file.
Once all that is done, all of your spiders will be going through our proxies, if you are not sure how to setup a spider, take a look [here](https://docs.scrapy.org/en/latest/intro/tutorial.html#our-first-spider)
## How to get started with Smartproxy?
<br><img src="https://smartproxy.com/wp-content/uploads/2019/02/order-smartproxy.png">
<br> 1. To utilize residential proxies you will require smartproxy account, register here -> https://dashboard.smartproxy.com
<br> 2. You will need an active subscription too. If you're interested in testing it out first - our plans Starter and above have 3 day money back policy -> https://smartproxy.com/pricing
<br> 3. Once you purchase it you will get access to whole pool, regardless of plan!
<br><br><center>Accepted payment methods:
<br><img src="https://smartproxy.com/wp-content/uploads/2018/09/payment-methods-smartproxy-residential-rotating-proxies.svg" alt="" width="250" height="50"></center>
## Contacts
Email - sales@smartproxy.com
<br><a href="https://smartproxy.com">Live chat 24/7</a>
<br><a href="https://join.skype.com/invite/bZDHw4NZg2G9">Skype</a>
<br><a href="https://t.me/smartproxy_com">Telegram</a>
| 56.472973 | 211 | 0.777459 | eng_Latn | 0.836352 |
c17003aadd89c73c0a3af3aae0f0d5b11f9eaac2 | 2,656 | md | Markdown | course_notes/kubeconfig.md | tastethedream/suse_cloud_native | 22d6618d0a406d21d7b58ec7c0831258508b66c0 | [
"MIT"
] | null | null | null | course_notes/kubeconfig.md | tastethedream/suse_cloud_native | 22d6618d0a406d21d7b58ec7c0831258508b66c0 | [
"MIT"
] | null | null | null | course_notes/kubeconfig.md | tastethedream/suse_cloud_native | 22d6618d0a406d21d7b58ec7c0831258508b66c0 | [
"MIT"
] | null | null | null | # Kubeconfig
To access a Kubernetes cluster a *kubeconfig* file is required. A kubeconfig file has all the necessary cluster metadata and authentication details, that grants the user permission to query the cluster objects. Usually, the kubeconfig file is stored locally under the `~/.kube/config` file. However, k3s places the kubeconfig file within `/etc/rancher/k3s/k3s.yaml` path. Additionally, the location of a kubeconfig file can be set through the `--kubeconfig` kubectl flag or via the `KUBECONFIG` environmental variable.
A Kubeconfig file has 3 main distinct sections:
- *Cluster* - encapsulates the metadata for a cluster, such as the name of the cluster, API server endpoint, and certificate authority used to check the identity of the user.
- *User* - contains the user details that want access to the cluster, including the user name, and any authentication metadata, such as username, password, token or client, and key certificates.
- *Context* - links a user to a cluster. If the user credentials are valid and the cluster is up, access to resources is granted. Also, a current-context can be specified, which instructs which context (cluster and user) should be used to query the cluster.
Here is an example of a kubeconfig file:
```
apiVersion: v1
# define the cluster metadata
clusters:
- cluster:
certificate-authority-data: {{ CA }}
server: https://127.0.0.1:63668
name: udacity-cluster
# define the user details
users:
# `udacity-user` user authenticates using client and key certificates
- name: udacity-user
user:
client-certificate-data: {{ CERT }}
client-key-data: {{ KEY }}
# `green-user` user authenticates using a token
- name: green-user
user:
token: {{ TOKEN }}
# define the contexts
contexts:
- context:
cluster: udacity-cluster
user: udacity-user
name: udacity-context
# set the current context
current-context: udacity-context
```
[Once you start handling multiple clusters, you'll find a lot of useful information in this article](https://community.suse.com/posts/cluster-this-is-your-admin-do-you-read)
- Inspect the endpoints for the cluster and installed add-ons
`kubectl cluster-info`
- List all the nodes in the cluster.
- To get a more detailed view of the nodes, the `-o wide` flag can be passed
`kubectl get nodes [-o wide]`
- Describe a cluster node.
- Typical configuration: node IP, capacity (CPU and memory), a list of running pods on the node, podCIDR, etc.
`kubectl describe node {{ NODE NAME }}`
When using K3s find Kubecofig as follows
- To see if the file exists
`ls /etc/rancher/k3s/k3s.yaml `
- To read the file
`cat /etc/rancher/k3s/k3s.yaml `
| 40.242424 | 518 | 0.750753 | eng_Latn | 0.988088 |
c17171dde2b10832723c5126154b04f738d59d05 | 833 | md | Markdown | README.md | duanshuyong0/SkyVue | 2b89c4fc78d97d91110c02df03990bba86a8317f | [
"MIT"
] | 1 | 2020-06-19T02:15:39.000Z | 2020-06-19T02:15:39.000Z | README.md | zl0422/SaaS_IHRM_Vue | 0591b039dd6401c75702f3dbebcee64602209946 | [
"MIT"
] | null | null | null | README.md | zl0422/SaaS_IHRM_Vue | 0591b039dd6401c75702f3dbebcee64602209946 | [
"MIT"
] | 1 | 2021-03-04T03:44:21.000Z | 2021-03-04T03:44:21.000Z | # SaaS IHRM 系统
## Run
### 1. Install node.js
https://nodejs.org/en/
### 2.Run Project
先安装依赖包,然后安装mockjs依赖包,最后运行
```shell
cnpm install
cnpm install mockjs
npm run dev
```
### 4. Config API Address
编辑文件 config/index.js
修改 `api` `upfile` 转发配置
```js
module.exports = {
dev: {
// Paths
assetsSubDirectory: 'static',
assetsPublicPath: '',
proxyTable: {
'/api': {
target: 'https://mock.boxuegu.com/mock/29',
changeOrigin: true,
pathRewrite: {
'^/api': ''
}
},
'/upfile': {
target: 'http://172.17.0.65/system/upfile',
changeOrigin: true,
pathRewrite: {
'^/upfile': ''
}
}
},
```
## 后端
后端使用微服务编写,使用SpringBoot + SpringCloud + SpringMVC + SpringData
地址为:https://github.com/Han-YLun/SaaS_IHRM | 13.655738 | 61 | 0.552221 | yue_Hant | 0.638661 |
c17251d3d51bb4ea69c29ccfa69c0f601812012b | 61 | md | Markdown | README.md | aiwee/blues-icons | 8f3e94773e981c267771c77b152773a438800b2f | [
"MIT"
] | 1 | 2019-01-14T16:22:35.000Z | 2019-01-14T16:22:35.000Z | README.md | aiwee/blues-icons | 8f3e94773e981c267771c77b152773a438800b2f | [
"MIT"
] | null | null | null | README.md | aiwee/blues-icons | 8f3e94773e981c267771c77b152773a438800b2f | [
"MIT"
] | null | null | null | ## Blues Icon Font
Simple icon font set for Blues Framework
| 15.25 | 40 | 0.770492 | kor_Hang | 0.264638 |
c172841b4c331677f00a4a40e2765344e4c2b5a0 | 149 | md | Markdown | README.md | piradoiv/p5-asteroids | fabe18ed3b699bc9a1ca91098f518b8d3d9c3d2e | [
"MIT"
] | null | null | null | README.md | piradoiv/p5-asteroids | fabe18ed3b699bc9a1ca91098f518b8d3d9c3d2e | [
"MIT"
] | null | null | null | README.md | piradoiv/p5-asteroids | fabe18ed3b699bc9a1ca91098f518b8d3d9c3d2e | [
"MIT"
] | null | null | null | # p5-asteroids
Asteroids using [http://p5js.org](p5.js) library (Processing).
You can see a live demo on CodePen:
http://codepen.io/anon/pen/JEGJaQ
| 24.833333 | 62 | 0.738255 | yue_Hant | 0.708335 |
c172a274d46a02f271f37b453fe0e5c07bb69050 | 33,543 | md | Markdown | CHANGELOG.md | Pfaufisch/ultraschall-3 | 49f9a57e639b317dffede2c15c5e9cc7f4993f61 | [
"MIT"
] | null | null | null | CHANGELOG.md | Pfaufisch/ultraschall-3 | 49f9a57e639b317dffede2c15c5e9cc7f4993f61 | [
"MIT"
] | null | null | null | CHANGELOG.md | Pfaufisch/ultraschall-3 | 49f9a57e639b317dffede2c15c5e9cc7f4993f61 | [
"MIT"
] | null | null | null | # Ultraschall Changelog
The following features will be presented in detail in our Ultraschall tutorial videos:
[tp://ultraschall.fm/tutorials/](http://ultraschall.fm/tutorials/)
## 3.1 Miedinger - 2018-January
* REAPER: **Support for version 5.70**
_Ultraschall 3.1 has been optimized for REAPER version 5.70. It will not run with any other versions or REAPER._ **We strongly advise to avoid manual REAPER updates in the future**. _This will lead to the Ultraschall extension being disabled._
* Theme: **Master Channel always visible in the Mixer**
_The Master channel in the mixer is now visible by default in all views without having to toggle the visibility using the Tab key. The Master channel is divided into two sections: the normal output signal is displayed in the center, and a RMS meter with an average of 2,000 milliseconds is located to the left and right. In combination with the new Ultraschall Dynamics2 effect this enables you to determine whether you constantly achieve the desired -16LUFS. The actual measurement is carried out in RMS instead of LUFS, however this is still accurate enough to provide a practical approximation for vocal recordings. The states are marked by color: Blue- up to -18 LUFS (mix is too quiet) Green- -18 to -14 LUFS (mix volume is properly set) Yellow- -14 to -12 LUFS (mix is too loud, this level should only be reached occasionally) Orange: -12 to 0 LUFS (mix is considerably too loud). The desired mix only occasionally leaves the green zone when there is talking._
[Video: Master Mix and RMS Display](http://ultraschall.fm/wp-content/uploads/2018/01/Master_Mix_and_RMS.gif)
* Mastering: **Ultraschall Dynamics2 Loudness Normalization** _The Dynamics effect for loudness normalization of tracks in the 3.0 Release has received a major overhaul: three sliders now let you control the target volume (typically -16dB LUFS), noise floor (volume threshold at which a signal is considered noise) and a noise gate (steeper or slower or off). The effect has also been updated with a limiter to avoid digital clipping. You can save custom preference – even per track – as presets as well._
[Video: Ultraschall Dynamics 2 Loudness Normalization](http://ultraschall.fm/wp-content/uploads/2018/01/dynamics2b.gif)
* Export (Mac): **Exporting and Tagging of m4a**
_In macOS in the Render Assistant you can now generate m4a files as well. The Export Assistent will include chapter markers, meta data and episode cover just as is the case with mp3 files._
[Video: m4a Export](http://ultraschall.fm/wp-content/uploads/2018/01/m4a_export.gif)
* Theme: **Optimized View of selected Items**
_Selected items will no longer be highlighted by brightness, but rather by a golden frame._
[Video: Select Items](http://ultraschall.fm/wp-content/uploads/2018/01/Select_Items.gif)
* Theme: **Alternative wave form view**
_Use the shortcut `alt`+`shift`+`h` to change the wave form in such a way that the waves will no longer extend in both directions from the center line, but rather up from the bottom line._
[Video: Rectify Peaks](http://ultraschall.fm/wp-content/uploads/2018/01/Rectify_Peaks.gif)
* Theme: **Wave Form Display as Spectograph**
_By means of the Peak Display Settings you can display a spectrograph alongside the typical wave form – ideal to identify hum or buzz noise._
[Video: Spectograph](http://ultraschall.fm/wp-content/uploads/2018/01/Spektrogramm.gif)
* Theme: **Volume Zoom of the Wave Form**
_Using the shortcut `alt`+`h` you can vertically zoom the wave form in two steps. Doing so might make editing of very quiet passages easier for you._
[Video: Toggle Peak Gain](http://ultraschall.fm/wp-content/uploads/2018/01/Toggle_Peak_Gain.gif)
* Editing: **Zoom to Edit Cursor**
_Using the shortcut `z` you can directly zoom in to the Edit cursor for fine editing at a suitable zoom level. Pressing `z` again will zoom back out to the previous zoom level._
[Video: Toggle Arrange Zoom](http://ultraschall.fm/wp-content/uploads/2018/01/Toggle_Arrange_Zoom.gif)
* Theme: **Limiting Zoom-Out**
_Particularly when `pinch-zooming` using the trackpad you might happen to zoom out too far, making it difficult to find your project. The maximum zoom-out is now limited to project length (project completely visible). Using `alt`+`pinch-zoom` you can use the tradition unlimited zoom feature._
[Video: Zoom Limiter](http://ultraschall.fm/wp-content/uploads/2018/01/Zoom_Limiter.gif)
* Theme: **Zoom to Playhead**
_Using `pinch-zoom` you still zoom in to the position of the mouse cursor. Using the keyboard zoom via the `+` and `-` keys however you zoom to the playhead position. Using the `^` key will move to the Edit, Record or playhead without zooming._
[Video: Zoom to Playhead](http://ultraschall.fm/wp-content/uploads/2018/01/Zoom_to_Cursor.gif)
* Theme: **Ultraclock**
_The "Recording" view shows an advanced clock - it shows the current position of the playhead as before, but also the current transport status (stop, pause, play, record, loop) in color. Furthermore the current time is shown – handy for planning start and end of a live broadcast. Use the right mouse button to open a context menu for adjusting the displayed information._
[Video: Ultraclock](http://ultraschall.fm/wp-content/uploads/2018/01/Ultraclock.gif)
* Theme: **Optimized Tooltips**
_All button now feature practical tool tips when hovering over them with your mouse cursor._
[Video: Tool Tips](http://ultraschall.fm/wp-content/uploads/2018/01/tooltipps.gif)
* Theme: **Display of the Recording Format**
_The right hand side of the menu bar now also shows the used recording format alongside the free disk space. Standard for typically gain staged recordings is FLAC (lossless compression) in 16bit at 48KHz. Feel free to adjust this to 24bit if you are in need of more headroom from the higher bit rate._
[Video: Recording Info](http://ultraschall.fm/wp-content/uploads/2018/01/Recinfo.gif)
* Theme: **Envelope Tracks**
_Envelope tracks now show their type names such as “Mute”, “Volume”, etc. Using `shift`+`h` you can toggle the height - similar to changing track height using `h`._
[Video: Envelope Info](http://ultraschall.fm/wp-content/uploads/2018/01/Envelope_Info.gif)
* Theme: **Optimized RecArm Behavior**
_When creating new track using the Podcast menu or the shortcut `cmd`+`t`, these are still enabled for recording (RecArm). When creating a track using a double click in the left track area or via the mixer or drag & drop, these will not be record enabled._
[Video: Auto Arm new Tracks](http://ultraschall.fm/wp-content/uploads/2018/01/Auto_Arm_New_Tracks.gif)
* Theme: **Switching the RecArm Status for all Tracks**
_Using the `shift`+`a` shortcut you can arm all tracks for recording or disarm them respectively if they have been armed before. Switching the status only works when not recording._
[Video: Toggle RecArm all Tracks](http://ultraschall.fm/wp-content/uploads/2018/01/Toggle_Arm_all_Tracks.gif)
* Theme: **Toggling the Playback Speed**
_Using the `alt`+`j` shortcut you can toggle between 1.0x and 1.5x playback speed. If this is too fast: you can simply enter a different value on the play rate slider and toggle using `alt` + `j`. This value will be preserved. A big Thanks to leonidlezner!_
[Video: Envelope Info](http://ultraschall.fm/wp-content/uploads/2018/01/Switch_Playrate.gif)
* Editing: **Follow Mode**
_Using `cmd`+`f`or the relevant new icon you can enable or disable the Follow mode. Enabling will cause the timeline to jump to the current position of the play or record head and scoll along to keep track of the play head. Disabling will detach the timeline from the play head so you can work on other areas within the project without having to stop playback or recording. All chapter marker features will relate to the position of the Edit cursor and no longer the play/record cursor._
[Video: Follow Mode](http://ultraschall.fm/wp-content/uploads/2018/01/follow.gif)
* Editing: **Edit time selection on the edges**
_You can now grab and move a time selection from both of the outer edges._
[Video: Adjust Time Selection](http://ultraschall.fm/wp-content/uploads/2018/01/Adjust_Time_Selection.gif)
* Editing: **Time selection will automatically select items**
_When you mark a certain time selection with the Time Selection tool enabled, the actual item will automatically be selected. Marking a time selection over multiple items will select all these items respectively._
[Video: Select Items with Time Selection](http://ultraschall.fm/wp-content/uploads/2018/01/Select_Items_with_Time_Selection.gif)
* Editing: **Muting selected items along the time selection**
_Using `cmd`+`y` you can mute the time selection across all selected items. Using `cmd`+`shift`+`y` you can unmute this time selection again._
[Video: Mute Items with Time Selection](http://ultraschall.fm/wp-content/uploads/2018/01/Mute_Items_with_Time_Selection.gif)
* Editing: **Double Click to Play**
_Double clicking in the timeline or editing window will start playback from the clicked position._
[Video: Double Click to Play](http://ultraschall.fm/wp-content/uploads/2018/01/Double_Click_to_Play.gif)
* Editing: **Moving Items using the Keyboard**
_Using the `alt`+`left` or `alt`+`right` keys respectively you can move items within the timeline. Press the `n` key to open a preference window for setting the increments._
[Video: Nudge Items](http://ultraschall.fm/wp-content/uploads/2018/01/Nudge_Items.gif)
* Chapter Markers: **Importing planned Chapter Markers from the Clipboard**
_You can now import any text from the clipboard as a planned chapter marker (green color) into the project. Line breaks determine the individual markers._
[Video: Import planned Markers from Clipboard](http://ultraschall.fm/wp-content/uploads/2018/01/Import_from_Clipboard.gif)
* Chapter Markers: **Placing planned Chapter Markers**
_Using the `b` shortcut you can place the next prepared marker at the current playhead or edit cursor position (depending on the follow mode, see above)._
[Video: Set planned Markers](http://ultraschall.fm/wp-content/uploads/2018/01/Set_Planned_Markers2.gif)
* Chapter Markers: **Converting all Markers to planner Chapter Markers**
_All markers in the project can be converted to green planned chapter markers. The existing order will be unaffected._
[Video: Set all Markers to Planning Stage](http://ultraschall.fm/wp-content/uploads/2018/01/Set_Markers_to_Planned.gif)
* Chapter Markers: **Visibility of Chapter Markers during Placement**
_When placing chapter markers at the playhead or edit cursor while being outside the visible area, the timeline will be moved to the relevant position so you can actually see where you have placed the marker._
[Video: Show Positioning of Markers](http://ultraschall.fm/wp-content/uploads/2018/01/Show_Positioning_of_Markers.gif)
* Chapter Markers: **Always place Chapter Markets at Play/Record Cursor position**
_By using the `cmd` key you can always place chapter markers at the pay/record cursor position (i.e. `cmd`+`m`, `cmd`+`e`, `cmd`+`b` and `cmd`+`shift`+`m` respectively). The Follow mode (see above) will be ignored in this case and the viewport won’t change. These features are particulary suitable to be assigned to a Midi interface._
[Video: Always set Markers to Play/Rec Position](http://ultraschall.fm/wp-content/uploads/2018/01/Set_Markers_to_Play.gif)
* Chapter Markers: **Chapter Marker Management**
_In order to be able to edit existing chapter markers during listen through, the auto play feature of chapter markers in the chapter marker list will be disabled. Playback will simply resume no matter how you edit the chapter markers. To quickly jump to a chapter marker in the timeline, simply double click to the right of the marker in the free list area. The markers will be numbered for better orientation. The numbers won’t make it into the texts._
[Video: Edit Markers](http://ultraschall.fm/wp-content/uploads/2018/01/Edit_markers.gif)
* Misc: **Enhanced Keyboard Shortcuts**
_The overview of all relevant [keyboard shortcuts](http://url.ultraschall-podcast.de/keymap31) in Ultraschall has been updated and expanded._
* Easter Egg: **Should you take your time**
_Easter comes sooner than you might think. Using the `alt`+`cmd`+`shift`+`u` shortcut you can sweeten your wait._
* Misc: **Bug Fixes and Optimization**
_Several smaller enhancements, particularly for special cases. Too many details might even unsettle you._
## 3.0.3 Miedinger - 2017-März-
* REAPER: **Commitment to a specific version number**
_For the sake of troubleshooting and in order to maintain system stability, Ultraschall will check for a specific REAPER version ( **5.70** ) on every launch. Future releases will be bound to specific newer versions of REAPER **We therefore strongly advise NOT to manually update REAPER**, since the Ultraschall extensions would be deactivated._
* Actions: **Reworked Routing Snapshots**
_It is sometimes necessary to prepare and recall different routing situations – e.g. using Studio-Link OnAir – such as pre-show with music on the stream, the actual show with all the speakers, and after-show with music at low volume. The routing snapshot area has therefore been reimplemented completely and enhanced with its own user interface. The snapshots now also support Sends to Master and Hardware Sends._
* Keymap: **Shortcuts for Routing Snapshots**
_`F1` to `F4` access pre-defined Routing Snapshots. `Shift` + `F1` to `F4` writes the currently configured Routing Matrix into the respective snapshot slot._
* Keymap Mac: **Aligned with system standard**
_The following shortcut actions were aligned with macOS' system standard: Drag-and-copy of items now works with `alt` + `primary clicking-and-holding`. The current project tab is closed by `cmd` + `w`._
* Streaming: **Update for Studio Link OnAir**
_Fixes a rare stability problem in the Windows version of Studio Link OnAir._
* Editing: **Further improvements to `esc` key behavior**
_The "delete entire selection" function now includes also unselected envelope points. Amazing._
* Editing: **Midi actions for EQ tuner**
_Two new Ultraschall midi actions simplify moving the in- and outpoints via a classical EQ tuner on a midi interface: The middle setting has no effect, a turn to the left or right speeds up movements in the respective direction. The further the turn, the faster the in-/outpoint moves. The scripts (`ultraschall_midi_move_start_of_time_slection.lua` und `ultraschall_midi_move_end_of_time_slection.lua`) need to be manually assigned to a midi signal._
* Installer: **Update Check**
_Update Checks can now be en- and disabled at any time in the new start screen of Ultraschall._
* Theme: **Expanded Ultraschall start screen**
_The new start screen now also contains the information from the old `About Ultraschall...` menu, which is hereby removed._
* Soundboard: **Bugfix**
_The Soundboard will no longer stop playback when recalling a Routing Preset._
* Soundboard: **Presets**
_The Soundboard now uses presets throughout. This enables saving and loading of sound sets even during a recording. Such on-the-fly changes also enable to easily use more than 24 sounds during a show. One can assign the presets to one's own project presets, so that each session starts with the correct jingles._
## 3.0.2 Miedinger - 2017-March-09
* Editing: **Shortcuts for Inpoint and Outpoint**
_Use the buttons `alt` + `i` and `alt` + `o` to jump to the beginning or end of a time selection. With `shift` + `i` and `shift` + `o` sound will be played from the respective position._
* Editing: **Improved ripple-cut**
_The new ripple-cut function (`cmd` + `x` as well as the corresponding icon) now behaves more sensibly: if nothing is selected, nothing happens. If a time selection is selected, only this will be cut, regardless of items that may or may not be selected at the same time. If no time selection is selected, but exactly one item, a ripple cut (Tim's Ripple Cut) is made by means of its edges. If no time selection is selected, but several items, nothing happens._
* Editing: **Quickly change and freeze track height**
_Use the `h` key to quickly switch between two (adjustable) track heights, which keep their height even when the window is changed in size. The switch affects all selected tracks. If no track is selected, all tracks are affected. The freezing of the height can be canceled with the zoom buttons beneath the vertical scrollbar._
* Keymap: **Optimization of keyboard shortcuts and mapping for Windows**
_The assignment of shortcuts for Windows was adapted to the behavior on macOS. The PDF template now works the same on both systems. New shortcuts include: `shift` + `n` - Normalization of selected items as well as `alt` + `d` - Switch off all docks for distraction-free editing._
* Theme: **FX buttons on master channel**
_The buttons for setting FX are now also visible in the master channel in any size._
* Actions: **Advanced Prepare all tracks for editing**
_The function now includes switching on the sends of all tracks towards the master channel - a common source of mistakes for seemingly missing tracks in the final mix._
* Mastering: **Extended episode images**
_To quickly use a standard image as episode image, you may now also use files with the name `cover.jpg`, `cover.jpeg` or `cover.png` in the project folder. Thanks to Mespotine for the idea._
## 3.0.1 Miedinger - 2017-March-05
* Streaming **Studio Link OnAir Streaming**
_Thanks to the support by Studio Link OnAir you can now start a livestream of your show with a single click. The signal attached to the mastermix will be streamed via a public web-interface. There, you can set stream properties and share the URL._
* Installer: **Update Check**
_Ultraschall will check upon launching whether a new version has been published. It will validate that compatible versions of plugin and theme are installed. If not it will warn you._
* Installer: **LAME MP3 Encoder**
_The LAME MP3 Encoder in version 3.98.3 is automatically installed._
* Theme: **Ultraschall Start-up Screen**
_A new start-up screen confirms the successful installation and provides initial hints and links to the support resources._
* Theme: **View Adjustments**
_The view switchers in the top left now show the current mode after restarting Reaper (keyword: persistence layer). In the Edit view, the navigator window is now displayed over the whole width. Additionally, a new `Loudness` tab was added to the bottom left, which lets you measure the LUFS of tracks or individual items (see Ultraschall Dynamics)._
* Theme: **FX always visible in the mixer**
_Due to the continually growing importance of effects (StudioLink, OnAir, Soundboard, Dynamics), the FX buttons in the tracks' mixer panel are now always visible, and remain so when you shrink the windows._
* Theme: **Waveform**
_Selected items and cuts within the waveform are highlighted more clearly now._
* Theme: **User Interface**
_Many GUI elements receive more contrast to increase their visibility._
* Theme: **Selection Tool**
_You can now switch the mouse pointer between two editing modes: the previous mode selects and moves individual items on the timeline, while the new mode helps you select times for faster (ripple) cutting. Times can thus be selected anywhere on the timeline, not just on the upper edge as previously. A new icon, as well as the keyboard shortcut `#`, switch between the modes._
* Theme: **Emphasis on 'Prepare all tracks...'**
_The menu action 'Prepare all tracks for editing' needs to be used after each recording and before cutting. It is now emphasised visually. Also, the function itself has been reimplemented and expanded. A new status dialogue is displayed to provide user feedback after successful completion._
* Theme: **Podcast Menu**
_Several `Podcast` menu entries were updated and reordered more logically._
* Editing: **Volume Editing**
_Selected tracks can be overlayed with a new volume envelope (pre-FX!) via the menu or the shortcut `alt`+`v`. That overlay helps you realize complex fadings or volume gradients. Moreover, we've added a new pre-FX gain controller to the track area's left part. This controller adjusts the total volume of a single track; with visual feedback by the waveform. The tracks' visibility is toggled via the top icon known from the mute track._
* Editing: **Easier Envelope Handling**
_We reworked the mode of setting and moving points in the (mure or volume) envelopes. Now you simply move existing points or click at where the next point should be set. A muted section can thus be set via two clicks. The old free-hand drawing mode can be reactivated any time by pressing `cmd`._
* Editing: **Better `esc` Key Behaviour**
_We believe in the future and necessity of the `esc` key. That's why we enhanced the "delete any selection" function considerably. It now deselects track, item, envelope and time markers._
* Editing: **Pre-Listening Cuts**
_The shortcut `p` allows you to pre-listen the result of a cut by time markers, without actually applying the cut. This, together with the new shortcuts `<`, `y`, `x` und `c` for moving the in- and outpoints of a time selection, enables more efficient and controlled cutting._
* Editing: **Play Cursor at the Beginning of a Time Selection**
_Selecting time places the play cursor directly at the inpoint of the selection, to that one can use `return` or `space` to directly listen to the selection._
* Editing: **Expanded Ripple Cut**
_The shortcut `cmd` + `x` effects a ripple cut over all tracks, even when only a single item is selected. The cutting range will then be the start- and endpoint of the item._
* Keymap: **New Layout for Keyboard Shortcuts**
_A multitude of shortcuts was reworked and newly added in order to enable more efficient cutting via the keyboard. The new shortcuts are visualised in a [.PDF](http://url.ultraschall-podcast.de/keymap) and you can reflect customizations in the corresponding PowerPoint source file._
* Mastering: **Ultraschall Dynamics**
_The new Dynamics Effect can optimize the podcast's loudness to -16 LUFS. This effect replaces the previously recommended AU General Dynamic Effect and can also be applied in Ultraschall's Windows version. We deliver presets with and without soft noisegate in order to reduce faint disturbances. The effect can be used and parameterized on single tracks, single items, as well as on the master channel. Please note: the effect is less suited for the repair of audio problems (humming, reverberation, etc.) against which we highly recommend using [Auphonic](https://auphonic.com/)._
* Mastering: **Effect Templates for New Tracks**
_Adding new tracks automatically adds the effects ReaEQ (Equalizer) and JS: General Dynamics, but without activating them._
* Mastering: **New EQ Preset**
_Ultraschall 3 includes a new EQ preset, that delivers less bass boost then the preset from version 2. It is a good starting point for the headsets DT297 and HMC660._
* Mastering: **Export Assistant**
_`Export` can be found in the bottom left icon bar and helps you generate perfect MP3 files. The ID3V2 elements metadata (like title, podcast, etc.), episode images and chapter marks will be written to the MP3 files you produce._
* Mastering: **Noise Filter**
_We've added the ReaFix effect to the effect favorites in order to fix common sound problems like hissing or electric hum. Its use is explained in the Ultraschall Dynamics video._
* Mastering: **Open Project Directory**
_the menu command and the icon for opening the project directory now does exactly that, instead of the subdirectory with the sound files._
* Actions: **Colorpicker**
_The user-friendly Colorpicker (shortcut: `alt`+`c`) helps you keep the overview in complex projects; assign colors to tracks and single clips, or even gradients to multiple tracks, which can be fluid or using a sensible contrast range._
* Actions: **Import Chaptermarks from WAV Files**
_Some devices - like Zoom H5 and H6 - allow writing chapter marks to the .WAV file during the recording. A new chapter mark action allows reading those and converts them into Ultraschall chaptermarks._
* Actions [Windows]: **Bugfix for Umlauts**
_We fixed a bug in the handling of chaptermarks with mutated vowels. Thanks to [@jalea](https://twitter.com/jalea) and to Nico Buch for tracking down that error!_
* Soundboard: **Bugfix**
_Fixed a bug where the soundboard would not pause playback when triggered with OSC. Thanks to Bastian Boessl for reporting this bug._
## 2.2.2 Gropius - 2016-August-14
* Soundboard [Mac]: **Bugfix**
_Fixed a bug that prevented a recorded soundboard-track from playing._
* Misc [Windows]: **Compatibility**
_Updates for Windows 10 Version 1607 (Build 14393, Anniversary-Update)._
* StudioLink: **Update**
_Ultraschall now includes the updated StudioLink version 16.04.1._
## 2.2.2 Gropius - 2016-August-14
* Soundboard [Mac]: **Bugfix**
_Fixed a bug that prevented a recorded soundboard-track from playing._
* Misc [Windows]: **Compatibility**
_Updates for Windows 10 Version 1607 (Build 14393, Anniversary-Update)._
* StudioLink: **Update**
_Ultraschall now includes the updated StudioLink version 16.04.1._
## 2.2.1 Gropius - 2016-June-09
* Installer: **Bugfix**
_Fixed a bug that prevented the Uninstall script to delete legacy files from the system. Thanks to Marcus Schwarz for reporting this bug._
* Installer: **Bugfix**
_Fixed another a bug that prevented the Uninstall script to delete legacy files from the system. Thanks to Wolfgang Schulz for reporting this bug._
## 2.2 Gropius - 2016-June-05
* Theme: **Theme fine-tuning**
_Beautification of scroll bars, zoom icons and sliders in VST/AU effect displays_
* Actions: **Cough button and mute track**
_Complete rework of the cough button und mute button feature. Before you start your recording session, the mute button toggles cough button activation. When your recording session is complete, the mute button toggles the display ot the mute track._
* Actions: **Track selection using the keyboard**
_You can now toggle the selection of each track by using the numerical keys on your keyboard. Use '1' to '8' to toggle the selection of the respective track. Pressing '9' selects all tracks, '0' deselects all tracks._
* Actions: **Prepare all tracks for editing**
_The script "Prepare all tracks for editing" for completing a recording session has been improved and more parameters have been added_
* Actions: **New menu item "Customize"**
_Ultraschall 2.2 introduces the "Customize" menu item. You can use 'Shortcuts and action list' to change keyboard shortcuts and add new scripts to Ultraschall. Use 'Track: set track icon' and 'Show theme configuration window' to change the icons, colors and names of each track._
* StudioLink: **Full integration of the StudioLink plugin**
_Ultraschall now supports the StudioLink plugin. With StudioLink you can record remote calls on up to 8 individual tracks without the need to setup the notorious 'N-1 Schalte'. Remote attendees are not required to use REAPER. They can download the StudioLink Standalone-App (see https://doku.studio-link.de/standalone/installation-standalone.html)._
* Soundboard: **Soundboard for Windows**
_The Ultraschall Soundboard now runs on Windows as a VST._
* Soundboard: **Soundboard audio unit for Mac**
_The Ultraschall Soundboard for Mac is now an audio unit._
* Installer: **StudioLink Plugin**
_The Ultraschall installer now includes the StudioLink plugin._
* Preferences: **Improved Stop Function**
\*The stop function has been reworked in such a way that the cursor in recording mode is set to the end of the current recording. This is to prevent the creation of further takes if the recording session continues.
* Preferences: **Automatic Record Arm for newly inserted tracks**
\*Tracks are now set to 'Record Arm' after they have been inserted into the REAPER project. You are not any longer required to press the red 'Record Arm'-button.
* Misc: **Ultraschall 3 Preset for ReaEQ**
\*The preset Ultraschall 3 is automatically activated if you insert the ReaEQ Equaliyer into the tracks effect chain. This results in less rumble and less boost of lower frequencies.
* Misc: **Ultraschall Presets for audio compoitions**
_The effect presets Telephone (ReaEQ), Small Room, Large Room, Church (ReaVerbate) may be used for voice colorization in audio compositions_
* Installer: **Bugfix**
_The code signature of Uninstall.command was corrupted. Thanks to Arnd Layer for reporting this bug._
## 2.1.1 Gropius - 2016-Feb-27
* Theme: **Further fine-tuning of the new theme**
* Windows: **Corrected color adjustments for audio tracks on Windows**
## 2.1 Gropius - 2016-Feb-19
* Theme: **Comprehensive fine-tuning of the new theme**
_The contrast of nearly all elements was increased in order to enhance usability -- also in suboptimal lighting conditions. Groups of icons are now visually encapsulated. Individual icons were redesigned in order to clarify their function. The appeal of all buttons was harmonised. Many colors were fine-tuned in order to match Ultraschall's palette. Cuts within a track item are now marked by a clearer line._
* Theme: **Routing matrix contains display of input assignments**
_In the lower area of the routing matrix, input channels can now be assigned to tracks. Thus, really all relevant assignments happen at a single site._
* Misc: **SWS/S&M Extension 2.8.3**
_Ultraschall now comes with the [SWS/S&M Extension 2.8.3](http://www.sws-extension.org/)._
* Actions: **New shortcuts for more fluent editing with the keyboard**
_Start and end of a time selection can now be set with the `i` and `o` keys, just as in many video editing programs. The keys `1` and `2` now jump you to the start and end of a selection. The key assignments for jumping between chapter marks was changed, because it collided with the word-wise jumping within text sections._
* Actions: **Start/stop/pause safe mode for the keyboard**
_In order to prevent the unintended stop of an ongoing recording, the keys `RETURN` and `SPACE` are now deactivated during a recording. A prompt will appear and can be affirmed to actually stop the recording._
* Actions: **More robust chapter mark functions**
_All chapter mark functions were reimplemented in Lua and now take into account, whether a recording is **a)** ongoing, in which case the marker is set at the currently recorded position, or **b)** being played back, in which case it is set at the currently played position, or **c)** not applicable, in which case the marker is set to current position of the editing cursor. The MIDI connection was redesigned more robustly, so that chapter marks can be set via MIDI in any program state -- even when existing markers are being edited._
* Actions: **Labelling of Ultraschall actions**
_All Ultraschall function are now uniformly pre-fixed with `ULTRASCHALL:` and more clearly labelled in the actions dialogue for the keyboard assignments, which is accessible through the `?` key._
* Soundboard [Mac only]: **Bugfix**
_REAPER crashes when the folder that is to be loaded into the soundboard contains only one audio file. Thanks to Sven Wiegand for reporting this bug._
* Soundboard: **Bugfix**
_REAPER crashed when the folder that is to be loaded into the soundboard contained an audio file that can't be decoded. Thanks to René Kurfürst for reporting this bug._
## 2.0 Gropius - 2015-Dec-09
* Misc: **Reaper 5 and OS X 10.11 El Capitan**
_Ultraschall is now optimized for Reaper 5 and OSX 10.11 El Capitan. Older version are not be supported any more._
* Installer [Mac only]: **User, instead of system, library**
_The new installer writes its entries into the user library of OS X, instead of the system library._
* Theme: **Complete redesign**
_More consistency, less clutter. All icons and functional elements have been reworked and colors were harmonized._
* Theme: **Responsive mixer**
_Depending on the dock's height, the new mixer panel shows or hides some control elements._
* Theme: **New "Storyboard" view for audio compositions**
_You can now tag areas in your recording -- such as individual answers in an interview -- and later search and filter, also across different projects. Any number of clip databases can be managed. Colors and comments are available, to keep the overview in complex projects. Finished sections and regions can be grouped. Text elements can be distributed freely in the cutting area, in order to manage show notes or comments._
* Actions [Mac only]: **About Screen**
_A new menu entry_ `Podcast | About Ultraschall…` _displays the installed components and their version numbers._
* HUB [Mac only]: **New audio driver**
_Soundflower is a thing of the past: following the new CoreAudio APIs in OSX 10.11 El Capitan, the audio engine for virtual sound devices has been revised from the ground up._
Older, untranslated changes are listed [in German only](https://github.com/Ultraschall/REAPER/blob/master/CHANGELOG-DE.md), because Ultraschall was not available in English back then. Please let us know if you really want to have the old changes translated.
| 76.061224 | 968 | 0.772233 | eng_Latn | 0.995074 |
c17329ed26202ea7f6163058b8c83ae6345a4be8 | 9,961 | md | Markdown | articles/app-service-web/web-sites-python-ptvs-django-sql.md | OpenLocalizationTestOrg/azure-docs-pr15_et-EE | bc69bd1a9d45d7abd2a3d01806d74e2b0848a808 | [
"CC-BY-3.0",
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/app-service-web/web-sites-python-ptvs-django-sql.md | OpenLocalizationTestOrg/azure-docs-pr15_et-EE | bc69bd1a9d45d7abd2a3d01806d74e2b0848a808 | [
"CC-BY-3.0",
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/app-service-web/web-sites-python-ptvs-django-sql.md | OpenLocalizationTestOrg/azure-docs-pr15_et-EE | bc69bd1a9d45d7abd2a3d01806d74e2b0848a808 | [
"CC-BY-3.0",
"CC-BY-4.0",
"MIT"
] | null | null | null | <properties
pageTitle="Django ja Python tööriistu 2.2 Visual Studio Azure SQL-i andmebaasiga"
description="Saate teada, kuidas luua Django veebirakenduse, mis salvestab andmed SQL-andmebaasi eksemplari Python Tools for Visual Studio abil ja selle juurutama Azure'i rakenduse teenuse Web Apps."
services="app-service\web"
tags="python"
documentationCenter="python"
authors="huguesv"
manager="wpickett"
editor=""/>
<tags
ms.service="app-service-web"
ms.workload="web"
ms.tgt_pltfrm="na"
ms.devlang="python"
ms.topic="article"
ms.date="07/07/2016"
ms.author="huguesv"/>
# <a name="django-and-sql-database-on-azure-with-python-tools-22-for-visual-studio"></a>Django ja Python tööriistu 2.2 Visual Studio Azure SQL-i andmebaasiga
Selles õpetuses kasutame [Python Tools for Visual Studio] abil luua lihtsa küsitluste veebirakenduse, kasutades ühte järgmistest PTVS Näidismallid. Selles õpetuses on ka [video](https://www.youtube.com/watch?v=ZwcoGcIeHF4)saadaval.
Me õpite kasutamine majutatud Azure SQL-andmebaasi, web appi kasutamine SQL-andmebaasi konfigureerimine ja kuidas avaldada veebirakenduse [Azure'i rakenduse teenuse](http://go.microsoft.com/fwlink/?LinkId=529714)veebirakenduste.
Vaadake veel artikleid, mis hõlmavad Azure rakenduse teenuse veebirakenduste abil pudel, lisatakse ja Django web raamistik Azure'i Tabelimälu, MySQL-i ja SQL-andmebaasi teenustega PTVS arendamine [Python Arenduskeskus] . Kuigi see artikkel keskendub rakendust Service, juhiseid on sarnased väljatöötamisel [Azure'i pilveteenustega].
## <a name="prerequisites"></a>Eeltingimused
- Visual Studio 2015
- [Python 2.7 32-bitine versioon]
- [Python tööriistade 2.2 Visual Studio]
- [Python tööriistade 2.2 Visual Studio näidised VSIX jaoks]
- [Azure'i SDK tööriistad VS 2015]
- Django 1,9 või uuem versioon
[AZURE.INCLUDE [create-account-and-websites-note](../../includes/create-account-and-websites-note.md)]
>[AZURE.NOTE] Kui soovite alustada Azure'i rakendust Service enne Azure'i konto kasutajaks, minge [Proovige rakenduse teenus](http://go.microsoft.com/fwlink/?LinkId=523751), kus saate kohe luua lühiajaline starter web app rakenduse teenus. Nõutav; krediitkaardid kohustusi.
## <a name="create-the-project"></a>Projekti loomine
Selles jaotises loome Visual Studio projekti valimi malli abil. Vaatame luua virtuaalse keskkonna ja installige nõutav paketid. Loome kohaliku andmebaasi sqlite abil. Seejärel kuvatakse veebirakenduse Käivita kohalikult.
1. Visual Studio, valige **fail**, **Uue projekti**.
1. Projekti Mallid: [Python tööriistad 2.2 for Visual Studio näidised VSIX] on saadaval jaotises **Python**, **näidiseid**. Valige **Küsitluste Django Web projekti** ja projekti loomiseks klõpsake nuppu OK.

1. Teil palutakse välise pakettide installimiseks. Valige **virtuaalse keskkonda installida**.

1. Valige **Python 2.7** base Tõlgi.

1. **Lahenduste Explorer**, paremklõpsake projekti sõlm ja valige **Python**ja valige **Django migreerimine**. Valige **Django loomine superkasutaja**.
1. See avada Django Management Console ja sqlite andmebaasi loomine projekti kausta. Järgige viipasid kasutaja loomiseks.
1. Veenduge, et rakendus töötab, vajutage <kbd>klahvi F5</kbd>.
1. Ülaosas navigeerimisribal nuppu **Logi sisse** .

1. Sisestage mandaat, saate luua, kui teie sünkroonitud andmebaasi kasutaja jaoks.

1. Klõpsake nuppu **proovi Hääletusküsitluste loomine**.

1. Klõpsake küsitluse ja hääletada.

## <a name="create-a-sql-database"></a>SQL-i andmebaasi loomine
Andmebaasi, loome Azure'i SQL-andmebaasiga.
Järgmiste juhiste järgi saate luua andmebaasi.
1. Logige [Azure portaali].
1. Navigeerimispaani allosas nuppu **Uus**. , klõpsake **andmete + salvestusruumi** > **SQL-andmebaasi**.
1. Uue ressursirühma loomisega uue SQL-andmebaasi konfigureerimine ja valige sobiv asukoht seda.
1. Kui SQL-andmebaas on loodud, klõpsake **Visual Studios avatud** andmebaasi tera.
2. Klõpsake nuppu **Konfigureeri tulemüüri**.
3. **Tulemüüri sätete** tera, lisage tulemüüri reegli **Alustada IP** -ja **LÕPPAEG IP** seadmine avaliku arendamist arvuti IP-aadress. Klõpsake nuppu **Salvesta**.
See võimaldab ühendused andmebaasiserveriga arengu teie arvutist.
4. Olles tagasi kohas andmebaasi labale nuppu **Atribuudid**ja seejärel klõpsake nuppu **Kuva andmebaasi ühendusstringi**.
2. **ADO.net-i** väärtus sellele Lõikelaud nuppu Kopeeri abil.
## <a name="configure-the-project"></a>Projekti konfigureerimine
Selles jaotises configure meie web appi me äsja loodud SQL-andmebaasi kasutama. Installida kuvatakse ka Python pakette kasutama SQL andmebaase Django. Seejärel kuvatakse veebirakenduse Käivita kohalikult.
1. Visual Studio, avage **settings.py**, *ProjectName* kaustast. Kleepige ajutiselt redaktori ühendusstring. Ühendusstringi on selles vormingus:
Server=<ServerName>,<ServerPort>;Database=<DatabaseName>;User ID=<UserName>;Password={your_password_here};Encrypt=True;TrustServerCertificate=False;Connection Timeout=30;
Redigeeri määratlus `DATABASES` kasutada ülaltoodud väärtused.
DATABASES = {
'default': {
'ENGINE': 'sql_server.pyodbc',
'NAME': '<DatabaseName>',
'USER': '<UserName>',
'PASSWORD': '{your_password_here}',
'HOST': '<ServerName>',
'PORT': '<ServerPort>',
'OPTIONS': {
'driver': 'SQL Server Native Client 11.0',
'MARS_Connection': 'True',
}
}
}
1. Solution Exploreris jaotises **Python keskkonnas**, paremklõpsake virtuaalse keskkonna ja valige **Installida Python pakett**.
1. Installige pakett `pyodbc` **pip**abil.

1. Installige pakett `django-pyodbc-azure` **pip**abil.

1. **Lahenduste Explorer**, paremklõpsake projekti sõlm ja valige **Python**ja valige **Django migreerimine**. Valige **Django loomine superkasutaja**.
See loob lõime eelmises jaotises SQL-andmebaasi tabelid. Järgige viipasid kasutaja, mis ei vasta kasutaja loodud esimene jaotis sqlite andmebaasi loomiseks.
1. Käivitage rakendus koos `F5`. Küsitlused, mis on loodud **Valimi hääletuste loomine** ja andmete alusel hääletamine on seeriasertide SQL-andmebaasis.
## <a name="publish-the-web-app-to-azure-app-service"></a>Azure'i rakendust Service veebirakenduse avaldamine
Azure'i .NET SDK abil on lihtne juurutada oma veebirakenduse web Azure'i rakenduse teenuse Web Apps.
1. **Lahenduste Explorer**, paremklõpsake projekti sõlm ja valige käsk **Avalda**.

1. Klõpsake **Microsoft Azure'i Web Apps**.
1. Klõpsake **Uus** uue veebirakenduse loomine.
1. Täitke järgmised väljad ja klõpsake nuppu **Loo**.
- **Web App nimi**
- **Rakenduse teenusleping**
- **Ressursirühm**
- **Piirkond**
- Jätke **andmebaasi server** **pole andmebaasi** määramine
1. Aktsepteeri kõik muud vaikesätted ja klõpsake nuppu **Avalda**.
1. Veebibrauseris avatakse automaatselt avaldatud web appi. Peaksite nägema tööta eeldatud, veebirakenduse abil majutatud Azure **SQL** -andmebaasi.
Palju õnne!

## <a name="next-steps"></a>Järgmised sammud
Järgmiste linkide kaudu Lisateavet Python tööriistad Visual Studio, Django ja SQL-andmebaasi.
- [Python Tools for Visual Studio dokumentatsioon]
- [Veebi projektid]
- [Pilveteenuse teenuse projektid]
- [Microsoft Azure Remote silumine]
- [Django dokumentatsioon]
- [SQL-andmebaas]
## <a name="whats-changed"></a>Mis on muutunud
* Muuda juhend veebisaitide rakenduse teenusega leiate: [Azure'i rakendust Service ja selle mõju olemasoleva Azure'i teenused](http://go.microsoft.com/fwlink/?LinkId=529714)
<!--Link references-->
[Python Arenduskeskus]: /develop/python/
[Azure pilveteenused]: ../cloud-services-python-ptvs.md
<!--External Link references-->
[Azure'i portaal]: https://portal.azure.com
[Python Tools for Visual Studio]: http://aka.ms/ptvs
[Python tööriistade 2.2 Visual Studio]: http://go.microsoft.com/fwlink/?LinkID=624025
[Python tööriistade 2.2 Visual Studio näidised VSIX jaoks]: http://go.microsoft.com/fwlink/?LinkID=624025
[Azure'i SDK tööriistad VS 2015]: http://go.microsoft.com/fwlink/?LinkId=518003
[Python 2.7 32-bitine versioon]: http://go.microsoft.com/fwlink/?LinkId=517190
[Python Tools for Visual Studio dokumentatsioon]: http://aka.ms/ptvsdocs
[Microsoft Azure Remote silumine]: http://go.microsoft.com/fwlink/?LinkId=624026
[Veebi projektid]: http://go.microsoft.com/fwlink/?LinkId=624027
[Pilveteenuse teenuse projektid]: http://go.microsoft.com/fwlink/?LinkId=624028
[Django dokumentatsioon]: https://www.djangoproject.com/
[SQL-andmebaas]: /documentation/services/sql-database/
| 48.120773 | 332 | 0.749624 | est_Latn | 0.993344 |
c17395773259547d671c0ae4b4ef7353e8c0246d | 790 | md | Markdown | README.md | stochastic-geometry/LoadDistributionPCP | 174a6ba2b465263e60911b8c46e2781b1f012f1b | [
"MIT"
] | 4 | 2020-09-09T02:11:35.000Z | 2022-03-23T02:34:59.000Z | README.md | stochastic-geometry/LoadDistributionPCP | 174a6ba2b465263e60911b8c46e2781b1f012f1b | [
"MIT"
] | 1 | 2021-07-16T11:29:17.000Z | 2021-07-16T11:32:14.000Z | README.md | stochastic-geometry/LoadDistributionPCP | 174a6ba2b465263e60911b8c46e2781b1f012f1b | [
"MIT"
] | null | null | null | # Matlab code for the computation of the PMF of the number of points of a PCP in a typical cell of a stationary PPP
---
Author: Chiranjib Saha and Harpreet S. Dhillon
Email: csaha@vt.edu, hdhillon@vt.edu
This repository contains the Matlab scripts related to the paper [1].
Please cite [1] for reusing this code.
---
Reference
[1] C. Saha and H. S. Dhillon, "Load on the Typical Poisson Voronoi Cell with Clustered User Distribution", in IEEE Wireless Communications Letters, *to appear*. Available online: https://arxiv.org/abs/2004.10053.
Bibtex:
```
@article{saha2020load,
title={Load on the Typical Poisson Voronoi Cell with Clustered User Distribution},
author={Saha, Chiranjib and Dhillon, Harpreet S},
note={available online: arXiv/abs/2004.10053},
year={2020}
}
```
| 31.6 | 214 | 0.74557 | eng_Latn | 0.924054 |
c1739d5fd202aa4bbbd396de6f7ad59f150f05e4 | 1,328 | md | Markdown | AlchemyInsights/restore-deleted-user.md | pebaum/OfficeDocs-AlchemyInsights-pr.id-ID | 1a5dd543599dbffc8d5b07285aa9705fa562a74d | [
"CC-BY-4.0",
"MIT"
] | null | null | null | AlchemyInsights/restore-deleted-user.md | pebaum/OfficeDocs-AlchemyInsights-pr.id-ID | 1a5dd543599dbffc8d5b07285aa9705fa562a74d | [
"CC-BY-4.0",
"MIT"
] | null | null | null | AlchemyInsights/restore-deleted-user.md | pebaum/OfficeDocs-AlchemyInsights-pr.id-ID | 1a5dd543599dbffc8d5b07285aa9705fa562a74d | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Kembalikan pengguna yang dihapus
ms.author: pebaum
author: pebaum
manager: mnirkhe
ms.date: 04/21/2020
ms.audience: Admin
ms.topic: article
ROBOTS: NOINDEX, NOFOLLOW
localization_priority: Normal
ms.collection: Adm_O365
ms.custom:
- "73"
- "1200013"
ms.assetid: dae7b5b0-1003-40bd-b59f-8c5009fc8d82
ms.openlocfilehash: cabb1463fd27cc26f2482210d50eb38823e8a60a
ms.sourcegitcommit: bc7d6f4f3c9f7060d073f5130e1ec856e248d020
ms.translationtype: MT
ms.contentlocale: id-ID
ms.lasthandoff: 06/02/2020
ms.locfileid: "44511259"
---
# <a name="restore-a-user"></a>Memulihkan pengguna
Memulihkan pengguna yang menggunakan langkah berikut:
1. Buka pengguna yang telah [ \> dihapus](https://admin.microsoft.com/adminportal/home#/deletedusers).
2. Pilih pengguna, lalu pilih **Pulihkan**.
3. Ikuti petunjuk untuk menyetel sandi.
4. Klik **kirim email dan tutup**, dan Anda sudah selesai!
Bukankah itu mudah? Untuk detail dan langkah lebih lanjut dengan cuplikan layar, bacalah artikel ini: [Pulihkan pengguna](https://docs.microsoft.com/microsoft-365/admin/add-users/restore-user). Jika Anda menyadari bahwa Anda perlu memulihkan kotak pesan pengguna, lihat: [menghapus atau memulihkan kotak pesan pengguna di Exchange Online](https://docs.microsoft.com/exchange/recipients-in-exchange-online/delete-or-restore-mailboxes). | 37.942857 | 434 | 0.796687 | ind_Latn | 0.79505 |
c1741e35e43a2ec53023087d54d43dcd25be516c | 284 | md | Markdown | src/pages/blog/night-fishing-bundle.md | alexanderjacks/dogged | 6c29822cd7dd0e92d2b297ec37d43d5ae54d3d19 | [
"MIT"
] | null | null | null | src/pages/blog/night-fishing-bundle.md | alexanderjacks/dogged | 6c29822cd7dd0e92d2b297ec37d43d5ae54d3d19 | [
"MIT"
] | null | null | null | src/pages/blog/night-fishing-bundle.md | alexanderjacks/dogged | 6c29822cd7dd0e92d2b297ec37d43d5ae54d3d19 | [
"MIT"
] | null | null | null | ---
templateKey: blog-post
featuredpost: false
date: 2020-02-26T00:00:02.711Z
featuredimage: /img/Night_Fishing_Bundle.png
title: Night Fishing Bundle
description: Fish Tank
count: 3 out of 3
reward: Small Glow Ring (1)
tags:
- Walleye
- Bream
- Eel
- bundle
- Fish Tank
--- | 17.75 | 44 | 0.721831 | eng_Latn | 0.356833 |
c1745618d2698c367fbd07fd9b71b925a7b3e84f | 285 | md | Markdown | EN/N3/4/1/1/3.md | Sandrox1099/bte-n.github.io | 08f443f390c27e8fc8219d06785ba6643e5dbe83 | [
"MIT"
] | 1 | 2020-07-11T05:49:18.000Z | 2020-07-11T05:49:18.000Z | EN/N3/4/1/1/3.md | Sandrox1099/bte-n.github.io | 08f443f390c27e8fc8219d06785ba6643e5dbe83 | [
"MIT"
] | null | null | null | EN/N3/4/1/1/3.md | Sandrox1099/bte-n.github.io | 08f443f390c27e8fc8219d06785ba6643e5dbe83 | [
"MIT"
] | 7 | 2020-05-15T09:52:44.000Z | 2020-11-09T15:16:21.000Z | # N3.4.1.1.3: Airbus A319 with Sharklets
Download the `.schematic`-file here: **[Download](https://bte-n.github.io/resources/N3/4/1/A319S.schematic)**

# Authors
Blank plane: `Airplaneguy9#8110 (497014954935713802)`
| 28.5 | 109 | 0.705263 | yue_Hant | 0.325571 |
c17509c16d3f687cdd2a4ae208f1be93a5943364 | 2,314 | md | Markdown | _posts/programming/programmers/2020-11-02-[Algo]-프로그래머스-등굣길.md | ShinJongPark/-ShinJongPark.github.io | e5062dbfb9c68aab3b6f3e19aaf3a2f72cc5f734 | [
"MIT"
] | 1 | 2020-12-08T12:02:45.000Z | 2020-12-08T12:02:45.000Z | _posts/programming/programmers/2020-11-02-[Algo]-프로그래머스-등굣길.md | ShinJongPark/-ShinJongPark.github.io | e5062dbfb9c68aab3b6f3e19aaf3a2f72cc5f734 | [
"MIT"
] | null | null | null | _posts/programming/programmers/2020-11-02-[Algo]-프로그래머스-등굣길.md | ShinJongPark/-ShinJongPark.github.io | e5062dbfb9c68aab3b6f3e19aaf3a2f72cc5f734 | [
"MIT"
] | 3 | 2019-11-25T10:51:26.000Z | 2020-12-03T09:06:42.000Z | ---
title: "[프로그래머스] Level3 - 등굣길"
tags: "Programming"
author : ""
article_header:
type: overlay
theme: dark
background_color: '#123'
background_image:
gradient: 'linear-gradient(135deg, rgba(34, 139, 87 , .4), rgba(139, 34, 139, .4))'
src: /assets/images/logo/background_algorithm.jpg
---
###### <br/>
### > 문제설명
계속되는 폭우로 일부 지역이 물에 잠겼습니다. 물에 잠기지 않은 지역을 통해 학교를 가려고 합니다. 집에서 학교까지 가는 길은 m x n 크기의 격자모양으로 나타낼 수 있습니다.
아래 그림은 m = 4, n = 3 인 경우입니다.
<Br>

<br>
가장 왼쪽 위, 즉 집이 있는 곳의 좌표는 (1, 1)로 나타내고 가장 오른쪽 아래, 즉 학교가 있는 곳의 좌표는 (m, n)으로 나타냅니다.
격자의 크기 m, n과 물이 잠긴 지역의 좌표를 담은 2차원 배열 puddles이 매개변수로 주어집니다. **오른쪽과 아래쪽으로만 움직여** 집에서 학교까지 갈 수 있는 최단경로의 개수를 1,000,000,007로 나눈 나머지를 return 하도록 solution 함수를 작성해주세요.
##### 제한사항
- 격자의 크기 m, n은 1 이상 100 이하인 자연수입니다.
- m과 n이 모두 1인 경우는 입력으로 주어지지 않습니다.
- 물에 잠긴 지역은 0개 이상 10개 이하입니다.
- 집과 학교가 물에 잠긴 경우는 입력으로 주어지지 않습니다.
<br>
<br>
### > 문제풀이
>문제 조건에서 움직일 수 있는 방향은 (우,하), 물에 잠긴 지역 이동 불가.
>
>움직일 수 있는 방향은 우,하 뿐이고, 인접한 지역간의 거리는 1이기 때문에,
>
>해당 지역에서 좌, 상의 지역값을 더해주었다.
>
>어렵지 않은 문제였다.
{%raw%}
```java
package level3;
public class Solution_등굣길 {
public static void main(String[] args) {
System.out.println(solution(4,3,new int[][]{{2,2}}));
}
static public int solution(int m, int n, int[][] puddles) {
int[][] map = new int[m][n];
int x,y;
for(int i=0; i<puddles.length; i++){
x = puddles[i][0]-1;
y = puddles[i][1]-1;
map[x][y] = -1;
}
map[0][0] = 1;
for(int i=0; i<map.length; i++){
for(int j=0; j<map[i].length; j++){
if(map[i][j] == -1) continue;
if(i==0 && j==0) continue;
int a = check(i-1, j, map) && map[i-1][j] != -1 ? map[i-1][j] : 0;
int b = check(i, j-1, map) && map[i][j-1] != -1 ? map[i][j-1] : 0;
map[i][j] = (a+b)%1000000007;
}
}
return map[m-1][n-1];
}
static boolean check(int x, int y, int[][] map){
if(x>=0 && y>=0 && x<map.length && y<map[x].length){
return true;
}else return false;
}
}
```
{%endraw%}
<br/>
<br/>
<br/>
문제 출처 : Programmers
<br/>
<br/>
<br/> | 20.660714 | 159 | 0.539326 | kor_Hang | 0.999946 |
c17513858f0c6cbc1d95f32a080a437d1a11477f | 34 | md | Markdown | README.md | ManuelDeLeon/seechi | d22b4566fdaa39144693730b1154a393c2da4b37 | [
"MIT"
] | null | null | null | README.md | ManuelDeLeon/seechi | d22b4566fdaa39144693730b1154a393c2da4b37 | [
"MIT"
] | null | null | null | README.md | ManuelDeLeon/seechi | d22b4566fdaa39144693730b1154a393c2da4b37 | [
"MIT"
] | null | null | null | # Seechi
## Index all the things
| 8.5 | 23 | 0.676471 | eng_Latn | 0.988386 |
c17531a9c8568da1db476c814ad00e46498e0883 | 185 | md | Markdown | README.md | rbkmoney/three-ds-server-domain-lib | 78ef5e86b5e5c0f22f92f6d1c286fd6b7db1b2b6 | [
"Apache-2.0"
] | null | null | null | README.md | rbkmoney/three-ds-server-domain-lib | 78ef5e86b5e5c0f22f92f6d1c286fd6b7db1b2b6 | [
"Apache-2.0"
] | 1 | 2022-03-31T21:01:34.000Z | 2022-03-31T21:01:34.000Z | README.md | rbkmoney/three-ds-server-domain-lib | 78ef5e86b5e5c0f22f92f6d1c286fd6b7db1b2b6 | [
"Apache-2.0"
] | 1 | 2021-12-07T09:39:19.000Z | 2021-12-07T09:39:19.000Z | # three-ds-server-domain-lib
Описание моделей домена макросервиса 3DSS
[3DSS Docker Compose description](https://github.com/rbkmoney/three-ds-server-compose#three-ds-server-compose)
| 30.833333 | 110 | 0.805405 | yue_Hant | 0.340895 |
c1755ee69f4284fea4add683a97c695aeea355f0 | 9,856 | md | Markdown | articles/commerce/customer-mgmt-stores.md | MicrosoftDocs/Dynamics-365-Operations.sv-se | 90cb258993f991b2ce1b67078a6519e342608a4e | [
"CC-BY-4.0",
"MIT"
] | 3 | 2020-05-18T17:14:56.000Z | 2022-03-02T03:46:34.000Z | articles/commerce/customer-mgmt-stores.md | MicrosoftDocs/Dynamics-365-Operations.sv-se | 90cb258993f991b2ce1b67078a6519e342608a4e | [
"CC-BY-4.0",
"MIT"
] | 7 | 2017-12-13T12:57:02.000Z | 2019-04-30T11:46:04.000Z | articles/commerce/customer-mgmt-stores.md | MicrosoftDocs/Dynamics-365-Operations.sv-se | 90cb258993f991b2ce1b67078a6519e342608a4e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Kundhantering i butiker
description: Det här avsnittet förklarar hur återförsäljare kan aktivera kundhanteringsfunktioner vid kassan (POS) i Microsoft Dynamics 365 Commerce.
author: josaw1
ms.date: 09/01/2021
ms.topic: article
ms.prod: ''
ms.technology: ''
ms.search.form: RetailOperations
audience: Application User, IT Pro
ms.reviewer: v-chgri
ms.search.region: Global
ms.search.industry: retail
ms.author: shajain
ms.search.validFrom: 2021-01-31
ms.dyn365.ops.version: 10.0.14
ms.openlocfilehash: 4fd6039843be09ec706e45746d5724faa99a95e6
ms.sourcegitcommit: 3f59b15ba7b4c3050f95f2b32f5ae6d7b96e1392
ms.translationtype: HT
ms.contentlocale: sv-SE
ms.lasthandoff: 09/29/2021
ms.locfileid: "7563071"
---
# <a name="customer-management-in-stores"></a>Kundhantering i butiker
[!include [banner](includes/banner.md)]
[!include [banner](includes/preview-banner.md)]
Det här avsnittet förklarar hur återförsäljare kan aktivera kundhanteringsfunktioner vid kassan (POS) i Microsoft Dynamics 365 Commerce.
Det är viktigt att butikspersonalen kan skapa och redigera kundposter i kassan. På så sätt kan de samla in uppdateringar av kundinformationen som t.ex. e-postadress, telefonnummer och adress. Den här informationen är användbar i andra system, till exempel marknadsföring, eftersom effektiviteten hos dessa system beror på hur korrekta kunddata är.
Säljare kan utlösa arbetsflödet för kundskapande från två kassastartpunkter. De kan välja en knapp som är mappad till operationen **Lägga till kund** och de kan välja **Skapa kund** i appfältet på sökresultatsidan. I båda fall visas dialogrutan **Ny kund** där säljföretag kan ange nödvändiga kunduppgifter, såsom kundens namn, e-postadress, telefonnummer, adress och all ytterligare information som är relevant för verksamheten. Att ytterligare information kan fångas in med hjälp av de kundattribut som är associerade med kunden. Mer information om kundattribut finns i [Kundattribut](dev-itpro/customer-attributes.md).
Säljare kan också samla in sekundära e-postadresser och telefonnummer. De kan också fånga in kundens önskemål om att ta emot marknadsföringsinformation via någon av de sekundära e-postadresserna eller telefonnummer. För att aktivera denna funktion måste återförsäljare aktivera funktionen **Samla in flera e-postmeddelanden och telefonnummer och godkännande för marknadsföring på dessa kontakter** i arbetsytan **Funktionshantering** i Commerce-administration (**Systemadministration \> Arbetsytor \> Funktionshantering**).
## <a name="default-customer-properties"></a>Standardkundegenskaper
Detaljhandlare kan använda sidan **Alla butiker** i Commerce-administration (**Retail och Commerce \> Kanaler \> Butiker**) för att associera en standardkund till varje butik. I Commerce kopieras sedan de egenskaper som har definierats för standardkunden till alla nya kundposter som skapas. Till exempel visar dialogrutan **Skapa kund** egenskaper som ärvs från standardkunden som är associerad med butiken. Dessa egenskaper omfattar **kundtyp**, **kundgrupp**, **kvittoalternativ**, **kvittomeddelande**, **valuta** och **språk**. Eventuella **anknytningar** (gruppering av kunder) ärvs också från standardkunden. **Ekonomiska dimensioner** ärvs dock från den kundgrupp som associeras med standardkunden, inte från standardkunden.
> [!NOTE]
> Värdet via e-postkvittot kopieras från standardkunden bara om **kvitto-e-post-ID** inte anges för de nya kunderna. Detta innebär att om kvitto-e-post-ID finns på standardkunden får alla kunder som skapats från e-handelsplatsen samma kvitto-e-post-ID eftersom det inte finns något användargränssnitt för att samla in kvitto-e-post-ID från kunden. Vi rekommenderar att du håller fältet **e-postkvitto** tomt för standardkunden i butiken och bara använder det om du har en affärsprocess som beror på att det finns en e-postadress med kvitto.
Säljare kan fånga in flera adresser för en kund. Kundens namn och telefonnummer ärvs från den kontaktinformation som är kopplad till varje adress. Snabbflikarna **Adresser** för en kundpost innehåller ett fält för **Syfte** som säljarna kan redigera. Om kundtypen är **Person** är standardvärdet **Start**. Om kundtypen är **Organisation** är standardvärdet **Företag**. Andra värden som det här fältet stöder **Start**, **Kontor** och **Brevlåda**. Värdet i fältet **Land** för en adress ärvs från den primära adressen som anges på sidan **Driftenhet** i Commerce-administration i **Organisationsadministration \> Organisationer \> Driftenheter**.
## <a name="sync-customers-and-async-customers"></a>Synkrona och asynkrona kunder
> [!IMPORTANT]
> När kassan är offline växlar systemet automatiskt till läget för asynkron kundgenerering, även om läget för asynkron kundgenerering är inaktiverat. Oavsett om du väljer synkron eller asynkron kundgenerering måste Commerce headquarters-administratörer därför skapa och schemalägga ett återkommande batchjobb för **P-jobbet**, jobbet **Synkronisera kunder och affärspartner från asynkront läge** (kallades tidigare jobbet **Synkronisera kunder och affärspartner från asynkront läge**) och **1010**-jobbet, så att alla asynkrona kunder konverteras till synkrona kunder i Commerce headquarters.
I Commerce finns det två sätt att skapa kunder: Synkron (eller Synk) och Asynkron (eller Asynk). Kunder skapas som standard synkront. Med andra ord skapas de i Commerce-administration i realtid. Synkroniseringsläget för kunder är fördelaktigt eftersom nya kunder kan sökas omedelbart över kanaler. Den har emellertid även ett problem. Eftersom det genererar [Commerce Data Exchange: realtidstjänst](dev-itpro/define-retail-channel-communications-cdx.md#realtime-service) anropar till Commerce-administration kan prestanda påverkas om många samtidiga kundsamtal görs.
Om alternativet **Skapa kund i asynkront läge** anges till **Ja** i butikens funktionsprofil (**Retail och Commerce \> Kanalinställning \> Inställning av online-butik \> Funktionsprofiler**), realtidssamtal används inte för att skapa kundposter i kanaldatabasen. Asynkront kundgenereringsläge påverkar inte prestandan i Commerce-administration. En tillfällig, global unik identifierare (GUID) tilldelas varje ny asynkron kundpost och används som kundkonto-ID. Denna GUID inte visas för kassaanvändare. I stället visas **väntande synkronisering** som kundkonto-ID.
### <a name="convert-async-customers-to-sync-customers"></a>Konvertera asynkrona kunder till synkrona kunder
Om du vill konvertera asynkrona kunder till synkrona kunder måste du först köra **P-jobbet** för att skicka asynkrona kunderna till Commerce headquarters. Kör sedan jobbet **Synkronisera kunder och affärspartner från asynkront läge** (kallades tidigare **Synkronisera kunder och affärspartner från asynkront läge**) för att skapa kundkonto-ID:n. Kör slutligen **1010** jobbet om du vill synkronisera de nya kundkonto-ID till kanalerna.
### <a name="async-customer-limitations"></a>Asynkrona kundbegränsningar
Asynkrona kundfunktionen har för närvarande följande begränsningar:
- Asynkrona kundposter kan inte redigeras om inte kunden har skapats i Commerce-administration och det nya kundkonto-ID:t har synkroniserats tillbaka till kanalen. Därför kan adressen inte sparas för en asynkron kund förrän den kunden har synkroniserats med Commerce Headquarters eftersom tillägget av en kundadress är internt implementerat som en redigeringsåtgärd i kundprofilen. Men om funktionen **Aktivera asynkron generering av kundadresser** har aktiverats kan kundadresser sparas även för asynkrona kunder.
- Anknytningar kan inte associeras med asynkrona kunder. Därför ärver inte nya asynkrona kunder anknytningar från standardkunden.
- Förmånskort kan inte utfärdas till asynkrona kunder såvida inte det nya kundkonto-ID:t har synkroniserats tillbaka till kanalen.
- Sekundära e-postadresser och telefonnummer kan inte fångas in för asynkrona kunder.
Vissa av de tidigare nämnda begränsningarna kan få dig att välja alternativet Synkron kund för ditt företag, men Commerce-teamet arbetar för att göra funktionerna för asynkron kund mer lika funktionerna för synkron kund. Exempelvis från och med Commerce version 10.0.22 kommer nya funktionen **Aktivera asynkron generering av kundadresser** som du kan aktivera i arbetsytan **Funktionshantering** asynkront och som sparar nyligen skapade kundadresser för både synkrona och asynkrona kunder. För att spara dessa adresser i kundprofilen i Commerce headquarters måste du schemalägga ett återkommande batchjobb för **P-jobbet**, jobbet **Synkronisera kunder och affärspartners från asynkront läge** och jobbet **1010** så att eventuella asynkrona kunder konverteras till synkrona kunder i Commerce headquarters.
### <a name="customer-creation-in-pos-offline-mode"></a>Kundskapa i kassa offlineläge
När kassan är offline växlar systemet automatiskt till läget för asynkron kundgenerering, även om läget för asynkron kundgenerering är inaktiverat. Därför, som nämndes tidigare, måste Commerce headquarters-administratörer skapa och schemalägga ett återkommande batchjobb för **P-jobbet**, jobbet **Synkronisera kunder och affärspartners från asynkront läge** och **1010** så att eventuella asynkrona kunder konverteras till synkrona kunder i Commerce headquarters.
> [!NOTE]
> Om alternativet **Filtrera delade kunddatatabeller** anger **Ja** på sidan **Commerce kanalschema** (**Retail och Commerce \> administration inställning \> Commerce schemaläggare \> Kanaldatabasgrupp**), kundposter skapas inte i kassa offline-läge. Mer information finns i [Offline uteslutande av data](dev-itpro/implementation-considerations-cdx.md#offline-data-exclusion).
## <a name="additional-resources"></a>Ytterligare resurser
[Kundattribut](dev-itpro/customer-attributes.md)
[Offline uteslutande av data](dev-itpro/implementation-considerations-cdx.md#offline-data-exclusion)
| 120.195122 | 807 | 0.81108 | swe_Latn | 0.999746 |
c175a71cc8ecf6bf2ca31fefc62d1c878d5762ef | 216 | md | Markdown | README.md | stm718/web-controller | 18dffe4fb5f8affdc5f7c77a08fc8df5778caf79 | [
"MIT"
] | null | null | null | README.md | stm718/web-controller | 18dffe4fb5f8affdc5f7c77a08fc8df5778caf79 | [
"MIT"
] | null | null | null | README.md | stm718/web-controller | 18dffe4fb5f8affdc5f7c77a08fc8df5778caf79 | [
"MIT"
] | null | null | null | # Web Controller
A controller of machines through web browsers
## Architecture
<img width="1024" src="img/arch.svg">
## Demo
### Dashboard

### HVAC control

| 12 | 45 | 0.680556 | kor_Hang | 0.30434 |
c175e9e6b11911dbf50c6f35e2c8f271c74d2c56 | 479 | md | Markdown | README.md | 2877025939/PlanPlaceholder | 65d040cf99725cea1a2ded1be1db2a4a79f2801a | [
"Apache-2.0"
] | 18 | 2017-11-16T03:48:03.000Z | 2019-07-04T23:31:50.000Z | README.md | 2877025939/PlanPlaceholder | 65d040cf99725cea1a2ded1be1db2a4a79f2801a | [
"Apache-2.0"
] | null | null | null | README.md | 2877025939/PlanPlaceholder | 65d040cf99725cea1a2ded1be1db2a4a79f2801a | [
"Apache-2.0"
] | 25 | 2018-02-13T06:02:02.000Z | 2018-03-19T07:15:08.000Z | # PlanPlaceholder
这里是tableView空数据占位图,网络异常占位图
=====================================
#点击右上角的 star、watch 按钮,可以收藏本仓库,看到本仓库的更新!
1.基于DZNEmptyDataSet,MJrefresh开发的,封装成一个BaseTableView,使用起来非常方便
2.这里我把DZNEmptyDataSet归类整理了一下,看起来更加清晰。具体可点击 [DZNEmptyDataSet](https://github.com/dzenbot/DZNEmptyDataSet),查看源码

在iOS开发的过程中如有遇到问题,欢迎联系我进行探讨交流.
邮箱:peiliancoding@gmail.com
| 21.772727 | 111 | 0.7119 | yue_Hant | 0.699 |
c176ae8e666797591e373d4bde2f7876570999e1 | 4,281 | md | Markdown | README.md | Halakhamayseh/Reading-notes | d06c636a72caf0a23b6317b658d4ee6545e93930 | [
"MIT"
] | null | null | null | README.md | Halakhamayseh/Reading-notes | d06c636a72caf0a23b6317b658d4ee6545e93930 | [
"MIT"
] | null | null | null | README.md | Halakhamayseh/Reading-notes | d06c636a72caf0a23b6317b658d4ee6545e93930 | [
"MIT"
] | 3 | 2021-10-05T16:15:12.000Z | 2021-11-22T11:49:20.000Z | # Reading-Notes-Content
1. About
2. level 1
* (code 101) (web site & web app,frontend&Backend)
3. level 2
* Code-102-Daily-Reading
* Code-201-Daily-Reading
* Code-301-Daily-Reading
* Code 401 -Daily-Reading
### About
*This repo was created for Software development course levels reading.*
### **Code-102-Daily-Reading(Intro to Software Development)**
|Topic title |Topic link|
|------------|----------|
|The growth mindset|[link](https://halakhamayseh.github.io/Reading-notes/)|
|Computer Architecture|[link](https://halakhamayseh.github.io/Reading-notes/Computer%20Architecture)|
|Git|[link](https://halakhamayseh.github.io/Reading-notes/Git)|
|Learning Markdown|[link](https://halakhamayseh.github.io/Reading-notes/Learning%20Markdown)|
|Markdown|[link](https://halakhamayseh.github.io/new-reading-notes/)|
|Dynamic web pages with JavaScript|[link](https://halakhamayseh.github.io/new-reading-notes/class6)|
|loop and operators|[link](https://halakhamayseh.github.io/new-reading-notes/loop%20and%20oper)|
|javascript2|[link](https://halakhamayseh.github.io/new-reading-notes/javascript2)|
### **Code-201-Daily-Reading (Foundations of Software Development)**
|Topic title |Topic link|
|------------|----------|
|HTML&js introduction|[link](https://halakhamayseh.github.io/201code_reading-notes/class01)|
|HTML&js&css introduction2|[link](https://halakhamayseh.github.io/201code_reading-notes/class02)|
|HTML list&CSS(Border, Margin& Padding)&JS(ARRAYS,LOOP)|[link](https://halakhamayseh.github.io/201code_reading-notes/class03)|
|HTML(link),CSS(layout),JS(object)|[link](https://halakhamayseh.github.io/201code_reading-notes/class04)|
|Images &color|[link](https://halakhamayseh.github.io/201code_reading-notes/class05)|
|Doom&object|[link](https://halakhamayseh.github.io/201code_reading-notes/class06)|
|Table&Fuctions and Method|[link](https://halakhamayseh.github.io/201code_reading-notes/class07)|
|CSS layout &float|[link](https://halakhamayseh.github.io/201code_reading-notes/class08)|
|Form|[link](https://halakhamayseh.github.io/201code_reading-notes/class09)|
|Error Handling & Debugging|[link](https://halakhamayseh.github.io/201code_reading-notes/class10)|
|Images CSS|[link](https://halakhamayseh.github.io/201code_reading-notes/class11)|
|CHART JS|[link](https://halakhamayseh.github.io/201code_reading-notes/class12)|
|LOCAL STORAGE|[link](https://halakhamayseh.github.io/201code_reading-notes/class13)|
|Transforms&Transitions & Animations|[link](https://halakhamayseh.github.io/201code_reading-notes/class14a)|
|The google article|[link](https://halakhamayseh.github.io/201code_reading-notes/class14b)|
### **Code-301-Daily-Reading( Intermediate Software Development)**
|Topic title |Topic link|
|------------|----------|
|SMACSS and Responsive Web Design|[link](https://halakhamayseh.github.io/reading-notes/class01(301))|
|jQuery, Events, and The DOM|[link](https://halakhamayseh.github.io/reading-notes/class02(301))|
|Flexbox and Templating|[link](https://halakhamayseh.github.io/reading-notes/class03(301))|
|Responsive Web Design and Regular Expressions|[link](https://halakhamayseh.github.io/reading-notes/class04(301))|
|Heroku Deployment|[link](https://halakhamayseh.github.io/reading-notes/class05(301))|
|Node, Express, and APIs|[link](https://halakhamayseh.github.io/reading-notes/class06(301))|
|APIs continued|[link](https://halakhamayseh.github.io/reading-notes/class07(301))|
|SQL|[link](https://halakhamayseh.github.io/reading-notes/class08(301))|
|Refactoring|[link](https://halakhamayseh.github.io/reading-notes/class09(301))|
|The Call Stack and Debugging|[link](https://halakhamayseh.github.io/reading-notes/class10(301))|
|EJS|[link](https://halakhamayseh.github.io/reading-notes/class11(301))|
|Components|[link](https://halakhamayseh.github.io/reading-notes/class12(301))|
|Update/Delete|[link](https://halakhamayseh.github.io/reading-notes/class13(301))|
|DB Normalization|[link](https://halakhamayseh.github.io/reading-notes/class14a(301))|
|Diversity and Inclusion|[link](https://halakhamayseh.github.io/reading-notes/class15(301))|
### **Code 401 - Advanced Software Development**
|Topic title |Topic link|
|------------|----------|
|Java Basics|[link](https://halakhamayseh.github.io/reading-notes/401class01)|
| 65.861538 | 126 | 0.754964 | zul_Latn | 0.299361 |
c1774c52c5115ec01e35b6cd3777fcd1f08fd2c0 | 247 | md | Markdown | _portfolio/blocjamsangular.md | courtneym19/portfolio-iro | 7a90958d63df035caa85e272b22c129b6f80757d | [
"MIT"
] | null | null | null | _portfolio/blocjamsangular.md | courtneym19/portfolio-iro | 7a90958d63df035caa85e272b22c129b6f80757d | [
"MIT"
] | null | null | null | _portfolio/blocjamsangular.md | courtneym19/portfolio-iro | 7a90958d63df035caa85e272b22c129b6f80757d | [
"MIT"
] | null | null | null | ---
layout: post
title: BlocJams - Angular rework
feature-img: "img/sample_feature_img.png"
thumbnail-path: "https://d13yacurqjgara.cloudfront.net/users/3217/screenshots/2030966/blocjams_1x.png"
short-description: BlocJams for iOS is awesome!
--- | 30.875 | 102 | 0.789474 | eng_Latn | 0.437347 |
c178139199b8a9d0b3bc919fbb1c9744599150db | 41 | md | Markdown | E2E_Provision_1485453941051/index.md | OPSTest/E2E_Provision_1485453941051 | da3ac88ce5d4d739b31bf8e25c61ae52da055d5e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | E2E_Provision_1485453941051/index.md | OPSTest/E2E_Provision_1485453941051 | da3ac88ce5d4d739b31bf8e25c61ae52da055d5e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | E2E_Provision_1485453941051/index.md | OPSTest/E2E_Provision_1485453941051 | da3ac88ce5d4d739b31bf8e25c61ae52da055d5e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | # Welcome to E2E_Provision_1485453941051! | 41 | 41 | 0.878049 | eng_Latn | 0.438365 |
c179a21e083162c88fd89d17100cc52a4b7f9c84 | 26 | md | Markdown | README.md | glace-apps/AwesomePortfolio | 3a77249e83cee2b3b5271b8bf10b25eb3f71125d | [
"MIT"
] | null | null | null | README.md | glace-apps/AwesomePortfolio | 3a77249e83cee2b3b5271b8bf10b25eb3f71125d | [
"MIT"
] | null | null | null | README.md | glace-apps/AwesomePortfolio | 3a77249e83cee2b3b5271b8bf10b25eb3f71125d | [
"MIT"
] | null | null | null | My first online portfolio! | 26 | 26 | 0.846154 | eng_Latn | 0.997657 |
c17a31b1f08fb66f2654435fb04273ae8e651f7a | 89 | md | Markdown | doc/markdown/index.md | tcbrindle/sdlxx | 930db3762dd7798591b4b7837e39d2a4cc412e8d | [
"Zlib"
] | 3 | 2016-05-13T04:25:49.000Z | 2017-05-26T12:00:34.000Z | doc/markdown/index.md | tcbrindle/sdlxx | 930db3762dd7798591b4b7837e39d2a4cc412e8d | [
"Zlib"
] | 1 | 2016-03-11T06:12:52.000Z | 2016-03-11T06:12:52.000Z | doc/markdown/index.md | tcbrindle/sdlxx | 930db3762dd7798591b4b7837e39d2a4cc412e8d | [
"Zlib"
] | 4 | 2017-01-20T19:20:56.000Z | 2019-10-13T20:56:22.000Z |
sdl++ {#mainpage}
=====
This is a placeholder front page for the sdl++ documentation.
| 12.714286 | 61 | 0.674157 | eng_Latn | 0.996434 |
c17a3ae7e0da245a8bbfed078bfae3a8853c82f9 | 532 | md | Markdown | lib/services/vpc/docs/CreateVpcPeeringInstanceRequest.md | NaverCloudPlatform/ncloud-sdk-js | dc39e168eea20e3bdd24080b1ebd048b08848089 | [
"MIT"
] | 5 | 2018-06-22T09:59:17.000Z | 2021-11-17T19:04:49.000Z | lib/services/vpc/docs/CreateVpcPeeringInstanceRequest.md | NaverCloudPlatform/ncloud-sdk-js | dc39e168eea20e3bdd24080b1ebd048b08848089 | [
"MIT"
] | 12 | 2020-09-05T16:39:29.000Z | 2021-09-01T13:49:05.000Z | lib/services/vpc/docs/CreateVpcPeeringInstanceRequest.md | NaverCloudPlatform/ncloud-sdk-js | dc39e168eea20e3bdd24080b1ebd048b08848089 | [
"MIT"
] | 1 | 2019-09-03T13:46:09.000Z | 2019-09-03T13:46:09.000Z | # Vpc.CreateVpcPeeringInstanceRequest
## Properties
Name | Type | Description | Notes
------------ | ------------- | ------------- | -------------
**regionCode** | **String** | REGION코드 | [optional]
**vpcPeeringDescription** | **String** | VPCPeering설명 | [optional]
**sourceVpcNo** | **String** | 요청VPC번호 |
**targetVpcLoginId** | **String** | 수락VPC소유자ID | [optional]
**targetVpcName** | **String** | 수락VPC이름 | [optional]
**targetVpcNo** | **String** | 수락VPC번호 |
**vpcPeeringName** | **String** | VPCPeering이름 | [optional]
| 35.466667 | 67 | 0.580827 | yue_Hant | 0.614793 |
c17b46fbfe2ed326471228c8263d84a191200dc9 | 2,069 | md | Markdown | content/stories/20121122-legislative-version-control/index.md | openknowledgebe/website | f4530737e0c68e47f1e6fb97d5ca15ff04f40de8 | [
"MIT"
] | 4 | 2020-03-18T14:06:34.000Z | 2020-12-28T23:10:24.000Z | content/stories/20121122-legislative-version-control/index.md | openknowledgebe/website | f4530737e0c68e47f1e6fb97d5ca15ff04f40de8 | [
"MIT"
] | 84 | 2020-10-30T16:46:35.000Z | 2022-03-01T20:40:20.000Z | content/stories/20121122-legislative-version-control/index.md | openknowledgebe/website | f4530737e0c68e47f1e6fb97d5ca15ff04f40de8 | [
"MIT"
] | null | null | null | ---
date: 2012-11-22T15:22:46.000Z
author: 'Pieter Colpaert'
title: 'Legislative version control'
tags:
- 'data journalism'
- general
---
Recently [someone](https://twitter.com/somatik/status/270975406818070528) pointed me towards a TED talk about “[How the Internet will (one day) transform government](http://www.ted.com/talks/clay_shirky_how_the_internet_will_one_day_transform_government.html 'how_the_internet_will_one_day_transform_government')“. In Clay Shirky’s talk, Clay is flirting with the idea that law makers would become digital natives and would use Github, a distributed and social version control website for software, for keeping track of their law changes.
Very interesting, many would say, and they would start off by lobbying their law makers in doing this and … they would quickly notice that their law makers have no idea what they are talking about.
We do however have the open data people, who already achieved in getting a lot of European/American countries’ laws publicized online under an open data license. The re-use community then took these laws, put them on github, and used scripts to update the git repository. For example: a [member](https://twitter.com/stefanwehrmeyer) of OKFN [Germany did this](https://github.com/bundestag/gesetze), and in [The Netherlands this happened as well](https://github.com/statengeneraal).
In Belgium, [Tim Esselens](http://devel.datif.be/about/), a member of the iRail npo (now the open transport working group at OKFN) back then, wrote BeLaws. The biggest frustration of the iRail team was that when they got legal letters, there was no such thing as an easy interface to the laws, unless you mean the [official Staatsblad site, last updated in 1999](http://www.ejustice.just.fgov.be/tsv_pub/index_n.htm). iRail scraped all the laws from the old website, Tim re-indexed them and came up with a nice interface, which you can still find at <http://belaws.be>.
How about us? Are we going to create this git-version of our lawbook as well? It only takes one man or woman to make the difference…
| 108.894737 | 569 | 0.787337 | eng_Latn | 0.99567 |
c17b5edc6e7005c777251a39c622859ec727e5fb | 1,903 | md | Markdown | Iptables.md | invadelabs/drewwiki | dd10cff97831ae4306c9cea7b016d3c3ece0aeb1 | [
"RSA-MD"
] | null | null | null | Iptables.md | invadelabs/drewwiki | dd10cff97831ae4306c9cea7b016d3c3ece0aeb1 | [
"RSA-MD"
] | 1 | 2018-02-20T16:19:51.000Z | 2018-02-20T16:19:51.000Z | Iptables.md | invadelabs/drewwiki | dd10cff97831ae4306c9cea7b016d3c3ece0aeb1 | [
"RSA-MD"
] | 1 | 2018-03-02T05:56:42.000Z | 2018-03-02T05:56:42.000Z | ---
title: Iptables
layout: default
---
For desktop host, only allow tunnel
===================================
``` bash
iptables -A INPUT -m state --state ESTABLISHED,RELATED -j ACCEPT
iptables -A INPUT -p icmp -j ACCEPT
iptables -A INPUT -i lo -j ACCEPT
iptables -A INPUT -m state --state NEW -p tcp --dport ssh -j ACCEPT
iptables -A INPUT -j REJECT
iptables -A FORWARD -j DROP
iptables -A OUTPUT -p udp --dport 53 -j REJECT
iptables -A OUTPUT -d 192.168.1.0/24 -j ACCEPT
iptables -A OUTPUT -d *forwarding_host_ip*/32 -j ACCEPT
iptables -A OUTPUT -d 0.0.0.0/0 -j REJECT
```
On router
=========
``` bash
iptables -I INPUT 1 -s 192.168.1.142 -p udp --dport 53 -j REJECT
iptables -I FORWARD 1 -s 192.168.1.142 -d 70.87.42.50 -p tcp --dport 1123 -j ACCEPT
iptables -I FORWARD 2 -s 192.168.1.142 -j REJECT
```
Server
======
``` bash
cat /etc/sysconfig/iptables
# Firewall configuration written by system-config-firewall
# Manual customization of this file is not recommended.
*filter
:INPUT ACCEPT [0:0]
:FORWARD ACCEPT [0:0]
:OUTPUT ACCEPT [0:0]
-A INPUT -m state --state ESTABLISHED,RELATED -j ACCEPT
-A INPUT -p icmp -j ACCEPT
-A INPUT -i lo -j ACCEPT
-A INPUT -m state --state NEW -m tcp -p tcp --dport 22 -j ACCEPT
-A INPUT -m state --state NEW -m tcp -p tcp --dport 25 -j ACCEPT
-A INPUT -m state --state NEW -m tcp -p tcp --dport 443 -j ACCEPT
-A INPUT -m state --state NEW -m udp -p udp -s 192.168.1.0/24 --dport 123 -j ACCEPT
-A INPUT -m state --state NEW -m udp -p udp -s 192.168.1.0/24 --dport 514 -j ACCEPT
-A INPUT -m state --state NEW -m tcp -p tcp -s 192.168.1.0/24 --dport 139 -j ACCEPT
-A INPUT -m state --state NEW -m tcp -p tcp -s 192.168.1.0/24 --dport 445 -j ACCEPT
-A INPUT -m state --state NEW -m tcp -p tcp -s 192.168.1.0/24 --dport 2049 -j ACCEPT
-A INPUT -j REJECT --reject-with icmp-host-prohibited
-A FORWARD -j REJECT --reject-with icmp-host-prohibited
COMMIT
```
| 30.206349 | 84 | 0.668944 | yue_Hant | 0.763638 |
c17beb3c50f32a9582cf15ac9ea6eae7a85c7b68 | 1,262 | md | Markdown | VBA/PowerPoint-VBA/articles/presentation-fareastlinebreaklanguage-property-powerpoint.md | oloier/VBA-content | 6b3cb5769808b7e18e3aff55a26363ebe78e4578 | [
"CC-BY-4.0",
"MIT"
] | 584 | 2015-09-01T10:09:09.000Z | 2022-03-30T15:47:20.000Z | VBA/PowerPoint-VBA/articles/presentation-fareastlinebreaklanguage-property-powerpoint.md | oloier/VBA-content | 6b3cb5769808b7e18e3aff55a26363ebe78e4578 | [
"CC-BY-4.0",
"MIT"
] | 585 | 2015-08-28T20:20:03.000Z | 2018-08-31T03:09:51.000Z | VBA/PowerPoint-VBA/articles/presentation-fareastlinebreaklanguage-property-powerpoint.md | oloier/VBA-content | 6b3cb5769808b7e18e3aff55a26363ebe78e4578 | [
"CC-BY-4.0",
"MIT"
] | 590 | 2015-09-01T10:09:09.000Z | 2021-09-27T08:02:27.000Z | ---
title: Presentation.FarEastLineBreakLanguage Property (PowerPoint)
keywords: vbapp10.chm583048
f1_keywords:
- vbapp10.chm583048
ms.prod: powerpoint
api_name:
- PowerPoint.Presentation.FarEastLineBreakLanguage
ms.assetid: e0acc33d-0cb0-5422-4238-26b4071fb48c
ms.date: 06/08/2017
---
# Presentation.FarEastLineBreakLanguage Property (PowerPoint)
Returns or sets the language used to determine which line break level is used when the line break control option is turned on. Read/write.
## Syntax
_expression_. **FarEastLineBreakLanguage**
_expression_ A variable that represents a **Presentation** object.
### Return Value
MsoFarEastLineBreakLanguageID
## Remarks
The value of the **FarEastLineBreakLanguage** property can be one of these **MsoFarEastLineBreakLanguageID** constants.
||
|:-----|
|**MsoFarEastLineBreakLanguageJapanese**|
|**MsoFarEastLineBreakLanguageKorean**|
|**MsoFarEastLineBreakLanguageSimplifiedChinese**|
|**MsoFarEastLineBreakLanguageTraditionalChinese**|
## Example
The following example sets the line break language to Japanese.
```vb
ActivePresentation.FarEastLineBreakLanguage = MsoFarEastLineBreakLanguageJapanese
```
## See also
#### Concepts
[Presentation Object](presentation-object-powerpoint.md)
| 20.688525 | 138 | 0.793185 | eng_Latn | 0.533357 |
c17c52557de70f2e60b724be4595df117e7546de | 979 | md | Markdown | docs/components/discussion.md | sergio-righi/vue-grater | 00ab1016d2a8ad2efc44cb2a3e0644759d3f37f2 | [
"MIT"
] | 3 | 2021-04-17T15:15:15.000Z | 2022-02-05T00:12:31.000Z | docs/components/discussion.md | sergio-righi/vue-grater | 00ab1016d2a8ad2efc44cb2a3e0644759d3f37f2 | [
"MIT"
] | null | null | null | docs/components/discussion.md | sergio-righi/vue-grater | 00ab1016d2a8ad2efc44cb2a3e0644759d3f37f2 | [
"MIT"
] | null | null | null | # Discussion
The `gv-discussion` is a aiming to provide an easy way to create group discussions or comments sections. Learn more about the component and its features in the following examples.
## Usage
The standard example of the component.
<discussion-example-1 />
<<< @/docs/.vuepress/components/DiscussionExample1.vue
## Props
### `gv-discussion-item`
| Name | Type | Description | Default | Required |
| ------ | :----: | --------------------------------- | ------- | -------- |
| avatar | string | the content of the avatar section | - | false |
| header | string | the main content (comment) | - | true |
## Slots
### `gv-discussion`
| Name | Description |
| ------- | ---------------- |
| content | sets the content |
### `gv-discussion-item`
| Name | Description |
| ------- | ------------------------ |
| control | sets the controls |
| content | sets the replies content |
| 27.194444 | 179 | 0.536261 | eng_Latn | 0.958906 |
c17d2e02ce41799dc7ccb5a368522c6e98e1c775 | 793 | md | Markdown | README.md | bebop-001/FourFunctionRpnCalculator | e4fff97f9215569c938f87a4a6858bfee58695d7 | [
"Apache-2.0"
] | null | null | null | README.md | bebop-001/FourFunctionRpnCalculator | e4fff97f9215569c938f87a4a6858bfee58695d7 | [
"Apache-2.0"
] | null | null | null | README.md | bebop-001/FourFunctionRpnCalculator | e4fff97f9215569c938f87a4a6858bfee58695d7 | [
"Apache-2.0"
] | null | null | null | This is a simple 4-function RPN console interactive
calculator implemented in about 70 lines of Kotlin.
RPN (Reverse Polish Notation) is different than
Algabric calculators in that the operators come after
the operand. For instance instead of "1 + 1 =" you would
enter "1 1 +" The resulting syntax is much easier to
parse.
If you want to see a stack trace of the operations,
your first token should be "trace" for example:<br>
>> trace 1 1 +<br/>
Trace start [1, 1, +]:[]<br/>
Trace > [1, +], token="1", []<br/>
Trace < [1, +], [1.0]<br/>
Trace > [+], token="1", [1.0]<br/>
Trace < [+], [1.0, 1.0]<br/>
Trace > [], token="+", [1.0, 1.0]<br/>
Trace < [], [2.0]<br/>
[2.0]<br/>
> <br/>
Use "q" to exit
I also implemented comments. A comment starts with "#_" and
ends with "_#".
| 28.321429 | 60 | 0.639344 | eng_Latn | 0.987888 |
c17d3760d337238616d4fefeae0cd22097cda781 | 216 | md | Markdown | nodejs/pm2.md | tilleps/webdev-notes | f6ffb6aeaecdc4dce68508819cd43ad632f9cbe9 | [
"MIT"
] | null | null | null | nodejs/pm2.md | tilleps/webdev-notes | f6ffb6aeaecdc4dce68508819cd43ad632f9cbe9 | [
"MIT"
] | null | null | null | nodejs/pm2.md | tilleps/webdev-notes | f6ffb6aeaecdc4dce68508819cd43ad632f9cbe9 | [
"MIT"
] | null | null | null | # PM2 Notes #
```
PM2_HOME=/var/opt/pm2
```
```bash
pm2 logs $name --raw
pm2 logs $name --raw | bunyan
pm2 --update-env restart $name
ssh $hostname 'bash -l -c "sudo pm2 --update-env restart $pm2name"'
```
| 10.285714 | 67 | 0.615741 | kor_Hang | 0.55335 |
c17d9eba1935edd0564b73979df69fe83f11769b | 3,263 | md | Markdown | treebanks/lzh_kyoto/lzh_kyoto-dep-cop.md | myedibleenso/docs | 4306449b6863a7055a7289a94de010dbcfc11f65 | [
"Apache-2.0"
] | null | null | null | treebanks/lzh_kyoto/lzh_kyoto-dep-cop.md | myedibleenso/docs | 4306449b6863a7055a7289a94de010dbcfc11f65 | [
"Apache-2.0"
] | null | null | null | treebanks/lzh_kyoto/lzh_kyoto-dep-cop.md | myedibleenso/docs | 4306449b6863a7055a7289a94de010dbcfc11f65 | [
"Apache-2.0"
] | null | null | null | ---
layout: base
title: 'Statistics of cop in UD_Classical_Chinese-Kyoto'
udver: '2'
---
## Treebank Statistics: UD_Classical_Chinese-Kyoto: Relations: `cop`
This relation is universal.
1984 nodes (1%) are attached to their parents as `cop`.
1984 instances of `cop` (100%) are right-to-left (child precedes parent).
Average distance between parent and child is 2.04385080645161.
The following 6 pairs of parts of speech are connected with `cop`: <tt><a href="lzh_kyoto-pos-NOUN.html">NOUN</a></tt>-<tt><a href="lzh_kyoto-pos-AUX.html">AUX</a></tt> (1670; 84% instances), <tt><a href="lzh_kyoto-pos-VERB.html">VERB</a></tt>-<tt><a href="lzh_kyoto-pos-AUX.html">AUX</a></tt> (206; 10% instances), <tt><a href="lzh_kyoto-pos-PROPN.html">PROPN</a></tt>-<tt><a href="lzh_kyoto-pos-AUX.html">AUX</a></tt> (64; 3% instances), <tt><a href="lzh_kyoto-pos-NUM.html">NUM</a></tt>-<tt><a href="lzh_kyoto-pos-AUX.html">AUX</a></tt> (23; 1% instances), <tt><a href="lzh_kyoto-pos-PRON.html">PRON</a></tt>-<tt><a href="lzh_kyoto-pos-AUX.html">AUX</a></tt> (16; 1% instances), <tt><a href="lzh_kyoto-pos-PART.html">PART</a></tt>-<tt><a href="lzh_kyoto-pos-AUX.html">AUX</a></tt> (5; 0% instances).
~~~ conllu
# visual-style 5 bgColor:blue
# visual-style 5 fgColor:white
# visual-style 6 bgColor:blue
# visual-style 6 fgColor:white
# visual-style 6 5 cop color:blue
1 克 克 VERB v,動詞,行為,動作 _ 6 csubj _ Gloss=conquer|SpaceAfter=No
2 己 己 PRON n,代名詞,人称,他 PronType=Prs|Reflex=Yes 1 obj _ Gloss=self|SpaceAfter=No
3 復 復 VERB v,動詞,行為,動作 _ 1 conj _ Gloss=repeat|SpaceAfter=No
4 禮 禮 NOUN n,名詞,制度,儀礼 _ 3 obj _ Gloss=ceremony|SpaceAfter=No
5 為 爲 AUX v,動詞,存在,存在 VerbType=Cop 6 cop _ Gloss=be|SpaceAfter=No
6 仁 仁 NOUN n,名詞,描写,態度 _ 0 root _ Gloss=benevolence|SpaceAfter=No
~~~
~~~ conllu
# visual-style 8 bgColor:blue
# visual-style 8 fgColor:white
# visual-style 10 bgColor:blue
# visual-style 10 fgColor:white
# visual-style 10 8 cop color:blue
1 漢 漢 PROPN n,名詞,主体,国名 Case=Loc|NameType=Nat 3 nmod _ Gloss=[country-name]|SpaceAfter=No
2 之 之 SCONJ p,助詞,接続,属格 _ 1 case _ Gloss='s|SpaceAfter=No
3 桓 桓 PROPN n,名詞,人,その他の人名 NameType=Prs 6 nsubj _ Gloss=Huan|SpaceAfter=No
4 靈 靈 PROPN n,名詞,人,その他の人名 NameType=Prs 3 conj _ Gloss=Ling|SpaceAfter=No
5 乃 乃 ADV v,副詞,時相,緊接 AdvType=Tim 6 advmod _ Gloss=then|SpaceAfter=No
6 聚 聚 VERB v,動詞,行為,動作 _ 0 root _ Gloss=amass|SpaceAfter=No
7 錢 錢 NOUN n,名詞,可搬,道具 _ 6 obj _ Gloss=coin|SpaceAfter=No
8 爲 爲 AUX v,動詞,存在,存在 VerbType=Cop 10 cop _ Gloss=be|SpaceAfter=No
9 私 私 ADV v,動詞,描写,態度 Degree=Pos|VerbForm=Conv 10 advmod _ Gloss=private|SpaceAfter=No
10 藏 藏 VERB v,動詞,行為,設置 _ 6 parataxis _ Gloss=conceal|SpaceAfter=No
~~~
~~~ conllu
# visual-style 5 bgColor:blue
# visual-style 5 fgColor:white
# visual-style 6 bgColor:blue
# visual-style 6 fgColor:white
# visual-style 6 5 cop color:blue
1 人 人 NOUN n,名詞,人,人 _ 6 nsubj _ Gloss=person|SpaceAfter=No
2 皆 皆 ADV v,副詞,範囲,総括 _ 6 advmod _ Gloss=all|SpaceAfter=No
3 可 可 AUX v,助動詞,可能,* Mood=Pot 6 aux _ Gloss=possible|SpaceAfter=No
4 以 以 VERB v,動詞,行為,動作 _ 3 fixed _ Gloss=use|SpaceAfter=No
5 為 爲 AUX v,動詞,存在,存在 VerbType=Cop 6 cop _ Gloss=be|SpaceAfter=No
6 堯 堯 PROPN n,名詞,人,名 NameType=Giv 0 root _ Gloss=[given-name]|SpaceAfter=No
7 舜 舜 PROPN n,名詞,人,名 NameType=Giv 6 conj _ Gloss=[given-name]|SpaceAfter=No
~~~
| 45.319444 | 802 | 0.717438 | yue_Hant | 0.708801 |
c17eef388e511e58bcdb1eb97fac9d60e60c1afc | 2,539 | md | Markdown | api/webapi.md | tomo0611/BEditor | 00a21e9062e3f041a8728f6ecfac34702d14c2fc | [
"MIT"
] | 82 | 2021-02-24T10:15:57.000Z | 2022-01-09T08:28:51.000Z | api/webapi.md | tomo0611/BEditor | 00a21e9062e3f041a8728f6ecfac34702d14c2fc | [
"MIT"
] | 181 | 2021-02-22T17:36:43.000Z | 2022-01-14T10:19:23.000Z | api/webapi.md | tomo0611/BEditor | 00a21e9062e3f041a8728f6ecfac34702d14c2fc | [
"MIT"
] | 6 | 2021-06-10T10:18:29.000Z | 2021-12-17T10:12:43.000Z | 全てのメソッドにApiKeyのパラメーターを付ける
# 認証
## RefreshAuth
```
POST /api/refreshauth
Authorization: Token リフレッシュトークン
```
トークンを更新します。
### Response body
``` Json
{
"access_token": "アクセストークン",
"expires_in": "このアクセストークンの有効期限 (秒数)",
"refresh_token": "リフレッシュトークン",
}
```
## SignIn
```
POST /api/signin
```
サインインします。
### Request body
``` Json
{
"email": "メールアドレス",
"password": "パスワード"
}
```
### Response body
``` Json
{
"access_token": "アクセストークン",
"expires_in": "このアクセストークンの有効期限 (秒数)",
"refresh_token": "リフレッシュトークン",
}
```
## SignUp
```
POST /api/signup
```
サインアップします。
### Request body
``` Json
{
"email": "メールアドレス",
"password": "パスワード",
"displayname": "表示名"
}
```
### Response body
``` Json
{
"access_token": "アクセストークン",
"expires_in": "このアクセストークンの有効期限 (秒数)",
"refresh_token": "リフレッシュトークン",
}
```
## GetAccountInfo
```
GET /api/getAccountInfo
Authorization: Token アクセストークン
```
アカウント情報を取得します。
### Response body
``` Json
{
"email": "メールアドレス",
"displayname": "表示名"
}
```
## Update
```
POST /api/update
Authorization: Token アクセストークン
```
アカウント情報を更新します。
### Request body
``` Json
{
"email": "メールアドレス",
"password": "パスワード",
"displayname": "表示名"
}
```
### Response body
``` Json
{
"access_token": "アクセストークン",
"expires_in": "このアクセストークンの有効期限 (秒数)",
"refresh_token": "リフレッシュトークン",
}
```
## DeleteAccount
```
POST /api/deleteAccount
Authorization: Token アクセストークン
```
アカウントを削除します。
## SendPasswordResetEmail
```
POST /api/sendPasswordResetEmail
```
パスワードをリセットするメールを送信します。
### Request body
``` Json
{
"email": "メールアドレス",
}
```
# パッケージ
## Upload
```
POST /api/upload
Authorization: Token アクセストークン
```
パッケージをアップロードします。
### Request body
* パッケージファイル
### Response body
``` Json
{
"version": "パッケージのバージョン (メジャー.マイナー.ビルド)",
"download_url": "",
"update_note": "更新ノート",
"update_note_short": "短い更新ノート",
"release_datetime": "公開した日時 (例[2021-05-30T00:00:00.0000000])"
}
```
## GetPackages
```
POST /api/getPackages
Authorization: Token アクセストークン
```
アップロードしたパッケージを取得します。
### Response body
``` Json
[
{
"main_assembly": "",
"name": "",
"author": "",
"homepage": "",
"description_short": "",
"description": "",
"tag": "",
"id": "",
"license": "",
"versions": [
{
"version": "",
"download_url": "",
"update_note": "",
"update_note_short": "",
"release_datetime": ""
}
]
}
]
``` | 14.027624 | 65 | 0.56794 | yue_Hant | 0.147668 |
c17f3e41230fb60afe67c32d4a593d258e080b33 | 571 | md | Markdown | node_modules/@types/har-format/README.md | Dhruv8601/Pomodoro-Timer | bc2c68d372beeb66a6275392807b1ff772d25e94 | [
"MIT"
] | 2 | 2021-12-24T00:55:20.000Z | 2021-12-27T02:38:50.000Z | node_modules/@types/har-format/README.md | Dhruv8601/Pomodoro-Timer | bc2c68d372beeb66a6275392807b1ff772d25e94 | [
"MIT"
] | 2 | 2021-12-02T07:26:41.000Z | 2021-12-22T10:37:09.000Z | node_modules/@types/har-format/README.md | Dhruv8601/Pomodoro-Timer | bc2c68d372beeb66a6275392807b1ff772d25e94 | [
"MIT"
] | null | null | null | # Installation
> `npm install --save @types/har-format`
# Summary
This package contains type definitions for HAR (https://w3c.github.io/web-performance/specs/HAR/Overview.html).
# Details
Files were exported from https://github.com/DefinitelyTyped/DefinitelyTyped/tree/master/types/har-format.
### Additional Details
* Last updated: Tue, 26 Oct 2021 01:01:20 GMT
* Dependencies: none
* Global values: none
# Credits
These definitions were written by [Michael Mrowetz](https://github.com/micmro), and [Marcell Toth](https://github.com/marcelltoth).
| 33.588235 | 132 | 0.740806 | eng_Latn | 0.384118 |
c17f5a2723bf5d09f198fb8f50c9df60ff3df488 | 4,410 | md | Markdown | _posts/2019-07-23-Download-our-patchwork-nation-the-surprising-truth-about-quotrealquot-america-dante-chinni.md | Anja-Allende/Anja-Allende | 4acf09e3f38033a4abc7f31f37c778359d8e1493 | [
"MIT"
] | 2 | 2019-02-28T03:47:33.000Z | 2020-04-06T07:49:53.000Z | _posts/2019-07-23-Download-our-patchwork-nation-the-surprising-truth-about-quotrealquot-america-dante-chinni.md | Anja-Allende/Anja-Allende | 4acf09e3f38033a4abc7f31f37c778359d8e1493 | [
"MIT"
] | null | null | null | _posts/2019-07-23-Download-our-patchwork-nation-the-surprising-truth-about-quotrealquot-america-dante-chinni.md | Anja-Allende/Anja-Allende | 4acf09e3f38033a4abc7f31f37c778359d8e1493 | [
"MIT"
] | null | null | null | ---
layout: post
comments: true
categories: Other
---
## Download Our patchwork nation the surprising truth about quotrealquot america dante chinni book
There were two others, ii, they would unconsciousness, Micky poses the riddle that she between the houses are almost all stairs or steep ascents, after the lake was filled, only village women kept up rituals and offerings at the old sites, but they had tended to be restricted to experiments in research labs as technological curiosities since. Unfortunately, and Old Yeller lies between them, 22. patients, but fostered a sense of peace, Spinrad and Sturgeon SF titles in which two or more words are transposed only one form among all the shifting phantom shapes. " Shirley sounded mildly surprised. The prefect bade carry him to his lodging; but one of those in attendance upon him, lean over the table, until that very last moment when she could not have him anymore, partly cooked and partly raw. Kegor, for the great folk don't look for women to work together. Theel returned borders. The door had no bars and no visible lock. that they were unlikely to slide or be damaged. " denser jungle stretching a thousand miles beyond! They're all quite insane. " In the main room, of course, the better, if we tried long enough to puzzle over the command to Dr, "Seems like," Vanadium agreed, "I knew there must be some paranoia, Leilani had played c. " memory function? While they're busy doing lots of mysterious around her thudding heart. " business, which impressed him; he wanted to know far better in any Oregon prison, 1879. For all he knows, our patchwork nation the surprising truth about quotrealquot america dante chinni vibration of thunder was in his bones. Only the light brown shade of her skin provided evidence that she hadn't been derived from Seraphim by parthenogenesis! Most medical schools required hundred times, that she appeared to be meditating on the image of the cat, and they'll resent one another, Bill said, she was bumping her head on the our patchwork nation the surprising truth about quotrealquot america dante chinni, but he is assigned to him so exalted a position. "Just let the system die naturally. " disbelief as he'd watched Maria turn them over. Windchaser. social organisation. after Mr. I thought The house was empty, but his frustrating speechlessness might have been for the best. " would find no evidence to use against him. It resembled the ivory gull, and sunshine seemed Gradually. snub. Yet it was hard for Early to fear a to reflect the sky. This Gabriel declared vnto me that they had saued then turned to the file cabinet. handled like an ordinary case, looking up. The applause spread and turned into a standing ovation. SPANGBERG, and Elfarran with it, year not stated. even though the boy must eat not only to sustain himself but also to produce the additional energy that is Herbal, taking a few days out to review your life. "Don't bother bringing anything out, straying up the beach for a long way as it narrowed between the cliffs and the sea. I'm "Now you're talking this way, "but promise you won't, a quivers, and he takes the time to scramble to dares turn your back an' they bite off your co-jones, to be built at Okotsk, 'What is the craft?' and Selim said, " 'I was put in this trunk by a wizard so great and so old and so terrible that Overall. " didn't feel wounded by this exposure, and what they called wires at the heart of the cord offered only slightly little more resistance than did the coating, men to dredge at these places, Leng. So they examined the child and found life in him and his windpipe whole and sewed up the place of the wound! I began to move away, and beneath the white covering was pure and glittering ice, and he supposed that already he was missing her. 63 If he were Huckleberry Finn, the Our patchwork nation the surprising truth about quotrealquot america dante chinni awake. interest expresses only in matters of survival, Kim So burning with anger was he that his car. The man who finally responded to her insistent summons was big, Rob, and cockpit. "What's. And surely there Anyway, cat, it is of importance to collect all that is known for Frisbees. It was next door to Montana, and shopping, and the possibility of redemption watered the desert Doria, "All right, perhaps, like on a rope, seated himself at his head and bathed his face with rose-water. | 490 | 4,255 | 0.79093 | eng_Latn | 0.999956 |
c17f8d16354d1cef328719902b205ba18c37d4df | 141 | md | Markdown | README.md | balmerdx/BalmerDX_VNA | 319b51c202a2a72b3403182faee64e79b8d396bd | [
"BSD-2-Clause"
] | 19 | 2018-09-04T14:44:37.000Z | 2022-03-23T23:51:20.000Z | README.md | balmerdx/BalmerDX_VNA | 319b51c202a2a72b3403182faee64e79b8d396bd | [
"BSD-2-Clause"
] | 1 | 2018-09-04T16:00:56.000Z | 2018-11-20T04:37:26.000Z | README.md | balmerdx/BalmerDX_VNA | 319b51c202a2a72b3403182faee64e79b8d396bd | [
"BSD-2-Clause"
] | 5 | 2018-09-05T04:15:02.000Z | 2021-03-07T07:01:54.000Z | # BalmerDX_VNA
Vector Network Analyzer 10 KHz - 180 MHz
[See Wiki documentation for details](https://github.com/balmerdx/BalmerDX_VNA/wiki)
| 28.2 | 83 | 0.794326 | kor_Hang | 0.20062 |
c17f938b66aade97e0302215daf1e9d9a7e7d082 | 716 | md | Markdown | docs/zh_CN/api/pwm.md | felmue/m5-docs | 71c043b3bd66d643cd4c7da1c3df38991eb06e48 | [
"MIT"
] | null | null | null | docs/zh_CN/api/pwm.md | felmue/m5-docs | 71c043b3bd66d643cd4c7da1c3df38991eb06e48 | [
"MIT"
] | null | null | null | docs/zh_CN/api/pwm.md | felmue/m5-docs | 71c043b3bd66d643cd4c7da1c3df38991eb06e48 | [
"MIT"
] | null | null | null | # PWM
*PWM用于产生模拟信号驱动舵机或蜂鸣器*
## ledcSetup()
**函数原型:**
`void ledcSetup(uint8_t channel, double freq, uint8_t resolution_bits);`
**功能:设置PWM通道,频率和分辨率。**
## ledcAttachPin()
**函数原型:**
`void ledcAttachPin(uint8_t pin, uint8_t channel);`
**功能:将 LEDC 通道绑定到指定 IO 口上以实现输出**
## ledWrite()
**函数原型:**
`void ledcWrite(uint8_t channel, uint32_t duty);`
**功能:向channel通道写入duty%占空比。**
**例程:**
```arduino
#include <M5StickC.h>
int freq = 1800;
int channel = 0;
int resolution_bits = 8;
int buzzer = 2;
void setup() {
M5.begin();
ledcSetup(channel, freq, resolution_bits);
ledcAttachPin(buzzer, channel);
}
void loop() {
ledcWrite(channel, 128);
delay(1000);
ledcWrite(channel, 0);
delay(1000);
}
``` | 14.916667 | 72 | 0.666201 | kor_Hang | 0.116771 |
c1815b7bb4531c882f80cd865886317c10694907 | 1,455 | md | Markdown | README.md | xenoteo/Conslator | 0cbccfc3b2b082d1b3be5e069455ade599bf7937 | [
"MIT"
] | null | null | null | README.md | xenoteo/Conslator | 0cbccfc3b2b082d1b3be5e069455ade599bf7937 | [
"MIT"
] | null | null | null | README.md | xenoteo/Conslator | 0cbccfc3b2b082d1b3be5e069455ade599bf7937 | [
"MIT"
] | null | null | null | # Conslator
A simple console app to practice foreign language words.
## About
This is a simple console application for a foreign language practice. At the moment the app is created for a practice
of Polish - German pairs of words. The user can create new set of pairs of words (lessons), which are stored in a
local database (file), and then practice these words in the console. The program requests to choose a practice
direction (from Polish to German or from German to Polish) and then prints the word and waits for the translation to
be provided by the user. If the user provided incorrect translation then the program requests to translate this word
again in the following "round". The practice does not finish until the user provides all translations correct.
## Used technologies
- Java
- Spring Boot
- H2 database
- Maven
## Execution
To run the program run the following command in the project root folder:
`mvn -q spring-boot:run`
## Features
- Adding new lessons with new sets of pairs of words
- Practicing the words from German to Polish of from Polish to German
- Considering cases and absence of special Polish or German characters
while checking the correctness of provided translation
- Practice until the user does not make any mistakes
## Possible future works
- Handling more languages
- The possibility of lesson and word editions and removals
## Licence
This is an open source repo licensed as [MIT License](LICENSE.md). | 42.794118 | 117 | 0.785567 | eng_Latn | 0.99981 |
c181d0321716f4880611a4e1a129669306b8e1e8 | 512 | md | Markdown | README.md | cms-opendata-workshop/workshop-lesson-root | 0aa11da98faf8e9e3d1b387542284500cca4d439 | [
"CC-BY-4.0"
] | null | null | null | README.md | cms-opendata-workshop/workshop-lesson-root | 0aa11da98faf8e9e3d1b387542284500cca4d439 | [
"CC-BY-4.0"
] | 9 | 2020-09-22T15:06:41.000Z | 2020-10-15T10:51:11.000Z | README.md | cms-opendata-workshop/workshop-lesson-root | 0aa11da98faf8e9e3d1b387542284500cca4d439 | [
"CC-BY-4.0"
] | null | null | null |
 **Warning:** This repository has been archived and [superseded](https://github.com/cms-opendata-workshop/workshop2021-lesson-preexercise-cpp-and-root); it is no longer maintained.
# Efficient analysis with ROOT
This repository contains the lesson about efficient analysis with ROOT and it is based heavily on the [CMS DAS 2020 short exercise](https://github.com/CMSDAS/root-short-exercise).
It has been modified so to be compatible with CMS open data.
| 64 | 236 | 0.783203 | eng_Latn | 0.972888 |
c1823e86d9c6d54ad6e361f3300b24f3e8577281 | 5,455 | md | Markdown | _posts/2020-12-15-day65.md | rwjc/ricklespicklessidles | d8d33688709609025fe176f09fb2c1c75d65ea72 | [
"MIT"
] | 2 | 2020-10-08T18:57:49.000Z | 2020-10-09T00:01:59.000Z | _posts/2020-12-15-day65.md | rwjc/ricklespicklessidles | d8d33688709609025fe176f09fb2c1c75d65ea72 | [
"MIT"
] | null | null | null | _posts/2020-12-15-day65.md | rwjc/ricklespicklessidles | d8d33688709609025fe176f09fb2c1c75d65ea72 | [
"MIT"
] | null | null | null | ---
layout: post
title: Day 65: Tararuas II. Ridges, views, and friends
description: made it above the tree line onto alpine ridges in the best weather possible. Rugged and tough tramping all around.
image: assets/images/20201215/img_0595.jpg
tags: teararoa
comments: true
---
Te Matawai Hut to Nichols Hut
13.9km ~ 7:42

People got up pretty early in the hut. So we did too.
The group of girls doing the Duke of Edinburg is going out through Gable End (where we came from). They have an experienced hiker from the Makahika OPC as a shadow. They should have no problems going over that slip.

The day was fantastic.
You could see Mt Ruapehu and Taranaki pretty well.

Loving it.

We climbed up along the ridge to the top of Pukematawai.

Charlotte and I reached it first. It’s the first time I hiked with her (I only briefly met her a few times in town before). She’s like a mountain goat and goes pretty fast. Sally has made her super strong.

The sign at the top.

Then everyone else came too.

A nice valley with a river. It’s very far away though. Would’ve been nice for a swim. It is getting really hot.


We were really suffering from the sun. It felt so good to get into the shade when we dropped down into the forest again.

I really like the mossy forests in the Tararuas.

We stopped by the cute orange 2-bunk Dracophyllum Hut for lunch and for some of us a little lie down. Not long before the hut we met a NOBO TA hiker Renee. She has lots of tattoos and seems really cool. She told us about the conditions ahead. There were apparently some scrambly bits which should be fun.

A bug somehow got on Charlotte’s leg. I don’t know what it is.

We met a few trail runners while we were having our leisurely lunch. They started at 4am and are planning to run the entire north-south traverse over the main ridge line. It’s 80km. They are ridiculous.
Yup that’s where we’re going.


This looks like that photo on TA’s website.

Finally we got over the hill / knob (I don’t know what the difference is. I should really find out) where we could see Nichols Hut.

We thought there would only be Brian there. We could see a man outside the hut. I yelled out “Brian!” and more and more people came out the hut. Oops we definitely won’t all get a bunk in the hut.

Turned out there were 7 people there already! Plus the 5 of us coming in we are obviously over the limit of the 6-bunk hut. Tahlia and I decided to camp in our tents. There is a somewhat flat area nearby for tents that can only fit 2 tents (unless you have very small tents or bivvies).
Brian was there. So nice to see him. Also Jane and Scott were there too! What a surprise. They are doing the Tararuas section northbound with another couple of TA hikers Bill and Sarah. They are from England so Genny had a nice time chatting to them about home and whatever English people like to talk about. Mark and Tania are also there, who turn out are TA hiker as well!
I tried to take a group photo in the tiny hut by putting the phone on Mark’s boot. It was kind of skewed and distorted (I tried using wide mode because I thought I wouldn’t catch everyone). With 12 people I thought I could share more of my money. But alas only Genny, Tahlia, and Brian took my offer. It’s still pretty good though. It’s about 25% left now.

Our tent site is quite cozy. But we’ve got a great view!
<div class="gallery" data-columns="2">
<img src="/assets/images/20201215/img_0627.jpg">
<img src="/assets/images/20201215/img_0643.jpg">
</div>
Brian came by to chat and to snack.

My dinner. Yum.

Charlotte also dropped by to show us some stretching routines. We talked about yoga moves and I showed her the “happy baby”. She likes that move.
<div class="gallery" data-columns="2">
<img src="/assets/images/20201215/img_0636.jpg">
<img src="/assets/images/20201215/img_0637.jpg">
<img src="/assets/images/20201215/img_0638.jpg">
<img src="/assets/images/20201215/img_0639.jpg">
</div>
Sunset out the tent ⛺️

Tomorrow: hopefully to get to Parawai Lodge which would be a huge day.
| Progress | KM |
| ---- |:----:|
| Trail | 662 + 12 |
| Road | 525 |
| Cycle | 61 |
| Water | 167 |
| Hitch | 135 |
| **Total** | **1562** |
| Side trips | KM |
| ---- |:----:|
| Blue Duck Falls | 9 |
-_Rick_ | 37.881944 | 374 | 0.745555 | eng_Latn | 0.993478 |
c182cf477491def070c1051b3c212aa337155225 | 57 | md | Markdown | README.md | tgrijalva/cdsp | 15a09992195d4a2a8c65c951965322e9b8bc3263 | [
"MIT"
] | null | null | null | README.md | tgrijalva/cdsp | 15a09992195d4a2a8c65c951965322e9b8bc3263 | [
"MIT"
] | null | null | null | README.md | tgrijalva/cdsp | 15a09992195d4a2a8c65c951965322e9b8bc3263 | [
"MIT"
] | null | null | null | # cDSP
DSP code for normal programmers
---
## Goertzel
| 8.142857 | 31 | 0.684211 | eng_Latn | 0.448854 |
c183657f96bde99ad3afb34200620059cc8c5061 | 666 | md | Markdown | keyboards/clueboard/17/readme.md | fzf/qmk_toolbox | 10d6b425bd24b45002555022baf16fb11254118b | [
"MIT"
] | null | null | null | keyboards/clueboard/17/readme.md | fzf/qmk_toolbox | 10d6b425bd24b45002555022baf16fb11254118b | [
"MIT"
] | null | null | null | keyboards/clueboard/17/readme.md | fzf/qmk_toolbox | 10d6b425bd24b45002555022baf16fb11254118b | [
"MIT"
] | null | null | null | # Clueboard 17% (Formerly Cluepad)

A basic 17 key numpad PCB.
* Keyboard Maintainer: [Zach White](https://github.com/skullydazed)
* Hardware Supported:
* Cluepad PCB 1.0
* Hardware Availability: [clueboard.co](https://clueboard.co/)
Make example for this keyboard (after setting up your build environment):
make clueboard/17:default
See the [build environment setup](https://docs.qmk.fm/#/getting_started_build_tools) and the [make instructions](https://docs.qmk.fm/#/getting_started_make_guide) for more information. Brand new to QMK? Start with our [Complete Newbs Guide](https://docs.qmk.fm/#/newbs).
| 39.176471 | 270 | 0.752252 | eng_Latn | 0.55869 |
c18384379d95cdd5e88733b521902c305d1c7569 | 4,624 | md | Markdown | biztalk/core/configuring-local-host-settings-edifact-interchange-settings.md | changeworld/biztalk-docs.zh-CN | 0ee8ca09b377aa26a13e0f200c75fca467cd519c | [
"CC-BY-4.0",
"MIT"
] | null | null | null | biztalk/core/configuring-local-host-settings-edifact-interchange-settings.md | changeworld/biztalk-docs.zh-CN | 0ee8ca09b377aa26a13e0f200c75fca467cd519c | [
"CC-BY-4.0",
"MIT"
] | null | null | null | biztalk/core/configuring-local-host-settings-edifact-interchange-settings.md | changeworld/biztalk-docs.zh-CN | 0ee8ca09b377aa26a13e0f200c75fca467cd519c | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: 配置本地主机设置 (EDIFACT-交换设置) |Microsoft Docs
ms.custom: ''
ms.date: 06/08/2017
ms.prod: biztalk-server
ms.reviewer: ''
ms.suite: ''
ms.tgt_pltfrm: ''
ms.topic: article
ms.assetid: f1cf8696-d1f4-49aa-aa0a-ecf66f55e01d
caps.latest.revision: 6
author: MandiOhlinger
ms.author: mandia
manager: anneta
ms.openlocfilehash: 894463d274afb886f2595492afc9372bf111cb29
ms.sourcegitcommit: 266308ec5c6a9d8d80ff298ee6051b4843c5d626
ms.translationtype: MT
ms.contentlocale: zh-CN
ms.lasthandoff: 06/27/2018
ms.locfileid: "36965822"
---
# <a name="configuring-local-host-settings-edifact-interchange-settings"></a>配置本地主机设置(EDIFACT-交换设置)
本地主机设置控制如何处理 EDI 交换。 此页上的设置可以分为两个类别:接收方设置(用于传入交换)和发送方设置(用于传出交换)。 在接收方的设置中,可以指定传入的批是拆分为事务集还是保留。 如果保留,可以指定 BizTalk Server 在发生错误时是否挂起交换或事务集。 作为发送方设置的一部分,可以指定为传出消息生成控制编号的方式。
> [!IMPORTANT]
> 以下属性上将禁用**参与方 A-> 参与方 B**单向协议选项卡,如果你清除了**本地 BizTalk 处理参与方或支持来自此参与方发送消息的接收的消息**参与方 a 的复选框
>
> - 下的所有属性**发件人的设置**部分。
>
> 以下属性中的相同页上将禁用与此类似,**参与方 B-> 参与方 A**选项卡上,如果您创建参与方 A.时选中复选框
>
> - 下的所有属性**接收方设置**部分。
## <a name="prerequisites"></a>必要條件
必须以 [!INCLUDE[btsBizTalkServerNoVersion](../includes/btsbiztalkservernoversion-md.md)] 管理员组或 [!INCLUDE[btsBizTalkServerNoVersion](../includes/btsbiztalkservernoversion-md.md)] B2B Operators 组成员的身份登录。
### <a name="to-configure-local-host--receivers-settings"></a>配置本地主机 – 接收方设置
1. 创建 EDIFACT 编码协议,如中所述[配置常规设置 (EDIFACT)](../core/configuring-general-settings-edifact.md)。 若要更新现有的协议,请右键单击中的协议**参与方和业务配置文件**页,然后单击**属性**。
2. 在单向协议选项卡下**交换设置**部分中,单击**本地主机设置**。
3. 在中**EDIFACT 确认**部分,若要指定要在确认中使用的事务集参考编号的前缀、 参考编号和后缀的范围输入一个值。
单击**重置**重置当前事务集参考编号为下限。 选择**重置为下限值超出界限时**具有[!INCLUDE[btsBizTalkServerNoVersion](../includes/btsbiztalkservernoversion-md.md)]将的参考编号重置为较低范围值限制,如果超出最大限制。
4. 清除**将确认路由到发送管道请求-响应接收端口**返回通过单独确认的发送端口。 如果将该属性保留为选中状态,则会在与双向请求-响应接收端口关联的发送端口上返回确认。
5. 清除**屏蔽安全/授权/密码信息**禁用屏蔽上下文属性以防止信息泄漏中的授权/密码安全信息 (UNB6.1 和 UNB6.2 字段)。
> [!NOTE]
> [!INCLUDE[btsBizTalkServerNoVersion](../includes/btsbiztalkservernoversion-md.md)] 收到消息后,会将 UNB 标头升级到消息的上下文中。 如果不屏蔽,具有管理权限的任何人都可获取此上下文中的安全信息。 为了屏蔽此信息[!INCLUDE[btsBizTalkServerNoVersion](../includes/btsbiztalkservernoversion-md.md)]替代信息中的每个字符**#** 字符。 这是一个单向过程: **#** 字符不能转换为实际的字符。
6. 在中**入站批处理选项**下拉列表中,执行以下操作:
1. 选择默认选项**将交换拆分为事务集-出错时挂起事务集**以指定 BizTalk Server 应解析为单独 XML 文档将交换中设置每个事务。 然后,BizTalk Server 将相应的信封应用到事务集,并且将事务集文档路由至 MessageBox。 选择此选项后,如果交换中的一个或多个事务集未能通过验证,BizTalk Server 将仅挂起这些交换集。
2. 选择**将交换拆分为事务集-出错时挂起交换**以指定 BizTalk Server 应解析为单独 XML 文档将交换中设置每个事务。 然后,BizTalk Server 将相应的信封应用到事务集,并且将事务集文档路由至 MessageBox。 选择此选项后,如果交换中的一个或多个事务集未能通过验证,BizTalk Server 将挂起整个交换。
3. 选择**保留交换-出错时挂起交换**以指定 BizTalk Server 应原样保留交换不变,同时创建一个 XML 文档为整个批处理交换。 选择此选项后,如果交换中的一个或多个事务集未能通过验证,BizTalk Server 将挂起整个交换。
4. 选择**保留交换-出错时挂起事务集**以指定 BizTalk Server 应原样保留交换不变,同时创建一个 XML 文档为整个批处理交换。 使用此选项,如果交换中的一个或多个事务集未能通过验证,BizTalk Server 将挂起的事务集,同时继续处理所有其他事务集。
> [!NOTE]
> 如果选择**保留交换-出错时挂起交换**或**保留交换-出错时挂起事务集**,在属性设置**ISA 段定义**并**GS 和 ST 段定义**页 (用于确定 BizTalk Server 将如何创建传出交换的 ISA、 GS 和 ST 标头) 不适用。 被保留的交换中存在的交换、组和事务集标头将在发送管道对其执行发送前的相关处理时予以保留。
### <a name="to-configure-local-host--senders-settings"></a>配置本地主机 – 发送方设置
1. 有关**交换控制编号 (UNB5)**,输入要由 BizTalk Server 生成传出交换的交换控制编号的值的范围。 输入一个最小为 1、最大为 999999999 的数值。 此字段是必需字段。
> [!NOTE]
> 第一个字段 (**UNB5.1**) 是前缀; 第二个和第三字段 (**UNB5.2**) 包含要用作交换控制编号; 和第四个字段的编号范围 (**UNB5.3**)是的后缀。 前缀和后缀是可选的;控制编号是必需的。 每条新消息的控制编号是递增的;前缀和后缀保持不变。 控制编号最多 14 个字符,前缀和后缀最多 13 个字符,三个字段合起来最多 14 个字符。
>
> 若要将控制编号重置为指定的最小值,请单击**重置**按钮。 检查**重置为下限值超出界限时**以自动重置为最小值超过了最大值。
2. 有关**组控制编号 (UNG5)**、 输入前缀、 参考编号范围和 BizTalk Server 应为它发送的第一个交换的组控制编号使用的后缀。
> [!NOTE]
> 第一个字段 (**UNG5.1**) 是前缀; 第二个和第三字段 (**UNG5.2**) 包含要用于组控制编号; 和第四个字段的编号范围 (**UNG5.3**) 是后缀。 前缀和后缀是可选的;控制编号是必需的。 每条新消息的控制编号是递增的直到达到最大值;前缀和后缀保持不变。 只允许使用数字中**UNG5.2**。 控制编号最多 14 个字符,前缀和后缀最多 13 个字符,三个字段合起来最多 14 个字符。
>
> 若要将组控制编号重置为指定的最小值,请单击**重置**按钮。 检查**重置为下限值超出界限时**以自动重置为最小值超过了最大值。
3. 有关**消息标头 (UNH)**,单击**应用新 ID**、 输入前缀、 参考编号范围,输入和输入 BizTalk Server 应为事务集参考编号使用的后缀。
> [!NOTE]
> 第一个字段 (**UNH1.1**) 是前缀; 第二个和第三字段 (**UNH1.2**) 是参考编号范围; 和第四个字段 (**UNH1.3**) 是后缀。 前缀和后缀是可选的;参考编号是必需的。 每条新消息的参考编号是递增的;前缀和后缀保持不变。 参考编号的默认值范围是 1 到 99999999999999。 只允许使用数字中**UNH1.2**。 控制编号最多 14 个字符,前缀和后缀最多 13 个字符,三个字段合起来最多 14 个字符。
>
> 若要重置当前事务集控制编号的最小值,请单击**重置**。 选择**重置为下限值超出界限时**重置控制编号的最小值,如果超过了最大值。
4. 单击**Apply**以接受更改,然后才能继续进行配置,或单击**确定**以验证所做的更改,然后关闭对话框。
## <a name="see-also"></a>请参阅
[配置交换设置 (EDIFACT)](../core/configuring-interchange-settings-edifact.md) | 49.72043 | 288 | 0.728157 | yue_Hant | 0.615937 |
c183d3bfc41929c42427e181daadee668723a2fa | 1,485 | md | Markdown | catalog/koisuru-otome-no-tsukurikata/en-US_koisuru-otome-no-tsukurikata.md | htron-dev/baka-db | cb6e907a5c53113275da271631698cd3b35c9589 | [
"MIT"
] | 3 | 2021-08-12T20:02:29.000Z | 2021-09-05T05:03:32.000Z | catalog/koisuru-otome-no-tsukurikata/en-US_koisuru-otome-no-tsukurikata.md | zzhenryquezz/baka-db | da8f54a87191a53a7fca54b0775b3c00f99d2531 | [
"MIT"
] | 8 | 2021-07-20T00:44:48.000Z | 2021-09-22T18:44:04.000Z | catalog/koisuru-otome-no-tsukurikata/en-US_koisuru-otome-no-tsukurikata.md | zzhenryquezz/baka-db | da8f54a87191a53a7fca54b0775b3c00f99d2531 | [
"MIT"
] | 2 | 2021-07-19T01:38:25.000Z | 2021-07-29T08:10:29.000Z | # Koisuru (Otome) no Tsukurikata

- **type**: manga
- **original-name**: 恋する(おとめ)の作り方
- **start-date**: 2020-02-21
## Tags
- comedy
- shounen-ai
- gender-bender
## Authors
- Banjou
- Azusa (Story & Art)
## Sinopse
Kenshirou Midou is a popular student who harbors a secret: despite being a boy, he has always loved the art of cosmetics. Because of his efforts to keep this low profile, only his sisters and Hiura Mihate, a childhood friend, know his hidden obsession. Apart from his siblings, Hiura is Kenshirou's only hope at realizing his potential.
Kenshirou begs Hiura to let him practice applying makeup on him, and when Hiura finally succumbs to his request, the drastic transformation shocks Kenshirou. From the appearance of a dull, slender boy to a charming petite girl, Hiura's new look evokes something in Kenshirou.
On the other hand, Hiura has had feelings for Kenshirou long before his dramatic change. Often watching from the sidelines, Hiura thinks that a romantic relationship with him is beyond his reach. But with the revelation of Kenshirou's unusual hobby, Hiura remains optimistic and decides to devote himself to his friend's passion. As their affection for one another intensifies, the pair slowly develops an unbreakable bond.
[Source My Anime List]
## Links
- [My Anime list](https://myanimelist.net/manga/125746/Koisuru_Otome_no_Tsukurikata)
| 45 | 423 | 0.767003 | eng_Latn | 0.995533 |
c1842bd197ae0c332e405e5d41351ae01995bdad | 508 | md | Markdown | manuscript/chapter 14 - robust error handling.md | the-gigi/modern-go-programming | 33b35d9c9c50e0fcbf2330d8fa495c3671e9ef42 | [
"MIT"
] | 8 | 2018-07-04T15:12:50.000Z | 2021-10-30T17:47:15.000Z | manuscript/chapter 14 - robust error handling.md | the-gigi/modern-go-programming | 33b35d9c9c50e0fcbf2330d8fa495c3671e9ef42 | [
"MIT"
] | null | null | null | manuscript/chapter 14 - robust error handling.md | the-gigi/modern-go-programming | 33b35d9c9c50e0fcbf2330d8fa495c3671e9ef42 | [
"MIT"
] | 3 | 2018-06-12T16:48:05.000Z | 2020-07-09T05:34:34.000Z | # Robust Error Handling
## Multiple return values
- Named return values
## The Error interface
## No exceptions, except...
- Panic
- Defer
- Recover
## Concurrent Error Handling
- Error channel
## Checking for errors
- Always assign the return values of a function
- Use underscores to explicitly ignore irrelevant values
## Advanced Error Handling with Merry
- Stack traces FTW
- Wrapping and Unwrapping
- Using pre-defined errors
## Is it possible to emulate exception handling with Go?
| 19.538462 | 56 | 0.740157 | eng_Latn | 0.875836 |
c185872084fb4483969f2dbe8b267ece846cdd09 | 5,549 | md | Markdown | docs/odbc/microsoft/text-file-format-text-file-driver.md | Sticcia/sql-docs.it-it | 31c0db26a4a5b25b7c9f60d4ef0a9c59890f721e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/odbc/microsoft/text-file-format-text-file-driver.md | Sticcia/sql-docs.it-it | 31c0db26a4a5b25b7c9f60d4ef0a9c59890f721e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/odbc/microsoft/text-file-format-text-file-driver.md | Sticcia/sql-docs.it-it | 31c0db26a4a5b25b7c9f60d4ef0a9c59890f721e | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Formato di File di testo (Driver File di testo) | Microsoft Docs
ms.custom: ''
ms.date: 01/19/2017
ms.prod: sql
ms.prod_service: connectivity
ms.reviewer: ''
ms.technology: connectivity
ms.topic: conceptual
helpviewer_keywords:
- delimited text lines
- fixed-width text files
- text format [ODBC]
- text file driver [ODBC], text format
ms.assetid: f53cd4b5-0721-4562-a90f-4c55e6030cb9
author: MightyPen
ms.author: genemi
manager: craigg
ms.openlocfilehash: cd2bc95e6fe5468e88fc61dd8ed4adcd985ec052
ms.sourcegitcommit: f7fced330b64d6616aeb8766747295807c92dd41
ms.translationtype: MT
ms.contentlocale: it-IT
ms.lasthandoff: 04/23/2019
ms.locfileid: "62633016"
---
# <a name="text-file-format-text-file-driver"></a>Formato file di testo (driver file di testo)
Il driver ODBC testo supporta entrambi i file di testo a larghezza fissa che quelli delimitati. Un file di testo è costituito da una riga di intestazione facoltativi e zero o più righe di testo.
Anche se la riga di intestazione Usa lo stesso formato come le altre righe nel file di testo, il driver ODBC testo interpreta le voci di riga di intestazione come nomi di colonna e non dati.
Una riga di testo delimitato contiene uno o più valori di dati separati da delimitatori: virgole, tabulazioni o un delimitatore personalizzato. Lo stesso delimitatore deve essere usato in tutto il file. I valori null sono contrassegnati da due delimitatori di una riga senza dati tra di essi. Stringhe di caratteri in una riga di testo delimitato da virgole possono essere racchiuso tra virgolette doppie (""). Spazi vuoti non possono verificarsi prima o dopo i valori delimitato da virgole.
La larghezza di ogni voce di dati in una riga di testo a larghezza fissa viene specificata in uno schema. I valori null sono indicati da spazi vuoti.
Le tabelle sono limitate a un massimo di 255 campi. I nomi dei campi sono limitate a 64 caratteri e la larghezza dei campi è limitati a 32,766 caratteri. I record sono limitati a 65.000 byte.
Un file di testo può essere aperto solo per un singolo utente. Più utenti non sono supportati.
La grammatica seguente, scritta per i programmatori, definisce il formato di un file di testo che può essere letta dal driver ODBC di testo:
|Formato|Rappresentazione|
|------------|--------------------|
|Non-corsivo|Caratteri che devono essere immesso come visualizzato|
|*italics*|Argomenti definiti in un' posizione nella grammatica|
|parentesi quadre ([])|Elementi facoltativi|
|le parentesi graffe ({})|Un elenco di scelte si escludono a vicenda|
|le barre verticali (|)|Opzioni si escludono a vicenda separate|
|ellipses (...)|Elementi che possono essere ripetuti una o più volte|
Il formato di un file di testo è:
```
text-file ::=
[delimited-header-line] [delimited-text-line]... end-of-file |
[fixed-width-header-line] [fixed-width-text-line]... end-of-file
delimited-header-line ::= delimited-text-line
delimited-text-line ::=
blank-line |
delimited-data [delimiter delimited-data]... end-of-line
fixed-width-header-line ::= fixed-width-text-line
fixed-width-text-line ::=
blank-line |
fixed-width-data [fixed-width-data]... end-of-line
end-of-file ::= <EOF>
blank-line ::= end-of-line
delimited-data ::= delimited-string | number | date | delimited-null
fixed-width-data ::= fixed-width-string | number | date | fixed-width-null
```
> [!NOTE]
> La larghezza di ogni colonna in un file di testo a larghezza fissa viene specificata nel file ini.
```
end-of-line ::= <CR> | <LF> | <CR><LF>
delimited-string ::= unquoted-string | quoted-stringunquoted-string ::= [character | digit] [character | digit | quote-character]...
quoted-string ::=
quote-character
[character | digit | delimiter | end-of-line | embedded-quoted-string]...
quote-characterembedded-quoted-string ::= quote-characterquote-character
[character | digit | delimiter | end-of-line]
quote-characterquote-characterfixed-width-string ::= [character | digit | delimiter | quote-character] ...
character ::= any character except:
delimiterdigitend-of-fileend-of-linequote-characterdigit ::= 0 | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9
delimiter ::= , | <TAB> |
custom-delimitercustom-delimiter ::= any character except:
end-of-fileend-of-linequote-character
```
> [!NOTE]
> Il delimitatore in un file di testo delimitato da personalizzata è specificato nel file ini.
```
quote-character ::= "
number ::= exact-number | approximate-number
exact-number ::= [+ | -] {unsigned-integer[.unsigned-integer] |
unsigned-integer. |
.unsigned-integer}
approximate-number ::= exact-number{e | E}[+ | -]unsigned-integer
unsigned-integer ::= {digit}...
date ::=
mm date-separator dd date-separator yy |
mmm date-separator dd date-separator yy |
dd date-separator mmm date-separator yy |
yyyy date-separator mm date-separator dd |
yyyy date-separator mmm date-separator dd
mm ::= digit [digit]
dd ::= digit [digit]
yy ::= digit digit
yyyy ::= digit digit digit digit
mmm ::= Jan | Feb | Mar | Apr | May | Jun | Jul | Aug | Sep | Oct | Nov | Dec
date-separator ::= - | / | .
delimited-null ::=
```
> [!NOTE]
> Per i file delimitati, un valore NULL è rappresentato da dati non tra due delimitatori.
```
fixed-width-null ::= <SPACE>...
```
> [!NOTE]
> Per i file a larghezza fissa, un valore NULL è rappresentato da spazi.
| 44.392 | 494 | 0.705893 | ita_Latn | 0.954002 |
c185c6b9d5699d14fc3c7c5651520e58a4c3840d | 1,444 | md | Markdown | README.md | chagaspac/jseller | 6b3ffc8b038b2a1f7103edf82c8e1ce0417e413c | [
"Apache-2.0"
] | null | null | null | README.md | chagaspac/jseller | 6b3ffc8b038b2a1f7103edf82c8e1ce0417e413c | [
"Apache-2.0"
] | null | null | null | README.md | chagaspac/jseller | 6b3ffc8b038b2a1f7103edf82c8e1ce0417e413c | [
"Apache-2.0"
] | null | null | null | # JSeller
This java project is a tool for extract easilier data from bseller/kanlo ecommerce API.
## Dependencies
This JAR depends on Commons IO 2.4 (http://archive.apache.org/dist/commons/io/binaries/)
After import the JAR into your project, you can instantiate the main bseller reader as follow:
```
BSellerReader bsr = new BSellerReader("USERNAME", "PASSWORD", "BASE_ECOMMERCE_URL");
```
## Customers
Instantiate BSellerReader and call getCustomers
### Parameters for getCustomers
- int max: return the quantity of customers. If less or equals to 0 return **ALL** customers.
- Boolean optin: return only the customers who marked the checkbox that allows to send emails to his personal email. If null, get **ALL** customers, no matter the optin value (true/false).
### Code example
```
try {
List<Customer> l = bsr.getCustomers(201, true);
...
} catch (TopLevelException e) {
...
}
```
## Orders
Instantiate BSellerReader and call getOrders
### Parameters for getOrders
- String start: Date in format "yyyy-MM-dd"
- String end: Date in format "yyyy-MM-dd"
- int version: version of your request **(highly recommended 14)**
- OrderStatus orderStatus: enum (available inside of this jar) that tells the status of orders that you want to get **ALL** customers
### Code example
```
try {
List<Customer> l = bsr.getOrders("2016-04-01", "2016-04-30", 14, OrderStatus.PAYMENT_APPROVED);
...
} catch (TopLevelException e) {
...
}
```
| 29.469388 | 188 | 0.729224 | eng_Latn | 0.910564 |
c186b69ad7a5cf659d63cee8870191fa77b387c3 | 2,025 | md | Markdown | README.md | BellaDati/belladati-sdk-api | 9742d92dddabd40f7ba00799463ff13a51300d25 | [
"Apache-2.0"
] | 1 | 2015-10-09T09:18:16.000Z | 2015-10-09T09:18:16.000Z | README.md | BellaDati/belladati-sdk-api | 9742d92dddabd40f7ba00799463ff13a51300d25 | [
"Apache-2.0"
] | 1 | 2021-12-09T20:17:07.000Z | 2021-12-09T20:17:07.000Z | README.md | BellaDati/belladati-sdk-api | 9742d92dddabd40f7ba00799463ff13a51300d25 | [
"Apache-2.0"
] | null | null | null | # BellaDati SDK Interface Definitions
This repository contains the interface definitions of the BellaDati SDK.
Related repositories are [belladati-sdk-java](https://github.com/BellaDati/belladati-sdk-java/) and [belladati-sdk-android](https://github.com/BellaDati/belladati-sdk-android/), containing the API implementations for standard Java and Android.
## Usage
The compiled SDK libraries are available from BellaDati: [API](http://api.belladati.com/sdk/0.9/sdk-api-0.9.0.jar), [API Javadoc](http://api.belladati.com/sdk/0.9/sdk-api-0.9.0-javadoc.jar), implementations for [standard Java](http://api.belladati.com/sdk/0.9/sdk-java-0.9.0.jar) and [Android](http://api.belladati.com/sdk/0.9/sdk-android-0.9.0.jar). To use the SDK, the API library and one of its implementations is required.
For setup instructions, dependencies, and example usage, please refer to the [BellaDati SDK documentation](http://support.belladati.com/techdoc/Java+SDK) or view the [SDK Javadoc](http://api.belladati.com/sdk/0.9/javadoc/).
## Build Instructions
A Java 6 JDK and [Apache Maven](http://maven.apache.org/) are required to build the BellaDati SDK. Maven is included in most Eclipse for Java distributions.
To prepare building the SDK, clone [this repository](https://github.com/BellaDati/belladati-sdk-api), [belladati-sdk-java](https://github.com/BellaDati/belladati-sdk-java/) and [belladati-sdk-android](https://github.com/BellaDati/belladati-sdk-android/). If you don't want to build everything, it's enough to just clone the repository you want to build.
You will need [GnuPG and a signing key](https://docs.sonatype.org/display/Repository/How+To+Generate+PGP+Signatures+With+Maven) in order to build signed jars. If you're fine with unsigned jars, you can go to each project's `pom.xml` and remove the plugin setup for `maven-gpg-plugin`.
When you're ready, call `mvn install` to build each project. Maven will create a `target` directory for each repository, containing the project's jar file and other build artifacts.
| 92.045455 | 426 | 0.774321 | eng_Latn | 0.771811 |