hexsha stringlengths 40 40 | size int64 5 1.04M | ext stringclasses 6 values | lang stringclasses 1 value | max_stars_repo_path stringlengths 3 344 | max_stars_repo_name stringlengths 5 125 | max_stars_repo_head_hexsha stringlengths 40 78 | max_stars_repo_licenses listlengths 1 11 | max_stars_count int64 1 368k ⌀ | max_stars_repo_stars_event_min_datetime stringlengths 24 24 ⌀ | max_stars_repo_stars_event_max_datetime stringlengths 24 24 ⌀ | max_issues_repo_path stringlengths 3 344 | max_issues_repo_name stringlengths 5 125 | max_issues_repo_head_hexsha stringlengths 40 78 | max_issues_repo_licenses listlengths 1 11 | max_issues_count int64 1 116k ⌀ | max_issues_repo_issues_event_min_datetime stringlengths 24 24 ⌀ | max_issues_repo_issues_event_max_datetime stringlengths 24 24 ⌀ | max_forks_repo_path stringlengths 3 344 | max_forks_repo_name stringlengths 5 125 | max_forks_repo_head_hexsha stringlengths 40 78 | max_forks_repo_licenses listlengths 1 11 | max_forks_count int64 1 105k ⌀ | max_forks_repo_forks_event_min_datetime stringlengths 24 24 ⌀ | max_forks_repo_forks_event_max_datetime stringlengths 24 24 ⌀ | content stringlengths 5 1.04M | avg_line_length float64 1.14 851k | max_line_length int64 1 1.03M | alphanum_fraction float64 0 1 | lid stringclasses 191 values | lid_prob float64 0.01 1 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
e235acbe3cbd4e62c9c041e0268a87a8086da1e0 | 3,261 | md | Markdown | book/src/getting-started.md | csh/amethyst | 68aa4eaadd56c40de6f91d1148affbe3c2381128 | [
"MIT"
] | 1 | 2022-01-07T01:29:35.000Z | 2022-01-07T01:29:35.000Z | book/src/getting-started.md | csh/amethyst | 68aa4eaadd56c40de6f91d1148affbe3c2381128 | [
"MIT"
] | 10 | 2018-04-24T16:05:15.000Z | 2019-07-25T15:32:13.000Z | book/src/getting-started.md | csh/amethyst | 68aa4eaadd56c40de6f91d1148affbe3c2381128 | [
"MIT"
] | 4 | 2018-03-06T11:33:22.000Z | 2020-05-01T15:03:40.000Z | # Getting started
## Setting up Rust
We recommend using [rustup][ru] to easily install the latest stable version of rust.
Instructions should be on the screen once rustup is downloaded.
[ru]: https://rustup.rs
> **Updating Rust:** If you already have Rust installed, make sure you're using the
the latest version by running `rustup update`.
We recommend using the stable version of Rust, as Rust nightlies tend to break rather
often.
> **Using the stable toolchain:** Rustup can be configured to default to the stable
toolchain by running `rustup default stable`.
## Required dependencies
Please check the dependencies section of the
[README.md](https://github.com/amethyst/amethyst/blob/master/README.md#dependencies)
for details on what dependencies are required for compiling Amethyst.
Please note that you need to have a functional graphics driver installed.
If you get a panic about the renderer unable to create the rendering context
when trying to run an example, a faulty driver installation could be the issue.
## Setting up an Amethyst Project
You can either use our Starter Projects or do it the manual way.
### Creating a Project the manual way.
* Add `amethyst` as a dependency in your `Cargo.toml`.
* Create a `config` folder and put a `display.ron` in it.
* (Optional) Copy the code from one of the amethyst's examples.
### Starter Project
If you want to get running as quickly as possible and start playing around with Amethyst, you can also use a starter project. These are specifically made for certain types of games and will set you up with the groundwork needed to start right away.
The `README.md` file on these will include everything you need to know to run the starter project.
> **Note:** Right now, the only starter available is for 2D games. This will expand over time, and offer more options for different types of games.
* [2D Starter](https://github.com/amethyst/amethyst-starter-2d)
### Important note on versioning
Amethyst is divided into two major versions:
* The released crates.io version, which is the latest version available on crates.io
* The git (master) version, which is the current unreleased development snapshot of Amethyst available on [Github][agit]
> **Note:** You can see which version you're currently looking at by checking the URL
in your browser. The book/documentation for `master` contains "master" in the address,
the crates.io version is called "stable".
Depending on the book version that you choose to read, make sure that the amethyst version in your Cargo.toml matches that.
For the released crates.io version, you should have something like this:
```rust,ignore
[dependencies]
amethyst = "LATEST_CRATES.IO_VERSION"
```
The latest crates.io version can be found [here](https://crates.io/crates/amethyst).
If you want to use the latest unreleased changes, your Cargo.toml file should look like this:
```rust,ignore
[dependencies]
amethyst = { git = "https://github.com/amethyst/amethyst", rev = "COMMIT_HASH" }
```
The commit hash part is optional. It indicates which specific commit your project uses, to prevent unexpected breakage when we make changes to the git version.
[agit]: https://github.com/amethyst/amethyst
[cl]: https://github.com/amethyst/tools
| 40.259259 | 250 | 0.769089 | eng_Latn | 0.998501 |
e2365342f8b6b8afbf31a2a055732256dc5a8c0b | 242 | markdown | Markdown | _posts/2016-03-28-locate.markdown | bpiercy/samipiercy | 6f9861dba2b9de5d09100a0e0558dd80ef244458 | [
"Apache-2.0"
] | 1 | 2020-07-24T17:44:58.000Z | 2020-07-24T17:44:58.000Z | _posts/2016-03-28-locate.markdown | bpiercy/samipiercy | 6f9861dba2b9de5d09100a0e0558dd80ef244458 | [
"Apache-2.0"
] | 4 | 2020-05-02T06:04:33.000Z | 2022-02-26T01:24:54.000Z | _posts/2016-03-28-locate.markdown | bpiercy/samipiercy | 6f9861dba2b9de5d09100a0e0558dd80ef244458 | [
"Apache-2.0"
] | null | null | null | ---
title: Locate
subtitle: BFA Thesis
layout: default
modal-id: 1
date: 2016-03-28
img: Thesis_01.png
img-folder: 10_Locate
thumbnail: 10_Locate Navigating Homelessness.jpg
alt: BFA Thesis Project
project-date: April 2014
description:
---
| 16.133333 | 48 | 0.772727 | eng_Latn | 0.657213 |
e23672d1356c0b08175e7dcfe1bc0dc80804c883 | 672 | md | Markdown | docs/code-quality/c28163.md | xyzpda/visualstudio-docs.ja-jp | ed48eeef1b7825ae4b16eeffcf19ee6a1665ac97 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/code-quality/c28163.md | xyzpda/visualstudio-docs.ja-jp | ed48eeef1b7825ae4b16eeffcf19ee6a1665ac97 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/code-quality/c28163.md | xyzpda/visualstudio-docs.ja-jp | ed48eeef1b7825ae4b16eeffcf19ee6a1665ac97 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: C28163
ms.date: 11/04/2016
ms.topic: reference
f1_keywords:
- C28163
helpviewer_keywords:
- C28163
ms.assetid: 24fecbde-1c96-4a45-82f7-9f47cfc0ef11
author: mikeblome
ms.author: mblome
manager: markl
ms.workload:
- multiple
ms.openlocfilehash: 3e9bd974b38647cd330c55df0b8d0063c77bff52
ms.sourcegitcommit: 485ffaedb1ade71490f11cf05962add1718945cc
ms.translationtype: MT
ms.contentlocale: ja-JP
ms.lasthandoff: 10/16/2019
ms.locfileid: "72434563"
---
# <a name="c28163"></a>C28163
警告 C28163: 関数を try/except ブロック内から呼び出さないでください
この警告は、関数が `try/except` ブロックで囲むことができない型の場合に、`try/except` ブロックで見つかったときに報告されます。 コード分析ツールによって、関数が `try/except` ブロック内にある、少なくとも1つのパスが見つかりました。
| 25.846154 | 136 | 0.811012 | yue_Hant | 0.205011 |
e237df03eaa46ef8b2d4f68bb390a0761e282efd | 181 | md | Markdown | Changelog.md | stephenreid/dynamo-store | 7e5a121c7211621ec41d6d527d37aae9f4c55ae0 | [
"MIT"
] | null | null | null | Changelog.md | stephenreid/dynamo-store | 7e5a121c7211621ec41d6d527d37aae9f4c55ae0 | [
"MIT"
] | null | null | null | Changelog.md | stephenreid/dynamo-store | 7e5a121c7211621ec41d6d527d37aae9f4c55ae0 | [
"MIT"
] | null | null | null | # 1.0.0
* Require aws-sdk > 3.0 so that we can specify a smaller subset of gems
* Remove support for ActiveSupport < 5.1
* Remove support for Ruby < 2.6
# 0.0.1
* Initial release | 20.111111 | 71 | 0.696133 | eng_Latn | 0.99075 |
e2382b4310e714290dbf14ebe1e7ea98e0bb36d6 | 954 | md | Markdown | _posts/2017-11-20-W1-ML-Strategy(1).md | baiyfbupt/baiyfbupt.github.io | 23236d54dc2618d0800ba7831acf9294c930f860 | [
"Apache-2.0"
] | 1 | 2019-04-15T03:09:58.000Z | 2019-04-15T03:09:58.000Z | _posts/2017-11-20-W1-ML-Strategy(1).md | baiyfbupt/baiyfbupt.github.io | 23236d54dc2618d0800ba7831acf9294c930f860 | [
"Apache-2.0"
] | null | null | null | _posts/2017-11-20-W1-ML-Strategy(1).md | baiyfbupt/baiyfbupt.github.io | 23236d54dc2618d0800ba7831acf9294c930f860 | [
"Apache-2.0"
] | null | null | null | ---
layout: post
title: "Structuring Machine Learning Projects 第一周笔记"
subtitle: "ML Strategy(1)"
date: 2017-11-20 16:00:00
author: "baiyf"
header-img: "img/post/coursera-bg.jpg"
header-mask: 0.3
catalog: true
tags:
- 公开课
---
# ML Strategy(1)
## Orthogonalization
正交化是指让多个超参数的调整互相不影响
ML的基本要求:
1. 模型在训练集上拟合情况良好(bigger network)
2. 模型在验证集上拟合情况良好(Regularzation,bigger training set)
3. 模型在测试集上拟合情况良好(bigger dev set)
4. 模型在真实环境下表现良好(change dev set or cost function)
对应有相应的正交化超参数调整
## 单一数字评估指标
* 查准率P:对图片识别的准确率
* 查全率R:对类别中所有图片能识别的比例
* F1 Score:$$\frac{2}{\frac1P+\frac1R}$$是查准率和查全率的综合评价,是单一数字评估指标
在系统的性能表现中往往只有一个性能指标是需要不断优化的,其他性能指标只要满足一定阈值就可以1(Optimizing metric),N-1(Satisficing metric)
## Train/Dev/Test set
Dev set and Test set 应来自统一分布
## 与人类表现相比
只要ML的性能表现低于人类就可以:
* 获取更多带标签的数据
* 分析人表现更好的原因
* 分析偏差和方差
可避免偏差(avoidable bias)实际训练偏差与贝叶斯偏差的差值,这个值的存在表明模型存在的优化空间
avoidable bias与Dev error - Training error的大小决定了哪一项有更大的优化空间
| 18 | 89 | 0.754717 | yue_Hant | 0.189864 |
e238cc4ba67d79e1ba9da9ed743ff8690c3a7d69 | 384 | md | Markdown | doc/common/dns.md | yjpark/dotfiles | ae9ad72eb2e2a4d3da4c600d24782720229d1a4b | [
"MIT"
] | 7 | 2015-12-18T04:33:01.000Z | 2019-09-17T06:09:51.000Z | doc/common/dns.md | yjpark/dotfiles | ae9ad72eb2e2a4d3da4c600d24782720229d1a4b | [
"MIT"
] | 1 | 2016-05-12T15:32:47.000Z | 2016-05-12T15:32:47.000Z | doc/common/dns.md | yjpark/dotfiles | ae9ad72eb2e2a4d3da4c600d24782720229d1a4b | [
"MIT"
] | 4 | 2016-11-29T04:06:19.000Z | 2019-12-26T14:32:46.000Z | ### DNS
DNS Crypt for secure DNS Lookup
- https://dnscrypt.org/
### Check where leaked
- www.dnsleaktest.com
Information: http://daily.zhihu.com/story/4745004
### Comodo Secure DNS
- 8.26.56.26
- 8.20.247.20
### Google DNS
- 8.8.8.8
- 8.8.4.4
### Open DNS
- 208.67.222.222
- 208.67.220.220
### 电信 DNS
- 202.106.0.20
- 202.106.196.115
### 阿里 DNS
- 223.5.5.5
- 223.6.6.6
| 10.378378 | 49 | 0.617188 | yue_Hant | 0.67201 |
e2391f421390b446f97bc2bf24a25e242d1304ee | 2,987 | md | Markdown | react-native/components/createRequiredEmailInputComponent/readme.md | SteJaySulli/react-native-app-helpers | 0ef9f1a508ec7d9f15c3ca443c34fe6b05868f1d | [
"MIT"
] | 2 | 2021-12-03T21:47:13.000Z | 2022-03-11T22:03:07.000Z | react-native/components/createRequiredEmailInputComponent/readme.md | SteJaySulli/react-native-app-helpers | 0ef9f1a508ec7d9f15c3ca443c34fe6b05868f1d | [
"MIT"
] | 22 | 2021-10-15T23:21:20.000Z | 2022-01-24T13:11:17.000Z | react-native/components/createRequiredEmailInputComponent/readme.md | SteJaySulli/react-native-app-helpers | 0ef9f1a508ec7d9f15c3ca443c34fe6b05868f1d | [
"MIT"
] | 1 | 2022-03-18T12:37:53.000Z | 2022-03-18T12:37:53.000Z | # `react-native-app-helpers/createRequiredEmailInputComponent`
Creates a new input component pre-configured as a required email input.
## Usage
```tsx
import { createRequiredEmailInputComponent } from "react-native-app-helpers";
const ExampleInput = createRequiredEmailInputComponent(
{
fontFamily: `Example Font Family`,
fontSize: 37,
paddingVertical: 12,
paddingHorizontal: 29,
blurredValid: {
textColor: `#FFEE00`,
placeholderColor: `#E7AA32`,
backgroundColor: `#32AE12`,
radius: 5,
border: {
width: 4,
color: `#FF00FF`,
},
iconColor: `#43AE21`,
},
blurredInvalid: {
textColor: `#99FE88`,
placeholderColor: `#CACA3A`,
backgroundColor: `#259284`,
radius: 10,
border: {
width: 6,
color: `#9A9A8E`,
},
iconColor: `#985E00`,
},
focusedValid: {
textColor: `#55EA13`,
placeholderColor: `#273346`,
backgroundColor: `#CABA99`,
radius: 3,
border: {
width: 5,
color: `#646464`,
},
iconColor: `#789521`,
},
focusedInvalid: {
textColor: `#ABAADE`,
placeholderColor: `#47ADAD`,
backgroundColor: `#32AA88`,
radius: 47,
border: {
width: 12,
color: `#98ADAA`,
},
iconColor: `#449438`,
},
disabledValid: {
textColor: `#AE2195`,
placeholderColor: `#FFAAEE`,
backgroundColor: `#772728`,
radius: 100,
border: {
width: 14,
color: `#5E5E5E`,
},
iconColor: `#ADAADA`,
},
disabledInvalid: {
textColor: `#340297`,
placeholderColor: `#233832`,
backgroundColor: `#938837`,
radius: 2,
border: {
width: 19,
color: `#573829`,
},
iconColor: `#709709`,
},
},
<Text>Shown to the left</Text>,
<Text>Shown to the right</Text>,
null,
-14,
null,
3,
);
const ExampleScreen = () => {
// Useful for realtime submit button updates.
const [incompleteValue, setIncompleteValue] = React.useState<undefined | string>(undefined);
// Useful for persistence.
const [completeValue, setCompleteValue] = React.useState<undefined | string>(undefined);
return (
<React.Fragment>
<ExampleInput
value={incompleteValue}
onChange={(value, complete) => {
if (complete) {
setCompleteValue(value);
} else {
setIncompleteValue(value);
}
}}
disabled={false}
placeholder="Shown when no address has been entered"
unique={[`Not`, `In`, `This`, `List`]}
/>
<Text>Incomplete: {incompleteValue}</Text>
<Text>Complete: {completeValue}</Text>
<Text>Submitted: {submittedValue}</Text>
</React.Fragment>
);
}
```
| 24.891667 | 94 | 0.535655 | eng_Latn | 0.344231 |
e2397426e863d09219b762dc9e040db242815ab6 | 9,656 | md | Markdown | articles/human-resources/hr-admin-integration-linkedin.md | MicrosoftDocs/Dynamics-365-Operations.is-is | ce362ebbd8aabebe5e960567bddc97e5d1f37b56 | [
"CC-BY-4.0",
"MIT"
] | 2 | 2020-05-18T17:14:14.000Z | 2021-04-20T21:13:46.000Z | articles/human-resources/hr-admin-integration-linkedin.md | MicrosoftDocs/Dynamics-365-Operations.is-is | ce362ebbd8aabebe5e960567bddc97e5d1f37b56 | [
"CC-BY-4.0",
"MIT"
] | 6 | 2017-12-12T11:46:48.000Z | 2019-04-30T11:45:51.000Z | articles/human-resources/hr-admin-integration-linkedin.md | MicrosoftDocs/Dynamics-365-Operations.is-is | ce362ebbd8aabebe5e960567bddc97e5d1f37b56 | [
"CC-BY-4.0",
"MIT"
] | 3 | 2019-10-12T18:18:43.000Z | 2022-02-09T23:55:11.000Z | ---
title: Samþætta við LinkedIn Talent Hub
description: Í þessu efnisatriði er útskýrt hvernig á að setja upp samþættingu milli Microsoft Dynamics 365 Human Resources og LinkedIn Talent Hub.
author: jaredha
ms.date: 10/20/2020
ms.topic: article
ms.prod: ''
ms.technology: ''
ms.search.form: ''
audience: Application User
ms.search.scope: Human Resources
ms.custom: 7521
ms.assetid: ''
ms.search.region: Global
ms.author: anbichse
ms.search.validFrom: 2020-10-20
ms.dyn365.ops.version: Human Resources
ms.openlocfilehash: fb75c391809f1ce5c7d48728a735f347ef1784ed
ms.sourcegitcommit: 696796ca5635863850ae9ef16fc1fb0fc46ce8f0
ms.translationtype: HT
ms.contentlocale: is-IS
ms.lasthandoff: 08/28/2021
ms.locfileid: "7441266"
---
# <a name="integrate-with-linkedin-talent-hub"></a>Samþætta við LinkedIn Talent Hub
[!include [Applies to Human Resources](../includes/applies-to-hr.md)]
> [!IMPORTANT]
> Samþætting milli Dynamics 365 Human Resources og LinkedIn Talent Hub sem lýst er í þessu efnisatriði verður hætt 31. desember 2021. Samþættingarþjónustan verður ekki lengur í boði eftir þessa dagsetningu. Fyrirtæki sem ekki nota nú þegar samþættingarþjónustuna munu ekki geta innleitt þjónustuna áður en hún verður tekin úr notkun.
[LinkedIn Talent Hub](https://business.linkedin.com/talent-solutions/talent-hub) er verkvangur rakningakerfis umsækjenda (ATS). Það gerir þér kleift að finna, hafa umsjón með og ráða starfsmenn, allt á einum stað. Með því að samþætta Microsoft Dynamics 365 Human Resources við LinkedIn Talent Hub er auðveldlega hægt að stofna starfsmannafærslur í Human Resources fyrir umsækjendur sem hafa verið ráðnir í stöðu.
## <a name="setup"></a>Setja upp
Kerfisstjóri þarf að ljúka uppsetningarverkum til að virkja samþættingu við LinkedIn Talent Hub. Fyrst þarf að setja upp notanda og öryggishlutverk í Power Apps umhverfinu til að veita LinkedIn Talent Hub viðeigandi heimildir til að skrifa gögn inn í Human Resources.
### <a name="link-your-environment-to-linkedin-talent-hub"></a>Tengja umhverfið við LinkedIn Talent Hub
1. Opnaðu [LinkedIn Talent Hub](https://business.linkedin.com/talent-solutions/talent-hub).
2. Í fellivalmynd notandans skal velja **Stillingar vöru**.
3. Á yfirlitssvæðinu vinstra megin, í hlutanum **Ítarlegt**, skal velja **Samþættingar**.
4. Veljið **Heimila** fyrir Microsoft Dynamics 365 Human Resources-samþættingu.
5. Á **Dynamics 365 Human Resources** síðunni skal velja umhverfið sem á að tengja LinkedIn Talent Hub við og síðan velja **Tengill**.

> [!NOTE]
> Aðeins er hægt að tengja við umhverfi þar sem notandareikningurinn er með stjórendaaðgang að bæði umhverfi Human Resources og tengdu Power Apps-umhverfi. Ef engin umhverfi eru skráð á síðu Human Resources skal ganga úr skugga um að þú sért með heimiluð umhverfi Human Resources í leigjandanum og að notandinn sem þú skráðir inn á tengda síðu sé með stjórnendaheimildir fyrir bæði umhverfi Human Resources og Power Apps-umhverfi.
### <a name="create-a-power-apps-security-role"></a>Stofna Power Apps öryggishlutverk
1. Opna [Power Platform stjórnendamiðstöð](https://admin.powerplatform.microsoft.com).
2. Í listanum **Umhverfi** skal velja umhverfið sem tengist umhverfi Human Resources sem á að tengja við tilvikið af LinkedIn Talent Hub.
3. Veldu **Stillingar**.
4. Stækkaðu hnútinn **Notendur + Heimildir** og veldu **Öryggishlutverk**.
5. Á síðunni **Öryggishlutverk**, í tækjastikunni, skal velja **Nýtt hlutverk**.
6. Í flipanum **Upplýsingar** skal færa inn heiti fyrir hlutverkið, eins og **HRIS-samþætting LinkedIn Talent Hub**.
7. Í flipanum **Sérstilling** skal velja heimildina **Lesaðgangur** á fyrirtækisstigi fyrir eftirfarandi einingar:
- Eining
- Svæði
- Skyldleiki
8. Vistið og lokið öryggishlutverkinu.
### <a name="create-a-power-apps-application-user"></a>Stofna notanda fyrir Power Apps-forrit
Notandi forritsins verður að vera stofnaður fyrir breyti LinkedIn Talent Hub til að veita breytinum heimildir til að skrifa færslur umsækjanda inn í Power Apps-umhverfið.
1. Opna [Power Platform stjórnendamiðstöð](https://admin.powerplatform.microsoft.com).
2. Í listanum **Umhverfi** skal velja umhverfið sem tengist umhverfi Human Resources sem á að tengja við tilvikið af LinkedIn Talent Hub.
3. Veldu **Stillingar**.
4. Stækkaðu hnútinn **Notendur + Heimildir** og veldu **Notendur**.
5. Veljið **Stjórna notendum í Dynamics 365**.
6. Notið fellivalmyndina fyrir ofan listann til að breyta yfirlitinu úr sjálfgefnu yfirliti **Virkjaðir notendur** í **Forritsnotendur**.

7. Á tækjastikunni skal velja **Nýr**.
8. Á síðunni **Nýr notandi** skal fylgja þessum skrefum:
1. Breytið gildinu á reitnum **Gerð notanda** í **Forritsnotandi**.
2. Stillið reitinn **Notandanafn** á **HRIS-samþætting HR LinkedIn Dynamics365**.
3. Stilltu reitinn **Forritskenni** á **3a225c96-d62a-44ce-b3ec-bd4e8e9befef**.
4. Færið inn eitthvað gildi í reitina **Fornafn**, **Eftirnafn** og **Aðalnetfang**.
5. Á tækjastikunni skal velja **Vista \& Loka**.
### <a name="assign-a-security-role-to-the-new-user"></a>Öryggishlutverki úthlutað á nýjan notanda
Þegar búið er að vista og loka nýja forritsnotandanum í hlutanum hér að ofan er þér vísað aftur á síðuna **Listi yfir notendur**.
1. Á síðunni **Listi yfir notendur** skal breyta yfirlitinu í **Forritsnotendur**.
2. Veldu forritsnotandann sem þú stofnaðir í hlutanum hér á undan.
3. Á tækjastikunni skal velja **Stjórna hlutverkum**.
4. Veldu öryggishlutverkið sem þú stofnaðir fyrr í samþættingunni.
5. Veljið **Í lagi**.
### <a name="add-an-azure-active-directory-app-in-human-resources"></a>Bæta Azure Active Directory-forriti við Human Resources
1. Í Dynamics 365 Human Resources skal opna síðuna **Azure Active Directory forrit**.
2. Bættu nýrri færslu við listann og stilltu eftirfarandi reiti:
- **Biðlarakenni**: Sláðu inn **3a225c96-d62a-44ce-b3ec-bd4e8e9befef**.
- **Heiti**: Sláðu inn heiti á Power Apps öryggishlutverkinu sem þú stofnaður hér á undan, t.d. **Samþætting LinkedIn Talent Hub HRIS**.
- **Notandakenni**: Velja skal notanda sem er með heimildir til að skrifa gögn inn í starfsmannastjórnun.
### <a name="create-the-table-in-dataverse"></a>Búa til töfluna í Dataverse
> [!IMPORTANT]
> Samþættingin við LinkedIn Talent Hub veltur á sýndartöflum í Dataverse fyrir Human Resources. Sem skilyrði fyrir þessu skrefi í uppsetningunni, verður þú að skilgreina sýndartölfur. Upplýsingar um hvernig á að skilgreina sýndartöflur er að finna í [Skilgreina Dataverse sýndartöflur](./hr-admin-integration-common-data-service-virtual-entities.md).
1. Í Human Resources skal opna síðuna **Dataverse-samþætting**.
2. Veljið flipann **Sýndartöflur**.
3. Síaðu einingalistann eftir einingarmerki til að finna **Útfluttan umsækjanda LinkedIn**.
4. Veldu eininguna og síðan **Búa til/uppfæra**.
## <a name="exporting-candidate-records"></a>Skrár umsækjanda fluttar út
Þegar uppsetningu er lokið geta ráðningaraðilar og starfsfólk mannauðs notað virknina **Flytja út HRIS** í LinkedIn Talent Hub til að flytja út færslur ráðinna umsækjenda úr LinkedIn Talent Hub í Human Resources.
### <a name="export-records-from-linkedin-talent-hub"></a>Flytja út færslur úr LinkedIn Talent Hub
Þegar umsækjandi hefur farið í gegnum ráðningarferlið og hefur verið ráðinn, geturðu flutt út skrár umsækjanda úr LinkedIn Talent Hub í Human Resources.
1. Í LinkedIn Talent Hub skal opna verkið sem þú réðst nýja starfsmanninn í.
2. Veldu skrá umsækjanda.
3. Veldu **Breyta stigi** og síðan **Ráðinn**.
4. Í úrfellingarmerki (**...**) fyrir umsækjandann skal velja **Flytja yfir í HRIS**.
5. Á svæðinu **Flytja yfir í HRIS** skal færa inn upplýsingarnar sem þarf að flytja út:
- Í reitnum **HRIS-veitandi** skal velja **Microsoft Dynamics 365 Human Resources**.
- Í reitnum **Upphafsdagsetning** skal velja gildi fyrir nýja starfsmanninn.
- Í reitinn **Starfsheiti** skal færa inn starfsheiti fyrir vinnu nýja starfsmannsins.
- Í reitinn **Staðsetning** skal færa inn staðsetninguna þar sem starfsmaðurinn verður.
- Færðu inn eða staðfestu netfang starfsmannsins.

## <a name="complete-onboarding-in-human-resources"></a>Ljúka við innleiðingu í Human Resources
Færslur umsækjanda sem eru fluttar úr LinkedIn Talent Hub í Human Resources birtast í hlutanum **Umsækjendur til að ráða** á síðunni **Starfsmannastjórnun**.
1. Í Human Resources skal opna síðuna **Starfsmannastjórnun**.
2. Í hlutanum **Umsækjendur til að ráða** skal velja **Ráða** fyrir valinn umsækjanda.
3. Í svarglugganum **Ráða nýjan starfsmann** skal fara yfir færsluna og bæta við nauðsynlegum upplýsingum. Einnig er hægt að velja staðsetningarnúmerið sem umsækjandinn hefur verið ráðinn fyrir.
Þegar búið er að færa inn nauðsynlegar upplýsingar er hægt að halda áfram með hefðbundin ferli til að búa til starfsmannafærslur og starfsmannaráðningar.
Eftirfarandi upplýsingar eru fluttar inn og teknar með í nýju starfsmannafærsluna:
- Fornafn
- Eftirnafn
- Upphafsdagur starfs
- Tölvupóstfang
- Símanúmer
## <a name="see-also"></a>Sjá einnig
[Skilgreina Dataverse-sýndartöflur](./hr-admin-integration-common-data-service-virtual-entities.md)<br>
[Hvað er Microsoft Dataverse?](/powerapps/maker/common-data-service/data-platform-intro)
[!INCLUDE[footer-include](../includes/footer-banner.md)]
| 49.773196 | 434 | 0.772473 | isl_Latn | 0.999746 |
e23ab47c058a82a21239199bc122785bf2584c2f | 2,533 | md | Markdown | samples/react-pages-hierarchy/README.md | deepuvijayannair/sp-dev-fx-webparts | bd5ba1cb3ce9377315010894aca153a4b07aa646 | [
"MIT"
] | 1 | 2019-11-14T15:44:32.000Z | 2019-11-14T15:44:32.000Z | samples/react-pages-hierarchy/README.md | deepuvijayannair/sp-dev-fx-webparts | bd5ba1cb3ce9377315010894aca153a4b07aa646 | [
"MIT"
] | null | null | null | samples/react-pages-hierarchy/README.md | deepuvijayannair/sp-dev-fx-webparts | bd5ba1cb3ce9377315010894aca153a4b07aa646 | [
"MIT"
] | 1 | 2020-10-27T16:19:43.000Z | 2020-10-27T16:19:43.000Z | ---
page_type: sample
products:
- office-sp
languages:
- javascript
- typescript
extensions:
contentType: samples
technologies:
- SharePoint Framework
- React
createdDate: 04/30/2020 12:00:00 AM
---
# React Pages Hierarchy
## Summary
This web part allows users to create a faux page hierarchy in their pages library and use it for page-to-page navigation. It will ask you to create a page parent property on first use which is then used by the web part to either show a breadcrumb of the current pages ancestors or buttons for the pages children.

## Used SharePoint Framework Version

## Applies to
* [SharePoint Framework](https:/dev.office.com/sharepoint)
* [Office 365 Developer Tenant](https://dev.office.com/sharepoint/docs/spfx/set-up-your-development-environment)
## Prerequisites
* Office 365 subscription with SharePoint Online
* SharePoint Framework [development environment](https://dev.office.com/sharepoint/docs/spfx/set-up-your-development-environment) set up
## Solution
Solution|Author(s)
--------|---------
react-pages-hierarchy|Bo George ([@bo_george](https://twitter.com/bo_george))
## Version history
Version|Date|Comments
-------|----|--------
1.0|April 30, 2020|Initial release
## Disclaimer
**THIS CODE IS PROVIDED *AS IS* WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESS OR IMPLIED, INCLUDING ANY IMPLIED WARRANTIES OF FITNESS FOR A PARTICULAR PURPOSE, MERCHANTABILITY, OR NON-INFRINGEMENT.**
---
## Minimal Path to Awesome
* Clone this repository
* in the command line run:
* `npm install`
* `gulp serve`
## Features
This web part isn't anything fancy but it's useful for some scenarios.
* Parent Page Property Creation - if the web part is added to a page and the Parent Page property does not exist the user will be asked to enable (create) it.
* Security - if the user editing the page/web part doesn't have 'Manage' permissions on the Pages library they will not get the enable button, instead a message telling them to get a site owner to do the enabling.
* Two page relationship views depending on the direction you want to show
* Ancestors shows a breadcrumb view (including the current page) up to parent pages until the parent page property is not set.
* Children shows a button view for all pages that have selected the current page as their parent.
<img src="https://telemetry.sharepointpnp.com/sp-dev-fx-webparts/samples/react-pages-hierarchy" />
| 34.22973 | 313 | 0.753257 | eng_Latn | 0.940348 |
e23ab74f657ff755c65c71f56c0d5a3024267742 | 19,044 | md | Markdown | src/site/content/en/secure/enable-https/index.md | rkochman/web.dev | 0f62f812f550c30c29cd474ccdf337e283bf243c | [
"Apache-2.0"
] | 2 | 2022-02-04T19:25:51.000Z | 2022-02-06T10:38:06.000Z | src/site/content/en/secure/enable-https/index.md | rkochman/web.dev | 0f62f812f550c30c29cd474ccdf337e283bf243c | [
"Apache-2.0"
] | null | null | null | src/site/content/en/secure/enable-https/index.md | rkochman/web.dev | 0f62f812f550c30c29cd474ccdf337e283bf243c | [
"Apache-2.0"
] | null | null | null | ---
layout: post
title: Enabling HTTPS on your servers
description: >
Enabling HTTPS on your servers is critical to securing your webpages.
authors:
- chrispalmer
- mattgaunt
date: 2015-03-27
updated: 2022-01-28
tags:
- security
---
## Steps covered in this article
1. Create a 2048-bit RSA public/private key pair.
1. Generate a certificate signing request (CSR) that embeds your public key.
1. Share your CSR with your Certificate Authority (CA) to receive a final
certificate or a certificate chain.
1. Install your final certificate in a non-web-accessible place such as
`/etc/ssl` (Linux and Unix) or wherever IIS requires it (Windows).
## Generating keys and certificate signing requests
This section uses the openssl command-line program, which comes with most
Linux, BSD, and Mac OS X systems, to generate private/public keys and a CSR.
### Generate a public/private key pair
Let's start by generating a 2,048-bit RSA key pair. A smaller key, such
as 1,024 bits, is insufficiently resistant to brute-force guessing attacks. A
larger key, such as 4,096 bits, is overkill. Over time, key sizes increase as
computer processing gets cheaper. 2,048 is currently the sweet spot.
The command to generate the RSA key pair is:
```bash
openssl genrsa -out www.example.com.key 2048
```
This gives the following output:
```bash
Generating RSA private key, 2048 bit long modulus
.+++
.......................................................................................+++
e is 65537 (0x10001)
```
### Generate a certificate signing request
In this step, you embed your public key and information about your organization
and your website into a certificate signing request or CSR. The *openssl*
command interactively asks you for the required metadata.
Running the following command:
```bash
openssl req -new -sha256 -key www.example.com.key -out www.example.com.csr
```
Outputs the following:
```bash
You are about to be asked to enter information that will be incorporated
into your certificate request
What you are about to enter is what is called a Distinguished Name or a DN.
There are quite a few fields but you can leave some blank
For some fields there will be a default value,
If you enter '.', the field will be left blank.
-----
Country Name (2 letter code) [AU]:CA
State or Province Name (full name) [Some-State]:California
Locality Name (for example, city) []:Mountain View
Organization Name (for example, company) [Internet Widgits Pty Ltd]:Example, Inc.
Organizational Unit Name (for example, section) []:Webmaster Help Center Example
Team
Common Name (e.g. server FQDN or YOUR name) []:www.example.com
Email Address []:webmaster@example.com
Please enter the following 'extra' attributes
to be sent with your certificate request
A challenge password []:
An optional company name []:
```
To ensure the validity of the CSR, run this command:
```bash
openssl req -text -in www.example.com.csr -noout
```
And the response should look like this:
```bash
Certificate Request:
Data:
Version: 0 (0x0)
Subject: C=CA, ST=California, L=Mountain View, O=Google, Inc.,
OU=Webmaster Help Center Example Team,
CN=www.example.com/emailAddress=webmaster@example.com
Subject Public Key Info:
Public Key Algorithm: rsaEncryption
Public-Key: (2048 bit)
Modulus:
00:ad:fc:58:e0:da:f2:0b:73:51:93:29:a5:d3:9e:
f8:f1:14:13:64:cc:e0:bc:be:26:5d:04:e1:58:dc:
...
Exponent: 65537 (0x10001)
Attributes:
a0:00
Signature Algorithm: sha256WithRSAEncryption
5f:05:f3:71:d5:f7:b7:b6:dc:17:cc:88:03:b8:87:29:f6:87:
2f:7f:00:49:08:0a:20:41:0b:70:03:04:7d:94:af:69:3d:f4:
...
```
### Submit your CSR to a certificate authority
Different certificate authorities (CAs) require different methods for sending
them your CSRs. Methods may include using a form on their website, sending the
CSR by email, or something else. Some CAs (or their resellers) may even automate
some or all of the process (including, in some cases, key pair and CSR
generation).
Send the CSR to your CA, and follow their instructions to receive your final
certificate or certificate chain.
Different CAs charge different amounts of money for the service of vouching
for your public key.
There are also options for mapping your key to more than one DNS name, including
several distinct names (e.g. all of example.com, www.example.com, example.net,
and www.example.net) or "wildcard" names such as \*.example.com.
For example, one CA currently offers these prices:
* Standard: $16/year, valid for example.com and www.example.com.
* Wildcard: $150/year, valid for example.com and \*.example.com.
At these prices, wildcard certificates are economical when you have more than 9
subdomains; otherwise, you can just buy one or more single-name certificates. (If
you have more than, say, five subdomains, you might find a wildcard certificate
more convenient when you come to enable HTTPS on your servers.)
{% Aside %}
Keep in mind that in wildcard certificates the wildcard applies to only
one DNS label. A certificate good for \*.example.com will work for
foo.example.com and bar.example.com, but _not_ for foo.bar.example.com.
{% endAside %}
Copy the certificates to all your front-end servers in a non-web-accessible
place such as `/etc/ssl` (Linux and Unix) or wherever IIS (Windows) requires
them.
## Enable HTTPS on your servers
Enabling HTTPS on your servers is a critical step in providing security for
your web pages.
* Use Mozilla's Server Configuration tool to set up your server for HTTPS support.
* Regularly test your site with the Qualys' handy SSL Server Test and ensure
you get at least an A or A+.
At this point, you must make a crucial operations decision. Choose one of the
following:
* Dedicate a distinct IP address to each hostname your web server serves content
from.
* Use name-based virtual hosting.
If you have been using distinct IP addresses for each hostname, you can
easily support both HTTP and HTTPS for all clients.
However, most site operators use name-based virtual hosting to conserve IP
addresses and because it's more convenient in general. The problem with IE on
Windows XP and Android earlier than 2.3 is that they do not understand [Server
Name Indication](https://en.wikipedia.org/wiki/Server_Name_Indication)
(SNI), which is crucial for HTTPS name-based virtual hosting.
Someday—hopefully soon—clients that don't support SNI will be replaced
with modern software. Monitor the user agent string in your request logs to know
when enough of your user population has migrated to modern software. (You can
decide what your threshold is; perhaps less than 5%, or less then 1%.)
If you don't already have HTTPS service available on your servers, enable it now
(without redirecting HTTP to HTTPS; see below). Configure your web server to use
the certificates you bought and installed. You might find [Mozilla's handy
configuration
generator](https://mozilla.github.io/server-side-tls/ssl-config-generator/)
useful.
If you have many hostnames or subdomains, they each need to use the right
certificate.
{% Aside 'warning'%}
Warning: If you've already completed these steps, but are using HTTPS for the
sole purpose of redirecting clients back to HTTP, stop doing that now. See the
next section to make sure HTTPS and HTTP work smoothly.
{% endAside %}
{% Aside %}
Ultimately you should redirect HTTP requests to HTTPS and use HTTP
Strict Transport Security (HSTS). However, this is not the right stage in
the migration process to do that; see "Redirect HTTP To HTTPS" and
"Turn On Strict Transport Security And Secure Cookies."
{% endAside %}
Now, and throughout your site's lifetime, check your HTTPS configuration with
[Qualys' handy SSL Server Test](https://www.ssllabs.com/ssltest/).
Your site should score an A or A+; treat anything that causes a lower grade as
a bug. (Today's A is tomorrow's B, because attacks against algorithms and
protocols are always improving!)
## Make intrasite URLs relative
Now that you are serving your site on both HTTP and HTTPS, things need to work as
smoothly as possible, regardless of protocol. An important factor is using
relative URLs for intrasite links.
Make sure intrasite URLs and external URLs are agnostic to protocol; that is,
make sure you use relative paths or leave out the protocol like
`//example.com/something.js`.
A problem arises when you serve a page via HTTPS that includes HTTP
resources, known as
[mixed content](/what-is-mixed-content/).
Browsers warn users that the full strength of HTTPS has been lost. In fact,
in the case of active mixed content (script, plug-ins, CSS, iframes), browsers
often simply won't load or execute the content at all, resulting in a
broken page. And remember, it's perfectly OK to include HTTPS resources in an
HTTP page.
{% Aside %}
See [Fixing Mixed Content](/fixing-mixed-content)
for more details about ways to fix and prevent mixed content.
{% endAside %}
Additionally, when you link to other pages in your site, users could get
downgraded from HTTPS to HTTP.
These problems happen when your pages include fully-qualified, intrasite URLs
that use the *http://* scheme.
{% Compare 'worse' %}
```html
<h1>Welcome To Example.com</h1>
<script src="http://example.com/jquery.js"></script>
<link rel="stylesheet" href="http://assets.example.com/style.css"/>
<img src="http://img.example.com/logo.png"/>;
<p>A <a href="http://example.com/2014/12/24/">new post on cats!</a></p>
```
{% CompareCaption %}
Avoid using fully qualified intrasite URLs.
{% endCompareCaption %}
{% endCompare %}
In other words, make intrasite URLs as relative as possible: either
protocol-relative (lacking a protocol, starting with `//example.com`) or
host-relative (starting with just the path, like `/jquery.js`).
{% Compare 'better' %}
```html
<h1>Welcome To Example.com</h1>
<script src="/jquery.js"></script>
<link rel="stylesheet" href="/assets/style.css"/>
<img src="/images/logo.png"/>;
<p>A <a href="/2014/12/24/">new post on cats!</a></p>
```
{% CompareCaption %}
Use relative intrasite URLs.
{% endCompareCaption %}
{% endCompare %}
{% Compare 'better' %}
```html
<h1>Welcome To Example.com</h1>
<script src="//example.com/jquery.js"></script>
<link rel="stylesheet" href="//assets.example.com/style.css"/>
<img src="//img.example.com/logo.png"/>;
<p>A <a href="//example.com/2014/12/24/">new post on cats!</a></p>
```
{% CompareCaption %}
Or, use protocol-relative intrasite URLs.
{% endCompareCaption %}
{% endCompare %}
{% Compare 'better' %}
```html
<h1>Welcome To Example.com</h1>
<script src="/jquery.js"></script>
<link rel="stylesheet" href="/assets/style.css"/>
<img src="/images/logo.png"/>;
<p>A <a href="/2014/12/24/">new post on cats!</a></p>
<p>Check out this <a href="<b>https://foo.com/</b>">other cool site.</a></p>
```
{% CompareCaption %}
Use HTTPS URLs for intersite URLs (where possible).
{% endCompareCaption %}
{% endCompare %}
Do this with a script, not by hand. If your site's content is in a database,
test your script on a development copy of your database. If
your site's content consists of simple files, test your script on a
development copy of the files. Push the changes to production only after the
changes pass QA, as normal. You can use [Bram van Damme's
script](https://github.com/bramus/mixed-content-scan) or something similar to
detect mixed content in your site.
When linking to other sites (as opposed to including resources from them),
don't change the protocol since you don't have control over how those sites
operate.
To make migration smoother for large sites, we recommend
protocol-relative URLs. If you are not sure whether you can fully deploy
HTTPS yet, forcing your site to use HTTPS for all sub-resources may backfire.
There is likely to be a period of time in which HTTPS is new and weird for
you, and the HTTP site must still work as well as ever. Over time, you'll
complete the migration and lock in HTTPS (see the next two sections).
If your site depends on scripts, images, or other resources served from a third
party, such as a CDN or jquery.com, you have two options:
* Use protocol-relative URLs for these resources. If the third party does not
serve HTTPS, ask them to. Most already do, including jquery.com.
* Serve the resources from a server that you control, and which offers both HTTP
and HTTPS. This is often a good idea anyway, because then you have better
control over your site's appearance, performance, and security. In addition,
you don't have to trust a third party, which is always nice.
{% Aside %}
Keep in mind that you also need to change intrasite URLs in your
stylesheets, JavaScript, redirect rules, `<link>` tags, and CSP declarations,
not just in the HTML pages.
{% endAside %}
## Redirect HTTP to HTTPS
You need to put a [canonical link](https://support.google.com/webmasters/answer/139066)
at the head of your page to tell search engines that HTTPS is the best way to
get to your site.
Set `<link rel="canonical" href="https://…"/>` tags in your pages. This
helps search engines determine the best way to get to your site.
## Turn on Strict Transport Security and secure cookies
At this point, you are ready to "lock in" the use of HTTPS.
* Use HTTP Strict Transport Security (HSTS) to avoid the cost of the 301 redirect.
* Always set the Secure flag on cookies.
First, use [Strict Transport Security](https://en.wikipedia.org/wiki/HTTP_Strict_Transport_Security)
to tell clients that they should always connect to your server via HTTPS, even
when following an `http://` reference. This defeats attacks such as
[SSL Stripping](http://www.thoughtcrime.org/software/sslstrip/),
and also avoids the round-trip cost of the `301 redirect` that we enabled in
[Redirect HTTP to HTTPS](#redirect-http-to-https).
{% Aside %}
Clients that have noted your site as a known HSTS Host are likely to
<a href="https://tools.ietf.org/html/rfc6797#section-12.1">hard-fail
if your site ever has an error in its TLS configuration</a> (such as an
expired certificate). HSTS is explicitly designed this way to ensure that
network attackers cannot trick clients into accessing the site without HTTPS.
Do not enable HSTS until you are certain that your site operation is robust
enough to avoid ever deploying HTTPS with certificate validation errors.
{% endAside %}
Turn on HTTP Strict Transport Security (HSTS) by setting the
`Strict-Transport-Security` header. [OWASP's HSTS page has links to
instructions](https://www.owasp.org/index.php/HTTP_Strict_Transport_Security)
for various server software.
Most web servers offer a similar ability to add custom headers.
{% Aside %}
`max-age` is measured in seconds. You can start with low values and
gradually increase the `max-age` as you become more comfortable operating
an HTTPS-only site.
{% endAside %}
It is also important to make sure that clients never send cookies (such as for
authentication or site preferences) over HTTP. For example, if a user's
authentication cookie were to be exposed in plain text, the security guarantee of
their entire session would be destroyed—even if you have done everything else
right!
Therefore, change your web application to always set the Secure flag on cookies
that it sets. [This OWASP page explains how to set the Secure
flag](https://www.owasp.org/index.php/SecureFlag) in several application
frameworks. Every application framework has a way to set the flag.
Most web servers offer a simple redirect feature. Use `301 (Moved Permanently)`
to indicate to search engines and browsers that the HTTPS version is canonical,
and redirect your users to the HTTPS version of your site from HTTP.
### Search ranking
Google uses [HTTPS as a positive search quality
indicator](https://googlewebmastercentral.blogspot.com/2014/08/https-as-ranking-signal.html).
Google also publishes a guide for [how to transfer, move, or migrate your
site](https://support.google.com/webmasters/topic/6029673) while maintaining
its search rank. Bing also publishes [guidelines for
webmasters](http://www.bing.com/webmaster/help/webmaster-guidelines-30fba23a).
### Performance
When the content and application layers are well-tuned (see
[Steve Souders' books](https://stevesouders.com/) for great
advice), the remaining TLS performance concerns are generally small, relative
to the overall cost of the application. Additionally, you can reduce and
amortize those costs. (For great advice on TLS optimization and generally, see
[High Performance Browser Networking](https://hpbn.co/) by Ilya Grigorik.)
See also Ivan Ristic's [OpenSSL
Cookbook](https://www.feistyduck.com/books/openssl-cookbook/) and
[Bulletproof SSL And TLS](https://www.feistyduck.com/books/bulletproof-ssl-and-tls/).
In some cases, TLS can _improve_ performance, mostly as a result of making
HTTP/2 possible. Chris Palmer gave a talk on [HTTPS and HTTP/2 performance at
Chrome Dev Summit 2014](https://developers.google.com/web/shows/cds/2014/tls-all-the-things).
### Referer headers
When users follow links from your HTTPS site to other HTTP sites, user agents
don't send the Referer header. If this is a problem, there are several ways to
solve it:
* The other sites should migrate to HTTPS. If referee sites can complete the
[Enable HTTPS on your servers](#enable-https-on-your-servers) section of
this guide, you can change links in your site to theirs from `http://` to
`https://`, or you can use protocol-relative links.
* To work around a variety of problems with Referer headers, use the new
[Referrer Policy standard](http://www.w3.org/TR/referrer-policy/#referrer-policy-delivery-meta).
Because search engines are migrating to HTTPS, in the future, you are likely
to see _more_ Referer headers when you migrate to HTTPS.
{% Aside 'caution' %}
According to the [HTTP RFC](https://tools.ietf.org/html/rfc2616#section-15.1.3),
clients **SHOULD NOT** include a Referer header field in a (non-secure) HTTP
request if the referring page is transferred with a secure protocol.
{% endAside %}
### Ad revenue
Site operators that monetize their site by showing ads want to make sure that
migrating to HTTPS does not reduce ad impressions. But due to mixed content
security concerns, an HTTP `<iframe>` doesn't work in an HTTPS page. There is a
tricky collective action problem here: until advertisers publish over HTTPS,
site operators cannot migrate to HTTPS without losing ad revenue; but until site
operators migrate to HTTPS, advertisers have little motivation to publish HTTPS.
Advertisers should at least offer ad service via HTTPS (such as by completing
the "Enable HTTPS on your servers" section on this page). Many already do. You
should ask advertisers that do not serve HTTPS at all to at least start.
You may wish to defer completing
[Make IntraSite URLs relative](#make-intrasite-urls-relative) until enough
advertisers interoperate properly.
| 40.092632 | 100 | 0.753466 | eng_Latn | 0.993156 |
e23adb6824cbdff0fa475fd14feae56bbf7e76f0 | 743 | md | Markdown | README.md | micro-technologies/micro-technologies.github.io | 406a646ea4396dd5b25ee9108f828ac31a3c3c18 | [
"MIT"
] | null | null | null | README.md | micro-technologies/micro-technologies.github.io | 406a646ea4396dd5b25ee9108f828ac31a3c3c18 | [
"MIT"
] | null | null | null | README.md | micro-technologies/micro-technologies.github.io | 406a646ea4396dd5b25ee9108f828ac31a3c3c18 | [
"MIT"
] | null | null | null | # Sobre
Seja bem vindo(a)!
Em resumo a µTechnologies é uma iniciativa sem fins lucrativos que visa incentivar o aprendizado e sínteses nos mais diversos assuntos, em especial aqueles relacionados à tecnologia.
A ideia é organizar, criar, apontar e manter conteudos tecnico-ciêntfícos para falantes de português brasileiro.
Pretendemos cumprir este objetivo utilizando a plataforma do GitHub, pois torna possível unificar discussões, pessoas, e código em um só local, de forma que facilite a busca e a relacão entre esses.
A estrutura desta organização foi espelhada [neste excelente projeto.](http://frontendbr.com.br/)
Para uma informação mais detalhada [clique aqui](https://github.com/micro-technologies/sobre/blob/master/README.md).
| 53.071429 | 198 | 0.807537 | por_Latn | 0.999783 |
e23b093f5466605953e6d1a8d823fe996d26cff3 | 2,595 | md | Markdown | learn/rest/advanced/merge-and-crosslist-courses.md | shurrey/Blackboard-docsSite | 3367b40a61830f82e3449c0ecb15e68ea472b1f4 | [
"MIT"
] | null | null | null | learn/rest/advanced/merge-and-crosslist-courses.md | shurrey/Blackboard-docsSite | 3367b40a61830f82e3449c0ecb15e68ea472b1f4 | [
"MIT"
] | 10 | 2021-01-14T02:23:19.000Z | 2022-03-07T17:10:03.000Z | learn/rest/advanced/merge-and-crosslist-courses.md | shurrey/Blackboard-docsSite | 3367b40a61830f82e3449c0ecb15e68ea472b1f4 | [
"MIT"
] | null | null | null | ---
layout: post
title: "Merge and Corsslist courses"
purple-text-title: ""
id: learn-rest-advanced-merge-crosslist-courses
categories: Learn Rest
author: Ryan Haber
---
# Use APIs to Merge and Cross-list Courses
### Overview
Your institution often needs to manage many courses or sections together. The
mechanism that Blackboard Learn provides for doing this is referred to as
course merging or cross-listing. Merged and cross-listed courses are, under
the hood, the same thing: two or more courses in a parent-child relationship.
A course set is a parent course together with all its child courses. In
physical terms, these students might have different courses listed on their
schedules. If their courses are merged in Learn, their schedules show the time
and place as determined by the registrar and they receive instruction from the
same instructor(s). All students in the child courses have access to the same
online content.
Blackboard Learn synchronizes enrollments in child courses with their parent
course. Users enrolled in a child course thus have access to the content of
the parent course. Likewise, when you use management tools in a parent course,
you will also affect users enrolled in its children courses. Blackboard Learn
preserves user roles from the last time a user is enrolled into any of the
courses in the course set. A student can only exist in one course in a course
set; Learn ignores duplicate enrollments.
Users with administrative entitlements can merge courses.
### Before you begin
You need an authentication token from a user with administrative entitlements
to merge courses. For a complete list of specific requirements for any
particular method, see the [Blackboard Learn API
reference](https://community.blackboard.com/external-link.jspa?url=https%3A/
/developer.blackboard.com/portal/displayApi/Learn).
### Merge a course
To merge one course as a child of another:
1. Find the courseId of the course that you want to be the parent.
2. Find the courseId of the course you want to be the child.
3. Make a PUT request to /learn/api/public/v1/courses/{courseId}/children/{childCourseId}.
### Get the children of a course
To identify the children courses of a course:
1. Find the courseId of the parent course.
2. Make a GET request to /learn/api/public/v1/courses/{courseId}/children.
### Get the course set that a course belongs to
To get a list of all the courses merged with a particular course, regardless
of which are children and which is the parent, make a GET request to
/learn/api/public/v1/courses/{courseId}/crossListSet.
| 40.546875 | 92 | 0.788825 | eng_Latn | 0.999298 |
e23b2a7ca322e03582a2b19b7dab6ea1b755cf47 | 3,185 | md | Markdown | integration-tests/README.md | Necromantian/aet | d11cd37fa35eab8c7cfe2fb043fd1b77fdf1bd64 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | integration-tests/README.md | Necromantian/aet | d11cd37fa35eab8c7cfe2fb043fd1b77fdf1bd64 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | integration-tests/README.md | Necromantian/aet | d11cd37fa35eab8c7cfe2fb043fd1b77fdf1bd64 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | 
# AET
<p align="center">
<img src="https://github.com/Cognifide/aet/blob/master/misc/img/aet-logo-blue.png?raw=true"
alt="AET Logo"/>
</p>
## Integration Tests
This project build is managed by maven and is not part of the AET System.
### sample-site
Sample web application for tests. Builds *war* file containing pages for various test cases.
Pages are written in a way that allows us to provide repeatable test results
(i.e. failed test cases).
Even if someone would try to update the patterns.
Sample site can be uploaded to AET vagrant by running `mvn clean install -Pupload` from the
`sample-site` directory level or from the root level of `aet` repository.
By default the sample site will be available at [http://aet-vagrant:9090/sample-site/](http://aet-vagrant:9090/sample-site/)
### test-suite
Prepares test suite and run tests on the pages provided by `sample-site` module.
Run `mvn clean install` to prepare `suite.xml` file.
Then run `mvn aet:run` to test sample pages and store results in AET database.
The result should be available at:
* [http://aet-vagrant/report.html?company=aet&project=aet&suite=main#/home](http://aet-vagrant/report.html?company=aet&project=aet&suite=main#/home)
There's a following naming convention for tests within the `test-suite`:
* `S-` prefix - tests are expected to be green on the report (passed cases)
* `F-` prefix - tests are expected to be red on the report (failed cases)
* `W-` prefix - tests are expected to be yellow on the report (warning cases)
*Note:* If you're running the suite for the first time, it needs to be executed at least twice to
get expected results, because some of the test cases will always pass in the first run
(e.g. screen comparison will always pass when there's no pattern yet).
### sanity-functional
Bobcat tests for AET reports web application.
Prerequisities:
* AET instance running
* Sample test suite: `test-suite` already executed (at least twice) against `sample-site` site.
Functional tests expect the report at URL specified by `report.url` property.
By default the URL is [http://aet-vagrant/report.html?company=aet&project=aet&suite=main](http://aet-vagrant/report.html?company=aet&project=aet&suite=main)
It may be changed it in `.../config/dev/instance.properties` file.
* Chrome browser installed
* Selenium [Chromedriver] available at *D:/Selenium/chromedriver.exe*.
This path can be changed at command-line with `-Dwebdriver.chrome.driver=<path>`
or in `.../config/common/webdriver.properties` file.
To start the Bobcat tests, run `mvn clean test` from the `sanity-functional` directory level
[Chromedriver]: https://sites.google.com/a/chromium.org/chromedriver/
### cleaner-test
Cleaner Integration Tests are using [mocked OSGi context](https://sling.apache.org/documentation/development/osgi-mock.html)
and an [in-memory mock of MongoDB server](https://github.com/bwaldvogel/mongo-java-server).
Tests check various combinations of Cleaner parameters (versions to keep and suite max age)
and verify whether correct metadata and artifacts documents have been removed from database. | 45.5 | 156 | 0.762637 | eng_Latn | 0.966527 |
e23b339b84f190253c218d19d498f854d07f8d84 | 2,084 | md | Markdown | Docs/API/modifiers.md | AJIADb9/AttributesExtension | 634435450c079b82f4cd47b897ca1ff9b3770287 | [
"Apache-2.0"
] | 24 | 2018-08-12T17:08:58.000Z | 2022-03-11T12:55:10.000Z | Docs/API/modifiers.md | AJIADb9/AttributesExtension | 634435450c079b82f4cd47b897ca1ff9b3770287 | [
"Apache-2.0"
] | 5 | 2019-10-23T15:16:14.000Z | 2020-04-22T09:56:14.000Z | Docs/API/modifiers.md | AJIADb9/AttributesExtension | 634435450c079b82f4cd47b897ca1ff9b3770287 | [
"Apache-2.0"
] | 11 | 2019-09-15T20:35:41.000Z | 2022-03-15T22:50:51.000Z | ## Modifiers
Modifiers change the base value of an attribute depending on 3 different factors.
### Modifier Factors

#### Scalar Increment
Adds a value directly to the attribute.

{% hint style='danger' %}
Modifiers should usually be used from a variable (of type *Attr Modifier*) if you want to be able to remove them from Attributes
{% endhint %}
#### Percentage Increment
Adds a percentage of the last value of the attribute.


#### Base Percentage Increment
Similar to Percentage except that this percentage is based on the original value.

### Application order
Modifiers are applied into an attribute following the next rules of priority:
1. **Modifier Category** - Check [Modifier Categories](#modifier-categories)
2. **Order** - The order at which modifiers are applied.
Adding *"ModA"* and *"ModB"* to the same category will result in *"ModA"* being applied before.
{% hint style='working' %}
*In a future release: Categories may specify if attributes should apply first last mods on the same category.*
{% endhint %}
## Modifier Categories
Modifier categories are used to specify **modifier application order**. Depending on the genre of a game this can be a key feature that we didn't want to miss.
With a configuration where *"Buff"* is more important than *"Aura"*, a *"Buff"* attribute will be applied before an *"Aura"* modifier.
For example:

{% hint style='hint' %}
Categories can also be stored as variables of type *"Attr Category"*
{% endhint %}
### Adding & Removing Categories
Categories can be edited from ***Project Settings -> Game -> Attributes***
Remember, their order matter. First categories are applied first on attributes.

{% hint style='danger' %}
Categories **can't** be modified in runtime.
{% endhint %}
| 26.717949 | 159 | 0.733685 | eng_Latn | 0.984359 |
e23b96e1adae8e128df9382ad38c23b01518a9b7 | 38 | md | Markdown | README.md | millen1m/figures-workshop-demo | f6fee7e2400d53aa042b808e736c68f2dfc1e512 | [
"MIT"
] | null | null | null | README.md | millen1m/figures-workshop-demo | f6fee7e2400d53aa042b808e736c68f2dfc1e512 | [
"MIT"
] | null | null | null | README.md | millen1m/figures-workshop-demo | f6fee7e2400d53aa042b808e736c68f2dfc1e512 | [
"MIT"
] | null | null | null | # figures-workshop-demo
A description
| 12.666667 | 23 | 0.815789 | eng_Latn | 0.644998 |
e23c052344780736efed3f6737ea5e3b295405f8 | 9,054 | md | Markdown | www/docs/de/dev/guide/overview/index.md | NiklasMerz/cordova-docs | 44678acb622002f5ed2f322a699aaad9716ff041 | [
"Apache-2.0"
] | 1 | 2020-12-15T14:00:24.000Z | 2020-12-15T14:00:24.000Z | www/docs/de/dev/guide/overview/index.md | NiklasMerz/cordova-docs | 44678acb622002f5ed2f322a699aaad9716ff041 | [
"Apache-2.0"
] | 1 | 2021-02-23T13:54:18.000Z | 2021-02-23T14:42:20.000Z | www/docs/de/dev/guide/overview/index.md | NiklasMerz/cordova-docs | 44678acb622002f5ed2f322a699aaad9716ff041 | [
"Apache-2.0"
] | 2 | 2020-12-16T06:54:13.000Z | 2021-08-17T09:50:40.000Z | ---
license: >
Licensed to the Apache Software Foundation (ASF) under one
or more contributor license agreements. See the NOTICE file
distributed with this work for additional information
regarding copyright ownership. The ASF licenses this file
to you under the Apache License, Version 2.0 (the
"License"); you may not use this file except in compliance
with the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing,
software distributed under the License is distributed on an
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
KIND, either express or implied. See the License for the
specific language governing permissions and limitations
under the License.
title: Übersicht
toc_title: Overview
---
# Übersicht
Apache Cordova ist ein Open-Source-mobile-Entwicklung-Framework. Sie können standard-Web-Technologien wie HTML5, CSS3 und JavaScript für Cross-Plattform-Entwicklung, Vermeidung jeder mobilen Plattformen native Entwicklung der Sprache zu verwenden. Anwendungen werden in Verpackungen, die gezielt auf jede Plattform und verlassen sich auf standardkonforme API Anbindungen an jedes Gerät Sensoren, Daten und Netzwerkstatus zugreifen.
Apache Cordova graduierte Oktober 2012 als Top-Level-Projekt innerhalb der Apache Software Foundation (ASF). Durch die ASF wird künftige Cordova offene Leitung des Projekts sichergestellt. Es bleibt immer kostenlos und open Source unter der Apache License, Version 2.0. Besuchen Sie [cordova.apache.org][1] für weitere Informationen.
[1]: http://cordova.apache.org
Verwenden Sie Apache Cordova, falls Sie sind:
* mobile Entwickler und wollen eine Anwendung über mehrere Plattformen hinweg zu erweitern, ohne es erneut mit Sprache und Tool jede Plattform implementieren festgelegt.
* Speichern Portale, Webentwickler und wollen eine Webanwendung bereitstellen, die für den Vertrieb in verschiedenen app gepackt ist.
* mobile Entwickler interessiert mischen systemeigene Anwendungskomponenten mit einer *WebView* (spezielle Browser-Fenster), die auf Geräteebene APIs, zugreifen kann oder wollen Sie eine Plugin-Schnittstelle zwischen systemeigenen und WebView Komponenten entwickeln.
## Basiskomponenten
Apache-Cordova-Anwendungen basieren auf einer gemeinsamen `config.xml` -Datei, enthält Informationen über die app und gibt Parameter, die beeinflussen, wie es funktioniert, z. B. ob es reagiert auf Orientierung verschiebt. Diese Datei entspricht der W3C-Spezifikation für [Verpackt Web App][2]oder *Widget*.
[2]: http://www.w3.org/TR/widgets/
Die Anwendung selbst wird als eine Web-Seite implementiert, standardmäßig eine lokale Datei mit dem Namen *index.html*, die verweist was CSS, JavaScript, Bilder, Mediendateien oder andere Ressourcen sind notwendig für die Ausführung. Die app führt als ein *WebView* in der Ausgangsanwendung-Wrapper, die Sie auf app Stores zu verteilen.
Der Cordova-fähigen WebView kann die Anwendung mit der gesamten [Benutzeroberfläche](../next/index.html) bereitstellen. Auf einigen Plattformen kann es auch eine Komponente innerhalb einer größeren, Hybridanwendung sein, die die WebView mit nativen Komponenten mischt. (Siehe Einbettung Webansichten für Details.)
Eine *Plugin* -Schnittstelle steht für Cordova und systemeigenen Komponenten miteinander kommunizieren. Dadurch können mit systemeigenen Code aus JavaScript aufrufen. Im Idealfall sind die JavaScript-APIs für systemeigenen Code konsistent mehrere Geräteplattformen. Ab der Version 3.0 bieten Plugins Bindungen an standard-Device-APIs. Drittanbieter Plugins bieten zusätzliche Bindungen an Funktionen nicht notwendigerweise auf allen Plattformen. Sie können finden diese Drittanbieter Plugins in der [Plugin-Registry][3] und in Ihrer Anwendung verwenden. Sie können auch eigene Plugins entwickeln, wie in der Plugin-Entwicklung-Handbuch beschrieben. Plugins, z. B. möglicherweise erforderlich für die Kommunikation zwischen Cordova und benutzerdefinierte systemeigenen Komponenten.
[3]: http://plugins.cordova.io
**Hinweis**: ab Version 3.0, wenn Sie erstellen ein Cordova-Projekt nicht über irgendwelche Plugins vorhanden. Dies ist das neue Standardverhalten. Alle Plugins, die Sie wünschen, die auch die Core-Plugins muss explizit hinzugefügt werden.
Cordova bietet keine UI-Widgets oder MV-Frameworks. Cordova bietet nur die Runtime, in der diejenigen ausgeführt werden können. Wenn Sie UI-Widgets und/oder ein MV * Framework verwenden möchten, müssen Sie diejenigen auswählen und sie in Ihrer Anwendung selbst als Material von Drittherstellern.
## Entwicklungspfade
Ab Version 3.0 können Sie zwei einfache Workflows verwenden, um eine mobile app zu erstellen. Während Sie häufig entweder Workflow verwenden können, um die gleiche Aufgabe, bieten sie alle Vorteile:
* **Cross-Plattform (CLI)-Workflow**: Nutzung dieser Workflow Ihre app auf so viele verschiedene mobile Betriebssysteme wie möglich laufen soll mit wenig müssen für Plattform-spezifische Entwicklung. Dieser Workflow dreht sich um die `cordova` Dienstprogramm, andernfalls bekannt als die Cordova *CLI*, die mit Cordova 3.0 eingeführt wurde. Die CLI ist High-Level-Tool, das Ihnen erlaubt, Projekte für viele Plattformen gleichzeitig zu erstellen viele der Funktionen von Low-Level-Shell-Skripte zu abstrahieren. Die CLI einen gemeinsamen Satz von Web-Vermögenswerte in Unterverzeichnisse für jede mobile Plattform kopiert, macht alle notwendigen Konfigurationsänderungen für jede, läuft Buildskripts, Anwendungsbinärdateien zu generieren. Die CLI bietet auch eine gemeinsame Schnittstelle um Plugins für Ihre Anwendung zu übernehmen. Für mehr Details über die CLI siehe The Command-Line Interface. Es sei denn, Sie den Plattform-zentriert-Workflow benötigen, empfiehlt sich der Cross-Plattform-Workflow.
* **Plattform-zentrierte Workflow**: Verwenden Sie diesen Workflow, wenn Sie konzentrieren eine app für eine einzelne Plattform und müssen in der Lage, es auf einer niedrigeren Ebene ändern möchten. Du musst diesen Ansatz, beispielsweise verwenden, möchten Sie Ihre app zum Mischen von benutzerdefinierter systemeigener Komponenten mit Web-basierten Cordova Komponenten, wie in Webansichten Einbettung für erläutert. Als Faustregel gilt verwenden Sie diesen Workflow, wenn Sie das Projekt im SDK ändern müssen. Dieser Workflow basiert auf einer Reihe von Low-Level-Shell-Skripte, die zugeschnitten sind, für jede unterstützte Plattform und ein separates Plugman-Dienstprogramm, mit das Sie Plugins anwenden kann. Während Sie diesen Workflow verwenden können, um Cross-Plattform-Anwendungen erstellen, ist es im allgemeinen schwieriger, weil das Fehlen eines übergeordneten Tools separate Build Zyklen und Plugin Änderungen für jede Plattform bedeutet. Dennoch, diesen Workflow können Sie besseren Zugang zu von jeder SDK bereitgestellten Entwicklungsoptionen und ist essentiell für komplexe Hybrid-apps. Sehen Sie die verschiedenen Plattform-Leitfäden für Details auf jeder Plattform verfügbar Shell Versorgungseinrichtungen.
Wenn zunächst ausgehend, kann es am einfachsten, den Cross-Plattform-Workflow verwenden, um eine app erstellen, wie beschrieben in The Command-Line Interface sein. Sie haben dann die Möglichkeit zu einem Plattform-zentriert-Workflow zu wechseln, benötigen Sie größere Kontrolle, die das SDK enthält. Low-Level-Shell Dienstprogramme stehen unter [cordova.apache.org][1] in einer separaten Verteilerliste als CLI zur Verfügung. Für Projekte, die ursprünglich von der CLI generiert, diese Shell Tools stehen auch in das Projekt hat verschiedene `platforms/*/cordova` Verzeichnisse.
**Hinweis**: Sobald Sie von der CLI-basierte Workflow zu einem rund um die Plattform-spezifische SDKs und Shell Tools wechseln, du kannst nicht zurück gehen. Die CLI unterhält einen gemeinsamen Satz von Cross-Plattform Quellcode, die auf jedem es verwendet, um über Plattform-spezifischen Quellcode schreiben zu bauen. Um Änderungen zu den Plattform-spezifischen Vermögenswerten vorgenommenen erhalten, Sie müssen auf der Plattform-zentrierte Shell-Werkzeugen zu wechseln, die Cross-Plattform-Quellcode zu ignorieren, und stützt sich stattdessen auf den Plattform-spezifischen Quellcode.
## Installation von Cordova
Die Installation von Cordova variieren abhängig vom obigen Workflow, die Sie wählen:
* Plattformübergreifende Workflow: finden Sie die Befehlszeilen-Schnittstelle.
* Plattform-zentrierte Workflow: finden Sie die Plattform-Handbüchern.
Nach der Installation von Cordova, empfiehlt es sich, dass Sie die Plattform-Führer für die mobilen Plattformen überprüfen, die Sie für entwickeln werden. Es wird auch empfohlen, dass Sie auch die Privatsphäre Guide, Sicherheit und die nächsten Schritte überprüfen. Konfigurieren von Cordova, finden Sie in der Datei config.xml Datei. JavaScript native Funktion auf einem Gerät zugreifen, finden Sie in der Plugin-APIs. Und finden Sie in den anderen enthalten wie nötig.
| 114.607595 | 1,227 | 0.818644 | deu_Latn | 0.998597 |
e23cd1ee20be18d62be03d04d47f1a1362831609 | 513 | md | Markdown | README.md | yeukfei02/divisionly-web | a09ebeac086128dffa9bf084b4a2d457ff1d2af2 | [
"MIT"
] | null | null | null | README.md | yeukfei02/divisionly-web | a09ebeac086128dffa9bf084b4a2d457ff1d2af2 | [
"MIT"
] | 26 | 2021-11-09T07:49:20.000Z | 2021-11-21T14:38:48.000Z | README.md | yeukfei02/divisionly-web | a09ebeac086128dffa9bf084b4a2d457ff1d2af2 | [
"MIT"
] | null | null | null | <p align="center">
<img width="350px" src="https://github.com/yeukfei02/divisionly-web/blob/main/readme-icon.png?raw=true"><br/>
<h2 align="center">divisionly-web</h2>
</p>
split expenses with friends
url: <https://divisionly.net/>
## Requirement
- install yarn
- install node (v14+)
## Testing and run
```zsh
$ yarn
// development
$ yarn run dev
// production
$ yarn run production
// run test case
$ yarn run test
// use eslint and prettier to format code
$ yarn run lint
```
open localhost:5000
| 15.088235 | 111 | 0.690058 | eng_Latn | 0.512983 |
e23cdfe26a96f4bcad68ae16a59368be5202bc53 | 8,946 | md | Markdown | www/docs/en/4.0.0/config_ref/index.md | NiklasMerz/cordova-docs | 44678acb622002f5ed2f322a699aaad9716ff041 | [
"Apache-2.0"
] | 291 | 2015-01-26T15:07:09.000Z | 2022-02-08T02:28:58.000Z | www/docs/en/4.0.0/config_ref/index.md | NiklasMerz/cordova-docs | 44678acb622002f5ed2f322a699aaad9716ff041 | [
"Apache-2.0"
] | 622 | 2015-02-16T02:56:58.000Z | 2022-03-28T21:18:31.000Z | www/docs/en/4.0.0/config_ref/index.md | NiklasMerz/cordova-docs | 44678acb622002f5ed2f322a699aaad9716ff041 | [
"Apache-2.0"
] | 686 | 2015-01-05T08:34:51.000Z | 2022-03-25T06:53:13.000Z | ---
license: >
Licensed to the Apache Software Foundation (ASF) under one
or more contributor license agreements. See the NOTICE file
distributed with this work for additional information
regarding copyright ownership. The ASF licenses this file
to you under the Apache License, Version 2.0 (the
"License"); you may not use this file except in compliance
with the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing,
software distributed under the License is distributed on an
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
KIND, either express or implied. See the License for the
specific language governing permissions and limitations
under the License.
title: The config.xml File
---
# The config.xml File
Many aspects of an app's behavior can be controlled with a global
configuration file, `config.xml`. This
platform-agnostic XML file is arranged based on the W3C's [Packaged
Web Apps (Widgets)](http://www.w3.org/TR/widgets/) specification, and
extended to specify core Cordova API features, plugins, and
platform-specific settings.
For projects created with the Cordova CLI (described in The
Command-Line Interface), this file can be found in the top-level
directory:
app/config.xml
Note that before version 3.3.1-0.2.0, the file existed at `app/www/config.xml`,
and that having it here is still supported.
When using the CLI to build a project, versions of this file are
passively copied into various `platforms/` subdirectories, for example:
app/platforms/ios/AppName/config.xml
app/platforms/blackberry10/www/config.xml
app/platforms/android/res/xml/config.xml
This section details global and cross-platform configuration options.
See the following sections for platform-specific options:
- [iOS Configuration](../guide/platforms/ios/config.html)
- [Android Configuration](../guide/platforms/android/config.html)
- [BlackBerry 10 Configuration](../guide/platforms/blackberry10/config.html)
In addition to the various configuration options detailed below, you
can also configure an application's core set of images for each target
platform. See [Icons and Splash Screens](images.html) for more information.
## Core Configuration Elements
This example shows the default `config.xml` generated by the CLI's
`create` command, described in [The Command-Line Interface](../guide/cli/index.html):
<widget id="com.example.hello" version="0.0.1">
<name>HelloWorld</name>
<description>
A sample Apache Cordova application that responds to the deviceready event.
</description>
<author email="dev@callback.apache.org" href="http://cordova.io">
Apache Cordova Team
</author>
<content src="index.html" />
<access origin="*" />
</widget>
The following configuration elements appear in the top-level
`config.xml` file, and are supported across all supported Cordova
platforms:
- The `<widget>` element's `id` attribute provides the app's
reverse-domain identifier, and the `version` its full version number
expressed in major/minor/patch notation.
The widget tag can also have attributes that specify alternative versions,
namely versionCode for Android and CFBundleVersion for iOS. See the
Additional Versioning section below for details.
- The `<name>` element specifies the app's formal name, as it appears
on the device's home screen and within app-store interfaces.
- The `<description>` and `<author>` elements specify metadata and
contact information that may appear within app-store listings.
- The optional `<content>` element defines the app's starting
page in the top-level web assets directory. The default value is
`index.html`, which customarily appears in a project's top-level
`www` directory.
- `<access>` elements define the set of external domains the app is
allowed to communicate with. The default value shown above allows
it to access any server. See the Domain [Whitelist Guide](../guide/appdev/whitelist/index.html) for details.
- The `<preference>` tag sets various options as pairs of
`name`/`value` attributes. Each preference's `name` is
case-insensitive. Many preferences are unique to specific
platforms, as listed at the top of this page. The following sections
detail preferences that apply to more than one platform.
### Additional Versioning
Both, Android and iOS support a second version string (or number) in addition
to the one visible in app stores,
[versionCode](http://developer.android.com/tools/publishing/versioning.html)
for Android and
[CFBundleVersion](http://stackoverflow.com/questions/4933093/cfbundleversion-in-the-info-plist-upload-error)
for iOS.
Below is an example that explicitly sets versionCode and CFBundleVersion
<widget id="io.cordova.hellocordova"
version="0.0.1"
android-versionCode="7"
ios-CFBundleVersion="3.3.3">
If alternative version is not specified, the following
defaults will be used:
// assuming version = MAJOR.MINOR.PATCH-whatever
versionCode = PATCH + MINOR * 100 + MAJOR * 10000
CFBundleVersion = "MAJOR.MINOR.PATCH"
## Global Preferences
The following global preferences apply to all platforms:
- `Fullscreen` allows you to hide the status bar at the top of the
screen. The default value is `false`. Example:
<preference name="Fullscreen" value="true" />
- `Orientation` allows you to lock orientation and prevent the
interface from rotating in response to changes in orientation.
Possible values are `default`, `landscape`, or `portrait`. Example:
<preference name="Orientation" value="landscape" />
__NOTE__: The `default` value means _both_ landscape and portrait
orientations are enabled. If you want to use each platform's
default settings (usually portrait-only), leave this tag out of the
`config.xml` file.
## Multi-Platform Preferences
The following preferences apply to more than one platform, but not to
all of them:
- `DisallowOverscroll` (boolean, defaults to `false`): set to `true`
if you don't want the interface to display any feedback when users
scroll past the beginning or end of content.
<preference name="DisallowOverscroll" value="true"/>
Applies to Android and iOS. On iOS, overscroll gestures cause
content to bounce back to its original position. On Android, they
produce a more subtle glowing effect along the top or bottom edge of
the content.
- `BackgroundColor`: Set the app's background color. Supports a
four-byte hex value, with the first byte representing the alpha
channel, and standard RGB values for the following three bytes. This
example specifies blue:
<preference name="BackgroundColor" value="0xff0000ff"/>
Applies to Android and BlackBerry. Overrides CSS otherwise available
across _all_ platforms, for example: `body{background-color:blue}`.
- `HideKeyboardFormAccessoryBar` (boolean, defaults to `false`): set
to `true` to hide the additional toolbar that appears above the
keyboard, helping users navigate from one form input to another.
<preference name="HideKeyboardFormAccessoryBar" value="true"/>
Applies to iOS and BlackBerry.
## The _feature_ Element
If you use the CLI to build applications, you use the `plugin` command
to enable device APIs. This does not modify the top-level `config.xml`
file, so the `<feature>` element does not apply to your workflow. If
you work directly in an SDK and using the platform-specific
`config.xml` file as source, you use the `<feature>` tag to enable
device-level APIs and external plugins. They often appear with custom values in
platform-specific `config.xml` files. For example, here is how to specify the
Device API for Android projects:
<feature name="Device">
<param name="android-package" value="org.apache.cordova.device.Device" />
</feature>
Here is how the element appears for iOS projects:
<feature name="Device">
<param name="ios-package" value="CDVDevice" />
</feature>
See the API Reference for details on how to specify each feature. See
the [Plugin Development Guide](../guide/hybrid/plugins/index.html) for more information on plugins.
## The _platform_ Element
When using the CLI to build applications, it is sometimes necessary to specify
preferences or other elements specific to a particular platform. Use the `<platform>`
element to specify configuration that should only appear in a single platform-specific
`config.xml` file. For example, here is how to specify that only android should use the
Fullscreen preference:
<platform name="android">
<preference name="Fullscreen" value="true" />
</platform>
| 40.479638 | 110 | 0.739996 | eng_Latn | 0.991848 |
e23ce224bd7b6c66e23bc51cc10d0711c43fa0d1 | 1,838 | md | Markdown | src/pages/index.md | ManchesterYMCAHarriers/manyharrier.co.uk | f3dad99dfb1f8be1d89be7127e8c4f54927c8665 | [
"MIT"
] | null | null | null | src/pages/index.md | ManchesterYMCAHarriers/manyharrier.co.uk | f3dad99dfb1f8be1d89be7127e8c4f54927c8665 | [
"MIT"
] | 15 | 2020-02-09T17:38:50.000Z | 2022-02-27T23:21:19.000Z | src/pages/index.md | ManchesterYMCAHarriers/manyharrier.co.uk | f3dad99dfb1f8be1d89be7127e8c4f54927c8665 | [
"MIT"
] | null | null | null | ---
templateKey: index-page
title: Welcome to the Manchester YMCA Harriers
description: The Manchester YMCA Harriers are a friendly running club based at the Y Club in Castlefield, Manchester city centre.
heroImage: /media/2019-sale-sizzler-2.jpg
intro: |
We are a friendly running club based at the Y Club in Castlefield, right in the
heart of Manchester city centre. If you're looking for a central Manchester
running club, then we might just be the one for you!
nextEventDefault: |
We currently have three regular training sessions each week:
- **Speedwork:** Tuesday, 7:00pm at [Longford Park](/venues/longford-park-stadium)
- **Group run:** Thursday, 6:30pm from [the Y Club](/venues/the-y-club)
Come along and join us for a run!
firstPanelImage: /media/2019-london-finishers.jpeg
firstPanelTitle: About us
firstPanelBody: |
We are affiliated to England Athletics and we frequently take
part in races and other running events. We have club
championships in cross-country, road, fell and track
disciplines. We particularly enjoy taking part in events where
we can run as a team!
We're not just about running; we arrange regular socials in
Manchester city centre and we are often found travelling further
afield for weekend breaks or longer holidays.
firstPanelLink: /about
firstPanelCTA: Read on
secondPanelImage: /media/2019-the-wharf-social.jpg
secondPanelTitle: Join us
secondPanelBody: |
We run and we do lots of fun stuff besides running. What's not
to like?!
Whether you're a practically a pro, just taking your first
running steps, mainly interested in the *après run*, or
somewhere in between all three, we'd love for you to join us!
If you like what you see, try us out for size and come and be a
part of it!
secondPanelLink: /join
secondPanelCTA: Find out more
---
| 39.106383 | 129 | 0.761697 | eng_Latn | 0.998601 |
e23d3858cf2e3976596b8dfa7be93705f10766fb | 2,360 | md | Markdown | _posts/2010-04-28-keys-to-a-successful-small-business.md | BlogToolshed50/nightskyillusions.com | c9abff106093496d2ba1298d6ae496d095ef1283 | [
"CC-BY-4.0"
] | null | null | null | _posts/2010-04-28-keys-to-a-successful-small-business.md | BlogToolshed50/nightskyillusions.com | c9abff106093496d2ba1298d6ae496d095ef1283 | [
"CC-BY-4.0"
] | null | null | null | _posts/2010-04-28-keys-to-a-successful-small-business.md | BlogToolshed50/nightskyillusions.com | c9abff106093496d2ba1298d6ae496d095ef1283 | [
"CC-BY-4.0"
] | null | null | null | ---
id: 105
title: Keys To A Successful Small Business
date: 2010-04-28T00:00:00+00:00
author: admin
layout: post
guid: http://www.trajedybeatz.com/2010/06/05/keys-to-a-successful-small-business/
permalink: /2010/04/28/keys-to-a-successful-small-business/
categories:
- Uncategorized
---
We often hear managers complaining that their employees aren’t productive, don’t listen and just can’t consistently get the job done. As a youth sports coach, I hear coaches with similar complaints—the kids don’t listen, don’t know where to go and don’t try very hard. I can’t relate. The boys on my team are usually focused, do what I ask of them, and work hard. As a business owner, my employees are focused, do what I ask of them and work hard. What am I doing that is different from the rest? And what can this teach you about running a successful small business?
As a coach, I make my boys’ jobs very simple. I ask only two things of them. I ask them to master one shot and I ask them to be aware of what is going on around them. Of course we work on defensive and offensive strategy, but both of those revolve around the two keys that I gave them for success—awareness and mastery.
I teach awareness by constantly asking them to be aware of where the ball is and at the same time to be aware of their teammates are and where their opponents are. I teach them how to see the ball and their opponent when he doesn’t have the ball. Sounds simple, but for ten year olds this is work.
I teach mastery by assigning homework to each boy. The second week of practice, they have to show me a spot on the court from which they can make a shot every time. I don’t care if it is from just two feet under the basket. I want them to know they can make it every single time. As the season progresses, they may gradually move their spot further and further out, but I still ask that they be able to make their shot every time unguarded in practice.
These two simple concepts have a tremendous effect on the boys during their games. They have incredible confidence in their ability to make shots because they “know” that they will always make it. I don’t need to yell at them like other coaches about where they should be on the court because they have developed awareness of what they are doing and seeing. Now let’s see how you can use this in your successful small business. | 118 | 567 | 0.780508 | eng_Latn | 0.999984 |
e23d945637dd237d66c721a5703ac9d48baed38d | 14 | md | Markdown | README.md | finesaaa/ocr-img-text | 703d0ac9c4b9e2230bdb0c95161c66ac310d3174 | [
"MIT"
] | null | null | null | README.md | finesaaa/ocr-img-text | 703d0ac9c4b9e2230bdb0c95161c66ac310d3174 | [
"MIT"
] | null | null | null | README.md | finesaaa/ocr-img-text | 703d0ac9c4b9e2230bdb0c95161c66ac310d3174 | [
"MIT"
] | null | null | null | # ocr-img-text | 14 | 14 | 0.714286 | swe_Latn | 0.547033 |
d387f683ceca5b7bcb7e88aaded6103ca4a460e6 | 1,468 | md | Markdown | README.md | nataliakrein/travel-agency-website | be39aa09707aa547140a71855a0d44e73919b614 | [
"MIT"
] | 3 | 2021-04-14T22:05:56.000Z | 2022-01-17T22:50:56.000Z | README.md | nataliakrein/travel-agency-website | be39aa09707aa547140a71855a0d44e73919b614 | [
"MIT"
] | null | null | null | README.md | nataliakrein/travel-agency-website | be39aa09707aa547140a71855a0d44e73919b614 | [
"MIT"
] | null | null | null | # Travel agency website ✈💙
O projeto consiste em um website estático para uma agência de viagens fictícia. O objetivo do mesmo foi treinar os conceitos acerca de HTML e CSS.
Para visitar o projeto, <a href="https://nataliakrein.github.io/travel-agency-website/">clique aqui</a>.
Principais conceitos colocados em prática:
<ul>
<li>Flexbox;</li>
<li>Media queries;</li>
<li>Animações;</li>
<li>Como integrar o Google Maps;</li>
</ul>
Como o intuito do projeto era apenas revisar alguns conceitos básicos, optei por não utilizar a metodologia BEM.
### 💻 Versão desktop

### 📱 Versão mobile

## 🛠 Tecnologias
Para o desenvolvimento deste projeto utilizei as seguintes tecnologias:
<ul>
<li>Sublime text 3;</li>
<li>HTML5;</li>
<li>CSS3;</li>
</ul>
## 💾 Clone
Para rodar o Travel Agency Website localmente em modo desenvolvimento você deve:
```
git clone https://github.com/nataliakrein/travel-agency-website.git project_name
abrir o arquivo index.html no seu navegador
```
## ⚖ Licença
Esse projeto está sob licença MIT. Veja o arquivo <a href="https://github.com/nataliakrein/travel-agency-website/blob/main/LICENSE">LICENÇA</a> para mais detalhes.
##
##### Coded with ❤ by <a href="https://github.com/nataliakrein/">Natália Krein</a>
| 36.7 | 163 | 0.745913 | por_Latn | 0.987506 |
d38871b83ab26ed20d9bd4443748afc75cbef7b6 | 1,279 | md | Markdown | starters/shared/README.md | baileyherbert/horizon | e0a2f4887581084189898374bab3225aeca3a2e6 | [
"MIT"
] | 6 | 2018-03-31T19:37:07.000Z | 2020-03-09T09:56:06.000Z | starters/shared/README.md | baileyherbert/horizon | e0a2f4887581084189898374bab3225aeca3a2e6 | [
"MIT"
] | 9 | 2018-10-09T23:56:17.000Z | 2022-02-06T23:44:40.000Z | starters/shared/README.md | baileyherbert/horizon | e0a2f4887581084189898374bab3225aeca3a2e6 | [
"MIT"
] | null | null | null | # Horizon Framework
This is a starter template for the Horizon Framework that is designed for building applications which will be
distributed to end users.
The `vendor` directory is hidden in a subfolder which can be renamed. Additionally, multiple `.htaccess` files have
been placed throughout the project to control access and routing. When routing is unavailable, legacy routing will
automatically kick in.
## Installation
Run the following commands to clone this template and install dependencies into the current working directory.
```bash
npx degit baileyherbert/horizon/starters/shared
composer install -d horizon
```
If you haven't done so already, consider installing the `ace` command line tool globally as well.
```bash
composer global require baileyherbert/ace
```
## Deployment
Package the entire application into an archive and distribute it. Instruct end users to extract the contents of the
archive into the directory on their server where they would like the application to appear. Then, have them create the
`.env.php` file using the included template and customize it.
## More links
Check out the following links for more information.
- Documentation: https://bailey.sh/packages/horizon/docs/
- Repository: https://github.com/baileyherbert/horizon
| 34.567568 | 118 | 0.79828 | eng_Latn | 0.998326 |
d388829eab4e5ae6164e29529c6acd09299c5d1c | 73 | md | Markdown | README.md | uroobasehar/ahsan | 4d032c2244e954d7154bbb1c691bba66c46cb90a | [
"MIT"
] | null | null | null | README.md | uroobasehar/ahsan | 4d032c2244e954d7154bbb1c691bba66c46cb90a | [
"MIT"
] | null | null | null | README.md | uroobasehar/ahsan | 4d032c2244e954d7154bbb1c691bba66c46cb90a | [
"MIT"
] | null | null | null | # [International Review System](https://github.com/uroobasehar/ahsan/)
| 18.25 | 70 | 0.753425 | yue_Hant | 0.787001 |
d389161dbc788a9fbcef8dacd67d50554f2e5d4e | 519 | markdown | Markdown | _posts/2016-02-14-flickr-client.markdown | scottmcallister/scottmcallister.github.io | 0b12a01ba158dec5b37d143638226a6f61096dc9 | [
"Apache-2.0"
] | null | null | null | _posts/2016-02-14-flickr-client.markdown | scottmcallister/scottmcallister.github.io | 0b12a01ba158dec5b37d143638226a6f61096dc9 | [
"Apache-2.0"
] | null | null | null | _posts/2016-02-14-flickr-client.markdown | scottmcallister/scottmcallister.github.io | 0b12a01ba158dec5b37d143638226a6f61096dc9 | [
"Apache-2.0"
] | null | null | null | ---
title: Flickr Client
subtitle: React Native
layout: project
modal-id: 6
date: 2016-02-14
img: flickr.png
thumbnail: flickr-thumbnail.png
alt: image-alt
project-date: February 2016
client: Personal Project
category: Mobile Development
link: https://github.com/scottmcallister/flickr-client
description: This iOS app uses the Flickr API to allow users to search for photos. There are options to sort search results by relevance, date posted, date taken, or interestingness. The app was built using React Native.
---
| 30.529412 | 220 | 0.789981 | eng_Latn | 0.902224 |
d3895295049c7bdbd4afc3ef8627e206980a098f | 362 | md | Markdown | _posts/2021-07-08/2021-06-22-My-pussy-gives-the-best-hugs-20210622173425931366.md | ipussy/ipussy.github.io | 95d19a74e38bb54303cf18057a99a57c783e76bf | [
"Apache-2.0"
] | null | null | null | _posts/2021-07-08/2021-06-22-My-pussy-gives-the-best-hugs-20210622173425931366.md | ipussy/ipussy.github.io | 95d19a74e38bb54303cf18057a99a57c783e76bf | [
"Apache-2.0"
] | null | null | null | _posts/2021-07-08/2021-06-22-My-pussy-gives-the-best-hugs-20210622173425931366.md | ipussy/ipussy.github.io | 95d19a74e38bb54303cf18057a99a57c783e76bf | [
"Apache-2.0"
] | null | null | null | ---
title: "My pussy gives the best hugs 😉"
metadate: "hide"
categories: [ God Pussy ]
image: "https://preview.redd.it/masub9meks671.jpg?auto=webp&s=cb5a6da2c3bbe611344542604a177143ec1e55a6"
thumb: "https://preview.redd.it/masub9meks671.jpg?width=640&crop=smart&auto=webp&s=0d483ea2734f840cd97aff94e6fa835d1209876a"
visit: ""
---
My pussy gives the best hugs 😉
| 36.2 | 124 | 0.770718 | kor_Hang | 0.081417 |
d389962f6c1cc90a5386874a60b634b460cd7a33 | 2,925 | md | Markdown | content/desktop/contributing-and-collaborating-using-github-desktop/managing-commits/squashing-commits.md | Micheleerb/docs | f3b3fe69e5f5c9446bc0a7f6270aa6bb8139be58 | [
"CC-BY-4.0",
"MIT"
] | 6 | 2022-03-09T07:09:42.000Z | 2022-03-09T07:14:08.000Z | content/desktop/contributing-and-collaborating-using-github-desktop/managing-commits/squashing-commits.md | Husky57/docs | 1d590a4feb780b0acc9a41381e721b61146175db | [
"CC-BY-4.0",
"MIT"
] | 133 | 2021-11-01T18:16:33.000Z | 2022-03-29T18:18:46.000Z | content/desktop/contributing-and-collaborating-using-github-desktop/managing-commits/squashing-commits.md | Waleedalaedy/docs | 26d4b73dcbb9a000c32faa37234288649f8d211a | [
"CC-BY-4.0",
"MIT"
] | 2 | 2021-10-05T09:44:04.000Z | 2021-10-05T09:44:52.000Z | ---
title: Squashing commits
intro: 'You can use {% data variables.product.prodname_desktop %} to squash commits in your branch''s history.'
versions:
fpt: '*'
---
## About squashing a commit
Squashing allows you to combine multiple commits in your branch's history into a single commit. This can help keep your repository's history more readable and understandable.
## Squashing a commit
{% mac %}
{% data reusables.desktop.current-branch-menu %}
2. In the list of branches, select the branch that has the commits that you want to squash.
{% data reusables.desktop.history-tab %}
4. Select the commits to squash and drop them on the commit you want to combine them with. You can select one commit or select multiple commits using <kbd>Command</kbd> or <kbd>Shift</kbd>.

5. Modify the commit message of your new commit. The commit messages of the selected commits you want to squash are pre-filled into the **Summary** and **Description** fields.
6. Click **Squash Commits**.
{% endmac %}
{% windows %}
{% data reusables.desktop.current-branch-menu %}
2. In the list of branches, select the branch that has the commits that you want to squash.
{% data reusables.desktop.history-tab %}
4. Select the commits to squash and drop them on the commit you want to combine them with. You can select one commit or select multiple commits using <kbd>Ctrl</kbd> or <kbd>Shift</kbd>.

5. Modify the commit message of your new commit. The commit messages of the selected commits you want to squash are pre-filled into the **Summary** and **Description** fields.
6. Click **Squash Commits**.
{% endwindows %}
## Error messages when squashing commits
When you squash commits, you may see one of the following notifications or error messages.
* A notification states that the requested change to the branch will require a force push to update the remote branch. Force pushing alters the commit history of the branch and will affect other collaborators who are working in that branch. Select **Begin Squash** to start the squash, and then click **Force push origin** to push your changes.

* An error states that the squash failed because there is a merge commit among the squashed commits.

* A notification is shown indicating that there are uncommitted changes present on your current branch. Select **Stash Changes and Continue** to store the changes and proceed, or select **Close** to dismiss the message and commit the changes. When there are no longer any uncommitted changes you can squash your commits.

| 55.188679 | 345 | 0.763761 | eng_Latn | 0.997228 |
d389bdddf2e82a1f7ed607653f8db9f8ad495c72 | 2,348 | md | Markdown | _posts/2018-12-09-Download-proton-engine-manual.md | Camille-Conlin/26 | 00f0ca24639a34f881d6df937277b5431ae2dd5d | [
"MIT"
] | null | null | null | _posts/2018-12-09-Download-proton-engine-manual.md | Camille-Conlin/26 | 00f0ca24639a34f881d6df937277b5431ae2dd5d | [
"MIT"
] | null | null | null | _posts/2018-12-09-Download-proton-engine-manual.md | Camille-Conlin/26 | 00f0ca24639a34f881d6df937277b5431ae2dd5d | [
"MIT"
] | null | null | null | ---
layout: post
comments: true
categories: Other
---
## Download Proton engine manual book
Darlene's voice trailed off into an incoherent babbling, not to nap. Not for a minute did I doubt he would be living at the apartment court on Las Palmas, I complain? Then after a second he nodded. But there had been no point in making a fuss over it, handing a menu to Paul. sailing through the Straits of Malacca strong ball-lightning was Paul recalled the letter he had written to Reverend Harrison White a couple after his landing on Behring Island for the first time saw some Cupboard to cupboard, drawn by Marine-engineer J, Ulfmpkgrumfl She considered the accusation, Hal. I--" Stone journeys to, and maybe the gov'ment never done killed your MOOG INDIGO scar tissue. " Hoping to prolong the experience, your-head not clean, I don't know, sitting by the fire shelling walnuts. Naomi wasn't slumped across him. Why do you think I don't have a staff. knuckles. The King's wizard says it's still here somewhere about to bond the two kingdoms was broken. If a pretense of control "Where's a lightr cried Jack. There was only a little space to sit among the green shoots and the long, brooding Jonas Salk accepted the picture. "Flying saucers?" Airborne through billowing smoke. Maybe the lords there had heard there was a proton engine manual fleet coming raiding, were Siberia, and Tarry took offense. An Proton engine manual in the Gun-room proton engine manual the _Vega_ during the Wintering, my endeavour is vain; My bosom is straitened. considerable depth proton engine manual the open sea is perhaps uncertain, Mary said. themselves into false gods, it were injustice. them off and pulled them about. Don't bring mice in with it. Dog and therefore boy together recognize that they proton engine manual no longer merely the objects of a feverish proton engine manual, including criminal trials of your proton engine manual. good point of observation from which to study the spectacular panoply of stars that brightened the desert leads was an extensive opening, and we treat the remaining eye with radiation. 104). They were all back. She reached out the poker to gather together her namesakes in the hearth, is to get over into Chironian territory! "A nice one," she had added in response to Proton engine manual astonished look. Singapore, ii. | 260.888889 | 2,254 | 0.795997 | eng_Latn | 0.99982 |
d38a6ea7e255cf3b7fd5db3af3de2784a316aec5 | 11,306 | md | Markdown | README.md | openforcefield/bayes-implicit-solvent | 067239fcbb8af28eb6310d702804887662692ec2 | [
"MIT"
] | 4 | 2019-11-12T16:23:26.000Z | 2021-07-01T05:37:37.000Z | README.md | openforcefield/bayes-implicit-solvent | 067239fcbb8af28eb6310d702804887662692ec2 | [
"MIT"
] | 4 | 2019-01-18T22:05:03.000Z | 2019-11-12T18:37:31.000Z | README.md | openforcefield/bayes-implicit-solvent | 067239fcbb8af28eb6310d702804887662692ec2 | [
"MIT"
] | 2 | 2019-12-02T20:23:56.000Z | 2021-03-25T23:28:36.000Z | # bayes-implicit-solvent
experiments with Bayesian calibration of implicit solvent models
## Highlights
### Colab notebook illustrating continuous parameter sampling
Our likelihood function depends on comparing ~600 calculated and experimental hydration free energies, which is computationally expensive and must be done at each sampling iteration.
Gradients of this likelihood are computed efficiently using Jax, and used to compare Langevin Monte Carlo with gradient descent. [](https://colab.research.google.com/github//openforcefield/bayes-implicit-solvent/blob/master/notebooks/fast_likelihood_gradients_in_jax_(batches_of_component_gradients).ipynb)

### Demonstration of automatic parameterization
A few Markov Chain Monte Carlo algorithms (implemented in [`samplers.py`](https://github.com/openforcefield/bayes-implicit-solvent/blob/master/bayes_implicit_solvent/samplers.py)) are applied to the task of sampling the continuous parameters of implicit solvent models.

### Comparisons of Gaussian and Student-t likelihood behavior
One observation from this study has been that the tail behavior of the likelihood function comparing experimental and predicted free energies has a pronounced affect on the behavior of samplers.

### RJMC experiments
Atom-typing schemes are represented using trees of SMIRKS patterns, implemented in the file [`typers.py`](https://github.com/openforcefield/bayes-implicit-solvent/blob/master/bayes_implicit_solvent/typers.py), along with uniform cross-model proposals that elaborate upon or delete the types within these schemes.
Scripts for numerical experiments with RJMC and various choices of prior, likelihood, within-model sampler, and constraints on the discrete model space are in `bayes_implicit_solvent/rjmc_experiments/`.
Using Langevin Monte Carlo for within-model sampling, and enforcing that elemental types are retained (see the script [`tree_rjmc_w_elements.py`](https://github.com/openforcefield/bayes-implicit-solvent/blob/master/bayes_implicit_solvent/rjmc_experiments/tree_rjmc_w_elements.py)), we obtain the following result.

In ongoing work, we are attempting to define better-informed cross-model proposals and use more reasonable prior restraints to improve the chance of converging cross-model sampling. Priors and cross-model proposals that are informed by the number of atoms that fall into each type are being prototyped here [`informed_tree_proposals.py`](https://github.com/openforcefield/bayes-implicit-solvent/blob/master/bayes_implicit_solvent/rjmc_experiments/informed_tree_proposals.py).
### Differentiable atom-typing experiments
As an alternative to assigning parameters using trees of SMIRKS, we also briefly considered assigning parameters using differentiable functions of SMIRKS features.
This would allow uncertainty in the parameter-assignment scheme to be represented using a posterior distribution over continuous variables only, rather than a challenging mixed continuous/discrete space.
Linear functions of SMIRKS fingerprints to radii and scales ([notebook](https://github.com/openforcefield/bayes-implicit-solvent/blob/master/notebooks/linear-typing-using-atom-features-only-student-t-loss.ipynb))

Multilayer perceptron from SMIRKS fingerprints to radii, scales, and parameters controlling charge-hydration asymmetry ([notebook](https://github.com/openforcefield/bayes-implicit-solvent/blob/master/notebooks/feedforward-typing-using-smarts-and-neighbor-features-student-t%2Bcha-df%3D7-and-per-particle-psis-big-batch.ipynb)) 
(Convolutional typing schemes appeared more difficult to optimize numerically, but may be an interesting direction for future work ([notebook](https://github.com/openforcefield/bayes-implicit-solvent/blob/master/notebooks/convolutional-typing.ipynb)))
## Detailed contents
### `bayes_implicit_solvent`
* `gb_models/` -- Clones the OpenMM GBSA OBC force using autodiff frameworks such as `jax` and HIPS `autograd`, to allow differentiating w.r.t. per-particle parameters.
* `molecule.py` -- Defines a class `Molecule` that predicts solvation free energy as function of GB parameters and compares to an experimental value, for use in posterior sampling.
* `prior_checking.py` -- methods for checking whether a typing scheme is legal
* `samplers.py` -- defines parameter samplers: random-walk Metropolis-Hastings, Langevin (unadjusted and Metropolis-Adjusted), RJMC
* `smarts.py` -- definitions of SMARTS primitives and decorators
* `solvation_free_energy.py` -- functions for computing solvation free energy using GB models
* `typers.py` -- defines the following classes: `DiscreteProposal`, `BondProposal`, `AtomSpecificationProposal`, `BondSpecificationProposal`, `SMIRKSElaborationProposal`, `SMARTSTyper`, `FlatGBTyper`, `GBTypingTree`, which hopefully encapsulate the bookkeeping needed to sample typing schemes using RJMC
* `utils.py` -- un-filed utilities for: interacting with OpenEye, getting or applying GB parameters in OpenMM systems, caching substructure matches
* `constants.py` -- temperature, unit conventions, etc.
(Currently contains some code that needs to be removed or refactored. `proposals.py` defines the following classes: `Proposal`, `RadiusInheritanceProposal`, `AddOrDeletePrimitiveAtEndOfList`, `AddOrDeletePrimitiveAtRandomPositionInList`, `SwapTwoPatterns`, `MultiProposal`, which were used in initial experiments that did not use a tree representation of the typing scheme. `prepare_freesolv.py` uses OpenEye to construct OEMol objects, assign partial charges, etc. starting from a list of SMILES strings.)
#### `bayes_implicit_solvent/continuous-parameter-experiments/`
* `elemental_types_mala.py` -- Use Metropolis-adjusted Langevin to sample the radii and scales in the elemtnal-types-only model
* `hydrogen_or_not.py` -- Toy model containing just two "types" -- "hydrogen" vs "not hydrogen" so we can plot the parameter space in 2D for inspection. Attempt to fit GB radii using this restricted typing scheme on subsets of FreeSolv. Also check how the results depend on the number of configuration-samples used in the hydration free energy estimates.
* `smirnoff_types.py` -- Use random-walk Metropolis-Hastings to sample GB radii for models restricted to use the same types defined in the nonbonded force section of smirnoff99frosst.
and many more to be documented further
#### `bayes_implicit_solvent/rjmc_experiments/`
* `informed_tree_proposals.py` -- Experiments with constructing guided discrete-model proposals, as well as with defining more effective priors for the discrete models
* `tree_rjmc_start_from_wildcard.py` -- Experiments running RJMC on GB typing trees starting from wildcard type and building up from there.
* `tree_rjmc_w_elements.py` -- Experiments running RJMC on GB typing trees, keeping elemental types as un-delete-able nodes.
#### `bayes_implicit_solvent/data`
See its readme: contains freesolv and some numerical results in pickle or numpy archives.
#### `bayes_implicit_solvent/hierarchical_typing`
Out-dated -- initial experiments where types were introduced by truncating the smirnoff nonbonded definitions
#### `bayes_implicit_solvent/tests`
* `test_bayes_implicit_solvent.py`
* `test_rjmc.py` -- unit tests and integration tests that RJMC on typing trees is correct
#### `bayes_implicit_solvent/vacuum_samples`
Scripts to generate configuration samples of FreeSolv set in vacuum, for use in reweighting.
### `devtools`
Copied from MolSSI's `cookiecutter-compchem`. Requirements listed in `devtools/conda-recipe/meta.yaml`.
### `docs`
To-do
### `notebooks`
Exploratory or visualization-focused notebooks.
#### `elaborate_typing_animation/`
* animated GIFs of initial slow typing-tree sampling code (also affected by a bug that was later corrected, where the charges for some molecules were drastically affected incorrectly prepared). The number of types sampled increased much more than expected, and the sampler became slower the more types were present. ([notebook](https://github.com/openforcefield/bayes-implicit-solvent/blob/master/notebooks/plot%20elaborate%20rjmc%20typing%20trees--%20now%20with%20less%20smirks-overlap!.ipynb))

#### `bugfixed_typing_animation/`
* animated GIF of early tree-RJMC run
#### `extended-sim-projections/`
* projections and of slow-relaxing torsions in some molecules from FreeSolv

#### `carboxyl-torsion-plots/`
* diagnostic plots for some slow torsional degrees of freedom involving carboxylic acids, encountered when preparing vacuum samples for use in reweighting-based likelihood estimator
#### `projections/`
* diagnostic tICA projections of gas-phase simulations
#### `nelder_mead_plots/`
* baseline of using Nelder-Mead simplex minimization rather than gradient-informed optimization or sampling ([notebook](https://github.com/openforcefield/bayes-implicit-solvent/blob/master/notebooks/inspect%20nelder%20mead%20results.ipynb))
<img src="https://user-images.githubusercontent.com/5759036/68699968-1706cc00-0552-11ea-86c4-8840f1628f85.png" width="400"> <img src="https://user-images.githubusercontent.com/5759036/68699996-238b2480-0552-11ea-9a04-00541382c2c0.png" width="400">
#### `rjmc-figures/`
* illustrative example of using RJMC to sample Gaussian mixture models ([notebook](https://github.com/openforcefield/bayes-implicit-solvent/blob/master/notebooks/birth-death%20moves%20for%20RJMC%20of%20GMMs.ipynb))
#### `rjmc_animation_march30/`, `rjmc_animation_march30_running/`, `rjmc_animation_march30_weighted/`
* animated GIFs inspecting a long run of tree RJMC, plotting either raw RMSE, RMSE of running-median prediction, or uncertainty-weighted RMSE ([notebook](https://github.com/openforcefield/bayes-implicit-solvent/blob/master/notebooks/inspect-march-30-rjmc-result.ipynb))

**TODO in this section: describe `freesolvbest-case-rmse.png`, AIS results, AIS vs RJMC diagnostic tests**
| 85.651515 | 506 | 0.81585 | eng_Latn | 0.917353 |
d38abd1fc18319f8e7c7c9a1fdc31cedcf812a36 | 28,044 | md | Markdown | articles/digital-twins/how-to-manage-model.md | KreizIT/azure-docs.fr-fr | dfe0cb93ebc98e9ca8eb2f3030127b4970911a06 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2021-03-12T23:37:08.000Z | 2021-03-12T23:37:08.000Z | articles/digital-twins/how-to-manage-model.md | KreizIT/azure-docs.fr-fr | dfe0cb93ebc98e9ca8eb2f3030127b4970911a06 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/digital-twins/how-to-manage-model.md | KreizIT/azure-docs.fr-fr | dfe0cb93ebc98e9ca8eb2f3030127b4970911a06 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: Gérer les modèles DTDL
titleSuffix: Azure Digital Twins
description: Découvrez comment gérer des modèles DTDL dans Azure Digital Twins, y compris comment les créer, les modifier et les supprimer.
author: baanders
ms.author: baanders
ms.date: 10/20/2021
ms.topic: how-to
ms.service: digital-twins
ms.openlocfilehash: 7ae5a3293b7b9ba4712c3762b18b6d9c66eab926
ms.sourcegitcommit: 2cc9695ae394adae60161bc0e6e0e166440a0730
ms.translationtype: HT
ms.contentlocale: fr-FR
ms.lasthandoff: 11/03/2021
ms.locfileid: "131507167"
---
# <a name="manage-azure-digital-twins-models"></a>Gérer les modèles Azure Digital Twins
Cet article explique comment gérer les [modèles](concepts-models.md) dans votre instance Azure Digital Twins. Les opérations de gestion incluent le chargement, la validation, la récupération et la suppression des modèles.
## <a name="prerequisites"></a>Prérequis
[!INCLUDE [digital-twins-prereq-instance.md](../../includes/digital-twins-prereq-instance.md)]
[!INCLUDE [digital-twins-developer-interfaces.md](../../includes/digital-twins-developer-interfaces.md)]
[!INCLUDE [visualizing with Azure Digital Twins explorer](../../includes/digital-twins-visualization.md)]
:::image type="content" source="media/how-to-use-azure-digital-twins-explorer/model-graph-panel.png" alt-text="Capture d’écran d’Azure Digital Twins Explorer montrant un exemple de graphe de modèle" lightbox="media/how-to-use-azure-digital-twins-explorer/model-graph-panel.png":::
## <a name="create-models"></a>Créer des modèles
Les modèles pour Azure Digital Twins sont écrits en DTDL et enregistrés sous forme de fichiers .JSON. Il existe également une [extension DTDL](https://marketplace.visualstudio.com/items?itemName=vsciot-vscode.vscode-dtdl) disponible pour [Visual Studio Code](https://code.visualstudio.com/). Cette extension fournit une fonctionnalité de validation de la syntaxe et d’autres fonctionnalités facilitant l’écriture de documents DTDL.
Prenons l’exemple d’un hôpital souhaitant disposer d’une représentation numérique des pièces de son bâtiment. Chaque pièce contient un distributeur de savon intelligent permettant de contrôler le lavage des mains, et des capteurs pour suivre le trafic.
La première étape de la solution consiste à créer des modèles pour représenter chaque aspect de l’hôpital. Dans ce scénario, la chambre d’un patient peut être décrite comme suit :
:::code language="json" source="~/digital-twins-docs-samples/models/PatientRoom.json":::
> [!NOTE]
> Il s’agit d’un exemple de corps pour un fichier. JSON au sein duquel un modèle est défini et enregistré, et doit être téléchargé dans le cadre d’un projet client. En revanche, l’appel d’API REST prend un tableau de définitions de modèle comme celle ci-dessus (qui est mappée à un `IEnumerable<string>` dans le kit de développement logiciel .NET). Par conséquent, pour utiliser ce modèle dans l’API REST directement, entourez-le avec des crochets.
Ce modèle définit un nom et un ID unique pour chaque chambre de patient, ainsi que des propriétés représentant le nombre de visiteurs et l’état de lavage des mains. Ces compteurs sont mis à jour d’après les capteurs de mouvement et les distributeurs de savon intelligents ; ils sont utilisés ensemble pour calculer le pourcentage de lavage des mains (propriété *handWashPercentage*). Le modèle définit également une relation *hasDevices*, qui sera utilisée pour connecter toute [représentation numérique](concepts-twins-graph.md) basée sur ce modèle Room aux périphériques réels.
En suivant cette méthode, vous pouvez définir des modèles pour l’hôpital entier, ou bien pour certaines zones.
### <a name="validate-syntax"></a>Valider la syntaxe
[!INCLUDE [Azure Digital Twins: validate models info](../../includes/digital-twins-validate.md)]
## <a name="upload-models"></a>Charger des modèles
Une fois les modèles créés, vous pouvez les charger vers l’instance Azure Digital Twins.
Lorsque vous êtes prêt à charger un modèle, vous pouvez utiliser l’extrait de code suivant pour le [kit de développement logiciel (SDK) .NET](/dotnet/api/overview/azure/digitaltwins/management?view=azure-dotnet&preserve-view=true): :
:::code language="csharp" source="~/digital-twins-docs-samples/sdks/csharp/model_operations.cs" id="CreateModel":::
Vous pouvez également charger plusieurs modèles dans une seule transaction.
Si vous utilisez le kit de développement logiciel (SDK), vous pouvez charger plusieurs fichiers de modèle avec la méthode `CreateModels` comme suit :
:::code language="csharp" source="~/digital-twins-docs-samples/sdks/csharp/model_operations.cs" id="CreateModels_multi":::
Si vous utilisez les [API REST](/rest/api/azure-digitaltwins/) ou l’interface [Azure CLI](/cli/azure/dt?view=azure-cli-latest&preserve-view=true), vous pouvez également charger différents modèles en plaçant plusieurs définitions de modèle dans un seul fichier JSON afin de les charger ensemble. Dans ce cas, les modèles doivent être placés dans un tableau JSON dans le fichier, comme dans l’exemple suivant :
:::code language="json" source="~/digital-twins-docs-samples/models/Planet-Moon.json":::
Lors du chargement, les fichiers de modèle sont validés par le service.
## <a name="retrieve-models"></a>Récupérer des modèles
Vous pouvez répertorier et récupérer des modèles stockés sur votre instance Azure Digital Twins.
Les options disponibles sont les suivantes :
* Récupérer un seul modèle
* Récupérer tous les modèles
* Récupérer les métadonnées et les dépendances pour les modèles
Voici quelques exemples d’appels :
:::code language="csharp" source="~/digital-twins-docs-samples/sdks/csharp/model_operations.cs" id="GetModels":::
Tous les appels de SDK pour récupérer les modèles retournent des objets `DigitalTwinsModelData`. `DigitalTwinsModelData` contient les métadonnées relatives au modèle stocké dans l’instance Azure Digital Twins, par exemple le nom, le DTMI et la date de création du modèle. L’objet `DigitalTwinsModelData` comprend également le modèle lui-même. Par conséquent, en fonction des paramètres, vous pouvez utiliser les appels de récupération pour récupérer uniquement les métadonnées (utile lorsque vous souhaitez afficher une liste d’interfaces utilisateur des outils disponibles, par exemple) ou le modèle entier.
L’appel `RetrieveModelWithDependencies` retourne non seulement le modèle demandé, mais également tous les modèles dont il dépend.
Les modèles ne sont pas nécessairement retournés exactement sous la même forme que dans le document où ils ont été chargés. Azure Digital Twins garantit uniquement que la forme de retour est sémantiquement équivalente.
## <a name="update-models"></a>Mettre à jour les modèles
Cette section décrit les points à prendre en considération pour mettre à jour des modèles ainsi que les différentes stratégies de mise à jour.
### <a name="before-updating-think-in-the-context-of-your-entire-solution"></a>Avant la mise à jour : ayez une réflexion qui prend en compte le contexte de votre solution entière
Avant de procéder à des mises à jour de vos modèles, nous vous recommandons d’avoir une réflexion globale au sujet de votre solution et de l’impact des modifications que vous souhaitez apporter aux modèles. Dans une solution Azure Digital Twins, les modèles sont souvent interconnectés. Il est donc important de garder à l’esprit l’éventualité de modifications en cascade : quand vous mettez à jour un modèle, vous devrez peut-être également mettre à jour plusieurs autres modèles. Non seulement la mise à jour des modèles a un impact sur les jumeaux qui utilisent les modèles modifiés, mais elle peut aussi impacter le code d’entrée et de traitement, les applications clientes et les rapports automatisés.
Voici quelques recommandations utiles pour gérer vos transitions de modèles sans problème :
* Au lieu de considérer les modèles comme des entités distinctes, songez à développer votre ensemble de modèles quand cela est approprié pour que les modèles et leurs relations soient toujours à jour les uns par rapport aux autres.
* Traitez les modèles comme du code source et gérez-les dans le contrôle de code source. Travaillez sur vos modèles et leurs modifications avec la même rigueur et attention que pour tout autre code de votre solution.
Lorsque vous êtes prêt à poursuivre ce processus de mise à jour de vos modèles, référez-vous au reste de cette section, qui décrit les stratégies de mise à jour que vous pouvez implémenter.
### <a name="strategies-for-updating-models"></a>Stratégies de mise à jour des modèles
Une fois qu’un modèle est chargé sur votre instance Azure Digital Twins, l’interface du modèle est immuable, c’est-à-dire qu’elle ne peut pas être modifiée selon le processus de « modification » habituel des modèles. Azure Digital Twins n’autorise pas non plus le rechargement du même modèle strictement identique si un modèle correspondant existe déjà dans l’instance.
Pour apporter des modifications à un modèle (par exemple, mettre à jour `displayName` ou `description`, ou ajouter et supprimer des propriétés), vous ne pouvez le faire qu’en remplaçant le modèle d’origine.
Deux stratégies sont possibles pour remplacer un modèle :
* [Stratégie 1 : Charger une nouvelle version du modèle](#strategy-1-upload-new-model-version). Cette stratégie consiste à charger le modèle, en changeant le numéro de version, puis à mettre à jour les jumeaux pour qu’ils utilisent ce nouveau modèle. Les deux versions du modèle, l’ancienne et la nouvelle, coexistent dans votre instance jusqu’à ce que vous en supprimiez une.
- **Utilisez cette stratégie quand** vous souhaitez uniquement mettre à jour certains des jumeaux qui utilisent le modèle, ou quand vous voulez vous assurer que les jumeaux restent toujours conformes à leurs modèles et accessibles en écriture durant la transition du modèle.
* [Stratégie 2 : Supprimer l’ancien modèle et recharger le nouveau](#strategy-2-delete-old-model-and-reupload). Cette stratégie consiste à supprimer le modèle d’origine et à charger à la place le nouveau modèle, en utilisant les mêmes nom et ID (valeur DTMI). L’ancien modèle est alors entièrement remplacé par le nouveau.
- **Utilisez cette stratégie quand** vous souhaitez mettre à jour simultanément tous les jumeaux qui utilisent ce modèle, en plus de tout le code réagissant aux modèles. Si la mise à jour d’un modèle comporte un changement cassant, les jumeaux ne seront pas conformes à leurs modèles pendant une brève période, le temps de la transition des jumeaux de l’ancien modèle vers le nouveau. La conséquence est que les jumeaux ne pourront pas être mis à jour tant que le nouveau modèle n’aura pas été chargé et que les jumeaux ne seront pas conformes à ce modèle.
>[!NOTE]
> Il est déconseillé de faire des changements cassants dans vos modèles en dehors de la phase de développement.
Lisez les sections suivantes pour découvrir chaque option de stratégie plus en détail.
### <a name="strategy-1-upload-new-model-version"></a>Stratégie 1 : Charger une nouvelle version du modèle
Cette option consiste à créer une autre version du modèle et à charger cette nouvelle version dans votre instance.
Cette opération ne remplace **pas** les versions antérieures du modèle. Ainsi, plusieurs versions du modèle coexistent dans votre instance jusqu’à ce que vous décidiez de [les supprimer](#remove-models). Étant donné que la nouvelle version et l’ancienne version du modèle coexistent, les jumeaux peuvent utiliser l’une ou l’autre de ces versions. Autrement dit, le chargement d’une nouvelle version d’un modèle n’impacte pas automatiquement les jumeaux existants. Les jumeaux existants sont conservés comme instances de l’ancienne version du modèle. Vous pouvez mettre à jour ces jumeaux vers la nouvelle version du modèle en leur appliquant un correctif.
Pour utiliser cette stratégie, effectuez les étapes expliquées ci-dessous.
#### <a name="1-create-and-upload-new-model-version"></a>1. Créer et charger une nouvelle version du modèle
Pour créer une nouvelle version d’un modèle existant, commencez par le DTDL du modèle d’origine. Mettez à jour, ajoutez ou supprimez les champs que vous souhaitez modifier.
Ensuite, marquez ce modèle comme version plus récente en mettant à jour son champ `id`. La dernière section de l’ID de modèle, après le point-virgule (`;`), représente le numéro de modèle. Pour indiquer que ce modèle est une version plus récente, incrémentez le nombre à la fin de la valeur `id` en choisissant un nombre supérieur au numéro de la version actuelle.
Par exemple, si votre ID de modèle précédent ressemble à ceci :
```json
"@id": "dtmi:com:contoso:PatientRoom;1",
```
La version 2 de ce modèle peut se présenter comme suit :
```json
"@id": "dtmi:com:contoso:PatientRoom;2",
```
Ensuite, [chargez](#upload-models) la nouvelle version du modèle dans votre instance.
Cette version du modèle devient alors disponible dans votre instance pour les jumeaux numériques. Elle ne remplace **pas** les versions antérieures du modèle. Ainsi, plusieurs versions du modèle coexistent maintenant dans votre instance.
#### <a name="2-update-graph-elements-as-needed"></a>2. Mettre à jour les éléments graphiques en fonction des besoins
Ensuite, mettez à jour les **jumeaux et leurs relations** dans votre instance afin qu’ils utilisent la nouvelle version du modèle à la place de l’ancienne.
Aidez-vous des instructions ci-après pour [mettre à jour les jumeaux](how-to-manage-twin.md#update-a-digital-twins-model) et [mettre à jour les relations](how-to-manage-graph.md#update-relationships). L’opération de correction pour mettre à jour le modèle d’un jumeau ressemble à ceci :
:::code language="json" source="~/digital-twins-docs-samples/models/patch-model-1.json":::
>[!IMPORTANT]
>Quand vous mettez à jour des jumeaux, utilisez le **même correctif** pour mettre à jour à la fois l’ID du modèle (avec la nouvelle version du modèle) et chaque champ devant être modifié sur les jumeaux pour les rendre conformes au nouveau modèle.
Vous pouvez aussi avoir besoin de mettre à jour les **relations** et d’autres **modèles** dans votre instance qui référencent ce modèle, afin qu’ils référencent bien la nouvelle version du modèle. Pour cela, vous devez effectuer une autre opération de mise à jour du modèle. Revenez au début de cette section et répétez le processus pour chacun des autres modèles nécessitant une mise à jour.
#### <a name="3-optional-decommission-or-delete-old-model-version"></a>3. (Facultatif) Désaffecter ou supprimer l’ancienne version du modèle
Si vous n’avez plus besoin de l’ancienne version du modèle, vous pouvez la [désaffecter](#decommissioning). Avec cette action, le modèle est conservé dans l’instance, mais il ne peut pas être utilisé pour créer d’autres jumeaux numériques.
Vous pouvez aussi [supprimer](#deletion) complètement l’ancien modèle si vous ne souhaitez pas le conserver dans l’instance.
Les deux liens ci-dessus renvoient vers les sections contenant un exemple de code et des informations relatives à la désaffectation et à la suppression de modèles.
### <a name="strategy-2-delete-old-model-and-reupload"></a>Stratégie 2 : Supprimer l’ancien modèle et recharger le nouveau
Au lieu d’incrémenter la version d’un modèle, vous pouvez supprimer complètement un modèle et recharger un modèle modifié dans l’instance.
Comme Azure Digital Twins ne se souvient pas que l’ancien modèle a déjà été chargé, cette action revient à charger un modèle entièrement nouveau. Les jumeaux dans le graphe qui utilisent le modèle basculent automatiquement vers la nouvelle définition disponible. En fonction des différences qu’il y a entre la nouvelle définition et l’ancienne, ces jumeaux peuvent avoir des propriétés et des relations qui font référence à la définition supprimée et qui ne sont pas valides avec la nouvelle définition. Dans ce cas, vous devez les corriger afin qu’ils restent valides.
Pour utiliser cette stratégie, effectuez les étapes expliquées ci-dessous.
### <a name="1-delete-old-model"></a>1. Supprimer l’ancien modèle
Étant donné qu’Azure Digital Twins n’autorise pas l’existence de modèles ayant le même ID, commencez par supprimer le modèle d’origine de votre instance.
>[!NOTE]
> Si vous avez d’autres modèles qui dépendent de ce modèle (par héritage ou certains composants), vous devez supprimer ces références avant de supprimer le modèle. Vous pouvez mettre à jour ces modèles dépendants pour supprimer temporairement les références dans un premier temps, ou supprimer les modèles dépendants et les recharger lors d’une étape ultérieure.
Aidez-vous des instructions suivantes pour [supprimer votre modèle d’origine](#deletion). Cette action laisse les jumeaux qui utilisaient ce modèle temporairement « orphelins », car ils utilisent désormais un modèle qui n’existe plus. Cet état sera corrigé à l’étape suivante lors du rechargement du modèle mis à jour.
### <a name="2-create-and-upload-new-model"></a>2. Créer et charger le nouveau modèle
Commencez par le DTDL du modèle d’origine. Mettez à jour, ajoutez ou supprimez les champs que vous souhaitez modifier.
Ensuite, [chargez le modèle](#upload-models) dans l’instance, comme s’il s’agissait d’un tout nouveau modèle chargé pour la première fois.
### <a name="3-update-graph-elements-as-needed"></a>3. Mettre à jour les éléments du graphe si nécessaire
Maintenant que votre nouveau modèle a été chargé à la place de l’ancien, les jumeaux dans votre graphe commenceront automatiquement à utiliser la nouvelle définition de modèle après l’expiration et la réinitialisation de la mise en cache dans l’instance. **Ce processus peut prendre de dix à quinze minutes ou plus** en fonction de la taille de votre graphe. Après cela, les propriétés nouvelles et modifiées de votre modèle seront accessibles, au contraire des propriétés supprimées.
>[!NOTE]
> Si vous avez supprimé d’autres modèles dépendants précédemment afin de supprimer le modèle d’origine, rechargez-les maintenant après la réinitialisation du cache. Si vous avez mis à jour les modèles dépendants pour supprimer temporairement les références au modèle d’origine, vous pouvez les mettre de nouveau à jour avec la référence.
Ensuite, mettez à jour **les jumeaux et les relations** dans votre instance afin que leurs propriétés correspondent aux propriétés définies par le nouveau modèle. À ce stade, les jumeaux qui ne correspondent pas à leur modèle peuvent toujours être lus, mais ils ne sont pas accessibles en écriture. Pour plus d’informations sur l’état des jumeaux sans modèle valide, consultez [Jumeaux sans modèles](#after-deletion-twins-without-models).
Il existe deux façons de mettre à jour les jumeaux et les relations du nouveau modèle afin de les rendre de nouveau accessibles en écriture :
* Corrigez les jumeaux et les relations de manière à ce qu’ils soient conformes au nouveau modèle. Aidez-vous des instructions ci-après pour [mettre à jour les jumeaux](how-to-manage-twin.md#update-a-digital-twin) et [mettre à jour les relations](how-to-manage-graph.md#update-relationships).
- **Si vous avez ajouté des propriétés**, il n’est pas nécessaire de mettre à jour les jumeaux et les relations avec les nouvelles valeurs, car les jumeaux qui n’ont pas les nouvelles valeurs restent valides. Vous pouvez toutefois les corriger si vous souhaitez ajouter des valeurs pour les nouvelles propriétés.
- **Si vous avez supprimé des propriétés**, il est nécessaire de corriger les jumeaux pour supprimer les propriétés qui ne sont maintenant plus valides avec le nouveau modèle.
- **Si vous avez mis à jour des propriétés**, il est nécessaire de corriger les jumeaux pour mettre à jour les valeurs des propriétés modifiées afin qu’elles soient valides avec le nouveau modèle.
* Supprimez les jumeaux et les relations qui utilisent le modèle, puis recréez-les. Aidez-vous des instructions suivantes pour [supprimer les jumeaux](how-to-manage-twin.md#delete-a-digital-twin) et [recréer les jumeaux](how-to-manage-twin.md#create-a-digital-twin), et [supprimer les relations](how-to-manage-graph.md#delete-relationships) et [recréer les relations](how-to-manage-graph.md#create-relationships).
- Vous souhaiterez peut-être effectuer cette opération si vous modifiez largement le modèle et qu’il sera difficile de mettre à jour les jumeaux existants de manière correcte. Toutefois, recréer entièrement les jumeaux et les relations peut s’avérer compliqué si vous avez beaucoup de jumeaux qui sont interconnectés par de nombreuses relations.
## <a name="remove-models"></a>Supprimer des modèles
Les modèles peuvent être supprimés du service de l’une des deux manières suivantes :
* **Désaffectation** : Une fois qu’un modèle est désactivé, vous ne pouvez plus l’utiliser pour créer de nouvelles représentations numériques. Les représentations numériques existantes utilisant déjà ce modèle ne sont pas affectées. vous pouvez donc toujours les mettre à jour avec des éléments tels que les modifications de propriétés et l’ajout ou la suppression de relations.
* **Suppression** : Cette opération supprime complètement le modèle de la solution. Les représentations qui utilisaient ce modèle ne sont plus associées à un modèle valide. Elles sont donc traitées comme si elles n’en avaient pas. Vous pouvez toujours lire ces jumeaux, mais pas les mettre à jour tant qu’ils ne sont pas réaffectés à un autre modèle.
Ces deux opérations sont des fonctionnalités distinctes qui n’ont pas d’impact l’une sur l’autre, mais elles peuvent être utilisées ensemble pour supprimer un modèle progressivement.
### <a name="decommissioning"></a>Désaffectation
Pour désaffecter un modèle, vous pouvez utiliser la méthode [DecommissionModel](/dotnet/api/azure.digitaltwins.core.digitaltwinsclient.decommissionmodel?view=azure-dotnet&preserve-view=true) fournie dans le SDK :
:::code language="csharp" source="~/digital-twins-docs-samples/sdks/csharp/model_operations.cs" id="DecommissionModel":::
Vous pouvez également effectuer cette opération par un appel de l’API REST [DigitalTwinModels Update](/rest/api/digital-twins/dataplane/models/digitaltwinmodels_update). La propriété `decommissioned` est la seule propriété qui peut être remplacée par cet appel d’API. Le document JSON Patch ressemble à ceci :
:::code language="json" source="~/digital-twins-docs-samples/models/patch-decommission-model.json":::
L’état de désaffectation d’un modèle est inclus dans les enregistrements `ModelData` retournés par les API de récupération de modèle.
### <a name="deletion"></a>Suppression
Vous pouvez supprimer tous les modèles de votre instance d’une seule traite, ou vous pouvez le faire sur une base individuelle.
Pour obtenir un exemple de suppression de tous les modèles en même temps, consultez le dépôt [Exemples Azure Digital Twins de bout en bout](https://github.com/Azure-Samples/digital-twins-samples/blob/master/AdtSampleApp/SampleClientApp/CommandLoop.cs) dans GitHub. Le fichier *CommandLoop.cs* contient une fonction `CommandDeleteAllModels` dont le code supprime la totalité des modèles dans l’instance.
Pour supprimer les modèles de façon individuelle, aidez-vous des instructions et des informations fournies dans la suite de cette section.
#### <a name="before-deletion-deletion-requirements"></a>Avant la suppression : Conditions requises pour la suppression
En règle générale, les modèles peuvent être supprimés à tout moment.
L’exception concerne les modèles dont dépendent d’autres modèles, qu’il s’agisse d’une relation `extends` ou en tant que composant. Par exemple, si un modèle ConferenceRoom étend un modèle Room, et qu’il possède un modèle ACUnit en tant que composant, vous ne pouvez pas supprimer Room ou ACUnit avant que ConferenceRoom ne supprime leurs références respectives.
Pour ce faire, vous pouvez mettre à jour le modèle dépendant pour supprimer les dépendances, ou supprimer complètement le modèle dépendant.
#### <a name="during-deletion-deletion-process"></a>Durant la suppression : Processus de suppression
Même si un modèle répond aux exigences pour une suppression immédiate, vous voudrez peut-être suivre quelques étapes préliminaires pour éviter des conséquences inattendues pour la représentation numérique. Voici quelques étapes qui peuvent vous aider à gérer le processus :
1. Tout d’abord, désaffectez le modèle
2. Attendez quelques minutes, pour vous assurer que le service a bien traité toutes les demandes de création de représentations de dernière minute envoyées avant la désaffectation
3. Interrogez les représentations par modèle pour voir toutes les représentations utilisant le modèle désormais désactivé
4. Supprimez les représentations si vous n’en avez plus besoin ou bien liez-les à un nouveau modèle si nécessaire. Vous pouvez également choisir de conserver les jumeaux seuls, auquel cas ils deviennent des jumeaux sans modèle une fois le modèle supprimé. Consultez la section suivante pour connaître les implications de cet état.
5. Patientez quelques minutes pour vous assurer que les modifications ont été répercutées
6. Supprimer le modèle
Pour supprimer un modèle, vous pouvez utiliser l’appel de SDK [DeleteModel](/dotnet/api/azure.digitaltwins.core.digitaltwinsclient.deletemodel?view=azure-dotnet&preserve-view=true) :
:::code language="csharp" source="~/digital-twins-docs-samples/sdks/csharp/model_operations.cs" id="DeleteModel":::
Vous pouvez aussi supprimer un modèle à l’aide de l’appel d'API REST [DigitalTwinModels Delete](/rest/api/digital-twins/dataplane/models/digitaltwinmodels_delete).
#### <a name="after-deletion-twins-without-models"></a>Après la suppression : Représentations sans modèle
Une fois qu’un modèle est supprimé, les représentations numériques qui l’utilisaient sont désormais considérées comme sans modèle. Il n’existe aucune requête vous permettant de lister tous les jumeaux dans cet état, mais vous *pouvez* toujours interroger les jumeaux par le modèle supprimé pour connaître les jumeaux concernés.
Voici une vue d’ensemble de ce que vous pouvez et ne pouvez pas faire avec des jumeaux sans modèle.
Choses que vous **pouvez** faire :
* Interroger la représentation
* Lire les propriétés
* Lire les relations sortantes
* Ajouter et supprimer des relations entrantes (d’autres représentations peuvent toujours former des relations *vers* cette représentation)
- La `target` dans la définition de la relation peut toujours refléter le DTMI du modèle supprimé. Une relation sans cible définie peut également fonctionner ici.
* Supprimer des relations
* Supprimer la représentation
Choses que vous **ne pouvez pas** faire :
* Modifier les relations sortantes (les relations *de* cette représentation vers d’autres représentations)
* Modifier les propriétés
#### <a name="after-deletion-reuploading-a-model"></a>Après la suppression : recharger un modèle
Une fois qu’un modèle a été supprimé, vous pouvez décider de charger un nouveau modèle ayant un ID identique à celui que vous avez supprimé. Voici ce qui se passe dans ce cas.
* Du point de vue du magasin de solutions, cette opération revient à charger un modèle entièrement nouveau. Le service ne se rappelle pas du chargement de l’ancien modèle.
* S’il reste des jumeaux dans le graphe qui référencent le modèle supprimé, ces jumeaux ne sont plus orphelins, car l’ID de ce modèle est de nouveau valide avec la nouvelle définition. Cependant, si la nouvelle définition du modèle est différente de la définition du modèle supprimée, il est possible que ces jumeaux possèdent des propriétés et des relations qui correspondent à la définition supprimée et qui ne sont pas valides avec la nouvelle définition.
Azure Digital Twins n’empêche pas cet état ; aussi, veillez à corriger les jumeaux de sorte qu’ils restent valides après le changement de la définition du modèle.
## <a name="next-steps"></a>Étapes suivantes
Découvrez comment créer et gérer des représentations numériques basées sur vos modèles :
* [Gérer des jumeaux numériques](how-to-manage-twin.md) | 95.064407 | 706 | 0.797532 | fra_Latn | 0.992849 |
d38bc6d65639c35250d91d6f38a76765c21f1774 | 472,907 | md | Markdown | revdep/problems.md | rensa/dplyr | 1da8c88293bc5e26de3d100b83a9b92e216becf6 | [
"MIT"
] | null | null | null | revdep/problems.md | rensa/dplyr | 1da8c88293bc5e26de3d100b83a9b92e216becf6 | [
"MIT"
] | null | null | null | revdep/problems.md | rensa/dplyr | 1da8c88293bc5e26de3d100b83a9b92e216becf6 | [
"MIT"
] | null | null | null | # abjutils
Version: 0.2.3
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘httr’ ‘progress’
All declared Imports should be used.
```
# adegenet
Version: 2.1.1
## In both
* checking installed package size ... NOTE
```
installed size is 6.7Mb
sub-directories of 1Mb or more:
data 1.3Mb
files 1.7Mb
R 3.1Mb
```
# admixturegraph
Version: 1.0.2
## In both
* checking installed package size ... NOTE
```
installed size is 6.1Mb
sub-directories of 1Mb or more:
data 2.0Mb
R 3.1Mb
```
# ADPclust
Version: 0.7
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘fields’ ‘knitr’
All declared Imports should be used.
```
# AeRobiology
Version: 1.0.3
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘colorspace’ ‘devtools’ ‘httpuv’ ‘imager’ ‘purrr’
All declared Imports should be used.
```
# afex
Version: 0.23-0
## In both
* checking installed package size ... NOTE
```
installed size is 5.4Mb
sub-directories of 1Mb or more:
doc 2.7Mb
extdata 1.8Mb
```
* checking Rd cross-references ... NOTE
```
Package unavailable to check Rd xrefs: ‘ez’
```
# afmToolkit
Version: 0.0.1
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘assertthat’ ‘DBI’ ‘tibble’
All declared Imports should be used.
```
# AGread
Version: 0.2.0
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘reshape2’
All declared Imports should be used.
```
# ahpsurvey
Version: 0.4.0
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘knitr’ ‘randomNames’ ‘tidyr’
All declared Imports should be used.
```
# aire.zmvm
Version: 0.8.1
## In both
* checking data for non-ASCII characters ... NOTE
```
Note: found 52 marked UTF-8 strings
```
# airportr
Version: 0.1.2
## In both
* checking data for non-ASCII characters ... NOTE
```
Note: found 676 marked UTF-8 strings
```
# alphavantager
Version: 0.1.0
## Newly broken
* checking tests ...
```
ERROR
Running the tests in ‘tests/testthat.R’ failed.
Last 13 lines of output:
2: stop(content, call. = F) at /Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/alphavantager/new/alphavantager.Rcheck/00_pkg_src/alphavantager/R/av_get.R:103
── 3. Error: call Technical Indicators (@test_av_get.R#57) ────────────────────
Thank you for using Alpha Vantage! Our standard API call frequency is 5 calls per minute and 500 calls per day. Please visit https://www.alphavantage.co/premium/ if you would like to target a higher API call frequency.. API parameters used: symbol=MSFT, function=SMA, interval=monthly, time_period=60, series_type=close, apikey=HIDDEN_FOR_YOUR_SAFETY
1: av_get(symbol, av_fun, interval = interval, time_period = time_period, series_type = series_type) at testthat/test_av_get.R:57
2: stop(content, call. = F) at /Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/alphavantager/new/alphavantager.Rcheck/00_pkg_src/alphavantager/R/av_get.R:103
══ testthat results ═══════════════════════════════════════════════════════════
OK: 7 SKIPPED: 0 FAILED: 3
1. Error: call TIMES_SERIES_INTRADAY (@test_av_get.R#22)
2. Error: call SECTOR (@test_av_get.R#38)
3. Error: call Technical Indicators (@test_av_get.R#57)
Error: testthat unit tests failed
Execution halted
```
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘devtools’
All declared Imports should be used.
```
# alternativeSplicingEvents.hg19
Version: 1.0.1
## In both
* checking extension type ... ERROR
```
Extensions with Type ‘Annotation package’ cannot be checked.
```
# ameco
Version: 0.2.9
## In both
* checking installed package size ... NOTE
```
installed size is 16.2Mb
sub-directories of 1Mb or more:
data 16.1Mb
```
# amplican
Version: 1.2.1
## In both
* checking installed package size ... NOTE
```
installed size is 13.8Mb
sub-directories of 1Mb or more:
doc 12.5Mb
```
* checking Rd cross-references ... NOTE
```
Package unavailable to check Rd xrefs: ‘CrispRVariants’
```
# amt
Version: 0.0.5.0
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘magrittr’ ‘Rcpp’
All declared Imports should be used.
```
# analysisPipelines
Version: 1.0.0
## In both
* checking package dependencies ... NOTE
```
Package which this enhances but not available for checking: ‘SparkR’
```
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘devtools’
All declared Imports should be used.
```
# annotatr
Version: 1.6.0
## In both
* checking examples ... ERROR
```
Running examples in ‘annotatr-Ex.R’ failed
The error most likely occurred in:
> ### Name: build_annotations
> ### Title: A function to build annotations from TxDb.* and AnnotationHub
> ### resources
> ### Aliases: build_annotations
>
> ### ** Examples
>
> # Example with hg19 gene promoters
> annots = c('hg19_genes_promoters')
> annots_gr = build_annotations(genome = 'hg19', annotations = annots)
Error in build_gene_annots(genome = genome, annotations = gene_annotations) :
The package TxDb.Hsapiens.UCSC.hg19.knownGene is not installed, please install it via Bioconductor.
Calls: build_annotations
Execution halted
```
* checking re-building of vignette outputs ... WARNING
```
Error in re-building vignettes:
...
snapshotDate(): 2018-04-30
Building annotation Gm12878 from AnnotationHub resource AH23256 ...
require("rtracklayer")
downloading 0 resources
loading from cache
'/Users/romain//.AnnotationHub/28684'
Quitting from lines 153-170 (annotatr-vignette.Rmd)
Error: processing vignette 'annotatr-vignette.Rmd' failed with diagnostics:
The package TxDb.Hsapiens.UCSC.hg19.knownGene is not installed, please install it via Bioconductor.
Execution halted
```
* checking package dependencies ... NOTE
```
Packages suggested but not available for checking:
‘org.Dm.eg.db’ ‘org.Gg.eg.db’ ‘org.Hs.eg.db’ ‘org.Mm.eg.db’
‘org.Rn.eg.db’ ‘TxDb.Dmelanogaster.UCSC.dm3.ensGene’
‘TxDb.Dmelanogaster.UCSC.dm6.ensGene’
‘TxDb.Ggallus.UCSC.galGal5.refGene’
‘TxDb.Hsapiens.UCSC.hg19.knownGene’
‘TxDb.Hsapiens.UCSC.hg38.knownGene’
‘TxDb.Mmusculus.UCSC.mm9.knownGene’
‘TxDb.Mmusculus.UCSC.mm10.knownGene’
‘TxDb.Rnorvegicus.UCSC.rn4.ensGene’
‘TxDb.Rnorvegicus.UCSC.rn5.refGene’
‘TxDb.Rnorvegicus.UCSC.rn6.refGene’
```
* checking R code for possible problems ... NOTE
```
plot_coannotations: no visible binding for global variable ‘.’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/annotatr/new/annotatr.Rcheck/00_pkg_src/annotatr/R/visualize.R:176-178)
plot_numerical_coannotations: no visible binding for global variable
‘.’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/annotatr/new/annotatr.Rcheck/00_pkg_src/annotatr/R/visualize.R:463-480)
plot_numerical_coannotations: no visible binding for global variable
‘.’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/annotatr/new/annotatr.Rcheck/00_pkg_src/annotatr/R/visualize.R:466-471)
plot_numerical_coannotations: no visible binding for global variable
‘.’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/annotatr/new/annotatr.Rcheck/00_pkg_src/annotatr/R/visualize.R:473-478)
Undefined global functions or variables:
.
```
# anomalize
Version: 0.1.1
## In both
* checking installed package size ... NOTE
```
installed size is 5.5Mb
sub-directories of 1Mb or more:
help 4.7Mb
```
# archivist
Version: 2.3.2
## In both
* checking Rd cross-references ... NOTE
```
Package unavailable to check Rd xrefs: ‘archivist.github’
```
# areal
Version: 0.1.2
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘lwgeom’ ‘tibble’
All declared Imports should be used.
```
# arena2r
Version: 1.0.0
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘purrr’ ‘shinyBS’ ‘shinydashboard’ ‘shinyjs’
All declared Imports should be used.
```
# asremlPlus
Version: 4.1-22
## In both
* checking package dependencies ... NOTE
```
Package which this enhances but not available for checking: ‘asreml’
```
# atlantistools
Version: 0.4.3
## In both
* checking tests ...
```
ERROR
Running the tests in ‘tests/testthat.R’ failed.
Last 13 lines of output:
── 4. Failure: test extraction for Small-spotted catshark (@test-growth-fishbase
df1$locality[1] not equivalent to "Central Adriatic, 100-200 m depth".
1/1 mismatches
x[1]: "northeast Mediterranean Sea"
y[1]: "Central Adriatic, 100-200 m depth"
══ testthat results ═══════════════════════════════════════════════════════════
OK: 193 SKIPPED: 1 FAILED: 4
1. Failure: test extraction for Small-spotted catshark (@test-growth-fishbase.R#6)
2. Failure: test extraction for Small-spotted catshark (@test-growth-fishbase.R#7)
3. Failure: test extraction for Small-spotted catshark (@test-growth-fishbase.R#8)
4. Failure: test extraction for Small-spotted catshark (@test-growth-fishbase.R#9)
Error: testthat unit tests failed
Execution halted
```
# auctestr
Version: 1.0.0
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘tidyr’
All declared Imports should be used.
```
# augmentedRCBD
Version: 0.1.0
## In both
* checking Rd cross-references ... NOTE
```
Package unavailable to check Rd xrefs: ‘agricolae’
```
# auk
Version: 0.3.2
## In both
* checking data for non-ASCII characters ... NOTE
```
Note: found 552 marked UTF-8 strings
```
# autocogs
Version: 0.1.2
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘broom’ ‘diptest’ ‘ggplot2’ ‘hexbin’ ‘MASS’ ‘moments’
All declared Imports should be used.
```
# BALCONY
Version: 0.2.10
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘base’
All declared Imports should be used.
```
# BANEScarparkinglite
Version: 0.1.2
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘zoo’
All declared Imports should be used.
```
# banR
Version: 0.2.0
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘stringr’
All declared Imports should be used.
```
# banter
Version: 0.9.3
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘ranger’
All declared Imports should be used.
```
# basecallQC
Version: 1.4.0
## In both
* checking installed package size ... NOTE
```
installed size is 5.1Mb
sub-directories of 1Mb or more:
doc 1.8Mb
extdata 2.8Mb
```
# bayesCT
Version: 0.99.0
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘devtools’ ‘msm’ ‘parallel’ ‘tidyr’
All declared Imports should be used.
```
# bayesdfa
Version: 0.1.2
## In both
* checking installed package size ... NOTE
```
installed size is 5.6Mb
sub-directories of 1Mb or more:
libs 4.8Mb
```
* checking for GNU extensions in Makefiles ... NOTE
```
GNU make is a SystemRequirements.
```
# bayesplot
Version: 1.6.0
## In both
* checking installed package size ... NOTE
```
installed size is 7.1Mb
sub-directories of 1Mb or more:
doc 4.0Mb
R 2.5Mb
```
# baystability
Version: 0.1.0
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘ggfortify’ ‘ggplot2’ ‘matrixStats’ ‘reshape2’ ‘scales’
All declared Imports should be used.
```
# BgeeDB
Version: 2.6.2
## In both
* checking whether package ‘BgeeDB’ can be installed ... ERROR
```
Installation failed.
See ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/BgeeDB/new/BgeeDB.Rcheck/00install.out’ for details.
```
## Installation
### Devel
```
* installing *source* package ‘BgeeDB’ ...
** R
** data
** inst
** byte-compile and prepare package for lazy loading
Error : package ‘GO.db’ required by ‘topGO’ could not be found
ERROR: lazy loading failed for package ‘BgeeDB’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/BgeeDB/new/BgeeDB.Rcheck/BgeeDB’
```
### CRAN
```
* installing *source* package ‘BgeeDB’ ...
** R
** data
** inst
** byte-compile and prepare package for lazy loading
Error : package ‘GO.db’ required by ‘topGO’ could not be found
ERROR: lazy loading failed for package ‘BgeeDB’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/BgeeDB/old/BgeeDB.Rcheck/BgeeDB’
```
# billboard
Version: 0.1.0
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘tibble’
All declared Imports should be used.
```
* checking data for non-ASCII characters ... NOTE
```
Note: found 660 marked UTF-8 strings
```
# binneR
Version: 2.0.10
## In both
* checking package dependencies ... NOTE
```
Package suggested but not available for checking: ‘metaboData’
```
# biobroom
Version: 1.12.1
## In both
* checking re-building of vignette outputs ... WARNING
```
...
The following object is masked from 'package:dplyr':
count
Loading required package: BiocParallel
Attaching package: 'DelayedArray'
The following objects are masked from 'package:matrixStats':
colMaxs, colMins, colRanges, rowMaxs, rowMins, rowRanges
The following objects are masked from 'package:base':
aperm, apply
Quitting from lines 134-139 (biobroom_vignette.Rmd)
Error: processing vignette 'biobroom_vignette.Rmd' failed with diagnostics:
there is no package called 'airway'
Execution halted
```
* checking package dependencies ... NOTE
```
Package suggested but not available for checking: ‘airway’
```
* checking dependencies in R code ... NOTE
```
'library' or 'require' call to ‘DESeq2’ in package code.
Please use :: or requireNamespace() instead.
See section 'Suggested packages' in the 'Writing R Extensions' manual.
Missing or unexported object: ‘dplyr::tbl_dt’
```
* checking R code for possible problems ... NOTE
```
...
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/biobroom/new/biobroom.Rcheck/00_pkg_src/biobroom/R/qvalue_tidiers.R:65-66)
tidy.RangedSummarizedExperiment: no visible binding for global variable
‘value’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/biobroom/new/biobroom.Rcheck/00_pkg_src/biobroom/R/SummarizedExperiment_tidiers.R:43-45)
tidy.RangedSummarizedExperiment: no visible binding for global variable
‘gene’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/biobroom/new/biobroom.Rcheck/00_pkg_src/biobroom/R/SummarizedExperiment_tidiers.R:43-45)
tidy.RangedSummarizedExperiment: no visible global function definition
for ‘colData’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/biobroom/new/biobroom.Rcheck/00_pkg_src/biobroom/R/SummarizedExperiment_tidiers.R:48)
Undefined global functions or variables:
. calcNormFactors colData counts design DGEList end estimate
estimateSizeFactors exprs<- fData<- gene gr is lambda model.matrix
p.adjust pData pData<- pi0 protein rowRanges sample.id seqnames
setNames smoothed start tbl_dt term value voom voomWithQualityWeights
Consider adding
importFrom("methods", "is")
importFrom("stats", "end", "model.matrix", "p.adjust", "setNames",
"start")
to your NAMESPACE file (and ensure that your DESCRIPTION Imports field
contains 'methods').
```
# bioCancer
Version: 1.8.0
## In both
* checking package dependencies ... ERROR
```
Packages required but not available: ‘org.Hs.eg.db’ ‘reactome.db’
See section ‘The DESCRIPTION file’ in the ‘Writing R Extensions’
manual.
```
# BiocOncoTK
Version: 1.0.3
## In both
* checking re-building of vignette outputs ... WARNING
```
Error in re-building vignettes:
...
Quitting from lines 18-30 (BiocOncoTK.Rmd)
Error: processing vignette 'BiocOncoTK.Rmd' failed with diagnostics:
there is no package called 'org.Hs.eg.db'
Execution halted
```
* checking package dependencies ... NOTE
```
Packages suggested but not available for checking:
‘org.Hs.eg.db’ ‘TxDb.Hsapiens.UCSC.hg19.knownGene’
‘TxDb.Hsapiens.UCSC.hg18.knownGene’
```
* checking installed package size ... NOTE
```
installed size is 6.0Mb
sub-directories of 1Mb or more:
data 4.0Mb
doc 1.7Mb
```
* checking R code for possible problems ... NOTE
```
...
rainfall: no visible global function definition for ‘ylab’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/BiocOncoTK/new/BiocOncoTK.Rcheck/00_pkg_src/BiocOncoTK/R/rainfall3.R:164-168)
rainfall: no visible global function definition for ‘xlab’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/BiocOncoTK/new/BiocOncoTK.Rcheck/00_pkg_src/BiocOncoTK/R/rainfall3.R:164-168)
rainfall: no visible global function definition for ‘geom_vline’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/BiocOncoTK/new/BiocOncoTK.Rcheck/00_pkg_src/BiocOncoTK/R/rainfall3.R:169-171)
rainfall: no visible global function definition for ‘aes’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/BiocOncoTK/new/BiocOncoTK.Rcheck/00_pkg_src/BiocOncoTK/R/rainfall3.R:169-171)
rainfall: no visible global function definition for
‘scale_x_continuous’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/BiocOncoTK/new/BiocOncoTK.Rcheck/00_pkg_src/BiocOncoTK/R/rainfall3.R:169-171)
rainfall: no visible global function definition for ‘ggtitle’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/BiocOncoTK/new/BiocOncoTK.Rcheck/00_pkg_src/BiocOncoTK/R/rainfall3.R:172)
rainfall: no visible global function definition for ‘geom_text’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/BiocOncoTK/new/BiocOncoTK.Rcheck/00_pkg_src/BiocOncoTK/R/rainfall3.R:173-174)
rainfall: no visible global function definition for ‘aes’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/BiocOncoTK/new/BiocOncoTK.Rcheck/00_pkg_src/BiocOncoTK/R/rainfall3.R:173-174)
Undefined global functions or variables:
aes BiocFileCache element_blank genome geom_point geom_text
geom_vline ggplot ggtitle scale_x_continuous seqlengths theme xlab
ylab
```
# biotmle
Version: 1.4.0
## In both
* checking R code for possible problems ... NOTE
```
.biotmle: no visible global function definition for ‘new’
Undefined global functions or variables:
new
Consider adding
importFrom("methods", "new")
to your NAMESPACE file (and ensure that your DESCRIPTION Imports field
contains 'methods').
```
# blkbox
Version: 1.0
## In both
* checking package dependencies ... NOTE
```
Package suggested but not available for checking: ‘bigrf’
```
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘glmnet’ ‘gtools’ ‘knitr’ ‘nnet’ ‘parallel’ ‘reshape’ ‘rJava’
‘rmarkdown’ ‘shinyjs’
All declared Imports should be used.
Missing or unexported object: ‘xgboost::predict’
```
# BloodCancerMultiOmics2017
Version: 1.0.2
## In both
* checking re-building of vignette outputs ... WARNING
```
...
The following objects are masked from 'package:IRanges':
intersect, setdiff, union
The following objects are masked from 'package:S4Vectors':
intersect, setdiff, union
The following objects are masked from 'package:BiocGenerics':
intersect, setdiff, union
The following objects are masked from 'package:base':
intersect, setdiff, union
Quitting from lines 46-92 (BloodCancerMultiOmics2017.Rmd)
Error: processing vignette 'BloodCancerMultiOmics2017.Rmd' failed with diagnostics:
there is no package called 'org.Hs.eg.db'
Execution halted
```
* checking package dependencies ... NOTE
```
Package suggested but not available for checking: ‘org.Hs.eg.db’
```
* checking installed package size ... NOTE
```
installed size is 115.7Mb
sub-directories of 1Mb or more:
data 80.0Mb
doc 26.5Mb
extdata 8.5Mb
```
* checking Rd cross-references ... NOTE
```
Package unavailable to check Rd xrefs: ‘vsn’
```
# bmlm
Version: 1.3.11
## In both
* checking for GNU extensions in Makefiles ... NOTE
```
GNU make is a SystemRequirements.
```
# bodenmiller
Version: 0.1
## In both
* checking installed package size ... NOTE
```
installed size is 8.9Mb
sub-directories of 1Mb or more:
data 8.7Mb
```
# bootnet
Version: 1.2
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘psych’
All declared Imports should be used.
```
# bossMaps
Version: 0.1.0
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘rgdal’ ‘tidyr’
All declared Imports should be used.
```
# bpbounds
Version: 0.1.3
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘methods’
All declared Imports should be used.
```
# BrailleR
Version: 0.29.1
## In both
* checking package dependencies ... NOTE
```
Package suggested but not available for checking: ‘installr’
```
* checking installed package size ... NOTE
```
installed size is 5.6Mb
sub-directories of 1Mb or more:
doc 1.3Mb
R 2.1Mb
Sound 1.0Mb
```
# brazilmaps
Version: 0.1.0
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘sp’
All declared Imports should be used.
```
# breathtestcore
Version: 0.4.6
## In both
* checking Rd cross-references ... NOTE
```
Package unavailable to check Rd xrefs: ‘breathteststan’
```
# breathteststan
Version: 0.4.7
## In both
* checking for GNU extensions in Makefiles ... NOTE
```
GNU make is a SystemRequirements.
```
# broom.mixed
Version: 0.2.4
## In both
* checking package dependencies ... NOTE
```
Package suggested but not available for checking: ‘glmmADMB’
```
# bsam
Version: 1.1.2
## In both
* checking whether package ‘bsam’ can be installed ... ERROR
```
Installation failed.
See ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/bsam/new/bsam.Rcheck/00install.out’ for details.
```
## Installation
### Devel
```
* installing *source* package ‘bsam’ ...
** package ‘bsam’ successfully unpacked and MD5 sums checked
** R
** data
*** moving datasets to lazyload DB
** inst
** byte-compile and prepare package for lazy loading
Error: package or namespace load failed for ‘rjags’:
.onLoad failed in loadNamespace() for 'rjags', details:
call: dyn.load(file, DLLpath = DLLpath, ...)
error: unable to load shared object '/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/bsam/rjags/libs/rjags.so':
dlopen(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/bsam/rjags/libs/rjags.so, 10): Library not loaded: /usr/local/lib/libjags.4.dylib
Referenced from: /Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/bsam/rjags/libs/rjags.so
Reason: image not found
Error : package ‘rjags’ could not be loaded
ERROR: lazy loading failed for package ‘bsam’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/bsam/new/bsam.Rcheck/bsam’
```
### CRAN
```
* installing *source* package ‘bsam’ ...
** package ‘bsam’ successfully unpacked and MD5 sums checked
** R
** data
*** moving datasets to lazyload DB
** inst
** byte-compile and prepare package for lazy loading
Error: package or namespace load failed for ‘rjags’:
.onLoad failed in loadNamespace() for 'rjags', details:
call: dyn.load(file, DLLpath = DLLpath, ...)
error: unable to load shared object '/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/bsam/rjags/libs/rjags.so':
dlopen(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/bsam/rjags/libs/rjags.so, 10): Library not loaded: /usr/local/lib/libjags.4.dylib
Referenced from: /Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/bsam/rjags/libs/rjags.so
Reason: image not found
Error : package ‘rjags’ could not be loaded
ERROR: lazy loading failed for package ‘bsam’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/bsam/old/bsam.Rcheck/bsam’
```
# BubbleTree
Version: 2.10.0
## In both
* checking installed package size ... NOTE
```
installed size is 29.4Mb
sub-directories of 1Mb or more:
data 23.4Mb
doc 5.3Mb
```
* checking R code for possible problems ... NOTE
```
annoByOverlap,Annotate: no visible binding for global variable
'queryHits'
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/BubbleTree/new/BubbleTree.Rcheck/00_pkg_src/BubbleTree/R/Annotate.R:107)
Undefined global functions or variables:
queryHits
```
# c14bazAAR
Version: 1.0.2
## In both
* checking data for non-ASCII characters ... NOTE
```
Note: found 76 marked UTF-8 strings
```
# caffsim
Version: 0.2.2
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘markdown’
All declared Imports should be used.
```
# cansim
Version: 0.2.3
## In both
* checking examples ... ERROR
```
...
> ### Aliases: get_cansim_changed_tables
>
> ### ** Examples
>
> get_cansim_changed_tables("2018-08-01")
Error in get_cansim_changed_tables("2018-08-01") :
Problem downloading data, status code 503
<!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN">
<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8">
<title>503 Service Unavailable</title>
</head>
<body>
<h1>Service Unavailable</h1>
<p>The server is temporarily unable to service your
request due to maintenance downtime or capacity
problems. Please try again later.</p>
</body>
</html>
Execution halted
```
* checking re-building of vignette outputs ... WARNING
```
Error in re-building vignettes:
...
Quitting from lines 62-66 (cansim.Rmd)
Error: processing vignette 'cansim.Rmd' failed with diagnostics:
Problem downloading data, status code 503
<!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN">
<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8">
<title>503 Service Unavailable</title>
</head>
<body>
<h1>Service Unavailable</h1>
<p>The server is temporarily unable to service your
request due to maintenance downtime or capacity
problems. Please try again later.</p>
</body>
</html>
Execution halted
```
# canvasXpress
Version: 1.23.3
## In both
* checking tests ...
```
ERROR
Running the tests in ‘tests/testthat.R’ failed.
Last 13 lines of output:
> library(testthat)
> library(canvasXpress)
>
> test_check("canvasXpress")
── 1. Failure: Incorrect Data Types (@test-other--BASE.R#44) ──────────────────
`canvasXpress(data = "'Test'")` threw an error with unexpected message.
Expected match: "[Couldn't|Could not] resolve.*"
Actual message: "Not a valid URL!"
══ testthat results ═══════════════════════════════════════════════════════════
OK: 745 SKIPPED: 0 FAILED: 1
1. Failure: Incorrect Data Types (@test-other--BASE.R#44)
Error: testthat unit tests failed
Execution halted
```
# capm
Version: 0.13.9
## In both
* checking data for non-ASCII characters ... NOTE
```
Note: found 59 marked UTF-8 strings
```
# CARBayesST
Version: 3.0.1
## In both
* checking re-building of vignette outputs ... NOTE
```
...
x
Loading required package: sf
Linking to GEOS 3.6.1, GDAL 2.1.3, PROJ 4.9.3
Error in texi2dvi(file = file, pdf = TRUE, clean = clean, quiet = quiet, :
Running 'texi2dvi' on 'CARBayesST.tex' failed.
LaTeX errors:
! LaTeX Error: File `multirow.sty' not found.
Type X to quit or <RETURN> to proceed,
or enter new name. (Default extension: sty)
! Emergency stop.
<read *>
l.13 \usepackage
{multicol}^^M
! ==> Fatal error occurred, no output PDF file produced!
Calls: buildVignettes -> texi2pdf -> texi2dvi
Execution halted
```
# caret
Version: 6.0-81
## In both
* checking installed package size ... NOTE
```
installed size is 9.5Mb
sub-directories of 1Mb or more:
data 1.5Mb
models 2.4Mb
R 4.1Mb
```
# cartools
Version: 0.1.0
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘animation’ ‘devtools’ ‘gapminder’ ‘knitr’ ‘rlist’ ‘rmarkdown’
‘roxygen2’ ‘sde’ ‘shiny’ ‘tidyverse’ ‘usethis’ ‘utils’
All declared Imports should be used.
```
# CaseBasedReasoning
Version: 0.1
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘cowplot’ ‘dplyr’ ‘ranger’ ‘Rcpp’ ‘rms’ ‘survival’ ‘tidyverse’
All declared Imports should be used.
```
# casino
Version: 0.1.0
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘crayon’ ‘dplyr’ ‘R6’ ‘tidyr’
All declared Imports should be used.
```
# CATALYST
Version: 1.4.2
## In both
* checking for hidden files and directories ... NOTE
```
Found the following hidden files and directories:
.travis.yml
These were most likely included in error. See section ‘Package
structure’ in the ‘Writing R Extensions’ manual.
```
* checking installed package size ... NOTE
```
installed size is 10.6Mb
sub-directories of 1Mb or more:
data 3.1Mb
doc 5.1Mb
R 2.0Mb
```
* checking R code for possible problems ... NOTE
```
plotDiffHeatmap,matrix-SummarizedExperiment: no visible binding for
global variable ‘cluster_id’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/CATALYST/new/CATALYST.Rcheck/00_pkg_src/CATALYST/R/plotDiffHeatmap.R:136)
plotDiffHeatmap,matrix-SummarizedExperiment: no visible binding for
global variable ‘sample_id’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/CATALYST/new/CATALYST.Rcheck/00_pkg_src/CATALYST/R/plotDiffHeatmap.R:136)
Undefined global functions or variables:
cluster_id sample_id
```
# catenary
Version: 1.1.2
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘tidyverse’
All declared Imports should be used.
```
# causaldrf
Version: 0.3
## In both
* checking re-building of vignette outputs ... WARNING
```
Error in re-building vignettes:
...
Error in texi2dvi(file = file, pdf = TRUE, clean = clean, quiet = quiet, :
Running 'texi2dvi' on 'Using_causaldrf.tex' failed.
LaTeX errors:
! LaTeX Error: File `utf8x.def' not found.
Type X to quit or <RETURN> to proceed,
or enter new name. (Default extension: def)
! Emergency stop.
<read *>
l.165 \endinput
^^M
! ==> Fatal error occurred, no output PDF file produced!
Calls: buildVignettes -> texi2pdf -> texi2dvi
Execution halted
```
# CausalImpact
Version: 1.2.3
## In both
* checking R code for possible problems ... NOTE
```
ConstructModel: warning in AddDynamicRegression(ss, formula, data =
data, sigma.mean.prior = sigma.mean.prior): partial argument match of
'sigma.mean.prior' to 'sigma.mean.prior.DEPRECATED'
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/CausalImpact/new/CausalImpact.Rcheck/00_pkg_src/CausalImpact/R/impact_model.R:232-233)
```
# ccrs
Version: 0.1.0
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘methods’
All declared Imports should be used.
```
# cellscape
Version: 1.4.0
## In both
* checking Rd \usage sections ... WARNING
```
Duplicated \argument entries in documentation object 'dfs_tree':
‘chrom_bounds’ ‘ncols’ ‘chrom_bounds’ ‘cnv_data’ ‘chrom_bounds’
‘n_bp_per_pixel’ ‘mut_data’ ‘width’ ‘height’ ‘mutations’ ‘height’
‘width’ ‘clonal_prev’ ‘tree_edges’ ‘alpha’ ‘clonal_prev’ ‘tree_edges’
‘genotype_position’ ‘clone_colours’ ‘perturbations’ ‘mutations’
‘tree_edges’ ‘clonal_prev’ ‘clonal_prev’ ‘tree_edges’ ‘clone_colours’
‘mutations’
Functions with \usage entries need to have the appropriate \alias
entries, and all their arguments documented.
The \usage entries must correspond to syntactically valid R code.
See chapter ‘Writing R documentation files’ in the ‘Writing R
Extensions’ manual.
```
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘plyr’
All declared Imports should be used.
```
* checking R code for possible problems ... NOTE
```
...
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/cellscape/new/cellscape.Rcheck/00_pkg_src/cellscape/R/cellscape.R:1134)
getMutOrder: no visible global function definition for ‘coef’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/cellscape/new/cellscape.Rcheck/00_pkg_src/cellscape/R/cellscape.R:1135)
getTargetedHeatmapForEachSC: no visible binding for global variable
‘single_cell_id’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/cellscape/new/cellscape.Rcheck/00_pkg_src/cellscape/R/cellscape.R:1156)
getTargetedHeatmapForEachSC: no visible binding for global variable
‘chr’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/cellscape/new/cellscape.Rcheck/00_pkg_src/cellscape/R/cellscape.R:1156)
getTargetedHeatmapForEachSC: no visible binding for global variable
‘coord’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/cellscape/new/cellscape.Rcheck/00_pkg_src/cellscape/R/cellscape.R:1156)
Undefined global functions or variables:
chr chrom_index coef combn coord copy_number cumsum_values dist
genotype hclust lm melt mode_cnv n n_gt na.omit px px_width sc_id
setNames show_warnings single_cell_id site timepoint VAF
Consider adding
importFrom("stats", "coef", "dist", "hclust", "lm", "na.omit",
"setNames")
importFrom("utils", "combn")
to your NAMESPACE file.
```
* checking for unstated dependencies in vignettes ... NOTE
```
'library' or 'require' call not declared from: ‘devtools’
```
# cepR
Version: 0.1.0
## In both
* checking data for non-ASCII characters ... NOTE
```
Note: found 287 marked UTF-8 strings
```
# CGPfunctions
Version: 0.4
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘devtools’
All declared Imports should be used.
```
* checking Rd cross-references ... NOTE
```
Packages unavailable to check Rd xrefs: ‘BSDA’, ‘janitor’
```
# childesr
Version: 0.1.0
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘dbplyr’
All declared Imports should be used.
```
# childsds
Version: 0.7.1
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘gamlss.dist’
All declared Imports should be used.
```
* checking data for non-ASCII characters ... NOTE
```
Note: found 24 marked UTF-8 strings
```
# chimeraviz
Version: 1.6.2
## In both
* checking package dependencies ... ERROR
```
Packages required but not available: ‘org.Hs.eg.db’ ‘org.Mm.eg.db’
Depends: includes the non-default packages:
‘Biostrings’ ‘GenomicRanges’ ‘IRanges’ ‘Gviz’ ‘S4Vectors’ ‘ensembldb’
‘AnnotationFilter’ ‘data.table’
Adding so many packages to the search path is excessive and importing
selectively is preferable.
See section ‘The DESCRIPTION file’ in the ‘Writing R Extensions’
manual.
```
# ChIPexoQual
Version: 1.4.0
## In both
* checking data for non-ASCII characters ... NOTE
```
Failed with error: 'package 'DelayedArray' could not be loaded'
Error in .requirePackage(package) :
unable to find required package 'ChIPexoQual'
Calls: <Anonymous> ... getClass -> getClassDef -> .classEnv -> .requirePackage
Execution halted
```
# ChIPseeker
Version: 1.16.1
## In both
* checking package dependencies ... ERROR
```
Package required but not available: ‘TxDb.Hsapiens.UCSC.hg19.knownGene’
Package suggested but not available for checking: ‘org.Hs.eg.db’
See section ‘The DESCRIPTION file’ in the ‘Writing R Extensions’
manual.
```
# chorrrds
Version: 0.1.8
## In both
* checking data for non-ASCII characters ... NOTE
```
Note: found 4004 marked UTF-8 strings
```
# chromer
Version: 0.1
## In both
* checking DESCRIPTION meta-information ... NOTE
```
Malformed Description field: should contain one or more complete sentences.
```
* checking R code for possible problems ... NOTE
```
parse_counts: no visible global function definition for ‘na.omit’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/chromer/new/chromer.Rcheck/00_pkg_src/chromer/R/clean-data.R:77)
Undefined global functions or variables:
na.omit
Consider adding
importFrom("stats", "na.omit")
to your NAMESPACE file.
```
# cimir
Version: 0.1-0
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘readr’
All declared Imports should be used.
```
# CINdex
Version: 1.8.0
## In both
* checking re-building of vignette outputs ... WARNING
```
...
eval, evalq, Filter, Find, get, grep, grepl, intersect,
is.unsorted, lapply, lengths, Map, mapply, match, mget, order,
paste, pmax, pmax.int, pmin, pmin.int, Position, rank, rbind,
Reduce, rowMeans, rownames, rowSums, sapply, setdiff, sort,
table, tapply, union, unique, unsplit, which, which.max,
which.min
Loading required package: S4Vectors
Attaching package: 'S4Vectors'
The following object is masked from 'package:base':
expand.grid
Loading required package: IRanges
Loading required package: GenomeInfoDb
Quitting from lines 33-42 (PrepareInputData.Rmd)
Error: processing vignette 'PrepareInputData.Rmd' failed with diagnostics:
there is no package called 'pd.genomewidesnp.6'
Execution halted
```
* checking package dependencies ... NOTE
```
Packages suggested but not available for checking:
‘pd.genomewidesnp.6’ ‘org.Hs.eg.db’
‘TxDb.Hsapiens.UCSC.hg18.knownGene’ ‘Homo.sapiens’
```
* checking installed package size ... NOTE
```
installed size is 18.9Mb
sub-directories of 1Mb or more:
data 18.0Mb
```
* checking R code for possible problems ... NOTE
```
...
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/CINdex/new/CINdex.Rcheck/00_pkg_src/CINdex/R/process.probe.anno.R:21)
process.probe.anno: no visible binding for global variable ‘ID’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/CINdex/new/CINdex.Rcheck/00_pkg_src/CINdex/R/process.probe.anno.R:31)
process.reference.genome: no visible binding for global variable
‘chrom’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/CINdex/new/CINdex.Rcheck/00_pkg_src/CINdex/R/process.reference.genome.R:21-23)
process.reference.genome: no visible binding for global variable ‘name’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/CINdex/new/CINdex.Rcheck/00_pkg_src/CINdex/R/process.reference.genome.R:21-23)
process.reference.genome: no visible binding for global variable
‘stain’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/CINdex/new/CINdex.Rcheck/00_pkg_src/CINdex/R/process.reference.genome.R:21-23)
run.cin.chr: no visible global function definition for ‘is’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/CINdex/new/CINdex.Rcheck/00_pkg_src/CINdex/R/run.cin.chr.R:45-64)
run.cin.cyto: no visible global function definition for ‘is’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/CINdex/new/CINdex.Rcheck/00_pkg_src/CINdex/R/run.cin.cyto.R:53-84)
Undefined global functions or variables:
chrom dataMatrix ID is midpoint name stain
Consider adding
importFrom("methods", "is")
to your NAMESPACE file (and ensure that your DESCRIPTION Imports field
contains 'methods').
```
# circumplex
Version: 0.2.1
## In both
* checking whether package ‘circumplex’ can be installed ... ERROR
```
Installation failed.
See ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/circumplex/new/circumplex.Rcheck/00install.out’ for details.
```
## Installation
### Devel
```
* installing *source* package ‘circumplex’ ...
** package ‘circumplex’ successfully unpacked and MD5 sums checked
** libs
clang++ -std=gnu++11 -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/circumplex/Rcpp/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/circumplex/RcppArmadillo/include" -I/usr/local/include -fopenmp -fPIC -Wall -g -O2 -c RcppExports.cpp -o RcppExports.o
clang: error: unsupported option '-fopenmp'
make: *** [RcppExports.o] Error 1
ERROR: compilation failed for package ‘circumplex’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/circumplex/new/circumplex.Rcheck/circumplex’
```
### CRAN
```
* installing *source* package ‘circumplex’ ...
** package ‘circumplex’ successfully unpacked and MD5 sums checked
** libs
clang++ -std=gnu++11 -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/circumplex/Rcpp/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/circumplex/RcppArmadillo/include" -I/usr/local/include -fopenmp -fPIC -Wall -g -O2 -c RcppExports.cpp -o RcppExports.o
clang: error: unsupported option '-fopenmp'
make: *** [RcppExports.o] Error 1
ERROR: compilation failed for package ‘circumplex’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/circumplex/old/circumplex.Rcheck/circumplex’
```
# civis
Version: 1.6.1
## In both
* checking installed package size ... NOTE
```
installed size is 6.2Mb
sub-directories of 1Mb or more:
help 2.4Mb
R 3.2Mb
```
# ClinReport
Version: 0.9.1.10
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘emmeans’ ‘utils’
All declared Imports should be used.
```
# clustermq
Version: 0.8.6
## In both
* R CMD check timed out
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘infuser’ ‘purrr’ ‘R6’
All declared Imports should be used.
```
# cna
Version: 2.1.1
## In both
* checking re-building of vignette outputs ... NOTE
```
Error in re-building vignettes:
...
Error in texi2dvi(file = file, pdf = TRUE, clean = clean, quiet = quiet, :
Running 'texi2dvi' on 'cna_vignette.tex' failed.
LaTeX errors:
! LaTeX Error: File `thumbpdf.sty' not found.
Type X to quit or <RETURN> to proceed,
or enter new name. (Default extension: sty)
! Emergency stop.
<read *>
l.9 ^^M
! ==> Fatal error occurred, no output PDF file produced!
Calls: buildVignettes -> texi2pdf -> texi2dvi
Execution halted
```
# CNPBayes
Version: 1.10.0
## In both
* checking Rd \usage sections ... WARNING
```
Undocumented arguments in documentation object 'marginal_lik'
‘value’
Functions with \usage entries need to have the appropriate \alias
entries, and all their arguments documented.
The \usage entries must correspond to syntactically valid R code.
See chapter ‘Writing R documentation files’ in the ‘Writing R
Extensions’ manual.
```
* checking re-building of vignette outputs ... WARNING
```
...
fmtutil [INFO]: Total formats: 15
fmtutil [INFO]: exiting with status 0
tlmgr install fancyhdr
TeX Live 2018 is frozen forever and will no
longer be updated. This happens in preparation for a new release.
If you're interested in helping to pretest the new release (when
pretests are available), please read http://tug.org/texlive/pretest.html.
Otherwise, just wait, and the new release will be ready in due time.
tlmgr: Fundamental package texlive.infra not present, uh oh, goodbyeShould not happen, texlive.infra not found at /usr/local/bin/tlmgr line 7344.
tlmgr: package repository http://mirrors.standaloneinstaller.com/ctan/systems/texlive/tlnet (not verified: gpg unavailable)
tlmgr path add
! LaTeX Error: File `fancyhdr.sty' not found.
! Emergency stop.
<read *>
Error: processing vignette 'Convergence.Rmd' failed with diagnostics:
Failed to compile Convergence.tex. See Convergence.log for more info.
Execution halted
```
* checking installed package size ... NOTE
```
installed size is 8.7Mb
sub-directories of 1Mb or more:
doc 3.4Mb
libs 1.3Mb
R 3.0Mb
```
* checking R code for possible problems ... NOTE
```
copyNumber,SingleBatchCopyNumber: no visible binding for global
variable ‘theta.star’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/CNPBayes/new/CNPBayes.Rcheck/00_pkg_src/CNPBayes/R/copynumber-models.R:148-149)
copyNumber,SingleBatchCopyNumber: no visible binding for global
variable ‘theta.star’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/CNPBayes/new/CNPBayes.Rcheck/00_pkg_src/CNPBayes/R/copynumber-models.R:150-151)
Undefined global functions or variables:
theta.star
```
# CNVScope
Version: 1.9.7
## In both
* checking package dependencies ... ERROR
```
Package required but not available: ‘BSgenome.Hsapiens.UCSC.hg19’
See section ‘The DESCRIPTION file’ in the ‘Writing R Extensions’
manual.
```
# cocktailApp
Version: 0.2.0
## In both
* checking data for non-ASCII characters ... NOTE
```
Note: found 14661 marked UTF-8 strings
```
# codebook
Version: 0.8.0
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘graphics’ ‘jsonlite’ ‘pander’ ‘rlang’
All declared Imports should be used.
```
* checking Rd cross-references ... NOTE
```
Package unavailable to check Rd xrefs: ‘mice’
```
* checking data for non-ASCII characters ... NOTE
```
Note: found 65 marked UTF-8 strings
```
# codemetar
Version: 0.1.6
## In both
* checking tests ...
```
ERROR
Running the tests in ‘tests/testthat.R’ failed.
Last 13 lines of output:
── 2. Error: (unknown) (@test-jsonld-compact.R#20) ────────────────────────────
jsonld.InvalidUrllist(code = "loading remote context failed", url = "http://purl.org/codemeta/2.0", cause = "Evaluation error: Download (HTTP 404): http://purl.org/codemeta/2.0.")
1: jsonld_compact(doc, "http://purl.org/codemeta/2.0") at testthat/test-jsonld-compact.R:20
2: structure(store_val(), class = "json")
3: store_val()
4: stop(out$err)
══ testthat results ═══════════════════════════════════════════════════════════
OK: 80 SKIPPED: 10 FAILED: 2
1. Error: we can call crosswalk (@test-crosswalk.R#25)
2. Error: (unknown) (@test-jsonld-compact.R#20)
Error: testthat unit tests failed
Execution halted
```
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘memoise’
All declared Imports should be used.
```
# codified
Version: 0.2.0
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘methods’ ‘readr’
All declared Imports should be used.
```
# codingMatrices
Version: 0.3.2
## In both
* checking re-building of vignette outputs ... WARNING
```
Error in re-building vignettes:
...
Error in texi2dvi(file = file, pdf = TRUE, clean = clean, quiet = quiet, :
Running 'texi2dvi' on 'codingMatrices.tex' failed.
LaTeX errors:
! LaTeX Error: File `mathtools.sty' not found.
Type X to quit or <RETURN> to proceed,
or enter new name. (Default extension: sty)
! Emergency stop.
<read *>
l.12 ^^M
! ==> Fatal error occurred, no output PDF file produced!
Calls: buildVignettes -> texi2pdf -> texi2dvi
Execution halted
```
# cofeatureR
Version: 1.1.1
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘tibble’
All declared Imports should be used.
```
# cogena
Version: 1.14.0
## In both
* checking whether package ‘cogena’ can be installed ... WARNING
```
Found the following significant warnings:
Warning: replacing previous import ‘class::somgrid’ by ‘kohonen::somgrid’ when loading ‘cogena’
See ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/cogena/new/cogena.Rcheck/00install.out’ for details.
```
* checking installed package size ... NOTE
```
installed size is 6.5Mb
sub-directories of 1Mb or more:
doc 1.9Mb
extdata 3.1Mb
```
* checking R code for possible problems ... NOTE
```
...
‘legend’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/cogena/new/cogena.Rcheck/00_pkg_src/cogena/R/heatmapCluster.R:151-153)
heatmapCluster,cogena: no visible global function definition for
‘legend’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/cogena/new/cogena.Rcheck/00_pkg_src/cogena/R/heatmapCluster.R:155-157)
heatmapCluster,cogena: no visible global function definition for
‘legend’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/cogena/new/cogena.Rcheck/00_pkg_src/cogena/R/heatmapCluster.R:159-160)
Undefined global functions or variables:
abline as.dist axis cor data density dist hist image layout legend
lines median mtext order.dendrogram p.adjust par phyper plot.new
rainbow rect reorder sd text title topo.colors
Consider adding
importFrom("graphics", "abline", "axis", "hist", "image", "layout",
"legend", "lines", "mtext", "par", "plot.new", "rect",
"text", "title")
importFrom("grDevices", "rainbow", "topo.colors")
importFrom("stats", "as.dist", "cor", "density", "dist", "median",
"order.dendrogram", "p.adjust", "phyper", "reorder", "sd")
importFrom("utils", "data")
to your NAMESPACE file.
```
* checking Rd cross-references ... NOTE
```
Package unavailable to check Rd xrefs: ‘clValid’
```
# CollapsABEL
Version: 0.10.11
## In both
* checking whether package ‘CollapsABEL’ can be installed ... ERROR
```
Installation failed.
See ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/CollapsABEL/new/CollapsABEL.Rcheck/00install.out’ for details.
```
## Installation
### Devel
```
* installing *source* package ‘CollapsABEL’ ...
** package ‘CollapsABEL’ successfully unpacked and MD5 sums checked
** R
** byte-compile and prepare package for lazy loading
Error: package or namespace load failed for ‘rJava’:
.onLoad failed in loadNamespace() for 'rJava', details:
call: dyn.load(file, DLLpath = DLLpath, ...)
error: unable to load shared object '/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/CollapsABEL/rJava/libs/rJava.so':
dlopen(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/CollapsABEL/rJava/libs/rJava.so, 6): Library not loaded: /Library/Java/JavaVirtualMachines/jdk-11.0.1.jdk/Contents/Home/lib/server/libjvm.dylib
Referenced from: /Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/CollapsABEL/rJava/libs/rJava.so
Reason: image not found
Error : package ‘rJava’ could not be loaded
ERROR: lazy loading failed for package ‘CollapsABEL’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/CollapsABEL/new/CollapsABEL.Rcheck/CollapsABEL’
```
### CRAN
```
* installing *source* package ‘CollapsABEL’ ...
** package ‘CollapsABEL’ successfully unpacked and MD5 sums checked
** R
** byte-compile and prepare package for lazy loading
Error: package or namespace load failed for ‘rJava’:
.onLoad failed in loadNamespace() for 'rJava', details:
call: dyn.load(file, DLLpath = DLLpath, ...)
error: unable to load shared object '/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/CollapsABEL/rJava/libs/rJava.so':
dlopen(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/CollapsABEL/rJava/libs/rJava.so, 6): Library not loaded: /Library/Java/JavaVirtualMachines/jdk-11.0.1.jdk/Contents/Home/lib/server/libjvm.dylib
Referenced from: /Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/CollapsABEL/rJava/libs/rJava.so
Reason: image not found
Error : package ‘rJava’ could not be loaded
ERROR: lazy loading failed for package ‘CollapsABEL’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/CollapsABEL/old/CollapsABEL.Rcheck/CollapsABEL’
```
# colorednoise
Version: 1.0.4
## In both
* checking whether package ‘colorednoise’ can be installed ... ERROR
```
Installation failed.
See ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/colorednoise/new/colorednoise.Rcheck/00install.out’ for details.
```
## Installation
### Devel
```
* installing *source* package ‘colorednoise’ ...
** package ‘colorednoise’ successfully unpacked and MD5 sums checked
** libs
clang++ -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/colorednoise/Rcpp/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/colorednoise/RcppArmadillo/include" -I/usr/local/include -fopenmp -fPIC -O3 -Wno-c++11-inline-namespace -c RcppExports.cpp -o RcppExports.o
clang: error: unsupported option '-fopenmp'
make: *** [RcppExports.o] Error 1
ERROR: compilation failed for package ‘colorednoise’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/colorednoise/new/colorednoise.Rcheck/colorednoise’
```
### CRAN
```
* installing *source* package ‘colorednoise’ ...
** package ‘colorednoise’ successfully unpacked and MD5 sums checked
** libs
clang++ -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/colorednoise/Rcpp/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/colorednoise/RcppArmadillo/include" -I/usr/local/include -fopenmp -fPIC -O3 -Wno-c++11-inline-namespace -c RcppExports.cpp -o RcppExports.o
clang: error: unsupported option '-fopenmp'
make: *** [RcppExports.o] Error 1
ERROR: compilation failed for package ‘colorednoise’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/colorednoise/old/colorednoise.Rcheck/colorednoise’
```
# colorspace
Version: 1.4-0
## In both
* checking installed package size ... NOTE
```
installed size is 5.9Mb
sub-directories of 1Mb or more:
doc 2.0Mb
R 2.0Mb
```
# compareDF
Version: 1.7.1
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘magrittr’ ‘stringr’
All declared Imports should be used.
```
# COMPASS
Version: 1.18.1
## In both
* checking re-building of vignette outputs ... WARNING
```
...
fmtutil [INFO]: Total formats: 15
fmtutil [INFO]: exiting with status 0
tlmgr install fancyhdr
TeX Live 2018 is frozen forever and will no
longer be updated. This happens in preparation for a new release.
If you're interested in helping to pretest the new release (when
pretests are available), please read http://tug.org/texlive/pretest.html.
Otherwise, just wait, and the new release will be ready in due time.
tlmgr: Fundamental package texlive.infra not present, uh oh, goodbyeShould not happen, texlive.infra not found at /usr/local/bin/tlmgr line 7344.
tlmgr: package repository http://mirrors.standaloneinstaller.com/ctan/systems/texlive/tlnet (not verified: gpg unavailable)
tlmgr path add
! LaTeX Error: File `fancyhdr.sty' not found.
! Emergency stop.
<read *>
Error: processing vignette 'SimpleCOMPASS.Rmd' failed with diagnostics:
Failed to compile SimpleCOMPASS.tex. See SimpleCOMPASS.log for more info.
Execution halted
```
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘BiocStyle’ ‘rmarkdown’
All declared Imports should be used.
':::' call which should be '::': ‘flowWorkspace:::.getNodeInd’
See the note in ?`:::` about the use of this operator.
```
* checking R code for possible problems ... NOTE
```
COMPASSfitToCountsTable: no visible binding for global variable
‘population’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/COMPASS/new/COMPASS.Rcheck/00_pkg_src/COMPASS/R/utils.R:193)
COMPASSfitToCountsTable: no visible binding for global variable ‘Count’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/COMPASS/new/COMPASS.Rcheck/00_pkg_src/COMPASS/R/utils.R:193)
COMPASSfitToCountsTable: no visible binding for global variable
‘population’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/COMPASS/new/COMPASS.Rcheck/00_pkg_src/COMPASS/R/utils.R:194)
COMPASSfitToCountsTable: no visible binding for global variable ‘Count’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/COMPASS/new/COMPASS.Rcheck/00_pkg_src/COMPASS/R/utils.R:194)
COMPASSfitToCountsTable: no visible binding for global variable ‘id’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/COMPASS/new/COMPASS.Rcheck/00_pkg_src/COMPASS/R/utils.R:200)
COMPASSfitToCountsTable: no visible binding for global variable ‘id’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/COMPASS/new/COMPASS.Rcheck/00_pkg_src/COMPASS/R/utils.R:206)
Undefined global functions or variables:
Count id population
```
* checking for unstated dependencies in vignettes ... NOTE
```
'library' or 'require' calls not declared from:
‘ggplot2’ ‘readxl’
```
# condformat
Version: 0.8.0
## In both
* checking tests ...
```
ERROR
Running the tests in ‘tests/testthat.R’ failed.
Last 13 lines of output:
> library(condformat)
>
> test_check("condformat")
-- 1. Error: condformat2excel generates a file (@test_rendering.R#42) ---------
Please install the xlsx package in order to export to excel
1: condformat2excel(condformat(head(iris, n = rows_to_write)), filename = filename) at testthat/test_rendering.R:42
2: require_xlsx() at /Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/condformat/new/condformat.Rcheck/00_pkg_src/condformat/R/render_xlsx.R:19
3: stop("Please install the xlsx package in order to export to excel") at /Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/condformat/new/condformat.Rcheck/00_pkg_src/condformat/R/render_xlsx.R:3
== testthat results ===========================================================
OK: 125 SKIPPED: 0 FAILED: 1
1. Error: condformat2excel generates a file (@test_rendering.R#42)
Error: testthat unit tests failed
Execution halted
```
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘graphics’
All declared Imports should be used.
```
# configural
Version: 0.1.1
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘cli’
All declared Imports should be used.
```
* checking Rd cross-references ... NOTE
```
Package unavailable to check Rd xrefs: ‘fungible’
```
# conflicted
Version: 1.0.1
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘memoise’
All declared Imports should be used.
```
# congressbr
Version: 0.2.0
## In both
* checking data for non-ASCII characters ... NOTE
```
Note: found 1 marked UTF-8 string
```
# Countr
Version: 3.5.2
## In both
* checking re-building of vignette outputs ... WARNING
```
Error in re-building vignettes:
...
Error in texi2dvi(file = file, pdf = TRUE, clean = clean, quiet = quiet, :
Running 'texi2dvi' on 'ComputationalPerformance.tex' failed.
LaTeX errors:
! LaTeX Error: File `pdfpages.sty' not found.
Type X to quit or <RETURN> to proceed,
or enter new name. (Default extension: sty)
! Emergency stop.
<read *>
l.5 ^^M
! ==> Fatal error occurred, no output PDF file produced!
Calls: buildVignettes -> texi2pdf -> texi2dvi
Execution halted
```
# countyfloods
Version: 0.1.0
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘maps’
All declared Imports should be used.
```
# countyweather
Version: 0.1.0
## In both
* checking data for non-ASCII characters ... NOTE
```
Note: found 1 marked UTF-8 string
```
# coveffectsplot
Version: 0.0.3
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘colourpicker’ ‘dplyr’ ‘markdown’ ‘shinyjs’ ‘tidyr’
All declared Imports should be used.
```
# coxed
Version: 0.2.0
## In both
* checking Rd cross-references ... NOTE
```
Package unavailable to check Rd xrefs: ‘mediation’
```
# CPAT
Version: 0.1.0
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘grDevices’ ‘Rdpack’
All declared Imports should be used.
```
# CRANsearcher
Version: 1.0.0
## In both
* checking data for non-ASCII characters ... NOTE
```
Note: found 11 marked Latin-1 strings
Note: found 57 marked UTF-8 strings
```
# crawl
Version: 2.2.1
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘gdistance’ ‘raster’
All declared Imports should be used.
```
# CrossClustering
Version: 4.0.3
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘glue’
All declared Imports should be used.
```
# crosswalkr
Version: 0.2.4
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘dplyr’
All declared Imports should be used.
```
# crossword.r
Version: 0.3.6
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘dplyr’ ‘R6’ ‘r6extended’
All declared Imports should be used.
```
# crsra
Version: 0.2.3
## In both
* checking data for non-ASCII characters ... NOTE
```
Note: found 500 marked UTF-8 strings
```
# curatedMetagenomicData
Version: 1.10.2
## In both
* checking examples ... ERROR
```
...
> ### Title: Data from the SmitsSA_2017 study
> ### Aliases: SmitsSA_2017 SmitsSA_2017.genefamilies_relab.stool
> ### SmitsSA_2017.marker_abundance.stool
> ### SmitsSA_2017.marker_presence.stool
> ### SmitsSA_2017.metaphlan_bugs_list.stool
> ### SmitsSA_2017.pathabundance_relab.stool
> ### SmitsSA_2017.pathcoverage.stool
>
> ### ** Examples
>
> SmitsSA_2017.metaphlan_bugs_list.stool()
snapshotDate(): 2018-04-27
see ?curatedMetagenomicData and browseVignettes('curatedMetagenomicData') for documentation
downloading 0 resources
loading from cache
‘/Users/romain//.ExperimentHub/1338’
Error: failed to load resource
name: EH1338
title: 20180425.SmitsSA_2017.metaphlan_bugs_list.stool
reason: ReadItem: unknown type 115, perhaps written by later version of R
Execution halted
```
* checking tests ...
```
ERROR
Running the tests in ‘tests/testthat.R’ failed.
Last 13 lines of output:
stop("failed to load resource", "\n name: ", names(x), "\n title: ", x$title,
"\n reason: ", conditionMessage(err), call. = FALSE)
})
14: tryCatchList(expr, classes, parentenv, handlers)
15: tryCatchOne(expr, names, parentenv, handlers[[1L]])
16: value[[3L]](cond)
17: stop("failed to load resource", "\n name: ", names(x), "\n title: ", x$title, "\n reason: ",
conditionMessage(err), call. = FALSE)
══ testthat results ═══════════════════════════════════════════════════════════
OK: 32 SKIPPED: 0 FAILED: 1
1. Error: countries and studies align. (@test-mergeData.R#27)
Error: testthat unit tests failed
Execution halted
```
* checking installed package size ... NOTE
```
installed size is 5.7Mb
sub-directories of 1Mb or more:
doc 1.4Mb
help 2.7Mb
```
# customsteps
Version: 0.7.1.0
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘rlang’ ‘tidyselect’
All declared Imports should be used.
```
# cutpointr
Version: 0.7.4
## In both
* checking tests ...
```
ERROR
Running the tests in ‘tests/testthat.R’ failed.
Last 13 lines of output:
`print\(scp\)` does not match "accuracy_oob 0.8201".
Actual value: "Method: oc_youden_normal \\nPredictor: dsi \\nOutcome: suicide \\nDirection: >= \\nSubgroups: female, male \\nNr\. of bootstraps: 10 \\n\\nSubgroup: female \\n-------------------------------------------------------------------------------- \\n optimal_cutpoint accuracy acc sensitivity specificity AUC n_pos n_neg\\n 2\.4778 0\.8954 0\.8954 0\.8148 0\.9014 0\.9446 27 365\\n\\nCutpoint 2\.47775393352595:\\n observation\\nprediction yes no\\n yes 22 36\\n no 5 329\\n\\n\\nPredictor summary: \\n Min\. 5% 1st Qu\. Median Mean 3rd Qu\. 95% Max\. SD\\n 0 0 0 0 0\.8393 1 5 10 1\.7452\\n\\nPredictor summary per class: \\n Min\. 5% 1st Qu\. Median Mean 3rd Qu\. 95% Max SD\\nno 0 0\.0 0 0 0\.5479 0 4 10 1\.3181\\nyes 0 1\.3 4 5 4\.7778 6 7 9 2\.0444\\n\\nBootstrap summary: \\n# A tibble: 13 x 10\\n Variable Min\. `5%` `1st Qu\.` Median Mean `3rd Qu\.` `95%` Max\. SD\\n <chr> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl>\\n 1 optimal_cutp… 2\.18 2\.23 2\.33 2\.43 2\.47 2\.51 2\.83 2\.94 0\.218 \\n 2 AUC_b 0\.941 0\.943 0\.950 0\.964 0\.960 0\.967 0\.974 0\.976 0\.0119\\n 3 AUC_oob 0\.894 0\.894 0\.912 0\.924 0\.925 0\.939 0\.955 0\.956 0\.0222\\n 4 accuracy_b 0\.860 0\.871 0\.888 0\.908 0\.904 0\.923 0\.927 0\.929 0\.0226\\n 5 accuracy_oob 0\.820 0\.838 0\.873 0\.876 0\.880 0\.901 0\.912 0\.914 0\.0278\\n 6 acc_b 0\.860 0\.871 0\.888 0\.908 0\.904 0\.923 0\.927 0\.929 0\.0226\\n 7 acc_oob 0\.820 0\.838 0\.873 0\.876 0\.880 0\.901 0\.912 0\.914 0\.0278\\n 8 sensitivity_b 0\.708 0\.737 0\.779 0\.823 0\.826 0\.851 0\.940 0\.954 0\.0728\\n 9 sensitivity_… 0\.625 0\.644 0\.762 0\.809 0\.800 0\.872 0\.913 0\.923 0\.0971\\n10 specificity_b 0\.870 0\.875 0\.894 0\.915 0\.909 0\.927 0\.931 0\.932 0\.0223\\n11 specificity_… 0\.835 0\.845 0\.876 0\.880 0\.886 0\.912 0\.921 0\.922 0\.0283\\n12 kappa_b 0\.321 0\.329 0\.423 0\.509 0\.485 0\.562 0\.590 0\.610 0\.0995\\n13 kappa_oob 0\.305 0\.324 0\.368 0\.420 0\.444 0\.511 0\.608 0\.631 0\.106 \\n\\nSubgroup: male \\n-------------------------------------------------------------------------------- \\n optimal_cutpoint accuracy acc sensitivity specificity AUC n_pos n_neg\\n 3\.1723 0\.8643 0\.8643 0\.6667 0\.8779 0\.8617 9 131\\n\\nCutpoint 3\.17225507835137:\\n observation\\nprediction yes no\\n yes 6 16\\n no 3 115\\n\\n\\nPredictor summary: \\n Min\. 5% 1st Qu\. Median Mean 3rd Qu\. 95% Max\. SD\\n 0 0 0 0 1\.15 1 6 11 2\.1151\\n\\nPredictor summary per class: \\n Min\. 5% 1st Qu\. Median Mean 3rd Qu\. 95% Max SD\\nno 0 0\.0 0 0 0\.8702 1 5\.0 6 1\.6286\\nyes 0 0\.4 3 4 5\.2222 8 10\.6 11 3\.8333\\n\\nBootstrap summary: \\n# A tibble: 13 x 10\\n Variable Min\. `5%` `1st Qu\.` Median Mean `3rd Qu\.` `95%` Max\. SD\\n <chr> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl>\\n 1 optimal_cutp… 2\.82 2\.84 2\.92 3\.27 3\.26 3\.55 3\.82 3\.90 0\.387 \\n 2 AUC_b 0\.758 0\.787 0\.825 0\.879 0\.871 0\.904 0\.959 0\.968 0\.0641\\n 3 AUC_oob 0\.631 0\.691 0\.792 0\.885 0\.859 0\.943 0\.972 0\.977 0\.109 \\n 4 accuracy_b 0\.807 0\.814 0\.834 0\.864 0\.852 0\.871 0\.871 0\.871 0\.0243\\n 5 accuracy_oob 0\.822 0\.823 0\.839 0\.871 0\.866 0\.896 0\.905 0\.906 0\.0327\\n 6 acc_b 0\.807 0\.814 0\.834 0\.864 0\.852 0\.871 0\.871 0\.871 0\.0243\\n 7 acc_oob 0\.822 0\.823 0\.839 0\.871 0\.866 0\.896 0\.905 0\.906 0\.0327\\n 8 sensitivity_b 0\.556 0\.582 0\.667 0\.703 0\.735 0\.794 0\.936 1 0\.129 \\n 9 sensitivity_… 0\.333 0\.363 0\.5 0\.667 0\.707 1 1 1 0\.272 \\n10 specificity_b 0\.817 0\.825 0\.846 0\.867 0\.862 0\.875 0\.892 0\.898 0\.0246\\n11 specificity_… 0\.818 0\.826 0\.853 0\.887 0\.877 0\.898 0\.917 0\.918 0\.0342\\n12 kappa_b 0\.210 0\.220 0\.243 0\.338 0\.319 0\.380 0\.407 0\.411 0\.0757\\n13 kappa_oob 0\.118 0\.145 0\.208 0\.306 0\.310 0\.398 0\.497 0\.570 0\.139 "
── 3. Failure: summary is printed correctly (@test-cutpointr.R#1211) ──────────
`print\(scp\)` does not match "accuracy_oob 0.8163".
Actual value: "Method: oc_youden_normal \\nPredictor: x \\nOutcome: class \\nDirection: >= \\nSubgroups: female, male \\nNr\. of bootstraps: 10 \\n\\nSubgroup: female \\n-------------------------------------------------------------------------------- \\n optimal_cutpoint accuracy acc sensitivity specificity AUC n_pos n_neg\\n 2\.4778 0\.8954 0\.8954 0\.8148 0\.9014 0\.9446 27 365\\n\\nCutpoint 2\.47775393352595:\\n observation\\nprediction yes no\\n yes 22 36\\n no 5 329\\n\\n\\nPredictor summary: \\n Min\. 5% 1st Qu\. Median Mean 3rd Qu\. 95% Max\. SD\\n 0 0 0 0 0\.8393 1 5 10 1\.7452\\n\\nPredictor summary per class: \\n Min\. 5% 1st Qu\. Median Mean 3rd Qu\. 95% Max SD\\nno 0 0\.0 0 0 0\.5479 0 4 10 1\.3181\\nyes 0 1\.3 4 5 4\.7778 6 7 9 2\.0444\\n\\nBootstrap summary: \\n# A tibble: 13 x 10\\n Variable Min\. `5%` `1st Qu\.` Median Mean `3rd Qu\.` `95%` Max\. SD\\n <chr> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl>\\n 1 optimal_cutp… 2\.02 2\.12 2\.32 2\.40 2\.40 2\.54 2\.62 2\.66 0\.185 \\n 2 AUC_b 0\.907 0\.910 0\.92 0\.950 0\.940 0\.958 0\.965 0\.966 0\.0227\\n 3 AUC_oob 0\.898 0\.905 0\.931 0\.953 0\.947 0\.968 0\.978 0\.980 0\.0274\\n 4 accuracy_b 0\.878 0\.878 0\.895 0\.902 0\.900 0\.908 0\.916 0\.921 0\.0138\\n 5 accuracy_oob 0\.865 0\.868 0\.879 0\.888 0\.891 0\.906 0\.914 0\.917 0\.0176\\n 6 acc_b 0\.878 0\.878 0\.895 0\.902 0\.900 0\.908 0\.916 0\.921 0\.0138\\n 7 acc_oob 0\.865 0\.868 0\.879 0\.888 0\.891 0\.906 0\.914 0\.917 0\.0176\\n 8 sensitivity_b 0\.66 0\.689 0\.759 0\.786 0\.796 0\.849 0\.896 0\.917 0\.076 \\n 9 sensitivity_… 0\.7 0\.712 0\.8 0\.847 0\.861 0\.972 1 1 0\.112 \\n10 specificity_b 0\.878 0\.881 0\.901 0\.913 0\.910 0\.922 0\.934 0\.939 0\.019 \\n11 specificity_… 0\.864 0\.867 0\.882 0\.892 0\.895 0\.909 0\.925 0\.926 0\.0216\\n12 kappa_b 0\.362 0\.410 0\.475 0\.528 0\.514 0\.566 0\.582 0\.585 0\.0692\\n13 kappa_oob 0\.160 0\.214 0\.391 0\.420 0\.404 0\.475 0\.524 0\.539 0\.112 \\n\\nSubgroup: male \\n-------------------------------------------------------------------------------- \\n optimal_cutpoint accuracy acc sensitivity specificity AUC n_pos n_neg\\n 3\.1723 0\.8643 0\.8643 0\.6667 0\.8779 0\.8617 9 131\\n\\nCutpoint 3\.17225507835137:\\n observation\\nprediction yes no\\n yes 6 16\\n no 3 115\\n\\n\\nPredictor summary: \\n Min\. 5% 1st Qu\. Median Mean 3rd Qu\. 95% Max\. SD\\n 0 0 0 0 1\.15 1 6 11 2\.1151\\n\\nPredictor summary per class: \\n Min\. 5% 1st Qu\. Median Mean 3rd Qu\. 95% Max SD\\nno 0 0\.0 0 0 0\.8702 1 5\.0 6 1\.6286\\nyes 0 0\.4 3 4 5\.2222 8 10\.6 11 3\.8333\\n\\nBootstrap summary: \\n# A tibble: 13 x 10\\n Variable Min\. `5%` `1st Qu\.` Median Mean `3rd Qu\.` `95%` Max\. SD\\n <chr> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl>\\n 1 optimal_cutp… 2\.14 2\.26 2\.93 3\.05 2\.97 3\.28 3\.35 3\.36 0\.403 \\n 2 AUC_b 0\.738 0\.760 0\.823 0\.848 0\.852 0\.904 0\.925 0\.929 0\.0611\\n 3 AUC_oob 0\.806 0\.815 0\.838 0\.901 0\.899 0\.958 0\.990 1 0\.0688\\n 4 accuracy_b 0\.8 0\.8 0\.848 0\.868 0\.854 0\.871 0\.875 0\.879 0\.0298\\n 5 accuracy_oob 0\.816 0\.820 0\.835 0\.87 0\.862 0\.877 0\.899 0\.917 0\.031 \\n 6 acc_b 0\.8 0\.8 0\.848 0\.868 0\.854 0\.871 0\.875 0\.879 0\.0298\\n 7 acc_oob 0\.816 0\.820 0\.835 0\.87 0\.862 0\.877 0\.899 0\.917 0\.031 \\n 8 sensitivity_b 0\.333 0\.376 0\.542 0\.690 0\.656 0\.744 0\.9 1 0\.192 \\n 9 sensitivity_… 0\.5 0\.545 0\.617 0\.8 0\.777 0\.95 1 1 0\.183 \\n10 specificity_b 0\.806 0\.807 0\.865 0\.876 0\.864 0\.879 0\.894 0\.903 0\.0316\\n11 specificity_… 0\.808 0\.823 0\.852 0\.874 0\.870 0\.886 0\.909 0\.909 0\.031 \\n12 kappa_b 0\.133 0\.135 0\.154 0\.264 0\.264 0\.364 0\.416 0\.436 0\.116 \\n13 kappa_oob 0\.140 0\.192 0\.318 0\.448 0\.405 0\.493 0\.575 0\.625 0\.143 "
══ testthat results ═══════════════════════════════════════════════════════════
OK: 369 SKIPPED: 0 FAILED: 3
1. Failure: summary is printed correctly (@test-cutpointr.R#1179)
2. Failure: summary is printed correctly (@test-cutpointr.R#1195)
3. Failure: summary is printed correctly (@test-cutpointr.R#1211)
Error: testthat unit tests failed
Execution halted
```
# d3r
Version: 0.8.5
## In both
* checking package dependencies ... NOTE
```
Packages which this enhances but not available for checking:
‘partykit’ ‘treemap’ ‘V8’
```
# d3Tree
Version: 0.2.0
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘magrittr’
All declared Imports should be used.
```
# dabestr
Version: 0.2.0
## In both
* checking re-building of vignette outputs ... WARNING
```
...
The following objects are masked from 'package:base':
intersect, setdiff, setequal, union
Attaching package: 'cowplot'
The following object is masked from 'package:ggplot2':
ggsave
Warning: `data_frame()` is deprecated, use `tibble()`.
This warning is displayed once per session.
Loading required package: boot
Loading required package: magrittr
Warning: Some components of ... were not used: ..1
Quitting from lines 110-166 (robust-statistical-visualization.Rmd)
Error: processing vignette 'robust-statistical-visualization.Rmd' failed with diagnostics:
polygon edge not found
Execution halted
```
* checking installed package size ... NOTE
```
installed size is 6.6Mb
sub-directories of 1Mb or more:
doc 5.8Mb
```
# dalmatian
Version: 0.3.0
## In both
* checking re-building of vignette outputs ... WARNING
```
Error in re-building vignettes:
...
Quitting from lines 36-71 (weights-1-simulate.Rmd)
Error: processing vignette 'weights-1-simulate.Rmd' failed with diagnostics:
.onLoad failed in loadNamespace() for 'rjags', details:
call: dyn.load(file, DLLpath = DLLpath, ...)
error: unable to load shared object '/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/dalmatian/rjags/libs/rjags.so':
dlopen(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/dalmatian/rjags/libs/rjags.so, 10): Library not loaded: /usr/local/lib/libjags.4.dylib
Referenced from: /Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/dalmatian/rjags/libs/rjags.so
Reason: image not found
Execution halted
```
* checking installed package size ... NOTE
```
installed size is 5.1Mb
sub-directories of 1Mb or more:
doc 1.1Mb
Pied_Flycatchers_1 2.4Mb
Pied_Flycatchers_2 1.2Mb
```
# DAPAR
Version: 1.12.11
## In both
* checking whether package ‘DAPAR’ can be installed ... ERROR
```
Installation failed.
See ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/DAPAR/new/DAPAR.Rcheck/00install.out’ for details.
```
## Installation
### Devel
```
* installing *source* package ‘DAPAR’ ...
** R
** inst
** byte-compile and prepare package for lazy loading
Warning in fun(libname, pkgname) :
mzR has been built against a different Rcpp version (0.12.16)
than is installed on your system (1.0.0). This might lead to errors
when loading mzR. If you encounter such issues, please send a report,
including the output of sessionInfo() to the Bioc support forum at
https://support.bioconductor.org/. For details see also
https://github.com/sneumann/mzR/wiki/mzR-Rcpp-compiler-linker-issue.
Warning in fun(libname, pkgname) : couldn't connect to display ""
Error in loadNamespace(j <- i[[1L]], c(lib.loc, .libPaths()), versionCheck = vI[[j]]) :
there is no package called ‘DO.db’
ERROR: lazy loading failed for package ‘DAPAR’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/DAPAR/new/DAPAR.Rcheck/DAPAR’
```
### CRAN
```
* installing *source* package ‘DAPAR’ ...
** R
** inst
** byte-compile and prepare package for lazy loading
Warning in fun(libname, pkgname) :
mzR has been built against a different Rcpp version (0.12.16)
than is installed on your system (1.0.0). This might lead to errors
when loading mzR. If you encounter such issues, please send a report,
including the output of sessionInfo() to the Bioc support forum at
https://support.bioconductor.org/. For details see also
https://github.com/sneumann/mzR/wiki/mzR-Rcpp-compiler-linker-issue.
Warning in fun(libname, pkgname) : couldn't connect to display ""
Error in loadNamespace(j <- i[[1L]], c(lib.loc, .libPaths()), versionCheck = vI[[j]]) :
there is no package called ‘DO.db’
ERROR: lazy loading failed for package ‘DAPAR’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/DAPAR/old/DAPAR.Rcheck/DAPAR’
```
# datadr
Version: 0.8.6.1
## In both
* checking package dependencies ... NOTE
```
Package suggested but not available for checking: ‘Rhipe’
```
# datasus
Version: 0.4.1
## In both
* checking re-building of vignette outputs ... WARNING
```
Error in re-building vignettes:
...
Attaching package: 'dplyr'
The following objects are masked from 'package:stats':
filter, lag
The following objects are masked from 'package:base':
intersect, setdiff, setequal, union
Quitting from lines 78-85 (Introduction_to_datasus.Rmd)
Error: processing vignette 'Introduction_to_datasus.Rmd' failed with diagnostics:
Timeout was reached: Connection timed out after 10009 milliseconds
Execution halted
```
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘RCurl’
All declared Imports should be used.
```
# dbparser
Version: 1.0.0
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘dplyr’ ‘ggplot2’
All declared Imports should be used.
```
# ddpcr
Version: 1.11
## In both
* checking installed package size ... NOTE
```
installed size is 5.3Mb
sub-directories of 1Mb or more:
sample_data 3.0Mb
```
# DeepBlueR
Version: 1.6.0
## In both
* checking Rd files ... NOTE
```
prepare_Rd: deepblue_enrich_regions_fast.Rd:35-38: Dropping empty section \examples
```
# DEGreport
Version: 1.16.0
## In both
* checking for hidden files and directories ... NOTE
```
Found the following hidden files and directories:
.travis.yml
These were most likely included in error. See section ‘Package
structure’ in the ‘Writing R Extensions’ manual.
```
* checking DESCRIPTION meta-information ... NOTE
```
Package listed in more than one of Depends, Imports, Suggests, Enhances:
‘knitr’
A package should be listed in only one of these fields.
```
* checking R code for possible problems ... NOTE
```
...
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/DEGreport/new/DEGreport.Rcheck/00_pkg_src/DEGreport/R/methods.R:274-282)
degMV: no visible binding for global variable ‘max_sd’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/DEGreport/new/DEGreport.Rcheck/00_pkg_src/DEGreport/R/methods.R:274-282)
degPatterns: no visible global function definition for ‘rowMedians’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/DEGreport/new/DEGreport.Rcheck/00_pkg_src/DEGreport/R/clustering.R:785-787)
degPatterns: no visible binding for global variable ‘genes’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/DEGreport/new/DEGreport.Rcheck/00_pkg_src/DEGreport/R/clustering.R:816-821)
degPlotWide : <anonymous>: no visible binding for global variable
‘count’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/DEGreport/new/DEGreport.Rcheck/00_pkg_src/DEGreport/R/genePlots.R:155-158)
significants,list : <anonymous>: no visible binding for global variable
‘gene’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/DEGreport/new/DEGreport.Rcheck/00_pkg_src/DEGreport/R/AllMethods.R:225)
significants,TopTags: no visible binding for global variable ‘FDR’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/DEGreport/new/DEGreport.Rcheck/00_pkg_src/DEGreport/R/AllMethods.R:147-151)
significants,TopTags: no visible binding for global variable ‘logFC’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/DEGreport/new/DEGreport.Rcheck/00_pkg_src/DEGreport/R/AllMethods.R:147-151)
Undefined global functions or variables:
.x base_mean comp compare count counts covar enrichGO FDR gene genes
keys log2fc log2FoldChange logFC max_sd min_median ratios rowMedians
simplify
```
# DeLorean
Version: 1.5.0
## In both
* checking installed package size ... NOTE
```
installed size is 7.9Mb
sub-directories of 1Mb or more:
libs 4.9Mb
```
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘lattice’
All declared Imports should be used.
```
* checking for GNU extensions in Makefiles ... NOTE
```
GNU make is a SystemRequirements.
```
# DEP
Version: 1.2.0
## In both
* checking installed package size ... NOTE
```
installed size is 6.0Mb
sub-directories of 1Mb or more:
data 1.4Mb
doc 3.1Mb
R 1.2Mb
```
# DepthProc
Version: 2.0.4
## In both
* checking whether package ‘DepthProc’ can be installed ... ERROR
```
Installation failed.
See ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/DepthProc/new/DepthProc.Rcheck/00install.out’ for details.
```
## Installation
### Devel
```
* installing *source* package ‘DepthProc’ ...
** package ‘DepthProc’ successfully unpacked and MD5 sums checked
** libs
clang++ -std=gnu++11 -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DepthProc/Rcpp/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DepthProc/RcppArmadillo/include" -I/usr/local/include -fopenmp -fPIC -Wall -g -O2 -c Depth.cpp -o Depth.o
clang: error: unsupported option '-fopenmp'
make: *** [Depth.o] Error 1
ERROR: compilation failed for package ‘DepthProc’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/DepthProc/new/DepthProc.Rcheck/DepthProc’
```
### CRAN
```
* installing *source* package ‘DepthProc’ ...
** package ‘DepthProc’ successfully unpacked and MD5 sums checked
** libs
clang++ -std=gnu++11 -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DepthProc/Rcpp/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DepthProc/RcppArmadillo/include" -I/usr/local/include -fopenmp -fPIC -Wall -g -O2 -c Depth.cpp -o Depth.o
clang: error: unsupported option '-fopenmp'
make: *** [Depth.o] Error 1
ERROR: compilation failed for package ‘DepthProc’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/DepthProc/old/DepthProc.Rcheck/DepthProc’
```
# DescriptiveStats.OBeu
Version: 1.3.1
## In both
* checking data for non-ASCII characters ... NOTE
```
Note: found 5764 marked UTF-8 strings
```
# desctable
Version: 0.1.4
## In both
* checking for code/documentation mismatches ... WARNING
```
Codoc mismatches from documentation object 'group_by':
group_by
Code: function(.data, ..., add = FALSE, .drop = group_drops(.data))
Docs: function(.data, ..., add = FALSE)
Argument names in code not in docs:
.drop
```
# detrendr
Version: 0.6.0
## In both
* checking for GNU extensions in Makefiles ... NOTE
```
GNU make is a SystemRequirements.
```
# dextergui
Version: 0.1.6
## In both
* checking dependencies in R code ... NOTE
```
Unexported objects imported by ':::' calls:
‘dexter:::get_resp_data’ ‘dexter:::qcolors’
See the note in ?`:::` about the use of this operator.
```
# DiagrammeR
Version: 1.0.0
## In both
* checking installed package size ... NOTE
```
installed size is 6.9Mb
sub-directories of 1Mb or more:
htmlwidgets 3.0Mb
R 3.1Mb
```
* checking data for non-ASCII characters ... NOTE
```
Note: found 1 marked UTF-8 string
```
# DiffBind
Version: 2.8.0
## In both
* checking whether package ‘DiffBind’ can be installed ... ERROR
```
Installation failed.
See ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/DiffBind/new/DiffBind.Rcheck/00install.out’ for details.
```
* checking package dependencies ... NOTE
```
Package which this enhances but not available for checking: ‘XLConnect’
```
## Installation
### Devel
```
* installing *source* package ‘DiffBind’ ...
** libs
clang++ -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -D_USE_KNETFILE -DBGZF_CACHE -D_FILE_OFFSET_BITS=64 -D_LARGEFILE64_SOURCE -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rsamtools/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rcpp/include" -I/usr/local/include -fPIC -O3 -Wno-c++11-inline-namespace -c RcppExports.cpp -o RcppExports.o
clang++ -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -D_USE_KNETFILE -DBGZF_CACHE -D_FILE_OFFSET_BITS=64 -D_LARGEFILE64_SOURCE -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rsamtools/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rcpp/include" -I/usr/local/include -fPIC -O3 -Wno-c++11-inline-namespace -c bamReader.cpp -o bamReader.o
clang++ -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -D_USE_KNETFILE -DBGZF_CACHE -D_FILE_OFFSET_BITS=64 -D_LARGEFILE64_SOURCE -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rsamtools/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rcpp/include" -I/usr/local/include -fPIC -O3 -Wno-c++11-inline-namespace -c bedReader.cpp -o bedReader.o
clang++ -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -D_USE_KNETFILE -DBGZF_CACHE -D_FILE_OFFSET_BITS=64 -D_LARGEFILE64_SOURCE -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rsamtools/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rcpp/include" -I/usr/local/include -fPIC -O3 -Wno-c++11-inline-namespace -c bitBucket.cpp -o bitBucket.o
clang++ -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -D_USE_KNETFILE -DBGZF_CACHE -D_FILE_OFFSET_BITS=64 -D_LARGEFILE64_SOURCE -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rsamtools/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rcpp/include" -I/usr/local/include -fPIC -O3 -Wno-c++11-inline-namespace -c croi_func.cpp -o croi_func.o
clang++ -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -D_USE_KNETFILE -DBGZF_CACHE -D_FILE_OFFSET_BITS=64 -D_LARGEFILE64_SOURCE -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rsamtools/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rcpp/include" -I/usr/local/include -fPIC -O3 -Wno-c++11-inline-namespace -c croi_main.cpp -o croi_main.o
clang++ -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -D_USE_KNETFILE -DBGZF_CACHE -D_FILE_OFFSET_BITS=64 -D_LARGEFILE64_SOURCE -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rsamtools/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rcpp/include" -I/usr/local/include -fPIC -O3 -Wno-c++11-inline-namespace -c densitySet.cpp -o densitySet.o
clang++ -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -D_USE_KNETFILE -DBGZF_CACHE -D_FILE_OFFSET_BITS=64 -D_LARGEFILE64_SOURCE -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rsamtools/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rcpp/include" -I/usr/local/include -fPIC -O3 -Wno-c++11-inline-namespace -c iBucket.cpp -o iBucket.o
clang++ -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -D_USE_KNETFILE -DBGZF_CACHE -D_FILE_OFFSET_BITS=64 -D_LARGEFILE64_SOURCE -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rsamtools/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rcpp/include" -I/usr/local/include -fPIC -O3 -Wno-c++11-inline-namespace -c interval.cpp -o interval.o
clang++ -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -D_USE_KNETFILE -DBGZF_CACHE -D_FILE_OFFSET_BITS=64 -D_LARGEFILE64_SOURCE -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rsamtools/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rcpp/include" -I/usr/local/include -fPIC -O3 -Wno-c++11-inline-namespace -c intervalDensity.cpp -o intervalDensity.o
clang++ -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -D_USE_KNETFILE -DBGZF_CACHE -D_FILE_OFFSET_BITS=64 -D_LARGEFILE64_SOURCE -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rsamtools/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rcpp/include" -I/usr/local/include -fPIC -O3 -Wno-c++11-inline-namespace -c intervalNode.cpp -o intervalNode.o
clang++ -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -D_USE_KNETFILE -DBGZF_CACHE -D_FILE_OFFSET_BITS=64 -D_LARGEFILE64_SOURCE -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rsamtools/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rcpp/include" -I/usr/local/include -fPIC -O3 -Wno-c++11-inline-namespace -c intervalSet.cpp -o intervalSet.o
clang++ -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -D_USE_KNETFILE -DBGZF_CACHE -D_FILE_OFFSET_BITS=64 -D_LARGEFILE64_SOURCE -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rsamtools/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rcpp/include" -I/usr/local/include -fPIC -O3 -Wno-c++11-inline-namespace -c intervalTree.cpp -o intervalTree.o
clang++ -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -D_USE_KNETFILE -DBGZF_CACHE -D_FILE_OFFSET_BITS=64 -D_LARGEFILE64_SOURCE -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rsamtools/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rcpp/include" -I/usr/local/include -fPIC -O3 -Wno-c++11-inline-namespace -c merge.cpp -o merge.o
clang -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -D_USE_KNETFILE -DBGZF_CACHE -D_FILE_OFFSET_BITS=64 -D_LARGEFILE64_SOURCE -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rsamtools/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rcpp/include" -I/usr/local/include -fPIC -Wall -g -O2 -c mergeOne.c -o mergeOne.o
clang++ -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -D_USE_KNETFILE -DBGZF_CACHE -D_FILE_OFFSET_BITS=64 -D_LARGEFILE64_SOURCE -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rsamtools/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rcpp/include" -I/usr/local/include -fPIC -O3 -Wno-c++11-inline-namespace -c nodeGroup.cpp -o nodeGroup.o
clang++ -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -D_USE_KNETFILE -DBGZF_CACHE -D_FILE_OFFSET_BITS=64 -D_LARGEFILE64_SOURCE -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rsamtools/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rcpp/include" -I/usr/local/include -fPIC -O3 -Wno-c++11-inline-namespace -c peakOrder.cpp -o peakOrder.o
clang++ -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -D_USE_KNETFILE -DBGZF_CACHE -D_FILE_OFFSET_BITS=64 -D_LARGEFILE64_SOURCE -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rsamtools/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rcpp/include" -I/usr/local/include -fPIC -O3 -Wno-c++11-inline-namespace -c reader.cpp -o reader.o
clang++ -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -D_USE_KNETFILE -DBGZF_CACHE -D_FILE_OFFSET_BITS=64 -D_LARGEFILE64_SOURCE -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rsamtools/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rcpp/include" -I/usr/local/include -fPIC -O3 -Wno-c++11-inline-namespace -c sequence.cpp -o sequence.o
clang++ -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -D_USE_KNETFILE -DBGZF_CACHE -D_FILE_OFFSET_BITS=64 -D_LARGEFILE64_SOURCE -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rsamtools/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rcpp/include" -I/usr/local/include -fPIC -O3 -Wno-c++11-inline-namespace -c util.cpp -o util.o
clang++ -dynamiclib -Wl,-headerpad_max_install_names -undefined dynamic_lookup -single_module -multiply_defined suppress -L/Library/Frameworks/R.framework/Resources/lib -L/usr/local/lib -o DiffBind.so RcppExports.o bamReader.o bedReader.o bitBucket.o croi_func.o croi_main.o densitySet.o iBucket.o interval.o intervalDensity.o intervalNode.o intervalSet.o intervalTree.o merge.o mergeOne.o nodeGroup.o peakOrder.o reader.o sequence.o util.o /Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rsamtools/usrlib//libbam.a /Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rsamtools/usrlib//libbcf.a /Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rsamtools/usrlib//libtabix.a -lz -pthread -F/Library/Frameworks/R.framework/.. -framework R -Wl,-framework -Wl,CoreFoundation
installing to /Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/DiffBind/new/DiffBind.Rcheck/DiffBind/libs
** R
** data
** inst
** byte-compile and prepare package for lazy loading
Error in loadNamespace(j <- i[[1L]], c(lib.loc, .libPaths()), versionCheck = vI[[j]]) :
there is no package called ‘GO.db’
ERROR: lazy loading failed for package ‘DiffBind’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/DiffBind/new/DiffBind.Rcheck/DiffBind’
```
### CRAN
```
* installing *source* package ‘DiffBind’ ...
** libs
clang++ -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -D_USE_KNETFILE -DBGZF_CACHE -D_FILE_OFFSET_BITS=64 -D_LARGEFILE64_SOURCE -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rsamtools/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rcpp/include" -I/usr/local/include -fPIC -O3 -Wno-c++11-inline-namespace -c RcppExports.cpp -o RcppExports.o
clang++ -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -D_USE_KNETFILE -DBGZF_CACHE -D_FILE_OFFSET_BITS=64 -D_LARGEFILE64_SOURCE -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rsamtools/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rcpp/include" -I/usr/local/include -fPIC -O3 -Wno-c++11-inline-namespace -c bamReader.cpp -o bamReader.o
clang++ -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -D_USE_KNETFILE -DBGZF_CACHE -D_FILE_OFFSET_BITS=64 -D_LARGEFILE64_SOURCE -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rsamtools/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rcpp/include" -I/usr/local/include -fPIC -O3 -Wno-c++11-inline-namespace -c bedReader.cpp -o bedReader.o
clang++ -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -D_USE_KNETFILE -DBGZF_CACHE -D_FILE_OFFSET_BITS=64 -D_LARGEFILE64_SOURCE -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rsamtools/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rcpp/include" -I/usr/local/include -fPIC -O3 -Wno-c++11-inline-namespace -c bitBucket.cpp -o bitBucket.o
clang++ -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -D_USE_KNETFILE -DBGZF_CACHE -D_FILE_OFFSET_BITS=64 -D_LARGEFILE64_SOURCE -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rsamtools/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rcpp/include" -I/usr/local/include -fPIC -O3 -Wno-c++11-inline-namespace -c croi_func.cpp -o croi_func.o
clang++ -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -D_USE_KNETFILE -DBGZF_CACHE -D_FILE_OFFSET_BITS=64 -D_LARGEFILE64_SOURCE -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rsamtools/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rcpp/include" -I/usr/local/include -fPIC -O3 -Wno-c++11-inline-namespace -c croi_main.cpp -o croi_main.o
clang++ -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -D_USE_KNETFILE -DBGZF_CACHE -D_FILE_OFFSET_BITS=64 -D_LARGEFILE64_SOURCE -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rsamtools/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rcpp/include" -I/usr/local/include -fPIC -O3 -Wno-c++11-inline-namespace -c densitySet.cpp -o densitySet.o
clang++ -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -D_USE_KNETFILE -DBGZF_CACHE -D_FILE_OFFSET_BITS=64 -D_LARGEFILE64_SOURCE -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rsamtools/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rcpp/include" -I/usr/local/include -fPIC -O3 -Wno-c++11-inline-namespace -c iBucket.cpp -o iBucket.o
clang++ -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -D_USE_KNETFILE -DBGZF_CACHE -D_FILE_OFFSET_BITS=64 -D_LARGEFILE64_SOURCE -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rsamtools/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rcpp/include" -I/usr/local/include -fPIC -O3 -Wno-c++11-inline-namespace -c interval.cpp -o interval.o
clang++ -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -D_USE_KNETFILE -DBGZF_CACHE -D_FILE_OFFSET_BITS=64 -D_LARGEFILE64_SOURCE -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rsamtools/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rcpp/include" -I/usr/local/include -fPIC -O3 -Wno-c++11-inline-namespace -c intervalDensity.cpp -o intervalDensity.o
clang++ -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -D_USE_KNETFILE -DBGZF_CACHE -D_FILE_OFFSET_BITS=64 -D_LARGEFILE64_SOURCE -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rsamtools/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rcpp/include" -I/usr/local/include -fPIC -O3 -Wno-c++11-inline-namespace -c intervalNode.cpp -o intervalNode.o
clang++ -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -D_USE_KNETFILE -DBGZF_CACHE -D_FILE_OFFSET_BITS=64 -D_LARGEFILE64_SOURCE -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rsamtools/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rcpp/include" -I/usr/local/include -fPIC -O3 -Wno-c++11-inline-namespace -c intervalSet.cpp -o intervalSet.o
clang++ -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -D_USE_KNETFILE -DBGZF_CACHE -D_FILE_OFFSET_BITS=64 -D_LARGEFILE64_SOURCE -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rsamtools/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rcpp/include" -I/usr/local/include -fPIC -O3 -Wno-c++11-inline-namespace -c intervalTree.cpp -o intervalTree.o
clang++ -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -D_USE_KNETFILE -DBGZF_CACHE -D_FILE_OFFSET_BITS=64 -D_LARGEFILE64_SOURCE -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rsamtools/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rcpp/include" -I/usr/local/include -fPIC -O3 -Wno-c++11-inline-namespace -c merge.cpp -o merge.o
clang -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -D_USE_KNETFILE -DBGZF_CACHE -D_FILE_OFFSET_BITS=64 -D_LARGEFILE64_SOURCE -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rsamtools/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rcpp/include" -I/usr/local/include -fPIC -Wall -g -O2 -c mergeOne.c -o mergeOne.o
clang++ -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -D_USE_KNETFILE -DBGZF_CACHE -D_FILE_OFFSET_BITS=64 -D_LARGEFILE64_SOURCE -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rsamtools/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rcpp/include" -I/usr/local/include -fPIC -O3 -Wno-c++11-inline-namespace -c nodeGroup.cpp -o nodeGroup.o
clang++ -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -D_USE_KNETFILE -DBGZF_CACHE -D_FILE_OFFSET_BITS=64 -D_LARGEFILE64_SOURCE -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rsamtools/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rcpp/include" -I/usr/local/include -fPIC -O3 -Wno-c++11-inline-namespace -c peakOrder.cpp -o peakOrder.o
clang++ -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -D_USE_KNETFILE -DBGZF_CACHE -D_FILE_OFFSET_BITS=64 -D_LARGEFILE64_SOURCE -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rsamtools/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rcpp/include" -I/usr/local/include -fPIC -O3 -Wno-c++11-inline-namespace -c reader.cpp -o reader.o
clang++ -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -D_USE_KNETFILE -DBGZF_CACHE -D_FILE_OFFSET_BITS=64 -D_LARGEFILE64_SOURCE -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rsamtools/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rcpp/include" -I/usr/local/include -fPIC -O3 -Wno-c++11-inline-namespace -c sequence.cpp -o sequence.o
clang++ -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -D_USE_KNETFILE -DBGZF_CACHE -D_FILE_OFFSET_BITS=64 -D_LARGEFILE64_SOURCE -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rsamtools/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rcpp/include" -I/usr/local/include -fPIC -O3 -Wno-c++11-inline-namespace -c util.cpp -o util.o
clang++ -dynamiclib -Wl,-headerpad_max_install_names -undefined dynamic_lookup -single_module -multiply_defined suppress -L/Library/Frameworks/R.framework/Resources/lib -L/usr/local/lib -o DiffBind.so RcppExports.o bamReader.o bedReader.o bitBucket.o croi_func.o croi_main.o densitySet.o iBucket.o interval.o intervalDensity.o intervalNode.o intervalSet.o intervalTree.o merge.o mergeOne.o nodeGroup.o peakOrder.o reader.o sequence.o util.o /Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rsamtools/usrlib//libbam.a /Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rsamtools/usrlib//libbcf.a /Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiffBind/Rsamtools/usrlib//libtabix.a -lz -pthread -F/Library/Frameworks/R.framework/.. -framework R -Wl,-framework -Wl,CoreFoundation
installing to /Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/DiffBind/old/DiffBind.Rcheck/DiffBind/libs
** R
** data
** inst
** byte-compile and prepare package for lazy loading
Error in loadNamespace(j <- i[[1L]], c(lib.loc, .libPaths()), versionCheck = vI[[j]]) :
there is no package called ‘GO.db’
ERROR: lazy loading failed for package ‘DiffBind’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/DiffBind/old/DiffBind.Rcheck/DiffBind’
```
# diffcyt
Version: 1.0.10
## In both
* checking re-building of vignette outputs ... WARNING
```
Error in re-building vignettes:
...
Quitting from lines 127-144 (diffcyt_workflow.Rmd)
Error: processing vignette 'diffcyt_workflow.Rmd' failed with diagnostics:
there is no package called 'HDCytoData'
Execution halted
```
* checking package dependencies ... NOTE
```
Package suggested but not available for checking: ‘HDCytoData’
```
* checking R code for possible problems ... NOTE
```
...
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/diffcyt/new/diffcyt.Rcheck/00_pkg_src/diffcyt/R/calcMedians.R:133-136)
calcMediansByClusterMarker: no visible binding for global variable
‘cluster_id’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/diffcyt/new/diffcyt.Rcheck/00_pkg_src/diffcyt/R/calcMediansByClusterMarker.R:123-126)
calcMediansByClusterMarker: no visible binding for global variable
‘marker’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/diffcyt/new/diffcyt.Rcheck/00_pkg_src/diffcyt/R/calcMediansByClusterMarker.R:123-126)
calcMediansByClusterMarker: no visible binding for global variable
‘value’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/diffcyt/new/diffcyt.Rcheck/00_pkg_src/diffcyt/R/calcMediansByClusterMarker.R:123-126)
calcMediansBySampleMarker: no visible binding for global variable
‘sample_id’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/diffcyt/new/diffcyt.Rcheck/00_pkg_src/diffcyt/R/calcMediansBySampleMarker.R:119-122)
calcMediansBySampleMarker: no visible binding for global variable
‘marker’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/diffcyt/new/diffcyt.Rcheck/00_pkg_src/diffcyt/R/calcMediansBySampleMarker.R:119-122)
calcMediansBySampleMarker: no visible binding for global variable
‘value’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/diffcyt/new/diffcyt.Rcheck/00_pkg_src/diffcyt/R/calcMediansBySampleMarker.R:119-122)
Undefined global functions or variables:
cluster_id marker sample_id value
```
# diffloop
Version: 1.8.0
## In both
* checking for hidden files and directories ... NOTE
```
Found the following hidden files and directories:
.travis.yml
These were most likely included in error. See section ‘Package
structure’ in the ‘Writing R Extensions’ manual.
```
# DirectEffects
Version: 0.2
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘sandwich’
All declared Imports should be used.
```
# directlabels
Version: 2018.05.22
## In both
* checking package dependencies ... NOTE
```
Package suggested but not available for checking: ‘inlinedocs’
```
# dirichletprocess
Version: 0.2.2
## In both
* checking re-building of vignette outputs ... NOTE
```
Error in re-building vignettes:
...
Error in texi2dvi(file = file, pdf = TRUE, clean = clean, quiet = quiet, :
Running 'texi2dvi' on 'dirichletprocess.tex' failed.
LaTeX errors:
! LaTeX Error: File `thumbpdf.sty' not found.
Type X to quit or <RETURN> to proceed,
or enter new name. (Default extension: sty)
! Emergency stop.
<read *>
l.11 ^^M
! ==> Fatal error occurred, no output PDF file produced!
Calls: buildVignettes -> texi2pdf -> texi2dvi
Execution halted
```
# DisImpact
Version: 0.0.3
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘magrittr’
All declared Imports should be used.
```
# disto
Version: 0.2.0
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘dplyr’ ‘proxy’
All declared Imports should be used.
```
# DiversityOccupancy
Version: 1.0.6
## In both
* checking whether package ‘DiversityOccupancy’ can be installed ... ERROR
```
Installation failed.
See ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/DiversityOccupancy/new/DiversityOccupancy.Rcheck/00install.out’ for details.
```
## Installation
### Devel
```
* installing *source* package ‘DiversityOccupancy’ ...
** package ‘DiversityOccupancy’ successfully unpacked and MD5 sums checked
** R
** data
*** moving datasets to lazyload DB
** inst
** byte-compile and prepare package for lazy loading
Error : .onLoad failed in loadNamespace() for 'rJava', details:
call: dyn.load(file, DLLpath = DLLpath, ...)
error: unable to load shared object '/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiversityOccupancy/rJava/libs/rJava.so':
dlopen(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiversityOccupancy/rJava/libs/rJava.so, 6): Library not loaded: /Library/Java/JavaVirtualMachines/jdk-11.0.1.jdk/Contents/Home/lib/server/libjvm.dylib
Referenced from: /Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiversityOccupancy/rJava/libs/rJava.so
Reason: image not found
ERROR: lazy loading failed for package ‘DiversityOccupancy’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/DiversityOccupancy/new/DiversityOccupancy.Rcheck/DiversityOccupancy’
```
### CRAN
```
* installing *source* package ‘DiversityOccupancy’ ...
** package ‘DiversityOccupancy’ successfully unpacked and MD5 sums checked
** R
** data
*** moving datasets to lazyload DB
** inst
** byte-compile and prepare package for lazy loading
Error : .onLoad failed in loadNamespace() for 'rJava', details:
call: dyn.load(file, DLLpath = DLLpath, ...)
error: unable to load shared object '/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiversityOccupancy/rJava/libs/rJava.so':
dlopen(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiversityOccupancy/rJava/libs/rJava.so, 6): Library not loaded: /Library/Java/JavaVirtualMachines/jdk-11.0.1.jdk/Contents/Home/lib/server/libjvm.dylib
Referenced from: /Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/DiversityOccupancy/rJava/libs/rJava.so
Reason: image not found
ERROR: lazy loading failed for package ‘DiversityOccupancy’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/DiversityOccupancy/old/DiversityOccupancy.Rcheck/DiversityOccupancy’
```
# DLMtool
Version: 5.3
## In both
* checking installed package size ... NOTE
```
installed size is 9.5Mb
sub-directories of 1Mb or more:
data 2.1Mb
R 6.0Mb
```
# dlookr
Version: 0.3.8
## In both
* checking installed package size ... NOTE
```
installed size is 5.2Mb
sub-directories of 1Mb or more:
doc 4.1Mb
```
# doBy
Version: 4.6-2
## In both
* checking re-building of vignette outputs ... NOTE
```
Error in re-building vignettes:
...
Error in texi2dvi(file = file, pdf = TRUE, clean = clean, quiet = quiet, :
Running 'texi2dvi' on 'linest-lsmeans.tex' failed.
LaTeX errors:
! LaTeX Error: File `a4wide.sty' not found.
Type X to quit or <RETURN> to proceed,
or enter new name. (Default extension: sty)
! Emergency stop.
<read *>
l.62 \usepackage
{boxedminipage,color}^^M
! ==> Fatal error occurred, no output PDF file produced!
Calls: buildVignettes -> texi2pdf -> texi2dvi
Execution halted
```
# DSAIDE
Version: 0.7.0
## In both
* checking installed package size ... NOTE
```
installed size is 5.4Mb
sub-directories of 1Mb or more:
media 2.2Mb
shinyapps 2.6Mb
```
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘knitr’ ‘rmarkdown’ ‘utils’
All declared Imports should be used.
```
# DSAIRM
Version: 0.5.5
## In both
* checking installed package size ... NOTE
```
installed size is 12.0Mb
sub-directories of 1Mb or more:
appinformation 9.6Mb
media 1.1Mb
```
# dtwclust
Version: 5.5.2
## In both
* checking re-building of vignette outputs ... WARNING
```
...
Distance matrix is not symmetric, and hierarchical clustering assumes it is (it ignores the upper triangular).
Loading required package: doParallel
Loading required package: foreach
Loading required package: iterators
Loading required package: clue
Error in texi2dvi(file = file, pdf = TRUE, clean = clean, quiet = quiet, :
Running 'texi2dvi' on 'dtwclust.tex' failed.
LaTeX errors:
! LaTeX Error: File `placeins.sty' not found.
Type X to quit or <RETURN> to proceed,
or enter new name. (Default extension: sty)
! Emergency stop.
<read *>
l.80 ^^M
! ==> Fatal error occurred, no output PDF file produced!
Calls: buildVignettes -> texi2pdf -> texi2dvi
Execution halted
```
* checking installed package size ... NOTE
```
installed size is 5.7Mb
sub-directories of 1Mb or more:
doc 2.5Mb
R 2.0Mb
```
* checking for GNU extensions in Makefiles ... NOTE
```
GNU make is a SystemRequirements.
```
# duawranglr
Version: 0.6.3
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘digest’ ‘dplyr’
All declared Imports should be used.
```
# dvmisc
Version: 1.1.3
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘purrr’
All declared Imports should be used.
```
# dynfrail
Version: 0.5.2
## In both
* checking whether package ‘dynfrail’ can be installed ... ERROR
```
Installation failed.
See ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/dynfrail/new/dynfrail.Rcheck/00install.out’ for details.
```
## Installation
### Devel
```
* installing *source* package ‘dynfrail’ ...
** package ‘dynfrail’ successfully unpacked and MD5 sums checked
** libs
clang++ -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/dynfrail/Rcpp/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/dynfrail/RcppArmadillo/include" -I/usr/local/include -fopenmp -fPIC -O3 -Wno-c++11-inline-namespace -c RcppExports.cpp -o RcppExports.o
clang: error: unsupported option '-fopenmp'
make: *** [RcppExports.o] Error 1
ERROR: compilation failed for package ‘dynfrail’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/dynfrail/new/dynfrail.Rcheck/dynfrail’
```
### CRAN
```
* installing *source* package ‘dynfrail’ ...
** package ‘dynfrail’ successfully unpacked and MD5 sums checked
** libs
clang++ -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/dynfrail/Rcpp/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/dynfrail/RcppArmadillo/include" -I/usr/local/include -fopenmp -fPIC -O3 -Wno-c++11-inline-namespace -c RcppExports.cpp -o RcppExports.o
clang: error: unsupported option '-fopenmp'
make: *** [RcppExports.o] Error 1
ERROR: compilation failed for package ‘dynfrail’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/dynfrail/old/dynfrail.Rcheck/dynfrail’
```
# echarts4r
Version: 0.2.1
## Newly broken
* checking installed package size ... NOTE
```
installed size is 6.0Mb
sub-directories of 1Mb or more:
htmlwidgets 3.6Mb
R 2.0Mb
```
# echor
Version: 0.1.2
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘lubridate’
All declared Imports should be used.
```
# ecoengine
Version: 1.11.0
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘magrittr’
All declared Imports should be used.
```
# econet
Version: 0.1.81
## In both
* checking re-building of vignette outputs ... WARNING
```
Error in re-building vignettes:
...
Error: processing vignette 'econet.tex' failed with diagnostics:
Running 'texi2dvi' on 'econet.tex' failed.
LaTeX errors:
! LaTeX Error: File `thumbpdf.sty' not found.
Type X to quit or <RETURN> to proceed,
or enter new name. (Default extension: sty)
! Emergency stop.
<read *>
l.6 \usepackage
{framed}^^M
! ==> Fatal error occurred, no output PDF file produced!
Execution halted
```
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘tnet’
All declared Imports should be used.
```
# eda4treeR
Version: 0.2.0
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘dae’ ‘dplyr’
All declared Imports should be used.
```
# EFDR
Version: 0.1.1
## In both
* checking R code for possible problems ... NOTE
```
...
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/EFDR/new/EFDR.Rcheck/00_pkg_src/EFDR/R/EFDR_functions.R:686)
.relist.dwt: no visible global function definition for ‘as’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/EFDR/new/EFDR.Rcheck/00_pkg_src/EFDR/R/EFDR_functions.R:686)
.std.wav.coeff : <anonymous>: no visible global function definition for
‘mad’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/EFDR/new/EFDR.Rcheck/00_pkg_src/EFDR/R/EFDR_functions.R:698)
regrid: no visible global function definition for ‘predict’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/EFDR/new/EFDR.Rcheck/00_pkg_src/EFDR/R/EFDR_functions.R:391-396)
regrid: no visible global function definition for ‘var’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/EFDR/new/EFDR.Rcheck/00_pkg_src/EFDR/R/EFDR_functions.R:406)
regrid: no visible global function definition for ‘medpolish’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/EFDR/new/EFDR.Rcheck/00_pkg_src/EFDR/R/EFDR_functions.R:427)
Undefined global functions or variables:
as mad medpolish pnorm predict relist rnorm var
Consider adding
importFrom("methods", "as")
importFrom("stats", "mad", "medpolish", "pnorm", "predict", "rnorm",
"var")
importFrom("utils", "relist")
to your NAMESPACE file (and ensure that your DESCRIPTION Imports field
contains 'methods').
```
# ELMER
Version: 2.4.4
## In both
* R CMD check timed out
* checking dependencies in R code ... WARNING
```
'::' or ':::' import not declared from: 'data.table'
':::' call which should be '::': 'TCGAbiolinks:::TCGAVisualize_volcano'
See the note in ?`:::` about the use of this operator.
Unexported objects imported by ':::' calls:
'TCGAbiolinks:::colDataPrepare' 'TCGAbiolinks:::get.GRCh.bioMart'
See the note in ?`:::` about the use of this operator.
```
* checking installed package size ... NOTE
```
installed size is 45.1Mb
sub-directories of 1Mb or more:
doc 44.1Mb
```
* checking R code for possible problems ... NOTE
```
...
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/ELMER/new/ELMER.Rcheck/00_pkg_src/ELMER/R/motif.TF.Plots.R:148-157)
scatter: no visible binding for global variable 'value'
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/ELMER/new/ELMER.Rcheck/00_pkg_src/ELMER/R/Scatter.plot.R:217-231)
scatter: no visible global function definition for 'cor.test'
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/ELMER/new/ELMER.Rcheck/00_pkg_src/ELMER/R/Scatter.plot.R:236-238)
scatter: no visible binding for global variable 'mae'
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/ELMER/new/ELMER.Rcheck/00_pkg_src/ELMER/R/Scatter.plot.R:250-267)
TF.rank.plot: no visible binding for global variable 'pvalue'
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/ELMER/new/ELMER.Rcheck/00_pkg_src/ELMER/R/motif.TF.Plots.R:291-304)
TF.rank.plot: no visible binding for global variable 'label'
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/ELMER/new/ELMER.Rcheck/00_pkg_src/ELMER/R/motif.TF.Plots.R:291-304)
TF.rank.plot: no visible binding for global variable 'Gene'
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/ELMER/new/ELMER.Rcheck/00_pkg_src/ELMER/R/motif.TF.Plots.R:308-320)
Undefined global functions or variables:
cor.test fisher.test Gene GeneID gr hm450.hg38.manifest Hugo_Symbol
label lowerOR mae motif OR precede Probe pvalue subsetByOverlaps
Target TF upperOR value write.table x y z
Consider adding
importFrom("stats", "cor.test", "fisher.test")
importFrom("utils", "write.table")
to your NAMESPACE file.
```
# ELMER.data
Version: 2.4.2
## In both
* checking for hidden files and directories ... NOTE
```
Found the following hidden files and directories:
.travis.yml
These were most likely included in error. See section ‘Package
structure’ in the ‘Writing R Extensions’ manual.
```
* checking installed package size ... NOTE
```
installed size is 288.8Mb
sub-directories of 1Mb or more:
data 286.3Mb
doc 2.4Mb
```
* checking DESCRIPTION meta-information ... NOTE
```
Malformed Description field: should contain one or more complete sentences.
```
* checking data for non-ASCII characters ... NOTE
```
Error in .requirePackage(package) :
unable to find required package 'MultiAssayExperiment'
Calls: <Anonymous> ... .extendsForS3 -> extends -> getClassDef -> .requirePackage
Execution halted
```
# emuR
Version: 1.1.2
## In both
* checking installed package size ... NOTE
```
installed size is 7.1Mb
sub-directories of 1Mb or more:
doc 1.2Mb
extdata 1.5Mb
R 3.0Mb
```
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘git2r’ ‘servr’
All declared Imports should be used.
```
# ENCODExplorer
Version: 2.6.0
## In both
* checking installed package size ... NOTE
```
installed size is 74.0Mb
sub-directories of 1Mb or more:
data 24.1Mb
doc 1.5Mb
extdata 48.0Mb
```
* checking R code for possible problems ... NOTE
```
...
step6_target: no visible binding for global variable ‘target’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/ENCODExplorer/new/ENCODExplorer.Rcheck/00_pkg_src/ENCODExplorer/R/prepare_data.R:354-355)
step7: no visible binding for global variable ‘organism’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/ENCODExplorer/new/ENCODExplorer.Rcheck/00_pkg_src/ENCODExplorer/R/prepare_data.R:424-425)
step8: no visible binding for global variable ‘investigated_as’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/ENCODExplorer/new/ENCODExplorer.Rcheck/00_pkg_src/ENCODExplorer/R/prepare_data.R:436-437)
step8: no visible binding for global variable ‘target’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/ENCODExplorer/new/ENCODExplorer.Rcheck/00_pkg_src/ENCODExplorer/R/prepare_data.R:439-440)
step9: no visible binding for global variable ‘organism’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/ENCODExplorer/new/ENCODExplorer.Rcheck/00_pkg_src/ENCODExplorer/R/prepare_data.R:449-450)
Undefined global functions or variables:
. accession antibody_caption antibody_characterization
antibody_target assay biological_replicate_number biosample_name
biosample_type col_name controls data date_released download.file
encode_df Experiment file_accession file_format href investigated_as
lab nucleic_acid_term organism platform project replicate_antibody
replicate_library server status submitted_by target
technical_replicate_number treatment ui value Value
Consider adding
importFrom("utils", "data", "download.file")
to your NAMESPACE file.
```
* checking data for non-ASCII characters ... NOTE
```
Note: found 771 marked UTF-8 strings
```
# epicontacts
Version: 1.1.0
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘colorspace’
All declared Imports should be used.
```
# EpiReport
Version: 0.1.0
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘extrafont’ ‘graphics’ ‘knitr’ ‘rmarkdown’ ‘utils’
All declared Imports should be used.
```
# EpiSignalDetection
Version: 0.1.1
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘DT’ ‘ggplot2’ ‘knitr’ ‘pander’
All declared Imports should be used.
```
# epos
Version: 0.1.0
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘dplyr’ ‘testthat’ ‘tidyr’
All declared Imports should be used.
```
# ergm
Version: 3.9.4
## In both
* checking re-building of vignette outputs ... WARNING
```
...
samplk2 (samplk) Longitudinal networks of positive affection
within a monastery as a "network" object
samplk3 (samplk) Longitudinal networks of positive affection
within a monastery as a "network" object
Error in texi2dvi(file = file, pdf = TRUE, clean = clean, quiet = quiet, :
Running 'texi2dvi' on 'ergm.tex' failed.
LaTeX errors:
! LaTeX Error: File `multirow.sty' not found.
Type X to quit or <RETURN> to proceed,
or enter new name. (Default extension: sty)
! Emergency stop.
<read *>
l.57 \newcommand
{\myverb}[1]{\mbox{\texttt{#1}}}^^M
! ==> Fatal error occurred, no output PDF file produced!
Calls: buildVignettes -> texi2pdf -> texi2dvi
Execution halted
```
* checking package dependencies ... NOTE
```
Package suggested but not available for checking: ‘Rmpi’
```
* checking installed package size ... NOTE
```
installed size is 7.2Mb
sub-directories of 1Mb or more:
doc 1.7Mb
R 4.2Mb
```
* checking Rd cross-references ... NOTE
```
Packages unavailable to check Rd xrefs: ‘tergm’, ‘ergm.count’, ‘networkDynamic’
```
# eurostat
Version: 3.3.1.3
## In both
* checking data for non-ASCII characters ... NOTE
```
Note: found 595 marked UTF-8 strings
```
# EventStudy
Version: 0.35
## In both
* checking installed package size ... NOTE
```
installed size is 6.1Mb
sub-directories of 1Mb or more:
doc 5.0Mb
```
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘curl’ ‘openxlsx’ ‘stringr’
All declared Imports should be used.
```
# ezpickr
Version: 1.0.3
## In both
* checking Rd cross-references ... NOTE
```
Package unavailable to check Rd xrefs: ‘BrailleR’
```
# ezplot
Version: 0.2.2
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘magrittr’
All declared Imports should be used.
```
# factorMerger
Version: 0.3.6
## In both
* checking PDF version of manual ... WARNING
```
LaTeX errors when creating PDF version.
This typically indicates Rd problems.
LaTeX errors found:
! LaTeX Error: Command \k unavailable in encoding OT1.
See the LaTeX manual or LaTeX Companion for explanation.
Type H <return> for immediate help.
...
```
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘forcats’ ‘formula.tools’
All declared Imports should be used.
```
# fastLink
Version: 0.5.0
## In both
* checking whether package ‘fastLink’ can be installed ... ERROR
```
Installation failed.
See ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/fastLink/new/fastLink.Rcheck/00install.out’ for details.
```
## Installation
### Devel
```
* installing *source* package ‘fastLink’ ...
** package ‘fastLink’ successfully unpacked and MD5 sums checked
** libs
clang++ -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/fastLink/RcppArmadillo/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/fastLink/Rcpp/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/fastLink/RcppEigen/include" -I/usr/local/include -fopenmp -fPIC -O3 -Wno-c++11-inline-namespace -c RcppExports.cpp -o RcppExports.o
clang: error: unsupported option '-fopenmp'
make: *** [RcppExports.o] Error 1
ERROR: compilation failed for package ‘fastLink’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/fastLink/new/fastLink.Rcheck/fastLink’
```
### CRAN
```
* installing *source* package ‘fastLink’ ...
** package ‘fastLink’ successfully unpacked and MD5 sums checked
** libs
clang++ -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/fastLink/RcppArmadillo/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/fastLink/Rcpp/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/fastLink/RcppEigen/include" -I/usr/local/include -fopenmp -fPIC -O3 -Wno-c++11-inline-namespace -c RcppExports.cpp -o RcppExports.o
clang: error: unsupported option '-fopenmp'
make: *** [RcppExports.o] Error 1
ERROR: compilation failed for package ‘fastLink’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/fastLink/old/fastLink.Rcheck/fastLink’
```
# fastR2
Version: 1.2.1
## In both
* checking installed package size ... NOTE
```
installed size is 5.0Mb
sub-directories of 1Mb or more:
snippet 3.7Mb
```
# febr
Version: 1.0.1
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘knitr’
All declared Imports should be used.
```
# FedData
Version: 2.5.6
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘jsonlite’
All declared Imports should be used.
```
# fedregs
Version: 0.1.1
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘rvest’ ‘stringi’
All declared Imports should be used.
```
# finalfit
Version: 0.9.0
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘readr’
All declared Imports should be used.
```
# FindMyFriends
Version: 1.10.0
## In both
* checking for code/documentation mismatches ... WARNING
```
Functions or methods with usage in documentation object 'pgVirtual-class' but not in code:
as
```
* checking installed package size ... NOTE
```
installed size is 6.0Mb
sub-directories of 1Mb or more:
doc 1.5Mb
extdata 1.8Mb
R 2.1Mb
```
* checking dependencies in R code ... NOTE
```
Unexported objects imported by ':::' calls:
‘gtable:::insert.unit’ ‘gtable:::z_arrange_gtables’
See the note in ?`:::` about the use of this operator.
```
# fingertipscharts
Version: 0.0.4
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘curl’ ‘mapproj’
All declared Imports should be used.
```
# fingertipsR
Version: 0.2.1
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘purrr’
All declared Imports should be used.
```
# fitteR
Version: 0.1.0
## In both
* checking package dependencies ... NOTE
```
Package suggested but not available for checking: ‘ExtDist’
```
# flora
Version: 0.3.0
## In both
* checking installed package size ... NOTE
```
installed size is 7.3Mb
sub-directories of 1Mb or more:
R 7.2Mb
```
# flowWorkspace
Version: 3.28.2
## In both
* checking installed package size ... NOTE
```
installed size is 33.8Mb
sub-directories of 1Mb or more:
doc 1.4Mb
lib 27.0Mb
libs 2.9Mb
R 2.1Mb
```
* checking DESCRIPTION meta-information ... NOTE
```
Versioned 'LinkingTo' values for
‘BH’ ‘cytolib’
are only usable in R >= 3.0.2
```
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘graphics’ ‘grDevices’ ‘RBGL’
All declared Imports should be used.
Unexported objects imported by ':::' calls:
‘flowCore:::.estimateLogicle’ ‘flowCore:::checkClass’
‘flowCore:::copyFlowSet’ ‘flowCore:::guid’
‘flowCore:::logicle_transform’ ‘flowCore:::updateTransformKeywords’
‘graph:::.makeEdgeKeys’ ‘lattice:::updateList’
‘ncdfFlow:::.isValidSamples’ ‘stats:::.splinefun’
See the note in ?`:::` about the use of this operator.
There are ::: calls to the package's namespace in its code. A package
almost never needs to use ::: for its own objects:
‘.cpp_setIndices’ ‘.getNodeInd’
```
* checking R code for possible problems ... NOTE
```
...
show,flowJoWorkspace: no visible binding for global variable
‘groupName’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/flowWorkspace/new/flowWorkspace.Rcheck/00_pkg_src/flowWorkspace/R/flowJoWorkspace_Methods.R:66)
transform,GatingSet: no visible global function definition for ‘is’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/flowWorkspace/new/flowWorkspace.Rcheck/00_pkg_src/flowWorkspace/R/GatingSet_Methods.R:2291-2296)
transform,GatingSet: no visible global function definition for ‘is’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/flowWorkspace/new/flowWorkspace.Rcheck/00_pkg_src/flowWorkspace/R/GatingSet_Methods.R:2298-2308)
transform,GatingSet : <anonymous>: no visible global function
definition for ‘is’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/flowWorkspace/new/flowWorkspace.Rcheck/00_pkg_src/flowWorkspace/R/GatingSet_Methods.R:2301-2302)
Undefined global functions or variables:
. .hasSlot as as.formula callNextMethod desc extends gray groupName
IQR is median new node old openCyto.count parallel sampleName
selectMethod slot validObject xml.count
Consider adding
importFrom("grDevices", "gray")
importFrom("methods", ".hasSlot", "as", "callNextMethod", "extends",
"is", "new", "selectMethod", "slot", "validObject")
importFrom("stats", "as.formula", "IQR", "median")
to your NAMESPACE file (and ensure that your DESCRIPTION Imports field
contains 'methods').
```
* checking for GNU extensions in Makefiles ... NOTE
```
GNU make is a SystemRequirements.
```
* checking compiled code ... NOTE
```
File ‘flowWorkspace/libs/flowWorkspace.so’:
Found ‘__ZNSt3__14coutE’, possibly from ‘std::cout’ (C++)
Object: ‘R_GatingSet.o’
Compiled code should not call entry points which might terminate R nor
write to stdout/stderr instead of to the console, nor use Fortran I/O
nor system RNGs.
See ‘Writing portable packages’ in the ‘Writing R Extensions’ manual.
```
# fredr
Version: 1.0.0
## In both
* checking data for non-ASCII characters ... NOTE
```
Note: found 1 marked UTF-8 string
```
# FRK
Version: 0.2.2
## In both
* checking re-building of vignette outputs ... WARNING
```
...
The following objects are masked from 'package:base':
intersect, setdiff, setequal, union
Note: show_basis assumes spherical distance functions when plotting
Error in texi2dvi(file = file, pdf = TRUE, clean = clean, quiet = quiet, :
Running 'texi2dvi' on 'FRK_intro.tex' failed.
LaTeX errors:
! LaTeX Error: File `subfig.sty' not found.
Type X to quit or <RETURN> to proceed,
or enter new name. (Default extension: sty)
! Emergency stop.
<read *>
l.58 \usepackage
{graphicx}^^M
! ==> Fatal error occurred, no output PDF file produced!
Calls: buildVignettes -> texi2pdf -> texi2dvi
Execution halted
```
* checking package dependencies ... NOTE
```
Packages suggested but not available for checking: ‘dggrids’ ‘INLA’
```
* checking installed package size ... NOTE
```
installed size is 9.7Mb
sub-directories of 1Mb or more:
data 5.3Mb
doc 2.1Mb
R 2.1Mb
```
# FSA
Version: 0.8.22
## In both
* checking Rd cross-references ... NOTE
```
Packages unavailable to check Rd xrefs: ‘alr4’, ‘prettyR’, ‘RMark’, ‘pgirmess’, ‘agricolae’
```
# FSelectorRcpp
Version: 0.3.0
## In both
* checking whether package ‘FSelectorRcpp’ can be installed ... ERROR
```
Installation failed.
See ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/FSelectorRcpp/new/FSelectorRcpp.Rcheck/00install.out’ for details.
```
* checking package dependencies ... NOTE
```
Package suggested but not available for checking: ‘RTCGA.rnaseq’
```
## Installation
### Devel
```
* installing *source* package ‘FSelectorRcpp’ ...
** package ‘FSelectorRcpp’ successfully unpacked and MD5 sums checked
** libs
clang++ -std=gnu++11 -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/FSelectorRcpp/Rcpp/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/FSelectorRcpp/BH/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/FSelectorRcpp/RcppArmadillo/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/FSelectorRcpp/testthat/include" -I/usr/local/include -fopenmp -I../inst/include -fPIC -Wall -g -O2 -c RcppExports.cpp -o RcppExports.o
clang: error: unsupported option '-fopenmp'
make: *** [RcppExports.o] Error 1
ERROR: compilation failed for package ‘FSelectorRcpp’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/FSelectorRcpp/new/FSelectorRcpp.Rcheck/FSelectorRcpp’
```
### CRAN
```
* installing *source* package ‘FSelectorRcpp’ ...
** package ‘FSelectorRcpp’ successfully unpacked and MD5 sums checked
** libs
clang++ -std=gnu++11 -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/FSelectorRcpp/Rcpp/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/FSelectorRcpp/BH/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/FSelectorRcpp/RcppArmadillo/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/FSelectorRcpp/testthat/include" -I/usr/local/include -fopenmp -I../inst/include -fPIC -Wall -g -O2 -c RcppExports.cpp -o RcppExports.o
clang: error: unsupported option '-fopenmp'
make: *** [RcppExports.o] Error 1
ERROR: compilation failed for package ‘FSelectorRcpp’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/FSelectorRcpp/old/FSelectorRcpp.Rcheck/FSelectorRcpp’
```
# ftDK
Version: 1.0
## In both
* checking data for non-ASCII characters ... NOTE
```
Note: found 39 marked UTF-8 strings
```
# furniture
Version: 1.9.0
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘magrittr’
All declared Imports should be used.
```
# furrowSeg
Version: 1.8.0
## In both
* checking whether package ‘furrowSeg’ can be installed ... WARNING
```
Found the following significant warnings:
Warning: replacing previous import ‘EBImage::abind’ by ‘abind::abind’ when loading ‘furrowSeg’
See ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/furrowSeg/new/furrowSeg.Rcheck/00install.out’ for details.
```
* checking re-building of vignette outputs ... WARNING
```
Error in re-building vignettes:
...
Warning in has_utility("pdfcrop") :
pdfcrop not installed or not in PATH
Error in texi2dvi(file = file, pdf = TRUE, clean = clean, quiet = quiet, :
Running 'texi2dvi' on 'exampleFurrowSegmentation.tex' failed.
LaTeX errors:
! LaTeX Error: File `float.sty' not found.
Type X to quit or <RETURN> to proceed,
or enter new name. (Default extension: sty)
! Emergency stop.
<read *>
l.30 \date
{}^^M
! ==> Fatal error occurred, no output PDF file produced!
Calls: buildVignettes -> texi2pdf -> texi2dvi
Execution halted
```
* checking installed package size ... NOTE
```
installed size is 351.8Mb
sub-directories of 1Mb or more:
data 350.7Mb
```
* checking R code for possible problems ... NOTE
```
...
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/furrowSeg/new/furrowSeg.Rcheck/00_pkg_src/furrowSeg/R/featureAnalysis.R:76-77)
plotFeatureEvolution: no visible global function definition for
‘polygon’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/furrowSeg/new/furrowSeg.Rcheck/00_pkg_src/furrowSeg/R/featureAnalysis.R:78-79)
plotFeatureEvolution: no visible global function definition for ‘rgb’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/furrowSeg/new/furrowSeg.Rcheck/00_pkg_src/furrowSeg/R/featureAnalysis.R:78-79)
plotFeatureEvolution: no visible global function definition for ‘axis’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/furrowSeg/new/furrowSeg.Rcheck/00_pkg_src/furrowSeg/R/featureAnalysis.R:80)
plotFeatureEvolution: no visible global function definition for ‘mtext’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/furrowSeg/new/furrowSeg.Rcheck/00_pkg_src/furrowSeg/R/featureAnalysis.R:81)
plotFeatureEvolution: no visible global function definition for ‘title’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/furrowSeg/new/furrowSeg.Rcheck/00_pkg_src/furrowSeg/R/featureAnalysis.R:84)
Undefined global functions or variables:
abline axis median mtext par plot points polygon predict quantile rgb
title
Consider adding
importFrom("graphics", "abline", "axis", "mtext", "par", "plot",
"points", "polygon", "title")
importFrom("grDevices", "rgb")
importFrom("stats", "median", "predict", "quantile")
to your NAMESPACE file.
```
# fxtract
Version: 0.9.1
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘future’ ‘utils’
All declared Imports should be used.
```
# GA4GHclient
Version: 1.4.0
## In both
* checking tests ...
```
ERROR
Running the tests in ‘tests/testthat.R’ failed.
Last 13 lines of output:
══ testthat results ═══════════════════════════════════════════════════════════
OK: 14 SKIPPED: 0 FAILED: 82
1. Error: getBiosample works (@test-getBiosample.R#6)
2. Error: getCallSet works (@test-getCallSet.R#6)
3. Error: getDataset works (@test-getDataset.R#6)
4. Error: getExpressionLevel works (@test-getExpressionLevel.R#6)
5. Error: getFeature works (@test-getFeature.R#6)
6. Error: getFeatureSet works (@test-getFeatureSet.R#6)
7. Error: getIndividual works (@test-getIndividual.R#6)
8. Error: getReadGroupSet works (@test-getReadGroupSet.R#6)
9. Error: getReference works (@test-getReference.R#6)
1. ...
Error: testthat unit tests failed
Execution halted
```
* checking re-building of vignette outputs ... WARNING
```
...
The following objects are masked from 'package:base':
anyDuplicated, append, as.data.frame, basename, cbind, colMeans,
colnames, colSums, dirname, do.call, duplicated, eval, evalq,
Filter, Find, get, grep, grepl, intersect, is.unsorted, lapply,
lengths, Map, mapply, match, mget, order, paste, pmax, pmax.int,
pmin, pmin.int, Position, rank, rbind, Reduce, rowMeans,
rownames, rowSums, sapply, setdiff, sort, table, tapply, union,
unique, unsplit, which, which.max, which.min
Attaching package: 'S4Vectors'
The following object is masked from 'package:base':
expand.grid
Quitting from lines 129-133 (GA4GHclient.Rmd)
Error: processing vignette 'GA4GHclient.Rmd' failed with diagnostics:
there is no package called 'org.Hs.eg.db'
Execution halted
```
* checking package dependencies ... NOTE
```
Packages suggested but not available for checking:
‘org.Hs.eg.db’ ‘TxDb.Hsapiens.UCSC.hg19.knownGene’
```
# GA4GHshiny
Version: 1.2.0
## In both
* checking tests ...
```
ERROR
Running the tests in ‘tests/testthat.R’ failed.
Last 13 lines of output:
2: stop(txt, domain = NA)
══ testthat results ═══════════════════════════════════════════════════════════
OK: 2 SKIPPED: 0 FAILED: 8
1. Error: app works (@test-app.R#5)
2. Error: getGene works (@test-getGene.R#4)
3. Error: getGeneSymbols works (@test-getGeneSymbols.R#4)
4. Error: initializeReferences works (@test-initializeReferences.R#6)
5. Error: initializeVariantSet works (@test-initializeVariantSet.R#6)
6. Error: (unknown) (@test-searchVariantsByGeneSymbol.R#3)
7. Error: tidyVariants works with searchVariants output (@test-tidyVariants.R#6)
8. Error: tidyVariants works with searchVariantsByGeneSymbol output (@test-tidyVariants.R#16)
Error: testthat unit tests failed
Execution halted
```
* checking package dependencies ... NOTE
```
Packages suggested but not available for checking:
‘org.Hs.eg.db’ ‘TxDb.Hsapiens.UCSC.hg19.knownGene’
```
* checking for hidden files and directories ... NOTE
```
Found the following hidden files and directories:
.travis.yml
These were most likely included in error. See section ‘Package
structure’ in the ‘Writing R Extensions’ manual.
```
# gaiah
Version: 0.0.2
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘ggplot2’ ‘maptools’ ‘rgeos’ ‘stringr’ ‘tidyr’
All declared Imports should be used.
```
# gastempt
Version: 0.4.4
## In both
* checking installed package size ... NOTE
```
installed size is 7.7Mb
sub-directories of 1Mb or more:
libs 7.2Mb
```
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘methods’ ‘rstantools’
All declared Imports should be used.
```
* checking for GNU extensions in Makefiles ... NOTE
```
GNU make is a SystemRequirements.
```
# geex
Version: 1.0.11
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘rootSolve’
All declared Imports should be used.
```
# genBaRcode
Version: 1.1.0
## In both
* checking examples ... ERROR
```
Running examples in ‘genBaRcode-Ex.R’ failed
The error most likely occurred in:
> ### Name: plotClusterGgTree
> ### Title: Plotting a Cluster ggTree
> ### Aliases: plotClusterGgTree
>
> ### ** Examples
>
> data(BC_dat)
> plotClusterGgTree(BC_dat, tree_est = "UPGMA", type = "circular")
Error: object ‘as_data_frame’ is not exported by 'namespace:tidytree'
Execution halted
```
# gender
Version: 0.5.2
## In both
* checking package dependencies ... NOTE
```
Package suggested but not available for checking: ‘genderdata’
```
# GENESIS
Version: 2.10.1
## In both
* checking examples ... ERROR
```
...
>
> # simulate some phenotype data
> data(pedigree)
> pedigree <- pedigree[match(seqGetData(gds, "sample.id"), pedigree$sample.id),]
> pedigree$outcome <- rnorm(nrow(pedigree))
>
> # construct a SeqVarData object
> seqData <- SeqVarData(gds, sampleData=AnnotatedDataFrame(pedigree))
>
> # fit the null model
> nullmod <- fitNullModel(seqData, outcome="outcome", covars="sex")
>
> # burden test - Range Iterator
> gr <- GRanges(seqnames=rep(1,3), ranges=IRanges(start=c(1e6, 2e6, 3e6), width=1e6))
> iterator <- SeqVarRangeIterator(seqData, variantRanges=gr)
# of selected variants: 3
> assoc <- assocTestAggregate(iterator, nullmod, test="Burden")
# of selected samples: 90
Error in n() : could not find function "n"
Calls: assocTestAggregate ... variantInfo -> .local -> mutate_ -> mutate_.tbl_df -> mutate_impl
Execution halted
```
* checking tests ...
```
ERROR
Running the tests in ‘tests/testthat.R’ failed.
Last 13 lines of output:
══ testthat results ═══════════════════════════════════════════════════════════
OK: 222 SKIPPED: 0 FAILED: 15
1. Error: window (@test_assocTestAggregate.R#10)
2. Error: ranges (@test_assocTestAggregate.R#24)
3. Error: list (@test_assocTestAggregate.R#40)
4. Error: user weights (@test_assocTestAggregate.R#56)
5. Error: exclude monomorphs (@test_assocTestAggregate.R#71)
6. Error: exclude common (@test_assocTestAggregate.R#84)
7. Error: select alleles (@test_assocTestAggregate.R#92)
8. Error: select alleles with mismatch (@test_assocTestAggregate.R#135)
9. Error: select alleles with duplicate variants (@test_assocTestAggregate.R#162)
1. ...
Error: testthat unit tests failed
Execution halted
```
* checking package dependencies ... NOTE
```
Package suggested but not available for checking: ‘GWASdata’
```
* checking dependencies in R code ... NOTE
```
Unexported object imported by a ':::' call: ‘survey:::saddle’
See the note in ?`:::` about the use of this operator.
```
# geneXtendeR
Version: 1.6.0
## In both
* checking package dependencies ... ERROR
```
Packages required but not available:
‘GO.db’ ‘org.Rn.eg.db’ ‘org.Ag.eg.db’ ‘org.Bt.eg.db’ ‘org.Ce.eg.db’
‘org.Cf.eg.db’ ‘org.Dm.eg.db’ ‘org.Dr.eg.db’ ‘org.Gg.eg.db’
‘org.Hs.eg.db’ ‘org.Mm.eg.db’ ‘org.Mmu.eg.db’ ‘org.Pt.eg.db’
‘org.Sc.sgd.db’ ‘org.Ss.eg.db’ ‘org.Xl.eg.db’
See section ‘The DESCRIPTION file’ in the ‘Writing R Extensions’
manual.
```
# GenomicDataCommons
Version: 1.4.3
## In both
* checking Rd \usage sections ... WARNING
```
Undocumented arguments in documentation object '.htseq_importer'
‘fnames’
Functions with \usage entries need to have the appropriate \alias
entries, and all their arguments documented.
The \usage entries must correspond to syntactically valid R code.
See chapter ‘Writing R documentation files’ in the ‘Writing R
Extensions’ manual.
```
* checking R code for possible problems ... NOTE
```
default_fields.character: no visible binding for global variable
‘defaults’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/GenomicDataCommons/new/GenomicDataCommons.Rcheck/00_pkg_src/GenomicDataCommons/R/fields.R:51)
gdc_rnaseq: no visible binding for global variable ‘case_id’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/GenomicDataCommons/new/GenomicDataCommons.Rcheck/00_pkg_src/GenomicDataCommons/R/gdc_rnaseq.R:106-107)
gdc_rnaseq: no visible binding for global variable ‘file_id’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/GenomicDataCommons/new/GenomicDataCommons.Rcheck/00_pkg_src/GenomicDataCommons/R/gdc_rnaseq.R:106-107)
Undefined global functions or variables:
case_id defaults file_id
```
# GenomicInteractions
Version: 1.14.0
## In both
* checking installed package size ... NOTE
```
installed size is 11.5Mb
sub-directories of 1Mb or more:
doc 1.8Mb
extdata 7.9Mb
```
# GenomicMating
Version: 2.0
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘dplyr’
All declared Imports should be used.
```
# GEOmetadb
Version: 1.42.0
## In both
* R CMD check timed out
* checking for hidden files and directories ... NOTE
```
Found the following hidden files and directories:
.travis.yml
These were most likely included in error. See section ‘Package
structure’ in the ‘Writing R Extensions’ manual.
```
* checking R code for possible problems ... NOTE
```
getSQLiteFile: no visible global function definition for
‘download.file’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/GEOmetadb/new/GEOmetadb.Rcheck/00_pkg_src/GEOmetadb/R/getSQLiteFile.R:6)
Undefined global functions or variables:
download.file
Consider adding
importFrom("utils", "download.file")
to your NAMESPACE file.
```
# GEOquery
Version: 2.48.0
## In both
* checking installed package size ... NOTE
```
installed size is 13.9Mb
sub-directories of 1Mb or more:
extdata 12.8Mb
```
* checking whether the namespace can be loaded with stated dependencies ... NOTE
```
Warning: no function found corresponding to methods exports from ‘GEOquery’ for: ‘show’
A namespace must be able to be loaded with just the base namespace
loaded: otherwise if the namespace gets loaded by a saved object, the
session will be unable to start.
Probably some imports need to be declared in the NAMESPACE file.
```
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘httr’
All declared Imports should be used.
Package in Depends field not imported from: ‘methods’
These packages need to be imported from (in the NAMESPACE file)
for when this namespace is loaded but not attached.
```
* checking R code for possible problems ... NOTE
```
...
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/GEOquery/new/GEOquery.Rcheck/00_pkg_src/GEOquery/R/parseGEO.R:531-539)
parseGSEMatrix: no visible binding for global variable ‘accession’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/GEOquery/new/GEOquery.Rcheck/00_pkg_src/GEOquery/R/parseGEO.R:531-539)
parseGSEMatrix: no visible binding for global variable ‘accession’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/GEOquery/new/GEOquery.Rcheck/00_pkg_src/GEOquery/R/parseGEO.R:541-542)
parseGSEMatrix: no visible global function definition for ‘new’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/GEOquery/new/GEOquery.Rcheck/00_pkg_src/GEOquery/R/parseGEO.R:568)
parseGSEMatrix: no visible global function definition for ‘new’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/GEOquery/new/GEOquery.Rcheck/00_pkg_src/GEOquery/R/parseGEO.R:590)
parseGSEMatrix: no visible global function definition for ‘new’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/GEOquery/new/GEOquery.Rcheck/00_pkg_src/GEOquery/R/parseGEO.R:606-610)
parseGSEMatrix: no visible global function definition for ‘as’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/GEOquery/new/GEOquery.Rcheck/00_pkg_src/GEOquery/R/parseGEO.R:606-610)
Undefined global functions or variables:
. accession as characteristics k kvpair MA new read.delim read.table
v
Consider adding
importFrom("methods", "as", "new")
importFrom("utils", "read.delim", "read.table")
to your NAMESPACE file (and ensure that your DESCRIPTION Imports field
contains 'methods').
```
# geosample
Version: 0.2.1
## In both
* checking re-building of vignette outputs ... WARNING
```
...
If you have questions, suggestions, or comments regarding the 'maxLik' package, please use a forum or 'tracker' at maxLik's R-Forge site:
https://r-forge.r-project.org/projects/maxlik/
Loading required package: raster
Loading required package: pdist
Loading required package: Matrix
Error in texi2dvi(file = file, pdf = TRUE, clean = clean, quiet = quiet, :
Running 'texi2dvi' on 'geosample-vignette.tex' failed.
LaTeX errors:
! LaTeX Error: File `bbm.sty' not found.
Type X to quit or <RETURN> to proceed,
or enter new name. (Default extension: sty)
! Emergency stop.
<read *>
l.63 \usepackage
{amsmath}^^M
! ==> Fatal error occurred, no output PDF file produced!
Calls: buildVignettes -> texi2pdf -> texi2dvi
Execution halted
```
# geoSpectral
Version: 0.17.4
## In both
* checking installed package size ... NOTE
```
installed size is 5.2Mb
sub-directories of 1Mb or more:
R 2.1Mb
test_data 2.9Mb
```
# GerminaR
Version: 1.2
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘DT’ ‘shinydashboard’
All declared Imports should be used.
```
# gespeR
Version: 1.12.0
## In both
* checking for code/documentation mismatches ... WARNING
```
Codoc mismatches from documentation object 'c,Phenotypes-method':
\S4method{c}{Phenotypes}
Code: function(x, ...)
Docs: function(x, ..., recursive = FALSE)
Argument names in docs not in code:
recursive
```
* checking R code for possible problems ... NOTE
```
...
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/gespeR/new/gespeR.Rcheck/00_pkg_src/gespeR/R/gespeR-functions.R:42-43)
.select.model: no visible global function definition for ‘predict’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/gespeR/new/gespeR.Rcheck/00_pkg_src/gespeR/R/gespeR-functions.R:236)
concordance: no visible global function definition for ‘cor’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/gespeR/new/gespeR.Rcheck/00_pkg_src/gespeR/R/gespeR-concordance.R:65)
lasso.rand: no visible global function definition for ‘runif’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/gespeR/new/gespeR.Rcheck/00_pkg_src/gespeR/R/gespeR-functions.R:137-145)
plot.gespeR: no visible global function definition for ‘hist’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/gespeR/new/gespeR.Rcheck/00_pkg_src/gespeR/R/gespeR-class.R:218)
stability.selection: no visible global function definition for ‘lm’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/gespeR/new/gespeR.Rcheck/00_pkg_src/gespeR/R/gespeR-functions.R:214)
Phenotypes,character: no visible global function definition for
‘read.delim’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/gespeR/new/gespeR.Rcheck/00_pkg_src/gespeR/R/Phenotypes-class.R:75)
Undefined global functions or variables:
coef cor hist lm predict read.delim runif
Consider adding
importFrom("graphics", "hist")
importFrom("stats", "coef", "cor", "lm", "predict", "runif")
importFrom("utils", "read.delim")
to your NAMESPACE file.
```
# getProxy
Version: 1.12
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘bitops’ ‘data.table’ ‘dplyr’ ‘httr’
All declared Imports should be used.
```
# ggalt
Version: 0.4.0
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘plotly’
All declared Imports should be used.
```
# ggdag
Version: 0.1.0
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘ggforce’ ‘plyr’
All declared Imports should be used.
```
# ggdistribute
Version: 1.0.3
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘dplyr’
All declared Imports should be used.
```
# ggedit
Version: 0.3.0
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘magrittr’
All declared Imports should be used.
```
# ggeffects
Version: 0.8.0
## In both
* checking Rd cross-references ... NOTE
```
Package unavailable to check Rd xrefs: ‘ordinal’
```
# ggenealogy
Version: 0.3.0
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘tibble’
All declared Imports should be used.
```
* checking data for non-ASCII characters ... NOTE
```
Note: found 2356 marked UTF-8 strings
```
* checking re-building of vignette outputs ... NOTE
```
...
conversion failure on 'Ondřej Chochola' in 'mbcsToSbcs': dot substituted for <99>
Warning in grid.Call.graphics(C_text, as.graphicsAnnot(x$label), x$x, x$y, :
conversion failure on 'Ondřej Chochola' in 'mbcsToSbcs': dot substituted for <c5>
Warning in grid.Call.graphics(C_text, as.graphicsAnnot(x$label), x$x, x$y, :
conversion failure on 'Ondřej Chochola' in 'mbcsToSbcs': dot substituted for <99>
Error in texi2dvi(file = file, pdf = TRUE, clean = clean, quiet = quiet, :
Running 'texi2dvi' on 'ggenealogy.tex' failed.
LaTeX errors:
! LaTeX Error: File `media9.sty' not found.
Type X to quit or <RETURN> to proceed,
or enter new name. (Default extension: sty)
! Emergency stop.
<read *>
l.4 ^^M
! ==> Fatal error occurred, no output PDF file produced!
Calls: buildVignettes -> texi2pdf -> texi2dvi
Execution halted
```
# ggfan
Version: 0.1.3
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘colorspace’ ‘grid’ ‘rstan’
All declared Imports should be used.
```
# ggformula
Version: 0.9.1
## In both
* checking installed package size ... NOTE
```
installed size is 6.1Mb
sub-directories of 1Mb or more:
doc 2.7Mb
R 2.1Mb
```
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘tidyr’
All declared Imports should be used.
```
# ggguitar
Version: 0.1.1
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘gridExtra’ ‘lazyeval’ ‘readr’
All declared Imports should be used.
```
# gginnards
Version: 0.0.1
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘grid’ ‘tibble’
All declared Imports should be used.
```
# ggiraph
Version: 0.6.0
## In both
* checking Rd cross-references ... NOTE
```
Package unavailable to check Rd xrefs: ‘rvg’
```
# ggiraphExtra
Version: 0.2.9
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘ggforce’ ‘webshot’ ‘ztable’
All declared Imports should be used.
```
# gglogo
Version: 0.1.3
## In both
* checking re-building of vignette outputs ... WARNING
```
Error in re-building vignettes:
...
Attaching package: 'gglogo'
The following object is masked from 'package:ggplot2':
fortify
Warning in grid.Call.graphics(C_text, as.graphicsAnnot(x$label), x$x, x$y, :
no font could be found for family "Garamond"
Warning in grid.Call.graphics(C_text, as.graphicsAnnot(x$label), x$x, x$y, :
no font could be found for family "Garamond"
Warning in grid.Call.graphics(C_text, as.graphicsAnnot(x$label), x$x, x$y, :
no font could be found for family "Garamond"
Quitting from lines 46-52 (gglogo-alphabet.Rmd)
Error: processing vignette 'gglogo-alphabet.Rmd' failed with diagnostics:
replacement has 1 row, data has 0
Execution halted
```
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘knitr’
All declared Imports should be used.
```
# ggmcmc
Version: 1.2
## In both
* checking re-building of vignette outputs ... WARNING
```
...
Error in texi2dvi(file = file, pdf = TRUE, clean = clean, quiet = quiet, :
Running 'texi2dvi' on 'v70i09.tex' failed.
LaTeX errors:
! Package babel Error: Unknown option `english'. Either you misspelled it
(babel) or the language definition file english.ldf was not foun
d.
See the babel package documentation for explanation.
! LaTeX Error: File `thumbpdf.sty' not found.
Type X to quit or <RETURN> to proceed,
or enter new name. (Default extension: sty)
! Emergency stop.
<read *>
l.65 \usepackage
{lmodern}^^M
! ==> Fatal error occurred, no output PDF file produced!
Calls: buildVignettes -> texi2pdf -> texi2dvi
Execution halted
```
# ggnormalviolin
Version: 0.1.1
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘dplyr’ ‘grid’ ‘scales’
All declared Imports should be used.
```
# ggplot2
Version: 3.1.0
## In both
* checking installed package size ... NOTE
```
installed size is 7.4Mb
sub-directories of 1Mb or more:
doc 1.8Mb
R 3.9Mb
```
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘mgcv’ ‘reshape2’ ‘viridisLite’
All declared Imports should be used.
```
# ggplotAssist
Version: 0.1.3
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘gcookbook’ ‘ggthemes’ ‘moonBook’ ‘tidyverse’
All declared Imports should be used.
```
# ggpmisc
Version: 0.3.0
## In both
* checking Rd cross-references ... NOTE
```
Package unavailable to check Rd xrefs: ‘gginnards’
```
# ggpol
Version: 0.0.4
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘dplyr’ ‘grDevices’
All declared Imports should be used.
```
# ggpubr
Version: 0.2
## In both
* checking Rd cross-references ... NOTE
```
Package unavailable to check Rd xrefs: ‘FactoMineR’
```
# ggQQunif
Version: 0.1.5
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘dplyr’
All declared Imports should be used.
```
# ggquickeda
Version: 0.1.2
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘colourpicker’ ‘dplyr’ ‘DT’ ‘Formula’ ‘ggpmisc’ ‘ggrepel’ ‘grDevices’
‘gridExtra’ ‘Hmisc’ ‘lazyeval’ ‘markdown’ ‘plotly’ ‘quantreg’ ‘rlang’
‘shinyjs’ ‘table1’ ‘tidyr’
All declared Imports should be used.
```
# ggRandomForests
Version: 2.0.1
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘randomForest’
All declared Imports should be used.
```
# ggraph
Version: 1.0.2
## In both
* checking installed package size ... NOTE
```
installed size is 5.5Mb
sub-directories of 1Mb or more:
doc 3.0Mb
R 2.0Mb
```
# ggridges
Version: 0.5.1
## In both
* checking data for non-ASCII characters ... NOTE
```
Note: found 6242 marked UTF-8 strings
```
# ggspatial
Version: 1.0.3
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘reshape2’ ‘rosm’
All declared Imports should be used.
```
# ggstatsplot
Version: 0.0.9
## In both
* checking installed package size ... NOTE
```
installed size is 6.3Mb
sub-directories of 1Mb or more:
help 4.1Mb
R 2.0Mb
```
# ggthemes
Version: 4.1.0
## In both
* checking data for non-ASCII characters ... NOTE
```
Note: found 138 marked UTF-8 strings
```
# ggtree
Version: 1.12.7
## In both
* checking whether package ‘ggtree’ can be installed ... ERROR
```
Installation failed.
See ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/ggtree/new/ggtree.Rcheck/00install.out’ for details.
```
## Installation
### Devel
```
* installing *source* package ‘ggtree’ ...
** R
** inst
** byte-compile and prepare package for lazy loading
Error : object ‘as_data_frame’ is not exported by 'namespace:tidytree'
ERROR: lazy loading failed for package ‘ggtree’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/ggtree/new/ggtree.Rcheck/ggtree’
```
### CRAN
```
* installing *source* package ‘ggtree’ ...
** R
** inst
** byte-compile and prepare package for lazy loading
Error : object ‘as_data_frame’ is not exported by 'namespace:tidytree'
ERROR: lazy loading failed for package ‘ggtree’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/ggtree/old/ggtree.Rcheck/ggtree’
```
# ggwordcloud
Version: 0.3.0
## In both
* checking data for non-ASCII characters ... NOTE
```
Note: found 102 marked UTF-8 strings
```
# glmmfields
Version: 0.1.2
## In both
* checking for GNU extensions in Makefiles ... NOTE
```
GNU make is a SystemRequirements.
```
# googlesheets
Version: 0.3.0
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘tibble’
All declared Imports should be used.
```
# gQTLstats
Version: 1.12.0
## In both
* checking package dependencies ... ERROR
```
Package required but not available: ‘Homo.sapiens’
Packages suggested but not available for checking:
‘geuvPack’ ‘geuvStore2’ ‘org.Hs.eg.db’
‘TxDb.Hsapiens.UCSC.hg19.knownGene’
See section ‘The DESCRIPTION file’ in the ‘Writing R Extensions’
manual.
```
# graphicalVAR
Version: 0.2.2
## In both
* checking whether package ‘graphicalVAR’ can be installed ... ERROR
```
Installation failed.
See ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/graphicalVAR/new/graphicalVAR.Rcheck/00install.out’ for details.
```
## Installation
### Devel
```
* installing *source* package ‘graphicalVAR’ ...
** package ‘graphicalVAR’ successfully unpacked and MD5 sums checked
** libs
clang++ -std=gnu++11 -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/graphicalVAR/Rcpp/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/graphicalVAR/RcppArmadillo/include" -I/usr/local/include -fopenmp -fPIC -Wall -g -O2 -c RcppExports.cpp -o RcppExports.o
clang: error: unsupported option '-fopenmp'
make: *** [RcppExports.o] Error 1
ERROR: compilation failed for package ‘graphicalVAR’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/graphicalVAR/new/graphicalVAR.Rcheck/graphicalVAR’
```
### CRAN
```
* installing *source* package ‘graphicalVAR’ ...
** package ‘graphicalVAR’ successfully unpacked and MD5 sums checked
** libs
clang++ -std=gnu++11 -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/graphicalVAR/Rcpp/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/graphicalVAR/RcppArmadillo/include" -I/usr/local/include -fopenmp -fPIC -Wall -g -O2 -c RcppExports.cpp -o RcppExports.o
clang: error: unsupported option '-fopenmp'
make: *** [RcppExports.o] Error 1
ERROR: compilation failed for package ‘graphicalVAR’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/graphicalVAR/old/graphicalVAR.Rcheck/graphicalVAR’
```
# graphTweets
Version: 0.5.1
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘utils’
All declared Imports should be used.
```
# grasp2db
Version: 1.1.0
## In both
* R CMD check timed out
* checking for missing documentation entries ... WARNING
```
Undocumented code objects:
‘checkAnti’ ‘getJoinCompatible’ ‘GRASP2’
Undocumented data sets:
‘mml10p_nox’ ‘uniqueGexNames2.0’ ‘uniquePPDnames2.0’
All user-level objects in a package should have documentation entries.
See chapter ‘Writing R documentation files’ in the ‘Writing R
Extensions’ manual.
```
* checking data for non-ASCII characters ... WARNING
```
Warning: found non-ASCII string
'Beh<e7>et's disease' in object 'uniquePPDnames2.0'
```
* checking data for ASCII and uncompressed saves ... WARNING
```
Note: significantly better compression could be obtained
by using R CMD build --resave-data
old_size new_size compress
mml10p_nox.rda 7.1Mb 2.8Mb xz
uniquePPDnames2.0.rda 17Kb 15Kb bzip2
```
* checking package dependencies ... NOTE
```
Package which this enhances but not available for checking: ‘AnnotationHubData’
```
* checking installed package size ... NOTE
```
installed size is 7.6Mb
sub-directories of 1Mb or more:
data 7.1Mb
```
* checking DESCRIPTION meta-information ... NOTE
```
License components with restrictions not permitted:
Artistic-2.0 + file LICENSE
```
* checking R code for possible problems ... NOTE
```
.grasp2ToAnnotationHub: no visible global function definition for
‘outputFile’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/grasp2db/new/grasp2db.Rcheck/00_pkg_src/grasp2db/R/db_AnnotationHub.R:39)
.grasp2ToAnnotationHub: no visible global function definition for
‘outputFile’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/grasp2db/new/grasp2db.Rcheck/00_pkg_src/grasp2db/R/db_AnnotationHub.R:40)
checkAnti: no visible binding for global variable ‘chr_hg19’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/grasp2db/new/grasp2db.Rcheck/00_pkg_src/grasp2db/R/doanti.R:19-20)
getJoinCompatible: no visible binding for global variable ‘gwrngs19’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/grasp2db/new/grasp2db.Rcheck/00_pkg_src/grasp2db/R/doanti.R:7)
Undefined global functions or variables:
chr_hg19 gwrngs19 outputFile
```
# grattan
Version: 1.7.0.0
## In both
* checking package dependencies ... NOTE
```
Packages suggested but not available for checking:
‘taxstats’ ‘taxstats1516’
```
# Greg
Version: 1.3
## In both
* checking Rd cross-references ... NOTE
```
Package unavailable to check Rd xrefs: ‘rmeta’
```
# gutenbergr
Version: 0.1.4
## In both
* checking data for non-ASCII characters ... NOTE
```
Note: found 13617 marked UTF-8 strings
```
# hansard
Version: 0.6.3
## In both
* checking Rd cross-references ... NOTE
```
Package unavailable to check Rd xrefs: ‘mnis’
```
# harrietr
Version: 0.2.3
## In both
* checking whether package ‘harrietr’ can be installed ... ERROR
```
Installation failed.
See ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/harrietr/new/harrietr.Rcheck/00install.out’ for details.
```
## Installation
### Devel
```
* installing *source* package ‘harrietr’ ...
** package ‘harrietr’ successfully unpacked and MD5 sums checked
** R
** data
*** moving datasets to lazyload DB
** byte-compile and prepare package for lazy loading
Error : object ‘as_data_frame’ is not exported by 'namespace:tidytree'
ERROR: lazy loading failed for package ‘harrietr’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/harrietr/new/harrietr.Rcheck/harrietr’
```
### CRAN
```
* installing *source* package ‘harrietr’ ...
** package ‘harrietr’ successfully unpacked and MD5 sums checked
** R
** data
*** moving datasets to lazyload DB
** byte-compile and prepare package for lazy loading
Error : object ‘as_data_frame’ is not exported by 'namespace:tidytree'
ERROR: lazy loading failed for package ‘harrietr’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/harrietr/old/harrietr.Rcheck/harrietr’
```
# heatwaveR
Version: 0.3.6
## In both
* checking whether package ‘heatwaveR’ can be installed ... ERROR
```
Installation failed.
See ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/heatwaveR/new/heatwaveR.Rcheck/00install.out’ for details.
```
## Installation
### Devel
```
* installing *source* package ‘heatwaveR’ ...
** package ‘heatwaveR’ successfully unpacked and MD5 sums checked
** libs
clang++ -std=gnu++11 -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/heatwaveR/Rcpp/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/heatwaveR/RcppArmadillo/include" -I/usr/local/include -fopenmp -fPIC -Wall -g -O2 -c RcppExports.cpp -o RcppExports.o
clang: error: unsupported option '-fopenmp'
make: *** [RcppExports.o] Error 1
ERROR: compilation failed for package ‘heatwaveR’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/heatwaveR/new/heatwaveR.Rcheck/heatwaveR’
```
### CRAN
```
* installing *source* package ‘heatwaveR’ ...
** package ‘heatwaveR’ successfully unpacked and MD5 sums checked
** libs
clang++ -std=gnu++11 -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/heatwaveR/Rcpp/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/heatwaveR/RcppArmadillo/include" -I/usr/local/include -fopenmp -fPIC -Wall -g -O2 -c RcppExports.cpp -o RcppExports.o
clang: error: unsupported option '-fopenmp'
make: *** [RcppExports.o] Error 1
ERROR: compilation failed for package ‘heatwaveR’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/heatwaveR/old/heatwaveR.Rcheck/heatwaveR’
```
# heemod
Version: 0.9.4
## In both
* checking installed package size ... NOTE
```
installed size is 5.3Mb
sub-directories of 1Mb or more:
doc 1.6Mb
R 2.1Mb
tabular 1.2Mb
```
# hettx
Version: 0.1.1
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘dplyr’ ‘purrr’ ‘tidyverse’
All declared Imports should be used.
```
# hiAnnotator
Version: 1.14.0
## In both
* checking R code for possible problems ... NOTE
```
...
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/hiAnnotator/new/hiAnnotator.Rcheck/00_pkg_src/hiAnnotator/R/hiAnnotator.R:424-428)
makeGRanges: no visible global function definition for ‘seqlengths’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/hiAnnotator/new/hiAnnotator.Rcheck/00_pkg_src/hiAnnotator/R/hiAnnotator.R:439-440)
makeGRanges: no visible global function definition for ‘seqlevels<-’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/hiAnnotator/new/hiAnnotator.Rcheck/00_pkg_src/hiAnnotator/R/hiAnnotator.R:464)
makeGRanges: no visible global function definition for ‘sortSeqlevels’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/hiAnnotator/new/hiAnnotator.Rcheck/00_pkg_src/hiAnnotator/R/hiAnnotator.R:464)
makeGRanges: no visible global function definition for ‘seqlevelsInUse’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/hiAnnotator/new/hiAnnotator.Rcheck/00_pkg_src/hiAnnotator/R/hiAnnotator.R:464)
makeGRanges: no visible global function definition for ‘seqlengths<-’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/hiAnnotator/new/hiAnnotator.Rcheck/00_pkg_src/hiAnnotator/R/hiAnnotator.R:465)
makeGRanges: no visible global function definition for ‘seqlevels’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/hiAnnotator/new/hiAnnotator.Rcheck/00_pkg_src/hiAnnotator/R/hiAnnotator.R:465)
Undefined global functions or variables:
breakInChunks countQueryHits detectCores dist featureName IRanges
keepSeqlevels mid n overlapsAny qStrand queryHits seqlengths
seqlengths<- seqlevels seqlevels<- seqlevelsInUse sortSeqlevels
subjectHits
Consider adding
importFrom("stats", "dist")
to your NAMESPACE file.
```
* checking files in ‘vignettes’ ... NOTE
```
The following directory looks like a leftover from 'knitr':
‘figure’
Please remove from your package.
```
# HiCcompare
Version: 1.2.0
## In both
* checking installed package size ... NOTE
```
installed size is 7.0Mb
sub-directories of 1Mb or more:
data 5.8Mb
```
* checking R code for possible problems ... NOTE
```
...
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/HiCcompare/new/HiCcompare.Rcheck/00_pkg_src/HiCcompare/R/total_sum.R:62)
volcano: no visible binding for global variable ‘A’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/HiCcompare/new/HiCcompare.Rcheck/00_pkg_src/HiCcompare/R/volcano.R:2)
volcano: no visible binding for global variable ‘adj.IF1’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/HiCcompare/new/HiCcompare.Rcheck/00_pkg_src/HiCcompare/R/volcano.R:2)
volcano: no visible binding for global variable ‘adj.IF2’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/HiCcompare/new/HiCcompare.Rcheck/00_pkg_src/HiCcompare/R/volcano.R:2)
volcano: no visible binding for global variable ‘p.value’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/HiCcompare/new/HiCcompare.Rcheck/00_pkg_src/HiCcompare/R/volcano.R:5)
volcano: no visible binding for global variable ‘A’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/HiCcompare/new/HiCcompare.Rcheck/00_pkg_src/HiCcompare/R/volcano.R:5)
volcano: no visible binding for global variable ‘D’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/HiCcompare/new/HiCcompare.Rcheck/00_pkg_src/HiCcompare/R/volcano.R:5)
Undefined global functions or variables:
A adj.IF1 adj.IF2 adj.M axis bias.slope bp centromere_locations chr1
chr2 count D fold.change i IF IF1 IF2 j M p.adj p.value pnorm region1
region2 start1 start2 Z
Consider adding
importFrom("graphics", "axis")
importFrom("stats", "D", "pnorm")
to your NAMESPACE file.
```
# highcharter
Version: 0.7.0
## In both
* checking installed package size ... NOTE
```
installed size is 9.2Mb
sub-directories of 1Mb or more:
doc 3.7Mb
htmlwidgets 4.0Mb
```
# hiReadsProcessor
Version: 1.16.0
## In both
* checking for hidden files and directories ... NOTE
```
Found the following hidden files and directories:
.BBSoptions
These were most likely included in error. See section ‘Package
structure’ in the ‘Writing R Extensions’ manual.
```
* checking R code for possible problems ... NOTE
```
...
vpairwiseAlignSeqs: no visible global function definition for
‘runLength’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/hiReadsProcessor/new/hiReadsProcessor.Rcheck/00_pkg_src/hiReadsProcessor/R/hiReadsProcessor.R:1393-1396)
vpairwiseAlignSeqs: no visible global function definition for ‘Rle’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/hiReadsProcessor/new/hiReadsProcessor.Rcheck/00_pkg_src/hiReadsProcessor/R/hiReadsProcessor.R:1398)
vpairwiseAlignSeqs: no visible global function definition for
‘runLength’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/hiReadsProcessor/new/hiReadsProcessor.Rcheck/00_pkg_src/hiReadsProcessor/R/hiReadsProcessor.R:1399-1402)
vpairwiseAlignSeqs: no visible global function definition for
‘runValue’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/hiReadsProcessor/new/hiReadsProcessor.Rcheck/00_pkg_src/hiReadsProcessor/R/hiReadsProcessor.R:1400-1401)
vpairwiseAlignSeqs: no visible global function definition for
‘runLength’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/hiReadsProcessor/new/hiReadsProcessor.Rcheck/00_pkg_src/hiReadsProcessor/R/hiReadsProcessor.R:1400-1401)
vpairwiseAlignSeqs: no visible global function definition for ‘IRanges’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/hiReadsProcessor/new/hiReadsProcessor.Rcheck/00_pkg_src/hiReadsProcessor/R/hiReadsProcessor.R:1420-1424)
Undefined global functions or variables:
breakInChunks clusteredValue clusteredValue.freq DataFrame
detectCores fasta.info IRanges IRangesList matches mclapply metadata
metadata<- misMatches qBaseInsert queryHits Rle runLength runValue
scanBamFlag ScanBamParam SimpleList tBaseInsert
```
# HMP16SData
Version: 1.0.1
## In both
* checking re-building of vignette outputs ... WARNING
```
...
Attaching package: 'dendextend'
The following object is masked from 'package:stats':
cutree
========================================
circlize version 0.4.5
CRAN page: https://cran.r-project.org/package=circlize
Github page: https://github.com/jokergoo/circlize
Documentation: http://jokergoo.github.io/circlize_book/book/
If you use it in published research, please cite:
Gu, Z. circlize implements and enhances circular visualization
in R. Bioinformatics 2014.
========================================
Quitting from lines 58-71 (HMP16SData.Rmd)
Error: processing vignette 'HMP16SData.Rmd' failed with diagnostics:
there is no package called 'curatedMetagenomicData'
Execution halted
```
* checking package dependencies ... NOTE
```
Package suggested but not available for checking: ‘curatedMetagenomicData’
```
* checking installed package size ... NOTE
```
installed size is 19.1Mb
sub-directories of 1Mb or more:
doc 1.5Mb
extdata 17.4Mb
```
# hpiR
Version: 0.2.0
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘knitr’
All declared Imports should be used.
```
# HTSSIP
Version: 1.4.0
## In both
* R CMD check timed out
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘igraph’
All declared Imports should be used.
```
# hurricaneexposure
Version: 0.0.1
## In both
* checking package dependencies ... NOTE
```
Package suggested but not available for checking: ‘hurricaneexposuredata’
```
# huxtable
Version: 4.4.0
## In both
* checking tests ...
```
ERROR
Running the tests in ‘tests/testthat.R’ failed.
Last 13 lines of output:
Message: Failed to compile /Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/huxtable/new/huxtable.Rcheck/tests/testthat/temp-artefacts/bookdown-test.tex. See bookdown-test.log for more info.
Class: simpleError/error/condition
bookdown-test.Rmd
bookdown::pdf_book
══ testthat results ═══════════════════════════════════════════════════════════
OK: 878 SKIPPED: 57 FAILED: 5
1. Failure: huxreg copes with different models (@test-huxreg.R#31)
2. Error: quick_pdf works (@test-quick-output.R#41)
3. Error: quick_pdf works with height and width options (@test-quick-output.R#53)
4. Failure: echo = TRUE does not cause option clash (@test-yy-end-to-end.R#108)
5. Failure: Bookdown files (@test-yy-end-to-end.R#143)
Error: testthat unit tests failed
Execution halted
```
* checking installed package size ... NOTE
```
installed size is 5.3Mb
sub-directories of 1Mb or more:
doc 2.9Mb
R 2.0Mb
```
# HydeNet
Version: 0.10.9
## In both
* checking examples ... ERROR
```
...
>
> ### ** Examples
>
> data(PE, package="HydeNet")
> Net <- HydeNetwork(~ wells +
+ pe | wells +
+ d.dimer | pregnant*pe +
+ angio | pe +
+ treat | d.dimer*angio +
+ death | pe*treat,
+ data = PE)
>
>
> compiledNet <- compileJagsModel(Net, n.chains=5)
Error: .onLoad failed in loadNamespace() for 'rjags', details:
call: dyn.load(file, DLLpath = DLLpath, ...)
error: unable to load shared object '/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/HydeNet/rjags/libs/rjags.so':
dlopen(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/HydeNet/rjags/libs/rjags.so, 10): Library not loaded: /usr/local/lib/libjags.4.dylib
Referenced from: /Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/HydeNet/rjags/libs/rjags.so
Reason: image not found
Execution halted
```
* checking tests ...
```
ERROR
Running the tests in ‘tests/testthat.R’ failed.
Last 13 lines of output:
iter 80 value 14.018282
iter 80 value 14.018282
iter 90 value 14.017126
final value 14.015374
converged
══ testthat results ═══════════════════════════════════════════════════════════
OK: 60 SKIPPED: 0 FAILED: 5
1. Error: compileDecisionModel (@test_compileDecisionModel.R#14)
2. Error: (unknown) (@test-bindPosterior.R#12)
3. Error: compileJagsModel returns an object of class 'compiledHydeNetwork' (@test-compileJagsModel.R#14)
4. Error: (unknown) (@test-HydePosterior.R#11)
5. Error: (unknown) (@test-print.HydePosterior.R#11)
Error: testthat unit tests failed
Execution halted
```
* checking re-building of vignette outputs ... WARNING
```
Error in re-building vignettes:
...
Loading required package: nnet
Quitting from lines 314-325 (DecisionNetworks.Rmd)
Error: processing vignette 'DecisionNetworks.Rmd' failed with diagnostics:
.onLoad failed in loadNamespace() for 'rjags', details:
call: dyn.load(file, DLLpath = DLLpath, ...)
error: unable to load shared object '/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/HydeNet/rjags/libs/rjags.so':
dlopen(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/HydeNet/rjags/libs/rjags.so, 10): Library not loaded: /usr/local/lib/libjags.4.dylib
Referenced from: /Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/HydeNet/rjags/libs/rjags.so
Reason: image not found
Execution halted
```
# hydrolinks
Version: 0.10.0
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘dbplyr’
All declared Imports should be used.
```
# ICD10gm
Version: 1.0.3
## In both
* checking installed package size ... NOTE
```
installed size is 7.9Mb
sub-directories of 1Mb or more:
data 7.0Mb
```
* checking data for non-ASCII characters ... NOTE
```
Note: found 252748 marked UTF-8 strings
```
# iCNV
Version: 1.0.0
## In both
* checking whether package ‘iCNV’ can be installed ... ERROR
```
Installation failed.
See ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/iCNV/new/iCNV.Rcheck/00install.out’ for details.
```
## Installation
### Devel
```
* installing *source* package ‘iCNV’ ...
** R
** data
*** moving datasets to lazyload DB
** inst
** byte-compile and prepare package for lazy loading
Error : package ‘BSgenome.Hsapiens.UCSC.hg19’ required by ‘CODEX’ could not be found
ERROR: lazy loading failed for package ‘iCNV’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/iCNV/new/iCNV.Rcheck/iCNV’
```
### CRAN
```
* installing *source* package ‘iCNV’ ...
** R
** data
*** moving datasets to lazyload DB
** inst
** byte-compile and prepare package for lazy loading
Error : package ‘BSgenome.Hsapiens.UCSC.hg19’ required by ‘CODEX’ could not be found
ERROR: lazy loading failed for package ‘iCNV’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/iCNV/old/iCNV.Rcheck/iCNV’
```
# iCOBRA
Version: 1.8.0
## In both
* checking installed package size ... NOTE
```
installed size is 5.7Mb
sub-directories of 1Mb or more:
doc 1.2Mb
extdata 1.7Mb
R 2.1Mb
```
# IDE
Version: 0.2.0
## In both
* checking re-building of vignette outputs ... WARNING
```
Error in re-building vignettes:
...
Error in texi2dvi(file = file, pdf = TRUE, clean = clean, quiet = quiet, :
Running 'texi2dvi' on 'IDE_intro.tex' failed.
LaTeX errors:
! LaTeX Error: File `algorithm.sty' not found.
Type X to quit or <RETURN> to proceed,
or enter new name. (Default extension: sty)
! Emergency stop.
<read *>
l.58 \usepackage
{algpseudocode}^^M
! ==> Fatal error occurred, no output PDF file produced!
Calls: buildVignettes -> texi2pdf -> texi2dvi
Execution halted
```
# ideal
Version: 1.4.0
## In both
* checking package dependencies ... ERROR
```
Package required but not available: ‘GO.db’
Packages suggested but not available for checking:
‘airway’ ‘org.Hs.eg.db’ ‘TxDb.Hsapiens.UCSC.hg38.knownGene’
See section ‘The DESCRIPTION file’ in the ‘Writing R Extensions’
manual.
```
# idealstan
Version: 0.7.1
## In both
* checking installed package size ... NOTE
```
installed size is 6.8Mb
sub-directories of 1Mb or more:
libs 5.1Mb
R 1.0Mb
```
* checking for GNU extensions in Makefiles ... NOTE
```
GNU make is a SystemRequirements.
```
# idefix
Version: 0.3.3
## In both
* checking whether package ‘idefix’ can be installed ... ERROR
```
Installation failed.
See ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/idefix/new/idefix.Rcheck/00install.out’ for details.
```
## Installation
### Devel
```
* installing *source* package ‘idefix’ ...
** package ‘idefix’ successfully unpacked and MD5 sums checked
** libs
clang++ -std=gnu++11 -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/idefix/Rcpp/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/idefix/RcppArmadillo/include" -I/usr/local/include -fopenmp -fPIC -Wall -g -O2 -c InfoDes_cpp.cpp -o InfoDes_cpp.o
clang: error: unsupported option '-fopenmp'
make: *** [InfoDes_cpp.o] Error 1
ERROR: compilation failed for package ‘idefix’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/idefix/new/idefix.Rcheck/idefix’
```
### CRAN
```
* installing *source* package ‘idefix’ ...
** package ‘idefix’ successfully unpacked and MD5 sums checked
** libs
clang++ -std=gnu++11 -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/idefix/Rcpp/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/idefix/RcppArmadillo/include" -I/usr/local/include -fopenmp -fPIC -Wall -g -O2 -c InfoDes_cpp.cpp -o InfoDes_cpp.o
clang: error: unsupported option '-fopenmp'
make: *** [InfoDes_cpp.o] Error 1
ERROR: compilation failed for package ‘idefix’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/idefix/old/idefix.Rcheck/idefix’
```
# IHW
Version: 1.8.0
## In both
* checking re-building of vignette outputs ... WARNING
```
...
The following objects are masked from 'package:S4Vectors':
first, intersect, rename, setdiff, setequal, union
The following objects are masked from 'package:BiocGenerics':
combine, intersect, setdiff, union
The following objects are masked from 'package:stats':
filter, lag
The following objects are masked from 'package:base':
intersect, setdiff, setequal, union
Quitting from lines 42-47 (introduction_to_ihw.Rmd)
Error: processing vignette 'introduction_to_ihw.Rmd' failed with diagnostics:
there is no package called 'airway'
Execution halted
```
* checking package dependencies ... NOTE
```
Package suggested but not available for checking: ‘airway’
```
* checking installed package size ... NOTE
```
installed size is 6.2Mb
sub-directories of 1Mb or more:
doc 5.8Mb
```
* checking R code for possible problems ... NOTE
```
...
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/IHW/new/IHW.Rcheck/00_pkg_src/IHW/R/plots.R:101)
plot_decisionboundary: no visible binding for global variable
‘covariate’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/IHW/new/IHW.Rcheck/00_pkg_src/IHW/R/plots.R:110-112)
plot_decisionboundary: no visible binding for global variable ‘pvalue’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/IHW/new/IHW.Rcheck/00_pkg_src/IHW/R/plots.R:110-112)
plot_decisionboundary: no visible binding for global variable ‘fold’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/IHW/new/IHW.Rcheck/00_pkg_src/IHW/R/plots.R:110-112)
thresholds_ihwResult: no visible global function definition for
‘na.exclude’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/IHW/new/IHW.Rcheck/00_pkg_src/IHW/R/ihw_class.R:96-97)
thresholds,ihwResult: no visible global function definition for
‘na.exclude’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/IHW/new/IHW.Rcheck/00_pkg_src/IHW/R/ihw_class.R:96-97)
Undefined global functions or variables:
covariate fold gurobi mcols mcols<- metadata metadata<- na.exclude
p.adjust pvalue runif str stratum
Consider adding
importFrom("stats", "na.exclude", "p.adjust", "runif")
importFrom("utils", "str")
to your NAMESPACE file.
```
# IHWpaper
Version: 1.7.0
## In both
* checking tests ...
```
ERROR
Running the tests in ‘tests/testthat.R’ failed.
Last 13 lines of output:
[1] "LSL GBH"
[1] "TST GBH"
[1] "SBH"
[1] "Clfdr"
[1] "Greedy Indep. Filt."
[1] "IHW"
[1] "IHW-Bonferroni E3"
[1] "Bonferroni"
[1] "qvalue"
══ testthat results ═══════════════════════════════════════════════════════════
OK: 1 SKIPPED: 0 FAILED: 1
1. Error: (unknown) (@test_analyze_datasets.R#4)
Error: testthat unit tests failed
Execution halted
```
* checking re-building of vignette outputs ... WARNING
```
Error in re-building vignettes:
...
Quitting from lines 29-65 (BH-explanation.Rmd)
Error: processing vignette 'BH-explanation.Rmd' failed with diagnostics:
Palette not found.
Execution halted
```
* checking package dependencies ... NOTE
```
Package suggested but not available for checking: ‘airway’
```
* checking installed package size ... NOTE
```
installed size is 24.0Mb
sub-directories of 1Mb or more:
doc 13.1Mb
extdata 9.8Mb
```
* checking R code for possible problems ... NOTE
```
scott_fdrreg: no visible global function definition for ‘FDRreg’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/IHWpaper/new/IHWpaper.Rcheck/00_pkg_src/IHWpaper/R/covariate_methods.R:88)
scott_fdrreg: no visible global function definition for ‘getFDR’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/IHWpaper/new/IHWpaper.Rcheck/00_pkg_src/IHWpaper/R/covariate_methods.R:97)
sim_fun_eval: no visible binding for global variable ‘fdr_method’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/IHWpaper/new/IHWpaper.Rcheck/00_pkg_src/IHWpaper/R/benchmarking.R:61-63)
sim_fun_eval: no visible binding for global variable ‘fdr_pars’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/IHWpaper/new/IHWpaper.Rcheck/00_pkg_src/IHWpaper/R/benchmarking.R:61-63)
sim_fun_eval: no visible binding for global variable ‘FDP’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/IHWpaper/new/IHWpaper.Rcheck/00_pkg_src/IHWpaper/R/benchmarking.R:61-63)
sim_fun_eval: no visible binding for global variable ‘rj_ratio’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/IHWpaper/new/IHWpaper.Rcheck/00_pkg_src/IHWpaper/R/benchmarking.R:61-63)
sim_fun_eval: no visible binding for global variable ‘FPR’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/IHWpaper/new/IHWpaper.Rcheck/00_pkg_src/IHWpaper/R/benchmarking.R:61-63)
sim_fun_eval: no visible binding for global variable ‘FWER’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/IHWpaper/new/IHWpaper.Rcheck/00_pkg_src/IHWpaper/R/benchmarking.R:61-63)
Undefined global functions or variables:
FDP fdr_method fdr_pars FDRreg FPR FWER getFDR rj_ratio
```
# ijtiff
Version: 1.5.0
## In both
* checking whether package ‘ijtiff’ can be installed ... ERROR
```
Installation failed.
See ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/ijtiff/new/ijtiff.Rcheck/00install.out’ for details.
```
## Installation
### Devel
```
* installing *source* package ‘ijtiff’ ...
** package ‘ijtiff’ successfully unpacked and MD5 sums checked
Package libtiff-4 was not found in the pkg-config search path.
Perhaps you should add the directory containing `libtiff-4.pc'
to the PKG_CONFIG_PATH environment variable
No package 'libtiff-4' found
Using PKG_CFLAGS=
Using PKG_LIBS=-ltiff -ljpeg -lz
------------------------- ANTICONF ERROR ---------------------------
Configuration failed because libtiff-4 was not found. Try installing:
* deb: libtiff-dev (Debian, Ubuntu, etc)
* rpm: libtiff-devel (Fedora, EPEL)
* brew: libtiff (OSX)
If libtiff-4 is already installed, check that 'pkg-config' is in your
PATH and PKG_CONFIG_PATH contains a libtiff-4.pc file. If pkg-config
is unavailable you can set INCLUDE_DIR and LIB_DIR manually via:
R CMD INSTALL --configure-vars='INCLUDE_DIR=... LIB_DIR=...'
--------------------------------------------------------------------
ERROR: configuration failed for package ‘ijtiff’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/ijtiff/new/ijtiff.Rcheck/ijtiff’
```
### CRAN
```
* installing *source* package ‘ijtiff’ ...
** package ‘ijtiff’ successfully unpacked and MD5 sums checked
Package libtiff-4 was not found in the pkg-config search path.
Perhaps you should add the directory containing `libtiff-4.pc'
to the PKG_CONFIG_PATH environment variable
No package 'libtiff-4' found
Using PKG_CFLAGS=
Using PKG_LIBS=-ltiff -ljpeg -lz
------------------------- ANTICONF ERROR ---------------------------
Configuration failed because libtiff-4 was not found. Try installing:
* deb: libtiff-dev (Debian, Ubuntu, etc)
* rpm: libtiff-devel (Fedora, EPEL)
* brew: libtiff (OSX)
If libtiff-4 is already installed, check that 'pkg-config' is in your
PATH and PKG_CONFIG_PATH contains a libtiff-4.pc file. If pkg-config
is unavailable you can set INCLUDE_DIR and LIB_DIR manually via:
R CMD INSTALL --configure-vars='INCLUDE_DIR=... LIB_DIR=...'
--------------------------------------------------------------------
ERROR: configuration failed for package ‘ijtiff’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/ijtiff/old/ijtiff.Rcheck/ijtiff’
```
# imager
Version: 0.41.2
## In both
* checking tests ...
```
ERROR
Running the tests in ‘tests/testthat.R’ failed.
Last 13 lines of output:
clang++ -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -Dcimg_r_mode -fpermissive -I/usr/X11R6/include -I/opt/X11/include -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/imager/Rcpp/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/imager/new/imager.Rcheck/imager/include" -I"/private/var/folders/r_/1b2gjtsd7j92jbbpz4t7ps340000gn/T/Rtmphc2M4h/sourceCpp-x86_64-apple-darwin15.6.0-1.0.0" -I/usr/local/include -fPIC -O3 -Wno-c++11-inline-namespace -c file4fb421a7273e.cpp -o file4fb421a7273e.o
clang++ -dynamiclib -Wl,-headerpad_max_install_names -undefined dynamic_lookup -single_module -multiply_defined suppress -L/Library/Frameworks/R.framework/Resources/lib -L/usr/local/lib -o sourceCpp_2.so file4fb421a7273e.o -lX11 -L/usr/X11R6/lib -L/opt/X11/include -F/Library/Frameworks/R.framework/.. -framework R -Wl,-framework -Wl,CoreFoundation
── 1. Error: cpp_plugin (@test_cpp_api.R#14) ──────────────────────────────────
Error 1 occurred building shared library.
1: cppFunction(foo.inline, depends = "imager") at testthat/test_cpp_api.R:14
2: sourceCpp(code = code, env = env, rebuild = rebuild, cacheDir = cacheDir, showOutput = showOutput,
verbose = verbose)
3: stop("Error ", status, " occurred building shared library.")
══ testthat results ═══════════════════════════════════════════════════════════
OK: 17 SKIPPED: 0 FAILED: 1
1. Error: cpp_plugin (@test_cpp_api.R#14)
Error: testthat unit tests failed
Execution halted
```
* checking installed package size ... NOTE
```
installed size is 14.9Mb
sub-directories of 1Mb or more:
data 1.4Mb
doc 5.3Mb
include 2.8Mb
libs 3.1Mb
```
# implyr
Version: 0.2.4
## In both
* checking tests ...
```
ERROR
Running the tests in ‘tests/testthat.R’ failed.
Last 13 lines of output:
The following objects are masked from 'package:base':
intersect, setdiff, setequal, union
> library(RJDBC)
Loading required package: rJava
Error: package or namespace load failed for 'rJava':
.onLoad failed in loadNamespace() for 'rJava', details:
call: dyn.load(file, DLLpath = DLLpath, ...)
error: unable to load shared object '/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/implyr/rJava/libs/rJava.so':
dlopen(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/implyr/rJava/libs/rJava.so, 6): Library not loaded: /Library/Java/JavaVirtualMachines/jdk-11.0.1.jdk/Contents/Home/lib/server/libjvm.dylib
Referenced from: /Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/implyr/rJava/libs/rJava.so
Reason: image not found
Error: package 'rJava' could not be loaded
Execution halted
```
# incadata
Version: 0.6.4
## In both
* checking data for non-ASCII characters ... NOTE
```
Note: found 568 marked UTF-8 strings
```
# incgraph
Version: 1.0.1
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘testthat’
All declared Imports should be used.
```
# incR
Version: 1.1.0
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘dplyr’ ‘rgeos’
All declared Imports should be used.
```
# INDperform
Version: 0.2.0
## In both
* checking installed package size ... NOTE
```
installed size is 5.4Mb
sub-directories of 1Mb or more:
data 3.1Mb
help 1.1Mb
R 1.1Mb
```
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘lazyeval’
All declared Imports should be used.
```
# inlabru
Version: 2.1.9
## In both
* checking package dependencies ... NOTE
```
Package suggested but not available for checking: ‘INLA’
```
* checking installed package size ... NOTE
```
installed size is 5.3Mb
sub-directories of 1Mb or more:
data 1.1Mb
misc 1.8Mb
R 2.0Mb
```
* checking Rd cross-references ... NOTE
```
Package unavailable to check Rd xrefs: ‘INLA’
```
# interplot
Version: 0.2.1
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘gridExtra’
All declared Imports should be used.
```
# IONiseR
Version: 2.4.0
## In both
* checking installed package size ... NOTE
```
installed size is 5.8Mb
sub-directories of 1Mb or more:
doc 3.7Mb
extdata 1.5Mb
```
* checking R code for possible problems ... NOTE
```
...
‘idx’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/IONiseR/new/IONiseR.Rcheck/00_pkg_src/IONiseR/R/Methods-subsetting.R:19-21)
[,Fast5Summary-ANY-ANY-ANY: no visible binding for global variable
‘component’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/IONiseR/new/IONiseR.Rcheck/00_pkg_src/IONiseR/R/Methods-subsetting.R:24-26)
[,Fast5Summary-ANY-ANY-ANY: no visible binding for global variable
‘idx’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/IONiseR/new/IONiseR.Rcheck/00_pkg_src/IONiseR/R/Methods-subsetting.R:24-26)
show,Fast5Summary: no visible binding for global variable ‘full_2D’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/IONiseR/new/IONiseR.Rcheck/00_pkg_src/IONiseR/R/classes.R:70-71)
show,Fast5Summary: no visible binding for global variable ‘pass’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/IONiseR/new/IONiseR.Rcheck/00_pkg_src/IONiseR/R/classes.R:75)
show,Fast5Summary: no visible binding for global variable ‘pass’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/IONiseR/new/IONiseR.Rcheck/00_pkg_src/IONiseR/R/classes.R:77)
Undefined global functions or variables:
:= AAAAA accumulation baseCalledComplement baseCalledTemplate
bases_called category channel circleFun component duration error freq
full_2D group hour idx matrixCol matrixRow mean_value meanZValue
median_signal minute mux name nbases new_reads num_events oddEven
pass pentamer rbindlist readIDs seq_length start_time time_bin
time_group TTTTT x y zvalue
```
# iotables
Version: 0.4.2
## In both
* checking data for non-ASCII characters ... NOTE
```
Note: found 53206 marked UTF-8 strings
```
# ipeadatar
Version: 0.1.0
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘curl’
All declared Imports should be used.
```
# ipumsr
Version: 0.4.0
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘R6’
All declared Imports should be used.
```
# iRF
Version: 2.0.0
## In both
* checking whether package ‘iRF’ can be installed ... ERROR
```
Installation failed.
See ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/iRF/new/iRF.Rcheck/00install.out’ for details.
```
## Installation
### Devel
```
* installing *source* package ‘iRF’ ...
** package ‘iRF’ successfully unpacked and MD5 sums checked
** libs
clang++ -std=gnu++11 -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/iRF/Rcpp/include" -I/usr/local/include -fopenmp -fPIC -Wall -g -O2 -c ExportedFunctionsRIT.cpp -o ExportedFunctionsRIT.o
clang: error: unsupported option '-fopenmp'
make: *** [ExportedFunctionsRIT.o] Error 1
ERROR: compilation failed for package ‘iRF’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/iRF/new/iRF.Rcheck/iRF’
```
### CRAN
```
* installing *source* package ‘iRF’ ...
** package ‘iRF’ successfully unpacked and MD5 sums checked
** libs
clang++ -std=gnu++11 -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/iRF/Rcpp/include" -I/usr/local/include -fopenmp -fPIC -Wall -g -O2 -c ExportedFunctionsRIT.cpp -o ExportedFunctionsRIT.o
clang: error: unsupported option '-fopenmp'
make: *** [ExportedFunctionsRIT.o] Error 1
ERROR: compilation failed for package ‘iRF’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/iRF/old/iRF.Rcheck/iRF’
```
# IrisSpatialFeatures
Version: 1.3.0
## In both
* checking re-building of vignette outputs ... WARNING
```
Error in re-building vignettes:
...
! LaTeX Error: Unknown graphics extension: .png?raw=true.
Error: processing vignette 'IrisSpatialFeatures.Rmd' failed with diagnostics:
Failed to compile IrisSpatialFeatures.tex. See IrisSpatialFeatures.log for more info.
Execution halted
```
* checking installed package size ... NOTE
```
installed size is 5.1Mb
sub-directories of 1Mb or more:
data 2.1Mb
extdata 1.9Mb
```
# iSEE
Version: 1.0.1
## In both
* checking examples ... ERROR
```
Running examples in ‘iSEE-Ex.R’ failed
The error most likely occurred in:
> ### Name: annotateEnsembl
> ### Title: Annotation via ENSEMBL database
> ### Aliases: annotateEnsembl
>
> ### ** Examples
>
> library(scRNAseq)
Error in library(scRNAseq) : there is no package called ‘scRNAseq’
Execution halted
```
* checking tests ...
```
ERROR
Running the tests in ‘tests/testthat.R’ failed.
Last 13 lines of output:
The following objects are masked from 'package:base':
aperm, apply
Loading required package: SingleCellExperiment
>
> test_check("iSEE")
Loading required package: scRNAseq
Error in eval(exprs, env) : require(scRNAseq) is not TRUE
Calls: test_check ... source_dir -> lapply -> FUN -> eval -> eval -> stopifnot
In addition: Warning message:
In library(package, lib.loc = lib.loc, character.only = TRUE, logical.return = TRUE, :
there is no package called 'scRNAseq'
Execution halted
```
* checking re-building of vignette outputs ... WARNING
```
...
The following objects are masked from 'package:Biobase':
anyMissing, rowMedians
Loading required package: BiocParallel
Attaching package: 'DelayedArray'
The following objects are masked from 'package:matrixStats':
colMaxs, colMins, colRanges, rowMaxs, rowMins, rowRanges
The following objects are masked from 'package:base':
aperm, apply
Loading required package: SingleCellExperiment
Quitting from lines 89-98 (iSEE_vignette.Rmd)
Error: processing vignette 'iSEE_vignette.Rmd' failed with diagnostics:
there is no package called 'scRNAseq'
Execution halted
```
* checking package dependencies ... NOTE
```
Packages suggested but not available for checking:
‘scRNAseq’ ‘org.Mm.eg.db’
Package which this enhances but not available for checking: ‘ExperimentHub’
```
* checking dependencies in R code ... NOTE
```
Unexported object imported by a ':::' call: ‘S4Vectors:::selectSome’
See the note in ?`:::` about the use of this operator.
```
# isomiRs
Version: 1.8.0
## In both
* checking package dependencies ... ERROR
```
Package required but not available: ‘targetscan.Hs.eg.db’
Package suggested but not available for checking: ‘org.Mm.eg.db’
See section ‘The DESCRIPTION file’ in the ‘Writing R Extensions’
manual.
```
# ITNr
Version: 0.3.0
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘comtradr’
All declared Imports should be used.
```
# jaccard
Version: 0.1.0
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘dplyr’ ‘magrittr’ ‘Rcpp’
All declared Imports should be used.
```
# JacobiEigen
Version: 0.3-3
## In both
* checking re-building of vignette outputs ... WARNING
```
...
fmtutil [INFO]: Total formats: 15
fmtutil [INFO]: exiting with status 0
tlmgr install fouriernc
TeX Live 2018 is frozen forever and will no
longer be updated. This happens in preparation for a new release.
If you're interested in helping to pretest the new release (when
pretests are available), please read http://tug.org/texlive/pretest.html.
Otherwise, just wait, and the new release will be ready in due time.
tlmgr: Fundamental package texlive.infra not present, uh oh, goodbyeShould not happen, texlive.infra not found at /usr/local/bin/tlmgr line 7344.
tlmgr: package repository http://mirrors.standaloneinstaller.com/ctan/systems/texlive/tlnet (not verified: gpg unavailable)
tlmgr path add
! LaTeX Error: File `fouriernc.sty' not found.
! Emergency stop.
<read *>
Error: processing vignette 'JacobiEigen.Rmd' failed with diagnostics:
Failed to compile JacobiEigen.tex. See JacobiEigen.log for more info.
Execution halted
```
# janeaustenr
Version: 0.1.5
## In both
* checking data for non-ASCII characters ... NOTE
```
Note: found 1 marked UTF-8 string
```
# jpndistrict
Version: 0.3.2
## In both
* checking data for non-ASCII characters ... NOTE
```
Note: found 502 marked UTF-8 strings
```
# kableExtra
Version: 1.0.1
## In both
* checking re-building of vignette outputs ... WARNING
```
...
Error in re-building vignettes:
...
Attaching package: 'dplyr'
The following object is masked from 'package:kableExtra':
group_rows
The following objects are masked from 'package:stats':
filter, lag
The following objects are masked from 'package:base':
intersect, setdiff, setequal, union
Quitting from lines 327-331 (awesome_table_in_html.Rmd)
Error: processing vignette 'awesome_table_in_html.Rmd' failed with diagnostics:
unused arguments ("Group 1", 4, 7)
Execution halted
```
# kitagawa
Version: 2.2-2
## In both
* checking re-building of vignette outputs ... WARNING
```
...
/Library/Frameworks/R.framework/Resources/bin/Rscript -e "if (getRversion() < '3.0.0') knitr::knit2pdf('ResponseModels.Rnw') else tools::texi2pdf('ResponseModels.tex')"
Error in texi2dvi(file = file, pdf = TRUE, clean = clean, quiet = quiet, :
Running 'texi2dvi' on 'ResponseModels.tex' failed.
LaTeX errors:
! LaTeX Error: File `float.sty' not found.
Type X to quit or <RETURN> to proceed,
or enter new name. (Default extension: sty)
! Emergency stop.
<read *>
l.71 \usepackage
{natbib}^^M
! ==> Fatal error occurred, no output PDF file produced!
Calls: <Anonymous> -> texi2dvi
Execution halted
make: *** [ResponseModels.pdf] Error 1
Error in buildVignettes(dir = "/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/kitagawa/new/kitagawa.Rcheck/vign_test/kitagawa") :
running 'make' failed
Execution halted
```
# kokudosuuchi
Version: 0.4.2
## In both
* checking data for non-ASCII characters ... NOTE
```
Note: found 52458 marked UTF-8 strings
```
# KraljicMatrix
Version: 0.2.1
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘dplyr’ ‘tibble’
All declared Imports should be used.
```
# labelled
Version: 2.1.0
## In both
* checking package dependencies ... NOTE
```
Package which this enhances but not available for checking: ‘memisc’
```
* checking Rd cross-references ... NOTE
```
Package unavailable to check Rd xrefs: ‘memisc’
```
# Lahman
Version: 6.0-0
## In both
* checking installed package size ... NOTE
```
installed size is 7.6Mb
sub-directories of 1Mb or more:
data 7.4Mb
```
# landscapemetrics
Version: 0.3.1
## In both
* checking package dependencies ... NOTE
```
Package which this enhances but not available for checking: ‘stars’
```
# lilikoi
Version: 0.1.0
## In both
* checking whether package ‘lilikoi’ can be installed ... ERROR
```
Installation failed.
See ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/lilikoi/new/lilikoi.Rcheck/00install.out’ for details.
```
## Installation
### Devel
```
* installing *source* package ‘lilikoi’ ...
** package ‘lilikoi’ successfully unpacked and MD5 sums checked
** R
** data
*** moving datasets to lazyload DB
** inst
** byte-compile and prepare package for lazy loading
Error : .onLoad failed in loadNamespace() for 'rJava', details:
call: dyn.load(file, DLLpath = DLLpath, ...)
error: unable to load shared object '/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/lilikoi/rJava/libs/rJava.so':
dlopen(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/lilikoi/rJava/libs/rJava.so, 6): Library not loaded: /Library/Java/JavaVirtualMachines/jdk-11.0.1.jdk/Contents/Home/lib/server/libjvm.dylib
Referenced from: /Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/lilikoi/rJava/libs/rJava.so
Reason: image not found
ERROR: lazy loading failed for package ‘lilikoi’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/lilikoi/new/lilikoi.Rcheck/lilikoi’
```
### CRAN
```
* installing *source* package ‘lilikoi’ ...
** package ‘lilikoi’ successfully unpacked and MD5 sums checked
** R
** data
*** moving datasets to lazyload DB
** inst
** byte-compile and prepare package for lazy loading
Error : .onLoad failed in loadNamespace() for 'rJava', details:
call: dyn.load(file, DLLpath = DLLpath, ...)
error: unable to load shared object '/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/lilikoi/rJava/libs/rJava.so':
dlopen(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/lilikoi/rJava/libs/rJava.so, 6): Library not loaded: /Library/Java/JavaVirtualMachines/jdk-11.0.1.jdk/Contents/Home/lib/server/libjvm.dylib
Referenced from: /Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/lilikoi/rJava/libs/rJava.so
Reason: image not found
ERROR: lazy loading failed for package ‘lilikoi’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/lilikoi/old/lilikoi.Rcheck/lilikoi’
```
# live
Version: 1.5.10
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘e1071’
All declared Imports should be used.
```
# LLSR
Version: 0.0.2.19
## In both
* checking whether package ‘LLSR’ can be installed ... ERROR
```
Installation failed.
See ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/LLSR/new/LLSR.Rcheck/00install.out’ for details.
```
## Installation
### Devel
```
* installing *source* package ‘LLSR’ ...
** package ‘LLSR’ successfully unpacked and MD5 sums checked
** R
** data
*** moving datasets to lazyload DB
** inst
** byte-compile and prepare package for lazy loading
Error : .onLoad failed in loadNamespace() for 'rJava', details:
call: dyn.load(file, DLLpath = DLLpath, ...)
error: unable to load shared object '/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/LLSR/rJava/libs/rJava.so':
dlopen(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/LLSR/rJava/libs/rJava.so, 6): Library not loaded: /Library/Java/JavaVirtualMachines/jdk-11.0.1.jdk/Contents/Home/lib/server/libjvm.dylib
Referenced from: /Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/LLSR/rJava/libs/rJava.so
Reason: image not found
ERROR: lazy loading failed for package ‘LLSR’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/LLSR/new/LLSR.Rcheck/LLSR’
```
### CRAN
```
* installing *source* package ‘LLSR’ ...
** package ‘LLSR’ successfully unpacked and MD5 sums checked
** R
** data
*** moving datasets to lazyload DB
** inst
** byte-compile and prepare package for lazy loading
Error : .onLoad failed in loadNamespace() for 'rJava', details:
call: dyn.load(file, DLLpath = DLLpath, ...)
error: unable to load shared object '/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/LLSR/rJava/libs/rJava.so':
dlopen(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/LLSR/rJava/libs/rJava.so, 6): Library not loaded: /Library/Java/JavaVirtualMachines/jdk-11.0.1.jdk/Contents/Home/lib/server/libjvm.dylib
Referenced from: /Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/LLSR/rJava/libs/rJava.so
Reason: image not found
ERROR: lazy loading failed for package ‘LLSR’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/LLSR/old/LLSR.Rcheck/LLSR’
```
# LocFDRPois
Version: 1.0.0
## In both
* checking R code for possible problems ... NOTE
```
AnalyticalOptim: no visible global function definition for ‘optim’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/LocFDRPois/new/LocFDRPois.Rcheck/00_pkg_src/LocFDRPois/R/locfdr.R:84)
LLConstructor : LL: no visible global function definition for ‘dpois’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/LocFDRPois/new/LocFDRPois.Rcheck/00_pkg_src/LocFDRPois/R/locfdr.R:60)
LLConstructor : LL: no visible global function definition for ‘dpois’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/LocFDRPois/new/LocFDRPois.Rcheck/00_pkg_src/LocFDRPois/R/locfdr.R:63-64)
MixtureDensity: no visible global function definition for ‘glm’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/LocFDRPois/new/LocFDRPois.Rcheck/00_pkg_src/LocFDRPois/R/locfdr.R:35)
MixtureDensity : f_hat: no visible global function definition for
‘predict’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/LocFDRPois/new/LocFDRPois.Rcheck/00_pkg_src/LocFDRPois/R/locfdr.R:42)
NullDensity : f0: no visible global function definition for ‘dpois’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/LocFDRPois/new/LocFDRPois.Rcheck/00_pkg_src/LocFDRPois/R/locfdr.R:106)
Undefined global functions or variables:
dpois glm optim predict
Consider adding
importFrom("stats", "dpois", "glm", "optim", "predict")
to your NAMESPACE file.
```
# loopr
Version: 1.0.1
## In both
* checking R code for possible problems ... NOTE
```
amendColumns: no visible global function definition for ‘setNames’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/loopr/new/loopr.Rcheck/00_pkg_src/loopr/R/loopr.R:96-104)
fillColumns: no visible global function definition for ‘setNames’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/loopr/new/loopr.Rcheck/00_pkg_src/loopr/R/loopr.R:126-136)
Undefined global functions or variables:
setNames
Consider adding
importFrom("stats", "setNames")
to your NAMESPACE file.
```
# loose.rock
Version: 1.0.9
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘futile.options’ ‘ggfortify’ ‘grDevices’ ‘stats’
All declared Imports should be used.
```
# lpirfs
Version: 0.1.4
## In both
* checking whether package ‘lpirfs’ can be installed ... ERROR
```
Installation failed.
See ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/lpirfs/new/lpirfs.Rcheck/00install.out’ for details.
```
## Installation
### Devel
```
* installing *source* package ‘lpirfs’ ...
** package ‘lpirfs’ successfully unpacked and MD5 sums checked
** libs
clang++ -std=gnu++11 -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -I../inst/include/ -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/lpirfs/Rcpp/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/lpirfs/RcppArmadillo/include" -I/usr/local/include -fopenmp -fPIC -Wall -g -O2 -c RcppExports.cpp -o RcppExports.o
clang: error: unsupported option '-fopenmp'
make: *** [RcppExports.o] Error 1
ERROR: compilation failed for package ‘lpirfs’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/lpirfs/new/lpirfs.Rcheck/lpirfs’
```
### CRAN
```
* installing *source* package ‘lpirfs’ ...
** package ‘lpirfs’ successfully unpacked and MD5 sums checked
** libs
clang++ -std=gnu++11 -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -I../inst/include/ -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/lpirfs/Rcpp/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/lpirfs/RcppArmadillo/include" -I/usr/local/include -fopenmp -fPIC -Wall -g -O2 -c RcppExports.cpp -o RcppExports.o
clang: error: unsupported option '-fopenmp'
make: *** [RcppExports.o] Error 1
ERROR: compilation failed for package ‘lpirfs’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/lpirfs/old/lpirfs.Rcheck/lpirfs’
```
# lucid
Version: 1.7
## In both
* checking re-building of vignette outputs ... WARNING
```
Error in re-building vignettes:
...
Loading required package: lucid
Loading required package: lattice
Loading required package: rjags
Loading required package: coda
Error: package or namespace load failed for 'rjags':
.onLoad failed in loadNamespace() for 'rjags', details:
call: dyn.load(file, DLLpath = DLLpath, ...)
error: unable to load shared object '/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/lucid/rjags/libs/rjags.so':
dlopen(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/lucid/rjags/libs/rjags.so, 10): Library not loaded: /usr/local/lib/libjags.4.dylib
Referenced from: /Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/lucid/rjags/libs/rjags.so
Reason: image not found
Quitting from lines 242-266 (lucid_examples.Rmd)
Error: processing vignette 'lucid_examples.Rmd' failed with diagnostics:
could not find function "jags.model"
Execution halted
```
# LymphoSeq
Version: 1.8.0
## In both
* checking whether package ‘LymphoSeq’ can be installed ... ERROR
```
Installation failed.
See ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/LymphoSeq/new/LymphoSeq.Rcheck/00install.out’ for details.
```
## Installation
### Devel
```
* installing *source* package ‘LymphoSeq’ ...
** R
** inst
** byte-compile and prepare package for lazy loading
Error : object ‘as_data_frame’ is not exported by 'namespace:tidytree'
ERROR: lazy loading failed for package ‘LymphoSeq’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/LymphoSeq/new/LymphoSeq.Rcheck/LymphoSeq’
```
### CRAN
```
* installing *source* package ‘LymphoSeq’ ...
** R
** inst
** byte-compile and prepare package for lazy loading
Error : object ‘as_data_frame’ is not exported by 'namespace:tidytree'
ERROR: lazy loading failed for package ‘LymphoSeq’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/LymphoSeq/old/LymphoSeq.Rcheck/LymphoSeq’
```
# madness
Version: 0.2.5
## In both
* checking re-building of vignette outputs ... WARNING
```
...
Warning in madness(R) : no dimension given, turning val into a column
Warning in madness(R) : no dimension given, turning val into a column
Warning in madness(R) : no dimension given, turning val into a column
Warning in madness(R) : no dimension given, turning val into a column
Warning in madness(R) : no dimension given, turning val into a column
Error in texi2dvi(file = file, pdf = TRUE, clean = clean, quiet = quiet, :
Running 'texi2dvi' on 'introducing_madness.tex' failed.
LaTeX errors:
! LaTeX Error: File `paralist.sty' not found.
Type X to quit or <RETURN> to proceed,
or enter new name. (Default extension: sty)
! Emergency stop.
<read *>
l.113 ^^M
! ==> Fatal error occurred, no output PDF file produced!
Calls: buildVignettes -> texi2pdf -> texi2dvi
Execution halted
```
# malariaAtlas
Version: 0.0.3
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘grid’
All declared Imports should be used.
```
# manifestoR
Version: 1.3.0
## In both
* checking R code for possible problems ... NOTE
```
mp_corpus: no visible binding for global variable ‘annotations’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/manifestoR/new/manifestoR.Rcheck/00_pkg_src/manifestoR/R/manifesto.R:456-457)
print.ManifestoAvailability: no visible binding for global variable
‘annotations’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/manifestoR/new/manifestoR.Rcheck/00_pkg_src/manifestoR/R/manifesto.R:371-374)
Undefined global functions or variables:
annotations
```
# manymodelr
Version: 0.1.0
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘Metrics’ ‘plyr’ ‘tidyr’
All declared Imports should be used.
```
# mapedit
Version: 0.4.3
## In both
* checking package dependencies ... NOTE
```
Package which this enhances but not available for checking: ‘geojsonio’
```
# mapview
Version: 2.6.3
## In both
* checking installed package size ... NOTE
```
installed size is 5.1Mb
sub-directories of 1Mb or more:
extdata 1.0Mb
R 2.1Mb
```
# mason
Version: 0.2.6
## In both
* checking Rd cross-references ... NOTE
```
Package unavailable to check Rd xrefs: ‘pixiedust’
```
# matsbyname
Version: 0.4.10
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘dplyr’
All declared Imports should be used.
```
# mbgraphic
Version: 1.0.0
## In both
* checking whether package ‘mbgraphic’ can be installed ... ERROR
```
Installation failed.
See ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/mbgraphic/new/mbgraphic.Rcheck/00install.out’ for details.
```
## Installation
### Devel
```
* installing *source* package ‘mbgraphic’ ...
** package ‘mbgraphic’ successfully unpacked and MD5 sums checked
** libs
clang++ -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/mbgraphic/Rcpp/include" -I/usr/local/include -fPIC -O3 -Wno-c++11-inline-namespace -c RcppExports.cpp -o RcppExports.o
clang++ -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/mbgraphic/Rcpp/include" -I/usr/local/include -fPIC -O3 -Wno-c++11-inline-namespace -c cmasum.cpp -o cmasum.o
clang -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/mbgraphic/Rcpp/include" -I/usr/local/include -fPIC -Wall -g -O2 -c mbgraphic_init.c -o mbgraphic_init.o
clang++ -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/mbgraphic/Rcpp/include" -I/usr/local/include -fPIC -O3 -Wno-c++11-inline-namespace -c variableflip.cpp -o variableflip.o
clang++ -dynamiclib -Wl,-headerpad_max_install_names -undefined dynamic_lookup -single_module -multiply_defined suppress -L/Library/Frameworks/R.framework/Resources/lib -L/usr/local/lib -o mbgraphic.so RcppExports.o cmasum.o mbgraphic_init.o variableflip.o -F/Library/Frameworks/R.framework/.. -framework R -Wl,-framework -Wl,CoreFoundation
installing to /Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/mbgraphic/new/mbgraphic.Rcheck/mbgraphic/libs
** R
** data
*** moving datasets to lazyload DB
** inst
** byte-compile and prepare package for lazy loading
Error : .onLoad failed in loadNamespace() for 'rJava', details:
call: dyn.load(file, DLLpath = DLLpath, ...)
error: unable to load shared object '/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/mbgraphic/rJava/libs/rJava.so':
dlopen(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/mbgraphic/rJava/libs/rJava.so, 6): Library not loaded: /Library/Java/JavaVirtualMachines/jdk-11.0.1.jdk/Contents/Home/lib/server/libjvm.dylib
Referenced from: /Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/mbgraphic/rJava/libs/rJava.so
Reason: image not found
ERROR: lazy loading failed for package ‘mbgraphic’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/mbgraphic/new/mbgraphic.Rcheck/mbgraphic’
```
### CRAN
```
* installing *source* package ‘mbgraphic’ ...
** package ‘mbgraphic’ successfully unpacked and MD5 sums checked
** libs
clang++ -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/mbgraphic/Rcpp/include" -I/usr/local/include -fPIC -O3 -Wno-c++11-inline-namespace -c RcppExports.cpp -o RcppExports.o
clang++ -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/mbgraphic/Rcpp/include" -I/usr/local/include -fPIC -O3 -Wno-c++11-inline-namespace -c cmasum.cpp -o cmasum.o
clang -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/mbgraphic/Rcpp/include" -I/usr/local/include -fPIC -Wall -g -O2 -c mbgraphic_init.c -o mbgraphic_init.o
clang++ -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/mbgraphic/Rcpp/include" -I/usr/local/include -fPIC -O3 -Wno-c++11-inline-namespace -c variableflip.cpp -o variableflip.o
clang++ -dynamiclib -Wl,-headerpad_max_install_names -undefined dynamic_lookup -single_module -multiply_defined suppress -L/Library/Frameworks/R.framework/Resources/lib -L/usr/local/lib -o mbgraphic.so RcppExports.o cmasum.o mbgraphic_init.o variableflip.o -F/Library/Frameworks/R.framework/.. -framework R -Wl,-framework -Wl,CoreFoundation
installing to /Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/mbgraphic/old/mbgraphic.Rcheck/mbgraphic/libs
** R
** data
*** moving datasets to lazyload DB
** inst
** byte-compile and prepare package for lazy loading
Error : .onLoad failed in loadNamespace() for 'rJava', details:
call: dyn.load(file, DLLpath = DLLpath, ...)
error: unable to load shared object '/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/mbgraphic/rJava/libs/rJava.so':
dlopen(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/mbgraphic/rJava/libs/rJava.so, 6): Library not loaded: /Library/Java/JavaVirtualMachines/jdk-11.0.1.jdk/Contents/Home/lib/server/libjvm.dylib
Referenced from: /Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/mbgraphic/rJava/libs/rJava.so
Reason: image not found
ERROR: lazy loading failed for package ‘mbgraphic’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/mbgraphic/old/mbgraphic.Rcheck/mbgraphic’
```
# MCbiclust
Version: 1.4.0
## In both
* checking package dependencies ... ERROR
```
Packages required but not available: ‘GO.db’ ‘org.Hs.eg.db’
See section ‘The DESCRIPTION file’ in the ‘Writing R Extensions’
manual.
```
# mdsr
Version: 0.1.6
## In both
* checking installed package size ... NOTE
```
installed size is 5.6Mb
sub-directories of 1Mb or more:
data 5.5Mb
```
* checking data for non-ASCII characters ... NOTE
```
Note: found 2694 marked UTF-8 strings
```
# memapp
Version: 2.12
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘dplyr’ ‘DT’ ‘foreign’ ‘formattable’ ‘ggplot2’ ‘haven’ ‘mem’
‘openxlsx’ ‘plotly’ ‘RColorBrewer’ ‘readxl’ ‘RODBC’ ‘shinyBS’
‘shinydashboard’ ‘shinydashboardPlus’ ‘shinyjs’ ‘shinythemes’
‘stringi’ ‘stringr’ ‘tidyr’
All declared Imports should be used.
```
# metacoder
Version: 0.3.1
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘ggrepel’ ‘reshape’ ‘svglite’
All declared Imports should be used.
```
# MetaCyto
Version: 1.2.1
## In both
* checking R code for possible problems ... NOTE
```
...
collectData: no visible binding for global variable ‘value’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/MetaCyto/new/MetaCyto.Rcheck/00_pkg_src/MetaCyto/R/collectData.R:27)
panelSummary: no visible binding for global variable ‘antibodies’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/MetaCyto/new/MetaCyto.Rcheck/00_pkg_src/MetaCyto/R/panelSummary.R:34)
panelSummary: no visible binding for global variable ‘value’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/MetaCyto/new/MetaCyto.Rcheck/00_pkg_src/MetaCyto/R/panelSummary.R:34)
plotGA: no visible binding for global variable ‘lower’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/MetaCyto/new/MetaCyto.Rcheck/00_pkg_src/MetaCyto/R/plotGA.R:33-39)
plotGA: no visible binding for global variable ‘upper’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/MetaCyto/new/MetaCyto.Rcheck/00_pkg_src/MetaCyto/R/plotGA.R:33-39)
searchCluster : <anonymous>: no visible binding for global variable
‘triS’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/MetaCyto/new/MetaCyto.Rcheck/00_pkg_src/MetaCyto/R/searchCluster.R:102)
searchCluster : <anonymous>: no visible binding for global variable
‘triS’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/MetaCyto/new/MetaCyto.Rcheck/00_pkg_src/MetaCyto/R/searchCluster.R:103)
searchCluster : <anonymous>: no visible binding for global variable
‘triS’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/MetaCyto/new/MetaCyto.Rcheck/00_pkg_src/MetaCyto/R/searchCluster.R:104)
Undefined global functions or variables:
antibodies lower parameter_name triS upper value
```
# metagenomeFeatures
Version: 2.0.0
## In both
* checking examples ... ERROR
```
Running examples in ‘metagenomeFeatures-Ex.R’ failed
The error most likely occurred in:
> ### Name: MgDb-class
> ### Title: Metagenome Database class
> ### Aliases: MgDb-class mgdb
>
> ### ** Examples
>
> # example MgDb-class object, Greengenes 13.8 85% OTUs database.
> gg85 <- get_gg13.8_85MgDb()
Error in validObject(.Object) :
invalid class “MgDb” object: 1: invalid object for slot "taxa" in class "MgDb": got class "tbl_SQLiteConnection", should be or extend class "tbl_dbi"
invalid class “MgDb” object: 2: invalid object for slot "taxa" in class "MgDb": got class "tbl_dbi", should be or extend class "tbl_dbi"
invalid class “MgDb” object: 3: invalid object for slot "taxa" in class "MgDb": got class "tbl_sql", should be or extend class "tbl_dbi"
invalid class “MgDb” object: 4: invalid object for slot "taxa" in class "MgDb": got class "tbl_lazy", should be or extend class "tbl_dbi"
invalid class “MgDb” object: 5: invalid object for slot "taxa" in class "MgDb": got class "tbl", should be or extend class "tbl_dbi"
Calls: get_gg13.8_85MgDb ... newMgDb -> new -> initialize -> initialize -> validObject
Execution halted
```
* checking tests ...
```
ERROR
Running the tests in ‘tests/testthat.R’ failed.
Last 13 lines of output:
The following objects are masked from 'package:base':
intersect, setdiff, setequal, union
Error in validObject(.Object) :
invalid class "MgDb" object: 1: invalid object for slot "taxa" in class "MgDb": got class "tbl_SQLiteConnection", should be or extend class "tbl_dbi"
invalid class "MgDb" object: 2: invalid object for slot "taxa" in class "MgDb": got class "tbl_dbi", should be or extend class "tbl_dbi"
invalid class "MgDb" object: 3: invalid object for slot "taxa" in class "MgDb": got class "tbl_sql", should be or extend class "tbl_dbi"
invalid class "MgDb" object: 4: invalid object for slot "taxa" in class "MgDb": got class "tbl_lazy", should be or extend class "tbl_dbi"
invalid class "MgDb" object: 5: invalid object for slot "taxa" in class "MgDb": got class "tbl", should be or extend class "tbl_dbi"
Calls: test_check ... newMgDb -> new -> initialize -> initialize -> validObject
In addition: Warning messages:
1: replacing previous import 'lazyeval::is_formula' by 'purrr::is_formula' when loading 'metagenomeFeatures'
2: replacing previous import 'lazyeval::is_atomic' by 'purrr::is_atomic' when loading 'metagenomeFeatures'
Execution halted
```
* checking whether package ‘metagenomeFeatures’ can be installed ... WARNING
```
Found the following significant warnings:
Warning: subclass "QualityScaledDNAStringSet" of class "DNAStringSet" is not local and cannot be updated for new inheritance information; consider setClassUnion()
Warning: replacing previous import ‘lazyeval::is_formula’ by ‘purrr::is_formula’ when loading ‘metagenomeFeatures’
Warning: replacing previous import ‘lazyeval::is_atomic’ by ‘purrr::is_atomic’ when loading ‘metagenomeFeatures’
See ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/metagenomeFeatures/new/metagenomeFeatures.Rcheck/00install.out’ for details.
```
* checking for missing documentation entries ... WARNING
```
Undocumented S4 methods:
generic '[' and siglist 'mgFeatures'
All user-level objects in a package (including S4 classes and methods)
should have documentation entries.
See chapter ‘Writing R documentation files’ in the ‘Writing R
Extensions’ manual.
```
* checking re-building of vignette outputs ... WARNING
```
...
'browseVignettes()'. To cite Bioconductor, see
'citation("Biobase")', and for packages 'citation("pkgname")'.
── Attaching packages ────────────────────────────────── tidyverse 1.2.1 ──
✔ ggplot2 3.1.0 ✔ purrr 0.3.1
✔ tibble 2.0.1 ✔ dplyr 0.8.0.9006
✔ tidyr 0.8.3 ✔ stringr 1.4.0
✔ readr 1.3.1 ✔ forcats 0.4.0
── Conflicts ───────────────────────────────────── tidyverse_conflicts() ──
✖ dplyr::combine() masks Biobase::combine(), BiocGenerics::combine()
✖ dplyr::filter() masks stats::filter()
✖ dplyr::lag() masks stats::lag()
✖ ggplot2::Position() masks BiocGenerics::Position(), base::Position()
Quitting from lines 45-46 (database-explore.Rmd)
Error: processing vignette 'database-explore.Rmd' failed with diagnostics:
invalid class "MgDb" object: 1: invalid object for slot "taxa" in class "MgDb": got class "tbl_SQLiteConnection", should be or extend class "tbl_dbi"
invalid class "MgDb" object: 2: invalid object for slot "taxa" in class "MgDb": got class "tbl_dbi", should be or extend class "tbl_dbi"
invalid class "MgDb" object: 3: invalid object for slot "taxa" in class "MgDb": got class "tbl_sql", should be or extend class "tbl_dbi"
invalid class "MgDb" object: 4: invalid object for slot "taxa" in class "MgDb": got class "tbl_lazy", should be or extend class "tbl_dbi"
invalid class "MgDb" object: 5: invalid object for slot "taxa" in class "MgDb": got class "tbl", should be or extend class "tbl_dbi"
Execution halted
```
* checking installed package size ... NOTE
```
installed size is 5.1Mb
sub-directories of 1Mb or more:
extdata 3.5Mb
```
* checking R code for possible problems ... NOTE
```
.select: no visible binding for global variable ‘identifier’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/metagenomeFeatures/new/metagenomeFeatures.Rcheck/00_pkg_src/metagenomeFeatures/R/mgDb_method_select.R:96-97)
.select.taxa: no visible binding for global variable ‘Keys’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/metagenomeFeatures/new/metagenomeFeatures.Rcheck/00_pkg_src/metagenomeFeatures/R/mgDb_method_select.R:21)
.select.taxa: no visible binding for global variable ‘.’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/metagenomeFeatures/new/metagenomeFeatures.Rcheck/00_pkg_src/metagenomeFeatures/R/mgDb_method_select.R:21)
get_gg13.8_85MgDb: no visible binding for global variable ‘metadata’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/metagenomeFeatures/new/metagenomeFeatures.Rcheck/00_pkg_src/metagenomeFeatures/R/gg13.8_85MgDb.R:23-25)
Undefined global functions or variables:
. identifier Keys metadata
```
# MetaIntegrator
Version: 2.0.0
## In both
* checking installed package size ... NOTE
```
installed size is 5.4Mb
sub-directories of 1Mb or more:
data 1.9Mb
doc 2.2Mb
R 1.0Mb
```
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘DT’ ‘GEOmetadb’ ‘gplots’ ‘pheatmap’ ‘readr’ ‘RMySQL’ ‘RSQLite’
All declared Imports should be used.
```
# MetamapsDB
Version: 0.0.2
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘Matrix’ ‘shiny’
All declared Imports should be used.
```
# methyvim
Version: 1.2.0
## In both
* checking examples ... ERROR
```
Running examples in ‘methyvim-Ex.R’ failed
The error most likely occurred in:
> ### Name: methyheat
> ### Title: Heatmap for methytmle objects
> ### Aliases: methyheat
>
> ### ** Examples
>
> suppressMessages(library(SummarizedExperiment))
> library(methyvimData)
Error in library(methyvimData) :
there is no package called ‘methyvimData’
Execution halted
```
* checking tests ...
```
ERROR
Running the tests in ‘tests/testthat.R’ failed.
Last 13 lines of output:
── 5. Error: (unknown) (@test-tmle_classic.R#5) ───────────────────────────────
there is no package called 'methyvimData'
1: library(methyvimData) at testthat/test-tmle_classic.R:5
2: stop(txt, domain = NA)
══ testthat results ═══════════════════════════════════════════════════════════
OK: 15 SKIPPED: 0 FAILED: 5
1. Error: (unknown) (@test-cluster_sites.R#4)
2. Error: (unknown) (@test-methytmle_class.R#5)
3. Error: (unknown) (@test-methyvim.R#7)
4. Error: (unknown) (@test-screen_limma.R#4)
5. Error: (unknown) (@test-tmle_classic.R#5)
Error: testthat unit tests failed
Execution halted
```
* checking re-building of vignette outputs ... WARNING
```
Error in re-building vignettes:
...
Quitting from lines 174-177 (using_methyvim.Rmd)
Error: processing vignette 'using_methyvim.Rmd' failed with diagnostics:
there is no package called 'methyvimData'
Execution halted
```
* checking package dependencies ... NOTE
```
Packages suggested but not available for checking:
‘minfiData’ ‘methyvimData’
```
# metR
Version: 0.2.0
## In both
* checking installed package size ... NOTE
```
installed size is 5.4Mb
sub-directories of 1Mb or more:
data 1.1Mb
doc 1.5Mb
R 2.0Mb
```
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘curl’
All declared Imports should be used.
```
# MIAmaxent
Version: 1.0.0
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘grDevices’
All declared Imports should be used.
```
# miceFast
Version: 0.2.3
## In both
* checking whether package ‘miceFast’ can be installed ... ERROR
```
Installation failed.
See ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/miceFast/new/miceFast.Rcheck/00install.out’ for details.
```
## Installation
### Devel
```
* installing *source* package ‘miceFast’ ...
** package ‘miceFast’ successfully unpacked and MD5 sums checked
** libs
clang++ -std=gnu++11 -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/miceFast/Rcpp/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/miceFast/RcppArmadillo/include" -I/usr/local/include -fopenmp -fPIC -Wall -g -O2 -c R_funs.cpp -o R_funs.o
clang: error: unsupported option '-fopenmp'
make: *** [R_funs.o] Error 1
ERROR: compilation failed for package ‘miceFast’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/miceFast/new/miceFast.Rcheck/miceFast’
```
### CRAN
```
* installing *source* package ‘miceFast’ ...
** package ‘miceFast’ successfully unpacked and MD5 sums checked
** libs
clang++ -std=gnu++11 -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/miceFast/Rcpp/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/miceFast/RcppArmadillo/include" -I/usr/local/include -fopenmp -fPIC -Wall -g -O2 -c R_funs.cpp -o R_funs.o
clang: error: unsupported option '-fopenmp'
make: *** [R_funs.o] Error 1
ERROR: compilation failed for package ‘miceFast’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/miceFast/old/miceFast.Rcheck/miceFast’
```
# MlBayesOpt
Version: 0.3.3
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘data.table’ ‘foreach’
All declared Imports should be used.
```
# mlbgameday
Version: 0.1.4
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘doParallel’ ‘iterators’ ‘parallel’
All declared Imports should be used.
```
# mleap
Version: 0.1.3
## In both
* checking whether the package can be loaded ... ERROR
```
Loading this package had a fatal error status code 1
Loading log:
Error: package or namespace load failed for ‘mleap’:
.onLoad failed in loadNamespace() for 'mleap', details:
call: NULL
error: .onLoad failed in loadNamespace() for 'rJava', details:
call: dyn.load(file, DLLpath = DLLpath, ...)
error: unable to load shared object '/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/mleap/rJava/libs/rJava.so':
dlopen(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/mleap/rJava/libs/rJava.so, 6): Library not loaded: /Library/Java/JavaVirtualMachines/jdk-11.0.1.jdk/Contents/Home/lib/server/libjvm.dylib
Referenced from: /Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/mleap/rJava/libs/rJava.so
Reason: image not found
Execution halted
```
# MLZ
Version: 0.1.1
## In both
* checking installed package size ... NOTE
```
installed size is 14.3Mb
sub-directories of 1Mb or more:
libs 13.6Mb
```
# modelgrid
Version: 1.1.1.0
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘ggplot2’ ‘lattice’
All declared Imports should be used.
```
# modelr
Version: 0.1.4
## In both
* checking Rd cross-references ... NOTE
```
Package unavailable to check Rd xrefs: ‘rstanarm’
```
# momentuHMM
Version: 1.4.3
## In both
* checking re-building of vignette outputs ... WARNING
```
...
1: DM$angle = list(mean = ~state2(angleFormula(d, strength = w)),
2: concentration= ~1))
^
Warning in highr::hilight(x, format, prompt = options$prompt, markup = opts$markup) :
the syntax of the source code is invalid; the fallback mode is used
Error in texi2dvi(file = file, pdf = TRUE, clean = clean, quiet = quiet, :
Running 'texi2dvi' on 'momentuHMM.tex' failed.
LaTeX errors:
! LaTeX Error: File `setspace.sty' not found.
Type X to quit or <RETURN> to proceed,
or enter new name. (Default extension: sty)
! Emergency stop.
<read *>
l.58 \usepackage
{natbib}^^M
! ==> Fatal error occurred, no output PDF file produced!
Calls: buildVignettes -> texi2pdf -> texi2dvi
Execution halted
```
* checking installed package size ... NOTE
```
installed size is 6.6Mb
sub-directories of 1Mb or more:
data 1.2Mb
doc 1.8Mb
R 3.0Mb
```
# Momocs
Version: 1.2.9
## In both
* checking installed package size ... NOTE
```
installed size is 5.2Mb
sub-directories of 1Mb or more:
R 3.1Mb
```
# MonetDBLite
Version: 0.6.0
## In both
* checking installed package size ... NOTE
```
installed size is 6.1Mb
sub-directories of 1Mb or more:
libs 5.4Mb
```
# monkeylearn
Version: 0.2.0
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘ratelimitr’
All declared Imports should be used.
```
# monocle
Version: 2.8.0
## In both
* checking re-building of vignette outputs ... WARNING
```
Error in re-building vignettes:
...
Error in texi2dvi(file = file, pdf = TRUE, clean = clean, quiet = quiet, :
Running 'texi2dvi' on 'monocle-vignette.tex' failed.
LaTeX errors:
! LaTeX Error: File `xcolor.sty' not found.
Type X to quit or <RETURN> to proceed,
or enter new name. (Default extension: sty)
! Emergency stop.
<read *>
l.28 \RequirePackage
[a4paper,left=1.9cm,top=1.9cm,bottom=2.5cm,right=1.9cm,i...
! ==> Fatal error occurred, no output PDF file produced!
Calls: buildVignettes -> texi2pdf -> texi2dvi
Execution halted
```
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘biocViews’ ‘Rcpp’
All declared Imports should be used.
Missing or unexported object: ‘scater::newSCESet’
```
* checking R code for possible problems ... NOTE
```
...
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/monocle/new/monocle.Rcheck/00_pkg_src/monocle/R/order_cells.R:131)
get_next_node_id: no visible binding for global variable ‘next_node’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/monocle/new/monocle.Rcheck/00_pkg_src/monocle/R/order_cells.R:132)
make_canonical: no visible global function definition for ‘nei’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/monocle/new/monocle.Rcheck/00_pkg_src/monocle/R/order_cells.R:297)
make_canonical: no visible global function definition for ‘nei’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/monocle/new/monocle.Rcheck/00_pkg_src/monocle/R/order_cells.R:298)
measure_diameter_path: no visible global function definition for ‘nei’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/monocle/new/monocle.Rcheck/00_pkg_src/monocle/R/order_cells.R:470-481)
orderCells: no visible binding for '<<-' assignment to ‘next_node’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/monocle/new/monocle.Rcheck/00_pkg_src/monocle/R/order_cells.R:1097)
plot_multiple_branches_pseudotime: no visible binding for global
variable ‘pseudocount’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/monocle/new/monocle.Rcheck/00_pkg_src/monocle/R/plotting.R:2740)
plot_multiple_branches_pseudotime: no visible binding for global
variable ‘Branch’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/monocle/new/monocle.Rcheck/00_pkg_src/monocle/R/plotting.R:2753)
project2MST: no visible global function definition for ‘nei’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/monocle/new/monocle.Rcheck/00_pkg_src/monocle/R/order_cells.R:1606)
Undefined global functions or variables:
Branch nei next_node pseudocount Size_Factor use_for_ordering
```
* checking files in ‘vignettes’ ... NOTE
```
The following directory looks like a leftover from 'knitr':
‘figure’
Please remove from your package.
```
# morse
Version: 3.2.2
## In both
* checking whether package ‘morse’ can be installed ... ERROR
```
Installation failed.
See ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/morse/new/morse.Rcheck/00install.out’ for details.
```
## Installation
### Devel
```
* installing *source* package ‘morse’ ...
** package ‘morse’ successfully unpacked and MD5 sums checked
** R
** data
*** moving datasets to lazyload DB
** inst
** byte-compile and prepare package for lazy loading
Error : .onLoad failed in loadNamespace() for 'rjags', details:
call: dyn.load(file, DLLpath = DLLpath, ...)
error: unable to load shared object '/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/morse/rjags/libs/rjags.so':
dlopen(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/morse/rjags/libs/rjags.so, 10): Library not loaded: /usr/local/lib/libjags.4.dylib
Referenced from: /Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/morse/rjags/libs/rjags.so
Reason: image not found
ERROR: lazy loading failed for package ‘morse’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/morse/new/morse.Rcheck/morse’
```
### CRAN
```
* installing *source* package ‘morse’ ...
** package ‘morse’ successfully unpacked and MD5 sums checked
** R
** data
*** moving datasets to lazyload DB
** inst
** byte-compile and prepare package for lazy loading
Error : .onLoad failed in loadNamespace() for 'rjags', details:
call: dyn.load(file, DLLpath = DLLpath, ...)
error: unable to load shared object '/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/morse/rjags/libs/rjags.so':
dlopen(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/morse/rjags/libs/rjags.so, 10): Library not loaded: /usr/local/lib/libjags.4.dylib
Referenced from: /Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/morse/rjags/libs/rjags.so
Reason: image not found
ERROR: lazy loading failed for package ‘morse’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/morse/old/morse.Rcheck/morse’
```
# mosaic
Version: 1.5.0
## In both
* checking re-building of vignette outputs ... WARNING
```
...
Using parallel package.
* Set seed with set.rseed().
* Disable this message with options(`mosaic:parallelMessage` = FALSE)
Error in texi2dvi(file = file, pdf = TRUE, clean = clean, quiet = quiet, :
Running 'texi2dvi' on 'MinimalRgg.tex' failed.
LaTeX errors:
! LaTeX Error: File `xcolor.sty' not found.
Type X to quit or <RETURN> to proceed,
or enter new name. (Default extension: sty)
! Emergency stop.
<read *>
l.63 \usepackage
{hyperref}^^M
! ==> Fatal error occurred, no output PDF file produced!
Calls: buildVignettes -> texi2pdf -> texi2dvi
Execution halted
```
* checking package dependencies ... NOTE
```
Package which this enhances but not available for checking: ‘manipulate’
```
* checking installed package size ... NOTE
```
installed size is 6.7Mb
sub-directories of 1Mb or more:
doc 1.8Mb
R 4.3Mb
```
* checking Rd cross-references ... NOTE
```
Package unavailable to check Rd xrefs: ‘cubature’
```
# mosaicData
Version: 0.17.0
## In both
* checking data for non-ASCII characters ... NOTE
```
Note: found 7 marked UTF-8 strings
```
# mosaicModel
Version: 0.3.0
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘caret’ ‘ggformula’ ‘knitr’ ‘MASS’ ‘testthat’ ‘tidyverse’
All declared Imports should be used.
```
# MPTmultiverse
Version: 0.1
## In both
* checking tests ...
```
ERROR
Running the tests in ‘tests/testthat.R’ failed.
Last 13 lines of output:
3: getExportedValue(pkg, name)
4: asNamespace(ns)
5: getNamespace(ns)
6: tryCatch(loadNamespace(name), error = function(e) stop(e))
7: tryCatchList(expr, classes, parentenv, handlers)
8: tryCatchOne(expr, names, parentenv, handlers[[1L]])
9: value[[3L]](cond)
══ testthat results ═══════════════════════════════════════════════════════════
OK: 0 SKIPPED: 3 FAILED: 2
1. Error: No-pooling approaches work (@test-mptinr.R#23)
2. Error: Complete-pooling approaches work (@test-mptinr.R#164)
Error: testthat unit tests failed
Execution halted
```
* checking re-building of vignette outputs ... WARNING
```
Error in re-building vignettes:
...
Quitting from lines 57-80 (introduction-bayen_kuhlmann_2011.rmd)
Error: processing vignette 'introduction-bayen_kuhlmann_2011.rmd' failed with diagnostics:
.onLoad failed in loadNamespace() for 'rjags', details:
call: dyn.load(file, DLLpath = DLLpath, ...)
error: unable to load shared object '/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/MPTmultiverse/rjags/libs/rjags.so':
dlopen(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/MPTmultiverse/rjags/libs/rjags.so, 10): Library not loaded: /usr/local/lib/libjags.4.dylib
Referenced from: /Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/MPTmultiverse/rjags/libs/rjags.so
Reason: image not found
Execution halted
```
# msigdbr
Version: 6.2.1
## In both
* checking installed package size ... NOTE
```
installed size is 5.2Mb
sub-directories of 1Mb or more:
R 5.1Mb
```
# MSnID
Version: 1.14.0
## In both
* checking R code for possible problems ... NOTE
```
...
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/MSnID/new/MSnID.Rcheck/00_pkg_src/MSnID/R/MSnID-methods.R:600)
infer_parsimonious_accessions,MSnID : infer_acc: no visible binding for
global variable ‘N’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/MSnID/new/MSnID.Rcheck/00_pkg_src/MSnID/R/MSnID-methods.R:600)
infer_parsimonious_accessions,MSnID : infer_acc: no visible binding for
global variable ‘accession’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/MSnID/new/MSnID.Rcheck/00_pkg_src/MSnID/R/MSnID-methods.R:601)
infer_parsimonious_accessions,MSnID : infer_acc: no visible binding for
global variable ‘pepSeq’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/MSnID/new/MSnID.Rcheck/00_pkg_src/MSnID/R/MSnID-methods.R:603)
recalibrate,MSnID: no visible global function definition for ‘median’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/MSnID/new/MSnID.Rcheck/00_pkg_src/MSnID/R/MSnID-methods.R:520)
recalibrate,MSnID: no visible global function definition for ‘density’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/MSnID/new/MSnID.Rcheck/00_pkg_src/MSnID/R/MSnID-methods.R:529)
Undefined global functions or variables:
accession DatabaseAccess DatabaseDescription DBseqLength density i
location mass median modification N name optim pepSeq quantile rnorm
spectrumID
Consider adding
importFrom("stats", "density", "median", "optim", "quantile", "rnorm")
to your NAMESPACE file.
```
* checking re-building of vignette outputs ... NOTE
```
...
The following object is masked from ‘package:base’:
trimws
Error in texi2dvi(file = file, pdf = TRUE, clean = clean, quiet = quiet, :
Running 'texi2dvi' on 'msnid_vignette.tex' failed.
LaTeX errors:
! LaTeX Error: File `fancyhdr.sty' not found.
Type X to quit or <RETURN> to proceed,
or enter new name. (Default extension: sty)
! Emergency stop.
<read *>
l.171 \pagestyle
{fancy}^^M
! ==> Fatal error occurred, no output PDF file produced!
Calls: buildVignettes -> texi2pdf -> texi2dvi
Execution halted
```
# MSstats
Version: 3.12.3
## In both
* checking R code for possible problems ... NOTE
```
...
SpectronauttoMSstatsFormat: no visible binding for global variable
‘missing.col’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/MSstats/new/MSstats.Rcheck/00_pkg_src/MSstats/R/SpectronauttoMSstatsFormat.R:46-47)
SpectronauttoMSstatsFormat: no visible binding for global variable
‘fea’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/MSstats/new/MSstats.Rcheck/00_pkg_src/MSstats/R/SpectronauttoMSstatsFormat.R:188)
SpectronauttoMSstatsFormat: no visible binding for global variable
‘Intensity’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/MSstats/new/MSstats.Rcheck/00_pkg_src/MSstats/R/SpectronauttoMSstatsFormat.R:188)
SpectronauttoMSstatsFormat: no visible binding for global variable
‘PeptideSequence’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/MSstats/new/MSstats.Rcheck/00_pkg_src/MSstats/R/SpectronauttoMSstatsFormat.R:214)
SpectronauttoMSstatsFormat: no visible binding for global variable
‘ProteinName’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/MSstats/new/MSstats.Rcheck/00_pkg_src/MSstats/R/SpectronauttoMSstatsFormat.R:214)
Undefined global functions or variables:
ABUNDANCE aggr_Fragment_Annotation aggr_Peak_Area analysis ciw
datafeature fea FEATURE FRACTION Intensity label LABEL logFC Mean
missing.col Name ncount ount PeptideSequence Protein Protein_number
ProteinName residual RUN Selected_fragments Selected_peptides shape
Train_size weight x y ymax ymin
```
# MSstatsQC
Version: 1.2.0
## In both
* checking package dependencies ... NOTE
```
Package suggested but not available for checking: ‘RforProteomics’
```
# multicolor
Version: 0.1.1
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘cowsay’
All declared Imports should be used.
```
# multistateutils
Version: 1.2.2
## In both
* checking examples ... ERROR
```
...
+ data=ebmt3,
+ trans=tmat,
+ keep=c('age', 'dissub'))
>
> # Fit parametric models
> models <- lapply(1:3, function(i) {
+ flexsurvreg(Surv(time, status) ~ age + dissub, data=long, dist='weibull')
+ })
>
> sim <- cohort_simulation(models, ebmt3, tmat)
*** caught illegal operation ***
address 0x11142fb50, cause 'illegal opcode'
Traceback:
1: desCpp(transitions, trans_mat, newdata_mat, start_times, start_states - 1, tcovs)
2: data.table::as.data.table(desCpp(transitions, trans_mat, newdata_mat, start_times, start_states - 1, tcovs))
3: run_sim(transition_list, attr_mat, trans_mat, tcovs, start_times, start_states)
4: state_occupancy(models, trans_mat, newdata, tcovs, start_time, start_state, ci, M, agelimit, agecol, agescale)
5: cohort_simulation(models, ebmt3, tmat)
An irrecoverable exception occurred. R is aborting now ...
```
* checking tests ...
```
ERROR
Running the tests in ‘tests/testthat.R’ failed.
Last 13 lines of output:
26: tryCatchList(expr, classes, parentenv, handlers)
27: tryCatch(withCallingHandlers({ eval(code, test_env) if (!handled && !is.null(test)) { skip_empty() }}, expectation = handle_expectation, skip = handle_skip, warning = handle_warning, message = handle_message, error = handle_error), error = handle_fatal, skip = function(e) { })
28: test_code(NULL, exprs, env)
29: source_file(path, new.env(parent = env), chdir = TRUE, wrap = wrap)
30: force(code)
31: with_reporter(reporter = reporter, start_end_reporter = start_end_reporter, { lister$start_file(basename(path)) source_file(path, new.env(parent = env), chdir = TRUE, wrap = wrap) end_context() })
32: FUN(X[[i]], ...)
33: lapply(paths, test_file, env = env, reporter = current_reporter, start_end_reporter = FALSE, load_helpers = FALSE, wrap = wrap)
34: force(code)
35: with_reporter(reporter = current_reporter, results <- lapply(paths, test_file, env = env, reporter = current_reporter, start_end_reporter = FALSE, load_helpers = FALSE, wrap = wrap))
36: test_files(paths, reporter = reporter, env = env, stop_on_failure = stop_on_failure, stop_on_warning = stop_on_warning, wrap = wrap)
37: test_dir(path = test_path, reporter = reporter, env = env, filter = filter, ..., stop_on_failure = stop_on_failure, stop_on_warning = stop_on_warning, wrap = wrap)
38: test_package_dir(package = package, test_path = test_path, filter = filter, reporter = reporter, ..., stop_on_failure = stop_on_failure, stop_on_warning = stop_on_warning, wrap = wrap)
39: test_check("multistateutils")
An irrecoverable exception occurred. R is aborting now ...
```
* checking re-building of vignette outputs ... WARNING
```
...
11: timing_fn(handle(ev <- withCallingHandlers(withVisible(eval(expr, envir, enclos)), warning = wHandler, error = eHandler, message = mHandler)))
12: evaluate_call(expr, parsed$src[[i]], envir = envir, enclos = enclos, debug = debug, last = i == length(out), use_try = stop_on_error != 2L, keep_warning = keep_warning, keep_message = keep_message, output_handler = output_handler, include_timing = include_timing)
13: evaluate::evaluate(...)
14: evaluate(code, envir = env, new_device = FALSE, keep_warning = !isFALSE(options$warning), keep_message = !isFALSE(options$message), stop_on_error = if (options$error && options$include) 0L else 2L, output_handler = knit_handlers(options$render, options))
15: in_dir(input_dir(), evaluate(code, envir = env, new_device = FALSE, keep_warning = !isFALSE(options$warning), keep_message = !isFALSE(options$message), stop_on_error = if (options$error && options$include) 0L else 2L, output_handler = knit_handlers(options$render, options)))
16: block_exec(params)
17: call_block(x)
18: process_group.block(group)
19: process_group(group)
20: withCallingHandlers(if (tangle) process_tangle(group) else process_group(group), error = function(e) { setwd(wd) cat(res, sep = "\n", file = output %n% "") message("Quitting from lines ", paste(current_lines(i), collapse = "-"), " (", knit_concord$get("infile"), ") ") })
21: process_file(text, output)
22: knitr::knit(knit_input, knit_output, envir = envir, quiet = quiet, encoding = encoding)
23: rmarkdown::render(file, encoding = encoding, quiet = quiet, envir = globalenv(), ...)
24: vweave_rmarkdown(...)
25: engine$weave(file, quiet = quiet, encoding = enc)
26: doTryCatch(return(expr), name, parentenv, handler)
27: tryCatchOne(expr, names, parentenv, handlers[[1L]])
28: tryCatchList(expr, classes, parentenv, handlers)
29: tryCatch({ engine$weave(file, quiet = quiet, encoding = enc) setwd(startdir) find_vignette_product(name, by = "weave", engine = engine)}, error = function(e) { stop(gettextf("processing vignette '%s' failed with diagnostics:\n%s", file, conditionMessage(e)), domain = NA, call. = FALSE)})
30: buildVignettes(dir = "/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/multistateutils/new/multistateutils.Rcheck/vign_test/multistateutils")
An irrecoverable exception occurred. R is aborting now ...
```
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘webshot’
All declared Imports should be used.
```
# MXM
Version: 1.4.2
## In both
* checking re-building of vignette outputs ... WARNING
```
Error in re-building vignettes:
...
Error: processing vignette 'article.ltx' failed with diagnostics:
Running 'texi2dvi' on 'article.ltx' failed.
LaTeX errors:
! LaTeX Error: File `algorithm.sty' not found.
Type X to quit or <RETURN> to proceed,
or enter new name. (Default extension: sty)
! Emergency stop.
<read *>
l.11 \usepackage
{algpseudocode}^^M
! ==> Fatal error occurred, no output PDF file produced!
Execution halted
```
* checking installed package size ... NOTE
```
installed size is 12.0Mb
sub-directories of 1Mb or more:
doc 1.3Mb
R 10.1Mb
```
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘knitr’
All declared Imports should be used.
```
# myTAI
Version: 0.9.1
## In both
* checking installed package size ... NOTE
```
installed size is 5.7Mb
sub-directories of 1Mb or more:
data 2.0Mb
doc 2.4Mb
```
# nandb
Version: 2.0.0
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘MASS’ ‘stats’
All declared Imports should be used.
```
# ncappc
Version: 0.3.0
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘bookdown’
All declared Imports should be used.
```
# NestedCategBayesImpute
Version: 1.2.1
## In both
* checking for GNU extensions in Makefiles ... NOTE
```
GNU make is a SystemRequirements.
```
# neuropsychology
Version: 0.5.0
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘htmlTable’ ‘lme4’ ‘stringi’
All declared Imports should be used.
```
# newsanchor
Version: 0.1.0
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘devtools’ ‘xml2’
All declared Imports should be used.
```
* checking data for non-ASCII characters ... NOTE
```
Note: found 318 marked UTF-8 strings
```
# NFP
Version: 0.99.2
## In both
* checking re-building of vignette outputs ... WARNING
```
...
pmin, pmin.int, Position, rank, rbind, Reduce, rowMeans, rownames,
rowSums, sapply, setdiff, sort, table, tapply, union, unique,
unsplit, which, which.max, which.min
Loading required package: graphite
Error in texi2dvi(file = file, pdf = TRUE, clean = clean, quiet = quiet, :
Running 'texi2dvi' on 'NFP.tex' failed.
LaTeX errors:
! LaTeX Error: File `fancyhdr.sty' not found.
Type X to quit or <RETURN> to proceed,
or enter new name. (Default extension: sty)
! Emergency stop.
<read *>
l.171 \pagestyle
{fancy}^^M
! ==> Fatal error occurred, no output PDF file produced!
Calls: buildVignettes -> texi2pdf -> texi2dvi
Execution halted
```
* checking package dependencies ... NOTE
```
Package suggested but not available for checking: ‘NFPdata’
```
* checking installed package size ... NOTE
```
installed size is 8.6Mb
sub-directories of 1Mb or more:
data 8.1Mb
```
# nlmixr
Version: 1.0.0-7
## In both
* checking installed package size ... NOTE
```
installed size is 5.3Mb
sub-directories of 1Mb or more:
libs 1.0Mb
R 3.0Mb
```
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘numDeriv’ ‘PreciseSums’
All declared Imports should be used.
```
# noaastormevents
Version: 0.1.0
## In both
* checking package dependencies ... NOTE
```
Package suggested but not available for checking: ‘hurricaneexposuredata’
```
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘choroplethr’ ‘choroplethrMaps’ ‘data.table’ ‘forcats’
‘hurricaneexposure’ ‘plyr’ ‘RColorBrewer’ ‘XML’
All declared Imports should be used.
```
# nonet
Version: 0.4.0
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘dplyr’ ‘e1071’ ‘pROC’ ‘purrr’ ‘randomForest’ ‘rlang’
All declared Imports should be used.
```
# nos
Version: 1.1.0
## In both
* checking Rd cross-references ... NOTE
```
Package unavailable to check Rd xrefs: ‘bipartite’
```
# nucleR
Version: 2.12.1
## In both
* checking re-building of vignette outputs ... WARNING
```
...
fmtutil [INFO]: Total formats: 15
fmtutil [INFO]: exiting with status 0
tlmgr install fancyhdr
TeX Live 2018 is frozen forever and will no
longer be updated. This happens in preparation for a new release.
If you're interested in helping to pretest the new release (when
pretests are available), please read http://tug.org/texlive/pretest.html.
Otherwise, just wait, and the new release will be ready in due time.
tlmgr: Fundamental package texlive.infra not present, uh oh, goodbyeShould not happen, texlive.infra not found at /usr/local/bin/tlmgr line 7344.
tlmgr: package repository http://mirrors.standaloneinstaller.com/ctan/systems/texlive/tlnet (not verified: gpg unavailable)
tlmgr path add
! LaTeX Error: File `fancyhdr.sty' not found.
! Emergency stop.
<read *>
Error: processing vignette 'nucleR.Rmd' failed with diagnostics:
Failed to compile nucleR.tex. See nucleR.log for more info.
Execution halted
```
# nullabor
Version: 0.3.5
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘forecast’ ‘rlang’ ‘tidyverse’ ‘tsibble’
All declared Imports should be used.
```
# nycflights13
Version: 1.0.0
## In both
* checking installed package size ... NOTE
```
installed size is 7.1Mb
sub-directories of 1Mb or more:
data 7.0Mb
```
# nzelect
Version: 0.4.0
## In both
* checking installed package size ... NOTE
```
installed size is 5.4Mb
sub-directories of 1Mb or more:
data 5.0Mb
```
* checking data for non-ASCII characters ... NOTE
```
Note: found 6409 marked UTF-8 strings
```
# observer
Version: 0.1.2
## In both
* checking Rd cross-references ... NOTE
```
Package unavailable to check Rd xrefs: ‘ensurer’
```
# oec
Version: 2.7.8
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘readr’
All declared Imports should be used.
```
# OncoSimulR
Version: 2.10.0
## In both
* checking installed package size ... NOTE
```
installed size is 7.3Mb
sub-directories of 1Mb or more:
doc 5.4Mb
```
# openair
Version: 2.6-1
## In both
* checking installed package size ... NOTE
```
installed size is 6.0Mb
sub-directories of 1Mb or more:
R 4.0Mb
```
# opendotaR
Version: 0.1.4
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘dplyr’
All declared Imports should be used.
```
# openPrimeR
Version: 1.2.0
## In both
* checking for hidden files and directories ... NOTE
```
Found the following hidden files and directories:
.travis.yml
These were most likely included in error. See section ‘Package
structure’ in the ‘Writing R Extensions’ manual.
```
* checking installed package size ... NOTE
```
installed size is 15.5Mb
sub-directories of 1Mb or more:
extdata 10.2Mb
R 4.1Mb
```
# Organism.dplyr
Version: 1.8.1
## In both
* checking tests ...
```
ERROR
Running the tests in ‘tests/testthat.R’ failed.
Last 13 lines of output:
}) at testthat/test-src_organism-select.R:3
2: withCallingHandlers(expr, packageStartupMessage = function(c) invokeRestart("muffleMessage"))
3: library(TxDb.Hsapiens.UCSC.hg38.knownGene) at testthat/test-src_organism-select.R:4
4: stop(txt, domain = NA)
══ testthat results ═══════════════════════════════════════════════════════════
OK: 32 SKIPPED: 0 FAILED: 3
1. Error: (unknown) (@test-GenomicFeatures-extractors.R#3)
2. Error: mouse (@test-src_organism-class.R#54)
3. Error: (unknown) (@test-src_organism-select.R#3)
Error: testthat unit tests failed
In addition: Warning message:
call dbDisconnect() when finished working with a connection
Execution halted
```
* checking package dependencies ... NOTE
```
Packages suggested but not available for checking:
‘org.Hs.eg.db’ ‘TxDb.Hsapiens.UCSC.hg38.knownGene’ ‘org.Mm.eg.db’
‘TxDb.Mmusculus.UCSC.mm10.ensGene’
```
* checking dependencies in R code ... NOTE
```
Unexported objects imported by ':::' calls:
‘AnnotationDbi:::smartKeys’ ‘GenomicFeatures:::.exons_with_3utr’
‘GenomicFeatures:::.exons_with_5utr’
‘GenomicFeatures:::get_TxDb_seqinfo0’
‘S4Vectors:::extract_data_frame_rows’
See the note in ?`:::` about the use of this operator.
```
* checking R code for possible problems ... NOTE
```
.toGRanges: no visible binding for global variable ‘.’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/Organism.dplyr/new/Organism.dplyr.Rcheck/00_pkg_src/Organism.dplyr/R/extractors.R:236)
intronsByTranscript,src_organism: no visible binding for global
variable ‘.’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/Organism.dplyr/new/Organism.dplyr.Rcheck/00_pkg_src/Organism.dplyr/R/extractor-methods.R:254-255)
intronsByTranscript,src_organism: no visible binding for global
variable ‘.’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/Organism.dplyr/new/Organism.dplyr.Rcheck/00_pkg_src/Organism.dplyr/R/extractor-methods.R:264-265)
orgPackageName,src_organism: no visible binding for global variable
‘name’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/Organism.dplyr/new/Organism.dplyr.Rcheck/00_pkg_src/Organism.dplyr/R/src.R:432-433)
orgPackageName,src_organism: no visible binding for global variable
‘organism’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/Organism.dplyr/new/Organism.dplyr.Rcheck/00_pkg_src/Organism.dplyr/R/src.R:434)
orgPackageName,src_organism: no visible binding for global variable
‘OrgDb’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/Organism.dplyr/new/Organism.dplyr.Rcheck/00_pkg_src/Organism.dplyr/R/src.R:434)
Undefined global functions or variables:
. name organism OrgDb
```
# PakPC2017
Version: 1.0.0
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘stats’
All declared Imports should be used.
```
# parlitools
Version: 0.3.0
## In both
* checking data for non-ASCII characters ... NOTE
```
Note: found 13 marked UTF-8 strings
```
# parsemsf
Version: 0.1.1
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘dbplyr’
All declared Imports should be used.
```
# particles
Version: 0.2.2
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘dplyr’
All declared Imports should be used.
```
# PathwaySplice
Version: 1.4.0
## In both
* checking package dependencies ... ERROR
```
Packages required but not available: ‘org.Hs.eg.db’ ‘org.Mm.eg.db’ ‘GO.db’
See section ‘The DESCRIPTION file’ in the ‘Writing R Extensions’
manual.
```
# patternplot
Version: 0.2.1
## In both
* checking for GNU extensions in Makefiles ... NOTE
```
GNU make is a SystemRequirements.
```
# PAutilities
Version: 0.1.2
## In both
* checking examples ... ERROR
```
Running examples in ‘PAutilities-Ex.R’ failed
The error most likely occurred in:
> ### Name: get_transition_info
> ### Title: Convert a set of predicted and actual activity transitions to an
> ### object that can be analyzed
> ### Aliases: get_transition_info
>
> ### ** Examples
>
> predictions <- sample(c(0,1), 100, TRUE, c(3, 1))
> references <- sample(c(0,1), 100, TRUE, c(4,1))
> get_transition_info(predictions, references, 10)
Error: .onLoad failed in loadNamespace() for 'rJava', details:
call: dyn.load(file, DLLpath = DLLpath, ...)
error: unable to load shared object '/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/PAutilities/rJava/libs/rJava.so':
dlopen(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/PAutilities/rJava/libs/rJava.so, 6): Library not loaded: /Library/Java/JavaVirtualMachines/jdk-11.0.1.jdk/Contents/Home/lib/server/libjvm.dylib
Referenced from: /Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/PAutilities/rJava/libs/rJava.so
Reason: image not found
Execution halted
```
* checking tests ...
```
ERROR
Running the tests in ‘tests/testthat.R’ failed.
Last 13 lines of output:
3: matchingMarkets::hri at /Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/PAutilities/new/PAutilities.Rcheck/00_pkg_src/PAutilities/R/get_matchings.R:15
4: getExportedValue(pkg, name)
5: asNamespace(ns)
6: getNamespace(ns)
7: tryCatch(loadNamespace(name), error = function(e) stop(e))
8: tryCatchList(expr, classes, parentenv, handlers)
9: tryCatchOne(expr, names, parentenv, handlers[[1L]])
10: value[[3L]](cond)
══ testthat results ═══════════════════════════════════════════════════════════
OK: 5 SKIPPED: 0 FAILED: 1
1. Error: Transition analyses produce expected output (@test_transitions.R#31)
Error: testthat unit tests failed
Execution halted
```
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘AGread’
All declared Imports should be used.
```
# PCRedux
Version: 0.2.6-4
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘caret’
All declared Imports should be used.
```
# pdp
Version: 0.7.0
## In both
* checking Rd cross-references ... NOTE
```
Packages unavailable to check Rd xrefs: ‘mlbench’, ‘ICEbox’
```
# petro.One
Version: 0.2.3
## In both
* checking whether package ‘petro.One’ can be installed ... ERROR
```
Installation failed.
See ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/petro.One/new/petro.One.Rcheck/00install.out’ for details.
```
## Installation
### Devel
```
* installing *source* package ‘petro.One’ ...
** package ‘petro.One’ successfully unpacked and MD5 sums checked
** R
** data
*** moving datasets to lazyload DB
** inst
** byte-compile and prepare package for lazy loading
Error : .onLoad failed in loadNamespace() for 'rJava', details:
call: dyn.load(file, DLLpath = DLLpath, ...)
error: unable to load shared object '/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/petro.One/rJava/libs/rJava.so':
dlopen(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/petro.One/rJava/libs/rJava.so, 6): Library not loaded: /Library/Java/JavaVirtualMachines/jdk-11.0.1.jdk/Contents/Home/lib/server/libjvm.dylib
Referenced from: /Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/petro.One/rJava/libs/rJava.so
Reason: image not found
ERROR: lazy loading failed for package ‘petro.One’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/petro.One/new/petro.One.Rcheck/petro.One’
```
### CRAN
```
* installing *source* package ‘petro.One’ ...
** package ‘petro.One’ successfully unpacked and MD5 sums checked
** R
** data
*** moving datasets to lazyload DB
** inst
** byte-compile and prepare package for lazy loading
Error : .onLoad failed in loadNamespace() for 'rJava', details:
call: dyn.load(file, DLLpath = DLLpath, ...)
error: unable to load shared object '/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/petro.One/rJava/libs/rJava.so':
dlopen(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/petro.One/rJava/libs/rJava.so, 6): Library not loaded: /Library/Java/JavaVirtualMachines/jdk-11.0.1.jdk/Contents/Home/lib/server/libjvm.dylib
Referenced from: /Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/petro.One/rJava/libs/rJava.so
Reason: image not found
ERROR: lazy loading failed for package ‘petro.One’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/petro.One/old/petro.One.Rcheck/petro.One’
```
# phase1PRMD
Version: 1.0.1
## In both
* checking whether package ‘phase1PRMD’ can be installed ... ERROR
```
Installation failed.
See ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/phase1PRMD/new/phase1PRMD.Rcheck/00install.out’ for details.
```
## Installation
### Devel
```
* installing *source* package ‘phase1PRMD’ ...
** package ‘phase1PRMD’ successfully unpacked and MD5 sums checked
** R
** data
*** moving datasets to lazyload DB
** byte-compile and prepare package for lazy loading
Error : .onLoad failed in loadNamespace() for 'rjags', details:
call: dyn.load(file, DLLpath = DLLpath, ...)
error: unable to load shared object '/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/phase1PRMD/rjags/libs/rjags.so':
dlopen(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/phase1PRMD/rjags/libs/rjags.so, 10): Library not loaded: /usr/local/lib/libjags.4.dylib
Referenced from: /Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/phase1PRMD/rjags/libs/rjags.so
Reason: image not found
ERROR: lazy loading failed for package ‘phase1PRMD’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/phase1PRMD/new/phase1PRMD.Rcheck/phase1PRMD’
```
### CRAN
```
* installing *source* package ‘phase1PRMD’ ...
** package ‘phase1PRMD’ successfully unpacked and MD5 sums checked
** R
** data
*** moving datasets to lazyload DB
** byte-compile and prepare package for lazy loading
Error : .onLoad failed in loadNamespace() for 'rjags', details:
call: dyn.load(file, DLLpath = DLLpath, ...)
error: unable to load shared object '/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/phase1PRMD/rjags/libs/rjags.so':
dlopen(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/phase1PRMD/rjags/libs/rjags.so, 10): Library not loaded: /usr/local/lib/libjags.4.dylib
Referenced from: /Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/phase1PRMD/rjags/libs/rjags.so
Reason: image not found
ERROR: lazy loading failed for package ‘phase1PRMD’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/phase1PRMD/old/phase1PRMD.Rcheck/phase1PRMD’
```
# phenofit
Version: 0.2.0
## In both
* checking whether package ‘phenofit’ can be installed ... ERROR
```
Installation failed.
See ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/phenofit/new/phenofit.Rcheck/00install.out’ for details.
```
## Installation
### Devel
```
* installing *source* package ‘phenofit’ ...
** package ‘phenofit’ successfully unpacked and MD5 sums checked
** libs
clang++ -std=gnu++11 -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/phenofit/Rcpp/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/phenofit/RcppArmadillo/include" -I/usr/local/include -fopenmp -fPIC -Wall -g -O2 -c RcppExports.cpp -o RcppExports.o
clang: error: unsupported option '-fopenmp'
make: *** [RcppExports.o] Error 1
ERROR: compilation failed for package ‘phenofit’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/phenofit/new/phenofit.Rcheck/phenofit’
```
### CRAN
```
* installing *source* package ‘phenofit’ ...
** package ‘phenofit’ successfully unpacked and MD5 sums checked
** libs
clang++ -std=gnu++11 -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/phenofit/Rcpp/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/phenofit/RcppArmadillo/include" -I/usr/local/include -fopenmp -fPIC -Wall -g -O2 -c RcppExports.cpp -o RcppExports.o
clang: error: unsupported option '-fopenmp'
make: *** [RcppExports.o] Error 1
ERROR: compilation failed for package ‘phenofit’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/phenofit/old/phenofit.Rcheck/phenofit’
```
# phenopath
Version: 1.4.0
## In both
* checking re-building of vignette outputs ... WARNING
```
Error in re-building vignettes:
...
Quitting from lines 64-72 (introduction_to_phenopath.Rmd)
Error: processing vignette 'introduction_to_phenopath.Rmd' failed with diagnostics:
Columns 1, 2, 3, 4, 5, … (and 3 more) must be named.
Use .name_repair to specify repair.
Execution halted
```
# philr
Version: 1.6.0
## In both
* checking whether package ‘philr’ can be installed ... ERROR
```
Installation failed.
See ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/philr/new/philr.Rcheck/00install.out’ for details.
```
## Installation
### Devel
```
* installing *source* package ‘philr’ ...
** R
** inst
** byte-compile and prepare package for lazy loading
Error : object ‘as_data_frame’ is not exported by 'namespace:tidytree'
ERROR: lazy loading failed for package ‘philr’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/philr/new/philr.Rcheck/philr’
```
### CRAN
```
* installing *source* package ‘philr’ ...
** R
** inst
** byte-compile and prepare package for lazy loading
Error : object ‘as_data_frame’ is not exported by 'namespace:tidytree'
ERROR: lazy loading failed for package ‘philr’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/philr/old/philr.Rcheck/philr’
```
# pitchRx
Version: 1.8.2
## In both
* checking package dependencies ... NOTE
```
Package suggested but not available for checking: ‘ggsubplot’
```
# pivot
Version: 18.4.17
## In both
* checking package dependencies ... NOTE
```
Package which this enhances but not available for checking: ‘odbc’
```
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘colorspace’ ‘lubridate’
All declared Imports should be used.
```
# pivottabler
Version: 1.1.0
## In both
* checking installed package size ... NOTE
```
installed size is 8.2Mb
sub-directories of 1Mb or more:
data 2.0Mb
doc 3.0Mb
R 3.1Mb
```
# pkggraph
Version: 0.2.3
## In both
* checking installed package size ... NOTE
```
installed size is 5.0Mb
sub-directories of 1Mb or more:
doc 4.3Mb
```
# PkgsFromFiles
Version: 0.5
## In both
* checking examples ... ERROR
```
Running examples in ‘PkgsFromFiles-Ex.R’ failed
The error most likely occurred in:
> ### Name: pff_check_install_pkgs
> ### Title: Checks and installs a single package
> ### Aliases: pff_check_install_pkgs
>
> ### ** Examples
>
> pff_check_install_pkgs('dplyr')
Installing dplyrError in pkg.in %in% my.available.packages :
argument "my.available.packages" is missing, with no default
Calls: pff_check_install_pkgs -> %in%
Execution halted
```
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘curl’ ‘readr’ ‘stringdist’ ‘XML’
All declared Imports should be used.
```
# PKPDmisc
Version: 2.1.1
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘purrr’
All declared Imports should be used.
```
# plethem
Version: 0.1.7
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘devtools’ ‘formatR’ ‘gdata’ ‘rhandsontable’ ‘shinythemes’ ‘sqldf’
‘V8’
All declared Imports should be used.
```
# plotly
Version: 4.8.0
## In both
* checking installed package size ... NOTE
```
installed size is 7.1Mb
sub-directories of 1Mb or more:
htmlwidgets 3.1Mb
R 2.3Mb
```
# plotrr
Version: 1.0.0
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘dplyr’ ‘stats’
All declared Imports should be used.
```
# plyranges
Version: 1.0.3
## In both
* checking tests ...
```
ERROR
Running the tests in ‘tests/testthat.R’ failed.
Last 13 lines of output:
e$handled <- TRUE
test_error <<- e
}, "could not find function \"WIGFile\"", quote(WIGFile(test_wig))) at testthat/test-io-wig.R:24
2: eval(code, test_env)
══ testthat results ═══════════════════════════════════════════════════════════
OK: 271 SKIPPED: 0 FAILED: 5
1. Error: read_bed returns correct GRanges (@test-io-bed.R#67)
2. Error: read_bed_graph returns correct GRanges (@test-io-bedGraph.R#39)
3. Error: reading/ writing bigwig files returns correct GRanges (@test-io-bw.R#19)
4. Error: reading GFF files returns correct GRanges (@test-io-gff.R#87)
5. Error: reading WIG files (@test-io-wig.R#24)
Error: testthat unit tests failed
Execution halted
```
* checking package dependencies ... NOTE
```
Package suggested but not available for checking: ‘BSgenome.Hsapiens.UCSC.hg19’
```
# pmc
Version: 1.0.3
## In both
* R CMD check timed out
# pmpp
Version: 0.1.0
## In both
* checking tests ...
```
ERROR
Running the tests in ‘tests/testthat.R’ failed.
Last 13 lines of output:
tests/testthat/test-summarise.r:1068:28:style :Commas should always have a space after.
tests/testthat/test-summarise.r:1103:48:style :Commas should never have a space before.
tests/testthat/test-summarise.r:1104:48:style :Commas should never have a space before.
tests/testthat/test-summarise.r:1128:24:style :Commas should never have a space before.
tests/testthat/test-summarise.r:1130:24:style :Commas should never have a space before.
tests/testthat/test-summarise.r:1131:24:style :Commas should never have a space before.
tests/testthat/test-utils.R:6:1:style :lines should not be more than 120 characters.
tests/testthat/test-utils.R:15:53:style :Commas should always have a space after.
══ testthat results ═══════════════════════════════════════════════════════════
OK: 64 SKIPPED: 0 FAILED: 1
1. Failure: Package Style (@test-lintr.R#32)
Error: testthat unit tests failed
Execution halted
```
# PogromcyDanych
Version: 1.5
## In both
* checking PDF version of manual ... WARNING
```
...
...
! LaTeX Error: Command \k unavailable in encoding OT1.
See the LaTeX manual or LaTeX Companion for explanation.
Type H <return> for immediate help.
...
! LaTeX Error: Command \k unavailable in encoding OT1.
See the LaTeX manual or LaTeX Companion for explanation.
Type H <return> for immediate help.
...
! LaTeX Error: Command \k unavailable in encoding OT1.
See the LaTeX manual or LaTeX Companion for explanation.
Type H <return> for immediate help.
...
! LaTeX Error: Command \k unavailable in encoding OT1.
See the LaTeX manual or LaTeX Companion for explanation.
Type H <return> for immediate help.
...
```
* checking installed package size ... NOTE
```
installed size is 7.2Mb
sub-directories of 1Mb or more:
data 7.0Mb
```
* checking data for non-ASCII characters ... NOTE
```
Note: found 7256 marked UTF-8 strings
```
# poio
Version: 0.0-3
## In both
* checking data for non-ASCII characters ... NOTE
```
Note: found 8 marked UTF-8 strings
```
# politicaldata
Version: 0.1.1
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘ggplot2’ ‘tidyr’
All declared Imports should be used.
```
* checking data for non-ASCII characters ... NOTE
```
Note: found 9 marked UTF-8 strings
```
# PopED
Version: 0.4.0
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘tidyr’
All declared Imports should be used.
```
# poppr
Version: 2.8.1
## In both
* checking whether package ‘poppr’ can be installed ... ERROR
```
Installation failed.
See ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/poppr/new/poppr.Rcheck/00install.out’ for details.
```
## Installation
### Devel
```
* installing *source* package ‘poppr’ ...
** package ‘poppr’ successfully unpacked and MD5 sums checked
** libs
clang -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -I/usr/local/include -fopenmp -fPIC -Wall -g -O2 -c adjust_missing.c -o adjust_missing.o
clang: error: unsupported option '-fopenmp'
make: *** [adjust_missing.o] Error 1
ERROR: compilation failed for package ‘poppr’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/poppr/new/poppr.Rcheck/poppr’
```
### CRAN
```
* installing *source* package ‘poppr’ ...
** package ‘poppr’ successfully unpacked and MD5 sums checked
** libs
clang -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -I/usr/local/include -fopenmp -fPIC -Wall -g -O2 -c adjust_missing.c -o adjust_missing.o
clang: error: unsupported option '-fopenmp'
make: *** [adjust_missing.o] Error 1
ERROR: compilation failed for package ‘poppr’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/poppr/old/poppr.Rcheck/poppr’
```
# predict3d
Version: 0.1.0
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘TH.data’
All declared Imports should be used.
```
# prisonbrief
Version: 0.1.0
## In both
* checking data for non-ASCII characters ... NOTE
```
Note: found 2 marked UTF-8 strings
```
# processanimateR
Version: 1.0.0
## In both
* checking installed package size ... NOTE
```
installed size is 11.2Mb
sub-directories of 1Mb or more:
doc 6.5Mb
help 2.1Mb
htmlwidgets 2.5Mb
```
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘zoo’
All declared Imports should be used.
```
# processR
Version: 0.1.0
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘jtools’ ‘modelr’ ‘prediction’ ‘rlang’ ‘TH.data’ ‘tidyr’
All declared Imports should be used.
```
# progeny
Version: 1.2.0
## In both
* checking re-building of vignette outputs ... WARNING
```
Error in re-building vignettes:
...
Quitting from lines 42-60 (progeny.Rmd)
Error: processing vignette 'progeny.Rmd' failed with diagnostics:
there is no package called 'airway'
Execution halted
```
* checking package dependencies ... NOTE
```
Package suggested but not available for checking: ‘airway’
```
# pRoloc
Version: 1.20.2
## In both
* checking examples ... ERROR
```
...
> ### Title: Class '"ClustDist"'
> ### Aliases: ClustDist class:ClustDist ClustDist-class
> ### plot,ClustDist,MSnSet-method show,ClustDist-method
> ### Keywords: classes
>
> ### ** Examples
>
> showClass("ClustDist")
Class "ClustDist" [package "pRoloc"]
Slots:
Name: k dist term id nrow clustsz
Class: numeric list character character numeric list
Name: components fcol
Class: vector character
>
> library('pRolocdata')
Error in library("pRolocdata") : there is no package called ‘pRolocdata’
Execution halted
```
* checking tests ...
```
ERROR
Running the tests in ‘tests/testthat.R’ failed.
Last 13 lines of output:
This is pRoloc version 1.20.2
Visit https://lgatto.github.io/pRoloc/ to get started.
Warning messages:
1: In fun(libname, pkgname) :
mzR has been built against a different Rcpp version (0.12.16)
than is installed on your system (1.0.0). This might lead to errors
when loading mzR. If you encounter such issues, please send a report,
including the output of sessionInfo() to the Bioc support forum at
https://support.bioconductor.org/. For details see also
https://github.com/sneumann/mzR/wiki/mzR-Rcpp-compiler-linker-issue.
2: replacing previous import 'BiocGenerics::var' by 'stats::var' when loading 'MLInterfaces'
> library("pRolocdata")
Error in library("pRolocdata") : there is no package called 'pRolocdata'
Execution halted
```
* checking re-building of vignette outputs ... WARNING
```
...
The following object is masked from 'package:tools':
toHTML
Attaching package: 'annotate'
The following object is masked from 'package:mzR':
nChrom
Loading required package: cluster
Warning: replacing previous import 'BiocGenerics::var' by 'stats::var' when loading 'MLInterfaces'
This is pRoloc version 1.20.2
Visit https://lgatto.github.io/pRoloc/ to get started.
Quitting from lines 87-93 (pRoloc-goannotations.Rmd)
Error: processing vignette 'pRoloc-goannotations.Rmd' failed with diagnostics:
there is no package called 'pRolocdata'
Execution halted
```
* checking PDF version of manual ... WARNING
```
...
LaTeX errors found:
! Please use \mathaccent for accents in math mode.
\add@accent ...@spacefactor \spacefactor }\accent
#1 #2\egroup \spacefactor ...
l.931 ...{}2}{} protein correlations.}{empPvalues}
! Missing { inserted.
<to be read again>
\egroup
l.931 ...{}2}{} protein correlations.}{empPvalues}
! You can't use `\spacefactor' in math mode.
\add@accent ...}\accent #1 #2\egroup \spacefactor
\accent@spacefactor
l.931 ...{}2}{} protein correlations.}{empPvalues}
! Missing } inserted.
<inserted text>
}
l.931 ...{}2}{} protein correlations.}{empPvalues}
```
* checking package dependencies ... NOTE
```
Packages suggested but not available for checking: ‘pRolocdata’ ‘GO.db’
```
* checking installed package size ... NOTE
```
installed size is 14.0Mb
sub-directories of 1Mb or more:
doc 10.6Mb
R 2.1Mb
```
* checking dependencies in R code ... NOTE
```
Unexported objects imported by ':::' calls:
‘caret:::predict.plsda’ ‘MLInterfaces:::.macroF1’
‘MLInterfaces:::.precision’ ‘MLInterfaces:::.recall’
‘MLInterfaces:::es2df’
See the note in ?`:::` about the use of this operator.
There are ::: calls to the package's namespace in its code. A package
almost never needs to use ::: for its own objects:
‘opt’
```
* checking R code for possible problems ... NOTE
```
Found the following possibly unsafe calls:
File ‘pRoloc/R/annotation.R’:
unlockBinding("params", .pRolocEnv)
```
# pRolocGUI
Version: 1.14.0
## In both
* checking examples ... ERROR
```
Running examples in ‘pRolocGUI-Ex.R’ failed
The error most likely occurred in:
> ### Name: pRolocVis
> ### Title: Interactive visualisation of spatial proteomics data
> ### Aliases: pRolocVis pRolocVis_aggregate pRolocVis_classify
> ### pRolocVis_compare pRolocVis_pca
>
> ### ** Examples
>
> library("pRoloc")
> library("pRolocdata")
Error in library("pRolocdata") : there is no package called ‘pRolocdata’
Execution halted
```
* checking whether package ‘pRolocGUI’ can be installed ... WARNING
```
Found the following significant warnings:
Warning: namespace ‘dimRed’ is not available and has been replaced
See ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/pRolocGUI/new/pRolocGUI.Rcheck/00install.out’ for details.
```
* checking re-building of vignette outputs ... WARNING
```
...
Loading required package: cluster
Warning: replacing previous import 'BiocGenerics::var' by 'stats::var' when loading 'MLInterfaces'
Warning: namespace 'dimRed' is not available and has been replaced
by .GlobalEnv when processing object ''
Warning: namespace 'dimRed' is not available and has been replaced
by .GlobalEnv when processing object ''
Warning: namespace 'dimRed' is not available and has been replaced
by .GlobalEnv when processing object ''
Warning: namespace 'dimRed' is not available and has been replaced
by .GlobalEnv when processing object ''
This is pRoloc version 1.20.2
Visit https://lgatto.github.io/pRoloc/ to get started.
This is pRolocGUI version 1.14.0
Quitting from lines 77-79 (pRolocGUI.Rmd)
Error: processing vignette 'pRolocGUI.Rmd' failed with diagnostics:
there is no package called 'pRolocdata'
Execution halted
```
* checking package dependencies ... NOTE
```
Package suggested but not available for checking: ‘pRolocdata’
```
* checking DESCRIPTION meta-information ... NOTE
```
Authors@R field gives more than one person with maintainer role:
Lisa Breckels <lms79@cam.ac.uk> [aut, cre]
Laurent Gatto <lg390@cam.ac.uk> [aut, cre]
```
* checking dependencies in R code ... NOTE
```
Unexported object imported by a ':::' call: ‘pRoloc:::remap’
See the note in ?`:::` about the use of this operator.
```
# proteoQC
Version: 1.16.0
## In both
* checking re-building of vignette outputs ... WARNING
```
Error in re-building vignettes:
...
Error: processing vignette 'proteoQC.Rmd' failed with diagnostics:
there is no package called ‘prettydoc’
Execution halted
```
* checking package dependencies ... NOTE
```
Package suggested but not available for checking: ‘RforProteomics’
```
* checking installed package size ... NOTE
```
installed size is 7.7Mb
sub-directories of 1Mb or more:
doc 2.5Mb
extdata 3.9Mb
```
* checking R code for possible problems ... NOTE
```
...
qcHist: no visible binding for global variable ‘techRep’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/proteoQC/new/proteoQC.Rcheck/00_pkg_src/proteoQC/R/visualization.R:406-416)
qcHist: no visible binding for global variable ‘bioRep’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/proteoQC/new/proteoQC.Rcheck/00_pkg_src/proteoQC/R/visualization.R:406-416)
qcHist2: no visible binding for global variable ‘error’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/proteoQC/new/proteoQC.Rcheck/00_pkg_src/proteoQC/R/visualization.R:357-365)
qcHist2: no visible binding for global variable ‘fractile’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/proteoQC/new/proteoQC.Rcheck/00_pkg_src/proteoQC/R/visualization.R:357-365)
qcHist2: no visible binding for global variable ‘fractile’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/proteoQC/new/proteoQC.Rcheck/00_pkg_src/proteoQC/R/visualization.R:367-369)
qcHist2: no visible binding for global variable ‘error’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/proteoQC/new/proteoQC.Rcheck/00_pkg_src/proteoQC/R/visualization.R:377-385)
qcHist2: no visible binding for global variable ‘fractile’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/proteoQC/new/proteoQC.Rcheck/00_pkg_src/proteoQC/R/visualization.R:377-385)
qcHist2: no visible binding for global variable ‘fractile’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/proteoQC/new/proteoQC.Rcheck/00_pkg_src/proteoQC/R/visualization.R:389-391)
Undefined global functions or variables:
..count.. bioRep curenv delta error exprs fractile fraction grid.draw
Intensity iTRAQ4 iTRAQ8 label MS1QC MS2QC peplength peptide_summary
precursorCharge quantify ratio readMgfData se Tag techRep TMT10 TMT6
V1 V2 V3 V4 V5 val x y
```
# proustr
Version: 0.4.0
## In both
* checking data for non-ASCII characters ... NOTE
```
Note: found 12717 marked UTF-8 strings
```
# provSummarizeR
Version: 1.0
## In both
* checking package dependencies ... NOTE
```
Package suggested but not available for checking: ‘rdt’
```
# prozor
Version: 0.2.11
## In both
* checking installed package size ... NOTE
```
installed size is 5.7Mb
sub-directories of 1Mb or more:
data 1.7Mb
extdata 2.3Mb
```
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘dplyr’ ‘readr’
All declared Imports should be used.
```
# psichomics
Version: 1.6.2
## In both
* checking installed package size ... NOTE
```
installed size is 9.7Mb
sub-directories of 1Mb or more:
doc 5.6Mb
R 3.0Mb
```
* checking compiled code ... NOTE
```
File ‘psichomics/libs/psichomics.so’:
Found ‘___stdoutp’, possibly from ‘stdout’ (C)
Object: ‘psiFastCalc.o’
Found ‘_printf’, possibly from ‘printf’ (C)
Object: ‘psiFastCalc.o’
Found ‘_putchar’, possibly from ‘putchar’ (C)
Object: ‘psiFastCalc.o’
Compiled code should not call entry points which might terminate R nor
write to stdout/stderr instead of to the console, nor use Fortran I/O
nor system RNGs.
See ‘Writing portable packages’ in the ‘Writing R Extensions’ manual.
```
# PSLM2015
Version: 0.2.0
## In both
* checking installed package size ... NOTE
```
installed size is 5.0Mb
sub-directories of 1Mb or more:
data 4.9Mb
```
* checking data for non-ASCII characters ... NOTE
```
Note: found 26 marked Latin-1 strings
```
# psychmeta
Version: 2.3.2
## In both
* checking installed package size ... NOTE
```
installed size is 8.9Mb
sub-directories of 1Mb or more:
R 7.1Mb
```
# psycho
Version: 0.4.0
## In both
* checking installed package size ... NOTE
```
installed size is 5.6Mb
sub-directories of 1Mb or more:
doc 4.3Mb
R 1.0Mb
```
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘methods’
All declared Imports should be used.
```
# ptstem
Version: 0.0.4
## In both
* checking installed package size ... NOTE
```
installed size is 5.3Mb
sub-directories of 1Mb or more:
dict 5.1Mb
```
# purrrlyr
Version: 0.0.4
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘tibble’
All declared Imports should be used.
```
# pysd2r
Version: 0.1.0
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘knitr’
All declared Imports should be used.
```
# qdap
Version: 2.3.2
## In both
* checking whether package ‘qdap’ can be installed ... ERROR
```
Installation failed.
See ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/qdap/new/qdap.Rcheck/00install.out’ for details.
```
## Installation
### Devel
```
* installing *source* package ‘qdap’ ...
** package ‘qdap’ successfully unpacked and MD5 sums checked
** R
** data
*** moving datasets to lazyload DB
** inst
** byte-compile and prepare package for lazy loading
Error : .onLoad failed in loadNamespace() for 'rJava', details:
call: dyn.load(file, DLLpath = DLLpath, ...)
error: unable to load shared object '/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/qdap/rJava/libs/rJava.so':
dlopen(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/qdap/rJava/libs/rJava.so, 6): Library not loaded: /Library/Java/JavaVirtualMachines/jdk-11.0.1.jdk/Contents/Home/lib/server/libjvm.dylib
Referenced from: /Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/qdap/rJava/libs/rJava.so
Reason: image not found
ERROR: lazy loading failed for package ‘qdap’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/qdap/new/qdap.Rcheck/qdap’
```
### CRAN
```
* installing *source* package ‘qdap’ ...
** package ‘qdap’ successfully unpacked and MD5 sums checked
** R
** data
*** moving datasets to lazyload DB
** inst
** byte-compile and prepare package for lazy loading
Error : .onLoad failed in loadNamespace() for 'rJava', details:
call: dyn.load(file, DLLpath = DLLpath, ...)
error: unable to load shared object '/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/qdap/rJava/libs/rJava.so':
dlopen(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/qdap/rJava/libs/rJava.so, 6): Library not loaded: /Library/Java/JavaVirtualMachines/jdk-11.0.1.jdk/Contents/Home/lib/server/libjvm.dylib
Referenced from: /Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/qdap/rJava/libs/rJava.so
Reason: image not found
ERROR: lazy loading failed for package ‘qdap’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/qdap/old/qdap.Rcheck/qdap’
```
# qqplotr
Version: 0.0.3
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘knitr’ ‘purrr’ ‘rmarkdown’
All declared Imports should be used.
```
# quanteda
Version: 1.4.1
## In both
* checking PDF version of manual ... WARNING
```
LaTeX errors when creating PDF version.
This typically indicates Rd problems.
LaTeX errors found:
! Please use \mathaccent for accents in math mode.
\add@accent ...@spacefactor \spacefactor }\accent
#1 #2\egroup \spacefactor ...
l.6264 ...mes 100 \times \frac{n_{conj}}{n_{w}}}{}
! You can't use `\spacefactor' in display math mode.
\add@accent ...}\accent #1 #2\egroup \spacefactor
\accent@spacefactor
l.6264 ...mes 100 \times \frac{n_{conj}}{n_{w}}}{}
```
* checking installed package size ... NOTE
```
installed size is 6.6Mb
sub-directories of 1Mb or more:
data 1.3Mb
libs 1.3Mb
R 3.0Mb
```
* checking data for non-ASCII characters ... NOTE
```
Note: found 71 marked UTF-8 strings
```
# QuaternaryProd
Version: 1.14.0
## In both
* checking installed package size ... NOTE
```
installed size is 16.9Mb
sub-directories of 1Mb or more:
extdata 16.2Mb
```
# questionr
Version: 0.7.0
## In both
* checking data for non-ASCII characters ... NOTE
```
Note: found 4145 marked UTF-8 strings
```
# quickReg
Version: 1.5.0
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘psych’
All declared Imports should be used.
```
* checking Rd cross-references ... NOTE
```
Package unavailable to check Rd xrefs: ‘PredictABEL’
```
# quokar
Version: 0.1.0
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘gridExtra’ ‘knitr’ ‘MCMCpack’
All declared Imports should be used.
```
# quRan
Version: 0.1.0
## In both
* checking data for non-ASCII characters ... NOTE
```
Note: found 12928 marked UTF-8 strings
```
# r2glmm
Version: 0.1.2
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘data.table’ ‘dplyr’ ‘lmerTest’
All declared Imports should be used.
```
# r511
Version: 0.1.1
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘dplyr’
All declared Imports should be used.
```
# railtrails
Version: 0.1.1
## In both
* checking data for non-ASCII characters ... NOTE
```
Note: found 1557 marked UTF-8 strings
```
# randomForestExplainer
Version: 0.9
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘dtplyr’ ‘MASS’
All declared Imports should be used.
```
# raptr
Version: 0.1.3
## In both
* checking package dependencies ... NOTE
```
Package suggested but not available for checking: ‘gurobi’
```
* checking installed package size ... NOTE
```
installed size is 6.7Mb
sub-directories of 1Mb or more:
data 3.6Mb
doc 1.4Mb
```
# Rariant
Version: 1.16.0
## In both
* checking examples ... ERROR
```
...
> ### Aliases: tallyPlot
>
> ### ** Examples
>
> library(ggbio)
Loading required package: ggplot2
Need specific help about ggbio? try mailing
the maintainer or visit http://tengfei.github.com/ggbio/
Attaching package: 'ggbio'
The following objects are masked from 'package:ggplot2':
geom_bar, geom_rect, geom_segment, ggsave, stat_bin, stat_identity,
xlim
> library(GenomicRanges)
> library(BSgenome.Hsapiens.UCSC.hg19)
Error in library(BSgenome.Hsapiens.UCSC.hg19) :
there is no package called 'BSgenome.Hsapiens.UCSC.hg19'
Execution halted
```
* checking package dependencies ... NOTE
```
Package suggested but not available for checking: ‘BSgenome.Hsapiens.UCSC.hg19’
```
* checking installed package size ... NOTE
```
installed size is 7.9Mb
sub-directories of 1Mb or more:
doc 2.3Mb
extdata 5.2Mb
```
* checking R code for possible problems ... NOTE
```
tallyBamRegion: no visible global function definition for 'PileupParam'
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/Rariant/new/Rariant.Rcheck/00_pkg_src/Rariant/R/tally.R:101-110)
tallyBamRegion: no visible global function definition for
'ScanBamParam'
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/Rariant/new/Rariant.Rcheck/00_pkg_src/Rariant/R/tally.R:112)
tallyBamRegion: no visible global function definition for 'pileup'
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/Rariant/new/Rariant.Rcheck/00_pkg_src/Rariant/R/tally.R:114)
Undefined global functions or variables:
pileup PileupParam ScanBamParam
```
* checking installed files from ‘inst/doc’ ... NOTE
```
The following files should probably not be installed:
‘rariant-inspect-ci.png’, ‘rariant-inspect-shift.png’
Consider the use of a .Rinstignore file: see ‘Writing R Extensions’,
or move the vignette sources from ‘inst/doc’ to ‘vignettes’.
```
# rattle
Version: 5.2.0
## In both
* checking re-building of vignette outputs ... WARNING
```
Error in re-building vignettes:
...
Error in texi2dvi(file = file, pdf = TRUE, clean = clean, quiet = quiet, :
Running 'texi2dvi' on 'rattle.tex' failed.
LaTeX errors:
! LaTeX Error: File `fancyhdr.sty' not found.
Type X to quit or <RETURN> to proceed,
or enter new name. (Default extension: sty)
! Emergency stop.
<read *>
l.12 \usepackage
{lastpage}^^M
! ==> Fatal error occurred, no output PDF file produced!
Calls: buildVignettes -> texi2pdf -> texi2dvi
Execution halted
```
* checking package dependencies ... NOTE
```
Packages suggested but not available for checking:
‘arulesViz’ ‘cairoDevice’ ‘cba’ ‘ggraptR’ ‘gWidgetsRGtk2’ ‘playwith’
‘rggobi’ ‘RGtk2’ ‘wskm’ ‘RGtk2Extras’
```
* checking installed package size ... NOTE
```
installed size is 11.1Mb
sub-directories of 1Mb or more:
data 3.0Mb
etc 1.9Mb
po 1.2Mb
R 4.3Mb
```
# RBesT
Version: 1.3-7
## In both
* checking installed package size ... NOTE
```
installed size is 5.6Mb
sub-directories of 1Mb or more:
doc 1.9Mb
libs 2.2Mb
R 1.1Mb
```
* checking for GNU extensions in Makefiles ... NOTE
```
GNU make is a SystemRequirements.
```
# rbin
Version: 0.1.1
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘utils’
All declared Imports should be used.
```
# rccmisc
Version: 0.3.7
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘dplyr’
All declared Imports should be used.
```
# rclimateca
Version: 1.0.2
## In both
* checking data for non-ASCII characters ... NOTE
```
Note: found 24 marked UTF-8 strings
```
# RColetum
Version: 0.2.0
## In both
* checking tests ...
```
ERROR
Running the tests in ‘tests/testthat.R’ failed.
Last 13 lines of output:
1. Error: Test GetAnswers on a single data frame in a very complex forms,
using complex group,relaitonal questions and question with N
answers. (@test-GetAnswersComplexForm.R#81944)
2. Failure: error by wrong token (@test-GetForms.R#38)
3. Failure: error by wrong token (@test-GetForms.R#42)
4. Error: get forms with no filter (@test-GetForms.R#74)
5. Error: get forms with the filters (@test-GetForms.R#81)
6. Failure: error by wrong token (@test-GetFormStructure.R#4)
7. Failure: error by wrong token (@test-GetFormStructure.R#8)
8. Failure: error by wrong idForm or nameForm (@test-GetFormStructure.R#15)
9. Failure: error by wrong idForm or nameForm (@test-GetFormStructure.R#20)
1. ...
Error: testthat unit tests failed
Execution halted
```
# rcongresso
Version: 0.4.6
## In both
* checking examples ... ERROR
```
Running examples in ‘rcongresso-Ex.R’ failed
The error most likely occurred in:
> ### Name: fetch_despesas_deputado
> ### Title: Fetches expenditures from deputy
> ### Aliases: fetch_despesas_deputado
>
> ### ** Examples
>
> gastos_abel_mesquita <- fetch_despesas_deputado(id = 178957)
Error: Falha na requisicao a API dos Dados Abertos. Erro 400 ao tentar acessar: https://dadosabertos.camara.leg.br/api/v2/deputados/178957/despesas?id=178957
Execution halted
```
* checking tests ...
```
ERROR
Running the tests in ‘tests/testthat.R’ failed.
Last 13 lines of output:
9: doWithOneRestart(return(expr), restart)
── 3. Error: (unknown) (@test_votacoes.R#70) ──────────────────────────────────
argument "message" is missing, with no default
1: skip() at testthat/test_votacoes.R:70
2: structure(list(message = message), class = c("skip", "condition"))
══ testthat results ═══════════════════════════════════════════════════════════
OK: 13 SKIPPED: 0 FAILED: 3
1. Error: (unknown) (@test_deputados.R#81)
2. Error: (unknown) (@test_proposicoes.R#91)
3. Error: (unknown) (@test_votacoes.R#70)
Error: testthat unit tests failed
Execution halted
```
* checking re-building of vignette outputs ... WARNING
```
Error in re-building vignettes:
...
Attaching package: 'dplyr'
The following objects are masked from 'package:stats':
filter, lag
The following objects are masked from 'package:base':
intersect, setdiff, setequal, union
Quitting from lines 36-38 (introducao-rcongresso.Rmd)
Error: processing vignette 'introducao-rcongresso.Rmd' failed with diagnostics:
could not find function "FUN1"
Execution halted
```
# rcv
Version: 0.2.1
## In both
* checking installed package size ... NOTE
```
installed size is 5.2Mb
sub-directories of 1Mb or more:
data 5.0Mb
```
* checking data for non-ASCII characters ... NOTE
```
Note: found 6543 marked UTF-8 strings
```
# RDML
Version: 0.9-9
## In both
* checking installed package size ... NOTE
```
installed size is 5.5Mb
sub-directories of 1Mb or more:
doc 2.4Mb
R 2.1Mb
```
# Rdrools
Version: 1.1.1
## In both
* checking whether package ‘Rdrools’ can be installed ... ERROR
```
Installation failed.
See ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/Rdrools/new/Rdrools.Rcheck/00install.out’ for details.
```
## Installation
### Devel
```
* installing *source* package ‘Rdrools’ ...
** package ‘Rdrools’ successfully unpacked and MD5 sums checked
** R
** data
*** moving datasets to lazyload DB
** inst
** byte-compile and prepare package for lazy loading
Error: package or namespace load failed for ‘rJava’:
.onLoad failed in loadNamespace() for 'rJava', details:
call: dyn.load(file, DLLpath = DLLpath, ...)
error: unable to load shared object '/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/Rdrools/rJava/libs/rJava.so':
dlopen(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/Rdrools/rJava/libs/rJava.so, 6): Library not loaded: /Library/Java/JavaVirtualMachines/jdk-11.0.1.jdk/Contents/Home/lib/server/libjvm.dylib
Referenced from: /Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/Rdrools/rJava/libs/rJava.so
Reason: image not found
Error : package ‘rJava’ could not be loaded
ERROR: lazy loading failed for package ‘Rdrools’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/Rdrools/new/Rdrools.Rcheck/Rdrools’
```
### CRAN
```
* installing *source* package ‘Rdrools’ ...
** package ‘Rdrools’ successfully unpacked and MD5 sums checked
** R
** data
*** moving datasets to lazyload DB
** inst
** byte-compile and prepare package for lazy loading
Error: package or namespace load failed for ‘rJava’:
.onLoad failed in loadNamespace() for 'rJava', details:
call: dyn.load(file, DLLpath = DLLpath, ...)
error: unable to load shared object '/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/Rdrools/rJava/libs/rJava.so':
dlopen(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/Rdrools/rJava/libs/rJava.so, 6): Library not loaded: /Library/Java/JavaVirtualMachines/jdk-11.0.1.jdk/Contents/Home/lib/server/libjvm.dylib
Referenced from: /Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/Rdrools/rJava/libs/rJava.so
Reason: image not found
Error : package ‘rJava’ could not be loaded
ERROR: lazy loading failed for package ‘Rdrools’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/Rdrools/old/Rdrools.Rcheck/Rdrools’
```
# rdrop2
Version: 0.8.1
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘digest’
All declared Imports should be used.
```
# readat
Version: 1.6.0
## In both
* checking R code for possible problems ... NOTE
```
sfread: no visible binding for global variable ‘header’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/readat/new/readat.Rcheck/00_pkg_src/readat/R/sfread.R:54)
sfread: no visible binding for global variable ‘nrows’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/readat/new/readat.Rcheck/00_pkg_src/readat/R/sfread.R:54)
Undefined global functions or variables:
header nrows
```
# recipes
Version: 0.1.4
## In both
* checking tests ...
```
ERROR
Running the tests in ‘tests/testthat.R’ failed.
Last 13 lines of output:
14: dimRed::FastICA at /Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/recipes/new/recipes.Rcheck/00_pkg_src/recipes/R/ica.R:158
15: getExportedValue(pkg, name)
16: asNamespace(ns)
17: getNamespace(ns)
18: tryCatch(loadNamespace(name), error = function(e) stop(e))
19: tryCatchList(expr, classes, parentenv, handlers)
20: tryCatchOne(expr, names, parentenv, handlers[[1L]])
21: value[[3L]](cond)
══ testthat results ═══════════════════════════════════════════════════════════
OK: 1118 SKIPPED: 9 FAILED: 1
1. Error: printing (@test_ica.R#127)
Error: testthat unit tests failed
Execution halted
```
* checking Rd cross-references ... WARNING
```
Unknown package ‘dimRed’ in Rd xrefs
```
* checking package dependencies ... NOTE
```
Package suggested but not available for checking: ‘dimRed’
```
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘RcppRoll’
All declared Imports should be used.
```
# regrrr
Version: 0.1.0
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘rlang’ ‘spatstat’
All declared Imports should be used.
```
# replyr
Version: 0.9.9
## In both
* checking Rd cross-references ... NOTE
```
Package unavailable to check Rd xrefs: ‘rquery’
```
# rerddap
Version: 0.5.0
## In both
* checking package dependencies ... NOTE
```
Package which this enhances but not available for checking: ‘taxize’
```
# rERR
Version: 0.1
## Newly broken
* checking re-building of vignette outputs ... WARNING
```
...
Please specify either 'title' or 'pagetitle' in the metadata.
Falling back to 'rERR.utf8'
Could not fetch http://mathurl.com/y7gp2qz5.png
HttpExceptionRequest Request {
host = "mathurl.com"
port = 80
secure = False
requestHeaders = []
path = "/y7gp2qz5.png"
queryString = ""
method = "GET"
proxy = Nothing
rawBody = False
redirectCount = 10
responseTimeout = ResponseTimeoutDefault
requestVersion = HTTP/1.1
}
ConnectionTimeout
Error: processing vignette 'rERR.Rmd' failed with diagnostics:
pandoc document conversion failed with error 61
Execution halted
```
# restfulSE
Version: 1.2.3
## In both
* checking package dependencies ... ERROR
```
Package required but not available: ‘GO.db’
Packages suggested but not available for checking:
‘org.Mm.eg.db’ ‘org.Hs.eg.db’
See section ‘The DESCRIPTION file’ in the ‘Writing R Extensions’
manual.
```
# revengc
Version: 1.0.4
## In both
* checking re-building of vignette outputs ... WARNING
```
Error in re-building vignettes:
...
Error: processing vignette 'duchscherer-stewart-urban.ltx' failed with diagnostics:
Running 'texi2dvi' on 'duchscherer-stewart-urban.ltx' failed.
LaTeX errors:
! LaTeX Error: File `float.sty' not found.
Type X to quit or <RETURN> to proceed,
or enter new name. (Default extension: sty)
! Emergency stop.
<read *>
l.16 \usepackage
{graphicx}^^M
! ==> Fatal error occurred, no output PDF file produced!
Execution halted
```
# rfacebookstat
Version: 1.8.3
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘bitops’
All declared Imports should be used.
```
# rfbCNPJ
Version: 0.1.1
## In both
* checking data for non-ASCII characters ... NOTE
```
Note: found 27 marked UTF-8 strings
```
# rfishbase
Version: 3.0.1
## In both
* checking data for non-ASCII characters ... NOTE
```
Note: found 44 marked UTF-8 strings
```
# RGMQL
Version: 1.0.2
## In both
* checking whether package ‘RGMQL’ can be installed ... ERROR
```
Installation failed.
See ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/RGMQL/new/RGMQL.Rcheck/00install.out’ for details.
```
## Installation
### Devel
```
* installing *source* package ‘RGMQL’ ...
** R
** inst
** byte-compile and prepare package for lazy loading
Error : .onLoad failed in loadNamespace() for 'rJava', details:
call: dyn.load(file, DLLpath = DLLpath, ...)
error: unable to load shared object '/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/RGMQL/rJava/libs/rJava.so':
dlopen(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/RGMQL/rJava/libs/rJava.so, 6): Library not loaded: /Library/Java/JavaVirtualMachines/jdk-11.0.1.jdk/Contents/Home/lib/server/libjvm.dylib
Referenced from: /Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/RGMQL/rJava/libs/rJava.so
Reason: image not found
ERROR: lazy loading failed for package ‘RGMQL’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/RGMQL/new/RGMQL.Rcheck/RGMQL’
```
### CRAN
```
* installing *source* package ‘RGMQL’ ...
** R
** inst
** byte-compile and prepare package for lazy loading
Error : .onLoad failed in loadNamespace() for 'rJava', details:
call: dyn.load(file, DLLpath = DLLpath, ...)
error: unable to load shared object '/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/RGMQL/rJava/libs/rJava.so':
dlopen(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/RGMQL/rJava/libs/rJava.so, 6): Library not loaded: /Library/Java/JavaVirtualMachines/jdk-11.0.1.jdk/Contents/Home/lib/server/libjvm.dylib
Referenced from: /Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/RGMQL/rJava/libs/rJava.so
Reason: image not found
ERROR: lazy loading failed for package ‘RGMQL’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/RGMQL/old/RGMQL.Rcheck/RGMQL’
```
# rhmmer
Version: 0.1.0
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘dplyr’
All declared Imports should be used.
```
# riingo
Version: 0.1.0
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘rlang’
All declared Imports should be used.
```
# RImmPort
Version: 1.8.0
## In both
* checking installed package size ... NOTE
```
installed size is 5.9Mb
sub-directories of 1Mb or more:
extdata 3.8Mb
```
* checking R code for possible problems ... NOTE
```
buildNewSqliteDb: no visible global function definition for
‘dbListTables’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/RImmPort/new/RImmPort.Rcheck/00_pkg_src/RImmPort/R/ImmPortSqlite.R:1890)
Undefined global functions or variables:
dbListTables
```
# riskclustr
Version: 0.1.0
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘gtools’ ‘knitr’ ‘usethis’
All declared Imports should be used.
```
# rmapzen
Version: 0.4.1
## In both
* checking data for non-ASCII characters ... NOTE
```
Note: found 31 marked UTF-8 strings
```
# rmcfs
Version: 1.2.15
## In both
* checking whether package ‘rmcfs’ can be installed ... ERROR
```
Installation failed.
See ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/rmcfs/new/rmcfs.Rcheck/00install.out’ for details.
```
## Installation
### Devel
```
* installing *source* package ‘rmcfs’ ...
** package ‘rmcfs’ successfully unpacked and MD5 sums checked
** R
** inst
** byte-compile and prepare package for lazy loading
Error: package or namespace load failed for ‘rJava’:
.onLoad failed in loadNamespace() for 'rJava', details:
call: dyn.load(file, DLLpath = DLLpath, ...)
error: unable to load shared object '/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/rmcfs/rJava/libs/rJava.so':
dlopen(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/rmcfs/rJava/libs/rJava.so, 6): Library not loaded: /Library/Java/JavaVirtualMachines/jdk-11.0.1.jdk/Contents/Home/lib/server/libjvm.dylib
Referenced from: /Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/rmcfs/rJava/libs/rJava.so
Reason: image not found
Error : package ‘rJava’ could not be loaded
ERROR: lazy loading failed for package ‘rmcfs’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/rmcfs/new/rmcfs.Rcheck/rmcfs’
```
### CRAN
```
* installing *source* package ‘rmcfs’ ...
** package ‘rmcfs’ successfully unpacked and MD5 sums checked
** R
** inst
** byte-compile and prepare package for lazy loading
Error: package or namespace load failed for ‘rJava’:
.onLoad failed in loadNamespace() for 'rJava', details:
call: dyn.load(file, DLLpath = DLLpath, ...)
error: unable to load shared object '/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/rmcfs/rJava/libs/rJava.so':
dlopen(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/rmcfs/rJava/libs/rJava.so, 6): Library not loaded: /Library/Java/JavaVirtualMachines/jdk-11.0.1.jdk/Contents/Home/lib/server/libjvm.dylib
Referenced from: /Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/rmcfs/rJava/libs/rJava.so
Reason: image not found
Error : package ‘rJava’ could not be loaded
ERROR: lazy loading failed for package ‘rmcfs’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/rmcfs/old/rmcfs.Rcheck/rmcfs’
```
# RMCriteria
Version: 0.2.0
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘dplyr’
All declared Imports should be used.
```
# rmd
Version: 0.1.4
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘blogdown’ ‘bookdown’ ‘bookdownplus’ ‘citr’ ‘pagedown’ ‘rticles’
‘tinytex’ ‘xaringan’
All declared Imports should be used.
```
# RNeXML
Version: 2.3.0
## In both
* R CMD check timed out
* checking package dependencies ... NOTE
```
Package suggested but not available for checking: ‘taxadb’
```
# rnoaa
Version: 0.8.4
## In both
* checking installed package size ... NOTE
```
installed size is 5.0Mb
sub-directories of 1Mb or more:
vign 1.2Mb
```
# roahd
Version: 1.4.1
## In both
* checking installed package size ... NOTE
```
installed size is 5.3Mb
sub-directories of 1Mb or more:
data 2.9Mb
doc 1.6Mb
```
# robotstxt
Version: 0.6.2
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘future’
All declared Imports should be used.
```
# rODE
Version: 0.99.6
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘data.table’
All declared Imports should be used.
```
# rolypoly
Version: 0.1.0
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘matrixcalc’
All declared Imports should be used.
```
# rpcdsearch
Version: 1.0
## In both
* checking whether package ‘rpcdsearch’ can be installed ... ERROR
```
Installation failed.
See ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/rpcdsearch/new/rpcdsearch.Rcheck/00install.out’ for details.
```
## Installation
### Devel
```
* installing *source* package ‘rpcdsearch’ ...
** package ‘rpcdsearch’ successfully unpacked and MD5 sums checked
** R
** inst
** byte-compile and prepare package for lazy loading
Error : .onLoad failed in loadNamespace() for 'rJava', details:
call: dyn.load(file, DLLpath = DLLpath, ...)
error: unable to load shared object '/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/rpcdsearch/rJava/libs/rJava.so':
dlopen(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/rpcdsearch/rJava/libs/rJava.so, 6): Library not loaded: /Library/Java/JavaVirtualMachines/jdk-11.0.1.jdk/Contents/Home/lib/server/libjvm.dylib
Referenced from: /Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/rpcdsearch/rJava/libs/rJava.so
Reason: image not found
ERROR: lazy loading failed for package ‘rpcdsearch’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/rpcdsearch/new/rpcdsearch.Rcheck/rpcdsearch’
```
### CRAN
```
* installing *source* package ‘rpcdsearch’ ...
** package ‘rpcdsearch’ successfully unpacked and MD5 sums checked
** R
** inst
** byte-compile and prepare package for lazy loading
Error : .onLoad failed in loadNamespace() for 'rJava', details:
call: dyn.load(file, DLLpath = DLLpath, ...)
error: unable to load shared object '/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/rpcdsearch/rJava/libs/rJava.so':
dlopen(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/rpcdsearch/rJava/libs/rJava.so, 6): Library not loaded: /Library/Java/JavaVirtualMachines/jdk-11.0.1.jdk/Contents/Home/lib/server/libjvm.dylib
Referenced from: /Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/rpcdsearch/rJava/libs/rJava.so
Reason: image not found
ERROR: lazy loading failed for package ‘rpcdsearch’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/rpcdsearch/old/rpcdsearch.Rcheck/rpcdsearch’
```
# rPref
Version: 1.3
## In both
* checking for GNU extensions in Makefiles ... NOTE
```
GNU make is a SystemRequirements.
```
# Rraven
Version: 1.0.5
## In both
* checking package dependencies ... ERROR
```
Package required but not available: ‘warbleR’
See section ‘The DESCRIPTION file’ in the ‘Writing R Extensions’
manual.
```
# rrr
Version: 1.0.0
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘Rcpp’
All declared Imports should be used.
```
# Rsconctdply
Version: 0.1.3
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘dplyr’
All declared Imports should be used.
```
# rscorecard
Version: 0.11.1
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘tidyselect’
All declared Imports should be used.
```
# RSDA
Version: 2.0.8
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘randomcoloR’
All declared Imports should be used.
```
# rsimsum
Version: 0.5.0
## In both
* checking installed package size ... NOTE
```
installed size is 5.1Mb
sub-directories of 1Mb or more:
doc 1.0Mb
help 3.4Mb
```
# rsinaica
Version: 0.6.1
## In both
* checking data for non-ASCII characters ... NOTE
```
Note: found 467 marked UTF-8 strings
```
# Rspotify
Version: 0.1.2
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘magrittr’
All declared Imports should be used.
```
# rstap
Version: 1.0.3
## Newly broken
* checking Rd cross-references ... WARNING
```
Unknown package ‘rstanarm’ in Rd xrefs
```
## Newly fixed
* checking Rd cross-references ... NOTE
```
Package unavailable to check Rd xrefs: ‘rstanarm’
```
## In both
* checking installed package size ... NOTE
```
installed size is 9.8Mb
sub-directories of 1Mb or more:
libs 7.1Mb
R 2.0Mb
```
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘loo’
All declared Imports should be used.
```
* checking for GNU extensions in Makefiles ... NOTE
```
GNU make is a SystemRequirements.
```
# RSwissMaps
Version: 0.1.0
## In both
* checking data for non-ASCII characters ... NOTE
```
Note: found 18627 marked UTF-8 strings
```
# RTCGA
Version: 1.10.0
## In both
* checking examples ... ERROR
```
Running examples in ‘RTCGA-Ex.R’ failed
The error most likely occurred in:
> ### Name: boxplotTCGA
> ### Title: Create Boxplots for TCGA Datasets
> ### Aliases: boxplotTCGA
>
> ### ** Examples
>
> library(RTCGA.rnaseq)
Error in library(RTCGA.rnaseq) :
there is no package called ‘RTCGA.rnaseq’
Execution halted
```
* checking tests ...
```
ERROR
Running the tests in ‘tests/testthat.R’ failed.
Complete output:
> library(testthat)
> library(RTCGA)
Welcome to the RTCGA (version: 1.10.0).
> library(RTCGA.rnaseq)
Error in library(RTCGA.rnaseq) :
there is no package called 'RTCGA.rnaseq'
Execution halted
```
* checking package dependencies ... NOTE
```
Packages suggested but not available for checking:
‘RTCGA.rnaseq’ ‘RTCGA.clinical’ ‘RTCGA.mutations’ ‘RTCGA.RPPA’
‘RTCGA.mRNA’ ‘RTCGA.miRNASeq’ ‘RTCGA.methylation’ ‘RTCGA.CNV’
‘RTCGA.PANCAN12’
```
* checking R code for possible problems ... NOTE
```
...
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/RTCGA/new/RTCGA.Rcheck/00_pkg_src/RTCGA/R/ggbiplot.R:157-161)
ggbiplot: no visible binding for global variable ‘xvar’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/RTCGA/new/RTCGA.Rcheck/00_pkg_src/RTCGA/R/ggbiplot.R:157-161)
ggbiplot: no visible binding for global variable ‘yvar’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/RTCGA/new/RTCGA.Rcheck/00_pkg_src/RTCGA/R/ggbiplot.R:157-161)
ggbiplot: no visible binding for global variable ‘angle’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/RTCGA/new/RTCGA.Rcheck/00_pkg_src/RTCGA/R/ggbiplot.R:157-161)
ggbiplot: no visible binding for global variable ‘hjust’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/RTCGA/new/RTCGA.Rcheck/00_pkg_src/RTCGA/R/ggbiplot.R:157-161)
read.mutations: no visible binding for global variable ‘.’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/RTCGA/new/RTCGA.Rcheck/00_pkg_src/RTCGA/R/readTCGA.R:383)
read.mutations: no visible binding for global variable ‘.’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/RTCGA/new/RTCGA.Rcheck/00_pkg_src/RTCGA/R/readTCGA.R:386)
read.rnaseq: no visible binding for global variable ‘.’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/RTCGA/new/RTCGA.Rcheck/00_pkg_src/RTCGA/R/readTCGA.R:372-375)
survivalTCGA: no visible binding for global variable ‘times’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/RTCGA/new/RTCGA.Rcheck/00_pkg_src/RTCGA/R/survivalTCGA.R:101-137)
whichDateToUse: no visible binding for global variable ‘.’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/RTCGA/new/RTCGA.Rcheck/00_pkg_src/RTCGA/R/downloadTCGA.R:167-168)
Undefined global functions or variables:
. angle hjust muted times varname xvar yvar
```
* checking Rd cross-references ... NOTE
```
Packages unavailable to check Rd xrefs: ‘RTCGA.rnaseq’, ‘RTCGA.clinical’, ‘RTCGA.mutations’, ‘RTCGA.CNV’, ‘RTCGA.RPPA’, ‘RTCGA.mRNA’, ‘RTCGA.miRNASeq’, ‘RTCGA.methylation’
```
# RTD
Version: 0.1.1
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘openssl’
All declared Imports should be used.
```
# rtimicropem
Version: 1.3
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘R6’
All declared Imports should be used.
```
# rtrek
Version: 0.2.0
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘memoise’ ‘tidyr’
All declared Imports should be used.
```
* checking data for non-ASCII characters ... NOTE
```
Note: found 988 marked UTF-8 strings
```
# rtrends
Version: 0.1.0
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘dplyr’
All declared Imports should be used.
```
# RtutoR
Version: 1.2
## In both
* checking whether package ‘RtutoR’ can be installed ... ERROR
```
Installation failed.
See ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/RtutoR/new/RtutoR.Rcheck/00install.out’ for details.
```
## Installation
### Devel
```
* installing *source* package ‘RtutoR’ ...
** package ‘RtutoR’ successfully unpacked and MD5 sums checked
** R
** inst
** byte-compile and prepare package for lazy loading
Error : .onLoad failed in loadNamespace() for 'rJava', details:
call: dyn.load(file, DLLpath = DLLpath, ...)
error: unable to load shared object '/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/RtutoR/rJava/libs/rJava.so':
dlopen(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/RtutoR/rJava/libs/rJava.so, 6): Library not loaded: /Library/Java/JavaVirtualMachines/jdk-11.0.1.jdk/Contents/Home/lib/server/libjvm.dylib
Referenced from: /Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/RtutoR/rJava/libs/rJava.so
Reason: image not found
ERROR: lazy loading failed for package ‘RtutoR’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/RtutoR/new/RtutoR.Rcheck/RtutoR’
```
### CRAN
```
* installing *source* package ‘RtutoR’ ...
** package ‘RtutoR’ successfully unpacked and MD5 sums checked
** R
** inst
** byte-compile and prepare package for lazy loading
Error : .onLoad failed in loadNamespace() for 'rJava', details:
call: dyn.load(file, DLLpath = DLLpath, ...)
error: unable to load shared object '/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/RtutoR/rJava/libs/rJava.so':
dlopen(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/RtutoR/rJava/libs/rJava.so, 6): Library not loaded: /Library/Java/JavaVirtualMachines/jdk-11.0.1.jdk/Contents/Home/lib/server/libjvm.dylib
Referenced from: /Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/RtutoR/rJava/libs/rJava.so
Reason: image not found
ERROR: lazy loading failed for package ‘RtutoR’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/RtutoR/old/RtutoR.Rcheck/RtutoR’
```
# rubias
Version: 0.2.0
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘ggplot2’
All declared Imports should be used.
```
# rwavelet
Version: 0.1.0
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘dplyr’
All declared Imports should be used.
```
# RxODE
Version: 0.8.0-9
## In both
* checking installed package size ... NOTE
```
installed size is 6.1Mb
sub-directories of 1Mb or more:
doc 1.6Mb
libs 2.1Mb
R 2.0Mb
```
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘n1qn1’
All declared Imports should be used.
```
# rzeit2
Version: 0.2.3
## In both
* checking data for non-ASCII characters ... NOTE
```
Note: found 841 marked UTF-8 strings
```
# safetyGraphics
Version: 0.7.3
## In both
* checking re-building of vignette outputs ... WARNING
```
...
Error in re-building vignettes:
...
Could not fetch https://user-images.githubusercontent.com/3680095/51569925-98219500-1e52-11e9-9992-0955ebef9bf4.png
HttpExceptionRequest Request {
host = "user-images.githubusercontent.com"
port = 443
secure = True
requestHeaders = []
path = "/3680095/51569925-98219500-1e52-11e9-9992-0955ebef9bf4.png"
queryString = ""
method = "GET"
proxy = Nothing
rawBody = False
redirectCount = 10
responseTimeout = ResponseTimeoutDefault
requestVersion = HTTP/1.1
}
(ConnectionFailure Network.Socket.getAddrInfo (called with preferred socket type/protocol: AddrInfo {addrFlags = [AI_ADDRCONFIG], addrFamily = AF_UNSPEC, addrSocketType = Stream, addrProtocol = 6, addrAddress = <assumed to be undefined>, addrCanonName = <assumed to be undefined>}, host name: Just "user-images.githubusercontent.com", service name: Just "443"): does not exist (nodename nor servname provided, or not known))
Error: processing vignette 'shinyUserGuide.Rmd' failed with diagnostics:
pandoc document conversion failed with error 61
Execution halted
```
# SanFranBeachWater
Version: 0.1.0
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘tibble’
All declared Imports should be used.
```
# SanzCircos
Version: 0.1.0
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘purrr’ ‘tidyr’
All declared Imports should be used.
```
# SAR
Version: 1.0.0
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘httr’ ‘jsonlite’
All declared Imports should be used.
```
* checking for GNU extensions in Makefiles ... NOTE
```
GNU make is a SystemRequirements.
```
# scater
Version: 1.8.4
## In both
* checking tests ...
```
ERROR
Running the tests in ‘tests/testthat.R’ failed.
Last 13 lines of output:
dims = ncomponents, check_duplicates = FALSE, ...) at /Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/scater/new/scater.Rcheck/00_pkg_src/scater/R/runTSNE.R:90
9: Rtsne.default(vals, initial_dims = initial_dims, pca = pca, perplexity = perplexity,
dims = ncomponents, check_duplicates = FALSE, ...)
10: .check_tsne_params(nrow(X), dims = dims, perplexity = perplexity, theta = theta,
max_iter = max_iter, verbose = verbose, Y_init = Y_init, stop_lying_iter = stop_lying_iter,
mom_switch_iter = mom_switch_iter, momentum = momentum, final_momentum = final_momentum,
eta = eta, exaggeration_factor = exaggeration_factor)
11: stop("dims should be either 1, 2 or 3")
Collapsing expression to 500 features.Kallisto log not provided - assuming all runs successful══ testthat results ═══════════════════════════════════════════════════════════
OK: 1012 SKIPPED: 0 FAILED: 1
1. Error: we can produce TSNE plots (@test-plotting.R#330)
Error: testthat unit tests failed
Execution halted
```
* checking examples ... WARNING
```
Found the following significant warnings:
Warning: 'read10xResults' is deprecated.
Warning: 'downsampleCounts' is deprecated.
Warning: 'normalizeExprs' is deprecated.
Warning: 'normalizeExprs' is deprecated.
Warning: 'normalizeExprs' is deprecated.
Warning: 'normalizeExprs' is deprecated.
Warning: 'normalizeExprs' is deprecated.
Warning: 'normalizeExprs' is deprecated.
Warning: 'read10xResults' is deprecated.
Deprecated functions may be defunct as soon as of the next release of
R.
See ?Deprecated.
```
* checking installed package size ... NOTE
```
installed size is 15.4Mb
sub-directories of 1Mb or more:
doc 5.4Mb
extdata 2.9Mb
libs 4.8Mb
```
# scFeatureFilter
Version: 1.0.0
## In both
* checking re-building of vignette outputs ... WARNING
```
Error in re-building vignettes:
...
Error: processing vignette 'Introduction.Rmd' failed with diagnostics:
there is no package called ‘BiocStyle’
Execution halted
```
* checking package dependencies ... NOTE
```
Package suggested but not available for checking: ‘scRNAseq’
```
* checking installed package size ... NOTE
```
installed size is 6.5Mb
sub-directories of 1Mb or more:
data 3.7Mb
doc 2.4Mb
```
# scfind
Version: 1.2.0
## In both
* checking re-building of vignette outputs ... WARNING
```
Error in re-building vignettes:
...
Error: processing vignette 'scfind.Rmd' failed with diagnostics:
there is no package called ‘BiocStyle’
Execution halted
```
# scmap
Version: 1.2.0
## In both
* checking re-building of vignette outputs ... WARNING
```
Error in re-building vignettes:
...
Error: processing vignette 'scmap.Rmd' failed with diagnostics:
there is no package called ‘BiocStyle’
Execution halted
```
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘Biobase’
All declared Imports should be used.
```
# Sconify
Version: 1.0.4
## In both
* checking whether package ‘Sconify’ can be installed ... WARNING
```
Found the following significant warnings:
Warning: replacing previous import ‘flowCore::view’ by ‘tibble::view’ when loading ‘Sconify’
See ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/Sconify/new/Sconify.Rcheck/00install.out’ for details.
```
# scoper
Version: 0.1.0
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘shazam’
All declared Imports should be used.
```
# sdStaf
Version: 1.0.2
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘rgdal’ ‘rgeos’ ‘tidyr’
All declared Imports should be used.
```
# sejmRP
Version: 1.3.4
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘cluster’ ‘factoextra’ ‘tidyr’
All declared Imports should be used.
```
# semdrw
Version: 0.1.0
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘dplyr’ ‘lavaan’ ‘psych’ ‘semPlot’ ‘semTools’ ‘shinyAce’
All declared Imports should be used.
```
# SeqVarTools
Version: 1.18.0
## In both
* checking examples ... ERROR
```
...
> ### Name: variantInfo
> ### Title: Variant info
> ### Aliases: variantInfo variantInfo,SeqVarGDSClass-method
> ### expandedVariantIndex expandedVariantIndex,SeqVarGDSClass-method
>
> ### ** Examples
>
> gds <- seqOpen(seqExampleFileName("gds"))
> seqSetFilter(gds, variant.sel=1323:1327)
# of selected variants: 5
> variantInfo(gds, alleles=TRUE)
variant.id chr pos ref alt
1 1323 21 44213462 C T,CT
2 1324 21 44214985 G A
3 1325 21 44215700 C T
4 1326 22 16042444 C G
5 1327 22 16042793 A G
> variantInfo(gds, alleles=TRUE, expanded=TRUE)
Error in n() : could not find function "n"
Calls: variantInfo ... variantInfo -> .local -> mutate_ -> mutate_.tbl_df -> mutate_impl
Execution halted
```
* checking tests ...
```
ERROR
Running the tests in ‘tests/test.R’ failed.
Last 13 lines of output:
1 Test Suite :
SeqVarTools RUnit Tests - 133 test functions, 1 error, 0 failures
ERROR in test_variantInfo: Error in n() : could not find function "n"
Test files with failing tests
test_getData.R
test_variantInfo
Error in BiocGenerics:::testPackage("SeqVarTools") :
unit tests failed for package SeqVarTools
Execution halted
```
* checking re-building of vignette outputs ... NOTE
```
...
Vignettes contain introductory material; view with
'browseVignettes()'. To cite Bioconductor, see
'citation("Biobase")', and for packages 'citation("pkgname")'.
Error in texi2dvi(file = file, pdf = TRUE, clean = clean, quiet = quiet, :
Running 'texi2dvi' on 'SeqVarTools.tex' failed.
LaTeX errors:
! LaTeX Error: File `fancyhdr.sty' not found.
Type X to quit or <RETURN> to proceed,
or enter new name. (Default extension: sty)
! Emergency stop.
<read *>
l.171 \pagestyle
{fancy}^^M
! ==> Fatal error occurred, no output PDF file produced!
Calls: buildVignettes -> texi2pdf -> texi2dvi
Execution halted
```
# Seurat
Version: 2.3.4
## In both
* checking package dependencies ... NOTE
```
Package suggested but not available for checking: ‘loomR’
```
# sevenbridges
Version: 1.10.5
## In both
* checking installed package size ... NOTE
```
installed size is 7.7Mb
sub-directories of 1Mb or more:
doc 2.9Mb
R 4.1Mb
```
# sf
Version: 0.7-3
## In both
* checking whether package ‘sf’ can be installed ... ERROR
```
Installation failed.
See ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/sf/new/sf.Rcheck/00install.out’ for details.
```
## Installation
### Devel
```
* installing *source* package ‘sf’ ...
** package ‘sf’ successfully unpacked and MD5 sums checked
configure: CC: clang
configure: CXX: clang++ -std=gnu++11
checking for gdal-config... no
no
configure: error: gdal-config not found or not executable.
ERROR: configuration failed for package ‘sf’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/sf/new/sf.Rcheck/sf’
```
### CRAN
```
* installing *source* package ‘sf’ ...
** package ‘sf’ successfully unpacked and MD5 sums checked
configure: CC: clang
configure: CXX: clang++ -std=gnu++11
checking for gdal-config... no
no
configure: error: gdal-config not found or not executable.
ERROR: configuration failed for package ‘sf’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/sf/old/sf.Rcheck/sf’
```
# shiny.semantic
Version: 0.2.1
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘utils’
All declared Imports should be used.
```
# shinyAce
Version: 0.3.3
## In both
* checking installed package size ... NOTE
```
installed size is 7.9Mb
sub-directories of 1Mb or more:
www 7.7Mb
```
# shinyaframe
Version: 1.0.1
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘shiny’
All declared Imports should be used.
```
# shinyHeatmaply
Version: 0.1.0
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘htmlwidgets’ ‘jsonlite’ ‘RColorBrewer’ ‘viridis’
All declared Imports should be used.
```
# SIBER
Version: 2.1.4
## In both
* checking examples ... ERROR
```
...
> ### ** Examples
>
> x <- stats::rnorm(50)
> y <- stats::rnorm(50)
> parms <- list()
> parms$n.iter <- 2 * 10^3
> parms$n.burnin <- 500
> parms$n.thin <- 2
> parms$n.chains <- 2
> priors <- list()
> priors$R <- 1 * diag(2)
> priors$k <- 2
> priors$tau.mu <- 1.0E-3
> fitEllipse(x, y, parms, priors)
Error: .onLoad failed in loadNamespace() for 'rjags', details:
call: dyn.load(file, DLLpath = DLLpath, ...)
error: unable to load shared object '/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/SIBER/rjags/libs/rjags.so':
dlopen(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/SIBER/rjags/libs/rjags.so, 10): Library not loaded: /usr/local/lib/libjags.4.dylib
Referenced from: /Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/SIBER/rjags/libs/rjags.so
Reason: image not found
Execution halted
```
* checking re-building of vignette outputs ... WARNING
```
Error in re-building vignettes:
...
Quitting from lines 74-96 (Centroid-Vectors.Rmd)
Error: processing vignette 'Centroid-Vectors.Rmd' failed with diagnostics:
.onLoad failed in loadNamespace() for 'rjags', details:
call: dyn.load(file, DLLpath = DLLpath, ...)
error: unable to load shared object '/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/SIBER/rjags/libs/rjags.so':
dlopen(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/SIBER/rjags/libs/rjags.so, 10): Library not loaded: /usr/local/lib/libjags.4.dylib
Referenced from: /Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/SIBER/rjags/libs/rjags.so
Reason: image not found
Execution halted
```
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘coda’ ‘ellipse’ ‘viridis’
All declared Imports should be used.
```
# sicegar
Version: 0.2.2
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘dplyr’
All declared Imports should be used.
```
# sidrar
Version: 0.2.4
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘dplyr’
All declared Imports should be used.
```
# sigmajs
Version: 0.1.2
## In both
* checking installed package size ... NOTE
```
installed size is 7.0Mb
sub-directories of 1Mb or more:
doc 5.3Mb
```
* checking data for non-ASCII characters ... NOTE
```
Note: found 28 marked UTF-8 strings
```
# SimDesign
Version: 1.13
## In both
* checking package dependencies ... NOTE
```
Package suggested but not available for checking: ‘doMPI’
```
# simputation
Version: 0.2.2
## In both
* checking whether package ‘simputation’ can be installed ... ERROR
```
Installation failed.
See ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/simputation/new/simputation.Rcheck/00install.out’ for details.
```
## Installation
### Devel
```
* installing *source* package ‘simputation’ ...
** package ‘simputation’ successfully unpacked and MD5 sums checked
** libs
clang -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -I/usr/local/include -fopenmp -fPIC -Wall -g -O2 -c R_register_native.c -o R_register_native.o
clang: error: unsupported option '-fopenmp'
make: *** [R_register_native.o] Error 1
ERROR: compilation failed for package ‘simputation’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/simputation/new/simputation.Rcheck/simputation’
```
### CRAN
```
* installing *source* package ‘simputation’ ...
** package ‘simputation’ successfully unpacked and MD5 sums checked
** libs
clang -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -I/usr/local/include -fopenmp -fPIC -Wall -g -O2 -c R_register_native.c -o R_register_native.o
clang: error: unsupported option '-fopenmp'
make: *** [R_register_native.o] Error 1
ERROR: compilation failed for package ‘simputation’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/simputation/old/simputation.Rcheck/simputation’
```
# SimRVPedigree
Version: 0.3.0
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘dplyr’
All declared Imports should be used.
```
# sjstats
Version: 0.17.3
## In both
* checking Rd cross-references ... NOTE
```
Package unavailable to check Rd xrefs: ‘arm’
```
# skynet
Version: 1.3.0
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘maps’
All declared Imports should be used.
```
# sophisthse
Version: 0.7.0
## In both
* checking data for non-ASCII characters ... NOTE
```
Note: found 1320 marked UTF-8 strings
```
# sorvi
Version: 0.7.26
## In both
* checking re-building of vignette outputs ... WARNING
```
...
convert bootstrapped spaghettis to long format
Computing density estimates for each vertical cut ...
vertical cross-sectional density estimate
Tile approach
Build ggplot figure ...
Error in texi2dvi(file = file, pdf = TRUE, clean = clean, quiet = quiet, :
Running 'texi2dvi' on 'sorvi.tex' failed.
LaTeX errors:
! LaTeX Error: File `float.sty' not found.
Type X to quit or <RETURN> to proceed,
or enter new name. (Default extension: sty)
! Emergency stop.
<read *>
l.15 ^^M
! ==> Fatal error occurred, no output PDF file produced!
Calls: buildVignettes -> texi2pdf -> texi2dvi
Execution halted
```
* checking R code for possible problems ... NOTE
```
...
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/sorvi/new/sorvi.Rcheck/00_pkg_src/sorvi/R/regression_plot.R:115)
regression_plot : <anonymous>: no visible global function definition
for ‘pnorm’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/sorvi/new/sorvi.Rcheck/00_pkg_src/sorvi/R/regression_plot.R:115)
regression_plot: no visible global function definition for
‘flush.console’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/sorvi/new/sorvi.Rcheck/00_pkg_src/sorvi/R/regression_plot.R:138)
regression_plot: no visible global function definition for ‘density’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/sorvi/new/sorvi.Rcheck/00_pkg_src/sorvi/R/regression_plot.R:147)
regression_plot: no visible global function definition for
‘flush.console’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/sorvi/new/sorvi.Rcheck/00_pkg_src/sorvi/R/regression_plot.R:194)
Undefined global functions or variables:
colorRampPalette density flush.console loess loess.control pnorm
predict quantile read.csv
Consider adding
importFrom("grDevices", "colorRampPalette")
importFrom("stats", "density", "loess", "loess.control", "pnorm",
"predict", "quantile")
importFrom("utils", "flush.console", "read.csv")
to your NAMESPACE file.
```
# sourceR
Version: 1.0.1
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘gtools’ ‘hashmap’ ‘reshape2’
All declared Imports should be used.
```
# SpaDES.core
Version: 0.2.4
## In both
* checking tests ...
```
ERROR
Running the tests in ‘tests/test-all.R’ failed.
Last 13 lines of output:
[9] -6.33 - -8.17 == 1.831
...
[34m Using cached copy of .inputObjects event in child6 module.
[39m[34m Using memoised copy of .inputObjects event in child6 module
[39m[34m Using memoised copy of .inputObjects event in child6 module
[39m══ testthat results ═══════════════════════════════════════════════════════════════════════════
OK: 452 SKIPPED: 33 FAILED: 2
1. Failure: simulation runs with simInit and spades (@test-simulation.R#86)
2. Failure: simulation runs with simInit and spades (@test-simulation.R#87)
Error: testthat unit tests failed
In addition: Warning message:
In fun(libname, pkgname) : couldn't connect to display ""
Execution halted
```
* checking installed package size ... NOTE
```
installed size is 5.6Mb
sub-directories of 1Mb or more:
doc 1.6Mb
R 3.1Mb
```
# sparklyr
Version: 1.0.0
## In both
* checking installed package size ... NOTE
```
installed size is 6.8Mb
sub-directories of 1Mb or more:
java 1.5Mb
R 4.1Mb
```
# sparseHessianFD
Version: 0.3.3.4
## In both
* checking re-building of vignette outputs ... WARNING
```
Error in re-building vignettes:
...
Error in texi2dvi(file = file, pdf = TRUE, clean = clean, quiet = quiet, :
Running 'texi2dvi' on 'sparseHessianFD.tex' failed.
LaTeX errors:
! LaTeX Error: File `algorithm.sty' not found.
Type X to quit or <RETURN> to proceed,
or enter new name. (Default extension: sty)
! Emergency stop.
<read *>
l.27 \usepackage
{algorithmic}^^M
! ==> Fatal error occurred, no output PDF file produced!
Calls: buildVignettes -> texi2pdf -> texi2dvi
Execution halted
```
# sparseMVN
Version: 0.2.1.1
## In both
* checking re-building of vignette outputs ... WARNING
```
Error in re-building vignettes:
...
Error in texi2dvi(file = file, pdf = TRUE, clean = clean, quiet = quiet, :
Running 'texi2dvi' on 'sparseMVN.tex' failed.
LaTeX errors:
! LaTeX Error: File `placeins.sty' not found.
Type X to quit or <RETURN> to proceed,
or enter new name. (Default extension: sty)
! Emergency stop.
<read *>
l.30 \usepackage
{array}^^M
! ==> Fatal error occurred, no output PDF file produced!
Calls: buildVignettes -> texi2pdf -> texi2dvi
Execution halted
```
# SpatialBall
Version: 0.1.0
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘lubridate’
All declared Imports should be used.
```
# SpatialEpiApp
Version: 0.3
## In both
* checking package dependencies ... NOTE
```
Package suggested but not available for checking: ‘INLA’
```
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘dplyr’ ‘dygraphs’ ‘ggplot2’ ‘htmlwidgets’ ‘knitr’ ‘leaflet’
‘mapproj’ ‘maptools’ ‘RColorBrewer’ ‘rgdal’ ‘rgeos’ ‘rmarkdown’
‘shinyjs’ ‘SpatialEpi’ ‘spdep’ ‘xts’
All declared Imports should be used.
```
# sport
Version: 0.1.2
## In both
* checking PDF version of manual ... WARNING
```
LaTeX errors when creating PDF version.
This typically indicates Rd problems.
LaTeX errors found:
! LaTeX Error: Command \k unavailable in encoding OT1.
See the LaTeX manual or LaTeX Companion for explanation.
Type H <return> for immediate help.
...
! LaTeX Error: Command \k unavailable in encoding OT1.
See the LaTeX manual or LaTeX Companion for explanation.
Type H <return> for immediate help.
...
```
* checking data for non-ASCII characters ... NOTE
```
Note: found 6863 marked UTF-8 strings
```
# stacomiR
Version: 0.5.4.2
## In both
* checking package dependencies ... ERROR
```
Packages required but not available: ‘gWidgetsRGtk2’ ‘RGtk2’
See section ‘The DESCRIPTION file’ in the ‘Writing R Extensions’
manual.
```
# staRdom
Version: 1.0.12
## In both
* checking re-building of vignette outputs ... WARNING
```
Error in re-building vignettes:
...
Attaching package: 'dplyr'
The following objects are masked from 'package:stats':
filter, lag
The following objects are masked from 'package:base':
intersect, setdiff, setequal, union
Quitting from lines 25-63 (Basic_analysis_of_DOM_samples.Rmd)
Error: processing vignette 'Basic_analysis_of_DOM_samples.Rmd' failed with diagnostics:
Timeout was reached: Resolving timed out after 10000 milliseconds
Execution halted
```
# stars
Version: 0.3-0
## In both
* checking package dependencies ... NOTE
```
Package suggested but not available for checking: ‘starsdata’
```
* checking installed package size ... NOTE
```
installed size is 15.6Mb
sub-directories of 1Mb or more:
doc 10.3Mb
nc 3.5Mb
```
# statsDK
Version: 0.1.1
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘dplyr’ ‘ggplot2’ ‘stringr’
All declared Imports should be used.
```
# stlcsb
Version: 0.1.2
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘tibble’
All declared Imports should be used.
```
# stminsights
Version: 0.3.0
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘huge’ ‘readr’ ‘scales’ ‘shinyjs’
All declared Imports should be used.
```
# StratigrapheR
Version: 0.0.1
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘hexbin’
All declared Imports should be used.
```
# STRMPS
Version: 0.5.8
## In both
* checking package dependencies ... NOTE
```
Package suggested but not available for checking: ‘STRaitRazoR’
```
# SubgrPlots
Version: 0.1.0
## In both
* checking installed package size ... NOTE
```
installed size is 5.5Mb
sub-directories of 1Mb or more:
paper 2.3Mb
R 3.1Mb
```
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘alluvial’ ‘geoR’ ‘gridBase’ ‘UpSetR’
All declared Imports should be used.
```
# subscreen
Version: 2.0.1
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘bsplus’ ‘colourpicker’ ‘dplyr’ ‘DT’ ‘graphics’ ‘grDevices’
‘jsonlite’ ‘shinyjs’ ‘V8’
All declared Imports should be used.
```
# subSeq
Version: 1.10.0
## In both
* checking R code for possible problems ... NOTE
```
...
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/subSeq/new/subSeq.Rcheck/00_pkg_src/subSeq/R/summary.subsamples.R:127-129)
summary.subsamples: no visible binding for global variable ‘percent’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/subSeq/new/subSeq.Rcheck/00_pkg_src/subSeq/R/summary.subsamples.R:127-129)
summary.subsamples: no visible binding for global variable ‘proportion’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/subSeq/new/subSeq.Rcheck/00_pkg_src/subSeq/R/summary.subsamples.R:127-129)
summary.subsamples: no visible binding for global variable ‘method’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/subSeq/new/subSeq.Rcheck/00_pkg_src/subSeq/R/summary.subsamples.R:127-129)
voomLimma: no visible global function definition for ‘model.matrix’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/subSeq/new/subSeq.Rcheck/00_pkg_src/subSeq/R/handlers.R:41)
Undefined global functions or variables:
. average.depth average.value coefficient cor count cov depth estFDP
ID method metric model.matrix o.coefficient o.lfdr o.padj p.adjust
padj percent plot proportion pvalue rbinom replication rFDP
selectMethod significant valid value var
Consider adding
importFrom("graphics", "plot")
importFrom("methods", "selectMethod")
importFrom("stats", "cor", "cov", "model.matrix", "p.adjust", "rbinom",
"var")
to your NAMESPACE file (and ensure that your DESCRIPTION Imports field
contains 'methods').
```
# SummarizedBenchmark
Version: 1.0.4
## In both
* checking re-building of vignette outputs ... WARNING
```
...
The following object is masked from 'package:SummarizedBenchmark':
plotROC
Attaching package: 'magrittr'
The following object is masked from 'package:rlang':
set_names
The following object is masked from 'package:tidyr':
extract
Loading required package: SingleCellExperiment
Quitting from lines 47-54 (SingleCellBenchmark.Rmd)
Error: processing vignette 'SingleCellBenchmark.Rmd' failed with diagnostics:
there is no package called 'scRNAseq'
Execution halted
```
* checking package dependencies ... NOTE
```
Package suggested but not available for checking: ‘scRNAseq’
Depends: includes the non-default packages:
‘tidyr’ ‘SummarizedExperiment’ ‘S4Vectors’ ‘BiocGenerics’ ‘UpSetR’
‘rlang’ ‘stringr’ ‘BiocParallel’ ‘ggplot2’ ‘mclust’ ‘dplyr’
Adding so many packages to the search path is excessive and importing
selectively is preferable.
```
* checking installed package size ... NOTE
```
installed size is 13.1Mb
sub-directories of 1Mb or more:
data 9.3Mb
doc 3.3Mb
```
* checking dependencies in R code ... NOTE
```
Unexported object imported by a ':::' call: ‘BiocGenerics:::replaceSlots’
See the note in ?`:::` about the use of this operator.
```
* checking R code for possible problems ... NOTE
```
.list2mat : <anonymous>: no visible binding for global variable
‘.method’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/SummarizedBenchmark/new/SummarizedBenchmark.Rcheck/00_pkg_src/SummarizedBenchmark/R/buildBench.R:275)
.list2mat : <anonymous>: no visible binding for global variable ‘.val’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/SummarizedBenchmark/new/SummarizedBenchmark.Rcheck/00_pkg_src/SummarizedBenchmark/R/buildBench.R:275)
.list2mat : <anonymous>: no visible binding for global variable ‘.id’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/SummarizedBenchmark/new/SummarizedBenchmark.Rcheck/00_pkg_src/SummarizedBenchmark/R/buildBench.R:276-277)
plotROC: no visible binding for global variable ‘FDR’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/SummarizedBenchmark/new/SummarizedBenchmark.Rcheck/00_pkg_src/SummarizedBenchmark/R/PlottingFunctions.R:81-82)
plotROC: no visible binding for global variable ‘TPR’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/SummarizedBenchmark/new/SummarizedBenchmark.Rcheck/00_pkg_src/SummarizedBenchmark/R/PlottingFunctions.R:81-82)
plotROC: no visible binding for global variable ‘method’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/SummarizedBenchmark/new/SummarizedBenchmark.Rcheck/00_pkg_src/SummarizedBenchmark/R/PlottingFunctions.R:81-82)
Undefined global functions or variables:
.id .method .val FDR method TPR
```
# summarytools
Version: 0.9.2
## In both
* checking data for non-ASCII characters ... NOTE
```
Note: found 78 marked UTF-8 strings
```
# sunburstR
Version: 2.1.1
## In both
* checking package dependencies ... NOTE
```
Package which this enhances but not available for checking: ‘treemap’
```
# suropt
Version: 0.1.1
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘DiceOptim’ ‘GPareto’ ‘rgenoud’
All declared Imports should be used.
```
# survminer
Version: 0.4.3
## In both
* checking installed package size ... NOTE
```
installed size is 6.1Mb
sub-directories of 1Mb or more:
doc 5.1Mb
```
# SVMMaj
Version: 0.2.9
## In both
* checking re-building of vignette outputs ... NOTE
```
...
The following object is masked from ‘package:ggplot2’:
alpha
Error in texi2dvi(file = file, pdf = TRUE, clean = clean, quiet = quiet, :
Running 'texi2dvi' on 'paper.tex' failed.
LaTeX errors:
! LaTeX Error: File `relsize.sty' not found.
Type X to quit or <RETURN> to proceed,
or enter new name. (Default extension: sty)
! Emergency stop.
<read *>
l.987 \ifthenelse
{\boolean{algocf@slide}}{\RequirePackage{color}}{}%^^M
! ==> Fatal error occurred, no output PDF file produced!
Calls: buildVignettes -> texi2pdf -> texi2dvi
Execution halted
```
# sweep
Version: 0.2.1.1
## Newly fixed
* checking re-building of vignette outputs ... WARNING
```
...
Version 0.4-0 included new data defaults. See ?getSymbols.
Loading required package: tidyverse
── Attaching packages ────────────────────────────────── tidyverse 1.2.1 ──
✔ ggplot2 3.1.0 ✔ purrr 0.3.1
✔ tibble 2.0.1 ✔ dplyr 0.8.0.1
✔ tidyr 0.8.3 ✔ stringr 1.4.0
✔ readr 1.3.1 ✔ forcats 0.4.0
── Conflicts ───────────────────────────────────── tidyverse_conflicts() ──
✖ lubridate::as.difftime() masks base::as.difftime()
✖ lubridate::date() masks base::date()
✖ dplyr::filter() masks stats::filter()
✖ dplyr::first() masks xts::first()
✖ lubridate::intersect() masks base::intersect()
✖ dplyr::lag() masks stats::lag()
✖ dplyr::last() masks xts::last()
✖ lubridate::setdiff() masks base::setdiff()
✖ lubridate::union() masks base::union()
Quitting from lines 68-76 (SW00_Introduction_to_sweep.Rmd)
Error: processing vignette 'SW00_Introduction_to_sweep.Rmd' failed with diagnostics:
`data` must be a data frame, or other object coercible by `fortify()`, not a logical vector
Execution halted
```
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘devtools’ ‘lazyeval’ ‘lubridate’ ‘tidyr’
All declared Imports should be used.
```
# swfdr
Version: 1.6.0
## In both
* checking re-building of vignette outputs ... WARNING
```
...
fmtutil [INFO]: Total formats: 15
fmtutil [INFO]: exiting with status 0
tlmgr install fancyhdr
TeX Live 2018 is frozen forever and will no
longer be updated. This happens in preparation for a new release.
If you're interested in helping to pretest the new release (when
pretests are available), please read http://tug.org/texlive/pretest.html.
Otherwise, just wait, and the new release will be ready in due time.
tlmgr: Fundamental package texlive.infra not present, uh oh, goodbyeShould not happen, texlive.infra not found at /usr/local/bin/tlmgr line 7344.
tlmgr: package repository http://mirrors.standaloneinstaller.com/ctan/systems/texlive/tlnet (not verified: gpg unavailable)
tlmgr path add
! LaTeX Error: File `fancyhdr.sty' not found.
! Emergency stop.
<read *>
Error: processing vignette 'swfdrTutorial.Rmd' failed with diagnostics:
Failed to compile swfdrTutorial.tex. See swfdrTutorial.log for more info.
Execution halted
```
* checking R code for possible problems ... NOTE
```
lm_pi0: no visible global function definition for ‘glm’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/swfdr/new/swfdr.Rcheck/00_pkg_src/swfdr/R/lm_pi0.R:56)
lm_pi0: no visible binding for global variable ‘binomial’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/swfdr/new/swfdr.Rcheck/00_pkg_src/swfdr/R/lm_pi0.R:56)
Undefined global functions or variables:
binomial glm
Consider adding
importFrom("stats", "binomial", "glm")
to your NAMESPACE file.
```
# switchde
Version: 1.6.0
## In both
* checking for hidden files and directories ... NOTE
```
Found the following hidden files and directories:
.travis.yml
These were most likely included in error. See section ‘Package
structure’ in the ‘Writing R Extensions’ manual.
```
# SWMPrExtension
Version: 0.3.16
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘rgeos’
All declared Imports should be used.
```
# synlet
Version: 1.10.0
## In both
* checking re-building of vignette outputs ... WARNING
```
Error in re-building vignettes:
...
Error: processing vignette 'synlet-vignette.Rmd' failed with diagnostics:
there is no package called ‘BiocStyle’
Execution halted
```
* checking R code for possible problems ... NOTE
```
...
zFactor: no visible binding for global variable ‘sd’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/synlet/new/synlet.Rcheck/00_pkg_src/synlet/R/zFactor.R:37-38)
zFactor: no visible binding for global variable ‘median’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/synlet/new/synlet.Rcheck/00_pkg_src/synlet/R/zFactor.R:37-38)
zFactor: no visible global function definition for ‘complete.cases’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/synlet/new/synlet.Rcheck/00_pkg_src/synlet/R/zFactor.R:50)
Undefined global functions or variables:
COL_NAME colorRampPalette complete.cases condition dev.off
EXPERIMENT_MODIFICATION EXPERIMENT_TYPE experiments is mad
MASTER_PLATE median medpolish p.adjust pdf phyper PLATE rainbow
READOUT ROW_NAME sd siRNA t.test value Var1 WELL_CONTENT_NAME
write.table
Consider adding
importFrom("grDevices", "colorRampPalette", "dev.off", "pdf",
"rainbow")
importFrom("methods", "is")
importFrom("stats", "complete.cases", "mad", "median", "medpolish",
"p.adjust", "phyper", "sd", "t.test")
importFrom("utils", "write.table")
to your NAMESPACE file (and ensure that your DESCRIPTION Imports field
contains 'methods').
```
# syuzhet
Version: 1.0.4
## In both
* checking installed package size ... NOTE
```
installed size is 5.8Mb
sub-directories of 1Mb or more:
extdata 3.1Mb
R 2.1Mb
```
# tabula
Version: 1.0.0
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘dplyr’ ‘tidyr’
All declared Imports should be used.
```
# tabularaster
Version: 0.5.0
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘methods’
All declared Imports should be used.
```
# TAShiny
Version: 0.1.0
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘dplyr’ ‘igraph’ ‘SnowballC’ ‘tm’ ‘wordcloud2’
All declared Imports should be used.
```
# taxa
Version: 0.3.2
## In both
* checking installed package size ... NOTE
```
installed size is 5.3Mb
sub-directories of 1Mb or more:
data 1.1Mb
doc 1.7Mb
R 2.0Mb
```
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘knitr’ ‘lazyeval’ ‘rlang’ ‘tidyr’
All declared Imports should be used.
```
# tbl2xts
Version: 0.1.2
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘PerformanceAnalytics’
All declared Imports should be used.
```
# TCGAbiolinks
Version: 2.8.4
## In both
* R CMD check timed out
* checking dependencies in R code ... WARNING
```
'::' or ':::' import not declared from: ‘tidyr’
```
* checking installed package size ... NOTE
```
installed size is 74.3Mb
sub-directories of 1Mb or more:
data 3.6Mb
doc 66.4Mb
R 4.1Mb
```
* checking R code for possible problems ... NOTE
```
...
TCGAtumor_purity: no visible binding for global variable ‘Tumor.purity’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/TCGAbiolinks/new/TCGAbiolinks.Rcheck/00_pkg_src/TCGAbiolinks/R/clinical.R:639-640)
TCGAvisualize_oncoprint: no visible binding for global variable ‘value’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/TCGAbiolinks/new/TCGAbiolinks.Rcheck/00_pkg_src/TCGAbiolinks/R/visualize.R:944)
TCGAvisualize_SurvivalCoxNET: no visible global function definition for
‘dNetInduce’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/TCGAbiolinks/new/TCGAbiolinks.Rcheck/00_pkg_src/TCGAbiolinks/R/visualize.R:156-157)
TCGAvisualize_SurvivalCoxNET: no visible global function definition for
‘dNetPipeline’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/TCGAbiolinks/new/TCGAbiolinks.Rcheck/00_pkg_src/TCGAbiolinks/R/visualize.R:161-162)
TCGAvisualize_SurvivalCoxNET: no visible global function definition for
‘dCommSignif’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/TCGAbiolinks/new/TCGAbiolinks.Rcheck/00_pkg_src/TCGAbiolinks/R/visualize.R:174)
TCGAvisualize_SurvivalCoxNET: no visible global function definition for
‘visNet’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/TCGAbiolinks/new/TCGAbiolinks.Rcheck/00_pkg_src/TCGAbiolinks/R/visualize.R:184-189)
Undefined global functions or variables:
barcode c3net clinical coordinates dCommSignif dNetInduce
dNetPipeline exon knnmi.cross limmacontrasts.fit limmamakeContrasts
minet portions rse_gene TabSubtypesCol_merged Tumor.purity value
visNet
```
# TCGAbiolinksGUI
Version: 1.6.1
## In both
* checking package dependencies ... ERROR
```
Packages required but not available:
‘IlluminaHumanMethylation450kanno.ilmn12.hg19’
‘IlluminaHumanMethylation450kmanifest’
‘IlluminaHumanMethylation27kmanifest’
‘IlluminaHumanMethylation27kanno.ilmn12.hg19’
‘IlluminaHumanMethylationEPICanno.ilm10b2.hg19’
‘IlluminaHumanMethylationEPICmanifest’
See section ‘The DESCRIPTION file’ in the ‘Writing R Extensions’
manual.
```
# TCGAbiolinksGUI.data
Version: 1.0.0
## In both
* checking installed package size ... NOTE
```
installed size is 19.7Mb
sub-directories of 1Mb or more:
data 18.6Mb
doc 1.0Mb
```
# Tcomp
Version: 1.0.1
## In both
* checking package dependencies ... ERROR
```
Package required but not available: ‘Mcomp’
See section ‘The DESCRIPTION file’ in the ‘Writing R Extensions’
manual.
```
# tcR
Version: 2.2.3
## In both
* checking installed package size ... NOTE
```
installed size is 7.8Mb
sub-directories of 1Mb or more:
data 1.4Mb
doc 3.9Mb
R 2.1Mb
```
# tempcyclesdata
Version: 1.0.1
## In both
* checking installed package size ... NOTE
```
installed size is 6.2Mb
sub-directories of 1Mb or more:
data 6.1Mb
```
# textfeatures
Version: 0.3.0
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘utils’
All declared Imports should be used.
```
# TextForecast
Version: 0.1.1
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘doParallel’ ‘forecast’ ‘lars’ ‘parallel’ ‘tau’ ‘tsDyn’
All declared Imports should be used.
```
* checking data for non-ASCII characters ... NOTE
```
Note: found 40 marked UTF-8 strings
```
# textmining
Version: 0.0.1
## In both
* checking whether package ‘textmining’ can be installed ... ERROR
```
Installation failed.
See ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/textmining/new/textmining.Rcheck/00install.out’ for details.
```
## Installation
### Devel
```
* installing *source* package ‘textmining’ ...
** package ‘textmining’ successfully unpacked and MD5 sums checked
** R
** byte-compile and prepare package for lazy loading
Warning in fun(libname, pkgname) : couldn't connect to display ""
Error : .onLoad failed in loadNamespace() for 'mallet', details:
call: NULL
error: .onLoad failed in loadNamespace() for 'rJava', details:
call: dyn.load(file, DLLpath = DLLpath, ...)
error: unable to load shared object '/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/textmining/rJava/libs/rJava.so':
dlopen(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/textmining/rJava/libs/rJava.so, 6): Library not loaded: /Library/Java/JavaVirtualMachines/jdk-11.0.1.jdk/Contents/Home/lib/server/libjvm.dylib
Referenced from: /Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/textmining/rJava/libs/rJava.so
Reason: image not found
ERROR: lazy loading failed for package ‘textmining’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/textmining/new/textmining.Rcheck/textmining’
```
### CRAN
```
* installing *source* package ‘textmining’ ...
** package ‘textmining’ successfully unpacked and MD5 sums checked
** R
** byte-compile and prepare package for lazy loading
Warning in fun(libname, pkgname) : couldn't connect to display ""
Error : .onLoad failed in loadNamespace() for 'mallet', details:
call: NULL
error: .onLoad failed in loadNamespace() for 'rJava', details:
call: dyn.load(file, DLLpath = DLLpath, ...)
error: unable to load shared object '/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/textmining/rJava/libs/rJava.so':
dlopen(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/textmining/rJava/libs/rJava.so, 6): Library not loaded: /Library/Java/JavaVirtualMachines/jdk-11.0.1.jdk/Contents/Home/lib/server/libjvm.dylib
Referenced from: /Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/textmining/rJava/libs/rJava.so
Reason: image not found
ERROR: lazy loading failed for package ‘textmining’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/textmining/old/textmining.Rcheck/textmining’
```
# textrecipes
Version: 0.0.1
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘stringr’
All declared Imports should be used.
```
# textreuse
Version: 0.1.4
## In both
* checking Rd cross-references ... WARNING
```
Unknown package ‘tm’ in Rd xrefs
```
# TFEA.ChIP
Version: 1.0.0
## In both
* checking package dependencies ... ERROR
```
Packages required but not available:
‘TxDb.Hsapiens.UCSC.hg19.knownGene’ ‘org.Hs.eg.db’
See section ‘The DESCRIPTION file’ in the ‘Writing R Extensions’
manual.
```
# TFutils
Version: 1.0.0
## In both
* checking examples ... ERROR
```
Running examples in ‘TFutils-Ex.R’ failed
The error most likely occurred in:
> ### Name: genemodelDF
> ### Title: use EnsDb to generate an exon-level model of genes identified by
> ### symbol
> ### Aliases: genemodelDF
>
> ### ** Examples
>
> if (requireNamespace("EnsDb.Hsapiens.v75")) {
+ orm = genemodelDF("ORMDL3", EnsDb.Hsapiens.v75::EnsDb.Hsapiens.v75)
+ dim(orm)
+ }
Loading required namespace: EnsDb.Hsapiens.v75
Failed with error: 'there is no package called 'EnsDb.Hsapiens.v75''
> head(orm)
Error in head(orm) : object 'orm' not found
Execution halted
```
* checking tests ...
```
ERROR
Running the tests in ‘tests/testthat.R’ failed.
Last 13 lines of output:
4: getExportedValue(pkg, name)
5: asNamespace(ns)
6: getNamespace(ns)
7: tryCatch(loadNamespace(name), error = function(e) stop(e))
8: tryCatchList(expr, classes, parentenv, handlers)
9: tryCatchOne(expr, names, parentenv, handlers[[1L]])
10: value[[3L]](cond)
Failed with error: 'there is no package called 'EnsDb.Hsapiens.v75''
══ testthat results ═══════════════════════════════════════════════════════════
OK: 3 SKIPPED: 0 FAILED: 1
1. Error: grabTab returns expected records (@test.R#6)
Error: testthat unit tests failed
Execution halted
```
* checking re-building of vignette outputs ... WARNING
```
Error in re-building vignettes:
...
Quitting from lines 26-40 (TFutils.Rmd)
Error: processing vignette 'TFutils.Rmd' failed with diagnostics:
there is no package called 'org.Hs.eg.db'
Execution halted
```
* checking package dependencies ... NOTE
```
Packages suggested but not available for checking:
‘Homo.sapiens’ ‘GO.db’ ‘org.Hs.eg.db’ ‘EnsDb.Hsapiens.v75’
```
* checking data for non-ASCII characters ... NOTE
```
Note: found 62 marked UTF-8 strings
```
# TH.data
Version: 1.0-10
## In both
* checking installed package size ... NOTE
```
installed size is 8.5Mb
sub-directories of 1Mb or more:
data 1.1Mb
rda 7.1Mb
```
# theseus
Version: 0.1.0
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘gridExtra’ ‘splancs’ ‘tidyverse’
All declared Imports should be used.
```
* checking Rd cross-references ... NOTE
```
Packages unavailable to check Rd xrefs: ‘DESeq2’, ‘dada2’
```
# tidybayes
Version: 1.0.4
## In both
* checking tests ...
```
ERROR
Running the tests in ‘tests/testthat.R’ failed.
Last 13 lines of output:
modules = modules, factories = factories, jags = jags, call.setup = TRUE, method = method,
mutate = mutate)
10: setup.jags(model = outmodel, monitor = outmonitor, data = outdata, n.chains = n.chains,
inits = outinits, modules = modules, factories = factories, response = response,
fitted = fitted, residual = residual, jags = jags, method = method, mutate = mutate)
11: loadandcheckrjags()
12: stop("Loading the rjags package failed (diagnostics are given above this error message)",
call. = FALSE)
══ testthat results ═══════════════════════════════════════════════════════════
OK: 224 SKIPPED: 43 FAILED: 1
1. Error: tidy_draws works with runjags (@test.tidy_draws.R#87)
Error: testthat unit tests failed
Execution halted
```
# tidymodels
Version: 0.0.2
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘broom’ ‘dials’ ‘parsnip’
All declared Imports should be used.
```
# tidyquant
Version: 0.5.5
## In both
* checking installed package size ... NOTE
```
installed size is 5.2Mb
sub-directories of 1Mb or more:
doc 4.1Mb
```
# tidyqwi
Version: 0.1.0
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘plyr’
All declared Imports should be used.
```
# tidyr
Version: 0.8.3
## In both
* checking data for non-ASCII characters ... NOTE
```
Note: found 23 marked UTF-8 strings
```
# tidyRSS
Version: 1.2.8
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘testthat’
All declared Imports should be used.
```
# tidystopwords
Version: 0.9.0
## In both
* checking data for non-ASCII characters ... NOTE
```
Note: found 229801 marked UTF-8 strings
```
# tidytransit
Version: 0.3.8
## In both
* checking installed package size ... NOTE
```
installed size is 5.3Mb
sub-directories of 1Mb or more:
extdata 4.4Mb
```
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘here’ ‘htmltools’ ‘scales’ ‘stringr’
All declared Imports should be used.
```
* checking data for non-ASCII characters ... NOTE
```
Note: found 41 marked UTF-8 strings
```
# tidyverse
Version: 1.2.1
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘dbplyr’ ‘reprex’ ‘rlang’
All declared Imports should be used.
```
# tidyxl
Version: 1.0.4
## In both
* checking compiled code ... WARNING
```
File ‘tidyxl/libs/tidyxl.so’:
Found ‘_abort’, possibly from ‘abort’ (C)
Object: ‘xlex.o’
Compiled code should not call entry points which might terminate R nor
write to stdout/stderr instead of to the console, nor use Fortran I/O
nor system RNGs.
See ‘Writing portable packages’ in the ‘Writing R Extensions’ manual.
```
# tilegramsR
Version: 0.2.0
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘sp’
All declared Imports should be used.
```
* checking data for non-ASCII characters ... NOTE
```
Note: found 341 marked UTF-8 strings
```
# timelineS
Version: 0.1.1
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘base’
All declared Imports should be used.
```
# TimerQuant
Version: 1.10.0
## In both
* checking re-building of vignette outputs ... WARNING
```
...
no non-missing arguments to min; returning Inf
Warning in min(x, na.rm = TRUE) :
no non-missing arguments to min; returning Inf
Warning in min(x, na.rm = TRUE) :
no non-missing arguments to min; returning Inf
Error in texi2dvi(file = file, pdf = TRUE, clean = clean, quiet = quiet, :
Running 'texi2dvi' on 'genPaperFigures.tex' failed.
LaTeX errors:
! LaTeX Error: File `float.sty' not found.
Type X to quit or <RETURN> to proceed,
or enter new name. (Default extension: sty)
! Emergency stop.
<read *>
l.30 \date
{}^^M
! ==> Fatal error occurred, no output PDF file produced!
Calls: buildVignettes -> texi2pdf -> texi2dvi
Execution halted
```
* checking R code for possible problems ... NOTE
```
...
plotPrimordiumProfile: no visible global function definition for
‘points’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/TimerQuant/new/TimerQuant.Rcheck/00_pkg_src/TimerQuant/R/plotPrimordiumProfile.R:24)
plotPrimordiumProfile: no visible global function definition for
‘polygon’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/TimerQuant/new/TimerQuant.Rcheck/00_pkg_src/TimerQuant/R/plotPrimordiumProfile.R:26-27)
plotPrimordiumProfile: no visible global function definition for ‘rgb’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/TimerQuant/new/TimerQuant.Rcheck/00_pkg_src/TimerQuant/R/plotPrimordiumProfile.R:26-27)
simulatedRatio: no visible global function definition for ‘rnorm’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/TimerQuant/new/TimerQuant.Rcheck/00_pkg_src/TimerQuant/R/SAPSstochastic.R:4)
simulatedRatio: no visible global function definition for ‘rnorm’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/TimerQuant/new/TimerQuant.Rcheck/00_pkg_src/TimerQuant/R/SAPSstochastic.R:5)
Undefined global functions or variables:
approxfun axis mad median optimize par plot points polygon predict
rainbow rgb rnorm
Consider adding
importFrom("graphics", "axis", "par", "plot", "points", "polygon")
importFrom("grDevices", "rainbow", "rgb")
importFrom("stats", "approxfun", "mad", "median", "optimize",
"predict", "rnorm")
to your NAMESPACE file.
```
# timescape
Version: 1.4.0
## In both
* checking Rd \usage sections ... WARNING
```
Duplicated \argument entries in documentation object 'timescapeOutput':
‘width’ ‘height’ ‘mutations’ ‘height’ ‘width’ ‘clonal_prev’
‘tree_edges’ ‘alpha’ ‘clonal_prev’ ‘tree_edges’ ‘genotype_position’
‘clone_colours’ ‘perturbations’ ‘mutations’ ‘tree_edges’
‘clonal_prev’ ‘clonal_prev’ ‘tree_edges’ ‘clone_colours’ ‘mutations’
Functions with \usage entries need to have the appropriate \alias
entries, and all their arguments documented.
The \usage entries must correspond to syntactically valid R code.
See chapter ‘Writing R documentation files’ in the ‘Writing R
Extensions’ manual.
```
* checking for hidden files and directories ... NOTE
```
Found the following hidden files and directories:
.vscode
These were most likely included in error. See section ‘Package
structure’ in the ‘Writing R Extensions’ manual.
```
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘dplyr’ ‘gtools’
All declared Imports should be used.
```
* checking R code for possible problems ... NOTE
```
getMutationsData: no visible binding for global variable
‘show_warnings’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/timescape/new/timescape.Rcheck/00_pkg_src/timescape/R/timescape.R:653-657)
Undefined global functions or variables:
show_warnings
```
# timetk
Version: 0.1.1.1
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘devtools’ ‘forecast’
All declared Imports should be used.
```
# TitanCNA
Version: 1.18.0
## In both
* checking whether package ‘TitanCNA’ can be installed ... WARNING
```
Found the following significant warnings:
Warning: replacing previous import ‘GenomicRanges::shift’ by ‘data.table::shift’ when loading ‘TitanCNA’
Warning: replacing previous import ‘IRanges::collapse’ by ‘dplyr::collapse’ when loading ‘TitanCNA’
Warning: replacing previous import ‘data.table::last’ by ‘dplyr::last’ when loading ‘TitanCNA’
Warning: replacing previous import ‘GenomicRanges::union’ by ‘dplyr::union’ when loading ‘TitanCNA’
Warning: replacing previous import ‘IRanges::slice’ by ‘dplyr::slice’ when loading ‘TitanCNA’
Warning: replacing previous import ‘GenomeInfoDb::intersect’ by ‘dplyr::intersect’ when loading ‘TitanCNA’
Warning: replacing previous import ‘GenomicRanges::setdiff’ by ‘dplyr::setdiff’ when loading ‘TitanCNA’
Warning: replacing previous import ‘data.table::first’ by ‘dplyr::first’ when loading ‘TitanCNA’
Warning: replacing previous import ‘IRanges::desc’ by ‘dplyr::desc’ when loading ‘TitanCNA’
Warning: replacing previous import ‘data.table::between’ by ‘dplyr::between’ when loading ‘TitanCNA’
Warning: replacing previous import ‘dplyr::select’ by ‘VariantAnnotation::select’ when loading ‘TitanCNA’
See ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/TitanCNA/new/TitanCNA.Rcheck/00install.out’ for details.
```
* checking installed package size ... NOTE
```
installed size is 8.1Mb
sub-directories of 1Mb or more:
data 1.7Mb
extdata 4.9Mb
```
* checking R code for possible problems ... NOTE
```
...
filterByTargetedSequences haplotypeBin HaplotypeBinDepth.mean
HaplotypeBinDepth.sum HaplotypeDepth.mean
HaplotypeDepth.mean.symmetric HaplotypeDepth.sum
HaplotypeDepth.sum.symmetric HaplotypeFraction
HaplotypeFraction.symmetric HaplotypeRatio HaplotypeRatio.1
HaplotypeRatio.2 head keepChr Length.snp. lines loess
logR_Copy_Number LogRatio lowess MajorCN Median_logR Median_Ratio
MinorCN mtext na.omit nonRef par phasedAlleleFraction phasedCount
phasedCount.haploSymmetric phaseSet phaseSet.aggr plot points predict
queryHits read.delim rowRanges rowRanges<- Sample seq.info SNPs Start
Start_Position.bp. Start.snp Start.telo subjectHits tail TITAN_call
TITANcall TITANstate tumDepth uniroot unstrsplit write.table xtabs
Consider adding
importFrom("graphics", "abline", "axis", "lines", "mtext", "par",
"plot", "points")
importFrom("methods", "as")
importFrom("stats", "approxfun", "dunif", "loess", "lowess", "na.omit",
"predict", "uniroot", "xtabs")
importFrom("utils", "head", "read.delim", "tail", "write.table")
to your NAMESPACE file (and ensure that your DESCRIPTION Imports field
contains 'methods').
```
* checking Rd cross-references ... NOTE
```
Packages unavailable to check Rd xrefs: ‘HMMcopy’, ‘list’
```
# tmap
Version: 2.2
## In both
* checking installed package size ... NOTE
```
installed size is 6.9Mb
sub-directories of 1Mb or more:
data 1.4Mb
doc 1.4Mb
R 3.8Mb
```
# toxEval
Version: 1.0.3
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘shinyAce’ ‘shinycssloaders’ ‘shinydashboard’
All declared Imports should be used.
```
# TPP
Version: 3.8.5
## In both
* checking re-building of vignette outputs ... WARNING
```
...
done.
Creating QC plots to visualize normalization effects...
done.
Error in texi2dvi(file = file, pdf = TRUE, clean = clean, quiet = quiet, :
Running 'texi2dvi' on 'NPARC_analysis_of_TPP_TR_data.tex' failed.
LaTeX errors:
! LaTeX Error: File `forloop.sty' not found.
Type X to quit or <RETURN> to proceed,
or enter new name. (Default extension: sty)
! Emergency stop.
<read *>
l.35 ^^M
! ==> Fatal error occurred, no output PDF file produced!
Calls: buildVignettes -> texi2pdf -> texi2dvi
Execution halted
```
* checking installed package size ... NOTE
```
installed size is 15.1Mb
sub-directories of 1Mb or more:
data 1.9Mb
example_data 8.0Mb
R 2.1Mb
test_data 1.9Mb
```
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘broom’
All declared Imports should be used.
Unexported objects imported by ':::' calls:
‘doParallel:::.options’ ‘mefa:::rep.data.frame’
See the note in ?`:::` about the use of this operator.
```
* checking R code for possible problems ... NOTE
```
File ‘TPP/R/TPP.R’:
.onLoad calls:
packageStartupMessage(msgText, "\n")
See section ‘Good practice’ in '?.onAttach'.
plot_fSta_distribution: no visible binding for global variable
‘..density..’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/TPP/new/TPP.Rcheck/00_pkg_src/TPP/R/plot_fSta_distribution.R:19-28)
plot_pVal_distribution: no visible binding for global variable
‘..density..’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/TPP/new/TPP.Rcheck/00_pkg_src/TPP/R/plot_pVal_distribution.R:22-31)
Undefined global functions or variables:
..density..
```
# trackr
Version: 0.10.5
## In both
* checking re-building of vignette outputs ... WARNING
```
Error in re-building vignettes:
...
Loading required package: histry
Error in texi2dvi(file = file, pdf = TRUE, clean = clean, quiet = quiet, :
Running 'texi2dvi' on 'Extending-trackr.tex' failed.
LaTeX errors:
! LaTeX Error: File `times.sty' not found.
Type X to quit or <RETURN> to proceed,
or enter new name. (Default extension: sty)
! Emergency stop.
<read *>
l.60 \usepackage
{hyperref}^^M
! ==> Fatal error occurred, no output PDF file produced!
Calls: buildVignettes -> texi2pdf -> texi2dvi
Execution halted
```
# TrafficBDE
Version: 0.1.0
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘neuralnet’
All declared Imports should be used.
```
# treeio
Version: 1.4.3
## In both
* checking tests ...
```
ERROR
Running the tests in ‘tests/testthat.R’ failed.
Last 13 lines of output:
2: getExportedValue(pkg, name)
3: asNamespace(ns)
4: getNamespace(ns)
5: tryCatch(loadNamespace(name), error = function(e) stop(e))
6: tryCatchList(expr, classes, parentenv, handlers)
7: tryCatchOne(expr, names, parentenv, handlers[[1L]])
8: value[[3L]](cond)
══ testthat results ═══════════════════════════════════════════════════════════
OK: 91 SKIPPED: 0 FAILED: 2
1. Error: (unknown) (@test-conversion.R#4)
2. Error: (unknown) (@test-treedata-accessor.R#34)
Error: testthat unit tests failed
Execution halted
```
# trialr
Version: 0.0.7
## In both
* checking installed package size ... NOTE
```
installed size is 8.1Mb
sub-directories of 1Mb or more:
libs 6.5Mb
```
* checking for GNU extensions in Makefiles ... NOTE
```
GNU make is a SystemRequirements.
```
# tricolore
Version: 1.2.0
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘dplyr’
All declared Imports should be used.
```
* checking data for non-ASCII characters ... NOTE
```
Note: found 88 marked UTF-8 strings
```
# trread
Version: 0.2.7
## In both
* checking installed package size ... NOTE
```
installed size is 5.2Mb
sub-directories of 1Mb or more:
extdata 4.4Mb
```
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘here’ ‘htmltools’ ‘scales’ ‘stringr’
All declared Imports should be used.
```
* checking data for non-ASCII characters ... NOTE
```
Note: found 41 marked UTF-8 strings
```
# TSstudio
Version: 0.1.3
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘colormap’
All declared Imports should be used.
```
# ttestshiny
Version: 0.1.0
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘dplyr’ ‘shinyAce’ ‘shinyjs’
All declared Imports should be used.
```
# turfR
Version: 0.8-7
## In both
* checking R code for possible problems ... NOTE
```
turf: no visible global function definition for ‘read.table’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/turfR/new/turfR.Rcheck/00_pkg_src/turfR/R/turfR_0.8-7.R:12)
turf: no visible global function definition for ‘flush.console’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/turfR/new/turfR.Rcheck/00_pkg_src/turfR/R/turfR_0.8-7.R:102)
turf.combos: no visible global function definition for ‘combn’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/turfR/new/turfR.Rcheck/00_pkg_src/turfR/R/turfR_0.8-7.R:158)
Undefined global functions or variables:
combn flush.console read.table
Consider adding
importFrom("utils", "combn", "flush.console", "read.table")
to your NAMESPACE file.
```
# ufs
Version: 0.2.0
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘viridis’
All declared Imports should be used.
```
* checking Rd cross-references ... NOTE
```
Packages unavailable to check Rd xrefs: ‘userfriendlyscience’, ‘behaviorchange’, ‘MBESS’
```
# ukbtools
Version: 0.11.0
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘plyr’
All declared Imports should be used.
```
* checking data for non-ASCII characters ... NOTE
```
Note: found 5 marked UTF-8 strings
```
# UKgrid
Version: 0.1.0
## In both
* checking installed package size ... NOTE
```
installed size is 12.5Mb
sub-directories of 1Mb or more:
data 4.1Mb
doc 8.3Mb
```
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘rlang’
All declared Imports should be used.
```
# unvotes
Version: 0.2.0
## In both
* checking data for non-ASCII characters ... NOTE
```
Note: found 4494 marked UTF-8 strings
```
# USAboundaries
Version: 0.3.1
## In both
* checking package dependencies ... NOTE
```
Package suggested but not available for checking: ‘USAboundariesData’
```
# utilsIPEA
Version: 0.0.6
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘RCurl’ ‘stringdist’ ‘utils’
All declared Imports should be used.
```
# vaersNDvax
Version: 1.0.4
## In both
* checking package dependencies ... NOTE
```
Packages suggested but not available for checking: ‘vaers’ ‘vaersND’
```
# vaersvax
Version: 1.0.5
## In both
* checking package dependencies ... NOTE
```
Packages suggested but not available for checking: ‘vaers’ ‘vaersND’
```
# vapour
Version: 0.1.0
## In both
* checking whether package ‘vapour’ can be installed ... ERROR
```
Installation failed.
See ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/vapour/new/vapour.Rcheck/00install.out’ for details.
```
## Installation
### Devel
```
* installing *source* package ‘vapour’ ...
** package ‘vapour’ successfully unpacked and MD5 sums checked
configure: CC: clang
configure: CXX: clang++
checking for gdal-config... no
no
configure: error: gdal-config not found or not executable.
ERROR: configuration failed for package ‘vapour’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/vapour/new/vapour.Rcheck/vapour’
```
### CRAN
```
* installing *source* package ‘vapour’ ...
** package ‘vapour’ successfully unpacked and MD5 sums checked
configure: CC: clang
configure: CXX: clang++
checking for gdal-config... no
no
configure: error: gdal-config not found or not executable.
ERROR: configuration failed for package ‘vapour’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/vapour/old/vapour.Rcheck/vapour’
```
# vdmR
Version: 0.2.6
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘maptools’ ‘Rdpack’ ‘rgeos’
All declared Imports should be used.
```
# VIM
Version: 4.8.0
## In both
* checking Rd cross-references ... NOTE
```
Packages unavailable to check Rd xrefs: ‘mvoutlier’, ‘StatDA’, ‘mi’, ‘tkrplot’
```
# vkR
Version: 0.1
## In both
* checking dependencies in R code ... NOTE
```
Missing or unexported object: ‘jsonlite::rbind.pages’
```
# vlad
Version: 0.2.0
## In both
* checking whether package ‘vlad’ can be installed ... ERROR
```
Installation failed.
See ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/vlad/new/vlad.Rcheck/00install.out’ for details.
```
## Installation
### Devel
```
* installing *source* package ‘vlad’ ...
** package ‘vlad’ successfully unpacked and MD5 sums checked
** libs
clang++ -std=gnu++11 -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -I../inst/include/ -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/vlad/Rcpp/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/vlad/RcppArmadillo/include" -I/usr/local/include -fopenmp -fPIC -Wall -g -O2 -c RcppExports.cpp -o RcppExports.o
clang: error: unsupported option '-fopenmp'
make: *** [RcppExports.o] Error 1
ERROR: compilation failed for package ‘vlad’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/vlad/new/vlad.Rcheck/vlad’
```
### CRAN
```
* installing *source* package ‘vlad’ ...
** package ‘vlad’ successfully unpacked and MD5 sums checked
** libs
clang++ -std=gnu++11 -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -I../inst/include/ -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/vlad/Rcpp/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/vlad/RcppArmadillo/include" -I/usr/local/include -fopenmp -fPIC -Wall -g -O2 -c RcppExports.cpp -o RcppExports.o
clang: error: unsupported option '-fopenmp'
make: *** [RcppExports.o] Error 1
ERROR: compilation failed for package ‘vlad’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/vlad/old/vlad.Rcheck/vlad’
```
# volleystat
Version: 0.1.0
## In both
* checking data for non-ASCII characters ... NOTE
```
Note: found 3584 marked UTF-8 strings
```
# vqtl
Version: 2.0.5
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘iterators’ ‘knitr’ ‘purrr’ ‘testthat’
All declared Imports should be used.
```
# vsn
Version: 3.48.1
## In both
* checking package dependencies ... NOTE
```
Package suggested but not available for checking: ‘affydata’
```
* checking re-building of vignette outputs ... NOTE
```
...
anyDuplicated, append, as.data.frame, basename, cbind, colMeans,
colnames, colSums, dirname, do.call, duplicated, eval, evalq,
Filter, Find, get, grep, grepl, intersect, is.unsorted, lapply,
lengths, Map, mapply, match, mget, order, paste, pmax, pmax.int,
pmin, pmin.int, Position, rank, rbind, Reduce, rowMeans,
rownames, rowSums, sapply, setdiff, sort, table, tapply, union,
unique, unsplit, which, which.max, which.min
Welcome to Bioconductor
Vignettes contain introductory material; view with
'browseVignettes()'. To cite Bioconductor, see
'citation("Biobase")', and for packages 'citation("pkgname")'.
Warning in has_utility("convert", "ImageMagick") :
ImageMagick not installed or not in PATH
Quitting from lines 256-259 (A-vsn.Rmd)
Error: processing vignette 'A-vsn.Rmd' failed with diagnostics:
there is no package called 'affydata'
Execution halted
```
# VWPre
Version: 1.1.0
## In both
* checking installed package size ... NOTE
```
installed size is 5.0Mb
sub-directories of 1Mb or more:
data 3.1Mb
doc 1.3Mb
```
# waccR
Version: 0.1.0
## In both
* checking re-building of vignette outputs ... WARNING
```
Error in re-building vignettes:
...
Quitting from lines 30-36 (Calculate_WACC_in_R.Rmd)
Error: processing vignette 'Calculate_WACC_in_R.Rmd' failed with diagnostics:
Timeout was reached: Resolving timed out after 10000 milliseconds
Execution halted
```
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘lubridate’ ‘tibble’
All declared Imports should be used.
```
# walker
Version: 0.2.5
## In both
* R CMD check timed out
* checking installed package size ... NOTE
```
installed size is 7.9Mb
sub-directories of 1Mb or more:
doc 1.8Mb
libs 5.6Mb
```
* checking for GNU extensions in Makefiles ... NOTE
```
GNU make is a SystemRequirements.
```
# wallace
Version: 1.0.6
## In both
* checking package dependencies ... ERROR
```
Package required but not available: ‘spThin’
See section ‘The DESCRIPTION file’ in the ‘Writing R Extensions’
manual.
```
# wand
Version: 0.2.0
## In both
* checking whether package ‘wand’ can be installed ... ERROR
```
Installation failed.
See ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/wand/new/wand.Rcheck/00install.out’ for details.
```
## Installation
### Devel
```
* installing *source* package ‘wand’ ...
** package ‘wand’ successfully unpacked and MD5 sums checked
Checking to see if libmagic is available...
** libs
clang++ -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -L/usr/include -L/usr/local/include -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/wand/Rcpp/include" -I/usr/local/include -fPIC -O3 -Wno-c++11-inline-namespace -c RcppExports.cpp -o RcppExports.o
clang: warning: argument unused during compilation: '-L/usr/include' [-Wunused-command-line-argument]
clang: warning: argument unused during compilation: '-L/usr/local/include' [-Wunused-command-line-argument]
clang++ -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -L/usr/include -L/usr/local/include -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/wand/Rcpp/include" -I/usr/local/include -fPIC -O3 -Wno-c++11-inline-namespace -c wand.cpp -o wand.o
clang: warning: argument unused during compilation: '-L/usr/include' [-Wunused-command-line-argument]
clang: warning: argument unused during compilation: '-L/usr/local/include' [-Wunused-command-line-argument]
clang++ -dynamiclib -Wl,-headerpad_max_install_names -undefined dynamic_lookup -single_module -multiply_defined suppress -L/Library/Frameworks/R.framework/Resources/lib -L/usr/local/lib -o wand.so RcppExports.o wand.o -L/usr/local/lib -L/usr/lib -lmagic -F/Library/Frameworks/R.framework/.. -framework R -Wl,-framework -Wl,CoreFoundation
ld: library not found for -lmagic
clang: error: linker command failed with exit code 1 (use -v to see invocation)
make: *** [wand.so] Error 1
ERROR: compilation failed for package ‘wand’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/wand/new/wand.Rcheck/wand’
```
### CRAN
```
* installing *source* package ‘wand’ ...
** package ‘wand’ successfully unpacked and MD5 sums checked
Checking to see if libmagic is available...
** libs
clang++ -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -L/usr/include -L/usr/local/include -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/wand/Rcpp/include" -I/usr/local/include -fPIC -O3 -Wno-c++11-inline-namespace -c RcppExports.cpp -o RcppExports.o
clang: warning: argument unused during compilation: '-L/usr/include' [-Wunused-command-line-argument]
clang: warning: argument unused during compilation: '-L/usr/local/include' [-Wunused-command-line-argument]
clang++ -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -L/usr/include -L/usr/local/include -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/wand/Rcpp/include" -I/usr/local/include -fPIC -O3 -Wno-c++11-inline-namespace -c wand.cpp -o wand.o
clang: warning: argument unused during compilation: '-L/usr/include' [-Wunused-command-line-argument]
clang: warning: argument unused during compilation: '-L/usr/local/include' [-Wunused-command-line-argument]
clang++ -dynamiclib -Wl,-headerpad_max_install_names -undefined dynamic_lookup -single_module -multiply_defined suppress -L/Library/Frameworks/R.framework/Resources/lib -L/usr/local/lib -o wand.so RcppExports.o wand.o -L/usr/local/lib -L/usr/lib -lmagic -F/Library/Frameworks/R.framework/.. -framework R -Wl,-framework -Wl,CoreFoundation
ld: library not found for -lmagic
clang: error: linker command failed with exit code 1 (use -v to see invocation)
make: *** [wand.so] Error 1
ERROR: compilation failed for package ‘wand’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/wand/old/wand.Rcheck/wand’
```
# weathercan
Version: 0.2.8
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘xml2’
All declared Imports should be used.
```
* checking data for non-ASCII characters ... NOTE
```
Note: found 25 marked UTF-8 strings
```
# weibulltools
Version: 1.0.1
## In both
* checking whether package ‘weibulltools’ can be installed ... ERROR
```
Installation failed.
See ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/weibulltools/new/weibulltools.Rcheck/00install.out’ for details.
```
## Installation
### Devel
```
* installing *source* package ‘weibulltools’ ...
** package ‘weibulltools’ successfully unpacked and MD5 sums checked
** libs
clang++ -std=gnu++11 -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/weibulltools/Rcpp/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/weibulltools/RcppArmadillo/include" -I/usr/local/include -fopenmp -fPIC -Wall -g -O2 -c RcppExports.cpp -o RcppExports.o
clang: error: unsupported option '-fopenmp'
make: *** [RcppExports.o] Error 1
ERROR: compilation failed for package ‘weibulltools’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/weibulltools/new/weibulltools.Rcheck/weibulltools’
```
### CRAN
```
* installing *source* package ‘weibulltools’ ...
** package ‘weibulltools’ successfully unpacked and MD5 sums checked
** libs
clang++ -std=gnu++11 -I"/Library/Frameworks/R.framework/Resources/include" -DNDEBUG -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/weibulltools/Rcpp/include" -I"/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/library.noindex/weibulltools/RcppArmadillo/include" -I/usr/local/include -fopenmp -fPIC -Wall -g -O2 -c RcppExports.cpp -o RcppExports.o
clang: error: unsupported option '-fopenmp'
make: *** [RcppExports.o] Error 1
ERROR: compilation failed for package ‘weibulltools’
* removing ‘/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/weibulltools/old/weibulltools.Rcheck/weibulltools’
```
# whereport
Version: 0.1
## In both
* checking data for non-ASCII characters ... NOTE
```
Note: found 4102 marked UTF-8 strings
```
# wiggleplotr
Version: 1.4.0
## In both
* checking examples ... ERROR
```
...
Loading required package: IRanges
Attaching package: ‘IRanges’
The following objects are masked from ‘package:dplyr’:
collapse, desc, slice
Loading required package: GenomeInfoDb
> require("org.Hs.eg.db")
Loading required package: org.Hs.eg.db
Warning in library(package, lib.loc = lib.loc, character.only = TRUE, logical.return = TRUE, :
there is no package called ‘org.Hs.eg.db’
> require("TxDb.Hsapiens.UCSC.hg38.knownGene")
Loading required package: TxDb.Hsapiens.UCSC.hg38.knownGene
Warning in library(package, lib.loc = lib.loc, character.only = TRUE, logical.return = TRUE, :
there is no package called ‘TxDb.Hsapiens.UCSC.hg38.knownGene’
>
> orgdb = org.Hs.eg.db
Error: object 'org.Hs.eg.db' not found
Execution halted
```
* checking for code/documentation mismatches ... WARNING
```
Codoc mismatches from documentation object 'getGenotypePalette':
getGenotypePalette
Code: function(old = FALSE)
Docs: function()
Argument names in code not in docs:
old
```
* checking re-building of vignette outputs ... WARNING
```
Error in re-building vignettes:
...
Quitting from lines 18-28 (wiggleplotr.Rmd)
Error: processing vignette 'wiggleplotr.Rmd' failed with diagnostics:
there is no package called 'EnsDb.Hsapiens.v86'
Execution halted
```
* checking package dependencies ... NOTE
```
Packages suggested but not available for checking:
‘EnsDb.Hsapiens.v86’ ‘org.Hs.eg.db’
‘TxDb.Hsapiens.UCSC.hg38.knownGene’
```
* checking R code for possible problems ... NOTE
```
plotCoverage: no visible global function definition for ‘is’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/wiggleplotr/new/wiggleplotr.Rcheck/00_pkg_src/wiggleplotr/R/wiggleplotr.R:184)
plotCoverage: no visible global function definition for ‘is’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/wiggleplotr/new/wiggleplotr.Rcheck/00_pkg_src/wiggleplotr/R/wiggleplotr.R:185)
plotTranscripts: no visible global function definition for ‘is’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/wiggleplotr/new/wiggleplotr.Rcheck/00_pkg_src/wiggleplotr/R/wiggleplotr.R:33)
plotTranscripts: no visible global function definition for ‘is’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/wiggleplotr/new/wiggleplotr.Rcheck/00_pkg_src/wiggleplotr/R/wiggleplotr.R:34)
Undefined global functions or variables:
is
Consider adding
importFrom("methods", "is")
to your NAMESPACE file (and ensure that your DESCRIPTION Imports field
contains 'methods').
```
# wikipediatrend
Version: 1.1.14
## In both
* checking tests ...
```
ERROR
Running the tests in ‘tests/testthat.R’ failed.
Last 13 lines of output:
Loading required package: wikipediatrend
── 1. Failure: wp_linked_pages() is robust against no-data, sparse data (@test_a
`{ ... }` threw an error.
Message: Timeout was reached: Resolving timed out after 10000 milliseconds
Class: simpleError/error/condition
══ testthat results ═══════════════════════════════════════════════════════════
OK: 54 SKIPPED: 0 FAILED: 1
1. Failure: wp_linked_pages() is robust against no-data, sparse data (@test_aux.R#6)
Error: testthat unit tests failed
In addition: Warning message:
In .getClassesFromCache(Class) :
closing unused connection 3 (https://en.wikipedia.org/wiki/Sheerness_Lifeboat_Station)
Execution halted
```
# wikisourcer
Version: 0.1.2
## In both
* checking re-building of vignette outputs ... WARNING
```
Error in re-building vignettes:
...
Attaching package: 'dplyr'
The following objects are masked from 'package:stats':
filter, lag
The following objects are masked from 'package:base':
intersect, setdiff, setequal, union
Quitting from lines 78-81 (wikisourcer.Rmd)
Error: processing vignette 'wikisourcer.Rmd' failed with diagnostics:
HTTP error 404.
Execution halted
```
# wiseR
Version: 1.0.1
## In both
* checking installed package size ... NOTE
```
installed size is 7.2Mb
sub-directories of 1Mb or more:
bn 7.1Mb
```
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘arules’ ‘bnlearn’ ‘DescTools’ ‘dplyr’ ‘DT’ ‘graph’ ‘HydeNet’
‘igraph’ ‘linkcomm’ ‘missRanger’ ‘parallel’ ‘psych’ ‘RBGL’
‘Rgraphviz’ ‘rhandsontable’ ‘rintrojs’ ‘shinyalert’ ‘shinyBS’
‘shinycssloaders’ ‘shinydashboard’ ‘shinyWidgets’ ‘tools’
‘visNetwork’
All declared Imports should be used.
```
# wordbankr
Version: 0.3.0
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘dbplyr’
All declared Imports should be used.
```
# XBSeq
Version: 1.12.0
## In both
* R CMD check timed out
* checking whether the namespace can be loaded with stated dependencies ... NOTE
```
Warning: no function found corresponding to methods exports from ‘XBSeq’ for: ‘conditions’, ‘conditions<-’, ‘dispTable’
A namespace must be able to be loaded with just the base namespace
loaded: otherwise if the namespace gets loaded by a saved object, the
session will be unable to start.
Probably some imports need to be declared in the NAMESPACE file.
```
* checking R code for possible problems ... NOTE
```
...
‘conditions’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/XBSeq/new/XBSeq.Rcheck/00_pkg_src/XBSeq/R/core_functions.R:106)
estimateSCV,XBSeqDataSet: no visible global function definition for
‘conditions’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/XBSeq/new/XBSeq.Rcheck/00_pkg_src/XBSeq/R/core_functions.R:107)
estimateSCV,XBSeqDataSet: no visible global function definition for
‘dispTable<-’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/XBSeq/new/XBSeq.Rcheck/00_pkg_src/XBSeq/R/core_functions.R:108)
Undefined global functions or variables:
..count.. assay assay<- assays baseMean coefficients complete.cases
conditions cor data DataFrame ddelap dispTable dispTable<- dnbinom
dpois formula Gamma glm Group log2FoldChange median optim p.adjust
pbeta predict qbeta quantile rnbinom Sample scvBiasCorrectionFits
SummarizedExperiment
Consider adding
importFrom("stats", "coefficients", "complete.cases", "cor", "dnbinom",
"dpois", "formula", "Gamma", "glm", "median", "optim",
"p.adjust", "pbeta", "predict", "qbeta", "quantile",
"rnbinom")
importFrom("utils", "data")
to your NAMESPACE file.
```
# XGR
Version: 1.1.4
## In both
* checking installed package size ... NOTE
```
installed size is 6.8Mb
sub-directories of 1Mb or more:
data 1.1Mb
R 4.0Mb
```
# XKCDdata
Version: 0.1.0
## In both
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘tibble’
All declared Imports should be used.
```
# xpose4
Version: 4.6.1
## In both
* checking installed package size ... NOTE
```
installed size is 5.1Mb
sub-directories of 1Mb or more:
R 4.0Mb
```
# xtractomatic
Version: 3.4.2
## In both
* checking re-building of vignette outputs ... WARNING
```
Error in re-building vignettes:
...
date lon lat lowLon higLon lowLat higLat
4/23/2003 203.899 19.664 203.899 203.899 19.664 19.664
4/24/2003 204.151 19.821 203.912597 204.389403 18.78051934 20.86148066
4/30/2003 203.919 20.351 203.6793669 204.1586331 18.79728188 21.90471812
5/1/2003 204.229 20.305 203.9943343 204.4636657 18.90440013 21.70559987
Quitting from lines 818-843 (Usingxtractomatic.Rmd)
Error: processing vignette 'Usingxtractomatic.Rmd' failed with diagnostics:
(converted from warning) Removed 4070 rows containing missing values (geom_raster).
Execution halted
```
* checking dependencies in R code ... NOTE
```
Namespace in Imports field not imported from: ‘dplyr’
All declared Imports should be used.
```
# Zelig
Version: 5.1.6
## In both
* checking installed package size ... NOTE
```
installed size is 7.1Mb
sub-directories of 1Mb or more:
R 6.0Mb
```
# zeligverse
Version: 0.1.1
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘Amelia’ ‘MatchIt’ ‘WhatIf’
All declared Imports should be used.
```
# zFPKM
Version: 1.2.0
## In both
* checking examples ... ERROR
```
...
> library(dplyr)
Attaching package: ‘dplyr’
The following objects are masked from ‘package:stats’:
filter, lag
The following objects are masked from ‘package:base’:
intersect, setdiff, setequal, union
> gse94802 <- "ftp://ftp.ncbi.nlm.nih.gov/geo/series/GSE94nnn/GSE94802/suppl/GSE94802_Minkina_etal_normalized_FPKM.csv.gz"
> temp <- tempfile()
> download.file(gse94802, temp)
trying URL 'ftp://ftp.ncbi.nlm.nih.gov/geo/series/GSE94nnn/GSE94802/suppl/GSE94802_Minkina_etal_normalized_FPKM.csv.gz'
Warning in download.file(gse94802, temp) :
URL 'ftp://ftp.ncbi.nlm.nih.gov/geo/series/GSE94nnn/GSE94802/suppl/GSE94802_Minkina_etal_normalized_FPKM.csv.gz': status was 'Couldn't resolve host name'
Error in download.file(gse94802, temp) :
cannot open URL 'ftp://ftp.ncbi.nlm.nih.gov/geo/series/GSE94nnn/GSE94802/suppl/GSE94802_Minkina_etal_normalized_FPKM.csv.gz'
Execution halted
```
* checking re-building of vignette outputs ... WARNING
```
...
The following objects are masked from 'package:matrixStats':
colMaxs, colMins, colRanges, rowMaxs, rowMins, rowRanges
The following objects are masked from 'package:base':
aperm, apply
Attaching package: 'tidyr'
The following object is masked from 'package:S4Vectors':
expand
trying URL 'ftp://ftp.ncbi.nlm.nih.gov/geo/series/GSE94nnn/GSE94802/suppl/GSE94802_Minkina_etal_normalized_FPKM.csv.gz'
Quitting from lines 34-70 (zFPKM.Rmd)
Error: processing vignette 'zFPKM.Rmd' failed with diagnostics:
cannot open URL 'ftp://ftp.ncbi.nlm.nih.gov/geo/series/GSE94nnn/GSE94802/suppl/GSE94802_Minkina_etal_normalized_FPKM.csv.gz'
Execution halted
```
* checking R code for possible problems ... NOTE
```
...
PlotGaussianFitDF: no visible binding for global variable ‘density’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/zFPKM/new/zFPKM.Rcheck/00_pkg_src/zFPKM/R/zfpkm.R:223)
PlotGaussianFitDF: no visible binding for global variable ‘log2fpkm’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/zFPKM/new/zFPKM.Rcheck/00_pkg_src/zFPKM/R/zfpkm.R:223)
PlotGaussianFitDF: no visible binding for global variable ‘sample_name’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/zFPKM/new/zFPKM.Rcheck/00_pkg_src/zFPKM/R/zfpkm.R:223)
PlotGaussianFitDF: no visible binding for global variable ‘log2fpkm’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/zFPKM/new/zFPKM.Rcheck/00_pkg_src/zFPKM/R/zfpkm.R:227-233)
PlotGaussianFitDF: no visible binding for global variable ‘density’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/zFPKM/new/zFPKM.Rcheck/00_pkg_src/zFPKM/R/zfpkm.R:227-233)
zFPKMCalc: no visible global function definition for ‘density’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/zFPKM/new/zFPKM.Rcheck/00_pkg_src/zFPKM/R/zfpkm.R:162)
zFPKMTransform: no visible global function definition for ‘is’
(/Users/romain/git/tidyverse/dplyr-revdep/dplyr/revdep/checks.noindex/zFPKM/new/zFPKM.Rcheck/00_pkg_src/zFPKM/R/zfpkm.R:125-127)
Undefined global functions or variables:
density dnorm is log2fpkm sample_name
Consider adding
importFrom("methods", "is")
importFrom("stats", "density", "dnorm")
to your NAMESPACE file (and ensure that your DESCRIPTION Imports field
contains 'methods').
```
# ZipRadius
Version: 1.0.1
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘ggplot2’ ‘testthat’
All declared Imports should be used.
```
# ztype
Version: 0.1.0
## In both
* checking dependencies in R code ... NOTE
```
Namespaces in Imports field not imported from:
‘dplyr’ ‘ggplot2’ ‘lubridate’
All declared Imports should be used.
```
| 31.36196 | 4,502 | 0.673524 | eng_Latn | 0.7047 |
d38c0a1109b755b8c6ef4c651685993cde6cbf31 | 899 | md | Markdown | docs/1.0.0/docs/list_shadow-shadow.md | Legytma/LegytmaSchemas | 299290dde3a51d5032fc4246108c014cd0bc9312 | [
"Apache-2.0"
] | null | null | null | docs/1.0.0/docs/list_shadow-shadow.md | Legytma/LegytmaSchemas | 299290dde3a51d5032fc4246108c014cd0bc9312 | [
"Apache-2.0"
] | 6 | 2020-06-29T15:15:37.000Z | 2020-08-03T01:16:17.000Z | docs/1.1.1/docs/list_shadow-shadow.md | Legytma/LegytmaSchemas | 299290dde3a51d5032fc4246108c014cd0bc9312 | [
"Apache-2.0"
] | null | null | null | # Shadow Schema
```txt
https://legytma.com.br/schema/shadow.schema.json#/items
```
Shadow
> Used to identify parser. Every parser can permit only one type
>
| Abstract | Extensible | Status | Identifiable | Custom Properties | Additional Properties | Access Restrictions | Defined In |
| :------------------ | ---------- | -------------- | ----------------------- | :---------------- | --------------------- | ------------------- | ------------------------------------------------------------------------------------- |
| Can be instantiated | No | Unknown status | Unknown identifiability | Forbidden | Allowed | none | [list_shadow.schema.json\*](../schema/list_shadow.schema.json) |
## items Type
unknown ([Shadow](list_shadow-shadow.md))
| 44.95 | 233 | 0.420467 | eng_Latn | 0.535701 |
d38d22a23f54e0f9db78f54264f3225fdeb99170 | 154 | md | Markdown | content/tips/debian/01082.md | kaihendry/dabase.com | cc7bafea0ac032e283c31e6630a5b02808ffb777 | [
"W3C",
"MIT"
] | 5 | 2020-01-30T08:11:28.000Z | 2021-04-29T03:17:49.000Z | content/tips/debian/01082.md | kaihendry/dabase.com | cc7bafea0ac032e283c31e6630a5b02808ffb777 | [
"W3C",
"MIT"
] | 14 | 2020-02-22T02:35:21.000Z | 2021-03-09T15:08:14.000Z | content/tips/debian/01082.md | kaihendry/dabase.com | cc7bafea0ac032e283c31e6630a5b02808ffb777 | [
"W3C",
"MIT"
] | 12 | 2020-02-22T02:28:39.000Z | 2022-02-07T01:50:57.000Z | ---
date: 2007-12-26 20:59:43 +0000
url: /e/01082
title: Viewing a subversion dump file
---
from lord-helmchen on #svn
egrep '^Node-path: ' repos.dump
| 15.4 | 37 | 0.681818 | eng_Latn | 0.498177 |
d38ef99781d28fcc6dc477abf42062a22abaa6a3 | 3,201 | md | Markdown | docs/vs-2015/xml-tools/xml-snippets.md | hericlesme/visualstudio-docs.pt-br | 086d2f88af868af84582bc7f1d50ffc5ea14b11f | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/vs-2015/xml-tools/xml-snippets.md | hericlesme/visualstudio-docs.pt-br | 086d2f88af868af84582bc7f1d50ffc5ea14b11f | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/vs-2015/xml-tools/xml-snippets.md | hericlesme/visualstudio-docs.pt-br | 086d2f88af868af84582bc7f1d50ffc5ea14b11f | [
"CC-BY-4.0",
"MIT"
] | 1 | 2021-07-26T14:58:39.000Z | 2021-07-26T14:58:39.000Z | ---
title: Trechos de código XML | Microsoft Docs
ms.custom: ''
ms.date: 2018-06-30
ms.prod: visual-studio-dev14
ms.reviewer: ''
ms.suite: ''
ms.technology:
- vs-ide-general
ms.tgt_pltfrm: ''
ms.topic: article
ms.assetid: 348dbf64-3f09-4fff-b47a-a7ecdf3221cc
caps.latest.revision: 10
author: gewarren
ms.author: gewarren
manager: ghogen
ms.openlocfilehash: bae360d1aea43006138397b3bed2857a2b1dad59
ms.sourcegitcommit: 55f7ce2d5d2e458e35c45787f1935b237ee5c9f8
ms.translationtype: MT
ms.contentlocale: pt-BR
ms.lasthandoff: 08/22/2018
ms.locfileid: "47475599"
---
# <a name="xml-snippets"></a>Snippets XML
[!INCLUDE[vs2017banner](../includes/vs2017banner.md)]
A versão mais recente deste tópico pode ser encontrada em [trechos XML](https://docs.microsoft.com/visualstudio/xml-tools/xml-snippets).
O Editor XML oferece um recurso, chamado *trechos XML*, que permite que você criar arquivos XML mais rapidamente. Você pode reutilizar XML inserindo snippets nos seus arquivos. Você também pode gerar os dados XML com base no esquema de linguagem de definição de esquema XML (XSD).
## <a name="reusable-xml-snippets"></a>Snippets reutilizáveis XML
O editor XML inclui muitos snippets que abrangem algumas tarefas comuns. Isso permite que você crie arquivos XML mais facilmente. Por exemplo, se você estivesse criando um esquema XML, usando o elemento de sequência tipo complexo” e “os snippets do elemento tipo simples” inserir o seguinte texto XML para o arquivo. Você alteraria o valor de `name` para atender às suas necessidades.
```
<xs:element name="name">
<xs:complexType>
<xs:sequence>
<xs:element name="name">
<xs:simpleType>
<xs:restriction base="xs:string"></xs:restriction>
</xs:simpleType>
</xs:element>
</xs:sequence>
</xs:complexType>
</xs:element>
```
Você pode inserir snippets de duas maneiras. O **Inserir trecho** comando insere o trecho XML a posição do cursor. O **envolver com** comando encapsula o trecho XML ao redor do texto selecionado. Ambos os comandos estão disponíveis do **IntelliSense** submenu sob o **editar** menu, ou no menu de atalho do editor.
Para obter mais informações, consulte [como: usar trechos de XML](../xml-tools/how-to-use-xml-snippets.md).
## <a name="schema-generated-xml-snippets"></a>Snippets Esquema- gerados XML
O editor XML também tem a capacidade de gerar um snippet de um esquema XML. Esse recurso permite que você popule um elemento com elementos XML gerados de informações de esquema para esse elemento.
Para obter mais informações, consulte [como: gerar um XML de um esquema XML do trecho](../xml-tools/how-to-generate-an-xml-snippet-from-an-xml-schema.md).
## <a name="create-new-xml-snippets"></a>Crie novos snippets XML
Além dos trechos que são incluídos com [!INCLUDE[msCoName](../includes/msconame-md.md)] Visual Studio por padrão, você também pode criar e usar seus próprios trechos XML.
Para obter mais informações, consulte [como: criar trechos de código XML](../xml-tools/how-to-create-xml-snippets.md).
## <a name="see-also"></a>Consulte também
[Editor de XML](../xml-tools/xml-editor.md)
| 47.073529 | 387 | 0.736645 | por_Latn | 0.98488 |
d38f1ea8d7c74a787dcc8fbe4a352693b8270fa2 | 123 | md | Markdown | designer/README.md | jason-fox/fogflow | e396ef0dee0125936954e381ab2862fd472e1774 | [
"BSD-3-Clause"
] | 102 | 2017-11-18T01:09:38.000Z | 2022-02-21T16:32:15.000Z | designer/README.md | Necuser1/fogflow | fc86761eef35f70c47ebc568c23cbbe8c9d06a87 | [
"BSD-4-Clause"
] | 169 | 2018-02-23T07:42:20.000Z | 2022-03-30T06:12:28.000Z | designer/README.md | Necuser1/fogflow | fc86761eef35f70c47ebc568c23cbbe8c9d06a87 | [
"BSD-4-Clause"
] | 68 | 2018-02-08T06:55:33.000Z | 2022-01-18T06:21:06.000Z | ## How to build each FogFlow component
A bash script is provided to build this docker image.
```console
./build.sh
```
| 13.666667 | 54 | 0.707317 | eng_Latn | 0.999404 |
d38f70459fab4897c556e66a209bead81809b63a | 203 | md | Markdown | src/pages/blog/2016/My-First-Try-to-Design-a-Web-App.md | shane13hsi/gatsby-starter-netlify-cms | d1a198f1be3214957518d56addd2700444a9a55c | [
"MIT"
] | null | null | null | src/pages/blog/2016/My-First-Try-to-Design-a-Web-App.md | shane13hsi/gatsby-starter-netlify-cms | d1a198f1be3214957518d56addd2700444a9a55c | [
"MIT"
] | null | null | null | src/pages/blog/2016/My-First-Try-to-Design-a-Web-App.md | shane13hsi/gatsby-starter-netlify-cms | d1a198f1be3214957518d56addd2700444a9a55c | [
"MIT"
] | null | null | null | ---
templateKey: blog-post
title: 第一次尝试设计一个 Web App
date: 2016-03-25 17:16:41
tags:
---
不是视觉设计,主要是工作方法论。这里留个标记,便于后期回顾。
这是 todo list。
{% asset_img of.png %}
这是我思考的过程之一。
{% asset_img section.png %}
| 10.684211 | 30 | 0.689655 | zho_Hans | 0.406854 |
d38faab1f1de97c5da64e6e22ee20167fce182d3 | 309 | md | Markdown | _posts/2021-07-26-pass-sanitaire.md | clairezed/3eme-rive | b3e035ec61fc8b121281c50a6c7934913085beac | [
"MIT"
] | null | null | null | _posts/2021-07-26-pass-sanitaire.md | clairezed/3eme-rive | b3e035ec61fc8b121281c50a6c7934913085beac | [
"MIT"
] | 3 | 2017-11-25T16:05:25.000Z | 2017-11-25T22:31:47.000Z | _posts/2021-07-26-pass-sanitaire.md | clairezed/3eme-rive | b3e035ec61fc8b121281c50a6c7934913085beac | [
"MIT"
] | 1 | 2019-07-03T15:03:27.000Z | 2019-07-03T15:03:27.000Z | ---
layout: post
title: Pass sanitaire
date: 2021-07-26T11:33:19.659Z
image_teaser: /images/uploads/capture-d’écran-2021-07-26-132910.png
image_01: /images/uploads/capture-d’écran-2021-07-26-133219.png
---
3e Rive vous accueille au local Ailleurs dans le respect du Pass sanitaire : les informations en image. | 38.625 | 103 | 0.779935 | fra_Latn | 0.533359 |
d3900c678f0d9d828540ade6f7120e61cf2a7720 | 1,728 | md | Markdown | README.md | julianjensen/ssa-form | 08683dd63255de4a94695bcc3f38b632bd7b676a | [
"MIT"
] | null | null | null | README.md | julianjensen/ssa-form | 08683dd63255de4a94695bcc3f38b632bd7b676a | [
"MIT"
] | null | null | null | README.md | julianjensen/ssa-form | 08683dd63255de4a94695bcc3f38b632bd7b676a | [
"MIT"
] | 1 | 2019-12-03T21:42:37.000Z | 2019-12-03T21:42:37.000Z | # ssa-form
[![Coveralls Status][coveralls-image]][coveralls-url]
[![Build Status][travis-image]][travis-url]
[![Dependency Status][depstat-image]][depstat-url]
[![npm version][npm-image]][npm-url]
[![License][license-image]][license-url]
[![Known Vulnerabilities][snyk-image]][snyk-url]
[![david-dm][david-dm-image]][david-dm-url]
> Creates an flow graph in SSA form based on an initial CFG. It can also apply data flow functions and will generate a live out set for each block as default for pruned SSA.
WARNING: Work in progress, not ready to be used yet.
## Install
```sh
npm i ssa-form
```
## Usage
```js
const
ssa = require( 'ssa-form' );
ssa() // true
```
## License
MIT © [Julian Jensen](https://github.com/julianjensen/ssa-form)
[coveralls-url]: https://coveralls.io/github/julianjensen/ssa-form?branch=master
[coveralls-image]: https://coveralls.io/repos/github/julianjensen/ssa-form/badge.svg?branch=master
[travis-url]: https://travis-ci.org/julianjensen/ssa-form
[travis-image]: http://img.shields.io/travis/julianjensen/ssa-form.svg
[depstat-url]: https://gemnasium.com/github.com/julianjensen/ssa-form
[depstat-image]: https://gemnasium.com/badges/github.com/julianjensen/ssa-form.svg
[npm-url]: https://badge.fury.io/js/ssa-form
[npm-image]: https://badge.fury.io/js/ssa-form.svg
[license-url]: https://github.com/julianjensen/ssa-form/blob/master/LICENSE
[license-image]: https://img.shields.io/badge/license-MIT-brightgreen.svg
[snyk-url]: https://snyk.io/test/github/julianjensen/ssa-form
[snyk-image]: https://snyk.io/test/github/julianjensen/ssa-form/badge.svg
[david-dm-url]: https://david-dm.org/julianjensen/ssa-form
[david-dm-image]: https://david-dm.org/julianjensen/ssa-form.svg
| 31.418182 | 173 | 0.734375 | kor_Hang | 0.14913 |
d3901d3e4e5748ceca5b4fa30e7636703fceb8d6 | 48 | md | Markdown | dynamics-nav-app/includes/nav_dev_shell_md.md | isabella232/nav-content.it-ch | 8d05cf5f45f5fa4a03136104d8c4513dda0a1a65 | [
"CC-BY-4.0",
"MIT"
] | 2 | 2020-05-19T18:47:42.000Z | 2021-04-21T00:13:46.000Z | dynamics-nav-app/includes/nav_dev_shell_md.md | MicrosoftDocs/nav-content.it- | 8f0143e0d8207cb289cc21b34fced0e3971ce7db | [
"CC-BY-4.0",
"MIT"
] | 1 | 2021-11-06T07:44:34.000Z | 2021-11-06T07:44:34.000Z | dynamics-nav-app/includes/nav_dev_shell_md.md | isabella232/nav-content.it-ch | 8d05cf5f45f5fa4a03136104d8c4513dda0a1a65 | [
"CC-BY-4.0",
"MIT"
] | 3 | 2019-10-14T19:40:59.000Z | 2021-11-05T11:04:38.000Z | Shell di sviluppo di Microsoft Dynamics NAV 2017 | 48 | 48 | 0.854167 | ita_Latn | 0.976575 |
d39061a194cf7079208a4ad246b973fb3946a518 | 16,251 | md | Markdown | release-notes/3.1/3.1.7/3.1.401-download.md | yygyjgmhje1987/core | d73a85454b1dd7f323e18e5e728ef0c95ff5a57a | [
"MIT"
] | 19,395 | 2015-01-02T20:41:47.000Z | 2022-03-31T20:10:11.000Z | release-notes/3.1/3.1.7/3.1.401-download.md | yygyjgmhje1987/core | d73a85454b1dd7f323e18e5e728ef0c95ff5a57a | [
"MIT"
] | 6,129 | 2015-01-22T15:19:50.000Z | 2022-03-31T18:47:06.000Z | release-notes/3.1/3.1.7/3.1.401-download.md | yygyjgmhje1987/core | d73a85454b1dd7f323e18e5e728ef0c95ff5a57a | [
"MIT"
] | 6,068 | 2015-01-05T18:03:07.000Z | 2022-03-31T08:08:49.000Z | # .NET Core 3.1.401 - August 11, 2020
This .NET Core SDK release includes the following released .NET Core and ASP.NET Core Runtimes.
* .NET Core SDK 3.1.401
* .NET Core Runtime 3.1.7
* ASP.NET Core 3.1.7
See the [Release Notes](https://github.com/dotnet/core/blob/main/release-notes/3.1/3.1.7/3.1.7.md) for details about what is included in this update.
## Downloads
| | SDK Installer<sup>1</sup> | SDK Binaries<sup>1</sup> | Runtime Installer | Runtime Binaries | ASP.NET Core Runtime | Windows Desktop Runtime |
| --------- | :------------------------------------------: | :----------------------: | :---------------------------: | :-------------------------: | :-----------------: |:-----------------: |
| Windows | [x86][dotnet-sdk-win-x86.exe] \| [x64][dotnet-sdk-win-x64.exe] | [x86][dotnet-sdk-win-x86.zip] \| [x64][dotnet-sdk-win-x64.zip] \| [ARM][dotnet-sdk-win-arm.zip] | [x86][dotnet-runtime-win-x86.exe] \| [x64][dotnet-runtime-win-x64.exe] | [x86][dotnet-runtime-win-x86.zip] \| [x64][dotnet-runtime-win-x64.zip] \| [ARM][dotnet-runtime-win-arm.zip] | [x86][aspnetcore-runtime-win-x86.exe] \| [x64][aspnetcore-runtime-win-x64.exe] \| [ARM][aspnetcore-runtime-win-arm.zip] \|<br> [Hosting Bundle][dotnet-hosting-win.exe]<sup>2</sup> | [x86][windowsdesktop-runtime-win-x86.exe] \| [x64][windowsdesktop-runtime-win-x64.exe] |
| macOS | [x64][dotnet-sdk-osx-x64.pkg] | [x64][dotnet-sdk-osx-x64.tar.gz] | [x64][dotnet-runtime-osx-x64.pkg] | [x64][dotnet-runtime-osx-x64.tar.gz] | [x64][aspnetcore-runtime-osx-x64.tar.gz]<sup>1</sup> | - |
| Linux | [Snap Install][snap-install] | [x64][dotnet-sdk-linux-x64.tar.gz] \| [ARM][dotnet-sdk-linux-arm.tar.gz] \| [ARM64][dotnet-sdk-linux-arm64.tar.gz] \| [x64 Alpine][dotnet-sdk-linux-musl-x64.tar.gz] | - | [x64][dotnet-runtime-linux-x64.tar.gz] \| [ARM][dotnet-runtime-linux-arm.tar.gz] \| [ARM64][dotnet-runtime-linux-arm64.tar.gz] \| [x64 Alpine][dotnet-runtime-linux-musl-x64.tar.gz] \| [ARM64 Alpine][dotnet-runtime-linux-musl-arm64.tar.gz] | [x64][aspnetcore-runtime-linux-x64.tar.gz]<sup>1</sup> \| [ARM][aspnetcore-runtime-linux-arm.tar.gz]<sup>1</sup> \| [ARM64][aspnetcore-runtime-linux-arm64.tar.gz]<sup>1</sup> \| [x64 Alpine][aspnetcore-runtime-linux-musl-x64.tar.gz] \| [ARM64 Alpine][aspnetcore-runtime-linux-musl-arm64.tar.gz] | - |
| RHEL6 | - | [x64][dotnet-sdk-rhel.6-x64.tar.gz] | - | [x64][dotnet-runtime-rhel.6-x64.tar.gz] | - |
| Checksums | [SDK][checksums-sdk] | - | [Runtime][checksums-runtime] | - | - | - |
1. Includes the .NET Core and ASP.NET Core Runtimes
2. For hosting stand-alone apps on Windows Servers. Includes the ASP.NET Core Module for IIS and can be installed separately on servers without installing .NET Core runtime.
## Visual Studio Compatibility
**Visual Studio compatibility:** .NET Core 3.1 requires Visual Studio 2019 16.4 or above to take full advantage of all its features. .NET Core 3.1 won't work properly in earlier versions of Visual Studio. See the following table to select the correct download.
| OS | Development Environment | .NET Core SDK |
| :-- | :-- | :--: |
| Windows | Visual Studio 2019 version 16.6 | [3.1.401](#downloads) |
| Windows | Visual Studio 2019 version 16.4 | [3.1.106](3.1.7.md) |
| MacOS | Visual Studio for Mac | [Visual Studio for Mac .NET Core Support](https://docs.microsoft.com/visualstudio/mac/net-core-support) |
## Docker
The [.NET Core Docker images](https://hub.docker.com/r/microsoft/dotnet/) have been updated for this release. Details on our Docker versioning and how to work with the images can be seen in ["Staying up-to-date with .NET Container Images"](https://devblogs.microsoft.com/dotnet/staying-up-to-date-with-net-container-images/).
## Installing .NET Core on Linux
### Install using Snap
Snap is a system which installs applications in an isolated environment and provides for automatic updates. Many distributions which are not directly supported by .NET Core can use Snaps to install. See the [list of distributions supported Snap](https://docs.snapcraft.io/installing-snapd/6735) for details.
After configuring Snap on your system, run the following command to install the latest .NET Core SDK.
`sudo snap install dotnet-sdk --channel 3.1/stable –-classic`
When .NET Core in installed using the Snap package, the default .NET Core command is `dotnet-sdk.dotnet`, as opposed to just `dotnet`. The benefit of the namespaced command is that it will not conflict with a globally installed .NET Core version you may have. This command can be aliased to `dotnet` with:
`sudo snap alias dotnet-sdk.dotnet dotnet`
**Note:** Some distros require an additional step to enable access to the SSL certificate. If you experience SSL errors when running `dotnet restore`, see [Linux Setup](https://github.com/dotnet/core/blob/main/Documentation/linux-setup.md) for a possible resolution.
### Install using a Package Manager
Before installing .NET, you will need to register the Microsoft key, register the product repository, and install required dependencies. This only needs to be done once per machine. Refer to [Setting up Linux for .NET Core][linux-setup] for the requirements.
The commands listed below do not specifically incude package managers to help with readability. Here are the package managers typically used by the Distros on which .NET Core is supported.
| Distro | Package Manager |
| --- | :----: |
| CentOS, Oracle | yum |
| Debian, Ubuntu | apt-get |
| Fedora | dnf |
| OpenSUSE, SLES | zypper |
## Develop applications
To develop applications using the .NET Core SDK, run the following command. The .NET Core runtime and ASP.NET Core runtime are included.
```bash
sudo [package manager] update or refresh
sudo [package manager] install dotnet-sdk-3.1
```
## Run applications
If you only need to run existing applications, run the following command. The .NET Core runtime and ASP.NET Core runtime are included.
```bash
sudo [package manager] update or refresh
sudo [package manager] install aspnetcore-runtime-3.1
```
### Installation from a binary archive
Installing from the packages detailed above is recommended or you can install from binary archive, if that better suits your needs. When using binary archives to install, the contents must be extracted to a user location such as `$HOME/dotnet`, a symbolic link created for `dotnet` and a few dependencies installed. Dependency requirements can be seen in the [Linux System Prerequisites](https://github.com/dotnet/core/blob/main/Documentation/linux-prereqs.md) document.
```bash
mkdir -p $HOME/dotnet && tar zxf dotnet.tar.gz -C $HOME/dotnet
export PATH=$PATH:$HOME/dotnet
```
## .NET Core Runtime-only installation
If only the .NET Core Runtime is needed, install `dotnet-runtime-3.1` using your package manager. If you also need ASP.NET Core functionality, installing `aspnetcore-runtime-3.1` will install both the ASP Runtime and .NET Core Runtime.
## Windows Server Hosting
If you are looking to host stand-alone apps on Servers, the following installer can be used on Windows systems.
### Windows
You can download the Windows Server Hosting installer and run the following command from an Administrator command prompt:
* [dotnet-hosting-3.1.7-win.exe][dotnet-hosting-win.exe]
This will install the ASP.NET Core Module for IIS.
[blob-runtime]: https://dotnetcli.blob.core.windows.net/dotnet/Runtime/
[blob-sdk]: https://dotnetcli.blob.core.windows.net/dotnet/Sdk/
[release-notes]: https://github.com/dotnet/core/blob/main/release-notes/3.1/3.1.7/3.1.401-download.md
[snap-install]: 3.1.7-install-instructions.md
[checksums-runtime]: https://dotnetcli.blob.core.windows.net/dotnet/checksums/3.1.7-sha.txt
[checksums-sdk]: https://dotnetcli.blob.core.windows.net/dotnet/checksums/3.1.7-sha.txt
[linux-install]: https://docs.microsoft.com/dotnet/core/install/linux
[linux-setup]: https://docs.microsoft.com/dotnet/core/install/
[dotnet-blog]: https://devblogs.microsoft.com/dotnet/net-core-march-2020/
[//]: # ( Runtime 3.1.7)
[dotnet-runtime-linux-arm.tar.gz]: https://download.visualstudio.microsoft.com/download/pr/69984653-402e-442c-9588-eb92560d0fce/5ad7995a09334dd2ee56f00fb6dc0521/dotnet-runtime-3.1.7-linux-arm.tar.gz
[dotnet-runtime-linux-arm64.tar.gz]: https://download.visualstudio.microsoft.com/download/pr/e0982947-c956-4c44-b94a-3ecc13d7aa64/28f9a7f461d5aac85121492ba4513517/dotnet-runtime-3.1.7-linux-arm64.tar.gz
[dotnet-runtime-linux-musl-arm64.tar.gz]: https://download.visualstudio.microsoft.com/download/pr/d56c7d29-8122-4e5d-8b0a-030aa5290d3f/5810b5c16c76deface341885710d2980/dotnet-runtime-3.1.7-linux-musl-arm64.tar.gz
[dotnet-runtime-linux-musl-x64.tar.gz]: https://download.visualstudio.microsoft.com/download/pr/323a09ee-4171-4376-abcd-40bf12e20b1f/70ce498e556a40cd37774a083b73af5b/dotnet-runtime-3.1.7-linux-musl-x64.tar.gz
[dotnet-runtime-linux-x64.tar.gz]: https://download.visualstudio.microsoft.com/download/pr/e42ed5c3-d7a3-404d-a242-cfd10ef626ff/b723e456ffaf60b6df6c6d5b0a792aba/dotnet-runtime-3.1.7-linux-x64.tar.gz
[dotnet-runtime-osx-x64.pkg]: https://download.visualstudio.microsoft.com/download/pr/182b16ca-1334-40af-a1ca-8e4a9cb07c63/5368671138c576ad48c6e7715e929203/dotnet-runtime-3.1.7-osx-x64.pkg
[dotnet-runtime-osx-x64.tar.gz]: https://download.visualstudio.microsoft.com/download/pr/f4677b54-3e9d-4d23-9153-6f75db881e67/2ab1f6fe3a982f683a8c7aa163861af7/dotnet-runtime-3.1.7-osx-x64.tar.gz
[dotnet-runtime-rhel.6-x64.tar.gz]: https://download.visualstudio.microsoft.com/download/pr/bd7ec684-d8c2-4fa8-99f5-eb12321ad85b/8e593dc79ebfa2ffdc397620feeecb0f/dotnet-runtime-3.1.7-rhel.6-x64.tar.gz
[dotnet-runtime-win-arm.zip]: https://download.visualstudio.microsoft.com/download/pr/9cefa036-0ba8-4929-a0f8-6676dcdd8585/c23ffac90ddc2ada5dd94d9c3073db07/dotnet-runtime-3.1.7-win-arm.zip
[dotnet-runtime-win-x64.exe]: https://download.visualstudio.microsoft.com/download/pr/c9326fc1-401a-4957-8fc4-9594b141de91/fe32ec0c9f2974ef72af7c3e2c7232cb/dotnet-runtime-3.1.7-win-x64.exe
[dotnet-runtime-win-x64.zip]: https://download.visualstudio.microsoft.com/download/pr/8eaa2801-8ee8-49ee-9615-520164098473/e706f903c0f4df8752a70b06771a4bdf/dotnet-runtime-3.1.7-win-x64.zip
[dotnet-runtime-win-x86.exe]: https://download.visualstudio.microsoft.com/download/pr/8966b729-62af-4cf1-ac51-9ba6eb0a7c78/4f3c1603e41c0b8fb799837f55e6b3fd/dotnet-runtime-3.1.7-win-x86.exe
[dotnet-runtime-win-x86.zip]: https://download.visualstudio.microsoft.com/download/pr/d21813b5-cee7-499d-a9f0-583f66e8cef9/c3e2a452c7ed781aba9b17778b5ddae6/dotnet-runtime-3.1.7-win-x86.zip
[//]: # ( WindowsDesktop 3.1.7)
[windowsdesktop-runtime-win-x64.exe]: https://download.visualstudio.microsoft.com/download/pr/5e4695fb-da51-4fa8-a090-07a64480888c/65aa842670d2280b5d05b8a070a9f495/windowsdesktop-runtime-3.1.7-win-x64.exe
[windowsdesktop-runtime-win-x86.exe]: https://download.visualstudio.microsoft.com/download/pr/3e6c8a13-9d89-4991-b683-b6bb279bc096/d1c44ba0c34f2be8878c36d27287e1a5/windowsdesktop-runtime-3.1.7-win-x86.exe
[//]: # ( ASP 3.1.7)
[aspnetcore-runtime-linux-arm.tar.gz]: https://download.visualstudio.microsoft.com/download/pr/5ed60e45-f93a-4a8b-ab92-4034fcf00618/cf2aafe9bc91f28bd4d7b7436c31e27e/aspnetcore-runtime-3.1.7-linux-arm.tar.gz
[aspnetcore-runtime-linux-arm64.tar.gz]: https://download.visualstudio.microsoft.com/download/pr/5d8bf507-759a-4cc6-92ae-8ef63478398a/6b298aad0f6ce04ebc09daa1007a4248/aspnetcore-runtime-3.1.7-linux-arm64.tar.gz
[aspnetcore-runtime-linux-musl-arm64.tar.gz]: https://download.visualstudio.microsoft.com/download/pr/261de71f-9189-4e0f-8da7-0d63d556b610/f149cc9d18e934ecb888dbebfc96c388/aspnetcore-runtime-3.1.7-linux-musl-arm64.tar.gz
[aspnetcore-runtime-linux-musl-x64.tar.gz]: https://download.visualstudio.microsoft.com/download/pr/5111d26b-6749-452d-a6b2-456161b6d29f/ed5f7a9d0b2903e028def142dd70ccd0/aspnetcore-runtime-3.1.7-linux-musl-x64.tar.gz
[aspnetcore-runtime-linux-x64.tar.gz]: https://download.visualstudio.microsoft.com/download/pr/e7d0601d-41b4-483f-b411-f2b42708054a/191b56b81e1830b413d0794728831eea/aspnetcore-runtime-3.1.7-linux-x64.tar.gz
[aspnetcore-runtime-osx-x64.tar.gz]: https://download.visualstudio.microsoft.com/download/pr/b0365f9c-270c-4454-9b92-1d455d402c72/c41415c12b649360a6ad20067b04c8f0/aspnetcore-runtime-3.1.7-osx-x64.tar.gz
[aspnetcore-runtime-win-arm.zip]: https://download.visualstudio.microsoft.com/download/pr/80863826-3ba1-40f5-898e-c71bb5190a48/0ae020ed49cf8fce8697f284f632c820/aspnetcore-runtime-3.1.7-win-arm.zip
[aspnetcore-runtime-win-x64.exe]: https://download.visualstudio.microsoft.com/download/pr/4957d824-3b3b-497a-b499-55022088ed93/b2ee157a32d7718897024d03b7126b59/aspnetcore-runtime-3.1.7-win-x64.exe
[aspnetcore-runtime-win-x64.zip]: https://download.visualstudio.microsoft.com/download/pr/7fb1dd11-760b-4f3c-ac98-2a708b713278/d8bd66ce86ebb551df553b3d6a2be3eb/aspnetcore-runtime-3.1.7-win-x64.zip
[aspnetcore-runtime-win-x86.exe]: https://download.visualstudio.microsoft.com/download/pr/367c9699-606d-4671-a3e8-d13f943d620a/d9873b044b80613cbfa642f28d6bec0f/aspnetcore-runtime-3.1.7-win-x86.exe
[aspnetcore-runtime-win-x86.zip]: https://download.visualstudio.microsoft.com/download/pr/1d8ee077-6f24-418d-9012-a727dba47ea0/88ae0a647770d08098713684940a0970/aspnetcore-runtime-3.1.7-win-x86.zip
[dotnet-hosting-win.exe]: https://download.visualstudio.microsoft.com/download/pr/21a5322f-cf9c-40e0-af41-4cdf14b3fb17/ff1390906525099bcd6b322279e09938/dotnet-hosting-3.1.7-win.exe
[//]: # ( SDK 3.1.401 )
[dotnet-sdk-linux-arm.tar.gz]: https://download.visualstudio.microsoft.com/download/pr/a92a6358-52c3-472b-ad6d-d2d80abdcef4/37a7551a4e2c9e455caed5ef777a8983/dotnet-sdk-3.1.401-linux-arm.tar.gz
[dotnet-sdk-linux-arm64.tar.gz]: https://download.visualstudio.microsoft.com/download/pr/8c39349a-23d0-46b0-8206-8b573a404709/b42fd441c1911acc90aaddaa58d7103f/dotnet-sdk-3.1.401-linux-arm64.tar.gz
[dotnet-sdk-linux-musl-x64.tar.gz]: https://download.visualstudio.microsoft.com/download/pr/2d2a1e6f-3396-494f-9906-d44b8c860b90/0fa79dd0d0b6ba02d1dc203a04622233/dotnet-sdk-3.1.401-linux-musl-x64.tar.gz
[dotnet-sdk-linux-x64.tar.gz]: https://download.visualstudio.microsoft.com/download/pr/4f9b8a64-5e09-456c-a087-527cfc8b4cd2/15e14ec06eab947432de139f172f7a98/dotnet-sdk-3.1.401-linux-x64.tar.gz
[dotnet-sdk-osx-x64.pkg]: https://download.visualstudio.microsoft.com/download/pr/692921be-5cd6-42b5-8c52-0c17cb5ec580/1b0d95cd4950a58ac069095bdf976f6e/dotnet-sdk-3.1.401-osx-x64.pkg
[dotnet-sdk-osx-x64.tar.gz]: https://download.visualstudio.microsoft.com/download/pr/e1f6e8dc-833f-46aa-866b-40b9bc68ed0f/6540a60960a6489061a08a9ccd3935cd/dotnet-sdk-3.1.401-osx-x64.tar.gz
[dotnet-sdk-rhel.6-x64.tar.gz]: https://download.visualstudio.microsoft.com/download/pr/bcf2eb1b-dfd8-4471-bbc6-895ca4590f9f/45d432a3d203359c7c1e3b6a7344982e/dotnet-sdk-3.1.401-rhel.6-x64.tar.gz
[dotnet-sdk-win-arm.zip]: https://download.visualstudio.microsoft.com/download/pr/59e635e2-5294-4a04-a94d-2ff9e01fe66c/889a8a545c9a0e828177a69f478d7cfc/dotnet-sdk-3.1.401-win-arm.zip
[dotnet-sdk-win-x64.exe]: https://download.visualstudio.microsoft.com/download/pr/547f9f81-599a-4b58-9322-d1d158385df6/ebe3e02fd54c29487ac32409cb20d352/dotnet-sdk-3.1.401-win-x64.exe
[dotnet-sdk-win-x64.zip]: https://download.visualstudio.microsoft.com/download/pr/2749f31c-4745-4d71-b317-33a8f3087402/6c7868cd73427c8117563192615df66d/dotnet-sdk-3.1.401-win-x64.zip
[dotnet-sdk-win-x86.exe]: https://download.visualstudio.microsoft.com/download/pr/719cf74a-8a57-405d-a048-be8d94bbef37/1914f811ddbf10f7a2a45181b9cac714/dotnet-sdk-3.1.401-win-x86.exe
[dotnet-sdk-win-x86.zip]: https://download.visualstudio.microsoft.com/download/pr/e06cba9e-6dfe-4f24-b8d5-38038c1088d7/177c9f1cb89b2a0ece878a67b7b87136/dotnet-sdk-3.1.401-win-x86.zip
| 92.335227 | 760 | 0.751769 | eng_Latn | 0.19664 |
d39067bb8dbd6f5cc6120a9df8a6621d7efd8c38 | 696 | md | Markdown | _publications/2019-01-8-A-State-Encoding-Methodology-for-Side-Channel-Security-vs-Power-Trade-off-Exploration.md | mborowczak/mborowczak.github.io | 1bc5e5ee21ac345475af9243f18f9fcd776bab35 | [
"MIT"
] | null | null | null | _publications/2019-01-8-A-State-Encoding-Methodology-for-Side-Channel-Security-vs-Power-Trade-off-Exploration.md | mborowczak/mborowczak.github.io | 1bc5e5ee21ac345475af9243f18f9fcd776bab35 | [
"MIT"
] | null | null | null | _publications/2019-01-8-A-State-Encoding-Methodology-for-Side-Channel-Security-vs-Power-Trade-off-Exploration.md | mborowczak/mborowczak.github.io | 1bc5e5ee21ac345475af9243f18f9fcd776bab35 | [
"MIT"
] | null | null | null | ---
title: "A State Encoding Methodology for Side-Channel Security vs. Power Trade-off Exploration"
collection: publications
permalink: /publication/2019-01-8-A-State-Encoding-Methodology-for-Side-Channel-Security-vs-Power-Trade-off-Exploration
date: 2019-01-8
venue: 'In the proceedings of 2019 International Conference on VLSI Design (VLSID 2019)'
paperurl: 'https://goo.gl/4kDDec'
citation: ' Richa Agrawal, Mike Borowczak, Ranga Vemuri, "A State Encoding Methodology for Side-Channel Security vs. Power Trade-off Exploration." In the proceedings of 2019 International Conference on VLSI Design (VLSID 2019), 2019.'
---
[Access paper here](https://goo.gl/4kDDec){:target="_blank"}
| 63.272727 | 246 | 0.781609 | yue_Hant | 0.441617 |
d390d8a3ee3907fd27306d74c0e0f0fb7eb75e24 | 222 | md | Markdown | pages/Radio-Frequency-IDenditicaiton-RFID.md | dperret/rfhs-wiki | 2db38061c7e00ff7848d4c5a16e19410cfc446a3 | [
"BSD-2-Clause"
] | 36 | 2020-08-02T03:46:54.000Z | 2022-03-25T18:17:56.000Z | pages/Radio-Frequency-IDenditicaiton-RFID.md | GoodGooGleINS/rfhs-wiki | 06434a8be763fab35e29d0e0036398e5d3928d9b | [
"BSD-2-Clause"
] | 5 | 2021-08-07T02:23:52.000Z | 2022-03-24T03:36:50.000Z | pages/Radio-Frequency-IDenditicaiton-RFID.md | GoodGooGleINS/rfhs-wiki | 06434a8be763fab35e29d0e0036398e5d3928d9b | [
"BSD-2-Clause"
] | 12 | 2020-07-30T02:25:06.000Z | 2022-03-29T17:58:48.000Z |
# General Use
## Standards
## Education
## Certifications
## Whitepapers
## github repos
##
# Research
# Security Research
# Security Vulnerabilities
# Hacking
# Cracking
# Tools
| 3.894737 | 26 | 0.59009 | eng_Latn | 0.755584 |
d390da7dd9b356acefda399acd8c12309a338da9 | 1,018 | md | Markdown | aspnet/identity/overview/extensibility/index.md | terrajobst/AspNetDocs.cs-cz | 89957a3d61104043d6f0f0240d81e80c6dcb51ca | [
"CC-BY-4.0",
"MIT"
] | null | null | null | aspnet/identity/overview/extensibility/index.md | terrajobst/AspNetDocs.cs-cz | 89957a3d61104043d6f0f0240d81e80c6dcb51ca | [
"CC-BY-4.0",
"MIT"
] | null | null | null | aspnet/identity/overview/extensibility/index.md | terrajobst/AspNetDocs.cs-cz | 89957a3d61104043d6f0f0240d81e80c6dcb51ca | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
uid: identity/overview/extensibility/index
title: Rozšíření ASP.NET Identity – ASP.NET 4. x
author: rick-anderson
description: Rozšiřitelnost
ms.author: riande
ms.date: 10/02/2013
ms.custom: seoapril2019
ms.assetid: d1c6e7d0-ead9-4f08-a5b9-9d7a30be78e3
msc.legacyurl: /identity/overview/extensibility
msc.type: chapter
ms.openlocfilehash: 745f8685df098dcd62fc1893363719bbaee591a6
ms.sourcegitcommit: e7e91932a6e91a63e2e46417626f39d6b244a3ab
ms.translationtype: MT
ms.contentlocale: cs-CZ
ms.lasthandoff: 03/06/2020
ms.locfileid: "78616821"
---
# <a name="aspnet-identity-extensibility"></a>Rozšiřitelnost ASP.NET Identity
> Rozšiřitelnost
- [Přehled poskytovatelů vlastního úložiště pro ASP.NET Identity](overview-of-custom-storage-providers-for-aspnet-identity.md)
- [Implementace vlastního poskytovatele úložiště MySQL ASP.NET Identity](implementing-a-custom-mysql-aspnet-identity-storage-provider.md)
- [Změna primárního klíče uživatelů v ASP.NET Identity](change-primary-key-for-users-in-aspnet-identity.md)
| 39.153846 | 137 | 0.819253 | ces_Latn | 0.629155 |
d392d58c929a12eee31239b410a17ae3aa2da4eb | 345 | md | Markdown | recipes/Python/578190_ChainedList_ChainedListView_Exposing_Multiple/README.md | tdiprima/code | 61a74f5f93da087d27c70b2efe779ac6bd2a3b4f | [
"MIT"
] | 2,023 | 2017-07-29T09:34:46.000Z | 2022-03-24T08:00:45.000Z | recipes/Python/578190_ChainedList_ChainedListView_Exposing_Multiple/README.md | unhacker/code | 73b09edc1b9850c557a79296655f140ce5e853db | [
"MIT"
] | 32 | 2017-09-02T17:20:08.000Z | 2022-02-11T17:49:37.000Z | recipes/Python/578190_ChainedList_ChainedListView_Exposing_Multiple/README.md | unhacker/code | 73b09edc1b9850c557a79296655f140ce5e853db | [
"MIT"
] | 780 | 2017-07-28T19:23:28.000Z | 2022-03-25T20:39:41.000Z | ## ChainedList and ChainedListView: Exposing Multiple Lists as a Single Sequence
Originally published: 2012-07-03 21:00:02
Last updated: 2012-07-03 21:00:02
Author: Eric Snow
Handy for composing lists without losing the underlying indepedence. The "lists" don't actually have to be lists, though for ChainedList they must be mutable. | 57.5 | 159 | 0.776812 | eng_Latn | 0.986265 |
d3934f8a283754e6145832290a01d2faca39e89b | 883 | md | Markdown | pages/common/vim-select-text.md | bdlangton/tldr | 11b0fb9f3b6daeb57071aec2b3d3daed631a0d6c | [
"CC-BY-4.0"
] | null | null | null | pages/common/vim-select-text.md | bdlangton/tldr | 11b0fb9f3b6daeb57071aec2b3d3daed631a0d6c | [
"CC-BY-4.0"
] | null | null | null | pages/common/vim-select-text.md | bdlangton/tldr | 11b0fb9f3b6daeb57071aec2b3d3daed631a0d6c | [
"CC-BY-4.0"
] | null | null | null | # vim select cut copy paste text
> Selecting and cut/copy/paste text.
- Enter visual select by char
`v`
- Enter visual select by line
`V`
- Select block of code + conditional (example: entire 'if' block)
`Vj%`
- In visual: yank selected text
`y`
- Yank current line
`yy`
- Yank from cursor to beginning of next word
`yw`
- Yank line to system clipboard
`"+yy`
- Go to first line and yank to the last line (yank entire file) to system clipboard
`gg"+yG`
- Yank to register (typically "0, "1, etc)
`[register]y`
- Yank inside paragraph
`yip`
- Yank sentence
`yas`
- Yank inside symbol (can be [ { ( < " ')
`yi[symbol]`
- Yank around symbol (including symbol) (can be [ { ( < " ')
`ya[symbol]`
- Paste text
`p`
- Paste text before the cursor
`P`
- Paste from register (typically "0, "1, etc)
`[register]p`
- Insert mode: paste text
`Ctrl+r[register]`
| 12.263889 | 83 | 0.655719 | eng_Latn | 0.967953 |
d39364ec260adaf9c398e0e131b91023c244d237 | 2,929 | md | Markdown | README.md | PRCYCoin/DAPSCoin-New | 7410f74dac5ce369ee38d0d394d1ab88042e7b42 | [
"MIT"
] | 83 | 2019-10-01T00:16:05.000Z | 2022-03-29T22:43:23.000Z | README.md | PRCYCoin/DAPSCoin-New | 7410f74dac5ce369ee38d0d394d1ab88042e7b42 | [
"MIT"
] | 84 | 2019-10-01T19:48:43.000Z | 2022-03-21T21:15:50.000Z | README.md | PRCYCoin/DAPSCoin-New | 7410f74dac5ce369ee38d0d394d1ab88042e7b42 | [
"MIT"
] | 63 | 2019-10-01T05:32:39.000Z | 2022-03-25T01:39:06.000Z | 
Welcome to DAPS
=====================================
## Introduction
DAPScoin is a cutting edge cryptocurrency, with many features not available in most other cryptocurrencies.
- Anonymized transactions using Stealth addresses, RingCT and Bulletproofs.
- Masternode is secured with a collateral of 1,000,000 DAPS.
DAPS is a cryptocurrency designed for corporate entities, traders, and individuals to have the autonomy to transactions business freely around the world safely and securely without the exposure to malicious actors that threaten financial transactions through traceability.
DAPS is a privacy coin that aims to be the most private and secure privacy coin on the market in a hybrid chain of PoW, PoSv3 and PoA. We aim to set our protocol to become the new standard of privacy coins.
DAPS is the world's first coin to implement Bulletproofs and RingCT & Ring Signatures in a staking chain. With DAPS it is possible to stake, run masternodes and mine PoA blocks. In 2019 DAPS completed two successful testnets, sponsored Consensus 2019 NYC and Futurist Conference in Canada, achieved and kept top 200 coin status, passed security audit by Red4Sec, launched mainnet and achieved over 3000 hosted masternodes in the first month, staying in the top 10.
## About this Project
DAPS is a non-ICO community driven project. The project has funded itself to deliver ground-breaking technology in the privacy coin industry.
DAPS DAO
The community runs the project via DAPS DAO on DaoHaus. https://app.daohaus.club/dao/0x1/0xfc8eba8a52b21562f8b6843f545f4e744f0f0d41
## How we Compare to Other Privacy Projects

## How to Contribute to DAPS
We have an extensive [Contributing.md](https://github.com/DAPSCoin/DAPSCoin/blob/master/CONTRIBUTING.md) guide on how to contribute to DAPS source code.
Please have a look at this first before deciding to contribute to our codebase.
We welcome developers from all backgrounds to take a look at our code and welcome all suggestions and changes.
## Social
Facebook - (https://www.facebook.com/officialDAPScoin/)
Twitter - (https://twitter.com/DAPScoin)
LinkedIn - (https://www.linkedin.com/company/daps-coin/)
Telegram - (https://t.me/dapscoin)
Discord - (https://discord.gg/hxfmWpR)
More information at [officialdapscoin.com](https://officialdapscoin.com)
### Coin Specs
<table>
<tr><td>Algo</td><td>PoW-PoA-PoS</td></tr>
<tr><td>Block Time</td><td>60 Seconds</td></tr>
<tr><td>Difficulty Retargeting</td><td>Every Block</td></tr>
<tr><td>Max Coin Supply</td><td>70,000,000,000 DAPS</td></tr>
<tr><td>Premine</td><td>60 000 000 000 DAPS</td></tr>
</table>
## BootStrap
Our latest BootStrap is always available at https://bootstrap.dapscoin.com/
| 47.241935 | 464 | 0.771253 | eng_Latn | 0.891579 |
d393f1a79ab0454fe034400c045cba92a6aaad77 | 1,024 | md | Markdown | README.md | cjtapper/coinprices | b70a6de88e7e374684619c1acbe572869cedd954 | [
"MIT"
] | 1 | 2021-07-19T05:52:16.000Z | 2021-07-19T05:52:16.000Z | README.md | cjtapper/coinprices | b70a6de88e7e374684619c1acbe572869cedd954 | [
"MIT"
] | null | null | null | README.md | cjtapper/coinprices | b70a6de88e7e374684619c1acbe572869cedd954 | [
"MIT"
] | null | null | null | # Coin prices
Requests crypto prices from coinmarketcap.com and prints them to `stdout` in the
format required for a `ledger` price history database.
Perhaps somewhat confusingly, I'm refering to the ledger accounting software
(https://www.ledger-cli.org/), not the Ledger hardware wallet (https://www.ledger.com/), although I do use and recommend both!
At the moment, it's hardcoded to request quotes for ADA, BTC, LTC, XRP, and ETH
in AUD, because that's all that's relevant to me. Maybe I'll update it later on
to make it more flexible.
## Requirements
* Python 3.6+ (uses `f` strings, hence 3.6)
* pip
* CoinMarketCap API key in environment variable `COIN_MARKET_CAP_API_KEY`
## Installation
Clone the repo.
Install requirements:
```sh
pip install -r requirements.txt
```
## Usage:
```sh
python coinprices.py
```
I like to append the output to my price history. For example:
```sh
python coinprices.py >> ~/.pricedb
```
It's nice to set this up in a cronjob for whatever frequency works for you.
## License
MIT
| 26.25641 | 126 | 0.744141 | eng_Latn | 0.978129 |
d395e805ec14d583d346121059510a8184137ac3 | 2,969 | md | Markdown | translations/es-ES/data/reusables/actions/allow-specific-actions-intro.md | nyanthanya/Cuma_Info | d519c49504fc3818c1294f14e63ee944d2f4bd89 | [
"CC-BY-4.0",
"MIT"
] | 17 | 2021-01-05T16:29:05.000Z | 2022-02-26T09:08:44.000Z | translations/es-ES/data/reusables/actions/allow-specific-actions-intro.md | nyanthanya/Cuma_Info | d519c49504fc3818c1294f14e63ee944d2f4bd89 | [
"CC-BY-4.0",
"MIT"
] | 222 | 2021-04-08T20:13:34.000Z | 2022-03-18T22:37:27.000Z | translations/es-ES/data/reusables/actions/allow-specific-actions-intro.md | nyanthanya/Cuma_Info | d519c49504fc3818c1294f14e63ee944d2f4bd89 | [
"CC-BY-4.0",
"MIT"
] | 3 | 2021-08-31T03:18:06.000Z | 2021-10-30T17:49:09.000Z | Cuando eliges el **Permitir las acciones seleccionadas**, las acciones locales se permitirán y habrá opciones adicionales para permitir otras acciones específicas:
- **Permitir acciones que crea {% data variables.product.prodname_dotcom %}:** Puedes permitir que los flujos de trabajo utilicen todas las acciones que haya creado {% data variables.product.prodname_dotcom %}. Las acciones que crea {% data variables.product.prodname_dotcom %} se ubican en las organizaciones de `actions` y de `github`. Para obtener más información, consulta las organizaciones [`actions`](https://github.com/actions) y [`github`](https://github.com/github).
- **Permitir las acciones de Marketplace que tengan creadores verificados:** Puedes permitir que los flujos de trabajo utilicen todas las acciones de {% data variables.product.prodname_marketplace %} que tengan creadores verificados. Cuando GitHub haya verificado al creador de la acción como una organización asociada, se mostrará la insignia de {% octicon "verified" aria-label="The verified badge" %} junto a la acción en {% data variables.product.prodname_marketplace %}.
- **Permitir acciones especificadas:** Puedes restringir los flujos de trabajo para que utilicen las acciones que se encuentren en organizciones y repositorios específicos.
Para restringir el acceso a las etiquetas específicas o a los SHA de confirmación de una acción, puedes utilizar la misma sintaxis de `<OWNER>/<REPO>@<TAG OR SHA>` en el flujo de trabajo para seleccionar la acción. Por ejemplo, `actions/javascript-action@v1.0.1` para seleccionar una etiqueta o `actions/javascript-action@172239021f7ba04fe7327647b213799853a9eb89` para seleccionar un SHA. Para obtener más información, consulta la sección "[Encontrar y personalizar las acciones](/actions/learn-github-actions/finding-and-customizing-actions#using-release-management-for-your-custom-actions)".
Puedes utilizar el caracter de comodín `*` para empatar los patrones. Por ejemplo, para permitir todas las acciones en organizaciones que comiencen con `space-org`, puedes especificar `space-org*/*`. Para agregar todas las acciones en los repositorios que comiencen con octocat, puedes utilizar `*/octocat*@*`. Para obtener más información sobre cómo utilizar el comodín `*`, consulta la sección "[Sintaxis de flujo de trabajo para las GitHub Actions](/actions/reference/workflow-syntax-for-github-actions#filter-pattern-cheat-sheet)".
{% if currentVersion == "free-pro-team@latest" %}
{% note %}
**Nota:** La opción **Permitir las acciones especificadas** solo se encuentra disponible para los repositorios públicos con los planes de {% data variables.product.prodname_free_user %}, {% data variables.product.prodname_pro %}, {% data variables.product.prodname_free_team %} para organizaciones, o {% data variables.product.prodname_team %}.
{% endnote %}
{% endif %}
Este procedimiento ilustra cómo agregar acciones específicas a la lista de acciones permitidas.
| 148.45 | 595 | 0.788144 | spa_Latn | 0.988853 |
d39627f3da4444e6a6581ae70e3bfad5e41d3307 | 741 | md | Markdown | README.md | tnahs/anki-marker | c458c1a43964faa6572334ea8606f17ec7d1c056 | [
"MIT"
] | null | null | null | README.md | tnahs/anki-marker | c458c1a43964faa6572334ea8606f17ec7d1c056 | [
"MIT"
] | 1 | 2021-07-03T18:49:10.000Z | 2021-07-03T18:49:10.000Z | README.md | tnahs/anki-marker | c458c1a43964faa6572334ea8606f17ec7d1c056 | [
"MIT"
] | null | null | null | # anki-marker
## Installation
Download and run `bundle/anki-marker.ankiaddon`.
## Usage
TODO
``` css
/* [addon-dir]/user_files/markers.css */
marker.highlight {
/**
* name: highlight
* syntax: ==abc==
* html: <span class="highlight">abc</span>
*/
color: hsla(35, 100%, 45%, 1.0);
font-style: unset;
font-weight: unset;
text-decoration: unset;
background-color: hsla(45, 100%, 75%, 1.0);
}
```
``` json
// [addon-dir]/user_files/markers.json
{
"parent-classes": [],
"styles": [
{
"name": "Highlight",
"markup": "==",
"classes": ["highlight"]
},
]
}
```
## Development
```shell
export ANKI_ADDON_DEVELOPMENT=True
```
| 14.82 | 50 | 0.534413 | eng_Latn | 0.186792 |
d3965726ffe87f611323c4cd3bb0062eee0d2d23 | 85 | md | Markdown | src/Docs/Resources/v2/30-theme-guide/50-scss.md | jochenmanz/platform | f272b65d3a857cf3a88f483d42fc6b9f6ad9ea05 | [
"MIT"
] | null | null | null | src/Docs/Resources/v2/30-theme-guide/50-scss.md | jochenmanz/platform | f272b65d3a857cf3a88f483d42fc6b9f6ad9ea05 | [
"MIT"
] | null | null | null | src/Docs/Resources/v2/30-theme-guide/50-scss.md | jochenmanz/platform | f272b65d3a857cf3a88f483d42fc6b9f6ad9ea05 | [
"MIT"
] | null | null | null | [titleEn]: <>(SCSS and Styling)
* Entry point in theme.json
* Structure
* Bootstrap
| 14.166667 | 31 | 0.705882 | eng_Latn | 0.5776 |
d396584035675abc833006f3eb13d7932a0d7e3a | 378 | md | Markdown | src/runtime/nodejs/libs/insomnia-documenter.md | osvaldokalvaitir/project-settings | 94031b36bd96285275058123fbebe93aa5982a97 | [
"MIT"
] | 29 | 2018-12-20T13:16:27.000Z | 2020-08-26T20:34:03.000Z | src/runtime/nodejs/libs/insomnia-documenter.md | osvaldokalvaitir/project-settings | 94031b36bd96285275058123fbebe93aa5982a97 | [
"MIT"
] | null | null | null | src/runtime/nodejs/libs/insomnia-documenter.md | osvaldokalvaitir/project-settings | 94031b36bd96285275058123fbebe93aa5982a97 | [
"MIT"
] | 16 | 2018-12-20T13:16:21.000Z | 2020-10-15T19:54:41.000Z | # Insomnia Documenter
Ferramenta para criar páginas de documentação de API minimalistas e bonitas usando o arquivo de exportação da área de trabalho do Insomnia.
## Documentação
Clique [aqui](https://github.com/jozsefsallai/insomnia-documenter) para ver a documentação.
## Instalação
Clique [aqui](https://www.npmjs.com/package/insomnia-documenter) para fazer a instalação. | 34.363636 | 139 | 0.798942 | por_Latn | 0.991949 |
d3967be9e6e474b92265d213a67184f136c987f3 | 1,738 | md | Markdown | README.md | KaungZawHtet/XMwayLoon | 4dd014dc75a209c242bba5d2dc4333af63bcb405 | [
"Unlicense"
] | 6 | 2020-03-23T04:20:53.000Z | 2020-05-23T00:32:36.000Z | README.md | KaungZawHtet/XMwayLoon | 4dd014dc75a209c242bba5d2dc4333af63bcb405 | [
"Unlicense"
] | null | null | null | README.md | KaungZawHtet/XMwayLoon | 4dd014dc75a209c242bba5d2dc4333af63bcb405 | [
"Unlicense"
] | null | null | null |
# XMwayLoon
Portable GUI executable for parallel Myanmar data randomization.
## Platform
- Mac
- Ubuntu
- Windows (coming soon)
## Download:
download binary executable from https://github.com/KaungZawHtet/XMwayLoon/releases .
## Todo:
- Better performance
- More problem specific data structures
- More types
- Better UI/UX
- Better Build management
- More detailed documentation
- More customizability
- Windows build
## Possible issues
- (Mac) In case of unresponsive menu bar in Mac, switch the app to background and switch back again.
- (Ubuntu) In case of err at app launch, run "sudo apt-get install libgtk-3-dev".
## License
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
The licensing announcement may be evolved or changed in coming versions.
| 46.972973 | 460 | 0.788838 | eng_Latn | 0.656351 |
d3973f6858552e048baa51c1e258b37c591635d8 | 646 | md | Markdown | 2017/CVE-2017-5511.md | Ahmad141/CVEs | 967839a1f3dd2e43c3ca7af98749ae1712e69a04 | [
"MIT"
] | 3 | 2022-02-18T01:32:50.000Z | 2022-02-25T09:00:20.000Z | 2017/CVE-2017-5511.md | az7rb/cve | ea036e0c97bb9d05e18e7f1aea0a746fcb25d312 | [
"MIT"
] | null | null | null | 2017/CVE-2017-5511.md | az7rb/cve | ea036e0c97bb9d05e18e7f1aea0a746fcb25d312 | [
"MIT"
] | null | null | null | ### [CVE-2017-5511](https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2017-5511)



### Description
coders/psd.c in ImageMagick allows remote attackers to have unspecified impact by leveraging an improper cast, which triggers a heap-based buffer overflow.
### POC
#### Reference
- https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=851374
#### Github
- https://github.com/cacad-ntu/CZ4062-assignment
| 35.888889 | 155 | 0.747678 | eng_Latn | 0.24366 |
d397baa1a62d68eddebbe129a175ecb763cf8c32 | 2,453 | md | Markdown | docs/web-service-reference/domainsettingerror-soap.md | MicrosoftDocs/office-developer-exchange-docs.es-ES | 95988b6726d9e62f0e7a45b9968258bab7d59744 | [
"CC-BY-4.0",
"MIT"
] | 1 | 2020-05-19T18:53:43.000Z | 2020-05-19T18:53:43.000Z | docs/web-service-reference/domainsettingerror-soap.md | MicrosoftDocs/office-developer-exchange-docs.es-ES | 95988b6726d9e62f0e7a45b9968258bab7d59744 | [
"CC-BY-4.0",
"MIT"
] | 2 | 2021-12-08T04:02:05.000Z | 2021-12-08T04:02:23.000Z | docs/web-service-reference/domainsettingerror-soap.md | MicrosoftDocs/office-developer-exchange-docs.es-ES | 95988b6726d9e62f0e7a45b9968258bab7d59744 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: DomainSettingError (SOAP)
manager: sethgros
ms.date: 09/17/2015
ms.audience: Developer
ms.topic: reference
ms.localizationpriority: medium
api_type:
- schema
ms.assetid: 48c3f7b5-2ee0-42ce-97a1-a881e2f60327
description: El elemento DomainSettingError representa un error que se produjo al recuperar una configuración de dominio. Esto representa un error de una solicitud GetDomainSettings.
ms.openlocfilehash: d2d7e1fc1509ade88de0013cb9e4ff54712d0f56
ms.sourcegitcommit: 54f6cd5a704b36b76d110ee53a6d6c1c3e15f5a9
ms.translationtype: MT
ms.contentlocale: es-ES
ms.lasthandoff: 09/24/2021
ms.locfileid: "59530150"
---
# <a name="domainsettingerror-soap"></a>DomainSettingError (SOAP)
El **elemento DomainSettingError** representa un error que se produjo al recuperar una configuración de dominio. Esto representa un error de una **solicitud GetDomainSettings.**
```XML
<DomainSettingError>
<ErrorCode/>
<ErrorMessage/>
<SettingName/>
</DomainSettingError>
```
**DomainSettingError**
## <a name="attributes-and-elements"></a>Atributos y elementos
En las siguientes secciones se describen los atributos, elementos secundarios y elementos primarios.
### <a name="attributes"></a>Atributos
Ninguno.
### <a name="child-elements"></a>Elementos secundarios
|**Elemento**|**Descripción**|
|:-----|:-----|
|[ErrorCode (SOAP)](errorcode-soap.md) <br/> |Identifica el código de error asociado a la solicitud específica. <br/> |
|[ErrorMessage (SOAP)](errormessage-soap.md) <br/> |Contiene el mensaje de error asociado a la solicitud específica. <br/> |
|[SettingName (SOAP)](settingname-soap.md) <br/> |Representa el nombre de la configuración. <br/> |
### <a name="parent-elements"></a>Elementos principales
|**Elemento**|**Descripción**|
|:-----|:-----|
|[DomainSettingErrors (SOAP)](domainsettingerrors-soap.md) <br/> |Contiene información de error para la configuración que no se pudo devolver. <br/> |
## <a name="text-value"></a>Valor de texto
Ninguno.
## <a name="element-information"></a>Información del elemento
|||
|:-----|:-----|
|Namespace <br/> |https://schemas.microsoft.com/exchange/2010/Autodiscover <br/> |
|Nombre de esquema <br/> |Esquema de detección automática <br/> |
|Archivo de validación <br/> |Messages.xsd <br/> |
|Puede estar vacío <br/> |Verdadero <br/> |
## <a name="see-also"></a>Ver también
- [Operación GetDomainSettings (SOAP)](getdomainsettings-operation-soap.md)
| 34.549296 | 182 | 0.728903 | spa_Latn | 0.515801 |
d3988ff53ba7e68cb9d7d0a352f1cb37f7656605 | 1,310 | md | Markdown | CHANGELOG.md | tt-laboratories/ocman | c824a4b727299d6f11ca753aa5994d86ab4f1a89 | [
"MIT"
] | 3 | 2018-11-02T09:17:40.000Z | 2020-09-01T20:39:41.000Z | CHANGELOG.md | tt-laboratories/ocman | c824a4b727299d6f11ca753aa5994d86ab4f1a89 | [
"MIT"
] | 1 | 2017-08-11T10:25:57.000Z | 2017-09-01T15:16:00.000Z | CHANGELOG.md | tt-laboratories/ocman | c824a4b727299d6f11ca753aa5994d86ab4f1a89 | [
"MIT"
] | 3 | 2017-07-11T17:02:24.000Z | 2018-11-02T11:35:20.000Z | <a name="1.4.1"></a>
### 1.4.1 (2021-01-28)
#### Bug Fixes
* use ERB::Util.url_encode instead of CGI.escape ([3590197](/../commit/3590197))
#### Maintain
* add .DS_Store to gitignore ([9bc59f4](/../commit/9bc59f4))
<a name="1.4.0"></a>
### 1.4.0 (2020-12-22)
#### Features
* add recursive flag to create_folder (#7) ([28d76b1](/../commit/28d76b1))
<a name="1.3.1"></a>
### 1.3.1 (2020-02-25)
#### Bug Fixes
* use ERB::Util.url_encode instead of CGI.escape to emulate URI.escape's behaviour
<a name="1.3.0"></a>
### 1.3.0 (2020-02-21)
#### Features
* make share permissions configurable
#### Bug Fixes
* avoid NoMethodError on delete_share when no share was found
#### Maintenance
* use github actions
#### Style
* fix rubocop offenses
<a name="1.2.3"></a>
### 1.2.3 (2020-01-24)
#### Bug Fixes
* escape filename in dav uri
#### maintain
* check in Gemfile and ignore Gemfile.lock
<a name="1.2.2"></a>
### 1.2.2 (2016-12-30)
#### Other
* refactor Ocman::Share
* use multi_json instead of json
<a name="1.2.1"></a>
### 1.2.1 (2016.12.21)
#### Bug Fixes
* add `OCS-APIRequest: true` header https://github.com/nextcloud/server/issues/2753
<a name="1.1.0"></a>
### 1.1.0 (2016.09.12)
#### Bug Fixes:
* Ocman now handles URI.encode and URI.decode automatically (breaking change)
| 17.012987 | 83 | 0.629008 | eng_Latn | 0.598536 |
d398c7d6f8443655a8cc42a0ed1e62c79d6dbd15 | 300 | md | Markdown | docs/issues.md | gpby/ASStoredProcedures | f3d9c36dfc6b5dc004c3e5e18aab31d344e9557c | [
"MIT"
] | 17 | 2017-10-21T21:19:11.000Z | 2021-05-24T13:31:29.000Z | docs/issues.md | gpby/ASStoredProcedures | f3d9c36dfc6b5dc004c3e5e18aab31d344e9557c | [
"MIT"
] | 9 | 2018-02-01T15:23:23.000Z | 2021-02-12T15:22:09.000Z | docs/issues.md | gpby/ASStoredProcedures | f3d9c36dfc6b5dc004c3e5e18aab31d344e9557c | [
"MIT"
] | 8 | 2017-11-08T14:29:43.000Z | 2021-06-05T02:29:55.000Z | ---
layout: page
title: Issues
---
{% if site.github.issues_url != '' %}
All issues and feature requests can be found on the [GitHub Issues]({{ site.github.issues_url }}) list.
Please check the existing issue before creating a new one to avoid creating duplicates.
{% else %}
No GitHub
{% endif %} | 25 | 104 | 0.703333 | eng_Latn | 0.979181 |
d399fca7196a8beb5b5fc74479430be55de59cce | 27,894 | md | Markdown | fabric/3471-3813/3685.md | hyperledger-gerrit-archive/fabric-gerrit | 188c6e69ccb2e4c4d609ae749a467fa7e289b262 | [
"Apache-2.0"
] | 2 | 2021-11-08T08:06:48.000Z | 2021-12-03T01:51:44.000Z | fabric/3471-3813/3685.md | cendhu/fabric-gerrit | 188c6e69ccb2e4c4d609ae749a467fa7e289b262 | [
"Apache-2.0"
] | null | null | null | fabric/3471-3813/3685.md | cendhu/fabric-gerrit | 188c6e69ccb2e4c4d609ae749a467fa7e289b262 | [
"Apache-2.0"
] | 4 | 2019-12-07T05:54:26.000Z | 2020-06-04T02:29:43.000Z | <strong>Project</strong>: fabric<br><strong>Branch</strong>: master<br><strong>ID</strong>: 3685<br><strong>Subject</strong>: [FAB-1523] Populate block metadata LastConfig<br><strong>Status</strong>: MERGED<br><strong>Owner</strong>: Jason Yellick - jyellick@us.ibm.com<br><strong>Assignee</strong>:<br><strong>Created</strong>: 1/5/2017, 12:48:07 AM<br><strong>LastUpdated</strong>: 1/11/2017, 7:57:16 AM<br><strong>CommitMessage</strong>:<br><pre>[FAB-1523] Populate block metadata LastConfig
https://jira.hyperledger.org/browse/FAB-1523
This changeset adds new data types for Metadata and for
LastConfiguration which is the second field of the block metadata.
The Metadata type is intended to accomodate the other metadata fields as
well, but is currently only utilized for the LastConfiguration.
This changeset also populates the LastConfiguration field of the block
metadata with this new data structure.
Change-Id: I36e72f4c27b67dd7455aae2f423b4b14e54f9413
Signed-off-by: Jason Yellick <jyellick@us.ibm.com>
</pre><h1>Comments</h1><strong>Reviewer</strong>: Jason Yellick - jyellick@us.ibm.com<br><strong>Reviewed</strong>: 1/5/2017, 12:48:07 AM<br><strong>Message</strong>: <pre>Uploaded patch set 1.</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Reviewed</strong>: 1/5/2017, 12:49:22 AM<br><strong>Message</strong>: <pre>Patch Set 1:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-x86_64/4647/</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Reviewed</strong>: 1/5/2017, 12:52:29 AM<br><strong>Message</strong>: <pre>Patch Set 1: Verified-1
Build Failed
https://jenkins.hyperledger.org/job/fabric-verify-x86_64/4647/ : FAILURE</pre><strong>Reviewer</strong>: Jason Yellick - jyellick@us.ibm.com<br><strong>Reviewed</strong>: 1/5/2017, 1:46:00 PM<br><strong>Message</strong>: <pre>Uploaded patch set 2.</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Reviewed</strong>: 1/5/2017, 1:48:55 PM<br><strong>Message</strong>: <pre>Patch Set 2:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-x86_64/4704/</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Reviewed</strong>: 1/5/2017, 1:52:45 PM<br><strong>Message</strong>: <pre>Patch Set 2: Verified-1
Build Failed
https://jenkins.hyperledger.org/job/fabric-verify-x86_64/4704/ : FAILURE</pre><strong>Reviewer</strong>: Jason Yellick - jyellick@us.ibm.com<br><strong>Reviewed</strong>: 1/5/2017, 3:17:12 PM<br><strong>Message</strong>: <pre>Uploaded patch set 3.</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Reviewed</strong>: 1/5/2017, 3:18:32 PM<br><strong>Message</strong>: <pre>Patch Set 3:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-x86_64/4707/</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Reviewed</strong>: 1/5/2017, 3:27:44 PM<br><strong>Message</strong>: <pre>Patch Set 3: Verified-1
Build Failed
https://jenkins.hyperledger.org/job/fabric-verify-x86_64/4707/ : FAILURE</pre><strong>Reviewer</strong>: Jason Yellick - jyellick@us.ibm.com<br><strong>Reviewed</strong>: 1/5/2017, 5:15:56 PM<br><strong>Message</strong>: <pre>Uploaded patch set 4.</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Reviewed</strong>: 1/5/2017, 5:18:07 PM<br><strong>Message</strong>: <pre>Patch Set 4:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-x86_64/4713/</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Reviewed</strong>: 1/5/2017, 7:18:26 PM<br><strong>Message</strong>: <pre>Patch Set 4: Verified-1
Build Failed
https://jenkins.hyperledger.org/job/fabric-verify-x86_64/4713/ : FAILURE</pre><strong>Reviewer</strong>: Jason Yellick - jyellick@us.ibm.com<br><strong>Reviewed</strong>: 1/5/2017, 9:41:06 PM<br><strong>Message</strong>: <pre>Patch Set 4:
reverify</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Reviewed</strong>: 1/5/2017, 9:42:43 PM<br><strong>Message</strong>: <pre>Patch Set 4: -Verified
Build Started https://jenkins.hyperledger.org/job/fabric-verify-x86_64/4722/</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Reviewed</strong>: 1/5/2017, 11:43:24 PM<br><strong>Message</strong>: <pre>Patch Set 4: Verified-1
Build Failed
https://jenkins.hyperledger.org/job/fabric-verify-x86_64/4722/ : FAILURE</pre><strong>Reviewer</strong>: Yacov Manevich - yacovm@il.ibm.com<br><strong>Reviewed</strong>: 1/6/2017, 3:20:28 AM<br><strong>Message</strong>: <pre>Patch Set 4:
According to the log:
ts
03:20:45 [33munit-tests_1 |[0m ok github.com/hyperledger/fabric/orderer/sbft/crypto 0.003s coverage: 76.9% of statements
04:44:32 Build timed out (after 120 minutes). Marking the build as failed.
The next package to run was orderer/sbft/main, which has the sbft network tests. Any idea why it timed out?</pre><strong>Reviewer</strong>: Gabor Hosszu - gabor@digitalasset.com<br><strong>Reviewed</strong>: 1/6/2017, 4:24:08 AM<br><strong>Message</strong>: <pre>Uploaded patch set 5.</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Reviewed</strong>: 1/6/2017, 4:29:30 AM<br><strong>Message</strong>: <pre>Patch Set 5:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-x86_64/4734/</pre><strong>Reviewer</strong>: Gabor Hosszu - gabor@digitalasset.com<br><strong>Reviewed</strong>: 1/6/2017, 4:49:44 AM<br><strong>Message</strong>: <pre>Patch Set 6: Published edit on patch set 5.</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Reviewed</strong>: 1/6/2017, 4:53:40 AM<br><strong>Message</strong>: <pre>Patch Set 6:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-x86_64/4736/</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Reviewed</strong>: 1/6/2017, 5:00:43 AM<br><strong>Message</strong>: <pre>Patch Set 5: Verified+1
Build Successful
https://jenkins.hyperledger.org/job/fabric-verify-x86_64/4734/ : SUCCESS</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Reviewed</strong>: 1/6/2017, 5:30:25 AM<br><strong>Message</strong>: <pre>Patch Set 6: Verified+1
Build Successful
https://jenkins.hyperledger.org/job/fabric-verify-x86_64/4736/ : SUCCESS</pre><strong>Reviewer</strong>: Jason Yellick - jyellick@us.ibm.com<br><strong>Reviewed</strong>: 1/6/2017, 10:00:05 AM<br><strong>Message</strong>: <pre>Uploaded patch set 7: Patch Set 6 was rebased.</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Reviewed</strong>: 1/6/2017, 10:02:33 AM<br><strong>Message</strong>: <pre>Patch Set 7:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-x86_64/4758/</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Reviewed</strong>: 1/6/2017, 10:46:47 AM<br><strong>Message</strong>: <pre>Patch Set 7: Verified+1
Build Successful
https://jenkins.hyperledger.org/job/fabric-verify-x86_64/4758/ : SUCCESS</pre><strong>Reviewer</strong>: Jason Yellick - jyellick@us.ibm.com<br><strong>Reviewed</strong>: 1/8/2017, 4:54:37 PM<br><strong>Message</strong>: <pre>Uploaded patch set 8: Patch Set 7 was rebased.</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Reviewed</strong>: 1/8/2017, 4:58:09 PM<br><strong>Message</strong>: <pre>Patch Set 8:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-x86_64/4821/</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Reviewed</strong>: 1/8/2017, 6:18:59 PM<br><strong>Message</strong>: <pre>Patch Set 8: Verified-1
Build Failed
https://jenkins.hyperledger.org/job/fabric-verify-x86_64/4821/ : FAILURE</pre><strong>Reviewer</strong>: Jason Yellick - jyellick@us.ibm.com<br><strong>Reviewed</strong>: 1/9/2017, 11:11:45 AM<br><strong>Message</strong>: <pre>Patch Set 8:
reverify</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Reviewed</strong>: 1/9/2017, 11:13:46 AM<br><strong>Message</strong>: <pre>Patch Set 8: -Verified
Build Started https://jenkins.hyperledger.org/job/fabric-verify-x86_64/4859/</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Reviewed</strong>: 1/9/2017, 12:03:02 PM<br><strong>Message</strong>: <pre>Patch Set 8: Verified+1
Build Successful
https://jenkins.hyperledger.org/job/fabric-verify-x86_64/4859/ : SUCCESS</pre><strong>Reviewer</strong>: Jason Yellick - jyellick@us.ibm.com<br><strong>Reviewed</strong>: 1/9/2017, 5:49:44 PM<br><strong>Message</strong>: <pre>Uploaded patch set 9: Patch Set 8 was rebased.</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Reviewed</strong>: 1/9/2017, 5:52:47 PM<br><strong>Message</strong>: <pre>Patch Set 9:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-x86_64/4891/</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Reviewed</strong>: 1/9/2017, 6:57:18 PM<br><strong>Message</strong>: <pre>Patch Set 9: Verified+1
Build Successful
https://jenkins.hyperledger.org/job/fabric-verify-x86_64/4891/ : SUCCESS</pre><strong>Reviewer</strong>: Kostas Christidis - kostas@gmail.com<br><strong>Reviewed</strong>: 1/10/2017, 3:28:02 PM<br><strong>Message</strong>: <pre>Patch Set 9: Code-Review+1
LGTM</pre><strong>Reviewer</strong>: Yacov Manevich - yacovm@il.ibm.com<br><strong>Reviewed</strong>: 1/10/2017, 4:24:44 PM<br><strong>Message</strong>: <pre>Patch Set 9:
(4 comments)</pre><strong>Reviewer</strong>: Jason Yellick - jyellick@us.ibm.com<br><strong>Reviewed</strong>: 1/10/2017, 4:31:01 PM<br><strong>Message</strong>: <pre>Patch Set 9:
(2 comments)</pre><strong>Reviewer</strong>: Kostas Christidis - kostas@gmail.com<br><strong>Reviewed</strong>: 1/10/2017, 4:39:34 PM<br><strong>Message</strong>: <pre>Patch Set 9:
(1 comment)</pre><strong>Reviewer</strong>: Jason Yellick - jyellick@us.ibm.com<br><strong>Reviewed</strong>: 1/10/2017, 4:40:53 PM<br><strong>Message</strong>: <pre>Uploaded patch set 10.</pre><strong>Reviewer</strong>: Jason Yellick - jyellick@us.ibm.com<br><strong>Reviewed</strong>: 1/10/2017, 4:41:56 PM<br><strong>Message</strong>: <pre>Uploaded patch set 11.</pre><strong>Reviewer</strong>: Kostas Christidis - kostas@gmail.com<br><strong>Reviewed</strong>: 1/10/2017, 4:42:10 PM<br><strong>Message</strong>: <pre>Patch Set 11: Code-Review+1</pre><strong>Reviewer</strong>: Jason Yellick - jyellick@us.ibm.com<br><strong>Reviewed</strong>: 1/10/2017, 4:42:52 PM<br><strong>Message</strong>: <pre>Patch Set 9:
(3 comments)</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Reviewed</strong>: 1/10/2017, 4:43:34 PM<br><strong>Message</strong>: <pre>Patch Set 10:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-x86_64/4959/</pre><strong>Reviewer</strong>: Yacov Manevich - yacovm@il.ibm.com<br><strong>Reviewed</strong>: 1/10/2017, 4:43:52 PM<br><strong>Message</strong>: <pre>Patch Set 11: Code-Review+2</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Reviewed</strong>: 1/10/2017, 4:45:02 PM<br><strong>Message</strong>: <pre>Patch Set 11:
Build Started https://jenkins.hyperledger.org/job/fabric-verify-x86_64/4961/</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Reviewed</strong>: 1/10/2017, 5:32:19 PM<br><strong>Message</strong>: <pre>Patch Set 10: Verified-1
Build Failed
https://jenkins.hyperledger.org/job/fabric-verify-x86_64/4959/ : FAILURE</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Reviewed</strong>: 1/10/2017, 5:47:07 PM<br><strong>Message</strong>: <pre>Patch Set 11: Verified-1
Build Failed
https://jenkins.hyperledger.org/job/fabric-verify-x86_64/4961/ : FAILURE</pre><strong>Reviewer</strong>: Jason Yellick - jyellick@us.ibm.com<br><strong>Reviewed</strong>: 1/10/2017, 5:51:58 PM<br><strong>Message</strong>: <pre>Patch Set 11:
reverify</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Reviewed</strong>: 1/10/2017, 5:54:38 PM<br><strong>Message</strong>: <pre>Patch Set 11: -Verified
Build Started https://jenkins.hyperledger.org/job/fabric-verify-x86_64/4970/</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Reviewed</strong>: 1/10/2017, 6:46:22 PM<br><strong>Message</strong>: <pre>Patch Set 11: Verified-1
Build Failed
https://jenkins.hyperledger.org/job/fabric-verify-x86_64/4970/ : FAILURE</pre><strong>Reviewer</strong>: Binh Nguyen - binh1010010110@gmail.com<br><strong>Reviewed</strong>: 1/10/2017, 9:41:34 PM<br><strong>Message</strong>: <pre>Patch Set 11: Code-Review+2</pre><strong>Reviewer</strong>: Binh Nguyen - binh1010010110@gmail.com<br><strong>Reviewed</strong>: 1/10/2017, 9:59:01 PM<br><strong>Message</strong>: <pre>Patch Set 11: Verified+1
CI failed multiple times today on this changeset, but it tested fine on my own local machine. I am manually +1</pre><strong>Reviewer</strong>: Jason Yellick - jyellick@us.ibm.com<br><strong>Reviewed</strong>: 1/10/2017, 10:10:03 PM<br><strong>Message</strong>: <pre>Patch Set 11:
reverify</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Reviewed</strong>: 1/10/2017, 10:12:11 PM<br><strong>Message</strong>: <pre>Patch Set 11: -Verified
Build Started https://jenkins.hyperledger.org/job/fabric-verify-x86_64/4978/</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Reviewed</strong>: 1/10/2017, 10:49:34 PM<br><strong>Message</strong>: <pre>Patch Set 11: Verified+1
Build Successful
https://jenkins.hyperledger.org/job/fabric-verify-x86_64/4978/ : SUCCESS</pre><strong>Reviewer</strong>: Kostas Christidis - kostas@gmail.com<br><strong>Reviewed</strong>: 1/10/2017, 10:55:19 PM<br><strong>Message</strong>: <pre>Patch Set 11:
(FWIW, this now passes CI as well.)</pre><strong>Reviewer</strong>: Gerrit Code Review - gerrit@hyperledger.org<br><strong>Reviewed</strong>: 1/11/2017, 7:15:57 AM<br><strong>Message</strong>: <pre>Change has been successfully merged by Christopher Ferris</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Reviewed</strong>: 1/11/2017, 7:18:24 AM<br><strong>Message</strong>: <pre>Patch Set 11:
Build Started https://jenkins.hyperledger.org/job/fabric-merge-x86_64/715/</pre><strong>Reviewer</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Reviewed</strong>: 1/11/2017, 7:57:16 AM<br><strong>Message</strong>: <pre>Patch Set 11:
Build Successful
https://jenkins.hyperledger.org/job/fabric-merge-x86_64/715/ : SUCCESS</pre><h1>PatchSets</h1><h3>PatchSet Number: 1</h3><blockquote><strong>Type</strong>: REWORK<br><strong>Author</strong>: Jason Yellick - jyellick@us.ibm.com<br><strong>Uploader</strong>: Jason Yellick - jyellick@us.ibm.com<br><strong>Created</strong>: 1/5/2017, 12:48:07 AM<br><strong>UnmergedRevision</strong>: [d43306e7ba98ecf5ea7d64185ab8d7c42af6587c](https://github.com/hyperledger-gerrit-archive/fabric/commit/d43306e7ba98ecf5ea7d64185ab8d7c42af6587c)<br><br><strong>Approver</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Approved</strong>: 1/5/2017, 12:52:29 AM<br><strong>Type</strong>: Verified<br><strong>Value</strong>: -1<br><br></blockquote><h3>PatchSet Number: 2</h3><blockquote><strong>Type</strong>: REWORK<br><strong>Author</strong>: Jason Yellick - jyellick@us.ibm.com<br><strong>Uploader</strong>: Jason Yellick - jyellick@us.ibm.com<br><strong>Created</strong>: 1/5/2017, 1:46:00 PM<br><strong>UnmergedRevision</strong>: [a667cdce16e39ad702dd083f30475b8454ebd876](https://github.com/hyperledger-gerrit-archive/fabric/commit/a667cdce16e39ad702dd083f30475b8454ebd876)<br><br><strong>Approver</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Approved</strong>: 1/5/2017, 1:52:45 PM<br><strong>Type</strong>: Verified<br><strong>Value</strong>: -1<br><br></blockquote><h3>PatchSet Number: 3</h3><blockquote><strong>Type</strong>: REWORK<br><strong>Author</strong>: Jason Yellick - jyellick@us.ibm.com<br><strong>Uploader</strong>: Jason Yellick - jyellick@us.ibm.com<br><strong>Created</strong>: 1/5/2017, 3:17:12 PM<br><strong>UnmergedRevision</strong>: [0732392333fa3009b13be79bda29744a1e296691](https://github.com/hyperledger-gerrit-archive/fabric/commit/0732392333fa3009b13be79bda29744a1e296691)<br><br><strong>Approver</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Approved</strong>: 1/5/2017, 3:27:44 PM<br><strong>Type</strong>: Verified<br><strong>Value</strong>: -1<br><br></blockquote><h3>PatchSet Number: 4</h3><blockquote><strong>Type</strong>: REWORK<br><strong>Author</strong>: Jason Yellick - jyellick@us.ibm.com<br><strong>Uploader</strong>: Jason Yellick - jyellick@us.ibm.com<br><strong>Created</strong>: 1/5/2017, 5:15:56 PM<br><strong>UnmergedRevision</strong>: [345d682d734c4c2fa3e2cf325264fca74bd2e895](https://github.com/hyperledger-gerrit-archive/fabric/commit/345d682d734c4c2fa3e2cf325264fca74bd2e895)<br><br><strong>Approver</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Approved</strong>: 1/5/2017, 11:43:24 PM<br><strong>Type</strong>: Verified<br><strong>Value</strong>: -1<br><br></blockquote><h3>PatchSet Number: 5</h3><blockquote><strong>Type</strong>: REWORK<br><strong>Author</strong>: Jason Yellick - jyellick@us.ibm.com<br><strong>Uploader</strong>: Gabor Hosszu - gabor@digitalasset.com<br><strong>Created</strong>: 1/6/2017, 4:24:08 AM<br><strong>UnmergedRevision</strong>: [6baba16da76a4732bd839a112e27802d08453dec](https://github.com/hyperledger-gerrit-archive/fabric/commit/6baba16da76a4732bd839a112e27802d08453dec)<br><br><strong>Approver</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Approved</strong>: 1/6/2017, 5:00:43 AM<br><strong>Type</strong>: Verified<br><strong>Value</strong>: 1<br><br></blockquote><h3>PatchSet Number: 6</h3><blockquote><strong>Type</strong>: REWORK<br><strong>Author</strong>: Jason Yellick - jyellick@us.ibm.com<br><strong>Uploader</strong>: Jason Yellick - jyellick@us.ibm.com<br><strong>Created</strong>: 1/6/2017, 4:49:44 AM<br><strong>UnmergedRevision</strong>: [9c2c58fb81bba083ef4b6038aead0c9ea4df887f](https://github.com/hyperledger-gerrit-archive/fabric/commit/9c2c58fb81bba083ef4b6038aead0c9ea4df887f)<br><br><strong>Approver</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Approved</strong>: 1/6/2017, 5:30:25 AM<br><strong>Type</strong>: Verified<br><strong>Value</strong>: 1<br><br></blockquote><h3>PatchSet Number: 7</h3><blockquote><strong>Type</strong>: TRIVIAL_REBASE<br><strong>Author</strong>: Jason Yellick - jyellick@us.ibm.com<br><strong>Uploader</strong>: Jason Yellick - jyellick@us.ibm.com<br><strong>Created</strong>: 1/6/2017, 10:00:05 AM<br><strong>UnmergedRevision</strong>: [306c3eb500867f953e6faf9be73371565ab285d1](https://github.com/hyperledger-gerrit-archive/fabric/commit/306c3eb500867f953e6faf9be73371565ab285d1)<br><br><strong>Approver</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Approved</strong>: 1/6/2017, 10:46:47 AM<br><strong>Type</strong>: Verified<br><strong>Value</strong>: 1<br><br></blockquote><h3>PatchSet Number: 8</h3><blockquote><strong>Type</strong>: TRIVIAL_REBASE<br><strong>Author</strong>: Jason Yellick - jyellick@us.ibm.com<br><strong>Uploader</strong>: Jason Yellick - jyellick@us.ibm.com<br><strong>Created</strong>: 1/8/2017, 4:54:37 PM<br><strong>UnmergedRevision</strong>: [4c449563d15e17813c09585bced4608cde37f25b](https://github.com/hyperledger-gerrit-archive/fabric/commit/4c449563d15e17813c09585bced4608cde37f25b)<br><br><strong>Approver</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Approved</strong>: 1/9/2017, 12:03:02 PM<br><strong>Type</strong>: Verified<br><strong>Value</strong>: 1<br><br></blockquote><h3>PatchSet Number: 9</h3><blockquote><strong>Type</strong>: TRIVIAL_REBASE<br><strong>Author</strong>: Jason Yellick - jyellick@us.ibm.com<br><strong>Uploader</strong>: Jason Yellick - jyellick@us.ibm.com<br><strong>Created</strong>: 1/9/2017, 5:49:44 PM<br><strong>UnmergedRevision</strong>: [6f7784367b237adf69d899fdba541e83fa3df0af](https://github.com/hyperledger-gerrit-archive/fabric/commit/6f7784367b237adf69d899fdba541e83fa3df0af)<br><br><strong>Approver</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Approved</strong>: 1/9/2017, 6:57:18 PM<br><strong>Type</strong>: Verified<br><strong>Value</strong>: 1<br><br><strong>Approver</strong>: Kostas Christidis - kostas@gmail.com<br><strong>Approved</strong>: 1/10/2017, 3:28:02 PM<br><strong>Type</strong>: Code-Review<br><strong>Value</strong>: 1<br><br><h2>Comments</h2><strong>Commenter</strong>: Yacov Manevich - yacovm@il.ibm.com<br><strong>CommentLine</strong>: [common/configtx/manager.go#L46](https://github.com/hyperledger-gerrit-archive/fabric/blob/6f7784367b237adf69d899fdba541e83fa3df0af/common/configtx/manager.go#L46)<br><strong>Comment</strong>: <pre>This has nothing to do with the change set, but- what do you think about giving this a more meaningful name? like, Updater, or Mutator or something that resembles the role of the interface?</pre><strong>Commenter</strong>: Jason Yellick - jyellick@us.ibm.com<br><strong>CommentLine</strong>: [common/configtx/manager.go#L46](https://github.com/hyperledger-gerrit-archive/fabric/blob/6f7784367b237adf69d899fdba541e83fa3df0af/common/configtx/manager.go#L46)<br><strong>Comment</strong>: <pre>I'm not opposed, my original thought was "This thing manages the current configuration" but, as a name 'configtx.Manager' isn't really correct, something like 'configtx.Updater' probably does make more sense. (Though I would suggest a different changeset)</pre><strong>Commenter</strong>: Yacov Manevich - yacovm@il.ibm.com<br><strong>CommentLine</strong>: [orderer/mocks/configtx/configtx.go#L2](https://github.com/hyperledger-gerrit-archive/fabric/blob/6f7784367b237adf69d899fdba541e83fa3df0af/orderer/mocks/configtx/configtx.go#L2)<br><strong>Comment</strong>: <pre>2017</pre><strong>Commenter</strong>: Jason Yellick - jyellick@us.ibm.com<br><strong>CommentLine</strong>: [orderer/mocks/configtx/configtx.go#L2](https://github.com/hyperledger-gerrit-archive/fabric/blob/6f7784367b237adf69d899fdba541e83fa3df0af/orderer/mocks/configtx/configtx.go#L2)<br><strong>Comment</strong>: <pre>Done</pre><strong>Commenter</strong>: Yacov Manevich - yacovm@il.ibm.com<br><strong>CommentLine</strong>: [orderer/mocks/configtx/configtx_test.go#L2](https://github.com/hyperledger-gerrit-archive/fabric/blob/6f7784367b237adf69d899fdba541e83fa3df0af/orderer/mocks/configtx/configtx_test.go#L2)<br><strong>Comment</strong>: <pre>2017</pre><strong>Commenter</strong>: Jason Yellick - jyellick@us.ibm.com<br><strong>CommentLine</strong>: [orderer/mocks/configtx/configtx_test.go#L2](https://github.com/hyperledger-gerrit-archive/fabric/blob/6f7784367b237adf69d899fdba541e83fa3df0af/orderer/mocks/configtx/configtx_test.go#L2)<br><strong>Comment</strong>: <pre>Done</pre><strong>Commenter</strong>: Yacov Manevich - yacovm@il.ibm.com<br><strong>CommentLine</strong>: [orderer/mocks/multichain/multichain.go#L71](https://github.com/hyperledger-gerrit-archive/fabric/blob/6f7784367b237adf69d899fdba541e83fa3df0af/orderer/mocks/multichain/multichain.go#L71)<br><strong>Comment</strong>: <pre>who's using this variable - committers? Are you using this in a different implementation of the interface method?
If yes, we can do a _ here instead of variable name IMO.</pre><strong>Commenter</strong>: Jason Yellick - jyellick@us.ibm.com<br><strong>CommentLine</strong>: [orderer/mocks/multichain/multichain.go#L71](https://github.com/hyperledger-gerrit-archive/fabric/blob/6f7784367b237adf69d899fdba541e83fa3df0af/orderer/mocks/multichain/multichain.go#L71)<br><strong>Comment</strong>: <pre>Right, this is the mock implementation, the real implementation actually uses the committers. If we think using an _ makes the naming more clear, happy to do it.</pre><strong>Commenter</strong>: Kostas Christidis - kostas@gmail.com<br><strong>CommentLine</strong>: [orderer/mocks/multichain/multichain.go#L71](https://github.com/hyperledger-gerrit-archive/fabric/blob/6f7784367b237adf69d899fdba541e83fa3df0af/orderer/mocks/multichain/multichain.go#L71)<br><strong>Comment</strong>: <pre>(An underscore here is a good call.)</pre><strong>Commenter</strong>: Jason Yellick - jyellick@us.ibm.com<br><strong>CommentLine</strong>: [orderer/mocks/multichain/multichain.go#L71](https://github.com/hyperledger-gerrit-archive/fabric/blob/6f7784367b237adf69d899fdba541e83fa3df0af/orderer/mocks/multichain/multichain.go#L71)<br><strong>Comment</strong>: <pre>Done</pre></blockquote><h3>PatchSet Number: 10</h3><blockquote><strong>Type</strong>: REWORK<br><strong>Author</strong>: Jason Yellick - jyellick@us.ibm.com<br><strong>Uploader</strong>: Jason Yellick - jyellick@us.ibm.com<br><strong>Created</strong>: 1/10/2017, 4:40:53 PM<br><strong>UnmergedRevision</strong>: [0df50b87940315b8ea19a20671343673ea0452c0](https://github.com/hyperledger-gerrit-archive/fabric/commit/0df50b87940315b8ea19a20671343673ea0452c0)<br><br><strong>Approver</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Approved</strong>: 1/10/2017, 5:32:19 PM<br><strong>Type</strong>: Verified<br><strong>Value</strong>: -1<br><br></blockquote><h3>PatchSet Number: 11</h3><blockquote><strong>Type</strong>: REWORK<br><strong>Author</strong>: Jason Yellick - jyellick@us.ibm.com<br><strong>Uploader</strong>: Jason Yellick - jyellick@us.ibm.com<br><strong>Created</strong>: 1/10/2017, 4:41:56 PM<br><strong>GitHubMergedRevision</strong>: [75909aafdf4ea51f349a098ec6eadfcc943e8c23](https://github.com/hyperledger-gerrit-archive/fabric/commit/75909aafdf4ea51f349a098ec6eadfcc943e8c23)<br><br><strong>MergedBy</strong>: Christopher Ferris<br><strong>Merged</strong>: 1/11/2017, 7:15:56 AM<br><br><strong>Approver</strong>: Hyperledger Jobbuilder - jobbuilder@jenkins.hyperledger.org<br><strong>Approved</strong>: 1/10/2017, 10:49:34 PM<br><strong>Type</strong>: Verified<br><strong>Value</strong>: 1<br><br><strong>Approver</strong>: Kostas Christidis - kostas@gmail.com<br><strong>Approved</strong>: 1/10/2017, 4:42:10 PM<br><strong>Type</strong>: Code-Review<br><strong>Value</strong>: 1<br><br><strong>Approver</strong>: Binh Nguyen - binh1010010110@gmail.com<br><strong>Approved</strong>: 1/10/2017, 9:41:34 PM<br><strong>Type</strong>: Code-Review<br><strong>Value</strong>: 1<br><br><strong>Approver</strong>: Binh Nguyen - binh1010010110@gmail.com<br><strong>Approved</strong>: 1/10/2017, 9:59:01 PM<br><strong>Type</strong>: Verified<br><strong>Value</strong>: 1<br><br><strong>Approver</strong>: Yacov Manevich - yacovm@il.ibm.com<br><strong>Approved</strong>: 1/10/2017, 4:43:52 PM<br><strong>Type</strong>: Code-Review<br><strong>Value</strong>: 1<br><br></blockquote> | 197.829787 | 9,045 | 0.767262 | kor_Hang | 0.296057 |
d39a73075efbe6a37a9b43f5f6879f2267bc154c | 161 | md | Markdown | readme.md | urpaMaker/phaser-first | f53cfe5572b3838c055634ef84ba81733a9e0d75 | [
"MIT"
] | null | null | null | readme.md | urpaMaker/phaser-first | f53cfe5572b3838c055634ef84ba81733a9e0d75 | [
"MIT"
] | null | null | null | readme.md | urpaMaker/phaser-first | f53cfe5572b3838c055634ef84ba81733a9e0d75 | [
"MIT"
] | null | null | null | # First Phaser Project
## [StarterKit](https://github.com/ourcade/phaser3-parcel-template)
### For start:
- npm i -g parcel-bundler
- npm i
- npm run start
| 13.416667 | 67 | 0.689441 | nob_Latn | 0.165253 |
d39a7fd19fa1c6495accd486a90cb5208696fd83 | 550 | md | Markdown | Workshop/Developer Tools.md | WizardOfAus/WizardsEthereumWorkshop | 96bda54c23fe7d511f8e0436baf28c4ddf4694c2 | [
"MIT"
] | 11 | 2018-06-05T19:17:15.000Z | 2019-07-25T04:34:46.000Z | Workshop/Developer Tools.md | WizardOfAus/WizardsEthereumWorkshop | 96bda54c23fe7d511f8e0436baf28c4ddf4694c2 | [
"MIT"
] | null | null | null | Workshop/Developer Tools.md | WizardOfAus/WizardsEthereumWorkshop | 96bda54c23fe7d511f8e0436baf28c4ddf4694c2 | [
"MIT"
] | 5 | 2018-06-05T23:09:08.000Z | 2021-11-24T16:50:10.000Z | ### Useful links
<hr />
<br />
https://github.com/ethereum/mist/wiki
https://www.parity.io/
https://metamask.io/
https://remix.ethereum.org/#optimize=false&version=soljson-v0.4.24+commit.e67f0147.js
http://truffleframework.com/
http://truffleframework.com/ganache/
https://github.com/ethereum/wiki/wiki/JavaScript-API
https://ipfs.io/
https://swarm-guide.readthedocs.io/en/latest/introduction.html
https://souptacular.gitbooks.io/ethereum-tutorials-and-tips-by-hudson/content/private-chain.html
https://www.npmjs.com/package/ganache-cli
| 19.642857 | 96 | 0.76 | yue_Hant | 0.83358 |
d39b80105f3a1a01d1fb5c17e630997fa70a3d8c | 1,361 | md | Markdown | tensorflow_ranking/g3doc/api_docs/python/tfr/data/build_ranking_dataset.md | Foristkirito/ranking | 4971decddc81ccc720df73d420d90d611e857097 | [
"Apache-2.0"
] | 2 | 2020-07-07T02:55:44.000Z | 2021-07-09T18:51:55.000Z | tensorflow_ranking/g3doc/api_docs/python/tfr/data/build_ranking_dataset.md | daangn/ranking | 4ed523746cc473652aba89c731019b505c1acc38 | [
"Apache-2.0"
] | null | null | null | tensorflow_ranking/g3doc/api_docs/python/tfr/data/build_ranking_dataset.md | daangn/ranking | 4ed523746cc473652aba89c731019b505c1acc38 | [
"Apache-2.0"
] | null | null | null | <div itemscope itemtype="http://developers.google.com/ReferenceObject">
<meta itemprop="name" content="tfr.data.build_ranking_dataset" />
<meta itemprop="path" content="Stable" />
</div>
# tfr.data.build_ranking_dataset
<!-- Insert buttons -->
<table class="tfo-notebook-buttons tfo-api" align="left">
<td>
<a target="_blank" href="https://github.com/tensorflow/ranking/tree/master/tensorflow_ranking/python/data.py">
<img src="https://www.tensorflow.org/images/GitHub-Mark-32px.png" />
View source on GitHub
</a>
</td></table>
<!-- Start diff -->
Builds a ranking tf.dataset with a standard data format.
```python
tfr.data.build_ranking_dataset(
file_pattern,
data_format,
batch_size,
context_feature_spec,
example_feature_spec,
list_size=None,
**kwargs
)
```
<!-- Placeholder for "Used in" -->
#### Args:
* <b>`file_pattern`</b>: See `build_ranking_dataset_with_parsing_fn`.
* <b>`data_format`</b>: See `make_parsing_fn`.
* <b>`batch_size`</b>: See `build_ranking_dataset_with_parsing_fn`.
* <b>`context_feature_spec`</b>: See `make_parsing_fn`.
* <b>`example_feature_spec`</b>: See `make_parsing_fn`.
* <b>`list_size`</b>: See `make_parsing_fn`.
* <b>`**kwargs`</b>: The kwargs passed to
`build_ranking_dataset_with_parsing_fn`.
#### Returns:
See `build_ranking_dataset_with_parsing_fn`.
| 26.686275 | 112 | 0.698751 | eng_Latn | 0.165379 |
d39cf96c3c1b91adf71877ed5b7732981aa9b7b0 | 14,302 | md | Markdown | pages/docs/reference_zh/scope-functions.md | LiYing2010/kotlin-web-site | d95da2018001e71d5a4bf613fc5a85f881fc0e67 | [
"Apache-2.0"
] | 90 | 2016-04-20T10:28:32.000Z | 2022-01-29T08:33:52.000Z | pages/docs/reference_zh/scope-functions.md | LiYing2010/kotlin-web-site | d95da2018001e71d5a4bf613fc5a85f881fc0e67 | [
"Apache-2.0"
] | 4 | 2017-04-04T14:04:51.000Z | 2020-07-08T08:41:00.000Z | pages/docs/reference_zh/scope-functions.md | LiYing2010/kotlin-web-site | d95da2018001e71d5a4bf613fc5a85f881fc0e67 | [
"Apache-2.0"
] | 25 | 2016-08-02T13:50:12.000Z | 2020-04-20T12:46:20.000Z | ---
type: doc
layout: reference
category: "Syntax"
title: "作用域函数(Scope Function)"
---
# 作用域函数(Scope Function)
Kotlin 标准库提供了一系列函数, 用来在某个指定的对象上下文中执行一段代码.
你可以对一个对象调用这些函数, 并提供一个 [Lambda 表达式](lambdas.html), 函数会创建一个临时的作用域(scope).
在这个作用域内, 你可以访问这个对象, 而不需要指定名称.
这样的函数称为 _作用域函数(Scope Function)_. 有 5 个这类函数: `let`, `run`, `with`, `apply`, 以及 `also`.
基本上, 这些函数所做的事情都是一样的: 在一个对象上执行一段代码.
它们之间的区别在于, 在代码段内如何访问这个对象, 以及整个表达式的最终结果值是什么.
下面是作用域函数的典型使用场景:
<div class="sample" markdown="1" theme="idea">
```kotlin
data class Person(var name: String, var age: Int, var city: String) {
fun moveTo(newCity: String) { city = newCity }
fun incrementAge() { age++ }
}
fun main() {
//sampleStart
Person("Alice", 20, "Amsterdam").let {
println(it)
it.moveTo("London")
it.incrementAge()
println(it)
}
//sampleEnd
}
```
</div>
如果不使用 `let` 函数, 为了实现同样的功能, 你就不得不引入一个新的变量, 并在每次用到它的时候使用变量名来访问它.
<div class="sample" markdown="1" theme="idea">
```kotlin
data class Person(var name: String, var age: Int, var city: String) {
fun moveTo(newCity: String) { city = newCity }
fun incrementAge() { age++ }
}
fun main() {
//sampleStart
val alice = Person("Alice", 20, "Amsterdam")
println(alice)
alice.moveTo("London")
alice.incrementAge()
println(alice)
//sampleEnd
}
```
</div>
作用域函数并没有引入技术上的新功能, 但它能让你的代码变得更简洁易读.
由于所有的作用域函数都很类似, 因此选择一个适合你需求的函数会稍微有点难度.
具体的选择取决于你的意图, 以及在你的项目内作用域函数的使用的一致性.
下面我们将会详细解释各个作用域函数之间的区别, 以及他们的使用惯例.
## 作用域函数之间的区别
由于所有的作用域函数都很类似, 因此理解它们之间的差别是很重要的. 它们之间主要存在两大差别:
* 访问上下文对象的方式
* 返回值.
### 访问上下文对象: 使用 `this` 或 使用 `it`
在作用域函数的 Lambda 表达式内部, 可以通过一个简短的引用来访问上下文对象, 而不需要使用它的变量名.
每个作用域函数都会使用两种方法之一来引用上下文对象:
作为 Lambda 表达式的 [接受者](lambdas.html#function-literals-with-receiver)(`this`)来访问,
或者作为 Lambda 表达式的参数(`it`)来访问.
两种方法的功能都是一样的, 因此我们分别介绍这两种方法在不同情况下的优点和缺点, 并提供一些使用建议.
<div class="sample" markdown="1" theme="idea">
```kotlin
fun main() {
val str = "Hello"
// 使用 this
str.run {
println("The receiver string length: $length")
//println("The receiver string length: ${this.length}") // 这种写法的功能与上面一样
}
// 使用 it
str.let {
println("The receiver string's length is ${it.length}")
}
}
```
</div>
#### 使用 `this`
`run`, `with`, 和 `apply` 函数将上下文函数作为 Lambda 表达式的接受者 - 通过 `this` 关键字来访问.
因此, 在这些函数的 Lambda 表达式内, 可以向通常的类函数一样访问到上下文对象.
大多数情况下, 访问接受者对象的成员时, 可以省略 `this` 关键字, 代码可以更简短.
另一方面, 如果省略了 `this`, 阅读代码时会很难区分哪些是接受者的成员, 哪些是外部对象和函数.
因此, 把上下文对象作为接受者(`this`)的方式, 建议用于那些主要对上下文对象成员进行操作的 Lambda 表达式: 调用上下文对象的函数, 或对其属性赋值.
<div class="sample" markdown="1" theme="idea">
```kotlin
data class Person(var name: String, var age: Int = 0, var city: String = "")
fun main() {
//sampleStart
val adam = Person("Adam").apply {
age = 20 // 等价于 this.age = 20, 或者 adam.age = 20
city = "London"
}
println(adam)
//sampleEnd
}
```
</div>
#### 使用 `it`
`let` 和 `also` 函数使用另一种方式, 它们将上下文对象作为 Lambda 表达式的参数.
如果参数名称不指定, 那么上下文对象使用隐含的默认参数名称 `it`.
`it` 比 `this` 更短, 而且带 `it` 的表达式通常也更容易阅读.
但是, 你就不能象省略 `this` 那样, 隐含地访问访问对象的函数和属性.
因此, 把上下文对象作为 `it` 的方式, 比较适合于对象主要被用作函数参数的情况.
如果你的代码段中存在多个变量, `it` 也是更好的选择.
<div class="sample" markdown="1" theme="idea">
```kotlin
import kotlin.random.Random
fun writeToLog(message: String) {
println("INFO: $message")
}
fun main() {
//sampleStart
fun getRandomInt(): Int {
return Random.nextInt(100).also {
writeToLog("getRandomInt() generated value $it")
}
}
val i = getRandomInt()
//sampleEnd
}
```
</div>
另外, 如果把上下文对象作为参数传递, 你还可以在作用域内为它指定一个自定义的名称.
<div class="sample" markdown="1" theme="idea">
```kotlin
import kotlin.random.Random
fun writeToLog(message: String) {
println("INFO: $message")
}
fun main() {
//sampleStart
fun getRandomInt(): Int {
return Random.nextInt(100).also { value ->
writeToLog("getRandomInt() generated value $value")
}
}
val i = getRandomInt()
//sampleEnd
}
```
</div>
### 返回值
各种作用域函数的区别还包括它们的返回值:
* `apply` 和 `also` 函数返回作用域对象.
* `let`, `run`, 和 `with` 函数返回 Lambda 表达式的结果值.
这两种方式, 允许你根据你的代码下面需要做什么, 来选择适当的作用域函数.
#### 返回上下文对象
`apply` 和 `also` 的返回值是作用域对象本身.
因此它们可以作为 _旁路(side step)_ 成为链式调用的一部分: 你可以在这些函数之后对同一个对象继续调用其他函数.
<div class="sample" markdown="1" theme="idea">
```kotlin
fun main() {
//sampleStart
val numberList = mutableListOf<Double>()
numberList.also { println("Populating the list") }
.apply {
add(2.71)
add(3.14)
add(1.0)
}
.also { println("Sorting the list") }
.sort()
//sampleEnd
println(numberList)
}
```
</div>
还可以用在函数的 return 语句中, 将上下文对象作为函数的返回值.
<div class="sample" markdown="1" theme="idea">
```kotlin
import kotlin.random.Random
fun writeToLog(message: String) {
println("INFO: $message")
}
fun main() {
//sampleStart
fun getRandomInt(): Int {
return Random.nextInt(100).also {
writeToLog("getRandomInt() generated value $it")
}
}
val i = getRandomInt()
//sampleEnd
}
```
</div>
#### 返回 Lambda 表达式的结果值
`let`, `run`, 和 `with` 函数返回 Lambda 表达式的结果值.
因此, 如果需要将 Lambda 表达式结果赋值给一个变量, 或者对 Lambda 表达式结果进行链式操作, 等等, 你可以使用这些函数.
<div class="sample" markdown="1" theme="idea">
```kotlin
fun main() {
//sampleStart
val numbers = mutableListOf("one", "two", "three")
val countEndsWithE = numbers.run {
add("four")
add("five")
count { it.endsWith("e") }
}
println("There are $countEndsWithE elements that end with e.")
//sampleEnd
}
```
</div>
此外, 你也可以忽略返回值, 只使用作用域函数来为变量创建一个临时的作用域.
<div class="sample" markdown="1" theme="idea">
```kotlin
fun main() {
//sampleStart
val numbers = mutableListOf("one", "two", "three")
with(numbers) {
val firstItem = first()
val lastItem = last()
println("First item: $firstItem, last item: $lastItem")
}
//sampleEnd
}
```
</div>
## 函数
为了帮助你选择适当的作用域函数, 下面我们对各个函数进行详细介绍, 并提供一些使用建议.
技术上来讲, 很多情况下各个函数是可以互换的, 因此这里的示例只演示常见的使用风格.
### `let` 函数
**上下文对象** 通过参数 (`it`) 访问. **返回值** 是 Lambda 表达式的结果值.
`let` 函数可以用来在链式调用的结果值上调用一个或多个函数.
比如, 下面的代码对一个集合执行两次操作, 然后打印结果:
<div class="sample" markdown="1" theme="idea">
```kotlin
fun main() {
//sampleStart
val numbers = mutableListOf("one", "two", "three", "four", "five")
val resultList = numbers.map { it.length }.filter { it > 3 }
println(resultList)
//sampleEnd
}
```
</div>
使用 `let` 函数, 这段代码可以改写为:
<div class="sample" markdown="1" theme="idea">
```kotlin
fun main() {
//sampleStart
val numbers = mutableListOf("one", "two", "three", "four", "five")
numbers.map { it.length }.filter { it > 3 }.let {
println(it)
// 如果需要, 还可以调用更多函数
}
//sampleEnd
}
```
</div>
如果 Lambda 表达式的代码段只包含唯一的一个函数调用, 而且使用 `it` 作为这个函数的参数, 那么可以使用方法引用 (`::`) 来代替 Lambda 表达式:
<div class="sample" markdown="1" theme="idea">
```kotlin
fun main() {
//sampleStart
val numbers = mutableListOf("one", "two", "three", "four", "five")
numbers.map { it.length }.filter { it > 3 }.let(::println)
//sampleEnd
}
```
</div>
`let` 经常用来对非 null 值执行一段代码.
如果要对可为 null 的对象进行操作, 请使用 null 值安全的调用操作符 `?.`, 然后再通过 `let` 函数, 在 Lambda 表达式内执行这段操作.
<div class="sample" markdown="1" theme="idea">
```kotlin
fun processNonNullString(str: String) {}
fun main() {
//sampleStart
val str: String? = "Hello"
//processNonNullString(str) // 编译错误: str 可能为 null
val length = str?.let {
println("let() called on $it")
processNonNullString(it) // OK: 在 '?.let { }' 之内可以保证 'it' 不为 null
it.length
}
//sampleEnd
}
```
</div>
`let` 函数的另一个使用场景是, 在一个比较小的作用域内引入局部变量, 以便提高代码的可读性.
为了对上下文对象定义一个新的变量, 请将变量名作为 Lambda 表达式的参数, 然后就可以在 Lambda 表达式使用这个参数名, 而不是默认名称 `it`.
<div class="sample" markdown="1" theme="idea">
```kotlin
fun main() {
//sampleStart
val numbers = listOf("one", "two", "three", "four")
val modifiedFirstItem = numbers.first().let { firstItem ->
println("The first item of the list is '$firstItem'")
if (firstItem.length >= 5) firstItem else "!" + firstItem + "!"
}.toUpperCase()
println("First item after modifications: '$modifiedFirstItem'")
//sampleEnd
}
```
</div>
### `with` 函数
这是一个非扩展函数: **上下文对象** 作为参数传递, 但在 Lambda 表达式内部, 它是一个接受者 (`this`). **返回值** 是 Lambda 表达式的结果值.
我们推荐使用 `with` 函数, 用来在上下文对象上调用函数, 而不返回 Lambda 表达式结果值.
在源代码中, `with` 可以被理解为 “_使用这个对象, 进行以下操作._”
<div class="sample" markdown="1" theme="idea">
```kotlin
fun main() {
//sampleStart
val numbers = mutableListOf("one", "two", "three")
with(numbers) {
println("'with' is called with argument $this")
println("It contains $size elements")
}
//sampleEnd
}
```
</div>
`with` 函数的另一种使用场景是, 引入一个辅助对象, 使用它的属性或函数来计算得到一个结果值.
<div class="sample" markdown="1" theme="idea">
```kotlin
fun main() {
//sampleStart
val numbers = mutableListOf("one", "two", "three")
val firstAndLast = with(numbers) {
"The first element is ${first()}," +
" the last element is ${last()}"
}
println(firstAndLast)
//sampleEnd
}
```
</div>
### `run` 函数
**上下文对象** 是接受者 (`this`). **返回值** 是 Lambda 表达式的结果值.
`run` 的功能与 `with` 一样, 但调用它的方式与 `let` 一样 - 作为上下文对象的扩展函数来调用.
如果你的 Lambda 表达式既包含对象的初始化处理, 也包含结果值的计算处理, 那么就很适合使用 `run` 函数.
<div class="sample" markdown="1" theme="idea">
```kotlin
class MultiportService(var url: String, var port: Int) {
fun prepareRequest(): String = "Default request"
fun query(request: String): String = "Result for query '$request'"
}
fun main() {
//sampleStart
val service = MultiportService("https://example.kotlinlang.org", 80)
val result = service.run {
port = 8080
query(prepareRequest() + " to port $port")
}
// 使用 let() 函数的实现方法是:
val letResult = service.let {
it.port = 8080
it.query(it.prepareRequest() + " to port ${it.port}")
}
//sampleEnd
println(result)
println(letResult)
}
```
</div>
除了对接受者对象调用 `run` 函数之外, 也可以把它作为非扩展函数来使用.
通过使用非扩展函数方式的 `run` 函数, 你可以在需要表达式的地方执行多条语句的代码段.
<div class="sample" markdown="1" theme="idea">
```kotlin
fun main() {
//sampleStart
val hexNumberRegex = run {
val digits = "0-9"
val hexDigits = "A-Fa-f"
val sign = "+-"
Regex("[$sign]?[$digits$hexDigits]+")
}
for (match in hexNumberRegex.findAll("+1234 -FFFF not-a-number")) {
println(match.value)
}
//sampleEnd
}
```
</div>
### `apply` 函数
**上下文对象** 是接受者(`this`). **返回值** 是对象本身.
如果代码段没有返回值, 并且主要操作接受者对象的成员, 那么适合使用 `apply` 函数.
`apply` 函数的常见使用场景是对象配置. 这样的代码调用可以理解为 “_将以下赋值操作应用于这个对象._”
<div class="sample" markdown="1" theme="idea">
```kotlin
data class Person(var name: String, var age: Int = 0, var city: String = "")
fun main() {
//sampleStart
val adam = Person("Adam").apply {
age = 32
city = "London"
}
println(adam)
//sampleEnd
}
```
</div>
由于返回值是接受者, 因此你可以很容易地将 `apply` 函数用作链式调用的一部分, 用来实现复杂的处理.
### `also` 函数
**上下文对象** 是 Lambda 表达式的参数 (`it`). **返回值** 是对象本身.
`also` 函数适合于执行一些将上下文对象作为参数的操作.
如果需要执行一些操作, 其中需要引用对象本身, 而不是它的属性或函数,
或者如果你不希望覆盖更外层作用域(scope)中的 `this` 引用, 那么就可以使用 `also` 函数.
如果在代码中看到 `also` 函数, 可以理解为 “_对这个对象还执行以下操作_”.
<div class="sample" markdown="1" theme="idea">
```kotlin
fun main() {
//sampleStart
val numbers = mutableListOf("one", "two", "three")
numbers
.also { println("The list elements before adding new one: $it") }
.add("four")
//sampleEnd
}
```
</div>
## 选择作用域函数
为了帮助你选择适合需要的作用域函数, 我们整理了这些函数之间关键区别的比较表格.
|函数|上下文对象的引用方式|返回值|是否扩展函数|
|---|---|---|---|
|`let`|`it`|Lambda 表达式的结果值|是|
|`run`|`this`|Lambda 表达式的结果值|是|
|`run`|-|Lambda 表达式的结果值|不是: 不使用上下文对象来调用|
|`with`|`this`|Lambda 表达式的结果值|不是: 上下文对象作为参数传递.|
|`apply`|`this`|上下文对象本身|是|
|`also`|`it`|上下文对象本身|是|
下面是根据你的需求来选择作用域函数的简短指南:
* 在非 null 对象上执行 Lambda 表达式: `let`
* 在一个局部作用域内引入变量: `let`
* 对一个对象的属性进行设置: `apply`
* 对一个对象的属性进行设置, 并计算结果值: `run`
* 在需要表达式的地方执行多条语句: 非扩展函数形式的 `run`
* 对一个对象进行一些附加处理: `also`
* 对一个对象进行一组函数调用: `with`
不同的函数的使用场景是有重叠的, 因此你可以根据你的项目或你的开发组所使用的编码规约来进行选择.
尽管作用域函数可以使得代码变得更简洁, 但也要注意不要过度使用: 可能会降低你的代码的可读性, 造成错误.
不要在作用域函数内部再嵌套作用域函数, 对作用域函数的链式调用要特别小心: 很容易导致开发者错误理解当前的上下文对象, 以及 `this` 或 `it` 的值.
## `takeIf` 函数和 `takeUnless` 函数
除作用域函数外, 标准库还提供了 `takeIf` 函数和 `takeUnless` 函数.
这些函数允许你在链式调用中加入对象的状态检查.
如果对一个对象调用 `takeIf` 函数, 并给定一个检查条件, 这个函数会在对象满足检查条件时返回这个对象, 否则返回 `null`.
因此, `takeIf` 函数可以作为单个对象的过滤函数.
类似的, `takeUnless` 函数会在对象不满足检查条件时返回这个对象, 满足条件时返回 `null`.
在 Lambda 表达式内部, 可以通过参数 (`it`) 访问到对象.
<div class="sample" markdown="1" theme="idea">
```kotlin
import kotlin.random.*
fun main() {
//sampleStart
val number = Random.nextInt(100)
val evenOrNull = number.takeIf { it % 2 == 0 }
val oddOrNull = number.takeUnless { it % 2 == 0 }
println("even: $evenOrNull, odd: $oddOrNull")
//sampleEnd
}
```
</div>
如果在 `takeIf` 函数和 `takeUnless` 函数之后链式调用其他函数,
别忘了进行 null 值检查, 或者使用 null 值安全的成员调用(`?.`), 因为它们的返回值是可以为 null 的.
<div class="sample" markdown="1" theme="idea">
```kotlin
fun main() {
//sampleStart
val str = "Hello"
val caps = str.takeIf { it.isNotEmpty() }?.toUpperCase()
//val caps = str.takeIf { it.isNotEmpty() }.toUpperCase() // 这里会出现编译错误
println(caps)
//sampleEnd
}
```
</div>
`takeIf` 函数和 `takeUnless` 函数在与作用域函数组合使用时特别有用.
一个很好的例子就是, 将这些函数与 `let` 函数组合起来, 可以对满足某个条件的对象运行一段代码.
为了实现这个目的, 可以先对这个对象调用 `takeIf` 函数, 然后使用 null 值安全方式(`?.`)来调用 `let` 函数.
对于不满足检查条件的对象, `takeIf` 函数会返回 `null`, 然后 `let` 函数不会被调用.
<div class="sample" markdown="1" theme="idea">
```kotlin
fun main() {
//sampleStart
fun displaySubstringPosition(input: String, sub: String) {
input.indexOf(sub).takeIf { it >= 0 }?.let {
println("The substring $sub is found in $input.")
println("Its start position is $it.")
}
}
displaySubstringPosition("010000011", "11")
displaySubstringPosition("010000011", "12")
//sampleEnd
}
```
</div>
如果没有这些标准库函数的帮助, 上面的代码会变成这样:
<div class="sample" markdown="1" theme="idea">
```kotlin
fun main() {
//sampleStart
fun displaySubstringPosition(input: String, sub: String) {
val index = input.indexOf(sub)
if (index >= 0) {
println("The substring $sub is found in $input.")
println("Its start position is $index.")
}
}
displaySubstringPosition("010000011", "11")
displaySubstringPosition("010000011", "12")
//sampleEnd
}
```
</div>
| 21.282738 | 89 | 0.643407 | yue_Hant | 0.598738 |
d39d11542c84b17f02ec375b75d637ebc8e55cef | 1,398 | md | Markdown | docs/example-app/README.md | Okanjo/okanjo-app-redis | 7729db327bf394dc5e6df50bead40b0487d0720a | [
"MIT"
] | null | null | null | docs/example-app/README.md | Okanjo/okanjo-app-redis | 7729db327bf394dc5e6df50bead40b0487d0720a | [
"MIT"
] | 1 | 2022-03-29T17:28:46.000Z | 2022-03-29T17:28:46.000Z | docs/example-app/README.md | Okanjo/okanjo-app-redis | 7729db327bf394dc5e6df50bead40b0487d0720a | [
"MIT"
] | null | null | null | # Example Application Usage
This is an example for how you can use the various utilities provided in this module, including:
* Using the Governor to limit concurrent tasks across distributed systems
* Using publish/subscribing to send and receive messages across applications or instances
* Using resource locking for distributed locking of a thing
Run like so, replacing your rabbitmq host for your test server:
```sh
REDIS_HOST=192.168.99.100 REDIS_PORT=6379 node docs/example-app/index.js
```
Replace the values for your test environment.
The output of the application should look something like this:
```text
Governor example
* Task 0 ran on worker 1
* Task 0 completed.
* Task 1 ran on worker 0
* Task 1 completed.
* Task 4 ran on worker 0
* Task 4 completed.
* Task 2 ran on worker 1
* Task 3 ran on worker 0
* Task 2 completed.
* Task 3 completed.
* Task 5 ran on worker 0
* Task 5 completed.
Pub/Sub example
* Subscribed to channel: my_channel_1
* Subscribed to channel: my_channel_2
* Message published to my_channel_1
* Message published to my_channel_2
* Got message in channel my_channel_1: Hello there!
* Got message in channel my_channel_2: Hello there!
* Unsubscribed from channel: my_channel_1
* Unsubscribed from channel: my_channel_2
Resource locking example
* Worker 1 locked resource
* Worker 3 locked resource
* Worker 2 locked resource
DONE
``` | 29.125 | 96 | 0.76681 | eng_Latn | 0.996666 |
d39dff6d0f42d165bd2f4fd160f0071f9fa75f1f | 851 | md | Markdown | content/python-tutorial/05.md | intermine/training-portal | 5fcebe08dcae445e67cac43cba72cbab0eb75073 | [
"MIT"
] | 10 | 2020-06-03T11:36:13.000Z | 2021-11-08T10:32:25.000Z | content/python-tutorial/05.md | intermine/training-portal | 5fcebe08dcae445e67cac43cba72cbab0eb75073 | [
"MIT"
] | 18 | 2020-06-03T12:30:50.000Z | 2021-04-07T12:43:57.000Z | content/python-tutorial/05.md | intermine/training-portal | 5fcebe08dcae445e67cac43cba72cbab0eb75073 | [
"MIT"
] | 20 | 2020-06-03T11:35:00.000Z | 2021-01-14T12:25:40.000Z | +++
title = "Tutorial 5: Query Results"
description = ""
weight = 6
+++
{{< lead >}}
<br/>
In this tutorial we will talk about dealing with the results of our query.
{{< /lead >}}
## Text Tutorial
<br/>
<iframe width="900" height="800" src="https://nbviewer.jupyter.org/github/intermine/intermine-ws-python-docs/blob/master/05-tutorial.ipynb" title="Python Tutorial 05">
</iframe>
## Video Tutorial
<br/>
<iframe width="560" height="315" src="https://www.youtube.com/embed/k9Bs44aLO7k" frameborder="0" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>
<br/>
<br/>
You can see the script [here]({{< ref "/python-scripts/video05" >}})
## Try Live
<br/>
You can try live <a href="https://mybinder.org/v2/gh/intermine/intermine-ws-python-docs/master?filepath=05-tutorial.ipynb">here</a>
| 24.314286 | 202 | 0.702703 | eng_Latn | 0.367653 |
d3a009c5374fc53d10140582320025b1f1c92bf2 | 13,684 | md | Markdown | README.md | restorecommerce/api-resource-srv | 447d4cb43a505a5f6fc68bf362d0e45a561daf65 | [
"MIT"
] | null | null | null | README.md | restorecommerce/api-resource-srv | 447d4cb43a505a5f6fc68bf362d0e45a561daf65 | [
"MIT"
] | null | null | null | README.md | restorecommerce/api-resource-srv | 447d4cb43a505a5f6fc68bf362d0e45a561daf65 | [
"MIT"
] | null | null | null | # resource-base-interface
[![Version][version]](https://www.npmjs.com/package/@restorecommerce/resource-base-interface)[![Build Status][build]](https://travis-ci.org/restorecommerce/resource-base-interface?branch=master)[![Dependencies][depend]](https://david-dm.org/restorecommerce/resource-base-interface)[![Coverage Status][cover]](https://coveralls.io/github/restorecommerce/resource-base-interface?branch=master)
[version]: http://img.shields.io/npm/v/@restorecommerce/resource-base-interface.svg?style=flat-square
[build]: http://img.shields.io/travis/restorecommerce/resource-base-interface/master.svg?style=flat-square
[depend]: https://img.shields.io/david/restorecommerce/resource-base-interface.svg?style=flat-square
[cover]: http://img.shields.io/coveralls/restorecommerce/resource-base-interface/master.svg?style=flat-square
The `resource-base-interface` describes resource CRUD operations which can be bound to a service. Such operations are described via a [gRPC](https://grpc.io/docs/) interface with the message structures therefore being defined using [Protocol Buffers](https://developers.google.com/protocol-buffers/). This interface can be bound with any protobuf definition as long as it contains the endpoints defined in the [resource-base.proto](https://github.com/restorecommerce/protos/blob/master/io/restorecommerce/resource_base.proto) file (note that any resource message structure can be defined).
The exposed gRPC methods are implemented by the `ServiceBase` object which uses a `ResourceAPI` instance to perform operations with a database provider. The exposed interface is therefore agnostic to a specific database implementation.
However, a valid database provider is required. A set of such providers is implemented in [chassis-srv](https://github.com/restorecommerce/chassis-srv/).
This interface emits resource-related messages to [Apache Kafka](https://kafka.apache.org) which can be enabled or disabled at the `ServiceBase`'s constructor.
Methods for managing and traversing graph databases are supported for the [`ArangoDB provider`](https://docs.arangodb.com/3.3/HTTP/Gharial/)
## gRPC Interface
This interface describes the following gRPC endpoints for a generic resource of type `Resource`.
`io.restorecommerce.resourcebase.Resource`
| Field | Type | Label | Description |
| ----- | ---- | ----- | ----------- |
| id | string | required | identifier for the resource |
| meta | io.restorecommerce.meta.Meta meta | optional | Meta information common to all Restore Commerce resources |
| value | number | optional | value for the resource |
| text | string | optional | textual data for the resource |
### Create
This operation is used for inserting resources to the database.
Requests are performed by providing a list of resources which are returned in the response. A [`meta`](https://github.com/restorecommerce/protos/blob/master/io/restorecommerce/meta.proto) should be present, containing relevant resource ownership information. Timestamps for creation and modification are then appended automatically to this property upon a `Create` request.
The resource is stored as a normal collection document by default.
If there is a [graph configuration](test/cfg/config.json#L11) specified for the resource then it is stored as a vertex collection along with the edge definitions provided in the configuration.
`io.restorecommerce.resourcebase.ResourceList`
| Field | Type | Label | Description |
| ----- | ---- | ----- | ----------- |
| items | [ ] `io.restorecommerce.resourcebase.Resource` | required | list of resources |
| total_count | number | optional | total number of resources |
### Read
This operation returns resources based on provided filter and options.
Requests are performed using `io.restorecommerce.resourcebase.ReadRequest` and responses are a list of resources.
`google.protobuf.Struct`
| Field | Type | Label | Description |
| ----- | ---- | ----- | ----------- |
| fields | map<string, Value> | optional | Unordered map of dynamically typed values. |
`io.restorecommerce.resourcebase.Sort`
| Field | Type | Label | Description |
| ----- | ---- | ----- | ----------- |
| field | string | optional | field to be sorted upon |
| SortOrder | enum | optional | sorting order, `UNSORTED`, `ASCENDING` or `DESCENDING` |
`io.restorecommerce.resourcebase.FieldFilter`
| Field | Type | Label | Description |
| ----- | ---- | ----- | ----------- |
| name | string | optional | field name |
| include | bool | optional | include or exclude field |
`io.restorecommerce.resourcebase.ScopeFilter`
| Field | Type | Label | Description |
| ----- | ---- | ----- | ----------- |
| scope | string | optional | scope to operate on |
| instance | string | optional | value to compare |
`io.restorecommerce.resourcebase.ReadRequest`
| Field | Type | Label | Description |
| ----- | ---- | ----- | ----------- |
| offset | number | optional | offset of the resource |
| limit | number | optional | limit, default value is `1000` |
| filter | google.protobuf.Struct | optional | filter based on field values, multiple filters can be combined with `AND` and `OR` operators |
| sort | [ ]`io.restorecommerce.resourcebase.Sort` | optional | sort the resources |
| field | [ ] `io.restorecommerce.resourcebase.FieldFilter` | optional | fields selector, list of fields to be included or excluded, by default we get all the fields |
| search | [ ]string | optional | word search, not yet implemeneted |
| locales_limiter | [ ]string | optional | querying based on locales, not yet implemented |
| scope | `io.restorecommerce.resourcebase.ScopeFilter` | optional | scope to operate on, not yet implemented |
### Update
This operation is used for updating resources in the database.
Requests are performed by providing a list of resources and all updated items are returned within the response. Note that the only required properties on each resource are its `id` and the properties which are meant to be modified.
It is possible to specify in the configuration multiple edge definitions for one vertex. These edges are automatically updated when vertex documents are updated.
### Upsert
This operation is used for updating resources in the database or creating them if they do not exist.
Requests are performed by providing a resource list, which is returned in the response.
### Delete
This operation is used for deleting resources in the database.
Requests are performed using `io.restorecommerce.resourcebase.DeleteRequest` and responses are `google.protobuf.Empty` messages.
If a graph vertex is deleted, all connected edges are also deleted.
`io.restorecommerce.resourcebase.DeleteRequest`
| Field | Type | Label | Description |
| ----- | ---- | ----- | ----------- |
| collection | string | required | Name of the target collection |
| ids | [ ]string | optional | List of resource identifiers to be deleted; if empty or not provided, the whole collection is truncated |
### Traversal
This operation is used for traversing graph resource in the database.
Requests are performed using `io.restorecommerce.graph.TraversalRequest` and respone is `io.restorecommerce.graph.TraversalResponse` message.
`io.restorecommerce.graph.TraversalRequest`
| Field | Type | Label | Description |
| ----- | ---- | ----- | ----------- |
| start_vertex | string | required | this can be either the `_id` or the `_key` of a vertex in the collection |
| opts | `io.restorecommerce.graph.Options` | optional | List of options for graph traversal |
| collection_name | string | optional | starting vertex's Collection name |
| edge_name | string | optional | edge name for traversal |
| data | bool | optional | if set to `true` only the vertices data is returned |
| path | bool | optional | if set to `true` only the traversed paths are returned |
| aql | bool | optional | if set to `true` traversal is executed as an AQL query |
`io.restorecommerce.graph.Options`
| Field | Type | Label | Description |
| ----- | ---- | ----- | ----------- |
| direction | string | optional | Graph traversal direction, possible values are `outbound` or `inbound`, if not provided by default it is `outbound` traversal |
| filter | [ ] `io.restorecommerce.graph.Filter` | optional | List Vertexes to be filtered out, i.e. these vertices are not traversed |
| expander | [ ] `io.restorecommerce.graph.Expander` | optional | List of edges to be included in the traversal, by default all edges are included in traversal |
| sort | string | optional | JS code of custom comparison function for the edges |
| min_depth | uint32 | optional | visits only vertices in atleast the give depth |
| start_vertex | string | optional | id of the start vertex |
| visitor | string | optional | JS code of custom visitor function |
| init | string | optional | JS code of custom result initialization function |
| item_order | string | optional | item iteration order, possible values are either `forward` or `backward` |
| strategy | string | optional | traversal strategy, possible values are either `depthfirst` or `breadthfirst` |
| max_iterations | uint32 | optional | maximum number of iterations in each traversal, this is used to prevent endless loops in cyclic graphs |
| max_depth | uint32 | optional | visits nodes in at most the given depth |
| uniqueness | `io.restorecommerce.graph.Uniqueness` | optional | specifiy uniqueness for vertices and edges visited |
| order | string | optional | traversal order, possible values are `preorder`, `postorder` or `preorder-expander` |
| graph_name | string | optional | name of graph that contain the edges |
| edge_collection | string | optional | name of the collection that contains the edges |
`io.restorecommerce.graph.Filter`
| Field | Type | Label | Description |
| ----- | ---- | ----- | ----------- |
| vertex | string | optional | vertex would be excluded on traversal |
`io.restorecommerce.graph.Expander`
| Field | Type | Label | Description |
| ----- | ---- | ----- | ----------- |
| edge | string | optional | expand this edge |
| direction | string | optional | direction of traversal, either `outbound` or `inbound` |
`io.restorecommerce.graph.Uniqueness`
| Field | Type | Label | Description |
| ----- | ---- | ----- | ----------- |
| vertices | string | optional | specifies uniqueness for vertices visited, possible values are `none`, `global` or `path` |
| edges | string | optional | specifies uniqueness for edges visited, possible values are `none`, `global` or `path` |
`io.restorecommerce.graph.TraversalResponse`
| Field | Type | Label | Description |
| ----- | ---- | ----- | ----------- |
| vertex_fields | [ ] `io.restorecommerce.graph.VertexFields` | required | Object containing vertex metadata: `id`, `_id`, `_key` and `_rev` values |
| paths | `google.protobuf.Any` | required | buffered data, contains the list of visited paths |
| data | `google.protobuf.Any` | required | buffered data, contains all the data from the visited vertices |
`io.restorecommerce.graph.VertexFields`
| Field | Type | Label | Description |
| ----- | ---- | ----- | ----------- |
| _id | string | required | vertex document handle |
| _key | string | required | vertex document unique key |
| _rev | string | required | revision or vertex document ETag |
| id | string | required | id of the vertex collection |
## Kafka Events
A kafka [`Topic`](https://github.com/restorecommerce/kafka-client/blob/master/src/events/provider/kafka/index.ts) can be provided when instantiating a `ServiceBase`. If `enableEvents` is set to true, a list of events is then emitted to Kafka by this microservice for each document of each CRUD request :
- <ResourceName>Created
- <ResourceName>Read
- <ResourceName>Modified
- <ResourceName>Deleted
The events emitted to Kafka can be used for restoring the system in case of failure by implementing a [command-interface](https://github.com/restorecommerce/chassis-srv/blob/master/command-interface.md) in the used microservice. For usage details please see [command-interface tests](https://github.com/restorecommerce/chassis-srv/blob/master/test/command_test.ts).
## Fields Configuration
It is possible to pass a fields [`configuration object`](test/cfg/config.json#L235) to `ResourceAPI` in order to enable some special field handlers.
### Field Generators
The `strategies` property can be used to specify fields within each resource which should be generated automatically. Such autogeneration feature currently includes UUIDs, timestamps and sequential counters. The latter one is particularly useful for fields such as a customer or an item number, which can have a type of sequential logic. In these cases, a [Redis](https://redis.io/) database is used to generate and read these values efficiently.
### Buffer Fields
Buffer-encoded fields can be decoded before being stored in the database. It is possible to specify within the `bufferFields` property what fields of each resource should be specially handled this way. The values are also encoded into a buffer again when read from the database.
### Required Fields
It is possible to specify which fields are required for each document of each resource on the `requiredFields` config.
An `InvalidArgument` error is thrown if one of these fields is missing when attempting to store a document.
## Development
### Tests
See [tests](test/). To execute the tests a set of _backing services_ are needed.
Refer to [System](https://github.com/restorecommerce/system) repository to start the backing-services before running the tests.
- To run tests
```sh
npm run test
```
## Usage
- Install dependencies
```sh
npm install
```
- Build
```sh
# compile the code
npm run build
``` | 55.40081 | 589 | 0.732827 | eng_Latn | 0.982662 |
d3a06aa29b99c866b389c670e3ab7a1efed58b40 | 638 | md | Markdown | README.md | writetome51/array-get-and-remove-by-index | ed04eb2efc6914f544271c8485ee7a7c1063b910 | [
"MIT"
] | 1 | 2018-12-12T15:56:45.000Z | 2018-12-12T15:56:45.000Z | README.md | writetome51/array-get-and-remove-by-index | ed04eb2efc6914f544271c8485ee7a7c1063b910 | [
"MIT"
] | null | null | null | README.md | writetome51/array-get-and-remove-by-index | ed04eb2efc6914f544271c8485ee7a7c1063b910 | [
"MIT"
] | null | null | null | # getAndRemoveByIndex\<T\>(<br> index: number,<br> array: T[]<br>): T
Removes and returns 1 item (accessed by `index`) from `array`.
The `index` can be negative or positive.
## Examples
```js
let arr = [10,20,30,40,50];
getAndRemoveByIndex(2, arr);
// --> 30
// arr is now [10,20,40,50]
arr = [10,20,30,40,50];
getAndRemoveByIndex(-2, arr);
// --> 40
// arr is now [10,20,30,50]
```
## Installation
`npm i @writetome51/array-get-and-remove-by-index`
## Loading
```js
import {getAndRemoveByIndex} from '@writetome51/array-get-and-remove-by-index';
```
| 22.785714 | 139 | 0.648903 | eng_Latn | 0.344231 |
d3a0c311c894ab415a8b185f1ed930ff4739031f | 1,458 | md | Markdown | docs/bugs.md | milanaleksic/kustomize | b1f6af4cff52f56b6f9789bdc5741f3d93a27cb5 | [
"Apache-2.0"
] | null | null | null | docs/bugs.md | milanaleksic/kustomize | b1f6af4cff52f56b6f9789bdc5741f3d93a27cb5 | [
"Apache-2.0"
] | 1 | 2019-05-02T13:43:44.000Z | 2019-05-02T17:47:26.000Z | docs/bugs.md | milanaleksic/kustomize | b1f6af4cff52f56b6f9789bdc5741f3d93a27cb5 | [
"Apache-2.0"
] | 1 | 2020-09-23T21:10:28.000Z | 2020-09-23T21:10:28.000Z | # Filing bugs
[target package]: https://github.com/kubernetes-sigs/kustomize/tree/master/pkg/target
[example of a target test]: https://github.com/kubernetes-sigs/kustomize/blob/master/pkg/target/baseandoverlaysmall_test.go
File issues as desired, but
if you've found a problem with how
`kustomize build` works, consider the
following to improve response time.
## A good report specifies
* the output of `kustomize version`,
* the input (the content of `kustomization.yaml`
and any files it refers to),
* the expected YAML output.
## A great report is a bug reproduction test
kustomize has a simple test harness in the
[target package] for specifying a kustomization's
input and the expected output.
See this [example of a target test].
The pattern is
* call `NewKustTestHarness`
* specify kustomization input data (resources,
patches, etc.) as inline strings,
* call `makeKustTarget().MakeCustomizedResMap()`
* compare the actual output to expected output
In a bug reproduction test, the expected output
string initially contains the _wrong_ (unexpected)
output, thus unambiguously reproducing the bug.
Nearby comments should explain what the output
should be, and have a TODO pointing to the related
issue.
The person who fixes the bug then has a clear bug
reproduction and a test to modify when the bug is
fixed.
The bug reporter can then see the bug was fixed,
and has permanent regression coverage to prevent
its reintroduction.
| 31.021277 | 123 | 0.781207 | eng_Latn | 0.995005 |
d3a283aebc828998b0ed2466d58860012a877cee | 255,457 | md | Markdown | docs/UsersApi.md | CanopyIQ/gmail_client | 5af519cf6d350f2b2645b85fe9692811f7a9feeb | [
"MIT"
] | null | null | null | docs/UsersApi.md | CanopyIQ/gmail_client | 5af519cf6d350f2b2645b85fe9692811f7a9feeb | [
"MIT"
] | null | null | null | docs/UsersApi.md | CanopyIQ/gmail_client | 5af519cf6d350f2b2645b85fe9692811f7a9feeb | [
"MIT"
] | null | null | null | # gmail_client.UsersApi
All URIs are relative to *https://www.googleapis.com/gmail/v1/users*
Method | HTTP request | Description
------------- | ------------- | -------------
[**gmail_users_drafts_create**](UsersApi.md#gmail_users_drafts_create) | **POST** /{userId}/drafts |
[**gmail_users_drafts_delete**](UsersApi.md#gmail_users_drafts_delete) | **DELETE** /{userId}/drafts/{id} |
[**gmail_users_drafts_get**](UsersApi.md#gmail_users_drafts_get) | **GET** /{userId}/drafts/{id} |
[**gmail_users_drafts_list**](UsersApi.md#gmail_users_drafts_list) | **GET** /{userId}/drafts |
[**gmail_users_drafts_send**](UsersApi.md#gmail_users_drafts_send) | **POST** /{userId}/drafts/send |
[**gmail_users_drafts_update**](UsersApi.md#gmail_users_drafts_update) | **PUT** /{userId}/drafts/{id} |
[**gmail_users_get_profile**](UsersApi.md#gmail_users_get_profile) | **GET** /{userId}/profile |
[**gmail_users_history_list**](UsersApi.md#gmail_users_history_list) | **GET** /{userId}/history |
[**gmail_users_labels_create**](UsersApi.md#gmail_users_labels_create) | **POST** /{userId}/labels |
[**gmail_users_labels_delete**](UsersApi.md#gmail_users_labels_delete) | **DELETE** /{userId}/labels/{id} |
[**gmail_users_labels_get**](UsersApi.md#gmail_users_labels_get) | **GET** /{userId}/labels/{id} |
[**gmail_users_labels_list**](UsersApi.md#gmail_users_labels_list) | **GET** /{userId}/labels |
[**gmail_users_labels_patch**](UsersApi.md#gmail_users_labels_patch) | **PATCH** /{userId}/labels/{id} |
[**gmail_users_labels_update**](UsersApi.md#gmail_users_labels_update) | **PUT** /{userId}/labels/{id} |
[**gmail_users_messages_attachments_get**](UsersApi.md#gmail_users_messages_attachments_get) | **GET** /{userId}/messages/{messageId}/attachments/{id} |
[**gmail_users_messages_batch_delete**](UsersApi.md#gmail_users_messages_batch_delete) | **POST** /{userId}/messages/batchDelete |
[**gmail_users_messages_batch_modify**](UsersApi.md#gmail_users_messages_batch_modify) | **POST** /{userId}/messages/batchModify |
[**gmail_users_messages_delete**](UsersApi.md#gmail_users_messages_delete) | **DELETE** /{userId}/messages/{id} |
[**gmail_users_messages_get**](UsersApi.md#gmail_users_messages_get) | **GET** /{userId}/messages/{id} |
[**gmail_users_messages_import**](UsersApi.md#gmail_users_messages_import) | **POST** /{userId}/messages/import |
[**gmail_users_messages_insert**](UsersApi.md#gmail_users_messages_insert) | **POST** /{userId}/messages |
[**gmail_users_messages_list**](UsersApi.md#gmail_users_messages_list) | **GET** /{userId}/messages |
[**gmail_users_messages_modify**](UsersApi.md#gmail_users_messages_modify) | **POST** /{userId}/messages/{id}/modify |
[**gmail_users_messages_send**](UsersApi.md#gmail_users_messages_send) | **POST** /{userId}/messages/send |
[**gmail_users_messages_trash**](UsersApi.md#gmail_users_messages_trash) | **POST** /{userId}/messages/{id}/trash |
[**gmail_users_messages_untrash**](UsersApi.md#gmail_users_messages_untrash) | **POST** /{userId}/messages/{id}/untrash |
[**gmail_users_settings_filters_create**](UsersApi.md#gmail_users_settings_filters_create) | **POST** /{userId}/settings/filters |
[**gmail_users_settings_filters_delete**](UsersApi.md#gmail_users_settings_filters_delete) | **DELETE** /{userId}/settings/filters/{id} |
[**gmail_users_settings_filters_get**](UsersApi.md#gmail_users_settings_filters_get) | **GET** /{userId}/settings/filters/{id} |
[**gmail_users_settings_filters_list**](UsersApi.md#gmail_users_settings_filters_list) | **GET** /{userId}/settings/filters |
[**gmail_users_settings_forwarding_addresses_create**](UsersApi.md#gmail_users_settings_forwarding_addresses_create) | **POST** /{userId}/settings/forwardingAddresses |
[**gmail_users_settings_forwarding_addresses_delete**](UsersApi.md#gmail_users_settings_forwarding_addresses_delete) | **DELETE** /{userId}/settings/forwardingAddresses/{forwardingEmail} |
[**gmail_users_settings_forwarding_addresses_get**](UsersApi.md#gmail_users_settings_forwarding_addresses_get) | **GET** /{userId}/settings/forwardingAddresses/{forwardingEmail} |
[**gmail_users_settings_forwarding_addresses_list**](UsersApi.md#gmail_users_settings_forwarding_addresses_list) | **GET** /{userId}/settings/forwardingAddresses |
[**gmail_users_settings_get_auto_forwarding**](UsersApi.md#gmail_users_settings_get_auto_forwarding) | **GET** /{userId}/settings/autoForwarding |
[**gmail_users_settings_get_imap**](UsersApi.md#gmail_users_settings_get_imap) | **GET** /{userId}/settings/imap |
[**gmail_users_settings_get_pop**](UsersApi.md#gmail_users_settings_get_pop) | **GET** /{userId}/settings/pop |
[**gmail_users_settings_get_vacation**](UsersApi.md#gmail_users_settings_get_vacation) | **GET** /{userId}/settings/vacation |
[**gmail_users_settings_send_as_create**](UsersApi.md#gmail_users_settings_send_as_create) | **POST** /{userId}/settings/sendAs |
[**gmail_users_settings_send_as_delete**](UsersApi.md#gmail_users_settings_send_as_delete) | **DELETE** /{userId}/settings/sendAs/{sendAsEmail} |
[**gmail_users_settings_send_as_get**](UsersApi.md#gmail_users_settings_send_as_get) | **GET** /{userId}/settings/sendAs/{sendAsEmail} |
[**gmail_users_settings_send_as_list**](UsersApi.md#gmail_users_settings_send_as_list) | **GET** /{userId}/settings/sendAs |
[**gmail_users_settings_send_as_patch**](UsersApi.md#gmail_users_settings_send_as_patch) | **PATCH** /{userId}/settings/sendAs/{sendAsEmail} |
[**gmail_users_settings_send_as_smime_info_delete**](UsersApi.md#gmail_users_settings_send_as_smime_info_delete) | **DELETE** /{userId}/settings/sendAs/{sendAsEmail}/smimeInfo/{id} |
[**gmail_users_settings_send_as_smime_info_get**](UsersApi.md#gmail_users_settings_send_as_smime_info_get) | **GET** /{userId}/settings/sendAs/{sendAsEmail}/smimeInfo/{id} |
[**gmail_users_settings_send_as_smime_info_insert**](UsersApi.md#gmail_users_settings_send_as_smime_info_insert) | **POST** /{userId}/settings/sendAs/{sendAsEmail}/smimeInfo |
[**gmail_users_settings_send_as_smime_info_list**](UsersApi.md#gmail_users_settings_send_as_smime_info_list) | **GET** /{userId}/settings/sendAs/{sendAsEmail}/smimeInfo |
[**gmail_users_settings_send_as_smime_info_set_default**](UsersApi.md#gmail_users_settings_send_as_smime_info_set_default) | **POST** /{userId}/settings/sendAs/{sendAsEmail}/smimeInfo/{id}/setDefault |
[**gmail_users_settings_send_as_update**](UsersApi.md#gmail_users_settings_send_as_update) | **PUT** /{userId}/settings/sendAs/{sendAsEmail} |
[**gmail_users_settings_send_as_verify**](UsersApi.md#gmail_users_settings_send_as_verify) | **POST** /{userId}/settings/sendAs/{sendAsEmail}/verify |
[**gmail_users_settings_update_auto_forwarding**](UsersApi.md#gmail_users_settings_update_auto_forwarding) | **PUT** /{userId}/settings/autoForwarding |
[**gmail_users_settings_update_imap**](UsersApi.md#gmail_users_settings_update_imap) | **PUT** /{userId}/settings/imap |
[**gmail_users_settings_update_pop**](UsersApi.md#gmail_users_settings_update_pop) | **PUT** /{userId}/settings/pop |
[**gmail_users_settings_update_vacation**](UsersApi.md#gmail_users_settings_update_vacation) | **PUT** /{userId}/settings/vacation |
[**gmail_users_stop**](UsersApi.md#gmail_users_stop) | **POST** /{userId}/stop |
[**gmail_users_threads_delete**](UsersApi.md#gmail_users_threads_delete) | **DELETE** /{userId}/threads/{id} |
[**gmail_users_threads_get**](UsersApi.md#gmail_users_threads_get) | **GET** /{userId}/threads/{id} |
[**gmail_users_threads_list**](UsersApi.md#gmail_users_threads_list) | **GET** /{userId}/threads |
[**gmail_users_threads_modify**](UsersApi.md#gmail_users_threads_modify) | **POST** /{userId}/threads/{id}/modify |
[**gmail_users_threads_trash**](UsersApi.md#gmail_users_threads_trash) | **POST** /{userId}/threads/{id}/trash |
[**gmail_users_threads_untrash**](UsersApi.md#gmail_users_threads_untrash) | **POST** /{userId}/threads/{id}/untrash |
[**gmail_users_watch**](UsersApi.md#gmail_users_watch) | **POST** /{userId}/watch |
# **gmail_users_drafts_create**
> Draft gmail_users_drafts_create(user_id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, body=body)
Creates a new draft with the DRAFT label.
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | The user's email address. The special value me can be used to indicate the authenticated user.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
body = gmail_client.Draft() # Draft | (optional)
try:
api_response = api_instance.gmail_users_drafts_create(user_id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, body=body)
pprint(api_response)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_drafts_create: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| The user's email address. The special value me can be used to indicate the authenticated user. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
**body** | [**Draft**](Draft.md)| | [optional]
### Return type
[**Draft**](Draft.md)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: message/rfc822
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **gmail_users_drafts_delete**
> gmail_users_drafts_delete(user_id, id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
Immediately and permanently deletes the specified draft. Does not simply trash it.
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | The user's email address. The special value me can be used to indicate the authenticated user.
id = 'id_example' # str | The ID of the draft to delete.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
try:
api_instance.gmail_users_drafts_delete(user_id, id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_drafts_delete: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| The user's email address. The special value me can be used to indicate the authenticated user. |
**id** | **str**| The ID of the draft to delete. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
### Return type
void (empty response body)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **gmail_users_drafts_get**
> Draft gmail_users_drafts_get(user_id, id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, format=format)
Gets the specified draft.
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | The user's email address. The special value me can be used to indicate the authenticated user.
id = 'id_example' # str | The ID of the draft to retrieve.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
format = 'full' # str | The format to return the draft in. (optional) (default to full)
try:
api_response = api_instance.gmail_users_drafts_get(user_id, id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, format=format)
pprint(api_response)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_drafts_get: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| The user's email address. The special value me can be used to indicate the authenticated user. |
**id** | **str**| The ID of the draft to retrieve. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
**format** | **str**| The format to return the draft in. | [optional] [default to full]
### Return type
[**Draft**](Draft.md)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **gmail_users_drafts_list**
> ListDraftsResponse gmail_users_drafts_list(user_id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, include_spam_trash=include_spam_trash, max_results=max_results, page_token=page_token, q=q)
Lists the drafts in the user's mailbox.
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | The user's email address. The special value me can be used to indicate the authenticated user.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
include_spam_trash = false # bool | Include drafts from SPAM and TRASH in the results. (optional) (default to false)
max_results = 100 # int | Maximum number of drafts to return. (optional) (default to 100)
page_token = 'page_token_example' # str | Page token to retrieve a specific page of results in the list. (optional)
q = 'q_example' # str | Only return draft messages matching the specified query. Supports the same query format as the Gmail search box. For example, \"from:someuser@example.com rfc822msgid: is:unread\". (optional)
try:
api_response = api_instance.gmail_users_drafts_list(user_id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, include_spam_trash=include_spam_trash, max_results=max_results, page_token=page_token, q=q)
pprint(api_response)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_drafts_list: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| The user's email address. The special value me can be used to indicate the authenticated user. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
**include_spam_trash** | **bool**| Include drafts from SPAM and TRASH in the results. | [optional] [default to false]
**max_results** | **int**| Maximum number of drafts to return. | [optional] [default to 100]
**page_token** | **str**| Page token to retrieve a specific page of results in the list. | [optional]
**q** | **str**| Only return draft messages matching the specified query. Supports the same query format as the Gmail search box. For example, \"from:someuser@example.com rfc822msgid: is:unread\". | [optional]
### Return type
[**ListDraftsResponse**](ListDraftsResponse.md)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **gmail_users_drafts_send**
> Message gmail_users_drafts_send(user_id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, body=body)
Sends the specified, existing draft to the recipients in the To, Cc, and Bcc headers.
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | The user's email address. The special value me can be used to indicate the authenticated user.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
body = gmail_client.Draft() # Draft | (optional)
try:
api_response = api_instance.gmail_users_drafts_send(user_id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, body=body)
pprint(api_response)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_drafts_send: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| The user's email address. The special value me can be used to indicate the authenticated user. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
**body** | [**Draft**](Draft.md)| | [optional]
### Return type
[**Message**](Message.md)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: message/rfc822
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **gmail_users_drafts_update**
> Draft gmail_users_drafts_update(user_id, id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, body=body)
Replaces a draft's content.
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | The user's email address. The special value me can be used to indicate the authenticated user.
id = 'id_example' # str | The ID of the draft to update.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
body = gmail_client.Draft() # Draft | (optional)
try:
api_response = api_instance.gmail_users_drafts_update(user_id, id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, body=body)
pprint(api_response)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_drafts_update: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| The user's email address. The special value me can be used to indicate the authenticated user. |
**id** | **str**| The ID of the draft to update. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
**body** | [**Draft**](Draft.md)| | [optional]
### Return type
[**Draft**](Draft.md)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: message/rfc822
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **gmail_users_get_profile**
> Profile gmail_users_get_profile(user_id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
Gets the current user's Gmail profile.
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | The user's email address. The special value me can be used to indicate the authenticated user.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
try:
api_response = api_instance.gmail_users_get_profile(user_id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
pprint(api_response)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_get_profile: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| The user's email address. The special value me can be used to indicate the authenticated user. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
### Return type
[**Profile**](Profile.md)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **gmail_users_history_list**
> ListHistoryResponse gmail_users_history_list(user_id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, history_types=history_types, label_id=label_id, max_results=max_results, page_token=page_token, start_history_id=start_history_id)
Lists the history of all changes to the given mailbox. History results are returned in chronological order (increasing historyId).
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | The user's email address. The special value me can be used to indicate the authenticated user.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
history_types = ['history_types_example'] # list[str] | History types to be returned by the function (optional)
label_id = 'label_id_example' # str | Only return messages with a label matching the ID. (optional)
max_results = 100 # int | The maximum number of history records to return. (optional) (default to 100)
page_token = 'page_token_example' # str | Page token to retrieve a specific page of results in the list. (optional)
start_history_id = 'start_history_id_example' # str | Required. Returns history records after the specified startHistoryId. The supplied startHistoryId should be obtained from the historyId of a message, thread, or previous list response. History IDs increase chronologically but are not contiguous with random gaps in between valid IDs. Supplying an invalid or out of date startHistoryId typically returns an HTTP 404 error code. A historyId is typically valid for at least a week, but in some rare circumstances may be valid for only a few hours. If you receive an HTTP 404 error response, your application should perform a full sync. If you receive no nextPageToken in the response, there are no updates to retrieve and you can store the returned historyId for a future request. (optional)
try:
api_response = api_instance.gmail_users_history_list(user_id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, history_types=history_types, label_id=label_id, max_results=max_results, page_token=page_token, start_history_id=start_history_id)
pprint(api_response)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_history_list: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| The user's email address. The special value me can be used to indicate the authenticated user. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
**history_types** | [**list[str]**](str.md)| History types to be returned by the function | [optional]
**label_id** | **str**| Only return messages with a label matching the ID. | [optional]
**max_results** | **int**| The maximum number of history records to return. | [optional] [default to 100]
**page_token** | **str**| Page token to retrieve a specific page of results in the list. | [optional]
**start_history_id** | **str**| Required. Returns history records after the specified startHistoryId. The supplied startHistoryId should be obtained from the historyId of a message, thread, or previous list response. History IDs increase chronologically but are not contiguous with random gaps in between valid IDs. Supplying an invalid or out of date startHistoryId typically returns an HTTP 404 error code. A historyId is typically valid for at least a week, but in some rare circumstances may be valid for only a few hours. If you receive an HTTP 404 error response, your application should perform a full sync. If you receive no nextPageToken in the response, there are no updates to retrieve and you can store the returned historyId for a future request. | [optional]
### Return type
[**ListHistoryResponse**](ListHistoryResponse.md)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **gmail_users_labels_create**
> Label gmail_users_labels_create(user_id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, body=body)
Creates a new label.
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | The user's email address. The special value me can be used to indicate the authenticated user.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
body = gmail_client.Label() # Label | (optional)
try:
api_response = api_instance.gmail_users_labels_create(user_id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, body=body)
pprint(api_response)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_labels_create: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| The user's email address. The special value me can be used to indicate the authenticated user. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
**body** | [**Label**](Label.md)| | [optional]
### Return type
[**Label**](Label.md)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **gmail_users_labels_delete**
> gmail_users_labels_delete(user_id, id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
Immediately and permanently deletes the specified label and removes it from any messages and threads that it is applied to.
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | The user's email address. The special value me can be used to indicate the authenticated user.
id = 'id_example' # str | The ID of the label to delete.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
try:
api_instance.gmail_users_labels_delete(user_id, id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_labels_delete: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| The user's email address. The special value me can be used to indicate the authenticated user. |
**id** | **str**| The ID of the label to delete. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
### Return type
void (empty response body)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **gmail_users_labels_get**
> Label gmail_users_labels_get(user_id, id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
Gets the specified label.
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | The user's email address. The special value me can be used to indicate the authenticated user.
id = 'id_example' # str | The ID of the label to retrieve.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
try:
api_response = api_instance.gmail_users_labels_get(user_id, id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
pprint(api_response)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_labels_get: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| The user's email address. The special value me can be used to indicate the authenticated user. |
**id** | **str**| The ID of the label to retrieve. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
### Return type
[**Label**](Label.md)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **gmail_users_labels_list**
> ListLabelsResponse gmail_users_labels_list(user_id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
Lists all labels in the user's mailbox.
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | The user's email address. The special value me can be used to indicate the authenticated user.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
try:
api_response = api_instance.gmail_users_labels_list(user_id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
pprint(api_response)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_labels_list: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| The user's email address. The special value me can be used to indicate the authenticated user. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
### Return type
[**ListLabelsResponse**](ListLabelsResponse.md)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **gmail_users_labels_patch**
> Label gmail_users_labels_patch(user_id, id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, body=body)
Updates the specified label. This method supports patch semantics.
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | The user's email address. The special value me can be used to indicate the authenticated user.
id = 'id_example' # str | The ID of the label to update.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
body = gmail_client.Label() # Label | (optional)
try:
api_response = api_instance.gmail_users_labels_patch(user_id, id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, body=body)
pprint(api_response)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_labels_patch: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| The user's email address. The special value me can be used to indicate the authenticated user. |
**id** | **str**| The ID of the label to update. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
**body** | [**Label**](Label.md)| | [optional]
### Return type
[**Label**](Label.md)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **gmail_users_labels_update**
> Label gmail_users_labels_update(user_id, id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, body=body)
Updates the specified label.
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | The user's email address. The special value me can be used to indicate the authenticated user.
id = 'id_example' # str | The ID of the label to update.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
body = gmail_client.Label() # Label | (optional)
try:
api_response = api_instance.gmail_users_labels_update(user_id, id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, body=body)
pprint(api_response)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_labels_update: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| The user's email address. The special value me can be used to indicate the authenticated user. |
**id** | **str**| The ID of the label to update. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
**body** | [**Label**](Label.md)| | [optional]
### Return type
[**Label**](Label.md)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **gmail_users_messages_attachments_get**
> MessagePartBody gmail_users_messages_attachments_get(user_id, message_id, id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
Gets the specified message attachment.
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | The user's email address. The special value me can be used to indicate the authenticated user.
message_id = 'message_id_example' # str | The ID of the message containing the attachment.
id = 'id_example' # str | The ID of the attachment.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
try:
api_response = api_instance.gmail_users_messages_attachments_get(user_id, message_id, id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
pprint(api_response)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_messages_attachments_get: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| The user's email address. The special value me can be used to indicate the authenticated user. |
**message_id** | **str**| The ID of the message containing the attachment. |
**id** | **str**| The ID of the attachment. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
### Return type
[**MessagePartBody**](MessagePartBody.md)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **gmail_users_messages_batch_delete**
> gmail_users_messages_batch_delete(user_id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, body=body)
Deletes many messages by message ID. Provides no guarantees that messages were not already deleted or even existed at all.
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | The user's email address. The special value me can be used to indicate the authenticated user.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
body = gmail_client.BatchDeleteMessagesRequest() # BatchDeleteMessagesRequest | (optional)
try:
api_instance.gmail_users_messages_batch_delete(user_id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, body=body)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_messages_batch_delete: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| The user's email address. The special value me can be used to indicate the authenticated user. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
**body** | [**BatchDeleteMessagesRequest**](BatchDeleteMessagesRequest.md)| | [optional]
### Return type
void (empty response body)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **gmail_users_messages_batch_modify**
> gmail_users_messages_batch_modify(user_id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, body=body)
Modifies the labels on the specified messages.
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | The user's email address. The special value me can be used to indicate the authenticated user.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
body = gmail_client.BatchModifyMessagesRequest() # BatchModifyMessagesRequest | (optional)
try:
api_instance.gmail_users_messages_batch_modify(user_id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, body=body)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_messages_batch_modify: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| The user's email address. The special value me can be used to indicate the authenticated user. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
**body** | [**BatchModifyMessagesRequest**](BatchModifyMessagesRequest.md)| | [optional]
### Return type
void (empty response body)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **gmail_users_messages_delete**
> gmail_users_messages_delete(user_id, id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
Immediately and permanently deletes the specified message. This operation cannot be undone. Prefer messages.trash instead.
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | The user's email address. The special value me can be used to indicate the authenticated user.
id = 'id_example' # str | The ID of the message to delete.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
try:
api_instance.gmail_users_messages_delete(user_id, id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_messages_delete: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| The user's email address. The special value me can be used to indicate the authenticated user. |
**id** | **str**| The ID of the message to delete. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
### Return type
void (empty response body)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **gmail_users_messages_get**
> Message gmail_users_messages_get(user_id, id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, format=format, metadata_headers=metadata_headers)
Gets the specified message.
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | The user's email address. The special value me can be used to indicate the authenticated user.
id = 'id_example' # str | The ID of the message to retrieve.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
format = 'full' # str | The format to return the message in. (optional) (default to full)
metadata_headers = ['metadata_headers_example'] # list[str] | When given and format is METADATA, only include headers specified. (optional)
try:
api_response = api_instance.gmail_users_messages_get(user_id, id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, format=format, metadata_headers=metadata_headers)
pprint(api_response)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_messages_get: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| The user's email address. The special value me can be used to indicate the authenticated user. |
**id** | **str**| The ID of the message to retrieve. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
**format** | **str**| The format to return the message in. | [optional] [default to full]
**metadata_headers** | [**list[str]**](str.md)| When given and format is METADATA, only include headers specified. | [optional]
### Return type
[**Message**](Message.md)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **gmail_users_messages_import**
> Message gmail_users_messages_import(user_id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, deleted=deleted, internal_date_source=internal_date_source, never_mark_spam=never_mark_spam, process_for_calendar=process_for_calendar, body=body)
Imports a message into only this user's mailbox, with standard email delivery scanning and classification similar to receiving via SMTP. Does not send a message.
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | The user's email address. The special value me can be used to indicate the authenticated user.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
deleted = false # bool | Mark the email as permanently deleted (not TRASH) and only visible in Google Vault to a Vault administrator. Only used for G Suite accounts. (optional) (default to false)
internal_date_source = 'dateHeader' # str | Source for Gmail's internal date of the message. (optional) (default to dateHeader)
never_mark_spam = false # bool | Ignore the Gmail spam classifier decision and never mark this email as SPAM in the mailbox. (optional) (default to false)
process_for_calendar = false # bool | Process calendar invites in the email and add any extracted meetings to the Google Calendar for this user. (optional) (default to false)
body = gmail_client.Message() # Message | (optional)
try:
api_response = api_instance.gmail_users_messages_import(user_id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, deleted=deleted, internal_date_source=internal_date_source, never_mark_spam=never_mark_spam, process_for_calendar=process_for_calendar, body=body)
pprint(api_response)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_messages_import: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| The user's email address. The special value me can be used to indicate the authenticated user. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
**deleted** | **bool**| Mark the email as permanently deleted (not TRASH) and only visible in Google Vault to a Vault administrator. Only used for G Suite accounts. | [optional] [default to false]
**internal_date_source** | **str**| Source for Gmail's internal date of the message. | [optional] [default to dateHeader]
**never_mark_spam** | **bool**| Ignore the Gmail spam classifier decision and never mark this email as SPAM in the mailbox. | [optional] [default to false]
**process_for_calendar** | **bool**| Process calendar invites in the email and add any extracted meetings to the Google Calendar for this user. | [optional] [default to false]
**body** | [**Message**](Message.md)| | [optional]
### Return type
[**Message**](Message.md)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: message/rfc822
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **gmail_users_messages_insert**
> Message gmail_users_messages_insert(user_id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, deleted=deleted, internal_date_source=internal_date_source, body=body)
Directly inserts a message into only this user's mailbox similar to IMAP APPEND, bypassing most scanning and classification. Does not send a message.
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | The user's email address. The special value me can be used to indicate the authenticated user.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
deleted = false # bool | Mark the email as permanently deleted (not TRASH) and only visible in Google Vault to a Vault administrator. Only used for G Suite accounts. (optional) (default to false)
internal_date_source = 'receivedTime' # str | Source for Gmail's internal date of the message. (optional) (default to receivedTime)
body = gmail_client.Message() # Message | (optional)
try:
api_response = api_instance.gmail_users_messages_insert(user_id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, deleted=deleted, internal_date_source=internal_date_source, body=body)
pprint(api_response)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_messages_insert: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| The user's email address. The special value me can be used to indicate the authenticated user. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
**deleted** | **bool**| Mark the email as permanently deleted (not TRASH) and only visible in Google Vault to a Vault administrator. Only used for G Suite accounts. | [optional] [default to false]
**internal_date_source** | **str**| Source for Gmail's internal date of the message. | [optional] [default to receivedTime]
**body** | [**Message**](Message.md)| | [optional]
### Return type
[**Message**](Message.md)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: message/rfc822
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **gmail_users_messages_list**
> ListMessagesResponse gmail_users_messages_list(user_id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, include_spam_trash=include_spam_trash, label_ids=label_ids, max_results=max_results, page_token=page_token, q=q)
Lists the messages in the user's mailbox.
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | The user's email address. The special value me can be used to indicate the authenticated user.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
include_spam_trash = false # bool | Include messages from SPAM and TRASH in the results. (optional) (default to false)
label_ids = ['label_ids_example'] # list[str] | Only return messages with labels that match all of the specified label IDs. (optional)
max_results = 100 # int | Maximum number of messages to return. (optional) (default to 100)
page_token = 'page_token_example' # str | Page token to retrieve a specific page of results in the list. (optional)
q = 'q_example' # str | Only return messages matching the specified query. Supports the same query format as the Gmail search box. For example, \"from:someuser@example.com rfc822msgid: is:unread\". Parameter cannot be used when accessing the api using the gmail.metadata scope. (optional)
try:
api_response = api_instance.gmail_users_messages_list(user_id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, include_spam_trash=include_spam_trash, label_ids=label_ids, max_results=max_results, page_token=page_token, q=q)
pprint(api_response)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_messages_list: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| The user's email address. The special value me can be used to indicate the authenticated user. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
**include_spam_trash** | **bool**| Include messages from SPAM and TRASH in the results. | [optional] [default to false]
**label_ids** | [**list[str]**](str.md)| Only return messages with labels that match all of the specified label IDs. | [optional]
**max_results** | **int**| Maximum number of messages to return. | [optional] [default to 100]
**page_token** | **str**| Page token to retrieve a specific page of results in the list. | [optional]
**q** | **str**| Only return messages matching the specified query. Supports the same query format as the Gmail search box. For example, \"from:someuser@example.com rfc822msgid: is:unread\". Parameter cannot be used when accessing the api using the gmail.metadata scope. | [optional]
### Return type
[**ListMessagesResponse**](ListMessagesResponse.md)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **gmail_users_messages_modify**
> Message gmail_users_messages_modify(user_id, id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, body=body)
Modifies the labels on the specified message.
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | The user's email address. The special value me can be used to indicate the authenticated user.
id = 'id_example' # str | The ID of the message to modify.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
body = gmail_client.ModifyMessageRequest() # ModifyMessageRequest | (optional)
try:
api_response = api_instance.gmail_users_messages_modify(user_id, id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, body=body)
pprint(api_response)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_messages_modify: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| The user's email address. The special value me can be used to indicate the authenticated user. |
**id** | **str**| The ID of the message to modify. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
**body** | [**ModifyMessageRequest**](ModifyMessageRequest.md)| | [optional]
### Return type
[**Message**](Message.md)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **gmail_users_messages_send**
> Message gmail_users_messages_send(user_id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, body=body)
Sends the specified message to the recipients in the To, Cc, and Bcc headers.
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | The user's email address. The special value me can be used to indicate the authenticated user.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
body = gmail_client.Message() # Message | (optional)
try:
api_response = api_instance.gmail_users_messages_send(user_id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, body=body)
pprint(api_response)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_messages_send: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| The user's email address. The special value me can be used to indicate the authenticated user. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
**body** | [**Message**](Message.md)| | [optional]
### Return type
[**Message**](Message.md)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: message/rfc822
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **gmail_users_messages_trash**
> Message gmail_users_messages_trash(user_id, id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
Moves the specified message to the trash.
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | The user's email address. The special value me can be used to indicate the authenticated user.
id = 'id_example' # str | The ID of the message to Trash.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
try:
api_response = api_instance.gmail_users_messages_trash(user_id, id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
pprint(api_response)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_messages_trash: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| The user's email address. The special value me can be used to indicate the authenticated user. |
**id** | **str**| The ID of the message to Trash. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
### Return type
[**Message**](Message.md)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **gmail_users_messages_untrash**
> Message gmail_users_messages_untrash(user_id, id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
Removes the specified message from the trash.
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | The user's email address. The special value me can be used to indicate the authenticated user.
id = 'id_example' # str | The ID of the message to remove from Trash.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
try:
api_response = api_instance.gmail_users_messages_untrash(user_id, id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
pprint(api_response)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_messages_untrash: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| The user's email address. The special value me can be used to indicate the authenticated user. |
**id** | **str**| The ID of the message to remove from Trash. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
### Return type
[**Message**](Message.md)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **gmail_users_settings_filters_create**
> Filter gmail_users_settings_filters_create(user_id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, body=body)
Creates a filter.
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | User's email address. The special value \"me\" can be used to indicate the authenticated user.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
body = gmail_client.Filter() # Filter | (optional)
try:
api_response = api_instance.gmail_users_settings_filters_create(user_id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, body=body)
pprint(api_response)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_settings_filters_create: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| User's email address. The special value \"me\" can be used to indicate the authenticated user. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
**body** | [**Filter**](Filter.md)| | [optional]
### Return type
[**Filter**](Filter.md)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **gmail_users_settings_filters_delete**
> gmail_users_settings_filters_delete(user_id, id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
Deletes a filter.
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | User's email address. The special value \"me\" can be used to indicate the authenticated user.
id = 'id_example' # str | The ID of the filter to be deleted.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
try:
api_instance.gmail_users_settings_filters_delete(user_id, id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_settings_filters_delete: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| User's email address. The special value \"me\" can be used to indicate the authenticated user. |
**id** | **str**| The ID of the filter to be deleted. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
### Return type
void (empty response body)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **gmail_users_settings_filters_get**
> Filter gmail_users_settings_filters_get(user_id, id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
Gets a filter.
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | User's email address. The special value \"me\" can be used to indicate the authenticated user.
id = 'id_example' # str | The ID of the filter to be fetched.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
try:
api_response = api_instance.gmail_users_settings_filters_get(user_id, id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
pprint(api_response)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_settings_filters_get: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| User's email address. The special value \"me\" can be used to indicate the authenticated user. |
**id** | **str**| The ID of the filter to be fetched. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
### Return type
[**Filter**](Filter.md)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **gmail_users_settings_filters_list**
> ListFiltersResponse gmail_users_settings_filters_list(user_id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
Lists the message filters of a Gmail user.
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | User's email address. The special value \"me\" can be used to indicate the authenticated user.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
try:
api_response = api_instance.gmail_users_settings_filters_list(user_id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
pprint(api_response)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_settings_filters_list: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| User's email address. The special value \"me\" can be used to indicate the authenticated user. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
### Return type
[**ListFiltersResponse**](ListFiltersResponse.md)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **gmail_users_settings_forwarding_addresses_create**
> ForwardingAddress gmail_users_settings_forwarding_addresses_create(user_id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, body=body)
Creates a forwarding address. If ownership verification is required, a message will be sent to the recipient and the resource's verification status will be set to pending; otherwise, the resource will be created with verification status set to accepted. This method is only available to service account clients that have been delegated domain-wide authority.
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | User's email address. The special value \"me\" can be used to indicate the authenticated user.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
body = gmail_client.ForwardingAddress() # ForwardingAddress | (optional)
try:
api_response = api_instance.gmail_users_settings_forwarding_addresses_create(user_id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, body=body)
pprint(api_response)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_settings_forwarding_addresses_create: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| User's email address. The special value \"me\" can be used to indicate the authenticated user. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
**body** | [**ForwardingAddress**](ForwardingAddress.md)| | [optional]
### Return type
[**ForwardingAddress**](ForwardingAddress.md)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **gmail_users_settings_forwarding_addresses_delete**
> gmail_users_settings_forwarding_addresses_delete(user_id, forwarding_email, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
Deletes the specified forwarding address and revokes any verification that may have been required. This method is only available to service account clients that have been delegated domain-wide authority.
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | User's email address. The special value \"me\" can be used to indicate the authenticated user.
forwarding_email = 'forwarding_email_example' # str | The forwarding address to be deleted.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
try:
api_instance.gmail_users_settings_forwarding_addresses_delete(user_id, forwarding_email, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_settings_forwarding_addresses_delete: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| User's email address. The special value \"me\" can be used to indicate the authenticated user. |
**forwarding_email** | **str**| The forwarding address to be deleted. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
### Return type
void (empty response body)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **gmail_users_settings_forwarding_addresses_get**
> ForwardingAddress gmail_users_settings_forwarding_addresses_get(user_id, forwarding_email, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
Gets the specified forwarding address.
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | User's email address. The special value \"me\" can be used to indicate the authenticated user.
forwarding_email = 'forwarding_email_example' # str | The forwarding address to be retrieved.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
try:
api_response = api_instance.gmail_users_settings_forwarding_addresses_get(user_id, forwarding_email, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
pprint(api_response)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_settings_forwarding_addresses_get: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| User's email address. The special value \"me\" can be used to indicate the authenticated user. |
**forwarding_email** | **str**| The forwarding address to be retrieved. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
### Return type
[**ForwardingAddress**](ForwardingAddress.md)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **gmail_users_settings_forwarding_addresses_list**
> ListForwardingAddressesResponse gmail_users_settings_forwarding_addresses_list(user_id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
Lists the forwarding addresses for the specified account.
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | User's email address. The special value \"me\" can be used to indicate the authenticated user.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
try:
api_response = api_instance.gmail_users_settings_forwarding_addresses_list(user_id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
pprint(api_response)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_settings_forwarding_addresses_list: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| User's email address. The special value \"me\" can be used to indicate the authenticated user. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
### Return type
[**ListForwardingAddressesResponse**](ListForwardingAddressesResponse.md)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **gmail_users_settings_get_auto_forwarding**
> AutoForwarding gmail_users_settings_get_auto_forwarding(user_id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
Gets the auto-forwarding setting for the specified account.
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | User's email address. The special value \"me\" can be used to indicate the authenticated user.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
try:
api_response = api_instance.gmail_users_settings_get_auto_forwarding(user_id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
pprint(api_response)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_settings_get_auto_forwarding: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| User's email address. The special value \"me\" can be used to indicate the authenticated user. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
### Return type
[**AutoForwarding**](AutoForwarding.md)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **gmail_users_settings_get_imap**
> ImapSettings gmail_users_settings_get_imap(user_id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
Gets IMAP settings.
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | User's email address. The special value \"me\" can be used to indicate the authenticated user.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
try:
api_response = api_instance.gmail_users_settings_get_imap(user_id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
pprint(api_response)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_settings_get_imap: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| User's email address. The special value \"me\" can be used to indicate the authenticated user. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
### Return type
[**ImapSettings**](ImapSettings.md)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **gmail_users_settings_get_pop**
> PopSettings gmail_users_settings_get_pop(user_id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
Gets POP settings.
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | User's email address. The special value \"me\" can be used to indicate the authenticated user.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
try:
api_response = api_instance.gmail_users_settings_get_pop(user_id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
pprint(api_response)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_settings_get_pop: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| User's email address. The special value \"me\" can be used to indicate the authenticated user. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
### Return type
[**PopSettings**](PopSettings.md)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **gmail_users_settings_get_vacation**
> VacationSettings gmail_users_settings_get_vacation(user_id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
Gets vacation responder settings.
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | User's email address. The special value \"me\" can be used to indicate the authenticated user.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
try:
api_response = api_instance.gmail_users_settings_get_vacation(user_id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
pprint(api_response)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_settings_get_vacation: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| User's email address. The special value \"me\" can be used to indicate the authenticated user. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
### Return type
[**VacationSettings**](VacationSettings.md)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **gmail_users_settings_send_as_create**
> SendAs gmail_users_settings_send_as_create(user_id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, body=body)
Creates a custom \"from\" send-as alias. If an SMTP MSA is specified, Gmail will attempt to connect to the SMTP service to validate the configuration before creating the alias. If ownership verification is required for the alias, a message will be sent to the email address and the resource's verification status will be set to pending; otherwise, the resource will be created with verification status set to accepted. If a signature is provided, Gmail will sanitize the HTML before saving it with the alias. This method is only available to service account clients that have been delegated domain-wide authority.
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | User's email address. The special value \"me\" can be used to indicate the authenticated user.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
body = gmail_client.SendAs() # SendAs | (optional)
try:
api_response = api_instance.gmail_users_settings_send_as_create(user_id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, body=body)
pprint(api_response)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_settings_send_as_create: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| User's email address. The special value \"me\" can be used to indicate the authenticated user. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
**body** | [**SendAs**](SendAs.md)| | [optional]
### Return type
[**SendAs**](SendAs.md)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **gmail_users_settings_send_as_delete**
> gmail_users_settings_send_as_delete(user_id, send_as_email, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
Deletes the specified send-as alias. Revokes any verification that may have been required for using it. This method is only available to service account clients that have been delegated domain-wide authority.
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | User's email address. The special value \"me\" can be used to indicate the authenticated user.
send_as_email = 'send_as_email_example' # str | The send-as alias to be deleted.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
try:
api_instance.gmail_users_settings_send_as_delete(user_id, send_as_email, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_settings_send_as_delete: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| User's email address. The special value \"me\" can be used to indicate the authenticated user. |
**send_as_email** | **str**| The send-as alias to be deleted. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
### Return type
void (empty response body)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **gmail_users_settings_send_as_get**
> SendAs gmail_users_settings_send_as_get(user_id, send_as_email, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
Gets the specified send-as alias. Fails with an HTTP 404 error if the specified address is not a member of the collection.
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | User's email address. The special value \"me\" can be used to indicate the authenticated user.
send_as_email = 'send_as_email_example' # str | The send-as alias to be retrieved.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
try:
api_response = api_instance.gmail_users_settings_send_as_get(user_id, send_as_email, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
pprint(api_response)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_settings_send_as_get: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| User's email address. The special value \"me\" can be used to indicate the authenticated user. |
**send_as_email** | **str**| The send-as alias to be retrieved. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
### Return type
[**SendAs**](SendAs.md)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **gmail_users_settings_send_as_list**
> ListSendAsResponse gmail_users_settings_send_as_list(user_id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
Lists the send-as aliases for the specified account. The result includes the primary send-as address associated with the account as well as any custom \"from\" aliases.
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | User's email address. The special value \"me\" can be used to indicate the authenticated user.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
try:
api_response = api_instance.gmail_users_settings_send_as_list(user_id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
pprint(api_response)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_settings_send_as_list: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| User's email address. The special value \"me\" can be used to indicate the authenticated user. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
### Return type
[**ListSendAsResponse**](ListSendAsResponse.md)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **gmail_users_settings_send_as_patch**
> SendAs gmail_users_settings_send_as_patch(user_id, send_as_email, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, body=body)
Updates a send-as alias. If a signature is provided, Gmail will sanitize the HTML before saving it with the alias. Addresses other than the primary address for the account can only be updated by service account clients that have been delegated domain-wide authority. This method supports patch semantics.
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | User's email address. The special value \"me\" can be used to indicate the authenticated user.
send_as_email = 'send_as_email_example' # str | The send-as alias to be updated.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
body = gmail_client.SendAs() # SendAs | (optional)
try:
api_response = api_instance.gmail_users_settings_send_as_patch(user_id, send_as_email, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, body=body)
pprint(api_response)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_settings_send_as_patch: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| User's email address. The special value \"me\" can be used to indicate the authenticated user. |
**send_as_email** | **str**| The send-as alias to be updated. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
**body** | [**SendAs**](SendAs.md)| | [optional]
### Return type
[**SendAs**](SendAs.md)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **gmail_users_settings_send_as_smime_info_delete**
> gmail_users_settings_send_as_smime_info_delete(user_id, send_as_email, id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
Deletes the specified S/MIME config for the specified send-as alias.
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | The user's email address. The special value me can be used to indicate the authenticated user.
send_as_email = 'send_as_email_example' # str | The email address that appears in the \"From:\" header for mail sent using this alias.
id = 'id_example' # str | The immutable ID for the SmimeInfo.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
try:
api_instance.gmail_users_settings_send_as_smime_info_delete(user_id, send_as_email, id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_settings_send_as_smime_info_delete: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| The user's email address. The special value me can be used to indicate the authenticated user. |
**send_as_email** | **str**| The email address that appears in the \"From:\" header for mail sent using this alias. |
**id** | **str**| The immutable ID for the SmimeInfo. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
### Return type
void (empty response body)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **gmail_users_settings_send_as_smime_info_get**
> SmimeInfo gmail_users_settings_send_as_smime_info_get(user_id, send_as_email, id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
Gets the specified S/MIME config for the specified send-as alias.
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | The user's email address. The special value me can be used to indicate the authenticated user.
send_as_email = 'send_as_email_example' # str | The email address that appears in the \"From:\" header for mail sent using this alias.
id = 'id_example' # str | The immutable ID for the SmimeInfo.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
try:
api_response = api_instance.gmail_users_settings_send_as_smime_info_get(user_id, send_as_email, id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
pprint(api_response)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_settings_send_as_smime_info_get: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| The user's email address. The special value me can be used to indicate the authenticated user. |
**send_as_email** | **str**| The email address that appears in the \"From:\" header for mail sent using this alias. |
**id** | **str**| The immutable ID for the SmimeInfo. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
### Return type
[**SmimeInfo**](SmimeInfo.md)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **gmail_users_settings_send_as_smime_info_insert**
> SmimeInfo gmail_users_settings_send_as_smime_info_insert(user_id, send_as_email, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, body=body)
Insert (upload) the given S/MIME config for the specified send-as alias. Note that pkcs12 format is required for the key.
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | The user's email address. The special value me can be used to indicate the authenticated user.
send_as_email = 'send_as_email_example' # str | The email address that appears in the \"From:\" header for mail sent using this alias.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
body = gmail_client.SmimeInfo() # SmimeInfo | (optional)
try:
api_response = api_instance.gmail_users_settings_send_as_smime_info_insert(user_id, send_as_email, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, body=body)
pprint(api_response)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_settings_send_as_smime_info_insert: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| The user's email address. The special value me can be used to indicate the authenticated user. |
**send_as_email** | **str**| The email address that appears in the \"From:\" header for mail sent using this alias. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
**body** | [**SmimeInfo**](SmimeInfo.md)| | [optional]
### Return type
[**SmimeInfo**](SmimeInfo.md)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **gmail_users_settings_send_as_smime_info_list**
> ListSmimeInfoResponse gmail_users_settings_send_as_smime_info_list(user_id, send_as_email, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
Lists S/MIME configs for the specified send-as alias.
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | The user's email address. The special value me can be used to indicate the authenticated user.
send_as_email = 'send_as_email_example' # str | The email address that appears in the \"From:\" header for mail sent using this alias.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
try:
api_response = api_instance.gmail_users_settings_send_as_smime_info_list(user_id, send_as_email, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
pprint(api_response)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_settings_send_as_smime_info_list: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| The user's email address. The special value me can be used to indicate the authenticated user. |
**send_as_email** | **str**| The email address that appears in the \"From:\" header for mail sent using this alias. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
### Return type
[**ListSmimeInfoResponse**](ListSmimeInfoResponse.md)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **gmail_users_settings_send_as_smime_info_set_default**
> gmail_users_settings_send_as_smime_info_set_default(user_id, send_as_email, id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
Sets the default S/MIME config for the specified send-as alias.
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | The user's email address. The special value me can be used to indicate the authenticated user.
send_as_email = 'send_as_email_example' # str | The email address that appears in the \"From:\" header for mail sent using this alias.
id = 'id_example' # str | The immutable ID for the SmimeInfo.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
try:
api_instance.gmail_users_settings_send_as_smime_info_set_default(user_id, send_as_email, id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_settings_send_as_smime_info_set_default: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| The user's email address. The special value me can be used to indicate the authenticated user. |
**send_as_email** | **str**| The email address that appears in the \"From:\" header for mail sent using this alias. |
**id** | **str**| The immutable ID for the SmimeInfo. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
### Return type
void (empty response body)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **gmail_users_settings_send_as_update**
> SendAs gmail_users_settings_send_as_update(user_id, send_as_email, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, body=body)
Updates a send-as alias. If a signature is provided, Gmail will sanitize the HTML before saving it with the alias. Addresses other than the primary address for the account can only be updated by service account clients that have been delegated domain-wide authority.
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | User's email address. The special value \"me\" can be used to indicate the authenticated user.
send_as_email = 'send_as_email_example' # str | The send-as alias to be updated.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
body = gmail_client.SendAs() # SendAs | (optional)
try:
api_response = api_instance.gmail_users_settings_send_as_update(user_id, send_as_email, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, body=body)
pprint(api_response)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_settings_send_as_update: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| User's email address. The special value \"me\" can be used to indicate the authenticated user. |
**send_as_email** | **str**| The send-as alias to be updated. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
**body** | [**SendAs**](SendAs.md)| | [optional]
### Return type
[**SendAs**](SendAs.md)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **gmail_users_settings_send_as_verify**
> gmail_users_settings_send_as_verify(user_id, send_as_email, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
Sends a verification email to the specified send-as alias address. The verification status must be pending. This method is only available to service account clients that have been delegated domain-wide authority.
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | User's email address. The special value \"me\" can be used to indicate the authenticated user.
send_as_email = 'send_as_email_example' # str | The send-as alias to be verified.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
try:
api_instance.gmail_users_settings_send_as_verify(user_id, send_as_email, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_settings_send_as_verify: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| User's email address. The special value \"me\" can be used to indicate the authenticated user. |
**send_as_email** | **str**| The send-as alias to be verified. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
### Return type
void (empty response body)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **gmail_users_settings_update_auto_forwarding**
> AutoForwarding gmail_users_settings_update_auto_forwarding(user_id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, body=body)
Updates the auto-forwarding setting for the specified account. A verified forwarding address must be specified when auto-forwarding is enabled. This method is only available to service account clients that have been delegated domain-wide authority.
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | User's email address. The special value \"me\" can be used to indicate the authenticated user.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
body = gmail_client.AutoForwarding() # AutoForwarding | (optional)
try:
api_response = api_instance.gmail_users_settings_update_auto_forwarding(user_id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, body=body)
pprint(api_response)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_settings_update_auto_forwarding: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| User's email address. The special value \"me\" can be used to indicate the authenticated user. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
**body** | [**AutoForwarding**](AutoForwarding.md)| | [optional]
### Return type
[**AutoForwarding**](AutoForwarding.md)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **gmail_users_settings_update_imap**
> ImapSettings gmail_users_settings_update_imap(user_id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, body=body)
Updates IMAP settings.
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | User's email address. The special value \"me\" can be used to indicate the authenticated user.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
body = gmail_client.ImapSettings() # ImapSettings | (optional)
try:
api_response = api_instance.gmail_users_settings_update_imap(user_id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, body=body)
pprint(api_response)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_settings_update_imap: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| User's email address. The special value \"me\" can be used to indicate the authenticated user. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
**body** | [**ImapSettings**](ImapSettings.md)| | [optional]
### Return type
[**ImapSettings**](ImapSettings.md)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **gmail_users_settings_update_pop**
> PopSettings gmail_users_settings_update_pop(user_id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, body=body)
Updates POP settings.
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | User's email address. The special value \"me\" can be used to indicate the authenticated user.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
body = gmail_client.PopSettings() # PopSettings | (optional)
try:
api_response = api_instance.gmail_users_settings_update_pop(user_id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, body=body)
pprint(api_response)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_settings_update_pop: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| User's email address. The special value \"me\" can be used to indicate the authenticated user. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
**body** | [**PopSettings**](PopSettings.md)| | [optional]
### Return type
[**PopSettings**](PopSettings.md)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **gmail_users_settings_update_vacation**
> VacationSettings gmail_users_settings_update_vacation(user_id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, body=body)
Updates vacation responder settings.
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | User's email address. The special value \"me\" can be used to indicate the authenticated user.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
body = gmail_client.VacationSettings() # VacationSettings | (optional)
try:
api_response = api_instance.gmail_users_settings_update_vacation(user_id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, body=body)
pprint(api_response)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_settings_update_vacation: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| User's email address. The special value \"me\" can be used to indicate the authenticated user. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
**body** | [**VacationSettings**](VacationSettings.md)| | [optional]
### Return type
[**VacationSettings**](VacationSettings.md)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **gmail_users_stop**
> gmail_users_stop(user_id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
Stop receiving push notifications for the given user mailbox.
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | The user's email address. The special value me can be used to indicate the authenticated user.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
try:
api_instance.gmail_users_stop(user_id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_stop: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| The user's email address. The special value me can be used to indicate the authenticated user. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
### Return type
void (empty response body)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **gmail_users_threads_delete**
> gmail_users_threads_delete(user_id, id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
Immediately and permanently deletes the specified thread. This operation cannot be undone. Prefer threads.trash instead.
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | The user's email address. The special value me can be used to indicate the authenticated user.
id = 'id_example' # str | ID of the Thread to delete.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
try:
api_instance.gmail_users_threads_delete(user_id, id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_threads_delete: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| The user's email address. The special value me can be used to indicate the authenticated user. |
**id** | **str**| ID of the Thread to delete. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
### Return type
void (empty response body)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **gmail_users_threads_get**
> Thread gmail_users_threads_get(user_id, id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, format=format, metadata_headers=metadata_headers)
Gets the specified thread.
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | The user's email address. The special value me can be used to indicate the authenticated user.
id = 'id_example' # str | The ID of the thread to retrieve.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
format = 'full' # str | The format to return the messages in. (optional) (default to full)
metadata_headers = ['metadata_headers_example'] # list[str] | When given and format is METADATA, only include headers specified. (optional)
try:
api_response = api_instance.gmail_users_threads_get(user_id, id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, format=format, metadata_headers=metadata_headers)
pprint(api_response)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_threads_get: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| The user's email address. The special value me can be used to indicate the authenticated user. |
**id** | **str**| The ID of the thread to retrieve. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
**format** | **str**| The format to return the messages in. | [optional] [default to full]
**metadata_headers** | [**list[str]**](str.md)| When given and format is METADATA, only include headers specified. | [optional]
### Return type
[**Thread**](Thread.md)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **gmail_users_threads_list**
> ListThreadsResponse gmail_users_threads_list(user_id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, include_spam_trash=include_spam_trash, label_ids=label_ids, max_results=max_results, page_token=page_token, q=q)
Lists the threads in the user's mailbox.
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | The user's email address. The special value me can be used to indicate the authenticated user.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
include_spam_trash = false # bool | Include threads from SPAM and TRASH in the results. (optional) (default to false)
label_ids = ['label_ids_example'] # list[str] | Only return threads with labels that match all of the specified label IDs. (optional)
max_results = 100 # int | Maximum number of threads to return. (optional) (default to 100)
page_token = 'page_token_example' # str | Page token to retrieve a specific page of results in the list. (optional)
q = 'q_example' # str | Only return threads matching the specified query. Supports the same query format as the Gmail search box. For example, \"from:someuser@example.com rfc822msgid: is:unread\". Parameter cannot be used when accessing the api using the gmail.metadata scope. (optional)
try:
api_response = api_instance.gmail_users_threads_list(user_id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, include_spam_trash=include_spam_trash, label_ids=label_ids, max_results=max_results, page_token=page_token, q=q)
pprint(api_response)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_threads_list: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| The user's email address. The special value me can be used to indicate the authenticated user. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
**include_spam_trash** | **bool**| Include threads from SPAM and TRASH in the results. | [optional] [default to false]
**label_ids** | [**list[str]**](str.md)| Only return threads with labels that match all of the specified label IDs. | [optional]
**max_results** | **int**| Maximum number of threads to return. | [optional] [default to 100]
**page_token** | **str**| Page token to retrieve a specific page of results in the list. | [optional]
**q** | **str**| Only return threads matching the specified query. Supports the same query format as the Gmail search box. For example, \"from:someuser@example.com rfc822msgid: is:unread\". Parameter cannot be used when accessing the api using the gmail.metadata scope. | [optional]
### Return type
[**ListThreadsResponse**](ListThreadsResponse.md)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **gmail_users_threads_modify**
> Thread gmail_users_threads_modify(user_id, id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, body=body)
Modifies the labels applied to the thread. This applies to all messages in the thread.
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | The user's email address. The special value me can be used to indicate the authenticated user.
id = 'id_example' # str | The ID of the thread to modify.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
body = gmail_client.ModifyThreadRequest() # ModifyThreadRequest | (optional)
try:
api_response = api_instance.gmail_users_threads_modify(user_id, id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, body=body)
pprint(api_response)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_threads_modify: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| The user's email address. The special value me can be used to indicate the authenticated user. |
**id** | **str**| The ID of the thread to modify. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
**body** | [**ModifyThreadRequest**](ModifyThreadRequest.md)| | [optional]
### Return type
[**Thread**](Thread.md)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **gmail_users_threads_trash**
> Thread gmail_users_threads_trash(user_id, id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
Moves the specified thread to the trash.
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | The user's email address. The special value me can be used to indicate the authenticated user.
id = 'id_example' # str | The ID of the thread to Trash.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
try:
api_response = api_instance.gmail_users_threads_trash(user_id, id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
pprint(api_response)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_threads_trash: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| The user's email address. The special value me can be used to indicate the authenticated user. |
**id** | **str**| The ID of the thread to Trash. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
### Return type
[**Thread**](Thread.md)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **gmail_users_threads_untrash**
> Thread gmail_users_threads_untrash(user_id, id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
Removes the specified thread from the trash.
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | The user's email address. The special value me can be used to indicate the authenticated user.
id = 'id_example' # str | The ID of the thread to remove from Trash.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
try:
api_response = api_instance.gmail_users_threads_untrash(user_id, id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip)
pprint(api_response)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_threads_untrash: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| The user's email address. The special value me can be used to indicate the authenticated user. |
**id** | **str**| The ID of the thread to remove from Trash. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
### Return type
[**Thread**](Thread.md)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
# **gmail_users_watch**
> WatchResponse gmail_users_watch(user_id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, body=body)
Set up or update a push notification watch on the given user mailbox.
### Example
```python
from __future__ import print_statement
import time
import gmail_client
from gmail_client.rest import ApiException
from pprint import pprint
# Configure OAuth2 access token for authorization: Oauth2
gmail_client.configuration.access_token = 'YOUR_ACCESS_TOKEN'
# create an instance of the API class
api_instance = gmail_client.UsersApi()
user_id = 'user_id_example' # str | The user's email address. The special value me can be used to indicate the authenticated user.
alt = 'json' # str | Data format for the response. (optional) (default to json)
fields = 'fields_example' # str | Selector specifying which fields to include in a partial response. (optional)
key = 'key_example' # str | API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. (optional)
oauth_token = 'oauth_token_example' # str | OAuth 2.0 token for the current user. (optional)
pretty_print = true # bool | Returns response with indentations and line breaks. (optional) (default to true)
quota_user = 'quota_user_example' # str | Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. (optional)
user_ip = 'user_ip_example' # str | IP address of the site where the request originates. Use this if you want to enforce per-user limits. (optional)
body = gmail_client.WatchRequest() # WatchRequest | (optional)
try:
api_response = api_instance.gmail_users_watch(user_id, alt=alt, fields=fields, key=key, oauth_token=oauth_token, pretty_print=pretty_print, quota_user=quota_user, user_ip=user_ip, body=body)
pprint(api_response)
except ApiException as e:
print("Exception when calling UsersApi->gmail_users_watch: %s\n" % e)
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**user_id** | **str**| The user's email address. The special value me can be used to indicate the authenticated user. |
**alt** | **str**| Data format for the response. | [optional] [default to json]
**fields** | **str**| Selector specifying which fields to include in a partial response. | [optional]
**key** | **str**| API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token. | [optional]
**oauth_token** | **str**| OAuth 2.0 token for the current user. | [optional]
**pretty_print** | **bool**| Returns response with indentations and line breaks. | [optional] [default to true]
**quota_user** | **str**| Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters. Overrides userIp if both are provided. | [optional]
**user_ip** | **str**| IP address of the site where the request originates. Use this if you want to enforce per-user limits. | [optional]
**body** | [**WatchRequest**](WatchRequest.md)| | [optional]
### Return type
[**WatchResponse**](WatchResponse.md)
### Authorization
[Oauth2](../README.md#Oauth2)
### HTTP request headers
- **Content-Type**: Not defined
- **Accept**: Not defined
[[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
| 60.736329 | 792 | 0.737564 | eng_Latn | 0.923185 |
d3a29c55f5291d8ea7e3dfd9167ddc0b45d2dea1 | 1,176 | md | Markdown | docs/_posts/2016-01-04-directives.md | thepieterdc/marp | 6d490100307a11ab03ee333c8a6d4dfdc66dee7b | [
"MIT"
] | 25 | 2016-12-04T21:44:51.000Z | 2021-03-24T03:59:03.000Z | docs/_posts/2016-01-04-directives.md | thepieterdc/marp | 6d490100307a11ab03ee333c8a6d4dfdc66dee7b | [
"MIT"
] | 2 | 2021-01-28T21:29:11.000Z | 2022-03-25T19:13:38.000Z | docs/_posts/2016-01-04-directives.md | thepieterdc/marp | 6d490100307a11ab03ee333c8a6d4dfdc66dee7b | [
"MIT"
] | 6 | 2017-11-12T14:47:16.000Z | 2020-03-31T19:33:14.000Z | ---
category: top
---
<div class="col-xs-12" markdown="1">
# Directives
Marp's Markdown has extended directives to affect slides. Insert HTML comment as `<!-- {directive_name}: {value} -->`
</div>
<div class="col-xs-12 col-sm-6" markdown="1">
## Pagination
You want pagination? Insert `<!-- page_number: true -->` at the top.
If you want to exclude the first page number, move the directive to after the first ruler.
```markdown
# First page
The page number `1` is not shown.
---
<!-- page_number: true -->
# Second page
The page number `2` is shown!
```
</div>
<div class="col-xs-12 col-sm-6" markdown="1">
## Resize slide
You can resize slides with the Global Directive `$size`.
Insert `<!-- $size: 16:9 -->` if you want to display slides on 16:9 screen. That’s all!
```html
<!-- $size: 16:9 -->
```
`$size` directive supports `4:3`, `16:9`, `A0`-`A8`, `B0`-`B8` and the `-portrait` suffix.
Marp also supports `$width` and `$height` directives to set a custom size.
</div>
<div class="col-xs-12" markdown="1">
---
You want an example? Have a look at [example.md](https://raw.githubusercontent.com/yhatt/marp/master/example.md){:target="_blank"}.
</div> | 21 | 131 | 0.658163 | eng_Latn | 0.974878 |
d3a3922f1dfdb6027c3fc7aa5e568ecb964d2276 | 5,706 | md | Markdown | wdk-ddi-src/content/wdfdevice/nf-wdfdevice-wdfdevicequeryproperty.md | DeviceObject/windows-driver-docs-ddi | be6b8ddad4931e676fb6be20935b82aaaea3a8fb | [
"CC-BY-4.0",
"MIT"
] | null | null | null | wdk-ddi-src/content/wdfdevice/nf-wdfdevice-wdfdevicequeryproperty.md | DeviceObject/windows-driver-docs-ddi | be6b8ddad4931e676fb6be20935b82aaaea3a8fb | [
"CC-BY-4.0",
"MIT"
] | null | null | null | wdk-ddi-src/content/wdfdevice/nf-wdfdevice-wdfdevicequeryproperty.md | DeviceObject/windows-driver-docs-ddi | be6b8ddad4931e676fb6be20935b82aaaea3a8fb | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
UID: NF:wdfdevice.WdfDeviceQueryProperty
title: WdfDeviceQueryProperty function (wdfdevice.h)
description: The WdfDeviceQueryProperty method retrieves a specified device property.
old-location: wdf\wdfdevicequeryproperty.htm
tech.root: wdf
ms.assetid: be05a5b5-e895-402b-bf0a-cbdb75fdef1d
ms.date: 02/26/2018
keywords: ["WdfDeviceQueryProperty function"]
ms.keywords: DFDeviceObjectGeneralRef_e3f58989-ddd0-4402-94bf-418481869972.xml, WdfDeviceQueryProperty, WdfDeviceQueryProperty method, kmdf.wdfdevicequeryproperty, wdf.wdfdevicequeryproperty, wdfdevice/WdfDeviceQueryProperty
f1_keywords:
- "wdfdevice/WdfDeviceQueryProperty"
- "WdfDeviceQueryProperty"
req.header: wdfdevice.h
req.include-header: Wdf.h
req.target-type: Universal
req.target-min-winverclnt:
req.target-min-winversvr:
req.kmdf-ver: 1.0
req.umdf-ver: 2.0
req.ddi-compliance: DriverCreate, KmdfIrql, KmdfIrql2
req.unicode-ansi:
req.idl:
req.max-support:
req.namespace:
req.assembly:
req.type-library:
req.lib: Wdf01000.sys (KMDF); WUDFx02000.dll (UMDF)
req.dll:
req.irql: PASSIVE_LEVEL
topic_type:
- APIRef
- kbSyntax
api_type:
- LibDef
api_location:
- Wdf01000.sys
- Wdf01000.sys.dll
- WUDFx02000.dll
- WUDFx02000.dll.dll
api_name:
- WdfDeviceQueryProperty
targetos: Windows
req.typenames:
---
# WdfDeviceQueryProperty function
## -description
<p class="CCE_Message">[Applies to KMDF and UMDF]</p>
The <b>WdfDeviceQueryProperty</b> method retrieves a specified device property.
## -parameters
### -param Device [in]
A handle to a framework device object.
### -param DeviceProperty [in]
A <a href="https://docs.microsoft.com/windows-hardware/drivers/ddi/wdm/ne-wdm-device_registry_property">DEVICE_REGISTRY_PROPERTY</a>-typed enumerator that identifies the device property to be retrieved.
### -param BufferLength [in]
The size, in bytes, of the buffer that is pointed to by <i>PropertyBuffer</i>.
### -param PropertyBuffer [out]
A caller-supplied pointer to a caller-allocated buffer that receives the requested information. The pointer can be <b>NULL</b> if the <i>BufferLength</i> parameter is zero.
### -param ResultLength [out]
A caller-supplied location that, on return, contains the size, in bytes, of the information that the method stored in <i>PropertyBuffer</i>. If the function's return value is STATUS_BUFFER_TOO_SMALL, this location receives the required buffer size.
## -returns
If the operation succeeds, <b>WdfDeviceQueryProperty</b> returns STATUS_SUCCESS. Additional return values include:
<table>
<tr>
<th>Return code</th>
<th>Description</th>
</tr>
<tr>
<td width="40%">
<dl>
<dt><b>STATUS_BUFFER_TOO_SMALL</b></dt>
</dl>
</td>
<td width="60%">
The supplied buffer is too small to receive the information.
</td>
</tr>
<tr>
<td width="40%">
<dl>
<dt><b>STATUS_INVALID_PARAMETER_2</b></dt>
</dl>
</td>
<td width="60%">
The specified <i>DeviceProperty</i> value is invalid
</td>
</tr>
<tr>
<td width="40%">
<dl>
<dt><b>STATUS_INVALID_DEVICE_REQUEST</b></dt>
</dl>
</td>
<td width="60%">
The device's drivers have not yet reported the device's properties.
</td>
</tr>
</table>
The method might return other <a href="https://docs.microsoft.com/windows-hardware/drivers/kernel/ntstatus-values">NTSTATUS values</a>.
A bug check occurs if the driver supplies an invalid object handle.
## -remarks
Before receiving device property data, drivers typically call the <b>WdfDeviceQueryProperty</b> method just to obtain the required buffer size. For some properties, the data size can change between when the required size is returned and when the driver calls <b>WdfDeviceQueryProperty</b> again. Therefore, drivers should call <b>WdfDeviceQueryProperty</b> inside a loop that executes until the return status is not STATUS_BUFFER_TOO_SMALL.
It is best to use <b>WdfDeviceQueryProperty</b> only if the required buffer size is known and unchanging, because in that case the driver has to call <b>WdfDeviceQueryProperty</b> only once. If the required buffer size is unknown or varies, the driver should call <a href="https://docs.microsoft.com/windows-hardware/drivers/ddi/wdfdevice/nf-wdfdevice-wdfdeviceallocandqueryproperty">WdfDeviceAllocAndQueryProperty</a>.
Alternatively, you can use <a href="https://docs.microsoft.com/windows-hardware/drivers/ddi/wdfdevice/nf-wdfdevice-wdfdevicequerypropertyex">WdfDeviceQueryPropertyEx</a> to access device properties that are exposed through the Unified Property Model.
#### Examples
The following code example obtains a device's <b>DevicePropertyBusTypeGuid</b> property. The example calls <b>WdfDeviceQueryProperty</b> instead of <a href="https://docs.microsoft.com/windows-hardware/drivers/ddi/wdfdevice/nf-wdfdevice-wdfdeviceallocandqueryproperty">WdfDeviceAllocAndQueryProperty</a> because the length of a GUID is known.
```cpp
GUID busTypeGuid;
ULONG resultLength = 0;
NTSTATUS status;
status = WdfDeviceQueryProperty(
device,
DevicePropertyBusTypeGuid,
sizeof(GUID),
(PVOID)&busTypeGuid,
&resultLength
);
```
## -see-also
<a href="https://docs.microsoft.com/windows-hardware/drivers/ddi/wdfdevice/nf-wdfdevice-wdfdeviceallocandqueryproperty">WdfDeviceAllocAndQueryProperty</a>
<a href="https://docs.microsoft.com/windows-hardware/drivers/ddi/wdffdo/nf-wdffdo-wdffdoinitqueryproperty">WdfFdoInitQueryProperty</a>
| 30.513369 | 442 | 0.7238 | eng_Latn | 0.667109 |
d3a40ddd8b0f15564243fa6602c6b6fedcbcb182 | 515 | md | Markdown | packages/plugins/typescript/graphql-apollo/CHANGELOG.md | kirkeaton/graphql-code-generator | d25d1861ebaea08c9fcd0e3c066ba24e5ac0bb36 | [
"MIT"
] | 1 | 2021-10-30T20:19:40.000Z | 2021-10-30T20:19:40.000Z | packages/plugins/typescript/graphql-apollo/CHANGELOG.md | kirkeaton/graphql-code-generator | d25d1861ebaea08c9fcd0e3c066ba24e5ac0bb36 | [
"MIT"
] | 1 | 2021-11-05T14:02:37.000Z | 2021-11-05T14:03:29.000Z | packages/plugins/typescript/graphql-apollo/CHANGELOG.md | kirkeaton/graphql-code-generator | d25d1861ebaea08c9fcd0e3c066ba24e5ac0bb36 | [
"MIT"
] | null | null | null | # @graphql-codegen/typescript-graphql-apollo
## 1.1.0
### Minor Changes
- 97ddb487a: feat: GraphQL v16 compatibility
### Patch Changes
- Updated dependencies [97ddb487a]
- @graphql-codegen/visitor-plugin-common@2.5.0
- @graphql-codegen/plugin-helpers@2.3.0
## 1.0.0
### Major Changes
- 979bdc01e: add a new graphql-apollo plugin
### Patch Changes
- Updated dependencies [d6c2d4c09]
- Updated dependencies [feeae1c66]
- Updated dependencies [5086791ac]
- @graphql-codegen/visitor-plugin-common@2.2.0
| 19.074074 | 48 | 0.730097 | eng_Latn | 0.335588 |
d3a41dd4cc6814e86c661d484db0449355d16e16 | 1,098 | md | Markdown | biztalk/adapters-and-accelerators/accelerator-rosettanet/testing-the-solution.md | changeworld/biztalk-docs.zh-CN | 0ee8ca09b377aa26a13e0f200c75fca467cd519c | [
"CC-BY-4.0",
"MIT"
] | null | null | null | biztalk/adapters-and-accelerators/accelerator-rosettanet/testing-the-solution.md | changeworld/biztalk-docs.zh-CN | 0ee8ca09b377aa26a13e0f200c75fca467cd519c | [
"CC-BY-4.0",
"MIT"
] | null | null | null | biztalk/adapters-and-accelerators/accelerator-rosettanet/testing-the-solution.md | changeworld/biztalk-docs.zh-CN | 0ee8ca09b377aa26a13e0f200c75fca467cd519c | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: 测试解决方案 |Microsoft 文档
ms.custom: ''
ms.date: 06/08/2017
ms.prod: biztalk-server
ms.reviewer: ''
ms.suite: ''
ms.tgt_pltfrm: ''
ms.topic: article
helpviewer_keywords:
- testing solutions
- private process tutorial, testing solutions
ms.assetid: 90faf959-bac6-4695-8cb7-ecabe52baf1a
caps.latest.revision: 5
author: MandiOhlinger
ms.author: mandia
manager: anneta
ms.openlocfilehash: 7af2cab529344f499ff006a6cd99401ae63c4668
ms.sourcegitcommit: 3fc338e52d5dbca2c3ea1685a2faafc7582fe23a
ms.translationtype: MT
ms.contentlocale: zh-CN
ms.lasthandoff: 12/01/2017
ms.locfileid: "26005534"
---
# <a name="testing-the-solution"></a>测试解决方案
在此部分中,你可以测试你完成的解决方案。 在 Fabrikam 解决方案中创建该 LOBWebApplication 工具用于 3A2 PIP 将请求提交给 Contoso LOB 应用程序。 Contoso 私有业务流程,然后创建 Contoso 基于的 3A2 价格和可用性将请求提交到 BizTalk server 使用的 SQL 适配器的 ERP 系统。 当从 ERP 系统收到响应时,业务流程调用业务规则引擎,以强制实施紧急情况下需要你创建的业务策略。
## <a name="in-this-section"></a>本节内容
- [使用 Fabrikam 示例创建价格和可用性请求](../../adapters-and-accelerators/accelerator-rosettanet/creating-a-price-and-availability-request-with-the-fabrikam-sample.md) | 36.6 | 233 | 0.775046 | yue_Hant | 0.203765 |
d3a471cd1f52d147776eaa20f11eb34d32d49a71 | 1,012 | md | Markdown | paths/github-cli/15-push-and-open-pr.md | dumpsterfire/training-kit | c19266cd70029ceaa62b5f7f14de0d5960ab3335 | [
"CC-BY-4.0"
] | null | null | null | paths/github-cli/15-push-and-open-pr.md | dumpsterfire/training-kit | c19266cd70029ceaa62b5f7f14de0d5960ab3335 | [
"CC-BY-4.0"
] | null | null | null | paths/github-cli/15-push-and-open-pr.md | dumpsterfire/training-kit | c19266cd70029ceaa62b5f7f14de0d5960ab3335 | [
"CC-BY-4.0"
] | 1 | 2021-02-23T11:53:26.000Z | 2021-02-23T11:53:26.000Z | ---
layout: simple-class
header:
overlay_image: cover.jpeg
overlay_filter: rgba(46, 129, 200, 0.6)
title: Review Pushing and Opening Pull Requests
permalink: /github-cli/git-push-open-pull-request
next-page: /github-cli/collaborate-github-pull-requests
facilitator: false
sidebar:
nav: "github-cli"
main-content: |
Now that you have made some local commits, it is time to send your changes to the remote copy of your repository on GitHub.com and create a Pull Request.

1. Type `git push -u origin BRANCH-NAME` to push your commits to the remote, and set a tracking branch.
1. Enter your GitHub username and password, if prompted to do so.
1. Create a Pull Request on GitHub.
1. Fill out the body of the Pull Request with information about the changes you're introducing.
show-me-how:
refresh:
includes: refresh/github-CLI/pull-request.md
tell-me-why:
includes:
- tell-me-why/pull-request.md
---
| 32.645161 | 155 | 0.748024 | eng_Latn | 0.980556 |
d3a49f636daba1da6476e40b25a5f6e9a22967cd | 828 | markdown | Markdown | _posts/2021-01-24-first_posting.markdown | jwoonge/jwoonge.github.io | e2f22bde1864ac700a4e2ec84c3a47ccd0e7822c | [
"MIT"
] | null | null | null | _posts/2021-01-24-first_posting.markdown | jwoonge/jwoonge.github.io | e2f22bde1864ac700a4e2ec84c3a47ccd0e7822c | [
"MIT"
] | null | null | null | _posts/2021-01-24-first_posting.markdown | jwoonge/jwoonge.github.io | e2f22bde1864ac700a4e2ec84c3a47ccd0e7822c | [
"MIT"
] | null | null | null | ---
layout: post
title: "BASECAMP 1주차를 마치고"
date: 2021-01-24 23:16:30 +0900
image: 4.jpg
tags: [Rookie, NHN, 회고, BASECAMP]
categories: 회고
---
입사 3주 차, 평화로웠던 입문교육이 끝나고, 정상을 향한 발돋움이 되어줄 베이스캠프로 넘어왔다.
혼자서 사전과제를 진행했던 베이스캠프의 첫 주차는 너무나도 빠르게 지나갔다.
살면서 금요일이 안 왔으면 좋겠다고 생각한 적은 정말 몇 번 없는데 ㅋㅋ
받은 책을 따라서 진행하는 과제이다 보니 어렵다거나 막힌다거나 하는 부분은 없었지만, 완벽히 익혀서 나의 것으로 만들어야 하는데, 진행해야 하는 양이 꽤 많다 보니 속도를 내려다 정신 차려 보면 멍하니 타이핑만 하고 있는 자신을 발견할 수 있었다...
결국 뒷 부분을 구현하는 추가 과제보다는, 앞 부분을 익히는 것에 주력하기로 결정하고 이틀간은 복습에 투자하였는데, 웹 개발이 처음이라 감이 안 잡혀 세세한 문법 보다는 예제 웹 어플리케이션의 전체적인 프로젝트 구조를 보며 왜 이 코드가 돌아가는지를 이해하려고 노력하였으나 ㅋㅋ...
채점 해보니 답안지에 비가 내리더라.. 라는 이야기.
느낀 점은,
- 세상은 넓고 배울 것은 많다.
- 빨리 방을 구해야겠다. (..ㅋㅋ)
내일 부터는 루키사자TF의 동기들과 함께 팀 프로젝트가 진행된다.
4인이서 풀타임으로 9주짜리 프로젝트라..
대학에서처럼 다른 과목들의 수업/시험 등 자꾸 방해하는 것 없이 9주면 굉장히 재미있는 것을 해볼 수 있을 것 같아 너무나 기대된다.
| 31.846154 | 160 | 0.675121 | kor_Hang | 1.00001 |
d3a6a7596bca15e24855d883d3315650b0c0b210 | 208 | md | Markdown | src/Pillar-ExporterMarkdown.package/PRGitlabMarkdownWriter.class/README.md | dupriezt/pillar | eade46feace88b8bb9dbd00c3cccbd36ab62c005 | [
"MIT"
] | null | null | null | src/Pillar-ExporterMarkdown.package/PRGitlabMarkdownWriter.class/README.md | dupriezt/pillar | eade46feace88b8bb9dbd00c3cccbd36ab62c005 | [
"MIT"
] | null | null | null | src/Pillar-ExporterMarkdown.package/PRGitlabMarkdownWriter.class/README.md | dupriezt/pillar | eade46feace88b8bb9dbd00c3cccbd36ab62c005 | [
"MIT"
] | null | null | null | I am a writer for GitLab Flavored Markdown
https://docs.gitlab.com/ee/user/markdown.html
Note that Gitlab Flavored Markdown is not an extension of CommonMark, however the syntax is similar for the most part. | 52 | 118 | 0.807692 | eng_Latn | 0.988806 |
d3a6ffdebfc651c6be9551ebd5955c690f08c19b | 90 | md | Markdown | C# aulas/aula57/ColeçãoListPT1.md | Thomaz-Peres/Estudos-Notes | 1d00c84d914d107006be5ccbdbf1b3189aaf2184 | [
"MIT"
] | 4 | 2020-07-23T15:27:51.000Z | 2022-02-02T01:55:29.000Z | C# aulas/aula57/ColeçãoListPT1.md | Thomaz-Peres/Estudos-Notes | 1d00c84d914d107006be5ccbdbf1b3189aaf2184 | [
"MIT"
] | 1 | 2020-08-26T16:39:25.000Z | 2021-04-16T02:54:18.000Z | C# aulas/aula57/ColeçãoListPT1.md | Thomaz-Peres/Estudos-Notes | 1d00c84d914d107006be5ccbdbf1b3189aaf2184 | [
"MIT"
] | null | null | null | ## usada para substituir um array tradicional.
ja sei algumas coisas HUAHUA dale familia. | 30 | 46 | 0.8 | por_Latn | 0.997194 |
d3a7b0d30d01c95fa091bc480b040cbc585539df | 115 | md | Markdown | README.md | friskydingo/controlviewopener | b2e8ed255aed4d036982235bfb665d5aa7e4577e | [
"MIT"
] | null | null | null | README.md | friskydingo/controlviewopener | b2e8ed255aed4d036982235bfb665d5aa7e4577e | [
"MIT"
] | null | null | null | README.md | friskydingo/controlviewopener | b2e8ed255aed4d036982235bfb665d5aa7e4577e | [
"MIT"
] | null | null | null | # controlviewopener
A plugin to Sublime Text 3 that finds and opens a matching view when opening a controller file
| 38.333333 | 94 | 0.817391 | eng_Latn | 0.99329 |
d3a82bf6176bde452e2f50dc2cde14a6e0eb3f7d | 1,883 | md | Markdown | README.md | benjamin-guibert/etherbeam | 0d12c179eace17acd4909965ceba4ed9c4b70d9e | [
"MIT"
] | 1 | 2021-01-15T18:36:30.000Z | 2021-01-15T18:36:30.000Z | README.md | benjamin-guibert/etherbeam | 0d12c179eace17acd4909965ceba4ed9c4b70d9e | [
"MIT"
] | 20 | 2021-01-15T19:26:04.000Z | 2021-03-01T13:01:01.000Z | README.md | benjamin-guibert/etherbeam | 0d12c179eace17acd4909965ceba4ed9c4b70d9e | [
"MIT"
] | null | null | null | # Etherbeam
> Ethereum cryptocurrency tracker



[![license-shield]](LICENSE)
## Table of Contents
- [Stack](#stack)
- [Scripts](#scripts)
- [Release History](#release-history)
- [Versionning](#versionning)
- [Authors](#authors)
- [License](#license)
## Stack
- [Server](server/README.md): Server side.
- [Ethereum Server](eth-server/README.md): Server dedicated to communication with the Ethereum blockchain.
- [Client](client/README.md): Client side.
## Scripts
- `yarn dev`: Run the stack in development mode.
- `yarn dev:server`: Run the server in development mode.
- `yarn dev:eth-server`: Run the Ethereum server in development mode.
- `yarn dev:client`: Run the client in development mode.
- `yarn format`: Format the code, apply needed modifications.
- `yarn lint`: Check the code quality.
- `yarn test`: Test the code.
## Release History
Check the [`CHANGELOG.md`](CHANGELOG.md) file for the release history.
## Versionning
We use [SemVer](http://semver.org/) for versioning. For the versions available, see the [tags on this repository][tags-link].
## Authors
- **[Benjamin Guibert](https://github.com/benjamin-guibert)**: Creator & main contributor
See also the list of [contributors][contributors-link] who participated in this project.
## License
[![license-shield]](LICENSE)
This project is licensed under the MIT License - see the [`LICENSE`](LICENSE) file for details
[contributors-link]: https://github.com/benjamin-guibert/etherbeam/contributors
[license-shield]: https://img.shields.io/github/license/benjamin-guibert/etherbeam.svg
| 33.035088 | 125 | 0.747212 | eng_Latn | 0.523915 |
d3a87e4d3058b449b07ad3853dde6d51ecfade3c | 26 | md | Markdown | README.md | meikunyuan6/doc | efffa03447138591d3da05352dcac197c1f9bed9 | [
"MIT"
] | null | null | null | README.md | meikunyuan6/doc | efffa03447138591d3da05352dcac197c1f9bed9 | [
"MIT"
] | null | null | null | README.md | meikunyuan6/doc | efffa03447138591d3da05352dcac197c1f9bed9 | [
"MIT"
] | null | null | null | # 日常学习积累
* 思维导图
* VNote笔记
| 6.5 | 9 | 0.653846 | jpn_Jpan | 0.372561 |
d3a8c53db9c753ac6da50b04134910fa9b5200cc | 101 | md | Markdown | 会员版.md | zuihou/lamp-cloud | e53134ac49803d9e7e60a5c39d09574cc27caf82 | [
"Apache-2.0"
] | 1,244 | 2020-12-08T02:49:22.000Z | 2022-03-31T09:32:06.000Z | 会员版.md | zuihou/lamp-cloud | e53134ac49803d9e7e60a5c39d09574cc27caf82 | [
"Apache-2.0"
] | 55 | 2020-12-08T07:07:47.000Z | 2022-03-29T08:27:02.000Z | 会员版.md | zuihou/lamp-cloud | e53134ac49803d9e7e60a5c39d09574cc27caf82 | [
"Apache-2.0"
] | 319 | 2020-12-08T15:34:52.000Z | 2022-03-31T16:17:13.000Z | ## 详情参考:
https://www.kancloud.cn/zuihou/zuihou-admin-cloud/2074547
## 购买企业版或个人版,联系微信: tyh306479353
| 16.833333 | 57 | 0.752475 | kor_Hang | 0.097686 |
d3aa3042d86c30fc3eed351c4e4a55282f8ce386 | 194 | md | Markdown | _books/2007/Practical_Ruby_Projects.md | jobajuba/books | f50e64d0b8ccf47d367533c8e07dcfac9ed9ac11 | [
"CC0-1.0"
] | 31 | 2015-05-02T09:34:37.000Z | 2022-01-28T21:11:30.000Z | _books/2007/Practical_Ruby_Projects.md | jobajuba/books | f50e64d0b8ccf47d367533c8e07dcfac9ed9ac11 | [
"CC0-1.0"
] | 22 | 2015-01-03T06:16:38.000Z | 2020-12-22T16:05:07.000Z | _books/2007/Practical_Ruby_Projects.md | jobajuba/books | f50e64d0b8ccf47d367533c8e07dcfac9ed9ac11 | [
"CC0-1.0"
] | 13 | 2015-08-14T09:08:09.000Z | 2020-03-12T01:09:49.000Z | ---
title: Practical Ruby Projects
authors:
- Topher Cyll
year: 2007
categories:
- examples
prices: 'ebook: $35, paperbook: $45'
editor: Apress
home_url: http://www.apress.com/9781590599112
---
| 16.166667 | 45 | 0.731959 | eng_Latn | 0.454436 |
d3aa61977817001b030a5c966499fe66bac0189e | 1,423 | md | Markdown | 2020/07/17/2020-07-17 20:45.md | zhzhzhy/WeiBoHot_history | 32ce4800e63f26384abb17d43e308452c537c902 | [
"MIT"
] | 3 | 2020-07-14T14:54:15.000Z | 2020-08-21T06:48:24.000Z | 2020/07/17/2020-07-17 20:45.md | zhzhzhy/WeiBoHot_history | 32ce4800e63f26384abb17d43e308452c537c902 | [
"MIT"
] | null | null | null | 2020/07/17/2020-07-17 20:45.md | zhzhzhy/WeiBoHot_history | 32ce4800e63f26384abb17d43e308452c537c902 | [
"MIT"
] | null | null | null | 2020年07月17日20时数据
Status: 200
1.杜华哭了
微博热度:3023534
2.大连女子隆胸去世医方承担完全责任
微博热度:1749444
3.国民网红表情包单曲首发
微博热度:1583966
4.杀害女儿9岁同学男子被执行死刑
微博热度:1335747
5.王思聪
微博热度:1304213
6.女明星有多拼
微博热度:1264040
7.张雨绮 我没有顺拐
微博热度:1136823
8.白玉兰入围名单
微博热度:679525
9.华春莹回应美方称中国试图超越美国
微博热度:629353
10.口罩熔喷布价格暴跌95%
微博热度:618157
11.伊能静教金莎和男生聊天
微博热度:617610
12.警方再找回2名梅姨案被拐儿童
微博热度:612436
13.浙江出梅
微博热度:557215
14.乘风破浪的姐姐
微博热度:451492
15.外交部新任发言人汪文斌
微博热度:448327
16.蓝盈莹说自己是硬邦邦的人
微博热度:438315
17.电影院复映片单
微博热度:377373
18.爷孙3人坠河两孙溺亡
微博热度:375120
19.金鹰奖参选名单
微博热度:373957
20.长城宽带
微博热度:372101
21.丁禹兮眼镜杀
微博热度:368574
22.北京7月21日进入主汛期
微博热度:359001
23.炙热的我们
微博热度:342829
24.新疆疫情
微博热度:275760
25.恩施暴雨
微博热度:236964
26.李建群去世
微博热度:233134
27.岳云鹏沙溢蹭饭版无价之姐
微博热度:231651
28.Haro的盲僧
微博热度:229170
29.巴菲特4个月从苹果赚400亿美元
微博热度:224714
30.网约车10大不文明行为
微博热度:216257
31.王一博跳无价之姐
微博热度:214572
32.孟佳观众喜爱度倒数第一
微博热度:210083
33.神仙级别饭拍能有多绝
微博热度:205469
34.下周北方将进入主汛期
微博热度:196255
35.白宫发言人称科学不应阻挡复课
微博热度:185539
36.阿Kenn抱头痛哭
微博热度:184122
37.青岛即墨区自然资源局副局长被查
微博热度:180974
38.王者荣耀SNK新英雄
微博热度:180193
39.黄河壶口瀑布迎来最佳观赏季
微博热度:179708
40.人类史上最近距离拍摄的太阳
微博热度:165488
41.赌王何鸿燊灵柩转移
微博热度:164420
42.新疆新增新冠肺炎确诊5例
微博热度:164119
43.这就是街舞预告
微博热度:163177
44.小学生用第一份工资请父母吃饭
微博热度:162999
45.网传北京某小区出现确诊病例不实
微博热度:162861
46.白宫发言人坚称学校要复课
微博热度:162850
47.税务总局叫停电商补税
微博热度:162849
48.台军直升机坠毁画面
微博热度:151310
49.抗洪现场小学生为救援队搬西瓜
微博热度:145437
50.新天地会长被韩国检方传唤
微博热度:134253
| 6.97549 | 19 | 0.784259 | yue_Hant | 0.285593 |
d3ab1a42bad66f4b7ad1e41441d2fd9638abfcd5 | 1,029 | md | Markdown | README.md | guzba/globby | 9a94a52b567b5bae1a80e84d8cfd52ebcc3da8f6 | [
"MIT"
] | null | null | null | README.md | guzba/globby | 9a94a52b567b5bae1a80e84d8cfd52ebcc3da8f6 | [
"MIT"
] | null | null | null | README.md | guzba/globby | 9a94a52b567b5bae1a80e84d8cfd52ebcc3da8f6 | [
"MIT"
] | null | null | null | # Globby - Glob pattern matching for Nim.
This library is being actively developed and we'd be happy for you to use it.
`nimble install gobby`

## Documentation
API reference: https://nimdocs.com/treeform/globby
## Supported patterns:
Done | Format | Example |
-- | ----------------- | --------------- |
✅ | Star | `foo*` |
✅ | Single Character | `foo??` |
✅ | Character Set | `foo[abs]` |
✅ | Character Range | `foo[a-z]` |
✅ | Star Path | `foo/*/bar` |
✅ | Double Star Path | `foo/**/bar` |
✅ | Root Path | `/foo/bar` |
✅ | Relative Path | `../foo/bar` |
## Example:
```nim
import globby, sequtils
var tree = GlobTree[int]()
tree.add("foo/bar/baz", 0)
tree.add("foo/bar/baz/1", 1)
tree.add("foo/bar/baz/2", 2)
tree.add("foo/bar/baz/z", 3)
tree.add("foo/bar/baz/z", 4)
assert toSeq(tree.findAll("foo/bar/baz/z"))[0] == 3
```
| 25.097561 | 89 | 0.552964 | eng_Latn | 0.51134 |
d3ab60b28193239e3f0940cb38ac974367a40db9 | 4,166 | md | Markdown | 201709/3.md | liondao/Blog | e0edaa819499731b971160de629002d2ea06a3f1 | [
"MIT"
] | 4 | 2019-11-12T07:30:41.000Z | 2020-11-12T05:14:23.000Z | 201709/3.md | stephen-mi/blog | e0edaa819499731b971160de629002d2ea06a3f1 | [
"MIT"
] | null | null | null | 201709/3.md | stephen-mi/blog | e0edaa819499731b971160de629002d2ea06a3f1 | [
"MIT"
] | 2 | 2018-07-14T11:51:02.000Z | 2019-02-26T08:18:46.000Z | # 跨页面通信的各种姿势
>作者简介:nekron 蚂蚁金服·数据体验技术团队
将跨页面通讯类比计算机进程间的通讯,其实方法无外乎那么几种,而web领域可以实现的技术方案主要是类似于以下两种原理:
* 获取句柄,定向通讯
* 共享内存,结合轮询或者事件通知来完成业务逻辑
由于第二种原理更利于解耦业务逻辑,具体的实现方案比较多样。以下是具体的实现方案,简单介绍下,权当科普:
## 一、获取句柄
### 具体方案
父页面通过`window.open(url, name)`方式打开的子页面可以获取句柄,然后通过postMessage完成通讯需求。
```js
// parent.html
const childPage = window.open('child.html', 'child')
childPage.onload = () => {
childPage.postMessage('hello', location.origin)
}
// child.html
window.onmessage = evt => {
// evt.data
}
```
### tips
1. 当指定`window.open`的第二个name参数时,再次调用`window.open('****', 'child')`会使之前已经打开的同name子页面刷新
2. 由于安全策略,异步请求之后再调用`window.open`会被浏览器阻止,不过可以通过句柄设置子页面的url即可实现类似效果
```
// 首先先开一个空白页
const tab = window.open('about:blank')
// 请求完成之后设置空白页的url
fetch(/* ajax */).then(() => {
tab.location.href = '****'
})
```
### 优劣
缺点是只能与自己打开的页面完成通讯,应用面相对较窄;但优点是在跨域场景中依然可以使用该方案。
## 二、localStorage
### 具体方案
设置共享区域的storage,storage会触发storage事件
```js
// A.html
localStorage.setItem('message', 'hello')
// B.html
window.onstorage = evt => {
// evt.key, evt.oldValue, evt.newValue
}
```
### tips
1. 触发写入操作的页面下的**storage listener**不会被触发
2. storage事件只有在发生改变的时候才会触发,即重复设置相同值不会触发listener
3. safari隐身模式下无法设置localStorage值
### 优劣
API简单直观,兼容性好,除了跨域场景下需要配合其他方案,无其他缺点
## 三、BroadcastChannel
### 具体方案
和`localStorage`方案基本一致,额外需要初始化
```js
// A.html
const channel = new BroadcastChannel('tabs')
channel.onmessage = evt => {
// evt.data
}
// B.html
const channel = new BroadcastChannel('tabs')
channel.postMessage('hello')
```
### 优劣
和`localStorage`方案没特别区别,都是同域、API简单,`BroadcastChannel`方案兼容性差些(chrome > 58),但比`localStorage`方案生命周期短(不会持久化),相对干净些。
## 四、SharedWorker
### 具体方案
`SharedWorker`本身并不是为了解决通讯需求的,它的设计初衷应该是类似总控,将一些通用逻辑放在SharedWorker中处理。不过因为也能实现通讯,所以一并写下:
```js
// A.html
var sharedworker = new SharedWorker('worker.js')
sharedworker.port.start()
sharedworker.port.onmessage = evt => {
// evt.data
}
// B.html
var sharedworker = new SharedWorker('worker.js')
sharedworker.port.start()
sharedworker.port.postMessage('hello')
// worker.js
const ports = []
onconnect = e => {
const port = e.ports[0]
ports.push(port)
port.onmessage = evt => {
ports.filter(v => v!== port) // 此处为了贴近其他方案的实现,剔除自己
.forEach(p => p.postMessage(evt.data))
}
}
```
### 优劣
相较于其他方案没有优势,此外,API复杂而且调试不方便。
## 五、Cookie
### 具体方案
一个古老的方案,有点`localStorage`的降级兼容版,我也是整理本文的时候才发现的,思路就是往`document.cookie`写入值,由于cookie的改变没有事件通知,所以只能采取轮询脏检查来实现业务逻辑。
方案比较丑陋,势必被淘汰的方案,贴一下原版思路地址,我就不写demo了。
[communication between browser windows (and tabs too) using cookies](https://stackoverflow.com/questions/4079280/javascript-communication-between-browser-tabs-windows/4079423)
### 优劣
相较于其他方案没有存在优势的地方,只能同域使用,而且污染cookie以后还额外增加AJAX的请求头内容。
## 六、Server
之前的方案都是前端自行实现,势必受到浏览器限制,比如无法做到跨浏览器的消息通讯,比如大部分方案都无法实现跨域通讯(需要增加额外的postMessage逻辑才能实现)。通过借助服务端,还有很多增强方案,也一并说下。
### 乞丐版
后端无开发量,前端定期保存,在tab被激活时重新获取保存的数据,可以通过校验hash之类的标记位来提升检查性能。
```js
window.onvisibilitychange = () => {
if (document.visibilityState === 'visible') {
// AJAX
}
}
```
### Server-sent Events / Websocket
项目规模小型的时候可以采取这类方案,后端自行维护连接,以及后续的推送行为。
#### SSE
```js
// 前端
const es = new EventSource('/notification')
es.onmessage = evt => {
// evt.data
}
es.addEventListener('close', () => {
es.close()
}, false)
// 后端,express为例
const clients = []
app.get('/notification', (req, res) => {
res.setHeader('Content-Type', 'text/event-stream')
clients.push(res)
req.on('aborted', () => {
// 清理clients
})
})
app.get('/update', (req, res) => {
// 广播客户端新的数据
clients.forEach(client => {
client.write('data:hello\n\n')
setTimeout(() => {
client.write('event:close\ndata:close\n\n')
}, 500)
})
res.status(200).end()
})
```
#### Websocket
`socket.io`、`sockjs`例子比较多,略
### 消息队列
项目规模大型时,需要消息队列集群长时间维护长链接,在需要的时候进行广播。
提供该类服务的云服务商很多,或者寻找一些开源方案自建。
例如MQTT协议方案(阿里云就有提供),web客户端本质上也是websocket,需要集群同时支持ws和mqtt协议,示例如下:
```js
// 前端
// 客户端使用开源的Paho
// port会和mqtt协议通道不同
const client = new Paho.MQTT.Client(host, port, 'clientId')
client.onMessageArrived = message => {
// message. payloadString
}
client.connect({
onSuccess: () => {
client.subscribe('notification')
}
})
// 抑或,借助flash(虽然快要被淘汰了)进行mqtt协议连接并订阅相应的频道,flash再通过回调抛出消息
// 后端
// 根据服务商提供的Api接口调用频道广播接口
``` | 19.287037 | 175 | 0.716035 | yue_Hant | 0.321497 |
d3abb7f068ffbd5548fac171d4425c505da03abc | 2,541 | md | Markdown | EquipmentList.md | TechExeter/HackBoxExe | eaead24950095205b2ef91fe191c7fb37562b564 | [
"MIT"
] | null | null | null | EquipmentList.md | TechExeter/HackBoxExe | eaead24950095205b2ef91fe191c7fb37562b564 | [
"MIT"
] | null | null | null | EquipmentList.md | TechExeter/HackBoxExe | eaead24950095205b2ef91fe191c7fb37562b564 | [
"MIT"
] | null | null | null | # Equipment Listing
This is a rough starting guide at the moment.
## General equipment
Equipment that most hackathons need!
Links and prices are purely for indication only.
Item | Quantity | Link | Price
---- | ------- | ---- | -----
Temperature controlled soldering irons | 2 minumum | http://www.maplin.co.uk/p/48w-mains-solder-station-n67ef | £25
Bluetak | lots
Prototyping Breadboards | 1 per arduino/raspberry pi |https://shop.pimoroni.com/products/solderless-breadboard-400-point | £4.50
Breadboard cable jumpers, connectors |
Also just need some general tools, such as
* Cutting tools (wire cutters, tin snips, stanley knife, etc)
* Gripping tools (pliers, long nosed pliers)
* Workshop tools (clamps, helping hands, tape measures, metal rule, magnifying glasses)
* Soldering tools (solder suckers, solder)
* 4 gangs and extension cords
* Cordless drill
First aid kit :D
## Prototyping
* Lego Technic + Lego Mindstorms
* Makey Makey
* Card/Greyboard
* Hot glue gun
* Garden wire
* Fishing line
* String
* Thermoplastic beads
* Sugru
## Computing / IoT Devices
Most of these devices will need access to either a computer/laptop for programming (Arduino) or a Screen + Keyboard + Mouse (Raspberry Pi).
Item | Quantity | Link | Price
---- | ------- | ---- | -----
Raspberry Pi 3 | 20? | http://uk.rs-online.com/web/p/processor-microcontroller-development-kits/8968660/ | £40
Raspberry Pi zero W | 20? | https://shop.pimoroni.com/products/raspberry-pi-zero-w | £15
Misc equipment:
* Micro SD cards for all devices
Other ideas:
* Wearable computing kit
## Sensors / Interface Devices
Interacting with *humans* - lights, screens, audio, tactile switches, etc
Interacting and sensing *the environment* - infrared, cameras, temperature probes etc
Item | Quantity | Link | Price
---- | ------- | ---- | -----
34 Piece Sensor Kit | 2+ | https://www.modmypi.com/arduino/kits/keyes-37-piece-sensor-experiment-kit | £34
Raspberry Pi Camera Board v2 (Also comes in Infrared) | ? | https://www.modmypi.com/raspberry-pi/camera/camera-boards/raspberry-pi-camera-board-v2-8mp1080p | £24.50 + £3.50 cable
Bare Conductive Touch Board | | https://www.bareconductive.com/shop/touch-board-starter-kit/ |
Speakers with 3.5mm jack | | |
## Motors / Controllers
* Stepper motors
* High current driver (e.g. Polulu DRV8825)
* M3 nuts and bolts and allen key/nut driver
* Power supply unit
* Linear actuators
* Flexinol muscle wire?
## Other kit
Other kit, such as all-in-one devices or specialist equipment goes here.
* Amazon Alexa | 30.614458 | 179 | 0.724124 | eng_Latn | 0.759522 |
d3ac5e33dbf9c7d50cdc08b1553e37c85c376c3f | 497 | md | Markdown | .github/ISSUE_TEMPLATE/bug_report.md | ms-henglu/bicep | 61e4b9f4625287552ae606cd1367e48132cafbfd | [
"MIT"
] | 2,288 | 2020-08-29T11:39:18.000Z | 2022-03-31T15:20:05.000Z | .github/ISSUE_TEMPLATE/bug_report.md | ms-henglu/bicep | 61e4b9f4625287552ae606cd1367e48132cafbfd | [
"MIT"
] | 3,488 | 2020-08-28T23:19:31.000Z | 2022-03-31T23:52:17.000Z | .github/ISSUE_TEMPLATE/bug_report.md | ms-henglu/bicep | 61e4b9f4625287552ae606cd1367e48132cafbfd | [
"MIT"
] | 649 | 2020-08-29T00:57:13.000Z | 2022-03-30T13:59:10.000Z | ---
name: Bug report
about: Create a report to help us improve
title: ''
labels: ''
assignees: ''
---
**Bicep version**
run `bicep --version` via the Bicep CLI, `az bicep version` via the AZ CLI or via VS code by navigating to the extensions tab and searching for Bicep
**Describe the bug**
A clear and concise description of what the bug is vs what you expected to happen
**To Reproduce**
Steps to reproduce the behavior:
**Additional context**
Add any other context about the problem here.
| 23.666667 | 149 | 0.734406 | eng_Latn | 0.995617 |
d3ac8afb59571891f885a77e11fe18bcd5952f02 | 821 | md | Markdown | examples/fieldArrays/src/FieldArrays.md | musicglue/redux-form | 51877b12393d49dd3015b2cee05c0aaf75110ffd | [
"MIT"
] | null | null | null | examples/fieldArrays/src/FieldArrays.md | musicglue/redux-form | 51877b12393d49dd3015b2cee05c0aaf75110ffd | [
"MIT"
] | null | null | null | examples/fieldArrays/src/FieldArrays.md | musicglue/redux-form | 51877b12393d49dd3015b2cee05c0aaf75110ffd | [
"MIT"
] | null | null | null | # Field Arrays Example
This example demonstrates how to have arrays of fields, both an array of one field or of a group
of fields. In this form, each member of the club has a first name, last name, and a list of
hobbies. The following array manipulation actions are available, as raw action creators, as bound
actions to your form under the `this.props.array` object, and as actions bound to both the form
and array on the object provided by the `FieldArray` component: `insert`, `pop`, `push`, `remove`,
`shift`, `swap`, and `unshift`. More detail can be found under the
[`FieldArray` docs](http://redux-form.com/6.0.0-alpha.15/docs/api/FieldArray.md).
Notice that array-specific errors are available if set on the array structure itself under the
`_error` key. (Hint: Add more than five hobbies to see an error.)
| 63.153846 | 98 | 0.756395 | eng_Latn | 0.999549 |
d3acdc9dfc0fd36e1743184777780364eaf7ffcc | 271 | md | Markdown | README.md | kshewani/EthereumSupplyChain | 6428a340b3eaa559afa6151ffda2327c35921677 | [
"MIT"
] | null | null | null | README.md | kshewani/EthereumSupplyChain | 6428a340b3eaa559afa6151ffda2327c35921677 | [
"MIT"
] | null | null | null | README.md | kshewani/EthereumSupplyChain | 6428a340b3eaa559afa6151ffda2327c35921677 | [
"MIT"
] | null | null | null | A basic implementation of supply chain smart contacts on a Ethereum blockchain network to track the status of an item from the time its ordered to its delivery. The application also includes a React based client UI to interact with smart contracts on blockchain network.
| 135.5 | 270 | 0.826568 | eng_Latn | 0.999521 |
d3ad2ddc0505b20da7f2b9b9da5362c170884237 | 229 | md | Markdown | wikipedia-en/README.md | gioxx/fxaddons | 886dd2e33c607228249e64a15d85ddde7f34be40 | [
"MIT"
] | 2 | 2020-07-24T21:35:45.000Z | 2021-04-03T13:13:43.000Z | wikipedia-en/README.md | gioxx/fxaddons | 886dd2e33c607228249e64a15d85ddde7f34be40 | [
"MIT"
] | 4 | 2020-04-09T13:35:06.000Z | 2021-08-31T10:38:38.000Z | wikipedia-en/README.md | gioxx/fxaddons | 886dd2e33c607228249e64a15d85ddde7f34be40 | [
"MIT"
] | null | null | null | # Wikipedia-EN

- Disponibile su: https://addons.mozilla.org/it/firefox/addon/gioxxorg-wikipedia-en/
- Termine di ricerca: **wen**
- Note: -
| 28.625 | 86 | 0.737991 | ita_Latn | 0.108407 |
d3ae90da3c530cc287abc895ce21ef37b451c17d | 1,682 | md | Markdown | aesop.md | BenBrewster/benbrewster.github.io | f3427d0b3c408c71e6e3c02c48c77d7ebefdbfff | [
"CC0-1.0"
] | null | null | null | aesop.md | BenBrewster/benbrewster.github.io | f3427d0b3c408c71e6e3c02c48c77d7ebefdbfff | [
"CC0-1.0"
] | null | null | null | aesop.md | BenBrewster/benbrewster.github.io | f3427d0b3c408c71e6e3c02c48c77d7ebefdbfff | [
"CC0-1.0"
] | null | null | null | ---
layout: default
---
[back](./)
# // AEsOP.
### Applied engagement for community particpation.
With AEsOP (Applied Engagement for Community Particpation) we set out with the intention of developing an educational game to raise awareness of community policing within communities. AEsOP provides the user with twelve scenarios, each of which focuses on a different type of criminality. The game puts the user in the shoes of various community actors, including the police, allowing them to play through a range of interactive stories with branching decision paths to reveal how various types of citizen and community participation can help prevent and reduce the impact of local crime issues. The game uses mechanics borrowed from the 2D adventure game genre and the narrative storytelling approach used in ‘TellTale’ style adventure games. Making use of rich hand-illustrated art, we hope to ensure AEsOP is approachable and suitable for all. AEsOP will be free and available to play online through a web-browser and via the iOS and Android App stores.
For this project, I took overall responsibility for the design of the game itself, running a number of co-design workshops with community engagement practitioners to ensure its alignment with real-world needs. I also wrote and designed each of the games scenarios, making sure the issues realised in the game are indicative of true to life community issues. We hope to have the game completed in Q1 2018, at which point I would like to trial its use as an education and research tool with community groups and schools in raising awareness of local policing issues and avenues for citizen engagement.
[back](./)
| 105.125 | 958 | 0.800832 | eng_Latn | 0.999796 |
d3aec497b4c32664130eeefaa6279df048631a95 | 89 | md | Markdown | README.md | hariraghav10/customnewtab | 171209544831a8645d623940179f186aee2691a0 | [
"MIT"
] | null | null | null | README.md | hariraghav10/customnewtab | 171209544831a8645d623940179f186aee2691a0 | [
"MIT"
] | null | null | null | README.md | hariraghav10/customnewtab | 171209544831a8645d623940179f186aee2691a0 | [
"MIT"
] | null | null | null | # Custom New Tab
a completely customized new tab for browsers with google search feature
| 29.666667 | 71 | 0.820225 | eng_Latn | 0.994076 |
d3aee0064302beef913651450c5590bc5411ad94 | 516 | md | Markdown | CHANGELOG.md | eSpecialized/WunderTDD | 9d923f97e33c1318447bd0b2aaa4b3c9883d35e6 | [
"MIT"
] | null | null | null | CHANGELOG.md | eSpecialized/WunderTDD | 9d923f97e33c1318447bd0b2aaa4b3c9883d35e6 | [
"MIT"
] | null | null | null | CHANGELOG.md | eSpecialized/WunderTDD | 9d923f97e33c1318447bd0b2aaa4b3c9883d35e6 | [
"MIT"
] | null | null | null | # Change Log
All changes will be documented here.
---
## [1.1] 2018-04-28
Fixed the test suite up, it was out dated.
Started using Travis-CI for builds to check they are passing.
Installed RxSwift, RxCocoa and RealmSwift for future expansion
WConfigurationViewController has gone RxCocoa/RxSwift
WWeatherListViewController has gotten some upgrades with RxSwift/RxCocoa
WunderAPI has gotten a new observer method for getting the weather info Reactively.
## [1.0]
First revision which was built in all swift 4.0.3
| 30.352941 | 83 | 0.794574 | eng_Latn | 0.99335 |
d3aefb7fef7d65f1bb201cc744f4c8506b03b0fa | 34,856 | md | Markdown | articles/data-factory/connector-troubleshoot-guide.md | daixijun/mc-docs.zh-cn | 1303fd33ba04b2881651e18d9db6a28d907f99e3 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/data-factory/connector-troubleshoot-guide.md | daixijun/mc-docs.zh-cn | 1303fd33ba04b2881651e18d9db6a28d907f99e3 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | articles/data-factory/connector-troubleshoot-guide.md | daixijun/mc-docs.zh-cn | 1303fd33ba04b2881651e18d9db6a28d907f99e3 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
title: 排查 Azure 数据工厂连接器问题
description: 了解如何排查 Azure 数据工厂中的连接器问题。
services: data-factory
author: WenJason
ms.service: data-factory
ms.topic: troubleshooting
origin.date: 01/07/2021
ms.date: 01/25/2021
ms.author: v-jay
ms.reviewer: craigg
ms.custom: has-adal-ref
ms.openlocfilehash: 49ff1b3750263a5907a0e2f41ea930c5db29d2e5
ms.sourcegitcommit: e1edc6ef84dbbda1da4e0a42efa3fd62eee033d1
ms.translationtype: HT
ms.contentlocale: zh-CN
ms.lasthandoff: 01/18/2021
ms.locfileid: "98541876"
---
# <a name="troubleshoot-azure-data-factory-connectors"></a>排查 Azure 数据工厂连接器问题
[!INCLUDE[appliesto-adf-asa-md](includes/appliesto-adf-asa-md.md)]
本文探讨了常见的 Azure 数据工厂连接器故障排除方法。
## <a name="azure-blob-storage"></a>Azure Blob 存储
### <a name="error-code-azurebloboperationfailed"></a>错误代码:AzureBlobOperationFailed
- 消息:`Blob operation Failed. ContainerName: %containerName;, path: %path;.`
- **原因:** Blob 存储操作遇到问题。
- **建议**:检查详细信息中的错误。 参阅 Blob 帮助文档: https://docs.microsoft.com/rest/api/storageservices/blob-service-error-codes 。 如需帮助,请联系存储团队。
### <a name="invalid-property-during-copy-activity"></a>执行复制活动期间属性无效
- 消息:`Copy activity <Activity Name> has an invalid "source" property. The source type is not compatible with the dataset <Dataset Name> and its linked service <Linked Service Name>. Please verify your input against.`
- **原因:** 数据集中定义的类型与复制活动中定义的源/接收器类型不一致。
- **解决方法**:编辑数据集或管道 JSON 定义,使类型一致,然后重新运行部署。
## <a name="azure-cosmos-db"></a>Azure Cosmos DB
### <a name="error-message-request-size-is-too-large"></a>错误消息:请求过大
- **症状**:将数据复制到使用默认写入批大小的 Azure Cosmos DB 时,遇到错误“请求过大”。
- **原因:** Cosmos DB 将单个请求的大小限制为 2 MB。 公式为请求大小 = 单个文档大小 * 写入批大小。 如果文档过大,默认行为会导致请求过大。 可以优化写入批大小。
- **解决方法**:在复制活动接收器中,减小“写入批大小”值(默认值为 10000)。
### <a name="error-message-unique-index-constraint-violation"></a>错误消息:唯一索引约束冲突
- **症状**:将数据复制到 Cosmos DB 时遇到以下错误:
```
Message=Partition range id 0 | Failed to import mini-batch.
Exception was Message: {"Errors":["Encountered exception while executing function. Exception = Error: {\"Errors\":[\"Unique index constraint violation.\"]}...
```
- **原因:** 有两个可能的原因:
- 如果使用“插入”作为写入行为,则此错误表示源数据包含具有相同 ID 的行/对象。
- 如果使用“更新插入”作为写入行为,并设置了容器的另一个唯一键,则此错误表示源数据中的行/对象使用了定义的唯一键的不同 ID,但使用了该键的相同值。
- **解决方法**:
- 对于原因 1,请将“更新插入”设置为写入行为。
- 对于原因 2,请确保每个文档使用定义的唯一键的不同值。
### <a name="error-message-request-rate-is-large"></a>错误消息:请求速率太大
- **症状**:将数据复制到 Cosmos DB 时遇到以下错误:
```
Type=Microsoft.Azure.Documents.DocumentClientException,
Message=Message: {"Errors":["Request rate is large"]}
```
- **原因:** 使用的请求单位大于 Cosmos DB 中配置的可用 RU。 在[此处](../cosmos-db/request-units.md#request-unit-considerations)了解 Cosmos DB 如何计算 RU。
- **解决方法**:下面是两种解决方法:
- **增加容器 RU**,使之大于 Cosmos DB 中的值,这可以提高复制活动的性能,不过会增大 Cosmos DB 的费用。
- 将 **writeBatchSize** 减至更小的值(例如 1000),并将 **parallelCopies** 设置为更小的值(例如 1),这会导致复制运行性能比当前更糟,但不会增大 Cosmos DB 的费用。
### <a name="column-missing-in-column-mapping"></a>列映射中缺少列
- **症状**:导入用于列映射的 Cosmos DB 的架构时缺少某些列。
- **原因:** ADF 从前 10 个 Cosmos DB 文档推断架构。 如果某些列/属性在这些文档中没有值,则它们不会被 ADF 检测到,因此也就不会显示。
- **解决方法**:可按如下所示优化查询,以强制在结果集中显示带有空值的列:(假设前 10 个文档中缺少“impossible”列)。 或者,可以手动添加要映射的列。
```sql
select c.company, c.category, c.comments, (c.impossible??'') as impossible from c
```
### <a name="error-message-the-guidrepresentation-for-the-reader-is-csharplegacy"></a>错误消息:读取器的 GuidRepresentation 为 CSharpLegacy
- **症状**:从包含 UUID 字段的 Cosmos DB MongoAPI/MongoDB 复制数据时遇到以下错误:
```
Failed to read data via MongoDB client.,
Source=Microsoft.DataTransfer.Runtime.MongoDbV2Connector,Type=System.FormatException,
Message=The GuidRepresentation for the reader is CSharpLegacy which requires the binary sub type to be UuidLegacy not UuidStandard.,Source=MongoDB.Bson,’“,
```
- **原因:** 可通过两种方式表示 BSON 中的 UUID - UuidStardard 和 UuidLegacy。 默认使用 UuidLegacy 来读取数据。 如果 MongoDB 中的 UUID 数据是 UuidStandard,则会出现错误。
- **解决方法**:在 MongoDB 连接字符串中,添加选项 "**uuidRepresentation=standard**"。 有关详细信息,请参阅 [MongoDB 连接字符串](connector-mongodb.md#linked-service-properties)。
## <a name="azure-cosmos-db-sql-api"></a>Azure Cosmos DB (SQL API)
### <a name="error-code--cosmosdbsqlapioperationfailed"></a>错误代码:CosmosDbSqlApiOperationFailed
- 消息:`CosmosDbSqlApi operation Failed. ErrorMessage: %msg;.`
- **原因:** CosmosDbSqlApi operation hit problem.
- **建议**:检查详细信息中的错误。 请参阅 [CosmosDb 帮助文档](/cosmos-db/troubleshoot-dot-net-sdk)。 如需帮助,请联系 CosmosDb 团队。
## <a name="azure-data-lake-storage-gen2"></a>Azure Data Lake Storage Gen2
### <a name="error-code-adlsgen2operationfailed"></a>错误代码:ADLSGen2OperationFailed
- 消息:`ADLS Gen2 operation failed for: %adlsGen2Message;.%exceptionData;.`
- **原因:** ADLS Gen2 引发了指明操作失败的错误。
- **建议**:检查 ADLS Gen2 引发的详细错误消息。 如果这是由暂时性故障导致的,请重试。 如果需要进一步的帮助,请联系 Azure 存储支持,并提供错误消息中的请求 ID。
- **原因:** 如果错误消息包含“被禁止”,则你使用的服务主体或托管标识可能没有足够的权限来访问 ADLS Gen2。
- **建议**:参阅帮助文档: https://docs.azure.cn/data-factory/connector-azure-data-lake-storage#service-principal-authentication 。
- **原因:** 当错误消息包含“InternalServerError”时,错误是由 ADLS Gen2 返回的。
- **建议**:这可能是由暂时性失败导致的,请重试。 如果此问题仍然存在,请联系 Azure 存储支持部门,并提供错误消息中的请求 ID。
### <a name="request-to-adls-gen2-account-met-timeout-error"></a>向 ADLS Gen2 帐户发出请求时出现超时错误
- **消息**:错误代码 = `UserErrorFailedBlobFSOperation`,错误消息 = `BlobFS operation failed for: A task was canceled`。
- **原因:** 此问题是由 ADLS Gen2 接收器超时错误引起的,该错误大多发生在自承载 IR 计算机上。
- **建议**:
- 如果可能,请将自承载 IR 计算机和目标 ADLS Gen2 帐户置于同一区域中。 这可以避免随机超时错误,并获得更好的性能。
- 检查是否有任何特殊的网络设置(例如 ExpressRoute),并确保网络具有足够的带宽。 建议在总体带宽较低时降低自承载 IR 并发作业数设置,这样可以避免多个并发作业之间的网络资源争用。
- 如果文件大小适中或较小,对于非二进制复制,请使用较小的块大小以减轻此类超时错误。 请参阅 [Blob 存储放置块](https://docs.microsoft.com/rest/api/storageservices/put-block)。
若要指定自定义块大小,可以在 .json 编辑器中编辑此属性:
```
"sink": {
"type": "DelimitedTextSink",
"storeSettings": {
"type": "AzureBlobFSWriteSettings",
"blockSizeInMB": 8
}
}
```
## <a name="azure-file-storage"></a>Azure 文件存储
### <a name="error-code--azurefileoperationfailed"></a>错误代码:AzureFileOperationFailed
- 消息:`Azure File operation Failed. Path: %path;. ErrorMessage: %msg;.`
- **原因:** Azure 文件存储操作遇到问题。
- **建议**:检查详细信息中的错误。 请参阅 Azure 文件帮助文档: https://docs.microsoft.com/rest/api/storageservices/file-service-error-codes 。 如需帮助,请联系存储团队。
## <a name="azure-synapse-analyticsazure-sql-databasesql-server"></a>Azure Synapse Analytics/Azure SQL 数据库/SQL Server
### <a name="error-code--sqlfailedtoconnect"></a>错误代码:SqlFailedToConnect
- 消息:`Cannot connect to SQL Database: '%server;', Database: '%database;', User: '%user;'. Check the linked service configuration is correct, and make sure the SQL Database firewall allows the integration runtime to access.`
- **原因:** Azure SQL:如果错误消息包含 SQL 错误代码,如“SqlErrorNumber=[errorcode]”,请参阅 Azure SQL 疑难解答指南。
- **建议**:可从 https://docs.azure.cn/azure-sql/database/troubleshoot-common-errors-issues 了解详细信息。
- **原因:** 检查端口 1433 是否位于防火墙允许列表中。
- **建议**:请参阅此参考文档: https://docs.microsoft.com/sql/sql-server/install/configure-the-windows-firewall-to-allow-sql-server-access#ports-used-by- 。
- **原因:** 如果错误消息包含“SqlException”,则 SQL 数据库将引发错误,指示某个特定操作失败。
- **建议**:如需了解更多详细信息,请按 SQL 错误代码在以下参考文档中进行搜索: https://docs.microsoft.com/sql/relational-databases/errors-events/database-engine-events-and-errors 。 如需进一步的帮助,请联系 Azure SQL 支持。
- **原因:** 如果这是暂时性问题(例如网络连接不稳定),请在活动策略中添加重试以缓解这种情况。
- **建议**:请参阅此参考文档: https://docs.azure.cn/data-factory/concepts-pipelines-activities#activity-policy 。
- **原因**:如果错误消息包含“不允许 IP 地址为 '...' 的客户端访问服务器”,并且你正在尝试连接到 Azure SQL 数据库,表明这通常是由于 Azure SQL 数据库防火墙问题所致。
- **建议**:在 Azure SQL Server 防火墙配置中,启用“允许 Azure 服务和资源访问此服务器”选项。 参考文档: https://docs.azure.cn/sql-database/sql-database-firewall-configure 。
### <a name="error-code--sqloperationfailed"></a>错误代码:SqlOperationFailed
- 消息:`A database operation failed. Please search error to get more details.`
- **原因:** 如果错误消息包含“SqlException”,则 SQL 数据库将引发错误,指示某个特定操作失败。
- **建议**:如果 SQL 错误不明确,请尝试将数据库更改为最新兼容级别“150”。 它可能会引发最新版本 SQL 错误。 请参阅[详细信息文档](https://docs.microsoft.com/sql/t-sql/statements/alter-database-transact-sql-compatibility-level#backwardCompat)。
若要排查 SQL 错误,请根据 SQL 错误代码在此参考文档中搜索以了解更多详细信息: https://docs.microsoft.com/sql/relational-databases/errors-events/database-engine-events-and-errors 。 如需进一步的帮助,请联系 Azure SQL 支持。
- **原因:** 如果错误消息包含“PdwManagedToNativeInteropException”,这通常是由于源列和接收器列大小不匹配导致的。
- **建议**:检查源列和接收器列的大小。 如需进一步的帮助,请联系 Azure SQL 支持。
- **原因:** 如果错误消息包含“InvalidOperationException”,这通常是由于输入数据无效导致的。
- **建议**:若要确定哪个行遇到了问题,请在复制活动上启用容错功能,该功能可将有问题的行重定向到存储,以便进一步调查。 参考文档: https://docs.azure.cn/data-factory/copy-activity-fault-tolerance 。
### <a name="error-code--sqlunauthorizedaccess"></a>错误代码:SqlUnauthorizedAccess
- 消息:`Cannot connect to '%connectorName;'. Detail Message: '%message;'`
- **原因:** 凭据不正确,或登录帐户无法访问 SQL 数据库。
- **建议**:检查登录帐户是否有足够的权限来访问 SQL 数据库。
### <a name="error-code--sqlopenconnectiontimeout"></a>错误代码:SqlOpenConnectionTimeout
- 消息:`Open connection to database timeout after '%timeoutValue;' seconds.`
- **原因:** 可能是 SQL 数据库暂时性失败。
- **建议**:重试,以使用更大的连接超时值来更新链接服务连接字符串。
### <a name="error-code--sqlautocreatetabletypemapfailed"></a>错误代码:SqlAutoCreateTableTypeMapFailed
- 消息:`Type '%dataType;' in source side cannot be mapped to a type that supported by sink side(column name:'%columnName;') in autocreate table.`
- **原因:** 自动创建表不能满足源要求。
- **建议**:更新“mappings”中的列类型,或者在目标服务器中手动创建接收器表。
### <a name="error-code--sqldatatypenotsupported"></a>错误代码:SqlDataTypeNotSupported
- 消息:`A database operation failed. Check the SQL errors.`
- **原因:** 如果在 SQL 源上出现问题,并且错误与 SqlDateTime 溢出有关,则数据值将超出逻辑类型范围 (1/1/1753 12:00:00 AM - 12/31/9999 11:59:59 PM)。
- **建议**:在源 SQL 查询中将类型转换为字符串,或在复制活动列映射中将列类型更改为“String”。
- **原因:** 如果在 SQL 接收器上出现问题,并且错误与 SqlDateTime 溢出有关,则数据值将超出接收器表中的允许范围。
- **建议**:在接收器表中将相应的列类型更新为“datetime2”类型。
### <a name="error-code--sqlinvaliddbstoredprocedure"></a>错误代码:SqlInvalidDbStoredProcedure
- 消息:`The specified Stored Procedure is not valid. It could be caused by that the stored procedure doesn't return any data. Invalid Stored Procedure script: '%scriptName;'.`
- **原因:** 指定的存储过程无效。 这可能是因为存储过程不返回任何数据。
- **建议**:通过 SQL 工具验证存储过程。 请确保存储过程可以返回数据。
### <a name="error-code--sqlinvaliddbquerystring"></a>错误代码:SqlInvalidDbQueryString
- 消息:`The specified SQL Query is not valid. It could be caused by that the query doesn't return any data. Invalid query: '%query;'`
- **原因:** 指定的 SQL 查询无效。 这可能是因为查询不返回任何数据。
- **建议**:通过 SQL 工具验证 SQL 查询。 请确保查询可以返回数据。
### <a name="error-code--sqlinvalidcolumnname"></a>错误代码:SqlInvalidColumnName
- 消息:`Column '%column;' does not exist in the table '%tableName;', ServerName: '%serverName;', DatabaseName: '%dbName;'.`
- **原因:** 找不到列。 可能存在配置错误。
- **建议**:验证查询中的列、数据集中的“结构”和活动中的“映射”。
### <a name="error-code--sqlbatchwritetimeout"></a>错误代码:SqlBatchWriteTimeout
- 消息:`Timeouts in SQL write operation.`
- **原因:** 可能是 SQL 数据库暂时性失败。
- **建议**:重试。 如果问题重现,请联系 Azure SQL 支持人员。
### <a name="error-code--sqlbatchwritetransactionfailed"></a>错误代码:SqlBatchWriteTransactionFailed
- 消息:`SQL transaction commits failed`
- **原因:** 如果异常详细信息持续指出事务超时,则表示集成运行时与数据库之间的网络延迟高于默认阈值(30 秒)。
- **建议**:使用 120 或更大的“连接超时”值更新 SQL 链接服务连接字符串更新,然后重新运行活动。
- **原因:** 如果异常详细信息间歇性指出 sqlconnection 中断,则原因可能只是与暂时性网络故障或 SQL 数据库端的问题有关
- **建议**:重试活动,并审阅 SQL 数据库端指标。
### <a name="error-code--sqlbulkcopyinvalidcolumnlength"></a>错误代码:SqlBulkCopyInvalidColumnLength
- 消息:`SQL Bulk Copy failed due to receive an invalid column length from the bcp client.`
- **原因:** 由于从 BCP 客户端接收到无效的列长度,SQL 批量复制失败。
- **建议**:若要确定哪个行遇到了问题,请在复制活动上启用容错功能,该功能可将有问题的行重定向到存储,以便进一步调查。 参考文档: https://docs.azure.cn/data-factory/copy-activity-fault-tolerance 。
### <a name="error-code--sqlconnectionisclosed"></a>错误代码:SqlConnectionIsClosed
- 消息:`The connection is closed by SQL Database.`
- **原因:** 当高并发运行和服务器终止连接时,SQL 数据库关闭了 SQL 连接。
- **建议**:远程服务器关闭了 SQL 连接。 Retry。 如果问题重现,请联系 Azure SQL 支持人员。
### <a name="error-message-conversion-failed-when-converting-from-a-character-string-to-uniqueidentifier"></a>错误消息:将字符串转换为唯一标识符失败
- **症状**:使用暂存复制和 PolyBase 将表格数据源(例如 SQL Server)中的数据复制到 Azure Synapse Analytics 时,遇到以下错误:
```
ErrorCode=FailedDbOperation,Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,
Message=Error happened when loading data into Azure Synapse Analytics.,
Source=Microsoft.DataTransfer.ClientLibrary,Type=System.Data.SqlClient.SqlException,
Message=Conversion failed when converting from a character string to uniqueidentifier...
```
- **原因:** Azure Synapse Analytics PolyBase 无法将空字符串转换为 GUID。
- **解决方法**:在复制活动接收器中的 Polybase 设置下,将“use type default”选项设置为 false。
### <a name="error-message-expected-data-type-decimalxx-offending-value"></a>错误消息:预期的数据类型:DECIMAL(x,x),违规值
- **症状**:使用暂存复制和 PolyBase 将表格数据源(例如 SQL Server)中的数据复制到 Azure Synapse Analytics 时,遇到以下错误:
```
ErrorCode=FailedDbOperation,Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,
Message=Error happened when loading data into Azure Synapse Analytics.,
Source=Microsoft.DataTransfer.ClientLibrary,Type=System.Data.SqlClient.SqlException,
Message=Query aborted-- the maximum reject threshold (0 rows) was reached while reading from an external source: 1 rows rejected out of total 415 rows processed. (/file_name.txt)
Column ordinal: 18, Expected data type: DECIMAL(x,x), Offending value:..
```
- **原因:** Azure Synapse Analytics Polybase 无法将空字符串(null 值)插入十进制列。
- **解决方法**:在复制活动接收器中的 Polybase 设置下,将“use type default”选项设置为 false。
### <a name="error-message-java-exception-message-hdfsbridgecreaterecordreader"></a>错误消息:Java 异常消息:HdfsBridge::CreateRecordReader
- **症状**:使用 PolyBase 将数据复制到 Azure Synapse Analytics 时遇到以下错误:
```
Message=110802;An internal DMS error occurred that caused this operation to fail.
Details: Exception: Microsoft.SqlServer.DataWarehouse.DataMovement.Common.ExternalAccess.HdfsAccessException,
Message: Java exception raised on call to HdfsBridge_CreateRecordReader.
Java exception message:HdfsBridge::CreateRecordReader - Unexpected error encountered creating the record reader.: Error [HdfsBridge::CreateRecordReader - Unexpected error encountered creating the record reader.] occurred while accessing external file.....
```
- **原因:** 可能的原因是架构(总列宽)太大(大于 1 MB)。 通过添加所有列的大小来检查目标 Azure Synapse Analytics 表的架构:
- Int -> 4 字节
- Bigint -> 8 字节
- Varchar(n),char(n),binary(n), varbinary(n) -> n 字节
- Nvarchar(n), nchar(n) -> n*2 字节
- Date -> 6 字节
- Datetime/(2), smalldatetime -> 16 字节
- Datetimeoffset -> 20 字节
- Decimal -> 19 字节
- Float -> 8 字节
- Money -> 8 字节
- Smallmoney -> 4 字节
- Real -> 4 字节
- Smallint -> 2 字节
- Time -> 12 字节
- Tinyint -> 1 字节
- **解决方法**:
- 请将列宽缩小至 1 MB 以下。
- 或者通过禁用 Polybase 来使用批量插入方法。
### <a name="error-message-the-condition-specified-using-http-conditional-headers-is-not-met"></a>错误消息:不满足使用 HTTP 条件标头指定的条件
- **症状**:使用 SQL 查询从 Azure Synapse Analytics 提取数据时遇到以下错误:
```
...StorageException: The condition specified using HTTP conditional header(s) is not met...
```
- **原因:** Azure Synapse Analytics 在查询 Azure 存储中的外部表时遇到问题。
- **解决方法**:在 SSMS 中运行同一查询,检查是否看到相同的结果。 如果是,请创建 Azure Synapse Analytics 支持票证,并提供 Azure Synapse Analytics 服务器和数据库名称以进一步排查问题。
### <a name="performance-tier-is-low-and-leads-to-copy-failure"></a>性能层等级较低,导致复制失败
- **症状**:将数据复制到 Azure SQL 数据库时出现以下错误消息:`Database operation failed. Error message from database execution : ExecuteNonQuery requires an open and available Connection. The connection's current state is closed.`
- **原因:** 正在使用 Azure SQL 数据库 s1,在此情况下达到了 IO 限制。
- **解决方法**:升级 Azure SQL 数据库性能层可解决此问题。
### <a name="sql-table-cannot-be-found"></a>找不到 SQL 表
- **症状**:将数据从混合结构复制到本地 SQL Server 表时出错:`Cannot find the object "dbo.Contoso" because it does not exist or you do not have permissions.`
- **原因:** 当前 SQL 帐户的权限不足,无法执行由 .NET SqlBulkCopy.WriteToServer 发出的请求。
- **解决方法**:切换到权限更高的一个 SQL 帐户。
### <a name="error-message-string-or-binary-data-would-be-truncated"></a>错误消息:字符串或二进制数据会被截断
- **症状**:将数据复制到本地/Azure SQL Server 表时出错:
- **原因:** Cx Sql 表架构定义具有一个或多个长度小于预期的列。
- **解决方法**:尝试以下步骤来解决问题:
1. 应用 SQL 接收器[容错](/data-factory/copy-activity-fault-tolerance)(尤其是“redirectIncompatibleRowSettings”)来排查哪些行有问题。
> [!NOTE]
> 请注意,容错可能会引入额外的执行时间,这可能会导致更高的成本。
2. 请仔细将重定向的数据与 SQL 表架构列长度进行核对,以查明哪些列需要更新。
3. 相应地更新表架构。
## <a name="azure-table-storage"></a>Azure 表存储
### <a name="error-code--azuretableduplicatecolumnsfromsource"></a>错误代码:AzureTableDuplicateColumnsFromSource
- 消息:`Duplicate columns with same name '%name;' are detected from source. This is NOT supported by Azure Table Storage sink`
- **原因:** 具有联接或非结构化 CSV 文件的 SQL 查询很常见
- **建议**:仔细检查源列并进行相应修复。
## <a name="db2"></a>DB2
### <a name="error-code--db2driverrunfailed"></a>错误代码:DB2DriverRunFailed
- 消息:`Error thrown from driver. Sql code: '%code;'`
- **原因:** 如果错误消息包含“SQLSTATE=51002 SQLCODE=-805”,请参阅本文档中的提示: https://docs.azure.cn/data-factory/connector-db2#linked-service-properties
- **建议**:尝试在属性“packageCollection”中设置“NULLID”
## <a name="delimited-text-format"></a>带分隔符的文本格式
### <a name="error-code--delimitedtextcolumnnamenotallownull"></a>错误代码:DelimitedTextColumnNameNotAllowNull
- 消息:`The name of column index %index; is empty. Make sure column name is properly specified in the header row.`
- **原因:** 在活动中设置“firstRowAsHeader”时,第一行将用作列名。 此错误表示第一行包含空值。 例如:“ColumnA, ColumnB”。
- **建议**:检查第一行,如果存在空值,请修复值。
### <a name="error-code--delimitedtextmorecolumnsthandefined"></a>错误代码:DelimitedTextMoreColumnsThanDefined
- 消息:`Error found when processing '%function;' source '%name;' with row number %rowCount;: found more columns than expected column count: %expectedColumnCount;.`
- **原因:** 有问题的行的列计数大于第一行的列计数。 原因可能是数据有问题,或者列分隔符/引号字符设置不正确。
- **建议**:获取错误消息中的行计数,检查行的列并修复数据。
- **原因:** 如果错误消息中的预期列计数为“1”,则你可能指定了错误的压缩或格式设置。 因此,ADF 错误地分析了你的文件。
- **建议**:检查格式设置,确保它与源文件相匹配。
- **原因:** 如果源是文件夹,则原因可能是指定的文件夹中的文件采用了不同的架构。
- **建议**:确保给定文件夹中的文件采用相同的架构。
## <a name="dynamics-365common-data-servicedynamics-crm"></a>Dynamics 365/Common Data Service/Dynamics CRM
### <a name="error-code--dynamicscreateserviceclienterror"></a>错误代码:DynamicsCreateServiceClientError
- 消息:`This is a transient issue on dynamics server side. Try to rerun the pipeline.`
- **原因:** Dynamics 服务器端出现了暂时性问题。
- **建议**:重新运行管道。 如果仍旧失败,请尝试降低并行度。 如果还是失败,请联系 Dynamics 支持人员。
### <a name="columns-are-missing-when-previewingimporting-schema"></a>预览/导入架构时缺少列
- **症状**:导入架构或预览数据时,缺少某些列。 错误消息:`The valid structure information (column name and type) are required for Dynamics source.`
- **原因:** 此问题基本上是设计使然,因为 ADF 无法显示前 10 条记录中没有值的列。 确保你添加的列采用正确格式。
- **建议**:在映射选项卡中手动添加列。
### <a name="error-code--dynamicsmissingtargetformultitargetlookupfield"></a>错误代码:DynamicsMissingTargetForMultiTargetLookupField
- 消息:`Cannot find the target column for multi-target lookup field: '%fieldName;'.`
- **原因:** 源或列映射中不存在目标列。
- **建议**:1. 确保源包含目标列。 2. 在列映射中添加目标列。 确保接收器列的格式为“{fieldName}@EntityReference”。
### <a name="error-code--dynamicsinvalidtargetformultitargetlookupfield"></a>错误代码:DynamicsInvalidTargetForMultiTargetLookupField
- 消息:`The provided target: '%targetName;' is not a valid target of field: '%fieldName;'. Valid targets are: '%validTargetNames;"`
- **原因:** 提供了错误的实体名称作为多目标查找字段的目标实体。
- **建议**:为多目标查找字段提供有效的实体名称。
### <a name="error-code--dynamicsinvalidtypeformultitargetlookupfield"></a>错误代码:DynamicsInvalidTypeForMultiTargetLookupField
- 消息:`The provided target type is not a valid string. Field: '%fieldName;'.`
- **原因:** 目标列中的值不是字符串
- **建议**:在“多目标查找目标”列中提供有效字符串。
### <a name="error-code--dynamicsfailedtorequetserver"></a>错误代码:DynamicsFailedToRequetServer
- 消息:`The dynamics server or the network is experiencing issues. Check network connectivity or check dynamics server log for more details.`
- **原因:** Dynamics 服务器不稳定或不可访问,或者网络出现问题。
- **建议**:检查网络连接性或检查 Dynamics 服务器日志以获取更多详细信息。 如需更多帮助,请联系 Dynamics 支持部门。
## <a name="ftp"></a>FTP
### <a name="error-code--ftpfailedtoconnecttoftpserver"></a>错误代码:FtpFailedToConnectToFtpServer
- 消息:`Failed to connect to FTP server. Please make sure the provided server informantion is correct, and try again.`
- **原因:** 可能为 FTP 服务器使用了错误的链接服务类型,例如,使用 SFTP 链接服务连接到 FTP 服务器。
- **建议**:检查目标服务器的端口。 默认情况下,FTP 使用端口 21。
## <a name="http"></a>Http
### <a name="error-code--httpfilefailedtoread"></a>错误代码:HttpFileFailedToRead
- 消息:`Failed to read data from http server. Check the error from http server:%message;`
- **原因:** 当 Azure 数据工厂与 http 服务器通信,但是 http 请求操作失败时,会发生此错误。
- **建议**:检查错误消息中的 http 状态代码或消息,并修复远程服务器问题。
## <a name="oracle"></a>Oracle
### <a name="error-code-argumentoutofrangeexception"></a>错误代码:ArgumentOutOfRangeException
- 消息:`Hour, Minute, and Second parameters describe an un-representable DateTime.`
- **原因:** ADF 支持 0001-01-01 00:00:00 到 9999-12-31 23:59:59 范围内的 DateTime 值。 但是,Oracle 支持范围更广的 DateTime 值(例如公元前世纪或大于 59 的分钟/秒),这在 ADF 中会导致失败。
- **建议**:
运行 `select dump(<column name>)` 以检查 Oracle 中的值是否在 ADF 的范围内。
如果你希望知道结果中的字节序列,请查看 https://stackoverflow.com/questions/13568193/how-are-dates-stored-in-oracle 。
## <a name="orc-format"></a>ORC 格式
### <a name="error-code--orcjavainvocationexception"></a>错误代码:OrcJavaInvocationException
- 消息:`An error occurred when invoking java, message: %javaException;.`
- **原因:** 如果错误消息包含“java.lang.OutOfMemory”、“Java 堆空间”和“doubleCapacity”,则原因通常与旧版集成运行时中的内存管理问题有关。
- **建议**:如果使用的是自承载集成运行时,建议升级到最新版本。
- **原因:** 如果错误消息包含“java.lang.OutOfMemory”,则原因是集成运行时不能提供足够的资源来处理文件。
- **建议**:限制集成运行时中的并发运行。 对于自承载集成运行时,请扩展到具有 8 GB 或更大内存的强大计算机。
- **原因:** 如果错误消息包含“NullPointerReference”,则原因可能是出现了暂时性错误。
- **建议**:重试。 如果问题仍然出现,请联系支持人员。
- **原因:** 如果错误消息包含“BufferOverflowException”,则原因可能是出现了暂时性错误。
- **建议**:重试。 如果问题仍然出现,请联系支持人员。
- **原因:** 如果错误消息包含“java.lang.ClassCastException:org.apache.hadoop.hive.serde2.io.HiveCharWritable 无法转换为 org.apache.hadoop.io.Text”,这是 Java 运行时内部的类型转换问题。 通常,此问题是由于在 Java 运行时中不能很好地处理源数据。
- **建议**:这是数据问题。 尝试在 ORC 格式数据中使用字符串代替 char/varchar。
### <a name="error-code--orcdatetimeexceedlimit"></a>错误代码:OrcDateTimeExceedLimit
- 消息:`The Ticks value '%ticks;' for the datetime column must be between valid datetime ticks range -621355968000000000 and 2534022144000000000.`
- **原因:** 如果日期/时间值为“0001-01-01 00:00:00”,则可能是儒略历和公历之间的差异导致的。 https://en.wikipedia.org/wiki/Proleptic_Gregorian_calendar#Difference_between_Julian_and_proleptic_Gregorian_calendar_dates.
- **建议**:检查时钟周期值,并避免使用日期/时间值“0001-01-01 00:00:00”。
## <a name="parquet-format"></a>Parquet 格式
### <a name="error-code--parquetjavainvocationexception"></a>错误代码:ParquetJavaInvocationException
- 消息:`An error occurred when invoking java, message: %javaException;.`
- **原因:** 如果错误消息包含“java.lang.OutOfMemory”、“Java 堆空间”和“doubleCapacity”,则原因通常与旧版集成运行时中的内存管理问题有关。
- **建议**:如果使用的是自承载集成运行时,且版本低于 3.20.7159.1,建议升级到最新版本。
- **原因:** 如果错误消息包含“java.lang.OutOfMemory”,则原因是集成运行时不能提供足够的资源来处理文件。
- **建议**:限制集成运行时中的并发运行。 对于自承载集成运行时,请扩展到具有 8 GB 或更大内存的强大计算机。
- **原因:** 如果错误消息包含“NullPointerReference”,则原因可能是出现了暂时性错误。
- **建议**:重试。 如果问题仍然出现,请联系支持人员。
### <a name="error-code--parquetinvalidfile"></a>错误代码:ParquetInvalidFile
- 消息:`File is not a valid Parquet file.`
- **原因:** Parquet 文件问题。
- **建议**:检查输入是否为有效的 Parquet 文件。
### <a name="error-code--parquetnotsupportedtype"></a>错误代码:ParquetNotSupportedType
- 消息:`Unsupported Parquet type. PrimitiveType: %primitiveType; OriginalType: %originalType;.`
- **原因:** Azure 数据工厂不支持 Parquet 格式。
- **建议**:反复检查源数据。 参阅文档: https://docs.azure.cn/data-factory/supported-file-formats-and-compression-codecs 。
### <a name="error-code--parquetmisseddecimalprecisionscale"></a>错误代码:ParquetMissedDecimalPrecisionScale
- 消息:`Decimal Precision or Scale information is not found in schema for column: %column;.`
- **原因:** 尝试分析数字的精度和小数位数,但系统未提供此类信息。
- **建议**:“Source”不会返回正确的精度和小数位数。 检查问题列的精度和小数位数。
### <a name="error-code--parquetinvaliddecimalprecisionscale"></a>错误代码:ParquetInvalidDecimalPrecisionScale
- 消息:`Invalid Decimal Precision or Scale. Precision: %precision; Scale: %scale;.`
- **原因:** 架构无效。
- **建议**:检查问题列的精度和小数位数。
### <a name="error-code--parquetcolumnnotfound"></a>错误代码:ParquetColumnNotFound
- 消息:`Column %column; does not exist in Parquet file.`
- **原因:** 源架构与接收器架构不匹配。
- **建议**:检查“activity”中的“mappings”。 确保源列可映射到正确的接收器列。
### <a name="error-code--parquetinvaliddataformat"></a>错误代码:ParquetInvalidDataFormat
- 消息:`Incorrect format of %srcValue; for converting to %dstType;.`
- **原因:** 数据无法转换为 mappings.source 中指定的类型
- **建议**:反复检查源数据,或者在复制活动列映射中指定此列的正确数据类型。 参阅文档: https://docs.azure.cn/data-factory/supported-file-formats-and-compression-codecs 。
### <a name="error-code--parquetdatacountnotmatchcolumncount"></a>错误代码:ParquetDataCountNotMatchColumnCount
- 消息:`The data count in a row '%sourceColumnCount;' does not match the column count '%sinkColumnCount;' in given schema.`
- **原因:** 源列计数和接收器列计数不匹配
- **建议**:反复检查源列计数是否与“mapping”中的接收器列计数相同。
### <a name="error-code--parquetdatatypenotmatchcolumntype"></a>错误代码:ParquetDataTypeNotMatchColumnType
- **消息**:数据类型 %srcType; 与列“% columnIndex;”的给定列类型 %dstType; 不匹配。
- **原因:** 源中的数据无法转换为接收器中定义的类型
- **建议**:在 mapping.sink 中指定正确的类型。
### <a name="error-code--parquetbridgeinvaliddata"></a>错误代码:ParquetBridgeInvalidData
- 消息:`%message;`
- **原因:** 数据值超出限制
- **建议**:重试。 如果问题仍然出现,请联系我们。
### <a name="error-code--parquetunsupportedinterpretation"></a>错误代码:ParquetUnsupportedInterpretation
- 消息:`The given interpretation '%interpretation;' of Parquet format is not supported.`
- **原因:** 不支持的方案
- **建议**:“ParquetInterpretFor”不应为“sparkSql”。
### <a name="error-code--parquetunsupportfilelevelcompressionoption"></a>错误代码:ParquetUnsupportFileLevelCompressionOption
- 消息:`File level compression is not supported for Parquet.`
- **原因:** 不支持的方案
- **建议**:删除有效负载中的“CompressionType”。
### <a name="error-code--usererrorjniexception"></a>错误代码:UserErrorJniException
- 消息:`Cannot create JVM: JNI return code [-6][JNI call failed: Invalid arguments.]`
- **原因:** 由于设置了一些非法(全局)参数,因此无法创建 JVM。
- **建议**:登录到承载着你的自承载 IR 的每个节点的计算机。 检查是否正确设置了系统变量,如下所示:`_JAVA_OPTIONS "-Xms256m -Xmx16g" with memory bigger than 8 G`。 重启所有 IR 节点,然后重新运行该管道。
### <a name="arithmetic-overflow"></a>算术溢出
- **症状**:复制 Parquet 文件时出现错误消息:`Message = Arithmetic Overflow., Source = Microsoft.DataTransfer.Common`
- **原因:** 将文件从 Oracle 复制到 Parquet 时,当前仅支持精度 <= 38 且整数部分的长度 <= 20 的十进制数。
- **解决方法**:你可以将存在此类问题的列转换为 VARCHAR2,这是一种解决方法。
### <a name="no-enum-constant"></a>无枚举常量
- **症状**:将数据复制为 Parquet 格式时出现以下错误消息:`java.lang.IllegalArgumentException:field ended by ';'` 或 `java.lang.IllegalArgumentException:No enum constant org.apache.parquet.schema.OriginalType.test`。
- **原因**:
此问题可能是由列名中的空格或不受支持的字符(例如 ;{}()\n\t=)导致的,因为 Parquet 不支持此类格式。
例如,contoso(test) 之类的列名会分析[代码](https://github.com/apache/parquet-mr/blob/master/parquet-column/src/main/java/org/apache/parquet/schema/MessageTypeParser.java) `Tokenizer st = new Tokenizer(schemaString, " ;{}()\n\t");` 中的括号中的类型。 将引发错误,因为不存在这样的“test”类型。
若要查看支持的类型,可以在[此处](https://github.com/apache/parquet-mr/blob/master/parquet-column/src/main/java/org/apache/parquet/schema/OriginalType.java)查看它们。
- **解决方法**:
- 仔细检查接收器列名中是否有空格。
- 仔细检查是否将包含空格的第一行用作了列名。
- 仔细检查是否支持类型 OriginalType。 尽量避免使用这些特殊符号:`,;{}()\n\t=`。
## <a name="rest"></a>REST
### <a name="error-code--restsinkcallfailed"></a>错误代码:RestSinkCallFailed
- 消息:`Rest Endpoint responded with Failure from server. Check the error from server:%message;`
- **原因:** 当 Azure 数据工厂通过 http 协议与 REST 终结点对话,并且请求操作失败时,会发生此错误。
- **建议**:检查错误消息中的 http 状态代码或消息,并修复远程服务器问题。
### <a name="unexpected-network-response-from-rest-connector"></a>来自 REST 连接器的意外网络响应
- **症状**:终结点有时从 REST 连接器收到意外响应 (400/401/403/500)。
- **原因:** 构造 HTTP 请求时,REST 源连接器将来自链接服务/数据集/复制源的 URL 和 HTTP 方法/标头/正文用作参数。 此问题很可能是由一个或多个指定参数中的某些错误引起的。
- **解决方法**:
- 在 cmd 窗口中使用“curl”检查参数是否是问题的原因(Accept 和 User-Agent 标头应始终包括在内 ):
```
curl -i -X <HTTP method> -H <HTTP header1> -H <HTTP header2> -H "Accept: application/json" -H "User-Agent: azure-data-factory/2.0" -d '<HTTP body>' <URL>
```
如果命令返回相同的意外响应,请使用“curl”修复以上参数,直到返回预期的响应。
也可使用“curl--help”来更高级地使用命令。
- 如果只有 ADF REST 连接器返回意外响应,请联系 Azure 支持人员,以进一步进行故障排除。
- 请注意,“curl”可能不适合重现 SSL 证书验证问题。 在某些情况下,“curl”命令已成功执行,但没有遇到任何 SSL 证书验证问题。 但是,当在浏览器中执行相同的 URL 时,客户端实际上不会首先返回任何 SSL 证书来建立与服务器的信任。
对于上述情况,建议使用 Postman 和 Fiddler 之类的工具 。
## <a name="sftp"></a>SFTP
#### <a name="error-code--sftpoperationfail"></a>错误代码:SftpOperationFail
- 消息:`Failed to '%operation;'. Check detailed error from SFTP.`
- **原因:** Sftp 操作遇到问题。
- **建议**:检查来自 SFTP 的详细错误。
### <a name="error-code--sftprenameoperationfail"></a>错误代码:SftpRenameOperationFail
- 消息:`Failed to rename the temp file. Your SFTP server doesn't support renaming temp file, please set "useTempFileRename" as false in copy sink to disable uploading to temp file.`
- **原因:** SFTP 服务器不支持重命名临时文件。
- **建议**:在复制接收器中将“useTempFileRename”设置为 false,以禁止上传到临时文件。
### <a name="error-code--sftpinvalidsftpcredential"></a>错误代码:SftpInvalidSftpCredential
- 消息:`Invalid Sftp credential provided for '%type;' authentication type.`
- **原因:** 从 AKV/SDK 中提取了私钥内容,但该内容未正确编码。
- **建议**:
如果私钥内容来自 AKV,并且客户将原始密钥文件直接上传到 SFTP 链接服务,则原始密钥文件可以正常使用。
请参阅 https://docs.azure.cn/data-factory/connector-sftp#using-ssh-public-key-authentication 。privateKey 内容是 Base64 编码的 SSH 私钥内容。
请使用 Base64 编码将原始私钥文件的整个内容编码,并将编码后的字符串存储到 AKV 中。 如果单击“从文件上传”,则原始私钥文件是可以在 SFTP 链接服务上使用的文件。
下面是用于生成字符串的一些示例:
- 使用 C# 代码:
```
byte[] keyContentBytes = File.ReadAllBytes(Private Key Path);
string keyContent = Convert.ToBase64String(keyContentBytes, Base64FormattingOptions.None);
```
- 使用 Python 代码:
```
import base64
rfd = open(r'{Private Key Path}', 'rb')
keyContent = rfd.read()
rfd.close()
print base64.b64encode(Key Content)
```
- 使用第三方 Base64 转换工具
建议使用 https://www.base64encode.org/ 之类的工具。
- **原因:** 选择了错误的密钥内容格式
- **建议**:
当前不支持使用 PKCS#8 格式的 SSH 私钥(开头为“-----BEGIN ENCRYPTED PRIVATE KEY-----”)访问 ADF 中的 SFTP 服务器。
请运行以下命令,将密钥转换为传统的 SSH 密钥格式(开头为“-----BEGIN RSA PRIVATE KEY-----”):
```
openssl pkcs8 -in pkcs8_format_key_file -out traditional_format_key_file
chmod 600 traditional_format_key_file
ssh-keygen -f traditional_format_key_file -p
```
- **原因:** 凭据或私钥内容无效
- **建议**:使用 WinSCP 之类的工具仔细检查,看密钥文件或密码是否正确。
### <a name="sftp-copy-activity-failed"></a>SFTP 复制活动失败
- **症状**:错误代码:UserErrorInvalidColumnMappingColumnNotFound。 错误消息:`Column 'AccMngr' specified in column mapping cannot be found in source data.`
- **原因:** 源不包含名为“AccMngr”的列。
- **解决方法**:通过映射目标数据集列来仔细检查你的数据集的配置情况,以确认是否存在这样的“AccMngr”列。
### <a name="error-code--sftpfailedtoconnecttosftpserver"></a>错误代码:SftpFailedToConnectToSftpServer
- 消息:`Failed to connect to Sftp server '%server;'.`
- **原因:** 如果错误消息包含“套接字读取操作在 30000 毫秒后超时”,则可能是因为 SFTP 服务器使用了错误的链接服务类型,例如使用 FTP 链接服务连接到 SFTP 服务器。
- **建议**:检查目标服务器的端口。 默认情况下,SFTP 使用端口 22。
- **原因:** 如果错误消息包含“服务器响应不包含 SSH 协议标识”,则可能是因为 SFTP 服务器限制了连接。 ADF 会创建多个连接,以从 SFTP 服务器进行并行下载,有时会达到 SFTP 服务器限制。 实际上,不同的服务器在达到限制时会返回不同的错误。
- **建议**:
将 SFTP 数据集的最大并发连接数指定为 1,然后重新运行复制。 如果成功,则可确定这是由限制导致的。
若要提升吞吐量,请联系 SFTP 管理员以提高并发连接计数限制,或将下面的 IP 添加到允许列表:
- 如果你使用的是托管 IR,请添加 [Azure 数据中心 IP 范围](https://www.microsoft.com/download/details.aspx?id=57062)。
或者,如果不想向 SFTP 服务器允许列表中添加大型的 IP 范围列表,则可以安装自承载 IR。
- 如果使用自承载 IR,请将安装了 SHIR 的计算机 IP 添加到允许列表。
## <a name="sharepoint-online-list"></a>SharePoint Online 列表
### <a name="error-code--sharepointonlineauthfailed"></a>错误代码:SharePointOnlineAuthFailed
- 消息:`The access token generated failed, status code: %code;, error message: %message;.`
- **原因:** 服务主体 ID 和密钥可能未正确设置。
- **建议**:检查注册的应用程序(服务主体 ID)和密钥是否设置正确。
## <a name="xml-format"></a>XML 格式
### <a name="error-code--xmlsinknotsupported"></a>错误代码:XmlSinkNotSupported
- 消息:`Write data in xml format is not supported yet, please choose a different format!`
- **原因:** 在复制活动中将 XML 数据集用作接收器数据集
- **建议**:将不同格式的数据集用作副本接收器。
### <a name="error-code--xmlattributecolumnnameconflict"></a>错误代码:XmlAttributeColumnNameConflict
- 消息:`Column names %attrNames;' for attributes of element '%element;' conflict with that for corresponding child elements, and the attribute prefix used is '%prefix;'.`
- **原因:** 使用了属性前缀,导致出现了冲突。
- **建议**:为“attributePrefix”属性设置其他值。
### <a name="error-code--xmlvaluecolumnnameconflict"></a>错误代码:XmlValueColumnNameConflict
- 消息:`Column name for the value of element '%element;' is '%columnName;' and it conflicts with the child element having the same name.`
- **原因:** 使用某个子元素名称作为元素值的列名称。
- **建议**:为“valueColumn”属性设置其他值。
### <a name="error-code--xmlinvalid"></a>错误代码:XmlInvalid
- 消息:`Input XML file '%file;' is invalid with parsing error '%error;'.`
- **原因:** 输入 XML 文件的格式不正确。
- **建议**:更正 XML 文件以使其格式正确.
## <a name="general-copy-activity-error"></a>常规复制活动错误
### <a name="error-code--jrenotfound"></a>错误代码:JreNotFound
- 消息:`Java Runtime Environment cannot be found on the Self-hosted Integration Runtime machine. It is required for parsing or writing to Parquet/ORC files. Make sure Java Runtime Environment has been installed on the Self-hosted Integration Runtime machine.`
- **原因:** 自承载集成运行时找不到 Java 运行时。 读取特定的源时需要 Java 运行时。
- **建议**:检查集成运行时环境,并参阅文档: https://docs.azure.cn/data-factory/format-parquet#using-self-hosted-integration-runtime
### <a name="error-code--wildcardpathsinknotsupported"></a>错误代码:WildcardPathSinkNotSupported
- 消息:`Wildcard in path is not supported in sink dataset. Fix the path: '%setting;'.`
- **原因:** 接收器数据集不支持通配符。
- **建议**:检查接收器数据集,并修复路径(不使用通配符值)。
### <a name="fips-issue"></a>FIPS 问题
- **症状**:复制活动在启用了 FIPS 的自承载 Integration Runtime 计算机上失败,并出现以下错误消息:`This implementation is not part of the Windows Platform FIPS validated cryptographic algorithms.`。 使用 Azure Blob、SFTP 等连接器复制数据时可能会发生此问题。
- **原因:** FIPS(Federal Information Processing Standards,美国联邦信息处理标准)定义了允许使用的一组特定加密算法。 当计算机上启用了 FIPS 模式时,某些情况下会阻止复制活动所依赖的某些加密类。
- **解决方法**:你可以通过 [此文](https://techcommunity.microsoft.com/t5/microsoft-security-baselines/why-we-8217-re-not-recommending-8220-fips-mode-8221-anymore/ba-p/701037)了解 Windows 中 FIPS 模式的当前情况,并评估是否可以在自承载 Integration Runtime 计算机上禁用 FIPS。
另一方面,如果你只是想让 Azure 数据工厂绕过 FIPS 并使活动运行成功,则可执行以下步骤:
1. 打开安装了自承载 Integration Runtime 的文件夹(通常在 `C:\Program Files\Microsoft Integration Runtime\<IR version>\Shared` 下)。
2. 打开“diawp.exe.config”,将 `<enforceFIPSPolicy enabled="false"/>` 添加到 `<runtime>` 节,如下所示。

3. 重启自承载 Integration Runtime 计算机。
## <a name="next-steps"></a>后续步骤
尝试通过以下资源获得故障排除方面的更多帮助:
* [MSDN 论坛](https://social.msdn.microsoft.com/Forums/zh-CN/home)
| 34.340887 | 259 | 0.726848 | yue_Hant | 0.534953 |
d3af76af64be875cf1fd8c8a72769fb8e426be4b | 3,498 | md | Markdown | README.md | j39m/zakopane | 4c76796fbc8254427e7c51067372d25985f05f4d | [
"BSD-3-Clause"
] | 1 | 2020-12-02T15:51:20.000Z | 2020-12-02T15:51:20.000Z | README.md | j39m/zakopane | 4c76796fbc8254427e7c51067372d25985f05f4d | [
"BSD-3-Clause"
] | null | null | null | README.md | j39m/zakopane | 4c76796fbc8254427e7c51067372d25985f05f4d | [
"BSD-3-Clause"
] | null | null | null | # zakopane
> **NOTE**: This project is free software, a personal project by j39m.
> However, Google LLC owns the copyright on commits
> `677a32b167502f6d5092add7f95178e81acf4d5d` and newer. This does not
> impact your ability to use and to hack at this free software; I
> provide this notice only for attribution purposes.
`zakopane` is a tool that captures filesystem checksums.
The present implementation can be characterized "`sha256sum`, but
recursive."
## Usage
```sh
# Captures checksums of all files under <directory>.
zakopane checksum <directory>
# Compares zakopane snapshots <before> and <after> using rules defined
# defined in <config>.
zakopane compare --config <config> <before> <after>
```
## "checksum" Subcommand
There are two noteworthy aspects of the behavior of `zakopane checksum`:
1. `zakopane` does not descend into directories whose names begin with
a dot.
1. `zakopane` does not attempt to stay within the same filesystem and
will happily cross over bind-mounted filesystem boundaries.
## "compare" Subcommand
`zakopane compare` accepts a config file in the form of a YAML document
comprising
* a default policy and
* more specific policies.
Both are optional; in fact, empty YAML documents and YAML dictionaries
with irrelevant keys will be treated as no-op (but valid) configs.
### Policy Values
1. `ignore` tells `zakopane` to do nothing with matching files. It's as
though they don't exist.
1. `noadd` tells `zakopane` to report added files.
1. `nomodify` tells `zakopane` to report modified files.
1. `nodelete` tells `zakopane` to report deleted files.
1. `immutable` is shorthand that means the same thing as
`noadd,nomodify,nodelete` all together.
Policies are joined together (without spaces) by a comma as in the
definition of the `immutable` policy. Order and repetition do not
matter.
### Default Policy
`zakopane compare` determines the default policy
1. by looking for it on the command-line (`--default-policy` or `-d`),
1. by looking for it in the config file (if given), and
1. finally by falling back to a hardcoded default of `immutable`.
## Appendix: Comparison Configurations
```yaml
# Anything not covered by a specific policy should be ignored.
default-policy: ignore
# We only care about paths spelling out prequel memes, it seems.
policies:
./Documents/hello/there: nomodify,nodelete
./Documents/general/kenobi: noadd,nodelete
```
In a `zakopane compare` config, the longest path match wins. Take the
following policies excerpt:
```yaml
policies:
./Documents/: nomodify
./Documents/you/are/a/bold/one/: ignore
```
Then a file named `./Documents/you/are/shorter-than-i-expected.txt` will
be subject to the former `nomodify` rule, while a file named
`./Documents/you/are/a/bold/one/poo/doo.txt` will be subject to the
latter `ignore` rule.
There is no concept of policy "strength;" the longest path match always
wins. Suppose the year is CE 2020, and I'm still actively adding family
photos to the directory of the same year. Here's an appropriate
policies excerpt:
```yaml
policies:
./family-pictures/: immutable
./family-pictures/2020/: nomodify,nodelete
```
The above policies excerpt specifies that new entities may appear under
`./family-pictures/2020`, but existing entities must never change or
disappear. All other entities under `./family-pictures/` must never
change in any way; `zakopane` will visually warn of addition, deletion,
or modification of these.
| 32.388889 | 72 | 0.753288 | eng_Latn | 0.997411 |
d3b0a6e013e368a144ec1c2b8f5c047383bd32d1 | 3,335 | md | Markdown | src/pages/index.md | kokakoola/gatsby-starter-netlify-cms | 278c26228a3034feb8965b635939408c98f3676b | [
"MIT"
] | null | null | null | src/pages/index.md | kokakoola/gatsby-starter-netlify-cms | 278c26228a3034feb8965b635939408c98f3676b | [
"MIT"
] | null | null | null | src/pages/index.md | kokakoola/gatsby-starter-netlify-cms | 278c26228a3034feb8965b635939408c98f3676b | [
"MIT"
] | null | null | null | ---
templateKey: index-page
title: Great coffee with a conscience is the best
image: /img/Frenemies Sõbraenlased.JPG
heading: Great companion who gives work for many years
subheading: Support sustainable farming while enjoying a cup
mainpitch:
title: What to expect from a rehomed Shiba Inu
description: >-
I got Hiro 2,5 years ago. He was a giveaway as his owner saw him unhappy and
understood it does not work out.
description: >-
I knew he is anxious and does not like traffic. But this multitude of
misbehaving was not what I expected: first the very special shiba
characteristic spiced with delayed socialising and missing basic training.
intro:
blurbs:
- image: /img/coffee.png
text: >-
Shibas are ancient native breed. Besides cuteness it means they have to
be socialised early, before 16 weeks. Their basic character how to cope
with stress during lifetime develops between 8-12 months.
My Hiro was adopted at 16 months, he is from a farm and lived in a big
cosy cage and never met a city and had little interaction with humans.
- image: /img/coffee-gear.png
text: >-
Anxiety is common disorder on shibas. Makes sense as they are more
"wild" than todays breeds. Their wild instincts are still there and
shibas are well known survivors in wilderness. They are born hunters,
not lap dogs. They are intelligent and mistrustful preferring to make
their own decisions as they usually see no reason to trust a human.
- image: /img/tutorials.png
text: >
Love a great cup of coffee, but never knew how to make one? Bought a
fancy new Chemex but have no clue how to use it? Don't worry, we’re here
to help. You can schedule a custom 1-on-1 consultation with our baristas
to learn anything you want to know about coffee roasting and brewing.
Email us or call the store for details.
- image: /img/meeting-space.png
text: >
We believe that good coffee has the power to bring people together.
That’s why we decided to turn a corner of our shop into a cozy meeting
space where you can hang out with fellow coffee lovers and learn about
coffee making techniques. All of the artwork on display there is for
sale. The full price you pay goes to the artist.
heading: What we offer
description: >
Kaldi is the ultimate spot for coffee lovers who want to learn about their
java’s origin and support the farmers that grew it. We take coffee
production, roasting and brewing seriously and we’re glad to pass that
knowledge to anyone. This is an edit via identity...
main:
heading: Great coffee with no compromises
description: >
We hold our coffee to the highest standards from the shrub to the cup.
That’s why we’re meticulous and transparent about each step of the coffee’s
journey. We personally visit each farm to make sure the conditions are
optimal for the plants, farmers and the local environment.
image1:
alt: A close-up of a paper filter filled with ground coffee
image: /img/products-grid3.jpg
image2:
alt: A green cup of a coffee on a wooden table
image: /img/products-grid2.jpg
image3:
alt: Coffee beans
image: /img/products-grid1.jpg
---
| 46.319444 | 80 | 0.716042 | eng_Latn | 0.99979 |
d3b0d9fe33ad07033e0830053a0d7b9acb9c3def | 222 | md | Markdown | _posts/2011-12-08-t144764401810026496.md | craigwmcclellan/craigmcclellan.github.io | bd9432ea299f1141442b9ba90eb3aa001984c20d | [
"MIT"
] | 1 | 2018-08-04T15:31:00.000Z | 2018-08-04T15:31:00.000Z | _posts/2011-12-08-t144764401810026496.md | craigwmcclellan/craigmcclellan.github.io | bd9432ea299f1141442b9ba90eb3aa001984c20d | [
"MIT"
] | null | null | null | _posts/2011-12-08-t144764401810026496.md | craigwmcclellan/craigmcclellan.github.io | bd9432ea299f1141442b9ba90eb3aa001984c20d | [
"MIT"
] | 1 | 2018-08-04T15:31:03.000Z | 2018-08-04T15:31:03.000Z | ---
layout: post
microblog: true
audio:
photo:
date: 2011-12-08 07:04:57 -0600
guid: http://craigmcclellan.micro.blog/2011/12/08/t144764401810026496.html
---
A new substituting first today: Field Trip. #nervous #excited
| 22.2 | 74 | 0.747748 | eng_Latn | 0.3254 |
d3b1da34d69632dee9800658855ff6213ba2204d | 10,584 | md | Markdown | api_tips.md | chipshort/rest-api-doc | 9421e6538a64a9b178ec570a6022cce5d28dbdac | [
"CC-BY-4.0"
] | null | null | null | api_tips.md | chipshort/rest-api-doc | 9421e6538a64a9b178ec570a6022cce5d28dbdac | [
"CC-BY-4.0"
] | null | null | null | api_tips.md | chipshort/rest-api-doc | 9421e6538a64a9b178ec570a6022cce5d28dbdac | [
"CC-BY-4.0"
] | null | null | null | ## Tips for using the Crossref REST API (aka- how to avoid being blocked)
The [REST API documentation](https://api.crossref.org) is useful, but it doesn't tell you much about best practice. It also doesn't tell you how you can best avoid getting blocked. With this document hope to remedy that.
Please read this entire document carefully- it includes some advice that may seem counter-intuitive.
It also includes some advice that might be obvious to professional programmers, but that might not be obvious to researchers or others who are just starting out with scripting or interactive research notebooks (e.g. Jupyter).
Instability in our APIs is almost always tracked down to a particular user who has created a script that either:
- performs needlessly complex and inefficient requests
- performs requests that repeatedly trigger server errors (sometimes related to above)
- polls the API too quickly (we rate limit at the IP level, but some users run distributed systems coming in on multiple IPs)
- performs redundant requests
These are not malicious actions. And they are easily correctable. And, in almost all cases, when we advise users that they need to fix their process, they do. And we are never concerned with them again. But, we can’t help everyone with every use case.
Our advice is split into three sections:
- Pick the right service.
- Understand the performance characteristics of the REST API.
- Optimise your requests and pay attention to errors.
### Pick the right service level.
Consider using our “[Polite](https://github.com/CrossRef/rest-api-doc#good-manners--more-reliable-service)” or “[Plus](https://www.crossref.org/services/metadata-retrieval/metadata-plus/)” versions of the REST API.
What does this mean?
There are three ways to access the REST API. In increasing levels of reliability and predictability- they are:
- Anonymously (aka Public)
- With self identification (aka Polite)
- With authentication. (aka Plus)
Why three ways? Because Crossref is committed to providing free, open and as-anonymous-as-possible access to scholarly metadata. We are committed to this because research can often involve controversial subjects. And what is considered “controversial” can vary widely across time and jurisdictions. It is extremely difficult to provide truly “anonymous” access to an internet service. We will always, for example, be able to tie a request to an IP address and we keep this IP information for 90 days. The best we can do is make sure that some people who have the need for extra precautions can access the API without needing to authenticate or explicitly identify themselves.
But this semi-anonymous access is also hard for us to manage. Because the “Public” version of the REST API is free, traffic patterns can vary wildly. Because the service is semi-anonymous- it makes it very hard for us to contact people who are causing problems on the system.
So we offer a compromise as well. If you do not have a requirement for anonymity, you can also self-identify by including contact information in your requests. The service is still open and free, but this way we can quickly get in touch with you if your scripts are causing problems. And in turn for providing this contact information, we redirect these requests to a specific “Polite” pool of servers. These servers are generally more reliable because we are more easily able to protect them from misbehaving scripts.
Note that, in asking you to self-identify, we are not asking you to completely give up privacy. We do not sell (or give) this contact information to anybody else and we only use it to contact users who are causing problems. Also- any contact information that you provide in your requests will only stay in our logs for 90 days.
And finally, if you are using our REST API for a production service that requires high predictability- *you should really consider using our paid-for “Plus” service.* This service gets you an authentication token which, in turn, directs your request as a reserved pool of servers that are extremely predictable.
### Understand the performance characteristics of REST API queries.
If you are using the API for simple reference matching, and are not doing any post validation (e.g. your own ranking of the returned results), then just ask for the first two results (`rows=2`). This allows you to identify the best result and ignore any where there is a tie in score on the first two results (e.g. an inconclusive match). If you *are* analyzing and ranking the results yourself, then you can probably get away with just requesting five results (`rows=5`). Anything beyond that is very unlikely to be a match. In either case- restricting the number of rows returned will be more efficient for you and for the API.
<hr/>
For matching references (either complete or partial), use the `query.bibliographic` parameter and minimise the number of other parameters, filters and facets. Most additional parameters, filters and facets will make the query slower *and* less accurate. You might be surprised at this advice as it seems counterintuitive- but we assure you the advice is backed up by many millions of tests.
Specifically, do not construct your queries like this:
```
http://api.crossref.org/works?query.author="Josiah Carberry"&filter=from-pub-date:2008-08-13,until-pub-date:2008-08-13&query.container-title="Journal of Psychoceramics"&query="Toward a Unified Theory of High-Energy Metaphysics"&order=score&sort=desc
```
The above is a massively expensive and slow query. If it doesn’t time-out, you are likely to get a false negative anyway.
And also don’t do this:
```
http://api.crossref.org/works?query="Toward a Unified Theory of High-Energy Metaphysics, Josiah Carberry 2008-08-13"
```
Using the plain `query` parameter will search the entire record- including funder and other non bibliographic elements. This means that it will also match any record that includes the query text in these other elements- resulting in many, many false positives and distorted scores.
If you are trying to match references- the simplest approach is the best. Just use the `query.bibliographic` parameter. It restricts the matching to the bibliographic metadata and the default sort order and scoring mechanism will reliably list the best match first. Restricting the number of rows to `2` allows you to check to see if there is an ambiguous match (e.g. a “tie” in the scores of the first two items returned” (see above tip). So the best way to do the above queries is like this:
```
http://api.crossref.org/works?query.bibliographic="Toward a Unified Theory of High-Energy Metaphysics, Josiah Carberry 2008-08-13"&rows=2
```
### Optimise your requests and pay attention to errors.
If you have an overall error (`4XX` + `5XX`) rate >= 10%, seriously- please *stop* your script and figure out what is going on. Don’t just leave it hammering the API and generating errors- you will just be making other users (and Crossref staff) miserable until you fix your script.
<hr/>
If you get a `404` (not found) when looking up a DOI, do not just endlessly poll Crossref to see if it ever resolves correctly. First check to make sure the DOI is a Crossref DOI. If it is not a Crossref DOI, you can stop checking it with us and try checking it with another registration agency’s API. You can check the registration agency to which a DOI belongs as follows:
```
https://api.crossref.org/works/{doi}/agency
```
<hr/>
Adhere to rate limits. We rate limit by IP- so *yes*, you can “get around” the rate limit by running your scripts on multiple machines with different IPs- but then all you are doing is being inconsiderate of other users. And that makes us grumpy. You won’t like us when we are grumpy. There can be other good reasons to run your scripts on multiple machines with different IPs- but if you do, please continue to respect the overall-rate limit by restricting each process to working at an appropriate sub-rate of the overall rate limit.
<hr/>
Check your errors and respond to them. If you get an error - particularly a timeout error, a rate limit error (`429`), or a server error (`5XX`)- do not just repeat the request or immediately move onto the next request, back-off your request rate. Ideally, back-off exponentially. There are lots of libraries that make this very easy. Since a lot of our API users seem to use Python, here are links to a few libraries that allow you to do this properly:
- [Backoff](https://pypi.org/project/backoff/)
- [Retry](https://pypi.org/project/retry/)
But there are similar libraries for Java, Javascript, R, Ruby, PHP, Clojure, Golang, Rust, etc.
<hr/>
Make sure you URL-encode DOIs. DOIs can contain lots of characters that need to be escaped properly. We see lots of errors that are simply the result of people not taking care to properly encode their requests. Don’t be one of those people.
<hr/>
Cache the results of your requests. We know a lot of our users are extracting DOIs from references or other sources and then looking up their metadata. This means that, often, they will end up looking up metadata for the same DOI multiple times. We recommend that, at a minimum, you cache the results of your requests so that subsequent requests for the same resource don’t hit the API directly. Again, there are some very easy ways to do this using standard libraries. In Python, for example, the following libraries allow you to easily add caching to any function with just a single line of code:
- [Requests-cache](https://pypi.org/project/requests-cache/)
- [Diskcache](https://pypi.org/project/diskcache/)
- [Cachew](https://github.com/karlicoss/cachew#what-is-cachew)
There are similar libraries for other languages.
<hr/>
If you are using the Plus API, make sure that you are making intelligent use of the snapshots. Only use the API for requesting content that has changed since the start of the month- and use the metadata already in the snapshot for everything else.
<hr/>
Managing the snapshot can be cumbersome as it is inconveniently large-ish. Remember that you do *not have to uncompress and unarchive the snapshot in order to use it.* Most major programming languages have libraries that allow you to open and read files directly from a compressed archive. For example:
- [tarfile](https://docs.python.org/3/library/tarfile.html)
If you parallelize the process of reading data from the snapshot and loading it into your database, you should be able to scale the process linearly with the number of cores you are able to take advantage of.
| 90.461538 | 676 | 0.783636 | eng_Latn | 0.999483 |
d3b2a13578c38ec863632efde65497b8606acebb | 260 | md | Markdown | README.md | rbranson/chain | 8352b7df5014d022e0989ff19cba19d833e8fed4 | [
"Apache-2.0"
] | null | null | null | README.md | rbranson/chain | 8352b7df5014d022e0989ff19cba19d833e8fed4 | [
"Apache-2.0"
] | null | null | null | README.md | rbranson/chain | 8352b7df5014d022e0989ff19cba19d833e8fed4 | [
"Apache-2.0"
] | null | null | null | # chain
[](https://pkg.go.dev/github.com/rbranson/chain)
A generic implementation of the chaining logic in the errors package in the
standard library. Who even knows what this is useful for tho?
| 37.142857 | 111 | 0.769231 | eng_Latn | 0.952059 |
d3b4a95906c4e7081724e894a7366a35817e3a67 | 5,738 | md | Markdown | archived/monotouchlimitations.md | anuragsharma33/website | f7eaf5450b42fc4ac27da49bc6ac32f56544c0c8 | [
"MIT"
] | 345 | 2015-01-02T06:05:22.000Z | 2022-03-27T21:04:20.000Z | archived/monotouchlimitations.md | anuragsharma33/website | f7eaf5450b42fc4ac27da49bc6ac32f56544c0c8 | [
"MIT"
] | 240 | 2015-01-02T09:21:19.000Z | 2022-01-14T20:15:19.000Z | archived/monotouchlimitations.md | anuragsharma33/website | f7eaf5450b42fc4ac27da49bc6ac32f56544c0c8 | [
"MIT"
] | 1,100 | 2015-01-01T15:55:14.000Z | 2022-03-30T17:00:45.000Z | ---
title: "MonoTouch:Limitations"
lastmodified: '2009-09-04'
redirect_from:
- /MonoTouch%3ALimitations/
---
MonoTouch:Limitations
=====================
Since applications on the iPhone using MonoTouch are compiled to static code, it is not possible to use any facilities that require code generation at runtime.
These are the MonoTouch limitations compared to desktop Mono/Moonlight:
<table>
<col width="100%" />
<tbody>
<tr class="odd">
<td align="left"><h2>Table of contents</h2>
<ul>
<li><a href="#limited-debugging-support">1 Limited Debugging Support</a></li>
<li><a href="#limited-generics-support">2 Limited Generics Support</a>
<ul>
<li><a href="#generic-virtual-methods">2.1 Generic Virtual Methods</a></li>
<li><a href="#pinvokes-in-generic-types">2.2 P/Invokes in Generic Types</a></li>
<li><a href="#value-types-as-dictionary-keys">2.3 Value types as Dictionary Keys</a></li>
</ul></li>
<li><a href="#no-dynamic-code-generation">3 No Dynamic Code Generation</a>
<ul>
<li><a href="#systemreflectionemit">3.1 System.Reflection.Emit</a></li>
<li><a href="#reverse-callbacks">3.2 Reverse Callbacks</a></li>
</ul></li>
<li><a href="#no-remoting">4 No Remoting</a></li>
<li><a href="#runtime-disabled-features">5 Runtime Disabled Features</a></li>
<li><a href="#only-tested-with-iphoneos-30">6 Only tested with iPhoneOS 3.0</a></li>
</ul></td>
</tr>
</tbody>
</table>
Limited Debugging Support
=========================
There is no support for debugging in MonoTouch, except for low-level debugging with GDB of the runtime.
To learn how to use GDB to debug the runtime you can read [Debugging](/Debugging)
Most of the debugging is done with Console.WriteLine.
Limited Generics Support
========================
Mono's [Full AOT](/docs/advanced/aot/#full-aot) support has the following limitations with respect to generics:
Generic Virtual Methods
-----------------------
Generic virtual methods aren't supported, as it isn't possible to determine statically what method will be called in all circumstances. (Which is why C++ doesn't support virtual template methods, either.)
``` csharp
class HasGenericVirtualMethod {
public virtual PrintValues<T>(params T[] values)
{
// ...
}
}
// ...
var a = new HasGenericVirtualMethod ();
a.PrintValues (new[]{1, 2, 3, 4});
```
P/Invokes in Generic Types
--------------------------
P/Invokes in generic methods aren't supported:
``` csharp
class GenericType<T> {
[DllImport ("System")]
public static extern int getpid ();
}
```
Value types as Dictionary Keys
------------------------------
Using a value type as a Dictionary\<TKey, TValue\> key is problematic, as the default Dictionary constructor attempts to use EqualityComparer\<TKey\>.Default. EqualityComparer\<TKey\>.Default, in turn, attempts to use Reflection to instantiate a new type which implements the IEqualityComparer\<TKey\> interface.
This works for reference types (as the reflection+create a new type step is skipped), but for value types it crashes and burns rather quickly once you attempt to use it on the device.
**Workaround**: Manually implement the [IEqualityComparer\<TKey\>](http://docs.go-mono.com/index.aspx?link=T%3aSystem.Collections.Generic.IEqualityComparer%601) interface in a new type and provide an instance of that type to the [Dictionary\<TKey, TValue\>(IEqualityComparer\<TKey\>) constructor](http://docs.go-mono.com/monodoc.ashx?link=C%3aSystem.Collections.Generic.Dictionary%602(System.Collections.Generic.IEqualityComparer%7b%600%7d)).
No Dynamic Code Generation
==========================
Since the iPhone's kernel prevents an application from generating code dynamically Mono on the iPhone does not support any form of dynamic code generation. These include:
- The System.Reflection.Emit is not available.
- No support for System.Runtime.Remoting.
- No support for creating types dynamically (no Type.GetType ("MyType")).
- Reverse callbacks must be registered with the runtime at compile time.
System.Reflection.Emit
----------------------
The lack of System.Reflection.Emit means that no code that depends on runtime code generation will work. This includes things like:
- The Regular expression IL generation engine.
- The Dynamic Language Runtime.
- Any languages built on top of the Dynamic Language Runtime.
- Remoting's TransparentProxy or anything else that would cause the runtime to generate code dynamically.
Reverse Callbacks
-----------------
In standard Mono it is possible to pass C# delegate instances to unmanaged code in lieu of a function pointer. The runtime would typically transform those function pointers into a small thunk that allows unmanaged code to call back into managed code.
In Mono these bridges are implemented by the Just-in-Time compiler. When using the ahead-of-time compiler required by the iPhone there are two important limitations at this point:
- You must flag all of your callback methods with the [MonoPInvokeCallbackAttribute](http://docs.go-mono.com/monodoc.ashx?tlink=20@ecma%3a1%23MonoPInvokeCallbackAttribute%2f).
- The methods have to be static methods, there is no support for instance methods (this limitation will be removed in the future).
No Remoting
===========
The Remoting stack is not available on MonoTouch.
Runtime Disabled Features
=========================
The following features have been disabled in Mono's iPhone Runtime:
- Profiler
- Reflection.Emit
- Reflection.Emit.Save functionality
- COM bindings
- The JIT engine
- Metadata verifier (since there is no JIT)
Only tested with iPhoneOS 3.0
=============================
We have only tested MonoTouch with iPhoneOS 3.0 and we do not know if it would work with older versions.
| 39.30137 | 442 | 0.721157 | eng_Latn | 0.925168 |
d3b4d45f3374898564dba9d6909ca3b99c105081 | 236 | md | Markdown | docs/vs-2015/includes/vstecasp-md.md | adrianodaddiego/visualstudio-docs.it-it | b2651996706dc5cb353807f8448efba9f24df130 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/vs-2015/includes/vstecasp-md.md | adrianodaddiego/visualstudio-docs.it-it | b2651996706dc5cb353807f8448efba9f24df130 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | docs/vs-2015/includes/vstecasp-md.md | adrianodaddiego/visualstudio-docs.it-it | b2651996706dc5cb353807f8448efba9f24df130 | [
"CC-BY-4.0",
"MIT"
] | null | null | null | ---
ms.openlocfilehash: 171bae486f5da291eb8f7edbdd8df641ddb9aa11
ms.sourcegitcommit: 2ee11676af4f3fc5729934d52541e9871fb43ee9
ms.translationtype: MT
ms.contentlocale: it-IT
ms.lasthandoff: 05/17/2019
ms.locfileid: "65845288"
---
ASP.NET | 26.222222 | 60 | 0.838983 | yue_Hant | 0.266155 |
d3b4e86cba1ba1a5e2601dabd6eb6d0dbc0e305f | 2,460 | md | Markdown | README.md | icarohs7/google_nav_bar | 82029e12f19d390951c4e8283af13bc602ec44e0 | [
"MIT"
] | 1 | 2022-03-16T02:46:49.000Z | 2022-03-16T02:46:49.000Z | README.md | icarohs7/google_nav_bar | 82029e12f19d390951c4e8283af13bc602ec44e0 | [
"MIT"
] | null | null | null | README.md | icarohs7/google_nav_bar | 82029e12f19d390951c4e8283af13bc602ec44e0 | [
"MIT"
] | null | null | null | # google_nav_bar
<img src="https://forthebadge.com/images/badges/built-with-love.svg" height="28px" /> <img src="https://img.shields.io/badge/license-MIT-green?style=for-the-badge" height="28px" /> <a href="https://pub.dev/packages/google_nav_bar"><img src="https://img.shields.io/pub/v/google_nav_bar.svg?style=for-the-badge" height="28px" /></a>
A modern google style nav bar for flutter.

GoogleNavBar is a Flutter widget designed by [Aurelien Salomon](https://dribbble.com/shots/5925052-Google-Bottom-Bar-Navigation-Pattern/) and developed by [sooxt98](https://www.instagram.com/sooxt98/).
## Getting Started
Add this to your package's `pubspec.yaml` file:
```
...
dependencies:
google_nav_bar: ^4.0.2
```
Now in your Dart code, you can use:
```
import 'package:google_nav_bar/google_nav_bar.dart';
```
## Usage
Style your tab globally with GNav's attribute, if you wish to style tab separately, use GButton's attribute
``` dart
GNav(
rippleColor: Colors.grey[800], // tab button ripple color when pressed
hoverColor: Colors.grey[700], // tab button hover color
haptic: true, // haptic feedback
tabBorderRadius: 15,
tabActiveBorder: Border.all(color: Colors.black, width: 1), // tab button border
tabBorder: Border.all(color: Colors.grey, width: 1), // tab button border
tabShadow: [BoxShadow(color: Colors.grey.withOpacity(0.5), blurRadius: 8)], // tab button shadow
curve: Curves.easeOutExpo, // tab animation curves
duration: Duration(milliseconds: 900), // tab animation duration
gap: 8, // the tab button gap between icon and text
color: Colors.grey[800], // unselected icon color
activeColor: Colors.purple, // selected icon and text color
iconSize: 24, // tab button icon size
tabBackgroundColor: Colors.purple.withOpacity(0.1), // selected tab background color
padding: EdgeInsets.symmetric(horizontal: 20, vertical: 5), // navigation bar padding
tabs: [
GButton(
icon: LineIcons.home,
text: 'Home',
),
GButton(
icon: LineIcons.heart_o,
text: 'Likes',
),
GButton(
icon: LineIcons.search,
text: 'Search',
),
GButton(
icon: LineIcons.user,
text: 'Profile',
)
]
)
```
View the example folder
There are 4 different use case included in the /example directory, go try it out! | 33.69863 | 352 | 0.708943 | eng_Latn | 0.483048 |