Instruction
stringlengths
14
778
input_code
stringlengths
0
4.24k
output_code
stringlengths
1
5.44k
Bump ridedott/merge-me-action from 2.9.79 to 2.9.80
# See https://github.com/ridedott/merge-me-action/ # This workflow automates merges from patches sent by Dependabot, and # only by dependabot, once the other CI workflows pass name: Auto-merge Dependabot PRs on: workflow_run: types: - completed workflows: - "Continuous Integration" jobs: merge-me: name: Auto-merge Dependabot PRs runs-on: ubuntu-latest steps: - name: Auto-Merge if: ${{ github.event.workflow_run.conclusion == 'success' }} uses: ridedott/merge-me-action@v2.9.79 with: GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} MERGE_METHOD: MERGE
# See https://github.com/ridedott/merge-me-action/ # This workflow automates merges from patches sent by Dependabot, and # only by dependabot, once the other CI workflows pass name: Auto-merge Dependabot PRs on: workflow_run: types: - completed workflows: - "Continuous Integration" jobs: merge-me: name: Auto-merge Dependabot PRs runs-on: ubuntu-latest steps: - name: Auto-Merge if: ${{ github.event.workflow_run.conclusion == 'success' }} uses: ridedott/merge-me-action@v2.9.80 with: GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} MERGE_METHOD: MERGE
Use actions/checkout@v3 for Gradle Wrapper Validation Action
name: "Validate Gradle Wrapper" on: [push, pull_request] permissions: contents: read jobs: validation: name: "Validation" runs-on: ubuntu-latest steps: - uses: actions/checkout@v2 - uses: gradle/wrapper-validation-action@v1
name: "Validate Gradle Wrapper" on: [push, pull_request] permissions: contents: read jobs: validation: name: "Validation" runs-on: ubuntu-latest steps: - uses: actions/checkout@v3 - uses: gradle/wrapper-validation-action@v1
Change database name to botnbot
development: database: 'twb' username: root password: root production: database: 'twb' username: root password: root test: database: 'twb' username: root password: root
development: database: 'botnbot' username: root password: root production: database: 'botnbot' username: root password: root test: database: 'botnbot' username: root password: root
Revert "upload artifact after build"
# This workflow will build a Java project with Maven, and cache/restore any dependencies to improve the workflow execution time # For more information see: https://help.github.com/actions/language-and-framework-guides/building-and-testing-java-with-maven name: Java CI with Maven on: [push] jobs: build: runs-on: ubuntu-latest steps: - uses: actions/checkout@v3 - name: Set up JDK 16 uses: actions/setup-java@v3 with: java-version: '16' distribution: 'temurin' cache: maven - name: Build with Maven run: mvn -B package --file pom.xml echo "::set-output name=releaseName::`ls home/runner/work/SubTools/SubTools/SubSort/target/subsort-*.jar | awk -F '(/|.jar)' '{print $8}'`" - name: Upload jar if: success() uses: actions/upload-artifact@v3 with: name: ${{ steps.buildRelease.outputs.releaseName }} path: "home/runner/work/SubTools/SubTools/SubSort/target/subsort-*.jar"
# This workflow will build a Java project with Maven, and cache/restore any dependencies to improve the workflow execution time # For more information see: https://help.github.com/actions/language-and-framework-guides/building-and-testing-java-with-maven name: Java CI with Maven on: [push] jobs: build: runs-on: ubuntu-latest steps: - uses: actions/checkout@v3 - name: Set up JDK 16 uses: actions/setup-java@v3 with: java-version: '16' distribution: 'temurin' cache: maven - name: Build with Maven run: mvn -B package --file pom.xml
Make item factory services private.
parameters: darvin_menu.item_factory.abstract.class: Darvin\MenuBundle\Item\AbstractItemFactory darvin_menu.item_factory.menu_item.class: Darvin\MenuBundle\Item\MenuItemFactory darvin_menu.item_factory.slug_map_item.class: Darvin\MenuBundle\Item\SlugMapItemFactory darvin_menu.item_factory.slug_map_item.uri_route: darvin_content_content_show services: darvin_menu.item_factory.abstract: class: "%darvin_menu.item_factory.abstract.class%" abstract: true # public: false arguments: - "@doctrine.orm.entity_manager" - "@knp_menu.factory" darvin_menu.item_factory.menu_item: class: "%darvin_menu.item_factory.menu_item.class%" parent: darvin_menu.item_factory.abstract arguments: - "@darvin_menu.item_factory.slug_map_item" darvin_menu.item_factory.slug_map_item: class: "%darvin_menu.item_factory.slug_map_item.class%" parent: darvin_menu.item_factory.abstract arguments: - "@router" - "%darvin_menu.item_factory.slug_map_item.uri_route%"
parameters: darvin_menu.item_factory.abstract.class: Darvin\MenuBundle\Item\AbstractItemFactory darvin_menu.item_factory.menu_item.class: Darvin\MenuBundle\Item\MenuItemFactory darvin_menu.item_factory.slug_map_item.class: Darvin\MenuBundle\Item\SlugMapItemFactory darvin_menu.item_factory.slug_map_item.uri_route: darvin_content_content_show services: darvin_menu.item_factory.abstract: class: "%darvin_menu.item_factory.abstract.class%" abstract: true public: false arguments: - "@doctrine.orm.entity_manager" - "@knp_menu.factory" darvin_menu.item_factory.menu_item: class: "%darvin_menu.item_factory.menu_item.class%" parent: darvin_menu.item_factory.abstract arguments: - "@darvin_menu.item_factory.slug_map_item" darvin_menu.item_factory.slug_map_item: class: "%darvin_menu.item_factory.slug_map_item.class%" parent: darvin_menu.item_factory.abstract arguments: - "@router" - "%darvin_menu.item_factory.slug_map_item.uri_route%"
Use same port as faf-stack
faf-api: version: #faf-api.version# map: folder-zip-files: ${MAP_UPLOAD_PATH:/content/faf/vault/maps} folder-preview-path-small: ${MAP_PREVIEW_PATH_SMALL:/content/faf/vault/map_previews/small} folder-preview-path-large: ${MAP_PREVIEW_PATH_LARGE:/content/faf/vault/map_previews/large} spring: application: name: FAF Java API Prototype mvc: favicon: enabled: false datasource: type: com.zaxxer.hikari.HikariDataSource hikari: connection-test-query: SELECT 1 FROM DUAL minimum-idle: 2 maximum-pool-size: 8 jpa: hibernate: naming: physical-strategy: org.hibernate.boot.model.naming.PhysicalNamingStrategyStandardImpl properties: hibernate: current_session_context_class: org.springframework.orm.hibernate5.SpringSessionContext dialect: org.hibernate.dialect.MySQL5Dialect profiles: active: ${API_PROFILE:dev} server: port: ${API_PORT:8080} security: enable-csrf: true management: context-path: /management port: 8081 security: enabled: false
faf-api: version: #faf-api.version# map: folder-zip-files: ${MAP_UPLOAD_PATH:/content/faf/vault/maps} folder-preview-path-small: ${MAP_PREVIEW_PATH_SMALL:/content/faf/vault/map_previews/small} folder-preview-path-large: ${MAP_PREVIEW_PATH_LARGE:/content/faf/vault/map_previews/large} spring: application: name: FAF Java API Prototype mvc: favicon: enabled: false datasource: type: com.zaxxer.hikari.HikariDataSource hikari: connection-test-query: SELECT 1 FROM DUAL minimum-idle: 2 maximum-pool-size: 8 jpa: hibernate: naming: physical-strategy: org.hibernate.boot.model.naming.PhysicalNamingStrategyStandardImpl properties: hibernate: current_session_context_class: org.springframework.orm.hibernate5.SpringSessionContext dialect: org.hibernate.dialect.MySQL5Dialect profiles: active: ${API_PROFILE:dev} server: port: ${API_PORT:8010} security: enable-csrf: true management: context-path: /management port: 8081 security: enabled: false
Remove docker-compose only if it exists
language: python python: - "2.7" sudo: required services: - docker env: global: - DOCKER_VERSION=1.12.0-0~trusty - DOCKER_COMPOSE_VERSION=1.6.2 before_install: - apt-cache madison docker-engine - sudo apt-get -o Dpkg::Options::="--force-confnew" install -y docker-engine=${DOCKER_VERSION} - sudo rm /usr/local/bin/docker-compose - curl -L https://github.com/docker/compose/releases/download/${DOCKER_COMPOSE_VERSION}/docker-compose-`uname -s`-`uname -m` > docker-compose - chmod +x docker-compose - sudo mv docker-compose /usr/local/bin cache: directories: - $HOME/.cache/pip - /var/lib/docker before_script: - docker-compose up -d db - docker-compose run documentregister setuplocaldb script: - docker-compose run documentregister test
language: python python: - "2.7" sudo: required services: - docker env: global: - DOCKER_VERSION=1.12.0-0~trusty - DOCKER_COMPOSE_VERSION=1.6.2 before_install: - apt-cache madison docker-engine - sudo apt-get -o Dpkg::Options::="--force-confnew" install -y docker-engine=${DOCKER_VERSION} - sudo rm -f /usr/local/bin/docker-compose - curl -L https://github.com/docker/compose/releases/download/${DOCKER_COMPOSE_VERSION}/docker-compose-`uname -s`-`uname -m` > docker-compose - chmod +x docker-compose - sudo mv docker-compose /usr/local/bin cache: directories: - $HOME/.cache/pip - /var/lib/docker before_script: - docker-compose up -d db - docker-compose run documentregister setuplocaldb script: - docker-compose run documentregister test
Configure Travis for better build performance
language: ruby rvm: - 1.9.3 - 2.0.0 - 2.1.0 - 2.1.1 - 2.1.2 - jruby-19mode
sudo: false language: ruby cache: bundler rvm: - 1.9.3 - 2.0.0 - 2.1.0 - 2.1.1 - 2.1.2 - jruby-19mode
Use a notice instead of join/message.
language: clojure notifications: irc: "jcsi.ms#qualityclj"
language: clojure notifications: irc: channels: - "jcsi.ms#qualityclj" use_notice: true
Revert "Also test against openjdk7 and oraclejdk7 (in addition to oraclejdk8)"
# Configuration for CI build language: java jdk: - openjdk7 - oraclejdk7 - oraclejdk8
# Configuration for CI build language: java jdk: - oraclejdk8
Fix groovy script popup error un Kibana
network.host: 0.0.0.0 # this value is required because we set "network.host" # be sure to modify it appropriately for a production cluster deployment discovery.zen.minimum_master_nodes: 1
network.host: 0.0.0.0 # this value is required because we set "network.host" # be sure to modify it appropriately for a production cluster deployment discovery.zen.minimum_master_nodes: 1 script.inline: true
Add redis service with environment variables
box: pjvds/golang@1.0.1
box: pjvds/golang@1.0.1 services: - wercker/redis@0.0.8 build: steps: - script: name: set environment variables code: export REDIS_ADDR=$WERCKER_REDIS_URL
Update from Hackage at 2017-04-01T23:36:21Z
homepage: '' changelog-type: '' hash: 536eb5ccdb14b53963fd9fbc90f357f7c3e2f4da88c4d2c17d6a2e4123b9f877 test-bench-deps: base: ! '>=4.9.0.0 && <=4.9.1.0' constraints: -any mtl: -any random: -any DeepDarkFantasy: -any maintainer: lolisa@marisa.moe synopsis: A DSL for creating neural network. changelog: '' basic-deps: base: ! '>=4.9.0.0 && <=4.9.1.0' constraints: -any mtl: -any random: -any all-versions: - '0.0.1' - '0.0.1.1' - '0.2017.3.28' - '0.2017.3.30' author: '' latest: '0.2017.3.30' description-type: haddock description: Deep Dark Fantasy(DDF) is a domain specific language that allow one to automatically derive derivative of program in DDF. Hence, one can write neural network in DDF and use the derivative program for gradient descend. license-name: Apache
homepage: '' changelog-type: '' hash: 0dbd16368edcf4cf67ec98a506923cd4500d6ba2f72cca07b469a66a3abdefe4 test-bench-deps: base: ! '>=4.9.0.0 && <=4.9.1.0' constraints: -any mtl: -any random: -any DeepDarkFantasy: -any maintainer: lolisa@marisa.moe synopsis: A DSL for creating neural network. changelog: '' basic-deps: base: ! '>=4.9.0.0 && <=4.9.1.0' constraints: -any mtl: -any random: -any all-versions: - '0.0.1' - '0.0.1.1' - '0.2017.3.28' - '0.2017.3.30' - '0.2017.4.1' author: '' latest: '0.2017.4.1' description-type: haddock description: Deep Dark Fantasy(DDF) is a domain specific language that allow one to automatically derive derivative of program in DDF. Hence, one can write neural network in DDF and use the derivative program for gradient descend. license-name: Apache
Fix MR IID in changelog item
--- title: Impersonation no longer gets stuck on password change. merge_request: 2904 author: type: fixed
--- title: Impersonation no longer gets stuck on password change. merge_request: 15497 author: type: fixed
Update from Hackage at 2016-01-19T00:15:13+0000
homepage: http://github.com/mgsloan/th-reify-many changelog-type: '' hash: fe00ca5b122ab4faa12f8ba41c59b83dba5ce28573bfef0f92e051cfca6976d0 test-bench-deps: base: -any th-reify-many: -any template-haskell: -any maintainer: Michael Sloan <mgsloan at gmail> synopsis: Recurseively reify template haskell datatype info changelog: '' basic-deps: base: ! '>=4 && <5' containers: -any th-expand-syns: -any mtl: -any template-haskell: ! '>=2.5.0.0' safe: -any all-versions: - '0.1.1' - '0.1.2' - '0.1.3' author: Michael Sloan latest: '0.1.3' description-type: haddock description: ! '@th-reify-many@ provides functions for recursively reifying top level declarations. The main intended use case is for enumerating the names of datatypes reachable from an initial datatype, and passing these names to some function which generates instances.' license-name: BSD3
homepage: http://github.com/mgsloan/th-reify-many changelog-type: '' hash: 61020b3bbb609d80962494775bcbc2c2d02fde68f744a9794439c9e9c11168ef test-bench-deps: base: -any th-reify-many: -any template-haskell: -any maintainer: Michael Sloan <mgsloan at gmail> synopsis: Recurseively reify template haskell datatype info changelog: '' basic-deps: base: ! '>=4 && <5' containers: -any th-expand-syns: -any mtl: -any template-haskell: ! '>=2.5.0.0' safe: -any all-versions: - '0.1.1' - '0.1.2' - '0.1.3' - '0.1.4' author: Michael Sloan latest: '0.1.4' description-type: haddock description: ! '@th-reify-many@ provides functions for recursively reifying top level declarations. The main intended use case is for enumerating the names of datatypes reachable from an initial datatype, and passing these names to some function which generates instances.' license-name: BSD3
Use correct script in Circle
machine: node: version: 4.4.7 test: post: # Ensure steps leading up to publishing work. - node_modules/.bin/builder run transpile-dev
machine: node: version: 4.4.7 test: post: # Ensure steps leading up to publishing work. - node_modules/.bin/builder run build-dist
Fix missing mailers queue in Sidekiq
:concurrency: 20 :queues: - [default, 1] - [searchkick, 1]
:concurrency: 20 :queues: - [default, 1] - [searchkick, 1] - [mailers, 1]
Fix detection of tagged release
# This workflow will upload a Python Package using Twine when a release is created # For more information see: https://help.github.com/en/actions/language-and-framework-guides/using-python-with-github-actions#publishing-to-package-registries # This workflow uses actions that are not certified by GitHub. # They are provided by a third-party and are governed by # separate terms of service, privacy policy, and support # documentation. name: deploy on: release: types: [published] jobs: deploy: runs-on: ubuntu-latest steps: - uses: actions/checkout@v2 - name: Set up Python uses: actions/setup-python@v2 with: python-version: '3.x' - name: Install dependencies run: | python -m pip install --upgrade pip pip install build - name: Build package run: python -m build --sdist --wheel --outdir dist/ - name: Publish package if: github.event_name == 'push' && startsWith(github.ref, 'refs/tags') uses: pypa/gh-action-pypi-publish@27b31702a0e7fc50959f5ad993c78deac1bdfc29 with: user: __token__ password: ${{ secrets.PYPI_INTEGRATION }}
# This workflow will upload a Python Package using Twine when a release is created # For more information see: https://help.github.com/en/actions/language-and-framework-guides/using-python-with-github-actions#publishing-to-package-registries # This workflow uses actions that are not certified by GitHub. # They are provided by a third-party and are governed by # separate terms of service, privacy policy, and support # documentation. name: deploy on: release: types: [published] jobs: deploy: runs-on: ubuntu-latest steps: - uses: actions/checkout@v2 - name: Set up Python uses: actions/setup-python@v2 with: python-version: '3.x' - name: Install dependencies run: | python -m pip install --upgrade pip pip install build - name: Build package run: python -m build --sdist --wheel --outdir dist/ - name: Publish package if: startsWith(github.ref, 'refs/tags') uses: pypa/gh-action-pypi-publish@27b31702a0e7fc50959f5ad993c78deac1bdfc29 with: user: __token__ password: ${{ secrets.PYPI_INTEGRATION }}
Upgrade 'kubernetes-dashboard' to version 1.6.3.
apiVersion: extensions/v1beta1 kind: Deployment metadata: labels: k8s-app: kubernetes-dashboard name: kubernetes-dashboard namespace: kube-system spec: replicas: 1 selector: matchLabels: k8s-app: kubernetes-dashboard template: metadata: annotations: scheduler.alpha.kubernetes.io/tolerations: | [ { "key": "dedicated", "operator": "Equal", "value": "master", "effect": "NoSchedule" } ] labels: k8s-app: kubernetes-dashboard spec: containers: - image: gcr.io/google_containers/kubernetes-dashboard-amd64:v1.6.2 imagePullPolicy: Always livenessProbe: httpGet: path: / port: 9090 initialDelaySeconds: 30 timeoutSeconds: 30 name: kubernetes-dashboard ports: - containerPort: 9090 protocol: TCP resources: limits: cpu: 100m memory: 50Mi requests: cpu: 100m memory: 50Mi
apiVersion: extensions/v1beta1 kind: Deployment metadata: labels: k8s-app: kubernetes-dashboard name: kubernetes-dashboard namespace: kube-system spec: replicas: 1 selector: matchLabels: k8s-app: kubernetes-dashboard template: metadata: annotations: scheduler.alpha.kubernetes.io/tolerations: | [ { "key": "dedicated", "operator": "Equal", "value": "master", "effect": "NoSchedule" } ] labels: k8s-app: kubernetes-dashboard spec: containers: - image: gcr.io/google_containers/kubernetes-dashboard-amd64:v1.6.3 imagePullPolicy: Always livenessProbe: httpGet: path: / port: 9090 initialDelaySeconds: 30 timeoutSeconds: 30 name: kubernetes-dashboard ports: - containerPort: 9090 protocol: TCP resources: limits: cpu: 100m memory: 50Mi requests: cpu: 100m memory: 50Mi
Bump ridedott/merge-me-action from v2.8.6 to v2.8.7
name: Merge me! on: check_suite: types: - completed jobs: merge-me: name: Merge me! runs-on: ubuntu-latest steps: - name: Merge me! uses: ridedott/merge-me-action@v2.8.6 with: # Depending on branch protection rules, a manually populated # `GITHUB_TOKEN_WORKAROUND` environment variable with permissions to # push to a protected branch must be used. This variable can have an # arbitrary name, as an example, this repository uses # `GITHUB_TOKEN_DOTTBOTT`. # # When using a custom token, it is recommended to leave the following # comment for other developers to be aware of the reasoning behind it: # # This must be used as GitHub Actions token does not support # pushing to protected branches. GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} MERGE_METHOD: MERGE
name: Merge me! on: check_suite: types: - completed jobs: merge-me: name: Merge me! runs-on: ubuntu-latest steps: - name: Merge me! uses: ridedott/merge-me-action@v2.8.7 with: # Depending on branch protection rules, a manually populated # `GITHUB_TOKEN_WORKAROUND` environment variable with permissions to # push to a protected branch must be used. This variable can have an # arbitrary name, as an example, this repository uses # `GITHUB_TOKEN_DOTTBOTT`. # # When using a custom token, it is recommended to leave the following # comment for other developers to be aware of the reasoning behind it: # # This must be used as GitHub Actions token does not support # pushing to protected branches. GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} MERGE_METHOD: MERGE
Convert flake8 linter output into GitHub annotations
# This workflow will install Python dependencies, run tests and lint with a variety of Python versions # For more information see: https://help.github.com/actions/language-and-framework-guides/using-python-with-github-actions name: Python package on: push: branches: [ master ] pull_request: workflow_dispatch: jobs: build: runs-on: ubuntu-latest strategy: matrix: python-version: [3.7, 3.8, 3.9] steps: - uses: actions/checkout@v2 - name: Set up Python ${{ matrix.python-version }} uses: actions/setup-python@v2 with: python-version: ${{ matrix.python-version }} - name: Install dependencies run: | python -m pip install --upgrade pip python -m pip install flake8 pytest if [ -f requirements.txt ]; then pip install -r requirements.txt; fi - name: Lint with flake8 run: | make lint-ci - name: Test with pytest run: | make test
# This workflow will install Python dependencies, run tests and lint with a variety of Python versions # For more information see: https://help.github.com/actions/language-and-framework-guides/using-python-with-github-actions name: Python package on: push: branches: [ master ] pull_request: workflow_dispatch: jobs: build: runs-on: ubuntu-latest strategy: matrix: python-version: [3.7, 3.8, 3.9] steps: - uses: actions/checkout@v2 - name: Set up Python ${{ matrix.python-version }} uses: actions/setup-python@v2 with: python-version: ${{ matrix.python-version }} - name: Install dependencies run: | python -m pip install --upgrade pip python -m pip install flake8 pytest if [ -f requirements.txt ]; then pip install -r requirements.txt; fi - name: Set up flake8 annotations uses: rbialon/flake8-annotations@v1 - name: Lint with flake8 run: | make lint-ci - name: Test with pytest run: | make test
Revert "install from setup.py", for some reason it broke the build
language: python python: - "2.7" - "3.2" - "3.3" - "3.4" env: - DJANGO=1.7 - DJANGO=1.8 install: - pip install -q Django==$DJANGO --use-mirrors - pip install coverage - pip install -e git://github.com/django-nose/django-nose.git#egg=django-nose - python setup.py install script: - python runtests.py
language: python python: - "2.7" - "3.2" - "3.3" - "3.4" env: - DJANGO=1.7 - DJANGO=1.8 install: - pip install -q Django==$DJANGO --use-mirrors - pip install coverage - pip install -e git://github.com/django-nose/django-nose.git#egg=django-nose - pip install -q -e . --use-mirrors script: - python runtests.py
Disable firefox testing due to webdriver issues.
os: osx language: node_js cache: - node_modules node_js: - lts/* branches: only: - gh-pages script: - npm test matrix: include: - name: "Chrome Stable" env: BROWSER=chrome addons: chrome: stable - name: "Chrome Beta" env: BROWSER=chrome addons: chrome: beta - name: "Firefox ESR" env: BROWSER=firefox addons: firefox: latest-esr - name: "Firefox Stable" env: BROWSER=firefox addons: firefox: latest - name: "Firefox Beta" env: BROWSER=firefox addons: firefox: latest-beta notifications: email: recipients: forward-webrtc-github@webrtc.org on_success: change on_failure: always
os: osx language: node_js cache: - node_modules node_js: - lts/* branches: only: - gh-pages script: - npm test matrix: include: - name: "Chrome Stable" env: BROWSER=chrome addons: chrome: stable - name: "Chrome Beta" env: BROWSER=chrome addons: chrome: beta # - name: "Firefox ESR" # env: BROWSER=firefox # addons: # firefox: latest-esr # - name: "Firefox Stable" # env: BROWSER=firefox # addons: # firefox: latest # - name: "Firefox Beta" # env: BROWSER=firefox # addons: # firefox: latest-beta notifications: email: recipients: forward-webrtc-github@webrtc.org on_success: change on_failure: always
Add workflow to do cherrypicks
name: Cherry pick commit(s) on: workflow_dispatch: inputs: commit: description: 'Commit to cherrypick' required: true branch: description: 'Branch to cherry pick to' required: true default: 'v5-09-XX' jobs: build: runs-on: ubuntu-latest steps: - name: Decide which branch to use run: | cat << EOF ::set-output name=branch::$(echo ${{ github.event.inputs.tag }}-patches | sed -e's/[a-z][a-z]*-patches$/-patches/') EOF id: decide_release_branch - uses: actions/checkout@v2 with: ref: "${{ steps.decide_release_branch.outputs.branch }}" - name: Update the branch run: | set -e git checkout ${{ steps.decide_release_branch.outputs.branch }} git cherry-pick ${{ github.event.inputs.commit }} git config --global user.email "alibuild@cern.ch" git config --global user.name "ALICE Action Bot" git push
name: Cherry pick commit(s) on: workflow_dispatch: inputs: commit: description: 'Commit to cherrypick' required: true branch: description: 'Branch to cherry pick to' required: true default: 'v5-09-XX' jobs: build: runs-on: ubuntu-latest steps: - name: Decide which branch to use run: | cat << EOF ::set-output name=branch::$(echo ${{ github.event.inputs.branch }}-patches | sed -e's/[a-z][a-z]*-patches$/-patches/') EOF id: decide_release_branch - uses: actions/checkout@v2 with: ref: "${{ steps.decide_release_branch.outputs.branch }}" - name: Update the branch run: | set -e git checkout ${{ steps.decide_release_branch.outputs.branch }} git cherry-pick ${{ github.event.inputs.commit }} git config --global user.email "alibuild@cern.ch" git config --global user.name "ALICE Action Bot" git push
Use new Travis container infrastructure.
language: ruby rvm: - 2.2.3 script: rubocop && rspec
sudo: false language: ruby rvm: - 2.2.3 script: rubocop && rspec
Use `ember try:one` instead of `ember try`.
--- language: node_js node_js: - "0.12" sudo: false cache: directories: - node_modules env: - EMBER_TRY_SCENARIO=default - EMBER_TRY_SCENARIO=ember-release - EMBER_TRY_SCENARIO=ember-beta - EMBER_TRY_SCENARIO=ember-canary matrix: fast_finish: true allow_failures: - env: EMBER_TRY_SCENARIO=ember-canary before_install: - export PATH=/usr/local/phantomjs-2.0.0/bin:$PATH - "npm config set spin false" - "npm install -g npm@^2" install: - npm install -g bower - npm install - bower install script: - ember try $EMBER_TRY_SCENARIO test - npm run test-node
--- language: node_js node_js: - "0.12" sudo: false cache: directories: - node_modules env: - EMBER_TRY_SCENARIO=default - EMBER_TRY_SCENARIO=ember-release - EMBER_TRY_SCENARIO=ember-beta - EMBER_TRY_SCENARIO=ember-canary matrix: fast_finish: true allow_failures: - env: EMBER_TRY_SCENARIO=ember-canary before_install: - export PATH=/usr/local/phantomjs-2.0.0/bin:$PATH - "npm config set spin false" - "npm install -g npm@^2" install: - npm install -g bower - npm install - bower install script: - ember try:one $EMBER_TRY_SCENARIO --- ember test - npm run test-node
Exclude config, test/helpers from CodeClimate
languages: Ruby: false JavaScript: true PHP: false Python: false exclude_paths: - "*.md" - "*.png" - "*.json" - "test/*"
languages: Ruby: false JavaScript: true PHP: false Python: false exclude_paths: - "*.md" - "*.png" - "*.json" - "config/*" - "test/*" - "test/helpers/*"
Use docker volume for postgresql
version: '3' services: db: image: postgres:9.6 web: build: . command: bundle exec rails s -p 3000 -b '0.0.0.0' volumes: - .:/myapp ports: - "3000:3000" depends_on: - db
version: '3' services: db: image: postgres:9.6 volumes: - postgresql-data:/var/lib/postgresql/data web: build: . command: bundle exec rails s -p 3000 -b '0.0.0.0' environment: RAILS_ENV: development volumes: - .:/myapp ports: - "3000:3000" depends_on: - db volumes: postgresql-data: driver: local
Update configuration for DCL Planning
--- site: dclg_planning whitehall_slug: department-for-communities-and-local-government homepage: https://www.gov.uk/government/collections/planning-practice-guidance tna_timestamp: 20140724121321 host: planningguidance.planningportal.gov.uk homepage_furl: www.gov.uk/dclg options: --query-string p aliases: - planningguidance.communities.gov.uk
--- site: dclg_planning whitehall_slug: department-for-communities-and-local-government homepage: https://www.gov.uk/government/collections/planning-practice-guidance homepage_title: Planning guidance tna_timestamp: 20140724121321 host: planningguidance.planningportal.gov.uk options: --query-string p aliases: - planningguidance.communities.gov.uk
Update Decisions to 1.12.2 (18)
Categories: - Sports & Health License: MIT AuthorName: Markus Fisch AuthorEmail: mf@markusfisch.de AuthorWebSite: https://www.markusfisch.de SourceCode: https://github.com/markusfisch/Libra IssueTracker: https://github.com/markusfisch/Libra/issues Changelog: https://github.com/markusfisch/Libra/releases AutoName: Decisions RepoType: git Repo: https://github.com/markusfisch/Libra Builds: - versionName: 1.10.0 versionCode: 14 commit: 1.10.0 subdir: app gradle: - yes - versionName: 1.12.0 versionCode: 16 commit: 1.12.0 subdir: app gradle: - yes - versionName: 1.12.1 versionCode: 17 commit: 1.12.1 subdir: app gradle: - yes AutoUpdateMode: Version %v UpdateCheckMode: Tags CurrentVersion: 1.12.1 CurrentVersionCode: 17
Categories: - Sports & Health License: MIT AuthorName: Markus Fisch AuthorEmail: mf@markusfisch.de AuthorWebSite: https://www.markusfisch.de SourceCode: https://github.com/markusfisch/Libra IssueTracker: https://github.com/markusfisch/Libra/issues Changelog: https://github.com/markusfisch/Libra/releases AutoName: Decisions RepoType: git Repo: https://github.com/markusfisch/Libra Builds: - versionName: 1.10.0 versionCode: 14 commit: 1.10.0 subdir: app gradle: - yes - versionName: 1.12.0 versionCode: 16 commit: 1.12.0 subdir: app gradle: - yes - versionName: 1.12.1 versionCode: 17 commit: 1.12.1 subdir: app gradle: - yes - versionName: 1.12.2 versionCode: 18 commit: e2fcee03b8ead10b592280d4f5ccffedcb4c49b7 subdir: app gradle: - yes AutoUpdateMode: Version %v UpdateCheckMode: Tags CurrentVersion: 1.12.2 CurrentVersionCode: 18
Fix Ansible 2.0 deprecation warning for ‘sudo:’ directive
# site.yml --- - hosts: srv001 sudo: true roles: [] - hosts: srv002 sudo: true roles: []
# site.yml --- - hosts: srv001 become: true roles: [] - hosts: srv002 become: true roles: []
Update base url on config
# For AppEngine safe: false port: 8081 source: site destination: appengine/build/static markdown: kramdown permalink: /articles/:title.html baseurl: /web fundamentals: /fundamentals url: /web highlighter: pygments pygments: true spotlights: false github: root: https://github.com/Google/WebFundamentals content: tree/master/src/site include_open_html: true custom: kramdown: toc_levels: "2" include: ['.htaccess'] exclude: ['config.rb'] # comment this for plain jekyll serve usage. langs_available: ['ko'] sample_link_base: "/web/fundamentals/resources/samples/"
# For AppEngine safe: false port: 8081 source: site destination: appengine/build/static markdown: kramdown permalink: /articles/:title.html baseurl: /web fundamentals: /web/fundamentals url: /web highlighter: pygments pygments: true spotlights: false github: root: https://github.com/Google/WebFundamentals content: tree/master/src/site include_open_html: true custom: kramdown: toc_levels: "2" include: ['.htaccess'] exclude: ['config.rb'] # comment this for plain jekyll serve usage. langs_available: ['ko'] sample_link_base: "/web/fundamentals/resources/samples/"
Update repository URL for Flake8
repos: - repo: https://github.com/psf/black rev: 20.8b1 hooks: - id: black args: [--line-length=80] - repo: https://gitlab.com/pycqa/flake8 rev: "3.8.4" hooks: - id: flake8 - repo: https://github.com/asottile/pyupgrade rev: v2.7.4 hooks: - id: pyupgrade args: [--py36-plus]
repos: - repo: https://github.com/psf/black rev: 20.8b1 hooks: - id: black args: [--line-length=80] - repo: https://github.com/PyCQA/flake8 rev: "3.8.4" hooks: - id: flake8 - repo: https://github.com/asottile/pyupgrade rev: v2.7.4 hooks: - id: pyupgrade args: [--py36-plus]
Update selenium and webdriver versions.
--- # defaults file for selenium selenium_install_dir: /opt selenium_version: "3.4.0" selenium_install_firefox: yes selenium_install_chrome: yes selenium_chromedriver: "2.30" selenium_display_id: "1" selenium_port: 4444 selenium_xvfb_args: "--server-args='-screen 0, 1920x1080x24'" selenium_user: www-admin
--- # defaults file for selenium selenium_install_dir: /opt selenium_version: "3.10.0" selenium_install_firefox: yes selenium_install_chrome: yes selenium_chromedriver: "2.36" selenium_display_id: "1" selenium_port: 4444 selenium_xvfb_args: "--server-args='-screen 0, 1920x1080x24'" selenium_user: www-admin
Test Node 5, and 0.12 only
language: node_js node_js: - "0.12" - "0.10" - "iojs" sudo: false cache: directories: - node_modules notifications: email: false
language: node_js node_js: - "5" - "0.12" sudo: false cache: directories: - node_modules notifications: email: false
Install pip in docker playbook
--- - name: Update apt-get once a day apt: update_cache=yes cache_valid_time=84600 sudo: yes - name: Ensure curl is installed apt: name=curl sudo: yes - name: Install docker shell: curl -sSL https://get.docker.com/ubuntu/ | sudo sh args: creates: /usr/bin/docker sudo: yes - name: Ensure docker is running service: name: docker state: running sudo: yes - name: Add user to docker group user: name: "{{ docker_user }}" append: yes groups: docker sudo: yes - name: Install docker-py pip: name=docker-py sudo: yes
--- - name: Update apt-get once a day apt: update_cache=yes cache_valid_time=84600 sudo: yes - name: Ensure curl is installed apt: name=curl sudo: yes - name: Ensure python is installed apt: name={{ item }} with_items: - python - python-pip sudo: yes - name: Install docker shell: curl -sSL https://get.docker.com/ubuntu/ | sudo sh args: creates: /usr/bin/docker sudo: yes - name: Ensure docker is running service: name: docker state: running sudo: yes - name: Add user to docker group user: name: "{{ docker_user }}" append: yes groups: docker sudo: yes - name: Install docker-py pip: name=docker-py sudo: yes
Update from Hackage at 2016-11-04T13:24:13Z
homepage: https://github.com/mhwombat/gray-extended changelog-type: '' hash: 253345a95219b9eb2fcd743da2424d1eab9731fab974351947ac6c3817bacc8e test-bench-deps: test-framework: ==0.8.* base: ==4.* test-framework-quickcheck2: ==0.3.* gray-extended: -any QuickCheck: ==2.7.* maintainer: amy@nualeargais.ie synopsis: Gray encoding schemes changelog: '' basic-deps: base: ==4.* all-versions: - '1.2' - '1.3' - '1.4' - '1.5' - '1.5.1' author: Amy de Buitléir latest: '1.5.1' description-type: haddock description: ! 'Gray codes satisfy the property that two successive values differ in only one digit. Usually the term \"Gray code\" refers to the Binary Reflected Gray code (BRGC), but non-binary Gray codes have also been discovered.' license-name: BSD3
homepage: https://github.com/mhwombat/gray-extended#readme changelog-type: '' hash: 992d02d8d7e53e93dd52de2898dbfb886d2cb8f1f51caf07cbe59bfd6dab8115 test-bench-deps: test-framework: ==0.8.* base: -any test-framework-quickcheck2: ==0.3.* gray-extended: -any QuickCheck: ==2.7.* || ==2.8.* || ==2.9.* maintainer: amy@nualeargais.ie synopsis: Gray encoding schemes changelog: '' basic-deps: base: ! '>=4.7 && <5' all-versions: - '1.2' - '1.3' - '1.4' - '1.5' - '1.5.1' - '1.5.2' author: Amy de Buitléir latest: '1.5.2' description-type: haddock description: ! 'Gray codes satisfy the property that two successive values differ in only one digit. Usually the term \"Gray code\" refers to the Binary Reflected Gray code (BRGC), but non-binary Gray codes have also been discovered.' license-name: BSD3
Make Circle build the lib before test
# From https://circleci.com/docs/yarn/ machine: node: version: 9.4.0 dependencies: override: - yarn cache_directories: - ~/.cache/yarn test: override: - yarn test # From http://codereview.cc/harbormaster/step/edit/6/ notify: webhooks: - url: http://codereview.cc/harbormaster/hook/circleci/
# From https://circleci.com/docs/yarn/ machine: node: version: 9.4.0 dependencies: override: - yarn cache_directories: - ~/.cache/yarn test: override: - yarn build && yarn test # From http://codereview.cc/harbormaster/step/edit/6/ notify: webhooks: - url: http://codereview.cc/harbormaster/hook/circleci/
Update Better schedule to 1.5.1 (9)
AntiFeatures: - NonFreeNet Categories: - Internet - Science & Education License: GPL-3.0-or-later AuthorName: Vít Skalický AuthorEmail: vit.skalicky@email.cz SourceCode: https://github.com/vitSkalicky/lepsi-rozvrh IssueTracker: https://github.com/vitSkalicky/lepsi-rozvrh/issues AutoName: Better schedule RepoType: git Repo: https://github.com/vitSkalicky/lepsi-rozvrh.git Builds: - versionName: '1.0' versionCode: 1 commit: v1.0 subdir: app gradle: - yes - versionName: '1.1' versionCode: 2 commit: v1.1 subdir: app gradle: - yes - versionName: 1.3.2 versionCode: 6 commit: v1.3.2 subdir: app gradle: - yes scanignore: - app/build.gradle - versionName: '1.5' versionCode: 8 commit: v1.5 subdir: app gradle: - yes AutoUpdateMode: Version v%v UpdateCheckMode: Tags UpdateCheckIgnore: beta preview CurrentVersion: '1.5' CurrentVersionCode: 8
AntiFeatures: - NonFreeNet Categories: - Internet - Science & Education License: GPL-3.0-or-later AuthorName: Vít Skalický AuthorEmail: vit.skalicky@email.cz SourceCode: https://github.com/vitSkalicky/lepsi-rozvrh IssueTracker: https://github.com/vitSkalicky/lepsi-rozvrh/issues AutoName: Better schedule RepoType: git Repo: https://github.com/vitSkalicky/lepsi-rozvrh.git Builds: - versionName: '1.0' versionCode: 1 commit: v1.0 subdir: app gradle: - yes - versionName: '1.1' versionCode: 2 commit: v1.1 subdir: app gradle: - yes - versionName: 1.3.2 versionCode: 6 commit: v1.3.2 subdir: app gradle: - yes scanignore: - app/build.gradle - versionName: '1.5' versionCode: 8 commit: v1.5 subdir: app gradle: - yes - versionName: 1.5.1 versionCode: 9 commit: v1.5.1 subdir: app gradle: - yes AutoUpdateMode: Version v%v UpdateCheckMode: Tags UpdateCheckIgnore: beta preview CurrentVersion: 1.5.1 CurrentVersionCode: 9
Fix postgresql task include when using as galaxy dependency
--- - name: Install PostgreSQL on {{ ansible_distribution }}-{{ ansible_distribution_major_version }} include: roles/postgresql/tasks/postgresql_{{ ansible_pkg_mgr }}{{ ansible_distribution_major_version }}.yml tags: [db, postgresql] when: ansible_os_family == "RedHat" - name: Install PostgreSQL on {{ ansible_distribution }}-{{ ansible_distribution_major_version }} include: roles/postgresql/tasks/postgresql_{{ ansible_pkg_mgr }}.yml tags: [db, postgresql] when: ansible_os_family == "Debian"
--- - name: Install PostgreSQL on {{ ansible_distribution }}-{{ ansible_distribution_major_version }} include: postgresql_{{ ansible_pkg_mgr }}{{ ansible_distribution_major_version }}.yml tags: [db, postgresql] when: ansible_os_family == "RedHat" - name: Install PostgreSQL on {{ ansible_distribution }}-{{ ansible_distribution_major_version }} include: postgresql_{{ ansible_pkg_mgr }}.yml tags: [db, postgresql] when: ansible_os_family == "Debian"
Add mongo config settings in collector service templates
heat_template_version: 2016-04-08 description: > OpenStack Ceilometer Collector service configured with Puppet parameters: ServiceNetMap: default: {} description: Mapping of service_name -> network name. Typically set via parameter_defaults in the resource registry. This mapping overrides those in ServiceNetMapDefaults. type: json DefaultPasswords: default: {} type: json EndpointMap: default: {} description: Mapping of service endpoint -> protocol. Typically set via parameter_defaults in the resource registry. type: json MonitoringSubscriptionCeilometerCollector: default: 'overcloud-ceilometer-collector' type: string resources: CeilometerServiceBase: type: ./ceilometer-base.yaml properties: ServiceNetMap: {get_param: ServiceNetMap} DefaultPasswords: {get_param: DefaultPasswords} EndpointMap: {get_param: EndpointMap} outputs: role_data: description: Role data for the Ceilometer Collector role. value: service_name: ceilometer_collector monitoring_subscription: {get_param: MonitoringSubscriptionCeilometerCollector} config_settings: get_attr: [CeilometerServiceBase, role_data, config_settings] step_config: | include ::tripleo::profile::base::ceilometer::collector
heat_template_version: 2016-04-08 description: > OpenStack Ceilometer Collector service configured with Puppet parameters: ServiceNetMap: default: {} description: Mapping of service_name -> network name. Typically set via parameter_defaults in the resource registry. This mapping overrides those in ServiceNetMapDefaults. type: json DefaultPasswords: default: {} type: json EndpointMap: default: {} description: Mapping of service endpoint -> protocol. Typically set via parameter_defaults in the resource registry. type: json MonitoringSubscriptionCeilometerCollector: default: 'overcloud-ceilometer-collector' type: string resources: CeilometerServiceBase: type: ./ceilometer-base.yaml properties: ServiceNetMap: {get_param: ServiceNetMap} DefaultPasswords: {get_param: DefaultPasswords} EndpointMap: {get_param: EndpointMap} MongoDbBase: type: ./database/mongodb-base.yaml properties: ServiceNetMap: {get_param: ServiceNetMap} DefaultPasswords: {get_param: DefaultPasswords} EndpointMap: {get_param: EndpointMap} outputs: role_data: description: Role data for the Ceilometer Collector role. value: service_name: ceilometer_collector monitoring_subscription: {get_param: MonitoringSubscriptionCeilometerCollector} config_settings: map_merge: - get_attr: [MongoDbBase, role_data, config_settings] - get_attr: [CeilometerServiceBase, role_data, config_settings] step_config: | include ::tripleo::profile::base::ceilometer::collector
Switch to out of memory db:
# SQLite version 3.x # gem install sqlite3 # # Ensure the SQLite 3 gem is defined in your Gemfile # gem 'sqlite3' development: adapter: sqlite3 database: db/development.sqlite3 pool: 5 timeout: 5000 # Warning: The database defined as "test" will be erased and # re-generated from your development database when you run "rake". # Do not set this db to the same as development or production. test: adapter: sqlite3 database: ":memory:" timeout: 500 production: adapter: sqlite3 database: db/production.sqlite3 pool: 5 timeout: 5000
# SQLite version 3.x # gem install sqlite3 # # Ensure the SQLite 3 gem is defined in your Gemfile # gem 'sqlite3' development: adapter: sqlite3 database: db/development.sqlite3 pool: 5 timeout: 5000 # Warning: The database defined as "test" will be erased and # re-generated from your development database when you run "rake". # Do not set this db to the same as development or production. test: adapter: sqlite3 database: db/test.sqlite3 timeout: 500 production: adapter: sqlite3 database: db/production.sqlite3 pool: 5 timeout: 5000
Drop duplicate python warnings argument in azure template
steps: - script: | # Fix Git SSL errors git submodule sync && git submodule update --init --recursive pipenv run pytest --junitxml=test-results.xml displayName: Run integration tests env: PYTHONWARNINGS: 'ignore:DEPRECATION' PY_EXE: $(PY_EXE) GIT_SSL_CAINFO: $(GIT_SSL_CAINFO) LANG: $(LANG) PIP_PROCESS_DEPENDENCY_LINKS: $(PIP_PROCESS_DEPENDENCY_LINKS) PIPENV_DEFAULT_PYTHON_VERSION: $(PIPENV_DEFAULT_PYTHON_VERSION) PYTHONWARNINGS: ignore:DEPRECATION PIPENV_NOSPIN: '1'
steps: - script: | # Fix Git SSL errors git submodule sync && git submodule update --init --recursive pipenv run pytest --junitxml=test-results.xml displayName: Run integration tests env: PY_EXE: $(PY_EXE) GIT_SSL_CAINFO: $(GIT_SSL_CAINFO) LANG: $(LANG) PIP_PROCESS_DEPENDENCY_LINKS: $(PIP_PROCESS_DEPENDENCY_LINKS) PIPENV_DEFAULT_PYTHON_VERSION: $(PIPENV_DEFAULT_PYTHON_VERSION) PYTHONWARNINGS: ignore:DEPRECATION PIPENV_NOSPIN: '1'
Remove the obsolete 'sudo:' setting
dist: trusty sudo: false group: beta language: node_js node_js: - node addons: firefox: latest-esr cache: directories: - node_modules - "$HOME/.cache/bower" before_install: - "if [ -d node_modules ] && [ x$(cat node_modules/.last-node-version 2>/dev/null) != x$(node -e 'console.log(process.version)') ]; then npm rebuild --update-binary && node -e 'console.log(process.version)' > node_modules/.last-node-version; fi" before_script: - npm install web-component-tester bower polymer-cli - $(npm bin)/bower install - $(npm bin)/polymer lint --rules polymer-2 script: - xvfb-run $(npm bin)/wct
dist: trusty group: beta language: node_js node_js: - node addons: firefox: latest-esr cache: directories: - node_modules - "$HOME/.cache/bower" before_install: - "if [ -d node_modules ] && [ x$(cat node_modules/.last-node-version 2>/dev/null) != x$(node -e 'console.log(process.version)') ]; then npm rebuild --update-binary && node -e 'console.log(process.version)' > node_modules/.last-node-version; fi" before_script: - npm install web-component-tester bower polymer-cli - $(npm bin)/bower install - $(npm bin)/polymer lint --rules polymer-2 script: - xvfb-run $(npm bin)/wct
Allow failures on 7 until box is updated
# Use Docker environment sudo: false # Setup build matrix language: php php: - 5.4 - 5.5 - 5.6 - 7.0 - hhvm env: matrix: - PREFER_LOWEST="--prefer-lowest" - PREFER_LOWEST="" # Dependencies before_install: - composer self-update install: - travis_retry composer update --no-interaction --prefer-source $PREFER_LOWEST script: composer test # Cache dependencies cache: directories: # - vendor - $HOME/.composer/cache # Gitter notifications notifications: webhooks: urls: - https://webhooks.gitter.im/e/c7e5d662086972567218 on_success: change on_failure: always on_start: false
# Use Docker environment sudo: false # Setup build matrix language: php php: - 5.4 - 5.5 - 5.6 - 7.0 - hhvm matrix: allow_failures: - php: 7.0 env: matrix: - PREFER_LOWEST="--prefer-lowest" - PREFER_LOWEST="" # Dependencies before_install: - composer self-update install: - travis_retry composer update --no-interaction --prefer-source $PREFER_LOWEST script: composer test # Cache dependencies cache: directories: # - vendor - $HOME/.composer/cache # Gitter notifications notifications: webhooks: urls: - https://webhooks.gitter.im/e/c7e5d662086972567218 on_success: change on_failure: always on_start: false
Test workaround for xbuild problem
before_install: - sudo add-apt-repository ppa:directhex/monoxide -y - sudo apt-get update -qq -y - sudo apt-get install mono-devel -qq -y script: ./build /p:Configuration=Release /p:Framework=$Framework "/t:DumpSettings;CleanAll;BuildFramework;TestFramework" env: matrix: - Framework="net-2.0" - Framework="net-3.5" - Framework="net-4.0" - Framework="net-4.5"
before_install: - sudo add-apt-repository ppa:directhex/monoxide -y - sudo apt-get update -qq -y - sudo apt-get install mono-devel -qq -y before_script: ./build /p:Configuration=Release /p:Framework=$Framework "/t:DumpSettings;CleanAll" script: ./build /p:Configuration=Release /p:Framework=$Framework "/t:Build;Test" env: matrix: - Framework="net-2.0" - Framework="net-3.5" - Framework="net-4.0" - Framework="net-4.5"
Add python and nodejs dependencies.
language: ruby rvm: - 2.0.0 - 1.9.3 - 1.8.7
language: ruby rvm: - 2.0.0 - 1.9.3 - 1.8.7 python: - "2.7" node_js: - "0.6" install: - pip install git+git://github.com/ipython/ipython.git
Install slycot before control in Travis build
language: python python: - "2.7" - "3.3" - "3.4" # install required system libraries before_install: - export DISPLAY=:99.0 - sh -e /etc/init.d/xvfb start # use miniconda to install numpy/scipy, to avoid lengthy build from source - if [[ "$TRAVIS_PYTHON_VERSION" == "2.7" ]]; then wget http://repo.continuum.io/miniconda/Miniconda-3.4.2-Linux-x86_64.sh -O miniconda.sh; else wget http://repo.continuum.io/miniconda/Miniconda3-3.4.2-Linux-x86_64.sh -O miniconda.sh; fi - bash miniconda.sh -b -p $HOME/miniconda - export PATH="$HOME/miniconda/bin:$PATH" - hash -r - conda config --set always_yes yes --set changeps1 no - conda update -q conda - conda install --yes python=$TRAVIS_PYTHON_VERSION conda-build pip coverage - conda config --add channels http://conda.binstar.org/cwrowley - conda info -a # Install packages install: - conda build conda-recipe - conda install control --use-local - conda install slycot - pip install coveralls # command to run tests script: - coverage run setup.py test after_success: - coveralls
sudo: false language: python cache: apt: true pip: true directories: - $HOME/.cache/pip - $HOME/.local python: - "2.7" - "3.3" - "3.4" # install required system libraries before_install: - export DISPLAY=:99.0 - sh -e /etc/init.d/xvfb start # use miniconda to install numpy/scipy, to avoid lengthy build from source - if [[ "$TRAVIS_PYTHON_VERSION" == "2.7" ]]; then wget http://repo.continuum.io/miniconda/Miniconda-3.4.2-Linux-x86_64.sh -O miniconda.sh; else wget http://repo.continuum.io/miniconda/Miniconda3-3.4.2-Linux-x86_64.sh -O miniconda.sh; fi - bash miniconda.sh -b -p $HOME/miniconda - export PATH="$HOME/miniconda/bin:$PATH" - hash -r - conda config --set always_yes yes --set changeps1 no - conda update -q conda - conda install --yes python=$TRAVIS_PYTHON_VERSION conda-build pip coverage - conda config --add channels http://conda.binstar.org/cwrowley - conda info -a # Install packages install: - conda install slycot - conda build conda-recipe - conda install control --use-local - pip install coveralls # command to run tests script: - coverage run setup.py test after_success: - coveralls
Remove test runners for PHP 5.4 + 5.5
language: php php: - 5.4 - 5.5 - 5.6 - 7.0 - hhvm cache: directories: - $HOME/.composer/cache before_install: - phpenv config-rm xdebug.ini || true before_script: - composer self-update - composer install --prefer-source --no-interaction --dev script: vendor/bin/phpspec run --format=pretty notifications: slack: secure: MT5eWBoLLafyRTdFVyig9A/kw68UuKxSiFwFfsd7RaJwJlnR31rfw3oTS6/74Fkr7UQNOclUGSxbX10iIkR9I0ktMayMSDrnDZXNLs3Zn5ciGRjo9zIQ+gw0eqmU1hHyVXvG7pSwoLjqihlkO9J6K/1bUhTaHT+OXvKXZOqJ3Os=
language: php php: - 5.6 - 7.0 - hhvm cache: directories: - $HOME/.composer/cache before_install: - phpenv config-rm xdebug.ini || true before_script: - composer self-update - composer install --prefer-source --no-interaction --dev script: vendor/bin/phpspec run --format=pretty notifications: slack: secure: MT5eWBoLLafyRTdFVyig9A/kw68UuKxSiFwFfsd7RaJwJlnR31rfw3oTS6/74Fkr7UQNOclUGSxbX10iIkR9I0ktMayMSDrnDZXNLs3Zn5ciGRjo9zIQ+gw0eqmU1hHyVXvG7pSwoLjqihlkO9J6K/1bUhTaHT+OXvKXZOqJ3Os=
Switch to 2023.1 Python3 unit tests and generic template name
- project: templates: - horizon-non-primary-django-jobs - check-requirements - openstack-python3-zed-jobs
- project: templates: - horizon-non-primary-django-jobs - check-requirements - openstack-python3-jobs
Check and install python on the remote host
--- - name: Prepare the enviorment of remote host for ansible hoats: all gather_factes: False pre_tasks: - name: Install python for Ansible raw: test -e /usr/bin/python || (apt -y update && apt install -y python-simplejson) changed_when: False - name: Set up the development system by vagrant ansible provision hosts: all roles: - base - development
--- - name: Prepare the enviorment of remote host for ansible hosts: all gather_facts: False pre_tasks: - name: Install python for Ansible raw: test -e /usr/bin/python || (apt -y update && apt install -y python-simplejson) changed_when: False - name: Set up the development system by vagrant ansible provision hosts: all roles: - base - development
Add pull_request as a trigger for the CI job
name: Node CI on: [push] jobs: build: timeout-minutes: 20 runs-on: ubuntu-latest strategy: matrix: node-version: [12.x] steps: - uses: actions/checkout@v2 - name: Use Node.js ${{ matrix.node-version }} uses: actions/setup-node@v1 with: node-version: ${{ matrix.node-version }} - name: Install dependencies run: yarn install - name: Build run: yarn build - name: Test run: yarn test
name: Node CI on: [push, pull_request] jobs: build: timeout-minutes: 20 runs-on: ubuntu-latest strategy: matrix: node-version: [12.x] steps: - uses: actions/checkout@v2 - name: Use Node.js ${{ matrix.node-version }} uses: actions/setup-node@v1 with: node-version: ${{ matrix.node-version }} - name: Install dependencies run: yarn install - name: Build run: yarn build - name: Test run: yarn test
Index the jenkins master log in Splunk
# Configure a Jenkins master instance for testeng # This has the Jenkins Java app, but none of the requirements # to run the tests. - name: Configure instance(s) hosts: jenkins_master sudo: True gather_facts: True vars: COMMON_DATA_DIR: "/mnt" COMMON_ENABLE_DATADOG: True COMMON_ENABLE_SPLUNKFORWARDER: True SPLUNKFORWARDER_LOG_ITEMS: - source: '/var/lib/jenkins/jobs/*/builds/*/junitResult.xml' recursive: true index: 'testeng' sourcetype: junit followSymlink: false crcSalt: '<SOURCE>' - source: '/var/lib/jenkins/jobs/*/builds/*/build.xml' index: 'testeng' recursive: true sourcetype: build_result followSymlink: false crcSalt: '<SOURCE>' - source: '/var/lib/jenkins/jobs/*/builds/*/log' index: 'testeng' recursive: true sourcetype: build_log followSymlink: false crcSalt: '<SOURCE>' roles: - common - role: datadog when: COMMON_ENABLE_DATADOG - jenkins_master # run just the splunkforwarder role by using '--tags "splunkonly"' # e.g. ansible-playbook jenkins_testeng_master.yml -i inventory.ini --tags "splunkonly" -vvvv - role: splunkforwarder when: COMMON_ENABLE_SPLUNKFORWARDER tags: splunkonly sudo: True
# Configure a Jenkins master instance for testeng # This has the Jenkins Java app, but none of the requirements # to run the tests. - name: Configure instance(s) hosts: jenkins_master sudo: True gather_facts: True vars: COMMON_DATA_DIR: "/mnt" COMMON_ENABLE_DATADOG: True COMMON_ENABLE_SPLUNKFORWARDER: True SPLUNKFORWARDER_LOG_ITEMS: - source: '/var/lib/jenkins/jobs/*/builds/*/junitResult.xml' recursive: true index: 'testeng' sourcetype: junit followSymlink: false crcSalt: '<SOURCE>' - source: '/var/lib/jenkins/jobs/*/builds/*/build.xml' index: 'testeng' recursive: true sourcetype: build_result followSymlink: false crcSalt: '<SOURCE>' - source: '/var/lib/jenkins/jobs/*/builds/*/log' index: 'testeng' recursive: true sourcetype: build_log followSymlink: false crcSalt: '<SOURCE>' - source: '/var/log/jenkins/jenkins.log' index: 'testeng' recursive: false followSymlink: false roles: - common - role: datadog when: COMMON_ENABLE_DATADOG - jenkins_master # run just the splunkforwarder role by using '--tags "splunkonly"' # e.g. ansible-playbook jenkins_testeng_master.yml -i inventory.ini --tags "splunkonly" -vvvv - role: splunkforwarder when: COMMON_ENABLE_SPLUNKFORWARDER tags: splunkonly sudo: True
Replace build with lint circle.ci dummy
# Javascript Node CircleCI 2.0 configuration file # # Check https://circleci.com/docs/2.0/language-javascript/ for more details # version: 2 general: branches: ignore: - gh-pages jobs: test: docker: - image: circleci/node:8.9.4 steps: - checkout - run: npm install - run: npm test
# Javascript Node CircleCI 2.0 configuration file # # Check https://circleci.com/docs/2.0/language-javascript/ for more details # version: 2 general: branches: ignore: - gh-pages jobs: test: docker: - image: circleci/node:8.9.4 steps: - checkout - run: npm install - run: npm test build: docker: - image: circleci/node:8.9.4 steps: - checkout - run: npm run lint workflows: version: 2 test_build: jobs: - build - test
Consolidate gitlab and master branches in to single dev job
gitlab-dev: tags: - meao - gcp only: - gitlab variables: NAMESPACE: nucleus-dev script: - docker/bin/build_images.sh - docker/bin/push2dockerhub.sh - CLUSTER_NAME=iowa-b bin/update-config.sh - CLUSTER_NAME=frankfurt bin/update-config.sh master: tags: - meao - gcp only: - master variables: NAMESPACE: nucleus-dev script: - docker/bin/build_images.sh - docker/bin/push2dockerhub.sh - bin/update-config.sh stage: tags: - meao - gcp only: - stage variables: NAMESPACE: nucleus-stage script: - docker/bin/build_images.sh - docker/bin/push2dockerhub.sh - bin/update-config.sh prod: tags: - meao - gcp only: - prod variables: NAMESPACE: nucleus-prod script: - docker/bin/build_images.sh - docker/bin/push2dockerhub.sh - bin/update-config.sh
dev: tags: - meao - gcp only: - gitlab - master variables: NAMESPACE: nucleus-dev script: - docker/bin/build_images.sh - docker/bin/push2dockerhub.sh - CLUSTER_NAME=iowa-b bin/update-config.sh - CLUSTER_NAME=frankfurt bin/update-config.sh stage: tags: - meao - gcp only: - stage variables: NAMESPACE: nucleus-stage script: - docker/bin/build_images.sh - docker/bin/push2dockerhub.sh - bin/update-config.sh prod: tags: - meao - gcp only: - prod variables: NAMESPACE: nucleus-prod script: - docker/bin/build_images.sh - docker/bin/push2dockerhub.sh - bin/update-config.sh
Update Emacs link in AppVeyor
environment: EMACSBIN: emacs-24.4-bin%28i686-pc-mingw32%29.7z matrix: fast_finish: true install: - ps: Start-FileDownload "http://downloads.sourceforge.net/project/emacs-bin/releases/$env:EMACSBIN" - ps: 7z x $env:EMACSBIN -oemacs-local | FIND /V "ing " build_script: - cmd: FSharp.AutoComplete\fake.cmd Test test: off
environment: EMACSBIN: emacs-24.4-bin-i686-pc-mingw32.7z matrix: fast_finish: true install: - ps: Start-FileDownload "http://downloads.sourceforge.net/project/emacs-bin/releases/$env:EMACSBIN" - ps: 7z x $env:EMACSBIN -oemacs-local | FIND /V "ing " build_script: - cmd: FSharp.AutoComplete\fake.cmd Test test: off
Test against Ruby 2.7 on AppVeyor
version: 1.0.{build}-{branch} environment: RUBYOPT: -Eutf-8 matrix: - RUBY_VERSION: 200 - RUBY_VERSION: 200-x64 - RUBY_VERSION: 21 - RUBY_VERSION: 21-x64 - RUBY_VERSION: 22 - RUBY_VERSION: 22-x64 - RUBY_VERSION: 23 - RUBY_VERSION: 23-x64 - RUBY_VERSION: 24 - RUBY_VERSION: 24-x64 - RUBY_VERSION: 25 - RUBY_VERSION: 25-x64 - RUBY_VERSION: 26 - RUBY_VERSION: 26-x64 install: - set PATH=C:\Ruby%RUBY_VERSION%\bin;%PATH% - bundle install build: off before_test: - ruby -v - gem -v - bundle -v test_script: - bundle exec rake
version: 1.0.{build}-{branch} environment: RUBYOPT: -Eutf-8 matrix: - RUBY_VERSION: 200 - RUBY_VERSION: 200-x64 - RUBY_VERSION: 21 - RUBY_VERSION: 21-x64 - RUBY_VERSION: 22 - RUBY_VERSION: 22-x64 - RUBY_VERSION: 23 - RUBY_VERSION: 23-x64 - RUBY_VERSION: 24 - RUBY_VERSION: 24-x64 - RUBY_VERSION: 25 - RUBY_VERSION: 25-x64 - RUBY_VERSION: 26 - RUBY_VERSION: 26-x64 - RUBY_VERSION: 27 - RUBY_VERSION: 27-x64 install: - set PATH=C:\Ruby%RUBY_VERSION%\bin;%PATH% - bundle install build: off before_test: - ruby -v - gem -v - bundle -v test_script: - bundle exec rake
Remove dollar sign in Zephir code
namespace Stub; class Functions { /** * @issue https://github.com/phalcon/zephir/issues/658 */ public function filterVar1() -> bool { var ret; let ret = "0"; return false === filter_var($ret, FILTER_VALIDATE_FLOAT, 20480); } /** * @issue https://github.com/phalcon/zephir/issues/658 */ public function filterVar2() -> bool { var ret; let ret = "0"; return false == filter_var($ret, FILTER_VALIDATE_FLOAT, 20480); } }
namespace Stub; class Functions { /** * @issue https://github.com/phalcon/zephir/issues/658 */ public function filterVar1() -> bool { var ret; let ret = "0"; return false === filter_var(ret, FILTER_VALIDATE_FLOAT, 20480); } /** * @issue https://github.com/phalcon/zephir/issues/658 */ public function filterVar2() -> bool { var ret; let ret = "0"; return false == filter_var(ret, FILTER_VALIDATE_FLOAT, 20480); } }