Instruction
stringlengths
14
778
input_code
stringlengths
0
4.24k
output_code
stringlengths
1
5.44k
Add support for Manila Tempest plugin
--- test_dict: test_regex: manila_tempest_tests.tests.api whitelist: [] blacklist: [] plugins: manila_tests: repo: "https://github.com/openstack/manila.git" package: "python-manila-tests"
Update FIWARE docs with google analytics hook
site_name: FIWARE 3D UI (XML3D) site_url: https://xml3d.readthedocs.org repo_url: https://github.com/xml3d/xml3d.js.git site_description: XML3D 3D UI - Documentation docs_dir: doc markdown_extensions: [toc,fenced_code] use_directory_urls: false theme: readthedocs extra_css: ["https://fiware.org/style/fiware_readthedocs.css"] pages: - Home: index.md - 'User & Programmers Manual': 'user_guide.md' - 'Installation & Administration Manual': 'installation_guide.md'
site_name: FIWARE 3D UI (XML3D) site_url: https://xml3d.readthedocs.org repo_url: https://github.com/xml3d/xml3d.js.git google_analytics: ['UA-84207637-3', 'xml3d.readthedocs.io'] site_description: XML3D 3D UI - Documentation docs_dir: doc markdown_extensions: [toc,fenced_code] use_directory_urls: false theme: readthedocs extra_css: ["https://fiware.org/style/fiware_readthedocs.css"] pages: - Home: index.md - 'User & Programmers Manual': 'user_guide.md' - 'Installation & Administration Manual': 'installation_guide.md'
Add swagger spec for the complex example to its own repository
swagger: '2.0' info: description: | Este es un ejemplo complejo para Interoperabilidad version: 0.0.1 title: Servicio Complejo de Referencia contact: email: contacto@interoperabilidad.digital.gob.cl license: name: Apache 2.0 url: http://www.apache.org/licenses/LICENSE-2.0.html schemes: - http host: complex-service-interop.herokuapp.com basePath: /complex_example paths: /personas: get: summary: Listado de personas operationId: peopleIndex consumes: - application/json produces: - application/json responses: "200": description: successful operation schema: type: array items: $ref: '#/definitions/people' definitions: nombreyapellido: type: object required: - nombres - apellidos properties: nombres: type: string apellidos: type: string telefono: type: number minimum: 111111 datos: type: object properties: telefonos: type: array items: $ref: '#/definitions/telefono' email: type: string format: email people: type: object required: - persona properties: persona: $ref: '#/definitions/nombreyapellido' datos: $ref: '#/definitions/datos'
Update from Hackage at 2022-09-03T17:12:36Z
homepage: https://github.com/TristanCacqueray/haskell-xstatic#readme changelog-type: markdown hash: 84b53d5d9af9ac2c8cf805605c6f8bb27a9246419b7cf38baad07b3f9f4fe689 test-bench-deps: {} maintainer: tdecacqu@redhat.com synopsis: Low-Fat static file packaging for Haskell projects changelog: | # Changelog ## 0.1.0 - Initial release basic-deps: bytestring: -any wai: -any base: <5 containers: -any binary: -any file-embed: -any http-types: -any all-versions: - 0.1.0 author: Tristan Cacqueray latest: 0.1.0 description-type: haddock description: Low-Fat static file packaging for Haskell projects. license-name: BSD-3-Clause
Add validation of spec with CI test
os: - linux language: node_js node_js: - "6.11.0" install: - npm install -g swagger-tools script: - swagger-tools validate swagger.json - swagger-tools validate swagger_access_token.json
Update from Forestry.io - Updated Forestry configuration
--- new_page_extension: md auto_deploy: false admin_path: '' webhook_url: sections: - type: jekyll-pages label: Pages create: all - type: jekyll-posts label: Posts create: all upload_dir: uploads public_path: "/uploads" front_matter_path: '' use_front_matter_path: false file_template: ":filename:" instant_preview: true build: preview_env: - JEKYLL_ENV=staging preview_output_directory: _site install_dependencies_command: gem install jekyll preview_docker_image: forestryio/build:latest mount_path: "/opt/buildhome/repo" instant_preview_command: jekyll serve --drafts --unpublished --future --port 8080 --host 0.0.0.0 -d _site preview_command: jekyll build --drafts --unpublished --future -d _site
Update from Hackage at 2016-12-16T21:31:09Z
homepage: https://bitbucket.org/tdammers/sprinkles changelog-type: '' hash: af5783dacb2acfed2ff3ce07abb7676f44b4bced98486db38b4c6c7577abe5cb test-bench-deps: regex-pcre: -any heredoc: -any base: -any filepath: -any data-default: -any sprinkles: -any regex-base: -any tasty-quickcheck: -any classy-prelude: -any tasty-hunit: -any wai-extra: -any temporary: -any tasty: -any directory: -any maintainer: tdammers@gmail.com synopsis: JSON API to HTML website wrapper changelog: '' basic-deps: warp: -any regex-pcre: -any cereal: -any memcached-binary: -any bytestring: -any wai: -any case-insensitive: -any Cabal: -any base: ! '>=4.7 && <5' time: -any hsyslog: -any HDBC: -any aeson-pretty: -any unordered-containers: -any text: -any curl: -any ginger: ! '>=0.3.8.0 && <0.4' unix-compat: -any filepath: -any process: -any parsec: -any data-default: -any HDBC-postgresql: -any array: -any pandoc-types: -any random-shuffle: -any containers: -any pandoc: -any utf8-string: -any sprinkles: -any wai-handler-fastcgi: -any mime-types: -any HDBC-mysql: -any regex-base: -any network-uri: -any mtl: -any HDBC-sqlite3: -any classy-prelude: -any hashable: -any system-locale: -any HTTP: -any wai-extra: -any transformers: -any pandoc-creole: -any scientific: -any http-types: -any Glob: -any aeson: -any template-haskell: -any safe: -any yaml: -any vector: -any directory: -any all-versions: - '0.3.5.0' author: Author name here latest: '0.3.5.0' description-type: haddock description: Please see README.md license-name: BSD3
Add workflow to build gem and store as artifact
name: Build Gem Artifact on: workflow_dispatch: jobs: build_gem: name: Build Gem Artifact runs-on: "ubuntu-latest" strategy: fail-fast: true steps: - name: Checkout Repository uses: actions/checkout@v2 - name: "Set up Ruby 2.7" uses: ruby/setup-ruby@v1 with: ruby-version: "2.7" bundler-cache: true - name: "Set up Node 12" uses: actions/setup-node@v2 with: node-version: "12" cache: yarn - name: Install Frontend Dependencies run: yarn install - name: Test Backend run: bundle exec rspec - name: Test and Build Frontend run: bash script/build - name: Build Gem run: gem build jekyll-admin.gemspec - name: Stash Gem as Temporary Artifact uses: actions/upload-artifact@v2 with: name: "jekyll-admin.gem" path: "${{ github.workspace }}/*.gem" retention-days: 15
Add a control file for Travis CI
language: c compiler: - gcc - clang install: - sudo apt-get update - sudo apt-get install libglib2.0-0 libglib2.0-dev gtk-doc-tools script: - ./autogen.sh --disable-introspection --disable-gtk-doc && make && make check
Set up CI with Azure Pipelines
pool: vmImage: 'ubuntu-latest' # other options: 'macOS-10.13', 'vs2017-win2016' steps: - task: UseDotNet@2 displayName: 'Install .net core 3.0 (preview)' inputs: packageType: sdk version: '3.1.100' installationPath: $(Agent.ToolsDirectory)/dotnet - script: dotnet restore Thornless.UI.Web/Thornless.UI.Web.csproj displayName: "Restore .Net dependencies" - script: dotnet build Thornless.sln displayName: "Build" - script: dotnet test Thornless.sln displayName: "Unit Tests" - script: dotnet publish Thornless.UI.Web/Thornless.UI.Web.csproj --output $(Build.ArtifactStagingDirectory) --configuration Release displayName: "Publish" condition: and(succeeded(), eq(variables['Build.SourceBranch'], 'refs/heads/master')) - task: CopyFilesOverSSH@0 inputs: sshEndpoint: 'Hostwinds SSH' sourceFolder: '$(Build.ArtifactStagingDirectory)' contents: '**' targetFolder: '/var/www/thornless.weredev.com.new' cleanTargetFolder: true failOnEmptySource: true condition: and(succeeded(), eq(variables['Build.SourceBranch'], 'refs/heads/master')) - task: SSH@0 inputs: sshEndpoint: 'Hostwinds SSH' runOptions: 'commands' commands: 'cd /var/cicd && ./update-site.sh thornless.weredev.com' condition: and(succeeded(), eq(variables['Build.SourceBranch'], 'refs/heads/master'))
Set up CI with Azure Pipelines
# Azure Pipelines documentation https://aka.ms/yaml trigger: - master pool: vmImage: 'ubuntu-latest' steps: - script: echo Hello, world! displayName: 'Run a one-line script' - script: | echo Add other tasks to build, test, and deploy your project. echo See https://aka.ms/yaml displayName: 'Run a multi-line script'
Set up CI with Azure Pipelines
# Starter pipeline # Start with a minimal pipeline that you can customize to build and deploy your code. # Add steps that build, run tests, deploy, and more: # https://aka.ms/yaml trigger: - master pool: vmImage: 'ubuntu-latest' steps: - script: echo Hello, world! displayName: 'Run a one-line script' - script: | echo Add other tasks to build, test, and deploy your project. echo See https://aka.ms/yaml displayName: 'Run a multi-line script'
Add basic, is gem installed unit-test
{% set name = "activesupport" %} {% set version = "4.2.11.1" %} package: name: rb-{{ name|lower }} version: {{ version }} source: url: https://rubygems.org/downloads/{{ name }}-{{ version }}.gem sha256: 264782453d1912cf99f80b6bbf3f2e0e0592874e740e2c66f4dbad02903a78ed build: noarch: generic number: 0 script: - gem install -N -l -V --norc --ignore-dependencies {{ name }}-{{ version }}.gem - gem unpack {{ name }}-{{ version }}.gem skip: {{ win }} requirements: host: - ruby run: - ruby - rb-i18n >=0.7,<1 - rb-minitest >=5.1,<6 - rb-thread_safe >=0.3.4,<1 - rb-tzinfo >=1.1,<2 test: commands: - gem list -i {{ name }} -v {{ version }} - ruby -r {{ name }} -e 'exit 0' about: home: https://rubygems.org/gems/{{ name }} license: MIT license_family: MIT license_file: {{ name }}-{{ version }}/MIT-LICENSE summary: A toolkit of support libraries and Ruby core extensions extracted from the Rails framework. doc_url: https://www.rubydoc.info/gems/{{ name }} extra: recipe-maintainers: - sodre
Set up CI with Azure Pipelines
# Starter pipeline # Start with a minimal pipeline that you can customize to build and deploy your code. # Add steps that build, run tests, deploy, and more: # https://aka.ms/yaml pool: vmImage: 'Ubuntu 16.04' steps: - task: AzureCLI@1 inputs: azureSubscription: 1fbdb4bb-441e-4922-ba1f-ffd04f533ea1 scriptLocation: inlineScript inlineScript: az acr login -g camino -n camino displayName: 'Login to ACR.' - script: PRIVATE_REPO=camino.azurecr.io ./build.sh displayName: 'Build and push containers.'
Configure Mergify to automatically merge translations PR every week
pull_request_rules: - name: Automatically merge translations conditions: - "author=weblate" - "-conflict" - "current-day-of-week=Sat" actions: merge: method: squash
Update from Hackage at 2018-05-31T13:50:31Z
homepage: https://github.com/andrewthad/contiguous changelog-type: '' hash: 304410d7d1af70c7fd462a0e9305bd396f60d180fb7238be49df642db5b46c54 test-bench-deps: {} maintainer: andrew.thaddeus@gmail.com synopsis: Unified interface for primitive arrays changelog: '' basic-deps: base: ! '>=4.9 && <5' primitive: ! '>=0.6.4' all-versions: - '0.1.0.0' author: Andrew Martin latest: '0.1.0.0' description-type: markdown description: ! '# primitive-class ' license-name: BSD3
Fix Japanese translations (typo) in the 1.2.x branch
fos_user: username: already_used: γƒ¦γƒΌγ‚ΆγƒΌεγ―ζ—’γ«δ½Ώη”¨γ•γ‚Œγ¦γ„γΎγ™ blank: ユーアー名をε…₯εŠ›γ—γ¦γγ γ•γ„ short: γƒ¦γƒΌγ‚ΆγƒΌεγŒηŸ­γ™γŽγΎγ™ long: γƒ¦γƒΌγ‚ΆγƒΌεγŒι•·γ™γŽγΎγ™ email: already_used: γƒ‘γƒΌγƒ«γ‚’γƒ‰γƒ¬γ‚Ήγ―ζ—’γ«δ½Ώη”¨γ•γ‚Œγ¦γ„γΎγ™ blank: パールをドレスをε…₯εŠ›γ—γ¦γγ γ•γ„ short: γƒ‘γƒΌγƒ«γ‚’γƒ‰γƒ¬γ‚ΉγŒηŸ­γ™γŽγΎγ™γ€ long: γƒ‘γƒΌγƒ«γ‚’γƒ‰γƒ¬γ‚ΉγŒι•·γ™γŽγΎγ™ invalid: γƒ‘γƒΌγƒ«γ‚’γƒ‰γƒ¬γ‚ΉγŒζ­£γ—γγ‚γ‚ŠγΎγ›γ‚“ password: blank: パスワードをε…₯εŠ›γ—γ¦γγ γ•γ„ short: γƒ‘γ‚Ήγƒ―γƒΌγƒ‰γŒηŸ­γ™γŽγΎγ™ new_password: blank: 新しいパスワードをε…₯εŠ›γ—γ¦γγ γ•γ„ short: γƒ‘γ‚Ήγƒ―γƒΌγƒ‰γŒηŸ­γ™γŽγΎγ™ current_password: invalid: 正しいパスワードをε…₯εŠ›γ—γ¦γγ γ•γ„ group: blank: 名前をε…₯εŠ›γ—γ¦γγ γ•γ„ short: εε‰γŒηŸ­γ™γŽγΎγ™ long: εε‰γŒι•·γ™γŽγΎγ™
fos_user: username: already_used: γƒ¦γƒΌγ‚ΆγƒΌεγ―ζ—’γ«δ½Ώη”¨γ•γ‚Œγ¦γ„γΎγ™ blank: ユーアー名をε…₯εŠ›γ—γ¦γγ γ•γ„ short: γƒ¦γƒΌγ‚ΆγƒΌεγŒηŸ­γ™γŽγΎγ™ long: γƒ¦γƒΌγ‚ΆγƒΌεγŒι•·γ™γŽγΎγ™ email: already_used: γƒ‘γƒΌγƒ«γ‚’γƒ‰γƒ¬γ‚Ήγ―ζ—’γ«δ½Ώη”¨γ•γ‚Œγ¦γ„γΎγ™ blank: パールをドレスをε…₯εŠ›γ—γ¦γγ γ•γ„ short: γƒ‘γƒΌγƒ«γ‚’γƒ‰γƒ¬γ‚ΉγŒηŸ­γ™γŽγΎγ™ long: γƒ‘γƒΌγƒ«γ‚’γƒ‰γƒ¬γ‚ΉγŒι•·γ™γŽγΎγ™ invalid: γƒ‘γƒΌγƒ«γ‚’γƒ‰γƒ¬γ‚ΉγŒζ­£γ—γγ‚γ‚ŠγΎγ›γ‚“ password: blank: パスワードをε…₯εŠ›γ—γ¦γγ γ•γ„ short: γƒ‘γ‚Ήγƒ―γƒΌγƒ‰γŒηŸ­γ™γŽγΎγ™ new_password: blank: 新しいパスワードをε…₯εŠ›γ—γ¦γγ γ•γ„ short: γƒ‘γ‚Ήγƒ―γƒΌγƒ‰γŒηŸ­γ™γŽγΎγ™ current_password: invalid: 正しいパスワードをε…₯εŠ›γ—γ¦γγ γ•γ„ group: blank: 名前をε…₯εŠ›γ—γ¦γγ γ•γ„ short: εε‰γŒηŸ­γ™γŽγΎγ™ long: εε‰γŒι•·γ™γŽγΎγ™
Update from Hackage at 2019-03-05T14:15:22Z
homepage: '' changelog-type: markdown hash: 96f675063849889457667d4c494a444010613d3612a2ab40f871b212232538a0 test-bench-deps: {} maintainer: stephan.schiffels@mac.com synopsis: A package with basic parsing utilities for several Bioinformatic data formats. changelog: | V 1.1.4.1: First entry in the Changelog. Added Haddock documentation to all modules and prepare for releasing on Hackage. basic-deps: MissingH: ! '>=1.4.0.1 && <1.5' exceptions: ! '>=0.8.3 && <0.9' bytestring: ! '>=0.10.8.2 && <0.11' pipes-text: ! '>=0.0.2.5 && <0.1' lens-family: ! '>=1.2.2 && <1.3' split: ! '>=0.2.3.3 && <0.3' base: ! '>=4.7 && <5' pipes-bytestring: ! '>=2.1.6 && <2.2' text: ! '>=1.2.2.2 && <1.3' turtle: ! '>=1.4.5 && <1.5' containers: ! '>=0.5.10.2 && <0.6' pipes: ! '>=4.3.7 && <4.4' hslogger: ! '>=1.2.10 && <1.3' pipes-attoparsec: ! '>=0.5.1.5 && <0.6' foldl: ! '>=1.3.7 && <1.4' data-memocombinators: ! '>=0.5.1 && <0.6' attoparsec: ! '>=0.13.2.2 && <0.14' transformers: ! '>=0.5.2.0 && <0.6' errors: ! '>=2.2.2 && <2.3' pipes-safe: ! '>=2.2.6 && <2.3' vector: ! '>=0.12.0.1 && <0.13' all-versions: - 1.1.4.1 author: Stephan Schiffels latest: 1.1.4.1 description-type: markdown description: | Sequence-Formats is a Haskell package to provide file parsers and writers for some commonly, and less commonly used file formats in Bioinformatics, speficially population genetics. The library makes heavy use of the [pipes library](http://hackage.haskell.org/package/pipes) to provide Producers and Consumers for reading and writing files. license-name: GPL-3.0-only
Switch AppVeyor CI to use conda env
# CI on Windows via appveyor # This file was based on Olivier Grisel's python-appveyor-demo branches: except: - fix-docs environment: matrix: - PYTHON: "C:\\Python27-conda32" PYTHON_VERSION: "2.7" PYTHON_ARCH: "32" - PYTHON: "C:\\Python36-conda64" PYTHON_VERSION: "3.6" PYTHON_ARCH: "64" install: # Install miniconda Python - "powershell ./ci/install_python.ps1" # Prepend newly installed Python to the PATH of this build (this cannot be # done from inside the powershell script as it would require to restart # the parent CMD process). - "SET PATH=%PYTHON%;%PYTHON%\\Scripts;%PATH%" # Check that we have the expected version and architecture for Python - "python --version" - "python -c \"import struct; print(struct.calcsize('P') * 8)\"" # install xarray and depenencies - "conda install --yes --quiet pip pytest numpy pandas scipy netCDF4 matplotlib dask" - "python setup.py install" build: false test_script: - "py.test xarray --verbose"
# CI on Windows via appveyor # This file was based on Olivier Grisel's python-appveyor-demo branches: except: - fix-docs environment: matrix: - PYTHON: "C:\\Python27-conda32" PYTHON_VERSION: "2.7" PYTHON_ARCH: "32" CONDA_ENV: "py27-cdat+pynio" - PYTHON: "C:\\Python36-conda64" PYTHON_VERSION: "3.6" PYTHON_ARCH: "64" CONDA_ENV: "py36" install: # Install miniconda Python - "powershell ./ci/install_python.ps1" # Prepend newly installed Python to the PATH of this build (this cannot be # done from inside the powershell script as it would require to restart # the parent CMD process). - "SET PATH=%PYTHON%;%PYTHON%\\Scripts;%PATH%" # Check that we have the expected version and architecture for Python - "python --version" - "python -c \"import struct; print(struct.calcsize('P') * 8)\"" # install xarray and dependencies - "conda env create --file ./ci/requirements-%CONDA_ENV%.yml" - "activate test_env" - "python setup.py install" build: false test_script: - "py.test xarray --verbose"
Add github action to publish release asset
name: Upload release asset(s) on: release: types: ["published"] jobs: build: runs-on: ubuntu-latest timeout-minutes: 30 steps: - name: Checkout repo uses: actions/checkout@v2 - name: Set up JDK uses: actions/setup-java@v1 with: java-version: 11 - name: Decrypt secrets run: ./release/decrypt-secrets.sh env: ENCRYPT_KEY: ${{ secrets.ENCRYPT_KEY }} - name: Generate build number run: echo "BUILD_NUMBER=${{github.run_number}}" > app/build_number.properties - name: Build Release run: ./gradlew app:bundleRelease --stacktrace - name: Upload aab uses: actions/upload-release-asset@v1 with: upload_url: ${{ github.event.release.upload_url }} asset_path: app/build/outputs/bundle/release/app-release.aab asset_name: pb-android-prod-release.aab asset_content_type: application/zip env: GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
Set up CI with Azure Pipelines
# Python package # Create and test a Python package on multiple Python versions. # Add steps that analyze code, save the dist with the build record, publish to a PyPI-compatible index, and more: # https://docs.microsoft.com/azure/devops/pipelines/languages/python trigger: - master pool: vmImage: ubuntu-latest strategy: matrix: Python36: python.version: '3.6' Python37: python.version: '3.7' Python38: python.version: '3.8' Python39: python.version: '3.9' Python310: python.version: '3.10' steps: - task: UsePythonVersion@0 inputs: versionSpec: '$(python.version)' displayName: 'Use Python $(python.version)' - script: | python -m pip install --upgrade pip pip install -r requirements_dev.txt pip install -r requirements.txt pip install -e .[dev,test,docs,ci] displayName: 'Install dependencies' - script : | download_eniric_data.sh # Some files from Dropbox displayName: 'Download the data for eniric and testing.' - script : | make atmos displayName: 'Preparing atmosphere models.' - script: | pip install pytest pytest-azurepipelines pytest displayName: 'pytest' - script: | pip install pytest pytest-azurepipelines pytest --cov --durations=10 displayName: 'pytest coverage'
Allow the server to start
language: python dist: trusty python: - "2.7" sudo: required cache: pip: true directories: - $HOME/buildout-cache addons: apt: packages: - chromium-browser - chromium-chromedriver virtualenv: system_site_packages: true before_install: - wget http://chromedriver.storage.googleapis.com/2.21/chromedriver_linux64.zip - unzip chromedriver_linux64.zip - sudo chmod u+x chromedriver - sudo mv chromedriver /usr/bin/ - export CHROME_BIN=chromium-browser - "export DISPLAY=:99.0" - "sh -e /etc/init.d/xvfb start" - sleep 3 install: - pip install selenium script: - python -m SimpleHTTPServer 8000 & - python test.py - pkill -9 python
language: python dist: trusty python: - "2.7" sudo: required cache: pip: true directories: - $HOME/buildout-cache addons: apt: packages: - chromium-browser - chromium-chromedriver virtualenv: system_site_packages: true before_install: - wget http://chromedriver.storage.googleapis.com/2.21/chromedriver_linux64.zip - unzip chromedriver_linux64.zip - sudo chmod u+x chromedriver - sudo mv chromedriver /usr/bin/ - export CHROME_BIN=chromium-browser - "export DISPLAY=:99.0" - "sh -e /etc/init.d/xvfb start" - sleep 3 install: - pip install selenium script: - python -m SimpleHTTPServer 8000 & - sleep 3 - python test.py - pkill -9 python
Update consul to run in a multi-host swarm mode docker engine
version: "3" services: # # Deploying a Consul cluster using this file assumes that overlay network 'voltha_net' # has already been created. To deploy the cluster, issue the command: # # docker stack deploy -c docker-compose-consul-cluster.yml consul # # This command will create overlay network 'consul_net'. # consul: image: consul:latest # Deploy to all docker manager nodes deploy: mode: global placement: constraints: - node.role == manager restart_policy: condition: on-failure environment: CONSUL_LOCAL_CONFIG: "{disable_update_check: true}" CONSUL_BIND_INTERFACE: eth0 entrypoint: - consul - agent - -server - -bootstrap-expect=3 - -config-dir=/consul/config - -data-dir=/consul/data # mandatory property - -bind={{ GetInterfaceIP "eth0" }} - -client=0.0.0.0 - -ui - -retry-join=10.10.10.2 - -retry-join=10.10.10.3 - -retry-join=10.10.10.4 - -retry-join=10.10.10.5 - -retry-join=10.10.10.6 - -retry-join=10.10.10.7 networks: - net - voltha-net ports: - "8300:8300" - "8400:8400" - "8500:8500" - "8600:8600/udp" networks: net: driver: overlay ipam: driver: default config: - subnet: 10.10.10.0/29 voltha-net: external: name: voltha_net
Add the Elk standalone example cluster config
--- #-------------------------------------------- # ELK Standalone server. #-------------------------------------------- name: ELKStandalone environment: awstest_useast1 credentials_file: "/home/vagrant/.aws/credentials" profile_name: "default" public_key_loc: "/tmp/symphonykey.pub" private_key_loc: "/tmp/symphonykey" connection_info: username: "ec2-user" clusters: elk: name: elk-cluster public_key_loc: "/tmp/symphonykey.pub" private_key_loc: "/tmp/symphonykey" cluster_size: 1 cluster_template: basic_instance tags: Project: "ELK" ApplicationRole: "Test" Name: "ELK-${count.index}" Cluster: "ElkStandalone" Environment: "devtest" BusinessUnit: "ACME" Owner: "behzad_dastur" OwnerEmail: "behzad_dastur" connection_info: username: "ec2-user" services: logstash: logstash_listen_port_beats: 5044 logstash_elasticsearch_hosts: - "http://{{ ansible_default_ipv4['address'] }}:9200" logstash_local_syslog_path: /var/log/syslog logstash_monitor_local_syslog: true logstash_enabled_on_boot: yes
Add changelog for deleting types from nuggets table
databaseChangeLog: - changeSet: id: 73 author: Pontus Doverstav changes: - dropColumn: schemaName: contentschema tableName: nuggets columnName: type
Add starter TravisCI configuration. Will pick up Travis profile from new parent module when released.
# Licensed to the Apache Software Foundation (ASF) under one or more # contributor license agreements. See the NOTICE file distributed with # this work for additional information regarding copyright ownership. # The ASF licenses this file to You under the Apache License, Version 2.0 # (the "License"); you may not use this file except in compliance with # the License. You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. language: java sudo: false jdk: - openjdk7 - oraclejdk8 after_success: - mvn clean cobertura:cobertura coveralls:report
Set elastic search java and docker runtime memory limits
version: '2' services: db: image: postgres environment: - POSTGRES_USER=saleor - POSTGRES_PASSWORD=saleor ports: - '127.0.0.1:5432:5432' redis: image: redis ports: - '127.0.0.1:6379:6379' search: image: elasticsearch:5.4 ports: - '127.0.0.1:9200:9200' web: build: . command: python manage.py runserver 0.0.0.0:8000 environment: - DATABASE_URL=postgres://saleor:saleor@db/saleor - DEFAULT_FROM_EMAIL=noreply@example.com - ELASTICSEARCH_URL=http://search:9200 - OPENEXCHANGERATES_API_KEY - REDIS_URL=redis://redis:6379/0 - SECRET_KEY=changeme depends_on: - db - redis - search ports: - '8000:8000' volumes: - .:/app:Z
version: '2' services: db: image: postgres environment: - POSTGRES_USER=saleor - POSTGRES_PASSWORD=saleor ports: - '127.0.0.1:5432:5432' redis: image: redis ports: - '127.0.0.1:6379:6379' search: image: elasticsearch:5.4.3 mem_limit: 1g environment: - "ES_JAVA_OPTS=-Xms512m -Xmx512m" ports: - '127.0.0.1:9200:9200' web: build: . command: python manage.py runserver 0.0.0.0:8000 environment: - DATABASE_URL=postgres://saleor:saleor@db/saleor - DEFAULT_FROM_EMAIL=noreply@example.com - ELASTICSEARCH_URL=http://search:9200 - OPENEXCHANGERATES_API_KEY - REDIS_URL=redis://redis:6379/0 - SECRET_KEY=changeme depends_on: - db - redis - search ports: - '8000:8000' volumes: - .:/app:Z
Update from Hackage at 2017-06-03T16:56:07Z
homepage: '' changelog-type: markdown hash: 6c48593858404378006c9936b41f3627c9a524c5ce40e64d5c439f84207415f5 test-bench-deps: {} maintainer: portnov84@rambler.ru synopsis: Full-weight string formatting library, analog of Python's string.format changelog: ! '# Revision history for text-format-heavy ## 0.1.0.0 -- YYYY-mm-dd * First version. Released on an unsuspecting world. ' basic-deps: base: ! '>=4.8 && <5' time: ! '>=1.5' text: ! '>=1.2 && <1.3' parsec: ! '>=3.1 && <3.2' data-default: ! '>=0.7' containers: ! '>=0.5' all-versions: - '0.1.0.0' author: Ilya Portnov latest: '0.1.0.0' description-type: haddock description: ! 'This package contains full-featured string formatting function, similar to Python''s string.format. Features include: * Automatically numbered variable placeholders; * Positional variable placeholders; * Named variable placeholders; * Placeholders can be used in any order; one variable can be used several times or not used at all. * Specific format can be used for each variable substitution. This package prefers functionality over "light weight" and (probably) performance. It also exposes all required interfaces to extend and customize it. For more details, please refer to <https://github.com/portnov/text-format-heavy/wiki Wiki>. See also the @examples/@ directory.' license-name: BSD3
Add global, WordPress-specific list of variables
--- # WordPress version settings wp_version: 4.5.3 wp_sha256sum: fd94288cd6fc657b2d8061737fcb121fc6acbe18acfcff80661e49fd2d3ee17c # MySQL settings mysqlservice: mysqld mysql_port: 3306 # WordPress database settings wp_db_name: wordpress wp_db_user: wordpress wp_db_password: ThisIsNotSecure # Apache server configuration apache_port: 80 server_hostname: derezzed.justinwflory.com # Disable All Updates # By default automatic updates are enabled, set this value to true to disable all automatic updates auto_up_disable: false #Define Core Update Level # true = Development, minor, and major updates are all enabled # false = Development, minor, and major updates are all disabled # minor = Minor updates are enabled, development, and major updates are disabled core_update_level: true
Add a github workflow testing the outbox
name: ruby_event_store-outbox on: [push] jobs: test: runs-on: ubuntu-20.04 env: GEM_NAME: ruby_event_store-outbox BUNDLE_GEMFILE: ${{ matrix.gemfile }} DATABASE_URL: ${{ matrix.database }} DATA_TYPE: ${{ matrix.datatype }} GEM_LOCATION: contrib/ruby_event_store-outbox strategy: fail-fast: false matrix: include: - ruby: ruby-2.7 gemfile: Gemfile database: sqlite3:db.sqlite3 datatype: binary steps: - uses: actions/checkout@v2 - uses: ruby/setup-ruby@v1 with: ruby-version: ${{ matrix.ruby }} bundler-cache: true working-directory: ${{ env.GEM_LOCATION }} - run: make test working-directory: ${{ env.GEM_LOCATION }}
Include CodeClimate config on github
version: "2" # required to adjust maintainability checks checks: argument-count: config: threshold: 8 complex-logic: config: threshold: 4 file-lines: config: threshold: 250 method-complexity: config: threshold: 15 method-count: config: threshold: 20 method-lines: config: threshold: 50 nested-control-flow: config: threshold: 4 return-statements: config: threshold: 4 similar-code: enabled: false identical-code: enabled: false plugins: sonar-java: enabled: true config: sonar.java.source: "7" tests_patterns: - Hyperrail/src/test/** - Hyperrail/src/androidTest/** - OpenTransport/src/test/** - OpenTransport/src/androidTest/**
Use github action for CI
name: Checks on: pull_request jobs: check: name: Checks runs-on: ubuntu-latest steps: - uses: actions/checkout@master - uses: actions/setup-node@v1 with: node-version: '17.6.0' - run: export CHROME_BIN=/usr/bin/google-chrome - run: export DISPLAY=:99.0 - run: sh -e /etc/init.d/xvfb start - run: sudo apt-get update - run: sudo apt-get install -y libappindicator1 fonts-liberation - run: wget https://dl.google.com/linux/direct/google-chrome-stable_current_amd64.deb - run: sudo dpkg -i google-chrome*.deb - run: yarn - run: yarn add node-forge -P
Add bzip2 to satisfy bzlib req
# Jinja variables help maintain the recipe as you'll update the version only here. {% set version = "1.1.13" %} {% set major_version = "1.1" %} {% set sha1 = "f61e65a7616a3492815d18689c202d0685fe167d" %} {% set md5 = "4cb87da2dba05540afce162f34b3a9a6" %} package: name: pbzip2 version: {{ version }} source: fn: pbzip2-{{ version }}.tar.gz url: https://launchpad.net/pbzip2/{{ major_version }}/{{ version }}/+download/pbzip2-{{ version }}.tar.gz sha1: {{ sha1 }} md5: {{ md5 }} build: number: 0 skip: True # [win] script: 'make PREFIX=$PREFIX && make install PREFIX=$PREFIX' test: commands: - pbzip2 --version about: home: http://compression.ca/pbzip2/ license: pbzip2 License license_file: COPYING summary: PBZIP2 is a parallel implementation of the bzip2 block-sorting file compressor extra: recipe-maintainers: - shahin
{% set version = "1.1.13" %} {% set major_version = "1.1" %} {% set sha1 = "f61e65a7616a3492815d18689c202d0685fe167d" %} {% set md5 = "4cb87da2dba05540afce162f34b3a9a6" %} package: name: pbzip2 version: {{ version }} source: fn: pbzip2-{{ version }}.tar.gz url: https://launchpad.net/pbzip2/{{ major_version }}/{{ version }}/+download/pbzip2-{{ version }}.tar.gz sha1: {{ sha1 }} md5: {{ md5 }} build: number: 0 skip: True # [win] script: 'make PREFIX=$PREFIX && make install PREFIX=$PREFIX' requirements: build: - bzip2 test: commands: - pbzip2 --version about: home: http://compression.ca/pbzip2/ license: pbzip2 License license_file: COPYING summary: PBZIP2 is a parallel implementation of the bzip2 block-sorting file compressor extra: recipe-maintainers: - shahin
Update from Hackage at 2022-04-18T14:21:37Z
homepage: '' changelog-type: '' hash: 6996a2370d20c3ab5131fa5a934f9fb6b2b2f90c8bb9fccb96f3655c28d5a31b test-bench-deps: base: '>=4.7 && <5' time: -any text: ==1.* msgpack: '>=1.0 && <2' registry-hedgehog: -any protolude: ==0.3.* containers: '>=0.2 && <1' registry: ==0.2.* transformers: '>=0.5 && <2' tasty: -any template-haskell: '>=2.13 && <3.0' vector: '>=0.1 && <1' maintainer: etorreborre@yahoo.com synopsis: MessagePack encoders / decoders changelog: '' basic-deps: base: '>=4.7 && <5' text: ==1.* msgpack: '>=1.0 && <2' protolude: ==0.3.* containers: '>=0.2 && <1' registry: ==0.2.* transformers: '>=0.5 && <2' template-haskell: '>=2.13 && <3.0' vector: '>=0.1 && <1' all-versions: - 0.1.0.0 author: '' latest: 0.1.0.0 description-type: haddock description: This library provides encoders / decoders which can be easily customized for the MessagePack format. license-name: MIT
Quit spamming IRC for build notifications
before_script: - sh -e /etc/init.d/xvfb start - export DISPLAY=:99.0 bundler_args: --without development production --quiet env: - GEM=api DB=sqlite - GEM=api DB=mysql - GEM=api DB=postgres - GEM=backend DB=sqlite - GEM=backend DB=mysql - GEM=backend DB=postgres - GEM=core DB=sqlite - GEM=core DB=mysql - GEM=core DB=postgres - GEM=frontend DB=sqlite - GEM=frontend DB=mysql - GEM=frontend DB=postgres before_install: - cd $GEM; export BUNDLE_GEMFILE="`pwd`/Gemfile" script: - bundle exec rake test_app - bundle exec rake spec notifications: email: - ryan@spreecommerce.com irc: use_notice: true skip_join: true channels: - "irc.freenode.org#spree" rvm: - 1.9.3 - 2.0.0
before_script: - sh -e /etc/init.d/xvfb start - export DISPLAY=:99.0 bundler_args: --without development production --quiet env: - GEM=api DB=sqlite - GEM=api DB=mysql - GEM=api DB=postgres - GEM=backend DB=sqlite - GEM=backend DB=mysql - GEM=backend DB=postgres - GEM=core DB=sqlite - GEM=core DB=mysql - GEM=core DB=postgres - GEM=frontend DB=sqlite - GEM=frontend DB=mysql - GEM=frontend DB=postgres before_install: - cd $GEM; export BUNDLE_GEMFILE="`pwd`/Gemfile" script: - bundle exec rake test_app - bundle exec rake spec notifications: email: - ryan@spreecommerce.com rvm: - 1.9.3 - 2.0.0
Add a before_script to Travis config file to install grunt-cli
language: node_js node_js: - "0.11" - "0.10" - "0.8" - "0.6"
language: node_js node_js: - "0.11" - "0.10" - "0.8" - "0.6" before_script: - npm install -g grunt-cli
Add support for Github Actions
name: CI on: push: branches: [ dev ] pull_request: branches: [ dev ] jobs: build: runs-on: ubuntu-latest steps: - name: Checkout latest code uses: actions/checkout@v2 - name: Set up JDK 8 uses: actions/setup-java@v1 with: java-version: 8 - name: Setup build cache uses: actions/cache@v1 with: path: ~/.gradle/caches key: ${{ runner.os }}-gradle-${{ hashFiles('**/*.gradle') }} restore-keys: | ${{ runner.os }}-gradle- - name: Run tests run: ./gradlew test
Revert "st2chatops also may need proxies"
--- # Update proxy env vars in StackStorm service config files - name: proxy | Configure StackStorm services become: yes lineinfile: dest: /etc/{{ 'default' if ansible_facts.pkg_mgr == 'apt' else 'sysconfig' }}/{{ item.0 }} create: yes regexp: '^{{ item.1 }}=' line: "{{ item.1 }}={{ ansible_facts.env.get(item.1) }}" # NB: Empty ENV var cast to 'None' string in Ansible state: "{{ 'present' if ansible_facts.env.get(item.1, 'None') != 'None' else 'absent' }}" vars: _services: [st2api, st2actionrunner, st2chatops] _proxy_vars: [http_proxy, https_proxy, no_proxy] loop: '{{ _services|product(_proxy_vars)|list }}' notify: - restart st2actionrunner - restart st2api
--- # Update proxy env vars in StackStorm service config files - name: proxy | Configure StackStorm services become: yes lineinfile: dest: /etc/{{ 'default' if ansible_facts.pkg_mgr == 'apt' else 'sysconfig' }}/{{ item.0 }} create: yes regexp: '^{{ item.1 }}=' line: "{{ item.1 }}={{ ansible_facts.env.get(item.1) }}" # NB: Empty ENV var cast to 'None' string in Ansible state: "{{ 'present' if ansible_facts.env.get(item.1, 'None') != 'None' else 'absent' }}" vars: _services: [st2api, st2actionrunner] _proxy_vars: [http_proxy, https_proxy, no_proxy] loop: '{{ _services|product(_proxy_vars)|list }}' notify: - restart st2actionrunner - restart st2api
Update from Hackage at 2017-10-18T23:31:54Z
homepage: https://github.com/githubuser/ssh-tunnel#readme changelog-type: '' hash: aca3735c9b7a04409ba51e319d2e579fa45034256d109bc7d32e6961f918e4cf test-bench-deps: {} maintainer: ncrashed@gmail.com synopsis: Proxy http-client via ssh tunnel. changelog: '' basic-deps: http-client: ! '>=0.4 && <0.6' base: ! '>=4.7 && <5' managed: ! '>=1.0 && <1.1' text: ! '>=1.2 && <1.3' uuid: ! '>=1.3 && <1.4' turtle: ! '>=1.2 && <1.4' foldl: ! '>=1.1 && <1.3' transformers: ! '>=0.4 && <0.6' all-versions: - '1.0.0.0' author: ! 'Anton Gushcha <ncrashed@gmail.com> , Anatoliy Nardid <nazgul17@gmail.com>' latest: '1.0.0.0' description-type: markdown description: ! "ssh-tunnel\n==========\n\nSmall library that allows to create SSH tunnel (SOCKS proxy) from Haskell and proxy\n[http-client](http://hackage.haskell.org/package/http-client).\n\nTunnel is created with:\n```\nssh -f -N -M -S <master-socket> -i <pemfile> -L <localport>:127.0.0.1:<remoteport> <user>@<host>\n```\n\nNote: that the tunnel is created in background (-f) without a shell on remote host (-N)\nand (-M -S <master-socket>) defines special socket that is used to terminate the tunnel.\n\nHow to use in your code:\n```haskell\nimport Control.Monad.Managed\nimport Control.SSH.Tunnel\nimport Network.HTTP.Client\nimport Network.HTTP.Client.TLS\n\nwith (openSshTunnel config tlsManagerSettings) $ \\settings -> do\n manager <- newManager settings\n -- do things with manager\n -- as soon you live the scope, tunnel will be down\n```\n\nAlso, you can find `addFingerprints` function useful for adding fingerprints to\nknown hosts on first connect. \n" license-name: BSD3
Set up CI with Azure Pipelines
# ASP.NET Core # Build and test ASP.NET Core projects targeting .NET Core. # Add steps that run tests, create a NuGet package, deploy, and more: # https://docs.microsoft.com/azure/devops/pipelines/languages/dotnet-core pool: vmImage: 'vs2017-win2016' variables: buildConfiguration: 'Release' steps: - script: dotnet build --configuration $(buildConfiguration) displayName: 'dotnet build $(buildConfiguration)'
Add new GitHub workflow for tagged commits to a branch other than master
name: Deploy custom tag on: push: branches: - "!master" tags: - "[0-9]+.[0-9]+.[0-9]+" jobs: deploy: runs-on: ubuntu-latest steps: - name: Check out Git repository uses: actions/checkout@v2 - name: Install Java and Maven uses: actions/setup-java@v1 with: java-version: 15 - name: Cache local Maven repository uses: actions/cache@v2 with: path: ~/.m2/repository key: ${{ runner.os }}-maven-${{ hashFiles('**/pom.xml') }} restore-keys: | ${{ runner.os }}-maven- - name: Get tag version id: tag_version run: | echo "::set-output name=new_tag::${GITHUB_REF#refs/*/}" - name: Set maven version run: | echo "Setting Maven version to ${{ steps.tag_version.outputs.new_tag }}" mvn -B versions:set -DnewVersion=${{ steps.tag_version.outputs.new_tag }} # - name: Deploy to Maven Central # uses: samuelmeuli/action-maven-publish@v1 # with: # gpg_private_key: ${{ secrets.gpg_private_key }} # gpg_passphrase: ${{ secrets.gpg_passphrase }} # nexus_username: ${{ secrets.nexus_username }} # nexus_password: ${{ secrets.nexus_password }} # # - name: Create GitHub release # uses: softprops/action-gh-release@v1 # with: # tag_name: ${{ steps.tag_version.outputs.new_version }} # name: Flux Capacitor ${{ steps.tag_version.outputs.new_version }} # body: ${{ steps.tag_version.outputs.changelog }} # # - name: Login to docker hub # uses: actions-hub/docker/login@master # env: # DOCKER_USERNAME: ${{ secrets.DOCKER_USERNAME }} # DOCKER_PASSWORD: ${{ secrets.DOCKER_PASSWORD }} # # - name: Docker build # run: | # cd test-server # docker build -t fluxcapacitorio/flux-capacitor-test:${{ steps.tag_version.outputs.new_version }} -t fluxcapacitorio/flux-capacitor-test:latest . # cd .. # # - name: Push to docker hub # uses: actions-hub/docker@master # with: # args: push fluxcapacitorio/flux-capacitor-test
Update from Hackage at 2017-01-12T15:52:25Z
homepage: https://github.com/lovasko/goat changelog-type: '' hash: 5ed53a6b6887fc1c7335b3de27d6c119251bf9ed0aa551d718d4df3eaa9e5078 test-bench-deps: cereal: -any bytestring: -any base: ! '>=4.7 && <5' goat: -any QuickCheck: -any safe: -any maintainer: Daniel Lovasko <daniel.lovasko@gmail.com> synopsis: Time Series Compression changelog: '' basic-deps: cereal: -any bytestring: -any split: -any base: ! '>=4.7 && <5' floating-bits: -any safe: -any all-versions: - '1.0.0' author: Daniel Lovasko <daniel.lovasko@gmail.com> latest: '1.0.0' description-type: haddock description: ! 'Goat is a time series compression implementation heavily influenced by the Gorilla paper published by Facebook. It provides separate compression algorithms for both time and value points of a time series.' license-name: OtherLicense
Enable CI using GitHub Actions
name: CI on: push: branches: - main pull_request: jobs: test: name: Test Rails uses: alphagov/govuk-infrastructure/.github/workflows/test-rails.yaml@main with: requiresJavaScript: true requiresMySQL: true requiresRedis: true mysqlUsername: contacts mysqlPassword: contacts
Test pb to just run docker storage setup on nodes
--- - hosts: localhost gather_facts: yes roles: - instance-groups tags: - always - hosts: nodes roles: - role: docker-storage-setup docker_dev: '/dev/vdb' tags: - pre - storage
Add transition config for LLWR
--- site: nda_llwr whitehall_slug: low-level-waste-repository-ltd homepage: https://www.gov.uk/government/organisations/low-level-waste-repository-ltd tna_timestamp: 20180313170246 host: llwrsite.com aliases: - www.llwrsite.com
Add or update the Azure App Service build and deployment workflow config
# Docs for the Azure Web Apps Deploy action: https://github.com/Azure/webapps-deploy # More GitHub Actions for Azure: https://github.com/Azure/actions name: Build and deploy Node.js app to Azure Web App - vscode-uiflow-production on: push: branches: - azure-ci workflow_dispatch: jobs: build: runs-on: windows-latest steps: - uses: actions/checkout@v2 - name: Set up Node.js version uses: actions/setup-node@v1 with: node-version: '16.x' - name: npm install, build, and test run: | npm install npm run build --if-present npm run test --if-present - name: Upload artifact for deployment job uses: actions/upload-artifact@v2 with: name: node-app path: . deploy: runs-on: ubuntu-latest needs: build environment: name: 'Production' url: ${{ steps.deploy-to-webapp.outputs.webapp-url }} steps: - name: Download artifact from build job uses: actions/download-artifact@v2 with: name: node-app - name: 'Deploy to Azure Web App' uses: azure/webapps-deploy@v2 id: deploy-to-webapp with: app-name: 'vscode-uiflow-production' slot-name: 'Production' publish-profile: ${{ secrets.AZUREAPPSERVICE_PUBLISHPROFILE_A17CFD50B300411A830C497827D1170D }} package: .
Add action to dedupe Dependabot PRs
name: Dedupe Dependabot PRs on: push: branches: ['dependabot/**'] permissions: contents: write pull-requests: write repository-projects: write jobs: dedupe: name: Dedupe Dependabot PRs runs-on: ubuntu-latest steps: - name: Checkout uses: actions/checkout@v3 - name: Cache .yarn/cache uses: actions/cache@v3 env: cache-name: yarn-cache with: path: .yarn/cache key: ${{ runner.os }}-${{ env.cache-name }}-${{ hashFiles('**/yarn.lock') }} restore-keys: | ${{ runner.os }}-${{ env.cache-name }} - name: Use Node.js uses: actions/setup-node@v3 with: node-version: '16' - name: Enable Corepack run: corepack enable - name: Configure Git run: | git config user.name 'github-actions[bot]' git config user.email 'github-actions[bot]@users.noreply.github.com' - name: Detect working directory run: | echo "WORKING_DIRECTORY=$(git log -1 --pretty=%B | sed -n 's/.* in \/\(.*\)$/\1/p')" >> $GITHUB_ENV - name: Dedupe dependencies run: yarn dedupe working-directory: ${{ env.WORKING_DIRECTORY }} env: HUSKY: 0 - name: Commit changes run: | git add . git commit -m 'Dedupe dependencies' || true working-directory: ${{ env.WORKING_DIRECTORY }} - name: Push changes run: git push working-directory: ${{ env.WORKING_DIRECTORY }}
Add publish to PyPI workflow file
# This workflows will upload a Python Package using Twine when a release is created # For more information see: https://help.github.com/en/actions/language-and-framework-guides/using-python-with-github-actions#publishing-to-package-registries name: Upload Package to PyPI on: release: types: [created] jobs: deploy: runs-on: ubuntu-latest steps: - uses: actions/checkout@v2 - name: Set up Python uses: actions/setup-python@v2 with: python-version: '3.x' - name: Install dependencies run: | python -m pip install --upgrade pip pip install setuptools wheel twine - name: Build and publish env: TWINE_USERNAME: __token__ TWINE_PASSWORD: ${{ secrets.PYPI_API_TOKEN }} run: | python setup.py sdist bdist_wheel twine upload dist/*
Use gitleaks for detecting hardcoded secrets
name: Security Scans on: [push, pull_request] jobs: cleanup_runs: name: Cancel old branch builds runs-on: ubuntu-latest if: "!startsWith(github.ref, 'refs/tags/') && github.ref != 'refs/heads/master'" steps: - name: Find and cancel old builds of this branch uses: rokroskar/workflow-run-cleanup-action@v0.2.2 env: GITHUB_TOKEN: "${{ secrets.GITHUB_TOKEN }}" gitleaks: name: Detecting hardcoded secrets runs-on: ubuntu-latest steps: - uses: actions/checkout@v2 with: # Fetch all history for all tags and branches fetch-depth: '0' - name: Gitleaks - detecting hardcoded secrets uses: zricethezav/gitleaks-action@v1.1.4
Add GitHub Action to automatically post links
on: push: workflow_dispatch: schedule: - cron: '30 * * * *' # every hour, at ~:30 https://crontab.guru/#30_*_*_*_* jobs: update-news-releases: runs-on: ubuntu-latest steps: - name: Check out this repo uses: actions/checkout@v2 - name: Set up Node uses: actions/setup-node@v2 with: node-version: 16 - name: Pull Node packages run: |- npm install - name: Fetch latest data run: |- node _scripts/generate-link-posts.mjs - name: Commit and push if there are new links run: |- git config user.name "Automated" git config user.email "actions@users.noreply.github.com" git add links/ timestamp=$(date -u) git commit -m "Latest links from Pinboard: ${timestamp}" || exit 0 git push
Update from Forestry.io - Updated Forestry configuration
--- label: Blog post hide_body: false fields: - type: text name: title label: title - type: field_group name: header label: header fields: - type: file name: image label: image - type: text name: layout label: layout - type: boolean name: author_profile label: author_profile - type: boolean name: read_time label: read_time - type: boolean name: comments label: comments - type: boolean name: share label: share - type: boolean name: related label: related
Add a really basic GA workflow to build/install.
name: Build and package on: workflow_dispatch: push: branches: - cidev jobs: mondo-build: name: Performs a mondo-build of all optional components runs-on: ubuntu-18.04 env: BUILD_DIR: ${{ github.workspace }}/../iree-build INSTALL_DIR: ${{ github.workspace }}/../iree-install steps: - uses: actions/checkout@v2 with: submodules: true - uses: actions/setup-python@v2 with: python-version: '3.8' - name: Report environment shell: bash run: | echo "GITHUB_RUN_ID=${GITHUB_RUN_ID}" echo "GITHUB_RUN_NUMBER=${GITHUB_RUN_NUMBER}" echo "GITHUB_WORKSPACE=${GITHUB_WORKSPACE}" echo "PWD=${PWD}" echo "PATH=${PATH}" - name: Configure shell: bash run: | cd "${GITHUB_WORKSPACE}" cmake -B "${BUILD_DIR}" "${GITHUB_WORKSPACE}" \ -DCMAKE_INSTALL_PREFIX="${INSTALL_DIR}" \ -DCMAKE_BUILD_TYPE=Release \ -DIREE_BUILD_TENSORFLOW_COMPILER=ON \ -DIREE_BUILD_XLA_COMPILER=ON \ -DIREE_BUILD_TFLITE_COMPILER=ON \ -DPython3_EXECUTABLE=$(which python) - name: Build shell: bash run: | cd "${BUILD_DIR}" make -j 4 make install ls -lRh "${INSTALL_DIR}" - name: Upload Install Directory uses: actions/upload-artifact@v2 with: name: iree-install-ubuntu-18.04 path: ${{env.INSTALL_DIR}}/ retention-days: 5
Configure CircleCI for deployment to GitHub pages
deployment: demo: tag: /(release-.*|demo)/ commands: - git config --global user.email circleci@circleci - git config --global user.name CircleCI - gulp demo-copy - git checkout gh-pages - ls | grep -v build | grep -v node_modules | xargs rm -rf - mv build/* ./ && rm -r build - git status && git add --all . - cd build/ && git commit -m "Update (`date '+%F %T %Z'`) [ci skip]" - git push origin gh-pages
Add Circle CI style checks to the repo
version: 2.1 executors: chef: docker: - image: chef/chefdk:3.11.3 jobs: stylecheck: executor: chef steps: - checkout - run: name: Run style and unit tests command: | cookstyle . foodcritic . chef exec rspec workflows: version: 2 "Style and Unit Tests": jobs: - stylecheck
Add config for pre-commit tool
- repo: git://github.com/pre-commit/pre-commit-hooks sha: 96fb7fa10f2f4c11ed33482a9ad7474251e5e97f hooks: - id: trailing-whitespace - id: flake8 args: [--max-line-length=120] - id: check-merge-conflict - id: double-quote-string-fixer - id: end-of-file-fixer - id: name-tests-test - id: debug-statements - id: check-added-large-files - id: check-ast - id: check-byte-order-marker - id: check-case-conflict - id: check-docstring-first - id: check-json - id: pretty-format-json - id: check-symlinks - id: check-yaml exclude: (vagga|feature_api/config)\.yaml - id: detect-private-key - id: requirements-txt-fixer - repo: git://github.com/FalconSocial/pre-commit-mirrors-pep257 sha: 149e61b7a717945143fe51f010fe1c576e729a9f hooks: - id: pep257 - repo: git://github.com/Lucas-C/pre-commit-hooks sha: c25201a00e6b0514370501050cf2a8538ac12270 hooks: - id: remove-tabs - repo: git://github.com/Lucas-C/pre-commit-hooks-lxml sha: master hooks: - id: forbid-html-img-without-alt-text - repo: git://github.com/detailyang/pre-commit-shell sha: 1fdffa0434cde2b87f19ad258201d3e81481af5f hooks: - id: shell-lint
Add a 'one-off' task to run single test case on concourse
# # A one-of task you can run on concourse to run a single maven test # # Run this with the following command from the ROOT of the sts4 repo: # # fly -t tools execute -x --config concourse/tasks/run-manifest-yaml-test.yml inputs: - name: sts4 platform: linux image_resource: type: docker-image source: repository: kdvolder/sts4-build-env run: dir: sts4/vscode-extensions/vscode-manifest-yaml path: ../mvnw args: - "-U" - "-f" - "../pom.xml" - "-pl" - "vscode-manifest-yaml" - "-am" - "-DfailIfNoTests=false" - "-Dtest=ManifestYamlEditorTest#reconcileShowsWarningOnUnknownService" - "clean" - "install"
Add a test for the originally reported failure.
# Test that joins between INFORMATION_SCHEMA tables work as expected. --- - CreateTable: c (cid INT PRIMARY KEY NOT NULL, cname VARCHAR(128)) --- - Statement: SELECT column_name, data_type, type_category, type_bundle_name FROM information_schema.columns c LEFT OUTER JOIN information_schema.types t ON c.data_type = t.type_name WHERE table_schema = 'test' AND table_name = 'c' - output: [['cid', 'INT', 'INTEGER', 'MCOMPAT'], ['cname', 'VARCHAR', 'STRING_CHAR', 'MCOMPAT']] ...
Add action workflow with lint & deploy
name: CI/CD Pipeline on: push: branches: - develop - main pull_request: # Run workflow whenever a PR is opened, updated (synchronized), or marked ready for review. types: [ opened, synchronize, ready_for_review ] jobs: lint: name: Lint runs-on: ubuntu-latest steps: - name: Checkout repository uses: actions/checkout@v2 - name: Get npm cache directory id: npm-cache run: echo "::set-output name=dir::$(npm config get cache)" - name: Configure npm cache uses: actions/cache@v2 with: path: ${{ steps.npm-cache.outputs.dir }} key: ${{ runner.os }}-npm-${{ hashFiles('**/package-lock.json') }} restore-keys: | ${{ runner.os }}-npm- - name: Install Node dependencies run: npm i env: CI: true - name: Detect coding standard violations run: npm run lint deploy: if: github.ref == 'refs/heads/develop' needs: [lint] name: Deploy runs-on: ubuntu-latest steps: - name: Checkout repository uses: actions/checkout@v2 - name: Deploy to Firebase uses: FirebaseExtended/action-hosting-deploy@v0 with: repoToken: "${{ secrets.GITHUB_TOKEN }}" firebaseServiceAccount: "${{ secrets.FIREBASE_SERVICE_ACCOUNT }}" projectId: web-dev-media channelId: live
Add Node.js 10 to Travis testing
language: node_js sudo: false node_js: - "6" - "7" - "8" - "9" after_success: - npm run coverage - npm i codecov - codecov -f coverage/coverage.json notifications: email: on_success: never cache: directories: - node_modules
language: node_js sudo: false node_js: - "6" - "7" - "8" - "9" - "10" after_success: - npm run coverage - npm i codecov - codecov -f coverage/coverage.json notifications: email: on_success: never cache: directories: - node_modules
Add conf file for P2PU KoomBook
--- - hosts: localhost roles: # Enable or disable to suit your needs # This role automatically send logs to central server each time the device gets an Internet connection - logs - role: zim_install zim_name: "wikipedia.en gutenberg.en wiktionary.en tedbusiness.en teddesign.en tedentertainment.en tedglobalissues.en tedscience.en tedtechnology.en" - role: kalite version: "0.15.0" lang: "en"
Add feedback bot to pull request
on: pull_request: types: [opened] jobs: comment: runs-on: ubuntu-latest steps: - uses: actions/github-script@v3 with: github-token: ${{secrets.GITHUB_TOKEN}} script: | github.issues.createComment({ issue_number: context.issue.number, owner: context.repo.owner, repo: context.repo.repo, body: `πŸ‘‹ Thanks for contributing to Viper! You are awesome! A maintainer will take a look at your pull request shortly. In the meantime: We are working on **Viper v2** and we would love to hear your thoughts about what you like or don't like about Viper, so we can improve or fix those issues. If you have a couple minutes, please take some time to give us some feedback: https://forms.gle/R6faU74qPRPAzchZ9 If you've already given us your feedback, you can still help by spreading the news, either by sharing the above link or telling people about this on Twitter: https://twitter.com/sagikazarmark/status/1306904078967074816 **Thank you!** `, })
Update TravisCI to Gitter webhook
language: go sudo: false go: - 1.4 - 1.5 - 1.6 - 1.7 - tip script: - go test -v -covermode=count -coverprofile=coverage.out after_success: - bash <(curl -s https://codecov.io/bash) notifications: webhooks: urls: - https://webhooks.gitter.im/e/acc2c57482e94b44f557 on_success: change # options: [always|never|change] default: always on_failure: always # options: [always|never|change] default: always on_start: false # default: false
language: go sudo: false go: - 1.4 - 1.5 - 1.6 - 1.7 - tip script: - go test -v -covermode=count -coverprofile=coverage.out after_success: - bash <(curl -s https://codecov.io/bash) notifications: webhooks: urls: - https://webhooks.gitter.im/e/7f95bf605c4d356372f4 on_success: change # options: [always|never|change] default: always on_failure: always # options: [always|never|change] default: always on_start: false # default: false
Add Prod environment with 3VM but no directories sync
--- # Variables used by Vagrantfile are defined here servers: - name: frontend ip: 172.16.1.30 hostname: frontend.irma.local memory: 2048 - name: brain ip: 172.16.1.31 hostname: brain.irma.local memory: 2048 - name: probe ip: 172.16.1.32 hostname: probe.irma.local memory: 2048
Add config for Travis CI
language: cpp compiler: - gcc install: - sudo apt-get install gcc-multilib g++-multilib cmake before_script: - cmake . -DCMAKE_C_FLAGS=-m32 -DCMAKE_CXX_FLAGS=-m32 script: - make - make package
Configure Travis-CI for Linux testing
language: cpp os: linux # Use the latest supported distro, Xenial, released in April 2016… dist: xenial git: depth: 1 compiler: - clang - gcc env: matrix: - BUILD_TYPE="Release" - BUILD_TYPE="Debug" # Xenial only has outdated versions of the packages we could use, but we still # install them as a way to get all their dependencies installed. before_install: - sudo apt-get install -y libassimp-dev - sudo apt-get install -y libglm-dev - sudo apt-get install -y libglfw3-dev script: - mkdir build || exit 1 - pushd build && { cmake -DCMAKE_CXX_FLAGS="-Wall" \ -DCMAKE_BUILD_TYPE="${BUILD_TYPE}" \ -G Ninja ..; popd; } || exit 1 - cmake --build build || exit 1
Add a Travis CI configuration file.
language: python python: - "2.7" before_install: - sudo apt-get update -qq install: - sudo apt-get install -y python-opencv - pip install -r requirements.txt script: - python setup.py build
Build feature branches in Travis, too.
language: ruby script: 'rake db:create db:test:load default' rvm: - 1.9.3 env: notifications: email: false bundler_args: --without development branches: only: - master
language: ruby script: 'rake db:create db:test:load default' rvm: - 1.9.3 env: notifications: email: false bundler_args: --without development
Validate files with Travis CI
# Travis CI config for the WordPoints developer tools. language: ruby # Use the new infrustructure. sudo: false before_script: - gem install travis-lint script: - xmllint --noout $(find . -type f \( -name '*.xml' -o -name '*.xml.dist' \)) - find . -name '*.sh' -exec bash -n {} \; - find . -name '*.yml' -exec travis-lint {} \; # EOF
Add Oracle Java 11 to Travis build.
# Licensed to the Apache Software Foundation (ASF) under one or more # contributor license agreements. See the NOTICE file distributed with # this work for additional information regarding copyright ownership. # The ASF licenses this file to You under the Apache License, Version 2.0 # (the "License"); you may not use this file except in compliance with # the License. You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. language: java sudo: false jdk: - openjdk7 - oraclejdk8 - openjdk10 - openjdk11 after_success: - mvn clean cobertura:cobertura coveralls:report
# Licensed to the Apache Software Foundation (ASF) under one or more # contributor license agreements. See the NOTICE file distributed with # this work for additional information regarding copyright ownership. # The ASF licenses this file to You under the Apache License, Version 2.0 # (the "License"); you may not use this file except in compliance with # the License. You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. language: java sudo: false jdk: - openjdk7 - openjdk10 - openjdk11 - oraclejdk8 - oraclejdk11 after_success: - mvn clean cobertura:cobertura coveralls:report
Add a Travis config file.
language: php php: - 5.3.3 - 5.3 - 5.4 - 5.5 - 5.6 - hhvm - hhvm-nightly sudo: false before_script: - composer self-update - composer install --no-interaction --prefer-source --dev script: ./vendor/bin/phpunit notifications: email: false irc: "irc.freenode.org#phpunit"
Use Xcode 11 as CI env
language: swift osx_image: - xcode10.1 - xcode10.3 env: - DESTINATION='platform=macOS' SWIFT_VERSION=5.0 - DESTINATION='platform=iOS Simulator,name=iPhone 8' SWIFT_VERSION=5.0 - DESTINATION='platform=tvOS Simulator,name=Apple TV' SWIFT_VERSION=5.0 - DESTINATION='platform=macOS' SWIFT_VERSION=4.2 - DESTINATION='platform=iOS Simulator,name=iPhone 8' SWIFT_VERSION=4.2 - DESTINATION='platform=tvOS Simulator,name=Apple TV' SWIFT_VERSION=4.2 - DESTINATION='platform=macOS' SWIFT_VERSION=4.0 - DESTINATION='platform=iOS Simulator,name=iPhone 8' SWIFT_VERSION=4.0 - DESTINATION='platform=tvOS Simulator,name=Apple TV' SWIFT_VERSION=4.0 cache: bundler before_install: - gem update --system - gem install bundler matrix: exclude: - osx_image: xcode10.1 env: DESTINATION='platform=macOS' SWIFT_VERSION=5.0 - osx_image: xcode10.1 env: DESTINATION='platform=iOS Simulator,name=iPhone 8' SWIFT_VERSION=5.0 - osx_image: xcode10.1 env: DESTINATION='platform=tvOS Simulator,name=Apple TV' SWIFT_VERSION=5.0 script: - bundle exec fastlane test_ci
language: swift osx_image: - xcode10.3 - xcode11.1 env: - DESTINATION='platform=macOS' SWIFT_VERSION=5.0 - DESTINATION='platform=iOS Simulator,name=iPhone 8' SWIFT_VERSION=5.0 - DESTINATION='platform=tvOS Simulator,name=Apple TV' SWIFT_VERSION=5.0 - DESTINATION='platform=macOS' SWIFT_VERSION=4.2 - DESTINATION='platform=iOS Simulator,name=iPhone 8' SWIFT_VERSION=4.2 - DESTINATION='platform=tvOS Simulator,name=Apple TV' SWIFT_VERSION=4.2 - DESTINATION='platform=macOS' SWIFT_VERSION=4.0 - DESTINATION='platform=iOS Simulator,name=iPhone 8' SWIFT_VERSION=4.0 - DESTINATION='platform=tvOS Simulator,name=Apple TV' SWIFT_VERSION=4.0 cache: bundler before_install: - gem update --system - gem install bundler script: - bundle exec fastlane test_ci
Revert "Now we have a cache; lets run the tests."
language: java sudo: false script: ant -Divy.logging=default -Djava.security.egd=file:/dev/./urandom test-no-reports jdk: - oraclejdk8 cache: directories: - $HOME/.ivy2 notifications: email: false branches: only: - develop
language: java sudo: false script: ant -Divy.logging=default -Djava.security.egd=file:/dev/./urandom jar jdk: - oraclejdk8 cache: directories: - $HOME/.ivy2 notifications: email: false branches: only: - develop
Add dev config file with mock API
# # This config-file is only needed for development. Instead of changing the url # everytime you work locally on the project, you start both config-files, overwriting # the first one with the development variables needed. # # Start development with β€ΊΒ $ jekyll serve --config _config.yml,_config_dev_with_mock_api.yml url: "http://localhost:4000" urlimg: "http://localhost:4000/images/" laterpay: connector: api_root: 'http://localhost:3001/api/example'
Add another config example: mine
--- nick: Eidolos server: NAO write_keys: 1 write_normal_ttyrec: 1 write_interhack_ttyrec: 1 nhoptions: vikeys: 1 plugins: exclude: - Autologin MonsterColors: chromatic_nemeses: 1 PriceID: short_names: scroll: 60: [EW] 80: [EA, RC] potion: 300: [GA, GL, paral] Autoadjust: key: k unihorn: h athame: E stethoscope: s pickaxe: x bag: b Amulet: d blindfold: P lizard: L conflict: c whistle: w levitation: l lamp: j instrument: a trice: '^ye' '\bpotions?\b[^.]*?\b': "^q" '\bwand\b[^.]*?': "^z" Macros: '^E': "E- Elbereth\n" '^V': "E- Elbereth\nE- Elbereth\nE- Elbereth\n> " '^W': "aany\e"
Update from Hackage at 2018-05-17T21:21:38Z
homepage: https://github.com/ublubu/shapes#readme changelog-type: '' hash: 0bb1969bde0861eb061d883f2d9277b5f846e569f499b929244977ae83f69f42 test-bench-deps: either: -any base: ! '>=4.7 && <5' hspec: -any array: -any shapes-math: -any containers: -any shapes: -any lens: -any ghc-prim: -any linear: -any mtl: -any transformers: -any deepseq: -any vector-th-unbox: -any QuickCheck: -any vector: -any maintainer: kynan.rilee@gmail.com synopsis: physics engine and other tools for 2D shapes changelog: '' basic-deps: either: -any base: ! '>=4.7 && <5' criterion: -any array: -any shapes-math: -any containers: -any shapes: -any lens: -any ghc-prim: -any linear: -any mtl: -any transformers: -any deepseq: -any vector-th-unbox: -any vector: -any all-versions: - '0.1.0.0' author: Kynan Rilee latest: '0.1.0.0' description-type: haddock description: Please see the README on Github at <https://github.com/ublubu/shapes#readme> license-name: BSD3
Update from Hackage at 2019-05-04T03:11:14Z
homepage: https://github.com/thomaseding/math-interpolate changelog-type: '' hash: 256da141677506ed8d6e62ab4b74b8d531b6519c2da13208cf097754ed53ff04 test-bench-deps: {} maintainer: Thomas Eding synopsis: Class for interpolation of values changelog: '' basic-deps: base: ^>=4.12.0.0 all-versions: - 0.1.0.1 author: Thomas Eding latest: 0.1.0.1 description-type: markdown description: "interpolate\r\n" license-name: BSD-3-Clause
Use stack setup so you can ignore most of the output
cache: - "c:\\sr" # stack root, short paths == less problems build: off before_test: - curl -ostack.zip -L --insecure http://www.stackage.org/stack/windows-i386 - 7z x stack.zip stack.exe clone_folder: "c:\\stack" environment: global: STACK_ROOT: "c:\\sr" test_script: # The ugly echo "" hack is to avoid complaints about 0 being an invalid file # descriptor - echo "" | stack --arch i386 --no-terminal --install-ghc test
cache: - "c:\\sr" # stack root, short paths == less problems build: off before_test: - curl -ostack.zip -L --insecure http://www.stackage.org/stack/windows-i386 - 7z x stack.zip stack.exe clone_folder: "c:\\stack" environment: global: STACK_ROOT: "c:\\sr" test_script: - stack setup > nul # The ugly echo "" hack is to avoid complaints about 0 being an invalid file # descriptor - echo "" | stack --no-terminal test
Update from Hackage at 2018-05-12T00:33:32Z
homepage: '' changelog-type: '' hash: 4bf2da55deca1ffb2a32b100b44c21346ae860e5f6c3fbe05f11a179defddf57 test-bench-deps: {} maintainer: strake888@gmail.com synopsis: Non-crashing `Enum` operations changelog: '' basic-deps: base: ! '>=4.7 && <5' all-versions: - '0.1.0.0' author: M Farkas-Dyck latest: '0.1.0.0' description-type: markdown description: ! '# Enum ' license-name: BSD3
Update from Hackage at 2018-05-12T07:37:39Z
homepage: https://github.com/buonuomo/Text.Pronounce changelog-type: markdown hash: 3d685fcadf3b278d2c289dfb7529c958fe63bc79b7bcd8cbe9a50e58ccca1428 test-bench-deps: {} maintainer: ngoodman@uchicago.edu synopsis: A Haskell library for interfacing with the CMU Pronouncing Dictionary changelog: ! "# Revision history for pronounce\r\n\r\n## 0.1.0.0 -- YYYY-mm-dd\r\n\r\n* First version. Released on an unsuspecting world.\r\n" basic-deps: base: ! '>=4.10 && <4.11' text: ! '>=1.2 && <1.3' filepath: ! '>=1.4 && <1.5' containers: ! '>=0.5 && <0.6' binary: ! '>=0.8.4' mtl: ! '>=2.2 && <2.3' all-versions: - '1.1.0.1' author: Noah Goodman latest: '1.1.0.1' description-type: markdown description: ! "# Text.Pronounce\nA pronunciation and rhyming library that uses the CMU Pronouncing Dictionary\n\nThis package is a basic interface for the Carnegie Mellon University Pronouncing Dictionary, based off of \n[Allison Parrish's Python API](https://github.com/aparrish/pronouncingpy), `pronouncing`. \n\n### Documentation\nComing soon..\n" license-name: BSD3
Add a release note about storage/incoming split
--- features: - The storage of new measures that ought to be processed by *metricd* can now be stored using different storage drivers. By default, the driver used is still the regular storage driver configured. See the `[incoming]` section in the configuration file.
Update from Forestry.io - Updated Forestry configuration
--- upload_path: "/uploads/:year:/:month:/:day:" frontmatter_file_url_template: "/uploads/:year:/:month:/:day:" body_file_url_template: "/uploads/:year:/:month:/:day:" new_page_extension: md auto_deploy: false admin_path: webhook_url: collections:
Update from Hackage at 2017-02-12T06:28:35Z
homepage: https://github.com/rblaze/credential-store#readme changelog-type: '' hash: 31a5698f149ac2cdcd38f77f3670b06ae3599fe9c95ff31dc548f62217331faf test-bench-deps: bytestring: -any base: -any credential-store: -any tasty-hunit: -any tasty: -any maintainer: blaze@ruddy.ru synopsis: Library to access secure credential storage providers changelog: '' basic-deps: bytestring: -any base: ! '>=4.7 && <5' safe-exceptions: -any memory: -any credential-store: -any containers: -any cryptonite: -any dbus: -any all-versions: - '0.1.0.0' author: Andrey Sverdlichenko latest: '0.1.0.0' description-type: markdown description: ! '# credential-store Windows and Linux credentials storage ' license-name: Apache-2.0
Add github workflow for automatic test
name: spglib test using conda on: push: branches-ignore: - rc - master jobs: build-linux: runs-on: ubuntu-latest strategy: max-parallel: 5 steps: - uses: actions/checkout@v2 - name: Set up Python 3.8 uses: actions/setup-python@v2 with: python-version: 3.8 - name: Add conda to system path run: | # $CONDA is an environment variable pointing to the root of the miniconda directory echo $CONDA/bin >> $GITHUB_PATH - name: Install dependencies run: | conda install --yes -c conda-forge python=3.8 conda install --yes -c conda-forge numpy compilers pip pytest codecov pytest-cov - name: Setup spglib working-directory: ./python run: | ./get_nanoversion.sh cat __nanoversion__.txt python setup.py build pip install -e . - name: Test with pytest working-directory: ./python run: | pytest --cov=./ --cov-report=xml - name: Upload coverage to Codecov working-directory: ./python uses: codecov/codecov-action@v1 with: verbose: true
Add config for Cloud Builder
steps: - name: 'gcr.io/cloud-builders/bazel' args: ['build', 'gce-containers-startup'] - name: 'gcr.io/cloud-builders/gsutil' args: ['cp', 'bazel-bin/gce-containers-startup/gce-containers-startup', 'docker/gce-containers-startup'] - name: 'gcr.io/cloud-builders/docker' args: ['build', '-f', 'docker/Dockerfile', '-t', 'gcr.io/gce-containers/konlet:HEAD', '.'] images: ['gcr.io/gce-containers/konlet:HEAD']
Configure github actions to execute test on PR
# This workflow will build a Java project with Maven # For more information see: https://help.github.com/actions/language-and-framework-guides/building-and-testing-java-with-maven name: Java CI with Maven on: push: branches: [ master ] pull_request: branches: [ master ] jobs: build: runs-on: ubuntu-latest steps: - uses: actions/checkout@v2 - name: Set up JDK 8 uses: actions/setup-java@v2 with: java-version: '8' distribution: 'adopt' - name: Test with Maven run: mvn -B test --file pom.xml
Set up CI with Azure Pipelines
# .NET Desktop # Build and run tests for .NET Desktop or Windows classic desktop solutions. # Add steps that publish symbols, save build artifacts, and more: # https://docs.microsoft.com/azure/devops/pipelines/apps/windows/dot-net trigger: - master pool: vmImage: 'windows-latest' variables: solution: '**/*.sln' buildPlatform: 'Any CPU' buildConfiguration: 'Release' steps: - task: VSBuild@1 inputs: solution: '$(solution)' platform: '$(buildPlatform)' configuration: '$(buildConfiguration)' - task: VSTest@2 inputs: platform: '$(buildPlatform)' configuration: '$(buildConfiguration)'
Update from Hackage at 2021-03-24T01:39:18Z
homepage: https://github.com/ekmett/codex/tree/master/parsnip#readme changelog-type: markdown hash: 78c588893fe7e09864c6b38c243c32bb2e7bbf1604172713011d62fff9dd019a test-bench-deps: {} maintainer: Edward Kmett <ekmett@gmail.com> synopsis: A fast, minimal parser changelog: | # 0 * Split off from `engine` basic-deps: bytestring: -any base: '>=4.15 && <5' data-default: -any containers: -any ghc-prim: -any attoparsec: -any primitive: -any all-versions: - '0' author: Edward Kmett latest: '0' description-type: markdown description: |+ parsnip ===== [![Hackage](https://img.shields.io/hackage/v/parsnip.svg)](https://hackage.haskell.org/package/parsnip) This is a rather minimal parsing library for applications that do not need nice error messages. Use with a library like `parsers` or `parser-combinators` to fill in the missing functionality. Contact Information ------------------- Contributions and bug reports are welcome! Please feel free to contact me through github or on the #haskell IRC channel on irc.freenode.net. -Edward Kmett license-name: (BSD-2-Clause OR Apache-2.0)
Add kustomize.yaml to the Tekton folder
apiVersion: kustomize.config.k8s.io/v1beta1 kind: Kustomization resources: - account.yaml - build-push-ma-base-image.yaml - publish.yaml - release-pipeline-nightly.yaml - release-pipeline.yaml - resources.yaml
Update from Hackage at 2021-03-24T00:20:48Z
homepage: https://github.com/ekmett/codex/tree/master/atlas#readme changelog-type: markdown hash: 32c168b5b9fd4e843376ca9eb02ef33b233392cdbbcbf0b87c5f2cee00209861 test-bench-deps: base: '>=4.11 && <5' hspec: -any transformers: -any atlas: -any primitive: -any maintainer: Edward Kmett <ekmett@gmail.com> synopsis: Skyline rectangle packing changelog: | # 0 * Split off from `codex` basic-deps: inline-c: -any base: '>=4.11 && <5' data-default: -any containers: -any lens: -any transformers: -any template-haskell: -any primitive: -any all-versions: - '0' author: Edward Kmett latest: '0' description-type: markdown description: |+ atlas ===== [![Hackage](https://img.shields.io/hackage/v/atlas.svg)](https://hackage.haskell.org/package/atlas) [![Build Status](https://secure.travis-ci.org/ekmett/atlas.png?branch=master)](http://travis-ci.org/ekmett/atlas) This package provides 2D skyline rectangle packing, which can be useful for building texture atlases and the like. Contact Information ------------------- Contributions and bug reports are welcome! Please feel free to contact me through github or on the #haskell IRC channel on irc.freenode.net. -Edward Kmett license-name: (BSD-2-Clause OR Apache-2.0)
Create role to copy the contents of the skeleton directory intelligently depending on our virtual architecture.
--- # This Ansible role copies the contents of the skeleton directory /etc/skel to # the home directory of vagrant on a virtual machine, or to /root/ on a # container. # # TYPE must either be "vm" or "container". - name: Copy skeleton contents (/etc/skel/*) to root on containers. command: cp --recursive /etc/skel/. /root/ when: TYPE == "container" - name: Create temporary directory for skeleton transfer on vms. command: mktemp register: temp_creation when: TYPE == "vm" - name: Copy skeleton contents (/etc/skel/*) to the temporary directory on vms. command: cp --recursive /etc/skel/. {{ temp_creation.stdout }}/ when: TYPE == "vm" - name: Set mode of transferred skeleton files. file: group: vagrant owner: vagrant path: {{ temp_creation.stdout }} recurse: yes state: directory when: TYPE == "vm" - name: Copy temporary skeleton files to the vagrant home directory. command: cp --preserve --recursive {{ temp_creation.stdout }}/. /home/vagrant/ when: TYPE == "vm"
Add new issues to a project board
name: Add issues to review project on: issues: types: - opened jobs: get-user-type: runs-on: ubuntu-latest outputs: ignored: ${{ steps.set-output.outputs.ignored }} steps: - name: Install dependencies run: | npm install node-fetch@2 - uses: actions/github-script@v5 name: "Determing HQ user or not" id: set-output with: script: | const fetch = require('node-fetch'); const response = await fetch('https://collaboratorsv2.euwest01.umbraco.io/umbraco/api/users/IsIgnoredUser', { method: 'post', body: JSON.stringify('${{ github.event.issue.user.login }}'), headers: { 'Authorization': 'Bearer ${{ secrets.OUR_BOT_API_TOKEN }}', 'Content-Type': 'application/json' } }); var isIgnoredUser = true; try { if(response.status === 200) { const data = await response.text(); isIgnoredUser = data === "true"; } else { console.log("Returned data not indicate success:", response.status); } } catch(error) { console.log(error); }; core.setOutput("ignored", isIgnoredUser); console.log("Ignored is", isIgnoredUser); add-to-project: if: needs.get-user-type.outputs.ignored == 'false' runs-on: ubuntu-latest needs: [get-user-type] steps: - uses: actions/add-to-project@main with: project-url: https://github.com/orgs/${{ github.repository_owner }}/projects/21 github-token: ${{ secrets.ADD_TO_PROJECT_PAT }}
Add SDL dev libs to Travis install config
language: rust rust: - stable - beta - nightly matrix: allow_failures: - rust: nightly
language: rust rust: - stable - beta - nightly matrix: allow_failures: - rust: nightly install: - wget https://www.libsdl.org/release/SDL2-2.0.4.tar.gz -O sdl2.tar.gz - tar xzf sdl2.tar.gz - pushd SDL2-2.0.4 && ./configure && make && sudo make install && popd - wget -q http://www.libsdl.org/projects/SDL_ttf/release/SDL2_ttf-2.0.12.tar.gz - wget -q http://www.libsdl.org/projects/SDL_image/release/SDL2_image-2.0.0.tar.gz - wget -q http://internode.dl.sourceforge.net/project/sdl2gfx/SDL2_gfx-1.0.1.tar.gz - tar xzf SDL2_ttf-*.tar.gz - tar xzf SDL2_image-*.tar.gz - tar xzf SDL2_gfx-*.tar.gz - pushd SDL2_ttf-* && ./configure && make && sudo make install && popd - pushd SDL2_image-* && ./configure && make && sudo make install && popd - pushd SDL2_gfx-* && ./autogen.sh && ./configure && make && sudo make install && popd
Add release note file for next release
--- fixes: - Prevent the git indexer to uselessly fetch same pack at each run. Bug seen on centos 7. - Fix get stat by tag that was broken by the recent project schema change. - Fix manual release definition that was broken by the recent project schema change.
Update from Hackage at 2019-08-19T05:06:57Z
homepage: https://github.com/3noch/postgresql-simple-interpolate changelog-type: '' hash: 33deb8eb397d24d77883f53ceb7123df2124d28d0e340767e9315e96771235b5 test-bench-deps: {} maintainer: eacameron@gmail.com synopsis: Interpolated SQL queries via quasiquotation changelog: '' basic-deps: base: ! '>=4.5 && <5' parsec: ==3.1.* postgresql-simple: -any mtl: ! '>=2.1 && <2.3' haskell-src-meta: ! '>=0.6 && <0.9' template-haskell: -any all-versions: - '0.1' author: Elliot Cameron latest: '0.1' description-type: haddock description: Interpolated SQL queries via quasiquotation license-name: BSD-3-Clause
Update from Hackage at 2022-05-09T11:42:57Z
homepage: https://github.com/NorfairKing/linkcheck#readme changelog-type: markdown hash: 4b74a193ee39f1b39f2c51ab4e95025af736679182bb6a78c7bf79916779946a test-bench-deps: base: '>=4.7 && <5' linkcheck: -any maintainer: syd@cs-syd.eu synopsis: Check for broken links in CI changelog: |+ # Changelog ## [0.1.0.0] - 2022-05-09 ### Added * Linkcheck now caches requests when fragment checking is turned on. basic-deps: http-client: -any bytestring: -any lrucache: -any unliftio: -any stm: -any path: -any base: '>=4.7 && <5' text: -any retry: -any conduit: -any linkcheck: -any containers: -any http-client-tls: -any network-uri: -any mtl: -any monad-logger: -any optparse-applicative: -any tagsoup: -any http-types: -any aeson: -any path-io: -any all-versions: - 0.1.0.0 author: Tom Sydney Kerckhove latest: 0.1.0.0 description-type: haddock description: '' license-name: MIT
Update from Hackage at 2019-09-22T15:05:14Z
homepage: https://github.com/gojup/http-mock#readme changelog-type: markdown hash: c4c821e59bb16f3199cc81fadc0fa61dd3f4d538cb49a93d1d2ae565b3dd6dbf test-bench-deps: base: ! '>=4.7 && <5' http-mock: -any maintainer: f.rincon@protonmail.com synopsis: HTTP mocking and expectations library for Haskell changelog: | # Changelog for http-mock ## Unreleased changes basic-deps: warp: -any http-client: -any wai: -any base: ! '>=4.7 && <5' filepath: -any network: -any async: -any random: -any directory: -any all-versions: - 0.1.0.0 author: Fernando Rincon Martin latest: 0.1.0.0 description-type: markdown description: "# http-mock\n\nHTTP mocking and expectations library for Haskell\n\n## Status\nThis project is aiming to provide a complete http mocking \nsystem for allow developers of http client libraries to test\ntheir clients.\n\nCurrently the project is very basic and is limited to allow\nconnect a http-client to a Wai `Application`. This is enough\nto build any kind of mock server.\n\n\n\n" license-name: Apache-2.0
Add a role to deploy /etc/hosts for vpn names
# vim: set ft=ansible --- - name: generate hosts entries lineinfile: dest: /etc/hosts regexp: "{{ item.name | replace('.', '\\.') }}" line: "{{ item.net | ipaddr('1') | ipaddr('address') }} {{ item.name }}" with_items: vpn.static_clients
Allow pull request builds in AppVeyor
ο»Ώversion: '{build}' branches: only: - master - /pull\/.+\/merge/ image: Visual Studio 2017 environment: MYGET_API_KEY: secure: 78qy8e6pKfJlQV7RAG5tJOWegzXpjASkUs3aFdVBoPYA5gi6+mWdjbuAmNa5OQPe nuget: disable_publish_on_pr: true assembly_info: patch: false configuration: - Release before_build: - cmd: nuget sources update -Name nuget.org -Source https://api.nuget.org/v3/index.json - cmd: git remote set-url origin https://github.com/baunegaard/expressioncache.git build_script: - ps: .\build.ps1 test: off deploy: off
ο»Ώversion: '{build}' branches: only: - master - /pull\/.+\/merge/ image: Visual Studio 2017 environment: MYGET_API_KEY: secure: 78qy8e6pKfJlQV7RAG5tJOWegzXpjASkUs3aFdVBoPYA5gi6+mWdjbuAmNa5OQPe assembly_info: patch: false configuration: - Release before_build: - cmd: nuget sources update -Name nuget.org -Source https://api.nuget.org/v3/index.json - cmd: git remote set-url origin https://github.com/baunegaard/expressioncache.git build_script: - ps: .\build.ps1 test: off deploy: off
Make mkfs.xfs work by growing the loopback device
rjil::db::bind_address: "%{ipaddress_eth2}" keystone::admin_bind_host: "%{ipaddress_eth2}" keystone::public_bind_host: "%{ipaddress_eth2}" glance::bind_host: "%{ipaddress_eth2}" #glance::registry::bind_host: "%{ipaddress_eth0}" rjil::jiocloud::etcd::addr: "%{ipaddress_eth2}:4001" rjil::jiocloud::etcd::peer_addr: "%{ipaddress_eth2}:7001" rjil::haproxy::openstack::keystone_ips: "%{lookup_array('services::keystone::eth2')}" rjil::haproxy::openstack::keystone_internal_ips: "%{lookup_array('services::keystone::eth2')}" rjil::haproxy::openstack::glance_ips: "%{lookup_array('services::glance::eth2')}" rjil::keystone::public_address: "%{ipaddress_eth2}" rjil::ceph::storage_cluster_if: eth2 rjil::ceph::public_if: eth2 rjil::ceph::osd::autogenerate: true rjil::ceph::osd::autodisk_size: 50 rjil::ceph::osd::osd_journal_size: 2
rjil::db::bind_address: "%{ipaddress_eth2}" keystone::admin_bind_host: "%{ipaddress_eth2}" keystone::public_bind_host: "%{ipaddress_eth2}" glance::bind_host: "%{ipaddress_eth2}" #glance::registry::bind_host: "%{ipaddress_eth0}" rjil::jiocloud::etcd::addr: "%{ipaddress_eth2}:4001" rjil::jiocloud::etcd::peer_addr: "%{ipaddress_eth2}:7001" rjil::haproxy::openstack::keystone_ips: "%{lookup_array('services::keystone::eth2')}" rjil::haproxy::openstack::keystone_internal_ips: "%{lookup_array('services::keystone::eth2')}" rjil::haproxy::openstack::glance_ips: "%{lookup_array('services::glance::eth2')}" rjil::keystone::public_address: "%{ipaddress_eth2}" rjil::ceph::storage_cluster_if: eth2 rjil::ceph::public_if: eth2 rjil::ceph::osd::autogenerate: true rjil::ceph::osd::autodisk_size: 80 rjil::ceph::osd::osd_journal_size: 2
Add Korean version language file
index: home: 'ν™ˆ' search: '검색' archive: 'μ•„μΉ΄μ΄λΈŒ' category: 'μΉ΄ν…Œκ³ λ¦¬' uncategorized: 'λ―Έμ§€μ •' tag: 'νƒœκ·Έ' nav: next: 'λ‹€μŒ' prev: '이전' older: '이전 κΈ€' newer: 'λ‹€μŒ κΈ€' widget: recents: '졜근 κΈ€' archives: 'μ•„μΉ΄μ΄λΈŒ' categories: 'μΉ΄ν…Œκ³ λ¦¬' links: '링크' tags: 'νƒœκ·Έ' tag_cloud: 'νƒœκ·Έ ν΄λΌμš°λ“œ' article: more: 'μžμ„Ένžˆ 보기' comments: 'λŒ“κΈ€' share: 'κ³΅μœ ν•˜κΈ°' catalogue: 'μΉ΄νƒˆλ‘œκ·Έ' profile: follow: 'νŒ”λ‘œμš°' post: '포슀트' tag: 'νƒœκ·Έ' posts: '포슀트' tags: 'νƒœκ·Έ'