Instruction
stringlengths
14
778
input_code
stringlengths
0
4.24k
output_code
stringlengths
1
5.44k
Use Travis CI container architecture
language: ruby rvm: - 1.9.3 - 2.0.0 - 2.1 - 2.2 - jruby-19mode - rbx-2 gemfile: - gemfiles/rails_3.2.gemfile - gemfiles/rails_4.0.gemfile - gemfiles/rails_4.1.gemfile - gemfiles/rails_4.2.gemfile matrix: allow_failures: - rbx-2
language: ruby rvm: - 1.9.3 - 2.0.0 - 2.1 - 2.2 - jruby-19mode - rbx-2 gemfile: - gemfiles/rails_3.2.gemfile - gemfiles/rails_4.0.gemfile - gemfiles/rails_4.1.gemfile - gemfiles/rails_4.2.gemfile matrix: allow_failures: - rbx-2 sudo: false
Update from Hackage at 2022-05-02T15:29:34Z
homepage: https://github.com/unfoldml/jsonl changelog-type: '' hash: 68f5451721c5ebbccba027727e39cd601c8eb4b67f0315d8b55f9e563f5f3695 test-bench-deps: {} maintainer: UnfoldML AB synopsis: JSON Lines changelog: '' basic-deps: bytestring: -any base: '>=4.7 && <5' aeson: -any all-versions: - 0.1.0.0 author: Marco Zocca latest: 0.1.0.0 description-type: markdown description: | # jsonl Adapter between 'aeson' and the JSONL format (https://jsonlines.org/) license-name: BSD-3-Clause
Update from Hackage at 2021-03-20T17:54:17Z
homepage: '' changelog-type: markdown hash: e0aee781d8f43c551b09b5918d3527c360068840c80308a03d45484c249c187b test-bench-deps: streaming: -any base: '>=4.7 && <5' hspec: -any streaming-nonempty: -any maintainer: paolo.veronelli@gmail.com synopsis: Add support for non empty streams to Streaming lib changelog: | # Changelog for streaming-nonempty ## Unreleased changes basic-deps: streaming: -any base: '>=4.7 && <5' all-versions: - 0.1.0.0 author: Paolo Veronelli latest: 0.1.0.0 description-type: markdown description: "# streaming-nonempty\n\nA correct _groupBy_ function is expected to produce only non-empty groups, which means we should be able to fold them up with only Semigroup constraint on the values. This is not the case for `Data.List.groupBy` as well for the Streaming lib counterpart.\n\nThis package export a _groupBy_ function that produce `f (Stream f m a)` groups which are then guaranteed to have the functorial layer.\n\nThe `NEStream (Of a) m r` newtype is supporting _sconcat_ which means we can define\n\n```haskell\n\ngroupSemigroupBy :: (Semigroup a, Monad m) => (a -> a -> Bool) -> Stream (Of a) m r -> Stream (Of a) m r\ngroupSemigroupBy f = S.mapped sconcat . groupBy f\n\n```\n\nwith expected behavior to collapse groups using semigroup composition\n\nIn contrast using the standard _groupBy_ we are stuck with \n\n```haskell\n\ngroupMonoidBy :: (Monoid a, Monad m) => (a -> a -> Bool) -> Stream (Of a) m r -> Stream (Of a) m r\ngroupMonoidBy f = S.mapped mconcat . groupBy f\n\n```\n\nIt would be legit to use an `sconcatUnsafe` that would panic on empty streams because *we know* groups are not empty. " license-name: BSD-3-Clause
Remove git submodules from build script as they are automatically updated by fdroid. Also decrement wrongly changed version numbers.
Categories: - Security - Multimedia License: GPL-3.0-or-later AuthorName: Thomas Nibler AuthorEmail: cryptocam@tnibler.de SourceCode: https://gitlab.com/tnibler/cryptocam/tree/HEAD IssueTracker: https://gitlab.com/tnibler/cryptocam/issues AutoName: Cryptocam RepoType: git Repo: https://gitlab.com/tnibler/cryptocam Builds: - versionName: '1.0' versionCode: 1 commit: v1.0.2 submodules: true gradle: - yes output: app/build/outputs/apk/release/app-release-unsigned.apk build: ANDROID_SDK_HOME=$$SDK$$ ANDROID_SDK_ROOT=$$SDK$$ ANDROID_NDK_HOME=$$NDK$$ ./complete_build.sh ndk: r21d AutoUpdateMode: Version v%v UpdateCheckMode: Tags
Exclude auto-generated and 3rd party code from Code Climate
exclude_paths: # Auto-generated classes from the PEG parser generator - TYPO3.Eel/Classes/TYPO3/Eel/FlowQuery/FizzleParser.php - TYPO3.Eel/Classes/TYPO3/Eel/EelParser.php - TYPO3.Eel/Classes/TYPO3/Eel/AbstractParser.php # 3rd partx code - TYPO3.Eel/Resources/Private/PHP/php-peg/** - TYPO3.Flow/Resources/Public/Security/JavaScript/JSBN/** - TYPO3.Flow/Resources/PHP/iSecurity/**
Add a Travis testing file.
language: ruby rvm: - 1.9.3 - 2.0.0 - 2.1.0 env: global: - TESLA_EMAIL=elon.musk@teslamotors.com - TESLA_PASS=oilLOL - TESLA_CLIENT_ID=1 - TESLA_CLIENT_SECRET=2
Convert old CI script to Travis.
language: ruby rvm: - 1.9.3 - 2.0.0 - 2.1.2 - jruby-1.6 - jruby-1.7 env: - ACTIVESUPPORT_VERSION=3.0 - ACTIVESUPPORT_VERSION=3.1 - ACTIVESUPPORT_VERSION=3.2 matrix: include: - rvm: 1.8.7 env: ACTIVESUPPORT_VERSION=2.3 install: - "bundle update" script: "bundle exec rake ci:spec"
Add Travis CI yml file
language: java jdk: - oraclejdk8 - oraclejdk7 - openjdk6 sudo: false # as per http://blog.travis-ci.com/2014-12-17-faster-builds-with-container-based-infrastructure/
Remove C Extensions default config to silent warns
language: ruby sudo: false cache: bundler install: true script: - 'if [[ "$TRAVIS_RUBY_VERSION" =~ "jruby" ]]; then rvm get head && rvm reload && rvm use --install $TRAVIS_RUBY_VERSION; fi' - 'bundle install' - 'bundle exec rake test:coverage' rvm: - 2.0.0 - 2.1.0 - 2.1.1 - 2.1.2 - 2.1.3 - 2.1.4 - 2.1.5 - 2.1.6 - 2.1.7 - 2.2.0 - 2.2.1 - 2.2.2 - 2.2.3 - rbx-2 - jruby-9000 - jruby-head - ruby-head matrix: allow_failures: - rvm: rbx-2 - rvm: jruby-head - rvm: ruby-head
language: ruby sudo: false cache: bundler install: true env: global: - JRUBY_OPTS=--dev script: - 'if [[ "$TRAVIS_RUBY_VERSION" =~ "jruby" ]]; then rvm get head && rvm reload && rvm use --install $TRAVIS_RUBY_VERSION; fi' - 'bundle install' - 'bundle exec rake test:coverage' rvm: - 2.0.0 - 2.1.0 - 2.1.1 - 2.1.2 - 2.1.3 - 2.1.4 - 2.1.5 - 2.1.6 - 2.1.7 - 2.2.0 - 2.2.1 - 2.2.2 - 2.2.3 - rbx-2 - jruby-9000 - jruby-head - ruby-head matrix: allow_failures: - rvm: rbx-2 - rvm: jruby-head - rvm: ruby-head
Fix the integration tests by specifying which sln to build
version: 1.0.{build} environment: matrix: - Configuration: Release - Configuration: Debug before_build: - nuget restore - appveyor DownloadFile https://raw.githubusercontent.com/dbolkensteyn/pr-analysis/ondemand/msbuild-sonarqube-runner-begin.cmd && msbuild-sonarqube-runner-begin after_build: - appveyor DownloadFile https://raw.githubusercontent.com/dbolkensteyn/pr-analysis/ondemand/msbuild-sonarqube-runner-end.cmd && msbuild-sonarqube-runner-end notifications: - provider: HipChat room: 409390 auth_token: secure: RW8+2GpOWo3PcoM3ehoI+mbfUr7h508RtTDyszpR6/E= on_build_success: false on_build_failure: true on_build_status_changed: false
version: 1.0.{build} environment: matrix: - Configuration: Release - Configuration: Debug before_build: - nuget restore - appveyor DownloadFile https://raw.githubusercontent.com/dbolkensteyn/pr-analysis/ondemand/msbuild-sonarqube-runner-begin.cmd && msbuild-sonarqube-runner-begin build: verbosity: minimal project: SonarQube.MSBuild.Runner.sln after_build: - appveyor DownloadFile https://raw.githubusercontent.com/dbolkensteyn/pr-analysis/ondemand/msbuild-sonarqube-runner-end.cmd && msbuild-sonarqube-runner-end notifications: - provider: HipChat room: 409390 auth_token: secure: RW8+2GpOWo3PcoM3ehoI+mbfUr7h508RtTDyszpR6/E= on_build_success: false on_build_failure: true on_build_status_changed: false
Add bob.db.casia_fasd recipe [skip appveyor]
{% set version = "2.0.5" %} package: name: bob.db.casia_fasd version: {{ version }} source: fn: bob.db.casia_fasd-{{ version }}.zip md5: 54bf2328f118b24df793a486d00c242a url: https://pypi.python.org/packages/source/b/bob.db.casia_fasd/bob.db.casia_fasd-{{ version }}.zip build: number: 0 skip: true # [not linux] script: python -B setup.py install --single-version-externally-managed --record record.txt requirements: build: - python - setuptools - six - bob.db.base - antispoofing.utils run: - python - six - bob.db.base - antispoofing.utils test: commands: - nosetests -sv bob.db.casia_fasd imports: - bob - bob.db - bob.db.casia_fasd requires: - nose about: home: http://pypi.python.org/pypi/xbob.db.casia_fasd license: GNU General Public License v3 (GPLv3) summary: CASIA Face Anti-Spoofing Database Access API for Bob extra: recipe-maintainers: - 183amir
Update from Hackage at 2018-03-16T00:56:26Z
homepage: https://github.com/nomeata/containers-verified changelog-type: markdown hash: 18dbe519685935ef9c833a666278e48a1e4f77bb58676dc719eaf5e84f95d348 test-bench-deps: {} maintainer: mail@joachim-breitner.de synopsis: Formally verified drop-in replacement of containers changelog: ! '# Revision history for containers-verified ## 0.5.11.0 -- 2018-03-15 * First version. ' basic-deps: containers: ==0.5.11.0 all-versions: - '0.5.11.0' author: Joachim Breitner latest: '0.5.11.0' description-type: haddock description: ! 'In the context of the <https://deepspec.org/main DeepSpec project>, parts of the <http://hackage.haskell.org/package/containers containers> library were formally verified using <https://github.com/antalsz/hs-to-coq hs-to-coq> and the interactive theorem prover Coq. This package depends on precisely the verified version of containers and re-exports the verified parts of the API, with module name and function name unchanged. If you happen to use only the verified subset of the API, then you can simply change @containers@ to @containers-verified@ in your @.cabal@ file and earn bragging rights about using verified data structures in your project. Because the types from @containers@ are re-exported, you can still interface with other libraries that depend on @containers@ directly. If you happen to need additional modules or functions, you will have to depend on both @containers@ and @containers-verified@, and use <https://downloads.haskell.org/~ghc/latest/docs/html/users_guide/glasgow_exts.html#package-qualified-imports package-qualified imports> to disambiguate. This package does not re-export any of the @….Internals@ modules. We cannot control which type class instances are re-exported; these therefore may give you access to unverified code. Also, the @conatiners@ code contains some CPP directives; these can enable different code on your machine than the code that we verified (e.g. different bit-widths). To learn more about what exactly has been verified, and how wide the formalization gap is, see the paper “Ready, Set, Verify! Applying hs-to-coq to non-trivial Haskell code” by Joachim Breitner, Antal Spector-Zabusky, Yao Li, Christine Rizkallah, John Wiegley and Stephanie Weirich. The long-term maintenance plan for this package is not fleshed out yet, and certainly depends on user-demand. Let us know your needs! (And your technical or financial abilities to contribute...)' license-name: MIT
Add or update the App Service deployment workflow configuration from Azure Portal.
# Docs for the Azure Web Apps Deploy action: https://github.com/Azure/webapps-deploy # More GitHub Actions for Azure: https://github.com/Azure/actions name: Build and deploy ASP app to Azure Web App - chillin on: push: branches: - master workflow_dispatch: jobs: build: runs-on: 'windows-latest' steps: - uses: actions/checkout@v2 - name: Setup MSBuild path uses: microsoft/setup-msbuild@v1.0.2 - name: Setup NuGet uses: NuGet/setup-nuget@v1.0.5 - name: Restore NuGet packages run: nuget restore - name: Publish to folder run: msbuild /nologo /verbosity:m /t:Build /t:pipelinePreDeployCopyAllFilesToOneFolder /p:_PackageTempDir="\published\" - name: Upload artifact for deployment job uses: actions/upload-artifact@v2 with: name: ASP-app path: '/published/**' deploy: runs-on: 'windows-latest' needs: build environment: name: 'production' url: ${{ steps.deploy-to-webapp.outputs.webapp-url }} steps: - name: Download artifact from build job uses: actions/download-artifact@v2 with: name: ASP-app - name: Deploy to Azure Web App id: deploy-to-webapp uses: azure/webapps-deploy@v2 with: app-name: 'chillin' slot-name: 'production' publish-profile: ${{ secrets.AzureAppService_PublishProfile_f7fbc730e109473e9d38bf00151d2cab }} package: .
Leverage a GitHub workflow to build the Maven site on merges to main and on pull requests.
# This workflow will build a Java project with Maven # For more information see: https://help.github.com/actions/language-and-framework-guides/building-and-testing-java-with-maven name: CI for Maven site on: push: branches: [ main ] pull_request: branches: [ main ] jobs: build: runs-on: ubuntu-latest steps: - uses: actions/checkout@v2 - name: Set up JDK 15 uses: actions/setup-java@v1 with: java-version: 15.0.2 - name: Build Maven site run: mvn -B site --file pom.xml
Use Jasmine as the test runner for JavaScript from Python
# src_files # # Return an array of filepaths relative to src_dir to include before jasmine specs. # Default: [] # # EXAMPLE: # # src_files: # - lib/source1.js # - lib/source2.js # - dist/**/*.js # src_files: - js/jquery.min.js - js/jqm-config.js - js/jquery.mobile.min.js - js/jquery.notify.min.js - js/lodash.min.js - js/backbone.min.js - js/app-config.js - js/app-models.js - js/app-views.js - js/app-router.js # stylesheets # # Return an array of stylesheet filepaths relative to src_dir to include before jasmine specs. # Default: [] # # EXAMPLE: # # stylesheets: # - css/style.css # - stylesheets/*.css # stylesheets: # helpers # # Return an array of filepaths relative to spec_dir to include before jasmine specs. # Default: ["helpers/**/*.js"] # # EXAMPLE: # # helpers: # - helpers/**/*.js # helpers: - "helpers/**/*.js" # spec_files # # Return an array of filepaths relative to spec_dir to include. # Default: ["**/*[sS]pec.js"] # # EXAMPLE: # # spec_files: # - **/*[sS]pec.js # spec_files: - "**/*[Ss]pec.js" # src_dir # # Source directory path. Your src_files must be returned relative to this path. Will use root if left blank. # Default: project root # # EXAMPLE: # # src_dir: public # src_dir: "webfront" # spec_dir # # Spec directory path. Your spec_files must be returned relative to this path. # Default: spec/javascripts # # EXAMPLE: # # spec_dir: spec/javascripts # spec_dir: webfront/tests/spec
Set up CI with Azure Pipelines
# Python package # Create and test a Python package on multiple Python versions. # Add steps that analyze code, save the dist with the build record, publish to a PyPI-compatible index, and more: # https://docs.microsoft.com/azure/devops/pipelines/languages/python trigger: - master pool: vmImage: 'ubuntu-latest' strategy: matrix: Python27: python.version: '2.7' Python35: python.version: '3.5' Python36: python.version: '3.6' Python37: python.version: '3.7' steps: - task: UsePythonVersion@0 inputs: versionSpec: '$(python.version)' displayName: 'Use Python $(python.version)' - script: | python -m pip install --upgrade pip pip install -r requirements.txt displayName: 'Install dependencies' - script: | pip install pytest pytest-azurepipelines pytest displayName: 'pytest'
Add Istanbul config to set high watermarks
verbose: false instrumentation: root: . default-excludes: true excludes: [] embed-source: false variable: __coverage__ compact: true preserve-comments: false complete-copy: false save-baseline: false baseline-file: ./coverage/coverage-baseline.json reporting: print: summary reports: - lcov dir: ./coverage watermarks: statements: [80, 99] lines: [80, 99] functions: [80, 99] branches: [80, 99] hooks: hook-run-in-context: false post-require-hook: null
Include BMI cfg file for gpp + lai study
config: /home/mapi8461/projects/ILAMB-experiments/2/MsTMIP-gpp-lai.cfg ilamb_root: /nas/data/ilamb model_root: MODELS/MsTMIP regions: models: - BIOME-BGC - CLASS-CTEM-N - CLM4 - CLM4VIC - GTEC - ISAM - LPJ-wsl - ORCHIDEE-LSCE - SiB3 - SiBCASA confrontations:
Update from Hackage at 2018-01-25T17:16:37Z
homepage: https://github.com/chessai/acme-cuteboy changelog-type: markdown hash: 131d151f30d358ccced05d971a783baf8e97145959ce1ef922ca6bc73efc2601 test-bench-deps: {} maintainer: chessai1996@gmail.com synopsis: Maybe gives you a cute boy changelog: ! '# Revision history for acme-cuteboy ## 0.1.0.0 -- YYYY-mm-dd * First version. Released on an unsuspecting world. ' basic-deps: base: ! '>=3 && <6' acme-cuteboy: -any all-versions: - '0.1.0.0' author: chessai latest: '0.1.0.0' description-type: haddock description: ! 'A package which exists solely to try and give the user a cute boy. Executable and library are both available.' license-name: PublicDomain
Update from Hackage at 2018-03-15T11:42:02Z
homepage: https://github.com/aiya000/hs-random-names changelog-type: '' hash: 022d001b1f814e5a66b764b09b54be0e773f0bbc371b79839bdbde20ffcc5828 test-bench-deps: {} maintainer: aiya000 <aiya000.develop@gmail.com> synopsis: Expose Random and Arbitrary instances changelog: '' basic-deps: base: ! '>=4.7 && <5' text: -any random: -any QuickCheck: -any safe: -any all-versions: - '0.1.0.0' author: aiya000 latest: '0.1.0.0' description-type: haddock description: Random, Arbitrary instances for naming cases (PascalCase, camelCase, etc) license-name: MIT
Set up CI with Azure Pipelines
queue: Hosted VS2017 steps: - checkout: self clean: true - task: MSBuild@1 displayName: Restore inputs: solution: src/Moq.sln msbuildArguments: /t:Restore /p:Configuration=Release /m - task: MSBuild@1 displayName: Version inputs: solution: src/Moq/Moq.Package/Moq.Package.nuproj msbuildArguments: /t:Version - task: MSBuild@1 displayName: Build inputs: solution: src/Moq.sln msbuildArguments: /bl:"$(Build.ArtifactStagingDirectory)\build.binlog" /p:PackageOutputPath=$(Build.ArtifactStagingDirectory) /p:Configuration=Release - task: VSTest@2 displayName: Test inputs: testAssemblyVer2: src\*\*\bin\*\*.Tests.dll runInParallel: 'true' codeCoverageEnabled: 'true' publishRunAttachments: 'true' - task: PublishBuildArtifacts@1 displayName: Publish Artifact inputs: PathtoPublish: $(Build.ArtifactStagingDirectory) ArtifactName: out ArtifactType: Container condition: always()
Add GitHub Actions for CI (run tests)
name: CI on: push: branches: [ master ] pull_request: branches: [ master ] jobs: test: runs-on: ubuntu-latest strategy: fail-fast: false matrix: ruby-version: ['2.7', '3.0', '3.1'] steps: - uses: actions/checkout@v2 - name: Set up Ruby uses: ruby/setup-ruby@v1 with: ruby-version: ${{ matrix.ruby-version }} bundler-cache: true - name: Run tests run: bundle exec rspec
Update from Hackage at 2016-08-12T20:52:44+0000
homepage: '' changelog-type: '' hash: ff56c3ad3dc6de241ae99b14493800689d9acc6a00c17208c01686a4eaab738e test-bench-deps: {} maintainer: Index Int <vlad.z.4096@gmail.com> synopsis: Higher order versions of MTL classes changelog: '' basic-deps: base: ! '>=4.6 && <4.10' mtl: ! '>=2.2.1' transformers: ! '>=0.4.2' all-versions: - '0.1' author: Index Int latest: '0.1' description-type: haddock description: ! 'Higher order versions of MTL classes to ease programming with polymorphic recursion and reduce UndecidableInstances. See <http://flint.cs.yale.edu/trifonov/papers/sqcc.pdf> for further discussion of the approach taken here.' license-name: BSD3
Add length and pattern constraints
id: http://schemas.taskcluster.net/github/v1/github-pull-request-message.json# $schema: http://json-schema.org/draft-04/schema# title: "GitHub Pull Request Message" description: | Message reporting that a GitHub pull request has occurred type: object properties: version: {$const: message-version} organizationName: description: | The GitHub `organizationName` which had an event. type: string repositoryName: description: | The GitHub `repositoryName` which had an event. type: string action: description: | The GitHub `action` which triggered an event. type: string details: type: object description: | Metadata describing the pull request. additionalProperties: false required: - version - organizationName - repositoryName - action
id: http://schemas.taskcluster.net/github/v1/github-pull-request-message.json# $schema: http://json-schema.org/draft-04/schema# title: "GitHub Pull Request Message" description: | Message reporting that a GitHub pull request has occurred type: object properties: version: {$const: message-version} organizationName: description: | The GitHub `organizationName` which had an event. type: string minLength: {$const: identifier-min-length} maxLength: {$const: identifier-max-length} pattern: {$const: identifier-pattern} repositoryName: description: | The GitHub `repositoryName` which had an event. type: string minLength: {$const: identifier-min-length} maxLength: {$const: identifier-max-length} pattern: {$const: identifier-pattern} action: description: | The GitHub `action` which triggered an event. type: string minLength: {$const: identifier-min-length} maxLength: {$const: identifier-max-length} pattern: {$const: identifier-pattern} details: type: object description: | Metadata describing the pull request. additionalProperties: false required: - version - organizationName - repositoryName - action
Add release prelude about changing policies
--- prelude: > This release leverages oslo.policy's policy-in-code feature to modify the default check strings and scope types for nearly all of keystone's API policies. These changes make the policies more precise than they were before, using the reader, member, and admin roles where previously only the admin role and a catch-all rule was available. The changes also take advantage of system, domain, and project scope, allowing you to create role assignments for your users that are appropriate to the actions they need to perform. Eventually this will allow you to set ``[oslo_policy]/enforce_scope=true`` in your keystone configuration, which simplifies access control management by ensuring that oslo.policy checks both the role and the scope on API requests. However, please be aware that not all policies have been converted in this release and some changes are still under development. During the transition phase, if you have not overridden a policy, the old default and the new default will be OR'd together. This means that, for example, where we have changed the policy rule from ``'rule:admin_required'`` to ``'role:reader and system_scope:all'``, both policy rules will be in effect. Please check your current policies and role assignments before upgrading to ensure the policies will not be too permissive for your deployment. To hide the deprecation warnings and opt into the less permissive rules, you can override the policy configuration to use the newer policy rule.
Update from Hackage at 2017-09-03T21:13:43Z
homepage: https://github.com/mtesseract/io-throttle#readme changelog-type: '' hash: e8f3ec4a903046b04225f182a61c5d45c0ce329e647ccf42923984faab714b75 test-bench-deps: test-framework-hunit: -any bytestring: -any test-framework: -any stm: -any base: ! '>=4.7 && <5' text: -any clock: -any async: -any HUnit: -any throttle-io-stream: -any stm-chans: -any say: -any maintainer: mtesseract@silverratio.net synopsis: Throttler between arbitrary IO producer and consumer functions changelog: '' basic-deps: stm: -any base: ! '>=4.7 && <5' clock: -any async: -any stm-chans: -any all-versions: - '0.2.0.0' author: Moritz Schulte latest: '0.2.0.0' description-type: markdown description: ! "# throttle-io-stream [![Hackage version](https://img.shields.io/hackage/v/throttle-io-stream.svg?label=Hackage)](https://hackage.haskell.org/package/throttle-io-stream) [![Stackage version](https://www.stackage.org/package/throttle-io-stream/badge/lts?label=Stackage)](https://www.stackage.org/package/throttle-io-stream) [![Build Status](https://travis-ci.org/mtesseract/throttle-io-stream.svg?branch=master)](https://travis-ci.org/mtesseract/throttle-io-stream)\n\n### About\n\nThis packages provides throttling functionality for arbitrary IO\nproducers and consumers. The core function exported is the following:\n\n```haskell\nthrottle :: ThrottleConf a -- ^ Throttling configuration\n -> IO (Maybe a) -- ^ Input callback\n -> (Maybe a -> IO ()) -- ^ Output callback\n -> IO (Async ()) -- ^ Returns an async handler for this\n -- throttling process\n```\n\nThis will spawn asynchronous operations that\n\n1. consume data using the provided input callback and write it into an\n internal buffering queue and\n\n1. produce data from the buffering queue using the provided consumer\n \ callback.\n" license-name: BSD3
Add support for docker-based test execution.
version: '3.3' services: test: image: outrigger/build:php56 container_name: gdt environment: COMPOSER_CACHE_DIR: /root/.cache/composer NODE_VERSION: 6 NPM_CONFIG_CACHE: /root/.cache/npm GDT_QUIET: 1 entrypoint: ["/init", "npm", "test"] network_mode: bridge volumes: - .:/code # Persist the cache directories associated with various tools. # The first volume mount covers: npm, composer, bower, fontconfig, & yarn - /data/gdt/cache:/root/.cache - /data/gdt/cache/drush:/root/.drush/cache - /data/gdt/cache/behat_gherkin:/tmp/behat_gherkin_cache working_dir: /code
Add PMC Author Manuscript Collection
name: PMCAMC type: remote url: - ftp://ftp.ncbi.nlm.nih.gov/pub/pmc/manuscript/PMC002XXXXXX.xml.tar.gz - ftp://ftp.ncbi.nlm.nih.gov/pub/pmc/manuscript/PMC003XXXXXX.xml.tar.gz - ftp://ftp.ncbi.nlm.nih.gov/pub/pmc/manuscript/PMC004XXXXXX.xml.tar.gz - ftp://ftp.ncbi.nlm.nih.gov/pub/pmc/manuscript/PMC005XXXXXX.xml.tar.gz format: pmcxml filter: .xml unzip: true chunkSize: 2000
Configure Travis for test + PyPI deploy
sudo: false dist: trusty language: python python: - "3.5" - "3.6" install: - "nvm install node" # latest version of node - "nvm use node" - "npm install" - "pip install -r requirements.txt" script: - "gulp assets --env=deploy" - "pytest" deploy: provider: pypi user: sqre-admin password: secure: "csXR1Hq18Omj7nWHXP9iq9N6DxjjOoXCB2S3YSi/cK4neKV/p7+yxqBkWyA8WXpw7lFtlL8Umm0kQGMTK9GAAQnaxCoPPai3ifoUhbmvLn+x1uMkiKL+v9qJtAyjCfzjrnawXKrgyIU/VkgTKJ9C81bPJWP8Qhw2bJFY2usRyOqd7l44opP/njymoaDNdn4a6kHvNin9+eTiJXFrcisihsbzZV5outypyxftx3AJYfNOOjtaMz+QtAk/RZumV3gBLOq74pWcUiA4G3c439JhmBbQ6nERnN7ssznA/nQzmj42pqD2YWt5+F/afA8okMG8oDbMj428Ifmp027R3MJwJ65GC0G84ciKVQW08pVQt6e+MleDDGHSwYEMYLkSXmBUYF9YmRTt4KnEczEgzuIuqbLTgxgueYRm6NdE5TGD4M8pMDnRXp2b/Zk3z3gboXeocG3hvIBakoTzJLKpLYLgO0RzhDqy8A74sEbe4iZj3CJwh5ilYb/NH6KZqe+Y3UAbuJvVjGT/n194Dw2Lkqz2md1M9Gr3/DhGALrx+WEmr7m3wGM5Cxluk8A3TcNu7o0LldlGSEkeeSO1HnHOgpsm2DQfvGybfw8Zb4wSlE+Gh7VfWu4fj68KxHpCTdKcYjTSrGCdJJI3T3FaKgZciguOTZYzn5mUl7tYBUrZDgSxly0=" on: tags: true repo: "lsst-sqre/lander" skip_upload_docs: true distributions: sdist bdist_wheel
Add environment file to generate conda environment with required build and runtime packages
name: nighres channels: - mrtrix3 - simpleitk - itk - conda-forge - bioconda - defaults dependencies: - python=3.9 - pip - numpy - nibabel - psutil - jcc - pandas - pyyaml - scikit-image - scikit-learn - Nilearn - statsmodels - matplotlib - Pillow - plotly - charset-normalizer - chart-studio - idna - requests - retrying - urllib3 - tenacity - webcolors - gcc_linux-64 - gxx_linux-64 - pip: - antspyx
Add an example Andy might use
#An example for Andy resources: - name: version type: semver source: driver: git file: fancy-bundle uri: git@github.com/kdvolder/versions-repo branch: master - name: my-git type: git source: uri: git@github.com/kdvolder/test branch: master - name: my-bundle type: s3 source: bucket: my.bucket.spring.io access_key_id: {{s3_key}} secret_access_key: {{s3_secret}} regexp: snapshots/fancy-bundle-(.*).zip jobs: - name: build-it plan: - get: my-git trigger: true - task: build-it config: platform: linux inputs: - name: my-git outputs: - name: target path: my-git/target image_resource: type: docker-image source: repository: kdvolder/my-build-env run: path: my-git/scripts/build-iy.sh - put: my-bundle params: file: my-git/target/fancy-bundle-*.zip - name: test-it plan: - get: my-git passed: - build-it - get: my-bundle passed: - build-it - task: run-tests config: platform: linux inputs: - name: my-bundle image_resource: type: docker-image source: repository: kdvolder/my-build-env run: path: my-git/scripts/run-tests.sh - name: publish-it plan: - get: my-git passed: - test-it - get: my-bundle passed: - test-it - task: publish config: platform: linux inputs: - name: my-bundle image_resource: type: docker-image source: repository: kdvolder/my-build-env run: path: my-git/scripts/publish.sh
Add Travis CI build script
language: csharp sudo: required dist: trusty env: - CLI_VERSION=latest addons: apt: packages: - gettext - libcurl4-openssl-dev - libicu-dev - libssl-dev - libunwind8 - zlib1g mono: - 4.4.2 os: - linux branches: only: - master install: - export DOTNET_INSTALL_DIR="$PWD/.dotnetcli" - curl -sSL https://raw.githubusercontent.com/dotnet/cli/rel/1.0.0/scripts/obtain/dotnet-install.sh | bash /dev/stdin --version "$CLI_VERSION" --install-dir "$DOTNET_INSTALL_DIR" - export PATH="$DOTNET_INSTALL_DIR:$PATH" script: - ./build.sh
Configure the Highways Agency's Standards for Highways site
--- site: ha_standardsforhighways whitehall_slug: highways-agency host: www.standardsforhighways.co.uk tna_timestamp: 20121206042904 homepage: https://www.gov.uk/government/organisations/highways-agency
Add etcd, use llparse image
k8s: image: llparse/kubernetes:latest environment: GLOG_v: 3 K8SM_MESOS_MASTER: ${MESOS_MASTER} DEFAULT_DNS_NAME: ${DEFAULT_DNS_NAME} HOST: ${DOCKER_HOST_IP} MESOS_SANDBOX: /data DNS_SERVER_IP: 169.254.169.250 DNS_DOMAIN: rancher.internal DNS_NAMESERVERS: 8.8.8.8:53,8.8.4.4:53 DEBUG: true
k8s-scheduler: image: llparse/mesos-k8s:v1.2.4 command: | k8sm-scheduler --etcd-servers http://etcd:2379 links: - etcd k8s: image: llparse/kubernetes:latest environment: GLOG_v: 3 K8SM_MESOS_MASTER: ${MESOS_MASTER} DEFAULT_DNS_NAME: ${DEFAULT_DNS_NAME} HOST: ${DOCKER_HOST_IP} MESOS_SANDBOX: /data DNS_SERVER_IP: 169.254.169.250 DNS_DOMAIN: rancher.internal DNS_NAMESERVERS: 8.8.8.8:53,8.8.4.4:53 DEBUG: true etcd: image: rancher/etcd:v2.3.6 labels: io.rancher.scheduler.affinity:container_label_soft_ne: io.rancher.stack_service.name=$${stack_name}/$${service_name} io.rancher.sidekicks: data io.rancher.service.allocate.skip.serialize: 'true' io.rancher.container.start_once: 'true' environment: ETCD_DATA_DIR: /data ETCDCTL_ENDPOINT: http://etcd:2379 links: - data - discovery volumes_from: - data data: image: busybox command: /bin/true net: none volumes: - /data labels: io.rancher.container.start_once: 'true' discovery: image: rancher/etcd:v2.3.6 command: discovery_node labels: io.rancher.container.start_once: 'true' io.rancher.sidekicks: bootstrap bootstrap: image: rancher/etcd:v2.3.6 command: bootstrap links: - discovery labels: io.rancher.container.start_once: 'true' environment: ETCDCTL_ENDPOINT: http://etcd:2379
Update from Hackage at 2016-04-01T05:20:12+0000
homepage: https://github.com/micxjo/hs-luis-client changelog-type: '' hash: 2d86d9cc1afc6d0bc2bb5bde8033f33ffcda5b8e72a99b3297475a913445c743 test-bench-deps: {} maintainer: Micxjo Funkcio <micxjo@fastmail.com> synopsis: An unofficial client for the LUIS NLP service. changelog: '' basic-deps: http-client: ! '>=0.4.27 && <0.5' base: ! '>=4.7 && <5' text: ! '>=1.2.2.1 && <1.3' wreq: ! '>=0.4.1.0 && <0.5' lens: ==4.13.* aeson: ! '>=0.9.0.1 && <0.10' vector: ! '>=0.11.0.0 && <0.12' all-versions: - '0.0.1' author: Micxjo Funkcio <micxjo@fastmail.com> latest: '0.0.1' description-type: haddock description: Please see README.md license-name: BSD3
Add bob.db.voxforge recipe [skip appveyor]
{% set version = "2.0.3" %} package: name: bob.db.voxforge version: {{ version }} source: fn: bob.db.voxforge-{{ version }}.zip md5: a8cfa83b4c63966c0dfcfbf1ba0db1ce url: https://pypi.python.org/packages/source/b/bob.db.voxforge/bob.db.voxforge-{{ version }}.zip build: number: 0 skip: true # [not linux] script: python -B setup.py install --single-version-externally-managed --record record.txt requirements: build: - python - setuptools - bob.db.base - bob.db.verification.utils - bob.db.verification.filelist run: - python - bob.db.base - bob.db.verification.utils - bob.db.verification.filelist test: commands: - nosetests -sv bob.db.voxforge imports: - bob - bob.db - bob.db.voxforge requires: - nose about: home: http://pypi.python.org/pypi/bob.db.voxforge license: GNU General Public License v3 (GPLv3) summary: Speaker verification protocol on a subset of the VoxForge database extra: recipe-maintainers: - 183amir
Add support for training data sync and backup
govuk_env_sync::tasks: "pull_content_data_admin_integration_daily": ensure: "present" hour: "0" minute: "18" action: "pull" dbms: "postgresql" storagebackend: "s3" database: "content_data_admin_production" temppath: "/tmp/content_data_admin_integration_pull" url: "govuk-integration-database-backups" path: "postgresql-backend" "push_content_data_admin_integration_daily": ensure: "present" hour: "1" minute: "18" action: "push" dbms: "postgresql" storagebackend: "s3" database: "content_data_admin_production" temppath: "/tmp/content_data_admin_integration_push" url: "govuk-training-database-backups" path: "postgresql-backend"
Remove legacy env variables from the registry deploymentconfig if present
--- - name: remove legacy environment variables from registry dc oc_env: kind: dc name: "{{ openshift_hosted_registry_name }}" namespace: "{{ openshift_hosted_registry_namespace }}" state: absent env_vars: BEARER_TOKEN: BEARER_TOKEN_FILE: KUBERNETES_MASTER: OPENSHIFT_CA_FILE: OPENSHIFT_CERT_FILE: OPENSHIFT_INSECURE: OPENSHIFT_KEY_FILE: OPENSHIFT_MASTER:
Update from Forestry.io - Updated Forestry configuration
--- hide_body: false is_partial: true fields: - type: text name: title label: Title description: e.g. Display Title config: required: true - type: textarea label: Body name: body description: e.g. Describe what's happening config: wysiwyg: true schema: format: markdown
Switch from teamocil to tmuxinator
# ~/.tmuxinator/pp.yml name: some_project root: /mnt/marge/e/dp/PP windows: - pp: root: /mnt/marge/e/dp/PP layout: tiled panes: - harpers-boys: - cd /mnt/marge/e/dp/PP/harpers-outdoor-book-for-boys - git shortlog - impending-crisis: - cd /mnt/marge/e/dp/PP/the-impending-crisis - git shortlog - saturday-magazine: - cd /mnt/marge/e/dp/PP/the-saturday-magazine-66 - git shortlog - passamaquoddy: - cd /mnt/marge/e/dp/PP/passamaquoddy-texts - git shortlog - ppimg: ~/pp/tools/repo/ppimg - dp2ppgen: ~/pp/tools/repo/dp2ppgen - ppgen: ~/pp/tools/repo/ppgen # Optional tmux socket # socket_name: foo # Runs before everything. Use it to start daemons etc. # pre: sudo /etc/rc.d/mysqld start # Runs in each window and pane before window/pane specific commands. Useful for setting up interpreter versions. # pre_window: rbenv shell 2.0.0-p247 # Pass command line options to tmux. Useful for specifying a different tmux.conf. # tmux_options: -f ~/.tmux.mac.conf # Change the command to call tmux. This can be used by derivatives/wrappers like byobu. # tmux_command: byobu # Specifies (by name or index) which window will be selected on project startup. If not set, the first window is used. # startup_window: logs #windows: # - editor: # layout: main-vertical # panes: # - vim # - guard # - server: bundle exec rails s # - logs: tail -f log/development.log
Add recipe for serialchemy [skip appveyor] [skip travis]
{% set name = "serialchemy" %} {% set version = "0.2.0" %} package: name: "{{ name|lower }}" version: "{{ version }}" source: url: https://pypi.io/packages/source/{{ name[0] }}/{{ name }}/{{ name }}-{{ version }}.tar.gz sha256: 8ef00f71153cc076709ed58ba00731768dbabbf15c237cd423df3c808b24a988 build: noarch: python number: 0 script: "{{ PYTHON }} -m pip install . --no-deps -vv" requirements: host: - pip - python >=3.6 - setuptools_scm run: - python >=3.6 - sqlalchemy >=1.1 test: imports: - serialchemy - serialchemy._tests about: home: http://github.com/ESSS/serialchemy license: MIT license_family: MIT license_file: LICENSE summary: Serializers for SQLAlchemy models. doc_url: https://serialchemy.readthedocs.io/ dev_url: http://github.com/ESSS/serialchemy extra: recipe-maintainers: - lincrosenbach - kfasolin - nicoddemus
Add configuration file for github actions.
# This workflow will do a clean installation of node dependencies, cache/restore them, build the source code and run tests across different versions of node # For more information see: https://help.github.com/actions/language-and-framework-guides/using-nodejs-with-github-actions name: Node.js CI on: push: branches: [ "master" ] pull_request: branches: [ "master" ] jobs: build: runs-on: ubuntu-latest strategy: matrix: node-version: [14.x, 16.x, 18.x] # See supported Node.js release schedule at https://nodejs.org/en/about/releases/ steps: - uses: actions/checkout@v3 - name: Use Node.js ${{ matrix.node-version }} uses: actions/setup-node@v3 with: node-version: ${{ matrix.node-version }} cache: 'npm' - run: npm ci - run: npm test
Update from Hackage at 2015-12-03T15:16:40+0000
homepage: https://github.com/dmcclean/dimensional-codata changelog-type: markdown hash: ed371a9a2db52bcd589dbe913cc45f8875b6657327845f7f0b008d0cec80dedb test-bench-deps: {} maintainer: douglas.mcclean@gmail.com synopsis: CODATA Recommended Physical Constants with Dimensional Types changelog: ! "2014.0.0.0\r\n----------\r\n* Initial release with CODATA 2014 files.\r\n" basic-deps: base: ! '>=4.7 && <4.9' numtype-dk: ! '>=0.5 && <1' dimensional: ! '>=1.0' all-versions: - '2014.0.0.0' author: Douglas McClean latest: '2014.0.0.0' description-type: markdown description: ! "# dimensional-codata\r\nCODATA recommended values of fundamental physical constants, \r\nfor use with the [dimensional](https://github.com/bjornbm/dimensional) library.\r\n\r\nThis module offers a selection of fundamental physical constants with their values as defined or measured\r\nand published by the Committee on Data for Science and Technology of the International Council for Science.\r\n\r\nThese values are from the 2014 CODATA recommended values, by way of NIST.\r\n\r\nThe original document offers many, many more constants than are provided here. An effort has been made to narrow it\r\ndown to the most useful ones. If your work requires others or if you have another contribution or suggestion, please\r\nsubmit issues or pull requests to [the GitHub repository](https://github.com/dmcclean/dimensional-codata).\r\n" license-name: BSD3
Add GitHub Actions CI/CD test suite workflow
name: Test suite on: push: pull_request: schedule: - cron: 1 0 * * * # Run daily at 0:01 UTC jobs: tests: name: 👷 runs-on: ${{ matrix.os }} strategy: # max-parallel: 5 matrix: python-version: - 3.7 - 3.6 - 3.5 - 2.7 os: - ubuntu-18.04 - ubuntu-16.04 - macOS-10.14 - windows-2019 - windows-2016 env: - TOXENV: python steps: - uses: actions/checkout@master - name: Set up Python ${{ matrix.python-version }} uses: actions/setup-python@v1 with: version: ${{ matrix.python-version }} - name: Upgrade pip/setuptools/wheel run: >- python -m pip install --disable-pip-version-check --upgrade pip setuptools wheel - name: Install tox run: >- python -m pip install --upgrade tox tox-venv - name: Log installed dists run: >- python -m pip freeze --all - name: Log env vars run: >- env env: ${{ matrix.env }} - name: Update egg_info based on setup.py in checkout run: >- python -m bootstrap env: ${{ matrix.env }} - name: Verify that there's no cached Python modules in sources if: >- ! startsWith(matrix.os, 'windows-') run: >- ! grep pyc setuptools.egg-info/SOURCES.txt - name: 'Initialize tox envs: ${{ matrix.env.TOXENV }}' run: | python -m tox --parallel auto --notest env: ${{ matrix.env }} - name: Test with tox run: | ${{ startsWith(matrix.os, 'windows-') && 'setx NETWORK_REQUIRED ' || 'export NETWORK_REQUIRED=' }}1 python -m tox \ --parallel 0 \ -- \ --cov env: ${{ matrix.env }}
Rename to more common file extension (also 8.3!)
database: user: mypa pass: defaultpassword name: mypa type: mysql host: localhost sock: /var/run/mysqld/mysqld.sock port: 3306 game: name: MyPA version: 1.0.0-devel round: 1 motd: | Have fun people! mail: from: mypa@localhost replyto: mypa@localhost setup: gameopen: true signupopen: true havoc: false gal_size: 7 cluster_size: 7 universe_size: 15 score_per_roid: 1500 end_of_round: 39250 # still 7 hours to go, but disable sleep/signup sleep_period: 23 start: roids: 3 metal: 5000 crystal: 5000 eonium: 5000 planet_m: 250 planet_c: 250 planet_e: 250 resource_min_per_roid: 150 resource_max_per_roid: 350 number_of_fleets: 4 # base + 3, has to be same as in ticker normal: high_prot: 0 # faktor nach oben noob_prot: 3 # faktor nach unten ticktime: 30 # needed for jscript havoc: high_prot: 0 noob_prot: 20 ticktime: 15 mysettings: 0 # default settings
Add the packaging metadata for the keybase snap
name: keybase version: master summary: Crypto for everyone! description: | Keybase service and command line client. grade: devel # must be 'stable' to release into candidate/stable channels confinement: strict apps: keybase: command: keybase plugs: [network, network-bind] parts: client-go: source: ../../go plugin: go go-importpath: github.com/keybase/client/go go-packages: [github.com/keybase/client/go/keybase] go-buildtags: [production] after: [go] go: source-tag: go1.7.5
Update from Hackage at 2022-05-03T10:01:10Z
homepage: https://github.com/unfoldml/jsonl-conduit changelog-type: '' hash: 3673327a0d56b6bde12ee5d7e5d89f4e25e7a6a11546181bab3961c81838c311 test-bench-deps: jsonl-conduit: -any bytestring: -any base: -any hspec: -any conduit: -any aeson: -any maintainer: UnfoldML AB synopsis: Conduit interface to JSONL-encoded data changelog: '' basic-deps: jsonl: '>=0.2' bytestring: -any base: '>=4.7 && <5' conduit: -any aeson: -any all-versions: - 0.1.0.0 - 0.1.0.1 author: Marco Zocca latest: 0.1.0.1 description-type: markdown description: | # jsonl-conduit [![Build Status](https://travis-ci.org/unfoldml/jsonl-conduit.png)](https://travis-ci.org/unfoldml/jsonl-conduit) TODO Description. license-name: BSD-3-Clause
Add mac build github action
name: ci on: [push] jobs: build: runs-on: macos-latest steps: - uses: actions/checkout@v2 - name: cmake run: cmake -DCMAKE_CXX_FLAGS=-Werror . - name: make run: make
Add docker image building action.
name: Build and push image to Docker Hub on: # Build and push to registry for all git pushes, mark as 'latest' only for # main branch. push: # For pull requests, build but don't push. pull_request: env: PUSH_TO_MASTER: ${{ github.event_name == 'push' && endsWith(github.ref, '/master') }} jobs: build: name: Build and optionally push the image to Docker Hub runs-on: ubuntu-latest steps: - name: Checkout uses: actions/checkout@v2 - name: Work out docker tags uses: crazy-max/ghaction-docker-meta@v1 id: docker_meta with: images: yacoob/aib2ofx tag-sha: true - name: Set up docker builder (buildx) uses: docker/setup-buildx-action@v1 - name: Log in to docker hub # Skip registry push for pull_requests. if: github.event_name != 'pull_request' uses: docker/login-action@v1 with: username: ${{ secrets.DOCKER_USERNAME }} password: ${{ secrets.DOCKER_TOKEN }} - name: Build and optionally push the docker image uses: docker/build-push-action@v2 with: target: aib2ofx context: . # Skip registry push for pull_requests. push: ${{ github.event_name != 'pull_request' }} labels: ${{ steps.docker_meta.outputs.labels }} tags: | ${{ steps.docker_meta.outputs.tags }} ${{ env.PUSH_TO_MASTER == 'true' && 'yacoob/aib2ofx:latest' }}
Set up CI with Azure Pipelines
# Starter pipeline # Start with a minimal pipeline that you can customize to build and deploy your code. # Add steps that build, run tests, deploy, and more: # https://aka.ms/yaml trigger: - master pool: vmImage: 'Ubuntu-16.04' steps: - script: echo Hello, world! displayName: 'Run a one-line script' - script: | echo Add other tasks to build, test, and deploy your project. echo See https://aka.ms/yaml displayName: 'Run a multi-line script'
Set up CI with Azure Pipelines
# Starter pipeline # Start with a minimal pipeline that you can customize to build and deploy your code. # Add steps that build, run tests, deploy, and more: # https://aka.ms/yaml pool: vmImage: 'Ubuntu 16.04' steps: - script: echo Hello, world! displayName: 'Run a one-line script' - script: | echo Add other tasks to build, test, and deploy your project. echo See https://aka.ms/yaml displayName: 'Run a multi-line script'
Add sample DLNA server Docker Compose manifest.
rclone-dlna-server: container_name: rclone-dlna-server image: rclone/rclone command: # Tweak here rclone's command line switches: # - "--config" # - "/path/to/mounted/rclone.conf" - "--verbose" - "serve" - "dlna" - "remote:/" - "--name" - "myDLNA server" - "--read-only" # - "--no-modtime" # - "--no-checksum" restart: unless-stopped # Use host networking for simplicity with DLNA broadcasts # and to avoid having to do port mapping. net: host # Here you have to map your host's rclone.conf directory to # container's /root/.config/rclone/ dir (R/O). # If you have any remote referencing local files, you have to # map them here, too. volumes: - ~/.config/rclone/:/root/.config/rclone/:ro
Set up CI with Azure Pipelines
# ASP.NET Core # Build and test ASP.NET Core projects targeting .NET Core. # Add steps that run tests, create a NuGet package, deploy, and more: # https://docs.microsoft.com/azure/devops/pipelines/languages/dotnet-core trigger: - master pool: vmImage: 'Ubuntu-16.04' variables: buildConfiguration: 'Release' steps: - script: dotnet build --configuration $(buildConfiguration) displayName: 'dotnet build $(buildConfiguration)'
Load environment variables from a JSON file in GCS
main: params: [input] steps: - load_env_details: call: read_env_from_gcs args: bucket: workflow_environment_info object: env-info.json result: env_details - call_service: call: http.get args: url: ${env_details.SERVICE_URL} result: service_result - return_result: return: ${service_result.body} read_env_from_gcs: params: [bucket, object] steps: - read_from_gcs: call: http.get args: url: ${"https://storage.googleapis.com/download/storage/v1/b/" + bucket + "/o/" + object} auth: type: OAuth2 query: alt: media result: env_file_json_content - return_content: return: ${env_file_json_content.body}
Add the no-op GitHub Actions workflow for Python releases.
name: Python on: workflow_dispatch: inputs: build: required: true type: number jobs: no-op: runs-on: ubuntu-latest steps: - run: echo "These aren't the droids you're looking for." shell: bash
Add playbook for OS X workstations
--- - name: Configure Mac mini workstation vars: ansible_ssh_pipelining: yes hosts: gui vars_files: - ../vars/global.yml - ../vars/secrets.yml roles: - shell - languages
Create configuration for Travis CI server
language: python python: 2.7 env: - TOX_ENV= virtualenv: system_site_packages: true install: - pip install -U tox coveralls script: - tox - coveralls --rcfile=.coveragerc
Add build definition for PR
# Build ASP.NET Core project on pull request variables: - name: BuildConfiguration value: Release - name: BuildPlatform value: Any CPU - name: BuildProjects value: PlanningPokerCore.sln trigger: none pr: - master jobs: - template: azure-pipelines-build.yml parameters: RunEnd2EndTests: true PublishArtifacts: false
Update from Hackage at 2019-03-02T19:36:14Z
homepage: https://github.com/chessai/fib changelog-type: markdown hash: 29347a6b1ca2d76a1235b3b001730f5ac8d83f9db07cd2f1ee91cb83edd11128 test-bench-deps: {} maintainer: chessai <chessai1996@gmail.com> synopsis: fibonacci algebra changelog: |- # Changelog `fib` uses [PVP Versioning][1]. The changelog is available [on GitHub][2]. 0.0.0 ===== * Initially created. [1]: https://pvp.haskell.org [2]: https://github.com/chessai/fib/releases basic-deps: base-noprelude: ! '>=4.10.1.0 && <4.13' integer-gmp: -any semirings: ! '>=0.3' all-versions: - '0.1' author: chessai latest: '0.1' description-type: markdown description: |- # fib [![Hackage](https://img.shields.io/hackage/v/fib.svg)](https://hackage.haskell.org/package/fib) [![BSD3 license](https://img.shields.io/badge/license-BSD3-blue.svg)](LICENSE) fibonacci algebra license-name: BSD-3-Clause
Add support for docker stack with mock
version: '3' services: meetup_loto: image: ghibourg/meetup_loto depends_on: - apimeetupcom networks: - meetup_loto_net ports: - "8888:8080" environment: - env=mock apimeetupcom: image: campanda/webserver-mock ports: - "443:443" environment: - ssl=true - custom_responses_config=/config/config.yml volumes: - ./tests/mock/meetup:/config networks: meetup_loto_net: aliases: - api.meetup.com networks: meetup_loto_net:
Update from Hackage at 2018-05-26T16:19:27Z
homepage: https://github.com/incertia/unity-testresult-parser#readme changelog-type: markdown hash: d71acac06574941bbfc8dca500e5cccde7230be84114e99fbf957a567385c5f6 test-bench-deps: {} maintainer: incertia@incertia.net synopsis: '' changelog: ! '# Changelog for unity-testresult-parser ## Unreleased changes ' basic-deps: ansi-wl-pprint: -any split: -any xml-conduit: -any base: ! '>=4.7 && <5' unordered-containers: -any text: -any containers: -any ansi-terminal: -any unity-testresult-parser: -any mtl: -any optparse-applicative: -any all-versions: - '0.1.0.0' author: Will Song latest: '0.1.0.0' description-type: markdown description: ! '# Unity TestResults Parser This is a tool to parse Unity test results, designed for use in CI/CD environments. ' license-name: BSD3
Update from Hackage at 2017-05-15T09:20:05Z
homepage: https://github.com/peti/logging-facade-syslog#readme changelog-type: '' hash: 8139c68b3893219fd9ebc40964421ae5eb8846337d7a0b8e34d02b1515e34f2d test-bench-deps: {} maintainer: Peter Simons <simons@cryp.to> synopsis: A logging back-end to syslog(3) for the logging-facade library changelog: '' basic-deps: logging-facade: -any base: <5 hsyslog: ==5.* all-versions: - '1' author: Peter Simons latest: '1' description-type: haddock description: A simple "System.Logging.Facade" back-end for @syslog(3)@ as specified in <http://pubs.opengroup.org/onlinepubs/9699919799/functions/syslog.html POSIX.1-2008>. license-name: BSD3
Automate update of IPs in nginx.conf!!!!
# Update the IPs for new Aspace instances # call by --limit=<current version> --extra-vars="previous_version=<previous version>" --- - hosts: aspace_instances vars: pre_tasks: - fail: msg='Need to set previous version by adding --extra-vars "previous_version=<previous-version i.e. v2.0.1>" to ansible-playbook call' when: previous_version == None - debug: var=previous_version tasks: - debug: var=client_name - debug: var: hostvars['{{ client_name }}-{{ previous_version}}']['ansible_ssh_host'] - set_fact: old_host: "{{ client_name }}-{{ previous_version}}" - set_fact: old_host_ip: "{{ hostvars[old_host]['ansible_ssh_host'] }}" - name: change the IP from previous to current in nginx.conf replace: dest: /etc/nginx/nginx.conf regexp: "{{ old_host_ip }}" replace: "{{ ansible_ssh_host }}" become: yes delegate_to: 127.0.0.1
Update from Hackage at 2016-05-23T03:19:19+0000
homepage: '' changelog-type: '' hash: c6bd373f3f690f3ddf7396f4063989522da587252408bc2bb23ce5e30ec815ab test-bench-deps: {} maintainer: winterland1989@gmail.com synopsis: A simple round-robin data type changelog: '' basic-deps: base: ! '>=4.6 && <5.0' all-versions: - '0.1.0.0' author: winterland1989 latest: '0.1.0.0' description-type: markdown description: ! "# A simple round-robin data type\n\nThis package provide a simple data type wrap a round-robin table. so you can select resources(host, connection...) using round-robin fashion.\n\n## Example\n\n```haskell\nimport qualified Date.RoundRobin as RR\nimport qualified Network.HTTP.Client as HTTP\n\nmain :: IO ()\nmain = do\n \ reqs <- mapM HTTP.parseUrl [\"http://foo.com\", \"http://bar.com\", \"http://qux.com\"]\n \ proxyTable <- RR.newRoundRobin reqs\n manager <- HTTP.newManager HTTP.defaultManagerSettings\n\n \ ...\n -- maybe now you're inside a server service(a forked thread)\n -- use select to choose a request in round-robin fashion\n req <- RR.select proxyTable\n res <- HTTP.httpLbs req manager\n ...\n```\n" license-name: MIT
Set up CI with Azure Pipelines
# Docker image # Build a Docker image to deploy, run, or push to a container registry. # Add steps that use Docker Compose, tag images, push to a registry, run an image, and more: # https://docs.microsoft.com/azure/devops/pipelines/languages/docker trigger: - master pool: vmImage: 'Ubuntu-16.04' variables: imageName: 'smalldave/weather:$(build.buildId)' steps: - script: docker build -f Dockerfile -t $(imageName) . displayName: 'docker build'
Update from Hackage at 2020-06-19T00:39:12Z
homepage: https://github.com/prsteele/math-programming-tests#readme changelog-type: markdown hash: 681a904b00587046e9f6cd0383f7336381a71154ea190fc4064783cd9ef88c26 test-bench-deps: {} maintainer: steele.pat@gmail.com synopsis: Utility functions for testing implementations of the math-programming library. changelog: | # Changelog for math-programming-api-tests ## Unreleased changes basic-deps: math-programming: '>=0.3.0 && <0.4' base: '>=4.12.0.0 && <4.13' text: '>=1.2.3.1 && <1.3' tasty-quickcheck: '>=0.10.1.1 && <0.11' tasty-hunit: '>=0.10.0.2 && <0.11' tasty: '>=1.2.3 && <1.3' all-versions: - 0.3.0 author: Patrick Steele latest: 0.3.0 description-type: markdown description: | # math-programming-tests This library provides utility functions for testing backends to the `math-programming` library. license-name: BSD-3-Clause
Add .gitlab-ci demo for TYPO3 unit and functional tests
stages: - tests phpunit:php7.1: stage: tests image: besseiit/webdev services: - mysql variables: # Configure mysql service (https://hub.docker.com/_/mysql/) MYSQL_ROOT_PASSWORD: typo3 TIMEZONE: "Europe/Vienna" typo3DatabaseName: "testing" typo3DatabaseUsername: "root" typo3DatabasePassword: "typo3" typo3DatabaseHost: "mysql" before_script: # Add SSH key so we can clone private repositories (and could be useful for deployment) - eval $(ssh-agent -s) - ssh-add <(echo "$SSH_PRIVATE_KEY") - mkdir -p ~/.ssh # Disable host key verification since docker can't know the remote host (first connect) - echo -e "Host *\n\tStrictHostKeyChecking no\n\n" > ~/.ssh/config # Add prestissimo for faster package download - composer global require hirak/prestissimo script: - composer install - vendor/bin/phpunit -c vendor/typo3/testing-framework/Resources/Core/Build/UnitTests.xml FOLDER_OF_UNIT_TESTS - vendor/bin/phpunit -c vendor/typo3/testing-framework/Resources/Core/Build/FunctionalTests.xml FOLDER_OF_FUNCTIONAL_TESTS
Allow anyone to tag/untag/close issues.
special: everyone: can: - say - tag - untag - close config: tag: only: - async/await - backported - help wanted - documentation - notebook - tab-completion - windows
Update from Hackage at 2020-10-04T18:13:59Z
homepage: https://github.com/srid/rib#readme changelog-type: '' hash: 3b837333314137f0b156617de1411396611d0c307b425e18aa05883676bb01e7 test-bench-deps: {} maintainer: srid@srid.ca synopsis: Static site generator based on Shake changelog: '' basic-deps: warp: -any modern-uri: -any shake: '>=0.18.5' exceptions: -any wai: '>=3.2.2 && <3.3' time: -any text: '>=1.2.3 && <1.3' safe-exceptions: -any megaparsec: '>=8.0' filepath: -any async: -any base-noprelude: '>=4.12 && <4.14' containers: '>=0.6.0 && <0.7' binary: '>=0.8.6 && <0.9' relude: '>=0.6 && <0.8' iso8601-time: -any mtl: '>=2.2.2 && <2.3' foldl: -any cmdargs: '>=0.10.20 && <0.11' optparse-applicative: '>=0.15' fsnotify: '>=0.3.0 && <0.4' wai-app-static: '>=3.1.6 && <3.2' aeson: '>=1.4.2 && <1.5' directory: '>=1.0 && <2.0' all-versions: - 1.0.0.0 author: Sridhar Ratnakumar latest: 1.0.0.0 description-type: haddock description: Haskell static site generator based on Shake, with a delightful development experience. license-name: BSD-3-Clause
Update from Hackage at 2017-09-16T14:35:39Z
homepage: https://github.com/daig/microgroove changelog-type: '' hash: 0d0f3ce69a6d1498b7cd28b1114a82802ab01c20900eec2d4c2e882885e2163e test-bench-deps: {} maintainer: dai@sodality.cc synopsis: Array-backed extensible records changelog: '' basic-deps: base: ! '>=4.7 && <5' primitive: -any vector: -any all-versions: - '0.1.0.0' author: Dai latest: '0.1.0.0' description-type: markdown description: ! '# microgroove Microgroove supports type-safe positional heterogeneous records similar to [vinyl](https://hackage.haskell.org/package/vinyl-0.6.0/docs/Data-Vinyl-Core.html#t:Rec) and [SOP](https://hackage.haskell.org/package/generics-sop-0.3.1.0/docs/Generics-SOP.html#t:NP). Unlike these record types which are represented by linked lists, `microgroove`''s `Rec` type is backed by arrays, and so support constant-time indexing and mutable updates via the associated `MRec` type. Microgroove can be used for lightweight statically specified polymorphic records just like `vinyl`, but also dynamic record types that are only provided at run-time, such as receiving an arbitrary JSON protocol. # build The recommended way to build microgroove is via [stack](https://www.haskellstack.org) with `stack build` # contribute microgroove is an early alpha, so please submit any bugs or feature requests on the issue tracker ' license-name: BSD3
Update from Hackage at 2015-07-11T22:19:10+0000
homepage: '' changelog-type: '' hash: 02499cbf549e2734c9ee85d2b3374eb23e7c5715344d14eb08d5bf61ff26e75f test-bench-deps: {} maintainer: Index Int <vlad.z.4096@gmail.com> synopsis: Ad-hoc type classes for lifting changelog: '' basic-deps: base: ! '>=4.6 && <4.9' transformers: ! '>=0.4.2' all-versions: - '0.1.0.0' author: Index Int latest: '0.1.0.0' description-type: haddock description: ! 'This simple and lightweight library provides type classes for lifting monad transformer operations.' license-name: BSD3
Use CodeQL to analyze code
# For most projects, this workflow file will not need changing; you simply need # to commit it to your repository. # # You may wish to alter this file to override the set of languages analyzed, # or to provide custom queries or build logic. # # ******** NOTE ******** # We have attempted to detect the languages in your repository. Please check # the `language` matrix defined below to confirm you have the correct set of # supported CodeQL languages. # name: "CodeQL" on: push: branches: [ "master" ] pull_request: # The branches below must be a subset of the branches above branches: [ "master" ] schedule: - cron: '36 8 * * 6' jobs: analyze: name: Analyze runs-on: ubuntu-latest permissions: actions: read contents: read security-events: write strategy: fail-fast: false matrix: language: [ 'java' ] # CodeQL supports [ 'cpp', 'csharp', 'go', 'java', 'javascript', 'python', 'ruby' ] # Learn more about CodeQL language support at https://aka.ms/codeql-docs/language-support steps: - name: Checkout repository uses: actions/checkout@v3 # Initializes the CodeQL tools for scanning. - name: Initialize CodeQL uses: github/codeql-action/init@v2 with: languages: ${{ matrix.language }} # If you wish to specify custom queries, you can do so here or in a config file. # By default, queries listed here will override any specified in a config file. # Prefix the list here with "+" to use these queries and those in the config file. # Details on CodeQL's query packs refer to : https://docs.github.com/en/code-security/code-scanning/automatically-scanning-your-code-for-vulnerabilities-and-errors/configuring-code-scanning#using-queries-in-ql-packs # queries: security-extended,security-and-quality - name: Set up JDK 8 uses: actions/setup-java@v3 with: java-version: '8' distribution: 'adopt' cache: maven - name: Build with Maven run: mvn -B verify - name: Perform CodeQL Analysis uses: github/codeql-action/analyze@v2
Update from Forestry.io - Updated Forestry configuration
--- new_page_extension: md auto_deploy: false admin_path: "/admin" webhook_url: sections: - type: directory path: projects label: Projects create: documents match: "**/*.md" templates: - project - type: directory path: journal label: Journal create: documents match: "**/*.md" templates: - journal - type: directory path: data/authors label: Authors create: documents match: "**/*.json" new_doc_ext: json templates: - author - type: document path: data/theme.json label: Theme config upload_dir: uploads public_path: "/uploads" front_matter_path: "../uploads" use_front_matter_path: true file_template: ":filename:" build: publish_command: gridsome build output_directory: dist
--- new_page_extension: md auto_deploy: false admin_path: "/admin" webhook_url: sections: - type: directory path: projects label: Projects create: documents match: "**/*.md" templates: - project - type: directory path: journal label: Journal create: documents match: "**/*.md" templates: - journal - type: directory path: data/authors label: Authors create: documents match: "**/*.json" new_doc_ext: json templates: - author - type: document path: data/theme.json label: Theme config upload_dir: uploads public_path: "/uploads" front_matter_path: "../static/uploads" use_front_matter_path: true file_template: ":filename:" build: publish_command: gridsome build output_directory: dist
Update from Hackage at 2020-12-08T16:50:19Z
homepage: '' changelog-type: markdown hash: e52ca1cc2bd0787e09bae5404be9092cab23d191ce8bb907f48ea5148b003069 test-bench-deps: {} maintainer: dan.firth@homotopic.tech synopsis: Prelude with algebraic constructs and polykinds on. changelog: | # Changelog for chassis ## 0.0.1.0 * Experimental prelude with many algebraic constructs and control structures, and polykinds. basic-deps: exceptions: -any rio: -any first-class-families: -any either: -any bytestring: -any extra: -any path: -any base: '>=4.7 && <5' composite-base: -any comonad: -any text: -any distributive: -any containers: -any vinyl: -any contravariant: -any profunctors: -any all-versions: - 0.0.1.0 author: Daniel Firth latest: 0.0.1.0 description-type: markdown description: | # chassis Chassis is an experimental prelude that exports most of the standard Haskell apparatus by default, common types and type classes, as well as lesser used abstractions from category theory by default. Chassis also re-exports some thins from vinyl and composite, and keeps polykinds on by default. Extremely early prototype idea. license-name: BSD-3-Clause
Update from Hackage at 2017-03-22T09:22:20Z
homepage: https://github.com/frasertweedal/hs-concise changelog-type: '' hash: e7a01aaac1538a529a753f7148626e162eacb3789fc6762c878f1ebf17c29f7a test-bench-deps: bytestring: -any base: ! '>=4 && <5' text: -any lens: -any quickcheck-instances: -any tasty-quickcheck: -any tasty: -any QuickCheck: -any concise: -any maintainer: frase@frase.id.au synopsis: Utilities for Control.Lens.Cons changelog: '' basic-deps: bytestring: -any base: ! '>=4 && <5' text: -any lens: -any all-versions: - '0.1.0.0' author: Fraser Tweedale latest: '0.1.0.0' description-type: text description: ! 'Utilities for ``Control.Lens.Cons`` =================================== This library provides a few utility types and functions for working with ``Control.Lens.Cons``. Rewrite rules are employed to make common conversions perform as well as the "native" but non-generic conversion (e.g. ``String`` to ``Text``, lazy to strict, etc). ' license-name: BSD3
Update from Hackage at 2018-04-27T16:02:55Z
homepage: https://github.com/ryota-ka/duration#readme changelog-type: '' hash: 370cd2adbdcc307ca2a4696b301d34ff35bdf17c12e025feefd9b20bc30a9d8e test-bench-deps: base: ! '>=4.7 && <5' time: -any hspec: -any parsec: -any doctest: -any duration: -any template-haskell: -any maintainer: kameoka.ryota@gmail.com synopsis: A tiny compile-time time utility library inspired by zeit/ms changelog: '' basic-deps: base: ! '>=4.7 && <5' time: -any parsec: -any template-haskell: -any all-versions: - '0.1.0.0' author: Ryota Kameoka latest: '0.1.0.0' description-type: markdown description: ! "# duration\n\nA tiny compile-time time utility library, inspired by [zeit/ms](https://github.com/zeit/ms).\n\n## Examples\n\n```haskell\n{-# LANGUAGE QuasiQuotes #-}\n{-# LANGUAGE TypeApplications #-}\n\nmodule Main where\n\nimport Data.Time.Clock (DiffTime, NominalDiffTime)\nimport Data.Time.Clock.Duration (t, s, ms, µs, ns, ps)\n\nmain :: IO ()\nmain = do\n print @DiffTime [t| 1 day |] -- 86400s\n print @DiffTime [t| 2h |] -- 7200s\n print @DiffTime [t| -1m |] -- -60s\n\n print @NominalDiffTime [s| 1 |] -- 1s\n print @NominalDiffTime [ms| 1 |] -- 0.001s\n print @NominalDiffTime [µs| 1 |] -- 0.000001s\n print @NominalDiffTime [ns| 1 |] -- 0.000000001s\n print @NominalDiffTime [ps| 1 |] -- 0.000000000001s\n\n print @Int [s| 2 days |] -- 172800\n print @Integer \ [ms| 5s |] -- 5000\n print @Double [µs| 2min |] -- 1.2e8\n print @Float [ns| -1 sec |] -- -1.0e9\n print @Rational [ps| 10ms |] -- 10000000000 % 1\n```\n" license-name: BSD3
Update from Hackage at 2016-03-19T13:40:39+0000
homepage: http://github.com/aelve/Spock-lucid changelog-type: markdown hash: 64f457072b9ffde02b144cf5aa5df0c1f92c22ae9e953bfca6090cf4b8244c07 test-bench-deps: {} maintainer: yom@artyom.me synopsis: Lucid support for Spock changelog: ! '# 0.1.0.0 First release. ' basic-deps: Spock: ! '>=0.9' lucid: ==2.* base: ! '>=4.7 && <4.10' transformers: -any all-versions: - '0.1.0.0' author: Artyom latest: '0.1.0.0' description-type: haddock description: Lucid support for Spock license-name: BSD3
Set up CI with Azure Pipelines
trigger: branches: include: - master - release - refs/tags/* pr: - master pool: vmImage: 'windows-2019' variables: BuildConfiguration: Release Projects: '**/*.csproj' steps: # Install required SDKs and tools - task: UseDotNet@2 displayName: 'Install .NET Core SDK' inputs: packageType: 'sdk' version: '2.2.203' - task: CmdLine@2 displayName: 'Install DNT' inputs: script: 'npm i -g dotnettools' # Patch project versions (on master only) - task: CmdLine@2 displayName: 'Update project version revision' condition: and(succeeded(), eq(variables['Build.SourceBranch'], 'refs/heads/master')) inputs: script: 'dnt bump-versions revision "$(Build.BuildId)"' - task: CmdLine@2 displayName: 'Patch project version preview' condition: and(succeeded(), eq(variables['Build.SourceBranch'], 'refs/heads/master')) inputs: script: 'dnt bump-versions preview "$(Build.BuildNumber)"' # Build and test - task: DotNetCoreCLI@2 displayName: 'Build solution' inputs: command: 'build' projects: '$(Projects)' arguments: '--configuration $(BuildConfiguration)' feedsToUse: 'select' versioningScheme: 'off' - task: DotNetCoreCLI@2 displayName: 'Run tests' inputs: command: 'test' projects: '$(Projects)' arguments: '--configuration $(BuildConfiguration)' feedsToUse: 'select' versioningScheme: 'off' # Publish artifacts - task: CopyFiles@2 displayName: 'Copy packages' # condition: and(succeeded(), ne(variables['Build.Reason'], 'PullRequest')) inputs: Contents: '**/*.nupkg' TargetFolder: '$(Build.ArtifactStagingDirectory)' flattenFolders: true - task: PublishBuildArtifacts@1 displayName: 'Publish artifacts' # condition: and(succeeded(), ne(variables['Build.Reason'], 'PullRequest')) inputs: PathtoPublish: '$(Build.ArtifactStagingDirectory)' ArtifactName: 'drop' publishLocation: 'Container'
Set up CI with Azure Pipelines
# Node.js with React # Build a Node.js project that uses React. # Add steps that analyze code, save build artifacts, deploy, and more: # https://docs.microsoft.com/azure/devops/pipelines/languages/javascript trigger: - master pool: vmImage: 'ubuntu-latest' steps: - task: NodeTool@0 inputs: versionSpec: '10.x' displayName: 'Install Node.js' - script: | npm install npm run build displayName: 'npm install and build'
Drop rbx/jruby on Travis as they use ancient versions
sudo: false language: ruby script: - bundle exec rspec rvm: - 2.0.0 - 2.1.10 - 2.2.6 - 2.3.3 - 2.4.1 - ruby-head - rubinius - jruby matrix: allow_failures: - rvm: rubinius - rvm: jruby
sudo: false language: ruby script: - bundle exec rspec rvm: - 2.0.0 - 2.1.10 - 2.2.6 - 2.3.3 - 2.4.1 - ruby-head
Update from Hackage at 2020-04-18T08:34:19Z
homepage: https://github.com/poscat0x04/cache-polysemy#readme changelog-type: markdown hash: 3f39ae3c9ae6e5083083a2c4ef270f09290d46f992208df0f8ed8e5710056686 test-bench-deps: polysemy-plugin: '>=0.2.5.0 && <0.3' base: '>=4.10 && <5' clock: '>=0.8 && <0.9' cache-polysemy: -any cache: '>=0.1.3.0 && <0.2' hashable: '>=1.3.0.0 && <1.4' polysemy: '>=1.3.0.0 && <1.4' maintainer: poscat@mail.poscat.moe synopsis: cached hashmap changelog: | # cache-polysemy ## 0.1.0 Initial release basic-deps: polysemy-plugin: '>=0.2.5.0 && <0.3' base: '>=4.10 && <5' clock: '>=0.8 && <0.9' cache: '>=0.1.3.0 && <0.2' hashable: '>=1.3.0.0 && <1.4' polysemy: '>=1.3.0.0 && <1.4' all-versions: - 0.1.0 author: Poscat latest: 0.1.0 description-type: markdown description: | # cache-polysemy An polysemy interface for cached hashmap and an interpreter implemented using [cache](https://hackage.haskell.org/package/cache) license-name: BSD-3-Clause
Update VBox extension pack to 5.1.18
--- virtualbox: extensions: - "http://download.virtualbox.org/virtualbox/5.1.18/Oracle_VM_VirtualBox_Extension_Pack-5.1.18.vbox-extpack"
Fix artifact path attempt 2
version: 1.0.{build} # build Configuration, i.e. Debug, Release, etc. configuration: Release before_build: - cmd: nuget restore build: project: NLog.Kafka.sln publish_nuget: false publish_nuget_symbols: false include_nuget_references: false verbosity: normal after_build: - nuget pack NLog.Targets.Kafka\NLog.Targets.Kafka.csproj artifacts: # pushing all *.nupkg files in directory - path: '.\NLog.Targets.Kafka\*.nupkg' deploy: provider: NuGet api_key: secure: exO8//WMx3fyPkwcZWlWHgjm5LdXUyoJUpWAyZgXSrijY8B+w/aRsSsJfHr8BAPy skip_symbols: true artifact: '*.nupkg'
version: 1.0.{build} # build Configuration, i.e. Debug, Release, etc. configuration: Release before_build: - cmd: nuget restore build: project: NLog.Kafka.sln publish_nuget: false publish_nuget_symbols: false include_nuget_references: false verbosity: normal after_build: - nuget pack NLog.Targets.Kafka\NLog.Targets.Kafka.csproj artifacts: # pushing all *.nupkg files in directory - path: '**/*.nupkg' deploy: provider: NuGet api_key: secure: exO8//WMx3fyPkwcZWlWHgjm5LdXUyoJUpWAyZgXSrijY8B+w/aRsSsJfHr8BAPy skip_symbols: true artifact: '**/*.nupkg'
Update from Hackage at 2020-10-11T22:37:49Z
homepage: '' changelog-type: markdown hash: ead616a9777b9cf6a390d87987943d1824eba9b1e2896daf5b40c5e6c09a3c22 test-bench-deps: stm: -any base: '>=4.10 && <4.15' time: -any async: -any doctest: -any containers: -any random: -any maintainer: lyndon@sordina.net synopsis: 'Churros: Channel/Arrow based streaming computation library.' changelog: | # Revision history for churros ## 0.1.0.0 -- YYYY-mm-dd * First version. Released on an unsuspecting world. basic-deps: {} all-versions: - 0.1.0.0 author: Lyndon Maydwell latest: 0.1.0.0 description-type: haddock description: |- The Churro library takes an opinionated approach to streaming by focusing on IO processes and allowing different transport options. license-name: MIT
Use latest gjslint version (2.3.10)
before_install: - "sudo pip install http://closure-linter.googlecode.com/files/closure_linter-2.3.9.tar.gz" - "git clone --depth=50 https://github.com/jsdoc3/jsdoc build/jsdoc" - "git clone https://code.google.com/p/glsl-unit/ build/glsl-unit" before_script: - "./build.py plovr" - "./build.py serve-integration-test &" - "rm src/ol/renderer/webgl/*shader.js" - "sleep 3" script: "./build.py JSDOC=build/jsdoc/jsdoc integration-test"
before_install: - "sudo pip install http://closure-linter.googlecode.com/files/closure_linter-latest.tar.gz" - "git clone --depth=50 https://github.com/jsdoc3/jsdoc build/jsdoc" - "git clone https://code.google.com/p/glsl-unit/ build/glsl-unit" before_script: - "./build.py plovr" - "./build.py serve-integration-test &" - "rm src/ol/renderer/webgl/*shader.js" - "sleep 3" script: "./build.py JSDOC=build/jsdoc/jsdoc integration-test"
Disable this test (we should fix rest framework to disable based on java versions)
# Integration tests for Repository S3 component # "S3 repository can be registereed": - do: snapshot.create_repository: repository: test_repo_s3_1 verify: false body: type: s3 settings: bucket: "my_bucket_name" access_key: "AKVAIQBF2RECL7FJWGJQ" secret_key: "vExyMThREXeRMm/b/LRzEB8jWwvzQeXgjqMX+6br" # Get repositry - do: snapshot.get_repository: repository: test_repo_s3_1 - is_true: test_repo_s3_1 - is_true: test_repo_s3_1.settings.bucket - is_false: test_repo_s3_1.settings.access_key - is_false: test_repo_s3_1.settings.secret_key
# Integration tests for Repository S3 component # "S3 repository can be registereed": - skip: version: "all" reason: does not work on java9, see https://github.com/aws/aws-sdk-java/pull/432 - do: snapshot.create_repository: repository: test_repo_s3_1 verify: false body: type: s3 settings: bucket: "my_bucket_name" access_key: "AKVAIQBF2RECL7FJWGJQ" secret_key: "vExyMThREXeRMm/b/LRzEB8jWwvzQeXgjqMX+6br" # Get repositry - do: snapshot.get_repository: repository: test_repo_s3_1 - is_true: test_repo_s3_1 - is_true: test_repo_s3_1.settings.bucket - is_false: test_repo_s3_1.settings.access_key - is_false: test_repo_s3_1.settings.secret_key
Use the pre-release builds of chefdk
# Use Travis's cointainer based infrastructure sudo: false addons: apt: sources: - chef-stable-precise packages: - chefdk # Don't `bundle install` install: echo "skip bundle install" branches: only: - master # Ensure we make ChefDK's Ruby the default before_script: - eval "$(/opt/chefdk/bin/chef shell-init bash)" # We have to install chef-sugar for ChefSpec - /opt/chefdk/embedded/bin/chef gem install chef-sugar script: - /opt/chefdk/embedded/bin/chef --version - /opt/chefdk/embedded/bin/rubocop --version - /opt/chefdk/embedded/bin/rubocop - /opt/chefdk/embedded/bin/foodcritic --version - /opt/chefdk/embedded/bin/foodcritic . --exclude spec - /opt/chefdk/embedded/bin/rspec spec
# Use Travis's cointainer based infrastructure sudo: false addons: apt: sources: - chef-current-precise packages: - chefdk # Don't `bundle install` install: echo "skip bundle install" branches: only: - master # Ensure we make ChefDK's Ruby the default before_script: - eval "$(/opt/chefdk/bin/chef shell-init bash)" # We have to install chef-sugar for ChefSpec - /opt/chefdk/embedded/bin/chef gem install chef-sugar script: - /opt/chefdk/embedded/bin/chef --version - /opt/chefdk/embedded/bin/rubocop --version - /opt/chefdk/embedded/bin/rubocop - /opt/chefdk/embedded/bin/foodcritic --version - /opt/chefdk/embedded/bin/foodcritic . --exclude spec - /opt/chefdk/embedded/bin/rspec spec
Add Travis CI config file
language: ruby rvm: - 2.2 before_install: - bundle install install: gem install jekyll html-proofer script: jekyll build && htmlproofer ./_site env: global: - NOKOGIRI_USE_SYSTEM_LIBRARIES=true # speeds up installation of html-proofer sudo: false
Add Github Actions CI workflow
name: CI on: push: branches: [master] pull_request: branches: [master] jobs: build: runs-on: ubuntu-latest strategy: matrix: go: ['1.16'] steps: - name: Setup uses: actions/setup-go@v2 with: go-version: ${{ matrix.go }} - name: Prerequisites run: | sudo apt-get -y install libvlc-dev vlc-plugin-base vlc-plugin-video-output vlc-plugin-access-extra - name: Checkout uses: actions/checkout@v2 - name: Dependencies run: | go version go get -v -t -d ./... go get -u golang.org/x/lint/golint - name: Lint run: golint -set_exit_status=1 ./... - name: Vet run: go vet ./... - name: Test run: go test -v -coverprofile=coverage.txt -covermode=atomic ./... - name: Coverage uses: codecov/codecov-action@v1
Update from Hackage at 2019-09-16T09:00:56Z
homepage: '' changelog-type: '' hash: 61f03a2f19238d83faf8ee492ab06be491a06371826acaae69880c1507fcfbff test-bench-deps: {} maintainer: arian.vanputten@gmail.com v.cacciarimiraldo@gmail.com synopsis: Reimplementation of the `gdiff` algorithm for `generics-mrsop` changelog: '' basic-deps: generics-mrsop: ! '>=2.0.0' base: <5 all-versions: - 0.0.0 author: Arian van Putten and Victor Miraldo latest: 0.0.0 description-type: haddock description: |- Here we port the `gdiff` algorithm and library to work over the `generics-mrsop`, enabling code that relies on the later library to access the `gdiff` algorithm. license-name: BSD-3-Clause
Set up Travis CI builds
# Licensed to the Apache Software Foundation (ASF) under one or more # contributor license agreements. See the NOTICE file distributed with # this work for additional information regarding copyright ownership. # The ASF licenses this file to You under the Apache License, Version 2.0 # (the "License"); you may not use this file except in compliance with # the License. You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. language: java notifications: email: - oak-dev@jackrabbit.apache.org
Include Travic CI config file
language: android jdk: oraclejdk7 android: components: # The BuildTools version used by your project - build-tools-23.0.2 # The SDK version used to compile your project - android-23
Add initial Travis CI configuration
language: objective-c script: 'curl -s -H "Authorization: token $ATOM_ACCESS_TOKEN" -H "Accept: application/vnd.github.v3.raw" https://api.github.com/repos/atom/apm/contents/script/build-package | sh' env: global: secure: nv0HjvndSO99MhkDtcMinw3ZegEJlJued1SvMVuBpj4Wn+oHFnFAt4tWtcs+WXckFaLxn/bnKSvQMddSpochLidQNIzjJHT+YKw5DauFxBS8aL4FF3ra/0RX5wi7PX1mNU6llBOFwppALvFxLOauP1I7rBTiQZI3HmCPOHeB+Bs=
Add Travis Build Automation Configuration
language: java sudo: false install: true addons: sonarcloud: organization: "thanglequoc-github" token: secure: "QOSwtWz2OVldndV0fUBayNbe63g+Dhfkk4wra5MtXjtESLkfJ4yomlHsLc94qPZd+y54jPlcBY38YxAstcU5LJMu1H1H2VpFKLEk2udWycfWwRcFTHK0raibtBgJpNmwJJcAqJW2Sgsb1Ym8Q5+aR5aY9mVrXIwJbztCdKtQdgRdwAGBQjIHSc32r+1JPiRjsQAet+tu7abNbU3Cyj4MmCdpntKOwLj037IRfdWVNbN1eQOyCSJGl32ACBS80dN3xJ9DdWipa+zYbLP0h+r/dU0bmbl5cRrm2eYY8DddgVUEumog1Durb1JAFMBlhiGpKdl5+ZkhQ/Ez26p5ZWCqcZRvUxPCfRFDcbnF7ijYAsQPJDwmdyYQ79k7e+Q1lohV+XGKbSztcAy9X9FDvtOygil2qnhpw2Ix6QN4UijtSA057RmmUXeAERBufLqpLS7W1epItksB8HYvp/iVARNg3ywquKmEjtpJ1a7xkb8+GYjeiQOXOMNDy7cfAHVROR/32N9F2xsEXAC7UkpgonmMdxexruC0/R/0GDTKgFePCtJijJZ/jC0zv3UX5IwQ/BUrK8F0osBKd2oEbhuEKSM2tbF7RJM4cTg+OvJxTMEOpP2/xZ5pvqFjbRHD9VfSfUmEOniBu4+YX4OC95bsmUTIeuGSVIL2IQdvycL9vghEFvo=" jdk: - oraclejdk8 script: - mvn clean org.jacoco:jacoco-maven-plugin:prepare-agent package sonar:sonar cache: directories: - '$HOME/.m2/repository' - '$HOME/.sonar/cache'
Add Ruby 2.0 to Travis build
language: ruby script: "rake spec" rvm: - 1.8.7 - 1.9.2 - 1.9.3 - jruby-18mode - jruby-19mode - ree
language: ruby script: "rake spec" rvm: - 1.8.7 - 1.9.2 - 1.9.3 - 2.0.0 - jruby-18mode - jruby-19mode - ree
Set up CI with Azure Pipelines
# C/C++ with GCC # Build your C/C++ project with GCC using make. # Add steps that publish test results, save build artifacts, deploy, and more: # https://docs.microsoft.com/azure/devops/pipelines/apps/c-cpp/gcc trigger: - master pool: vmImage: 'Ubuntu-16.04' steps: - script: | cd samples make -j2 displayName: 'make'
Add Clojure and just use the existing Bash record
# Popular languages appear at the top of language dropdowns # # This file should only be edited by GitHub staff - ActionScript - Bash - C - C# - C++ - CSS - CoffeeScript - Common Lisp - Diff - Emacs Lisp - Erlang - HTML - Haskell - Java - JavaScript - Lua - Objective-C - PHP - Perl - Python - Ruby - SQL - Scala - Scheme - Shell
# Popular languages appear at the top of language dropdowns # # This file should only be edited by GitHub staff - ActionScript - Bash - C - C# - C++ - CSS - Clojure - CoffeeScript - Common Lisp - Diff - Emacs Lisp - Erlang - HTML - Haskell - Java - JavaScript - Lua - Objective-C - PHP - Perl - Python - Ruby - SQL - Scala - Scheme
Add a simple YAML IT for them.
# Test parser settings --- - Statement: SELECT true||false - output: [[truefalse]] --- - Statement: SET parserDoubleQuoted TO 'string' --- - Statement: SET parserInfixBit TO "true" --- - Statement: SET parserInfixLogical TO "true" --- - Statement: SELECT 1|2 - output: [[3]] --- - Statement: SELECT true||false - output: [[true]] --- # Return settings to normal - Statement: SET parserInfixBit TO DEFAULT --- - Statement: SET parserInfixLogical TO DEFAULT --- - Statement: SET parserDoubleQuoted TO DEFAULT ...
Add example pod that returns to pod name
apiVersion: v1 kind: Service metadata: name: hello spec: selector: app: hello ports: - protocol: TCP port: 8080 targetPort: 8080 #type: LoadBalancer type: NodePort --- apiVersion: apps/v1beta1 kind: Deployment metadata: name: hello spec: replicas: 4 template: metadata: labels: app: hello spec: containers: - name: hello image: rilleralle/node-hello-host ports: - containerPort: 8080
Update from Hackage at 2022-09-03T18:44:05Z
homepage: https://github.com/UnaryPlus/number-wall changelog-type: markdown hash: 3e6c658c771af864fe91e2e0b3077e2b3ee858cf64ac6aceacbaf31789deebfe test-bench-deps: base: -any number-wall: -any maintainer: ombspring@gmail.com synopsis: Create number walls and save them as images changelog: | # Revision history for number-wall ## 0.1.0.0 -- 2022-9-3 * First version. Released on an unsuspecting world. basic-deps: memoize: '>=0.2.0 && <1.2' JuicyPixels: '>=3.3 && <3.4' base: '>=4.14.1.0 && <5' mod: '>=0.1.1.0 && <0.2' semirings: '>=0.5.2 && <0.8' all-versions: - 0.1.0.0 author: Owen Bechtel latest: 0.1.0.0 description-type: markdown description: | # number-wall Create number walls and save them as images. Documentation is available on [Hackage](https://hackage.haskell.org/package/number-wall). license-name: MIT