content
large_stringlengths
3
20.5k
url
large_stringlengths
54
193
branch
large_stringclasses
4 values
source
large_stringclasses
42 values
embeddings
listlengths
384
384
score
float64
-0.21
0.65
# mm2-gmk-migration Migrate to Google Managed Kafka using MirrorMaker2 [![Open in Cloud Shell](http://gstatic.com/cloudssh/images/open-btn.svg)](https://console.cloud.google.com/cloudshell/editor?cloudshell\_git\_repo=https%3A%2F%2Fgithub.com%2Fmandeeptrehan%2Fmm2-gmk-migration.git) ## Setup Terraform Workspace 1. Upda...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/mm2-gmk-migration/README.md
main
gcp-professional-services
[ -0.0281795933842659, -0.09259697049856186, 0.03911697119474411, -0.0073614101856946945, -0.05205727368593216, -0.038899824023246765, -0.11738685518503189, -0.05260925367474556, -0.03315167501568794, 0.1317763477563858, -0.013080386444926262, -0.1308051198720932, -0.009032472036778927, -0.0...
-0.014351
# Hashpipeline ## Overview In this solution, we are trying to create a way to indicate security teams if there is a file found with US Social Security Numbers (SSNs). While the DLP API in GCP offers the ability to look for SSNs, it may not be accurate, especially if there are other items such as account numbers that lo...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/dataflow-dlp-hash-pipeline/README.md
main
gcp-professional-services
[ -0.0741424709558487, 0.0221727192401886, -0.06395049393177032, -0.054168570786714554, -0.06622929871082306, -0.018757596611976624, 0.027738982811570168, -0.06167647987604141, 0.06955581158399582, -0.018902333453297615, -0.03460170701146126, 0.02144024334847927, 0.057062502950429916, -0.022...
-0.037324
a list of valid and random Social Security Numbers \* Store the plain text in `scripts/socials.txt` \* Hash the numbers (normalized without dashes) using HMAC-SHA256 and the key generated from `make create\_key` \* Store the hashed values in Firestore under the collection specified in the terraform variable: `firestore...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/dataflow-dlp-hash-pipeline/README.md
main
gcp-professional-services
[ -0.03282053396105766, 0.07085782289505005, -0.07435569912195206, 0.013497263193130493, 0.007571707013994455, 0.0009643802768550813, -0.02010771632194519, -0.02215591073036194, 0.03858019411563873, 0.043426379561424255, -0.007814310491085052, 0.005435900762677193, 0.0947158932685852, -0.063...
-0.041117
# Data Generator This directory shows a series of pipelines used to generate data in GCS or BigQuery. The intention for these pipelines are to be a tool for partners, customers and SCEs who want to create a dummy dataset that looks like the schema of their actual data in order to run some queries in BigQuery. There are...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/dataflow-data-generator/README.md
main
gcp-professional-services
[ -0.03840050846338272, -0.006651861127465963, -0.07306468486785889, 0.03582092002034187, -0.06575320661067963, -0.04747123271226883, -0.04344186931848526, -0.014201977290213108, -0.0511937253177166, -0.000032896004995564, 0.03613395243883133, -0.06058288738131523, 0.06676173955202103, -0.11...
0.083373
using the `--schema\_file` parameter with a file containing a list of json objects with `name`, `type`, `mode` and optionally `description` fields. This form follows the output of`bq show --format=json --schema `. This data generator now supports nested types like `RECORD`/`STRUCT`. Note, that the approach taken was to...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/dataflow-data-generator/README.md
main
gcp-professional-services
[ -0.06597451120615005, 0.03629276901483536, 0.03764444589614868, 0.02610612101852894, -0.08073974400758743, -0.005748828407377005, -0.0382661335170269, -0.0018743528053164482, -0.020285069942474365, -0.043007414788007736, -0.04264337942004204, -0.03460518270730972, 0.005612408742308617, 0.0...
-0.017094
use [`bq\_table\_resizer.py`](bigquery-scripts/bq\_table\_resizer.py) to copy the table into itself until it reaches the desired size. ``` --output\_bq\_table=project:dataset.table ``` #### Sparsity (optional) Data is seldom full for every record so you can specify the probability of a NULLABLE column being null with t...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/dataflow-data-generator/README.md
main
gcp-professional-services
[ 0.013928405940532684, 0.04161693528294563, -0.07179169356822968, 0.01343106385320425, -0.02964864857494831, -0.043158821761608124, 0.0489763468503952, 0.05131696164608002, -0.10776819288730621, 0.026270179077982903, 0.0015059822471812367, -0.03670961409807205, 0.05703308433294296, -0.11084...
-0.096147
to `setup.py`. ### Generating Joinable tables Snowflake schema To generate multiple tables that join based on certain keys, start by generating the central fact table with the above described [`data\_generator\_pipeline.py`](data-generator-pipeline/data\_generator\_pipeline.py). Then use [`data\_generator\_joinable\_ta...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/dataflow-data-generator/README.md
main
gcp-professional-services
[ -0.0009771800832822919, 0.024740761145949364, -0.00403068820014596, 0.027149077504873276, -0.0007650683401152492, -0.03167174756526947, 0.01536385528743267, 0.04597858712077141, -0.11525584012269974, -0.02711702138185501, -0.046231698244810104, -0.029409263283014297, 0.03945162519812584, -...
-0.07843
--dataset= \ --table= \ --partitioning\_column date \ --source\_file=files\_to\_load.txt ``` ### BigQuery Histogram Tool This script will create a BigQuery table containing the hashes of the key columns specified as a comma separated list to the `--key\_cols` parameter and the frequency for which that group of key colu...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/dataflow-data-generator/README.md
main
gcp-professional-services
[ 0.03954862058162689, 0.03211835026741028, -0.04045751318335533, -0.0072896359488368034, -0.04099714010953903, -0.030019298195838928, 0.0429144985973835, 0.06404799968004227, -0.11120475828647614, 0.04435361549258232, 0.01929827779531479, -0.018200740218162537, 0.043655894696712494, -0.1330...
-0.065825
# Cloud Composer Examples This example demonstrates how to test Airflow DAGs using then deploy them to Cloud Composer using with Cloud Build. ## Run build using Cloud Build The included cloudbuild.yaml file has the following flow: 1. Build Airflow DAGs Builder: Builds a docker image with airflow dependencies included 2...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/cloud-composer-cicd/README.md
main
gcp-professional-services
[ -0.039927516132593155, -0.008133147843182087, 0.022703522816300392, 0.03552880883216858, 0.018087521195411682, -0.03957688808441162, -0.05044447258114815, -0.034799084067344666, 0.01434397790580988, 0.017466288059949875, -0.0415244922041893, -0.09894216060638428, -0.01543989684432745, -0.0...
-0.11298
# iap-idp-connect ## Introduction This service programmatically connects IAP (Identity Aware Proxy) to IdP (Identity Platform) in Google Cloud Platform. By This program, you connect Identity providers (including multi-tenants) with IAP (Identity Aware Proxy) backend services For example, you connect SAML integrations d...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/iap-idp-connect/README.md
main
gcp-professional-services
[ -0.0781773254275322, -0.02995501086115837, 0.029668956995010376, -0.05482974275946617, -0.06620022654533386, -0.03559781238436699, 0.060485970228910446, -0.014887174591422081, -0.004298059269785881, 0.016719257459044456, 0.015393131412565708, -0.01724088378250599, 0.05989851802587509, -0.0...
-0.002464
# Testing GCS Connector for Dataproc The Google Cloud Storage connector for Hadoop (HCFS) enables running [Apache Hadoop](http://hadoop.apache.org/) or [Apache Spark](http://spark.apache.org/) jobs directly on data in [GCS](https://cloud.google.com/storage) by implementing the Hadoop FileSystem interface. The connector...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/dataproc-gcs-connector/README.md
main
gcp-professional-services
[ -0.11188250035047531, -0.07839424163103104, 0.016862642019987106, 0.0007271149661391973, 0.044894687831401825, -0.045413143932819366, -0.04695798084139824, -0.0207506213337183, -0.032408248633146286, 0.07001429796218872, 0.05573330447077751, 0.003934826236218214, 0.03473132848739624, -0.06...
-0.030148
directory of this project `gcs-connector-poc/`, the script will run the following commands to build the Dataproc cluster using the GCS connector. ```bash cd terraform terraform init terraform apply ``` ### 6. Test the Dataproc cluster The script `test\_gcs\_connector.sh` will test the GCS Connector on your Dataproc clu...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/dataproc-gcs-connector/README.md
main
gcp-professional-services
[ -0.062031105160713196, -0.04532192274928093, -0.03031703270971775, 0.006414007395505905, -0.02114245295524597, -0.0405413918197155, -0.030352244153618813, -0.010683225467801094, -0.033007748425006866, 0.1194751188158989, -0.007094436790794134, -0.0823860913515091, 0.07454701513051987, -0.0...
-0.04879
## Inputs | Name | Description | Type | Default | Required | |------|-------------|------|---------|:-----:| | dataproc\\_cluster | Name for dataproc cluster | `any` | n/a | yes | | dataproc\\_subnet | Name for dataproc subnetwork to create | `any` | n/a | yes | | hadoop\\_version | Hadoop version for the GCS connector...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/dataproc-gcs-connector/terraform/README.md
main
gcp-professional-services
[ -0.0587148554623127, -0.07360813021659851, -0.00162305252160877, 0.006275252439081669, -0.036844402551651, -0.010549348779022694, -0.039107926189899445, -0.049482762813568115, -0.042662832885980606, 0.08246061950922012, -0.021365897729992867, -0.10835859924554825, 0.04443613439798355, -0.0...
0.022145
# Custom Dataproc Scheduled Cluster Deletion This repository provides scripts for launching a Google Cloud Dataproc Cluster while specifying the maximum idle time after which the cluster will be deleted. The custom scripts will consider active SSH sessions and YARN based jobs in determining whether or not the cluster i...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/dataproc-idle-shutdown/README.md
main
gcp-professional-services
[ -0.0867605209350586, -0.04353182390332222, 0.010719845071434975, 0.04608964920043945, 0.03748501092195511, -0.013849700801074505, 0.047495342791080475, -0.06400144845247269, -0.004238716792315245, 0.06755921989679337, 0.014604602940380573, -0.020618826150894165, 0.035619158297777176, -0.00...
0.065006
m, h, d” (seconds, minutes, hours, days, respectively). Examples: “30m” or “1d” (30 minutes or 1 day from when the cluster becomes idle). 4. [Optional] Specify, as the value of the metadata key “key\_process\_list”, a semi-colin separated list of process names (in addition to YARN jobs and active SSH connections) for w...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/dataproc-idle-shutdown/README.md
main
gcp-professional-services
[ -0.008402340114116669, -0.00968314241617918, -0.055985115468502045, 0.04553182050585747, 0.06136763095855713, -0.022216033190488815, 0.02529115416109562, 0.026980433613061905, -0.020810887217521667, 0.036854103207588196, 0.02851616032421589, -0.04054223373532295, 0.048382554203271866, -0.0...
0.055782
Unit Tests === Run unit tests after installing development dependencis: ```bash pip install -r requirements-dev.txt pytest ``` Save and Replay VM Deletion Events === It is useful to replay VM deletion events to test out changes to the Background Function. See the [Replay Quickstart][replay-qs] for more information. 1. ...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/gcf-pubsub-vm-delete-event-handler/DEVELOPER.md
main
gcp-professional-services
[ 0.029339581727981567, -0.04017949849367142, 0.0059486799873411655, -0.0638306513428688, 0.03489717096090317, -0.03741772845387459, 0.004672881681472063, -0.06305859982967377, 0.017968518659472466, 0.026437245309352875, 0.0012722954852506518, 0.011537676677107811, 0.04781258851289749, 0.015...
-0.047181
"ip\_address": "", I dns\_vm\_gc 578383257362746 2019-06-12 01:30:09.145 "operation": { I dns\_vm\_gc 578383257362746 2019-06-12 01:30:09.145 "id": "971500189857477422", I dns\_vm\_gc 578383257362746 2019-06-12 01:30:09.145 "name": "operation-1560300992590-58b15e267f2cc-e7529c4d-f1343924", I dns\_vm\_gc 578383257362746...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/gcf-pubsub-vm-delete-event-handler/DEVELOPER.md
main
gcp-professional-services
[ -0.031035928055644035, 0.06352248042821884, -0.015381867997348309, -0.05561297759413719, -0.09377782046794891, -0.10079735517501831, -0.0021555847488343716, -0.0560959167778492, 0.04943275824189186, 0.07953589409589767, 0.011443695053458214, -0.06758588552474976, -0.06365055590867996, -0.0...
0.049115
2019-06-12 01:30:20.438 "zone": "us-west1-a" I dns\_vm\_gc 578389534045377 2019-06-12 01:30:20.438 }, I dns\_vm\_gc 578389534045377 2019-06-12 01:30:20.438 "type": "gce\_instance" I dns\_vm\_gc 578389534045377 2019-06-12 01:30:20.438 }, I dns\_vm\_gc 578389534045377 2019-06-12 01:30:20.438 "severity": "INFO", I dns\_vm...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/gcf-pubsub-vm-delete-event-handler/DEVELOPER.md
main
gcp-professional-services
[ -0.013017968274652958, -0.007330212276428938, 0.014226799830794334, -0.08957739919424057, -0.050752732902765274, -0.11440486460924149, -0.019742976874113083, -0.032607369124889374, 0.02048529125750065, 0.045127447694540024, 0.0007654479704797268, -0.0851215124130249, -0.048945918679237366, ...
0.026565
VM DNS Garbage Collection === This folder contains a [Background Function][bg] which deletes DNS A records when a VM is deleted. \*\*Please note\*\* DNS record deletion is implemented, however, cannot be guaranteed. A race exists between the function obtaining the VM IP address and the `compute.instances.delete` operat...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/gcf-pubsub-vm-delete-event-handler/README.md
main
gcp-professional-services
[ -0.04251698777079582, 0.04649516940116882, 0.02818468026816845, -0.032154008746147156, 0.05632960796356201, -0.07035484910011292, 0.01526801660656929, -0.1412254124879837, 0.1310928761959076, 0.04322171211242676, 0.019138706848025322, 0.04227902367711067, 0.007251270115375519, 0.0058381888...
0.133858
delete DNS records in the host project. This role may be granted at the Shared VPC project level. Compute Viewer --- Grant the Compute Viewer role to the dns-vm-gc service account. Compute Viewer allows the DNS VM GC function to read the IP address of the VM, necessary to ensure the correct A record is deleted. This ro...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/gcf-pubsub-vm-delete-event-handler/README.md
main
gcp-professional-services
[ -0.043767694383859634, -0.029856376349925995, -0.005180039443075657, -0.0031168919522315264, -0.034129250794649124, -0.01439234334975481, 0.015049975365400314, -0.09718445688486099, 0.07262664288282394, 0.06904838979244232, -0.044621292501688004, -0.01430483441799879, 0.03203025832772255, ...
0.017687
to day reporting. The correlation is useful for the rare situation of complete end-to-end tracing. Reporting === Lost Race --- Periodic reporting should be performed to monitor for `NOT\_PROCESSED` results. In the event of a lost race, automatic DNS record deletion is not guaranteed. The following Stackdriver Advanced ...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/gcf-pubsub-vm-delete-event-handler/README.md
main
gcp-professional-services
[ -0.11027620732784271, -0.012106003239750862, 0.016013165935873985, 0.06216007098555565, 0.016156909987330437, -0.0299880038946867, -0.0555608868598938, -0.11083529144525528, 0.0896940603852272, 0.002448177430778742, 0.008473297581076622, -0.014463807456195354, -0.012118076905608177, 0.0425...
0.040281
# Carbon Footprint Dashboard This example shows how to use the prebuilt templates for Carbon Footprint Estimates and create your own Carbon Footprint Dashboard by connecting to carbon reporting exports from BigQuery. # Using Looker Studio Users can use this Data Studio dashboard as-is, or use them as a starting point f...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/carbon-footprint-dashboard/README.md
main
gcp-professional-services
[ 0.008075103163719177, 0.0662723183631897, 0.012678729370236397, 0.0572996623814106, 0.06440812349319458, 0.025420354679226875, -0.0496845506131649, 0.06367488950490952, -0.06563208252191544, 0.05608353018760681, -0.039433203637599945, -0.12332364916801453, 0.04113270714879036, -0.008495993...
0.011907
Copyright 2023 Google. This software is provided as-is, without warranty or representation for any use or purpose. Your use of it is subject to your agreement with Google. # Twilio Conversation Integration with a Virtual Agent using Dialogflow This is an example how to integrate a Twilio Conversation Services with Virt...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/ccai-dialogflow-middleware/README.md
main
gcp-professional-services
[ -0.07956753671169281, -0.035168565809726715, -0.004395061172544956, -0.11300206184387207, -0.06138571351766586, -0.06459873169660568, 0.04500236362218857, 0.04946407303214073, 0.018293162807822227, 0.005099930334836245, -0.04527909308671951, -0.03884415328502655, 0.05528300628066063, -0.05...
0.149613
Set your [Google Application Default Credentials][application-default-credentials] by [initializing the Google Cloud SDK][cloud-sdk-init] with the command: ``` gcloud init ``` Generate a credentials file by running the [application-default login](https://cloud.google.com/sdk/gcloud/reference/auth/application-default/lo...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/ccai-dialogflow-middleware/README.md
main
gcp-professional-services
[ -0.09207271039485931, 0.007293469738215208, -0.03597346693277359, -0.12155744433403015, -0.05693589150905609, -0.04313342273235321, 0.029104121029376984, 0.009965893812477589, 0.010880663059651852, 0.033180806785821915, 0.0016349587822332978, -0.028965644538402557, 0.07646306604146957, -0....
-0.087877
to the request so our webhook gets invoked 5) Use the Twilio [Interaction API](https://www.twilio.com/docs/flex/developer/conversations/interactions-api) to invoke a handoff to the Flex UI ## How to run the initializer to programmatically create a Twilio conversation > [ConversationInitializer](./src/main/java/com/midd...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/ccai-dialogflow-middleware/README.md
main
gcp-professional-services
[ -0.09212331473827362, 0.034212175756692886, 0.0034580908250063658, -0.04583623260259628, -0.01050003431737423, -0.12713439762592316, 0.012608595192432404, 0.09782060980796814, 0.012623624876141548, -0.021733935922384262, -0.010104367509484291, -0.06221367046236992, -0.008083192631602287, 0...
0.11519
# Dataflow Streaming Schema Handler This package contains a set of components required to handle unanticipated incoming streaming data into BigQuery with schema mismatch. The code will uses Schema enforcement and DLT (Dead Letter Table) approach to store schema incompability. In case of schema incompability detected fr...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/dataflow-streaming-schema-handler/README.md
main
gcp-professional-services
[ -0.02060059830546379, -0.0007240242557600141, 0.03545646369457245, -0.04439564049243927, 0.022497937083244324, -0.06772954761981964, -0.03193112835288048, -0.02868587151169777, -0.009001721628010273, 0.004495667293667793, 0.004642652813345194, -0.03319685533642769, 0.0005931277992203832, -...
0.037009
# CloudML Marketing (Classification) Model for Banking The goal of this notebook is to create a classification model using CloudML as an alternative to on-premise methods. Along the way you will learn how to store data into BigQuery, fetch and explore that data, understand how to properly partition your dataset, perfor...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/cloudml-bank-marketing/README.md
main
gcp-professional-services
[ -0.038176607340574265, -0.06898278743028641, -0.0504416897892952, -0.01719662733376026, 0.06012919917702675, 0.04636712372303009, 0.025784846395254135, -0.022871823981404305, 0.03582395240664482, -0.03881213814020157, 0.044231515377759933, -0.06253388524055481, 0.016023755073547363, -0.088...
0.182387
# spanner-interleave-subquery This example contains the benchmark code to examine query efficiency gains of using Cloud Spanner interleaved tables with subqueries. ## Prerequisite Run the following command to create a Cloud Spanner database with [schema.sql](schema.sql). ```bash gcloud spanner databases create ${DATABA...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/spanner-interleave-subquery/README.md
main
gcp-professional-services
[ -0.005775157827883959, -0.04193267226219177, 0.008036796934902668, -0.004965894389897585, -0.03376276418566704, -0.10674393177032471, -0.036940544843673706, -0.03564298152923584, -0.04153730347752571, 0.06463675945997238, -0.003918803762644529, -0.08124624937772751, 0.0825534388422966, -0....
-0.111986
# Fixity Metadata for GCS 🗃 This script pulls metadata and checksums for file archives in Google Cloud Storage and stores them in a manifest file and in BigQuery to track changes over time. The script uses the [BagIt](https://tools.ietf.org/html/rfc8493) specification. ## Overview Each time this Fixity function is run...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/gcs-fixity-function/README.md
main
gcp-professional-services
[ -0.07042159140110016, 0.023232467472553253, 0.017297225072979927, -0.027471035718917847, 0.06845714151859283, -0.10746903717517853, 0.0446842685341835, -0.025086477398872375, 0.04143248870968819, 0.07223854213953018, -0.01518652681261301, 0.022564375773072243, 0.0015456213150173426, -0.102...
0.05681
# Setup Clone this repository and run locally, or use Cloud Shell to walk through the steps: [![Open in Cloud Shell](https://gstatic.com/cloudssh/images/open-btn.png)](https://ssh.cloud.google.com/cloudshell/open?page=shell&cloudshell\_git\_repo=https://github.com/GoogleCloudPlatform/professional-services&cloudshell\_t...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/gcs-fixity-function/docs/setup.md
main
gcp-professional-services
[ -0.02445610985159874, -0.057490233331918716, 0.011912900023162365, -0.038602810353040695, 0.00600008899345994, -0.028410371392965317, -0.04497884586453438, 0.014056479558348656, -0.007477901875972748, 0.13189545273780823, 0.02276723086833954, -0.07641435414552689, 0.06709278374910355, -0.0...
-0.087153
# Setup Clone this repository and run locally, or use Cloud Shell to walk through the steps: [![Open in Cloud Shell](https://gstatic.com/cloudssh/images/open-btn.png)](https://ssh.cloud.google.com/cloudshell/open?page=shell&cloudshell\_git\_repo=https://github.com/GoogleCloudPlatform/professional-services&cloudshell\_t...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/gcs-fixity-function/docs/remove.md
main
gcp-professional-services
[ -0.02379748225212097, -0.03957981616258621, 0.04526985064148903, -0.0454796627163887, 0.0009774519130587578, -0.04309573769569397, -0.03326983377337456, -0.0006604720838367939, -0.018461961299180984, 0.13211466372013092, 0.027568671852350235, -0.06203806772828102, 0.061559099704027176, -0....
-0.070764
# Terraform Config Validator Policy Library This repo contains a library of constraint templates and sample constraints to be used for Terraform resource change requests. If you're looking for the CAI variant, please see [Config Validator](https://github.com/lykaasegura/w-secteam-repo). Everything in this repository ha...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/terraform-resource-change-policy-library/README.md
main
gcp-professional-services
[ -0.08367093652486801, 0.026643402874469757, 0.046441853046417236, -0.00418784748762846, 0.02311522886157036, -0.0182906836271286, 0.020653100684285164, -0.09488620609045029, 0.0014811797300353646, 0.056059613823890686, -0.02227538451552391, -0.07439955323934555, 0.08456964045763016, 0.0310...
0.005188
# This text will be replaced #ENDINLINE ``` Replaced: ``` #INLINE("my\_rule.rego") #contents of my\_rule.rego #ENDINLINE ``` #### Linting Policies Config Validator provides a policy linter. You can invoke it as: ``` go get github.com/GoogleCloudPlatform/config-validator/cmd/policy-tool policy-tool --policies ./policies...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/terraform-resource-change-policy-library/README.md
main
gcp-professional-services
[ -0.08812500536441803, 0.03617151081562042, 0.05497997999191284, -0.0879971832036972, 0.029143016785383224, -0.03969632834196091, -0.018276693299412727, -0.0467427559196949, -0.05637824535369873, 0.06964663416147232, 0.04247809946537018, -0.046687278896570206, 0.02964792214334011, -0.069952...
-0.101932
## Config Validator | Setup & User Guide ### Go from setup to proof-of-concept in under 1 hour \*\*Table of Contents\*\* \* [Overview](#overview) \* [How to set up constraints with Policy Library](#how-to-set-up-constraints-with-policy-library) \* [Get started with the Policy Library repository](#get-started-with-the-p...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/terraform-resource-change-policy-library/docs/user_guide.md
main
gcp-professional-services
[ -0.07541549950838089, -0.010816989466547966, -0.04300122708082199, -0.05388215556740761, 0.024271253496408463, 0.002351989271119237, 0.03493741899728775, -0.02091469056904316, -0.12738417088985443, 0.01318049244582653, 0.05168008431792259, -0.05100454017519951, 0.09564601629972458, 0.02521...
0.001399
wish to use, create constraint YAML files corresponding to those templates, and place them under `policies/constraints`. Commit the newly created constraint files to \*\*your\*\* Git repository. For example, assuming you have created a Git repository named "policy-library" under your GitHub account, you can use the fol...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/terraform-resource-change-policy-library/docs/user_guide.md
main
gcp-professional-services
[ -0.004609712399542332, -0.02773446775972843, -0.026546046137809753, -0.0458003431558609, -0.009787403978407383, 0.059383049607276917, -0.012622518464922905, 0.026284709572792053, -0.0417330376803875, 0.06395780295133591, 0.0005631447420455515, -0.03272462636232376, 0.03432795777916908, -0....
-0.010991
sample IAM domain restriction constraint: ``` cp samples/constraints/iam\_service\_accounts\_only.yaml policies/constraints ``` Let's take a look at this constraint: ``` apiVersion: constraints.gatekeeper.sh/v1beta1 kind: TFGCPIAMAllowedPolicyMemberDomainsConstraintV2 metadata: name: service-accounts-only annotations: ...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/terraform-resource-change-policy-library/docs/user_guide.md
main
gcp-professional-services
[ -0.05955628678202629, -0.024526171386241913, 0.033617883920669556, -0.05748281255364418, 0.0028501616325229406, -0.05210351571440697, 0.054386526346206665, -0.11316812038421631, 0.01633746363222599, 0.10824912786483765, 0.0035217369440943003, -0.1259901374578476, 0.09568040072917938, 0.038...
-0.006619
# Functional Principles of the Constraint Framework You'll notice that this repository contains a handful of folders, each with different items. It's confusing at first, so let's dive into it! First, let's start with how the library is organized. ## Folder Structure The folder structure below contains a TL;DR explanati...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/terraform-resource-change-policy-library/docs/functional_principles.md
main
gcp-professional-services
[ -0.05167834460735321, -0.006730480585247278, -0.06694655120372772, -0.05316251516342163, 0.08361507952213287, 0.03970588371157646, 0.0899895653128624, 0.0036390528548508883, -0.042015258222818375, 0.03220576420426369, 0.014059260487556458, 0.011907054111361504, 0.014565731398761272, 0.0315...
0.099545
domain(s) you've passed in. You can create multiple constraints for any given ConstraintTemplate, you just need to make sure that the rules don't conflict with one another. For example, any allowlist/denylist policy would be difficult to create multiple constraints for. The reason is that if one constraint is of type `...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/terraform-resource-change-policy-library/docs/functional_principles.md
main
gcp-professional-services
[ -0.03798939287662506, -0.009243086911737919, 0.01490385178476572, -0.07287414371967316, 0.006133065093308687, 0.013536143116652966, 0.0634828433394432, -0.08341819792985916, -0.02610466443002224, 0.02066975086927414, -0.0393926277756691, -0.07961614429950714, 0.06997792422771454, 0.0386157...
-0.059814
"\*" } ``` There are two parts to this policy. We have a `violation` object, which contains line-by-line logic statements. The way rego works is that the policy will run line-by-line, and will `break` if any of the conditions don't pass. For instance, if the `resource.type` is \*\*not\*\* "google\_project\_iam\_binding...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/terraform-resource-change-policy-library/docs/functional_principles.md
main
gcp-professional-services
[ -0.07401347905397415, 0.07451707869768143, -0.010502059943974018, -0.0024731378071010113, -0.054887108504772186, -0.00223120697773993, 0.1341153234243393, -0.05934140086174011, -0.07788950204849243, 0.03110680729150772, -0.00511779822409153, -0.0277834665030241, 0.047616783529520035, 0.027...
0.119793
`mode` and `roles`. If you look at the ConstraintTempalte, you can see that these two fields are defined and described. `Mode` is a string enumerable that \*must\* be either denylist or allowlist. `gcloud beta terraform vet` will actually error out if this is not upheld in the associated constraint(s). ## Additional Re...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/terraform-resource-change-policy-library/docs/functional_principles.md
main
gcp-professional-services
[ -0.06893104314804077, 0.05003847926855087, 0.06046844646334648, -0.011869418434798717, 0.03723585233092308, 0.002379043959081173, 0.04317258670926094, -0.09998626261949539, -0.07868068665266037, 0.04634565860033035, -0.0384625606238842, -0.041174646466970444, 0.07342555373907089, 0.0374046...
0.010686
# Cloud Support API (v2) Samples ## About the Support API The [Cloud Support API](https://cloud.google.com/support/docs/reference/rest) provides programmatic access to Google Cloud's Support interface. You can use the Cloud Support API to integrate Cloud Customer Care with your organization's customer relationship mana...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/cloud-support/appengine_python_example/README.md
main
gcp-professional-services
[ -0.14295049011707306, -0.035127364099025726, 0.04075104743242264, -0.025556661188602448, -0.030121196061372757, -0.011632217094302177, -0.05779070034623146, 0.014461216516792774, -0.046075526624917984, 0.027799149975180626, -0.0007192067569121718, 0.0030513727106153965, 0.09125176072120667, ...
0.032069
# java\_working\_app\_example This code is created by Eugene Enclona, If you have any questions about this code reach out to eenclona@google.com ## Overview The purpose of this code is to show how one can build a sample Java app to take advantage of the Cloud Support API. One functionality in this app is to modify the ...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/cloud-support/java_working_app_example/README.md
main
gcp-professional-services
[ -0.07862231880426407, 0.012553559616208076, 0.08589577674865723, -0.09386363625526428, 0.02742445282638073, -0.059359360486269, -0.002264875452965498, 0.026888148859143257, -0.006272569764405489, 0.025236379355192184, -0.008956125006079674, -0.031509701162576675, 0.07144051045179367, -0.03...
0.043606
# Cloud Support API (v2) Samples The [Cloud Support API](https://cloud.google.com/support/docs/reference/rest) provides programmatic access to Google Cloud's Support interface. You can use the Cloud Support API to integrate Cloud Customer Care with your organization's customer relationship management (CRM) system. The ...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/cloud-support/java_starter_example/README.md
main
gcp-professional-services
[ -0.10774070769548416, -0.05701761320233345, 0.018803458660840988, -0.006570742931216955, -0.027111602947115898, 0.03394756093621254, 0.005814963020384312, 0.01052988413721323, -0.03686006739735603, 0.08016616106033325, 0.00749612133949995, -0.027007795870304108, 0.09027278423309326, -0.034...
-0.04572
## Overview Application Framework to execute various operations on redis to evaluate key performance metrics such as CPU & Memory utilization, Bytes transferred, Time per command etc. For complete list of metrics refer [MemoryStore Redis Metrics](https://cloud.google.com/memorystore/docs/redis/supported-monitoring-metr...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/redis-benchmarks/redis-benchmarks/README.md
main
gcp-professional-services
[ -0.013214468024671078, -0.014734527096152306, -0.11106491833925247, -0.027392182499170303, -0.04017627239227295, -0.08519042283296585, -0.0033125837799161673, 0.027782579883933067, -0.004035237245261669, 0.02325667440891266, -0.04264377802610397, -0.01988036185503006, 0.024082796648144722, ...
0.202085
method in [WorkloadExecutor](./src/main/java/com/google/cloud/pso/benchmarks/redis/WorkloadExecutor.java) for reference. ## Disclaimer This project is not an official Google project. It is not supported by Google and disclaims all warranties as to its quality, merchantability, or fitness for a particular purpose.
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/redis-benchmarks/redis-benchmarks/README.md
main
gcp-professional-services
[ -0.07263834774494171, 0.000965493090916425, -0.04454900324344635, -0.027831392362713814, -0.060742784291505814, -0.0958072617650032, -0.03267129138112068, -0.005662828218191862, -0.030757570639252663, 0.009404248557984829, -0.032551735639572144, 0.03837579861283302, -0.000029719440135522746,...
0.145215
# BigQuery cross-project slots utilization monitoring This solution was written to help monitoring slot utilization across multiple projects, while breaking down allocation per project. This is relevant for customers using [flat-rate pricing](https://cloud.google.com/bigquery/pricing#flat\_rate\_pricing). ## Background...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/bigquery-cross-project-slot-monitoring/README.md
main
gcp-professional-services
[ 0.013849270530045033, -0.04509427025914192, -0.02051021344959736, 0.008189195767045021, -0.0025700926780700684, -0.01969592273235321, 0.06257719546556473, -0.053936153650283813, 0.01911991834640503, 0.03998879715800285, -0.06612762808799744, -0.06636008620262146, 0.007299912627786398, -0.0...
0.000858
permissions: + Billing Viewer on Billing Account ID ([documentation](https://cloud.google.com/billing/docs/how-to/billing-access#update\_billing\_permissions)). + Monitoring Editor on project hosting Stackdriver account. + Monitoring Viewer on all projects. It will be easier to apply this on Folder / Org level. 6. Metr...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/bigquery-cross-project-slot-monitoring/README.md
main
gcp-professional-services
[ 0.01303411740809679, -0.06112099811434746, -0.016662612557411194, 0.02302948571741581, -0.0023571604397147894, -0.007204737514257431, -0.03870640695095062, -0.016000812873244286, 0.03828296437859535, 0.039494991302490234, -0.05117752030491829, -0.05094793438911438, 0.037112098187208176, 0....
-0.057705
## python The python folder contains the EphemeralInstance module and associated usage examples. The EphemeralInstance module can be used to deploy and destroy projects, enable services, and bind principles to the deployed projects. Available methods: \* various get methods for project metadata \* deploy\_project() - c...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/ephemeral-projects/README.md
main
gcp-professional-services
[ -0.0771193653345108, 0.012630315497517586, -0.026990052312612534, -0.040111906826496124, -0.0011527453316375613, -0.08865921944379807, 0.07192185521125793, 0.00865268800407648, -0.019748583436012268, 0.02626117318868637, 0.022359251976013184, -0.010565130040049553, 0.10191817581653595, -0....
0.114387
This is a collection of Cloud Build Examples for common CICD Tasks to be used for applications where their repository is connected to Google Cloud Build. These `.yaml` examples can be copied to the root directory of applications where their traditional Dockerfiles are or you can choose to add them to a `cloudbuild/`, `...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/cloudbuild-application-cicd/README.md
main
gcp-professional-services
[ -0.07228155434131622, -0.002831009216606617, 0.037552040070295334, -0.06294722110033035, 0.00426868349313736, -0.04599050432443619, 0.027870433405041695, -0.054925061762332916, 0.02727597951889038, 0.06055830419063568, 0.011706511490046978, -0.06345093995332718, 0.06585149466991425, -0.076...
0.097339
# Universal Application Containerizer with Push to Artifact Registry This Example allows you to containerize an application and push to a specific Artifact Registry Docker Repository. Cloud build will containerize the application based on the `Dockerfile` defined within the repository used. Two image tags will be creat...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/cloudbuild-application-cicd/containerize/README.md
main
gcp-professional-services
[ -0.004073375836014748, 0.03594174236059189, 0.05389413237571716, -0.05492732301354408, -0.02219928614795208, -0.046723488718271255, -0.010163952596485615, 0.007435974664986134, -0.00979791209101677, 0.07549752295017242, -0.005023317411541939, -0.07476437836885452, 0.04780775308609009, -0.0...
-0.013523
# Getting started Copy the `deploy\_cloud\_run.yaml` file into your application's git repository. Modify the substitution default values as needed. # Using the Terraform Example: This Terraform is an example Cloud Build Triggers and some example dependencies that allows you to deploy the latests version of your applica...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/cloudbuild-application-cicd/deploy_to_cloud_run/README.md
main
gcp-professional-services
[ 0.014015263877809048, -0.031236357986927032, 0.036858148872852325, -0.06458502262830734, -0.06393203139305115, -0.0057793715968728065, -0.01947704702615738, 0.007156902924180031, 0.033718291670084, 0.10351760685443878, 0.0111978305503726, -0.08924902975559235, 0.04180603846907616, -0.05323...
-0.036917
# Internal HTTP Load Balancer Terraform Example This example shows how to deploy an internal HTTP load balancer using plain terraform (i.e. without using any external modules). This example will create all the required resources needed for a working internal load balancer except the GCP project. ## Design In this examp...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/terraform-ilb/README.md
main
gcp-professional-services
[ -0.09860943257808685, -0.013367272913455963, 0.044117093086242676, -0.04692130908370018, -0.03666994348168373, -0.10323373973369598, -0.013253082521259785, -0.05708620324730873, 0.002553197555243969, 0.04668639600276947, -0.021843906491994858, -0.07149060070514679, 0.03038904257118702, -0....
0.024912
# Selective deployment Organizing code across multiple folders within a single version control repositiroy such as github is a very common practice, and we're referring this as multi-folder repository. \*\*Selective deployment\*\* approach lets you find the folders changed within your repository and only run the logic ...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/cloudbuild-selective-deployment/README.md
main
gcp-professional-services
[ -0.0661468654870987, -0.07823673635721207, -0.021724049001932144, -0.005662140902131796, -0.009926059283316135, -0.058936722576618195, 0.0208986047655344, -0.011385380290448666, 0.09267760068178177, 0.04996709153056145, 0.08347702026367188, -0.04839872196316719, 0.06577655673027039, -0.005...
0.087767
git clone. Cloud build in its default behaviour uses shallow a copy of the repository (i.e. only the code associated with the commit with which the current build was triggered). Shallow copy prevents us from performing git operations like git diff. However, we can use following step in the cloud build to fetch unshallo...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/cloudbuild-selective-deployment/README.md
main
gcp-professional-services
[ -0.07341022789478302, -0.01907547004520893, 0.09670638293027878, 0.03328930586576462, 0.03087613172829151, -0.09549976885318756, 0.020359307527542114, -0.10010924935340881, 0.09476249665021896, 0.03160242363810539, 0.06315718591213226, -0.0010350175434723496, 0.03345998004078865, -0.091622...
-0.061873
# ML Ops with Vertex AI for enterprises Enterprises frequently have specific requirements, especially around security and scale, that are often not addressed by other examples. In this example we demonstrate machine learning use case implementation that respects typical security requirements, and that includes that aut...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/vertex_mlops_enterprise/README.md
main
gcp-professional-services
[ -0.05949186906218529, -0.011910852044820786, -0.012392749078571796, 0.009337657131254673, -0.02687683515250683, -0.05455399677157402, -0.06333710998296738, 0.028489060699939728, -0.04441137984395027, 0.021497538313269615, -0.05674333870410919, -0.030218567699193954, 0.042029839009046555, -...
0.173565
# MLOps with Vertex AI ## Set up the experimentation notebook Once the environment has been deployed, the first step is to open the Jupyter notebook available in the [Vertex Workbench section](https://console.cloud.google.com/vertex-ai/workbench/list/managed), under the specific region (e.g. `europe-west4`). Use the `O...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/vertex_mlops_enterprise/doc/03-MLOPS.md
main
gcp-professional-services
[ -0.11906079202890396, -0.04111579433083534, -0.03610498085618019, -0.024792570620775223, -0.011320523917675018, -0.054852068424224854, 0.01063940767198801, 0.012984945438802242, -0.05091843008995056, 0.03663992881774902, -0.015543426387012005, -0.1074286699295044, 0.03739451989531517, -0.0...
0.103624
# MLOps with Vertex AI - Git integration with Cloud Build ## Accessing GitHub from Cloud Build via SSH keys Follow this procedure to create a private SSH key to be used for Github access from Cloud Build: https://cloud.google.com/build/docs/access-github-from-build ``` mkdir workingdir cd workingdir ssh-keygen -t rsa -...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/vertex_mlops_enterprise/doc/02-GIT_SETUP.md
main
gcp-professional-services
[ -0.03995026648044586, -0.021102972328662872, -0.007164213340729475, -0.008527150377631187, -0.021957475692033768, -0.012396448291838169, -0.048207685351371765, -0.0188295841217041, 0.025532441213726997, 0.0670701190829277, 0.03392026573419571, -0.024948136880993843, 0.07263531535863876, -0...
-0.008838
# Considerations with VPC SC ## Cloud Build Use Cloud Build [private pools](https://cloud.google.com/build/docs/private-pools/using-vpc-service-controls) or create a VPC SC ingress rule or [access level](https://cloud.google.com/access-context-manager/docs/create-basic-access-level#members-example) adding the Cloud Bui...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/vertex_mlops_enterprise/doc/VPC-SC.md
main
gcp-professional-services
[ -0.11382612586021423, -0.04837745428085327, -0.029033435508608818, 0.024569593369960785, -0.002758851507678628, 0.032796792685985565, 0.0467611625790596, -0.047588273882865906, -0.012779274955391884, 0.08015944808721542, -0.005861984565854073, -0.0885837972164154, 0.12557446956634521, -0.0...
-0.059347
# Issues when running Github actions: ``` ERROR: (gcloud.builds.submit) There was a problem refreshing your current auth tokens: ('Unable to acquire impersonated credentials', '{\n "error": {\n "code": 403,\n "message": "Permission \'iam.serviceAccounts.getAccessToken\' denied on resource (or it may not exist).",\n "st...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/vertex_mlops_enterprise/doc/ISSUES.md
main
gcp-professional-services
[ -0.04364081099629402, -0.055701084434986115, -0.0009704979020170867, -0.05520404875278473, -0.0008477509836666286, -0.06763787567615509, -0.01994296908378601, -0.05367063730955124, 0.01691381260752678, 0.06770355999469757, 0.04032481089234352, -0.04619712755084038, 0.09405980259180069, -0....
-0.045234
# MLOps with Vertex AI - Infra setup ## Introduction This example implements the infrastructure required to deploy an end-to-end [MLOps process](https://services.google.com/fh/files/misc/practitioners\_guide\_to\_mlops\_whitepaper.pdf) using [Vertex AI](https://cloud.google.com/vertex-ai) platform. ## GCP resources A t...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/vertex_mlops_enterprise/doc/01-ENVIRONMENTS.md
main
gcp-professional-services
[ -0.08988127112388611, -0.0596245601773262, 0.023187655955553055, -0.018637066707015038, 0.015264240093529224, -0.05208980292081833, 0.0018655509920790792, -0.037438735365867615, -0.08211781084537506, 0.05101047456264496, -0.05903083458542824, -0.07399693131446838, 0.042805932462215424, -0....
0.123442
should be an existing bucket that your user has access to. - Create a `terraform.tfvars` file and specify the required variables. You can use the `terraform.tfvars.sample` an an starting point ```tfm project\_create = { billing\_account\_id = "000000-123456-123456" parent = "folders/111111111111" } project\_id = "credi...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/vertex_mlops_enterprise/doc/01-ENVIRONMENTS.md
main
gcp-professional-services
[ -0.050462231040000916, -0.012327136471867561, -0.020084917545318604, -0.03670801967382431, -0.06530917435884476, -0.03366313502192497, -0.00030076754046604037, -0.0036915920209139585, 0.053928717970848083, 0.1114123985171318, -0.005982538685202599, -0.15688617527484894, 0.05856730043888092, ...
-0.046471
# Reference KFP Pipeline We include here a reference KFP pipeline implementation, that follows best practices such as: \* Traceability of data by storing training, test and validation datasets as an intermediate artifact \* Splitting the input data into training, test and validation and giving the training step only ac...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/vertex_mlops_enterprise/src/kfp_pipelines/README.md
main
gcp-professional-services
[ -0.017324209213256836, 0.011357964016497135, 0.0032667717896401882, -0.048247117549180984, -0.00709494249895215, -0.054821353405714035, -0.03558864817023277, 0.07732235640287399, -0.0543394535779953, 0.01605289988219738, -0.036286819726228714, -0.10796323418617249, 0.012987175025045872, 0....
-0.015676
that are not included anywhere, and these are to generate a [model card](https://medium.com/google-cloud/build-responsible-models-with-model-cards-and-vertex-ai-pipelines-8cbf451e7632) for our model. A model card, in this case, is an HTML document with any information that we may be required to provide about our model....
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/vertex_mlops_enterprise/src/kfp_pipelines/README.md
main
gcp-professional-services
[ -0.09728538244962692, 0.007066668476909399, 0.0030564237385988235, 0.01684119924902916, 0.024221068248152733, 0.05990701913833618, -0.0913613960146904, -0.003152090823277831, 0.034524958580732346, -0.05669841915369034, -0.02755780518054962, -0.04246125742793083, 0.0275136549025774, -0.0428...
0.152145
The Oracle DDL Migration Utility does the following functionalities: 1. The script connects to Oracle Database through the oracle-python connector (oracledb). 2. The script uses the oracle metadata table (all\_tab\_columns) to retrieve the table schema information. 3. The script produces the "create table" statement us...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/bigquery-oracle-ddl-migration-utility/README.md
main
gcp-professional-services
[ -0.017839249223470688, -0.05622123181819916, -0.051566626876592636, -0.04157064110040665, 0.0427863746881485, -0.12432722002267838, -0.022787287831306458, 0.01276370044797659, -0.08106768876314163, 0.03496452420949936, 0.004162434954196215, -0.018440980464220047, -0.009242378175258636, -0....
-0.04225
The Oracle BQ Converter Script does the following functionalities 1. The script reads the oracle ddl files from the specified gcs path (output path of the oracle\_ddl\_extraction script) 2. The script calls the BigQuery Migration API and converts the ddl to the BigQuery DDL and placed it in the specified gcs path Below...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/bigquery-oracle-ddl-migration-utility/DDL_Converter/Oracle BQ Converter.md
main
gcp-professional-services
[ -0.06528592109680176, -0.013530910946428776, -0.021373363211750984, -0.07523726671934128, 0.004832201171666384, -0.06497379392385483, -0.05414711683988571, 0.005332549102604389, -0.06089519336819649, 0.03424737975001335, -0.033388424664735794, -0.06236451491713524, 0.029764337465167046, -0...
-0.165603
The Oracle DDL Extraction Script does the following functionalities: 1. The script connects to Oracle Database through the oracle-python connector (oracledb) 2. The script uses the oracle metadata table (all\_tab\_columns) to retieve the table schema information 3. The script produces the "create table" statement using...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/bigquery-oracle-ddl-migration-utility/DDL_Extractor/Oracle DDL Extraction.md
main
gcp-professional-services
[ -0.09113504737615585, -0.03392060846090317, -0.008647868409752846, -0.05136454850435257, 0.03023158758878708, -0.08497635275125504, 0.024006886407732964, -0.042526185512542725, -0.04162520915269852, 0.06353013962507248, 0.02470502257347107, -0.06636213511228561, 0.03966943547129631, -0.053...
-0.134654
The BQ Table Creator Script does the following functionalities 1. Reads the output sql file created by the oracle bq converter script 2. The script create the Bigquery Tables in the specified target dataset. 3. The table structure will include source columns, metadata columns and paritioning and clustering info 3. The ...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/bigquery-oracle-ddl-migration-utility/BQ_Table_Creator/BQ Table Creator.md
main
gcp-professional-services
[ -0.012999078258872032, -0.039546508342027664, -0.03968287259340286, -0.032979566603899, -0.014427633956074715, -0.04609902948141098, -0.017934482544660568, -0.011147808283567429, -0.0926671102643013, 0.05081506446003914, -0.01124536618590355, -0.06824041157960892, 0.0416858084499836, -0.11...
-0.134633
# Using dbt and Cloud Composer for managing BigQuery example code DBT (Data Building Tool) is a command-line tool that enables data analysts and engineers to transform data in their warehouses simply by writing select statements. Cloud Composer is a fully managed data workflow orchestration service that empowers you to...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/dbt-on-cloud-composer/README.md
main
gcp-professional-services
[ -0.06989438831806183, -0.00676809623837471, -0.007109421771019697, -0.007800376508384943, -0.011187952943146229, -0.05422506481409073, -0.030221089720726013, 0.043222375214099884, -0.007292996626347303, 0.05826225131750107, -0.039084531366825104, -0.010539066977798939, 0.033385589718818665, ...
0.049119
dbt Here are the follow up steps for running the code: 1. Push the code in dbt-project repository and make sure the Cloud Build triggered; and successfully create the docker image 2. In the Cloud Composer UI, run the DAG (e.g dbt\_with\_kubernetes.py) 3. If successfull, check the BigQuery console to check the tables Wi...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/dbt-on-cloud-composer/README.md
main
gcp-professional-services
[ 0.021345151588320732, -0.0024058856070041656, 0.003528670873492956, 0.008937790058553219, -0.005954978056252003, -0.04645317792892456, -0.036838140338659286, 0.04388270899653435, 0.02752566896378994, 0.037185780704021454, -0.059761423617601395, -0.07874153554439545, 0.055393580347299576, -...
-0.084817
create namespace NAMESPACE ``` 2) Create a Kubernetes service account for your application to use ```bash kubectl create serviceaccount KSA\_NAME \ --namespace NAMESPACE ``` 3) Assuming that the dbt-sa already exists and has the right permissions to trigger BigQuery jobs, the special binding has to be added to allow th...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/dbt-on-cloud-composer/README.md
main
gcp-professional-services
[ -0.042858321219682693, -0.0217670239508152, -0.03229866176843643, -0.08455263078212738, -0.11534151434898376, 0.027678385376930237, 0.08395494520664215, -0.004057373385876417, 0.052521541714668274, 0.05940370261669159, -0.05386476591229439, -0.11769868433475494, 0.04122058302164078, -0.052...
0.00945
## Terraform template to setup the composer environment This terraform template serves as a starting point for the private VPC Composer setup. There are two versions of Cloud Composer configured. The recommended setup should be Composer2 as for its simplicity and its great autoscaling capacity. Create the `terraform.tf...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/dbt-on-cloud-composer/terraform/README.md
main
gcp-professional-services
[ 0.08432813733816147, 0.008205846883356571, -0.05393150448799133, 0.021702270954847336, -0.059787627309560776, 0.040058668702840805, 0.0021569314412772655, 0.024139784276485443, 0.0007379815797321498, 0.09620439261198044, -0.041678037494421005, -0.11453983932733536, -0.008594082668423653, -...
-0.016169
# Kubeflow Fairing Examples `Kubeflow Fairing` is a Python package that streamlines the process of building, training, and deploying machine learning (ML) models in a hybrid cloud environment. By using Kubeflow Fairing and adding a few lines of code, you can run your ML training job locally or in the cloud, directly fr...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/kubeflow-fairing-example/README.md
main
gcp-professional-services
[ -0.06207532435655594, -0.09208277612924576, 0.04053217172622681, -0.006598112639039755, 0.03269166126847267, 0.002260026056319475, -0.04411839693784714, -0.039009109139442444, -0.027744632214307785, -0.03823411092162132, -0.06685033440589905, -0.09938183426856995, 0.024622896686196327, -0....
0.152639
tested on notebook service outside Kubeflow cluster also, which means it could be - Notebook running on your personal computer - Notebook on AI Platform, Google Cloud Platform - Essentially notebook on any environment outside Kubeflow cluster For notebook running inside Kubeflow cluster, for example JupytHub will be de...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/kubeflow-fairing-example/README.md
main
gcp-professional-services
[ -0.054857462644577026, -0.024033699184656143, 0.03798823058605194, -0.035935305058956146, 0.02279294654726982, -0.012824696488678455, -0.07502058148384094, -0.036429263651371, 0.016739225015044212, 0.023007487878203392, -0.048452455550432205, -0.07748236507177353, 0.0484725721180439, -0.06...
0.150617
# Overview The purpose of this walkthrough is to create [Custom Dataflow templates](https://cloud.google.com/dataflow/docs/concepts/dataflow-templates). The value of Custom Dataflow templates is that it allows us to execute Dataflow jobs without installing any code. This is useful to enable Dataflow execution using an ...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/dataflow-custom-templates/README.md
main
gcp-professional-services
[ -0.04191472381353378, -0.03653723746538162, 0.04869644716382027, -0.029895739629864693, -0.00638121273368597, 0.012534607201814651, -0.048360131680965424, -0.030033595860004425, -0.02815408445894718, 0.004279565531760454, -0.028871580958366394, -0.02724488265812397, 0.0056369188241660595, ...
-0.011619
proceed. ``` DIR=infrastructure/03.io terraform -chdir=$DIR init terraform -chdir=$DIR apply -var="project=$(gcloud config get-value project)" ``` ## 6. Provision the Dataflow template builder We will use [Cloud Build](https://cloud.google.com/build) to build the custom Dataflow template. There are advantages to using ...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/dataflow-custom-templates/README.md
main
gcp-professional-services
[ -0.04412246122956276, -0.006983994506299496, 0.07074005156755447, -0.04388427734375, -0.012317742221057415, -0.014292527921497822, -0.019434040412306786, -0.0442158505320549, 0.05457941070199013, 0.03408025950193405, -0.018803926184773445, -0.09236836433410645, 0.002496106084436178, -0.066...
-0.055138
terraform -chdir=$DIR destroy -var="project=$(gcloud config get-value project)" ```
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/dataflow-custom-templates/README.md
main
gcp-professional-services
[ 0.027479924261569977, 0.1040320098400116, 0.050275739282369614, -0.029920386150479317, 0.03638402000069618, -0.06746193766593933, 0.07668347656726837, -0.029967887327075005, 0.07404603064060211, 0.06831680238246918, 0.0037351467180997133, -0.04878335818648338, 0.05719594657421112, -0.00476...
-0.064587
# Deploy the Custom Dataflow template by following these steps ## Overview The purpose of this walkthrough is to create [Custom Dataflow templates](https://cloud.google.com/dataflow/docs/concepts/dataflow-templates). The value of Custom Dataflow templates is that it allows us to execute Dataflow jobs without installing...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/dataflow-custom-templates/cloud_shell_tutorial.md
main
gcp-professional-services
[ -0.08144344389438629, -0.036305949091911316, 0.057586416602134705, -0.06611857563257217, -0.06522766500711441, -0.010015307925641537, -0.02923930622637272, -0.026151133701205254, 0.003968065604567528, 0.03544865548610687, -0.0577029325067997, -0.056861571967601776, 0.006037034559994936, -0...
-0.057961
own GitHub organization or personal account In order to benefit from [Cloud Build](https://cloud.google.com/build), the service requires we own this repository; it will not work with a any repository, even if it is public. Therefore, complete these steps before proceeding: 1) [Fork the repository](https://github.com/Go...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/dataflow-custom-templates/cloud_shell_tutorial.md
main
gcp-professional-services
[ -0.04469761624932289, -0.05353666469454765, 0.06005609780550003, -0.061944689601659775, -0.08643040806055069, -0.02933746576309204, -0.007842032238841057, -0.03623452037572861, 0.0003859174612443894, 0.10235695540904999, -0.014092992059886456, -0.07794585824012756, 0.05349387973546982, -0....
-0.057416
# Overview This directory holds code to build an Apache Beam pipeline written in [python](https://www.python.org/downloads/). # Requirements - [Python version 3.6 or higher](https://www.python.org/downloads/) - [Pip version 7.0.0 or higher](https://python-poetry.org/docs/#installation) # Usage ## Setup virtual environm...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/dataflow-custom-templates/python/README.md
main
gcp-professional-services
[ 0.03334765508770943, -0.08154889941215515, -0.02244405634701252, -0.010039142332971096, -0.043090321123600006, -0.06064348295331001, -0.036696504801511765, -0.022633114829659462, -0.07221326977014542, -0.04605879262089729, -0.008789347484707832, -0.030783306807279587, -0.02091732807457447, ...
-0.013829
# Overview This directory holds code to build an Apache Beam pipeline written in [java](https://www.java.com/en/). # Requirements \*NOTE: installing gradle is not required\* - [Java version 11](https://www.oracle.com/java/technologies/downloads/#java11) # Usage ## Running Word Count Run the following command to execute...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/dataflow-custom-templates/java/README.md
main
gcp-professional-services
[ 0.026076124981045723, -0.06694622337818146, 0.02137106843292713, -0.09526718407869339, -0.055630192160606384, -0.04347296059131622, 0.02079094760119915, -0.030675897374749184, -0.03552443906664848, -0.02726043201982975, -0.03596438094973564, -0.047415200620889664, 0.00811332929879427, -0.0...
-0.060239
# Overview This module is responsible for provisioning the builder that builds the custom Dataflow template. This module does not build the template itself but provisions the process, using [Cloud Build](https://cloud.google.com/build) that performs the build step. This module achieves the following: - Provision cloud ...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/dataflow-custom-templates/infrastructure/04.template/README.md
main
gcp-professional-services
[ -0.058699119836091995, -0.007691327948123217, 0.05586247891187668, -0.03133720904588699, -0.014267691411077976, -0.022031359374523163, -0.05025095120072365, -0.05741143971681595, 0.00900330115109682, 0.057839374989271164, -0.03762653097510338, -0.08257761597633362, 0.011214584112167358, -0...
-0.038015
# Deploying Redis Cluster on GKE This is a sample K8s configuration files to deploy a Redis Cluster on GKE. ## Pre-requisites - Install Google Cloud SDK, kubectl, and git client. - Enable the Kubernetes Engine API. - Create or select a GCP project. Make sure that billing is enabled for your project. ## How to use 1. Pr...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/redis-cluster-gke/README.md
main
gcp-professional-services
[ -0.016301417723298073, -0.039653655141592026, 0.014919825829565525, -0.04066602885723114, -0.04989423602819443, -0.03609294816851616, -0.018404273316264153, -0.006249952595680952, 0.020398437976837158, 0.045646898448467255, -0.030815759673714638, -0.08700460940599442, 0.023384563624858856, ...
0.073307
# Entities creation and update for Dialogflow This module is an example how to create and update entities for Dialogflow. ## Recommended Reading [Entities Options](https://cloud.google.com/dialogflow/docs/entities-options) ## Technology Stack 1. Cloud Storage 1. Cloud Functions 1. Dialogflow ## Programming Language Pyt...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/dialogflow-entities-example/README.md
main
gcp-professional-services
[ -0.0638006404042244, -0.015411071479320526, -0.028114916756749153, -0.046763770282268524, -0.08683939278125763, -0.099791020154953, 0.036566998809576035, 0.01674061268568039, 0.04685910791158676, 0.023095015436410904, 0.016009755432605743, -0.07016101479530334, 0.018547648563981056, -0.039...
-0.020599
[ "@saving-account-types:saving-account-types" ] }, { "value": "@checking-account-types:checking-account-types", "synonyms": [ "@checking-account-types:checking-account-types" ] }, { "value": "@sys.date-period:date-period @saving-account-types:saving-account-types", "synonyms": [ "@sys.date-period:date-period @saving-a...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/dialogflow-entities-example/README.md
main
gcp-professional-services
[ 0.05835264176130295, 0.04403286054730415, -0.050841957330703735, -0.0279708169400692, -0.03621083125472069, 0.02706311270594597, 0.05002199858427048, 0.005631620995700359, -0.050562649965286255, 0.09587211906909943, 0.04070791229605675, -0.07637813687324524, 0.019804785028100014, -0.010032...
0.008914
# Certificate Authority Service Demo This repository contains sample Terraform resource definitions for deploying several related Certificate Authority Service (CAS) resources to the Google Cloud. It demonstrates several Certificate Authority Service features and provides examples of Terraform configuration for the fol...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/certificate-authority-service-hierarchy/README.md
main
gcp-professional-services
[ -0.03279026225209236, -0.004723486956208944, 0.06933802366256714, -0.03650869056582451, -0.048209547996520996, -0.056041419506073, -0.030021226033568382, -0.07436465471982956, 0.031350597739219666, 0.05635394901037216, 0.0028596443589776754, -0.11745044589042664, 0.096678726375103, 0.02746...
-0.050515
$CONCURRENCY $QPS $TIME ``` The test uses [Fortio](https://github.com/fortio/fortio) to call CAS API concurrenly over HTTPS to generate dummy certificates simulataing load on the CA Pool. The test will run for the time duration defined by the `TIME` environment variable. Check the outcome ``` 142.251.39.106:443: 3 172....
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/certificate-authority-service-hierarchy/README.md
main
gcp-professional-services
[ -0.05761158838868141, 0.06363371759653091, -0.05535636469721794, 0.028580661863088608, -0.06738027185201645, -0.10368165373802185, -0.07321157306432724, 0.028325246647000313, 0.09700652956962585, -0.00162770866882056, 0.025651654228568077, -0.11009708791971207, 0.049595318734645844, 0.0339...
0.111795
# Dataflow Streaming Benchmark When developing Dataflow pipelines, it's common to want to benchmark them at a specific QPS using fake or generated data. This pipeline takes in a QPS parameter, a path to a schema file, and generates fake JSON messages matching the schema to a Pub/Sub topic at the rate of the QPS. ## Pip...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/dataflow-streaming-benchmark/README.md
main
gcp-professional-services
[ -0.08351849019527435, 0.01422634907066822, 0.0058512878604233265, -0.046281252056360245, -0.013957034796476364, -0.09048143029212952, -0.06588109582662582, -0.023686008527874947, -0.017340127378702164, -0.039174869656562805, -0.05788438767194748, -0.11416150629520416, 0.04117296263575554, ...
0.029314
# GCE access to Google AdminSDK ## Introduction This package provides basic instruction and code snippets (in python), to help users manage access to [Google's Admin SDK](https://developers.google.com/admin-sdk/) using GCE's [service account](https://cloud.google.com/compute/docs/access/create-enable-service-accounts-f...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/gce-to-adminsdk/README.md
main
gcp-professional-services
[ -0.13123823702335358, 0.0052902307361364365, 0.02124406024813652, -0.06227664276957512, -0.05596054717898369, -0.0675131157040596, 0.0963236391544342, 0.005672913044691086, -0.006101807113736868, 0.0500892736017704, -0.0003553055867087096, -0.03882989287376404, 0.0684007853269577, -0.06065...
-0.024616
## TSOP Log Processor Currently Transfer Service for On Prem writes transfer logs to GCS objects but not to cloud logging. It puts the logs in same bucket where it transfer objects. This cloud function reads logs from GCS object and writes to cloud logging. Cloud Function gets invoked via pub/sub as soon as logs are cr...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/tsop-log-processor/README.md
main
gcp-professional-services
[ -0.05311695486307144, -0.05043978616595268, 0.026208259165287018, 0.021045641973614693, 0.04996226727962494, -0.06487390398979187, 0.09064528346061707, -0.02251066267490387, 0.11056702584028244, 0.10948199033737183, -0.04449208453297615, -0.04606512188911438, 0.007729709148406982, 0.036113...
0.04591
## Setup mTLS and TLS with GCP Application Load Balancer (EXTERNAL\_MANAGED) ### OVERVIEW: mTLS with GCP Application Load Balancer enhances security by requiring clients to authenticate themselves with certificates, in addition to the server authenticating itself to the client. This provides mutual authentication, ensu...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/gclb-mtls-tls/README.md
main
gcp-professional-services
[ -0.11862635612487793, 0.01251271367073059, 0.07037681341171265, -0.018522251397371292, -0.10907220095396042, -0.01646329276263714, 0.021875478327274323, 0.0073002479039132595, 0.065305694937706, -0.023387836292386055, -0.07729823142290115, -0.08240620046854019, 0.1160939559340477, -0.02215...
-0.029467
above mtls connection response are populated automatically by load balancer frontend and sent back to the backend server after a successful mTLS connection is established.
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/gclb-mtls-tls/README.md
main
gcp-professional-services
[ -0.11627938598394394, -0.02424107864499092, 0.000374761555576697, 0.09820530563592911, -0.052882954478263855, -0.08927354216575623, -0.012632980942726135, -0.03414236009120941, 0.10345735400915146, 0.0028190487064421177, -0.0661248043179512, 0.011408803053200245, 0.07956987619400024, -0.07...
0.014155
# BigQuery to Spanner using Mutations ## Why create this repo? \* When we need to move data from BigQuery to Spanner using Apache Spark in Scala \* \*\*Use of Spanner Mutation instead of SQL\*\* \* Spanner has a limitation on mutations per transaction, particularly affecting its efficiency in handling UPSERTS using [Mu...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/dataproc-spanner/mutations/README.md
main
gcp-professional-services
[ -0.06991896033287048, -0.010546035133302212, -0.0054233623668551445, 0.012108157388865948, -0.0212083850055933, -0.08208756893873215, 0.0087061058729887, 0.002711963141337037, -0.05145695060491562, 0.0077806939370930195, 0.004920855164527893, 0.016984935849905014, -0.003013425273820758, -0...
-0.028572
spark --cluster ${CLUSTER\_NAME} \ --region=us-central1 \ --jar=${GCS\_BUCKET\_JARS}/${APP\_JAR\_NAME} \ --jars=${GCS\_BUCKET\_JARS}/google-cloud-spanner-6.45.1.jar,gs://test-dataproc-spanner/jars/google-cloud-spanner-jdbc-2.17.1-single-jar-with-dependencies.jar \ -- ${PROJECT\_ID} ${BQ\_DATASET} ${BQ\_TABLE} ${SPANNER...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/dataproc-spanner/mutations/README.md
main
gcp-professional-services
[ -0.013493802398443222, -0.043487802147865295, 0.06602749973535538, -0.017801232635974884, 0.05547402426600456, -0.006458809599280357, 0.021303605288267136, -0.024028949439525604, -0.01415180042386055, 0.01691100373864174, 0.02408379316329956, -0.14184659719467163, 0.047025155276060104, -0....
-0.131802
## BigQuery to Spanner using Apache Spark in Scala ### Why create this repo ? \* Google Cloud customers need to move data from BigQuery to Spanner using Apache Spark in Scala \* Dataproc [templates](https://github.com/GoogleCloudPlatform/dataproc-templates) cannot help as they are written in Python & Java \* The Apache...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/dataproc-spanner/jdbc/README.md
main
gcp-professional-services
[ -0.04336041212081909, -0.04500246420502663, -0.01597699336707592, 0.010599535889923573, -0.08215165138244629, -0.036738209426403046, -0.030760277062654495, -0.007857142016291618, -0.0793009027838707, 0.00947425328195095, -0.007625633385032415, 0.022159012034535408, -0.006264841184020042, -...
-0.127085
list core/project --format="value(core.project)"``` | | BQ\_DATASET | The BigQuery dataset with source data | my\_bq\_dataset | | BQ\_TABLE | The BigQuery table with source data | my\_bq\_table | | BQ\_TABLE\_PK | Primary key column name of the BigQuery table | my\_pk\_column\_name | | SPANNER\_PROJECT\_ID | The ID of ...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/dataproc-spanner/jdbc/README.md
main
gcp-professional-services
[ 0.032204821705818176, 0.01919301226735115, -0.010703576728701591, 0.02663518115878105, 0.03864442557096481, -0.02708398923277855, 0.050223466008901596, -0.009552341885864735, 0.005400522146373987, 0.039867907762527466, 0.005622324999421835, -0.11592075973749161, 0.08689533919095993, -0.128...
-0.118677
# Webhook example This module is a webhook example for Dialogflow. An agent created in Dialogflow is connected to this webhook that is running in Cloud Function. The webhook also connects to a Cloud Firestore to get the users information used in the example. ## Recommended Reading [Dialogflow Fulfillment Overview](http...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/dialogflow-webhook-example/README.md
main
gcp-professional-services
[ -0.0505196787416935, 0.029314668849110603, -0.08960781246423721, -0.054108865559101105, -0.0643438845872879, -0.07909462600946426, 0.04063906520605087, 0.007488461211323738, 0.023361576721072197, -0.041096486151218414, -0.019265610724687576, -0.048340942710638046, 0.028057200834155083, -0....
0.040574
a nice day! ### Running the sample from Dialogflow console In [Dialogflow's console](https://console.dialogflow.com), in the simulator on the right, query your Dialogflow agent with `I need assistance` and respond to the questions your Dialogflow agent asks. ### Running the sample using gcloud util Example: $ gcloud fu...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/dialogflow-webhook-example/README.md
main
gcp-professional-services
[ -0.040326062589883804, 0.03339602053165436, 0.0057946802116930485, -0.028640301898121834, -0.050275418907403946, -0.06784936785697937, 0.05682097375392914, -0.005461480468511581, 0.06192215904593468, -0.044359199702739716, -0.05924602225422859, -0.10904912650585175, 0.0007645674631930888, ...
-0.007935
# Multi-regional Application Availability This demo project contains Google Cloud infrastrcuture components that illustrate use cases for enhancing availability of a Cloud Run or Google Cloud Compute Managed Instance Groups based applications. The applciation instances get redundantly deployed to two distinct regions. ...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/cloud-dns-load-balancing/README.md
main
gcp-professional-services
[ -0.07158535718917847, -0.039440352469682693, 0.04605768993496895, -0.06801900267601013, -0.034978628158569336, -0.04928114637732506, -0.04870926961302757, -0.06493991613388062, -0.019960926845669746, -0.000912768766283989, -0.03557746484875679, -0.05795455724000931, 0.03739691153168678, -0...
0.068474
and running load tests. ## GCE Managed Instance Groups 1. Checkout demo HTTP responder service container ``` git clone https://github.com/GoogleCloudPlatform/golang-samples.git cd golang-samples/run/hello-broken ``` 2. Build container, tag it and push to the Artifact Registry ``` docker build . -t eu.gcr.io/${PROJECT\_...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/cloud-dns-load-balancing/README.md
main
gcp-professional-services
[ -0.07964427024126053, 0.000133897818159312, 0.06563036143779755, -0.039826881140470505, -0.0462997741997242, -0.0989154726266861, -0.020623991265892982, -0.056524407118558884, -0.03125838562846184, 0.011851820163428783, -0.01259892713278532, -0.12075185775756836, 0.022034667432308197, -0.0...
-0.036374
the end of the execution should look like following ``` IP addresses distribution: 10.156.0.11:8080: 16 10.199.0.48:8080: 4 Code -1 : 12 (10.0 %) Code 200 : 108 (90.0 %) Response Header Sizes : count 258 avg 390 +/- 0 min 390 max 390 sum 100620 Response Body/Total Sizes : count 258 avg 7759.624 +/- 1.497 min 7758 max 7...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/cloud-dns-load-balancing/README.md
main
gcp-professional-services
[ -0.030446624383330345, -0.024030013009905815, 0.0458671897649765, -0.024256331846117973, -0.0986022800207138, -0.04307815432548523, -0.04442184418439865, -0.10023613274097443, 0.027214577421545982, 0.09638401865959167, -0.01456498447805643, -0.026758018881082535, -0.0014435031916946173, -0...
-0.027313