content
large_stringlengths
3
20.5k
url
large_stringlengths
54
193
branch
large_stringclasses
4 values
source
large_stringclasses
42 values
embeddings
listlengths
384
384
score
float64
-0.21
0.65
\*\*Table of Contents\*\* - [Template Bash Script Module](#template-bash-script-module) - [Example Usage](#example-usage) - [Requirements](#requirements) - [Providers](#providers) - [Inputs](#inputs) - [Outputs](#outputs) # Template Bash Script Module This module is responsible for handling replacement of variables in ...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/kerberized_data_lake/modules/template_bash_script/README.md
main
gcp-professional-services
[ -0.05084969103336334, 0.05487830191850662, 0.0339139848947525, -0.00783561822026968, -0.037299249321222305, 0.025701018050312996, 0.05329537019133568, -0.012380395084619522, 0.012363706715404987, 0.012841765768826008, -0.013863049447536469, -0.0772324800491333, 0.08362186700105667, 0.00476...
0.072463
# Terraform Google Cloud IAM Deny and Organization Policies This Terraform configuration demonstrates how to implement a series of security guardrails within a Google Cloud organization. It leverages IAM Deny Policies and Organization Policies to restrict high-privilege permissions and enforce organizational standards....
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/iam-deny/README.md
main
gcp-professional-services
[ -0.04859980195760727, 0.03664405643939972, 0.02341904677450657, -0.028264053165912628, 0.024461714550852776, -0.012463713996112347, 0.0838848277926445, -0.15006333589553833, -0.036581769585609436, 0.11174927651882172, 0.024511851370334625, -0.06527402251958847, 0.0920400321483612, 0.023686...
0.069628
directory of this example. ## Installation and Deployment 1. \*\*Clone Repository:\*\* If you haven't already, clone the repository containing this example to your local machine. ```bash # Example: Replace with the actual repository URL git clone https://github.com/kevinschmidtG/professional-services.git cd examples/ia...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/iam-deny/README.md
main
gcp-professional-services
[ -0.04445449262857437, 0.0284831915050745, 0.045584242790937424, -0.0379442423582077, -0.03859224170446396, -0.06987234950065613, 0.030001239851117134, -0.08872229605913162, 0.015287177637219429, 0.14023272693157196, 0.03859758377075195, -0.07408089935779572, 0.0848049595952034, -0.01285087...
0.066058
| List of principals exempt from the billing deny rule. | `list(string)` | `[]` | No | | `sec\_exception\_principals` | List of principals exempt from the security deny rule. | `list(string)` | `[]` | No | | `top\_exception\_principals` | List of principals exempt from the organization-level deny policy. | `list(string...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/iam-deny/README.md
main
gcp-professional-services
[ -0.014204991981387138, 0.030781811103224754, -0.0020760183688253164, -0.06300541013479233, 0.004020210355520248, -0.02706149034202099, 0.041673269122838974, -0.09972638636827469, 0.006363718770444393, 0.10127794742584229, 0.06562840938568115, -0.09774420410394669, 0.07348749786615372, -0.0...
0.005361
# Data Format Description Language ([DFDL](https://en.wikipedia.org/wiki/Data\_Format\_Description\_Language)) Processor Example This module is a example how to process a binary using a DFDL definition. The DFDL definitions are stored in a Bigtable database. The application send a request with the binary to process to ...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/dfdl-bigtable-pubsub-example/README.md
main
gcp-professional-services
[ 0.012360905297100544, -0.04976893588900566, -0.12527287006378174, -0.03629706799983978, 0.0031816211994737387, -0.09690380841493607, -0.006935793906450272, 0.055016666650772095, -0.03169453516602516, 0.021419795230031013, -0.02213211916387081, -0.001638780813664198, 0.023638805374503136, -...
0.102147
needed to run this example. #### Topics To run this example two topics need to be created: 1. A topic to publish the final json output: "data-output-json-topic" 2. A topic to publish the binary to be processed: "data-input-binary-topic" #### Subscription The following subscriptions need to be created: 1. A subscription...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/dfdl-bigtable-pubsub-example/README.md
main
gcp-professional-services
[ 0.04453466087579727, -0.08520792424678802, -0.05044285207986832, -0.023704994469881058, -0.009525139816105366, -0.00843069888651371, -0.07765217125415802, -0.03212578222155571, -0.06597617268562317, 0.047808315604925156, 0.012759263627231121, -0.05356425791978836, 0.14107422530651093, 0.02...
-0.047077
# A 'state-scalable' project factory pattern with Terragrunt ## Overview Resolves the problem of state volume explotion with project factory. Terragrunt helps with that by: 1. Providing a dynamic way to configure [remote\_state](https://terragrunt.gruntwork.io/docs/features/keep-your-remote-state-configuration-dry/#kee...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/terragrunt-project-factory-gcp/README.md
main
gcp-professional-services
[ -0.07403895258903503, 0.013360635377466679, 0.026893801987171173, 0.044525470584630966, 0.004432467743754387, -0.015155253931879997, -0.07697955518960953, -0.034305743873119354, 0.0054654330015182495, 0.08802139014005661, -0.0031624718103557825, -0.0627087727189064, 0.0384712889790535, 0.0...
0.020813
-> terragrunt.hcl terraform { # Pull the terraform configuration from the local file system. Terragrunt will make a copy of the source folder in the # Terragrunt working directory (typically `.terragrunt-cache`). source = "../..//src" # Files to include from the Terraform src directory include\_in\_copy = [ "main.tf", ...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/terragrunt-project-factory-gcp/README.md
main
gcp-professional-services
[ -0.04787224158644676, -0.0268353633582592, -0.0382893942296505, 0.014518744312226772, -0.02903561294078827, 0.010241031646728516, -0.031594663858413696, -0.027104830369353294, 0.033235207200050354, 0.03931214287877083, 0.017941411584615707, -0.0755644142627716, 0.053129225969314575, -0.035...
0.034318
Copyright 2023 Google LLC Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed un...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/terragrunt-project-factory-gcp/src/README.md
main
gcp-professional-services
[ -0.07404069602489471, -0.05078043043613434, -0.03379674255847931, -0.07616318762302399, 0.017160683870315552, 0.0031281046103686094, 0.00703018344938755, -0.08708835393190384, -0.004665897693485022, -0.01652432233095169, 0.08337689936161041, 0.006790664047002792, 0.030634446069598198, -0.0...
-0.002184
## DAG to DAG dependency between multiple Composer Clusters Dependencies between Airflow DAGs might be inevitable. Within a composer instance, one can define a dag to dag dependency easily using operators like TriggerDagRunOperator and ExternalTaskSensor, for example, when you have a task in one DAG that triggers anoth...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/multicluster-dag-dependencies/Readme.md
main
gcp-professional-services
[ -0.04465825855731964, -0.05347662419080734, -0.0347299724817276, 0.041934262961149216, 0.014412721619009972, -0.0756768062710762, -0.010312208905816078, -0.02552253007888794, 0.02285802736878395, -0.021036198362708092, -0.0894559919834137, -0.0781617984175682, -0.02533886581659317, -0.0469...
0.087461
{ "environment\_name": "composer-env-1", "dag\_id": "dag\_id\_3\_downstream" } ] } } ``` Each upstream dag has an object in GCS bucket. The environment name to Airflow web UI URL mapping is stored in a configuration environment(`composer-url-mapping-{env}.json`). Make sure to replace the `proect\_id` and `url` in the m...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/multicluster-dag-dependencies/Readme.md
main
gcp-professional-services
[ -0.05378493294119835, 0.04339195042848587, 0.02241607755422592, 0.02523002214729786, 0.07127035409212112, -0.07133161276578903, 0.0005312069552019238, 0.027467038482427597, 0.026148607954382896, -0.04924639314413071, -0.017966775223612785, -0.09397818148136139, -0.028263350948691368, -0.02...
0.013487
# Dataproc Persistent History Server This repo houses the example code for a blog post on using a persistent history server to view job history about your Spark / MapReduce jobs and aggregated YARN logs from short-lived clusters on GCS. ![Architecture Diagram](img/persistent-history-arch.png) ## Directory structure - `...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/dataproc-persistent-history-server/README.md
main
gcp-professional-services
[ -0.060543447732925415, -0.014602607116103172, -0.05287950485944748, -0.004183890298008919, 0.053935062140226364, -0.01414225809276104, -0.08094336092472076, -0.03033611550927162, -0.00044959643855690956, 0.03530280664563179, -0.017765654250979424, 0.022325556725263596, -0.015434926375746727,...
0.018951
Compute zone. ``` cd workflow\_templates sed -i 's/PROJECT/your-gcp-project-id/g' \* sed -i 's/HISTORY\_BUCKET/your-history-bucket/g' \* sed -i 's/HISTORY\_SERVER/your-history-server/g' \* sed -i 's/REGION/us-central1/g' \* sed -i 's/ZONE/us-central1-f/g' \* sed -i 's/SUBNET/your-subnet-id/g' \* cd cluster\_templates s...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/dataproc-persistent-history-server/README.md
main
gcp-professional-services
[ 0.034247953444719315, 0.03739048168063164, 0.07149995118379593, -0.0458102710545063, -0.0009370531188324094, -0.053353942930698395, -0.024307919666171074, -0.04015154391527176, -0.04510929808020592, 0.041893698275089264, -0.018971173092722893, -0.06740175932645798, -0.024519111961126328, -...
0.005515
# BigQuery Benchmark Repos Customers new to BigQuery often have questions on how to best utilize the platform with regards to performance. For example, a common question which has routinely resurfaced in this area is the performance of file loads into BigQuery, specifically the optimal file parameters (file type, # col...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/bq_benchmarks/README.md
main
gcp-professional-services
[ -0.020089298486709595, -0.007720554247498512, 0.005238876212388277, 0.06062912195920944, 0.031119506806135178, -0.07674720138311386, -0.02647274360060692, 0.02847987972199917, 0.014794224873185158, 0.026248814538121223, -0.07103781402111053, 0.05426234379410744, -0.0032395042944699526, -0....
-0.008082
will include the type of table, type of query, and the table properties. \*\*Table Type\*\*: \* `BQ\_MANAGED`: Tables located within and managed by BigQuery. \* `EXTERNAL`: Data located in GCS files, which are used to create a temporary external table for querying. \*\*Query Type\*\*: \* `SIMPLE\_SELECT\_\*`: Select al...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/bq_benchmarks/README.md
main
gcp-professional-services
[ -0.043070998042821884, -0.0052539692260324955, -0.07799401879310608, 0.0719095766544342, -0.008506039157509804, -0.048313576728105545, 0.004966873675584793, 0.03401793912053108, -0.003282679244875908, 0.02931888960301876, -0.02384190633893013, 0.0016386661445721984, 0.024030843749642372, -...
0.095245
for both the File Loader Benchmark and the Federated Query Benchmark. They can be configured in the `FILE\_PARAMETERS` dictionary in [`generic\_benchmark\_tools/file\_parameters.py`](generic\_benchmark\_tools/file\_parameters.py). Currently, no file parameters can be added to the dictionary, as this will cause errors. ...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/bq_benchmarks/README.md
main
gcp-professional-services
[ -0.06813982129096985, -0.03556568920612335, -0.10608727484941483, 0.03934977576136589, -0.043334152549505234, -0.10075978934764862, -0.007156546227633953, 0.07180619239807129, -0.04419495165348053, 0.022410530596971512, 0.0023964405991137028, 0.025911973789334297, 0.07672043889760971, -0.0...
0.07198
in the resized staging dataset: `100\_STRING\_10\_10MB`, `100\_STRING\_10\_100MB`, `100\_STRING\_10\_1GB`, `100\_STRING\_10\_2GB`. To run the process of creating staging and resized staging tables, run the following command: ``` python bq\_benchmark.py \ --create\_staging\_tables \ --bq\_project\_id= \ --staging\_datas...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/bq_benchmarks/README.md
main
gcp-professional-services
[ 0.04320140928030014, -0.013916810043156147, -0.09429189562797546, 0.033848315477371216, -0.050914932042360306, -0.1033668965101242, 0.004430559463799, 0.029291924089193344, -0.06591779738664627, 0.07219577580690384, -0.006552266422659159, -0.10039037466049194, 0.06575767695903778, -0.03378...
-0.015227
combination of files, but where numFiles > 1. More specifically, it is copied 100 times for `numFiles`=100, 1000 times for `numFiles`=1000, and 10000 times for `numFiles`=10000. Copying is much faster than extracting each table tens of thousands of times. As an example, the files listed above are copied to create the f...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/bq_benchmarks/README.md
main
gcp-professional-services
[ 0.0010166208958253264, -0.02606360986828804, -0.08089925348758698, -0.012892769649624825, 0.10579729080200195, -0.06259038299322128, 0.0823257565498352, 0.014145797118544579, -0.01729011908173561, 0.10119561105966568, 0.04871310293674469, -0.012512377463281155, 0.006223912350833416, -0.027...
0.097537
Before the tables are deleted, the tables and their respective files can be used to run the Federated Query Benchmark. If running the two benchmarks independently, each file will be used to create a BigQuery table two different times. Running the two benchmarks at the same time can save time if results for both benchma...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/bq_benchmarks/README.md
main
gcp-professional-services
[ -0.01581253670156002, -0.029254266992211342, -0.06045500189065933, 0.0743413120508194, -0.017599502578377724, -0.08170375972986221, -0.053248051553964615, 0.015660623088479042, -0.03441479057073593, 0.0048593697138130665, -0.035910818725824356, -0.008129366673529148, 0.054232675582170486, ...
-0.037073
were loaded from before the tables are deleted. If results for both benchmarks are desired, this will save time when compared to running each benchmark independently, since the same tables needed for the File Loader Benchmark are needed for the Federated Query Benchmark. It has a value of `store\_true`, so this flag wi...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/bq_benchmarks/README.md
main
gcp-professional-services
[ 0.005374578293412924, -0.034911513328552246, 0.006795056164264679, 0.06289409101009369, 0.014829754829406738, -0.08813003450632095, -0.0770474374294281, -0.02063675969839096, 0.009480469860136509, 0.03918420523405075, -0.028295699506998062, 0.013426416553556919, 0.012615724466741085, -0.07...
-0.088832
reason should be stored in a different bucket. `--results\_table\_name`: Name of the results table to hold relevant information about the benchmark loads. `--results\_dataset\_id`: Name of the dataset that holds the results table. `--bq\_logs\_dataset`: Name of dataset hold BQ logs table. This dataset must be in projec...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/bq_benchmarks/README.md
main
gcp-professional-services
[ 0.012714519165456295, 0.013587156310677528, -0.07455228269100189, 0.043907057493925095, -0.005314864683896303, -0.08432666212320328, 0.03252997249364853, 0.0858331024646759, -0.018633006140589714, 0.022855184972286224, -0.022330956533551216, -0.03044900670647621, 0.029993783682584763, -0.0...
-0.009861
# BQ Translation Validator A utility to compare 2 SQL Files and point basic differences like column names, table names, joins, function names etc. ## Business Requirements 1. Validation of the SQL by comparing both the Legacy Source SQL and Bigquery SQL 2. Ability to quickly identify any translation errors, right at th...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/bigquery-translation-validator-utility/README.md
main
gcp-professional-services
[ 0.00970538891851902, -0.021947743371129036, -0.07663177698850632, -0.021618343889713287, 0.03408708795905113, -0.021361157298088074, 0.007267324719578028, 0.011733774095773697, -0.05807406082749367, -0.042139384895563126, -0.03362105414271355, -0.08157283067703247, 0.03878678381443024, -0....
-0.00434
in `./config/functions.csv`. 3. Run the utility `python3 main.py -input-path=path/to/input/folder -output-path=path/to/output/folder` 4. Check the result in `validation-translation.xlsx` and `log\_files` Folder. ### Run with Test Files in GCS Below packages are need to run the script:pandas, sqlparse, XlsxWriter, googl...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/bigquery-translation-validator-utility/README.md
main
gcp-professional-services
[ -0.020139489322900772, -0.01901869848370552, -0.07084108889102936, -0.01274718064814806, 0.010719461366534233, -0.03140147775411606, -0.03516259789466858, 0.03963673859834671, -0.008781437762081623, 0.01748395897448063, 0.0044944207184016705, -0.06971118599176407, 0.06423118710517883, -0.0...
-0.118693
# gRPC Example This example creates a gRCP server that connect to spanner to find the name of user for a given user id. ## Application Project Structure ``` . └── grpc\_example └── src └── main ├── java └── com.example.grpc ├── client └── ConnectClient # Example Client ├── server └── ConnectServer # Initializes gRPC Se...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/grpc_spanner_example/README.md
main
gcp-professional-services
[ -0.06944283097982407, -0.060689814388751984, 0.0003875581605825573, -0.08821330964565277, -0.059011414647102356, -0.060797084122896194, 0.03468708693981171, 0.003771950025111437, 0.03156076371669769, 0.024744758382439613, 0.019102025777101517, -0.032549697905778885, 0.053798336535692215, 0...
-0.085035
# Home Appliances’ Working Status Monitoring Using Gross Power Readings The popularization of IoT devices and the evolvement of machine learning technologies have brought tremendous opportunities for new businesses. We demonstrate how home appliances’ (e.g. kettle and washing machine) working status (on/off) can be inf...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/e2e-home-appliance-status-monitoring/README.md
main
gcp-professional-services
[ -0.0944940447807312, -0.0019037446472793818, 0.0646829605102539, 0.041322872042655945, 0.05647227168083191, -0.07432164251804352, 0.0016077918699011207, -0.053533073514699936, -0.0902729481458664, -0.003425990929827094, -0.03773060813546181, -0.06798531860113144, 0.032450202852487564, -0.0...
0.104896
>> app.yaml echo " GOOGLE\_CLOUD\_PROJECT: '${GOOGLE\_PROJECT\_ID}'" >> app.yaml # deploy application engine, choose any region that suits and answer yes at the end. gcloud --project ${GOOGLE\_PROJECT\_ID} app deploy # create a pubsub topic "data" and a subscription in the topic. # this is the pubsub between IoT device...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/e2e-home-appliance-status-monitoring/README.md
main
gcp-professional-services
[ 0.020808398723602295, -0.02992648258805275, 0.03211081400513649, -0.05570976436138153, -0.007403283845633268, -0.05652284994721413, 0.032734464854002, -0.0397331528365612, 0.06502101570367813, 0.04118749871850014, -0.0009812549687922, -0.04605954512953758, 0.08222011476755142, -0.031172212...
-0.068602
# Workload Identity Federation This repository provides an example for creating a Workload Identity Federation (WIF) Component that could be used for authenticating to Google Cloud from a GitLab CI/CD job using a JSON Web Token (JWT) token. This configuration generates on-demand, short-lived credentials without needing...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/workload_identity_federation/README.md
main
gcp-professional-services
[ -0.11949349194765091, -0.019702039659023285, 0.009459205903112888, -0.03489626944065094, -0.04792318865656853, -0.05099998787045479, 0.03589563071727753, -0.03763147071003914, 0.0003804391890298575, 0.04270361363887787, -0.012701510451734066, -0.07115140557289124, 0.08843720704317093, 0.00...
0.016856
## Ingesting CCAI-Insights Data to BigQuery using Cloud Composer [Contact Center AI (CCAI) Insights](https://cloud.google.com/solutions/ccai-insights?hl=en) can be used to generate insights from voice and chat conversations. This repo will walk you through how to leverage this tool to analyze call conversations and gen...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/ccai-insight-export-composer/Readme.md
main
gcp-professional-services
[ -0.0085145877674222, -0.04249752312898636, -0.012784269638359547, 0.043318383395671844, 0.04145422205328941, 0.025030996650457382, 0.03847040981054306, 0.013872814364731312, 0.06475422531366348, -0.04115784540772438, -0.0875956118106842, -0.01130184531211853, -0.026423614472150803, -0.0241...
0.116701
3:\*\* Create a json varaible for ccai-insight configuration by navigating to Admin > Variables in Airflow UI. Below is the screenshot. ![CCAI-Insights Varaible in Airflow](images/airflow\_ccai\_config.jpg) You can also add the variable using gcloud command line. ``` gcloud composer environments run $COMPOSER\_ENVIRONE...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/ccai-insight-export-composer/Readme.md
main
gcp-professional-services
[ -0.015901288017630577, -0.018411939963698387, -0.08233723789453506, 0.06773845106363297, -0.01794077642261982, -0.018255259841680527, 0.045491646975278854, 0.0004367089713923633, 0.03866393491625786, -0.02511473186314106, 0.05296030640602112, -0.10727386921644211, 0.020761433988809586, -0....
0.02305
# Conversational Hotel Booking Agent The Conversational Hotel Booking Agent is a custom Generative AI (GenAI) agent specialized in handling hotel bookings through a conversational user experience (UX). This repository contains the agent's source code and deployment instructions. The integrated solution features: \* \*\...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/mcptoolbox-bq-claude-slack-agent/README.md
main
gcp-professional-services
[ -0.056359026581048965, -0.02494218572974205, -0.027490733191370964, 0.005856571719050407, -0.04878558591008186, -0.02885783091187477, 0.059308011084795, -0.05537048727273941, 0.022881586104631424, -0.015935329720377922, -0.020177800208330154, -0.013987678103148937, 0.03771615028381348, -0....
0.14418
table. ![BigQuery table with Hotel's data after interaction](/examples/mcptoolbox-bq-claude-slack-agent/images/bq2.png) This example demonstrates the power of a conversational UX implemented through a specialized Agent, MCP, and a well-defined toolset to accomplish complex tasks within a specific domain. The example da...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/mcptoolbox-bq-claude-slack-agent/README.md
main
gcp-professional-services
[ -0.006576135754585266, 0.002007972914725542, -0.014027873985469341, 0.025273997336626053, -0.03949849307537079, -0.05754280090332031, 0.02367190271615982, -0.007000506855547428, -0.02443385124206543, -0.0015834004152566195, -0.041253164410591125, -0.027972126379609108, 0.0560932382941246, ...
0.087357
Token; this token is typically handled by the Spring Boot Slack SDK and doesn't usually require manual configuration in this project's default setup. 4. \*\*Configure Bot Token Scopes\*\*: \* Return to "OAuth & Permissions". \* Scroll to "Scopes" and under "Bot Token Scopes", add the following: \* `app\_mentions:read`:...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/mcptoolbox-bq-claude-slack-agent/README.md
main
gcp-professional-services
[ -0.07777942717075348, -0.09438909590244293, 0.0442100428044796, -0.004430846311151981, 0.01539526041597128, -0.009760498069226742, 0.009826716035604477, 0.05843294411897659, 0.008018415421247482, 0.06140945479273796, 0.010204369202256203, -0.06015680357813835, 0.03129272535443306, 0.063478...
0.101012
Deploys the MCP Toolbox service to Cloud Run. - Deploys the Slackbot service to Google Cloud Run, configuring it with the MCP Toolbox URL for tool discovery. After the script completes successfully, the Slackbot will be deployed and running on Cloud Run. Once these steps are completed and the application is deployed wi...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/mcptoolbox-bq-claude-slack-agent/README.md
main
gcp-professional-services
[ -0.07367881387472153, -0.09813597053289413, 0.06819240003824234, -0.02394680492579937, -0.018553420901298523, -0.04747028276324272, -0.024578308686614037, -0.01227420661598444, -0.03684556111693382, 0.06610522419214249, 0.019941411912441254, -0.009719162248075008, 0.04371289908885956, -0.0...
0.047384
# Terraform Configuration for Spring Boot Cloud Run Deployment This directory contains Terraform code to provision the necessary Google Cloud infrastructure for deploying the Spring Boot application to Cloud Run, along with supporting services like BigQuery, MCP Toolbox, and the needed permissions between components. B...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/mcptoolbox-bq-claude-slack-agent/infra/README.md
main
gcp-professional-services
[ -0.013263888657093048, -0.026306796818971634, 0.013977687805891037, -0.0824880599975586, -0.026164276525378227, -0.006058243103325367, -0.030163351446390152, -0.05477508157491684, 0.0031980446074157953, 0.06952585279941559, -0.023634813725948334, -0.07586577534675598, 0.05777060613036156, ...
-0.02548
to assign specific values to the variables defined in `variables.tf`, such as your Google Cloud Project ID and preferred region. \*\*It is crucial to copy this file to `terraform.tfvars` and customize it before running `terraform apply`.\*\* - \*\*`bigquery.tf`\*\*: - \*\*Purpose\*\*: Manages all BigQuery-related resou...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/mcptoolbox-bq-claude-slack-agent/infra/README.md
main
gcp-professional-services
[ 0.016463739797472954, 0.03431222215294838, -0.008994263596832752, -0.012858662754297256, -0.05289597809314728, 0.0006441647419705987, 0.04432343319058418, -0.050204820930957794, -0.0002605539921205491, 0.10575028508901596, -0.05126233398914337, -0.09837348759174347, 0.047682374715805054, -...
-0.027313
- `artifact\_registry\_repository\_id`: The ID of the Artifact Registry repository. - \*\*Other Files in `infra/`\*\*: - `infra/bigquery/data.csv`: Sample data loaded into the BigQuery table. - `infra/mcptoolbox/tools.yaml.tpl`: Template for the MCP Toolbox configuration, which is stored in Secret Manager. ## Cleaning ...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/mcptoolbox-bq-claude-slack-agent/infra/README.md
main
gcp-professional-services
[ 0.0014301673509180546, 0.09215215593576431, 0.032511185854673386, -0.015200173482298851, 0.017257696017622948, -0.04721079021692276, -0.0512428879737854, -0.05386995151638985, 0.014658120460808277, 0.08856385946273804, -0.008153960108757019, -0.05375826731324196, 0.020020807161927223, -0.0...
0.023532
## Summary Time series data is either not being reported or failing to be ingested. ## Impact We are not confident in the state of the application. ## Triage ### Case 1: All time-series absent 1. Check for any deployments that might indicate a change in the reporting logic. 2. Check whether Stackdriver Monitoring is re...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/alert-absence-dedup/policy_doc.md
main
gcp-professional-services
[ -0.0009855326497927308, -0.05076649412512779, 0.005212962161749601, 0.06956671923398972, 0.054759301245212555, -0.07959268242120743, -0.005105925723910332, -0.025436850264668465, 0.0568569041788578, -0.0011338433250784874, -0.008655629120767117, -0.0783935934305191, 0.005940364673733711, -...
0.059041
# Stackdriver alerts for missing timeseries This example demonstrates creating alerts for missing monitoring data with Stackdriver in a way that duplicate alerts are not generated. For example, suppose that you have 100 timeseries and you want to find out when any one of them is missing. If one or two timeseries are mi...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/alert-absence-dedup/README.md
main
gcp-professional-services
[ -0.038993459194898605, -0.09916266053915024, 0.01670345664024353, 0.06844991445541382, 0.020226111635565758, -0.08682999759912491, 0.003673647064715624, -0.024335065856575966, 0.10423669219017029, -0.046904079616069794, 0.011165456846356392, -0.10451476275920868, 0.05494702234864235, -0.05...
0.011596
a few minutes the alert should resolve itself. Stop the process and restart it with only one partition: ```shell ./alert-absence-demo --labels "1" ``` Check that only one alert is fired. Stop the process and do not restart it. You should see an alert that indicates all time series are absent.
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/alert-absence-dedup/README.md
main
gcp-professional-services
[ 0.048631053417921066, -0.004974472802132368, -0.004195047076791525, 0.04766729101538658, 0.1381642371416092, -0.07310909777879715, 0.03358319401741028, -0.018481142818927765, 0.1367097944021225, -0.08291788399219513, 0.011562489904463291, -0.08917025476694107, -0.01912892796099186, -0.0342...
0.09295
# Running End-to-End test for Dataflow pipelines We all know testing dataflow applications is important. But when building data ingestion and transformation, sometimes thing gets too technical and hence stakeholders like analyst, data scientist have no visibility on what test are being performed. In this sample code. W...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/e2e-test-dataflow/Readme.md
main
gcp-professional-services
[ -0.05795194208621979, -0.00033147886279039085, 0.05397126078605652, -0.0016202724073082209, 0.017221886664628983, -0.09618173539638519, -0.07018885016441345, -0.0005143328453414142, -0.01110094878822565, -0.019097115844488144, -0.05342848226428032, -0.14549396932125092, 0.04482152312994003, ...
0.053872
the dataflow and create buckets in GCS and create BigQuery dataset in the project. `export GOOGLE\_APPLICATION\_CREDENTIALS=<` `mvn test`
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/e2e-test-dataflow/Readme.md
main
gcp-professional-services
[ -0.035879477858543396, -0.026683388277888298, -0.02354416437447071, -0.0386553630232811, -0.05837591737508774, -0.03940121829509735, -0.03918682783842087, 0.0377226322889328, -0.042663972824811935, 0.046179261058568954, -0.03919699043035507, -0.08998113125562668, 0.049783386290073395, -0.1...
-0.056342
## Cloud Composer Examples: This repo contains the following examples of using Cloud Composer, Google Cloud Platform's managed Apache Airflow service: 1. [Composer Dataflow Examples](composer\_dataflow\_examples/README.md) a. [Simple Load DAG](composer\_dataflow\_examples/simple\_load\_dag.py): provides a common patter...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/cloud-composer-examples/README.md
main
gcp-professional-services
[ -0.09670466929674149, -0.0126967066898942, -0.006683800369501114, 0.023134825751185417, 0.02714516595005989, -0.058872561901807785, -0.02607911080121994, 0.0010025018127635121, 0.012015311978757381, 0.012947206385433674, -0.048967670649290085, -0.002738119103014469, -0.03623383492231369, -...
0.015412
## Cloud Composer: Ephemeral Dataproc Cluster for Spark Job ### Workflow Overview \*\*\* ![Alt text](../img/composer-http-post-arch.png "A diagram illustrating the workflow described below.") An HTTP POST to the airflow endpoint from an on-prem system is used as a trigger to initiate the workflow. At a high level the C...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/cloud-composer-examples/composer_http_post_example/README.md
main
gcp-professional-services
[ -0.07750850170850754, 0.019329722970724106, -0.023899709805846214, 0.032593198120594025, 0.05483398586511612, -0.06955601274967194, -0.01690089702606201, -0.04395284131169319, 0.028475066646933556, 0.02458837255835533, 0.008982742205262184, -0.00697662215679884, -0.02824992872774601, -0.14...
0.050577
[these](https://cloud.google.com/composer/docs/quickstart) steps to create a Cloud Composer environment if needed (\*cloud-composer-env\*). We will set these variables in the composer environment. | Key | Value |Example | | :--------------------- |:---------------------------------------------- |:----------------------...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/cloud-composer-examples/composer_http_post_example/README.md
main
gcp-professional-services
[ 0.03288105130195618, 0.0054140412248671055, -0.04100921377539635, -0.05264405533671379, 0.0013173260958865285, 0.04153509438037872, 0.007693604566156864, 0.03154871612787247, 0.017095642164349556, 0.07187134772539139, 0.020901935175061226, -0.13419277966022491, 0.024174954742193222, -0.093...
-0.102898
# Airflow Metadata Export This repo contains an example Airflow DAG Factory that exports data from a list of airflow tables into a BigQuery location. The goal of this example is to provide a common pattern to export data to BigQuery for auditing and reporting purposes. ## DAG Factory Overview At a high-level the factor...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/cloud-composer-examples/airflow_metadata_export/README.md
main
gcp-professional-services
[ -0.0030572398100048304, 0.026571406051516533, -0.03780726343393326, 0.025061681866645813, 0.04735136777162552, -0.04036644101142883, -0.04049019142985344, -0.019387047737836838, 0.0014210318913683295, 0.030110085383057594, -0.02195616438984871, -0.0330263115465641, -0.05207638442516327, -0...
-0.09039
## Cloud Composer workflow using Cloud Dataflow ##### This repo contains an example Cloud Composer workflow that triggers Cloud Dataflow to transform, enrich and load a delimited text file into Cloud BigQuery. The goal of this example is to provide a common pattern to automatically trigger, via Google Cloud Function, a...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/cloud-composer-examples/composer_dataflow_examples/README.md
main
gcp-professional-services
[ -0.052083682268857956, 0.06368589401245117, 0.03954440355300903, -0.013167206197977066, 0.03234662488102913, -0.02598986215889454, 0.02154475636780262, -0.014518027193844318, 0.0938655361533165, 0.038984980434179306, -0.011756553314626217, 0.005184185225516558, 0.010468332096934319, -0.143...
0.084098
| | input\_field\_names | \*comma-separated-field-names-for-delimited-file\*|state,gender,year,name,number,created\_date| | bq\_output\_table | \*bigquery-output-table\* |my\_dataset.usa\_names | | email | \*some-email@mycompany.com\* |some-email@mycompany.com | The variables can be set as follows: `gcloud composer env...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/cloud-composer-examples/composer_dataflow_examples/README.md
main
gcp-professional-services
[ 0.03918715566396713, 0.031573936343193054, -0.050068534910678864, 0.03213335573673248, -0.020733650773763657, 0.021766163408756256, 0.018957117572426796, -0.004464332014322281, 0.016658825799822807, 0.04791083559393883, -0.01755431294441223, -0.08875616639852524, 0.03031548298895359, -0.11...
-0.057259
# bigquery-analyze-realtime-reddit-data ## Table Of Contents 1. [Use Case](#use-case) 2. [About](#about) 3. [Architecture](#architecture) 4. [Guide](#guide) 5. [Sample](#sample) ---- ## Use-case Simple deployment of a ([reddit](https://www.reddit.com)) social media data collection architecture on Google Cloud Platform....
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/bigquery-analyze-realtime-reddit-data/README.md
main
gcp-professional-services
[ -0.08014721423387527, -0.02836531400680542, 0.02828381024301052, -0.03197244927287102, 0.011161250993609428, -0.08590925484895706, -0.012755866162478924, -0.03942679986357689, -0.01868511736392975, 0.05380460247397423, -0.014550586231052876, -0.010421224869787693, 0.0291344802826643, -0.05...
0.038981
# 🚀 GCC Creative Studio has moved! This project has moved to a new repository: 👉 \*\*[https://github.com/GoogleCloudPlatform/gcc-creative-studio](https://github.com/GoogleCloudPlatform/gcc-creative-studio)\*\* ## What this means for you If you have already deployed Creative Studio from this repository, please follow ...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/creative-studio/README.md
main
gcp-professional-services
[ -0.009332076646387577, -0.05971663445234299, 0.09187771379947662, -0.0037289157044142485, 0.02269609086215496, -0.011558271013200283, -0.038413722068071365, -0.04920240864157677, -0.003741809166967869, 0.04851250723004341, -0.0002811687591020018, -0.05583420395851135, 0.016815202310681343, ...
-0.07613
# Using Batching in Cloud Pub/Sub Java client API. This example provides guidance on how to use [Pub/Sub's Java client API](https://cloud.google.com/pubsub/docs/reference/libraries) to batch records that are published to a Pub/Sub topic. Using [BatchingSettings](http://googleapis.github.io/gax-java/1.4.1/apidocs/com/go...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/pubsub-publish-avro-example/README.md
main
gcp-professional-services
[ -0.052423957735300064, 0.008670863695442677, -0.006822499912232161, -0.033786341547966, -0.0366513654589653, -0.059078361839056015, -0.0005102871800772846, -0.052191201597452164, -0.046858564019203186, 0.02589009888470173, -0.05298938229680061, 0.03894982859492302, 0.0892711654305458, -0.0...
0.015701
# Dataproc lifecycle management orchestrated by Composer Writing DAGs isn’t practical when having multiple DAGs that run similar Dataproc Jobs, and want to share clusters efficiently, with just some parameters changing between them. Here makes sense to dynamically generate DAGs. Using this project you can deploy multip...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/dataproc-lifecycle-via-composer/README.md
main
gcp-professional-services
[ -0.10978114604949951, -0.04016409441828728, -0.041279181838035583, 0.054427310824394226, 0.04346834495663643, -0.06563366949558258, 0.03146575391292572, -0.0019854840356856585, -0.008548249490559101, -0.01954171434044838, -0.023016100749373436, -0.0027175205759704113, 0.008873123675584793, ...
0.102533
of Composer Environment provisioning, you should expect for successful completion along with a list of the created resources. ## Running DAGs DAGs will run as per \*\*Schedule\*\*, \*\*StartDate\*\*, and \*\*Catchup\*\* configuration in DAG configuration file, or can be triggered manually trough Airflow web console aft...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/dataproc-lifecycle-via-composer/README.md
main
gcp-professional-services
[ -0.05855603888630867, -0.030523935332894325, -0.03702731803059578, 0.06721922010183334, 0.04099591076374054, -0.036984317004680634, 0.015543787740170956, -0.04449216276407242, -0.023742906749248505, -0.0048758224584162235, -0.016365591436624527, -0.07300595194101334, -0.019793346524238586, ...
-0.087712
# dataproc-job-optimization-guide ---- ## Table Of Contents 1. [About](#About) 2. [Guide](#Guide) 3. [Results](#Results) 4. [Next Steps](#Next-steps) ---- ## About This guide is designed to optimize performance and cost of applications running on Dataproc clusters. Because Dataproc supports many big data technologies -...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/dataproc-job-optimization-guide/README.md
main
gcp-professional-services
[ -0.0737711638212204, -0.019104547798633575, -0.0040307133458554745, 0.07461889833211899, 0.02763928472995758, -0.05818743258714676, -0.027301684021949768, 0.011269645765423775, -0.13891638815402985, 0.04452948644757271, -0.055846959352493286, -0.014140764251351357, 0.019482336938381195, -0...
0.041002
storage rm --recursive gs://$BUCKET\_NAME/transformed-$TIMESTAMP gcloud dataproc jobs submit pyspark --region=$REGION --cluster=$CLUSTER\_NAME-testing-4x4-standard scripts/spark\_average\_speed.py -- gs://$BUCKET\_NAME/raw-$TIMESTAMP/ gs://$BUCKET\_NAME/transformed-$TIMESTAMP/ ``` \*\*2 x n2-standard-8 = 1 min 31 secon...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/dataproc-job-optimization-guide/README.md
main
gcp-professional-services
[ 0.010403348132967949, -0.021471502259373665, -0.02061542682349682, -0.01309878472238779, 0.02406354621052742, -0.05639531463384628, -0.00834596622735262, 0.004975413903594017, -0.01679772324860096, 0.08610799163579941, 0.0266489889472723, -0.05302297696471214, 0.014292576350271702, -0.0362...
-0.063403
clusters (see step 6) to allow clusters to scale up, and delete them when the job or workflow is complete. Downscaling may not be necessary on ephemeral, job/workflow scoped clusters. - Ensure primary workers make up >50% of your cluster. Do not scale primary workers. - This does increase cost versus a smaller number o...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/dataproc-job-optimization-guide/README.md
main
gcp-professional-services
[ -0.06201167404651642, -0.011588063091039658, 0.014725239016115665, -0.008381993509829044, 0.014065668918192387, -0.033664703369140625, -0.020353758707642555, -0.03977033495903015, -0.050705231726169586, 0.04118959978222847, -0.016243448480963707, -0.04320451244711876, 0.02496049739420414, ...
0.012634
# Churn Prediction with Survival Analysis This model uses Survival Analysis to classify customers into time-to-churn buckets. The model output can be used to calculate each user's churn score for different durations. The same methodology can be used used to predict customers' total lifetime from their "birth" (initial ...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/cloudml-churn-prediction/README.md
main
gcp-professional-services
[ -0.10715402662754059, -0.04112360253930092, -0.06597026437520981, -0.011642706580460072, 0.07772063463926315, 0.007042732089757919, 0.027045197784900665, 0.059012334793806076, 0.03041238524019718, -0.05130413919687271, 0.04650251194834709, 0.010259168222546577, 0.026177480816841125, 0.0117...
0.060998
.. ``` ## Model Training Model training minimizes the negative of the log likelihood function for a statistical Survival Analysis model with discrete-time intervals. The loss function is based off the paper [A scalable discrete-time survival model for neural networks](https://peerj.com/articles/6257.pdf). For each reco...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/cloudml-churn-prediction/README.md
main
gcp-professional-services
[ -0.048813045024871826, -0.06003871560096741, 0.04363466799259186, -0.020311662927269936, 0.10821349173784256, -0.01084604300558567, 0.0417223796248436, 0.1003473624587059, 0.019957222044467926, -0.06053604558110237, 0.03959970921278, -0.010928156785666943, 0.07367227226495743, -0.010702413...
0.053074
# Personal Workbench Notebooks Deployer Accurate usage logging and cost allocation is key when you have multiple analytics users sharing data products, that is why you want to ensure analytical users use their own credentials when querying and processing data. Automation in provisioning and decommissioning of analytics...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/personal-workbench-notebooks-deployer/README.md
main
gcp-professional-services
[ -0.05639578402042389, -0.056485578417778015, -0.02911735512316227, -0.032142043113708496, -0.041694194078445435, -0.0499911829829216, 0.03921978548169136, 0.05044669657945633, -0.01784423366189003, 0.08059561252593994, -0.049598392099142075, -0.03413429856300354, 0.03470318019390106, 0.012...
0.064979
secure URL to JupyterHub to the users. When a user pick a template and create a cluster (or reuse an existing one), JupyterHub redirects the user to Dataproc Notebooks through the Component Gateway. Main components: - \*\*JupyterHub:\*\* UI+web server (GCP managed) for users to pick Jupyter notebooks server templates a...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/personal-workbench-notebooks-deployer/README.md
main
gcp-professional-services
[ -0.018607066944241524, 0.030376659706234932, 0.016288038343191147, -0.022038886323571205, -0.06015874817967415, 0.022504935041069984, -0.02658981643617153, -0.004832208622246981, -0.02576286904513836, 0.009542330168187618, -0.009398994036018848, -0.042252931743860245, 0.11076988279819489, ...
-0.00462
# Personal Managed Workbench notebooks ## Example ```hcl module "sample-managed-module" { source = "./modules/personal-managed-notebook" notebook\_users\_list = ["@"] managed\_instance\_prefix = "" project\_id = "" } ``` ## Variables | name | description | type | required | default | |----------------------------------...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/personal-workbench-notebooks-deployer/modules/personal-managed-notebook/README.md
main
gcp-professional-services
[ -0.02474699541926384, 0.022886428982019424, -0.04881393536925316, 0.06674571335315704, -0.020274246111512184, 0.056691329926252365, 0.05831223726272583, 0.014494326896965504, -0.0024416795931756496, -0.03232521936297417, 0.020852206274867058, -0.12808677554130554, 0.08346300572156906, -0.0...
0.012244
# Personal User Managed Workbench notebooks ## Example ```hcl module "sample-user-managed-module" { source = "./modules/personal-user-managed-notebook" notebook\_users\_list = ["@"] dataproc\_yaml\_template\_file\_name = "" personal\_dataproc\_notebooks\_bucket\_name = "" user\_managed\_instance\_prefix = "" generated\...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/personal-workbench-notebooks-deployer/modules/personal-user-managed-notebook/README.md
main
gcp-professional-services
[ -0.007773126009851694, 0.016623087227344513, -0.03301703929901123, 0.023025687783956528, -0.043094947934150696, 0.021796349436044693, 0.057875413447618484, 0.023357994854450226, -0.013124886900186539, -0.0030927490442991257, 0.03157304227352142, -0.11301770061254501, 0.04556037113070488, -...
-0.038189
# BigQuery Group Sync For Row Level Access Sample code to synchronize group membership from G Suite/Cloud Identity into BigQuery and join that with your data to control access at row level. ## Source files \* `group\_sync.py` and `group\_sync\_test.py`: main code and unit tests. \* `auth\_util.py`: utility function to ...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/bigquery-row-access-groups/README.md
main
gcp-professional-services
[ -0.05810751020908356, -0.0013269283808767796, -0.036575257778167725, -0.026718731969594955, -0.05916796997189522, -0.05462173372507095, 0.007185847964137793, -0.0353197380900383, -0.062010757625103, 0.05783417075872421, -0.00654953345656395, 0.005299841519445181, 0.08226335048675537, -0.10...
-0.119457
# Data Format Description Language ([DFDL](https://en.wikipedia.org/wiki/Data\_Format\_Description\_Language)) Processor Example This module is a example how to process a binary using a DFDL definition. The DFDL definitions are stored in a Firestore database. The application send a request with the binary to process to...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/dfdl-firestore-pubsub-example/README.md
main
gcp-professional-services
[ -0.028155047446489334, -0.040710411965847015, -0.15607763826847076, -0.013011596165597439, 0.017038140445947647, -0.07588924467563629, 0.012804580852389336, 0.06607072055339813, -0.0017741358606144786, -0.031557898968458176, -0.03251776471734047, -0.017345773056149483, 0.025015031918883324, ...
0.139244
A topic to publish the final json output: "data-output-json-topic" 2. A topic to publish the binary to be processed: "data-input-binary-topic" #### Subscription The following subscriptions need to be created: 1. A subscription to pull the binary data: data-input-binary-sub ## Usage ### Initialize the application Refere...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/dfdl-firestore-pubsub-example/README.md
main
gcp-professional-services
[ 0.015521501190960407, -0.039841603487730026, -0.06281284242868423, -0.03326781094074249, 0.023678157478570938, -0.02826092205941677, -0.06608615070581436, -0.04567345976829529, -0.042026132345199585, 0.06131885573267937, 0.025426851585507393, -0.020317595452070236, 0.11413692682981491, 0.0...
0.011162
# QAOA Examples for Max-SAT Problems These are examples of parsing a max-SAT problem in a proprietary format. Problems can be converted into QUBO form as described in [this article, appendix C](https://arxiv.org/pdf/1708.09780.pdf) to be later simulated with [cirq](https://github.com/quantumlib/Cirq/). The maxSAT probl...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/qaoa/README.md
main
gcp-professional-services
[ -0.064321368932724, -0.02970210276544094, -0.07149200141429901, -0.02235819771885872, -0.1463829129934311, -0.08392220735549927, -0.08336236327886581, 0.054986558854579926, -0.1065140813589096, 0.044940587133169174, -0.037936173379421234, -0.03420804440975189, 0.015518687665462494, 0.01913...
0.037332
## Scheduling Command in GCP using Cloud Run and Cloud Scheduler There are some scenarios where running a command regularly in you environment is necessary without a need of orchestration tool such as Cloud Composer. In this example we schedule a command using Cloud Run and Cloud Scheduler. Once such example is running...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/schedule-cloud-run-jobs/Readme.md
main
gcp-professional-services
[ -0.10191646218299866, -0.04896773397922516, -0.011740042828023434, 0.029826732352375984, 0.0020447822753340006, -0.02662830986082554, -0.03262684866786003, -0.07289595156908035, 0.09431192278862, 0.037659596651792526, -0.012878308072686195, -0.05292429029941559, 0.04191475361585617, -0.047...
0.014827
The Snowflake DDL Migration Utility does the following functionalities: The script connects to Snowflake Database through the snowflake-python connector The script uses the get\_ddl command to retrieve the table schema information The script produces the "create table" statement using the schema information and store t...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/bigquery-snowflake-tables-migration-utility/README.md
main
gcp-professional-services
[ 0.008530732244253159, -0.04868076741695404, -0.05152475833892822, -0.005611220840364695, 0.0682315081357956, -0.08850234001874924, -0.05460323393344879, 0.008498851209878922, -0.06831206381320953, -0.017293713986873627, -0.06715776771306992, -0.02971029095351696, 0.018238931894302368, -0.1...
-0.105013
The Snowflake BQ Converter Script does the following functionalities 1. The script reads the snowflake ddl files from the specified gcs path (output path of the snowflake\_ddl\_extraction script) 2. The script calls the BigQuery Migration API and converts the ddl to the BigQuery DDL and placed it in the specified gcs p...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/bigquery-snowflake-tables-migration-utility/DDL_Converter/Snowflake BQ Converter.md
main
gcp-professional-services
[ -0.028750736266374588, -0.022765927016735077, -0.017575325444340706, -0.047852423042058945, 0.030127324163913727, -0.03704285994172096, -0.07472722232341766, 0.00011979842383880168, -0.03990694507956505, -0.01061776652932167, -0.07659932971000671, -0.04906975477933884, 0.013616763055324554, ...
-0.19069
The Snowflake DDL Extraction Script does the following functionalities: The script connects to Snowflake Database through the snowflake-python connector The script uses the snowflake get\_ddl command to retieve the table schema information The script produces the "create table" statement using the schema information an...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/bigquery-snowflake-tables-migration-utility/DDL_Extractor/Snowflake DDL Extraction.md
main
gcp-professional-services
[ -0.028778981417417526, -0.04588690027594566, -0.020224308595061302, -0.009205296635627747, 0.06232541427016258, -0.06660662591457367, -0.028742985799908638, -0.030010294169187546, -0.039051495492458344, 0.009022938087582588, -0.048891689628362656, -0.06446383893489838, 0.03699372708797455, ...
-0.170456
The BQ Table Creator Script does the following functionalities Reads the output sql file created by the snowflake bq converter script The script create the Bigquery Tables in the specified target dataset. The table structure will include source columns, metadata columns and paritioning and clustering info The final DDL...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/bigquery-snowflake-tables-migration-utility/BQ_Table_Creator/BQ Table Creator.md
main
gcp-professional-services
[ 0.007933096028864384, -0.04760267585515976, -0.05230559781193733, -0.009910170920193195, -0.00814236979931593, -0.0168015006929636, -0.022807233035564423, -0.00479409983381629, -0.08308922499418259, 0.012993143871426582, -0.05501492694020271, -0.06690265238285065, 0.042042527347803116, -0....
-0.151493
The Archive DDL Script archive the DDL files created by the scripts (snowflake\_ddl\_extraction.py, snowflake\_bq\_converter.py and archive\_ddl.py) and place the files in the specified archive bucket. Below packages are need to run the script: google-cloud-storage Steps to run this script: 1. Make Sure the pre-requsit...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/bigquery-snowflake-tables-migration-utility/DDL_Archiver/Archive DDL.md
main
gcp-professional-services
[ -0.066238172352314, -0.025619829073548317, 0.03619207441806793, -0.057005416601896286, 0.05853467807173729, -0.039220888167619705, -0.06971519440412521, -0.045471567660570145, -0.013150871731340885, -0.028344860300421715, -0.0581953264772892, -0.005835134070366621, 0.04297912120819092, -0....
-0.213921
# Cloud Function "Act As" Caller > tl;dr The Terraform project in this repository deploys a single simple Python Cloud Function that executes a simple action against Google Cloud API using identity of the function caller illustrating the full flow described above. --- This example describes one possible solution of red...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/cloud-functions-act-as/README.md
main
gcp-professional-services
[ -0.08039003610610962, 0.009836256504058838, 0.04261599853634834, -0.04163346812129021, -0.02006690204143524, -0.05506032705307007, 0.032528363168239594, -0.07752387970685959, 0.029931845143437386, 0.07456401735544205, 0.011086422950029373, -0.05806051939725876, 0.028281327337026596, -0.007...
0.063204
GitHub Workflow defined in this source repository in the [.github/workflows/call-function.yml](./.github/workflows/call-function.yml) file. It authenticates to GCP as a Workload Identity using Workflow Identity Fedederation set up by the Terraform project in this repository. The Service Account "mapped" to the Workload...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/cloud-functions-act-as/README.md
main
gcp-professional-services
[ -0.08380861580371857, -0.008474479429423809, -0.005888337269425392, -0.02876000851392746, 0.018045365810394287, -0.05134231224656105, 0.04757951945066452, -0.014610767364501953, 0.11035341769456863, 0.03258369863033295, -0.01634220965206623, 0.018194260075688362, 0.03647832199931145, -0.05...
0.119477
account \* Workload Identity Service Account – the GCP service account that represents the external GitHub Workload Identity. When the GitHub workflow authenticates to the GCP it is this service account's IAM permissions that the GitHub Workload Identity is granted. | Service Account | Role | Description | |-----------...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/cloud-functions-act-as/README.md
main
gcp-professional-services
[ -0.07536463439464569, -0.01901114545762539, -0.024025585502386093, -0.0035172789357602596, 0.03485853970050812, -0.01243998110294342, 0.0882207602262497, 0.007360086310654879, 0.049638789147138596, -0.00072033068863675, 0.018438026309013367, -0.057874321937561035, 0.057578038424253464, -0....
0.094797
GitHub [workflow](.github/workflows/call-function.yml) in this repository illustrates the way of calling the sample Cloud Function from a GitHub workflow. For the workflow to succeed, a dedicated service account `wi-sample-account` is mapped to the authenticated GitHub Workload Identity. It needs to have `cloudfunction...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/cloud-functions-act-as/README.md
main
gcp-professional-services
[ -0.03908099979162216, -0.06717586517333984, 0.0031652674078941345, -0.04679407924413681, -0.03744520992040634, -0.07367853820323944, 0.016607122495770454, -0.004142445977777243, 0.05205656960606575, 0.06705621629953384, -0.042339254170656204, -0.03090442344546318, 0.026102634146809578, -0....
0.04197
run the following command: terraform destroy To delete the project, do the following: 1. In the Cloud Console, go to the [Projects page](https://console.cloud.google.com/iam-admin/projects). 1. In the project list, select the project you want to delete and click \*\*Delete\*\*. 1. In the dialog, type the project ID, an...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/cloud-functions-act-as/README.md
main
gcp-professional-services
[ -0.04784600809216499, 0.053865984082221985, 0.02837361767888069, -0.03586621582508087, -0.020566029474139214, -0.054297808557748795, 0.005982874892652035, -0.11482716351747513, 0.07612621039152145, 0.10013376176357269, -0.04898911714553833, -0.012902893126010895, 0.07408025860786438, -0.04...
-0.011518
# Scikit-learn pipeline trainer for AI Platform This is a example for building a scikit-learn-based machine learning pipeline trainer that can be run on AI Platform, which is built on top of the [scikit-learn template](https://github.com/GoogleCloudPlatform/cloudml-samples/tree/master/sklearn/sklearn-template/template)...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/cloudml-sklearn-pipeline/README.md
main
gcp-professional-services
[ -0.07690850645303726, -0.053795162588357925, 0.00615173764526844, -0.012034716084599495, 0.06103593856096268, -0.0644528865814209, -0.09608947485685349, -0.015697086229920387, -0.029424268752336502, -0.024417802691459656, -0.047355398535728455, -0.051458947360515594, -0.042140182107686996, ...
0.151545
sure the following API & Services are enabled. \* Cloud Storage \* Cloud Machine Learning Engine \* BigQuery API \* Cloud Build API (for CI/CD integration) \* Cloud Source Repositories API (for CI/CD integration) - Configure project id and bucket id as environment variable. ```bash $ export PROJECT\_ID=[your-google-pro...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/cloudml-sklearn-pipeline/README.md
main
gcp-professional-services
[ -0.05992864444851875, -0.04639938473701477, -0.0018006999744102359, -0.010235465131700039, -0.017447087913751602, 0.010754356160759926, -0.03865449130535126, -0.03703182190656662, -0.06906402111053467, 0.03798360750079155, -0.01708468794822693, -0.031814832240343094, 0.036385223269462585, ...
-0.031127
AI Platform. - `hptuning\_config.yaml`: for running hyperparameter tuning job on AI Platform. The YAML files share some configuration parameters. In particular, `runtimeVersion` and `pythonVersion` should correspond in both files. Note that both Python 2.7 and Python 3.5 are supported, but Python 3.5 is the recommended...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/cloudml-sklearn-pipeline/README.md
main
gcp-professional-services
[ -0.044549982994794846, -0.09280375391244888, -0.008823122829198837, 0.030002159997820854, 0.033007897436618805, -0.07850925624370575, -0.07848048210144043, -0.04415843263268471, -0.12481395155191422, -0.07936525344848633, -0.015402376651763916, -0.004514344036579132, -0.05595935881137848, ...
0.090907
processing function Args: num: (float) Returns: float """ return np.sqrt(num) def \_numeric\_sq\_sr(num): """Example function that take scala and return an array Args: num: (float) Returns: numpy.array """ return np.array([\_numeric\_square(num), \_numeric\_square\_root(num)]) def \_area(args): """Examples function tha...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/cloudml-sklearn-pipeline/README.md
main
gcp-professional-services
[ 0.04111538454890251, -0.014330253005027771, -0.04869825020432472, 0.03164612874388695, -0.060327641665935516, -0.13200215995311737, 0.04064016044139862, -0.061189375817775726, -0.08742552250623703, -0.04247613251209259, -0.06464120745658875, 0.06616592407226562, -0.041027240455150604, 0.06...
0.088746
# Cloud Storage to BigQuery using Cloud Composer End-to-end sample example to do data extraction from Cloud Storage to BigQuery using Composer.git \* Cloud Storage to BigQuery \* With Airflow operator [BigQueryInsertJobOperator](https://airflow.apache.org/docs/apache-airflow-providers-google/stable/\_modules/tests/syst...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/gcs-to-bq/README.md
main
gcp-professional-services
[ 0.03470029681921005, 0.03438059613108635, -0.0177856907248497, 0.034478042274713516, 0.00811988115310669, -0.00853531900793314, -0.018119901418685913, 0.0027198074385523796, 0.022501664236187935, 0.03464081138372421, 0.0016465865774080157, -0.05892711877822876, -0.008757402189075947, -0.08...
-0.122977
Introduction ============ Many consumer-facing applications allow creators to upload audio files as a part of the creative experience. If you’re running an application with a similar use case, you may want to extract the text from the audio file and then classify based on the content. For example, you may want to categ...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/ml-audio-content-profiling/README.md
main
gcp-professional-services
[ -0.02259764075279236, -0.04286389797925949, -0.07836488634347916, -0.01745302602648735, 0.05780641362071037, 0.03747459873557091, 0.026769166812300682, -0.0411330908536911, -0.04000864923000336, -0.043701980262994766, -0.09475261718034744, -0.04258604720234871, 0.005747409537434578, 0.0399...
0.055434
deploy ```` You will be prompted to use a region to serve the location from. You may pick any region, but you cannot change this value later. You can verify that the app was deployed correctly by navigating to https://`$PROJECT`.appspot.google.com. You should see the following UI: ![text](img/app\_home.png?raw=true) ##...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/ml-audio-content-profiling/README.md
main
gcp-professional-services
[ 0.04625634849071503, -0.04064491018652916, 0.02187647670507431, -0.11067002266645432, -0.02390432357788086, -0.036500342190265656, -0.019879695028066635, -0.04548998549580574, 0.0025128540582954884, 0.05627107247710228, -0.008275406435132027, -0.04777829349040985, 0.06544143706560135, -0.0...
-0.071535
yes ```` All of the resources should be deployed. ### View Results ### Test it out 1. You can start by trying to upload an audio file in GCS. You can do this using `gcloud storage` or in the UI under the **staging bucket**. This will trigger `send\_stt\_api\_function`. This submits the request to the Speech API and pub...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/ml-audio-content-profiling/README.md
main
gcp-professional-services
[ -0.05997931957244873, -0.0783756822347641, -0.02345702238380909, -0.0436236672103405, 0.025415100157260895, 0.02352866716682911, -0.07915278524160385, -0.11295805871486664, 0.02231048047542572, -0.027468184009194374, -0.07945425063371658, 0.007249090354889631, -0.005442050285637379, -0.020...
-0.005665
# AI-Powered SLO Consultant > \*\*Automated Site Reliability Engineering (SRE) Agent\*\* > > Use your application code to indentify deployment-ready Service Level Objectives (SLOs) through Terraform in minutes using Google Gemini. ## Overview The \*\*AI-Powered SLO Consultant\*\* is an intelligent workflow tool designe...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/slo-assistant/Readme.md
main
gcp-professional-services
[ -0.06817895174026489, -0.03798077255487442, 0.010440892539918423, 0.001187631394714117, 0.0009171385318040848, -0.10363385081291199, -0.013099437579512596, -0.015553029254078865, -0.09423985332250595, 0.004165066871792078, -0.055247459560632706, -0.04189520329236984, 0.06266728043556213, -...
0.189617
test ``` ### Formatting & Linting To ensure code quality and consistency, run `make format` (which uses `black` with a line length of 100) and `make lint` (which uses `pylint`): ```bash # Format code make format # Run linter make lint ``` ## Usage The application uses a decoupled architecture with a FastAPI backend and...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/slo-assistant/Readme.md
main
gcp-professional-services
[ -0.043634556233882904, -0.018421288579702377, -0.0810498297214508, -0.007385514676570892, -0.029798351228237152, -0.08288649469614029, -0.0772709921002388, -0.004458872601389885, -0.06282258033752441, -0.07115113735198975, 0.03933163359761238, -0.05111309513449669, -0.001961236586794257, -...
0.002021
# Cloud Composer in Shared VPC This repo uses terraform to create below resources in order to deploy a private composer environment in shared VPC. \* Two projects, one for shared VPC and other for composer environment \* One shared VPC and subnets in host project \* Neccesary IAM permissions and firewall rules in order...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/composer-shared-vpc/README.md
main
gcp-professional-services
[ -0.003597321454435587, 0.0064821429550647736, -0.05662255734205246, -0.032411109656095505, -0.009694804437458515, 0.005115197505801916, 0.017266465350985527, -0.06181468442082405, 0.038559023290872574, 0.10666867345571518, -0.04259112477302551, -0.1123422309756279, 0.050693463534116745, 0....
0.006317
## Requirements No requirements. ## Providers | Name | Version | |------|---------| | [google](#provider\\_google) | n/a | ## Modules | Name | Source | Version | |------|--------|---------| | [network](#module\\_network) | terraform-google-modules/network/google | ~> 3.4.0 | | [private-dns-zones-shared](#module\\_priva...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/composer-shared-vpc/shared/README.md
main
gcp-professional-services
[ -0.07070177048444748, -0.048891231417655945, 0.04709859564900398, -0.04163733124732971, -0.04694851487874985, -0.00925919134169817, -0.022980947047472, -0.09779613465070724, -0.0732889398932457, 0.04248461499810219, 0.014562729746103287, -0.08392111957073212, -0.001671752193942666, -0.0297...
-0.019943
## Requirements No requirements. ## Providers | Name | Version | |------|---------| | [google](#provider\\_google) | n/a | ## Modules | Name | Source | Version | |------|--------|---------| | [composer-v1-private](#module\\_composer-v1-private) | terraform-google-modules/composer/google//modules/create\_environment\_v1...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/composer-shared-vpc/composer_v1_pvt_shared_vpc/README.md
main
gcp-professional-services
[ -0.031278565526008606, -0.008907560259103775, 0.020507093518972397, -0.019490012899041176, 0.015773197636008263, -0.03371414542198181, -0.009463589638471603, -0.08377382904291153, -0.059502147138118744, 0.050208114087581635, 0.012938711792230606, -0.10060559213161469, 0.015075487084686756, ...
-0.024089
[subnetwork](#input\\_subnetwork) | The subnetwork to host the composer cluster. | `string` | n/a | yes | | [subnetwork\\_region](#input\\_subnetwork\\_region) | The subnetwork region of the shared VPC's host (for shared vpc support) | `string` | `""` | no | | [tags](#input\\_tags) | Tags applied to all nodes. Tags are...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/composer-shared-vpc/composer_v1_pvt_shared_vpc/README.md
main
gcp-professional-services
[ -0.04002523049712181, 0.051907408982515335, -0.1267370581626892, 0.03666665032505989, 0.09173561632633209, 0.00449303537607193, 0.007817128673195839, -0.04096155986189842, -0.04792405292391777, -0.03027717024087906, 0.025889864191412926, -0.009438625536859035, 0.021760886535048485, -0.0190...
0.077114
# dataflow-production-ready (Python) ## Usage ### Creating infrastructure components Prepare the infrastructure (e.g. datasets, tables, etc) needed by the pipeline by referring to the [Terraform module](/terraform/README.MD) Note the BigQuery dataset name that you crate for late steps. ### Creating Python Virtual Envir...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/dataflow-production-ready/python/README.md
main
gcp-professional-services
[ -0.012697436846792698, -0.025480756536126137, 0.026878749951720238, -0.05192013829946518, -0.04876599833369255, -0.055365025997161865, -0.049587737768888474, 0.007720795925706625, -0.03405989706516266, 0.0582134909927845, -0.06469328701496124, -0.09583207964897156, 0.008024503476917744, -0...
-0.061916
to simplify the process of packaging the pipeline into a container we utilize [Google Cloud Build](https://cloud.google.com/cloud-build/). We preinstall all the dependencies needed to \*compile and execute\* the pipeline into a container using a custom [Dockerfile](ml\_preproc/Dockerfile). In this example, we are using...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/dataflow-production-ready/python/README.md
main
gcp-professional-services
[ -0.06089973449707031, 0.011371045373380184, 0.051056452095508575, -0.026522237807512283, 0.03799043223261833, -0.04377530887722969, -0.026567641645669937, 0.01200034935027361, -0.02903021313250065, 0.012700075283646584, 0.005259444005787373, -0.04569384828209877, -0.0229005366563797, -0.05...
-0.100579
# Ingesting GCS files to BigQuery using Cloud Functions and Serverless Spark In this solution, we build an approch to ingestion flat files (in GCS) to BigQuery using serverless technology. This solution might be not be performanct if you have frequent small files that lands to GCP. We use [Daily Shelter Occupancy](http...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/gcs-to-bq-serverless-services/Readme.md
main
gcp-professional-services
[ -0.016253279522061348, 0.030266178771853447, 0.03324681147933006, 0.059302669018507004, 0.0771106407046318, -0.07042261213064194, 0.03424574062228203, 0.0041329399682581425, 0.018748147413134575, 0.06598100066184998, -0.025961171835660934, -0.019069598987698555, 0.018165064975619316, -0.01...
-0.044655
<> dataset= <> bq\_table = <> error\_topic=<> ``` - \*\*Step 5:\*\* The cloud function is triggered once the object is copied to bucket. The cloud function triggers the Servereless spark Deploy the function. ``` cd trigger-serverless-spark-fxn gcloud functions deploy trigger-serverless-spark-fxn --entry-point \ invoke\...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/gcs-to-bq-serverless-services/Readme.md
main
gcp-professional-services
[ -0.005270143039524555, -0.029472146183252335, -0.014632866717875004, -0.010126180946826935, 0.0401855930685997, -0.035105641931295395, 0.042892590165138245, -0.004121794831007719, 0.020056232810020447, 0.06546225398778915, 0.02096071094274521, -0.0817672535777092, 0.03898702934384346, -0.0...
-0.054397
# Left-Shift Validation at Pre-Commit Hook This pre-commit hook uses open-source tools to provide developers with a way to validate Kubernetes manifests before changes are committed and pushed to a repository. Policy checks are typically instantiated after code is pushed to the repository, as it goes through each envir...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/left-shift-validation-pre-commit-hook/README.md
main
gcp-professional-services
[ -0.03310298174619675, 0.014855539426207542, 0.07654502987861633, -0.029762186110019684, 0.024436095729470253, -0.00905556045472622, -0.017757104709744453, -0.018684621900320053, 0.09610577672719955, 0.0825076475739479, 0.03461987525224686, -0.04002949595451355, 0.009733369573950768, -0.014...
0.09015
continue the commit. --- ## Purpose Organizations that deploy applications on Kubernetes clusters often use tools like [Open Policy Agent](https://www.openpolicyagent.org/) (OPA) or [Gatekeeper](https://open-policy-agent.github.io/gatekeeper/website/docs) to enforce security and operational policies. These are often es...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/left-shift-validation-pre-commit-hook/README.md
main
gcp-professional-services
[ -0.021478116512298584, -0.022794853895902634, 0.0004896118771284819, 0.0014950836775824428, 0.0486275851726532, -0.050046611577272415, -0.013903586193919182, -0.0462634302675724, 0.10426028072834015, 0.05455399677157402, 0.02015800215303898, -0.038456521928310394, -0.0003186628455296159, -...
0.122036
- Resetting the default behavior of your pre-commit hook, if you make changes that break the code. - Describing new Constraints and/or ConstraintTemplates to use. We have a collection of samples from the [OPA Gatekeeper Library](https://github.com/open-policy-agent/gatekeeper-library) that you can use, but you can also...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/left-shift-validation-pre-commit-hook/README.md
main
gcp-professional-services
[ -0.039774972945451736, 0.014077422209084034, 0.001600376097485423, 0.011239234358072281, 0.008104757405817509, -0.052951544523239136, -0.05723036453127861, -0.025145623832941055, 0.01762288250029087, 0.06844247877597809, 0.018918227404356003, -0.000958066142629832, -0.005044391378760338, -...
-0.012781
# Monitoring GCP Cloud DNS public zone ## Overview The config files in this repo supports the GCP blogpost on "Visualizing Cloud DNS public zone query data using log-based metrics and Cloud Monitoring". ### Config files included in this repo 1. [config.yaml](config.yaml) 2. [dashboard.json](dashboard.json) 3. [latency-...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/cloud-dns-public-zone-dashboard/README.md
main
gcp-professional-services
[ -0.01617797091603279, -0.06899842619895935, 0.0324721597135067, -0.018475160002708435, -0.033470068126916885, -0.06595002114772797, 0.05082348361611366, -0.07798779755830765, 0.04734146595001221, 0.07425668090581894, -0.05632716417312622, -0.07009116560220718, 0.00015996149159036577, -0.00...
0.036657
but also result in ingestion errors. \*\*Example\*\* ``` labelExtractors: ProjectID: EXTRACT(resource.labels.project\_id) QueryName: EXTRACT(jsonPayload.queryName) QueryType: EXTRACT(jsonPayload.queryType) ResponseCode: EXTRACT(jsonPayload.responseCode) TargetName: EXTRACT(resource.labels.target\_name) SourceIP: EXTRAC...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/cloud-dns-public-zone-dashboard/README.md
main
gcp-professional-services
[ -0.047856200486421585, 0.04661276564002037, 0.02366369403898716, 0.011220847256481647, 0.015332222916185856, -0.03039632923901081, 0.03793410584330559, -0.08463944494724274, 0.06614314764738083, 0.024384992197155952, -0.003653518622741103, -0.10500670224428177, -0.01610652357339859, 0.0173...
0.040218
# Getting user profile from IAP-enabled GAE application This example demonstrates how to retrieve user profile (e.g. name, photo) from an IAP-enabled GAE application. ## Initial Setup This setup can be done from `Cloud Shell`. You need `Project Owner` permission to run this, e.g. for creating GAE app. The following set...
https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/iap-user-profile/README.md
main
gcp-professional-services
[ -0.012698967009782791, -0.012424474582076073, 0.010357809253036976, -0.02892342209815979, -0.043644413352012634, -0.021654294803738594, 0.08669643849134445, 0.0612049400806427, -0.05354553833603859, 0.061903782188892365, 0.004300425760447979, -0.13456711173057556, 0.08104650676250458, -0.0...
-0.04431