content large_stringlengths 3 20.5k | url large_stringlengths 54 193 | branch large_stringclasses 4
values | source large_stringclasses 42
values | embeddings listlengths 384 384 | score float64 -0.21 0.65 |
|---|---|---|---|---|---|
of external application load balancer with serverless network endpoint groups (NEGs) # is configured with a TLS certificate for the Cloud DNS name resolving to the load balancer IP address, # then the we can also omit `-k` curl parameter and client will verify the server TLS certificate properly: curl "http://metadata.... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/cloud-dns-load-balancing/README.md | main | gcp-professional-services | [
-0.08734188973903656,
0.043820396065711975,
0.022392291575670242,
-0.04272036999464035,
-0.07219501584768295,
-0.10397298634052277,
-0.03454038128256798,
-0.044624630361795425,
0.021372439339756966,
0.014293390326201916,
-0.019596658647060394,
-0.09665203839540482,
0.06649709492921829,
-0.... | -0.044643 |
Useful Links \* [Multi-region failover using Cloud DNS Routing Policies and Health Checks for Internal TCP/UDP Load Balancer](https://codelabs.developers.google.com/clouddns-failover-policy-codelab#0) \* [AWS DNS load balancing example](https://docs.aws.amazon.com/whitepapers/latest/real-time-communication-on-aws/cross... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/cloud-dns-load-balancing/README.md | main | gcp-professional-services | [
-0.06688959896564484,
-0.020529912784695625,
0.04775675758719444,
-0.059767261147499084,
-0.05585465952754021,
-0.008600147441029549,
-0.015399240888655186,
-0.0362665057182312,
-0.022553078830242157,
0.02904028445482254,
-0.06954245269298553,
-0.0036622092593461275,
-0.016520043835043907,
... | 0.077391 |
# dataflow-bigquery-to-alloydb We are going to be moving data from a public dataset stored in BigQuery into a table that will be created in AlloyDB. This is the BigQuery query that will generate the source data: ```sql SELECT from\_address, to\_address, CASE WHEN SAFE\_CAST(value AS NUMERIC) IS NULL THEN 0 ELSE SAFE\_C... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/dataflow-bigquery-to-alloydb/README.md | main | gcp-professional-services | [
-0.03508508950471878,
0.016483547165989876,
-0.02572828158736229,
0.04594127833843231,
-0.049779172986745834,
-0.0011662552133202553,
0.0742088109254837,
0.059880904853343964,
-0.06841088831424713,
-0.006541825830936432,
-0.03337819129228592,
-0.08111467957496643,
0.1089804470539093,
-0.09... | 0.078617 |
# SRE Risk Analysis Tool This project is a tool designed to help SREs (Site Reliability Engineers) assess and manage risks associated with critical user journeys in their applications. It consists of a React frontend and a Flask backend that interacts with a Firestore database. ## Features \* \*\*Application Management... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/risk-analysis-asset/README.md | main | gcp-professional-services | [
-0.07973498851060867,
0.01086917333304882,
-0.07708162814378738,
0.02365989051759243,
0.11364005506038666,
0.01619199477136135,
0.030936986207962036,
0.11761006712913513,
-0.037716083228588104,
-0.0393686443567276,
-0.05520075559616089,
-0.041506845504045486,
0.03363121673464775,
0.0082326... | 0.178573 |
# SRE Risk Analysis Tool - Backend This Flask-based backend provides APIs to manage applications, critical user journeys (CUJs), and risks, storing data in a Firestore database. ## Features \* \*\*Application Management:\*\* \* Add new applications with names and descriptions. \* List all existing applications. \* \*\*... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/risk-analysis-asset/backend/README.md | main | gcp-professional-services | [
-0.06941074877977371,
0.033755313605070114,
-0.08025284856557846,
0.021829064935445786,
0.10688095539808273,
-0.03143048658967018,
0.011362923309206963,
0.05619588494300842,
-0.06072206795215607,
-0.05525984615087509,
-0.022777708247303963,
-0.053539078682661057,
0.04416674003005028,
-0.01... | 0.121032 |
# Sentiment Analysis with Kubeflow Pipelines, Cloud Dataflow, and Cloud Natural Language API Included code will build a Kubeflow Pipelines component and pipeline. The pipeline uses Cloud Dataflow to do sentiment analysis on New York Times headlines. Cloud DAtaflow uses Apache Beam (Java) to extract front page headlines... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/kubeflow-pipelines-sentiment-analysis/README.md | main | gcp-professional-services | [
-0.0033633189741522074,
0.007037558127194643,
0.03979587182402611,
0.0034769815392792225,
0.05114460736513138,
0.00668651657178998,
-0.08203712850809097,
-0.016055438667535782,
0.07029324769973755,
-0.022466201335191727,
-0.10410584509372711,
-0.045317500829696655,
-0.007688457611948252,
-... | 0.101727 |
# gcs-hive-external-table-file-optimization Example solution to showcase impact of file count, file size, and file type on Hive external tables and query speeds ---- ## Table Of Contents 1. [About](#about) 2. [Use Case](#use-case) 3. [Architecture](#architecture) 4. [Guide](#guide) 5. [Sample Queries](#sample-queries) ... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/gcs-hive-external-table-file-optimization/README.md | main | gcp-professional-services | [
-0.03286341577768326,
-0.031950220465660095,
0.0020667349454015493,
0.01050446555018425,
0.01675873063504696,
-0.0265109371393919,
0.014920513145625591,
0.01000287663191557,
-0.020794982090592384,
0.13320864737033844,
-0.021469179540872574,
0.008277803659439087,
0.03140667453408241,
-0.025... | 0.008824 |
| | avro | DEFLATE | 1 | 18.4 | 9.20 | | avro | none | 1 | 44.7 | 15.59 | | json | none | 6851 | 0.01 | 476.52 | comments = 6851 x 10kb file(s)  comments\_json = 1 x 95.6mb file(s)  comments\_json\_gz = 1 x 17.1mb file(s) ![Stack-Resour... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/gcs-hive-external-table-file-optimization/README.md | main | gcp-professional-services | [
-0.11383334547281265,
-0.03205875679850578,
-0.029795033857226372,
0.01754792220890522,
0.0964231863617897,
-0.08784960210323334,
0.04868502914905548,
0.043992508202791214,
-0.025615321472287178,
0.061269182711839676,
0.03197753056883812,
0.00905345007777214,
0.015406236052513123,
0.050188... | 0.092743 |
# IoT Nirvana This solution was built with the purpose of demonstrating an end-to-end Internet of Things architecture running on Google Cloud Platform. The purpose of the solution is to simulate the collection of temperature measures from sensors distributed all over the world and to follow temperature evolution by cit... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/iot-nirvana/README.md | main | gcp-professional-services | [
-0.05938867852091789,
-0.010538820177316666,
0.031088409945368767,
-0.004269720055162907,
0.009192085824906826,
-0.08033844083547592,
-0.0005216380232013762,
-0.036786213517189026,
-0.033937666565179825,
0.016975780948996544,
-0.01564972847700119,
-0.1106065884232521,
0.06092316657304764,
... | 0.09138 |
pipeline's binary package will be stored \* \*\*[PUBSUB\_TOPIC]\*\* - the name of the PubSub topic created by the bootstrapping script, from which the Dataflow pipeline will read the temperature data; please note that this isn't the topic's canonical name, but instead the name relative to your project \* \*\*[BIGQUERY\... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/iot-nirvana/README.md | main | gcp-professional-services | [
0.017361773177981377,
0.000034428525395924225,
-0.02205834351480007,
-0.007294635754078627,
0.01959485560655594,
-0.03199846297502518,
-0.002178358845412731,
0.0003342456475365907,
0.0007664322620257735,
0.04583888500928879,
-0.02397364191710949,
-0.06743086129426956,
0.08479069173336029,
... | -0.083617 |
test the end to end solution, it is necessary first to start the temperature sensors simulation. Follow the steps below to achieve this: \* Go to the following address in your web browser, which will display the map of the Earth with 3 buttons at the bottom: \*\*Start\*\*, \*\*Update\*\*, \*\*Stop\*\* `https://[YOUR\_P... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/iot-nirvana/README.md | main | gcp-professional-services | [
0.030653802677989006,
0.0052102236077189445,
0.06607674062252045,
-0.022685524076223373,
0.013473331928253174,
-0.06262899190187454,
-0.04690522328019142,
-0.053769126534461975,
-0.030767833814024925,
0.03610314056277275,
0.001395844854414463,
-0.12850578129291534,
0.03748008608818054,
-0.... | 0.003424 |
- [Reusable Plugins](#reusable-plugins-for-cloud-data-fusion-cdf--cdap) - [Overview](#overview) - [CheckPointReadAction, CheckPointUpdateAction](#checkpointreadaction-checkpointupdateaction) - [Dependencies](#dependencies) - [Setting up Firestore](#setting-up-firestore) - [Set Runtime Arguments](#set-runtime-arguments)... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/cloud-datafusion-functions-plugins/README.md | main | gcp-professional-services | [
-0.02131955511868,
-0.035662442445755005,
-0.0324338860809803,
-0.039702367037534714,
0.016332916915416718,
0.027425983920693398,
0.02132384665310383,
0.00011807851842604578,
-0.0020418281201273203,
0.06403971463441849,
0.026439514011144638,
0.00747109716758132,
0.02797575667500496,
-0.051... | -0.038185 |
duplicate records are not in the destination table. `CheckPointReadAction` - reads checkpoints in Firestore DB and provides the data during runtime as environment variable `CheckPointUpdateAction` - updates checkpoints in Firestore DB (i.e., creates a new document and stores maximum update date / time from BQ so the ne... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/cloud-datafusion-functions-plugins/README.md | main | gcp-professional-services | [
-0.013459688052535057,
-0.03875569999217987,
-0.04033517464995384,
-0.010508647188544273,
-0.05340715870261192,
0.041642095893621445,
0.04693106934428215,
-0.016622448340058327,
0.04223408177495003,
0.020593205466866493,
0.016479922458529472,
-0.03692326322197914,
0.09546197950839996,
-0.0... | -0.071956 |
- Project ID: GCP project ID. - Dataset: Big Query dataset name. - Table Name: Big Query table name. \*\*Please see the following screenshot for example configuration:\*\*  # Putting it all together into a Pipeline `CheckPointReadAction` β `TruncateTableAction` β Database ... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/cloud-datafusion-functions-plugins/README.md | main | gcp-professional-services | [
0.016408143565058708,
0.010129597038030624,
-0.007782633416354656,
-0.04038653150200844,
-0.009536360390484333,
-0.052333854138851166,
0.02130773663520813,
0.047243986278772354,
-0.03883569687604904,
0.034501221030950546,
0.023687487468123436,
0.024149132892489433,
0.019349053502082825,
-0... | -0.032734 |
# Terraform for Deploying a KAS agent in a GKE cluster This repository provides Terraform code for deploying a KAS agent in a GKE cluster, to connect it with a Gitlab repository to automatically deploy, manage, and monitor your cloud-native solutions using GitOps practices. This creates resources in your cluster to dep... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/gitlab-kas-gke/README.md | main | gcp-professional-services | [
-0.06363236159086227,
-0.00664821732789278,
-0.0007348944200202823,
-0.026240935549139977,
-0.0582863911986351,
-0.07702837139368057,
0.003864965634420514,
-0.06292138248682022,
0.01732056215405464,
0.09131839126348495,
0.0007388138328678906,
-0.06424491107463837,
0.10506867617368698,
-0.0... | 0.1252 |
Kubernetes Service Account used by agent to manage deployments in product namespace - RBAC roles for KSA to read/write configMap of cluster - RBAC roles for KSA to read/write any k8s resources in product namespace ## Considerations - Setup remote backend in provider - Configure service account with appropriate permissi... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/gitlab-kas-gke/README.md | main | gcp-professional-services | [
-0.00340078491717577,
-0.03441203758120537,
-0.020328059792518616,
-0.02318948693573475,
-0.08619538694620132,
0.002923934254795313,
0.02646672911942005,
0.006150714587420225,
0.06359982490539551,
0.075881727039814,
-0.034741807729005814,
-0.06579089909791946,
0.09266514331102371,
0.003066... | 0.103934 |
## Requirements | Name | Version | |------|---------| | [gitlab](#requirement\\_gitlab) | >= 3.18.0 | | [helm](#requirement\\_helm) | >= 2.7.1 | | [kubernetes](#requirement\\_kubernetes) | >= 2.15.0 | ## Providers | Name | Version | |------|---------| | [gitlab](#provider\\_gitlab) | 15.8.0 | | [google](#provider\\_goo... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/gitlab-kas-gke/terraform-docs.md | main | gcp-professional-services | [
0.012505375780165195,
0.019954297691583633,
0.06487812101840973,
-0.024205118417739868,
0.010097355581820011,
-0.05126694589853287,
0.0325014628469944,
-0.06430594623088837,
-0.03526812419295311,
0.013850872404873371,
0.06176906079053879,
-0.12660932540893555,
0.03615987300872803,
-0.00371... | 0.042605 |
# Running a GKE application on spot nodes with on-demand nodes as fallback Spot VMs are unused virtual machines offered for a significant discount which makes them great for cost saving. However, spot VMs come with a catch: they can be shut down and taken away from your project at any time. Sometimes, they can even be ... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/gke-ha-setup-using-spot-vms/README.md | main | gcp-professional-services | [
-0.02466752380132675,
0.020375795662403107,
0.05188633129000664,
0.018835650756955147,
0.002734933514147997,
-0.0029044044204056263,
-0.03906577080488205,
0.032078150659799576,
0.025536149740219116,
0.021018855273723602,
-0.0716024860739708,
-0.008651466108858585,
0.049754317849874496,
-0.... | 0.081709 |
VMs have to be shut down, a new node has to be spun up in the on-demand pool, and the pods have to be restarted on the new node. Especially spinning up a new node takes some time, which means your application might experience temporary performance issues in this scenario. To avoid that, consider having sufficient insta... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/gke-ha-setup-using-spot-vms/README.md | main | gcp-professional-services | [
0.030366024002432823,
-0.042632147669792175,
0.03351280465722084,
0.05753074958920479,
0.0047969818115234375,
-0.010004065930843353,
-0.0774153470993042,
0.000036818510125158355,
-0.016561314463615417,
0.032732147723436356,
-0.047716282308101654,
0.015452967025339603,
-0.03399515897035599,
... | 0.088006 |
# Redacting Sensitive Data Using the DLP API This example illustrates how to use the DLP api in a Cloud Function to redact sensitive data from log exports. The scrubbed logs will then be posted to a Pub/Sub topic to be ingested elsewhere. ## Getting Started These instructions will walk you through setting up your envir... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/dlp/cloud_function_example/README.md | main | gcp-professional-services | [
-0.020873041823506355,
-0.03127492591738701,
0.07926388084888458,
-0.0684162899851799,
0.0196260754019022,
-0.04876298829913139,
-0.01215820200741291,
-0.03794345259666443,
-0.016508569940924644,
0.06012631207704544,
0.00939447246491909,
-0.03902062773704529,
0.06316396594047546,
-0.045042... | -0.121979 |
the identification and replacement of a small number of data types (EMAIL\_ADDRESS, CREDIT\_CARD\_NUMBER, and DATE\_OF\_BIRTH), however, the DLP API supports many more which can be found [here](https://cloud.google.com/dlp/docs/infotypes-reference) | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/dlp/cloud_function_example/README.md | main | gcp-professional-services | [
-0.10255670547485352,
-0.005509333685040474,
0.06863885372877121,
-0.009217170998454094,
-0.005672259721904993,
0.036987386643886566,
0.04208003729581833,
-0.0494784377515316,
-0.009280460886657238,
-0.017007464542984962,
0.003190910443663597,
-0.019681192934513092,
-0.001452230499126017,
... | 0.034979 |
# Contents - [Multi-Cluster ASM on Private Clusters](./infrastructure): Anthos Service Mesh (ASM) for multiple GKE clusters, using Terraform - [Twistlock PoC](./twistlock): Pod traffic security scanning, using ASM, Docker and Google Artifact Registry (GAR) - [Cloud SQL for PostgreSQL PoC](./postgres): Connecting GKE cl... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/anthos-service-mesh-multicluster/README.md | main | gcp-professional-services | [
-0.025795552879571915,
-0.07482876628637314,
-0.009208263829350471,
0.025011476129293442,
-0.0008116757962852716,
-0.054020147770643234,
0.00274574919603765,
-0.05340144410729408,
-0.03236200660467148,
0.046682871878147125,
0.0053516533225774765,
-0.0971488282084465,
0.00029376757447607815,
... | 0.135681 |
creating clusters. For more information, see [Setting up clusters with Shared VPC](https://cloud.google.com/kubernetes-engine/docs/how-to/cluster-shared-vpc). In this sample, we create two private clusters in different subnets of the same VPC in the same project, and enable clusters to communicate to each other's API s... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/anthos-service-mesh-multicluster/README.md | main | gcp-professional-services | [
-0.031177937984466553,
-0.04906964302062988,
0.013302499428391457,
-0.010705208405852318,
-0.015540245920419693,
-0.00819668360054493,
-0.06290329992771149,
-0.0386354885995388,
0.025787459686398506,
0.028686007484793663,
-0.04707447811961174,
-0.08444482088088989,
0.048734400421381,
-0.03... | 0.071835 |
3. Run install\_asm\_mesh ``` install\_asm\_mesh ``` or, you can run the commands in install\_asm\_mesh step by step manually ``` # Navigate to your working directory. Binaries will be downloaded to this directory. cd ${WORK\_DIR} # Set up K8s config and context set\_up\_credential ${CLUSTER1\_CLUSTER\_NAME} ${CLUSTER1... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/anthos-service-mesh-multicluster/README.md | main | gcp-professional-services | [
0.0245867557823658,
-0.032324548810720444,
-0.05368220806121826,
-0.012420046143233776,
-0.0452493317425251,
0.01898759976029396,
-0.0008266664226539433,
0.014822601340711117,
-0.031013093888759613,
0.038026534020900726,
0.05279633030295372,
-0.140086829662323,
0.03360367938876152,
-0.0131... | 0.094643 |
# Twistlock Deployment Notes Because GCP ASM is built upon Istio service mesh for Kubernetes, Twistlock deployment on GCP ASM follows Twistlock's instruction of installation for Istio environment in general. \*\*NOTE:\*\* Installing Prisma Cloud (Twistlock) SaaS on Kubernetes requires the use of a bearer token. This ca... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/anthos-service-mesh-multicluster/twistlock/README.md | main | gcp-professional-services | [
-0.11093262583017349,
-0.022776035591959953,
0.019306311383843422,
-0.009311875328421593,
-0.02301551215350628,
0.029980415478348732,
0.08222424983978271,
0.0580768808722496,
-0.005933052860200405,
0.03129582479596138,
0.03503203019499779,
-0.07998798787593842,
-0.0007452944992110133,
-0.0... | 0.343735 |
is one daemon pod on each node of your cluster. ``` kubectl get pods -n twistlock ``` 2. Navigate back in Twistlock Cloud, and go to \_\_Compute\_\_ -> \_\_Radars\_\_ -> \_\_Hosts\_\_, you should see your cluster nodes are monitored there. 3. Still in Twistlock Cloud, go to \_\_Compute\_\_ -> \_\_Radars\_\_ -> \_\_Cont... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/anthos-service-mesh-multicluster/twistlock/README.md | main | gcp-professional-services | [
0.005750588141381741,
-0.07913078367710114,
0.03838273137807846,
-0.01943553052842617,
0.07074368745088577,
-0.04640238732099533,
0.010908284224569798,
-0.09995588660240173,
-0.009177112951874733,
0.0021687287371605635,
0.019770849496126175,
-0.07022515684366226,
-0.01972467079758644,
-0.1... | 0.088669 |
# Auto Encrypt PostgreSQL SSL Connection Using Istio Proxy Sidecar ### Summary PostgreSQL uses application-level protocol negotiation for SSL connection. Istio Proxy currently uses TCP-level protocol negotiation, so Istio Proxy sidecar errors out during SSL handshake when it tries to auto encrypt connection with Postgr... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/anthos-service-mesh-multicluster/postgres/Istio-Sidecar.md | main | gcp-professional-services | [
-0.0379110611975193,
0.0929933413863182,
0.0003414978855289519,
0.04113525152206421,
-0.13386930525302887,
-0.006991181522607803,
0.03886463865637779,
-0.006492942571640015,
-0.015762925148010254,
-0.011311128735542297,
-0.061002761125564575,
-0.0722537562251091,
-0.012906488962471485,
0.0... | 0.191722 |
be prompted for password. You will see errors. #### Look into the Istio Proxy log You can read the logs in Cloud Logging. However, you may want to view sidecar log messages for the detailed network traffic information and errors with the following command. ``` kubectl logs deploy/postgres-istio -c istio-proxy -n ``` | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/anthos-service-mesh-multicluster/postgres/Istio-Sidecar.md | main | gcp-professional-services | [
0.0724126547574997,
-0.0060475291684269905,
0.04409172385931015,
0.008014421910047531,
-0.06924843788146973,
-0.023356614634394646,
0.011314370669424534,
-0.015189450234174728,
-0.005135948769748211,
0.05116256698966026,
-0.04267749935388565,
-0.11099359393119812,
-0.09105616062879562,
-0.... | 0.379182 |
# PostgreSQL Auto SSL Connection Using Cloud SQL Proxy ## Summary PostgreSQL uses application-level protocol negotiation for SSL connection. Istio Proxy currently uses TCP-level protocol negotiation, so Istio Proxy sidecar errors out during SSL handshake when it tries to auto encrypt connection with PostgreSQL. Please ... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/anthos-service-mesh-multicluster/postgres/README.md | main | gcp-professional-services | [
-0.05066001042723656,
0.01450428832322359,
-0.024198902770876884,
0.051331527531147,
-0.1620173305273056,
0.032627638429403305,
0.06794063746929169,
-0.03156513348221779,
0.013634729199111462,
-0.003294318215921521,
-0.05275094136595726,
-0.059052351862192154,
-0.005879048258066177,
0.0182... | 0.158919 |
serviceaccount \ ksa-sqlproxy \ iam.gke.io/gcp-service-account="sql-client@${PROJECT\_ID}.iam.gserviceaccount.com" \ -n YOUR\_NAMESPACE ``` 5. Deploy PostgreSQL client with Cloud SQL Proxy sidecar Take a look at the deployment YAML file, [postgres-cloudproxy.yaml](./postgres-cloudproxy.yaml). Please note the following ... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/anthos-service-mesh-multicluster/postgres/README.md | main | gcp-professional-services | [
-0.0068852524273097515,
-0.03463498502969742,
0.01446797139942646,
-0.0375601090490818,
-0.13289128243923187,
0.027635961771011353,
0.02736000530421734,
-0.04943784698843956,
0.0031636191997677088,
0.05777069181203842,
-0.026186231523752213,
-0.0625196173787117,
0.022093696519732475,
-0.10... | -0.023261 |
# Cost Optimization Dashboard This repo contains SQL scripts for analyzing GCP Billing, Recommendations data and also a guide to setup the Cost Optimization dashboard. For sample dashboard [see here](https://datastudio.google.com/c/u/0/reporting/6cf564a4-9c94-4cfd-becd-b9c770ee7aa2/page/r34iB). ## Introduction The Cost... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/cost-optimization-dashboard/README.md | main | gcp-professional-services | [
-0.01927974633872509,
-0.015027878805994987,
-0.05437107011675835,
0.02077886275947094,
0.007625507190823555,
-0.005328335799276829,
-0.00011292834824416786,
0.039680059999227524,
-0.06565059721469879,
0.07367240637540817,
-0.09226419776678085,
-0.03846612945199013,
0.049999091774225235,
-... | -0.009451 |
data analysis and aggregation #### Common Functions \* Compose a new query and copy the SQL at [common\_functions.sql](scripts/common\_functions.sql). \* Execute the query to create some required functions in the ```dashboard``` dataset. \* This is how the dataset will look like after the above step.  - [Create Cloud Function Doc](... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/bigquery-ml-claudeintegrations/BQ_RemoteFunction_Sample/README.md | main | gcp-professional-services | [
-0.05011308938264847,
-0.014014882035553455,
-0.029916414991021156,
-0.016311679035425186,
-0.05712311342358589,
0.014196217991411686,
-0.03151822090148926,
-0.022464843466877937,
-0.03796946629881859,
0.05176857113838196,
-0.046635791659355164,
-0.05535580962896347,
0.03817162662744522,
-... | -0.042088 |
# BQ+Claude Art Sample Project This README provides instructions for setting up and running the BQ+Claude3 Art Sample project. ## Prerequisites - Access to Google Cloud Platform (GCP) - BigQuery enabled in your GCP project ## Setup Instructions ### Step 1: Load Raw Data 1. Locate the `object.csv` file containing the ra... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/bigquery-ml-claudeintegrations/Python_Notebook_Sample/README.md | main | gcp-professional-services | [
-0.026880871504545212,
0.0046706185676157475,
-0.0054226769134402275,
-0.0077707828022539616,
-0.03017967939376831,
0.03899038955569267,
-0.07119075953960419,
-0.03824678435921669,
-0.07204436510801315,
0.03289148956537247,
-0.030966369435191154,
-0.04204016178846359,
0.029130786657333374,
... | -0.106237 |
# BigQuery Audit Log Anomany Detection BQ Audit log anomanly detection is a tool which uses [BigQuery Audit Logs](https://cloud.google.com/bigquery/docs/reference/auditlogs), specifically in the `AuditData` format, for automated analysis of Big Data Cloud environments with a focus on BigQuery. The tool summarized and a... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/bigquery-auditlog-anomaly-detection/README.md | main | gcp-professional-services | [
0.024211890995502472,
0.012705343775451183,
0.0006221460644155741,
0.026707233861088753,
0.05313104763627052,
-0.0434897318482399,
0.031139690428972244,
-0.0722503587603569,
0.011440074071288109,
0.008348379284143448,
-0.05127617344260216,
-0.013764044269919395,
0.01824292540550232,
-0.038... | 0.043153 |
for outliers. For example, if you expect all emails in your bq environment to carrying about roughly equal jobs, you will set a lower sigma value. 3. If the individual score more than sigma from the standard deviation, the point will be flagged as an outlier. ### **Time Series Analysis** The algorithm to identify time ... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/bigquery-auditlog-anomaly-detection/README.md | main | gcp-professional-services | [
-0.04739261046051979,
-0.028267288580536842,
-0.032743189483881,
0.06965060532093048,
-0.03834674134850502,
-0.03403249755501747,
-0.05625490844249725,
0.05440988019108772,
0.003264960367232561,
-0.018703656271100044,
-0.06802716851234436,
-0.02348441630601883,
0.03391867130994797,
-0.0980... | 0.08076 |
# BQ Long Running Optimizer A utility that reads the entire SQL and provides a list of suggestions that would help to optimize the query and avoid the long running issues. ## Business Requirements One of the most frequently ocurring issues in BigQuery are Long Running Issues. This is seen in 2 cases :- 1. The Queries g... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/bigquery-long-running-optimizer/Readme.md | main | gcp-professional-services | [
0.025990944355726242,
0.02707483246922493,
0.02616764046251774,
0.07443588972091675,
-0.002977202646434307,
-0.03831479698419571,
-0.027422884479165077,
0.023244604468345642,
-0.027806734666228294,
0.0029900039080530405,
-0.09228640794754028,
-0.009835357777774334,
-0.02235194854438305,
-0... | -0.013712 |
## Use LIKE instead of REGEXP\_CONTAINS Recommendation In BigQuery, you can use the REGEXP\_CONTAINS function or the LIKE operator to compare strings. REGEXP\_CONTAINS provides more functionality, but also has a slower execution time. Using LIKE instead of REGEXP\_CONTAINS is faster, particularly if you don't need the ... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/bigquery-long-running-optimizer/optimization/regex_contains/regex_contains.md | main | gcp-professional-services | [
-0.016708187758922577,
0.042792610824108124,
0.06133055314421654,
0.022788267582654953,
0.04862640053033829,
0.04036235809326172,
0.0874195322394371,
0.0212051160633564,
-0.0156562440097332,
-0.008626679889857769,
-0.019734816625714302,
0.004935805220156908,
0.0199052132666111,
-0.11478747... | -0.052776 |
The Generic DDL Migration Utility does the following functionalities: 1. The script connects to Generic Database (Oracle, Snowflake, MSSQL, Vertica, Neteeza). 2. The script uses the metadata table (all\_tab\_columns) to retrieve the table schema information. 3. The script produces the "create table" statement using the... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/bigquery-generic-ddl-migration-utility/README.md | main | gcp-professional-services | [
0.004238049034029245,
-0.054396599531173706,
-0.044766154140233994,
-0.03689533472061157,
0.04268923029303551,
-0.10066983103752136,
-0.03634898364543915,
0.023206965997815132,
-0.07336680591106415,
0.027608206495642662,
-0.006405336316674948,
-0.01791701838374138,
-0.008641736581921577,
-... | -0.040834 |
The Generic BQ Converter Script does the following functionalities 1. The script reads the generic ddl files from the specified gcs path (output path of the generic\_ddl\_extraction script) 2. The script calls the BigQuery Migration API and converts the ddl to the BigQuery DDL and placed it in the specified gcs path Be... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/bigquery-generic-ddl-migration-utility/DDL_Converter/generic_bq_converter.md | main | gcp-professional-services | [
-0.040886253118515015,
-0.01968848519027233,
-0.006099716294556856,
-0.06434249132871628,
-0.0026529463939368725,
-0.027229726314544678,
-0.05541481077671051,
0.006168832536786795,
-0.05273756757378578,
0.030146777629852295,
-0.03868037462234497,
-0.06152297556400299,
0.027919180691242218,
... | -0.165998 |
Below packages are need to run the script: google-cloud-secret-manager google-cloud-bigquery google-cloud-storage google-api-core sudo apt install unixodbc Steps to run this script: 1. Create the generic-ddl-extraction-config.json file and place it in the gcs bucket. 2. Create the object\_name\_mapping.json file and pl... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/bigquery-generic-ddl-migration-utility/DDL_Extractor/generic_ddl_extraction.md | main | gcp-professional-services | [
-0.051087670028209686,
-0.038759876042604446,
-0.031102752313017845,
-0.04663500934839249,
0.018346447497606277,
-0.0772319808602333,
-0.03194749727845192,
-0.004360705614089966,
-0.06549901515245438,
0.05740568786859512,
-0.006973256357014179,
-0.05095149949193001,
0.04888533055782318,
-0... | -0.163036 |
The BQ Table Creator Script does the following functionalities 1. Reads the output sql file created by the mssql/netezza/vertica bq converter script 2. The script create the Bigquery Tables in the specified target dataset. 3. The table structure will include source columns, metadata columns and paritioning and clusteri... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/bigquery-generic-ddl-migration-utility/BQ_Table_Creator/generic_bq_table_creator.md | main | gcp-professional-services | [
-0.00748930498957634,
-0.030217239633202553,
-0.04170418530702591,
-0.019594276323914528,
-0.019608333706855774,
-0.03017948567867279,
-0.017718615010380745,
-0.0011661838507279754,
-0.08870352059602737,
0.05651410296559334,
-0.020103834569454193,
-0.07876810431480408,
0.04106432944536209,
... | -0.113019 |
The Archive DDL Script archive the DDL files created by the scripts (generic\_ddl\_extraction.py, generic\_bq\_converter.py and archive\_ddl.py) and place the files in the specified archive bucket. Below packages are need to run the script: google-cloud-storage Steps to run this script: 1. Make Sure the pre-requsitie o... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/bigquery-generic-ddl-migration-utility/DDL_Archiver/generic_archive_ddl.md | main | gcp-professional-services | [
-0.08986543864011765,
-0.028830384835600853,
0.03194459527730942,
-0.08309690654277802,
0.03441179543733597,
-0.03496602177619934,
-0.06197838485240936,
-0.07071954756975174,
-0.017052944749593735,
0.010679759085178375,
-0.018966998904943466,
-0.0205532256513834,
0.04765082150697708,
-0.05... | -0.204457 |
# Rotate service account keys in secret manager The program helps to rotate the service account key in secret manager. - \*\*Step 1:\*\* Creata a Pub/Sub Topic. Secret manager will publish a message to this topic as the secret approaches the expiration time. ``` gcloud pubsub topics create "projects/PUBSUB\_PROJECT\_ID... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/rotate-service-account-keys/Readme.md | main | gcp-professional-services | [
-0.056778982281684875,
-0.038813516497612,
-0.06000599265098572,
-0.04282237961888313,
-0.019897010177373886,
-0.01585843227803707,
0.050082407891750336,
-0.045390792191028595,
0.0743514820933342,
0.05810888484120369,
0.03825439140200615,
0.01621607318520546,
0.14508774876594543,
0.0001119... | -0.042337 |
# Vertex AI Pipeline This repository demonstrates end-to-end [MLOps process](https://services.google.com/fh/files/misc/practitioners\_guide\_to\_mlops\_whitepaper.pdf) using [Vertex AI](https://cloud.google.com/vertex-ai) platform and [Smart Analytics](https://cloud.google.com/solutions/smart-analytics) technology capa... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/vertex_pipeline/README.md | main | gcp-professional-services | [
-0.12110676616430283,
-0.012424043379724026,
-0.0070341601967811584,
0.011045748367905617,
0.006009044591337442,
-0.022786425426602364,
-0.024729551747441292,
-0.0008118136902339756,
-0.10183680057525635,
0.016008438542485237,
-0.028243377804756165,
-0.029343638569116592,
0.00688063725829124... | 0.15399 |
custom serving image is not necessary if your choosen framework is supported by [pre-built-container](https://cloud.google.com/vertex-ai/docs/predictions/pre-built-containers), which are organized by machine learning (ML) framework and framework version, provide HTTP prediction servers that you can use to serve predict... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/vertex_pipeline/README.md | main | gcp-professional-services | [
-0.05464429780840874,
-0.02220374159514904,
0.034596119076013565,
0.02076021023094654,
0.06257830560207367,
-0.004484002478420734,
-0.021642060950398445,
0.03876514360308647,
-0.031592369079589844,
0.058926139026880264,
-0.007890284061431885,
-0.03747614473104477,
0.022548718377947807,
0.0... | 0.042371 |
# Instrumenting Web Applications End-to-End with Stackdriver and OpenTelemetry This tutorial demonstrates instrumenting a web application end-to-end, from the browser to the backend application, including logging, monitoring, and tracing with OpenTelemetry and Stackdriver to run for a load test. It shows how to collect... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/web-instrumentation/README.md | main | gcp-professional-services | [
-0.05815305933356285,
-0.04030333459377289,
0.041775893419981,
0.005646936129778624,
-0.03747949376702309,
-0.10704853385686874,
-0.07195582240819931,
0.023120056837797165,
-0.02953064627945423,
0.027204789221286774,
-0.06987796723842621,
-0.030132248997688293,
0.028384869918227196,
-0.089... | 0.006152 |
if you want to live on the edge gcloud beta container clusters create $NAME \ --num-nodes 1 \ --enable-autoscaling --min-nodes 1 --max-nodes 4 \ --enable-basic-auth \ --issue-client-certificate \ --release-channel $CHANNEL \ --zone $ZONE \ --enable-stackdriver-kubernetes ``` Change the project id in file `k8s/ot-servic... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/web-instrumentation/README.md | main | gcp-professional-services | [
-0.001295908004976809,
0.00568812619894743,
0.06507793813943863,
-0.026627229526638985,
-0.045545753091573715,
-0.05732082575559616,
-0.008412009105086327,
-0.007766475901007652,
0.028912128880620003,
0.08376877754926682,
0.014019086956977844,
-0.06839583069086075,
0.024971894919872284,
-0... | 0.018549 |
send a request from the command line with cURL ```shell EXTERNAL\_IP=[from kubectl get ingress command] REQUEST\_ID=1234567889 # A random number # See W3C Trace Context for format TRACE=00-0af7651916cd43dd8448eb211c80319c-b7ad6b7169203331-01 MILLIS=`date +%s%N | cut -b1-13` curl "http://$EXTERNAL\_IP/data/$REQUEST\_ID"... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/web-instrumentation/README.md | main | gcp-professional-services | [
0.01440480723977089,
0.03355755656957626,
-0.02301076054573059,
0.01203075610101223,
-0.013399552553892136,
-0.030545417219400406,
0.032236259430646896,
-0.010510541498661041,
0.10463955998420715,
0.05747319012880325,
-0.00963857863098383,
-0.10362179577350616,
-0.025634216144680977,
-0.08... | 0.076281 |
Copyright 2024 Google. This software is provided as-is, without warranty or representation for any use or purpose. Your use of it is subject to your agreement with Google. # Five9 Voicestream Integration with Agent Assist This is a PoC to integrate Five9 Voicestream with Agent Assist. ## Project Structure ``` . βββ ass... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/ccai-agentassist-five9-grpc/README.md | main | gcp-professional-services | [
-0.10696090012788773,
-0.0960487574338913,
-0.011004485189914703,
-0.12590067088603973,
-0.0387069471180439,
0.0487111434340477,
0.02103874832391739,
-0.02211388200521469,
-0.03549269586801529,
-0.010385602712631226,
0.003951164428144693,
-0.038424570113420486,
0.03660372272133827,
-0.0006... | 0.016372 |
is being guided by virtual agents. ### Local Development Set Up This application is designed to run on port 8080. Upon launch, the application will initialize and bind to port 8080, making it accessible for incoming connections. This can be changed in the .env file. #### Protocol Buffer Compiler: This implementation le... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/ccai-agentassist-five9-grpc/README.md | main | gcp-professional-services | [
-0.06538166850805283,
-0.0385475680232048,
-0.014640823937952518,
-0.158565491437912,
-0.02669464610517025,
-0.057993266731500626,
-0.04928538203239441,
-0.009568383917212486,
-0.0396595261991024,
-0.0009363498538732529,
-0.005012195557355881,
-0.026914069429039955,
-0.0708291307091713,
0.... | 0.002766 |
# TensorFlow Profiling Examples Before launching training job, please copy the raw data and define the environmental variables (the bucket for staging and the bucket where you are going to store data as well as training job's outputs) `export BUCKET=YOUR\_BUCKET gcloud storage cp gs://cloud-training-demos/babyweight/pr... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/tensorflow-profiling-examples/README.md | main | gcp-professional-services | [
-0.029027177020907402,
-0.0478937067091465,
-0.05693266913294792,
0.07005088776350021,
0.036543089896440506,
0.04731401801109314,
-0.03995825722813606,
-0.024107933044433594,
-0.09977026283740997,
-0.033005762845277786,
-0.052417296916246414,
-0.1251802146434784,
-0.021173834800720215,
-0.... | -0.010083 |
dumps locally: `rm -rf /tmp/profiler mkdir -p /tmp/profiler gcloud storage cp --recursive $OUTDIR/$MODEL/profiler /tmp` 4. Launch the profiler with `python ui.py --profile\_context\_path=/tmp/profiler/$(ls /tmp/profiler/ | head -1)` | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/tensorflow-profiling-examples/README.md | main | gcp-professional-services | [
0.002804955467581749,
0.015306040644645691,
-0.010122500360012054,
-0.030038969591259956,
0.05886027589440346,
-0.08502195030450821,
-0.01531562302261591,
0.06913258880376816,
-0.01618477702140808,
-0.012148741632699966,
0.04534435644745827,
-0.07312148809432983,
0.05459114909172058,
0.022... | -0.114073 |
# Modern CI/CD with Anthos: Demo Guide ## Overview This guide walks you through putting together a [modern CI/CD reference architecture with Anthos](https://cloud.google.com/solutions/modern-ci-cd-with-anthos). There are different permutations to leverage anthos for your particular use case. The purpose of this guide i... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/anthos-cicd-with-gitlab/README.md | main | gcp-professional-services | [
-0.10121011734008789,
-0.005794638302177191,
0.014874960295855999,
-0.06618046015501022,
-0.08408000320196152,
-0.11530429124832153,
-0.017399216070771217,
0.011189850978553295,
-0.0644783228635788,
0.10371091961860657,
0.02942747063934803,
-0.04091167822480202,
0.06591574102640152,
-0.058... | 0.076752 |
# Prerequisites ## Google Cloud Platform This tutorial uses Anthos which is on the Google Cloud Platform (GCP). If you donβt have an account, you can [sign up](https://cloud.google.com/free/) for $300 in free credits. Estimated cost to run this tutorial is $9 per day. After signing up, [create](https://cloud.google.com... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/anthos-cicd-with-gitlab/docs/1-prerequisites.md | main | gcp-professional-services | [
-0.0946783795952797,
-0.03129446879029274,
0.014993710443377495,
-0.04546573758125305,
-0.03993222489953041,
-0.06841069459915161,
-0.004349593538790941,
0.029691647738218307,
-0.004694181028753519,
0.08491451293230057,
-0.021896030753850937,
-0.03987519070506096,
0.09664211422204971,
-0.0... | -0.006479 |
# Set up Anthos Config Management (ACM) [Anthos Config Management](https://cloud.google.com/anthos/config-management) (ACM) is a key component of Anthos that lets you define and enforce configs, including custom policies, and apply it across all your infrastructure both on-premises and in the cloud. With ACM, you can s... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/anthos-cicd-with-gitlab/docs/3-set-up-anthos-config-management.md | main | gcp-professional-services | [
0.010623653419315815,
-0.04705417901277542,
-0.018763920292258263,
-0.0182436965405941,
0.012102249078452587,
-0.021231038495898247,
0.023641612380743027,
-0.052911821752786636,
-0.042570650577545166,
0.06236632913351059,
0.0345478318631649,
0.006746192462742329,
-0.004506201948970556,
-0.... | 0.073841 |
configmanagement.gke.io/cluster-selector: dev-cluster-selector EOF cd .. mkdir stage cd stage cat > namespace.yaml << EOF apiVersion: v1 kind: Namespace metadata: name: stage annotations: configmanagement.gke.io/cluster-selector: dev-cluster-selector EOF cd .. mkdir prod cd prod cat > namespace.yaml << EOF apiVersion: ... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/anthos-cicd-with-gitlab/docs/3-set-up-anthos-config-management.md | main | gcp-professional-services | [
0.0017636261181905866,
0.02539592795073986,
-0.029846733435988426,
-0.03983764350414276,
0.0677514597773552,
0.03786864131689072,
0.08921545743942261,
-0.04150361567735672,
-0.07061109691858292,
0.08428077399730682,
0.0417512021958828,
-0.11021020263433456,
0.04652324318885803,
-0.03889613... | 0.101434 |
# Register Clusters with Anthos ## Create GKE clusters For this tutorial weβll create 2 GKE clusters called `dev` and `prod`. The dev cluster will be used for the dev and test environments while the prod cluster will be used for the prod environment. Create dev and prod clusters: ```bash for i in "dev" "prod"; do gclou... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/anthos-cicd-with-gitlab/docs/2-register-gke-clusters-with-anthos.md | main | gcp-professional-services | [
0.049307748675346375,
-0.047386035323143005,
-0.0687374547123909,
-0.03231474384665489,
0.03739931434392929,
-0.03713495284318924,
0.04171903431415558,
-0.0046973987482488155,
-0.020929288119077682,
0.07990868389606476,
-0.01908337138593197,
-0.12796005606651306,
0.02596954070031643,
0.014... | 0.064587 |
# CICD with Anthos In this section, weβll automate a CI/CD pipeline taking advantage of the features from anthos. ## Create app Before creating a CICD pipeline we need an application. For this tutorial, weβll use the popular hello kubernetes application created by paulbower but with a few modifications. Download hello-... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/anthos-cicd-with-gitlab/docs/4-cicd-with-anthos-and-gitlab.md | main | gcp-professional-services | [
-0.052788566797971725,
-0.02760905958712101,
-0.027967626228928566,
-0.07362626492977142,
-0.05203859508037567,
-0.10008737444877625,
0.013229488395154476,
0.056223493069410324,
0.04746002331376076,
0.07961796969175339,
0.011824787594377995,
-0.06604786217212677,
0.011103333905339241,
-0.0... | 0.133751 |
- namePattern: gcr.io/kaniko-project/\* - namePattern: gcr.io/cloud-solutions-images/kustomize:3.7 - namePattern: gcr.io/kpt-functions/gatekeeper-validate - namePattern: gcr.io/kpt-functions/read-yaml - namePattern: gcr.io/stackdriver-prometheus/\* - namePattern: gcr.io/$PROJECT\_ID/cloudbuild-attestor - namePattern: g... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/anthos-cicd-with-gitlab/docs/4-cicd-with-anthos-and-gitlab.md | main | gcp-professional-services | [
-0.035164013504981995,
-0.011428999714553356,
-0.04473958909511566,
-0.009620538912713528,
0.07597294449806213,
-0.05658145993947983,
0.02221459150314331,
-0.06196611747145653,
0.008574721403419971,
0.0660879984498024,
0.05639547109603882,
-0.13600802421569824,
-0.04870516434311867,
0.0018... | 0.166529 |
image: gcr.io/cloud-solutions-images/kustomize:3.7 tags: - prod only: refs: - main script: - DIGEST=\$(cat images/digest.txt) # build out staging manifests - mkdir -p ./hydrated-manifests/ # stage - cd \${KUSTOMIZATION\_PATH\_NON\_PROD} - kustomize edit set image app=\${HOSTNAME}/\${PROJECT\_ID}/\${CONTAINER\_NAME}@\${... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/anthos-cicd-with-gitlab/docs/4-cicd-with-anthos-and-gitlab.md | main | gcp-professional-services | [
0.03869766741991043,
0.04754255339503288,
0.05722158029675484,
-0.04189937934279442,
0.03824497014284134,
-0.040125712752342224,
0.0414883978664875,
0.03281271457672119,
0.027270428836345673,
0.028903091326355934,
0.0028559528291225433,
-0.10890986025333405,
0.00867137685418129,
0.01564513... | 0.066657 |
\${CI\_PIPELINE\_URL}" git push origin stage fi EOF ``` Push platform-admin remote: In gitlab, create a blank public project under the [$GROUP\_NAME](https://gitlab.com/dashboard/groups) group called `platform-admin` then run the following commands to push `platform-admin` dir to gitlab ```bash cd ~/$GROUP\_NAME/platfo... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/anthos-cicd-with-gitlab/docs/4-cicd-with-anthos-and-gitlab.md | main | gcp-professional-services | [
-0.010379143990576267,
-0.037740614265203476,
-0.039150670170784,
-0.048888519406318665,
-0.04827123507857323,
-0.0978299081325531,
-0.008192076347768307,
0.02488839440047741,
0.0311542097479105,
0.04792051762342453,
0.03003503940999508,
-0.06980007886886597,
0.07618904858827591,
0.0162176... | -0.069928 |
## Register gitlab runner [Gitlab runner](https://docs.gitlab.com/runner/) is what runs the jobs in the gitlab pipeline. In this tutorial, we'll install the Gitlab runner application to our prod cluster and register it as our gitlab runner. Before registering gitlab runner on our system, weβll enable[ workload identity... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/anthos-cicd-with-gitlab/docs/4-cicd-with-anthos-and-gitlab.md | main | gcp-professional-services | [
-0.07050399482250214,
-0.01515432819724083,
0.011393417604267597,
-0.030642731115221977,
-0.032228171825408936,
-0.05615914613008499,
0.01484904158860445,
-0.035483431071043015,
0.04336225613951683,
0.016476795077323914,
-0.0566406287252903,
-0.044902969151735306,
0.04756232723593712,
-0.0... | 0.001737 |
namespace.yaml << EOF apiVersion: v1 kind: Namespace metadata: name: gitlab EOF cd .. cat > service-account.yaml << EOF apiVersion: v1 kind: ServiceAccount metadata: name: gitlab-runner annotations: iam.gke.io/gcp-service-account: gitlab-sa@$PROJECT\_ID.iam.gserviceaccount.com EOF ``` Push changes to acm repo: ```bash ... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/anthos-cicd-with-gitlab/docs/4-cicd-with-anthos-and-gitlab.md | main | gcp-professional-services | [
-0.022156229242682457,
-0.04194736108183861,
-0.034165672957897186,
0.004060035105794668,
-0.027796654030680656,
-0.043432462960481644,
0.03240012750029564,
0.003081226721405983,
0.07145998626947403,
0.05659836530685425,
0.02045402117073536,
-0.09642720222473145,
0.07325540482997894,
-0.04... | 0.034608 |
# Hello Kubernetes! This container image can be deployed on a Kubernetes cluster. When accessed via a web browser on port `8080`, it will display: - a default \*\*Hello world!\*\* message - the pod name - node os information  The default "Hello world!... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/anthos-cicd-with-gitlab/src/hello-kubernetes/README.md | main | gcp-professional-services | [
0.008225617930293083,
0.09247521311044693,
0.03705250099301338,
-0.056183964014053345,
-0.00062830705428496,
-0.06903651356697083,
-0.028306666761636734,
-0.02260446734726429,
0.020992478355765343,
0.010951961390674114,
0.013252973556518555,
-0.03286823257803917,
0.020034784451127052,
-0.0... | 0.041354 |
`app` folder in the VS Code Remote Containers terminal, you will be able to access the website on `http://localhost:8080`. You can change the port in the `.devcontainer\devcontainer.json` file under the `appPort` key. See [here](https://code.visualstudio.com/docs/remote/containers) for more details on working with this... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/anthos-cicd-with-gitlab/src/hello-kubernetes/README.md | main | gcp-professional-services | [
0.036804862320423126,
-0.009634800255298615,
-0.0045663462951779366,
-0.006722116377204657,
0.013817639090120792,
-0.03020954504609108,
-0.09251190721988678,
0.06636179238557816,
0.007903847843408585,
0.05095653235912323,
-0.010685877874493599,
0.0012305878335610032,
-0.016298962756991386,
... | -0.047746 |
This is an example of running build steps with cloudbuild, having docker-compose running HTTPS\_PROXY in the background connected to a bastion host via Identity Aware Proxy using 'gcloud compute ssh'. This is one of the way to restrict outbound public IP address of the cloudbuild default pool. # Prerequisite: ## Setup ... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/cloudbuild-with-tcp-proxy/README.md | main | gcp-professional-services | [
0.0038670306093990803,
0.07493487745523453,
0.040025509893894196,
-0.04398332163691521,
-0.04007330909371376,
0.013708259910345078,
-0.03216591104865074,
-0.07595387101173401,
0.004063720814883709,
0.05936186760663986,
-0.015387464314699173,
-0.0654173418879509,
0.07489851862192154,
-0.018... | -0.125676 |
# Composer Dependency Management ### TL;DR: This repository presents a Cloud Composer workflow designed to orchestrate complex task dependencies within Apache Airflow. The solution specifically addresses the challenge of managing parent-child DAG relationships across varying temporal frequencies (yearly, monthly, weekl... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/cloud-composer-dependency-management-example/README.md | main | gcp-professional-services | [
-0.11363695561885834,
-0.017165595665574074,
0.005158910062164068,
0.026971466839313507,
0.026809511706233025,
-0.07739248871803284,
-0.010723407380282879,
-0.013865943066775799,
0.01922553777694702,
-0.04897531121969223,
-0.04098285734653473,
-0.06340048462152481,
-0.04875127598643303,
-0... | 0.09717 |
a series of interconnected data pipelines, each designed to perform specific tasks and produce valuable insights. #### Yearly Refresh: Company Calendar Once a year, Symphony Goods executed a critical process known as ["Company\_cal\_refresh"](company\_cal\_refresh.py). This workflow ensured that the company's internal ... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/cloud-composer-dependency-management-example/README.md | main | gcp-professional-services | [
-0.09713505953550339,
0.008172580972313881,
-0.005559041630476713,
0.024640515446662903,
-0.027459580451250076,
0.0032023685052990913,
-0.04702538996934891,
-0.0007937554619275033,
-0.029855012893676758,
-0.0476253367960453,
-0.014728834852576256,
0.020952163264155388,
0.0010904660448431969,... | 0.240293 |
to drive business growth. | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/cloud-composer-dependency-management-example/README.md | main | gcp-professional-services | [
0.008070950396358967,
0.04602361470460892,
0.028568504378199577,
-0.005799518898129463,
0.000902113039046526,
0.04218204692006111,
-0.00926316436380148,
-0.00033132004318758845,
0.08095522969961166,
-0.001565173384733498,
0.02437773160636425,
0.11609331518411636,
-0.04201652482151985,
0.00... | 0.260868 |
# Cloud Run CRL Monitoring This project deploys a serverless solution to monitor Certificate Revocation Lists (CRLs) using Google Cloud Run, Cloud Scheduler, and Cloud Monitoring. It validates that CRLs are: 1. \*\*Accessible\*\*: Can be downloaded via HTTP/HTTPS. 2. \*\*Valid\*\*: Are in valid DER or PEM format. 3. \*... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/cloudrun-crl-monitor/README.md | main | gcp-professional-services | [
-0.11820987612009048,
0.01887786202132702,
-0.0008626578492112458,
-0.026364952325820923,
-0.0023493370972573757,
-0.06518718600273132,
0.009469040669500828,
-0.07242550700902939,
0.0775744840502739,
0.023487824946641922,
-0.0012349518947303295,
-0.059799790382385254,
0.07259026914834976,
... | 0.002777 |
# Better Consumer Complaint and Support Request Handling With AI ## Contributors - Dimos Christopoulos (Google) - [Shane Kok](https://www.linkedin.com/in/shane-kok-b1970a82/) (shanekok9@gmail.com) - Andrew Leach (Google) - Anastasiia Manokhina (Google) - [Karan Palsani](https://www.linkedin.com/in/karanpalsani/) (karan... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/cloudml-support-routing/README.md | main | gcp-professional-services | [
-0.17664547264575958,
-0.05781613662838936,
-0.0063437120988965034,
0.05478418618440628,
0.027034439146518707,
0.04939834028482437,
0.0188024640083313,
0.030703039839863777,
-0.03768705949187279,
-0.00702508632093668,
0.008565237745642662,
-0.08604329824447632,
0.02936698868870735,
-0.0325... | 0.147021 |
tested these instructions in other environments. \*\*All commands, unless otherwise stated, should be run from the directory containing this README.\*\* ## Enable Required APIs in your Project These instructions have been tested in a fresh Google Cloud project without any organization constraints. You should be able to... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/cloudml-support-routing/README.md | main | gcp-professional-services | [
-0.0710861012339592,
-0.044481873512268066,
-0.00014256460417527705,
-0.08210518211126328,
-0.026987051591277122,
-0.011321566067636013,
-0.039087045937776566,
-0.06413038820028305,
-0.06903626769781113,
0.03773981332778931,
0.01550817210227251,
-0.06915783137083054,
0.040190424770116806,
... | -0.069868 |
and if you aren't rerunning the model build you don't need to change `global.dataset\_display\_name` and `global.model\_display\_name`. If you need to change the default paths (because you are running somewhere besides an AI Platform Notebook, because your repo is in a different path, or because your AutoML service acc... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/cloudml-support-routing/README.md | main | gcp-professional-services | [
-0.0022974046878516674,
-0.03623273968696594,
0.051261261105537415,
0.023946518078446388,
-0.05987783148884773,
-0.00631271256133914,
0.0026627418119460344,
-0.004266960080713034,
-0.04414133355021477,
0.048724520951509476,
0.020389650017023087,
-0.11702385544776917,
0.04149702191352844,
0... | 0.015557 |
default is 1, but you should expect an extra hour of spin up time on top of the training budget. Upping the budget may improve model performance. ## Online Predictions The example pipeline makes batch predictions, but a common deployment pattern is to create an API endpoint that receives features and returns a predicti... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/cloudml-support-routing/README.md | main | gcp-professional-services | [
0.00652303546667099,
-0.05208912119269371,
-0.05115976557135582,
0.034001823514699936,
0.019249800592660904,
-0.04944337531924248,
-0.0860975831747055,
0.06308309733867645,
-0.07097061723470688,
-0.015922803431749344,
-0.039242345839738846,
-0.03583446145057678,
-0.019702816382050514,
-0.0... | 0.017098 |
# Storage Transfer Service(STS) Metrics [Google Cloud STS](https://cloud.google.com/storage-transfer/docs/overview) provides options to move data between buckets or from other cloud providers. Users can schedule recurring or one-time STS jobs to move data for data backup, synchronization, and replication. As a managed ... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/sts-metrics/README.md | main | gcp-professional-services | [
-0.06532861292362213,
-0.08636680245399475,
-0.005838251672685146,
0.003444698639214039,
-0.05468469858169556,
-0.021112214773893356,
0.07271067798137665,
-0.0170359555631876,
0.0660315528512001,
0.016242869198322296,
-0.008981415070593357,
-0.03742734715342522,
0.058704640716314316,
0.050... | 0.13426 |
log like the following: > [INFO ] 2022-12-18 11:33:06.250 [pool-7-thread-1] StsJobHelper - Creating one time transfer job in STS: {description=eshen-test, name=transferJobs/a-eshen-1671391986249, notificationConfig={payloadFormat=JSON, pubsubTopic=projects/eshen-test-3/topics/eshen-sts-metrics-topic}, projectId=eshen-t... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/sts-metrics/README.md | main | gcp-professional-services | [
-0.028077568858861923,
0.0025477197486907244,
0.006824004463851452,
0.04380915313959122,
0.01687506027519703,
-0.07109882682561874,
0.05928861349821091,
-0.06385451555252075,
0.07074660062789917,
0.040935106575489044,
-0.05136999115347862,
-0.12376739084720612,
0.003479425795376301,
0.0389... | 0.133807 |
# Energy Price Forecasting Example This repository contains example code to forecast energy prices. Specifically, given a historical time series of hourly spot prices and weather, it predicts future hourly spot prices multiple days into the future. The code takes in raw data from BigQuery, transforms and prepares the d... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/cloudml-energy-price-forecasting/README.md | main | gcp-professional-services | [
-0.09146218746900558,
0.0002846652932930738,
0.053290147334337234,
0.08718924969434738,
0.04209108650684357,
-0.11878383904695511,
-0.02866741269826889,
0.0011475355131551623,
-0.058803439140319824,
0.02149982936680317,
-0.0681111216545105,
-0.05019056424498558,
-0.006225191988050938,
-0.0... | -0.024619 |
# Running a GKE cluster with control plane authority features In GKE, Google Cloud fully manages the security configuration of the control plane, including \*\*encryption of storage at rest\*\*, and configuring the keys and certificate authorities (CAs) that sign and verify credentials in your clusters. The control pla... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/gke-control-plane-authority/README.md | main | gcp-professional-services | [
-0.04270615801215172,
-0.020749304443597794,
0.012946195900440216,
-0.005099076312035322,
-0.0025730656925588846,
-0.009234484285116196,
0.014460149221122265,
-0.09243721514940262,
-0.04616245627403259,
0.04210038110613823,
-0.0015164228389039636,
-0.039408616721630096,
0.06154752895236015,
... | 0.02922 |
the GKE cluster and KMS keys within the same GCP Project. \* Ensure the required variables are populated in a ```terraform.tfvars``` file or passed in as arguments in the apply command. Sample variables file: ``` project\_id = "my-gcp-project" org\_id = "1234567890" cluster\_name = "cpa-cluster" network\_project\_id = ... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/gke-control-plane-authority/README.md | main | gcp-professional-services | [
-0.0017799873603507876,
-0.0018340253736823797,
0.023898344486951828,
-0.057900551706552505,
-0.06831854581832886,
-0.025834903120994568,
0.022520767524838448,
-0.03886587917804718,
0.03151363506913185,
0.070042185485363,
-0.01847095414996147,
-0.16127067804336548,
0.07601499557495117,
-0.... | 0.045595 |
You can serve your TensorFlow models on Google Kubernetes Engine with [TensorFlow Serving](https://www.tensorflow.org/tfx/guide/serving). This example illustrates how to automate deployment of your trained models to GKE. In production setup, it's also useful to load test your models to tune TensorFlow Serving configura... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/tf-load-testing/README.md | main | gcp-professional-services | [
0.0076932236552238464,
-0.09719964861869812,
0.03092939779162407,
-0.0417134165763855,
-0.05129023641347885,
-0.014966408722102642,
-0.0535297654569149,
-0.025750141590833664,
-0.039807114750146866,
-0.015888329595327377,
-0.06641191244125366,
-0.06841004639863968,
0.027153192088007927,
-0... | 0.068454 |
gcr.io/mogr-test-277422/tensorflow-app:latest env: - name: MODEL\_NAME value: regression ports: - containerPort: 8500 - containerPort: 8501 args: ["--model\_config\_file=/benchmark/models.config", "--tensorflow\_intra\_op\_parallelism=4", "--tensorflow\_inter\_op\_parallelism=4", "--batching\_parameters\_file=/benchmar... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/tf-load-testing/README.md | main | gcp-professional-services | [
-0.04180961474776268,
-0.03830833360552788,
-0.05385163798928261,
0.05502679944038391,
-0.01558363251388073,
-0.07346321642398834,
-0.047572407871484756,
0.0023050005547702312,
-0.050918467342853546,
-0.09730120003223419,
0.007087971083819866,
-0.029461940750479698,
-0.04424972087144852,
-... | 0.041601 |
## Dataflow pipeline to change the key of a Bigtable For an optimal performance of our requests to a Bigtable instance, [it is crucial to choose a good key for our records](https://cloud.google.com/bigtable/docs/schema-design), so that both read and writes are evenly distributed across the keys space. Although we have ... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/bigtable-change-key/README.md | main | gcp-professional-services | [
-0.0004985698615200818,
-0.02103939838707447,
0.0031292622443288565,
-0.023044146597385406,
-0.039743877947330475,
-0.0923546627163887,
-0.08262684941291809,
0.007920009084045887,
-0.0509442575275898,
0.06322622299194336,
0.01569453999400139,
-0.03319057077169418,
0.07455278933048248,
-0.1... | -0.086164 |
triggering the pipeline. Then run: ```bash # Change if your location is different JAR\_LOC=target/bigtable-change-key-bundled-0.1-SNAPSHOT.jar PROJECT\_ID= REGION= TMP\_GS\_LOCATION= BIGTABLE\_INSTANCE= INPUT\_TABLE= OUTPUT\_TABLE= RUNNER=DataflowRunner java -cp ${JAR\_LOC} com.google.cloud.pso.pipeline.BigtableChangeK... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/bigtable-change-key/README.md | main | gcp-professional-services | [
0.010090353898704052,
-0.023623937740921974,
0.025479383766651154,
-0.040665846318006516,
0.003129251068457961,
-0.017830587923526764,
-0.07592711597681046,
0.0040000611916184425,
-0.0443824864923954,
0.02873940020799637,
-0.03619897738099098,
-0.06959950923919678,
0.012331841513514519,
-0... | -0.051087 |
CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License. | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/bigtable-change-key/README.md | main | gcp-professional-services | [
-0.017406092956662178,
0.037016917020082474,
-0.05546785518527031,
-0.04463174566626549,
-0.02692166157066822,
0.05144880339503288,
0.01092204824090004,
-0.044430073350667953,
0.02712048403918743,
-0.037726618349552155,
0.041487619280815125,
-0.01703323982656002,
0.034683987498283386,
0.06... | 0.135405 |
# BigQuery Audit Log Dashboard This example shows you how to build a dashboard using Data Studio for visualization and a SQL script to query the back-end data source. The dashboard displays different metrics pertaining to BigQuery consumption. The purpose of the dashboard is to be used as a BigQuery auditing tool that ... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/bigquery-audit-log/README.md | main | gcp-professional-services | [
-0.010753239504992962,
0.0266435407102108,
-0.06762705743312836,
0.011837264522910118,
0.04838644340634346,
-0.05588741600513458,
0.020632345229387283,
0.0003697971405927092,
-0.011011547408998013,
0.04482031613588333,
-0.08250890672206879,
-0.009798262268304825,
0.020708724856376648,
-0.0... | 0.046988 |
the data source created in the step 3 above. Click on create report. Rename the report (dashboard) to a name of your choice. | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/bigquery-audit-log/README.md | main | gcp-professional-services | [
-0.019336402416229248,
-0.002470494480803609,
-0.1058204397559166,
0.04388854652643204,
0.0178181491792202,
0.03563804551959038,
-0.03913033753633499,
0.017952801659703255,
0.026335088536143303,
0.051410965621471405,
0.010829743929207325,
-0.04915611818432808,
0.09240879863500595,
-0.00166... | 0.017988 |
# Load Jobs Report This document outlines the Load Jobs report (page 2) of the dashboard and explains the various graphs and tables present on the page. #### Note: In all further sections, the "time", "week" or "day" is relative to the timeframe selected in the date filter in the Selection Bar at the top of the page ##... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/bigquery-audit-log/docs/load_jobs.md | main | gcp-professional-services | [
-0.062286537140607834,
0.04722762107849121,
-0.020448723807930946,
0.09148065745830536,
0.05619033798575401,
-0.0304995309561491,
-0.009439361281692982,
-0.004246730823069811,
-0.036662034690380096,
0.011842516250908375,
-0.05973313748836517,
0.004617944825440645,
0.0529865138232708,
0.034... | 0.065171 |
# Extract Jobs Report This document outlines the Extract Jobs report (page 3) of the dashboard and explains the various graphs and tables present on the page. #### Note: In all further sections, the "time", "week" or "day" is relative to the timeframe selected in the date filter in the Selection Bar at the top of the p... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/bigquery-audit-log/docs/extract_jobs.md | main | gcp-professional-services | [
-0.10802905261516571,
0.05910846218466759,
-0.0150609090924263,
0.08195163309574127,
0.12467515468597412,
0.008117863908410072,
0.006303072907030582,
-0.01411046739667654,
-0.03542790189385414,
0.017140213400125504,
-0.023435022681951523,
-0.008724185638129711,
0.03889594227075577,
0.05059... | 0.074154 |
# Queries Report This document outlines the Queries report (page 5) of the dashboard and explains the various graphs and tables present on the page. #### Note: In all further sections, the "time", "week" or "day" is relative to the timeframe selected in the date filter in the Selection Bar at the top of the page ### Se... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/bigquery-audit-log/docs/query_jobs.md | main | gcp-professional-services | [
-0.07134354114532471,
0.06711261719465256,
0.0009291566093452275,
0.11018778383731842,
0.05369257926940918,
-0.009928814135491848,
0.002052862197160721,
0.0014878205256536603,
-0.02657218649983406,
-0.006762745324522257,
-0.05317261442542076,
-0.041788775473833084,
0.07112617790699005,
0.0... | 0.063988 |
# Copy Jobs Report This document outlines the Copy Jobs report (page 4) of the dashboard and explains the various graphs and tables present on the page. #### Note: In all further sections, the "time", "week" or "day" is relative to the timeframe selected in the date filter in the Selection Bar at the top of the page ##... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/bigquery-audit-log/docs/copy_jobs.md | main | gcp-professional-services | [
-0.07601282745599747,
0.035654813051223755,
-0.01971161738038063,
0.055513326078653336,
0.08771036565303802,
0.02422953024506569,
0.014051415026187897,
-0.013728497549891472,
-0.054297324270009995,
0.01065222080796957,
-0.014722615480422974,
-0.019787216559052467,
0.06767961382865906,
0.04... | 0.048559 |
# Overall Usage Report This document outlines the Overall Usage Report (page 1) of the dashboard and explains the various graphs and tables present on the page. #### Note: In all further sections, the "time", "week" or "day" is relative to the timeframe selected in the date filter present in the Selection Bar at the to... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/bigquery-audit-log/docs/overall_usage.md | main | gcp-professional-services | [
-0.011717593297362328,
0.032788511365652084,
-0.03737654536962509,
0.11407103389501572,
0.04755398631095886,
-0.04257592558860779,
0.009285488165915012,
0.023143094033002853,
-0.017371509224176407,
0.027449117973446846,
-0.05305952951312065,
-0.007805262226611376,
0.005477153230458498,
0.0... | 0.091254 |
# Dataproc Running Notebooks ## Objective Orchestrator to run Notebooks on an Ephemeral Dataproc cluster via Cloud Composer ## File Directory Structure βββ composer\_input β βββ initialization\_scripts/ init\_pip\_gcsfuse.sh β βββ jobs/ wrapper\_papermill.py β βββ DAGs/ composer\_pyspark\_notebook.py βββ notebooks β ββ... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/dataproc-running-notebooks/README.md | main | gcp-professional-services | [
-0.10792769491672516,
-0.018522504717111588,
-0.023065418004989624,
0.022884661331772804,
-0.0038127677980810404,
-0.03178669139742851,
-0.04766407608985901,
0.03949049115180969,
-0.03585640713572502,
0.04898484796285629,
0.01118266861885786,
-0.07197339087724686,
0.013539022766053677,
-0.... | -0.034282 |
# CloudSQL Custom Metric This example demonstrates how to create a custom metric for Stackdriver Monitoring. This example estimates the number of IP's consumed in a CloudSQL private services subnet.  ## Component Description A Stackdriver log sink at the orga... | https://github.com/GoogleCloudPlatform/professional-services/blob/main//examples/cloudsql-custom-metric/README.md | main | gcp-professional-services | [
-0.010262669064104557,
-0.06461530178785324,
-0.004560625180602074,
0.07117240130901337,
-0.0473022498190403,
-0.034872300922870636,
0.032222334295511246,
-0.03455427661538124,
0.1067686453461647,
0.04419415816664696,
-0.05522829294204712,
-0.09145299345254898,
0.05930899828672409,
-0.0321... | 0.049035 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.