Overview
This documentation provides a detailed walkthrough to send the Google Cloud SQL logs directly to SigNoz. By the end of this guide, you will have a setup that automatically sends your Cloud SQL logs to SigNoz.
Here's a quick summary of what we will be doing in this guide
- Create and configure Cloud SQL
- Create Pub/Sub topic
- Create Log Router to route the Cloud SQL logs to SigNoz
- Create Compute Engine instance
- Create OTel Collector to route logs from Pub/Sub topic to SigNoz Cloud
- Send and Visualize the logs in SigNoz Cloud
Prerequisites
- Google Cloud account with administrative privilege or Cloud SQL Admin privilege.
- SigNoz Cloud Account (we are using SigNoz Cloud for this demonstration, we will also need ingestion details. To get your Ingestion Key and Ingestion URL, sign-in to your SigNoz Cloud Account and go to Settings >> Ingestion Settings)
- Access to a project in GCP
Setup
Get started with Cloud SQL Configuration
Follow the steps mentioned in the Creating Cloud SQL document to create Cloud SQL instance.
Create PubSub Topic
Follow the steps mentioned in the Creating Pub/Sub Topic document to create the Pub/Sub topic.
Create Log Router to Pub/Sub Topic
Follow the steps mentioned in the Log Router Setup document to create the Log Router.
To ensure you filter out only the Cloud SQL logs, use the following filter conditions:
resource.type="cloudsql_database"
Setup OTel Collector
Follow the steps mentioned in the Creating Compute Engine document to create another Compute Engine instance. We will be installing OTel Collector on this instance.
Install OTel Collector as agent
Firstly, we will establish the authentication using the following commands:
- Initialize
gcloud
:
gcloud init
- Authenticate into GCP:
gcloud auth application-default login
Let us now proceed to the OTel Collector installation:
Step 1: Download otel-collector tar.gz for your architecture
wget https://github.com/open-telemetry/opentelemetry-collector-releases/releases/download/v0.116.0/otelcol-contrib_0.116.0_linux_amd64.tar.gz
Step 2: Extract otel-collector tar.gz to the otelcol-contrib folder
mkdir otelcol-contrib && tar xvzf otelcol-contrib_0.116.0_linux_amd64.tar.gz -C otelcol-contrib
Step 3: Create config.yaml
in the folder otelcol-contrib
with the below content in it. Replace <region>
with the appropriate SigNoz Cloud region. Replace SIGNOZ_INGESTION_KEY
with what is provided by SigNoz:
receivers:
otlp:
protocols:
grpc:
endpoint: 0.0.0.0:4317
http:
endpoint: 0.0.0.0:4318
googlecloudpubsub:
project: <gcp-project-id>
subscription: projects/<gcp-project-id>/subscriptions/<pubsub-topic's-subscription>
encoding: raw_text
processors:
batch: {}
exporters:
otlp:
endpoint: "ingest.<region>.signoz.cloud:443"
tls:
insecure: false
headers:
"signoz-ingestion-key": "<SigNoz-Key>"
service:
pipelines:
traces:
receivers: [otlp]
processors: [batch]
exporters: [otlp]
metrics:
receivers: [otlp]
processors: [batch]
exporters: [otlp]
logs:
receivers: [otlp, googlecloudpubsub]
processors: [batch]
exporters: [otlp]
Step 4: Once we are done with the above configurations, we can now run the collector service with the following command:
From the otelcol-contrib, run the following command:
./otelcol-contrib --config ./config.yaml
Run in background
If you want to run OTel Collector process in the background:
./otelcol-contrib --config ./config.yaml &> otelcol-output.log & echo "$!" > otel-pid
The above command sends the output of the otel-collector to otelcol-output.log
file and prints the process id of the background running OTel Collector process to the otel-pid
file.
If you want to see the output of the logs you’ve just set up for the background process, you may look it up with:
tail -f -n 50 otelcol-output.log
Visualize the logs in SigNoz Cloud
You can now start seeing the Cloud SQL logs on SigNoz Cloud.

Cloud SQL Logs in SigNoz Cloud
Prerequisites
- Google Cloud account with administrative privilege or Cloud SQL Admin privilege.
- Access to a project in GCP
- Self-Hosted SigNoz
For more details on how to configure Self-Hosted SigNoz for Logs, check official documentation by Self-Hosted SigNoz and navigate to the "Send Logs to Self-Hosted SigNoz" section.
Setup
Get started with Cloud SQL Configuration
Follow the steps mentioned in the Creating Cloud SQL document to create Cloud SQL instance.
Create PubSub Topic
Follow the steps mentioned in the Creating Pub/Sub Topic document to create the Pub/Sub topic.
Create Log Router to Pub/Sub Topic
Follow the steps mentioned in the Log Router Setup document to create the Log Router.
To ensure you filter out only the Cloud SQL logs, use the following filter conditions:
resource.type="cloudsql_database"
Setup OTel Collector
Follow the steps mentioned in the Creating Compute Engine document to create the Compute Engine instance.
Install OTel Collector as agent
Firstly, we will establish the authentication using the following commands:
- Initialize
gcloud
:
gcloud init
- Authenticate into GCP:
gcloud auth application-default login
Let us now proceed to the OTel Collector installation:
Step 1: Download otel-collector tar.gz for your architecture
wget https://github.com/open-telemetry/opentelemetry-collector-releases/releases/download/v0.116.0/otelcol-contrib_0.116.0_linux_amd64.tar.gz
Step 2: Extract otel-collector tar.gz to the otelcol-contrib folder
mkdir otelcol-contrib && tar xvzf otelcol-contrib_0.116.0_linux_amd64.tar.gz -C otelcol-contrib
Step 3: Create config.yaml
in the folder otelcol-contrib
with the below content in it. We are using the clickhouselogsexporter
in this case.
receivers:
otlp:
protocols:
grpc:
endpoint: 0.0.0.0:4317
http:
endpoint: 0.0.0.0:4318
googlecloudpubsub:
project: <gcp-project-id>
subscription: projects/<gcp-project-id>/subscriptions/<pubsub-topic's-subscription>
encoding: raw_text
processors:
batch: {}
resource/env:
attributes:
- key: deployment.environment
value: prod
action: upsert
exporters:
clickhouselogsexporter:
dsn: tcp://clickhouse:9000/
timeout: 5s
sending_queue:
queue_size: 100
retry_on_failure:
enabled: true
initial_interval: 5s
max_interval: 30s
max_elapsed_time: 300s
service:
pipelines:
logs:
receivers: [otlp, googlecloudpubsub]
processors: [batch, resource/env]
exporters: [clickhouselogsexporter]
Step 4: Once we are done with the above configurations, we can now run the collector service with the following command:
From the otelcol-contrib, run the following command:
./otelcol-contrib --config ./config.yaml
Run in background
If you want to run OTel Collector process in the background:
./otelcol-contrib --config ./config.yaml &> otelcol-output.log & echo "$!" > otel-pid
The above command sends the output of the otel-collector to otelcol-output.log
file and prints the process id of the background running OTel Collector process to the otel-pid
file.
If you want to see the output of the logs you’ve just set up for the background process, you may look it up with:
tail -f -n 50 otelcol-output.log
Visualize the logs in SigNoz
This will generate the Cloud SQL logs which you can now see on SigNoz.