I'm in a project the we need to use BigQuery, PubSub, Logs explorer and Cloud Functions.
The project:
Every time certain event occurs (like an user accepting cookies), a system inserts a new query into BigQuery with a lot of columns (params) like: utm_source, utm_medium, consent_cookies, etc...
Once I have this new query in my table I need to read the columns and get the values to use in a cloud function.
In the cloud function I want to use those values to make api calls.
What I manage to do so far:
I created a log routing sink that filter the new entries and send the log to my PubSub topic.
Where I'm stuck:
I want to create a Cloud function that triggers every time a new log comes in and in that function I want to access the information that is contained in the log, such as utm_source, utm_medium, consent_cookies, etc... And use values to make api calls.
Anyone can help me? Many MANY thanks in advance!
I made a project to illustrate the flow:
- Insert to table:
2.From this insertion create a sink in logging: (filtering)
- Now every time I create a new query it goes to PUB/SUB i get the log of the query
- What I want to do is to trigger a function on this topic and use the values I have in the query to do operations like call api etc...
So far I was able to write this code:
"use strict";
function main() {
// Import the Google Cloud client library
const { BigQuery } = require("@google-cloud/bigquery");
async function queryDb() {
const bigqueryClient = new BigQuery();
const sqlQuery = `SELECT * FROM `mydatatable``;
const options = {
query: sqlQuery,
location: "europe-west3",
};
// Run the query
const [rows] = await bigqueryClient.query(options);
rows.forEach((row) => {
const username = row.user_name;
});
}
queryDb();
}
main();
Now I'm again stuck, Idont know how to get the correct query from the sink I created and use the info to make my calls...
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…