You want to move data from Pub/Sub topic to BigQuery table? I have noticed that most examples show how to do it using Dataflow, which is a bit of overkill if there is no transformation/enrichment going on. For just moving data around google cloud functions are great solution.

First step will be to create a BigQuery table where your data will land. You need to first figure out schema and then follow steps from here to create BQ table.

If schema is relatively simple and you already have rows of sample data, you can let BigQuery auto create schema. Save your sample data to file and use it to initialize BiqQuery table. Both JSON and CSV works fine.

Next thing is to write a function. Both Node.js and python are supported, but we notice that python functions execute much faster.

Here is an example of function that decodes message body from input event, writes it to log and saves in BigQuery table. Dataset and table names are passed in as environment variables so you can reuse same function for different topics.

Save function to a file and deploy it using gcloud cli.

You can also use GCP web console to create function. Here is how form looks like:

Pick correct topic and make sure to set env variables too:

To view function execution logs go to GCP Stackdriver logs and set this as filter

Good luck! Here are few more reference that might be useful: