BigQuery - old

Setting up tables

Data in BigQuery typically resides in tables. When sharing data with Toplyne, the best practice is to create separate tables for event data and profile data partitioned on a time column.

Event data table

This table will capture a user's action; it will help answer the question, “what has a user done on your product, and when.”


Each row in this table represents an event triggered by a user.


  • USER_ID (Required): Key to identify which user has performed the event
  • ACCOUNT_ID (required for account-level analytics): Key to identify which account the particular user belongs to
  • EVENT_NAME (Required): Name of the event
  • TIMESTAMP (Required): Timestamp at which the event has occurred
  • UPDATED_AT (Required): UTC timestamp when the row has been updated or added (not the event timestamp)
  • EVENT_PROPERTIES (Optional): Event properties, typically shared as a JSON
    with key/value pairs


Partition this table on TIMESTAMP column.

Sample Event Table

129823Payment_Initiated2022-10-26 07:08:342023-01-09 17:08:34{
"paymentid": 123,
"amount" : 200,
156777Trial_Ended2022-11-01 20:01:142023-01-09 17:08:34{
"accountid" : 77

Profile data table

This table will capture any profile information a user or an account has; it will help answer the question, “who is this user or account?”.


Each row in this table is a unique entity.


  • USER_ID/ACCOUNT_ID (Required): Key to identify which entities' properties are listed in the row.
  • USER_CREATION_TIME/ ACCOUNT_CREATION_TIME (Optional): To identify since when the user or account has been active.
  • UPDATED_AT: UTC timestamp when the row has been updated or added (not the event timestamp)
  • USER_PROPERTIES/ACCOUNT_PROPERTIES: Each subsequent column is a profile key; against it, you will have its value. For example, you may choose to include the email address, geographical location, account name, current billing plan they are on, etc.


Partition this table on UPDATED_AT column.

Sample User Table

12962022-10-26 07:08:342022-11-26 07:08:34[email protected]UKAnnual Plan
26762022-11-01 20:01:142022-12-01 20:01:14_[email protected]UKMonthly Plan

Sample Account Table

232022-10-26 07:08:342022-11-26 07:08:34QueenUKAnnual Plan
562022-11-01 20:01:142022-11-26 07:08:34BeetlesUKMonthly Plan

Step-by-step guide to share

  1. Login into your Google Cloud Platform account.

  2. Go to IAM & Admin -> Click on Roles

  1. Click on Create Role
  1. Add Title As Toplyne Session Read role and ID as Toplye_Session_Read_role
  1. Search for BigQuery Read Session User. Select the option and click Ok
  1. Select all the supported roles and click ADD
  1. Click on Create to create the role
  1. Go To IAM & Admin -> Select Service Accounts
  1. Click on Create a Service Account
  1. Enter the Name ToplyneServiceAccount and Service account Id toplyneserviceaccount -> Click on Create and Continue
  1. In Role, add BigQuery Job User role and click on Add another rule
  1. In the role option, add Toplyne Session read role and click Continue followed by clicking Done
  1. Go to BigQuery and select your dataset.
  1. Navigate to Sharing -> click on Permissions
  1. Add Principal -> Enter service account email address you created in Step 10 -> Grant BigQuery Data Editor, Data Owner and Data Viewer Access -> Click Save

[email protected]

  1. Go back to IAM and select Service Accounts
  1. Select the service account that you have created in Step 10
  1. Go to Keys tab -> Add keys -> Create new key
  1. Download the key and share with us