This document describes how to import offline data from the CRM (i.e. sales or leads statuses) into BigQuery using Webhooks (postbacks).

Imported offline data can be combined with other website data to improve ROI reporting and attribution by optimising not only based on online transactions but on final order/lead statuses in the CRM.

Before you begin

Getting started

  1. Inside the admin panel go to Google BigQuery ▸ Data Streams page and click Add data stream.
  2. Choose Webhook from the list.
  3. Define Google BigQuery Destination table where collected data will be stored.
  4. Copy the value from the Webhook endpoint field.
  5. Click Save and follow the further instructions to integrate the code.

Webhook implementation

Imagine, you have the following data in your CRM:

createdAt updatedAt orderId currency total status userId
2020-01-10 2020-01-15 N1 USD 120.13 delivered U1
2020-01-15 2020-01-20 N2 USD 96.12 refunded U2
2020-01-20 2020-01-20 N3 USD 299.99 received U3


  • createdAt - time of the order creation in ISO 8601 standard;
  • updatedAt - time of the latest order status update in ISO 8601;
  • orderId - unique identifier of the order in your CRM system or website database. For other businesses it might be leadId or any other unique record identifier;
  • currency - order currency code;
  • total - Total cost of the order;
  • status - The most recent order status (e.g. received, shipped, delivered, refunded, etc);
  • userId - unique identifier of the user that made an order.

Note: here we demonstrate the minimal schema required for stitching online and offline orders, but you can add any other fields you need.

Depending on the programming language you use on the backend of your application your should implement a webhook triggering each time a new order or lead is created, or the status of a particular order or lead changes inside your CRM.

For example, the following code (CURL example) should be triggered in the order status was changed from shipped to delivered.

curl -d "orderId=N1&status=delivered&total=120.13&userId=U1" \

Make sure, that is replaced with a valid endpoint retrieved from the Webhook endpoint field of the data stream configuration inside the SegmentStream admin panel.

How webhook data import works

Each time you trigger a webhook for the data stream endpoint associated with a specific BigQuery table - a new row is created in this table.

The example row created in the BigQuery destination table by the webhook described before may look the following way:

hit_id payload received_at date
83cc2a6a-4829-4410-bab4-95b680c5b8d7 orderId=N1&status=delivered&total=120.13&userId=U1 1603131452 2020-10-19


  • hit_id - a unique ID of the request generated automatically when webhook is received by the SegmentStream server;
  • payload - the data sent in a POST body;
  • received_at - webhook Unix timestamp (in milliseconds passed since January 1, 1970);
  • date - the date of the webhook with a proper UTC offset applied.

This document was last updated on December 7, 2021. If you notice any gaps, outdated information or simply want to leave some feedback to help us improve our documentation, please let us know!