An attribution dashboard is one of a content marketer’s most valuable possessions. When it uses reliable, meaningful, well-structured data, it can highlight which content is moving the needle and what’s falling short of expectations—so you can focus your time and budget where it matters most.
But setting up this dashboard isn’t always easy. Data is often spread across tools, and joining it all together into something actionable takes careful planning and the proper setup.
That’s why we’re here to help. In this guide, we walk you through the entire process of building an attribution dashboard—from setting up your project in BigQuery and importing data to joining it and visualising the results in Looker Studio.

With this dashboard, you can:
- Quickly see which content drives results
- Make better decisions, faster, since all your key data is in one place
- Spot gaps in content and identify high-potential pieces
- Save time—no more jumping between tools
Let’s get started!
6 Steps to Building an Attribution Dashboard
1. Create a dedicated Google Cloud Project
See: Create a Google Cloud project | Google Workspace
If you don’t already have a Google Cloud Project for marketing analytics, create one.
Instructions:
- In the Google Cloud Platform Console, go to Menu > IAM & Admin > Create a Project.
- In the Project Name field, enter a descriptive name for your project—e.g., “Organic Attribution Reporting & Analytics.”
- (Optional) Edit the Project ID—this can’t be changed later. An ID like “organic-attribution-reporting-analytics” works well.
- In the Location field, click Browse to display potential locations for your project. Then, click Select.
- Click Create. The Google Cloud console will navigate to the Dashboard page, and your project will be created within a few minutes.
2. Enable BigQuery
Before starting with BigQuery, you must set up a billing account. (This requires the Billing Management administrator privilege.)
- In your Google Admin console, go to Menu > Billing > Subscriptions.
- At the top, click Start Paid Service or Set Up Billing.
- Follow the onscreen instructions
Instructions:
- Navigate to Menu > BigQuery. It may be under Pinned Products, or you may need to select View All Products at the bottom. BigQuery will be under the Analytics heading.
- Enable the BigQuery API.
3. Grant access permissions
See: Control access to resources with IAM | BigQuery | Google Cloud
Instructions:
- Navigate to Menu > IAM & Admin.
- Select Grant Access from the VIEW BY PRINCIPALS table.

- Add relevant emails to the New principals field. Under Assign Roles, select BigQuery Admin.
- Click Save.
4. Import data to BigQuery
Once you’ve set up the Cloud Project, it’s time to get your data into BigQuery.
Below are instructions for importing data from GA4, Google Sheets, Hubspot, and Airtable, alongside general advice for other types of platforms.
Google Analytics 4 (GA4)
See: [GA4] Set up BigQuery Export - Analytics Help and the permissions and service accounts sections in [GA4] Set up BigQuery Export.
GA4 includes a simple, native process for importing data to BigQuery:
Navigate to ⚙️Admin, scroll to Product Links, and click BigQuery Links.

- Click Link.
- Click Choose a BigQuery project to display a list of projects you can access.
- Select the project you created earlier, then click Confirm.
- Select a location for the data. You should use the same data location for GA4, BigQuery, and any projects you want to interact with.
- Click Next.
- Select Configure data streams and events to select which data streams to include with the export and specific events to exclude from the export.
You can exclude events by clicking Add to select from a list of existing events, or click Specify event by name to choose existing events by name or specify event names that have yet to be collected on the property.
- Click Done.
- Select Include advertising identifiers for mobile app streams if you want to include advertising identifiers.
- Select either or both a Daily (once a day) or a Streaming (continuous) export of data. Streaming will be more expensive and probably unnecessary. Daily will be sufficient in almost every case.
- Click Next.
- Review your settings, then click Submit.
Google Sheets
Many marketing teams—including Eleven Writing—use Google Sheets to manage content plans and calendars. However, getting this operational data into, say, GA4, isn’t very straightforward.
Without going through BigQuery, you’d essentially need to include all the same data in fields in your content management system (CMS) and pass this through to GA4 via the data layer. This is cumbersome and poses several problems, including the fact that data will be associated with live pages only from the time you set up this relationship. Aggregating the data through BigQuery provides greater control and flexibility.
You’ll start by creating an External Table that pulls data from Google Sheets. However, External Tables aren’t particularly fast, so you’ll also set up an Internal Table and create a Scheduled Query to move data from the External to the Internal Table.
External Table setup instructions
- Create a new data set and name it “content_calendar“ or similar.
- Click the three dots ⠇next to your data set and select Create Table.
- Under Source, select Google Drive.
- Paste in the URL of the sheet.
- Set File Format to Google Sheet.
- Set the Sheet Range to Export (or the name of the sheet you want to copy).
- Choose a name for the table, such as “content_calendar_export.”
- Under Schema, add the names of all headers you want to capture from your Google Sheet. You can also click Edit as text to do this in a familiar Structured Query Language (SQL) format. For the export sheet, keep all types as STRING. In the next step, you’ll assign appropriate data types in the Internal Table.
- Under Advanced Options (click the ⌄ arrow to expand), set Header rows to skip to 1 (depending on the number of header rows in your sheet).

Internal Table setup instructions
- As above, click the ⠇three dots next to your data set and select Create Table.
- Leave Create table from as Empty table.
- Choose a name for the table, such as “content_calendar_import.”
- Under Schema, turn on Edit as text and paste the schema. This time, however, set the appropriate types.
Supported types are:
- STRING
- BYTES
- INTEGER
- FLOAT
- NUMERIC
- BIGNUMERIC
- BOOLEAN
- TIMESTAMP
- DATE
- TIME
- DATETIME
- GEOGRAPHY
- JSON
- RANGE
The most common types you’ll likely use are STRING, BOOLEAN, and DATE. See Google Cloud > BigQuery > Data types for all acceptable types.
Scheduled Query setup instructions
- In the left navigation menu, select Scheduled Queries.
- Click + CREATE SCHEDULED QUERY IN EDITOR at the top of the page.
- Your query will perform a SELECT of the columns from your External Table and cast the type as necessary. Be sure to enter the correct address for the external_table—e.g., “your_project_name.content_calendar_import.”
The basic format is:

And to assign a data type, such as BOOLEAN or DATE:

For example, suppose you had the following fields in your content calendar that you’d like to capture for the purposes of attribution reporting:
Title (string)
Keyword (string)
Meta Description (string)
Author (string)
Campaign (string)
CTA (string)
Promotion (boolean)
USP/Product (string)
Cluster (string)
Audience (string)
Publication Date (date)
Your scheduled query, from an external table named content_calendar_export, would look like this:

- Click 🕔SCHEDULE.
- Assign a name (e.g., “hub_import_to_internal_with_types”).
- Set the frequency (e.g., every 12 hours).
- Under Destination for query results, select “☑️ Set a destination table for query results.”
- In the Dataset field, enter the name of the dataset you created above. From the dropdown, click that dataset.
- Enter the table ID of the internal table (e.g., “hub_import”).
- Select Overwrite table.
- Set the correct location (same as the dataset).
- Click SAVE.
HubSpot
If you're on HubSpot Enterprise, you can use the native BigQuery connector. Instructions are here: Connect HubSpot and Google BigQuery (BETA).
Otherwise, use a third-party connector (like Rivery and Airbite) or a custom API connection.
Below, we offer general advice on importing data to BigQuery via API.
Any API connection (HubSpot, Airtable, Notion, etc.)
Every API is different, but there are some reliable ways to connect external data to BigQuery safely and efficiently:
- BigQuery Storage Write API. This allows for both data streaming and batch loading, meaning you can send data continuously or query your API on a schedule and write the resulting data to a table in one go.
Batch loading is often the simpler and more practical option for most use cases. (It’s also more than sufficient regarding data granularity!) The process is relatively straightforward:
- CreateWriteStream creates one or more streams, which will house the data until you’re ready to commit it to BigQuery.
- AppendRows is called in a loop to write batches of records, followed by FinalizeWriteStream.
- BatchCommitWriteStreams commits all streams to BigQuery using a JSON object that specifies the destinationTable (“parent”) and an array of streams (“writeStreams”).
For instructions and example code, see: Batch load data using the Storage Write API | BigQuery | Google Cloud.
- Apps Script BigQuery Service. This enables you to read from and write to BigQuery via Google Apps Script. You can also manage projects, create new tables, and execute queries, making this tool useful for connecting external APIs to BigQuery in Google Workspace.
For instructions and example code, see: BigQuery Service | Apps Script | Google for Developers. - BigQuery Data Transfer Service. This enables you to schedule and manage data movement into BigQuery from Cloud Storage and other supported data services (including Google Ads, Salesforce, YouTube, Google Play, and Campaign Manager).
Using this method, you would create a CSV file of the data to be loaded into BigQuery, save it in Cloud Storage or another service, and schedule a data transfer.
If the data source supports exporting data as a CSV to Cloud Storage, you can bypass using the API and set up the entire data transfer without writing a single line of code.
Otherwise, retrieve the data via API, create a CSV, and upload it to Cloud Storage, then invoke the Data Transfer Service. You can do this via Cloud Run functions on a schedule or in response to changes in Cloud Storage data.
For instructions and example code, see: What is BigQuery Data Transfer Service? | Google Cloud.
Recommendations
Here are some basic recommendations for dealing with data from an external API:
- Data formats. Most API responses will be served as JSON, which you can send to BigQuery in the same format via the Storage Write API (using, for instance, the Node.js library or the JsonStreamWriter object from the Java library). The Apps Script service also accepts data in the form of JSON. However, if you’re using the Data Transfer Service, you’ll likely need to convert your JSON to CSV.
- Rate limiting. Most APIs have rate limits that control how many calls you can make within a given timeframe and how much data you can transfer at once. Some APIs support batch processing, while others use pagination (breaking large datasets into smaller chunks). Consider your data volume to determine whether it can be transferred in one go or needs multiple calls.
- Data retrieval. There are many ways you can call an API and retrieve data. Google’s App Scripts and Cloud Run functions offer two simple solutions you can schedule to run on a timer or in response to various triggers. Otherwise, you can use a local or third-party solution.
- Streamlining data. Standardise date, timestamps, and other data types for easy ingestion into BigQuery. Dates can be especially finicky to work with, and not all APIs return dates in the same format: Some use Epoch, some Unix, and others return a string that needs to be parsed before being passed on. You may also need to flatten JSON responses to simplify querying downstream.
- Overwrite or append rows. Depending upon the data source, you may overwrite the entire BigQuery table each time you retrieve new data from the API or simply append rows. Overwriting makes most sense when source data rows change (e.g., after you update a status in Airtable or edit a Notion table). Appending makes most sense when new rows (e.g., user activity or events) are added to the source data.
- Error handling. Access errors, rate limiting, and unexpected data types—errors abound when retrieving large amounts of data from an API. Ensure your process can handle errors as they occur, retrying as necessary and supplying appropriate logs.
5. Join Data in BigQuery (With Example SQL Queries)
Once you’ve loaded data in BigQuery, the next step is to join it for visualisation and analysis in Looker Studio (or another tool).
This is often the trickiest part of the process and varies by project. It’s important to test your queries as you build them to make sure the results are accurate. With some platforms (especially GA4), you may need to fine-tune to get the expected figures.
The key to joining datasets is having a shared identifier. For organic content, this is often the page URL. For emails or newsletters, however, you’ll likely need a different approach.
At Eleven Writing, we assign a unique numerical ID to each content piece. That ID is included in metadata passed through platforms like email tools (e.g., within a campaign name or UTM parameter), allowing us to connect datasets later in BigQuery.
For the example project below, we joined data from two sources: Google Sheets (a content plan) and GA4 (event data). We aimed to see which content strategies and formats drove the most conversions and return on investment (ROI). We also wanted to explore attribution models and different lookback windows.
Given the size and complexity of the data, we couldn’t rely on Looker Studio’s Merge function, and integrating data via the GA4 data layer wasn’t practical. Ultimately, we chose to bring both datasets into BigQuery and joined them there using two scheduled queries:
- Event query. This grabbed new events from the GA4 stream, focusing on events of our choice: product sign-ups, demo form submissions, and paid plan upgrades. For each event, this query also collected page views in the same session up to the moment of conversion. We applied a hybrid attribution model (blending first-touch and linear models): The first page view in the session was marked as “conversion” and additional page views are marked as “assisted conversions.”
- Join query. This joined event data with data from the content tracking sheet, which included strategic metadata like category, subcategory, author, and products or services promoted. We also included the URL of the Google Doc draft and Jira ticket for easy reference in Looker Studio, alongside the cost of the article’s production, which we use to calculate ROI.
This approach gave us a flexible, reliable dataset to assess content performance, and the queries cost just a few dollars a month to run several times a day.
SQL queries
Query I: Append new events to event attribution table









Query II: Join with dimensions and metadata from content hub


6. Import Data to Looker Studio
With all data aggregated into a single table, it’s time to import the relevant data into Looker Studio for visualisation and further analysis.
BigQuery does all the heavy lifting here, making the report significantly smoother and faster than using Google Analytics data directly or creating a Merge in Looker Studio.
Unsurprisingly, Looker makes it easy to import data from BigQuery. Just click Add a Data Source and select BigQuery.

Then, select your Project, Data set, and Table. If you don’t see your project, ensure you’re logged into the correct Google account with access to the BigQuery project.
You can now build tables, charts, and graphs with your data.
Level Up Your Reporting With Eleven Writing
With an attribution dashboard, you can clearly tie content to impact. It makes it easy to see which content efforts yield the best results, enabling you to communicate ROI to stakeholders and better distribute your resources in the future.
At Eleven Writing, attribution reporting and modelling are our bread and butter. We’ve helped many partners surface insights and prove the value of their content strategies.
Want a hand getting started or refining your attribution approach? Download our free eBook, Introduction to Multivariate Attribution for Content Marketing, or get in touch for a no-obligation consultation.
Receive insider tips straight to your inbox.
Receive insider tips straight to your inbox.
Would you like to speak to one of our experts?
Create custom email campaigns, measure performance, and turn insights into results with Mailchimp’s email marketing tools.