By clicking “Accept”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. View our Privacy Policy for more information.

Latest Updates on Google Data Analytics (March 2024)

The highlights of the updates on BigQuery, Looker Studio, Google Analytics (GA) & Google Tag Manager (GTM). By Alexander Junke

In this blog post, I want to summarize the new releases from the Google tools, that we use daily in datadice. Therefore I want to give an overview of the new features of BigQuery, Dataform, Looker Studio, Google Analytics and Google Tag Manager. Furthermore, I will focus on the releases that I consider to be the most important ones and I will also name some other changes that were made.

If you want to take a closer look, here you can find the Release Notes from BigQuery, Dataform, Looker Studio, Google Analytics & Google Tag Manage.

BigQuery

New Metadata Query

There are two new Metadata queries available.

Cache Refresh: 

To refresh the metadata cache of BigLake or object tables, you can use the “BQ.REFRESH_EXTERNAL_METADATA_CACHE” system procedure. This statement you need to CALL, which you see in the following example:

CALL BQ.REFRESH_EXTERNAL_METADATA_CACHE(project_name.database_name.table_name)

Storage Write API ingestion:

The new Metadata view “INFORMATION_SCHEMA.WRITE_API_TIMELINE*”, contains Storage Write API ingestions data on a minutely level.

One row represents the ingestions into a table per minute and per stream type and error code. Example query:

SELECT start_timestamp, stream_type, SUM(total_requests) AS num_failed_requests FROM `region-eu`.INFORMATION_SCHEMA.WRITE_API_TIMELINE WHERE error_code != 'OK' AND start_timestamp > TIMESTAMP_SUB(CURRENT_TIMESTAMP,INTERVAL 30 MINUTE) GROUP BY 1,2 ORDER BY start_timestamp DESC

Quicker scheduled queries

A nice and easy change. The quickest interval to update your data was before every 15 minutes. Now it is even possible to schedule updates every 5 minutes.

Help me code tool

A tool we will publish a separate blog post about it in the next few days.

It is a new tool in the Gemini landscape. You can type, with the use of natural language, a statement, and a query gets generated in BigQuery.

To use it you need to enable it at first by clicking on “Code generation tool”.

Then you can click on the pen symbol on the left side and a prompt will appear to enter the command. 

A nice assistance, where it will be important, as other AI tools like ChatGPT, to find out how to write the prompt and how good the results will be at the end. If you really will save time with this, will tell you the time.

Dataform

Repository access

It is possible now to give access to all Google-authenticated users to a repository. The corresponding principal is “allAuthenticated” users.

Example:
If you give the Viewer role to these users, all service accounts and users with a Google Account can access this repository

An important change for publicly available code repositories.

New Beta version

There is a new beta version available with some bigger changes where you need to take action when the version is live.

Mentioning some changes:

The workflow_settings.yaml replaces the dataform.json

  • The settings are strictly typed
  • The Dataform core version can be written in the workflow_settings.yaml and does not need to be in the package.json

Package installation with @dataform/cli

  • The installation is stateless, which means you do not need to have a “node_modules” folder in the project and the user does not need to use NPM

Warehouse-agnostic compilation output

  • The compilation results from “@dataform/core” will not be inserted into the compiled graph anymore, they will be inserted into the execution engine from Dataform


You should take a look into the changes so that you can react quickly when this version is live.

Looker Studio

No further release for Looker Studio.

Google Analytics

No mandatory user ID for user-provided data

You can add data to your events on the user level. So far it was needed to add a user ID to GA if you collect custom user-provided data. Now it can receive hashed user-provided data without a user ID by selecting the corresponding setting in Google Analytics.

To enable this you need to go to the “Admin” section in your GA4 property, and click on “Data collection”. Then turn on “User-provided data collection” and enable “Collect automatically detected user-provided data”.

But it is still recommended to send actively a user ID, just if it is technically not possible you can do the mentioned implementation.

Key events

There is a change in the terminology, to have a uniform definition between Google Ads and Google Analytics. There are conversions and key events available in GA now.

  • Key events: Events which are important for the business
  • Conversions: Events to measure your ad performance

It is a good change because now the meaning of conversions is the same in Google Analytics and Google Ads.

Just that it is clear, if a key event is used in Google Ads as well, this key event is a conversion too.


Google Tag Manager

No further release for the Google Tag Manager.

Upcoming datadice blog posts for this month

  • Power BI  - Coming Soon
  • BigQuery Help me code tool - Coming Soon

Further Links

This post is part of the Google Data Analytics series from datadice and explains to you every month the newest features in BigQuery, Data Studio, Google Analytics and Google Tag Manager.

Check out our LinkedIn account, to get insights into our daily working life and get important updates about BigQuery, Data Studio and marketing analytics.

We also started with our own YouTube channel. We talk about important DWH, BigQuery, Data Studio and many more topics. Check out the channel here.

If you want to learn more about how to use Google Data Studio and take it to the next level in combination with BigQuery, check our Udemy course here.

If you are looking for help to set up a modern and cost-efficient data warehouse or analytical dashboards, send us an email to hello@datadice.io and we will schedule a call.