Pub/Sub — Push Messages to Cloud Function Endpoint in a different GCP Project

Piyush Bajaj
Searce
Published in
4 min readJun 22, 2022

--

Introduction to Google Cloud Pub/Sub

Photo by Volodymyr Hryshchenko on Unsplash

Google Cloud Pub/Sub is a messaging service that allows apps and services to share event data. It enables safe and highly available communication between independently designed programs by decoupling senders and receivers. Developers typically utilise Google Cloud Pub/Sub to construct asynchronous workflows, distribute event notifications, and stream data from various processes or devices because it provides low-latency/durable messaging.

Your services can communicate with each other with a latency of 100 milliseconds using Pub/Sub.

You can use Pub/Sub to establish systems of event producers and consumers, which are referred to as publishers and subscribers. Instead of synchronous remote procedure calls, publishers connect with subscribers asynchronously by broadcasting events (RPCs).

Publishers transmit events to the Pub/Sub topics without thinking about how or when they will be processed. The events are then delivered to all services that need to respond to them through Pub/Sub.

Fig 1: Pub/Sub architectural diagram

What exactly is a push subscription, and when should I use one?

Pub/Sub can come to you with push subscriptions: As an HTTP POST request, Pub/Sub sends messages to a preset HTTPS location (also known as a push endpoint).

Unlike a pull subscription, which requires you to actively ask Pub/Sub for messages, a push subscription allows you to define a push endpoint to which Pub/Sub will deliver messages in real time.

Fig 2: Push and Pull subscription

Business Use-Case:

Consider this scenario where Project-A is where Pub/Sub Topics are created and in which the messages are being pushed, this project is handled by 3rd party vendors or has restricted access because of the security setup.

There is another Project-B where these topics data is needed in real-time so as to make it available in BigQuery for analytical use cases.

In this case, we have to establish a cross-communication between Pub/Sub Topic of Project-A and Cloud Function of Project-B.

Solution Architecture:

Fig 3: Solution architecture diagram

So, for the above business use case we adopted a strategy based on the concept of using two Service Accounts, which may essentially assist us in establishing communication between two different Google Projects.

Data will be initially written to the Cloud Function, from where it can be written to various warehouses such as BigQuery, etc.

Note: Project-A includes Cloud Pub/Sub and Project-B includes Cloud Function.

Solution Steps:

  1. In Project-A, create a service account named publish-test with Pub/Sub Publisher access.

2. In Project-B, add the publish-test service account and provide Cloud Functions Invoker access.

3. Deploy a Cloud Function with the name Demo_Function in Project-B which uses HTTP Trigger to read all messages sent by Pub/Sub Topic in Project-A and also attach the default service account to it.

4. Create a Pub/Sub topic in Project-A with the name Project_A_Topic which has a push-based subscription that uses the Endpoint URL of Demo_Function(Cloud Function) and has a publish-test service account attached to it.

  • Steps to create a subscription.

5. To verify, publish your message in Project_A_Topic and check the Demo_function logs to see whether it is receiving a message.

So, in this way, you can publish data from the client’s Pub/Sub topic (Project-A) to your own GCP Project (Project-B).

Conclusion:
In this Blog, we have explored how to push messages to cloud function endpoint in a different GCP Project.

Thank you so much for reading this blog, Happy deployments☁️😄.

I would like to give special thanks to Shreya Goel , Manasa Kallakuri , Karan Kaushik for your contributions and motivating me to write this blog :)

--

--