The <TL;DR>

Currently, you can’t deploy a background Cloud Function that needs to be triggered by a Cloud Pub/Sub topic in another Google Cloud project. You can use a Cloud Dataflow streaming pipeline as one option to workaround this. It’s not as complicated as it sounds, I promise. Read on to see why not.

The nitty-gritty

You can find in docs where this limitation is somewhat (I think the docs should make it cleaer) documented.

“Cloud Functions can be triggered by messages published to Cloud Pub/Sub topics in the *same project* as the function."

It’s a bit of silly limitation of the platform to be honest, and one which can trip developers up pretty quickly if they’re not aware of it. Why? Because my dear friends, in the enterprise it’s commonplace to use different Google Cloud projects across the organisation (e.g. finance, marketing, data-science-tomfoolery etc.), but still need to retain the ability to broadcast notifications across the organisation to other teams when something interesting has happened, and have Cloud Functions trigger on those events.

Examples of this include downstream dependency workflows, or informing some other team(s) that something wonderful has happened, like a file has been processed from GCS, and loaded into BigQuery. As a side note, if only there was a way to have BigQuery events as a trigger to Cloud Functions, eh? ;)

There’s a few workarounds to this problem, but using a Cloud Dataflow pipeline in streaming mode to bridge the gap between projects is by far the best of a bad bunch of hacks in my honest opinion. No, really, it is. I promise.

See here and here for more info in the Google docs on this gnarly limitation.

Oh, won’t somebody please think of the enterprise!

I’ve been working on Google Cloud for almost seven years now. When I started, it was just App Engine and BigQuery. Yup, that’s all there was folks. They were gloriously simple times. These days, there’s a plethora of tools and services on the stack that you can use to build your solutions - and to get lost in. Because of this, it’s a double edged sword sometimes.

It’s no secret that working with Google Cloud in an enterprise environment can sometimes have you pulling your hair out (I’m looking in your direction, VPC service controls), and hurling your machine out the 10th floor of the building in a fit of rage. We’ve come across this particular limitation with Cloud Functions a few times now, and so I thought it was high time that I wrote a quick post about it in the hope it will help others, and for posterity. It probably won’t, because I’m not sure anyone reads my drivel anyway.

As I’ve mentioned above in the <TL;DR>, you cannot currently trigger a Cloud Function from a Cloud Pub/Sub topic in another project. I do know the boffins over at Google are working on fixing this, and knowing my luck will release it just after I publish this blog post making me look stupid (once again), but until then you’re going to be needing some kind of workaround *cough*, hacks, *cough* to tide you over.

A couple of options

The first thing that developers naiely think of doing is deploying the Cloud Function as an HTTP trigger instead, which gives them a public HTTP endpoint by default*. Then, instead of the pull subscription on the Pub/Sub topic, they create a push subscription instead, and paste their shiny new Cloud Function endpoint into the configuration for it. Now, although this will of course work, it’s not going to fly with most security teams.

_* see below :)_

That said, you can (just because you can, doesn’t mean you should) hand roll, and wrap some basic authorization around the Cloud Function to lock it down (see here and here), but that’s just getting into the realms of things we shouldn’t be wasting our time doing, or worrying about as developers. It even brings CORS into play, and that’s something I want to avoid getting dragged into at all costs. Ewwww.

No my friends, using that approach probably isn’t the wisest option.

Update since I started writing this post: you can now lock down Cloud Functions using IAM. Also, after November 1, 2019, Google have said that newly created functions will be private-by-default, and will only be invocable by authorized clients unless you set a public IAM policy on the function. This ensures functions are not accidentally made public and also means that unauthorized requests will be rejected without incurring costs. However, I just tested and it’s still publicly accessible by default! Also, still having a public IP/url may still give your security team the hibbie jibbies. See here for all the deets.

Another option we looked at was moving the Cloud Function into the source project where the Pub/Sub topic lived, and have the other team deploy it in their project instead. But, that’s just super ick, and even worse than having a public Cloud Function, because we’ve now thrown our code over the wall to another team who don’t know anything about it, nor should they. We also looked at getting the source project team to deploy a simple Cloud Function in their project to hang off their topic, and then simply push to a topic in our project. But again, it was unnecessarily forcing work onto another team, and tightly coupling them to the problem. That might fly in small orgs, but try doing that in the enterprise!

No, no, no. None of these options would do at all.

Another option

Another option that we started teasing out over a beer, was using Cloud Dataflow to bridge the gap between the two projects. The idea was relatively simple. Spin up a trivial Dataflow pipeline in streaming mode, and subscribe it to the topic in the project we needed to hook into. Unlike Cloud Functions, you can go cross project with Dataflow. High five!

Then, that Dataflow pipeline would simply act like a proxy for Pub/Sub, and write the message(s) it receives from the originating topic to another topic in our project. Finally, the last piece of the puzzle, we could then deploy the Cloud Function as a private/background Cloud Function to the topic in our project. Here’s the conga line:

event/message -> Pub/Sub topic (their project) -> Cloud Dataflow -> Pub/Sub topic (our project) -> private Cloud Function

We tested and it worked well. Of course, the downside to this approach is that we’ve now introduced two new components into the conga line (Dataflow & new Pub/Sub topic). But, both are trivial, minimal code, and easy to maintain. Also, once Google allow cross project subscription on Cloud Functions, the refactoring effort will be minimal because we’ll simply need to remove Dataflow, delete the topic and and redeploy the Cloud Function to trigger from the cross project topic. Easy!

The biggest disadvantage we see with this approach is cost. You need to run the Dataflow pipeline in streaming mode, which means running it 24/7. But, because it’s just acting like a proxy, you should only need a small instance (e.g. n1-standard-1), which keeps costs relatively low. The way we reasoned this decision was to think about how much of our time we’d need to invest to get it working, and deployed without Dataflow, and if it would be more or less expensive.

It was the latter.

Wrapping up

Hopefully this will be useful to others who stumble into the same pitfall. It’s not an ideal solution, but using a Cloud Dataflow pipeline to bridge the gap works well and will also scale. I’m sure there’s more ways of doing this, and I’d love to hear them. Ping me on Twitter if you have another idea.

Fingers crossed we’ll be able to deploy Cloud Functions to trigger off a Pub/Sub topic in another project soon. Otherwise, there will be more computers harmed by being flung out of windows by developers. And, nobody wants that.