Introduction
Cloudflare Pages (see their docs) is a deployment tool that you link to your GitHub or GitLab repository, which will automatically build and deploy websites on every pull request.
🤷 Static Sites
Pages works reasonably well for websites that are static. In fact, this website (built using Docusaurus) is a great example of a good use of Pages: when a change is pushed to the main
repository, the command npm run build
generates a folder of static HTML, CSS, images, and so on, and all those assets are "deployed".
Behind the scenes, Pages is actually pushing all the assets up to KV, and deploying a Worker that servers those assets back out.
You could accomplish the same thing as Pages by setting up continuous integration on your repository (e.g. Github Workflows or similar) and use Workers Sites or your own custom rolled tooling to sync assets to KV and serve them. But if you're just running a content site, like this one, using Cloudflare Pages wraps all that up for you in a nice way.
There are two big problems with Pages:
The first is the same as with most other automated deployment tools: if your site doesn't automatically deploy when you push changes, you'll have a really hard time figuring out why. If you push up commits to your repository, sometimes Cloudflare doesn't seem to trigger a deploy, and the only thing you can do is try pushing commits to hopefully trigger redeploys.
The other problem is that every commit is deployed to a hash-based subdomain, and the old versions aren't automatically deleted. For example, the commit hash e7359143 is the Pages deploy from the initial Docusaurus install. Unless that deploy gets manually deleted, this version will be available to the internet at large forever. You'll need to decide if that matters to you, or build some tooling to use their API to automatically delete those deploys.
👎 Dynamic Sites
Pages lets you add dynamic endpoints using "Pages Functions" (currently in beta), which are basically path-level overrides that get executed pretty much the same way a Worker would get executed to handle requests.
The general idea is that you would add a route as a folder+file in a designated folder, like functions/api/todos/[id].js
(TypeScript is supported), and then write something pretty much the same as Worker code to handle the request to /api/todos/*
. These files are all bundled and built using some not-yet published tooling (best guess is that it uses Webpack behind the scenes) so you don't have to bundle things yourself.
You can also bypass the bundling and routing and handle it yourself by creating a file _worker.js
at the folder root, and that will be called like a normal Worker handler.
The main problems with using Pages Functions in production are:
- Total lack of logging. If your dynamic endpoints ever run into issues, or you need to audit requests, you'll have to roll your own thing entirely and hope for the best.
- The bundling part for code in the
functions
folder is not publicly audited code, so there are security concerns here. Cloudflare could make this public and auditable, but I'm not holding my breath. - Since it's still in beta, there are bits and pieces that don't quite work as expected, and without logging you can waste a lot of time trying to track down root cause for issues.
If your site is primarily static content and the dynamic portion of the site is very small, using Pages Functions might still be a good choice, just know those issues going in.