VCL problem solving: collect edge data | Fastly
Our second annual customer summit was a great day of talks and workshops, and we heard from various brilliant speakers on the future of the edge, media 2.0, and clever tips for using Fastly’s Custom Varnish Configuration Language (VCL).
In that vein, Andrew Betts of the Financial Times discussed using VCL to “solve anything” — pushing his team’s problems to the CDN layer. Andrew helped build the original HTML5 web app for the Financial Times, and ran FT labs for three years. He’s currently working with Nikkei (the FT’s parent company) to rebuild nikkei.com. Below is part one in a series recapping his talk (including a few additional Varnish tips from Doc for good measure). In this first post, we’ll cover the benefits of pushing as much as you can to the CDN layer, and address the problems of collecting valuable data at the edge.
Benefits of edge code
As many of you may know, we’re big fans of extending your application to the edge — by pushing logic and content closer to users, Fastly helps you create faster experiences for your customers. Edge code also helps the Financial Times solve problems more effectively, giving them:
Smarter routing
Faster authentication — moved higher up in the stack
Higher cache hit ratio, which leads to:
Reduced bandwidth bills
Improved performance for end users
Andrew’s first experience with coding at the edge was with Edge Side Includes (ESI). “For those of us coding on the web in the late 90s, it’s essentially a copy of server side includes, but moved out to the edge,” he explained. “Basically, ESI lets you serve pages with holes, and the holes would be filled by additional requests to origin for the fragments that would go into those holes.” This works “pretty well,” but you end up with an architecture that couples your back ends to your ESI processing layer so tightly that it’s essentially a big monolith (and although they’re not for everyone, microservices make it easier for you to update parts of your application without overhauling the whole thing. The FT finds that loose coupling works better for their architecture.).
Previously, the Financial Times used “a lot” of ESI, but now they do everything with VCL, because:
Request and response bodies are opaque, simplifying things like debugging and caching and improving performance.
Everything on the edge happens in metadata — where the bodies are data itself and headers are metadata.
It’s very restricted: no loops (making it difficult to mess up).
It’s extensible: useful Fastly extensions include GeoIP and crypto.
It’s incredibly powerful when used creatively.
Andrew discussed the following pain points his company faces, as well as the solutions enabled with the help of VCL.
Debug headers
Adding debug headers lets you collect request lifecycle information in a single HTTP response header, which is great if you:
Find it hard to understand what path the request is taking through your VCL
Have restarts in your VCL and need to see all the individual back end requests, not just the last request