{"id":1399,"date":"2013-04-18T16:24:10","date_gmt":"2013-04-18T14:24:10","guid":{"rendered":"http:\/\/www.alkannoide.com\/?p=1399"},"modified":"2013-06-14T17:06:38","modified_gmt":"2013-06-14T15:06:38","slug":"mistserver-optimize-the-http-delivery-via-caching","status":"publish","type":"post","link":"https:\/\/www.alkannoide.com\/2013\/04\/18\/mistserver-optimize-the-http-delivery-via-caching\/","title":{"rendered":"Mistserver – Optimize the HTTP delivery via caching"},"content":{"rendered":"

In the previous post, we looked the new features of MistServer (version 1.1).<\/a>
\n
\"varnish\"<\/a>Today, I expose an idea exposed by a friend Nicolas Weil<\/a>. We talked about the content caching for an architecture based on Mistserver especially for HTTP based format. We thought about Varnish<\/a>, an HTTP accelerator<\/a>. The idea is to keep in cache the different fragments which are generated by MistServer. This article will not talk about the RTMP or TS part, theses protocols are not HTTP based.<\/p>\n

To make this test, I use on the same server MistServer and Varnish. Here is the architecture.
\n
\"varnish-mist\"<\/a>
\n<\/p>\n

As you look, my Mistserver HTTP based protocols are on 8081 port.\u00a0The operating system is Ubuntu and I follow this tutorial to install Varnish<\/a>. Then I setup Varnish and fill which is his backend. Open the\u00a0\/etc\/varnish\/default.vcl<\/em> and update thoses lines :<\/p>\n

backend default {\r\n    .host = \"127.0.0.1\";\r\n    .port = \"8081\";\r\n}\r\nsub vcl_recv {\r\n    set req.grace = 30s;\r\n    return (lookup);\r\n}\r\nsub vcl_pipe {\r\n    return (pipe);\r\n}\r\nsub vcl_pass {\r\n    return (pass);\r\n}\r\nsub vcl_init {\r\n    return (ok);\r\n}\r\nsub vcl_fini {\r\n    return (ok);\r\n}\r\nsub vcl_deliver {\r\n        if (obj.hits > 0) {\r\n                set resp.http.X-Cache = \"cached\";\r\n        } else {\r\n                set resp.http.X-Cache = \"uncached\";\r\n        }\r\n        return (deliver);\r\n}<\/pre>\n

To help on debug, \u00a0I add in vcl_deliver a X-Cache<\/em> header with values cached – uncached<\/em>. HTTP files will be in cache during 30 seconds. You can change the value very easier.<\/p>\n

You can optimize the configuration but it’s not the object of that article and you can find many informations on the web.<\/p>\n

To check the good configuration, \u00a0get a content with curl command line. Be careful, we are not testing to call the MistServer but the Varnish cache, so we will point to the port 80 not 8081.<\/p>\n

$ curl -I http:\/\/ip_address\/dynamic\/mystream\/manifest.f4m<\/pre>\n

Check the HTTP header :<\/p>\n

HTTP\/1.1 200 OK\r\nCache-Control: no-cache\r\nContent-Type: text\/xml\r\nX-UID: 0f4405cc8b4b43312a8f35d7ba0b043e_sintel_dynamic\r\nContent-Length: 815\r\nAccept-Ranges: bytes\r\nDate: Thu, 11 Apr 2013 14:30:19 GMT\r\nAge: 0\r\nConnection: keep-alive\r\nX-Cache: uncached<\/pre>\n

You can see the uncached value on the last line.
\nIf you make a new call, before the end of the 30s of cache, the content will be in Varnish cache and MistServer is not called :<\/p>\n

HTTP\/1.1 200 OK\r\nCache-Control: no-cache\r\nContent-Type: text\/xml\r\nX-UID: 0f4405cc8b4b43312a8f35d7ba0b043e_sintel_dynamic\r\nContent-Length: 815\r\nAccept-Ranges: bytes\r\nDate: Thu, 11 Apr 2013 14:30:21 GMT\r\nAge: 2\r\nConnection: keep-alive\r\nX-Cache: cached<\/pre>\n

This is the magic of HTTP caching ! \ud83d\ude09 Mistserver is not used when content is cached by Varnish.<\/p>\n

You can check too with the varnishlog tools to see if Varnish call the backend or not.<\/p>\n

If you want to use this kind of configuration for production for example, you can apply this kind of architecture : multiple varnish caches (called edges), as physical servers, in front of Mistserver (called origin).<\/p>\n

\"varnish_mist_multiple\"<\/a><\/p>\n

You can divide by many the load on the origin server and Varnish was created for this kind of usage.<\/p>\n

To finish, you can apply this kind of configuration for VoD or Live. For VoD, the cache lifetime can be more than 10 minutes, for live, I advise a short cache lifetime, the reason are simple to understand.<\/p>\n

Have fun !<\/p>\n","protected":false},"excerpt":{"rendered":"

In the previous post, we looked the new features of MistServer (version 1.1). Today, I expose an idea exposed by a friend Nicolas Weil. We talked about the content caching for an architecture based on Mistserver especially for HTTP based format. We thought about Varnish, an HTTP accelerator. The idea is to keep in cache […]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":[],"categories":[39,69,46],"tags":[88,86,85,87],"_links":{"self":[{"href":"https:\/\/www.alkannoide.com\/wp-json\/wp\/v2\/posts\/1399"}],"collection":[{"href":"https:\/\/www.alkannoide.com\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.alkannoide.com\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.alkannoide.com\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.alkannoide.com\/wp-json\/wp\/v2\/comments?post=1399"}],"version-history":[{"count":28,"href":"https:\/\/www.alkannoide.com\/wp-json\/wp\/v2\/posts\/1399\/revisions"}],"predecessor-version":[{"id":1525,"href":"https:\/\/www.alkannoide.com\/wp-json\/wp\/v2\/posts\/1399\/revisions\/1525"}],"wp:attachment":[{"href":"https:\/\/www.alkannoide.com\/wp-json\/wp\/v2\/media?parent=1399"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.alkannoide.com\/wp-json\/wp\/v2\/categories?post=1399"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.alkannoide.com\/wp-json\/wp\/v2\/tags?post=1399"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}