Web server configuration: Difference between revisions

From Open Source Ecology
Jump to navigation Jump to search
No edit summary
No edit summary
Line 10: Line 10:


And then back out to the client in the reverse order: apache -> varnish -> nginx -> Internet
And then back out to the client in the reverse order: apache -> varnish -> nginx -> Internet
==Why??==
OSEs principles aims for simplicity--so why aren't we simply using only Apache? Why Nginx as well? And why Varnish? Great question!
Before simplicity, OSE is radically committed to using FLOSS. We're also an ecologically-aware, low-budget, non-profit with limited financial & computational resources. Keeping this in mind, below are the reasons for the complexity described in this documentation:
# Varnish is a cache. It's an essential component that allows us to server very high volume of requests on many websites from a single server. Unfortunately, the free version of Varnish does not speak https.
# Our biggest site is Mediawiki. In 2017, Wikipedia has chosen Varnish as their cache-of-choice, after experimenting with Squid & Nginx caching. If the biggest user of our biggest site's application backend is using Varnish, we should use it too. And I found good wordpress plugins that play nicely with Varnish as well.
# Nginx is our tls-terminator. It listens to our encrypted traffic over https & passes unencrypted http traffic internally onto varnish.
# Nginx has great DOS protection and rate-limiting built-in.
# Nginx being distinct from Apache gives us the ability to serve a SITE_DOWN vhost to users for a specific domain, while devs are still able to iterate & test changes to the backend Apache server.
# Most people who use https + varnish specifically use Nginx to terminate their https. Therefore, there is better documentation & a user support base for this architecture.
# History. I (Michael Altfield) came on-board in 2017 with only Apache running. I added https to protect our user's passwords that were being sent in cleartext, and--in doing so--I had to abandon CloudFront so a third-party didn't have our private keys. At the time, CF was our CDN cache, so I had to implement a self-hosted cache. I chose varnish, and then had to add nginx before it for https termination.


==Nginx==
==Nginx==

Revision as of 17:50, 22 December 2017

Software

Our http(s) content is served using the following web servers:

  1. Nginx
  2. Varnish
  3. Apache

The traffic flows in-order, ie: Internet -> nginx -> varnish -> apache

And then back out to the client in the reverse order: apache -> varnish -> nginx -> Internet

Why??

OSEs principles aims for simplicity--so why aren't we simply using only Apache? Why Nginx as well? And why Varnish? Great question!

Before simplicity, OSE is radically committed to using FLOSS. We're also an ecologically-aware, low-budget, non-profit with limited financial & computational resources. Keeping this in mind, below are the reasons for the complexity described in this documentation:

  1. Varnish is a cache. It's an essential component that allows us to server very high volume of requests on many websites from a single server. Unfortunately, the free version of Varnish does not speak https.
  2. Our biggest site is Mediawiki. In 2017, Wikipedia has chosen Varnish as their cache-of-choice, after experimenting with Squid & Nginx caching. If the biggest user of our biggest site's application backend is using Varnish, we should use it too. And I found good wordpress plugins that play nicely with Varnish as well.
  3. Nginx is our tls-terminator. It listens to our encrypted traffic over https & passes unencrypted http traffic internally onto varnish.
  4. Nginx has great DOS protection and rate-limiting built-in.
  5. Nginx being distinct from Apache gives us the ability to serve a SITE_DOWN vhost to users for a specific domain, while devs are still able to iterate & test changes to the backend Apache server.
  6. Most people who use https + varnish specifically use Nginx to terminate their https. Therefore, there is better documentation & a user support base for this architecture.
  7. History. I (Michael Altfield) came on-board in 2017 with only Apache running. I added https to protect our user's passwords that were being sent in cleartext, and--in doing so--I had to abandon CloudFront so a third-party didn't have our private keys. At the time, CF was our CDN cache, so I had to implement a self-hosted cache. I chose varnish, and then had to add nginx before it for https termination.

Nginx

Varnish

Apache

See Also