Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update site-defaults.conf #5693

Open
wants to merge 1 commit into
base: staging
Choose a base branch
from
Open

Update site-defaults.conf #5693

wants to merge 1 commit into from

Conversation

ajnadox
Copy link

@ajnadox ajnadox commented Feb 1, 2024

updated security config

updated security config
@bersierj
Copy link

bersierj commented Feb 5, 2024

You have to add in Content-Security-Policy connect-src 'self' https://api.pwnedpasswords.com/range/
to check if password not in pwnedpasswords


index index.php index.html;

client_max_body_size 0;

gzip on;
#Disable gzip compression to avoid the BREACH attack
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there any source of this being actual in 2024?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

https://nginx.org/en/docs/http/ngx_http_gzip_module.html:

When using the SSL/TLS protocol, compressed responses may be subject to BREACH attacks.


So it must be sth. else that needs to mitigate that:

BREACH

The Browser Reconnaissance and Exfiltration via Adaptive Compression of Hypertext (BREACH) vulnerability is very similar to CRIME but BREACH targets HTTP compression, not TLS compression. This attack is possible even if TLS compression is turned off. An attacker forces the victim’s browser to connect to a TLS-enabled third-party website and monitors the traffic between the victim and the server using a man-in-the-middle attack. The BREACH vulnerability is registered in the NIST NVD database as CVE-2013-3587.

A vulnerable web application must satisfy the following conditions:
  • Be served from a server that uses HTTP-level compression
  • Reflect user input in HTTP response bodies
  • Reflect a secret (such as a CSRF token) in HTTP response bodies (therefore values in HTTP headers, such as cookies, are safe from this attack).
Prevention
  • Disable HTTP compression
  • Separate secrets from user input
  • Randomize secrets per request
  • Mask secrets (effectively randomize by XORing with a random secret per request)
  • Protect pages against CSRF
  • Hide the length (by adding random numbers of bytes to responses)
  • Limit the rate of requests

Source: https://www.acunetix.com/blog/articles/tls-vulnerabilities-attacks-final-part/


Maybe the CSRF Token in /web is enough to mitigate?
https://github.com/search?q=repo%3Amailcow%2Fmailcow-dockerized%20CSRF&type=code

But in my opinion the saver way is to turn gzip off as also bandwidth isn't the main reason anymore
so loading would increase just slightly? (haven't done any testing)

@milkmaker
Copy link
Collaborator

This pull request has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs.

@milkmaker milkmaker added the stale Please update the issue with current status, unclear if it's still open/needed. label Apr 26, 2024
@realizelol
Copy link
Contributor

Before it's getting closed - here's my opinion:

#only supporting strong ciphers
ssl_ciphers ECDHE-ECDSA-AES128-GCM-SHA256:ECDHE-RSA-AES128-GCM-SHA256:ECDHE-ECDSA-AES256-GCM-SHA384:ECDHE-RSA-AES256-GCM-SHA384:ECDHE-ECDSA-CHACHA20-POLY1305:ECDHE-RSA-CHACHA20-POLY1305:DHE-RSA-AES128-GCM-SHA256:DHE-RSA-AES256-GCM-SHA384;

Is there any source where you get that ciphers list from? Another up-2-date method would be:

echo "ssl_ciphers \"$(openssl ciphers -s '!aNULL:!eNULL:!LOW:!3DES:!MD5:!EXP:!PSK:!DSS:!RC4:!SEED:!ECDSA:!ADH:!IDEA:!3DES:!CAMELLIA:!AESCCM:!ECDHE-RSA-CHACHA20-POLY1305:!DHE-RSA-CHACHA20-POLY1305:!RSA+AESGCM:!ARIA256-GCM-SHA384:!ARIA128-GCM-SHA256:!SHA1:!SHA256:!SHA384:HIGH@STRENGTH')\";"
ssl_ciphers "TLS_AES_256_GCM_SHA384:TLS_CHACHA20_POLY1305_SHA256:TLS_AES_128_GCM_SHA256:ECDHE-RSA-AES256-GCM-SHA384:DHE-RSA-AES256-GCM-SHA384:ECDHE-ARIA256-GCM-SHA384:DHE-RSA-ARIA256-GCM-SHA384:ECDHE-RSA-AES128-GCM-SHA256:DHE-RSA-AES128-GCM-SHA256:ECDHE-ARIA128-GCM-SHA256:DHE-RSA-ARIA128-GCM-SHA256";

The -s flag to use only supported ones + exclude(!) vulnerable ciphers + only HIGH ciphers + sort them by @strength.
[ Hopefully there was nothing new to exclude as this list is about 3 years old. ]


#prioritize ChaCha ciphers
ssl_conf_command Options PrioritizeChaCha;

Note

Note that configuring OpenSSL directly might result in unexpected behavior.

Source: https://nginx.org/en/docs/http/ngx_http_ssl_module.html#ssl_conf_command


#includeSubDomains; and preload; if all of your services are using HTTPS
add_header Strict-Transport-Security "max-age=63072000; includeSubDomains; preload";

Set 31536000 seconds (one year) as it's the ideal value for max-age:
2. It is advisable to assign the max-age directive’s value to be greater than 10368000 seconds (120 days) and ideally to 31536000 (one year). Websites should aim to ramp up the max-age value to ensure heightened security for a long duration for the current domain and/or subdomains.
Source: https://blog.qualys.com/vulnerabilities-threat-research/2016/03/28/the-importance-of-a-proper-http-strict-transport-security-implementation-on-your-web-server#hsts-best-practices


includeSubdomains has it's side-effects, so remove that:

  1. It’s expensive to furnish every subdomain server with a valid CA-signed certificate. If HSTS is enabled, the browser prohibits users from using self-signed certificates for any subdomains.
  2. Insecure resources will fail to load and break pages that would have worked seamlessly over TLS when no HSTS policy was implemented.
    Source: https://blog.qualys.com/vulnerabilities-threat-research/2016/03/28/the-importance-of-a-proper-http-strict-transport-security-implementation-on-your-web-server#use-of-includesubdomains

preload should never be the default, as already mentioned, the domain will be added to a public list, will may increase attack rate:
Warning: You should ensure that you fully understand the implications of HSTS preloading before you include the directive in your policy and before you submit. It means that your entire domain and all subdomains, including those managed or maintained by third parties, will only work with HTTPS. Preloading should be viewed as a one way ticket. Whilst it is possible to be removed, it can take a long time and you may not be removed from all browsers.
Source:
Example file of google: https://chromium.googlesource.com/chromium/src/+/ea9dfef649a309a05c3b5c112150485836fbfcc7%5E%21/net/http/transport_security_state_static.json


#X-XSS-Protection with Content Security instead
add_header X-XSS-Protection "0";

Seems to be fine - even if we need to set "unsafe-inline" @ script-src in Content-Security-Policy to make SOGo work.
Source: https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/X-XSS-Protection [Browsers which accepted the header: Chrome v4-77 + Edge v12-16]


#Cross-Origin Resource, Opener, and Embedder Policies
#add_header Cross-Origin-Resource-Policy same-origin;
#add_header Cross-Origin-Opener-Policy same-origin;
#if you do not use Gravatar with SOGo
#add_header Cross-Origin-Embedder-Policy require-corp;

It has to be:

add_header Cross-Origin-Embedder-Policy "unsafe-none" always;
add_header Cross-Origin-Opener-Policy "same-origin" always;
add_header Cross-Origin-Resource-Policy "same-origin" always;

otherwise you won't be able to load any external embedded graphics in mail.


I don't know how many users will use gravatar but in my opinion it shouldn't be the default one.
So using this as Content-Security-Policy header:

add_header Content-Security-Policy "default-src 'none'; connect-src 'self' https://api.github.com; font-src 'self' https://fonts.gstatic.com; img-src 'self' data:; script-src 'self' 'unsafe-inline'; style-src 'self' 'unsafe-inline' https://fonts.googleapis.com; frame-ancestors 'none'; upgrade-insecure-requests; block-all-mixed-content; base-uri 'none'";

And yes gzip should be off, using it for 2 month now - without any noticeable delay.

@milkmaker milkmaker removed the stale Please update the issue with current status, unclear if it's still open/needed. label May 2, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

6 participants