Configuring Apache and security settings

March 6, 2023

5.1 Integrating with an Apache HTTP Web Server

In some production setup, you will often need to redirect the port 8181 and 9443 to the default HTTP (80) and HTTPS (443) ports. To do so, you will need to setup an Apache HTTP web server in front of Apache Unomi.

Here is an example configuration using mod_proxy and the DNS name "":

In your Unomi package directory, in <cxs-install-dir>/etc/org.apache.unomi.cluster.cfg:


 and you will also need to change the contextserver.domain in the <cxs-install-dir>/etc/org.apache.unomi.web.cfg file

Main virtual host config:

<VirtualHost *:80>
        Include /var/www/vhosts/
<IfModule mod_ssl.c>
    <VirtualHost *:443>
        Include /var/www/vhosts/ 
        SSLEngine on
        SSLCertificateFile /var/www/vhosts/ 
        SSLCertificateKeyFile /var/www/vhosts/ 
        SSLCertificateChainFile /var/www/vhosts/
        <FilesMatch "\.(cgi|shtml|phtml|php)$">
            SSLOptions +StdEnvVars
        <Directory /usr/lib/cgi-bin>
            SSLOptions +StdEnvVars
        BrowserMatch "MSIE [2-6]" \
        nokeepalive ssl-unclean-shutdown \
        downgrade-1.0 force-response-1.0
        BrowserMatch "MSIE [17-9]" ssl-unclean-shutdown


DocumentRoot /var/www/vhosts/
CustomLog /var/log/apache2/ combined
<Directory />
    Options FollowSymLinks
    AllowOverride None
<Directory /var/www/vhosts/>
    Options FollowSymLinks MultiViews
    AllowOverride None
    Order allow,deny
    allow from all
<Location /cxs>
    Order deny,allow
    deny from all
    allow from <jahia-dx-ip-address>
RewriteEngine On
RewriteRule .* - [F]
ProxyPreserveHost On
ProxyPass /server-status !
ProxyPass /robots.txt !
RewriteCond %{HTTP_USER_AGENT} Googlebot [OR]
RewriteCond %{HTTP_USER_AGENT} msnbot [OR]
RewriteCond %{HTTP_USER_AGENT} Slurp
RewriteRule ^.* - [F,L]
ProxyPass / http://localhost:8181/ connectiontimeout=20 timeout=300 ttl=120
ProxyPassReverse / http://locahost:8181 
Warning: make sure you protected the /cxs from external accesses, only the DX server should be able to use it directly.

5.2 Security aspects

5.2.1 Administrator username and password

The Apache Unomi REST API is protected using JAAS authentication and using Basic or Digest HTTP auth. By default, the login/password for the REST API full administrative access is "karaf/karaf". It is strongly recommended that you change the default username and password as soon as possible. This can be done by modifying the following configuration file <cxs-install-dir>/etc/

adminUserName = adminPassword,_g_:admingroup
_g_\:admingroup = group,admin,manager,viewer,webconsole

In-depth details for the JAAS security in the CXS’ Karaf server can be found at:

Warning: do not leave the default karaf/karaf password in the file when putting a system in production !

5.2.2 SSL certificate

The Apache Unomi package is configured with a default SSL certificate. You can change it by following these steps: 1. Replace the existing keystore in file <cxs-install-dir>/etc/keystore by your own certificate. See for details. 2. Update the keystore and certificate password in <cxs-install-dir>/etc/ file: = true

You should now have SSL setup on Apache Unomi with your certificate, and you can test it by trying to access it on port 9443.

5.2.3 Securing a production environment

Before going live with a project, you should absolutely read the following sections that will help you setup a proper secure environment for running your Apache Unomi. Install and configure a firewall (port numbers)

You should setup a firewall around your cluster of Apache Unomis and/or ElasticSearch nodes. If you have an application-level firewall you should only allow the following connections open to the whole world:

  • http://localhost:8181/context.js
  • http://localhost:8181/eventcollector
  • http://localhost:8181/client

All other ports and urls should not be accessible to the world. For your Apache Unomi client applications (such as the DX Marketing Factory), you will need to make the following ports accessible to the client machine:

  • 8181 - Apache Unomi HTTP port
  • 9443 - Apache Unomi HTTPS port For your Apache Unomis and for any standalone ElasticSearch nodes you will need to open the following ports for proper node-to-node communication:
  • 5700 - 5800 Hazelcast cluster protocol
  • 9200 - ElasticSearch REST API
  • 9300 - ElasticSearch TCP transport

Of course, any ports listed here are the default ports configured in each server, you may adjust them if needed in the "Network And HTTP" section of the <elasticsearch-install-dir>/config/elasticsearch.yml file or in the <cxs-install-dir>/etc property files.

Note that if you need a temporary SSL certificate for a pre-production environment for example, you can generate one using Certbot ( until a proper one is delivered from IT. Secure Elasticsearch

It is recommended when going live to restrict the automatic index creation to only indexes that are created by Apache Unomi (and/or your other application using indices in the same ES cluster). In order to do that you should add the following line to your <es-install-dir>/config/elasticsearch.yml file:

action.auto_create_index: +context*,-*

Also we recommend to follow industry recommended best practices for securing ElasticSearch. You may find more valuable recommendations here: Setup a (SSL) proxy

As an alternative to an application-level firewall, you could also route all traffic to Apache Unomi through a proxy and use it to filter any communication.

A relatively straight-forward way to do this is to use an Apache HTTP server in front of the public-facing Apache Unomi endpoint. You could even setup this proxy connection to use SSL-only and proxy to the http://UNOMI_HOST:8181/ port. Configuration could look something like this:

Listen 443
    SSLEngine On
    # Set the path to SSL certificate
    # Usage: SSLCertificateFile /path/to/cert.pem
    SSLCertificateFile /etc/apache2/ssl/file.pem
    ProxyPass / http://UNOMI_HOST:8181/
    ProxyPassReverse / http://UNOMI_HOST:8181/

This way all the traffic to the endpoint will be secured using SSL, and then will proxy the requests to Apache Unomi on the HTTP port 8181.

5.2.4 Search robots and crawlers

By default Apache Unomi includes a /robots.txt file with the following content:

User-agent: *
Disallow: /

meaning, it disallows search bots and crawlers to access the content. On top of it you could block the known search bots on your front-end Apache HTTPD server using the following rewrite rules:

RewriteCond %{HTTP_USER_AGENT} Googlebot [OR]
RewriteCond %{HTTP_USER_AGENT} msnbot [OR]
RewriteCond %{HTTP_USER_AGENT} Slurp
RewriteRule ^.* - [F,L]