Upgrading to jCustomer 2.x & jExperience 3.x

June 18, 2024

The goal of this document is to detail the process necessary for migrating from jCustomer 1x to jCustomer 2x. Before starting the migration, it is important to read through the entirety of this document and to understand the multiple operations involved in the upgrade.

jCustomer 2.x does introduce significant (breaking) changes, please:

  • Read all of the instructions on this page carefully, before proceeding with the upgrade process
  • Upgrade jCustomer to 1.5.7+ before applying these instructions
  • Always upgrade towards the latest version of jCustomer 2.x

Overview

The following diagram provides an overview of the key migration steps.

jcustomer2x-migration-overview.png

Stage 1: Prepare the migration

Before starting the data migration, there are some steps required to perform a successful migration.

During this stage, you will be working with a development (or pre-prod) instance of jCustomer containing a copy of your production data, this is necessary as you will be replaying all of your older events to make sure they are compatible with new JSON Schemas. At the end of this stage, you will have created all JSON Schemas and all migration scripts and will be ready to try out a migration to jCustomer 2x.

jCustomer 2.x brings 2 major changes: validation of events and changes in the data model.

To prepare the migration, you will need to:

  • Identify if you are using a custom data model, and if you do, create the corresponding JSON schemas
  • Identify if you rely on event properties using dynamic keys and if you do, implement a custom migration script
  • Identify if your custom Kibana dashboards are impacted
  • Identify if your custom modules rely on jExperience, and if they do, update them to follow best practices

1.1 - Prepare your environment

To complete Stage 1 of the migration, aside from your regular development environment, you will need:

  • root access to a jCustomer 1x instance containing a copy of your production data
  • root access to a jCustomer 2x instance (freshly started, without data)
  • to be capable of running our custom event checker tool

Do not proceed to the next step until the above conditions are met.

1.2 - Using a custom data model?

With the introduction of JSON Schemas in jCustomer 2.x, any events submitted to a jCustomer 2.x environment must comply with a pre-existing JSON Schema to be accepted and processed. Is considered a "custom data model", an event that has either:

  • An event type that is not known to jExperience or Apache Unomi. By default, jCustomer and jExperience are provided with all the schemas needed to support their functionalities, but it might be necessary to create additional schemas for any custom events you created. Known event types can be found in jExperience code base and Apache Unomi code base  
  • An event property that is not known to jExperience or Apache Unomi

Note: Custom data model for profiles and sessions do not require any operation. Only events are concerned as profiles and sessions cannot be created or modified from any public endpoint

 The next step consists in verifying if your implementation is using a custom data model, two methods are available to assist you in identifying if this is the case and if you require additional JSON schemas or JSON schema extensions:

  • by looking at your code 
  • by looking at your existing data 

1.2.1 - In your code

You should review any added integration code since deploying your jCustomer/jExperience environment. This usually involves checking custom Jahia modules or jCustomer plugins, including any Javascript code that was added to send events to jCustomer.

For example, if you are sending custom events to jCustomer using the wem.js library, you will need to create custom JSON schemas for those events. This is true even if you just added some custom parameters to existing events, such as the page view event.

1.2.2 - In your existing data

Although the migration process (from jCustomer 1.x to jCustomer 2.x) will handle the migration of existing events, this migration itself does not create additional JSON Schemas, and any event received by an environment after its migration will still require a valid JSON Schema to be properly accepted and processed by the jCustomer 2.x server (or else it will be rejected).

To help validate your readiness to migrate to jCustomer 2.x, we built a custom event checker tool that will automatically fetch all of your existing events (in the past 60 days by default), migrate them to jCustomer 2.x data model, and validate them against existing deployed JSON Schemas in jCustomer 2.x. By looking at the generated error messages, you will be able to update your JSON Schemas accordingly and iteratively modify, deploy, and test them until all your events validate properly. It is also possible for you to run the custom event checker and share the output with our support team.

☁️ As a Cloud customer, we will generate a customized list of events specific to your current configuration using our custom event checker tool. This tool will be executed on your behalf.

1.3 - Create and deploy JSON Schemas

If you identified that you are currently using a custom data model, you will need to create and deploy JSON Schemas. For the time being, it is easier to work directly towards the fresh jCustomer 2x instance you prepared earlier.

At the end of this step, all events issued by your codebase should have valid JSON Schemas, this means that if you were to go with those in production today, with no consideration for your previous data, you would have all the necessary JSON Schemas needed for operating jCustomer 2x in production.

☁️  As a Cloud customer, if you identified that you are currently using a custom data model, you will need to execute the curl command too. Replace "localhost" with your environment's domain name. 

1.3.1 - Creating and deploying new schemas

Any event submitted to jCustomer 2.x public endpoints will be rejected except if they comply with an existing JSON Schema.

If you are new to JSON Schemas in jCustomer (or Apache Unomi) 2.x, please review the First steps with Apache Unomi. This page contains step-by-step instructions to create your first JSON Schema.Do not go to production until you created all JSON Schemas necessary to handle your events.

Here's an example of what a request to create JSON schema looks like:

curl --location --request POST 'http://localhost:8181/cxs/jsonSchema' \
-u 'karaf:karaf' \
--header 'Content-Type: application/json' \
--data-raw '{
    "$id": "https://vendor.test.com/schemas/json/events/dummy/1-0-0",
    "$schema": "https://json-schema.org/draft/2019-09/schema",
    "self": {
        "vendor": "com.vendor.test",
        "name": "dummy",
        "format": "jsonschema",
        "target": "events",
        "version": "1-0-0"
    },
    "title": "DummyEvent",
    "type": "object",
    "allOf": [
        {
            "$ref": "https://unomi.apache.org/schemas/json/event/1-0-0"
        }
    ],
    "properties": {
        "properties": {
            "$ref": "https://vendor.test.com/schemas/json/events/dummy/properties/1-0-0"
        }
    },
    "unevaluatedProperties": false
}'

You can also find information on sending events via jExperience's tracker here.

1.3.2 - Testing your events on your schemas

As previously mentioned, it is advised to verify if your previous events have a matching JSON Schema by using jCustomer Custom Event Checker. In general, you should not proceed with the migration if there are rejected events.

You can also verify events individually by submitting a POST request to an admin endpoint dedicated to schema validation: /cxs/jsonSchema/validateEvent, which will provide details about schema errors (if the event does not match a schema). You can find more information about the new event validation endpoint in the Apache Unomi documentation.

Note: when running your jCustomer instance in debug mode, rejected events will be logged and will point to the portion of the event object not matching the schema, see the "reviewing logs" section below. 

1.3.3 - Packaging your json schemas for production

It is possible to package your custom JSON schemas inside a Jahia module, which will then be sent to jCustomer for deployment upon module startup. You can find information on how to do this in our jExperience module packaging documentation.

1.4 - Using dynamic keys for event properties?

If you are using dynamic keys for event properties, you will have to create a migration script to migrate these properties before creating a new JSON Schema. The key of a property is the name of the property. If the name of these properties can be different between several events of the same event type. It's the case for the form event for example.

Example: We can have a form that contains a name and first name, once an event is sent with the data of this form, the name, and the first name will be sent through the event properties. We can have a second form that contains the address, age, and location. So when sending an event with the data of this form, the properties will be address, age, and location. In both cases, the event type will be "form". The migration of form events is already handled by jCustomer, the properties are moved to the new event field name flattenedProperties.

This change has been introduced to avoid mapping explosion in Elasticsearch. The mapping for flattenedProperties is not dynamic so it avoids issues when there are a lot of different event properties keys. The properties field of the events will continue to generate dynamic mappings, so if you want to add data in this field it will require a specific JSON Schema to limit the possible data.

If you have the same kind of event than form, you will have to create a migration script to move the dynamic key properties to the flattenedProperties field. Once the migration is done you could be able to create the JSON schema for your event type.

1.5 - Create migration scripts

As jCustomer 2.x introduces breaking changes in its data model, it includes a set of migration scripts to transform the existing data from jCustomer 1.5.7+ data model to jCustomer 2.x data model.

But since jCustomer can only migrate objects it knows about, some custom events you used might require the creation of migration scripts and injecting those into the migration process.

Before starting its migration, jCustomer will create a list of all of the migration scripts located in its internal codebase and scripts placed by users in the data/migration/scripts directory; it will then sort that list and execute the scripts sequentially. This allows you to specify, by following a naming pattern, in which order your scripts should be executed.

The Apache Unomi 2.x documentation contains a dedicated section to explain how to create additional migrations scripts.

You can learn more about what was changed in Apache Unomi 2 on the official website.

☁️ If you need to implement a custom migration script as a Cloud customer, please submit it to the Jahia Support Team. They will execute it on your behalf and provide you with the output.

1.5.1 - Implement a custom migration script

If you need to implement your custom migration script, the documentation on the Apache Unomi website is there to help you. Here is an example of a built-in migration Groovy script provided with Apache Unomi from the tools/shell-commands/src/main/resources/META-INF/cxs/migration directory:

MigrationContext context = migrationContext
String esAddress = context.getConfigString("esAddress")
String indexPrefix = context.getConfigString("indexPrefix")

String baseSettings = MigrationUtils.resourceAsString(bundleContext, "requestBody/2.0.0/base_index_mapping.json")
String mapping = MigrationUtils.extractMappingFromBundles(bundleContext, "profile.json")
String newIndexSettings = MigrationUtils.buildIndexCreationRequest(baseSettings, mapping, context, false)
MigrationUtils.reIndex(context.getHttpClient(), bundleContext, esAddress, indexPrefix + "-profile",
        newIndexSettings, MigrationUtils.getFileWithoutComments(bundleContext, "requestBody/2.0.0/profile_migrate.painless"), context)

This script references request body files that are available here. 

1.6 - Update your custom Kibana dashboards

Because of the change introduced regarding dynamic keys, it is not yet possible to build Kibana dashboards based on: 

  • URL parameters
  • Interests
  • Form events specific fields

These fields are now also using flattenedProperties and it is not possible to build visualizations based on these fields. If that causes any issue, please report it to the Jahia support team..

1.7 - Update your configuration & your custom modules to follow best practices

☁️  The following recommendation also applies to Jahia Cloud customers.

1.7.1 - For any custom modules relying on jExperience

If you have any dependencies on jExperience, you should update them as well when updating jExperience

1.7.2 - New web tracker (wem.js)

As part of jExperience 3.3.0, we released a new version of the web tracker (wem.js).

This new major version of the tracker was an opportunity for some cleanup and removal of functions not used by jExperience.

The following functions were removed from the web tracker:

  • loadContent()
  • extends()
  • _createElementFromHTML()
  • _loadScript()
  • sendAjaxFormEvent()

The tracker remains extensible and if you still need these functions (or similar functions), you can easily integrate this logic into your own codebase.

1.7.3 - New methods to communicate with jCustomer

Since jExperience 2.8.0, some methods have been added to ease the communication between Jahia modules and jCustomer. If in your modules, there are some calls to jCustomer, you must update these modules to use the methods provided by jExperience.

The new methods are described in the section "Calling jCustomer public endpoint and Calling jCustomer private / admin endpoint" sections of the Using jExperience java services page. 

The new methods formats automatically the payload of the requests which are send to jCustomer. It allows to send a valid object to jCustomer. If you were using your own instance of an object mapper to format the payload of your request, it's not necessary anymore. The new methods uses a well configured instance to format the data.
If you do not use these methods, you could face some validation error thrown by JSON schemas when sending your requests because of not well formatted data.

1.7.4 - Use jExperience as a proxy to jCustomer

Since jExperience 2.7.2, it is possible to use jExperience as a proxy to jCustomer. In other words, jCustomer doesn't need to be exposed to internet. This new setting ensures that the calls to context.json and eventcollector are always sent to the same domain as the website; therefore cookies are always set as first party which prevent browsers to block them. This setting is described in Installing Elasticsearch, jCustomer, and jExperience, in the section "Properties to configure in the jExperience settings file" => "jCustomer public URL". 

☁️ For Jahia cloud customers, this setting will be shipped directly in a Jahia Cloud release automatically.

1.8 - Wrapping up stage 1

When reaching this stage, you should have the following:

  • a jCustomer 2x events containing all custom JSON Schemas needed for accepting events generated from now on

  • Migration scripts necessary to migrate your old events to the new jCustomer 2x data model

Do not proceed until the above conditions are met.

Stage 2: Validate in pre-prod

In this next stage, we will be taking your dev (or pre-prod) environment and will migrate it to jCustomer 2x. The goal is to verify that you met all the needed requirements for a successful migration. At the end of this stage, you will have all of the needed resources to begin migrating your production environment.

2.1 - Migrate pre-prod

In this step, we will take your current jCustomer 1x instance (not production) and migrate it to jCustomer 2x. The steps are very similar to those detailed in Stage 3.

Once the migration is complete:

  • Verify the logs to make sure no errors were reported during the migration, if you notice errors, adjust your custom migration scripts accordingly. If after careful review you still notice migration errors, you can reach out to our support team for assistance.
  • Use your pre-prod environment, simulate user visits, and watch for proper operation while reviewing the logs. These will highlight any steps that might have been missed in Stage 1.

If you did miss events, now is the time to update your JSON Schemas until no more errors are present in your logs as users are visiting the site.

☁️ As a Cloud customer, this action will be executed automatically upon your request. Jahia will perform the migration of non-production environments. This allows for testing and validation of the migration process. This action will also upgrade the jExperience module to 3.3.0.

2.1.1 - Reviewing logs

As jCustomer 2.x will reject any input that is not validated by a JSON schema, it can be useful to activate specific logging to understand why events are being rejected. This will make it a lot easier to understand what is going on, especially when testing the system before going live, as well as during production, as these logs may be activated and deactivated at any time. However, it is recommended that all of the checks be done before going live because these logs should never be active for long on production systems as they may cause slowdowns and eat up disk space very quickly (which is why they are deactivated by default).

You can find information about how to activate and read the log entries in the related section in the Apache Unomi documentation.

Note: it is also possible to activate the logs using an environment variable that is configured through the etc/custom.system.properties file :

org.apache.unomi.logs.jsonschema.level=${env:UNOMI_LOGS_JSONSCHEMA_LEVEL:-INFO} 

 Changing the above value to DEBUG may then be done like this:

export UNOMI_LOGS_JSONSCHEMA_LEVEL=DEBUG
☁️ As a Cloud customer, we got you covered with our partner Datadog. We have created a new monitor named "[jCustomer] "event rejection" error" for your benefit. This monitor is designed to alert you when an event submitted to jCustomer 2.x is rejected due to non-compliance with an existing JSON Schema. You need to check if all custom data models, schemas, and custom events work as expected in the new jCustomer 2.4.0 environment.

2.2 - Wrapping up stage 2

At the end of stage 2, you will have the following:

  • A freshly released version of your module embedding your custom JSON Schemas
  • A set of migration scripts needed to migrate older events

Do not proceed to Stage 3 until the above conditions are met.

Stage 3: Migrate production

3.0 - jExperience upgrade safety

To prevent accidental upgrades, a mechanism displays an error message if an administrator tries to upgrade to the next major version of the module (first-digit upgrade).

Once ready to perform the migration, the first step will consist in removing that safety.

3.0.1 Safety removal via the filesystem

This safety can be removed by deleting the file: digital-data/karaf/etc/org.jahia.modules.modulemanager.configuration.constraints-jexperience.yaml.

The file will be automatically added back after the upgrade and will be configured to prevent an upgrade to jExperience 4.x

3.0.2 Safety removal via a GraphQL call

Performing the following GraphQL call will remove the upgrade safety

mutation {
  admin {
    jahia {
      configuration(
        pid: "org.jahia.modules.modulemanager.configuration.constraints", identifier: "jexperience"
      ) {
      	remove(name: "moduleLifeCycleConstraints")
      }
    }
  }
}

The corresponding configuration file will be automatically added back after the upgrade and will be configured to prevent an upgrade to jExperience 4.x

3.1 - Migrate jCustomer & update jExperience

☁️ As a Cloud customer, following the successful migration of the non-production environment and validation from you, the Cloud team will upgrade the production environment on your behalf. The date for the upgrade will be collaboratively planned with you.

Please note that during the migration and as long as the jExperience module is started, the fallback variants are displayed by personalizations/optimizations on the site.

To migrate your platform, please follow these steps carefully:

  1. Backup your jCustomer-1.5.7+ instance (in case you run jCustomer in cluster, backup the whole cluster)
    For a detailed description of how to backup jCustomer please follow the section How to backup jCustomer on this page
  2. Stop jCustomer-1.5.7+
  3. Download jCustomer-2.4.0 from jExperience customer center and install it
  4. If you're handling custom events, make sure you created the corresponding migration scripts and placed those in jCustomer: data/migration/scripts
  5. Start a single jCustomer 2.4.0 migration node following instructions from Apache Unomi 2.0.0 documentation. This node will be handling all of the data transformations, make sure to review all items in the "Checklist" section. The documentation details two approaches for triggering the migration (replacing with your source Unomi version):
    • Manually using the Karaf shell with the command: unomi:migrate 1.6.0
    • Or, with Docker using the environment variable: KARAF_OPTS="-Dunomi.autoMigrate=1.6.0" 
  6. Verify that the migration was successful by reviewing the file data/migration/history.json, all steps in this file must have the status "COMPLETED"
  7. Apply all your custom configurations to the new jCustomer-2.4.0 instance
  8. Upgrade and start jExperience in version 3.3.0 on your Jahia instance.
  9. Upgrade jExperience-dashboards to version 1.0.0 on your Jahia instance
  10. Start jCustomer-2.4.0 ./start command in the operating system shell.
  11. Connect to the jCustomer Karaf SSH shell using an SSH client such as in the following example (default password karaf):
    ssh -p 8102 karaf@localhost
  12. Start jCustomer (only needed when running jCustomer 2.4.0 for the first time).
    unomi:start
  13. Submit custom JSON Schemas via the API (if those were not embedded in your module)

3.3 - Wrapping up stage 3

Your production environment should now be fully migrated, as an extra measure of precaution, you could keep monitoring jCustomer logs for schema validation errors during a few weeks.

But remember that JSON schemas are also here to increase the robustness of your platform and aim at rejecting events not conforming to a set schema, if your platform is subject to attacks, these would trigger legitimate schema validation errors.