TermWeb - Business Integration Message Adapter (BIMA) Manual

This document describes the message format for TermWeb Business Integration Message Adapter (BIMA). BIMA is used for integration of TermWeb with other enterprise systems via a message broker middleware, such as WebSphere MQ, ActiveMQ, RabbitMQ, Apache Kafka.
BIMA uses the Java Message Service (JMS) standard and Kafka Streams.

Introduction

There are two TermWeb operations that causes BIMA to generate messages: initial transfer (export of term database) and update of term data.

When such an operation occurs, BIMA generates one or more messages and publishes them to the message broker's publish/subscribe topics defined in BIMA's configuration file.
Each message consists of several properties and a message body, which is a XML structure. The properties are useful for applications when filtering messages to avoid unnecessary XML parsing from the body.

Installation

  • Copy the sample configuration file to [TERMWEB_HOME] and edit it as necessary

  • Edit [TERMWEB_HOME]/termweb.properties and add the following lines:

    adapter.class.1=com.termweb4.bima.BimaAdapter adapter.config.1=[TERMWEB_HOME]/BimaAdapter.properties
  • If you’re going to use JMS you may replace activemq libraries with another: RabbitMQ, IBM MQ. If you’re going use another MQ implementation copy any message broker specific jar files to [TOMCAT_HOME]/webapps/ROOT/WEB-INF/lib that BIMA needs to create instances of TopicConnectionFactory and Topic. By default, TermWeb 4 comes with ActiveMQ implementation files, don’t forget to delete to avoid any conflicts:

    • activemq-client-*.jar

      • // And its transitive dependencies org.apache.geronimo.specs:geronimo-jms_1.1_spec:1.1.1 org.apache.geronimo.specs:geronimo-j2ee-management_1.1_spec:1.0.1 org.fusesource.hawtbuf:hawtbuf:1.11

Configuration

Property

Description

Property

Description

Common properties

adapter.enabled

Enables or disables the message publishing. Set the value to false to disable the messaging.

adapter.jms.enabled

Enables or disables the message publishing only for JMS publisher. Set the value to false to disable the messaging.

adapter.kafka.enabled

Enables or disables the message publishing only for Kafka publisher. Set the value to false to disable the messaging.

adapter.maxMessageSize

Defines the max message size in bytes for publishing to the initial topic. A value of 0 indicates no max size for the messages.
Specify this value a little bit less than allowed maximum because sometimes it impossible to cut message properly. E.g.: if your message store has 1 048 576 (1MB) maximum size, then put here 921 600 (900KB)

JMS properties

jms.topic.init.namePattern

Defines the name pattern for the initial export topic. See below for description of the pattern.

jms.topic.update.section.namePattern

Defines the name pattern for the section update topic. See below for description of the pattern.

jms.topic.update.concept.namePattern

Defines the name pattern for the concept update topic. See below for description of the pattern.

jms.topic.update.term.namePattern

Defines the name pattern for the term update topic. See below for description of the pattern.

jms.tcf.type

Defines the class name for the TopicConnectionFactory.
For example: org.apache.activemq.ActiveMQConnectionFactory or com.ibm.mq.jms.MQTopicConnectionFactory

jms.tcf.*

Defines several properties for the TopicConnectionFactory class. The names of the properties depend on the properties available in the class defined by jms.tcf.type. BIMA attempts to call a setter method for each defined property, e.g., the property jms.tcf.queueManager will call setQueueManager() with the value from the property.

jms.topic.init.type

Defines the class name for the Topic used for initial export messages.
For example: org.apache.activemq.command.ActiveMQTopic or com.ibm.mq.jms.MQTopic

jms.topic.init.nameMethod

The name of the method in the Topic (sub)class used for setting the baseTopicName. Normally this is setPhysicalName or setBaseTopicName.

jms.topic.init.*

Other properties for the class defined by jms.topic.init.type. BIMA attempts to call the corresponding setter method for each property.

jms.topic.update.section.type

Defines the class name for the Topic used for section update messages.
For example: org.apache.activemq.command.ActiveMQTopic or com.ibm.mq.jms.MQTopic

jms.topic.update.section.nameMethod

The name of the method in the Topic (sub)class used for setting the baseTopicName. Normally this is setPhysicalName or setBaseTopicName.

jms.topic.update.section.*

Other properties for the class defined by jms.topic.update.section.type. BIMA attempts to call the corresponding setter method for each property.

jms.topic.update.concept.type

Defines the class name for the Topic used for concept update messages.
For example: org.apache.activemq.command.ActiveMQTopic or com.ibm.mq.jms.MQTopic

jms.topic.update.concept.nameMethod

The name of the method in the Topic (sub)class used for setting the baseTopicName. Normally this is setPhysicalName or setBaseTopicName.

jms.topic.update.concept.*

Other properties for the class defined by jms.topic.update.concept.type. BIMA attempts to call the corresponding setter method for each property.

jms.topic.update.term.type

Defines the class name for the Topic used for term update messages.
For example: org.apache.activemq.command.ActiveMQTopic or com.ibm.mq.jms.MQTopic

jms.topic.update.term.nameMethod

The name of the method in the Topic (sub)class used for setting the baseTopicName. Normally this is setPhysicalName or setBaseTopicName.

jms.topic.update.term.*

Other properties for the class defined by jms.topic.update.term.type. BIMA attempts to call the corresponding setter method for each property.

Kafka properties

kafka.bootstrap.servers

A list of host/port pairs to use for establishing the initial connection to the Kafka cluster. The client will make use of all servers irrespective of which servers are specified here for bootstrapping – this list only impacts the initial hosts used to discover the full set of servers. This list should be in the form host1:port1,host2:port2,...

kafka.client.id

An id string to pass to the server when making requests. The purpose of this is to be able to track the source of requests beyond just ip/port by allowing a logical application name to be included in server-side request logging. Default value is TermWebProducer.

kafka.client.dns.lookup

Controls how the client uses DNS lookups. If set to use_all_dns_ips, connect to each returned IP address in sequence until a successful connection is established. After a disconnection, the next IP is used. Once all IPs have been used once, the client resolves the IP(s) from the hostname again (both the JVM and the OS cache DNS name lookups, however). If set to resolve_canonical_bootstrap_servers_only, resolve each bootstrap address into a list of canonical names. After the bootstrap phase, this behaves the same as use_all_dns_ips.

kafka.include.jms.topic

Include JMS topic name as Kafka’s record header JmsTopicName. When set to true it will include related value generated from configuration: jms.topic.update.section.namePattern, jms.topic.update.concept.namePattern, jms.topic.update.term.namePattern

kafka.topic.init.name

Default topic name used for initial export messages sent to Kafka. User may change it for every export later.

kafka.topic.update.section.name

Topic name used for update section messages sent to Kafka

kafka.topic.update.concept.name

Topic name used for update concept messages sent to Kafka

kafka.topic.update.term.name

Topic name used for update term messages sent to Kafka

kafka.security.protocol

Protocol used to communicate with brokers. Valid values are: PLAINTEXT, SSL, SASL_PLAINTEXT, SASL_SSL. Default value is PLAINTEXT.

Kafka SASL configuration
TermWeb provides keys to configure most of Kafka producer SASSL properties listed here: 3.3 Producer configs. Here you may find short description of these properties

kafka.sasl.mechanism

SASL mechanism used for client connections. This may be any mechanism for which a security provider is available. GSSAPI is the default mechanism.

kafka.sasl.login.callback.handler.class

The fully qualified name of a SASL login callback handler class that implements the AuthenticateCallbackHandler interface. For brokers, login callback handler config must be prefixed with listener prefix and SASL mechanism name in lower-case. For example, listener.name.sasl_ssl.scram-sha-256.sasl.login.callback.handler.class=com.example.CustomScramLoginCallbackHandler

kafka.sasl.login.connect.timeout.ms

The (optional) value in milliseconds for the external authentication provider connection timeout. Currently applies only to OAUTHBEARER.

kafka.sasl.oauthbearer.token.endpoint.url

The URL for the OAuth/OIDC identity provider. If the URL is HTTP(S)-based, it is the issuer’s token endpoint URL to which requests will be made to login based on the configuration in kafka.sasl.jaas.config. If the URL is file-based, it specifies a file containing an access token (in JWT serialized form) issued by the OAuth/OIDC identity provider to use for authorization.

kafka.sasl.oauthbearer.expected.audience

The (optional) comma-delimited setting for the broker to use to verify that the JWT was issued for one of the expected audiences. The JWT will be inspected for the standard OAuth “aud” claim and if this value is set, the broker will match the value from JWT’s “aud” claim to see if there is an exact match. If there is no match, the broker will reject the JWT and authentication will fail.

kafka.sasl.jaas.config

JAAS login context parameters for SASL connections in the format used by JAAS configuration files. JAAS configuration file format is described here. The format for the value is: loginModuleClass controlFlag (optionName=optionValue)*;.

kafka.ssl.truststore.location

The location of the trust store file. For example: /opt/iris-truststore.jks

kafka.ssl.truststore.password

The password for the trust store file. If a password is not set, trust store file configured will still be used, but integrity checking is disabled. Trust store password is not supported for PEM format.

Kafka producer configuration
TermWeb provides keys to configure most of Kafka producer properties listed here: 3.3 Producer configs. Here you may find short description of these properties

kafka.producer.delivery.timeout.ms

Default value is 600000 (10 minutes).

An upper bound on the time to report success or failure after a call to send() returns. This limits the total time that a record will be delayed prior to sending, the time to await acknowledgement from the broker (if expected), and the time allowed for retriable send failures. The producer may report failure to send a record earlier than this config if either an unrecoverable error is encountered, the retries have been exhausted, or the record is added to a batch which reached an earlier delivery expiration deadline. The value of this config should be greater than or equal to the sum of kafka.producer.request.timeout.ms and kafka.producer.linger.ms

kafka.producer.metadata.max.age.ms

The period of time in milliseconds after which we force a refresh of metadata even if we haven’t seen any partition leadership changes to proactively discover any new brokers or partitions.

kafka.producer.metadata.max.idle.ms

Controls how long the producer will cache metadata for a topic that’s idle. If the elapsed time since a topic was last produced to exceeds the metadata idle duration, then the topic’s metadata is forgotten and the next access to it will force a metadata fetch request.

kafka.producer.batch.size

The producer will attempt to batch records together into fewer requests whenever multiple records are being sent to the same partition. This helps performance on both the client and the server. This configuration controls the default batch size in bytes.

No attempt will be made to batch records larger than this size.

Note: This setting gives the upper bound of the batch size to be sent. If we have fewer than this many bytes accumulated for this partition, we will ‘linger’ for the kafka.producer.linger.ms time waiting for more records to show up. This kafka.producer.linger.ms setting defaults to 0, which means we’ll immediately send out a record even the accumulated batch size is under this kafka.producer.batch.size setting.

kafka.producer.acks

The number of acknowledgments the producer requires the leader to have received before considering a request complete. This controls the durability of records that are sent. The following settings are allowed: [all, -1, 0, 1]

kafka.producer.linger.ms

The producer groups together any records that arrive in between request transmissions into a single batched request. Normally this occurs only under load when records arrive faster than they can be sent out. However, in some circumstances the client may want to reduce the number of requests even under moderate load. Valid Values: [0,...]

kafka.producer.request.timeout.ms

The configuration controls the maximum amount of time the client will wait for the response of a request. If the response is not received before the timeout elapses the client will resend the request if necessary or fail the request if retries are exhausted.

kafka.producer.max.request.size

The maximum size of a request in bytes. This setting will limit the number of record batches the producer will send in a single request to avoid sending huge requests. This is also effectively a cap on the maximum uncompressed record batch size. Note that the server has its own cap on the record batch size (after compression if compression is enabled) which may be different from this.

kafka.producer.reconnect.backoff.max.ms

The maximum amount of time in milliseconds to wait when reconnecting to a broker that has repeatedly failed to connect. If provided, the backoff per host will increase exponentially for each consecutive connection failure, up to this maximum.

kafka.producer.max.block.ms

The configuration controls how long the KafkaProducer’s send(), partitionsFor(), initTransactions(), sendOffsetsToTransaction(), commitTransaction() and abortTransaction() methods will block. For send() this timeout bounds the total time waiting for both metadata fetch and buffer allocation (blocking in the user-supplied serializers or partitioner is not counted against this timeout).

kafka.producer.buffer.memory

The total bytes of memory the producer can use to buffer records waiting to be sent to the server. If records are sent faster than they can be delivered to the server the producer will block for kafka.producer.max.block.ms after which it will throw an exception.

kafka.producer.retry.backoff.ms

The amount of time to wait before attempting to retry a failed request to a given topic partition. This avoids repeatedly sending requests in a tight loop under some failure scenarios.

kafka.producer.compression.type

The compression type for all data generated by the producer. The default is none (i.e. no compression). Valid values are none, gzip, snappy, lz4, or zstd. Compression is of full batches of data, so the efficacy of batching will also impact the compression ratio (more batching means better compression).

kafka.producer.retries

Setting a value greater than zero will cause the client to resend any record whose send fails with a potentially transient error. Note that this retry is no different than if the client resent the record upon receiving the error. Produce requests will be failed before the number of retries has been exhausted if the timeout configured by kafka.producer.delivery.timeout.ms expires first before successful acknowledgement. Users should generally prefer to leave this config unset and instead use kafka.producer.delivery.timeout.ms to control retry behavior.

kafka.producer.connections.max.idle.ms

Close idle connections after the number of milliseconds specified by this config.

kafka.producer.partitioner.ignore.keys

When set to true the producer won't use record keys to choose a partition. If false, producer would choose a partition based on a hash of the key when a key is present. Note: this setting has no effect if a custom partitioner is used.

kafka.producer.transaction.timeout.ms

The maximum amount of time in milliseconds that a transaction will remain open before the coordinator proactively aborts it. The start of the transaction is set at the time that the first partition is added to it. If this value is larger than the setting in the broker, the request will fail with a error.

Name pattern

The name pattern for the different JMS topics is a text string with variables that will be replaced with values depending on the dictionary, section and language that are affected by the update message. The variables are:

  • <DICTIONARYID>

Is replaced by the oid of the dictionary.

  • <SECTIONID>

Is replaced by the oid of the section.

  • <LANGUAGE>

Is replaced by the language iso code. Only applicable for term update messages. (2-letter)

  • <TAG>

Is replaced by the tag the user can enter when selecting an initial transmission. Thus, only applicable for initial transmission messages.

Example configuration.

Property

Value

Property

Value

jms.topic.init.namePattern

Subsystem/load/<TAG>

jms.topic.update.section.namePattern

Subsystem/update/category/Dict-<DICTIONARYID>/Cat-<SECTIONID>

jms.topic.update.concept.namePattern

Subsystem/update/concept/Dict-<DICTIONARYID>/Cat-<SECTIONID>

jms.topic.update.term.namePattern

Subsystem/update/term/Dict-<DICTIONARYID>/Cat-<SECTIONID>/<LANGUAGE>

Adapter administration

When the adapter is installed in TermWeb an BIMA administration panel is available in TermWeb. Log in as administrator and go to the AdminView → System → BIMA. The administration panel opens.

General panel

The general panel allows the adapter to be enabled and disabled. It also allows temporary disable one of the parts JMS or Kafka. Please note that the setting in BimaAdapter.properties is not affected by this control. After restarting TermWeb the adapter will be enabled or disabled according to the value in the properties file.

Information about current configuration is also displayed here.

Export panel

The export panel allows the administrator to do an initial transfer of all or a part of the term data in a dictionary. The selection of concept, terms and fields are based on regular Export settings used for normal exports.

Select settings and topic

Start by selecting one of the available export settings. If no setting is appropriate for your needs, you can click the “Create export setting” link which opens the Export settings panel and lets you define a new setting. Save the setting and then select the BIMA administration icon again in the AdminView → System and select the newly created setting.

Continue by entering a name of the topic to which the data should be published. Choose type of message broker configured in adapter (JMS or Kafka)

Click on the “Continue” button.

Confirm your selection

Your selected export setting is now displayed in detail with languages, fields and total number of concepts that will be published when using this setting. By moving your mouse pointer over the language and field summary you will see a list of all languages and fields included.

If this is the data you expected, click the “Start export” button to start publishing the messages. Otherwise, click the “Reset” button to go back and select another export setting and/or topic name.

Export progress

The selected concepts are now retrieved from the database, converted to XML segments, and published to the selected topic. For details about the message format, see section Message structure below.

A progress bar indicates how much of the export is completed. The elapsed time and an estimation of the total time is also given.

You can stop the export by pressing the Stop button. Please note that this may cause receiving systems to only get parts of the data intended.

Export summary

When the export is done, or if the export was stopped by the user, a summary is displayed with the total number of concepts published, together with elapsed time.

Message structure

Message properties

The following specific properties are defined in a JMS message or as header in Kafka’s record:

Property

Type

Description

Possible values

Property

Type

Description

Possible values

DictionaryId

String

Contains the oid of the dictionary that contains the object described in this message

Any string

SectionId

String

Contains the oid of the section that contains the object described in this message.

Any string

NewSectionId

String

Contains the new section oid when the section was changed for Concept, otherwise null.

Any string

OldSectionId

String

Contains the old section oid when the section was changed for Concept, otherwise null.

Any string

DataType

String

Identifies the type of data entity that is contained in this message.

Is one of:

  • SECTION (only section data)

  • CONCEPT (a full concept including terms)

  • TERM (a single term)

DataAction

String

Identifies the type of action that was performed on the data.

Is one of:

  • CREATED (the data entity was created)

  • UPDATED (some field in the entity was modified)

  • DELETED (the entity was deleted)

  • UNMODIFIED (no field in the entity was modified. Only happens for terms in a concept that has generated in message with DataAction UPDATED)

TransmissionType

String

Identifies the type of transmission.

Is one of:

  • INITIAL (transfer of (a subset of) all concepts invoked by the user)

  • UPDATE (transfer of one or more concepts due to an update of the TermWeb database)

Restore

Boolean

Identifies if the message was generated in a restore operation in TermWeb.

true if it is a restore operation, otherwise false

Initial transmission

During an initial transfer (described in Adapter administration above) BIMA will create a message (or several messages if the message is larger than the maximum message size specified in the config file) for each section in the dictionary being exported. The TransmissionType in the message is set to INITIAL. DictionaryId and SectionId are set to their respective values. All other properties are set to null.

The body of the message is an XML structure containing all the concepts being exported from the section, which could be none if the export setting's filter does not include any concept from this section.

Update transmissions

An update transmission occurs when some term data is updated in TermWeb. This is done by either renaming or deleting a section, creation, modification, or deletion of concepts, or by importing a file. BIMA then creates one or more messages and publishes them to the broker.

Section updates

Messages caused by section updates are published to the section update topic(s) defined by jms.topic.update.section.namePattern or by kafka.topic.update.section in the config file. All messages have DataType set to SECTION.

If a section is renamed a message is published to the topic with DataAction set to UPDATED. The XML in the body consists only of the dictionary and section. The new name is found in the “name” attribute of the section tag.

If a section is deleted a message with DataAction set to DELETED is published. Only dictionary and section in the XML.

See Example messages for more detailed information.

Term data updates

For every change there will be a message for the entire concept together with a message for each affected term. There is therefore a redundancy in the messages since the data in the term messages also are included in the concept message.

Concept messages are published to the concept topic(s) defined by jms.topic.update.concept.namePattern or by kafka.topic.update.concept in the config file. Term update messages are published to the topic(s) defined by jms.topic.update.term.namePattern or by kafka.topic.update.term.
The subscribing client applications can thus select the most appropriate data structure.

In the group of messages, the concept message has DataType set to CONCEPT, an appropriate value set for DataAction and the body is the XML for the entire concept. Each term message has DataType set to TERM, a DataAction value as appropriate and the XML structure for the term.

When importing a file there will be a series of messages, just like if every concept was created or modified manually. The messages will be sent to the broker when the import of the file is completed.

Description of term data update transmissions

The messages generated from the different kinds of updates are as follows.

Creation of a new concept

One message for the entire concept, and one message for each term in the concept. All have DataAction set to CREATED.

Deletion of a concept

One message for the entire concept, and one message for each term in the concept. All have DataAction set to DELETED.

Modification of a concept

One message for the entire concept with DataAction set to UPDATED.

For every term that has been created, there is a message with the term data and DataAction set to CREATED.

For every term that has been deleted, there is a message with the term data and DataAction set to DELETED.

If any term field except the term name or language has been modified, there is a message with the term data and DataAction set to UPDATED.

If the term name or language has been changed, the old term is considered to have been deleted and a new one created. Therefore, this will create two messages: one DELETED and one CREATED.

If any field on the concept level has been modified, a message with DataAction UNMODIFIED is generated for every term that has not already generated a message with DataAction UPDATED, CREATED or DELETED. Subscribing systems can then determine if some action needs to be taken with the data.

If the section has been changed, the concept is considered to have been deleted and new one created. The messages for the deletion are published to the old section's topic and the creation messages are published to the new section's topic.

XML structure

The XML structure for the messages is like the structure of a TermWeb XML export file and is described in .
Message example:

<dictionary id=”8” name=”Automotive terms”> <section id=”16” name="Headings"> <concept id="4116633" termno=”430”> <metadata> <createdOn>1999-05-12 03:12:54</createdOn> <createdBy>admin</createdBy> <changedOn>2004-10-04 13:53:27</changedOn> <changedBy>admin</changedBy> </metadata> <field name="Domain">All domains</field> <field name="Origin">Termlex</field> <field name="Term No.">430</field> ... <term> <metadata> <createdOn /> <createdBy /> <changedOn>2004-10-04 13:53:22</changedOn> <changedBy>admin</changedBy> </metadata> <field name="Term">Lubrication, injection pump</field> <field name="Language">en-GB</field> <field name="Region"/> ... </term> <term> ... </term> ... </concept> ... </section> .... </dictionary>

As previously stated, there are one message containing the entire concept and one for each term affected for every term data update. The XML structure in a update concept message contains exactly one <concept> tag with corresponding content, for initial transfer it may have multiple concepts. For a term message, the body contains exactly one <term> tag with content.

The Status field of the concept level is also included in every term message before the <term> tag (messages which are sent to: to the topic(s) defined by jms.topic.update.term.namePattern or by kafka.topic.update.term), by default messages about terms in this topic don’t contain Concept fields, but Status will be present anyway:

Kafka integration

Integration for Kafka is using the same message structure. By default, all messages serialized as JSON strings. XML part of message will be stored in records value.

For initial transmission key will be 1-100, where 1 and 100 sequential position number of first and last element in message.
Numbering starts with 1.

For update transmission key will be S1234, C1234 or T1234 depending on type of message. S1234 is for Section, C1234 is an id for Concept and T1234 id for Term. In case of Section change for Concept BIMA will create two messages with same key, but different DataAction: DELETED and CREATED.

Example messages

Term deleted from Concept

Concept has three Terms: English, Swedish and German. User deletes German term. Next records will be sent:

Topic: Subsystem-update-concept Key: C1005635

Topic: Subsystem-update-concept Key: C1005635

Kafka header

Value

JmsTopicName

Subsystem/update/concept/Dict-97qjc/Cat-97qjs

TransmissionType

UPDATE

DictionaryId

97qjc

SectionId

97qjs

DataType

CONCEPT

DataAction

UPDATED

Restore

false

Body:

A message about German Term that it’s been deleted

Topic: Subsystem-update-term Key: T2326754

Topic: Subsystem-update-term Key: T2326754

Kafka header

Value

JmsTopicName

Subsystem/update/term/Dict-97qjc/Cat-97qjs/de

TransmissionType

UPDATE

DictionaryId

97qjc

SectionId

97qjs

DataType

TERM

DataAction

DELETED

Restore

false

Body:

A message about English Term that nothing has been changed.

Topic: Subsystem-update-term Key: T2326755

Topic: Subsystem-update-term Key: T2326755

Kafka header

Value

JmsTopicName

Subsystem/update/term/Dict-97qjc/Cat-97qjs/en

TransmissionType

UPDATE

DictionaryId

97qjc

SectionId

97qjs

DataType

TERM

DataAction

UNMODIFIED

Restore

false

Body:

A message about Swedish Term that nothing has been changed.

Topic: Subsystem-update-term Key: T2326758

Topic: Subsystem-update-term Key: T2326758

Kafka header

Value

JmsTopicName

Subsystem/update/term/Dict-97qjc/Cat-97qjs/sv

TransmissionType

UPDATE

DictionaryId

97qjc

SectionId

97qjs

DataType

TERM

DataAction

UNMODIFIED

Restore

false

Body:

Term added to Concept

User adds new Spanish Term to Concept consisting of two Terms: English and Swedish. Next records will be sent:

Topic: Subsystem-update-concept Key: C1005635

Topic: Subsystem-update-concept Key: C1005635

Kafka header

Value

JmsTopicName

Subsystem/update/concept/Dict-97qjc/Cat-97qjs

TransmissionType

UPDATE

DictionaryId

97qjc

SectionId

97qjs

DataType

CONCEPT

DataAction

UPDATED

Restore

false

Body:

A message about new Spanish Term that it’s been added

Topic: Subsystem-update-term Key: T2326754

Topic: Subsystem-update-term Key: T2326754

Kafka header

Value

JmsTopicName

Subsystem/update/term/Dict-97qjc/Cat-97qjs/es

TransmissionType

UPDATE

DictionaryId

97qjc

SectionId

97qjs

DataType

TERM

DataAction

CREATED

Restore

false

Body:

A message about English Term that nothing has been changed.

Topic: Subsystem-update-term Key: T2326755

Topic: Subsystem-update-term Key: T2326755

Kafka header

Value

JmsTopicName

Subsystem/update/term/Dict-97qjc/Cat-97qjs/en

TransmissionType

UPDATE

DictionaryId

97qjc

SectionId

97qjs

DataType

TERM

DataAction

UNMODIFIED

Restore

false

Body:

A message about Swedish Term that nothing has been changed.

Topic: Subsystem-update-term Key: T2326758

Topic: Subsystem-update-term Key: T2326758

Kafka header

Value

JmsTopicName

Subsystem/update/term/Dict-97qjc/Cat-97qjs/en

TransmissionType

UPDATE

DictionaryId

97qjc

SectionId

97qjs

DataType

TERM

DataAction

UNMODIFIED

Restore

false

Body:

Term change in Concept

User decides to change Spanish Term from unión aleada to unión aleado. Concept consists of two Terms: English and Spanish. Next records will be sent:

Topic: Subsystem-update-concept Key: C1005635

Topic: Subsystem-update-concept Key: C1005635

Kafka header

Value

JmsTopicName

Subsystem/update/concept/Dict-97qjc/Cat-97qjs

TransmissionType

UPDATE

DictionaryId

97qjc

SectionId

97qjs

DataType

CONCEPT

DataAction

UPDATED

Restore

false

Body:

A message about old Spanish Term that it’s been deleted

Topic: Subsystem-update-term Key: T2327791

Topic: Subsystem-update-term Key: T2327791

Kafka header

Value

JmsTopicName

Subsystem/update/term/Dict-97qjc/Cat-97qjs/es

TransmissionType

UPDATE

DictionaryId

97qjc

SectionId

97qjs

DataType

TERM

DataAction

DELETED

Restore

false

Body:

A message about new Spanish Term that it’s been created

Topic: Subsystem-update-term Key: T2327791

Topic: Subsystem-update-term Key: T2327791

Kafka header

Value

JmsTopicName

Subsystem/update/term/Dict-97qjc/Cat-97qjs/es

TransmissionType

UPDATE

DictionaryId

97qjc

SectionId

97qjs

DataType

TERM

DataAction

CREATED

Restore

false

Body:

A message about English Term that nothing has been changed.

Topic: Subsystem-update-term Key: T2326755

Topic: Subsystem-update-term Key: T2326755

Kafka header

Value

JmsTopicName

Subsystem/update/term/Dict-97qjc/Cat-97qjs/en

TransmissionType

UPDATE

DictionaryId

97qjc

SectionId

97qjs

DataType

TERM

DataAction

UNMODIFIED

Restore

false

Body:

Concept field update

User changes value of one the Concept fields. Concept has two Terms: English and Swedish. Next records will be sent:

Topic: Subsystem-update-concept Key: C1005635

Topic: Subsystem-update-concept Key: C1005635

Kafka header

Value

JmsTopicName

Subsystem/update/concept/Dict-97qjc/Cat-97qjs

TransmissionType

UPDATE

DictionaryId

97qjc

SectionId

97qjs

DataType

CONCEPT

DataAction

UPDATED

Restore

false

Body:

A message about English Term that nothing has been changed.

Topic: Subsystem-update-term Key: T2326755

Topic: Subsystem-update-term Key: T2326755

Kafka header

Value

JmsTopicName

Subsystem/update/term/Dict-97qjc/Cat-97qjs/en

TransmissionType

UPDATE

DictionaryId

97qjc

SectionId

97qjs

DataType

TERM

DataAction

UNMODIFIED

Restore

false

Body:

A message about Swedish Term that nothing has been changed.

Topic: Subsystem-update-term Key: T2326758

Topic: Subsystem-update-term Key: T2326758

Kafka header

Value

JmsTopicName

Subsystem/update/term/Dict-97qjc/Cat-97qjs/sv

TransmissionType

UPDATE

DictionaryId

97qjc

SectionId

97qjs

DataType

TERM

DataAction

UNMODIFIED

Restore

false

Body:

Restore of deleted Term

User restores deleted Swedish Term in Concept. Concept consisted only of one English Term. Next records will be sent:

Topic: Subsystem-update-concept Key: C1005635

Topic: Subsystem-update-concept Key: C1005635

Kafka header

Value

JmsTopicName

Subsystem/update/concept/Dict-97qjc/Cat-97qjs

TransmissionType

UPDATE

DictionaryId

97qjc

SectionId

97qjs

DataType

CONCEPT

DataAction

UPDATED

Restore

true

Body:

A message about Swedish Term restoration

Topic: Subsystem-update-term Key: T2326758

Topic: Subsystem-update-term Key: T2326758

Kafka header

Value

JmsTopicName

Subsystem/update/term/Dict-97qjc/Cat-97qjs/sv

TransmissionType

UPDATE

DictionaryId

97qjc

SectionId

97qjs

DataType

TERM

DataAction

CREATED

Restore

true

Body:

A message about English Term that nothing has been changed.

Topic: Subsystem-update-term Key: T2326755

Topic: Subsystem-update-term Key: T2326755

Kafka header

Value

JmsTopicName

Subsystem/update/term/Dict-97qjc/Cat-97qjs/en

TransmissionType

UPDATE

DictionaryId

97qjc

SectionId

97qjs

DataType

TERM

DataAction

UNMODIFIED

Restore

false

Body:

Section change in Concept

User has changed in Concept field Section. Concept consisted of two English and Swedish Terms

Because of the section change TermWeb will create two messages: Concept deleted and Concept created.

Topic: Subsystem-update-concept Key: C1005635

Topic: Subsystem-update-concept Key: C1005635

Kafka header

Value

JmsTopicName

Subsystem/update/concept/Dict-97qjc/Cat-97qjs

TransmissionType

UPDATE

DictionaryId

97qjc

SectionId

97qjs

NewSectionId

97qk5

DataType

CONCEPT

DataAction

DELETED

Restore

false

Body:

A message about Concept moved to new Section:

Topic: Subsystem-update-concept Key: C1005635

Topic: Subsystem-update-concept Key: C1005635

Kafka header

Value

JmsTopicName

Subsystem/update/concept/Dict-97qjc/Cat-97qk5

TransmissionType

UPDATE

DictionaryId

97qjc

SectionId

97qk5

OldSectionId

97qjs

DataType

CONCEPT

DataAction

CREATED

Restore

false

Body:

Same rule applies to Terms, in total it will be 4 messages, 2 about Term deletion from one Section, 2 messages about Term creation in another section. To shorten documentation, here will be shown only two messages:

Topic: Subsystem-update-term Key: T2326755

Topic: Subsystem-update-term Key: T2326755

Kafka header

Value

JmsTopicName

Subsystem/update/term/Dict-97qjc/Cat-97qjs/en

TransmissionType

UPDATE

DictionaryId

97qjc

SectionId

97qjs

NewSectionId

97qk5

DataType

TERM

DataAction

DELETED

Restore

false

Body:

And term moved message to another section:

Topic: Subsystem-update-term Key: T2326755

Topic: Subsystem-update-term Key: T2326755

Kafka header

Value

JmsTopicName

Subsystem/update/term/Dict-97qjc/Cat-97qjs/en

TransmissionType

UPDATE

DictionaryId

97qjc

SectionId

97qk5

OldSectionId

97qjs

DataType

TERM

DataAction

CREATED

Restore

false

Body:

New Section created in Dictionary

Administrator or User with the rights to edit Dictionary creates new Section, TermWeb will produce next record with new Section.

Topic: Subsystem-update-section Key: S1319

Topic: Subsystem-update-section Key: S1319

Kafka header

Value

JmsTopicName

Subsystem/update/category/Dict-97qjc/Cat-98lst

TransmissionType

UPDATE

DictionaryId

97qjc

SectionId

98lst

DataType

SECTION

DataAction

CREATED

Restore

false

Body:

Section name change in Dictionary

Administrator or User with the rights to edit Dictionary changes Section name, TermWeb will produce next record with new Section name.

Topic: Subsystem-update-section Key: S1317

Topic: Subsystem-update-section Key: S1317

Kafka header

Value

JmsTopicName

Subsystem/update/concept/Dict-97qjc/Cat-97qjs

TransmissionType

UPDATE

DictionaryId

97qjc

SectionId

97qjs

DataType

SECTION

DataAction

UPDATED

Restore

false

Body:

Section deletion from Dictionary

Administrator or User with the rights to edit Dictionary deletes Section, TermWeb will produce next record with deleted Section information.

Topic: Subsystem-update-section Key: S1319

Topic: Subsystem-update-section Key: S1319

Kafka header

Value

JmsTopicName

Subsystem/update/category/Dict-97qjc/Cat-98lst

TransmissionType

UPDATE

DictionaryId

97qjc

SectionId

98lst

DataType

SECTION

DataAction

DELETED

Restore

false

Body:

Initial export

User launches initial export via Export panel. TermWeb will create message for every Concept and put it in one record until it possible to make record size smaller than limit defined in adapter property adapter.maxMessageSize.

All messages will be sent to topic specified by the User in export panel. Key represents order number of concept in export data, it consists of two numbers divided by - start and finish position for this message.

Topic: Subsystem-update-initial Key: 1-47

Topic: Subsystem-update-initial Key: 1-47

Kafka header

Value

JmsTopicName

Subsystem-update-initial

TransmissionType

INITIAL

DictionaryId

97qjc

SectionId

97qjs

DataType

CONCEPT

DataAction

CREATED

Restore

false

Body:

Next message will be the same structure, only with new order numbers:

Topic: Subsystem-update-initial Key: 48-92

Topic: Subsystem-update-initial Key: 48-92

Kafka header

Value

JmsTopicName

Subsystem-update-initial

TransmissionType

INITIAL

DictionaryId

97qjc

SectionId

97qjs

DataType

CONCEPT

DataAction

CREATED

Restore

false

Body:

In one record can’t be multiple <section> elements.

Etc, until the end of export.