Integrations
Breadcrumbs

Omma Data (Aqtiva)

All references to "Aqtiva" in this document correspond to OMMA Data, the current name of the technology which, at the time the integration plugin was created, had the aforementioned name.

Integration model


The process consists of periodically requesting from Aqtiva the existing live rules since a given date and their results. This information will be sent to Anjana to update the datasets associated with the reported dataformats as well as the rules associated with them.


For the process to work correctly, some predefined attributes must be created in the dataset template and a new entity type Aqtiva_Rule must also be created with predefined attributes, in which the data related to the Aqtiva rules (metadata and results) will be stored.

Dataset Template

The attributes to be added to the dataset template are the following:


Attribute Name

Type

Value

AqtivaIdDataFormat

INPUT_TEXT

Will contain the dataformat identifier

AqtivaDataFormatName

INPUT_TEXT

Will contain the dataformat name

AqtivaQRulesArray

ARRAY_ENTITY

Will contain an array of ARIs with the associated rules

AqtivaKPILastUpdateDate

INPUT_TEXT

Date of Last KPI Execution of results at datasource level

AqtivaDataQualityDatasourceLevel

TEXT_AREA

Will contain the results at datasource level

AQTIVA_RULE Template

To be able to store information related to the executed rules, it is necessary to create the native entity AQTIVA_RULE which will contain the metadata related to them along with the applied results.

To do this, it is necessary to create a template with the following attributes:


Attribute Name

Type

Value

AqtivaIdProject

INPUT_TEXT

Project ID in Aqtiva

AqtivaProjectName

INPUT_TEXT

Project name in Aqtiva

AqtivaIdQualityPoint

INPUT_TEXT

Quality Point ID in Aqtiva

AqtivaQualityPointName

INPUT_TEXT

Quality Point name in Aqtiva

AqtivaIdDataSource

INPUT_TEXT

Datasource ID in Aqtiva

AqtivaDataSourceName

INPUT_TEXT

Datasource name in Aqtiva

AqtivaIdQualityRule

INPUT_TEXT

Rule ID in Aqtiva

physicalName

INPUT_TEXT

Composite rule name (Project, Quality Point, DataSource)

AqtivaIdDataFormat

INPUT_TEXT

Dataformat ID in Aqtiva

AqtivaAnjanaDatasets

ARRAY_ENTITY

Dataset(s) on which the Rule is executed

AqtivaDataFormatSource

INPUT_TEXT

Dataset Type

AqtivaQualityRuleActive

INPUT_CHECKBOX

Indicates whether the rule is active for simulation

AqtivaQualityRuleDimension

INPUT_TEXT

Dimension

AqtivaQualityRuleColumns

INPUT_TEXT

Columns to which it Applies

AqtivaQualityRuleConditions

INPUT_TEXT

Rule Conditions

AqtivaQualityRuleCreationDate

INPUT_DATE

Rule Creation Date in Aqtiva

AqtivaQualityRuleWarnThType

INPUT_TEXT

Warning Threshold Type

AqtivaQualityRuleWarnThValue

INPUT_DECIMAL

Warning threshold value

AqtivaQualityRuleErrType

INPUT_TEXT

Error threshold type

AqtivaQualityRuleErrThValue

INPUT_DECIMAL

Error threshold value

AqtivaRecordIdRecordLastExec

INPUT_TEXT

Aqtiva Execution Record ID

AqtivaLastResultSynchronizationDateAnjana

INPUT_DATE

Last Synchronization Date

AqtivaRuleExecutionResults

TEXT_AREA

Result in json representation. Internal use

AqtivaRuleExecutionResultsView

ENRICHED_TEXT_AREA

Display of the results of the AqtivaRuleExecutionResults attribute

description

TEXT_AREA

Rule description

expirationDate

INPUT_DATE

Expiration date for the rule

name

INPUT_TEXT

Rule name

  • All the indicated attributes must be active attributes in the corresponding templates for their values to be updated correctly.

  • It is recommended to configure NOT EDITABLE validation on all the attributes mentioned above to avoid accidentally editing an attribute, since they are updated automatically. Except AqtivaIdDataFormat in the dataset template, which must be edited manually to associate a dataset with an Aqtiva dataformat.

  • Despite being a native entity, it is recommended not to grant deprecation permission to any role so that manual deprecation of the AQTIVA_RULE object cannot be performed, thus avoiding inconsistencies in the database and what is sent by Aqtiva.


The plugin has no defined ARI since extraction, sampling or active governance of structures will not be possible. The communication is triggered directly from the Plugin to Anjana with the data to be updated.

Required credentials

The credentials to access Aqtiva are stored in the configuration file in base64 and will be provided by Aqtiva. The plugin will look for the following configuration:

YAML
totplugin:
  connection:
name: dev
technology:
  auth:
    client-id: <clientId a sustituir>
    secret: <secret a sustituir>

Deployment

The generic plugin deployment manual must be followed. https://wiki.anjanadata.com/es/configuracion/25.2/tot-despliegue-de-plugins


Configuration

The Aqtiva certificate installation is required:

  • It is obtained from the URL configured in the Aqtiva API YAML.

  • It is uploaded to the machine hosting the plugin instance.

  • The uploaded certificate is registered with the "keytool" command, noting that the first parameter must specify the Java version being used (obtained from the plugin service descriptor):

Bash
keytool -trustcacerts -keystore "<path_to_jdk>/jre/lib/security/cacerts" -storepass changeit -importcert -alias <alias_to_be_stored> -file "<path_to_certificate/certificate.crt"


The certificate uses DNS and not IP, so the "totplugin.aqtiva.auth.url" property set in the YAML must be a DNS.

Tot

Tot persists the moment of the last successful execution in order to subsequently make a more targeted query of the Aqtiva data. This is done in the tot DB schema, in the plugin_execution table, whose fields are:

  • name as nvarchar(255) Primary Key

  • date as timestamp


Kerno

To display the value of the AqtivaRuleExecutionResultsView field in rules once they are executed in Aqtiva, it is necessary to have a template configured in the kerno yml in the variable anjana.aqtiva.template_result_view. Since the variable is an ENRICHED_TEXT_AREA, plain text and HTML tags can be added to format it as desired.


The value of variables must be enclosed in {{ }} as shown in the following example.

AqtivaRecordEnvirontmentLastExec: {{AqtivaRecordEnvirontmentLastExec}}<br>AqtivaRecordIdLastExec: {{AqtivaRecordIdLastExec}}<br>AqtivaRecordNumInLastExec: {{AqtivaRecordNumInLastExec}}<br>AqtivaRecordNumOutLastExec: {{AqtivaRecordNumOutLastExec}}<br>AqtivaRecordNumOkLastExec: {{AqtivaRecordNumOkLastExec}}<br>AqtivaRecordNumKoLastExec: {{AqtivaRecordNumKoLastExec}}<br>AqtivaDQILastExec: {{AqtivaDQILastExec}}<br>AqtivaRecordLevelLastExec: {{AqtivaRecordLevelLastExec}}<br>AqtivaRecordMessageLastExec: {{AqtivaRecordMessageLastExec}}<br>AqtivaRuleLastExecDate: {{AqtivaRuleLastExecDate}}


The result will look as follows

att_1_for_171999382.png

Operation

Aqtiva Plugin


The plugin has a batch that will execute according to the specified configuration. The process it executes consists of 3 parts:

Last execution date

The process obtains the last execution date on which to make the request to Aqtiva. To do this it makes a request to an endpoint in Tot that provides this information. If the Tot response is correct, the configured delta is checked and, if it exists, it is applied as the resulting date when making the request.

If the date returned by Tot is not correct, either because it has not been set yet or because an error has occurred, the resulting date will be the current date minus one year.

Metadata synchronization between Aqtiva and Anjana

It retrieves the live data from Aqtiva from a given date and all the Anjana data related to Aqtiva. This request date for the data must be an old date so that Aqtiva returns all existing live rules. A configuration property has been defined where this date can be set, and if it does not exist, 1-Jan-1970 will be used. Once obtained, it processes the data from both sources to calculate which rules need to be expired, created or modified.

Once all this information has been processed, it will be sent to Anjana for metadata synchronization. If no errors occur, the process moves on to results synchronization.

Results synchronization

Based on the metadata to be synchronized to obtain the rules that need to be taken into account, the process starts by requesting the results from Aqtiva from the date obtained in the first step.

From the results to apply (excluding those rules that in the metadata synchronization have resulted in expiration), we generate 2 types of results:

  • Datasource results at dataformat level. Grouping by dataformat, datasource and qualityPoint, all rules will have the same value, so the one corresponding to the most recent rule in Aqtiva is synchronized in Anjana.

  • Results at rule level. We add the list of rules to synchronize.


If there is any problem, we can check the Aqtiva logs where the process described here can be clearly identified

att_3_for_171999382.png

Available endpoints

  • POST /api/aqtiva/synchronize/rules. Responsible for carrying out the full process of Aqtiva metadata synchronization. On one hand it synchronizes the metadata by updating and expiring according to the Anjana and Aqtiva data, and finally it synchronizes the results according to that update data.

  • POST /synchronize/rules/metadata. Responsible for synchronizing the metadata results in Anjana according to the metadata synchronization payload passed.

  • GET /synchronize/metatada. Responsible for synchronizing the metadata in Anjana with the Aqtiva data according to the filter passed (process start date).

Tot

Tot will act as a proxy between the Aqtiva plugin and Kerno. The only work it does is that, when the results request is completed, if everything went well (no exception occurred), it persists the current date and time in the database.

Available endpoints

  • GET /internal/v4/anjana/aqtiva/lastExecution. Responsible for obtaining the last execution of the Aqtiva plugin.

  • POST /internal/v4/anjana/aqtiva/synchronize. Responsible for obtaining the Aqtiva metadata stored in Anjana.

  • PATH /internal/v4/anjana/aqtiva/synchronize. Responsible for directing the metadata update request to Kerno.

  • /internal/v4/anjana/aqtiva/result. Responsible for directing the request to Kerno to update the results of objects in Anjana related to Aqtiva. It is also responsible for saving the last update date when no error has occurred.

Kerno

Kerno will process the data sent by the plugin according to each process.

Metadata update

In this case it has two functions: on one hand, obtaining the existing metadata in Anjana that is related to some dataformat (that is, those datasets that have a dataformat attribute in their template with a value), and on the other, making that update effective by deprecating objects and creating necessary rules.

  • When datasources need to be expired, the datasets in which they are related will be obtained and that result will be removed from the dataset's result list.

  • When dataformats need to be expired, the associated datasets will be obtained and the attributes containing the relationship between datasets and rules will be updated

    • AqtivaQRulesArray of the dataset - the rule to expire is removed

    • AqtivaAnjanaDatasets of the rule is left empty

  • When dataformats need to be expired, the associated datasets will be obtained and the values of certain dataset attributes will be deleted:

    • AqtivaIdDataFormat

    • AqtivaDataFormatName

    • AqtivaKPILastUpdateDate

    • AqtivaDataQualityDatasourceLevel

  • Rules to be expired are set to DEPRECATED status and the expirationDate attribute (expiration date) is assigned the current execution time so that it expires when due (next expiration batch). If this attribute is not assigned in the template, a log will be displayed on the console indicating the issue but the rule will be allowed to be saved as deprecated.

  • For rules to be created, it is checked whether they exist or not. If they exist, the data is updated with the information coming from the plugin; if they do not exist, a new rule is created associated with the corresponding dataset.

Results update

  • Results are processed at datasource level and at rule level. For datasources, the datasets associated with the dataformats to be updated are obtained and the various values passed are added to the results attribute. This process adds or modifies, it does not need to delete any.

  • Results are processed at rule level. For each rule, the results are grouped according to the dataformat and rule identifier, showing the list of execution results for that rule. There is a particularity with these results in that an attribute is used that will contain the "technical" value of the results and another attribute that will contain the visual value of same. This process adds or modifies, it does not need to delete any result.


If there is any problem, we can check the Kerno logs where the process described here can be clearly identified:


att_5_for_171999382.png
att_4_for_171999382.png

Available endpoints

  • POST /internal/v4/metadata/aqtiva/synchronize. Responsible for obtaining the metadata related to Aqtiva that exists in Anjana. A filter can be passed designed to support pagination, although in this version it is not being used. It has been generated as a base for future developments if needed.

  • PATCH /internal/v4/metadata/aqtiva/synchronize. Responsible for performing the synchronization of Aqtiva metadata in Anjana according to the values passed, creating the necessary rules, associating them with datasets and deprecating the necessary objects.

  • PATCH /internal/v4/metadata/aqtiva/result. Responsible for updating the results (at datasource level in datasets and at different execution levels in rules) of the existing Aqtiva metadata in Anjana according to the values passed.