Universal Data Adapter (UDA) in FDMEE 11.2.2
The Universal Data Adapter enables you to integrate external source table/view information directly, such as from SQL tables, into FDMEE - bypassing the open interface table. This feature enables you to easily connect to any source data where they have access to the underlying database in the case where a prepackaged adapter is not available, or if the pre-packaged adapter does not provide the required data.
Here's a main difference between UDA & Open Interface Adapter:
- Open Interface Adapter: We use FDMEE table AIF_OPEN_INTERFACE as a staging table between our source DB and the FDMEE staging table (TDATASEG_T).
- Universal Data Adapter: We bypass the Open Interface table so we have direct integration between our source DB and TDATASEG_T (i.e. FDMEE).
Check the below diagram for the data flow:
UDA can be thought as a way of extracting data from any data source which FDMEE can access its underlying database.
FDMEE uses Oracle Data Integrator as the underlying ELT (Extract-Load-Transform) tool. UDA also does the same. Actually the UDA is a source adapter in FDMEE composed of:
- ODI: Universal data adapters depend on Oracle Data Integrator (ODI) to import, transform, validate and export data into target applications. As such, you need to specify how the data is seeded to FDMEE in ODI.
- Source columns (available in the import format)
- Source filters (used to filter source data)
- Drill-through URL (we can navigate to source system generating the data)
For setting up UDA, we will have to perform few configurations in ODI topology in order to configure the physical connection to our DB server/database. At a high level, we need to perform the following steps in ODI Topology:
- Create Data Server for our source technology.
- Create a Physical Schema for the new data server.
- Create a new Logical Schema for our source technology.
- Use Global context or create a new one to map the physical schema to the logical schema.
These can be done from both ODI Consoleand ODI Studio (ODI Console is web-based and is automatically deployed when FDMEE is installed. On the other hand, ODI Studio has to be installed manually). NOTE: You don't need a full ODI license for this. You can use ODI RUL (Restricted Use of License) for FDMEE purposes, including customization on source adapters.
Before we dive right in, below is how we map Database and ODI objects:
Also, below two concepts are very important in this topic:
Logical Schema: This is an ODI object which groups similar physical schemas. This a key component of the UDA and it MUST have the following names:
Context: Resolves logical schemas into physical schemas at run-time. By default, FDMEE uses only one context called GLOBAL. But you can define others as well.
Let's create a DATA SERVER now.
Login to ODI Console - Master Repository. Expand 'Topology' on the left panel under 'Browse'.
Expand 'Physical Architecture'.
Then expand 'Technologies'. Since I'm using an Oracle database, I will look for 'Oracle'.
If you're using MSSQL, then you need to look for 'Microsoft SQL Server'.
Back to Oracle now. Right click on 'Oracle' and select 'Create':
You'll get the below window:
Give a data server name and click on 'Next'.
You'll get the below screen:
JDBC URL: Please make sure you give the URL's in the below format:
- Format for Oracle Database with SID: dbURL=jdbc:oracle:thin:@hostName:port:SID
- Format for Oracle Database with service name: dbURL=jdbc:oracle:thin:@hostName:port/serviceName
- Format for Microsoft SQL Server, with SID, using the name for database you already created: dbURL=jdbc:weblogic:sqlserver://hostName:port;databaseName=databaseName
USER/JDBC Password: Details of the source schema.
Click 'Next'.
Keep the defaults and click on 'Create'.
Data Server is now created.
Test connection with the agent:
Connection is successful:
Next, let's create a PHYSICAL SCHEMA.
Right click on 'FDMEE_UDA_ORACLE' and select 'Create'. You will get the below screen:
Give the schema name and leave all others as default. Click on 'Save'. Physical schema is created:
From the back-end table:
Next, we'll create a LOGICAL SCHEMA.
Go to 'Logical Architecture', expand 'Technologies':
Look for 'Oracle':
Right click on 'Oracle' and click on 'Create':
You'll get the below window:
Since my source is Oracle, I will name it as 'UDA_ORCL'.
Click on 'Save'. Logical schema is created:
Edit the logical schema. You'll get the below screen:
Select the physical schema that we created. I went ahead with 'Global' context.
Click on 'Save'.
From the back-end table:
Please make sure the below exists:
"Universal Data Adapter Model" folder: Check this in ODI work repository under 'Model Folders':
"Oracle Adapter Project": Check this in ODI work repository under 'Projects':
We are now ready to start configuring UDA in FDMEE.
Adding a SOURCE SYSTEM.
In the same way we do with the Open Interface Adapter, we need to configure a source system in FDMEE. Go to FDMEE, on the Setup tab, under Register, select Source System. Click Add.
Below are the available source system types:
ODI Context Code: Specify the context code defined in ODI for the connection to the specific instance. A context groups the source and target connection information (links our source system with the physical architecture defined in ODI). Technically, at run-time, ODI points to the data server/physical schema mapped to specific logical schema (based on our source type) through the context set in FDMEE.
If having one source system of each type you would probably use GLOBAL context which is the default one for FDMEE. When data is sourced from another instance of a same seeded technology (Oracle, MSSQL, Teradata, DB2), then use another source system of the same technology type with a different Context.
Save it, the new source system should show in the summary pane.
It's recorded in the AIF_SOURCE_SYSTEMS table:
Creating the SOURCE ADAPTER.
A source adapter is an integration framework in FDMEE that enables you to extract data from the source system in a flexible and customizable manner. When you configure Oracle Data Integrator with a universal source adapter, ODI extracts the data directly to a temporary data table used for transforming data from source to target in a data rule execution (tdataseg_t).
As another step in implementing the universal data adapter feature, create a definition that instructs how data is populated in FDMEE from the source table through ODI.
On the Setup tab, under Register, select Source Adapter. Click Add.
Adapter Key: Enter a user defined identifier for the adapter.
Adapter Name: Enter a user defined name for the adapter.
Source System Type: Select the technology type. Available choices are:
Based on the source system type, the following fields are populated automatically:
ODI Package Name (ORCL Balances) & ODI Project Code (AIF_ORACLE)
Table Name: Specify the source table name.
Click Save. The source adapter will be saved:
Once we have created the new source adapter we need to
1. Bring table definition into FDMEE so we can configure filters and import format.
2. Generate the ODI package which will extract data from the table and import into TDATASEG_T.
Let's import the table definition.
On the Setup tab, under Register, select Source Adapter. Select the universal data adapter. Click Import Table Definition to return all column details from the source table.
When importing the columns we need to select the source system name so ODI can go to the physical table and reverse it.
You should get a success message like this:
Once they are imported, we have to classify some columns and optionally type a user-friendly name for each column. This is the name that we will see in the import format configuration.
Note: When using a view as a source for the Universal Data Adapter, don’t include column names that are reserved words for the selected technology. For example, the work “Year” is a reserved word for Teradata and should not be used in a view.
Based on the column type, from Classification, select the classification:
For example, select:
Amount (mandatory): So FDMEE knows which is the column imported as amount.
Year (optional): Used to filter data if DLR is configured with period type "explicit".
Period (optional): Used to filter data if DLR is configured with period type "explicit".
Period Number (optional): Used to filter data if DLR is configured with period type "explicit".
If we assign only Amount, then we can only use period type "None" in the data load rules. In other words, FDMEE assumes that the content of the table/view has data only for the period being processed. If we classify the period columns then we can use period type "Explicit" which will be passing as extract filters the values that we configure in the source period mappings.
The source adapter is stored in the AIF_SRC_ADAPTERS table:
💡 What actually happens when you hit the 'Import Table Definition' button?
If we navigate to ODI work repository and check the sessions, you will see that an ODI process was executed.
This process reverses the table which basically means- it will create a data store in ODI for that table.
You can see the details in the session steps:
Then this datastore will be used in the ODI interface as a source datastore.
Let's define the parameters now.
Use the Parameters tab to specify the list of parameters (filter) for the universal data adapter.
In a typical data extract from database we usually use filters. If you think in a SQL query like this:
SELECT A, B FROM TABLE WHERE A = 'EPM'
We are using a filter on column A so only records having A with value "EPM" will be extracted.
The same applies to UDA. We can define filters on any column of the source adapter. These parameters will be available as text box in the data load rule.
On the Setup tab, under Register, select Source Adapter. Select the universal data adapter. In the details section, select the Parameters tab. From the Parameters task bar, select Add. Entry fields are displayed for the Parameter Name, Parameter Data Type, Condition, Column Name, Default Value, and Parameter Prompt.
Fill the details as shown below:
Parameter Name: Enter the parameter name (in order to be aligned with other source adapters, I used the following naming convention for parameter names - 'p_period').
Parameter Data Type: Select the data type of the parameter.
Available data types:
Condition: Specify the type of the parameter:
- Explicit: You re prompted to provide an explicit value in the Data Rule, which is matched exactly in the source table to pull data. (WHERE COLUMN='x')
- Between: You are prompted to provide a between values in the Data Rule, and these range of values are matched in the source table to pull data. (WHERE COLUMN BETWEEN 'x' AND 'y')
- In: When this type is selected, you prompted to provide multiple values in Data Rule, and these values are matched in the source table to pull data. (WHERE COLUMN IN ('x','y'))
- Like: You are prompted to provide a string in the Data Rule. Values starting with the string are matched in the source table to pull data. (WHERE COLUMN LIKE 'x%')
Column Name: Enter the column name to use as a filter.
In Default Value, enter the value to default on the Data Load Rule screen.
When entering an "Explicit" value, use the 'X','XX,'XXX' format.
When entering a "Between" value, use the ‘X’ and ‘XX’ format.
Parameter Prompt: Enter the label to display for the parameter on the Data Rule screen.
When a new parameter is added and you are in a multi-language environment, then set the browser locale to the appropriate language and edit the prompts as needed in that language.
Click Save.
Here, I have created just one parameter.
It gets saved in the AIF_SRC_ADAPTER_PARAMETERS table:
Once ODI has the datastore for our table/view (imported after clicking "Import Table Definition"), it needs to generate the package with an interface to pull data from our source into FDMEE staging table TDATASEG_T.
Click Generate Template Package.
You should get a confirmation:
After the ODI objects have been generated we can see that the ODI package has been updated in the source adapter accordingly:
The package has been generated and it contains the interface that pulls data. The ODI scenario that we will generate from the import format, will be generated from this package.
Let's get to Drill URL.
FDMEE provides a framework for using URLs for drill through. You click the hyperlink in the amount cell, and the source system is launched in a new EPM Workspace tab or a new window. Multiple drill through URLs are provided to drill to different pages in the source system based on the import format mapping.
On the Setup tab, under Register, select Source Adapter. In Source Adapter, select the Drill URL tab. On the Drill URL task bar, select Add. Entry fields are displayed for the Drill URL Name, Drill URL, Request Method, and Drill URL Prompt columns.
Drill URL Name: Enter a user defined name for the drill through URL.
Drill URL: Enter the URL used for the drill through.
Enter the URL without the server and port information. The URL must contain the parameter name and column name from the TDATASEG table enclosed in the symbol $.
For example, enter: ENTITY=$ATTR1$&VERSION=$ATTR2$.
In the above example the value of ATTR1 is passed as a value for the ENTITY parameter, and ATTR2 is passed as the value for the VERSION parameter. Parameters are separated by the "&" character.
To specify the request-response between a client and server for the drill URL format, enter either:
- GET: Form data is encoded into the URL. If no method is specified, then GET is the assumed request-response.
- POST: Form data is displayed in the message body.
Drill URL Prompt: Enter a user-defined prompt for the drill-through prompt. For example, enter Default.
Next is IMPORT FORMAT.
The import format is where we will link our source columns (imported from the source adapter page) with our target dimensions. Create a new import format.
Give a name & description to your import format. Select the relevant source, target and source adapter. Drill URL will be automatically populated once you select the source adapter. Click 'Save'.
Let's come to the import format mappings.
Source columns: Columns available in the UDA. In order to make column names understandable for anyone, the import format shows the "Display Name" property setup in the source adapter.
Target Dimensions: By default, you will see the list of dimensions of your target application.
In the case you need to concatenate multiple source columns into one target dimension you can add the target dimension again. Each source column assigned to the same target dimension will be concatenated using the concatenation character set in the import format definition.
Once mapping is done, we need to regenerate the ODI scenario.
When we create new import formats or we change the columns map, we need to re-generate the ODI Scenario (FDMEE will execute this scenario when you click execute the DLR).
If everything runs fine, we will get a success message.
At times, you might get an error like this:
Check the 'aif-WebApp.log' for details on the error. In fact, if you see the status and notice a green tick - it means that the ODI scenario has been successfully generated.
If you have a green tick but still 'Regenerate ODI Scenario' throws an error - try regenerating it again.
Next is creating a LOCATION.
Create a new location & assign the import format.
I can see it in the TPOVPARTITION table:
PERIOD MAPPING
The source period mappings define the values for columns Period, Period Number and Year. These values will be passed as parameters when we set our DLR with explicit period mapping type. We can define multiple calendars and configure adjustment periods if needed. Source period mapping are defined at source system level. Therefore if we have more than one table for the same source system and they use different calendar, we will have to configure multiple calendars. Calendars are then assigned to the data load rules.
Make sure you've selected the correct 'Source System'.
You should have your DATA LOAD MAPPING ready. Check my older blog for details on this section.
Next, we come to DATA LOAD RULE.
Create a new data load rule.
Make sure you have selected the correct location. Give a name and description to your DLR.
Period Mapping Type: Keep it as 'Explicit' since we will use the source period mapping for the selected calendar.
Calendar: Choose the calendar that you have mentioned in period mapping.
In the Source Options, I can see the filter for Period. Default value is Jan, as I mentioned in the parameter in the source adapter.
Save it. Click on 'Execute'. In this step, I will be just importing the data from source to FDMEE.
Click on 'Run'.
Now, let's go to process details and see. It shows as successful:
Below are the process steps:
Let's head over to WORKBENCH.
My source data has been successfully imported and filtered (showing only 8 rows).
Next, you can export the data by clicking on 'Export'.
Click on 'OK'.
Process details:
Process steps:
This was my source table: ORACLE_SOURCE_DATA
Please note: Without the filter, I would get all the 16 rows imported:
💡Let me show a different example of filter. If you see the above table, I have 8 rows of ENTITY = India. Please see below:
I'll add a filter in the source adapter on ENTITY.
Save it & click on 'Generate Template Package'.
Next, go to Data Load Rule. You'll see the new filter in Source Options:
Execute the DLR as shown below:
Click on 'Run'.
Process details:
Process Steps:
In Workbench, we have the exact 8 rows as filtered in the source table:
That's all! Hope it helps.
Thanks for detailed post.did you get chance to migrate fdmee schema from 11.1.2.4 to 11.2.2? Once we have configured in 11.2.2 trying to use ssis aif_*dtx file to create connections for both old and new schema but it's getting failing keep on :( is there any method available to migrate or upgrade fdmee old schema to new schema .
ReplyDeleteThanks in advance.