English Book on Master Data Management Available Markus Ganser

English Book on Master Data Management Available
Markus Ganser


In the MDM discussion forum on SDN people have been continuously asking for additional information on SAP NetWeaver Master Data Management. The good news is that SAP Press has now released the English version of the SAP NetWeaver MDM book.

The book describes the "MDM technology, architecture, and solution landscape", provides a " detailed technical description of all three usage scenarios", and "includes highly-detailed, proven guidance from real-life customer examples".

Ordering from the US:
SAP NetWeaver Master Data Management
Ordering from Europe:
SAP NetWeaver Master Data Management

Best regards,

Markus

Markus Ganser joins the product management team of SAP Master Data Management

Strategic Data Services for SAP NetWeaver MDM Markus Ganser

Strategic Data Services for SAP NetWeaver MDM
Markus Ganser

On behalf of the SAP field services and SAP development teams involved with SAP NetWeaver Master Data Management (MDM) I'd like to announce the availability of SAP Strategic Data Services (SDS) for SAP NetWeaver MDM. This offering comprises complementary high-quality customer services all along the life cycle of an MDM implementation project. SDS is staffed with MDM architects and SAP experts from consulting and development.

End-to-end implementation acticvities include:

  • Data-quality analysis
  • Project scoping, planning and estimation
  • Master-data modeling and taxonomy creation
  • Implementation of business rules and matching strategies to enable data consolidation
  • Development and roll-out of data, data-process governance and methodologies
  • Services in data cleansing, validation, enrichment and classification, including certified third party data-quality services
  • Definition, design and configuration of MDM system architecture
  • Tuning and optimization of MDM performance
  • Custom API and SOA development services

If you are interested in SAP NetWeaver MDM or have already purchased it and like to move to the next steps, SDS could be a great opportunity to engage. The SDS front-office is staffed with MDM-experienced consultants from SAP field services, while the SDS back-office is part of the MDM development team. This combined approach makes SDS a service provider with vast and first hand experience with MDM and data issues. In addition, SDS enables you to directly engage with MDM-certified business information providers and data quality partners. The team is ready to help you make your MDM and data projects a sustained success.

See also the SDS overview graphic:

image

For detailed information, send an email to Ronen.Liper@SAP.com and Hermann.Reiter@sap.com.

Markus Ganser joins the product management team of SAP Master Data Management

OS/DB Migration Naveen kumar.H

OS/DB Migration
Naveen kumar.H

You have a SAP NW04 system running in your landscape with SP19, and suddenly there is a request from your manager that you need to set up one more system with the same Release and SP, as you need it for some training purpose within 2 hrs. You are perplexed!! You know that installing the systems and patching it would take at least 4-5 hrs depending your hardware, what else can you do?


SAP has an answer for these kind of situation, to be specific its more than the above scenario.SYSTEM COPY or OS/DB migration is the solution for these kind of situations, not only these , may be you have to install it on other OS/DB combinations, you can just do it within 1-2 hrs. Lets us discover what the various options that are provided by SAP to make this Migration.


To begin with, lets us be clear with some of the terminologies of system Copy.

Source system
The SAP system containing the original database is called as source system.


Target system
The system to which the database is to be imported is called as Target system


Homogeneous system copy
During the system copy if you use the same operating system and database platform in source as well as target system is called Homogeneous system copy.


Heterogeneous system copy
During the system copy if you change either the database or the operating system or both of them in the target system as compared to source,such kind of system copy is called Heterogeneous system copy.


Export CD
When you perform a system copy on Source system, you will be prompted to give the location of the folder where the SAPINST is going to dump the database and the SDMkit.jar file. You manually create this empty folder and after successful system copy you find the database and SDM archived in this folder.You have to manually copy this folder ( now referred as Export CD to the desired location ) and then start installing on to the target system. The SAPINST would prompt you to enter the location of the export CD, and you would provide the necessary path where you have dumped this CD.


NOTE : SAP strongly recommends that you only perform OS /DB migration only when you have a strong knowledge about OS/DB and other aspects. Perform heterogeneous system copy only if you are a certified SAP consultant.


You can find all the related documents on OS/DB migration in the following location

http://service.sap.com/systemcopy


First of all let me start of with NW04 OS/DB migration. I will be concentrating more on AS-JAVA OS/DB migration.
SAP provides 2 different procedures of system copy.

Data base specific system copy

This procedure can be used in both homogeneous and heterogeneous system copy scenarios. What does this exactly mean is that you have to separately backup the Database , and then the SAPINST will only archive the Sdmkit.jar only. You have to manually restore the database on the target system and the SAPINST will only restore the SDM. To elucidate,
Let us assume that you have to perform a system copy on system that is running on WINDOWS and SAPDB. First of all using SAPDB manager you take the database backup of the instance and then run the SAPINST. SAPINST is only going to archive the SDM.
Now if the target system also WINDOWS and SAPDB then you can restore the Database and the SAPINST would restore the SDM. But if the target system is of the combination WINDOWS and SQL, I am afraid that the Database cannot be restored. So the possible solution is Data base Independent system copy.



Data base independent system copy

This procedure is used only in heterogeneous system copy scenarios. From the above scenario it is clear that, if the database is changed and if you had performed Data base specific system copy you cannot restore the database. Hence SAP suggests that use this procedure where, the database is archived immaterial to the type of the database.
If you have your Source system on WINDOWS ,SAPDB and you choose this Data base independent system copy, you need not perform Backup , instead SAPINST using the methods of Jload and R3 load would archive the database and also the SDM and hence even though your target systems is WINDOWS-ORACLE or LINUX-ORACLE , no matter what you can perform your OS/DB migration.


Before you start executing the OS / DB migration, Check the compatibility of the source and the target system, You can find the Matrix in the link below

http://help.sap.com/saphelp_erp2005vp/helpdata/en/21/692225b246b24f9202cb4433b5d691/frameset.htm


There is a Slight difference in executing NW04 system copy and NW04s system copy. You can only execute NW04 system copy by executing the SAPINST with an additional parameter. i.e.

./sapinst product_cp.catalog


Kindly refer to the system copy guide for more information.

SAP insists you to use the Patch DVD’s fo NW04 system copy rather than normal SP9 DVD”s.


When you are performing a Heterogeneous system copy, you would be prompted to enter the Migration key. In order to get the key, kindly raise a customer message under BC-INS and provide the details of Source system like OS / DB and also the details of the target system and also the SOURCE.PROPERTIES file that is created when you create an export CD.


When you want to perform a NW04s system copy, you just have to execute the SAPINST, preferably choose the latest SR2 DVD’s.


Now let me discuss some of the known bugs during system copy and the related CSN and Notes for the same.


To start with, Before performing NW04 system copy there is a note that mentions about the pre-requisites for it. Kindly refer to Note 785848 and 711093. You can avoid many errors by referring to these notes. Believe me!!!


One common error while executing the NW04 system copy on AS-JAVA + AS-EP using NW04 SP9 DVD’s is “Portal version not found”. This is because the SAPINST is looking for the version in some folder structure where the version.txt is not at all available. So you have to manually copy it from different folder to the desired one. Kindly refer to the Note 847491.


During NW04 homogeneous system copy using SP9 DVD’s the SAPINST will prompt the migration key!! This is also a bug as the migration key is only needed in case of heterogeneous system copy. Hence SAP insists to use PATCH DVD’s where they have fixed most of the bugs.


If you are executing the NW04 system copy of portal components using SP9 DVD’s, you would encounter some errors while installing it on target system. The common error would during SDM installation step, The sapinst would error out at this step. Kindly refer to the Note 755132.


After successful Installation on target system, the visual administrator does not come up. Refer to the Note 937708.


This one is one of the most interesting, The Java bug, After successful installation on target system, the server0 node does not come up. The node keeps fluctuating between starting and starting framework. You feel frustrated, Don’t worry refer to this CSN message 1458040 2005, or you can refer to the link Sun’s bug .

http://bugs.sun.com/bugdatabase/view_bug.do?bug_id=2121269



So these are the common bugs and fixes that were encountered, Hope this blog would solve most of your problems during OS/DB migration.

Naveen kumar.H is a budding Netweaver XI Technology consultant

Data Modelling and Database Design in ABAP – Part 4 Tobias Trapp

Data Modelling and Database Design in ABAP – Part 4
Tobias Trapp

Let’s Start an Experiment

In the last instalments of this weblog series I dealt with semantic data models, SERM and SAP Data Modeller. Now I cover an completely different topic. Let’s start with an experiment and create a transparent table:

image

Then we use transaction se16 to enter some data:

image

Now we add another column and activate the transparent table:

image

Now here’s an effect that might be surprising to you: every search using se16 for a CONN_ID value won’t find anything:

image

But what went wrong? Let’s switch on the flag “Initial Values”:

image

And finally after activation we get the expected result:

image

What happened? When we appended the column the new fields have been filled with NULL values which didn’t match the expression SELECT COUNT() FROM ZTEST WHERE CONN_ID EQ SPACE. Let’s have a look at the F1-documentation of the “Initial Values” in detail:

Indicator that NOT NULL is forced for this field

Select this flag if a field to be inserted in the database is to be filled with initial values. The initial value used depends on the data type of the field. Please note that fields in the database for which the this flag is not set can also be filled with initial values. When you create a table, all fields of the table can be defined as NOT NULL and filled with an initial value. The same applies when converting the table. Only when new fields are added or inserted, are these filled with initial values. An exception is key fields. These are always filled automatically with initial values.

Restrictions and notes:

  • The initial value cannot be set for fields of data types LCHR, LRAW, and RAW. If the field length is greater than 32, the initial flag cannot be set for fields of data type NUMC.
  • If a new field is inserted in the table and the initial flag is set, the complete table is scanned on activation and an UPDATE is made to the new field. This can be very time-consuming.
  • If the initial flag is set for an included structure, this means that the attributes from the structure are transferred. That is, exactly those fields which are marked as initial in the definition have this attribute in the table as well.

What are NULL Values?

There are different semantics for NULL values in database tables:

  • unknown value (there is value but we don’t know it),
  • not existing value (we can’t apply the attribute),
  • missing information (there may be a value but we don’t know it).

SQL standard defines some rules for NULL values:

  1. You can’t insert a NULL value in a column that that is defined NOT NULL,
  2. The result of a comparison between two NULL values is not true – you have to use the IS NOT NULL and IS NULL.
  3. If a column contains NULL values it will be ignored when using aggregations: MAX, AVG and SUM.
  4. When doing grouping using GROUP BY then there are special rows for the results.
  5. If a table contains NULL values in joins there apply rules for outer joins.

In fact above rules apply for ABAP as well. I suggest reading details in transaction ABAPHELP.

What are NULL Values Used For?

We already saw how to create NULL values in a database table by appending a row with the flag “initial values” set off. But there is another possibility: insert a new row using a view that doesn’t affect a column with the flag “initial values” switched off. But even if you don’t have NULL values in two transparent tables it is easy to create NULL values in a left outer join which is left as an easy exercise to the reader.

In fact NULL values are not very useful because there is no NULL in ABAP and is difficult to set a database field to NULL. Usually we use working areas (resp. internal tables) to update a transparent table – but if we select a row into a working area the NULL value will be converted to an initial value and after the update the NULL value is lost.

But there is one interesting application for NULL values. If we need a post processing after appending a row to a transparent table to calculate values for new fields it is very useful to be able to distinguish between new fields and already calculated fields with initial values.

Tobias Trapp is SAP Mentor and developer for AOK Systems GmbH

Welcome to the SAP NetWeaver MDM Customer Survey Markus Ganse

Welcome to the SAP NetWeaver MDM Customer Survey
Markus Ganse

Welcome to the MDM Customer Survey 2007

image

We'd like to ask you, our SAP NetWeaver MDM customers and prospects, to take twenty minutes to complete our questionnaire asking specific questions about how you deploy and use MDM in your company. We'd like to get a clear picture of how you use or plan to use SAP NetWeaver MDM at your site now and in the future.

Your feedback will be analyzed by our product management team and will ultimately shape our product investment and development plans over the next few years.

The survey is open for MDM customers and prospects from March 16 - 30, 2007.

All participanting MDM customers will have the chance to win an iPod, and will be rewarded with 20 SDN points for their feedback.

We'd like to point out that the obtained survey data will not be used for any marketing activities or other external purposes.

Although the survey is self-explanatory, I'd like to provide some basic navigation hints:

  • To complete the questionnare, click SAP NetWeaver MDM Customer Survey 2007
  • Read the introduction first.
  • Then, click "OK" on the box with the introductory text, and complete the questions.
  • If you'd like to make an additional comment to a question, you may drag and drop the relevant feedback icons from the upper left to the corresponding question, and enter your point.
  • To scroll on the page, click the "More" button at the bottom.
  • To switch to the next page, click "Next" at the bottom of page 1.
  • When you have completed the questionnaire, click the "Next" button on page 2.
  • You'll be directed to an SDN page to get rewarded with SDN 20 points.

Thanks in advance for your contributio

SAP NetWeaver MDM Product Management team

Markus Ganser joins the product management team of SAP Master Data Management

Data Federator Metadata Integrator for Metadata Manager XI Release 2 Fred Vanborre

Data Federator Metadata Integrator for Metadata Manager XI Release 2
Fred Vanborre

The Metadata manager and Data Federator teams are proud to announce the release of the new Data Federator Metadata Integrator for Metadata Manager.

The Data Federator Metadata Integrator further enhances Business Objects metadata management offering with the following Data Federator metadata:

  • Deployed versions of projects
  • Catalogs
  • Schemas
  • Datasources
  • Target tables
  • Mapping rules

This comprehensive support enables end-to-end data lineage and impact analysis features for enterprise deploying BI projects based on Data Federator multi-source data foundations.

Fred Vanborr

MDM Java API 2 - Matching Example Lars Rueter SAP Employee

MDM Java API 2 - Matching Example
Lars Rueter SAP Employee

Description

You want to use the matching capabillities of the MDM Java API 2. The application presented here demonstrates how easy it is to select and execute matching strategies from your own applications using the MDM Java API 2. The included Java source code can be used as an example for further customer developments. Please use the link above to download the application and the source code.

NOTE: You need SAP NetWeaver MDM 5.5 SP05 for this application. SAP NetWeaver MDM 5.5 SP05 Patch 0 is available for customers and partners under the terms specified in SAP Note 1025897.

Disclaimer

SAP code or applications samples and tutorials are NOT FOR PRODUCTION USE unless specifically noted. You may not demonstrate, test, examine, evaluate or otherwise use them in a live operating environment or with data that has not been sufficiently backed up. You may not rent, lease, lend, or resell SAP code or application samples and tutorials.

Prerequisites

  • SAP NetWeaver 2004s
  • MDM Server 5.5 SP5
  • MDMJAVAAPI05_0.sca (MDM Java API, download from Service Marketplace)

Configuration

Only some basic steps are required to deploy and run the application:

  • Unarchive and mount the MDM repository Vendor_Matching_Example.a2a
  • Deploy MDMJAVAAPI05_0.sca (MDM Java API) with SDM
  • Deploy the Web Dynpro application sap.com~sap.mdm.matching.example.ear with SDM
  • Optional: Define additional matching rules and strategies.

The application can be configured to run with any MDM repository. Repository specific connection properties are defined in the Visual Administrator. The configuration entries Field_* have to match a main table field-code in your MDM repository. The entries Field_*_lbl are speaking names for these fields.

For example, your main table has a field with the field-code Name. If you want to use this field for matching set Field_1 to Name in the configuration. Also choose a display name for this field that will be shown above the input field in the Web Dynpro application. Set Field_1_lbl to Company Name.


image

Run the application

http://localhost:53000/webdynpro/dispatcher/sap.com/sap.mdm.matching.example/Test

For example, enter Vishay Siliconix in the name field, shelton in the City field and execute the strategy to find matching entries in the repository.


image

Lars Rueter is a Senior SAP NetWeaver RIG Consultant at SAP.

R/3-XI-MDM (Outbound Scenario) SUBBAIAH GORLA BALA

R/3-XI-MDM (Outbound Scenario)
SUBBAIAH GORLA BALA

This article will show you how to integrate SAP R/3 and Master Data Management (MDM) using PI in short framework. We will be discussing Outbound process i.e. Outbound to the SAP R/3
image
Figure Outbound to R/3

MDM Configuration Steps:
Step One-->Create Client System
Follow the same steps to create client system. Also we can use the same client system.
Step Two-->Create Port for Client System
Follow the same steps to create inbound port.
The Port Name is SIEBEL_IB_CUS_SIEBELCUS01, Code SIEBEL_IB_CUS_SIEBELCUS01, and Type as Inbound. See the figures below.
image
Figure Port for Inbound

MDM Server Side Configuration
In the SAP MDM Server, Look at the server folder structure for the Inbound folder which we have created in the MDM Console.
image
Figure MDM Server Folder

FTP Server Side Configuration
Step Three-->FTP Server Configuration
Configure/Specify the file folder path for pulling the file in FTP Server. This is screen refers the FTP server configuration (WS_FTP server)
image
Figure FTP Server

SAP XI / PI Side Configuration
Step Four --> SAP PI Receiver Communication Channel
image
Figure Receiver CC

Outbound Process Flow:
image
Figure Outbound Process Flow

>>>Here, I will be discussing Outbound process, the above figure shows the entire process
Process step in R/3
In this IDoc triggers to outbound port.
Process step in PI
The IDoc Communication Channel picks up in the PI System and starts the PI Process. In the PI, transformation and conversion happens and sends to the File Adaptor. The file adaptor pushes the file to the specified path in MDM server folder.
Process step in MDM
The Import Manager picks up the file which the file placed in server folder and imports the file to MDM repository main table.

I hope this blog will be useful for real time while integrating the same.


Integrating MDM Item-Details-iView into a WebDynpro Application Steffen Ulmer

Integrating MDM Item-Details-iView into a WebDynpro Application
Steffen Ulmer

You want to access the MDM ItemDetails iView directly without the MDM Search capabilities?
Or do you want to write your own MDM-WebDynpro iView and want to use the existing ItemDetails iView within this WebDynpro?

If so you should read this weblog.
The referred article will show you how to integrate the SAP MDM 5.5 Business Package- iView functionality into a SAP WebDynpro. With SAP MDM 5.5 SP4 there is a very good possibility to position the ItemDetails IView on a WebDynpro View via the iFrames-UI element. The result is that you have the capacity to use existing and configurable iView within your self-programmed WebDynpro application. This procedure will save you a lot of costs and time during the implementation phase. The integration of the HTMLB-based IView and the WebDynpro based outer-View is transparent and the user will feel like it is only one application.

Please open this article: Integrating MDM Item-Details-iView into a WebDynpro Application

Steffen Ulmer is a SAP NetWeaver Technology Consultant.

Master Data Management posting confirmation Steffen Nesper SAP Employee

Master Data Management posting confirmation
Steffen Nesper SAP Employee

URL: https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/master-data-management/SAP%20NetWeaver%20Master%20Data%20Management%20Posting%20Confirmation%20via%20Exchange%20Infrastructure.pdf

Subscribe.Subscribe
Print. Print
Permalink Permalink
SAP Master Data Management 5.5 (MDM) distributes master data to SAP ERP Central Component (ECC). As MDM does not have a direct interface into SAP ECC, SAP Exchange Infrastructure (XI) will be used for distribution of the master data objects. This could be vendors, customers or banks or any other master data object. In a central Master Data Management scenario, there is no feedback provided by the backend system if the distribution was successful or not. If this is mandatory in your project, the following process might be considered. There will be a complex process developed via Business Process Management (BPM) to be able to send master data into SAP and query the delivery status and post it back to MDM in case of success. Also the process needs to send emails to the requestor of the master data change/creation in case of success or send an email to a general administrator email address in case of failure. In the described business case, there will be a portal view to enter master data requests via SAP Enterprise Portal through a person called “requestor”. Please see URL for full article.

Steffen Nesper is a Consultant focusing on Process Integration.

Contextual overview of SAP NetWeaver MDM Markus Ganser SAP Employee

Contextual overview of SAP NetWeaver MDM
Markus Ganser SAP Employee

URL: https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/e0476a3d-557f-2910-169d-9e3608d3bcdf

Subscribe.Subscribe
Print. Print
Permalink Permalink
SDN features a new article on SAP NetWeaver Master Data Management (MDM) emphasizing the business strategy and business implications involved.

So, this rather non-technical article sheds light on the overall business strategy followed when implementing MDM, it gives clear insight into the scenario approach, verticalization and extensibility of the product, and finally stresses the importance of MDM as a pillar for companies seeking to operate on the basis of an enterprise service-oriented architecture.

Check it out by clicking the URL above, it's also worthwile reading for techies.

Best wishes,

Markus

Markus Ganser joins the product management team of SAP Master Data Management

SAP MDM - One bite at a time Dattatreya Kulkarni

SAP MDM - One bite at a time
Dattatreya Kulkarni



SAP MDM: One bite at a time!

“We have a very complex way of managing master data that is spread over multiple systems. We have developed this solution long back and it has evolved over years. When we implemented it first, we did want to establish data governance models and the associated processes, in addition to making the IT solution available. However, reasons – good or bad, made us postpone this for future since the implementation deadline was around the corner and we did not want the business to wait. We have been growing rapidly over last 7 years and have acquired 3 companies in the process. Most of our energy & resources have been spent on integrating the new companies with us. We never got the time to go back to the pending matter of data governance model. Today there are many users who can create and change data as per their priority in their system. We have no control on who is changing what and we are not sure where and how to correct it”

These are the words of some customers, who want to implement a complete master data management solution but are unable to visualize how they would manage to do that while keeping the business running in the given scenario.

While we all know that SAP MDM allows an incremental approach in terms of implementing the IT scenarios, it is also useful to know that one can break this up further and choose to start off with just one master data object. For example, one can choose to focus only on material/product master and follow the incremental approach from content consolidation to central master data management. Further, one can choose to go Business Unit (BU) wise & so on.

By staying focused on one master data object,

a. You can manage the data owners (especially when you are a global company) better

b.You can manage the change associated with the new solution effectively

c.You can implement the new processes/practices in a smaller group more easily

d.You can plan to spend the required time and resources to establish the data governance model, even while keeping the overall project timeline and cost under control (as against a large big bang, all-objects-together approach)

e.You can leverage on the success of this object to get a faster buy in from other data owners and senior management

f.You can make project management a less complex matter & rather focus on the solution

g.You lower the risk (of failure) associated with the initiative

The question to be answered next: “which master data object to start with?” I can think of two options.

Option one, is to choose an object that is the least complex has lowest volumes and is the least spread across many IT systems. The chances that you complete a project initiative around this object fast and smooth are high. You could then leverage this success to get the required resources to take up others. The flip side of this is that you will may find it difficult to make a business case for investing in this initiative, since benefits gets postponed.

Option two, is to take up an object that poses immediate & critical challenges to your business (irrespective of the complexities involved around that object). This approach will help you deliver tangible business benefits quicker. Also, the chances that you get excellent support from business are high since they have high stakes in the project, owing its potential of creating an immediate-n-positive impact!

An approach that works at one company may not work at other. This has to be arrived at after deliberations within the company under the guidance of an experienced external partner. SAP MDM enables you to adopt this approach with its flexibility as you can choose to go object by object and focus on one repository at a time (from design to Go Live). The ease with which you can integrate it in your IT landscape makes the case stronger. In SAP MDM lies a workable solution to address the concerns like the one that I shared at the beginning of this note. Hence, you take a bite at a time; chew what you bit before taking the next bite, using SAP MDM.

I wanted to bring up this thought since many large companies do face this challenge. However, it does not mean that a company can not adopt a big bang approach using SAP MDM. If business and IT are committed to make it successful and if the business scale and market environment allows adoption of a big bang approach, it will work & will have its own benefits.

Dattatreya Kulkarni is the head of SAP Logistics and Supply Chain Management Practice in Wipro Technologies.


Testing and Monitoring an Interface Between MDM & XI Harrison Holland

Testing and Monitoring an Interface Between MDM & XI
Harrison Holland


1. MDM

First we'll start with the syndication process in MDM, and making sure our settings are correct.

1.1 Check Configuration


1.1.1 Client-Side Settings
  • Open the MDM Console
  • Navigate to Ports in the repository to which your map is located.
    image
  • Verify that you have selected the correct map (built in Part I)
  • Select your processing mode as Automatic
    image
  • Open the MDM Syndicator (Select the repository and press OK)
  • Select File->Open
  • Select the remote system representing ECC
  • Select your map and press OK
  • Select the Map Properties tab
    image
  • Check Suppress Unchanged Records so we automatically update only changed records.
  • Close and save your map
    image
1.1.2 Server-Side Settings
  • Open your mdss.ini file on the MDM server
  • Verify that Auto Syndication Task Enabled=True
  • For testing purposes, change the Auto Syndication Task Delay (seconds) to something rather small, such as 30 or less. This way you don't have to wait a long time for syndication when testing.
    image
  • Verify that the service is started.
  • UNIX systems: ps -ef | grep mdss
  • WINDOWS systems: open services, and look for entry regarding syndication server
  • If service is not running, run command ./mdss (UNIX) or rightclick->start service (WINDOWS)
    image

1.2 Important Locations

I'd like to go over some of the important locations (directories) on your server that will come in handy when troubleshooting and testing. One of the trickiest parts of working with MDM is figuring out where things go and where to look. Because it's so different from the SAP software that we are all used to, navigating the system is not as easy as running a transaction code. Also, MDM reacts to certain situations differently that you may expect, so it's important to know where to look when things aren't working properly. I'm working with MDM installed on HP-UX, however I will try to address each topic as it would appear in Windows to the best of my knowledge.

1.2.1 Home

Log onto your MDM server and navigate to the home directory for the MDM application server. On the server I am working with (sandbox) it happens to be located on the opt filesystem, and the path looks like /opt/MDM. In this directory take note of several important directories:

    /opt/MDM/Distributions
    /opt/MDM/Logs
    /opt/MDM/bin

The Distributions folder is very important because this is where the port directories get created. When you create a port in the MDM Console for a particular repository, it creates a subset of folders in the Distributions directory based on which repository the port was created in, and whether the port is inbound or outbound. For example, in our particular scenario we may navigate to the path /opt/MDM/Distributions/install_specific_directory/Material/Outbound/. Here we will notice a folder entitled ECC which (if you followed the fist part of this series) corresponds to the port that we created earlier. This directory was created as soon as the port was created in the MDM Console. We will focus more on the contents of our port directory shortly.

The Logs folder contains several important log files, however most of them will not apply to our particular scenario, because the logs that we will want to look at are going to be specific to the syndication process, and are located within the port directory. Neverless, I thought it was important to mention that in certain troubleshooting scenarios, don't forget that these log files also exist.

The Bin directory is critical because that is where the files that start the app servers are located. The programs mds, mdss, and mdis are critical files.

1.2.2 Port

Your port directory is going to have the following format:
/MDM_HOME_DIRECTORY/Distributions/MDM_NAME/REPOSITORY/Outbound/REMOTE_SYSTEM/CODE/

For example the we created looks like this:
/opt/MDM/SID.WORLD_ORCL/Material/Outbound/ECC/Out_ECC/

Here you should see the following directories:

    /Archive
    /Exception
    /Log
    /Ready
    /Status


The Archive directory is not as important during the process of syndication as it is with the process of importing data into MDM. This directory contains the processed data. For example, if you were to import an XML document containing material master data, a message would get placed in the archive directory for reference later if you ever needed to check.

The Exception directory is very important because often times when an error occurs you can find a file has been generated in the Exceptions folder that should look similar to that file that either the import server or the syndication server are attempting to import or syndicate. In other words, lets say you were attempting to import an XML document that contained material master data, but the map that was built in MDM has a logic error, the document will instead get passed to the Exceptions folder and the status of the port will be changed in the MDM Console program to "blocked".

The Log directory is important for the obvious reason. Logs are created each time the syndication server runs. So if your interval is 30 seconds, then a log will be generated in this folder every 30 seconds. It will give you the details of the syndication process which ultimately can be critical in the troubleshooting process.

The Ready folder is the most important folder in our scenario. When the Syndication Server polls during it's interval and performs the syndication, the generated XML message will appear in the Ready folder. So in the case of our scenario, we are going to have material master data exported to this directory and ultimately Exchange Infraustructure is going to pick up the data and process it to ECC.

The Status directory contains XML files that hold certain information pertaining to the import / export of data. This information includes processing codes and timestamps.

1.3 Testing

Now are going to test out our scenario and make sure that the export of materials works correctly. First things first, we need to create a new material in the MDM Data Manager. Make sure that your MDM syndication server is turned on! Remember on UNIX we can start it by running ./mdss in the /bin directory, and on Windows by simply starting the service.

1.3.1 MDM Data Manager
  • Start MDM Data Manager
  • Connect to Material repository.
    image
  • Add a new material by "right-click".
    image
  • Fill in required fields to satisfy the map built in Part I.
    image
  • Verify the new product is saved by clicking elsewhere in the Records screen, and then back to the new Material.
    image


1.3.2 Check Syndication

We are now going go verify that the syndication process is taking place as it should based on the settings in your mdss.ini file. If you have set the MDM Syndication Server to perform the syndication process every 30 seconds, as I set it for testing purposes, then by the time you log into your server the syndication should have already occured. Lets check by logging onto the server and navigating to the Ready folder in our Port directory.

/opt/MDMSID.WORLD_ORCL/Material/Outbound/ECC/Out_ECC/

If all went as planned your Ready folder may look something like this:

image

Those files are XML files that contain the data for each material in your repository that has changed. In this case the only materials in my repository are the two that I just added, so the MDM Syndication Server updated the Ready folder with both new materials. Now they are waiting for XI to pick them up and process them. Before we move over to the XI part lets take a look at one of these files and verify that the data in them is correct. Keep in mind that if you have already configured XI to pick up the files from this directory and process them, it's possible you won't see them here because they have already been deleted by XI (based on the settings in your communication channel).

1.3.3 Verify Data

Lets go ahead and open one of these files. I copied the file from the server to my local Windows running computer to examine the file, but of course you can read the file straight from the server if you prefer. If your mapping was done similar to mine, your file should look like a MATMAS05 IDoc in XML structure. This is to make it easier for XI to process since we can export in this format from MDM without much difficulty.
image

2. Exchange Infrastructure

Now we'll take a look at the second half of this scenario and test out our XI interface.

2.1 Check Configuration

The only configuration we are going to check is the outbound communication channel. This is what tells Exchange Infrastructure where to pick up what file (location, filename) and do what after it's processed by the inbound communication channel (processing mode, ie: delete).

  • Start your Integration Directory (Integration Builder: Configuration).
  • Navigate to your outbound communication channel.
  • Examine your File Access Parameters.
    image

In my case, because this is a test scenario, I have a bash script picking up the file from the port directory and dropping it onto a drive that all of the SAP systems have access to; this being the /depot filesystem. As you can see I made a temporary folder on that filesystem for the files for this interface to be stored while waiting to be processed. Of course, the simplest way to do this would be to mount the Port directory from your MDM machine to your XI machine. Next take a look at your Processing Parameters and change the settings accordingly. For this particular scenario I have set the poll interval to 5 seconds for testing purposes. Also, notice that I am using delete as the processing parameter. This is so that I can verify that the file was processed, and so the folder doesn't get cluttered up with files.

image

If everything is the way you want it, lets go ahead and take a look at some important locations that will come in handy for testing and debugging the interface.

2.2 Important Locations

2.2.1 Integration Repository - Map Testing

Start the Integration Repository (Integration Builder: Design) and navigate to the map that we built in Part II. Select the Test tab.

image

To test our map, we can actually use the XML document that MDM generated via the Syndication Server. Lets go ahead and try this.

  • Press the "Load Test Instance" button.
    image
  • Select the XML file MDM generated.
    image
  • Press the "Start Transformation" button.
    image

If everything went smooth then you should see a pop up screen that says "Executed successfully". Otherwise you will recieve an error to which you can begin your debugging process.

image

2.2.2 Runtime Workbench - Component Monitoring

The runtime workbench is one of the most powerful and useful features of Exchange Infrastructure. Here we can get very detailed descriptions of errors that may occur with each component of XI. The component that we will want to pay particular attention to is the Adapter Engine.
  • Log into your runtime workbench and select Component Monitoring -> Display.
    image
  • Click the Adapter Engine link.
    image

Here you can view the status of the adapter. If there is an error in your configuration of a particular adapter it will show up here.

image

2.2.3 Runtime Workbench - Message Monitoring

Follow a similar procedure to display that Message Monitoring.

image

  • Select your time filter, in this case I will select the last hour.
  • Press Start.
    image

You can now see the list of messages that have been processed by the Adapter Engine over the last hour. On my system only one message has been processed in the last hour. You can press either Details or Versions to view more information about any particular message that was processed.

image
2.2.4 Integration Engine - Monitoring

This is a particularly useful component of Exchange Infrastructure that allows us to view various aspects of the messages that get processed. Lets start by logging into the XI system and taking a look.

  • Run transaction SXMB_MONI.
    image
  • Double-click Monitor for Processed XML Messages.
  • Press F8 or the Execute button.
    image
  • Select a message and press Display.
    image

You may notice that I have selected a message that coantains an error and did not actually reach it's destination. In Call Adapter -> SOAP Header take a look at Error. If you double click that button a screen will appear on the right hand side that shows the details of the error.

image

This error tells us that something is wrong with the IDoc Adapter. It tells us that transaction IDX1 contains errors, but in this case the error is actually in the configuration of our communication channel, in which we have made reference to the wrong Port. If you select Call Adapter -> Payloads you can see the content of the XML message that came from MDM.

image

If you go back to SXMB_MONI you may want to also take a look at the Processing Statistics program that will show a good overview which can be helpful when testing your interface with thousands of materials.

image

3. Testing

Now we're going to go ahead and test out the interface from end to end. I'm assuming that by now you have turned on the MDM Syndication Server and your XI interface is activated in the Integration Directory. Lets log into the MDM Data Manager and create a new material for testing purposes.

  • Right click -> Add
    image
  • Enter enough data to satisfy your interface requirements (ie: which fields must be populated?)
    image
  • Click on another material to save changes
  • Close the MDM Data Manager
  • Turn on your MDM Syndication Server (if it's not already turned on)
If your Syndication Server settings have been configured correctly then we can assume that because you added a new material to the data manager, it will now syndicate as soon as your interval cycles through (set in the mdss.ini file on your server). Lets go ahead and move over to the Exchange Infrastructure Runtime Workbench to see if it has processed our message. Keep in mind, depending on your interval time it may take a few minutes. Hopefully you should see something like this:

image

If the runtime workbench shows the message transferred successfully then lets log ino ECC and see if the IDoc was posted.
  • Log into ECC system
  • Run transaction WE02
    image
  • Press F8
  • In the left hand pane, select Inbound IDocs -> MATMAS
    image
  • In the right hand pane, select the IDoc that just transferred and double click on it
  • In the IDoc display, on the left hand side expand E1MARAM and select E1MAKTM
    image
  • Verify that the material data is correct
    image
  • Expand Status Records -> 53 and double click the only record available
    image
  • In the pop up window, copy the message number that was issued to the IDoc
  • Press Proceed
  • Paste the message number that you copied
    image
  • Press F8
    image

You may notice that my image says material 11696 created. This is because a modification was made to an ABAP program to create a material when an IDoc is processed with a certain code. In this blog, the ABAP modification is out of scope, but I'm assuming if you are familiar with ALE then this process should be familiar as well. In any case, this is not a permanent solution, just a temporary solution to finish our prototype. If we take that newly generated material number and run transaction MM02 we should be able to pull up the details on that material.

image

Press Select Views and select Basic Data and continue.

image

Hopefully if all went as planned, the material should have transferred smoothly, with no loss in data. This concludes the three part series on MDM and XI. Thanks for reading, hopefully it helps!

Harrison Holland is a systems integration and basis specialist.