migration-header

This is the second blog on Agile PLM data migrations.  My prior post "Agile PLM Data Migration – Part 1: Anatomy of an Agile PLM Data Migration" covered the typical process I use while performing a migration and this post will aim to set the stage for a sample migration of a document object with the assignment of a revision, requiring a change order. As mentioned in the first blog, there are a lot of other posts and books on migrations but seeing it is a lot different than reading about it from a theoretical point of view (or from someone who hasn't actually done one).

Imagine, if you will, that you have 10,000 documents to migrate.  All of which were released in the legacy document management system on different dates. This would be a great job for the DataLoad tool because we can easily create all change orders in a released status and assign the desired revision to the imported documents. This is a Oracle Consulting Services and Partner only tool designed specifically for data migrations. It has the ability to reproduce revision history and load most object types, not just Items. The purpose of this blog is not to provide a tutorial on the DataLoad tool, but rather to give a high level overview of how an Agile PLM migration can be accomplished with a real example.

migration-header

There are many books and blogs on data migration, several of which even pertain to Agile PLM, but most of what I have seen reads more like a sales presentation or material put together by someone who hasn’t personally dug in to understand how to get the migration done.  This blog is the first in a series to address HOW an Agile PLM migration is done from a technical perspective.  

Validation requirements, data sampling sizes, etc. are all important topics for data migration but I consider them more of the business unit’s responsibility as I have never been a fan of having developers performing quality control against the product of their own deliverable.  Naturally unit testing should be performed by the migration expert along the way.  So what does a real world data migration look like?

Friday, 28 August 2015 00:00

Examples CD ... Who Needs That?

I was helping troubleshoot an Agile 9.3.3 environment when I ran into an issue attaching a file.  The error was: ORA-29861: domain index is marked LOADING/FAILED/UNUSABLE. 

Oracle Index Error

Friday, 08 November 2013 00:00

Running SQL from a Groovy Event Handler

Have you ever run into a situation where you needed to run SQL from an event handler or PX but wasn't quite sure how to do it?  This blog will show you how to get a connection to the Agile PLM database and execute SQL without worrying about the connection details.

The Script

The script itself is pretty simple - just import the com.agile.util.sql package and leverage the ConnectionFactory class.  Once you get the connection, you can proceed as you normally would since it is a standard Connection interface.  If you are coding this externally make sure you add agileclasses.jar to your classpath.

import com.agile.agileDSL.ScriptObj.*;
import com.agile.api.*;
import com.agile.util.sql.*;
import java.sql.*;

void invokeScript(IBaseObjectScriptObj object) {

	Connection connection = null;
	PreparedStatement statement = null;
	ResultSet rows = null;
	
	try {
		connection = ConnectionFactory.getFactory().getConnection();
		statement = connection.prepareStatement("select ifs_url from vault");
		rows = statement.executeQuery();

		while(rows.next()) {

			// log the SQL response
			object.logMonitor( "SQL Response: " + rows.getString("ifs_url") );
		}
		
	} catch (Exception e) { 
		object.logMonitor( e.getMessage() );
	}
}

 

Agile 9.3.2 Keystore

If you are used to the pre-Agile 9.3.2 scripts to import and export a database then you probably noticed that things have changed.  Now you must include passwords to import and export dump files but you may not have realized there is more!  You must also reset your keystore and system critical passwords if loading a dump from another system.  At first this was pretty annoying, but after creating a script to reset them during the import, the inconvenience is a distant memory.  This blog will show you how to do the same.

Enterprise Data Quality for Product Data This is the forth and final blog in a series about Agile 9.3.2's integration with Oracle's Enterprise Data Quality for Product Data (EDQP), which walks through the Agile PLM 9.3.2 and EDQP Integration White Paper to explore the integration's setup and capabilities. This edition will focus on section 4 of the whitepaper

Enterprise Data Quality for Product Data This is the third blog in a series about Agile 9.3.2's integration with Oracle's Enterprise Data Quality for Product Data (EDQP), which walks through the Agile PLM 9.3.2 and EDQP Integration White Paper to explore the integration's setup and capabilities. This edition will focus on section 3 of the whitepaper

Enterprise Data Quality for Product Data This is the second blog in a series about Agile 9.3.2's integration with Oracle's Enterprise Data Quality for Product Data (EDQP), which walks through the Agile PLM 9.3.2 and EDQP Integration White Paper to explore the integration's setup and capabilities. This edition will focus on section 2 of the whitepaper

For the introduction, see Discovery: Agile PLM 9.3.2 and EDQP Integration (Part 1)

Enterprise Data Quality for Product Data In its latest release, Agile 9.3.2 offers an integration with Oracle's Enterprise Data Quality for Product Data (EDQP).  For those seeking data cleansing, standarization, or completion of missing data, this is a welcomed addition to the feature set.  In this blog series, I will walk through the Agile PLM 9.3.2 and EDQP Integration White Paper and explore the integration's setup and capabilities. 

Hint - Workflow status' aren't list values...

Our help desk recently fielded a support ticket from a company looking for assistance with their extraction of PSR content.  Specifically, they could not identify how to extract the PSR number, the workflow name and workflow state.  One can be very quick to jump to the conclusion that workflow states are list items and should reside in the listentry table, but that is not the case.

Page 1 of 2