Rodney McCabe

Sr. Technical Consultant and Solution Architect at Mazamo Corp
http://www.linkedin.com/in/rodneymccabe

migration-header

This is the second blog on Agile PLM data migrations.  My prior post "Agile PLM Data Migration – Part 1: Anatomy of an Agile PLM Data Migration" covered the typical process I use while performing a migration and this post will aim to set the stage for a sample migration of a document object with the assignment of a revision, requiring a change order. As mentioned in the first blog, there are a lot of other posts and books on migrations but seeing it is a lot different than reading about it from a theoretical point of view (or from someone who hasn't actually done one).

Imagine, if you will, that you have 10,000 documents to migrate.  All of which were released in the legacy document management system on different dates. This would be a great job for the DataLoad tool because we can easily create all change orders in a released status and assign the desired revision to the imported documents. This is a Oracle Consulting Services and Partner only tool designed specifically for data migrations. It has the ability to reproduce revision history and load most object types, not just Items. The purpose of this blog is not to provide a tutorial on the DataLoad tool, but rather to give a high level overview of how an Agile PLM migration can be accomplished with a real example.

migration-header

There are many books and blogs on data migration, several of which even pertain to Agile PLM, but most of what I have seen reads more like a sales presentation or material put together by someone who hasn’t personally dug in to understand how to get the migration done.  This blog is the first in a series to address HOW an Agile PLM migration is done from a technical perspective.  

Validation requirements, data sampling sizes, etc. are all important topics for data migration but I consider them more of the business unit’s responsibility as I have never been a fan of having developers performing quality control against the product of their own deliverable.  Naturally unit testing should be performed by the migration expert along the way.  So what does a real world data migration look like?

Friday, 28 August 2015 00:00

Examples CD ... Who Needs That?

I was helping troubleshoot an Agile 9.3.3 environment when I ran into an issue attaching a file.  The error was: ORA-29861: domain index is marked LOADING/FAILED/UNUSABLE. 

Oracle Index Error

Thursday, 02 January 2014 00:00

Getting a user's recently visited items

I had a comment posted in another blog (Agile PX - Creating an Agile PLM Process Extension ) from Akshay who wanted to know how to get items from the Recently Visited and Navigator panes in the web client.  They figured out Recently Visited items, but not Navigator.  Navigator doesn't appear to be persisted, so there is little help I can offer, but I thought readers may be interested in knowing how to get the Recent items.

The process flow is pretty simple:

  • Get the IUser of interest
  • Get the recently visited IFolder
  • Iterate over the IAgileObjects in the IFolder

 

recent visits

 

Friday, 08 November 2013 00:00

Running SQL from a Groovy Event Handler

Have you ever run into a situation where you needed to run SQL from an event handler or PX but wasn't quite sure how to do it?  This blog will show you how to get a connection to the Agile PLM database and execute SQL without worrying about the connection details.

The Script

The script itself is pretty simple - just import the com.agile.util.sql package and leverage the ConnectionFactory class.  Once you get the connection, you can proceed as you normally would since it is a standard Connection interface.  If you are coding this externally make sure you add agileclasses.jar to your classpath.

import com.agile.agileDSL.ScriptObj.*;
import com.agile.api.*;
import com.agile.util.sql.*;
import java.sql.*;

void invokeScript(IBaseObjectScriptObj object) {

	Connection connection = null;
	PreparedStatement statement = null;
	ResultSet rows = null;
	
	try {
		connection = ConnectionFactory.getFactory().getConnection();
		statement = connection.prepareStatement("select ifs_url from vault");
		rows = statement.executeQuery();

		while(rows.next()) {

			// log the SQL response
			object.logMonitor( "SQL Response: " + rows.getString("ifs_url") );
		}
		
	} catch (Exception e) { 
		object.logMonitor( e.getMessage() );
	}
}

 

Agile 9.3.2 Keystore

If you are used to the pre-Agile 9.3.2 scripts to import and export a database then you probably noticed that things have changed.  Now you must include passwords to import and export dump files but you may not have realized there is more!  You must also reset your keystore and system critical passwords if loading a dump from another system.  At first this was pretty annoying, but after creating a script to reset them during the import, the inconvenience is a distant memory.  This blog will show you how to do the same.

Maybe you’ve just completed your work of art (the latest, greatest piece of software to extend your Agile PLM deployment) or reached a new level of volume your application hasn’t had to process before and at the magical point of execution you’re rewarded with a stack trace that seems to have no end.  Specifically you receive a message similar to the following:

Error code : 60086
Error message : Call APIException.getRootCause() for details.
Root Cause exception : weblogic.rjvm.PeerGoneException: ; nested exception is:
       weblogic.socket.MaxMessageSizeExceededException: Incoming message of size: '10000080' bytes exceeds the configured maximum of: '10000000' bytes for protocol: 't3'

Enterprise Data Quality for Product Data This is the forth and final blog in a series about Agile 9.3.2's integration with Oracle's Enterprise Data Quality for Product Data (EDQP), which walks through the Agile PLM 9.3.2 and EDQP Integration White Paper to explore the integration's setup and capabilities. This edition will focus on section 4 of the whitepaper

Enterprise Data Quality for Product Data This is the third blog in a series about Agile 9.3.2's integration with Oracle's Enterprise Data Quality for Product Data (EDQP), which walks through the Agile PLM 9.3.2 and EDQP Integration White Paper to explore the integration's setup and capabilities. This edition will focus on section 3 of the whitepaper

Enterprise Data Quality for Product Data This is the second blog in a series about Agile 9.3.2's integration with Oracle's Enterprise Data Quality for Product Data (EDQP), which walks through the Agile PLM 9.3.2 and EDQP Integration White Paper to explore the integration's setup and capabilities. This edition will focus on section 2 of the whitepaper

For the introduction, see Discovery: Agile PLM 9.3.2 and EDQP Integration (Part 1)

Page 1 of 3