Documentum Composer and Eclipse Team

Now that we have Composer installed on Eclipse JEE we have an open tooling platform that we can extend.  One such extension is Eclipse Team that provides a source control API that most source control vendors; cvs (comes with eclipse anyway), svn, perforce, starteam, etc, have integrated with.  Right out of the box you can share your projects with other team members via most standard source control systems.  For demonstration I’m going to use the Subversive SVN provider here but the others all work in a similar way.

As part of the scenario we will be playing out the role of two developers by switching between two workspaces.

Installing the SVN Provider

First thing first though we need to install the Subversive SVN team provider.  Goto Help->Install New Software…  Select Helios in the “Work with” drop down and type “svn” into the filter box.  Select the Subversive SVN Team Provider (Incubation) and install it.

You will be prompted for the usual restart so go ahead and do that. 

Configuring the Team Provider for Documentum

Next we need to tell the team provider what to checkin.  You need to do this in the other workspace too and I will remind you to do so.  Goto Window->Preferences…->Team->Ignored Resources. 

Click Add Pattern… and add the following three patterns:-
*.artifact

Now repeat that adding the next two patterns:-
*.dar
/Web Services/bin/*

This tells all the team providers to ignore built artifact files (*.artifact), Documentum ARchives (*.dar) and all web service build artifacts.  There is no point in checking these in.

Sharing a Project

We now assume the role of the first developer.  He wants to share a project.  With the Documentum Artifacts perspective open create a new Documentum project to share.  As this is the first Documentum project in your workspace Composer will also create the core and the TCM reference project for you.  As these two upstream projects are both referenced by your project you need to check these in too.  This ensures your projects references will be satisfied when others check them out.  This principal stands for any referenced projects that you may add too. 

The core and tcm reference projects are hidden in the Documentum Artifacts perspective so open the Navigator view to show all three projects; Window->Show View->Navigator.  Now you can see all three projects, select the core project first, right-click and select Team->Share Projects…

This will bring up the SVN Share Projects Wizard.

Enter the URL for your SVN repo and the login details.  We can default everything else so just click Finish. 

SVN will then pop up the Commit dialog allowing you can enter a comment for the commit and to review the set of files that will be commited.  It is worth checking these to make sure there are no resources that should be ignored.

Click OK and the files will be added to the repository.  Once that’s done, repeat this process for the TCM reference project and lastly for you project.

Once all three projects have been shared your workspace should look something like this:-

Note the little orange repository icon in the bottom left hand corner of all your folders and files.  This is a visual indicator that these files are being managed with source control.

Import a Project

So now other developers are free to work on these projects too.  Let’s have a look at how they might do that. 

Switch to a new, empty workspace and show the Documentum Artifacts perspective.  Now set up the ignored resources as we did earlier.

Now import the projects that were shared by developer one.  Right-click in the Project Explorer and select Import->Import… to bring up the Import dialog.  Find and expand the SVN category and select Project from SVN.

Click Next to launch the SVN Import dialog.  On the first page enter the repository info as you did before.  Click Next to bring up the Select Resource dialog.

Select the Browse button.  Navigate to the DocumentumCoreProject and select it. 

Click OK to confirm the selection and return to the previous dialog.  Then click Finish to bring up the Check Out As dialog. 

Accept the defaults on this dialog and Click Finish to start the import.

Once the core project has been imported repeat the same process for the tcm reference project next and finally for your project.  Don’t forget the core and tcm reference projects won’t be visible in the Documentum Navigator so you can open one of the other navigator views to see them.

Once all three projects have been imported you should see something like this:-

Again, note the little orange repository icons indicating these projects are under source control.

Updating a Project

Naturally your developers are now going to update the project by adding new artifacts, editing others and possibly deleting some others still.  This is all part of the natural life-cycle of the project.

So switch workspace back to the first developer and let’s make a couple of changes.  Let’s add a new format artifact and edit the description of the dar folder install parameter.  Your workspace should now look something like this.

Note the arrow (>) next to items that have been modified.  And note also that these track all the way back up the tree to the project itself.  And the little question mark on the format artifact indicating it is new.  If you want to learn more about the SVN icons goto Window->Preferences->Team->SVN->Label Decorations for this dialog:-

Now submit these changes.  The first thing we do is merge out; i.e. grab all the changes made by others whilst we have been working on our change.  This ensures we are compatible and not blindly forcing a breaking change into the repository.  Right-click on the project and select Team->Update.  Obviously in this case no one else has made any changes yet.  But typically at this stage we would resolve any conflicts and re-test
our changes to ensure they are still good.  Then we know we are fit to check in.  Right-click the project and select Team->Submit… 

Add a submission comment and click OK.  Note that the SVN modified indicators all disappear and your changes have been submitted into the
repository.

In the meantime the developer that imported the project is now making changes too.  Let’s switch to his workspace and also make an edit to the description of the dar install folder parameter.

Resolving a Conflicting Change

You’ll notice that both developers have now changed the dar folder install parameter artifact.  The second developer is going to be made aware of this when he attempts to check his changes in. 

So still as the second developer do your merge out.  Right-click on the project and select Team->Update.  SVN will attempt to update your workspace.  It will grab any new, non-conflicting changes and attempt to resolve any conflicts.  It will update your project with the new format artifact.  However, it will be unable to resolve the conflict that has arisen with the dar folder install parameter.  You’ll be notified with the following dialog:-

Click OK to close the dialogs.  Now right-click on the project and select Team->Synchronize with Repository.  A dialog will pop-up asking if you want to open the Synchronize Perspective which is a collection of views for helping with synchronization tasks.  Choose Yes and the perspective will open:-

Right-click on the dar folder install parameter, the artifact that has the conflict, and select Edit Conflict.  This will open the XMI versions of the local and repository file highlighting the conflicts. 

The developer uses this editor to resolve the conflict in the normal way.  Once he is done he marks the file as merged by right-clicking on the artifact in the tree again and selecting Mark as Merged.  The artifact will be removed the file from the Synchronize view:-

Lastly, he right-clicks on the project and selects Commit and his changes, merged with others, are committed to the source control system.

Conclusion

In today’s post we’ve taken a look at the Subversive SVN Team Provider for eclipse and demonstrated how it can be used with Composer, unmodified, to share Documentum projects amongst a team of developers through an SVN source control system.

The same basic theory holds true except for the subtle differences between providers feature sets.  Over time I will try and demonstrate the same scenarios using CVS and Perforce.

In the meantime – Happy Composing!

Composer on Helios

I know that many of my readership do like to run Composer on Eclipse, opposed to running Composer the RCP product which is still based on Eclipse 3.4.  Composer the RCP product is a closed-box.  Composer on eclipse on the other hand is an open platform that you can extend through the software updates in order to leverage any of the multiple complimentary tool-sets that are out there.

So, to that end, a quick set of instructions on how to get Composer running on the latest version of Eclipse, 3.6, codenamed Helios.

Download and install the JEE version of eclipse from here.  The JEE version already contains most of the dependencies that Composer requires to run including EMF, GEF, etc.  Start it up and choose a workspace.  Once it has started the only additional package you need is the emf validation framework.  Goto Help->Install New Software…  Choose the “All Available Sites” or “Helios” in the “Work with” drop down and in the filter box type “emf validation framework” and you should see something like this:-

Select the SDK.  Click Next to review the installation.  Click Next again and accept the licence agreement.  Then click Finish to install.  Once it has been downloaded and installed into your Eclipse run-time you will be asked to restart – go ahead and do this.  Once it has restarted the install is complete.  Now close Eclipse again as you now need to add the Composer plugins.

Copy all folders and jars beginning “com.emc.ide%” from the features and plugins folders of your existing Composer 6.5 install into the dropins folder under a /composer/eclipse path.  It should look like this:-

Now restart Eclipse one more time and navigate to Help->About Eclipse to confirm the Documentum logo is there as shown in the following image:-

And your done.  As always the obligatory support statement. Composer isn’t officially supported in this configuration- although technically there is no difference.  Any issues you did find would need to be reproduced on Composer RCP before Tech Support would accept them.

Lastly, a small request from me.  Before I left on my LOA and now that I am back I am always actively encouraging EMC to host and support Composer as a set of plugins from an update site, like the rest if the Eclipse community, as well as Composer as a product.  But I am just one of a few voices saying this within EMC who obviously weigh our opinion up against the cost of hosting the update site.  So if you would like to consume Composer from an update site, if you would like Composer to be an open -platform it is important that you let EMC know this by telling them directly or by commenting here.

Ok that’s it.  In future posts I show you how you can leverage some of the complimentary tool-sets I mentioned earlier and that you now have available to you.  And obviously feel free to experiment and let us know what works for you.

Happy Composing

Creating a DFS Services Client using Documentum Composer

One of my readers left a comment saying that my previous post left him high and dry when it came to creating the client project for testing his service.

So I put this very quick post together to describe it for him and hopefully for others too.

First this to note here is that there are lots of ways to create a DFS Client spanning both Java or .Net.  This is just one way using Documentum Composer.

So where did the previous post end up.  Well we had created a Documentum Project and within that created a DFS service, exported it as a service archive and deployed it onto a suitable application server. 

So what next?

Create yourself another Documentum Project (File->New->Other->Documentum Project).

Next create yourself a lib folder and in it import (Import->File System) your services remote jar.  This will have been exported at the same time as your service archive.  If you called your service archive samples for example then you would look for samples-remote.jar.

Once imported add it to your java build path (<project>->Properties->Java Build Path->Libraries->Add JARs…  Navigate to your remote jar within your client project and select it.  You should end up with the equivalent of this:-

image

You also want to make sure you have the right DFS client class path configured.  So whilst you are on this dialog.  Select the “DFS Services Library” build path entry and click Edit…  Select an appropriate library type for your scenario.  I’ve chosen “DFS Remote Client Library with UCF”:-

image

Click Finish.

Finally you need to write some code to call your service.  There are lots of contexts that your code could execute within; a Main class, a java application, an eclipse application, lots of others too.  I’m a big fan of test-driven development so I like to create Junit tests as these can be automated later and become part of the applications test suite.   Let’s create one of these (File->New->Other->Java->Junit->Junit Test Case):-

image

Give your Junit test a Package and a Name and make sure you’ve chosen to create a constructor stub.  Don’t worry about the super class warning – just click Finish and you should see this dialog:-

image

Obviously we do want to add Junit 3 library to the project’s class path so choose to perform that action.  Click Ok. 

Composer should do that for you and open your newly created Junit test case in the java editor.  We now need to add our service orchestration code.   Add a method called test with the following signature shown:-

image

And then add your orchestration code to that method.  If your not sure how to get started with this then a great place to start here is the client code that ships with the DFS SDK.  Cut and paste in a sample remote client and re-purpose it for your needs by changing the service related calls to use your service.  I’m actually using the DFS Hello World sample for this article.  As you are adding code, especially if you cut and paste a sample, you’ll end up with a bunch of problem markers:-

image

Because we already configured the project with a DFS class path you can hover over the problem marker in the sidebar and use the quick fix to add the necessary DFS imports.

A good tip here is to note that the moduleName and contextRoot (in the code sample above) must match the equivalent settings that you configured for your services project and that are also specified in the Export Service Archive dialog:-

image

That is pretty much it.  Make sure your services are deployed and that the server is running and the wsdl for each service is available.

To run the Junit test right-click on it in the project explorer and select Run As->Junit Test.  This will execute your test case using the Junit test framework and display the results.

If you want to debug the code then set a break point in the code and select Debug As->Junit Test instead.

Alternative

Now it is important to note that DFS supports two modes of invocation; remote – as we outlined above.  But it also supports local invocation too. 

This provides us with an alternative way to test our service code with less fixture than the remote alternative; i.e. without having to deploy it to a server.  

First we also need to add some of the services’ resources to the services project’s class path so that the DFS runtime can find them.  The Ear is configured in such a way that these same resources are available on the class path when the Ear is deployed on an app server.  We are just mimicking this configuration. 

So right-click on your services project and select Properties ->Java Build Path->Libraries.  Click Add Class Folder.  Navigate down your services project.  Expand the Web Services folder and the bin folder.  And check the gen-rsc folder.  Click Ok.   You should see this folder added t0 the build path:-

image

That’s the services project configured.

Now create another Documentum client project.

This time round make sure you add your service project to the Java Build Path (<client project>/Properties->Java Build Path->Projects):-

image

As before edit the type of DFS Library Path but select “DFS Local Library”.

There is no need to add the remote jar as we will pick up the service classes directly from the project.

As before add the Junit Test Case.  If you get any compilation errors then quick fix them.  Note, that when you add your services code you must call getLocalService instead of getRemoteService AND you don’t need to register the service context with the ContextFactory ahead of time.  So your service invocation code simply becomes:-

ContextFactory contextFactory = ContextFactory.getInstance();
IServiceContext context = contextFactory.newContext();

ServiceFactory serviceFactory = new ServiceFactory();  
IHelloWorldService service = serviceFactory.getLocalService(
               IHelloWorldService.class, context);

And that’s it.  You are ready to go.  Right-click on the Junit class in the project explorer and select Run As (or Debug As)->Junit Test.

Conclusions

So we’ve walked through creating a remote DFS Consumer and looked at an alternative for creating a local DFS consumers that leverages DFS’s local invocation facility.

As always I await your feedback.  In the meantime…

Happy Composing!

Documentum Development: Past, Present and Future

 

The Past

Using DAB the mode de jour was to develop against a Docbase; live, connected, whatever you want to call it.  You nominated a "System of Record" Docbase.  You then went about the business of defining all the artifacts; types, lifecycles, workflows, relations and so on and so forth, that collectively made up your Docapp (a DOCbase APPlication).

Your application(s) code would never execute against these particular artifacts, in this particular Docbase; because they really were just the definitions of resources that your application requires to be present in a production Docbase in order to run correctly. Meanwhile, the rest of your application artifacts; java classes, servlets & jsps, XML, etc, were all managed out of a source control system.

To prepare a content-rich application for distribution. Release engineering (releng) would checkout all the application artifacts, build and package them as an installer or as a war or an ear.  They would then also export the associated Docapp as a Docapp Archive and place with the application package.  These resources were then placed on a download site where customers could find them.

To install the application the customer would use the installer or deploy the war or ear. He would then use DAI to install the docapp archive into one or more production Docbases.

So, that is how it was.  The System of Record for your source artifacts was the Docbase itself.  But what was so wrong with that I hear some of your ask?  Well a few things as it happens:-

1. Having different Systems of Record is just plain wrong, and what made us so special anyway?
2. Versioning semantics for Docbase artifacts is not always the same as they are for source control, complicating development, histories, labelling and snapshoting
3. Releng needed to have and to know Docbases and to construct Docbases processes (not all of which were that automatable) which was hugely burdensome for them

With reference to the versioning semantics.  It is important for us all to recognize that the Docbase is just plain bad at development versioning.  For example, types in a Docbase cannot be versioned but code certainly depends on particular versions of types.  Now if we could have reliably versioned artifacts then we can support explicit versioned dependencies, even when the target environment does not support it. Explicit versioned dependencies allow us to bind code to specific versions of Docbase resources and have those dependencies validated.  The upsides of reliable versioning are hopefully evident.

The Present

Recognizing that content management would soon standardize through efforts like CMIS.  Recognizing that this would foster and entire industry of organizations building content-rich applications we knew we would have to address these shortcomings and move more mainstream.

So we built Composer and a system of linked XML-based artifacts and an install process that can transform these XML-based artifacts into Docbase objects, preserving the links by converting them to Docbase links; i.e. OIDs.

No longer is there a need for a "System of Record" Docbase.  Docbase artifacts can now be developed in the same eclipse IDE right alongside all your other application artifacts.  They can also be managed out of the same source control system, giving us reliable version semantics and greatly simplifying the releng process.

The Future

OK so we have ironed out that little wrinkle.  Docbase artifacts are now analogous to other source artifacts and therefore we are all now followers of the same development model.   In this respect we’ve become a bit more standard and hopefully removed a big barrier to entry for release engineering departments.  We believe this will help promote EMC|Documentum as the platform for developing content-rich applications.

So what’s next?

In a word "OSGi".

A phenomenon is sweeping the java community in the form of OSGi, a dynamic module system for Java. Most application servers are OSGi-based already.  And this phenomenon will undoubtedly impact the ECM community too. 

OSGi is mature.  It is over 10 years old and has grown out of embedded systems world.  So this is not new technology by any stretch.  It may just be new to some of us.  It is also standards-based. 

OSGi promotes a world of modules (bundles in their parlance) that can be dynamically inserted (or removed) into any live, running system with no interruptions in the services it provides.  Each bundle has versioned dependencies that the system will honour and enforce these.  And a system can support multiple versions of a module at any one time.

For all these reasons (and many others) I believe that OSGi is the perfect fundamental building block for next generation ECM solutions and solution frameworks.

It is also very important to recognize that in this future world of solutions and solution frameworks we may well also see direct, application hosted, manipulation of our XML systems of record.  As well as continuing to support the more traditional offline development model, both will co-exist quite happily side-by-side.  This direct manipulation will be supported by a family of web-based tooling built on the central core of the Composer tooling platform.

But this is still a little way in the future.  Because OSGi will effect how applications are delivered the most noticeable change you will see in the short-term will be to our friend the dar (Documentum Archive) which I believe is already set to become an OSGi bundle.  This will prepare the way for it to be just like every other piece of the solution.  Developed the same way, delivered the same way.  Depended upon, dependent upon others.  Really binding into the application as the vital constituent part that it really is. 

Conclusions

So what conclusions should you draw from this? 

Well if you are considering developing a content rich application (or larger) on Documentum then you need to be following Composer’s lead and adopting model-driven development.  Making XML models a central tenet of your development practices is the right thing to do.  Leverage Composer, the tooling platform, and in particular its EMF-based modelling and your not going to go too far wrong. 

Also getting more involved in Composer, the tooling platform, by extending it for your purposes by adding your own Docbase artifacts would be an excellent way to introduce yourself to the wonderful world of OSGi.

Happy Composing

DarDocs

Besides the biggy of source control another advantage of having an XML-based offline system of record of our Documentum artefacts is that we can use standard XML tools on those artefacts.

For example you could generate HTML docs  from them; DarDocs if you will.

Take a transform like this:-

<!–XML Transform to generate HTML from Documentum Type Artefacts–>
<xsl:stylesheet version="1.0"
xmlns:xsl="
http://www.w3.org/1999/XSL/Transform" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"  xmlns:lxslt="http://xml.apache.org/xslt"

xmlns:xmi="http://www.omg.org/XMI" xmlns:Artifact="http://documentum.emc.com/infrastructure/12012007/artifact" >
<xsl:template match="/">
  <html>
  <body> 
     <table border="1"> 
     <xsl:for-each select="Artifact:Artifact/dataModel/type">
        <tr> 
           <td>Document Type</td> 
           <td><xsl:value-of select="@name"/></td>
        </tr>
        <tr> 
           <td>Super Type</td> 
           <td><xsl:value-of select="substring-before(substring-after(super_name/@href,’name=’), ‘#’)"/></td>
        </tr> 
        <tr bgcolor="#9acd32">
           <th>Attribute Name</th>
           <th>Attribute Type</th>
           <th>Attribute length</th>
           <th>Attribute Label</th>
           <th>Help Text</th>
           <th>Comment</th> 
        </tr> 
      <xsl:for-each select="primaryElement/attributes"> 
        <tr> 
           <td><xsl:value-of select="@attr_name"/></td> 
           <td><xsl:value-of select="attr_type/@values"/></td> 
           <td><xsl:value-of select="@attr_length"/></td> 
           <td><xsl:value-of select="attrAnnotations/locales/@label_text"/></td> 
           <td><xsl:value-of select="attrAnnotations/locales/@help_text"/></td> 
           <td><xsl:value-of select="attrAnnotations/locales/@comment_text"/></td> 
        </tr> 
      </xsl:for-each> 
     </xsl:for-each>
     </table>
  </body>
  </html>
</xsl:template>

</xsl:stylesheet>

And a simple ant task like this:-

<?xml version="1.0" encoding="UTF-8"?>
<project default="generateDocs">

   <target description="Generate DarDocs" name="generateDocs"> 
      <xslt basedir="Artifacts" destdir="Documents"  style="style/DocTypes.xsl"> 
         <mapper type="glob" from="*.type" to="*.html"/> 
      </xslt> 
   </target>
</project>

And run it over a project with a few types in and you should get an html file for each type that looks something like this:-

Document Type soptype
Super Type dm_document
Attribute Name Attribute Type Attribute length Attribute Label Help Text Comment
attr1 STRING 10 attr1_label Some user help on attr1 Developer comment for attr1
attr2 INTEGER 10 attr2 label Some user help for attr2 Developer comment for attr2

It would be simple enough for a release engineer to add this to your releng build so that each build also produces a set of DarDocs for your Dars.

Hat tip to Eddie Doey from UK Consulting for providing this great example.

Happy Composing!

Migrating your projects between versions of Documentum Composer

As we release new versions of Documentum Composer you need to migrate your projects to this new version.  But how do you do this?

It’s pretty simple actually.  First-time you start the new version of Composer create yourself a new workspace.  Then re-import your projects into that new workspace using the Documentum-specific projects import wizard; File->Import->Documentum->Existing Projects Into Workspace:-

image

Any model changes that need to be applied to your artifacts will occur during this import.

The astute will realize that this is an irrevocable operation so your team will all need to use the same version of Composer.

Happy Composing!

EMC Documentum Developer Edition

Today’s a first for EMC.

For the first time ever you can download a completely free version of the EMC Documentum for development, the EMC Documentum Developer Edition.  The one-click install includes the Content Server, DFS, Composer, DA and Webtop.  All the tools you need to develop enterprise content-centric applications.

To compliment this we’ve also launched a new developer-oriented community.

Enjoy and as always happy composing!

Configuring BOF Hot Deployment in Composer

By way of a follow-up to Don’s post I wanted to show you how this surfaces in Composer.

As we have discussed before a BOF module comprises Jardef artifacts :-

image

and Java Library artifacts :-

image

By design implementation Jardefs are sandboxed.  Interface Jardefs are not, for all the reasons Don highlighted.  Meaning only your implementation jardefs can be hot deployed.

Now if we take a look at the Java Library editor you will see immediately where check to indicate whether or not you would like to sandbox each Java Library.  Checking this option will mean that Java Library will support hot deployment.  Unchecking it will mean it won’t but instead you can then share this Java Library across modules – which is sometimes useful.

image

Happy Composing