Documentum Development: Past, Present and Future


The Past

Using DAB the mode de jour was to develop against a Docbase; live, connected, whatever you want to call it.  You nominated a "System of Record" Docbase.  You then went about the business of defining all the artifacts; types, lifecycles, workflows, relations and so on and so forth, that collectively made up your Docapp (a DOCbase APPlication).

Your application(s) code would never execute against these particular artifacts, in this particular Docbase; because they really were just the definitions of resources that your application requires to be present in a production Docbase in order to run correctly. Meanwhile, the rest of your application artifacts; java classes, servlets & jsps, XML, etc, were all managed out of a source control system.

To prepare a content-rich application for distribution. Release engineering (releng) would checkout all the application artifacts, build and package them as an installer or as a war or an ear.  They would then also export the associated Docapp as a Docapp Archive and place with the application package.  These resources were then placed on a download site where customers could find them.

To install the application the customer would use the installer or deploy the war or ear. He would then use DAI to install the docapp archive into one or more production Docbases.

So, that is how it was.  The System of Record for your source artifacts was the Docbase itself.  But what was so wrong with that I hear some of your ask?  Well a few things as it happens:-

1. Having different Systems of Record is just plain wrong, and what made us so special anyway?
2. Versioning semantics for Docbase artifacts is not always the same as they are for source control, complicating development, histories, labelling and snapshoting
3. Releng needed to have and to know Docbases and to construct Docbases processes (not all of which were that automatable) which was hugely burdensome for them

With reference to the versioning semantics.  It is important for us all to recognize that the Docbase is just plain bad at development versioning.  For example, types in a Docbase cannot be versioned but code certainly depends on particular versions of types.  Now if we could have reliably versioned artifacts then we can support explicit versioned dependencies, even when the target environment does not support it. Explicit versioned dependencies allow us to bind code to specific versions of Docbase resources and have those dependencies validated.  The upsides of reliable versioning are hopefully evident.

The Present

Recognizing that content management would soon standardize through efforts like CMIS.  Recognizing that this would foster and entire industry of organizations building content-rich applications we knew we would have to address these shortcomings and move more mainstream.

So we built Composer and a system of linked XML-based artifacts and an install process that can transform these XML-based artifacts into Docbase objects, preserving the links by converting them to Docbase links; i.e. OIDs.

No longer is there a need for a "System of Record" Docbase.  Docbase artifacts can now be developed in the same eclipse IDE right alongside all your other application artifacts.  They can also be managed out of the same source control system, giving us reliable version semantics and greatly simplifying the releng process.

The Future

OK so we have ironed out that little wrinkle.  Docbase artifacts are now analogous to other source artifacts and therefore we are all now followers of the same development model.   In this respect we’ve become a bit more standard and hopefully removed a big barrier to entry for release engineering departments.  We believe this will help promote EMC|Documentum as the platform for developing content-rich applications.

So what’s next?

In a word "OSGi".

A phenomenon is sweeping the java community in the form of OSGi, a dynamic module system for Java. Most application servers are OSGi-based already.  And this phenomenon will undoubtedly impact the ECM community too. 

OSGi is mature.  It is over 10 years old and has grown out of embedded systems world.  So this is not new technology by any stretch.  It may just be new to some of us.  It is also standards-based. 

OSGi promotes a world of modules (bundles in their parlance) that can be dynamically inserted (or removed) into any live, running system with no interruptions in the services it provides.  Each bundle has versioned dependencies that the system will honour and enforce these.  And a system can support multiple versions of a module at any one time.

For all these reasons (and many others) I believe that OSGi is the perfect fundamental building block for next generation ECM solutions and solution frameworks.

It is also very important to recognize that in this future world of solutions and solution frameworks we may well also see direct, application hosted, manipulation of our XML systems of record.  As well as continuing to support the more traditional offline development model, both will co-exist quite happily side-by-side.  This direct manipulation will be supported by a family of web-based tooling built on the central core of the Composer tooling platform.

But this is still a little way in the future.  Because OSGi will effect how applications are delivered the most noticeable change you will see in the short-term will be to our friend the dar (Documentum Archive) which I believe is already set to become an OSGi bundle.  This will prepare the way for it to be just like every other piece of the solution.  Developed the same way, delivered the same way.  Depended upon, dependent upon others.  Really binding into the application as the vital constituent part that it really is. 


So what conclusions should you draw from this? 

Well if you are considering developing a content rich application (or larger) on Documentum then you need to be following Composer’s lead and adopting model-driven development.  Making XML models a central tenet of your development practices is the right thing to do.  Leverage Composer, the tooling platform, and in particular its EMF-based modelling and your not going to go too far wrong. 

Also getting more involved in Composer, the tooling platform, by extending it for your purposes by adding your own Docbase artifacts would be an excellent way to introduce yourself to the wonderful world of OSGi.

Happy Composing


Besides the biggy of source control another advantage of having an XML-based offline system of record of our Documentum artefacts is that we can use standard XML tools on those artefacts.

For example you could generate HTML docs  from them; DarDocs if you will.

Take a transform like this:-

<!–XML Transform to generate HTML from Documentum Type Artefacts–>
<xsl:stylesheet version="1.0"
xmlns:xsl="" xmlns:xsi=""  xmlns:lxslt=""

xmlns:xmi="" xmlns:Artifact="" >
<xsl:template match="/">
     <table border="1"> 
     <xsl:for-each select="Artifact:Artifact/dataModel/type">
           <td>Document Type</td> 
           <td><xsl:value-of select="@name"/></td>
           <td>Super Type</td> 
           <td><xsl:value-of select="substring-before(substring-after(super_name/@href,’name=’), ‘#’)"/></td>
        <tr bgcolor="#9acd32">
           <th>Attribute Name</th>
           <th>Attribute Type</th>
           <th>Attribute length</th>
           <th>Attribute Label</th>
           <th>Help Text</th>
      <xsl:for-each select="primaryElement/attributes"> 
           <td><xsl:value-of select="@attr_name"/></td> 
           <td><xsl:value-of select="attr_type/@values"/></td> 
           <td><xsl:value-of select="@attr_length"/></td> 
           <td><xsl:value-of select="attrAnnotations/locales/@label_text"/></td> 
           <td><xsl:value-of select="attrAnnotations/locales/@help_text"/></td> 
           <td><xsl:value-of select="attrAnnotations/locales/@comment_text"/></td> 


And a simple ant task like this:-

<?xml version="1.0" encoding="UTF-8"?>
<project default="generateDocs">

   <target description="Generate DarDocs" name="generateDocs"> 
      <xslt basedir="Artifacts" destdir="Documents"  style="style/DocTypes.xsl"> 
         <mapper type="glob" from="*.type" to="*.html"/> 

And run it over a project with a few types in and you should get an html file for each type that looks something like this:-

Document Type soptype
Super Type dm_document
Attribute Name Attribute Type Attribute length Attribute Label Help Text Comment
attr1 STRING 10 attr1_label Some user help on attr1 Developer comment for attr1
attr2 INTEGER 10 attr2 label Some user help for attr2 Developer comment for attr2

It would be simple enough for a release engineer to add this to your releng build so that each build also produces a set of DarDocs for your Dars.

Hat tip to Eddie Doey from UK Consulting for providing this great example.

Happy Composing!

Migrating your projects between versions of Documentum Composer

As we release new versions of Documentum Composer you need to migrate your projects to this new version.  But how do you do this?

It’s pretty simple actually.  First-time you start the new version of Composer create yourself a new workspace.  Then re-import your projects into that new workspace using the Documentum-specific projects import wizard; File->Import->Documentum->Existing Projects Into Workspace:-


Any model changes that need to be applied to your artifacts will occur during this import.

The astute will realize that this is an irrevocable operation so your team will all need to use the same version of Composer.

Happy Composing!

EMC Documentum Developer Edition

Today’s a first for EMC.

For the first time ever you can download a completely free version of the EMC Documentum for development, the EMC Documentum Developer Edition.  The one-click install includes the Content Server, DFS, Composer, DA and Webtop.  All the tools you need to develop enterprise content-centric applications.

To compliment this we’ve also launched a new developer-oriented community.

Enjoy and as always happy composing!

Configuring BOF Hot Deployment in Composer

By way of a follow-up to Don’s post I wanted to show you how this surfaces in Composer.

As we have discussed before a BOF module comprises Jardef artifacts :-


and Java Library artifacts :-


By design implementation Jardefs are sandboxed.  Interface Jardefs are not, for all the reasons Don highlighted.  Meaning only your implementation jardefs can be hot deployed.

Now if we take a look at the Java Library editor you will see immediately where check to indicate whether or not you would like to sandbox each Java Library.  Checking this option will mean that Java Library will support hot deployment.  Unchecking it will mean it won’t but instead you can then share this Java Library across modules – which is sometimes useful.


Happy Composing

Troubleshooting BOF Development Mode

One of my readers had some trouble getting his local BOF development working. 

No matter what he did he always got NoClassDefFoundError when DFC attempted to load his service class locally.

Turns out that his issue was an eclipse issue.  He had specified something like this as a vm-arg in his junit launch config:-


And for some reason eclipse was not resolving the variable “${project_loc}” properly.

After he re-specified this setting with a fully-qualified path, everything worked as expected.

A good tip is that DFC will report, during start-up, some basic status about BOF development mode, as far as it is concerned. 

Look out for:-

“Error during initialization of BOF Development mode”

if something catastrophic happens, Or:-

“BOF Development Mode: can not find registry file {0}”

if DFC cant find the local registry file that it thinks it is looking for.  This was my readers issue.  Or (hopefully):-

“******* BOF Development Mode is enabled *******”

if DFC is able to load the registry file.

I thought I would put this out in case anyone else ran into same the problem.  And of course if anyone knows why eclipse is failing to resolve the variable then we’d all love to know that too.

Happy Composing.

Documentum Composer: Smart Containers

My colleague David Louie, the Composer product manager, posted a nice demo on the EDN site demonstrating how to build and use a smart container.

One way toconsider a smart container is as a composite application object model.   These can be defined in Composer.  WDK Webtop includes some generic runtime support and together these can be used for rapid prototyping.

From this demo you can hopefully extrapolate either WDK Webtop customizations adding specific runtime support for the Police Case Folders (i.e. File->New-> Police Case Folder, etc) or even a purpose built rich client bound to DFS web services, generated from the smart container model!

Anyway, here is the EDN page.  I recommend you take a look.  The demo is the .exe link at the bottom of the page.

Hat tip to Dave for producing this.

Happy Composing!

Documentum Composer: What’s in an Install?

A couple of people have asked after and commented on my blog about the install process and specifically about the dmc_dar type that Composer installs.  They all said that an overview of the install process would be helpful.  So, I will try to oblige.  I will keep this quite high level at this stage so if anything herein stimulates further questions then please do comment.

Composer was designed from the ground as an offline system of record.  Each Docbase type typically has an artifact equivalent as it’s offline record.  Each artifact has a URN (or a unique ID) and artifacts can be linked together via references to their URNs.  References can and do cross project boundaries.

A Project groups together a meaningful set of artifacts.  Because the artifacts it contains have dependencies so too for the Project that depend on the upstream Projects holding the referenced artifacts.  A Project goes through a build process to ensure its artifacts are valid for install.  The “validated & built” artifacts of a Project are exported into a dar file (Documentum ARchive) as a unit of distribution.  A Project can be installed, modified and then re-installed many times throughout its lifetime.  In many ways a Project is similar to an OSGi bundle (or an eclipse plug-in if you’d prefer).

So to the install.  Regardless of how it happens; through the UI or from via ant, the install procedure is basically the same.  The installer will:-

  1. Check that all referenced projects are already installed in the target Docbase.  From their dmc_dar records it will also build a URN to Object ID map which is used later on in the install process
  2. Attempt to resolve Parameters artifacts to real Docbase objects
  3. Run any pre (and afterwards post) install procedures (really this is for backward compatibility with DAB that often used pre & post install tasks to set or finalize the Docbase)
  4. Perform a two pass install.  First pass handles attributes.  Second pass handles references using the above mentioned URN->OID map
  5. Apply each artifact’s install options; set acl, link folders and owner assignment

Now as you will probably be aware your Documentum project contains a couple of extra artifacts, hidden from normal view; the project’s dardef artifact and the dmc_dar type definition artifact.

The dardef artifact (standing for dar definition) stores metadata about the dar which is used during installation.  It is system managed and should not be modified by hand.

The dmc_dar type defines a new “system” type.  It is the equivalent of DAB’s dm_application type.  The artifact is built and added to your Project’s dar and ultimately ends up in the install queue along with all your artifacts.  It’s install intent is set to IGNORE therefore it will get installed, if required, otherwise not.  At the end of the installation an instance of this type is created to record the installation.  Some metadata from the dardef is recorded in the instance’s attributes and two pages of content are added.  Page 0 is actually the dar itself.  Page 1 is the URN->OID map.  This map will be used in subsequent “upgrade” installs or by the install processes of other dependee Projects to help resolve their references.  Over time this type should become a standard system type but initially we wanted to be masters of our own destiny and hence we decided to define it as a type in the Project.

I must emphasize at this point that I don’t consider dmc_dar, its attributes or any of its content pages to be public at the moment.  So I would not advise writing any scripts or other programs against them that you’re not prepared to maintain at least.

Happy Composing!

Documentum Composer: What’s in a Documentum Project and what should I check in?

I have received a few comments asking what resources are in a Documentum Project, what needs to go into version control and what doesn’t?   I think because Composer contributes some “source” artifacts to the project it is not immediately obvious what the set that needs source control actually is.  So, let’s have a look.

Create a new Documentum Project and then open the Navigator view.  For those new to eclipse the Navigator view will give you an unfiltered view of the projects in your workspace.  You should see something like this:-


Let’s start at the bottom and work up.  All the dot files hold metadata about the project; .project and .classpath are managed by eclipse itself; .dmproject, .dfsproject are managed by the Documentum Artifact project system and .dfsproject is managed by the DFS (Documentum Foundation Services) project system.  Please feel free to have a look inside these but unless you know what you are doing I wouldn’t necessarily modifying them.  They are system managed and modification (of the Documentum ones at least) by hand isn’t a supported operation.

Artifacts is artifact source folder.  It contains a default folder structure for you to store your source artifacts in.  Artifacts must be stored in this folder but it is totally up to you how it is structured.  We chose to mirrored DABs folder structure so it was familiar to migrating user’s.  At some point in the future I am hoping we will be able to add a project template facility that may show other structures and include sample artifacts for a specific problem domain.

As the project is also a regular java project src is the registered java source folder.  Java source can go in here.  If your project contains a module artifact, for example, as per this post and this one.

The resources in both of these folders get “built” into bin.   From here the set of built artifacts get packaged into the dar in bin-dar, normally hidden.  Some artifacts can also have content which is stored in content, again hidden.  The dar folder, also again normally hidden, contains the source dardef artifact (i.e. dar manifest – system managed) and the dmc_dar type; instances of which store information about the installed application (artifact URN->OID mappings, etc).  This will be the subject of my next post actually.

Installation Parameter Files is an optional place to store parameter/value maps for installing into different docbases.

Lastly, Web Services/src is the source folder for DFS Web Services (described in this post and this one).  Web Services/gen-src contains the java source of the generated JAX-WS web service (a DFS web service produces a JAX-WS web service which is then built) and Web Services/bin contains the built version of that web service along with all of its associated web resources.

So, to the check-in question.  A derived resource is any resource that is generated from source artifacts as part of the build process.  That also includes new source that is generated from your source; i.e. Web Services/gen-src.  These don’t need to be checked in because they will get re-built.  So the set of resources that you do need to check-in is highlighted in red:-


For those that asked I hope that helps clarify the situation.

Happy Composing!