Archive

Archive for the ‘Uncategorized’ Category

EMC World-Momentum 2014 – The final view

May 15th, 2014 Comments off

Sitting in a plane going back home after four long days of bathing in the Documentum information, discussion and fun I’m ready to make up a final view of the vision of the IIG team the challenges and opportunitees I see and whatever more was there.

First of, I have to thank the EMC and IIG marketing teams for organizing a very good venue. Overall all went more than smooze and only getting in and out of ballroom A with 12.000 people demanded a little patience. The party at TAO and the Imaging Dragons are not easily forgotten.

The fact that (almost) all sessions for us Documentum geeks were on one level, the perfectly located Momentum lounge in the middle and the seperated area in the solution pavilon made it almost as the Core freaks were not there :-).

Now about the content. Read more…

xCP 2 Designer : Connecting the Dots

January 25th, 2013 Comments off

In November EMC introduced the new product xCP 2.0; it is a complete overhaul of their Case Management solution. With xCP 2 comes xCP Designer. Designer is the new all-in-one configuration tool that can be used to create xCP applications. xCP Designer is a great step forward from the days of FormsBuilder, ProcessBuilder, Taskspace, Composer and several other tools that were needed to create xCP 1 applications.

xCP Designer makes application creation much easier than before, but there is a learning curve. It is a powerful tool that can be used to create an object model, events, business processes, queries and the user interface. This can be a lot at first and since so much has changed from the previous version, the question arises of what goes where? And more importantly: How do you combine all these elements into a usable application?

One thing I learned while doing an xCP project is that when information needs to go from one component to another, the Designer only allows you to specify inputs. You always need to go to the Destination component and specify where it gets its input from. It is an information pull model. Once you get your head around this, it becomes easier to connect the dots.

For instance, if you want to create a page to update account details for a customer, you may want to perform some validation on the inputs before allowing the user to press the Save button. Let’s say we want to check that a valid bank account nr was entered. In xCP Designer this can be configured using a Stateless Process. The process would take a customer ID as input, perform some activities that check the account nr in the bank’s CRM system and produce a validation message as output.
The process could be as simple as this:

A few quick tips:

  • xCP Pages can only set and read the variables of a process. Packages cannot be used.
  • Don’t forget to set the ‘Stateless Service settings’ of your input variable to ‘Use for input only’ and for your output variable to ‘Use for output only’.

Now that we have the process, how can a user interface Page make use of it?
First of all we need to add the Stateless Process as a Data Service Instance to the Page.

  • Open the Data Service Instances panel, click on the green + sign and add the process to the page
  • Select ‘Refresh when input value changes’ on the General tab of the process
  • On the Input tab, specify the widget that we wish to validate the value of, in this case the Account nr field.
  • Notice there is no Output tab, even though the process produces an output message

So where does the output message go?

  • Add a Value Display field to the form that will show the output message
  • For the value of the field you can select the output message of the Stateless Process

As an extra you can also configure that the Save button is only enabled when the validation message is empty. You can do that on the Behavior tab of the Save button.

As you can see, when you want to connect the dots in xCP Designer you must always specify on the Destination object where it gets its inputs.

I hope this post was useful to you. You can leave a comment if you have anything to add.

Categories: Documentum, Uncategorized, xCP Tags:

How the internet is changing economics

January 18th, 2012 Comments off

This post is a response to a article of the same name by Kas Thomas.

Kas noticed that in ‘the old days’ most markets are dominated by 2, or 3 brands. For instance, Coca Cola and Pepsi dominate the soft drinks market. with the advent of the internet, some new markets have emerged and many existing markets are changing drastically. Kas noticed that the internet based markets seem to be dominated by just 1 big player. Google dominates the Search market, Amazon dominates the on-line bookstore market, FaceBook dominates the social network market, etc.
Kas predicts that major shake-ups are due, as markets move to the internet. Where there was room for several big players before, there will be just 1 left after the transition to an on-line market is complete.

To me, this sounds like a horror scenario. Like Kas, I am no economy major, but having just 1 dominant market player means a practical monopoly. Monopolies mean high prices, or less than top-quality service.

Though this scenario will probably come true in some markets, I think there is hope. Though the internet is apperently a good place to create a monopoly, the opposite is also true. The internet can be a great unifier that provides chances to smaller componies. In the automobile industry for instance, more and more cars are sold through the internet. There are some great search engines that will let you search for your prefered make, model, etc. The result list will show cars offered by big companies, small firms and private cars. This way of working prevents a monopoly and promotes competition, leading to lower prices and higher quality service.
The same goes for electronic equipment for instance. There are several sites that will help you find the right equipment, show specs and reviews and a list of stores that sell the product, ordered cheapest-first.

I also think that the monopoly scenario may be more relevant in the US than in other countries. Amazon for instance is a popular site in Holland, but the local company bol.com is doing just as well. The local company has an advantage, as it knows the local market, culture and language.

Finally, I don´t think there will ever be just 1 operating system for our devices. Microsoft managed to create a monopoly 30 years ago when the digital age was just starting, but no one will be able to repeat that now. With Apple, Google and Microsoft there is just too much money and power to let that happen. There is a fashion in OSs. One year iOS will be in vogue and the next year Android will be hip, or the newest Windows version will be the new must-have. I don´t see a monopoly in SmartPhone or Tablet OS any time soon.

Categories: Uncategorized Tags:

Testing Documentum Customizations without Documentum

January 5th, 2012 1 comment

Usually when developing customizations for Documentum I test the customization by running it against a Documentum repository. This is easy to do but it has a number of drawbacks. First of all this requires the availability of a Documentum server and a connection to a repository.  Second, this repository must also be setup in such a way that your code can be tested. This could mean that your code has to be installed in the repository before you are able to test it. And third, this requires that most of your code is in a testable state, i.e. all components can be compiled and run without apparent errors.

At times it is desirable to be able to test your code without actually running it against a Documentum content server. This may simply be the case if you do not have access to a Documentum content server to test against, or maybe your code is part of a continuous build process where having access to a content server is not desirable.

Writing unit tests that do not require actually running against a Documentum server means that we have to dive into the world of mock objects. Using mock objects we can simulate the desired behavior of objects and interfaces without using the “real” object. If we can use mock objects to simulate the behavior of Documentum DFC objects we should not need an actual Documentum server to run our unit tests. The big question of course is: can it be done?

First of all you need to use a framework for generating mock objects. I started out using EasyMock (www.easymock.org) but soon ran into its limitations in that it cannot mock static classes or private methods. With EasyMock you can probably get quite far but this will require that you develop your code around your unit tests.

If you are really serious about writing your Documentum unit tests you will need a framework like PowerMock (www.powermock.org). PowerMock can be used as an extension of EasyMock and gives you – amongst other things – the ability to mock static classes, mock private methods, create a partial mock of an object or even set a private variable of a class to a specific value.  With PowerMock you can write your test units around your code instead of the other way around.

Unit testing a DfSingleDocbaseModule

Now let’s get to work. Assume you have a custom object type called “my_document”, subtype of “dm_document”. This object type has a custom String attribute called “special_value”. The “my_document” object type has its own TBO that is accessed through the “IMyDocumentTbo” interface.  The “IMyDocumentTbo” interface contains the “getSpecialValue” method which retrieves the value of the “special_value” attribute.

We also have a workflow with a BOF Module activity that retrieves the “special_value” value of an object of type “my_document”. The BOF Module activity uses a custom module consisting of the “IMyModule” interface and the “MyModuleImpl” implementation class that extends the “DfSingleDocbaseModule” class.

The “IMyModule” interface looks like this:

public interface IMyModule {

String getObjectSpecialValue(final String objectId) throws DfException;

}

The “MyModuleImpl” class typically looks something like this:

 public class MyModuleImpl extends DfSingleDocbaseModule implements IMyModule {

    public final String getObjectSpecialValue(final String objectId)
throws DfException {

		String specialValue = "";
		IDfSession session = null;
		IDfSessionManager mgr = null;
		try {
			mgr = this.getSessionManager();
			session = mgr.getSession(this.getDocbaseName());
			IMyDocumentTbo document = (IMyDocumentTbo) session.
				getObject(new DfId(objectId));
			specialValue = document.getSpecialValue();
		} catch (DfException e) {
			DfLogger.error(this, "getObjectSpecialValue::Error: ", null, e);
			throw e;
		} finally {
			if (session != null) {
				mgr.release(session);
			}
		}
		DfLogger.debug(this, "getObjectSpecialValue::specialValue=[{0}]",
			new String[] {specialValue}, null);
		return specialValue;
	}
}
 

Writing a unit test for the “MyModuleImpl” class has a number of challenges. The static class DfLogger is used for logging purposes. Since we do not really want to do any logging we need to mock this class. PowerMock supports this with the “PowerMock.mockStatic()” method. Furthermore the private methods “getSessionManager()” and “getDocbaseName()” of the “DfSingleDocbaseModule” class are used to get access to Documentum. Because we do not really want to access Documentum we need to mock the behavior of these methods. Fortunately PowerMock also supports the concept of a partial mock, which means that some methods of a class can be mocked while others are executed normally.
A PowerMock unit test typically consists of 4 parts:

  1. The definition of the objects to mock.
  2. The specification of expected behavior of the unit to test.
  3. Replaying the test.
  4. Verifying the result.

For the test unit we need to mock a number of objects:

  • the Documentum session manager
  • the Documentum session
  • the custom Documentum object (my_document)
  • the DfLogger

We also need to partially mock the module itself to be able to define the behavior of the “getSessionManager()” and “getDocbaseName()” methods. The list of methods to mock can be supplied to the “PowerMock.createPartialMock()” method as a String array.

So the definition of the objects to mock looks like this:

final String[] mockMethods = new String[] {"getSessionManager", "getDocbaseName"};
final IMyModule moduleMock = PowerMock.createPartialMock(MyModuleImpl.class, 	mockMethods);
final IDfSessionManager mgrMock = PowerMock.createMock(IDfSessionManager.class);
final IDfSession sessionMock = PowerMock.createMock(IDfSession.class);
final IMyDocumentTbo documentMock = PowerMock.createMock(IMyDocumentTbo.class);
PowerMock.mockStatic(DfLogger.class);

Now we have to define the expected behavior of the module method we want to test. In this example there are 3 ways in which behavior is expected:
1. We expect some public methods to be executed that return a value, i.e. an object of some kind. This behavior is expected using the regular “expect” method from EasyMock.
2. We expect some private methods to be executed that return a value, i.e. an object of some kind. This behavior is expected using the “PowerMock.expectPrivate()”method from PowerMock.
3. We expect some public methods to be executed that do not return any value (void methods). These methods can just be mentioned without expecting any return value.
The trick is that when a return value is expected we can tell our test unit to return one of the mocked objects instead of a real object. With this mocked object we can further specify expected behavior.

For “MyModuleImpl” the expected behavior looks like this:

PowerMock.expectPrivate(moduleMock, "getSessionManager").andReturn(mgrMock);
PowerMock.expectPrivate(moduleMock, "getDocbaseName").andReturn(REPOSITORY);

expect(mgrMock.getSession(REPOSITORY)).andReturn(sessionMock);
expect(sessionMock.getObject(anyObject(DfId.class))).andReturn(documentMock);
expect(documentMock.getSpecialValue()).andReturn(SPECIAL_VALUE_VALUE);

DfLogger.debug(anyObject(Object.class), anyObject(String.class), 	anyObject(String[].class), anyObject(Exception.class));

The first 2 lines specify the expected behavior of the mocked private methods of the module, returning the session manager mock and the name of the repository. The following 3 lines specify the expected behavior of some public methods on the session manager, the session and the document mock objects. The final 2 lines specify expected behavior of methods that do not return any value.
To replay and verify the expected behavior we need to tell PowerMock which mocks to replay and which to verify. Fortunately PowerMock contains “replayAll” and “verifyAll” methods so that we do not actually need to specify the separate mocks. Between replaying and verifying we need to perform the action we want to test en specify what behavior is correct or not.

For “MyModuleImpl” replaying and verifying looks like this:

String specialValue;
try {
	PowerMock.replayAll();
	specialValue = moduleMock.getObjectSpecialValue("anyId");
	assertEquals(specialValue, SPECIAL_VALUE_VALUE);
	} catch (DfException e) {
		fail("Unexpexted DfException");
	}
PowerMock.verifyAll();

Finally to correctly execute the test PowerMock uses a number of annotations that need to be specified for the unit test class. These annotations are:
• @RunWith(PowerMockRunner.class)specifies that the unit test is run using PowerMock
• @PrepareForTest({…}) specifies an array of classes to use during the test. Unfortunately the PowerMock documentation is not quite clear on which classes to include. My advice is: include all static classes you are testing and all custom classes.

For the “MyModuleImpl” unit test the following setup works:

@RunWith(PowerMockRunner.class)
@PrepareForTest( {MyModuleImpl.class, IMyDocumentTbo.class, DfLogger.class})

Putting it all together the entire unit test looks like this:

@RunWith(PowerMockRunner.class)
@PrepareForTest( {MyModuleImpl.class, IMyDocumentTbo.class, DfLogger.class})
public class TestModuleWithMock extends TestCase {

	private static final String REPOSITORY = "repository";
	private static final String SPECIAL_VALUE_VALUE = "This is your special value";

	@Test
	public final void testModule_getSpecialValue() throws Exception {
		final String[] mockMethods = new String[]
			{"getSessionManager", "getDocbaseName"};
		final IMyModule moduleMock = PowerMock.createPartialMock(MyModuleImpl.class,
			mockMethods);
		final IDfSessionManager mgrMock = PowerMock.createMock(IDfSessionManager.class);
		final IDfSession sessionMock = PowerMock.createMock(IDfSession.class);
		final IMyDocumentTbo documentMock = PowerMock.createMock(IMyDocumentTbo.class);
		PowerMock.mockStatic(DfLogger.class);

		PowerMock.expectPrivate(moduleMock, "getSessionManager").andReturn(mgrMock);
		PowerMock.expectPrivate(moduleMock, "getDocbaseName").andReturn(REPOSITORY);

		expect(mgrMock.getSession(REPOSITORY)).andReturn(sessionMock);
		expect(sessionMock.getObject(anyObject(DfId.class))).andReturn(documentMock);
		expect(documentMock.getSpecialValue()).andReturn(SPECIAL_VALUE_VALUE);

		DfLogger.debug(anyObject(Object.class), anyObject(String.class),
			anyObject(String[].class), anyObject(Exception.class));
		mgrMock.release(sessionMock);
		String specialValue;
		try {
			PowerMock.replayAll();
			specialValue = moduleMock.getObjectSpecialValue("anyId");
			assertEquals(specialValue, SPECIAL_VALUE_VALUE);
		} catch (DfException e) {
			fail("Unexpexted DfException");
		}
		PowerMock.verifyAll();
	}
}

Once you have this test running correctly you can extend the test by including exception handling. For instance you could change the behavior of the “sessionMock.getObject()” to throw a DfException and verify that this is handled correctly.

To wrap it up
Using PowerMock you can get a long way into writing unit tests that do not require access to Documentum. Still there will be challenges ahead. Unit testing TBO’s for instance seems to pose some problems of its own. But if you’re serious about unit testing this way you may achieve a fair amount of code coverage.

But then, is it worth it? If you start writing this kind of unit test, you have to realize that in some cases you will have to put in at least as much effort into writing the unit test as you have done writing the unit itself. Be sure to mention this to your project manager in advance.

Jeroen Teeling
ECM Consultant

SharePoint Governance

December 9th, 2011 Comments off

It has been a while now since the SharePoint Connections 2011 in Amsterdam (22/23 November) but I am still excited about the focus on SharePoint Governance. My company has a focus on Enterprise Content Management so I understand importance of Governance. SharePoint has a reputation for being easy and out of the box and it has not always been easy to convince clients of the importance of Governance when using SharePoint. That is why it is exciting to see increasing awareness in the community of the importance of Governance.
During the SharePoint Connections conference Dan Holme spent the keynote and another session talking about SharePoint Governance. A few months back I also attended the SPGov+IA workshop by 21apps and a common theme in both was the fact that often the definition of Governance is not clear and different people have different interpretations.

What is Governance?

SharePoint Governance means having an answer to the following questions:
1. Where
2. When
3. What
4. How
5. Who
6. Why
…knowing where you are now, where you want to go and how you plan to get there.
Below I will detail a few highlights from Dan Holme’s keynote speech. For the whole story you can see his presentation on http://bit.ly/danholme1111spointnl.
I can also recommend the SPGov+IA workshop (http://www.spgovia.com/) to get familiar with tools that can help you plan and implement your SharePoint Governance and Information Architecture.
Where
It is important to make sure the company has a vision and understands how SharePoint fits into the enterprise strategy.
When
Architect SharePoint as a platform that happens to first delivers a solution instead of only producing what is easy, cheap or quick for the current project.
What
Focus on the requirements to make sure you know what you are doing, why you are doing it, what is behind each requirement and how you can make sure it worked in the end.
How
Establish a process to define how you introduce and deploy solutions.
Who
Define roles & responsibilities.
Why
Make clear why you are doing the project. Involve your users in the change management process to help with user adoption.

It is our responsibility as SharePoint consultants to make sure these questions are asked and answered and to explain their importance to the business. I am hoping that with the wider spread awareness that Governance is a key factor for a successful project it will become a standard part of all projects.

Astrid Verhoef
ECM Consultant