Monthly Archives: October 2006

Ah – the last day in Norway

I’ve had a really fun class this week.  Norway is a beautiful place and the people are also very intelligent which makes for a very fun week of training.  I want to thank Zsolt Ujvari who arranged the training her for Compello Software  –

BTW – If you are a reader of my blog you are probably interested in Windows Workflow Foundation – if you live in Norway and are interested in working on a next generation system using Windows Workflow Foundation (WF) – send

Code snippet for serializationsurrogate

If you read my blog you know I’ve talk about SerializationSurrogate before –,guid,5f4d8c41-73bf-4d7f-93b4-8934130a783b.aspx

Teaching WF persistence today in Norway – one of my students (Fritz Lowrey) suggested a snippet for creating the surrogate and the surrogateselector.  So here is my attempt (just put it in your Csharp snippets directory inside of My DocumentsVisual Studio 2005Code SnippetsVisual C#):

<?xml version=1.0 encoding=utf-8?>

<CodeSnippet Format=1.0.0 xmlns=>


<Title>Serialization Surrogate Snippet</Title>

<Author>Jon Flanders</Author>


<Description>A snippet to create a SerializationSurrgote and Surrogate Selector for serializing a .NET type that isn’t marked with the SerializableAttribute and doesn’t implement ISerializable</Description>












<Code Language=csharp>


public class $type$SurrogateSelector : SurrogateSelector


public override ISerializationSurrogate GetSurrogate(Type type, StreamingContext context, out ISurrogateSelector selector)


//check to see if type has DataContractAttribute

if (type==typeof($type$))


selector = this;

return new $type$SerializationSurrogate();


return base.GetSurrogate(type, context, out selector);



public class $type$SerializationSurrogate : ISerializationSurrogate



public class $type$ContractRef : IObjectReference


#region IObjectReference Members

public object GetRealObject(StreamingContext context)


//TODO:Add code to create a new instance of the real object type

return null;


//TODO:Add fields for data you’d like serialized



#region ISerializationSurrogate Members

public void GetObjectData(object obj, SerializationInfo info, StreamingContext context)


//TODO:Add data from original type instance

Type t = typeof($type$);



public object SetObjectData(object obj, SerializationInfo info, StreamingContext context, ISurrogateSelector selector)


return null;









One reason why to created your own WorkflowLoaderService

Harry Pierson was in my WF/WCF course I taught a few weeks ago.  He posted on his blog about features he thought were cool in WF here.

One of the ones that I thought was interesting that he caught during my course was that the WorkflowLoaderService is actually a pluggable service (just like every other service except for the WorkflowQueueingService).   So of course it came to my mind to show an example of why you might want to create a custom loader service.

The WorkflowLoaderService is a very simple API – it has two methods name CreateInstance both which return an instance of Activity.  Everytime a Host calls WorkflowRuntime.CreateWorkflow – the WorkflowLoaderService is called to actually create the instance.  One of the CreateInstance methods is for compiled Activities (the one that takes as its argument a Type) and one for XAML activation (the one that takes as its arguments two XmlReaders – one for the workflow and one for rules).

The DefaultWorkflowLoaderService is fairly simple – the CreateInstance that takes a Type uses Activator.CreateInstance to create an instance of the Activity Type.  The one that takes XAML is slightly more complex – but essentially uses the WorkflowMarkupSerializer to turn the XAML into an Activity.

So why might you want to replace this service?  Well – there are many scenarios – but one that always comes to mind for me resolves around rule loading. 

One of the great features of WF is to be able to model some of you logic in rules versus code.  In Visual Studio – whenever you add rules, those rules are stored in a .rules file along side your workflow files.  These .rules files are then compiled into the assembly as resources.  Whenever an Activity first needs a rule, there is an infrastructure that loads the .rules file into a RuleDefinitions type (containing both any RuleSets and RuleConditions) – and stuff the object into a well-known DependencyProperty in the root Activity.

One of the features of rules that is so useful is being able to replace them at runtime with a different set of rules – but with the DefaultWorkflowLoaderService – the only way you can do that is if you use XAML activation.  But what if you want to replace rules on a compiled Activity type without having to recompile it.  The default infrastructure doesn’t allow this.

But – if you build your own WorkflowLoaderService – when a compiled Activity type is requested – you could read the rules from an alternate location (based on configuration or some other algorithm) and then dynamically create the RuleDefinitions and stick it into the Activity using the well-known DependencyProperty.  Here is the code that does this in a simulated way (note that you’d have to change the algorithm that loads the alternate rules to something useful):

public class DynamicRuleWorkflowLoader : DefaultWorkflowLoaderService
protected override System.Workflow.ComponentModel.Activity CreateInstance(Type workflowType)
Activity a = base.CreateInstance(workflowType);
WorkflowMarkupSerializer s = new WorkflowMarkupSerializer();
object o = s.Deserialize(XmlReader.Create(“AlternateRules.xml”));
a.SetValue(RuleDefinitions.RuleDefinitionsProperty, o);
return a;
protected override Activity CreateInstance(System.Xml.XmlReader workflowDefinitionReader, System.Xml.XmlReader rulesReader)
return base.CreateInstance(workflowDefinitionReader, rulesReader);

Upgraded my Atlas (ok now ASP.NET AJAX) Workflow Monitor

Since they’ve release beta 1 of ASP.NET AJAX (formerly known as Atlas) – I have to upgrade all my Atlas samples.  The first one I decided to tackle was my Workflow Monitor – since it was totally based on the Atlas server-side model (no custom javascript).   It literally took me about 10 minutes – thanks to the

Migration Guide.   I’m a little suprised actually that xml-script isn’t making it into the base product (it’ll be supported as part of the community CTP).  I guess I can understand the reasoning – but it IMO was one of the coolest things about Atlas in terms of hooking non ASP.NET 2.0 devs.   

Link is the same – AtlasWorkflowMonitor (318k) – note that you have to have installed ASP.NET AJAX from this time (in earlier versions you could have the atlas dll in your bin directory – now they are loading it from the gac).

On my way to Norway

To teach a Windows Workflow Foundation 5-day course.  Everyone says it will be beautiful.

 BTW – I don’t plan on updating the Atlas Workflow Designer until they release the next version of Atlas – read Biertand Roy’s blog about why –

Link to Atlas and more  – they are about to change from closures to prototypes.  This will be a great chance for me to re-architect the whole thing.  The next big feature will be HTML based rule editing.

Mea Culpa

So pretty much the worst thing an instructor can do is tell a class something that is wrong.  I did that last week teaching a WF/WCF “combo” course in Kirkland, WA.  What is worse is that one of my students blogged about two of them, which has caused a little controversy.

Here is Harry’s first entry

Here is Paul’s response

And Harry’s follow-up.  To back Harry up – I know after I talks that he doesn’t think WF is a *Toy*.  He had some realistic reservations about a couple of pieces.

So here is what I got wrong – in earlier builds of WF – the WebServiceInput/WebServiceOuput activities in combination with the ASMX hosting layer – kept the Workflow InstanceID in Session (although now I am doubting that – I very clearly remembering seeing that code with Reflector).  It now uses a “normal” Http Cookie.  The problem I proposed in class still holds true – which is that because the ASMX hosting layer doesn’t allow you to change the namespace URI (see Paul’s blog for a manual fix for this) and because it does still send a “session” based cookie.  Which means if the client closes (let’s say the client is a Windows Forms application) – the cookie is lost and the Workflow would be “abandoned” on the server.  Of course if the client is a workflow – and the client persists the cookies will get persisted as well – so you can get long-running workflows – but only if the client is a workflow.

So if the cookie gets “lost” there isn’t any other way to “rematch” the workflow to a new client.  That is also one of what I see as a limitation of the OOB WF/ASMX integration layer – is that there isn’t any way (OOB that is) to share a workflow instance among different users.  So one of the details I taught (Session versus custom cookies) was incorrect – but the general usage model of the OOB WF/ASMX integration was correct (although I am pretty sure I didn’t use the moniker Harry associated with it). 

IMO – the OOB WF/ASMX integration layer will be useful in about 25% or so of cases when someone wants to expose a workflow as a ASMX Web Service.  I think in the other 75% of cases people will build their own layer that has a little more flexibility.

On to the SqlWorkflowPersistenceService.  Let me be clear – I have extreme respect for the people who wrote this code (I actually know them personally).    First of all – here is what Harry blogged that is incorrect (and I am not sure if explained this incorrectly or not  – but let’s assume that I did) – the OOB persistence service doesn’t load *all* instances when one host starts – it loads all *running* workflows (so workflows that are idled stay idled and persisted).  So if two hosts go down and start back up  – all running instances will get loaded into the first host that comes back up (no load balancing).

Also to me – the biggest limitation of the OOB persistence and tracking service – is the fact that if I use them together – and then put my own TransactionScope activity with database access code – I end up getting a DTC transaction.  For the WF applications I’ve worked on this was enough for us to want to create a custom tracking and persistence service.

There are a few more reasons to write a custom persistence service.  I won’t go into all of them here  – but there are good reasons to write a custom persistence service with WF – which means the OOB persistence service won’t be used 100% of the time – which was really my point – I think it is probably usable in about 75% of cases.

So here is my final word on this subject (I hope :)) :

The OOB WF/ASMX integration is useful mostly when there are workflows on both ends- and when you don’t need to share workflows across users.

The OOB SqlWorkflowPersistenceService is a very usable service if you are doing one or two hosts and need robustness and load balancing.

WF is not a toy  – Harry never said WF overall was a toy – just the two features here – he got the wrong information from me and hopefully this blog post has cleared that up.  Anyone who reads this blog knows that I actually love WF and think it is by far the best and most interesting part of .NET 3.0.