# Generating a mock/stub WCF web service from a WSDL

When working on integration projects you sometimes need to build a stub or mock service to emulate the behavior of the targeted system in your dev environment.

Visual Studio’s Add Service Reference dialog provides an easy way for generating the client code based on the WSDL of the service that you are invoking. Unfortunately there is no such dialog to generate a server side stub / mock.

There are various approaches you can take here but using svcutil.exe has been the most pain-free for me.

Here’s an example of how to go about it:
svcutil /mc UserService.wsdl UserTypes.xsd

The /mc parameter generates a class file with all the data types defined in the .xsd as well as the interfaces for all the operations defined in the WSDL. It also provides you with a starter .config file that you’ll then need to tweak to define the port on which the service is going to be hosted.

Once you have these files create a new WCF Service Library project, add the generated class files. Then create an implementation class that implements the interfaces that is generated. To keep things simple you might want to write code for just the operations that your calling from the client side.

Now here’s a gotcha for those stubbing out a service generated by Oracle WebLogic. The Action (SOAP Action) attribute on the operations are sometimes the same for all the operations in the interface. WCF doesn’t support this since it doesn’t conform to the WSDL specifications. You’ll easily know that you’ve hit this issue when you get the following exception when trying to host your stubbed service.

System.InvalidOperationException: The operations xxx and yyy have the same action (). Every operation must have a unique action value. at System.ServiceModel.Dispatcher.ImmutableDispatchRuntime.ActionDemuxer.Add(String action, DispatchOperationRuntime operation) at System.ServiceModel.Dispatcher.ImmutableDispatchRuntime..ctor(DispatchRuntime dispatch) at System.ServiceModel.Dispatcher.DispatchRuntime.GetRuntimeCore() at System.ServiceModel.Dispatcher.ChannelDispatcher.OnOpened() at System.ServiceModel.Channels.CommunicationObject.Open(TimeSpan timeout) at System.ServiceModel.ServiceHostBase.OnOpen(TimeSpan timeout) at System.ServiceModel.Channels.CommunicationObject.Open(TimeSpan timeout) at Microsoft.Tools.SvcHost.ServiceHostHelper.OpenService(ServiceInfo info) 

To overcome this you can create a custom Dispatch Behavior that uses an alternate algorithm to assign incoming messages to operations. The Dispatch by Body Element WCF sample comes with a sample implementation that works well.

All you need to do is add the two class files in this zip file (it’s the same code that comes with the WCF samples) DispatchByBodyBehavior

Next open up the class generated by svcutil and add the DispatchByBodyBehavior attribute to the ServiceContractInterface. You should now be able to host the service in WCF without any issues.

[ServiceContract(Namespace="http://Microsoft.ServiceModel.Samples"), DispatchByBodyElementBehavior] public interface IDispatchedByBody { [OperationContract(ReplyAction="*"), DispatchBodyElement("bodyA","http://tempuri.org")] Message OperationForBodyA(Message msg); [OperationContract(ReplyAction = "*"), DispatchBodyElement("bodyB", "http://tempuri.org")] Message OperationForBodyB(Message msg); [OperationContract(Action="*", ReplyAction="*")] Message DefaultOperation(Message msg); }

# Updating Extended Properties of a Database using SQL Server SMO

Updating the extended properties on a database using SQL Server’s excellent Server Management Objects API is not as straightforward as setting the value and calling update.

The database.Alter() method needs to be called both before and after updating the value. I had to lookup the code of El Pluto‘s awesome SQL Server Extended Properties Quick Editor project on CodePlex to figure this out.

using Microsoft.SqlServer.Management.Smo;

/// <summary>
/// Set's the extended property of a database.
/// </summary>
/// <param name="serverName">The name of the SQL Server.</param>
/// <param name="databaseName">The name of the database.</param>
/// <param name="propertyName">The name of the extended property.</param>
/// <param name="value">The value of the extended property.</param>
private void SetExtendedProperty(string serverName, string
    databaseName, string propertyName, string value)
{
    var server = new Server(serverName);
    var database = server.Databases[databaseName];

    database.Alter();
    if (!database.ExtendedProperties.Contains(propertyName))
    {
        database.ExtendedProperties.Add(
            new ExtendedProperty(database, propertyName, value));
    }
    else
    {
        database.ExtendedProperties[propertyName].Value = value;
    }
    database.Alter();
}


# Dynamically setting multiple activity destinations in K2 with ASP .NET

When building a typical workflow you usually know which user or group needs to perform an activity at design time. Sometimes though the workflow needs to be more dynamic.

The issue I had to resolve recently involved having to build a workflow where the end-user gets to individually pick the users who will be performing the next step. Here’s a view of  the workflow design.

The scenario involved an application being submitted for review. The application would go to an individual who is responsible for assigning a group of users (destination users) to review the application. The twist was that it was the individual picking users for each application, it wasn’t a fixed group or role. The screen mockup shows how they do it.

When the person hits the ‘Assign Reviewers’ button the form then needs to turn up as a work list item for each of the reviewers (destination users) who get to review the application in parallel.

Implementing this process using K2/InfoPath is quite straightforward and is well documented in many places including this post titled ‘Activity Destination Users based upon a Repeating XML element‘ in a K2 underground blog.

It’s not well documented though for ASP.NET. The post ‘How To: Use a web service for destinations in K2 blackpoint‘ is close to what we want but it’s targeted at using a web service.

K2 let’s you set multiple destination users in one of two ways

1. Using a SmartObject method in a role

2. Using Xml as a destination set

Going the SmartObject route was a lot of work for my simple requirement so I chose the Xml method. The idea here is to use the list of users in the Assign Reviewers form and store them in a Process xml field. The destination set will then be configured to read the xml field and create a slot for each user.

FYI: See page 4 of the Advanced Destinations whitepaper for a description of the two roles. <rant>Why the K2 KB portal needs a login is beyond me.</rant>

Step 1: Write the code to create the xml containing the list of users

The code block below accepts a ; delimited list of domain name\username (e.g. domain\johna) and creates an XML document containing the list of users. This is then assigned to the Process Instance XML field called Reviewers.

private static void AssignReviewers(WorklistItem item, string listOfUsers)
{
    var doc = new XmlDocument();
    var root = doc.CreateElement("UserList");
    doc.AppendChild(root);

    foreach (string user in listOfUsers.Split(';'))
    {
        var userNode = doc.CreateElement("Users");
        userNode.InnerText = user;
        root.AppendChild(userNode);
    }
    item.ProcessInstance.XmlFields["Reviewers"].Value = doc.OuterXml;
}

Step 2: Create an xml schema based on the user list xml

This proved to be the trickiest part for me. Using the xsd.exe as documented in this article didn’t work.  After a lot of anguish I worked out that K2 was happy with the schema generated by InfoPath. So I opened an empty form in InfoPath and added a repeating text field (screenshot).

Next I exported the form to get to the .xsd (in InfoPath 2010 it is File -> Publish -> Export Source Files). Cleaning out the my: namespace and a bit of tweaking should give you the following schema definition. This definition should work fine with the xml produced by the above code block.

<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<xsd:schema xmlns:xsd="http://www.w3.org/2001/XMLSchema">
  <xsd:element name="UserList">
    <xsd:complexType>
      <xsd:sequence>
        <xsd:element name="Users" minOccurs="0" maxOccurs="unbounded"/>
      </xsd:sequence>
    </xsd:complexType>
  </xsd:element>
</xsd:schema>

Step 3: Create an xml field for the process

Armed with the xml schema, we’re now ready to configure the workflow.

Fire up the K2 process designer and add the process xml field name Reviewers which we referred to in our code in step 1.

To do this open the Process General Properties window, expand the right panel. From the list select the Process/Activity Data node, expand it to select the name of your process and right-click on it to click Add.. and get to the Add XML Field.

Name the field ‘Reviewers’.

Switch to the XML Schema tab and browse to pick the xsd file created in Step 2.

When you hit OK you should now be able to drill down and see the Users node.

The key is to make sure that the node’s icon has a green overlay which flags it as a repeating node. If it’s there you should be fine. If not you need to repeat the steps above until you get the green overlay icon. Without it, the worklist item is not going to be assigned to multiple users.

Step 4: Setup the destination to read from the xml field

Select the activity which needs to be executed in parallel and click on the Destination Users node. Switch to the Advanced Mode if you are not already in there (select the checkbox on the first page). In the Destination Rule Options select Plan per destination ->All at once. We do this to tell K2 that when it goes to multiple users they will be able to open and work on the item in parallel.

Select ‘Create a slot for each destination’, this way each destination user get’s their own slot.

Click on the Edit button to configure the Destination sets.

Click on the ellipsis to open the Context Browser, drill down to the Process Xml field that we setup earlier and drag the Users repeating node (the one with the green icon) onto Name column.

You should now be all set to test out your dynamic multiple destination users! Running through the workflow you will now see that the worklist item gets assigned to each of the reviewers in parallel.

# SharpSvn: A Primer

The SharpSvn library basically gives you a .NET interface to all the operations that you would normally perform through a tool like TortoiseSVN.

I found myself needing this exact library while writing a tool that changes files that have been checked out from SVN.

The problem with manipulating files that are under SVN is that you need to be careful about renaming files (and sometimes even deleting). If you don’t do it through the SVN api then you will end up with duplicates files/folders in SVN since SVN thinks that it’s a new file.

To solve this I finally got a chance to crack open the SharpSVN library which is used by my favourite AnkhSVN.

1. Download the latest library from http://sharpsvn.open.collab.net/. You have to pick between either 1.5 or 1.6. I went with 1.6 and didn’t run into any issues. I think this is based on the version of the SVN server that your connecting to.

2. In your Visual Studio project add a reference to the following assemblies.
- SharpSvn.dll
- SharpSvn.UI.dll (Only needed if you need the UI to prompt for login)

3. If like me your building on a 64 bit OS and you want your app to run on a 32 bit OS, make sure the project that references the SharpSvn.dll is set to Build for the x86 Platform. (Build –> Configuration Manager – Solution Platform)

4. Write your code using the SvnClient object. Here are some samples from the SharpSvn Wiki and some that I wrote.

CheckOut

public void CheckOut()
{
using (SvnClient client = new SvnClient())
{
client.CheckOut(
new Uri("http://svn.collab.net/repos/svn/trunk/contrib"),
@"c:\wc");
}
}

Add new files to the working copy

using(SvnClient client = new SvnClient())
{
// TODO: Set optional settings on args

}

Check if a given path is a valid SVN working copy

public static bool IsWorkingCopy(string path)
{
using (var client = GetSvnClient())
{
var uri = client.GetUriFromWorkingCopy(path);

return uri != null;
}
}

Find out if a particular folder/file has been marked for deletion.

public static bool IsDeleted(string path)
{
if(!IsWorkingCopy(path)) return false;

bool isDeleted;
using (var client = GetSvnClient())
{
Collection<SvnStatusEventArgs> args;
client.GetStatus(path, out args);
isDeleted = args.Count > 0 && args[0].LocalContentStatus == SvnStatus.Deleted;
}
return isDeleted;
}

What’s even more awesome about the guys who wrote this library actively support it (even over twitter, thanks http://twitter.com/srijken!).

And that was even before I found out that they have a ready made .wxs file for integrating the .dlls into my WiX installer package. Awesome!

# Automate Build for a ClickOnce Application Hosted on CodePlex using MSBuild

This is what I wanted my automated build to do:

2. Update the version number in AssemblyInfo.cs
3. Build the project
4. Check-in the updated AssemblyInfo.cs
5. Label the project with the version number
6. Publish the ClickOnce package to my webserver

In order to achieve this I used the CodePlex Source Control Client (cpc.exe) to perform the get latest and check-ins. I was not able to complete #5 as the cpc client does not provide labelling. Maybe once the SvnBridge supports it I can upgrade this guide to use a SubVersion client.

I also wrote a command line utility SetVersion.exe utility that updates the version number on an AsssemblyInfo.cs or .vb file. The source for this is published as SetVersion on the MSDN Code Gallery.

So without further ado this is the MSBuild project file that performs the tasks.

<Project xmlns="http://schemas.microsoft.com/developer/msbuild/2003" DefaultTargets="Build">
  <PropertyGroup>
    <CodePlexProjectName>mycodeplexprojectname</CodePlexProjectName>
    <Version>1.$(CCNetNumericLabel).0.0</Version>  <BuildFolder>C:\MyBuilds\$(Version)\</BuildFolder>
    <IISPublishFolder>C:\MyInstallLocation\</IISPublishFolder>
    <ProjectFolder>MySolution\Client\</ProjectFolder>
    <BuildPublishFolder>$(ProjectFolder)bin\Release\app.publish</BuildPublishFolder>  <BuildProject>$(ProjectFolder)Client.vbproj</BuildProject>
    <ToolsFolder>C:\Projects\SolutionFolder\Build\</ToolsFolder>
    <CodePlexClient>$(ToolsFolder)cpc.exe</CodePlexClient>  <SetVersion>$(ToolsFolder)SetVersion.exe</SetVersion>
    <CodePlexUser>codeplexusername</CodePlexUser>
    <CodePlexPassword>codeplexpassword</CodePlexPassword>
  </PropertyGroup>

  <Target Name="Build">
    <Message Text="##Building version: $(Version)" Importance="high"/>  <Message Text="##Cleaning folder..." Importance="high"/>  <Exec Command="md$(BuildFolder)"/>

    <Message Text="##Getting latest version" Importance="high"/>
    <Exec Command="$(CodePlexClient) checkout$(CodePlexProjectName)" WorkingDirectory="$(BuildFolder)"/>    <Message Text="##Updating build number$(SetVersion) $(BuildFolder)$(ProjectFolder)AssemblyVersionInfo.vb $(Version)" Importance="high"/>  <Exec Command="$(SetVersion) $(BuildFolder)$(ProjectFolder)AssemblyVersionInfo.vb $(Version)"/>    <Message Text="##Building" Importance="high"/>  <MSBuild Projects="$(BuildFolder)$(BuildProject)" Targets="Publish" Properties="Configuration=Release;ApplicationVersion=$(Version)" />

    <Message Text="##Commiting version change" Importance="high"/>
    <Exec Command="$(CodePlexClient) commit /username$(CodePlexUser) /password $(CodePlexPassword) /message ---Automated-Build---$(Version)" WorkingDirectory="$(BuildFolder)"/>    <Message Text="##Publishing" Importance="high"/>  <Exec Command="rd$(IISPublishFolder) /s /q"/>
    <Exec Command="xcopy $(BuildFolder)$(BuildPublishFolder)\*.* $(IISPublishFolder) /e /i /y"/>  <Exec Command="xcopy$(IISPublishFolder)..\default.htm $(IISPublishFolder)"/>  </Target> </Project> Although I’ve worked with MSBuild files in the past this was the very first time I wrote one so a few things to remember. Items defined in the <PropertGroup> node are like declaring variables. You can then use them anywhere in your script in this format$(VariableName).

What’s even better is that you can override the default values you provide in the build file with values from the command line like. For example to force a particular version number I can peform the build like this:

MSBuild ReleaseBuild.proj /p:Version=1.6.0.0

When getting the latest version I opted to create a new folder for each build and pull the files into that folder. This way I didn’t have to worry about cleaning the bin folders. Every new build would start from an empty folder.

In my case rather than building the solution file I opted to build just the Client project since that would compile all the dependant projects. The key part that helped me create the necessary files for the ClickOnce publishing was the Targets=”Publish” parameter, that along with the ability to set the ClickOnce version using the properties provided an elegant solution for the tricky problem of keeping the AssemblyVersion and the ClickOnce application version in sync.

<MSBuild Projects="$(BuildFolder)$(BuildProject)" Targets="Publish" Properties="Configuration=Release;ApplicationVersion=\$(Version)" />

I use CruiseControl to kick off the build process as well as drive the version numbering. CruiseControl is not absolutely necessary though, as the build file can be run from the command line, as long as a version number is specified.

I’ve packaged the build file along with the SetVersion.exe, cpc.exe (codeplex client) and the ccnet.config for download here.

# WCF Performance Optimization Tips

I wound up work on my last project and thought of sharing some performance challenges we faced when the product went live.

Keep in mind though that optimization options heavily rely on your application design and its usage scenarios.

Usage Scenario

The usage scenario for which the following optimizations worked are as follows. The WCF services are hosted under IIS and currently only serve requests that come from the desktop client application through the intranet. Roughly 300 instances of the client application will be used concurrently.

Long running calls
The application design used the BackgroundWorker when making calls to the server but the actual WCF call was synchronous. Synchronous calls work well in most scenarios but when your service performs a long running task (>1 sec) it blocks other clients from connecting to the server. Asynchronous calls make better sense in this scenario.
Ref: Synchronous and Asynchronous Operations

Bindings
When designing your application pick the binding carefully as this also affects performance. The WCF services were initially designed to use WSHttpBinding. This binding though affects performance especially when options such as security, reliable sessions and transaction flow are enabled.

As part of the performance tuning we switched to using BasicHttpBinding as none of the WS features were actually being used by the application. This dramatically improved performance because we cut down on all the acknowledgements.

 Default WSHttpBinding with Reliable Session Enabled (9 messages) BasicHttpBinding (2 messages)

The security we lost when switching over to BasicHttpBinding will be enforced at the network level by locking down the machines that are allowed to make calls to the service.

In our case we could have squeezed out more performance if we had used NetTcpBinding but that would have required IIS7 and WAS.

Ref:
-    WCF Binding Comparison
-    BasicHttpBinding compared to WSHttpBinding at SOAP packet level [Note: Although this author recommends WS over Basic, Microsoft specifies that the secure and reliable sessions be disabled for Load Balanced environments which basically brings it down to Basic].

Service Model Throttling
The service throttling parameters are another key element to look at when performance tuning services. The name is a little misleading though as it tends to imply that you want to throttle your service when in fact these are the default settings that the Microsoft engineers have put in place to prevent DOS attacks against your service.

In our case due to the nature of our application usage these settings caused the server to queue new requests once the default throttling levels of 10 concurrent sessions were reached. What effectively happened was that once long running queries were being processed other requests started getting queued up even though the server memory and processor usage were very low.

The resolution for this was to increase the default values for these settings (shown below) to a few thousand.

<behaviors>
  <serviceBehaviors>
    <behavior name="DefaultThrottlingBehavior">
      <serviceThrottling maxConcurrentCalls="16"
                         maxConcurrentSessions="10"
                         maxConcurrentInstances="[Int32.MaxValue]" />
    </behavior>
  </serviceBehaviors>
</behaviors>

Ref:

General guidelines when tuning performance

Benchmarks

When tasked with tuning for performance the first requirement is to establish benchmarks on the current service levels and the expected service level once the tuning is completed.

Identify the bottleneck

The next key point is to identify the area that is causing the bottleneck. For the WCF services we tracked the time taken on the client side against the actual execution of the web service to eliminate the network being the bottleneck.

We worked backwards from the database call to ensure that they completed within the specified time.

Having your application instrumented is a key part of the initial design. The Enterprise Library provides instrumentation and open source tools such as log4net are lightweight and effective as well.

Effectively testing out any tuning options can only be done if you can repro the issue in your development environment. The built-in load tester in Visual Studio will be a key part of your load testing armoury.

Code Profiling

If the bottleneck points to your code, code profiling tools will help your isolate the problem areas. At the time of this writing Red Gate ANTS Profiler and JetBrains dotTrace are capable solutions. The code profiler built into Visual Studio 2008 is not as effective as these tools.

Finally a shout-out to WCF MVP Buddhike who saved my day by quickly pointing me towards the binding and security modes as the perf culprits.

# Silverlight

Silverlight  2.0 just went live. If you are a .NET developer building ASP.NET or WinForms/WPF applications this is a HUGE deal. Your .NET code can now run within a browser and across platforms (including Mac and on Linux using Moonlight) without requiring the .NET framework installed.

I have never spent much time learning AJAX but I see XAML/WPF basing used heavily in the future. When Silverlight 1.0 was first released I was excited at the prospect of re-using WPF knowledge to build web applications but was sorely disappointed with the lack of tooling and controls. Fast forward today and you have rich tooling support in Visual Studio a number of control (including a grid view and date picker) plus more controls being released by the Client Controls team.

Start learning Silverlight today your going to need it soon. Goodbye AJAX I will avoid you whenever I can.

To get started on Silverlight see the great ScottGu.

# WCF Add Service Reference gotcha with Windows Server

We recently switched from developing in Vista to Windows Server 2003. Someone had the bright idea that we should develop in the same environment the application is going to be hosted on. Go figure.

What that meant is that you run into wierd issues like this one. When trying to add a Service Reference to a WCF service hosted under IIS you keep getting this 'Add Service Reference Error':

Metadata contains a reference that cannot be resolved: 'http://merill/Services.Host/ClientProfile.svc?wsdl'.
The WSDL document contains links that could not be resolved.
The underlying connection was closed: An unexpected error occurred on a receive.
Unable to read data from the transport connection: An existing connection was forcibly closed by the remote host.
An existing connection was forcibly closed by the remote host
Metadata contains a reference that cannot be resolved: 'http://localhost/Services.Host/ClientProfile.svc'.
Metadata contains a reference that cannot be resolved: 'http://localhost/Services.Host/ClientProfile.svc'.
If the service is defined in the current solution, try building the solution and adding the service reference again.

The key part of this message is the reference to the downloading of the xsd. When I tried accessing the .svc url in a browser it worked fine, but trying to access the .svc?xsd=xsd0 brings up the generic 'cannot display webpage' message.

When you unleash your weapon (Process Monitor) on the csc.exe process (this is the compiler generating the xsd) you'll realise that the IIS identity IIS_WPG does not have access to the Windows\Temp folder. Give enough rights to the folder and viola problemo solved.

Happy WCF programming on Windows Server!

# Making use of the ‘??’ operator in C#

The ?? operator was introduced to C# in 2.0 and I made a mental note to myself to use it when possible.

Recently I had to do some tinkering with good ole Request.Form and Request.QueryString and I kept trying get the neurons to connect and figure out the shorter way of doing it. I knew there was another way to do it than write all this.

   1: string filter = Request.QueryString["Filter"];

   2: if (filter == null)

   3: {

   4:     filter = "";

   5: }


And Google is no help when you have no keyword to search. This is when ReSharper came in handy and prompted to replace the above code with this succinct version.

   1: filter = Request.QueryString["Filter"] ?? "";


BTW: If you are coding in VS 2008 check out the ReSharper 4.0 nightly builds, it's awesome.