Wednesday, November 24, 2010

Adventures with BizTalk: HTTP "GET" Part 1

[Note: This post is based upon an old blog post that I'm migrating for reference purposes, so some of the content might be a bit out of date. Still, hopefully it might help someone sometime...]

A while ago I was involved in a BizTalk project where we had a conceptually simple (and I would expect common) requirement: We would be receiving messages from an external party, and as part of the orchestration processing of those messages, we needed to use information in the message to dynamically retrieve one or more files from an external website based on a provided URL and then write it out to a pre-configured location in the filesystem.
I'll point out at this stage that the problem I'm attempting to solve here is actually getting the remote file: the parts about extracting the URL from the original incoming message and also writing the file once retrieved out to the filesystem are trivial...
Anyway, it couldn't be that hard to retrieve a remote file, could it? After all, BizTalk is all about connectivity, and has a host of adapters that should be able to solve this problem!
Over the next few posts, I'll describe my adventures in devising an appropriate solution to this problem, and you'll see that it wasn't as easy as it sounds (or as it should be)!
Potential Solutions

To start off in this post I'll list each of the potential solutions I considered:
That's it for this time, next time I'll start making my way through each of the solutions, discarding them at will!!!

Getting started in the Cloud

I haven't posted in a while because I've just started with my new company, Chamonix IT Consulting. Chamonix covers a number of core competencies such as architecture, systems integration, BI, portals and collaboration, and traditional app dev. However, one of the most exciting competencies is our focus on cloud computing. In fact, Chamonix practices what it preaches, and all of our LOB systems are cloud-based - we don't have a single on-premise server to host our LOB systems.

Over the last few weeks I've been rapidly ascending into cloud and getting my head around what it all means from an architectural and business perspective, what are its strengths and weaknesses, and when and how a client might consider a cloud solution as opposed to traditional on-premise. Having all of your LOB systems cloud-based certainly helps in this regard, as you experience the pleasure and the pain first-hand.

To get a deeper perspective on some of the challenges for developing in the cloud, we decided to prototype a relatively simple resource management application that we'd deploy to Microsoft's Windows Azure platform. It's certainly proved to be an eye-opening exercise with a number of challenges, some related to cloud technologies, some related to emerging technologies on the Microsoft platform that could be used on-premise or in the cloud.

Over the next few weeks I'll post about some of our experiences, some of the challenges and how we've overcome them, and my overall take on whether we were successful in what we set out to achieve or not. So you know what's in store, here's a summary of some of the technologies we've touched and I'll be mentioning:

  • SQL Azure
  • Windows Azure (Web Role)
  • Azure Platform AppFabric Access Control Service
  • OpenID & oData
  • ASP.NET 4.0
  • Entity Framework
  • WCF RIA Services
Until next time!

Friday, November 5, 2010

A few of my not so favourite things...

[Note: This post is based upon an old blog post that I'm migrating for reference purposes, so some of the content might be a bit out of date. Still, hopefully it might help someone sometime...]

ASP.NET LoginName control casing

The ASP.NET LoginName control displays the name of the logged-in user. Unfortunately, it displays the name of the logged-in user using whatever the user typed in to the log-in form... so, if we have a user named "David", and I type in "daVID" into the log-in form, the LoginName control will faithfully display my name as "daVID", rather than retrieving and displaying my name using the casing it was defined with.

I had a client that was unhappy with this behaviour. There are a number of work-arounds including deriving a custom LoginName control from the ASP.NET LoginName control. In the end, I went with an approach that handled the OnAuthenticate event of the Login control on the log-in form (ie, the control the user types their username and password into), validated the user's credentials, and if valid, retrieved the user from the membership store and set the value of the Login control's UserName property to the value retrieved. The Login control then takes over and creates the FormsAuthentication cookie etc using the correctly cased user name, and the LoginName control uses that throughout the lifetime of the login...

A small but annoying "feature"...

IIS 5.1 MaxConnections

I was creating a test harness for conducting some performance benchmarking of a BizTalk solution using LoadGen. The development environment in this case was based on Windows XP, and hence IIS 5.1, whereas the actual test environment was based on Windows Server 2003, and hence IIS 6.0.

Whilst developing the test harness however, I was encountering "Access denied" errors back from IIS whenever I ramped up the number of messages I was sending to the WCF endpoint BizTalk was exposing...

After checking the obvious security-related bits, my first thought was that it was something BizTalk-specific that was causing message throttling to occur. It wasn't. Nor was it some WCF-level setting.

In the end it turned out to be an IIS 5.1 limit on the number of "active" connections it allows at any one time: 10, by default. Because LoadGen was spinning up multiple threads to load the endpoint, and the response from BizTalk was taking a while to be generated (from another system) and IIS was holding the connection open, I was running into this limit.

You can change the limit up to apparently 40 concurrent active connections, using the following command:

adsutils.vbs SET w3svc/maxConnections 40

IIS 6.0+ doesn't suffer from this limitation, as far as I've read.

MS DTC on Windows XP & Vista: Error Message 5: Access is Denied

If you receive something like this:

ERROR MESSAGE 5 - ERROR MESSAGE 5 - Access is Denied
Invoking RPC method on TURTLE86
Problem:fail to invoke remote RPC method
Error(0x5) at dtcping.cpp @303
-->RPC pinging exception
-->5(Access is denied.)

This error will only occur if the destination machine is a Windows XP machine or a Windows VISTA machine. This is an additional security in the RPC layer which is configured on the client operating systems. More details on this security aspect is described in the article "RPC Interface Restriction" on Technet: RPC Interface Restriction

To get rid of this error just follow these steps to configure the registry key and REBOOT the machine:

1. Click Start, click Run, type Regedit, and then click OK.
2. Locate and then click the following registry key:
3. On the Edit menu, point to New, and then click Key. Note: If the RPC registry key already exists, go to step 5.
4. Type RPC, and then press ENTER.
5. Click RPC.
6. On the Edit menu, point to New, and then click DWORD Value.
7. Type RestrictRemoteClients, and then press ENTER.
8. Click RestrictRemoteClients.
9. On the Edit menu, click Modify.
10.In the Value data box, type 0, and then click OK. Note To enable the RestrictRemoteClients setting, type 1.
11.Close Registry Editor and restart the computer.


Thursday, November 4, 2010

A few of my favourite things

[Note: This post is based upon an old blog post that I'm migrating for reference purposes, so some of the content might be a bit out of date. Still, hopefully it might help someone sometime...]

Deployment Framework for BizTalk

As if BizTalk development wasn't tricky enough, deploying BizTalk solutions can be a very painful exercise, particularly when you're doing it repeatedly.

The Deployment Framework for BizTalk is a blessing, providing an MSBuild-based highly configurable deployment framework that's integrated right into Visual Studio, and making deployment as simple as clicking a toolbar button. On top of this, it also provides a suite of extra features on top of more manual BizTalk deploment techniques including generation of server installation MSI's and SSO-based runtime configuration.

VB.NET XML Literals & Linq to XML

VB.NET seems to receive less attention than C# in many cases, but one case where it surpasses C# in .NET 3.5 is the ability to create XML literals. Using XML literals you can create an anonymous type variable, assign it an XML literal, and its type will be inferred from its assignment.

Add to this Linq to XML, which donates the power of LINQ to manipulating XML fragments, and you can do some pretty cool stuff.

The following example demonstrates both to create an XML representation of a set of search criteria.

Assuming we have a variable called searchCriteria that is an array of SearchCriterion objects with Name and Value properties, it transposes this array into an XML representation (in my case, for logging purposes).

Dim searchCriteriaXml = <searchCriteria><%= From c In searchCriteria Select <searchCriterion name=<%= c.Name %> value=<%= c.Value %> /> %> </searchCriteria>

The result of this is something that looks like the following:

  <searchcriterion name="..." value="...">
  <searchcriterion name="..." value="...">

Resize a VHD

For ages I thought it wasn't possible to resize a virtual hard disk that had been created at a particular fixed size... I thought it was stuck that way. We had several virtual development environments that had been created based on a pathetically small Windows XP virtual image, and they were, as far as I thought, marooned on a 7Gb C: drive.

I'm not sure what caused me to take another look, but I'm glad I did... I won't post the exact process, but using information and the tools from the following URLs, I finally managed to resize these VHDs. Hurray!

Taking an ASP.NET application offline

Did you know you can take an ASP.NET application "offline" by placing a file named App_Offline.htm in the root of the virtual directory for the application? I didn't, until recently...

Entity Framework

I trialled the EF 1.0 on a project recently. I'd been looking to try it out for a while to see how it compared to our MyGeneration-based DAL approach, but had struggled to find a good place for it. I'd done a fair bit of reading in the meantime (including Julia Lerman's outstanding Programming Entity Framework book), so knew heading in that it's a huge topic of itself, and also, being a v1.0, not the finished product yet.

Given all that, and working within its limitations, I have to say it was actually on the whole a very pleasant experience to use, and it fit the bill in this case very nicely.

There are still a raft of issues and limitations with v1.0 that mean that you need to evaluate whether it's the right fit for your situation (many of which are being addressed by v2.0 = EF 4.0, released with .NET 4.0), but I have to say that in my case it was very "nifty" to use and to write Linq to Entities queries against the conceptual model, and performed very well, especially if you optimise the model and the Linq to Entities queries.

So all in all I'd say that in my experience it's not as "bad" as the wrap it sometimes seems to get, you just need to know what to expect heading in, and evaluate whether it's really the right fit for your purpose.


Provides the following:
  • SOAP Header support for WCF
  • Adding WSDL Documentation from Source Code XML Comments
  • Override SOAP Address Location URL
  • Single WSDL file for better compatibility with older SOAP tools.
Of these, I've utilised it for the WSDL documentation from source code (does however require that you deploy the VS-generated .xml documentation file along with your bin folder) and the single WSDL file option.

I really like the philosophy of Develop the contract for WCF services (schemas and WSDL) first, and then generate the implementation code (.NET data / message / service contracts & interfaces) from it. It's kind of the reverse to the "traditional" code-first approach.

The download is an add-in for VS2008 that automates a fair bit of this for you. Again, I haven't had a chance to actually use this in anger, and I'm interested if any of these sorts of capabilities will be built into WCF 4 / VS 2010, but I like the idea...

XSLT Profiler

It's been around for a while now, but I only just had a reason to use it. Essentially, it does what the name suggests, it profiles the performance of your XSLT and gives you a raft of information on where it's running slow. I used it to compare two different XSLT approaches to produce the same result to determine which was more speedy!

Microsoft Architecture Journal

Sometimes a bit dry, but usually filled with interesting articles that are less technically-focused than MSDNMag (which is also brilliant).

Wednesday, November 3, 2010

Technical Pain Points

[Note: This post is based upon an old blog post that I'm migrating for reference purposes, so some of the content might be a bit out of date. Still, hopefully it might help someone sometime...]

Some recent technical pain points...

Visual Studio 2008 Web Application Projects & Profile Properties

To cut a long story short, VS2008 Web Application Projects don't natively support ASP.NET Profile Properties. In VS2008 Web Site Projects (which, in case you hadn't heard me ranting previously, I loathe), the "Profile" class is dynamically generated behind the scenes when the Profile is set up in the config file. However, the same doesn't occur in Web Application Projects. You can utilise this tool [] to enable support for building the Profile class as part of the build process. Think hard about using Profile Properties though: should these properties really be "first-class citizens" of your underlying database schema, rather than "tacked-on"?

Using .NET TransactionScope with SQL Server 2000

Another goodie. TransactionScope, introduced in .NET 2.0, makes transactional .NET code a breeze! It uses a lightweight transaction in most cases, until the transaction requires escalation to a distributed transaction. Unfortunately, one of the cases where it doesn't use a lightweight transaction is when you're working against a SQL Server 2000 database. Yes, even if you're only accessing a single database on the SQL Server 2000 instance, you still start out with a distributed transaction: which means MS DTC becomes involved, and must be suitably configured on both the web server and database server.

MS DTC Authentication between a Computer in a Domain and a Computer not in a Domain

Following on from the previous item...

Of course, in our situation, we were deploying to an environment where we had a database server that was a member of the corporate domain, and a web server that was in the DMZ, and not a member of the domain. There are 3 options for authentication between DTCs: Mutual Authentication (preferred), Incoming Caller Authentication, and No Authentication. Because in our scenario there's not common point of reference for authentication between the DTCs (and no trust can be established), we had to go with No Authentication. We then had to do what we could to further secure DTC communication between the web and database servers through firewall rules restricting ports and IPs.

SQL Server Collation Differences between the Server and the Database

SQL Server has a default collation for the server instance. By default, when you create a new database, it uses this collation. However, it's also possible to specify a different collation for the database (it's also possible to specify a different collation again on a column-by-column basis inside tables within the database, but that's just an aside).

Normally this isn't a problem (other than it's nice to decide on a collation and stick with it unless you really need a different collation). However, when you have stored procedures or functions in your database that create and use temporary tables, a difference in collation between the database and the server instance can be a problem, particularly if you're trying to join between tables in your database and the temporary tables you've created. In this case, you'll get a collation mismatch error.

The workaround is to specify the COLLATE DATABASE_DEFAULT statement for collation-aware columns when creating temporary tables in tempdb. This will ensure that the temporary table column collation matches that used in the database. Then you'll only have a problem if for some reason you've used yet another collation for the specific columns in your database you're joining on... Yay for collation!

Tuesday, November 2, 2010

Back online

Hmmm... So, you may have noticed I haven't posted in a while. In fact, you probably won't have noticed, because you don't exist, but anyway...

I've been busy...

For the last few months I've been working on a large app dev and integration project for a Defence client. The project has been focused around the provision of a web-based application to manage design data related to the electrical system for products that are assembled by the client. The most challenging (and hence interesting) part of the project has been that the design for the electrical system actually comes from an international design authority. So the majority of design data within the application, as well as design drawing files, are actually authored in another country, and need to be integrated into our application. I won't go into the details of the business, communication, and technical challenges (which were many), but in the end our solution has been based on a combination of system and human processes supported by technologies including SQL Server (database engine, integration services, and transactional replication), Oracle (database engine), .NET (ASP.NET, WCF) and BizTalk. The solution is now nearing system and acceptance testing, and initial results and feedback have been very positive.

Right in the middle of that project, I had the opportunity to start a new role with Chamonix IT Consulting ( I've just started with Chamonix this month, and my focus here will be on enterprise architecture, integration, and the Cloud. It's a very exciting opportunity for me, particularly given that Chamonix is really just getting started - I'm looking forward to helping create the culture of a new IT consuluting busines, working with some great people, and with a particular focus on the Cloud.

So, I hope that explains in part why I haven't posted in a while.

What you should see in the next few days and weeks are quite a number of posts I've had in my backlog, so hopefully I'll be as good as my word and you'll see them soon. There's (I think) some pretty cool content to come, including some experiences I've had with BizTalk, WCF, ASP.NET, and Entity Framework over the last few months. I'll be back soon!