Friday, June 29, 2007

Promoting Context Properties

Sometimes there is a context property on a message in BizTalk that for some reason by default does not get promoted. If you want to route the message based on this property (say in a send port), then follow these steps:


1) Create a Correlation Set and Type based on the property you want to route by in your orchestration.

2) In the Send Port properties that sends the particular message you want to route, set the Correlation Set to initialise. This is achieved through Orchestration Designer.


I found out step one via this Addictive News post. Step two is there for clarification.

There is also a custom pipeline component available that allows a bit more dynamism.

Thursday, June 28, 2007

Dropping Temp Tables in SQL Server

A typical task in a some stored procedures is to check whether or not a table exists and if so do something with it, perhaps dropping it.

If it is a temp table people sometimes are not sure how to go about this.

Here is some simple code to accomplish the task:


IF OBJECT_ID('tempdb..#tempT') IS NOT NULL
BEGIN
DROP TABLE #tempT
END


Remember that all the temp tables you create actually get created in the tempdb database, which is a system database (not the database you are using!). So replace #tempT with the name of your temp table, but leave the tempdb in there.

Friday, June 22, 2007

SSIS Package Automation (DTutil)

I was putting together a deploy script for a project I have been working on the includes a series of SSIS packages developed using Visual Studio Business Intelligence Studio. After a bit of googling I discovered the way to automate the deployment of these was with a command line utility called dtutil.

You can move SSIS packages around from any sort of store to another. Here is a basic example that involves copying a package from a a file location (perhaps your deploy folder?) to the default sql server instance on the machine you run the command from.

Note that the /Decrypt option is for password encrypted packages built using the technique described in a previous post.


dtutil /FILE MyPackage.dtsx /COPY SQL;MyPackage /Decrypt myEncryptionPassword

Automating SQL Scripts via Command Line (sqlcmd)

A typical deployment task in any systems integration project usually involves either deploying a sql database or making some modifications to an existing one.

Previous to SQL Server 2005, we could use osql to automate this task. We still can, but SQL Server 2005 brings us a command line utility with a little bit more grunt that is more purposed to our task.

This is sqlcmd. Follow that link for the msdn article about it.

This can let us do anything we can do via T-SQL via the command line. All you need to do is put all your sql into a script file and then you can run the script from the command line by inputting something like this:


sqlcmd -S localhost -i d:\deploy\DoTasks.sql -U myusername -P mypassword


Note that the switches seem to be case sensitive. You don't need to specif icy a user name and password, if you don't sqlcmd will just use your current security context.

Wednesday, June 20, 2007

BizTalk Scripts

There is a fantastic resource for various vb scripts for BizTalk admin and development tasks that can be found at gotdotnet.

Some of the scripts could be very handy for prepping a prod/preprod/test server for deployment.

Tuesday, June 19, 2007

BizTalk MsgBox Log File

Attempting to master BizTalk means having to understand how a lot of different technologies work.

Having a strong knowledge in SQL Server is something that time and again comes in very handy. Whilst I might be able to whip up a decent stored procedure, or set up a decent batch job, when it comes to the more complicated DBA type work I am still very much a learner.

I recently left a solution deployed to the test environment running over the weekend to see how it would cope over a decent period of time. When I got in on Monday there was over 60k server instances in the dehydrated or ready to run state. This was not what I wanted to see, definitely not in the expected or acceptable results ball park.

Turns out the cause was the there was no hard drive space left which meant log files could not be written to, intermediate xml messages coukdn't be created etc

I thought perhaps that the file output of the orchestrations had just written too much data to the poor 20gb hd in use. This turned out to be a false assumption as the culprit was the BizTalk Message Box database' log file. It had expanded to 12gb.

So in SQL Management Studio you can right click on the BizTalkMsgBoxDB and go to Tasks -> Shrink -> Files and select the log file. You'll then be able to resize the log DB.



Just as importantly is the right click Properties -> Files dialog. This lets you set auto growth options for the log file.


Friday, June 15, 2007

A Simple Yet Deceptive Trap

This is something that caught me today. It seems like a pretty simple thing, but had me frustrated for about fifteen minutes.

When compiling a BizTalk project with orchestrations enclosed you may occassionally get a "cannot find symbol" or "invalid reference" message when compiling. This seems like a simple enough problem, just reload your references and things should be fine?

Well occassionally we make silly mistakes. In this instance it turns out that I did not put double quotes around the value propertyof an orchestration variable. It turns out the Intial Value property of an orchestration variable is a live property in that it accepts referernces to objects and other variables. So if you want just a string literal you need to explicitly make Visual Studio aware that what you are entering is a string (quotation marks).

Of course it may be better to design your solution so that the initial value property is pulled from a variable elsewhere, perhaps in an XML file or database somewhere instead of hardcoding inside your orchestration!

Thursday, June 14, 2007

Tech Ed 2006 Retrospective, Windows Server 2008

It was still called longhorn back then, but essentially the same product.

So Tech Ed 2007 has just been and gone, which I sadly missed, but I'd like to share a little bit of belated news about last years festival.

One of the tidbits that the evangelists and gurus from Microsoft told us in Sydney was that apparently everything you can do in a GUI you will be able to do from the command line.

Take a moment to let that sink in.

Coming from a systems integration perspective, that is pretty great news. Some of the more recent projects I have been working on require some fairly complex deploy operations. SQL Server databases, DTS pacakges, various BizTalk artifacts, .NET assemblies to be gacked and lots of security settings. This has the potential to simplify things grandly.

If anyone has any thoughts on this please leave a comment.

SSIS DTS Package Security Woes and Fixes

In a previous blog I talked about how to get security settings to persist across servers in DTS packages made using Business Intelligence Studio for SSIS.

Well it seems sometimes no matter how many rebuilds and hard coding of connection strings into the package you do, things just don't go as planned. The dreaded "AcquireConnection" message comes up and your package fails to execute.

There is hope though. From a maitenance perspective this may in fact be a better way of doing things assuming access to SSIS is limited to the correct server team guys and gals.

In the "Run Package" dialog box you can click on the "Connection Managers" tab. This lists the various connections in your package. You can override the settings for these connections here.



In a future post I will let you know how this integrates with the SQL Server Agent jobs.

Wednesday, June 13, 2007

MMC cannot open the file ...

From time to time you may see an error message similar to this when trying to open any MMC snap-in:


Microsoft has this article about it. The solution suggested did not seem to work for me.

I was able to get around this problem by opening mmc (it's located within the scope of the path variable so you can just go "Run -> mmc" to open it). You can then add the approriate snap in.

Wednesday, June 06, 2007

Some Days

Some days are just not as productive as others!

I don't know if it was the cold weather (it got below 10 C today!), the forboding morning traffic jam or the distinct lack of satisfaction from my morning corn flakes, but when I logged on I somehow knew something was amiss.

I immediately dived into some experimental rules engine stuff (which I will hopefully tell you all about in a future blog post) and thought I had shaken the willies. I was wrong. Not long after deployment I noticed something awry. My receive ports weren't receiving and my send ports weren't sending.

Somehow every folder on my local hard drive got the read only tag.

To add to my woes I could not log on locally to my SQL Server instance. Bizarrely I used mstsc to log remotely onto a server with SQL Management Studio and connect back the other way to my local host I could use my host instance fine.

So after implementing many well intentioned hot fixes and workarounds I am still stuck. A full system scan using the office standard Symantec Antivirus is in progress and my machine is at the mercy of the desktop support team.

I am expecting great things from tomorrow :)

Tuesday, June 05, 2007

BizTalk Large Posters from Microsoft

Microsoft have provided some great BizTalk posters to print out. The BizTalk Server Team Blog has the low down.

Free Download Manager

I have been using the Free Download Manager for about a month now. I have not really used download managers of any sort for quite a few years, probably GetRight was the last one I used.

FDM I have found quite unobtrusive to my workspace, but also very stable and pretty does what a dowbload manager should do. I'd reccomend it.

Friday, June 01, 2007

Google Gears

Slashdot is reporting that Google has released a new technology called Google Gears. I headed over to check it out...

So currently it is released as a beta. The product boasts that it can:

1) Store and serve application resources locally.
2) Store data locally in a fully-searchable relational database.
3) Run asynchronous Javascript to improve application responsiveness.

So it is touted as a "browser extension" that allows web sites a developer creates to be viewed offline.

So the database technology is actually just SqlLite. The "Local Server" is the seemingly key piece of software that allows you to view your web sites offline (i.e. without network connectivity). This is potentially quite handy for perhaps security sensitive apps.

I intend to investigate this a little more and will make another post after some decent hands on exposure. Google rarely release something to the market that doesn't have broad innovative goals in mind. This one could be easily dismissed as unnecessary , but I have a feeling that after a bit of a play this might be something potentially powerful.

DTS Packages (SSIS) Running as Scheduled Jobs

A lot of people, including myself, have been confounded by some mildly complex security settings in relation to DTS tasks.

Having deployed a DTS package to SQL Server 2005 Integration Services and then running the package from SSIS, I received no errors. The package ran as expected. I needed to run this package every ten minutes, so I set up a single step job via SQL Server Agent using the SQL Server Management Studio. When I attempted to run teh job there was no joy, simply a failed job.

I investigated the History logs as suggested by the very vague error message provided. You can do this by right-clicking on the job and selecting "View History". I discovered that the job was attempting to run as a domain service account. It turns out that when you run a job it is run by the server and not the user context. Running a package manually will use whatever user you logged on as.

So a bad way of fixing this problem would be to simply restart the SQL Server Agent windows service using the relevant user credentials (the credentials used to deploy the package to SSIS). We obviously don't want to do this.

So we must create a Credential and a Proxy, and then set up the particular step of the job to run as the Proxy rather than the SQL Server Agent account. Details can be found here.

These settings took me a bit of experimenting before I understood what was going on.