Monday, December 10, 2012

"Clever" uninstall of msi packages/applications using PowerShell

When I created my own PowerShell script library for BizTalk deployment automation I ran across the need to uninstall applications, both BizTalk applications and non-BizTalk ones, by only knowing their name.

At first, I solved it using a WMI query like so:
$name = "application name"
$product = Get-WmiObject -Class Win32_Product -Filter "name='$name'" -ComputerName "localhost"
[void]$product.Uninstall()

This is not a recommended way of doing it though. It is both very slow and also causes a bit of spamming in the Windows Eventlog since the query in fact does a reconfigure of ALL applications installed. This reconfiguration can also cause a bit of other issues in some cases.

I then created another way of trying to uninstall applications in a more failsafe and secure way by using msiexec with the uninstall flag. The tricky part was to find a way to get the product key in order to be able to use msiexec since it requires this for uninstalling an application. The result can be found below.

The script function will take the application name as argument. It will then via the registry (note that this is configured for x64, so change the path if you are running x86) look up the application settings. If the application can be found via name (it should), we extract the product key. Then this is used as a parameter to msiexec.

If the product key cannot be found (it happens), we will instead try to read the uninstall string that is set when installing the application. Windows will run this string when you choose to uninstall an application, so why do not we use it? If found, we extract the product key and do our msiexec call.

If all fail, we throw an exception to be caught in the real part of the script.

This is the full script function:

Function Uninstall-Program([string]$name)
{
    $success = $false

    # Read installation information from the registry
    $registryLocation = Get-ChildItem "HKLM:\Software\Wow6432Node\Microsoft\Windows\CurrentVersion\Uninstall\"
    foreach ($registryItem in $registryLocation)
    {
        # If we get a match on the application name
        if ((Get-itemproperty $registryItem.PSPath).DisplayName -eq $name)
        {
            # Get the product code if possible
            $productCode = (Get-itemproperty $registryItem.PSPath).ProductCode
           
            # If a product code is available, uninstall using it
            if ([string]::IsNullOrEmpty($productCode) -eq $false)
            {
                Write-Host "Uninstalling $name, ProductCode:$code"
           
                $args="/uninstall $code"

                [diagnostics.process]::start("msiexec", $args).WaitForExit()
               
                $success = $true
            }
            # If there is no product code, try to read the uninstall string
            else
            {
                $uninstallString = (Get-itemproperty $registryItem.PSPath).UninstallString
               
                if ([string]::IsNullOrEmpty($uninstallString) -eq $false)
                {
                    # Grab the product key and create an argument string
                    $match = [RegEx]::Match($uninstallString, "{.*?}")
                    $args = "/x $($match.Value) /qb"

                    [diagnostics.process]::start("msiexec", $args).WaitForExit()
                   
                    $success = $true
                }
                else { throw "Unable to uninstall $name" }
            }
        }
    }
   
    if ($success -eq $false)
    { throw "Unable to find application $name" }
}

Sunday, December 9, 2012

Running an Integration Competence Center for small businesses

A while ago I saw a post from an old colleague on LinkedIn where he linked to an old article called Too small for an Integration Competency Center? Think again. I was intrigued to read the article since I am in the middle of establishing an ICC at the organization I'm at, and we are quite small in that regard.

The four building blocks involved are:
  • Adopt common integration practices
  • Standardize on a common integration platform
  • Create a decentralized group that specializes in integration
  • Form a centralized ICC
This has all been done at the organization I am at, with the experience I have from larger organization as a basis for the setup. In my case, with an IT department of a whopping four people, forming an ICC is no big challenge in that regard.

I have opted to include external parties in the ICC to form a team that is not bound to a strict organizational hierarchy, but rather be agile and flexible to the demands from the organization at stake.

The basis of the ICC is in my role as integration architect/supervisor in the IT department. Coupled to this I have included an external integration architect to have a form of sounding board. Most, if not all, integration development work is carried out by consultants which the external integration architect is responsible to coordinate. As needed, I also include internal resources in the ICC to help forming the ICC and drive the integration development forward. So basically, the ICC is in it's simple form just two persons, with several others coming and going as needed.

So far, this is working very well. The agility in the ICC is a strong factor to consider. A lot of times, people are included full time even if they are not needed full time. Since the external architect is contracted on an as-needed-basis, costs can be brought down. Of course, that puts a lot of burden on me.

However, I have strived to create a simple and effective process of the entire integration life cycle. Document templates are available for everything from requesting an integration (done by the organization) to detailed specification and testing of the completed development. The development guidelines and deliverance specifications are to be as standardized as possible to not bind anything to a single person or organization (except our own). I aim to be able to both replace the consultants involved as well as myself without having to explain anything about the setup and behavior of the ICC and involved parts.

As for the four building blocks above, I feel that all (almost) of them can be ticked off in our case. The platform is BizTalk, best practices are established through documentation and processes based on experience from me and consultants, a decentralized group is formed and the ICC is formed and handling most of the integration needs. What is left is the funding that is still divided and handled by separate organizational units rather than taken care of completely by a central organization. This is currently the biggest issue to handle since it hamper a true value-for-money path regarding integration.

But to summarize, in accordance to the article linked in the beginning of this post, you cannot have a too small of an organization for an ICC to be formed. By including resources on a need-to basis as well as including external consultants where needed, it can be handled in a very effective way. The important part is to have a clear process of the entire life cycle as well as clear boundaries between different parties involved. My role in the whole ICC is most importantly to coordinate the different tasks between the resources, and as long as that is done appropriately, everything is flowing smoothly. Having a centralized funding of the work is not necessary, but very (very!) preferable.

Sunday, December 2, 2012

Simple dos and don'ts for versioning webservice interfaces (and similar integration interfaces)

Earlier this week I went to a user group gathering for the ERP system we utilize at work. The topic for the day was "integration". At the end of the day during an open Q and A session, I approached the ERP representatives with questions on how they worked with the web service API exposed from the ERP system. I asked them if using the current web services guaranteed that they would work without changes and need for deep regression testing when upgrading the system. I also asked them how new functionality to existing services would be implemented, from an integration point of view.

The answer was that current implementations would require normal regression testing after upgrading, but that there could be changes in the backend making the services work a bit different in the new versions, depending on the changes made in the business logic and GUI.

This made me a bit quezy and fidgety. In the best of worlds, I want a web service to be seen as a hard written contract between two parties, in this case the ERP system on one side and the integration platform on the other. By exposing a service, you should guarantee that the specification for the service is not to be changed and that the underlying logic is to be set in stone as well. That seemed to not be the case.

The discussion then evolved into the realm of versioning and planned changes of existing services. As of now, new functionality is added to the service without changing its name, version or namespace. Fields are simply added as optional.

This method is something I have several issues with, and which I also expressed at the session (maybe a bit too harsh, but it is an important subject to me). First off, by adding new functionality as optional fields, you will break one of the (many) benefits with web service contracts: hard typing of them! By doing it this way, the developers will make future maintenance of the services harder, for both themselves as well as those utilizing the services. Also, the clients will have to upgrade the integrations whether they want to or not as soon as the ERP system is to be upgraded and the services change.

I instead proposed that they should version their services and operations. When changing an operation, create a new version of it and as long as it is possible, make sure that the previous version(s) are still functional. If a service operation is to be removed from the system, mark it as obsolete and let the clients know which version of the ERP system it will be removed in, and preferably how to utilize other services and operations to enable the needed functionality. I myself think that this is a very obvious way of handling the dilemma. I have grown to understand that it is very rare that versioning is used though and I do my best preaching the above statements.

So please, make sure that you use some form of versioning strategy when building your interfaces. It will make my (and many others, I guarantee) life a lot easier.

Tuesday, November 6, 2012

BizTalk Server 2013 (formerly BizTalk Server 2010R2) is in beta

Late last night I saw some tweets about the new version of BizTalk getting to beta status. A quick look, and sure enough, BizTalk 2010R2 is renamed to BizTalk 2013 and a beta version is available for download.

I must say that this release is very intriguing. Among the more interesting new features for me is the REST capabilities, the new SharePoint adapter and the ESB toolkit integration (especially since I loathe the way the first version was to be setup and handled).

The new dependency function to check out how artifacts are dependent of eachother is also a nice addition.

Go check it out and hope for a quick release of the final product!

Thursday, September 6, 2012

The BizTalk File adapter, least permissions required for a Receive Location

Every now and then a questions pops up regarding BizTalks FILE adapter permissions. If a Receive Location is set up to poll a folder where the account used (by default the BizTalk host account) does not have appropriate permissions, the following error messages will be written in the event log.
File transport does not have read/write privileges for receive location "D:\BizTalk\In\".
The Messaging Engine failed to add a receive location "Receive Location1" with URL "D:\BizTalk\In\*.xml" to the adapter "FILE". Reason: "File transport does not have read/write privileges for receive location "D:\BizTalk\In\". ".
The receive location "Receive Location1" with URL "D:\BizTalk\In\*.xml" is shutting down. Details:"The Messaging Engine failed while notifying an adapter of its configuration. ".
Usually, the solution is to just give the BizTalk host account full permissions to the folder and be done with it.

I myself try to never give full permissions to any account as long as it is not needed. It is however more common than not to see applications and users all use the SA login to SQL Server, be local (or domain) admins, and be put as a member in all of the groups used in BizTalk. *shudder*

Anyhow, for the FILE adapter, the first attempt a user does if not going the "full permission route" is to give the account used read+write permissions as so:



This will not work but give the same errors again. Neither will adding the "Modify" permission do any good. It is at this point most users simply click the last checkbox available, giving Full permissions. And voila, it will work!

What one rather should do is go to the Advanced permission properties and add the "Delete subfolders and files" permission which in reality is what is missing. BizTalk has been able to properly read the file, but was unable to delete it, resulting in the error.


Notice that the "Delete" permission is not set. Basically, the account used for reading the files in the folder most likely should NOT have the permission to delete the folder itself, but just the contents inside of it.

Be careful though! While experimenting, I noticed that if I had correct permissions according to the screenshot above, but removed for example the "Write extended attributes" permission, the Receive Location would NOT read the files in the folder, neither would it give ANY type of errors or warnings. Everything will in that case look perfectly fine, with the Receive Location up and running, but no files will be read and processed!

Wednesday, August 29, 2012

Using PowerShell to remove all old files in a folder structure

A common task when working with BizTalk installations is to clean out old data from folder structures. May it be old log files, automatically saved messages from a send port, database backups or any other sort of data that is obsolete after a certain period of time.

When I started working with BizTalk, the norm was to use bat files or VB Script. These can still be seen in many places since they a) work just fine, and b) the developers/administrators hasn't learned a new way to script these kind of things, mostly because a) they work just fine.

I myself as mentioned earlier prefer to use PowerShell nowadays.

In order to recursivly delete old files and folders, one line of PowerShell script is all that is needed:

get-childitem -path 'c:\temp\test' -recurse | where -FilterScript {$_.LastWriteTime -le [System.DateTime]::Now.AddDays(-30)} | Remove-Item -recurse -force

This script simply traverses the specified folder (why not change it to take the folder and date as a parameter) and does so recursivly. All files older than 30 days in this case is then deleted. By using the -force parameter, locked and hidden files will not cause an error.

In order to have this run automatically on a server, it needs some adjustments. Add a Scheduled Task and set the Program/script to run to Powershell and then add the arguments as so:

-noninteractive -command "& {get-childitem -path 'c:\temp\test' -recurse | where -FilterScript {$_.LastWriteTime -le [System.DateTime]::Now.AddDays(-30)} | Remove-Item -recurse -force}"

Set the security options for the task to "Run whether user is logged on or not" and also check the "Run with highest privileges" checkbox.

Monday, August 27, 2012

Why I created my own PowerShell scripts for BizTalk deployment

Most of us working with BizTalk has at one point or another started scripting our deployments instead of doing them manually. I have during the years seen many variants of automating the deployment process. At one place the "old" way of using bat-files calling BTSTASK where used extensively. At another, a custom .Net program with a GUI was developed that would call the different API:s available. I myself resorted to using PowerShell to create a fully automatic deployment routine.

My scripts has two parts. One with different functions that I wrote and one part that encapsulates the functions and executes them in the correct order. As input I create an XML file that specifies which .msi packages that are to be deployed, which applications to remove/update/add, settings for the IIS virtual directories and so on. Since it is quite common during complicated deployments that something will err, I added functions to test run the scripts in advance. The test run simply checks for dependencies in the BizTalk configuration, making sure that applications can be modified as configured. It also checks that all files needed are provided in the setup package.

When creating the scripts, I briefly looked at using the BizTalk Deployment Framework and other extensions to PowerShell, but opted on developing it all in my own scripts. This due to several reasons, each quite strong in my case:
  • I would learn more about PowerShell as well as the API:s and tools found in BizTalk
  • I would know exactly where things could go wrong and what to do about it.
  • It allowed me a greater flexibility since nothing was hidden in dll files or needed installation. As long as PowerShell was installed on the servers, the scripts would work. Nothing else is needed. This is also something that administrators liked. Some of them are a bit weary about installing "some stuff I found on the net". 
This post is the theoretical background to posts that are to follow that will showcase and explain my scripts in individual blocks. I have already started in advance with the post Recycle IIS application pools using PowerShell, and more snippets are to come.

Wednesday, August 15, 2012

BizTalk Map Viewer: Graphical interface for the BizTalk Map Documenter

Quite a while ago during an assignment we needed a way to display the logic in BizTalk maps for developers that didn't have BizTalk installed on their workstations. We quickly turned to the BizTalk Map Documenter which simply is an XSLT file built to take a standard BizTalk map file (.btm) and output it in a human readable format as an HTML page.

In order to use the XSLT file you need to execute it, preferably with the utility msxsl.exe which can take the .btm file and XSLT file as arguments and then output the result to be viewed by the user. However, this caused a bit too much command line hacking in our case and I took an evening in the hotel bar to quickly whip together a GUI for it and thus, the BizTalk Map Viewer was born.



The application is very simple and in it's basic form simply allows you to open a BizTalk map file which then will be parsed by the BizTalk Map Documenter XSLT file and the result will be shown in the application window. I added functionality to save, print and copy the result. I also added functionality to register the application as a valid right-click option for .btm files. If enabled, you will get the option to "Open with BizTalk Map Viewer" when right-clicking on BizTalk map files which then will open the selected file in the application. Just be sure to have placed the application executable in a good folder before registering it since the current path will be used.

Feel free to download the code and executable.

Use it as you please.

Friday, August 3, 2012

A take on documenting integration platforms using Visio

Earlier this year I wrote a post about having Visio update a field on a shape automatically when editing any other field, making it possible to always know when the metadata was last changed. In the comments I was asked to share the Visio sheet which it now is time to do. This is the result of that request, in some way documenting my attempt to use Visio to document an integration platform. Hopefully you can gain some ideas to use in your documentation endeavours.


In almost all of my encounters of different integration platforms at various companies, there has been a high level view of the entire integration setup showing the various message flows available. This map has usually been constructed in Visio. There has also been two types of layouts, lets call them the structured layout and the unstructured layout.

The following is an example of the unstructured layout. It's common and does a fairly good job at describing the full environment. It is often seen when there has not been a true approach to working with integrations.
Most of the time they are filled with objects, with varying descriptions, showing the entire network.You can see that two systems interact, but you do not know in which way. It is also often hard to see how everything is connected if you as in the example has an integration hub in the middle, doing some message routing.

Hence we get the structured layout as is seen below. In this case, there is a very solid structure with the integration platform in the middle, and the integrated systems on both sides with arrows showing each and every message flowing through.

These maps are good at showing the message flows as every message is entered, but it doesn't give you the overall picture that the unstructured layout gives you since a lot of fluff is stripped away. Usually there has also been an improvement from the unstructured layout in that each message flow has been given an identity which then maps to more detailed documentation in a file archive or document portal.


So, earlier this year I did a test with a variant of the structured layout, seeing if I could document the current state of integrations at my current job. The map I was given from the beginning was according to the unstructured layout. A lot of information was out of date, systems had been taken out, others taken in, versions changed, message flows altered and so on. I had to create an up to date picture of the integrations, so that I could commence working with them.

I took a brand new Visio document and started documenting. I created a "system" shape that had the following shape data:
  • System name
  • System name 2
  • System version
  • Server name
  • Database
  • Database version
In my Swedish Visio it looks like this:

In order to make it easier updating and changing these properties/metadata for all shapes, I changed the Shape Sheet to make the form data window open when doubleclicking on the shape. You can do this by editing the Shape Sheet property "EventDblClick" under "Events" to read "=DOCMD(1312)".

I then did the same thing for the arrow shape adding metadata that I want to know for each message:
  • ID
  • Object name
  • Status (i.e. Planned, Under development, In use, To be removed)
  • Transport (i.e. Web service, File, FTP, WCF...)
  • Type (i.e. Xml, Flat file...)
  • Frequency
  • Automatic (i.e. a drop down giving the option of Yes or No, indicating whether the message transfer is completely automatic)
  • Sync/Async
  • One way/Request-response
  • From system
  • To system
  • Comment

Many of the fields are defined in the form data editor as dropdown menus as to make the data more consistent and easier to edit. I also added the "From system" and "To system" in order to get good documentation automatically created by exporting the form data to Excel. As with the System shapes, I added the doubleclick event to the shape.

I also altered the shape so that the choices made in the form data will be visible in the shape. Depending on the "Status", the arrow will be yellow (planned), green (under development), black (in use) or red (to be removed). This is done by editing the Data Graphic and setting that each value will match a specific color.

Also, setting the "Automatic" to "No" will show a little man besides the ID and Object name. This can be set in the Shape Sheet as well under "Text fields" and "Value". In my case, the formula is: =IF(STRSAME(Prop.Automatic,"No",TRUE)," 웃","")&" "&Prop.ID&". "&Prop.Objectname&" ". But is of course dependent on the name you give each property in the form data.

In the same manner, I have a setting that will change the arrows on the shape so that I get an indication if it is a one way transfer or a request-response one, by adding a little unfilled arrow in the other direction of the data flow.

I have also adjusted where the text appears on the connector according to this post from the Visio Guy, so that the text is always at the end of the line instead of in the middle.





So, is this a good way of documenting message flows?

Well, in this case, it has been very helpful. I have a single document that shows me quite a lot of the current setup and state. It eases on the job of manually editing shapes to indicate whether it is a planned message flow, if it is request-response, a manual job and so on. Instead, I just select the desired property and let Visio format the shape accordingly, giving me a neat view of the state.

After documenting all message flows in this Visio document, I will most likely strip it down. I will remove a lot of the metadata in the connection shapes, and have this solely in the integration specification document that I will create for each message. I do not like to have duplicate information in my documentation, and I feel that that sort of data will belong better in the written documentation than in the Visio sheet. I will still keep the color coding depending on Status, as well as some other visual cues that is helpful when looking at the overall view of the integration platform.

Monday, July 9, 2012

Book review: (MCTS): Microsoft BizTalk Server 2010 (70-595) Certification Guide

I got the opportunity to review the book (MCTS): Microsoft BizTalk Server 2010 (70-595) Certification Guide published by Packt Publishing, a company that is pretty well renowned for their good collection of IT books. The book itself is written by Johan Hedberg, Kent Weare and Morten la Cour.

I received the book just before the weekend and spent the spare time over the next few days reading it through and checking out the exercises that accompanies the book.

As the book both exclaims on the front as well as the writers themselves add in the first chapter, this book is a guide to help you study and prepare for the latest BizTalk exam called 70-595, TS: Developing Business Process and Integration Solutions by Using Microsoft BizTalk Server 2010 which I wrote about last year, but still have left to take due to too little time. Since the prerequisites for the exam is that you should have at least two years of experience of working with BizTalk solutions, the book aims to be a quick guide to many areas of BizTalk development making the focus the parts you need to be able to complete the test.

The book is structured accoring to the areas covered by the exam in order to give you a better chance of focusing on the areas you lack expertise in as well as knowing the structure on the test. Chapter 1 through 8 will cover the technical bits of BizTalk, and does a very good job at this. Each chapter will end with a few questions making sure that some vital parts of the chapter has stuck, as common with IT books. These chapters reflects the major topics that the exam is composed of, making it a breeze to know which areas you need to study more thoroughly.

When starting reading the book, I immediately got the feeling that the writers didn't quite stick to the view of the readers having more than two years of BizTalk development and hence not having to run through simple tasks that you learn during the first hours of a basic BizTalk course. Pretty much the entirety of chapter one is like this, and parts of the other chapters as well. But that is a minor fluke since most of the book contains just the right amount of information, or maybe a tad to little. Just as Mikael Sand mentions in his review, there are parts of the book that will tell you how to do things, but not why. I also have to add that there are parts that while describing a satisfactory way to solve the problem, it doesn't adhere to best practice. In other places, the writers will push you in the direction of complying with the established non-formal best practice in a very good way that is seldom seen in books of this type (SOA Patterns With BizTalk Server 2009 by Richard Seroter is one of few exceptions).

After the technical chapters, you get chapter 9 and 10 which gives you a more hands-on preparation of the actual test. Chapter 9 runs through tips on how to study for the test as well as how the actual test will take place. Chapter 10 is a collection of more practice questions that will mimic the real ones.

Now, bear in mind that I have yet to take the 70-595 test. I have however taken the 2006 version (70-235) and can say that based on my knowledge from that test as well as my knowledge from BizTalk development and architecture, this book covers the major parts in a very good way. I believe myself that a good way of studying for the test would be this book, reading it through, having a BizTalk setup to do labs and practical work in, and then of course a browser with MSDN/Technet available for all that in-depth knowledge hunting that I think is a must. It's simply just not good enough to know how you do something, you also should know why you should do it that way, and in which way that will affect your platform and solution. The book does a pretty good job at this by every now and then redirecting the reader to MSDN for more information.


All in all, this book actually surprised me as soon as I got past chapter 1, and I can recommend it as a good study guide to aid you (alongside the vast resources found online) to successfully complete the 70-595 exam.


Friday, June 29, 2012

How to move the LiteSpeedCentral database

We are running LiteSpeed at work in order to compress our backups in SQL Server. I was tasked with moving the LiteSpeedCentral database from one SQL server to another. A simple task one might think.

By a quick search, I found a support ticket that says that you simply do a backup/restore of the database and a reconfiguration of the LiteSpeed client on each server. Simple enough, I did that.

The reconfiguration could have been easier, i.e. swap "old server" for "new server" instead of having to reconfigure everything, but sure enough, it worked.

I then got a lot of warnings on my old SQL Server telling me that some service on the LiteSpeed application server is trying to connect to the database but is failing.


I did some digging trying to find out what was causing this. The warnings came with an interval of fifteen minutes which narrowed it down to the SQL Server Agent job LiteSpeed for SQL Server Update Native Backup statistics and more specifically the query exec [LiteSpeedLocal]..[LiteSpeed_ImportNativeHistory]. When I ran this manually, it executed without errors but I still got the login failures logged on the old SQL Server.

No matter what I did to reconfigure the LiteSpeed application I could get rid of this warning. Finally I managed to find a dicussion on the Quest support forum that pointed me to the registry where I could find entries that were left pointing to the old Central Database Server. They were at HKLM/SOFTWARE/Imceda/SQLLiteSpeed/Engine/ActivityServers/[Database]/ Deleting these entries solved my issue.

So seriously Quest. Is it so hard to have a properly documented procedure for moving the database? It should be a pretty common task for customers to have to do..

Wednesday, June 13, 2012

Script updates of contacts in Outlook

I wanted to have all of my colleagues contact information in my iPhone so I thought of the simplest way to handle it. I quickly decided that the best way would be to simple add the appropriate domain accounts as contacts in Outlook which in a few seconds work gave me about 240 new contacts with the correct information entered as they appear in our Active Directory.

I then realised that they all had the internal speed dial phone number entered as their work number. This is a four digit number which is the same as the last four digits in the externally used number. Hence I needed a simple way to automatically add a prefix to the number for each contact, but only those that had four digits in the field since my address book is filled with external contacts as well.
After fiddling briefly with macros, csv-files and other means to get this to work, I had an epiphany and thought of PowerShell.

Three rows of PowerShell code written, tested and executed in a few minutes solved my problem

$outlook = new-object -com outlook.application
$contacts = $outlook.Session.GetDefaultFolder(10)
$contacts.Items | % { if($_.BusinessTelephoneNumber.Length -eq 4) { $_.BusinessTelephoneNumber = "123-45" + $_.BusinessTelephoneNumber; $_.save() } }


Using PowerShell to automatically update lots and lots of data will most likely be the first I think of the next time a similar task appears. This was almost ridiculously easy.

Tuesday, February 28, 2012

Having Visio update shape data automatically

Part of my job is to have an up to date documentation of all integrations we have. I am used to using Visio for this, and have at my previous clients kept it quite simple with manual formatting for visualizing if a message is under construction or to be deleted etc.

At this moment, I am trying out having "a lot" of documentation gathered in the Visio document. Besides having shapes for "Systems" and connections between these that each is equivalent of a "Message", I have added custom data fields on these that allow me to document metadata on each message flow. Part of the metadata I have on a message is Id, Name, Trigger, Status, Transport type and so on. Some of these fields are then used to dynamically alter the format of the connection to give a visual cue to the underlying data. If the Status is "Planned" I give the connection a green color, if the connection is marked as "Request/Response" I visualize this by having a symbol indicating so.

This all works fine and gives me one single document for the overview of the integrations. I then thought of implementing a change list so that I could follow up on revisions of the diagram. Having a normal text based change list on a separate sheet would be possible, but would incur manual work to update it and also not be mandatory (meaning that it would be forgotten and hence useless). Instead, I opted for an extra custom data field on the shapes used indicating the last time the shape was updated (data wise, not if it was moved). This is handled automatically by code that triggers when the shape data is updated.

I have one class module called "UpdateShapeTimeStamp":

Dim WithEvents appObj As Visio.Application

Private Sub appObj_CellChanged(ByVal Cell As IVCell)
    If (Cell.Shape.CellExists("Prop.Updated", 0)) Then
        Cell.Shape.CellsU("Prop.Updated").FormulaU = Chr(34) & Now() & Chr(34)
    End If
End Sub


Private Sub Class_Initialize()
    Set appObj = Application
End Sub


Private Sub Class_Terminate()
    Set appObj = Nothing
End Sub


and the Document code to enable this code:

Dim UpdateShapeTimeStampClass As UpdateShapeTimeStamp

Private Sub Document_BeforeDocumentClose(ByVal Doc As IVDocument)
    Set UpdateShapeTimeStampClass = Nothing
End Sub


Private Sub Document_DocumentOpened(ByVal Doc As IVDocument)
    Set UpdateShapeTimeStampClass = New UpdateShapeTimeStamp
End Sub


Now when changing the shape data on any shape, the event will trigger and check if the custom field "Updated" is available (meaning that it is one of my custom shapes) and if so, set it to the current datetime.

This allows me to have an automatic tracking of all changes in the diagram for each shape. Neat!

Tuesday, January 24, 2012

Outlook 2010 macro to copy item links to the Windows clipboard

I have used OneNote for a few years to handle all my notes regarding different projects and all information I gather during work. It works very well, and I have during this time looked at how to use Outlook in a more advanced way. Of course, I want to tie the two products together.

In the basic form, it is possible to create OneNote note taking pages from an email or appointment in OneNote. This is nice, since I can simply bring up the context menu and choose "OneNote" in order to create a place for all my notes from this specific meeting (and also write down things to bring up before the meeting occurs so that I am prepared). I have however often run into the issue that a lot of scheduled meetings will be accompanied by separate emails containing information that is relevant. In some way I'd like to group these separate items together, preferably in the OneNote page that I have created for the meeting.

I looked at how OneNote creates a page for an email, which is in the same way as for an appointment. It will bring with it a few pieces of information and also create a link back to the email item in Outlook. This link is what I am interested in.

I looked into different ways of creating it. Most promising was a utility called Linker that can copy the internal ID of an Outlook item and put it in the clipboard. The flaw was that it prefixes the ID with "outlook:". This works natively for Outlook 2003. Outlook 2007 can handle it with a registry hack. Outlook 2010 will not handle it at all.

I at least got a bit closer to a solution.

I then found out that a proper prefix in Outlook 2010 is "onenote:outlook?folder=Contacts&entryid=". I looked into how to get the ID from the Outlook item via VBA code and it proved quite easy. I then hacked together a small Macro that would grab the ID from the currently selected item, create the correct HTML link with the prefix above, and then put it in the clipboard. This did of course not work.

The clipboard is an intricate piece of Windows that can handle a lot of different data. Putting raw text into it is simple, putting an HTML link into it a bit more difficult. I turned to Google and found a support article titled "How to add HTML code to the clipboard by using Visual Basic".

This helped me create the more intricate string used to identify HTML in Windows. Using this, I had it working.

After a while I noticed that special characters (å ä ö specifically since I'm in Sweden) did not work and got encoded in a wrong way. After trying different ways of encoding the string I finally resorted to try and put the entity name ä in the string and this worked! It is HTML I'm working with after all. A quick google gave me this nice piece of code to encode special characters in a string, and now I have it all working flawlessly.

My workflow now is as follows:
An appointment is made in Outlook, by me or someone else. I rightclick this and add a OneNote page with information regarding this meeting under a tab for the specific project. Those emails that are of interest for this meeting is then linked in the OneNote page using my Macro. I simply click the email in Outlook, hit my shortcut button to copy the link, and then paste it in OneNote. I have created the macro so that the link description is fetched from the subject in the email. Perfect!

I also noticed that the macro will create links to pretty much anything in Outlook. Emails, appointments, contacts and so on. This way I can link together all items in one OneNote page so I don't have to browse or search in Outlook for whatever I need.

Here is the complete code for the macro if you are interested:
Private Declare Function CloseClipboard Lib "user32" () As Long
Private Declare Function OpenClipboard Lib "user32" (ByVal hWnd As Long) _
   As Long
Private Declare Function GlobalAlloc Lib "kernel32" ( _
   ByVal wFlags As Long, ByVal dwBytes As Long) As Long
Private Declare Function SetClipboardData Lib "user32" ( _
   ByVal wFormat As Long, ByVal hMem As Long) As Long
Private Declare Function EmptyClipboard Lib "user32" () As Long
Private Declare Function RegisterClipboardFormat Lib "user32" Alias _
   "RegisterClipboardFormatA" (ByVal lpString As String) As Long
Private Declare Function GlobalLock Lib "kernel32" (ByVal hMem As Long) _
   As Long
Private Declare Function GlobalUnlock Lib "kernel32" ( _
   ByVal hMem As Long) As Long
Private Declare Sub CopyMemory Lib "kernel32" Alias "RtlMoveMemory" ( _
   pDest As Any, pSource As Any, ByVal cbLength As Long)
Private Declare Function GetClipboardData Lib "user32" ( _
   ByVal wFormat As Long) As Long
Private Declare Function lstrlen Lib "kernel32" Alias "lstrlenA" ( _
   ByVal lpData As Long) As Long
Private Const m_sDescription = _
                  "Version:1.0" & vbCrLf & _
                  "StartHTML:aaaaaaaaaa" & vbCrLf & _
                  "EndHTML:bbbbbbbbbb" & vbCrLf & _
                  "StartFragment:cccccccccc" & vbCrLf & _
                  "EndFragment:dddddddddd" & vbCrLf
                 
Private m_cfHTMLClipFormat As Long


Function RegisterCF() As Long
   'Register the HTML clipboard format
   If (m_cfHTMLClipFormat = 0) Then
      m_cfHTMLClipFormat = RegisterClipboardFormat("HTML Format")
   End If
   RegisterCF = m_cfHTMLClipFormat
End Function


Public Sub PutHTMLClipboard(sHtmlFragment As String, _
   Optional sContextStart As String = "", _
   Optional sContextEnd As String = "")
  
   Dim sData As String
  
   If RegisterCF = 0 Then Exit Sub
  
   'Add the starting and ending tags for the HTML fragment
   sContextStart = sContextStart & ""
   sContextEnd = "" & sContextEnd
  
   'Build the HTML given the description, the fragment and the context.
   'And, replace the offset place holders in the description with values
   'for the offsets of StartHMTL, EndHTML, StartFragment and EndFragment.
   sData = m_sDescription & sContextStart & sHtmlFragment & sContextEnd
   sData = Replace(sData, "aaaaaaaaaa", _
                   Format(Len(m_sDescription), "0000000000"))
   sData = Replace(sData, "bbbbbbbbbb", Format(Len(sData), "0000000000"))
   sData = Replace(sData, "cccccccccc", Format(Len(m_sDescription & _
                   sContextStart), "0000000000"))
   sData = Replace(sData, "dddddddddd", Format(Len(m_sDescription & _
                   sContextStart & sHtmlFragment), "0000000000"))
   'Add the HTML code to the clipboard
   If CBool(OpenClipboard(0)) Then
  
      Dim hMemHandle As Long, lpData As Long
     
      hMemHandle = GlobalAlloc(0, Len(sData) + 10)
     
      If CBool(hMemHandle) Then
         lpData = GlobalLock(hMemHandle)
         If lpData <> 0 Then
           
            CopyMemory ByVal lpData, ByVal sData, Len(sData)
            GlobalUnlock hMemHandle
            EmptyClipboard
            SetClipboardData m_cfHTMLClipFormat, hMemHandle
         End If
      End If
  
      Call CloseClipboard
   End If
End Sub


' Add the current selected item as Clipboard link
Sub AddOutlookItemAsClipboardLink()
    Dim linkString As String
    linkString = "{1}"
    linkString = Replace(linkString, "{0}", ActiveExplorer.Selection.Item(1).EntryID)
    linkString = Replace(linkString, "{1}", EncodeString(ActiveExplorer.Selection.Item(1).Subject()))
       
    PutHTMLClipboard (linkString)
End Sub


' Encodes special characters to their entity equivalent
Function EncodeString(ByVal strOriginal) As String
    Dim currChar, i, sOut, CharList
    CharList = "óáéíúÁÉÍÓÚ¡¢£¤¥¦§¨©ª«¬®¯°±²³´µ¶·¸¹º»¼½¾¿×÷ÀÂÃÄÅÆÇÈÊËÌÎÏÐÑÒÔÕÖØÙÛÜÝÞßàâãäåæçèêëìîïðñòôõöøùûüýþÿ"
    sOut = strOriginal
    For i = 1 To Len(CharList)
        currChar = Mid(CharList, i, 1)
        sOut = Replace(sOut, currChar, "&#x" & Hex(AscW(currChar)) & ";")
    Next
    EncodeString = sOut
End Function