Getting Dynamics CRM ObjectTypeCode values via SQL

Lots of tables in Dynamics CRM use the ObjectTypeCode to identify the entity. Sure, we all remember that Account is ObjecTypeCode 1 and Contact is ObjectTypeCode 2… But what about your custom entities, which start at 10,000 and are not guaranteed to be the same in different organizations, even if you have the same solutions installed — how will we remember their ObjecTypeCode values?

You don’t remember them, you query them when they need you. It’s quick and easy if you are on premises and have access to the SQL database. (If you don’t, ask your DBA to create a view with this query and give you rights to it.)

select coalesce(OriginalLocalizedName,name) as DisplayName, Name as SchemaName, ObjectTypeCode
from EntityLogicalView
order by ObjectTypeCode

Much easier than memorizing, and avoids the problems of remembering the wrong number for the wrong organization or deployment. Oops!

How entity relationships can kill performance in Dynamics CRM 2011

Microsoft Dynamics CRM 2011 allows us to specify relationships between entities, and gives us some configuration as to how those relationships work. One of those relationship types is “parental” — which effectively means, “anything that happens to the parent happens to the child.” Therein lies a lot of power — and a lot of risk.

In the environment where I work, we have a lot of entities. Two of those entities are a “Contract” entity (not the out-of-the-box Contract entity, which we renamed to a more appropriate “Service Contract”) and a “Contract Volume” entity. Intuitively, since you can not have Contract Volume without a Contract, we set the relationship type to parental, with Contract being the parent to Contract Volume.

We have thousands of Contract records, and hundreds of thousands of Contract Volume records. The data for these comes from a separate proprietary application. Data is inserted/updated into CRM from this other system using Scribe Insight. The entities are read-only in CRM, so we don’t have to bi-directional synchronization of data. All was well, and we moved data for a couple weeks without issue.

In a recent release, we expanded the functionality of CRM and found the need to customize the owner of the Contract entity to one of three different CRM teams. So, we did what seemed sensible: update the Scribe package to set the owner based on the criteria we came up with. Unfortunately, running this package took down our CRM system within a minute. The database server was overloaded, and SQL blocks were everywhere.

The problem was our relationship. In a parental relationship, any deleting, sharing, and assigning of the parent will affect all of its children, The effect of this is:

  • Deleting a parent will delete all child records.
  • Sharing or un-sharing a parent will perform an identical share or un-share on all child records.
  • Assigning (changing the owner of) a parent will assign (change the owner of) all child records.

Think about the potential impact of this. If a single record has 10,000 child records, and you assign this record to another user, CRM must perform the same operation against all 10,000 child records — as one operation. Add on top of that CRM’s built-in overhead (auditing, principal object access, logging, etc.), and you’ve effectively killed your system, which likely can not keep up with the scope of your request while dealing with other people’s requests.

A better solution for us was to do the following:

  • Change the Parental relationship to a Referential relationships, where there is no cascading, and a delete will simply remove the parent reference, leaving the child an orphan with no parent.
  • Create a cleanup job to delete all the orphaned child records. This can be done via a a Scribe job, a recurring bulk delete job, or manually.

Dynamics CRM has a lot of power, but you have to think about the effect of the choices made, because they very easily can cause unintended consequences — as they did for us recently!

First day without coffee… or not?

I’m going to make an announcement that may come as a shock to anyone who knows me…

I’ve voluntarily decided to give up coffee.

Those that know me will probably react like the first person who saw me walking around today with a cup of water instead of a cup of coffee: they laughed. I was a heavy coffee drinker, about 5 cups a day. I enjoyed my coffee. I blogged about the wonders of coffee. I identified myself with a cup of coffee. I bought coffee for co-workers, and they bought it for me. I went out for coffee breaks when the smokers went out for cigarette breaks. I looked forward to drinking coffee…

So why the change? Ultimately, it came down to health and stress. I drink too much coffee (caffeine) and not enough water, and I’ve been too stressed lately. Coffee (caffeine) is not helping either of those.

After writing that, I decided to look up the negative effects of coffee. While typing into Google, the instant search instead brought me to this page: 7 Reasons Why Coffee Is Good For You (Popular Science). Not what I was looking for. Perhaps a more balanced review is here: 10 Reasons Coffee is Both the Best and the Worst Beverage Ever Invented (io9). Ugh. Why can’t this be easy?

Coffee isn’t a magic elixir, and it isn’t a doomsday device. Like most things in life, moderation is key. When it comes to coffee, I’ve tipped the moderation scale so far that it fell over. Time to recalibrate. I’ll be off the sauce for a while, and we’ll see where it goes from there…

Getting rid of nameless Contacts when Leads are qualified in Dynamics CRM

Some would wonder why such a question is asked… but in my Microsoft Dynamics CRM 2011 environment, we do not require a First Name or Last Name on a Lead. When a Lead is qualified, users often click the buttons to create an Account and a Contact, even when they haven’t specified any contact-specific information (such as first and last name). What results is often a Contact with no name, and that contact is the Primary Contact for the Account. Ouch!

To fix this, we created a workflow on the Contact entity, which would deactivate the Contact if it had a Qualifying Lead, and had a blank First Name or Last Name. This took care of the Contact, but there was still a problem: the Account’s Primary Contact pointed still pointed to the nameless Contact we just deactivated!

Looking at the Audit History on the Account, we were able to figure out that, when qualifying a lead, CRM creates and updates records as follows:

  1. A new Account record is created. The Primary Contact field is left blank.
  2. A new Contact record is created. The Contact’s Account is set to the Account created in step #1.
  3. The Account record created in step #1 is updated. The Primary Contact field is set to the Contact created in step #2.

To solve the deactivated Primary Contact problem, we created a workflow on the Account entity, which would clear the Primary Contact field if the Primary Contact was set to a deactivated Contact.

Problem solved! Users can continue trying to create Contacts with no names, and we’ll make sure they never see them.

Batch file to compress and copy a folder

Recently, I needed to compress all contents of a folder (including subfolders) to an archive file, and copy it to a remote (network) location, all from a command line. In other words, I wanted to do this:

compressAndCopyFolder <sourceFolder> <destinationFolder> <archiveFileName>

Here’s how each parameter would work:

  • sourceFolder is the folder, along with all subfolders, to be added to the archive.
  • destinationFolder is the folder where the archive would be created.
  • archiveFileName is the file name of the archive

I prefer 7zip for archiving, so I could have done this simply using one command:

7z.exe" a -r "%destinationFolder%\%archiveFileName%.7z" %sourceFolder%\*

The problem with this is that it is inefficient to work with a large archive file over the network — there’s a lot of chatter going on to add files to an archive file. Much faster to create the archive locally, then copy the final archive file to the network. So, instead of one command, we have two:

echo off
if "%3"=="" (
	echo usage: 7zfolder ^<sourcefolder^> ^<destinationfolder^> ^<destinationfilename^>
	echo Note: The suffix .7z will be added at the end of destinationFilename.
	goto :eof
if not exist "%1" (
	echo ERROR: Source folder does not exist: %1
	goto :eof
if not exist "%2" (
	echo ERROR: Destination folder does not exist: %2
	goto :eof
if exist "%2\%3.7z" (
	@echo ERROR: Destination file already exists in destination folder: %2\%3
	goto :eof

@"c:\Program Files\7-Zip\7z.exe" a -r "%temp%\%3.7z" %1\*
@copy "%temp%\%3.7z" %2\%3.7z /z

Now, from a command line, I just do this…

7zfolder c:\mysource \\myserver\mydest archive

… and all files in c:\mysource will be added to an archive \\myserver\mydest\archive.7z.

Don’t you just love batch files?

Sync or synch as short for synchronize?

Today at work, someone wrote the following as a work item summary: “Synch production and test data”.

Reading this, I thought, “Which is correct – sync or synch – when shortening synchronize?

The winner is sync, and here’s why:

  • We pronounce either variant as sink.
  • c alone is often pronounced k (cake, panic)
  • ch alone is often pronounced ch (church, match)
  • Sync = ”sink”
  • Synch = “sinch”
  • One less keystroke for sync

I was happy with that until reading an article on The Language Lover’s Blog, “Sync or Synch?“, which made me conscious of other common variants, such as psych as short for psychology.

Ultimately, my search was put to bed when I read this comment:

you guys are such nerds!!! the world really doesnt care about the voiceles velar fricative converting to a velar plosive. blogging should be eliminated from the internet.

Guilty as charged…

The Runner’s Dilemma: A Story

Author’s Note: I am not a runner, and I probably never will be. My body isn’t made for running. But running is a good metaphor for journeys, and like many others, though I may not run, I still take many journeys in life, some with my body, some with my mind, and some with my heart. The below came to me this morning while driving to work, a journey in itself, as I reflected not on that journey, but on the greater journey of life, and how we get through it: one step at a time.

He had prepared for this for so long, so precisely; yet nothing was going as planned.

For years, he trained on a predictable track, its curves coming at regular intervals, its flatness something he didn’t have to think about. He trained on the perfect days, not too hot, not too cold, just the right amount of moisture in the air to stay refreshed without feeling oppressed. He was so prepared for the marathon. He knew it was coming, and he spent what seems a lifetime getting ready for it.

Now the day of the marathon was here, and nothing was going according to plan. The course was changed at the last minute. The predictable curves were now unknown turns, the hills and valleys mysteries. The course was scattered with potholes, as if God himself dropped them randomly from the sky. The weather was oppressively hot, with a harsh wind blowing loose dirt and debris, a humidity that made the air itself heavy to run through and breathe.

He wanted to give up, yet he found himself standing there, at the starting line, with countless others. His mind told him that this is crazy, that it can’t be done. But when the gun went off to mark the start of the race, his body reacted, putting one foot in front, then the next, as he found his stride. His mind told him to stop, unconvincingly, as his body kept moving, perhaps out of muscle memory, perhaps just to keep up with the mass around him.

As he moved forward, he saw people around him, some struggling, some unfazed by the changing events. The weather shifted as the course shifted, each turn bringing a new unexpected challenge. His mind kept thinking, this is crazy, I can’t do this, yet his body kept moving.

This went on for some time, the debate between mind and body, until his body started complaining. It was getting to be too much, the fatigue was beginning to set in. His body was not prepared for what he was doing.

He was about to give up when he looked around and his mind realized something: I’m already come so far. At that point, things changed. His mind found inspiration, and it tool control of the body, telling it not to focus on the entirety of the journey, but to focus on the next step… and the next… and the next…

And the body obeyed. It put one foot in front, then the next, continuing its stride, as the team of mind and body continued on to the finish line.

Is there gender inequality in education (or anywhere else)?

I stumbled across this on the web today, courtesy of Mark Perry / Carpe Diem, and had to share it, because it does a good job of illustrating the perceived gender inequality in education.

57% of students in postsecondary education are men.

52% of students in gifted and talented education programs in 2009 were boys.

In 2009-10, men received 62% of all associate’s degrees, 57% of of all bachelor’s degrees, 62% of all master’s degrees, and 53% of all doctorate degrees.

One would read that and likely say it is conclusive evidence that women are under-represented in education, and there is gender inequality against them.

However, that would be a lie. In copying the above from the source, I changed all references from women to men. The actual facts should read as follows:

57% of students in postsecondary education are women.

52% of students in gifted and talented education programs in 2009 were girls.

In 2009-10, women received 62% of all associate’s degrees, 57% of of all bachelor’s degrees, 62% of all master’s degrees, and 53% of all doctorate degrees.

To show the same in a table:

  # of Men # of Women
Postsecondary Education 10 13
Gifted and Talented Education Programs 10 11
Associate’s Degrees 10 16
Bachelor’s Degrees 10 13
Master’s Degrees 10 16
Doctorate Degrees 10 11

Obviously, there are many reasons for this, some of which are:

What does this mean? To me, it means this: men and women are interested in different things, have different goals, and make different choices in life. Men make choices such as military service (or crime) more than women; they choose to be firefighters (98% male) and engineers (89% male), whereas women choose to be teachers (about 85% female), social workers (81% female), and registered nurses (94% female).

So is there a “gender gap”? No more than there is a “pet acceptance gap.” Consider the differences in dog and cat acceptance in U.S. households, where there are 10% more cats than dogs in U.S. households. Why? Because cats and dogs are different, so people accept them differently, and in different numbers.

The bottom line: There are gender differences; the gaps are created by those differences.

Disclaimer: Obviously, there are specific instances where men and/or women are unfairly treated in education, or in certain professions, or in certain social circles. This commentary is not to focus on the individual experience, but on the experience as a whole. Though there may be injustice in life at times, it does not mean that all of life is unjust. And, of course, correlation does not imply causality.

Solved: ClickOnce publish failing with a valid certificate

For the past 22 months, we’ve been publishing using ClickOnce to the same path, using the same certificate, with no problems. This week, problems found us:

C:\Windows\Microsoft.NET\Framework\v3.5\Microsoft.Common.targets (3652,9): error MSB3482: An error occurred while signing: Failed to sign . SignTool Error: The signer’s certificate is not valid for signing. SignTool Error: An error occurred while attempting to sign:

The usual problem here is an expired certificate, but the certificate we’re using was created in December 2010 and expired in year 2039, so we knew expiration wasn’t a problem. The certificate file itself was also known to be valid; it was stored in our Subversion repository and hadn’t changed since the day it was created.

We called Microsoft (free support using our MSDN subscription). We spent about four hours working with them, creating test certificates, creating new certificates, trying to publish, with mixed and inconsistent results. We were told to use a new certificate because our current certificate was deemed corrupt. (When I asked what corrupt means, I was told it means the certificate file is bad. When I said that the MD5 hash of the certificate file was identical to the MD5 hash of the file created nearly two years ago and that it was stored in source control, I was told it was “bad” not corrupt. Semantics. Whatever.)

I seem to have fixed the problem this morning. It wasn’t the certificate. It wasn’t the path. Frankly, I don’t know what it is. I only know what steps I took before the point in which everything started publishing normally, as it had for the past 22 months.

Here’s what I did: I verified that every project in the solution has the same target platform (x86), .Net Framework version (3.5), and correct assembly information (some had default copyright information, some had autogenerated version numbers where we expected After doing that and rebuilding, the ClickOnce publish worked as it did before, with the same old certificate, and clients behaved as expected (upgrading according to ClickOnce settings).

It makes no sense, because none of those settings were changed from our last good ClickOnce publish. Either way, it fixed the problem. Hopefully, it’ll help fix yours, if and when this annoying problem comes up.

Finding out which tables in a SQL database have no rows

How many times have you looked at a production database and found a table that has no rows in it? Then, when questioning the developers, you are told, “Oh, that table was never used.” (That’s slack undertow.) It happens to me often enough, so I wanted to come up with a way to easily find all the tables in a SQL database that have no rows.

One line of SQL (which actually runs three lines of SQL, once for each table in the database) is all you need:

exec sp_MSforeachtable 'declare @count int; select @count = count(*) from ?; if @count = 0 print ''?'';'

Run that, and the output will list the schema and table name of every table in the current database that has no rows.