Thursday, 9 December 2010

Stability issues with AsyncRemoveCompletedJobs

CRM 4.0 UR 3 brought in a useful feature, the ability to configure the CRM Asynchronous Service to automatically delete records from completed asynchronous operations, and hence keep the size of the asyncoperationbase SQL table down to a reasonable size. This behaviour is configured by the registry values AsyncRemoveCompletedJobs and AsyncRemoveCompletedWorkflows

However, I recently met an issue with this behaviour, where the CRM Asynchronous Service appears to get in a state where all it is doing is deleting completed jobs, to the exclusion of all other activity. This can leave the CRM Asynchronous Service to have effectively hung (not responding to service control requests, nor polling for new jobs to process) and not to process any new jobs for a considerable period of time (in one environment, this could be several hours).

The main symptoms are:

  • No jobs being processed for a considerable period of time
  • The Crm Asynchronous Service not responding to service control requests (i.e. you cannot stop it through the Services console, so you have to kill the process)
  • No values reported for most performance counters (e.g. 'Total Operations Outstanding', 'Threads in use')
  • If you do restart the service, you see a burst of activity (including performance counters) whilst outstanding jobs are processed, then it reverts to the same behaviour as above
  • If you look at the SQL requests submitted by the Crm Asynchronous Service (I use the SQL dynamic management views sys.dm_exec_requests and sys.dm_exec_sessions) you see just one DELETE request and no other SQL activity

At the moment, the only workaround I have is to remove the registry values, and to use a scheduled SQL job to periodically clear out the asyncoperationbase table. Here is an example of such a script.

Wednesday, 17 November 2010

How to use impersonation in an ASP .Net page using IFD in CRM 4.0

This is is common requirement, and I've never found what I consider to be a suitable explanation of what needs to be done. This post is not intended to be exhaustive, but is intended to cover the essentials in one place.

The fundamental requirement is to create a custom ASP .Net page that is accessible both internally (via AD authentication) and over the Internet (via IFD authentication), and where the code will access CRM data under the context of the user accessing the page. To do this, you need to deploy and configure your code as follows:


  1. Deploy the ASP .Net page within the CRM Web Site (the only supported place is within the ISV directory). If you don't do this, then IFD authentication will not apply to your page
  2. Run the ASP .Net page within the CrmAppPool, and do not create an IIS application for it. If you don't do this, then you won't be able to identify the authenticated user
  3. Ensure that the CRM HttpModules MapOrg and CrmAuthentication are enabled. This will happen by default by inheritance of the settings from the root web.config file in the CRM web site, but I'm mentioning it here as there are some circumstances (when you don't need IFD) in which it is appropriate to disable these HttpModules. Again, if the HttpModules aren't enabled, then you won't be able to identify the authenticated user
  4. As your code is in a virtual directory (rather than a separate IIS application), ASP .Net will look for your assemblies in the [webroot]\bin folder, so that is where you should put them (or in the GAC). The initial release documentation for CRM 4.0 stated that it was unsupported to put files in [webroot]\bin folder of the CRM web site, but this restriction has been lifted

You also need to follow certain coding patterns within your code. An example of these can be found here. Note that, Crm web services refers to both the CrmService and the MetadataService:

  1. Ensure you can identify the organisation name. The example code shows how to parse this from the Request.Url property, though I prefer to pass this on the querystring (which the example also supports)
  2. Use the CrmImpersonator class. All access to the Crm web services needs to be wrapped within the using (new CrmImpersonator()) block. If you don't do this you will probably get 401 errors, often when accessing the page internally via AD authentication (see later for a brief explanation)
  3. Use the ExtractCrmAuthenticationToken static method. This is necessary to get the context of the calling user (which is stored in the CallerId property)
  4. Use CredentialCache.DefaultCredentials to pass AD credentials to the Crm web services. If you don't do this, then you will probably get 401 errors as you'd be trying to access the web service anonymously (IIS would throw these 401 errors)

That should be all that you need on the server side. The final piece of the puzzle is to ensure that you provide the correct Url when accessing the page, which again needs a little consideration:

When accessing the page from an internal address, the Url should be of the form:
http://[server]/[orgname]/ISV/MyFolder/MyPage.aspx

When accessing the page from an external address, the Url should be of the form:
http://[orgname].[serverFQDN]/ISV/MyFolder/MyPage.aspx

This is relatively easy to achieve when opening the page from within CRM (i.e. in an IFrame, via an ISV.config button or in client script). In each case you can use the PrependOrgName global function in client script - e.g.

var u = PrependOrgName('/ISV/MyFolder/MyPage.aspx');

This function will determine correctly whether to add the organisation name to the Url. Note also that I've provided a relative Url, which will ensure the first part of the Url is always correct. As this uses a JavaScript function, you will always need to use a small piece of code to access the page, and cannot rely on statically providing the Url in the source of an IFrame, or in the Url attribute of an ISV.Config button. Any relative Urls in SiteMap should automatically get the organisation name applied correctly. Remember to also pass the organisation name on the querystring if the server code expects this (you can get the organisation name from the ORG_UNIQUE_NAME global variable)

Earlier I promised an explanation of what the server code does. This is not as complete an explanation as it could be, but the basics are:

  1. The HttpModules identify the correct CRM organisation (MapOrg) from the Url provided, and place information about the authenticated calling user in the HttpContext (CrmAuthentication)
  2. The ExtractCrmAuthenticationToken method reads the user context from the HttpContext, and puts the user's systemuserid in the CallerId property of the CrmAuthenticationToken
  3. Because the CallerId is set, the call to CRM is necessarily using CRM impersonation. For this to be permitted, the execution account (see Dave Berry's blog for a definition) must be a member of the AD group PrivUserGroup. The execution account is the AD account that is returned by CredentialCache.DefaultCredentials. This is where things get a little interesting
  4. If the request comes via the external network and IFD authentication is used, CRM handles the authentication outside of IIS and no IIS / ASP .Net impersonation occurs. Therefore CredentialCache.DefaultCredentials will return the AD identity of the process, which is the identity of the CrmAppPool, which necessarily is a member of PrivUserGroup
  5. However, if the request comes via the internal network, AD authentication is used and IIS / ASP .Net impersonation does occur (through the setting in web.config). This impersonation will change the execution context of the thread, and CredentialCache.DefaultCredentials would then return the AD context of the caller. This is fine in a pure AD authentication scenario, but the use of the ExtractCrmAuthenticationToken method means that CRM impersonation is necessarily expected; this will only work if the execution account is a member of PrivUserGroup, and CRM users should not be members of PrivUserGroup. This is where the CrmImpersonator class comes in: its constructor reverts the thread's execution context to that of the process (i.e. it undoes the IIS / ASP .Net impersonation), so that CredentialCache.DefaultCredentials will now return the identity of the CrmAppPool, and the CRM platform will permit CRM impersonation

To finish off, here are a few other points to note:

  • IFD impersonation only applies when accessing the CRM platform. If you use IFD authentication, there is no way of impersonating the caller when accessing non-CRM resources (e.g. SQL databases, SharePoint, the file system); it cannot be done, so save yourself the effort and don't even try (though, for completeness, SQL impersonation is possible using EXECUTE AS, but that's it)
  • If you want to use impersonation, do not use the CrmDiscoveryService. The CrmDiscoveryService can only be used with IFD if you know the user's username and password, and you won't know these unless you prompt the user, which kind of defeats the point of impersonation

Thursday, 11 November 2010

When is the Default Organisation not the Default Organisation

In CRM 4.0, an organisation can be designated as the Default Organisation in the Deployment Manager tool:



However, this does not always do what some people expect. CRM has 2 distinct concepts of a 'Default Organisation', and only one of them can be set using Deployment Manager. The 'Default Organisation' that you set in Deployment is a system-wide setting whose primary use is to define which organisation database is used by code that accesses CRM 4.0 through the CRM 3.0 web services (CRM 3.0 did not support multiple organisations, so the web services had no way to specify the organisation to connect to).

The other type of 'Default Organisation' applies to users. Each user has a Default Organisation, which is the organisation that they are connected to if browsing to a Url that does not contain the organisation name. For example, if a user browses to http://crm/Excitation/loader.aspx, then the user will necessarily be taken to the Excitation organisation, but if they browse to http://crm/loader.aspx, they will be taken to their default organisation, which has no relationship to the 'Default Organisation' that is set in Deployment Manager. Each user's default organisation will be the first organisation in which their CRM user record was created.

One issue that can arise is if a user connects using a Url that does not contain the organisation name, and either their default organisation has been disabled, or their user account in their default organisation has been disabled. In this scenario, the user would receive either the error 'The specified organization is disabled' or 'The specified user is either disabled or is not a member of any business unit'. The simplest solution would be to specify an appropriate organisation name in the Url; however if this is not possible, the rest of this post describes an unsupported alternative.

Unfortunately, none of the MSCRM tools will display a user's default organisation, nor is there a supported way to change this (though there is an unsupported way - see below). All the information is stored in the MSCRM_Config database, in the SystemUser, Organization and SystemUserOrganizations tables. The following output shows some of the relevant data from these tables:


SystemUser table


Organization table


SystemUserOrganizations table

The DefaultOrganizationId field in SystemUser defines each user's default organisation, and this can be joined to the Id field in the Organization table.

The fields that define the user are a little more complex: The Id field in SystemUser is unique to each CRM user. You can join the SystemUser table to the SystemUserOrganizations table using the SystemUser.Id and the SystemUserOrganizations.UserId fields. The CrmUserId field in SystemUserOrganizations can be joined to the systemuserid field in the systemuserbase table in each CRM organisation database. Note that the same user will have a different systemuserid in each organisation database. The following query illustrates these joins, taking the user's name from one of the organisation databases (it's not stored in MSCRM_Config):

select o.UniqueName, u.FullName
from Organization o
join SystemUser su on o.Id = su.DefaultOrganizationId
join SystemUserOrganizations suo on su.Id = suo.UserId
join Excitation_MSCRM..systemuser u on suo.CrmUserId = u.systemuserid


So, that's how it fits together. As this is all SQL data, it is not difficult to modify this, but be aware that to do so is completely unsupported, and could break your CRM implementation. If you were to make any changes, make sure you backup the relevant databases (especially MSCRM_Config) before you do so.

If you did want to change a user's default organisation, please heed the warning in the preceding paragraph and backup the MSCRM_Config database. The following SQL update statement will change the default organisation of a given user, based on their systemuserid in one organisation database. The reason for writing the query this way is to ensure that a user's default organisation can only be set to an organisation that they exist in, and this query should only ever modify one record. If it modifies 0 records, then check the @systemuserid value, and if it modifies more than one record then your MSCRM_Config database is probably corrupt, and you should reinstall CRM and reimport your organisation databases (I was serious about my warnings).

declare @systemuserid uniqueidentifier
set @systemuserid = '25E1DC1D-BEC2-449B-AAD8-4A6309122AE1' -- replace this
update SystemUser
set DefaultOrganizationId = suo.OrganizationId
from SystemUserOrganizations suo
where suo.UserId = SystemUser.Id
and suo.CrmUserId = @systemuserid

One final point; CRM caches some of the data in MSCRM_Config, so you'd need to recycle the CRM application pool to sure any changes have taken effect.

Tuesday, 9 November 2010

The CRM 4.0 Reporting Services Connector - how it works

The Dynamic CRM Connector for Reporting Services is a very useful component that avoids the need to configure Active Directory delegation (aka double-hop authentication) when multiple servers are used for the CRM, Reporting Services and SQL Server roles in a CRM implementation. In general, it is easy to install and use, but I'm always interested in how these things work.

How the Connector is installed and invoked
The connector is installed as a Data Processing Extension with SSRS. These extensions are registered within the rsreportserver.config file on the Reporting Server, as per the following snippet:

<Data>
<Extension Name="SQL" Type="Microsoft.ReportingServices.DataExtensions.SqlConnectionWrapper,Microsoft.ReportingServices.DataExtensions" /> <Extension Name="OLEDB" Type="Microsoft.ReportingServices.DataExtensions.OleDbConnectionWrapper,Microsoft.ReportingServices.DataExtensions" />
<Extension Name="MSCRM" Type="Microsoft.Crm.Reporting.DataConnector.SrsExtConnection,Microsoft.Crm.Reporting.DataConnector" /> </Data>

All CRM reports with SSRS are configured to use a specific SSRS Data Source. When the Connector is installed, the Data Source is changed to use the MSCRM Data Processing Extension, instead of the default SQL Server extension. See images below:


Report properties showing the MSCRM Data Source


MSCRM Data Source using the SQL Server extension before the connector is installed


MSCRM Data Source using the CRM extension after the connector is installed.

There are 3 differences between these configurations:

  1. The Connection Type, which specifies the extension
  2. The Connection String is absent with the CRM connector. This is because the connector reads some of the database information from registry values that were created during its installation, and some from data passed to it when the report is run (see below)
  3. The Credentials. With the SQL Server connector, standard Windows Integrated security is used - i.e. the user's AD credentials are used to connect to SQL Server. With the CRM connector, separate 'credentials' are passed to SSRS (again, see below)

What happens when a report is run
If you try to run a CRM report with the CRM connector installed, the connector will require some 'credentials', as per point no.3 above. This image shows what happens if you try to run a report from Report Manager:


Running a CRM report from Report Manager when the CRM connector is installed

These 'credentials' are not what they seem; rather they are a cunning way for the CRM platform to pass information about the current user to the CRM connector. The CRM connector expects the current user's systemuserid (a Guid) to be passed into the Log In Name box, and the organizationid (another Guid) to be passed into the Password box. These are not your login name and password.

As the report uses a data source that uses the CRM connector, the RS Report Server code calls the CRM connector code (the SrsExtConnection class in the Microsoft.Crm.Reporting.DataConnector assembly, as per the rsreportserver.config data above). The code will then:
  1. Check that it is permitted to impersonate a CRM user. This checks that the identity the code is running under (which the identity of the ReportServer application pool, or the Reporing Services service, depending on the version of Reporting Services) belongs to the AD group PrivReportingGroup
  2. Connect to the MSCRM_Config database to determine the correct MSCRM organization database, based on the organizationid that was passed in the 'credentials'
  3. Connect to the relevant MSCRM organization database. Note that this is done (as was the previous step) using integrated security under the AD identity as per step 1 above
  4. Use the SQL statement SET Context_Info to pass the calling CRM user's systemuserid into the Context_Info
  5. Execute the SQL statement(s) within the report definition. The definition of all filtered views use the fn_FindUserGuid function to read the systemuserid from the Context_Info

What can you do with this information
One use is for troubleshooting. Checking the rsreportserver.config is a quick way to see if the connector is installed, and checking the configuration of the MSCRM Data Source will tell you if the connector is in use. Changing the MSCRM Data Source is a quick way to turn the connector on or off for test purposes.

You can also run the reports directly, rather than from CRM. Again, when troubleshooting I find it useful to run a report directly from Report Manager web interface. To do this with the connector, you need to enter the systemuserid and organizationid when prompted (see image above). These values can be read from the filteredsystemuser and filterorganization views respectively in the MSCRM database.

A further option is to run the reports via other means, such as Url Access, as described here (that article was written for CRM 3, see here for an update for CRM 4). To do this with the connector installed, you will also have to pass the systemuserid and organizationid on the query string. This is done using the following syntax:

&dsu:CRM=<systemuserid>&dsp:CRM=<organizationid>

Friday, 29 October 2010

Web.config settings - e.g. ViewState, Session

A common issue raised in the CRM Development forum is of custom web pages that work correctly in a development environment, but then fail to work when deployed into the CRM web site. The most common reason for this is that the CRM web.config overrides some of the default ASP.Net configuration settings.

The relevant entries in the CRM web.config (in the root of the CRM web site) are:

<pages buffer="true" enableSessionState="false" enableViewState="false" validateRequest="false/">
<sessionState mode="Off"/>

These have 2 main consequences:
  1. Session State is disabled. This issue is relatively easy to diagnose, as you tend to get a clear error message if trying to use session state when it is disabled
  2. ViewState is disabled. This can be a more subtle effect, as developers often rely on viewState without necessarily being aware of it. ViewState is what allows ASP.Net web controls to maintain property values across web requests; if it is disabled then it leads to symptoms such as values not being retained, or list boxes losing their contents

The solution for viewState is straightforward. You can reenable viewState for you application either in the web.config in your application directory, or at the page level within the <@Page> directive. These looks like the following:

web.config:
<pages enableViewState="true" />

Page directive:
<@Page EnableViewState="true" />

Session state is a bit more complex, as this is configured at the web application level. Personally, I've never seen any reason to use session state within custom code in the CRM web site; CRM doesn't use this, and I find it best to mimic CRM behaviour wherever possible.

And one final point about best practise; as this post demonstrates, it is best not to rely on the default ASP .Net configuration settings, rather I find it best to always explicitly enable or disable settings in the local web.config

Thursday, 28 October 2010

SDK assemblies: Versions and Processor Architecture

The download of the Dynamics CRM SDK includes the sdk assemblies (microsoft.crm.sdk.dll, microsoft.crm.sdktypeproxy.dll etc). There are 4 sets of these assemblies, one in the bin directory, another in bin\64bit, and two more in the respective online subdirectories.


Given the directory naming, I'd always assumed that the assemblies in the bin directory were built for 32bit (x86) systems only, and those in bin\64bit were build for 64bit systems. I found this a little annoying, as I generally prefer to build assemblies as MSIL (AnyCPU) to avoid the need for different 32 and 64 bit deployment packages.

However, it turns out that the assemblies in the bin\64bit directory are actually built as MSIL (AnyCPU), rather than specifically as 64 bit assemblies (apparently this was an earlier deployment mix-up which it's now too late to correct). This gives me what I want, so I now always use the assemblies that, bizarrely, come in the bin\64bit directory of the SDK.

Thursday, 26 August 2010

What takes up most space in an MSCRM database

A frequent question is 'What is taking up most space in my MSCRM database'. My first step is to check which tables take up most space, for which I use the SQL script at the end of this post. The most common large tables, and reasons for them are:

1. AsyncOperationBase: This stores all asynchronous operations, and can get very large if completed ones aren't cleared out. You have to explicitly set the registry values AsyncRemoveCompletedWorkflows and AsyncRemoveCompletedJobs. These 2 values are covered in separate KB articles (http://support.microsoft.com/kb/968755 and http://support.microsoft.com/kb/957871). To clear existing records, use the SQL script in http://support.microsoft.com/kb/968520/
2. ActivityMimeAttachment: This stores email attachments
3. AnnotationBase: This stores other files (those associated with Notes)
4. ActivityPointerBase and/or ActivityPartyBase; These store activities and the associated parties respectively. Although individual record sizes aren't typically large, these tables typically have the largest number of records

For no.s 2 to 4 I've not given a way of clearing out space, as this would involve deletion of business data.

SQL Script
I normally use the following SQL Script to find what objects take up most space in a SQL database. It doesn't take account all possible storage in all SQL databases, but works fine for an MSCRM database

select OBJECT_NAME(object_id), SUM(reserved_page_count) / 128 as ReservedMB, sum(used_page_count) / 128 as UsedMB, MAX(row_count)
from sys.dm_db_partition_stats
group by object_id
order by SUM(reserved_page_count) desc

Friday, 6 August 2010

A view of records owned by the user's business unit

Bubbling something up from an answer I gave on the Dynamics CRM forums. Although it may not seem obvious within Advanced Find, it is possible to create a view that displays records owned by a user's business unit.

The following gives the contacts owned by the current user's BU - strictly it's composed as 'contacts owned by a business unit that contains the current user'

Contacts
Owning Business Unit
Users (Business Unit)
User Equals Current User

Wednesday, 12 May 2010

Advanced Developer Extensions - an update

Just a quick note to say that I've updated my post on the Advanced Developer Extensions. Shan McArthur of AdxStudio (who developed these extensions) was kind enough to provide extra information about these extensions, which I've now incorporated into the original post

Tuesday, 11 May 2010

CRM SDK 4.0.12 and the Advanced Developer Extensions

The CRM 4.0.12 SDK has recently been released. Normally an SDK update is not particularly significant, but in this case it includes some major enhancements, which come under the banner of 'Advanced Developer Extensions'. David Yack has already posted a quick how-to on the CRM Team Blog; rather than duplicate that, this post is intended to cover the scope and limitations of the new extensions as I see them, and how they differ from the original CRM programming model.

What are the new extensions ?
It seems to make sense to split the new features into 2; the 'Advanced Developer Extensions' (whcih has been helpfully shortened to Microsoft xRM), and the Portal Accelerator (aka Portal Developer). The Portal accelerator uses Microsoft xRM, but I think it is otherwise best treated separately. So, for this post I'll concentrate on Microsoft xRM.

Architecture of Microsoft xRM
Although Microsoft xRM appears to provide a whole new programming model, it can be considered as essentially a (rather big) wrapper around the existing SDK assemblies (microsoft.crm.sdk and microsoft.crm.sdktypeproxy). So, although your code would not directly use the classes in the SDK assemblies, the communication with CRM is still ultimately done via the CRM web services, and is subject to the same limitations (e.g. limitations of the FetchXml query syntax). Another consequence of this is that you do not need to change any of your network configuration to use these extensions.

Changes to the Programming Model
This is where it gets interesting; the new extensions provide a whole new programming model, which can affect pretty well all of the code you use to communicate with CRM. The major changes as I see it are:

  1. You can use strongly-typed classes for the CRM entities. Although you can do this with the existing SOAP CRM web services, up till now you needed to use the DynamicEntity class with the SDK assemblies
  2. Native .Net types are used for attributes - e.g. int? rather than CrmNumber. Note that nullable types (e.g. int?, rather than int) as used
  3. The connection information is covered within a DataContext class, which effectively replaces the use of the CrmService instance
  4. Data modifications can be cached locally, then submitted as a batch, using the DataContext.SaveChanges method
  5. The extensions provide additional methods to update lookup values. These can work off either the lookup attribute name (e.g. customerid) or the relationship name (e.g. contact_customer_accounts). The extensions also provide methods to retrieve related records, which avoids the need to use QueryExpression or QueryByAttribute
  6. You can use LINQ queries to retrieve data from CRM, rather than building QueryExpression instances
  7. The DataContext and LINQ technologies allow direct data-binding with .Net user interface controls

Use of Strongly-Typed classes
As with the SOAP web services, you can use strongly-types classes for system and custom entities. Superficially the process for setting this up differs from the SOAP web services, although the underlying idea is pretty similar. With these extensions, you use a supplied tool (CrmSvcUtil.exe) to connect to a CRM server. This tool will generate the class definitions for all CRM entities into one output code file which you'll add into your .Net project. Ultimately, this process is very similar to what happens behind the scenes when you create a Web Reference to the SOAP web services. The main internal difference is that the generated classes with these extensions maps down to the DynamicEntity class, but this is hidden from you.

You can still use a generic class with these extensions rather than strongly-typed classes. With the extensions it is ICrmEntity, rather than DynamicEntity.

Native .Net Types
Native .Net Types are used instead of Crm-specific types. The extensions use the nullable versions of the types (e.g. int?, rather than int) so that you can still identify null values (which was one of the main original reasons for the Crm-specific types). For picklist and status attributes the extensions provide an additional Label attribute (e.g. customertypecodeLabel) with the appropriate text, whereas for lookup attributes you can get the related entity via a property that has the relationship name (e.g. price_level_accounts).

DataContext class
This replaces the need for the CrmService instance, and handles the selection of authentication type, passing credentials and management of the CrmAuthentication token. All connection-related information (server URL and port, organisation name, authentication type and credentials) can be specified in one connection string. The extension code includes logic to read connection string information from the application config file. Overall, this should greatly simplify deployment across different environments and authentication types.

The DataContext exposes an IOrganizationService instance, which looks to combine the underlying ICrmService and IMetadataService instances. This allows use of any of the more specific CrmService messages (e.g. InstantiateTemplate), or MetadataService messages.

Interestingly the constructor for the DataContext can take an IContextService or IWorkflowContext instance, but not an IPluginExecutionContext instance. This implies that the extensions can work within a custom workflow activity, but are of limited use within plugins. See below for more on this.

Batch Updates
It looks to me as though any changes submitted via the DataContext will be submitted as a batch via the SaveChanges method. This won't provide any transactional support, as this is not possible within the Crm Web Services. Overall, I think this approach is due to the design patterns used in the Microsoft data-related classes, and I'm pretty neutral as to whether this offers more benefits or drawbacks. If you do develop with these extensions, I'd bear the following in mind:

  • I could imagine developers forgetting to include the SaveChanges method, and I can't see (though I've not tested this) a way that the extension code would throw an error is pending changes were discarded
  • There seems to be no means to control the behaviour if an individual data operation fails, and the documentation doesn't currently describe the default behaviour here. To get such control you need to call SaveChanges for each individual data modification

Handling lookups and relationships
I've not looked at this in great detail, but I see the greatest benefit is the simplicity of retrieving related records with methods such as GetRelatedEntities to get child and many-many entities, and GetRelatedEntity to get a parent entity.

Use of LINQ queries
Again, not an area I've spent much time with. In the first instance I see this as most useful for developers that are already familiar with the LINQ query syntax, but I generally encourage the use of standard technologies (such as LINQ) in preference to application-specific technologies (such as QueryExpression). My expectation is that you should expect developer productivity gains with use of LINQ instead of QueryExpression, but I've not spent enough time on this to get good metrics. It should be emphasised that the LINQ query ultimately maps down to a QueryExpression, and hence you are as limited in the scope of queries as you currently are.

Data Binding
This really comes as a direct benefit of the use of standard .Net data classes, and LINQ. You can directly bind UI components like the ASP.Net GridView control to the results of a LINQ query. Oddly, there doesn't seem to be an example of this in the main crmsdk4.chm help file, but the following code extract from the 'advanced_developer_extensions_-_walkthrough_webapp.docx' document in the SDK shows how easy and powerful this can be:

var crm = new XrmDataContext("Crm");
ContactsGrid.DataSource = crm.contacts.Where
(c => c.emailaddress1.EndsWith("@example.com"));
ContactsGrid.DataBind();

Limitations of the Advanced Developer Extensions
There's a lot of very good stuff in the areas mentioned above, which could have significant benefits on developer productivity. However, I don't think you can immediately replace all existing .Net code (even if you wanted to), as there are some areas that I don't think these extensions reach (yet?). Note that the following list is based on my investigations so far, and may include features that do exist, but which I've missed in my brief analysis so far.

  1. Plugins. As mentioned above, the constructor for the XrmDataContext class has no overload that takes an IPluginExecutionContext. I also can't see a way to manipulate a DynamicEntity (e.g. from InputParameters) within these extensions, so I don't think they are viable for use in plugin code
  2. IFD (aka SPLA) impersonation. AD impersonation is supported, but I can't see the equivalent of the CrmImpersonator class. I need to do some further tests on this to see if IFD impersonation can work with these extensions; if not it would restrict the deployment scenarios for web extensions

It's also worth pointing out that the major benefits of the extension code relate to standard data operations (Create, Retrieve, Update, Delete). You can still use the additional messages (e.g. InstantiateTemplate, Merge) of the CrmService and MetadataService through these extensions, but you'll need to use the standard SDK types

My understanding is that these extensions originated as a basis for developing portal-style applications against CRM, so in that context it is not at all surprising that there are areas where the code can't reach. It'll be interesting to see how this changes, both with CRM 4 and CRM 5.

What Next ?
What next, indeed. The timing of this release is interesting, coming relatively late in the life of CRM 4 as the most recent version of CRM. It's still a little too early to know what will happen with CRM 5, but it would be logical to expect that these extensions will continue to work with future versions of CRM. The more interesting question is whether we will continue to have more that one programming model for CRM 5 (e.g. a native interface, and extensions such as these which form a wrapper around the native interface), or whether these 2 models will start to merge together.

Tuesday, 13 April 2010

CRM 4.0 UR 10

It's that time again; a new Update Rollup is out. See
http://support.microsoft.com/?kbid=979347
http://www.microsoft.com/downloads/details.aspx?displaylang=en&FamilyID=5da4d352-1d3f-4426-9c17-b256c30afdb9

I think the most significant fix in this rollup is the fix for metadata in new Organisation databases, as described at http://support.microsoft.com/kb/980627/ . This fixes a bug that started in UR 7, in which a newly created organisation (i.e. created after rollup 7 or higher had been installed) missed some SQL objects and permissions. The 2 symptoms I'd met of this were:
  1. Missing stored procedures for cascading relationships. This gives the error: SqlException: Could not find stored procedure 'dbo.p_CascadeCollectAssign'
  2. Missing permissions on the FilteredActivityPointer view. This gives the error: the SELECT permission was denied on the object 'FilteredActivityPointer', and is described here

Saturday, 13 March 2010

A bit of SQL - finding CRM attributes with no data

Over time I've been asked to review various existing CRM implementations. One thing I often like to check is whether there are any attributes that are not in use. To this end I created the stored procedure at the end of this post.

The stored procedure lists the attributes in an entity along with how many records contain data in that attribute. The parameters in the procedure allow you to limit the attributes to custom attributes, and to specify a maximum count of values (above which the attribute data won't be output).

Notes:
  1. The procedure as written would be created in the MSCRM organisation database. You could create it in a different database, and add the databasename.schemaname. prefixes if you want
  2. The procedure uses the supported route of querying the filtered view for an entity. If you have SQL dbo permission, and don't mind deleted records being included in the counts, then you can run the query against the base view, which is considerably faster. To do this, remove the string 'Filtered' from the procedure definition

Use the following SQL to create the stored procedure:

create proc pExcCheckColumnValues @entityName nvarchar(64), @customOnly bit = 0, @maxThreshold int = null
as
declare @attrName nvarchar(50), @sql nvarchar(1024)
declare cur cursor fast_forward for
select a.name from attribute a

join entity e on a.entityid = e.entityid
where e.name = @entityName and a.IsLogical = 0
and ((@customOnly = 1 and a.IsCustomField = 1) or isnull(@customOnly, 0) = 0)
create table #tmp (EntityName nvarchar(64), AttributeName nvarchar(50), ValueCount int)
open cur
fetch next from cur into @attrName
while @@fetch_status = 0
begin
set @sql = 'insert #tmp select ''' + @entityName + ''', ''' + @attrName + ''', count(*) from Filtered' + @entityName + ' where ' + @attrName + ' is not null'
+ case when @maxThreshold is not null then ' having count(*) <= ' + cast(@maxThreshold as nvarchar) else '' end
exec (@sql)
fetch next from cur into @attrName
end
close cur
deallocate cur
select * from #tmp
go

To give an example of how to use the procedure, the following will return all custom attributes of the account, where no more than 10 account records have a value in this field:

exec pExcCheckColumnValues @entityName = 'account', @customOnly = 1, @maxThreshold = 10

Thursday, 11 March 2010

Service Principle Names and Security

I've recently been working with CRM and Reporting Services in environments where AD delegation has been required, and hence it's been important to configure Service Principle Names (SPN).

A good overview of SPNs and how they are set can be found in this document for CRM 3. Although some of the steps in that document have been superseded by the use of the CRM 4.0 Reporting Services connector, the document is still a good reference for the main aspects of setting up SPNs.

There are two further considerations that are not covered in the above document though; ensuring Kerberos is used, and getting permission to modify SPNs.

Ensure Kerberos is used
There are 2 Windows authentication mechanisms: NTLM (aka Challenge/Response) and Kerberos. Only Kerberos supports delegation. By default, Kerberos will be used if the client supports it (this setting is known as 'Negotiate'), but this behaviour can be changed.

For web applications hosted in IIS (such as CRM and Reporting Servers 2005), this is controlled by the NTAuthenticationProvider setting in the IIS metabase. Note that this can be set at a few different levels, and you'll need to check them all to determine the current configuration. When you install CRM, the Environment Diagnostics Wizard requires that Negotiate is the default setting, though it could have been changed after install.

Reporting Services 2008 does not use IIS, and the authentication settings are configured in the rsreportserver.config file. Information about these settings can be found here in SQL Server 2008 Books Online.

Permissions required to modify SPNs
It's all very well knowing what SPNs need to be created or modified, but you may not have permission to do this, especially in high security environments. SPN data is stored in Active Directory, hence you need appropriate permissions in Active Directory. There are two approaches that you can take to get the SPNs setup if you don't have permission:
  1. Ask a Domain Administrator to grant you the permissions. The second half of this technet article describes the permissions you need, and how they are assigned
  2. The individual service accounts can be granted permission to create their own SPNs. This is more applicable to SQL Server SPNs, but can be used by any code that can register it's own SPN. This is well described here

Friday, 12 February 2010

Update Rollup 9

It's that time of every other month again. A new update rollup is available; number 9 is available here. There don't seem to be any particularly important fixes this time, unless you're in one of the time zones affect by this bug, which is now fixed