Wednesday, August 5, 2009

ADO.NET Entity Framework and RIA Services

I want to share some thoughts around ADO.NET Entity Framework (EF) and RIA Services (RIA). Before I get to deep in the subject, please understand that I have some experience with leveraging EF but that I haven’t been able to play enough with RIA Services to forge a definitive opinion (I did a very simple project to test capabilities) although at first sight, the added benefits of using RIA seems to be very low and for all advantages there are better paths to take in my opinion. I’ll get back to that in a minute.

As far as ADO.NET Entity framework goes, and put the minor kinks aside that I found along the way while using it and for which I was able to move around easily and not so easily sometimes, I think it would be a sound business decision to really consider it for the whole data architecture foundation, head to head with nHibernate.

Entity Framework:

EF presents some great characteristics for which I am a proponent. My main drivers are:
- Center of Microsoft Data Architecture vision
- Microsoft announced that EF is one the technology behind OSLO and I expect many tools in the future to interconnect the two technologies. http://www.microsoft.com/soa/products/oslo.aspx
- Microsoft made it clear that EF will be compatible with future version of SQL Server Reporting Services, SQL Analysis services and Windows Workflow foundation. Sorry no link, I don’t remember where I saw this.
- Rosier future if we moving to SOA into the Windows Azure Cloud since ADO.NET Data Services leverages EF. http://msdn.microsoft.com/en-us/data/bb931106.aspx
- Compatible with LINQ out of the box, .NET 3.0 query integrated language. LINQ is phenomenal. (nHibernate is compatible too now through a contrib library)
- Visual DSL inside Visual Studio to map entities with the database. I’ve seen better (little kinds here and there) but seen worst too.
- The API is great, fresh and best of breed in my opinion.

Some of the drawbacks I found are:
- Early technology, presents minor kinks for which you have to find ways around.
- The entities are not reusable, they are regenerated each time you create a new model with the same entity... When you define the same entity in the same assembly, you have to put them in separate namespace.
- Entities are bloated with code, the classes are generated in one big file per model, pretty annoying.
- You have to develop on SQL 2000 or else your EDMX is not compatible with it without recompiling. I.e. if you do development of the EDMX on 2008, you have to run the software on SQL 2008 or you have to change your edmx and recompile… Worst it will use datetime2 by default and you’ll have to redo your model.
- If you expose these entities in your public web service, the WSDL is definitely going to be CLUDGY.
- Version one is not compatible with POCO classes without a huge twist.

In my view, best of breed software development processes today start by crafting a domain object model by hand (something we also refers to as a canonical data model since we would also like to reuse it for integration scenarios), cherry picking attribute on entities, defining them in real-world business terms and then adapting this model to the ER Mapper software (EF or nHibernate for example). This presents a bit more work but will be amazing reward when taking integration scenarios and new technologies into account. I prefer to adhere to the KISS principle on the entity tier. I urge my clients in defining the entities with as less “hard links” as possible between them so that they can be reused in several different scenarios easily. If you generate them with technology like EF directly from your database and put them on the wire for public consumption, you’re getting away from the KISS principle at least from integration scenarios and you introduce dangerous and subtle coupling between the middle tier and the database engine.

RIA Services:
The essence of RIA is to share logic between the services and a SilverLight/Ajax Clients, generate proxy code to talk to web services and help provide some facilities like authentication, validation, etc. With or without RIA, your SilverLight clients will reach the same level of richness.

On the positive side:
- It’s built by Microsoft
- It’s easy, fast and gets the job done.
- Promotes the use of the full .NET framework on the server side and generate SL code for the client
- It’s extensible, so whatever we do not like can be added and probably modified too
- It could be applied to certain key scenarios and would jive with a mixed architecture
- People have been juggling with the Model View ViewModel pattern at it seems to work. http://wildermuth.com/2009/08/05/RIA_Services_Silverlight_and_MVVM
- Flexibility demonstrated by tying it to nHibernate and ASP.NET MVC : http://www.chrisvandesteeg.nl/

RIA drawbacks in my mind:
- The Visual Studio code generation is an annoying the bit for me, at least until VS 2010 where we’ll have great control over t4 template generation process, even for that stuff
- It’s still just a Technology Preview, not a released product
- Attribute based validation is useless in enterprise scenarios
- It’s geared toward UI servicing, what gets generated doesn’t adhere to KISS in terms of integration scenarios. I’d rather have my own services, my own entities status, my own simple schema defined once and used in all different scenarios.
- Code generation is used to share application logic. This *could* be fine if that would not be for the fact that you have to actually develop the UI and the server at the same time. Let me be clear on that one, code sharing between projects is in my opinion, not a good practice; been there done that and stopped... This is really of a concern to me on large enterprise scale projects. It seems harder to develop using a modularized approach although, I would have to invest some more time to make sure.
- One annoying fact is that out of the box, the domain service object doesn’t use any interfaces whatsoever, which makes it hard to develop unit test and fully embrace a kernel for IoC where you inject MOCK Services in your UI kernel instead of the real connected implementations. Honestly, I just don’t understand how the lead engineer got away with releasing these bits without actually Interfacing the LinqToEntitiesDomainService base class.
o public class YouDomainService : LinqToEntitiesDomainService

That's it, I hope you liked it and that I didn't bore you too much ! :)

Thursday, March 19, 2009

Finally, an easy fix in the .edmx file...

I finally decided to check the .EDMX file after the conversion to SQL 2000 before my commit in SVN. I figured out that the engine is driven by an attribute on the context definition: For SQL 2008, I had:

ProviderManifestToken="2008" xmlns:store="http://schemas.microsoft.com/ado/2007/12/edm/EntityStoreSchemaGenerator" xmlns="http://schemas.microsoft.com/ado/2006/04/edm/ssdl">

For SQL 2000:

ProviderManifestToken="2000" xmlns:store="http://schemas.microsoft.com/ado/2007/12/edm/EntityStoreSchemaGenerator" xmlns="http://schemas.microsoft.com/ado/2006/04/edm/ssdl">


I guess, I was also not using 2005 when I generated my model either, I was using 2008... :)

ADO.NET Entity Framework SQL2000 vs SQL2005

I have been venturing in upgrading a simple FTP service to ado.net entity framework. As usual with Microsoft, things just don't work out of the box.

First, let me tell you that I started by generating my Model from a SQL 2005 Database. I had to learn the hard way that if you try to use the model with SQL2000, the SQL generated by the entity framework is going to be incompatible in a few cases. This code, which works fine with SQL 2005 didn't work out of the box with SQL 2000:

Dim paramReference As ObjectParameter = New ObjectParameter("p", ThreadContext.Properties()("ActionType"))
Dim actionType As Harmony.FTPService.ActionType = context.ActionType.Where("It.ActionTypeId = @p", paramReference).First()

So then, I decide to try to regen my model on SQL2000 on a different computer where I have SQL 2000 installed to see if that would actually help. So, guess what? After adding my new SQL2000 connection in the ado.net entity wizard, I get this error message:

---------------------------
Microsoft Visual Studio
---------------------------
Could not load file or assembly 'Microsoft.SqlServer.Management.Sdk.Sfc, Version=10.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91' or one of its dependencies. The system cannot find the file specified.
---------------------------
OK
---------------------------

Ok, thanks to the internet, I found a link that tells me DLL Hell is not a thing of the past and that I got to download and install the following SQL Server 2008 features on top of my SQL 2005/SQL 2000 installation:

Install the following from Microsoft SQL Server 2008 Feature Pack RC0,
June 2008 -
Microsoft SQL Server 2008 Native Client
Microsoft SQL Server System CLR Types
Microsoft SQL Server 2008 Management Objects

These features can be downloaded from:
http://www.microsoft.com/downloads/details.aspx?FamilyId=089A9DAD-E2DF-43E9-9CD8-C06320520B40&displaylang=en


WAIT WHAT? I don't have SQL 2008!!! Oh well, I did install the 3 msi on my machine anyway...

Back in business!!!

I was able to generate my new model based on SQL 2000... At first glance, the only difference is the connection string : MultipleActiveResultSets=False instead of MultipleActiveResultSets=True... Ok, I am not going to venture in the EDMX XML file, it may be different but I don't care at this point in time, I prefer to keep it as a black box since I just barely started my venture in ado.net entity framework.

Guess what? Generating against SQL 2000 fixes the issue!

I hope this post can help anyone else that has SQL2005 vs SQL2005 issues with ADO.NET Entity framework.

Monday, March 9, 2009

Welcome to my Blog

Hi all!

I have created my personal blog where I am going to share my current findings around technology and the like.

Hope you will enjoy it!