Feb 032012
 

In Part 1: Out-of-the-box Features, I went through some of the great new features with Enterprise Library 5 Data Access including accessors and mappers. Before version 5, most of my EntLib extensions code was in place to perform these new features (not as eloquently, of course). I have become attached to a few of the extensions I had put in place over the years. I will keep my extensions around now for only a few reasons. 1) Customized database exceptions 2) IDataReader usability enhancements 3) Reduced mapping footprint.

Extensions

I typically have a Utils.dll that I import in every project. For data/resource access projects, I also include my Utils.Data.dll. Utils.Data started its career as a data access application block similar to SqlHelper from the pre-EntLib days. Today, Utils.Data is a set of extensions that merely makes Enterprise Library more fun to be with.

IDataReaderExtensions

Out of the box, System.Data.IDataRecord only gives you the ability to access fields by their integer index value. As an architect that does not have supervisory control over the database or the objects within, this scares me. Any additions or re-ordering of the output fields will surely cause your index-based mapping to blow up. You could solve this with a call to .GetOrdinal(fieldName) first to get the index, but that is twice the code (not to mention boring plumbing code). My extensions do nothing novel. They simply provide string-based extensions like .GetInt32(string name) that do the retrieval and casting for you. I also added a few frequently-used new extensions like .GetNullableInt(string name) to keep my result mapping as clean as concise as possible.

Reader use with built-in features:

jeep = new Jeep()
{
	ID = row.GetInt32(0),
	Name = row.GetString(1),
	Description = row.GetString(2),
	Status = row.GetBoolean(3)
};

Reader use with extensions:

jeep = new Jeep()
{
	ID = reader.GetInt32(“JeepID”),
	Name = reader.GetString(“Name”),
	Description = reader.GetString(“Description”),
	Status = reader.GetBoolean(“Status”),
};

I advise that you never use string literals in data access code. Data access code is hit hard, so take your performance improvements when you can. I prefer having const strings locally in my data access class or having an internal static class with const strings to share with all classes in my data access project. The attached solution has examples.

Parameter and Result/Row Mapping

The now built-in ParameterMapper, RowMapper, and ResultSetMapper are beautiful. Sometimes you need a little sumpin’ special to make your code easier to read and work consistently when getting one or ten entities in a database call. Similar to how ExecuteSprocAccessor works with row and result set mappers, CreateObject and CreateCollection support generics and build an object or collection of the specified type. Instead of deriving a new class from a base mapper class, I chose to have one delegate method that generates a single object from a reader. This delegate is used by both CreateObject and CreateCollection. Let’s look at the differences with code.

Creating an object with EntLib5 features:

public Jeep GetJeepByID(int id)
{
	Database db = DatabaseFactory.CreateDatabase();
	IParameterMapper jeepParameterMapper = new JeepParameterMapper();
	IRowMapper<Jeep> jeepRowMapper = new JeepRowMapper();
	IEnumerable<Jeep> jeeps = db.ExecuteSprocAccessor<Jeep>(StoredProcedures.GetJeepByID, jeepParameterMapper, jeepRowMapper, id);
	return jeeps.First();
}

internal class JeepRowMapper : IRowMapper<Jeep>
{
	public Jeep MapRow(System.Data.IDataRecord row)
	{
		return new Jeep()
		{
			ID = row.GetInt32(0),
			Name = row.GetString(1),
			Description = row.GetString(2),
			Status = row.GetBoolean(3)
		};
	}
}

Creating an object with extensions:

public Jeep GetJeepByID(int id)
{
	Database db = DatabaseFactory.CreateDatabase();
	DbCommand cmd = db.GetStoredProcCommand(StoredProcedures.GetJeepByID, id);
	Jeep jeep = db.CreateObject(cmd, GenerateJeepFromReader);
	return jeep;
}

private Jeep GenerateJeepFromReader(IDataReader reader)
{
	Jeep jeep = null;
	if (reader.Read())
	{
		jeep = new Jeep()
		{
			ID = reader.GetInt32(Fields.JeepID),
			Name = reader.GetString(Fields.JeepName),
			Description = reader.GetString(Fields.JeepDescription),
			Status = reader.GetBoolean(Fields.JeepStatus),
		};
	}
	return jeep;
}

One more thing to note is that my CreateObject, CreateCollection, and their GetAccessor equivalents have my customized exception handling logic that makes use of the StoredProcedureException. We’ll go through that now.

Customized and Standardized Exceptions

The only value in logging exceptions is if your entire system logs exceptions and other messages in a consistent and meaningful manner. If error messages are logged as “ERROR!” or “All bets are off!!!” then you shouldn’t bother logging. In the real world, few developers, architects, or support staff have access to production databases. Having meaningful and detailed error messages is key to troubleshooting an issue and meeting your SLAs. I created a simple StoredProcedureException that provides the executed (or attempted) command as part of the stack trace.

WARNING: You should never, ever, ever show the stack trace in your application or let your users see the real error messages.
Log the real message and stack trace, then show “Data access exception” to your users. Please!

 

In the attached code samples, you’ll see two data access methods that call “ExceptionStoredProcedure” that does nothing other than RAISERROR(‘This is an exception’, 16, 1). With the built-in features, you can expect a SqlException and a stack trace that looks like this:

at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection) 
   at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection) 
   at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning() 
   at System.Data.SqlClient.TdsParser.Run(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj) 
   at System.Data.SqlClient.SqlDataReader.ConsumeMetaData() at System.Data.SqlClient.SqlDataReader.get_MetaData() 
   at System.Data.SqlClient.SqlCommand.FinishExecuteReader(SqlDataReader ds, RunBehavior runBehavior, String resetOptionsString) 
   at System.Data.SqlClient.SqlCommand.RunExecuteReaderTds(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, Boolean async) 
   at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method, DbAsyncResult result) 
   at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method) 
   at System.Data.SqlClient.SqlCommand.ExecuteReader(CommandBehavior behavior, String method) 
   at System.Data.SqlClient.SqlCommand.ExecuteDbDataReader(CommandBehavior behavior) 
   at System.Data.Common.DbCommand.ExecuteReader(CommandBehavior behavior) 
   at Microsoft.Practices.EnterpriseLibrary.Data.Database.DoExecuteReader(DbCommand command, CommandBehavior cmdBehavior) 
      in e:BuildsEntLibLatestSourceBlocksDataSrcDataDatabase.cs:line 460 
   at Microsoft.Practices.EnterpriseLibrary.Data.Database.ExecuteReader(DbCommand command) 
      in e:BuildsEntLibLatestSourceBlocksDataSrcDataDatabase.cs:line 846 
   at Microsoft.Practices.EnterpriseLibrary.Data.CommandAccessor`1.d__0.MoveNext() 
      in e:BuildsEntLibLatestSourceBlocksDataSrcDataCommandAccessor.cs:line 68 at System.Linq.Enumerable.First[TSource](IEnumerable`1 source) 
   at DataAccess.JeepDataAccess.GetJeepByIDShowingException(Int32 id) 
      in C:DevCookbookUtilitiesEntLibExtensions5.0EntLibExtensionsDataAccessJeepDataAccess.cs:line 58 
   at Client.Program.TestExceptionGetWithEntLib5Only() 
      in C:DevCookbookUtilitiesEntLibExtensions5.0EntLibExtensionsClientProgram.cs:line 58 
   at Client.Program.Main(String[] args) 
      in C:DevCookbookUtilitiesEntLibExtensions5.0EntLibExtensionsClientProgram.cs:line 22

With my extensions, you can expect a StoredProcedureException that includes the text of the full stored procedure being executed at the time. This has saved me countless times as my log table stores the full stack trace and I can reproduce exactly what happened without guessing. The InnerException of the StoredProcedureException will be the same SqlException seen above. The customized stack trace will look like this:

[Stored procedure executed: ExceptionStoredProcedure @RETURN_VALUE=-6, @JeepID=1]
   at Soalutions.Utilities.Data.DatabaseExtensions.CreateObject[T](Database db, DbCommand cmd, GenerateObjectFromReader`1 gofr) 
      in C:DevCookbookUtilitiesEntLibExtensions5.0EntLibExtensionsEntLibExtensionsDatabaseExtensions.cs:line 49
   at DataAccess.JeepDataAccess.GetJeepByIDShowingExceptionWithExtensions(Int32 id) 
      in C:DevCookbookUtilitiesEntLibExtensions5.0EntLibExtensionsDataAccessJeepDataAccess.cs:line 65
   at Client.Program.TestExceptionGetWithExtensions() 
      in C:DevCookbookUtilitiesEntLibExtensions5.0EntLibExtensionsClientProgram.cs:line 66
   at Client.Program.Main(String[] args) 
      in C:DevCookbookUtilitiesEntLibExtensions5.0EntLibExtensionsClientProgram.cs:line 23

So that’s really it. There is some other hidden goodness in there, but it’s not really worth talking about in this post.

Download sample solution: EntLibExtensions.zip – 140 KB (143,360 bytes)

Dec 072010
 

In a previous post about my extensions for Enterprise Library pre-version 5, There was quite a bit of customized logic to create custom entities from a result set. Enterprise Library 5 now takes care of almost all of my customizations with the advent of accessors, row mappers, result set mappers, and parameter mappers. In this post I’ll show a few different ways to use out-of-the-box Enterprise Library 5 features to access data. In Part 2, I’ll also show a few of my own extensions that simply extend Enterprise Library and reduce repetitive code in my data access layer code.

Out-of-the-box Features

The most simplistic scenario exists when your database queries bring back results with column names exactly matching the property names. This is by far the easiest code to write with Enterprise Library, and requires far less code than with all previous versions. Here is a sample showing the default mapping of input parameters and result set columns/values using the new database extension method ExecuteSprocAccessor. You simply pass in the stored procedure name and the params, returning an IEnumerable of your custom entity (in this case, a Jeep object).

public Jeep GetJeepByID(int id)
{
    Database db = DatabaseFactory.CreateDatabase();
    IEnumerable<Jeep> jeeps = db.ExecuteSprocAccessor<Jeep>("GetJeepByID", id);
    return jeeps.First();
}

You can only use this method if all public properties of the custom entity can be mapped to a result set column/value. If any public property values cannot be mapped, you will receive a System.InvalidOperationException stating that the column was not found on the IDataRecord being evaluated. If your parameter or result set mapping becomes more complicated, you can specify a parameter mapper, row mapper, result set mapper, or a combination thereof to customize how your procedure is called, and how the results are interpreted. Here is an example of a custom parameter mapper and row mapper used to replicate the default mapping performed in the first example:

internal class JeepParameterMapper : IParameterMapper
{
    public void AssignParameters(DbCommand command, object[] parameterValues)
    {
        DbParameter parameter = command.CreateParameter();
        parameter.ParameterName = "@JeepID";
        parameter.Value = parameterValues[0];
        command.Parameters.Add(parameter);
    }
}

internal class JeepRowMapper : IRowMapper<Jeep>
{
    public Jeep MapRow(System.Data.IDataRecord row)
    {
        return new Jeep()
        {
            ID = row.GetInt32(0),
            Name = row.GetString(1),
            Description = row.GetString(2),
            Status = row.GetBoolean(3)
        };
    }
}

Below you will see the same task being performed in the first example, but this time with our custom mappers.

public Jeep GetJeepByIDWithMappers(int id)
{
    IParameterMapper jeepParameterMapper = new JeepParameterMapper();
    IRowMapper<Jeep> jeepRowMapper = new JeepRowMapper();

    Database db = DatabaseFactory.CreateDatabase();
    IEnumerable<Jeep> jeeps = db.ExecuteSprocAccessor<Jeep>("GetJeepByID", jeepParameterMapper, jeepRowMapper, id);
    return jeeps.First();
}

ResultSetMappers can be used to map more complex result sets to custom entities with deeper object graphs. Consider a stored procedure that returns multiple result sets similar to that seen in the following image. The first result set contains the custom entity details, and the second result set is some collection of child objects. In this case, we see an article with a child collection of article images.

article_resultset

You would have a hard time building up your custom entity without using an IDataReader and iterating through the result sets with .NextResult. ResultSetMappers allow you to code for this scenario. Below we’ll create a custom result set mapper for articles that will map all of the relevant result sets to the Article object.

internal class ArticleResultSetMapper : IResultSetMapper<Article>
{
    public IEnumerable<Article> MapSet(System.Data.IDataReader reader)
    {
        Dictionary<int, Article> articles = new Dictionary<int, Article>();

        Article article;
        ArticleImage articleImage;
        while (reader.Read())
        {
            article = new Article
            {
                ID = reader.GetInt32(0),
                Title = reader.GetString(1),
                Description = reader.GetString(2),
                Images = new Collection<ArticleImage>()
            };
            articles.Add(article.ID, article);
        }
        if (reader.NextResult())
        {
            while (reader.Read())
            {
                int articleID = reader.GetInt32(0);
                if (articles.ContainsKey(articleID))
                {
                    articleImage = new ArticleImage
                    {
                        DisplayOrder = reader.GetInt32(1),
                        Url = reader.GetString(2),
                        Caption = reader.GetString(3)
                    };
                    articles[articleID].Images.Add(articleImage);
                }
            }
        }

        return articles.Select(a => a.Value);
    }
}

Below you will see the code used to create a new IEnumerable<Article> using our ArticleResultSetMapper:

public Article GetArticleByID(int id)
{
    ArticleResultSetMapper articleResultSetMapper = new ArticleResultSetMapper();

    Database db = DatabaseFactory.CreateDatabase();
    IEnumerable<Article> articles = db.ExecuteSprocAccessor<Article>("GetArticleByID", articleResultSetMapper, id);
    return articles.First();
}

As you can probably tell, Enterprise Library 5 gives you more power and control over the mapping and generation of your custom entities. The previous version of my Enterprise Library extensions focused primarily on performing just the types of mappings that are now built into the product. After seeing just a few examples, you should be ready to jump into Enterprise Library 5 Data Access head first. In the next post, we’ll walk through usage scenarios for a few of my Enterprise Library extensions that makes these routine tasks easier to read, maintain, and train.

 Posted by at 4:42 am
Jul 272010
 

In .NET 1.1, I tried the original MS Data Access Application Block’s SqlHelper (you can still download it here). It was great for most of the common uses, but was lacking in some areas. The consuming code looked sloppy and encouraged blind faith that database objects never changed. It also didn’t support transactions as I would have liked, and didn’t support my obsession with custom entities. I started out writing an extension library that wrapped SqlHelper, but that felt very wrong wrapping the ADO.NET wrapper (SqlHelper). I ended up writing my own version of SqlHelper called SqlHelper (nice name, eh?). You see, at this time I was getting over a bad relationship with a series of ORM products that had a negative effect on my productivity. I decided to revolt with good ol? fashion data access methods that have never let us down.

The only thing worse than my ORM experience was the disgusting over-use of DataSet and DataTable. For my dollar, DataReader is where it’s at. I agree that using the reader is slightly more dangerous in the hands of an inexperienced or inattentive developer (did you know you have to close the reader when you’re done with it?). Nothing can compare with the speed and flexibility of the reader, which is why DataSet and DataAdapter use it at their core. If you are working with custom entities, instead of DataSets and DataTables, you would be crazy to not use the DataReader.

My SqlHelper worked in conjunction with my DataAccessLayer class that defined a few delegates that made reader-to-object-mapping a simple task.  Once the mapping methods were written to be used with the delegates, which returned object or System.Collections.CollectionBase because we did not yet have generics (can you imagine?), you simply called the SqlHelper to do all of the hard work. SqlHelper did not implement all of the craziness that the original version contained. It was a short 450 lines of code that did nothing but access data in a safe and reliable way. In the example below, we have the GenerateDocumentFromReader method that is used by the GenerateObjectFromReader delegate. When SqlHelper.ExecuteReaderCmd is called, the delegate is passed in to map the reader results to my object? in this case a Document.

// Object generation method
private static object GenerateDocumentFromReader(IDataReader returnData)
{
     Document document = new Document();
     if (returnData.Read())
     {
         document = new Document(
             (int)returnData["DocumentId"],
             (byte[])returnData["DocumentBinary"],
             returnData["FileName"].ToString(),
             returnData["Description"].ToString(),
             returnData["ContentType"].ToString(),
             (int)returnData["FileSize"],
             returnData["MD5Sum"].ToString(),
             (bool) returnData["EnabledInd"],
             (int)returnData["CreatorEmpId"],
             Convert.ToDateTime(returnData["CreateDt"]),
             (int)returnData["LastUpdateEmpId"],
             Convert.ToDateTime(returnData["LastUpdateDt"]));
     }     return document;
}
public static Document GetDocumentByDocumentId(int documentId)
{
     SqlCommand sqlCmd = new SqlCommand();
     SqlHelper.SetCommandArguments(sqlCmd, CommandType.StoredProcedure, "usp_Document_GetDocumentByDocumentId");
     SqlHelper.AddParameterToSqlCommand(sqlCmd, "@DocumentId", SqlDbType.Int, 0, ParameterDirection.Input, documentId);
     DataAccessLayer.GenerateObjectFromReader gofr = new DataAccessLayer.GenerateObjectFromReader(GenerateDocumentFromReader);
     Document document = SqlHelper.ExecuteReaderCmd(sqlCmd, gofr) as Document;
     return document;
}

This worked wonderfully for years. After converting, I couldn’t imagine a project that used ORM, DataSets, or DataTables again. I’ve been on many 1.1 projects since writing my SqlHelper in 2004, and I have successfully converted them all. In early 2006, MS graced us with .NET 2.0. Generics, System.Transactions, and partial classes changed my life. In my first few exposures to generics, like Vinay “the Generic Guy” Ahuja’s 2005 Jax Code Camp presentation and Juval “My Hero” Lowy’s MSDN article “An Introduction to Generics”, I listened/read and pondered the millions of uses of generics. I adapted my SqlHelper heavily to use these new technologies and morphed it into something else that closely represented the newest version of the DAAB, Enterprise Library 3.

By this point, I wanted to convert to Enterprise Library. It was far better than the simple SqlHelper. It had better transaction support, though I don’t know if that included System.Transactions. I could have put my object generation extensions on top of it and it would have worked well for years. On home projects I had already converted to use EntLib. At work I was not so lucky. The deep stack trace when something went wrong scared everyone, and that is still a fear for those starting out in EntLib today. To ease the fears, I just created my replacement to SqlHelper the Database class.

I used a lot of the same naming conventions as Enterprise Library. In fact, much of the consuming code was nearly identical (except for the fact that it did not implement the provider pattern and worked only with SQL Server). This was in anticipation of a quick adoption of Enterprise Library 3 in the workplace. Kind of a “see? not so bad” move on my part. Just like EntLib, you created a Database class using the DatabaseFactory that used your default connection string key. Commands and parameters were created and added with methods off of the Database class. Aside from the SqlCommand/DbCommand, everything looked and felt the same, but came in a small file with only 490 lines of code instead of 5 or more projects with 490 files. Using it felt the same, too. Only my object/collection generation extensions looked different from the standard reader, scalar, dataset routines. Below is the same code from above using the Database class and related classes to create a Document from a reader.

// Object generation method
private static Document GenerateDocumentFromReader(IDataReader returnData)
{
     Document document = new Document();
     if (returnData.Read())
     {
         document = new Document(
             GetIntFromReader(returnData, "DocumentId"),
             GetIntFromReader(returnData, "DocumentTypeId"),
             GetStringFromReader(returnData, "DocumentTypeName"),
             GetByteArrayFromReader(returnData, "DocumentBinary"),
             GetStringFromReader(returnData, "FileName"),
             GetStringFromReader(returnData, "Description"),
             GetStringFromReader(returnData, "ContentType"),
             GetIntFromReader(returnData, "FileSize"),
             GetStringFromReader(returnData, "MD5Sum"),
             GetStringFromReader(returnData, "CreatorEmpID"),
             GetDateTimeFromReader(returnData, "CreateDt"),
             GetStringFromReader(returnData, "LastUpdateEmpID"),
             GetDateTimeFromReader(returnData, "LastUpdateDt"));
     }
     return document;
}
public static Document GetDocumentByDocumentId(int documentId)
{
     Database db = DatabaseFactory.CreateDatabase(AppSettings.ConnectionStringKey);
     SqlCommand sqlCmd = db.GetStoredProcCommand("usp_Document_GetDocumentByDocumentId");
     db.AddInParameter(sqlCmd, "DocumentId", SqlDbType.Int, documentId);
     GenerateObjectFromReader<Document> gofr = new GenerateObjectFromReader<Document>(GenerateDocumentFromReader);
     Document document = CreateObjectFromDatabase<Document>(db, sqlCmd, gofr);
     return document;
}

This, too, worked great for years. Other than a brief period in 2007 when I tried to wrap all of my data access code with WCF services, .NET 3.0 came and went with no changes to my data access methodology. In late 2007, I had lost all love of my SqlHelper and my Database/DataAccessLayer classes. With .NET 3.5 and Enterprise Library 4.0, I no longer felt the need to roll my own. .NET now had extension methods for me to extend Enterprise Library however I pleased. Enterprise Library supported System.Transactions making its use a dream if behind a WCF service that allowed transaction flow. With a succinct 190 lines of extension code, I had it made in the shade with Enterprise Library 4.0. In fact, I haven’t used anything since.

The consuming code was almost exactly the same. You’ll notice the SqlCommand has changed to DbCommand. The SqlDbType has changed to DbType. Other than that, it feels and works the same.

// Object generation method
private static Document GenerateDocumentFromReader(IDataReader returnData)
{
     Document document = new Document();
     if (returnData.Read())
     {
         document = new Document(
             returnData.GetInt32("DocumentId"),
             returnData.GetInt32("DocumentTypeId"),
             returnData.GetString("DocumentTypeName"),
             returnData.GetByteArray("DocumentBinary"),
             returnData.GetString("FileName"),
             returnData.GetString("Description"),
             returnData.GetString("ContentType"),
             returnData.GetInt32("FileSize"),
             returnData.GetString("MD5Sum"),
             returnData.GetString("CreatorEmpID"),
             returnData.GetDateTime("CreateDt"),
             returnData.GetString("LastUpdateEmpID"),
             returnData.GetDateTime("LastUpdateDt"));
     }
     return document;
}
public static Document GetDocumentByDocumentID(int documentId)
{
     Database db = DatabaseFactory.CreateDatabase();
     DbCommand cmd = db.GetStoredProcCommand("usp_Document_GetDocumentByDocumentId");
     db.AddInParameter(cmd, "DocumentID", DbType.Int32, documentId);
     GenerateObjectFromReader<Document> gofr = new GenerateObjectFromReader<Document>(GenerateDocumentFromReader);
     Document document = db.CreateObject<Document>(cmd, gofr);
     return document;
}

With a full suite of unit test projects available for download with the Enterprise Library source files, the fear should be abated for the remaining holdouts. Getting started is as easy as including two DLL references, and adding 5 lines of config. You can’t beat that!

I downloaded Enterprise Library 5 last week. I’ve been making use of new features such as result set mapping (eliminating the need for my object generation extensions), parameter mapping, and accessors that bring them all together. There’s a bunch of inversion of control features in place as well. I think I’ll be quite comfortable in my new EntLib5 home.

 Posted by at 3:32 am
Dec 072008
 

It’s so easy! Start downloading Enterprise Library 4.1 now while you read this. The data application block syntax has not changed much since the first version. The most notable change was allowing us to use System.Data.Common.DbCommand when version 3.0 was released. I understand the uneasy feeling some developers have using Enterprise Library. My team at my previous employer decided not to use it, thinking it would add increased complexity and would not give us the flexibility we needed if we had to change something. This is typical of groups that do not have an established Data Access Library.

Your Data Access Library should be one of the most highly tested libraries in your application. If there is a problem there, you will have issues everywhere. Enterprise Library not only comes with the source code, but also includes the full suite of unit tests for each of the application blocks. You should feel at ease when you decide to migrate to Enterprise Library. Run it through your full battery of tests before you commit the team to it. If you find any problems, check the forums, request changes/enhancements from the MS Patterns & Practices team, or fix it yourself.

The steps to achieve EntLib goodness:

  1. Download Enterprise Library
  2. Add reference to “Enterprise Library Data Access Application Block” and “Enterprise Library Shared Library”
  3. Change your app.config or web.config
  4. Write some much more readable data access code

I’ll start at step 3 as steps 1 and 2 are self-explanatory. Your connection string needs to be in you app’s config file, the machine.config file, or in a connectionStrings.config file referenced in those config files. You can start using it just by adding the <configSections> clock and the <dataConfiguration> node. This will allow you to have one default database for all commands you will execute.

<?xml version=”1.0encoding=”utf-8“?>

<configuration>

    <configSections>

        <section name=”dataConfigurationtype=”Microsoft.Practices.EnterpriseLibrary.Data.Configuration.DatabaseSettings, Microsoft.Practices.EnterpriseLibrary.Data, Version=4.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35” />

    </configSections>

    <dataConfiguration defaultDatabase=”Testing” />

    <connectionStrings>

        <add name=”TestingconnectionString=”server=Server_Name;database=DB_Name;Integrated Security=true;

                  providerName=”System.Data.SqlClient” />

    </connectionStrings>
</configuration>

 

By the time you get to step 4, you have all of the infrastructure in place. Painless so far, let’s see how steep the learning curve is.

With ADO.NET, you would write:

  116 string connectionString = ConfigurationManager.ConnectionStrings[“Testing”].ConnectionString;

  117 using (SqlConnection con = new SqlConnection(connectionString))

  118 using (SqlCommand cmd = new SqlCommand(“usp_ErrorLog_Insert”, con))

  119 {

  120     cmd.CommandType = System.Data.CommandType.StoredProcedure;

  121     cmd.Parameters.AddWithValue(“Message”, “Testing 1”);

  122     cmd.Parameters.AddWithValue(“UserID”, 5150);

  123     try

  124     {

  125         con.Open();

  126         cmd.ExecuteNonQuery();

  127     }

  128     finally

  129     {

  130         con.Close();

  131     }

  132 }

With Enterprise Library, you write:

  170 Database db = DatabaseFactory.CreateDatabase();

  171 DbCommand cmd = db.GetStoredProcCommand(“usp_ErrorLog_Insert”);

  172 db.AddInParameter(cmd, “Message”, System.Data.DbType.String, “Testing 1”);

  173 db.AddInParameter(cmd, “UserID”, System.Data.DbType.Int32, 5150);

  174 db.ExecuteNonQuery(cmd);

 

Line 170 creates the database object. This is the hardest thing to get used to. You call everything related to the Database object. In ADO.NET, we are used to creating a connection, adding the connection to a command, using the command in an adapter. Here you’ll always be using the Database object to create a command, add parameters to the command, execute the command, fill a DataSet, etc. It is definitely less code to write, but it is also more readable and elegant.

If you have a database to execute commands against other than the defaultDatabase specified in the config file, then the first line changes to:

  170 Database db = DatabaseFactory.CreateDatabase(“OtherConnectionStringKey”);

 

That’s it. The patterns & practices team has really done a nice job making it painless to use Enterprise Library. Take the time to try it out again if you reviewed a previous version. I reviewed 2.0, and chose not to use it. When 3.0 came out, I was hooked.

 Posted by at 4:09 am