Data


Most of today’s business applications deal with relational databases, despite the apparent movement of many developers that feel we should abandon some of the most mature software technology in the industry and move to object databases. If you are using SQL Server as your relational database you may run into a scenario where you want to execute a parameterized SQL statement, but need to return the value of an identity column after it is generated. A simple way to this is to use an output parameter and execute an extra statement after the insert to get the identity column using the SCOPE_IDENTITY function.

/// <summary>/// Insert.  Insert a row in the Data Store.
/// </summary>
/// <param name="dataSourceCommand">The command object</param>
/// <param name="item">The item to save</param>
private static void InsertRow(DbCommand dataSourceCommand, ItemClass item)
{
     private const string SQL_INSERT =
@"INSERT INTO Table(name,lastChangeDate)
VALUES (@name,@lastChangeDate) SET @sak=SCOPE_IDENTITY()";

    try
    {
        dataSourceCommand.Parameters.Clear();

        string nameParameterName = string.Format("@{0}", _nameColumnName);
        DbParameter nameParm = dataSource.Command.CreateParameter();
        nameParm.ParameterName = nameParameterName;
        nameParm.DbType = System.Data.DbType.Int32;
        nameParm.Value = item.Name;
        dataSource.Command.Parameters.Add(nameParm);

        string lastChangeDateParameterName =             string.Format("@{0}", _lastChangeDateColumnName);
        DbParameter lastChangeDateParm = dataSource.Command.CreateParameter();
        lastChangeDateParm.ParameterName = lastChangeDateParameterName;
        lastChangeDateParm.DbType = System.Data.DbType.DateTime;
        lastChangeDateParm.Value = item.LastChangeDate;
        dataSource.Command.Parameters.Add(lastChangeDateParm);

        string identityParameterName = string.Format("@{0}", _identityName);
        DbParameter identityParm = dataSource.Command.CreateParameter();
        identityParm.ParameterName = identityParameterName;
        identityParm.DbType = System.Data.DbType.Int32;
        identityParm.Direction = System.Data.ParameterDirection.Output;
        dataSource.Command.Parameters.Add(identityParm);

        dataSource.Command.CommandText = SQL_INSERT;

        int i = dataSource.Command.ExecuteNonQuery();
        item.Identity = (int)identityParm.Value;
    }
    catch (Exception ex)
    {
        throw;
    }
}

Advertisements

I watched a good bit of the PDC 2009 webcasts last night and found there are some interesting technologies coming out. I don’t think it comes as much of a surprise that Microsoft is cramming Azure and Silverlight down developer’s throats. There were also talks of IE 9 (don’t worry they are just starting development) and Visual Studio 2010. In my opinion Visual Studio 2010 is a game changer in development if they can get the performance up and memory requirements down. The top feature in, in my opinion, is the historical debugging capability in IntelliTrace, which allows a snapshot call stack to be taken in a different environment and then allows a developer to open that snapshot and step through the source code right on their desktop.

All that aside, there seems to be an increased focus on data this year and a multitude of code name and product name changes. I want to provide an overview of my understanding of the recent items that were discussed at PDC 2009 relating to data.

  • Windows Azure Tables and Blobs are a way to store data in the cloud. When that wasn’t enough Microsoft moved to Azure SQL Data Services. This technology allowed users to store data in SQL tables and was extremely scalable. Apparently the public doesn’t care about scale because they want relational. So Microsoft has renamed the offering to SQL Azure. The primary differences are that now you can actually use SQL Server Management Studio to connect using the (Tabular Data Stream) TDS protocol and you can use relational queries.
  • Oslo was a metadata framework for managing data and has now been boiled down to M and Quadrant and now falls under a larger umbrella of SQL Server Modeling Services. Honestly I can’t quite see the compelling argument for this yet unless you are a DSL (domain specific language) designer.
  • Astoria was a REST based data access service. It was renamed to ADO.Net data services and recently changed to WCF data services. The new name change also introduced OData, which sounds like the new ADO.Net (yes, yet another data access technology).
  • Dallas is a new data service from Microsoft intended to create a market place for data, the new commodity (I guess this is Microsoft’s answer to Apple’s iTunes and App Store). There are some public data sets available already for free. It seems the intention here is to enable innovation using data.
  • Pivot from Microsoft labs was one of the more compelling things I saw. It is essentially a new Internet browser that has an extension for understanding another new data format called Collection XML (cxml). The viewer uses Silverlight and DeepZoom to visualize data.
  • LINQ to SQL was a compelling technology that abstracted developer from writing T-SQL code by using an ORM (object-relational mapper). However, it seems tying this technology to SQL Server was not popular with the real world and Microsoft has been strongly advising against using it. In the 4.0 version of the .Net framework Microsoft placed this technology within the ADO.Net Entity Framework (EF) and calls it “Direct Mapping”. The EF also supports mapping at a more abstract level using an Entity Data Model (EDM) and a raw schema, should that be desired.
Copyright © Scott P. Rudy 2009 All Rights Reserved