Although I have some personal thoughts on Edison and his tactics when it came to giving credit to those that assisted him (e.g. Nikola Tesla), the 1929 recording originally done on a pallophotophone and the work behind restoring it (see: gereports.com and retrothing.com) is still pretty impressive. It is not what I thought Thomas Edison would have sounded like, but of course he was 82 at the time.

Most of today’s business applications deal with relational databases, despite the apparent movement of many developers that feel we should abandon some of the most mature software technology in the industry and move to object databases. If you are using SQL Server as your relational database you may run into a scenario where you want to execute a parameterized SQL statement, but need to return the value of an identity column after it is generated. A simple way to this is to use an output parameter and execute an extra statement after the insert to get the identity column using the SCOPE_IDENTITY function.

/// <summary>/// Insert.  Insert a row in the Data Store.
/// </summary>
/// <param name="dataSourceCommand">The command object</param>
/// <param name="item">The item to save</param>
private static void InsertRow(DbCommand dataSourceCommand, ItemClass item)
{
     private const string SQL_INSERT =
@"INSERT INTO Table(name,lastChangeDate)
VALUES (@name,@lastChangeDate) SET @sak=SCOPE_IDENTITY()";

    try
    {
        dataSourceCommand.Parameters.Clear();

        string nameParameterName = string.Format("@{0}", _nameColumnName);
        DbParameter nameParm = dataSource.Command.CreateParameter();
        nameParm.ParameterName = nameParameterName;
        nameParm.DbType = System.Data.DbType.Int32;
        nameParm.Value = item.Name;
        dataSource.Command.Parameters.Add(nameParm);

        string lastChangeDateParameterName =             string.Format("@{0}", _lastChangeDateColumnName);
        DbParameter lastChangeDateParm = dataSource.Command.CreateParameter();
        lastChangeDateParm.ParameterName = lastChangeDateParameterName;
        lastChangeDateParm.DbType = System.Data.DbType.DateTime;
        lastChangeDateParm.Value = item.LastChangeDate;
        dataSource.Command.Parameters.Add(lastChangeDateParm);

        string identityParameterName = string.Format("@{0}", _identityName);
        DbParameter identityParm = dataSource.Command.CreateParameter();
        identityParm.ParameterName = identityParameterName;
        identityParm.DbType = System.Data.DbType.Int32;
        identityParm.Direction = System.Data.ParameterDirection.Output;
        dataSource.Command.Parameters.Add(identityParm);

        dataSource.Command.CommandText = SQL_INSERT;

        int i = dataSource.Command.ExecuteNonQuery();
        item.Identity = (int)identityParm.Value;
    }
    catch (Exception ex)
    {
        throw;
    }
}

I went running this morning for the first time in a while. I brought along my phone, since my Zune doesn’t come with GPS. My phone has GPS, but doesn’t auto sync with my Zune software. not to mention the lack of basic functional engineering at HTC has produced phones without a full size headphone jack. Aside from marketing it’s no wonder Apple is crushing Microsoft in sales.  I can only hope Windows Phone 7 engineering specs will fix a lot of this. Okay, no more digressing.

The real point of this was to write about SportyPal. It is excercise software that tracks where you went and, aside from the cars you passed, everything about where you went. You can actually insert an embedded view of what you did, but the free version of wordpress apparently won’t let you do that, so here is a link to my SportyPal workout.

I watched a good bit of the PDC 2009 webcasts last night and found there are some interesting technologies coming out. I don’t think it comes as much of a surprise that Microsoft is cramming Azure and Silverlight down developer’s throats. There were also talks of IE 9 (don’t worry they are just starting development) and Visual Studio 2010. In my opinion Visual Studio 2010 is a game changer in development if they can get the performance up and memory requirements down. The top feature in, in my opinion, is the historical debugging capability in IntelliTrace, which allows a snapshot call stack to be taken in a different environment and then allows a developer to open that snapshot and step through the source code right on their desktop.

All that aside, there seems to be an increased focus on data this year and a multitude of code name and product name changes. I want to provide an overview of my understanding of the recent items that were discussed at PDC 2009 relating to data.

  • Windows Azure Tables and Blobs are a way to store data in the cloud. When that wasn’t enough Microsoft moved to Azure SQL Data Services. This technology allowed users to store data in SQL tables and was extremely scalable. Apparently the public doesn’t care about scale because they want relational. So Microsoft has renamed the offering to SQL Azure. The primary differences are that now you can actually use SQL Server Management Studio to connect using the (Tabular Data Stream) TDS protocol and you can use relational queries.
  • Oslo was a metadata framework for managing data and has now been boiled down to M and Quadrant and now falls under a larger umbrella of SQL Server Modeling Services. Honestly I can’t quite see the compelling argument for this yet unless you are a DSL (domain specific language) designer.
  • Astoria was a REST based data access service. It was renamed to ADO.Net data services and recently changed to WCF data services. The new name change also introduced OData, which sounds like the new ADO.Net (yes, yet another data access technology).
  • Dallas is a new data service from Microsoft intended to create a market place for data, the new commodity (I guess this is Microsoft’s answer to Apple’s iTunes and App Store). There are some public data sets available already for free. It seems the intention here is to enable innovation using data.
  • Pivot from Microsoft labs was one of the more compelling things I saw. It is essentially a new Internet browser that has an extension for understanding another new data format called Collection XML (cxml). The viewer uses Silverlight and DeepZoom to visualize data.
  • LINQ to SQL was a compelling technology that abstracted developer from writing T-SQL code by using an ORM (object-relational mapper). However, it seems tying this technology to SQL Server was not popular with the real world and Microsoft has been strongly advising against using it. In the 4.0 version of the .Net framework Microsoft placed this technology within the ADO.Net Entity Framework (EF) and calls it “Direct Mapping”. The EF also supports mapping at a more abstract level using an Entity Data Model (EDM) and a raw schema, should that be desired.
Copyright © Scott P. Rudy 2009 All Rights Reserved

I see a cycle coming. I think Azure is a fairly compelling idea, but it makes me wonder what the company I work for will do going forward. We have obviously seen economies of scale work to a corporation’s advantage. So will large corporations like mine be better off negotiating cloud rates with Microsoft based on a certain amount of volume? If so what would this do to developer productivity?

Instead of the developer just being able to create a new environment, I can just see these companies putting process in place to provision the new environment in their configured way. Thus it would raise the maturity level, but reduce productivity. Imagine having an “Azure Development Workspace” catalog item where it has to go through approval chains and get lots of people involved.

It may sound cynical, but it seems every time something comes out that is supposed to simplify life, we add process (as opposed to guidelines) for formal structure and thus make life more complicated and expensive.

Copyright © Scott P. Rudy 2009 All Rights Reserved

Earlier this month I wrote how you can mount a VHD using the command line utility DiskPart. This is an extremely useful utility, but sometimes you just want the simplicity of using the Windows GUI interface. Well, in Windows 7 the Disk Management MMC snap-in makes this a breeze.

The easiest way to load the snap-in is to open Computer Management by right-clicking on the “Computer” link in the Windows start menu and choosing “Manage”. This action will open a whole host of utilities you can use to manage your computer, but for this exercise you will only need to use the Disk Management link located in the left pane under Computer Management\Storage. Once you click on Disk Management it may take Windows a few seconds to find all of your storage devices and display them in the right pane.

Once the storage devices have loaded you can click on the Action menu at the top of the window. You now have the “Attach VHD” menu option. Simply click that option and select the VHD file you want to attach.

Detaching the file is a little more confusing. To detach it you have to navigate to the disk in the bottom pane of the Disk Management utility. The left most column in that pane is a series of buttons with the disk name shown. Right click on the disk name (e.g. Disk 2 in the image below) and select “Detach VHD”. Ensure you don’t check the delete checkbox, unless you don’t want any data on the VHD anymore.

detach

Copyright © Scott P. Rudy 2009 All Rights Reserved

I can almost say I told you so, as it looks like Nokia released a device that resembles a netbook/notebook hybrid running, yep you guess it, Microsoft Windows. The Booklet 3G is a fanless device that comes with pretty much anything you would need, keyboard, touchpad, monitor, Wi-Fi, mobile broadband (3G HSPA), assisted GPS, Bluetooth, web cam, card reader and HDMI out (for use with the HD ready capability). The best part, however, is the apparent 12 hours of battery life this thing is supposed to provide. The fact that it all fits neatly into a 2.75 pound device doesn’t hurt either.

The release date and asking price won’t be revealed until next week, but I have seen rumors of greater than 700 USD price tag surfacing. If this is indeed the case I think Nokia may have trouble selling the device without using long term contracts with wireless providers to subsidize the cost. The device does come with a SIM card reader as well for connectivity to telecommunication networks.

The use of this device in the enterprise remains to be seen, but as with most Netbooks, the absence of a TPM module will leave corporate data at risk and more than likely stall adoption.

Copyright © Scott P. Rudy 2009 All Rights Reserved

Today Microsoft and Nokia announced they would be forming a global alliance. I have to wonder what this could bring to the future of mobile computing for the enterprise. The general consumer seems to love the iPhone (or the marketing around it), but the restrictions placed on the device and service leaves a lot to be desired for the enterprise customers. In my opinion vendor lock-in has always been the biggest obstacle to wider-spread adoption of Apple’s products.

Now I could consider Blackberry and Google Android to be the only other real competitors here, but I think the Nokia and Microsoft partnership opens up some bigger possibilities, like competition with HP, Asus, Acer and Lenovo (oh, I guess Dell is still around too).

Let’s face it, the features of a Netbook are practically already built into the Nokia N97, albeit at a slightly larger price tag. If Nokia plays this right, they could be the only provider to have their own device and operating system that runs Microsoft Office at an affordable price. Why is this important, well Microsoft Office generally runs most of the Fortune 1000. Nokia could become a serious player in the enterprise market overnight.

Copyright © Scott P. Rudy 2009 All Rights Reserved

If you are running the Business, Enterprise or Ultimate version of Windows Vista or Windows 7 you can use the Windows Backup and Restore Center to conduct full hard drive backups of your computer. All you need is enough storage media that can accommodate the used portion of your hard drive. While you could use CD or DVD storage, the simplicity of using a large external USB hard drive is much more convenient, especially with the size of some of the hard drives shipped out today.

This feature has been long overdue in the Windows world, but it now provides some great reassurance. Full backups simply place your files into VHD files. However, what happens when you backup your machine and then reinstall an OS from scratch? How do you get the files off of your VHD. Well, if you are running Vista you need to use a tool like VHDMount that ships inside the Windows Virtual Server 2005 R2 sP1, unless you have a Windows 7 install CD handy and you swap out the boot loader by looking at one of my previous posts.

You can’t install VHDMount on Windows 7, but you don’t need to either. The built-in DISKPART utility allows you to mount VHD files as a drive letter just like VHDMount. Simply locate your VHD file and use the following commands in a command prompt with elevated privileges.

Microsoft Windows [Version 6.1.7600]
Copyright (c) 2009 Microsoft Corporation. All rights reserved.
C:\users\scottrudy\>DISKPART

Microsoft DiskPart version 6.1.7600
Copyright (c) 1999-2008 Microsoft Corporation.
On computer: SCOTTRUDY

DISKPART> SEL VDISK file="E:\…\1c44c446-91c0-11dd-8709-806e6f6e6963.vhd"
DiskPart successfully opened the virtual disk file.

DISKPART> ATTACH VDISK
DiskPart successfully attached the virtual disk file.

When you are finished simply type DETACH VDISK to unmount the file.

Copyright © Scott P. Rudy 2009 All Rights Reserved

When applications display ASP.NET runtime error messages to the end user, the root cause of the problem is due to insufficient error handling within the application. To gain the most granular control over errors, ASP.NET page events should be wrapped in a try…catch block so that proper error handling can be performed. However, the runtime does not provide this by default, so it must be manually performed. Any time something must be done manually the potential exists for a human to forget. Fortunately there are other features provided by ASP.NET that can provide a higher level of control.

Note – While it is also possible to implement a custom IHttpModule to perform error handling, it is out of scope for this post.

The ASP.NET framework essentially adds three additional ways to perform error handling by using, page events, application events and custom error handling in the app.config file. The first of these three, page events, allows the developer to use the Page_Error event handler to process any unhandled exceptions by subscribing to the the Error event inherited from the System.Web.UI.TemplateControl class. This can be done on each page to allow for fine grained control over the error handling. However, as with any event, you must be sure to manually subscribe, or use the AutoEventWireup feature. Developers that use this method must also consider whether or not the error should be bubbled up to the application level. In order to prevent the propagation from happening the handler must make a call to Server.ClearError() once the exception is properly handled. Applications that use a custom base class for all pages can use this feature to catch all unhandled exceptions. However, since inheritance sometimes must be broken, ASP.NET provides a simpler way to do this.

The second error handling method provided by the ASP.NET framework, application events, performs nearly the same functionality as the page event handler. The difference is that the Application_Error will catch any unhandled exceptions within an application’s scope. This handler must subscribe to the System.Web.HttpApplication’s Error event in the application’s global.asax file. Note that the AutoEventWireup and the Server.ClearError() concerns from the Page_Error description above also apply to the Application_Error event handler. However, once the error is handled at the application level, it might be advisable to let the user know something happened so they can notify someone. If the Server.ClearError() method is not called then another feature of ASP.NET can be put to use.

The last handling method provided by the ASP.NET framework is the <customErrors /> element in the web.config file. This element allows for an application to specify a default page to go to when there is an unhandled exception by using the defaultRedirect attribute. Simply set the value of the attribute to a relative or fully qualified hyperlink that the application should use. In order for the application to use this page the value of the mode attribute must also be set to “On” or “RemoteOnly”. Setting the value to “RemoteOnly” allows a system administrator to log on to the machine where the error is occurring and allow the ASP.NET runtime error message to display in the browser. For development environments this value is often set to “Off” so that the developers can see the error messages that are occurring. This is extremely helpful when an application writes messages to the event log, but the developers don’t have the necessary privileges to view the event log. The customError element also allows a developer to tailor how various HTTP status codes are handled by nesting an <error/> element inside.

Copyright © Scott P. Rudy 2009 All Rights Reserved