Blog

A catalogue of my discoveries in software development and related subjects, that I think might be of use or interest to everyone else, or to me when I forget what I did!

SQL ARITHABORT Plan Problems

June 07, 2012

In SQL Server Management Studio the default setting for Query Execution 'ARITHABORT' setting is ON. In ADO.NET the default setting is OFF. This can cause problems when trying to optimise your slow running queries, as the execution plans used by .NET and SSMS can be different. This can result in different execution times of your RPC completed vs SP completed in SQL Profiler. Also, it can cause your ADO.NET based queries to run slower, if the plan was optimised for the SSMS based query. In order to avoid this problem, I would recommend setting the option to OFF in SSMS: Tools > Options > Query Execution > SQL Server > Advanced.. Also, if you have come across this problem, after having changed the setting it may be worth clearing your plan cache:
DBCC FREEPROCCACHE
Permalink: SQL ARITHABORT Plan Problems

Remote Import with Progress using WCF

April 28, 2012

Following on from my chunked file uploader, I needed a way to initiate a server side import of a file and track the progress on the Silverlight clientside. To do this, I first wrote my import logic which was to be performed on the server. The important factor with the import logic, is that it raises an event whenever a line of data has been imported. I then wrote a WCF service wrapper for this, which uses a separate thread to perform the import so that the service call can return immediately after having started the import. The service comprises of 3 functions to; begin the import, check the progress, get the results. The service interface and implementation are shown below:
[ServiceContract]
    public interface IExampleImporter
    {
       
        [OperationContract]
        void BeginImportUploadedExampleFile(string uploadedFilename, bool hasHeaderRow);

        [OperationContract]
        bool CheckImportUploadedExampleFileComplete(string uploadFilename, out int rowsImported);

        [OperationContract]
        Response<ImportResult> GetImportUploadedExampleFileResults(string uploadFilename);

    }
public partial class ImportService : IExampleImporter
    {
        //create some static dictionaries for tracking the import progress and results across service calls
        private static Dictionary<string, int> _fileRowsImported = new Dictionary<string,int>();
        private static Dictionary<string, Response<ImportResult>> _fileResults = new Dictionary<string, Response<ImportResult>>();

        //we will need a class to define the params to the import thread
        private class ImportUploadedExampleFileThreadParams
        {
            public HttpContext HttpContext { get; set; }

            public string UploadFileToken { get; set; }
            public bool HasHeaderRow { get; set; }
        }
               
        //this is the service endpoint to begin a new import
        public void BeginImportUploadedExampleFile(string uploadFileToken, bool hasHeaderRow)
        {
            //you can only import a file which isnt already being imported
            if (_fileResults.ContainsKey(uploadFileToken) || _fileRowsImported.ContainsKey(uploadFileToken))
                throw new Exception("This file is already being imported");

            //initialise a place in the rows imported list
            _fileRowsImported.Add(uploadFileToken, 0);

            //we will run the import in another thread to avoid service timeouts - client can then poll 'IsUploadComplete'
            System.Threading.Thread importThread = new System.Threading.Thread(new System.Threading.ParameterizedThreadStart(ImportUploadedExampleFileThreadEntry));
            importThread.IsBackground = true;

            importThread.Start(new ImportUploadedExampleFileThreadParams()
            {
                HttpContext = HttpContext.Current,
                
                UploadFileToken = uploadFileToken,
                HasHeaderRow = hasHeaderRow
                
            });

        }

        public bool CheckImportUploadedExampleFileComplete(string uploadFileToken, out int rowsImported)
        {
            //does this import exist?
            lock (_fileRowsImported)
            {
                if (!_fileRowsImported.ContainsKey(uploadFileToken))
                    throw new Exception("No current import found for the given token");

                rowsImported = _fileRowsImported[uploadFileToken];
            }

            //is it complete?
            return _fileResults.ContainsKey(uploadFileToken);
            
        }

        public Response<ImportResult> GetImportUploadedExampleFileResults(string uploadFileToken)
        {
            Response<ImportResult> res;

            //does this import exist and complete?
            lock (_fileResults)
            {
                if (!_fileResults.ContainsKey(uploadFileToken))
                    throw new Exception("Import with given token does not exist or was not yet complete");

                res = _fileResults[uploadFileToken];
                
                //now remove the result
                _fileResults.Remove(uploadFileToken);
            }

            //also remove the progress since this import is now done
            lock (_fileRowsImported)
            {
                _fileRowsImported.Remove(uploadFileToken);
            }

            return res;
        }

        private void ImportUploadedExampleFileThreadEntry(object threadParam)
        {
            //parse the thread params
            ImportUploadedExampleFileThreadParams p = (ImportUploadedExampleFileThreadParams)threadParam;

            Response<ImportResult> importResponse = new Response<ImportResult>();

            //ensure the file exists
            string fullFilePath = p.HttpContext.Server.MapPath("UploadFiles\\" + p.UploadFileToken);

            if (System.IO.File.Exists(fullFilePath))
            {
                 //create an instance of the import helper to perform the import
                 ExampleImporterHelper helper = new ExampleImporterHelper();
                 helper.RowImported += new Action<string, bool>(helper_RowImported);

                 //invoke the importer and get the result
                 importResponse.Item = helper.Import(fullFilePath, p.HasHeaderRow);
                 importResponse.IsSuccess = true;

            }
            else
            {
                importResponse.IsSuccess = false;
                importResponse.MessageKey = "File not found";
            }

            lock (_fileResults)
            {
                _fileResults.Add(p.UploadFileToken, importResponse);
            }
        }

        void helper_RowImported(string filename, bool successfulRow)
        {
            //parse the token from the filename
            string token = System.IO.Path.GetFileName(filename);

            //update the rows imported
            lock (_fileRowsImported)
            {
                if (_fileRowsImported.ContainsKey(token))
                    _fileRowsImported[token] += 1;
            }

        }

    }
The way this works is, after having uploaded a file to the server, the client calls 'BeginImport..' passing the filename/token which initiates the import. The service creates a new thread which runs the actual import code passing in the required parameters. The service maintains two Dictionaries, one which stores the current number of lines imported by filename (by counting the RowImported events) and one which stores the eventual 'ImportResult' - which exists once the import is complete. Calling the 'is complete' function will return true or false and additionally will return the number of lines imported through the 'out' parameter. This enables the client to determine if an upload has finished and if not, how many lines are complete so far - which can be reflected in the UI based on the total number of lines. My particular example relies on the following data contracts to transmit the results:
[DataContract]
    public class ImportResult
    {
        [DataMember]
        public int SuccessCount { get; set; }

        [DataMember]
        public int FailCount { get; set; }

        private List<ImportException> _Exceptions = new List<ImportException>();

        [DataMember]
        public List<ImportException> Exceptions
        {
            get { return _Exceptions; }
            set { _Exceptions = value; }
        }


    }
    [DataContract]
    public class ImportException
    {
        [DataMember]
        public int LineNumber { get; set; }

        public Exception Exception { get; set; }

        [DataMember]
        public string ErrorMessage
        {
            get
            {
                return Exception.Message;
            }
            private set { }
        }

        public ImportException(int lineNumber, Exception ex)
        {
            this.LineNumber = lineNumber;
            this.Exception = ex;
        }
    }
An example of calling the service in Silverlight (using MVVM viewmodel) is as follows:
public void ImportUploadedFile(string serverToken)
        {

            ImportProgressText = string.Format("[{0}] Starting Import. (this may take a while)\r\n", DateTime.Now.ToString("dd-MMM-yyyy hh:mm:ss"));

            //in part 4 we actually import the file and get the results of the import
            IsImporting = true;

            //Start the Example importer
            ServiceInerfaceClient svc = ServiceUtility.GetExampleClient();
            svc.BeginImportUploadedExampleFileCompleted += new EventHandler<AsyncCompletedEventArgs>(svc_BeginImportUploadedExampleFileCompleted);
            svc.BeginImportUploadedExampleFileAsync(serverToken, HasHeaderRow);
        }

        void svc_BeginImportUploadedExampleFileCompleted(object sender, AsyncCompletedEventArgs e)
        {
            ((ServiceInerfaceClient)sender).BeginImportUploadedExampleFileCompleted -= svc_BeginImportUploadedExampleFileCompleted;

            //check no errors launching import
            if (e.Error == null)
            {
                //now that the import has successfully started we can begin polling for the results
                PollForImportResults();
            }
            else
            {
                IsImporting = false;
                ShowBusy(false);

                if (e.Error != null)
                    ImportProgressText += string.Format("[{0}] ERROR: " + e.Error.Message, System.DateTime.Now.ToString("dd-MMM-yyyy hh:mm:ss"));
                else
                    ImportProgressText += string.Format("[{0}] ERROR: Unknown Error Importing File", System.DateTime.Now.ToString("dd-MMM-yyyy hh:mm:ss"));
            }
        }

        private void PollForImportResults()
        {
            //set up a client and the event handler
            ServiceInerfaceClient svc = ServiceUtility.GetExampleClient();
            svc.CheckImportUploadedExampleFileCompleteCompleted += new EventHandler<CheckImportUploadedExampleFileCompleteCompletedEventArgs>(svc_CheckImportUploadedExampleFileCompleteCompleted);

            //start the recursion on another thread (prevent freezing the UI)
            System.Threading.Thread pollThread = new System.Threading.Thread(new System.Threading.ParameterizedThreadStart(PollForImportResults_Recursion));
            pollThread.IsBackground = true;
            pollThread.Start(svc);

        }

        private void PollForImportResults_Recursion(object ServiceInerfaceClientInstance)
        {
            //wait 5 secs
            System.Threading.Thread.Sleep(5000);

            //call the service and wait for the poll result
            ServiceInerfaceClient svc = (ServiceInerfaceClient)ServiceInerfaceClientInstance;
            svc.CheckImportUploadedExampleFileCompleteAsync(SelectedFile.ServerToken);
        }

        void svc_CheckImportUploadedExampleFileCompleteCompleted(object sender, CheckImportUploadedExampleFileCompleteCompletedEventArgs e)
        {
            //make sure no errors reading polling service
            if (e.Error == null)
            {
                //update the rows imported counter
                //update the UI via the dispatcher thread
                Deployment.Current.Dispatcher.BeginInvoke(new Action<ExampleAdminViewModel>((vm) =>
                    {
                        vm.RowsProcessed = e.rowsImported;
                    }), this);

                //was the import complete?
                if (e.Result == true)
                {
                    //my job now done, remove the handler
                    ((ServiceInerfaceClient)sender).CheckImportUploadedExampleFileCompleteCompleted -= svc_CheckImportUploadedExampleFileCompleteCompleted;

                    //update the UI via the dispatcher thread
                    Deployment.Current.Dispatcher.BeginInvoke(new Action<ExampleAdminViewModel>((vm) =>
                        {
                            vm.ImportProgressText += string.Format("[{0}] Import Complete, retreiving results.\r\n", System.DateTime.Now.ToString("dd-MMM-yyyy hh:mm:ss"));
                        }), this);

                    //finish the import to get the results
                    GetImportResults();
                }
                else
                {
                    //still not finished, continue with the recursion
                    PollForImportResults_Recursion((ServiceInerfaceClient)sender);
                }
            }
            else
            {
                //display errors using dispatcher (UI) thread
                Deployment.Current.Dispatcher.BeginInvoke(new Action<ExampleAdminViewModel, CheckImportUploadedExampleFileCompleteCompletedEventArgs>((vm, UIe) =>
                {
                    vm.IsImporting = false;
                    vm.ShowBusy(false);

                    if (UIe.Error != null)
                        vm.ImportProgressText += string.Format("[{0}] ERROR: " + UIe.Error.Message, System.DateTime.Now.ToString("dd-MMM-yyyy hh:mm:ss"));
                    else
                        vm.ImportProgressText += string.Format("[{0}] ERROR: Unknown Error Importing File", System.DateTime.Now.ToString("dd-MMM-yyyy hh:mm:ss"));
                }), this, e);
               
            }
        }

        private void GetImportResults()
        {
            ServiceInerfaceClient svc = ServiceUtility.GetExampleClient();

            svc.GetImportUploadedExampleFileResultsCompleted += new EventHandler<GetImportUploadedExampleFileResultsCompletedEventArgs>(svc_GetImportUploadedExampleFileResultsCompleted);
            svc.GetImportUploadedExampleFileResultsAsync(SelectedFile.ServerToken);
        }

        void svc_GetImportUploadedExampleFileResultsCompleted(object sender, GetImportUploadedExampleFileResultsCompletedEventArgs e)
        {
            
            if (e.Error == null && e.Result != null && e.Result.IsSuccess)
            {
                ImportResult res = e.Result.Item;

                //file has finished importing, calculate a string to represent the results
                string importResultsText = string.Format(@"[{0}] Results:

----------------------------------------
Successful rows: {1}
Failed rows: {2}

Failure breakdown:
---------------------------------------

",
                                System.DateTime.Now.ToString("dd-MMM-yyyy hh:mm:ss"),
                                res.SuccessCount,
                                res.FailCount);

                foreach (ImportException iEx in res.Exceptions)
                {
                    importResultsText += string.Format("Line: {0} - {1}\r\n", iEx.LineNumber, iEx.ErrorMessage);
                }

                //display results using dispatcher (UI) thread
                Deployment.Current.Dispatcher.BeginInvoke(new Action<ExampleAdminViewModel>((vm) =>
                {
                    vm.ImportProgressText += importResultsText;
                    vm.IsImporting = false;
                    vm.ShowBusy(false);
                }), this);
            }
            else
            {

                //display errors using dispatcher (UI) thread
                Deployment.Current.Dispatcher.BeginInvoke(new Action<ExampleAdminViewModel, GetImportUploadedExampleFileResultsCompletedEventArgs>((vm, UIe) =>
                {
                    vm.IsImporting = false;
                    vm.ShowBusy(false);

                    if (UIe.Result != null)
                        vm.ImportProgressText += string.Format("[{0}] ERROR: " + UIe.Result.MessageKey, System.DateTime.Now.ToString("dd-MMM-yyyy hh:mm:ss"));
                    else if (UIe.Error != null)
                        vm.ImportProgressText += string.Format("[{0}] ERROR: " + UIe.Error.Message, System.DateTime.Now.ToString("dd-MMM-yyyy hh:mm:ss"));
                    else
                        vm.ImportProgressText += string.Format("[{0}] ERROR: Unknown Error Importing File", System.DateTime.Now.ToString("dd-MMM-yyyy hh:mm:ss"));
                }), this, e);

               
            }

        }
This is again using threads to perform the polling of the server so that calls to Thread.Sleep can be made on the polling thread without freezing the UI. I have used 'BeginInvoke' on the 'Dispatcher' thread when setting ViewModel properties, as this indirectly updates the UI (through the bindings) and so must take place on the main UI thread.
Permalink: Remote Import with Progress using WCF

Chunked File Uploader in Silverlight and WCF

April 27, 2012

In order to upload a file from a Silverlight application to your webserver you generally provide a WCF service that accepts the filestream. However, accepting a large file as a single chunk can sometimes lead to problems with service timeouts, request size restrictions etc. so I took the time to write a chunked file uploader service, which allows you to send the file in several parts. The way this works is similar to how a normal file stream works, in that you open a target file in exchange for a token/handle, write to the file using the token/handle and then close the file when you are finished. In addition to this, because we are dealing with async operations over HTTP we are required to ensure the file parts are committed in the correct order as they were intended when sent (i.e. sequencing the chunks). Firstly, the interface for the IChunkedFileUploader is as follows:
[ServiceContract]
    public interface IChunkedFileUpload
    {
        [OperationContract]
        string OpenChunkedUploadFile();

        [OperationContract]
        Response WriteToChunkedUploadFile(string fileToken, int seq, byte[] data);

        [OperationContract]
        Response CloseChunkedUploadFile(string fileToken, int totalPartsSent);
    }
The 'Response' class simply wraps two properties { bool, string } for the success flag and any error message. The implementation of this interface in my WCF layer looks as follows:
public class FileUploadService : IChunkedFileUpload
    {
        /* In order to allow the chunking of upload data in Silverlight we will write a light sequenced data wrapper */
        private class FilePart
        {
            public int Seq { get; set; }
            public byte[] Data { get; set; }
        }
        private static Dictionary<string, List<FilePart>> _FileParts = new Dictionary<string, List<FilePart>>();

        public string OpenChunkedUploadFile()
        {
            string fileToken = "";


            //incase of simultaneous calls, lock the static variable
            lock (_FileParts)
            {
                //find a unique name for the file
                while (fileToken == "" || _FileParts.ContainsKey(fileToken))
                {
                    fileToken = DateTime.Now.Ticks.ToString() + ".ccf";
                }

                //create the "file"
                _FileParts.Add(fileToken, new List<FilePart>());
            }

            //return the token (filename)
            return fileToken;
        }

        public Response CloseChunkedUploadFile(string fileToken, int totalPartsSent)
        {

            try
            {

            //check its a valid token, else its 'file not found'
            if(_FileParts.ContainsKey(fileToken))
            {
                //check the number of parts received matches the 'total parts' check passed in by client (to ensure all parts were received, including the final part)
                if (_FileParts[fileToken].Count == totalPartsSent)
                {
                    List<byte> fileData = new List<byte>();

                    //reconstruct the file data from the file parts
                    //ensuring the sequences are valid (no gaps, duplicates etc.)
                    for (int i = 0; i <= _FileParts[fileToken].Max(p=>p.Seq); i++)
                    {
                        IEnumerable<FilePart> seqPart = _FileParts[fileToken].Where(p => p.Seq == i);

                        //should be 1 match
                        if (seqPart.Count() == 1)
                            fileData.AddRange(seqPart.First().Data);
                        else
                            throw new System.IO.InvalidDataException("Data was invalid, not all sequences were present or were duplicated");
                    }

                    //if we reach this point then the total number of parts was correct and the sequences were correctly sent/received
                    //we can now write the bytes to disk
                    string fullFilePath = HttpContext.Current.Server.MapPath("UploadFiles\\" + fileToken);
                    System.IO.File.WriteAllBytes(fullFilePath, fileData.ToArray());

                    //and now kill the version in memory
                    _FileParts[fileToken].Clear();
                    _FileParts.Remove(fileToken);

                    //success!
                    return new Response() { IsSuccess = true };
                }
                else
                {
                    throw new System.IO.InvalidDataException("Data was invalid, total parts check failed.");
                }
            }
            else{
                throw new System.IO.FileNotFoundException();
            }
            }
            catch (Exception ex)
            {
                return new Response<string>() { IsSuccess = false, MessageKey = ex.Message };
            }
        }

        public Response WriteToChunkedUploadFile(string fileToken, int seq, byte[] data)
        {
            try
            {
                //check its a valid token, else its 'file not found'
                if (_FileParts.ContainsKey(fileToken))
                {
                    _FileParts[fileToken].Add(new FilePart() { Data = data, Seq = seq });
                    return new Response() { IsSuccess = true };
                }
                else
                {
                    throw new System.IO.FileNotFoundException();
                }
            }
            catch (Exception ex)
            {
                return new Response() { IsSuccess = false, MessageKey = ex.Message };
            }
        }

    }
Basically, what this does is creates a shared 'File system' on the server which is backed by a Dictionary<string, List<FilePart>> - the token/handle and the file parts received. In Silverlight, or any other client, you call 'OpenChunkedUploadFile' to receive a unique token which you will use to send your file parts. This initialises a new item in the dictionary of files. The token will also be used as the final filename, so that you can calculate the remote URL of the eventual file. You then loop through your target file bytes, sending chunks of your desired size to the 'WriteToChunkedUploadFile' function. You mark each file part with the sequence number to ensure the file is rebuilt in the correct order. At this point all the service is doing is adding to the List for the current file in the Dictionary. When all of your bytes are sent, you call 'CloseChunkedUploadFile' passing your token and the count of total file parts (to ensure all parts were received). After performing some checks on the file parts (e.g. number of parts, sequence contiguity) the file parts are arranged in sequence order to create the full file bytes of the file, which is then committed to the physical disk on the server (using a pre-defined folder and the token file name). I have written an upload helper class which shows this:
public class ChunkedFileUploadHelper
    {
        #region "Fields"
        private int _FileChunkSize;
        private int _fileUploadSequence = 0;
        private int _fileUploadPartSuccessCount = 0;

        private ChunkedUploadFileInfo _fileInfo;
        #endregion

        #region "Events"
        public delegate void UploadAsyncCompletedEventHandler(ChunkedFileUploadHelper sender, string serverFileToken);
        public event UploadAsyncCompletedEventHandler UploadAsyncCompleted;
        protected void OnUploadAsyncCompleted()
        {
            if (UploadAsyncCompleted != null)
                UploadAsyncCompleted(this, _fileInfo.ServerToken);
        }

        #endregion

        /// <summary>
        /// Create an instance of the helper to be used with the provided file data instance
        /// </summary>
        /// <param name="fileInfo">The instance of chunked file data to be used</param>
        public ChunkedFileUploadHelper(ChunkedUploadFileInfo fileInfo) : this(fileInfo, 512000) //default chunk size 512KB
        { }

        /// <summary>
        /// Create an instance of the helper to be used with the provided file data instance, specifying a custom chunk size
        /// </summary>
        /// <param name="fileInfo">The instance of chunked file data to be used</param>
        /// <param name="chunkSize">Custom chunk size, default is 512KB</param>
        public ChunkedFileUploadHelper(ChunkedUploadFileInfo fileInfo, int chunkSize)
        {
            if (fileInfo == null)
                throw new ArgumentNullException("fileInfo", "fileInfo cannot be NULL");

            _fileInfo = fileInfo;
            _FileChunkSize = chunkSize;
        }

        /// <summary>
        /// Asynchronously uploads a file using the chunked service and calls the callback passing in the server file token
        /// </summary>
        public void UploadAsync()
        {
            if (_fileInfo != null && _fileInfo.FileData.Length > 0)
            {
                //reset the file upload data
                _fileUploadSequence = 0;
                _fileUploadPartSuccessCount = 0;
                _fileInfo.BytesUploaded = 0;

                //begin with making the initial service call
                ServiceInerfaceClient svc = ServiceUtility.GetChunkedFileUploadClient();

                //first thing to do is to create a new empty file on the server
                svc.OpenChunkedUploadFileCompleted += new EventHandler<OpenChunkedUploadFileCompletedEventArgs>(svc_OpenChunkedUploadFileCompleted);
                svc.OpenChunkedUploadFileAsync();
            }
            else
            {
                throw new InvalidOperationException("Cannot start upload as there was no file data to upload");
            }
        }

        void svc_OpenChunkedUploadFileCompleted(object sender, OpenChunkedUploadFileCompletedEventArgs e)
        {
            ((ServiceInerfaceClient)sender).OpenChunkedUploadFileCompleted -= svc_OpenChunkedUploadFileCompleted;

            if (e.Error == null && !string.IsNullOrEmpty(e.Result))
            {
                //we have been given a filename/token
                _fileInfo.ServerToken = e.Result;

                //now can move on to part 2
                Upload_part2();
            }
            else
            {
                if (e.Error != null)
                    throw e.Error;
                else
                    throw new Exception("Did not receive valid token from the server");
            }
        }

        private void Upload_part2()
        {
            //create a service client and add the handle for the part uploaded event
            ServiceInerfaceClient svc = ServiceUtility.GetChunkedFileUploadClient();
            svc.WriteToChunkedUploadFileCompleted += new EventHandler<WriteToChunkedUploadFileCompletedEventArgs>(svc_WriteToChunkedUploadFileCompleted);

            //start the recursion
            Upload_part2_recursion(svc);
        }



        private void Upload_part2_recursion(ServiceInerfaceClient svc)
        {
            //now that we have a file name we can send chunks of data
            byte[] bytesToSend = _fileInfo.FileData.Skip(_fileUploadSequence * _FileChunkSize).Take(_FileChunkSize).ToArray();

            if (bytesToSend.Length > 0)
            {
                svc.WriteToChunkedUploadFileAsync(_fileInfo.ServerToken, _fileUploadSequence, bytesToSend, bytesToSend.Length);
                _fileUploadSequence++;
            }
        }

        void svc_WriteToChunkedUploadFileCompleted(object sender, WriteToChunkedUploadFileCompletedEventArgs e)
        {
            if (e.Error == null && e.Result != null && e.Result.IsSuccess)
            {
                //bytes were successfully uploaded
                _fileInfo.BytesUploaded += (int)e.UserState;
                _fileUploadPartSuccessCount++;

                //were all bytes accounted for and successfully received?
                if (_fileInfo.BytesUploaded == _fileInfo.FileData.Length && _fileUploadPartSuccessCount == _fileUploadSequence)
                {
                    //we can now remve the handler
                    ((ServiceInerfaceClient)sender).WriteToChunkedUploadFileCompleted -= svc_WriteToChunkedUploadFileCompleted;

                    //file is completely upload, we can now go to step 3
                    Upload_part3();
                }
                else
                {
                    //keep sending the parts 
                    Upload_part2_recursion(((ServiceInerfaceClient)sender));
                }

            }
            else
            {
                if (e.Error != null)
                    throw e.Error;
                else if (e.Result != null)
                    throw new Exception(e.Result.MessageKey);
                else
                    throw new Exception("Unknown Error Uploading File to Server");
            }
        }

        private void Upload_part3()
        {
            ServiceInerfaceClient svc = ServiceUtility.GetChunkedFileUploadClient();

            //now we want to tell the server to close the file and commit to disk
            svc.CloseChunkedUploadFileCompleted += new EventHandler<CloseChunkedUploadFileCompletedEventArgs>(svc_CloseChunkedUploadFileCompleted);
            svc.CloseChunkedUploadFileAsync(_fileInfo.ServerToken, _fileUploadSequence);
        }

        void svc_CloseChunkedUploadFileCompleted(object sender, CloseChunkedUploadFileCompletedEventArgs e)
        {
            if (e.Error == null && e.Result != null && e.Result.IsSuccess)
            {
                //the file is now on the server
                OnUploadAsyncCompleted();
            }
            else
            {
                if (e.Error != null)
                    throw e.Error;
                else if (e.Result != null)
                    throw new Exception(e.Result.MessageKey);
                else
                    throw new Exception("Unknown Error Committing File to Server");
            }
        }

    }
The 'ChunkedFileUploadInfo' class is as follows:
public class ChunkedUploadFileInfo : INotifyPropertyChanged
    {
        #region "Client Side File Info"
        private string _OriginalFilename;
        public string OriginalFilename
        {
            get { return _OriginalFilename; }
            set { _OriginalFilename = value; OnPropertyChanged("OriginalFilename"); }
        }

        private byte[] _fileData;
        public byte[] FileData
        {
            get { return _fileData; }
            set { _fileData = value; OnPropertyChanged("FileData"); }
        }
        #endregion

        #region "Upload FIle Info"
        private string _ServerToken;
        public string ServerToken
        {
            get { return _ServerToken; }
            set { _ServerToken = value; OnPropertyChanged("ServerToken"); }
        }

        private int _bytesUploaded;
        public int BytesUploaded
        {
            get { return _bytesUploaded; }
            set { _bytesUploaded = value; OnPropertyChanged("BytesUploaded"); }
        }
        #endregion

        #region "Notify Property Changed"
        public event PropertyChangedEventHandler PropertyChanged;

        private void OnPropertyChanged(string propertyName)
        {
            if (PropertyChanged != null)
                PropertyChanged(this, new PropertyChangedEventArgs(propertyName));
        }
        #endregion
    }
Permalink: Chunked File Uploader in Silverlight and WCF

Fixing Dodgy Characters in SQL

December 08, 2011

When dealing with Unicode/nvarchar data in SQL, sometimes some unwanted characters can sneak into the database which may break things, or look wrong if you're not expecting them. One example of this is if you use an XML serializer on data containing character 0x0B, it will throw an exception. To find and replace this data, you need to use the BINARY collation as SQL will not find any characters outside of the normal set using the normal collations. Example of a replace 0x0B:
UPDATE
MyTable
set
MyField = REPLACE(MyField COLLATE 	
Latin1_General_BIN, NCHAR(11), '')
Permalink: Fixing Dodgy Characters in SQL

Creating a Multi-Select Drop Down List in Silverlight

August 24, 2011

This is quite a common task/requirement, to have a drop down/combo interface with checkable options inside. Silverlight is quite good at this due to its databound approach to displaying and manipulating data. As a preface, lets assume you have wrapped your IEnumerable data in a wrapper class that contains the data item and a 'Selected' property, for example:
public class SelectableObject : ComponentModel.INotifyPropertyChanged
{

    private bool _selected;
    public bool Selected
    {
        get { return _selected; }
        set
        {
            _selected = value;

            OnPropertyChanged("Selected");
        }
    }

    public object DataItem { get; set; }

    public event PropertyChangedEventHandler PropertyChanged;
    protected void OnPropertyChanged(string propertyName)
    {
        if (PropertyChanged != null)
            PropertyChanged(this, new PropertyChangedEventArgs(propertyName));
    }
}
Now that you have a datasource which has the properties needed to bind a selectable list you can display the data in a modified combo box which is designed to work with this kind of data. I based my combo box on the RadComboBox (prism version), but this can likely be easily translated to any other base implementation: XAML:
<UserControl x:Class="FQNS.MultiSelectDropDown"
    xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"
    xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"
    xmlns:d="http://schemas.microsoft.com/expression/blend/2008"
    xmlns:mc="http://schemas.openxmlformats.org/markup-compatibility/2006"
    mc:Ignorable="d"
             xmlns:my="clr-namespace:ExternalControls.TelerikForPrism;assembly=ExternalControls.TelerikForPrism"
    d:DesignHeight="300" d:DesignWidth="400">
    
    <Grid x:Name="LayoutRoot">
        <my:PrismRadComboBox ItemsSource="{Binding}"  Name="cmbComboBox" EmptyText="-All-" Margin="0,0,0,2" SelectionChanged="cmbComboBox_SelectionChanged" DropDownClosed="cmbComboBox_DropDownClosed">
            <!-- Data Template is defined in code behind-->
        </my:PrismRadComboBox>
    </Grid>
</UserControl>
Code-behind:
public partial class MultiSelectDropDown : UserControl
{
    public MultiSelectDropDown()
    {
        InitializeComponent();

    }

    private string _displayMemberPath;
    public string DisplayMemberPath
    {
        get { return _displayMemberPath; }
        set
        {
            _displayMemberPath = value;
            cmbComboBox.ItemTemplate = (DataTemplate)XamlReader.Load(@"
            <DataTemplate xmlns=""http://schemas.microsoft.com/client/2007"">
                <StackPanel Orientation=""Horizontal"">
                    <CheckBox IsChecked=""{Binding Selected, Mode=TwoWay}""></CheckBox>
                    <TextBlock Text=""{Binding Path=" + DisplayMemberPath + @"}""></TextBlock>
                </StackPanel>
            </DataTemplate>");
        }
    }

    private void cmbComboBox_SelectionChanged(object sender, Telerik.Windows.Controls.SelectionChangedEventArgs e)
    {
        //we actually dont want 'selections' to be made, so always select -1 but tick the selected item
        SetItemSelectedProperty(cmbComboBox.SelectedItem, true);

        cmbComboBox.SelectedIndex = -1;
    }

    private void SetComboText()
    {
        //at this point the data context should be at least IEnumerable
        IEnumerable objectList = DataContext as IEnumerable;
        if (objectList != null)
        {
            List<object> selectedItems = (from object o in objectList where GetItemSelectedProperty(o) == true select o).ToList();
            switch (selectedItems.Count)
            {
                case 0:
                    cmbComboBox.EmptyText = "-All-";
                    break;
                case 1:
                    cmbComboBox.EmptyText = GetItemDisplayProperty(selectedItems[0]);
                    break;
                default:
                    cmbComboBox.EmptyText = "Multiple Selections..";
                    break;
            }

        }
    }


    private void SetItemSelectedProperty(object dataItem, bool value)
    {
        if (dataItem != null)
        {
            //get the 'Selected' property and confirm its a correct type
            PropertyInfo selectedProp = dataItem.GetType().GetProperty("Selected");
            if (selectedProp != null && selectedProp.PropertyType == value.GetType())
            {
                selectedProp.SetValue(dataItem, value, null);
            }
        }
    }

    private bool GetItemSelectedProperty(object dataItem)
    {
        if (dataItem != null)
        {
            //get the 'Selected' property and confirm its a boolean type
            PropertyInfo selectedProp = dataItem.GetType().GetProperty("Selected");
            if (selectedProp != null && selectedProp.PropertyType == typeof(bool))
            {
                return (bool)selectedProp.GetValue(dataItem, null);
            }
        }

        //default to not selected
        return false;
    }

    private string GetItemDisplayProperty(object dataItem)
    {
        if (dataItem == null)
            throw new ArgumentNullException("dataItem", "Data Item cannot be NULL");

        //get the 'DisplayMemberPath' property
        if (DisplayMemberPath.Contains("."))
        {
            //child property - iterate the tree
            Type currentType = dataItem.GetType();
            object currentValue = dataItem;

            string[] props = DisplayMemberPath.Split('.');
            foreach (string p in props)
            {
                PropertyInfo thisProp = currentType.GetProperty(p);
                if (thisProp != null)
                {
                    currentType = thisProp.PropertyType;
                    currentValue = thisProp.GetValue(currentValue, null);
                }
                else
                {
                    break;
                }
            }

            return (currentValue != null ? currentValue.ToString() : dataItem.ToString());
        }
        else
        {
            //direct property
            PropertyInfo displayProp = dataItem.GetType().GetProperty(DisplayMemberPath);

            if (displayProp != null)
            {
                object propVal = displayProp.GetValue(dataItem, null);
                if (propVal != null)
                    return propVal.ToString();
            }
        }
        //default to return the obejct itself
        return dataItem.ToString();
    }


    private void cmbComboBox_DropDownClosed(object sender, EventArgs e)
    {
            SetComboText();
    }

}
Example Usage:
<my:MultiSelectDropDown DataContext="{Binding MySelectableDataSource, Mode=TwoWay}" DisplayMemberPath="DataItem.Name"></my:MultiSelectDropDown>
Permalink: Creating a Multi-Select Drop Down List in Silverlight

Dynamic OrderBy using LINQ to SQL

August 22, 2011

Following on from my last post on Virtual Paging with Silverlight/WCF Services, the next topic to go hand in hand with this is dynamically ordering your data when using LINQ to SQL as a backend. Ordinarily I use my own data architecture, so haven't run into this problem of using LINQ to SQL and trying to pass your order by clause from the UI before; but recently I was working on a project which does use LINQ to SQL (actually its Entity Framework but I guess these things fit together) so this came up, after having accomplished proper server side paging. When you are paging data you want to page it based on a certain order. The order of the data is seldom hard coded, you want the user to define at runtime how the data should be ordered, e.g. by selecting the column in a grid view. In LINQ to SQL this isn't as simple as it should be, unless you download the 'Dynamic LINQ Library' that Microsoft have published in some of their samples. Not one for including code libraries unnecessarily I decided to tackle it with a little extra Googling and understanding. What I came across was basically the use of LINQ Expressions in conjunction with reflection to dynamically build the parameters to the LINQ order by clause at runtime. I only needed this to be very simple, i.e. one column at a time. I therefore defined a simple string which would represent a 'sort expression' of one column and one direction. Here is the code to my solution:
private IQueryable<YouDataObject> SearchYourDataQuery(
EFDBConnection conn, 
//some search criteria here,
string orderBy)
{

//default sort order is start date asc
if (string.IsNullOrEmpty(orderBy)) orderBy = "StartDate";
orderBy=orderBy.Trim();

//parse the order by statement 
string sortExprRegex = @"^(?<orderBy>.+?)\s?(?<direction>ASC|DESC)?$";
string sortDirection = "ASC";

if (Regex.IsMatch(orderBy, sortExprRegex, RegexOptions.IgnoreCase))
{
    Match orderByParts = Regex.Match(orderBy, sortExprRegex, RegexOptions.IgnoreCase);
    orderBy = orderByParts.Groups["orderBy"].Value;
    if(!string.IsNullOrEmpty(orderByParts.Groups["direction"].Value)) sortDirection = orderByParts.Groups["direction"].Value;
}          

//generate the query to get the filtered records
IQueryable<YouDataObject> dataQry = (from a in conn.YourDatas
                                            //put your joins, where clauses etc
                                            select a);
            
    //build an order by statement based on the column name passed to us
ParameterExpression tableParam = Expression.Parameter(typeof(YouDataObject), "d");

//awlays starting property access from the top table
Expression expr = tableParam;

//if it has dots then its a child property, i.e of another type
if (orderBy.Contains("."))
{
    string[] props = orderBy.Split('.');

    Type type = typeof(YouDataObject);
    foreach (string prop in props)
    {
        // use reflection (not ComponentModel) to mirror LINQ
        PropertyInfo pi = type.GetProperty(prop);
        expr = Expression.Property(expr, pi);
        type = pi.PropertyType;
    }

}
else
{
    //no dots means a direct property accessor of the top level table
    expr = Expression.Property(tableParam, orderBy);
}

//create a test of data type, but default is to use 'object'
if (expr.Type == typeof(DateTime))
{
    if(sortDirection.ToUpper() == "ASC")
        dataQry = compQry.OrderBy(Expression.Lambda<Func<YouDataObject, DateTime>>(expr, tableParam));
    else
        dataQry = compQry.OrderByDescending(Expression.Lambda<Func<YouDataObject, DateTime>>(expr, tableParam));
}
else
{
    if (sortDirection.ToUpper() == "ASC")
        dataQry = compQry.OrderBy(Expression.Lambda<Func<YouDataObject, object>>(expr, tableParam));
    else
        dataQry = compQry.OrderByDescending(Expression.Lambda<Func<YouDataObject, object>>(expr, tableParam));
}
            
//return the resulting query
return dataQry;

}
There are probably ways to neaten up the code and make it more reusable, but I think as a starting point that is hopefully helpful.
Permalink: Dynamic OrderBy using LINQ to SQL

Virtual Paging with Silverlight/WCF Services

August 18, 2011

When you want to page a collection in Silverlight all the online documentation points you to the 'PagedCollectionView' class, which offers a paging wrapper around an IEnumerable. This needs the full list of IEnumerable data to function in the first place, which is fine for small datasets, but for the most part you want to page a database resultset from many thousands of rows, down to maybe 10 at a time. Your Silverlight application probably gets its data from a WCF service, either directly or by using RIA Services link, so you want to pass your paging request from Silverlight to the service layer, where it can be transformed through your architecture to more than likely a SQL query, so that one page of data is transported at a time. There are hints on Google to implement an IPagedCollectionView in order to achieve this. Below shows my version of an IPagedCollectionView which can then be used for 'virtual' paging of data in Silverlight:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.ComponentModel;
using System.Collections.Specialized;
using System.Collections;

/// <summary>
/// A class which can be used in Silverlight to enable server side (virtual) paging code to store it's view state
/// </summary>
public class PagedVirtualCollectionView : IPagedCollectionView, IEnumerable, INotifyCollectionChanged, INotifyPropertyChanged
{
    /// <summary>
    /// Constructor takes initial page of data as its source
    /// </summary>
    /// <param name="source">A page of source data</param>
    public PagedVirtualCollectionView(IEnumerable source)
    {
        _sourceCollection = source;
    }

    private IEnumerable _sourceCollection;
    /// <summary>
    /// The underlying page of source data
    /// </summary>
    public IEnumerable SourceCollection
    {
        get { return _sourceCollection; }
        set { 
            _sourceCollection = value; 
            OnPropertyChanged("SourceCollection"); 
            OnCollectionChanged(new NotifyCollectionChangedEventArgs(NotifyCollectionChangedAction.Reset)); 
        }
    }

    private int _VirtualItemCount;
    /// <summary>
    /// Get or set this value, which is the total number of records in the database
    /// </summary>
    public int VirtualItemCount
    {
        get { return _VirtualItemCount; }
        set { 
            _VirtualItemCount = value; 
            OnPropertyChanged("VirtualItemCount"); 
            OnPropertyChanged("ItemCount"); 
            OnPropertyChanged("TotalItemCount"); 
        }
    }

    public int VirtualPageCount
    {
        get { return (int)Math.Ceiling((double)VirtualItemCount / (double)PageSize); }
    }

    private int _pageIndex = 0;
    private int _pageSize;

    #region "IEnumerable"
    /// <summary>
    /// For IEnumerable interface, passes the call to the underlying SourceCollection
    /// </summary>
    /// <returns></returns>
    public IEnumerator GetEnumerator()
    {
        return _sourceCollection.GetEnumerator();
    }
    #endregion

    #region "IPagedCollectionView"
    public bool CanChangePage
    {
        get { return !_isPageChanging; }
    }

    private bool _isPageChanging = false;
    /// <summary>
    /// Return true between states of page changing and page changed, otherwise false
    /// </summary>
    public bool IsPageChanging
    {
        get { return _isPageChanging; }
        set { _isPageChanging = value; OnPropertyChanged("IsPageChanging"); }
    }

    /// <summary>
    /// This will be the same as the virtual item count, for use in data paging controls
    /// </summary>
    public int ItemCount
    {
        get { return VirtualItemCount; }
    }

    public bool MoveToFirstPage()
    {
        return MoveToPage(0);
    }

    public bool MoveToLastPage()
    {
        return MoveToPage(VirtualPageCount - 1);
    }

    public bool MoveToNextPage()
    {
        return MoveToPage(PageIndex + 1);
    }

    public bool MoveToPage(int pageIndex)
    {
        if (pageIndex >= 0 && pageIndex <= (VirtualPageCount - 1))
        {
                

            //fire the page changing event so the call can be made to load the next page of data
            PageChangingEventArgs pcea = new PageChangingEventArgs(pageIndex);
            OnPageChanging(pcea);

            if (!pcea.Cancel)
            {
                //let outside world know we are changing pages
                IsPageChanging = true;

                //event handlers have run, we should now be on the new page
                _pageIndex = pageIndex;

                //let the outside world know we are no longer changing pages
                IsPageChanging = false;

                OnPropertyChanged("PageIndex");

                //raise an event to signal the completed change of page
                OnPageChanged();
                return true;
            }
            else
            {
                //event handler cancelled the page change
                return false;
            }
                
        }
        else
        {
            //page index out of bounds, or was busy changing pages
            return false;
        }
    }

    public bool MoveToPreviousPage()
    {
        return MoveToPage(PageIndex - 1);
    }

    public event EventHandler<EventArgs> PageChanged;
    protected void OnPageChanged()
    {
        if (PageChanged != null)
            PageChanged(this, EventArgs.Empty);
    }

    public event EventHandler<PageChangingEventArgs> PageChanging;
    protected void OnPageChanging(PageChangingEventArgs e)
    {
        if (PageChanging != null)
            PageChanging(this, e);
    }

    /// <summary>
    /// Read-only version of page index, use the helper functions to move through the pages
    /// </summary>
    public int PageIndex
    {
        get { return _pageIndex; }
    }

    /// <summary>
    /// Get/set the page size which will can then be used by the external paging calls
    /// </summary>
    public int PageSize
    {
        get
        {
            return _pageSize;
        }
        set
        {
            _pageSize = value;
        }
    }

    public int TotalItemCount
    {
        //this will always be the same as virtual item count
        get { return VirtualItemCount; }
    }
    #endregion

    #region "INotifyPropertyChanged"
    public event PropertyChangedEventHandler PropertyChanged;
    protected void OnPropertyChanged(string propertyName)
    {
        if (PropertyChanged != null)
            PropertyChanged(this, new PropertyChangedEventArgs(propertyName));
    }
    #endregion

    #region"INotifyCollectionChanged"
    public event NotifyCollectionChangedEventHandler CollectionChanged;
    protected void OnCollectionChanged(NotifyCollectionChangedEventArgs e)
    {
        if (CollectionChanged != null)
            CollectionChanged(this, e);
    }
    #endregion
}
Basically it works by allowing you to specify a 'VirtualItemCount' aswel as your 'SourceCollection' so that you can dictate how many items there are in total (before paging has been applied). Whether you are using MVVM or traditional DataContext, you should bind your grid and pager to the instance of the PagedVirtualCollectionView and handle the 'PageChanging' event. You can then replace the 'SourceCollection' with the current page of data and set the 'VirtualItemCount' to the total count of records. E.g.
private PagedVirtualCollectionView _SearchResults = new PagedVirtualCollectionView(new List<SearchResult>());
public PagedVirtualCollectionView SearchResults
{
    get { return _SearchResults; }
}

public MyViewModel()
{
    _SearchResults.PageChanging += new EventHandler<PageChangingEventArgs>(SearchResultsPageChangingHandler);
}

protected void SearchResultsPageChangingHandler(object sender, PageChangingEventArgs e)
{
    UpdateSearchResults(e.NewPageIndex);
}

public void UpdateSearchResults(int pageIndex)
{
    //CALL YOUR SERVICE PASSING pageIndex and SearchResults.PageSize
}

void UpdateSearchResultsCompleted(object sender, SearchCompletedEventArgs e)
{          
    if (e.Error == null)
    {
        SearchResults.SourceCollection = e.Result;
        SearchResults.VirtualItemCount = e.totalRecords; //totalRecords is defined as an 'out' parameter to the WCF service
    }
}
Permalink: Virtual Paging with Silverlight/WCF Services

IIS 6 Rewriting Problem with .NET 4

August 03, 2011

Recently I had a problem with a URL rewriting regular expression not picking up URL requests, resulting in 404 not found errors. The expression was designed to pickup anything that consists of a string containing no dots (extensionless files). This is because I know these files exist only in a database and have a handler. It worked fine until one day it just didn't. After enabling logging in Helicon rewriter, I could see the original URL was not actually being passed through to the Helicon ISAPI filter and infact had already been modified to include '/eurl.axd/{hash}' on the end. Hence why the regex was not making a positive match. A quick Google revealed a change in ASP.NET 4 that seems to handle extensionless file requests slightly differently than before. It looks like .NET 4 is getting in early (ASP.NET never used to fire for non-mapped files, right?) and picking up the extensionless file and (maybe because its IIS 6) is rewriting the URL, as above, with the '/eurl.axd/{hash}' part suffixed. This was now causing a 404. To fix this I simply added a 2nd regular expression that would pick up these requests for extensionless files with aforementioned 'eurl.axd' suffix so that this will also pass to the database file handler.
Permalink: IIS 6 Rewriting Problem with .NET 4

Silverlight Designer Crashing Visual Studio 2010

July 08, 2011

After the Silverlight runtime update to 4.0.60531.0, opening a XAML file in design view was crashing visual studio. There was also a problem in opening the Toolbox, or any designer that loads the toolbox, with Visual Studio crashing when "Loading toolbox content from package 'Microsoft.VisualStudio.IDE.ToolboxControlsInstaller.ToolboxInstallerPackage' {2C298B35-07DA-45F1-96A3-BE55D91C8D7A}" When I found the solution, it appears this probably only affects systems where 'System.Windows.dll' has been previously registered in the GAC. The solution, is to remove System.Windows from your GAC using the following command from the Visual Studio Command Prompt: gacutil -u System.Windows
Permalink: Silverlight Designer Crashing Visual Studio 2010

Finding a Child Control by Type in Silverlight

June 07, 2011

Recently I needed to traverse the visual tree of a Silverlight dependancy object (UI element) in order to find a child object with the type I was looking for. As I didn't know how far nested the child would be, I wrote a recursive helper function which will scan all child elements looking for the first instance it finds. I then extended this functionality with a overload allowing to specify the name of the object I was looking for. I thought it might be useful elsewhere, so here it is:
private T FindControlByType<T>(DependencyObject container) where T : DependencyObject
{
    return FindControlByType<T>(container, null);
}

private T FindControlByType<T>(DependencyObject container, string name) where T : DependencyObject
{
    T foundControl = null;

    //for each child object in the container
    for (int i = 0; i < VisualTreeHelper.GetChildrenCount(container); i++)
    {
        //is the object of the type we are looking for?
        if (VisualTreeHelper.GetChild(container, i) is T && (VisualTreeHelper.GetChild(container, i).GetValue(FrameworkElement.NameProperty).Equals(name) || name == null))
        {
            foundControl = (T)VisualTreeHelper.GetChild(container, i);
            break;
        }
        //if not, does it have children?
        else if (VisualTreeHelper.GetChildrenCount(VisualTreeHelper.GetChild(container, i)) > 0)
        {
            //recursively look at its children
            foundControl = FindControlByType<T>(VisualTreeHelper.GetChild(container, i), name);
            if (foundControl != null)
                break;
        }
    }

    return foundControl;
}
You can tweak the code to accept more parameters if you need more comparisons to match your object.
Permalink: Finding a Child Control by Type in Silverlight