Performance of .Net reflection on mobile devices (via Xamarin and C#)

.Net reflection can be very handy to automate development tasks involving objects, without typing (or even knowing) the names of the properties.

For instance, if you need to copy properties between different types of objects (think of domain object to view model copies  – and vice versa – in MVC projects), it is difficult, today, to resist the comforts of frameworks like Automapper, EmitMapper, ValueInjecter, BLToolkit and others.

But how does reflection perform on mobile with Xamarin? In some cases, it can’t perform at all, because Apple doesn’t allow dynamic code creation (think System.Reflection.Emit). In other cases, it performs reasonably well only if we don’t ask the simulator (and more so the device) to crunch very big amounts of objects.

We created a little project  to test how copy of C# objects via reflection performs on mobile. We created a class (“Narcissus Copier”) that uses reflection to copy property among objects.

With Narcissus, we can do two things:

  • Copy values between two objects based on common property names and types (we check what these common properties are each time we copy two objects);
  • Copy values between two objects based on equal property names, with the premise that the corresponding property names and types are “registered” beforehand in the utility (we check what these common properties are only once in the app).

This is a link to the overall solution that includes both the “Narcissus” copier and the iOS project:

https://github.com/RickCSharp/NarcissusCopieriOStest

This is the method that copies the properties of a “source”object on a “destination” object.

// This method allows you to copy an object on another object, based on the common properties they have.
		// Syntax:
		// NarcissusCopier<TypeOfSourceObject, TypeOfDestinationObject>.CopyAnyObject(SourceObjectInstance, DestinationObjectInstance);
		// To improve performance in case the copies of two objects are executed more than once, 
		// the method pair RegisterObjectProperties and CopyRegisteredObject is more indicated
		public static void CopyAnyObject(TSource source, TDestination destination)
		{
			var propertiesOfSource =
                           source.GetType().GetProperties();
			var propertiesOfDestination =
                           destination.GetType().GetProperties();
			var propertiesInCommon =
				(from a in propertiesOfSource
				 join b in propertiesOfDestination
				 on new { a.Name, a.PropertyType }
                                 equals new { b.Name, b.PropertyType }
				 select a);
			foreach (var propertyInCommon in propertiesInCommon)
			{
			var valueOfPropertyToCopy =
propertyInCommon.GetValue(source, null);
			PropertyInfo propertyOfDestinationToReplace =
destination.GetType().GetProperty(propertyInCommon.Name);	
			propertyOfDestinationToReplace.SetValue(destination,
                          Convert.ChangeType(valueOfPropertyToCopy,
                          propertyOfDestinationToReplace.PropertyType),
                          null);
			}
		}

The idea to test how reflection copy performs is:

  • We create 1000, 10K, 100K, 200K, 500K instances of two pairs of mildly compatible complex classes (i.e.: each pair has some common properties, but some are not common);
  • We copy the values of the properties of the instances of the first type of objects on the properties of the second types of objects;
  • First we do it without reflection (“direct” copy);
  • Then we do it with reflection, but without “registering” the object pairs (that is to say the common properties are evaluated every time we ask the method to perform a copy);
  • Last, we do the copy with reflection more “prepared”, that is to say by first “registering” the object pairs (every time we ask the method to perform a copy, the common properties are already known).

 

We will take advantage of this test to also see how different the iOS simulator and the actual devices perform (or don’t perform, in a case).

Performance in the simulator (real device tests are below)

Here are the results of the direct copy / reflection copy on an iPhone 6plus (iOS 9.3) simulator (this is important to underline because on the real device it will be a totally different story) running on a macBookPro i7 8Gb RAM:

Test 1:
1,000 object copies in the simulator. Not a lot of difference between “direct” copy and copy via reflection

Simulator Screen Shot 02 Jul 2016 19.38.21

Test 2:
10,000 object copies in the simulator. The difference begins to be important (5x slower in the best reflection case).

Simulator Screen Shot 02 Jul 2016 19.39.00

Test 3:
100,000 object copies in the simulator. The non-reflection methods continue to perform well. Code is reused very well by the runtime. Reflection code is not.

Simulator Screen Shot 02 Jul 2016 19.39.38

Test 4:
200,000 object copies in the simulator. Performance degrades in the reflection part of the copy.

Simulator Screen Shot 02 Jul 2016 19.40.41

Test 5:
500,000 object copies in the simulator. This is an extreme case (it wouldn’t be a good idea to modify 500,000 objects in a mobile app, would it?), but this example does show some pitfalls of reflection.

Simulator Screen Shot 02 Jul 2016 20.11.40

Performance on a real device (iPhone 6plus)

Comparing the simulator to the real device is interesting because there is one particular case (the 500,000 objects) that the device cannot even work with because of insufficient memory. This reminds us once again that the simulators must be used with a grain of salt.

Test 1 on iPhone:
1,000 object copies in the real device. Not a lot of difference between “direct” copy and copy via reflection nor between Simulator and real device

IMG_1630

Test 2 on iPhone:
10,000 object copies in the real device. The performance of copy without reflection is still good. The performance of copy via reflection is comparable to that of the simulator.

IMG_1631

Test 3 on iPhone:
100,000 object copies in the real device. The performance of copy without reflection is still good. However, the device begins to degrade both compared to the “simple” copy and compared to the simulator.

IMG_1632

Test 4 on iPhone:
200,000 object copies in the real device. The performance of copy without reflection is still good. The device is three time slower than the simulator in this case.

IMG_1633

Test 5 on iPhone:
…last, 500,000 object copies in the real device. The performance cannot be shown as the creation of 500,000 complex objects on an iPhone 6plus results in the crash of the app.

Syncfusion free “succintly” ebooks

When we discuss technologies on this blog, we generally try to present different commercial alternatives to reach a certain goal. This time we’ll make an exception and we will present a set of books offered (for free, or better: for your availabilty to subscribe to their newsletter) by  one specific company: Syncfusion, to which we’re not affiliated in any way.

Why do we do this? Because they have given us for free (as they have everyone else) a lot of good eBooks that give you the lowdown on interesting development/system admin topics. These books are professional and well written. Everyone can get them for free at www.syncfusion.com

In particular, I liked the “Git Succintly” eBook. I have used Git in the past via dubious GUI’s (Visual Studio – which has no “staging” concept until VS 2013; this may change with VS 2015 update 2 if I’m not mistaken – and Xamarin studio).

This little book, written very concisely by Ryan Hodson (“succint” books are rarely made up of more than 100 pages), tells you everything you have to know to kick the ground running with Git (even professionally) via command line, something I hadn’t done in the past.

git

And… it’s free.

Other cool titles:

  • NancyFX succintly (for microservice fans)
  • Criptography in .Net

What does Syncfusion actually sell? They sell (with an interesting business model that really awards independent developers) components for web and mobile development. You may want to check them out.

Renewing an SSL certificate for a website hosted in Azure

Managing resources in Azure has become easier (well, at least the interface looks better) since Microsoft launched the new portal (the one at portal.azure.com).

Let us see today how you upload, in the new portal, a renewed IP-based SSL certificate for your Azure web app.

Prerequisites

  1. Needless to say, to upload a renewed certificate in Azure you need to have a renewed certificate. You don’t have to wait for the old certificate to expire before installing the new one, though: you can buy the new certificate in advance (one/two months is a pretty safe choice) and use it immediately. However, watch out that some third parties (for example: the bank that allows your eCommerce payments) may need to install the intermediate certificates of your new certificate in their “certificate store” before you replace the certificate in your web server. Check with them if this is the case.
  2. To renew an SSL certificate, you can talk to the issuer of the existing certificate. There are also DNS providers that issue SSL certificates for you via a Certification Authority they trust, so you don’t have to speak to another party.
  3. The new certificate must be in the .pfx format (password-protected) to get along with IIS (Azure also runs Apache actually, but I think most Azure websites are IIS presently. I may be wrong already and I will definitely be wrong in the future).
    I explained how to create the .pfx certificate in this post. However, if your Certificate authority or DNS provider are very kind, you won’t have to go through any of that: they will create a .pfx for you, thank you very much. For instance, dnsimple has an interface that creates the pfx for you when you buy a certificate through them (they buy it at Comodo’s). Dnsimple also provides a matching password you will have to use in Azure in conjunction with the certificate:
Download a pfx format certificate and password from dnsimple

Download a pfx format certificate and its password from dnsimple, or any provider that is “IIS-friendly”

The actual work

  1. Go to portal.azure.com
  2. Choose the blade (new portal terminology for a dynamic window) corresponding to your web app
  3. In the app’s settings, choose “Custom domains and SSL”
  4. Choose “Upload certificate”. Don’t be scared if you’re doing this ahead of time: before you bind the certificate to your site, nothing will change in the configuration. Plus, as we said, you can use the renewed certificate before the old one expires, unless a third party needs the intermediate certificates.
upload renewed pfx certificate in Azure

upload renewed pfx certificate in Azure

5. Once you upload the new certificate, the list of available certificates is incremented by one (see the “Certificates” section in the screenshot below: there is a “2017” certificate below the “2016”).

As you can see in the "certificates" section, I have a new one

As you can see in the “certificates” section, I have a new one

6. Now you would be tempted to ADD a new binding between your hostname and the new certificate. You would want to do that in the SSL bindings configuration (see “SSL bindings” in the screenshot above). Azure will allow you to do that; however, once you save and re-enter the blade, you will see that only the old certificate still has a binding to the hostname.

7. This is why you don’t ADD a new binding between the hostname and the new certificate: you update the existing binding. In the row corresponding to existing binding, select the new certificate you just uploaded and replace the old one, as you see below:

Choose the new certificate in the SSL binding

Choose the new certificate in the SSL binding

8. If your SSL is already IP-based, you won’t have to set the IP binding again: the old configuration is kept.

9. However, in order to check that the new Certificate chain is working, you can use an online tool like SSL shopper’s checker.

Just make sure that you are seeing the latest, non-cached situation in the tool!

SSL-checker

Check your SSL certificate in Azure via an SSL checker

Encrypting a SQL Azure Database to comply with the EU data protection laws

Back in 2014, Microsoft’s president and chief legal officer Brad Smith wrote a note in the company’s blog (you can read it here: http://blogs.microsoft.com/blog/2014/04/10/privacy-authorities-across-europe-approve-microsofts-cloud-commitments/) stating that Azure was the only cloud provider meeting the renewed data protection regulations of the European Union. This award stemmed from policies that were already in place and some that Microsoft committed to implementing in the future.

It has to be noted that, by “data protection”, one does not refer only to possible hackers stealing customer data, but also, as Microsoft says, Protecting customer data from government snooping (read here: http://blogs.microsoft.com/blog/2013/12/04/protecting-customer-data-from-government-snooping/).

The European Commission, on October 15, 2015, ruled that the “Safe Harbor” decision of year 2000 (which affirmed that data were by definition protected when exchanged between EU countries and the US) is invalid. This new ruling followed a complaint of an Austrian Facebook user who affirms the company does not protect his data from the US authorities: http://curia.europa.eu/jcms/upload/docs/application/pdf/2015-10/cp150117en.pdf.

The data protection is thusly interpreted by Amazon as regards their very widely used Web Services cloud: https://aws.amazon.com/compliance/eu-data-protection/

There are also a lot of additional cloud service providers that have addressed the EU’s regulations; here is a good white paper by Cloudera: https://www.cloudera.com/content/dam/cloudera/Resources/PDF/solution-briefs/eu-data-protection-directive-compliance-solution.pdf

Here is how Rackspace comments the October 15th ruling http://blog.rackspace.com/eu-ruling-on-safe-harbor-rackspace-stands-prepared/

If your company is in a European country and you want to use cloud storage, chances are that you may want to ask your cloud provider to keep your data in the EU datacenters. Most providers allow you to choose where your database is located and replicated.

However, this is not enough. One of the requirements of data protection is encryption. Customer data should be encrypted not only when it leaves the EU for back up and geo-replication: you must assure a certain level of security if you allow your customers store personal data in a database.

Starting from October, 2015, if you have a SQL Azure database, you can take advantage of the transparent data encryption. What does it mean? It means your customer data is encrypted with a server-level cryptographic key but you don’t have to change your SQL queries.

I will try to show now how simple this is. I will replicate the same info you find here: https://msdn.microsoft.com/library/dn948096, but with a real-case scenario.

Before encrypting: Back up the DB

SQL Azure DBs are automatically backed up (frequency is set by you: watch out for Azure bandwidth costs!) but it is a good practice to back up your data before any important DDL operation. You can backup your DB to an Azure container.

To do so, from the Azure portal choose “SQL Server”, then select  the server that contains the DB you want to back up before encrypting.

Then, choose the “export” feature.

Exporting a SQL Azure DB to an Azure container

Exporting a SQL Azure DB to an Azure container

 

You have to choose the Azure “blob” where your backup file (.bacpac) will be stored. You also need to provide your server’s username and password (by server, I mean a DB server: being an “as a service” DB, actually, it is not a real server).

 

Configuring an Azure container to export a SQL Azure DB

Configuring an Azure container to export a SQL Azure DB

 

Encrypt the DB

The cool DDL command to encrypt the DB is:

ALTER DATABASE [MYDatabaseName] SET ENCRYPTION ON;

If you are not cool and don’t like writing command lines (I absolutely don’t), you can achieve the same result via the portal, (see screenshots below):

  1. Select the Server
  2. Select the DB
  3. Choose “all settings”
  4. “Transparent data encryption”
  5. “ON”
  6. “Save”

set-encryption-sql-azure
7. Wait for some seconds (depending on the size of the DB, it could be also minutes)
8. You are done.

Keep on querying

Encryption is totally transparent. Keep on querying your DB!

Calling WebServices Asynchronously in .Net: Unique state object is required for multiple asynchronous simultaneous operations

When you call a web service aynchronously, you may want to make sure that responses arrive in the same order in which they were requested.

Is this always the case? Not necessarily. At times you just don’t care if the responses to requests A and B come in order B and A. Other times, order is crucial. One would argue that in these cases synchronous methods are better than asynchronous, but – especially when you are using third party web services, or particular technologies (PCLs are an example) – you cannot always choose to “go synchronously”.

.Net helps you enforce the correct request – response order by throwing the error that is the subject of this post. If a client requests multiple async methods in a short period of time, .Net tries to prevent the inconsistencies by complaining thusly:

ERROR: System.ArgumentException: There was an error during asynchronous processing. Unique state object is required for multiple asynchronous simultaneous operations to be outstanding. ---> System.ArgumentException: Item has already been added. Key in dictionary: 'System.Object' Key being added: 'System.Object' at System.Collections.Hashtable.Insert(Object key, Object nvalue, Boolean add)

When does this happen? Is .Net mean to us?

It happens when you call an async web service method before the previous call has received its

completed

event.

Let us reproduce the issue.
Let us imagine we have a web service that we reference in a Console project with a

myWSRef

reference.
Let us imagine the service exposes an async method called

getProductDataAsync(manufactorCode,productId)

Our client repeatedly calls the async service in a while-loop, like this:

using System;
using System.Collections.Generic;
using System.Diagnostics;
using System.Linq;
using System.Text;
using System.Threading;
using System.Threading.Tasks;
namespace testwcf
{
    class Program
    {
       static void Main(string[] args)
        {
            string manufactorCode="blahblahCode";
            string productId = "blahblahCode2";

            // we define the client for the WS
            myWSRef.RemoteWS wsClient = new myWSRef.RemoteWS ();

            // we attach a handler to the "Completed" event
            wsClient.getProductDataResponseCompleted += callToProductDataCompleted;
            
            int prodNumb=0;
            while (prodNumb<=100)
            {
                try
                {
                    prodNumb++;
                    string artificialSuffix=new Random().Next().ToString();
                    wsClient.getProductDataAsync(manufactorCode,productId+artificialSuffix);
                }
                catch (Exception e)
                {
                    Console.WriteLine(e);
                }
            }
            Console.ReadLine();
        }

        public static void callToProductDataCompleted(object sender, myWSRef.GetProductDataCompletedEventArgs ea)
        {
            //this is the handler to the webserver reply

            if (ea.Error != null){
                Console.WriteLine("ERROR: " + ea.Error);
                Debug.WriteLine("ERROR: " + ea.Error);
             }
            else {
                Console.WriteLine("Call OK: ");
                Console.WriteLine(ea.Result);
            }
        }
    }
}

What happens if we run this? .Net will throw the runtime error that is the subject of this post.

Let us try to distantiate the calls by asking the Thread to sleep one second.

try
                {
                    prodNumb++;
                    string artificialSuffix=new Random().Next().ToString();
                    wsClient.EditorProductsDataResponse2Async(authCode, editorId, productId + artificialSuffix, idMachine, true, year);
                    Thread.Sleep(1000);
                 }

What happens? Temporarily, the error goes away. In fact, one second is enough for the web service response to call the callToProductDataCompleted routine, and we’re safe… unless… the next call takes three seconds rather than one second. And we’re back to square one.

How to solve this issue for good? Many suggest that every call has a unique GUID.

Stackoverflow offers
this suggestion: every call passes its own Guid to the service.

What about when the web service is done by someone else and you cannot pass it any uniqueID?

One way to solve this is with a semaphore: when the response of a web service call has not been received, you cannot call another one.

This is the code, with the IsBusy semaphore implemented

using System;
using System.Collections.Generic;
using System.Diagnostics;
using System.Linq;
using System.Text;
using System.Threading;
using System.Threading.Tasks;
namespace testwcf
{
    class Program
    {

        static Boolean IsBusy = false;  // this will be our semaphore
        static void Main(string[] args)
        {
            string manufactorCode="blahblahCode";
            string productId = "blahblahCode2";

            // we define the client for the WS
            myWSRef.RemoteWS wsClient = new myWSRef.RemoteWS ();

            // we attach a handler to the "Completed" event
            wsClient.getProductDataResponseCompleted += callToProductDataCompleted;
            
            int prodNumb=0;
            while (prodNumb<=100)
            {
               // if IsBusy, it means I am in the middle of a call
                if (IsBusy)
                {
                    continue; //I wait until the web service is not being called by another client still
                }
                try
                {
                    prodNumb++;
                    string artificialSuffix=new Random().Next().ToString();
                    wsClient.getProductDataAsync(manufactorCode,productId+artificialSuffix);
                }
                catch (Exception e)
                {

                    Console.WriteLine(e);
                    // treat the exception, then
                    IsBusy=false; // free the "semaphore" for another call
            }
            Console.ReadLine();
        }

        public static void callToProductDataCompleted(object sender, myWSRef.GetProductDataCompletedEventArgs ea)
        {
            //this is the handler to the webserver reply
             IsBusy = false; // when the WS call has been dutifully served, we "free" the loop to serve another one
            if (ea.Error != null){
                Console.WriteLine("ERROR: " + ea.Error);
                Debug.WriteLine("ERROR: " + ea.Error);
             }
            else {
                Console.WriteLine("Call OK: ");
                Console.WriteLine(ea.Result);

            }
        }
    }
}

There are other methods to avoid the request overlapping, of course, like calling the following method inside the Completed callback function (iteration). Just remember that the callToProductDataCompleted method is called even when the web server throws an error (as: 400 Bad request), so the method is the place to handle those exceptions. By contrast, client errors (as a timeout or network error) will be caught by the catch block of the getProductDataAsync call.

Controller versus view in MVC .net: is the code in the view as fast as that in the controller? Is it slower?

One of the basic rules of MVC is that views should be only – exactly – views, that is to say: objects that present to the user something that is already “worked and calculated”.

They should perform little, if not none at all, calculation. All the significant code should be in the controllers. This allows better testability and maintainability.

Is this, in Microsoft’s interpretation of MVC, also justified by performance?

We tested this with a very simple code that does this:

– creates 200000 “cat” objects and adds them to a List

– creates 200000 “owner” objects and adds them to a List

– creates 200000 “catowner” objects (the MTM relation among cats and owners) and adds them to a List

– navigates through each cat, finds his/her owner, removes the owner from the list of owners (we don’t know if cats really wanted this, but their freedom on code fits our purposes).

We’ve run this code in a controller and in a razor view.

The result seem to suggest that the code in views runs just as fast as in controllers even if don’t pre-compile views (the compilation time in our test is negligible).

The average result for the code with the logic in the controller is 18.261 seconds.

The average result for the code with the logic in the view is 18.621 seconds.

The performance seems therefore very similar.

Here is how we got to this result.

Case 1: Calculations are in the CONTROLLER

Models:

using System;
using System.Collections.Generic;
using System.Linq;
using System.Web;

namespace WebPageTest.Models
{
public class Owner
{
public string Name { get; set; }
public DateTime DOB { get; set; }
public virtual CatOwner CatOwner { get; set; }
}
public class Cat
{
public string Name { get; set; }
public DateTime DOB { get; set; }
public virtual CatOwner CatOwner { get; set; }
}
public class CatOwner
{
public virtual Cat Cat { get; set; }
public virtual Owner Owner { get; set; }
}
}

Controller:

using System;
using System.Collections.Generic;
using System.Diagnostics;
using System.Linq;
using System.Web;
using System.Web.Mvc;
using WebPageTest.Models;

namespace WebPageTest.Controllers
{
public class HomeController : Controller
{
public ActionResult Index()
{
Stopwatch howLongWillItTake = new Stopwatch();
howLongWillItTake.Start();
List<Owner> allOwners = new List<Owner>();
List<Cat> allCats = new List<Cat>();
List<CatOwner> allCatOwners = new List<CatOwner>();
// create lists with 200000 cats, 200000 owners, 200000 relations
for (int i = 0; i < 200000; i++)
{
//Cat
Cat CatX = new Cat();
CatX.Name = “Cat ” + i.ToString();
CatX.DOB = DateTime.Now.AddDays(i / 10);
//Owner
Owner OwnerX = new Owner();
OwnerX.Name = “Owner ” + i.ToString();
OwnerX.DOB = DateTime.Now.AddDays(-i / 10);
//Relationship “table”
CatOwner CatOwnerXX = new CatOwner();
CatOwnerXX.Cat = CatX;
// Relations
CatOwnerXX.Owner = OwnerX;
CatX.CatOwner = CatOwnerXX;
OwnerX.CatOwner = CatOwnerXX;
//add to list
allCats.Add(CatX);
allOwners.Add(OwnerX);
allCatOwners.Add(CatOwnerXX);
}
// now I remove all the items
foreach (Cat CatToDelete in allCats)
{
Owner OwnerToRemove = CatToDelete.CatOwner.Owner;
allOwners.Remove(OwnerToRemove);
}
// now all cats are free
int numberOfCats = allCats.Count();
int numberOfOwners = allOwners.Count();
howLongWillItTake.Stop();
long elapsedTime = howLongWillItTake.ElapsedMilliseconds;
// give info to the view
ViewBag.numberOfCats = numberOfCats;
ViewBag.numberOfOwners = numberOfOwners;
ViewBag.elapsedTime = elapsedTime;
return View();
}
}
}

View:

<div class=”row”>
<div class=”col-md-12″>
<hr />
<b>Results</b>
<br/>
Cats: @ViewBag.numberOfCats
<br/>
Owners: @ViewBag.numberOfOwners
<br/>
ElapsedTime in milliseconds: @ViewBag.ElapsedTime
<hr />
</div>
</div>

Case 2: Calculations are in the VIEW (pre-compiled)

Models: same as above

Controller:

using System;
using System.Collections.Generic;
using System.Linq;
using System.Web;
using System.Web.Mvc;

namespace WebPageTest.Controllers
{
public class HomeBisController : Controller
{
public ActionResult Index()
{
return View();
}
}
}

View:

@using System;
@using System.Collections.Generic;
@using System.Diagnostics;
@using System.Linq;
@using System.Web;
@using WebPageTest.Models;
@using System.Web.Mvc;
@{
Stopwatch howLongWillItTake = new Stopwatch();
howLongWillItTake.Start();
List<Owner> allOwners = new List<Owner>();
List<Cat> allCats = new List<Cat>();
List<CatOwner> allCatOwners = new List<CatOwner>();
//create lists with 200000 cats, 200000 owners, 200000 relations
for (int i = 0; i < 200000; i++)
{
//Cat
Cat CatX = new Cat();
CatX.Name = “Cat ” + i.ToString();
CatX.DOB = DateTime.Now.AddDays(i / 10);
//Owner
Owner OwnerX = new Owner();
OwnerX.Name = “Owner ” + i.ToString();
OwnerX.DOB = DateTime.Now.AddDays(-i / 10);
//Relationship “table”
CatOwner CatOwnerXX = new CatOwner();
CatOwnerXX.Cat = CatX;
// Relations
CatOwnerXX.Owner = OwnerX;
CatX.CatOwner = CatOwnerXX;
OwnerX.CatOwner = CatOwnerXX;
//add to list
allCats.Add(CatX);
allOwners.Add(OwnerX);
allCatOwners.Add(CatOwnerXX);
}
// now I remove all the items
foreach (Cat CatToDelete in allCats)
{
Owner OwnerToRemove = CatToDelete.CatOwner.Owner;
allOwners.Remove(OwnerToRemove);
}
// now all cats are free
int numberOfCats = allCats.Count();
int numberOfOwners = allOwners.Count();
howLongWillItTake.Stop();
long elapsedTime = howLongWillItTake.ElapsedMilliseconds;
// give info to the view

}
<div class=”row”>
<div class=”col-md-12″>
<hr />
<b>Results</b>
<br />
Cats: @numberOfCats
<br />
Owners: @numberOfOwners
<br />
ElapsedTime in milliseconds: @elapsedTime
<hr />
</div>
</div>

How fast is classic ADO.net compared to Entity Framework?

Or maybe I should write: how slower is Entity Framework as compared to ADO.Net?

By Entity Framework I mean Microsoft’s open source package that allows you to manage DB objects via strongly-typed classes and collections.

By ADO.Net I mean peeking into the DB using the old ADO objects SQLConnection, SQLCommand, SQLParameters

This is the little test (note that it is a very peculiar test because rarely will you in real life insert, update and delete objects one by one: more massive operations are more likely):

– we create two table: Books and Authors. They are related via Author_Id, which is on the Books table.

– we insert 1000 authors and 1000 books

. we update 1000 books with a new title (one by one)

– we delete 1000 books (one by one)

– DB Is SQLserver version 11, running on a quad-core i5 @1.9 Ghz running Windows 8

– Server is a Windows 8 machine with 8Gb Gb RAM

The code for Entity Framework?

Book Model

namespace FastEF.Models
{
 public class Book
 {
 public int Id { get; set; }
 public string Title { get; set; }
 public Author Author { get; set; }
 
 }
}

Author Model

namespace FastEF.Models
{
 public class Author
 {
 public int Id { get; set; }
 public string Name { get; set; }
 public string Address { get; set; }
 public ICollection<Book> Books { get; set; }
}
}

DbContext

namespace FastEF.Models
{
 public class DBCreator:DbContext
 {
 public DbSet<Book> Books { get; set; }
 public DbSet<Author> Authors { get; set; }
}
}

Then, the action from Entity Framework test, which:

– inserts 1000 auhors and 1000 books related to the authors

– updates the 1000 books

– deletes the 1000 books


 public ActionResult EF()
        {
            Book bookToCreate = new Book();
            Author authorToCreate = new Author();
            Stopwatch tellTime = new Stopwatch();
            long insertingTime = 0;
            long updatingTime = 0;
            long deletingTime = 0;
            List generatedBookIds = new List();

            // let us delete table contents
            try
            {
                var objCtx = ((System.Data.Entity.Infrastructure.IObjectContextAdapter)thisDB).ObjectContext;
                objCtx.ExecuteStoreCommand("DELETE FROM Books");
                objCtx.ExecuteStoreCommand("DELETE FROM Authors");

            }


            catch (Exception e)
            {
                // write exception. Maybe it's the first time we run this and have no tables
                Debug.Write("Error in truncating tables: {0}", e.Message);

            }

            // let us start the watch
            tellTime.Start();

            // INSERTING!
            // we create 1000 authors with name="John Doe nr: " + a GUID
            // and address ="5th Avenue nr: " + a GUID
            // we create a book called "The Cronicles of: " + a GUID and attach it to the author
            // we save the book, so the author is also automatically created

            for (int i = 0; i < 1000; i++)
            {

                // creating author
                authorToCreate = new Author();
                authorToCreate.Name = "John Doe nr. " + Guid.NewGuid();
                authorToCreate.Address = "5th Avenue nr. " + Guid.NewGuid();

                //creating book and linking it to the author
                bookToCreate = new Book();
                bookToCreate.Title = "The Chronicles of: " + Guid.NewGuid();
                bookToCreate.Author = authorToCreate;

                //saving the book. Automatically, the author is saved
                thisDB.Books.Add(bookToCreate);
                thisDB.SaveChanges();
                generatedBookIds.Add(bookToCreate.Id);
            }

            insertingTime = tellTime.ElapsedMilliseconds; // how did I do with inserting?

            tellTime.Restart(); // restart timer

            // We update the 1000 books by changing their title
            foreach (int bookId in generatedBookIds)
            {

                Book bookToUpdate = thisDB.Books.Find(bookId);
                bookToUpdate.Title = "New chronicles of: " + Guid.NewGuid();

                thisDB.SaveChanges();

            }

            updatingTime = tellTime.ElapsedMilliseconds; // how did I do with inserting?
            tellTime.Restart(); // restart timer

            // We delete 1000 books, one by one
            foreach (int bookId in generatedBookIds)
            {

                Book bookToDelete = thisDB.Books.Find(bookId);
                thisDB.Books.Remove(bookToDelete);

            }

            deletingTime = tellTime.ElapsedMilliseconds; // how did I do with inserting?
            tellTime.Stop(); // stop timer


            //printing the results 

            string returnedMessage = "Results with Entity Framwork 6.1: ";
            returnedMessage += "
1000 Insert operations in ms.: " + insertingTime.ToString(); returnedMessage += "
1000 Update operations in ms.: " + updatingTime.ToString(); returnedMessage += "
1000 Delete operations in ms.: " + deletingTime.ToString(); return Content(returnedMessage); }

The code for ADO.Net?

 public ActionResult SQLClient()
        {

            string insertAuthorSQL = "INSERT INTO Authors (Name, Address) VALUES (@name, @address)";
            string insertBookSQL = "INSERT INTO Books(Title, Author_Id) VALUES (@Title, @Author_Id)";
            string updateBookSQL = "UPDATE Books Set Title=@Title where Id=@Id";
            string deleteBookSQL = "DELETE Books where Id=@Id";

            Book bookToCreate = new Book();
            Author authorToCreate = new Author();
            Stopwatch tellTime = new Stopwatch();

            // SQL Objects we will use
            SqlConnection connAntiEF = new SqlConnection(WebConfigurationManager.ConnectionStrings["DefaultConnection"].ToString());
            SqlCommand cmdAntiEF = new SqlCommand();

            // Open Connection
            connAntiEF.Open();

            long insertingTime = 0;
            long updatingTime = 0;
            long deletingTime = 0;
            List generatedBookIds = new List();

            // let us delete table contents
            try
            {
                cmdAntiEF = new SqlCommand("DELETE FROM Books", connAntiEF);
                cmdAntiEF.ExecuteNonQuery();
                cmdAntiEF = new SqlCommand("DELETE FROM Authors", connAntiEF);
                cmdAntiEF.ExecuteNonQuery();
            }


            catch (Exception e)
            {
                // write exception. 
                Debug.Write("Error in truncating tables: {0}", e.Message);

            }

            // let us start the watch
            tellTime.Start();

            // INSERTING!
            // we create 1000 authors with name="John Doe nr: " + a GUID
            // and address ="5th Avenue nr: " + a GUID
            // we create a book called "The Cronicles of: " + a GUID and attach it to the author
            // we save the book, so the author is also automatically created

            for (int i = 0; i < 1000; i++)
            {

                // creating author
                authorToCreate = new Author();
                authorToCreate.Name = "John Doe nr. " + Guid.NewGuid();
                authorToCreate.Address = "5th Avenue nr. " + Guid.NewGuid();

                //creating book and linking it to the author
                bookToCreate = new Book();
                bookToCreate.Title = "The Chronicles of: " + Guid.NewGuid();
                bookToCreate.Author = authorToCreate;

                // INSERT book with SQL and get its Id


                SqlParameter parmName = new SqlParameter("Name", authorToCreate.Name);
                SqlParameter parmAddress = new SqlParameter("Address", authorToCreate.Address);
                cmdAntiEF.CommandText = insertAuthorSQL;
                cmdAntiEF.Parameters.Add(parmName);
                cmdAntiEF.Parameters.Add(parmAddress);
                cmdAntiEF.ExecuteNonQuery();

                cmdAntiEF.Parameters.Clear();
                cmdAntiEF.CommandText = "SELECT @@IDENTITY";

                int insertedAuthorID = Convert.ToInt32(cmdAntiEF.ExecuteScalar());

                // INSERT book with SQL and get its Id


                parmName = new SqlParameter("title", bookToCreate.Title);
                parmAddress = new SqlParameter("author_id", insertedAuthorID);

                cmdAntiEF.CommandText = insertBookSQL;
                cmdAntiEF.Parameters.Add(parmName);
                cmdAntiEF.Parameters.Add(parmAddress);
                cmdAntiEF.ExecuteNonQuery();

                // we neeed the book's Id to iterate through the Id's later
                cmdAntiEF.CommandText = "SELECT @@IDENTITY";
                int insertedBookID = Convert.ToInt32(cmdAntiEF.ExecuteScalar());
                generatedBookIds.Add(insertedBookID);


                parmName = null;
                parmAddress = null;
                cmdAntiEF.Parameters.Clear();

            }


            insertingTime = tellTime.ElapsedMilliseconds; // how did I do with inserting?

            tellTime.Restart(); // restart timer

            // We update 1000 books by changing their title
            cmdAntiEF.CommandText = updateBookSQL;
            foreach (int bookId in generatedBookIds)
            {

                //parameters are loaded with the book's new data
                SqlParameter parmTitle = new SqlParameter("Title", "New chronicles of: " + Guid.NewGuid());
                SqlParameter parmId = new SqlParameter("Id", bookId);
                cmdAntiEF.Parameters.Add(parmTitle);
                cmdAntiEF.Parameters.Add(parmId);

                cmdAntiEF.ExecuteNonQuery();
                parmTitle = null;
                cmdAntiEF.Parameters.Clear();

            }

            updatingTime = tellTime.ElapsedMilliseconds; // how did I do with inserting?
            tellTime.Restart(); // restart timer

            // We delete 1000 books one by one
            cmdAntiEF.CommandText = deleteBookSQL;
            foreach (int bookId in generatedBookIds)
            {
                SqlParameter parmId = new SqlParameter("Id", bookId);
                cmdAntiEF.Parameters.Add(parmId);
                cmdAntiEF.ExecuteNonQuery();
                parmId = null;
                cmdAntiEF.Parameters.Clear();
            }

            connAntiEF.Close();

            deletingTime = tellTime.ElapsedMilliseconds; // how did I do with inserting?
            tellTime.Stop(); // stop timer

            // printing the results
            string returnedMessage = "Results with SQL Connection: ";
            returnedMessage += "
1000 Insert operations in ms.: " + insertingTime.ToString(); returnedMessage += "
1000 Update operations in ms.: " + updatingTime.ToString(); returnedMessage += "
1000 Delete operations in ms.: " + deletingTime.ToString(); return Content(returnedMessage); }

How did they do?

Entity Framework

Results with Entity Framwork 6.1:
1000 Insert operations in ms.: 11355
1000 Update operations in ms.: 20833
1000 Delete operations in ms.: 18117

Entity framework performance

Adding, updating, deleting 1000 sqlserver objects via EF

CPU average use: 35%

Memory average use: 65%

ADO.Net

Results with SQL Connection:
1000 Insert operations in ms.: 921
1000 Update operations in ms.: 309
1000 Delete operations in ms.: 311

ado.net insert and update and delete

Inserting, updating, deleting sql server objects via ado

How to interpret the results?

They cannot be compared, because using EF means using objects rather than non-typed records.

So, I keep on thinking ORMs are the way to go.

However, if one day I was asked to speed up parts of an application that is slow when reading / writing data, I would know where to go and look for possible ameliorations.