.NET generics performance

Thu, Oct 26, 2006 3-minute read

You may wonder if the nice features of the generics List lacks in performance.

I wondered the same, so I decided to make a small and not very thorough test.

            ArrayList items = new ArrayList();
            for (int x = 0; x < 1000000;x++ )
            {
                items.Add("string" + x.ToString());
            }

            List<string> list = new List<string>();
            for (int x = 0; x < 1000000; x++)
            {
                list.Add("string" + x.ToString());
            }

            DateTime before = DateTime.Now;
            ArrayList wanted = new ArrayList();
            foreach (string str in items)
            {
                if (str.ToLower().EndsWith("0"))
                {
                    wanted.Add(str);
                }
            }
            DateTime after = DateTime.Now;
            Console.WriteLine("Time elapsed: {0} milliseconds", after.Subtract(before).TotalMilliseconds);

            before = DateTime.Now;
            List<string> wanteds = list.FindAll(delegate(string str) { return str.ToLower().EndsWith("0"); });
            after = DateTime.Now;
            Console.WriteLine("Time elapsed: {0} milliseconds", after.Subtract(before).TotalMilliseconds);
            Console.WriteLine("Press enter to continue");
            Console.ReadLine();

As you can see, I try to do the same things twice. The first time, I loop through an ArrayList with 1,000,000 string in it, to find those that ends in a 0. The strings that match are added to a new ArrayList.

The second time I do the same, just this time I am using the generics List, and the FindAll.

The performance difference are not great, but its there, and surprisingly it is in favour of the generics List.

On my computer, the first example gives the output:
Time elapsed: 890,625 milliseconds

and the second one:
Time elapsed: 765,625 milliseconds

which gives approximately 15% increase in performance, but that is only with one million items in the lists.

I summed it up in the following table, i could not move above 1 million items on my computer since I only have 2 GB of ram :)

And it seems like the 15% speed increase continues. 

Number if items ArrayList List<string>
1000 0 0
10000 15,625 0,000
100000 93,750 78,125
1000000 890,625 765,625

But what happens if the objects are a bit more complex than just strings?

            int count = 2000000;
            ArrayList items = new ArrayList(count);
            for (int x = 0; x < count; x++)
            {
                Servant s = new Servant();
                s.Name = "Jane"+x.ToString();
                s.Age = x % 2 == 0 ? 19:18;
                s.Sex = x % 2 == 0 ? SexFlag.Female : SexFlag.Male;
                items.Add(s);
            }

            List<Servant> list = new List<Servant>(count);
            for (int x = 0; x < count; x++)
            {
                Servant s = new Servant();
                s.Name = "Jane" + x.ToString();
                s.Age = x % 2 == 0 ? 19:18;
                s.Sex = x % 2 == 0 ? SexFlag.Female : SexFlag.Male;
                list.Add(s);
            }

            DateTime before = DateTime.Now;
            ArrayList wanted = new ArrayList();
            foreach (Servant s in items)
            {
                if (s.Sex == SexFlag.Female && s.Age > 18 && s.Age < 25 && s.Name.ToLower().EndsWith("0"))
                {
                    wanted.Add(s);
                }

            }
            DateTime after = DateTime.Now;
            
            
            Console.WriteLine("Time elapsed ArrayList: {0} milliseconds, found {1} items", after.Subtract(before).TotalMilliseconds,wanted.Count);

            before = DateTime.Now;
            List<Servant> wanteds = list.FindAll(delegate(Servant s) { return s.Sex == SexFlag.Female && s.Age > 18 && s.Age < 25 && s.Name.ToLower().EndsWith("0"); });
            after = DateTime.Now;
            Console.WriteLine("Time elapsed List<string>: {0} milliseconds, found {1} items", after.Subtract(before).TotalMilliseconds,wanteds.Count);

And the results:
 

Number if items ArrayList List<Servant>
1000 0,000 0,000
10000 0,000 0,000
100000 46,875 46,875
1000000 484,375 390,625
2000000 968,750 812,500

That is all nice, it seems like the generics have an advantage over the plain old way of doing things, but what about CPU usage? How does that sum up?

I ran the test with the more complex objects a few times, and monitored the CPU usage, my test results was simple, it seems like the dont differ much, but then again my testing tools is my eyes, so i can be greatly wrong :)

But considering my tests, it seems like a safe assumption to say that generics have no negative performance impact what so ever, it on the other hand gives a performance boost, when your lists are very large.