How did I Reduce the Load Time of my Application

Introduction

Let me tell you a story about how performance problems can show up later in development and why we should try to be ready earlier.

I was developing an Information System for a metal manufacturing company and everything went fine with the performance of an application on my local computer. Let’s be honest, I am not some kind of performance Guru and I was not paying much attention to the shape of my queries.

Everything looked good, tested and because we already missed the deadline, my project manager decided that it was time for data migration. Let me be clear, we weren’t working on a brand new application, but an upgrade for the old one. This should explain why we had the data ready to migrate into the new database.

Data migration

Work began. It took us a few days to prepare a “bridge” between the old database and the new one. Long story short, the transfer of data was successful.

Now was the time to turn on the app and try some user experience. Honestly, I was looking forward to it, because it was my first connection with the real world after a year of work. Application booted up and I just wanted to scream “It’s alive” like Dr. Frankenstein.

Transfer of data was completed and I wanted to scream: “It’s alive”!

Struggle

But then it began. We tried to open one of the main module windows and the application froze for like one minute or even more. I was like: “Okey, one window with big load time. I will fix it. No big deal.”. Then my colleague tried to open other module windows and we noticed that there are more and more loading time problems. We decided that it was time to look at the amount of data the application was working with.

Data did not look extra big so we turned the SQL Profiler on and started to track the communication between the application and a database server. All queries looked small and fast. I didn’t understand what was going on. But then we realized the error. My queries were fast but there were too many of them and the Trace window of Profiler just grew and grew.

Trace window just grew and grew.

It was about time for deep dive into the code and find out what was going on. I was analyzing a trace file and I was thinking about what could cause such a behavior of the application. It seemed like it has to be something in For loop. While I was debugging the code I found the problem. It turned out that making even a small query request in every iteration of For Loop with a length of more than 1000 is a big deal. Now we had to invent a better way.

public async Task Process (IEnumerable<int> ids)
{
     foreach (var id in ids)
     {
           var entity = await repository.GetById(id);
           DoMoreStuffWithEntity(entity);
     }
}

As you can see in the body of For loop, I am demanding a record of the table by its ID in every iteration. This has to be changed. But how?

##Consultation I went into the colleague’s office to discuss my situation and he came up with a really nice and clear solution.

“Why wouldn’t you load all the data you need into memory first and then load record by id from memory collection?”

Yes, it was a lack of experience. After a few changes, the code looked different.

public async Task Process (IEnumerable<int> ids)
{
     var entities = await repository.GetByIds(ids);
     foreach (var entity in entities)
     {
           DoMoreStuffWithEntity(entity);
     }
}

I made some changes, started the application in debug mode, and turned on tracking in SQL Profiler. Now we could see one big query which was finished in milliseconds and the response of an application was instantaneous.

Conclusion

Great. We managed to solve the problem really quickly, but it turned out that I was making this mistake over and over again in my algorithms. It took me maybe a whole week to make all necessary changes, and not everywhere I could use the same recipe, but in this case, it is a game-changer.

Give yourself sometimes a few minutes, look at the behavior of your application via your favorite monitoring tools and ask yourself a question; How is it going to behave with bigger data than with your few dummy records?

itixo-logo-blue.png