Monday 20 October 2014

Classic Developer mistakes that makes applications slow down

So you write a system or inherit a system from a previous incumbent that has long since fled and things are doing fine. The system runs and there are a few small issues but nothing serious, but as the systems beds in and ages things start to slow down. The test system is fine but the live system gets slower and slower until it starts to break. Managers are in a rage users are mutterings and you are pulling your hair out. Servers get upgraded and things are fine again, for a while. Then after a while things start to slow again and you start a cycle of server upgrades. 

Below is a list of Common issues that make code go slow.


VLO :  Very Large Objects

VLO is quite common in my experience where a objects (C#) can be thousands of lines long and have all sorts of unused data and oddles of business logic in the object.  Keep you data away from your business logic and keep your movable objects light. Serializing a VLO with lots of embedded objects, state and logic and passing this between layers is just daft.

See a previous post on Light Weight Object Pattern


Reporting On Live Systems

Managers like reports and quite a few like the same reports every 15mins as if this will increase their teams performance or ease their paranoia.  Also having 100 people running the same report every morning at 9am is also not good as your user base is is trying to sign on at the same time. Replicate the database and feed the reporting from the replicated data is a good start, for very common reports, email it out once daily or download load and place on a link for users. Poor reporting setup can kill your application.


Silly Loops

Look at the GetCustomer method below.

private Customer GetCustomer(int customerid)
{
   Customer mycustomer;

    var getallcustomers = dallayer.getcustomers();

   foreach(customer in getallcustomers)
   {
      if (customer.id = customerid)
             mycustomer =  customer;
            // or
            // return customer;
    }

   return mycustomer;
}

You might very well laugh but I've seen this in a few systems in a few different combinations. So for ever customer you cycle through all customers till you find the one you are looking for. This problem does not seem to go away with linq and lambda expressions, it just gets hidden a bit better. This  also applies to internal collections big and small. 

Not Fetching Efficiently From Storage

This is getting much more common with the advent of ORMs (Castle, Entity Framework etc..) where data fetches are inefficient, developers are now increasingly separated from the data layer to such an extent that they don't understand the power of the database. Code First development is also has a part to play here as it further separates storage from the developer as they believe that storage is thing they don't have to worry about. But with older systems running classic ADO code this is still a problem where badly formed queries or badly designed data can make fetches slow.

Go back to old fashioned normalization of your data layer, it was used for good reason and look at the queries run on the database. That means looking at the queries generated by the ORM to ensure that are running as expected and produce SQL you'd expect.


Use the Power of your Database

Back when I was a lad (.NET 1.1~2.0 days) a lot of business logic was driven by the database, this at time was an acceptable method and strongly encouraged. Now this is out of favor and the database is seen as storage only. I did have a discussion with a developer at the time who was having performance problems with a system for processing a large amount of data via object. He'd load thousands of objects  from the database to code, process the objects and save them back to the database. The issue was it was very slow and the data levels were just going to get larger overtime. I suggested that he process the data on the database server rather than in code. So he tried this and was shocked at how much things sped up as the database could crunch his data a lot quicker than the code version. 

Databases are very good at processing large amount of data quickly and efficiently, they are after all designed for this task. So consider taking you pain points of data processing and seeing of they could be done on the database.

I've heard arguments that all applications should be database agnostic and "we could switch the layer at anytime"; this is very noble and sounds great, but really unless you are selling a product targeting multiple databases, the database layer almost never changes. It does get switched due to cost or performance but this is quite rare in my experience.

Fail To Cache Common Items

This is likely one most common issues for performance and is in almost every system. 

Take for example Titles for forms (Mr, Miss, Mrs etc...) these are usually stored on a database and can be used on different forms and used for almost every transaction. It is very common that every time the Title collection (i.e. from a database table) is used it is loaded from the database. Other similar collections (Country, Area, City, Occupation etc..) can also add unwanted load.

If you have common list/collections it is better to cache them at an application level to reduce the toll on the database and calling layers.




tbc..   I'll add more to this later









No comments:

Post a Comment

Comments are welcome, but are moderated and may take a wee while before shown.