Monday, 20 April 2015

Telerik RadSpell Stop Spell Dialogue from Moving

Problem

This control (and other in Telerik) control set have a quirk that on dialogue drop it enables all other iframes. So if you have a hidden iframe or dialogue then this will pop up as visible.

What it should really do is only enable items that are not hidden before it starts the drag.

The drag is not limited so it can be dragged off your application



Stop the Drag

To stop the drag do this.


  $('.rwTitlebar').prop("disabled", true);




Monday, 23 February 2015

Raspberry PI Snappy Unbuntu Username and Password


OK What is it???


I'm sure its not a big secret but it seems so when you look for it.

Username : unbuntu
password:   unbuntu


It is not  pi / raspberry that for Raspbian.

I'm really only waiting for the Windows 10 core to be released....

Monday, 20 October 2014

Classic Developer mistakes that makes applications slow down

So you write a system or inherit a system from a previous incumbent that has long since fled and things are doing fine. The system runs and there are a few small issues but nothing serious, but as the systems beds in and ages things start to slow down. The test system is fine but the live system gets slower and slower until it starts to break. Managers are in a rage users are mutterings and you are pulling your hair out. Servers get upgraded and things are fine again, for a while. Then after a while things start to slow again and you start a cycle of server upgrades. 

Below is a list of Common issues that make code go slow.


VLO :  Very Large Objects

VLO is quite common in my experience where a objects (C#) can be thousands of lines long and have all sorts of unused data and oddles of business logic in the object.  Keep you data away from your business logic and keep your movable objects light. Serializing a VLO with lots of embedded objects, state and logic and passing this between layers is just daft.

See a previous post on Light Weight Object Pattern


Reporting On Live Systems

Managers like reports and quite a few like the same reports every 15mins as if this will increase their teams performance or ease their paranoia.  Also having 100 people running the same report every morning at 9am is also not good as your user base is is trying to sign on at the same time. Replicate the database and feed the reporting from the replicated data is a good start, for very common reports, email it out once daily or download load and place on a link for users. Poor reporting setup can kill your application.


Silly Loops

Look at the GetCustomer method below.

private Customer GetCustomer(int customerid)
{
   Customer mycustomer;

    var getallcustomers = dallayer.getcustomers();

   foreach(customer in getallcustomers)
   {
      if (customer.id = customerid)
             mycustomer =  customer;
            // or
            // return customer;
    }

   return mycustomer;
}

You might very well laugh but I've seen this in a few systems in a few different combinations. So for ever customer you cycle through all customers till you find the one you are looking for. This problem does not seem to go away with linq and lambda expressions, it just gets hidden a bit better. This  also applies to internal collections big and small. 

Not Fetching Efficiently From Storage

This is getting much more common with the advent of ORMs (Castle, Entity Framework etc..) where data fetches are inefficient, developers are now increasingly separated from the data layer to such an extent that they don't understand the power of the database. Code First development is also has a part to play here as it further separates storage from the developer as they believe that storage is thing they don't have to worry about. But with older systems running classic ADO code this is still a problem where badly formed queries or badly designed data can make fetches slow.

Go back to old fashioned normalization of your data layer, it was used for good reason and look at the queries run on the database. That means looking at the queries generated by the ORM to ensure that are running as expected and produce SQL you'd expect.


Use the Power of your Database

Back when I was a lad (.NET 1.1~2.0 days) a lot of business logic was driven by the database, this at time was an acceptable method and strongly encouraged. Now this is out of favor and the database is seen as storage only. I did have a discussion with a developer at the time who was having performance problems with a system for processing a large amount of data via object. He'd load thousands of objects  from the database to code, process the objects and save them back to the database. The issue was it was very slow and the data levels were just going to get larger overtime. I suggested that he process the data on the database server rather than in code. So he tried this and was shocked at how much things sped up as the database could crunch his data a lot quicker than the code version. 

Databases are very good at processing large amount of data quickly and efficiently, they are after all designed for this task. So consider taking you pain points of data processing and seeing of they could be done on the database.

I've heard arguments that all applications should be database agnostic and "we could switch the layer at anytime"; this is very noble and sounds great, but really unless you are selling a product targeting multiple databases, the database layer almost never changes. It does get switched due to cost or performance but this is quite rare in my experience.

Fail To Cache Common Items

This is likely one most common issues for performance and is in almost every system. 

Take for example Titles for forms (Mr, Miss, Mrs etc...) these are usually stored on a database and can be used on different forms and used for almost every transaction. It is very common that every time the Title collection (i.e. from a database table) is used it is loaded from the database. Other similar collections (Country, Area, City, Occupation etc..) can also add unwanted load.

If you have common list/collections it is better to cache them at an application level to reduce the toll on the database and calling layers.




tbc..   I'll add more to this later









Design Pattern : Light Weight Object

Introduction

This design pattern is one I use a lot and have done so for a long time, this might be similar to other patterns, if so please tell me so I can reference them. Although this looks like "yeah.. how else would you do this"  take a look at you objects for passing data and see if they contain bloat that would be better to be reduced.


Description

The Light Weight Object design pattern takes an object and reduces it to the absolute minimum for its intended purpose to increase speed and performance for systems. This means that you only pass the absolute bare object essentials rather than an object which is bloated with extra data and functionality that you might need (but most of the time don't). If you are passing large objects around in your application then it can be bad for performance, if you are passing hundreds or even thousands these between layers or are subject to serialization then this can have a very adverse effect on performance. This comes about by either design or function creep (or a mix of both).

Design

Large object by design are very common as all the functionality and methods to support the data (properties) is kept in a single place and its fairly easy to create large function rich objects that serve your business requirements well. Large objects are self descriptive and looks impressive and have all the functionality required to perform functions over the object life-cycle.

Function Creep

You start with an object, business or functionality changes come in; additional code added,  after a couple of major changes the object starts to bloat, functions are added to make calling calls easier to use the objects and old methods are often not removed. The objects starts to bloat and bloat as new functions and products are added to the application.

Classic Design 

One thing I've seem a lot in my development career is large objects, really large objects, C# classes with hundreds and even thousands of lines which carry a lot data and internal functionality.

Object- 1,000l+ lines
    |______   Properties
    |______   Methods
                          |______ Data Layer
                          |______ Serialization
                          |______ Business Logic
                          |______ Other Stuff...
         
   

I've seen large objects with as seen above with a mix or all of  the items mentioned.( and some with even more than you could imagine). This is extra baggage and is carried and supported by the system by functions that might be used or only used at initialization or used rarely.


Light Object Pattern

Keep the object you are passing around as small as possible and use service classes or layers to get data or perform functions as you need it.

Light Object --  50 lines

Data Layer  -- 300 lines
Business Logic -- 300 lines
Other Stuf --    nnn lines


Rules


  • Keep the object as small as possible
  • Remove methods to service layers
  • Only have a single use for the object (split if dual functions)
  • Remove unused  properties


When To Use

Use when you need to keep objects small and light ( you really should be doing this by default as good practice). if they are going to be placed in collection or used in large volumes. If you are running very large volumes of data in memory than this can make huge difference to your capacity and the performance of you systems.  Although  know your scale, will I have 100 users or 1 Million users on this system.

Pros


  • Lots of objects take up less memory
  • Quicker proccessing
  • Better OOP :  encapsulation, single use,
  • Better Testing 


Cons


  • More complex code
  • More re-factoring required
  • Objects are not as self descriptive
  • Developer has to know how to process the object


Example


Bob has a user object that contain the following which is created at login and held in session with the user. This is fairly standard and not as bloated as we think with of very large objects. Bob has noticed that each user only takes up a few kb  (kilo-bytes) of memory per user, but this is not really much but the amount of users is starting to hurt.

The user object

public class User
{
    public string SessionId  { get; set; }
    public int UserId { get; set;  }
    public string UserName  {get; set; }
    public string FullName  { get; set; }
    public string Email { get; set;  }
    public string Password { get; set;  }
    public string Address CurrentAddress { get; set; }
    public DateTime LastLogin { get; set;  }
    public DateTimecDateJoined  {get; set;  }
    public Bool AccountActive {get; set;  }
    public Bool LoggedIn { get; set;  }
    public string Address[]  PreviousAddress { get; set; }
    public string SecretQuestion { get; set; }
    public string Secret Answer {get; set;  }
    public void   Log (string message }
    public void Dispose()
    public bool Validate()
    public void Logout()
    private Log _log
}


This looks like a standard object, not too big with a few standard  methods you would expect.

But Bob has a problem, his user base is getting close to 250K users and the cost of his cloud bill seems to be going up a lot faster than the users on the system.  So Bob starts with the user object which is held in session for every user.

What can go or stay and why
  1. SessionId                           Not Needed, copy of system session field
  2. UserId                                Keep used in code
  3. UsedName                         Keep used for display
  4. FullName                           lose :  used rarely
  5. Email                                 lose :  used rarely
  6. Password                           delete not required in this 
  7. Address x 5                       lose :  used rarely
  8. Last Login                        delete not used 
  9. Date Joined                       delete not used
  10. AccountActive                 delete: If not active then it this object can't exist in session
  11. Logged In                         delete:  if not logged in then object not in session
  12. Previous Address             delete:  never used 
  13. Secret Question                delete : never used after login, only on lost password
  14. Secret Answer                  delete : never used after login, only on lost password
  15. Method :  Log                   move to service class
  16. Method :  Dispose             move dispose to service class as can be handled by session end or logout
  17. Method :  Validate            move to service class
  18. Method :  Logout              move to service class
  19. Method :  Other                 move to service class
  20. Object:   Log                     remove embedded logging object (NLOG, Log4Net etc..)

As can be seen you can reduce the user object for session storage to something like this

public class LightUser
{
     public int  UserId      { get; set; }   
     public string UsedName     { get; set; }
}


Although this does not represent the full user object/details it does cover  the user object used in 99% of the cases 99% of the time. 

Now bob can squeeze more session user objects in the  same memory space, he may have to make more calls to the service layers but this is a lot less cpu than the storage and service of the larger object footprint.


Sunday, 5 October 2014

C# Simple DAL Layer with MongoDB


Introduction

See Post 1 in the series
http://rundevrun.blogspot.co.uk/2012/11/setup-mongodb-on-windows-8.html




MongoDB is a NoSQL database that stores "documents". The documents are in effect JSON serialised objects. These objects can (and often will) be a concatenation of other object, that is objects with a nested objects in them.

This DAL layer will focus on saving and retrieving C# objects to and from a MongoDB database. The core storage of MongoDB of  Json objects will be ignored as we are only interested C# objects.

Example of Customer with Embedded Address object (which is a collection).

public class Customer
{
public ObjectId Id { get; set; }
public int CustomerId { get; set; }
public string CustomerShortName { get; set; }
public string CustomerLongName { get; set; }
public List<Address> Addresses { get; set; }
public int HomeAddress { get; set; }
public int WorkAddress { get; set; }
public int[] PreviousAddress { get; set; }
}
 
public class Address
{
public int AddressIndex { get; set; }
public string Address1 { get; set; }
public string Address2 { get; set; }
public string Address3 { get; set; }
public string Address4 { get; set; }
public string Address5 { get; set; }
public string Address6 { get; set; }

}
 

Why Create a DAL

Well this could be a long conversation (or rant), but basically you may well want to encapsulate the database access to a single place within your code. Also you may not want the calling layers (you website, logic) to know or care which database is used. It is not uncommon for products to support multiple databases and having a distant layer removes a lot of pain when you want to replace the database used.

In traditional databases (SQL type) we would have to create a table for customer and a table for addresses then link then via database foreign keys or with code.

Creating A Simple DAL (Data Access Layer)

The DAL we will create will be to service the customer class as shown above.
First we will create an empty class and add methods for the dal.

public class MongoDal
{

}

 

Create Single Database 

Create a Database connection that can be used in the whole class. This will return a the same MongoDatabase object with the same parameters. The connectionString and the database should be located within a configuration file, but here are hardcoded for simplicity. This is a private method as we don't want to expose this to calling layer, no point creating methods that can be skipped past.

 
private MongoDatabase Database
{  get
   {
     string connectionString = "mongodb://localhost";
     MongoClient client = new MongoClient(connectionString);
     MongoServer server = client.GetServer();
     MongoDatabase database = server.GetDatabase("test");
     return database;

  }
}
 
 

Create Customer Method

SQL Equivalent :  Insert

Create a method to creates a customer onto the database, note that the get collection is hardcoded to "Customers", this could be supplied to the dal layer at runtime or supplied with configuration file.

public void CreateCustomer(Customer customer)
{   var collection = Database.GetCollection("Customers");
   collection.Insert(customer);
}

 

Get Customer Method

SQL Equivalent : Select

This method will get a customer object with the objectid within the customer class.

1. Define Query
2. Get Collection (customers)
3. FindOne (single) customer
4. Get the Bson document (json)
5. Deserialize back to a Customer object


public Customer GetCustomer(ObjectId objectid)
{  //Query: Get Customer.id == supported objected
  var query = Query<Customer>.EQ(e => e.Id, objectid);  
  var collection = Database.GetCollection("Customers");
  BsonDocument bsonCustomer = collection.FindOne(query);
  Customer customer = BsonSerializer.Deserialize<Customer>(bsonCustomer);
  return customer;
}


This is the core of getting ALL getting all objects from MongoDB. In this example the Query to the database is fixed rather than having a dynamic query supplied by the calling layer. We will touch on this later.

 

Delete a Customer

SQL Equivalent : Delete

You might at some point want to delete a customer from the database, this follows the same core method as getting a customer where a delete query is created and run againist the collection. This is very powerful and needs careful consideration.


public void DeleteCustomer(ObjectId objectId)
{  var query = Query<Customer>.EQ(e => e.Id, objectId);
  var collection = Database.GetCollection("Customers");
  collection.Remove(query);
}
 
 

Get All Customers

SQL Equivalent : Select  *

This will get all customers from the collection. If  you have a very large database with millions of customers this is likely not to make you friends with operations or your boss. But for simple smaller collections it is quite useful.

public List<Customer> GetAllCustomer()
{
  var collection = Database.GetCollection("Customers");
  MongoCursor<BsonDocument> Bsoncustomers = collection.FindAll();
  List<Customer> customers = new List<Customer>();
  foreach (var bsoncustomer in Bsoncustomers 
  {    var cus = BsonSerializer.Deserialize<Customer>(bsoncustomer); 
    customers.Add(cus);
   }


   return customer;
}

You could in the example above use a linq-expression statement in place of the foreach, I've shown this below.

return Bsoncustomers.Select(bsoncustomer => BsonSerializer.Deserialize<Customer>(bsoncustomer)).ToList();

 

A Wee Bit More Advanced

You may have noticed that previous example have a lot code that was very similar. Below we have two methods GetByAddress6 and GetByQuery.

GetByQuery will get a collection of Customers by the query supplied, this query could be anything from getting by name or something more complex.

GetByAddress6 forms a query and passed to GetByQuery to execute it.



public List<Customer> GetByAddress6(string address)
{  IMongoQuery query = Query<Customer>.EQ(x => x.Addresses[0].Address6 == address);
  return GetByQuery(query);
} 
public List<Customer> GetByQuery(IMongoQuery query)
{   var collection = Database.GetCollection("Customers");
  MongoCursor<BsonDocument> Bsoncustomers = collection.Find(query);
  List<Customer> customers = new List<Customer>();
  foreach (var bsoncustomer in Bsoncustomers)
  {     var cus = BsonSerializer.Deserialize<Customer>(bsoncustomer);
    customers.Add(cus);
   }
 return customers;
}


Why do this?

It makes the code a lot more maintainable and reduces the time to add additions to code later and concentrates the same code in the a single place.

 

Using The Code

Example of creating an instance of the dal and adding a customer,

private void Sample()
{
 
Address addy = new Address()
{
 Address1 = "123 High Street",
 Address2 = "North",
 Address3 = "Any Town",
 Address4 = "",
 Address5 = "",
 Address6 = "123 123"

};
 
Customer customer = new Customer();
customer.Id = new ObjectId();
customer.Addresses = new List<Address>();
customer.HomeAddress = 0;
customer.CustomerId = 12345;
customer.CustomerShortName = "Tommy";
customer.CustomerLongName = "Mr Thomous Smith";
 var id = customer.Id;

//Create dal instance
MongoDal dal = new MongoDal();

//Create customer
dal.CreateCustomer(customer);
 
//Get All customers
List<Customer> allcustomer = dal.GetAllCustomer();
 //Get Customer
Customer GetCustomer = dal.GetCustomer(id);

//Delete Customer
dal.DeleteCustomer(id);

 
}

 

Full Code Example

Here's the full thing:

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using MongoDB.Bson;
using MongoDB.Bson.Serialization;
using MongoDB.Driver;
using MongoDB.Driver.Builders;

namespace SampleCode
{
 
public class MongoDal
{
 
private MongoDatabase Database
{
 get 
 {
   string connectionString = "mongodb://localhost";
   MongoClient client = new MongoClient(connectionString);
   MongoServer server = client.GetServer();
   MongoDatabase database = server.GetDatabase("test"); //name of the database

   return database;
  }
}

 
 
public void CreateCustomer(Customer customer)
{  var collection = Database.GetCollection("Customers");
  collection.Insert(customer);
}

 
public Customer GetCustomer(ObjectId objectid
{
 //Query: Get Customer.id == supported objectid
 IMongoQuery query = Query<Customer>.EQ(e => e.Id, objectid);  
 var collection = Database.GetCollection("Customers");
 BsonDocument bsonCustomer = collection.FindOne(query);
 Customer customer = BsonSerializer.Deserialize<Customer>(bsonCustomer);

 return customer;
 }

 
 
public void DeleteCustomer(ObjectId objectId)
{
 var query = Query<Customer>.EQ(e => e.Id, objectId);
 var collection = Database.GetCollection("Customers");

 collection.Remove(query);

}

 
public List<Customer> GetAllCustomer()
{  var collection = Database.GetCollection("Customers");
  MongoCursor<BsonDocument> Bsoncustomers = collection.FindAll();
  List<Customer> customers = new List<Customer>();
  foreach (var bsoncustomer in Bsoncustomers)
  {    var cus = BsonSerializer.Deserialize<Customer>(bsoncustomer);
    customers.Add(cus);  

  }
}

 
public List<Customer> GetByAddress6(string address)
{ IMongoQuery query = Query<Customer>.EQ(x => x.Addresses[0].Address6 == address);
 return GetByQuery(query);
}

 
public List<Customer> GetByQuery(IMongoQuery query)
{ var collection = Database.GetCollection("Customers");
 MongoCursor<BsonDocument> Bsoncustomers = collection.Find(query);
 List<Customer> customers = new List<Customer>();
 foreach (var bsoncustomer in Bsoncustomers)
 {   var cus = BsonSerializer.Deserialize<Customer>(bsoncustomer);
   customers.Add(cus);  

 }
  return customers;
 }
}
}