This article introduces the concepts of lazy evaluation and fair scheduling in C#.

Please Sign up or sign in to vote.

Table of Contents

Introduction

This article briefly introduces the concepts of lazy evaluation and fair scheduling in C#.

Lazy Evaluation

Lazy evaluation is an evaluation strategy which delays the evaluation of an expression until its value is needed and which also avoids repeated evaluations. The sharing can reduce the running time of certain functions by an exponential factor over other non-strict evaluation strategies, such as call-by-name.

Listed below are the benefits of lazy evaluation:

Performance increases by avoiding needless calculations, and error conditions in evaluating compound expressions

The ability to construct potentially infinite data structure

The ability to define control flow (structures) as abstractions instead of primitives

The .NET Framework provides Lazy<T> in order to manage lazy evaluation. In order to initialize a lazy expression, it is possible to do it through a lambda:

var lazyNumber = new Lazy<int>(() = > 42 ); Console.WriteLine(lazyNumber.Value);

In the sample below, we are creating two lazy expressions, and we will try to add them:

var x = new Lazy<int>(() = > 42 ); var y = new Lazy<int>(() = > 24 ); Console.WriteLine(x + y); Console.WriteLine(x.Value + y.Value);

In the sample below, we are creating a lazy expression that realizes an edge effect and we will try to freeze it.

var p = new Lazy<object>(() = > { Console.WriteLine( " Boo" ); return null ; });

Let's try to create our own Lazy<T> :

public class MyLazy<T> { private readonly Func<T> _f; private bool _hasValue; private T _value; public MyLazy(Func<T> f) { _f = f; } public static implicit operator T(MyLazy<T> lazy) { if (!lazy._hasValue) { lazy._value = lazy._f(); lazy._hasValue = true ; } return lazy._value; } }

MyLazy<T> is a generic class that contains the following fields:

_f : A function for lazy evaluation that returns a value of type T

: A function for lazy evaluation that returns a value of type _value : A value of type T (frozen value)

: A value of type (frozen value) _hasValue : A boolean that indicates whether the value has been calculated or not

In order to use objects of type MyLazy<T> as objects of type T , the implicit keyword is used. The evaluation is done at type casting time.

Below is a sample usage of MyLazy<T> .

var myLazyRandom = new MyLazy<double>(GetRandomNumber); double myRandom = myLazyRandom; Console.WriteLine( " Random with MyLazy<double>: {0}" , myRandom);

where GetRandomNumber returns a random double as follows:

static double GetRandomNumber(){ Random r = new Random(); return r.NextDouble(); }

Fair Scheduling

Fair scheduling is a scheduling method in which all jobs get an equal share of resource over time. The CPU usage is equally distributed among the system.

Assume that we want to create the following fair scheduling system: a central system managing a processor allocates to a set of threads so that each thread has a fair part of the time.

A thread will be modelled as a code tested by a processor and cut into small pieces. Each of these pieces will have a cost in time. The processor, which has multiple threads to handle simultaneously, schedules threads and let them run in pieces. In order to choose the thread to be executed, the processor will select the thread that has cost the least in working time.

A thread will have an integer called ThreadId that represents the thread identifier and an integer Workload that counts the amount of work already done. The work pieces of the thread will be represented through a Worker attribute of type IEnumerable<int> .

Thus:

The value Worker.Current represents the costs of the last piece done

represents the costs of the last piece done The operation Worker.MoveNext() makes the thread do a piece of work

For simplicity, the thread will have a method Work() which lets the thread do a piece of work and adds its workload to Workload .

In the sample code below, we are creating a list of 10 threads.

using System; using System.Linq; using System.Collections.Generic; class Program { public class Thread { public int Workload { get ; set ; } public int ThreadId { get ; private set ; } private IEnumerator<int> Worker; public Thread() { Workload = 0 ; ThreadId = rng.Next() % 100 + 100 ; Worker = CreateWorker( this ).GetEnumerator(); } public int Work() { int load = Worker.Current; Workload += load; Worker.MoveNext(); return load; } static Random rng = new Random(); static IEnumerable<int> CreateWorker(Thread p) { while ( true ) { int load = rng.Next() % 9 + 1 ; Console.WriteLine( " {0} : working for {1} hours (total load : {2})" , p.ThreadId, load, p.Workload); yield return load; } } } static void Main( string [] args) { Console.WriteLine( " Yield!" ); var ListOfThreads = new List<Thread>(); for ( int i = 0 ; i < 10 ; i++) { var processor = new Thread(); ListOfThreads.Add(processor); } Thread p; for ( int i = 0 ; i < 500 ; i++) { p = ListOfThreads.Aggregate((curp, next) = > { if (next.Workload < curp.Workload) return next; else return curp; }); p.Work(); } Console.WriteLine( " Press any key to exit." ); Console.ReadKey(); } }

In the method Main , first and formost, a list of 10 threads is created. Then, the scheduling of threads is done knowing that the thread that should work is the one having the smallest workload.

Fair Thread Pool

It is possible to create a thread pool allowing scheduling of jobs in a fair way. All jobs will get an equal share of resource over time. Thus, the CPU usage will be equally distributed among the system.

Jobs will be enqueued in the pool associated to a tag. To pick a job to run, the threads in the pool cycle through the tags in round robin. Thus, it will be guaranteed that jobs associated to a tag are never blocked by a bunch of jobs associated to another tag. The pool will alternate between tags when picking jobs from its queue.

That's it! I hope you enjoyed reading.

History