I have the below console application to check the time taken for doing a specific operation with BigInteger in C#.
using System.Diagnostics;
using System.Numerics;
Stopwatch timer = new Stopwatch();
for (int j=0; j<10; j++)
{
Console.WriteLine("Loop Started {0}",j);
timer.Start();
BigInteger bitFields = 0;
byte value = 0;
for (int i = 0; i < 65400; ++i)
{
var bb = new BigInteger(value);
var result = (bb << (i * 8));
bitFields = bitFields ^ result;
}
Console.WriteLine("Elapsed duration: {0}", timer.Elapsed.ToString());
timer.Stop();
Console.WriteLine("Loop Completed");
Thread.Sleep(2000);
}
Console.ReadLine();
Even though I expect approximately similar time taken for each loop, the result shows differently. Time to execute the loop increases gradually
Is there any reason for this behaviour.

This happens because you start/resume the same timer with
Stopwatch.Start:You can switch to
Stopwatch.Restart:Also note that for benchmarking it is better to use specialized tools like BenchmarkDotNet.