OpenNETCF.Timers.Timer2 and System.threading.timer wrong timing event call after 7-8 hours

257 Views Asked by At

I am using OpenNETCF Timer2 (and previously System.Threading.Timer) in this example. The timer behaves very odd. After around 7-8 hours. It does a few random calls of Timer event function with a interval of 2 seconds instead of 1 second. And behaves normally for next 7-8 hours. Kindly guide, I have tried System.Threading.Timers but that appears to call timer event by 2 seconds instead of 1 second for each 2 minutes. How to fix this problem?

public partial class Form1 : Form
{
    private OpenNETCF.Timers.Timer2 mTimer2;
    private DateTime mLastSampleTime;
    private Int32 Interval;
    private Double ActualInterval;
    private long ActualTickInterval;`enter code here`
    StreamWriter file;
    Boolean firstRun = true;

    //private Timer sysTimer;

    public Form1()
    {
        InitializeComponent();
        /// <summary>
        /// Timer object for Windows CE environment.
        /// </summary>
        /// 

        // This text is always added, making the file longer over time
        // if it is not deleted.
        file = System.IO.File.AppendText("TimerLog.txt");




        mTimer2 = new OpenNETCF.Timers.Timer2(1000);//1000 for actual test
        mTimer2.AutoReset = true;
        mTimer2.Elapsed += Timer2Event;
        mTimer2.Start();
        file.WriteLine("TImer Started");

    }


    private void Timer2Event(object sender,
        OpenNETCF.Timers.ElapsedEventArgs e)
    {
         try
        {

            DateTime now = DateTime.UtcNow;

            if (!firstRun) //do not calculate difference if first run
            {
                //ActualInterval = ((now - mLastSampleTime).TotalMilliseconds);//Difference between previous timestamp and current (in milliseconds)
                ActualTickInterval = ((now - mLastSampleTime).Ticks);//Difference between previous timestamp and current (in ticks)
                Interval = Convert.ToInt32((now - mLastSampleTime).TotalSeconds);//Interval as used in LAMP code

                if (Interval >= 2) file.Write("-------->");//if Interval is 2 seconds, Mark it.
                file.Write("UTC Now: " + now.ToString() +" Current Tick: "+now.Ticks.ToString() +" Actual Interval: " + ActualInterval + " Actual Tick Difference: "+ActualTickInterval.ToString()+" Deviation (Ideal=0) "+(ActualTickInterval-10000000).ToString()+"\r\n");
                file.Flush();    
            }
            else
            {
                firstRun = false;
            }
             mLastSampleTime = now;
        }
        catch (Exception exp)
        {   
            file.Write("Logging Failed " + DateTime.UtcNow.ToString() + " exception: " + exp.Message+"\r\n");
        }
    }



    }

}

The tick difference jumps directly from 10,000,000 to 2,00,000 for these specific cases. And the subsequent tick difference is 0. I.E 2 rows in the log file have the same timestamp.

1

There are 1 best solutions below

1
ctacke On

There's actually a problem in your test logic that might account for the behavior you're seeing.

Under Windows CE is's very rare - I mean super-ultra-rare in my experience - that the system clock has resulton finer than 1 full second. In fact, it's very often tied to a hardware RTC that only gives a 1 second resolution. What that means is that your math using DateTime.Ticks isn't useful, and a difference from, say 1:00.999 and 1:02:001, will show up as a 2-second gap, not 1.002.

Change your logic to use Environment.TickCount to measure the elapsed time between ticks and you should see numbers right near 1000 every time and it will eliminate the potential of the artifact you're seeing.

EDIT

The Timer2 Object, like the built-in Forms Timer, is based on the Windows message pump tick and is not intended to be very precise. 1000ms +/- a few tens is not at all surprising to me.

If you need more precision, you should use a Threading timer or a high-precision timer (available in the SDF under the OpenNETCF.Timers namespace). Even with those, however, you must be aware that you are running in a managed, garbage-collecting environment so there's absolutely no guarantee you'll get consistent exact 1000ms ticks. If you need absolutely deterministic behavior, you need to write the portion that requires it in C, and know how to achieve determinism even there.