Summary: | digital clock plasmoid is slow by about 3/4 of a second | ||
---|---|---|---|
Product: | [Unmaintained] plasma4 | Reporter: | Hal V. Engel <hvengel> |
Component: | general | Assignee: | Plasma Bugs List <plasma-bugs> |
Status: | RESOLVED INTENTIONAL | ||
Severity: | normal | CC: | aseigo |
Priority: | NOR | ||
Version: | unspecified | ||
Target Milestone: | --- | ||
Platform: | Gentoo Packages | ||
OS: | Linux | ||
Latest Commit: | Version Fixed In: | ||
Sentry Crash Report: |
Description
Hal V. Engel
2008-09-28 23:00:18 UTC
we will work on accuracy and what not, but microsecond accuracy of timing devices isn't a design goal for plasma. I guess I didn't communicate the issue very well. I merely pointed out that my system clock has near microsecond accuracy*. I stated that your goal should be to get the plasma clocks to change the displayed time within 250 milliseconds (1/4 second) of the actual seconds epoch of the system clock. This is a far cry from microsecond accuracy (by more than 4 orders of magnitude) and it is a fairly loose requirement as time keeping things go. I didn't ask for the plasma clocks to have microsecond accurate displays. The Qt timer functions only have a millisecond resolution and I think this puts a lower bound on how accurate these plasma clocks can be. In other words I only want the plasma clocks to have a low enough latency that it is not apparent that there is a significant lag between the plasma clock and the system clock. Of course if you can get the plasma clock synced more tightly to the system clock (for example if you where to get the latency below for example 50 milliseconds) then that would be a good thing. The US NIST, which is considered the worlds experts on time related issues, says the following in "WWVB Radio Controlled Clocks: Recommended Practices for Manufacturers and Consumers" (see http://tf.nist.gov/general/pdf/1976.df page 5) "3. RECOMMENDED PRACTICES FOR CLOCK ACCURACY, CLOCK DISPLAY AND CONTROLS ALL RCC products should display time accurate to at least within +-0.5 s, so that when the time is rounded to the nearest second, the seconds value is always correct. Tighter synchronization (to within +-0.2s) is desirable. This prevents the human eye from detecting any errors when checking a RCC display against another independent time reference, whereas a 0.5 s error could be noticeable." As you can see NIST states that latencies much greater than 0.2 seconds can result in the human eye detecting the offset. I am only asking that the latency of the plasma clocks be low enough that it "prevents the human eye from detecting any errors". The current plasma clocks have a latency that appears to be close to 750 milliseconds which is well above the maximum error that is recommended by NIST. It is also very apparent to human observers. My main goal was to give you some concrete information about this so that you would put in place a requirement to reduce the latency to the point it "prevents the human eye from detecting any errors" and to give you some guidance on what latencies will give a result that meets that requirement (IE. under 200 milliseconds). * This is admittedly an unusual setup since it requires a patched kernel and special hardware. There are probably only a few hundred Linux machines in existence that are configured this way. But a commonly used and fairly simple setup using ntp and Internet based time servers can easily keep the local computer clock synced to within 10 milliseconds of the actual time and with careful selection of time servers this type of setup can get the clock offset into the 1 to 2 millisecond range. Even for this simple time keeping setup the plasma latency errors are over 2 orders of magnitude greater than the local clock errors. |