Do you know how many microseconds are there in a nanosecond? You thought it was 10^-3^, eh? Heh, think twice and no, I have not gone nuts. At least, not yet. Read on.

More seriously, what am I ranting about, here? Time precision in Java, is the topic.

Today, I needed to profile a process which takes less than a millisecond (10^-3^ seconds). Up to and including JDK 1.4 the precision of Java time used to be order of a millisecond and for anything more precise you were pointed to JNI, which is a nasty place to be pointed to, let me say. Fortunately enough JDK 5 has added System.nanoTime() method and Sun has honestly implemented it in its JDK. Thanks a lot to Sun, for that but - hold the excitement, for a second.

Does this method really give the precision of a nanosecond? Let's check it (using a tuned version of the code from Vladimir Roubtsov):

public class Test {

  public static void main( String[] args) {

    // Let JVM warm up and do whatever optimizations:
    for (int i = 0; i < 10000; i++) System.nanoTime ();

    long start = System.nanoTime(), end = start;
    long accumulated = 0;
    int iterations = 200;

    for (int i = 0; i < iterations; i++) {
      // Wait till the system time changes:
      while ( end == start )
        end = System.nanoTime ();

      accumulated += ( end - start );
      start = end = System.nanoTime ();

    System.out.println ("delta = " + accumulated/iterations + "nanoseconds" );

Following are the results of running this script on different platforms:\ Windows: delta = 1265 nanoseconds\ Linux: delta = 1015 nanoseconds

Let me, also, note that Windows machine was a 1.7GHz IBM ThinkPad laptop, whereas the Linux box is a dual-Xeon server. The results are quite close, to ignore any hardware and software differences and declare that:

There are approximately 1-1.3 microseconds in a nanosecond... at least, in Java ;-)