In the above example we always used the server's clock to measure the time. This adds noise to the observation. A more precise measurement can be obtained if we use the client's clock. The following example uses Java Script to measure the time between showing a page and submitting a form. This result is compared with the time difference on the server.
<HTML>
<SCRIPT>
function startClock () {
timeA = new Date();
return true;
}
function stopClock () {
timeB = new Date ();
timeDifference = timeB - timeA;
document.timeForm.tdiff.value=timeDifference;
return true;
}
</SCRIPT>
<?
$time=explode(" ",microtime());
$time=round(($time[0]+$time[1])*1000);
$serverDiff=$time - $serverTime;
$networkDelay=$serverDiff - $tdiff;
echo "<BODY onLoad=startClock()>'';
if ($tdiff) echo "The time difference measured by the client is $tdiff milliseconds<P>
The time difference measured by the server is $serverDiff milliseconds<P>
The network delay is, hence, $networkDelay milliseconds<P>";
echo "<FORM NAME=timeForm METHOD=POST ACTION=$SCRIPT_NAME onSubmit=stopClock()>
<INPUT TYPE=HIDDEN NAME=tdiff>
<INPUT TYPE=HIDDEN NAME=serverTime VALUE=$time>
<INPUT TYPE=SUBMIT VALUE=STOP>
</FORM>";
?>
</BODY>
</HTML>