Join the Most Relevant JEE Main 2025 Test Series & get 99+ percentile! Join Now
Search any question & find its solution
Question: Answered & Verified by Expert
It is claimed that two cesium clocks, if allowed to run for 100 years, free from any disturbance, may differ by only about $0.025 \mathrm{~s}$. What does this imply for the accuracy of the standard cesium clock in measuring a time interval of $1 \mathbf{s}$ ?
PhysicsUnits and Dimensions
Solution:
2237 Upvotes Verified Answer
Error in 100 years $=0.02 \mathrm{~s}$
$\therefore \quad$ Error in $1 \mathrm{sec}$
$$
\begin{aligned}
& \frac{\Delta t}{t}=\frac{0.02}{100 \times 365 \times 2.5 \times 24 \times 60 \times 60}=\frac{0.02}{3.5576 \times 10^9} \\
=& 0.0063 \times 10^{-9}=0.63 \times 10^{-11}
\end{aligned}
$$
So, there is an accuracy of $1 \mathrm{~s}$ is $10^{-11} \mathrm{~s}$.

Looking for more such questions to practice?

Download the MARKS App - The ultimate prep app for IIT JEE & NEET with chapter-wise PYQs, revision notes, formula sheets, custom tests & much more.