The actual executing time of a program might various on many aspects, such as the percents of CPU resources provided, the thread or progress schedule of the system. Also, there would be many other operations, such as system input/output, space allocation, that would be affected by the task distribution of your PC, which makes the execution time runs "randomly". That is to say, when n is really small, there would be many other unpredicted aspects affecting the executing time. However, when we are researching on algorithms, we do not care these "extra executing time", but "how the executing time grows when the data scale grows".
For example, a program runs 5000 + n
would be better than a program runs 200 + n2
Though the second program takes less time when n is small.