By measuring the amount of time it takes a component of a product to move from one workstation to the next, an engineer has estimated that the standard deviation is 6 seconds.

(a) How many measurements should be made to be 90% certain that the maximum error of estimation will not exceed 1 seconds?

(b) What sample size is required for a maximum error of 2 seconds?

So far I have E= 1.65(6/square root of 1).