Ask a New Question

Question


1) a microwave has a typical wavelength of about 0.1 meter. What us the comparable wavelength of a radio wave?

A) less than 0.1 meter

B) greater than 0.1 meter

C) equal to 0.1 meter

D) it depends on the frequency
3 years ago

Answers

oobleck
higher frequency means shorter wavelength
so, which is which?

just look at any illustration of the electromagnetic spectrum
3 years ago

Related Questions

You plug a microwave into a 20-ampere (amp) electrical circuit. The microwave use up to 12.5 amps. A... what means typical and typical person? http://www.onelook.com/?w=*&loc=revfp2&clue=typical+person... In a certain microwave oven on the high power setting, the time it takes a randomly chosen kernel o... Microwave ovens use microwave radiation to heat food. The microwaves are absorbed by moisture in the... Microwave ovens emit microwave radiation that is absorbed by water as heat. Suppose that the wavelen... Microwave ovens emit microwave energy with a wavelength of 12.2 cm. What is the energy of exactly on... A microwave runs off a 110. V circuit and draws 7.00 A of current. How much water would be required... A microwave is placed on top of two boxes. One box is 4 feet 7 inches tall, the other box is 4 feet... How do you microwave ovens use electromagnetic waves to cook food The microwave spectrum of HCl shows a series of lines at 21, 18, 42, 36, 63, 54, 84, 72 and 105.91 c...
Ask a New Question
Archives Contact Us Privacy Policy Terms of Use