If the distance from an antenna on Earth to a geosynchronous communications satellite is 26000 miles, given that there are 1.61 kilometers per mile, and radio waves travel at the speed of light (3.0 x 108 meters/sec), how many milliseconds does it take for a signal from the antenna to reach the satellite?

An Introduction to Physical Science
14th Edition
ISBN:9781305079137
Author:James Shipman, Jerry D. Wilson, Charles A. Higgins, Omar Torres
Publisher:James Shipman, Jerry D. Wilson, Charles A. Higgins, Omar Torres
Chapter2: Motion
Section2.2: Speed And Velocity
Problem 2.2CE: A communications satellite is in a circular orbit about the Earth at an altitude of 3.56 104 km....
icon
Related questions
Question

If the distance from an antenna on Earth to a geosynchronous communications satellite is 26000 miles, given that there are 1.61 kilometers per mile, and radio waves travel at the speed of light (3.0 x 108 meters/sec), how many milliseconds does it take for a signal from the antenna to reach the satellite?

 

Expert Solution
trending now

Trending now

This is a popular solution!

steps

Step by step

Solved in 2 steps

Blurred answer
Similar questions
  • SEE MORE QUESTIONS
Recommended textbooks for you
An Introduction to Physical Science
An Introduction to Physical Science
Physics
ISBN:
9781305079137
Author:
James Shipman, Jerry D. Wilson, Charles A. Higgins, Omar Torres
Publisher:
Cengage Learning