A dog chased a rabbit. The dog was originally 300 feet away from the rabbit. The dog's speed is 100 feet per minute and the rabbit's speed is 50 feet per minute. Will the dog reach the rabbit? If yes, how long will it take?
Answer
x is the number of minuts runned by both animals:
100x -50x = 300
50x = 300
x = 300/50 = 6 minuts
So the dog will need to run 6 minuts to catch the rabbit.