Suppose we want to write a program that simulates the flight of a cannonball (or any other projectile such as a bullet, baseball or shot put). We are particularly interested in finding out how far the cannonball will travel when fired at various launch angles and initial velocities. The input to the program will be the launch angle (in degrees), the initial velocity (in meters per second) and the initial height (in meters). The output will be the distance that the projectile travels before striking the ground (in meters).
If we ignore the effects of wind resistance and assume that the cannon ball stays close to earth's surface (i.e., we're not trying to put it into orbit), this is a relatively simple classical physics problem. The acceleration of gravity near the earth's surface is about 9.8 meters per second per second. That means if an object is thrown upward at a speed of 20 meters per second, after one second has passed, its upward speed will have slowed to 20 — 9.8 = 10.2 meters per second. After another second, the speed will be only 0.4 meters per second, and shortly thereafter it will start coming back down.
For those who know a little bit of calculus, it's not hard to derive a formula that gives the position of our cannonball at any given moment in its flight. Rather than take the calculus approach, however, our program will use simulation to track the cannonball moment by moment. Using just a bit of simple trigonometry to get started, along with the obvious relationship that the distance an object travels in a given amount of time is equal to its rate times the amount of time (d= ft), we can solve this problem algorithmically.
Was this article helpful?