From Steps to Smooth Curves: Speed as a Derivative


In Chapter 8-4 of The Feynman Lectures on Physics, Richard Feynman introduces one of the central ideas of modern science and mathematics: the concept of speed as a derivative. This chapter is not merely about mechanics; it is a gateway into calculus itself – the language that allows us to describe motion with precision.

Feynman begins with a simple problem: how do we measure the speed of an object when its motion is not uniform? If something moves 100 metres in 10 seconds, its average speed is clear – 10 m/s. But what if the motion is irregular, changing from moment to moment? The average speed over a whole interval tells us little about the exact speed at a given instant.

To resolve this, Feynman introduces the notation of increments:

• Δt a small change in time,

• Δs: the corresponding small change in distance.

The ratio Δs / Δt is the average speed over a very small interval. As that interval becomes ever smaller, we approach the instantaneous speed.

Feynman then shifts from the finite differences Δs and Δt to the infinitesimals ds and dt. These are not ordinary quantities but differentials, symbols which represent the tiniest changes imaginable in distance and time.

This notation keeps track of what is changing and highlights the central idea: velocity is the derivative of distance with respect to time.

Feynman points out a subtle but important distinction – you cannot “cancel” the Δ‘s (or d’s) as if they were factors. Just as “sin θ” is not the product of three separate letters, so too are Δs and Δt to be understood as single, meaningful entities.

To make the procedure concrete, Feynman considers a particle whose position is given by a quadratic expression:

s = At^2 + Bt + C,

where A, B, and C are constants.

The procedure Feynman explains so elegantly is the fruit of centuries of struggle. Ancient Greek mathematicians, such as Archimedes, already had the idea of finding limits, using geometric arguments to approximate areas and volumes. But the true breakthrough came in the 17th century.

Isaac Newton, working on the motion of planets and falling apples alike, needed a systematic way to describe changing quantities. He called his method fluxions.

Gottfried Wilhelm Leibniz, working independently, developed his own notation of differentials – dx and dy – which is the very language physicists still use today.

The rivalry between Newton and Leibniz over the credit for inventing calculus became one of the great intellectual disputes of the early modern age. Yet out of that dispute came a shared legacy: a tool powerful enough to describe everything from planetary orbits to quantum fields.

In this chapter, Feynman is not just teaching mechanics; he is showing us how physics and mathematics intertwine. The act of taking a derivative is more than a trick of symbols - it is a way of capturing the essence of change.

With calculus, we no longer settle for average values or crude approximations. We can say with precision how fast something is moving at this instant, how rapidly it is accelerating, and how forces shape trajectories. Without this tool, modern science - from Einstein’s relativity to Schrödinger’s wave mechanics - would be unthinkable.

Starting from an intuitive problem, Feynman guides the reader through the birth of calculus, showing step by step how velocity emerges as the derivative of distance with respect to time. The story he tells is both mathematical and historical: a reminder that the greatest ideas in science often arise from the simplest of questions  “How fast am I moving right now?”

Comments

Popular posts from this blog

From Clouds and Cars to Parabolas: Feynman’s First Steps in Motion

Kepler’s Harmonies: Feynman, the Ellipse, and the Poetry of the Planets

The Uncertainty Principle – Feynman’s Quantum Rethink of Reality