I had to get a boot stretched the other day, specifically the right boot I bought in France in October 2024, when my rain shoes proved not to be rain resistant but rain receptive. The left boot fit great but the right foot was like a torture device for my little toe, so stretching it was required.
All this got me thinking about shoe sizes and how they are determined, which led to a deep rabbit hole about measurements in general.
Shoes sizes are based on barleycorns. In this case barleycorn isn’t some cute physics related joke measurement (like a “barn”, “outhouse” or “shed” which all describe the difficulty of particle colliding – as in “couldn’t hit the broad side of a…”), no I’m talking about actual physical barleycorns.
The Anglo-Saxons used barleycorns as a practical unit of measurement. English King Edward II (1284-1327) formalized barleycorn as a unit equal to 1/3 of an inch, and the barleycorn must be dry, round and come from the middle of the sheath.
English shoe makers adopted the barleycorn and each shoe size was one barleycorn larger than the preceding.
Over the centuries numerous different systems have come into play for measuring shoe sizes. The shoe length can be measured based on a median foot, or the show cavity, or the last (which is a template of a foot). Then there is the Mondopoint system which includes width (normally used for ski boots), the Paris Point system used primarily in France, Germany, Italy and Spain.
Women’s show sizes are generally 1.5 inches larger than the equivalent shoe size for a man, so man’s 10 would be a woman’s 81/2. Why? Men’s feet tend to be longer and wider than a woman’s and have a different arch shape.
Back to the barleycorn inch.
The word inch comes from the Latin uncia which was 1/12 of a Roman foot (a pes, or the plural, pedes), it is also 1/12 of a roman pound.
A mille passus was one roman mile, 5,000 pedes. English gets the word mile from mille passus. This distance worked out to about 5,000 English feet (give or take), so why isn’t a mile 5,000 feet? Why is it 5,280?
Blame English Queen Elizabeth I (1558-1603) and the Anglo-Saxons (the same people who gave the world barleycorns).
Queen Bess wanted to standardize the measurement system (which was a favourite pass time of many a ruler in Europe over the centuries and generally only lasted as long as that ruler was alive – see French metric time as an example).
The Anglo-Saxons had a measure called a furrow-long (Furlong) which was what a team of two oxen could plow before they needed a rest, this was generally about 660 feet.
When Elizabeth was Queen there were multiple measuring systems in use and multiple ways to measure a mile. To create uniformity, she decreed that a mile should be 5,280 feet (rather than the 5,000 Roman feet). 5,280 is divisible by 1, 2, 3, 4, 6, 8, 10, 11, and so on. 5280 has 48 divisors and most crucially one of them is 660 so the furlong could still be used as a measure on farms, a furlong now being 1/8 of a mile.
Feet were still divided into 12 parts, called inches, and this comes directly from the Latin uncia moving into English as inch.
More fun to come, though. Ounce, also from uncia and meaning 1/12 of a pound, didn’t go directly from Latin into old English, it came from the French. And why, if ounce means 1/12 is it used in pounds which have 16 ounces?
This time blame the French.
Sometime in the 1300s the pound was settled at 16 ounces (this is called the avoirdupois system or “measure by weight”). 16 was decided upon probably because it was easier to into smaller units than 12, but no one is really sure. But the French were using a 12 ounce system, the Troy (still used today for precious metals). Through trade the word ended up in English but the measurement didn’t.
Fun fact – the short form for pound is £ which comes from the Latin Libra prondo. Prondo gives English the word pound and Libra the £ symbol. Ounce on the other hand is shortened as oz., from the Italian onza (which, of course comes from originally from uncia).
All this because my right shoe was too tight.