2 "Geometry" Posts

How Large Language Models (LLMs) Think: Turning Meaning into Math

When you enter a sentence into a Large Language Model (LLM) such as ChatGPT or Claude , the model does not process words as language. It represents them as numbers.

Each word, phrase, and code token becomes a vector — a list of real-valued coordinates within a high-dimensional space. Relationships between meanings are captured not by grammar or logic but by geometry. The closer two vectors lie, the more similar their semantic roles appear to the model.

This is the mathematical foundation of large language models: linear algebra. Matrix multiplication, vector projection, cosine similarity, and normalization define how the model navigates this vast space of meaning. What feels like understanding is actually the alignment of high-dimensional vectors governed by probability and geometry.


“Linear algebra and geometry do more than support AI; they create its language of meaning.”


Read more →

From Ice Shows to Algorithms: Cracking the Truck-Packing Problem

My first full-time programming job was for Holiday on Ice, an international ice show. While I focused mainly on back office systems such as accounting, itinerary, and box office reporting, I knew that one of the biggest technical challenges faced by the show’s crew was efficiently loading trucks for the next city.

“Given the dimensions of a truck and a list of containers (with their dimensions and weight), in what order, position, and orientation should you pack the truck?”


One day, the controller asked me if I could code a system that took, as input, the trucks’ 3D dimensions and the 3D dimensions (and weight) of every object to be packed. Back in the Turbo Pascal era, exploring 3D packing was painful. Today, with Python and AI-assisted scaffolding, it’s surprisingly approachable.

Read more →