Language models provide some spectacular new opportunities for discovery. While working on some ideas it occurred to me to explore the relationships between concepts spatially. Interpolating across the latent space between concepts and then mapping to a vectors nearest token predicate. I arranged this formulation as a form of tessellation to cover the n-dimensional volume efficiently. The result is a work-in-progress I call the Tessellator. Since this is an experiment it just made sense to open source it and a new update is due in the next week or two.