Nerdy internals of an Apple text editor π¨π»βπ§ (19 minute read)
This article discusses engineering details behind Paper, a text editor for Apple platforms. While Paper is built on the older TextKit 1 framework, all of the concepts, abstractions, and principles discussed in the article still apply to TextKit 2, either unchanged or under a better API. The article covers the TextView class, syntax highlighting, styling, typing, performance, sharing, and much more.
Training great LLMs entirely from ground zero in the wilderness as a startup (15 minute read)
Training AI models requires compute, but there's a large amount of instability and huge variance in the quality of clusters, accelerators, and connectivity in compute providers. This article discusses the differences between training models at Google and training models 'in the wild' - outside of Google, where data scientists have to use GPUs instead of Google's TPUs. It makes comparisons between Google's infrastructure and tools to their equivalents in the outside world. With the right amount of brute technical strength, it is possible to train models from scratch that match Gemini Pro/GPT 3.5.