4/5/26

I finished my batch last week and this is what I have been up to in the past week.

  • Using llama.cpp to host some local models like GLM 4.7 flash and Qwen3-32B. Built a web browser MCP tool.

    • Well initially, my grad school friend and I had wanted to test out OpenClaw together but then we realized it had a lot of security issues and so we are now pivoting to just building some autonomous agents ourselves using deep agents. And maybe sandboxing the agents with greyhaven.

    • And with the latest supply chain attacks on Litellm and Axios, I feel like I need to think about cybersecurity more. Hence sandboxing which led me to look into the 7 layers of networking communication.

  • GPU programming and Open AI Parameter golf challenge with @Patrick Hulin (he) (SP1'26)

    • And somehow meandered to also implementing quick sort in place (Hoare's scheme)

    • Also learned that generally a backward pass of a neural network has 2X FLOPS of the forward pass because there are two matrix multiplications -- one for error propagation and one that applies that to the input.

Next
Next

End of W6