About me
I am a fourth-year PhD student at UC Berkeley. I am fortunate to be advised by Jelani Nelson and Avishay Tal. Previously, I did my undergrad in Yao Class, Tsinghua University. Even before that, in high school, I was a competitive programmer.
I spent first three years of my PhD trying to understand the role of memory in computation from a TCS standpoint. This includes fun & specific questions like:
- How many memory access is needed to decide whether n numbers are distinct?
- HHow to approximate median when you can only look at each datapoint once?
- Can you solve a set of equations when you can only look at each equation once? Twice? $O(1)$ times?
It is amazing how these very specific and simple-looking questions can lead to very deep lines of research and many beautiful ideas.
Recently, my interests shifts to the on-gonig LLM revolution. This summar, I’m interning at Microsoft Resarch, mentored by Janardhan Kulkarni on LLM reasoning and self-improving.
Nowadays, With only noisy human data and compute, LLMs was able to solve competitive-level math and algorithm problems. It is foreseeable that with new training recipes, and more enginnering, data, architecture and algorihtmic innovation, it is going to be able to solve not only all the specific problems above, but also the grand open problems that I have no clue how one might attack:
- Is P seperable from L? (Can every time-efficient algorihtm also be made very memory-efficient?)
- Is RL equal to L? (Is randomnesss useless for very memory-efficient algorihtms?)
LLMs are going to fundamentally change how theoreticians work. For me, doing theory has been an enjoyable journey and I worked with very kind and inspiring professors that I am always so grateful to.
- Prof. Ryan Williams
- Prof. Zhihao Gavin Tang
- Prof. Hu Fu
- Prof. Mikkel Thorup
- Prof. Ran Duan