We previously launched Auto Wiki (https://news.ycombinator.com/item?id=38915999) in Jan 2024 and broke the ground for AI generated wikis that explain your code. Now this product has been rebuilt by the same team, as well as others and launched as a part of Google. Hope you enjoy.
Although, I've recently moved on to working on Gemini and AI research, I'm still involved as an advisor and founder emeritus of sorts. This team moves extremely fast and while we don't have full availability yet, we're working hard on addressing some early feedback before we make it more widely available including for private repos. Personally, I think the NotebookLM integration is a nice touch and distinguishing factor that we could only do as Google.
If they choose to publish no-truth-value garbage about my life's work I will f**ing shred them with words.
This project and the whole philosophy behind it is just dripping with disrespect so you won't find me in line to be polite to the people who made it. If they're going to walk over and chunder their lukewarm slop onto me they can expect a verbal fistfight.
There is no data carried over from Mutable if that's what you're referring to. Please refer to the final email sent to users which I don't have access to.
Ok so it was acquired and merged into that google offering? Does this mean we lost the open source nature and ability to preserve and protect our data?
Interesting direction. We also have a codebase chat (example here https://wiki.mutable.ai/ollama/ollama) that HN might find appealing. It uses a wiki as a living artifact owned by your team to power the chat, gives us increased context length and reasoning capabilities. We didn't really like the results we got with embeddings. Have been pretty thrilled with the results on Q&A, search, and even codegen (more on that soon).
Can you share some basic insight into how your system processes an open source repo to generate a wiki with a hierarchy and structure that maps to the same or similar hierarchy and structure of the codebase?
I can understand the value of marrying that wiki to the codebase and how that would help LLM's better "understand" the codebase.
What I'm lost on is how you can auto-generate that wiki in the first place (in a high quality way). I presume it's not perfect, but a very interesting problem space and would love to hear what you've learned and what you are trying to accomplish this feat!
Thanks for posting btw, this HN comments section has been INCREDIBLE.
Thanks! It's a multi stage process where we summarize the code and structure it into articles. We also do some self verification, for example checking for dead links.
Do you build a syntax tree of the code, then loop the tree to auto-write an article for each node (or the larger or more material nodes) and then also reference the tree to pull in related nodes/pieces/modules/whatnot of the codebase when auto-writing a documentation article for each node?
Greatly appreciate the suggestion! but I recently joined another YC backed Gen AI startup in the fine-tuning space (OpenPipe) :-D
Speaking of, there's a good chance fine tuned models will be a component of your fully-optimized codebase -> wiki automation process at some point in the future. Likely to increase consistency/reliability of LLM responses as clear patterns start to emerge in the process. If y'all decide to layer that on or even explore it as an optimization strategy hit us (or me directly) up. We love collaborating w/ engineers working on problems at the edge like this, aside from how engaging the problems themselves are, it also helps us build our best possible product too!
Very excited to follow your journey! just sent you a LI request.
It's funny to see this because from my perspective we've done this on a pretty scrappy budget. That said we're thinking of adding an open access for select repos. Why is logging in so terrible? We want to get to know our users.
We previously launched Auto Wiki (https://news.ycombinator.com/item?id=38915999) in Jan 2024 and broke the ground for AI generated wikis that explain your code. Now this product has been rebuilt by the same team, as well as others and launched as a part of Google. Hope you enjoy.
Although, I've recently moved on to working on Gemini and AI research, I'm still involved as an advisor and founder emeritus of sorts. This team moves extremely fast and while we don't have full availability yet, we're working hard on addressing some early feedback before we make it more widely available including for private repos. Personally, I think the NotebookLM integration is a nice touch and distinguishing factor that we could only do as Google.
I hope you enjoy.
Thank You, Omar (Formerly Founder/CEO MutableAI)