This writing makes more sense when you mentally add back in the LLM's em-dashes that were removed to make it look less like a nonhuman-generated marketing article.
Ideas for your feature pipeline: geographic filtering (e.g., learn to identify plant/bird species in southern Florida); temporal filtering (explore extinct species --> currently endangered species); audio to learn calls/songs.
Thanks for the feedback! Really appreciate it. I am working on specific locations. And actually the "endangered" filter is a great idea. Thanks so much!
*Edit: Wanted to mention that thousands of more species are planned as content.
One approach might be to set up two adversarial summarizers and a judge (like common-law litigation). Instead of using one model to identify and resolve all claims, a first model (plaintiff) seeks out the most supportive arguments for, and evidence of, claims across all nodes; then a second model (defendant) antagonizes the first by seeking out only disconfirming evidence and the best counter-arguments; the parties may get one or more replies or sur-replies; and then a third model (judge) evaluates the previous two against one another. The idea could be extended to incorporate appellate models that ensure compliance with the rules and propose changes to rules or addition/subtraction of rules. Appellate decisions could be maintained in a separate directory and accessed by the adversarial models.
More promising food for thought: developing and employing rules of evidence and procedure. For example, evidence may be taken only from immutable files; the first level nodes present issues, not summaries or syntheses; each issue has separate plaintiff, defendant, judge, and appellate nodes, from which a summary or explanation is ultimately created.
I never use LLMs for research or drafting. But I do use them to roll my own local semantic search; to whip up reusable regexes; to create small deterministic programs that, say, convert my set of paraphrased facts' citations to discrete documents into citations to the compiled appellate record; and to quickly code a Google Docs plugin that will automate away the repeated corrections to my co-counsel's bad typing and citation style (she'll never change).
For these uses, LLMs are wonderful — and Kagi Assistant is plenty good enough.
reply