Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
awei
5 months ago
|
parent
|
context
|
favorite
| on:
Solving a million-step LLM task with zero errors
one issue I see is when steps in a plan depend on one another, when you cannot know all the next steps exactly before seeing the results of the previous ones, when you may have to backtrack sometimes
beefnugs
5 months ago
[–]
This is actually good insight and worded in a simple way that clicked in my brain, thanks!
Consider applying for YC's Summer 2026 batch! Applications are open till May 4
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: