Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Not sure what the original commenter was looking for but I can give my thoughts:

- stackoverflow's UI actually serves well to provide a sort of "ambient" information that rapidly indicates not just the best answers, but the best most-recent answers. Oftentimes, and especially in rapidly-evolving dev languages/frameworks, what was the best answer a few months ago may no longer be the best answer and the ability to rapidly scan the comments that would indicate this is valuable. - in addition those stackoverflow comments and links within them can point to additional info that can save the dev time (potentially pointing to the dev misidentifying the problem: "don't do this, this is the real issue <link>).

I think with the traditional google->stackoverflow or google->[some documentation site, forum, etc] user flow you actually get layers of ambient cues as to relevance, recency and quality that we've grown accustom to. Even if your product ultimately serves better answers I'd worry that lacking these cues would make a user like me feel as though I'm blindly trusting an answer that seems to have come from the ether (sort of like github copilot).

As low-hanging fruit maybe adding level-meters beside each result that indicates these dimensions could help (like npmjs.com does with npm pkg results in their ui).

I love the product idea and it looks like a strong start! Good luck!



I agree that software documentation is constantly changing and a mechanism for evaluating the "freshness" of an answer could be very useful. Right now we provide an easy way to find the source of a code snippet but source attribution/evaluation is something we're actively looking into, especially for the natural language answer. A quantitative score could be interesting too. Thanks for the feedback!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: