When given a query, what makes the LLM say “That’s good. I’ve said enough. I’ll think I’ll stop here.” instead of just stringing together endless tokens of information? submitted by /u/Fabulous_Analysis885
Originally posted by u/Fabulous_Analysis885 on r/ArtificialInteligence
You must log in or # to comment.
