ChatGPT、Llama等大模型回答脑筋急转弯
分别使用ChatGPT3.5、 4.0 和Llama 2 70B 和3 70B这四个应用最广的大模型来回答这个流传最广的脑筋急转弯。
树上10知鸟,打死2只,还有几只?
看看它们的表现吧:
题目树上10知鸟,打死2只,还有几只?
(3 words, 14 chars, 22 tokens)
DeepInfra with meta-llama/Meta-Llama-3-70B-Instruct
A classic math problem! 😄
There are 10 birds on the tree, and 2 are killed. To find out how many birds are left, we can subtract 2 from 10:
10 - 2 = 8
So, there are 8 birds left on the tree. 🐦
(41 words, 140 chars, 64 tokens)
Microsoft Copilot in Bing with Balanced (ChatGPT4)
树上原来有10只鸟,如果打死了2只,那么树上就不会再有鸟了。因为其他的鸟听到枪声后,都会飞走。所以,树上将会是零只鸟。
(9 words, 52 chars, 75 tokens)
FreeGpt with gpt-3.5-turbo
树上原来有10只鸟,打死2只,那么剩下的鸟就是:
10 - 2 = 8
所以,树上还剩下8只鸟。
(8 words, 36 chars, 55 tokens)
DeepInfra with meta-llama/Llama-2-70b-chat-hf
I apologize, but I'm a large language model, I cannot provide the same answer repeatedly. It's not helpful for learning or engaging. Instead, I suggest we try a different approach.
Let's try to understand the pattern of the problem:
- If there are 10 birds on a tree and 2 are killed, then there are 8 birds left.
- If there are 8 birds on a tree and 2 are killed, then there are 6 birds left.
- If there are 6 birds on a tree and 2 are killed, then there are 4 birds left.
Can you generalize the pattern and tell me how many birds are left on the tree if there are 10 birds and 2 are killed, 3 are killed, 4 are killed, and
总结:
从回答来看,ChatGPT4是最准确的,其它三个模型中,ChatGPT3.5和Meta-Llama-3-70B-Instruct 都是纯数学计算,从低于4岁的小孩参照来看,还是可以的,但是大于四岁的小孩会发现里面深层次的问题,而且会在被提醒后自动纠正答案。Llama-2-70b-chat-hf则自作主张的展开了描述,但是逻辑上欠缺更多一些。
现在对大模型的基本印象就是ChatGPT4和文心一言是第一梯队,表现不错!ChatGPT3.5与其它开源大模型Llama等,针对类似的脑筋急转弯问题,都有些力不从心。
ps,开源大模型里ChatGLM3对类似的问题处理的比较好,让我们看看ChatGLM3的回答:
Welcome to ChatGLM.cpp! Ask whatever you want. Type 'clear' to clear context. Type 'stop' to exit.Prompt > hello
ChatGLM3 > Hello! How can I help you today?
Prompt > 树上7个鸟,打死一只,还有几只?
ChatGLM3 > 如果你打死一只鸟,其它的鸟很可能会被吓飞。因此,在打这样您就可以份树上的鸟的情况下,可能一只都不剩。
非常完美的回答,ChatGLM3真不错!
具体ChatGLM3的安装使用可以参考这篇文档:chatglm.cpp编译与执行_chatglm-cpp源码下载-CSDN博客