It's not that the AI is stupid. It's that you, as a human being, literally cannot comprehend how this AI will interpret its goal. Paperclip Maximizer problems are merely stating an easily-understandable disaster scenario and saying "we cannot say for certain that this won't end up happening". But there are infinite other ways it could go wrong as well.