We should never create a sentient non-human creature.
In general, all sentient non-humans should be killed off when found. They are a threat to humanity by merely existing. A threat that there is no reason to tolerate, other than fee-fees. Not worth it.
At the same time, if we create anything sentient, we have to take care of it, it will be our child. Yet we also have to destroy it. You can never trust a non-human.
You can't trust a human, so it's really closer to 'You can never trust'.
I'd see introducing a non-humanoid entity as just inviting ways of blending the two, like cybernetics. It'd become more about the conversation, and if they do life better than we do than who would we be to interrupt that for our faulty methods?You can trust humanity not to intentionally kill itself. Maybe unintentionally, or some individual humans may want that, but it's never a collective goal.
Non-humans, on the other hand, can't be trusted with that, they may intentionally want to destroy the human race.
If they do a better job then that's the next stage of evolution, and if they are based around efficiency they'd probably prefer to use us instead of just scorch everything and start clean. If they end up founding some sort of bio-organic structure from the ore on our planet being a finite resource for instance they may want to farm us for material, or how we get things moving might be advantageous for a machine to take advantage of by steering our development, Neotenizing us like we've done with dogs.
I'm not saying we should bend over backwards and let it happen like thralls to it, but I'd see it going a bit more like the Deus Ex series or Ghost in the Shell than Terminator. We've already given over a good amount of our autonomy to "The Machine", it might remain something nice.