We humans love to throw around words the meanings of which we leave undefined. For example, "consciousness." To the best of my knowledge there is no properly agreed universal definition. Therefore arguing over whether a computer program can become "conscious" is like arguing about how many angels can dance on the head of a pin. Furthermore, all the overwhelming evidence from studies conducted since World War II shows very clearly that we humans have occasional partial consciousness at best. So, are we expecting more of our machines? In the end, we come back not to the limitations of computer programs but to the manifest limitations of people. As we are evidently mostly incapable of coherent thought, it's not surprising that debates over machine consciousness should be riddled with logical incoherence.