Question

I'm working on a driver for reading smart cards (PC/SC), and I've been reading the data in a forced 8-bit manner, even if the card itself might have a 16-bit chip. I have two questions, one is: how would I tell whether the card conforms to a 16-bit or 8-bit architecture, and the other is: would there be a performance boost to treating the 16-bit system as 16-bit?

Was it helpful?

Solution

Would there be a performance boost to treating the 16-bit system as 16-bit?

No.

The CPU is internally 8, 16 or even 32 bit. But all current processor cards operate over either an ISO 7816-3 (contact) or ISO 14443 (contactless) interface. It's this interface that controls the speed, not the CPU. The CPU uses an outer clock for this, but all the latest smart cards use an internal clock that is running at much higher speeds.

As long as the interfaces are not updated, the "choice" between 8 or 16 bit doesn't matter a bit, let alone 8. I've put "choice" between quotes because I don't see where you have any choice in this.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top