Вопрос

If we define A Variable "integer" In PIC Microcontroller, Will It be the same size when I define the same "int" Variable At Atmel Microcontroller ? Or it will be different sizes ?

This Question is in Embedded Systems interview, What Should the Answer be ?

I'm a little confused !!

Does it depend on the Microcontroller or the programming language ? Does the same variables Type like integer are the same size in all different programming languages ??

It's Not the same question as It's a little different in the Embedded controllers.

Это было полезно?

Решение

The answer to the interview question should be something like:

Possibly, where it matters one should use the types defined in stdint.h, or otherwise consult the compiler documentation or inspect the definitions in limits.h.

The interviewer is unlikely to be asking for a yes/no answer and probably would not appreciate such terseness in an interview situation in any case - the questions are intended to get you talking until you have said something useful or interesting about yourself or your abilities and knowledge. What he is perhaps looking for is whether you are aware of the fact that standard type sizes in C are a compiler/architecture dependency and how you might handle the potential variability in portable code.

It is likely and possible that an int between one PIC and another PIC or one Atmel and another will differ let alone between PIC and Atmel. An Atmel AVR32 for example will certainly differ from an an 8bit AVR, and similarly the MIPS based PIC32 differs from "classic" PICs.

Also the size of built in types is strictly a "compiler implementation" issue, so it is possible that two different compilers for the same processor will differ (although it is highly improbable - since no compiler vendor would sensibly go out of their way to be that perverse!).

Languages other than C and C++ (and assembler of course) are less common on small micro-controllers because these are systems level languages with minimal runtime environment requirements, but certainly the sizes of types may vary depending on the language definition.

Другие советы

The problem is that standard C types will tend to vary from implementation to implementation. Using the types found in stdint.h will allow you to specify how many bits you want.

It depends on the architecture 32 bits or 64 bits. On 32 bits systems, your integer would be coded on 32bit :

for signed integer 32 bits :

value between -2,147,483,648 and 2,147,483,647

On 64 bits system it will be 64 :

for signed integer 64 bits : value between -9223372036854775808 and 9223372036854775807

So to answer your question integer can have different size depending on the architecture you are using.

TIP: If you have your code assume that a specific type is of a specific size, you can verify this assumption during compilation:

#define C_ASSERT(cond) char c_assert_var_##__LINE__[(cond) ? 1 : -1]
C_ASSERT(sizeof(int) == 4);

During compile-time, this will produce the following code:

char c_assert_var_350[(sizeof(int) == 4) ? 1 : -1];

which will not compile if sizeof(int) != 4

It depends on many things, I can't say neither yes nor no, but my answer is more no.

int is guaranteed to be 16-bit. However in many latter architecture int is 32-bit number and it doesn't break any rules. As far I know what in Atmels 8-bit microcontrollers int is 16-bit, not sure about PIC.

Anyway, my suggest would be to use defined types. I don't know what compiler you are using but I'm using AVR Studio. It has defined types such as:

uint8_t
int8_t
uint16_t
...
int64_t

So these types are guaranteed to have same size on every processor, you just need to make a little research through your compiler.

Лицензировано под: CC-BY-SA с атрибуция
Не связан с StackOverflow
scroll top