Lets say we have this:
int main()
{
int32_t* value = (uint32_t*)malloc(sizeof(uint32_t));
uint32_t array[9] = {1, 2, 3, 4, 5, 6, 7, 8, 9};
*value = *(uint32_t*)((char*)array + 8);
printf("Value is: %d\n", *value);
return 0;
}
The value in this case would be 3. Why exactly is that? If we cast an uint32_t to char, does that mean one char is 4 Byte in uint32_t and therefore
array[9] = {0, 4, !!8!!, 12, 16, 20, 24, 28, 32};
Could someone try to explain this?

When you initialize an array, each initializer sets an element of the array regardless of how many bytes each element takes up.
You machine is probably using little-endian byte ordering. That means that
arraylooks like this in memory:Each value of type
uint32_tis 4 bytes long with the least significant byte first.When you do
(char*)arraythat castsarray(converted to a pointer) to achar *, so any pointer arithmetic on achar *increases the address by the size of achar, which is 1.So
(char*)array + 8points here:That pointer is then converted to a
uint32_t *and dereferenced, so it reads the value 3.